Independent review of information security — ISO 27001 control 5.35

Organizations that treat their Information Security Management System (ISMS) as a set-and-forget system accumulate blind spots that auditors and attackers exploit. Controls degrade, policies drift from practice, and gaps compound unnoticed until a failed audit or security incident forces a reckoning. Independent review is the mechanism that forces objectivity into a system prone to self-assessment bias, and ISO 27001 control 5.35 is where that mechanism is formalized.

What 5.35 requires

ISO/IEC 27001:2022 Annex A control 5.35 requires organizations to schedule independent reviews of their ISMS at planned intervals. These reviews must assess the continuing suitability, adequacy, and effectiveness of the organization’s approach to managing information security, covering people, processes, and technology.

The word “independent” carries specific weight here. Reviewers must have no authority over or direct involvement in the ISMS areas they are assessing. A Chief Information Security Officer (CISO) reviewing the security program they built and manage does not meet this requirement. Independence means structural separation between the reviewer and the subject of the review. This is often where organizations struggle most with 5.35, particularly smaller teams where security responsibilities are concentrated in a few individuals. The standard doesn’t prescribe a specific organizational structure, but it does require that whoever conducts the review can assess the ISMS without bias or conflict of interest.

Reviews should occur at minimum annually, but the standard also requires trigger-based reviews when significant changes occur. These triggers include new regulatory requirements, major security incidents, business restructuring, mergers or acquisitions, and infrastructure overhauls such as cloud migrations. Organizations that only review on a fixed calendar miss the point. The ISMS needs reassessment whenever the risk landscape shifts materially.

The review scope should cover the full breadth of the ISMS, including information security policies, risk assessment methodology and results, control implementation and operational effectiveness, incident response procedures, access management practices, and change management processes.

Once the review is complete, results must be reported to top management with documented findings, corrective actions, assigned owners, and remediation timelines. This reporting requirement is not optional. It ensures that governance failures surface at the level where resources can be allocated to fix them.

Why 5.35 matters

Organizations that skip or delay independent reviews develop institutional blind spots. Controls that were effective at certification gradually degrade as the business evolves. Policies drift from practice, access reviews lapse, and the documented ISMS no longer reflects operational reality.

An organization passes its initial ISO 27001 certification audit with a well-designed ISMS. Over the next 18 months, the company migrates core infrastructure to a public cloud provider, onboards three new Software as a Service (SaaS) vendors with access to sensitive data, and restructures its engineering team. No independent review is triggered by any of these changes. By the time the next surveillance audit arrives, the ISMS documents a security posture that no longer exists. Risk assessments reference on-premises infrastructure that has been decommissioned. Access control policies don’t account for the new cloud Identity and Access Management (IAM) model. The gap between documentation and reality has widened into a governance failure.

This is the risk class that 5.35 addresses — undetected control degradation caused by the absence of objective, independent assessment. Without it, organizations rely on the same people who built the system to evaluate whether it still works. That structural conflict of interest is precisely what independent review is designed to eliminate. The consequence isn’t theoretical. Certification bodies regularly raise nonconformities against organizations that cannot demonstrate independent review activity during surveillance audits, and major nonconformities left unresolved within the corrective action period can result in certification suspension.

What attackers exploit

When independent reviews don’t happen, specific weaknesses accumulate that threat actors target:

  • Stale risk assessments: Risk registers that no longer reflect the actual threat landscape leave organizations defending against yesterday’s threats while today’s go unaddressed.
  • Policy-practice gaps: Controls that exist in documentation but aren’t enforced in practice create a false sense of security that attackers exploit directly.
  • Privilege creep: Unreviewed access privileges accumulate over time, giving compromised accounts far more reach than they should have.
  • Untested incident response: Procedures that haven’t been exercised or updated since initial implementation fail when they’re needed most.
  • Unvetted third-party integrations: Vendor connections added after the last review may bypass existing controls entirely, creating unmonitored entry points.
  • Incomplete audit trails: Missing or gaps in logging and evidence collection that an independent reviewer would have flagged go unnoticed, limiting forensic capability after a breach.

How to implement 5.35

For your organization (first-party)

Building an effective independent review program requires both structural decisions and operational discipline.

1. Establish a formal review schedule. Set annual reviews as a baseline, and define trigger events that mandate additional reviews. Document these triggers in your ISMS governance framework so they aren’t left to subjective judgment. An ISO 27001 implementation checklist can help ensure these governance elements are captured from the start. Common triggers include regulatory changes, major incidents, organizational restructuring, and significant technology deployments.

2. Define independence criteria. Reviewers must not have authority over or involvement in the ISMS areas they review. Document what qualifies as independent in your context, because the answer varies by organization size and structure.

3. Select qualified reviewers. Three common models work in practice:

  • Internal audit team: Reports directly to the board or audit committee, not to the CISO or IT leadership. This is the strongest internal option but requires a sufficiently resourced audit function.
  • Cross-departmental peers: Staff from other departments with security training can provide independence, though their depth of expertise may be limited.
  • External consultants or certified auditors: Particularly valuable for smaller organizations that lack internal audit capacity. Look for ISO 27001 Lead Auditor certification or equivalent credentials.

4. Define the review scope. Map the review to the full ISMS, including policies, risk assessments, control implementation, incident response, access management, and change management. Use ISO/IEC 27007 (guidelines for ISMS auditing) and ISO/IEC TS 27008 (guidelines for assessment of information security controls) as reference frameworks for structuring the review methodology. ISO 27002 implementation guidance can also help reviewers understand the intent behind specific controls.

5. Conduct the review. This involves interviewing process owners, reviewing evidence such as logs, policies, and system configurations, and performing gap analysis between documented controls and actual implementation. Effective reviews go beyond document checks. Reviewers should verify that controls are operating as described by testing samples, observing processes, and comparing documented procedures against what actually happens on the ground. A review that only confirms policies exist without checking whether they’re followed provides minimal assurance.

6. Report findings to top management. Deliver a formal report with findings, risk ratings, corrective actions, assigned owners, and target remediation dates. Management review meetings (required under ISO 27001 Clause 9.3) are the natural venue for presenting these results. The report should distinguish between major and minor findings, provide clear recommendations for each, and assign accountability to specific individuals rather than teams or departments.

7. Track remediation. Log all findings in a corrective action register and follow up in subsequent reviews. Findings without tracked remediation are findings ignored.

Common mistakes to avoid:

  • Assigning the review to the security team who built the ISMS. This is the most frequent independence failure. If the CISO’s team designed and operates the ISMS, they cannot independently review it.
  • Treating the review as a checkbox exercise. Reviews without documented findings, evidence review, or follow-up add no value and will not satisfy auditors.
  • Reviewing policies without verifying implementation. A paper audit that confirms policies exist without checking whether they’re followed misses the entire purpose of 5.35.
  • Failing to trigger reviews after significant changes. Annual cadence alone is insufficient when the business undergoes material changes between scheduled reviews.
  • Not reporting results to top management. Findings that never reach decision-makers with budget authority cannot drive meaningful remediation.

For your vendors (third-party assessment)

When assessing whether vendors comply with 5.35, focus on evidence of both the review process and its outcomes. Understanding the full scope of third-party risk requirements under ISO 27001 provides additional context for vendor assessments.

An ISO 27001 vendor questionnaire can formalize this process. Key questions to ask:

  • “Do you conduct independent reviews of your ISMS at planned intervals?”
  • “Who performs these reviews, and what is their relationship to the security function?”
  • “Can you provide the most recent independent review report or executive summary?”

Evidence to request:

  • Most recent independent review report (dated within the last 12 months)
  • Audit programme register showing planned and completed reviews
  • Corrective action register with remediation status for identified findings
  • Reviewer credentials or certifications demonstrating competence

Red flags that indicate weak compliance:

  • No review report within the last 12 months. This suggests reviews are either not happening or not being documented.
  • The reviewer manages the ISMS. Same person designing and evaluating the system is not independent.
  • No corrective action tracking. Findings without remediation plans indicate the review had no operational impact.
  • Consistently clean findings. Reports that note zero issues with no detail are more likely to reflect a superficial review than a mature security program.

Verification beyond self-attestation: Request the vendor’s ISO 27001 certificate and confirm it was issued by an accredited certification body. Check accreditation against recognized schemes such as UKAS (United Kingdom), ANAB (United States), or JAS-ANZ (Australia/New Zealand). Ask for the most recent surveillance audit results, which provide third-party validation that the ISMS remains effective between certification cycles.

Audit evidence for 5.35

Auditors assessing compliance with 5.35 look for specific artifacts that demonstrate the review process is established, executed, and acted upon.

Evidence typeExample artifact
Independent review policyPolicy defining review scope, frequency, independence criteria, and reporting requirements
Audit programme registerSchedule of planned internal and external audits with dates, scope, and assigned reviewers
Independent review reportMost recent report including scope, methodology, findings, and recommendations
Reviewer independence declarationSigned statement confirming reviewers have no conflict of interest with areas under review
Corrective action registerLog of findings with assigned owners, remediation plans, target dates, and resolution status
Management review minutesMeeting records showing top management received and acted on independent review findings

The strongest compliance posture combines all six artifact types. Missing any one of them gives auditors reason to raise an observation or nonconformity, depending on the gap’s severity.

Cross-framework mapping

Control 5.35 maps to equivalent requirements across several major frameworks, making evidence reusable for organizations operating under multiple compliance obligations.

FrameworkEquivalent control(s)Coverage
NIST 800-53CA-02(01) (Independent Assessors)Full
NIST CSF 2.0GV.OC (Organizational Context), ID.GV (Governance)Partial
SOC 2 Trust Services CriteriaCC4.1 (COSO Principle 16, Monitoring Activities)Partial
CIS Controls v8.118.1 (Penetration Testing Program)Partial (overlaps on independent assessment, narrower scope)
DORA (EU)Article 6 (ICT risk management framework review)Partial
CPS 230 (APRA)Independent review of operational risk managementPartial

Organizations already mapped to NIST 800-53 will find the closest alignment with CA-02(01), which specifically requires independent assessors for control assessments. The SOC 2 and CIS Controls mappings are partial because they address monitoring and testing respectively, but don’t fully replicate the governance and reporting requirements of 5.35. For organizations subject to DORA in the EU or CPS 230 under APRA in Australia, demonstrating compliance with 5.35 provides supporting evidence for those frameworks’ own independent review requirements, reducing duplicate effort across regulatory obligations.

Control 5.35 connects functionally to several other Annex A controls. Understanding these relationships helps ensure the independent review covers interdependent areas and that findings from the review feed into the right remediation processes.

Control IDControl nameRelationship
5.1Policies for information securityReviews assess whether policies are implemented and current
5.2Information security roles and responsibilitiesReviewer independence depends on clear role separation
5.3Segregation of dutiesPrevents self-review by ensuring duties are separated
5.36Compliance with policies, rules, and standardsReviews verify compliance; findings feed into 5.36 assessments
5.37Documented operating proceduresReviews check whether documented procedures match actual practice
8.34Protection of information systems during audit testingEnsures review activities don’t compromise systems under assessment
5.24Information security incident management planning and preparationReview findings may trigger incident management improvements

Frequently asked questions

What is ISO 27001 5.35?

ISO 27001 Annex A 5.35 is the control requiring organizations to conduct scheduled, independent assessments of their ISMS covering people, processes, and technology. The reviews must evaluate the continuing suitability, adequacy, and effectiveness of the organization’s information security approach and be performed by reviewers with no involvement in the areas being assessed.

What happens if 5.35 is not implemented?

Without independent reviews, control degradation goes undetected, governance blind spots accumulate, and the ISMS gradually drifts from operational reality. During surveillance or recertification audits, auditors will flag the absence of independent review evidence as a nonconformity, which can jeopardize ISO 27001 certification if not remediated within the required timeframe.

How do you audit 5.35?

Auditors verify that a formal review schedule exists, that reviews are conducted by individuals independent of the ISMS areas under assessment, and that findings are documented and reported to top management. Key artifacts include the independent review report, audit programme register, corrective action register with remediation tracking, and management review minutes showing leadership engagement with findings.

How UpGuard helps

Independent reviews are more effective when reviewers have access to current, comprehensive data rather than stale snapshots assembled weeks before the assessment. The UpGuard platform provides continuous visibility across your attack surface and vendor ecosystem, supporting the evidence collection and ongoing monitoring that independent reviews require.

  • Breach Risk: Continuous external attack surface monitoring gives reviewers real-time visibility into exposed assets, vulnerabilities, and configuration issues rather than point-in-time scan results.
  • Vendor Risk: Automated vendor assessments and continuous monitoring provide up-to-date evidence of third-party security posture, supporting both first-party review readiness and vendor compliance evaluation under 5.35.

When your ISMS is backed by continuous monitoring, independent reviewers spend less time collecting evidence and more time analyzing whether controls are actually working. The gap between reviews shrinks from a potential vulnerability window into a continuously observed environment.

Ready to strengthen your review readiness? Learn more about the UpGuard platform.

Experience superior visibility and a simpler approach to cyber risk management