Performance Evaluation and Monitoring

The Check phase of the PDCA cycle asks one essential question: is the management system actually working? Clause 9 of ISO 27001 is the structured answer — a set of requirements that force the organization to look honestly at its own performance, subject its practices to independent scrutiny, and bring the results before top management for review and decision.

The three components of Clause 9 are distinct but interconnected. Monitoring and measurement (9.1) provides the data. Internal audit (9.2) provides the independent verification. Management review (9.3) provides the governance response. All three are required. All three must produce documented evidence. And all three are among the most heavily scrutinized elements in a certification audit — because they are the components that show whether the ISMS is a living management system or a one-time documentation exercise.

This article covers each component with the depth practitioners need: what the standard actually requires, how to build programs that satisfy both auditors and organizational needs, and the common failure patterns that generate audit findings in each area.

Clause 9 at a Glance

Clause 9 contains three sub-clauses that form the Check phase of the ISMS's operational cycle:

9.1

Monitoring & Measurement

Track ISMS performance against defined criteria

9.2

Internal Audit

Independent objective evidence of conformance

9.3

Management Review

Top management evaluates ISMS suitability

The outputs of all three sub-clauses feed into Clause 10 — the improvement phase. Monitoring results identify control gaps. Audit findings produce corrective actions. Management review decisions drive ISMS changes and improvements. Clause 9 is the feedback mechanism that makes the ISMS self-correcting.

Clause 9.1 — Monitoring, Measurement, Analysis, and Evaluation

Clause 9.1 requires the organization to determine what needs to be monitored and measured, the methods for doing so, when the monitoring will be performed, when results will be analyzed and evaluated, and who is responsible. The results must be retained as documented information.

The standard is deliberately flexible — it does not prescribe specific metrics or measurement methods. This flexibility is both an opportunity and a trap. Organizations that design their monitoring program thoughtfully produce a set of metrics that give genuine visibility into ISMS health. Organizations that populate their metrics register with measures that are easy to collect but not informative produce dashboards that look complete but tell management nothing useful.

The Distinction Between Monitoring and Measurement

ISO 27001 draws a distinction between monitoring (observing a system to detect changes) and measurement (assigning values to characteristics). In ISMS practice, monitoring typically covers continuous or periodic observation of security controls and events — access logs, vulnerability scan results, incident alerts. Measurement covers the quantification of ISMS performance — completion rates, response times, finding counts. Both are required because monitoring without measurement produces data without insight, and measurement without monitoring misses real-time signals.

ISMS KPI Reference: Metrics by Domain

The following table provides a comprehensive reference set of ISMS KPIs organized by domain, including targets, measurement frequency, and data sources. These represent a practical starting point — organizations should adapt targets to their risk tolerance and baseline performance:

Risk Management
MetricTargetFrequencyData source

 

% of high/critical risks with treatment plan on schedule> 95%MonthlyRisk treatment plan tracker

 

Number of unaccepted risks above acceptance threshold0MonthlyRisk register

 

Mean time from risk identification to risk owner assignment< 5 working daysMonthlyRisk register timestamps

 

Risk register last reviewed / updated< 90 days agoMonthlyRisk register version history

 

Control Effectiveness
MetricTargetFrequencyData source

 

% of applicable Annex A controls with implementation evidence> 98%QuarterlySoA implementation status

 

Critical vulnerability remediation within SLA> 95% within 30 daysMonthlyVulnerability management tool

 

Access review completion rate (quarterly user access reviews)100%QuarterlyAccess review records

 

% of privileged accounts with MFA enabled100%MonthlyIdentity management system

 

People & Awareness
MetricTargetFrequencyData source

 

Annual security awareness training completion rate> 98% (all staff)Monthly trackingLMS completion records

 

Phishing simulation click-through rate< 5% (trending down)Quarterly simulationPhishing simulation platform

 

Security incidents caused by human errorTrending down YoYQuarterlyIncident register

 

New staff security onboarding within 30 days100%MonthlyHR onboarding records

 

Incident Management
MetricTargetFrequencyData source

 

Mean time to detect (MTTD) — security incidents< 24 hours (target)Per incident + monthly averageIncident log

 

Mean time to respond (MTTR) — security incidents< 4 hours for P1/P2Per incident + monthly averageIncident log

 

% of incidents with post-incident review completed100% for P1/P2; > 80% for P3MonthlyPost-incident review records

 

Regulatory notification compliance (UU PDP 14-day obligation)100% on-timePer applicable incidentNotification log

 

ISMS Program Health
MetricTargetFrequencyData source

 

Internal audit findings closed within target timeframe> 90% within 60 daysMonthlyCorrective action register

 

Management review conducted as scheduled100% on schedulePer review cycleManagement review calendar / minutes

 

Supplier security reviews completed on schedule> 95%QuarterlySupplier review tracker

 

ISMS documents reviewed within retention schedule100% within scheduleMonthlyDocument control register

 

 

What Makes Metrics Meaningful

 

A metric is meaningful if it tells management something they can act on. 'Number of security policies in place' is a count, not a metric — it communicates nothing about security effectiveness. 'Percentage of critical vulnerabilities remediated within SLA' is a metric — it shows whether the vulnerability management program is functioning and creates accountability for improvement.

 

The best ISMS metrics are: specific enough to be unambiguous, measurable without excessive manual effort, tied to ISMS objectives or risk treatment priorities, reported consistently so trends are visible, and connected to decision-making — when the metric goes outside target, someone has authority and responsibility to act.

 

Bitlion GRC metrics integration: Bitlion's platform aggregates ISMS performance data from connected tools — vulnerability scanners, access management systems, training platforms, incident management tools — into a unified ISMS dashboard. Metrics are calculated automatically from operational data, eliminating manual reporting overhead and ensuring that KPI data is available in real time for management review preparation.

 

Clause 9.2 — Internal Audit

 

Clause 9.2 requires the organization to conduct internal audits at planned intervals to provide information on whether the ISMS conforms to the organization's own requirements and to the requirements of ISO 27001, and whether the ISMS is effectively implemented and maintained.

 

Internal audit is perhaps the most misunderstood requirement in ISO 27001. Some organizations treat it as a pre-audit before the external certification audit — a rehearsal for the real thing. Others treat it as a documentation review, checking whether required policies and procedures exist. Neither approach captures what ISO 27001 actually requires: an independent, systematic evaluation of whether the ISMS is operating as designed and whether it is effective.

 

Building an Effective Internal Audit Program

 

An effective internal audit program is not a single event — it is an ongoing program that provides systematic coverage of the entire ISMS over time, prioritized by risk. The table below maps the six components of a complete audit program to what each requires and what evidence it produces:

 

ComponentWhat it requiresOutput / evidence
Audit UniverseDefine what can be audited — all clauses 4–10, all applicable Annex A control domains, all in-scope business units, and all in-scope supplier relationships.Documented audit universe register listing all auditable areas and their associated ISMS requirements.
Risk-Based SchedulingHigher-risk areas should be audited more frequently. Areas with previous nonconformities should be prioritized. New or changed processes warrant early audit. Not all areas need to be audited every cycle.Annual internal audit schedule showing which areas will be audited when, the rationale for scheduling decisions, and the audit objectives for each area.
Auditor IndependenceAuditors must be objective and impartial — they cannot audit their own work. In small organizations, this may require using a different staff member for different audit areas, or engaging an external auditor for areas where internal independence cannot be achieved.Documented auditor assignment showing independence from audited areas; auditor competence records.
Audit ExecutionAuditors collect evidence through document review, staff interviews, system observation, and log sampling. Evidence must be sufficient to support conclusions. Findings must be documented with specific clause or control references.Audit working papers, evidence samples, and draft findings for auditee review before report finalization.
Audit ReportingAudit reports must communicate findings clearly — distinguishing major and minor nonconformities from observations and opportunities for improvement. Reports must be issued to management and the relevant auditee.Formal audit report with finding classification, clause references, evidence citations, and recommendations. Issued within 5 working days of audit completion.
Corrective Action Follow-upAuditors do not just identify nonconformities — they verify that corrective actions are adequate and implemented. Follow-up audits or evidence reviews confirm closure. Unresolved findings escalate to management review.Corrective action register entries linked to audit findings; closure verification records; escalation records for overdue CARs.

 

Audit Finding Classification

 

How audit findings are classified matters — both for how they are communicated to management and for how they affect certification decisions. The four finding types used in ISO 27001 audits are:

 

Finding typeDefinitionExamplesConsequence for certification
Major NonconformityA failure to meet a requirement that results in the ISMS's inability to achieve its intended outcomes, a systemic breakdown of a management system requirement, or total absence of a required element.No risk assessment has ever been performed. Statement of Applicability does not exist. No management reviews have been conducted. Top management has no involvement in the ISMS.Certification cannot be issued or is suspended until the major NC is resolved and verified through a follow-up audit. Corrective action plan required within 30 days.
Minor NonconformityAn isolated lapse in meeting a requirement that does not indicate a systemic failure and does not prevent the ISMS from achieving its intended outcomes.Access review completed 4 months ago instead of the quarterly schedule in the policy. One of 47 staff members has not completed this year's awareness training. A single risk register entry has no risk owner assigned.Corrective action required and documented. Closure verification at next surveillance audit. Does not prevent certification if isolated finding.
Observation / Opportunity for ImprovementNot a nonconformity — a situation where the auditor believes the organization could improve its ISMS effectiveness or reduce future risk of nonconformity.Risk treatment plan format could be more structured to improve tracking. Management review agenda could benefit from addition of threat intelligence input. Supplier review cycle could be shortened for critical providers.No mandatory corrective action required. Organization should consider whether to address; unaddressed observations may become nonconformities at future audits.
Positive FindingAreas of notable good practice that exceed standard requirements — documented by auditors as exemplary and worth sharing.ISMS integrated into product development lifecycle with automated security gates in CI/CD. Real-time risk dashboard accessible to all risk owners. Threat intelligence subscription actively feeding risk assessments.No action required. Recognition of best practice. May be cited in audit report as example for other audit areas or organizations.

 

A common internal audit mistake: Organizations that conduct internal audits exclusively as documentation reviews — checking whether policies exist, whether records are filed, whether forms are complete — consistently miss operational nonconformities. The most significant ISMS failures are operational: controls that were implemented but are no longer functioning, processes that are documented but not followed, risks that have been identified but whose treatment is stalled. Effective internal audits include operational evidence testing — not just document review, but observation, staff interviews, system log sampling, and configuration verification.

 

Internal vs. External Audit: Understanding the Relationship

 

Internal audits and external certification audits serve different purposes and have different standing. Internal audits are conducted by or on behalf of the organization — they are a management tool for self-assessment. External audits are conducted by an accredited certification body — they are an independent third-party verification.

 

External auditors review the internal audit program and its outputs. They are looking for evidence that the internal audit program is genuine — that it covers the full scope, produces real findings, and drives corrective action. An internal audit program that consistently produces zero findings, that never audits the areas where external auditors find problems, or that produces findings that are closed without adequate corrective action signals that the program is not functioning as intended. Experienced external auditors adjust their focus based on what the internal audit has and has not found.

 

Clause 9.3 — Management Review

 

Clause 9.3 requires top management to review the organization's ISMS at planned intervals to ensure its continuing suitability, adequacy, effectiveness, and alignment with the strategic direction of the organization. The review must consider eight specific inputs and produce documented outputs covering continual improvement decisions, ISMS change decisions, and resource decisions.

 

Management review is the highest-level governance activity in the ISMS. It is where top management acts on the evidence produced by monitoring and audit — making decisions that determine the ISMS's direction, resource allocation, and improvement priorities. It is not a status report meeting. It is not an opportunity for the CISO to present a dashboard and receive a nod. It is a structured governance decision-making process, and the minutes must reflect that.

 

Required Inputs: What Must Be Addressed

 

ISO 27001 is specific about what the management review must address. All eight required inputs are mandatory — management reviews that skip any of them are nonconforming. The table below maps each required input to what it must cover and the supporting evidence that should be available:

 

Required input (Clause 9.3.2)ClauseWhat to cover in the reviewSupporting evidence
Status of actions from previous reviews9.3.2(a)Open action items from the last review — completion status, overdue items, and any changes to action owners or timelines. This creates continuity between reviews.Action log from previous minutes; updated status for each item
Changes in external and internal issues9.3.2(b)Significant changes to the context identified in Clause 4.1 — new regulations, significant business changes, major new threats or incidents in the sector. Changes that have or should affect the ISMS scope or risk assessment.Updated context analysis or issue log; regulatory update briefing; risk assessment update records
Information security performance — trends and metrics9.3.2(c)The monitoring and measurement results from Clause 9.1 — KPI dashboard, control effectiveness metrics, audit findings, incident statistics, awareness training outcomes. Presented as trends, not one-time snapshots.ISMS performance dashboard; monthly metrics reports; incident register summary
Fulfillment of information security objectives9.3.2(d)Progress against the objectives set in Clause 6.2 — are we on track? What obstacles exist? Should any objectives be revised given changed circumstances?Objectives register with current status; action plan completion rates; milestone achievement records
Feedback from interested parties9.3.2(e)Client feedback on security matters, regulatory communications, supplier security incidents, audit findings from client or regulatory audits of the organization's security posture.Client security questionnaire responses; regulatory correspondence; supplier audit results; client satisfaction data
Results of risk assessment and status of treatment plan9.3.2(f)Summary of risk register status — new high/critical risks identified, treatment plan execution progress, residual risks above acceptance threshold, risk owner acceptance status.Risk register executive summary; treatment plan status report; risk owner acceptance records
Results of internal audit9.3.2(g)Audit findings from all internal audits conducted since the last management review — nonconformities, observations, closure status of corrective actions, and recurring findings that warrant management attention.Internal audit reports; corrective action register; finding trend analysis
Opportunities for continual improvement9.3.2(h)Proposed improvements to the ISMS — from audit observations, staff feedback, lessons learned from incidents, benchmarking against peer organizations, or management's own assessment of ISMS maturity.Improvement proposals; post-incident lessons learned; management observations documented in minutes

 

Required Outputs: What the Review Must Decide

 

A management review that concludes without documented decisions is a discussion, not a governance activity. Clause 9.3.3 requires specific outputs — and those outputs must be retained as documented information. The table below defines the required outputs and what meaningful minute entries for each look like:

 

Required output (Clause 9.3.3)What it must containExample from minutes
Decisions on continual improvement opportunitiesSpecific improvements approved with resource allocation, responsible owners, and target dates. Not vague commitments — actionable decisions.Approved: implement automated access review tooling by Q3 2026. Owner: Head of IT. Budget: approved IDR 85M.
Decisions on any need for changes to the ISMSChanges to scope, policy, objectives, risk assessment methodology, or controls — approved through management review as the Clause 6.3 change planning mechanism.Approved: expand ISMS scope to include mobile banking application from Q4 2026. Scope statement to be updated by ISMS Manager.
Resource needs identified and decisions madeWhere the review identifies resource gaps — staffing, tooling, budget — decisions on how to address them must be documented with management approval.Approved: hire dedicated ISMS Analyst to support growing compliance requirements. HR to open requisition by end of month.

 

How Often to Hold Management Reviews

 

The standard requires management reviews 'at planned intervals' — it does not specify the frequency. Annual management reviews are common and typically sufficient for stable organizations. More frequent reviews are appropriate when the ISMS is newly implemented, when significant changes are occurring, when audit findings are numerous, or when the regulatory environment is changing rapidly.

 

For Indonesian organizations in 2026 — navigating active UU PDP implementation, evolving OJK IT governance requirements, and a dynamic threat landscape — quarterly management reviews are increasingly defensible as standard practice, with annual reviews representing a minimum. Regulators who investigate incidents may ask when the last management review was conducted and what risk topics were discussed. 'Twelve months ago' is a difficult answer if a significant regulatory change occurred six months ago.

Management review as regulatory defense: In the event of a regulatory investigation following a security incident, the management review records serve as evidence of governance due diligence. Minutes that show top management regularly reviewing risk status, discussing threat intelligence, allocating resources for control implementation, and making documented improvement decisions provide a materially stronger defense posture than the absence of such records. Treat management review minutes as a governance artifact, not an administrative formality.

Required Outputs from Clause 9

All four Clause 9 outputs are explicitly required by the standard's text — every one is a major nonconformity if absent:

§Required document / outputStatusWhat auditors examine
9.1Monitoring and measurement resultsEXPLICITDocumented results of ISMS performance monitoring — KPI data, control effectiveness evidence, incident statistics. Must be retained as documented information.
9.2Internal audit programEXPLICITThe documented audit program showing which areas will be audited when, audit objectives, and the rationale for scheduling. Must be maintained and available.
9.2Internal audit reports and findingsEXPLICITFormal audit reports for every audit conducted — including evidence, finding classifications, and recommendations. Retained as documented information.
9.3Management review records / minutesEXPLICITEvidence that management reviews were conducted — including attendance, all required inputs addressed, and decisions made. One of the most scrutinized documents in an external audit.

Note that all four are marked EXPLICIT — Clause 9 has the highest density of explicitly required documented information of any clause in ISO 27001. This reflects the standard's philosophy: the Check phase must produce durable, auditable evidence that the evaluation actually occurred.

Common Clause 9 Nonconformities

Monitoring metrics that are never reviewed or acted upon

An ISMS metrics dashboard that is populated but never presented at management review, never discussed by the security team, and never used to inform risk assessment updates is not monitoring — it is data collection. Clause 9.1 requires analysis and evaluation of monitoring results. Evidence of this analysis is not a dashboard; it is records showing that someone reviewed the data, drew conclusions, and took (or decided not to take) action based on those conclusions.

Internal audit that never finds nonconformities

An internal audit program that consistently produces zero nonconformities in a mature ISMS is credible. An internal audit program that produces zero nonconformities in a first-cycle ISMS that is 18 months old signals that the audit is not being conducted rigorously. Real first-cycle ISMS audits find things — incomplete training records, controls not yet fully implemented, documentation with version control gaps. If the internal audit finds nothing and the external certification audit subsequently finds multiple nonconformities, the internal audit program itself becomes a finding.

 

Management reviews that cover some inputs but not all eight

It is surprisingly common for management reviews to address some required inputs — usually performance metrics and audit findings — while omitting others. Interested party feedback, risk assessment status, and feedback on continual improvement opportunities are frequently absent from management review agendas. Clause 9.3 requires all eight. Minutes that do not address all eight inputs are nonconforming regardless of how thorough they are on the items they do cover.

Management review minutes that record no decisions

Minutes that record what was presented but not what was decided fail the output requirements of Clause 9.3.3. 'The CISO presented the risk register status' is a record of a presentation. 'Management approved the risk treatment plan update and allocated an additional IDR 120M for MFA implementation, with Head of IT responsible for delivery by Q3 2026' is a management review output. The distinction between a record of discussion and a record of decision is exactly what auditors look for.

The nominal management review pattern: The most pervasive management review failure in Indonesian mid-market organizations is the review that occurs in name but not in substance — the CEO attends, the CISO presents a prepared summary, the CEO approves the summary without detailed review, and the minutes record 'ISMS continues to operate effectively, no changes required.' This satisfies the formal requirement but not the governance intent. Auditors probe this by asking attendees specific questions about what was discussed — if the CEO cannot describe the top three risks reviewed or the improvement decisions made, the review did not function as intended.

 

Clause 9 as the ISMS's Self-Correcting Mechanism

The three components of Clause 9 work together as a self-correcting governance mechanism. Monitoring detects signals. Internal audit verifies them independently. Management review interprets them and decides how to respond. When all three are functioning genuinely — not just formally — the ISMS has the capacity to improve continuously rather than just maintain compliance.

Organizations that invest in Clause 9 properly — that build monitoring programs that inform decisions, internal audit programs that find real problems, and management reviews that produce real governance choices — consistently report that their second and third certification cycles are qualitatively different from their first. The ISMS matures because the feedback mechanisms are working. Controls that are not effective are found and fixed. Risks that have grown are identified and treated. Gaps between documented intent and operational reality are systematically closed.

Clause 9 is not just evidence collection for auditors. It is the organizational intelligence system that keeps the ISMS aligned with the real world.