The gap assessment is the most consequential analytical activity in the entire ISO 27001 implementation. It is the point where the organization looks honestly at what it has, compares it to what ISO 27001 requires, and produces an honest inventory of the work ahead.
Done well, a gap assessment delivers three things simultaneously: a clear picture of current security maturity, a prioritized list of what needs to be built or improved, and a resource-estimated implementation roadmap that allows accurate planning and realistic commitment to a certification timeline. Done poorly — rushed, superficial, or conducted with a bias toward finding a favorable result — it produces a false confidence that leads to failed certification attempts and wasted resources.
This article covers the complete gap assessment process: how to structure the assessment, what evidence to collect, how to score gaps consistently, how to present findings to management, and how to build the remediation roadmap that will drive Phases 3 and 4 of the implementation.
What a Gap Assessment Is — and Is Not
A gap assessment is not a risk assessment. It does not quantify information security risks or select controls. It assesses the maturity of the organization's current practices against the requirements of ISO 27001:2022 — identifying where requirements are not met and estimating what is needed to meet them.
A gap assessment is not a compliance checklist. Checking whether a policy document exists is not the same as assessing whether the policy is current, approved, communicated, and being followed in practice. The best gap assessments test both the existence and the operational reality of controls and processes.
A gap assessment is not a one-time document. The findings feed directly into the risk assessment (which assets have inadequate controls?), the risk treatment plan (which gaps need to be closed?), the SoA (which Annex A controls are partially implemented?), and the implementation roadmap (what needs to be done, by whom, and by when?). The gap assessment is the foundation on which the rest of the implementation is built.
| THE HONEST ASSESSMENT PRINCIPLE | The most common gap assessment failure is confirmation bias — finding what is expected rather than what is actually there. An honest gap assessment tests claims rather than accepting them, samples evidence rather than reviewing summaries, and scores a practiced but undocumented control as a gap rather than giving credit for informal practice. |
Choosing the Right Assessment Approach
There is no single prescribed methodology. Different approaches suit different organizational contexts. The table below describes four main approaches, when each is most appropriate, and what they produce:
| Approach | Best for | How it works | Output | Effort |
| Clause-by-Clause Review | First-cycle implementations with limited prior ISMS experience | Work through Clauses 4–10 systematically. For each sub-clause: does evidence exist? Is it adequate? Is it current? Score 0–3. | Clause conformance matrix with maturity scores and remediation priorities. | 3–5 days |
| Annex A Controls Assessment | Organizations with existing security programs wanting to map current controls | Work through all 93 Annex A controls. For each: applicable? Implemented? Effective? Document evidence. | Pre-SoA control coverage matrix showing which controls are partially or fully in place. | 4–8 days |
| Evidence-Based Walkthrough | Organizations claiming existing controls that need independent verification | Request evidence for claimed controls and test them: review access configs, pull audit logs, sample training records, test procedures. | Verified control inventory distinguishing 'controls that exist' from 'controls that function'. | 5–10 days |
| Combined Assessment (recommended) | Most practical for first-cycle implementations | Combine clause review with Annex A mapping. Clause review assesses management system maturity; Annex A mapping covers technical controls. | Integrated gap report with clause gaps, control gaps, priority heat map, and remediation roadmap. | 6–12 days |
For most first-cycle implementations, the Combined Assessment approach produces the most useful output. It takes more time, but the integrated view — showing both management system gaps and control gaps — gives the implementation team and executive sponsor a complete picture that supports better planning decisions.
The Maturity Scoring Framework
Consistent scoring transforms a gap assessment from narrative observation into a usable planning tool. A 0–3 maturity scale — simple enough to apply consistently, granular enough to distinguish meaningfully different states — works well for ISO 27001 gap assessments:
| Score | Level | Definition | Typical characteristics | Implementation action |
| 0 | Not Started | No evidence the requirement has been considered. | No risk assessment performed. No IS policy exists. | Build from scratch. Full documentation and implementation required. |
| 1 | Initial / Ad Hoc | Requirement partially addressed but informally — undocumented practice or tools without process. | Some access controls exist but no formal policy or review process. | Formalize and document. Assign ownership. Establish evidence collection. |
| 2 | Documented & Defined | Requirement addressed through documented, approved processes mostly followed in practice. | IS policy exists and signed. Risk methodology documented. Access reviews defined and usually followed. | Focus on consistency, evidence collection, and closing document-to-practice gaps. |
| 3 | Managed & Effective | Requirement fully met. Consistent operation, evidence retained, controls effective, actively monitored. | Risk register current and reviewed regularly. Access reviews on schedule with documented evidence. | Maintain. Include in internal audit sampling to verify continued effectiveness. |
| Scoring discipline: The most common mistake is scoring a 2 (Documented & Defined) when evidence only supports a 1 (Initial / Ad Hoc). If the policy exists but is unapproved and not followed, that is a 1. If the access review is documented but 14 months overdue against a quarterly policy, that is a 1 or low-2. Honest scoring is more useful than optimistic scoring — it produces a remediation plan that is adequately resourced. |
Evidence Collection: What to Gather and How
A gap assessment is only as good as its evidence base. Evidence collection uses four techniques: document review, interviews, observation, and technical testing. Before the assessment begins, request a pre-assessment evidence package from the organization. The checklist below covers the standard categories:
| Governance & Leadership |
|
| Risk & Planning |
|
| People & Awareness |
|
| Technical Controls |
|
| Supplier & Third-Party |
|
| Incident & Operations |
|
Interview Approach
Interviews are the most efficient way to identify gaps that documents do not reveal. The critical question is not 'do you have a process for X?' but 'walk me through what happens when X occurs'. The difference tests documentation versus operational reality.
Key interview subjects: ISMS Lead / CISO (management system maturity), Head of IT / Infrastructure (technical control status), a software developer (secure development practices), HR representative (onboarding and offboarding process), a business unit manager (risk owner awareness), and a non-technical staff member (awareness level, policy knowledge).
| The non-technical staff interview is the most revealing test of awareness effectiveness. Ask a customer service representative: 'If you received an urgent password reset email, what would you do?' and 'If you found a USB drive in the car park, what would you do?' The answers tell you more about security culture than all training completion records combined. |
The Gap Heat Map: Sample Assessment Output
The gap heat map is the primary output of the assessment phase — a single view of the organization's maturity across ISO 27001 requirements, combined with priority and effort ratings. The sample below reflects a realistic assessment for an Indonesian fintech company at early implementation stage:
| Requirement area | Score | Priority | Effort | Gap narrative and action |
| 4.1 Context of organization | 1 | High | Medium | Issues register exists informally; needs documentation and regulatory mapping to UU PDP, POJK, PBI |
| 4.2 Interested parties | 0 | High | Low | No formal register. Quick win — create stakeholder register within 2 weeks |
| 4.3 ISMS scope | 2 | Medium | Low | Verbal scope exists; needs formal documented scope statement |
| 5.1 Leadership commitment | 1 | Critical | Medium | CEO aware of ISMS project but not formally engaged. Requires management review structure and policy sign-off |
| 5.2 Information security policy | 1 | Critical | Medium | Draft policy from 2022 — outdated, unapproved, not covering UU PDP. Needs full redraft |
| 5.3 Roles and responsibilities | 1 | High | Low | CISO role exists; no RACI, no risk owner assignments, no formal ISMS role documentation |
| 6.1 Risk assessment | 0 | Critical | High | No formal risk assessment performed. Highest priority gap — drives all subsequent control selection |
| 6.2 IS objectives | 0 | High | Low | No documented objectives. Set after risk assessment is complete |
| 7.1 Resources | 2 | Medium | Low | Security budget exists; documentation of resource allocation and adequacy assessment needed |
| 7.2 Competence | 1 | High | Medium | Team competent technically; no competence framework, no training records retained |
| 7.3 Awareness | 1 | High | Medium | Informal security briefings; no formal program, no completion records, no phishing simulation |
| 7.5 Documented information | 1 | High | High | Documents exist but no version control, no approval records — full document control system needed |
| 8.1 Operational controls | 2 | Medium | High | Controls exist informally; supplier contracts lack security addenda; change management has no security gate |
| 9.1 Monitoring & measurement | 0 | High | Medium | No KPIs, no metrics program, no systematic performance monitoring |
| 9.2 Internal audit | 0 | High | Medium | No internal audit program. Must be operational before Stage 2 audit |
| 9.3 Management review | 0 | Critical | Low | No management review conducted. Must have at least one completed before Stage 2 |
| Annex A — Org controls (5.x) | 1 | High | High | Many controls partially in place; information classification, threat intelligence, supplier mgmt need formalization |
| Annex A — Tech controls (8.x) | 2 | Medium | High | MFA partial (privileged only), SIEM under-monitored, vulnerability scanning needs scheduling |
Reading this heat map, a clear pattern emerges: the management system foundations — risk assessment, management review, IS objectives, interested parties register — are at 0. These must be addressed in Phase 3. The technical controls are in better shape — mostly 1 or 2 — meaning there is a foundation to build on. This pattern is extremely common: technical controls grow organically, but the management system infrastructure never gets built because nobody has led that work.
| Bitlion assessment module: Bitlion's platform includes a structured gap assessment workflow guiding the assessor through all Clause 4–10 requirements and all 93 Annex A controls, capturing maturity scores, gap narratives, and evidence references. The platform generates the heat map automatically and exports a formatted report for executive presentation. |
Common Gap Patterns in Indonesian Organizations
After conducting gap assessments across Indonesian regulated industries, a consistent set of gaps appears with high frequency. Understanding these patterns helps implementation teams allocate effort appropriately:
| Common gap | Severity | Frequency | Timeline implication | Implementation fix |
| No formal risk assessment | Critical | Very common | Blocks all Annex A control selection. Must be the first Phase 3 deliverable. Expect 4–6 weeks. | Prioritize risk methodology as Phase 3 starting point. All documentation depends on it. |
| Policy documents outdated or unapproved | Major | Common | Existing policies (often 2–4 years old) cannot be patched — require full redraft aligned to current scope and regulations. | Treat existing policies as reference only. Draft new policy suite aligned to current scope and regulatory context. |
| No management review history | Major | Very common | Stage 2 requires evidence of at least one management review. Must be scheduled 3–6 months before Stage 2. | Schedule first management review no later than 3 months before planned Stage 2 date. |
| Supplier contracts lack security addenda | High | Very common | Adding security addenda to critical supplier agreements takes 4–12 weeks due to negotiation cycles. | Identify top 5–8 critical suppliers early in Phase 3. Begin contract amendment in parallel with policy development. |
| No internal audit competence | Major | Common | Internal audit must complete before Stage 2. Auditor training takes 3–5 days. Program design takes 2–3 weeks. | Schedule internal auditor training early in Phase 3. Design audit program immediately after training. |
| Access reviews never conducted | High | Common | Evidence of at least one completed review cycle required before Stage 2. First review may uncover significant stale access. | Conduct first access review as part of Phase 4. Build recurring calendar entry. |
| No security awareness training program | High | Common | Completion records for all in-scope staff required for Stage 2. LMS setup and delivery takes 4–6 weeks. | Select and deploy LMS early in Phase 4. Set completion target 4 weeks before Stage 2 audit. |
| Technical controls partially deployed | Variable | Very common | MFA admin-only, scanning unconfigured, logging without alerts — partial deployment means partial evidence. | Gap assessment must specifically test 'is this control deployed AND functional?' not just 'does it exist?' |
These gaps — particularly the absence of a formal risk assessment, outdated or unapproved policies, and no management review history — appear in the majority of first-cycle Indonesian implementations. They are the predictable result of security programs that grew organically without an ISMS framework. The gap assessment makes them visible. The remediation roadmap closes them systematically.
From Gap Assessment to Remediation Roadmap
The gap assessment report is an analytical output. The remediation roadmap is the planning output — translating gap findings into a sequenced, resourced action plan. A good roadmap sequences by dependencies (risk assessment before SoA), front-loads quick wins (building momentum), estimates effort realistically, and defines clear outputs for each wave that can be verified.
| Wave 1 · Weeks 1–4 — Quick wins and critical blockers |
|
| Wave 2 · Weeks 5–10 — Foundation build — risk and documentation infrastructure |
|
| Wave 3 · Weeks 11–18 — Core ISMS build — risk assessment and policy suite |
|
| Wave 4 · Weeks 19–30 — Control implementation — close technical and process gaps |
|
| Wave 5 · Weeks 31–42 — Pre-certification — audit, review, certify |
|
Presenting Gap Findings to the Executive Sponsor
The gap assessment report is a technical document. Before it goes to the executive sponsor, translate it into a management briefing answering three questions: where do we stand, what does it cost to close the gaps, and when can we realistically certify?
The executive briefing should cover: a maturity summary by domain (not a 100-point gap list), the three to five most critical gaps and their business risk implications, the resource and timeline implications of the remediation roadmap, and a recommended go/no-go decision on proceeding to Phase 3 with the proposed scope and resource level.
| Gap assessment as scope validator: A well-conducted gap assessment sometimes reveals that the provisional scope needs adjustment — a system assumed in-scope is not connected to production, or a supplier relationship discovered during assessment is processing in-scope personal data. The gap assessment is the right moment to validate and refine scope — before Phase 3 documentation work begins. |
The Gap Assessment as a Foundation of Confidence
A thorough, honest gap assessment is an act of organizational confidence, not an admission of weakness. Every organization starting an ISO 27001 implementation has gaps — the question is how significant they are and how long they take to close. An organization that knows its gaps accurately can plan effectively, resource appropriately, and commit to a certification timeline with genuine confidence.
The organizations that go into Phase 3 with a clear, honestly-scored gap assessment consistently execute better and certify faster than those that conduct a superficial assessment and discover real gaps during the internal audit or Stage 2 certification audit. The assessment investment — typically 6–12 days of focused work — pays returns many times over in the quality of the implementation plan it enables.
Know your gaps. Score them honestly. Plan to close them systematically. That is how certification happens on schedule.