Gap Analysis and Current State Assessment

The gap assessment is the most consequential analytical activity in the entire ISO 27001 implementation. It is the point where the organization looks honestly at what it has, compares it to what ISO 27001 requires, and produces an honest inventory of the work ahead.

Done well, a gap assessment delivers three things simultaneously: a clear picture of current security maturity, a prioritized list of what needs to be built or improved, and a resource-estimated implementation roadmap that allows accurate planning and realistic commitment to a certification timeline. Done poorly — rushed, superficial, or conducted with a bias toward finding a favorable result — it produces a false confidence that leads to failed certification attempts and wasted resources.

This article covers the complete gap assessment process: how to structure the assessment, what evidence to collect, how to score gaps consistently, how to present findings to management, and how to build the remediation roadmap that will drive Phases 3 and 4 of the implementation.

What a Gap Assessment Is — and Is Not

A gap assessment is not a risk assessment. It does not quantify information security risks or select controls. It assesses the maturity of the organization's current practices against the requirements of ISO 27001:2022 — identifying where requirements are not met and estimating what is needed to meet them.

A gap assessment is not a compliance checklist. Checking whether a policy document exists is not the same as assessing whether the policy is current, approved, communicated, and being followed in practice. The best gap assessments test both the existence and the operational reality of controls and processes.

A gap assessment is not a one-time document. The findings feed directly into the risk assessment (which assets have inadequate controls?), the risk treatment plan (which gaps need to be closed?), the SoA (which Annex A controls are partially implemented?), and the implementation roadmap (what needs to be done, by whom, and by when?). The gap assessment is the foundation on which the rest of the implementation is built.

THE HONEST ASSESSMENT PRINCIPLEThe most common gap assessment failure is confirmation bias — finding what is expected rather than what is actually there. An honest gap assessment tests claims rather than accepting them, samples evidence rather than reviewing summaries, and scores a practiced but undocumented control as a gap rather than giving credit for informal practice.

Choosing the Right Assessment Approach

There is no single prescribed methodology. Different approaches suit different organizational contexts. The table below describes four main approaches, when each is most appropriate, and what they produce:

ApproachBest forHow it worksOutputEffort
Clause-by-Clause ReviewFirst-cycle implementations with limited prior ISMS experienceWork through Clauses 4–10 systematically. For each sub-clause: does evidence exist? Is it adequate? Is it current? Score 0–3.Clause conformance matrix with maturity scores and remediation priorities.3–5 days
Annex A Controls AssessmentOrganizations with existing security programs wanting to map current controlsWork through all 93 Annex A controls. For each: applicable? Implemented? Effective? Document evidence.Pre-SoA control coverage matrix showing which controls are partially or fully in place.4–8 days
Evidence-Based WalkthroughOrganizations claiming existing controls that need independent verificationRequest evidence for claimed controls and test them: review access configs, pull audit logs, sample training records, test procedures.Verified control inventory distinguishing 'controls that exist' from 'controls that function'.5–10 days
Combined Assessment (recommended)Most practical for first-cycle implementationsCombine clause review with Annex A mapping. Clause review assesses management system maturity; Annex A mapping covers technical controls.Integrated gap report with clause gaps, control gaps, priority heat map, and remediation roadmap.6–12 days

For most first-cycle implementations, the Combined Assessment approach produces the most useful output. It takes more time, but the integrated view — showing both management system gaps and control gaps — gives the implementation team and executive sponsor a complete picture that supports better planning decisions.

The Maturity Scoring Framework

Consistent scoring transforms a gap assessment from narrative observation into a usable planning tool. A 0–3 maturity scale — simple enough to apply consistently, granular enough to distinguish meaningfully different states — works well for ISO 27001 gap assessments:

ScoreLevelDefinitionTypical characteristicsImplementation action
0Not StartedNo evidence the requirement has been considered.No risk assessment performed. No IS policy exists.Build from scratch. Full documentation and implementation required.
1Initial / Ad HocRequirement partially addressed but informally — undocumented practice or tools without process.Some access controls exist but no formal policy or review process.Formalize and document. Assign ownership. Establish evidence collection.
2Documented & DefinedRequirement addressed through documented, approved processes mostly followed in practice.IS policy exists and signed. Risk methodology documented. Access reviews defined and usually followed.Focus on consistency, evidence collection, and closing document-to-practice gaps.
3Managed & EffectiveRequirement fully met. Consistent operation, evidence retained, controls effective, actively monitored.Risk register current and reviewed regularly. Access reviews on schedule with documented evidence.Maintain. Include in internal audit sampling to verify continued effectiveness.
Scoring discipline: The most common mistake is scoring a 2 (Documented & Defined) when evidence only supports a 1 (Initial / Ad Hoc). If the policy exists but is unapproved and not followed, that is a 1. If the access review is documented but 14 months overdue against a quarterly policy, that is a 1 or low-2. Honest scoring is more useful than optimistic scoring — it produces a remediation plan that is adequately resourced.

Evidence Collection: What to Gather and How

A gap assessment is only as good as its evidence base. Evidence collection uses four techniques: document review, interviews, observation, and technical testing. Before the assessment begins, request a pre-assessment evidence package from the organization. The checklist below covers the standard categories:

Governance & Leadership
  • Information security policy (any version — even draft)
  • Organizational chart showing security function reporting lines
  • Minutes or records of any security-related management meetings
  • Budget records showing security spend
  • Evidence of executive communication on security topics
Risk & Planning
  • Any existing risk registers, risk assessments, or vulnerability assessments
  • Previous audit reports (internal or external)
  • Regulatory correspondence referencing IT security requirements
  • Business continuity or disaster recovery documentation
  • Previous gap assessments or security reviews
People & Awareness
  • HR policies covering security (acceptable use, clean desk, etc.)
  • Training records for any security-related training
  • Onboarding checklists
  • Background check or screening policy / evidence
  • Staff awareness communications (emails, posters, briefings)
Technical Controls
  • Network architecture diagrams and asset inventory
  • Access control policy and IAM configuration records
  • Firewall and network security configurations
  • Patch management records or vulnerability scan reports
  • Backup and recovery policy and test records
  • Encryption standards documentation
  • Logging and monitoring configuration evidence
Supplier & Third-Party
  • Contracts with cloud providers, SaaS vendors, and payment processors
  • Data processing agreements (DPAs) if any exist
  • Supplier security questionnaire responses received
  • Supplier compliance certificates on file
Incident & Operations
  • Incident log or records of security events
  • Incident response procedure (if documented)
  • Change management records or procedure
  • Post-incident reviews (if conducted)
  • Any regulatory breach notifications filed

Interview Approach

Interviews are the most efficient way to identify gaps that documents do not reveal. The critical question is not 'do you have a process for X?' but 'walk me through what happens when X occurs'. The difference tests documentation versus operational reality.

Key interview subjects: ISMS Lead / CISO (management system maturity), Head of IT / Infrastructure (technical control status), a software developer (secure development practices), HR representative (onboarding and offboarding process), a business unit manager (risk owner awareness), and a non-technical staff member (awareness level, policy knowledge).

The non-technical staff interview is the most revealing test of awareness effectiveness. Ask a customer service representative: 'If you received an urgent password reset email, what would you do?' and 'If you found a USB drive in the car park, what would you do?' The answers tell you more about security culture than all training completion records combined.

The Gap Heat Map: Sample Assessment Output

The gap heat map is the primary output of the assessment phase — a single view of the organization's maturity across ISO 27001 requirements, combined with priority and effort ratings. The sample below reflects a realistic assessment for an Indonesian fintech company at early implementation stage:

Requirement areaScorePriorityEffortGap narrative and action
4.1 Context of organization1HighMediumIssues register exists informally; needs documentation and regulatory mapping to UU PDP, POJK, PBI
4.2 Interested parties0HighLowNo formal register. Quick win — create stakeholder register within 2 weeks
4.3 ISMS scope2MediumLowVerbal scope exists; needs formal documented scope statement
5.1 Leadership commitment1CriticalMediumCEO aware of ISMS project but not formally engaged. Requires management review structure and policy sign-off
5.2 Information security policy1CriticalMediumDraft policy from 2022 — outdated, unapproved, not covering UU PDP. Needs full redraft
5.3 Roles and responsibilities1HighLowCISO role exists; no RACI, no risk owner assignments, no formal ISMS role documentation
6.1 Risk assessment0CriticalHighNo formal risk assessment performed. Highest priority gap — drives all subsequent control selection
6.2 IS objectives0HighLowNo documented objectives. Set after risk assessment is complete
7.1 Resources2MediumLowSecurity budget exists; documentation of resource allocation and adequacy assessment needed
7.2 Competence1HighMediumTeam competent technically; no competence framework, no training records retained
7.3 Awareness1HighMediumInformal security briefings; no formal program, no completion records, no phishing simulation
7.5 Documented information1HighHighDocuments exist but no version control, no approval records — full document control system needed
8.1 Operational controls2MediumHighControls exist informally; supplier contracts lack security addenda; change management has no security gate
9.1 Monitoring & measurement0HighMediumNo KPIs, no metrics program, no systematic performance monitoring
9.2 Internal audit0HighMediumNo internal audit program. Must be operational before Stage 2 audit
9.3 Management review0CriticalLowNo management review conducted. Must have at least one completed before Stage 2
Annex A — Org controls (5.x)1HighHighMany controls partially in place; information classification, threat intelligence, supplier mgmt need formalization
Annex A — Tech controls (8.x)2MediumHighMFA partial (privileged only), SIEM under-monitored, vulnerability scanning needs scheduling

Reading this heat map, a clear pattern emerges: the management system foundations — risk assessment, management review, IS objectives, interested parties register — are at 0. These must be addressed in Phase 3. The technical controls are in better shape — mostly 1 or 2 — meaning there is a foundation to build on. This pattern is extremely common: technical controls grow organically, but the management system infrastructure never gets built because nobody has led that work.

Bitlion assessment module: Bitlion's platform includes a structured gap assessment workflow guiding the assessor through all Clause 4–10 requirements and all 93 Annex A controls, capturing maturity scores, gap narratives, and evidence references. The platform generates the heat map automatically and exports a formatted report for executive presentation.

Common Gap Patterns in Indonesian Organizations

After conducting gap assessments across Indonesian regulated industries, a consistent set of gaps appears with high frequency. Understanding these patterns helps implementation teams allocate effort appropriately:

Common gapSeverityFrequencyTimeline implicationImplementation fix
No formal risk assessmentCriticalVery commonBlocks all Annex A control selection. Must be the first Phase 3 deliverable. Expect 4–6 weeks.Prioritize risk methodology as Phase 3 starting point. All documentation depends on it.
Policy documents outdated or unapprovedMajorCommonExisting policies (often 2–4 years old) cannot be patched — require full redraft aligned to current scope and regulations.Treat existing policies as reference only. Draft new policy suite aligned to current scope and regulatory context.
No management review historyMajorVery commonStage 2 requires evidence of at least one management review. Must be scheduled 3–6 months before Stage 2.Schedule first management review no later than 3 months before planned Stage 2 date.
Supplier contracts lack security addendaHighVery commonAdding security addenda to critical supplier agreements takes 4–12 weeks due to negotiation cycles.Identify top 5–8 critical suppliers early in Phase 3. Begin contract amendment in parallel with policy development.
No internal audit competenceMajorCommonInternal audit must complete before Stage 2. Auditor training takes 3–5 days. Program design takes 2–3 weeks.Schedule internal auditor training early in Phase 3. Design audit program immediately after training.
Access reviews never conductedHighCommonEvidence of at least one completed review cycle required before Stage 2. First review may uncover significant stale access.Conduct first access review as part of Phase 4. Build recurring calendar entry.
No security awareness training programHighCommonCompletion records for all in-scope staff required for Stage 2. LMS setup and delivery takes 4–6 weeks.Select and deploy LMS early in Phase 4. Set completion target 4 weeks before Stage 2 audit.
Technical controls partially deployedVariableVery commonMFA admin-only, scanning unconfigured, logging without alerts — partial deployment means partial evidence.Gap assessment must specifically test 'is this control deployed AND functional?' not just 'does it exist?'

These gaps — particularly the absence of a formal risk assessment, outdated or unapproved policies, and no management review history — appear in the majority of first-cycle Indonesian implementations. They are the predictable result of security programs that grew organically without an ISMS framework. The gap assessment makes them visible. The remediation roadmap closes them systematically.

From Gap Assessment to Remediation Roadmap

The gap assessment report is an analytical output. The remediation roadmap is the planning output — translating gap findings into a sequenced, resourced action plan. A good roadmap sequences by dependencies (risk assessment before SoA), front-loads quick wins (building momentum), estimates effort realistically, and defines clear outputs for each wave that can be verified.

Wave 1 · Weeks 1–4  —   Quick wins and critical blockers
  • Create interested parties register (4.2) — 2 days
  • Document ISMS scope statement (4.3) — 1 day
  • Draft information security policy for CEO approval — 5 days
  • Assign risk owners to all business units — 1 day
  • Initiate supplier contract security addenda negotiations (top 5 suppliers)
  • Schedule first management review (target: 3 months before Stage 2)
Wave 2 · Weeks 5–10  —   Foundation build — risk and documentation infrastructure
  • Define and document risk assessment methodology — 3 days
  • Complete context analysis document (4.1) — 3 days
  • Establish document control framework and register — 2 days
  • Build asset inventory for risk assessment — 5 days
  • Define competence framework for ISMS roles — 2 days
  • Enrol ISMS team in ISO 27001 internal auditor training
Wave 3 · Weeks 11–18  —   Core ISMS build — risk assessment and policy suite
  • Conduct full information security risk assessment — 10 days
  • Produce risk register with risk owner review and acceptance
  • Complete initial Statement of Applicability from risk assessment
  • Draft and approve supporting policy suite (access control, incident management, supplier security, cryptography, classification)
  • Define IS objectives and action plans (6.2)
  • Design security awareness training program and select LMS
Wave 4 · Weeks 19–30  —   Control implementation — close technical and process gaps
  • Deploy MFA across all accounts per risk treatment plan
  • Configure centralized logging and SIEM alerting rules
  • Implement formal vulnerability management schedule and first scan
  • Conduct first formal access review with documented evidence
  • Launch security awareness training — achieve >95% completion
  • Formalize change management with security assessment gate
  • Finalize and sign all critical supplier security addenda
Wave 5 · Weeks 31–42  —   Pre-certification — audit, review, certify
  • Conduct internal audit covering all clauses 4–10 and key Annex A domains
  • Address internal audit findings — corrective actions and evidence
  • Conduct pre-certification management review
  • Compile and submit Stage 1 documentation package
  • Stage 1 audit — address documentation findings
  • Stage 2 certification audit

Presenting Gap Findings to the Executive Sponsor

The gap assessment report is a technical document. Before it goes to the executive sponsor, translate it into a management briefing answering three questions: where do we stand, what does it cost to close the gaps, and when can we realistically certify?

The executive briefing should cover: a maturity summary by domain (not a 100-point gap list), the three to five most critical gaps and their business risk implications, the resource and timeline implications of the remediation roadmap, and a recommended go/no-go decision on proceeding to Phase 3 with the proposed scope and resource level.

Gap assessment as scope validator: A well-conducted gap assessment sometimes reveals that the provisional scope needs adjustment — a system assumed in-scope is not connected to production, or a supplier relationship discovered during assessment is processing in-scope personal data. The gap assessment is the right moment to validate and refine scope — before Phase 3 documentation work begins.

The Gap Assessment as a Foundation of Confidence

A thorough, honest gap assessment is an act of organizational confidence, not an admission of weakness. Every organization starting an ISO 27001 implementation has gaps — the question is how significant they are and how long they take to close. An organization that knows its gaps accurately can plan effectively, resource appropriately, and commit to a certification timeline with genuine confidence.

The organizations that go into Phase 3 with a clear, honestly-scored gap assessment consistently execute better and certify faster than those that conduct a superficial assessment and discover real gaps during the internal audit or Stage 2 certification audit. The assessment investment — typically 6–12 days of focused work — pays returns many times over in the quality of the implementation plan it enables.

Know your gaps. Score them honestly. Plan to close them systematically. That is how certification happens on schedule.