Purpose of the Gap Assessment
A formal gap assessment is the foundation of realistic implementation planning. The gap assessment answers a critical question: where is the organization today relative to ISO 20000 requirements? Organizations that skip formal gap assessment typically underestimate implementation effort by 50-100%, leading to compressed timelines, incomplete documentation, and SMS infrastructure that is not genuinely operational.
The gap assessment produces: (1) an objective understanding of current state, (2) identification of remediation priorities, (3) preliminary estimates of implementation effort, and (4) the raw material for the implementation project plan. A gap assessment conducted carefully in Phase 1 prevents costly rework and false starts later.
Two Dimensions of Gap Assessment
The gap assessment must cover two distinct domains: Management System Governance gaps (Clauses 4-7, 9-10) and Service Management Practice gaps (Clause 8).
Management System Governance gaps address whether the organization has established the foundational SMS infrastructure: Do we have a documented service management policy? Is there a service management plan? Do we have defined roles and responsibilities? Is there a documented approach to risk management? Do we have an internal audit program? These governance gaps often require behavioral and organizational changes, not just documentation.
Service Management Practice gaps address whether the organization is executing the twelve Clause 8 practices: Do we have incident management? Is it documented? Is it followed consistently? Are records maintained? Are we measuring practice effectiveness? These practice gaps often can be closed faster than governance gaps because they are more operational and less dependent on organizational change.
Gap Assessment Methodology
Three complementary assessment approaches are typically combined to produce a comprehensive gap assessment:
Document review: examine existing policies, procedures, process documentation, and other formally documented information against ISO 20000 requirements. Document review identifies whether procedures exist and what they describe, but does not assess whether procedures are actually followed.
Interview: speak with process owners, service management practitioners, and operations leadership about how processes actually work versus how they are documented. Interviews reveal gaps between documented procedure and actual practice, and identify workarounds and informal processes.
Observation and sampling: review a sample of operational records (incident records, change records, problem records, etc.) to assess whether practices are being executed as designed and whether records are complete and accurate. Sampling reveals whether the organization has the discipline to maintain the management system infrastructure that ISO 20000 requires.
Gap Assessment Rating Scale
Organizations typically use a standardized rating scale to assess each requirement or clause. Common approaches include:
Traffic light system: Red (not in place / significant gaps), Amber (partially in place / moderate gaps), Green (substantially or fully in place / minimal gaps).
Numerical maturity scale: 0 (not in place), 1 (initiated), 2 (developing), 3 (substantially in place), 4 (fully in place and optimized).
Gap category approach: Not in place, Partially in place, Substantially in place, Fully in place.
Whichever scale is chosen, consistency across the assessment is important. Each rating must be accompanied by evidence--a reference to documentation reviewed, comments from interviews, or a note about the sampling result that supports the rating.
| KEY CONCEPT | The gap assessment is a diagnostic, not a self-certification. The goal is honest identification of gaps, not a favorable picture. Organizations that rate themselves too highly on gap assessments (inflating the "Green" ratings to make their current state look better) consistently fail Stage 1 or encounter major Stage 2 nonconformities. An objective gap assessment is a more valuable document than an optimistic one. |
Clause-by-Clause Gap Analysis
A structured gap assessment works through Clauses 4-10 systematically. For each sub-clause, the assessment should document:
Current state: what is the organization currently doing (or not doing) to meet this requirement? What documentation exists? How is this practiced operationally?
Gap identification: what is missing or incomplete? What does the organization need to do to meet the requirement fully?
Evidence needed: what documented information, records, or evidence will auditors need to see to verify that the organization meets this requirement?
Remediation effort estimate: how much work will be required to close this gap? Effort in days, whether internal resources, external support, or both are needed, and any dependencies on other gaps.
Prioritizing Remediation
Not all gaps are equal. Some gaps pose high audit risk (likely to generate a nonconformity), while others are lower risk. Some gaps can be closed quickly, while others require significant effort or organizational change. Prioritization based on multiple dimensions produces a more realistic remediation plan.
Audit risk: How likely is it that an auditor will find a nonconformity related to this gap? Governance gaps (missing policy, no internal audit program, no management review process) typically pose higher audit risk than operational gaps in individual practices.
Implementation effort: How much work is required to close this gap? Governance infrastructure gaps often require more effort than practice-level gaps because they touch the entire organization.
Dependencies: Some gaps must be closed before others. For example, establishing document control procedures must happen before service management procedures can be finalized and controlled. Change management procedures must be in place before changes to service management procedures can be managed as formal changes.
Business criticality: Is this gap affecting current service delivery quality or customer satisfaction? High-criticality gaps may need remediation even if they are lower audit risk.
The prioritization matrix looks at all four dimensions and produces a ranked list of gaps ordered by: (1) high-risk gaps, (2) high-effort gaps with dependencies (these should start early to avoid delaying other work), and (3) lower-risk/lower-effort gaps.
Building the Implementation Plan from Gap Assessment
The implementation project plan is built by mapping gap remediation activities to the implementation roadmap phases. Each gap becomes one or more work packages in the project plan, with:
Clear ownership: who is responsible for closing this gap?
Clear deliverables: what will be the output when this gap is closed? A documented procedure? A policy? A record format?
Clear success criteria: what evidence will demonstrate that the gap is closed?
Dependencies: what other gaps or activities must be completed before this gap remediation can begin?
Timeline: in which phase will this gap be addressed?
Gap Assessment for Organizations with Existing ISO 27001
Organizations that already hold ISO 27001 certification have an existing management system infrastructure. The gap assessment for ISO 20000 should include a comparative analysis: what parts of the ISO 27001 SMS infrastructure can be leveraged for ISO 20000 (document control, management review, internal audit, top management commitment)? What must be built additionally? Where do the two standards have differing requirements that require reconciliation?
Common finding: organizations with ISO 27001 have strong governance infrastructure (policy, management review, internal audit) that can be extended to ISO 20000 service management practices. This typically reduces Phase 2 design effort by 30-40%. However, they may have weak incident and problem management practices relative to ISO 20000 expectations, requiring additional practice-level work.
Common Gap Patterns in Indonesian IT Organizations
Incident management capability is often strong--most organizations have a service desk and incident tracking. However, incident classification, severity assessment, and escalation procedures are often informal.
Problem management is frequently weak or non-existent. Organizations resolve incidents without systematically investigating root cause or managing problems.
Change management processes exist in most organizations, but records are incomplete. Change documentation is often done in multiple systems or spreadsheets rather than in a centralized, auditable change management system.
Release management is often conflated with change management. Organizations struggle to separate release planning from operational change control.
Configuration management database: many organizations either have no CMDB or have a CMDB that is significantly inaccurate and not actively maintained.
Service portfolio: services and SLAs are often documented informally in customer contracts rather than in a centralized, managed service portfolio.
Management system governance: the internal audit program is frequently non-existent or very basic. Service management policy may not be formally documented. Roles and responsibilities for service management are often unclear.
| IMPORTANT | Management system governance gaps (Clauses 4-7, 9-10) take longer to close than practice gaps (Clause 8) because they require behavioral and cultural change, not just documentation. When building the implementation timeline, allocate more time and resources to governance gaps than to practice gaps. Auditors will assess governance maturity carefully; poor governance infrastructure will generate nonconformities even if practices are well-documented. |
The Gap Assessment Report
The gap assessment is documented in a formal report that typically includes: (1) executive summary of findings and major gap categories, (2) detailed clause-by-clause assessment with ratings and evidence, (3) prioritized list of remediation activities, (4) effort and timeline estimates for remediation, (5) recommendations for external support needs, and (6) high-level roadmap for closing gaps across the implementation phases.
The gap assessment report is the primary input to the implementation project plan. It is also the baseline by which the organization can measure progress during implementation. After remediation activities are completed, a follow-up assessment (often conducted just before internal audit) provides objective evidence of progress and identifies remaining gaps.
| BITLION INSIGHT | Bitlion GRC gap assessment toolkit includes a pre-built ISO 20000 questionnaire aligned to Clause 4-10 requirements, automated gap heat map generation, and remediation planning templates. Organizations using Bitlion gap assessment tools report reducing assessment time from 6-8 weeks to 3-4 weeks while increasing assessment accuracy and confidence. |
Gap Assessment Rating Scale and Implications
| Rating | Definition | Audit Risk | Typical Remediation Timeframe |
|---|---|---|---|
| Red / Not in place | Requirement not addressed; no documentation or evidence of practice | Very high--likely nonconformity at audit | 8-16 weeks (governance) or 4-8 weeks (practice) |
| Amber / Partially in place | Some evidence of addressing requirement; partial documentation or inconsistent practice | High--likely observation or nonconformity | 4-8 weeks (governance) or 2-4 weeks (practice) |
| Green / Substantially in place | Requirement substantially addressed; documentation in place, mostly consistent practice | Low--unlikely to generate audit finding | 1-2 weeks for full compliance |
| Fully in place | Complete compliance; fully documented, consistent practice, evidence available | Minimal--no remediation typically needed | None |
Common Gap Patterns by Clause
| Clause | Typical Gap in Indonesian IT Orgs | Priority Level | Remediation Approach |
|---|---|---|---|
| Clause 4-5: Context and Leadership | No formal service management policy; top management commitment not documented | High | Draft and approve service management policy; document top management commitment |
| Clause 6: Planning | No service management plan or plan not updated; objectives informal or missing | High | Develop SMP with explicit objectives, roles, resources, timeline |
| Clause 7: Support | Lack of formal competence management; no resource allocation to SMS | Medium | Define competence requirements; establish competence plan and training schedule |
| Clause 8.1: Incident management | Process exists but lacks rigor; categories and severity not standardized; records incomplete | Medium | Standardize incident classification; establish incident severity matrix; improve record completeness |
| Clause 8.2: Problem management | Weak or non-existent; no root cause analysis; no problem records | High | Establish problem management process; train on root cause analysis; implement problem tracking |
| Clause 8.3: Service request management | Often called "low-priority incidents"; not separated from incident management | Medium | Define service request category; establish service request SLA; separate from incident management |
| Clause 8.4: Change management | Process exists; records incomplete; change approval authority unclear | Medium | Improve change record completeness; clarify change approval authority; establish change advisory board |
| Clause 8.5: Release management | Conflated with change management; no separate release plan or release records | Medium | Separate release management from change management; establish release planning process |
| Clause 8.6: Configuration management | CMDB missing or inaccurate; CI scope not defined; verification process non-existent | High | Define CI scope; populate/reconcile CMDB; establish CI verification process |
| Clause 8.7: Service portfolio | Services documented in contracts but not in central portfolio; no service review process | Medium | Create centralized service portfolio; document service definitions; establish quarterly service review |
| Clause 8.8-12: Other practices | Relationship, continuity, availability, capacity, and security aspects are often informal or absent | Medium | Develop procedures; establish records; assign ownership; integrate with operational systems |
| Clause 9-10: Monitoring and improvement | No internal audit program; management review non-existent; no action tracking | High | Establish internal audit schedule and auditor competence; establish management review calendar; implement action tracking |