Domain 8 is the largest of the four Annex A domains — 34 controls covering the complete technical security architecture of an organization. It contains the highest density of controls that directly prevent, detect, and respond to attacks: the MFA deployment that blocks credential compromise, the vulnerability scanning that closes the gaps before attackers find them, the SIEM that detects the anomaly before it becomes a breach, and the secure coding standard that prevents the vulnerability from existing in the first place.
The 2022 revision added 7 new controls to Domain 8 — more than any other domain. These 7 controls (configuration management, information deletion, data masking, data leakage prevention, monitoring activities, web filtering, and secure coding) reflect the security challenges that have become central to the threat landscape in the decade since 2013: cloud infrastructure drift, personal data exposure in non-production environments, and the dramatic shift of attack surface into application-layer vulnerabilities. For Indonesian organizations navigating UU PDP enforcement, controls 8.10 (deletion), 8.11 (masking), and 8.12 (DLP) are directly tied to regulatory obligations.
This article provides the complete 34-control reference for Domain 8, deep-dive implementation guidance for all 7 new 2022 controls, and the top 7 audit findings in this domain with specific prevention and fix guidance.
Domain 8 Sub-Group Structure
The 34 technological controls fall into 10 logical sub-groups. Understanding the sub-groups helps with ownership assignment — different teams own different technical domains:
| Sub-group | Controls | Coverage |
| Access and identity (8.1–8.5) | 8.1–8.5 | Endpoint device management, privileged access rights, information access restriction, source code access, and secure authentication. The technical implementation of the access governance policies set in Domain 5 (5.15–5.18). |
| Operational controls (8.6–8.8) | 8.6–8.8 | Capacity management, malware protection, and vulnerability management. Core operational security controls that run continuously. A.8.8 vulnerability management is one of the most frequently audited controls in Stage 2. |
| System hardening and config (8.9) | 8.9 | Configuration management — NEW in 2022. Requires documented, monitored baseline security configurations for all in-scope systems. One of the most underimplemented new controls. |
| Data protection (8.10–8.12) | 8.10–8.12 | Information deletion (NEW), data masking (NEW), and data leakage prevention (NEW). Three new 2022 controls directly tied to UU PDP personal data obligations. |
| Resilience and backup (8.13–8.14) | 8.13–8.14 | Information backup and redundancy of information processing facilities. Fundamental resilience controls with direct BCM implications. |
| Logging and monitoring (8.15–8.17) | 8.15–8.17 | Logging, monitoring activities (NEW 2022), and clock synchronization. SIEM-adjacent controls that generate the evidence trail for incident investigation and audit. |
| Operational security (8.18–8.22) | 8.18–8.22 | Privileged utilities, software installation controls, network security, network services security, and network segmentation. Infrastructure security hygiene. |
| Web filtering (8.23) | 8.23 | Web filtering — NEW in 2022. Requires filtering of access to external websites to protect against malicious content and enforce acceptable use policy. |
| Cryptography (8.24) | 8.24 | Use of cryptography — policy, key management, and appropriate algorithm selection. Direct relevance to UU PDP encryption obligations for personal data. |
| Secure development (8.25–8.34) | 8.25–8.34 | Secure development lifecycle, application security requirements, secure architecture, secure coding (NEW), security testing, outsourced development, environment separation, change management, test data management, and protection during audit. The complete secure SDLC control group. |
Complete Control Reference: All 34 Technological Controls
The complete reference table below covers all 34 controls organized by sub-group. Each entry shows the control reference (★ marks new 2022 controls), control name, requirement summary, key evidence required, and Indonesian regulatory context:
| 8.1–8.5 — Access and Identity | |||||
| |||||
| |||||
| |||||
| |||||
| |||||
| |||||
| 8.6–8.8 — Operational Security | |||||
| |||||
| |||||
| |||||
| |||||
| 8.9 — Configuration Management (NEW 2022) | |||||
| |||||
| |||||
| 8.10–8.12 — Data Protection (NEW 2022) | |||||
| |||||
| |||||
| |||||
| |||||
| 8.13–8.14 — Resilience | |||||
| |||||
| |||||
| |||||
| 8.15–8.17 — Logging and Monitoring | |||||
| |||||
| |||||
| |||||
| |||||
| 8.18–8.22 — Infrastructure Security | |||||
| |||||
| |||||
| |||||
| |||||
| |||||
| |||||
| 8.23 — Web Filtering (NEW 2022) | |||||
| |||||
| |||||
| 8.24 — Cryptography | |||||
| |||||
| |||||
| 8.25–8.34 — Secure Development | |||||
| |||||
| |||||
| |||||
| |||||
| |||||
| |||||
| |||||
| |||||
| |||||
| |||||
|
The 7 New 2022 Controls: Implementation Deep Dive
The 7 new controls in Domain 8 represent the areas where most first-cycle ISMS implementations have the largest gaps. Unlike established controls (MFA, vulnerability scanning, backup) where implementation is well understood, the new controls often require organizations to build processes from scratch. The table below provides specific implementation steps, maturity gap assessment, and evidence requirements for each:
| Control ★ | Name | Maturity gap | Implementation steps | Evidence | Reg. link |
| 8.9 ★ | Configuration management | Most organizations have configurations — few have documented, monitored baseline configurations with drift detection. | 1. Inventory all in-scope systems. 2. Define secure baseline configuration per system type (OS hardening guides: CIS Benchmarks are excellent starting points). 3. Document baselines as controlled documents. 4. Implement automated compliance scanning (OpenSCAP, Nessus, CIS-CAT). 5. Remediate deviations. 6. Include configuration change in change management process. | CIS Benchmark hardening documents adapted for your environment. Automated compliance scan output. Configuration change records linked to change management tickets. | BSSN SNI hardening guidelines; POJK system configuration security |
| 8.10 ★ | Information deletion | Most organizations have data retention policies. Far fewer have documented deletion procedures with evidence of execution — especially for cloud storage and SaaS platforms. | 1. Define data retention periods by data category (personal data, transactional data, log data). 2. Define deletion triggers (retention period expiry, data subject erasure request, contract end). 3. Define secure deletion methods per storage medium (NIST SP 800-88 guidance). 4. Implement automated deletion where possible. 5. Record deletion evidence — date, data category, deletion method, approver. | Data retention and deletion policy. Deletion schedule per data category. Deletion execution records. Secure deletion tool output or signed manual deletion certificates. | UU PDP Article 28 right to erasure; UU PDP Article 67 data minimization; KOMINFO retention guidelines |
| 8.11 ★ | Data masking | Organizations that use production data in development or testing without masking are highly exposed under UU PDP. This is one of the most common post-UU PDP enforcement findings. | 1. Identify all non-production environments that access or receive production data. 2. Classify the personal data fields in production datasets. 3. Implement data masking before extraction to non-production (masking tools: Oracle Data Masking, Delphix, open-source options). 4. Document the masking method for each field type. 5. Verify that masked data is functionally usable for testing purposes. | Data masking policy. Evidence that masked data is used in dev/test (extract records showing masked values). Masking configuration or procedure documentation. | UU PDP personal data protection in non-production contexts; POJK test data security |
| 8.12 ★ | Data leakage prevention | DLP implementations range from email gateway content filtering (basic) to endpoint DLP with behavioral analytics (advanced). Most first-cycle ISMSs start with email and cloud upload controls. | 1. Define sensitive data categories requiring DLP protection (personal data, financial data, proprietary data). 2. Select DLP coverage points (email gateway, cloud upload, endpoint). 3. Configure detection policies per data category. 4. Establish alert response process. 5. Tune policies to reduce false positives while maintaining sensitivity. 6. Log DLP events as evidence. | DLP policy document. DLP tool configuration showing monitored categories. DLP alert log and response records. False positive tuning records. | UU PDP unauthorized disclosure prevention; POJK data leakage controls; OJK data security obligations |
| 8.16 ★ | Monitoring activities | Many organizations collect logs but do not analyze them. 8.16 requires active monitoring — alerting on anomalous behavior, not just passive log storage. | 1. Deploy SIEM or equivalent log aggregation and alerting platform. 2. Define alert rules per threat scenario (failed login threshold, privileged account after-hours access, bulk data export, malware detection). 3. Assign monitoring ownership with response SLA. 4. Document response procedure for each alert type. 5. Review alert quality quarterly — tune to reduce noise. | SIEM configuration showing active alert rules. Alert response records (minimum 1 quarter). Monitoring coverage report. Alert rule documentation per threat scenario. | BSSN MSSP and SOC guidelines; POJK IT security monitoring requirements |
| 8.23 ★ | Web filtering | Web filtering is widely deployed but often not formally documented as an ISMS control with defined categories, coverage, and evidence. | 1. Define acceptable use policy (the policy basis for filtering). 2. Select filtering platform (DNS-based, proxy-based, or endpoint agent — each with different coverage scope). 3. Configure blocked categories (malware, phishing, adult, gambling minimum; extend based on risk profile). 4. Ensure coverage extends to remote workers and mobile devices. 5. Log blocked access attempts as evidence. | Web filtering policy. Filter configuration screenshot showing blocked categories. Coverage report showing remote/mobile device inclusion. Blocked access attempt log. | POJK acceptable use requirements; BSSN malicious website blocking guidance |
| 8.28 ★ | Secure coding | Organizations that develop software vary enormously in secure coding maturity — from no SAST at all to advanced DevSecOps pipelines. ISO 27001 requires documented standards and training as a minimum. | 1. Publish a Secure Coding Standard aligned to your technology stack (OWASP Top 10 minimum). 2. Train all developers on secure coding (annual minimum, with SAST tool training). 3. Integrate SAST in CI/CD pipeline with quality gate blocking critical findings from merging. 4. Include security in code review checklist. 5. Track and remediate findings by severity SLA. | Secure Coding Standard document (version-controlled). Developer training records. SAST tool CI/CD integration configuration. Sample scan output showing findings and remediation. Code review security checklist. | POJK software security requirements; OJK application security; BSSN secure development guidelines |
| UU PDP alignment of new data controls: Controls 8.10 (information deletion), 8.11 (data masking), and 8.12 (data leakage prevention) together form the technical implementation layer of UU PDP's personal data protection obligations. Organizations implementing these three controls to ISO 27001 standards will simultaneously satisfy UU PDP obligations on data minimization, erasure rights, and unauthorized disclosure prevention. The practical recommendation: treat the implementation of these three controls as a joint ISO 27001 + UU PDP workstream rather than a purely ISMS activity. The evidence produced serves both compliance programs. |
Top 7 Domain 8 Audit Findings
Domain 8 generates more Stage 2 audit findings than any other Annex A domain — because it contains the controls most testable through direct evidence: MFA enrollment reports, vulnerability scan outputs, SIEM alert logs, backup restoration records. The 7 findings below account for the majority of technical control nonconformities across first-cycle audits:
| Common audit finding | Control | Frequency | What auditors find | Prevention and fix |
| MFA not deployed for all accounts | 8.5 | Very common | MFA deployed for privileged accounts but not for all staff. Remote access users without MFA. Service accounts with weak passwords and no MFA equivalent. Auditors request the IAM MFA enrollment report — partial coverage is an immediate finding. | Deploy MFA for 100% of user accounts (not just admins). Use conditional access to enforce MFA for remote access as a minimum. For service accounts: use certificate-based auth or managed identities rather than passwords. |
| Vulnerability scans not run or critical findings not remediated | 8.8 | Very common | No documented vulnerability scan schedule. Scans run irregularly. Critical findings identified months ago with no remediation evidence. SLA (30 days for critical) not tracked. Auditors request the last two scan reports and remediation records. | Define scan schedule (monthly for critical systems). Define remediation SLA per severity. Track open findings in the CAR register. Report vulnerability KPIs at management review: critical finding count and average time-to-remediation. |
| Logging configured but not monitored | 8.15, 8.16 | Common | Logs are collected but no alert rules exist. SIEM deployed but alert responses not documented. Logging coverage report cannot be produced. Clock synchronization not verified — log timestamps inconsistent across systems. | Implement alert rules per threat scenario. Assign monitoring ownership with documented response SLA. Produce a monitoring coverage report. Verify NTP synchronization across all in-scope systems. |
| Development environments use unmasked production data | 8.11, 8.33 | Common | Personal data from production databases copied to development and test environments without masking. Auditors ask: 'How do you ensure real personal data is not used in non-production environments?' Absence of a masking procedure is a UU PDP gap as well as an Annex A finding. | Implement data masking before extraction to non-production. Document the masking procedure. For existing dev/test environments: audit what data is present and execute a one-time cleanse. Going forward: automated masking as part of the refresh process. |
| Secure coding declared implemented without evidence | 8.28 | Common | SoA shows 8.28 implemented. Auditor asks: 'Show me your secure coding standard.' No document exists. 'Show me your SAST integration.' Not in CI/CD. 'Show me developer training records.' Generic security awareness training only. All three elements of 8.28 are missing. | Implement all three: (1) Secure Coding Standard document, (2) SAST tool in CI/CD with enforced quality gate, (3) developer-specific training. Do not declare 8.28 implemented in the SoA until all three have evidence. |
| Network segmentation not demonstrated | 8.22 | Common | Network segmentation policy exists but auditor asks for architecture diagram and firewall rules showing that production, development, and test are actually separated. Flat network with no VLAN separation fails this control regardless of documentation. | Produce a current network architecture diagram showing zone separation. Show firewall rules enforcing inter-zone controls. If the network is flat: implement VLAN segmentation as a risk treatment action. This is an infrastructure change, not a documentation fix. |
| No backup restoration testing | 8.13 | Common | Backups are running per the backup schedule. But when auditors ask for backup restoration test records, none exist. Backups that have never been tested are unproven — they may not restore correctly when needed. | Conduct a documented restoration test for each critical system at least annually. Record: what was restored, from which backup date, to which environment, result, and time taken. Compare RTO achieved vs. target. |
| The evidence quality signal: In Domain 8, the gap between 'we have this control' and 'we can evidence this control' is where most audit findings originate. MFA can be deployed — but if you cannot produce the IAM enrollment report during the audit, the auditor cannot verify deployment. Vulnerability scanning can be running — but without documented scan schedules, scan reports, and remediation records, the control cannot be verified. Before Stage 2, conduct an evidence dry run for every Domain 8 control: for each control, attempt to retrieve the evidence that an auditor would ask for. Identify gaps and address them. This exercise consistently surfaces the cases where controls are running but evidence is not being captured. |
Domain 8 and the Indonesian Regulatory Landscape
Several Domain 8 controls have specific relevance to Indonesian regulatory requirements beyond generic ISO 27001 compliance. Understanding these connections prevents the mistake of treating ISO 27001 certification and Indonesian regulatory compliance as separate workstreams:
Bank Indonesia payment system controls and Domain 8
Bank Indonesia's payment system regulation (PBI 23/6/2021) specifies technical security requirements for payment system operators — requirements that map closely to Domain 8 controls. MFA for payment authorization (8.5), network segmentation separating payment systems (8.22), encryption of payment data in transit and at rest (8.24), and logging of all payment transactions (8.15) are all required by BI regulation independently of ISO 27001. Organizations certifying under ISO 27001 while also navigating BI payment system licensing should map their Domain 8 implementation against the BI technical requirements to ensure both are satisfied simultaneously.
POJK IT governance and Domain 8
POJK 11/2022 (IT governance for financial institutions) specifies controls in vulnerability management (8.8), access management (8.2, 8.3, 8.5), change management (8.32), software development security (8.25–8.29), and monitoring (8.15, 8.16). The overlap is substantial — an ISO 27001-compliant implementation of Domain 8 satisfies the majority of POJK 11/2022 technical controls. The key gaps to specifically address: POJK requires formal IT risk management reporting to the board (beyond what ISO 27001 requires in management review), and POJK has specific requirements for IT vendor management that require supplementary supplier security controls beyond what Domain 8 alone addresses.
UU PDP technical controls and Domain 8
UU PDP Article 35 requires that controllers and processors implement appropriate technical and organizational measures to protect personal data. The technical measures in Domain 8 that directly satisfy this article include: encryption (8.24), access controls (8.2, 8.3, 8.5), data masking (8.11), DLP (8.12), deletion (8.10), logging (8.15), and monitoring (8.16). Organizations that implement Domain 8 thoroughly will find that they have satisfied the majority of UU PDP's 'appropriate technical measures' requirement — with the key additional element being the privacy impact assessment and data mapping obligations that ISO 27001 does not directly require but UU PDP does.
Domain 8 Implementation Sequencing
For organizations implementing Domain 8 for the first time, the sequencing of control implementation matters. The following order builds from the highest-impact, most-evidenced controls to the more nuanced and complex:
- Phase 1 — Foundations (Month 1–3): MFA everywhere (8.5), vulnerability scanning with documented schedule (8.8), endpoint MDM and encryption (8.1), logging from all critical systems (8.15), backup testing (8.13).
- Phase 2 — Operations (Month 3–6): Monitoring and alerting with SIEM (8.16), network segmentation documented and evidenced (8.22), privileged access management (8.2), change management with security gate (8.32), configuration baselines for critical systems (8.9).
- Phase 3 — Data and Development (Month 6–9): DLP implementation (8.12), data masking for non-production environments (8.11), deletion procedure and schedule (8.10), secure coding standard and SAST in CI/CD (8.28), web filtering (8.23).
- Phase 4 — Architecture and Assurance (Month 9–12): Cryptography policy and key management (8.24), secure SDLC procedure (8.25–8.27), penetration testing for critical applications (8.29), environment separation verification (8.31).
This sequencing prioritizes the controls most likely to be tested at Stage 2 and most likely to generate major NCs if absent, while building operational discipline before addressing more nuanced implementation areas.