Article 32 of GDPR requires controllers and processors to implement a level of security appropriate to the risk, taking into account the state of the art, implementation costs, and the nature of the processing. It identifies four specific categories of measure that must be considered, but intentionally stops short of prescribing specific technical controls. This design reflects the diversity of processing contexts — a risk-based standard is more durable than a prescriptive checklist that becomes obsolete as technology evolves.
Article 32’s risk-based approach means that compliance requires genuine risk analysis, not just control implementation. An organisation that deploys a checklist of security controls without assessing whether those controls are proportionate to the specific risks of its processing activities is not fulfilling the Article 32 obligation. The standard is proportionality, not merely the presence of controls.
The Article 32 Risk Assessment
Before selecting and implementing security measures, Article 32(2) requires controllers and processors to assess the risks to individuals’ rights and freedoms, taking into account the nature of the processing. This risk assessment is not the same as a DPIA (which is required for high-risk processing under Article 35) — it is the security risk assessment that informs the appropriate level of technical and organisational measures for every processing activity.
ARTICLE 32 SECURITY RISK ASSESSMENT — FRAMEWORK
| Risk Dimension | Assessment Question | Higher Risk Indicators |
|---|---|---|
| Data sensitivity | How sensitive is the personal data being processed? What harm could result from unauthorised access or disclosure? | Special category data (health, biometric, racial origin); financial credentials; data about children; large volumes of data enabling identity fraud |
| Processing scale and scope | How many individuals’ data is involved? How many systems process this data? How complex is the processing? | Mass-scale processing (millions of records); complex multi-system processing; processing involving many processors and sub-processors |
| Threat landscape | What specific threats apply to this processing activity? What attack vectors are relevant to this type of data? | External cyber threats (phishing, ransomware, web application attacks); insider threats; third-party supply chain attacks; physical theft |
| Potential consequences of breach | What would be the consequences for individuals if this data were breached? What would be the consequences for the organisation? | Severe individual harm (physical safety, financial loss, discrimination); large regulatory fine; reputational damage; operational disruption |
| Current control effectiveness | Are existing controls adequate for the identified risks? Are there gaps between risk level and control maturity? | Controls below state-of-the-art for data sensitivity; controls not tested; gap between documented controls and operational reality |
Implementing the Four Article 32 Measures
ARTICLE 32(1)(a) — PSEUDONYMISATION AND ENCRYPTION
| Measure | Implementation Standard | Evidence |
|---|---|---|
| Encryption at rest | AES-256 for storage; database encryption for production databases; field-level encryption for special category data; full-disk encryption for all endpoints | Encryption policy; system configuration records; encryption audit; endpoint MDM enforcement reports |
| Encryption in transit | TLS 1.2 minimum, TLS 1.3 preferred for all data transmission; HTTPS enforced on all web applications; encrypted file transfer for bulk data | Certificate inventory; TLS scan results; HTTPS enforcement records; secure transfer procedure |
| Pseudonymisation | Replace direct identifiers in analytics, testing, and development environments; separate storage of key/mapping table; key access restricted | Pseudonymisation implementation documentation; separate key store access controls; test environment policy |
| Key management | Encryption keys managed separately from encrypted data; customer-managed keys (CMEK) for cloud-hosted sensitive data; key rotation schedule; access to keys restricted and logged | Key management policy; HSM or KMS configuration; key rotation records; key access audit log |
ARTICLE 32(1)(b) — CONFIDENTIALITY, INTEGRITY, AVAILABILITY, AND RESILIENCE
| Property | Key Controls | Minimum Standard |
|---|---|---|
| Confidentiality | Access control (RBAC, PAM, MFA); data loss prevention; network segmentation; audit logging of access | MFA on all systems with personal data; RBAC implemented and reviewed; access logs retained minimum 12 months |
| Integrity | Change management; database integrity controls; checksums and hash verification; audit logging of modifications | Immutable audit logs; change approval process; database transaction logging; version control for critical data |
| Availability | Redundancy and failover; load balancing; business continuity planning; SLA monitoring for critical systems | RTO and RPO defined and tested; no single points of failure for critical personal data systems; BCP tested annually |
| Resilience | Ability to withstand disruption including cyber attacks, hardware failure, and natural events; DDoS protection for internet-facing services | DDoS mitigation in place; multi-zone or multi-region deployment for critical systems; infrastructure resilience tested |
ARTICLE 32(1)(c) — RESTORE AVAILABILITY AND ACCESS AFTER INCIDENT
| Control | Requirement | Testing Standard |
|---|---|---|
| Backup regime | All personal data systems backed up; backup frequency appropriate to RTO/RPO; backups stored separately from primary systems; backups encrypted | Daily backup of production personal data; off-site or cloud backup; backup encryption confirmed; retention period defined |
| Recovery testing | Backup restoration tested regularly to confirm recovery is possible; test results documented; gaps remediated | Full restoration test at least annually; partial restoration test quarterly; disaster recovery test including personal data systems |
| Disaster recovery plan | Documented plan for restoring personal data processing after major incident; roles and responsibilities defined; escalation path to DPO for GDPR-relevant incidents | DRP covers all systems in scope for personal data; GDPR breach assessment built into recovery procedure; RTO/RPO specific to personal data systems |
| Incident response procedure | Defined process for detecting, containing, and recovering from security incidents; personal data incident assessment integrated into IR procedure | IR procedure includes GDPR breach assessment checklist; DPO notified within 12 hours of personal data incidents; 72-hour notification path documented |
ARTICLE 32(1)(d) — REGULAR TESTING, ASSESSING, AND EVALUATING
| Testing Activity | Frequency | Evidence Required |
|---|---|---|
| Penetration testing (external) | At least annually; after major system changes; after significant infrastructure changes | Penetration test scope and methodology; findings report; remediation tracking; retest confirmation |
| Vulnerability scanning | At least monthly for internet-facing systems; weekly preferred; ad hoc for new deployments | Scan reports; vulnerability register; remediation SLAs by severity; trend analysis |
| Security control audit | At least annually; reviews whether documented controls are in place and operating effectively | Control audit report; gap findings; management action plan; control effectiveness evidence |
| Phishing simulation / social engineering test | At least quarterly; follow-up training for those who fail | Simulation results; click rates; training completion for simulation failures; trend over time |
| Access review | At least annually; quarterly for privileged access; at every joiner/mover/leaver event | Access review records; privileged account review; certification by data owners; any access revocations actioned |
| Backup and DR testing | Full backup restoration test annually; DR test annually; partial restoration test quarterly | Test results; RTO/RPO achieved in test; gaps identified; remediation plan |
Security Governance: Making Article 32 Operational
Technical controls are only effective if they are governed by clear organisational structures, policies, and accountability. Security governance translates Article 32’s requirements into an operational security programme with defined ownership, review cycles, and escalation paths.
SECURITY GOVERNANCE STRUCTURE
| Governance Element | Requirement | GDPR Linkage |
|---|---|---|
| Information security policy | Board-approved security policy; reviewed annually; covers all major control areas; communicated to all staff | Demonstrates Art. 32 commitment at leadership level; part of TOM documentation for enterprise questionnaires |
| Security ownership | Named CISO or security lead; clear accountability for control implementation; reporting to board or senior leadership | Accountability principle requires identifiable ownership; security owner coordinates with DPO on Art. 32 compliance |
| Security risk register | Documented security risks with likelihood, impact, and treatment; reviewed quarterly; updated on new threats | Risk-based approach to Art. 32; demonstrates proportionality; informs DPIA risk sections |
| Supplier security requirements | Security requirements flowed down to processors and sub-processors via DPA and supplementary security schedules | Art. 28 requires processors to implement security per Art. 32; security schedule in DPA operationalises this |
| Security incident procedure | Incident classification; escalation to DPO; GDPR breach assessment SLA; notification procedure | Connects Art. 32 security incident response to Art. 33/34 breach notification obligations |
| BITLION INSIGHT | Article 32 compliance is not a snapshot — it is a continuous process. Security measures that were appropriate when a system was first deployed may be inadequate two years later as the threat landscape evolves, the volume of data processed grows, and the ‘state of the art’ advances. An annual security review that reassesses the risk profile of each significant processing activity against current threat intelligence and current security tooling available — and updates the TOM schedule accordingly — is the mechanism that keeps Article 32 compliance current rather than historical. |