Security of Processing

Article 32 of GDPR requires controllers and processors to implement a level of security appropriate to the risk, taking into account the state of the art, implementation costs, and the nature of the processing. It identifies four specific categories of measure that must be considered, but intentionally stops short of prescribing specific technical controls. This design reflects the diversity of processing contexts — a risk-based standard is more durable than a prescriptive checklist that becomes obsolete as technology evolves.

Article 32’s risk-based approach means that compliance requires genuine risk analysis, not just control implementation. An organisation that deploys a checklist of security controls without assessing whether those controls are proportionate to the specific risks of its processing activities is not fulfilling the Article 32 obligation. The standard is proportionality, not merely the presence of controls.

 

The Article 32 Risk Assessment

Before selecting and implementing security measures, Article 32(2) requires controllers and processors to assess the risks to individuals’ rights and freedoms, taking into account the nature of the processing. This risk assessment is not the same as a DPIA (which is required for high-risk processing under Article 35) — it is the security risk assessment that informs the appropriate level of technical and organisational measures for every processing activity.

ARTICLE 32 SECURITY RISK ASSESSMENT — FRAMEWORK

Risk DimensionAssessment QuestionHigher Risk Indicators
Data sensitivityHow sensitive is the personal data being processed? What harm could result from unauthorised access or disclosure?Special category data (health, biometric, racial origin); financial credentials; data about children; large volumes of data enabling identity fraud
Processing scale and scopeHow many individuals’ data is involved? How many systems process this data? How complex is the processing?Mass-scale processing (millions of records); complex multi-system processing; processing involving many processors and sub-processors
Threat landscapeWhat specific threats apply to this processing activity? What attack vectors are relevant to this type of data?External cyber threats (phishing, ransomware, web application attacks); insider threats; third-party supply chain attacks; physical theft
Potential consequences of breachWhat would be the consequences for individuals if this data were breached? What would be the consequences for the organisation?Severe individual harm (physical safety, financial loss, discrimination); large regulatory fine; reputational damage; operational disruption
Current control effectivenessAre existing controls adequate for the identified risks? Are there gaps between risk level and control maturity?Controls below state-of-the-art for data sensitivity; controls not tested; gap between documented controls and operational reality

 

Implementing the Four Article 32 Measures

ARTICLE 32(1)(a) — PSEUDONYMISATION AND ENCRYPTION

MeasureImplementation StandardEvidence
Encryption at restAES-256 for storage; database encryption for production databases; field-level encryption for special category data; full-disk encryption for all endpointsEncryption policy; system configuration records; encryption audit; endpoint MDM enforcement reports
Encryption in transitTLS 1.2 minimum, TLS 1.3 preferred for all data transmission; HTTPS enforced on all web applications; encrypted file transfer for bulk dataCertificate inventory; TLS scan results; HTTPS enforcement records; secure transfer procedure
PseudonymisationReplace direct identifiers in analytics, testing, and development environments; separate storage of key/mapping table; key access restrictedPseudonymisation implementation documentation; separate key store access controls; test environment policy
Key managementEncryption keys managed separately from encrypted data; customer-managed keys (CMEK) for cloud-hosted sensitive data; key rotation schedule; access to keys restricted and loggedKey management policy; HSM or KMS configuration; key rotation records; key access audit log

ARTICLE 32(1)(b) — CONFIDENTIALITY, INTEGRITY, AVAILABILITY, AND RESILIENCE

PropertyKey ControlsMinimum Standard
ConfidentialityAccess control (RBAC, PAM, MFA); data loss prevention; network segmentation; audit logging of accessMFA on all systems with personal data; RBAC implemented and reviewed; access logs retained minimum 12 months
IntegrityChange management; database integrity controls; checksums and hash verification; audit logging of modificationsImmutable audit logs; change approval process; database transaction logging; version control for critical data
AvailabilityRedundancy and failover; load balancing; business continuity planning; SLA monitoring for critical systemsRTO and RPO defined and tested; no single points of failure for critical personal data systems; BCP tested annually
ResilienceAbility to withstand disruption including cyber attacks, hardware failure, and natural events; DDoS protection for internet-facing servicesDDoS mitigation in place; multi-zone or multi-region deployment for critical systems; infrastructure resilience tested

ARTICLE 32(1)(c) — RESTORE AVAILABILITY AND ACCESS AFTER INCIDENT

ControlRequirementTesting Standard
Backup regimeAll personal data systems backed up; backup frequency appropriate to RTO/RPO; backups stored separately from primary systems; backups encryptedDaily backup of production personal data; off-site or cloud backup; backup encryption confirmed; retention period defined
Recovery testingBackup restoration tested regularly to confirm recovery is possible; test results documented; gaps remediatedFull restoration test at least annually; partial restoration test quarterly; disaster recovery test including personal data systems
Disaster recovery planDocumented plan for restoring personal data processing after major incident; roles and responsibilities defined; escalation path to DPO for GDPR-relevant incidentsDRP covers all systems in scope for personal data; GDPR breach assessment built into recovery procedure; RTO/RPO specific to personal data systems
Incident response procedureDefined process for detecting, containing, and recovering from security incidents; personal data incident assessment integrated into IR procedureIR procedure includes GDPR breach assessment checklist; DPO notified within 12 hours of personal data incidents; 72-hour notification path documented

ARTICLE 32(1)(d) — REGULAR TESTING, ASSESSING, AND EVALUATING

Testing ActivityFrequencyEvidence Required
Penetration testing (external)At least annually; after major system changes; after significant infrastructure changesPenetration test scope and methodology; findings report; remediation tracking; retest confirmation
Vulnerability scanningAt least monthly for internet-facing systems; weekly preferred; ad hoc for new deploymentsScan reports; vulnerability register; remediation SLAs by severity; trend analysis
Security control auditAt least annually; reviews whether documented controls are in place and operating effectivelyControl audit report; gap findings; management action plan; control effectiveness evidence
Phishing simulation / social engineering testAt least quarterly; follow-up training for those who failSimulation results; click rates; training completion for simulation failures; trend over time
Access reviewAt least annually; quarterly for privileged access; at every joiner/mover/leaver eventAccess review records; privileged account review; certification by data owners; any access revocations actioned
Backup and DR testingFull backup restoration test annually; DR test annually; partial restoration test quarterlyTest results; RTO/RPO achieved in test; gaps identified; remediation plan

 

Security Governance: Making Article 32 Operational

Technical controls are only effective if they are governed by clear organisational structures, policies, and accountability. Security governance translates Article 32’s requirements into an operational security programme with defined ownership, review cycles, and escalation paths.

SECURITY GOVERNANCE STRUCTURE

Governance ElementRequirementGDPR Linkage
Information security policyBoard-approved security policy; reviewed annually; covers all major control areas; communicated to all staffDemonstrates Art. 32 commitment at leadership level; part of TOM documentation for enterprise questionnaires
Security ownershipNamed CISO or security lead; clear accountability for control implementation; reporting to board or senior leadershipAccountability principle requires identifiable ownership; security owner coordinates with DPO on Art. 32 compliance
Security risk registerDocumented security risks with likelihood, impact, and treatment; reviewed quarterly; updated on new threatsRisk-based approach to Art. 32; demonstrates proportionality; informs DPIA risk sections
Supplier security requirementsSecurity requirements flowed down to processors and sub-processors via DPA and supplementary security schedulesArt. 28 requires processors to implement security per Art. 32; security schedule in DPA operationalises this
Security incident procedureIncident classification; escalation to DPO; GDPR breach assessment SLA; notification procedureConnects Art. 32 security incident response to Art. 33/34 breach notification obligations
BITLION INSIGHTArticle 32 compliance is not a snapshot — it is a continuous process. Security measures that were appropriate when a system was first deployed may be inadequate two years later as the threat landscape evolves, the volume of data processed grows, and the ‘state of the art’ advances. An annual security review that reassesses the risk profile of each significant processing activity against current threat intelligence and current security tooling available — and updates the TOM schedule accordingly — is the mechanism that keeps Article 32 compliance current rather than historical.