A Data Protection Impact Assessment is GDPR’s structured mechanism for identifying and managing high-risk processing before it begins. Article 35 makes DPIAs mandatory for processing activities likely to result in a high risk to the rights and freedoms of natural persons — and it specifies three categories of processing that are always high risk. For every other processing activity that might be high risk, the EDPB has developed assessment criteria that determine whether a DPIA is required.
A DPIA is not a bureaucratic formality. It is a systematic analysis of the risks that a specific processing activity poses to data subjects, followed by a structured process for reducing those risks to an acceptable level. Where risks cannot be reduced below the threshold at which they become unacceptable, the controller must consult the supervisory authority before the processing begins. A DPIA that identifies unacceptable risks and does nothing about them is worse than no DPIA at all — it creates documented evidence of a known, unmitigated risk.
When a DPIA Is Mandatory
Article 35(3) identifies three categories of processing that always require a DPIA. Article 35(1) establishes the broader principle: a DPIA is required for any processing likely to result in high risk. The EDPB has developed nine criteria that, when two or more are met, indicate that a DPIA is likely required.
ARTICLE 35(3) — ALWAYS REQUIRES A DPIA
| Category | Description | Common Examples |
|---|---|---|
| Systematic and extensive profiling with significant effects | Automated processing including profiling that produces legal or similarly significant decisions about individuals | Credit scoring; insurance risk assessment; recruitment AI; behavioural advertising targeting |
| Large-scale special category processing | Processing of special categories (Art. 9) or criminal records (Art. 10) at large scale | Hospital patient records systems; health app platforms; insurance medical underwriting at scale |
| Systematic monitoring of publicly accessible areas | Large-scale surveillance of areas where the public is present | City-wide CCTV; retail footfall analytics; workplace monitoring systems |
EDPB NINE-CRITERIA DPIA SCREENING (DPIA LIKELY IF 2+ CRITERIA MET)
| Criterion | Description | Example |
|---|---|---|
| 1. Evaluation or scoring | Profiling or predicting aspects of an individual’s performance, personality, interests | Credit scoring; employee performance AI; behavioural profiling |
| 2. Automated decision-making with legal/significant effects | Decisions that significantly affect individuals made without human review | Automated loan decisions; automated employment screening |
| 3. Systematic monitoring | Processing used to observe, monitor, or control data subjects | Workplace monitoring; IoT devices; network monitoring of employees |
| 4. Sensitive data or highly personal data | Special categories; criminal data; financial data; location data; biometric data | Health app; genetic testing; banking analytics |
| 5. Large-scale processing | Large number of data subjects; large volume; wide geographic range; long duration | National health database; global CRM; city-wide transport data |
| 6. Matching or combining datasets | Combining data from multiple sources in ways that exceed reasonable expectations | Cross-referencing purchase data with social media; data broker enrichment |
| 7. Data concerning vulnerable individuals | Children; employees; patients; elderly; asylum seekers; others in power imbalance | Children’s app; employee monitoring; care home management system |
| 8. Innovative use or application of new technologies | New technology creating novel risks not yet fully understood | Facial recognition; AI decision support; smart home devices; IoT |
| 9. Prevents exercise of rights or access to service/contract | Processing that could result in individuals being denied services or rights | Fraud blacklists; credit bureau data; tenant screening databases |
The DPIA Process: Five Systematic Steps
Article 35(7) specifies the minimum content of a DPIA. The EDPB’s guidelines on DPIAs (WP248) elaborate the methodology. The process can be structured as five sequential steps that build the DPIA document systematically.
DPIA METHODOLOGY — FIVE-STEP PROCESS
| Step | What It Involves | Output |
|---|---|---|
| 1. Processing description | Document the processing activity in detail: purposes, data categories, data subjects, recipients, retention, transfers, technical architecture | Processing description section of DPIA |
| 2. Necessity and proportionality assessment | Assess whether the processing is necessary for the purpose; whether less intrusive means could achieve the same purpose; whether the legal basis is appropriate | Necessity analysis; basis documentation |
| 3. Risk identification | Identify the risks to data subjects: unauthorised access, excessive collection, function creep, inability to exercise rights, discrimination. Assess likelihood and severity before mitigation. | Risk register with pre-mitigation ratings |
| 4. Risk mitigation | For each identified risk, document the measures taken to reduce it: technical controls, organisational measures, anonymisation, access restrictions, contractual protections. Assess residual risk after mitigation. | Mitigated risk register; residual risk ratings |
| 5. Consultation and approval | DPO reviews DPIA; if residual risk remains high, consult supervisory authority before processing. Document DPO advice and, if applicable, SA consultation outcome. | DPO sign-off; SA opinion (if consulted) |
Step 1: Processing Description — More Detail Than You Think
The processing description is the foundation of the DPIA. It must be sufficiently detailed that a reader who is unfamiliar with the system can understand precisely what personal data is involved, how it flows, who has access, where it is stored, and what decisions it enables or informs. A vague description — ‘we process customer data to provide our service’ — is not a processing description for DPIA purposes.
The processing description should include: the specific purposes of the processing; a detailed description of the processing operations (collection, storage, analysis, sharing, deletion); the categories and approximate volumes of data subjects; the categories of personal data processed; any special categories or criminal records data; the technical architecture (which systems handle the data, where they are located, who operates them); the data flows (how data enters, moves within, and exits the system); the processors and sub-processors involved; and the retention periods and deletion mechanisms.
Step 3: Risk Identification — Thinking Like a Regulator
The risk identification step requires the organisation to systematically consider what could go wrong — for data subjects, not for the organisation. GDPR’s risk framework focuses on risks to the rights and freedoms of natural persons, not on regulatory or reputational risk to the controller.
DPIA RISK CATEGORIES — RISKS TO DATA SUBJECTS
| Risk Category | Description | Example Sources |
|---|---|---|
| Confidentiality breach | Unauthorised access to or disclosure of personal data | Cyberattack; insider threat; misconfigured access controls; accidental disclosure |
| Integrity failure | Inaccurate or incomplete personal data leading to wrong decisions about individuals | Data corruption; failure to update records; incorrect automated decision-making |
| Availability failure | Inability to access personal data when needed, including by the data subject | System outage; data loss without backup; disaster recovery failure |
| Excessive collection | More data collected than necessary, creating unnecessary risk | Scope creep in development; over-broad API permissions; unnecessary fields |
| Function creep | Data used for purposes beyond those for which it was collected | Analytics repurposing; AI training without consent; secondary profiling |
| Rights obstruction | Data subject unable to exercise their GDPR rights | No SAR process; no erasure mechanism; no opt-out for direct marketing |
| Discrimination or harm | Processing results in discriminatory treatment or direct harm to individuals | Biased AI scoring; health data exposure leading to insurance denial; re-identification |
| Loss of control | Data subject loses meaningful control over their data | Data sold to third parties; excessive third-party sharing; opaque profiling |
Assessing Risk: Likelihood and Severity
Each identified risk must be assessed on two dimensions before and after mitigation: the likelihood that the risk will materialise, and the severity of the impact on data subjects if it does. The combination of likelihood and severity determines the overall risk level for that risk, which in turn drives the mitigation response.
RISK RATING MATRIX
| Low Severity | Medium Severity | High Severity | |
|---|---|---|---|
| Low Likelihood | Low — monitor | Low — monitor | Medium — mitigate |
| Medium Likelihood | Low — monitor | Medium — mitigate | High — mitigate or consult SA |
| High Likelihood | Medium — mitigate | High — mitigate | Very High — consult SA or don’t proceed |
Prior Consultation: When Residual Risk Is Unacceptable
Article 36 requires the controller to consult the supervisory authority prior to processing where a DPIA indicates that the processing would result in a high risk in the absence of measures taken by the controller to mitigate the risk. In other words: where the organisation has conducted a DPIA, has implemented all reasonable mitigations, and the residual risk to data subjects remains high, the SA must be consulted before processing begins.
The prior consultation procedure involves submitting the DPIA to the SA along with specified information: the respective responsibilities of controllers, processors, and any joint controllers; the purposes and means of the intended processing; the measures and safeguards provided to protect the rights and freedoms of data subjects; contact details of the DPO; and any other information the SA requests. The SA has eight weeks to respond, with a possible extension to sixteen weeks for complex cases.
If the SA considers the proposed processing to be in violation of GDPR, it will provide written advice within the consultation period and may use its corrective powers. Processing that proceeds in disregard of a SA’s negative opinion in the prior consultation procedure is a serious GDPR violation.
| IMPORTANT | Prior consultation is not optional where the DPIA identifies unmitigable high risk. Organisations that conduct a DPIA, identify high residual risk, and proceed without consulting the SA have not complied with Article 36 — and the DPIA record creates documented evidence that they knowingly proceeded with a high-risk processing activity without regulatory clearance. This materially worsens the enforcement position. |
The DPO’s Role in the DPIA
Article 35(2) requires controllers to seek the advice of the DPO, where designated, when carrying out a DPIA. The DPO must be involved in the DPIA process, not just informed of its outcome. In practice, this means the DPO should: advise on whether a DPIA is required; review the processing description for accuracy and completeness; validate the risk identification methodology; assess the adequacy of proposed mitigations; and sign off on the final DPIA document or document their concerns if they do not endorse it.
For organisations without a mandatory DPO, the equivalent function — independent review of the DPIA by a qualified privacy professional — should still be performed. A DPIA reviewed only by the team responsible for the processing being assessed lacks independence and is less likely to identify risks that the project team has a stake in undervaluing.
Integrating DPIA into the Product Development and Procurement Lifecycle
A DPIA is most useful when it is conducted before design decisions are finalised — when there is still time to modify the architecture, reduce the data scope, or implement privacy-enhancing measures without incurring significant rework costs. A DPIA conducted after a system is built and ready to launch may identify risks that are technically and commercially impractical to mitigate at that point.
DPIA TRIGGER POINTS IN THE PRODUCT LIFECYCLE
| Lifecycle Stage | DPIA Action | Who Is Responsible |
|---|---|---|
| New product or feature concept | Screen against DPIA criteria (Art. 35(3) and nine criteria) | Product owner + Privacy team |
| Architecture design phase | Initiate DPIA if screening triggers; begin processing description | Engineering lead + Privacy team |
| Pre-development sign-off | Complete DPIA steps 1–4; obtain DPO review | DPO / Privacy team |
| Launch approval | Confirm DPIA is current; SA consultation complete if required | DPO; Senior management |
| Material change to processing | Re-screen; update or reopen DPIA if scope changes significantly | Product owner + Privacy team |
| Annual review | Review all active DPIAs for currency; update where processing has changed | DPO / Privacy team |
Procurement of third-party systems that will process personal data also requires DPIA consideration. A controller that deploys a new SaaS platform processing special category data at scale may need to conduct a DPIA even though it is not building the system itself — the DPIA obligation attaches to the processing activity, not to whether the controller built the processing infrastructure. Due diligence on third-party systems should include an assessment of whether the procurement triggers a DPIA requirement.
| BITLION INSIGHT | Organisations that build DPIA triggering into their intake process for new products and third-party procurement — as a mandatory step before budget approval — consistently report lower post-launch risk discoveries and fewer enforcement interactions. The DPIA is not a cost imposed on product development. It is the process that prevents a €1 million fine from being the consequence of a product decision that could have been made differently at the design stage for a fraction of the cost. |