Article 25 of GDPR codifies a principle that data protection advocates had long argued for but few regulations had previously mandated: privacy must be built in, not bolted on. Privacy by design requires organisations to embed data protection into the architecture of their systems, processes, and products from the outset. Privacy by default requires that the most privacy-protective settings be the default — that systems do not collect or share more data than is necessary unless the user actively chooses to share more.
For technology companies, product teams, and engineering organisations, Article 25 is one of the most operationally consequential provisions of GDPR. It means that privacy is a design requirement, not a legal review that happens after launch. For organisations that have traditionally treated data protection as a compliance overlay, Article 25 requires a significant cultural and operational shift.
What Article 25 Actually Requires
Article 25(1) requires controllers to implement appropriate technical and organisational measures designed to implement the data protection principles effectively and integrate the necessary safeguards into the processing, in order to meet GDPR’s requirements and protect the rights of data subjects. These measures must be implemented both at the time of determining the means of processing (design stage) and at the time of processing itself (operational stage).
The standard is ‘appropriate’ — taking into account the state of the art, the cost of implementation, the nature, scope, context, and purposes of processing, and the risks of varying likelihood and severity for the rights and freedoms of natural persons. Privacy by design is therefore not a fixed set of technical controls. It is a risk-proportionate, context-sensitive obligation to build privacy into systems by design.
Article 25(2) addresses privacy by default specifically: the controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed. That obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage, and their accessibility.
| KEY IDEA | Privacy by default does not mean the most restrictive possible setting in every case. It means the most privacy-protective setting that is consistent with the processing purpose. A user can always choose to share more — but sharing more should require an active choice, not an active opt-out. Default settings should reflect the minimum necessary, not the maximum possible. |
The Seven Foundational Principles of Privacy by Design
The privacy by design framework, originally developed by Ann Cavoukian and referenced in GDPR’s implementation guidance, articulates seven foundational principles that provide practical content for the Article 25 obligation. These are not GDPR requirements in themselves, but they provide the most widely accepted framework for implementing privacy by design in practice.
SEVEN FOUNDATIONAL PRINCIPLES OF PRIVACY BY DESIGN
| Principle | What It Means | Practical Application |
|---|---|---|
| 1. Proactive, not reactive | Anticipate and prevent privacy risks before they occur | Privacy review in design sprints, not post-launch audits |
| 2. Privacy as the default | Protect privacy automatically, without data subject action | Minimum collection by default; opt-in to share more |
| 3. Privacy embedded into design | Integrated into system architecture, not added afterwards | Data flows designed with minimisation from the start |
| 4. Full functionality — positive-sum | Avoid privacy vs. functionality trade-offs where possible | Privacy-enhancing features as product advantages |
| 5. End-to-end security | Protect data through its full lifecycle, then securely destroy | Encryption in transit and at rest; automated deletion |
| 6. Visibility and transparency | Keep operations and data uses open to external verification | Accurate privacy notices; audit logs; user dashboards |
| 7. Respect for user privacy | Keep it user-centric; strong defaults; user control | Consent management; preference centres; access tools |
Privacy by Design in the Product Development Lifecycle
For organisations that build software products or digital services, Article 25 creates a direct obligation to integrate privacy into the product development process. This means privacy review is not a stage that happens after features are built — it is a requirement that must be satisfied before processing design is finalised.
PRIVACY BY DESIGN — INTEGRATION ACROSS THE SDLC
| SDLC Phase | Privacy by Design Requirements | Key Questions |
|---|---|---|
| Requirements & planning | Define data collection scope; specify purposes; identify special categories; assess DPIA need | What data is collected? Why? Is a DPIA needed? |
| Architecture & design | Design data flows with minimisation; define retention periods; specify access controls; plan deletion | Is the data architecture privacy-preserving by default? |
| Development | Implement pseudonymisation/anonymisation; enforce access controls in code; implement logging; apply encryption | Are privacy controls implemented, not just specified? |
| Testing & QA | Test with anonymised/synthetic data; verify data minimisation; test deletion; review access log coverage | Do tests use real personal data? (They shouldn’t.) |
| Launch & deployment | Verify default settings are minimum-data; confirm consent flows; check privacy notice accuracy | Are defaults set correctly? Is the notice up to date? |
| Operations & maintenance | Monitor for scope creep; review data access logs; enforce retention schedules; update DPIA on changes | Has the data scope changed since design? Is the DPIA current? |
Privacy by Default: Translating Principle into Settings
Privacy by default under Article 25(2) has the most concrete and immediately verifiable implications. It requires that for every system, application, and service that processes personal data, the default configuration collects and processes the minimum necessary — and that broader collection or sharing requires an active choice by the user, not an active opt-out.
PRIVACY BY DEFAULT — EXAMPLES ACROSS PRODUCT CONTEXTS
| Product Context | Non-Compliant Default (Too Much) | Compliant Default (Minimum) |
|---|---|---|
| User registration form | Collects date of birth, phone, gender, address alongside email and name | Collects only what is necessary; optional fields clearly marked |
| Mobile app permissions | Requests location, contacts, microphone on install | No permissions requested until feature requires it; explained in context |
| Analytics tracking | Full behavioural tracking enabled for all users by default | No tracking by default; user opts into analytics |
| Profile visibility | User profile visible to all platform members by default | Profile visible only to connections/friends by default |
| Data retention | Data retained indefinitely unless user deletes account | Data deleted after defined retention period automatically |
| Marketing preferences | All marketing channels opted in by default | All marketing channels opted out by default; user opts in |
| Third-party data sharing | Data shared with partner network by default | No third-party sharing by default; explicit consent required |
| Session data | Long-lived persistent sessions; no inactivity timeout | Short sessions; inactivity timeout; re-authentication required |
Technical Measures That Implement Privacy by Design
The EDPB’s Guidelines 4/2019 on Article 25 identify specific technical measures that implement privacy by design. These measures are not exhaustive but provide a concrete foundation for technology teams translating the legal obligation into engineering practice.
TECHNICAL PRIVACY BY DESIGN MEASURES
| Measure | What It Does | Relevant Principle |
|---|---|---|
| Pseudonymisation at rest | Separates identifying data from processing data; reduces breach impact | Minimisation; integrity & confidentiality |
| Data masking in non-prod environments | Prevents real personal data exposure in development and testing | Minimisation; confidentiality |
| Automated retention enforcement | Deletes or anonymises data automatically at end of retention period | Storage limitation; accountability |
| Granular access controls (RBAC) | Limits data access to roles with genuine need; enforces least privilege | Confidentiality; minimisation |
| Audit logging of data access | Creates traceable record of who accessed what personal data and when | Accountability; integrity |
| Encryption in transit and at rest | Protects data from interception and unauthorised access | Integrity & confidentiality |
| Consent management platform | Records consent state per user per purpose; enables withdrawal | Lawfulness; transparency |
| Data lineage tracking | Maps where data came from, where it goes, and what transforms it | Purpose limitation; accountability |
| Privacy-enhancing computation (PEC) | Techniques like differential privacy, federated learning that enable analysis without exposing individual data | Minimisation; integrity |
Organisational Measures: The Governance Layer
Technical measures alone do not satisfy Article 25. Organisational measures — the policies, processes, training, and governance structures that ensure privacy is systematically considered in design decisions — are equally required. Article 25(1)’s reference to both ‘technical and organisational measures’ is deliberate.
Key organisational measures include: a privacy intake process for new products and features that flags data processing implications before development begins; a formal privacy review stage in the product development lifecycle; a data protection impact assessment trigger that activates DPIA requirements before high-risk processing is implemented; staff training for engineers and product managers on data minimisation, purpose limitation, and privacy by design principles; and a data classification framework that ensures different handling standards are applied to different categories of personal data.
| BITLION INSIGHT | Organisations that integrate privacy review as a ‘gate’ in their sprint or feature delivery process — requiring sign-off from a privacy professional before launch — consistently report fewer post-launch compliance issues, simpler data deletion processes, and lower DPIA volumes than those that treat privacy as a post-launch review. The cost of fixing a privacy problem in design is a fraction of the cost of fixing it in production. |
Demonstrating Compliance with Article 25
The accountability principle requires controllers to be able to demonstrate compliance with Article 25. This is one of the more challenging accountability obligations because privacy by design compliance is partly evidenced by what is not there — data that was not collected, fields that were not added, permissions that were not requested — which is inherently harder to document than what is present.
The EDPB’s guidelines identify several forms of evidence that demonstrate Article 25 compliance. Privacy impact assessments conducted at the design stage and documented in the system file demonstrate that privacy was considered before implementation. Data flow diagrams that show minimum data collection and appropriate routing demonstrate that the architecture reflects minimisation. DPIA records show that high-risk processing was assessed before it was deployed. Privacy by design checklists completed for new features show that the development process systematically addresses privacy requirements.
ARTICLE 25 COMPLIANCE EVIDENCE
| Evidence Type | What It Demonstrates | Who Maintains It |
|---|---|---|
| Privacy intake forms for new features | Privacy was considered at design stage, not retrospectively | Product / Privacy team |
| Data flow diagrams | Architecture reflects minimisation and purpose limitation | Engineering / Architecture |
| DPIA records for high-risk features | High-risk processing was assessed before deployment | DPO / Privacy team |
| Default settings documentation | Current defaults are minimum necessary for purpose | Product / Engineering |
| Privacy by design checklist (per sprint/release) | Privacy review was systematically applied in development | Product / Privacy team |
| Staff training records | Engineers and product staff have received privacy by design training | HR / Learning & Development |
| Retention enforcement logs | Automated deletion is functioning as configured | Engineering / IT Operations |
Article 25’s requirements do not stop at launch. Privacy by design is an ongoing obligation: as systems evolve, as data scopes change, and as new features are added, the Article 25 assessment must be revisited. A DPIA or privacy review conducted at initial launch does not cover subsequent material changes. Building privacy review into change management and feature development processes is the only sustainable way to maintain Article 25 compliance across a product’s lifecycle.