Privacy by Design and Default

Article 25 of GDPR codifies a principle that data protection advocates had long argued for but few regulations had previously mandated: privacy must be built in, not bolted on. Privacy by design requires organisations to embed data protection into the architecture of their systems, processes, and products from the outset. Privacy by default requires that the most privacy-protective settings be the default — that systems do not collect or share more data than is necessary unless the user actively chooses to share more.

For technology companies, product teams, and engineering organisations, Article 25 is one of the most operationally consequential provisions of GDPR. It means that privacy is a design requirement, not a legal review that happens after launch. For organisations that have traditionally treated data protection as a compliance overlay, Article 25 requires a significant cultural and operational shift.

 

What Article 25 Actually Requires

Article 25(1) requires controllers to implement appropriate technical and organisational measures designed to implement the data protection principles effectively and integrate the necessary safeguards into the processing, in order to meet GDPR’s requirements and protect the rights of data subjects. These measures must be implemented both at the time of determining the means of processing (design stage) and at the time of processing itself (operational stage).

The standard is ‘appropriate’ — taking into account the state of the art, the cost of implementation, the nature, scope, context, and purposes of processing, and the risks of varying likelihood and severity for the rights and freedoms of natural persons. Privacy by design is therefore not a fixed set of technical controls. It is a risk-proportionate, context-sensitive obligation to build privacy into systems by design.

Article 25(2) addresses privacy by default specifically: the controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed. That obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage, and their accessibility.

KEY IDEAPrivacy by default does not mean the most restrictive possible setting in every case. It means the most privacy-protective setting that is consistent with the processing purpose. A user can always choose to share more — but sharing more should require an active choice, not an active opt-out. Default settings should reflect the minimum necessary, not the maximum possible.

 

The Seven Foundational Principles of Privacy by Design

The privacy by design framework, originally developed by Ann Cavoukian and referenced in GDPR’s implementation guidance, articulates seven foundational principles that provide practical content for the Article 25 obligation. These are not GDPR requirements in themselves, but they provide the most widely accepted framework for implementing privacy by design in practice.

SEVEN FOUNDATIONAL PRINCIPLES OF PRIVACY BY DESIGN

PrincipleWhat It MeansPractical Application
1. Proactive, not reactiveAnticipate and prevent privacy risks before they occurPrivacy review in design sprints, not post-launch audits
2. Privacy as the defaultProtect privacy automatically, without data subject actionMinimum collection by default; opt-in to share more
3. Privacy embedded into designIntegrated into system architecture, not added afterwardsData flows designed with minimisation from the start
4. Full functionality — positive-sumAvoid privacy vs. functionality trade-offs where possiblePrivacy-enhancing features as product advantages
5. End-to-end securityProtect data through its full lifecycle, then securely destroyEncryption in transit and at rest; automated deletion
6. Visibility and transparencyKeep operations and data uses open to external verificationAccurate privacy notices; audit logs; user dashboards
7. Respect for user privacyKeep it user-centric; strong defaults; user controlConsent management; preference centres; access tools

 

Privacy by Design in the Product Development Lifecycle

For organisations that build software products or digital services, Article 25 creates a direct obligation to integrate privacy into the product development process. This means privacy review is not a stage that happens after features are built — it is a requirement that must be satisfied before processing design is finalised.

PRIVACY BY DESIGN — INTEGRATION ACROSS THE SDLC

SDLC PhasePrivacy by Design RequirementsKey Questions
Requirements & planningDefine data collection scope; specify purposes; identify special categories; assess DPIA needWhat data is collected? Why? Is a DPIA needed?
Architecture & designDesign data flows with minimisation; define retention periods; specify access controls; plan deletionIs the data architecture privacy-preserving by default?
DevelopmentImplement pseudonymisation/anonymisation; enforce access controls in code; implement logging; apply encryptionAre privacy controls implemented, not just specified?
Testing & QATest with anonymised/synthetic data; verify data minimisation; test deletion; review access log coverageDo tests use real personal data? (They shouldn’t.)
Launch & deploymentVerify default settings are minimum-data; confirm consent flows; check privacy notice accuracyAre defaults set correctly? Is the notice up to date?
Operations & maintenanceMonitor for scope creep; review data access logs; enforce retention schedules; update DPIA on changesHas the data scope changed since design? Is the DPIA current?

 

Privacy by Default: Translating Principle into Settings

Privacy by default under Article 25(2) has the most concrete and immediately verifiable implications. It requires that for every system, application, and service that processes personal data, the default configuration collects and processes the minimum necessary — and that broader collection or sharing requires an active choice by the user, not an active opt-out.

PRIVACY BY DEFAULT — EXAMPLES ACROSS PRODUCT CONTEXTS

Product ContextNon-Compliant Default (Too Much)Compliant Default (Minimum)
User registration formCollects date of birth, phone, gender, address alongside email and nameCollects only what is necessary; optional fields clearly marked
Mobile app permissionsRequests location, contacts, microphone on installNo permissions requested until feature requires it; explained in context
Analytics trackingFull behavioural tracking enabled for all users by defaultNo tracking by default; user opts into analytics
Profile visibilityUser profile visible to all platform members by defaultProfile visible only to connections/friends by default
Data retentionData retained indefinitely unless user deletes accountData deleted after defined retention period automatically
Marketing preferencesAll marketing channels opted in by defaultAll marketing channels opted out by default; user opts in
Third-party data sharingData shared with partner network by defaultNo third-party sharing by default; explicit consent required
Session dataLong-lived persistent sessions; no inactivity timeoutShort sessions; inactivity timeout; re-authentication required

 

Technical Measures That Implement Privacy by Design

The EDPB’s Guidelines 4/2019 on Article 25 identify specific technical measures that implement privacy by design. These measures are not exhaustive but provide a concrete foundation for technology teams translating the legal obligation into engineering practice.

TECHNICAL PRIVACY BY DESIGN MEASURES

MeasureWhat It DoesRelevant Principle
Pseudonymisation at restSeparates identifying data from processing data; reduces breach impactMinimisation; integrity & confidentiality
Data masking in non-prod environmentsPrevents real personal data exposure in development and testingMinimisation; confidentiality
Automated retention enforcementDeletes or anonymises data automatically at end of retention periodStorage limitation; accountability
Granular access controls (RBAC)Limits data access to roles with genuine need; enforces least privilegeConfidentiality; minimisation
Audit logging of data accessCreates traceable record of who accessed what personal data and whenAccountability; integrity
Encryption in transit and at restProtects data from interception and unauthorised accessIntegrity & confidentiality
Consent management platformRecords consent state per user per purpose; enables withdrawalLawfulness; transparency
Data lineage trackingMaps where data came from, where it goes, and what transforms itPurpose limitation; accountability
Privacy-enhancing computation (PEC)Techniques like differential privacy, federated learning that enable analysis without exposing individual dataMinimisation; integrity

 

Organisational Measures: The Governance Layer

Technical measures alone do not satisfy Article 25. Organisational measures — the policies, processes, training, and governance structures that ensure privacy is systematically considered in design decisions — are equally required. Article 25(1)’s reference to both ‘technical and organisational measures’ is deliberate.

Key organisational measures include: a privacy intake process for new products and features that flags data processing implications before development begins; a formal privacy review stage in the product development lifecycle; a data protection impact assessment trigger that activates DPIA requirements before high-risk processing is implemented; staff training for engineers and product managers on data minimisation, purpose limitation, and privacy by design principles; and a data classification framework that ensures different handling standards are applied to different categories of personal data.

BITLION INSIGHTOrganisations that integrate privacy review as a ‘gate’ in their sprint or feature delivery process — requiring sign-off from a privacy professional before launch — consistently report fewer post-launch compliance issues, simpler data deletion processes, and lower DPIA volumes than those that treat privacy as a post-launch review. The cost of fixing a privacy problem in design is a fraction of the cost of fixing it in production.

 

Demonstrating Compliance with Article 25

The accountability principle requires controllers to be able to demonstrate compliance with Article 25. This is one of the more challenging accountability obligations because privacy by design compliance is partly evidenced by what is not there — data that was not collected, fields that were not added, permissions that were not requested — which is inherently harder to document than what is present.

The EDPB’s guidelines identify several forms of evidence that demonstrate Article 25 compliance. Privacy impact assessments conducted at the design stage and documented in the system file demonstrate that privacy was considered before implementation. Data flow diagrams that show minimum data collection and appropriate routing demonstrate that the architecture reflects minimisation. DPIA records show that high-risk processing was assessed before it was deployed. Privacy by design checklists completed for new features show that the development process systematically addresses privacy requirements.

ARTICLE 25 COMPLIANCE EVIDENCE

Evidence TypeWhat It DemonstratesWho Maintains It
Privacy intake forms for new featuresPrivacy was considered at design stage, not retrospectivelyProduct / Privacy team
Data flow diagramsArchitecture reflects minimisation and purpose limitationEngineering / Architecture
DPIA records for high-risk featuresHigh-risk processing was assessed before deploymentDPO / Privacy team
Default settings documentationCurrent defaults are minimum necessary for purposeProduct / Engineering
Privacy by design checklist (per sprint/release)Privacy review was systematically applied in developmentProduct / Privacy team
Staff training recordsEngineers and product staff have received privacy by design trainingHR / Learning & Development
Retention enforcement logsAutomated deletion is functioning as configuredEngineering / IT Operations

Article 25’s requirements do not stop at launch. Privacy by design is an ongoing obligation: as systems evolve, as data scopes change, and as new features are added, the Article 25 assessment must be revisited. A DPIA or privacy review conducted at initial launch does not cover subsequent material changes. Building privacy review into change management and feature development processes is the only sustainable way to maintain Article 25 compliance across a product’s lifecycle.