Article 5 of GDPR is the regulation’s spine. Everything else — the lawful bases, the rights framework, the security obligations, the accountability documentation — gives effect to seven foundational principles that govern how personal data must be handled. These principles are not aspirational values or ethical guidelines. They are binding legal requirements. Violating them directly is a Tier 2 infringement under Article 83(5), carrying fines of up to €20 million or 4% of global annual turnover.
Understanding each principle precisely — what it requires, what it prohibits, and what constitutes a violation — is the foundation of a GDPR compliance programme. This article provides that understanding, together with the practical implementation requirements and the most common failure patterns that supervisory authorities have identified in enforcement actions since 2018.
ARTICLE 5 AT A GLANCE
| Principle | Article | The Core Requirement | Common Failure |
|---|---|---|---|
| Lawfulness, Fairness & Transparency | 5(1)(a) | Process data legally, fairly, and openly | No valid lawful basis; opaque privacy notices |
| Purpose Limitation | 5(1)(b) | Collect for specified purposes; don’t repurpose incompatibly | Repurposing data for analytics without new basis |
| Data Minimisation | 5(1)(c) | Collect only what is necessary for the purpose | Collecting everything ‘just in case’ |
| Accuracy | 5(1)(d) | Keep data accurate and up to date | Outdated records; no correction processes |
| Storage Limitation | 5(1)(e) | Don’t retain data longer than necessary | No retention schedule; indefinite storage |
| Integrity & Confidentiality | 5(1)(f) | Implement appropriate security measures | Inadequate encryption; poor access controls |
| Accountability | 5(2) | Demonstrate compliance with all principles | No documentation; no evidence of controls |
Principle 1: Lawfulness, Fairness, and Transparency
Article 5(1)(a) requires that personal data be processed lawfully, fairly, and in a transparent manner in relation to the data subject. These three elements are distinct requirements that must each be satisfied independently.
Lawfulness means every processing activity must have a valid lawful basis under Article 6 — one of consent, contract, legal obligation, vital interests, public task, or legitimate interests. Processing that lacks a lawful basis is unlawful regardless of how well it is documented, how securely the data is held, or how beneficial the purpose might be. Article 1.4 of this knowledge hub covers lawful basis selection in detail.
Fairness is a broader, more values-based concept. Processing is fair when it meets the reasonable expectations of the data subject. It is unfair when it deceives, when it uses data in ways that harm or disadvantage the individual without justification, or when it exploits power imbalances to extract consent or data that the individual would not freely provide. Supervisory authorities have applied the fairness principle to practices including: processing data for purposes that conflict with the stated purpose; using nudge techniques in consent interfaces to push users toward data-sharing; and processing data about individuals in ways that produce discriminatory outcomes.
Transparency means individuals must be informed about how their data is used, in clear and plain language, at or before the point of collection. The right to be informed — fulfilled through privacy notices under Articles 13 and 14 — is the operational expression of the transparency principle. Transparency failures are among the most commonly penalised GDPR violations: opaque privacy notices, missing mandatory information, confusing or deliberately complex language designed to obscure the scope of processing.
| KEY IDEA | Transparency is not just about having a privacy notice. It is about ensuring that the information in that notice is genuinely accessible, comprehensible, and complete. A 40-page privacy policy written in legal language satisfies the letter of GDPR’s information requirements only if it also meets the standard of clear and plain language. Most don’t. Layered privacy notices — a short summary with links to fuller detail — are the DPA-recommended approach. |
TRANSPARENCY — WHAT MUST BE DISCLOSED
| Information Required | Where Disclosed | Article |
|---|---|---|
| Controller identity and contact details | Privacy notice at collection | 13(1)(a) |
| DPO contact details (where applicable) | Privacy notice at collection | 13(1)(b) |
| Purposes and lawful bases for processing | Privacy notice at collection | 13(1)(c) |
| Legitimate interests relied on (if applicable) | Privacy notice at collection | 13(1)(d) |
| Recipients or categories of recipients | Privacy notice at collection | 13(1)(e) |
| International transfer details and safeguards | Privacy notice at collection | 13(1)(f) |
| Retention periods | Privacy notice at collection | 13(2)(a) |
| Existence of data subject rights | Privacy notice at collection | 13(2)(b) |
| Right to withdraw consent (if consent is the basis) | Privacy notice at collection | 13(2)(c) |
| Right to lodge a complaint with a DPA | Privacy notice at collection | 13(2)(d) |
| Whether provision of data is a requirement and consequences | Privacy notice at collection | 13(2)(e) |
| Existence of automated decision-making, including profiling | Privacy notice at collection | 13(2)(f) |
Principle 2: Purpose Limitation
Article 5(1)(b) requires that personal data be collected for specified, explicit, and legitimate purposes and not further processed in a manner that is incompatible with those purposes. Purpose limitation has two dimensions: the initial specification of purpose at the point of collection, and the prohibition on incompatible further processing.
The initial specification requirement means that the purpose for which data is collected must be identified with precision before collection begins, communicated to data subjects, and documented in the RoPA. Vague purpose descriptions — ‘improving our services’, ‘analytical purposes’, ‘business development’ — do not satisfy the specified and explicit elements. The purpose must be concrete enough that the data subject understands what their data will actually be used for, and specific enough that it creates a genuine boundary on subsequent use.
The compatibility test for further processing is not a binary prohibition on any secondary use. Article 5(1)(b) acknowledges that further processing may be compatible with the original purpose. Article 6(4) provides the compatibility assessment framework: the link between original and new purposes; the context in which the data was collected; the nature of the data; the consequences of further processing for data subjects; and whether appropriate safeguards (encryption, pseudonymisation) are in place. Where further processing is incompatible, a new lawful basis must be established for the new purpose — which in practice usually means consent, since the other bases are tied to specific activities.
| IMPORTANT | Purpose limitation is the principle most frequently violated by organisations that treat personal data as a general business asset. Data collected to fulfil a customer order cannot be repurposed for marketing without a separate lawful basis. Data collected to provide a service cannot be used to train AI models without a fresh compatibility assessment and, where the processing is incompatible, explicit consent. The collected-for-one-thing, used-for-everything approach is a purpose limitation violation at scale. |
Two exceptions to purpose limitation are explicitly recognised. Further processing for archiving purposes in the public interest, scientific or historical research purposes, or statistical purposes is considered compatible with the original purpose, provided appropriate safeguards are in place. These are the same exceptions that apply to retention and erasure under Article 17.
Principle 3: Data Minimisation
Article 5(1)(c) requires that personal data be adequate, relevant, and limited to what is necessary in relation to the purposes for which it is processed. Data minimisation is the principle that prevents organisations from collecting more personal data than they actually need for their stated purpose.
Adequate means the data is sufficient to fulfil the purpose — collecting too little data would prevent the purpose from being achieved. Relevant means there is a direct connection between the data and the purpose — data points with no bearing on the purpose should not be collected. Limited to what is necessary means the organisation collects the minimum data required to achieve the purpose — not a broader set that might be useful, might be interesting, or might be used for purposes that have not yet been identified.
In practice, data minimisation requires organisations to conduct a purpose-led data collection assessment before building any system that collects personal data. The assessment asks: What is the purpose? What is the minimum data required to achieve it? Are we proposing to collect anything beyond that minimum? If yes, what additional justification exists for the excess? This assessment is particularly important for product teams and developers who tend to default to collecting all available data as a hedge against future requirements.
| BITLION INSIGHT | Privacy by design — Article 25 — operationalises data minimisation at the system level. Organisations that embed data minimisation review into their product development lifecycle, requiring explicit justification for every personal data field collected, consistently collect less data, have smaller breach exposures, and have simpler data deletion and rights fulfilment processes. Data minimisation is not a privacy constraint on product development. It is a risk management discipline that makes products cheaper to run and easier to secure. |
Principle 4: Accuracy
Article 5(1)(d) requires that personal data be accurate and, where necessary, kept up to date. Every reasonable step must be taken to ensure that personal data that is inaccurate, having regard to the purposes for which it is processed, is erased or rectified without delay.
The accuracy principle has both a passive dimension — not collecting or recording inaccurate data in the first place — and an active dimension — taking steps to identify and correct inaccuracies as they arise or are discovered. The active dimension requires the organisation to have processes for receiving and processing rectification requests from data subjects (the right to rectification under Article 16), for updating data as circumstances change, and for propagating corrections to processors and other recipients to whom the data has been disclosed.
The ‘where necessary’ qualifier acknowledges that not all data needs to be kept current. Historical transaction records do not need to be updated as the data subject’s circumstances change. But customer contact data, medical records, financial data used for ongoing credit assessments, and any data where currency directly affects decisions made about the data subject must be kept up to date.
ACCURACY — IMPLEMENTATION CHECKLIST
| Action Required | Who Is Responsible | Frequency |
|---|---|---|
| Provide a rectification channel for data subjects | Compliance / Product | Ongoing |
| Process rectification requests within 1 month | Data owners | Per request |
| Notify processors and third parties of corrections | Data owners / DPO | Per correction |
| Review contact and profile data for currency | CRM / IT | Annually or per trigger |
| Validate data at collection point (e.g. format checks) | Engineering / Product | At system build |
| Remove or correct data identified as inaccurate in audits | Compliance / IT | Per audit cycle |
Principle 5: Storage Limitation
Article 5(1)(e) requires that personal data be kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed. Personal data may be stored for longer periods insofar as it will be processed solely for archiving purposes in the public interest, scientific or historical research purposes, or statistical purposes, subject to appropriate safeguards.
Storage limitation requires every processing activity to have a defined retention period. That period must be the minimum necessary to achieve the stated purpose — not the maximum that could be justified, not an indefinite period, and not a period chosen for operational convenience. Retention periods must be documented in the RoPA and communicated to data subjects in the privacy notice.
In practice, retention period definition requires input from legal (what are the minimum statutory retention obligations?), business operations (what is the minimum period for which data is genuinely needed?), and IT (how is data physically deleted at end of retention, across all systems including backups?). The combination of legal minimum and operational maximum creates a defensible retention window. Any retention beyond that window requires specific justification.
| IMPORTANT | Indefinite retention is the most common storage limitation violation. Organisations frequently retain personal data ‘just in case’ or because deletion requires effort that has not been prioritised. Supervisory authorities treat indefinite retention — or retention periods with no documented justification — as a clear Article 5(1)(e) violation. Building automated deletion routines tied to retention schedules is the only scalable solution for organisations handling data at volume. |
EXAMPLE RETENTION PERIODS BY DATA CATEGORY
| Data Category | Typical Retention Driver | Example Period | After Period |
|---|---|---|---|
| Customer transaction records | Tax/accounting law | 5–7 years post-transaction | Delete or anonymise |
| Employee payroll records | Employment/tax law | 7 years post-employment | Delete |
| Customer service interactions | Dispute resolution / QA | 1–2 years | Delete |
| Job applicant CVs (unsuccessful) | Legitimate interests | 6 months post-decision | Delete or seek consent to retain |
| Marketing consent records | Consent management | 3 years post-last-interaction | Delete consent record |
| CCTV footage | Security / incident review | 30–90 days | Overwrite |
| Website access logs | Security monitoring | 90 days–1 year | Delete or anonymise |
| Medical records (private sector) | Healthcare law / clinical need | Per regulatory requirement | Delete per clinical protocol |
Principle 6: Integrity and Confidentiality
Article 5(1)(f) requires that personal data be processed in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction, or damage, using appropriate technical or organisational measures.
This principle is the foundation of GDPR’s security obligations, elaborated in Article 32. The standard is ‘appropriate’ — proportionate to the nature and volume of data processed, the state of the art in available security technology, the cost of implementation, and the risks to data subjects if a breach occurs. There is no single set of measures that satisfies Article 5(1)(f) for all organisations — a hospital processing health records faces a different risk landscape and must implement correspondingly more robust controls than a small business with a basic customer mailing list.
The integrity dimension covers protecting data against unauthorised modification and ensuring that data remains accurate and complete throughout its lifecycle. The confidentiality dimension covers protecting data against unauthorised access and disclosure — which encompasses both external threats (cyber attacks, unauthorised access) and internal threats (staff accessing data beyond their role, accidental disclosure in communications).
INTEGRITY & CONFIDENTIALITY — KEY MEASURES
| Measure Category | Examples | GDPR Reference |
|---|---|---|
| Encryption | AES-256 at rest; TLS 1.2+ in transit; encrypted backups | Art. 32(1)(a) |
| Pseudonymisation | Tokenisation of identifiers; key separation; data masking | Art. 32(1)(a) |
| Access control | Role-based access; least privilege; MFA; privileged access management | Art. 32(1)(b) |
| Availability & resilience | Redundant infrastructure; disaster recovery; backup testing | Art. 32(1)(b) |
| Incident response | Breach detection; 72-hour notification; post-incident review | Art. 32(1)(c) |
| Regular testing | Penetration testing; vulnerability scanning; security audits | Art. 32(1)(d) |
| Organisational measures | Staff training; clean desk policy; data classification; NDA | Art. 32(1) |
Principle 7: Accountability — The Binding Principle
Article 5(2) states: ‘The controller shall be responsible for, and be able to demonstrate compliance with, paragraph 1 (‘accountability’).’ The accountability principle does not introduce new substantive obligations — it requires the controller to be able to prove that it meets the obligations already established in Article 5(1). It transforms GDPR from a self-reported compliance regime into a documented, evidence-based one.
Accountability is satisfied not by asserting compliance but by maintaining the evidence that demonstrates it: a complete and current RoPA showing lawful bases; documented LIAs for legitimate interests processing; consent records for consent-based processing; DPAs with all processors; completed DPIAs for high-risk activities; security policies and records of security testing; staff training records; data subject rights response logs; and breach notification records. Together, these documents constitute the ‘accountability portfolio’ that an organisation must be able to produce in response to a supervisory authority investigation.
The accountability principle also drives the internal governance requirements: the appointment of a DPO where required, the integration of data protection into organisational governance structures, the designation of data owners and risk owners, and the implementation of privacy by design in product and system development. Accountability is not a documentation exercise — it is the evidence that a genuine data protection culture exists within the organisation.
| BITLION INSIGHT | Supervisory authorities assess accountability not by reviewing a compliance checklist but by examining whether the organisation’s documentation reflects actual practice. A comprehensive privacy policy paired with no staff training records, no DPAs with processors, and no DPIA for a high-risk processing activity demonstrates paper compliance, not genuine accountability. The test is whether the documentation and the practice are aligned. |
Common Failures Across All Principles
Enforcement actions since 2018 reveal consistent patterns of Article 5 failure. Understanding these patterns is the fastest way to identify the highest-risk gaps in any compliance programme.
MOST COMMON ARTICLE 5 VIOLATIONS IN DPA ENFORCEMENT
| Failure Pattern | Principle Violated | Enforcement Examples |
|---|---|---|
| No valid lawful basis documented for key processing activities | 5(1)(a) Lawfulness | Multiple DPA investigations 2019–2025 |
| Privacy notices incomplete, inaccessible, or in complex legal language | 5(1)(a) Transparency | WhatsApp €225M; Google €150M (France) |
| Data repurposed for analytics/AI training without new basis | 5(1)(b) Purpose limitation | Clearview AI; multiple ad-tech cases |
| Excessive data collection beyond what purpose requires | 5(1)(c) Minimisation | Instagram €405M (children’s data) |
| Outdated customer/employee records; no correction process | 5(1)(d) Accuracy | Routine DPA findings in investigations |
| No retention schedule; indefinite storage of personal data | 5(1)(e) Storage limitation | Identified in majority of DPA investigations |
| Inadequate security leading to breach; poor access controls | 5(1)(f) Integrity/confidentiality | British Airways €20M; Marriott €18M |
| No RoPA; no DPAs; no DPIA; no evidence of compliance | 5(2) Accountability | Standard finding in enforcement actions |
Article 5’s principles are the lens through which every GDPR obligation should be assessed. When evaluating a new processing activity, the first questions are always: Is this lawful? Is it fair? Are we being transparent? Is the purpose specified? Are we collecting only what we need? Is the data accurate? Will we delete it when it is no longer needed? Is it adequately protected? And can we prove all of this? Every subsequent article in this knowledge hub addresses one or more of these questions in operational detail.