Policy & Law

GDPR in Practice: What European Privacy Law Actually Protects (and What It Doesn't)

April 28, 2026 8 min read Haven Team

GDPR is frequently cited as the gold standard of privacy regulation. It is also frequently misunderstood — both by companies that treat compliance as a checkbox exercise and by users who assume it provides protections it was never designed to give. Neither extreme is useful.


The General Data Protection Regulation came into force in May 2018, replacing the 1995 Data Protection Directive. It applies to any organization that processes personal data of EU residents, regardless of where the organization itself is based — a jurisdictional reach that has made it the most globally influential privacy law to date. Understanding what it actually does requires separating the rights it grants from the enforcement it delivers.

What GDPR Requires: The Core Rights

GDPR grants EU residents a set of specific, legally enforceable rights over their personal data:

GDPR also imposes obligations on data controllers: they must have a lawful basis for processing (consent, contract, legal obligation, vital interests, public task, or legitimate interests), they must practice data minimization, and they must notify relevant supervisory authorities within 72 hours of discovering a data breach that poses risk to individuals.

The Consent Banner Problem

The cookie consent banner you click "Accept All" on to make it disappear is a GDPR compliance mechanism. It is not a privacy protection. For most users in practice, it is the opposite: a dark pattern that extracts consent through friction. Clicking "Reject All" — when that option exists and is not buried — is the rights-preserving choice.

Where GDPR Does Not Apply

GDPR is a data protection regulation, not a comprehensive privacy shield. Several significant gaps are worth understanding:

Intelligence and national security — Article 2(2) explicitly excludes processing by EU member states for national security purposes. Law enforcement has a separate framework (Law Enforcement Directive, LED) with weaker individual rights. If your threat model includes government surveillance, GDPR is not your protection.

Purely personal or household activities — GDPR doesn't apply to personal use. If someone publishes your data on a personal blog, GDPR doesn't directly help you against them.

Data that has already left the EU — GDPR restricts transfers to third countries without adequate protections, but once data has been transferred unlawfully, GDPR enforcement can't retrieve it. The regulation creates accountability for future behavior; it doesn't provide a retrieval mechanism.

Non-EU users — If you are outside the EU and the company is outside the EU, GDPR does not apply to your relationship. Many companies apply GDPR-level protections globally as a policy choice, but that's voluntary.

The Enforcement Problem

GDPR's headline penalty — 4% of global annual turnover or €20 million, whichever is higher — is eye-catching. Enforcement reality is more complicated.

Enforcement is handled by national Data Protection Authorities (DPAs). Ireland's DPA, which handles many major tech companies because they're headquartered there, has been criticized for slow enforcement and light penalties relative to the violations involved. Meta has received large fines (over €1 billion across multiple cases), but years elapsed between complaint and fine, and the fines represent a small fraction of the revenue generated by the contested practices during that period.

Small and medium organizations face different dynamics: the DPA of a smaller country may lack the resources to investigate complex cross-border cases. The result is uneven enforcement across the EU, with significant variation by country, sector, and the size of the organization involved.

GDPR Provision Works Well Works Poorly
Subject access requests Large, compliant companies Small companies, companies that ignore requests
Data breach notification Mature compliance teams Companies that fail to detect breaches at all
Consent requirements Clear opt-in scenarios Dark patterns, "legitimate interests" overuse
Right to erasure Straightforward account deletion Backup retention, third-party sharing already done
Cross-border enforcement Major cases with political will Most cases; coordination delays measured in years

The Consent Dark Pattern Loophole

GDPR requires consent to be freely given, specific, informed, and unambiguous. In practice, most websites offering consent banners violate at least one of these requirements — they bury the reject button, make rejection require more clicks than acceptance, use deceptive language ("Accept to continue" vs. no equivalent framing for rejection), or claim consent for purposes the user didn't actually agree to.

The IAB Europe's Transparency and Consent Framework — a system used by thousands of ad-tech companies to pass consent signals — was ruled unlawful by the Belgian DPA in 2022, but continued operating during appeals. The gap between GDPR's consent requirements and the ad-tech industry's actual practices remains substantial.

Consent under GDPR should mean "yes, you may." Much of the ad-tech industry treats it as a box to tick so processing can continue on a claimed lawful basis that regulators will take years to challenge.

"Legitimate interests" — one of the six lawful bases for processing — is similarly overused. It's intended for genuine balancing exercises where an organization's interests outweigh individual privacy interests. It's frequently invoked as a fallback when consent would clearly be denied.

What Good Privacy Protection Actually Requires

GDPR is a regulatory floor. It establishes minimum requirements for how personal data must be handled. But compliance with GDPR says nothing about whether a service is actually private or secure — only that it has a documented lawful basis for the processing it does, notifies you of breaches, and honors your access and deletion requests within the defined timelines.

A service can be fully GDPR-compliant while collecting extensive behavioral data, sharing it with dozens of "partners" under legitimate interests, retaining it for years, and having it technically accessible to its own staff. The regulation constrains how data is handled, not whether it's collected in the first place — as long as a lawful basis exists.

The strongest privacy protections come from architectural choices that make data collection unnecessary, not from regulatory compliance that permits it. Services that use end-to-end encryption and client-side key derivation can genuinely not access your message contents — not because a regulation forbids them to, but because they structurally lack the keys. That's a different category of protection than a privacy policy and a DPA registration.

GDPR matters — it gives users meaningful rights, creates real accountability, and has meaningfully changed how large tech companies document and justify their data practices. But it operates on the surface of a system that was designed to collect and profit from personal data, and it cannot fully constrain what that system does when regulators are underfunded, enforcement is slow, and dark patterns remain cheaper than genuine compliance.

Try Haven free for 15 days

Encrypted email and chat in one app. No credit card required.

Get Started →