The term "dark patterns" was coined by UX designer Harry Brignull in 2010 to describe user interface tricks that benefit a company at a user's expense. The concept has since become a recognized category in design ethics research and, increasingly, in regulatory enforcement. When applied to privacy settings, dark patterns are the mechanism by which GDPR consent has become largely theater: technically compliant, deliberately circumvented.
Recognizing them is the first step to not being manipulated by them. This is a catalog of the most common ones, how they work, and what they're designed to achieve.
Cookie Consent: The Classic Case Study
The GDPR requires freely given, specific, informed, and unambiguous consent for non-essential cookies. Most cookie consent banners are architecturally designed to fail at least two of these requirements while remaining technically defensible.
The most common violations:
- Visual asymmetry — "Accept All" is a filled button in the site's primary color. "Reject" or "Manage" is a greyed-out link in smaller text. Users click the prominent option. Studies consistently find acceptance rates 3–4× higher when the accept button is visually prominent versus equal-sized options.
- Reject buried under layers — clicking "Manage Preferences" reveals a modal with dozens of toggles, all pre-checked. Clicking "Reject All" requires finding a second "Save" button after unchecking everything. The accept path is two taps; the reject path is a small UX project.
- Implied consent — banners that say "By continuing to use this site, you agree..." without requiring any action. The GDPR is explicit that inaction cannot constitute consent; courts have found this pattern unlawful. It persists anyway because enforcement is slow.
- Infinite partner lists — the "our advertising partners" list on some consent banners runs to hundreds or thousands of companies. This is technically transparent. It is practically unreadable, and it's designed to be.
The French data protection authority (CNIL) fined Google €150 million and Facebook €60 million in January 2022 specifically for consent interfaces that made refusing cookies more difficult than accepting them. The Irish DPC fined Meta €390 million in January 2023 partly over consent mechanisms. These are large numbers; they have not stopped the practices.
Mobile App Permission Dark Patterns
iOS and Android both require apps to request permission before accessing sensitive resources — location, microphone, camera, contacts. The permission prompt comes from the operating system and is standardized. What isn't standardized is how the app primes you before the system prompt appears.
Common patterns:
- Pre-permission priming — the app shows its own screen before the system dialog: "To give you the best experience, we need access to your location." The custom screen can use any framing it wants. The system dialog that follows shows a binary choice. Users primed with a compelling rationale accept at much higher rates.
- Permission bundling — requesting multiple permissions in sequence during onboarding, relying on users to click through a permission flow without reading each one carefully. Apps that need location for a core feature sometimes request microphone in the same flow with no explanation.
- Feature hostaging — making a feature unavailable unless a permission is granted, even when the feature doesn't actually require the permission. A social app that requires "allow contacts" to post content doesn't need contacts to post content.
The defense is simple but requires pausing during onboarding: deny permissions by default and grant them only when you understand why a specific feature needs them. Most permissions can be denied without breaking the app's core function.
Account Deletion and Data Portability Friction
The GDPR grants users the right to delete their accounts and export their data. These rights are real. The implementation is often designed to make exercising them as tedious as possible.
| Dark Pattern | How It Manifests | Goal |
|---|---|---|
| Roach motel | Easy to sign up (one click), deletion requires emailing support, waiting, confirming via a separate link | Prevent account deletion through friction |
| Deactivation vs. deletion | Settings offer "deactivate" prominently and "delete" buried or absent from UI; deactivation retains all data | Retain user data even after they try to leave |
| Data export delay | Export takes "up to 30 days" — the maximum window under GDPR — even when technically it could be faster | Reduce likelihood users complete export before leaving |
| Guilt-tripping | Deletion flow shows photos of friends you'll "lose touch with," "X people will miss you" messages | Emotional manipulation to abandon deletion |
Privacy Settings Designed to Be Found Last
Privacy settings are not a regulated UI location. Where a company puts them is a product decision. Reliably, privacy-reducing settings are in prominent locations (notifications, personalization, convenience features) and privacy-protecting settings are in Settings → Privacy → Advanced → More Options → scroll down.
The architecture of a settings menu reveals the company's priorities as clearly as its privacy policy. Features that benefit the user are easy to find; features that benefit the company's data collection are easy to leave on.
What to look for when you first set up a service or device:
- Search the settings for "data," "tracking," "personalization," "targeted," "analytics," and "diagnostic." These are the common keyword clusters for optional data collection.
- Look for toggles that are on by default that you didn't consciously enable. Defaults are the most important design choice a company makes — they determine what happens for the majority of users who never visit settings.
- Check whether "off" means actually off, or means "limited" — some "off" states still permit first-party analytics.
What Regulation Is Actually Changing
The GDPR's approach to dark patterns has been clarified through enforcement decisions and guidance from the European Data Protection Board (EDPB). Their 2022 guidelines on dark patterns specifically address:
- Requiring equal visual prominence for accept and reject options
- Prohibiting "consent or pay" models where refusing tracking requires paying (this is still contested)
- Treating repeated consent dialogs after a user has already refused as coercive
In the US, the FTC has taken action on dark patterns under its Section 5 "unfair or deceptive practices" authority. The FTC's 2022 report "Bringing Dark Patterns to Light" documented categories of dark patterns and signaled enforcement intent. California's CCPA and CPRA regulations include opt-out mechanism requirements that address some consent design issues.
Regulatory action is real but slow, and enforcement actions lag the practices they target by years. The practical limits of GDPR enforcement mean users can't rely on regulation alone. Recognizing the patterns — the asymmetric buttons, the buried rejections, the pre-permission priming — and treating them as what they are (deliberate design choices against your interests) is the more reliable defense.
Designing Honestly: What It Looks Like
Some products treat consent as something to be genuinely sought rather than gamed. The signals are consistent: reject is as easy to find as accept; the default is the option that collects less; settings are organized by what the user cares about, not by what the company wants buried.
Haven is designed on these principles — data you share stays within your encrypted communications, and settings for limiting collection are at the top level of preferences, not at the bottom of an advanced submenu. We mention this not as a plug but as an illustration that the alternative is technically feasible; it's a product choice, not a technical constraint. Companies that bury their privacy controls are making a deliberate decision about whose interests the product serves.