Dark patterns boost this quarter’s metrics—then bill you next quarter’s trust. I’ve been tracking the fallout from the “growth hacks” that probably felt like cracking some secret code until the process servers start showing. The pattern is eerily consistent across industries. Take Amazon’s internal “Project Iliad”—named after Homer’s epic about a decade-long war. (Flair for drama, much?) The FTC alleges Amazon designed a complex cancellation process to deter Prime subscribers from unsubscribing, using what the agency described only slightly hyperbolically as a “four-page, six-click, fifteen-option cancellation process.” Amazon’s case is still working through federal court. Then there’s Epic Games—hit with $245 million in refunds for using dark patterns that tricked Fortnite players into unwanted purchases. The FTC distributed $72 million in December 2024 and another $126 million in June 2025 to affected users. But the bigger shift? Regulators aren’t just slapping wrists anymore. The UK’s DMCC Act—in effect since April 6, 2025—now allows the CMA to impose fines up to 10% of global annual turnover for consumer law breaches—putting dark patterns within range of antitrust violations. Here’s what teams ship when they think they’re being clever: → Roach motels: Easy to get in, maze to get out → Drip pricing: When the $19 advertised price becomes $47 at checkout → Fake urgency: Countdowns that reset every hour → Hidden exits: Burying free/cheaper plans and the $0 tip option But there’s a bigger cost: 👎🏼 Short-term conversion bumps followed by support ticket floods 👎🏼 Refund programs that dwarf the original “gains” 👎🏼 Legal exposure that makes product-market fit irrelevant 👎🏼 Brand damage that takes years to repair The most efficient teams I’ve worked with ask one question before shipping: “Would users choose this if everything were perfectly transparent?” Swipe below for ethical alternatives that also simply work better long-term. If you’re banking on dark patterns helping you to hit your numbers, then you don’t have a conversion problem—you have a value problem. Comment “DARK UX” if you want me to send you this PDF. I’m curious: What’s the last dark UX you encountered that made you question a brand’s integrity? #ethicaldesign #uxdesign #darkpatterns #designethics #darkux ⸻ 👋🏼 Hi, I’m Dane—your source for UX and career tips. ❤️ Was this helpful? A 👍🏼 would be thuper kewl. 🔄 Share to help others (or for easy access later). ➕ Follow for more like this in your feed every day.
Privacy Dark Patterns
Explore top LinkedIn content from expert professionals.
Summary
Privacy-dark-patterns are deceptive design strategies used in websites and apps to trick users into sharing more personal data, subscribing unintentionally, or making it hard to opt-out—often sacrificing user trust for short-term business gains. As regulators and consumers become more aware, businesses are being called out and even penalized for these manipulative tactics.
- Prioritize transparency: Make key choices—like opting in or out—clear and easy to understand so users feel confident about how their information is used.
- Respect user intent: Avoid hiding unsubscribe links, pre-checking boxes, or using confusing language that pressures people into unwanted actions.
- Build trust, not traps: Design experiences that make it as simple to leave or decline an offer as it is to join, showing users that your brand values fair treatment over short-term numbers.
-
-
Dark Patterns & the DSA – First Higher Court Judgment under Art. 25 DSA #DarkPatterns #DSA #DigitalRegulation One of the first—if not the first—higher court decisions interpreting Art. 25 Digital Services Act (DSA) has been issued by the Higher Regional Court of Bamberg (OLG Bamberg). This ruling marks a significant step in clarifying the boundaries of manipulative design practices under the DSA. The case involved an additional ticket insurance offered on a platform when purchasing concert tickets. The court examined the following design practices: (1) The insurance was prominently highlighted. (2) If consumers chose not to opt in, they had to affirmatively reject it by clicking a button labeled “I bear the full risk” The court held: (1) The Art. 25(2) DSA exemption applies if the practice falls under the scope of the Unfair Commercial Practices (UCP) Directive, not only if it is an infringement of the UCPD. (2) But: A breach of the DSA’s standard of care (Art. 25) also constitutes a breach of Art. 5(2)(a) UCP Directive. (3) The highlighted offer alone constitutes #framing, but not an infringement. (4) The repeated prompt qualifies as soft #nagging, but not as an infringement per se. (5) However, the #combination of dark patterns —especially with the misleading implication of the “I bear the full risk” button (which ignores the buyer’s right to a refund in cases such as event cancellation) — amounts to an infringement of Art. 25 DSA. 📄 Case Reference: OLG Bamberg (3. Zivilsenat), Judgment of 05.02.2025 – 3 UKI 11/24 e 📝 Full text (in German, open access): https://lnkd.in/eyUKZpBD For further reading and academic context: - my interpretation of Art. 25 DSA in: Hofmann/Raue, DSA article by article commentary (2023 German edition / 2024 English edition) - some of these arguments were reflected in the court’s reasoning. - Dregelies, MMR 2023, p. 243 (German). Martin Husovec Alberto De Franceschi Christoph Busch João Pedro Quintais 🟥Joris van Hoboken Michael Denga Max Dregelies Prof. Dr. Mario Martini Katharina Kaesling
-
#30DaysOfGRC - 1 Most people think they’re in control of their data because they clicked “Accept.” But what if the design of that button was never meant to give them a real choice? That’s the power of dark patterns, subtle tricks in interface design that nudge users into sharing more than they intended. Tiny things. A grayed-out opt-out box. A pre-selected checkbox buried in a paragraph. A popup that makes “No” harder to find than “Yes.” This isn’t just a design problem. It’s a governance problem. When teams are pressured to hit engagement numbers or drive data collection, the ethical line gets blurry. You can technically be compliant and still exploit user behavior. Now layer that with consent fatigue, the endless cookie banners, privacy popups, and setting toggles that show up everywhere. People get tired. They stop reading. They click just to move on. And that’s where privacy fails. Good governance means designing systems that prioritize real choice. It means looking at how data is collected, not just what the law says, but what’s fair and transparent. If you want to build trust, start by removing the friction from doing the right thing. #grc #dataprivacy #aigovernance #privacyengineering #cybersecurity #techpolicy #uxethics #darkpatterns #30daychallenge #complianceculture #governancework
-
Solidcore, please do better: If you need to squint to find the "Unsubscribe" link in an email, that’s not clever design – it’s a dark pattern. Dark patterns are deceptive design choices that intentionally trick or manipulate users into doing things they might not otherwise do, like making it hard to unsubscribe, signing up for something by accident, or adding unwanted items to a cart. They prioritize short-term business metrics over long-term trust. In the email below, Solidcore hides their Unsubscribe link in tiny, dark gray text against a black background, nearly invisible to most users. This is a textbook dark pattern meant to reduce unsubscribes. It may drive slightly better short-term engagement but at what cost? The best brands make opting in and out, just as easy. Because great product and marketing teams build relationships based on mutual value, not manipulation. If your content is valuable, users will want it. If it isn’t, no amount of hidden links will keep them around. Let’s build better.
-
🔮💻 Have you ever been tricked online into doing something you didn't initially intend? Maybe you ended up subscribing to an email newsletter you didn't want or found yourself unable to easily cancel a subscription. If so, you've been a victim of what's known as 'Dark Patterns' in User Experience (UX) Design. 🕸️🎭 Dark Patterns are deceptive techniques used in websites and apps, deliberately designed to make users do things they wouldn't typically choose to do. This could be anything from signing up for recurring bills, making it difficult to delete an account, or surreptitiously adding items to your shopping cart. While these techniques might increase short-term metrics (like conversion rates), they do so at the expense of user trust and long-term customer loyalty. It's just like a mouse trap for every user.📉👥 As UX designers, it is our responsibility to advocate for the user and ensure that we are designing ethically. This means prioritizing transparency, honesty, and respect in our designs. 👩💻🔎🎨 Next time you're designing an interface, ask yourself: 1️⃣ Is this choice architecture helping users make the best decision for them, or is it pushing them towards a decision that benefits the business? 2️⃣ Are we making it easy for users to understand what they're opting into? 3️⃣ Are we respecting users' time and attention? I challenge you to be part of the solution, to use your design skills to create experiences that respect and empower the users, not manipulate them. 💪🌟 Share your thoughts below on how you ensure ethical decision-making in your design process! Let's learn from each other and collectively make the digital world a better place. 🌐🤝💬 #uxdesign #darkpatterns #ethicsindesign #design #designcommunity
-
UX has a dark side: Deceptive patterns—UI patterns intentionally designed to deceive or confuse users for the purpose of helping the company achieve its own goal, without delivering value to the user or respecting their trust. Here, we see a simple on-off switch for a cookie consent dialog. But what is on, and what is off? In the Apple HIG, and Google’s Material Design, the switch is on when it’s flipped to the right. And the user gets further confirmation that the switch is on, by adding a color to the switch’s track. But this switch does the opposite of industry leaders—and it does so in a way that can conceivably deceive users into consenting to the use of trackers collecting their information, when they think they’re opting out. Companies believe decisions like this are beneficial to the business, but this actually increases the risk of the company getting sued—and lawsuits about this kind of thing happen often. Having clear, easy to understand design isn’t just beneficial to the users, it protects the business, as well.
-
#Addictive #patterns in the processing of #personaldata - interesting report issued by the Spanish DPA that highlights how providers implement misleading and addictive design patterns to prolong the time users stay on their services or to increase their level of engagement and the amount of personal data collected about them. "#Deceptive patterns are considered as interfaces and user experiences implemented on #socialmedia platforms that lead users to make unintended, unwilling and potentially harmful decisions regarding processing their personal data. Addictive patterns will be defined in this document as design features, attributes or practices that determine a particular way of using #digitalplatforms, #applications and services intended to make users spend much more time using them or with a greater degree of commitment than what is expected, convenient or healthy for them. Both characteristics of a design pattern, its deceptive and addictive nature, are closely related, although they are not the same. Adding such operations implementing addictive patterns to personal data processing has implications for several #dataprotection aspects, like the #lawfulness of the processing (in particular over the #consent conditions or the prohibition to processing special categories of personal data), #fairness and #transparency, purpose limitation, data #minimization, data protection by design and by default, and #accountability." The paper is available at https://lnkd.in/dtu-YtC4. #gdpr #privacy
-
I hate Dark Patterns! They are manipulative UX tricks that pressure, trick, or overwhelm users into giving consent, sharing data, or spending money. Today’s offender: the Blind app, a gossip platform for tech workers. I saw a push notification about a new post so I click on it then a screen pops up asking for consent to share data. The video shows what happens next: 1. Consent screen pops up with only “Agree” prominently displayed. 2. “More Options” link is in the top right. 3. Clicking it gives two options: AGREE TO ALL or SAVE & EXIT — but all categories are preselected, so both mean agree to all. 4. To opt out, you have to manually unselect each category. 5. The consent partners list has 1,500+ entries, all preselected and must be unchecked one by one 😡 6. You can’t use the app anymore unless you agree or manually unselect what you don’t want out of the 1,500+ From 1-10 how shady would you rate this example?
-
Are you up to date on your #DarkPatterns booster shots? Though subverting user choice can be a problem in any context, it's increasingly important to avoid when it comes to #privacy choices. Today, the California Privacy Protection Agency posted a new "enforcement advisory" reminding businesses to "carefully review and assess their user interfaces" when offering privacy choices to "ensure that they are offering symmetrical choices and using language that is easy for consumers to understand." Press release here: https://lnkd.in/dHHqvMcc For California, #SymmetricalChoices is what my legal writing professor would call the “Phrase that Pays.” The enforcement warning’s focus on this concept does not offer any new guidance, but reiterates the requirements under the CCPA and the implementing regulations. Together, these clarify the meaning of "dark patterns" under California's privacy law, where consent is considered invalid if based on an agreement obtained through the use of dark patterns. The advisory comes on the heels of the intergovernmental report released this summer by the FTC and two international consumer protection networks, which "showed a large percentage of the websites and mobile apps examined may use dark patterns, digital design techniques that may manipulate consumers into buying products or services or giving up their privacy." The review was not limited solely to privacy choices, but highlighted the need for businesses to reexamine their approach to individual choice across contexts: https://lnkd.in/dUr-Fuka This is the second advisory from the CPPA’s enforcement division. The first one focused on applying data minimization to consumer requests.
-
The Kids Online Safety Act (KOSA) is about design, not content Example 8, What is a dark pattern? “Dark patterns on online interfaces of online platforms are practices that materially distort or impair, either on purpose or in effect, the ability of recipients of the service to make autonomous and informed choices or decisions. Those practices can be used to persuade the recipients of the service to engage in unwanted behaviours or into undesired decisions which have negative consequences for them.” https://lnkd.in/g2i_NfCv In other words, manipulation, deception, and subverting and/or impairing user autonomy and choice. Last year we opened some Snapchat test accounts and found one dark pattern right away. Here are some screenshots. Check out the “Enable app permissions to make sign up easy” and “Tap ‘allow’ when prompted” messaging that fades into the background when Snap asks for access to your contacts. That’s a dark pattern. But Snap didn’t stop there … when we said no, it asked us again, and then again. That’s harassment, combined with dark patterns. We shouldn’t need a law to tell for-profit companies that they don’t get to subvert or impair user autonomy or choice, and yet, here we are. These images all came from a single account opening. If you still think that the Kids Online Safety Act is about content, please read it again. KOSA is about product design. In fact, a covered entity could never take down a single piece of content and fully comply with KOSA, or take down everything anyone might find offensive, and be in violation. #PassKOSA #PassKOSPA #NoMore #NoChildLostToSocialMedia #SMVLC
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Healthcare
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development