Deceptive patterns are designs that force the user to take an action that is not in their best interest. They are prolific on the web because they are phenomenally effective at boosting conversions. However, their use is unethical and legally problematic.

This article provides an overview of how deceptive patterns can manifest and discusses how to identify them in your designs.

What Are Deceptive Patterns?

Definition: A deceptive pattern is a design pattern that prompts users to take an action that benefits the company employing the pattern by deceiving, misdirecting, shaming, or obstructing the user’s ability to make another (less profitable) choice.

Deceptive patterns harm users by causing financial loss, loss of privacy, and legal control. Furthermore, deceptive patterns are more likely to be successful with vulnerable users, such as time-poor users or users with lower literacy and digital literacy levels, making this practice even more troubling. Designers should avoid and actively argue against the use of deceptive patterns.

Origin and Evolution

The concept of deceptive pattern (sometimes referred to as “dark patterns”) was coined by Harry Brignull in 2010 and was a play on the term “design pattern,” which refers to a common user-interface design for a specific component or experience.

Sludge is a concept related to deceptive patterns.

Definition: A sludge is the opposite of a nudge, a term used by behavioral economists to denote a tactic encouraging a positive behavior, such as saving for retirement.

Richard Thaler and Cass Sunstein define sludge as “any aspect of choice architecture consisting of friction that makes it harder for people to obtain an outcome that will make them better off (by their own lights).” Some deceptive patterns are sludges. For instance, a cookie-consent dialog that makes the user click through many screens to reject targeting or marketing cookies is an example of sludge. However, deceptive patterns can include deceit, trickery, emotional manipulation, and friction.

Deceptive patterns are prolific on the web and in digital design. A 2019 study by researchers from Princeton University and the University of Chicago found deceptive design patterns were present on over 10% of a sample of 11,000 popular ecommerce sites on the web. Another study by researchers from the University of Zurich in 2019 found that deceptive patterns were present in 95% of 240 free, trending apps sampled from the Google Play store, and over half of these apps had an average of 7 deceptive patterns.

The widespread nature of deceptive patterns is partly due to two widespread phenomena:

  1. A/B testing and the focus on driving conversions: The rise in popularity of A/B and multivariate testing for optimizing conversions has led to the increased use of deceptive design patterns.
  2. Copycat designs: Many companies imitate the designs of their competitors. While this practice has helped to establish and standardize design patterns, it has also resulted in the proliferation and perceived legitimization of deceptive design patterns.

Examples of Deceptive Patterns

There are many types of deceptive patterns, and there is extensive literature from UX practitioners, scholars, and legal experts on classifying deceptive patterns. Since there are too many deceptive patterns (and differing nomenclatures) to note in this article, some prominent examples are given below.

  • Obstruction: Making it difficult for a user to make a choice that doesn’t benefit the company, for example, by increasing the interaction cost of finding the relevant information or capability. For example, the US Federal Trade Commission filed a lawsuit against Amazon, partly due to Amazon’s use of obstruction when customers tried to unsubscribe from Prime online.
  • Visual or wording tricks: When a design exploits the limits of human cognition, takes advantage of cognitive biases, or subverts users’ expectations by breaking common, well-practiced design patterns. Examples include misleading wording for questions and answer options (like using double negatives) or moving a Close button far outside a modal.
  • Nagging: Persistently nagging users to agree to something, even though they may have already declined the request. For example, if a mobile-app permission request is triggered repeatedly, users may give up rejecting the request because it is too effortful.
  • Emotionally manipulative designs: When designs scare, guilt, or shame users into making a choice that the business favors. For example, many ecommerce websites rely on modal dialogues asking users to provide an email address in exchange for a discount code. The design would be emotionally manipulative if the link to decline read No thanks, I don’t like saving money. (This is an example of a manipulink, and this practice has become known as “confirmshaming.”)
  • Sneaking or preselection: When a purchase funnel adds extra items to the basket automatically or preselects options that are not required. In these cases, users must identify the added option and remove or uncheck the item.

Many deceptive patterns are now illegal under various consumer- and data-protection laws. For example, processing a user’s data without explicit consent is illegal under the EU’s General Data Protection Regulation (GDPR). Thankfully, governments and regulators are catching up on deceptive practices and outlawing specific practices.

What Makes Something a Deceptive Pattern?

Sometimes, designers struggle to distinguish a persuasive design from a deceptive design. Take, for example, the signup page for LinkedIn Premium. The design utilizes social proof (by highlighting those user’s connections who use the service). It plays into the cognitive bias of anchoring (the crossed-out pricing to highlight the one-month free trial). Assuming the content is accurate and not fabricated, then the design is persuasive, not deceptive.

LinkedIn Premium's Sign up page shows some persuasive content showcasing the advantages of premium and the features provided. In addition, the design showcases the profiles of connections using Premium with the tag line "Navneet and millions of other members use Premium". Additionally, a free trial is offered. The usual price of $19.99 is crossed out and replaced with 1-month free trial.
LinkedIn’s design utilizes social proof (by showing the user their connections who use the product) and anchoring (through the crossed-out pricing, highlighting the free trial).

Booking.com (below) also uses social proof (through featured reviews that appear on top of the hotel images) and scarcity (by showing how many rooms are left to purchase). Assuming that the number of hotel rooms remaining and the reviews are authentic and that the company hasn’t been employing shady business practices — like releasing room availability in blocks or paying people for positive reviews — then the design isn’t deceptive but persuasive (although it might be perceived as “salesy” to some users).

A hotel overview page from Booking.com is shown. At the top of the page, reviews are highlighted by the article author. The room availability is also shown. Some text by each room reads: "Only 5 rooms left on our site". The screenshot has been annotated, showing these items use social proof and scarcity.
Booking.com utilizes the marketing tactics of scarcity and social proof to encourage conversions.

After a further look at Booking.com’s design, some element of deception appears to be involved. The carousel of positive reviews at the top of the page seems complete and isn’t clickable. If users wanted to read more of Jennifer’s review, they would need to know to scroll down the page to the review section. Jennifer’s review appears under the section See what guests loved the most. And now there’s a Read more link. You’d be forgiven for thinking that the Read More link would show more of a positive review. But it doesn’t; it shows the negative comments that Booking.com doesn’t want you to see easily.

There are two screenshots shown from Booking.com. The first screenshot shows a partial review of a hotel from a customer called Jennifer. The second screenshot shows the full review from Jennifer which appears in a modal pop-up after the user selects a "read more" link. The partial review only shows positive comments, but the full review shows different comments Jennifer made which were positive and negative.
Booking.com’s full reviews are hard to find. It appears the reviews are mostly positive from the hotel’s page. However, users must click Read more or Read All Reviews to see the full reviews (and not just the positive comments made by users).

It’s hard to imagine that Booking.com didn’t deliberately design its website this way to encourage bookings, since customers are asked to provide feedback in this unusual manner (having separate spaces for reporting good and bad aspects of their stay). Regardless of whether the design was intentional, some mild deception is at play. This mild deception, blended with scarcity and availability bias (good reviews readily available), results in time-poor users making a booking without careful consideration — precisely what the business wants them to do.

In the example above, the deception was apparent only after careful consideration of what users might be expecting and of how the design works. Some deceptive designs are hard to spot immediately. They require some deliberate thought and questioning to identify and prevent.

How to Avoid Deceptive Patterns

Testing your designs regularly with users can help identify any obvious issues or strong deceptive patterns. Researchers have found that obvious deceptive patterns often solicit anger and distaste from users. If you’re employing obvious deceptive patterns in your design, you’ll probably hear about them from your usability-testing participants. However, don’t wait for users to identify deceptive patterns because mild deceptive patterns often go unnoticed, as researchers Jamie Luguri and Lior Strahilevitz found. This effect was especially pronounced for users with lower education levels.

We must scrutinize our designs to ensure that they are ethical. A structured way of doing this is by running an amended cognitive walkthrough. Walk through the design and ask yourself a series of questions, such as the following:

  • Could users spend more or provide more of their data than they intended or needed?
  • When users consent to something in exchange for a capability, product, or experience, is the exchange fair and appropriate?
  • Is the information presented about each choice factually correct?
  • Given how the information or options are presented, could users easily misinterpret the choices (or availability of choice)?
  • Could users miss another choice in the interface (for example, because it is obscured or in a location the user might not expect)?
  • Could users miss an essential piece of information that would assist them in making a choice?
  • Can users access all the information regarding each choice available to them quickly?
  • Can users quickly implement a choice they want to make (or do many unnecessary steps block them)?
  • Are users rushed into making a decision?
  • Are users unfairly pressured or emotionally manipulated when making a choice?
  • Could users feel ashamed, nervous, or guilty when declining a choice?

To create empathy and get your team to adopt a user-centered mindset, ground these questions by referring to a persona of your most vulnerable user.

Conclusion

As UX practitioners, our job is to straddle the needs of the business that employs us with our users' interests. Designers should call out deceptive patterns when proposed or implemented and seek fairer, more ethical ways of helping businesses and users achieve their goals.

References

Arunesh Mathur et al. 2019. Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites. Proceedings of the ACM on Human-Computer Interaction 3, CSCW (2019), 1–32. DOI:http://dx.doi.org/10.1145/3359183

Harry Brignull. 2023. Deceptive patterns exposing the tricks tech companies use to control you 1st ed., Testimonium Ltd.

Jamie Luguri, Lior Jacob Strahilevitz, Shining a Light on Dark Patterns, Journal of Legal Analysis, Volume 13, Issue 1, 2021, Pages 43–109, https://doi.org/10.1093/jla/laaa006

John Newman &. Amy Ritchie and Staff in the Bureau of Competition & Office of Technology. 2023. FTC takes action against Amazon for enrolling consumers in Amazon Prime without consent and sabotaging their attempts to cancel. (June 2023). Retrieved September 25, 2023 from https://www.ftc.gov/news-events/news/press-releases/2023/06/ftc-takes-action-against-amazon-enrolling-consumers-amazon-prime-without-consent-sabotaging-their

Linda Di Geronimo, Larissa Braz, Enrico Fregnan, Fabio Palomba, and Alberto Bacchelli. 2020. UI Dark Patterns and Where to Find Them: A Study on Mobile Applications and User Perception. In CHI Conference on Human Factors in Computing Systems (CHI '20), April 25–30, 2020, Honolulu, HI, USA. ACM, New York, NY, USA 14 Pages. https://doi.org/10.1145/3313831.3376600

Richard H. Thaler and Cass R. Sunstein. 2022. Nudge: The final edition, London: Penguin Books.