“Dark Patterns: Past, Present, and Future. The evolution of tricky user interfaces.”
Communications of the ACM, September 2020, Vol. 63 No. 9, Pages 42-47
By Arvind Narayanan, Arunesh Mathur, Marshini Chetty, Mihir Kshirsagar
“Design is power. In the past decade, software engineers have had to confront the fact that the power they hold comes with responsibilities to users and to society. In this decade, it is time for designers to learn this lesson as well.”
Dark patterns are user interfaces that benefit an online service by leading users into making decisions they might not otherwise make. Some dark patterns deceive users while others covertly manipulate or coerce them into choices that are not in their best interests. A few egregious examples have led to public backlash recently: TurboTax hid its U.S. government-mandated free tax-file program for low-income users on its website to get them to use its paid program; Facebook asked users to enter phone numbers for two-factor authentication but then used those numbers to serve targeted ads; Match.com knowingly let scammers generate fake messages of interest in its online dating app to get users to sign up for its paid service. Many dark patterns have been adopted on a large scale across the Web.
Figure 1 shows a deceptive countdown timer dark pattern on JustFab. The advertised offer remains valid even after the timer expires. This pattern is a common tactic—a recent study found such deceptive countdown timers on 140 shopping websites.
The research community has taken note. Recent efforts have catalogued dozens of problematic patterns such as nagging the user, obstructing the flow of a task, and setting privacy-intrusive defaults, building on an early effort by Harry Brignull (darkpatterns.org). Researchers have also explained how dark patterns operate by exploiting cognitive biases uncovered dark patterns on more than 1,200 shopping websites, shown that more than 95% of the popular Android apps contain dark patterns, and provided preliminary evidence that dark patterns are indeed effective at manipulating user behavior.
Although they have recently burst into mainstream awareness, dark patterns are the result of three decades-long trends: one from the world of retail (deceptive practices), one from research and public policy (nudging), and the third from the design community (growth hacking).
Figure 2 illustrates how dark patterns stand at the confluence of these three trends. Understanding these trends—and how they have collided into each other—is essential to help us appreciate what is actually new about dark patterns, demystifies their surprising effectiveness, and shows us why it will be difficult to combat them. We end this article with recommendations for ethically minded designers.
Deception and Manipulation in Retail
The retail industry has a long history of deceptive and manipulative practices that range on a spectrum from normalized to unlawful (Figure 3). Some of these techniques, such as psychological pricing (that is, making the price slightly less than a round number), have become normalized. This is perfectly legal, and consumers have begrudgingly accepted it. Nonetheless, it remains effective: consumers underestimate prices when relying on memory if psychological pricing is employed.
More problematic are practices such as false claims of store closings, which are unlawful but rarely the target of enforcement actions. At the other extreme are bait-and-switch car ads such as the one by a Ford dealership in Cleveland that was the target of an FTC action.
The Origins of Nudging
In the 1970s, the heuristics and biases literature in behavioral economics sought to understand irrational decisions and behaviors—for example, people who decide to drive because they perceive air travel as dangerous, even though driving is, in fact, orders of magnitude more dangerous per mile. Researchers uncovered a set of cognitive shortcuts used by people that make these irrational behaviors not just explainable but even predictable.
For example, in one experiment, researchers asked participants to write down an essentially random two-digit number (the last two digits of each participant’s social security number), then asked if they would pay that number of dollars for a bottle of wine, and finally asked the participants to state the maximum amount they would pay for the bottle. They found the willingness to pay varied by approximately threefold based on the arbitrary number. This is the anchoring effect: lacking knowledge of the market value of the bottle of wine, participants’ estimates become anchored to the arbitrary reference point. This study makes it easy to see how businesses might be able to nudge customers to pay higher prices by anchoring their expectations to a high number. In general, however, research on psychological biases has not been driven by applications in retail or marketing. That would come later.
About the Authors:
Arvind Narayanan is an associate professor of computer science at Princeton University, Princeton, NJ, USA, where he leads the Princeton Web Transparency and Accountability Project to uncover how companies collect and use our personal information.
Arunesh Mathur is a graduate student in the department of computer science at Princeton University, Princeton, NJ, USA.
Marshini Chetty is an assistant professor in the department of computer science at the University of Chicago, IL, USA.
Mihir Kshirsagar leads the Tech Policy Clinic at Princeton University’s Center for Information Technology Policy, Princeton, NJ, USA.
User Interface Designers, Slaves of Fashion
The Case Against Data Lock-in
Brian W. Fitzpatrick and J.J. Lueck
Bitcoin’s Academic Pedigree
Arvind Narayanan and Jeremy Clark