Dark Patterns Cannot Stay in the Dark

In this Saturday Seminar, we collect scholarship on manipulative online practices that deceive consumers.

TurboTax reportedly tricked consumers into paying for tax filing services even if they were entitled to free tax filing services.

And Google reversed a design change after receiving backlash for allegedly making search results and advertisements indistinguishable, leading more consumers to click on paid links.

More companies are attracting attention for using dark patterns, which are online strategies that can influence people into doing things they did not intend. These dark patterns do not accidentally frustrate or distract online users but often manipulate consumers by design.

Harry Brignall coined the term “dark patterns” in 2010. His original list of 12 dark patterns includes wording questions to confuse consumers, adding hidden costs at the last step of a checkout process, and charging credit cards automatically when a free trial service ends. Over the past decade, scholars have developed extensive descriptions and taxonomies of other dark patterns that harm consumers.

Dark patterns present a problem because companies leverage these online strategies to promote their own interests at the expense of consumers. Dark patterns harm individuals by manipulating them into spending more money, disclosing more personal information, and expending more time, energy, and attention than they originally intended. Most dark patterns undermine individual autonomy and lead people to make choices they would not have otherwise made. In terms of collective welfare, dark patterns can also hurt competition, price transparency, and trust in markets.

Recognizing these harms, California, Colorado, and Connecticut now ban companies from using dark patterns to gain consent from consumers for obtaining personal data. At the federal level, the bipartisan DETOUR Act would prohibit online platforms with over 100 million users from using dark patterns.

Even without legislation targeting dark patterns, the Federal Trade Commission (FTC) has brought successful enforcement actions against them using its authority to regulate “unfair and deceptive acts and practices.” To examine dark patterns further, the FTC held a workshop in April 2021 that discussed definitions, potential rules, and disproportionate impacts on consumers based on race, income, and age.

In this week’s Saturday Seminar, we collect scholarship that focuses on the prevalence of dark patterns, their effects, and options for further regulation.

  • Websites frequently use dark patterns, Arunesh Mathur of Princeton University and his coauthors assert in an article in the Proceedings of the ACM on Human Computer Interaction. Mathur and his coauthors found that 11.1 percent of the websites in their study used dark patterns and more popular websites were more likely to use them. They explain how dark patterns exploit cognitive biases to affect decision-making, such as by making an item seem scarce or popular. Mathur and his coauthors describe some dark patterns that make false representations, which fall under the FTC’s authority to combat deceptive dark patterns.
  • In an article in the Harvard Law School Journal of Legal Analysis, Jamie Luguri, law clerk on the United States Court of Appeals for the Seventh Circuit and Lior Strahilevitz of University of Chicago Law School demonstrate the power of dark patterns. In their experiments, they found that users exposed to mild dark patterns were twice as likely to sign up for a service, while aggressive dark patterns nearly quadrupled acceptance rates. They assert that mild dark patterns are more insidious because they influence people without generating backlash or annoyance. After surveying the FTC’s success litigating dark patterns thus far, Luguri and Strahilevitz urge more systematic legal interventions.
  • In an article in the Harvard Journal of Law & Technology, Lauren E. Willis of Loyola Law School Los Angeles argues that dark patterns and deception are inevitable when companies use artificial intelligence to personalize and optimize online experiences. Willis explains that online deception is harder to prove because of questions regarding whether machines can have “intent” to deceive and whether a “reasonable consumer” would be deceived by a customized web experience. Instead of litigating deception, Willis proposes arguing that dark patterns are unfair because they are likely to cause substantial injury and consumers cannot reasonably avoid them.
  • Dark patterns can lead internet users to disclose large amounts of private information even if they have strong privacy concerns. In an article in the Current Opinion in Psychology, Ari Ezra Waldman of the Northeastern University School of Law argues that this privacy paradox does not result from consumer carelessness, but from companies using predatory dark patterns for disclosure. Waldman explains that this paradox is modeled on a “myth of rational disclosure.” Since cognitive biases make it difficult to make rational decisions about privacy online, Waldman argues that legislation should require online platforms to have a duty of care, confidentiality, and loyalty to their users.
  • Antitrust law can also regulate dark patterns and protect decisional privacy. In an article in the Alabama Law Review, Gregory Day of the University of Georgia Terry College of Business and Abbey Stemler of the Indiana University Kelley School of Business contrast the approach of traditional antitrust law––which relies on high prices to measure consumer welfare––with the new approach needed in digital markets, where companies offer low prices and free services at the expense of consumer privacy and welfare. Using the antitrust theory of coercion, Day and Stemler suggest that regulators should prohibit companies from building market power with dark patterns that coerce users to spend valuable attention, data, and money.
  • “Regulatory pluralism” is the best way to regulate dark patterns because it employs the strengths of both data protection and consumer protection regimes, Mark Leiser of Leiden University asserts in a recent paper. Leiser posits that data protection laws alone cannot address dark patterns because they regulate data controllers but not other companies that influence user experiences online. Leiser proposes using unfairness principles from both data protection laws and consumer protection laws to cover the entire environment of online transactions where dark patterns occur.

The Saturday Seminar is a weekly feature that aims to put into written form the kind of content that would be conveyed in a live seminar involving regulatory experts. Each week, The Regulatory Review publishes a brief overview of a selected regulatory topic and then distills recent research and scholarly writing on that topic.