If you’ve ever found yourself clicking through multiple web or app pages fruitlessly while trying to figure out how to access account features or contact customer service about a specific issue, you may have been caught in a “dark pattern.” The term describes a design practice that makes navigation to elements that publishers would prefer users not access, like account deletion or requests for personal information, intentionally opaque and confusing. A change to California’s state privacy law is the first regulation to directly take on this practice, threatening civil penalties brought under the state’s existing unfair competition laws.
Dark patterns no longer welcome in California
Dark patterns can be applied both proactively and reactively. They might be used to create a maze that dissuades end users from taking a particular action, or they might attempt to trick someone into handing over valuable personal information.
California Attorney General Xavier Becerra announced that the change was part of a set of new regulations developed for the existing California Consumer Privacy Act (CCPA). The new regulations forbid publishers from using “confusing language,” “unnecessary steps” or requirements that they listen to a sales pitch to change their mind about opting out before they’re allowed to access the relevant function. In a statement to the press, Becerra said that the new addition to the state privacy law would “ensure that consumers will not be confused or misled when seeking to exercise their data privacy rights.”
The regulations also require that the relevant icons or buttons for these services be visible and clear. To that end, the state has created a highly visible blue “Privacy Options” icon that can be used to ensure compliance whenever consumers are presented with information or choices about their rights.
If the state identifies a dark pattern, the organization will be given a 30 day cure notice in which the situation must be remedied. If the offending element is not brought back into compliance in that time, the state threatens civil fines (of unspecified amounts) brought under the authority of anticompetition laws.
Becerra also provided a specific example of more blunt and straightforward dark patterns that could get an organization in trouble: being suddenly directed to a subscription page while in the midst of browsing a website or watching something in an app.
The concept of dark patterns is referenced as far back as 2010, but became a prominent discussion topic in 2018 with the release of a report from the Norwegian Consumer Council called “Deceived by Design.” The report describes dark patterns as having privacy-intrusive default settings and options, misleading wording, elements that give users the illusion of control and pushing users into take-it-or-leave-it choices. The report cited several examples from Google and Facebook’s services.
The CCPA now specifies that consent obtained via dark patterns is not legally valid. Businesses must also now count the steps it takes to opt in to a service, as the amendment requires that there is not a greater amount of steps to opt out.
Privacy law update leaves room for interpretation, will require precedent
Though the new privacy law amendments are now finalized, there is expected to be a “feeling out” period of precedent decisions determining what exactly constitutes confusing language and unacceptable interface designs.
The requirement that end users not be forced to “listen to reasons why they shouldn’t opt out” could prove to be relevant to Facebook’s current plans to counter Apple’s new privacy requirements. Reports are that Facebook plans to have users view a video extolling the virtues of opting in to targeted advertising prior to the mandatory pop-up that will soon be in every app on the Apple Store. If Facebook forges ahead with this plan, it will be interesting to see if California considers it a violation of the new privacy law amendments.
Privacy law that specifically addresses dark patterns is very new; this is the first piece ever passed in the United States. A federal bill that named dark patterns was introduced to the Senate in 2019; it would have banned them from any platform with over 100 million users. The bill was ultimately never voted on.