More often than not, a user will go on a website with a specific task in mind – to browse, to buy, or to learn. But how much of what one does on the website is done of the consumer’s own volition?
Companies’ use of “dark patterns” can potentially, and many times, very effectively, get a consumer to make a specific decision on their websites contrary to what the consumer actually wants. Some dark patterns mistakenly misdirect consumers, while others intentionally and coercively “force” them to act in a certain way. These patterns can take the form of trick questions, misleading information, or even hidden information, as well as color blocks and fonts placed in a certain location of the website that can skew consumer behavior. Even the unintentional use of dark patterns makes decision-making confusing for consumers and erodes trust.
While the Federal Trade Commission has intensified enforcement of consent decrees against companies that have strategically implemented the use of dark patterns, it’s still possible for companies to violate both federal and state laws by not being proactive in avoiding deceptive and unfair practices. The FTC has laid out the test that defines an “unfair” practice or act as one that (i) causes or is likely to cause substantial injury to consumers, (ii) is not reasonably avoidable by consumers themselves, and (iii) is not outweighed by countervailing benefits to consumers or competition.
In deciding what comprises a deceptive or unfair practice, the FTC looks at whether consumers will eventually end up with services, goods, or terms of contracts they did not willingly agree to. Additionally, under specific contract laws, where there is no express consent to form a contract, the contract will be deemed invalid.
So how can privacy professionals, particularly attorneys, effectively counsel their clients and organizations away from this common, yet all too prevalent, practice?
To start, there are morals and ethics around the use of dark patterns. Depending on the company’s values, they may or may not care how they make their money. Conversely, companies who care more about their reputation and building a brand that focuses on ethically sourced customers on their websites, the ones who care about the quality of the customer experience, will undoubtedly attract an audience and customer base who respects the company’s transparency.
Moreover, research shows most consumers would likely be more willing to provide more information about themselves if the company were transparent and honest about their data practices. Consumers are attracted to and would rather spend their money on honest and simple-worded contracts without the gimmicks.
Then the question moves to how can well-intentioned companies avoid employing dark patterns by mistake? Fortunately, there are multiple tools available to help privacy counsel guide their organizations away from misleading or confusing consumers.
Data protection impact assessments
The first is a Data Protection Impact Assessment (DPIA). Where “high-risk” projects involving personal information are in play, under the GDPR, a DPIA is required. A DPIA is obligatory for organizations that exceed the law’s risk threshold and collect personal information about individuals through defined fields on the website or through the use of backend tracking. Even in jurisdictions where DPIAs are not required by law, similar processes and evaluations can help identify potential issues. How can you spot potential dark patterns during these reviews for your client and eliminate future harm?
By law, a DPIA is required to contain a description of the processing operations, the purposes of processing, and the legitimate interest of the processing by the controller. It also includes a proportionality test of the processing compared to its necessity. More importantly, a DPIA also includes the risks to an individual’s rights and freedoms, such as potential dark patterns that can inhibit one’s ability to fully exercise those rights and freedoms. These assessment procedures, even informal ones outside of GDPR, can be helpful in asking the right questions to ensure you’re not engaged in the use of dark patterns.
Some specific features to look for during a review include disguised ads, forced continuity, hidden costs, and price comparison prevention. Colors and color blocking schemes can also mislead or confuse consumers into thinking their choices are different than they actually are or are limited to what they see right in front of them. Using colors to manipulate a consumer toward a specific action – buying a plan, adding more money into their account, or accepting terms they may not want to accept – are all things to avoid.
The position of EU regulators regarding dark patterns is clear. For example, the CNIL in France, fined Google for building a maze of privacy preferences. The CNIL also published a dedicated guide to inform organizations how to avoid dark patterns.
The concerns with dark patterns also apply to the “Do Not Sell My Personal Information” requirement for companies subject to the California Consumer Privacy Act (CCPA). Enacted in January 2020, it does not allow companies to “hide” options through manipulative design. Specifically, the law states:
“A business shall not use a method that is designed with the purpose or has the substantial effect of subverting or impairing a consumer’s choice to opt-out.” Cal. Code Regs. tit. 11 § 999.315
The law also provides examples of these patterns, including:
“(1)The business’s process for submitting a request to opt-out shall not require more steps than that business’s process for a consumer to opt-in to the sale of personal information after having previously opted out. The number of steps for submitting a request to opt-out is measured from when the consumer clicks on the “Do Not Sell My Personal Information” link to completion of the request. The number of steps for submitting a request to opt-in to the sale of personal information is measured from the first indication by the consumer to the business of their interest to opt-out in completion of the request.
“(2) A business shall not use confusing language, such as double-negatives (e.g., “Don’t Not Sell My Personal Information”), when providing consumers the choice to opt-out.”
(3) Except as permitted by these regulations, a business shall not require consumers to click through or listen to reasons why they should not submit a request to opt-out before confirming their request.
(4) The business’s process for submitting a request to opt-out shall not require the consumer to provide personal information that is not necessary to implement the request.
Another tool for privacy counsel is a product review. The use of psychological tricks to get a consumer to agree to terms of service or goods they never wanted in the first place is best addressed during the ideation and design phases of product design. Product designers, by nature, are hired to help attract more customers and users. During product reviews, privacy counsel should ask tough questions about the user journey and UX design choices. How are we ensuring the user has the information and ability they need to make informed and freely given consent for how their data will be used? Product teams accompanied by privacy professionals can help companies remove these obscurities and trickery from their designs to meet compliance requirements and provide better user experiences.
Although users can change their privacy settings after the fact when those controls are made available, they most likely will not due to challenges with usability or inconvenience. Leveling with the product team to create a platform without any tricks or obscurity—and with the integrity to be honest about what they are asking for and why—can take a company pretty far.
Consumers are notoriously influenced by the influx of ads, bait-and-switch features on the landing page of websites, and misleading design schemes. However, they’re also increasingly catching on to the game and judge companies harshly if they perceive dishonesty or manipulation. With new privacy laws and regulations emerging every day, privacy professionals should be hopeful of a sphere where they have the tools and capability to effectively counsel their own organizations away from dark patterns and toward greater brand trust.