Boy playing tablet showing possible new DETOUR Act to stop tech companies in using dark patterns to obtain consent or user data
New Senate Bill Targets Dark Patterns Used by Big Tech Giants by Nicole Lindsey

New Senate Bill Targets Dark Patterns Used by Big Tech Giants

For years, the biggest tech companies in Silicon Valley – including both Facebook and Google – have been deploying new methods, techniques and approaches that essentially “trick” users into handing over their personal data or consenting to have their right to personal privacy taken away. And now a bipartisan group of lawmakers in the U.S. Senate are fighting back against these so-called “dark patterns.” The new Deceptive Experiences to Online Users Reduction (DETOUR) Act, if signed into law, would make it illegal for the nation’s biggest tech companies (defined as those with over 100 million monthly active users) to “design, modify or manipulate a user interface with the purpose or substantial effect of obscuring, subverting, or impairing user autonomy, decision-making, or choice to obtain consent or user data.”

Crackdown on dark patterns

There’s obviously a lot to unpack here. The bill, co-authored by Senator Mark Warner (D-Virginia) and Deb Fisher (R-Nebraska), is squarely aimed at some of the deceptive and manipulative practices discovered over the past year. In fact, the DETOUR Act comes almost exactly one year after Facebook CEO Mark Zuckerberg was called in front of Congress to testify about his company’s deceptive business practices as they pertained to user privacy. And in the 12 months that have followed, a series of articles in the mainstream media have uncovered more of these deceptive practices at companies such as Google.

In fact, in the online world, all of these deceptive practices are collectively known as “dark patterns.” The term, coined by user interface (UI) and user experience (UX) expert Harry Brignull, refers to the myriad number of ways that companies use to obtain the personal data of users, or to make it close to impossible for users to remove themselves from a service. For years, social media platforms have used them to ensnare users. In fact, one of the dark patterns highlighted by Brignull is known as the “Privacy Zuckerberg.” This is defined as any tactic that tricks users into sharing more information than they intended to. This might include default settings that are intentionally designed to share as much data as possible – making it incumbent upon users to scroll through a confusing series of screens in order to ensure their privacy.

Other dark patterns are even more controversial. For example, the Roach Motel is a tactic in which companies make it very easy to sign up for a service, but close to impossible to leave the service. Moreover, some companies carefully create experiences for online users geared at convincing users to hand over their personal data. For example, they might use a very cheerful “Click OK to continue” button (without any other warning or hint that a user has given consent). They might create disguised ads, in order to force continuity in a web experience. Through a misdirection, they might make price comparison impossible and thereby undermine customer choice. They might use “friend spam” as a way to get other users to sign up. Or they might bury certain practices deep in their Terms of Service, hoping users never read them.

In other cases, apps and websites find ways to access contacts, photos, or other sensitive information. Apps, in fact, are among the worst at creating interfaces that intentionally limit understanding. (One great example here is the case of Cambridge Analytica and a simple Facebook quiz that unlocked the personal data of tens of thousands of other users). Clearly, the lawmakers want to crack down on manipulative user interfaces. Once apps have access to your contacts, that is when they can engage in “friend spam,” and even make continuity in the app incumbent upon getting other users to sign up.

Implications of the DETOUR Act for dark patterns

If the new Senate bill becomes law, it would essentially outlaw these dark patterns. According to Senator Warner, the goal is to “instill a little transparency” in the various ways that companies like Facebook and Google try to obtain user data. In addition, the DETOUR Act would crack down on dark patterns specifically designed to ensnare or addict users under the age of 13. For example, videos that auto-play after viewing another video is one way to trick young users into watching video after video after video.

And, finally, the DETOUR Act provides for the creation of professional standards bodies that would work alongside the Federal Trade Commission (FTC) in order to crack down on the most egregious dark patterns. The thinking here is that big tech giants would need to submit their UI or UX tricks to a panel of experts and get their approval before moving forward with new behavioral or psychological experiments designed to sign up or convert users.

It’s here, however, that things get a little interesting. That’s because all tech companies – both large and small – routinely engage in what is known as A/B testing. Basically, they run a quick test experiment online to see if Version A or Version B works better to accomplish a certain business goal. According to a first reading of the DETOUR Act, it would make A/B testing illegal – or, at least, very difficult to do quickly without first obtaining consent. The Internet world moves very fast, and by the time a professional standards body or review board has signed off on a practice, it may already be too late.

Broad bipartisan support for the DETOUR Act

As might be expected, the DETOUR Act has broad bipartisan support in the U.S. Congress – an exceedingly rare event in today’s very fractured and frayed political scene. Apparently, Congressional lawmakers are getting tired of hearing Silicon Valley executives explain how exactly they manipulate their users into giving up their data using what appear to be exploitative or deceptive practices. And, for a public that is growing weary of the shenanigans at companies like Facebook and Google, it’s important for lawmakers to show that they are doing something to solve the problem.

In fact, Senator Warner says that the DETOUR Act is just the first of several bills that he will be introducing in the U.S. Senate. The next bill will likely focus on data transparency, and the various ways that large Silicon Valley tech companies buy, sell and trade user data in a shadowy ecosystem of data brokers. The message is clear: if the large online operators on the Web are not able to self-regulate, then government must regulate these tech companies.

Setting the table for a federal privacy bill

However, while it is great first step to pass a bill like the DETOUR Act, what about a comprehensive federal privacy bill for the United States? In many ways, it is possible to see the DETOUR Act and its focus on the presence of dark patterns as just the first, small piece of a much larger puzzle that will bring the U.S. closer to European GDPR standards.

And, as might be expected, big tech and social media companies have thus far been very quiet about their reaction to the DETOUR Act. Microsoft is the only tech giant that has spoken out in favor of the bill, with Facebook notably silent on the matter. Just like lawmakers, who see the DETOUR Act as the first in a series of events, tech companies, too, are probably preparing for a long, arduous battle in 2019 and beyond. It appears that the days of self-regulation on the Internet are coming to an end, and now it’s time for Facebook and Google to focus on a possible compromise solution that will not cause too much disruption to their business models and overall profitability.