I woke up on May 24 to the internet proclaiming the privacy downfall of DuckDuckGo.
An external auditor reported on a “secret data flow list” that enables the sharing of data with Microsoft for third-party advertising. The audit describes how DuckDuckGo’s web browser did not block data transfers to ad platforms owned by Microsoft—LinkedIn and Bing—when the auditor was on a site that was not a Microsoft property. The audit is nuanced, and I think the auditor’s commentary is the best way to simply relay the findings. One main take-away is this: DuckDuckGo intentionally left certain third-party trackers unimpeded while many users thought the product would be blocking those trackers.
Branding itself a privacy champion, DuckDuckGo has grown in reputation and size over the past decade, offering a privacy-minded search alternative to Google alongside a recent mobile browser app and even more recent desktop browser app. Especially in the past few months, I’ve seen DuckDuckGo grow: I’ve encountered billboards (“Tired Of Being Tracked Online? We Can Help”), airport posters, and radio ads. The messaging touts how DuckDuckGo will block the tracking that runs rampant on other search engines and web browsers.
There is a great temptation to shame DuckDuckGo. Indeed, the company’s internal agreements contradicted what its users expected, some of whom adopted DuckDuckGo specifically because they are subject to surveillance and manipulation. The data sharing directly opposes the principle of “Predictability” in the National Institutes of Standards and Technology’s framework for privacy engineering. Yet I want to take a different perspective here.
The DuckDuckGo disclosure is an opportunity to reflect, for individuals and companies alike. As an individual, who defines what privacy means to you? As an organization, is your internal definition of “user privacy” consistent with what your users expect? The DuckDuckGo disclosure is a consequence of a deeper issue: right now in most of the United States, our definitions of privacy—when it is respected, when it is violated—largely come from companies who stand to profit from their proprietary notions of privacy. The disclosure is a potent call for comprehensive consumer privacy legislation, as a means to codify privacy rules via a public institution rather than a business that is itself playing the game.
Identifying who defines privacy
For the countless people who drove past the DuckDuckGo billboards, walked by the airport posters, listened to the radio ads—these encounters may have shaped their definitions of privacy. “Tired Of Being Tracked Online?” Yes, I am!, the passerby says. It’s not a stretch to think how people develop notions of privacy around this rhetoric: So my privacy is protection from online tracking, from my data being sold. These definitions of privacy become associated with the organizations who offer the terms: DuckDuckGo, in this case. We develop our definitions of privacy from a variety of sources: law (if you live in a place like California, where privacy is a constitutional right), our social environments, and privacy-branded products like DuckDuckGo.
The variation in privacy definitions is understandable, particularly given the lack of comprehensive consumer privacy law in the U.S. Instead of a public institution determining what constitutes a privacy right or a privacy violation, the most vocal arbiters of privacy are companies that are selling their own notions of privacy for profit. I think of the Android Privacy Sandbox and, released just a couple days before I wrote this, Apple’s high-profile television ad for its App Tracking Transparency feature. Companies are offering privacy because people want it. Companies are selling their own definitions of privacy—even when it’s not all it’s cracked up to be—because no external authority in the U.S. has stepped in to level the playing field.
To paraphrase Proton CEO Andy Yen’s remarks from a 2021 Web Summit talk on this particular trend of companies defining privacy for themselves, we risk not only losing privacy but also our ability to define privacy for ourselves. If we want privacy to be defined by the people, for the good of the people, we should start by establishing shared legal standards for privacy. Otherwise, we continue in a vicious cycle: companies defining privacy on their terms, users trusting those companies with their data, reputational harm upon disclosure of unfair data practices, and users seeking out a new privacy product to define and protect privacy for them once again.
When you define privacy for someone else, you’re inviting them to trust you
What are our methods for collectively building trustworthy technology? In tandem with U.S. federal legislation, companies should reflect on whether they are actually being transparent to users about their data practices. I’ll highlight two opportunities for growth: open-source development and the responsibility that comes with a being a privacy-focused brand.
One way to make transparency more than a statement is by supporting the development of open-source software. That is, software projects with source code that anyone can review, collaborate on, and distribute. Open-source software is key to developing fair and privacy-respecting technologies, for several reasons. Any developer can inspect the actual code defining data flows and privacy standards. Developers can also collaborate openly to workshop and refine privacy standards. With the mindset of “Trust, but verify,” people outside of a software company don’t have to take that company’s word that they’re following the rules; the outside observers can see for themselves.
Open-source software is not a cure-all for transparency issues. DuckDuckGo’s browser app, as its CEO claims, is open-source within a partly closed-source company as a whole. However, open-source can offer a promising route toward technologies that can be independently audited and improved by the community. Crucially, these reviews typically come from organizations and people who do not stand to profit from behind-the-scenes data flows. Alongside my disappointment with the audit findings, I am grateful to the auditor for bringing these practices to light. Ideally, it never would have been necessary, but it needed to be done.
In addition to implementing approaches like open-source software development, companies must consider their own communications with respect to the current privacy vacuum in the U.S. Developing tools or products with any mention of “privacy” means that the stakes are high. Calling yourself a privacy champion is an invitation for users to deeply trust what you offer. Users develop high standards, and there is the visceral backlash as I’ve observed in the hours since the disclosure. For those of us that are building privacy technologies—whether your customers are people or businesses—we need to recognize the responsibility that comes with branding ourselves “privacy” companies. Humans are trusting you to do what you say and—importantly—to not do what you don’t say. Tucking data practices into legalese does not protect you from real reputational damage.
If you are working in privacy, you should be able to recognize the benefits of shared baselines for privacy. If you don’t, I would respectfully ask, out of genuine curiosity: Why are you in your role?
Multi-pronged privacy
Regarding a fuller disclosure of the data sharing that DuckDuckGo permits, the company’s CEO said on May 24 that the company “will likely have something up by the end of the day.” The browser app now has updated language on app markets, where it specifies that it blocks “most trackers.” The CEO has insisted that the company has never promised online anonymity to users. And I can respect that technicality. But the billboard, the airport posters, the radio ads—they depict DuckDuckGo as a privacy savior. Whether or not that depiction is accurate, companies working in privacy should own up to their responsibility in communicating about privacy when federal baselines are so lacking. Ask yourself: based on what a typical user encounters when they use your product, will they be caught off-guard by any of your data practices? This is a difficult question to answer, but avoiding it could lead to a backlash like the one DuckDuckGo has faced.
To see the power of federal privacy legislation, just look at the Federal Trade Commission’s action last week to enforce the Children’s Online Privacy Protection Act in the context of ed tech. Enacted laws do not eliminate all future wrongdoing, but they give companies and consumers alike a clear set of expectations.
Until federal privacy legislation is enacted—and not just a bill that’s been watered down by Big Tech lobbyists—I want to take a page from DuckDuckGo’s CEO. He references DuckDuckGo as “multi-pronged privacy protection,” going on to list DuckDuckGo’s privacy features. To take a different approach to this idea of multi-pronged privacy, I think users and businesses alike stand to benefit from multi-pronged privacy, where the prongs are not product features but reflective of privacy as a complex and necessary social issue.
Privacy challenges are technical, legislative, and cultural. Our privacy solutions need to be similarly interdisciplinary. This can look like developing open-source privacy tools, fostering norms where independent audits of technology are healthy and welcome, and enacting comprehensive federal privacy legislation in the U.S.