On the Internet, it’s becoming increasingly impossible to avoid personalization wherever you go. Everything that you do online – especially searching for specific words online – can be compiled, analyzed and added to a sophisticated algorithm so that you only see the things you are most likely to click on. Even more disturbingly, even when you are logged out of a website or app, you may still be tracked, observed and monitored in order for companies to deliver personalized information.
To highlight this point, privacy-focused search engine DuckDuckGo recently conducted a study of Google search results, to see just how much personalization really occurs for users of the search engine, and then to check whether simply logging out of the Google search engine and browsing in “Incognito” mode would yield different results. In theory, personalized search results when you are logged into Google should be very different from the search results when you are logged out of your Google account, right? After all, the whole point of private browsing in “Incognito” mode is to visit websites completely anonymously. But that’s not what DuckDuckGo found. In fact, they found that Google users remain trapped within their own “filter bubble,” regardless of whether or not they are logged out.
The filter bubble concept
Of course, the concept of the filter bubble has been around for nearly a decade. Back in 2010, Internet activist Eli Pariser coined the term “filter bubble” to describe the way websites like Google or social networks like Facebook deliver results to users based on factors like search history. These websites have a built-in incentive to show you what they think you want to see and only information that will appeal to you. And, of course, that means showing you personalized content that you like, while simultaneously hiding things from you that you won’t like. The only problem, of course, is that this creates a “Filter Bubble” – a type of intellectual isolation in which you think you are getting the big picture view, but are actually only getting views that are similar to yours because the Internet is hiding certain content from you.
Pariser stumbled across the Filter Bubble concept when he noticed that he and his friends could type the same exact search term into Google, and they would get different results based on personal information. But it went further than that – the personalized searches seemed to reflect the political biases and orientations of the user, almost as if Google knew everything about you and was showing you what you wanted to hear. In the political realm, this can have dangerous consequences, because it can actually act to harden opinions around certain topics, rather than open up the floor to new discourse. In the 2016 presidential election, this dynamic seemed to play out just as described by Pariser five years earlier in his famous filter bubble TED Talk: hard-core supporters of Donald Trump only saw pro-Trump and anti-Clinton content, while hard-core supporters of Hillary Clinton only saw pro-Clinton and anti-Trump content whenever they searched Google, checked news sites, browsed sites of media companies, or visited social media networks.
DuckDuckGo revisits the filter bubble controversy
What DuckDuckGo sought to prove in its study of search results is that nothing has really changed since 2011 – no matter how hard you try, you can’t escape the Google filter bubble. In fact, DuckDuckGo even created a new standalone website called “Don’t Bubble Us” to showcase exactly how this works in real life, the factors that can lead to filter bubble creation, and why Google users might want to reconsider which search engine they use in order to avoid Google personalizing search results.
The layout of the study was relatively simple – DuckDuckGo asked 87 different people located around the United States to carry out the same searches at the same time, for search terms like “gun control,” “immigration” and “vaccination.” The volunteers in the survey first conducted logged out searches, and then immediately logged in and conducted the same search while logged in. Of the 87 volunteers, 76 used desktops and 11 used mobile devices. The idea was to control for as many variables as possible, while showing just how personalized the results were. The search “gun control,” for example, resulted in 19 different domains, configured in 31 different ways. Depending on your past Google search history, you would get a different configuration of search results.
The real takeaway here, though, is not that the results were personalized for people who were logged in. The real takeaway was that the results also appeared to be personalized for people who were logged out. The conclusion, according to DuckDuckGo, is that Incognito mode does not mean anonymous. Google still knows who you are, and will adjust its search algorithm accordingly.
Different interpretations of the same DuckDuckGo study
As might be expected, Google hotly contested the results of the survey. Google said the findings were “flawed” because they made the overarching assumption that any differences in search results must be the result of personalization. There are a number of factors, says Google, that might have influenced the search results, including time, location, and a sort of catchall reason called “contextualization” (which looks at factors such as IP addresses). Top stories, for example, would be expected to change frequently, simply due to the way news works. DuckDuckGo, though, countered these arguments by saying that it attempted to control for as many variables as possible, including both time and location.
Of course, the fact that DuckDuckGo is a direct competitor of Google also needs to be taken into consideration. Doesn’t DuckDuckGo have every incentive in the world to show people that Google can’t be trusted, and that Google’s personalization follows you everywhere? At the current time, Google is being battered on several different fronts by privacy advocates, and now is a remarkably fortuitous time for DuckDuckGo to release its results and even launch a website to publicize those results.
A new era of privacy-first products and services
One thing is certain: the DuckDuckGo study on the Google filter bubble is surely going to generate more buzz and momentum around “privacy-first” products and services. While some amount of personalized content on the web seems to be inevitable, people are finally waking up to the fact that they have more power than they think to take their data privacy into their own hands. Small steps – such as abandoning one search engine for another, based on privacy concerns alone – hint at a future in which data privacy becomes a source of competitive advantage for companies that build privacy into their products and services from the very beginning.