Eli Parser made a big splash in a 2010 TED talk where he discussed a phenomena he called the “filter bubble.” According to Parser, modern technology – and especially Google search – were putting us at risk. The threat, of course, was the worst one of all: Ourselves. Thanks to the personalization of search results, it’s like (Parser argues) that we will only get results that agree with what we already believe.
This goes along with a psychological tendency known as “confirmation bias,” where we tend to ignore the things we don’t agree with and dig into the things we do. To see whether the confirmation bias would play a role in the search engine results and see just how strong the Google filter bubble is, columnist Alexander Zwissler conducted an experiment with a group of friends.
The experiment was simple: He chose a group of friends that were all very similar in most measurable demographics, with one notable exception. Each person chosen had a very different political view, ranging (as Zwissler expressed it) “from progressive left to conservative right and all points in between.” Zwissler theorized that, thanks to the filter bubble, the terms selected (“global warming,” “Oakland,” and “mountain biking”) would yield very different results that reflected the political slant of the searchers.
To Zwissler’s surprise, the search engine results pages looked nearly identical; the changes, it turns out, were more based on the location of the individuals than any deeply recognized political bent. The order changed slightly, he indicated, but the results were basically the same. The most substantial differences were actually found for the term “mountain biking,” where location played a far more substantial role. So how strong is Google’s filter bubble, really? Apparently, not nearly so strong as some initially believed.
[Sources include: Alexander Zwissler]