Facebook and Google’s personalized content algorithm creates social echo chambers

As internet information platforms like Google and Facebook continue to grow in prominence and ubiquity, they have begun to replace traditional media intermediaries. Researcher Bozdag (2013) argues that this transition has made Google and Facebook ‘the gatekeepers of our society’. However, Google and Facebook (among other online content intermediaries), due to their model of disseminating information according to personalized content strategies, have fallen victim to producing ‘echo chambers’ and ‘filter bubbles’ (ibid). As informational gatekeepers to society, Google and Facebook’s echo chambers pose considerable risks to the health of democracy and the durability of the fabric of online society.

A filter bubble is defined in more formal terms as a “personal ecosystem of information that’s been catered by these algorithms” (ibid). An Internet user’s profile is constructed through their past browsing and search history when they indicate interest in topics by “clicking links, viewing friends, putting movies in [their] queue, reading news stories” (ibid). An Internet firm aggregates this information to target advertising to the user, or make certain types of information appear more prominently in search results pages. Since 2004 Google has used Google Personalized Search for its search engine, which is based on browser cookie records. More than 50 variables (‘signals’) are used by this algorithm including Search History and location. These variables and their differing weights personalize and adapt the results of a search to each user (Portent Team 2014).

The danger of filter bubbles is that they are more or less invisible; many people are completely unaware that they even exist. The Guardian reported that “more than 60% of Facebook users are entirely unaware of any curation on Facebook at all, believing instead that every single story from their friends and followed pages appeared in their news feed” (Hern 2017). Filter bubbles and echo chambers also support confirmation bias, presenting users with information that they have previously indicated that they already agree with. The architecture of a personalized-content system actively dissuades and prevents users from encountering information that they might find disagreeable or unpleasant. Facebook’s Newsfeed algorithm has been found to reduce politically cross-cutting content by 5 percent for conservatives and 8 percent for liberals (Bleiberg and Darrel 2017). This has dramatic repercussions for the health of social movements and societal dialogue as well as the speed that which political radicalization can take place.

Sometimes the filter bubble can integrate biased or discriminatory assumptions, derived from the reproduction of certain patterns associated with specific social groups or profiles. A well-known example of this being the role of the Google search engine in the US Presidential 2012 election. Analysis from The Wall Street Journal points to Google’s search results favoring Barack Obama, whom searches were more customized than the ones corresponding to Romney (Boutin 2011)