YouTube Content Recommendation Algorithm Promotes Political Radicalization

Content recommendation algorithms, acting as information promoters and gatekeepers online, play an increasingly important role in shaping today’s society, politics, and culture. Given that YouTube is estimated to have the second most web traffic after Google and that 70% of the videos that YouTube users watch are suggested by its algorithm, it is safe to say that YouTube’s recommendation system commands much of the worlds’ attention (Hao 2019). 

From this position of power, the algorithm has amassed serious criticism, with critics asserting that the recommendation system leads users down rabbit holes of content and systematically exposes them towards extreme content (Roose 2019). Since YouTube’s algorithm is built to engage users and keep them on the platform, it often suggests content that users have already expressed interest in. The unintended but problematic repercussion of this feedback loop is “that users consistently migrate from milder to more extreme content” on the platform (Ribeiro, Ottoni, West, Almeida, Meira  2019). In part due to its content recommendation algorithm, researcher Rebecca Lewis of Data & Society reports YouTube as the “single most important hub by which an extensive network of far-right influencers profit from broadcasting propaganda to young viewers” (Lewis 2018). YouTube’s algorithm, by rewarding radical content, has given political extremists fertile soil to plant their ideas. The problems facing YouTube’s content recommendation system echo larger questions regarding the state of the internet and democracy: when profit-driven websites control an outsized proportion of the world’s attention and information, how can democracy survive?