Google search harming minorities

Google’s search algorithm has exacted serious harm on minority groups. The search engine has been found guilty of featuring an auto-fill query “are Jews evil” and tagging African-Americans as “gorillas” within the images section. Google search has also been criticized for promoting Islamaphobia by suggesting offensive and dangerous queries through its autofill function. For example, if a user typed “does Islam” in the Google search bar, the algorithm’s first autofill suggestion was “does Islam permit terrorism” (Abdelaziz 2017). This has troubling effects in the real world given research has demonstrated a clear correlation between anti-Muslim searches and anti-Muslim hate crimes (Soltas and Stephens-Davidowitz 2015).

Google has announced that they will continuously be removing hateful and offensive content and tweaking their algorithm in order to permanently rid themselves of the problem, but experts are not optimistic it’s entirely possible. With millions of new pages coming online every day, Google faces the Sisyphean task of content moderation. Heidi Beirich, a project director for the Southern Poverty Law Center, argues that “Google’s algorithm is seriously flawed and it’s a scary thing because millions of people around the world are using it. It’s a fundamental problem with how search works.” She directs attention to the story of white supremacist Dylann Roof, who went “from being someone who was not raised in a racist home to someone so steeped in white supremacist propaganda that he murdered nine African-Americans during a Bible study” (Abdelaziz 2017). Dylann Roof found such propaganda from using Google and clicking on websites from the first page of search results. Southern Poverty Law Center has communicated these problems to Google but meaningful action remains unseen.