Sexist predictive searches in Google Search

In 2014, Emily McManus, editor of, discovered what she thought was a sexist bug. After typing in the Google search bar “English major who taught herself calculus”, she was suggested, “Did you mean: English major who taught himself calculus?” (Sinha 2014). McManus, outraged, tweeted a screenshot of the search engine. On Twitter, fellow users explained to her how the search engine’s algorithm had not messed up but had been programmed to suggest that specific content due to it being a more searched item (ibid). Google’s search engine offers autocomplete and alternative suggestions to queries that look similar to popularly-searched items. According to the company, the phrase “taught himself calculus” was suggested because it is used more online than “taught herself calculus”, which is why Google’s algorithm assumed that it was correct (ibid). In other words, a longstanding structural bias in society was replicated through the search engine.

Related posts