Gender bias in Google Translate

In an effort to combat gender bias in translations, Google Translate has transitioned to showing gender-specific translations for many languages (Lee 2018). The algorithm inadvertently replicates gender biases prevalent in society as it learns from hundreds of millions of translated texts across the web. This often results in the translation of words like “strong” or “doctor” as masculine and for other words, like “nurse” or “beautiful” as feminine (ibid). Given a biased training set of data, the algorithm has a longstanding gender bias problem. It remains to be seen if it can be entirely resolved by the new feature.

Related posts