LinkedIn’s search bar has been criticized for discriminating against women. The Seattle Times (Day 2016) reports that oftentimes a search for a female contact on the website yielded empty responses or suggestions asking if the searcher meant to look for a similar-looking man’s name. At least a dozen of the most common female names in the US, paired with placeholder last names, bring up LinkedIn’s suggestion to replace. For instance, a searcher looking for “Andrea Jones” will be suggested “Andrew Jones,” “Danielle” to “Daniel”, “Michaela” to “Michael” and “Alexa” to “Alex”, etc. While females searches will suggest male counterparts, searches for the 100 most common male names in the U.S., on the other hand, brought up no prompts asking if users meant female names (ibid.).
LinkedIn has commented that its suggested results are generated automatically by an analysis of the tendencies of past searchers (ibid.). Kieran Snyder, the chief executive of the algorithmic bias-fighting company Textio, explains that “It really comes down to, are you putting the correct training data into the system. A broader set of people (working on the software) would have figured out how to get a broader set of data in the first place (ibid).”