Facebook’s algorithms perpetuate existing inequalities in the housing market

The United State’s Department of HUD (Housing and Urban Development) has sued Facebook over discrimination (Tobin, n.d.). According to Dept. of HUD, Facebook has violated the Fair Housing Act, which bans discrimination on the basis of protected characteristics including race. The HUD’s lawsuit accuses Facebook of enabling advertisers to filter out people on the basis of their gender, race and other characteristics and deploying algorithms that discriminate against people based on those characteristics. The charge comes two years after ProPublica reported on the issue (Angwin and Parris 2016). Soon before the HUD’s complaint was lodged, fair housing groups also sued Facebook for the same reasons (Angwin and Tobin 2018). The United States Attorney for the Southern District of New York, filed a statement of interest supporting the National Fair Housing Alliance (ibid).

This case not only raises privacy concerns but also unease regarding algorithmic discrimination based on race and other protected attributes. The HUD stated the following in its charge against Facebook:

Even if an advertiser tries to target an audience that broadly spans protected class groups, Respondent’s ad delivery system will not show the ad to a diverse audience if the system considers users with particular characteristics most likely to engage with the ad. If the advertiser tries to avoid this problem by specifically targeting an unrepresented group, the ad delivery system will still not deliver the ad to those users, and it may not deliver the ad at all. This is so because Respondent structured its ad delivery system such that it generally will not deliver an ad to users whom the system determines are unlikely to engage with the ad, even if the advertiser explicitly wants to reach those users regardless. (HUD vs Facebook, [2019])