Automated Tenant Background Checks Incorrectly Deny Housing

An increase in rentership across the United States coupled with the rise of a databased economy has created a new industry of algorithmically-processed tenant background checks. While approximately 9 out of 10 landlords across the country buy cheap and fast background checks from screening companies like RealPage, CoreLogic, TransUnion, and RentGrow, their products can be plagued with inaccuracy (Kirchner and Goldstein 2020). The consequence of such inaccuracy is the wrongful denial of housing for tenants around the US. When journalists Lauren Kirchner and Matthew Goldstein reviewed hundreds of federal lawsuits against screening companies from the past 10 years, they found that an industry built on cheap, fast results often hastily misidentifies tenants as criminals and sex offenders, erroneously denying them any chance at securing a place to live (ibid). 

Due to careless and faulty screening algorithms, Samantha Johnson, a 31-year old that lives in Oregon with two traffic infractions on her record, has been declined from renting countless times (ibid). In 2018, a landlord received a background check that alleged that Johnson had committed “Burglary and domestic assault in Minnesota. Selling meth and jumping bail in Kentucky. Driving without insurance in Arkansas. Disorderly conduct. Theft. Lying to a police officer. Unspecified “crimes.” Too many narcotics charges to count (ibid).” None of the charges were actually hers, rather they belonged to other Samantha Johnson’s (and to names that closely match ‘Samantha Johnson’) around the country. Casting a wide net, the automated screening tool aggregated information from a variety of criminal databases and miscategorized the charges under a singular identity. 

Faulty automated background checks underscore how algorithmic processing might be a cost-efficient and scalable solution for companies but it nonetheless overwhelms innocent people with huge costs when it does make an error. Even when algorithms have error rates of 1% (like the reported error rate of many screening companies’ algorithms), at a scale of millions of people, the end result is hundreds of thousands of people affected (ibid). Rigorous internal auditing must take place before algorithms are given free rein over critical resources like housing.