Uruguay’s Ministry of the Interior Invests in Predictive Policing

Policing

Near the end of 2013, Uruguay’s Ministry of the Interior acquired a license for a popular predictive policing software called PredPol. Predpol is a proprietary algorithm utilized by police forces around the world, including the Oakland Police Dept. in California and Kent’s Police Force in England. Trained from historical crime datasets, the software relies on a machine learning algorithm to analyze three variables (crime type, location, and date/time) in order to predict crime ‘hot spots’ (Ortiz Freuler and Iglesias 2018). From its analysis, Predpol creates custom maps that direct police attention to 150 square meter ‘hot spots’ where crime is statistically likely to occur. 

PredPol—and the field of predictive policing in general—has been criticized for its opacity and capacity for discrimination.

 First, PredPol’s specific usage in Uruguay was trained from data sets made classified by Uruguay’s Ministry of the Interior, so citizens cannot view the algorithmic inputs if they wanted to audit the system. The system also operates as a proprietary black box, meaning that its algorithm is protected property and also that its outputs cannot be explained through causal experimentation (even if its inputs were to be public data) (ibid). 

Second, Predpol poses a risk to public safety and justice through its susceptibility to discriminate by class and race, reproducing entrenched structures of inequality. These biases could be attributed to the quality of the input data well as the design of the system. While Predpol learns from historical crime data sets, it is important to make note that such data sets only include reported crimes and that some crimes are disproportionately reported more than other (for example, nuisance/petty crime is over-reported compared to white collar/financial crime) (ibid). Thus, the input data could be considered biased, making the entire system susceptible to creating unintended discriminatory effects. The system was also designed to evaluate location, a variable which is not considered a suspect class but does directly correlate with income and race. In addition to this design bug, a 2016 Human Rights data analysis report demonstrates that greater police presence in a specific area increases the likelihood that crime will be reported in that zone (ibid). In this way, by directing police attention towards locations that historically have had petty crime, PredPol installs a feedback loop (a process in which a system’s outputs are reused as inputs) that over-polices vulnerable communities, ultimately reifying inequality along the lines of class and race.