In 2015, the Oakland Police Department (OPD) implemented an algorithm designed by the American firm PredPol to predict future drug crimes. Similar to other predictive policing software, the system predicts geographic areas where crime is most likely to occur based on crime reports from historical data sets. The Human Rights Data Analysis Group (HRDAG) found that the algorithm was biased against neighborhoods inhabited primarily by low-income people and minorities (Lum 2016). They attribute such bias to the fact that most drug crimes were previously registered in these neighborhoods, thus police officers were directed by the algorithm to already over-policed communities. Because of this, HRDAG argues that the algorithm failed: it did not unlock data-driven insights into drug use previously unknown to police, rather it reinforced established inequalities surrounding drug policing in low-income and minority neighborhoods (ibid).