The Metropolitan Police of London has implemented an automated system to assess the potential harm that gang members impose on public safety. Since 2012, the initiative has used algorithmic processing to collect, identify, and exchange data about individuals related or belonging to gangs (Amnesty International 2018). The system uses the gathered information to calculate two scores: 1) the probability an individual joins a gang and 2) their level of violence.
However, the specific metrics and criteria used to assign “harm scores” have not been revealed by the Metropolitan Police, which raises questions about transparency. Amnesty International has pointed out that the ‘’Ending Gang and Youth Violence Strategy for London’’ from 2012 gives insights regarding the issue. According to it, each gang member would be scored taking into account the number of crimes he/she has committed in the past three years “weighted according to the seriousness of the crime and how recently it was committed” (ibid)
Amnesty’s report draws attention to various issues, the most important being the risk the system inflicts on young people belonging to vulnerable communities as well as their privacy. The system has been criticized for its high susceptibility to racial bias. For example, Dez Brown, a black man convicted of manslaughter at age 17, claims he was erroneously labeled as a ‘gangster’ in court when he did not pertain to any organized gang at the time of committing the crime (Francis 2018). He states,
“I wasn’t in a gang. There was an affiliation in the area and I had friends, but we weren’t working as a collective and I hadn’t pledged allegiance to any group. It’s unfair to say all people involved in organized crime are a gang, lots are being exploited, some are just trying to get by. But in court, they are labeled as part of the gang and tarnished with the same brush.”