This App is not for you: a summary of the presentation event

On the 5th of June we presented the results of the research This App is not for you: Bias and externalities in police/community interaction tools. The Catalan case that was done with the support of the Open Society Foundations for Europe. With this study (that you can read and download here) we wanted to know how technology is used by the community policing nowadays, and if this use of digital technology applications is increasing or weakening the purposes of the community policing orientation.

We wanted to know the technologies that are being used the most (basically, mobile applications, social networks, devices such as tablets, or drones and video surveillance cameras, as well as databases). In the study we focused on the case of community policing in Catalonia, looking at the mechanisms of adoption and the processes of legitimisation behind the implementation of technology. Could these technologies change the relationship between police and community?

Methodology

To this end, we have carried out 2 focus groups with middle and high ranking police officers, 2 focus groups with social organizations for the defense of human rights and neighborhood organizations, and semi-structured interviews with representatives of both groups (police and NGOs). We also carried out an online survey which, thanks to the collaboration of the 3 police officers who were part of the project’s board and also of the Catalan Association of Criminologists, we were able to reach 53 local police units in Catalonia out of the 230 existing units, including large cities and small municipalities. Finally, we made a digital ethnography, which was added during the research process, where we dedicated to know the activity of the parapolice groups in Barcelona, how they act in the networks and to what extent their dynamics are disruptive with respect to the objectives of the community police.

In the research we wanted to know if ICT could make the relationship between community and police more fluid and reliable, thus benefiting community security purposes. We started with 3 research questions that were initially posed: 

1- Given the existence of a distance between the police and the community, could technology influence the distance or the rapprochement between both actors?

2- To what extent could ICT contribute to the stigmatisation of certain social groups when used as a mechanism for proximity relationships, and how could the technology work in the fields of surveillance and intelligence in community policing? 

3- Could technology be an element contrary to different objective axes of community policing, such as social cohesion or the exchange of information? Could it be a disruptive instrument in the sense that the community itself use it as a parapolice mechanism?

What we learned from the fieldwork…

  • According to the police, the police-community distance tends to decrease with technological intervention, however, for most of the social organizations interviewed this distance can become useful, for different reasons, especially in cases of vulnerable groups.
  • To the police, the predominant view is that technologies can have a positive impact on the community and that to a large extent (about 90% in the case of the survey) they are not instruments that can lead to some discrimination of social groups subject to stigmatization.
  • The organizations nuance these visions, on the one hand in relation to the objectives of each organization, many times the impact of having a presence in these digital media can have different disadvantages, also the degree of usefulness that is observed is relative to the type of surveillance that can be exercised (the potential for surveillance) and also elements are introduced that have to do with the digital divide, in relation to the degree of exclusion experienced by each social group.
  • With regard to surveillance linked to the stigmatization of specific groups, there are different cases. In general, this distancing from the police lies in the fact that trust, which is a basic element of community policing, does not seem to be present. Thus, in the face of this lack of trust, the use of technologies is only seen as useful in those cases where they make it possible to identify police abuse.
  • As cases that see more positively the technological implementation there are actors that demand more community police and mechanisms adapted to groups already excluded, precisely in the case of the Raval neighborhood and other neighborhoods, for example in the case of gender violence or older people, to be able to include them better through these technologies, but at the same time with a greater police presence in the streets.
  • On disruption: analysing the most followed account on Twitter, Helpers (@bcnhelpers), just looking at the tag cloud one can see how the prevailing view is that of the prosecution of crime from a perspective of criminalisation and to some extent stigmatisation of certain groups. On the other hand, the institutional model in relation to these groups (such as Helpers) has been changing in recent years and they are also being denounced by the police authorities.

 

Insights by the organizations invited to the presentation

To all of them, and also to all the other organizations and police forces involved in this project, we are very grateful for their collaboration since without their help this study would not have been possible.sible este estudio.

The importance of the “human factor”

Jose L. Diego, Valencia Local Police

To this expert in technology and community policing from the Valencia Local Police, technology is a tool. He emphasizes that, as local police, they are already closer to the community and also that since the 1990s they have been working in the streets as community police, and since 2005 they have been participating in European research projects on community policing. He believes that technologies -such as social networks, where they have a large audience- have been incorporated into a more profound process of integration into the community and civil society, and this, he says, “is not only done through new technologies”, which can nourish and help, but the basis is the face-to-face work, getting to know and be interested in the contexts of the different neighbourhoods and the problems of the people.

The digital divide may push community policing further away from the most vulnerable groups

Beatriz Fernández Gensana, Arrels Fundació

She is very much in agreement with the presented report regarding the entities and groups that are in a vulnerable situation, in the sense that what they require is a human factor and therefore the fear they have is that this human factor will be replaced by the technological part, that is, “that a person in front of you will be replaced by an algorithm”. But in addition, another reticence is related to the digital divide, since in the case of homeless people, to be able to make an appropriate use of these means or to have a formation not even basic to be able to use them, is a real difficulty. And finally, she emphasizes that many groups are stigmatized and that new technologies carry the risk that this will increase: “having mobile applications or databases that identify a certain person belonging to a certain group can make them go from being a victim to being a potential criminal”.

Fear of a greater surveillance over vulnerable communities

Kaire Ba Dejuan, Pareu de Parar-me, SOS Racisme – Catalunya

Communities that already have greater control are afraid that with the implementation of technology this control will increase. She reflects that the concern is that mechanisms are being established to control the communities but not to evaluate the police: “We have insisted a lot on whether, for example, some kind of police evaluation inspired by the PIPE project (Effective Police Identification) could be established at the level of some municipalities to see who is identified and thus be able to analyze this data and see if there was disproportionality. And it’s really been very difficult, we’ve come across a lot of ‘no’s’, but then we’ve seen that when it comes to setting up evaluation mechanisms for communities, then yes”. For Kaire, it would be interesting to be able to evaluate the police work that sometimes generates and reproduces this social stigma, since there is also a need to intervene in order to eradicate certain collective and dynamic processes.

Predictive technology can lead us to create algorithms that deepen stigmas and discriminations

Andrés G. Berrio, Irídia

He thinks that there is a great need to introduce mechanisms and projects to bring the problems of certain communities closer together, and this is something positive that technology can bring us. But at the same time, the trend towards moving into a field of criminal actuarialism, which seeks to predict the current criminal risk, “can lead us to build algorithms that can reproduce certain positions that are made by people and that can end up deepening according to which stigmas, discriminations, etc.”. He also comments that community policing is confronted with problems that are structurally rooted and should be addressed from this prism by institutions and society as a whole. He also believes that for community policing to work well, crime prevention must be analysed from a much broader social perspective, and that “without truly social policies that can address a range of situations and perspectives it is going to be very difficult to generate effective links”.

 

You can see/review the presentation event in this video (video in Spanish):