Introducing “Weakening of democratic practices” as a type of social impact of algorithms in the OASI Register

OASI stands for Observatorio de Algoritmos con Impacto Social, because the fact that the use and abuse of algorithmic systems may have negative social impacts is one of the main reasons behind the development of OASI by la Fundación Eticas.

Therefore, “Social impact” is one of the main categories we use to catalogue and typify algorithms in the OASI Register. And when we launched an updated version of the Register in October 2021, we included these types of social impact:

  • Racial discrimination
  • Gender discrimination
  • Socioeconomic discrimination
  • Religious discrimination
  • State Surveillance
  • Social polarisation / radicalisation
  • Threat to people’s privacy
  • Manipulation / behavioural change
  • Generating addiction
  • Disseminating misinformation

 

We decided on such categorisation based on our knowledge at the time, and because we found it useful and relevant to help us classify and analyse the different types of algorithmic systems being developed and used around the world. You can read how we define those categories on the page about social impact of algorithms on Eticas Foundation’s site.

As we keep adding algorithms to the Register, and as our knowledge of the algorithmic landscape grows, we may need to review and update such typology of social impacts – and that’s what we did in December 2021, when we introduced “Weakening of democratic practices” as a new type of social impact of algorithms.

 

THIS IS HOW WE DEFINE IT:

In functioning democracies, there should be transparency and explicability regarding the decisions taken by governments, public administrations and other state bodies, so that they can be held accountable and their decisions can be reviewed and contested through democratic mechanisms. However, if some decisions are based on algorithmic systems that are not known or not transparent and explicable, then that may result in a weakening of democratic practices because it won’t be clear or even possible to know how the decision was taken, who is accountable for it, and how it can be reviewed and democratically contested.

Since then, we added “Weakening of democratic practices” as a potential negative social impact of two algorithmic systems in the OASI Register: Valencia IA4Covid, which has been used by the Valencian regional government in Spain to decide which social restrictions to impose during the Covid-19 pandemic, and Prometea, used by the Public Prosecutor’s Office in Buenos Aires, in Argentina, to automate many administrative tasks, sometimes leaving humans outside of processes that are completely automated.

The use of such algorithms and automated decision-making processes by governments and other public authorities has the potential to help busy civil servants and elected representatives to be more efficient – but it also has the potential to make their work more opaque (maybe even to themselves) and unaccountable to the public.

That is why we think adding “Weakening of democratic practices” to the OASI Register will be useful and help us produce a more appropriate picture of the use of algorithmic systems that affect public life.

OASI is a work in progress as we aim to keep adding entries to the Register. If you know of an algorithm that it’s not in the Register, you can let us know and help us in our effort to contribute to an informed and responsible public conversation about the use (and abuse) of algorithmic systems in our societies.

 

Jose Miguel Calatayud