Eticas Foundation will externally audit the risk-assessment algorithm used to determine protection measures for women at risk of gender violence

In 2015, the Spanish Government Delegation against Gender Violence carried out a macro-survey on violence against women. According to its results, 10.3% of women aged 16 or over had suffered physical violence throughout their lives, rising up to 25.4% in the case of psychological violence. This survey, and other previous analyzes, led to the development by the Secretary of State for Security of the Ministry of the Interior of the system ‘VioGén’, Comprehensive Monitoring System in Cases of Gender Violence.

Eticas Foundation is a non-profit organisation working to protect people in technology processes. One of our fields of expertise is algorithmic auditing (see our Guide to Algorithm Auditing) and AI bias, and so we have been following developments in the use of VioGén. In 2018 the entity published the first report on their work on VioGén, where they reviewed its technical elements, explored a series of bias/false negative hypotheses and proposed a methodology for auditing VioGén to measure its impact on different cultural, socio economic and geographical groups. Eticas Foundation even offered to conduct a pro-bono internal audit of the system.

In a context of increased awareness of the impact of gender violence, the organization has decided to start an external audit of the system, in alliance with Ana Bella Foundation, in order to bring light and transparency into how the VioGén algorithm works, but also to train and empower civil society organizations to open the black box of AI.

External algorithmic audits are those conducted without access to the code. External audits use a range of methods to reverse engineer systems and understand their functioning through their known inputs and/or outputs. These audits can rely on administrative data, interviews, reports or design scripts to gather outputs at scale. 

In July 2021, Eticas launched an External Algorithmic Auditing project, which for the next 18 months will reverse engineer systems in the field of criminal justice (gender violence and recidivism), work (recruitment platforms), social media and banking (credit scoring). Their team of experts, with the help of an international advisory board composed of engineers, lawyers and practitioners, have been developing and validating different methodologies to understand how algorithms work, whether they are biased and how they are affecting different groups of people.

Then, in 2022 Eticas’ work will be compiled into an External Auditing Guide which, like their pioneering Algorithmic Auditing Guide, will be the first document to compile and explain in practical terms how to externally reverse engineer different kinds of AI systems. The guide will help those affected by algorithmic decisions understand how decisions have been made, and identify whether they may have been treated unfairly by such systems.

 

On VioGén, Spanish Government algorithm to protect women against gender violence

The VioGén system has been in operation since 2007 to respond to gender violence through some tools developed by the National Police. Regulated by organic law 1/2004, of December 28, “of Comprehensive Protection Measures Against Gender Violence,” has suffered from modifications. At present, the risk levels analyzed are five: “Not appreciated”, “Low”, “Medium”, “High” and “Extreme”, according to the instruction 4/2019 of the Secretariat of State Security, that came into force on March 13, 2019.

Each level deploys a series of protection and monitoring measures of obligatory cunning, which vary in intensity according to the level of risk. VioGén also includes a mechanism called 3A “Notice, alert, alarm” (for the Spanish abbreviation of Aviso, Alerta, Alarma), which provides agents and complainants from a permanent updating of the victim’s risk level, called “Permanent Evolution Evolution” (or EPER, “Estimación Permanente de Evolución del Riesgo”).

Its main objectives are to collect in a coordinated manner the information of the cases reported, assess the risk and facilitate security forces the alerts to contribute to the prevention of gender violence by emitting warnings through their “Automated notification subsystem” in case there’s detected risk for the victim.

Following these premises, two questionnaires were designed and implemented in almost all Spanish territory, to measure the risk of suffering again from domestic violence: VPR (Valoración Policial del Riesgo, or Police Valuation of Risk) and VPER (Valoración Policial de Evolución del Riesgo or Police Assessment of Risk Evolution). Currently, around 30,000 people have access to the system, most of whom belong to law enforcement agencies, but there are also other authorized actors such as prison centers, victim care organizations and 392 municipalities. It should also be noted that since its implementation in 2007,  more than 500,000 complaints there have been recorded, of which around 10% are active (according to the Ministry of the Interior, 2020).