Automating (in)justice?: an adversarial audit of RisCanvi
A tool designed to assess inmates’ risk of recidivism should be robust and reliable. This adversarial audit, the first done of an AI system used in the criminal justice system in Europe, makes some shocking discoveries.
Lawmaker or Lawbreaker? How FaceNet Got It Wrong
FaceNet’s errors reveal AI’s potential for misidentification, highlighting cases where even prominent figures were incorrectly flagged. This article discusses the implications for privacy and security in facial recognition technology
Bias in a Silver Platter: AI’s Struggle with Gender Roles
Examining AI’s reinforcement of gender roles, this piece highlights how bias in automated systems perpetuates stereotypes and impacts social norms.
Name Your Bias: AI’s Fairness Challenge in Hiring
Exploring AI’s role in hiring, this article delves into bias challenges within automated recruitment tools and the impact on fair hiring practices
FemTech: My body, my data, their rules
Exploring the privacy risks in femtech, this article reveals how personal health data in menstrual and fertility tracking apps is often exploited, raising concerns over data ownership, consent, and regulatory gaps.
BadData: The High Cost of Poor Data Quality
Uncovering the hidden risks of flawed data—showing how errors, biases, and outdated information can twist decision-making, fueling predictions that may shape lives in unexpected and sometimes irreversible ways.
The case of Viogén: Can AI solve gender violence?
VioGén is an algorithm that determines the level of risk faced by a victim of gender-based violence and establishes her protection measures in Spain.
Auditing TikTok. Social media’s treatment of migrants
What is the impact of social media on the representation and voice of migrants and refugees in Europe?
Auditing YouTube. Social media’s treatment of migrants
What is the impact of social media on the representation and voice of migrants and refugees in Europe?
Invisible No More: The Impact of Facial Recognition on People with Disabilities
An In-Depth Audit of Biases in Facial Recognition Technology Impacting Individuals with Disabilities