Public-Interest Audits
Testing AI to protect people, the environment and democracy
Public-Interest Audits (PIA) are projects where we bring AI impact data and issues to the spotlight not as a result of a request by an impacted community, as we do with Community-Led Audits (CLAs), but because the Eticas team or our partners feel a specific issue deserves attention, quickly, or as part of our broader research into AI impacts.
While all our audits are public-interest audits, this program allows us to undertake more experimental projects, leveraging different methods and approaches, to bring the Eticas team and Board together around research that is relevant to us and our partners, and to respond to new events or AI applications in record time.
List of PIA
already published:
Lawmaker or Lawbreaker? How FaceNet Got It Wrong
Bias in a Silver Platter: AI’s Struggle with Gender Roles
Name Your Bias: AI’s Fairness Challenge in Hiring
FemTech: My body, my data, their rules
BadData: The High Cost of Poor Data Quality
Our Public Interest Audits go beyond traditional assessments,
providing actionable insights and benchmarks that drive AI transparency and accountability across industries. By holding AI systems to higher standards, Eticas Foundation empowers organizations to make responsible improvements that serve and protect the public interest, fostering a digital future that benefits society as a whole.
Do you know of a system we should audit?
At Eticas, we are committed to digital transparency and justice. If you have information about applications or systems you believe should be investigated, reach out to us now.
Donate Now*
Your support helps us fight AI bias, empower communities, and hold systems accountable. Together, we can build a fair and transparent future for all.
*Eticas is a 501(c)(3) nonprofit organization. Your donation is tax-deductible to the fullest extent allowed by law algo.