(AI)

Fighting for justice and transparency in the age of AI

As tech, data and AI increasingly permeate our lives—transforming the way we live, work, and communicate—trust, safety and oversight in AI outcomes remains alarmingly inadequate.

While innovations in any other commercial fields, from healthcare to cars, planes or food, undergo rigorous checks, AI systems often reach the public with minimal scrutiny. This oversight gap can perpetuate biases, especially against marginalized communities, and undermine hard-won rights. In our experience, current AI dynamics are the single greatest threat to advances in equality, democracy, sustainability and public accountability.

At Eticas Foundation, our mission is to promote better AI for all through three core pillars:

Auditing, Educating, and Advocating.

Founded in
2012

Since the day we started, we have been dedicated to protecting people and the environment in AI processes by building the socio-technical tools needed to ensure that AI works for everyone. We fulfill our mission by working with communities impacted by AI to reverse-engineer the systems that impact them, build their technical capacity and help them develop their own voice and agenda in the AI debate.

We focus our efforts through three core pillars:

We Audit

Our community-led AI audits and Public-Interest Audits are essential for exposing the AI black box:

Transparency:

We assess AI systems to identify biases and inefficiencies, generating empirical data that illustrates their impact on specific communities.

Empowerment:

Collaborating with civil society organizations, we actively engage affected groups, enhancing their technical capacity and amplifying their voices in the AI discourse.

Informed advocacy:

The results of our audits inform campaigns and legal actions, advocating for regulatory frameworks that prioritize fairness and accountability in AI development.

We Empower

We believe that knowledge is key to navigating the complexities of AI:

Research initiatives:

Our foundation leads research that examines the implications of data and AI processes on underrepresented groups, highlighting the societal impacts of these technologies.

Awareness campaigns:

We aim to demystify data's role in everyday life, empowering citizens to understand the challenges and opportunities of the digital future.

Collaborative engagement:

Fostering a culture of collaboration, we unite various stakeholders to comprehensively address the societal challenges posed by AI.

We Advocate

Our advocacy efforts are pivotal in shaping the future of responsible AI:

Legal framework development:

We strive to strengthen regulations that protect society, ensuring accountability and effectiveness in policy implementation.

Promoting change:

By providing critical insights into AI bias and inefficiencies, we equip policymakers to enact reforms that safeguard vulnerable communities.

Integrating robust audits:

We advocate for the incorporation of thorough audits and impact metrics into compliance standards, holding the AI industry accountable for its practices.

Our Team

Arancha Cienfuegos

Tech Lead, Data Systems

Alvaro Giorgio

Tech Lead, Data Systems

Nombre Apellido

Tech Lead, Data Systems

Nombre Apellido

Tech Lead, Data Systems

Nombre Apellido

Tech Lead, Data Systems

Nombre Apellido

Tech Lead, Data Systems

Arancha Cienfuegos

Tech Lead, Data Systems

Arancha Cienfuegos

Tech Lead, Data Systems

Join the team!

Since our inception, the team at Eticas Foundation have been committed to the belief that responsible innovation is the only innovation worth pursuing.

Join us in our mission for better AI for all. If you share our passion for creating a fair and accountable technological landscape, explore our career opportunities or reach out to us at careers@eticas.ai

Stay Informed, Stay Ahead

Sign up for our newsletter to receive updates on AI accountability, our latest projects, and how you can make an impact. Your details will be securely stored, and we’ll reach out as soon as we’re ready to share exclusive insights.