Auditing algorithmic society

If you are using or intending to deploy a new algorithm, even if you are concerned about a third party algorithm, we have great news: we can help you! We are experts in the development of Algorithmic Impact Assessments able to minimze algorithmic social impacts and amend the algorithmic main sources of bias both, internally -in the case you own the algorithm- and externally -in the event you are concerned or would like to expose a bias for an algorithm you have not developed.

Social desirability

Desirability of an algorithmic system will be strictly dependent on both its capacity to follow the legal requirements in a specific context –data protection and privacy- and its capacity to anticipate and mitigate undesired social impacts while complying with the purpose that gave it to birth

Preventing bias methodology

The sources of bias are an iterative risk that encompasses all algorithmic life-cycle. That’s why it is so important to be aware of it at every stage of the whole algorithmic process.

Interested in our work?

You can collaborate with the project by sharing with us algorithms that are being implemented around you or by using the information in this directory to foster changes in your community.