Auditing algorithmic society
If you are using or intending to deploy a new algorithm, even if you are concerned about a third party algorithm, we have great news: we can help you! We are experts in the development of Algorithmic Impact Assessments able to minimze algorithmic social impacts and amend the algorithmic main sources of bias both, internally -in the case you own the algorithm- and externally -in the event you are concerned or would like to expose a bias for an algorithm you have not developed.
Preventing bias methodology
The sources of bias are an iterative risk that encompasses all algorithmic life-cycle. That’s why it is so important to be aware of it at every stage of the whole algorithmic process.
Interested in our work?
You can collaborate with the project by sharing with us algorithms that are being implemented around you or by using the information in this directory to foster changes in your community.