If you are using or intending to deploy a new algorithm, even if you are concerned about a third party algorithm, we have great news: we can help you! We are experts in the development of Algorithmic Impact Assessments able to minimze algorithmic social impacts and amend the algorithmic main sources of bias both, internally -in the case you own the algorithm- and externally -in the event you are concerned or would like to expose a bias for an algorithm you have not developed.
Desirability of an algorithmic system will be strictly dependent on both its capacity to follow the legal requirements in a specific context –data protection and privacy- and its capacity to anticipate and mitigate undesired social impacts while complying with the purpose that gave it to birth
The sources of bias are an iterative risk that encompasses all algorithmic life-cycle. That’s why it is so important to be aware of it at every stage of the whole algorithmic process.
You can collaborate with the project by sharing with us algorithms that are being implemented around you or by using the information in this directory to foster changes in your community.
Unless otherwise noted content on this site is licensed under a Creative Commons Attribution 4.0 International License
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.