Most algorithms are trained on lots of historical data and unfortunately, the recommendations, predictions and expectations rendered by the algorithm are biased.
Race and gender first come to mind when we think of bias, but let’s not forget disabilities and age. Also, a lesser known form of bias is when present data suddenly stops looking like past data. Current algorithms cannot cope when confronted with a sudden pandemic, financial crisis or a rare, extreme event.
Note that no developer intentionally discriminates. However, since they are creating from their unique knowledge base, unintentional forms of discrimination occur. It then amplifies when there’s no redress because at first glance it appears that the algorithm is functioning as expected.
Fortunately, the first step towards redress and change is awareness. The Eticas Foundation’s goal with the observatory is to highlight biased algorithms and their social impact.
Through this directory, you will be able to browse algorithms and filter by location, type of discrimination produced and/or impacted sector.
Interested in our work?
You can collaborate with the project by sharing with us algorithms that are being implemented around you or by using the information in this directory to foster changes in your community.