Looking inside Uber´s algorithm

As AI systems proliferate, the need to understand how they work and impact on communities, and specifically on vulnerable groups, becomes more and more urgent. However, as a “black box” technology that lacks transparency, auditing algorithms is not always easy or welcomed by those using them. As the mission of Eticas Foundation is to protect people in technological processes, we came up with another way of opening the “black box” by means of reverse-engineering. We measure the social impact of algorithmic systems on people.

Uber, together with other ride-hailing apps, have changed the way we move around cities. The algorithms powering these platforms are capable of offering precise ETAs, and a rather personalised service, but have you ever wondered whether a custom service could actually mean that some individuals or groups are being discriminated against?

Well, we have, but to test this hypothesis we need data. Being honest, lots of data. This is why we ask you to help us compile a database of trips (forget about personal information, we just want to look at the general patterns). The information we ask you to share with us just consists of your trip duration, its price, start and finish, and other general information that by no means would allow us to know much about you.  Further than that, we won’t even be able to link this information to names nor emails, we won’t ever know to whom this information belongs to.

We tried to make it simple for you, so if you are willing to collaborate follow this link and help us turn the box a little bit less black!