The Impact of Facial Recognition on People with Disabilities
An In-Depth Audit of Biases in Facial Recognition Technology Impacting Individuals with Disabilities
21
deviations of up to 21 years between predicted and actual ages
7.19%
age prediction inaccuracies for participants with Down Syndrome
4.45%
age prediction inaccuracies for participants without Down Syndrome
Invisible No More: The Impact of Facial Recognition on People with Disabilities. An In-Depth Audit of Biases in Facial Recognition Technology Impacting Individuals with Disabilities
As advances in artificial intelligence (AI) continue to reshape various aspects of society, the promise of technologies like facial recognition holds significant potential. However, as these advancements unfold, it becomes increasingly crucial to consider the broader implications and potential risks they pose, particularly for marginalized communities.
This adversarial audit report investigates the intersection of facial recognition technology and disability, shedding light on potential biases and challenges faced by individuals with disabilities.
Key recommendations include adopting transparent bias mitigation strategies, prioritizing accessibility by design, collaborating with disability organizations, and investing in research on AI and disability. The findings underscore the need for a more inclusive and equitable approach to technological advancement.
01
Gender bias was unearthed through our thorough analysis of age predictions. Women consistently faced underestimation, with unsettling instances of being forecasted as young as 5 or 8 years old. In contrast, men were prone to overestimation, exacerbating the gender divide. These findings expose an entrenched gender bias within Azul’s algorithm, a bias that extends its ramifications across the lives of individuals with Down Syndrome.
Azul’s algorithm demonstrated moderate efficacy in BMI prediction for individuals with Down Syndrome. However, it exhibited a propensity to overestimate BMI values, especially concerning women. This trend raises concerns about equity within insurance pricing.
02
The comparison between Azul and DeepFace has brought to the forefront the intricate challenges surrounding age prediction for individuals with Down Syndrome. Azul exhibited deviations of up to 21 years between predicted and actual ages. In the case of DeepFace, the evaluation of age prediction for Down Syndrome participants revealed a similarly challenging landscape.
The algorithm demonstrated significant deviations between predicted and actual ages, spanning from -16 to +23 years. These disparities parallel the difficulties encountered by Azul, highlighting the intricate nature of age prediction for individuals with Down Syndrome.
03
Comprehensive Evaluation: In light of biases in facial recognition, stakeholders must assess its suitability and reliability. Accessibility by Design: All FR models, including Azul, should prioritize universal design for an inclusive experience from the start.
Disability Advocacy Collaboration: Companies like Azul should partner with disability organizations to develop ethical AI solutions. Regular Third-Party Audits: Azul and others should conduct audits to ensure transparency, fairness, and user protection.