In 2010, a Taiwanese-American family discovered what seemed to be a malfunction in Nikon Coolpix S630 camera: every time they took a photo of each other smiling, a message flashed across the screen asking, “Did someone blink?” No one had, so they assumed the camera was broken (Rose, 2010).
Face detection, one of the latest smart technologies to trickle down to consumer cameras, is supposed to make taking photos more convenient. Some cameras with face detection are designed to warn you when someone blinks, others are triggered to take the photo when they see that you are smiling (Rose, 2010). Nikon’s camera was not broken, but their face detection software was wildly inaccurate – that is unless you’re Caucasian. Nikon has failed to comment on the error in their software but, Adam Rose of TIME magazine has offered an explanation: the algorithm probably learned to determine ‘blinking’ from a dataset of majority Caucasian people (ibid). The software has not been trained with Asian eyes and thus confuses them for blinking. This error underscores how algorithms must be consciously designed and tested with everyone in mind because if not, bias might slip into the picture.