Can Apple's iPhone X Beat Facial Recognition's Bias Problem? | WIRED

Joy Buolamwini once built a robot that could play peekaboo. But there was just one problem: It couldn't see her. Buolamwini is black, and the facial-recognition software she used couldn't recognize her face. The software worked well enough with lighter-skinned people, so Buolamwini moved on to other projects. "[I] figured, you know what, somebody else will solve this problem," she explained in a TEDx talk about her work.

But it didn't get solved, at least not right away. Buolamwini continued to encounter facial-recognition software that just couldn't see her. Hers was not an isolated example. In 2009, two co-workers created a video that went viral showing how an HP webcam designed to track people's faces as they moved followed the white worker, but not her black colleague. In 2015, web developer Jacky Alciné tweeted a screenshot that showed Google Photos labeling a picture of him and a friend as gorillas.

Tuesday, Apple introduced its own facial-recognition program, Face ID, that will unlock its new iPhone X. Now, we will learn whether Apple was able to overcome such problems.

Apple, which did not respond to an interview request, has had years to learn from the mistakes of previous systems. There are some indications it is applying those lessons. Face ID uses an infrared camera to create three-dimensional models of its users’ faces, which, in theory, could prove more nuanced than previous two-dimensional systems. Its website for the new iPhone X shows Face ID working with a person of color. During its two-hour new-product event, the company showed another face-detection feature—part of its automated portrait-lighting mode—working with people with a variety of skin tones. But we won't know for sure how well Face ID works in the real world until enough iPhone Xs are in the hands of customers.

Solving these problems matters, and not just for Apple. As the use of facial recognition technology by law enforcement expands, the consequences of malfunctions will be more severe. "My friends and I laugh all the time when we see other people mislabeled in our photos," Buolamwini said during her TEDx talk. "But misidentifying a suspected criminal is no laughing matter, nor is breaching civil liberties."

There are technical reasons that previous facial-recognition systems failed to recognize black people correctly. In a blog post, HP blamed the lighting conditions in the viral video for its camera's failure. In an article for Hacker Noon, Buolamwini points out that a camera's default settings can affect how well it's able to process images of different skin tones. But Buolamwini argues that these issues can be overcome.

Facial-recognition software works by training algorithms with thousands, or preferably millions, of examples, and then testing the results. Researchers say the problematic facial-recognition systems likely were given too few black faces, and can only identify them under ideal lighting conditions. Stanford University computer science professor Andrew Ng, who helped build Google's artificial-intelligence platform Google Brain, and Michigan State professor and machine-vision expert Anil Jain say facial-recognition systems need to be trained with more diverse samples of faces.

Researchers call this type of problem, when underlying biases influence the resulting technology, "algorithmic bias." Other examples include photo sets used to train image-recognition algorithms that identify men in kitchens as women, job-listing systems that show more high-paying jobs to men than women, or automated criminal-justice systems that assign higher bail or longer jail sentences to blacks than whites. Buolamwini founded a group called the Algorithmic Justice League to raise awareness of algorithmic bias, collect examples, and ultimately solve the problem.

Apple's use of infrared will make Face ID less susceptible to lighting problems. But the technology alone can't overcome the potential for algorithmic bias. "The face recognition system still has to be trained on faces of different demographic types," Jain says.

If Apple's software proves more capable than facial recognition systems of the past, it will be because the company took this into account while training it.

https://www.wired.com/story/can-apples-iphone-x-beat-facial-recognitions-bias-problem/