Computers

Study finds some facial recognition systems only accurate for white male faces

Study finds some facial recognition systems only accurate for white male faces
The study found extraordinary error rates in picking up gender in dark-skinned female subjects
The study found extraordinary error rates in picking up gender in dark-skinned female subjects
View 1 Image
The study found extraordinary error rates in picking up gender in dark-skinned female subjects
1/1
The study found extraordinary error rates in picking up gender in dark-skinned female subjects

A new study from MIT and Stanford University researchers has found that three commercial facial analysis programs demonstrated significant error rates in determining the gender of any subject that wasn't white and male. The study highlights an ongoing problem in the field of machine-learning suggesting the datasets these algorithms are trained on consist of disproportionately high volumes of white males over other gender and racial types.

The origins of the research came several years ago when Joy Buolamwini, at the time a graduate student at MIT Media Lab, was using a commercial facial analysis system for a multimedia installation and discovered it consistently was not working properly recognizing darker-skinned faces. To test these systems Buolamwini, and research partner Timnit Gebru, generated a set of 1200 facial images consisting of even representations of men, women and different skin tones.

The study used gender identification to test all three systems, as this was a binary decision allowing for easy statistical assessments, and the results were dramatic. Across all three tested facial recognition systems the error rates for identifying the gender of white males was less than one percent. But for darker skinned female subjects that error rate ballooned up to between 20 percent and 35 percent. For the darkest-skinned female faces, two of the tested systems generated extraordinary error rates of around 46 percent when trying to identify gender.

"To fail on one in three, in a commercial system, on something that's been reduced to a binary classification task, you have to ask, would that have been permitted if those failure rates were in a different subgroup?" says Buolamwini. "The other big lesson ... is that our benchmarks, the standards by which we measure success, themselves can give us a false sense of progress."

Despite the study's limited focus on gender identification, the researchers suggest these same racial and gender biases would probably be reflected in the facial identification system results for other tasks. The problems raised by this study are concerning, considering these facial recognition systems are used by law enforcement agencies and health care departments.

"What's really important here is the method and how that method applies to other applications," says Buolamwini. "The same data-centric techniques that can be used to try to determine somebody's gender are also used to identify a person when you're looking for a criminal suspect or to unlock your phone."

The paper will be presented at the upcoming Conference on Fairness, Accountability, and Transparency in New York.

Take a deeper look at the research in the video below.

Source: MIT News

Gender Shades

2 comments
2 comments
Daishi
If gender and race are social constructs how can they be determined by facial recognition software anyway?
christopher
Amusing, until you realize that idiots use this same software for security. FaceID, TouchID, etc etc - all hacked and bypassed easily, but that doesn't stop the snake-oil vendors making millions from exploiting us with fake claims.