We’ve Been Over This
Amazon apparently still hasn’t fixed the problems with its facial recognition software, Rekognition.
The software, which is used by police and Immigration and Customs Enforcement, has drawn criticism for frequent errors, like when it matched Congresspeople’s headshots with photos from a mugshot database. Now, new research shows that Rekognition generally fares well with white men’s faces, but it struggles to identify light-skinned women and anyone with darker skin, according to The Verge.
Gender Bias
Presented with the faces of light-skinned women, Rekognition incorrectly labeled 19 percent of them as men, according to new research from MIT and the University of Toronto. Women with dark skin were incorrectly labeled as men 31 percent of the time, according to the research, which will be presented this weekend at the AAAI/ACM Conference on Artificial Intelligence, Ethics, and Society.
Amazon responded to the research similarly to how it dismissed past criticisms of Rekognition — the company argued that the researchers hadn’t used the latest version of Rekognition, according to The Verge. Amazon also pushed back on the basis that the facial analysis system that detects facial expressions and gender is separate from the facial recognition system that police might use to compare people to a mugshot. Fair points — but not ones that address the actual problem detected by the researchers head-on.
Source : Amazon’s Facial Recognition Struggles With Darker Skin