Researchers say Amazon face-detection technology shows bias
January 25, 2019
NEW YORK (AP) — Facial-detection technology that Amazon is marketing to law enforcement often misidentifies women, particularly those with darker skin, according to researchers from MIT and the University of Toronto.
Privacy and civil rights advocates have called on Amazon to stop marketing its Rekognition service because of worries about discrimination against minorities. Some Amazon investors have also asked the company to stop out of fear that it makes Amazon vulnerable to lawsuits.
The researchers said that in their tests, Amazon's technology labeled darker-skinned women as men 31 percent...
For access to this article please
sign in or
subscribe.
Reader Comments(0)