Amazon facial-identification software used by police falls short on tests for accuracy and bias, new
Source: Washington Post
Technology
Amazon facial-identification software used by police falls short on tests for accuracy and bias, new research finds
By Drew Harwell
January 25 at 11:01 AM
Facial-recognition software developed by Amazon and marketed to local and federal law enforcement as a powerful crime-fighting tool struggles to pass basic tests of accuracy, such as correctly identifying a person's gender, new research released Thursday says.
Researchers with M.I.T. Media Lab also said Amazon's Rekognition system performed more accurately when assessing lighter-skinned faces, raising concerns about how biased results could tarnish the artificial-intelligence technology's use by police and in public venues, including airports and schools.
Amazon's system performed flawlessly in predicting the gender of lighter-skinned men, the researchers said, but misidentified the gender of darker-skinned women in roughly 30 percent of their tests. Rival facial-recognition systems from Microsoft and other companies performed better but were also error-prone, they said.
The problem, AI researchers and engineers say, is that the vast sets of images the systems have been trained on skew heavily toward white men. The research shows, however, that some systems have rapidly grown more accurate over the past year after greater scrutiny and corporate investment into improving the results.
....
Drew Harwell is a national technology reporter for The Washington Post specializing in artificial intelligence. He previously covered national business and the Trump companies. Follow https://twitter.com/drewharwell
Read more: https://www.washingtonpost.com/technology/2019/01/25/amazon-facial-identification-software-used-by-police-falls-short-tests-accuracy-bias-new-research-finds/
David Fahrenthold Retweeted
[link:https://twitter.com/Fahrenthold
Amazon's facial-recognition software, marketed 2 law enforcement as a powerful crime-fighting tool, struggles 2 pass basic tests o/accuracy, raising concerns abt how biased results could tarnish AI's exploding use by police & surveillance. By @drewharwell
Link to tweet
McCamy Taylor
(19,240 posts)without actually having the complicated human right brain that allows us to recognize faces. For instance, such software reads me as "juvenile male" from the nose up. And I ain't a juvenile male! Savvy makeup artists will be applying the touches that they know will fool the computer.
durablend
(7,468 posts)It was Mechanical Turk that "trained" the AI, clearly doing a shitty job of it.