Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

mahatmakanejeeves

(57,756 posts)
Fri Jan 25, 2019, 02:09 PM Jan 2019

Amazon facial-identification software used by police falls short on tests for accuracy and bias, new

Source: Washington Post

Technology
Amazon facial-identification software used by police falls short on tests for accuracy and bias, new research finds

By Drew Harwell
January 25 at 11:01 AM

Facial-recognition software developed by Amazon and marketed to local and federal law enforcement as a powerful crime-fighting tool struggles to pass basic tests of accuracy, such as correctly identifying a person's gender, new research released Thursday says.

Researchers with M.I.T. Media Lab also said Amazon's Rekognition system performed more accurately when assessing lighter-skinned faces, raising concerns about how biased results could tarnish the artificial-intelligence technology's use by police and in public venues, including airports and schools.

Amazon's system performed flawlessly in predicting the gender of lighter-skinned men, the researchers said, but misidentified the gender of darker-skinned women in roughly 30 percent of their tests. Rival facial-recognition systems from Microsoft and other companies performed better but were also error-prone, they said.

The problem, AI researchers and engineers say, is that the vast sets of images the systems have been trained on skew heavily toward white men. The research shows, however, that some systems have rapidly grown more accurate over the past year after greater scrutiny and corporate investment into improving the results.
....

Drew Harwell is a national technology reporter for The Washington Post specializing in artificial intelligence. He previously covered national business and the Trump companies. Follow https://twitter.com/drewharwell

Read more: https://www.washingtonpost.com/technology/2019/01/25/amazon-facial-identification-software-used-by-police-falls-short-tests-accuracy-bias-new-research-finds/



David Fahrenthold Retweeted

[link:https://twitter.com/Fahrenthold

Amazon's facial-recognition software, marketed 2 law enforcement as a powerful crime-fighting tool, struggles 2 pass basic tests o/accuracy, raising concerns abt how biased results could tarnish AI's exploding use by police & surveillance. By @drewharwell


2 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Amazon facial-identification software used by police falls short on tests for accuracy and bias, new (Original Post) mahatmakanejeeves Jan 2019 OP
It would be so easy to fool a computer since it is programmed to check a few signs McCamy Taylor Jan 2019 #1
Keep in mind durablend Jan 2019 #2

McCamy Taylor

(19,240 posts)
1. It would be so easy to fool a computer since it is programmed to check a few signs
Fri Jan 25, 2019, 02:12 PM
Jan 2019

without actually having the complicated human right brain that allows us to recognize faces. For instance, such software reads me as "juvenile male" from the nose up. And I ain't a juvenile male! Savvy makeup artists will be applying the touches that they know will fool the computer.

Latest Discussions»Latest Breaking News»Amazon facial-identificat...