AI researchers tell Amazon not to sell 'flawed' facial recognition

AI researchers tell Amazon to stop selling police 'faulty' facial recognition

Studies show amazon's facial recognition technology often has higher error rates for minorities

Table of Contents
  1. Studies show amazon's facial recognition technology often has higher error rates for minorities
  2. "FLAWED FACIAL ANALYSIS TECHNOLOGIES ARE REINFORCING HUMAN BIASES."
  3. Anandkumar says via email that he hopes the letter will open a "public dialogue on how we can evaluate facial recognition," adding that technical frameworks are needed to evaluate this technology. 
  4. Amazon has defended its technology and much of the letter released today offers a point-by-point rebuttal of the company's criticisms. 

Artificial intelligence researchers from Google, Facebook, Microsoft and several leading universities have called on Amazon to stop selling its facial recognition technology to law enforcement.

WordPress.com

In a open letter published today, researchers say studies have repeatedly shown that Amazon's algorithms are flawed,

cwith higher error rates for darker-skinned faces and female faces. 

Researchers say that if police adopt such technology, it has the potential to amplify racial discrimination, create cases of mistaken identity and encourage intrusive surveillance of marginalized groups.

"FLAWED FACIAL ANALYSIS TECHNOLOGIES ARE REINFORCING HUMAN BIASES."

D. student at the University of Colorado Boulder and one of 26 signatories to the letter . Scheuerman says such technologies "may be appropriate for malicious purposes ...

In a way that the companies that supply them have no knowledge of."

Other signatories include Timnit Gebru, a researcher at Google whose work has highlighted failures in facial recognition algorithms; 

Yoshua Bengio, an IA researcher that recently received the Turing Award; and Anima Anandkumar, Caltech professor and former principal scientist at Amazon's AWS subsidiary.

Anandkumar tells via email that he hopes the letter will open a "public dialogue on how we can evaluate facial recognition," adding that technical frameworks are needed to evaluate this technology. 

"Government regulation can only happen once we have established technical frameworks to evaluate these systems," says Anandkumar.

As a leading provider of facial recognition technology, Amazon's algorithms have repeatedly been examined in this way. 

A study released earlier this year showed that the company's software has more difficulty with identify the gender of darker-skinned men and women,

while a 2018 test conducted by the ACLU found that Amazon's Rekognition software paired incorrectly photos of 28 members of Congress with police mugshots.

In China, police officers wear sunglasses with built-in facial recognition to detect criminals in public places.

Amazon has defended its technology and much of the letter released today offers a point-by-point rebuttal of the company's criticisms. 

The authors point out, for example, that although Amazon says it has received no reports of police misusing its facial recognition,

that is not a meaningful statement, since there are no laws to audit their applications.

As studies find more flaws in this technology, protests among researchers, shareholders and technology employees are occurring more frequently. 

Google is refusing to sell facial recognition software because of its potential for abuse, while Microsoft has called for government regulations

However, exactly how this technology is to be monitored is a difficult question .

"These technologies must be developed in a way that does not harm human beings, or perpetuate harm to already historically marginalized groups."

says Scheuerman. He stresses that, to do so, there needs to be more dialogue between those developing facial recognition technology and those criticizing it. 

"This is an interdisciplinary problem that requires an interdisciplinary approach."

Leave a Reply

Go up

Cookies on this website are used to personalize content and ads, provide social media features and analyze traffic. More information