Amazon to stop use of it’s facial recognition software by cops

In a major policy shift for the company, Amazon is placing a one-year moratorium on use of it’s facial recognition software by the police . For years the company has been a fierce defender of the controversial software.

The company would suspend the use of this program for law enforcement to allow policymakers some room to establish system for regulation  that has been sparking controversy for years and shining an uncomfortable spotlight on Amazon’s cloud computing department. This step was taken in the wake of police violence and racism riots, after an cop killed George Floyd, an unarmed black man. In research, facial recognition technology has been found to often have trouble recognizing individuals with darker skin, recalling previous policy overreaching that infringed civil liberties for advocacy groups.

Amazon Web Services, the cloud computing group of the company, released Rekognition in 2016, a software service designed to identify objects in images and videos, including the ability to match a face with images in a database without taking the time to compare images manually.

Recognition isn’t the only software of this sort. Rivals from Amazon like Microsoft Corp., and Google have similar capabilities. But Amazon ‘s software became the focus of an intense debate about the potential for powerful, new software to undermine human rights after the American Civil Liberties Union called out the risks of misidentifying people with such software. The group highlighted the relationships between Amazon and a sheriff’s office in Oregon and Orlando City, two commitments that Amazon had made in marketing materials.

“We’ve advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge,” the company said. “We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested.”said,Nina Lindsey, an Amazon spokeswoman.

The House and Senate Democrats included a provision in a sweeping police-reform bill introduced Monday that would block real-time facial recognition analysis of federal police body camera footage. Amazon said other organizations will continue to use the software, including those using facial recognition to fight human trafficking. Rekognition runs on Amazon servers, and is delivered as an internet service to customers, making it theoretically relatively easy for Amazon to suspend access for police users. How many law enforcement departments did use Rekognition remains unknown.In an interview for a PBS Frontline investigation that aired earlier this year, AWS chief Andy Jassy said he didn’t know the total number of police departments using Rekognition.

“It’s sort of the first, real, meaningful concession we’ve seen from Amazon allowing that use of facial recognition by police might not be good for communities” harmed by biased policing, said Shankar Narayan, who expressed concerns about Rekognition to Amazon officials while at the ACLU of Washington, which he left earlier this year. “The move shows that Amazon is vulnerable to public pressure and optics,” said Narayan, a co-founder MIRA, an organization working to give civil society groups a greater say in how new technologies are used.

Amazon, who has long been reluctant to bow to external pressure on public policy issues, claimed those studies did not accurately reflect its software capabilities. The corporation has said there have been no documented incidents of Rekognition’s harassment by law enforcement, but the ability by Amazon to monitor the use of the app is restricted by the security of AWS and regulations against consumer data analysis.

Following a January 2019 study by two AI researchers, pressure on Amazon stepped up , showing the software made more mistakes when used on people with darker skin, especially women. Amazon argued with the paper’s conclusions and methodology, authored by Inioluwa Deborah Raji and Joy Buolamwini, leading some of the top AI scientists, including Yoshua Bengio, the Turing Award winner, to criticize both Amazon’s sale of the product to police and its treatment of Raji and Buolamwini. The ACLU tested the software separately on Congress members and found it falsely matched 28 of them with mugshots, selecting minority lawmakers disproportionately.

“We believe it is the wrong approach to impose a ban on promising new technologies because they might be used by bad actors for nefarious purposes in the future,” Matt Wood, an executive in Amazon’s machine learning group, said in a 2018 blog post. “The world would be a very different place if we had restricted people from buying computers because it was possible to use that computer to do harm.”

Source:https://tech.hindustantimes.com/tech/news/amazon-to-pause-use-of-facial-recognition-software-by-cops-71591848562092.html