Phil Ting doesn’t have a criminal record. Yet the California State Assemblyman is one of 26 lawmakers falsely matched to a mugshot in Amazon Rekognition’s public 25,000-photo database, the American Civil Liberties Union announced Tuesday. Amazon sells its facial recognition tool to law enforcement for use in body cameras, but Ting and the ACLU want to change that.
“I wasn’t surprised that I was falsely identified, but the stakes are much higher for ordinary Californians,” Ting told Forbes. “I don’t think it’s okay to use the technology to determine if someone is innocent or should be arrested. We’re innocent until proven guilty. Facial recognition would force individuals to prove that they’re innocent.”
In May, the California State Assembly approved Ting’s bill to ban law enforcement’s use of facial recognition and biometric scanners, which identify things such as how a person walks, in body cameras. When the bill goes before the State Senate in two weeks, California stands to be the largest state to prohibit facial recognition on officer-worn body cameras and would join New Hampshire and Oregon, both of which passed similar bans in 2017.
To illustrate the inaccuracy of Amazon Rekognition, the ACLU entered photos of 120 California lawmakers, including Ting, and 26, or 21%, were falsely matched with arrest photos. Amazon says the technology is continually updating based on new data, but this has minimally affected the error rate compared with a similar test in a larger data pool conducted in 2018 when the ACLU found 28 out of 535 members of Congress (just over 5% were inaccurately identified in mugshots). Though people of color made up 20% of congress, they made up 40% of the bad matches.
“It’s not just whether the false matches in this exact test were people of color,” Matt Cagle, a technology and civil liberties attorney at the ACLU, told Forbes. “The reality is these cameras are going to be in communities that are already over-policed and where body cameras are more present than in other whiter or more affluent communities.”
In response to the ACLU’s findings Amazon said in a statement that it continues to advocate for regulating facial recognition technology, but and that the ACLU’s testing was done in a way that misuses the technology.
“The ACLU is once again knowingly misusing and misrepresenting Amazon Rekognition to make headlines,” the company’s statement reads. “When used with the recommended 99% confidence threshold and as one part of a human-driven decision facial recognition technology can be used for a long list of beneficial purposes, from assisting in the identification of criminals to helping find missing children to inhibiting human trafficking.”
Advocates of Ting’s Assembly Bill 1215 say that If the bill does not pass, the over-policing of communities of color in California could increase instances where people of color are scanned. “People getting scanned are going to be disproportionately people of color, the inputs are going to be skewed, potentially even higher than the portion of people in the legislature,” Cagle said.
Researchers at MIT and the Georgetown Center on Privacy and Technology have indicated that Amazon Rekognition is less efficient at identifying people who are not white men.
Despite bipartisan concern over privacy and the use of facial recognition and biometric technology, there remains scant federal oversight. Advocates look for the passage of Ting’s bill in California to inspire similar calls to oversight in other states.
A March 2019 study conducted by David Binder Research, a San-Francisco based research firm, found that 82% of likely November 2020 California voters somewhat, if not strongly, disagree that the government should be able to monitor and track citizens identity and whereabouts using biometric data.
San Francisco just became the first city in the nation to ban the use of facial recognition technology by police and government agencies. Rosenberg does worry about an uptick in neighborhood surveillance, she said, but is pleased that the bill will stop facial recognition technology from misidentifying people as criminal suspects in real-time policing.
“Right now, the California legislature has a decision to make,” Cagle says. ”Do they add facial recognition to body cameras and break the promise of these tools that were offered to communities, or do they draw a line in the sand with a flawed technology and say these tools should be solely used to keep officers accountable?”