Massachusetts

Mass. bill targets police misuse of facial recognition

Proposals from Reps. Orlando Ramos and Dave Rogers and Senate Majority Leader Cindy Creem are meant to crack down on racial disparities related to facial recognition technology

People walking by camera
Steffi Loos/Getty Images

Police departments in Massachusetts would be significantly curtailed in their ability to rely on facial recognition technology under revived legislation touted by civil rights activists that's meant to crack down on racial disparities among people of color wrongfully suspected of crimes.

Proposals from Reps. Orlando Ramos and Dave Rogers and Senate Majority Leader Cindy Creem (H 1728/S 927) would centralize the use of the technology — which identifies people based on physical characteristics of their face, head and body — within the Massachusetts State Police. That consolidation ensures there's not 351 local police departments conducting their own facial recognition searches with different types of software and levels of expertise, Kade Crockford of the ACLU of Massachusetts said at a legislative briefing on Thursday.

"We limit the number of entities that could potentially be violating the public's trust, using this technology in a way that doesn't comply with the law," Crockford, director of the ACLU's Technology for Liberty Project, told Beacon Hill staffers and reporters. "Nobody can promise perfect accountability with any statute that's on the books. The intent here is to really narrow the possibility for abuse and misuse and increase the possibility for effective accountability and oversight."

Crockford served on the Facial Recognition Commission — composed of lawmakers, public safety officials, attorneys and other experts — that published recommendations, endorsed by then-Attorney General Maura Healey, last year that would be implemented through the proposals. The bills have been awaiting a hearing before the Joint Committee on the Judiciary since February.

Rahsaan Hall, president and CEO of the Urban League of Eastern Massachusetts, repeatedly condemned the "faulty, unregulated, racially biased technology" at the hearing, which he said "overwhelmingly, disproportionately misidentifies darker skinned people and women." Hall read a list of names to underscore stories from people of color erroneously incarcerated due to the software, including a Detroit man who was taken into custody for 30 hours after police used grainy security footage to link him to a shoplifting case.

"Each of these individuals was wrongly accused, in some instances of heinous acts and others simple criminal offenses, but the disruption to their lives was so significant," Hall said.

Creem said this is her third time filing legislation tied to facial recognition technology.

Senate Democrats gave the bill initial approval last session but didn't fully pass the bill over to the House. The House had overwhelmingly adopted a Ramos facial recognition amendment to a bond bill, though the policy didn't make it out of conference committee talks with the Senate.

"The truth is the third time is the charm," Creem said at the briefing. "I know it's going to work. It's very important."

Under the legislation, local law enforcement officials who want to use facial recognition technology to investigate felonies would need to obtain a warrant from a judge, which they would then take to the State Police to handle the tool, Crockford said. But in emergency situations, such as immediate threats to health or safety, warrants would not be needed, Crockford said.

The bills prohibit mass surveillance of citizens, a practice that Crockford noted is used by the Chinese government, and contain due process protections for defendants who have been identified through facial recognition searches.

Rogers said the legislation is needed to strengthen the 2020 police reform bill, particularly facial recognition provisions that he said former Gov. Charlie Baker had "weakened."

"I really hope we can move it forward because while we have made progress to have a law on the books, it has some significant weaknesses," Rogers said. "And now is the time to rectify those weaknesses."

Erik Learned-Miller, a computer science professor at the University of Massachusetts Amherst, said people, including police officers, have put too much trust into the technology.

Learned-Miller, who has developed facial recognition software, said artificial intelligence can make mistakes — even if the tools are advertised to be 99% accurate. For example, he said software that works well on analyzing passport photos, in which people are expressionless and brightly lit, may falter when dealing with grainy parking garage footage shot late at night.

"There are different applications like sorting photos on your personal computer, unlocking your smartphone and other things in which an error can produce fairly benign errors — but police work is really different," Learned-Miller, who served on the facial recognition commission as a Baker appointee, said. "It's important to tailor legislation to the application that you're talking about ... This legislation makes the right balance, and I fully, fully support what we're trying to do here."

Copyright State House News Service
Contact Us