FBI’s facial recognition database is dangerously inaccurate
It returns false positives 15 percent of the time, more often with minorities and women.
Despite law enforcement’s attempt to conceal its existence, it’s no secret that half of Americans over the age of 18 — 117 million people in total — are part of a massive facial recognition database, their personal information culled from DMV files in 18 states. A staggering 80 percent of the people in the database don’t have any sort of arrest record. Yet, the system’s recognition algorithm inaccurately identifies them during criminal searches 15 percent of the time with black women most often being misidentified, the House Committee on Oversight and Government Reformheard last week.
“Facial recognition technology is a powerful tool law enforcement can use to protect people, their property, our borders, and our nation,” said the committee chair, Jason Chaffetz. “But it can also be used by bad actors to harass or stalk individuals. It can be used in a way that chills free speech and free association by targeting people attending certain political meetings, protests, churches, or other types of places in the public.”
The database got its start back in 2010 with the FBI’s Next Generation Identification system, which was designed to supplement the agency’s existing fingerprint database. The problem was that the FBI didn’t bother to tell the public about the new registration or publish a required-by-law privacy impact assessment until 2015. Populating the system with data also differed significantly from the FBI’s existing fingerprint database in that the photos were collected proactively rather than from a crime scene.
“No federal law controls this technology, no court decision limits it. This technology is not under control,” Alvaro Bedoya, executive director of the center on privacy and technology at Georgetown Law told the committee. This lack of oversight and regulation has dangerous consequences.
The Government Accountability Office (GAO) did audit the FBI’s facial recognition algorithms last year and found the system sorely lacking in accountability, accuracy and oversight. “It doesn’t know how often the system incorrectly identifies the wrong subject,” explained the GAO’s Diana Maurer. “Innocent people could bear the burden of being falsely accused, including the implication of having federal investigators turn up at their home or business.” The GAO found that black people, especially women, were more likely to be subject to a facial recognition search and more likely to be misidentified than whites.
“I think the issue goes beyond the first amendment concerns that were expressed,” Rep. Paul Mitchell (R-MI), said during the hearing. “I don’t want to just protect someone if they’re in a political protest from being identified, the reality is we should protect everybody unless there is a valid documented criminal justice action.” There’s no word on whether the FBI is complying with the GAO’s recommendations or whether the House committee’s tongue lashing will spur the agency to action.
IMAGE: Gary Cameron / Reuters
For more on this story and video go to: https://www.engadget.com/2017/03/27/fbis-facial-recognition-database-is-dangerously-inaccurate/