Facial recognition surveillance in Congress’s crosshairs
Too often, the government takes a “surveil first, ask permission later” approach to new and ultimately dangerous tools. For facial recognition, it’s time to change that and rein in an out-of-control system.
Amid a congressional environment where Democrats and Republicans are often loudly at odds, last week’s start to a series of hearings on facial recognition surveillance featured members whose only conflict seemed to be who could express the most outrage about the technology.
House Oversight Committee Chairman Elijah Cummings (D-Md.) said, “there are virtually no controls …. Whatever walk of life you come from, you may be a part of this [surveillance] process.”
The committee’s top Republican Rep. Jim Jordan (R-Ohio.) declared “It’s time for a time out” on government use of the technology.
We’re likely to see a similar united front next week when the committee brings in FBI and other government agencies on its efficacy.
As lawmakers on both sides of the aisle call for robust limits and even a potential ban, it is clear that police departments’ unrestricted use of facial recognition is unsustainable. Fortunately, there are common-sense reforms that can gain consensus and fix this broken system.
Pervasive facial recognition surveillance is not just something out of a sci-fi dystopia: It’s already here. The FBI’s facial recognition system conducts an average of over 4,000 scans per month, and at least one-in-four state or local police departments can run facial recognition searches either directly or via a partnering agency. This poses a staggering array of risks to civil rights, civil liberties and public safety.
Facial recognition offers the government unprecedented power to monitor our activities, threatening privacy and First Amendment rights. In China, facial recognition is a key component of a surveillance system that puts Big Brother’s surveillance apparatus to shame. China has a national database of “face prints” (think fingerprints, scanned from a distance, without notice or consent), plus a blanket network of cameras, tracks millions of individuals, logs their activities and targets minorities and dissidents.
As the recent hearing witnesses noted, there are no federal statutes to prevent facial recognition surveillance from taking a similar form in the United States. Indeed, in 2015, Baltimore police used it to scan individuals at a protest and arrest people with outstanding warrants “directly from the crowd.”
Moreover, facial recognition has a well-documented accuracy problem. Its use raises serious civil rights concerns, including the potential for improper investigation and other police actions. Poor datasets give facial recognition systems what Massachusetts Institute of Technology researcher and Algorithmic Justice League founder Joy Buolamwini calls a “coded gaze.” As she explained during the hearing, they are far more prone to inaccuracy when scanning women and people of color. Multiple studies, by Buolamwini and others, including an FBI expert, confirm this.
Accuracy issues also risk public safety. Multiple “real-time” facial recognition systems — which scans crowds and generates alerts for matches with a watchlist — register false matches over 90 percent of the time. Such a plainly dysfunctional system won’t help public safety. It will waste law enforcement resources, hamper community-police relations, and endanger innocent individuals who may be improperly “recognized” as they’re walking down the street. Yet, major cities including Detroit and Orlando have begun implementing these systems.
Luckily, Congress appears ready to act swiftly. Several suggested a moratorium on government use of the technology while mitigating problems and developing proper limits. This is reasonable; you don’t leave your hand in a pot of boiling water while trying to cool it down.
Meanwhile, broad consensus is emerging on necessary safeguards. There are viable solutions, including requiring police to obtain a warrant to use the technology, limiting its use to serious crimes, mandatory human review of matches and independent testing and accuracy standards.
Jake Laperruque is senior counsel for The Constitution Project at the Project On Government Oversight. The organization recently proposed recommendations on facial recognition surveillance, drafted by experts, including law enforcement officials, technologists and privacy experts.
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts