Privacy advocates fear that the growing use of facial recognition software in the private sector could lead to increased public surveillance and compromise civil liberties.
As cruise lines, NFL teams, airlines and retailers like Walmart begin to test and use facial recognition software for their own security systems, experts worry that the technology is ushering in a new erosion of personal privacy.
“Biometric surveillance creep is going on in both the government and the private sector,” said Adam Schwartz, an attorney at the Electronic Frontier Foundation, a digital civil liberties group.
{mosads}Schwartz and other privacy advocates worry that the increased collection of biometric data, especially through facial recognition software, poses a danger to the public.
They fear that increased facial recognition software will make mass surveillance easier to accomplish. And the ubiquity of surveillance cameras, which can easily be retrofitted with facial recognition software and fed into facial databases, means that there’s already an apparatus in place for large-scale monitoring.
“This is a really big concern for us,” Schwartz said. “What makes facial recognition different from other biometrics is that it’s very easy to collect from a person without their noticing.”
Facial recognition software poses a special privacy risk because massive amounts of collection can occur without consent from the people under surveillance.
Fingerprint scans require an individual to place their finger on a scanner, while DNA collection is hard to accomplish without being noticed. By comparison, facial scans can be done anywhere with cameras linked to facial recognition software, whether or not the subject knows their face is being captured.
Experts also worry that biometric data, like fingerprints, DNA or facial imagery, is even more sensitive than important but alterable personal information like credit card and Social Security numbers. Databases containing biometric data pose huge cybersecurity concerns because they are increasingly valuable for hackers.
“Once your biometrics are stolen, hackers can do things with them,” Schwartz said. “If someone gets their hands on your thumbprint and perhaps that can be weaponized against you and someone can get into things you were using it to get into.”
Schwartz points out that once such information is stolen, a fingerprint or face can’t be changed to keep a hacker from using a victim’s biometric data.
These issues are becoming more urgent as the use of facial recognition technology begins to grow.
The FBI already has an enormous facial recognition database with access to as many as 411.9 million images. State and local law enforcement agencies also use the technology to help them in their policing.
But facial recognition is no longer confined to government agencies, which are at least on some level accountable to the public.
In 2001, the NFL used it during the Super Bowl to monitor the crowd for criminals without attendees’ knowledge or consent. This year, facial recognition tech could return to football, with two NFL teams contracting with security company IDEMIA in 2017 to create a TSA Precheck-style security system for fans entering their stadiums.
Like TSA Precheck, which IDEMIA also provides, the system lets fans register for entrance into games. Stadiumgoers registered with the program can then move through security more quickly, cleared through biometric measures like fingerprint and face scanners. IDEMIA said that it will use biometric data for its system but is still deciding whether it will use fingerprint scanning, facial recognition software or a combination of the software.
IDEMIA is also using facial recognition technology to create similar systems for the Royal Caribbean cruise line. The firm is also working with other companies on deals that haven’t been announced yet, according to IDEMIA North American President Robert Eckel.
IDEMIA’s use of facial recognition software is different than the NFL’s previous stab at the technology. Individual’s face data is only collected after they give consent to IDEMIA — a step that security experts appreciate.
But while IDEMIA’s facial recognition system is opt-in, advocates are still wary of what more widespread use of facial recognition technology could lead to.
“There’s a fear we have of increased normalization of this,” Schwartz said. “Once people start doing facial recognition on an airplane, they’ll get used to it in a supermarket. And then all of sudden our lives become more and more on display.”
{mosads}The kind of facial recognition systems Schwartz is concerned about are already being tested at British grocery stores.
Even if IDEMIA is doing its part to get consent from individuals, though, some advocates fear that consent can be blurred if opting for facial recognition is the more convenient option.
“Consent can be manipulated,” said Jay Stanley, a senior policy analyst at the American Civil Liberties Union. “I’ve raised questions about whether or not the TSA is doing that.”
Stanley and Schwartz note that the line of consent is very fine and can be influenced by things like line speed. They question if consent is actually consent if one line is moving faster than another, prompting individuals to just pick the easier option.
“You can put more workers in some lines and make them move more quickly and make others move more slowly,” Stanley said.
That becomes a complicated issue, though, since IDEMIA’s technology is meant in part to move individuals through lines faster.
IDEMIA’s Eckel said that individuals who have opted in appreciate the results of facial recognition technology so far, regardless of experts’ concerns.
“People are very excited about how frictionless it is and how simple it is,” Eckel said “They just come and they’re waved through.”