With Facial Recognition Technology, Govt Use is One Concern of Many
BOSTON – Last week, the American Civil Liberties Union, through its Massachusetts chapter, filed a lawsuit in federal court, asking the court to order the U.S. Department of Justice (DOJ), Federal Bureau of Investigation (FBI) and the Drug Enforcement Administration (DEA) to produce records the ACLU has already sought through a Freedom of Information Act (FOIA) request.
In its lawsuit, the ACLU notes that it filed its request last January, seeking information on “policies, contracts, and other records relating to the Defendants’ use of face recognition programs and other biometric identification and tracking technology,” noting that to date, “none of the Defendants has released any record responsive to the Request.”
The ACLU argues that production of the records is “important to assist the public in understanding the government’s use of highly invasive biometric identification and tracking technologies,” adding that the two technologies “have the potential to enable undetectable, persistent, and suspicionless surveillance on an unprecedented scale.”
“Such surveillance would permit the government to pervasively track people’s movements and associations in ways that threaten core constitutional values,” the ACLU asserts in its lawsuit.
However you might feel about the idea of law enforcement using facial recognition technology in its pursuit of suspects and investigating leads, the fact we know so little about how the government is employing such technologies is troubling, as Kade Crockford, the Director of the Technology for Liberty Program for ACLU of Massachusetts observes in a recent blog post about the lawsuit.
“Because of the FBI’s secrecy, little is known about how the agency is supercharging its surveillance activities with face recognition technology,” Crockford writes. “But what little is known from public reporting, the FBI’s own admissions to Congress, and independent tests of the technology gives ample reason to be concerned.”
Crockford observes that the FBI recently testified that the agency “does not need to demonstrate probable cause of criminal activity before using its face surveillance technology” and that witnesses at a recent hearing “also could not confirm whether the agency is meeting its constitutional obligations to inform criminal defendants when the agency has used the tech to identify them.”
Adding to the ACLU’s concern over facial recognition technology are questions about the efficacy and accuracy of the technology itself.
“This lack of transparency would be frightening enough if the technology worked,” Crockford writes. “But it doesn’t: Numerous studies have shown face surveillance technology is prone to significant racial and gender bias…. When our freedoms and rights are on the line, one false match is too many.”
While Amazon, the company behind the facial recognition software that is reportedly being used by the FBI, has disputed the notion its software produces as many false positives as the ACLU claims, the Washington Post notes that “local agencies are free to disregard those thresholds in their searches.” The Post adds that a sheriff’s office in Oregon that used the ‘Rekognition’ software told the Post earlier this year that “each facial-recognition search returned five possible results, whether the system was highly confident in the match or not.”
The ACLU is not alone in harboring reservations about law enforcement’s use of facial recognition technology and its databasing of images of American citizens. In September, a bipartisan group of eight Senators and Representatives sent a letter to Kevin McAleenan, the Acting Secretary of the U.S. Department of Homeland Security, requesting information about use of facial recognition technologies by Immigration and Customs Enforcement (ICE) and the FBI.
Noting that ICE and the FBI “have accessed millions of Americans’ photographs without their knowledge or consent through state driver’s license databases,” the Congressmen further observe that “almost all of the photographs maintained in these databases and reviewed by ICE or the FBI are individuals not suspected of or charged with criminal activity.”
“We write to request information about how ICE and the FBI use those databases, and specifically how those agencies use facial recognition technology to scan individuals’ photographs and personal information for criminal investigative purposes.”
The letter is signed by Senators Ron Johnson (Rep.), Gary Peters (Dem.), Christopher Coons (Dem.), Rand Paul (Rep.), Richard Blumenthal (Dem.) and Mike Lee (Rep.), as well as Representatives Bennie Thompson (Dem.) and Jerrold Nadler (Dem.), who serve as chairmen or ranking members of committees with oversight responsibility over the Dept. of Homeland Security and the judiciary branch, emphasizing the degree of Congressional concern on this topic.
While the ACLU’s lawsuit and the letter from the Senators and Representatives serve to sound the alarm over governmental use (and lack of transparency about that use) of facial recognition technology, anyone concerned with their privacy ought to be concerned about private and corporate use of facial recognition, as well.
Adult performers got a small taste of what might stem from unfettered application of facial recognition technologies in an adult industry context earlier this year, when a Weibo user claimed to have built an image database built by indexing images from adult websites, enabling users to identify social media users who have also performed in porn.
While the developer soon announced he’d canceled his project after facing widespread backlash, it’s likely only a matter of time before someone else offers such a ‘service’ – and sticks by their offering, regardless of any blowback from the online community. For that matter, given that the international private sector isn’t subject to the same sort of scrutiny the ACLU and Congress are looking to impose on American law enforcement agencies, it’s quite possible some developer, company or foreign state actor has already deployed the technology for its own purposes, whatever those purposes may be.
It’s not hard to picture how private entities armed with facial recognition technology could wreak havoc on the adult industry (or on society at large, naturally). Imagine a situation in which a company deploys facial recognition technology to snoop on its own employees, seeking out potentially problematic images of their employees on social media – or maybe discovering that one of their employees is a cam performer on the side, or a former adult performer whose prior career had hitherto avoided detection by the company.
Now that it has been developed, facial recognition technology is likely here to stay, part of the landscape of both law enforcement and private use. No doubt, it’s a potentially useful tool, particularly for law enforcement. The challenge will be preventing abuse of its increasingly potent capabilities. How that challenge will (or can) be met is another question, altogether.