Update from LitLand: The ACLU Sues the Government Over the Use of Facial Recognition Technology
LitLand is a monthly feature that reviews developments in litigation as they relate to privacy matters and highlight any past, current, and future cases about which you should know.
The American Civil Liberties Union (ACLU) is no stranger to suing the United States government for perceived violations of civil rights, and the organization’s newest lawsuit is the ACLU’s latest battleground for individual rights. The lawsuit, filed on October 31, 2019, in federal district court in Massachusetts, seeks enforcement of a Freedom of Information Act (FOIA) request for records related to the government’s use of facial recognition technology.
The ACLU alleges that the Department of Justice (DOJ), the Drug Enforcement Administration (DEA), and the Federal Bureau of Investigations (FBI) failed to release records in response to a FOIA request filed by the ACLU in January 2019.
ACLU Eyes Info on FBI Facial Recognition Unit and Database
The ACLU’s request "sought policies, contracts, and other records relating to the [agencies’] use of facial recognition programs and other biometric identification and tracking technology." Specifically, the ACLU seeks documents related to an investigative unit known as FACE (Facial Analysis, Comparison, and Evaluation) and an FBI database of over 30 million photos, known as the Next Generation Identification-Interstate Photo System. Moreover, the FBI is allegedly in the process of purchasing or developing technology that will assist in the collection of other types of biometric data, including "voice prints, gait prints, and other forms of biometric identification."
The ACLU seeks an injunction prohibiting the government "from charging Plaintiffs search, review, processing, and duplication fees in connection with responding to the Request," as well as the immediate production of documents responsive to the Request.
What Is Facial Recognition Technology?
In the simplest terms, facial recognition technology uses algorithms and software to create templates of a face which can then be compared to a repository of photographs, images, or other templates to identify an individual. The multifaceted technology can be used in a variety of ways from innocuous uses, like helping you to organize your phone’s photo gallery into groups based on the people in the picture, to deployments that affect the public, like helping law enforcement identify criminals and curbing potential crime.
Though the technology is becoming more ubiquitous—almost every new smartphone is capable of being unlocked with a glance—it has confronted challenges due to privacy concerns, claims of potential constitutional violations, and ethical issues with respect to how it may adversely affect minorities and women. These concerns have led certain cities to increase regulation of the technology.
For example, the ACLU lawsuit lists a number of municipalities that have already placed a ban on or restricted the use of facial recognition by law enforcement and other government entities, including the cities of San Francisco, Oakland, and Berkeley in California; and Brookline, Springfield, and Cambridge in Massachusetts. The ACLU also cites recent efforts in the U.S. House of Representatives to curb the use of the technology in federally funded public housing and to prohibit the use of federal funds for purchasing or funding the technology.
As AI Proliferates, So Too Will Legal Challenges
Underlying the ACLU’s FOIA request and lawsuit is litany of oft-cited concerns, including that the technology is "highly invasive" and undetectable, "threaten[s] core constitutional values," is inherently unreliable, especially as it relates to recognizing people of color, and that the public has an interest in knowing how the government is using the technology and the safeguards that are in place to prevent abuse.
Irrespective of the ACLU’s lawsuit, the technology will continue to be used and challenged at the same time. However, the path forward will likely include some regulation of or limitation on governmental use of the technology.