Acting FTC Chair Signals Potential Enforcement Priorities: Privacy and AI
In opening remarks at the Future of Privacy Forum, Acting Federal Trade Commission (FTC) Chairwoman Rebecca Kelly Slaughter previewed enforcement priorities under the Biden Administration. Slaughter's speech marked her first major public address as acting chair, and she highlighted particular technologies and activities likely to receive the FTC's attention.
Support for Federal Data Privacy Legislation
Noting the privacy and data risks raised by Americans' increased reliance on the internet for work, education, and socialization during COVID-19, Slaughter reminded the audience of her support for federal privacy legislation.
While an omnibus privacy law would likely give the FTC explicit enforcement authority, Slaughter emphasized that the FTC has been creative in its efforts to protect consumers' privacy and data, even in the absence of such a law. She is encouraging FTC staff be "innovative and creative" in its use of "the full panoply of tools available to the FTC."
Considering Relief: Disgorgement, Consumer Notice
Slaughter emphasized her focus on obtaining strong relief for consumers—she dissented from major recent FTC cases, like YouTube, as she felt better outcomes should have been secured for consumers. She highlighted "meaningful disgorgement" and "effective consumer notice" as two remedies she wants the FTC to seek in future enforcement actions:
- Disgorgement: Slaughter referenced the FTC's recent settlement with Everalbum as an example of the potential application of disgorgement relief to privacy and security cases. If companies unlawfully collect and/or use consumers' data, the FTC should require disgorgement of both the improperly obtained data and any benefits from that data, Slaughter suggested.
In the Everalbum case, the company destroyed wrongfully collected data and the resulting models and algorithms. As we discussed after the settlement, this "fruit-of-the-poisonous-tree"-type remedy has significant potential ramifications for developers of artificial intelligence (AI) and machine-learning technologies. - Consumer notice: Slaughter suggested that requiring proper consumer notice regarding privacy and security practices enables consumers to make informed choices regarding their data and "vote with their feet" to protect themselves.
She pointed to the recent settlement with Flo, a fertility-tracking app that allegedly violated its commitment not to share sensitive information with third parties, that required the company to notify consumers of the false promises it had made.
Slaughter's remarks also touched on recent antitrust and competition considerations, as she noted that the FTC has a dual mandate to address both competition and privacy issues. She suggested the FTC should view enforcement related to these areas as complementary.
Pandemic-Related Risks
The FTC has a significant role to play in COVID-19-response, said Slaughter, particularly as the pandemic raises consumer protection concerns. Slaughter mentioned the importance of protecting children using educational technology (ed-tech) to learn from home—the FTC is currently reviewing the COPPA rule, and Slaughter stated emphatically that COPPA applies to ed-tech.
Health apps, contact tracing, and other telehealth solutions raise privacy concerns as well, Slaughter noted. She is requesting that the "staff take a close look" at these issues and also consider how the "the importance of reliable Internet has grown" such that consumers need some "transparency regarding the privacy practices of broadband providers." Slaughter expects the FTC to issue a report on broadband privacy practices in 2021.
Racial Equity
Slaughter tied the FTC's role in addressing systemic racism to the digital divide, exacerbated by COVID-19, AI and algorithmic decision-making, facial recognition technology, and use of location data from mobile apps.
- COVID-19 and the digital Divide: Although racial inequity is not a new phenomenon in the United States, the pandemic is widening gaps. Slaughter cited the "digital divide" between students who have the equipment and access to learn from home and those who do not and "outsized consequences" from privacy violations for communities at the margins.
- Algorithmic discrimination: Slaughter also raised concerns about algorithmic discrimination and the potential risks of artificial intelligence and algorithmic decision-making exacerbating racial disparities. She asked the FTC staff to investigate bias in algorithms and discriminatory outcomes.
- Facial recognition technology: Facial recognition technologies also present equity concerns, Slaughter explained, pointing to inaccuracies in the technologies' identification of non-white faces. The FTC will "redouble" its attention on violations of law related to facial recognition technology out of concern for discrimination and the "obvious" privacy implications of tools that identify otherwise unknown individuals.
- Geolocation data: Finally, Slaughter mentioned the use of geolocation information from mobile apps to locate Black Lives Matter protestors last summer. She said, "I'm concerned about misuse of location data generally but, in particular, as it applies to tracking Americans engaged in constitutionally protected speech."
An Aggressive Enforcement Future?
Slaughter's remarks may portend an active FTC that takes an aggressive stance related to consumer privacy, data use, and technology deployments. We expect the FTC will consider issuing civil investigative demands on these issues in the coming months and years.