Privacy Triage: Five Tips to Identify Key Privacy Risks of New Products and Services
Thermal scanners alert businesses if anyone at their establishment has a higher than normal temperature. Apps tell a business when a customer who has placed an order has arrived at the parking lot for a curbside pick-up. Services stream online story time to children stuck at home.
In the true spirit of American consumerism, the pandemic has quickly inspired companies to roll out new products and services, not just to fight COVID-19, but to capitalize on new consumer needs and desires in a changed world.
New products and services, however, create new data workflows and new privacy headaches for the businesses seeking to implement or buy them. Striking the right balance between privacy and convenience (or in the context of COVID-19, public safety) is challenging in normal circumstances. Under the pressure of accelerated rollout cycles, the need to “do something,” and hardball sales tactics, privacy easily can—but should not—become an afterthought.
We offer this short checklist that identifies privacy and security issues that should always be part of your risk analysis.
1. Privacy policies must accurately describe the organization’s processing of personal information
This is Privacy 101, but still represents a challenge for many businesses. A business’s disclosures and policies must accurately describe its collection, use, storage, and disclosure of personal information. Inaccurate disclosures create a risk of an enforcement action by the Federal Trade Commission (FTC) and/or state attorneys general for deception or unfairness.
Inaccurate or unclear disclosures also create a risk of bad press, which can damage an organization’s reputation and force it to expend resources to mitigate the problem. The state of North Dakota recently discovered this when it offered to the public a COVID-19 contact tracing app called Care19. The State’s privacy notice said that it did not share location data with third parties, but a researcher discovered that the app’s developer enabled it to send information to a company known for doing marketing based on location. As a result, the State’s representatives had to scramble to make the necessary updates to their privacy policy and app.
2. Organizations should clearly understand other parties' collection, use, storage, and disclosure of personal and confidential information
Laws like the California Consumer Privacy Act (CCPA) and the European Union’s General Data Protection Regulation (GDPR) require that organizations formally define what, if any, restrictions apply when they share data with vendors and partners. The GDPR, for example, requires “joint controllers” and “processors” to abide by certain data protection terms in contracts, and the CCPA requires businesses to make partners “service providers” by restricting their ability to use information for other purposes if the businesses want to avoid a “sale” of personal information.
Clarity is important regardless of exposure to GDPR and CCPA, as it allows an organization to define the privacy experience it is providing to its customers. But many privacy policies fail to distinguish between external parties (or differently branded and managed affiliates) who are receiving personal information to provide a service to the business, and those who receive it for independent commercial purposes. This leads to consumer confusion and allows the external party to be sloppy about how it manages the data.
Many organizations seeking to mitigate the risks of COVID-19 are looking to adopt temperature-monitoring technology that includes facial recognition to track and trace employee health. Organizations that use third-party apps for this purpose must conduct diligence regarding compliance with applicable biometrics laws (see No. 4, below), how the app developer intends to collect and use personal information, with whom the app developer will share the data, and whether everything aligns with the organization’s privacy notice.
If the app developer says that it “anonymizes” personal data for its own use, an organization should determine what steps the developer takes to disassociate the data from the individual to whom it relates and whether the proposed method and format meet legal standards for de-identified data. An organization should also consider contractual controls regarding use and disclosure of personal information (and, if your organization is subject to the GDPR, whether you are required to put such controls in place).
3. Beware of precise location data
Precise location collected via GPS or Wi-Fi connections can be sensitive information by itself. The FTC expects companies to obtain affirmative, opt-in consent to collect precise location, app stores require you to obtain consent via their terms, and users expect a high degree of transparency.
The value and utility of location data is likely to increase over time, and COVID-19 appears to have accelerated that trend. For example, DoorDash recently announced a geo-fencing feature that will alert restaurants when customers who have placed orders arrive at the establishment for curbside pick-up, if the customer opts-in to share location.
Before using features that collect location data, organizations must review the workflow and their privacy policies to ensure that customers receive clear notice regarding the collection of such information—both generally and in any California-specific notice. Organizations should also consider setting a short retention period for location data. Because it loses its value quickly after the transaction is complete, it may be subject to data access and deletion rights, and a large database of such information can cause consumer concerns.
4. Review collection of biometric information carefully
The collection, use, and retention of biometric information, such as face prints and iris scans, is regulated. For example, the Illinois Biometric Information Privacy Act (BIPA) requires private entities to obtain consent to collect biometric information; create retention schedules that ensure deletion at appropriate times; not sell, lease, trade, or profit from it; not disclose it unless the entity has obtained consent or another exception applies; and protect it from unauthorized disclosure using a reasonable standard of care. BIPA’s private right of action has engendered numerous class action lawsuits since its passage in 2008.
Many companies are offering kiosks and apps that integrate temperature scanning with facial recognition technology to store a record of readings for a particular person. If your organization plans to implement advanced temperature monitoring that uses facial recognition to log temperatures, you should first review applicable biometric information processing laws. You may need to work with your chosen vendor to implement a notice and consent framework, as well as data governance policies, that address legal requirements.
5. Recognize restrictions on children's personal information
Children’s personal information is deemed sensitive and failing to handle it properly can put organizations at a high risk of an enforcement action and negative publicity. As we have explained in a previous post, collection of personal information from children under 13 online may require an organization to obtain “verifiable parental consent.” Under the Children’s Online Privacy Protection Act (COPPA), and under the CCPA, the “sale” of personal information of children under 16 requires opt-in consent from the child or parents depending on the child’s age. An organization that is subject to both the CCPA and COPPA will face operational challenges in addressing both of these obligations, so taking the time to ensure that your organization is prepared in advance is a worthwhile investment.
As parents with small children know, keeping children occupied while working from home can be a job unto itself, and online streaming services are positioned to capitalize on the need and demand for entertainment. Organizations catering to children should undertake a careful review of the types of information they collect because navigating the intersection of various laws can be tricky, even for issues as seemingly straightforward as the child’s age. For example, COPPA applies to children under 13, but the CCPA applies to children under 16 and has different rules for children under 13. In the EU, the GDPR requires parental consent for children under 16 but allows individual EU member states to set the cutoff age between 13 and 16.
Our most important tip
Take a deep breath and approach your risk analysis methodically. These five issues represent areas of high risk, but there may be other privacy issues lurking in the background. It is much harder to remediate privacy violations—and the attendant public relations fallout—than to address them proactively.
The facts, laws, and regulations regarding COVID-19 are developing rapidly. Since the date of publication, there may be new or additional information not referenced in this advisory. Please consult with your legal counsel for guidance.
DWT will continue to provide up-to-date insights and virtual events regarding COVID-19 concerns. Our most recent insights, as well as information about recorded and upcoming virtual events, are available at www.dwt.com/COVID-19.