Publications

Public Ends From Private Means: Privacy Rights and Benevolent Use of Personal Data

April 1, 2020 Blog

With the explosion of COVID-19 cases worldwide, companies and governments have expanded their interest in the use of the vast stores of consumer data. Even where such collection and use of personal data is ostensibly for the public good, the privacy rights and legal requirements applicable to such data must be considered carefully.[i]

In recent weeks, a plethora of private companies have introduced programs and applications to monitor and assess the spread of the disease, as well as to assess the effectiveness of the public health measures (e.g., social distancing) aimed at stopping it. 

Some companies or technology developers seek voluntary participation through self-reporting by members of the public. The COVID-19 Symptom Tracker from health science company ZOE Global Ltd. aims to gather data about the symptoms, spread of, exposure to, and groups most at risk from COVID-19, using an app-based study. As of last week, it was the third most downloaded app in Apple’s UK App store, and the second most in downloaded of Google Play’s new releases in the UK. Because consumers are voluntarily providing data and opting in to such data collection and to its use for the specific purpose of studying COVID-19, there are fewer legal restrictions on using their data for that purpose.[ii]

Other companies have sought to re-tool their existing products or data in order to help fight the disease. One such company is Unacast, a marketing analytics company out of Norway that tracks human movements using GPS data received from other companies’ apps (largely where such companies incorporate tracking functionality into their apps using Unacast’s software development kit, though such mobile location-related data can also be gathered through the use of Wifi connection information, IP, and other sources). Unacast has already built a business out of using and sharing its substantial set of (anonymized) consumer data to help retail, marketing, real estate, and travel companies assess consumer needs and behaviors. Now, the company is using that same data to introduce an interactive Social Distancing Scoreboard, which aims to “empower organizations to measure and understand the efficacy of social distancing initiatives at the local level.” While Unacast seems to only be sharing anonymized data, it is likely that the data as collected by the company includes personal information. According to Unacast’s website, it only collects data after providing proper disclosures and opt-in, and respects device settings. Unacast’s use and sharing of that personal data must adhere to the permissions garnered at the time of collection.

Where such aims are pursued using anonymized data collected through third party apps, and do not include any Personally Identifiable Information (and/or proper disclosures and permissions have been acquired prior to collection), such data collection and use is unlikely to create significant liability for its users. However, in a time when everyone is eager to dive in and contribute to a solution, it is important to be mindful of the risks that can arise from the collection and use of personalized information, no matter how commendable the aims for which it is collected. The California Consumer Privacy Act (CCPA) would require proper disclosures as to the types of data being collected, the purposes of the collection, and the categories of companies with whom the data is shared to be disclosed to the data subject at the time of the collection (which would likely be accomplished through a linked-to privacy policy[iii]). Europe’s General Data Protection Regulation (GDPR) would further require opt-in to such collection and use. Such laws seek to prevent unwanted use of personal information, including location tracking, and failure to comply with such privacy laws could result in significant regulator liability (as well as, for companies collecting information from European data subjects, liability pursuant to a consumer law suit[iv]).

Governments have a similar interest in tracking the development and spread of COVID-19, as well as in the effectiveness of measures to slow or stop the disease. China, Singapore South Korea, and other nations have stepped up technological surveillance to ensure compliance with quarantine orders, and to track the disease within their borders. A recent report similarly indicates that the U.S. government hopes to partner with major technology companies, to track whether Americans are adequately complying with social distancing orders, and to fight the pandemic. However, many have expressed concern about such tracking, and some Western companies have indicated a reticence, or even refusal to comply (companies in China or similar countries are less likely to have the option to take such stances).

As companies and governments attempt to track and combat COVID-19 through data collection and monitoring, we are likely to see interesting developments in privacy law in the United States, and around the world.


[i] While health information is of course a core privacy concern, this article is focused on general personally identifiable information, and does not address the separate issues for protected health information arising under HIPAA and/or other laws.

[ii] That said, whether in Europe or the U.S., the purposes for which personal data can be used and the third-parties with whom it would be shared would be limited to those disclosed at the time of collection/opt-in. On the other hand, such a company could use and share de-identified collections of the same data without limitation.

[iii] The CCPA and GDPR require parental opt-in for certain minors, so the software would have to include some kind of age-gating.

[iv] While the CCPA also includes a private right of action, such right only arises in connection with data security failures and does not apply to violations of the data collection/disclosure requirements, enforcement of which is solely the domain of the California Attorney General.

Firm Highlights

Publication

California Appeals Court Empowers Privacy Agency to Immediately Enforce CCPA Regulations

In  California Privacy Protection Agency et al. v. The Superior Court of Sacramento County  (case number C099130), the Third Appellate District of the California Court of Appeal returned authority to the California Privacy Protection...

Read More
Publication

It Wasn’t Me, It Was the AI: Intellectual Property and Data Privacy Concerns With Nonprofits’ Use of Artificial Intelligence Systems

In today's rapidly changing technological landscape, artificial intelligence (AI) is making headlines and being discussed constantly. To be sure, AI provides a powerful tool to nonprofits in creating content and exploiting for countless cost-effective...

Read More
Publication

California Proposes New AI & Automated Decision-Making Technology Regulations

The California Privacy Protection Agency (CPPA) released its draft  regulatory framework for automated decision-making technology (ADMT) on November 27. These regulations are a preview of what new requirements may look like for companies currently...

Read More
Publication

Court Reinstates CPPA Enforcement Authority and Confirms No Delay Necessary for Enforcement of Future CCPA Regulations

A recent appellate decision has made clear that the regulations promulgated under California’s groundbreaking consumer privacy law, the California Consumer Privacy Act (CCPA, as amended by the California Privacy Rights Act (CPRA)), are ripe...

Read More
Publication

Enforcement of CPRA Regulations Delayed

Shortly before the California Privacy Right Act (CPRA) modifications to the California Consumer Privacy Act (CCPA) were set to become enforceable on July 1, 2023, a Sacramento Superior Court judge issued a ruling on...

Read More
Publication

Nonprofits’ Use of Artificial Intelligence Systems: Intellectual Property and Data Privacy Concerns

In today's rapidly changing technological landscape, artificial intelligence (AI) is making headlines and being discussed constantly. To be sure, AI provides a powerful tool to nonprofits in creating content and exploiting for countless cost-effective...

Read More
Publication

BIPA Liability: Existing CGL Coverage May Provide a Lifeline for Policyholders

Developments in the law have increased the potential liability that companies could face under the Illinois Biometric Information Privacy Act (BIPA), but fortunately for policyholders, Illinois case law has also solidified coverage for BIPA...

Read More
Event

AI and Privacy: What Every Company Needs to Do Today

Sushila Chanana and Benjamin Buchwalter will discuss "AI and Privacy: What Every Company Needs to Do Today' at the ACC 2024 Privacy Summit.  This session will introduce basics of AI governance, such as ownership...

Read More
Publication

Thomson Reuters v. Ross Intelligence: AI Copyright Law and Fair Use on Trial

On Sept. 25, 2023, Judge Stephanos Bibas (sitting by designation in the District of Delaware), determined that fact questions surrounding issues of fair use and tortious interference required a jury to decide media conglomerate...

Read More
Publication

California AI Proposal Rethinks Consumer Scope and Recordkeeping

The California Privacy Protection Agency will revisit its  draft  regulations for automated decision-making technology on March 8, including use of artificial intelligence to process personal information. Comment periods should be coming soon in 2024...

Read More