Publications

Public Ends From Private Means: Privacy Rights and Benevolent Use of Personal Data

April 1, 2020 Blog

With the explosion of COVID-19 cases worldwide, companies and governments have expanded their interest in the use of the vast stores of consumer data. Even where such collection and use of personal data is ostensibly for the public good, the privacy rights and legal requirements applicable to such data must be considered carefully.[i]

In recent weeks, a plethora of private companies have introduced programs and applications to monitor and assess the spread of the disease, as well as to assess the effectiveness of the public health measures (e.g., social distancing) aimed at stopping it. 

Some companies or technology developers seek voluntary participation through self-reporting by members of the public. The COVID-19 Symptom Tracker from health science company ZOE Global Ltd. aims to gather data about the symptoms, spread of, exposure to, and groups most at risk from COVID-19, using an app-based study. As of last week, it was the third most downloaded app in Apple’s UK App store, and the second most in downloaded of Google Play’s new releases in the UK. Because consumers are voluntarily providing data and opting in to such data collection and to its use for the specific purpose of studying COVID-19, there are fewer legal restrictions on using their data for that purpose.[ii]

Other companies have sought to re-tool their existing products or data in order to help fight the disease. One such company is Unacast, a marketing analytics company out of Norway that tracks human movements using GPS data received from other companies’ apps (largely where such companies incorporate tracking functionality into their apps using Unacast’s software development kit, though such mobile location-related data can also be gathered through the use of Wifi connection information, IP, and other sources). Unacast has already built a business out of using and sharing its substantial set of (anonymized) consumer data to help retail, marketing, real estate, and travel companies assess consumer needs and behaviors. Now, the company is using that same data to introduce an interactive Social Distancing Scoreboard, which aims to “empower organizations to measure and understand the efficacy of social distancing initiatives at the local level.” While Unacast seems to only be sharing anonymized data, it is likely that the data as collected by the company includes personal information. According to Unacast’s website, it only collects data after providing proper disclosures and opt-in, and respects device settings. Unacast’s use and sharing of that personal data must adhere to the permissions garnered at the time of collection.

Where such aims are pursued using anonymized data collected through third party apps, and do not include any Personally Identifiable Information (and/or proper disclosures and permissions have been acquired prior to collection), such data collection and use is unlikely to create significant liability for its users. However, in a time when everyone is eager to dive in and contribute to a solution, it is important to be mindful of the risks that can arise from the collection and use of personalized information, no matter how commendable the aims for which it is collected. The California Consumer Privacy Act (CCPA) would require proper disclosures as to the types of data being collected, the purposes of the collection, and the categories of companies with whom the data is shared to be disclosed to the data subject at the time of the collection (which would likely be accomplished through a linked-to privacy policy[iii]). Europe’s General Data Protection Regulation (GDPR) would further require opt-in to such collection and use. Such laws seek to prevent unwanted use of personal information, including location tracking, and failure to comply with such privacy laws could result in significant regulator liability (as well as, for companies collecting information from European data subjects, liability pursuant to a consumer law suit[iv]).

Governments have a similar interest in tracking the development and spread of COVID-19, as well as in the effectiveness of measures to slow or stop the disease. China, Singapore South Korea, and other nations have stepped up technological surveillance to ensure compliance with quarantine orders, and to track the disease within their borders. A recent report similarly indicates that the U.S. government hopes to partner with major technology companies, to track whether Americans are adequately complying with social distancing orders, and to fight the pandemic. However, many have expressed concern about such tracking, and some Western companies have indicated a reticence, or even refusal to comply (companies in China or similar countries are less likely to have the option to take such stances).

As companies and governments attempt to track and combat COVID-19 through data collection and monitoring, we are likely to see interesting developments in privacy law in the United States, and around the world.


[i] While health information is of course a core privacy concern, this article is focused on general personally identifiable information, and does not address the separate issues for protected health information arising under HIPAA and/or other laws.

[ii] That said, whether in Europe or the U.S., the purposes for which personal data can be used and the third-parties with whom it would be shared would be limited to those disclosed at the time of collection/opt-in. On the other hand, such a company could use and share de-identified collections of the same data without limitation.

[iii] The CCPA and GDPR require parental opt-in for certain minors, so the software would have to include some kind of age-gating.

[iv] While the CCPA also includes a private right of action, such right only arises in connection with data security failures and does not apply to violations of the data collection/disclosure requirements, enforcement of which is solely the domain of the California Attorney General.

Firm Highlights

Publication

California Proposes New AI & Automated Decision-Making Technology Regulations

The California Privacy Protection Agency (CPPA) released its draft  regulatory framework for automated decision-making technology (ADMT) on November 27. These regulations are a preview of what new requirements may look like for companies currently...

Read More
Publication

Is the Copyright Threat to Generative AI Overhyped? Implications of Kadrey v. Meta

In November 2023, Meta successfully had nearly all of the claims against it dismissed in the Kadrey v. Meta Platforms, Inc. suit, a victory with potential implications for other technology companies with generative AI tools...

Read More
News

Farella 2024 Partner Elevations: Cynthia Castillo and Greg LeSaint

Northern California legal powerhouse Farella Braun + Martel is pleased to announce the election of two lawyers to partnership effective Jan. 1: Cynthia Castillo and Greg LeSaint. “We are thrilled to elevate Cynthia and...

Read More
Publication

It Wasn’t Me, It Was the AI: Intellectual Property and Data Privacy Concerns With Nonprofits’ Use of Artificial Intelligence Systems

In today's rapidly changing technological landscape, artificial intelligence (AI) is making headlines and being discussed constantly. To be sure, AI provides a powerful tool to nonprofits in creating content and exploiting for countless cost-effective...

Read More
News

Farella Wins Complete Defense Ruling at Trial for Smart Meter Technology Company

Northern California legal powerhouse Farella Braun + Martel secured a complete defense victory for a smart meter technology company following a two-week bench trial in the U.S. Bankruptcy Court for the Southern District of California...

Read More
News

Winston Liaw Named a Leadership Council on Legal Diversity Fellow

Northern California legal powerhouse Farella Braun + Martel is proud to announce that Winston Liaw has been named a Leadership Council on Legal Diversity (LCLD) Fellow for 2024. Winston joins a select group of...

Read More
Publication

A Summary of New Laws Coming for California Employers in 2024

In 2023, California has adopted several new employment laws either introducing new employee protections or codifying existing practices into state law. With these changes, employers will need to examine and adjust some of their...

Read More
Publication

Major Decision Affects Law of Scraping and Online Data Collection, Meta Platforms v. Bright Data

On January 23, 2024, the court in Meta Platforms Inc. v. Bright Data Ltd. , Case No. 3:23-cv-00077-EMC (N.D. Cal.), issued a summary judgment ruling with potentially wide-ranging ramifications for the law of scraping and...

Read More
Publication

Court Reinstates CPPA Enforcement Authority and Confirms No Delay Necessary for Enforcement of Future CCPA Regulations

A recent appellate decision has made clear that the regulations promulgated under California’s groundbreaking consumer privacy law, the California Consumer Privacy Act (CCPA, as amended by the California Privacy Rights Act (CPRA)), are ripe...

Read More
News

Scraping Battles: Meta Loses Legal Effort to Halt Harvesting of Personal Profiles

Alex Reese spoke to Matt Fleischer-Black of  Cybersecurity Law Report about the Meta v. Bright Data decision and its impact on U.S. scraping case law. Read the article here (paywall or trial).

Read More