Insights
Publications

Public Ends From Private Means: Privacy Rights and Benevolent Use of Personal Data

April 1, 2020 Blog

With the explosion of COVID-19 cases worldwide, companies and governments have expanded their interest in the use of the vast stores of consumer data. Even where such collection and use of personal data is ostensibly for the public good, the privacy rights and legal requirements applicable to such data must be considered carefully.[i]

In recent weeks, a plethora of private companies have introduced programs and applications to monitor and assess the spread of the disease, as well as to assess the effectiveness of the public health measures (e.g., social distancing) aimed at stopping it. 

Some companies or technology developers seek voluntary participation through self-reporting by members of the public. The COVID-19 Symptom Tracker from health science company ZOE Global Ltd. aims to gather data about the symptoms, spread of, exposure to, and groups most at risk from COVID-19, using an app-based study. As of last week, it was the third most downloaded app in Apple’s UK App store, and the second most in downloaded of Google Play’s new releases in the UK. Because consumers are voluntarily providing data and opting in to such data collection and to its use for the specific purpose of studying COVID-19, there are fewer legal restrictions on using their data for that purpose.[ii]

Other companies have sought to re-tool their existing products or data in order to help fight the disease. One such company is Unacast, a marketing analytics company out of Norway that tracks human movements using GPS data received from other companies’ apps (largely where such companies incorporate tracking functionality into their apps using Unacast’s software development kit, though such mobile location-related data can also be gathered through the use of Wifi connection information, IP, and other sources). Unacast has already built a business out of using and sharing its substantial set of (anonymized) consumer data to help retail, marketing, real estate, and travel companies assess consumer needs and behaviors. Now, the company is using that same data to introduce an interactive Social Distancing Scoreboard, which aims to “empower organizations to measure and understand the efficacy of social distancing initiatives at the local level.” While Unacast seems to only be sharing anonymized data, it is likely that the data as collected by the company includes personal information. According to Unacast’s website, it only collects data after providing proper disclosures and opt-in, and respects device settings. Unacast’s use and sharing of that personal data must adhere to the permissions garnered at the time of collection.

Where such aims are pursued using anonymized data collected through third party apps, and do not include any Personally Identifiable Information (and/or proper disclosures and permissions have been acquired prior to collection), such data collection and use is unlikely to create significant liability for its users. However, in a time when everyone is eager to dive in and contribute to a solution, it is important to be mindful of the risks that can arise from the collection and use of personalized information, no matter how commendable the aims for which it is collected. The California Consumer Privacy Act (CCPA) would require proper disclosures as to the types of data being collected, the purposes of the collection, and the categories of companies with whom the data is shared to be disclosed to the data subject at the time of the collection (which would likely be accomplished through a linked-to privacy policy[iii]). Europe’s General Data Protection Regulation (GDPR) would further require opt-in to such collection and use. Such laws seek to prevent unwanted use of personal information, including location tracking, and failure to comply with such privacy laws could result in significant regulator liability (as well as, for companies collecting information from European data subjects, liability pursuant to a consumer law suit[iv]).

Governments have a similar interest in tracking the development and spread of COVID-19, as well as in the effectiveness of measures to slow or stop the disease. China, Singapore South Korea, and other nations have stepped up technological surveillance to ensure compliance with quarantine orders, and to track the disease within their borders. A recent report similarly indicates that the U.S. government hopes to partner with major technology companies, to track whether Americans are adequately complying with social distancing orders, and to fight the pandemic. However, many have expressed concern about such tracking, and some Western companies have indicated a reticence, or even refusal to comply (companies in China or similar countries are less likely to have the option to take such stances).

As companies and governments attempt to track and combat COVID-19 through data collection and monitoring, we are likely to see interesting developments in privacy law in the United States, and around the world.


[i] While health information is of course a core privacy concern, this article is focused on general personally identifiable information, and does not address the separate issues for protected health information arising under HIPAA and/or other laws.

[ii] That said, whether in Europe or the U.S., the purposes for which personal data can be used and the third-parties with whom it would be shared would be limited to those disclosed at the time of collection/opt-in. On the other hand, such a company could use and share de-identified collections of the same data without limitation.

[iii] The CCPA and GDPR require parental opt-in for certain minors, so the software would have to include some kind of age-gating.

[iv] While the CCPA also includes a private right of action, such right only arises in connection with data security failures and does not apply to violations of the data collection/disclosure requirements, enforcement of which is solely the domain of the California Attorney General.

Firm Highlights

Publication

Continuing Use of CGL Policies to Cover Data Breach Losses

Our lives and the products and devices we use become more dependent on data by the day. As a result, cyberattacks and data breaches present everchanging risks to companies and individuals, and the importance...

Read More
News

LinkedIn Loses Data Appeal

Erik Olson was quoted in the article "LinkedIn Loses Data Appeal" in CDR Magazine . In the article, Erik said: We are pleased to see that the Ninth Circuit has again affirmed, in light...

Read More
News

Janice Reicher Named a 2022 Leadership Council on Legal Diversity Fellow

Farella Braun + Martel is proud to announce that Janice Reicher has been named a member of the 2022 class of Leadership Council on Legal Diversity (LCLD) Fellows. Janice joins a select group of...

Read More
Publication

How to Guard Against 3 Cannabis Cyber Attack Risks

Cyber attacks are now commonplace. Ransomware attacks, in particular, have skyrocketed in frequency and size. High-profile data breaches have cost businesses in the United States millions of dollars in losses and incalculable reputational harm...

Read More
Publication

Section 230 Immunity Won’t Protect You: State and Federal Lawmakers Take Aim at Social Media Companies With Proposed Legislation Creating Affirmative Duties to Act to Prevent Harm to Users

Three new bills, one introduced in the California Assembly and two in the US Senate, are taking aim at online social media platforms. If adopted, these bills would significantly alter existing duties to prevent...

Read More
News

As Feds Step Up White-Collar Enforcement, Companies Face Heightened Risks

Aviva Gilbert, Farella partner and co-chair of the firm's White Collar Criminal Defense and Internal Corporate Investigations Group, was quoted in the Corporate Counsel article "As Feds Step Up White-Collar Enforcement, Companies Face Heightened...

Read More
Publication

Platform Ecosystems – The Landscape of US and EU Legislation (Webinar)

Stephanie Skaff and Nate Garhart discuss "Platform Ecosystems – The Landscape of US and EU Legislation." Several new bills targeting online platform companies are making their way through state and federal legislative bodies in the...

Read More
Publication

hiQ’s Groundbreaking Injunction Against LinkedIn Reaffirmed: Scraping of Publicly Available Data Likely Does Not Violate CFAA

The U.S. Court of Appeals for the Ninth Circuit has affirmed its prior decision , holding that LinkedIn could not block hiQ, a scraping entity, from scraping public LinkedIn profiles. The court found it was...

Read More
Publication

The War Exclusion in a Time of War

The “war” exclusion has gotten more attention over the past couple of weeks in light of Russia’s invasion of Ukraine. For good reason. This exclusion, common in property and liability policies alike, typically eliminates...

Read More
Publication

Platform Ecosystems: Computer Fraud and Abuse Act and Other Scraping Law Developments (Webinar)

Stephanie Skaff and Erik Olson discuss "Platform Ecosystems: Computer Fraud and Abuse Act and Other Scraping Law Developments." Web scraping has existed as long as the World Wide Web has, and as data has...

Read More