Insights
Publications

Public Ends From Private Means: Privacy Rights and Benevolent Use of Personal Data

April 1, 2020 Blog

With the explosion of COVID-19 cases worldwide, companies and governments have expanded their interest in the use of the vast stores of consumer data. Even where such collection and use of personal data is ostensibly for the public good, the privacy rights and legal requirements applicable to such data must be considered carefully.[i]

In recent weeks, a plethora of private companies have introduced programs and applications to monitor and assess the spread of the disease, as well as to assess the effectiveness of the public health measures (e.g., social distancing) aimed at stopping it. 

Some companies or technology developers seek voluntary participation through self-reporting by members of the public. The COVID-19 Symptom Tracker from health science company ZOE Global Ltd. aims to gather data about the symptoms, spread of, exposure to, and groups most at risk from COVID-19, using an app-based study. As of last week, it was the third most downloaded app in Apple’s UK App store, and the second most in downloaded of Google Play’s new releases in the UK. Because consumers are voluntarily providing data and opting in to such data collection and to its use for the specific purpose of studying COVID-19, there are fewer legal restrictions on using their data for that purpose.[ii]

Other companies have sought to re-tool their existing products or data in order to help fight the disease. One such company is Unacast, a marketing analytics company out of Norway that tracks human movements using GPS data received from other companies’ apps (largely where such companies incorporate tracking functionality into their apps using Unacast’s software development kit, though such mobile location-related data can also be gathered through the use of Wifi connection information, IP, and other sources). Unacast has already built a business out of using and sharing its substantial set of (anonymized) consumer data to help retail, marketing, real estate, and travel companies assess consumer needs and behaviors. Now, the company is using that same data to introduce an interactive Social Distancing Scoreboard, which aims to “empower organizations to measure and understand the efficacy of social distancing initiatives at the local level.” While Unacast seems to only be sharing anonymized data, it is likely that the data as collected by the company includes personal information. According to Unacast’s website, it only collects data after providing proper disclosures and opt-in, and respects device settings. Unacast’s use and sharing of that personal data must adhere to the permissions garnered at the time of collection.

Where such aims are pursued using anonymized data collected through third party apps, and do not include any Personally Identifiable Information (and/or proper disclosures and permissions have been acquired prior to collection), such data collection and use is unlikely to create significant liability for its users. However, in a time when everyone is eager to dive in and contribute to a solution, it is important to be mindful of the risks that can arise from the collection and use of personalized information, no matter how commendable the aims for which it is collected. The California Consumer Privacy Act (CCPA) would require proper disclosures as to the types of data being collected, the purposes of the collection, and the categories of companies with whom the data is shared to be disclosed to the data subject at the time of the collection (which would likely be accomplished through a linked-to privacy policy[iii]). Europe’s General Data Protection Regulation (GDPR) would further require opt-in to such collection and use. Such laws seek to prevent unwanted use of personal information, including location tracking, and failure to comply with such privacy laws could result in significant regulator liability (as well as, for companies collecting information from European data subjects, liability pursuant to a consumer law suit[iv]).

Governments have a similar interest in tracking the development and spread of COVID-19, as well as in the effectiveness of measures to slow or stop the disease. China, Singapore South Korea, and other nations have stepped up technological surveillance to ensure compliance with quarantine orders, and to track the disease within their borders. A recent report similarly indicates that the U.S. government hopes to partner with major technology companies, to track whether Americans are adequately complying with social distancing orders, and to fight the pandemic. However, many have expressed concern about such tracking, and some Western companies have indicated a reticence, or even refusal to comply (companies in China or similar countries are less likely to have the option to take such stances).

As companies and governments attempt to track and combat COVID-19 through data collection and monitoring, we are likely to see interesting developments in privacy law in the United States, and around the world.


[i] While health information is of course a core privacy concern, this article is focused on general personally identifiable information, and does not address the separate issues for protected health information arising under HIPAA and/or other laws.

[ii] That said, whether in Europe or the U.S., the purposes for which personal data can be used and the third-parties with whom it would be shared would be limited to those disclosed at the time of collection/opt-in. On the other hand, such a company could use and share de-identified collections of the same data without limitation.

[iii] The CCPA and GDPR require parental opt-in for certain minors, so the software would have to include some kind of age-gating.

[iv] While the CCPA also includes a private right of action, such right only arises in connection with data security failures and does not apply to violations of the data collection/disclosure requirements, enforcement of which is solely the domain of the California Attorney General.

Firm Highlights

Publication

Cybersecurity Regulation: Key Takeaways From an Unusual FTC Order That Will Follow CEO for a Decade

The FTC recently issued a proposed order that would settle an enforcement action against Drizly, LLC and its co-founder and CEO, James Rellas, arising from data breaches in 2018 and 2020 that affected over...

Read More
Publication

What Recent Rulings in 'hiQ v. LinkedIn' and Other Cases Say About the Legality of Data Scraping

LinkedIn obtained a permanent injunction on Dec. 6 in its six-year-old lawsuit against data scraping company hiQ Labs, which LinkedIn quickly cheered as a “final, decisive victory” that established an “important legal precedent.” While...

Read More
Publication

Top 5 Privacy Cases To Watch, From Chatbots to Geolocation

Litigation — and threats of litigation — related to privacy law violations have been on the rise recently. While some judges have pushed back on the theories set forth by plaintiffs, new privacy lawsuits...

Read More
Publication

California Attorney General Announces Enforcement Sweep of Mobile Applications

Shortly before Privacy Day, California Attorney General (Cal AG) Rob Bonta  announced  a California Consumer Privacy Act (CCPA) enforcement sweep that targeted mobile applications. The sweep focused on popular apps in the retail, travel...

Read More
Publication

Nonprofits’ Use of Artificial Intelligence Systems: Intellectual Property and Data Privacy Concerns

In today's rapidly changing technological landscape, artificial intelligence (AI) is making headlines and being discussed constantly. To be sure, AI provides a powerful tool to nonprofits in creating content and exploiting for countless cost-effective...

Read More
Publication

Enforcement of CPRA Regulations Delayed

Shortly before the California Privacy Right Act (CPRA) modifications to the California Consumer Privacy Act (CCPA) were set to become enforceable on July 1, 2023, a Sacramento Superior Court judge issued a ruling on...

Read More
Publication

California Passes Landmark Privacy Protections for Children With Big Implications for Online Providers

Governor Newsom recently signed into law AB 2273 , the California Age-Appropriate Design Code Act (CA AADCA), making California the first state to pass broad privacy protections for children. The CA AADCA is modeled...

Read More
Publication

Privacy Policy Best Practices for Nonprofits

Welcome to EO Radio Show – Your Nonprofit Legal Resource . I’m happy to have my colleague Nate Garhart back for a discussion on privacy laws and how they affect website content development and online...

Read More
Publication

I Always Feel Like AI Is Watching Me: Artificial Intelligence and Privacy

ChatGPT got the early press, and every day we learn of new generative artificial intelligence products that can create new and creative visual and text responses to human input. Following on ChatGPT’s fame, Google’s...

Read More
Publication

Nonprofit Websites and Terms of Use - Best Practices and Common Pitfalls

Welcome to EO Radio Show – Your Nonprofit Legal Resource . Happy New Year, everyone!  In episode 26, Cynthia Rowland and her guest Nate Garhart discuss websites and terms of use and the legal concepts...

Read More