Publications

Electric Fence: Protecting Proprietary Rights in Collected Energy Data

November 10, 2020 Articles

Like companies in other industries, a growing number of modern energy-related companies are focusing their efforts on data collection and analysis. For example, Enphase – an energy technology company – regularly tracks data about how and when its customers’ homes are using or generating solar energy.[1] As such data collection becomes more widespread, and the resulting data is aggregated, its holders will gain the ability to perform critical, wide-reaching assessments about entire communities’ power use – including power generation, distribution and consumption – an ability which may be very valuable in a competitive marketplace, and may attract interest from competitors.

But how are companies to protect these important data collections? As the companies involved are merely collecting the data – not “inventing” (as would be protected by patent law) or “authoring” it (enabling copyright protection), they cannot rely on traditional registration-based intellectual property protections. Instead, such companies are likely to have to rely on trade secret law to protect this valuable asset from unauthorized third-party use. 

Two key factors affect whether such information may qualify as a trade secret.

1. Confidentiality. Trade secret protectability stems, in part, from the fact that the subject information is kept secret, and is not easily discoverable by competitors. This means that companies must avoid sharing their information with third parties, and must take steps to ensure that such information is protected. 

In this case, that step is complicated by the fact that, in many cases (such as the Enphase example, above), some of the data is already available to individual consumers who use the product or service. However, trade secret protections may still apply to the aggregated data that companies have compiled about patterns of use by large groups of people. 

In order to demonstrate the data’s confidentiality, companies generally must show that they have taken reasonable steps to ensure that it is protected from misuse, by both employees and outside parties. The question as to what is reasonable, however, is open to interpretation, and is evolving with the ever-changing technological environment in which the data resides.

While the analysis seems somewhat circular (i.e., an asset can be protected as a trade secret if it is kept secret), the point is that in the event of unauthorized disclosure or use, such disclosure will only be actionable if steps have been taken to protect the data.

2. Nature and value of the information. Not all confidential information possessed by a company is a trade secret. Confidential data only qualifies for trade secret protection if it has “independent economic value” as a result of its secrecy. When assessing whether such value is present, California courts generally consider (1) the extent to which the owner derives value from the information’s secrecy; ( the extent to which others could obtain economic benefits from the data; (3) the amount of time, money, or labor that the owner spent to develop the information, and (4) the amount of time, money, or labor that would be saved by a competitor by using the information. See CACI 4412; Altavion v. Konica Minolta Sys. Lab., Inc., (2014) Cal.App.4th 26, 62.

Thus, companies claiming that certain information is a trade secret must be able to demonstrate not only that they spent time and effort developing, aggregating, or assembling that information, but also that they and their competitors would find that aggregated information to have economic value.

As energy companies, like many of their peers in other industries, begin to perform more data collection and analysis, it will be important to keep the above factors in mind, to determine how such data can be protected. Many different measures are available, including but not limited to:

  • allowing employee access to the data only on a need-to-know basis;
  • using employee nondisclosure agreements with defined consequences (including procedural consequences) for unauthorized sharing;
  • ending access for departing employees and employees that no longer need access;
  • including confidentiality protections in contracts under which the data is shared;
  • using technology-based gatekeeping measures; and
  • employing other methods to ensure secrecy and protection from disclosure.

But the “reasonable” steps to be taken are not one-size-fits-all. Indeed, what is reasonable for a small company will almost necessarily differ from what is reasonable for a multinational corporation. Moreover, the nature of the data itself could further affect the definition of reasonable measures. Data incorporating consumer personal information, for example, could require a different level of protection (though, as noted previously, the distinct privacy issues related to such data are beyond the scope of this article).

And while the battle over a trade secret may be won or lost well before any claim against an unauthorized user, any disclosure in breach of a contractual obligation requires immediate action (likely in court, using sealed pleadings) if there is any hope of rebottling the genie. 

In the end, as energy (and other) companies become more focused on capitalizing on data, they will need to take steps to protect their data collections, and to limit their exposure. Companies seeking to understand and spot potential (or actual) risks, and to demonstrate the reasonableness of their security measures, should act to define and track the ways in which such information is being used, and their processes to protect confidentiality. Those processes should factor in industry guidance, and the evolving body of court decisions that will continue to define the “reasonable” security measures necessary to protect the value of a company’s trade secret data collections.


[1] This type of data collection also implicates interesting tangential privacy issues not addressed in this article.

Firm Highlights

Publication

California Appeals Court Empowers Privacy Agency to Immediately Enforce CCPA Regulations

In  California Privacy Protection Agency et al. v. The Superior Court of Sacramento County  (case number C099130), the Third Appellate District of the California Court of Appeal returned authority to the California Privacy Protection...

Read More
Publication

Thomson Reuters v. Ross Intelligence: AI Copyright Law and Fair Use on Trial

On Sept. 25, 2023, Judge Stephanos Bibas (sitting by designation in the District of Delaware), determined that fact questions surrounding issues of fair use and tortious interference required a jury to decide media conglomerate...

Read More
Publication

Nonprofits’ Use of Artificial Intelligence Systems: Intellectual Property and Data Privacy Concerns

In today's rapidly changing technological landscape, artificial intelligence (AI) is making headlines and being discussed constantly. To be sure, AI provides a powerful tool to nonprofits in creating content and exploiting for countless cost-effective...

Read More
Publication

It Wasn’t Me, It Was the AI: Intellectual Property and Data Privacy Concerns With Nonprofits’ Use of Artificial Intelligence Systems

In today's rapidly changing technological landscape, artificial intelligence (AI) is making headlines and being discussed constantly. To be sure, AI provides a powerful tool to nonprofits in creating content and exploiting for countless cost-effective...

Read More
Publication

Top 5 Privacy Cases To Watch, From Chatbots to Geolocation

Litigation — and threats of litigation — related to privacy law violations have been on the rise recently. While some judges have pushed back on the theories set forth by plaintiffs, new privacy lawsuits...

Read More
Publication

California Proposes New AI & Automated Decision-Making Technology Regulations

The California Privacy Protection Agency (CPPA) released its draft  regulatory framework for automated decision-making technology (ADMT) on November 27. These regulations are a preview of what new requirements may look like for companies currently...

Read More
Publication

BIPA Liability: Existing CGL Coverage May Provide a Lifeline for Policyholders

Developments in the law have increased the potential liability that companies could face under the Illinois Biometric Information Privacy Act (BIPA), but fortunately for policyholders, Illinois case law has also solidified coverage for BIPA...

Read More
Publication

Enforcement of CPRA Regulations Delayed

Shortly before the California Privacy Right Act (CPRA) modifications to the California Consumer Privacy Act (CCPA) were set to become enforceable on July 1, 2023, a Sacramento Superior Court judge issued a ruling on...

Read More
Publication

California AI Proposal Rethinks Consumer Scope and Recordkeeping

The California Privacy Protection Agency will revisit its  draft  regulations for automated decision-making technology on March 8, including use of artificial intelligence to process personal information. Comment periods should be coming soon in 2024...

Read More