Publications

Section 230 Immunity Won’t Protect You: State and Federal Lawmakers Take Aim at Social Media Companies With Proposed Legislation Creating Affirmative Duties to Act to Prevent Harm to Users

March 24, 2022 Articles

Three new bills, one introduced in the California Assembly and two in the US Senate, are taking aim at online social media platforms. If adopted, these bills would significantly alter existing duties to prevent or mitigate certain harms to users, particularly minors.

The bills come in the wake of widespread concern generated by testimony and documents from former Facebook employee Frances Haugen. Although the testimony and documents related primarily to Facebook’s internal research finding negative impacts on mental health of teen girls using its Instagram social media platform, the hearings gave rise to a broader concern – that social media algorithms are being intentionally designed by social media companies to make platforms addictive, including to children who are especially vulnerable to the mental health impacts of addictive social media use.

Since Haugen’s revelations, Facebook (now Meta) has been targeted by multiple investigations, including by state attorneys general for violations of consumer protection and securities laws, and lawsuits, including a wrongful death lawsuit initiated in Connecticut after the suicide death of an 11-year-old girl.

The legislation introduced in California and in the US Senate would go well beyond these current remedial legal efforts, imposing significant new affirmative duties on online platform service providers and significantly altering the legal landscape currently defined by CDA 230. 

Section 230

Section 230 of the Communications Decency Act was premised on the idea that online service providers are publishers, not content creators, and that as such they are not responsible for policing content posted on their platforms by users. Under Section 230, online service providers are not required to monitor content posted on their platforms and are immune from liability for damages from content posted by users. The later-added Good Samaritan provisions of Section 230 further provide that, to the extent an online service provider acts to restrict or limit content, that action will also be immune from liability, as long as it is taken in good faith.

California’s “Social Media Platform Duty to Children Act” (AB 2408)

California legislators have introduced a bill known as the “Social Media Platform Duty to Children Act” (AB 2408) that, if adopted, would significantly alter the immunity landscape set out in Section 230. The California bill, which is directed at large social media platform companies (platforms controlled by companies earning more than $100 million in revenue), would impose an express duty of care that requires online social media platform companies to affirmatively act to prevent and mitigate harm to minor users. Significantly, the bill also creates a private civil right of action, allowing (either individually or as a class action) parents and/or guardians of any child who suffers any injury as a result of a company violating that duty of care to sue for injunctive relief and recovery of actual damages, civil penalties, attorneys fees and punitive damages. The liability provisions of the bill are extremely broad. Companies would be liable if they “knowingly or negligently cause or contribute to addiction through any act or omission or any combination of acts or omissions.” Addiction is defined as the use of a social media platform that indicates a preoccupation with the platform, including a difficulty of withdrawing or reducing use, that causes physical, emotional, developmental, or other harm to a child.

Although the California bill includes a safe harbor from penalties for social media platforms that take steps to avoid harm to minors, the broadly worded liability provisions and the lack of clear safe harbor guidance are likely to create a potential landmine of liability for even the best-intentioned social media company. 

Companies targeted by civil actions under the proposed statute also are unlikely to find immunity protection available under Section 230. Existing provisions of Section 230 expressly preempt conflicting state laws, but AB 2408 carefully avoids this preemptive effect by targeting companies for their own product’s practices and features, including algorithms and design, and expressly stating that it will not be construed as imposing liability for content created by others.  The distinction drawn is consistent with the Ninth Circuit’s recent interpretation of the applicable scope of Section 230 immunity in Lemmon v. Snapchat. As such, civil suits brought pursuant to the California statute (if adopted) are very likely to be viewed by courts as falling outside of the scope of Section 230 immunity and not preempted by federal law.

Federal “Kids Online Safety Act” and Proposed Federal Reforms to Section 230

Federal lawmakers have also recently introduced legislation aiming to increase protections for minors using social media platforms. In contrast to California’s approach, these efforts are more narrowly tailored and do not add an independent civil right of action.

The Kids Online Safety Act would require online social media platform companies, regardless of size, to provide certain settings to help facilitate parents’ efforts to protect their children from harmful content and potential addiction, and to publish third-party audits outlining the risks of their platforms for minors. Similar to the proposed California bill, the Kids Online Safety Act also affirmatively states that a covered platform has a duty to prevent and mitigate the heightened risk of harms to minors posed by materials on, or engagement with, the platform, including addiction-like behaviors. Unlike the California bill, however, no new civil right of action would be created by the legislation.

Separate federal legislation also has been introduced that seeks to reform the existing immunity provisions of Section 230. That legislation, titled the Justice Against Malicious Algorithms Act, would limit the immunity provided by Section 230 in certain circumstances. In particular, the bill would not immunize from liability any online platform company that “knowingly or recklessly” used an algorithm or other technology to recommend content that materially contributed to physical or severe emotional injury.

Conclusion

Online social media platform companies should prepare now for new legislation that requires significantly more robust Trust & Safety efforts to protect minors who use their platforms.  Although strong public policy considerations support existing Section 230 immunity provisions, the growth of social media platforms and the increasing engagement of minors with those platforms present competing public policy concerns that social media companies will be required to address either proactively or, almost inevitably, in response to legislation. Although adoption of the California legislation as currently drafted is unlikely, the proposed federal legislation efforts could receive bipartisan support. It is also likely that there will be continued efforts, in California and other jurisdictions, to provide legal vehicles to hold platforms liable for algorithms and design features that promote a level of engagement with the platform that creates a significant risk of addiction and negative mental or physical harm, particularly for minors and perhaps ultimately, for all users.

Firm Highlights

Publication

Court Reinstates CPPA Enforcement Authority and Confirms No Delay Necessary for Enforcement of Future CCPA Regulations

A recent appellate decision has made clear that the regulations promulgated under California’s groundbreaking consumer privacy law, the California Consumer Privacy Act (CCPA, as amended by the California Privacy Rights Act (CPRA)), are ripe...

Read More
Publication

Major Decision Affects Law of Scraping and Online Data Collection, Meta Platforms v. Bright Data

On January 23, 2024, the court in Meta Platforms Inc. v. Bright Data Ltd. , Case No. 3:23-cv-00077-EMC (N.D. Cal.), issued a summary judgment ruling with potentially wide-ranging ramifications for the law of scraping and...

Read More
Publication

It Wasn’t Me, It Was the AI: Intellectual Property and Data Privacy Concerns With Nonprofits’ Use of Artificial Intelligence Systems

In today's rapidly changing technological landscape, artificial intelligence (AI) is making headlines and being discussed constantly. To be sure, AI provides a powerful tool to nonprofits in creating content and exploiting for countless cost-effective...

Read More
News

Winston Liaw Named a Leadership Council on Legal Diversity Fellow

Northern California legal powerhouse Farella Braun + Martel is proud to announce that Winston Liaw has been named a Leadership Council on Legal Diversity (LCLD) Fellow for 2024. Winston joins a select group of...

Read More
News

Farella Wins Complete Defense Ruling at Trial for Smart Meter Technology Company

Northern California legal powerhouse Farella Braun + Martel secured a complete defense victory for a smart meter technology company following a two-week bench trial in the U.S. Bankruptcy Court for the Southern District of California...

Read More
Publication

Is the Copyright Threat to Generative AI Overhyped? Implications of Kadrey v. Meta

In November 2023, Meta successfully had nearly all of the claims against it dismissed in the Kadrey v. Meta Platforms, Inc. suit, a victory with potential implications for other technology companies with generative AI tools...

Read More
Publication

A Summary of New Laws Coming for California Employers in 2024

In 2023, California has adopted several new employment laws either introducing new employee protections or codifying existing practices into state law. With these changes, employers will need to examine and adjust some of their...

Read More
News

Farella 2024 Partner Elevations: Cynthia Castillo and Greg LeSaint

Northern California legal powerhouse Farella Braun + Martel is pleased to announce the election of two lawyers to partnership effective Jan. 1: Cynthia Castillo and Greg LeSaint. “We are thrilled to elevate Cynthia and...

Read More
Publication

California Proposes New AI & Automated Decision-Making Technology Regulations

The California Privacy Protection Agency (CPPA) released its draft  regulatory framework for automated decision-making technology (ADMT) on November 27. These regulations are a preview of what new requirements may look like for companies currently...

Read More
News

Scraping Battles: Meta Loses Legal Effort to Halt Harvesting of Personal Profiles

Alex Reese spoke to Matt Fleischer-Black of  Cybersecurity Law Report about the Meta v. Bright Data decision and its impact on U.S. scraping case law. Read the article here (paywall or trial).

Read More