Section 230 Immunity Won’t Protect You: State and Federal Lawmakers Take Aim at Social Media Companies With Proposed Legislation Creating Affirmative Duties to Act to Prevent Harm to Users
Three new bills, one introduced in the California Assembly and two in the US Senate, are taking aim at online social media platforms. If adopted, these bills would significantly alter existing duties to prevent or mitigate certain harms to users, particularly minors.
The bills come in the wake of widespread concern generated by testimony and documents from former Facebook employee Frances Haugen. Although the testimony and documents related primarily to Facebook’s internal research finding negative impacts on mental health of teen girls using its Instagram social media platform, the hearings gave rise to a broader concern – that social media algorithms are being intentionally designed by social media companies to make platforms addictive, including to children who are especially vulnerable to the mental health impacts of addictive social media use.
Since Haugen’s revelations, Facebook (now Meta) has been targeted by multiple investigations, including by state attorneys general for violations of consumer protection and securities laws, and lawsuits, including a wrongful death lawsuit initiated in Connecticut after the suicide death of an 11-year-old girl.
The legislation introduced in California and in the US Senate would go well beyond these current remedial legal efforts, imposing significant new affirmative duties on online platform service providers and significantly altering the legal landscape currently defined by CDA 230.
Section 230 of the Communications Decency Act was premised on the idea that online service providers are publishers, not content creators, and that as such they are not responsible for policing content posted on their platforms by users. Under Section 230, online service providers are not required to monitor content posted on their platforms and are immune from liability for damages from content posted by users. The later-added Good Samaritan provisions of Section 230 further provide that, to the extent an online service provider acts to restrict or limit content, that action will also be immune from liability, as long as it is taken in good faith.
California’s “Social Media Platform Duty to Children Act” (AB 2408)
California legislators have introduced a bill known as the “Social Media Platform Duty to Children Act” (AB 2408) that, if adopted, would significantly alter the immunity landscape set out in Section 230. The California bill, which is directed at large social media platform companies (platforms controlled by companies earning more than $100 million in revenue), would impose an express duty of care that requires online social media platform companies to affirmatively act to prevent and mitigate harm to minor users. Significantly, the bill also creates a private civil right of action, allowing (either individually or as a class action) parents and/or guardians of any child who suffers any injury as a result of a company violating that duty of care to sue for injunctive relief and recovery of actual damages, civil penalties, attorneys fees and punitive damages. The liability provisions of the bill are extremely broad. Companies would be liable if they “knowingly or negligently cause or contribute to addiction through any act or omission or any combination of acts or omissions.” Addiction is defined as the use of a social media platform that indicates a preoccupation with the platform, including a difficulty of withdrawing or reducing use, that causes physical, emotional, developmental, or other harm to a child.
Although the California bill includes a safe harbor from penalties for social media platforms that take steps to avoid harm to minors, the broadly worded liability provisions and the lack of clear safe harbor guidance are likely to create a potential landmine of liability for even the best-intentioned social media company.
Companies targeted by civil actions under the proposed statute also are unlikely to find immunity protection available under Section 230. Existing provisions of Section 230 expressly preempt conflicting state laws, but AB 2408 carefully avoids this preemptive effect by targeting companies for their own product’s practices and features, including algorithms and design, and expressly stating that it will not be construed as imposing liability for content created by others. The distinction drawn is consistent with the Ninth Circuit’s recent interpretation of the applicable scope of Section 230 immunity in Lemmon v. Snapchat. As such, civil suits brought pursuant to the California statute (if adopted) are very likely to be viewed by courts as falling outside of the scope of Section 230 immunity and not preempted by federal law.
Federal “Kids Online Safety Act” and Proposed Federal Reforms to Section 230
Federal lawmakers have also recently introduced legislation aiming to increase protections for minors using social media platforms. In contrast to California’s approach, these efforts are more narrowly tailored and do not add an independent civil right of action.
The Kids Online Safety Act would require online social media platform companies, regardless of size, to provide certain settings to help facilitate parents’ efforts to protect their children from harmful content and potential addiction, and to publish third-party audits outlining the risks of their platforms for minors. Similar to the proposed California bill, the Kids Online Safety Act also affirmatively states that a covered platform has a duty to prevent and mitigate the heightened risk of harms to minors posed by materials on, or engagement with, the platform, including addiction-like behaviors. Unlike the California bill, however, no new civil right of action would be created by the legislation.
Separate federal legislation also has been introduced that seeks to reform the existing immunity provisions of Section 230. That legislation, titled the Justice Against Malicious Algorithms Act, would limit the immunity provided by Section 230 in certain circumstances. In particular, the bill would not immunize from liability any online platform company that “knowingly or recklessly” used an algorithm or other technology to recommend content that materially contributed to physical or severe emotional injury.
Online social media platform companies should prepare now for new legislation that requires significantly more robust Trust & Safety efforts to protect minors who use their platforms. Although strong public policy considerations support existing Section 230 immunity provisions, the growth of social media platforms and the increasing engagement of minors with those platforms present competing public policy concerns that social media companies will be required to address either proactively or, almost inevitably, in response to legislation. Although adoption of the California legislation as currently drafted is unlikely, the proposed federal legislation efforts could receive bipartisan support. It is also likely that there will be continued efforts, in California and other jurisdictions, to provide legal vehicles to hold platforms liable for algorithms and design features that promote a level of engagement with the platform that creates a significant risk of addiction and negative mental or physical harm, particularly for minors and perhaps ultimately, for all users.