No Quarter: What Claims Doesn’t Section 230 of the Communications Decency Act Protect Platform Companies Against?

August 30, 2021 Articles
The Recorder

Depending on what you read or who you talk to, Section 230 of the Communications Decency Act (47 U.S.C. § 230) (CDA) is either a tool of censorship, a shield of Big Tech that allows it to profit off of content harmful to third parties, or the only thing that enables the internet to work. 

Section 230 distinguishes between an "information content provider" that creates content and the "interactive computer service" that hosts the content, and immunizes the hosting platform from responsibility for content created by its users: "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."[1]  "A defamation claim is perhaps the most obvious example of a claim that seeks to treat a website or smartphone application provider as a publisher or speaker, but it is by no means the only type of claim that does so."[2]  The "Good Samaritan" provision of Section 230 also gives platforms the freedom, but not the obligation, to restrict "objectionable" content posted by third parties on them.[3]

By releasing platforms from liability for their users’ content, Section 230 eliminates the need to review user content and remove potentially unlawful items. It keeps such platforms from having to make subjective decisions about the propriety of user content and leads to the tolerance of user speech that would more likely be removed in an abundance of caution. Thus, many argue it is vital for protecting free expression online. Internet companies claim that without the protections of Section 230, the costs associated with protecting against such potential liability would make it impossible to do business, and the internet as we know it could not exist. But many others see Section 230 as ill-suited for addressing some of the harms that arise in our modern platform-based economy, particularly when one-third party uses online platforms to post content that harms another third party, and the victim has little power to have the content removed quickly and permanently. 

Whatever the assessment, Section 230 is still the law of the land, and it has protected interactive computer services from most claims arising out of third-party content posted on their services since its passing in 1996. 

If your company provides a platform for users to make and/or merely post content, you likely have a good sense of the legal claims Section 230 immunizes you against, and that you can probably quickly win on a motion to dismiss if those claims are brought against you because the courts generally will consider the applicability of immunity at the earliest stage in litigation to avoid needless costs on courts and the parties involved.[4]

But there are certain kinds of claims – defined by statute or developed by case law – against which Section 230 provides no immunity.  Companies should be aware of these exceptions to identify and address the highest risks associated with user content appearing on their platforms. We discuss below some of the most important types of claims for which Section 230 provides no quarter.

Federal Criminal Law Violations.  Section 230 provides no immunity against criminal prosecutions for violation of federal criminal statutes.[5]

Platform companies should be aware that Section 230 is a broad shield against civil liability for torts, but not against criminal prosecutions, especially those based on criminal statutes concerning online trafficking in obscenity, stalking, and harassment.[6]

Promotion or Facilitation of Sex Trafficking. There is no immunity for platforms that violate the “Allow States and Victims to Fight Online Sex Trafficking Act” (FOSTA).  FOSTA expressly creates an exception to Section 230 immunity, and platforms can be held responsible, via federal criminal prosecutions, state criminal prosecutions, and even civil claims, based on behavior that facilitates or promotes sex trafficking or prostitution.[7] 

Companies who run platforms allowing for users to post personals-style advertisements, which might include ads by sex workers, would be well-advised to consult with counsel to ensure that they are minimizing their risk in this regard.  Indeed, after the passage of FOSTA, several websites suspended certain interpersonal functions.  For example, Craigslist took down its Personals and Therapeutic Services sections and blocked the reposting of advertisements previously listed in the Therapeutic Services section to other sections because it did not want to risk liability under this law and jeopardize its other services.[8]

Intellectual Property Infringement.  Although many attack Section 230 because it supposedly relieves platforms from the duty of having to review, moderate, or take down content posted by its users, that is not entirely true, because platform companies need to understand that they do have some affirmative obligations to remove content posted by their users, or else they could face legal liability.

A court may not construe Section 230 in a manner that would either “limit or expand any law pertaining to intellectual property.”[9]  Therefore, Section 230 does not provide immunity to platform companies from federal intellectual property claims, which includes at least those based on the copyright, patent, and trademark laws.[10]  So, for example, a company is not free from liability for copyright claims if its users publish copyrighted material on its platform unless it adheres to the Digital Millennium Copyright Act notice and takedown provisions.[11]

Additionally, whether platform companies are immune from claims based on state intellectual property rights is unsettled.[12]  For example, the United States Court of Appeals for the Ninth Circuit has held that Section 230 barred a state right of publicity claim, while the United States District Court for the District of New Hampshire declined to dismiss a plaintiff's right of publicity claim under New Hampshire law, holding that it was an "intellectual property" right and that Section 230's exception to immunity for intellectual property claims does not distinguish between federal or state intellectual property rights.[13]  This issue is currently pending before the United States Court of Appeals for the Third Circuit.[14] 

Platform companies are thus well-advised to assume that they may need to respond not only to requests to take down copyrighted, trademarked, or patented material from their platforms, but also content that may be subject to other protections under state laws, such as a person's name, image, or likeness.

Electronic Communications Privacy Act.  A court may not construe Section 230 "to limit the application of the Electronic Communications Privacy Act (ECPA) of 1986...or any similar State law."[15]  The ECPA prohibits, among other things, the actual or attempted interception, use, or disclosure of wire, oral, or electronic communications and protects the contents of files stored by service providers.[16]  That said, transmitting user communications is often an incident of the services that platforms provide, and so that activity alone does not give rise to liability under the ECPA.[17]

Product Design Flaws.  In Lemmon v. Snap, Inc., plaintiffs sued Snapchat concerning a video filter that highlighted the speed at which a user was traveling.[18]  The complaint alleged the filter led to users driving recklessly in order to post what they deemed to be impressive feats and alleged that Snap should have known this would be the case and that its filter would lead to injuries when the reckless activities led to accidents.[19]

The court distinguished between blaming the platform for content third parties generated and posted versus design decisions by the company in its product that may have encouraged dangerous behavior by those using the product.[20]  For Section 230 to provide a defendant immunity against a claim, the claim must seek to treat the defendant as the "publisher or speaker" of content provided by a third party, and the plaintiff’s negligent design claim did not seek to treat Snap as such.[21]

Platform providers should thus not assume that Section 230 immunizes them from liability for the provision of any application that they make available to users to create and post content.  If the claim does not seek to treat the platform as a publisher or speaker of third-party content, then Section 230 provides no immunity.

Failure To Warn.  In Doe v. Internet Brands the court found that Section 230 provided no immunity for a platform against failure to warn claims brought by a plaintiff against it where third parties used the platform to lure aspiring models to fake auditions where they were drugged and raped, as the complaint included allegations that the service knew of this criminal activity from an outside source.[22]

Again, the claim could proceed because it did not seek to hold the platform liable as the "publisher or speaker" of any third-party content.  If a platform thus learns that it is being used for criminal activity, particularly of a type where third parties may be in danger of harm, they should consult with counsel before turning a blind eye to such conduct, despite the broad leeway provided by the "Good Samaritan" provision to choose to either moderate, or not moderate, user content.

Transactions Arising from Third-Party Posted Content.  Section 230 does not immunize platforms from claims based on laws prohibiting transactions that may arise from third-party-posted content posted on them.  For example, claims based on the processing of payments for short-term rentals advertised in third-party posts will not be immune where the transactions violate local ordinances prohibiting online platforms from processing short-term rentals of properties.[23]  While the host site would not be subject to liability for the third-party advertisement of the prohibited services appearing on its site, its processing of the violative transaction is not protected. 

Thus, platforms seeking to process transactions arising from third-party postings must ensure that the transactions themselves are legal within all the jurisdictions in which they operate.

Anticompetitive Conduct.  Although the "Good Samaritan" provision gives platforms broad discretion in moderating the third-party content available on them, Section 230 does not provide immunity against claims of removing the content of or otherwise blocking direct competitors for anticompetitive purposes, and Section 230(e)(3) therefore permits the enforcement of state-law claims against such behavior.[24] 

Accordingly, platforms should consult with counsel to ensure that any content-moderation policies are not animated by a desire to harm the platforms' competitors.

Tortious Content Created By a Platform.  Section 230 will not bar liability for situations where a platform participated in the creation of the tortious content.  Examples from the case law are instructive.  There is no immunity where the platform requires users to answer pre-populated questions, the resulting information is published, and the information is unlawfully discriminatory,[25] where a site creates false dating profiles to retain customers,[26] or where the platform deceptively mistranslates members' actions, such as using a user's clicking on a "Like" button on a company's page into the words "Plaintiff likes [Brand]," and further combining that text with Plaintiff's photograph, the company's logo, and the label "Sponsored Story."[27]

The more actively the platform is involved in framing, directing, developing, editing, or repurposing its user's information and activity on the platform into the final content that is published, the greater the risk it runs in losing its immunity under Section 230 if that content is in any way illegal.  Any plans around such efforts should therefore be run by counsel before implementation.

To be sure, Section 230 offers important and strong protection for platforms that enable the creation and posting of third-party content.  Understanding the limitations of that protection, however, is integral to ensuring that the platform is adequately assessing and addressing its risk of liability associated with user content, or at least the cost of defending a suit on the merits related to such content.


[1] 47 U.S.C. § 230(c)(1).

[2] Lemmon v. Snap, Inc., 995 F.3d 1085, 1091 (9th Cir. 2021).

[3] 47 U.S.C. § 230(c)(2).

[4] See, e.g. Nemet Chevrolet, Ltd. v. Consumeraffairs.com, Inc., 591 F.3d 250, 255 (4th Cir. 2009).

[5] 47 U.S.C. § 230(e)(1);  Gonzalez v. Google LLC, 2 F.4th 871, 890 (9th Cir. 2021).

[6] 47 U.S.C. § 230(b)(5); Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12, 23 (1st Cir. 2016).

[7] 47 U.S.C. § 230(e)(5).

[8] See Woodhull Freedom Found. v. United States, 948 F.3d 363, 368–69 (D.C. Cir. 2020).

[9] 47 U.S.C. § 230(e)(2).

[10] Perfect 10, Inc. v. CCBill LLC, 488 F.3d 1102, 1118 (9th Cir. 2007); Almeida v. Amazon.com, Inc., 456 F.3d 1316, 1322 (11th Cir. 2006).

[11] See 17 U.S.C. § 512.

[12] Compare, e.g., Perfect 10, 488 F.3d 1102, with Doe v. Friendfinder Network, Inc., 540 F. Supp. 2d 288, 298-304 (D.N.H. 2008); see also Atl. Recording Corp. v. Project Playlist, Inc., 603 F. Supp. 2d 690, 704 (S.D.N.Y. 2009) (Section 230 does not provide immunity against federal and state intellectual property claims).

[13] Id.

[14] Karen Hepp v. Facebook, et al., No. 20-2885 (3d Cir.).

[15] 47 U.S.C. § 230(e)(4).

[16] 18 U.S.C. §§ 2511, 2701.

[17] Doe v. GTE Corp., 347 F.3d 655, 658–59 (7th Cir. 2003).

[18] 995 F.3d 1085, 1087 (9th Cir. 2021).

[19] Id.

[20] Id. at 1090-95.

[21] Id.

[22] 824 F.3d 846 (9th Cir. 2016).

[23] Homeaway.com, Inc. v. City of Santa Monica, 918 F.3d 676 (9th Cir. 2019).

[24] Enigma Software Grp. USA, LLC v. Malwarebytes, Inc., 946 F.3d 1040, 1052 (9th Cir. 2019), cert. denied, 141 S. Ct. 13, 208 L. Ed. 2d 197 (2020).

[25] Fair Hous. Council of San Fernando Valley v. Roommates.Com, LLC, 521 F.3d 1157, 1166 (9th Cir. 2008).

[26] Anthony v. Yahoo Inc., 421 F. Supp. 2d 1257, 1262–63 (N.D. Cal. 2006).

[27] Fraley v. Facebook, Inc., 830 F. Supp. 2d 785, 801 (N.D. Cal. 2011).


Reprinted with permission from the August 30, 2021 issue of The Recorder. ©2021 ALM Media Properties, LLC. Further duplication without permission is prohibited.  All rights reserved.