One Pending Supreme Court Case Could Change the Internet as We Know It: Gonzalez v. Google and Tech Platforms’ Liability
The Supreme Court granted certiorari in Gonzalez v. Google, a high-stakes case appealed from the Ninth Circuit about the scope of protection Section 230 of the Communications Decency Act affords technology companies against liability for the content on their platforms. The case could fundamentally change the modern landscape of the internet and online speech.
At issue is a provision of Section 230 that reads, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” 47 U.S.C. § 230(c)(1). Under this law, websites like Facebook, Twitter, and YouTube and internet search engines have since 1996 enjoyed broad immunity from liability for claims like defamation based on their users’ content posted to their platforms, often winning cases bringing these claims on early motions to dismiss. The Gonzalez family aims to hold YouTube’s parent company Google responsible for the death of their family member, Nohemi Gonzalez, who died in a terrorist attack in Paris in 2015. Her family’s argument centers on YouTube’s algorithm that “recommends” additional videos for a given user to watch based on videos the user has already watched. According to Nohemi’s family, YouTube “recommended” terrorist group ISIS’s radicalizing videos to some viewers and these “recommendations” contributed to Nohemi’s death. The Gonzalez family argues that Section 230(c)(1) only protects a platform’s exercise of traditional editorial functions over user-generated content, such as deciding whether to alter, withhold, publish, or withdraw the content. According to Nohemi’s family, it does not extend to “recommending” user-generated content because that exceeds traditional editorial functions. They reason that recommendations involve communication by the platform itself, and therefore go beyond “publishing” (i.e., providing the public with content supplied by others). In illustrating this argument, petitioners analogize to publishing a book. Were a member of the Court to recommend a book, they would not by doing so be transformed into the publisher of the book.
Google, on the other hand, argues that an online service should be treated as a publisher within the meaning of Section 230 when it curates and displays third-party content of potential interest to each user because pairing a third-party video with another does not change the underlying content. Brief in Opposition, at 21. They reason that “actively bringing [a speaker’s] message to interested parties . . . falls within the heartland of what it means to be the ‘publisher’ of information.” Id., at 3.
Google also argues that unlike many cases the Supreme Court hears, there is no split of authority among circuit courts of appeal here. Rather, precedent out of both the Second and Ninth Circuits have held that Section 230 protects targeted recommendations by interactive computer services, although with divergent reasoning. See, respectively, Force v. Facebook, Inc., 934 F.3d 53, 68 (2d Cir. 2019); Dyroff v. Ultimate Software Group, Inc., 934 F.3d 1093, 1098 (9th Cir. 2019). In Force, the Second Circuit held that Facebook’s recommendations remain traditional editorial functions because distributing third-party information inherently connects users to content, whether recommended or not. 934 F.3d 53, 67. In Dyroff, the Ninth Circuit found that to hold the defendant liable for making such recommendations would “inherently require the court to treat the defendant as the ‘publisher or speaker’ of content provided by another,” and therefore run counter to the intent of Section 230. 934 F.3d 1093, 1098. The Ninth Circuit’s ruling below in Gonzalez v. Google then followed Dyroff’s precedent in finding that targeted recommendations do indeed fall within the scope of Section 230’s protection. 2 F.4th 871, 895 (9th Cir. 2021). Nohemi Gonzalez’s family argues that the conflict is that this case, Dryoff, and Force conflict with decisions of other circuit courts of appeal holding that the protections of Section 230 apply only to traditional editorial functions.
Judges in both the Second and Ninth Circuits have raised concerns that the scope of immunity granted by Section 230 should be clarified. Three judges in particular – Katzmann, Berzon, and Gould – have argued that the majority’s interpretations of Section 230 are counter to the law’s text and intent. Force, at 77-84 (Katzmann, J., concurring in part and dissenting in part); Gonzalez, 2 F. 4th 871, 913-917 (Berzon, J., concurring); Id., 920 (Gould, J., concurring in part and dissenting in part). This division amongst judges on the circuit courts will now be resolved by the nation’s highest court.
Now, the Supreme Court, in its first-ever case interpreting Section 230, will determine whether the provision immunizes “interactive computer services” (i.e., websites like YouTube) when they make targeted recommendations of content provided by another “information content provider” (i.e., their users). At stake in Gonzalez is the very manner in which online content is delivered to users. Currently, every big name internet platform – Facebook, Google, Twitter – uses algorithms which recommend content based on users’ online behavior. If the Court were to find that interactive computer services can be held liable for their recommendations, this could fundamentally change how those platforms operate. As Google points out, interactive computer services must make constant choices about what information to display and how in order to help users navigate the sheer amount of data online. Users who have become accustomed to platforms “curating” content for them may dislike having to hunt through mountains of random content to find what they want to see if these platforms stop making recommendations altogether. If platforms continue to make recommendations, they may face significant increased costs in the form of more stringent recommendation algorithms and AI-filtering of recommended content, as well as increasing their human reviewer headcount to try to further prevent the recommendation of content that may create an increased risk of creating liability for them. They will also have to increase their legal spending to fight cases on the merits that they previously could have defeated on a motion to dismiss.
But Nohemi’s family responds that recommended content takes control over the information received away from users and places it in the hands of internet companies. Petition, at 32. The stated policy of Section 230 is to “encourage the development of technologies which maximize user control over what information is received by individuals, families, and schools who use the internet and other computer services.” Id. As Nohemi’s family reasons, these algorithms are influential at best – and downright dangerous at their worst. Petition, at 19.
Given the huge potential ramifications for the tech industry, this case is one to watch this term.