Scores of briefs were filed last month in a closely watched Supreme Court case by parties with a stake in what has been called “the most important law in tech.” Nearly 80 briefs by more than 20 U.S. companies, 14 industry bodies, 40 civil society organizations, 30 individuals, 20 current and former Senators and members of Congress, 28 States, and the U.S. Government were filed in Gonzalez v. Google. Gonzalez is the first Supreme Court case involving Section 230 of the federal Telecommunications Act, and the future of the internet as it is currently known may rest on the outcome.
Recent U.S. Government figures measure the size of the U.S. digital economy to be more than 10% of the total U.S. economy, growing even when the overall economy contracted. All of this economic growth and activity can be traced to what was once regarded as an obscure provision of the 1996 Telecommunications Act, a law known as “Section 230.” Section 230 is important not only to the U.S. digital economy, but to every single website with comments or third-party content, making it a critical provision for free expression online. As Senator Wyden, one of the law’s authors, put it: “I wrote this law to protect free speech.”
This month, Section 230 is going before the Supreme Court for the first time in Gonzalez v. Google, a case brought by the family of an American victim of the Paris ISIS attacks with a novel claim regarding the video service YouTube as responsible for the attack.
Because of the sweeping ramifications of this case to websites of all sizes and the users that rely on them, there was overwhelming support to preserve Section 230. While there were briefs on the other side, a diverse set of industry, civil society, and government stakeholders filed in support of Google, and even briefs in support of Gonzalez or neither party acknowledged the importance of Section 230 to the internet.
As MIT Technology Review explained in an article entitled “How the Supreme Court ruling on Section 230 could end Reddit as we know it”: “In short, the court could change Section 230 in ways that won’t just impact big platforms; smaller sites like Reddit and Wikipedia that rely on community moderation will be hit too.”
There were nearly 80 briefs filed in this case, most by so-called amici curiae, literally meaning “friends of the court.”1 These briefs explain the effect the case would have on others who are not parties before the Court. Of these briefs, 48 supported Google, 19 were in support of Gonzalez, and 11 supported neither party. The United States is also a party to this case, expressing concerns with the previous decision and also requesting to participate in oral argument.
Many of the briefs fall into three main buckets, with the majority in the first bucket:
- Section 230 is working and should not be changed
- the Court shouldn’t touch Section 230, but Congress can
- the Court should weight in on Section 230 in this case
In a general sense, companies, industry groups, and trade associations defend the camp that Section 230 is working and should not be changed; the Executive, both in the Federal and State levels, want the law to change, but argue this should be done through Congress, not the Court; lastly, most legislators who submitted briefs, and the state of Texas, want the Court to weigh in on this case. Experts and civil society are fairly evenly distributed amongst the buckets, with some even focusing more on issues well outside of the extremist content at issue in this case.
The fate of the internet may hang with this Supreme Court case because without Section 230, websites and apps would need to approve everything people say and post online. Some companies would over-sanitize their services, as lawyers reject everything remotely controversial. Others would abdicate responsibility, and stick their head in the sand. The result is that sites would turn into either kindergarten, or chaos: Sesame Street or Bourbon Street. And neither one is what most users — or most advertisers — want.
Want to learn more? Here’s a deeper dive:
-
This case was brought in June 2016 by Reynaldo Gonzalez, Jose Hernandez, Rey Gonzalez, and Paul Gonzalez – relatives of Nohemi Gonzalez, an American student studying abroad in Paris who was killed in November 2015 by a group recruited by ISIS. Gonzalez et al. sued Google alleging that YouTube aided ISIS in recruiting members by recommending the videos ISIS posted on YouTube to other viewers. The district court dismissed their complaints in October 2017 and August 2018, finding that Section 230 covered YouTube’s recommendation of the videos because they were produced and posted via third parties. Although the Ninth Circuit affirmed the decision in June 2021, the court was split over whether Section 230 actually did provide protection for Google’s activities. Gonzalez then petitioned the Supreme Court for certiorari in April 2022.
In their July 2022 reply brief, Gonzalez advocates that this case must be heard in order for the Court to decide Twitter v. Taamneh because the Court would “moot the dispute between the parties” by not granting cert in this case. In their November 2022 merits brief, Gonzalez reframed their question presented, from generally considering if content recommendations fall under the traditional editorial functions protected under Section 230 to specifically asking when, if ever, does the immunity of Section 230(c)(1) apply to the “recommendations of third party content.”
In their February 2023 reply brief, Gonzalez reiterated their earlier position that the recommendation tools utilized by Google do not always fall within the traditional editorial functions of a publisher nor do they always involve the recommendation of specifically third-party content and therefore some claims are outside the scope of Section 230. Likewise, Gonzalez emphasized and expanded on their previous argument that recommending content does not fall within the constructs of the language of Section 230 as the recommending of unasked-for content does not fit within the dictionary understanding of an “interactive computer service.” Gonzalez also cautioned that a ruling in their favor would not result in “dire consequences” as other defenses, such as the First Amendment, would still be available.
-
In their July 2022 brief opposing cert, Google asserts that no circuit split exists, stating that the Ninth and Second Circuit are the only courts that have addressed this question and both decided that Section 230-protected publication includes algorithm-generated recommendations. Google further argued that the Ninth Circuit held that allegations of YouTube failing to prevent ISIS from using its service itself shows that Google falls squarely into the category of publishers that are protected under Section 230, emphasizing the fact that algorithm-generated recommendations are “neutral tools” to “deliver content in response to use inputs” and under circuit precedent YouTube is not the “content creator or developer.”
Google’s January 2023 merits brief emphasizes that Section 230(c)(1) would bar Gonzalez’s claims. Google argues contrary interpretations lack support and risk upending the modern internet because the way the internet operates today would not be viable if the law treated every website and user as the publisher or speaker of the third-party content they disseminated. Lastly, Google claims a reversal in Taamneh would resolve this case. If no cause of action exists in Taamneh, no cause of action exists for Section 230(c)(1) to insulate, and the Court need not reach the question of Section 230’s scope.
-
The Biden Administration Department of Justice filed in support of vacating the opinion. The Administration claims that Section 230(c)(1)’s text is most naturally read to prohibit courts from holding a website liable for failing to block or remove third-party content, but not to immunize other aspects of the site’s own conduct. The Administration also argues that Section 230(c)(1) bars Gonzalez’s claims to the extent they are premised on YouTube’s alleged failure to block or remove ISIS videos from its site, but does not bar claims based on YouTube’s alleged targeted recommendations of ISIS content. Ultimately, they claim the Court should give Section 230(c)(1) a fair reading, with no thumb on the scale in favor of either a broad or a narrow construction. The Acting SG also filed a motion for leave to participate in oral argument and for divided argument, which the Court granted.
26 bipartisan states and the District of Columbia, led by Tennessee, filed a brief in support of Gonzalez. They argue content moderation is an issue for state regulation and the “states ability to remedy internet-related wrongs has been severely hampered by the judicial expansion of the internet ‘publisher’ immunity under Section 230.” They claim Section 230 does not contain “unmistakably clear language” to permit such expansive immunity because Congress merely wanted to protect interactive computer services from defamation liability. Texas filed separately to address concerns related to NetChoice & CCIA v. Paxton, in which they claim the current interpretation of Section 230 preempts Texas state law that “seeks to preserve free speech on the internet by preventing the biggest social-media platforms from censoring users based on viewpoint.”
The authors of Section 230, Senator Wyden (D-OR) and former Congressman Cox (R-CA), filed in support of Google, clearly explaining their intent with enacting the law, including that they contemplated algorithms: “recommending systems that rely on such algorithms are the direct descendants of the early content curation efforts that Congress had in mind when enacting Section 230.”
Senator Hawley (R-MO), a longtime critic of Section 230 who co-sponsored several bills to try to amend or even repeal it, argued that Section 230(c)(1) merely protects interactive computer service providers from liability as publishers of unlawful content posted by a third-party when editing or removing said unlawful content. A brief in support of neither party filed by Senator Cruz (R-TX) and 16 other Republican members of Congress, including members of Senate and House Judiciary, argued for a similar textual interpretation of Section 230(c)(1) as Senator Hawley, that it should merely “preclude courts from treating internet service providers as the speaker or publisher of third-party content on their websites.”
-
More than 20 internet companies filed briefs in support of Google — some filing on their own, and others in joint briefs with other companies and/or industry organizations. Amici included websites and online businesses of all sizes, representing many different types of online marketplaces, forums, communities, and platforms for blogging, software, and user reviews. Overall, online service companies share the view that Section 230 is paramount for the operation of the internet as we know it today and that any alteration to the law would jeopardize this technology, both for emerging businesses and companies operating at scale.
These companies echoed the importance of Section 230 to a wide range of businesses and their users, including Automattic, craigslist, Internet Works (joined by Etsy, Glassdoor, Pinterest, Roblox, Scribd, Skillshare, Tripadvisor, and Vimeo), Meta, Microsoft, Reddit, Twitter, Yelp, ZipRecruiter and Indeed, and cPanel, Identity Digital, Texas.Net, and Tucows. These services enable users to post third-party content, including comments, and, as craigslist puts it, “would not exist without Section 230.”
Two of these companies, Twitter and Meta, are parties in a companion Supreme Court case being argued on February 22, Twitter v. Taamneh. Twitter explains that excluding “recommendations” of particular content from Section 230’s ambit would be highly impractical because it would curtail selective filtering and organizing of content, without which it would be overwhelming and therefore useless to curate any type of content for users. Twitter says the Court should reverse Taamneh, which would also resolve the claims in this case as well. However, if the Court nonetheless addresses Section 230(c)(1), it should affirm, noting that Section 230 is “pivotal to the modern internet.”
Similarly, Meta notes that “decisions about how to organize and display third-party content fall within the heartland of §230’s protection.” Further, it warns that Gonzalez’s “proposal to protect removing content but not ‘recommending’ it would rewrite the statute and create incentives to remove content that Congress never intended.” Meta also highlights their robust anti-terrorism policies, which rely on algorithms for enforcement.
The brief from Reddit, joined by several volunteer Reddit moderators, discusses how “Section 230 protects Reddit, as well as Reddit’s volunteer moderators and users,” adding that “without robust Section 230 protection, internet users—not just companies—would face many more lawsuits from plaintiffs claiming to be aggrieved by everyday content-moderation decisions. Reddit’s moderators have been sued in the past, and Section 230 was the tool that allowed them to quickly and inexpensively avoid frivolous litigation.”
-
The Court also received briefs from more than a dozen trade associations and other industry groups representing startups and mid-sized technologies, marketplaces, internet infrastructure services, and product manufacturers. Many of these briefs explained the significance of Section 230 to industry and individual users, as well as providing additional context on Congressional intent, content moderation, and the use of algorithms.
The vast majority of trade group briefs supported Google. These include ACT, CCIA, NetChoice, DiMA, ITI, IAB, and TechNet, Chamber of Commerce, Developers Alliance, Engine, i2Coalition, Marketplace Industry Association, Product Liability Advisory Council, SIIA, and others. Their briefs, in general, emphasize that eliminating protection for algorithmic curation would especially harm marginalized speakers and suppress significant amounts of free speech.
CCIA, NetChoice, DiMA, ITI, IAB, and TechNet emphasize that digital services increasingly use algorithms to organize content and present it to users in a useful way. That organizing function is at the core of what digital services do, and what Section 230 protects. They argue that accepting Gonzalez’s view that organizing content amounts to making an unprotected “recommendation” would render Section 230 meaningless and leave many digital services less usable and less useful.
In addition to representing companies, several of the industry briefs were co-signed by interested individuals, including four volunteer Reddit moderators, and Chris Riley, who runs a social media service, Mastodon instance https://techpolicy.social, which also involves him moderating his community of users. These briefs highlighted community moderation and human review, in addition to algorithms, and the role of Section 230 in ensuring this type of moderation can work effectively.
-
A range of scholars and other experts also weighed in, including economists, legal scholars, technical experts, and national security experts. Whereas companies and industry groups massively supported Google, more legislators favored Gonzalez, and the Executive branch at the Federal and state levels sought a “third way,” experts in all fields have their opinions evenly spread out amongst the different approaches to Section 230.
Economists favored Google. Ginger Zhe Jin, Steven Tadelis, Liad Wagman and Joshua Wright argued that weakening the protections Congress established in Section 230 would potentially harm online free expression, stifle internet growth, and impair innovation. The brief from Fellows of Technology Policy Institute states that the targeting of search results, advertising, and other content to users fuels the overall digital economy and generates massive benefits for all consumers of information, goods, and services and for the firms that use the platforms to reach them.
Prof. Eric Goldman wrote about how Section 230 is a speech-enhancing statute, adding substantive and procedural speech protections to the First Amendment, and how Gonzalez’s requested outcomes would eliminate Section 230’s procedural benefits. A group of 19 internet law scholars focused on how automated recommendations are consistent with the text and purpose of Section 230. A group of 7 scholars of civil rights and social justice explained how “Section 230 amplifies underserved and marginalized voices,” and “undermining Section 230 would harm, rather than protect, the civil rights of marginalized people.”
Cyber Civil Rights Initiative and 2 legal scholars filed in support of Gonzalez. In their view, Section 230(c)(1) is a narrow limitation on liability that applies only to speech actions, and even more specifically only to such actions that attempt to impose liability on a provider or user of an interactive computer service as though it were the original author of a third-party’s speech.
There were several briefs from technical experts with experience studying algorithms that recommend content online. A group of 7 information science scholars declared their support for Google. Their brief argued that contrary to Gonzalez’s position, Section 230 does not draw a distinction between computer systems that rely on explicit user requests for information and those that rely on implicit requests via a user’s actions. CDT and 6 technologists echo that argument stating that “recommendation is functionally indistinguishable from selecting and ordering or ranking items for display, something every provider must do. In that sense, both argue that should Gonzalez’s position prevail, internet users would be less able to speak freely and everyone would have less access to information, due to the fact that internet service providers would then reduce the type of content displayed to their users to minimize risk.
Adopting a position in support of neither party is CITP Tech Policy Clinic who urges the Court to consider the salient features of recommendation engines in determining what platform conduct to immunize. Also in support of neither party was the brief from Integrity Institute and AlgoTransparency, who claim the key inquiry regarding Section 230 is not whether an algorithm was involved or not, but instead if the conduct of the platform is illegal.
A group of 10 national security experts filed a brief in support of Google, explaining that “limiting Section 230 immunity would discourage online platform providers from removing or downranking dangerous content,” and given the national security threat from hostile foreign governments, online services’ participation is necessary.
There was also a brief from 21 former national security officials, including several current and former members of Congress, in support of neither party. They argued that the 2016 Justice Against Sponsors of Terrorism Act implicitly repeals Section 230 to the extent that it shields Google from liability for Gonzalez’s claims based on the algorithmic amplification of terrorist content.
-
Another former Republican Congressman and 20 libertarian and free market groups filed in support of Google, including a coalition of organizations based in Colorado, Florida, Idaho, Illinois, Louisiana, Minnesota, New Mexico, Oklahoma, Tennessee, and Utah.
TechFreedom and a brief from Cato Institute, R Street Institute, and Americans for Tax Reform argued that narrowing the scope of Section 230 would have dire consequences for online speech, creating more spaces for misinformation and fewer online spaces for marginalized voices. A brief joined by former Senator Rick Santorum (R-PA) went along the same lines, claiming that if the statute is changed, “any meaningful organization of user content” would be removed and chill speech.
American Action Forum urged the Court to “honor Congress’s express deregulatory intent, give effect to every word of Section 230, safeguard a competitive marketplace, and promote innovation and competition in the technology sector” by affirming the lower court’s decision. Reason Foundation argued that the purpose of Section 230 advocated for a broad reading “in the event of any ambiguity,” not a narrow one. This tracks closely with the opinion of the Washington Legal Foundation that explained that Section 230’s language preempts state law and Congress has the authority to “regulate interstate and international commerce,” therefore internet regulation would fall within this broad authority.
Finally, a group of 11 free market groups from 10 different states composed by the Center for Growth and Opportunity, Americans for Prosperity Foundation, Beacon Center of Tennessee, Freedom Foundation of Minnesota, Illinois Policy, Independence Institute, James Madison Institute, Libertas Institute, Mountain States Policy Center, Oklahoma Council of Public Affairs, Pelican Institute for Public Policy, and Rio Grande Foundation argued that Section 230 already encompassed and protected Google’s actions as the recommendation algorithms are merely “enabling tools” that providers are allowed to use under the statute and do not qualify them as publishers. As an aside, the brief argued that, contrary to some conservative viewpoints, Section 230 did not need to be modified as there is no factual evidence for a finding of political bias.
27 progressive and nonpartisan groups, including advocates for creators, trust & safety, libraries, press freedom, and human rights, filed in support of Google. Wikimedia Foundation states that in seeking to hold Google liable in this case, Gonzalez presses a theory of Section 230(c)(1) that jumbles its language, disregards canons of construction, and would lead to absurd results.
Chamber of Progress, Global Project Against Hate and Extremism, HONR Network, Information Technology Innovation Foundation, IP Justice, LGBT Tech Institute, and Multicultural Media Telecom and Internet Council asserted that Section 230 was designed to protect and foster First Amendment protections online. Bipartisan Policy Center argued that the Court should exercise judicial restraint in the case and preserve the status quo by affirming the prevailing interpretation of Section 230.
Authors Alliance and a host of Internet Creators asserted that Google’s recommendation function helped content creators “share content” and grow their businesses and by narrowing Section 230 to exclude this function, platforms will be “less likely to host and promote independent creators’ content” resulting in the chilling of speech and harming independent creators, which runs counter to Congress’s original intention. Trust & Safety Foundation discussed the integrated role Trust & Safety plays in enforcing content moderation policies and how any limits to Section 230 would be “extremely burdensome” for these teams.
ACLU, ACLU of Northern California, and Daphne Keller filed an amicus in support of Google commenting that “plaintiffs here seek to hold YouTube liable for third-party content where it did no more than list that content as potentially of interest.” Article 19: Global Campaign For Free Expression and International Justice Clinic at the University of California, Irvine School of Law argue that Section 230 “promotes the same values of access to information and freedom of expression that are guaranteed by international human rights law, in particular Article 19 of the International Covenant on Civil and Political Rights.”
-
Briefs in support of Gonzalez also ranged from right-wing to left-wing groups and focused on the excessive immunity granted by Section 230, mainly seeking to defend the rights of citizens online. In general, they urge the Supreme Court to reframe the understanding of the law, so that online service providers are only immunized by decisions that meet the statutory preconditions.
America’s Future, U.S. Constitutional Rights Legal Defense Fund, and Conservative Legal Defense and Education argued that Section 230 was not intended to immunize the actions of providers to regulate content of those who work in concert with government to promote the “official government narrative” on a subject, despite recognizing that this piece of legislation was singularly responsible for the creation of a robust and unregulated internet, without which the current system would degrade if not collapse. Free Speech for People claims the Ninth Circuit’s ruling expands Section 230 immunity far beyond the text and purpose of the statute. American Association for Justice follows along the same lines by saying Section 230 confers narrow protection on internet providers, not blanket immunity and urging the Court to adopt a narrow interpretation of Section 230 immunity.
Other briefs, from members of the U.S. military, Liberty Justice Center, National Police Association and others, all urge the Court to reassess the scope of Section 230, saying that the lower courts’ interpretation of Section 230 departed from its original reading. Some argue the law should not be read to immunize violations of the Antiterrorism Act, as it would eliminate a powerful incentive for algorithm designers to be vigilant in the fight against terrorism.
Some briefs in support of Gonzalez focused on other types of dangerous content involving children, including child sexual abuse material (CSAM). National Center on Sexual Exploitation, the National Trafficking Sheltered Alliance, and RAINN argued that the broad-immunity interpretation of Section 230 has vitiated access to justice for CSAM survivors, saying that internet companies have a de facto immunity for knowing violations of federal law concerning CSAM. CHILD USA similarly urged the Court to interpret Section 230 consistently with its text and child safety purposes to avoid further injustice and to give victims an avenue for meaningful redress. Common Sense Media and Frances Haugen claimed that Google’s activities create a receptive and captive audience by recommending and steering adolescents to ISIS videos promoting terrorism and other harmful content.
-
Similarly, some amici took the opportunity of the first Section 230 case being before the Supreme Court to file briefs in support of neither party in order to express their views on the intersection of Section 230 with other types of dangerous content besides terrorism, like hate speech, civil rights, and children’s mental health and safety online.
EPIC urges the Court to bring Section 230 back to its original meaning, claiming that over the decades the provision has been contorted to grant internet companies unprecedented immunity from civil liability for their own harmful conduct. Free Press Action, on the other hand, says Section 230 does not preclude classifying Google as a distributor and obligating it to remove content it knows to be unlawful, so Section 230 does not prevent holding Google liable for knowing violations of the Anti-Terrorism Act (ATA). ADL follows the same argument, defending that the Court should make clear that Section 230 does not immunize a platform from the consequences of its own tortious conduct. Institute for Free Speech and Adam Candeub question whether YouTube “creat[ed] or develop[ed]” targeted recommendations using its algorithms would require factual development currently missing from the record.
Lawyers’ Committee for Civil Rights Under Law, Asian Americans Advancing Justice, National Coalition on Black Civic Participation, National Council of Negro Women, Office of Communication of the United Church of Christ, and Take Creative Control defend that when interpreting Section 230, this Court must adopt a balanced approach that neither impedes enforcement of civil rights laws, nor increases censorship of people of color and other historically underserved communities, urging the Court to preserve the benefits of Section 230 without allowing it to become a “get-out-of-jail-free card” for civil rights.
Giffords Law Center to Prevent Gun Violence uses its brief to examine the role of social media in three recent hate-motivated mass shootings in the United States. Fairplay claims American youth are experiencing mental health crises resulting from products and practices employed by social media companies. Children’s Advocacy Institute at the University of San Diego School of Law uses their brief to explore how AI-driven social media recommendation machines like YouTube work and claim social media platforms are the only entities that can prevent or reduce the risk of injuries that might happen online.
1CCIA Law Clerks Sam Carswell, Amber Grant, Lauren Lehner, and Lukas Ruthes Gonçalves were instrumental in helping with this project, as was Michael Kwun’s excellent resource.