This month, the Supreme Court marked a turning point in Internet history. The court agreed to hear Gonzalez v. Google, the first case to interpret Section 230 — the once-obscure law now widely regarded as having “created the internet” and debated by politicians on both sides of the aisle.
Section 230 states that online companies “shall not be deemed to be publishers” of any content provided by a third party, such as anyone appearing on the companies’ websites. Passed by Congress in 1996 as part of the otherwise ill-fated Communications Decency Act, the law provides some legal immunity to actors like Google, Twitter and Facebook for content shared by users on their platforms.
The law protects companies that provide platforms for other people’s speech from the constant threat of defamation lawsuits, while giving them the power to remove objectionable content. This allowed for the robust, often contradictory discourse that defines the internet today. What could the Supreme Court’s intervention mean for his future?
The Gonzalez case, which is now in court, arose after a young woman, Nohemi Gonzalez, was killed in an Islamic State attack in Paris. His estate and family claim that Google violated the Anti-Terrorism Act by allowing a terrorist organization to post content on YouTube (which is owned by Google) that furthers its mission. They also claim that Google’s algorithms promoted Islamic State by recommending its content to users.
To date, the two courts that have heard the case have ruled that Section 230 immunity extends to alleged violations of the Anti-Terrorism Act. But in other 230 decisions looking at different statutes, the 9th Circuit Court of Appeals in West Coast cases interpreted Section 230’s protection more narrowly than other courts. The possibility that this same law means different things depending on where someone lives in the United States is against the rule of law. Reconciling such inconsistencies is a common motivation for the Supreme Court to hear the case and may explain the current court’s interest in Gonzalez as new questions about algorithmic recommendations. Justice Clarence Thomas also expressed interest in adopting 230 in former dissents.
A court could simply adopt a broad view of 230 protections for platforms and reduce incentives to review the content those platforms carry. However, if the court takes a narrower view, it will lead to more content moderation.
Proponents of the narrow position may argue that while broad liability protection was appropriate when the industry first emerged, it is less so now that Internet companies are large and dominant. Tighter regulation could place greater responsibility on companies to use discretion in the content they post and potentially reach millions of people.
On the other hand, proponents of 230’s broad immunity protections argue that limiting protection to certain types of content would cause companies to eliminate anything remotely disturbing rather than take on the difficult, contentious task of deciding on which side of the line a piece belongs. content drops. The result will be the loss of a significant amount of online discourse, including anything that is likely to create even the slightest liability.
History provides good reason to worry that shrinking immunity could erode or stifle speech. Congress passed an amendment in 2018 to make Section 230 not apply to content that violates laws prohibiting sex trafficking. Two days after the law went into effect, Craigslist took down its private section instead of actually defining what content was related to prostitution. Other companies have followed suit, taking equally broad approaches. This experience shows that limiting immunity can reduce the amount of available speech. It may even lead content curators to abandon current efforts to strengthen their controls, because the more they moderate their content, the more scrutiny they will face.
But the Supreme Court may approach the Gonzalez case quite differently, focusing less on content moderation than on how platforms are designed. Section 230 expressly allows companies to remove certain types of objectionable content. What is less clear is whether the law provides similar protection for algorithmic decisions to promote illegal content. Gonzales plaintiffs’ challenge to YouTube’s algorithms. Any online curator must decide how to present their content to users. Courts could limit platforms’ ability to use algorithms to recommend content, a strategy that is now central to these companies’ business models and on which all users depend.
The Supreme Court’s decision in the Gonzalez case will likely represent the most consequential update to section 230 in the foreseeable future. Last year’s congressional hearings on issues raised by the statute reflected a partisan divide between Democrats who called for more content to be taken down and Republicans who demanded less — suggesting legislative consensus is unlikely anytime soon. If the Supreme Court sticks to its expected schedule, we’ll know whether it decides to reshape the future of the internet at the end of June.
Christopher S. Yoo is a professor of law and founding director of the Center for Technology, Innovation and Competitiveness at the University of Pennsylvania.