This fall, the Supreme Court marked a turning point in Internet history. The court agreed to consider Gonzalez v. Google. This is the first lawsuit to interpret Section 230.
Section 230 states that online companies are not “treated as publishers” of content provided by third parties, such as postings on their websites. Enacted by Congress in 1996 as part of the ill-fated Communications Decency Act, the law provides parties such as Google, Twitter and Facebook with some degree of legal immunity for the content users share on their platforms. provide.
The law protects companies that provide a platform for others to speak from the constant threat of defamation lawsuits, while giving them the power to remove content they deem objectionable. This has allowed for the robust and often dissonant discourse that defines today’s Internet. What does the Supreme Court’s intervention mean for its future?
Gonzales’ case is now in the hands of a court after an Islamic State attack in Paris killed a young woman, Nohemi Gonzales. Her property and family allege that Google violated anti-terrorism laws by allowing a terrorist organization to post content on her YouTube (owned by Google) that furthered its mission. They also claim that Google’s algorithms promoted IS by recommending its content to users.
Two courts that have heard the case so far have ruled that the Section 230 immunity covers alleged violations of anti-terrorism laws. However, given the varying statutes in other decisions related to section 230, the Ninth Circuit Court of Appeals, which has jurisdiction over West Coast cases, has construed section 230’s protections more narrowly than other courts. The possibility that this same law could mean different things based on where someone lives in the United States violates the rule of law. It’s a general motivation, and may explain the current court interest in Gonzalez, as well as new issues with algorithmic recommendations. showed interest in
Courts can reduce their incentive to review platform-held content simply by adopting a broader view of protecting platforms. However, if a court adopts a narrower view, it will lead to more content moderation.
Proponents of the narrow stance might argue that while broad liability protection was adequate when the industry first emerged, it is less justified now that Internet companies have become largely dominant.
On the other hand, those in favor of maintaining broad exemptions in 230 argued that limiting protection to certain types of content would make it difficult and controversial for companies to decide which side to send content to. It claims to get rid of everything that bothers you remotely instead of doing the work of brewing. content drops. As a result, a significant amount of online discourse is lost, including even the least likely liability.
History provides good reason to worry that narrowing immunity erodes or suppresses speech. In 2018, Congress enacted an amendment to provide that Section 230 does not apply to content that violates laws against sex trafficking. Two days after that law went into effect, Craigslist removed the personal section rather than trying to determine what content actually related to prostitution. Other companies followed suit and applied similar drastic approaches.
But the Supreme Court could approach Gonzales’ case in a very different way, focusing less on content moderation than on platform design.Section 230 allows companies to remove certain types of objectionable content. expressly allow it. What’s less clear is whether the law provides similar protections for the algorithm’s decisions to promote illegal content, which is the issue at hand in Gonzalez plaintiffs’ challenge to YouTube’s algorithm. Judges may limit the platform’s ability to recommend content using algorithms. This is now central to the business models of these companies and a strategy that all users rely on.
The Supreme Court’s decision in Gonzalez could be the most significant renewal of Section 230 in the foreseeable future. Last year’s congressional hearings on issues raised by the law reflected a partisan divide between Democrats who wanted more content removed and Republicans who wanted less, with legislative consensus If the Supreme Court goes ahead as planned, we’ll know by the end of June whether it will reshape the future of the Internet.
Christopher S. Yoo is a law professor and founding director of the Center for Technology, Innovation, and Competition at the University of Pennsylvania. ©2022 Los Angeles Times. Distributed by Tribune Content Agency.