Miss a day, miss a lot. Subscribe to The Defender's Top News of the Day. It's free.

The U.S. Supreme Court this week held hearings on a pair of cases that may go a long way toward determining the future of online speech, according to legal scholars and tech analysts.

The two consolidated casesGonzalez v. Google and Twitter v. Taamneh — relate to American victims of terrorist attacks, and to content hosted by online platforms that the plaintiffs in the respective cases allege helped fuel the terror attacks.

The plaintiffs in Gonzalez v. Google are the parents of Nohemi Gonzalez, a 23-year-old American college student from California State University, Long Beach, who was killed during an ISIS terrorist attack at La Belle Epoque bistro in November 2015. Gonzalez was studying in Paris at the time.

The parents sued Google under federal anti-terrorism laws, alleging YouTube, which is owned by Google, “is liable for damages because it failed to remove ISIS terrorist videos” from its platform, and that the algorithm it uses “even boosted the videos by recommending them to some users,” the New York Post reported.

The Gonzalez v. Google case specifically “questions whether Section 230 immunity should extend to user-created content that platforms recommend or promote — including via algorithms, which channel the majority of content viewed on YouTube and across the internet.”

Section 230, passed in 1996 as part of the Communications Decency Act, gives internet providers legal protections for hosting, moderating and removing most user content.

Twitter v. Taamneh concerns the death of Nawras Alassaf during a terrorist attack at an Istanbul nightclub in 2017. Alassaf’s family sued Twitter, Facebook and Google, “alleging the companies contributed to ISIS’s growth and that they could have taken more aggressive enforcement action to combat pro-ISIS content,” according to The Hill.

The case also pertains to content published on an online platform, although it does not directly focus on liability protections conferred by Section 230.

According to Politico:

“Platforms say if the liability shield doesn’t protect their use of targeted algorithms to recommend and promote content, some companies would more aggressively remove users’ speech or bar the discussion of more controversial topics for fear of being sued.”

The New York Post reports:

“There is bipartisan support in Congress for rewriting Section 230, which prohibits ‘interactive computer services’ from being treated as the ‘publisher or speaker’ of ‘information provided by outside users.’”

However, the arguments from the two sides differ. Democrats argue that “tech firms have used the statute to allow the unfettered spread of disinformation,” while “conservatives say that internet companies have abused the law to curtail right-leaning speech,” according to the New York Post.

Matthew Spitzer, J.D., Ph.D., professor emeritus of law at Northwestern University and former director of the Searle Center on Law, Regulation and Economic Growth, told The Defender that in hearing these two cases, the Supreme Court “has to worry — silently — about how their decision will affect negotiations in Congress,” reiterating that Democrats and Republicans disagree on what needs to be changed:

“If they did agree, we would already have had legislation (as we did in the case of online sex trafficking a few years ago), and we likely would have no case in court. But they can’t agree, so we’re in court.

“The court will not be able to interpret 230 to require complete political neutrality as an element of finding a website is not a publisher (as the Republicans would like) without ignoring both the text and the legislative history of Section 230.”

The contentious nature of Section 230 is on display in the Gonzalez case. A myriad of actors, including the American Civil Liberties Union, the Anti-Defamation League, the Electronic Frontier Foundation (EFF), the National Police Association, Big Tech companies, tech industry groups, members of Congress, U.S. states and even school districts have filed briefs in the case.

Congress so far has not acted other than to develop a carveout to Section 230 pertaining to content promoting sex trafficking.

Are recommendations by algorithms protected under Section 230?

According to the New York Post, Section 230 was designed to prevent internet companies from being treated as publishers by shielding them from lawsuits by anyone claiming to be wronged by content posted by another user — even though the platforms typically engage in moderation of user-posted content.

Section 230 “allow[s] social media platforms to moderate their content by removing posts that violate the services’ own standards, so long as they are acting in ‘good faith.’”

According to the EFF, Section 230 does not provide absolute protections to online platforms. For instance, it does not protect companies that violate federal criminal law or create illegal or harmful content.

In Gonzalez v. Google, the parents of Nohemi Gonzalez argued Section 230 “doesn’t protect YouTube’s use of algorithms to recommend content,” Politico reported.

The plaintiffs argued that YouTube’s automated algorithms “provided material assistance to, and had aided and abetted, ISIS,” in violation of the Anti-Terrorism Act, a law enabling Americans to file lawsuits to recover damages related to an “act of international terrorism.”

The videos helped ISIS incite violence and recruit new members, they said. The lawsuit accuses Google of:

“Knowingly permitt[ing] ISIS to post on YouTube hundreds of radicalizing videos inciting violence and recruiting potential supporters to join the ISIS forces [in the Middle East] … and to conduct terrorist attacks in their home countries. …

“[Google] ‘recommended ISIS videos to users’ … based on what Google knew about each of the millions of YouTube viewers, targeting users whose characteristics indicated that they would be interested in ISIS videos.”

Google countered that recommendations made by its algorithms are protected by Section 230, and claimed the plaintiffs did not provide evidence that any of the ISIS attackers responsible for the Paris attack were inspired by YouTube videos

Google also said a ruling against it would have negative ramifications for the entire internet, writing in its brief:

“This court should not undercut a central building block of the modern internet.

“Eroding Section 230’s protection would create perverse incentives that could both increase removals of legal but controversial speech on some websites and lead other websites to close their eyes to harmful or even illegal content.”

Halimah DeLaine Prado, an attorney for Google, told the New York Post, “Congress was clear that Section 230 protects the ability of online services to organize content. … Eroding these protections would fundamentally change how the internet works, making it less open, less safe, and less helpful.”

Lower courts agreed with Google, resulting in the plaintiffs’ appeal to the Supreme Court.

Supreme Court weighing ‘unintended consequences’

According to CNN, “Supreme Court justices appeared broadly concerned Tuesday about the potential unintended consequences of allowing websites to be sued for their automatic recommendations of user content.”

Eric Schnapper, LL.B., attorney for the plaintiffs, “argued that narrowing Section 230 … would not lead to sweeping consequences for the internet,” CNN reported. However, “A big concern of the justices seems to be the waves of lawsuits that could happen if the court rules against Google.”

The justices also asked questions about how the court should treat recommendation algorithms, how individual internet users might be affected by potential changes to Section 230, the lawmakers’ original intent in crafting Section 230 and the ability for users and platforms alike to abuse the liability shield provided by Section 230.

According to CNN, while the justices at times appeared skeptical of the plaintiffs’ arguments, they also appeared to question Google’s claims that changes to Section 230 would lead to the “collapse” of the internet if tech companies were obligated to remove harmful content.

Scott McCollough, J.D., an Austin-based internet and telecommunications lawyer, told The Defender the arguments at hand — and the applicability of Section 230 to today’s internet — are more nuanced than often claimed.

He said that the Gonzalez case focuses on section (c)(1) of Section 230 and not the more controversial section (c)(2).

“In my opinion, Google could easily be found to be operating in a mixed role, in that it was probably doing more than merely posting ‘information provided by another information service provider,’” McCollough said. “There was probably some ‘publishing’ too, and for that they can be held liable.”

McCollough referred to a recent legal analysis published by Philip Hamburger, J.D., a constitutional scholar and professor of law at Columbia University, who wrote:

“The statutory problem is textual. According to Google and the rest of Big Tech, YouTube enjoys protection as a ‘publisher’ under Section 230(c)(1) for its ‘editorial functions,’ whether in sharing and recommending videos or in blocking them. But that’s not what the section says.

“It says such companies shall not be ‘treated as the publisher’ of information provided by others. So Google, comically, is seeking to be treated as a publisher under a section that says it shall not be treated as a publisher.”

“The facts matter,” McCollough told The Defender, “and the case probably should go forward so plaintiffs can do discovery and find out precisely how purposeful Google was in its algorithmic pushing of the information based on perceived user preferences.”

Section 230 ‘protects Americans’ freedom of expression,’ proponents say

Politico warned that while “large platforms such as YouTube could afford the liability risks” in the event of a ruling against Google in the Gonzalez case, it would be “financially crippling for small businesses and startups.”

“Among those most affected by any ruling against Google could be smaller internet companies and individual website users, like volunteer moderators for Reddit,” wrote Politico.

Sen. Ron Wyden (D-Ore.), one of the original authors of Section 230, told Politico, “If you harm the little guys and you harm moderation, you’re going to reduce innovation, competition and opportunities, and give the big guys … even more of the online market.”

Section 230 was inspired by successful lawsuits filed against internet service providers such as Prodigy in the mid-1990s, for “defamatory” statements hosted on their bulletin boards. This led Wyden and Rep. Christopher Cox (R-Calif.) to draft Section 230.

A separate Politico report noted that one of the parties that filed a brief in Gonzalez v. Google is the nonprofit Authors Alliance. The group, which includes prominent online influencers, video bloggers and video game streamers, came out in support of Google — without disclosing its “direct financial ties to Google.”

The EFF, which also filed a brief in the Gonzalez case, writes on its website, “The free and open internet as we know it couldn’t exist without Section 230,” which “protects Americans’ freedom of expression online by protecting the intermediaries we all rely on” and without which, platforms “would intensively filter and censor user speech.”

However, the EFF appears to contradict itself when it says “Section 230 allows for web operators, large and small, to moderate user speech and content as they see fit. This reinforces the First Amendment’s protections for publishers to decide what content they will distribute” — though platforms have argued they are not publishers under this law.

Although opposition to Section 230 is often attributed to the political right, some Republican lawmakers have come out in support of the law. In a brief filed with the Supreme Court in the Gonzalez case, former Sen. Rick Santorum (R-Penn.) said narrowing Section 230 would suppress speech.

Section 230 gives tech companies too much power, critics argue

According to Law and Crime, Section 230 “has vanishingly few friends today outside of Silicon Valley and free-speech activists.”

The New York Post reported that “critics of the law say it has been interpreted in the courts well beyond its original intent and has handed private tech companies the keys to the modern-day version of the public square,” adding that “Politicians on both sides of the aisle agree Big Tech has immense power that needs to be reined in and regulated.”

Opponents of Section 230 include what Politico described as “populist” and “anti-big business” conservatives.

On May 28, 2020, then-President Donald Trump signed an executive order curtailing Section 230, arguing that it allows Big Tech platforms to restrict freedom of speech.

The order states:

“Twitter, Facebook, Instagram, and YouTube wield immense, if not unprecedented, power to shape the interpretation of public events; to censor, delete, or disappear information; and to control what people see or do not see.

“We must seek transparency and accountability from online platforms, and encourage standards and tools to protect and preserve the integrity and openness of American discourse and freedom of expression.”

Prior to signing the executive order, Trump said, “We’re here today to defend free speech from one of the greatest dangers it has faced in American history.” He added:

“A small handful of powerful social media monopolies had unchecked power to censure, restrict, edit, shape, hide, alter virtually any form of communication between private citizens or large public audiences.”

Prior to leaving office, Trump vetoed a $740 billion defense bill because, in part, it did not include a clause repealing Section 230.

Other Republicans have also sided against Section 230. Sen. Chuck Grassley (R-Iowa) argues that a ruling in favor of Google would benefit terrorism sympathizers, and Sen. Josh Hawley (R-Mo.) called on the Supreme Court to narrow its interpretation of Section 230 — as did Sen. Ted Cruz (R-Texas) and 16 other Republican lawmakers.

In 2020, Ajit Pai, chairman of the Federal Communications Commission, said social media companies “do not have a First Amendment right to a special immunity denied to other media outlets,” in announcing plans to “clarify the meaning” of Section 230.

President Biden has also spoken out against Section 230 on several occasions. In a January 2020 interview with The New York Times, Biden said “Section 230 … immediately should be revoked,” on the basis that it enabled “propagating falsehoods they [online platforms] know to be false.”

In September 2022, Biden called on Congress to “hold social media companies accountable” by repealing Section 230 protections and “get[ting] rid of special immunity for social media companies” by imposing “much stronger transparency requirements.”

In a Jan. 11, 2023, Wall Street Journal op-ed, Biden repeated these calls:

“We need Big Tech companies to take responsibility for the content they spread and the algorithms they use. That’s why I’ve long said we must fundamentally reform Section 230 of the Communications Decency Act, which protects tech companies from legal responsibility for content posted on their sites.

“We also need far more transparency about the algorithms Big Tech is using to stop them from discriminating, keeping opportunities away from equally qualified women and minorities, or pushing content to children that threatens their mental health and safety.”

The Biden administration also filed a brief in the Gonzalez case, calling on the case to be sent back to the 9th Circuit U.S. Court of Appeals and arguing that Section 230 does not provide immunity to YouTube when its algorithms recommend ISIS content.

Section 230 ‘needs work,’ but requires nuanced approach

Many opponents of Section 230 are looking to Justice Clarence Thomas to potentially lead the Supreme Court toward a ruling curtailing it. According to Politico, Thomas “has written two dissents urging his colleagues to take a case reviewing what he sees as the lower courts’ overly broad interpretation of the law in favor of tech companies.”

Justice Thomas has also been credited with “likely” leading “the push to review the [Gonzalez] case.” In a statement issued in March 2022 about the denial of certiorari in another Section 230-related case, Jane Doe v. Facebook, he wrote that it’s “hard to see” why Section 230 is protecting online platforms from their “own acts and omissions.”

McCollough told The Defender, “The thing that I find the most interesting is, from the ’70s through the ’80s [and up to] ’96, they were trying to eliminate monopolies and have competition,” he said.

He added:

“The problem that we have now is that whereas we have sort of dealt with the problem of the monopolies … at the physical layer, where communication services were delivered over pipes … the internet today is nothing like it was before ’95 and certainly in 1996.”

McCollough said that these differences include the establishment of major Big Tech platforms after 1996, which are “endpoints” that provide content and use the internet as a communication service, and the business models that developed later, where online user activity is monetized.

He argued for a nuanced approach:

“Equally important people need to understand that the focus today is on algorithms at consumer-facing applications. Algorithms are also used throughout the internet protocol stack. Without them the Internet would not work.

“So we have to distinguish between the portals operating above the application layer (like YouTube here) and all the layers below them. The problem with 230 is it covers not just the portals on top but all the players below since they are all within the definition of ‘interactive computer service.’

“From an internet and communications network perspective, YouTube is actually an endpoint, an internet user. They are not ‘the internet’; they are on the internet.”

“We should not just claim ‘algorithms bad,’” McCollough continued. “It is just code, and it does what it is told to do. So the question is what was it told to do by people? You have to get past motion to dismiss to find out, and [the Gonzalez] case was disposed on a motion to dismiss, so no discovery was allowed.”

“There’s algorithms all the way down and they’re very important. If you start messing with the algorithms in the internet, you’re going to destroy the internet,” he added. “What we have to do is segregate the internet from the actors that are on top of it, like these social media platforms, because they have a completely different business model.”

Section 230 “needs work,” McCollough said, suggesting Congress “recognize the different players and distinguish between them based on role,” while “get[ting] rid of the reference to ‘publisher treatment,’” “radically change section (c)(2)” as “it is what has allowed all the censorship,” and “let state law operate much more freely.”

Robert Kozinets, Ph.D., professor of journalism at the USC Annenberg School for Communication and Journalism, and Jon Pfeiffer, J.D., an entertainment trial attorney and adjunct professor of law at Pepperdine University, wrote in January that “change in Section 230 is coming — and is long overdue.” They argued:

“We support free speech, and we believe that everyone should have a right to share information.

“When people who oppose vaccines share their concerns about the rapid development of RNA-based COVID-19 vaccines, for example, they open up a space for meaningful conversation and dialogue. They have a right to share such concerns, and others have a right to counter them.”

They called for “verification triggers” when a “platform begins to monetize content related to misinformation,” “transparent caps” by adding “coherent standards” to Section 230 that, if met, would limit a platform’s liability, and an institution akin to “Twitter court,” where “neutral arbitrators” would “adjudicate claims” relating to controversial content.

Other legal experts have called for a “duty of care” provision to be added to Section 230, creating a legal obligation to stop harmful activity on online platforms, or have suggested that Section 230 is itself unconstitutional.

McCollough told The Defender “there is a lot of battlefield prepping going on,” as the Gonzalez and Taamneh cases are “being used to set the narrative for the Florida and Texas anti-censorship cases that will be heard next year.”

Those cases pertain to similar laws those two states passed. Texas’ HB20 applies to social media platforms with at least 50 million active users, with “censorship” defined as blocking, removing or demonetizing content. It was paused by the 5th Circuit. Florida law SB 7072, blocked by the 11th Circuit, bars platforms from banning users based on their political ideology.

Legal scholars believe that the Supreme Court will rule on the Gonzalez and Taamneh cases “in tandem,” with a decision expected in late June or early July.