Skip to content
Breaking News Alert Biden DOJ Says Droning American Citizens Is Totally Fine Because Obama’s DOJ Said So

Supreme Court Justice Clarence Thomas Wants a Section 230 Reckoning

The Supreme Court justice recently hinted that Section 230—‘the twenty-six words that created the internet’—could soon be in the highest court’s crosshairs.

Share

Clarence Thomas has done it again. Never one to shy away from the Big Tech debate, the conservative Supreme Court justice recently hinted that Section 230—“the twenty-six words that created the internet”—could soon be in the highest court’s crosshairs.

In the simplest of terms, Section 230 of the 1996 Communications Decency Act grants immunity from civil liability for third-party content hosted on “interactive computer services.” On March 7, Thomas identified two foundational issues with the application of this statute in its current form. First, the courts’ broad interpretations have led to sweeping immunity for today’s tech platforms. Second, Big Tech companies never fail to abuse the privilege.

Thomas asserts that expansive interpretations of Section 230 from 1996 onward may clash with the original text. He contends that arguments favoring this broad immunity “rest largely on ‘policy and purpose’” instead of the statute’s plain text.

This isn’t the first time Thomas has made this claim. In his October 2020 statement on Malwarebytes, Inc. v. Enigma Software, he challenged previous decisions by the courts, writing that their interpretations have “long emphasized nontextual arguments,” leaving “questionable precedent” in their wake. But it is his impugnment of the tech companies themselves—who use the statute as a shield—that deserves attention.

There is a reason Thomas opened his March statement on Jane Doe v. Facebook with a gob-smacking narration, briskly detailing how “an adult, male sexual predator used Facebook to lure 15-year-old Jane Doe to a meeting, shortly after which she was repeatedly raped, beaten, and trafficked for sex.” Thomas later asks if immunity from such abuses are really “what the law demands.” The question is a good one.

Should Big Tech companies continue to hide behind their “capacious” immunity for both failing to apprise users of defects in their products or neglecting to protect them from the “malicious or objectionable activity of their users”? The answer, according to Thomas, appears to be no. For it is difficult to see why the law “should protect Facebook from liability for its own ‘acts and omissions.’” The platforms should incur blame—and attendant consequences—for their actions.

These acts and omissions take myriad forms. Tech companies often turn a blind eye to sex trafficking, deliberately draw in children as a key demographic, and wantonly apply their “community standards” based on political winds.

Without the deterrent effect of private lawsuits, it is likely tech companies are more emboldened to target younger audiences, opening them to a litany of abuses. Big Tech’s penchant for seeking out younger and younger children to lure in with highly addictive content is a testament to this.

For instance, internal Facebook documents published by The Wall Street Journal in 2021 revealed that the company considered “tweens” to be a “valuable but untapped market” and formed a team devoted to coaxing them to the platform. A quarter of all Americans on the Chinese Communist Party-beholden app TikTok are teenagers or younger as of early 2021. And Twitter is deliberately attempting to compete in that demographic through key hires aimed at attracting young people.

Most galling of all, these companies are aware of their deleterious effects on the next generation, yet continue their efforts to court them. Facebook itself concluded that a correlation between Instagram and teen suicidal ideation (among other teen mental health issues) exists, but still believes building an Instagram platform for children under 13 is “the right thing to do.”

Human traffickers, foreign, Islamist terrorists and their propagandists, and drug cartels also proliferate on these platforms, often shielded by Section 230’s expansive protections. As Facebook’s vice president of state public policy noted in 2021, Facebook “[allows] people to share information about how to enter a country illegally or request information about how to be smuggled.”

Thomas cited related dereliction this week as well, saying that in spite of Facebook’s awareness of its crimes, the company was:

afforded publisher immunity even though Facebook allegedly ‘knows its system facilitates human traffickers in identifying and cultivating victims,’ but has nonetheless ‘failed to take any reasonable steps to mitigate the use of Facebook by human traffickers’ because doing so would cost the company users—and the advertising revenue those users generate.

Other noxious “acts and omissions” by these companies include marketing themselves as democratizers of information while enforcing a two-tiered justice system that punishes and excludes a certain set of thinkers. For instance, last Thursday, Facebook announced it would temporarily lift its ban on violent speech if that speech is directed against Russians and Russian soldiers. (Facebook’s initial justification for the suspension of former President Donald Trump’s account was that his posts contribute to “the risk of ongoing violence.”)

Additional examples of this uneven application of community standards by Big Tech companies are legion, from Covid-19 misinformation suspensions to the suppression of the Hunter Biden laptop story to the denial of biological realities to their tolerance of Vladimir Putin and Chinese Communist Party activity on these platforms. Clearly, Big Tech’s abuse of Section 230 covers all manner of sins.

The implications of Justice Thomas’s thinking for Big Tech are stark. He presents two routes the government can take.

First, he defers to Congress’s role in clarifying the statute, stating that Congress “may soon resolve the burgeoning debate” about how correctly federal courts have interpreted Section 230. In his April 2021 statement on Biden v. Knight, Thomas emphasizes the legislative body’s agency in this realm, noting we got into this mess when Congress provided digital platforms with immunity from specific lawsuits without imposing corresponding responsibilities, like nondiscrimination.

Second, “assuming Congress does not step in to clarify Section 230’s scope,” Thomas suggests the Supreme Court should fill the now-gaping void of inaction.

They may not have to. Energy on Capitol Hill behind proposals to clarify 230 may yield concrete results in the next few years. From significant overhauls to carve-outs to more narrow reforms on the right, to targeted exceptions to immunity on the left, Congress is poised to address the evolution of its ‘90s era legislation and clear up the scope of the law.

But should they fail, Thomas’s most recent attempt to rally his confreres on the court may be exactly what Americans need to compel a reckoning with Big Tech. There is no better man to lead the charge.