In a blow to Big Tech’s claims of complete immunity from liability under Section 230, the Texas Supreme Court ruled last week Facebook can be held liable for state civil claims of sex trafficking on its platform.
Far more Americans are familiar with Section 230 now than when it was created by Congress in 1996, as it has become a key battlefield in the policy wrangling over Big Tech. The tiny provision was designed to “clean up the internet” by incentivizing tech platforms to freely moderate content without being subject to liability for content posted by users.
Although much furor centers around Section 230’s role in platforms’ speech censorship and content bias, expansive judicial interpretations of Section 230 have also allowed Big Tech to escape accountability for some of the truly horrific conduct that its platforms facilitate.
Big Tech’s Role in Enabling Trafficking
In this particular case, Facebook claimed Section 230 meant the platform bore no responsibility for its services being used to traffic young girls into sex slavery. In three combined cases, 14- and 15-year-old girls were contacted on Facebook (or on Instagram, which Facebook owns) by adult users.
In one instance, an adult male told a 15-year-old she was “pretty enough to be a model,” and suggested she pursue a modeling career. After the young girl confided in him about a recent argument with her mother, the user proposed they meet in person.
Shortly after meeting him, the plaintiff was photographed and her pictures posted on the now-defunct Backpage.com, advertising her for prostitution. She was “raped, beaten … and forced into further sex trafficking.”
The other two cases present similar fact patterns: girls were contacted through Facebook properties, groomed, and lured to in-person meetings, only to be repeatedly raped and ensnared in sex trafficking operations. In one example, after a 14-year-old was rescued, traffickers continued to use her profile to lure other children into trafficking. When the girl’s mother reported these activities to Facebook “through multiple channels,” the platform never responded.
Facebook argued in court that, under Section 230, it should face no responsibility whatsoever for its role in these cases — and therefore the cases should be dismissed outright. Regarding the plaintiff’s claims against Facebook for negligence and product liability, the court agreed, citing “the uniform view of federal courts” on Section 230 precedent.
The court did not agree, however, that recent amendments to Section 230 should let the platform off the hook. Instead, it found the plaintiffs’ civil claims, pursued under Texas trafficking law, could proceed. “Section 230,” wrote Justice James Blacklock, “does not withdraw from states the authority to protect their citizens from internet companies whose own actions — as opposed to those of their users — amount to knowing or intentional participation in human trafficking.”
Have the Courts Been Wrong All Along?
The outcome of this case is encouraging to the many victims of trafficking and dangerous harassment who have been unable to achieve even a modicum of accountability from the platforms that facilitate such behavior. But it is interesting in another way: in the credence the Texas court gives to the fault lines emerging around the country with regard to Section 230’s long-standing judicial interpretation.
While the tech companies have argued successfully in court for decades that their immunity is bulletproof, there is a growing chorus of skepticism inside and outside of Congress, and even on the U.S. Supreme Court, that this interpretation is flawed — and never what Congress actually intended.
An amicus brief filed by the Alliance to Counter Online Crime and the National Center on Sexual Exploitation argued Section 230 was passed “with the express purpose of limiting pornography and other online material harmful to children.” Since that time, the brief noted, the courts have “expanded the statute’s protections well beyond its text and Congress’s intent” by expanding Section 230’s original publisher immunity to cover distributors. The result has been a claim to “extraordinary, unprecedented immunity — legal protection from any liability whatsoever relating to third-party content,” even extending to knowing facilitation of child sex trafficking.
This echoes a line of argument recently made by Justice Clarence Thomas. Writing in late 2020, Justice Thomas suggested “the sweeping immunity courts have read into” Section 230 should be reconsidered. Broad interpretation by the courts, he said, has not only conflated publisher and distributor liability, but prevented plaintiffs from bringing cases “on alleged product design flaws — that is, the [platform’s] own misconduct.”
The Texas court agreed Justice Thomas’s argument was “plausible,” and found both the broad and narrow sense of the word “publisher” represented viable readings of the statute. “Section 230,” the opinion notes, “is no model of clarity, and there is ample room for disagreement about its scope.”
Even in recognizing the merits of arguments challenging the current tide of Section 230 interpretation, the court acknowledged the weight of prior precedent, and, oddly, Facebook’s economic expectations. “We are not interpreting Section 230 on a clean slate,” wrote Blacklock, “and we will not put the Texas court system at odds with the overwhelming federal precedent supporting dismissal of the plaintiffs’ common-law claims.”
Congress Must Act Where the Courts Will Not
Although the court did not upend the existing statutory interpretation of Section 230, it is a remarkable statement from a state supreme court that a supposedly settled body of law may have, in fact, been settled in error.
It is also a warning for lawmakers that the courts (outside of the U.S. Supreme Court), bound as they are by the uniform jurisprudential views on the issue, are unlikely to act in ways that diverge significantly. Until the Supreme Court or Congress steps in, tech companies will continue to rely on judicially bloated immunity rendered bulletproof by years of bad precedent.
This has only added to the arrogance of platforms like Facebook, which defended its suspension of Donald Trump due to the “high probability of imminent harm,” while simultaneously claiming no responsibility for its role in the actual harm — rape and trafficking — of children. Moreover, the courts appear cowed in the face of Facebook’s entrenched economic dominance. Blacklock cited the “expectations of those who operate and use … Facebook” as one of the reasons the court chose not to engage the statute’s interpretation.
When the courts have distorted a statute, or when it becomes clear circumstances have evolved to warrant reconsideration, our self-government should act. Unlike the courts, Congress is not constrained by legislative history or intent.
Given the proliferation of Section 230 proposals in both the House and Senate, it seems increasingly likely the statute will receive some amending, if not upending, in the near future. Cases like this one should only add further urgency to this mission.