Skip to content
Breaking News Alert The Average U.S. Gas Price Just Passed $4.50 Per Gallon In Demoralizing New Record

Section 230 Needs To Be Fixed So Internet Companies Can’t Feature Child Pornography

Share

In the leadup to the Communication Decency Act of 1996, America was concerned about the Internet exposing children to pornography. The July 1995 cover of Time magazine, titled “Cyberporn,” depicted a child staring at a computer with this caption: “Exclusive: A new study shows how pervasive and wild it really is. Can we protect our kids — and free speech?”

Today, our biggest problem is not children who are exposed to pornography. It’s children who are involved in pornography or child sexual abuse material — CSAM, as it’s known. When victims of CSAM seek justice in the courts, however, section 230 of the Communication Decency Act — a law that protects digital platforms from liability for third-party content— often blocks their lawsuits.

An Actual Problem: Online Child Porn

CSAM online is a very real crisis. The National Center for Missing and Exploited Children’s CyberTipline received 21.6 million reports of CSAM in 2020, an increase from 16.9 million in 2019. In 2019, a New York Times podcast revealed that both the FBI and Los Angeles Police Department have to prioritize CSAM reports for infants and toddlers; they could not effectively respond to reports for older children.

But even if a digital platform like Twitter runs afoul of existing CSAM laws, section 230 can shield them from accountability. Section 230 contains a blanket carveout for any federal criminal law, but it has an important quirk.

As an illustration, if Twitter violates federal CSAM laws, then federal law enforcement can file criminal charges. If victims sue Twitter for the exact same conduct, though, section 230 can and does block that civil claim. In 2021 alone, lawsuits against Twitter and Reddit ran into this exact problem.

One proposed bill, the EARN IT Act (for Eliminating Abusive and Rampant Neglect of Interactive Technologies), would fix this problem by carving federal and state CSAM laws out of section 230. If victims sued under those CSAM laws, section 230 would no longer affect their lawsuit.

What Would EARN IT Actually Do?

It often feels like there are just two sides on section 230: one that blames all of society’s problems on section 230, and another side that will cry foul if you even breathe on section 230.

So what would happen if EARN IT became law? Would tech companies become strictly liable for every single piece of CSAM on their websites? No existing law imposes such strict liability (and such a law would be unconstitutional anyway). What would really happen is that these online companies would have to obey any existing CSAM laws, without any special protection from section 230. In other words, their accountability would be the same as any offline company’s.

The debate about protecting child victims of CSAM, however, has been hijacked by various “experts” who claim EARN IT is a seemingly apocalyptic threat to encryption.

Encryption is the practice of encoding information. For example, when Alice sends Bob a message via the Signal app, Signal uses end-to-end encryption to prevent others from spying on that message; not even Signal can see Alice’s message.

Critics say that encryption — especially end-to-end encryption — will be at risk if EARN IT passes. Some even allege that EARN IT’s overt goal is to pressure companies to scan messages and photos for CSAM; such scanning is impossible if end-to-end encryption is used.

But EARN IT cannot directly threaten encryption. Only a CSAM law can directly threaten encryption. In the past, section 230 would likely have neutralized this hypothetical threat –unless CSAM laws were carved out of section 230.

To address this concern, the overly broad shield of section 230 — which neutralizes both credible lawsuits from victims and potential threats to encryption — would be replaced with a narrower shield for encryption. In spite of that good-faith effort, though, many are still branding EARN IT as anti-encryption. Some have even outright said that it is “unlikely that any amendment” will sufficiently protect encryption. But is this shield for encryption a solution in search of a problem?

It is easy to find tomes of “expert” analysis for the solution — EARN IT’s language for encryption — but it is much harder to find solid analysis for the problem: which CSAM law actually threatens encryption? Even when the critics cite specific laws, they often raise hypothetical concerns over a state law that does not even mention encryption. In some cases, they just invent hypothetical future state laws.

One proposal is to remove EARN IT’s shield for encryption altogether — not on anti-encryption grounds, but on the grounds that Congress should legislate against actual problems, not against hypothetical problems. If we agreed to shelve the encryption debate for a bill that has little to do with encryption, we could bring this debate back to the victims of CSAM and the actual problems they face.