US Legal Framework for Content Moderation
Overview of Section 230 & Internet Speech
The Historical Context
Who is responsible for illegal speech?
Historically:
- Publishers → Legally responsible for content
- Distributors → Not legally responsible (e.g., bookstore, newsstands)
The Challenge:
- Rise of online services blurred these clear lines
- Early Internet lacked clear legal framework
- Courts relied on outdated (and wild) precedents
Three Pillars of US Content Moderation Law
The US legal approach to online content moderation rests on three primary elements:
- Constitutional Protection for Free Speech → (First Amendment)
- Statutory Protection for Free Speech → (Section 230)
- Limits on State Regulation → (Dormant Commerce Clause)
The First Amendment Framework
The First Amendment provides strong and wide-ranging protections against government censorship of speech:
- Levels of Scrutiny → Different standards of analysis for different types of speech restrictions
- Limited Exclusions → Narrowly defined categories (e.g., CSAM, obscenity, incitement, defamation)
- Protected Speech → Many categories regulated internationally are actually protected in US (USA!!USA!!)
Protected Speech Categories
These content types are Constitutionally protected in the US:
- “Hate speech” → Unless it fits excluded categories, odious views about immutable characteristics are protected
- “Cyberbullying” → Including name-calling, dehumanizing references, brigading, doxing
- “Terroristic” content → Content about terrorist organizations generally protected
- “Pornography” → Non-obscene, non-CSAM sexual content protected (detailed in next slide)
- “Misinformation” → Including false political statements and health misinformation
Pornography and the Miller Test
The First Amendment protects most sexually explicit content with important exceptions:
- Protected Sexual Content:
- Adult pornography (non-obscene)
- Explicit written descriptions
- Artistic and educational nudity
- Sexual speech and discussions
Miller Test for Obscenity
(Miller v. California, 1973)
- Whether the average person, applying contemporary community standards, would find the work appeals to the prurient interest
- Whether the work depicts or describes sexual conduct in a patently offensive way
- Whether the work, taken as a whole, lacks serious literary, artistic, political, or scientific value
CSAM Exception: Child Sexual Abuse Material is unprotected and illegal under all circumstances
Platform Challenge: Legal pornography must be distinguished from illegal obscenity, creating complex moderation requirements
Internet Services as “Publishers”
When Internet services perform editorial or curatorial functions, the First Amendment protects those functions to the same degree it would protect offline content publishers:
- Miami Herald precedent → Government cannot force publication decisions
- Unique Internet Treatment → Supreme Court rejected reduced protections for Internet
- Editorial Rights → Services can make decisions about what content to include, exclude, moderate, filter, label, restrict, or promote
What Is Section 230?
Section 230 is the remains of the Communications Decency Act, much of which was struck down as unconstitutional.
It does two things:
- Defines who a publisher (originator/creator) is
- Offers broad liability protection for organizations or people who display speech of a publisher
Section 230(c)(1):
“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
Section 230(c)(2): Good Samaritan Protection
Section 230(c)(2):
- Protects online services from liability for “good faith” efforts to moderate content
- Shields platforms when they remove objectionable content
- Covers actions to restrict access to material the provider considers obscene, lewd, violent, harassing, or “otherwise objectionable”
- Protection applies whether or not such material is constitutionally protected
Section 230: Definitions
Interactive Computer Service
Any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.
Information Content Provider
Any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.
Why Was Section 230 Created?
Two big cases created chaos:
The Problem Section 230 Solved
How Section 230 Works
Courts use a three-pronged test:
- Defendant must be a “provider or user of an interactive computer service”
- Defendant must not be an “information content provider” of the content in question
- Plaintiff’s claim must treat defendant as the “publisher or speaker” of the content
If all three conditions are met, Section 230 protection applies.
Practical Implications of Section 230
What this means in practice:
- Being defined as the “publisher” of a message opens organizations up to civil liability
- Not being defined as the “publisher” permits individuals to repost anything, without regard to its truth or legality
- Not being defined as a “publisher” permits organizations to avoid liability for content published on their sites
- Results in “intermediary power,” or broad protection for intermediaries and distributors
Zeran v. America Online (4th Cir. 1997)
The foundational Section 230 case.
Facts: After the Oklahoma City bombing, an anonymous user posted offensive T-shirt advertisements on AOL with Zeran’s phone number. Zeran received death threats and harassment. He alleged AOL unreasonably delayed removing the posts, refused to publish retractions, and failed to screen for similar postings.
AOL’s Defense: Section 230 immunity.
Zeran’s Argument: Section 230 “leaves intact liability for interactive computer service providers who possess notice of defamatory material posted through their services.”
Zeran: The Court’s Analysis
The Fourth Circuit’s holding:
“By its plain language, § 230 creates a federal immunity to any cause of action that would make service providers liable for information originating with a third-party user of the service. Specifically, § 230 precludes courts from entertaining claims that would place a computer service provider in a publisher’s role. Thus, lawsuits seeking to hold a service provider liable for its exercise of a publisher’s traditional editorial functions – such as deciding whether to publish, withdraw, postpone or alter content – are barred.”
Zeran: Congressional Intent
Why Congress enacted Section 230:
“The purpose of this statutory immunity is not difficult to discern. Congress recognized the threat that tort-based lawsuits pose to freedom of speech in the new and burgeoning Internet medium. The imposition of tort liability on service providers for the communications of others represented, for Congress, simply another form of intrusive government regulation of speech.”
Congress knew it would be too onerous using tools known at the time to police posts. Congress also wanted to encourage self-policing when possible.
Zeran: The Notice Liability Problem
Why “liability upon notice” was rejected:
“If computer service providers were subject to distributor liability, they would face potential liability each time they receive notice of a potentially defamatory statement – from any party, concerning any message. Each notification would require a careful yet rapid investigation of the circumstances surrounding the posted information, a legal judgment concerning the information’s defamatory character, and an on-the-spot editorial decision whether to risk liability by allowing the continued publication of that information.”
This would defeat the dual purposes of Section 230: promoting free speech and encouraging self-regulation.
Zeran: Exceptions and Limits
What Section 230 does NOT protect:
“None of this means, of course, that the original culpable party who posts defamatory messages would escape accountability.”
Congress also found it U.S. policy “to ensure vigorous enforcement of Federal criminal laws to deter and punish trafficking in obscenity, stalking, and harassment by means of computer.”
Therefore:
- Criminal statutes may still apply
- IP enforcement actions may apply
- BUT it remains difficult to prove what an “interactive computer service” knows (e.g., Silk Road, Napster)
Jones v. Dirty World (6th Cir. 2014)
Testing the limits of Section 230 immunity.
Facts:
- Nik Lamas-Richie operates TheDirty.com, posting user contributions with his own editorial comments
- The website solicited uploads from users; staff selected and reviewed submissions
- Posts were edited to remove obscenity and nudity but otherwise posted as written
The conflict:
- Sarah Jones (a teacher and cheerleader) was defamed in anonymous posts
- Jones requested removal; Richie refused
- Defense: Section 230 bars liability
Jones: The Material Contribution Test
The Sixth Circuit’s rule:
“Section 230(c)(1)’s grant of immunity is not without limits, however. It applies only to the extent that an interactive computer service provider is not also the information content provider of the content at issue.”
The distinction:
“A website operator can simultaneously act as both a service provider and a content provider. If a website displays content that is created entirely by third parties, then it is only a service provider with respect to that content – and thus is immune from claims predicated on that content. But if a website operator is in part responsible for the creation or development of content, then it is an information content provider as to that content – and is not immune from claims predicated on it.”
Jones: Applying the Test
The court adopted the material contribution test from Fair Housing Council v. Roommates.com [9th Cir. 2008; (Fair Housing Council of San Fernando Valley v. Roommates.com, LLC, 2008)]:
- Material Contribution → Actions that materially contribute to the alleged illegality of the content
- “Development” means something more involved than merely displaying or allowing access to third-party content
- Simply reviewing and posting does not amount to material contribution, without more
Result: Richie and Dirty World were protected by Section 230 because they did not materially contribute to the defamatory statements themselves.
Doe v. MySpace (W.D. Tex. 2007)
Extending Section 230 beyond defamation.
Facts:
- A 13-year-old lied about her age to create a MySpace profile
- A 19-year-old contacted her through the platform
- She was subsequently sexually assaulted after meeting him in person
The claim:
- Parents sued MySpace for negligence, claiming the platform knew sexual predators were using the system
- Plaintiffs’ Argument: Section 230 only applies to defamatory statements in tort, not other torts like negligence
Doe v. MySpace: The Holding
The court disagreed:
Prior cases (including Doe v. Bates, 2006) demonstrated that Section 230 bars:
- Negligence
- Negligence per se
- Intentional infliction of emotional distress
- Invasion of privacy
- Civil conspiracy
- Distribution of child pornography claims against platforms
Observation: No matter how artfully plaintiffs pleaded their claims, the court viewed them as directed toward MySpace in its publishing, editorial, and/or screening capacities.
Limitations of Section 230
Section 230 does NOT protect online services from:
- Federal criminal law violations
- Intellectual property law violations
- Federal and state sex-trafficking law (after FOSTA-SESTA amendment in 2018)
- Laws that are otherwise consistent with Section 230
FOSTA: A New Federal Crime
The Allow States and Victims to Fight Online Sex Trafficking Act (2018):
18 U.S.C. § 2421A:
“Whoever owns, manages, or operates an interactive computer service … or conspires or attempts to do so, with the intent to promote or facilitate the prostitution of another person shall be [punished].”
(Exception: states that permit prostitution)
Additional enforcement mechanisms:
- Civil actions by victims
- State prosecution by attorneys general
- Federal civil actions by the Attorney General
What Section 230 Has Provided
- Enabled platforms that rely on user-generated content to exist
- Protected knowledge-sharing sites like Wikipedia
- Shielded small blogs and forums from expensive lawsuits
- Enabled product and business review sites
- Protected social media platforms while allowing content moderation
What’s Not Required for Section 230
Section 230(c)(1) immunity has several unique features:
- No Prerequisites: Automatic immunity (unlike DMCA safe harbor)
- Knowledge Irrelevant: A service may know about problematic third-party content and still benefit
- Takedown Notices: Services can ignore takedown notices with no legal consequence
- Financial Interest: Applies even when service profits from content
- Content Moderation: Protects services regardless of how vigorously they moderate content
Section 230 vs. First Amendment
The relationship between statutory and constitutional protections:
- Both Protect Services: But Section 230 provides a “procedural fast lane”
- Procedural Benefits: Allows early case dismissal without Constitutional determinations
- Substantive Effects: Encourages Internet services to moderate content without fearing the price tag of each decision
- “Lawful-but-Awful” Content: Section 230 revision won’t create incentives to address content protected by First Amendment
Promissory Estoppel: A Possible Exception?
Could contract claims survive Section 230?
- Section 230 addresses treatment as a “publisher or speaker”
- Contract claims may not implicate publisher functions
- Potential theories:
- Promissory estoppel based on platform promises
- Misrepresentation claims
- False advertising claims
Open question: If a content provider does more than provide content, is that additional activity similarly barred under Section 230?
Account Terminations and Removals
Internet services usually have unrestricted editorial discretion to terminate accounts or remove content without incurring liability:
- Users have lost dozens of cases, with no plaintiff victories
- Multiple winning defense theories:
- Private services aren’t state actors
- Terms of service reserve termination rights
- Section 230 protection
- Prima facie elements not satisfied
When Section 230 Doesn’t Apply
In cases outside Section 230 protection:
- Prima Facie Elements: Plaintiffs must establish basic claim elements
- First Amendment: Still protects services from strict liability
- Copyright Claims: DMCA Safe Harbor provides separate protection
- Trademark Claims: Courts generally follow notice-and-takedown principles
- Varied State IP Claims: Too diverse to summarize
State Regulation Limitations
The Dormant Commerce Clause restricts states’ Internet regulation:
- Conflicting Laws: Can’t make simultaneous compliance impossible
- Extraterritorial Regulation: Can’t govern activity wholly outside state
- Interstate Barriers: Can’t erect barriers to out-of-state services
- Limited Success: These challenges are infrequently advanced and succeed only occasionally
Transparency as Regulation
An emerging regulatory approach:
- Alternative Strategy: Focus on disclosure rather than dictating moderation decisions
- Requirements May Include:
- Publishing editorial policies
- Providing explanations for moderation decisions
- Publishing moderation statistics
- Constitutional Questions: The constitutionality of mandatory editorial transparency requirements remains unresolved
Why Do We Care Now?
Contemporary challenges to the Section 230 framework:
- “Curation” has caused problems, as some high-profile users have been barred from participating
- Some viewpoints on some platforms are given more screen time, others not
- Some features may enable indiscriminate silencing (e.g., blocking features)
- Information shared may not be factually accurate or even originate from an actual person (‘bots’)
- Most platforms now involve some form of human or algorithmic evaluation
Emerging Concerns
- Content may cast individuals in a false light, triggering defamation lawsuits, but the actual originator often cannot be found
- Increasingly, virtual concerns are becoming physical
- It is difficult to ascertain internal practices and thus evaluate from a “material contribution” perspective
- Rule 12(b)(6) of the Federal Rules of Civil Procedure makes it harder to bring claims (‘enhanced’ pleadings)
Current Debates
- Some argue Section 230 is too broad, shielding bad actors
- Critics claim it removes incentives to police harmful content
- Others worry it gives platforms too much freedom to silence voices
- Most objections about political speech relate more to First Amendment than Section 230
- Ongoing legislative proposals seek to amend the law
Political Dimension
Content moderation has become politically divisive:
- Democratic Perspective: Tend to want platforms to remove more content, even if Constitutionally protected
- Republican Perspective: Tend to want platforms to publish more content, even potentially harmful material
- No clear solution: These positions advance two radically different visions of the Internet’s future
- Challenging Environment: Creates difficulty for developing balanced policy approaches