Understanding the US Legal Framework for Content Moderation
The following slides provide a review of the key concepts from your assigned reading on US platform regulation. Since you’ve already read the material, use these slides to reinforce and organize your understanding. Pay particular attention to how Section 230 creates a unique environment for content moderation in the United States.
These slides are designed to help you consolidate your understanding of:
- The historical context of the publisher vs. distributor liability distinction
- The First Amendment’s role in protecting digital speech
- How Section 230 functions as both a “shield” and “sword” for platforms
- The many exceptions and limitations to Section 230 immunity
As you review these materials, consider how these different legal structures create incentives and constraints for modern content moderation practices.
In May 2025, President Donald Trump signed the Take it Down Act into law.
Although it is not clear how this new law will be enforced, its provisions pertain directly to platform providers and Section 230 immunity.
This area of the law is developing rapidly but you can read more about the “Take it Down Act (2025)” and its impact below.
Debates over whether Section 230 should continue to govern online content moderation have persisted for years. To help you engage with this ongoing conversation, experts in the area have compiled a group of essays from legal scholars and policy advocates debating the future of Section 230. These essays present a range of viewpoints on whether Section 230 should be:
- Preserved in its current form
- Modified with targeted reforms
- Significantly overhauled or replaced
As you read these perspectives, consider:
- What assumptions do different stakeholders make about how moderation should work?
- How might changes to Section 230 affect different types of online platforms?
- What values and priorities seem to motivate different reform proposals?
- How could psychological research contribute to this ongoing debate?
This debate connects directly to the simulation exercise you will complete later, as the legal framework fundamentally shapes what content moderation actually looks like in practice.
If you’re looking for a quick refresher on what content moderation is and how it differs from censorship, try watching this short video:
If you’re interested in this topic and want to explore the non-legal side of content moderation further, I highly encourage you to check out this digital module from the Social Futures Lab at UW.
If you’d like additional context on The United States’ Approach to “Platform” Regulation beyond the assigned reading, the following guide provides a helpful overview.
Once you are ready, move on to the psychology section on the next page.