
🔎 What Was Asked and Why
This research investigates when speech on social media is judged toxic enough to deserve platform intervention. Platforms set rules and depend on user reports, but little is known about what everyday users consider unacceptable and what moderation actions they prefer.
🧾 How the evidence was collected
📊 Key Findings
🔍 Why It Matters
These results have direct implications for platforms, policymakers, and democratic discourse. Understanding that users’ appetite for moderation is constrained—even when presented with toxic content—helps explain the challenges platforms face when relying on user reporting and highlights the political and policy trade-offs involved in designing moderation systems.

| Toxic Speech and Limited Demand for Content Moderation on Social Media was authored by Franziska Pradel, Jan Zilinsky, Spyros Kosmidis and Yannis Theocharis. It was published by Cambridge in APSR in 2024. |
