Communications Decency Act — Section 230
Foundational internet law providing immunity to online platforms for third-party content. Prevents digital harm victims from suing platforms for hosting harmful user-generated content. Central to ongoing debates about platform liability for algorithmic amplification, child exploitation content, and extremist radicalization. Enacted 1996, subject to reform proposals since 2019.
Related Incidents
Same harm domain — actors and location may differ
14-year-old girl groomed via social media by Sydney private school teacher leading to child abuse material charges
12-year-old girl sexually groomed via TikTok leading to out-of-state assault in Binghamton, New York
9-year-old girl dies after attempting blackout challenge on YouTube
13-year-old Louisiana girl exposed to AI-generated nude deepfake images leading to expulsion and federal lawsuit against school district
12-year-old girl groomed and coerced into self-harm and producing child sexual abuse materials via social media in New Jersey
Related Legislation
Other policies covering the same harm domain
Linked Litigation
20Legal cases linked to this policy