Digital Services Act investigation into Snapchat's child safety practices
The European Commission has opened a formal Digital Services Act investigation into Snapchat’s child safety practices, focusing on age restrictions, safeguards against grooming, and protections against illegal content. The investigation could result in regulatory penalties and compliance costs for Snap. This action reflects increased global scrutiny of social media platforms and their child safety measures.
Related Incidents
Same harm domain — actors and location may differ
14-year-old girl groomed via social media by Sydney private school teacher leading to child abuse material charges
12-year-old girl sexually groomed via TikTok leading to out-of-state assault in Binghamton, New York
9-year-old girl dies after attempting blackout challenge on YouTube
13-year-old Louisiana girl exposed to AI-generated nude deepfake images leading to expulsion and federal lawsuit against school district
12-year-old girl groomed and coerced into self-harm and producing child sexual abuse materials via social media in New Jersey
Related Legislation
Other policies covering the same harm domain
Linked Litigation
1Legal cases linked to this policy