Snapchat Investigated in Europe Over Child Safety Policies
European regulators are investigating Snapchat for its child safety policies, specifically its age-verification system and how it directs younger users. The investigation focuses on whether Snapchat's platform adequately protects minors from inappropriate content and interactions. The platform is accused of steering younger users toward content that may be unsuitable for their age group. This regulatory action is part of broader efforts in the EU to enforce stricter child safety standards on social media platforms.
Related Incidents
Same harm domain, actors and location may differ
14-year-old girl groomed via social media by Sydney private school teacher leading to child abuse material charges
12-year-old girl sexually groomed via TikTok leading to out-of-state assault in Binghamton, New York
9-year-old girl dies after attempting blackout challenge on YouTube
13-year-old Louisiana girl exposed to AI-generated nude deepfake images leading to expulsion and federal lawsuit against school district
12-year-old girl groomed and coerced into self-harm and producing child sexual abuse materials via social media in New Jersey
Related Legislation
Other policies covering the same harm domain