Take it Down Act
The Take it Down Act is the first federal legislation to criminalize nonconsensual intimate imagery, including AI-generated explicit deepfakes. It aims to address the growing issue of AI-generated, nonconsensual sexual images and videos, which are increasingly prevalent in schools and among minors. Despite its enactment, the article highlights a lack of awareness and training among educators and students, underscoring the need for additional resources and guidance to implement the law effectively in educational settings.
Related Incidents
Same harm domain, actors and location may differ
Punjab Kings cricketer targeted by AI-generated deepfake video leading to reputational harm and public confusion
Adult film actors fann wong and christopher lee defrauded via ai-generated forged images
Veteran actress Yeom Hye Ran defrauded of likeness via unauthorized AI-generated video on YouTube
Renowned South Korean actress has AI deepfake portrait rights violated through unauthorized image generation technology
5 million Abceed users exposed to AI voice cloning fraud risk via misconfigured cloud storage
Related Legislation
Other policies covering the same harm domain