Stop AI-Enabled Sexual Abuse of Children
Members of Congress
Artificial intelligence (AI) is rapidly changing the landscape of what’s possible online. Unfortunately, this new technology is increasingly being used to facilitate child sexual abuse and exploitation. Congress has a responsibility to understand the scope of this problem and address it – they must establish a commission to study how AI is being used to harm children and how legislation and regulation can prevent this abuse.
Attorneys General from every state recently sent Congress a letter expressing concern about AI-supported child sexual abuse and asking them to study the problem. Criminals are using AI to generate new images of child sex abuse victims, make innocent images of real children sexual, and even de-age celebrities to create sexual images of them as children. All these different types of deep fakes, as well as realistic voice scans, are being used by scammers, blackmailers, and kidnappers to trick kids into sharing personal information, sexual images, or money.
This technology is only getting more realistic, making it harder for law enforcement to identify and help real child victims. It’s also making it harder for platforms to identify, report, and remove illegal and violating content. With kids spending more and more time on social media – nearly 5 hours a day on average – we can’t afford to ignore this growing problem. We need Congress to first understand this problem and then take action to prevent it. Please establish a commission to study AI-enabled child sexual abuse now.
Sponsored by
To:
Members of Congress
From:
[Your Name]
Dear Congress,
As someone who cares deeply about children, I support the recent demands of Attorneys General from every state in their letter about AI-supported child sexual abuse. Criminals are using AI to generate new images of child sex abuse victims, make innocent images of real children sexual, and even de-age celebrities to create sexual images of them as children. These deepfakes, as well as realistic voice scans, are being used by scammers, blackmailers, and kidnappers to trick kids into sharing personal information, sexual images, or money. This technology is only getting more realistic, making it harder for law enforcement to identify and help real child victims.
Parents were already overwhelmed trying to prevent child sexual abuse online, and new AI tools are making that job even more impossible. We need you to understand this problem and then take action to prevent it. Please establish a commission to study AI-enabled child sexual abuse now.