OpenAI: Report Child Sexual Abuse Materials Created On Your Platforms

Sam Altman, CEO, OpenAI

Young boy looking at phone

Generative AI tools are being increasingly used to create sexual images of children. The National Center for Missing and Exploited Children (NCMEC) relies on reports from AI companies to identify which child sexual abuse images are AI-generated and which are real kids who need help. OpenAI, the creator of Chat-GPT and Dall-E, is one of the most prominent and recognizable leaders in the Generative AI space. But according to recent data from NCMEC, they are not properly reporting CSAM that’s being created with their products. NCMEC received just 329 reports of CSAM from Open AI in 2023 which indicate that the popular platform is not properly reporting CSAM to NCMEC.

The numbers don’t add up. OpenAI must begin to properly report all CSAM on their platforms.

The hyperrealistic images created on OpenAI platforms have already had a multitude of real world consequences for families. Criminals are using AI-generated CSAM of children to groom and extort real kids.

OpenAI has a responsibility and an obligation to properly police their platforms and report all cases of CSAM. This includes detecting CSAM images, reporting all findings to NCMEC, and including all of the necessary information within the reports for authorities to take action.

Sign the petition to tell OpenAI they need to take their responsibility as leaders in the AI space seriously and do their part to end online child sexual abuse.

Sponsored by

To: Sam Altman, CEO, OpenAI
From: [Your Name]

Dear Mr. Altman,

OpenAI’s platforms have created new opportunities for predators to create child sexual abuse material (CSAM) using the tools you provide. These images can include real child victims who may be in active danger as well as any child that has an innocent photo posted on the internet. That’s why detecting and reporting CSAM to NCMEC is so important, and why we, as parents, grandparents, and concerned citizens, are calling on you to take immediate action to detect and report CSAM that’s being created and facilitated by your products.

Families are already reporting that they and their minor children are being extorted using AI-generated images. AI is a fast moving field and you know better than anyone how quickly this technology evolves. It’s imperative that you immediately address how your product is being used to create CSAM and take appropriate steps to report it.

We ask OpenAI to take the following steps:
-Actively detect Child Sexual Abuse Materials (CSAM) on all of your platforms
-Report all detections to the National Center for Missing & Exploited Children (NCMEC)
-Create full reports with sufficient data for authorities to take necessary action