PC Mag: FBI: AI-Generated Child Porn Is Illegal
The FBI is telling the public that AI-generated child porn is illegal, amid questions over whether the US is doing enough to stop the illicit images. On Friday, the FBI issued a public service announcement to explicitly warn the public against using generative AI tools to create child sexual abuse materials (CSAM).
“Federal law prohibits the production, advertisement, transportation, distribution, receipt, sale, access with intent to view, and possession of any CSAM, including realistic computer-generated images,” the agency said.
It’s no secret that the rise of generate AI has sparked a surge in AI-produced porn. But the same technology also means that “even the least technical users can generate realistic artwork, images, and videos—including CSAM—from text prompts,” the FBI added.
The agency issued the statement after Rep. Anna Paulina Luna (R-Fla.) said during a House subcommittee hearing that the FBI is struggling to prosecute the creators of AI-generated CSAM. “Because technically, a child is not hurt in the process, because it is a generated image."
The National Center for Missing and Exploited Children also says it received 4,700 reports concerning AI-generated CSAM last year. “Furthermore, users of the technology to create this material have used the argument that, ‘At least I didn’t hurt a real child’ and ‘It’s not actually a child,'" the center added.
However, the FBI’s PSA is unequivocal on the issue. The agency cites two men who were convicted last year for possessing CSAM images. In one case, the defendant used an AI program to create images of minors. In the other, the defendant “possessed pictures that digitally superimposed the faces of child actors onto nude bodies and bodies engaged in sex acts.”
It’s no secret that the rise of generate AI has sparked a surge in AI-produced porn. But the same technology also means that “even the least technical users can generate realistic artwork, images, and videos—including CSAM—from text prompts,” the FBI added.
The agency issued the statement after Rep. Anna Paulina Luna (R-Fla.) said during a House subcommittee hearing that the FBI is struggling to prosecute the creators of AI-generated CSAM. “Because technically, a child is not hurt in the process, because it is a generated image."
The National Center for Missing and Exploited Children also says it received 4,700 reports concerning AI-generated CSAM last year. “Furthermore, users of the technology to create this material have used the argument that, ‘At least I didn’t hurt a real child’ and ‘It’s not actually a child,'" the center added.
However, the FBI’s PSA is unequivocal on the issue. The agency cites two men who were convicted last year for possessing CSAM images. In one case, the defendant used an AI program to create images of minors. In the other, the defendant “possessed pictures that digitally superimposed the faces of child actors onto nude bodies and bodies engaged in sex acts.”
In addition, the FBI’s PSA points out that “computer generated images indistinguishable from an actual minor” can be prosecuted under US federal law against child pornography. Still, others including the National Center for Missing and Exploited Children say the US needs to update its laws to explicitly outlaw AI-generated CSAM.
Last September, a group of state attorneys general also urged Congress to act “by expanding existing restrictions on CSAM to explicitly cover AI-generated CSAM [to] ensure prosecutors have the tools they need to protect our children."