Citizen Portal
Sign In

Lifetime Citizen Portal Access — AI Briefings, Alerts & Unlimited Follows

HSI: Generative AI now used to produce child sexual abuse material; FBI/HSI cases cited

3213353 · May 7, 2025

Loading...

AI-Generated Content: All content on this page was generated by AI to highlight key points from the meeting. For complete details and context, we recommend watching the full video. so we can fix them.

Summary

Homeland Security Investigations told viewers that adult offenders now use generative AI to create child sexual abuse material from ordinary photographs, citing a case in which an offender had thousands of AI-created images and investigators found links between hands-on offenses and AI content.

Homeland Security Investigations told viewers during a public livestream that generative AI is increasingly used by offenders to create child sexual abuse material (CSAM) from ordinary photos, and presented a recent federal case as an example.

Special Agent Dennis Fetting described investigations in which offenders combined hands-on offenses with AI-generated material. He described a case investigated in Tampa in which the defendant, identified on the livestream as Justin Colmo, pleaded guilty to charges related to hands-on abuse and to possessing thousands of AI-generated CSAM images; agents found more than 8,000 AI-created images on the defendant's devices, Fetting said.

"What we have now is the nexus between the hands-on offender and the generative AI to create entirely new content from real children that weren't even in a sexual situation," Fetting said. He explained that some offenders used regular pictures (for example, of children at public places) as inputs to generate abusive imagery, then used that material to threaten or extort victims.

Fetting said the cases show both new investigative challenges and new harms: generated imagery can depict a child in sexualized content even if the child was never abused in that way, and the imagery can be used for sextortion or distributed among offender communities. He noted the Tampa defendant had pled guilty and was awaiting sentencing at the time of the livestream.

The segment emphasized that law enforcement treats juvenile victims as victims rather than criminal suspects, and that state and local authorities often handle peer-on-peer incidents under juvenile justice and school disciplinary processes. Fetting urged parents and schools to report concerning incidents to local law enforcement and to national hotlines so investigators and victim services can act.

The livestream framed AI-generated CSAM as an emerging threat that expands the capacity of offenders to produce and weaponize images, and it highlighted the need for platform and investigative responses as well as for parents to preserve evidence and report tips to agencies such as HSI and NCMEC.