An alleged pedophile has been accused of using a GoPro to film kids at Disney World so he could create thousands of artificial intelligence child abuse images to spread on the dark web.
Justin Ryan Culmo allegedly admitted to recording his underage victims at the Orlando, Fla., theme park and at least one middle school in recent years, Forbes reported, citing FBI sources.
Culmo allegedly used the AI model Stable Diffusion to turn the footage into realistic images depicting child abuse, and then fired them off online under the screen names “Avalanche” and “TheRealAvalanche,” the feds said.
He was eventually tracked down after investigators identified one of his alleged victims and managed to trace the manipulated images back to him, the feds said.
When he was nabbed, law enforcement found a trove of child abuse images on his devices and five spy cameras in a locked drawer in his desk, the report said.
The suspect, who was busted late last year, has been indicted on a slew of child exploitation crimes, including abusing his two daughters, secretly recording minors and distributing child sexual abuse imagery online, Forbes said.
Culmo — who had been on the feds’ radar since at least 2012 — hasn’t been charged with creating AI child sexual abuse materials.
He has pleaded not guilty to the charges he was slapped with and is set to face trial next month.
“This case starkly highlights the ruthless exploitation that AI can enable when wielded by someone with the intent to harm,” said Jim Cole, an ex-Department of Homeland Security agent who previously worked on the case.
“This is not just a gross violation of privacy, it’s a targeted attack on the safety of children in our communities.”
Disney, though, said law enforcement never contacted it about Culmo’s alleged crimes.