The suspects in this latest case of alleged “deepfake” pornography include several minors, police said.
They knew their alleged victims, who were all from the southwestern province of Seville.
Advertisement
Officers began their investigation after receiving a complaint from the parents of one of the girls.
“Once the images were manipulated, they were disseminated through groups on well-known social media, causing significant emotional and social harm to the victims,” they said.
#OperacionesGC | Investigados 5 jóvenes por crear y difundir imágenes de 20 menores desnudas por #InteligenciaArtificial en la provincia de #Sevilla.
▶️Generaban una gran dificultad para distinguir entre las auténticas y las que habían sido modificadas
🔗https://t.co/Vwy8ik7f0l pic.twitter.com/uhAeXCi8Rf
— Guardia Civil (@guardiacivil) August 5, 2024
“The realism of the images made it very difficult to distinguish between those that were authentic and those that had been modified.”
The five suspects face possible charges of child pornography and breach of privacy laws.
As AI has boomed, so has so-called deepfake pornography.
AI enables users to generate hyper-realistic images and videos with minimal effort and money, often using photos of victims fully clothed that are lifted from their social media accounts and then manipulated to make them appear naked.