
When they returned to school after the summer holidays, more than twenty girls from Almendralejo, a town in southern Spain, received nude photos on their phones.
Neither of them had taken the photos, but they looked completely real. The photos were taken from the girls' Instagram accounts, altered through an artificial intelligence app and then shared in WhatsApp groups.
The teens were fully clothed in the real photos, but the app made the nudity look completely real. Now parents and prosecutors are asking if a crime has been committed and can the images be considered child pornography?
The mothers have organized to complain about what happened and the police have opened an investigation and have already identified several minors who are suspected to be involved in the case.
Although the nude photos are not real, the mothers say the girls' concern is real. "One of them told my daughter that he had done 'things' with her photo," one of the mothers told Spanish newspaper El País.
But can deepfakes (a video/photo of a person in which their face or body has been digitally altered to appear to be someone else, usually used maliciously or to spread malicious information) be legally punished? fake)?
Some experts say that the issue is not whether the photo is 100% real, but whether it appears to be real. The distribution of photos may be considered child pornography, a crime against moral integrity or the distribution of images with sexual content.
However, there is currently a legal loophole because the use of minors' faces in photographs affects their privacy, but when it comes to crimes in which intimate images are shared, it is the image as a whole that infringes on privacy.
Since it is created by deepfake, the actual privacy of the person in question is not affected. The effect it has (on the victim) can be very similar to a real nude photo, but the law is a step backwards.