Last month, a detective in a small town outside of Lancaster, Pennsylvania, invited dozens of high school girls and their parents to the police station to undertake a difficult task: one by one, the girls were asked to confirm that they were depicted in hundreds of AI-generated deepfake pornographic images seized by law enforcement.
In a series of back-to-back private meetings, Detective Laurel Bair of the Susquehanna Regional Police Department slid each image out from under the folder’s cover, so only the girl’s face was shown, unless the families specifically requested to see the entire uncensored image.
“It made me a lot more upset after I saw the pictures because it made them so much more real for me,” one Lancaster victim, now 16, told Forbes. “They’re very graphic and they’re very realistic,” the mother said. “There’s no way someone who didn’t know her wouldn't think: ‘that’s her naked,’ and that’s the scary part.” There were more than 30 images of her daughter.
The photos were part of a cache of images allegedly taken from 60 girls’ public social media accounts by two teenage boys, who then created 347 AI-generated deepfake pornographic images and videos, according to the Lancaster County District Attorney’s Office. The two boys have now been criminally charged with 59 counts of “sexual abuse of children,” and 59 counts of “posession of child pornography,” among other charges, including “possession of obscene materials depicting a minor.”
At the same time, allowing 60 children to have child porn generated with their likeness and then passed around to other children and the internet at large also seems wrong, even if it's generated CSAM instead of literally CSAM.
I would personally consider generating porn of a child and then giving that to their peers an act of child sex assault, even when committed by other children. That needs a legal response.
I think the argument in this case isn't that a crime wasn't committed, but rather charging a minor for CSAM possession is inappropriate (particularly when the images are fake). Perhaps a different law needs to be made for these highly specific cases, as the existing CSAM laws typically carry very hefty sentences that don't seem entirely appropriate in a case like this.
This has always been one of the problems with CSAM laws. There have been a number of cases now where minors were charged with CSAM possession for either naked pictures of themselves, or pictures (consensual) of their girlfriend/boyfriend who was also a minor. There's also the broader discussion about what exactly qualifies as CSAM, with some jurisdictions going for a more maximalist approach that considers things like drawings (even highly unrealistic or stylized ones) of people or even fictional characters to be CSAM. Some jurisdictions don't even require the photo or drawing to depict the minor naked or even engaging in a sexual act, they instead define it as pornography if the person in possession of it gets some kind of sexual gratification from it. So for instance a photo of a minor that's fully clothed and just standing there could actually be considered CSAM.
The problem is that it's hard to draw hard lines about what does or doesn't qualify without then leaving loopholes that can be exploited. This is why many jurisdictions opt for a maximalist approach and then leave it to the discretion of the police and prosecutors for what they do or do not consider, but of course that has the flaw that it's entirely arbitrary and leaves a lot of power in the hands of prosecutors and police for something widely regarded as a extremely serious crime.
Yes. Let's not pretend children aren't people too, they are going to take pictures of themselves or their partners and that is both normal and illegal right now.
So you would rather they get off unscathed instead?
In a similar deepfake porn case at a school in New Jersey, no charges were brought, and the alleged perpetrator does not appear to have suffered any academic or legal penalties. Dorotha Mani, a mother of one of the New Jersey victims, outlined her frustration with the leadership at her daughter’s school in approximately three pages of written testimony to Congress published in March. In that document, she called Westfield High School’s tepid response as “not only disheartening but also dangerous, as it fosters an environment where female students are left to feel victimized while male students escape necessary accountability.”
I'm a bit confused about the process here. Do the victims really have any special ability to identify fake pictures that any other reasonable person couldn't connect with them? It seems needlessly traumatizing to me.