“This kid who is not getting any kind of real consequence other than a little bit of probation, and then when he’s 18, his record will be expunged, and he’ll go on with life, and no one will ever really know what happened,” McAdams told CNN.
“If [this law] had been in place at that point, those pictures would have been taken down within 48 hours, and he could be looking at three years in jail...so he would get a punishment for what he actually did,” McAdams told CNN.
There's a reason kids are tried as kids and their records are expunged when they become adults. Undoing that will just ruin lives without lessening occurrences.
“It’s still so scary as these images are off Snapchat, but that does not mean that they are not on students’ phones, and every day I’ve had to live with the fear of these photos getting brought up resurfacing,” Berry said. “By this bill getting passed, I will no longer have to live in fear knowing that whoever does bring these images up will be punished.”
This week, Republican Senator Ted Cruz, Democratic Senator Amy Klobuchar and several colleagues co-sponsored a bill that would require social media companies to take down deep-fake pornography within two days of getting a report.
“[The bill] puts a legal obligation on the big tech companies to take it down, to remove the images when the victim or the victim's family asks for it,” Cruz said. “Elliston's Mom went to Snapchat over and over and over again, and Snapchat just said, ‘Go jump in a lake.’ They just ignored them for eight months.”
BS
It's been possible for decades for people to share embarrassing pictures of you, real or fake, on the internet. Deep fake technology is only really necessary for video.
Real or fake pornography including unwilling participants (revenge porn) is already illegal and already taken down, and because the girl is underage it's extra illegal.
Besides the legal aspect, the content described in the article, which may be an exaggeration of the actual content, is clearly in violation of Snapchat's rules and would have been taken down:
We prohibit any activity that involves sexual exploitation or abuse of a minor, including sharing child sexual exploitation or abuse imagery, grooming, or sexual extortion (sextortion), or the sexualization of children. We report all identified instances of child sexual exploitation to authorities, including attempts to engage in such conduct. Never post, save, send, forward, distribute, or ask for nude or sexually explicit content involving anyone under the age of 18 (this includes sending or saving such images of yourself).
We prohibit promoting, distributing, or sharing pornographic content, as well as commercial activities that relate to pornography or sexual interactions (whether online or offline).
We prohibit bullying or harassment of any kind. This extends to all forms of sexual harassment, including sending unwanted sexually explicit, suggestive, or nude images to other users. If someone blocks you, you may not contact them from another Snapchat account.
Is revenge porn illegal federally? Not that that would really matter, a state could still not have a law and have no way to prosecute it.
Given there was some state that recently passed a revenge porn law makes it clear your just wrong
On Snapchats ToS:
Lucky never ran into the first point personally but as a teenager I heard about it happening quite a bit.
The second point is literally not enforced at all, to the point where they recommend some sort of private Snapchats which are literally just porn made by models
I looked it up before posting. It's illegal in 48 states, including California where most of these companies are headquartered, and every state where major cloud data centers are located. This makes it effectively illegal by state laws, which is the worst kind of illegal in the United States when operating a service at a national level because every state will have slightly different laws. No company is going to establish a system that allows users in the two remaining states to exchange revenge porn with each other except maybe a website established solely for that purpose. Certainly Snapchat would not.
I've noticed recently there are many reactionary laws to make illegal specific things that are already illegal or should already be illegal because of a more general law. We'd be much better off with a federal standardization of revenge porn laws than a federal law that specifically outlaws essentially the same thing but only when a specific technology is involved.
I apologize for the innappropriate behavior and bans by @[email protected] in this thread, I've removed them as a mod here, banned them, and unbanned the ppl who they innappropriately banned.
Note: if they get unbanned in the near future, its because of our consensus procedure which requires us admins to take a vote.
Odd that there is no mention of the parents contacting the police and working through them to get the images down Technically and legally the photos would be considered child porn Since it's over the Internet it would bring Federal charges even though there maybe State charges
Somethings were handled wrong if all the kid is getting is probation
They aren't photos. They're photorealistic drawings done by computer algorithms. This might seem like a tiny quibble to many, but as far as I can tell it is the crux of the entire issue.
There isn't any actual private information about the girls being disclosed. The algorithms, for example, do not and could not know about and produce an unseen birthmark, mole, tattoo, piercing, etc. A photograph would have that information. What is being shown is an approxomation of what similar looking girls in the training set look like, with the girls' faces stiched on top. That is categorically different than something like revenge porn which is purely private information specific to the individual.
I'm sure it doesn't feel all that different to the girls in the pics, or to the boys looking at it for that matter. There is some degree of harm here without question. But we must tread lightly because there is real danger in categorizing algorithmic guesswork as reliable which many authoritarian types are desperate to do.
This is the other side of the same coin. We cannot start treating the output of neural networks as facts. These are error prone black-boxes and that fact must be driven hard into the consciousness of every living person.
For some, I'm sure purely unrelated reason, I feel like reading Phillip K Dick again...
They aren’t photos. They’re photorealistic drawings done by computer algorithms. This might seem like a tiny quibble to many, but as far as I can tell it is the crux of the entire issue.
most phone cameras alter the original image with AI shit now, it's really common, they apply all kinds of weird correction to make it look better. Plus if it's social media there's probably a filter somewhere in there. At what point does this become the ship of thesseus?
my point here, is that if we're arguing that AI images are semantically, not photos, than most photos on the internet including people would also arguably, not be photos to some degree.
Technically and legally the photos would be considered child porn
I don't think that has been tested in court. It would be a reasonable legal argument to say that the image isn't a photo of anyone. It doesn't depict reality, so it can't depict anyone.
I think at best you can argue it's a form of photo manipulation, and the intent is to create a false impression about someone. A form of image based libel, but I don't think that's currently a legal concept. It's also a concept where you would have to protect works of fiction otherwise you've just made the visual effects industry illegal if you're not careful.
In fact, that raises an interesting simily. We do not allow animals to be abused, but we allow images of animal abuse in films as long as they are faked. We allow images of human physical abuse as long as they are faked. Children are often in horror films, and creating the images we see is very strictly managed so that the child actor is not exposed to anything that could distress them. The resulting "works of art" are not under such limitations as far as I'm aware.
What's the line here? Parental consent? I think that could lead to some very concerning outcomes. We all know abusive parents exist.
I say all of this, not because I want to defend anyone, but because I think we're about to set some really bad legal precidents if we're not careful. Ones that will potentially do a lot of harm. Personally, I don't think the concept of any image, or any other piece of data, being illegal holds water. Police people's actions, not data.
And even if it hadn't, that's no excuse not to start.
It would be a reasonable legal argument to say that the image isn't a photo of anyone. It doesn't depict reality, so it can't depict anyone.
It depicts a real child and was distributed intentionally because of who it depicts. Find me then legal definition of pornography that demands that pornography be a "depiction of reality". Where do you draw the line with such a qualifier?
I think at best you can argue it's a form of photo manipulation, and the intent is to create a false impression about someone.
It is by definition "photo manipulation", but the intent is to sexually exploit a child against her will. If you want to argue that this counts as a legal form of free speech (as libel is, FYI), you can fuck right on off with that.
A form of image based libel, but I don't think that's currently a legal concept.
Maybe actually know something about the law before you do all this "thinking".
It's also a concept where you would have to protect works of fiction otherwise you've just made the visual effects industry illegal if you're not careful.
Oh no, not the sLiPpErY sLoPe!!!
We do not allow animals to be abused, but we allow images of animal abuse in films as long as they are faked.
Little girls are the same as animals, excellent take. /s
Children are often in horror films, and creating the images we see is very strictly managed so that the child actor is not exposed to anything that could distress them.
What kind of horror films are you watching that has naked children in sexual situations?
What's the line here?
Don't sexually exploit children.
Parental consent?
What the living fuck? Parental consent to make porn of their kids? This is insane.
I say all of this, not because I want to defend anyone, but because I think we're about to set some really bad legal precidents if we're not careful.
The bad legal precedent of banning the creation and distribution of child pornography depicting identifiable minors?
Personally, I don't think the concept of any image, or any other piece of data, being illegal holds water.
Yes. It's very tiring having to constantly fight this battle. Unfortunately that's what they want cause if enough of us are too tired to care then eventually it slips through and we never get back what we lost.
In this case, yes. Visually indistinguishable from a photo is considered CSAM. We don't need any new laws about AI to get these assholes. Revenge porn laws and federal CSAM statutes will do.
It should be considered illegal if it was used to harm/sexually abuse a child which in this case it was.
Whether it should be classed as CSAM or something separate, I tend to think probably something separate as a revenge porn type law that still allows for distinguishing between this and say a girl whose uncle groomed and sexually abused her while filming it as while this is awful it can (and often does seem) be the product of foolish youth rather than the offender and those involved all being very sick, dangerous, and actually violent offending adult pedophiles victimizing children.
Consider the following:
Underage girl takes a picture of her own genitals, unfortunately classified as the unhelpful and harmful term "child porn" and she can be charged and registered as a sex offender but it's not CSAM and -shouldn't- be considered illegal material or a crime (though it is because the west has a vile fixation on puritanism which hurts survivors of childhood sexual trauma as well as adults).
Underage girl takes a picture of her genitals and sends it to her boyfriend, again /shouldn't/ be CSAM (unfortunately may be charged similarly), she consented and we can assume there wasn't any unreasonable level of coercion. What it is unfortunately is bound by certain notions of puritanism that are very American.
From 2, boyfriend shares it with other boys, now it's potentially CSAM or at the least revenge porn of a child as she didn't consent and it could be used to harm her but punishment has to be modulated with the fact the offender is likely a child himself and not fully able to comprehend his actions.
Underage boy cuts out photo of underage girl he likes, only her face and head, glues it atop a picture of a naked porn actress, maybe a petite one and uses it for his own purposes in private. Not something I think should be classed as CSAM.
Underage boy uses AI to do the same as above but more believably, again I think it's kind of creepy but if he keeps it to himself and doesn't show anyone or spread it around it's just youthful weirdness though really he probably shouldn't have easy access to those tools.
Underage boy uses AI to do same as 4-5 but this time he spread it around, defaming the girl, she/her friends find out, people say mean things about her, she has to go to school with a bunch of people who are looking and pleasuring themselves to fake but realistic images of herself against her consent which is violating and makes one feel unsafe. Worse probably being bullied for it, mean things, called the s-word, etc.
Kids are weird and do dumb things though unfortunately boys especially in our culture have a propensity to do things that hurt girls far more than the inverse to the point it's not even really worth talking about girls being creepy or sexually abusive towards peer-aged boys in adolescence and young adulthood. To address this though you need to address patriarchy and misogyny on a cultural level, teach boys empathy and respect for girls and women and frankly do away with all this abusive pornography that's super prevalent and popular which encourages and perpetuates abusive actions and mentalities towards women and girls, this will never happen in the US however because it's structurally opposed to being able to do such a thing. Also couldn't hurt to peel back the stigma and shame around sexuality and nudity in the US which stems from its reactionary Christian culture but again I don't think that will ever happen in the US as it exists, not this century anyways.
Obviously not getting into adults here as that doesn't need to be discussed, it's wrong plain and simple.
Bottom line I think is companies need to be strongly compelled to quickly remove revenge-porn type stuff (regardless of the age of the victim though children can't deal with this kind of thing as well as adults so the risk of suicide or other self-harm is much higher so it should be treated as higher priority) which this definitely is. It's abusive and unacceptable and they should fear the credit card companies coming down on them hard and destroying them if they don't aggressively remove it and ban it and report those sharing it. It should be driven off the clear-web once reported, there should be an image-hash data-set like that used for CSAM (but separate) for such things and major services should use it to stop the spread.
i believe in the US for all intents and purposes, it is, especially if it was sourced from a minor, because you don't really have an argument against that one.
I'm not sure where you're going with that? I would argue that yes, it is. As it's sexual material of a child, with that child's face on it, explicitly made for the purpose of defaming her. So I would say it sexually abused a child.
But you could also be taking the stance of "AI trains on adult porn, and is mearly recreating child porn. No child was actually harmed during the process." Which as I've said above, I disagree with, especially in this particular circumstance.
Apologies if it's just my reading comprehension being shit
The teen’s phone was flooded with calls and texts telling her that someone had shared fake nude images of her on Snapchat and other social media platforms.
Berry, now 15, is calling on lawmakers to write criminal penalties into law for perpetrators to protect future victims of deepfake images.
“This kid who is not getting any kind of real consequence other than a little bit of probation, and then when he’s 18, his record will be expunged, and he’ll go on with life, and no one will ever really know what happened,” McAdams told CNN.
The mom and daughter say legislation is essential to protecting future victims, and could have meant more serious consequences for the classmate who shared the deep-fakes.
“If [this law] had been in place at that point, those pictures would have been taken down within 48 hours, and he could be looking at three years in jail...so he would get a punishment for what he actually did,” McAdams told CNN.
“It’s still so scary as these images are off Snapchat, but that does not mean that they are not on students’ phones, and every day I’ve had to live with the fear of these photos getting brought up resurfacing,” Berry said.
The original article contains 585 words, the summary contains 205 words. Saved 65%. I'm a bot and I'm open source!
The girl looks very much like a younger Christina Model.
Even before the time of AI, photoshopping one person's face on another's body was totally within the realm of possibility... I wonder if there are any laws about that, which could be used as precedent.
That's all well and good to remove them, but it solves nothing. At this point every easily accessible AI I'm aware of is kicking back any prompts with the names of real life people, they're already antisipating real laws, preventing the images from being made in the first place isn't impossible.
A friend in high school made nude drawings of another mutual friend. It was weird he showed me but he was generally an artsy guy and I knew he was REALLY into this girl and it was kind of in the context of showing he his art work. I reconnected with the girl years later and talked about this and while she said it was weird she didn't really think much of it. Rather, the creepy part to her was that he showed people.
I don't think we can stop horny teens from making horny content about their classmates, heck, I know multiple girls who wrote erotic stories featuring classmates. The sharing (and realism) is what turns the creepy but kind of understandable teenage behavior into something we need to deal with
Idk, with real people the determination on if someone is underage is based on their age and not their physical appearance. There are people who look unnaturally young that could legally do porn, and underage people who look much older but aren't allowed. It's not about their appearance, but how old they are.
With drawn or AI-generated CSAM, how would you draw that line of what's fine and what's a major crime with lifelong repercussions? There's not an actual age to use, the images aren't real, so how do you determine the legal age? Do you do a physical developmental point scale and pick a value that's developed enough? Do you have a committee where they just say "yeah, looks kinda young to me" and convict someone for child pornography?
To be clear I'm not trying to defend these people, but it seems like trying to determine what counts legal/non-legal for fake images seems like a legal nightmare. I'm sure there are cases where this would be more clear cut (if they ai generate with a specific age, trying to do deep fakes of a specific person, etc), but a lot of it seems really murky when you try to imagine how to actually prosecute over it.
If you really must, you can simply have the AI auto delete nsfw images, several already do this. Now to argue you can't simply never generate or give out nsfw images, you can also gate nsfw content generation behind any number of hinderences that are highly effective against anonymous use, or underage use.
The current method is auto deleting nsfw images. Doesn't matter how you got there, it detects nsfw it dumps it, you never get an image. Besides that gating nsfw content generation behind a pay wall or ID wall. It would stop a lot of teenagers. Not all, but it would put a dent in it. There are also AI models that will allow some nsfw if it's clearly in an artistic style, like a watercolor painting, but will kick nsfw realism or photography, rendered images, that sort of thing. These are usually both in the prompt mode, paint in/out, and image reference mode, generation of likely nsfw images, and after generating a nsfw check before delivering the image. AI services are antisipating full on legal consequences for allowing any nsfw or any realistic, photographic, cgi, image of a living person without their consent, it's easy to see that's what they are prepared for.
Depending on the AI developers to stop this on their own is a mistake. As is preemptively accepting child porn and deepfakes as inevitable rather than attempting to stop or mitigate it.