Image uploads are now disabled on lemm.ee due to malicious users
Sorry for the short post, I'm not able to make it nice with full context at the moment, but I want to quickly get this announcement out to prevent confusion:
Unfortunately, people are uploading child sexual abuse images on some instances (apparently as a form of attack against Lemmy). I am taking some steps to prevent such content from making it onto lemm.ee servers. As one preventative measure, I am disabling all image uploads on lemm.ee until further notice - this is to ensure that lemm.ee can not be used as gateway to spread CSAM into the network.
It will not possible to upload any new avatars or banners while this limit is in effect.
I'm really sorry for the disruption, it's a necessary trade-off for now until we figure out the way forward.
Yeah, the admins deserve all our support on this. Not only to protect themselves as server owners, but to stop the spread. Hopefully a longterm solution will be found soon
I think this is a great move until we have something rock solid to prevent this. There are tons of image hosting sites you can use (most of which have the resources to already try to prevent this stuff) so it shouldn’t really cause much inconvenience.
I know there are automated tools that exist for detection CSAM- given the challenges the fediverse has had with this issue it really feels like it'd be worthwhile for the folks developing platforms like lemmy and mastodon to start thinking about how to integrate those tools with their platforms to better support moderators and folks running instances.
Perfectly fine. People can upload images elsewhere and then just link to them. Most image upload sites will have all those protections in place already. A good stopgap until Lemmy gets those mod tools
This is really sad and disgusting. It affects the whole platform but especially smaller instances that can't keep up. Despite being a lemm.ee user, I was particularly upset about thegarden.land shutting down because of that spam. It had my favourite gardening community on here.
I really hope this gets sorted out, and the spammers end up where they belong.
I'd really love to start a small instance just to play host to a couple of niche interests I don't see around yet, but yeah, hearing about this fucked to behavior is making me hold off.
It has a real chilling affect on users, which is so unfortunate for a platform that is mostly made up of well meaning people
If one enjoys the twisted pain inflicted on children, then inflicting pain that makes most adults want to use eye-bleach by showing off their plunders is to them well executed revenge on the people they dont like.
The issue is that you really can’t. The laws are written specifically to prevent plausible deniability. Because pedos would be able to go “lol a troll sent it to me” and create some doubt in a jury. Remember that (at least in America) the threshold for conviction is supposed to be “beyond a reasonable doubt.” So if laws were focused on intent, all the pedos would need to do is create reasonable doubt, by arguing that they never intended to view/own the CSAM.
This was particularly popular in the Napster/Limewire days, when trolls would upload CSAM under innocuous titles, so people looking for the newest episode of their favorite show would find CSAM instead. You could literally find CSAM titled things like “Friends S10E9” because trolls were going for the shock factor of an innocent person opening a video only for it to end up being hardcore CSAM. Lots of actual pedos tried using the “I downloaded it by accident” defense.
So instead, the laws are written to close that loophole. It doesn’t matter why you have the CSAM. All that matters is you have it. The feds/courts won’t give a fuck if it was due to you seeking it out or if it was due to a bad actor sending it to you.
I'm going to go out on a limb and say they and all the other instances that were hit with this attack probably did. Which authorities, I don't know. If this instance is hosted in Estonia then probably Estonian authorities, but it's probably being hosted on the cloud so is it REALLY hosted in Estonia? There are a ton of American and EU users so hopefully the FBI and whatever the EU equivalent is. But honestly cybercrimes can get confusing because of the nature of people and hosting being spread out all over the world and it can be hard to even figure out who to report to.
I don't think they made it onto this server, with the 100kb upload limit in place, that was already a rather low risk. It's a preventive measure. So far lemmy.world was the one deliberately targeted.
This is a very good decision, I worried about this problem from the very beginning that I learned about the Fediverse.
Research must definitely be done to find CSAM detection tools that integrate into Lemmy, perhaps we could make a separate bridge repo that integrates a tool like that easily into the codebase.
I hope every disgusting creature that uploads that shit gets locked up
There was a user that posted a tool they had already been working on, that worked in Python, to go through and automate detection/deletion of potential CSAM on Lemmy servers that admins could implement until better tools come along. Unfortunately, I don't remember who posted it or where I saw it in my travels yesterday.
That's disgusting! You made the right thing, sorry you admins and mods have to put up with that shit, I hope instance owners that are being attacked are reporting it to local authorities.
Thank you sir. I appreciate the dedication to the community to subject yourself to the moderation. Hopefully we can squash this before it goes too far, farther than it has anyway..
I'm no expert on this; but I'd assume that it is sometime easy to track them down, and sometimes very hard. Easy if they just do a direct upload from their home internet with a fixed IP address, using a regular lemmy account that they also use for day-to-day stuff. But hard if (for example), they use upload from some coffee shop wifi connection with a throw-away account using some tor / proxy / VPN shenanigans.
It's not just about lemm.ee, the up/down of federation is that stuff from lemm.ee gets copied to all the other federated instances and vice versa. So lemm.ee's region aside, this move tries to help protect the fediverse at large by removing a major distribution hub. It's not a full solution in that regard, but it makes a bad situation incrementally less bad. Other popular instances may end up doing this as well till better tools come about.
It’s not just about lemm.ee, the up/down of federation is that stuff from lemm.ee gets copied to all the other federated instances and vice versa. So lemm.ee’s region aside, this move tries to help protect the fediverse at large by removing a major distribution hub.
Seems like this is still disabled, is it now in play for the foreseeable future? What about setting a time frame in which no images can be posted until it's surpassed? Or maybe have an approval required for adding the avatar and banner?
Personally, I would love it if all NSFW pictures were banned. There's deepai.org's nsfw detection bot. No idea if it costs money to use the api or not.
Not a single lemmy user or admin, was it .world, .ml, HB or exploding heads or dbzer0 would benefit from other instances being attacked like this. Spamming normal porn or gore maybe could go as a joke or retaliation sure but using csam is very threatening to the whole fediverse even if done against a single small instance.
Better shut the internet down then. This will only continue to worsen now that anybody can generate whatever images they want with AI assistance. Such image hashes will not be in CSAM databases (if AI generated imagery is even CSAM)