Transparency report: broken images and federated CSAM attack
Images posted within the last 48 hours will appear as broken. This is expected and intended.
Yesterday 2023-08-27 a community on the lemmy.world instance received multiple posts containing CSAM (or as it is more commonly known CP) content, which spread throughout the federation. We also ended up becoming involuntary hosts of said content.
Due to the severely limited nature of the Lemmy moderation tools, removing or purging the incriminated posts from the admin UI wasn't sufficient and didn't cause the images to be actually removed from our server. Because of this, a nuclear option was required. I have deleted every image saved by our server during the last 48 hours.
Unfortunately this also includes a post on [email protected] , as well as multiple posts on [email protected]. Authors of the affected posts can fix them by re-uploading their images, without the need to recreate the posts.
We are sorry for the inconvenience, but hosting CSAM content is highly illegal and we simply can't take any risks on this front.
I am currently discussing with the other admins whether further measures are necessary, to prevent this from happening in the future. We'll keep you posted if we have any updates.
EDIT [2023-08-28 10:00 UTC]:
The attack is still ongoing. I have now blocked the community and further deleted the last 15 minutes of images.
Lemmy needs an option for admins to not cache federated media, and only deliver it from it's originating instance on request. That way, the offending server can handle it or be defederated if theyret complicit without this systemic, network wide cleanup job, it's a huge security vulnerability to the network as a whole.
That's actually likely to come in the next update (pull request). It actually was proposed as a measure to save space, rather than to fix a security issue, but it do nonetheless.
only deliver it from it’s originating instance on request
Don't think I agree with this. It's not so easy as an admin to identify contents of this kind getting posted on your instance or knowing that you have trolls among your userbase. It's the kind of stuff that is hard to predict and once it's happened it's already too late. As an alternative I would propose:
fix the damn "purge" button: we actually do have a button to delete content off the database on the admin interface. Only thing is, it doesn't currently cover image caches, those stay indefinitely unless manually deleted from the terminal, as I did today. This is a MAJOR flaw and should be fixed ASAP
someone in the admin matrix chat proposed federated purges. It's somewhat controversial, but basically means that the content would get automatically deleted on all instances if the admin of the instance where it was originally posted presses the purge button. That way the lemmy.world admin could have automatically solved the problem for every other instance.
For federated purges, would it make sense for a purging instance to send an automated notice of contaminated content to all federated instances? Then admins can set whether they want to handle it manually or automatically?
But it seems like it should be possible to purge recent posts only from a particular instance only.
At this moment in time, yes they are. The lemmy.world team took down the community that was being targeted, which means that the attack has stopped (even though whoever was posting that shit got his own way). I'm bummed about having done these mass deletions but I was quite scared and that was the easiest thing to do.
Actually, if such an issue was to re surface in the future, I have found a way to more selectively delete the incriminated content. Only side effect of that is that I have to look at those pictures myself to grab their ID; and Lord, that shit can be disturbing at times.
Thank you for your patience and sorry about having destroyed your posts.
Sorry you had to go through that. Peple are messed up, and we have to expect this sort of thing from time to time. I, too, like the "store images only on the originating server" thing. It puts responsibility where it belongs. And wouldn't that allow for damage control by blocking the offending instance while they get their shit together?