Skip Navigation

Stanford researchers find Mastodon has a massive child abuse material problem

www.theverge.com /2023/7/24/23806093/mastodon-csam-study-decentralized-network

Not a good look for Mastodon - what can be done to automate the removal of CSAM?

214

You're viewing a single thread.

214 comments
  • I'm not actually going to read all that, but I'm going to take a few guesses that I'm quite sure are going to be correct.

    First, I don't think Mastodon has a "massive child abuse material" problem at all. I think it has, at best, a "racy Japanese style cartoon drawing" problem or, at worst, an "AI generated smut meant to look underage" problem. I'm also quite sure there are monsters operating in the shadows, dogwhistling and hashtagging to each other to find like minded people to set up private exchanges (or instances) for actual CSAM. This is no different than any other platform on the Internet, Mastodon or not. This is no different than the golden age of IRC. This is no different from Tor. This is no different than the USENET and BBS days. People use computers for nefarious shit.

    All that having been said, I'm equally sure that this "research" claims that some algorithm has found "actual child porn" on Mastodon that has been verified by some "trusted third part(y|ies)" that may or may not be named. I'm also sure this "research" spends an inordinate amount of time pointing out the "shortcomings" of Mastodon (i.e. no built-in "features" that would allow corporations/governments to conduct what is essentially dragnet surveillance on traffic) and how this has to change "for the safety of the children."

    How right was I?

214 comments