Skip Navigation

"Upload Moderation" - The EU's Latest Name For Messaging Surveillance

EU governments might soon endorse the highly controversial Child Sexual Abuse Regulation (CSAR), known colloquially as “chat control,” based on a new proposal by Belgium’s Minister of the Interior. According to a leak obtained by Pirate Party MEP and shadow rapporteur Patrick Breyer, this could happen as early as June.

The proposal mandates that users of communication apps must agree to have all images and videos they send automatically scanned and potentially reported to the EU and police.

This agreement would be obtained through terms and conditions or pop-up messages. To facilitate this, secure end-to-end encrypted messenger services would need to implement monitoring backdoors, effectively causing a ban on private messaging. The Belgian proposal frames this as “upload moderation,” claiming it differs from “client-side scanning.” Users who refuse to consent would still be able to send text messages but would be barred from sharing images and videos.

The scanning technology, employing artificial intelligence, is intended to detect known child sexual abuse material (CSAM) and flag new images and videos deemed suspicious. The proposal excludes the previously suggested scanning of text messages for grooming signs and does not address audio communication scanning, which has never been implemented.

The proposal first introduced on 8 May, has surprisingly gained support from several governments that were initially critical. It will be revisited on 24 May, and EU interior ministers are set to meet immediately following the European elections to potentially approve the legislation.

Patrick Breyer, a staunch opponent of chat control, expressed serious concerns. “The leaked Belgian proposal means that the essence of the EU Commission’s extreme and unprecedented initial chat control proposal would be implemented unchanged,” he warns. “Using messenger services purely for texting is not an option in the 21st century. And removing excesses that aren’t being used in practice anyway is a sham.”

Breyer emphasizes the threat to digital privacy, stating, “Millions of private chats and private photos of innocent citizens are to be searched using unreliable technology and then leaked without the affected chat users being even remotely connected to child sexual abuse – this would destroy our digital privacy of correspondence. Our nude photos and family photos would end up with strangers in whose hands they do not belong and with whom they are not safe.”

He also points out the risk to encryption, noting that “client-side scanning would undermine previously secure end-to-end encryption to turn our smartphones into spies – this would destroy secure encryption.”

Breyer is alarmed by the shifting stance of previously critical EU governments, which he fears could break the blocking minority and push the proposal forward. He criticizes the lack of a legal opinion from the Council on this fundamental rights issue. “If the EU governments really do go into the trilogue negotiations with this radical position of indiscriminate chat control scanning, experience shows that the Parliament risks gradually abandoning its initial position behind closed doors and agreeing to bad and dangerous compromises that put our online security at risk,” he asserts.

3