Contact Us

Disruptive Competition Project

655 15th St., NW

Suite 410

Washington, D.C. 20005

Phone: (202) 783-0070
Fax: (202) 783-0534

Contact Us

Please fill out this form and we will get in touch with you shortly.

Protecting Kids While Respecting Europeans’ Rights: How To Navigate the Safety-Privacy Conundrum 

Credit: nycshooter

Main takeaways

  1. Belgian EU Presidency’s latest proposal to force a breakthrough makes a fundamental error by failing to differentiate between known CSAM, unknown CSAM, and grooming
  2. It also risks undermining encryption, which is key to safeguarding Europeans’ privacy
  3. Introducing a framework for proactive detection would make online services safer 

Two years have passed since the European Commission published its initial proposal to prevent and combat child sexual abuse (CSA). Nevertheless, negotiations on Europe’s long-term regulatory framework to fight CSA still struggle to make progress.

What’s at stake, you ask? Well, on the one hand, the protection and safety of children online. Everyone will agree that this is extremely important. On the other hand, there’s something just as important: the right of users to privacy of their communications. 

Reaching a deal on the CSA Regulation cannot come at any price, as EU co-legislators need to make sure that both child protection and Europeans’ privacy are properly balanced. If EU Member States want to advance negotiations and adopt a common position, without upsetting this delicate balance, there are four key things they need to take into account. 

Recognise challenges in detecting different types of CSAM

Detection of CSA material (CSAM) is challenging given its unique and sensitive nature. The latest ‘compromise’ tabled by the Belgian Presidency of the EU Council, however, sadly fails to adequately take into consideration the challenges in the detection of different types of CSA content. It makes a fundamental mistake by proposing to treat known CSAM, unknown CSAM, and the solicitation of children (also known as ‘grooming’) all in a similar way, even though each type requires very different treatment.

While for known CSA content perceptual hashing technologies are quite accurate and the risk for false positives is negligible, detection of unknown CSAM and grooming actually requires providers of online services to rely on artificial intelligence technologies, which are still much less accurate and come with a higher risk for false positives. 

Whatever solution EU negotiators will agree on requires to be applied at a massive scale. This means that even small errors will end up being multiplied exponentially once applied to the billions of messages exchanged by Europeans every day. The risks associated with the flawed Belgian proposal are not only a high number of false positives and the overblocking of innocent users. It would also result in the overflowing of law enforcement’s reporting systems. This, in turn, would lead to actual CSAM cases falling through the cracks of the judicial system because of a flawed one-size-fits-all approach. 

I think we can all agree this is not something any of us wants to happen, as the aim of the proposal – first and foremost – is to protect children. For the future CSA Regulation to be truly effective it is vital that the new framework makes a clear distinction between the different types of CSAM and how they should be detected.

Properly safeguard encryption

At the risk of sounding like a broken record, let me reiterate that striking the right balance between protecting children and safeguarding users’ right to privacy is fundamental. In that respect, encryption (including end-to-end encryption of private messages) has a key role to play in the provision of secure communications, both for minors and adults. It also helps to protect private and confidential information contained within files and other repositories. 

With the Council Presidency’s compromise now suggesting to make it mandatory to scan users’ devices every time before any image or video can be uploaded to an online service (including end-to-end encrypted communication) the latest Belgian proposal would de facto lead to the introduction of client-side scanning that undermines encryption. 

Because users who refuse to consent to their device being scanned simply would no longer be able to share images, videos, and even hyperlinks. This while the importance of encryption to protect our privacy is acknowledged in EU legislation that already applies and has recently been highlighted again by over 300 leading scientists and researchers.

Hence, the final CSA Regulation can only provide a lasting solution if it focuses on building the right framework for innovative detection technologies, while in parallel upholding the right to privacy. This means allowing online services that use encryption to protect users’ communication (including end-to-end encryption) to combat CSA without forcing them to open up the content of people’s private messages. 

Allow proactive detection of CSA

Prevention of harm from happening in the first place is crucial when we are talking about the safety of children. That is why service providers should be allowed to proactively scan their online services for known CSAM and patterns of suspicious conduct, so even before receiving a detection order from authorities. 

It is a well-established fact that the broad range of proactive measures already put in place by online services provide a flexible, fast and effective way to prevent CSA from happening and to lower the risk of further dissemination. These measures include, for example, the removal of suspicious users, technologies that prevent unknown adults from contacting minors, and measures aimed at empowering young people by improving their (social) media literacy, to name just a few. 

To incorporate such measures into the final CSA Regulation, or at least make sure they remain compatible, online services should get the guarantee that they can continue to build on the basis of existing EU legislation (including the GDPR, DSA, and Terrorist Content Online Regulation) to design proactive solutions that are lawful and proportionate. The introduction of such a new framework has been examined and endorsed by a variety of legal studies that were recently published, including those by Timelex and professor Barczentewicz.

Don’t prescribe technology choices

Finally, lawmakers should avoid prescribing which technology or solution has to be used to detect (the dissemination of) CSAM. In order for detection to be effective, different types of services need to be able to apply different technologies. 

For example, certain providers have no knowledge of the content they host in order to preserve the privacy of users – something that is inherent to products like cloud storage – which means they would face a number of technical and legal constraints if the EU decides to impose technology on them that was designed for public-facing online platforms. 

We all know that technology continues to evolve at a much more rapid pace than laws do. So, a prescriptive technological approach would not only stifle innovation, as there is no incentive to make improvements, but also prevent the deployment of more accurate detection measures in the future altogether. Rather than prescribing a particular technology that is available today, we need to make sure that the best solutions continue to be deployed – now and in the future – so that children can enjoy a safe online experience. 


This debate has been extremely polarised since the beginning. It is just disappointing to see that two years after the proposal was presented, everyone seems to have become further entrenched. They either want to ensure the safety of children, or preserve the right to privacy – but hardly anyone seems to be able to acknowledge we need to do both.

Isn’t it about time that we start looking at these two rights as complementary, rather than diametrically opposed? We can only hope that EU policymakers will see the light soon – adopting a balanced approach that effectively protects children while ensuring everyone’s right to privacy.

European Union

DisCo is dedicated to examining technology and policy at a global scale.  Developments in the European Union play a considerable role in shaping both European and global technology markets.  EU regulations related to copyright, competition, privacy, innovation, and trade all affect the international development of technology and tech markets.