Urgent action is needed to protect victims of online abuse

23 November, 2020

Matt Burns & Dave Ranner

Urgent action is needed to protect victims of online abuse

As of 2020, the National Centre for Missing and Exploited Children (NCMEC) has received over 82 million reports of child sexual exploitation. Not to mention the incidents that go unreported, unnoticed, and unsolved. Child sexual exploitation (CSE), grooming, and abuse online are not issues people like to talk about, but they are undeniably prevalent, and the exponential rate at which they are escalating is a reality that cannot be ignored.

Despite this, there is a privacy law being implemented this year that will eradicate one of the primary means of investigation against these crimes. From December 21st, the EU’s ePrivacy Directive will make it illegal for tech companies to use technology to detect online child sexual exploitation.

Currently, there are no exemptions in this proposed regulation allowing investigators to prioritise the privacy rights of vulnerable children over those of malicious actors.

This would represent an unthinkably damaging, backwards step for child protection efforts. Organisations working against the spread of this material would be halted, investigations slowed, and an unfathomable number of victims would be left unidentified and unprotected.

If you agree that this is an implication that cannot be tolerated, please sign the petition to allow technology companies to continue using their tools to search for CSE imagery online before the 7th December here: change.org/childsafetyfirst.

What you need to know

Why is this regulation being made?

It’s been three and a half years since this regulation was proposed, and until now, EU governments have failed to reach an accord.

The aim of this law is to stop technology companies, from messaging platforms to social media operators, from encroaching on people’s privacy. However, many of these organisations perform significant work to stop CSE material being shared and to identify victims of abuse by searching their online platforms for known illegal imagery. This work directly empowers law enforcement to address millions of cases every year that otherwise wouldn’t be possible.

“Microsoft technology known as PhotoDNA has enabled tech giants like Facebook, Twitter and Google to track down millions of illegal child abuse images on their platforms.” - NCMEC

Maybe inadvertently, this law threatens the safety of children online and existing victims of abuse by restricting law enforcement in their investigations and stopping illegal images from being recognised and removed by technology solutions.

The impacts of this regulation on how our industry will battle CSE online directly contradicts the meeting held by EU officials in May 2020, discussing the need to do more to protect children online in the wake of increasing digital threats introduced by COVID-19.

What implications would it have on CSE investigations?

“9 in 10 webpages identified by the IWF showing videos and images of children suffering sexual abuse, rape, and torture are hosted on servers in Europe.” – Internet Watch Foundation (IWF)

The impacts of this regulation will put the industry back decades. All of the progress we’ve made in technology development and investigative tools that have accelerated advancement against criminal activity will become void. Once again, a huge proportion of crimes will go undetected and unsolved.

Ultimately, we expect to see an explosion of material shared online if this regulation passes. It will take much longer for material to be recognised and taken down without the use of technology, by which time it could be too late for the victims in question. This is especially harmful as most self-disclosed victims tend not to report their abuse material, making technology one of the only ways it would ever be found.

Empowered by this knowledge, there is a real and frightening possibility that offenders will take advantage of their freedoms and brazenly increase their illegal activities on platforms hosted in the EU.

Why privacy isn’t being violated by this technology

The fear that has spurred on this regulation is that tech companies can invade all user content – with no barriers to, or respect for, privacy. But this simply isn’t the case. The algorithms behind searching and identifying child abuse images online are incapable of recognising anything other than illegal images that are known to be damaging based on image hashes.

Think of it like virus scanners on your computer. They don’t spy on user content, or record activity, they only identify and flag specific material for the purpose of protecting your digital environment. Technologies like PhotoDNA, and many more, are the same, they find and classify abuse imagery online, and that alone.

Despite these tools being designed for the sole purpose of protecting children online and safeguarding victims of abuse, they are now being obstructed to protect the privacy of criminals.

We wholeheartedly agree that everyone is entitled to privacy. But we also believe that once the act of exercising your freedom encroaches upon the freedoms of another person, you forfeit your own right to privacy.

What you can do

There’s still time to do something. The European Parliament is currently debating an exception to this law wherein technology companies will be able use the tools in question until 2025, while a more permanent solution is determined.

NCMEC is hosting a petition to the European Parliament to accept the interim regulation at change.org/childsafetyfirst. The deadline for response is on 7th December, so if you want to take action against this potentially catastrophic decision, please sign now and share this message with your friends, family and professional connections.

You can also take an additional step. If you have any connections to MEPs, please take two minutes to write a letter explaining why this regulation would be extremely damaging to victims of CSE, online abuse, and overall, the safety of children online and on social platforms. It’s not too late to urge these figures to provide an exception for this worthy cause and help safeguard potential and existing victims.

Get in touch if you want to know what more you can do, or more about what this legislation means for victims, investigators and technology developers.


Subscribe to the Newsletter