Reflecting on 2023 at CameraForensics
By Matt Burns
Content warning: Self-harm and suicide, child abuse, sexual exploitation
As a seasoned law enforcement professional with over 30 years of experience in the Kent Police and the National Crime Agency, (15 of them in child protection) our Engagement Manager Rob Chitham is an expert in online and offline child protection and online crimes.
Recently, we sat down with him to understand the threat of online child abuse in more detail. Below, we discuss the rising trend of self-generated imagery, blackmail, and sextortion involving children and explore the challenges faced by law enforcement in addressing emerging technology.
It’s definitely a complex area that’s challenging to fully understand. That’s purely because there are so many different types of linked offending patterns.
When people used to talk about self-generated imagery, it was regarded as low-level imagery from a relationship that made its way onto the wider web. Now, the majority of first-generation images and videos fall under this blanket term of self-generated. In reality, there are many reasons for this, including grooming, catfishing, capping, sextortion and blackmail, with motivations that are both sexual and financial.
Everything is there for the taking with the internet. Offenders are finding new areas to reach and extort victims – especially using social media. Newer emerging threats are only making it easier and more dangerous still. I’m thinking of AI and deepfakes here, which are being used to generate imagery used to blackmail and extort victims for financial gain or sexual gratification.
The dark web presents another challenge, where we’ve seen organised crime groups becoming involved with the selling of ‘premium content’. Whether this is new content or not, the message is clear – there’s an audience for this imagery, and organised crime groups are profiting off the victimisation of others at an extremely high level.
I say that these cases are crucial for two main reasons. The first is that you need to put the victim first. The more effective your investigations, the quicker you can identify them and put the necessary safeguarding procedures in place. Blackmail, abuse, and extortion can severely harm anyone – especially a minor, and it’s not rare to find a victim suffering from significant psychological trauma, a pattern of self-harm, or worse. The goal of any investigator is to get there first and to give these victims the support they need. That’s why it’s crucial.
But, if we look at the challenge again, the second reason for rapid approaches is to keep up with the rapidly evolving technology. There’s always a new piece of technology that brings a new challenge to investigators. Now, that piece of technology is AI.
Tools like Stable Diffusion have truly transformed the online landscape and have brought with them completely new challenges for investigators. For example, generated images are getting more and more realistic, which can create a real challenge for investigators trying to determine whether or not there’s a victim in need of safeguarding.
I think it’s fair to say that in the UK we have some way to go until we get to the standard of North America on looking after our law enforcement officers’ mental health. It’s a demanding, taxing, and high-pressure role that can harm anyone’s health, so giving them the resources they need to continue doing what they’re committed to doing is vital.
It’s also important to continue investing in new technology. Investment isn’t going to be forthcoming unless online child protection is deemed to be a priority here in the UK. Sadly, we’ve seen some resource loss nationally in this area – which translates to a reduction in staffing and IT resources. This is a task that needs cutting-edge tools to stay ahead of a global challenge – both from organised crime and from individuals – and this investment could mean all the difference.
Definitely. When I think of support, I naturally think about cooperation. This could be cooperation with online platforms, or it could be cooperation with other countries. Both are important.
In the past, we’ve seen that certain platforms and hosting sites have been reluctant to cooperate with investigators. It could be damaging to their reputation to admit that they played a role in harm, but that’s where I think you need to remember your responsibility as a platform.
The same notion goes for countries as well. Countries that are not in full partnership or agreement with the UK are sadly not as collaborative as we need them to be. This includes issues of intelligence sharing with countries with complex human rights records. Now I don’t want to give the impression that that’s always the case, but international cooperation is definitely an area that needs change.
Ultimately, what we need is a far more joined-up approach to how we deal with the threat. The difficulty lies in a huge difference of opinion and resources, but while we’re fixating on that, sextortion, blackmail, and abuse schemes are still operating and targeting our children.
There is an impact on the process. Previously, victims were targeted via adult-themed platforms, and induced to perform sexual acts on camera that were then used as blackmail material. Now, because of the accessibility of social media, we’ve seen an exponential explosion of cases as this offence type moves onto these platforms impacting far more children.
Victims can potentially turn to self-harm and suicide because they’re being blackmailed and abused, while offenders gain financial and sexual gratification out of the process. These are dangerous people and having them being able to operate at scale makes it so difficult to identify and prosecute them. In the meantime, while we’re attempting to locate the offender’s real location, there are more and more victims worldwide.
I think that the constantly improving mobile data and internet accessibility and speed in developing countries and continents is also going to have a marked impact on this threat area. That’s why I firmly believe in the need to invest in our investigators and technical capabilities now.
This will be a continually worsening problem while we operate on an uncooperative, borderless internet. The only thing that can assist us is an investment in education, technology, collaboration, and tactics.
I think that my main concern, the difficulties for victim identification and the need for investment, has been voiced.
With the right tactics and partnerships in place, there are real opportunities to take a more proactive approach. In an era where platforms and individuals are obliterating Exif and metadata, there’s a real opportunity to get into the right groups and collect imagery that can be used to take new lines of enquiry that wouldn’t normally be there. For that to happen, there needs to be that joined-up approach both on the dark and the clear web, and a confident Victim ID network in the mainstream policing force.
Conversations like this fuel our drive to help safeguard victims worldwide. To see our other insights on all things open-source intelligence, AI-generated imagery, and image forensics, head on over to our blog today.