Exploring the core image forensics challenges in 2024

14 May, 2024

The CameraForensics Team

Exploring the core image forensics challenges in 2024

In 2024, online media has become more prevalent, and more easily accessible, than previously thought possible. Popular apps like TikTok and Instagram have signalled the continued domination of short-form content, while lightning-fast mobile connectivity ensures that new content is only a screen tap away.

These advancements have revolutionised the way we consume information and connect with others, offering unprecedented convenience and accessibility. However, amidst this digital revolution lies a darker challenge: the proliferation of harmful content, including child sexual abuse material (CSAM) and illicit media.

While image forensics techniques have been able to streamline online investigations and connect investigators with new intelligence, evolving technology threatens to bring new challenges to previously efficient processes. Below, we’re taking a deeper look at the ongoing battle against online child exploitation, exploring several key challenges affecting image forensics techniques.

At a glance, these include:

  • Increasing media in both volume and scale
  • Continually shifting concerns around user privacy
  • Evolving criminal sophistication
  • An increasing rate of realistic AI-Generated CSAM

Read on to learn more.

Increasing media in both volume and scale

In an average day, more than 6.9 billion images are shared from just WhatsApp alone, with a further 3.8 billion on Snapchat, and 2.1 billion on Facebook. As more applications continue to gain popularity, these trends show no signs of slowing down.

Online media continues to thrive, increasing both in terms of volume and complex formats. As a result, a severe, time-consuming backlog has been created. More imagery is being created than investigators can effectively counter, overwhelming teams while slowing down safeguarding efforts.

Even with the aid of automated tools, processing this volume efficiently and effectively is a formidable task, as investigators must maintain the highest level of accuracy throughout. However, several solutions are in the works, with methodologies ranging from greater international automated classification to early mitigation efforts on the device of the minor.

Discover more about the new mitigation efforts being developed in our full Q&A with SafeToNet’s Tom Farrell.

Continually shifting concerns around user privacy

Balancing the need for privacy with the objective of combatting CSAM is a delicate and ongoing challenge, one that’s no stranger to controversy.

Creating effective image forensics tools involves indexing large amounts of metadata from image-sharing sites like Flickr. Images on these sites aren’t ‘saved’ to platforms like CameraForensics in the traditional sense, but instead are catalogued, with a record of their metadata being documented should an investigator need them. It’s because of this indexing that investigators can quickly find leads like images taken from the same camera, or uploaded by the same username.

However, this indexing process relies on image metadata remaining untouched or ‘uncleansed.’ Most of the world’s most popular sites, like WhatsApp and Facebook, cleanse the metadata of images they host.

Metadata cleansing and encryption are essential tools for protecting user privacy and security. But, it also creates a barrier for law enforcement and forensic tools – removing many images that may be of value or effectiveness from an investigator’s inventory. That’s why we’re so passionate about big tech and law enforcement collaborating as close together as possible. Together, we can find a solution that fully respects user privacy, all while supplying investigators with the intelligence needed to safeguard those most in need.

Evolving criminal sophistication

As image forensics tools continue to develop, becoming more effective than ever for investigators, bad actors are also employing various tactics to evade detection.

These tactics include not only the use of encryption, but also sophisticated image manipulation techniques such as steganography (hiding data within other files), deepfakes (creating AI-generated fake images or videos), and morphing (blending multiple images to create a new one).

At CameraForensics, we’re hard at work creating tools that can identify and connect modified images with their native files, but other techniques are already evolving. We fully understand that image forensics tools need to continuously evolve to detect and counter these new methods, which is why we’re holding regular user summits, and continuing to explore new possibilities with dedicated R&D efforts.

An increasing rate of realistic AI-generated CSAM

“While the potential as a tool for new art can appear harmless, the accessibility and ease of use of these very sophisticated tools grant anyone the power to create or modify an image exactly how they want, with no accountability.” – Shaunagh Downing, Research and Development Engineer.

It’s no secret that AI is disrupting every industry, and ours is no exception.

As AI continues to evolve, a core challenge facing investigators is the amount of increasingly realistic generated imagery. These images and videos add to an already mounting backlog, as investigators must treat every image as genuine.

Generative AI images also make it easier than ever for bad actors to create and access illicit imagery. We recently sat down with Dr Shaunagh Downing to talk more about the very latest AI risks. Read the full interview today.

Determined to help LEAs worldwide counter the CSAM threat

We’re committed to creating the tools needed to help investigators combat the threat of illicit online child sexual abuse and exploitation.

That’s why our ongoing R&D efforts remain dedicated to finding new image forensics processes and solutions in the face of new challenges, while our collaborative efforts help ensure that we have access to the knowledge and insights needed to continue making a real difference.

Visit our blog today to learn more about our mission, and to discover other related insights


Subscribe to the Newsletter