Sounding the Alarm: Why the Fight Against CSAM Can’t Wait | Our Rescue
Skip to main content

Sounding the Alarm: Why the Fight Against CSAM Can’t Wait

Our Rescue
Posted by
Published on April 7, 2025
|
6 min read
Little girl with face blurred

ARLINGTON, Texas — Sex crimes against children are surging online, creating a backlog of digital evidence piling up faster than police officers across the US can sort through. Cyber tips for child sexual abuse material (CSAM) cases have tripled nationwide since 2020. This leaves police departments facing a logjam of investigations with only a scarcity of resources. 

In Arlington, Texas, Sgt. Tarik Muslimovic and his team in the Human Exploitation Trafficking Unit face thousands of images and videos that document unspeakable crimes. Each photo represents a child who is voiceless and needs help.  

“To some, it may just be an image or a video, but to us, that’s a child’s worst day of their life. If law enforcement doesn’t act on these tips, what message are we sending to offenders?” said Sgt. Muslimovic. 

To date, Google has reported 2.8 million instances of CSAM to the National Center for Missing & Exploited Children, which relays the reports to law enforcement agencies worldwide. According to the FBI, CSAM is defined as sexually explicit activities involving a child under 18 years old. This includes images, videos, and any media that exploits or abuses minors. 

The Department of Justice adds more gravity to the horrific crime, citing that “underlying every sexually explicit image or video of a child is abuse, rape, molestation and exploitation.” The National Center for Missing & Exploited Children notes that each time an image of sexual abuse is viewed, a child is revictimized, creating a permanent record of the crime. 

Pandemic Increases Predators Hiding Behind a Screen 

Sgt. Muslimovic says the majority of CSAM cases—approximately 90 percent—originate from cyber tips, with up to 100 tips per month at his department alone. Even after filtering out cases that cannot be prosecuted, more than 50 cases remain each month, far exceeding the agency’s capacity to investigate each one.  

“Before COVID-19, many offenders would seek out children in person. Now, they’ve realized they can stay behind a screen, share material, and face little consequence. That shift has contributed to the exponential rise in CSAM cases,” said Sgt. Muslimovic. 

His department devotes a full-time digital forensic investigator to help with the bottleneck of cases. Processing a suspect’s phone within hours is crucial to identifying and rescuing survivors. The other key to fighting this crime: finding more digital evidence for rapid processing. 

Further east in Wilkes-Barre, Pennsylvania, Sgt. Chaz Balogh recently retired from the Luzerne County District Attorney’s Office, after spending nearly two decades as an Internet Crimes Against Children (ICAC) investigator. He says the sad reality for ICAC investigators is that they often have to prioritize cases with suspected live survivors, leaving other online CSAM cases unaddressed and adding to the overwhelming volume. 

Sgt. Balogh’s partner in crime fighting is an electronic storage detection (ESD) dog, Spike. Armed with a search warrant, the detective and dog duo work together at suspected crime scenes to locate hidden electronic devices containing CSAM, often finding devices that police investigators had missed. Spike not only assists investigations but also provides emotional support to investigators and survivors alike. 

“Viewing child sexual abuse material on a daily basis is a horrific thing for every investigator, but at the same time, it motivates us to go after those individuals that are sexually exploiting children,” said Sgt. Balogh. “Each cyber tip sitting in that pile represents a real child living at home. It could be your son or daughter, and every one of them needs our help.” 

AI in the Fight: A Double-Edged Sword 

The rapid evolution of artificial intelligence (AI) is a double-edged sword complicating the fight against sex trafficking and child exploitation. While AI-generated CSAM is rising, AI also offers law enforcement agencies innovative tools to stay steps ahead. AI can identify new, never-before-seen exploitation images in ways traditional methods cannot, according to John Trenary, vice president of mission for Asia, Africa, and the Middle East at Our Rescue. 

“AI-powered investigative tools can recognize new CSAM by analyzing patterns, similar to how your phone categorizes pictures of buildings or pets. Even if the image was taken yesterday, AI can flag it. That’s a huge advancement, but law enforcement needs more staffing and funding to keep up. This problem isn’t going away. It’s evolving every single day,” said Trenary. 

Trenary, who has 25 years of experience in law enforcement—including digital forensics and cybercrime investigations supporting the FBI, Homeland Security Investigations, the Secret Service, and the Internet Crimes Against Children Task Force—explains how perpetrators use AI in two main ways. They either mass-produce AI-generated child exploitation material featuring fictional but realistic-looking children, or they target known children. 

“Maybe the child of a friend on Facebook. They download images from the parent’s social media, train AI on that child’s face and stature, and use AI to place the child into explicit imagery or even video,” said Trenary. “Every society on the planet agrees on one thing: You don’t sexualize young kids. But AI is now testing that boundary by normalizing exploitation in hidden online communities.” 

The Internet Watch Foundation has identified a growing threat where AI technology is being exploited to produce CSAM. A 2023 report found 20,000 AI-generated images on a dark web forum in one month, where more than 3,000 images depicted criminal child sexual abuse activities. 

AI is also being weaponized to create disturbingly realistic deepfake videos of real children. Perpetrators can manipulate explicit content, making it harder for law enforcement to distinguish between fabricated and real abuse. In response, agencies are deploying AI-driven detection tools such as image recognition software to uncover new CSAM and identify survivors faster. 

“Even poorly made, these videos can cause real harm. Sometimes, it’s difficult to even tell that an image is AI-generated, but we can still charge offenders under child sexual abuse material (CSAM) laws or possession of child pornography,” said Sgt. Muslimovic. “Thankfully, new laws are being written to address these evolving threats.” 

Halting CSAM in Our Nation’s Capital 

In March 2025, lawmakers in Washington, D.C., rallied support for the Stop CSAM Act to combat online child sexual abuse material. The first-of-its-kind hearing featured testimony from survivors, advocates, and law enforcement experts. 

The new bill, introduced by Sen. Josh Hawley, R-Mo., and Sen. Dick Durbin, D-Ill., aims to advance legislation allowing parents and survivors to sue tech companies. 

“Our bill takes a comprehensive approach to stemming the tide of online child sexual exploitation, and significantly, it pierces the broad immunity granted to Big Tech by Section 230,” said Sen. Durbin. 

Funding this Fight 

From our nation’s capital to our state capitals, we are attacking these dark and dirty deeds with legislative fixes. But more resources are needed to turn the tide: more investigators, more cutting-edge technologies, more money.   

With your support, we are waging war against cyber criminals and finding survivors. Our Rescue has assisted nearly 230 agencies in 2024 with training and equipment for digital forensics in trafficking investigations, resulting in 1,025 arrests and 173 rescues. Still, more resources are needed to keep up with the fight, knowing every click or download of CSAM revictimizes a child. Until then, the backlog grows, along with the number of children waiting to be saved.  

Published on April 7, 2025