What is CSAM? (Child Sexual Abuse Material) | Our Rescue
Skip to main content

What is CSAM?

Our Rescue
Posted by
Published on April 10, 2025
|
8 min read

Understanding CSAM: Definition and Terminology 

Child sexual abuse material (CSAM), commonly known as child pornography, refers to any form of media that depicts the sexual exploitation or sexual abuse of children. This can include images, videos, or even live streams that show minors being sexually abused or exploited. The term “CSAM” is legally and ethically precise, in contrast to “child pornography.” Calling it what it is – child sexual abuse material – accurately states the harmful nature of the content and the abuse involved, rather than implying consent or normalization of such exploitation. 

Legal Definition of CSAM 

It’s important to understand that under U.S. federal law, CSAM is still referred to as child pornography. It is defined in the Protect Act of 2003, which revised the Child Protection and Obscenity Enforcement Act of 1988, stating that it is any material that visually depicts a minor engaging in sexually explicit conduct or material intended to convey such depictions, even if it involves computer-generated imagery (CGI) or simulated scenes. The definition is broad: it covers not only real photographs and videos but also any material, whether produced digitally or via other means, that meets these criteria. 

Why the Term CSAM Is Preferred Over Older Terminology 

Distinguishing between “child pornography” and “CSAM” is not just semantics. The term “child pornography” carries connotations of consensual adult material, which is far from what child sexual abuse material involves. Referring to it as “pornography” risks downplaying the violence and exploitation children endure. That’s why advocates and lawmakers now use “CSAM.” We never want to normalize these acts, so it’s important to always emphasize the severity of what it means.  

Impact on Child Survivors 

The impact of CSAM on the children it depicts is profound and long-lasting. Survivors often suffer from severe psychological, emotional, and physical trauma. The abuse can lead to long-term mental health issues, such as depression, anxiety, PTSD, and feelings of shame and isolation. Even if survivors escape the abuse, the ongoing existence of their images on the internet can cause them immense harm as they grow older. 

Moreover, the abuse of children for CSAM is not just a one-time event; the material lives on indefinitely in the digital world, subjecting these children to ongoing exploitation and harm. 

In 2017, the Canadian Centre for Child Protection released their findings from a CSAM survivor survey. It showed that nearly 70% respondents indicated that they worry constantly about being recognized by someone who has seen images of their abuse. Thirty respondents (30%) reported being identified by a person who had viewed the child sexual abuse imagery.  

One survivor was quoted, saying, “My child sexual abuse imagery is out there for anyone to see, I will forever be taken advantage of. It’s not something that will ever go away.”  

It’s also important to recognize that many children who become victims of CSAM may never report their abuse due to fear, shame, or lack of understanding of the legal and emotional ramifications. 

Types of Content Classified as CSAM 

CSAM includes several forms of media, such as: 

  1. Photographs and Videos: These include real images or videos that show children being sexually abused or exploited. 
  1. Live Streams: In some cases, CSAM is produced and distributed through live streaming platforms, which can facilitate real-time abuse. 
  1. AI-Generated Imagery: With advancements in artificial intelligence, perpetrators can now create realistic depictions of child abuse without involving real children. These AI-generated images or videos are also classified as CSAM and are still a crime.  
  1. Drawings and Illustrations: Even digital artwork, cartoons, or illustrations depicting minors in sexualized contexts are considered CSAM. 

The Scope and Impact of CSAM 

The reach of CSAM is global and widespread. The issue isn’t confined to one country or region; it affects children across the world. Every day, law enforcement agencies, online platforms, and advocacy groups confront the devastating impact of CSAM on children and society at large. 

Our Rescue works in 5 regions and supports 29 countries around the world – every team has helped authorities in cases involving CSAM.

Global Statistics and Prevalence 

The scale of CSAM production and distribution has reached alarming levels. According to the National Center for Missing and Exploited Children (NCMEC), the number of reports related to CSAM has skyrocketed in recent years.  

In 2023, the CyberTipline saw more than 90% of reports involving the upload of CSAM by users outside the United States. Additionally, the CyberTipline received more than 35.9 million reports that referred to incidents of suspected CSAM. In 2021 it was 29,309,106. This crime is only growing in our digital age of unlimited access.  

These numbers reveal the frightening scope of CSAM, demonstrating that child exploitation is a global epidemic. 

Digital Proliferation Challenges 

One of the most pressing challenges in combating CSAM is the digital proliferation of such material. The internet makes it easier than ever for perpetrators to share and distribute CSAM globally. A photo or video taken in one country can be spread to every corner of the world within seconds, making it incredibly difficult to track and remove. Additionally, as technology evolves, so do the methods perpetrators use to create and distribute CSAM. This ever-changing landscape presents unique obstacles for law enforcement agencies. 

That’s why at Our Rescue we support our global law enforcement partners by assessing their investigative and forensic needs, implementing advanced, technical training, providing case mentoring, and expanding their forensic capabilities with specialized computers and Artificial Intelligence (AI) enabled forensic software to help in the fight against CSAM.  

Federal and State Laws 

In the United States, federal law provides the backbone for prosecuting CSAM cases. The 18 U.S. Code § 2252A makes it illegal to produce, distribute, receive, or possess CSAM in any U.S. state. Offenders can face severe penalties, including lengthy prison sentences and hefty fines. Additionally, many states have enacted their own laws to combat CSAM and support survivors. 

We recommend always staying up to date on pending bills that aim to further protect your children online. https://www.congress.gov/ 

International Regulations 

Internationally, several treaties and conventions address CSAM. In the United Nations there is an established Article 34 of the United Nations Convention on the Rights of the Child (UNCRC), stating that all signatories will take appropriate measures to protect children from the “exploitative use of children in pornographic performances and materials.”  

Laws regarding CSAM and its legality differ around the world. Many countries differ on the definition of “child” and age of consent. However, CSAM is illegal in 187 out of 195 countries. 

How to Report CSAM 

Reporting CSAM is critical to stopping its proliferation. If you come across CSAM, you must report it to the appropriate authorities. If it’s an emergency, dial 911 immediately.  

In the United States, NCMEC provides a confidential, anonymous way to report suspected CSAM through the CyberTipline, run by the National Center for Missing and Exploited Children. Submit a report here: report.cybertip.org

Additionally, you can call the Know2Protect Tipline at 833-591-KNOW (5669). 

Digital Safety Guidelines 

Parents and caregivers can take several steps to keep children safe online, including setting clear rules about internet use, monitoring online activities, and discussing the potential dangers of sharing personal information. Using parental control software and ensuring devices have appropriate privacy settings can also help mitigate risks. 

Learn about 11 apps to watch out for when it comes to keeping your kids safe online: https://ourrescue.org/education/statistics/online-apps-know-the-dangers-for-kids 

Resources for Parents and Educators 

There are many resources available to help parents and educators teach children about online safety and prevent exploitation. The best thing caregivers can do is start talking to children about what is and isn’t okay early on. Have open conversations so your child feels comfortable telling you when something is wrong.  

“Internet safety should be incorporated into general safety conversations that starts at 3, 4, and 5 years old in developmentally appropriate ways,” says Dr. Jordan Greenbaum, former Medical Director of the Global Initiative for Child Health and Well-Being at the International Centre for Missing & Exploited Children. “If we wait until sextortion is starting to happen at 11-13-years-old, it’s too late.” 

To learn how to start those conversations and to find more information on internet safety, check out our Start Talking: A Guide to Keeping Children Safe Online https://ourrescue.org/files/OURRescue_Start-Talking.pdf 

Technology Tools and Safeguards 

Technology companies play a critical role in protecting children from CSAM. Many platforms now use automated tools to detect and remove abusive content. But as offenders continue to exploit evolving technology, these tools must constantly adapt to meet new challenges, including the rise of encrypted platforms and AI-generated abuse imagery. 

In addition to technological innovation, public policy is also stepping up. Two nonpartisan bills currently before Congress—the Take It Down Act and the Kids Online Safety Act (KOSA)—aim to give children and families more control and protection online. The Take It Down Act would strengthen the mechanisms for removing explicit images of minors from the internet, while KOSA seeks to hold platforms accountable for harmful content and design features that put children at risk. Both bills represent important steps in the fight to make digital spaces safer for children. 

Together, legislation and technology must work hand in hand to address the urgent and evolving threat of CSAM in the digital age. 

The fight against CSAM requires the combined efforts of law enforcement, governments, technology companies, and individuals. By understanding the legal definitions, recognizing the widespread impact, reporting violations, and taking preventive measures, we can help protect children from this horrific abuse and work toward a world without CSAM. 

Published on April 10, 2025