Child sexual abuse material, commonly referred to as CSAM is one of the most disturbing and fastest-growing categories of online crime in the world and one that far too few people outside of law enforcement and advocacy circles understand in any real depth. The images and videos that fall under this category document actual abuse of real children, which means every time that material is shared or viewed, the harm to the victim is compounded and extended far beyond the original crime.
The scale of the problem is staggering. Reports submitted to NCMEC’s CyberTipline surpassed 36 million in 2023 alone, a figure that reflects both the volume of material circulating online and the increasing capacity of platforms to detect and report it. But those numbers also represent real children whose abuse has been documented and distributed, often repeatedly and across multiple jurisdictions, making both identification and removal enormously complex. The National Center for Missing and Exploited Children operates the CyberTipline as the central reporting system for this material in the U.S., and their resources at missingkids.org/gethelpnow/cybertipline explain how reports are reviewed, how victims are identified, and how the public can submit tips about suspected material.
Technology is both part of the problem and part of the solution. The same platforms that have made it easier for CSAM to spread have also invested, under significant legal and public pressure, in detection tools that flag and remove abusive material before it reaches large audiences. Hash-matching technology, AI-based content classifiers, and collaborative databases shared between platforms have made meaningful progress in reducing the volume of material that circulates unchecked. But gaps remain, particularly on encrypted messaging platforms and in newer online spaces where oversight is limited and reporting mechanisms are weak or nonexistent.
For parents, understanding that CSAM is not just a law enforcement problem but a child safety problem that can touch any family is important context. Children who are manipulated into producing explicit images through sextortion or online grooming become victims whose material may be shared as CSAM without their knowledge or consent. Teaching children about digital safety, healthy boundaries, and how to recognize and report manipulative behavior is a direct form of prevention, and the FBI’s dedicated resources for parents and caregivers protecting kids online offer practical, specific guidance that is well worth reviewing.
If you ever encounter suspected CSAM online, do not download, share, or attempt to investigate it yourself. Report it immediately to the CyberTipline at cybertipline.org, where trained analysts will review the report and route it to the appropriate law enforcement agencies. That reporting matters — each tip contributes to a larger picture that helps investigators identify victims, locate offenders, and disrupt networks distributing this material.
Fighting CSAM requires technology companies, law enforcement, nonprofits, governments, and informed members of the public all playing their part. The children whose abuse is documented in this material deserve every tool available to stop the spread of what has been done to them, and awareness is the foundation on which every other effort rests.