Rising Online Risks to Children: The Alarming Increase in Self-Generated Child Sexual Abuse Material (CSAM) and Risky Interactions with Adults

Children today are facing unprecedented online risks, as highlighted in a recent report from Thorn, a technology nonprofit. This article aims to shed light on the concerning rise in certain online threats to children, including the surge in self-generated child sexual abuse material (CSAM) and risky interactions with adults. Additionally, we will explore the challenges posed by novel technologies employed by online predators, as well as the vital role of hashing and matching technology in combating these risks. Lastly, we’ll introduce Safer, a proactive CSAM detection tool developed by Thorn, offering a comprehensive database of known CSAM hash values.

Increase in Self-Generated CSAM

One distressing trend that has emerged is the increasing number of minors engaging in the taking and sharing of sexual images of themselves, both consensually and coercively. This dangerous behavior has significant implications for the well-being of these children, as it not only exposes them to exploitation but also has a lasting impact on their mental and emotional health. Shockingly, there has been a 9% increase in cases of self-generated CSAM from 2021 to 2022, indicating the urgent need for intervention and support for vulnerable youth.

Risky Online Interactions with Adults

The Thorn report also reveals that children are reporting a rise in risky online interactions with adults. These interactions can range from inappropriate conversations to grooming for sexual exploitation. The dangers involved in such encounters cannot be overstated, as they expose children to manipulation, coercion, and potential harm. It is crucial for parents, educators, and law enforcement agencies to remain vigilant and take active steps to protect children from these dangerous online engagements.

Surge in Reports of Child Sexual Abuse Material (CSAM)

The statistics from the National Center for Missing and Exploited Children (NCMEC) paint a distressing picture. Over the past five years, NCMEC has witnessed a staggering 329% increase in reported CSAM files. In 2022 alone, they received a shocking 88.3 million CSAM files. These alarming numbers underscore the urgent need for concerted efforts by tech companies, law enforcement, and policymakers to combat the proliferation of child sexual exploitation material.

Novel Technologies Used by Online Predators

Online predators are continuously evolving their tactics to ensnare vulnerable children. One concerning development is the deployment of chatbots to scale their enticement. These chatbots simulate real conversations, targeting unsuspecting children and luring them into dangerous situations. The scalability and effectiveness of these tactics pose a significant challenge, necessitating innovative solutions to counteract and neutralize these threats.

Rise in Reports of Online Enticement

The NCMEC report also reveals an alarming 82% increase in reports of online enticement of children for sexual acts between 2021 and 2022. This surge in predatory behavior further emphasizes the pressing need for enhanced preventive measures and decisive action to protect children online. Efforts must focus not only on identifying and apprehending predators but also on empowering children with the knowledge and skills to recognize and respond to potential threats effectively.

The Importance of Hashing and Matching Technology

In the fight against online exploitation, hashing and matching technology plays a crucial role. Tech companies can leverage this technology to protect users and platforms by identifying and blocking known CSAM (Child Sexual Abuse Material) hash values. With the ability to compare uploaded content against a comprehensive database of known CSAM hashes, potential instances of exploitation can be flagged and addressed proactively. This technology serves as a critical tool in preventing the dissemination and re-victimization of children through the online sharing of explicit material.

Safer: A Proactive Tool for CSAM Detection

Thorn’s Safer is a remarkable tool designed to proactively detect CSAM by providing access to a vast database aggregating over 29 million known CSAM hash values. By automatically scanning and comparing uploaded content against this extensive repository, Safer can swiftly identify and flag potentially harmful material. This technology offers a powerful resource for tech companies, law enforcement agencies, and other stakeholders committed to combating the exploitation of children online.

The rise in self-generated CSAM and risky online interactions with adults highlights the urgent need for heightened awareness, intervention, and preventive measures to protect children in the digital realm. The alarming increase in reports of CSAM and online enticement further underscores the necessity for multi-stakeholder collaboration and decisive action. Harnessing the potential of innovative technologies and tools like hashing and matching, as demonstrated by Safer, can pave the way for safer online spaces for children. It is our collective responsibility to prioritize the well-being and safeguard the innocence of our children by effectively addressing and combating these growing online risks.

Explore more

AI Human Resources Integration – Review

The rapid transition of the human resources department from a back-office administrative hub to a high-tech nerve center has fundamentally altered how organizations perceive their most valuable asset: their people. While the promise of efficiency has always been the primary driver of digital adoption, the current landscape reveals a complex interplay between sophisticated algorithms and the indispensable nature of human

Is Your Organization Hiring for Experience or Adaptability?

The standard executive recruitment model has historically prioritized candidates with decades of specialized industry tenure, yet the current economic volatility suggests that a reliance on past success is no longer a reliable predictor of future performance. In 2026, the global marketplace is defined by rapid technological shifts where long-standing industry norms are frequently upended by generative AI and decentralized finance

OpenAI Challenge Hiring – Review

The traditional resume, once the golden ticket to high-stakes employment, has officially entered its obsolescence phase as automated systems and AI-generated content saturate the labor market. In response, OpenAI has introduced a performance-driven recruitment model that bypasses the “slop” of polished but hollow applications. This shift represents a fundamental pivot toward verified capability, where a candidate’s worth is measured not

How Do Your Leadership Signals Affect Team Performance?

The modern corporate landscape operates within a state of constant flux where economic shifts and rapid technological integration create an environment of perpetual high-stakes decision-making. In this atmosphere, the emotional and behavioral cues projected by executives do not merely stay within the confines of the boardroom but ripple through every level of an organization, dictating the collective psychological state of

Restoring Human Choice to Counter Modern Management Crises

Ling-yi Tsai, an organizational strategy expert with decades of experience in HR technology and behavioral science, has dedicated her career to helping global firms navigate the friction between technological efficiency and human potential. In an era where data-driven decision-making is often mistaken for leadership, she argues that we have industrialized the “how” of work while losing sight of the “why.”