Watchdog claims child sexual abuse material rising on X


This recording was made using enhanced software.

Summary

A growing problem

An analysis by NBC News suggests child sexual abuse material is increasing on X.

Severed ties

Thorn, a nonprofit that provides detection software to social media companies, says it ended its contract with X over lack of payment.

X response

X says its begun using its own internal system to location and remove the illegal content.


Full story

A nonprofit organization that aids social media platforms in detecting child sexual abuse material says that the illegal content is proliferating on X. The California-based company, Thorn, says it recently terminated its contract with X after it stopped receiving payment.

As first reported by NBC News, Cassie Coccaro, head of communications at Thorn, said the termination came after “months and months of outreach” failed to persuade X to continue paying for its services.

QR code for SAN app download

Download the SAN app today to stay up-to-date with Unbiased. Straight Facts™.

Point phone camera here

 X worked with Thorn in 2024 to test a tool designed to proactively detect text-based child sexual exploitation. However, Pailes Halai, Thorn’s senior manager of accounts and partnerships, says it’s unclear whether X fully implemented the tool. He also said the now-severed relationship raises concerns over X’s ability to limit the spread of child abuse material.

“It was very much a last-resort decision for us to make,” Halai said of the terminated contract. “We provided the services to them. We did it for as long as we possibly could, exhausted all possible avenues and had to terminate, ultimately, because, as a nonprofit, we’re not exactly in the business of helping to sustain something for a company like X, where we’re actually incurring huge costs.”

A growing problem

An analysis by NBC News indicates the amount of inappropriate material on X may be increasing. Hashtags advertising the sale of child sexual abuse material are widespread and use terms that have been prevalent since at least 2023.

Other aspects, NBC News said, are new, such as the use of bots to flood ads not only onto the main feed but into groups under the platform’s “Communities” section. A review of such hashtags by the Canadian Centre for Child Protection was able to identify in minutes accounts that posted images of previously identified victims as young as 7.

Ads often direct users to other platforms, such as Telegram, to view content or make a purchase. Lloyd Richardson, director of information technology at C3P, called X’s response “woefully insufficient.”

“It seems to be a little bit of a game of Whac-A-Mole that goes on,” Richardson said. “There doesn’t seem to be a particular push to really get to the root cause of the issue.” 

X’s enforcement capabilities strengthened

After questioning by NBC News, X said it had initiated additional efforts to detect child sexual abuse material through a method known as hash matching. Hashing designates a unique digital fingerprint to illegal content, allowing platforms to scan for child sexual abuse material without having to obtain the material themselves.

“This system allows X to hash and match media content quickly and securely, keeping the platform safer without sacrificing user privacy,” X said in a post. “This is enabled by the incredible work of our safety engineering team, who have built state of the art systems to further strengthen our enforcement capabilities.”

However, NBC News said material depicting the sexual abuse of children has only become more prevalent since Elon Musk bought the platform, then known as Twitter, in 2022. The outlet says it identified 23 hashtags related to the sale of child sexual abuse material. Only two hashtags it looked up were blocked by X.

Ultimately, Richardson, C3P’s director of information technology, says he believes industry-wide efforts to rely on artificial intelligence and automation to block illicit content are fundamentally flawed.

“There should be an actual incident response when someone is selling child sexual abuse material on your service, right?” Richardson said. “We’ve become completely desensitized to that. We’re dealing with the sale of children being raped. You can’t automate your way out of this problem.”

Alan Judd (Content Editor) and Ally Heath (Senior Digital Producer) contributed to this report.
Tags: , , ,

Why this story matters

The decision by nonprofit Thorn to cut ties with X over unpaid services for detecting child sexual abuse material underscores growing concerns about the platform's capacity and commitment to addressing online child exploitation.

Child exploitation detection

Ensuring effective detection of child sexual abuse material is crucial for protecting vulnerable individuals and complying with legal and ethical standards online.

Platform responsibility

The termination of Thorn’s contract and subsequent expert criticisms raise questions about X’s responsibility and ability to respond robustly to the harmful use of its platform.

Challenges of automation

Reliance on automated and AI-driven methods to address illegal content is debated, with experts like Lloyd Richardson of C3P asserting that such approaches may be insufficient to handle the scale and seriousness of child exploitation online.