New study investigates illegal child sexual abuse material and anonymity on Tor network

Rochester Institute of Technology cybersecurity professor Billy Brumley is helping shed light on dark corners of the internet.

Write your awesome label here.
Brumley and a multidisciplinary group of researchers from around the globe are investigating the availability of child sexual abuse material (CSAM) online, in order to better understand and combat its widespread dissemination. Their findings provide crucial insights into harmful behavior on anonymous services and highlight potential strategies for public health intervention.

The study, published in Nature’s Scientific Reports, offers a comprehensive analysis of the availability, search behavior, and user demographics related to CSAM on the Tor network. The team also intercepted CSAM users directly with a survey, discovering new ways to interfere with users’ activities.

Every day, millions of people use Tor—often known as the dark web—to encrypt communications and make internet browsing untraceable. There are both legal and illegal uses for anonymity. Via the Tor network, a vast amount of illegal CSAM is accessible.“The cybersecurity research field largely ignores the CSAM epidemic. It is categorically the head-in-the-sand ostrich syndrome,” said Brumley, the Kevin O’Sullivan Endowed Professor in Cybersecurity at RIT. “Security and privacy scholars tend to focus on purely technical aspects of anonymity and anti-censorship.

Whenever anyone proposes technologies that might threaten anonymity or provide content filtering, it’s often met with ignorant knee-jerk reactions.”The study notes a 2022 U.S. Congress report that despite evidence of the growing prevalence and severe consequences of CSAM accessible through the Tor network, computer science research on CSAM remains limited and anonymous services have not taken action.

“We have a duty to the public,” continued Brumley. “Our technologies are not encased in a vacuum. Yes, it’s sometimes easier to produce research results by ignoring the social consequences. But, that doesn’t make it ethical.”The study looked at more than 176,000 domains operating through the Tor network between 2018 and 2023. Researchers found that in 2023 alone, one in five of these websites shared CSAM. The material was also easily available on 21 of Tor’s 26 most popular search engines, four of which even advertised CSAM on their landing page.