Cybersecurity & Tech

Tor Hidden Services Are a Failed Technology, Harming Children, Dissidents and Journalists

Brian Levine, Brian Lynn
Friday, January 17, 2020, 8:27 AM

Tor hidden services are intended to help dissidents and whistleblowers. Instead, they have provided a false sense of security to users and created a platform for child sexual exploitation.

Published by The Lawfare Institute
in Cooperation With

A recent series of New York Times articles reported on the deeply disturbing amount of child sexual exploitation material that is available on the Internet. The articles discuss personal accounts of children who have been targeted, how tech companies have provided platforms for perpetrators of what is commonly called child pornography and how law enforcement has gone underfunded for years. In October 2019, Alan Rozenshtein commented on Lawfare that law enforcement’s efforts to combat this issue will become increasingly complicated when Facebook and other platforms roll out end-to-end encryption.

But the New York Times articles give just passing mention to the troublesome role already played by darknets in crimes against children. Darknets were created by computer scientists with the intention of increasing privacy and free speech. Unfortunately, even after decades of research, darknets are causing much more harm than good in practice. They have allowed perpetrators of many crimes, not just child sexual abuse, to organize like never before. And they have placed those who need free speech most—whistleblowers, dissidents and journalists—in danger. These are concerns that projects supporting darknets fail to acknowledge, and these organizations need to change their ways.

Darknets promise an anonymous exchange of information on the internet by masking identities, and they are rising in popularity for reasons of increased privacy and free speech. Every day, billions of people relinquish information about themselves through the use of social media, search engines, mobile phones and other technologies. Industries that collect and manage our data are under few regulations, and there are strong economic incentives for companies not to protect our privacy; darknets have positioned themselves as a solution. The Tor Project develops the most popular of the darknet technologies, with an estimated metric of just under 2 million users per day. There are two Tor software products. The first, the Tor hidden services software, hides the internet location of a website from those visiting it. Hidden services, recently rebranded as “onion services,” are used for child sexual exploitation sites and online markets selling addictive drugs, counterfeit documents, contract killers, stolen credit cards and spam bots. At the same time, hidden services have tangible benefits; among them are enabling freer access to information in repressive countries and offering anonymity to whistleblowers.

The second product, the Tor Browser, enables an individual to access the internet anonymously. For example, the owner of a website would not be able to identify visitors who use the Tor Browser. To be clear, our concern is only with hidden services; the Tor Browser provides an important anonymity service for individuals.

When technology results in one harm, yet its unavailability might cause another, addressing the dilemma requires careful inquiry. In the case of hidden services, we ask: In the pursuit of greater privacy, are we sufficiently considering the cost to society? Similarly, are we overestimating the benefits of these technologies?

The primary benefit of hidden services is the security provided. But how secure are they? A system is secure when the cost of compromising the system exceeds the value of doing so. For a hidden service, compromising its security means identifying the location of the hidden website, which could then be used to identify those responsible for the site. It is somewhat expensive to deanonymize a hidden service, but it is possible. Researchers have found many vulnerabilities in Tor's designs that allow deanonymization, and the Tor developers have fixed flaws when feasible. Some vulnerabilities aren't bugs or oversights, however, and they exist and remain because of deliberate trade-offs that Tor’s developers have made between security and a performant user experience. In other words, hidden services do not provide a high level of security against those with resources, as takedowns of sites offering abhorrent content have demonstrated. For example, the FBI took down the hidden service called Playpen, which has resulted in the rescue of 296 children from sexual abuse and 898 arrests of perpetrators to date. Playpen was a community of more than 150,000 users and hosted imagery of the sexual abuse of infants, toddlers and prepubescent children.

Hidden services are intended to help dissidents and whistleblowers, but they provide a false sense of security to users. Governments and corporations have immense resources to protect their self-interests, and certainly enough to breach Tor. The Tor Project website, which advertises itself as a solution for dissidents, irresponsibly does not spell out the risks to its intended users. Similarly, many news sources—including the New York Times and the Washington Post—offer SecureDrop hidden services for dissidents and whistleblowers to anonymously and securely submit documents. It's reckless that the SecureDrop websites do not explicitly state that the system is not secure against resourceful adversaries. It takes a quantum computer to breach modern encryption ciphers, but Tor can be defeated very simply and in a number of ways with today’s technology. Insights into and explanations of Tor's vulnerabilities are hidden in academic papers that are intelligible to experts alone—while users remain unaware.

Hidden services amplify the harms of our society's worst perpetrators many times over. The distribution of abhorrent content is merely the first problem. Hidden services bring together communities of perpetrators, allowing them to train each other and normalize their criminal behaviors to each other. And the content is used to normalize the abuse to the child victims (a process called grooming). Hidden services are creating resilient archives of child sexual abuse images that inflict pain long into the victims' adulthood. Because the internet has no physical boundaries, hidden services extend the reach of illicit content—including child pornography and illegal drugs—into all communities.

And unlike other technology, hidden services avoid public scrutiny. When other internet forums have made difficult decisions about the balance of free speech and hate or abuse, public discussion has catalyzed or at least informed their actions. Facebook eventually began removing misinformation; Reddit eventually disallowed so-called revenge pornography and speech that incites violence; and the internet service provider Cloudflare eventually dropped 8chan, the community that nurtured three mass shooters. Free speech bastions such as 4chan, 8chan and Voat all have one policy in common—no sexual images of children—while the Tor Project explicitly defends its tolerance of child sexual abuse. A study found that 83 percent of hidden services traffic involves child sexual abuse sites. The Tor Project recently took steps that prevent new measurements of the popularity of hidden services sites, making scrutiny harderstill. They refuse to even keep track of the problem.

For several reasons, the cost to privacy would be minimal if the Tor Project halted work on hidden services. First, state-of-the-art privacy research shows that hidden services have no practical security advantage over publishing information via a Tor Browser to a traditional, overt website. For example, using Tor Browser, a dissident could operate a web server in a foreign country that has more permissive speech; hidden services are not required. Individuals can hide their identity with Tor Browser to access sites run by advocacy organizations and even highly permissive sites such as 8chan. And whistleblowers can just as securely contact and access the New York Times, Facebook or the BBC World Service via the Tor Browser. Second, research found that illicit hidden services do not identify themselves, but hidden service sites of any legitimate use do; for example, we all know who operates the New York Times's SecureDrop hidden service. Third, as stated above, hidden services are not secure in the first place for whistleblowers, journalists and dissidents.

We believe in the principle that a measure of our society's greatness is how well we protect its members—particularly our most vulnerable members—from online harms, privacy violations and discrimination. Unfortunately, these persons are not protected well by hidden services. Instead, hidden services are allowing perpetrators to organize into communities to sell our personal data, operate black markets and inflict harms on the victims of child sexual abuse. Hidden services shatter the privacy owed to these victims, while overpromising protection to their intended users.

Thus, much important research and development remains to be done to ensure that everyone has both a right to the internet and a right to privacy. This task is urgent but must be performed more responsibly going forward. Computer scientists have an obligation to disclose and undo or mitigate any negative consequences of our technology. Our field invariably names the benefits of research advances, but negative consequences—if acknowledged at all—are too often labeled as an acceptable price or a problem that can be solved down the road.

It is time for our field and our society to stop labeling the Tor Project’s hidden services as a technology that is helpful, where its benefits outweigh its costs. Minimally, the Tor Project must clearly quantify the resources required by an adversary to defeat the software. But, ideally, because the sites that run on hidden services cause enormous and continual harm to victims of sexual abuse and other crimes, it's time to move on: Tor must halt the failed hidden services project and start researching a different technology for enabling free speech.

Brian Levine is a Professor in the College of Computer and Information Sciences at the University of Massachusetts Amherst and Director of the UMass Cybersecurity Institute. His research is focused on network security, privacy, and forensics, including investigations of online child exploitation.
Brian Lynn is on staff in the College of Computer and Information Sciences at the University of Massachusetts Amherst. His focus is on researching and developing technologies to combat online child exploitation.

Subscribe to Lawfare