Cybersecurity & Tech Surveillance & Privacy

FOSTA: The New Anti-Sex-Trafficking Legislation May Not End the Internet, But It’s Not Good Law Either

Danielle Citron, Quinta Jurecic
Wednesday, March 28, 2018, 1:00 AM

Amid the chaos of the last week, one of the most significant pieces of internet legislation of the last two decades went relatively unnoticed.

Photo: Wikimedia/U.S. Capitol

Published by The Lawfare Institute
in Cooperation With
Brookings

Amid the chaos of the last week, one of the most significant pieces of internet legislation of the last two decades went relatively unnoticed. Most people likely had no idea that Congress was moving full steam ahead on altering a law that some credit for “why we have the internet.” And so it did: On March 21, the Senate passed into law the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA).

Although the president has yet to sign the legislation, the bill’s effects are already being felt. FOSTA (known in a previous form as SESTA, or the Stop Enabling Sex Traffickers Act) amends Section 230 of the Communications Decency Act, which provides tech companies immunity from most liability for publishing third-party content. Currently, only federal criminal law, intellectual property laws, and the Electronic Communications Privacy Act fall outside the immunity provision. In the years since its enactment in 1996, Section 230 has been characterized as “the Magna Carta of the internet,” as Alan Rozenshtein wrote recently on Lawfare. For its supporters, Section 230 immunity is credited with enabling the growth of online platforms as safe-havens for speech, even speech that platforms would be responsible for if it was expressed offline.

For the first time in twenty years, FOSTA carves out an additional statutory exception for that immunity. The idea is that online platforms should face the same liability for enabling illegal sex-trafficking, as offline outlets do. According to the bill’s oddly-phrased “Sense of Congress” introduction, Section 230 was “never intended to provide legal protection to websites . . . that facilitate traffickers in advertising the sale of unlawful sex acts with sex trafficking victims.” That provision continues, “[i]t is the sense of Congress that websites that promote and facilitate prostitution have been reckless in allowing the sale of sex trafficking victims and have done nothing to prevent the trafficking of children and victims of force, fraud, and coercion.”

FOSTA then goes on to provide that technology companies will not be shielded from civil liability if they knowingly assist, support, or facilitate advertising activity that violates federal sex-trafficking law, specifically 18 USC 1591. (Section 230 does not immunize platforms from federal criminal liability.) Currently, advertisers are liable under Section 1591(a)(2) if they knowingly benefit from outlawed ads. FOSTA not only carves out an exception to Section 230 immunity for violations of Section 1591, but it also redefines what constitutes a Section 1591 violation to include “knowingly assisting, supporting, or facilitating” advertising.

Under the new law, state attorneys general, as parens patriae, can seek civil penalties for such activity. Additionally, technology companies could face state criminal charges if the conduct charged would constitute a violation of Section 1591. Companies could also be criminally liable in state court for violations of 18 USC 2421A, a new section added by FOSTA to the Mann Act, which prohibits transporting a person across state line “with intent that such individual engage in prostitution” or other criminal sexual activity. Section 2421A criminalizes the owning, management, or operation of a platform “with the intent to promote or facilitate” prostitution.

The law’s status as the first legislative incursion against Section 230 led to intense controversy over its drafting and passage. Initially, the arguments proceeded predictably. Anti-sex-trafficking advocates offered their strong support, portraying technology companies like Google as “allies” of human traffickers. Leading the opposition, internet freedom advocates argued that limiting Section 230 immunity would “endanger … free expression and innovation online.” Sex workers and some advocates for sex trafficking victims and survivors voiced concerns that clamping down on advertisements for sex online could place women in danger by forcing them to find work on the street and limiting their ability to vet clients.

At first, tech companies lined up in lockstep against the bill. But then the seemingly impossible happened: The platforms’ opposition receded. Perhaps because the climate on the Hill was increasingly inhospitable to the major social media companies given their role in L’Affaire Russe and perhaps because the writing was on the wall, the Internet Association (which represents Facebook, Google, Microsoft, and other big tech companies) endorsed the legislation after Senate staff changed the bill to head off some of its excesses.

As FOSTA awaits the president’s signature, major platforms are already doing damage control. Two days after the bill passed the Senate, Craigslist closed its personal ads section, which was often used to post solicitations for sex. Pointing to FOSTA, the advertising hub wrote, “Any tool or service can be misused. We can't take such risk without jeopardizing all our other services…” Likewise, Reddit announced a new sitewide policy prohibiting users from “solicit[ing] or facilitat[ing] any transaction or gift involving certain goods and services, including … [p]aid services involving physical sexual contact.” Reddit continued to invoke the typical catch-all denial of responsibility for transactions users might undertake, adding, with apparent irony, “Always remember: you are dealing with strangers on the internet.” Sex workers posting on forums and blogs have been tallying lists of websites that have shut down or removed U.S.-based advertisements. The sex ad market, however, has not collapsed: other websites remain open, at least for now.

Though neither Reddit nor the shuttered adult-advertising websites cited FOSTA, the timing hints at the legislation’s influence. House Judiciary Committee Chairman Bob Goodlatte, who pushed FOSTA forward in the House of Representatives, appeared to claim credit for the culling:

On the other end of the spectrum, as soon as the bill was passed, the Electronic Frontier Foundation announced, “Today was a dark day for the Internet.” The VPN service Private Internet Access took out full-page ads in The New York Times pleading with President Trump: “If you sign this bill, every website will be affected ... Free speech dies.” And Sen. Ron Wyden, one of Section 230’s drafters and FOSTA’s loudest opponent in Congress, wrote:

Critics of the bill’s approach to Section 230 argue that FOSTA creates exactly the “moderator’s dilemma” that Section 230 sought to avoid. By immunizing platforms from liability for under-filtering (under Section 230(c)(1)) and for “good faith” over-filtering (under Section 230(c)(2)), Congress aimed to incentivize self regulation by shielding companies from liability for incomplete moderation. (Whether courts have interpreted Section 230 in line with this legislative history is another matter.)

FOSTA’s detractors (see, for example, Mike Godwin, Eric Goldman, and Emma Llanso) argue that FOSTA’s unclear “knowingly facilitating” language could perversely push platforms to engage in no moderation at all. Companies might sit on their hands for fear their incomplete removal of ads for sex trafficking might become evidence of “knowing facilitation” of their distribution of that content. In their view, the “moderator’s dilemma” would push the platform to simply avoid all moderation so that it could disclaim any “knowledge,” especially because it’s not clear what would constitute “knowing facilitation” under current law.

The flipside is that companies could instead engage in over-the-top moderation to prove their anti-sex-trafficking bona fides and strengthen their argument that they did not knowingly facilitate such activity in any given case. Overly aggressive moderation could mean the use of machine-learning algorithms to filter and block anything that relates to sex, including activities that have nothing to do with illegal sex trafficking. Given the bad PR that could result from a total abdication of responsibility to moderate, this seems to us to be the most likely danger.

The law’s critics are onto something. FOSTA isn’t executed artfully—perhaps because the bill went through so many different revisions and so many different offices had a hand in drafting it. The language is confusing, especially the “knowing facilitation” standard, which makes it hard to imagine that state and local prosecutors will be eager to expend scarce resources on enforcement. Even the Justice Department voiced concerns about the bill in a letter to Goodlatte.

But more troubling is that FOSTA endorses a piecemeal approach to a problem that should be solved more comprehensively. The legislation deals with sex trafficking but doesn’t touch on the numerous other ills that Section 230 has shielded technology companies from having to grapple with, like harassment, defamation, use of platforms by terrorist organizations, or other illegal activity. Along with Benjamin Wittes, one of us (Citron) has proposed a broader though more balanced legislative fix, under which platforms would enjoy immunity from liability if they could show that their response to unlawful uses of their services was reasonable. The determination of what constitutes a reasonable standard of care would take into account differences among online entities, reducing opportunities for abuses without interfering with the further development of a vibrant internet or unintentionally turning innocent platforms into involuntary insurers for those injured through their sites.

It’s too soon to say with any certainty what FOSTA’s long-term effects on the internet ecosystem will be. For the moment, the most significant thing about the law may simply be the fact that it was passed at all. With FOSTA on the books, proposals like the above, while unlikely to go anywhere, are no longer laughably out of the question. Though there’s still broad disagreement over whether law can or should do more to incentivize platforms to moderate content, the cultural and political mood has shifted over the last year toward the sense that technology companies have failed to live up to their moral responsibility—the result of months of bad press over the use of major technology platforms by Russian trolls to spread disinformation during the 2016 election; mounting frustration with the failure of Twitter and YouTube to curb abusive or disturbing content; and, most recently, the snowballing crisis over Facebook’s failure to handle the Cambridge Analytica data leak.

Even Wyden, who has been one of the fiercest defenders of Silicon Valley against government regulation, sounded a warning in his closing remarks against FOSTA. “Sites like Facebook, Youtube, and Tumblr ... have an undeniable role to play in fostering a civil environment,” he said from the Senate floor. “Their failure to do so could very well mean the internet looks very different ten years from now.”


Danielle Citron is a Professor of Law at Boston University School of Law and a 2019 MacArthur Fellow. She is the author of "Hate Crimes in Cyberspace" (Harvard University Press 2014).
Quinta Jurecic is a fellow in Governance Studies at the Brookings Institution and a senior editor at Lawfare. She previously served as Lawfare's managing editor and as an editorial writer for the Washington Post.

Subscribe to Lawfare