Cybersecurity & Tech Democracy & Elections

Foreign Influence Operations and the 2020 Election: Framing the Debate

Josh A. Goldstein, Renee DiResta
Friday, October 23, 2020, 1:01 PM

Introducing a series from the Stanford Internet Observatory on assessing the threat of foreign influence operations targeting the United States.

A sign warning of misinformation. (Peter Hogan,; CC BY-ND 2.0,

Published by The Lawfare Institute
in Cooperation With

Editors Note: The authors will be taking questions live at 1 p.m. Eastern time on Friday, Oct. 30, as a part of our new project, Lawfare Live. Events under the Lawfare Live umbrella will be offered exclusively to Lawfare supporters on Patreon. Sign up now for a chance to talk to Goldstein and DiResta and to join an exciting community of Lawfare readers and podcast listeners. You can watch the event here.

Researchers of online manipulation and influence operations generally agree that foreign actors—Russia, China and Iran, to namea few—engage in overt and covert efforts to manipulate American public opinion in the service of political ends. But that consensus falls apartquickly when it comes to one seemingly simple question: To what extent do such efforts have a significant impact?

Take a recent Rand Corp. report on foreign interference. The authors highlightfour principal objectives of Russia’s active measures in the United States:

  1. Polarize and disrupt societal cohesion by exacerbating important and divisive issues, such as race, social class, and gender.
  2. Undermine public confidence in democratic institutions and processes.
  3. Spread confusion, generate exhaustion, and create apathy.
  4. Gain strategic influence over U.S. political decisionmaking and public opinion.

Some readers will perceive the threat described by Rand Corp. as grave. They might point to how foreign actors can run disinformation campaigns on the cheap—weakening social cohesion and undermining trust in news and fact. As National Counterintelligence and Security Center Director William Evaninaargued in July 2020, “Foreign efforts to influence or interfere with our elections are a direct threat to the fabric of our democracy.” What’s more, an increasingly accessible and wide rangeof tools is available for such campaigns. Propagandists can leverage state media outlets with large followings; they can use hack-and-leakoperations to redirect journalists and traditional news media; and intelligence agencies or public relations firms for hire can act as agents of influence, reaching out to American voters directly—and deceitfully—on social media.

But other readers may be more skepticaland argue that the threat of foreign influence operations is frequently exaggerated. After all, they might argue, it is often difficultto measure a campaign’s effect—but it is easy to conflate effort, or engagement on social media, with impact. Weakening social cohesion, in this view, is not primarily a product of foreign actors but, rather, a result of internalproblems that widen existing domestic fault lines. Focusing on foreign disinformation distracts from bigger problems at hand. For those who fall into this camp, treating foreign influence operations as grave significantly overstates the threat posed by such operations; to do so is to blameforeign boogeymen, write off real domestic political opposition, and “do their job for them.” In the words of former National Security Council aide Fiona Hill, “The biggest risk to the election is not the Russians. It’s us.”

The disagreement is not merely one of semantics. Underestimating the threat of foreign influence may diminish the political will needed to deter state actors from continuing to wage such operations. But exaggerating the threat of foreign influence operations risks undermining public trust in the news and fostering an ecosystem in which anyone you disagree with starts to look like a Russian troll.

The two viewpoints are not mutually exclusive, but they call for different framing and place different emphases on what is urgent. They are diagnoses that lead to different prescriptions. If the threat of influence operations is understated, technology companies must urgently enhance deterrence-by-denial—by making it more difficult to influence American citizens—and the U.S. government must enact deterrence-by-retaliation—by credibly threatening costly punishment for interference. If the threat is overstated, by contrast, the solution might be a public education campaign, run by researchers and elected representatives alike, to put foreign interference into perspective.

Given the stakes of the 2020 election and the looming discussion of foreign interference, we believe that the public conversation benefits from laying out the cards on the table. Instead of assuming a threat is high or low, researchers, journalists and government officials should describe the how and the why. What effect do foreign influence operations have? And are the effects commensurate with public discussion of them?

So, in partnership with Lawfare, the Stanford Internet Observatoryasked leading researchers of information operations and online manipulation to do just that. Each contributor was asked to write a short piece answering the question: “Is the threat of interstate influence operations overblown?” Drawing on their diverse methodological and disciplinary backgrounds, the contributors take different approaches to their ultimate conclusions. Their arguments touch on topics including the relative effect of domestic versus foreign disinformation, how to think about the impact of an influence operation, and Russia’s social media efforts in 2016.

We hope this series will add analytical rigor to the public conversation and allow readers to assess well-reasoned arguments incorporating the best available evidence to ultimately draw theirownconclusions.

Arguing that the threat is not overblown:

Arguing that the threat is overblown:

Reflecting on the future of influence operations

Josh A. Goldstein is a research fellow at Georgetown University's Center for Security and Emerging Technology (CSET), where he works on the CyberAI Project.
Renée DiResta is the technical research manager at the Stanford Internet Observatory, a cross-disciplinary program of research, teaching, and policy engagement for the study of abuse in current information technologies. Her work examines the spread of narratives across social and media networks; how distinct actor types leverage the information ecosystem to exert influence; and how policy, education, and design responses can be used to mitigate manipulation.

Subscribe to Lawfare