Security, Privacy and the Coronavirus: Lessons From 9/11

Peter Swire
Tuesday, March 24, 2020, 2:46 PM

In this moment of true national emergency, how does the public know whether new surveillance programs are necessary?

The National Guard operates a mobile testing center in New Rochelle, New York. (Source: Flickr/The National Guard by Sgt. Amouris Coss)

Published by The Lawfare Institute
in Cooperation With

A few days after Sept. 11, 2001, I entered the Mott House on Maryland Avenue, not far from the Senate. The national D.C. office of the ACLU was hosting a somber meeting of privacy and civil liberties experts from across the political spectrum. We all recognized that everything on our agenda had changed drastically. We were in a new era, where security was paramount, and privacy a policy afterthought at best.

This week, nearly 20 years later, Greg Nojeim of the Center for Democracy and Technology convened and ably led a similar meeting—online, of course—to consider surveillance in the context of the coronavirus. The world is once again at a moment where myriad practices must change, especially for the public health system, and many new laws will be considered. The Justice Department has already asked Congress for new powers, including the ability to ask a judge to detain people indefinitely during emergencies.

In this moment of true national emergency, how does the public know whether new surveillance programs are necessary? In some ways, everything that people are facing right now seems unprecedented and thus open to change. Every legal limit can appear, in the crisis of the moment, a relic of the prepandemic era.

Perhaps the biggest surveillance debates will occur about a type of data that is far more pervasive than in 2001—location data. The number of mobile phone subscriptions in the U.S. has roughly quadrupled since 2001, exceeding 400 million subscriptions, more than one per person in the country. In addition, since the introduction of the iPhone in 2007, many users subscribe to multiple location-tracing apps, which record location with varying degrees of precision.

The surveillance risks from using location data vary enormously. Some are privacy-preserving proposals to report aggregated, anonymized data, as Facebook and Google have announced. Other countries have already announced a range of responses, including intrusive programs in China and Israel, where the government reportedly bases actual quarantine orders on a person’s cellphone location data.

To assess new location and other surveillance proposals, it’s worth examining some particular lessons from the post-2001 period. The meeting in 2001 resulted in the 10 provisions of “In Defense of Freedom at a Time of Crisis.” The 128 groups that signed this document came from all political viewpoints, ranging from Amnesty International and the Electronic Frontier Foundation to Phyllis Schlafly’s Eagle Forum and Grover Norquist’s Americans for Tax Reform.

In reading the 10 provisions, consider how well many of them translate to our current crisis:

“ In Defense of Freedom at a Time of Crisis”

1. On September 11, 2001 thousands of people lost their lives in a brutal assault on the American people and the American form of government. We mourn the loss of these innocent lives and insist that those who perpetrated these acts be held accountable.

2. This tragedy requires all Americans to examine carefully the steps our country may now take to reduce the risk of future terrorist attacks.

3. We need to consider proposals calmly and deliberately with a determination not to erode the liberties and freedoms that are at the core of the American way of life.

4. We need to ensure that actions by our government uphold the principles of a democratic society, accountable government and international law, and that all decisions are taken in a manner consistent with the Constitution.

5. We can, as we have in the past, in times of war and of peace, reconcile the requirements of security with the demands of liberty.

6. We should resist the temptation to enact proposals in the mistaken belief that anything that may be called anti-terrorist will necessarily provide greater security.

7. We should resist efforts to target people because of their race, religion, ethnic background or appearance, including immigrants in general, Arab Americans and Muslims.

8. We affirm the right of peaceful dissent, protected by the First Amendment, now, when it is most at risk.

9. We should applaud our political leaders in the days ahead who have the courage to say that our freedoms should not be limited.

10. We must have faith in our democratic system and our Constitution, and in our ability to protect at the same time both the freedom and the security of all Americans.

Many members of the Lawfare community worked after 9/11 on security, surveillance and related topics. Each can offer their own insights based on how the nation responded. Here are some points that came to me while discussing the new challenges, including relevant terms and concepts that emerged from the earlier period, such as data mining, false positives, security theater and warrantless wiretaps.

Question whether the data is actually accurate and actionable. The possibility of nonuseful data is high, in my experience. One of the relevant aphorisms is “garbage in, garbage out.” Another is the “streetlight effect,” which is the tendency of people to search for something where it is easiest to look. It comes from the story of the drunk person searching for lost keys under a city streetlight, when the keys were actually lost somewhere else. When asked why, the drunk replies: “That’s where the light is.” Applied to new surveillance proposals, consider whether the data that is available—that is, under the streetlight—would actually help. For instance, many databases of location information lack the completeness or precision to provide meaningful utility for tracking the virus.

Be cautious about claims about the capabilities of “data mining.” Post-2001, the term “data mining” had the kind of semimagical properties that today many observers attribute to “machine learning” (ML) or “artificial intelligence” (AI). Proponents of a new surveillance program assert large benefits. Critics ask detailed questions, identifying holes in the proponents’ claimed benefits and explaining why the system may not work as advertised. Proponents then answer, “We can solve that with data mining (or ML or AI).” My recommendation in such instances is to work through the details, to see which subset of the proponents’ claims logically has a chance of succeeding. There may be promising ways to use ML or AI to address the coronavirus, such as screening potential drug therapies more quickly, but the existence of advanced statistical techniques does not mean that relevant, accurate data is available or that novel datasets should go into the hands of government agencies.

The peak example of proposed data mining after 9/11 was total information awareness (TIA), which aimed to combine numerous data sources and find the terrorist needles in the nonterrorist haystack. Its leader, Adm. John Poindexter, said: “We must become much more efficient and more clever in the ways we find new sources of data, [and] mine information from the new and old.” Faced with criticisms about efficacy and privacy, Congress cut off all funding to TIA in 2003. A subsequent Department of Defense report catalogued numerous problems with government data mining used in TIA and proposed safeguards to address those problems. Today, the new crisis provides a new opportunity for agencies to similarly expand their reach. In the words of Maya Wang of Human Rights Watch, “The coronavirus outbreak is proving to be one of those landmarks in the history of the spread of mass surveillance in China.”

Avoid inaccurate data and false positives. Suppose the government knew for certain that an individual was carrying the virus and going to crowded places. In that event, there would be compelling public health reasons to identify the person and take steps to isolate him or her, perhaps even requiring a period of quarantine. In the real world, however, data is rarely that accurate. At this moment in the United States, in the absence of effective testing, the data on who is infected is stunningly incomplete. Any current map of reported cases thus bears little resemblance to reality.

In addition to such problems of data inaccuracy and incompleteness, the experience after 9/11 taught policymakers the importance of “false positives.” Security expert Bruce Schneier explained in detail why data mining “isn’t tenable” for uncovering future terrorist plots. For credit cards, no algorithms existed for pinpointing the tiny number of terrorist transactions among the vast sea of nonterrorist transactions. If investigators ever did get a hint of a correlation, there could easily be hundreds or thousands of false positives, sending investigators off on one fruitless search after another.

A more overarching problem is that commercial databases are rarely configured to provide the sort of accuracy needed to, let’s say, conclude whether any residents in an apartment building are currently infected. A typical online advertising campaign is thrilled if it sends out 100 advertisements and gets as many as three responses. However, if public health authorities use the same target marketing approaches to identify relevant medical cases, then they would visit 97 false positives to find the three people with the illness.

Security theater. Schneier gets credit for coining the term “security theater” in his 2003 book “Beyond Fear.” The idea is that government agencies and others have an incentive to show they are “doing something” to fight the crisis, even if a security measure doesn’t actually improve security. A similar idea arises in the sixth provision, quoted above, of “In Defense of Freedom”: “We should resist the temptation to enact proposals in the mistaken belief that anything that may be called anti-terrorist will necessarily provide greater security.”

Applied to the coronavirus pandemic, Americans should be alert that certain proposals may actually be “public health theater.” For instance, officials may be tempted to have visible teams spray disinfectant on city streets, to show the government’s commitment to stopping infection. The science seems to show, however, that such disinfection has very little effect and may even backfire. In short, government officials have great incentives to show the public they are doing something, anything, to address the crisis. It is up to constructive critics in the press, academia and elsewhere to call out any such government action when we see it.

More specifically, it may become tempting for government agencies in the U.S. to seek access to location databases as a claimed way to battle the coronavirus. The usefulness of databases for tracking contacts, however, is open to serious doubt. The underlying data is often inexact. Even where two dots on the map appear close to each other, they may be on different floors of an apartment building or on opposite sides of a wall. In addition, there are innumerable other reasons why contagion may not occur between the owners of two phones in apparent proximity to each other. In short, absent an empirical showing of accuracy and actionability that does not exist to date, calls for such location tracking quite possibly are security theater rather than actual security.

Consider how the actions will look in retrospect. Soon after 9/11, major telecom providers such as AT&T and Verizon agreed to provide vast amounts of call detail records to the federal government. The company’s decisions were understandable in context, to help respond to the newly prominent terrorist threat. By 2006, the secret programs were described in USA Today. By 2008, facing sharp public criticism, the companies were facing tens of billions of dollars in legal liability for unlawful actions. Congress eventually stepped in to provide the companies immunity from those lawsuits, but this history is a vivid lesson to company executives that actions taken during a crisis can subsequently come under harsh criticism.

Joel Brenner first used the term “declining half-life of secrets” in a 2011 speech. (I subsequently wrote about the topic, as applied especially to secret government programs.) The experience of telecom providers, and the difficulty in preserving the secrecy of surveillance programs, are strong reasons for transparency about novel databases and surveillance programs during the current crisis. In addition, such transparency will better support user trust than another round of secret surveillance followed by press revelations.

Warrantless wiretaps. Many details have now emerged about the aggressive, unprecedented surveillance programs that government undertook after 9/11 in the war on terrorism. One vivid example of these problems was “warrantless wiretaps,” or the wiretaps taken without the legally required judicial oversight. Over time, courts ruled that a number of these new programs violated the law. After reviewing this history, the President’s Review Group on Intelligence and Communications Technologies, on which I served, recommended numerous reforms, including cancellation of the call detail program discussed above. Congress, with bipartisan support, adopted this and many other new checks and balances on government surveillance in the 2015 USA Freedom Act.

I have no basis for saying that any such illegal surveillance is occurring today. The experience after 9/11, however, teaches the importance of keeping watch for the possibility of lawless government action.


While there are many differences between the crises of 9/11 and today, there are hard-won lessons about security and privacy that can be gleaned from the recent past. Many urgent measures are needed to confront the coronavirus pandemic. Let’s do so mindful of the criteria for which measures actually make sense.

Peter Swire is the J.Z. Liang Chair in the Georgia Tech School of Cybersecurity and Privacy, and Professor of Law and Ethics in the Georgia Tech Scheller College of Business. He is Senior Counsel to Alston & Bird LLP, and Research Director of the Cross-Border Data Forum. He served as one of five members of President Obama’s Review Group on Intelligence and Communications Technology.

Subscribe to Lawfare