Armed Conflict Cybersecurity & Tech

Contact-Tracing Apps: What’s Needed to Be an Effective Public Health Tool

Susan Landau
Tuesday, January 19, 2021, 11:39 AM

Many have discussed the shortcomings of contract-tracing apps during the pandemic. The real problem is the lack of adequate social and public health infrastructure in the U.S.

The main entrance to the Washington Hospital Center in D.C. (Medill DC,; CC BY 2.0,

Published by The Lawfare Institute
in Cooperation With

Jane Bambauer and Brian Ray recently wrote how “Covid-19 Apps Are Terrible—They Didn’t Have to Be.” They argue that a false sense of priorities—most notably, a “fetishized notion of individual privacy”—kept the contact-tracing apps from working well and protecting us. But I’d argue the problem is not that simple.

Bambauer and Ray have much of the big picture right. They’re correct that once the pandemic ends, policymakers around the world will need to revisit what was done right and what was done wrong. As Bambauer and Ray note, policymakers will need to carefully consider trade-offs between individual privacy and public health. (I have more to say on this topic in a book coming out in April.) But the authors miss the fundamental reasons why contact-tracing apps have failed to take hold in the United States and in Europe and end up scapegoating privacy protections as the reasons the apps failed.

Let’s start with South Korea, which is where Bambauer and Ray’s paper begins. They note that South Korea avoided lockdowns and claim that the country’s use of cell site location information (CSLI), closed-circuit television (CCTV), credit card data and old-fashioned interviews enabled better contact tracing than the U.S. managed. That’s correct—but it’s only part of the reason why digital tools for contact tracing have been more effective in South Korea than in the U.S. or Europe.

South Korea’s first reported case of the coronavirus was recorded the same day the first reported case cropped up in the United States. South Korea’s use of contact tracing worked well. It also led to some serious invasions of privacy. Initial efforts, for example, involved public websites detailing the movement patterns of infected individuals. Later on, the government walked back some of the more privacy invasive measures. The U.S., by contrast, did not draw from all those different streams of information; contact-tracing apps and human tracers were left to do the work without CSLI, CCTV and credit card data. But digital support of contact tracing was not the only difference between South Korea and the U.S.

From the start, South Korea took the coronavirus seriously. This was in part because of its experience with MERS, in which a single ill person infected 28 others and caused illness in four hospitals, leading to a total of 185 infections. Coronavirus testing was ramped up quickly (far faster than in the U.S.), as was contact tracing. Post MERS, the South Korean government changed regulations so that in the event of a new infectious disease, diagnostic testing equipment could be approved rapidly. The nation also changed its laws. While the 2011 Personal Information Protection Act bans the collection, use, and disclosure of personal information without consent, the 2015 Infectious Disease Control and Prevention Act allows South Korea’s Ministry of Health and Welfare to quickly access CSLI and credit card records during an infectious outbreak.

A nondigital aspect of South Korea’s response was also quite important: People in the country exhibited a willingness to isolate, avoid crowds and wear masks. Early on in the pandemic, months before the medical establishment fully understood that the coronavirus spreads from person to person through airborne transmission, 50 percent of South Koreans reported postponing or canceling social events, 42 percent avoided crowded places and 63 percent wore masks outside the home. Even today, after hundreds of thousands of Americans have died from the illness, there isn’t widespread adoption of similar behavior in the U.S.

Bambauer and Ray point to South Korea’s response, one that steamrolled over potential privacy protections, and blame Google, Apple and privacy advocates for creating a situation in which the apps didn’t provide sufficient information to public health authorities. The argument contains some problems.

For one, the authors get some facts wrong. They say the two companies have reputational problems because of their “aggressive and well-documented use of personal information.” As a general matter, Google does have a privacy problem (full disclosure: I worked at Google as a senior staff privacy analyst in 2013-2014); but Apple largely does not (a source cited by the authors gives Apple an A+ on privacy).

But more importantly, the authors’ broader critique fails to tackle the most significant problem for contact-tracing apps. The apps were always playing catch up against the backdrop of the health equity issues that the pandemic sharply exposed.

First of all, contact tracing works based on trust. Whether it is contact tracing for HIV/AIDS in the United States in the 1980s or 2010s, or Ebola in Monrovia in 2015, establishing trust is critical for contact tracing to be effective. Contact tracers do their jobs asking infected patients what they need, how they can be helped, if they are safe in isolating, and so on. Contact tracers will arrange for someone to get tested, to have social workers visit, to arrange to have food provided if needed. Apps don’t provide those services. Establishing that personal connection and offering to provide help builds trust—and that enables contact tracing, which is privacy invasive, to work.

The U.S. is well positioned to have significant trust issues, particularly among disenfranchised communities. Racist actions by the government—including redlining neighborhoods and placing polluting facilities near Black neighborhoods—and by the medical profession have resulted in poor health outcomes for Black Americans. This has understandably led to the Black community’s distrust of government and medical experts—and it may well lead to the same for the apps.

Data from other countries indicates that contact-tracing apps struggled to gain traction in minority communities. In the U.K., for example, a study of the use of contact-tracing apps showed a markedly lower uptake in the racially mixed neighborhood of Newham than in the largely white Isle of Wight. It’s possible that this was because people on the Isle of Wight had had experience with a previous contact-tracing app and thus could make better use of it. But the disparate results have stemmed from distrust of government and the medical establishment by the significant Black, Asian and ethnically diverse population of Newham. As Christy Lopez, Laura Moy and I discussed earlier in the pandemic, there are multiple reasons why contact-tracing apps may not benefit marginalized communities particularly well—and thus why uptake may be low.

To work well, contact tracing requires intimate knowledge about a community’s habits, and this is something the apps aren’t able to take into account. Contact tracers at the Fort Apache Indian Reservation, for example, know that families live in multigenerational homes but children often visit their other grandparents who live outside of the home. The contact tracers helped to keep death rates low at the reservation by asking about “the other grandparents” and then visiting them to check their health. The apps have not been built to take into account the different ways different cultures live and the different ways the disease might spread across these diverse communities.

Most crucially, the U.S. apps also ran up against an intractable problem: a broken domestic health-care system. For a contact-tracing app to be efficacious, it needs to exist within a well-functioning public health system. An app notifying of potential exposure is useful only if the notified person can isolate quickly and easily. If a Swiss citizen receives an exposure notification from the SwissCovid app, for example, the Swiss government will subsidize staying home from work during the isolation period. The U.S. doesn’t offer similar support, and so faced with a positive exposure notification, many low-income workers would go to work and then infect others. The problem isn’t with the app; it’s with the lack of an underlying social infrastructure to support potentially infected people.

Bambauer and Ray do not discuss health equity issues at all, yet these are a crucial underlying aspect of this disease. I agree with Bambauer and Ray that we need to reexamine our laws and policies so that we’re better equipped to handle the next pandemic; South Korea’s excellent response is a result, in part, of the changes it made after its MERS debacle. But we should be careful not to put too great an expectation on technology in preventing disease spread. The real problem is the country’s lack of appropriate social and public health infrastructure.

Susan Landau is Bridge Professor in The Fletcher School and Tufts School of Engineering, Department of Computer Science, Tufts University, and is founding director of Tufts MS program in Cybersecurity and Public Policy. Landau has testified before Congress and briefed U.S. and European policymakers on encryption, surveillance, and cybersecurity issues.

Subscribe to Lawfare