Cybersecurity & Tech Surveillance & Privacy

The U.K.’s Plan for Electronic Eavesdropping Poses Cybersecurity Risks

Susan Landau, Matt Blaze, Steven M. Bellovin
Thursday, January 8, 2026, 10:30 AM
The U.K. government’s latest attempt to access encrypted cloud backups could allow adversarial actors to gain access to sensitive data.
An Apple iPhone on a table. (Thom, https://tinyurl.com/mrpdpat6; Unsplash license, https://unsplash.com/license)

About a year ago, the world learned of extensive intrusions into U.S. telecommunications networks, ultimately attributed to China. That was only the beginning of an investigation that led to the discovery that the Chinese government had penetrated the networks of at least 80 nations around the globe. Not only did China access phones used by the Trump and Harris presidential campaigns in 2024; it also indiscriminately collected information on U.S. citizens. One former senior FBI official estimated that China had collected data on virtually every American.

In response, the Australian, Canadian, New Zealand, and United States governments issued communications guidance that recommended, among other measures, using end-to-end encryption, a method that secures communications so that only the message’s sender and receiver can view the unencrypted contents. One nation notably abstained from issuing this guidance: the United Kingdom.

Indeed, the U.K. appears to be more preoccupied with criminal actors than the People’s Republic of China and other sophisticated, adversarial nation-states. Or so it would seem given the most recent salvo by the U.K. government regarding the use of Apple’s Advanced Data Protection (ADP) for iCloud, a technology that secures data in the cloud so that no one but the user—not Apple, and not law enforcement—has unencrypted access to it. As one of us wrote last year, the Washington Post reported that operating under the Investigatory Powers Act Technical Capability Notice (TCN), “UK security officials demanded that Apple provide access to encrypted iCloud material regardless of the data’s location.”

The attempt was met with strong objections. Technologists observed that such an action would undo the important security advances that Apple had built in, not just for the user being targeted, but for all ADP users. Civil libertarians decried the order as an unreasonable invasion of privacy and security. And the U.S. government objected on the basis of the aforementioned criticisms, as well as out of concern over U.K government secret orders against U.S. companies

In response, the U.K. government backed off—but only somewhat. In September, the U.K. government ordered that Apple enable “access to encrypted cloud backups of U.K. citizens.” This order is more limited than the original, which sought such access for anyone for whom the U.K. made a legal request. But the new order—which on the surface appears more reasonable—is not.

Apple is not allowed to report that it has received a TCN, so details of the actual order are not public; the Financial Times piece provides the most definitive reporting currently available. But there is a troubling lack of clarity that makes it difficult to understand exactly what the U.K. government is requiring. 

It’s useful to unpack the Apple security technology at issue. ADP, which allows users to store encrypted data in the cloud in a way that is decryptable only on the user’s devices, is based on end-to-end encryption. ADP is Apple’s way of extending the end-to-end encryption model to data storage rather than just communications. In storing data in the iCloud, a user is effectively sending an encrypted message to themselves and then receiving it sometime later on one of their (Apple) devices. Meanwhile, only encrypted data sits in the iCloud, ready to be delivered back to the user who put it there in the first place. 

In the ADP system, one of the user’s devices chooses the encryption key, a security protection that prevents anyone but the user’s devices from decrypting protected data. (If the user loses access to the key, they are out of luck. No one else has a key to enable decrypting the message, a situation that occurred recently in a different use of such systems.) It is this aspect of the system to which the U.K. government objects and wants changed: They want access to the key, too.

There are a few things to note about ADP. First, in reality, ADP matters most for mobile devices: iPhones, iPads, and laptops. If all you have is an Apple desktop, you don’t have much reason to use iCloud, since it is most useful for synchronizing and sharing content between devices. Mobile devices are, of course, most likely to cross borders, and therefore most likely to back up their contents to iCloud, making them the primary focus. Second, ADP is not a property of an individual device but, rather, is linked to what is known as an Apple Account (formerly known as an Apple ID): a single email address that links all your devices and purchases. This is a requirement to allow all your devices to have access to the encrypted content. This in turn means that you can’t enable ADP for, say, your Apple desktop but not your iPhone, or vice versa.

The vague language in the Financial Times article raises a series of questions: Does the U.K. government really want to impose its requirement for citizens, as opposed to residents? Does the order mean that if a U.K. citizen buys a device outside the U.K., they must declare having done so when they return to the U.K., thus triggering a notice to Apple to disable ADP for all of the users’ devices? Does it mean that if an American buys a device in the U.K., they would have to show their passport in order for ADP to be enabled? What’s the requirement for people with dual citizenship? Or for a U.K. citizen living abroad? Do noncitizen residents, to whom this order seemingly doesn’t apply, have more rights in the U.K. than British citizens? And if I sell my used device to a U.K. citizen, who is responsible for ensuring that the device now complies with the TCN?

Answers to these questions affect how the order would be implemented. In this piece, we assume for the sake of simplicity that the TCN order is intended to apply to U.K. residents; this avoids many of the complexities and ambiguities discussed above. iPhones can, after all, infer where their owners live by noting where they spend a significant amount of time.

There are two possible approaches the U.K. government could take to achieve access to device content in unencrypted form. The first is what is generically known as “exceptional access” (sometimes called “key escrow”), in which a copy of users’ decryption keys are held by a “trusted” third party for potential future use by the government. The other is simpler: disabling ADP access for U.K. residents’ phones, possibly wherever they are in the world. Both pose difficult, though different, issues.

Third-party exceptional access mechanisms have sometimes been proposed for enabling government access to encrypted communications. But it has been known for decades—ever since the era of the Clipper Chip in the early 1990s—that this is simply not a viable or practical solution. It becomes especially problematic from an international standpoint. Imagine trying to implement different exceptional access restrictions imposed by each of the 195 nations in the world—or even just 15. Even if the conflicting requirements inherent to such a system could be accurately specified and then resolved, the complexity of the software and supporting mechanisms required to implement it would be enormous and fraught with the potential for error.

Such a central key repository is also very likely to introduce serious security problems, for several reasons. One reason is that the database of decryption keys then becomes a hugely attractive target, especially to state actors. Salt Typhoon and Volt Typhoon, the hacks by Chinese government actors into worldwide telecommunication networks and U.S. critical infrastructure, demonstrate how powerful such attackers are.

Just adding the U.K. requirements dangerously increases complexity. A special government-only decryption key is a deliberate breach of the security barrier that encryption provides; no one knows how to do this in a way that ensures only legally authorized parties have access to the data. Because of the complexity of designing such a system, multiple studies have recommended that prior to solutions requiring escrowing encryption keys, governments should institute such efforts at scale to understand if they can work—and what the problems might be. But the U.K. government has made no such effort.

The second approach is to block U.K. residents from enabling ADP—it is (currently) disabled by default, so users have to explicitly request it. Someone in the U.K. purchasing an iPhone—phones can tell where users are in several different ways—would not be able to turn on ADP. The iPhone might simply not show that button, or the ability to activate the capability would be disabled. This could be based on a previous phone’s knowledge of where its user’s home is (many iPhones are simply replacements for older ones) or on where the iPhone is physically when its Apple Account is created. So far, so good. But what if a visitor buys an emergency replacement phone because their iPhone broke while traveling? Do they have to show their passport to be allowed to enable ADP? If so, to whom?

Or suppose a visitor plans an extended stay in the U.K. Does ADP get disabled when the visitor has been in the new location long enough (however long that is)? Or is it disabled the first time their devices go online when in the U.K.? The latter might seem desirable to the government, since it would handle the case of visits by foreign terrorists (who didn’t realize they could disable iCloud backups in the first place, rather than risk government access to their data). That would affect all visitors, including Americans—something the U.S. government has objected to. It could even affect foreign journalists visiting a liberal democratic country whose behavior is more like that of Russia or the People’s Republic of China.

In either case, users would want ADP to be automatically reenabled when they left the U.K. Doing so isn’t simple. Per Apple’s documentation, enabling ADP requires creating a “recovery key”; this key can be printed or use some other user’s Apple Account to hold it. In other words, manual steps are required. One could argue that the U.K. government should have an Apple Account that would hold the recovery key for all phones. That isn’t feasible because the recovery key can only be created when ADP is turned on; it isn’t accessible otherwise.

But even if it were workable, imagine the international situation where there’s a separate recovery key holder for every country the user visited. This has all the weaknesses of a central exceptional access database: It becomes a magnet for spies and other attackers. And every such country the user has ever visited—because when would such records get deleted, and how?—would be a possible target.

Matters grow even worse if the TCN does invoke citizenship. What of people with dual citizenship, who could simply evade the restriction by showing a non-U.K. ID when purchasing an iPhone? Of course, the U.K. could play the “whack-a-mole” game by requiring any purchaser of an iPhone to certify that they are not U.K. citizens. One can imagine that such a solution will not do much to prevent criminals, terrorists, and other bad actors from circumventing the U.K. limitation to accessing encrypted iCloud data by presenting IDs from other countries—fake passports are part of the tradecraft of every major intelligence agency. This solution also fails to handle the case of U.K. citizens living abroad. Solving that one is largely infeasible, as it is highly unlikely that the U.K. could impose such requirements on iPhone sellers and resellers outside the U.K.

The United Kingdom’s first attempt to prevent ADP’s use was issuing a Technical Capability Notice to Apple to provide access on demand to encrypted iCloud material regardless of the data’s location. This was an extraordinary extraterritorial request. In other words, if the U.K. government were investigating a person anywhere in the world, Apple was supposed to be able to return unencrypted iCloud material. The U.S. company responded by shutting off ADP within the U.K., first for new users and then for all. 

Undoubtedly, that change makes it easier for U.K. investigators to gather evidence on unsophisticated criminals and nation-state actors but does little against capable ones. The U.K. action makes it more difficult for individuals, businesses, civil society, and others within the nation to protect themselves—a serious mistake given the increasing threats they face. 

U.S. government objections to the U.K. government’s initial demand induced His Majesty’s government to scale back its request. But for the reasons we’ve already stated, this new policy attempt makes little more sense than the original. Sometimes, the most reasonable action for a government is to retreat and acknowledge a mistake in the face of a security situation that turns out to be more complex than originally understood. The U.S. government did so belatedly in the late 1990s with the Clipper Chip; let’s hope the U.K. government can turn around much quicker than that.


Susan Landau is Professor of Cyber Security and Policy in Computer Science, Tufts University. Previously, as Bridge Professor of Cyber Security and Policy at The Fletcher School and School of Engineering, Department of Computer Science, Landau established an innovative MS degree in Cybersecurity and Public Policy joint between the schools. She has been a senior staff privacy analyst at Google, distinguished engineer at Sun Microsystems, and faculty at Worcester Polytechnic Institute, University of Massachusetts Amherst, and Wesleyan University. She has served at various boards at the National Academies of Science, Engineering and Medicine and for several government agencies. She is the author or co-author of four books and numerous research papers. She has received the USENIX Lifetime Achievement Award, shared with Steven Bellovin and Matt Blaze, and the American Mathematical Society's Bertrand Russell Prize.
Matt Blaze is the McDevitt Chair of Computer Science and Law at Georgetown University, where he studies problems at the intersection of technology, law, and public policy. His research interests include surveillance, election systems, security, cryptography, and large scale systems and networks. Prior to joining Georgetown, he was a professor of computer and information Science at the University of Pennsylvania, and prior to that a Distinguished Member of Technical Staff at AT&T Labs - Research. He holds a PhD in computer science from Princeton.
Steven M. Bellovin is the Percy K. and Vidal L. W. Hudson Professor Emeritus of Computer Science at Columbia University and former affiliate faculty member at Columbia Law School. He is currently a Senior Affiliate Scholar at Georgetown University's Institute for Technology Law & Policy.
}

Subscribe to Lawfare