Published by The Lawfare Institute
in Cooperation With
There is a bill moving rapidly through the U.K. Parliament that poses a significant threat to data security and privacy in the U.K. and beyond. It is ill considered and should be amended substantially before it moves forward.
The bill is flawed in several respects, as some observers have pointed out. This piece focuses on certain elements that we think will stifle innovation and substantially hinder the efforts of private companies to enhance, or even maintain, core security and privacy products, features, and architecture, especially with respect to the use of encryption. To be sure, governments in democratic countries face challenges in accessing the content of communications of spies, terrorists, and other threat actors. They need help. But these purported solutions in the bill aren’t the right way to do it.
Specifically, the proposed amendments to the 2016 Investigatory Powers Act would give the U.K. government, at the sole discretion of the secretary of state for the Home Department (Home Office), the power to require a company to tell the U.K. government about new or changed products or features before the company could launch them. This mandate could be issued without consultation with privacy regulators or others in a position to opine on proportionality or other considerations, much less a judicial review.
Following receipt of a “Notification Notice” (yes, that’s actually what it is called), the U.K. government could use existing powers to require that the company meet surveillance capability demands as a condition of making a product or feature available. Demands are left to the discretion of the government and could include, for example, disabling security like encryption, user access controls, and privacy protection features. If the government’s demands are not met, the company may have no choice but to abandon the product or feature launch, giving the government essentially a veto power on how companies innovate and improve their products. (The government could even block a company from deprecating a service or deleting data.) All of this is done secretly, with the company prohibited from disclosing it unless the government allows it to do so. The act purports to extend enforceability to non-U.K. companies, and the amendments expand that to retention and these notices, exacerbating the challenges that companies face. Paired with the gag order that comes with each, this has several effects, including that the non-U.K. company can’t notify its home government of the demand, even one that violates the law of the home government, preventing any sort of diplomatic assistance.
The Home Office has been very explicit that the purpose of the amendments is to “ensure continuity of lawful access to data against a background of changing technology.” It’s understandable that the U.K. intelligence and law enforcement agencies would like to know about a company’s research and business plans, and have a say in whether and how a company makes a change that has serious implications for their weighty missions. Both of us have worked in law enforcement, and we know how important, and how difficult, the jobs of public safety officials are. There’s no reason to think that the intentions behind the bill are anything but noble. This proposed power, however, goes too far and is counterproductive.
First, there’s no case that this extraordinary power would solve any existing problem. Most providers are quite transparent about product launches, feature additions, and removals. Many companies have entire conferences to loudly trumpet what is coming, or at least issue announcements through blog posts and press releases. In addition, there’s no shortage of dialogue between the U.K. government and technology providers. In October 2023, U.K. security officials and their Five Eyes partners (the United States, Canada, Australia, and New Zealand) made a high-level and highly publicized visit to meet with technology companies in Palo Alto, California, to discuss a range of security topics, including espionage threats from China. On top of there being no clear problem to solve, the amendments could chill companies from engaging with the government in this otherwise healthy exchange about technological innovations for fear of enticing the government to issue a notification notice. The open cooperative dynamic is at risk of being replaced by one that is defensive and adversarial.
Second, this new product approval regime could harm British users and other users around the world. A company that ultimately must capitulate to the surveillance demands of the government may end up offering services that are less secure generally, susceptible to compromise by bad actors, state sponsored or otherwise. Perhaps as a result, the U.K. will have its narrow surveillance needs met at a particular moment in time, but this would come at a great cost to those users specifically, and cybersecurity generally. One of us has testified to Congress and one written at length about the importance, for example, of encryption in enhancing cybersecurity for society, while also working to find a more effective path forward for everyone. This bill, if enacted, could easily be used to stifle the increased use of encryption to protect data security and privacy.
Third, enacting this bill will seemingly legitimize this heavy-handed approach for countries less steeped in the rule of law and with a lower regard for human rights. Should the current version of the amendments pass, even if U.K. authorities adhere in exemplary fashion to human rights and privacy concerns, other security services, especially in authoritarian-leaning countries, will not. They could endeavor to replicate the U.K.’s secretive power in order to undermine product security for their own aims, not only to surveil users but also to censor their communications. No country should expect it will necessarily be the beneficiary of the use of this new power to control and direct product development. It’s purportedly designed for use by the U.K. and for the U.K., though resulting insecurities will be there for any actor to exploit if they can find them.
The proposal also runs counter to other efforts by numerous governments—including the U.K.—to urge the private sector to find better ways to substantially enhance cybersecurity on a more sustainable basis. Instead of doing that, the bill, as currently drafted, jeopardizes data security and privacy in pursuit of an understandable goal of helping law enforcement and intelligence agencies’ legitimate objectives. But no one needs a law that could limit future progress on much-needed security enhancements, such as through the increased use of encryption. The bill needs to be fixed.