Published by The Lawfare Institute
in Cooperation With
On Tuesday, Deputy Attorney General Rod Rosenstein gave remarks about encryption at the U.S. Naval Academy. His speech as prepared for delivery can be read here.
It’s an interesting speech for a variety of reasons, not least of which are his comments on “sunshine patriotism” and the rule of law. But the central theme of his remarks takes up the mantle of former FBI Director Comey on the subject of “Going Dark,” where law enforcement has the legal authority, but not the technical ability, to access encrypted communications. Unsurprisingly, Comey’s premature departure did not resolve law enforcement’s difficulty in accessing encrypted devices; apparently the FBI alone confronted 7,500 such devices over the past year.
But while the thrust of Rosenstein’s message is similar to Comey’s, his tone is markedly less conciliatory. Rosenstein explains :
The approach taken in the recent past — negotiating with technology companies and hoping that they eventually will assist law enforcement out of a sense of civic duty — is unlikely to work. Technology companies operate in a highly competitive environment. Even companies that really want to help must consider the consequences. Competitors will always try to attract customers by promising stronger encryption.
That explains why the government’s efforts to engage with technology giants on encryption generally do not bear fruit. Company leaders may be willing to meet, but often they respond by criticizing the government and promising stronger encryption.
Of course they do. They are in the business of selling products and making money.
Rosenstein is right. Nobody ever sold a smartphone by it being more accessible to U.S. law enforcement with a court order than the one sitting next to it in the store. Technology companies have a significant financial interest, both independently and collectively as an industry, in being publicly seen to thwart U.S. law enforcement, especially if they intend to sell devices overseas.
As Rosenstein also notes, these same companies typically have little problem complying with significant and illiberal restrictions around the globe, or building capabilities that are technically equivalent to a “backdoor” when it is in their business interests to do so. Vociferous objections to U.S. law enforcement feel somewhat less principled when the same company is facilitating censorship in China, building access-mechanisms secured with master keys to service security updates and complying with data-localization laws which enable foreign governments to surveil their users in that country by housing data in an in-country datacenter.
But while technology companies have been disingenuous in their public reasons for opposing proposed “Going Dark solutions, government officials in the U.S. and elsewhere do not do themselves any favors by allowing significantly different topics to be bundled under the same banner of “encryption” writ large, and thus allowing the discussion to become deeply confused.
For example, “device encryption,” used to prevent access to a device without the user’s PIN code, does not operate in a similar way to end-to-end communications encryption applications which prevent active wiretapping of communications. Neither is similar to the specialist anonymizing software used by criminals to access hidden child pornography and drugs markets on the web. And HTTPS, the encryption used to keep communications with ordinary websites secure, is fundamentally different again.
To be sure, each of these use “encryption,” at some superficial level, and all cause different headaches for law enforcement. Yet each category poses very different challenges to investigations, provides dissimilar security benefits to users, and has surprisingly unrelated options, alternatives and trade-offs for any proposed path forward for law enforcement or technology companies to adapt to their respective challenges.
Consider the common argument levied against government-mandated access-schemes that the government could instead use software vulnerabilities to “hack past” the encryption. This might be a solution-of-sorts for end-to-end communications encryption, and is probably the most workable answer to anonymising browsers used for accessing hidden illicit sites. But “hacking past” simply isn’t an alternative for device encryption. If a device is powered-off next to a dead body at a crime scene, there is simply no way to bypass the encryption and look for clues on the device as to the perpetrator of the crime against its owner. It cannot be hacked-past. The device is inaccessible: no amount of money, deep thinking, zero-day vulnerabilities or supercomputers will open it.
Similarly, arguments that government-mandated access-schemes will force criminals to use other apps may hold some water in the context of end-to-end encryption applications There are lots of communications apps; it is easy to change and install apps and many of the developers of these apps are small businesses overseas that the U.S. government can’t efficiently regulate.
In the context of device encryption, the reverse is true. Device encryption is not an app; it is a feature of the operating system. It is impractical for most people to change operating systems on a smartphone, and in practice nobody ever does. There are only a handful of key operating system manufacturers in the world, and all of them are multibillion-dollar tech companies based in the United States.
In short, discussions that blend different categories of encryption together inevitably end up deeply confused, with arguments that apply to one category that make no sense when applied to other categories of encryption.
There are safe, technical solutions to specific, narrow categories of technologies within the “Going Dark” umbrella that pose specific, narrow problems to law enforcement. For example, regulation in the narrow domain of device encryption could be achieved with trivial security trade-offs, cryptographically-enforced transparency to enforce oversight and prevent misuse. It can also be done in a way that allows devices sold to be decryptable only by the law-enforcement of the country in which the device was sold, thus resolving the bulk of the “international” question. But regulating encryption writ large in a way that would encompass technologies like HTTPS would be unworkable, unnecessary, and grossly unsafe for users. By allowing the conversation to take place at the 10,000 foot level, the government only sabotages their ability to drive some topics within “Going Dark to a reasonable, safe, and accessible solution.
So Rosenstein is right. He shouldn’t expect Silicon Valley to help the government. The financial incentives stack for private industry to oppose the government loudly and publicly on any issue about law enforcement access to their devices completely independent of any real security or civil liberty concerns on the topic. But if the conversation is to move forward without them, law enforcement needs to begin by recognizing this isn’t one debate about encryption. It’s a series of different debates about specific technologies, each with surprisingly different options for paths forward, and different costs versus benefits to users and law enforcement.
If Rosenstein wants the “Going Dark” debate to move forward, he’ll need to start by deciphering the debate itself.