Cybersecurity & Tech Surveillance & Privacy

Encryption, Biometrics, and the Status Quo Ante

Paul Rosenzweig
Monday, July 6, 2015, 10:29 AM

As the recent indirect debate between FBI Director James Comey and co-blogger Susan Landau makes clear, the underlying premises of the encryption issue are highly contested. The Senate will continue the debate this week with a hearing before the Judiciary Committee on July 8 at which our own Herb Lin (among others) will testify and another hearing before the Intelligence Committee.

Published by The Lawfare Institute
in Cooperation With

As the recent indirect debate between FBI Director James Comey and co-blogger Susan Landau makes clear, the underlying premises of the encryption issue are highly contested. The Senate will continue the debate this week with a hearing before the Judiciary Committee on July 8 at which our own Herb Lin (among others) will testify and another hearing before the Intelligence Committee.

In general, it seems accurate to say that encryption is a strong aspect of cybersecurity these days and, as those who wrote Congress noted in a letter I was pleased to sign, there is little prospect of an immaculate back-door – that is a door that only the FBI can open with appropriate legal process. Thus, we begin any discussion with a healthy dose of technological caution about the efficacy of what the government seems to be asking for.

But saying that does not close the discussion. Rather, I think, it sets the terms of the conversation on by giving it a technological ground – that perfect back doors are not possible. But there is a technological flip-side to the conversation that pro-encryption advocates too often disregard – the reality that properly implemented encryption is, for all practical purposes, uncrackable. The advent of public key encryption has, effectively changed the public policy dynamic – never before in human history have we seen a means of communication that was theoretically immune from government interception and access. Even the vaunted Enigma machine yielded, in the end, to the analysis of Bletchley Park. But public key encryption is, in theory, beyond mathematical analysis. It is impenetrable. That’s a real change.

But today’s debate isn’t even about this technology. It’s about human nature. The change in encryption technology is now to be married to a verity of human behavior – the pervasiveness of human laziness. I mean that not as a joke but as a serious comment about the value of default rules and how they interact with human behavior. The significance of the change that we are seeing is NOT in the development of public key encryption – that’s been around for roughly 20 years. Rather, the change is in the default rule – it is the difference between having encryption always “off” as a default rule (such that people need to turn it on to be effective) and always “on” (such that it works unless you turn it off). I have no data on how that change will affect government access to evidence of criminality, but my guess is that a default “on” rule will, once implemented and transitioned, put something on the order of 80-90% of data stored at rest in an encrypted state.

To this must be added the growing body of law that protects encryption passwords against compulsory disclosure, as a consequence of the Fifth Amendment. In other words, the courts say (rightly, in my view) that one cannot be compelled to tell the government your password. So now, even when the data storage device can be seized it cannot be decrypted, since the subject of the investigation is (presumptively) the only one who knows the pass phrase and s/he has a privilege against revealing it. As a result, data about criminal activity will become increasingly beyond any means of securing through lawful process – that is, under a court order of some sort (whether a warrant or a subpoena). [Which, of course, does not mean that other methods of access – black bag jobs; espionage; or, in other countries, torture, aren’t available – it only means that you can’t get the data by going to a court for an order or a warrant, thereby weakening judicial control of law enforcement.]

And finally, there is a fourth change that lies behind the current discussion – a change not only in how information is stored, but where. With pervasive cloud storage of data, information that used to be kept in the privacy of the home is now maintained by third-party service providers (whether in encrypted or unencrypted form). As a consequence, the government may now go to the cloud provider to secure evidence, and the circumstances in which the government’s investigative efforts provide notice to the subject of investigation of its investigation have become less – data collection that used to require notice to the subject (e.g. through service of a subpoena) can now be collected elsewhere.

What we see from this is a transformation along more than one axis of policy and law. Consider the state of play in, say, 1995 before the growth of the network and before the development of encryption technology:

  • Data relating to criminal activity could not be maintained in an undecryptable format;
  • Access to data relating to criminal activity could be achieved through lawful process (either a search warrant or a subpoena); and
  • Data relating to criminal activity was maintained in the possession of the subject of the investigation, so access brought with it notice.

Today, the paradigm is shifting in all three dimensions:

  • Data is undecryptable;
  • Data access through lawful process is often not possible, given Fifth Amendment constraints; and
  • When data is successfully accessed in the cloud, the subject does not get notice of the investigation.

How then to deal with this change in technology? One answer, of course, is to do nothing – I suspect that is what many of the encryption advocates would want. But that would be a changed circumstance – which is why Director Comey has reacted so strongly in public. Still, if an immaculate backdoor is technologically impossible, what then? Here’s one possible solution:

Mandatory biometric encryption.

In other words, allow strong encryption for data at rest, just as encryption advocates want, but require manufacturers to activate the encryption using a biometric key (such as a fingerprint or iris) rather than a pass phrase. The plan would have some real advantages:

First, biometrics are superior as a form of access control to passphrases. Everyone from the White House to Wired magazine agrees with that. Thus, as a general matter data protected by a biometric would actually be more secure.

Second, biometrics, unlike pass phrases, are almost certainly not protected by the Fifth Amendment. Hence the use of biometrics would restore the government’s ability to secure evidence of criminality through lawful process.

Third, access to data through a biometric would systematically be much more likely to be achieved by direct interaction with the subject of the investigation, restoring the notice aspect of data access that formerly had existed.

To be sure, neither law enforcement nor privacy advocates would fully approve this proposal. Law enforcement would lose the ability to gain surreptitious access to data through third party providers. Privacy advocates would lose the absolute secrecy that comes from perfect encryption. And American businesses may prefer not to have the obligation imposed upon them for competitiveness reasons (though they would, I think, be free to market non-biometric devices overseas). But most citizens would, I think, welcome restoring the traditional balance between privacy and security and a return to a system where access to evidence of criminality is achieved through legal means subject to judicial controls.

The proposal would, in effect, return us to the status quo ante – to a time when absolute data security was not technologically feasible, but where government could not gain criminal evidence of this sort by surreptitious means. For myself, I dare say that returning effectively to the practical state of policy back in 1995 would be adoption of a rule of technical neutrality.

One final note of limitation: I am speaking here exclusively about the problem of end point encryption, or encryption of data at rest. The problem of data in transit poses different issues that require different solutions. In general, I am less concerned about default adoption of such encryption because it conflicts with service provider business models.

Paul Rosenzweig is the founder of Red Branch Consulting PLLC, a homeland security consulting company and a Senior Advisor to The Chertoff Group. Mr. Rosenzweig formerly served as Deputy Assistant Secretary for Policy in the Department of Homeland Security. He is a Professorial Lecturer in Law at George Washington University, a Senior Fellow in the Tech, Law & Security program at American University, and a Board Member of the Journal of National Security Law and Policy.

Subscribe to Lawfare