Cybersecurity & Tech Surveillance & Privacy

Apple v. FBI Shows That Lawyers and Tech Speak Different Language on Privacy

Timothy Edgar
Thursday, March 17, 2016, 3:35 PM

I recently appeared on the Brown University Cybersecurity News Podcast to discuss bridging the lawyer and technology divide in the debate between Apple and the FBI. Interested Lawfare readers can listen to the full audio.

Published by The Lawfare Institute
in Cooperation With

I recently appeared on the Brown University Cybersecurity News Podcast to discuss bridging the lawyer and technology divide in the debate between Apple and the FBI. Interested Lawfare readers can listen to the full audio.

Lawyers and technologists mean different things when they say the word “privacy.” For American lawyers, privacy begins with Louis D. Brandeis. Privacy is the “right to be let alone.” “Unjustified” intrusions on privacy are a violation of the Fourth Amendment. The Fourth Amendment protects against unreasonable searches, i.e., those without a warrant or some exception to the warrant requirement.

For technologists, privacy means the ability to have a secure conversation. In their paper explaining the first practical public-key cryptosystem, Ron Rivest, Adi Shamir and Len Adleman define privacy to mean that “Alice” and “Bob” can talk with a technical guarantee that no one else can listen in. Privacy is defined by logic and math. There is no wiggle room; a system is either secure or it is broken.

We see this disconnect mirrored in the language of the FBI Director James Comey, who speaks of legal process and rights, and that of Tim Cook, Apple’s CEO, who stresses security and the danger of undermining it.

The full transcript is here.

ALAN USAS: Tim, do you see a fundamental conflict at the heart of the Apple v. FBI arising from the different ways lawyers and technologists view privacy? Fill us in on that conflict.

TIM EDGAR: One problem that we're seeing in the whole debate over access to encrypted communications is the different ways in which lawyers and technologists use words. And here, I'd like to talk about the word privacy. What does a lawyer mean when he or she says privacy, and what does a technologist mean when he or she says privacy? Because they mean two very different things, even though they're using the same word.

So, when I talk about this in my classes, I like to use the example of Louis Brandeis, who is the great Supreme Court justice, the author of The Right to Privacy [in the Harvard Law Review,] one of the greatest thinkers in American law on this topic. His famous dissent in Olmstead v. United States was all about this concept. In that case, the Supreme Court had to decide whether wiretapping was an invasion of a right protected under the Fourth Amendment. And that was in the late 1920s, [when] telephones were a new technology.

And most of the court just couldn't get their mind around the idea that this was a search, to listen into somebody's telephone conversation, because they said, well, you're not invading their home or searching their body. There's nothing physical being invaded. And this really rubbed Louis Brandeis the wrong way. Justice Brandeis said, no, no, an intangible invasion of privacy is a search under the Fourth Amendment, just like a tangible, physical search is. And it went all the way back to his earlier work, when he wrote that law review article, "The Right to Privacy."

So he talked about how privacy was the most comprehensive of rights. It was the right most valued by civilized people. It was the right to be left alone.

But then when he gets into talking about, what does that actually mean, he starts using wiggle words, which is something that lawyers always do. They talk about reasonableness. They talk about whether something is justifiable. He said, unjustifiable intrusions by the government on the privacy of the individual must be deemed a violation of the Fourth Amendment.

Well, what does he mean when he says unjustifiable? Well, of course, what he means is that the Fourth Amendment gives you a right against unreasonable searches-- not against all searches, just against those searches that the law deems unreasonable. And the most important thing when it comes to a search, for a lawyer is, has there been a warrant issued to justify the search? So that's what Louis Brandeis was talking about.

Now then I wanted to think about, well, what's an example for technologists of the word privacy being used in a similar, sort of seminal way? And I went back to a very famous paper published in 1978, by three famous technologists-- Adi Shamir, Ron Rivest, and Len Adleman. This is the famous RSA paper, the paper that made it possible to have public key cryptography in a usable and practical way. It had been already developed in theory, but Shamir, Rivest, and Adleman made it something that was really practical and usable. It's what underlies all of the privacy on the internet today.

So, they defined in their paper what they meant by privacy. They said, encryption is the standard means of rendering a communication private. And they introduced the famous characters of Alice and Bob. This is the first paper that talks about Alice and Bob.

So they said, how can Bob send a private message to Alice? That's the question. Without Eve listening in. And they defined privacy as an intruder listening in on the channel cannot decipher any message, because it is not possible to derive the decryption key from the encryption key. That's what a technologist means when they think something is private. In other words, the system is either secure, in which case it's private-- Eve can't listen in on Alice and Bob-- or it's breakable, in which case it's not private.

So you see, there's no wiggle room in that definition. It's not a lawyerly definition. They're not talking about, well, can we prevent unjustifiable listening in on Alice and Bob? Well, we don't care whether something is justifiable or not. If you can listen in, it's not private. If you can't, then it is private.

So, what does that mean for FBI v. Apple? Well, you see this exact same dichotomy happening. You look at the FBI director, Jim Comey. He says, we're just trying to search somebody's phone with a warrant. We want Apple's help to do that. We're not looking for a back door. We're looking for a front door. We're going to go in with legal process, a judge's order, and say, please help us unlock this phone.

But then Apple looks at it and they say, oh, if we do that, we set a precedent that our phone can be broken into. It's no longer private. If we do it for you, Mr. Comey, even in the most justifiable of circumstances-- well, how do we draw the line when some other government asks us to do the same thing, maybe under a circumstance where we don't think it's justifiable? Once we've broken our phone, it's broken and we can be asked or forced to use that key over and over again.

So you see this fundamental clash playing out in Apple v. FBI, between the lawyer who says privacy is when you're able to protect your communications legally, where you need a judge's order and a warrant to invade that privacy. But technologists say, no, privacy is when you can protect your communications technically, where they can't be broken into. And the strongest definition of [technical] privacy-- actually, Apple fails [that definition.] Because in the strongest definition of privacy, even Apple can't break into it.

So technologists look at this case and they say, the problem here is that Apple created a phone that they could break. And the solution is for Apple to create a phone that they can't break. A lawyer would look at this and say, the problem here is that we don't have a way to break into communications when society needs to. But if we create that way, maybe we create insecurity for everybody else, and so this is a policy dilemma that we have. And so, the two communities are in some ways, talking past each other.

Timothy H. Edgar defended privacy as an ACLU lawyer before going inside America’s growing surveillance state as an intelligence official in both the Bush and Obama administrations – a story he tells in Beyond Snowden: Privacy, Mass Surveillance and the Struggle to Reform the NSA. In 2013, Edgar left government to become a Senior Fellow at Brown University’s Watson Institute and helped put together Brown’s Executive Master in Cybersecurity. Edgar also serves on the advisory board of Virtru, an encryption software company. Edgar’s work has also appeared in the Wall Street Journal, the Guardian, Foreign Affairs, and Wired. Edgar is a graduate of Harvard Law School and Dartmouth College.

Subscribe to Lawfare