Cybersecurity & Tech Surveillance & Privacy

Encryption Legislation: Critics Blinded by Outrage are Blinded to the Lessons

Susan Hennessey
Thursday, April 21, 2016, 3:05 PM

As readers are well aware, Senators Richard Burr and Dianne Feinstein released draft legislation on encryption, the “Compliance with Court Orders Act” (CCOA), the other day. Judging by the apocalyptic tenor of responses to both the released discussion draft and to an earlier leaked version, it is reasonable to expect many in the technology industry and advocacy organizations to oppose this legislation the way they might oppose, say, the rule of Satan.

Published by The Lawfare Institute
in Cooperation With

As readers are well aware, Senators Richard Burr and Dianne Feinstein released draft legislation on encryption, the “Compliance with Court Orders Act” (CCOA), the other day. Judging by the apocalyptic tenor of responses to both the released discussion draft and to an earlier leaked version, it is reasonable to expect many in the technology industry and advocacy organizations to oppose this legislation the way they might oppose, say, the rule of Satan. The vehemence of the response is not surprising, as the law is similar to what many critics consider the worst case scenario—a CALEA-style generalized obligation to provide decrypted data or technical assistance to law enforcement.

Barring a dramatic and unforeseeable turn of events, this bill is not going to see the Senate floor any time soon. Yet that has not stopped critics from investing considerable energy in decrying the legislation as technologically illiterate, technophobic, evil, and undemocratic. In the hysteria, encryption advocates are missing an opportunity to gain genuine insight into the way people on the other side of the issues think about constructing workable regulatory regimes.

In order to move towards a constructive dialogue, and understanding the animating values, let’s take a moment to consider what this legislation attempts to accomplish, why, and how.

How Much of the Problem Do You Want to Solve?

To begin, consider the available legislative options to achieve Burr and Feinstein’s stated goal of facilitating lawful access to data and information. Ben has previously discussed five general approaches that Congress might pursue in regulating encryption technology. But for the SSCI’s purposes, there are really only two meaningful legislative options: technical assistance or a performance standard.

Under a technical assistance approach, the law would define the scope of the obligation to provide technical assistance under Title III, FISA, and the All Writs Act. Congress could set forth the specific parameters of what kind of help companies must provide to the government in executing a court order while still allowing companies to design their systems in such a way that might ultimately thwart government access. In essence, this approach adopts “lawful hacking” as advocated by some in the tech community, but adds the twist that companies have to help the government hack when they are able to do so. Legislation aimed at clarifying the obligations to provide technical assistance has a number of benefits—it can be narrowly tailored and presents a more palatable compromise—but it has one major drawback in that it only solves a small part of the problem.

WhatsApp recently announced it was moving to end-to-end encryption for all data—and similar messaging applications have followed. That move was a timely reminder to Congress that companies will increasingly work to position themselves such that they can offer no meaningful assistance to the government, even if they want to and even if required by a court order. The Justice Department hinted at this in its Motion to Compel in San Bernardino, which noted that “Apple has attempted to design and market its products to allow technology, rather than the law, to control access to data …. Despite its efforts, Apple nonetheless retains the technical ability to comply with the Order, and so should be required to obey it.”

Relying exclusively on defining obligations to give technical assistance as a solution to Going Dark solves a fixed—and ever-diminishing—sliver of the problem. As companies move towards stronger systems, they will inevitably reach a point where they cannot meaningfully help at all or cannot do so within a time frame that is responsive the law enforcement needs. While partial solutions may have virtues, technical assistance is not a comprehensive fix now or in the future.

Recognizing this, Burr and Feinstein have apparently decided that if they are going to solve the problem, they are going to solve as much of it as they reasonably can. Thus, CCOA is a form of what Ben calls “the Full Comey“—legislation which sets a performance standard of being able to produce and decrypt information when subject to a particular type of court order. The broader performance standard is then supplemented by an alternative obligation to provide technical assistance to facilitate access to data encrypted by some other party.

It’s actually a pretty straight-forward legislative solution. Certainly some quantity of information subject to a court order will nonetheless remain inaccessible, but the bill covers as much of the terrain as is practicable. But this legislation is not technologically illiterate, as the echo chamber of criticism has convinced itself. Rather, it is rationally constructed to achieve the goals of its drafters. It may be fun to convince yourself that your opponents are illiterate and stupid, but the reason for the disconnect here is not brains; it’s values.

Another Word for “Breadth” is Certainty.

Paul Rosenzweig noted the ‘incredible breadth” of the bill—as illustrated by this useful graphic—and characterized it as problematic. Indeed, a broad range of entities are covered. In addition to covering remote computing services and electronic communications services, CCOA applies to any person or entity who provides a product or method to facilitate a communication or the processing or storage of data.

But breadth is not necessarily an unintended consequence or careless drafting; there are significant advantages to comprehensive regimes. In order to function, private companies must clearly understand the rules and whether they are covered by a given statute or regulation. Ambiguity creates both economic and enforcement problems. The central issues of Apple v. FBI were ones of uncertainty, both genuine and manufactured. Has Congress spoken? Does the All Writs Act appropriately apply? If All Writs does apply, which kinds of activity can be compelled under it? Alternatively, does CALEA apply? What entities and activities are governed by the various provisions of CALEA?

This legislation answers the persisting uncertainties. If it were to pass, Congress will have spoken. Notwithstanding any other law—sweeping away the attendant uncertainty of CALEA and AWA—entities would know what is required of them in order to comply with the statute. And there would be relatively little ambiguity as to who is covered—primarily because essentially every relevant actor falls within the law.

The Bill Empowers the Courts, not the Executive.

There is a significant facial limitation on an otherwise broad act insofar as it would only govern the responses to “court orders.” The court would, of course, still be constrained by law in issuing these orders. And the act would further limit the relevant court orders to those issued to investigate or prosecute a defined number of serious offenses—those resulting in death or serious bodily harm, espionage and terrorism, exploitation and threats to a minor, serious violent felonies, federal drug crimes, and state equivalents. Furthermore, the act does not apply to anyone subject to a court order, but those covered entities as defined.

These limitations are certainly of cold comfort to covered entities (and recall that entire industries are covered). A company cannot know in advance when or if it will receive a court order related to one of the enumerated offenses, nor which data might be determined to be responsive. Therefore, companies would be forced to maintain the ability to access and decrypt any and all of the data within their control and to potentially be prepared to provide technical assistance for some additional set of data beyond their immediate control. But note that this is how a performance standard operates. In electing to pursue the “solve as much as we can” path, a generalized dictate is the alternative to a piecemeal solution.

There are two requirements that a covered entity that is served with a court order for information or data might face. Either the entity (1)(A) “shall provide such information or data in an intelligible format” or (1)(B) the entity shall “provide such technical assistance as is necessary to obtain such information or data in an intelligible format or to achieve the purpose of the court order.” Intelligible means either data that has either “never been encrypted, enciphered, encoded, modulated, or obfuscated” or, if it has, that has been returned to “its original form.”

The bill—like CALEA—cannot be used to mandate or prohibit the adoption of any particular design or operating system. Companies will determine the best methods. Many technology companies and advocacy groups insist it is not possible to offer encryption at all under the act. But that isn’t true. Preserving the technical mechanisms for access or decryption might compromise ideal security—maybe even basic security—but it’s clearly possible.

This is not a question of “magic rainbow unicorn” law; it is a stark policy choice with which a great many critics disagree.

This is an important point for those who want to dismiss the legislation as ignorant of mathematical principles. The drafters do not reject the premise that it is impossible to both have third-party access or decryption capability and also have an ideally secure system. Rather, recognizing this kind of access would result in some degree of real or theoretical lessened security, they have determined that there are ways to manage these risks. Advocates who wish to change the minds of those who support this bill must be convincing on this point—that there is, on balance, no feasible way to manage the risk. And thus far, Congress isn’t convinced.

The bill’s critics have made much of the fact that it includes no articulated penalties for noncompliance. But as a general matter, where there is no specific statutory penalty, courts possess discretionary mechanisms to ensure compliance with orders and to punish noncompliance—namely civil and criminal contempt charges.

As with the broader structural choices of the bill, here too the drafters faced a preliminary decision. They could vest discretionary enforcement in the judiciary’s existing authorities. Or they could adopt a provision similar to CALEA—which authorizes the court to issue fines of $10,000 per day for failure to comply—by creating some set penalties. But CALEA applies to narrowly defined, robust, and largely monolith telecommunications carriers. These companies have similar compliance capacities and the financial penalty can be calculated to have approximately the same incentive impact. But adopting this approach to encryption regulation, where there are far more diverse actors and incentives, risks creating penalties that are simultaneously overly and insufficiently broad.

Here, the bill empowers courts to exercise discretionary authority to impose equitable penalties—punitive, compensatory, or coercive. That flexibility is important to this kind of regulatory regime.

In a sort of elegant irony—considering the procedural history thus far—the power of federal courts “to punish by fine or imprisonment, at the discretion of said courts, all contempts of authority in any cause or hearing before the same” is derived from the Judiciary Act of 1789—the very same which produced the All Writs Act.

The contempt power has some complex constitutional dimensions. Under this bill, courts would likely need to resolve questions of the application of contempt to situations in which entities or individuals intentionally constructed conditions which made compliance impossible, limiting the effectiveness of coercive mechanisms. While this difficulty might be surfaced under CCOA, it is not created by the act which in no way alters the judicial authority to address those who defy court orders.

Ambiguities Persist.

Despite the aim of creating a broadly applicable bill to clarify performance standards and assistance obligations, there are a number of significant ambiguities in this draft.

For example, who determines whether the 1(A) obligation to provide information in an intelligible format or the 1(B) obligation to provide technical assistance applies? Presumably, the court makes the determination. But courts have limited technical expertise, and government agencies requesting the assistance might have a different understanding than a company regarding what qualifies as necessary, reasonable, or even possible.

Similarly, the 1(A) obligation to provide information or data in intelligible form—as opposed to the obligation to provide technical assistance—presents definitional problems. The obligation to provide information or data only applies to data which has been made unintelligible by a “feature, product, or service” which is “owned, controlled, created, or provided” by the entity or a third-party on its behalf.

The scope of the requirement represents an expansion of the decryption assistance obligations incumbent upon entities subject to CALEA. In CALEA, “[a] telecommunications carrier shall not be responsible for decrypting, or ensuring the government's ability to decrypt, any communication encrypted by a subscriber or customer, unless the encryption was provided by the carrier and the carrier possesses the information necessary to decrypt the communication.” The entities covered by CCOA are only responsible for decrypting encryption they themselves provided, and they are obligated to retain the capability to decrypt. For data encrypted by other parties, the entity is only responsible for providing technical assistance.

The CCOA provision means that a device manufacturer, like Apple, could not be required to provide decrypted data that is running on a third-party app, like WhatsApp. This is a sensible limitation, since there is likely no way for a company to provide such data in an intelligible form. However, the entity is not only responsible for its own product and services but also those “owned, controlled, created, or provided” by the entity. This language may create difficulties. Is Facebook required to provide WhatsApp messages in plaintext, since Facebook owns WhatsApp? As large companies purchase various apps which they incorporate into their services—Facebook also owns Instagram, for example—the obligation of parent companies and subsidiaries becomes more difficult to discern. And the relationship of parent and subsidiary in retaining decryption capability grows more complex across international markets.

Another potential concern with the language is that the scope of requirement for 1(A) applies to data that has been made unintelligible by a “feature, product, or service.” Feature is defined in the act as a “property or function” of a device or software application. The inclusion of “property” might extend the obligation to areas in which data is altered or modified as an unintended or collateral matter, which risks making the onus to preserve the ability to return the data or information to its original form overbroad.

There are legitimate reasons to oppose this legislation as a general matter, or to note the problematic ambiguities inherent in vesting enforcement in judicial discretion. But before critics dismiss the problem as a government which can’t do math or is relying on magic, they should try writing a law.

Obstructionism is easy, effective, and emotionally satisfying in the short term. But it is not a long-term strategy. When faced with a problem that is not going away, if you want to beat a bad idea, you have to come up with a better one.

Susan Hennessey was the Executive Editor of Lawfare and General Counsel of the Lawfare Institute. She was a Brookings Fellow in National Security Law. Prior to joining Brookings, Ms. Hennessey was an attorney in the Office of General Counsel of the National Security Agency. She is a graduate of Harvard Law School and the University of California, Los Angeles.

Subscribe to Lawfare