Congress Cybersecurity & Tech

PATCH: Debating Codification of the VEP

Mailyn Fidler, Trey Herr
Wednesday, May 17, 2017, 5:46 PM

Today a bipartisan group of lawmakers introduced in both the House and Senate a bill that would formalize the Vulnerability Equities Process (VEP) into law. The proposed legislation, the Protecting our Ability To Counter Hacking (PATCH) Act, is sponsored by Senators Brian Schatz (D-Hawai‘i), Ron Johnson (R-Wis.), and Cory Gardner (R-Colo.) (all members of the Senate Committee on Commerce, Science, and Transportation) and Representatives Ted Lieu (D-Calif.) and Blake Farenthold (R-Texas).

Published by The Lawfare Institute
in Cooperation With

Today a bipartisan group of lawmakers introduced in both the House and Senate a bill that would formalize the Vulnerability Equities Process (VEP) into law. The proposed legislation, the Protecting our Ability To Counter Hacking (PATCH) Act, is sponsored by Senators Brian Schatz (D-Hawai‘i), Ron Johnson (R-Wis.), and Cory Gardner (R-Colo.) (all members of the Senate Committee on Commerce, Science, and Transportation) and Representatives Ted Lieu (D-Calif.) and Blake Farenthold (R-Texas).

Established in accordance with a 2008 presidential directive and currently overseen by the National Security Council, the VEP is an interagency process used by the government to decide whether to disclose vulnerabilities or hold them for potential exploitation. Vulnerabilities are flaws in software or hardware that could be exploited to give an outside party access to a computer system. Because vulnerabilities can be fixed and patched if disclosed to vendors, but not if they are kept secret, the VEP affects the security of software and of the greater cyber ecosystem. This process also affects law enforcement and intelligence activities, as a vulnerability kept secret remains useful until widely known, but once disclosed can be patched and thereby rendered ineffectual. The VEP is designed to oversee the disclosure of vulnerabilities by the intelligence community, law enforcement, and other government actors, balancing reasons to keep them secret for use as opposed to being disclosed and fixed.

The first of its kind, this bill formally kicks off the debate over whether and how to codify the VEP, which presently exists only as a function of administration policy. The VEP has roots in Bush-era policies and saw significant refinement during Obama’s tenure. Snowden’s revelations in 2013 and the 2014 Heartbleed security vulnerability scandal resuscitated interest in codifying the process to facilitate accountability and continuity between administrations.

This post, based on the Senate version of the bill, highlights several key ideas in the recently introduced bill and evaluates the broader debate over codification. Regardless of the bill’s fate, its introduction is significant because it marks the first time that Congress will be actively involved in meaningful discussion about government disclosure of vulnerabilities. This is a positive development, particularly given the wide-ranging implications of that debate for the activities of law enforcement and intelligence operations. The willingness of legislators to consider introducing a bill also demonstrates the public prominence of this issue. Especially as law enforcement actors increasingly turn to vulnerabilities to circumvent encryption, effective and standard oversight is warranted.

The VEP is a small piece of the much larger puzzle of how to secure software. The process is not designed to incentivize private sector behavior or radically change the way companies handle software security. The VEP is important as a narrow oversight function over the government’s disclosure of vulnerabilities, which, if kept secret, could lead to public harm. Much of what is proposed in this bill is not new. But codifying these criteria into law is meant to address concerns the VEP is at the mercy of administration politics and lacks sufficient transparency and oversight for outside actors to trust that the process is being run as intended.

Breaking Down the Bill

Overall, the bill employs a light-touch approach. It first formalizes the existence of a vulnerability equities review board and names both permanent and ad-hoc members. This roster itself is significant, as the question of appropriate board membership has been a key point of debate in the VEP discussion. The current NSC process includes the intelligence community and law enforcement, but lacks effective representation from agencies capable of representing industry and citizen concerns. In the new bill, the Secretary of Homeland Security, a civilian agency, replaces the National Security Council as chair. The board also includes the Secretary of Commerce as well as the Directors of National Intelligence, CIA, FBI, and the NSA. Ad hoc members can include the Treasury, FTC, Department of Energy, and State as relevant, and the Board can also request the participation of NSC members. Inclusion of the FBI is a particularly positive step. Law enforcement’s use of vulnerabilities presents some different challenges from those encountered by the intelligence community—such as trickle-down to state agencies—and these deserve adequate consideration as part of a robust VEP.

The board’s purpose is to oversee the VEP and set more detailed policy on vulnerability disclosure on “whether, when, how, and to what degree” information about a non-public vulnerability should be disclosed. The bill requires that the resulting policy be made public where possible. To that end, the bill spells out some minimum criteria for this policy, many of which mirror existing policy as set out by former White House Cyber Coordinator Michael Daniel in his 2014 blog post about the contemporary VEP. The bill’s criteria include the following:

  • Which information technology products or systems are subject to the vulnerability, with particular attention paid to products or systems used in core Internet infrastructure, critical infrastructure, the U.S. economy, or in national security systems?
  • What is the risk of leaving the vulnerability unpatched or unmitigated?
  • What is the likelihood someone other than the government will discover the vulnerability?

The decision-making criteria would also consider the needs and capabilities of the federal government, looking at:

  • The government’s need to exploit the vulnerability.
  • The need for this vulnerability in an ongoing or imminent intelligence operation or law enforcement investigation.
  • The government’s ability to obtain the information being sought through other means than the vulnerability.
  • The ability of the government to discover if another actor has discovered and is using the vulnerability.

The criteria also include international-facing considerations—that is, the potential harm that an adversary of the United States could inflict with the vulnerability if independently discovered, and the risks that U.S. use of the vulnerability could pose to “foreign countries and the people of foreign countries.”

Each federal agency would be required to use this process upon “obtaining information about” a non-public vulnerability, except where the agency deems that information releasable without adjudication by the formal process. Homeland Security would coordinate disclosure, a change from the current practice of delegating disclosure to the agencies with relevant jurisdiction or experience. The bill applies to all vulnerabilities that are not publicly known, rather than just those “newly discovered,” another substantial change from current practice.

Reporting Requirements and Periodic Review

Overall, the bill provides for several straightforward oversight measures and mandates a review process without presenting overly burdensome statutory requirements. The bill’s biggest substantive departure from the present system is that it requires review of all non-public vulnerabilities rather than just those newly discovered. This expansion could be controversial within the intelligence community—it may be criticized as overly broad—but it also allows for periodic review of vulnerabilities retained for operational use. This periodic review could allow the government to use vulnerabilities for a time, while still allowing for disclosure at a later date. Scholars have argued for such periodic review on the grounds that it helps ensure ongoing accountability of government possession of vulnerabilities and relieves pressure on a one-time, all-in or all-out initial disclosure decision.

The bill also institutes welcome reporting measures, requiring an annual classified report to three Senate and four House committees, as well as an unclassified public report. The reporting requirements as written in the bill are detailed and necessary; hopefully they remain a part of the final bill. There is also a provision for the Privacy and Civil Liberties Oversight Board (PCLOB) to review all of these reports, although PCLOB’s future remains uncertain.

Agency Concerns and Considerations

The bill is by no means settled, and some federal agencies and outside actors may resist the changes it advances. For instance, transfer of the chairmanship from the NSC to Department of Homeland Security may be contentious amongst the agencies acquiring vulnerabilities. The VEP was originally run by an Executive Secretariat within the NSA’s Information Assurance Directorate and then moved to the White House National Security Council. Further shifting leadership to a civilian agency could provoke claims of a biased process or questions about the oversight body’s ability to understand technical and operational details. Agencies may also oppose having Commerce and Homeland Security coordinating disclosure to companies and the corresponding loss of organizational discretion.

Requiring consideration of the interests of foreign countries and foreign people is another substantial change in policy that might frustrate U.S. government agencies as overbroad. On the other hand, Five Eyes countries may welcome this change, in which their national and personal computer security may be taken into greater consideration than in past. However, the bill itself does not specify consideration of allies only, meaning it likely extends beyond allies to broader human rights considerations around vulnerability disclosure, a change that would be welcomed by human rights advocates. The scope of this required consideration deserves further clarification and specification by the review board, once created, to better understand its intended effects.

The bill explicitly allows tailored approaches to different kinds of vulnerabilities and exempts some kinds from the process altogether, a change likely to be welcomed by relevant agencies. These are common sense concessions that go some way toward making the VEP work for its various stakeholders. For instance, the bill permits agencies to immediately disclose vulnerabilities deemed “presumptively shareable or releasable,” bypassing the review process. This clause allows the US government, if it finds a vulnerability that is not particularly operationally useful, to disclose the vulnerability immediately without going through the VEP to prevent delayed disclosure. Additionally, the bill exempts vulnerabilities that become publicly known from continued VEP review.

Private Sector Concerns

The private sector can be expected to have a more fragmented set of opinions, including on whether the VEP should be law at all. Firms that discover and sell vulnerabilities to the government will likely oppose codification and the challenges it might bring to their industry. Other companies, including many who build and help maintain the software products in questions, such as Mozilla, support some formalization of the VEP.

Private sector actors may be particularly concerned about the bill’s definition of a vulnerability (“a design, configuration, or implementation weakness in a technology product, system, or application that can be exploited or triggered to cause unexpected or unintended behavior”). The bill’s language takes the positive step of reflecting the definition developed by the National Institute of Standards and Technology (NIST). The NIST definition has some private sector support, in contrast to the existing VEP definition of a vulnerability, which has created consternation. The definitional approach shared by NIST and the bill—an emphasis on exploitation rather than penetration— may address some private sector concerns.

Additionally, the bill indicates that the disclosure review process must take into account whether a private company “has a publicly disclosed policy for reporting and disclosing vulnerabilities.” This clause likely means government disclosure to private sector actors would depend, in part, on a company’s ability and willingness to patch vulnerabilities (although the language sets a fairly minimal disclosure-based standard). The inclusion of this language, however, suggests movement towards conditioning government disclosures on company security practices. This movement likely comes in response to government frustration about disclosing bugs that subsequently don’t get fixed, but it also adds another potentially problematic barrier to disclosure.

Why Care?

Public discussion of codification is welcome and positive, but a statutory approach can risk being either too restrictive or overly broad. Too many requirements and a bill becomes a straightjacket on the VEP, potentially harming both the quality of decision-making and reducing agency buy in. Excessively broad provisions, on the other hand, could make the process worse, allowing agencies more degrees of freedom in disclosure decisions than they currently have in practice. Well-crafted legislation, like the current draft, leaves ample room for specification of particular rules and processes by the Board, within a minimum framework established by law, with input from all affected stakeholders.

However, even for those in favor of the VEP, a ‘Goldilocks’ statutory approach will leave some questions unanswered. Disagreements may emerge about the requirements for annual reports and the lack of an appeal process. The bill’s decision-making criteria are also not weighted by importance, which could reduce the influence of some of the criteria less popular with agencies. In a similar vein, it is not clear if the bill covers vulnerabilities known to government contractors. The bill requires agencies to enter the process if they “obtain information about” a vulnerability - but does that include acquisition of a limited set of information by contract, and other forms of bureaucratic creativity? If a statutory approach moves forward, this element would be worth clarifying, potentially in conjunction with separate legislation.

This bill could have taken a range of forms, but the draft overall represents a solid middle ground approach. Statutory approaches can make people queasy, with good reason, but the current draft makes significant progress towards having a clear and flexible oversight process, an improvement over “policy by blog post.”

Mailyn Fidler is an Assistant Professor at the University of New Hampshire Franklin Pierce School of Law and a Faculty Fellow at the Berkman Klein Center for Internet & Society. Her research focuses on the intersection of criminal law, technology, and speech. Before entering academia, she served as a clerk on the Tenth Circuit Court of appeals and worked in strategic litigation at the intersection of the First and Fourth Amendments.
Trey Herr is Assistant Professor of cybersecurity and policy at American University’s School of International Service and director of the Cyber Statecraft Initiative at the Atlantic Council. At the Council his team works on the role of the technology industry in geopolitics, cyber conflict, the security of the internet, cyber safety and growing a more capable cybersecurity policy workforce.

Subscribe to Lawfare