Published by The Lawfare Institute
in Cooperation With
Over the past few months, a team at Mozilla has been looking closely at the recent remote hacking cases currently winding their way through the courts. Because the cases involve the possible disclosure of a potential Firefox vulnerability, we wanted to understand both the best outcome for our users and, more generally, the circumstances when courts would be the appropriate venue for such disclosures.
Our analysis regarding the technical arguments in these cases is largely consistent with Nick Weaver and Susan Hennessey’s conclusions; that is, the information disclosed by the FBI is probably sufficient to determine the authenticity of evidence collected without additional disclosures regarding the vulnerability to the defendant. If that analysis is generalizable to other lawful hacking cases—a big if at this point, because law enforcement hacking is likely to grow more complex—this would suggest that there is a path forward whereby defendants can obtain the information required to mount a fair defense without jeopardizing law enforcement’s sensitive collection methods.
Nonetheless, the outcome in these cases thus far should leave everyone dissatisfied. To explain why, we want to provide more detail about our argument and the outcome in the Michaud case.
Advanced Court Disclosure
The relevant issue in Michaud, and a number of related cases, relates to an alleged vulnerability exploited by the government in the Tor Browser. The Tor Browser is based substantially on our Firefox browser codebase and some have speculated that the vulnerability might exist in a portion of that code. At this point, no one outside the government (including us) knows whether that is the case.
In May, we filed a brief in the case asking the court to ensure that, if our code is implicated in a security vulnerability, the government disclose the vulnerability to us before it is disclosed to any other party. At that time, the judge had already ordered that the full exploit be disclosed to the defense team—a decision he later reversed, in part. We thought this raised additional security risks that needed to be addressed and that it was our responsibility to our users to note those concerns with the court.
The more people who know about a vulnerability, the more risk there is that it will be weaponized and threaten a product’s users. This is a concern if, for example, the vulnerability is procured from the grey market, indicating that a larger number of less reputable individuals might already be in possession of knowledge about the vulnerability. The act of exploiting a vulnerability against a large number of targets also expands the potential pool of people who may know about it. Similarly, disclosing a vulnerability in court changes the risk profile, both by expanding the number of individuals with knowledge and by potentially moving the vulnerability outside of the government’s strict system of rules and processes for protecting sensitive information. These factors should inform a decision regarding whether to disclose a vulnerability to an affected company, prior to permitting broader disclosure.
In the Michaud case, the crux of our concern was that the vulnerability might find its way into the hands of others beyond the defense team, possibly putting our users in jeopardy. In our view, a protective order was not sufficient to mitigate this risk.
The implications of such an exposure are significant. The Firefox browser has hundreds of millions of users. If the vulnerability exists in our code and if it were to leak, those users could all quickly be put at risk. Attackers could use this exploit to take control of a Firefox user's computer and use it, for example, to steal the user's personal information. Such a compromise could have catastrophic effects on individuals, corporations, and governments. Once such a leak occurs, Mozilla would face a race against time to protect our users from ongoing harm. Often vulnerabilities can be exploited by bad actors more quickly than software companies can develop, test, and release patches, so companies like Mozilla start the race at a disadvantage against bad actors.
And this assumes that we are even aware that the vulnerability has leaked in the first place. In reality, it is unlikely that Mozilla or the government would realize a third-party had received information about the vulnerability before potentially widespread damage had already occurred.
For these reasons, we asked the judge in the Michaud case to require the government to disclose the vulnerability to the affected technology companies first, whether Mozilla or other parties, so that it could be patched. That is a practice that we think should apply when vulnerabilities are going to be disclosed in court and is an example of the types of policies that need to be developed to deal with what is likely to be the government’s growing use of lawful hacking tools. Court ordered disclosure should follow the best practice of advance disclosure that are standard across the security research community. Companies should be given an opportunity to fix the vulnerability before it can threaten their broader user bases.
The judge in this case ultimately reversed his order and decided that the government was not required to disclose the vulnerability to the defense team. That determination rendered our argument somewhat moot. Our participation in the case was predicated on the judge’s disclosure order and its implications for our users. We weren’t siding with the government or the defendant. The judge’s reversal eliminated at least the immediate security risk we had identified. In light of this decision, Judge Bryan stated that, “It appears that Mozilla’s concerns should be addressed to the United States and should not be part of this criminal proceeding.”
Leaving Internet Users Out to Dry
Of course, this still leaves the underlying security issues—the fact that the government may know about a vulnerability that could affect millions of users—unaddressed.
We took seriously the judge’s guidance that we address our concerns to the United States government and have attempted to speak with law enforcement officials about this issue. In fact, we took this approach prior to our entry in the Michaud case and reached out to the government to inquire whether the vulnerability impacted Firefox. Government officials have thus far not been willing to sit down and hear our concerns. That approach is understandable, but ultimately shortsighted.
We can certainly appreciate the government’s desire to protect its sources and methods. And we understand that there are some circumstances where non-disclosure of a vulnerability is appropriate. At the same time, the government needs to take serious its obligations to protect the safety and security of hundreds of millions of people online and doing so requires a partnership with technology companies. Considering the ever-fraying relationship between the tech industry and the national security community, the challenges associated with lawful hacking and vulnerability disclosure offer a promising opportunity for a more productive collaboration moving forward as we work together to find the best way to protect online and offline security.
Court cases like these are rarely the ideal venue to protect the interests of average Internet users. In criminal proceedings, the defense and prosecution interests are most directly at stake, and the security equities of our users are implicated directly only when a judge orders disclosure. The problem, as illustrated by this case, is that there is no transparent, accountable venue whatsoever to address those interests. We can neither avail ourselves of the courts nor the executive branch. Nor can we have confidence that the factors identified above that should inform a disclosure decision have been adequately considered. So while Nick and Susan’s analysis suggests it is possible to protect the interests of both defendants and the government, at the end of the day average Internet users may still be hung out to dry.