Cybersecurity & Tech Executive Branch Foreign Relations & International Law

Cybersecurity’s Third Rail: Software Liability

Jim Dempsey
Thursday, March 2, 2023, 3:57 PM

The Biden administration’s cybersecurity strategy calls for placing responsibility for buggy

The New York State Court of Appeals, where Benjamin Cardozo held Buick Motor Company liable for a faulty wheel supplied by another firm. (Daniel Case, https://tinyurl.com/23xddw6x; CC BY-SA 3.0, https://creativecommons.org/licenses/by-sa/3.0/deed.en)

Published by The Lawfare Institute
in Cooperation With
Brookings

Well, they’ve done it. The Biden administration’s new National Cybersecurity Strategy takes on the third rail of cybersecurity policy: software liability. For decades, scholars and litigators have been talking about imposing legal liability on the makers of insecure software. But the objections of manufacturers were too strong, concerns about impeding innovation were too great, and the conceptual difficulties of the issue were just too complex. So today software licenses and user agreements continue to disclaim liability, whether the end user is a consumer or an operator of critical infrastructure. With this new strategy, the administration proposes changing that.

The strategy’s discussion of the issue starts with an incontrovertible point: “[M]arket forces alone have not been enough to drive broad adoption of best practices in cybersecurity and resilience.” Indeed, the strategy goes on to note, market forces often reward those entities that rush to introduce vulnerable products or services into our digital ecosystem. Problems include the shipping of products with insecure default configurations or known vulnerabilities and the integration of third-party software with unvetted or unknown features. End users are left holding the bag, and the entire ecosystem suffers, with U.S. citizens ultimately bearing the cost.

We must begin, the administration says, to shift liability onto those who should be taking reasonable precautions to secure their software. This will require three elements, according to the strategy: preventing manufacturers and service providers from disclaiming liability by contract, establishing a standard of care, and providing a safe harbor to shield from liability those companies that do take reasonable measurable measures to secure their products and services. Together, the three points are based on a recognition that the goal is not perfect security but, rather, reasonable security.

Some software companies will likely object. But in urging that responsibility should be placed on those best positioned to reduce risk, the administration is merely applying an old principle to the now-matured software sector. Early in the 20th century, the automobile industry was about where the computer software industry is today. Automobile makers then, as software developers do now, disclaimed liability for any flaws in their products. We sell to dealers, not to consumers, they argued, so end users don’t have the “privity of contract” with us needed to sue. And anyhow, we’re not liable for the tires or the brakes or any of the other components, since we didn’t make those. We just assembled the car. 

In 1916, then-state court judge Benjamin Cardozo, who went on to serve on the U.S. Supreme Court, rejected the auto makers’ arguments in an opinion that set off a chain of law reform across the country. He held that the defendant, Buick Motor Company, was responsible for the finished product. His words are remarkably relevant today. As a manufacturer of automobiles, Buick “was not at liberty to put the finished product on the market without subjecting the component parts to ordinary and simple tests.” The obligation to inspect, Cardozo acknowledged, must vary with the nature of the thing to be inspected. The more probable the danger, the greater the need of caution. As Tom Wheeler and David Simpson argued in a recent paper on liability in the telecommunications sector, the lessons of the case are clear: Neither the consumer nor the local dealership had meaningful insight into or control over the manufacturing process or material supply chain—but Buick did. Cardozo’s decision “firmly placed the risk assessment and mitigation responsibility with the corporation in the best position to know details regarding assembled sub-systems and to control the processes that would address risk factors.”

In calling for responsibility on those in the software supply chain best positioned to know their product and control the processes that would address risk factors, the administration is saying it is time for software development and services to catch up with the rest of the legal and economic framework. Lessons from other sectors—on how to define a standard of care and measure compliance with that standard—may well inform the next steps.


Jim Dempsey is a lecturer at the UC Berkeley Law School and a senior policy advisor at the Stanford Program on Geopolitics, Technology and Governance. From 2012-2017, he served as a member of the Privacy and Civil Liberties Oversight Board. He is the co-author of Cybersecurity Law Fundamentals (IAPP, 2024).

Subscribe to Lawfare