Cybersecurity & Tech Surveillance & Privacy

Does Equifax Owe Victims a Duty of Care?

Merritt Baer, Chinmayi Sharma
Tuesday, September 12, 2017, 11:30 AM

Last week, credit reporting outlet Equifax disclosed that they were subject to a massive hack of personally identifiable information that may have compromised the data of as many as 143 million Americans. Unlike many other high-profile data breaches, many of the individuals affected might not have ever used Equifax, viewed or consented to their data retention policies.

Published by The Lawfare Institute
in Cooperation With

Last week, credit reporting outlet Equifax disclosed that they were subject to a massive hack of personally identifiable information that may have compromised the data of as many as 143 million Americans. Unlike many other high-profile data breaches, many of the individuals affected might not have ever used Equifax, viewed or consented to their data retention policies. Nevertheless, their data was accessible and is now potentially compromised.

Last month, we wrote about theories of harm in data breach cases. Typically, you can only sue a party that failed to uphold a duty they owed you. This component of standing applies to data breach cases: does a defendant owe a plaintiff a duty of care?

A duty of care may arise from a special relationship between the defendant and the plaintiff or from a statute governing a particular activity. For example, the Fair Credit Reporting Act (FCRA) establishes statutory standards of care for how consumer reporting agencies (like Equifax) use or limit access to your information in the normal course of business. Below, we describe these two mechanisms and their application to data breach cases.

I. Special Relationships

A special relationship -- like that between a provider and consumer, employer and employee, or fiduciary and beneficiary -- is usually codified in a contract. In most data breach cases, plaintiffs allege that the defendant breached an explicit or implied contractual promise to take reasonable measures to protect personal information. To bring suit for breach of a duty of care based on contract, plaintiffs must show:

  1. the existence of a binding agreement;
  2. the non-breaching party fulfilled its obligations, if it had any;
  3. the breaching party failed to fulfill obligations;
  4. the lack of a legal excuse; and
  5. the existence of damages sustained due to the breach.

Some courts have been willing to read in an obligation absent explicit language-- either under a contractual agreement between two parties or a privacy policy that the company had. Occasionally, as with the breach of Toronto-based dating site Ashley Madison, the government has even found that the lack of an articulable privacy policy was itself grounds for enforcement of a duty of care. As far back as 2005, in In re JetBlue Airways Corp. Privacy Litigation, the court treated the company’s data policy as an enforceable contract in assessing a motion to dismiss.

When consumers seek to enforce a breached duty of care claim, courts struggle to coherently establish when and between whom the duty existed. Until the last decade or so, most people had a fairly knowable set of relationships in their daily lives. They often arose through contracts and the relationships were few and well-established. Individuals had a single employer, a single landlord, and so on. Today, our daily transactions are intricately webbed and can involve many parties we might not even know exist. We constantly interact with companies-- and their contractors-- that don’t necessarily have a direct relationship with us as individuals. In turn, these complexities frustrate courts, which often look to the directness of a relationship between two parties as a persuasive factor in establishing duty.

A recent line of cases illustrates how this problem applies to findings for the existence of a duty of care. In Willingham v. Global Payments, Inc., the court held that a payment processor owed no duty to consumers using the company’s platform to send funds to merchants. Similarly, in In re, Inc., the court declined to treat a company statement about the security policy as an enforceable contract and also denied the existence of an implied contract to safeguard the data.

Last year’s Nobel Prize for Economics went to two contract theorists. From the gig economy to supply-chain untraceables, we have innumerable daily transactions and interactions at every level from the individual to the corporate. These relationships are constantly fragmenting and, practically speaking, they can be impossible for any individual to control. We are in a new landscape of networks and obligations in data breach liability. And absent a specific relationship, courts have been hesitant to find that there exists a duty of care. Sometimes, plaintiffs can leverage terms and warranties, both implied and express, to hold contracting parties accountable in data breach incidents. So far, courts have held contractors liable either through an explicit contractual standard of care, or through general industry standards of care—for example, being on notice through the FAR and DFARS regulation that mandates compliance with the NIST 800-171 as the standards for system security and information protection.

In Ruiz v. Gap, Inc., the court assumed the existence of a duty between job applicants and a prospective employer. Gap, Inc. had failed to exercise reasonable care to prevent the theft of laptops with unencrypted information about the job applicants. Although there was no express contract, the court found that Ruiz had standing to bring the suit based on an increased risk of identity theft, but held that Ruiz failed to present evidence of actual damages.

These factors may be even more complex in the context of a government contractor. In June of this year, in McDowell v. CGI Federal Inc., a district court allowed a plaintiff to proceed with claims against a third-party government contractor following a data breach incident under the third-party beneficiary theory, treating the company’s privacy policy as a binding agreement. The government had already sued the contractor for breaching express and implied responsibilities by failing to secure sensitive personal data of federal employees. But the workers were also allowed to bring suit because the breached information security provisions of the privacy policy were included specifically to protect their data, so the court held that they were the intended beneficiaries of the contract.

II. Statutes as the basis for a claim

Absent a special relationship, there still may be a claim for breach of duty of care. And, as we have described, the landscape of special relationships are convoluted when it comes to the collection and storage of data. So in this climate, we may see increasing attempts to rely on statutory or other avenues to establish an owed duty.

First, a duty may arise from common law principles governing negligence liability generally. In In re Hannaford, a third party stole a grocery store’s debit and credit card data and the court used a negligence standard to assert a standard of care based on breach of implied contract. The case was remanded to district court in 2013 (but this issue was resolved at the appellate level), establishing that there can be an implied duty for a vendor to protect customer information. The case has not been resolved, but has reached class certification and the district court upheld the theory of duty.

Second, a statute may impose a duty, either in express terms or as a result of judicial reliance on the statute as the proper expression of the standard of care. For example, the FCRA, mentioned in the beginning of this article, imposes a specific statutory duty in a specific context.

Third, there may be a duty under the law of misrepresentation, which imposes a general duty to update statements, including statements related to data security, that are the basis for pending or continuing reliance by the recipient. Last year, Ashley Madison agreed to settle FTC and state charges that they deceived consumers and failed to protect 36 million users’ account and profile information in relation to a massive July 2015 data breach of their network.

Finally, failure-to-act rules may require exercise of reasonable care to avoid or minimize damages if an entity holding data has created a continuing risk of harm. In Bell v. Blizzard Entertainment Inc., the court upheld a claim of unjust enrichment because Blizzard had knowingly sold a game to a consumer, despite the fact that the company had been hacked several times in the past five years and had taken no additional measures to improve their security. When the consumers’ information was stolen, the court held that the vendor knew there was a strong likelihood of a breach. The company was therefore impermissibly passing to the plaintiff the cost of upgrading its own information security.

Even though these are potential avenues to argue for a court to find standing, they are not always effective:

Courts may refuse to find a duty to a relationship based on violation of a statute when the statute was not intended to protect the plaintiff against a particular type of harm. In Willingham, mentioned above, the court also found that the plaintiffs did not enjoy a duty of notification of breach even though there was an applicable Georgia breach disclosure statute because the plaintiffs were not Georgia residents and therefore not the class the statute was enacted to protect.

And in Dittman v. Univ. of Pitts. Med. Ctr., a Pennsylvania Superior Court affirmed a finding that the University of Pittsburgh Medical Center did not owe a common law duty to protect sensitive information for 64,000 employees. Calling the UPMC “victims of the same criminal activity,” the Superior Court went so far as to say, “We find it unnecessary to require employers to incur potentially significant costs to increase security measures when there is no true way to prevent breaches altogether.”


Clearly, previous understandings of legal duty take a new color when we assign blame in the data breach context. There need not be malice for there to be significant harm in data breaches (for more on types of harms that can survive a standing challenge, see our earlier article.) And the extent to which entities owe a duty to individuals in this new context is still poorly defined. The combination of a splintering of traditional transactional relationships, and a proliferation of increasingly intimate data, makes for a complex landscape of liability.

*The views expressed herein are the personal views of the authors and do not necessarily represent the views of the FCC or the US government, for whom one of the authors works.

Merritt is an expert in emerging technology and cybersecurity. Merritt has experience in all three branches of government, most recently in the Department of Homeland Security where she worked in the Office of Cybersecurity & Communications, often termed the United States’ “cyber firehouse.” Before joining the government, Merritt started a business advisory / legal practice, working with emerging tech companies at early stages of growth. Merritt speaks regularly on emerging areas including the future of the Internet, artificial intelligence and robots, current cybersecurity issues in 5G, cloud, mobile, IoT and ICS, corporate interactions with government cyber, women in tech, entrepreneurism and innovation. Her insights on business strategy and tech have been published in Forbes, The Baltimore Sun, The Daily Beast, Talking Points Memo and ThinkProgress. She continues to publish academic articles on Internet questions, and her work has appeared in the journals of Temple, Georgetown, Santa Clara, and the University of Virginia. Merritt is a graduate of Harvard Law School and Harvard College. She is admitted to the Bars of New York, the United States Court of Appeals for the Armed Forces, and the United States Supreme Court. Based in Washington, DC, she is a Fellow at the East-West Institute, founder of women’s tech advocacy group Tech & Roses, Adjunct Professor of Cybersecurity at the University of Maryland, and an amateur boxer.
Chinmayi Sharma is an Associate Professor at Fordham Law School. Her research and teaching focus on internet governance, platform accountability, cybersecurity, and computer crime/criminal procedure. Before joining academia, Chinmayi worked at Harris, Wiltshire & Grannis LLP, a telecommunications law firm in Washington, D.C., clerked for Chief Judge Michael F. Urbanski of the Western District of Virginia, and co-founded a software development company.

Subscribe to Lawfare