Cybersecurity & Tech Surveillance & Privacy

GoodRx, Health Data Brokerage, and the Limits of HIPAA

Justin Sherman
Friday, March 3, 2023, 12:52 PM

The Federal Trade Commission Building in Washington, D.C. (Carol M. Highsmith, https://tinyurl.com/3bjevc9v; Public Domain, https://tinyurl.com/4958s587)

Published by The Lawfare Institute
in Cooperation With
Brookings

On Feb. 1, the Federal Trade Commission (FTC) announced an enforcement action against prescription drug provider and telehealth company GoodRx, for “failing to notify consumers and others of its unauthorized disclosures of consumers’ personal health information to Facebook, Google, and other companies.” Samuel Levine, the director of the FTC’s Bureau of Consumer Protection, said in the press release that “the FTC is serving notice that it will use all of its legal authority to protect American consumers’ sensitive data from misuse and illegal exploitation.” Along with a $1.5 million penalty, the proposed order would prohibit GoodRx from sharing its users’ health information with third parties for advertising purposes, among other safeguards.

The FTC’s enforcement action against GoodRx is significant in and of itself. It is, for example, the first time the commission has taken action under its Health Breach Notification Rule, which requires certain personal health record vendors and their business associates to notify consumers of a breach of unsecured information. But the FTC’s enforcement action against GoodRx speaks to the broader limits of U.S. health privacy protections. Of course, there are myriad elements to the limits on health data privacy, such as law enforcement’s potential access to data related to pregnancy and health conditions. This article focuses on companies’ ability to collect, share, and broker access to individuals’ health data.

In short, while the Health Insurance Portability and Accountability Act (HIPAA) imposes some controls on the collection, use, and distribution of “protected health information” by covered entities, such as health care providers, it is narrowly scoped and leaves a vast ocean of health data unprotected. Hence, as underscored by the FTC’s GoodRx action, current U.S. privacy regulation allows many companies to legally collect, share, and sell—in other words, broker access to—Americans’ health data, from surgical histories to drug prescriptions. This lack of regulation creates a range of privacy risks to individuals, particularly marginalized and at-risk populations, and raises urgent questions about Congress’s ability and willingness to respond.

Data Brokerage and the Limits of HIPAA

HIPAA is often referred to as the U.S.’s health privacy law. But, as University of Connecticut law professor Carly Zubrzycki remarked at Duke’s recent Data Privacy Day conference, “It’s easy to forget that the ‘P’ in HIPAA is for portability, not for privacy.” Without getting too much into the law, a core focus of the original 1996 legislation was the establishment of rules around the collection, use, and sharing of health data by certain covered entities, with an emphasis on electronic health records and electronic health information, even though paper record use was still prominent. Privacy came into the picture several years later, when the Department of Health and Human Services issued the HIPAA Security Rule and the HIPAA Privacy Rule. These regulations established national privacy and security standards related to electronic health information.

Nonetheless, HIPAA and its requirements apply only to certain covered entities. That list of covered entities is composed of health care providers, health plans, and health care clearinghouses, which process or facilitate the processing of health information or transactions from one entity to another. It also includes covered entities’ business associates that perform covered functions (like a company handling health data for a hospital). HIPAA’s covered entities list does not include websites or apps developed by third parties with no links to a covered entity—even when those websites and apps collect and use health data. It does not include social media platforms. It also does not include data brokers or advertising technology companies. This means that the protections within HIPAA and its associated regulations do not apply to a wide range of companies that are collecting, analyzing, aggregating, and sharing or selling health data.

In the recent GoodRx case, for example, it was the FTC that took action against GoodRx, rather than the Department of Health and Human Services, precisely because GoodRx is not covered by HIPAA. The FTC therefore has jurisdiction over GoodRx’s privacy practices. As the FTC alleged, GoodRx repeatedly “represented, directly or indirectly, expressly or by implication, that GoodRx is a HIPAA-covered entity, and that its privacy and information practices were in compliance with HIPAA’s requirements,” when “in truth and in fact, GoodRx is not a HIPAA-covered entity, and its privacy and information practices did not comply with HIPAA’s requirements.” The FTC alleged that GoodRx’s misrepresentation of its coverage by HIPAA—for example, by displaying a seal of compliance on its telehealth services homepage—was deceptive to consumers. The FTC also alleged that GoodRx failed to implement measures to prevent the unauthorized disclosure of users’ health information—and that it did not obtain users’ “affirmative express consent” to share their health data with Facebook and other companies. Many other allegations, including the allegation that GoodRx falsely claimed it was compliant with Digital Advertising Alliance principles, relate to the broader issue of deceptiveness.

By taking this enforcement action against GoodRx, the FTC is continuing to link the brokerage of Americans’ data to consumer harm. It sued data broker Kochava in August 2022, alleging that it sells individually linkable location data in ways that are harmful to consumers. (The lawsuit is ongoing.) While GoodRx does not self-identify as a data broker, the data-sharing practices described by the FTC—that is, providing consumers’ personal health information to Facebook, Google, and other companies without their affirmative express consent—amount to GoodRx brokering access to individuals’ health data. Yet the fact remains that the FTC centered its enforcement action against GoodRx around deceptiveness, unfairness, and the company’s failure to notify consumers of a “breach” of their data—because brokering health data is not itself illegal. GoodRx is not covered by HIPAA and therefore not bound by it. The FTC further underscored this fact when it invoked the Health Breach Notification Rule in the GoodRx case, which explicitly “does not apply to HIPAA-covered entities, or to any other entity that it engages in activities as a business associate of a HIPAA-covered entity.” It is completely legal, in and of itself, for GoodRx to collect and sell individuals’ health data, so long as it does not mislead consumers about that activity.

The GoodRx case is just one illustration of the limits of HIPAA and the lack of regulation around health data brokerage. In a new report from our data brokerage research team at Duke’s Sanford School of Public Policy, former student researcher Joanne Kim examined the wide availability of mental health data on the open market. Joanne contacted 37 data brokers with inquiries related to health data, and 11 data brokers ultimately expressed that they were willing and able to sell mental health data to her. The advertised data included data on Americans “with depression, attention disorder, insomnia, anxiety, ADHD, and bipolar disorder as well as data on ethnicity, age, gender, zip code, religion, children in the home, marital status, net worth, credit score, date of birth, and single parent status.” One data broker advertised data that included the “names and postal addresses of individuals with depression, bipolar disorder, anxiety issues, panic disorder, cancer, PTSD, OCD, and personality disorder, as well as individuals who have had strokes and data on those people’s races and ethnicities.” The costs varied, ranging from $275 for 5,000 aggregated counts of people’s mental health conditions to upwards of $100,000 for yearlong access to a database on individuals’ mental health conditions.

This has been a problem for years: When Pam Dixon, the founder of the World Privacy Forum, testified to Congress a decade ago on data brokerage, she pointed to data brokers selling lists of people “suffering from mental health diseases, cancer, HIV/AIDS, and hundreds of other illnesses,” as well as health-related information such as, disturbingly, a list of “rape sufferers”—what Dixon then described as “an unjustifiable outrage that sacrifices a rape victim’s privacy for 7.9 cents a name.” Health data brokerage also encompasses data that may not be traditionally considered health data per se but can reveal health information. A prime example is when Joseph Cox at Motherboard reported in May 2022 that data broker SafeGraph was selling the location data of people visiting abortion clinics (after the report broke, SafeGraph said it had stopped selling the data). I have previously written for Lawfare about a data broker that secretly tracked the phones of people visiting abortion clinics, geofenced the areas around the clinics, and then sold access to those devices to anti-abortion groups—who then ran manipulative, anti-abortion messages to women literally sitting in clinic waiting rooms.

The lack of legal and regulatory controls around the commercial collection, selling, and sharing of health data creates numerous privacy risks, including vis-a-vis data related to mental illnesses, chronic health conditions, pregnancy, trauma, and drug prescriptions. Health insurance companies buy data from the data brokerage ecosystem—including data on a person’s race, educational level, marital status, and net worth—to profile consumers and, apparently, to make determinations about insurance pricing. Law enforcement can buy this data without a warrant. For-profit entities and malicious individuals can surveil and target people seeking access to reproductive health care without their knowledge. Data about individuals taking antidepressants, veterans suffering from post-traumatic stress disorder, and people with Alzheimer’s and dementia is likewise available for purchase on the open market. FTC Commissioner Christine Wilson mentioned this last example recently in her concurring statement on the GoodRx enforcement action. She wrote, “Health data in the hands of the wrong entities can be used in pernicious ways—for example, consider a data broker that compiles a list of Alzheimer’s patients which a fraudster then uses to scam them.”

As with all matters of privacy and surveillance, the harms are likely to fall hardest on those already marginalized and vulnerable in society, such as Black and brown communities afflicted by racism in health care quality, access, and outcomes—or individuals suffering from a mental health condition who are already, as Joanne Kim wrote in her report, adversely impacted by mental health stigma.

What’s to Be Done?

The FTC is taking necessary and important action against companies that are engaged in unfair or deceptive uses of consumer data, including in the realm of health data brokerage. But so long as companies not covered by HIPAA are legally permitted to collect, aggregate, analyze, share, and sell Americans’ health data, the brokerage of health data will remain systemic.

There has been some congressional movement in this area. Sen. Elizabeth Warren introduced the Health and Location Data Protection Act in June 2022, which would ban data brokers from selling health data and location data and require the FTC to establish implementation rules, including exceptions for HIPAA-covered entities and carve-outs for First Amendment-protected speech. Rep. Mary Gay Scanlon introduced the same legislation in the House in October 2022. Importantly, both proposals define a data broker as “a person that collects, buys, licenses, or infers data about individuals and then sells, licenses, or trades that data.” This definition departs from several state laws and bills around data brokerage, which often confine the definition of a “data broker” to third parties that have no direct business relationship with the consumer. By using this broader definition in the Health and Location Data Protection Act, legislators ensured the bill would cover the numerous first-party entities that collect health data on their own customers and then broker it—such as non-HIPAA-covered health apps or, in the recent case, GoodRx. However, this broader definition simultaneously means the bill is likely to face more resistance from data brokers, advertising technology firms, and other companies.

The American Data Privacy and Protection Act (ADPPA)—the larger privacy bill introduced in 2022 and likely to be reintroduced in some form—has some treatment of health data and data brokerage. Its definition of “sensitive covered data” includes “any information that describes or reveals the past, present, or future physical health, mental health, disability, diagnosis, or healthcare condition or treatment of an individual.” Entities defined as “data holders” must in some cases conduct an algorithmic impact assessment, which must include “a detailed description of steps the large data holder has taken or will take to mitigate potential harms from the covered algorithm to an individual or group of individuals,” including “making or facilitating advertising for, or determining access to, or restrictions on” the use of health care.

ADPPA has a section (206) on “third-party collecting entities,” its term for data brokers, but it would mainly require these companies to submit information to a public registry. The bill would not actually place significant controls on the brokerage of data itself. Additionally, the bill’s definition of “third-party collecting entity,” among other things, is confined to companies with no direct relationship with consumers and leaves out first-party collectors that broker data. In doing so, ADPPA fails to target the source of much brokered health data—companies collecting and then sharing or selling health data on their own customers, who typically have no idea their health information is being shared and sold once it is gathered.

Congress needs to act, and some of the currently proposed measures—such as ADPPA and other bills that take the broken consumer “consent” approach to data collection—fail to appropriately address the privacy issues and consumer harms associated with the data brokerage ecosystem. Policymakers must start with understanding the pervasiveness of the brokerage of Americans’ health data and the range of documented and looming harms. Then, they must realize why overlooking first-party data collection, to focus just on third parties, fails to regulate the way that so many Americans’ mental health conditions, drug prescriptions, chronic health conditions, trauma, pregnancy, and other health data is made available for sale in the first place.

The author thanks Jolynn Dellinger and Luke Schwartz for their comments on an earlier version of this article.


Justin Sherman is a contributing editor at Lawfare. He is also the founder and CEO of Global Cyber Strategies, a Washington, DC-based research and advisory firm; a senior fellow at Duke University’s Sanford School of Public Policy, where he runs its research project on data brokerage; and a nonresident fellow at the Atlantic Council.

Subscribe to Lawfare