The Lawfare Podcast: Justin Sherman on Senator Wyden’s Investigation of Near Intelligence Inc.
Published by The Lawfare Institute
in Cooperation With
On Feb. 13, Senator Ron Wyden released a letter documenting an investigation his office has been conducting into the activities of Near Intelligence Inc., a data broker that allegedly enabled an anti-abortion organization to target anti-abortion messaging and ads to people visiting 600 Planned Parenthood clinics across the United States. Lawfare Senior Editor Stephanie Pell sat down with Justin Sherman, CEO of Global Cyber Strategies and a Senior Fellow at Duke University’s Sanford School of Public Policy, to discuss this investigation. They talked about the various players in the data broker ecosystem that enable these invasive practices, the lack of federal legislation governing and preventing these activities, and what actions the FTC might be able to take against Near Intelligence Inc.
Click the button below to view a transcript of this podcast. Please note that the transcript was auto-generated and may contain errors.
Transcript
[Audio Excerpt]
Justin Sherman: We, in the United States, have a general sense that if we visit the emergency room for a cut on our hand, or if we stop by the urgent care clinic because we have a sore throat, that our doctor is not going to go sell that information on the street corner, right? That is a confidential set of activities that's protected by law. That's, of course, not the case here when you have companies that are gathering location data from your device and selling that information such that other organizations and individuals can learn that you're visiting a medical facility.
[Main Podcast]
Stephanie Pell: I'm Stephanie Pell, Senior Editor at Lawfare, and this is the Lawfare Podcast, February 28th, 2024.
On February 13th, Senator Ron Wyden released a letter documenting an investigation his office has been conducting into the activities of Near Intelligence Inc., a data broker that allegedly enabled an anti-abortion organization to target anti-abortion messaging and ads to people visiting 600 Planned Parenthood clinics across the United States. I sat down with Justin Sherman, CEO of Global Cyber Strategies, and a Senior Fellow at Duke University's Sanford School of Public Policy to discuss this investigation. We talked about the various players in the data broker ecosystem that enable these invasive practices, the lack of federal legislation governing and preventing these activities, And what actions the FTC might be able to take against Near Intelligence, Inc.
It's the Lawfare Podcast, February 28th: Justin Sherman on Senator Wyden's investigation of Near Intelligence, Inc.
Justin, you're back to talk about another data broker allegedly selling data that reveals sensitive places that people have visited. Senator Ron Wyden recently wrote to the Chair of the Federal Trade Commission and the Chair of the Securities Exchange Commission about this data broker, Near Intelligence Inc., which we will call Near, because of an investigation his office undertook in May of 2023 after the Wall Street Journal revealed that an anti-abortion organization used location data from Near to target anti-abortion messaging and ads to people who had visited reproductive health clinics. As Senator Wyden explains in his letter, the Veritas Society, a nonprofit established by Wisconsin Right to Life, allegedly hired the advertising agency, Recrue Media, to place these ads from November 2019 through the summer of 2022. And one of the interesting things we learned through this letter and Senator Wyden's investigation is more about both the data broker ecosystem and how anti-abortion ads and messaging can be delivered to people visiting places that provide reproductive health services. Can you pick up the narrative at this point and tell us about what Senator Wyden's staff learned about this particular targeting campaign and who did they learn it from?
Justin Sherman: Senator Wyden's team learned quite a bit during the course of their investigation through a bunch of different mechanisms, including interviews with the involved organizations. And just to preface this for those who perhaps have not listened to or read any previous Lawfare work on data brokers, this is a really opaque industry. And as you noted, it is very notable anytime we learn something new about how the data broker ecosystem operates, and so there's a lot of interesting information in here.
As you mentioned, the Wall Street Journal had reported about midway through last year that there was an anti-abortion group using cell phone location data to target ads to people who were visiting Planned Parenthood clinics. And there wasn't a ton of information in that reporting. It did name the Veritas Society, as you mentioned, and a couple other specific pieces of information about the campaign. But overall, there was still a lot in the dark. It wasn't really known the scope of the campaign. There was a lack of information at the time about exactly how the location data tracking had operated. And so essentially, after this happened, Senator Wyden's office had started an investigation, and I'll go through the main couple pieces that we've learned from the Senator's office and a letter they published about this investigation.
The first piece is that they discovered that the Veritas Society, this Wisconsin right-to-life nonprofit, had hired an advertising agency called Recrue Media. And this was the company that had placed advertisements from November 2019 through the summer of 2022, both before and after the Supreme Court issued the Dobbs decision. And when Senator Wyden's team spoke with this company, Recrue Media, they said that they had used this location data broker to draw a line literally around Planned Parenthood clinics in the U.S. and their parking lots and use that line that they drew to say, “Within this area we're going to run targeted advertisements based on location data.” And so that was an interesting finding, right, that this is how this happened. After that, Senator Wyden's office spoke with the second main involved company, this company, Near Intelligence. And this is a location data broker. And the Chief Privacy Officer of this company had said that until 2022, the company had zero controls in place to prevent anyone buying this location data from targeting people who visited sensitive facilities to include reproductive health clinics. We can get more into that, but I'll just say it's an interesting look at how organizations can tap into data collected by location data brokers to target people with advertisements, to track people, to geofence particular areas. And then they can also involve other kinds of companies, such as advertising technology companies to deliver advertisements to those people. All told, the Veritas Society, the anti-abortion nonprofit behind this campaign, said that in 2020 in Wisconsin alone, so in that specific year, in that specific state, it delivered 14.3 million ads to people who visited abortion clinics, and delivered those ads across social media, Facebook, Instagram, Snapchat. And then the big number which was reported out in Politico of the entire national scope of the campaign was that this nonprofit used the location data to target people visiting 600 Planned Parenthood locations in 48 states. A quite sweeping targeting campaign.
Stephanie Pell: And Justin, as I understand it, these ads were delivered to people on their personal devices, potentially while they were at various Planned Parenthood locations.
Justin Sherman: That's correct. On Facebook, on Instagram, on Snapchat, and potentially other social networking sites. Those were the three that were named by the Senator's office and by the Wall Street Journal, but it is written such that there could be others as well, on those platforms and apps on people's devices, they would get this anti-abortion messaging. Since this news has come out, all three of those networks I just named, Facebook, Instagram, and Snapchat, have all said that such ads violate their policies and that they will be rejecting these kinds of campaigns if they are attempted to be run in the future.
They would run these ads, though, to these apps, to these platforms, on people's devices, when they were within the vicinity of a Planned Parenthood clinic that we just described, right? That could be the parking lot as well for these hundreds of different clinics. And the targeted ads would reference the fact that people did not have to make a particular medical decision, for example. One ad that was served to people's devices on Facebook said, quote, "Took the first pill at the clinic. It may not be too late to save your pregnancy," unquote. They would run these kinds of ads to the person's social media platform on their device. Then if you clicked on the ad, it would then take you to a site that was registered to the Veritas Society, where it would provide other messaging designed to influence medical decisions, such as "I want to undo the abortion pill," or "I am thinking about the abortion pill." And then there was additional tracking and targeting on that site. The sort of tracing of data about people while they're at health clinics continues. But they were delivered right to personal devices, and as we had said, the use of the brokered location data is what enabled it to be focused on people who were literally at clinics or entering clinics.
Stephanie Pell: And Justin, while this should be readily apparent, can you speak to how privacy invasive such practices are?
Justin Sherman: This is highly invasive for a couple of reasons, right? So one bucket is related to the fact that this is medical information and medical decision-making, and people don't know this tracking is going on. So that's the first bucket. The second bucket is related to the location data. I'll take those in order. The first is that it should go without saying, but if people are visiting a Planned Parenthood clinic for a variety of reasons, that could be to get a particular kind of test, that could be for abortion, that could be to get educational materials, that could be to drop off or pick up a family member, right? Statistically, most people going to these locations are engaged in a wide variety of activities that are not related to abortion and other things. A lot of different kinds of reasons for people visiting a Planned Parenthood clinic. That is, of course, a very private and sensitive set of decisions, right? We, in the United States, have a general sense that if we visit the emergency room for a cut on our hand, or if we stop by the urgent care clinic because we have a sore throat, that our doctor is not going to go sell that information on the street corner, right? That is a confidential set of activities that's protected by law. That's the general sense most Americans have. That's, of course, not the case here when you have companies that are gathering location data from your device and selling that information such that other organizations and individuals can learn that you're visiting a medical facility. That in and of itself I think is invasive because it violates that expectation of privacy that people have. And sometimes you'll hear companies poo that or advertisers will dismiss that that saying something's creepy or saying something's unexpected is not harmful. But I would disagree with that. If you do not expect a certain decision to be subject to surveillance, or if you don't think that information about you in a medical setting is going to be tracked and disclosed, that is an important expectation. So that's the first bucket of why this is invasive is people have no expectation this is happening, and they also don't interact with the companies doing this tracking that much, right? People are not interacting with Near Intelligence on a daily basis in terms of going to the website or sending emails to people, right? So there's a lack of understanding of that tracking.
And the second bucket for why this is so invasive for people's privacy relates to location data specifically. So I always say location data is really sensitive for at least three reasons. One is that it is an incredibly sensitive piece of information because it's the equivalent of literally following somebody around with a notepad. You can actually trace where people are going in real time, and that's very dangerous, right? Knowing where someone is at a given moment can allow you to track them, to target them, to stalk them, right? That's also related to the second reason it's so sensitive. It's not just knowing where someone is at a particular point in time. But location data also allows you to infer a lot about somebody based on what they're doing. And if I follow someone around and they go to a bank that specializes in loans to low-income families, or if I go to a gay bar, or if I go to a mental health treatment center, or in this case, somebody goes to a Planned Parenthood clinic, it's not just knowing that they're there, but you can learn a lot or infer a lot about that, right? You can start to make inferences as these companies and data brokers were doing, about people visiting a Planned Parenthood. And we can get into how accurate that is, right? As you said, people visit Planned Parenthoods for a wide variety of reasons, and maybe a broker is bucketing them in a certain way. But the third main reason, just to wrap here on this point, that location data is so sensitive, is because it's incredibly difficult to anonymize. And we can get into that more, but if you're attempting to trace an individual device, as in this case, or you're attempting to track a device's location so you can run a targeted ad, as this anti-abortion group did, the conversation about quote unquote "anonymization" is often a smokescreen, because it's really difficult to do anyway. And if you can trace and run ads specifically to one person's device based on their location, there's not effective privacy protections in place.
Stephanie Pell: Now you've done a lot of work on the need for federal law regulating data broker practices. For folks who may not appreciate how this kind of targeting and tracking can happen, can you give us a sense of the void that exists in the law, and what, in your view, needs to occur to prevent these kinds of practices from continuing to happen?
Justin Sherman: There is a pretty significant gap in the law federally, and also at the state level in most cases, when it comes to the collection and sale of location data. Data brokers largely are not regulated. There's gaps in health, there's gaps in lots of different areas. In this case, primarily talking about location and somewhat about health, the main federal law implicated in this area is HIPAA, the Health Insurance Portability and Accountability Act. This is often referred to as the U.S.'s federal health privacy law, but as I just mentioned, the P is "portability". So there are some privacy controls in place for a couple categories of entities that are covered under that law. In other words, if you're a hospital, or you're a healthcare clearinghouse, or some others, you're covered by that law.
But lots of data brokers, mobile apps, social media companies, advertising technology firms, telehealth services have no coverage under HIPAA, and so they can collect and sell what would otherwise be considered health information with no restrictions. And then in the location data space, there's basically no restriction at all on the ability to collect and sell data. And so what happens here in practice is that thousands and thousands of mobile apps on people's phones, whether you use an Android phone, or an iPhone, or a Huawei, or something else, thousands and thousands of mobile apps collect location data. This is not a surprise to most listeners, right? We use a ride-sharing app and we want the driver to know where to pick us up. Or we're navigating ourselves and want to pull up GPS. Or maybe you're in a new city and you have your built-in weather app that autochanges the weather to your new location. Lots of apps collect location data.
What most of us don't know is that a lot of those apps then sell the location data. And they can take aggregated information about where people travel, all the way to real time, as in, exactly where you are at this moment, and then five seconds from now, and then five seconds from then, and sell that tied directly to your device, sometimes with your name, sometimes with a special advertising ID just for you. It's a huge market for selling this, but again, it's partly because there's a gap in law and regulation that does not limit the collection and sale of location data in this way. And as I noted earlier with the Wall Street Journal reporting, right, the social media platforms, such as Facebook and Snapchat, mentioned that this kind of manipulative advertising campaign to influence people's medical decisions violated their platform policies, which, if that's true, is a good thing, but it certainly doesn't violate existing laws. And so that really is a huge gap in this area.
Stephanie Pell: And is there any activity going on in the states, which again, obviously doesn't have nationwide impact, but is at least improving the situation?
Justin Sherman: Yes. There is a little bit of action at the state level focused specifically on the sale of location data. And currently, I think the real leader in this area is Massachusetts. There is a bill in the Massachusetts legislature, dubbed the Location Shield Act, that would essentially ban companies completely from selling, leasing, trading, or renting location data. It's pretty encompassing in how it talks about the ways that people can broker information about our phone location. It would force companies to get users' explicit consent before collecting location data in the first place. And it would have some carveouts, such as responding to emergencies. There are certainly services for different communities, including to increase accessibility of 911 call centers that might use location data. That's carved out of the bill. It's overwhelmingly targeted at this kind of practice, where consumers pull up their phone, they walk down the street, they go to a medical facility, they go to a voting booth, they go to their kid's school, they go to a military base, and they have no expectation that their phone app that has access to their GPS location is then selling that to a data broker, who then sells that to a bunch more people. It's an interesting bill. It's been moving along in the legislature. It's a really great model, I think, for other states who might look at this case we're talking about and say, "This is very concerning." Or think about all the other ways it could be used, right? We're, importantly, right now, talking about this specific instance with Planned Parenthood, but it certainly goes far beyond that. And some of the examples I just mentioned, whether it's tracking children, including people under the age of 13 years old, that people sometimes mistakenly believe have a ton of privacy protection. It could be tracking people visiting a gay bar, it could be tracking military service members, right? There's a lot of harm when it comes to location data, so Massachusetts has an interesting bill that other states might want to look at in this area.
Stephanie Pell: Now, Near Intelligence was also the subject of a second Wall Street Journal investigation, which the Wyden letter discusses. Can you tell us about this investigation?
Justin Sherman: Yes. After the May 2023 reporting by the Journal about the Planned Parenthood targeting, there was a second article, also by the Wall Street Journal, so credit again to those reporters, that came out in October of 2023. And this time it was focused on Near Intelligence selling location data to the U.S. government. And in particular, what happened was they reported that Near was selling location data to defense contractors first, and then the defense contractors were reselling that data to the defense department and to intelligence agencies. So after this reporting came out, Senator Wyden's office had another call with Near Intelligence and their Chief Privacy Officer, and the Chief Privacy Officer confirmed that they had been selling location data to this defense contractor for three years. And this privacy officer added that he joined the company in the middle this period of selling this location data to the U.S. government, that he had reviewed the company's practices and he had determined that Near Intelligence was selling this location data without user consent. And there were all these kinds of questions that came up then about how is this data broker handling the location data? Is it permitting its customers to just use the data for their own purposes? Which might already be concerning, right? But then, can they resell it again? What are the controls there? It provoked all these questions that Senator Wyden's office got into in their letter, but again, speak to an interesting set of questions around privacy risks in this area, the identifiability of location data, and the extent to which this business also involves location data brokers selling data to the government such as to the Department of Defense and its various contractors.
Stephanie Pell: And the Wyden letter appeared to indicate that the actual practices here contradicted statements that were on Near's website. Is that your understanding of what some of Wyden uncovered?
Justin Sherman: Yes. And this defense contractor, that Near Intelligence--there's a lot of names to keep track of, so I'm trying to minimize a little bit, but Near Intelligence, this location data broker, sold data to this defense contractor, and despite it doing so, Near Intelligence had put on its website, as you just alluded to, that the company did not sell data to defense or governments. And after this privacy officer joined in the middle of this kind of three-year window of selling the location data to the Defense Department, this person had pushed the company to remove those statements from the website because that was, of course, I would argue, a deceptive practice to be saying, "We are not doing X," and then to be doing precisely X, right? So that was certainly an important detail in this instance and is not unheard of. I'll just say generally in the location data space where some companies might intentionally be deceptive and other companies might not put the amount of care into their public statements and websites as they should. And so maybe they just don't even notice that, hey, we have this language, which doesn't excuse it but, they neglect the fact that they have this language on the website directly contradicting their data practices.
Stephanie Pell: And this investigation by Wall Street Journal and Senator Wyden also was very interesting insofar as it showed how all of the various actors interact in this data broker ecosystem, such that location data gets from your phone, sold somewhere, ads are delivered to your phone. There was a particular reference to the fact that an advertising exchange cut off Near's access to their services. First of all, what are advertising exchanges, why was Near cut off, and what perspective does this give us about how all of these actors interrelate in this ecosystem?
Justin Sherman: Yeah, advertising exchanges are essentially the online automatic auctions where companies can bid on getting ad access to people's devices. In real time, companies basically sit, for lack of a better word in these automated , they are provided offers with, "Oh, you can reach X many thousands of consumers who are in this age range and have these interests and have spent this much money on on cars and alcohol, and here's the cost of running ads," and then all these companies automatically bid on getting access to, you're at auction--bid on getting access to those consumers. So that's a sort of, especially as a techie, right, that's a very simple explanation. There are certainly great explainers out there for folks who want to dig in more, but that's really what it is, right? This is at the core of how a lot of digital advertising happens, how ads appear on our phone. This is a lot of the backend of how companies reach people with messaging.
So what's interesting here is these advertising exchanges were a source of location data for this data broker, for Near Intelligence. And I had mentioned mobile apps earlier. Those are a very significant source of data for location data brokers. There would be a much smaller market for everybody's phone location data, and where are we going, and our movement patterns, and all of this, if mobile apps were not selling our location data, right? So that is a big part of the market. There's also, which is even more opaque, this area where brokers basically can sit in on those advertising exchanges. They can get lots of data about people through them, and then, which often violates the terms of service of these exchanges, can resell the data, such as Near Intelligence did.
All to say, this is what the Wall Street Journal story in October of 2023 also revealed. Is that, okay, we know this location data broker has been assisting with targeting people, visiting Planned Parenthoods. We know it's been selling people's location data to defense companies. Now we're also learning that they are going to these online ad exchanges and getting real-time data about people's locations, and that's what they are using to facilitate their data broker business. And the advertising exchanges, after this was reported, cut Near Intelligence off because it was a violation of the terms of service, to take the data out of the exchange and then go sell it to other people. And so, that, I think, was a major finding here that did not stop or prompt Near Intelligence to stop selling people's location data, but certainly the ad exchanges were not happy that practice was occurring.
Stephanie Pell: And it appears then that it's up to each organization to make its own rules to govern the practices of other customers or associates that get access in this data. It appears that this really illustrates the significant gaps in the regulation of the collection and sale of location data.
Justin Sherman: This is correct. So there are lots of conversations to be had about pros and cons and issues with online advertising. Some online advertising companies, whether or not I agree with their privacy practices, do have some efforts to say, "Here are guardrails, here are controls, here's what we're going to allow and not allow." That's much less the case in the location data broker space. That's why it's so interesting, as you pointed out, that you get these cases where an ad exchange is getting data on probably millions of Americans, all kinds of stuff. What are they viewing? What are their interests, age, religion, you name it, right? Tons and tons of sensitive data. They're providing that to thousands of companies through these automated systems. Companies can then bid on access to those people. And yet, for all of those activities going on, what some of these exchanges don't want happening, in part because it might undercut them, in part because it looks bad and other reasons, is people sitting in there just to steal and resell data, basically, or to siphon off and resell data. And yet, some data brokers do that. And so, it absolutely underscores the lack of regulation, which we 100 percent need at the federal and state level. It also underscores a lack of best practices. And I always emphasize regulation should be first, but every industry, I think, should also have just standards and best practices. And when so many location data companies engage in these kinds of activities to get access to millions of Americans' phone locations, to figure out which bar people are visiting, which medical center are they going to, do they stop at a school before they head off to work in the morning, that entire ecosystem really also needs industry standards that say, "We're not going to do things like sit in ad exchanges and hoard all the data." But again, that's probably an unrealistic goal, because that's exactly what these companies need to do to make money and get access to that location data.
Stephanie Pell: And as we noted earlier, Senator Wyden addressed his letter to the chair of the FTC and the chair of the SEC. Based on these revelations, does it appear to you that there is a basis for the FTC to investigate Near Intelligence?
Justin Sherman: I think so. The FTC's two authorities in this area are the ability to investigate business practices that are deceptive and the ability to investigate business practices that are unfair. And so in the first case, as we had mentioned, Near Intelligence had published statements on its website--allegedly, I guess I have not looked archive--but allegedly published statements on its website explicitly stating that it was not providing location data to governments or to defense entities, yet it was doing so. So that is a classic example of something the FTC might investigate to say, "Consumers who bothered to look at your website," I don't know who would even do this, but, "Consumers who bothered to look at your website had an expectation that you were not selling location data to defense organizations and government organizations. Because you said that you were doing so anyway, that's deceiving consumers." So there's an interesting deceptiveness claim there.
The second piece relates to unfairness. And this is a much trickier question. Folks can go listen to the last podcast I did about this on X-Mode, a different location data broker, but the FTC lately has been advancing an argument that selling location data about Americans who were visiting sensitive locations, such as a kid's school, such as a place of worship, such as a medical facility, and they're being tracked by these location data brokers when they do, that is unfair practice if the data is clearly linked to that person. And so what could happen here, I would imagine, is the FTC might say, "This company was gathering location data about people who did not know it was doing so. It was gathering this location data about people visiting health clinics, and it was clearly linking that location data to specific devices because it was then allowing people to run manipulative messaging to the phones." And I don't, who knows, right? We can't say what they are or are not doing. But I do think that the FTC has some basis to at least look into this issue and to potentially pursue some kind of enforcement action.
Stephanie Pell: And we should probably also note then that in bringing the matter to the attention of the chair of the SEC, Security Exchange Commission, Senator Wyden was urging the SEC to look into whether some of the statements Near made to investors about Senator Wyden's own investigation may constitute securities fraud.
Justin Sherman: This is correct, and I won't go much in depth here. I'll stay in my lane. I'm by no stretch an SEC expert, but this was, as you mentioned, a core piece of the letter as well, is that Senator Wyden was essentially saying that some of the statements Near Intelligence had made to investors could constitute securities fraud. For instance, information related to its privacy practices, information related to individuals and organizations to whom they were selling data. Again, hard to say what will come out of this, but there at least is a call for these two organizations to look into Near Intelligence and the location data and privacy practices.
Stephanie Pell: In your view, is this a matter we need to continue to watch?
Justin Sherman: I think so. We've seen some data brokers in this space get covered in one or two news articles, maybe, and then there's not much more there, or there's an FTC action and then there's not much more there. Near Intelligence is up there in the location data space in terms of the number of articles and investigations and different things learned about this company. As we mentioned, this started with the Wall Street Journal reporting about a right-wing group hiring an ad firm to use this data to target people visiting Planned Parenthood clinics. We then learned again from the Journal that this company was also selling data to defense contractors, that they were violating the terms of service of ad exchanges to get all this location data on people and resell it. We learned from Politico and Senator Wyden's office that it was, in fact, 600 clinics in 48 states that were targeted. That there are all these other issues and deceptiveness going on. I do think it will be a matter to continue to watch. And same goes for if the FTC is going to do anything here. The FTC has been pumping out cases and again, among its different privacy areas, focusing a fair amount on location data, which I think is great, right? Because it's again, a very risky and sensitive type of data. And so there's an open question as well, if there's going to be any kind of follow-up investigation, order with the company for them to cease these practices. And the reason I say this is because Near Intelligence declared bankruptcy late last year in 2023, but is reportedly, and I have been digging a lot around about this, in the midst of being purchased by a venture capital firm. And so this has happened before. This happened to X-Mode, where X-Mode got a bunch of public scrutiny, there was some restructuring, some name-changing, this kind of thing. And the company basically continued business as usual. So even though Near Intelligence technically in that legal entity name main is bankrupt and so on, it is highly likely that this entity continues operating because the venture capital firm is quickly moving to purchase all of its assets.
Stephanie Pell: Very interesting. Anything else you'd like to share with our listeners?
Justin Sherman: Stay on the lookout for for state privacy laws in this area. I think that, obviously, there are a lot of really smart and dedicated folks in and out of government who've spent a long time fighting for a federal privacy law that continues to stall, right? But there's a lot of action at the state level, and that's comprehensive privacy laws, that's giving residents of the state specific privacy rights. For anyone listening who's concerned about this, whether it's medical targeting or targeting generally, as I said, military, children, places of worship, I would check out the Massachusetts Location Shield Act. Because it's a great starting template for what other states might want to implement.
Stephanie Pell: We'll have to leave it there for today then. Thanks so much for joining me.
The Lawfare Podcast is produced in cooperation with the Brookings Institution. You can get ad-free versions of this and other Lawfare podcasts by becoming a Lawfare material supporter through our website lawfaremedia.org/support. You'll also get access to special events and other content available only to our supporters. Please rate and review us wherever you get your podcasts.
Look out for our other podcasts, including Rational Security, Chatter, Allies, and The Aftermath, our latest Lawfare Presents podcast series on the government's response to January 6th. Check out our written work at lawfaremedia.org.
The podcast is edited by Jen Patja. And your audio engineer this episode was Cara Shillenn of Goat Rodeo. Our music is performed by Sophia Yan. As always, thank you for listening.