Cybersecurity & Tech Surveillance & Privacy

The Lawfare Podcast: Justin Sherman on the FTC Settlement with Location Data Broker X-Mode

Eugenia Lostri, Justin Sherman, Jen Patja
Friday, January 19, 2024, 8:00 AM
What are the implications of the recent FTC action against data brokers?

Published by The Lawfare Institute
in Cooperation With
Brookings

Last week, the Federal Trade Commission (FTC) reached a settlement with location data broker X-Mode Social. X-Mode collects over 10 billion location data points from all over the world every day, and sells it to clients in a range of industries, like advertisers, consulting firms, and private government contractors. The FTC argued that the data broker was conducting unfair business practices, including selling people’s sensitive location data.

To discuss the FTC settlement and its implications, Lawfare's Fellow in Technology Policy and Law Eugenia Lostri sat down with Justin Sherman, Founder and CEO of Global Cyber Strategies and a Senior Fellow at Duke University’s Sanford School of Public Policy. They talked about the FTC’s groundbreaking decision to list sensitive locations about which X-Mode cannot sell data, the likelihood that we will see further FTC action against data brokers, and the persistent need for comprehensive privacy legislation to better address harms.

Click the button below to view a transcript of this podcast. Please note that the transcript was auto-generated and may contain errors.

 

Transcript

[Introduction]

Justin Sherman: This is very specific, so there's no ambiguity for X-Mode in terms of what do we need to do internally to make sure that if someone emails one of our salespeople and says, hey, we'd love data about this children's health clinic, that salesperson, theoretically, should look at that list and say, hey, this hits two of the boxes here. I cannot sell this to you. From that standpoint, I think, you know, it's a really important piece of the order. And again, it's unprecedented. So it, you know, the FTC is forging a new path here in thinking about location data harm.

Eugenia Lostri: I am Eugenia Lostri, Lawfare’s Fellow in Technology Policy and Law, and this is the Lawfare Podcast, January 19, 2024.

Last week, the Federal Trade Commission, or FTC, reached a settlement with location data broker, X-Mode Social. X-Mode collects over 10 billion location data points from all over the world every day and sells it to clients in a range of industries, like advertisers, consulting firms, and private government contractors.

The FTC argued that the data broker was conducting unfair business practices, which included selling people's sensitive location data. To discuss the FTC's settlement and its implications, I sat down with Justin Sherman, founder and CEO of Global Cyber Strategies and a senior fellow at Duke University's Sanford School of Public Policy.

We talked about the FTC's groundbreaking decision to list sensitive locations about which X-Mode cannot sell data, the likelihood that we will see further FTC action against data brokers, and the persistent need for comprehensive privacy legislation to better address harms. It's the Lawfare Podcast for January 19: Justin Sherman on the FTC Settlement with Location Data Broker X-Mode.

[Main Podcast]

So Justin, can you start by walking us through the services that are provided by the data broker in question, X-Mode Social, and its successor Applogic. Just a quick note here to the audience that in following the language that it's used in the FTC complaint, we are going to be using X-Mode throughout the episode, but we're referring to both companies.

Justin Sherman: Yeah, and that's how people refer to the company anyway, is X-Mode. Yeah, I mean, X-Mode is one of the most notorious location data brokers actually in the data broker industry. And part of that is because X-Mode is one of the larger data brokers in the country that sells geolocation data, but it's also because X-Mode has received a lot of media attention before.

Listeners, for instance, may recall a news story a few years back that there was a Muslim prayer app that was basically giving users location data over to a data broker, which was in turn selling it to U.S. military contractors. So that story related to X-Mode. So there's been a few of these stories in the media about how X-Mode is gathering quite a lot of location data on millions and millions of Americans and then selling it to its customers.

And so that's what the FTC talked about in its complaint. It talks about how X-Mode sells location data on consumers to hundreds of different clients in all kinds of industries. They sell to real estate companies, they sell to financial firms. They also sell to government contractors and they call themselves in their marketing material, the second largest location data broker in the U.S.

So when I say location data, what does that actually mean? Well, this company as with many other companies, many other location data brokers, will approach mobile app developers. They will pay them to put a software development kit, a piece of code in the app. And basically what this does is when you open up an app to maybe check the weather or get directions or something like that, you think you're only giving your location data to just that app, but the app is in fact profiting by passing it off to the broker who then sells it. So this is the core X-Mode business model. And so the FTC talks about this practice.

And the second thing that X-Mode does on top of selling the actual location data, is it sells segments of consumers based on the location information. So it will look and see whether or not certain people are visiting a military base. Are they visiting a particular, I don't know, children's school, right? And then they will piece that information into lists of people who fit into that category. So those are kind of the two data activities that the FTC focused on in this particular complaint.

Eugenia Lostri: Right, and this might not be the most important part of the complaint, but I did find it interesting that not only are they getting this data from the companies that they're approaching, these other apps, I think the complaint says like 300 apps, they also have two of their own, and it's not necessarily clear to the consumer that these apps are owned by X-Mode.

And I did find it a little bit icky that one of them, you know, offers you breadcrumbs of where you've been when you were drunk, and if you need to retrace your steps and all this while they're benefiting from, you know, your kind of vulnerable circumstances just to get all of your location data.

Justin Sherman: Exactly. And just to say the names of these apps because they are I think both strange and perhaps amusing that you just referenced, right? One of these apps that X-Mode built was called Drunk Mode. As you just said, it has a number of features to, you know, basically walk back where you've been when you perhaps were intoxicated and don't remember, I don't know, which route you took home from the bar and you dropped your keys.

There's also other things that come up on the app store if you search for ‘Drunk Mode,’ like there's a keyboard purportedly to stop you from sending unwanted texts, you know. So there's all kinds of things going on there. And the second is Walk Against Humanity is the name of this second X-Mode app. And this app, they brand as a fitness app.

But to your point, it's a similar situation where you might use these apps. And like we all do, they pop up a privacy policy, you click through it, you don't look at it. But then in fact, this app that's purporting to help you retrace your steps, or has some GPS tracking associated with your fitness, your running, is then taking that and packaging it and selling it to retailers and selling it to military contractors and figuring out which kind of building did you go to? And now we're going to put you in this particular type of consumer category. So, you know, definitely, definitely creepy, but it is interesting because a lot of location data brokers don't have that second piece where they build their own app. And so it's, it's certainly an interesting feature of this litigation.

Eugenia Lostri: Right, so the FTC complaint alleges that some of X-Mode's business practices violate Section 5A of the FTC Act, which prohibits unfair or deceptive acts of practices in or affecting commerce, and charges X-Mode with eight counts. So to set the scene, could you describe what are the practices that concern the FTC?

Justin Sherman: Sure, and as you referenced, this is really a key authority for the FTC, this unfairness test, as well as deceptive practices. And in this case, it was not that they were arguing that X-Mode was deceptive, but that its practices were unfair. And there are a couple core pieces of unfairness, such as it causes substantial injury to consumers, and it's not something you can reasonably avoid, there's no countervailing benefit to commerce.

So, here the FTC zoomed in on X-Mode selling sensitive data. So, some of the data that X-Mode was selling could be tied to medical facilities, to reproductive health centers in particular, to places of religious worship, to temporary shelters. So this could include shelters for people who are homeless, shelters for survivors of domestic or gendered violence and so on. And so this, this claim was basically saying, well, this is very sensitive information. And in this case, the consumers are being tracked all over the place, right? This is an unfair practice.

There was also a count that X Mode was not fairly honoring consumer privacy choices because there were cases in which, consumers in their phone, right, you know, you have your, your phone settings do not track, right? People had turned on limiting of location tracking in their phone settings and X-Mode was still gathering data. And so that, and that's a pretty obvious unfairness claim, in my opinion, to say that that was an unfair practice.

And then there are several others. The FTC says that merely collecting data in this way, where you don't really have consumers full affirmative express consent, is not fair. And when you take that location data and then produce sensitive categories off of it around health conditions or, you know, places of worship or something that is additionally producing sensitive data. It's only for marketing purposes. That's also unfair.

Eugenia Lostri: So can you briefly describe what are the harms that could come to consumers from these practices and why the FTC was interested in, in looking at these unfair practices?

Justin Sherman: There are lots of harms that come from location data. I always say there's kind of a couple reasons for this.

One is that location data from your phone is the equivalent of someone following you around 24/7 with a notepad writing down everywhere you go, and did you stop walking for 10 seconds at that intersection to tie your shoe. Did you take a left or a right? Did you, you know, go into the second floor of the shopping mall or the first floor, right? There are even companies that track that, which is kind of creepy. So, the first piece, right, is just the fact that it generates a lot of data about you. And with that information, you can learn a lot about a person, including where they sleep at night, where they go to work, places they frequent, whether that's a supermarket or school or a bar.

The second reason location data is so sensitive is you can learn a lot of additional information off of it. Right, so as we've talked about already, it's not just the pings of where your phone is at any particular minute. Right, if you think of a street map, and we're going to make a dot with a marker, every minute where your phone is, right? You know, it's not just those pings, but it's also the places to which you're traveling. Are you visiting a medical facility that specializes in treating people suffering from cancer or from HIV, right? That is actually a really, really sensitive piece of information that can be learned from your location data. So there's that second piece where it can expose additional things about you, such as religion, such as if you have children, such as even if you're having an affair, such as if you have a particular medical condition, your sexual orientation, even something like immigration status. So that's quite sensitive.

And the third piece is that location data relates to you and where you go. And if you have a full map of where someone is all day, it's pretty easy to figure out who that person is, right? Because there are a lot of movements that are unique to us, whether that's where we go at the end of the day, whether that's where we spend a lot of our time. Do, are there other phones in the house? Do we live with housemates? Do you have family members? That kind of thing. So there's all these reasons location data is really sensitive. And so companies and others can do very harmful stuff with that. Companies have bought location data to learn about all kinds of purchasing practices, travel habits, medical conditions, other things about consumers. Law enforcement can buy data about Americans locations without needing a warrant or a court order. And so there's a whole host of civil liberties issues there. So there's really a lot of harm tied up in this, this space.

And I think that's why, to answer your last question, The FTC is so interested in this area. It's a place where the benefits to consumers are marginal if they're at all, right? Some of the most compelling quote unquote things that companies will say for why we need location data is, you know, delivering you a better ad while you're passing, you know, Starbucks or Dunkin Donuts on your web browser versus all of the harms, right, all the things we just mentioned, the scales are pretty tipped. You know, between this and some of the other ongoing cases the FTC has announced, I think location data will remain a big focus for the coming years.

Eugenia Lostri: When you speak about the potential for you to be identified as a person because of these pings for, you know, where we are. It's hard not to think back to one of the conversations that we had on the podcast some time ago, I believe it was November, where we talked about the harms that can come from data brokers that collect and sell publicly available information, right? And so, if this location data is not anonymized, it can be purchased basically by anyone, and there are no limits to what is sensitive and what they can purchase, you know, it is really likely that you could gain a comprehensive picture of someone's full life, including this very private and sensitive information, right? So, how does the settlement deal, from the FTC deal with the risk of this?

Justin Sherman: Yeah, it's an interesting connection point. I do agree with you. There is a lot of overlap between what we had talked about on that past episode, which was about data brokers and public records and government records and stalking and gendered violence. And what's different here, right, is that we're not talking actually about government records because these location data datasets are not something that is exempted from privacy laws. They're not something that, you know, the U.S. legal system has deemed to be super important for things like press reporting.

So, you can actually take a lot of measures, as the FTC has done here, to ensure that this is taken down. So, you know, but in this particular instance, the FTC did, I think, a good job talking about what it means for data to be linked to a particular person and the reality is that especially with location data, the actual data sets don't have things like name necessarily in them. Right, they might not say, you know, here's all of these phone pings all day and then this is Justin Sherman's phone, but companies will attach other, what technologists call persistent identifiers, to that data so that you can still track the person. So all of us who have a phone are handed a mobile advertising ID, it's called by advertisers and that's how advertisers track people across devices.

So the FTC mentioned a bunch of those persistent identifiers in its settlement with X-Mode and I think that's really good right because it shows that the FTC is very much tracking all the ways that companies link data back to specific people and they're very clear that yes, you can take a name out of a data set. Well, guess what? That doesn't mean it's de-identified. That doesn't mean you can't take it, and as you said, link, that information to a lot of sensitive characteristics about someone's life, their health, their religion, and so forth. And so that's a really important part of the order.

The other piece that's in there is the FTC has a bunch of requirements that X-Mode ensure that customers don't mess with that. So if data is de-identified, quote unquote, it has to meet a bunch of different requirements under the order, such as implementing technical safeguards to prohibit re identifying the data to people, implementing contractual requirements, which is a really important measure, so that if X-Mode sells data that's exempted, right, there are requirements in there if it's de-identified that the person buying it can't just take the data and then re-identify it once they have it.

So, there are a bunch of those measures I think in the order that speak to anonymization and, and re-identification as core concerns for the commission, and as they should be because this is a space where data brokers all the time will claim that things are anonymized and that's not always in fact the case.

Eugenia Lostri: When it comes to the contractual obligations between the data broker and the third-party apps, how does the FTC approach the concern that third parties were not necessarily informing their clients correctly who was going to be receiving this data? I think in particular, many were not informing that this would end up, you know, being purchased by private government contractors for national security purposes. How is that addressed?

Justin Sherman: That's addressed by the FTC requiring that X-Mode implement what they call a supplier assessment program. And this measure is to address that very problem, to address the fact that X-Mode has been gathering location data from a variety of people who X-Mode says consented to the sale of their data, but who absolutely did not knowingly and fully and affirmatively expressly consent to the sale of their data.

And so, basically what X-Mode now has to do within 90 days of the order, so the clock is ticking, is to create a program within the company to assess whether or not all of their data suppliers have received affirmative express consent from consumers to share the data. So, in other words, if I'm X-Mode and I'm getting data from five mobile apps today, I need to make sure that those mobile apps got affirmative express consent from their users to give me their location data.

And this is really big, I think, because there are a lot of problems with, in my opinion and with the way that the U.S. legal system approaches this notion of consent when it comes to data collection, right? We still have this ridiculous, you know, basically factually baseless idea that people read privacy policies in terms of service agreements and that when they click on things, it means they consented.

And, you know, I fully am aware of the fact that lots of laws say that, but in practice, all of the research and polling shows that the vast majority of people do not read this stuff. And I think plenty of listeners might agree that they don't read stuff. And I certainly don't have time to read stuff. And the research shows that no one has time.

So there are these really big problems with lots of apps giving away location data, selling location data without people's consent. But the FTC can't fix these massive legislative problems. What the FTC can do is use its existing authorities to protect privacy against these particular abuses. And so I think what they're doing here is fantastic. It's not, you know, if you're a congressional staffer, this is not the comprehensive solution to pursue. But I think for the FTC, it's really great to say, you're X-Mode, you have to make sure that consumers clearly affirmatively expressly consented and know that they are having their location data, you know, sent to you to be ultimately sold.

Eugenia Lostri: Justin, don't worry, we are approaching the part of the podcast where I ask you about what should Congress do? But before we get there, I want to keep on the settlement for a little bit longer. And one of the big items is that the FTC defines a list of sensitive locations to limit the data that X-Mode can collect and sell. So talk a little bit more about that and if you could maybe tell us what are the pros and cons of this approach.

Justin Sherman: Yeah, so one of the things that stands out most to me about this settlement is exactly what you just mentioned, that basically the FTC has created a list of sensitive locations, and I'll run through those in two seconds, and it has stipulated that X-Mode cannot sell location data about those sensitive locations at all going forward. And in and of itself, that is huge, right? To actually prohibit the sale completely of location data about particular locations. And so how is, how is the FTC scoped this, right? Cause that's kind of an interesting idea. What's sensitive, what's not sensitive, what makes the list, what doesn't?

And I'll rattle them off here. There are seven categories that the FTC has said are in this order, part of the sensitive locations list. The first is medical facilities, and that's quite comprehensive. They list mental health, reproductive health, outpatient, substance abuse, etc. So the first is medical facilities. The second is locations linked with religious organizations. Third is correctional facilities. Fourth is labor union offices. Fifth is locations of entities that predominantly provide education or childcare services to minors. So that would, of course, include things like daycares, elementary schools. The sixth on this list is places held to the public as mainly providing racial or ethnic origin-based services. So, you know, one can imagine if there's a community center for a particular ethnic group, or if there's an organization that focuses on medical disparities towards the black community or something, right? That could be covered. And the last thing on the FTC's sensitive locations list is temporary shelters and social services. And they explicitly include here centers to help people who are homeless, survivors of domestic violence, refugees, and immigrants.

So it's a really interesting list, and I do want to praise the FTC for getting this in the order. Again, this is huge. This is unprecedented. And so this is a big deal. And I do think there are a lot of benefits to having this in here. The FTC has come up with a list of some of the most at risk people and locations to get protections. I say some because, of course, it's not a complete list. We could easily come up with others that are not on here that should be, but the FTC, I think, has done a good job in trying to think about this from a prioritization standpoint and identify the areas in most need of protection.

I think it also makes sense from, a data broker compliance standpoint, right? Because rather than say to X-Mode, you can't sell data about places that are quote unquote sensitive and kind of leave that open ended, right, which I'm not defending X-Mode, but you know, practically speaking, that is a complicated thing to implement, right? How do I understand what they mean by that? This is very specific. So there's no ambiguity for X-Mode in terms of what do we need to do internally to make sure that if someone emails one of our salespeople and says, hey, we love data about this children's health clinic, that salesperson theoretically should look at that list and say, hey, this hits two of the boxes here. I cannot sell this to you. So from that standpoint, I think, you know, it's a really important piece of the order. And again, it's unprecedented. So it's, you know, the FTC is forging a new path here in thinking about location data harm.

Eugenia Lostri: Taking a step back and maybe thinking about the full settlement and with the understanding, as you mentioned before, that there is a limit to what can be accomplished solely under FTC authority. How effective do you think all these measures will be?

Justin Sherman: I'll say two things. One is that I think this draws very clearly on the FTC's authorities. The fact that X-Mode was selling precise geolocation data with persistent identifiers to a variety of clients where they clearly were not doing good controls and checks on who was buying data. There were cases in the complaint where X-Mode found out that customers were misusing data that they purchased and they weren't even sure which customers it was. So, you know, so the FTC, so the FTC clearly is drawing on its authorities. And I think this clearly is an unfair practice what, what X-Mode was doing. And so in that sense, it's effective because it's rooted in existing law and regulation.

The second thing is compliance. And now that this order is in place, which includes X-Mode setting up a privacy training program for employees, which includes X-Mode deleting some of its existing data, which includes, as we said, a ban on selling data about sensitive locations. There's a lot of stuff in there. The question is, how does the FTC make sure that X-Mode is in fact doing these things, right? Because, you know, sometimes companies are told to do these things and absolutely do them, and they dot every i and cross every t, and other times they fall short or they kind of half ignore what they're told to do.

And so that I think will be the second piece of making sure this is effective, which we won't know yet, right? Because a lot of these requirements haven't even come into effect yet. But once they do, the FTC does have authority from the order to submit written requests to X-Mode to get updates. And X-Mode is 14 days to file a sworn response to all of those requests, and there are some other reporting requirements in there, so I think that's the other big thing is, can the FTC, you know, as it often does, do a very good job tracking what X-Mode's doing and making sure that it's fulfilling all of its obligations under this settlement.

Eugenia Lostri: Yeah, that was going to be my next question, right? What were the consequences if X-Mode fails to comply with all of this? What are the teeth in the settlement for the FTC to ensure that X-Mode will change these unfair business practices?

Justin Sherman: Yeah, there are lots of points in here where the FTC could sink its teeth in if X-Mode is not complying, right?

As we just said, there are a bunch of compliance requirements, or compliance reporting requirements, rather, in the order where X-Mode has to keep the FTC apprised of what's happening, it has to respond to inquiries about what's happening, it has to keep records for all of these programs it's implementing.

There are all kinds of requirements. If you violate that, as has happened with some social media companies where the FTC has been investigated, then the FTC can bring additional action against you. So you might have to, you might get sued, you might have to go through another settlement process, you might have to pay fines.

So, it's really not good if you're a company to violate an agreement with the FTC. And so, you know, in this case, I would be surprised if X-Mode colors too far outside the lines, but it will be important for the FTC to make sure that they're sticking to the schedule and they're really comprehensively implementing each part of this order, because it really is a pretty sweeping set of requirements.

Eugenia Lostri: So beyond X-Mode, do you expect this decision to have an effect beyond, like, this one data broker, and maybe have a deterrent effect across the industry, maybe change some of the most egregious practices to avoid, you know, a repeat?

Justin Sherman: The data broker industry is very, very large, and when you dig into it, it's pretty diversified, right? So some data brokers, like we're talking about now, specialize in geolocation data. There are brokers who specialize in contact information. There are brokers who do everything. There are brokers who couldn't care less which individual person lives where, and, you know, entirely build their business around selling neighborhood or city level risk data. Where are their floods? Where are their fires? Where are their murders?

So, it's a diversified industry, and so I don't think that this order necessarily sends a signal to brokers that don't sell location data, because location data is just a very unique kind of case. But I do think it sends a clear message to location data brokers, which is we reached a settlement with a very prominent and sizable geolocation data broker engaged in the exact same stuff that a lot of other companies are doing and found that they were violating the FTC Act by engaging in unfair acts or practices and got them into a settlement where they have to, you know, they have to spend all this time and effort on all of these different privacy requirements. So I do think it sends a strong signal.

Unfortunately, is it going to stop other location data brokers from doing what they're doing? I don't think so. I think that some companies might choose to pursue this kind of sensitive locations list idea. So that if they get media scrutiny or if they get regulatory scrutiny or congressional scrutiny, they can say, hey, we don't sell data on Planned Parenthood clinics. We don't sell data on military bases, right? We focus on retail locations or we focus on parks or beaches or something else, right? So it's possible that that has a deterrent effect, but for a broker, if you could, you know, let's say a broker is going to get into the same conversation with the FTC a year from now, and then has to comply with the same order from that broker's perspective, the way they look at it, right, you could still have another year of making money doing all this.

So yeah, so I don't know if that even is a clear answer. I think I guess that, like I said, there is a strong signal being sent to the data broker industry, to the location data broker industry. I again want to commend the FTC and commend the FTC attorneys who worked on this because this is a sweeping and really clearly legally grounded and unprecedented order, but I just think it also takes a lot more to deter data brokers and, you know, as, as we'll get to some of that probably is more legislative than, than regulatory at this point.

Eugenia Lostri: So do you anticipate that there will be further FTC action against location data brokers along these lines?

Justin Sherman: I think so, yes. I think that if you look at the cases that the FTC has been pursuing, they're pretty clearly centered around a few certain types of populations or data, right? I mean, health data has very clearly been a priority for the FTC in the last couple of years, including last year, the first two ever enforcements of the FTC's health breach notification rule where a couple companies were selling and sharing really sensitive health data that that consumers had no idea was being shared or sold. Location, right, there's this X-Mode order that just came out. The FTC is in an ongoing lawsuit with Kochava, another location data broker.

So yeah, so health location, I think if you look at the priority areas. It's pretty clear what they are, and location is one of them, and location also touches the others, because people sell location data about kids schools, they sell location data about hospitals, so it also wraps up medical information, it wraps up children's privacy, which of course is a really important issue as well. So I have no doubt that I have no doubt that if there aren't already ongoing investigations into other location data brokers that we'll be hearing more like this in the coming months and years.

Eugenia Lostri: Now, unfortunately, and like it feels kind of cliche to say at this point, and many of these conversations lead to this, you know, what's missing here? It's comprehensive privacy legislation, right.

You've already hinted at this, we've talked about this in the previous podcast. So, you know, I'm sorry to have to ask, but how likely do you think comprehensive privacy legislation will be this year? Is this kind of action kind of helping move the, you know, the, the issue forward? Will we remain with nothing? What, what do you think, what are your predictions considering that we're still in January, I guess I can ask what your predictions for 2024 will be.

Justin Sherman: Right. The problem with predictions is, you know, they're predictions. And I suppose, right, we did also talk about this for folks who have not heard it. It's a great lesson. You and I and Paul Rosenzweig also talked about some of this at the Lawfare year in review at the end of 2023.

And so where is comprehensive privacy legislation at? Nowhere currently. There was a lot of conversation around this time last year of, how are we going to take the American Data Privacy and Protection Act, ADPA, as it's called for short, from 2022, reintroduce it into the new Congress, get it passed?

So there were a million hearings. There was actually one on data brokers that I was at as well. And so there was a lot of momentum last year in conversation about this and then it kind of went nowhere and so I don't, I don't think we're going to get any new comprehensive bill introduced this year, especially because it's an election year and Congress just has other priorities.

They're already thinking about the NDAA, the military defense bill for the end of the year already because of the election. So I don't think we're going to get anything comprehensive. I will say it is continually baffling to me why members do not want to pass anything piecemeal to protect people's privacy in the meantime. I understand that once you start identifying, oh, we're, we're going to go after ad tech targeting children, or we're going to go after location data brokers or something, plenty of other people pile in. And there are all these totally fair questions about why do we not focus on this other thing and why is this thing the most important.

But it's also frustrating when you see activities like this going on, companies surreptitiously gathering hundreds of millions of Americans location data, linking it to specific people, creating additional inference data off of the data and selling it. And as I've talked about many times the research team I run at Duke has found time and time again that there are some brokers that, they have good controls on who they sell to, but a lot of data brokers will sell to anybody with a credit card and an email. And so I just don't understand, it's just frustrating looking at it and, and seeing these and some other privacy abuses that others work on, you know, facial recognition, what have you, and not having something done in the meantime., especially because this is such a bipartisan issue.

But maybe what we need, I'll, I'll just throw in here, is a new version of the Video Privacy Protection Act from the late 1980s where basically, you know, a reporter went to a movie rental store and found out everything that Robert Bork, who was then a U.S. Supreme Court nominee, was renting from this store and everything his family was renting from the store. And wrote a story about it and immediately Congress passed the Video Privacy Protection Act in ‘88 prohibiting companies from disclosing that information about what videos people were renting.

And so I don't know what the version of that with data brokers is, but it seems to be that sometimes the best solution to this problem is journalists and others sort of pointing out to members of Congress how their privacy can be invaded through this, this kind of activity.

Eugenia Lostri: Well, that's kind of grim. So, before we, we start wrapping up Justin, if you have any last remarks, final thoughts, anything that we didn't get to talk about that you wish we had discussed, this is your time.

Justin Sherman: Yeah, I again just want to underscore you know, this is the first time the FTC has done this kind of settlement with a location data broker, so it is a big deal. And I, I again say, and plenty of other folks, including some former FTC personnel, Ashkan Soltani is one, talk about this all the time, the privacy team at the FTC is about 40 people. And I think that is just a massive legislative failure that in a country with hundreds of millions of people, we have about 40 people enforcing a good portion of our privacy regulations.

So, yeah, so it really is a great order and a lot of improvement in the X-Mode case will come from this settlement. That said, there are a few challenges that still remain. We mentioned the legislative challenge that other companies can still do this, that plenty of companies will not be deterred from selling location data about abortion clinics, or schools, or places of worship, and so on, just because this order came into effect. And there's also the economics of the problem.

Something that is interesting about X-Mode is it's one of the few location data brokers where there's actually public information about how much they pay mobile apps to get their location data. And there was reporting a few years ago that X-Mode, that you could be an individual developer, and they might pay you $10,000 or more a month to get your location data. So, if you're building a bunch of games or something, or this is even just a side activity for you to build an app, it's actually quite lucrative, and there are lots of reasons why you would want to turn over your own users location data to a company like X-Mode. So those challenges persist.

And again, like you said, it seems funny to kind of drumbeat on the same point, but I'm going to drumbeat on the same point that we need comprehensive privacy legislation, that the FTC needs far more resources because it's ridiculous that individual European countries have, you know, way fewer people and a hundred times as many privacy regulator staff, but it really is an important settlement and I think something that anyone looking at location data privacy and the privacy and national security risks should, should dig into.

Eugenia Lostri: Justin, thank you so much for joining me.

Justin Sherman: Thanks for having me.

Eugenia Lostri: The Lawfare Podcast is produced in cooperation with the Brookings Institution. You can get an ad-free version of this and other Lawfare podcasts by becoming a Lawfare material supporter at patreon.com/lawfare. You'll also get access to special events and other content available only to our supporters.

Please rate and review us wherever you get your podcast. Look out for our other podcasts, including Rational Security, Chatter, Allies, and The Aftermath, our latest Lawfare Presents podcast series on the government's response to January 6. Check out our written work at lawfaremedia.org. The podcast is edited by Jen Patja Howell, and your audio engineer this episode was Noam Osband of Goat Rodeo. Our music is performed by Sophia Yan. As always, thank you for listening.


Eugenia Lostri is a Senior Editor at Lawfare. Prior to joining Lawfare, she was an Associate Fellow at the Center for Strategic and International Studies (CSIS). She also worked for the Argentinian Secretariat for Strategic Affairs, and the City of Buenos Aires’ Undersecretary for International and Institutional Relations. She holds a law degree from the Universidad Católica Argentina, and an LLM in International Law from The Fletcher School of Law and Diplomacy.
Justin Sherman is a contributing editor at Lawfare. He is also the founder and CEO of Global Cyber Strategies, a Washington, DC-based research and advisory firm; a senior fellow at Duke University’s Sanford School of Public Policy, where he runs its research project on data brokerage; and a nonresident fellow at the Atlantic Council.
Jen Patja is the editor and producer of the Lawfare Podcast and Rational Security. She currently serves as the Co-Executive Director of Virginia Civics, a nonprofit organization that empowers the next generation of leaders in Virginia by promoting constitutional literacy, critical thinking, and civic engagement. She is the former Deputy Director of the Robert H. Smith Center for the Constitution at James Madison's Montpelier and has been a freelance editor for over 20 years.

Subscribe to Lawfare