Lawfare Daily: Big Tech and Law Enforcement, with Lukas Bundonis

Published by The Lawfare Institute
in Cooperation With
On today's episode, Lawfare's Fellow in Technology Policy and Law Eugenia Lostri speaks with Senior Privacy Engineer at Netflix and former Army Reserve intelligence officer, Lukas Bundonis. They talked about the relationship between law enforcement and tech companies, what that relationship looks like in the U.S. and other countries, and the different ways in which that communication can be politicized.
To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://
Click the button below to view a transcript of this podcast. Please note that the transcript was auto-generated and may contain errors.
Transcript
[Intro]
Lukas Bundonis: We build our products to the best of our ability as like big tech engineers to make sure that customers have the most control and context and decision making that we can afford them. But also that like when we have to comply with government requirements, provided that the government respects and there's like an established agreed definition that they respect the rule of law, we're going to answer their request to the most specific degree of our ability.
Eugenia Lostri: It's the Lawfare Podcast. I'm Eugenia Lostri, Lawfare’s Fellow in Technology, Policy, and Law with Senior Privacy Engineer at Netflix, and former Army Reserve Intelligence Officer Lucas Bundonis.
Lukas Bundonis: Like when the news says it's a tough time out there for data protection, it is. Partially because of machine learning, partially because of like a vague slide towards populism and authoritarianism in many parts of the world, it doesn't mean we should stop trying.
Eugenia Lostri: Today we're talking about the relationship between law enforcement and tech companies, what that relationship looks like in the U.S. and other countries, and the different ways in which that communication can be politicized.
[Main Podcast]
So Lukas, you sit in a very interesting intersection of law and engineering and technology. So can you maybe start by just describing what it is you do and what your day-to-day looks like?
Lukas Bundonis: Yeah, sure. Thanks Eugenia. My day-to-day is mostly about being a bridge between engineers, whether they're data or software engineers and lawyers and other legal professionals. Most of the nexus of like data portability and law enforcement response is answering requirements from legal about what to add to subject access requests or how to manage requests from customers or law enforcement about data. And it's been that way ever since I, I joined the field about four years ago.
Eugenia Lostri: So tell us a little bit how you joined the field. You know, what were the skills that allowed you to, to do this, this work? Because you're not a lawyer, right?
Lukas Bundonis: I'm also not an engineer either.
Eugenia Lostri: Also not an engineer, yeah.
Lukas Bundonis: It's kind of, it's kind of one of those things where it happened by accident. A lot of careers start by happy accident. I had some. Intelligence experience from the military. I had a policy background from school and writing some papers for academia and for research outfits. And I took a lot of classes that focused on first cybersecurity because I was really interested in cybersecurity policy.
And then I, I started getting interested in emerging tech and kind of like that lent itself pretty well to first working in privacy engineering. And honestly, like I spun some experience working in storage systems into like a project that focused on law enforcement response at Google. And that project ended up being my segue into helping their law enforcement response teams just sort of improve the ability and kind of like protections inherent in their products to keep customers safe while answering requirements from government and law enforcement.
So it was very, very happy accent and something that I think that lends itself well to, like my mix of skills, nonskills, whatever you would call, happens to be in that bucket.
Eugenia Lostri: I feel like that's usually how it happens. Happy accidents and being able to seize opportunities when, when they come your way, at least for us for being maybe, I wouldn't say the, you know, early in the, in the field, but when I started and I think when you started, there wasn't a path to do this work. You just kind of learned how to do it.
Lukas Bundonis: Yeah, not at all. Like, I mean, some of it's like, especially 'cause some of these technologies are like emerging mm-hmm. Or they're attached to something that like, governments have trouble understanding. And then you get into this weird territory where even the companies that make this technology don't really understand it super well.
So it's like there, there's nobody that really knows how to grapple with. I mean some of the engineers developing the tech, whether it's machine learning or something else, know, like how to innovate in the field, but they're not really sure how to add protections for it because it hasn't been invented yet. And that's, that's true of cyber security. That's true of MLs. That's true of a lot of different disciplines.
Eugenia Lostri: Yeah. And then crucially, this bridge function that you're describing, that you described kind of from the get go where you may understand the technology, but you may not understand what the legal regime around it is.
Lukas Bundonis: Oh, absolutely. It's, it's up to like first line engineers and legal professionals to be able to make informed decisions about risk, ownership, acceptance, and kind of forecasting because they either build the tech or they they know the law. A lot of times you'll hear privacy engineers specifically talk about knowing a little bit about each of those categories, but not being, not being sort of like the first line expert in like any specific one of them.
We are necessarily privacy experts, but it's challenging to even define privacy in the context of emerging technology. It's really challenging actually.
Eugenia Lostri: So one of the reasons why I was excited about our conversation is I think you bring such an interesting perspective to this, at least for me, when I think, and when I talk to people about the relationship between tech companies and law enforcement, I tend to hear it more from the law enforcement side or from researchers who are looking at this and studying this from the outside.
But you know, maybe it's a little bit less common to hear from someone who's actually inside the, the big tech companies. At least that's, that's been the case for me. So can you characterize that relationship between law enforcement and the companies from that unique perspective that you bring?
Lukas Bundonis: Absolutely. So with, with it in mind that it's like a small field and, you know, there are definitely sort of better spokespeople for like legal and risk positions. Like from an engineering standpoint, you always want a system to do what it's supposed to do. Like if you build it and then it has a certain failover tolerance and like you generally try to do your best to provide the best product.
And if that product happens to have as is required by not just GDPR, but now CCPA and a bunch of laws that are coming out across the planet as well as the U.S., you have to have some measure of data portability baked into every product. Like customers now expect to have some type of granular control over what data is available to them.
The tension arises when there's also sort of like this access question or this access expectation among law enforcement and governments. I mean, I have seen dozens upon dozens of headlines that revolve around different countries, level of expectation that they have an access to consumer data via a tech company.
It's something that I, I understand some folks at Lawfare are very interested in some stuff that I'm interested in, in terms of these tech companies acting as functional intermediaries for this surveillance. It's not that we conducted ourselves, though there's lots of different positions floating around the government now as to whether or not there's explicit surveillance being conducted by companies.
It's more along the lines of we build our products to the best of our ability as like big tech engineers to make sure that customers have the most control and context and decision making that we can afford them. The ability to delete data, all those different things that are baked into GDPR, but also that like when we have to comply with government requirements, provided that the government respects and there's like an established agreed definition that they respect the rule of law, we're going to answer their request to the most specific degree of our ability there.
There are some constraints on that ability, like lack of specificity. So if a bit of legal process comes in and it's not specific enough to target an account, or maybe it's for data that we don't have, there are occasions that any sane tech company would outright reject the request or push back and say, you need to be more specific.
If the country or government does not reflect what we would consider the rule of law, and that's generally a consensus position that isn't really spoken for among any specific tech company. But you'll see a common pattern where like one company will trumpet their response above another, but they're all functionally doing the same thing. They're, they're making it so that there is a precedent established.
And it's not, it's not a formal legal precedent. No one's actually writing new law on this. No one's writing papers. I mean, some people are writing papers hopefully soon myself to be included in that company. But generally, like, that's fascinating to me because it's, it's still in the spirit of honoring multiple different users of that ecosystem, making sure these requests that they reflect, like only the degree of accuracy that they provide, that the requests don't result in a legal data production that's over or under the mark and that it reflects like legal reality and like the reality of the product, like availability of data, things like that.
It's a world that most people don't like, they, they, like you said, they do know it exists from the perspective of law enforcement wants to unlock a phone or like even then they, they do it on the basis of encryption and it becomes an encryption debate and there's some kind of, I wouldn't, I wouldn't call it silly, but I would call it like very hand fisted debate between whether or not encryption should exist in the first place. This is a little bit different and sometimes more nuanced.
Eugenia Lostri: So I, I'm hoping to get to this later, but since you kind of brought it up, I feel like I need to ask it now.
You know, of course this is not just about responding to requests from U.S. agencies or U.S. law enforcement.
Lukas Bundonis: Nope.
Eugenia Lostri: This is spanning all over the world because, you know, big tech companies have operations everywhere and you know, I, I would love to hear a little bit more about what it's like to answer to these different regimes. What are the different things that you need to keep in mind?
But also, I, I was just reminded of the recent testimony to Congress that we saw from, from Microsoft, and they stated, for example, that when there are requests from the Chinese government, because that hearing focused a lot on, on China and Microsoft's presence there, they said, you know, they maybe just don't respond to those requests. Is is that, you know, typical, I, I know that there's concern a, around the access of foreign governments to Americans data.
Lukas Bundonis: It's very typical, so you'll see an industry standard where a lot of major tech companies either don't have business operations in China and each, each company has their own reasons for not doing that.
But when requests come in from, say, China or Russia, typically, and for different reasons, they will just not align to the request. I think maybe in the early thousands or early 2010s, there was a bit of a gray area where a couple of the big, the names in Big Tech, like were trying to do like a hybrid business model or do, you know, comply with CCP censorship, but like most, most folks have gotten out for, for like, kind of the same reasons.
It's, it's one of the weird ways in which they're all very harmonious. They don't like to answer requests from governments that they strongly believe would misuse the data. It's like target business and stuff like that.
Eugenia Lostri: So what about other, you know, governments? What, what's the range of how comfortable between, you know, yes, we will likely reply to this within the margins of the law to, we're just not going to have operations in your territory?
Lukas Bundonis: Well, I mean, it can be, so like some of the risks that come without mentioning any specific countries, some of the risks that get considered have been things like degree of fraud and corruption in submitting these requests, like how easy is it to submit fake requests, regardless of the subject matter or the type of request, so that's a big consideration.
It's not always a logistic consideration whether like a given group or like, you know, geographic region of like policing is like willing to use the tools that you provide them or like how corrupt a given court system is. But like that's all pretty kind of wrapped up into, into the overall picture.
And then sometimes it's like who gets elected into leadership and how do they use the information that they glean from tech companies? And there are a couple of standouts in the last like 10 to 15 years that would fit that bill, whether it's in Europe or in Asia. But like generally like, there's no one size fits all way to decide that a country you used to honor requests from no longer follows the rule of law.
It's tricky because then a lot of companies will have to contend with their, their business in that country too. Like, do they, do they want to, do, they want to catch smoke from a government that like, you know, may the line but they tow the line successfully, it's really challenging. As opposed to governments that are a, a joy to work with that have really robust regulatory agencies that flag instances of over production. I, again, without sort of getting into specifics, it's just very, it's, it's a, it's a big scale, like, to, to say like, what is the scale? It's very wide.
Eugenia Lostri: It's every single country. It's, it's all it every single country that's like, not Russia, China, and a handful of others.
Lukas Bundonis: Yeah, yeah, yeah.
Eugenia Lostri: So an interesting trend is this, you know, requirement in legislation that you as a company must have an office and personnel present in the country where you're operating, right? And there's been a lot of concerns that that can lead to applying pressure, right?
Because it's easier to say, well, we're not going to respond to, to your request. If everyone that works for you is in another country, it's far away. There isn't really much that they can do, but not only, you need to have a presence, but it is considered an offense maybe to not respond to the government requests. There's leverage there for, for the government. Do you think that's going to affect how we're seeing operations being maintained or, or not?
Lukas Bundonis: I hate to give the cop-out answer, but again, it depends on the country and it depends on the company. I have sort of like personal anecdotal experience that unfortunately I can't share that like generally there are creative ways to still base your office and still do things with like your people and your data to be able to protect them successfully.
And that's been, that is again, something that is surprisingly common across big tech companies. If the risk gets too great, so if not just lawyers but your security staff or your campus staff say like, hey, there is an imminent risk to people in this country, or like the person that just got into office is like not gonna honor their word to keep the campus safe. That is an easy, like contact the, the State Department type, get 'em out situation.
But beyond that, the, the safest thing is to withdraw. The medium ground is to just be able to apply a little bit of reverse pressure because like most of these companies, what, whatever the company is. It is so valuable to have in that country that like, I can't think of even, even cases where like certain companies still do business in Russia or they do business in like a, a, a country that's like towing the line on authoritarianism.
It can sometimes be a weird bluff that like they're, 'cause they, they need the company to provide those services. They're like really valuable for the economy or like people can get jobs doing that work. So it the, the companies have more leverage than it seems like they do.
Eugenia Lostri: Since we’re speaking about the international part of this, the UN is in the process of negotiating a new cyber crime convention, and it has been criticized, you know me, amongst the people who have criticized that for, between other things its very broad scope and honestly the insufficient protections that it provides to to human rights.
But the convention would also create, you know, some new obligations and create some challenges for tech companies. So could you maybe speak a little bit about what these changes would be and how worried you are about them?
Lukas Bundonis: So I am not like super well read on the current state of negotiations. So if there's like later context that you can provide that I missed, like please feel free.
But generally one of the most interesting things that I saw in the considerations for like the treaty negotiations was this concept of traffic data because it was like listed by like the ICO and some of the language in the, the draft sessions as like core to cyber crime investigations to be able to share data between stuff.
There's another act that like, is like that, like in electronic evidence, like there's a lot of like talk in the EU at least specifically rather than the UN to like be able to share stuff across, like share evidence for terrorist investigations and intelligence investigations. But this idea that somehow ambient traffic data, which to me just sounds like communications metadata, we've been talking about that in the States for like 30 years longer actually is really fascinating because a lot of times, I don't know that like having every actor that is responsible for each endpoint of like a major breach, having access to the same like telemetry and investigations data is actually going to speed up the investigation.
Like normally there's like a couple of key actors or there's like one person that pushed a bad update or something like that, that like does, that does most of the lifting that does most of the lifting. And, and it's necessarily because they have access to that telemetry, like is giving it to a bunch of different government agencies going to help. Does that mean I'm a critique of the treaty negotiations? Not really. I just think that it's like that concept of traffic data is really fuzzy and I thought that was interesting.
To answer your question directly, like do I believe that that would have like meaningful changes for the way that tech companies do things for like cybersecurity investigations? I think one of the biggest tensions I see is that tech companies still have a really fraught relationship sharing, like information about a vulnerability with the government because it's loosely assumed that like the government, whether it's any government in the world as you like, are very well aware from your own expertise, like they just don't trust that they're not actively using them, that they're not actively like using something to like, that's later gonna get patched.
So the way that ties in with traffic data to me is it's just, I'm glad that the UN is thinking about it and some of the drafters are thinking about it, but I don't know that it's gonna compel companies to rethink the way they collect that telemetry or share it for that matter. Unless there's something about the negotiations that I'm missing, like, please, please take it away and tell me something else that you, you found interesting about the same topic, because I'm happy to respond.
Eugenia Lostri: No, no, I, I think that's, that's interesting. You know, the part that I focused more on is the fact that there are insufficient protections in place and that the scope is just way too broad, honestly, literally everything would become a cybercrime and, and I think that that's dangerous. But you know that that always poses interesting challenges for everyone who actually needs to, to operationalize a treaty.
But I, I wanna bring us back to the U.S. It's been an interesting year for privacy and surveillance. There's some recent reporting that Schumer is expecting, the Child Online Safety bills clear the chamber soon so that, you know, again, going back to the theme of how is this going to change your job and how big tech companies operate, do you see these bills creating significant change to, to your day to day or, or not?
Lukas Bundonis: Child safety is tricky and while sort of like acknowledging that the law enforcement response world necessarily touches child safety, like in like a, sort of like a, a, a, a meaningful like relationship. It, it's challenging because a lot of like the requirements for like international agreements for domestic law enforcement are pretty ironclad for like how you treat with that subject and like different like, you know, issues that crop up on the internet.
I don't know that it would so much change my job just because of, just because where I now work has less data than Google. I think overall the landscape may get tricky in countries, just like the cyber crime negotiations that, that use like broad collection apparatuses to like go after crimes in a really weirdly fungible way.
Like you know, to say like, hey, I'm writing this bill, or I'm gonna send this legal process from government X to company Y to target a, a crime that involves like put putting children at risk or I'm going to do it because it's a cyber crime. To me, that's a really like broad continuum that has similarities in the sense that they just use very broad government vehicles to target the same thing. In the U.S. I don't necessarily see that happening, but sometimes child safety bills do include like weird provisions.
Eugenia Lostri: Let's talk about something that is I think maybe a little bit more squarely in your realm, which is the renewal of FISA 702. I'm sure you follow that closely, so, so I would like to hear just your thoughts on, on that entire process, how did you experience all of that?
Lukas Bundonis: Yeah, I mean, I watched a couple of the debates and I read the commentary, not just here on Lawfare, but elsewhere on the internet. I talked about it with my coworkers. I talked about it with some legal folks. And generally like 702 renewal always seems pretty cut and dried, like it's sort of a necessary element of how the bureau targets suspects and intelligence investigations and law enforcement investigations.
The challenge becomes, it seems like I'm a broken record at this point, but when people add like weird provisions to the renewal. So the iteration that I saw was like the Renewing Intelligence and Securing America Act, that folks from different civilities groups argued, and some commentary that like, it, it, it would, it would sort of like give a broader, you know, ability of federal law enforcement to compel companies to be able to surrender data or like that there's like some type of like more pressure under the current renewal that they can apply.
The weird part about the pressure that law enforcement applies to companies is, you know, they can like send a bunch of requests or they can send national security letters. They can send actual like legal process that compels companies to like set up, you know, broader and broader legal productions. I couldn't even, after reading the text of the bill, find stuff that like expanded that requirement very meaningfully.
I think the challenge is gonna become when that involves like the, the, the most interesting wrinkle doesn't actually come from domestic surveillance for me because for like U.S. persons, which 702 provides that the intelligence community can't target any U.S. persons, including if it's like targeting a foreign person for the purposes of targeting a domestic person.
The challenge comes when like those people are abroad or like, there are other weird entanglements with like cross border data sharing and like folks have like other meaningful relationships with like other data repositories and other regulations in different countries. Like, I think that that's where the compulsion or sort of the, the, the compulsion becomes stronger, a lot of the protections break down and that's my biggest concern with like changes to FISA that are, that have, that have gone forward through the renewal.
I don't like what you would call like backdoor inclusions. They're, they're not, they're not, they're not great. And I, coming from the intelligence community myself, I respect the need and the sort of, yeah, the need for constancy and like a, a degree of specificity, a degree of aggressiveness in every intelligence collection law.
But the relationship that intelligence law has with companies, let's, for now, let's say it's fraught. I, I don't, I don't, I don't like the, the landscape, especially as I mentioned before, when it gets into like silly debates about encryption and stuff like that. I, I hesitate to sort of like say that, like that's, that's a good trend for the future.
Eugenia Lostri: It's interesting not that long ago, not to self plug, but I will do it regardless. I, I had this really great conversation with Joseph Cox who wrote the book, “The Dark Wire,” about the FBI kinda building out their own hardened cell phone company so that they could sell that to drug dealers. And they just managed the entire company and that's how they had access to, to the communication.
And you know, like while you cannot expect that type of operation all the time, it just seems like maybe a better approach than requiring all these companies to break or weaken their encryption just to, just to be able to see what's going on.
Lukas Bundonis: Yeah, I mean I've, I've actually had a couple recent conversations with like former law enforcement professionals, and I think the general argument is, is, is based on a difference in, in, in mission rather than like technology.
Where like they wake up basically every morning thinking like, how can they not only like drive up a statistic like prosecution rate or like conviction rate or amount of putting dangerous people in jail, but like how to break up networks, how to break up gangs, how to break up terrorists. And generally if they can do it while meaningfully going above and beyond the requirements of the law, they are doing a good job. But that is like, that is not, that is, that is a, a corollary to their primary mission, which is just like keeping people safe.
And I've met tons, tons of people who really do believe in this mission and do a good job at it. Encryption always seems to be this thing that when I talk about it with them, it's like, well, like of course people deserve privacy, but I'm trying to catch terrorists. And that's like, that's kind of, that's it. That's like the end of the discussion most of the time.
For, for encryption advocates, as I'm sure many of the folks that you've already spoken to and like written about, like it's the exact opposite. It's like the, the, the, the main goal for driving protections in the end-to-end encryption space is just that the people who use Signal, WhatsApp, Telegram, Wicker, whatever. You know, they’re dissidents, they’re marginalized groups. Sure we’re willing to admit that like, you know, military and like spies use these too. But like, also, like most of these groups are protected and they can't, like, if they have that encryption weakened or broken in any significant way, they will be rendered just as vulnerable as the terrorists or the criminals would be to the FBI.
And it's an unacceptable trade off that you're willing to make here with regards to like. The subject of like dark wire or like that whole managed company. I think the American government has a long history of creating like, you know, a little le legally sound like legal snares to catch people and like subvert the definition of entrapment to be able to break people up and get their comms broken down.
I think the value of still asking companies for permission is because the companies can serve however strong or weak that could be at any given point a meaningful intermediary check on government surveillance, going direct to the customer still feels a little, yeah, I know I'm not a lawyer. It's not, it's legally not entrapment, but it does feel like it's sort of abrogating the process through which you're able to conduct surveillance fair and square.
I think the simple definition of surveillance is literally like observing someone for the purpose of, of collecting evidence. And how do you know that what you're collecting is going to go into evi-? Like how do you even submit the process for like a warrant if you're just snorfing up all their comms?
Like that's not, there's no evidentiary basis for the collection, I mean even in intelligence. I mean, there's lots of hair raising stories and you know, vague or explicit human rights violations in the history of the global intelligence services. But at the same time, there's a basis for every, there's supposed to be a basis for every interrogation, for every collection, for every like thing that you submit to your agency head to your lead, to your commander, in order to be able to gather that intelligence.
So that's why those plans, those like company things, like they always sort of like, like I said, I'm sure they're, they're airtight, but as a, as a privacy, as a newly minted privacy person going on five years, that stuff just gives me the heaps.
Eugenia Lostri: I'm failing right now to remember which proposed legislation this was, but not that long ago there was a piece of legislation being proposed about encryption that basically said, you know, we want you to have a way to access that is proven to not weaken general protections, right? So basically give access, but make it still be good.
And it just seems like such an interesting example of that disconnect or the lack of a bridge between policy makers and actual technologists. Because sometimes like that technology doesn't exist. Like there isn't a way to do that if I understand it correctly. So like how do you even get around that if there are requirements that just you cannot meet in the law?
Lukas Bundonis: I mean, I hate to be like tchotchke or precious with this comment, but like, just make them work harder. Like if you can, you know, it's proven that like, the National Security Agency can study TOR endpoint and then find out the computers that the traffic was on by looking at the end points. I mean, theoretically you could do that with endpoint devices. You could do computers, you could do phones.
Does it always work? No. Is it always in time to stop terrorists? No, but that's sort of the imperfect solution that meaningfully comports with the re, with the burden placed on intelligence and law enforcement agencies and cops, like you know any anywhere up and down the chain. Encryption exists for a reason. Cops earn their salary and they do a great job catching bad guys when they find workarounds that keep people who wanna use encrypted chats safe.
Like that's kind of like a weird, you know, people do take sides on that very pointed sides. And somebody can listen to this and think, well, this guy's like ludicrous because it's, it's only one or the other. But I really do think that making both sides of the community work harder as sort of what they do, yeah, has a, has a benefit for everyone. Unfortunately, yeah, there's also like a benefit for criminals and terrorists if you just like keep strong encryption, but it's not to the extent that like they still can't be caught.
Eugenia Lostri: So something that I've been quite interested in, especially looking at some of the recent developments over the last year, is how politicized communication between government agencies and tech companies has become, and I think we see that most clearly with. Social media companies because the communication between, you know, DOJ FBI and companies about misinformation efforts or how to maintain election security has been, has been criticized very publicly.
There's a new report, the DOJ Inspector General released evaluating the efforts to coordinate information sharing about foreign malign influence threats to U.S. elections, and the kind of three big findings were there's a lack of policy on how you do this, there are First Amendment implications and there's also a lack of strategy guiding the interactions.
You know, I, I know your field is not necessarily social media, but you do do a lot of the relationship between a tech com, the tech company that you work for, and law enforcement. So do you think these concerns translate at all to your field? Have you seen any kind of backlash about the way that the two sides work together or, or has it been kind of its own little, its own little pocket that is just not touched?
Lukas Bundonis: I think they're very related. I hate to be disappointing when I say like from whichever topic, like doesn't specifically affect like law enforcement response because folks in government and cops will always need data for investigations. They're always gonna wanna know about something.
I think the dichotomy I'd like to briefly examine is like it's assumed that Democrats will go after tech companies. Like it's just like they will hold them accountable because, you know, they're, they're harming consumers or like an Open Society requires snarled companies and they don't pay enough taxes, and they're building a bunch of evil machine learning, and then Republicans will not touch them.
And I think that is false. I think it just, and like, you know, even if like you agree on its face that like, you would agree with me, that that's false. A lot of people, like if Republicans win in November. They generally will go after tech companies that they don't like or that didn't give them money. They're, they're a little more personal these days. Democrats don't always go after tech companies successfully, nor do they go after them evenly. So it's always like, you know, like, and even depending upon how, you know, different cases in regulatory agencies proceed, it's one of those things where like it's never quite so cut and dried.
I think in the general pullback of government from tech with like social media and just that relationship, I do think it's important to look at how like search companies don't derank conservatives because they post conservative stuff. It's because it's generally associated with misinformation on like a 10 to 40%, you know, degree of reliability depending upon the site, I just made that up. That's not a real statistic.
But generally, like overall, the biggest thing that concerns me about that pullback is a lack of like keeping relationships strong. So whether it's in something that you know very well, like cybersecurity or something that I used to work in, like machine learning, that lack of understanding is gonna produce for really silly and ineffective regulation. Like if they, if they don't know how the products are made, they don't know what they're capable of.
I was a big fan of reading the, the, the non-binding bill of rights that the White House put out recently, but like, generally like. It, it doesn't have any teeth if, if the pace of innovation is gonna outstrip it in like a year or two or less, knowing machine learning these days, so whether it's social media, cybersecurity, and machine learning, I think that pullback doesn't have as much implication for my, you know, small subfield or for like the extent to which the government, the American government is gonna be willing to regulate its big tech companies.
But just that like that it won't be able to effectively. I, I, I really do worry about that breakdown in relationship. Let alone, like if you wanted to have some type of meaningful relationship for defense or for govtech, like all, all of that, like interstitial matters. It shouldn't be overly cozy. I mean, I think any, any sort of like free thinking American citizens should be concerned if it gets too cozy, because that also produces bad regulation, so you have to have something in the middle, I think.
Eugenia Lostri: I wanna pull from two threats that you presented in that answer. The first one you hinted at the differences between Democrats and Republicans and their relationship with big tech.
This is an election year. We just saw the change, you know, Biden stepping down, Kamala Harris taking over as the Democratic nominee. So I was wondering if you could maybe tell us about what it looks like if either, you know, either Trump or Harris actually wins. What can we expect to be different or remain the same?
Lukas Bundonis: I expect sort of like, like to, to re-pull on the, the part of it that I introduced, there is a big expectation that if a Democrat wins, the tech companies are in trouble, type of a, they're not gonna be able to do anything type of a thing.
I don't know, at least right now, whether or not we have a chance of seeing like toothy, strong, sensible, legislation from Congress on either privacy or machine learning that is national even if Kamala Harris does win. Because the companies are in the driver's seat, like I, I don't know how to emphasize that with an exclamation point enough.
But there are, there are definitely sort of admirable efforts, whether it's from like regulatory bodies or like NIST or anything like that. Like it, it's good faith effort and there are lots of cool policies being kicked around in D.C. that are meaningful that, that include data privacy as like one of its precepts. And I, and I love that, right? But the top four or five AI companies have already scraped the entire, you know, like what, whatever it is. Like they're gonna run outta data in 10 years or something like that. So I don't know how you sort of meaningfully eat at that advantage.
Especially since there are other things that that a Democratic administration would have to contend with. Whether it's immigration security, like two plus active, really big war zones, you know, and like, you know, buckets of human rights issues. Like all kinds of, like, things that like just will pull away from the fact that it seems like they wanna craft a narrative that they're not pulling away from the world. They're rejecting Republican isolationism.
And, and now it's, now you're supposed to regulate really fast, really powerful, really wealthy machine learning companies and also protect user privacy and fight the encryption, fight the way the Democrats should. That's, that seems like a, a bad quagmire to be honest. I, I think they're gonna have to pick, and if they do win, I wish 'em luck.
For Republicans, it's it's also kind of equally interesting. I mentioned that some of the sort of rejection of like tech regulation or, or going after specific tech companies is like personal to whether they're being deranked. It's more nuanced than that. I think that Republicans overall are mostly sort of like, like they're, they're pretty famous for engaging meaningfully in culture wars or like having, like certain companies now be conservatively aligned.
Like to mention, like some of the recent backers that it came out to give money to the Trump campaign, they are no less jazzed about AI. I mean, like, I, I frame it as like, Democrats don't want machine, you know, they, they're gonna regulate machine learning. It's like each, both political parties love that stuff. Like, they love sort of the dynamics of like, I think that conventionally Democrats have done a better job of sort of like safeguarding the worker.
Like generally, like many Democrats are pro-union or they're sort of like after worker protections and privacy protections. I think for Republicans you'll see a lot of like fights that revolve around the culture war, whether it's like search ranking or AI and art and AI and UGC, AI on social media, data privacy for all of those same things.
But data privacy in a way that's sort of like it. It's a weird cross section where like they care a lot about data privacy. If somebody is like, you know, sort of called out for posting a bunch. Like they don't, nobody wants to get like canceled for like posting stuff on social media about like, you know, gun, gun rights or being part of like a conservative rightwing site.
So they do care about privacy. It's like a, it's a, it's a very, like, it's, it's a complicated thing to explain. I think generally though, they are very happy to bring the fight, like kinda home. I mean, they're, they're, they are full tilt on, on route, like Trump and Vance are on route to like, you know, take the country back 200 years to Andrew Jackson.
You know, like they, they wanna focus on domestic issues and they're like, I think naturally tech regulation, billionaires that support them, culture war stuff, gun rights and immigration misinformation on social media. I think that. Whether it's like meaningfully different for me as an employee in this space. I, I couldn't tell you. I can't predict that. I just know that they're very comfortable bringing things domestic and that's, that's kind of what I see for them.
Eugenia Lostri: And the final thread that you've mentioned several times is machine learning, artificial intelligence. You know, I don't think we can go through a tech podcast nowadays without talking about it.
You know, we saw that Meta had to halt the use of AI in Brazil because of the data protection authority ban. Do you think there is a trend towards preventing artificial intelligence deployment or, or not? What's gonna happen with all of that? How do you see it from from where you sit?
Lukas Bundonis: I admire when a government believes they can, like, you know, like, or, or sort of like the, it, it reminds me of like the European withdrawal of some of the key, like, you know, sort of like the major powerful features of the bottles due to like, protection requirements.
Nothing is stopping this train. The, the people, the processes, the tech, the companies that, like I've engaged with, I would say being part, being like a loose, loose adherent and follower of the ML community for like two and a half years now 'cause I honestly, when, when we were at Fletcher, like I, I, I read about it and like, I, I was covering it, got excited. I wrote a little paper about it.
But like there is this headlong fascination with birthing like super intelligence, like there, there are adherence all throughout Silicon Valley. There are billionaires, there are politicians that just wanna see this thing rocket and like, I don't, I don't know why like, it's like they've never seen the Matrix or they've never seen a movie about this stuff, but they will strip away the capacity of most modern governments to contend with how powerful that crap is.
Just by continuing to scrape data, generating synthetic data to replace, you know, signing deals with publishers and stuff like that. I, I really do appreciate, like, you know, I'm, I'm painting a, a little bit of an apocalyptic picture.
I think that like, overall it's really good when governments, whether it's Brazil, the U.S., and Europe, try to meaningfully snarl. What I, what I would consider just like crappy and really like mildly to moderately disrespectful AI practices and its development because it's good like that's how you, that's how you tease apart a good baseline and restrictions for regulation.
Not too dissimilar from cybersecurity. Maybe just a little bit more juice, a little more powerful. That being said, it's not gonna change my job because if anything, more ML means more data and more data types. So that's all I'll say about that.
Also, like it's not gonna change that headlong fascination. Like that's how I really described it. Like people like God, they just cannot like get enough of this concept that like even the fricking safety people, like even the people that work in safety and protection, they're getting fired. They're getting fired or removed because they're getting concerned about data, normal things like data training practices, let alone whether something scary like this is, is sentient.
So I don't, I don't think it's gonna stop anything. I, I think it's gonna keep going. It's gonna be up to like data protection professionals like us to give a crap about putting in the guardrails. 'cause otherwise the development's gonna continue unimpeded. If, if it were to stop deployment, like if you were to stop deployment in one country or stop deployment of a feature, it would just go unimpeded in another and that's how it's gonna keep going.
Eugenia Lostri: Lucas, if there's anything, you know, any last thoughts that you wanna leave us with or something that you wish we'd covered but we didn't? You have the floor.
Lukas Bundonis: Yeah, so I tend to be like a rambler and kind of pessimistic about most tech, despite the fact that it is my livelihood generally. I think that whether it's law enforcement response, data protection, privacy, cybersecurity, or machine learning, I'd like to say that there are a lot of like core, really good professionals of like various degrees of like investment in the hype cycle that like are doing a damn good job every day, like making sure that users are protected in like the best way that they're currently able to make them.
A lot of decisions about like frontline protections. Again, whether it's any of the, the fields that we've, we've discussed on this episode is just like we, we all apply pressure. We have a responsibility to apply pressure as, as, as data protection professionals. You can call it data, whatever the bucket is that you'd wanna group all this stuff into.
And I think that it's just like when the news says it's a tough time out there for data protection, it is. Partially because of machine learning, partially because of like a vague slide towards populism and authoritarianism in many parts of the world. It doesn't mean we should stop trying. I think it's super important to encourage data protection folks to just keep their heads up and like not, not submit to the, the, the, the pessimism and some of the ranker that some of these trends produced in the industry because it's important to do our jobs.
Eugenia Lostri: Okay. That is actually a cheer note to end with. So Lucas, thank you so much for joining me. This was a great conversation.
Lukas Bundonis: Likewise. Thanks for having me.
Eugenia Lostri: The Lawfare Podcast is produced in cooperation with the Brookings Institution. You can get ad-free versions of this and other Lawfare podcasts by becoming a Lawfare material supporter through our website, lawfaremedia.org/support. You'll also get access to special events and other content available only to our supporters.
Please rate and review us wherever you get your podcasts. Look out for our other podcasts, including Rational Security, Chatter, Allies, and The Aftermath, our latest Lawfare Presents podcast series on the government's response to January 6th. Check out our written work at lawfirmmedia.org. The podcast is edited by Jen Patja and your audio engineer this episode was Noam Osband of Goat Rodeo. Our theme song is from Alibi music. As always, thank you for listening.