Cybersecurity & Tech States & Localities

Lawfare Daily: How Technologists Can Help Regulators with Erie Meyer and Laura Edelson

Justin Sherman, Erie Meyer, Laura Edelson, Jen Patja
Tuesday, October 14, 2025, 7:00 AM
What tasks are technologists best-suited to help regulators with?

Published by The Lawfare Institute
in Cooperation With
Brookings

Erie Meyer, Senior Fellow at Georgetown Law’s Institute for Technology Law & Policy and Senior Fellow at the Vanderbilt Policy Accelerator, and Laura Edelson, Assistant Professor of Computer Science at Northeastern University, who are coauthors of the recent toolkit, “Working with Technologists: Recommendations for State Enforcers and Regulators,” join Lawfare’s Justin Sherman to discuss how state enforcers and regulators can hire and better work with technologists, what technologists are and are not best-suited to help with, and what roles technologists can play across the different phases of enforcer and regulator casework. They also discuss how to best attract technologists to enforcement and regulation jobs; tips for technologists seeking to better communicate with those lawyers, compliance experts, and others in government with less technology background; and how this all fits into the future of AI, technology, and state and broader regulation.

To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/lawfare-institute.

Click the button below to view a transcript of this podcast. Please note that the transcript was auto-generated and may contain errors.

 

Transcript

[Intro]

Erie Meyer: Having a technologist on your team can help the conversation go, ‘here is the sample that we have designed to prove to you that we're not doing anything wrong’ to ‘we will just simply take all of the data.’ Something like ‘sampling is for ice cream’ was the rule of the road in a lot of law enforcement offices once they had a technologist in-house that could help them really get the information they needed to investigate conduct.

Justin Sherman: It's the Lawfare Podcast. I'm Justin Sherman, contributing editor at Lawfare and CEO of Global Cyber Strategies, with Erie Meyer, senior fellow at Georgetown Law’s Institute for Technology Law & Policy, and Laura Edelson, assistant professor of computer science at Northeastern University, who are co-authors of a recent toolkit on bringing technologists into regulatory bodies.

Laura Edelson: When regulators looked at the early days of, call it Web 2.0, and thought, you know, well, we just don't know how to tackle this yet, so we'll just wait and see how everything shakes out in 10 years. I think that went very badly. And I think if regulators decide again that they will just see how all of this shakes out in 10 years, then probably some very bad, possibly illegal patterns will be entrenched.

Justin Sherman: Today we're talking about how technologists can help state enforcers and regulators at all levels and how to make it happen.

[Main Podcast]

Why don't you start by telling us more about yourselves both briefly about your respective backgrounds as well as what you are working on currently.

Laura Edelson: Hi, my name is Laura Edelson. Currently, I'm an assistant professor of computer science at Northeastern University. I had a career as a software engineer for quite a few years before deciding to become an academic. Then I did my dissertation basically on the digital ad tech stack, that might be a good shorthand for it.

Then, when I was asked to go to serve at the Department of Justice as the chief technologist of the Antitrust Division, it was just too interesting an offer to turn down. So, I went there and I helped them think about both the digital ad tech stack as well as a whole bunch of other interesting problems, like what collusion might look like inside of an algorithm and other things like, you know, when is a search algorithm anti-competitive? And all of that stuff was a lot of fun and interesting. And so I went back to academia for a year, but then they, they lured me back and I did a second tour with the Civil Rights Division exploring other sorts of interesting questions about algorithms and civil rights there.

Erie Meyer: And I am Erie Meyer. I am a senior fellow at the Vanderbilt Policy Accelerator, as well as at the Georgetown Institute for Tech and Society. Before that, I have had a career in both technology and policy. I worked in the Ohio Attorney General's office during the financial crisis using open-source software and human-centered design to help address people's consumer problems, including just really tough stuff during the mortgage crisis, then ended up being part of the team that stood up the Consumer Financial Protection Bureau in response to that crisis.

I did a stint at the Office of Science and Technology Policy working on open data. I founded the U.S. Digital Service, now known as DOGE, and then went to Code for America to work with their national network of technical people across the country fighting to make their cities work better, and then went to the Federal Trade Commission, and this is during the first Trump administration, to work for Commissioner Chopra as a technologist on both consumer protection and antitrust cases, and then served as chief technologist for Chair Khan at the Federal Trade Commission in the office, or the head of the Office of Policy Planning for a short period of time.

And then went to the Consumer Financial Protection Bureau again, this time as chief technologist and senior advisor to the director, where my work was really to take a hard look at big tech lurching into consumer finance and make sure that we were ready to keep consumers safe.

Justin Sherman: A great segue, keeping consumers safe.

We're going to talk today, as we heard in the introduction, about a new toolkit you have co-authored along with two other of your colleagues titled, “Working with Technologists: Recommendations for State Enforcers and Regulators,” which is published by Georgetown Law’s Institute for Technology Law & Policy.

So we're going to get into all that. Let's, let's start sort of by setting the scene. First, are state enforcers and regulators already working with technologists, and how much or how little, et cetera? And what's the state of affairs say for your typical state AG’s office, you know, doing privacy or a regulator working on financial issues?

Erie Meyer: So, today there are still technologists at the Consumer Financial Protection Bureau. They have been fired, unfired, fired, unfired, et cetera a number of times since the Trump administration started, but folks are still there right now. Our understanding from court records are that folks are not being allowed to do their work on things like supervising big tech companies that are in the financial space, which is tough, but they, they're still there.

There is a technologist team still at the Federal Trade Commission. They have a chief technologist and they have a number of technologists that were hired under Biden that are still there, and I now believe that there are 20, maybe, state attorneys general that have some technical staff in-house.

Sometimes it's a sort of, what us on the podcast would call a sort of technologist, sometimes it's an investigator who happened to have a very technical background and is able to really dig in on investigations in a, in a wonderful way.

So, I would say, you know, things are a little strained at the federal level. But we really seeing very impressive growth at the state level with technologists in enforcement agencies.

Justin Sherman: What are some of the reasons, then, that state enforcers and regulators, you mentioned some federal regulators as well, might need to work with technologists? Or perhaps not even realize they need to work with technologists, but they really should be doing that?

And when you say working with technologists in quotes, what exactly does that mean here?

Laura Edelson: So, there's a few questions inside there. I'm gonna start with the first one you asked. I think the reason that quite a few regulators probably need to be working with technologists is that there are important questions that they need to answer to do their jobs, where the answers to those questions are inside technical systems.

And those technical systems can be pretty inscrutable to people who don't have, in some cases, quite specific expertise. So I'm talking about things like just the ability to read source code. Sometimes the answer to a regulator's question is written down somewhere very, very clearly, very explicitly. But in order to read that answer, you need to be able to read Python or something like that.

Other times, an answer to a regulatory question is embedded inside a machine learning model. And in order to get the answer to an important legal question, you need to be able to test how that model performs under a range of conditions. So it's a little bit regulator-dependent and it's a little bit question-dependent, but I think there are more and more questions like this that are relevant to just a huge number of regulators.

Erie Meyer: Two anecdotes I'll share is, I was––I won't say which agency, but I was once working on a case, and part of the way that we were calculating the fine was the, the number of people that had seen a certain page. The attorneys I was working with said, well, there's no way to tell if even they know how many people have seen this page.

And I said, I think they probably have website analytics. And they said, there's no way to know. And then I opened the developer tools in Chrome and showed them the analytics string and their jaws dropped. And that's just like a mini example of the kinds of things that are happening on cases––you know, that's just calculating a fine, but imagine trying to figure out, you know, which data to request in a subpoena, how to design a remedy, how to even understand what's happening in a market and where you really need to be watching closely.

Another example is, I remember when Cambridge Analytica was in the news and working with an attorney who said, well, you know, if Facebook just stopped tracking people on facebook.com, that would really, that would really solve it.

And then I talked about pixels, tracking pixels, and they hadn't really––this was before our friends at The Markup made that beautiful Blacklight tool to help understanding some of the tracking technology. So I ended up having to go to this lawyer's tiny, tiny, tiny small-town newspaper website. And they even said like, oh, I'm sure it's not on this website.

Go to, we can go to a bigger newspaper. I'm like, no, I'm sure it's here. And again, just opening up dev tools and showing them literally what a tracking pixel is and some of the types of data that was being transferred in. Again, jaws on the ground. And this is how everything works now. So having people, not just as an expert for a trial, but really integrated as part of the case team is just critical.

Justin Sherman: Blacklight is, is definitely a great example of some of what we're saying. So related to that, your toolkit, given all these needs, lays out a series of explanations for how state enforcers and regulators can leverage technologists and to what ends and different ways to do it. And for anyone listening who is in those populations, I'll also say it's great that there is a, template formats. You can actually just copy this material and adapt it to exactly what you need to do with the framework already there. So in that vein though, what kinds of functions can technologists help with at state enforcement or regulatory bodies? And in this context, when we talk about technologists here, what are the functions they are not intended to be helping with as well?

Erie Meyer: We've talked a little bit about their work in investigations and helping understand conduct or even measuring conduct. Another place they can be really important is around data acquisition. The ye olde law enforcement offices would sometimes be offered a sample of data from a defendant to look at exactly what was going on with modern data tools and infrastructure.

Having a technologist on your team can help the conversation go, ‘here is the sample that we have designed to prove to you that we're not doing anything wrong’ to ‘we will just simply take all of the data,’ you know. Something like ‘sampling is for ice cream’ was the, was the rule of the road in a lot of law enforcement offices once they had a technologist in-house that could help them really get the information they needed to investigate conduct.

Another thing that an in-house technologist can do is really think about implementing government programs. So if you imagine the way that a whistleblower gets in contact with an office or how a consumer or a small business reports either an antitrust violation or a consumer complaint to an office, you know, my office did an extraordinary report on chatbots and consumer finance. That was really fueled by the type of information we were able to get from the public. And technologists can be critical on getting that information from the public.

Things like working with third parties who are technical. You know, if you're working with a standards body, again, I, some of my best friends are lawyers, but it can really slow down translation if you're trying to get that expertise, if you have somebody who has to start from zero and isn't really conversant with those third party areas. And then also just policy development.

You know, I, I think industry would really like extremely complicated rules that only the biggest firms could comply with. And one of the benefits of having technical folks in the room is that you can really draw bright lines that, you know, startups or, or entrepreneurs can better adhere to and take advantage of without having to hire the lobbying-industrial complex of people who are going through a revolving door between these agencies and these firms.

Having crisp policy development with technical folks in the room can alleviate a lot of pain and leave room for innovators to really be able to get cool things done.

Laura Edelson: I just want to add a little bit, actually, to both of the points about building out infrastructure and capacity and also policy. One of the things that I'm really proud that I did when I was at Antitrust is I helped them develop a technology modernization plan because they had so many cases that were very data intensive, and this is just a way the world has changed.

Our entire, everything in the world, everything about businesses has become so much more data-driven. We have so many more emails. We have so much more customer data. All of these things exist. And so, the antitrust division just did not have the capacity to do the kinds of investigations they wanted to do.

I'm not even talking about in the tech sector, I mean in, in like agriculture, because every single business has become a very data-heavy business. And so, you know, I met with case teams, I met with people across the division to really try to understand how cases worked, what they thought cases would look like in the future.

And then we built, you know, a plan to modernize their resources, the kind of, you know, long-term technology assets they would have. And you know, it, I think it's going to be really transformative, but obviously that's going to be a multi-year process. That's something that we started back in 2023. And they're, they're maybe about halfway through it. But I, I think that's the kind of thing that's going to pay dividends for a long time.

And it's the kind of thing that, you know, technologists can, you know, can come in, they can consult, you know, with people inside the agency to understand their workflow and then figure out a long-term strategy that will help folks make the most of the time and the resources that they have.

On the policy point, I think that when people think about technology as a black box and they just think about ‘this is a thing that has this current outcome,’ and they don't really understand how it works, because there isn't a technologist on the team, they just don't understand the range of possibilities that are available to them.

And these are kind of just basic questions of fact that help you think more accurately about the future and help you think more creatively when you're trying to figure out policy solutions, when you're trying to think about the range of outcomes you want to drive. And I think, at this point, every policy team should at least have a consulting technologist because again, there are just fewer and fewer aspects of life that technology does not touch on.

And if there's a technological system involved, it really, really pays to have someone who can explain to the broader team exactly how that system works and what the other considerations might be.

Erie Meyer: To your question about what a technologist doesn't do, sometimes when the first technical person shows up in an office, you get questions about whether or not you're going to be fixing the printer.

And so Laura was being a little modest. She won a $44 million program to help modernize the Department of Justice with her leadership and envisioning what the future of antitrust work would be, which is going to be very, you know, heavily involved in tech. She did not, you know, go from office to office updating people's Microsoft 365.

So it's really, it's not IT support. It’s not, you know, those things are super important, but those are sort of the traditional teams that are, that are in government. These, these are more of your, your policy teams, as opposed to the, the IT folks.

Justin Sherman: No, it, that's a really important clarification and, and as you were saying it, as you both know much better than I do, right, you could have a technologist, you know, working on cases that's not building anything per se, but just interpreting. And you could have another next to them that’s actually building a tool to do the enforcement.

So, in that vein, practically speaking, there's also a tremendously valuable component which you've referenced already, which you talk about in your toolkit related to the skills and the knowledge that technologists bring to the table.

I'll talk about the flip side in a second, but what are some of those skills on the state enforcement and regulatory side where you do have a lot of, a lot of experts, but who are perhaps lawyers or compliance experts, not specialized in tech, what are some of those big gap areas that you both have seen that technologists can, can help close?

Erie Meyer: One of the things I love to highlight is when we say technologist, sometimes I mean a very fancy computer scientist like Laura. Sometimes I mean a very impressive product manager like Kevin Mori from the CFPB. Sometimes, I mean, a designer like Steph Nguyen from the FTC. And your technologists can have a, a varied background and the types of skills that could come forward are, for example, if we're investigating a, an unfair or abusive dark pattern, let's get the KPIs for the team that's working on the project.

If you're investigating AI sycophancy, let's look at the design methodology in the A/B testing and how things performed. It's the ability to know not just what is produced, but how it is produced, and being able to get records and data and information that can reveal some of the underlying thinking and conduct, that's really important both to a judge and to a successful investigation.

Laura Edelson: I do think there's one other specific thing that I actually think a lot of different technologists can bring. One of the, I feel like one of the most regular things that I do when I engage with, you know, whether it's, you know, going to DOJ for an extended stint or whether it's just, you know, a little bit of help to a state regulator is just decoding tech jargon.

There's so much work to be done, to do things like help regulators make document requests, just using the right words, the right jargon, that will actually get the documents that they're looking for produced. And then there's also the decoding of the, the way that language is sometimes, you know, genuinely abused in a way that, you know––it's, it's just this other dialect of how we talk about products and how we talk about how they work and the, all the language of systems that is really unfamiliar to lawyers and policymakers in many cases, and is almost, sometimes, I think deliberately opaque.

And just bringing in someone who knows that jargon and is familiar with it, that actually can be very helpful.

Erie Meyer: To Laura's point, Stephanie Nguyen and I just hosted a training for government workers on AI and as part of that training we played a game of Bullshit Bingo, where we talked about the different ways that regulators especially, but everyone in government, is, are fed B.S., whether it's part of sales presentations or to try to obfuscate, you know, conduct, and some of the, some of the tells that even non-technical government teams can use to sort of ask better follow-up questions. But yeah, I totally agree with Laura.

Justin Sherman: Conversely––I said we'd get to the flip side, so now is this moment––I know this wasn't as directly covered in your guidance, but just 'cause it's related, there are many ways in which, as you're noting, technologists that attempt to work with lawyers or policymakers or business leaders who have not done that much before, might, you know, much like lawyers may need a new tech skill set, they might need additional skillsets, sort of strengthen skillsets around how to best communicate with attorneys, how to best work within, especially if they're coming from, you know, the private sector or nonprofit, a bureaucratic agency.

So, if either of you had any guidance for technologists looking to these kinds of roles, what would you say in terms of, you know, best practices for communicating effectively in these environments and with these kinds of, of coworkers?

Laura Edelson: That is such a good question. I think there is a, almost a universal problem for anyone who is a specialist in a certain area to figure out how to communicate to people about their area without using all of their specialized language. How to communicate to a general audience.

I have spent a lot of time thinking about de-jargoning my own language and there's just some code switching, because as soon as I'm back, you know, with my computer science people, I, you know, obviously I need to go back to that jargon because that's how we communicate. But I put a lot of time into thinking about, what is the, what is the core intuition that I need to communicate?

You know, if I'm, if I need to get this room full of lawyers to understand the thing that is important about that technology, I start by figuring out, like, what is the core idea I need to communicate? And then, what's the thing about that that they need to know? Like what is relevant to them and to the thing that they need to take away?

And once I stop trying to explain how a system works, but instead I am starting with ‘what is the core insight’ and I can work backwards, that tends to go much better. Especially when it's been de-jargoned.

The other thing that I have started doing, and maybe this is a, this is like a cultural difference. At least in the technology spaces that I am very used to and certainly in academia, if you say something and someone doesn't agree with it or they don't follow it, they'll say, wait, wait, wait, slow down, explain that. Or, I don't understand that, or, is that right?

Someone will stop you every single sentence if they do not understand the thing that you have said. And it really took me a beat. But I have come to realize that I need to, I need to be the one to stop and to say, okay, did that make sense?

Can you tell me what you heard me say or like what does that mean to you? I have to actively encourage people to really be sure that they understand something and if not to ask a question, because that is much less of a norm for lawyers who, like––my interactions with them early on, they wouldn't necessarily stop me if they didn't understand something, because they would think that they, that somehow it was a problem, it was like their problem.

And it's like, no, no, no, no, if you don't understand something, that's my problem. So we're going to solve that together. But I, I really started being very aggressive in asking people that question.

Justin Sherman: That's a really important point. Your guide, building on many of these things you've mentioned, wraps up with helpfully talking about, you know, here are the different stages, generally, of state enforcer and regulator casework, from pre-investigation and discovery to actual investigation to litigation and, and settlement. And you explain how technologists can potentially add value and help at each of those different stages.

So, loosely, in those earlier casework stages, pre-investigation, discovery, what are some of the ways that technologists can help, or, of course, things you can talk about and ways you've seen them help in your own experiences?

Erie Meyer: In earlier casework stages, some of the ways that technologists can be really helpful is discovering potential cases. I think an anti-pattern in law enforcement offices is when the cases are only driven by when someone in the office is a friend of a friend who had this issue. Technologists are, you know, they err towards the systemic, and are able to discover places where things are broken in interesting ways.

I had, as part of our team's regular work, to pitch cases as regularly as they could, because they also can see patterns that otherwise might not be immediately obvious for discovery. And this is so critical, having technical people in the room for the discovery process, including in negotiated discovery, where a firm will hire, you know, the former chair of the Federal Trade Commission to sit in a negotiation with me and my team and say, oh, well, you know, we don't have those records, or we wouldn't have any way to produce them. Having someone on our team who can say, great, what do you all use for version control? And then follow up from there to actually get the data and the records that we really need for these cases.

It's been really helpful to have technical folks with different disciplines in those early discussions to make sure we're not sort of like slapping the, a bandaid on the end, but that we are having technical expertise baked throughout the process.

Laura Edelson: I just want to, plus-one two things that Erie said. First of all, one of the things that I think is not well understood outside of tech is that if we're talking about, if we're talking about a piece of software, or a model, or anything like that, there are truly exhaustive records kept that lawyers almost have trouble fathoming the, the level of detail that we keep.

You refer to version control. I have had multiple conversations where I explain to lawyers what version control is about how.

You know, when I write a piece of code, I may, I might check in that code so that there is a permanent record of its state multiple times per day. And that's a very normal way of, of working. And at minimum, I will have persistent––I, I'll be able to look back through history to see every, what the state of the code was every single time it was built and deployed.

And I will be able to know exactly, you know, what code was deployed where at what time. That, again, truly rigorous, exhaustive logs are kept for years, and that is very, very normal. The idea that code history might not exist is, it's, it's ludicrous. And the idea that system logs going back for months or years would not exist, that is ludicrous.

And this is actually, I think, a, a structural advantage for regulators who want to answer some factual question. If you're talking about a factual question inside a technological system, the data answer your question almost certainly exists. And the question is just, can you get it?

And then also on the, you know, this point that Erie's making about case origination and systemic thinking, I think, I think that's also a hundred percent completely right. Technologists, you know, it is our job to think about how systems work and to try to infer and develop patterns. And so, that's all we spend our time thinking about.

And I know something that I, is like a reflexive pattern that I have, when I interact with any kind of user-facing system, I'm usually trying to figure out, well, what's the pipeline that got this final result in front of me? And how can I shift how I'm interacting with this system to try to expose some piece that might be in the middle? And very often we can. And we can develop ways to do that.

That is, that is something that technologists will be doing all of the time. And so, you know, you might think, well, it's impossible to tell what's in the middle between point A and point C. And sometimes it's hard, but it's possible more often than you'd think.

Justin Sherman: Given these different roles technologists can play, I, I wanna talk more about how to actually get folks into these roles and some of the, the ways to navigate the challenges.

So, you know, Erie, as you've mentioned, you've hired a number of technologists into different, you know, federal agencies, in addition to serving yourself. Laura, you've also had a number of these roles and, and leading some of these efforts as well.

What are some of the most important steps to take or enticements to create to get those technologists from outside government, be they still in college or, or, you know, at a company or working at an NGO or deciding their next move or laid off or terminated, you know, to, to enter into, into public service, and to really make the case for why all the work you're talking about here is so important.

Erie Meyer: So, hilariously, every time I have posted a job for a technologist who would work on protecting consumers or helping small businesses, I get like a thousand job applicants. People are desperate to do this work.

Think about people in trust and safety at firms. They're working towards a set of goals. They're not working on obligations under the law. When you say to them, ‘hey, what if instead of a goofball board, and what if instead of, you know, fighting everyone to nudge towards doing the right thing, you were reporting to the taxpayers, to the American people, and defending civil rights?’

You know, people fought and died for these things to be on the books in some cases. My mom couldn't have a bank account in her own name when she was a young woman. And I think when you say to somebody, if they're a computer scientist or a designer or a product manager, ‘hey, you can fight to protect your neighbors, and you can do it with, with dignity and, and to make the whole country better,’ they jump at the opportunity.

Laura Edelson: I think that's completely right. I think that personally, and I think this is true of a lot of other people, the thing that I found really appealing about working in government was that it was going to be really impactful work.

It was work that was actually going to make a difference, hopefully a positive one, for Americans, and there are so many jobs that you can do in tech that will make you a lot of money that might have pleasant working conditions. And there are very few jobs that I can think of that would allow you to have more positive impact on the world than you can, you know, working as a technologist in government. Certainly, that can't be the only, the only thing, but I actually think that that's an incredibly powerful draw.

Justin Sherman: Certainly. And given that, what do you think about the challenges, right?

Because of course again, none of us are, are saying there aren't any, how do you see those both at the state level and federally in getting technologists into government? I'll say––you can combine them or whatever––but generally, right, as a general matter, say, someone's making a ton more in the private sector or they have certain perceptions that might be incorrect about government bureaucracy or something.

And then second, with respect to the current moment we're in and views folks may have about, let's say, at a state level, right? Unrelated to federal developments, but you know, just the general, the challenges of, of recruiting technologists now in that way.

Erie Meyer: I think the average American would be floored when they found out how under-resourced state-level enforcers are. It is shocking that there's like, one dude, in many states, leading all of the consumer protection and privacy work. So, the constraints that, and the biggest challenge that we're having, is on purpose, which is starving the teams of resources they would really need to hold these companies accountable.

Laura Edelson: The other thing that I would say is that there's a whole range of technologists that are doing a whole range of things that is not maximizing the amount of money that they earn. I had an industry job and career, and it made me really sad. And so I went into academia, which is not known for, for being the best-paying.

And there are plenty of other folks that want to find ways to use systems thinking and their ability to build things in the world and would like to do it. They would like to, you know, earn enough to support, you know, to support their families. But they don't need $10 million worth of stock options. What they need is to be able to look at themselves in the mirror.

And I think that, you know, I, I absolutely understand the concern about the, the high salaries that technologists command, but one, I don't think that government has to compete with the private sector because I think they're offering something very different to potential workers. And the other thing is that, the reality is, I really think technologists are really, really cost-efficient.

I know that compared to what the government paid me, I saved the government a lot of money. And I think that, you know, this isn't going to be the same amount of true for every worker and in every role, but I think while technologists do tend to be on the higher end of the pay scale, they are super, super efficient and they really deliver a lot of value.

Justin Sherman: 100%. And, and that has also been my experience as well. So you, you've both worked––just to make sure, of course, we hit, hit the AI topic––you've both worked at length on, on issues related to artificial intelligence technologies, or perhaps sometimes we can say so-called artificial intelligence technologies.

How do you see the recent developments and focus on AI right now fitting into this picture in terms of how states and regulators could be leveraging and perhaps could not or should not be leveraging technologists?

Erie Meyer: It seems like state leaders really know that they have to nail these issues. I think regulators believe that no matter what's being used to harm people, they should be able to understand it and address it.

And I think if you lack the in-house technical expertise to know what AI is, or what a chatbot is, or how kids are using it, you're behind the eight ball. And so I think integrating technologists into the enforcement and regulatory work as key, core members gives the regulators a chance to actually keep pace with things like the, to your point, gigantic investments in AI going on in a lot of these firms.

And it also helps us get to the core issues of why those investments are being made. I think following the money and making sure you understand not just what's being built, but why it is being built is really critical.

Laura Edelson: I just think we are in the, the Wild West era of all of these generative AI and large language model-backed and VLM-backed tools. And I think that there are a lot of companies that are throwing spaghetti at the wall. And sometimes that's fine, and sometimes it's, I think, probably either creating, you know, unfair, deceitful trade practices, or they're having models that, that maybe are behaving in ways that are illegally discriminatory.

And it's just because we're in this phase where a lot of people think no regulation exists so that they, you know, where they're just going to try things. And I do think that regulators are at least a little bit on the back foot in the AI arena because they, they look at these systems, they don't have someone on their teams who can actually explain what's going on under the covers, and they don't even know where to start.

And so I do think that regulators need to figure out how they're going to actually do their jobs and embrace their mandates in this new technology era. But I, I just think it was such a mistake when regulators looked at the early days of, call it Web 2.0, and thought, you know, well, we just don't know how to tackle this yet, so we'll just wait and see how everything shakes out in 10 years.

I think that went very badly. And I think if regulators decide again that they will just see how all of this shakes out in 10 years, then probably some very bad, possibly illegal patterns will be entrenched. And so I, I do think regulators have to, you know, whether it’s picking up the phone and talking to folks at their state universities, or whether it's creating job openings for technologists who can help them understand how their regulatory mandate intersects with AI systems, I, I think they, they just need to figure out the answers to those questions.

Justin Sherman: If you could recommend that a member of Congress or perhaps a state legislator listening, or someone who can hire a technologist and use this toolkit is listening to this, if there's one thing you could tell them to do to change how their organization or the organizations they influence work with technologists, what would that––for each of you in whatever order—what would that one thing be and why?

Erie Meyer: Yeah, to do a callback to Laura's earlier comment, I would say having technologists in-house is cheaper, I promise. So, get resources to the teams that you want to be effective in regulating the companies that are using this technology. It is outrageous that you would have a team of lawyers going up against well-funded corporations with more power and data than ever imagined in human history and not even have a single technical person on their side. So, my one thing would be resource the teams, get them in-house.

Laura Edelson: The other thing that I would say is I think it pays to think broadly, at least initially, about where technologists can contribute.

I found that I, like, I was not expecting that, when I came in, that an important thing for me to do would just be to figure out how we would create an organization that had the technical ability to do all the law enforcement that we needed to do. It just wasn't, it wasn't something that anyone had asked me about ahead of time.

It was something that I discovered once I got there that they just like didn't have the systems that they needed and they didn't have the skillsets they needed to do some of the things that they said they wanted to do. When I was asked to join, they sort of knew that there was probably some policy things that they might want to talk to me about.

But then when I actually got there and started talking to people about the different issues that were coming up, you know, I had to learn about treaties, because there were treaties that we were negotiating with other countries that had, that had important competition questions and important technology questions.

It took a minute to figure out what made sense to be in my portfolio, and thinking broadly, initially, about what that was let me be as effective as possible in the different scenarios that I wound up being in.

Justin Sherman: That's all the time we have. Erie, Laura, thanks very much for joining.

Laura Edelson: Thanks Justin.

Erie Meyer: Thanks. This was fun.

Justin Sherman: The Lawfare Podcast is produced in cooperation with the Brookings Institution. You can get ad-free versions of this and other Lawfare podcasts by becoming a Lawfare material supporter through our website, lawfaremedia.org/support. You'll also get access to special events and other content available only to our supporters.

Please rate and review us wherever you get your podcasts. Look out for our other podcasts, including Rational Security, Allies, The Aftermath, and Escalation, our latest Lawfare Presents podcast series about the war on Ukraine. Check out our written work at lawfaremedia.org.

The podcast is edited by Jen Patja and our audio engineer this episode was Goat Rodeo. Our theme song is from ALIBI Music. As always, thank you for listening.


Justin Sherman is a contributing editor at Lawfare. He is also the founder and CEO of Global Cyber Strategies, a Washington, DC-based research and advisory firm; the scholar in residence at the Electronic Privacy Information Center; and a nonresident senior fellow at the Atlantic Council.
Erie Meyer is a senior fellow at the Vanderbilt Policy Accelerator where she focuses on the intersection of technology, artificial intelligence, and regulation.
Laura Edelson is a Postdoctoral Researcher in Computer Science at New York University where she co-directs the Cybersecurity for Democracy project. She studies the spread of misinformation and other forms of harmful content in large online networks.
Jen Patja is the editor of the Lawfare Podcast and Rational Security, and serves as Lawfare’s Director of Audience Engagement. Previously, she was Co-Executive Director of Virginia Civics and Deputy Director of the Center for the Constitution at James Madison's Montpelier, where she worked to deepen public understanding of constitutional democracy and inspire meaningful civic participation.
}

Subscribe to Lawfare