Congress Cybersecurity & Tech

Lawfare Daily: Digital Forgeries, Real Felonies: Inside the TAKE IT DOWN Act

Renee DiResta, Mary Anne Franks, Becca Branum, Adam Conner, Jen Patja
Tuesday, May 6, 2025, 8:00 AM
What is the TAKE IT DOWN Act?

Published by The Lawfare Institute
in Cooperation With
Brookings

The TAKE IT DOWN Act is the first major U.S. federal law to squarely target non‑consensual intimate imagery (NCII) and to include a component requiring tech companies to act. Long handled via a patchwork of state laws, it criminalizes NCII at the federal level—both authentic images and AI-generated digital forgeries—and requires that platforms remove reported NCII within 48 hours of notification by a victim or victim's representative. TAKE IT DOWN passed with wide bipartisan support—unanimously in the Senate, and 409-2 in the House. Melania Trump championed it, and it is expected that President Trump will sign it. And yet, some of the cyber civil rights organizations that have led the fight to mitigate the harms of NCII over many years have serious reservations about the bill as passed. Why?

Lawfare Contributing Editor Renée DiResta sits down with Mary Anne Franks, President and Legislative & Technology Policy Director at the Cyber Civil Rights Initiative, and Eugene L. and Barbara A. Bernard Professor in Intellectual Property, Technology, and Civil Rights Law at the George Washington Law School; Becca Branum, Deputy Director of the Free Expression Project at the Center for Democracy & Technology; and Adam Conner, Vice President, Technology Policy at the Center for American Progress to unpack what the bill does, why it suddenly cruised through on a rare bipartisan wave of support, and whether its sweeping takedown mandate will protect victims or chill lawful speech. This is a nuanced discussion; some of the guests support specific aspects of the bill, while disagreeing about the implementation of others. Expect clear explanations, constructive disagreement, and practical takeaways for understanding this important piece of legislation.

More resources:

To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/lawfare-institute.

Click the button below to view a transcript of this podcast. Please note that the transcript was auto-generated and may contain errors.

 

Transcript

[Intro]

Mary Anne Franks: And so inadvertently, I assume, this bill actually wouldn't apply to deepfakes—that's one—so that seems like a really big problem for under inclusivity, and it does apply to all kinds of non-consensual visual depictions that are not, in fact what we would define as NCII according to the rest of the criminal statutes. So those seem like really, really big problems.

Renee DiResta: It's the Lawfare Podcast. I'm Renee DiResta, contributing editor here at Lawfare and associate research professor at Georgetown University McCourt School of Public Policy. With us are Mary Anne Franks, President and Legislative & Technology Policy Director of the Cyber Civil Rights Initiative, and professor at George Washington Law School; Becca Branum, Deputy Director of the Free Expression Project at the Center for Democracy and Technology; and Adam Conner, Vice President of Technology Policy at the Center for American Progress.

Becca Branum: Something I'm concerned about and I think is worth paying attention to is this model getting exported to other kinds of content that there might not be as much unanimous agreement about it being deserved to be taken down, and sort of replicating this across wider swaths of content that are even harder than NCII to identify.

Renee DiResta: I'm here today with three distinguished guests to talk about the TAKE IT DOWN Act. The act is sitting on President Trump's desk and he's expected to sign it, so it's important to understand what it does. This is the bill that penalizes non-consensual intimate imagery at the federal level, which nearly everyone agrees is a good thing, but it also has some provisions requiring that platforms take it down, which even some strong supporters of NCII penalization are concerned could lead to censorship or overenforcement.

[Main podcast]

Let's start by helping the listener understand the various provisions of the bill. So, TAKE IT DOWN is actually, as with many things in Congress, an acronym. So it is—I, I'm gonna read this—The Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act. And so it generally prohibits the non-consensual online publication of intimate visual depictions of individuals, both authentic and computer generated—so, touching on AI sometimes what's known as deepfake, deepfake content—and requires certain online platforms to promptly remove such depiction upon receiving notice of their existence.

And I guess we've got sort of four key aspects to this bill. We have criminalization of non-consensual intimate imagery, so making it a federal crime to knowingly publish or threaten to publish intimate images without the subject's consent, again, both authentic and AI generated deep fakes. We have platform responsibilities that are now attached to this, requirement to remove reported non-consensual intimate imagery within 48 hours of notification by the person depicted or a representative, also, I believe steps required to eliminate duplicates of the content.

There are now criminal penalties for that, including fines and imprisonment. Then there is an enforcement component where the FTC is tasked with enforcing the act, treating violations of, I believe, that notice and take down process as deceptive trade practices.

So a whole lot of stuff in there, and Mary Anne, as you mentioned, things that people have tried, have had implemented, I think at the state level, and now this is sort of the first time that we've seen this brought up to the federal level.

I dunno, maybe let's start with part one, I guess, start with the, the criminalization component. I dunno if you wanna talk a little bit about what that is doing, the, the sort of comparison between what this is doing at the federal level versus the very patchwork approach that the states had taken previously.

Mary Anne Franks: Yeah. So when we started—and I say we, when the Cyber Civil Rights Initiative—started calling for legislative reform back in 2013, there were only three states that criminalized image-based sexual abuse, there's now 49. If you're wondering who the holdout is, it's South Carolina.

And there are, there are lots of different definitions, there are lots of different penalties, categorizations, and so one of the reasons why we have been so insistent on pushing for a federal criminal law in addition to the state laws, is that you really do want to have a uniform definition of the offense. You want to have the sort of weight of the kind of deterrence effect, right, of something being a federal crime.

And so we, we were thrilled to some extent to see that this was getting pickup because the, the language that is folded into TAKE IT DOWN is language that was known as a standalone bill called the SHIELD Act for the last few years; before that, it was called a few other things. The first version of it was called the Intimate Privacy Protection Act.

And you know, we were pretty happy with that criminal provision when it comes to authentic depictions, and we had also worked with multiple members of Congress on a deepfakes provision because our view is that it's a slightly different matter and it really needs to be phrased somewhat differently. And roughly speaking, both of those sections have made it into TAKE IT DOWN and they're basically good.

I do have a couple of reservations. One of them is that there's a massive exception that's written into the bill that says this, these criminal penalties don't apply if it is a, if the person distributing or, or possessing the image is the person in the image. Now, what I think that they were trying to say is that if you have a selfie and you're distributed of yourself, that you won't be criminalized. But that's not what the exception says. It, it, it basically says if that person's in it as well as right, it's not limited to a depiction of themselves. So I think that's a really terrible loophole and it was something we raised repeatedly with, with the sponsors and for some reason did not get addressed. So I think that's a bad thing.

But apart from that, I'd say that the criminal provisions are pretty good. They're better than most of the state provisions. They're, they're very narrow, they're very specific, they're very clear. So I think that part of it is the win that we want to celebrate because that is something that we have needed for, for over a decade, and it is good that we finally have that criminal provision.

Adam Conner: Mary, can I just ask a quick question about this. I obviously, involves in, in law and it, obviously I know non-consensual intimate imagery is kind of a term we should be using. Is there a preferred term we should be using for kind of, the what they call a digital forgery in the law, or a deepfake? Is there a kind of preferred term we should be using as we kind of discuss this bill?

Mary Anne Franks: Thanks for that question. That, that is something that I was kind of, I was quite gratified to see in the bill that they did adopt the language that we have suggested, which is sexually explicit digital forgery as opposed to deepfake, which I, I've never really been comfortable with that term, given that this is not really a, you know, that's not a real term, and it, and it is named after a user who kind of created this problem. So that's, that's not great.

So I do think digital forgery gets closer to what we're trying to describe. So we, we tend to say sexually explicit digital forgery or and or inauthentic or synthetic, non consensually distributed intimate imagery, which is a kind of a mouthful.

Adam Conner: Thank you.

Renee DiResta: Why do you think they chose to pursue this legislation now? What do you think was the motivation for the, the momentum that we just saw here?

Mary Anne Franks: This is something that gives me pause, because as I mentioned, it's been a really long time since we've first been calling for this bill. And we've had a bill in place—a better one, I would say—back in 2016, introduced almost every single year since then, and every single year it got very close to passage or almost every single year it got close to passage and then there were objections from the ACLU and other civil liberties groups sometimes, and sometimes it was Republicans who were objecting to it.

Notably, when we were trying to promote deepfakes legislation, the re the pushback was mostly coming from Republicans saying, this sounds like you're trying to crack down on misinformation, and we don't, we, we think that misinformation isn't weird,

Renee DiResta: Right, yes, so this is, this is one of the reasons why I'm asking.

Mary Anne Franks: Yes, yes. And so it's very, so when this momentum started to pick up last year, we were a bit concerned because we, we weren't convinced that this was a genuine concern about the underlying issue, which is the sexual exploitation of the people being targeted, but that instead, this was becoming politicized in a way that made us uneasy.

And I, I, I, I think that it, that is a big part of why it got the steam that it did, which is not to say that the sponsors and the people who are supporting it and voting for it are necessarily acting in bad faith. I don't think that that's true. I think they are trying to do the right thing, but I think the fact that it suddenly became a consensus, notably last year, was a little bit strange.

And I think it's important to note too, that this bill got into the continuing resolution and it got stripped out basically on Elon Musk's orders, right? So what has changed between December and now is also a question, and the fact that the First Lady is, is, has decided to take this on as her issue, I think raises some questions.

So I think we do want to wonder why after having had better versions of this bill that were much more narrow, that did not include that take down provision—why was there never a a chance realistically of getting that passed? And yet now that you have that combined with all of these other things that now will at least potentially become a weapon to be used in a political or ideological way, why was this the vehicle that people suddenly got excited about?

Becca Branum: I think that's exactly right and I, I think it's important to situate this bill in the broader tech regulation landscape and the fact that Congress has been struggling for years to come up with narrowly tailored constitutional and sort of constitutionally defensible ways to regulate platforms and address very real online harms.

I think what's gratifying to me about this bill is Congress is being responsive to a very real problem. The way that they've gone about—we'll talk about this more when we get to the take down section because it's less relevant for the, for the criminal provisions, but I, I find it very interesting that this bill got over the finish line where other ones have failed.

So, for example, in the last Congress, the, the bill on everyone's mind was the Kids Online Safety Act which would require platforms to pretty fundamentally restructure the way that they operate their platforms and from CDT’s View would have similar although not exactly the same effects on people's speech online.

And, and really one way to think of how these bills are different—there are very real harms for kids online, NCII is obviously a very real harm for kids and adults–but the difference between the two, or at least one of them is that the TAKE IT DOWN Act as it relates to the take down provisions is probably a lot easier and cheaper to implement than the, than the Kids Online Safety Act and others might be. And I do wonder if that's why we saw the tech industry itself throw its weight behind this one rather than some other tech regulation bills that we've seen.

And so I think this is a story both of, I think, Mary Anne, you've described this as like a bittersweet moment—that that, that really resonated with me. It's sweet for me because it's a recognition of this as a real harm and I think the culmination of well over a decade of civil rights advocates trying to bring attention to this issue. And so it's, it's a sweet moment for that to get over the finish line, but it's a little bitter in that that work is culminating and coming to fruition along with our takedown mechanism that's going to have some pretty serious implications for user speech.

Renee DiResta: And we did have one other guest who I was hoping to have on who was very supportive. And so I'm maybe, I'll play the role of of, of being the little bit kind of advocating, I guess, for it just to, to get that, that viewpoint out a little bit. I mean, there are several hundred orgs here on this list of, of civil society orgs that, that did support, including, for example, NCMEC and, and some of the other child safety orgs.

I guess one of the, one of the kinda obvious questions is, are people reflexively responding to it, in part because Melania Trump picked it up? Like, let's, we can talk about the I wanna move into the, the notification and the, the concerns about over enforcement and platforms in a second, but how do we respond to the, to that concern?

Adam Conner: I, I'll add that too, but I think there's kind of two points I think that are worth highlighting and, and sharing.

You know, I think one is just the, the kind of context in which the bill kind of did, you know, make it through, which is obviously a work that started, you know, really picked up steam last Congress. And you have to remember, last Congress, Senate was controlled by Democrats and Ted Cruz is the ranking, now chair of, of Senate Commerce, but it is something that started from a more bipartisan kind of place.

Elon Musk did, and Donald Trump did strip it outta the CR at end of the year; they stripped everything out, so it was not just that they targeted this bill. I think it is probably important context to know there was a ton of stuff—pediatric cancer, cancer research and other things—in that CR that they sent. You know, so that is, this was a kind of consequence of that; it was not, I think, solely just take that out outta the bill. I think that's, that's helpful context.

You know, I do think that that helps explain on some level to kind of ability for this bill to move quickly, obviously, which was, it looked like it could become a law in a split Congress, and so obviously in a unified Congress, just from a kind of pure dynamics point of view and the simple mechanics, I think that is just a, a piece there.

And I think the other is that advocates, and I know Becca and Mary Anne have been working on this issue for years. This is a hugely serious issue. I think bittersweet is certainly a reasonable place to describe it, but these are advocates who I think when you hear their stories, you know, I think are, are very moving, and I think it is a very real problem and Congress sometimes feels bad when they can't solve problems.

And I think it can be hard for them to hear constantly that every solution they have is, you know, mixed right. And I think, I think that can come time be I think a very difficult place when, when they're thinking about how to weigh these kind of pros and cons in, in it.

I will say that, you know, I think in the kind of way that this moves so quickly is that, you know, I think to, to take back an example for KOSA, right, it is narrower than KOSA in that sense, right? It is a piece of a problem, not a, not a broad problem, which I think kind of contributes in a less nefarious sense to, to some of this, which isn't to say that it isn't a danger of being abused or others, but there's I think benign explanations for why this becomes a law that we sometimes forget in a world where most things don't become laws, because Congress rarely legislates anymore. But this has a series of kind of more benign legislative factors that can also help explain it.

Mary Anne Franks: I just wanna add to that though, that for historical context, that that when we had multiple years where we presented SHIELD or the Intimate Privacy Protection Act, again, much narrower, the complaints about it were that this is too broad. The complaints were that, you know, this is, you shouldn't criminalize this at all, this is going to ruin free speech. And we're talking about a provision that was just about narrowly prohibiting authentic visual depictions that were disclosed without consent.

And there was always pushback from the left, from the right, and it didn't matter that there were compelling stories. These victims have been speaking out and it was much harder to speak out 10 years ago than it is now. Victims who would speak out and, and all complaints about how, oh no, this is going to be censorship were just taken seriously by people who probably should have known better, so I think that, I don't find that to be a compelling explanation for why now, because we had all of those factors in place before and a much better, more constitutional, much more narrow bill.

And suddenly for people who had been criticizing criminalization, criticizing, oh, the chilling effects or what have you, are now supporting a bill that is much, much broader than anything we ever proposed in the past, and has a takedown provision that is direct and serious implications for for the way that tech platforms operate and is so prone to arbitrary enforcement—I find that very suspicious.

Renee DiResta: Let's talk about the enforcement provisions. Let's, let's go into helping the listener understand what the enforcement provisions are. So we have a requirement to remove within 48 hours upon notification by the victim or a representative of the person depicted. I believe is—I hope I got that language correct. Becca, do you wanna describe a little bit about the, the provision?

Becca Branum: Sure. So, within a year of the president signing the bill covered platforms—which is a definition we should chat a little bit about—but covered platforms will be required to set up a notice and take down system where either victims themselves will be able to submit a complaint to a platform, or people, third parties acting on behalf of victims will be able to submit a complaint, requesting that images and any known copies be taken down within 48 hours.

You talked a little bit about, about hoping to have some advocates for these provisions on the show, and I'm going to step into that role because I think, yeah, I mean, the notice and take down provision itself is something that, you know, CDT and just like on a personal level, I really wanted to get to a place where we could actively support these provisions because it's such an important tool for users to have. A lot of platforms have this right now where you can request the removal.

We also know that platforms, if they're not, you know, they don't always respond in time or in a timely manner, there's not always great transparency into whether and when they're going to take things down, and so the idea of empowering people right then and there when they find an image of themselves or someone lets them know to just say, hey, no, this is me. I'm taking back control over my, my image and and likeness, and I want this taken down now, and having platforms have to respond to that is extraordinarily powerful.

From CDT’s view, the problem comes—it gets tricky as it relates to the definitions and, and the ambiguity. And I think the best way to think about it is really, I, I don't have any concerns with NCII coming down, right? None whatsoever. If the provision operates as intended where it applies to non-consensual imagery and that imagery has to come down, I think most people would agree that’s really good thing and an important tool for people to have.

It's really the effects on everything that's not non-consensual imagery that CDT has concerns with, and unfortunately, the way the bill's drafted, it's, it's, it's ripe for sweeping in a lot of speech that the authors really didn't intend to sweep in. And I think it's gonna have some pretty negative effects for users.

Adam Conner: I think just one small point there that I, that—and Becca, feel free to correct me if I'm wrong—but I think one of the things that is interesting about this bill, just in the broad context is obviously a lot of particularly rhetoric from Republicans in Congress is kind of very anti-big tech right now and kind of understandably.

What is interesting about this bill is, is it is very broad, right? It does not just target you know, large social media platforms. It targets basically the whole internet with a few exceptions, which I think Becca and others we can, we'll, we'll talk about in a second.

But I think it's also you know, it is not that this is not a problem on major social media platforms; it is generally a problem they have some existing mechanisms for, and so I think it, it was interesting—to just kind of note it's not necessarily that targeting big tech of this was the kind of primary driving motivator, although certainly they're regulated and many regulated buy it and many will ended up supporting the bill—but I think it, it is interesting that it is and why I think there are speech concerns, right? It is a much broader targeted, you know, kinda sweeping in fairly large to all parts of the internet with some limited exceptions.

Renee DiResta: So I think just to, to note what you just said there, just to highlight it, several of the big tech platforms did support it.

Also, I think the comparison that a lot of people make is to the Digital Millennium Copyright Act notice and take down system, which has existed for a long time. They do comply.

There's a lot of concern about over compliance, about, again, overuse of DMCA to take down speech that should not be taken down, sort of the abuse that goes into the DMCA system—which for again, I'm not a lawyer nor a DMCA compliance officer, Adam, maybe you wanna weigh in on this you were at a platform—but just the, the extent to which entities will file, the companies will default to taking it down immediately to avoid being held liable and then sometimes things will go back up if the if the notice party files a, a complaint or a response, and then sometimes things will go back up. But there is this kind of a ability to abuse DMCA to take down speech and content that you do not like. So that is one of the things that is, that has been raised as an objection to this particular notice and take down provision.

Adam Conner: Yeah, I'll just say when we were setting up content moderation of Facebook, there was like a bucket of all these really hard problems and then like we hired somebody to deal with DMCA 'cause it's the law. And I think, you know, I think serves as a job we, a lot of us did want, and technology has obviously played a bigger role in that.

I, I would say kind of two things, not a DMCA expert, expert. I think there's a wide body of evidence on both its shortcomings, but also maybe it's better to have something to nothing, but I will defer to others on that.

I think that, kind of in this context for this bill, the critical aspect as it relates to implementation is there are no kind of appeal processes or any sort of ability to kind of contest this. I think that is part of the concern that's not necessarily—and again, Mary Anne and Becca correct me if I’m wrong—a constitutional concern, although may, maybe it will, is also an additional one, but it is more kind of a implementation, expression of speech concern relative to, to that, particularly if your content is swept in maybe you know, incorrectly.

Mary Anne Franks: I, I'd wanna add to that, that the, you know, I'm not a DMCA expert either, but the, the things that I do know about the DMCA process is that you have to attest on penalty of perjury that what you are alleging–

Renee DiResta: Right.

Mary Anne Franks: –you have a good faith belief that that is true. Now, there's a good faith requirement that's mentioned in the take, but there's no perjury attestation, and there's no, the DMCA says that you will be subject to liability if you are making knowingly false statements about whether, in fact you are entitled or authorized to be a person to ask for this removal, and that, that, or if you are knowingly misleading, providing misinformation about whether this is in fact violative, right?

Nothing like that in the provision here. So literally any kind of complaint that alleges a non-consensual visual depiction supposedly would have to be investigated fully, which, you know, one of the problems we have with this is it seems like it would just unleash just a, a torrent of bad faith complaints.

So as Becca was saying, if we were actually talking about this bill, doing what it thinks that it's doing—which is what, this is just going to be a laser targeting focus on the actual NCII and it's gonna come straight down—that would be amazing, right? But I don't see how you get that from this. And that's not only bad because of what it sweeps in. If that is what platforms are having to deal with on a daily basis, are they ever going to get to the actual NCII complaints and how are they supposed to sift through the bad faith complaints versus the ones that are genuine, so I'm worried about it actually being counterproductive, not just overly broad.

And then the other thing that I think is worth pointing out is that the definition of what is supposed to come down is not matching the criminal definition of what is unlawful, and that seems like a really big problem, right? So you have this incredibly narrow rightly so, really based on, on the statutes that we have been in the statutory language, we've been promulgating for some time to comport with the First Amendment. And then in the takedown provision, you just have this general term non-consensual visual depiction with none of the exceptions, none of the restrictions—just basically says something that is sexually explicit and allegedly non-consensual.

And, you know, that is both really, really broad, but also under inclusive, as I've mentioned in, in some of the analyses that I've done, which is, it's interesting that the bill says this is a deepfakes bill, or it's gonna take down deepfakes; the term intimate visual depiction comes from an existing civil federal statute, and that statute refers to authentic images. And so inadvertently, I assume this bill actually wouldn't apply to deepfakes—that's one, so that seems like a really big problem for under inclusivity—and it does apply to all kinds of non-consensual visual depictions that are not, in fact what we would define as NCII according to the rest of the criminal statute. So those seem like really, really big problems.

Becca Branum: That all sounds right to me. And I think it's also part of what we think about here at CDT is kind of the reality of how these, these bills get implemented, and so we look at this, the statutory text and agree entirely with what has been mentioned about the shortcomings and ambiguities there.

But also we know that particularly given the, the statute requirement that these bills, that these images rather come down with 48 hours, there are different approaches to intermediary liability, but when things like this are implemented, the, the goal here—and there's nothing in the bill that requires platforms to implement this, these provisions competently, right? We hope they will, we hope they will take care to investigate and ensure that things are submitted are actually NCII or even actually include nude imagery, right, aren't just sort of unflattering pictures I, I happen to find of myself on the internet. We hope they will do that, but there's no requirement that the platforms do that. And so it, the, the law is, is lacking in that way.

And also just thinking about sort of the economic incentives that that platforms have, if the choice is between, you know risking FTC enforcement or just taking down a piece of content that, that a user posts, that individual piece of content doesn't really make a platform all that much money, right? Every individual piece of content isn't particularly important to platforms even if they, they do want to facilitate speech, and so when you implement these broad liability requirements and a really strict deadline, given tough choices and, and resource constraints or choices to create constraints on resources and content moderation, we can expect that it might not be implemented in the best way possible, which will really exacerbate the, the ambiguities in the law.

Adam Conner: And I think just two, two small points. I think one to, to Mary Anne’s original point, right? There's no perjury kind of component or anything there. And my understanding, at least in having worked, you know, and, and I think others might have a better sense here is right, that is the kind of barrier to report NCII or other kind of horrifying or embarrassing imagery. Like that's, that's often a, a kind of balancing act I know for platforms or others as they try and understand that, and I think that is different you know, if you lower the barriers as you're doing it kind of voluntarily versus having to have something in law.

But I do, I do think that—please correct me if I'm wrong—that's, that's a, a balance equation where you can see if you read that, you know, in the build text, right, the, the requirements for identifying it are fairly minimal and, you know, likely we prevented maybe with check boxes and, right; ou gotta attest to it. You've gotta identify the piece of content, which is, you know, obviously necessary in a brief statement about it, but it doesn't, you know, kind of have any kinda strong additional details there.

I mean, it'll be interesting to see in the implementation here, and it is not required in any way, shape or form, if the FTC will give any sort of guidance. I mean, it's certainly not required in the bill. It's certainly something they could do and, and one hopes they might consider aspects of that as they implement, you know, aspects of it, but I think that's part of it.

And then I think the other thing I'll say to to, to Becca's point, as she pointed out, the digital abuses, right—there's like kind of a melding of potential abuses as I, as I understand the concerns and it is worth saying that they kind of come together, but they're also kinda separate, which is like will people use bad faith and just lie to the form? Will platforms check it? You know, will platforms bother to to to check any, you know, either correct maybe real or versus kind of bad faith, you know, pieces, will they even check the content involved? And then will somebody kind of flagrantly abuse it as I think has been suggested.

But those are, they kind of all meld together, but they are kind of different, separate risk vectors in terms of implementation. And you could see, for instance, some platforms saying we're small, right, and we're just gonna take everything down, whereas larger platforms might say, you know, using existing systems. And, and I think what's probably most likely true here is the kinds of content that is most vulnerable first is kind of content that would be adjacent to NCII, right? So this would be voluntary, intimate imagery or something like that, or pornography, or things that is legal, things like that. And so you could also imagine a company specializing in those things maybe putting more effort into it than, than kind of platforms. As I mentioned, the kind of benefit and downside of this bill is its broad applicability, and so I think that is obviously why Becca and others have concerns.

Renee DiResta: Well, I think for a brief shining moment in time, there was some transparency around DMCA takedowns also, right? There was the Lumen database, so outsiders could go and see to what extent abuse of DMCA was happening. We could—remember that for that this was actually how people saw that. X was taking down content in response to requests from the government of India, and then as soon as that article was written, X stopped contributing to the Lumen database.

But I, I know that there's been some concerns about FTC enforcement as the body has felt a bit more political with the firings of some of the democratically appointed FTC commissioners. That question of who is the best entity to handle that enforcement of platform compliance—I'm curious what you all think about that decision, whether it would've been better to have some sort of specialized privacy or for the child safety component, maybe child safety agency or some other entity handle that piece of it.

Mary Anne Franks: It's such a tough one because it's hard to know who, which entity would actually have the competence–

Renee DiResta: Right.

Mary Anne Franks: –for this, especially if we're talking about an FTC that is. You know, that there are two problems I think in this current moment that we would need to worry about in terms of the FTC. One is the over politicization and weaponization for political purposes; the other is stripping away any kind of budget or resources to do actual good work.

And so there's a ineffectual kind of concern on the one hand, and then there's a overly effective but really bad politicized kind of concern that I would have, you know, not only with the purported firings of the Democratic commissioners, but also that you have the head of the FTC openly saying, I want to go after tech companies that Trump doesn't like, right? I mean that's effectively what he's saying.

And that big tech, little tech divide is, is a concerning one for me because so many of the platforms that are the worst offenders when it comes to distribution networks of intimate imagery are little tech, right? So this kind of rhetoric about how we need to go after the big shots or what have you is, is very distracting.

And on top of all this, you have this, this complete entanglement that, that Trump has given us with this appointment of Elon Musk in whatever weird role he's in to say, you know what, realistically, if you're X, do you think that you are worried about the FTC coming after you, given the fact that you are, you know, Elon Musk is, is Trump's right hand man at the moment, right? Who actually has to worry about being investigated by the FTC? And I think that the implications of this are pretty unnerving and not, you know.

One of the additional reasons why I'm worried about this is this very odd provision that says that, you know, normally FTC, when it does its unfair and deceptive practices kind of investigations and exercises its powers, it's limited to investigating entities that are commercial entities that, that are operating for profit. That makes sense. It's the Federal Trade Commission, that's kind of their job. But this provision says in TAKE IT DOWN, that oh, we are not going to adhere by that limitation, it can, we can go after nonprofits as well.

And I just have to wonder what is that actually about because again, we're, we're a victims and survivors centered organization. We have, we are very familiar with the landscape of who are the worst offenders for distributors. I can't think of a nonprofit that is, that is on that list. And so this seems like a very odd place to do that kind of expansion of jurisdiction.

Renee DiResta: How should we think about—on the enforcement piece in particular, this is one area where we should see, I would think, some improvement on the false positives. Front platforms love to tout their sophisticated AI enforcement on this front.

Adam, I don't know if you wanna weigh in on this one; do we think that the 48 hour deadline is going to lead to the concern, you know, the, the proliferation of false positives that people are concerned about? Do we think that it will be better? You know, because they support it. I am curious like where you think that we're gonna come down on that.

Adam Conner: Yeah, so I think two things. One, just to finish off Mary Anne’s point about the, the FTC and our, you know, certainly very significant concern with the firing of not only commission or attempted firing of not only commissions of the FTC, but other places that implicates all sorts of aspects.

I think it is really important to note that like a, a critical part of—and there are very few successful tech bills that you know, that have made it into law—but a critical part of like the tech policy congressional landscape has been, has been a relatively safe bipartisan agreement point that you could do enforcement through the FTC for a variety of things. And that's just because as an independent agency in theory with bipartisan commissioners, like it was a place that kind of wasn't the fight on.

And I think unfortunately the, the kind of longer term impact of, of these illegal firings of independent commissioners by President Trump is it's really gonna damage that aspect of a place where could enforce privacy laws, could enforce, you know, various other laws that we have. There's any number of bills out there, some of which these groups would support, some of which would oppose, but like, you know, kind of just looked at it as a relatively safe and competent place to administer these things, and that's like a, a real lasting damage.

This may be the last time we see significant bipartisan support for a bill that does FTC enforce, that adds FTC enforcement powers just because of the kind of politicization of, of the agency, of independent agencies writ large. And obviously the kind of concerns that that, that are raised by Chair Ferguson's, you know, kind of agreeing with the president as a condition of his employment that he, this is no longer an independent agency and he will operate it, not like an independent agency. And his remarks the other day that they will, you know, sternly continue to prosecute the Facebook case up until the moment Donald Trump tells him to stop is, I think, you know, I think a, a good indication.

I think more broadly too as you get into the kind of questions of over enforcement. It actually puts the FTC—I think there are significant concerns about abuse there and other pieces—t also puts the FTC and platforms in a weird place because of this rhetoric, because obviously we have seen from Chair Carr, we have seen from the president and both his first term with his EO you know, attacking speech and, and his current kind of embodiments of, of attacking the First Amendment, you know, a, a concern to weaponize this, to take down speech they don't like, right?

But that is kind of difficult as you write broader rules, and you can imagine in this case, right, people who might weaponize take it down as, as folks are worried about the flood would say, you know, false reports. Those aren't gonna be limited to one side of the ideology if, if it ends up kind of targeting content that is not just kind of intimately, you know, in the depictions.

And you get to a weird place where if you start to be overly broad in taking down all content and then, oh, all that content is a lot of pro-Trump content, for instance, you know, that's a vulnerability. Now for platforms, that's something that, you know, the, the FTC and Trump might be mad about, and so you also get into this weird place where platforms by the law might not have to care too much about the content, but they might have to start thinking about having to sort through it more closely because they don't want to be yelled at for taking down too much Trump content. And now will they tilt that one way or another I think is, is obviously the question. But it just, it adds a extra layer of complication you don't find in the law that you see in the operating reality that platforms are working in, in the Trump era.

And I think as it gets implemented, you know, companies that are most positioned to comply with this law, and I think it is why you see broad social media platforms that have these existing systems feeling relatively confident about it. A lot of them, for instance, Facebook is a good example, right? Like doesn't allow under its rules, super intimate imagery anyway, so it's kind of coming down regardless of you know, kind of this law or not. I think it's again, a broader question for the 48 hour requirement with smaller platforms or platforms that, that might not have resources or care, I think is where, you know, you will see some really interesting questions on that.

Renee DiResta: I guess what, we have a few minutes left. I wanna get at the question of is this overall net beneficial?

It sounds like there's been fairly widespread agreement that the victim censored criminal provisions are on fairly firm ground, that they fill a pretty clear federal gap, genuine federal gap. The notice and take down provisions are where that potential chilling effect or over enforcement or First Amendment litigation is possibly gonna land. I think maybe where we'll see some litigation rulemaking, court challenges, not so dire that the whole law collapses. I'm curious what y'all think about that.

I've been curious about the balance between like the sort of slippery slope concerns I've seen expressed from civil libertarians and some tech policy folks versus, again, the overwhelming support and the real harms that this is attempting to backstop, and, you know, my, my personal feeling, my personal bias, that it is overall net beneficial. And so I'm curious, you know, where you all are coming down on that.

Becca Branum: So this bill's a bit of a heartbreaker for me because I, I really was excited to have the opportunity for Congress to weigh in both in, on this issue, which is, which is very worthy of attention and, and response and to empower people with a tool that they can use to try and stop the proliferation of, of those images online.

I think for, for me, what I'm turning to now is trying to, to the extent possible, making this a net benefit, right—working with platforms to minimize opportunities for abuse, working to advocate for interpretations of the law that will be user protective and privacy protective.

In its current form, obviously, as one of those civil libertarians myself, I have a tendency to focus on some of the, the problems, but there, there are benefits to this bill that I want to see accrue to, to users and to, to victims themselves. And I think there is a world in which Congress could have passed a constitutional and privacy protective take down mechanism to help victims.

And I am really looking forward to helping the FTC and the courts interpret this bill in a way that is consistent with that vision because I think on net—stepping back from this bill—it is absolutely a net positive for Congress to be addressing these issues and for victims to be able to, to respond and for us to also commit to reducing the prevalence of image-based sexual abuse. And so I, I am excited to make it a net benefit, even if in its current form it might not be.

Adam Conner: Real quick, Mary Anne, could you also know, 'cause I think we skipped over this, like could you just highlight what, what is the unconstitutional aspect you think is most strong in these cases? 'Cause it is, it's not quite that implementation of the enforcement, it's just the broader concept, right? Of the take down provision?

Mary Anne Franks: I, I, I mean, I think, yes, I, I think that there are straightforward overbreadth problems under the First Amendment under, for the takedown provision, because it's incredibly broad, it's vague. It's, as I mentioned, over and under inclusive. It's difficult for any entity to know how to conform its behavior with this. The dictates are just not clear. So I think that there is a very key overbreadth challenge; underbreadth challenge, I would, maybe it's weaker, but the over breath for sure, the vagueness. And so I, I do think just sort of straightforwardly the take down provision raises some very serious constitutional problems.

Renee DiResta: Do you think that the platform supported it, assuming that court challenges would significantly narrow it, and that this was more of an optic support?

Mary Anne Franks: I, I wonder, I, I do wonder a little bit about that. I wonder—I think what we're going to see fairly soon, I mean, for instance, it, I could imagine the bill gets signed and you immediately see a coalition of tech groups asking for an injunction for the take down provision, and I think that would be a good thing.

I, I actually hope that that is what happens, whether that's why they did it or whether they're just living in the reality that is not only is this, the Trump administration kind of dictates, but it's also one that, you know, from the other side or from every other side, people have been saying for a long time like, why aren't you doing something about this issue, right? Who wants to be the, the platform that now says in this environment that they don't care about revenge porn or deepfakes. Right? Five years ago they didn't care and 10 years ago they definitely didn't care. Now they, I guess they do, at least up to a point.

And now there's political pressure and this is one win where, where they can say, well, sure this is one, you know, thing that we should have been doing probably anyway, and the fact that the Trump administration is claiming that it cares about it is a great win. Well, everybody's happy. Maybe that's why they said, or they're just confident, right, as I think many companies would be, that it's not going to hit them because they know that as long as they keep Trump happy, they don't have to worry about getting any kind of enforcement action against them.

Renee DiResta: I guess let's go with one final question. In a few years, do you think that this will be a, will have evolved into a model for a you know, really serious global NCII framework, or a cautionary tale of over censorship or just something that really has no impact?

Mary Anne Franks: Small question. I mean, I, I guess I wanna link–

Renee DiResta: Lightning round.

Mary Anne Franks: I wanna link your previous question about, you know, is this overall a, because I think they're related questions, right? What is going to be the lasting impact of this bill?

And the, you know, when I wrote that, that this was not just bittersweet, but that I, I characterize the takedown provision as a poison pill because it's not just that. It's actually because I support the criminal provision so much—I mean, obviously we do because we have literally been calling for it for over a decade—but for it to be, you know, for it to be joined with what I think is a very damaging, not only unconstitutional but actually counterproductive for victim's provision.

The thing that I worry about the most is the kind of I don't even know how to express my, my, my, my despair or my disappointment at the idea that what victims have been advocating for and sacrificing for, including the founder of my organization, Holly Jacobs, is for this issue to be taken seriously because it's not a political issue. It is, it is just about individual privacy, dignity, anti-exploitation, and the idea that it could be harnessed with this kind of clearly to me politicized and I think bad faith provision—or at least one that I, I'm fairly certain is going to be used that way, that, that you're going to confuse this issue, that you are going to link these two things. That there are at least some actors I think who are using the, the kind of cover of, we care about this issue so that they can achieve their political ends. That is the thing that makes me extremely upset on behalf of the survivors that have really fought for this for the criminalization.

So for me, the criminal provisions, if we could fix that terrible loophole about the, the appearance of the disclosure of him or herself, those were the provisions we've been asking for, those are good provisions. I'm glad to see that maybe what has—one thing that has happened in the course of these last few years is people have come to consensus on that, which I am still surprised here because it never seemed to be true in, in previous years. If we could come to consensus around this, fix the, the problems that are in that, that are fairly, that are easy to fix, I think there, and just get rid of the takedown provision. I am much more supportive of the idea of real Section 230 reform, of real, structural changes to how we try to impose accountability on the tech industry rather than trying to do this, how many hours is it, which kind of image is it, who has to sign off on it, which I think is kind of an exercise in futility as much as it makes promises to victims that they're going to be taken care of.

That's the other piece of it. I really worry about that we're giving false hope to victims. This isn't a process that we know how to make work yet. And the idea that you're gonna tell victims this is gonna fix all of your issues—I think one of the things we're inevitably going to see in the next few years when this law goes into effect is actually it doesn't fix these things. And, and so now what?

Becca Branum: I think that's right, and I think something to keep an eye on moving forward is to the extent—I, I can't predict sort of who, who might end up challenging the bill—but to the extent it, it survives intact, something I'm concerned about and I think it's worth paying attention to, is this model getting exported to other kinds of content that there might not be as much unanimous agreement about it being deserved to be taken down, and sort of replicating this across wider swaths of content that are even harder than NCII to identify and where the equities aren't as clear about it needing to come down at the request of the people depicted.

And so that's worth considering because again, back to the broader tech regulation landscape, everyone wants there to be reform, whether on the left, they're on the left or the right, but the types of content they're concerned about and the ways they wanna go about it, they're, they've yet to reach agreement, right? And what people define as harm and what people define as harmful content really depends on their worldview. And I do worry about this model getting extended beyond NCII to areas where it will be even, even harder and, and more likely for lawful content to be censored through government action.

Adam Conner: And I'll just say I think two things. I think one, to—I understand why this is so bittersweet for, for Mary Anne—you know, I think one of the reasons why this sits now is because of the effect of advocacy and also the proliferation, right? I think the things we have heard over and over again, right, is that people understand with AI now that like, this is the thing that they are, they're seeing a lot more of, right, not just on the, on the real side, but on, on the digital forgery side, they're, they're seeing stories in their schools. And so I think it's obviously your, your effective advocacy, even if the solution here is not necessarily all of the one you might want.

But I think that to ask why now—I think it is kind of hitting a point where it is now no longer something that people kind of look at distantly, but I think increasingly understand just as a broad public, that this is a real thing and it really happens and, and it may need to be dealt with, which I think in part explains some of the congressional motivation.

You know, I, I think I would say two things about this. I think my general sense here is this provision—this, this law, and the take down provision—if it survives, you know, it will help take down NCII and it will help take down digital forgeries of intimate imagery. I think it is very clear that that is true, that that the kind of little tech places we, you know, kind of Mary Anne Frank's mentioned that like, help proliferate this aren't necessarily the big platforms, and I think that is like a, a net good.

I think whether or not all the other kind of content that could be affected by it is worth it is I think going to be, you know, really what we look back and ask about maybe in the future, see if it can change. But I, I do think it is worth saying that it will help some of that go down, and so there is some good in here, which I think why people have mixed feelings about it.

I think the other thing I would just say is that why it was Congress willing to do this? Obviously you know, the, the kind of piece there was you know, one, we've heard a lot and I know from my own time in the history, like people who are victims of NCII, not all of them want to take criminal, you know, take action in the court. Sometimes they just want it down and they want it to go away. And whether or not that's possible or even constitutional, I think that kind of speaks to the reason why this provision was in the bill.

And I think it's, it speaks to the reason, quite candidly, why I think members of Congress were willing to roll the dice on maybe something that's not as constitutional, because it feels so powerless to say we can't tell them to take it down. Now obviously there's legal issues and there's all sorts of abuse issues, but I think if you look at fundamentally, that’s how it happened.

And I think unfortunately plenty of members of Congress, particularly female ones, have experienced this kind of assault and abuse and I think that helped drive a very human reaction to it, and obviously now it's in the courts and the implementation and, and for the platforms, but I think that's why it will have a legacy. And I think it just kind of determines what survives, you know, the courts to determine an implementation, determine what that legacy is.

Renee DiResta: Thanks everybody. So much for the really thoughtful and nuanced discussion here. I think it's a really important moment. And I guess now we see if it gets signed before this comes out.

But no, I, I really appreciate all of your expertise and I hope that this really kind of helps explain to the audience some of the questions that I had. I really benefited from, from hearing all the nuance that y'all brought to this conversation. So thank you so much for sharing your expertise with us today.

Mary Anne Franks: Thank you.

Renee DiResta: The Lawfare Podcast is produced in cooperation with the Brookings Institution. You can get ad-free versions of this and other Lawfare podcasts by becoming a Lawfare material supporter at our website, lawfaremedia.org/support. You'll also get access to special events and other content available only to our supporters.

Please rate and review us wherever you get your podcasts. Look out for our other podcasts, including Rational Security, Allies, The Aftermath, and Escalation, our latest Lawfare Presents podcast series about the war in Ukraine. Check out our written work at lawfaremedia.org.

This podcast is edited by Jen Patja and our audio engineer this episode was Hazel Hoffman of Goat Rodeo. Our theme song is from alibi music. As always, thank you for listening.


Renée DiResta is an Associate Research Professor at the McCourt School of Public Policy at Georgetown. She is a contributing editor at Lawfare.
Dr. Mary Anne Franks is a Professor of Law and Dean’s Distinguished Scholar at the University of Miami School of Law, where she teaches First Amendment law, law and technology, and criminal law and procedure. She is the President and Legislative and Tech Policy Director of the Cyber Civil Rights Initiative, a nonprofit organization dedicated to combating online discrimination and abuse, and an Affiliate Fellow of the Yale Law School Information Society Project. She is the author of The Cult of the Constitution: Our Deadly Devotion to Guns and Free Speech (Stanford 2019).
Becca Branum is the Deputy Director of the Free Expression Project at the Center for Democracy & Technology.
Adam Conner is the vice president of Technology Policy at the Center for American Progress.
Jen Patja is the editor and producer of the Lawfare Podcast and Rational Security. She currently serves as the Co-Executive Director of Virginia Civics, a nonprofit organization that empowers the next generation of leaders in Virginia by promoting constitutional literacy, critical thinking, and civic engagement. She is the former Deputy Director of the Robert H. Smith Center for the Constitution at James Madison's Montpelier and has been a freelance editor for over 20 years.
}

Subscribe to Lawfare