Lawfare Daily: TikTok Ban at the Supreme Court

Published by The Lawfare Institute
in Cooperation With
In a live conversation on January 10, Lawfare Tarbell Fellow in Artificial Intelligence Kevin Frazier talked to Lawfare Senior Editor Alan Rozenshtein and Senior Staff Attorney at the Knight Institute Ramya Krishnan about the Supreme Court oral arguments over the legislation passed by Congress that bans TikTok unless its parent company ByteDance divests from the app, the arguments made by the different sides, and their predictions about how the Court might rule.
To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/
Click the button below to view a transcript of this podcast. Please note that the transcript was auto-generated and may contain errors.
Transcript
[Intro]
Ramya Krishnan: You could imagine a world in which the Court says that TikTok loses, but its creators win because it's more convinced that the creators have a First Amendment interest at stake and that there is, you know, at least doubts over key questions when it comes to TikTok U.S.
Kevin Frazier: It's the Lawfare Podcast. I'm Kevin Frazier, Lawfare Tarbell Fellow in Artificial Intelligence with Lawfare senior editor Alan Rosenstein and senior staff attorney at the Knight Institute Ramya Krishnan.
Alan Rozenshtein: An injunction is only supposed to issue if the Court, certainly when you have a duly enacted act of Congress, if the Court believes that there is a likelihood, I think maybe even a high likelihood, that the requester will win on the merits, which I don't think anyone thinks. So I think an injunction is just not on the table.
Kevin Frazier: In a live recording on January 10, we discussed the Supreme Court oral arguments over the constitutionality of the TikTok divest or ban legislation, the arguments made in front of the Court, and how the Court may rule.
[Main podcast]
So, let's get into the weeds a little bit here. I guess we're all content manipulators now. We saw that thrown out there. Whose speech is it anyways? Does anyone have eyes on Jeff Bezos’s family members? I'm concerned for their health and wellbeing. All that and more, we have a lot to discuss.
So, let's kick it to Alan first. Alan, for the folks who haven't been in the weeds of this case, you're kind of Mr. TikTok now. Can you quickly summarize the provisions of the TikTok bill and perhaps give us a sense of what the government's claimed motivations were for passing the bill?
Alan Rozenshtein: Sure. I do think that Mr. TikTok is maybe the meanest thing you've ever called me.
Kevin Frazier: Oh, well, clearly you don't know what I've said about you, but we'll address that.
Alan Rozenshtein: Meanest thing I've ever heard you call me.
So earlier last year, Congress passed the, and I'm going to try to get the name of the bill right, the Protecting Americans from Foreign Adversary Applications Something Controlled Act, or PAFACA–real disappointment that they didn't do their usual backronym–in which Congress by name singled out TikTok and its Chinese parent company ByteDance. And stated that unless within 270 days of the enactment of the law, which is January 19–the day before Donald Trump is inaugurated for a second term–then unless ByteDance divested its 20 percent stake in the company by that date, TikTok would be banned in the United States.
And what a ban specifically is in this case is that app stores like Apple and Google are no longer permitted to distribute the app. Cloud service providers like Oracle, which is TikTok's main cloud provider in the United States, are no longer permitted to host TikTok. And domain name services may also–there's a little bit of uncertainty under the law–may also be forbidden from essentially mapping tiktok.com to the IP address, which would be a problem if you're trying to access TikTok through your web browser.
The law does not restrict individuals access to TikTok, and it does not actually require internet service providers to restrict access, but it would be a pretty steep barrier to the vast majority of TikTok's 170 million American users. And the law further imposes pretty steep penalties on the affected companies, so $5000 per user in the case of, you know, the app stores and the cloud service providers.
So that's what the law does. There's actually a second portion of the law, which is quite interesting, but outside the scope of this particular litigation, which would allow the president to impose similar divestment or bans on companies that are controlled by, entities from adversary nations defined as China, Russia, North Korea, or Iran. But that's a topic for another day.
So Congress passed this law. TikTok immediately sued in the D.C. Circuit, which is an odd place to initially sue. But the law requires all litigation to occur in the first instance in the D.C. Circuit. The D.C. Circuit heard oral argument on this, I believe, in September.
That oral argument went quite poorly for TikTok. We had a conversation about this after that happened. And then in December last month, the D.C. Circuit unanimously 3-0 ruled against TikTok. TikTok then appealed to the Supreme Court asking for injunctive relief. The Supreme Court, instead of granting that relief basically treated it like a cert petition, accepted that cert petition, and then very quickly within weeks, basically scheduled oral argument. Oral argument happened earlier today.
The other procedural thing that happened in the meantime is that President-elect Trump submitted a friend of the court brief asking the Supreme Court to pause the law. It's not clear if it was an injunction or an administrative stay. It's not clear exactly what Trump was asking for–basically to pause the law for some indeterminate amount of time to give the incoming president a chance to exercise his famous deal making abilities and try to put a divestment in place.
So we heard oral argument today, and I'll just quickly say what the government's main arguments were and then we'll, we'll jump to, I think Ramya, for the other side of this.
But throughout this argument, Congress and then the executive branch in defending this law has pointed to two main justifications for this law. One is a data privacy justification. And the concern here is that TikTok–like all social media companies to be fair–collects massive amounts of data on Americans.
But unlike most social media companies TikTok is owned by ByteDance, and ByteDance is in principle controlled by the Chinese government because, or can be controlled by the Chinese government because it's a Chinese company. And that presents, according to the government, a really profound data security threat or intelligence threat and espionage threat.
And then the second concern that the government put forward is a content manipulation concern. This is the idea that again, because TikTok is owned by ByteDance, and ByteDance in principle is ultimately controllable by the Chinese government and that TikTok uses the ByteDance algorithm, the Chinese government could manipulate that algorithm, potentially covertly, in a way that would advance Chinese interests.
As this law was going through Congress, there were some questions about whether this was just a general content manipulation concern. There was also a question, at least from some members of Congress, about whether or not TikTok was already doing this with respect to speech that those Congress people did not like, you know, pro-Palestine speech or anti-America speech, which obviously raises First Amendment concerns.
But basically this, this duo of concerns, data privacy and content manipulation, were what Congress cited in justifying the law. And the D.C. Circuit upheld the law on both of those grounds. And we didn't really see I think any new arguments from the government in part because the case has been, I think, pretty well ventilated. And also because the government got basically two weeks to put this case together for the Supreme Court. So actually no one had any time to really come up with new arguments.
Kevin Frazier: And Alan, before we dive into the other side, just highlighting for the listeners the D.C. Circuit opinion: what the sort of grounding was there, especially with respect to national security and perhaps the prioritization of those interests over First Amendment interests.
And also, if you could give us a little bit of insight, as you've written about, for Lawfare – yes, shameless plug for you. Can you tell us about the weird quirk about some of these national security interests being redacted or some of the information about that being redacted and how that played a part in argument today?
Alan Rozenshtein: Sure, sure. So the D.C. Circuit ruled 3-0 against TikTok, but there were actually two opinions. One written by senior Judge Ginsburg and joined by Judge Rao. And then another concurring opinion, I think concurring in the judgment written by Chief Judge Srinivasan.
So this case, as many First Amendment cases does, hinges, at least in theory, on the question of what tier of scrutiny do you apply: no scrutiny, intermediate scrutiny, strict scrutiny, super duper strict scrutiny. There are kind of a million versions of this. And this is actually a very hard case to figure out exactly which tier to apply.
And so Judge Ginsburg, his opinion was, we're just going to assume that strict scrutiny might apply, but even under strict scrutiny, we think that the national security rationale is really strong. And not only that, but this law actually, in a sense, advances First Amendment interests because part of the First Amendment is not just having access to a platform, but making sure that that platform is not being covertly manipulated. That's what Judge Ginsburg said.
Judge Srinivasan, Chief Srinivasan said, actually, I think we should take a position on what level of scrutiny should apply. It should be intermediate scrutiny, which actually, in some sense, is even worse for TikTok than what Judge Ginsburg said. And under intermediate scrutiny, this easily applies.
Now, it's true that during the briefing, both when Congress was debating this law they got classified briefings from the intelligence community. And then when the Justice Department was defending this case in the D.C. Circuit, and then I guess in the Supreme Court as well, they included some classified briefing to the judges that only the judges were able to see, and obviously not the parties–not TikTok and not the public. We don't know obviously what was in those briefings.
TikTok was extremely unhappy that it had to deal with secret evidence, which is understandable. But interestingly, the D.C. Circuit, in its opinion, expressly said, we are not relying on any of this secret evidence, so we're doing it entirely on the public record. So I think that's probably why we didn't hear a lot of that today in the Supreme Court.
I assume that the Supreme Court, when it got the D.C. Circuit’s materials also got to read the classified record. But that didn't really come up except when Justice Gorsuch got extremely grumpy I think understandably so, and kind of, subtweeted or actually really just, really just threw Congress under the bus, saying, Congress, can you please pass a law that creates an avenue for parties to be able to get classified evidence in these cases, or at least to be able to have some sort of CIPA like access to it.
Again, I don't think this is going to actually, in the end, make any difference in the disposition of the case, except that there will be a very grumpy – understandably so, I will enjoy reading that–grumpy opinion from Justice Gorsuch saying, come on, get it together, Congress. This is not, this is no way to run a hot dog stand.
Kevin Frazier: Not a way to run a hot dog stand. Okay, we've heard from Alan. Ramya, it sounds like TikTok was facing somewhat of an uphill battle here. Obviously the loss at the D.C. Circuit, maybe this general notion of national security, kind of having a pervasive gloss over how the justices were thinking about this issue.
So, what did TikTok USA try to argue? And if you could give us a sense in particular, what your read of the justices was with respect to this question of whose speech is it anyways, right? Who's speaking? What speech interests are implicated here? What level of scrutiny should be applied?
Ramya Krishnan: Sure. Yeah. So, I mean, I think TikTok faced an uphill battle before the Supreme Court, much as they did before the D.C. Circuit.
And of course, we had two advocates. on TikTok's side. We had Noel Francisco, who was arguing for TikTok itself, and then we had Jeff Fisher, who was arguing for TikTok's creators. And they were invoking different First Amendment interests, but I think ultimately there was a lot of overlap in their First Amendment arguments.
And unsurprisingly, you know, both of them said, argued, and unsurprisingly I agree with this, that this was an easy First Amendment case. So on the one hand, you had TikTok make the argument that it's a U.S. subsidiary, and so it has the same First Amendment rights as any U.S. owned platform. And so while ByteDance—of course, is, you know, headquartered in China, is a foreign owned company and may lack First Amendment rights–that it has First Amendment rights in the curation of its platform.
The TikTok creators, on the other hand, obviously asserted their own First Amendment interests in being able to speak on the platform, in being able to hear ideas, content, speech from other TikTok users, U.S. and abroad, and in being able to work with their, you know–this was language that Jeff Fisher used again and again–their editor of choice which is TikTok itself with input from ByteDance.
So they both argued that they had a pretty strong claim to the First Amendment in this case. And so the question then was what level of scrutiny applies? And they argued that strict scrutiny applies, that this is a clearly content based law. It's content based because it singles out a subset of, of companies here and, you know, notably they said, excludes, for example, platforms whose primary purpose is to host reviews.
But it's also content based in, in purpose. And here they hinged a lot on, you know, the content in the content manipulation rationale that the government was advancing. So there's a lot of talk about to what extent is the government's rationale about covert content manipulation versus content manipulation itself.
That said, both advocates argued that the sort of invocation of covert was really like a post hoc kind of rationalization and was weak, ultimately, because if the government's interest is in protecting against covert content manipulation–that is, China having potential secret influence over the way in which TikTok is moderated–then the answer to that is disclosure. It's making clear to the American public that China is already, or may influence in the future, the way in which TikTok or content on TikTok is moderated.
And then when it comes to content manipulation itself, which it said was the true rationale behind this ban, they both argued, and I think quite rightly, that this is an illicit purpose under the First Amendment that, and I think Jeff Fisher said this at one point, that ideas are not a national security threat. The First Amendment is built around an anti-paternalism principle, which says that the answer to dangerous ideas or dangerous speech is not suppression of those ideas and speech, but trusting people to come to their own conclusions and ultimately trusting in the marketplace of ideas. Because any answer would be worse than that.
So that was ultimately their argument. And data security, data protection–which, as Alan noted, was the other rationale that the government invoked to shore up this law, you know, both of them argued, look, this really isn't an independent justification here. Congress likely would not have passed this law based solely on data protection, and that's evident from the, you know, mountain of statements made by legislators in passing this ban. It's also evident from the government's briefing here where they seem to put most stock in the content manipulation rationale, covert content manipulation rationale.
But even accepting that that was an interest that truly motivated the law, that it simply cannot sustain this broad prohibition on speech or divestiture, forced divestiture, if that's how you want to look at it. And that's because it's woefully under inclusive. But also there are clear, less restrictive alternatives if you want to protect American sensitive data.
And in particular, they both referenced extending the data broker law that was actually passed as part of the same package as this PAFACA. So, so the law essentially prohibits data brokers from sharing Americans personally identifying information, which with foreign adversaries, including China, they said that could easily be extended to companies like TikTok.
Of course, you can pass other kinds of data privacy laws too that would also get at the same problem. So that was, I think, essentially the arguments that, that they made and why they thought this was a easy First Amendment case.
Kevin Frazier: Very comprehensive. Thank you very much for that. So I want to play justice for a second. Chief Justice Roberts raised some interesting counterpoints, one of which was, why isn't this just a matter of regulating corporate structure, right? We're saying this is just a regulation on ByteDance. We're ordering a divestiture. Why are we even implicating the First Amendment here? So, Ramya, can you give us the response there and whether you thought that was persuasive, and then, Alan, we can kick it back to you.
Ramya Krishnan: Yeah, sure. So, I mean, I think, and you know, Roberts wasn't alone in, I think, making this point. I think a number of the justices, you know, particularly when you had the advocates for TikTok and TikTok's creators up there, you know, made the point that, look, isn't this law just targeting, you know, foreign ownership or foreign adversary control. It really doesn't say anything about expression. You know, as Alan mentioned, the law doesn't mention creators.
So, you know, maybe at most, this law incidentally burdens the speech of TikTok's users, but it doesn't directly go after it here. So why, why do we, maybe why is the First Amendment implicated at all? And I think, you know, I mean, I think that TikTok had a good answer to this, which is, you know, essentially like, I mean, I think there's like an air of unreality almost pervading this questioning–you know, the question obviously is, well, why, why were they interested in foreign control anyway here?
And the answer is, I think, you know, the, the substance of, you know, the content that TikTok is currently carrying or that might carry in the future or how it's currently being moderated or how it might be moderated in the future. And so it ultimately comes down to this fear that Americans are going to be exposed to ideas that the government doesn't like, fears Americans may be persuaded by, and that's generally the kind of thing that the First Amendment doesn't allow.
Kevin Frazier: And one other thing before we switch back to Alan. I'm curious your take. Justice Barrett, again, raised briefly something she flagged in Moody, which was this concern about whether an algorithm can be expressive.
And that didn't carry a ton of conversation necessarily at oral argument, but I'm keen to hear if you thought that got a little bit more fleshed out and whether you think that will find its way into the ultimate decision here.
Ramya Krishnan: Yeah, I, I'm not, I'm not sure. I mean, I think that there was more debate over sort of, you know, to what extent, and, and you, we saw this in the D.C. Circuit argument as well.
You know, there are some factual disputes here, I think about the extent to which, you know, TikTok is exercising its own independent judgment in deciding on its algorithm, customizing it, and ultimately deciding, you know, engaging in the kind of editorial decision making that Moody says is protected versus ByteDance really calling the shots here because, you know, ByteDance is a foreign owned company that doesn't have First Amendment rights.
And so I think there's, you know, kind of a dispute about that in the case. I'm not ultimately sure how much it will matter. And ultimately, you know, one possible result–though I'm not sure how likely it is, and we can get to sort of predictions later–is you could imagine a world in which the Court says the TikTok loses, but its creators win because it's more convinced that the creators have a First Amendment interest at stake and that there is, you know, at least doubts over key questions when it comes to TikTok U.S.
Kevin Frazier: All right, Ramya, thanks so much for outlining all that. Alan, tons of fodder here. What do you have to say, good sir?
Alan Rozenshtein: Yeah, what I have to say is that my head hurts and I'm glad I took a nap after this argument and, and, and, you know, part of this is being glib, but part of this is also just, the First Amendment is rough.
I mean, it is, it is–constitutional doctrine is, is bad enough, and then the First Amendment is just this, like, scholastic exercise in these extremely fine distinctions. And I just think this is a perfect example where you can just go round and round and round and almost describe this conduct or this law in like a million different ways.
Because on the one hand, the chief justice is correct. Like this is, this is in a sense, a restriction on corporate ownership. And therefore you would think that at most the O'Brien test would apply with, you know, incidental effects on speech and are there alternate channels of communication? And then you have Alito kind of, I thought hilariously comparing TikTok to like a pair of old pants, I think was, was maybe what he said.
Ramya Krishnan: Like a shirt, Alan
Alan Rozenshtein: A really comfortable shirt, like one of those like L.L. Bean flannel shirts that you like laundered a bunch of times. It's really soft, right. And I, I guess in this analogy, like YouTube shorts is like a nicer shirt that you'll move to. Which, I don't know, I find hilarious. And so, in that view, like, this is not that big of a deal.
But then, of course, Ramya is also totally correct that there's this air of unreality. Because, of course, like, the whole reason you care about the corporate ownership is because of potential content manipulation. But then you also have, well, is content manipulation different than content based? And so, is it, is it actually a bad motivation that poisons it.
And then you have this other motivation as well, data privacy. But also, what even is a motivation when you're talking about Congress? Because I, I think there's also Alito's point, Congress is a they, not an it, right? There are hundreds of Congresspeople whose views you're trying to aggregate. And, and so, you know–and we haven't even gotten to the question, right, that I know, Kevin, you wanted to tee up, so I'm just going to steal it if that's okay from you.
But is this even a speech issue at all, right? Might this really be an associational issue? Which was sort of Justice Jackson's kind of interesting curveball question about is this really a question of speech? Because of course, if you're a TikTok user, you can speak as much as you want. The question is, do you get to choose whom you associate with?
Which is to say, do you get to choose–and I thought this was where Fisher at some point said something really provocative in a way that I appreciate his candor, but I think does not help him ultimately–which is that what he thinks is at stake is the right of TikTok of users to be able to choose a platform that is owned by ByteDance.
Which on the one hand is true, and I get where he's coming from. But it's also not a great argument, I think, for a Supreme Court that I think is quite sensitive to the concerns, the national security concerns of China, right? At some point when I was like live skeeting this, which I think is what you're supposed to say in Blue Sky, I, I found a Bart Simpson –
Kevin Frazier: Are we that's the right phraseology?
Alan Rozenshtein: Not great.
Kevin Frazier: Okay, we should maybe revisit. Justice Kagan might have a thing to say about that.
Alan Rozenshtein: Yeah, exactly. Yeah, Justice Kagan, if you're interested in Blue Sky, please workshop this.
I found a Bart Simpson meme generator of, of just Bart Simpson writing, but China over and over and over again, right? Which I think is really what this case is about. And, and more seriously, I mean, there are cases like Holder v. Humanitarian Law Project that I think make this associational argument quite difficult to uphold.
I'm actually not sure what I'm trying to say, except I think this is an absolute mess of sort of overlapping layers of First Amendment concerns, many of which point in opposite directions. It's completely unclear how you do the math, right? Like, you know, you get, you get three First Amendment points for not being, you know, specifically targeted content. You get minus four First Amendment points for all those dudes in Congress complaining about pro Palestine speech. And how do you do all that arithmetic?
And so, this is all to say that I think at the end of the day, right, what is going to happen is that the Court will assume some degree of heightened scrutiny. Right? I think they will essentially do, and University of Chicago law professor Genevieve Lakier I think basically said this on Blue Sky earlier today, so I'm stealing from her, but I think it's right.
They're going to essentially do some kind of a version of what the D.C. Circuit did, which is they're going to assume that some heightened level of scrutiny applies, and then they're going to say, but China is a very scary, unusual case. And so we're just going to say on these facts, we think–especially given the need to defer to the political branches–that there are multiple grounds for this law, right?
We are not comfortable setting aside the concerned judgments of the political branches, but we're not going to rule broadly. We're going to keep this very, very narrow on the facts because there are a bunch of other First Amendment issues we're also trying to think through with the internet–see, for example, Net Choice from last year, and the Paxton Texas pornography verification law and, and so forth and so forth, right–that we just don't want to get into.
Which, which I think, you know, for people like Ramya and me is going to be kind of intellectually very unsatisfying, but otherwise it's going to require the Court to answer, you know, 10 legal issues. Any one of them is an absolute brain teaser.
Kevin Frazier: And hence again, why I'm okay not being a U.S. Supreme clerk just for today. We'll set that aside for today.
Ramya, when we're trying to tally up all these points that Alan discussed, one big variable that we haven't discussed was the line of questioning centered on feasibility of divestiture, and also whether or not Congress adequately considered alternatives.
And here, I think, is a really interesting argument about whether Congress did the homework that Fisher and Francisco were seeking of exploring well, what if we just did put up a big banner on TikTok that said, China might be stealing your data, beware? Or what if we instead had other disclosure requirements? Would that have been some viable alternative that should have been on the congressional agenda a little bit more frequently? Or did Congress have justification in just going more or less straight to this bill?
What did you make of that? How do you add up those points in that regard?
Ramya Krishnan: Yeah, well, I mean, I think that there are lots of comments and questions from the justices which pointed in different directions.
So again, yeah, very difficult to sort of tally up. I think on, you know, the consideration of alternatives, you know, I, I think at least on the covertness and, and we saw a bit of a bit more skepticism come through, I think when, when General Prelogar got up to defend this rationale. But you know, in particular from I think Kagan and Gorsuch who expressed, I think, you know, real sort of, yeah, doubt, I think, about the covert content manipulation because they seem to really buy this idea that if what you're, if what you're concerned about is that the American people just don't know that China may be manipulating what you see on TikTok, that disclosure is a less restrictive alternative.
And so I think at least on that score, you know, there was an acknowledgment from at least some of the justices that that rationale was weaker because of the availability of less restrictive alternatives.
It was interesting when, you know, data, data protection/security came up, you know, there were a lot of questions about are we able to treat data protection here as an independent sort of like a freestanding justification?
I think many of the justices, maybe all of them, would have much preferred if this law had come to them with just the data protection rationale because that is obviously a compelling interest that doesn't turn on, on content or viewpoint, so it's a lot more First Amendment friendly.
But I was surprised that there wasn't sort of more skepticism about data protection when it comes to tailoring because, again—and I mentioned this earlier—there seem to be some pretty, pretty less restrictive alternatives on the table that sound more squarely in data privacy than, you know, a forced divestiture/ban of TikTok.
On feasibility, you know, I didn't hear much sympathy for TikTok's claims that a divestiture here, at least within the timeline provided, and maybe, you know, regardless of timeline, would be infeasible. And this was like part of, you know, my, my frustration with the argument to which is not like necessarily like, oh, I think TikTok deserves a lot of sympathy here. But a number of justices made this argument that, you know, nothing in the law requires TikTok to go dark, that ByteDance could always allow the sale and or the transfer of its algorithm.
And if TikTok can't use its algorithm, that's a problem of ByteDance’s or China's own making. TikTok can always sort of come up with its own algorithm, and if users lose their platform in the process, well, you know, sometimes you have to, you know, throw out the baby with the bath water, and -
Alan Rozenshtein: You just have to divest the baby. There's no, this is just a divestment. This is just a divestment or defenestration of the baby law.
Ramya Krishnan: It's a great prescription for any new parent. I'm sure. But yeah, so I mean, I think, sure, I mean, I, I understand on the one hand taking this kind of like formalist approach of like, well, you know, TikTok can always just come up with its own algorithm, but obviously that's far fetched.
And obviously the motivation behind the law was, you know, members of Congress–and obviously it's always, you know, dicey kind of talking about them as some kind of uniform entity here–but there's a lot of evidence from legislative statements, but also from the government's own defense of this law, that this bill happened because they don't like how content on TikTok is being edited, or compiled, or they're fearful of how those things might, you know, how that might be shaped in the future by, by China.
And so, yeah, again, I just found like the sort of willingness to turn a blind eye to like actually the, the most practical operation here pretty frustrating.
Kevin Frazier: Lots, lots of different points going in different directions there, Ramya. It's a tough one though. So Alan, any clarity there? I know for example, one argument that was interesting that got brought up. I believe it was Fisher who said, well, it's not as though TikTok can just go find new engineers to dream up some new algorithm.
We've got a global set of engineers. How are we going to work with all of them? How are we going to source – I believe they cited Irish content. I don't know why they went to Ireland. Maybe they're watching Say Nothing on Hulu or something. I don't know. But for you, what does the feasibility look like here? How does it come out in your point scoring?
Alan Rozenshtein: Yeah, I, I, I mean, so, so there's a feasibility question. Then there's the question of, of alternatives, which is kind of what I want to address first. So I, I am perhaps–I think the government actually did a better job defending the kind of least restrictive means part of this than Ramya did. And I thought that Solicitor General Prelogar did actually a very nice job here on sort of a couple of dimensions.
So first, right, there was this question of, well, if the concern is covert manipulation, why not just put a disclaimer? Write, somewhere, and say, you know, TikTok is owned by ByteDance, and ByteDance is a Chinese company, so don't trust anything you see.
And I thought Prelogar did a nice job explaining why that wouldn't work, which is, the point is not informing users that, in some general sense, this is all potentially controlled by China. It's on a content by content basis, is this particular piece of content being served to you because of some CCP directed modification of the algorithm?
There's a very striking analogy that I thought she did a nice job doing, which is a disclosure in a store that sells a thousand products that one of them causes cancer is not actually helpful. The whole question is which of these products is the one that causes cancer, right?
And a law that required, right, ByteDance or TikTok to include on every piece, of content, you know, is this particular piece of content being manipulated by the Chinese government? Well, A) would be unenforceable, because of course, like the Chinese government wouldn't let TikTok know, and also would actually raise its own First Amendment issues regarding compelled speech and so forth.
So I think there's actually, you know, if you think that avoiding covert content manipulation is a legitimate government interest–and of course, I think Fisher, who was arguing for the users, was most powerful in saying, no, I just don't think that that's a legitimate interest because that's paternalistic and the point is counter speech and Lamont and, and, and users have a right to gain foreign property, all that sort of stuff, which I think is a very fair point.
But if you think that it is a compelling government interest, it's actually very hard to figure out a way of dealing with that, that is both effective and does not do a lot of damage to the First Amendment interest at stake here, right? And so I'm actually not convinced that forcing the government to get extremely involved in the day to day algorithmic decisions of TikTok, which, by the way, was the main remedy that so called Project Texas, which was this many years long negotiation between TikTok and CFIUS, the U.S. government, that failed or that Congress thought was not working, and therefore it enacted PFACA. I thought, I always thought that that Project Texas would have had those First Amendment problems as well.
And then as to the data security side, I actually thought also Prelogar had a nice response to this argument that Noel Francisco—the former, former Solicitor General, there was a lot of impressive legal talent in that argument—that Francisco, who was arguing for TikTok, raised about, well, if the concern is data privacy, just write a law that says TikTok cannot share data with anyone or everyone's going to jail. Prelogar said but that would basically be infeasible for the exact same reason you guys are saying that a divestment isn't feasible, which is that if you want TikTok to be part of this global ByteDance run network, it needs to transfer data—that's the whole point.
So I, I think the problem here is, is that if you accept the government's compelling interests, which you don't have to do, but if you do, I think it actually becomes very difficult to tailor this in a more narrow way. And you might say, well, how can it be harder to tailor something more narrow than a ban? A ban seems very untailored. Well, yes, but the point of tailoring is not you have to make the remedy very small; you have to make it–it's rather that you have to make it no larger than is necessary to achieve the objective.
And I just think in this case, once you've accepted the objective is legitimate, if you have, it's actually extremely hard. There are a few tools that are smaller than sledgehammers that will accomplish the goal, is what I will say.
Kevin Frazier: Alright well, I know burning on the back of the minds of a lot of listeners has to be, what the heck's gonna happen? What are y'all's predictions? We'll get there in just one second.
One really interesting plot twist that came up during the oral argument was, let's assume the Supreme Court from that upholds the D.C. Circuit's judgment, doesn't side with TikTok, right? We enforce the law, and then come January 20th, we have a new president, and the president opts not to enforce that law.
Do we see that coming? What was the justices’ response? Did anything surprise you all about this exchange that we saw? Ramya, I'll let you kick off the conversation here.
Ramya Krishnan: Yeah. So, I mean, I was, I was interested to see – I wouldn't say I was surprised, but there was a, a colloquy with General Prelogar about, you know, well, what if this ban goes into effect, Trump comes into power. Can he then sort of, you know, extend or, you know, approve a qualified divestiture, or even just kind of sit on his hands, extend the time for divestiture? Even though, you know, the ban is already meant to have gone into effect under the law, and she seemed to think that he would be able to do that.
So, you know, that, that is on the table. I mean, there was also an interesting colloquy; I think Alito's the one who asked the question about whether the Court had the authority to grant an administrative stay that would essentially postpone the January 19 deadline until after Trump takes office. And I think she said something along the lines of, you know, well, I can't point to any formal authority, but you know, of course the Court can sort of allow itself more time to resolve the case.
So I, what I took her to mean is that the Court could, could do this. So I think that that option is like potentially on the table and for my money, I think either, you know, Genevieve slash Alan's, you know, prediction is the most likely, or potentially even just like an administrative stay, punting, is likely too.
Kevin Frazier: Let's be sure to have Alan flesh out that prediction in a second.
I do just want to say though, on that colloquy about enforcement, the fact that it wasn't just a one off of might President Trump not enforce the law? And Prelogar could have said yes. But instead, they got into the weeds of, okay, but what would the reliance interests be of, for example, Google and Apple? Would they be able to then bring a due process claim if, later, we saw President Trump reverse himself and start to enforce the law?
That's pretty mind blowing. You don't talk about, well, what if the president just flip flops and what are the due processes – so that whole exchange was pretty mind blowing. And in fact, the justices themselves at one point said, well, maybe we should stop talking about this and move on. So.
Ramya Krishnan: Well, can I just add, you know, just, it's, it's funny because usually courts are in the business of presuming compliance with the law. And yeah, so it was, it was funny – let's, let's go into this imaginary where we assume people are not complying with the law.
Alan Rozenshtein: I gotta say jokes aside that, that bummed me out because I, you know, look, I, I don't know why Justice Kavanaugh started that whole thing and maybe he was musing, and fair enough. But I certainly think Justice Sotomayor was not amused because she – I think pretty strongly, I couldn't tell if she was rebuking like Trump or maybe Justice Kavanaugh, but she did not like the fact that we were all seriously talking about what if the president just, you know, YOLO, just doesn't do it? Because why not? And then you rely on what was like estoppel by laches or something?
I don't know, I got like horrible CivPro PTSD flashbacks in that moment. But to be clear, like I do think that's just a real possibility, that that's actually exactly what happens, what incoming President Trump does right? He either tells Attorney General Bondi to not enforce this law and then there's this like weird oral argument dicta that Kavanaugh has said, though there's now counter dicta by Sotomayor – it's very weird.
If you're the Apple general counsel I mean, I don't feel bad for you, you're doing fine generally in life, but this is stressful actually. Like it's very unclear what you do in this situation.
Or maybe Trump just says hey, I'm just going to announce that there's a divestiture and therefore none of this applies. Or he says I extend 90 days, even though under the statute there needs to be a legal agreement that there's a divestiture happening. But again, maybe we're just in YOLO world, like nothing matters.
This question of administrative stay, I, I, Ramya, I actually, I, I would like, I want to understand what anyone's talking about. Because as far as I can tell, there are two ways that the Supreme Court could punt this guy.
One thing it could do–and this I actually understand, although I don't think it works–is an injunction, right? Where the Supreme Court says, no, no, no, we are enjoining this law until a later date at which point we will decide the merits.
But the problem with an injunction is that it's pretty clear what the standard is, right? And Steve Vladeck has written great stuff about this for his “One First” newsletter, which people should read, about TikTok and President Trump's request regarding TikTok, which is that an injunction is only supposed to issue if the Court–certainly when you have a duly enacted act of Congress–if the Court believes that there is a likelihood, I think maybe even a high likelihood that the requester will win on the merits, which I don't think anyone thinks.
So I think an injunction is just not on the table. But then there's just something called administrative stay which is not an injunction, but I will admit I'm not sure I fully understand the difference, and I'm also not sure what the administrative stay would be for.
Like, as far as I understand, and Ramya, please educate me, administrative stay is this like, it's like a cleanup thing. It's like, hey, there's just some stuff we have to do to kind of clean stuff up, so let's just not do anything until we clean it up. But you're not, there's nothing to clean up here. Like, there's a question, it must be answered.
And Donald Trump may be doing something at some point because he's, like, good at deals. I don't think any of this works, but I am legitimately confused and I was not pleased that we were going down that route because that would seriously bum me out. If this all ends in an administrative stay, please, please help me, Ramya.
Ramya Krishnan: I can't help you, Alan. I see your pleas for help, but I can't help you. Yeah, I mean, I, to be clear, I'm not saying that issuing an administrative stay would make sense in any way, that it's, you know, that it's justified that they clearly have authority to do this. I mean, I think, you know, Prelogar got up against the wall; like also, maybe she wasn't up against the wall, but she generally is not in the business of telling the Court they can't do things that they want to do.
And so it was like, if you want to do this, and notably not for the reason that the president-elect has said that he's a good deal maker and there's a deal to be made here, but because you, the Court think that you need more time to consider and adjudicate the weighty questions at stake in this case.
So I don't know that there's any precedent for it doing that. And it worries me too that this might be on the table, but it seems to be on the table. And to the extent that there was a lot of, sort of, divergence, you know, perhaps it's, it's clear, I think, from the argument that the weight of opinion is with the government, not with TikTok or, or its users.
But I think that there is still potentially a lot of disagreement about the particular pathway for the government to win on the merits. And so that's why I say maybe the administrative stay is still on the table for that reason.
Another possibility – again, I have no idea how likely this is or if there's any precedent for this kind of thing happening – but you know the, the motion for a stay is still pending. It's conceivable, at least to me, that it formally denies the request for the stay, but then just says that we're going to deal with this in the ordinary course and not, like, on an expedited basis.
Alan Rozenshtein: But what is the stay for? What are you trying to stay? You're trying because, because if you're trying, I don't think you can stay a law. You have to enjoin a law, right? Or, so if you're, you're trying to stay the D.C. Circuit opinion, but that of course doesn't help TikTok because then the D.C. Circuit needs to stay and the law goes, so what are we staying exactly?
Ramya Krishnan: You, it's a great question. It's a, it's a fantastic question. I mean, I agree. I agree. It doesn't make sense.
But the other option I was flagging is that it actually denies the request for a stay, but then just makes clear that it's not actually going to act before the January 19 deadline. And, you know, I see that as a possibility, maybe because it makes more sense than actually granting the stay, because granting the stay doesn't seem to make, make sense, because what are you actually staying?
But also because there were a bunch of questions asking, you know, like, look, what is really going to happen like the day after January 19 for the users? Like, aren't these users still going to be able to use TikTok on their phones, which is how most people use TikTok, for some amount of time? And I think the government made an argument along these lines in its opposition to the request for a stay.
And so, for that reason, I actually, I think that there is maybe a majority of justices on the court who are not convinced that like, actually, TikTok goes dark, in the full sense of darkness, you know, the day after January 19th, and so maybe they, maybe they do that. Which has the same effect as like, granting a stay maybe.
Kevin Frazier: So, there's a lot to sort out, and I know Alan's in need of another nap, so we're gonna have to wrap this up pretty soon here.
One thing I want to point out is that in the Steel Seizure Case—which all con law nerds know and love—they heard oral argument on May 12th and still took five weeks later to issue a decision in June of 1952. So five weeks is still a pretty rapid turnaround.
They've got nine days. So my two part prediction time for our two wonderful panelists is first, what day do we get a decision? And what is the decision? No pressure on either of you. But of course, because I'm kind, I'm going to make Alan go first.
Alan Rozenshtein: Nice. So, okay. Oh boy.
So, so here's what I think is going to happen. I think, so, so the, the Court has its next business day on Monday, and it's going to issue orders. I think on Monday they are going to issue an order denying TikTok's request for a stay. And then they'll say nothing else, right? They'll say, well, we've denied the stay.
We've heard the oral argument, okay. So now we just are gonna wait and issue our opinion in due course.
What that effectively will mean is that the law will go into effect. It will go into effect because the justices have done like a straw vote and they're pretty confident that they're gonna uphold the law, but they're not sure why. And they know that it's gonna take them months and months and months to write the opinions and deal with like the 17 meta issues and kind of angels dancing ahead of a pin – first on the doctrine issues, they need to figure this out.
And so, they're giving themselves flexibility because they're not yet actually ruling on anything. But everyone kind of understands that the only reason they're letting the law go into effect is because they're pretty confident that they're gonna rule to uphold it.
But also then to Ramya's point, it'll also give them a few months to see what's actually going to happen. Because although Francisco, and I'm kind of surprised about this, seemed actually pretty unsure whether TikTok was going to go dark or not. He seemed to say that TikTok is going to go dark, but TikTok is not going to, doesn't have to go dark, right?
Like as long as they move their cloud infrastructure abroad, and they must already have foreign cloud infrastructure providers, because of course TikTok is global. And the apps are smart enough to, you know, you know, find those new servers for several weeks, if not months, you will be able to use TikTok on a kind of slightly screwed up basis, but you'll still use it. Which then kind of gives everyone breathing room and allows the Court to kind of give Trump the relief he wants because, because then Trump can still use a couple of months to, to make the quote unquote deal and preserve TikTok that way.
So that's what I think is going to happen. Probably on Monday, certainly next week. And I think TikTok, I think, I think TikTok might get three votes. I think they might get Gorsuch and Sotomayor and maybe, if they're having a very good day, Kagan. But I just don't see more than three votes for TikTok at the end of the day here. That's a very long answer, but I'm also very confused.
Kevin Frazier: It’s a detailed prediction. The best thing is, come Monday, I'll be able to tell you if you're right or wrong in some regard. And that makes my weekend exciting. Ramya, can you give us your prediction, please?
Ramya Krishnan: Yeah, well, I mean, I'm gonna be boring here, and then I feel like flagging, talking about this possibility myself, and then here, I'll further flesh it out. I think I've, like, convinced myself into believing that maybe this is the most likely scenario – that they deny the stay, but really in the hopes that maybe something might happen to moot the case when the Trump administration comes in, or at the very least, give them time to sort out their differences of opinion.
Now, if we do end up having an actual decision on the merits, and for my mind, the, the second most likely option is that we get a ruling for the government. before January 19. And I think if that happens, it will likely be we're going to assume that heightened scrutiny applies and that it's satisfied and it will essentially be a Humanitarian Law Project v. Holder mach 2 kind of situation where you see a lot of sort of the national security deference and an emphasis on the sort of China concerns behind this law.
And so I think, you know, if, if, if the Court gets to the merits, that's the most likely place they’ll land, because I don't-–I didn't actually hear that to be, and obviously this might, might, might change as the justices go off to further consider the case, but I didn't hear, you know, five votes for the First Amendment just doesn't come into play here. And I'm not even sure there was a critical mass on is this, you know, content-neutral and intermediate scrutiny applies, or is this definitely content based.
And so, for that reason, they might, in order to get a sort of, you know, one, one opinion, a majority opinion–sort of converge on, let's just assume for the sake of things, that strict scrutiny applies à la the D.C. Circuit.
Kevin Frazier: Well, folks, we've got some predictions, wild times ahead, no doubt, and you'll certainly hear from us one way or another whether Alan's right or not. Who knows? We'll find out soon. But for now, thanks Ramya and Alan for joining us and talk to you all soon.
Ramya Krishnan: Thanks for having me.
Alan Rozenshtein: Thanks a lot.
Kevin Frazier: The Lawfare Podcast is produced in cooperation with the Brookings Institution. You can get ad-free versions of this and other Lawfare podcasts by becoming a Lawfare material supporter through our website, lawfaremedia.org/support. You'll also get access to special events and other content available only to our supporters.
Please rate and review us wherever you get your podcasts. Look out for our other podcasts, including Rational Security, Chatter, Allies, and The Aftermath, our latest Lawfare Presents podcast series on the government's response to January 6th. Check out our written work at lawfairmedia.org.
The podcast is edited by Jen Patja. Our theme song is from Alibi Music. As always, thank you for listening.