Cybersecurity & Tech

Lawfare Daily: Wikipedia, Ref-Working, and the Battle Over Reality

Renée DiResta, Jimmy Wales, Jen Patja
Tuesday, December 9, 2025, 7:00 AM
What happens when reliable sources become a battleground for power?

Wikipedia is more than an encyclopedia. It’s a key part of the internet’s information infrastructure—shaping what people know, what AI models learn, and what the public sees as true. But in an era of geopolitical conflict, AI disruption, and fracturing trust, Wikipedia has come under attack.

In this episode, Renée DiResta talks with Wikipedia founder Jimmy Wales about his new book, “The Seven Rules of Trust,” and about how Wikipedia has managed to remain one of the most trusted sites on the internet. They explore the principles that helped build that trust and the outside pressure it’s come under—from American congressmen, to Russian censorship campaigns, to Elon Musk’s Grokipedia. 

What does it take to make institutions trustworthy in a low-trust era? What happens when reliable sources become a battleground for power? And how does a community continue to build shared knowledge while partisans are redefining the rules of truth?

For further reading, see:

To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/lawfare-institute.

Click the button below to view a transcript of this podcast. Please note that the transcript was auto-generated and may contain errors.

 

Transcript

[Intro]

Jimmy Wales: So one of the things we look for when we're looking at sources is oh, when they do get something wrong, because everybody does, what do they do about it?

You know, how transparent are they about what happened and how they're going to fix it?

Renée DiResta: It's the Lawfare Podcast. I'm Renee DiResta, contributing editor at Lawfare, here with Jimmy Wales, founder of Wikipedia and author of “The Seven Rules of Trust.”

Jimmy Wales: So the way Wikipedia works is very different from social media, for example, where you can just flood the zone. You know, like you can have 10,000 bots all saying similar things and just flood the zone with it. That's going to get you nowhere in Wikipedia.

[Main episode]

Renée DiResta: Today we're talking about what it takes to keep an open, collaborative platform trustworthy in a time of deep distrust and political pressure.

So, I love the fact that you opened your book with Stephen Colbert's joke about Wikeality back when Wikipedia was seen as kind of, you know, chaotic and maybe unreliable, right? He tells this joke about elephants, and how we can change what elephants are just by editing the page.

But Wikipedia has now become the sort of scaffolding of the internet. It trains AI, it powers search, it shapes how billions of people understand the world. When did you realize that it was becoming that kind of infrastructure?

Jimmy Wales: It's a good question. I mean, it sort of emerged over a period of time. I mean, I remember a few specific moments. We had the John Seigenthaler incident. So John Seigenthaler, Sr. was a very well-known journalist who stumbled across his Wikipedia entry, which had a terrible error, got in touch with me, which—and I, you know, immediately fixed it. And, but then he wrote a scathing editorial about Wikipedia in the USA today.

That was the first time I was like, oh wow. Like actually, people are paying attention to this. Like, it's actually newsworthy, you know, like what Wikipedia says. So that was one of the early incidents.

But, you know, it's over time, you know, there's all kinds of funny thing. I mean, even like Stephen Colbert making a joke about Wikipedia was like, oh wow. Like, I'm noticing this.

Renée DiResta: You're on TV.

Jimmy Wales: Yeah. Crazy.

Renée DiResta: And I remember, because I was, at the time, this would've been I guess around 2018 or so, right when Susan Wojcicki was sitting on stage at one of the conferences. Maybe it was Recode, I think, one of the major tech conferences. And she says oh yeah, we're going to use Wikipedia for fact checking, was how she framed it.

This was during the days of conspiracy theories was, you know, this was the fake news. When fake news meant, you know, stories that weren't true. It was before that term got kind of co-opted. And she kind of announced on stage that Wikipedia was going to be used for fact-checking.

And I thought this was very interesting. I was studying conspiracy theories quite a bit at the time and writing about them. And I think I wrote about this for Wired. I was just you know, contributing at the time, thinking, like, this is very interesting, right? Because. I, as I recall, I think I reached out to somebody at Wikimedia. And they were like, nobody told us about that.

Do you want to talk a little bit about what that was like?

Jimmy Wales: Yeah, no, I mean, that was interesting. I mean, I think Susan mentioned it to me, but yeah, it wasn't like a formal deal or anything. You know, it's I mean, one of the interesting things about Wikipedia is it's freely licensed, so everything in Wikipedia can be copied, modified, redistributed, redistribute modified versions commercially or noncommercially. Which is, will be relevant when we talk about AI in a little while.

And so it was sort of no surprise that they might want to take some snippets from Wikipedia to put underneath videos or you know, whatever. But it's kind of cool.

Renée DiResta: It was a very interesting approach, I think because YouTube rather notoriously, recently, you know, they wrote a letter to Jim Jordan saying, we don't fact check, we've never fact checked.

And Jim Jordan was like, you're never going to fact check. But this became a thing, right? Because because it was actually this moment back in 2018 when they said we were, we're going to use crowdsourcing, crowdsourcing information. We're going to point to Wikipedia, and whatever Wikipedia says is going to be the thing that they point to.

And it was almost a, almost a Community Notes model in a sense, even back in the olden days when they were relying on this consensus of the crowd to sort of go after things like the Earth is round and vaccines don't cause autism. These things that we're now re-litigating in 2025.

One of the things about that, actually, kind of connecting it maybe to, to things like Community Notes and consensus, is that Wikipedia has always made that process very, its process, I should say, very visible. That disagreement, debate, evidence whereas many institutions tend to keep that sausage-making behind the scenes.

You write about this a lot in your book.

Jimmy Wales: Mm-hmm.

Renée DiResta: What have you learned from making that process so visible? You write a lot about it in the context of transparency to help build trust.

Jimmy Wales: Yeah, no, I mean, I think it's really really important. It's important for the Wikipedia process. It's important for trust.

And it is something that I think I'd like to see happen more in, in more places. So, you know, perhaps unusually for somebody who's quite anti-misinformation and disinformation, I actually—I wasn't that unhappy when Facebook decided to stop their program, that they were doing fact checking.

Not because, you know, I think, oh, it's fine, post nonsense. That's not my point. But it was like, it was not transparent enough. And it was a bit top down and it was a bit, you know, sort of like these organizations were, you know, doing the fact checking mainly doing a good job. But people didn't trust that process. It didn't feel authentic in certain ways.

I mean, I always joke, you know how on Wikipedia it says you know, the neutrality of this article has been disputed, or, the following section doesn't cite any these sources. And I always say, you know, I wish the New York Times would do this. Sometimes, you know, just put a little note at the top of an article saying, you know, we had a big fight in the newsroom. We weren't sure we were going to run this. We think it's important enough to run, but just be warned, a couple of the journalists don't think this is firmed up enough yet.

Wow, that's kind of cool. That's actually amazing. And it's—now I can read it with that understanding. And there is that old school traditional journalism sort of voice of God, you know, like it's true because we're telling you it's true.

And people kind of can see through that. And you know, some of the older ways of dealing with problems around that are still valid, run a correction. That's actually really important as one of the things we look for when we're looking at sources is oh, when they do get something wrong, because everybody does, what do they do about it?

You know, how transparent are they about what happened and how they're going to fix it and so forth?

Renée DiResta: That's an interesting point. I think that question of legitimacy is really very closely tied into the transparency piece. Have you ever encountered a moment where you think oh, that kind of backfired?

Or have you found it to be just a universally a positive?

Jimmy Wales: Well, I mean, one of the, one of the funny issues is that it's often very difficult for outside people, journalists, for example, to actually read a discussion page in Wikipedia. Because all the voices look completely equal, but without any understanding of who's saying what and how that's going to be perceived by the other people in the conversation, t's actually really hard to interpret.

Ad that's not a criticism of journalism. It's just, it's a hard thing to do. And so what happens is sometimes you'll see, you know, a, a story like massive controversy breaks out on Wikipedia. Not the same two trolls who are always complaining about everything.

It's not really, that one's not a real controversy in Wikipedia. And, you know, recently I just, on my talk page I casually mentioned some ideas about how we might be, might use AI to support our work. And one person said, this is making me really angry. I can't believe you're saying this.

And I was like, I'm sorry you're angry. And you know, well, it was just a conversation. It was a conversation. And I would say there were mixed feelings about my ideas. And some people thought, oh, good idea. Some people thought not so much. But the news headline was, community pushes back on Jimmy Wales's plan to shove AI down their throats or something.

You know, like I'm exaggerating, but. And so that, that's okay, right. We are just trying to have a little conversation here, like you would in any organization sort of around the water cooler as they used to say. I don't, do people have water coolers anymore?

But yeah. So sometimes it's a little funny because the transparency just means, well, it's all out there, so.

Renée DiResta: Right. And sometimes I think it is hard to follow. You have the edit history comments. Then you have the talk page comments. Then you have sometimes things wind up on the talk page, comments of the editors themselves when there's a dispute. I've followed it a few times as I've looked at contentious articles, both as a person who follows it as a person who writes about it sometimes, and also as a person who becomes the subject of controversy sometimes, you know, these days.

And so it, it has been interesting to see try to figure out where the different—where the different sense making, so to speak, happens.

And so it's been very interesting. Have you followed the emergence of—I don’t know if you spend any time on X Community Notes?

Jimmy Wales: Yeah. I actually, it's the only thing I still like about XI I would say I spend as much time, so I try to avoid it these days. I find it, it's just gotten so toxic and it's just really not same.

It's not fun. I have a few friends on there and interesting people, so that's still valid. But when I go on I typically just, I see a request to do a community note thing, and I do it. I'm like, oh, this is kind of fun and interesting and yeah.

Renée DiResta: So you're a contributor to Community Notes?

Jimmy Wales: Yeah. Yeah yeah. I feel like it's a valid and useful thing to do. And I mean, it's interesting because I do try to come at it with, like a Wikipedian. And so it, it's often sort of fun for me to say, wow, I massively disagree with this person's comment. However, this community note is not actually very good.

Like it's sort of—I'm one of those people who are always like, take it to the, you know, post your own comment. Like, you're just disagreeing with the opinion. You're not fact checking, you know. This is not a clarification or whatever. You're just debating the person.

It's, yeah. Go debate the person. That's fine. So anyway, I do enjoy it and I actually think it's something I'd like to see them do more with and more of that. Because, you know, I mean, one of the longstanding problems of Twitter is—Twitter, X—is, the algorithmic amplification of really toxic people, toxic comments.

But also as I always remind people about Usenet, which was before the worldwide web. It's like, we didn't even need algorithms to be toxic to each other.

That's a human thing. And so already you take that tendency, you add a layer of algorithmic promotion on top of it, and you get this cesspool that's completely not living up to its purpose, if it has a purpose.

So yeah, I think Community Notes is a good thing.

Renée DiResta: No, I was just saying in trust and safety research that the problem with social media is people. So, for what that’s worth.

Jimmy Wales: Yeah. I used to, when I was young, a teenager I worked in a grocery store for a while. I was working the night shift stocking shelves, and I got to help open a new store. So they'd opened a brand new store. And so the week before, we just stocked the entire grocery store beforehand and it was like the platonic essence of grocery store the morning it opened. Every can, everything was absolutely perfect.

And I said, you know, this job would be a lot more pleasing if it weren't for the customers.

Renée DiResta: There you go.

Jimmy Wales: So it's yeah, absolutely. The only, yeah. The hard part about community management is managing the community.

Renée DiResta: So, right. So there's, there's two thoughts I have.

One is, I don't know if you've followed some of the research on Community Notes that’s very interesting, which is that—and maybe this kind of like branches between AI versus where I want to take it with some of the debates about sources, but let me stick with the Community Notes piece here—have you seen any of the interesting research on what's called the Habermas machine?

Jimmy Wales: I have not, no.

Renée DiResta: It makes this argument, it's very interesting research, it's done by Google DeepMind, that actually one of the incredible uses of AI is that it can produce a community note or you know, a sort of a snippet, a distillation, in very neutral language that people actually like quite a bit.

Because one of the problems with Community Notes is that oftentimes the notes don't clear, so to speak. The way that a note shows up on a tweet is that the bridging algorithm means that people on the left and people on the right all have to agree that the note is helpful in order for it to show up.

That's how the algorithm works. And that in terms of writing in that neutrality—which I think ties into Wikipedia and neutral point of view, right?—How do you express something in kind of a neutral language that people find palatable? This is the thing that has to happen on Community Notes.

On Wikipedia, you know, you're sort of expressing a neutral point of view. You have more language to work with. But on Community Notes, what people are finding is that the machine actually can write in that tone that more people like. And so it clears.

And so you're starting to see to assist with scaling, producing more notes in a neutral tone. And then the voting is what goes to the people. Mm-hmm. So the consensus process is kind of down—

Jimmy Wales: Oh, that's really interesting. Mm-hmm.

Renée DiResta: —to the humans, right? So the humans have to do the voting, the humans have to decide a note is needed, per your point. This isn't just an opinion debate. That consensus still has to happen between right and left. But that's where people are starting to see a use case for AI in this kind of consensus making process. It helps just write in that tone that people find palatable.

 I was curious if you'd seen this or what you heard about that?

Jimmy Wales: I hadn’t seen it. It sounds very plausible and very interesting. I mean, I use large language models all the time for fun. And, you know, just, I find it an absolutely fascinating thing. And I can see that, that sounds likely to me.

And I, what I also like about it generally—and this is my view about the future of AI and Wikipedia, large language models and all that—is, for AI to help by making suggestions is a much more effective use than trying to take over something from humans. Because the level of nuance required to really make a decision is often quite difficult and you just really do want that assurance and reassurance.

On the other hand, a starting point, like a helpful, you know, oh here's an idea of something, yeah. Actually quite interesting.

Renée DiResta: So how are you thinking about AI and Wikipedia?

Jimmy Wales: Well, I mean, there's a, there are a lot of different elements there. So I would say one of the first things is, we are not envisioning and you know, we really don't think it's a great idea to have AI producing content that goes directly to readers. Just because the hallucination problem is still quite bad and so on and you know, it just, that doesn't work. It doesn't feel right to us. It doesn't feel necessary.

But I'm interested in how AI might help us do certain things like at scale that already happen, but that are a little bit too manual, so to speak.

So I was talking to a French Wikipedian this summer. We have an annual conference, Wikimania. We were in Nairobi, which was awesome. The Florence Old School, Wikipedia, she said, oh, I don't have time to edit Wikipedia as much as I used to. But I have a hobby of, I go into French Wikipedia and I find a, an older page that has dead links because dead—links go dead after time.

And then I see, what was the link supporting? And then I go find a new source that supports that and I add the source. And I'm like, oh, okay. Yeah. Great. And it's just a little thing she does and it's nice and you know, that's very typical. Wikipedia knitting we call it. You know, just sort of something to do.

But I said, okay, what I'm interested, what would you think about this idea where finding a 404 not found link, you don't even need a, you know, that's not AI. You just find a broken link and then an AI reads and says, oh, this is what that was supporting. And then it goes to some usual suspect sources. It vets them a bit for you, and it makes a suggestion.

And she said, oh, that sounds useful. Because the interesting part is making the judgment, does this support that? That's the human piece. What's not interesting is finding a dead link. That's a lot of clicking. And what's not interesting is Googling and throwing out like seven like sources that don't—she thought it would mention it, didn't mention this point, whatever.

So it's oh wow, that could just speed up the work and it would be great. So that's a simple little thing, but it's the sort of thing that you can start to think about being able to do now that we have a technology that can do textual analysis in a really kind of human-approachable way.

You know, other things would be cross-language comparisons. I've played around with this a little bit myself. I realized I started doing it because I was interested in, you know, controversial topics and how we deal with them in different cultures, different languages. But I actually think it's just as valid to think about it for not so controversial topics. Although frankly, as a Wikipedian, I can tell you almost anything can be controversial if you scratch the surface.

But you know, like my simple idea here is, what about articles about French wine in English Wikipedia versus French Wikipedia? Well, that's probably, they're probably quite similar. They're both actually quite good on the topic of wine, because I guess there's a lot of wine hobbyists who are interested. But I bet you that if you just sort of, you know, ran a script over a lot—you would find articles that are markedly different for no good reason. And that would be really interesting to be able to sort of have a, something that sort of just all it does is a little friendly bot.

It raises hands like, oh, hey, I found these two articles and they're quite different. And even though they're supposed to be about the same subject, and then a human would go, oh great. Let me look into that. That sounds fun. That sounds interesting, because one of the things that people do, they come to Wikipedia as Wikipedians.

And this is the best way to come to Wikipedia. Not as a culture warrior for your thing that you're uptight about, right? But you come as a Wikipedian, and you're just like, oh, what am I going to do today? I want to help Wikipedia, I want to do—I want to do something interesting.

And maybe AI can help us find things that are interesting and sort of, and productive. You know, what's a popular article that has a neutrality warning that's been there for too long? Oh, that might be—if you're looking for a tough challenge, that might be good.

So anyway, that's where I think about AI as like supporting the community.

Renée DiResta: I guess this bridges into Grok quite nicely. Have you looked at your Grokipedia bio?

Jimmy Wales: I have not. I have not. I've actually, I've barely had time to look at Grokipedia. I have looked at it a little bit. I, mainly I've seen news stories about it and so forth, and it doesn't sound great.

I really do need to do a deep dive. And I think I've got some time like late next week to actually sit for six hours and do a bunch of comparisons. I might actually get AI to help me compare them.

Renée DiResta: Yeah. I can recommend an article by Alexios Mantzarlis over at Indicator Media. I linked it in my recent analysis also.

I read my bio. I'll give a—

Jimmy Wales: Yeah.

Renée DiResta: I was you know, one of the lucky 855,000 to get a page the first, in the first pass.

No, actually in all seriousness, you know, I study adversarial abuse online. That's my job.

I was very intrigued. The first half, I have to say was very good. It really just sort of trawled the web and all these random profiles that I've had in the past, random interviews I've done with people where, you know, they start with what was your childhood like?

And you tell these random anecdotes about, oh, my dad taught me how to code, and all these random things that never, you know, that a random person would never know, like it actually did scrape and aggregate all that stuff.. And so like the very, the opening is actually quite rich with these random, you know, stories and things.

And then it goes off the rails, you know?

Jimmy Wales: Yeah yeah yeah.

Renée DiResta: And it has a whole controversy section. And I think that it really is tilted towards controversies. That's kind of what he wants, that's, you know, sort of like—when you have a product that's created out of spite, you really can see what it could be versus what it is.

And that, I think, is actually the great tragedy of Grokipedia, is that—so Alexios’s analysis, which does do a diff between Wikipedia and Grokipedia kind of finds that about 55% of the articles are essentially cribs. They're basically identical.

But in the controversial topics is where you see this significant deviation, where there's this significant overemphasis on controversy. And that's where my experience was that it pulled from a congressional report written by Jim Jordan. That was just complete bullshit. And in their congressional testimonies by nut jobs, you know, the real politically motivated stuff.

And even as it used the politically motivated testimonies, it hallucinated facts that were not even in them. So it was sort of one degree past even the conspiracy theories. It went one degree further. So I thought, well, this is very interesting.

And so there's no talk page. You, there's no talk page to fight with the bot. There's just a box. So I fought with it in the box, by the way. Turns out everything you fight with it about in the box becomes public. It is, you know, they do in fact post that, which is good. I didn't—that's good. I was like, I was a little bit direct, but that language is out there. That's alright.

Whatever. It is what it is. So it turns out two weeks, two and a half weeks later, after I write about this for The Atlantic, it does in fact go and correct, it does correct the bio. And you can actually see the bot's reasoning as it goes and compares my complaint about the source, saying this isn't even in here, you're hallucinating.

It does in fact go and look, and it does in fact make the correction, right. It does realize that what it is saying is just not true, right? It's just not true. Yeah. It winds up writing a very long, winding—I mean, honestly, it's kind of garbage. It's very long, winding, it doesn't make any sense really.

But it does in fact go and edit it. And so it's very interesting this process of trying to fight with a robot to make a change. But it did, after two and a half weeks, go and make this change.

So I wrote about that too. You know, I wrote about it on my Substack.

Jimmy Wales: Yeah.

Renée DiResta: But it's a really interesting experience to, to look at, again this process of that initial scrape, that very comprehensive crawl that it does, I thought oh, you know, if you were to actually say Hey, I want to go write a bio of somebody, right, particularly when you have these initiatives that Wikipedia does, where it's oh, we want to write about women or people who are, you know, not necessarily represented or these various projects, you could see it as this is an interesting way to gather sources that don't show up on Google and things like this.

But then again, since it's motivated by spite , it very heavily leans into controversies. And what you see in Alexios’s—this kind of ties into the other thing I want to talk about: source comparison.

That's the thing that really hits as the major difference here, which is a significant percentage of the sources are things that I think the average person would not consider to be reliable. And by that, I don't mean, I don't even mean right wing media or left wing media. I mean Stormfront, I mean Infowars.

So, so we're really way far off in the realm of what people would consider to be, I think, the fringe of the fringe. LifeSiteNews, you know, these sorts of, much more out in the realm of, you know, the, these things that are way out there.

So that's the sort of analysis when you look at the diffs that are very interesting. And I'm kind of curious, bringing it back to the sort of source wars here, how you think about that fight over sources that is playing out now? You know, you're winding up getting letters from members of Congress, the sort of ref working that's happening here.

How are you guys thinking about holding the line there, or you know, how you're going to engage the community and members of Congress about that war that's happening?

Jimmy Wales: Yeah, so I mean, it's super interesting. So, I mean, the first thing to say is, you know, I think it would be completely intellectually irresponsible not to make judgements about sources. That's something you really have to do. And you can't just say, you know, like, how dare you? It's biased if you think this social media influencer is not as valid as the New England Journal of Medicine.

No, it isn't. It isn't at all biased. That’s just paying attention to quality and the facts of reality and so forth. And then at the same time, you always have to, like in the Wiki world, you have to be very thoughtful about this, and how do you think about sources, and when you deprecate, is a term we use—which doesn't mean it's banned as a source, it means you just prefer a better source.

And you know, that gets really tricky when we are in an era where, quite frankly, there's a large number of new media sources that are quite low quality. And they do tend to be right wing. I would say that's a sort of a big, has been a big growth.

I mean, one of the things that I've been saying is, if there are right wing billionaires who are upset about the state of the culture, please fund some high-quality news sources and high quality, you know—and it's not about the political leaning, it’s really about, are you just posting populist crazy things that have no, no basis in truth?

Do you do error corrections? Do you, how thoughtful is it? That's really important. And so that's, you know, currently I think an issue in the world. It's not an issue for Wikipedia. I think we have to bend over backwards as best we can to say, you know, okay, let's be very careful that we aren't cherry picking sources. Because that's also a very natural, easy human tendency, you know?

And I think we're pretty good on that front. You know, if there's a genuine scientific controversy, and it also has some politicized element to it, well, we should ignore the politicized element and just cover the genuine aspect of the controversy.

And yeah, I think we mostly do that. I mean, obviously you can quibble on this, that and the other, always. So, yeah, it's really tricky. And then I just have to say, I, it's not quite what you asked me, but you gave me an excuse for a little small rant about this.

As you know as well as anybody, there is a real trope on the right that the Biden administration was putting pressure on to censor, you know, social media and potentially Wikipedia. And when people found out, yes, we had conversations with the Biden administration, it was all like, ‘aha,’ you know. This is mainly about, about COVID.

And I'm like no, hold on a second. Like, we don't we talk to governments all the time. We explain how Wikipedia works. We'll never change anything because the government wants us to change it. Also, we never got any pressure from the Biden administration to change anything. Like that just wasn't a thing. They were like, as they're researching and trying to learn about what's going on in the information ecosystem around COVID and things like this, obviously they talked to us.

Very interesting. And yes, we have gotten pressure from the U.S. government, but from the Republicans. And I think that's extremely problematic. You know, we had this letter from the interim attorney in D.C.

Who didn't get the permanent job, so great.

Renée DiResta: Ed Martin.

Jimmy Wales: Yeah. Ed Martin.

Yeah. And I mean, frankly you know, I just said, you know, it's really good that we do have fantastic, very calm professional staff who just basically answered the letter in a very matter of fact way. Because I would've just said, go fuck yourself and footnoted the First Amendment. I say, none of this is in any of your business. It's completely absurd.

But you know, all right, you've got questions like, here's the answers to the questions. And you know, this sort of hinting at our nonprofit status, and saying things like, basically, you know, ‘it's come to our attention you let foreigners edit Wikipedia.’

It's like ‘yes, we’re global,’ you know, ‘what are you even talking about?’ But you know, there we are. And so I just think that is unfortunate and it's actually something I do think quite sincerely and quite genuinely, there are people on the right who are upset and concerned about freedom of expression and various negative things that are happening potentially to freedom of expression.

I mean, here, I live in the UK, where freedom of expression is not nearly as protected as the U.S. and a lot of people have concerns. I think that's great.

And if you have those concerns, then you on the right should say, actually it's outrageous that Ed Martin is sending like nasty letters to Wikipedia.

That is not the role of government whatsoever. That's my rant.

Renée DiResta: No, I, well, I got subpoenaed by Jim Jordan, I think around the same time that he was sending letters to Wikipedia. So I'm, yeah, I'm right there with you. Yeah. I think that that question, they've also, I believe, wanted names of editors and things like that.

So know we've definitely seen—I've written about, I think, some of the requests. There's been also, I think, actual state censorship from foreign governments that has targeted Wikipedia. I think you were fined by Russia. Russia fined the Wikimedia Foundation for refusing to take down certain content about the war in Ukraine.

So you've also experienced this from non-US governments too. Do you want to, do you have any other—

Jimmy Wales: Dfeinitely, yeah. There's a few examples. So yes there, there have been fines issued in Russia, which we're never going to pay, by the way. You know, that's just not a thing we're going to do.

Yeah. Just no we, well, we're currently blocked in China. We've had an on/off sort of situation in China. We have a standing offer from China that we could be open and accessible in China if only we would let a Chinese University manage the content and make sure it's legal in China. No, sorry.

That's not something we're going to do. We were banned in Turkey for about three years. And we—I'm very proud of this one—we fought all the way to the Supreme Court in Turkey and won.

And now we're unblocked in Turkey. And it was a landmark decision for freedom of expression in Turkey. And that's great. And you know, I think one of the reasons, so we have many reasons.

I mean, partly we're just quite ideological about the fundamental human right to participate in discourse and in something like Wikipedia. Like it's actually—you know, this isn't hate speech. This isn't threatening people. It isn't all the borderline things. This is Wikipedia. It's very, you know—and even if it's biased in places, which of course it is in sometimes in some places, we try not to be, that's okay. That's also perfectly part of legitimate discourse and so forth.

But also I think of it as a very practical matter. Like one of the issues is if we. If we decided to you know, in Turkey, we could have gotten unblocked in Turkey by just blocking a certain page from being visible in Turke. And just don't send that page to Turkey and it's all going to be fine.

Once you start doing that, then they come out of the woodwork. You know, I think a lot of governments really understand it's kind of all or nothing. Like, you could block Wikipedia, but you're going to block all of Wikipedia, because they're not going to cave in. They're not going to, they don't play ball. And I think that's really important.

And, you know, I’ve criticized Elon, had a little sort of thing with him about this. In Turkey, they decided to take down certain tweets even after he—you know, he loves to talk a lot about freedom of expression. And he sort of made the typical kind of reason and excuse, which by the way, other than him posturing about the issue so flamboyantly as he does, I think is a hard problem.

I think, I think it's a really—it's not a hard problem for us, right. But for. Commercial enterprises with a business model that's different from ours. I'm like, I get it. It's you know, are you going to lose your entire access to the Turkish market over one tweet?

Okay, that right. That's a tough business decision to make. For us, it's actually—I mean, this is one of the great things about our business model. So we're a charity, but our business model is support from small donors. And our donors would be very angry at us if we participated in censorship. So it actually would cost us money, you know?

So it's like, great, our incentives are aligned with our values. That's really wonderful. So anyway, I, you know, I think it's a hard decision to make, you know, in a lot of the companies, some are better than others about it. Some just, like—they do whatever. They just take down whatever without even a fight.

Others, I think YouTube typically tries to fight the good fight where they can. Yeah.

Renée DiResta: How do you think about sort of state-sponsored edit warring? I know that, so this again maybe ties into the Wikipedia as information infrastructure that really heavily influences AI downstream. So not AI as a creation tool for Wikipedia, but Wikipedia as a training dataset for AI.

So one of the things that we see is LLMs and other content really heavily use sources to produce reality downstream, we might say. And this means that state-sponsored actors in particularly are interested in influencing it.

This is of course why Congress is also interested in what's on Wikipedia. But this question, Russia in particular, right, let's just say it is very interested in shaping narratives on Wikipedia, knowing that this is going to influence Google snippets, right? It's going to influence—AI answer engines are going to draw very heavily from it. If you can shape what reality, you know, if you shape reality on a Wikipedia page, you're going to influence training data downstream. You're, this is going to shape what is communicated.

How do you guys think about that in terms of it takes a lot for or community to be on top of all of the pages?. It is a massive endeavor.

There's a colleague of mine at Lawfare who wrote a really excellent article a couple years back about fighting in one guy's bio, and I don't remember which—it was a Ukrainian poet, a writer, I'm trying to remember the specifics. But even just like trying to erase Ukrainian identity, where they kept kind of like trying to go in, take out Ukrainian and replace with Russian.

So these sort of small-scale, little information war kind of dynamics where just trying to like subtly change things around the edges to, to kind of tilt narratives in favor of Russia. This is the sort of thing where like the community can get to it at some scale, but when you have something that's done at large scale, I'm sort of curious how you think about that reality shaping.

Those types of attacks, given the importance of Wikipedia downstream.

Jimmy Wales: Yeah. Yeah. So I, and there's a few things to say about this. So, first of all, it's quite difficult to get away with that. So the way Wikipedia works is very different from social media, for example, where you can just flood the zone, you know. Like, you can have 10,000 bots all saying similar things and just flood the zone with it.

That's going to get you nowhere in Wikipedia. Another element of Wikipedia that I think is quite important is, we have a very strong tradition against voting on content. So we’ll have straw polls and things like that. We call it a not-vote just to, so it's not a vote.

Because what really matters is, you know, getting the sense of the room, but also evaluating policy-based arguments for what you're saying.

And so just sheer numbers doesn't get you very far. In fact, it more than likely just gets you blocked, because people immediately go oh, there's a hundred. Random people we've never heard of before, all saying the same thing. Let's just block them all. They’re clearly bots and trolls and what have you.

So it's harder than you think. And then also, I think—I was once in Russia, before the war, a long time ago, when one would go to Russia. And I was sat at dinner after a conference and I was sat next to the editor in chief of a major magazine.

And he said, oh, I can make Wikipedia say what I want. I just give a hundred dollars each to a few Wikipedians, and it's done. And I'm like, okay, let's talk about that. Could you really do that?

And the answer is probably not, because if they started putting in strange things, the other Wikipedians would go what are you doing? You don't have a source. What is this about? You could enter the discourse, but you could do that for free.

You know, I said, but you're the editor-in-chief of a magazine. You can make it say whatever you want. And in fact, the government can probably make it say whatever they want, because they can make you write whatever they say, you know?

And so it's much, much harder when you've got an open system, where the decision making is based on consensus and about the editorial standards and so on and so forth. Now, that's not to say, you know, governments can't have some influence and some participation, and maybe that would be okay if they're very transparent about it.

Like, just post on the talk page and say, oh, hi, I'm from the Russian government, we've got this concern. You've not really dealt with this statistic properly. Okay, well, we'll look at it. I'm—whatever. I mean, that that's a minor point.

But the idea of like sort of sneaking into Wikipedia and sock puppeting, I'm sure it happens at small scale, but I don't think it's, it is the main thrust of things.

And in fact, one of the, my, my moments of pride was I was at a conference here in London and I met a Ukrainian journalist. And she said, oh how is Russian Wikipedia talking about the war? And I said, well, I hear good things, but like she's, you speak Russian? Yes. Fluent in Russian. Fluent in Ukrainian.

I'm like, well, please go and let me know. Go and read it.

 So she did. That night, she went home—went to the hotel—read it, came back. I saw her the next day and she said, and I said, oh, well how was it?

And she said. It was better than I thought it would be. And she said, I've got a quibble with a few things.

And I'm like, great, I've got a quibble with a lot of things. That's where we—that's, I'm okay with that. And you know, it's things like Russian Wikipedia says it's a war. It says Russia invaded Ukraine. It's, it says the things that you might think would be difficult to say in Russia.

And by the way, a lot of the Wikipedians have gone laying low, you know, abandoned their old account, which was tied to their real-life identity.

And now they're editing under a new account that's more anonymous, and they're using a VPN, and they're being a little more careful because it's quite difficult. You know, we've had Wikipedians arrested in various authoritarian places and they're heroes. They're amazing. And so, yeah. What I'm saying is, I'm not dismissing the question.

I'm saying I think we're okay, but it—we always have to take that seriously. And, you know, obviously we do technical things, looking for patterns of IP addresses and things like that, looking for sock-puppeting, things like that. Because that, that does happen.

But I think more often, sock-puppeting and sort of doing that is one person being a troll rather than state-sponsored activity.

Renée DiResta: That's fair. I think that question about reliable sources comes up also in the realm of, you know, the engineering of entire domains that are essentially fabricated by the state at this point. Just LLM-generated, alternate propaganda sites and realities. It makes me think a lot about this question of reliable news ecosystems, particularly when they are outside of the sort of U.S. language ecosystem. and how we know what is.

Jimmy Wales: It’s absolutely, absolutely crucial. I mean, it's super important and super interesting.

I mean, one of the things—so back when, you know, we first started to get worried about fake news sites, before Donald Trump decided it means all the news he doesn't like, you know, that's a legitimate phenomenon that existed before LLMs.

And it was never a problem for Wikipedia because the Wikipedias spend their lives debating about the quality of sources, and also they're kind of hard to fool. Like I remember one example headline, it said, “Pope Endorses Trump”—this is Pope Francis. And I, you know, you show this to—well, it was a little bit viral on social media.

It wasn't like a massive thing, but it did, it got a lot of retweets or whatever. And you can fool random people, but fooling Wikipedians is quite tough because they would go oh, hold on. Popes don't generally endorse political candidates. That's just not a thing.

And by the way, Pope Francis, and like whatever you may think about the Catholic church and all that, he seemed like a really nice guy and unlikely to be a Trump supporter.

Like, very much not likely. And so you would just go to that ‘this probably makes no sense.’ And then you go look at the website and you're like, oh.

Like you click around a bit, and you're like, oh, this is only four pages on the website. It's, it was just done up for social media.

But what you're talking about is a further threat because now—in order to make something like that was quite easy, but you could only make one or two pages. You couldn't generate an entire news site. And now you've, you probably can create quite a huge amount of content and make that look a lot more plausible.

And I would say you, you've identified as well: try to fool English-speaking Wikipedia about English language media. You're not going to get very far. Try to fool Germans about German language media. You're not going to get up very far. But try to fool any of us about news sites in Thailand. Ooh, it might be a little tricky. I don't know which ones are the best. I don't know the most famous papers, and I don't know if this is new or old.

I mean, it just means more diligence on our part. So if somebody wants to use a source, but it's—yeah, it's s part of the world that we have to grapple with.

Renée DiResta: I want to touch on another thing that you said in your book that I think ties back right now to this crisis of trust in institutions. Which is growing, I would say.

As you know, we've seen some really remarkable things from the CDC in the last 48 hours. We now have a page that says vaccines cause autism on cdc.gov. It’s incredible.

Jimmy Wales: Wow. Okay. Yeah.

Renée DiResta: So I'm curious how we should be thinking about public agencies trying to build trust.

And I think that, you know, I wrote a book that came out in 2024 that still sort of assumed that we would be reforming federal agencies. Now, I think, yeah, maybe we're going down to the state level at this point, but yeah.

Jimmy Wales: Yeah.

Renée DiResta: How should institutions be thinking about reforming or rebuilding trust at this point? How should they be learning the lessons that you write about in your book?

What do you think is the key thing for them to be thinking about?

Jimmy Wales: You know, giving advice in normal times is different from giving advice right now, because I very much appreciate—I mean, we know how many scientists have resigned or left because they can't, you know, like they just can't abide by it whatsoever.

But in normal times, you know, there are definitely things that, that organizations of all kinds do and can and should do to build trust. So you know, like one of the things that I think we should insist on as citizens when we get back to sanity, is the intellectual independence of the scientists at the CDC.

You know, should they also have checks and balances about things, like not being too much under the sway of big pharma? Fine. Yeah. Great. That's actually important. I, it's less important than the anti-vaxxers think, but yeah, that's a thing.

So all of these sort of steps to say, like, how do we ensure the integrity of the process and depoliticize it as much as possible?

You know, one of the things that I learned in the research for the book that I thought was quite interesting is, in a lot of these partisan cases, that these kinds of actions reduce trust, not only with the people who disagree with you, but also the people who agree with you.

So this was, you know—I talk about the, the Washington Post decision not to endorse presidential candidates. Which I said I think it was a good decision, just very badly timed. You know, doing it just before the election made it seem like it was implicit support for Trump by Jeff Bezos or whatever, and I don't think that was it.

But more broadly, you know, the research shows like endorsements, political endorsements by news organizations not only lower trust among people who don't agree with the endorsement, they actually lower trust among people who do agree with the endorsement because their then concern is, is the news being skewed to support their candidate? Or is it a campaigning organization, or is it fact-based?

And I think that's doubly true like with journalism and news. It's a complicated matter. I think there's—personally, I think there's nothing wrong with having a right-leaning or left-leaning paper that has a view. That's fine. You've got an audience. That's fine. I think we all have to—I trust them all a little bit less because of that, but it's fine.

But I think for, you know, government agencies trying to represent all the citizens, I think they've got to really be very thoughtful about this. And you know, a piece of it would be that purpose.

What is the purpose of the CDC? The purpose of the CDC is not to fight big pharma. It's not to support the president. The purpose of the CDC is to research and disseminate accurate, factual information that's important for our health.

So yeah. I'm not saying anything too radical, I think, but it sounds a bit radical at these times.

Renée DiResta: Looking ahead, what do you think is the next, you know, the next trust challenge that you see coming for Wikipedia or for the broader knowledge ecosystem?

Jimmy Wales: Yeah. Well, I mean, I do think, you know, there are a lot of the challenges we've touched on already. You know, if we are in a broader information ecosystem that has become more politicized and more fraught with risk for accuracy and so forth. I mean, in the past you would've never questioned for one second statistics coming from the CDC about chickenpox.

Right. But is that going to become politicized? If that becomes politicized, then are those statistics still valid? I mean, have they fired all the statisticians and are they stopping collecting the data? So like then suddenly, like the quality of that output is large.

So that's a risk for us, a big risk that I talk about a lot that's not necessarily new, and not necessarily about sort of the current Trump administration, all of that. But like the decline in local journalism, I think, is a real challenge for Wikipedia. It's easier to write the history of my hometown Huntsville, Alabama in 1980 than it is in 2020 because the local newspaper is basically dead.

And you know, like the number of journalists working locally is much smaller. So that's a problem that we, you know, that I—and I don't have a solution to that. I wish I did. And then for us, for a Wikipedian, I think one of the most important things is community health. And community health, I mean by that, staying curious, staying neutral, staying open.

One of the things that I have said in my book, and I've said to Elon Musk is, if you're putting out the incorrect view that Wikipedia has become Wikipedia and it's been taken over by radical left-wing activists, and it's a sad shame, but Wikipedia used to be great and it's been taken over by left-wing crazies.

Well, you're doing two things. You're telling kind, thoughtful conservatives that they wouldn't be welcome in Wikipedia, so they're going to stay away. I don't want that. I want them here. I want people here to debate the issues of our time in a thoughtful and kind manner.

And you're also telling those crazy left-wing activists that Wikipedia is their new home, and then we have to deal with them. That's no fun either.

And it's, no, actually what we want are people who care more about the values of Wikipedia than about the culture wars. And that's really important. And they can come from all sides. And, you know, there's great people on all sides.

And so that's a, that's something we have to be careful about, that we don't react to all that by saying, fine. Like I, I bristle a bit when people say, I get why they say it, but oh, reality has a liberal bias.

That's a cute saying and all that, but I'm like, it's not—that should not become a mantra for Wikipedia because that doesn't really foster an intellectual openness and willingness to listen to people on all sides.

Renée DiResta: Is there any appeal or anything that we can be telling ordinary listeners or just the average person that they can be doing to contribute or, you know, help to turn the tide on this stuff?

Jimmy Wales: I hope it's fun to come and edit Wikipedia. And so I think that's always the best thing people do.

Obviously we're a charity. We do depend on donations. It's our 25th anniversary. And yeah, and of course I'm here about my book. Got book—so, this is the German edition which I'm very excited to try to read. I speak a little bit of German, but not very much. It's coming out in 20 different languages, so I'm excited about that.

Renée DiResta: That's fantastic.

Jimmy Wales: Yeah.

Renée DiResta: Awesome. Well, thank you so much. I really enjoyed our conversation.

Jimmy Wales: Yeah. Fantastic.

Renée DiResta: The Lawfare Podcast is produced in cooperation with the Brookings Institution. You can get ad-free versions of this and other Lawfare podcasts by becoming a Lawfare material supporter at our website, lawfaremedia.org/support. You'll also get access to special events and other content available only to our supporters.

Please rate and review us wherever you get your podcasts. Look out for our other podcasts, including Rational Security, Allies, The Aftermath, and Escalation, our latest Lawfare Presents podcast series about the war in Ukraine. Check out our written work at lawfaremedia.org.

This podcast is edited by Jen Patja and our audio engineer this episode was Cara Shillenn of Goat Rodeo. Our theme song is from ALIBI music. As always, thank you for listening.


Renée DiResta is an Associate Research Professor at the McCourt School of Public Policy at Georgetown. She is a contributing editor at Lawfare.
Jimmy Wales is a founder of Wikipedia and the author of "The Seven Rules of Trust."
Jen Patja is the editor of the Lawfare Podcast and Rational Security, and serves as Lawfare’s Director of Audience Engagement. Previously, she was Co-Executive Director of Virginia Civics and Deputy Director of the Center for the Constitution at James Madison's Montpelier, where she worked to deepen public understanding of constitutional democracy and inspire meaningful civic participation.
}

Subscribe to Lawfare