Cybersecurity & Tech

How Unmoderated Platforms Became the Frontline for Russian Propaganda

Samantha Bradshaw, Renee DiResta, Christopher Giles
Wednesday, August 17, 2022, 8:01 AM

An overfocus on covert networks on Facebook and Twitter misses the full expanse of the propaganda strategies that often reach more users through different communication media.

Social media applications on a phone (Jason Howie,; CC BY 2.0,

Published by The Lawfare Institute
in Cooperation With

The Russian invasion of Ukraine has highlighted the evolving complexities of platform governance challenges in an increasingly decentralized information environment. Russia’s war in Ukraine has killed, injured, and displaced thousands of civilians. Russia, leading up to and throughout the conflict, brought the full scope of its propaganda apparatus to bear, leveraging overt and covert capabilities on both broadcast and social media to justify the invasion, downplay the death and destruction of families and homes, and deny human rights abuses. Social media companies have been called to make difficult, real-time decisions about content moderation with life-or-death stakes. However, even as the Western conversation focuses on Facebook, Twitter, and Alphabet’s successes and failures, state war propaganda is increasingly prevalent on platforms that offer minimal-moderation virality as their value proposition.

For several years, most of the focus on social media influence operations has homed in on disinformation and coordinated inauthentic behavior (CIB), particularly the covert campaigns with now-notorious bots and trolls run amok on Western social media, such as those that sowed discontent during the 2016 U.S. presidential election. However, this narrow focus on mainstream social media platforms and covert activities has left two glaring blind spots for ongoing information operations: overt information operations and the rise of social media propaganda on alternative platforms like Telegram.

Russia’s Full Spectrum Information Operations

Russia has a long history of “full spectrum” information operations that span the range of available communication mediums. Russian politicians, ministries, and media outlets work together to co-construct narratives that serve their geopolitical interests. Evidence of these “full spectrum” operations predate the digital era, with examples of Soviet operations leveraging newspapers, radio, and television tracing back to the 1930s and throughout the civil rights movement. Every available dissemination channel was utilized, and while the narratives were inflected for specific audiences or to highlight particular details, the overarching goal of undermining Western hegemony remained constant. In the United States during the civil rights movement, for example, incidents of racism were published and amplified by Soviet media as a broader strategic effort to highlight American hypocrisy and undermine the United States’ position on the world stage. These messages were spread not only through covert and unattributable channels but also quite openly, in clearly attributable “white propaganda” outlets.

Today, internet memes have replaced propaganda posters and the full spectrum of channels including social media. However, state-backed media, which itself now has a presence on social platforms, continues to play an important role in shaping narratives that support Russia’s geopolitical interests. These range from “counter-hegemonic” and anti-Western coverage of NATO and other institutions of liberal democratic order, to conspiratorial narratives about the Skripal poisoning, the 2014 downing of Malaysia Airlines flight MH17, or the Russian invasion of Crimea. In our research on Russian state-backed media coverage of the #BlackLivesMatter movement, we show how attributable state media outlets, posting on Facebook, strategically employed race and racism to simultaneously promote the movement (on some channels) and undermine it (on others). While our investigation focused on state media on Facebook, these same state outlets were running similar strategies on Instagram, TikTok, YouTube, and Twitter, reformatting their propaganda to fit the affordances and audience preferences of each platform. On TikTok in particular, editorial teams employ millennial language and meme culture that mixes Russian state political content with dance videos and life hacks.

Before the invasion of Ukraine, platform responses to state-backed media focused primarily on transparency and labeling policies. Over the course of a decade, Russia continued to evolve its overt propaganda capability. It made significant investments into the global reach of RT, with the channel operating in six different languages to reach audiences in Latin America, the Middle East, and Africa. These same properties are now playing a significant role in framing and shaping public opinion around the Ukraine invasion. Some outlets falsely claimed that the Ukrainian government was conducting a genocide against its own civilians, or that NATO countries would carry out “false flag chemical weapons attacks in Ukraine’s breakaway republics to tarnish Russia’s reputation.”

Russian state-backed media outlets like RT and Sputnik broadcast to millions of viewers in Europe on television and via social media for 10 days after the invasion, until European governments initiated sanctions against them. Some platforms, in response to the sanctions as well as the ongoing concerns around the role of state-backed media in wartime propaganda, took steps to limit the reach and impact of state-backed media on their platforms. For example, Meta announced plans to demote Russian state-backed media (including regional channels) on Facebook and Instagram, YouTube blocked Russian state-funded media channels from operating globally, Telegram limited access to RT and Sputnik channels for those users who signed up with a European phone number, and TikTok adopted some transparency and labeling policies, expediting the rollout of state-backed media labels to Russian state-backed media.

While the steps platforms took to limit the reach of state-backed media by demoting their content or banning them altogether impacted the reach of some channels, outlets learned to adjust their strategies to evade bans by using official embassy accounts and individual journalists’ accounts to share content. Although most state-backed media content was not being algorithmically recommended to users, there was nothing preventing the sharing of content from these channels. In the weeks after the ban, major channels, particularly those in the Global South, were still generating hundreds of thousands of engagements, although these were lower than prewar levels.

Full Spectrum Operations on Telegram

Telegram, which is used widely in Ukraine and Russia (and has a growing presence in the U.S. and Europe), has been central to information dynamics throughout the war. It operates in a curious space in the online platform ecosystem, often overlooked in Western media coverage. Like many alternative platforms, Telegram has an avowed commitment to doing the minimum possible content moderation. There have been a few cases, mostly related to groups engaging in or encouraging violence, that have resulted in Telegram removing accounts and content. In 2019, Telegram removed Islamic State accounts, and in 2021, it terminated two anti-vaccine groups in Italy and Germany that advocated violence against health care officials. Following pressure from the European Union, Telegram blocked RT accounts on its platform in the region.

The platform has been an essential communication tool for Ukrainians gathering information about a fast-moving conflict. But its decision to eschew content moderation has created an environment in which viral pro-Kremlin propaganda, both overt and covert, has thrived. A vast and complex network of pro-Kremlin propaganda groups and channels on Telegram share, replicate, and echo misinformation and posts promoting pro-Russian disinformation.

The Russian invasion has provided momentum for various pro-Kremlin Telegram communities to develop with different characteristics, styles, and emphasis. Some channels are creators of content and others are propagators of narratives, persistently forwarding content from dozens of channels. Making any concrete claims as to the attribution of a page’s owners is a significant challenge; unlike Facebook and Twitter, Telegram does not attempt to identify, label, or moderate Russian propaganda channels, overt or covert.

Several recent reports have assessed these networks. One investigation identified over 80 channels that were reportedly part of a pro-Kremlin Telegram network to target specific populations in Ukraine. The channels took the names of local municipalities in Ukraine and provided updates about the war alongside the promotion of Russian government propaganda narratives. Another analysis found that pro-Kremlin Telegram channels were generating more engagement than were those critical of Russia.

Pro-Kremlin channels often provide on-the-ground reporting from Ukraine, sharing graphic imagery and celebrating Russian military advancements. Some have hundreds of thousands of subscribers and are associated with Russian mercenary actors. The channels that tend to have the largest followings on Telegram, often over one million subscribers, are pro-Russian propagandists who have achieved fame or public attention via TV current affairs programs. Telegram channels are also being used to mobilize individuals. In August 2022, Meta removed a network of Instagram accounts that were artificially trying to promote the perception of widespread support for Russia’s invasion of Ukraine. These efforts were organized on the Telegram channel “Cyber Front Z,” which had over 100,000 subscribers. That channel remains undisturbed.

The question of what both mainstream and alternative social media platforms do to combat full spectrum information operations has ramifications for armed conflicts worldwide going forward. And, perhaps more importantly, the ongoing invasion of Ukraine provides a glimpse of propaganda and conflict in a more decentralized social media future, in which not all platforms can or will moderate.

Addressing the Blind Spots of Information Operations

A comprehensive strategy to combat disinformation campaigns must consider full spectrum operations that incorporate both overt and covert dynamics across a wide range of analog, digital, and alternative media, including non-Western platforms like Telegram. An overfocus on covert networks on Facebook and Twitter misses the full expanse of the propaganda strategies that often reach more users through different communication media on popular local media and social media channels. It misses the fact that state actors are simply moving their content strategies and investing in audience growth on platforms that won’t moderate, investigate provenance, or contextualize state narratives.

State-backed media are some of the worst offenders when it comes to editorializing the war and denying the use of force or violent events taking place on the ground, and their increasing use of social media as a channel creates distinct complexities. While some platform policies prevent certain government entities from denying the use of force or violent events in the context of an attack against the territorial integrity of another state, state-backed media are not explicitly limited from publishing these kinds of narratives despite the fact they have significantly larger audiences than government or embassy accounts. One way platforms could continue to strengthen their response to overt propaganda would be to include state-backed media in this policy.

Telegram’s Pavel Durov said in July 2021 “that conspiracy theories only strengthen each time their content is removed by moderators. Instead of putting an end to wrong ideas, censorship often makes it harder to fight them.” Counterspeech, contextualization, and correction are, indeed, long-standing alternatives to censorship. However, the architecture of Telegram is not providing an equitable forum for facts to win out. Its uncapped forwarding function and the presence of highly prolific, unattributed channels creates an environment in which state propaganda, overt and covert, can spread with little oversight or accountability. The platform could maintain its decision to minimize the removal of groups and accounts and yet still build out functions that create greater transparency around state-operated pages, or it could consider forwarding limits that might temper the spread.

Propaganda has—and always will be—a component of violent conflict. In the modern information environment, social media platforms are one of the first lines of defense on the digital battlefield. The Russia-Ukraine war has shown that platforms are not simply neutral or commercial entities: Their policies make them arbiters of geopolitics, and the decisions they make—or don’t make, in the case of Telegram—can mean life or death during times of war, conflict, and violence. Although information spaces have never been homogeneous, the invasion of Ukraine provides a preview of the policymaking challenges in a fractured and highly politicized information environment.

Samantha Bradshaw is an assistant professor in new technology and security at American University’s School of International Service.
Renée DiResta is the technical research manager at the Stanford Internet Observatory, a cross-disciplinary program of research, teaching, and policy engagement for the study of abuse in current information technologies. Her work examines the spread of narratives across social and media networks; how distinct actor types leverage the information ecosystem to exert influence; and how policy, education, and design responses can be used to mitigate manipulation.
Christopher Giles is a researcher and open-source investigator at the Stanford Internet Observatory. Prior to joining Stanford University as a Knight-Hennessy Scholar, Christopher reported on disinformation for BBC News, covering the Covid-19 pandemic and the 2020 US election.

Subscribe to Lawfare