Cybersecurity & Tech

U.S. Influence Operations: The Military’s Resurrected Digital Campaign for Hearts and Minds

Renee DiResta, John Perrino
Tuesday, October 11, 2022, 8:16 AM

In a battle the United States can’t withdraw from, a coordinated campaign of truth, authenticity, and transparency is key to victory.

(Georgia National Guard from United States, CC BY 2.0 <https://creativecommons.org/licenses/by/2.0>, https://bit.ly/3McjXAi)

Published by The Lawfare Institute
in Cooperation With
Brookings

In October 2008, the U.S. Special Operations Command published a request for proposal (RFP) seeking “rapid, on-order global dissemination of web-based influence products and tools in support of strategic and long-term U.S. Government goals and objectives.” The RFP listing, for something nondescript called a “Trans Regional Web Initiative” (TRWI), appeared at a time when the global war on terror—and the growing online presence of terrorists—was a particularly critical mission. The TRWI required a lead that could handle everything from the development of website architecture and content management systems, to the development of content “tailored to foreign audiences” in the battle for hearts and minds. 

At the time, the announcement was viewed with some skepticism. Wired, for example, warned that U.S. efforts had largely been unsuccessful at “creating cultural and/or news content that appeals to foreign audiences” and speculated whether anyone would read the websites. But in September 2009, the contract, worth $10 million for the first year with four annual renewal options that would later exceed $20 million, was awarded to one of the largest government and military contractors, General Dynamics Information Technology (GDIT). 

Five years later, in 2014, the TRWI was shut down, as Congress, too, became skeptical about whether its impact justified its price tag. But a variant would rise again—with much more alongside it—as the online propaganda battleground expanded from counterterrorism to a messaging war of all-against-all as state adversaries like Russia entered the mix. 

Nearly 14 years after TRWI was funded, researchers at the Stanford Internet Observatory and Graphika received and analyzed a unique data set from Meta and Twitter (as participants in the platforms’ researcher consortiums). Each of our teams had, over the prior four years, analyzed networks of assets attributed to many other state-linked actors—particularly China, Russia, and Iran, but also including India, Mexico, Nigeria, and many more—that had been suspended by social media platforms for “coordinated inauthentic behavior” and similar policy violations. This was the first data set we had observed for which Twitter listed the “presumptive countries of origin” as the United States and Great Britain, while Meta said the “country of origin” was the United States. 

While most of the tactics were not novel, the data set was a complex mix of overt operations linked to the U.S. military—linked, in fact, to the old domains of the TRWI—and other activity that presented something of a puzzle. It was demonstrably pro-Western, pro-U.S., but there were few clues as to who or which actors were behind the masked activity. The Stanford Internet Observatory and Graphika wrote our joint report on only that part of the data set: the covert cluster of activity, which we described without assigning an attribution. 

Subsequent reporting in the Washington Post by Ellen Nakashima, however, found that Facebook and Twitter had in fact previously reached out to the Pentagon about inauthentic accounts that they believed were linked to the military. She additionally noted that the analysis in our Aug. 24 report on the pro-Western covert aspects of the network had spurred a review of military information operations conducted on social media. The report cited two anonymous officials that said U.S. Central Command (CENTCOM) is among the military commands facing scrutiny after the report “elicited a flurry of news coverage and questions for the military.” 

This data set, with its combination of distinct overt and covert operational dynamics, raises important questions about how democratic countries should operate in an increasingly complex and contested online battlespace for hearts and minds. The United States must not cede the information space, but efforts should be refocused on boosting the truth in an authentic way while also exposing falsehoods and deception peddled by adversaries. However, the questions around how to engage are many and complex: Which government and military organizations should lead public affairs, or information operations? And to what extent must they coordinate for a unified voice on U.S. policy and the promotion of regional diplomacy and development efforts? Are there instances when the operation of undisclosed accounts and personas is justified, such as for the infiltration of closed terrorist groups? What rules and oversight should be in place for such a wide range of information operations? How should the success of information operations be measured to ensure accountability and the constant improvement of such efforts? To better understand the dynamics in play, this piece offers a look at the full scope of the activity in the data sets from Meta and Twitter. 

A Complicated Puzzle of Pro-Western Networks

At first glance upon receiving the data, we thought we were looking at a network connected to GDIT’s TRWI efforts from long ago, with accounts tied to the long-abandoned domains. Some of the accounts, and the websites that they connected to, openly claimed affiliation with U.S. military combatant commands (including CENTCOM) and clearly disclosed these affiliations in compliance with relevant laws and policies.

However, it quickly became clear that there was a second cluster of accounts that represented websites without U.S. military disclosures. This second cluster of activity, which we called “the covert cluster,” was reminiscent of the activities we’d seen from other state actors over the years. There were accounts related to websites that appeared to be sham media properties purporting to produce independent journalism and also fake persona accounts with profile pictures generated by artificial intelligence (AI). It was a demonstrably pro-U.S. propaganda effort, but we could not make a strong connection linking the overt TRWI-linked materials, which were demonstrably run by CENTCOM and a contractor, with this second cluster.

We wrote an extensive report summarizing the covert network cluster only, which had emerged as early as 2017, but was most active over the past three years across Twitter, Facebook, Instagram, and other social media platforms. Fake personas promoted the interests of the United States and its allies in the Middle East and Central Asia, including Iran and Afghanistan. Following Russia’s invasion of Ukraine in February, some of the persona accounts heavily criticized Russia for the deaths of innocent civilians and other atrocities. Others promoted U.S. Agency for International Development activities in the region or promoted positive views of American troops. The accounts sometimes shared articles from U.S government-funded media outlets, such as Voice of America and Radio Free Europe, and links to regional news websites sponsored by the U.S. military. A portion of the activity also promoted anti-extremism messaging. Overall, the covert accounts had low engagement, illustrating the limitations of using inauthentic tactics to build influence online. The vast majority of posts and tweets reviewed received no more than a handful of likes or retweets. Less than one in five accounts we analyzed from the overt and covert accounts had 1,000 or more followers. 

To further complicate matters, however, there was a third segment of activity with accounts linked to new media websites that did have CENTCOM disclosure statements, but the affiliation was not always clearly posted on their social media profiles. And these accounts also appeared to be linked to the old TRWI domain infrastructure.

The Short-Lived Trans-Regional Web Initiative

Under the TRWI contract, GDIT registered and operated 10 coordinated websites and corresponding social media channels targeted at regional audiences across the Middle East, North Africa, and Central Asia. The content focused primarily on issues related to regional stability, human rights, and economic growth while promoting regional partners and allies and attacking adversaries like Russia and Iran. 

The websites, which all disclosed U.S. military backing, include Info Sur Hoy, Sabahi Online, Magharebia, Mawtani, Al-Shorfa, Khabar South Asia, Southeast European Times, and Central Asia Online. All of the aforementioned websites focused on politics and news unique to their respective regions. Al-Shorfa, for example, targeted an audience in the Middle East, promoting interfaith dialogue initiatives and pointing to the destabilizing effects of terrorism and extremism across the region. Mawtani similarly promoted “greater regional stability [in the Middle East] through bilateral and multilateral cooperation” with a special emphasis on Iraq.

Based on an assessment of news articles from the time during which the TRWI was active, audience engagement with the sites varied, but they seemed to achieve some reach. Coverage in Foreign Policy noted that “for a small outlet covering an obscure corner of the world, Central Asia Online does relatively well. In 2011, the site published an average of 71 stories per month,” earning a reported 168,000 article reads, 85,000 unique visitors, and 380 reader comments per month. The coverage also noted that the site’s material “seeps into local newspapers, websites, and news aggregators around the world, expanding the site’s readership.” A 2013 doctoral thesis by Roy Revie includes an in-depth analysis of media and social media dynamics of the sites—including ranking data from Alexa, a web analytics service; locations in which articles were quoted, linked to, and engaged with; and tables of social media interactions—noting that some of the sites received hundreds of engagements on articles shared to their Facebook pages while others received almost none. 

Even in 2013, however, the program was controversial. A classified Government Accountability Office (GAO) report, leaked to USA Today, faulted the program for a lack of coordination with other U.S. efforts. While the White House saw some benefit to the program, Congress cut funding for the TRWI in fiscal year 2014, citing the GAO report that emphasized failures with military information support operations—including the TRWI program, which rose to an annual cost of $22 million. The report highlighted a surge in funding for cloudy results and a lack of coordination for military-operated websites with embassies and State Department efforts on U.S. policy positions and with regional diplomatic efforts in the target regions. State Department officials expressed concern about how audiences “sensitive to foreign military presence” in northern Africa might receive military-operated news websites, according to USA Today reporting at the time. Then-Senate Armed Services Chairman Carl Levin (D-Mich.) opposed the continued funding of the effort, with bipartisan support from senators skeptical of the military programs or the perceived funding waste. 

The timing of the TRWI’s termination is extraordinary when viewed in the context of the past seven years: Around the time of the program’s demise, state actors and non-state extremist groups were increasingly engaging in social media propaganda and online influence campaigns. Islamic State recruitment and propaganda activity was highly visible on Twitter. And the Internet Research Agency—the now-notorious Russian troll factory best known for interfering in American domestic affairs and the 2016 election—had already begun to operate

In August 2014, General Dynamics Information Technology warned 61 employees in the D.C. suburbs that their positions would likely be terminated given the funding cuts for the program. In a truly remarkable turn of events, some of the laid-off GDIT contractors soon found new jobs working for the Russian government’s Sputnik news agency.

“What seems to be clear is that the anti-status quo powers in the world today—Russia, China, Iran, and the Islamic State—know the value of information warfare and invest heavily in it,” former Voice of America director and Defense Department information strategy adviser Robert W. Reilly told the AMI wire service in its 2016 reporting on the former U.S. contractors’ move to Sputnik. The former TRWI contractors had “nowhere else to go and are being picked up by Putin …. [I]t’s a powerful illustration of who takes information warfare strategy seriously and who doesn’t,” he said.

Zlatko Kovach was one of those former GDIT contractors who supported the TRWI. His seemingly abandoned LinkedIn profile continues to list his current position as “Senior Editor at General Dynamics Information Technology” with a description that he produces “content for online news websites covering the European Union, Southeast Europe and Turkey,” working with “over 60 freelance reporters and photographers” to assign, produce, and edit stories. 

He echoed Reilly’s sentiment, telling AMI that:

the nature of the game has changed …. You have media that’s shrinking. U.S. government communication efforts were being canceled. The media is evolving. There is a media space, and the question is, who’s going to fill that space? Then I had to ask, who’s offering jobs?

The Cat Came Back

Media reporting and contract data on the original TRWI domains was easy to find. But, curiously, coverage on what appears to have been a TRWI “respawn” in the 2016-2017 time frame was largely absent. These domains also had appropriate CENTCOM or other combatant command attribution language in their “About” pages, as the old TRWI sites had—however, their Instagram, Twitter, and Telegram accounts did not have obvious disclosures. They were neither covert nor particularly proactive about putting their affiliation in highly visible places. As we investigated these still-active websites, domain registrations and Google Analytics data suggested that from late 2016 to early 2017, a new cluster of websites with largely the same regional focus—and some of the same branding—had emerged to replace the TRWI. 

The rebranding, it seems, began in mid-to-late 2016.

The TRWI ran across multiple social media platforms. Websites linked to the TRWI Twitter accounts pointed to over a dozen Facebook pages and YouTube channels, all of which had also been operated as part of the program. Several of these Facebook pages and YouTube channels were still active when they were discovered but had been rebranded as new outlets covering the same regions as their predecessors. 

Mawtani rebranded as Diyaruna and continues to focus on international efforts toward regional stability in the Middle East with a focus on Iraq. Al-Shorfa, also targeting an audience in the Middle East, became Al-Mashareq. The website and its social media channels continued to promote interfaith dialogue and highlight the harms of extremism in the region. In at least one occasion, the operators behind Al-Mashareq’s Facebook page—blocked to users in the U.S.—exposed location tags from Rockville, Maryland, and Fort Worth, Texas, in posts. The websites continue to have the appropriate CENTCOM attribution to comply with U.S. law. At least eight TRWI Twitter accounts also rebranded and operate under this as-yet-unknown program, based on unique Twitter IDs.

The re-branded Al-Mashareq Facebook page included posts with visible locations within the U.S.

Shortly after, nearly concurrently with the rebranded launch of TRWI outlets, the covert operation began. While we have no evidence that the operators are the same, we do observe that the covert personas boosted the overt content, linking to these overt “respawned” domains. Continued research and a deeper investigation of these activities is warranted. “With the rise of Russia and China as strategic competitors, military commanders have wanted to fight back, including online,” Nakashima wrote in Washington Post coverage of the new Pentagon review. An anonymous defense official told the Post that commanders “got really excited” when legal barriers to conducting clandestine operations were pulled back in 2019.

The Cat’s Out of the Bag. What to Do?

States engage in influence operations. This is not new, nor is it unique to social media. Cold War history is replete with examples of Western governments using overt propaganda alongside covert operations to influence nonaligned countries—and even overthrowing democratically chosen governments—through print, television, and radio communications. There is also a long history of “agents of influence” who secretly worked on behalf of rival governments—some used their real names, while others crafted elaborate personas. And, of course, online espionage and cyber operations have been a reality since the dawn of the internet. State influence activities stem from strategic objectives—they serve a purpose. No government, Western or otherwise, is simply going to stop pursuing them. 

But this data set offers a rare glimpse into pro-Western influence operations—both overt and covert—in the social media age. From what we have seen, the meager benefits of the covert operation were not worth the substantial risks. The overt activity appeared to perform slightly better, but, while our glimpses may be limited, strategy in the information domain appears scattered and incoherent.

The U.S. government and military should not run influence operations powered by inauthentic accounts or fake engagement. Operating a network of fake social media accounts with masked or AI-generated faces to target publics in adversary countries in the battle for hearts and minds is not worth the loss of the moral high ground for what appears to be minimal upside. By creating inauthentic personas and using fake accounts and inauthentic engagement to boost perspectives, the United States gives up the high ground of truthful, though selective, information campaigns that have previously defined U.S. doctrine. This approach has distinguished U.S.-backed reporting and public affairs efforts abroad. 

The online influence game has been normalized. It has expanded significantly since the notorious 2014 efforts by the Russian Internet Research Agency to convince the American public that a Louisiana chemical company was under attack, an initial volley that subsequently extended into a multiyear effort to erode American social cohesion and interfere in the 2016 U.S. election. A combination of that activity, and the actions of the highly visible Islamic State propagandists, no doubt inspired some of the U.S. military’s changing views and authorities around online influence operations over the past decade.

However, in our analysis of the data sets, we observed this operation to take a “spray and pray” approach, with flimsy personas, content that achieved only the most minimal engagement, and no clear strategic value. While the ethics may be debated, the numbers show this approach is not a successful strategy. It may enrich defense contractors, but it diminishes Western credibility, with no obvious benefit.

The threat of online propaganda is real, and the United States needs a capacity for response. While some in the national security establishment likely believe that this is “fighting fire with fire” and justified, they should keep in mind one of Russia’s key goals with their influence operations: to attack the very idea of objective truth and create a sense of nihilism among global audiences. Actions by democracies to copy Russia’s techniques create conditions favorable to Russian and Chinese fabulists, with no upside to the free world. American principles, instead, must continue to adhere to spreading factual information to advance democratic values and shape public opinion. In the shadow of Russia’s illegal war in Ukraine and China’s imprisonment of millions of ethnic minorities, the moral gap between authoritarians and democracies has not been this clear since World War II. The United States and its allies should not make things easier for those states by muddying the water using inauthentic online propaganda. 

As the Department of Defense takes an increasingly offensive cybersecurity strategy, known as “defend forward,” a similar approach may be warranted in the information space. Instead of replicating the deceitful networks of authoritarian adversaries, U.S. and Western efforts should focus on exposing those adversarial networks with radical transparency and winning hearts and minds with an underutilized weapon: the truth.

We are encouraged by the serious response the research findings are reported to have at the highest levels of the U.S. government and the military. These findings warrant a thorough review to ensure the U.S. does not give up the moral high ground by tearing down, instead of building up, trust with critical audiences abroad.


Renée DiResta is the technical research manager at the Stanford Internet Observatory, a cross-disciplinary program of research, teaching, and policy engagement for the study of abuse in current information technologies. Her work examines the spread of narratives across social and media networks; how distinct actor types leverage the information ecosystem to exert influence; and how policy, education, and design responses can be used to mitigate manipulation.
John Perrino is a policy analyst at the Stanford Internet Observatory where he translates research and builds policy engagement around online trust, safety and security issues.

Subscribe to Lawfare