The U.S. national security apparatus has bet big on its ability to influence Beijing’s behavior to deter or delay conflict with China. It is a risky bet—potentially blind—because there is strong evidence that foreign-directed influence does not move unsympathetic audiences. The science of influence and persuasion exists in the sales and advertising domains, but no one has developed such a science for national security.
Today, it is not known whether foreign influence campaigns work, whether they are counterproductive, or whether they have any effects whatsoever. Those questions could be answered with a serious research program. However, to this point, the United States is just playing at influence, doing this or that on a pop-psychology whim, hoping something will happen, and almost never detecting whether anything did. It is an irresponsible approach to a problem of immense stakes.
The American philosopher Richard Rorty was known for provocative statements about truth, one of which is, “Take care of freedom, and truth will take care of itself.”1 His argument was that truth gets all the protection it needs from freedom alone. Rorty held controversial views, but this one is especially unpopular today. Much of the academic and government work on information warfare not only warns against the dangers of mis- and disinformation, foreign malign influence, cognitive domain operations, or cognitive warfare, but also advocates for media literacy education to shore up Americans’ so-called cognitive security.
Research skeptical of these claims is largely unadvertised. There is no disagreement that Russia, China, and others wage disinformation campaigns to hide unfavorable narratives from, and promote false ones among, both domestic and foreign audiences. There is, however, real disagreement about whether foreign-targeted activities work.
Similar skepticism exists about interventions such as media and information literacy training designed to protect U.S. citizens from foreign influence. Scientific studies on the effects of influence campaigns are a sobering body of research comprising academic papers with titles such as, “Misinformation on Misinformation,” “Avoiding the Echo Chamber about Echo Chambers,” “The Echo Chamber Effect Is Overstated,” and “Why Do So Few People Share Fake News? It Hurts Their Reputations,” as well as studies critical of the field itself: “A Lack of Effect Studies and of Effects: The Use of Strategic Communication in the Military Domain” and “Review of Social Science Research on the Impact of Countermeasures against Influence Operations.”2
In her testimony before Congress on Russian influence in Ukraine, Crisis Group’s Olga Oliker summed the matter up nicely: “While we can establish the presence of a sizable Russian effort in this regard, this begs the most important question: Does any of this work?”3
U.S. adversaries invest heavily in offensive “information warfare” capabilities (a term with no agreed definition).4 China’s investment in foreign information manipulation alone is estimated in the billions of dollars and increases annually.5 Beijing uses state-owned media and funds outside journalists to shape international coverage of China. It uses counterfeit accounts on YouTube, Twitter, and Facebook—many of which are regularly discovered and taken down—and produces disinformation on issues including COVID-19, People’s Liberation Army (PLA) activities in the South China Sea, the internment of Uyghurs, Russia’s invasion of Ukraine, and anything else that counters Beijing’s official narratives. Russia’s known information operations include its use of social media to divide Americans during the 2016 presidential election and attempts to undermine color revolutions in former Soviet states. Both China and Russia arrest, abduct, jail, disappear, assault, and kill sources of undesirable information.6
In essence, information warfare is simply harnessing information for one’s own warfighting advantage. Terminological and conceptual confusion around “information warfare” and related categories of influence persist and are a common criticism of mis- and disinformation studies as a field of research.7 Within government, too, the Department of Defense frequently changes its doctrinal vocabulary for information, influence, and related concepts—dropping terms, adding new terms, modifying definitions, and even resurrecting old terms.8
Two Information Warfare Types
Broadly, there are two information warfare types: command-and-control (C2) manipulation and psychological influence. The former refers to bettering and defending one’s C2 while disrupting an adversary’s—depriving it of information or feeding it false or misleading data. This is typically accomplished through electronic signature control—hiding one’s signature, spoofing it, and finding and fixing enemy signatures. It is a back-and-forth of signature detection, jamming, antijamming, counter antijamming, and so on. Cyber tools can defend and attack C2 as well, even with something as simple as a power outage. Importantly, this type of information warfare is used primarily at the tactical level and requires no special insights about a target’s psychology or cognitions. It may be informed by knowledge of an adversary’s institutional decision-making processes; tactics, techniques, and procedures (TTPs); and kill chains—those processes that take place among individuals and across units as opposed to existing inside a single person’s head.
The other type of information warfare—psychological influence—is used primarily at the strategic and operational or campaign levels and, unlike C2 manipulation, does rely on information about the psychology and cognitions of individuals or groups (e.g., local populations, military leaders, etc.). This type of information warfare is very much in vogue.
China, Russia, Iran, and North Korea conduct information warfare externally and internally—against their own populations. In both directions, they promote truths and falsehoods that benefit them and restrict those that could harm them. This is accomplished through heavy internet controls and restrictions on speech and the press. Newspapers and websites are shut down or nationalized. Russia’s detention of Wall Street Journal reporter Evan Gershkovich and China’s arrest and detention of Apple Daily newspaper owner Jimmy Lai are two examples. Beijing seized and shuttered Apple Daily, one of the last independent media publications in Hong Kong.9 Excluding journalists, China jails more writers than any other nation according to PEN America’s 2023 Freedom to Write Index. It is followed by Saudi Arabia, Iran and Vietnam (tied for 3rd place); Israel (including the occupied territories); and Russia and Belarus (tied for 6th place).
These tactics are pursued internationally as well, though with less success. To influence international narratives, authoritarian states turn to clandestine influence, subversion, and sabotage. Beyond social media, they use international law, money, and politics to shape narratives—just as they use the same methods to mold the rules-based order to their advantage.
In competition, the manipulation of news through social media, state-owned media, and financially compromised foreign media appears mostly ineffective.10 While the size and number of these operations is large, their returns are few. Most followers of Chinese-run Facebook campaigns turn out to be Chinese-owned or -purchased bots.11 Alarmists warn that AI-generated news and deepfakes will soon be indistinguishable from real news, and populations will be easily fooled or doubt truth as something they can discern—though, it seems at least as likely that consumers will become more skeptical and discerning. Deepfake detectors are already available. Indeed, U.S. adversaries have so heavily invested in these ineffective influence methods, it bears considering whether we should be spending tax dollars to counter them.
What the Research Shows
Russian influence operations have little effect on the attitudes of Americans or other populations distrustful of Russia.12 And Chinese operations can be comically inept. Researchers at the Stanford Internet Observatory wrote this of Beijing’s COVID-19 disinformation campaign in 2020:
Very few of the accounts in this network achieved any sort of significant reach or engagement, and many of the narratives the accounts promoted have been previously observed in past takedowns. The continued lack of focus on plausible persona development is notable. The operation is primarily interesting from the standpoint of confirming the commitment of the CCP to leverage all of the operational capabilities at its disposal to influence the global public on matters of national importance. Particularly in the context of the coronavirus pandemic, we have now observed a full spectrum of propaganda operations spanning both overt, attributable state media, and covert social media persona accounts.13
Four years later, China’s disinformation apparatus has made little progress. The State Department’s Global Engagement Center’s September 2023 report detailed how China expanded its range of operations to control global information through information manipulation, overt and covert influence, buying off journalists and platforms, intimidation, and development, use, and export of technical surveillance and censorship technologies.14 Yet, when it comes to the effects of these operations, the report warns only that “the latest developments in artificial intelligence technology would enable the PRC to surgically target foreign audiences and thereby perhaps influence economic and security decisions in its favor” (emphasis added).
In a well-known example of ineffective influence operations, Russian actors used the opening of a mosque in Texas to create opposing Facebook groups to recruit protesters and counterprotesters. Approximately 10 white nationalist protesters and 50–100 counterprotesters showed up and shouted at one another across a street. No violence occurred.15 In another event organized by a Russian Facebook group, 5,000–10,000 protesters marched to Trump Tower from Union Square after Trump’s election in November 2016.16 Evaluating the Russian operation’s influence is complicated by the fact that the march was the fourth such Trump protest in New York City that November and one of many across the country.
Two purported cases claimed as evidence of disinformation’s harmful effects are the 6 January 2021 attack on the U.S. Capitol and COVID-19 vaccine hesitancy. Those examples have at least face validity, and though the evidence and arguments for them are beyond the scope of this essay, they raise questions that demonstrate the complexity involved in determining the conditions under which influence operations may work and the distinct causal contributions of multiple independent variables. For example, if it is the case that the 2020 presidential election disinformation causally contributed to the 6 January attack, to what degree was the effect from the disinformation having a domestic, rather than foreign, source? Similarly, in assessing the effects of COVID-19 mis- and disinformation, to what degree does it matter that most interactions with misinformation happen when people seek out views with which they already agree?
None of this speaks to potential consequences from combined attacks of simultaneous disinformation and kinetic effects. Even if disinformation’s value is short-lived, it could be of outsized benefit in crisis or the chaotic opening hours of conflict. Today, a deepfake video of the U.S. President saying a ransomware attack has compromised coastal electric grids might do little harm before it is quickly debunked. The same video released simultaneously with cyberattacks causing blackouts at the start of actual conflict could potentially cause substantial confusion and disarray—even delaying a defensive, military response.
At present, though, the United States is engaged in competition and striving to avoid conflict. The National Security Strategy describes the moment thus: “The United States and our allies and partners have an opportunity to shape the PRC and Russia’s external environment in a way that influences their behavior even as we compete with them.”17 In other words, the National Security Strategy is largely an influence strategy. For that reason, it bears considering whether and how well influence works in the national security domain, and under what conditions.
When Psychological Influence Might Be Effective
It is an open and contentious question whether nuclear deterrence, as an influence strategy, works. Some subspecies of nuclear deterrence may work while others do not. Nuclear deterrence has not prevented other nations from obtaining nuclear weapons. If anything, it has encouraged nations to pursue nuclear weapons to deter their adversaries, many who are already nuclear-armed. It is possible, but difficult to show, that nuclear weapons deter conflict escalation above a certain threshold with other nuclear nations. That may be one reason the United States has not directly attacked Russia in defense of Ukraine; however, that hypothesis has only to fail once to fail for everyone.
Diplomatic relations and negotiations may be the quintessential example of influence operations and activities. They work at least sometimes, and when they do, they produce reified, tangible outcomes such as peace deals and signed agreements—more than can be said for most psychological influence instances of information warfare. The military exception par excellence of a tangible, psychological-influence-style information warfare outcome is the surrender ceremony—like Japan’s surrender on 2 September 1945 on the deck of the USS Missouri (BB-63) in Tokyo Bay. Apart from eradication, warfare—even the warfare of World War II—is essentially an influence activity.
The successful use of Russian influence activities in Crimea, which has a large population of Russian Black Sea Fleet sailors and veterans already inclined to agree with Russian narratives—and the consensus view among researchers that this is the kind of circumstance under which influence operations could have an effect—suggests that some subpopulations in the United States may be susceptible to specific instances of foreign malign influence. “Russia is our friend,” for example, was among the slogans white supremacists chanted at Charlottesville.18 But while there will always be some U.S. citizens and service members predisposed to believe adversarial messaging, mis- and disinformation interventions, such as media literacy education, are unlikely to benefit them. Theirs is a problem of values—not of defending against malign information.
A Science of Influence for National Security
Our current understanding of the foreign-malign-psychological-influence type of information warfare, including mis- and disinformation, does not justify the media, policy, and popular obsession and fear on daily display. More, and more rigorous, investigations of effects and effect sizes are needed. It should first be determined whether Chinese and Russian investments and operations pose any real threat before resources are redirected to unproven concepts such as media and information literacy. Today, there is more and better evidence for the proposition that U.S. policy-makers should encourage Russia and China to increase their investments in ineffective influence operations—leaving them fewer resources to spend on bombs and bullets. That could change, of course, with investment, time, and experience. More concerning could be the effects of combined psychological and kinetic attacks on U.S. soil that are difficult to predict; however, such scenarios have garnered little attention.
Electronic and cyber warfare, intelligence collection, and counterintelligence are not only advantageous but also necessary in modern conflict. There may not be controlled studies demonstrating their effects, but militaries relying on them know the consequences of malfunctioning radars, jammers, antijammers, and poor cyber defenses.
Studies skeptical of the efficacy of mis- and disinformation operations do not show that they have no effects. They show, rather, that we do not yet know if they are, or can be, effective—and if they can be, under what circumstances. Answering those questions requires rigorous, controlled studies that both detect effects and measure effect sizes. The National Security Strategy and National Defense Strategy state the United States will harness foreign influence to shape the behavior of the governments of Iran, North Korea, Russia, and China. The buzzwords of influence and perception management are everywhere in Washington today, but the national security apparatus has demonstrated no scientific understanding of foreign influence, nor any desire to develop one.
To date, the few effects studies there are suggest that attempts to leverage psychological influence against foreign audiences have limited effectiveness and largely depend on an audience’s prior attitudes toward the messenger and message. Ukraine, for example, has successfully lobbied for international aid and support, but that success is limited to countries that already favored Ukraine—or at least opposed Russia.
Authoritarians’ domestic operations are another story entirely. For people living behind China’s Great Firewall, under constant, ubiquitous surveillance, the Communist Party controls both the information to which they have access and the information they can disseminate. While Beijing’s attempts at global information dominance have been futile, the party state’s internal efforts are disturbingly effective. Ministry of State Security Vice Minister Yuan Yikun (formerly Yuan Peng) put it this way: “What is truth and what is a lie is already unimportant; what’s important is who controls discourse power.”19 Perhaps that is not surprising to historians of autocratic regimes of the 20th century. What it suggests, though, is that in open, democratic societies, taking care of freedom—democratizing discourse power—is the best defense against foreign manipulation. Rorty’s claim to “take care of freedom, and truth will take care of itself,” implies, in other words, “that if people can say what they believe without fear, then . . . the task of justifying themselves to others and the task of getting things right will coincide.”20 If the United States wants to use influence operations to shape Russia and China’s behavior, and avoid war, it needs to get serious about the science of influence and institute a rigorous program of research that applies that science to the national security domain. The stakes are too high for anything less.
1. Robert B. Brandom, ed., Rorty and His Critics (Malden, MA: Wiley-Blackwell Publishers, 2000), 347.
2. Sacha Altay et al., “Misinformation on Misinformation: Conceptual and Methodological Challenges,” Social Media and Society 9, no. 1 (January 2023); Andrew Guess et al., “Avoiding the Echo Chamber about Echo Chambers,” Knight Foundation 2, no. 1 (2018): 1–25; Elizabeth Dubois et al., “The Echo Chamber Is Overstated: The Moderating Effect of Political Interest and Diverse Media,” Information, Communication & Society 21, no. 5 (2018): 729–45; Sacha Altay et al., “Why Do So Few People Share Fake News? It Hurts Their Reputation,” New Media & Society 24, no. 6 (2022): 1303–24; Claes Wallenius and Sofia Nilsson, “A Lack of Effect Studies and of Effects: The Use of Strategic Communication in the Military Domain,” International Journal of Strategic Communication 13, no. 5 (2019): 404–17; and Laura Courchesne, Julia Ilhardt, and Jacob N. Shapiro, “Review of Social Science Research on the Impact of Countermeasures against Influence Operations,” Harvard Kennedy School Misinformation Review 2, no. 5 (September 2021).
3. Olga Oliker, “Russian Influence and Unconventional Warfare Operations in the ‘Grey Zone’: Lessons from Ukraine,” Statement before the Senate Armed Services Committee Subcommittee on Emerging Threats and Capabilities (2017).
4. Sarah White, “The Organizational Determinants of Military Doctrine: A History of Army Information Operations,” Texas National Security Review 6, no. 1 (Winter 2022/2023): 51–78; and Mark Pomerleau, “Out: ‘Information Warfare.’ In: ‘Information Advantage,’” C4ISRnet, 29 September 2020.
5. U.S. State Department Global Engagement Center, “How the People’s Republic of China Seeks to Reshape the Global Information Environment,” 28 September 2023.
6. Kaela Malig, “How Russia’s Press Freedom Deteriorated over the Decades Since Putin Came to Power,” PBS Frontline, 26 September 2023; and Reporters Without Borders, “China,” September 2023.
7. Tim Hwang, “Deconstructing the Disinformation War,” MediaWell, Social Science Research Council (June 2020).
8. White, “The Organizational Determinants of Military Doctrine.”
9. Austin Ramzy and Tiffany May, “Hong Kong Arrests Jimmy Lai, Media Mogul, Under National Security Law,” The New York Times, 9 August 2020.
10. Zuri Linetsky, “China Can’t Catch a Break in Asian Public Opinion,” Foreign Policy, 28 June 2023.
11. Carly Miller et al., “Sockpuppets Spin COVID Yarns: An Analysis of PRC-attributed June 2020 Twitter Takedown,” Stanford University Freeman Spogli Institute for International Studies (June 2020).
12. Oliker, “Russian Influence and Unconventional Warfare Operations.”
13. Miller et al., “Sockpuppets Spin COVID Yarns.”
14. U.S. State Department Global Engagement Center, “How the People’s Republic of China Seeks to Reshape.”
15. Christopher A. Bail et al., “Assessing the Russian Internet Research Agency’s Impact on the Political Attitudes of American Twitter Users in Late 2017,” Proceedings of the National Academies of Sciences of the United States of America 117, no. 1 (September 2019): 243–50.
16. Ali Breland, “Thousands Attend Protest Organized by Russians on Facebook,” The Hill, 31 October 2017.
17. Joseph R. Biden, National Security Strategy of the United States of America (Washington, DC: The White House, October 2022).
18. David Neiwert, “Explaining ‘You Will Not Replace Us,’ ‘Blood and Soil,’ ‘Russia Is Our Friend,’ and Other Catchphrases from Torch-Bearing Marchers in Charlottesville,” Southern Poverty Law Center, October 2017.
19. Matt Pottinger and Mike Gallagher, “No Substitute for Victory,” Foreign Affairs 103, no. 3 (May/June 2024).
20. Brandom, Rorty and His Critics, 347.