Social media is a disruptive influence. While arguably it presents no greater disruption than previous historical milestones, such as email, telephones, the telegraph, or the printing press, we are still learning its impact on our lives.
Accelerated by the coronavirus pandemic, this disruptive force has had a profound impact on our physical status quo, through increasingly intense competition for our digital attention. This spectrum of competition permeates and adapts to how we navigate social media platforms that comprise our personal digital ecosystems. Whether the avatars we interact with are real or synthetic, marketing-derived artificial intelligence (AI) tools are being employed against populations with hidden agendas or unimaginable consequence as highlighted in the Netflix documentary The Social Dilemma, or worse weaponized without remorse for what could happen in the real world with the rise of “digital authoritarianism” as highlighted in the 2020 War on the Rocks essay series.1
The Navy and military at large historically have been slow to adapt to disrupters and evolving threats. But just weathering the trend or relying on centralized public affairs postures to navigate this digital terrain only empowers our rivals and the next extremist threat. We must change our way of thinking and adapt to this new character of competition and potential warfare. Three factors frame the nature of the challenge: mind-set, organization, and institutional courage.
Mind-set
Social media warfare (SMW) for the purpose of this discussion comprises adapting physical strategies to the digital world, specifically focusing on how our digital manifestations (personal and command avatars) interact across the spectrum of competition, specifically in the gray zone between peace and war. Defining a strategy, employing a posture, monitoring the situation, and coordinating action and assessment across the range of interaction have always been the commander’s business—they remain so with social media competition. This is so much more than traditional public affairs posturing. Maneuvering across social media requires a more agile approach. SMW requires a mission command mind-set, empowered by maneuver, across a distributed environment.
In his 2016 book The Fourth Industrial Revolution, Klaus Schwab listed the top 10 world populations, only they were not all nation-states. Surprisingly, the top three populations were Facebook, China, and India.2 This shift in what constitutes a population is central to understanding the nature of digital completion we face. We no longer are competing only between nations at sea, but across digital populations with ambiguous actors, agendas, and potentially explosive social consequence. Figure1 presents a population comparison, updated for 2021.
The lack of borders in social media represents two sides of a proverbial coin: one being the amalgamation of viewpoints from across far-reaching audiences, the other being the clash of civilizations when those views collide amid algorithms designed to draw-in and maintain user attention. Understanding the strategic mechanics under the digital hood of these platforms can inform our own SMW strategies, posture, and education efforts toward our primary audience—our sailors, Marines, civil servants, and significant others—while also helping to develop defensive and offensive strategies against authoritarian actors (state or nonstate). Learning to understand the maneuver space is key.
Organization
The second challenge is how to organize our digital war-fighting posture to compete effectively in the gray space of social media. Digital command presence is becoming just as important as the tangible. The notion that you can just turn off social media fails to appreciate the distributed nature of the environment: just because you have blinders on does not mean your primary audience does.
Fortunately, mission command and mosaic maneuver across distributed maritime operations are well suited to this environment. Deploying an effective digital command presence requires a holistic design, tailored to the commander’s strategy. What is a new recruit’s first opinion of your command, not just your physical but also your digital quarterdeck?3 Is it current; do the links work; do they clearly link to your social media platforms? Do you verify that a Google search prioritizes the links you want your primary audience directed to when they search for your command; do you know how? These sound daunting to the uninformed, but they really are not that hard. It is a matter of applying basic military discipline and attention to detail to our digital quarterdeck, just like we would the physical. More often than not, the digital is now the first impression. Figure 2 presents a notional digital command design for engaging in social media maneuver.
Institutional Courage
Third, SMW requires courage. Our sailors, Marines, civil servants, and significant others know what courage means in the physical world, but do we think and talk about what it means in the digital one as well? Putting oneself out there in the social landscape, in uniform, requires courage, planning, and, most of all, a strategy.
During my command tour, we drafted a model of what we envisioned our digital presence to look like using simple Post-it notes, similar to figure 2. This was before TikTok and WhatsApp were as mainstream. We then drew out how we wanted to use each platform. After brain storming, we prioritized themes that warranted specific platform investment (time and creative energy). With that basic framework, principles, and strategy in hand, we set out to tackle content. In our situation, the themes were to tell the Navy story, educate recruits and staff to continually learn about their profession of arms, and communicate time-critical or routine information to the widest primary audience. We then set about on delegation.
Once you have the intent and some basic guidelines, let your people run with it. Taking that leap into writing and posting substantive content takes the same type of confidence we muster for walking aboard a ship the first time; the difference is we train and prepare our people for the latter while not really investing in the former. This needs to change—at every level. Our primary audience is critical to our digital and physical strategy, posture, and success. Their education is ever more important in the social media domain to empower a command to deal with ambiguity, uncertainty, adversity, or even hostility.
Understanding the nature and mechanics of trends can help empower collective digital courage and leads to the next part of discussion: What makes a trend?
Firestorms, Bots, and AI
In 2017, Air Force Lieutenant Colonel Jarred Prier presented a framework for commanding the trend in social media. Prier suggests actors command the trend through four factors:
• A message that fits an existing, even if an obscure, narrative
• A group of true believers predisposed to the message
• A relatively small team of agents or cyber warriors
• A network of automated “bot” accounts4
A logical question, then, in posturing for social media interaction is, What constitutes a threat, the trend, or an actor? To better appreciate the dimensions of this question, we need to examine the rise of online firestorms and use of bots.
Fireships traditionally were ships “carrying combustibles or explosives sent burning among the enemy’s ships or works to set them on fire.”5 This nautical asymmetric concept aligns with a discussion of “online firestorms in social media” in the Public Relations Review. In the article, the authors define an online firestorm as “the sudden discharge of large quantities of messages containing negative word-of-mouth and complaint behavior” that, when facilitated by social media, can “gather public attention and develop into a serious reputational crisis.”6 They also refer to a prelude paracrisis as “a publicly visible crisis threat that charges an organization with irresponsible or unethical behavior,” which, for the sake of the remaining discussion, will be referred to as a digital fireship.7
In their classification of online firestorms, the authors outline three levels of crisis responsibility in the context of situational crisis communication theory: victim (natural disaster or rumor), accidental (technical failure), and preventable (human-error or organizational misdeed)—the latter leading to the most severe reputational threat.8 They then analyzed trends across Twitter and Weibo social media communities and identified social issues discussed in online firestorms, including illegal activities, freedom of speech, sexual harassment/crime, public safety/health, child abuse/neglect, racism, gender, LGBTQ rights, political contribution, international relations. Table 1 presents their findings ranked in priority by U.S. occurrence. These themes resonate with our longtime efforts to assess command climates in the fight against destructive and predatory behavior. In light of perpetual service challenges to combat these recurring issues or, more recently, extremism, this research demonstrates the potential value day-to-day social media analysis could provide commanders in identifying emergent issues and potential digital fireships and preventing them from maturing into firestorms.
In an article in Information Processing and Management, the authors summarize a social media bot as “a computer algorithm that automatically produces content and interacts with humans on social media, trying to emulate and possibly alter their behavior.”9 They classify three types of bots: benign (bots that automatically post tweets of emergency alerts, chat bots, and news bots), neutral (bots that post or repost jokes or share nonsense), and malicious (generally operated by a bot master). Table 2 summarizes four classes of malicious bots.
The authors present key historic examples of how bots were used to influence public opinion across a range of Twitter-based activity. Based on their analysis, they warn of the need to anticipate bot-enabled activity or attacks during elections, social and political conflicts, marketing schemes, and popular events. Adapted to military circles, we should also likely expect a rise in future adversary bot activity when online fireship topics emerge. Potential targets could include periods of operational stress, such as preparations for deployments, unplanned extensions, or other emergent news that can be used against our primary audience.
Tools for Commanders
While not all encompassing, the digital ecosystem in figure 2 could lead to the typical conclusion: It is unrealistic to expect commanders and staff to pour over all their requisite social media tools at all hours of the day. After all, they have a physical command to run as well! This is where current and future autonomous tools can be leveraged to complement the commander’s decision cycle to gain and maintain situational awareness of relevant rising digital trends based on their desired social medial strategy.10
In his article “13 Ways to Track Trends on Social Media,” Max Freedman highlights existing analytic options provided by “Google Trends, TweetDeck, Hootsuite, Cyfe, Tumblr, Trending Reddit, Facebook for Media, Geofeedia, Social Mention, Top Hashtags, Personalized Facebook ads, TubeBuddy, and Feedly.”11 These web-based tools represent a rising sector of the tech industry and metadata mining: social media analysis and trend detection. Increasingly, these tools can provide timely insight into emergent social media trends and serve as an early warning to potential digital fireships.
As we look to explore the human-to-machine balance in this regard and empower decision advantage, it is helpful to consider one historical perspective on AI balance. In a recent Harvard Business Review article, chess grand master Gary Kasparov (who famously competed against IBM's Deep Blue in 1997) summarized his research into the benefits of human and machine intelligence, after his loss to Deep Blue. He offered “a weak human + machine + better process was superior to a strong computer alone, and most importantly superior to a strong human + machine + inferior process.”12
So, how do we complement humans and process with machines to gain and maintain a competitive advantage over our potential adversaries?
A recent report published in Computer Science Review, “Machine Learning Algorithms for Social Media Analysis,” includes a taxonomy of tools for developing digital insight. Its definition of business intelligence (methods, architecture, and techniques to turn data into useful information that influences business activities) is important to this discussion.13 Figure 3 presents their overarching model and serves as a baseline for potential military adaption.
With the increasing proliferation of smart device applications, a command-tailorable business intelligence application could give commanders fingertip awareness of their social environment, as well as provide alert cueing to potential fireships. Afloat options could include a locally hosted web-based interface with automated emails similar to MyAnalytics by Microsoft Office 365, while ashore a synchronized mobile app could provide on-demand and emergent cueing. Two questions are likely to arise regarding intelligence secrecy and privacy ethics. On the former, there will be occasion to retain attribution awareness for ongoing law enforcement investigation or intelligence collection, but a mutually supporting objective must be to communicate timely threat information directly to commanders for trend cueing and, if warranted, tactical social response. Regarding privacy, there are two relevant points to consider: First, we must always abide by the principles of a free and open digital commons to protect our collective First Amendment rights. However, while we must remain steadfast to uphold our liberal values abroad, we also must accept the duality that our rivals can and in many cases will not attend to these values—this is very nature of the gray space competition around us. The second and perhaps more important point is that the data is already in the public sector, being used by both peaceful physical and synthetic actors and a nefarious few. We must learn to identify friend from foe in this domain and invest in our commander’s ability to do so as well.
As a service, we are on a back foot when it comes to establishing our social media presence, understanding the terrain, and adapting suitable physical strategies to maneuver within it. We have great people, who with a little investment in solid processes and machines can yet again make the difference in this new environment and seize the tactical competitive advantage. Like the physical world, we must consider the factors of time, force, and space in designing our digital command strategies for social media, and design them we must. Failure to engage with holistically balanced human and AI-assisted tools and strategy will only empower our adversaries to further exploit our digital weakness and gain competitive gray space advantage over our principal audience—our people. It’s on us to act. We can no longer take social media warfare for granted. #AreYouReady?
1. Netflix, The Social Dilemma (2020). This documentary highlights the intentional strategy behind social medial applications to gain and maintain user attention through various marketing-based algorithms. See War on the Rocks four-part series on “Digital Authoritarianism” (2020). One of the series authors (Steven Feldstein) defines digital authoritarianism as to "surveil, repress, and manipulate domestic and foreign populations.”
2. Klaus Schwab, The Fourth Industrial Revolution (Geneva: World Economic Forum, 2016). Original “Top 10 Populations” graphic included in the book annex “Shift 2: Our Digital Presence,” discussion derived from the McCrindle.com “Social Media and Narcissism” blog post.
3. The term “digital quarterdeck” was introduced to the author during the 2015 prospective commanding officer course at the Navy Leadership and Ethics Center in Newport, Rhode Island, during discussions on command presence.
4. Lt Col Jarred Prier, USAF, “Commanding the Trend: Social Media as Information Warfare,” in Strategic Studies Quarterly (Winter 2017): 59.
5. Merriam-Webster, “Fireship.”
6. Sora Kim, Kang Hoon Sung, Yingru Ji, Chen Xing, and Jiayu Gina Qu, “Online Firestorms in Social Media: Comparative Research between China Weibo and USA Twitter,” Public Relations Review 47, no. 1 (2021): 1.
7. Kim et al., “Online Firestorms,” 2.
8. Kim et al., “Online Firestorms,” 3.
9. Mariam Orabi, Djediga Mouheb, Zaher Al Aghbari, and Ibrahim Kamel, “Detection of Bots in Social Media: A Systematic Review,” Information Processing and Management 57 (2020): 4.
10. J. Boyd, “A Discourse on Winning and Losing,” briefing slides, Document No. M-U 43947 (Maxwell Air Force Base, AL: Air University Library, 1987).
11. Max Freedman, “13 Ways to Track Trends on Social Media,” Business.com, 19 October 2020.
12. David De Cremer and Garry Kasparov. “AI Should Augment Human Intelligence, Not Replace It” (Boston; Harvard Business Review, 18 March 2021). Retrieved on 28 March 2021 from https://hbr.org/2021/03/ai-should-augment-human-intelligence-not-replace-it
13. T. K. Balaji, Chandra Sekhara Rao Annavarapu, and Annushree Bablani, “Machine Learning Algorithms for Social Media Analysis: A Survey,” Computer Science Review 40 (2021): 12.