In the age of digital interconnectedness, an individual service member’s data can become a pivotal factor in operational security. During World War II, the phrase “Loose Lips Might Sink Ships” began appearing. It was a simple request: Keep military secrets out of the civilian domain and out of the hands of adversaries. This early effort at information security contrasts with the complexity of today’s digital landscape. In World War II, secrets were manually encrypted, transmitted via radio, typed on analog typewriters, or spoken and veiled in the dark of night. In the 21st century, the information atmosphere is global, connected, and surprisingly accessible.
Influence Campaigns
Today, an individual’s data shadow can reveal a great deal about them: Financial information, GPS location data, consumer habits, and ideological preferences can all be sewn together to create a generalized photo of an individual’s life.1 Further, much of this data is available at low or no cost. The use of apps and web services and the general spread of the Internet of Things (IoT), combined with the expanding field of predictive behavior modeling, are opening new opportunities for accidental disclosure and digital influence every day. With the increasingly widespread adoption of generative artificial intelligence (AI), tailor-fitted influence campaigns are set to be the next wave of information warfare.
For instance, digital AI companions such as Replika or CrushOn could unseal the lips and fingers of lonely service members. For a deployed sailor, a digital companion may appear to be a benign form of entertainment. However, if weaponized properly, it can become a seductive honeypot. The user may disclose intimate details about their life and interests, fall prey to blackmail, or potentially leak information about their specific functions within the military. Each interaction increases the likelihood of such disclosure. This data could then be used to generate a customer journey map, allowing for more successful iterations with future users and A/B testing, making the information operation more proficient with each target.
Applications like Replika also could be used to create an even sharper image of a sailor through less visible means. Using black boxed application permissions and user agreements, an application could gain access to information and services on the user’s device, including social media accounts and facial recognition, microphones, and cameras, among other. Adversaries could then harvest information on their target and surroundings to help generate a tailor-made information operation on that specific user.
Individual Targeting
For example, an entryway can be created if a sailor is disposed toward a specific polemic social issue (such as abortion, guns, drugs, race, free speech, LGBTQ, etc.). Through advertisements, manipulated social media feeds, deep fakes, and other forms of digital communication, an individual could be nudged into dangerous dispositions such as antisocial behavior, ideological group-think, or-at the extreme end of the spectrum-violent actions by tailor-made mis/disinformation. Such dispositions could fester within online echo chambers, increasing the confidence in one’s sense of false realities. A sense of unfairness, inequality, or injustice could foster resentment. These constructed realities thrive through the exploitation of cognitive biases inherent in the contemporary algorithm-dominated infosphere. If left unaddressed, the user could come to believe the hyperreality generated specifically for them. From new recruit to well-worn sailor, no one is safe from this new threat.
Looking Forward
There are several routes through which the Sea Services can be kept safe from tailor-made information campaigns. The first is data education. Informing sailors and other service members of the role their data plays (both on duty and off) can help protect them from targeted attacks. More mandatory training on data discretion, privacy policies, and IoT security would be a step in the right direction. Simulations of how data can be weaponized by adversaries also would be a pragmatic approach to such an abstract concept, particularly in regard to AI.
Second, increasing cheap access to ad blockers, VPNs, and proficient virus and malware software and other data-obfuscating technologies would aid in keeping data safe.
Third, the Sea Services should support legislation to create extra protective measures for U.S. service members’ data within technology companies. In 2023 alone, more than five billion records were breached in publicly disclosed data leaks. There is more than enough information from these records—many of them sensitive—to map, manipulate, and move individuals with tailor-made influence campaigns.
As the United States collectively moves forward, it should look back on and revise sentiments such as “Loose Lips Might Sink Ships.” Today it is not loose lips, but rather loose data, that can shape the modern war theater. Today’s slogan should be “Loose Data Steals Minds.”
1. Shoshana Zuboff, “The Age of Surveillance Capitalism,” Social Theory Re-Wired (London, UK: Routledge, 2023), 203–13.