Editor’s note: This article is part of a series, “Compete and Win: Envisioning a Competitive Strategy for the Twenty-First Century.” The series endeavors to present expert commentary on diverse issues surrounding US competitive strategy and irregular warfare with peer and near-peer competitors in the physical, cyber, and information spaces. The series is part of the Competition in Cyber Project (C2P), a joint initiative by the Army Cyber Institute and the Modern War Institute. Read all articles in the series here.
Special thanks to series editors Capt. Maggie Smith, PhD, C2P director, and Dr. Barnett S. Koven.
Imagine a future where moral and cognitive battles are waged with well-crafted narratives delivered and manipulated by an intricate web of simple and sophisticated cyber, information, electronic, and psychological warfare tools. The modern information environment, and how we interact with it, allows perceptions to be shaped in seconds with a retweet, a share, a like, or a download. With the internet, access to information is instantaneous—until it’s not—and autocratic rulers are increasingly taking advantage of their populations’ reliance on the internet and the information environment in times of unrest and upheaval by shutting off access. Most recently, internet traffic was cut off in Kazakhstan, as authorities in the petrostate tried to quell unrest over rising fuel prices and sow confusion among protestors by disrupting communications and popular messaging platforms (e.g., Telegram and Signal). What autocratic rulers clearly understand is the power of social connection and how the internet and information environment facilitate idea mobilization and narrative affinity across populations. Ultimately, we do not have to imagine a future war to recognize that the next conflict will include an information component, and the US military will need to contribute to perception and information management in the cognitive dimension as a core element of future battles.
However, what the US military struggles to understand is how the information environment is pervasive to modern strategic competition, and within the information environment and cyberspace, a myriad of roles remains underexplored and underutilized. Since its inception, the US military has conceived of a war-peace duality, but for our adversaries—Russia and China, since at least the time of Lenin and Mao—there has only been competition or open conflict. These powers recognize that this competition is principally nonkinetic and will largely (though by no means exclusively) play out in third countries (e.g., in Central Asia instead of in the US, Chinese, or Russian homelands) and they are quite comfortable with proxy engagements. Moreover, integrated nonkinetic approaches are central to their competition engagements strategies. For example, Russia’s concept of information warfare integrates cyber and information operations into military and nonmilitary activities, during both peace and war, and is aimed at eroding cohesion in target societies and undermining rival states’ leadership. China’s concept of “Three Warfares” emphasizes and integrates public opinion warfare, psychological warfare, and legal warfare.
The US military, by contrast, is just starting to play catchup. While emphasis on operations in the information environment and the cyber domain are certainly increasing, the balance of the military’s attention remains focused on force-on-force engagements during declared conflicts. Much of the time, information and cyber are given supporting roles for kinetic operations but recently, US Army Cyber Command announced a shift in focus from information warfare to “information advantage” for “decision dominance,” and is working to bring the concepts to the forefront of how the Army fights. However, just what information is, and how it impacts the force, remains uncertain. To fully understand the threats posed by human interactions within and use of the information environment, the Army needs to demonstrate how those threats impact the Army’s legal warfighting responsibilities and DoD more broadly. A large chunk of these responsibilities come down to a simple formula: the Army must man, train, and equip a land force prepared to answer the nation’s call. This construct, it turns out, is extraordinarily useful in illustrating the Army’s—and, by extension, the US military’s—vulnerabilities in the information environment and the steps it should take toward building a cohesive and competitive information strategy.
The Army’s two primary manpower responsibilities are (1) attracting high-quality recruits and (2) retaining experienced and capable servicemembers. To effectively meet those mandates, the Army must consider the information environment and narratives about military service that shape perceptions of it. Currently, several narratives designed to degrade public trust in the military are circulating. Worryingly, these narratives may be working—as evidenced by recent polling data that shows only 56 percent of those surveyed (down from 70 percent in 2018) have “a great deal of trust and confidence” in the military. Distrust in an institution is easy to exploit—China, for example, has spread disinformation about the COVID-19 virus origins, falsely claiming that the virus was created for biological warfare purposes at Fort Detrick, Maryland, which is the same Army facility that a Russian disinformation campaign claimed created the AIDS virus back in the 1980s. Narratives surrounding sexual assault and the recent failures to protect soldiers experiencing harassment and abuse are also exploitable—the narratives may scare a person away from joining, shape his or her thought process on military service, or scare a mother so much that she convinces her child not to enlist or commission in the first place. Even narratives around the US military’s inability to win wars are shown to degrade recruiting and retention efforts.
Retaining experienced soldiers also requires the Army to consider and assess the information environment to truly understand the context in which soldiers make decisions about their careers. Career choices are influenced by a soldier’s unit, family, and children, and the experiences of their peers. Therefore, narratives that hone in on and highlight the failures of the military leadership inevitably shape the command climate and general feeling about staying in or getting out of the military and may stop a servicemember from reenlisting. Additionally, location and broader social narratives matter too. For example, a soldier assigned to Fort Campbell, Kentucky has different contextual influencers than one stationed at Fort Belvoir, Virginia. For the Army to achieve information advantage, it must acknowledge how external narratives influence the supply of ready and able recruits and the retention of its experienced soldiers.
However, addressing preexisting themes and narratives that impact the public perception of the military is insufficient. Our adversaries persistently use our social vulnerabilities to degrade military readiness by impacting the Army’s ability to recruit and retain qualified individuals. The Army’s reactive posture gives our adversaries first-mover advantage and impedes the Army’s ability to achieve information advantage—in part, because malign influence campaigns that spread mis- and disinformation can spread up to six times faster than any truth. To get out in front of malign narratives, the Army must devise better messaging and marketing tactics that account for and address the underlying issues and social cleavages that our adversaries exploit. Specifically, to become proactive in the information environment, the Army needs to understand and predict how and what our competitors and adversaries are going to say, and be ready to deploy solutions ahead of, and in response to, competing and malicious narratives. One solution is teaching critical-thinking skills and inoculating the force by teaching soldiers to become more thoughtful consumers of media and information, especially regarding social media. With respect to the general population of potential recruits, the training process is a long one, but given the Army’s investment in professional military education and training, the proposition is, in theory, feasible.
Manning is only one aspect of force readiness—soldiers must also be prepared for war. The Army training cycle requires its forces to “focus training on high-intensity conflict, with emphasis on operating in dense urban terrain, [in] electronically degraded environments, and under constant surveillance.” As such, traditional forms of training, and joint and partner exercises, are essential to long-term success on the battlefield. The secretary of the Army has already acknowledged that we will be contested from “fort to port,” identifying how logistics, and the Army’s ability to swiftly mobilize and deploy personnel and equipment around the world, is at risk. Ultimately, the Army needs to consider the effect of malign influence operations on all critical nodes—information and physical—to ensure relevant populations are receptive to a massive inflow of US military equipment and that the adversary will not be able to target local sentiments against US forces.
Examples of foreign operations in the information environment targeting training exercises already exist, and the 2015 Jade Helm exercise conducted by US Army special operations forces is one of the most startling. Jade Helm was a large, but routine, training exercise that took place in Texas and was designed to allow participants to practice “core special warfare tasks for use overseas to help protect our national security interests.” Despite being well coordinated at all echelons of the US government, the training exercise—which was held on military bases and private, state, and federal lands—was shrouded in multiple conspiracy theories about then President Barack Obama. A Russian disinformation campaign pushed false narratives that claimed the operation was in preparation for the president to declare martial law in Texas, eventually putting the entire exercise in jeopardy. The soldiers participating in the exercise were monitored with scrutiny and nearly prevented from conducting training, and participants were harassed by conspiracy theorists. To this day, Russia’s false narratives about Jade Helm persist and show how conspiracy theories take root, grow, and are difficult to counter once they enter public discourse.
The information warfare tactics used against Jade Helm could be applied throughout the world, whenever and wherever the US military trains with partners and allies. In fact, we should assume those tactics will be used in the very locations that US servicemembers may be fighting the next war. For example, recent disinformation spread by Russian Defense Minister Sergey Shoigu, and picked up by the Russian media, alleges a provocation plot in the Donbas region of eastern Ukraine that involves US private military companies and chemical weapons. Ultimately, training with partners and allies could become all but impossible if disinformation about the US military and its role in an ongoing conflict are effective and result in undue political pressure being applied to a host nation’s government. Domestically, the same concerns apply—any mobilization of military vehicles outside a military base already stirs conspiratorial conjecturing, and even when a National Guard unit moves from an armory to an installation for training, conversation and concern are visible on social media.
Therefore, the Army needs revise how it thinks about training its forces, in both domestic and foreign settings, to account for the information environment. Just as the US government has long understood that multinational training exercises send signals to the global community about US interests, partnerships, and capabilities, those signals can also be manipulated by adversaries. Similarly, domestic exercises that mobilize military units and federal, state, and local authorities also send signals that can be misinterpreted and exploited to sow uncertainty and distrust at home. To gain and hold information advantage, the Army must assess the information environment before, during, and after domestic exercises—just as it does internationally—to understand the narratives surrounding the training and troop movements and to predict, preempt, and ultimately prevent false narratives from taking hold. Training the force takes place within the information environment and moving from the mindset that domestic troop movements are somehow different than mobilizations abroad is a key first step to conducting effective training in the new information environment.
While manning and training mostly fall within the purview of the individual service components, equipping America’s armed forces relies on heavy investment from outside entities—both public and private. Weighing in on defense acquisitions are a myriad of organizations, from the intelligence community and think tanks that identify risk and threats, to defense contractors that develop and build new systems, to the elected and appointed officials who determine the defense budget and provide oversight—and all are susceptible to foreign malign influence campaigns. For example, the intelligence community could unwittingly rely on false information planted by an adversary’s military deception campaign intended to misrepresent the amount they are investing in new technologies. More alarmingly, general confusion about research findings can be created and exploited to push procurement in the wrong direction, leaving the Army ready to fight the wrong war. Ultimately, the military must understand foreign operations in the information environment as ongoing and as not necessarily intended to have immediate effects. Instead, information operations are part of our adversaries’ strategic efforts to obfuscate, deceive, and sabotage the US military and to degrade US power over time.
Cyber-enabled information operations on critical information systems are another persistent risk that the Army must acknowledge. With advancements in machine learning and automation, and the Army’s adoption of such technologies, adversaries could lull the end user into a sense of security while utilizing model poisoning and contamination attacks to generate confusing and inaccurate predictions. Already, the defense systems used are taken for granted due to vast asymmetry of capabilities between US and enemy forces during the post-9/11 wars, and to fight the next war, the Army will have to relearn the fundamentals of intelligence for when those systems no longer work. The other, less likely avenue is a foreign adversary conducting influence operations that cause the end user to lose trust in the systems they rely on to do their jobs. Such an attack could render the time, money, and training spent on critical systems irrelevant and information advantage mandates an awareness of how adversaries can use the information environment to influence the lethality of the force.
Finally, even though much of the Army’s equipment is manufactured and assembled in the United States, military procurement remains reliant on foreign parts and production (including from competitors and adversaries) for subcomponents and materials. A 2016 report from the Office of the Under Secretary of Defense for Acquisitions, Technology, and Logistics highlighted how semiconductors and integrated circuits, and their availability, are at high risk. The component parts that rely on these items are also put at risk, including everything from weapons targeting systems to the computers military members use daily. Taiwan alone accounts for over 50 percent of the world’s manufacturing of the most advanced, ten-nanometer chips to both Chinese and American companies. And, when the United States is reliant on a specific foreign part or technology, the supplier is also at risk—our competitors and adversaries could use influence operations to manipulate the source country into shutting off the supply of key components the United States needs to fight.
In their book LikeWar, Peter W. Singer and Emerson Brooking ominously wrote that “there’s no historical analogue to the speed and totality with which social media platforms have conquered the planet.” Authoritarian governments around the globe have recognized the power inherent to information access—as depicted in the recent examples of Kazakhstan and Sudan—and America’s peer competitors, like China and Russia, spend vast resources on information and cyber operations targeting both their backyard neighbors—Ukraine and Lithuania, for example—and nations around the globe. The manipulation of the information environment and the cognitive impact of malign influence campaigns are now widely known, especially after Russia’s information and cyber operations targeting democratic elections and China’s government-sponsored disinformation about COVID-19. But, in the modern information environment, the United States is far behind our adversaries in appreciating the potential for information operations to undermine our military’s ability to man, train, and equip.
Ultimately, the Army has taken the first steps toward recognizing the vulnerabilities inherent to the ubiquity of the information environment by pivoting away from information warfare—a term that preserves the peace-war dichotomy that is irrelevant in competition—toward achieving information advantage—a term that appreciates the information environment’s moral and cognitive aspects and its relevance to military readiness. But until all Army leaders—from maneuver to logistics to cyber—understand how information can be used to degrade military effectiveness, the Army is subject to the fates of the next conflict. China and Russia already view war as information-centric—and Iran is not far behind—where all other aspects serve to shape the information environment. Until the US military reconfigures how it mans, trains, and equips its forces to account for the information environment, the next war will be far more difficult than it needs to be.
Maj. Joe Littell is a US Army psychological operations officer and research scientist at the Army Cyber Institute at the United States Military Academy.
Capt. Maggie Smith, PhD, is a US Army cyber officer currently assigned to the Army Cyber Institute at the United States Military Academy where she is a scientific researcher, an assistant professor in the Department of Social Sciences, and an affiliated faculty of the Modern War Institute. She is also the coeditor of this series and director of the Competition in Cyberspace Project.
The views expressed are those of the author and do not reflect the official position of the United States Military Academy, Department of the Army, or Department of Defense.
Image credit: Sgt. Dustin D. Biven, US Army