Over the weekend, millions of Americans partook in a powerful strategic myth by heading to see the next offering from the Star Wars franchise: Solo: A Star Wars Story.

Considering its generational range and cultural reach, Star Wars may be the most important story about war since Thucydides wrote The Peloponnesian War. And it’s influence on the military has been profound: from forward-stationed soldiers calling their counter-mortar protection “R2-D2” to former four-star generals letting phrases like “disturbance in the Force” slip into their interviews; a recent, $10 billion Pentagon contract had been nicknamed “JEDI,” while, most notably, the Strategic Defense Initiative, a Reagan-era space-based missile defense program, quickly earned a catchier nickname: “Star Wars.”

While some will inevitably dismiss the films as fairy tales or space operas, their immense grip on the American imagination means that understanding them, and other myths, is important to unlocking American attitudes toward war and strategy.

Yuval Noah Harari’s influential book Sapiens: A Brief History of Humankind noted that humanity’s rise is “rooted in common myths.” According to Joseph Campbell, myths are “stories about the wisdom of life” that have a “sociological” function in that they often validate a “certain social order.” They also can serve a “pedagogical” purpose—to teach us to live better lives.

The Star Wars storyline covers much ground in service to myth. The eternal struggle between good and evil, embodied on screen by the dark-sided Sith, the Empire, and the First Order versus the Force-wielding Jedi, the Republic, and the Rebellion. Each subplot plays out in this long “reel” war, and at least indirectly has influenced the way Americans think about real war.

Historian William McNeill once argued that such myths are at the “basis of human society.” This, he claimed, is because such stories, “based on faith more than fact,” are thought by the masses to be true and are “then acted on whenever circumstances suggest or require common response” as humanity’s “substitute for instinct.” McNeill reminded readers of the seemingly illogical British 1940 stand against the Germans, because “they ‘knew’ from schoolbook history that in European wars their country lost all the early battles and always won the last. This faith, together with a strong sense of the general righteousness of their cause, and fear of what defeat would bring, made it possible for them to persist in waging war until myth became fact once more in 1945.”

Nearly eighty years later, such myths persist on the American side of the Atlantic. There are at least three American strategic myths present and pervasive in the post–Cold War era that run in parallel with several in the Star Wars series, which, if held onto, will harm the nation and warp American strategic culture (given that such myths are the “accepted narratives” that often underpin a nation’s strategic culture).

With some recommended alternatives, here are the three myths that need to go the way of the Death Star: suddenly and violently into the vacuum of space.

Technology wins wars.

It’s hard to believe now, but there was a time in the 1990s and early 2000s when the military trotted out techno-optimist buzz words so plentiful you’d have thought they were dreamed up by a Silicon Valley startup’s sales rep: “full-spectrum dominance,” “Rapid Decisive Operations,” “shock and awe,” and “network-centric operations.” The common link was they promised to leverage advanced technologies that would be effective at “lifting the fog of war,” in the words of retired Admiral and Vice Chairman of the Joint Chiefs of Staff Bill Owens. Stephen Biddle has summed up this tech-friendly view of war by using clichés: “‘Anything you can see, you can kill,’ and in the future we’ll be able to see everything.” Like calling an Uber, some idealists see future war as entirely “done by push-button at a distance.”

But the long slogs in Afghanistan and Iraq forced a counter-narrative to surface. Critics like

former national security advisor and retired Lt. Gen. H.R. McMaster have pilloried the notion that “lightning victories could be achieved by small numbers of technologically sophisticated American forces capable of launching precision strikes against enemy targets from safe distances.”

American forces have learned, through now-seemingly-endless years in the Middle East, that wars cannot be won technologically. You can’t lift war’s fog any more than you can learn to cut clouds with knives.

The relationship between the individual and technology at war is a major plotline in the Star Wars films. It can be seen, embodied really, in the contrast between Darth Vader’s complete integration with his black, life-sustaining, full-tech bodysuit and his son Luke Skywalker’s robo-replacement hand. Here we see a struggle between an individual employing relatively more and comprehensive tech and another relying on much more limited and relatively less tech. Or, on a grander scale, the Empire’s enormous Death Star versus the Rebellion’s modest technological means. Campbell has spoken on this theme from the films, that individuals “should not submit” entirely to technology, but “command” it instead (and, of course, he acknowledges that the “problem” is “how” to do so).

As in the films, the modern military’s struggle over the role of technology isn’t over.

Yet, while the philosophical war over the right balance between human and machine will rage on, at the least, we should know by now the myths the military told itself in the 1990s and 2000s about victories purely through technology must end. Of course, future wars against modern, well-equipped adversaries will require more enhanced technology than the messy counterinsurgencies in Iraq and Afghanistan. But resisting the temptation to expect technology to win our wars for us is vital. We need a new myth, a more balanced myth—perhaps that technology provides tools, but victory and defeat is human.

One kind of war is better than another.

Without the utility of a clear and present Soviet adversary, the planning construct that emerged in the 1990s and early 2000s predicated the size of the US military on the ability to fight two major theater wars simultaneously. The unstated assumption was that if a force was prepared to fight these big wars (say, Korea) then that same force would also be competent at fighting any and all smaller wars (like Panama). And so planning and training leaned toward titanic-World War II-style slugfests.

Of course, the key issue was that the precise size and duration of such a “major regional contingency” wasn’t very well specified. Analysts have since pointed out that both Afghanistan and Iraq “were much smaller” than the models envisioned as “major” contingencies. So, even two relatively smaller shooting wars caused real friction and raised serious questions about the post–Cold War planning model.

As the wars in Afghanistan and Iraq stressed the force, a debate emerged within the Army over the utility and efficacy of counterinsurgency doctrine. During these arguments, advocates seeking to re-orient the 1990s big-war-leaning military towards waging counterinsurgency warfare after 9/11 made some bold statements that indicated a preference for certain kinds of war.

The writers of Field Manual 3-24 told their readers that counterinsurgency was the “graduate level of war,” that it was somehow loftier or smarter than other, more direct forms of combat. And (since retired) Army Col. Robert Cassidy wrote that counterinsurgency warfare is “more difficult than operations against enemies who fight according to the conventional paradigm.”

While long-term planning does force certain choices, military and strategic practitioners should cringe at the claim that one form of war is somehow harder or preferred over another. The Star Wars saga shows us this failing. Yoda, in leading the Jedi Council, wrongly put his faith in the miniscule Jedi Order to keep peace in the Republic. By admitting only rare-blooded individuals, the Jedi were so constrained in terms of numbers that they were destined to be useful only in niche scenarios. Like small unit special operations teams, Jedi were effective in hard-to-reach places at must-arrive-now times, but, ultimately, such a tiny force left the Jedi and Republic at the hands of the Empire’s exponential numerical advantage in clone Storm Troopers.

Wars are like natural disasters—some rare and immensely destructive, others more common. As a young Palestinian woman just said of Gaza, it’s “just a big volcano which [is] about to explode.” Wars obey a power law distribution, meaning, there will always be fewer big wars with extreme fiscal and human consequences, and there will be many smaller wars with relatively fewer consequences. Lewis Fry Richardson’s work from just after the Second World War demonstrated this long ago.

Thus, we know there’s an ongoing chance we may see some large wars in our lives, that a great deal more smaller-scale wars will erupt during that time, and that we cannot predict with precision when and which kind will occur on our watch. And so instead of pushing or pulling toward bigger or smaller wars, the much better guiding myth, or narrative we tell ourselves, is that all wars, no matter the size and type, may have to be fought and must always be won.

Unrivaled American economic strength equals perpetual military dominance.

Long unstated but always present in the post–Cold War years is the notion that great American wealth means unparalleled military strength. The “economic foundations of military power,” in the words of Edward Meade Earle, meant that as long as the American economy stays on top, so will its military. This story’s been told for at least a generation.

But it’s changing. In 2010, it was noted that China barely spent one-fifth as much on defense as the United States, and that the gap between American defense spending and that of its adversaries was enormous:

In 1986, US military spending was only 60% as high as that of its adversaries (taken as a group). Today, America spends more than two and one-half times as much as does the group of potential adversary states, including Russia and China. This means that if the United States were to cut its spending in half today, it would still be spending more than its current and potential adversaries—and the balance would still be twice as favorable as during the Cold War.

It’s been said before, but in 2018 it appears the American edge is vanishing, and will soon vanish. Yascha Mounk and Roberto Stefan Foa, recently wrote in Foreign Affairs that countries rated “not free” by Freedom House have risen to 33 percent of global income (up from 12 percent in 1990), which is as high as it was during the gathering storm decade of the 1930s. They write,

As a result, the world is now approaching a striking milestone: within the next five years, the share of global income held by countries considered “not free”—such as China, Russia, and Saudi Arabia—will surpass the share held by Western liberal democracies. In the span of a quarter century, liberal democracies have gone from a position of economic strength to a position to unprecedented economic weakness.

Is the United States ready to fight in a world where we aren’t the richest on the battlefield? Moreover, at least since the Second World War, America has always had the qualitative advantage of world-beating industry and technology behind it when it went to war. But today, it’s an open question whether Silicon Valley will support and advance American strategic interests.

In Star Wars, the Empire was clearly wealthier and had massive technological advantages over the Rebels. As a result, at several junctures, the Empire’s leaders were overconfident, cocky even, and believed themselves to be mere inches from victory. But in several instances they were beaten by the smaller, scrappier Rebels. Just as in the real world, bigger doesn’t necessarily win.

There’s great danger in the natural arrogance that comes with having lots of stuff that’s better than what the enemy carries. Even worse is thinking you’ve got an edge in resources before the bullets fly and then finding out that’s not actually the case after the shooting starts. So the better myth would be: No matter the resources, whether it’s with sticks or nukes, we must be prepared to fight both effectively and efficiently.

Some of this myth-shedding (or myth-shifting) exercise is simply the natural result of one generation sloughing off the last generation’s ideas that no longer make sense. Like Luke pushing aside his father, or J.J. Abrams taking the helm from George Lucas, or Kylo Ren advising Rey to “Let the past die. Kill it if you have to.”—this is what happens when one age gives way to another. And with the Greatest Generation moving toward the great-beyond, and the Baby Boomers heading toward the beach, Gen Xers putting on stars and the Millennials (and Generation Z) filling the ranks, we should expect and even embrace some new myths.

Because if we’re not careful, American strategists might be beaten by a better myth—and while speed may kill tactically, storytellers win strategically.

(And at the box office.)

 

Maj. ML Cavanaugh is a non-resident fellow with the Modern War Institute at West Point, and co-edited the book, with author Max Brooks, Strategy Strikes Back: How Star Wars Explains Modern Military Conflict, from Potomac Books.

This essay is an unofficial expression of opinion; the views expressed are those of the author and not necessarily those of West Point, the Department of the Army, the Department of Defense, or any agency of the US government.