Today’s American military is, arguably, the most tactically adept fighting force in the world—perhaps of all time. It is, without question, the best-resourced military in human history. Our technological advantage is unprecedented, as cutting edge hardware and software platforms deliver extraordinary capabilities in areas ranging from SIGINT to targeting to command and control. Taken together, the United States wields a tactically, financially, and technologically superior warfighting machine.
Why do we have so little to show for it? What accounts for our twenty-first-century inability to translate tactical excellence, technological dominance, and near-boundless resources into durable strategic outcomes in our post-9/11 “long war”?
There is a long list of potential scapegoats. Misguided political leadership. Imperial over-reach. The proliferation of complex asymmetric threats. Hyper-partisan domestic politics. The horizon-lowering influence of the twenty-four-hour news cycle.
The list goes on.
However, when confronting our recent strategic woes—from Afghanistan to Iraq and beyond—the defense establishment must step forward to shoulder its share of the burden. We are present en masse from the front lines of conflict to the inner corridors of power. We have opportunities to shape debates and affect outcomes at every level. A problem of this sort, and of this magnitude, is one that we must own and confront within our ranks.
The Intelligence Cycle is Broken
A key part of the answer lies in what’s known as the “intelligence cycle”—the process through which we investigate, analyze, and decide to act upon the world around us. This process has developed systemic, structural flaws. The platforms through which we gather information, and the processes and mediums through which we conduct analysis, are not adequately capturing ground truth. We are not integrating a granular, nuanced understanding of locality—and the potential strategic implications thereof—into the intellectual foundations of our strategic thinking. This has corrupted our ability to root strategic thinking in the realities of the battlefield. Instead, when strategic decisions are made (by men and women who are, inevitably, both physically and psychologically isolated from the front lines), debate takes place in a virtual reality that has been constructed by the intelligence cycle—and that may bear only a passing resemblance to the facts on the ground.
A confluence of factors has led us to this point:
1. The Ascendance of Technology and the Primacy of Targeting
Technological innovation has revolutionized tactical intelligence in the twenty-first century. Most dramatically, technology has driven extraordinary advances in our targeting capabilities. The global reach of our targeting platforms is unprecedented and unmatched. We are able to track the enemy with a diverse and sophisticated suite of tools, fixing him in time and space, so that we might bring our exceptional lethality to bear. However, a growing focus on targeting has drawn front-line attention away from deeper strategic concerns. This, in turn, has affected the inputs that we feed into the intelligence cycle in subtly pernicious ways.
Appreciating this dynamic is critical to understanding why our targeting excellence has not delivered comparable success at the strategic level. It is a root cause of why we appear to be engaged in a never-ending game of whack-a-mole with our enemies.
Our view of the fight at the tactical level, from counterterrorism operations in the Sahel to counterinsurgency in Afghanistan, is structured around our view of the enemy. Put another way, the operational networks of our enemies are the framework through which we see the battlefield, and the targeting process is our lens.
In and of itself, this is perfectly natural. Why shouldn’t front-line military units take an enemy-centric view of the battlefield?
The problem is twofold:
First, our current approach to network targeting takes an extremely limited view of the enemy. Our men and women on the ground are zeroed in on tactical intelligence about the enemy—the who and where. Questions of strategic intelligence—the why—are marginalized.
Technology has been a driving force behind this phenomenon. With the advent of dynamic targeting software platforms, our actions at the tactical level now center on feeding inputs into the technological tools that underpin the targeting cycle. We conceptualize the enemy in link analysis charts, and we strive to “connect the dots” and generate actionable intelligence.
The resulting reach and specificity of our knowledge is extraordinary. We are able to map out shadowy global networks with speed and precision. Yet this knowledge lacks depth and substance. Link diagrams may be geo-located, thus ostensibly connecting the enemy to locality, but our reporting processes provide little incentive to root our understanding of the enemy in meaningful local context. We excel at connecting the dots and mapping the network, but our view of the enemy is two-dimensional.
Second, this enemy-focused lens has been transposed to the strategic level, with disastrous consequences. Our two-dimensional, network-centric view of the enemy is a limitation at the tactical level, where it inhibits our ability to anticipate second- and third-order effects. It is a catastrophe at the strategic level, where it frames our worldview. Our tactical-level representation of the enemy as a “Palantir Bonsai Tree” (as opposed to as an organic outgrowth of local society) has become the intellectual framework for strategic decision-making. No wonder, then, that our efforts to prune limbs prompt new growth in unanticipated directions.
2. The Fetishization of Data
As front-line intelligence analysts have become consumed within the targeting process, strategic decision-makers have doubled-down on technology and data as a means to understand the battlefield. With fewer and fewer substantive, qualitative inputs into the intelligence cycle, we have compensated by harvesting ever-larger quantities of data.
Dramatic breakthroughs in the fields of big data, predictive analytics, and artificial intelligence are pulling us further and further toward a “data-driven” understanding of the world. We break the battlefield down into measurable parts. Those parts are then measured by a diverse array of sensors. From that baseline, we then establish metrics that can be objectively assessed over time, in a process that can, increasingly, be automated.
The intellectual appeal of this approach is obvious. The defense establishment, similar to any large organization engaged in complex operations in dynamic environments, likes quantitative data. It is clean. Objective. Unambiguous. Scalable. It enables the clear measurement of progress, and the indisputable demonstration of results.
The problem, however, is that quantitative data is reductive. On the battlefield, we can quantify and measure an extraordinary range of things, from incidences of violence to the price of bread to the movement of displaced people. Yet once we quantify something, stripping away its contextual meaning and turning it into a data point, it loses all of its explanatory power. A quantitative data set cannot tell us anything about the significance of changing rates of violence, price fluctuations, or patterns of migration. Is an uptick in violence the result of the enemy’s growing strength? Is it tied to a rogue commander who has broken with the enemy’s central leadership? Is it the final death throes of an insurgent movement that has lost local support? The data cannot tell us. Interpretation requires deep, localized, contextual understanding. Yet this sort of information is not being adequately captured by the intelligence cycle at the tactical level—and, critically, only tactical-level personnel have the firsthand access to ground truth that is essential to acquire this information.
As such, our strategic reliance on quantitative data to compensate for a paucity of substantive, qualitative understanding is dangerously misguided. We are asking data sets to explain what is happening on the battlefield—but the data itself has nothing to say. Strategic-level planners and policymakers are being fed vast quantities of de-contextualized data points, to which they (or those around them) are compelled to ascribe meaning.
The rapidly growing size of our data streams is particularly dangerous in this respect. The very bigness of our data imparts an illusion of understanding. If we have terabytes of data, after all, surely we must know what we are looking at? But data sets, no matter their size, can never answer the question “why?” Indeed, context-free data sets can be structured to say virtually anything—and this is where our shift toward quant goes from being an analytical limitation to a terrifying strategic liability. In a highly politicized environment where no one in the room has an intuitive feel for ground truth, and where qualitative context is largely absent from the intelligence cycle, the field is left open for mistaken assumptions, political manipulation, and old-fashioned careerist bullshitting.
3. The Cult of The Operator
Our preoccupation with targeting, and our growing reliance on technology-driven quantitative analysis, have proceeded hand in hand with a shift in tactical-level organizational culture. Following the example of our most elite units, the ideal of “the operator” has taken root among the men and women on the ground. It is now the archetype of professional competency. Tactical lethality is championed, at the inevitable, albeit unspoken expense of sophistication and strategic effect.
Fueled by our targeting prowess, this has fed into a growing anti-intellectualism. Superficially, this is not an unexpected development. Killing the enemy is the core business of the military. The targeting process is a natural lens through which tactical elements of the military should view the world. It is logical that our most prolific kinetic targeting capabilities will command respect and admiration.
However, if no one at the tactical level is looking beyond the immediate demands of the targeting process to collect substantive and meaningful contextual detail on the enemy, then that information will never enter the intelligence cycle. Instead, it will be left to others (who lack direct access) to invent narratives that ascribe meaning to our network targeting packages and quantitative data sets.
No one, except our front-line personnel, has access to ground truth. Yet with few exceptions, front-line intelligence analysts are not asked to think strategically, to wrestle with ambiguity, and to make big-picture sense of local-level uncertainty. High-level strategy is something that is done by others, elsewhere. “Every Soldier is a Sensor,” goes the mantra, but we have calibrated our sensors to maximize the uptake of reductive, quantitative data points, while we indulge the false humility that tactical-level personnel “are just grunts” and thus beneath the plane of strategic thinking.
A Wealth of Data, A Poverty of Insight
The status quo is a recipe for yet more unproductive tactical excellence. The prevailing currents in the defense sector, meanwhile, are pulling us further toward the extremes noted above. Technology is ascendant. Big data, algorithmic processing, predictive analytics, machine learning, and artificial intelligence are the buzzwords of the day among our best and brightest. The zeitgeist tells us that the future lies with the large-scale quantification of the world around us, and we are following corporate America’s lead toward the technologization of everything.*
Total information awareness is a realistic objective, the technologists tell us, because of impending advances in the industry. Our ability to leverage technology toward tactical objectives is already the driving force behind how we operate on the battlefield. Looking ahead, the ability of technology to convey strategic understanding is central to our thinking about war and intelligence in the twenty-first century as well. Future intelligence analysis, according to this vision, will be grounded in automated data collection and analysis platforms that deliver both tactical acuity and strategic clarity, harvesting and processing unfathomable quantities of data from sources as diverse as social media platforms, classified reporting databases, and weather satellites.
The American military will continue to pursue technology-driven solutions. On the one hand, it will enable us to get better and better at targeting. All organizations like to focus on their strengths. Yet there is a looming danger that our tech-enabled excellence at the point of execution will be held up as validation that technology is the path to success at all levels, and that technology can deliver strategic results as well. On the other hand, technology will deliver an increasingly compelling illusion of situational understanding. As we are able to harvest more and more data, and then to process and visualize that data in ever-more dynamic ways, it will become increasingly tempting to believe that we simply must know what is happening—that the fundamental reductiveness and explanatory impotence of quantitative data has been transcended by sheer volume.
Yet, the fact that we have achieved so little while pursuing this track to date should give us pause. The allure of a high-tech, plug-in solution to understanding the world must be tempered by an appreciation of what technology and quantitative data can and cannot do.
What if our wholesale embrace of technology—as the lens through which we see the battlefield, the brain that processes its dynamics, and the central nervous system that guides and shapes our actions thereupon—is a root cause of our recent strategic malaise? What if a complex human environment, packed with layers of historical, cultural, and social meaning, and inextricably intertwined with political and economic systems, cannot be broken down into patterns of ones and zeroes and then reconstructed in any remotely meaningful way? What if, instead, it presents us with increasingly complex patterns of correlation that we are increasingly ill equipped to contextualize?
How Can We Fix the System?
First, let us be clear about what we should not do. Of course, we should not (and could not) disavow technology. This is not a call for a return to an idealized past, where human intelligence reporting reigned supreme, and long-form narrative was the standard medium. Nor is it proposing that we inject reams of speculative, unstructured, qualitative text into the intelligence cycle, or that we flood the intelligence community with academics.
What we must do, however, is re-structure the intelligence cycle so that contextual detail is fed into the military’s central nervous system. It is not a question of “cultural awareness.” It is not an abstract, open-ended inquiry into “local context.” It is a question of adding an essential layer of depth and meaning to what has become a two-dimensional targeting process that is, in turn, driving an increasingly reductive and de-contextualized intelligence cycle.
Our current embrace of network-centric targeting must be expanded to incorporate economic, social, and political context. We must force qualitative, granular detail into the intelligence cycle at the tactical level in a structured, methodologically consistent fashion. We must develop a three-dimensional view of the enemy from the bottom upwards, which captures a network’s connectivity to local environment. From this knowledge, we can attack not only the nodes and linkages within the enemy’s network, but also the network’s linkages to locality.
Technology and data have vital roles to play in this process, but they cannot deliver the needed insights by themselves. This will require contextual inputs and structural direction from a cadre of highly skilled front-line personnel who can leverage technology as a force multiplier of human expertise. Elements of this capability will be tech-centric, as we continue to reap the extraordinary tactical advantages provided by technological innovation. Yet our work must be executed with an ethnographer’s ear for meaning, and a historian’s eye for context—and situated in an organizational culture where the collectors, producers, and consumers of intelligence possess a shared understanding of the limitations of quantitative data.
This sounds academic. It may even sound pretentious. But in fact, the way ahead is straightforward. Academics have cloaked the skill sets of academia (and of the social sciences in particular) in deliberately complex language, presenting them as things that require uniquely academic expertise. Academia, as a whole, has charted a course toward obscurity, celebrating its “cult of irrelevance.” Yet academics are not priests. It is not their place to reveal or withhold sacred wisdom. The methods and the literature are open and available to us all, and both have much to offer in regard to our current challenges. There is nothing to prevent us from demystifying and utilizing academic skills for ourselves, and integrating academic methods into front-line intelligence collection and analysis.
The integration of structured, qualitative detail into the intelligence cycle is an essential step in the rehabilitation of our strategic thinking. Macro-level strategic debates cannot be allowed to proceed without connectivity to micro-level detail. Our leadership must be forced—by the nature of the inputs that we direct into the intelligence cycle—to engage with contextual nuance. The facts on the ground must be integral to strategic debate, and they must be provided by the men and women on the front lines (rather than projected over data sets by rear-echelon analysts). Technology alone cannot supply the insights that we need to make the right choices, at the right moment, for the right reasons.
* For a discussion of the downside to Corporate America’s infatuation with technology-driven solutions, see Sensemaking: The Power of the Humanities in the Age of the Algorithm, by Christian Madsbjerg (Hachette Books, 2017). Madsbjerg is a Danish strategy consultant and advocate of the value of human intuition. The book offers a range of highly relevant lessons for the defense sector, related to the limitations of quantitative data and the ways in which technology affects our view of the world around us.
I disagree the of an intelligence professional is to answer the commander or policy maker's priority information requirement at the tactical levelh these are referred to as PIR. A battalion intelligence officer's job isn't to determine what prime minister of country X is thinking. His job is to provide his commander with the information he needs to make accurate decisions on the battlefield be that who the top insurgent is in the AO or where the enemy tank company is. National level intelligence is where these root cause questions should be asked.
It's not the intelligence cycle that's broken its the questions that are being asked of it.
So basically the problem is the DoD CCRP and it's last remaining "Revolution in Military Affairs" (RMA) project: Information Centric/Network Centric Warfare. Concepts like "Power to the edge" are cool sounding concepts based entirely on Toffler's "Third Wave", but what we got instead is closer to the dysfunction of "Future Shock". The "Strategic Corporal" envisioned by the CCRP is the information equivalent of the "cult of the operator". What is produced instead is the protential to create a PVT Manning.
So what your really calling for is ditching the last bits of RMA and if so I say it would not be too soon to do so. But what is needed other than ditching RMA is an unbiased and complete assessment of how it was created, why it failed and how to keep it from coming back.
The need to keep it from coming back can be found in the cyclic nature of its reoccurrence: WWII Strategic Bombing, Vietnam Era statistical analysts, modeling and control (command and control via spreadsheet by "Whiz Kids" ) and more recently all that waisted and counter productive RMA stuff.
What the author describes as problems with the intelligence cycle (or intelligence in general) are not caused by intelligence information or the way it's collected and produced — they're the questions being asked. Link analysis charts and big data sifting didn't produce a "whack-a-mole" strategy…they're byproducts of that strategy. Within that strategy, intent — "why" — is largely meaningless, because carrying on with a long tradition in U.S. policy, those making it are looking for the easy solutions. Deriving intent and the forces behind it, let alone changing it, is slow and often difficult. Targeting is so much easier….
The other misperception the author has is what intelligence provides. It's not "ground truth"…or at least, not more than a snapshot in time and space, regardless of how it's collected. That's why a consolidated intel picture is an estimate — it's brief observations connected through analysis. Nor is predictive…at best, analysis identifies trends, and can extrapolate from there.
The author makes frequent assertions that decision makers and intelligence analysts have become overly reliant on data and technological solutions. He seems to generalize that intelligence analysts are not incorporating local, contextual data into their assessments to get at the big picture. It's not clear what evidence he has, other than a hunch, that intelligence analysts are not incorporating contextual data. I think this is a faulty assumption. The aggregated, correlated data CAN and DOES provide essential clues and patterns that assist the analyst in coming to a specific conclusion or analytical judgment. I think most analysts–even the front-line, local-level analysts–are not the "grunts" he seems to assert them to be. He also seems to assume that analysts in front-line units, such as those supporting "operators" or intensively involved in targeting, do not take an academic approach. Again, I think this is a faulty assumption coming from someone in academia. Analysts have been incorporating the aspects of PMESII-PT into their assessments for at least the last 15 years or so. Both approaches–data-driven and academic, qualitative and quantitative–are very important to providing quality intelligence support to operations and ultimately tactical, operational, and strategic mission accomplishment.
The thematic elements of this article seem to be spot on. We do have a wealth of data much of which lacks context, and we all want to enable the trigger puller no matter what echelon we occupy. So it seems that part of the problem not addressed here (that I could find) is the need to "be where you are" in the analytic hierarchy. The tactical analyst must analyze tactically; the "why" that analyst is looking for is not the same "why" an analyst at a national-level agency is looking for. the tactical analyst's job isn't to determine banking trends unless that points to a target that can be destroyed; similarly, the national-level analyst should not be focusing on finding the bed-down location of a bad guy unless it gives context to a wider analytic theme.
The data is, in general, available to us all, but that means that we can easily stray from our lanes into other people's business when we should be focusing on providing context in a way that they don't have time to do for themselves. It might not be as fun or rewarding being the strategic analyst who develops long term analytic estimates as it is being the target developer on the battlefield, but the former is just as important within context as the latter is to immediate operational needs.
The author claims the intel cycle is broken while never actually describing the intel cycle, nor where the intel cycle is broken, nor where it should be improved. I suspect he does not know the intel cycle well enough to describe how collection results in data, and how analysis results in a product, which contains the meaning and context he claims is missing.
The author also claims the targeting cycle is partially to blame for breaking the intel cycle, but fails to accurately describe how the intel cycle and targeting cycle currently interact. Again, I suspect he does not know either cycle well enough to articulate how they interrelate, much less how one breaks the other. His argument could be strengthened if he understood the difference between "targeting" and "lethal target engagement" and used those two distinct terms correctly.
While he makes some good points, particularly about DoD always looking for a technological solution, his claim that "the intel cycle is broken" is largely unsupported in this essay.
I've been engaged in this problem most of my military and civilian career and I believe that this is an important subject for leadership to read and absorb because it speaks about the necessity of accounting for the relationship of strategy and tactics throughout the intelligence cycle. Machine data integration is very good at immediate tactical representation but lousy at the deeper strategic context. The question of "when is the moment to execute tactical actions and the expectations of why you shouldn't and/or don't need to" isn't being held accountable in systems. Moreover, strategic machine learning simply can't be included and incorporated into the operation order with exclusionary discrimination incorporated into system design that prevents the necessary deeper acquisition and processing of corelevant data, information and knowledge. Operators expect and desire to function on absolute facts arising from data in spite of reality that the "truth of perception problem" is inevitability always within a "Noise to Knowledge acquisition matrix and to base the intelligence cycle on any more narrowed formula is a misrepresentation and betrayal of the needs and expectations of the of the operational enterprise. This is a complex deeply rooted Godel problem that can be mapped to asking incomplete questions and expecting a complete singular answer. The idea that there is one best solution to any given problem is a mathematical fallacy. Intelligence, like mathematics involves many systems of logic they are both interdependently related and interdependent with each other. For an operational intelligence view to remain optimal it must function as a part of oppositional strategic foundations based on the best known available knowledge, not just data and information. Without the recognition of incongruence we are essentially allowing the system design of machine intelligence to make the decision when to pull triggers.