622:full

Systems Thinking and National Intelligence

March 17, 2021

I am hardly an expert on systems thinking, but I have long thought that the practice of intelligence analysis would be enriched by more explicit attention to it. After all, what intelligence is trying to understand are political systems, often complicated ones made up of equally complicated sub-systems.[1]

By Gregory F. Treverton

NOTE: The views expressed here are those of the author and do not necessarily represent or reflect the views of SMA, Inc.

Table 1 presents a summary of my take on the major tenets of systems thinking and their implications for intelligence analysis.

Table 1: Summary of Principles and Practices of Systems Thinking in Intelligence

Systems Thinking Implications for Intelligence
Always ask what is there? What isn’t there? Looking for unknown unknowns
What holds the present in place? How durable is the status quo or current trends, what wildcard might drive toward a very different outcome?
The answer lies in the question being asked The answer lies in the question being asked
Focus on emergent properties, focusing on plausibility, not likelihood. What more than when Too much focus on likelihood, best bets, too little on consequence
Path dependence. Developments that do not equal the sum of the parts Often too linear in causality
Look for power that hasn’t been exercised yet Look for power that hasn’t been exercised yet: Greta Thunberg
Articulate assumptions in EVERY context. Test assumptions. Watch for assumptions that something you haven’t named is at work Assumption in the parent of f***-up. Looking for and probing anomalies critical
Biases are always at work Intelligence is aware of many biases, like confirmation, but others are buried: “Americans” and search for Osama bin Laden
Traditional disciplines are just one way to sort our intellectual life. Disciplines create borders, blind spots Ditto, important to create teams that include non-SMEs and different generations
Look for slow and fast variables. Look for movement that creeps, and for edges Most “failures” results from mind-sets that missed creep: e.g. Ebola

Happily, many of the tenets are observed in intelligence analysis, but perhaps not explicitly enough. Systems thinking’s admonition to look for what isn’t there translates for intelligence into Donald Rumsfeld’s “unknown knowns”—things we don’t know we don’t know, as distinguished from “known unknowns,” those things we know we don’t know.[2] By the same token, while intelligence frequently asks what wildcards might drive the assessment dramatically away from the best bet, explicitly asking what holds the present in place would compel more attention to the present as a system, not simply as a given.

Systems theory and intelligence both understand that how the question is asked takes you much of the way to the answer; indeed, if I write another book on intelligence, I already have the title, Asking the Right Question. In the infamous 2002 National Intelligence Estimate on Iraqi weapons of mass destruction, the right question was critical but very unlikely to occur: why might Saddam Hussein pretend he has weapons he doesn’t? Asking not what is likely but what is plausible is a nice antidote to the tendency of intelligence to pay too much attention to likelihood and too little to consequence: big consequences merit attention even if they are unlikely or of unknown probability.

Intelligence pays lip service to path dependence, for instance in devising indicators that would be way-stations in route to a particular outcome. Again, more explicitness would be helpful: if the best bet is X, what would it take by way of path to result in 4X. Intelligence is pretty good at assessing power balances that exist but not good at imagining where new sources of power might emerge, and on that score the subject matter experts are little help: they are the ones who find it hardest to imagine major discontinuities. They would not have imagined that an autistic Swedish teenager, Greta Thunberg, would become a major global force on climate issues.

So, too, intelligence worries about assumptions, especially half-buried ones. When I left chairing the National Intelligence Council (NIC), I gave my closest colleagues a cut-glass paper weight I’d had made in China. In letters small enough to be inconspicuous was printed my favorite reminder for intelligence (and perhaps for life): “assumption is the parent of f***-up.” Systems thinking adds the additional injunction to particularly look for factors you haven’t named that might be at work.

Intelligence is aware of many biases, like the search for information validating your favorite hypothesis, “confirmation bias” in the jargon. Yet, here too, system thinking’s admonition that biases are always at work is helpful. While at the NIC, I started work on unconscious biases and was struck in a conversation with a woman who had been on the Central Intelligence Agency (CIA) team trying to locate Osama bin Laden. The team, she said, finally realized that a critical but unidentified source of bias affected all their work: they were all Americans.

Intelligence suffers less that academia from the dominance of disciplines, our inheritance from eighteenth century Germany; in my stints inside intelligence agencies, I was mostly unaware of the disciplinary training of my colleagues. Intelligence does kowtow, appropriately enough, to subject matter experts (SMEs). Yet because those SMEs are the ones least likely to imagine discontinuities, intelligence has to be careful in assembling teams that mix experts and non-experts, seasoned analysts and brash newcomers. That is all the more important because, while intelligence agencies are now more diverse in gender and, to a lesser extent, ethnicity, their cognitive diversity is still limited: the constraints of being investigated for a security clearance and passing a polygraph do tend to deter or eliminate wild and crazy thinkers!

Finally, if, as I have long believed, intelligence is telling and adjusting stories, systems thinking reminds us that narratives can gallop but also creep, and they have edges. A striking example comes from medicine, Ebola. The medical profession had a story about the disease: it would flare up in isolated African communities but, because it was so lethal, would kill its victims before they could spread it, and thus the isolated hot spots would die out. The trouble was that Ebola was implanted in a system in which the creep of progress had created better rural-to-urban transport. Victims no longer died before they could infect. The story was true until it wasn’t.

[1] For a nice introduction, see Donella H. Meadows, Thinking in Systems, Earthscan, 2009.
[2] These distinctions were not new with him, but he used them famously at a press briefing while Secretary of Defense, on 12 February 2002. I have thought the category he neglected to mention, unknown knowns—those things we don’t know or don’t remember we know—is also critical, for instance that men from the Middle East had been interested in flying lessons before the summer of 2001.

Get the inside track with our Experts on Demand. SMA gives you access to experts covering a breadth of market areas and end customers. Their experience and guidance will give your project the competitive edge.
Edited by Dick Eassom, CF APMP Fellow
Published on March 17, 2021, by SMA, Inc.