Now that You Know More, Would You Do It Anyway?

As indefatigable optimists, re-booters strive to improve our attitudes, techniques, and approaches, using past mistakes as a basis upon which to learn. And, as any re-booter knows, we get things wrong all the time. People zig when we knew they’d zag. We take wild risks, underestimate the significance of the random rumblings of others, and look for confirmation of our expectations rather than watch for signs of deviation. Everybody does this. When we screw up, get it wrong, or are left dumbstruck by the actions of another, in hindsight, it’s usually straightforward to weave the narrative together. “Oh,” we mutter, “now, I see. Huh. I guess I should’ve realized … If only I’d known, I would’ve done it differently.” Maybe yes, maybe no.

Any of this sound familiar?

The thing about hindsight is that having more or better information doesn’t mean that we would’ve made a different choice. Knowing in advance that your boss is an irredeemable asshole doesn’t mean you’d be any less shocked when they turn their guns on you. There are some things we just can’t bring ourselves to admit because what happens afterwards is so awful to contemplate. I guess what I’m trying to posit is that information has its limits in terms of influencing our decisions. Repeatedly, I am astonished by the murky, counter-intuitive depths of the human psyche. How much of our obtuseness is naiveté and how much is willful blindness?

The topic of intelligence failures and whether or not more knowledge would’ve made any difference is the subject of today’s post. All too often, we say to ourselves, “If only I had known…” Yeah right. That’s what we tell ourselves, but even we can’t be fooled all the time. Sometimes knowing more information not only would not have changed the outcome, it would’ve made everything harder. That’s an odd thing to say, isn’t it? Having more knowledge doesn’t always enhance our position or alter our choices. Sometimes, we’d do it anyway, even knowing as much as we do today. Huh. We are a reckless random bunch, we hominids.

These days, I am wading my way through a dense (and recently declassified) report on intelligence failures using the fall of the Shah of Iran and the allegations of WMDs as case studies. It’s a fascinating read in which the author deconstructs the flow of intelligence and why certain developments were ignored or glossed over. To me, such breakdowns in understanding have as much relevance to our personal lives as they do for the US Government.

People are almost always too slow to take account of new information…sudden and dramatic events have more impact on people’s beliefs than do those that unfold more slowly…. people can assimilate each small bit of information to their beliefs without being forced to reconsider the validity of their basic premises. They become accustomed to a certain amount of information which conflicts with their beliefs without appreciating the degree to which it really clashes with what they think.”

In his text, the author stresses the critical importance of considering alternative explanations, arguing that doing so leads to overall better analysis even if the conclusion remains the same. In other words, spending the time and energy to speculate what else might explain why something happened probably won’t change your overall conclusion, but what it will do is make that conclusion that much more solid because you’ve actually thought it through. For instance, there are going to be several ways to explain why your spouse is gone every Saturday. Taking the time and trouble to run through these possibilities gives you a more reasoned and thoughtful basis for your assessment.

Does any of what I am saying make sense?

Years ago, had I subjected my own situation to this sort of scrutiny, I would’ve been uncomfortably aware that the odds of X happening were far higher than I told myself—which is precisely why I didn’t do so. I was afraid. I was naïve. I didn’t want to deal with the implications of drawing such a conclusion. And, I might have been wrong. Instead, I opted for the Slow Boil School of Frog Cooking.

Of course, it’s always easier to see patterns and problems when we have some distance from the situation—which was precisely what the analysts didn’t have when it came to reporting about the health of the Shah’s regime. There were too few analysts with too much to do and they had neither the time nor the institutional incentive to sit back and provide some analysis of what was happening–which is why so few alarm bells were sounded in the summer of 1978. Afterwards, it was too late.

So, too, with you. You’re swamped with car pools and deadlines and payrolls to meet. You know you’ve been stressed, so it’s likely that whatever little hiccup occurred last night has to do with momentary thoughtlessness on your (or their) part. Put it behind you, it’s not important…Re-read that quote I cited above.

Does this apply to you?

I haven’t finished my Intelligence Failures book yet, but it’s certainly got me thinking. Would I have stayed where I was for as long as I did if I’d known what was coming? I still don’t know. I still can’t answer. It’s sobering to subject ourselves to such scrutiny and realize we wouldn’t have made any different decisions—especially when the end result was so agonizing. Fatalism is not what I’m driving at in this post. And many times, we’re set on a path and there’s no way around it. But, what can we do with our greater base of knowledge today to avoid similar situations in the future? How can minimize our own intelligence failures from here on out?


Tags: , , , , , ,

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: