Unknowns, errors and the Iraq War

Originally published in the Herald-Mail

Thomas A. Firey Mar 9, 2016

Give Donald Trump credit. He usually tells crowds whatever they want to hear, no matter what the truth might be. But at last month’s (2/13/2016) South Carolina Republican debate, he boldly delivered a hard truth to a crowd that didn’t want to hear it.

“Obviously, the war in Iraq was a big, fat mistake,” he said to a chorus of boos. “We should have never been in Iraq. We have destabilized the Middle East.”[1]

Being The Donald, he then took things too far, saying of the George W. Bush administration: “They lied. They said there were weapons of mass destruction, there were none. And they knew there were none.”

Ever since U.S. forces failed to find Iraqi nuclear or chemical weapons, Bush’s supporters and critics have offered competing narratives of the war. Supporters say there were weapons, but Saddam Hussein’s regime must have hidden them or smuggled them out of the country. Critics claim Bush knew there weren’t any weapons, but he launched the war for personal gain or to benefit American oil companies.

Those narratives stoke partisan passions, but they’re both absurd. Bush supporters can’t explain why Hussein would get rid of powerful weapons on the eve of a war for his very survival, nor can they say where the weapons went or how U.S. reconnaissance failed to notice them. Bush critics can’t explain how he was to benefit from the inevitable discovery that the weapons didn’t exist, nor can they say what American oil companies have gained from the restoration of their giant competitor, the Iraq National Oil Company.

Beyond scheming leaders and disappearing weapons, there’s a much more sensible explanation of the war—one that also sheds light on many other costly mistakes made by both government officials and ordinary citizens. It involves how people deal with unknowns.

Life is full of unknowns, of course, and many of them are especially worrisome. Will my furnace work the next time it’s cold? Will my employer pay me this week? Is my fever the result of a simple virus or something more serious?

We handle unknowns by collecting information: I test my furnace, pay attention to the financial news, and see my doctor if I continue feeling lousy. Problem is, though the new information may give me a better idea of what’s going on, I still don’t know for sure: my doctor could miss something; the newly repaired furnace might not work. I could use more resources to gather even more information, but that information would still be limited. Besides, resources—especially time and money—are scarce, and using a lot of them on one unknown means I can’t use them in other ways.

Nearly every choice we face leaves us exposed to some uncertainty and risk of error. So what to do? To answer that, think about car alarms.

Several years ago, someone stole the hubcaps off my neighbor’s car. It had an alarm that supposedly detects suspicious vibrations, but the thief was skilled enough not to trigger it. Once my neighbor replaced the hubcaps, he decided he wouldn’t suffer another theft, so he turned the alarm’s sensitivity to maximum. Now whenever the wind blows, the alarm sounds.

My neighbor’s car alarm produced two different types of errors. First, it failed to detect the thief. Analysts call this a Type-II (“false negative”) error, meaning the alarm didn’t detect something it was supposed to. Then after my neighbor changed the sensitivity, the car began a series of false alarms. These are Type-I (“false positive”) errors, meaning the car indicates it’s detecting something that isn’t, in fact, the case.

The problem with these errors is that attempts to reduce one usually increase the other. So when dealing with unknowns, people—either consciously or subconsciously—have to choose which error they’re more willing to accept.

In late 2002, Bush officials faced a major unknown: whether Iraq had restarted its weapons program. They did have several worrisome indications: Iraq was caught importing suspicious materials. Hussein had grown more bellicose, especially about arms inspections. And his regime had manufactured—and used—chemical weapons in the past. All of that suggested something bad was happening in Baghdad.

But the most important factor in the Bush decision to invade was what happened a year earlier. After 9/11, investigators discovered the U.S. government had received some tips about the terrorists prior to the attacks: a few had taken flying lessons but didn’t study landings, several had purchased one-way plane tickets, and “chatter” had increased on the al-Qaeda communications network. Those revelations prompted strong criticism that the government should have detected and thwarted the attack.

Yet those bits of information were tiny and isolated, and accompanied by mountains of similar “tips” that were benign. But, analytically speaking, Bush officials did commit a Type-II error—their terrorism “alarm” indicated all was well. After 9/11 they turned their alarm sensitivity to maximum, and Hussein set it off.

Now, nearly 15 years later, many Bush supporters still can’t acknowledge that his administration made a severe mistake in invading Iraq. And many of his critics can’t accept that the war was the result of anything other than wickedness or stupidity. The supporters’ denial and the critics’ naivety prevent us from learning from Iraq and improving our decision making. As a result, more severe Type-I and Type-II errors likely lie in America’s future.

Thomas A. Firey is a senior fellow with the Maryland Public Policy Institute and a Washington County native.



[1]Transcript of the Republican National Debate.” Nytimes.com, Feb. 14, 2016.