Conversations with Howard

Talking with Howard Kunreuther about his work—including one of his last books—always meant gaining new insights.

Talking to Howard meant learning something new every time. Every conversation was engaging and thought provoking because of his intellect, the range of his knowledge, and his enthusiasm for exploring diverse perspectives and ideas.

Just to pick one example, I gained many new ideas from my discussions with him about one of his last works, a book he co-authored with Bob Meyer that is entitled, The Ostrich Paradox: Why We Underprepare for Disasters.

The first thing I gained from these discussions was to learn a lot more than I previously knew about ostriches. Before I read Howard’s book, my only knowledge about ostriches was the myth that they buried their head in the sand in the face of danger. But from his book, I learned that was just a myth. When faced with a potential threat, ostriches instead tend to run away or use their strong legs to defend themselves. They also have keen eyesight and hearing; their primary defense mechanism is to detect and avoid danger early rather than burying their heads in the sand.

So, what do ostriches have to do with people underpreparing for disasters? Howard and Bob argued that ostriches need to adapt to overcome their lack of ability to fly, just like people need to overcome their psychological biases to prepare adequately for future disasters.

Not only did I learn more about ostriches from talking with Howard about this book and reading it, I also learned more about motivating disaster preparedness, even though that has been a topic I have studied for years.

The six biases that they highlight—myopia, amnesia, optimism, inertia, simplification, and herding—were familiar concepts from my prior studies of psychological biases in decision making under uncertainty. However, Howard always delivered a story or an example that provided new insight.

For example, he would talk about myopia bias—that is, focusing only on the short-term—by referring to the failure of the countries bordering the Indian Ocean to see the value in investing in a relatively low-cost tsunami warning system prior to the tragedy that befell the area in 2004. When evaluating a proposed warning system, these countries massively underestimated the impact that such a system could have. They even assigned a negative value to the system due to concerns about how it could dampen tourism by reminding visitors of the risk of tsunamis.

For everyone who knew Howard, you can imagine him relating this story with his energetic frustration at the decision makers’ decision to forgo a critical safety investment because of a reluctance to remind people of the risk of a disaster.

Another frustrating story that Howard would tell, as an example of inertia bias, involved decision makers in New Orleans. In 2004, forecasts predicted that Hurricane Ike would hit New Orleans directly. City managers knew they could not evacuate 100,000 residents who either did not have cars or were too fragile to move, so they suggested using the Superdome as a shelter of last resort for Hurricane Ike.

Because the Superdome had none of the resources needed to house thousands of residents for an extended period of time, it was likely a poor shelter candidate. Fortunately, Hurricane Ike changed course, averting the crisis that hurricane season.

Unfortunately, despite the clearly identified problems that would have occurred if Hurricane Ike had hit New Orleans and the city had relied on the Superdome as a shelter, nothing was done to improve shelter options before Hurricane Katrina, which struck the following year.  Again, you can imagine Howard’s frustration with the decision makers when—unsure how to proceed—they chose to do nothing.

Talking about The Ostrich Paradox with Howard also gave him the opportunity to talk about one of his specific passions: improving flood insurance in the United States. Current programs do not incentivize individuals to purchase flood insurance and invest in cost-effective mitigation measures to protect themselves against future losses. Howard worked tirelessly to effect change in this industry and his work was having an impact.

Howard, ever the optimist, believed that he could improve decision making around disasters to save both lives and property. He also believed that when protective decisions go awry, it was not because humans lack the innate ability to make good decisions, but rather because human reasoning is not designed to evaluate the risk presented by rare threats for which we have little prior knowledge. He argued, instead, that a more analytical approach was needed.

Howard’s legacy will rest with the many future discussions by the next generation of scholars focusing on the challenges of improving disaster preparedness to reduce future consequences. These future scholars have the benefit of building on a foundation of extensive knowledge that Howard generated over his long career.

Robin Dillon-Merrill

Robin Dillon-Merrill is a Professor and Operations and Analytics Area Chair at the McDonough School of Business, Georgetown University.

This essay is part of a series celebrating the life and scholarship of Howard Kunreuther, titled “Commemorating Howard Kunreuther.”