Primary navigation:

QFINANCE Quick Links
QFINANCE Reference

Home > Macroeconomic Issues Viewpoints > Why Financial and Other Serious Risks Are Such Slippery Concepts to Get Hold Of

Macroeconomic Issues Viewpoints

Why Financial and Other Serious Risks Are Such Slippery Concepts to Get Hold Of

by David Ropeik

Introduction: The "Truth" About Risk

Many years ago, as a broadcast journalist in Boston, I noticed that in any situation in which a risk was involved, people were either more afraid of the risk than circumstances warranted, or were not afraid enough, based on what the experts said about the evidence. This led to the realization that any conversation about risk has to respect the fact that there is no single knowable, factual truth concerning risk. There is only the reality that comes from our interpretation of the facts.

Cognitive neuroscience research has established that facts, by themselves, have no meaning. They are like 1s and 0s in a computer before the software is turned on. However, once we give facts a meaning, that then goes on to shape and inform our conscious view of the world.

Risk is, in its essence, the chance that something bad could happen. The chance, the probability, is calculable to some degree. But “bad” is subjective. So the key point to grasp is that the definition of risk itself confirms the idea that risk is an inescapably subjective concept. The way we interpret “badness” is heavily determined by our instincts, our evolutionary heritage, starting with the flight or fight wiring ingrained in our brain and mediated by all kinds of subjective learning, experience, assumptions, and hearsay.

Risk is, in other words, a messy, fuzzy concept, both subjective and instinctive. It is all about how we feel about facts, rather than about the fact themselves. So when people talk about getting risk right, “right” depends heavily on how they are conceiving of that particular risk at a specific moment in time.

If you take these points, you immediately see the plight and the challenge facing any risk officer in a financial institution such as a bank. The risk officer, on one important level, is just another human struggling with a subjective view of risk. On behalf of their organizations they are supposed to step outside of their humanity and somehow form an “objective” view of risk.

One way that corporations try to solve this dilemma is to provide risk officers with data analytic tools and mountains of “factual data.” But, of course, the data have to be interpreted, and with the interpretation comes subjectivity. Suppose that the risk officer’s data analysis shows him or her that the bank is leveraged beyond the three-times debt-to-capital ratio permitted by the banking regulator. That is an objective measure, since what constitutes bank capital and bank debt is defined by the regulator. So our risk officer flags this up to the board, who then say, “That’s fine, don’t bother about it. We’ll bring it back into line shortly.”

What does the risk officer do? Take the board’s word for it? Regard the fact that the violation has been flagged as “job done”? Say “not good enough” and risk his or her career by pushing the board to take immediate action? Go and whistle-blow to the press or the regulator? Risk, even with all the interpretive guidelines in the world, remains messy and rife with subjective dilemmas.

Fear of Radiation

Back in October 2013, I wrote an article for the New York Times entitled “Fear vs. radiation: The mismatch,” in which I pointed out how disturbing and fearful the world still finds the Fukushima nuclear disaster and events related to it, despite the fact that leading health scientists have said, again and again, that the radiation from Fukushima has been relatively harmless. This is similar to the results found after studying the health effects of Chernobyl on the local population, which, again, have proved vastly less than people feared. Our anxiety about nuclear radiation stems directly from our understandable fear of the horrors of nuclear war, allied to fears about the relationship between radioactive material and tissue, gene and bone marrow damage.

The statistics, strangely enough, do not support the intensity of our fears in respect of the impact of even quite high levels of radiation. As I pointed out, researchers following some 86,611 Japanese survivors of Nagasaki and Hiroshima who had been within 10 kilometers of the center of the explosions found that, for the entire population exposed, in many cases to extremely high levels of radiation, there was an excess cancer death rate of just two-thirds of one per cent. Despite evidence that high-dose exposures lead to only minimal additional mortality risk, the fear surrounding the Fukushima disaster is massive and has already helped to shape government thinking about nuclear power. Germany, for example, renounced nuclear power decisively after Fukushima, without any real analysis of the potential impact on German industry or consumers of the loss of nuclear power as an option in Germany’s power portfolio. Switzerland and Italy took similar steps.

Precisely because our fear of radiation massively exceeds the realities of the hazards that it poses, the Environmental Protection Agency in the United States recently proposed new guidelines for communities about the kinds of protection that would be necessary for a range of nuclear accidents, ranging from a power plant accident to dispersal of radiation from a “dirty” terrorist bomb. The protection ranged from merely staying indoors, since most radiation cannot penetrate skin, much less walls or windows, to evacuation of the affected area, depending on the severity of the threat. This is a laudable attempt to help communities gain a better, less fear-driven understanding of the risks posed by what would be quite extreme radiation events.

Back to Table of contents

Further reading


  • Bernstein, Peter L. Against the Gods: The Remarkable Story of Risk. New York: Wiley, 1998.
  • Damasio, Antonio. Descartes’ Error: Emotion, Reason, and the Human Brain. New York: Penguin Putnam, 1994.
  • Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.
  • Pinker, Steven. The Blank Slate: The Modern Denial of Human Nature. New York: Penguin, 2003.
  • Ropeik, David. How Risky Is It, Really? Why Our Fears Don’t Always Match the Facts. New York: McGraw-Hill, 2010.
  • Slovic, Paul. The Feeling of Risk: New Perspectives on Risk Perception. New York: Earthscan, 2010.

Articles, Reports, Etc.


Back to top

Share this page

  • Facebook
  • Twitter
  • LinkedIn
  • Bookmark and Share