Risk by Dan Gardner

Simon Perry casually posing on a steep cliff
Simon Perry casually posing on a steep cliff. Why isn't he scared, or is he, on this very exposed terrain 50 metres off the ground? The risk is all relative. At Alcasan, Stoney Middleton, Peak District, taken by Masa. (License: Creative Commons Share-Attribute)

We are all wired to be a stone-age human!
So we react to the risk as wired as our ancestors who run on the savannah in Africa, fleeing away from predator beasts, avoiding poisonous or harmful things like particular plants and something polluted to maximise the chance of survival.

It might be all good as long as one lives such a life alone, relying thoroughly on their instinct, or so called Gut. But we don't. Modern lives are far more complicated. More importantly, we have a brain (or Head) to digest the information and judge what is better and what is worse, and language, as well as the method to record, to pass on the knowledge. They are after all the primary reason why humans have been prospering on the earth. By using our Head, we should be able to deal with risks better than with just our Gut, taking into account all the information our Gut doesn't or can't care about.

Unfortunately, humans seem to be still hard-wired as a caveman in terms of the response to various risks we face. As a risk-dealing climber and trained scientist, I thought I would be better. However it is apparently a typical example of the optimism bias, and I only have bounded rationality at best. I too am, of course, one of the descendants of stone-age homo sapiens. That is what I realised and humbled me in reading the brilliant book: Risk: The Science and Politics of Fear by Dan Gardner (2009).

In this post, I summarise, from a climber's perspective, some interesting facts I have learnt from Dan Gardner's Risk.

Typical (hard-wired) psychological biases

There are several risk-related biases or rules in the pattern of human mind, as modern psychology has revealed. Here is the list.

1. Anchoring Rule

An estimate is always affected by something, like a number, s/he hears before she makes a judgement.


If you hear the number '4' a lot like 4 apples, and if you guess a grade of a route immediately afterwards, you are more likely to guess it to be E4, whereas otherwise you would say E5.
Apparently humans are as stupid as this. Strack & Mussweiler (2006) found out it is the case even for judges in writing a sentence. When even professional judges are affected, we ordinary people must be…

2. Rule of Typical Things

If one feels a thing (in a hypothesis) to be typical and likely, s/he tends to feel the entire (subsequent) story is likely.


Think of cases: (1) a walker goes behind the schedule, (2) a walker gets lost in a hill. Logically, the probability of both (1) and (2) happen is smaller than that of either of them happens (regardless of the other). However, people tend to feel the other way around; for example, the probability that both happen is higher than that of a walker getting lost (including both the cases of s/he may and may not go behind the schedule).

3. Example Rule (or Availability heuristic)

If it is easy to recall examples of something, Gut feels that something must be more common.


If one hears a statement people fell off a steep cliff in a mountain and died, it is easy to imagine it for her/him, then s/he tends to conclude mountains are dangerous.

4. Good-Bad Rule (or Affect heuristic)

If people thought the risk posed by something was high, they judged the benefit to be low. The reverse was also true.

Climbing is dangerous, therefore it is pointless. Walking is safe, therefore it is beneficial.
Of course in reality, there are countless things that are safe but pointless (not much benefit), and vice versa!

5. Confirmation bias

Once a belief is established, our brains will seek to confirm it.


Mary has judged a route is well protected, based on a guidebook. When she talks to other climbers who know the route or reads different guidebook, it is more likely her belief of the route to be protected is strengthened than she changes and retreats her view. For example, as soon as she hears some one say I have placed Rock No.1, she straightaway confirms it is a sign of well-protected route, maybe not caring (or asking) how good the placement is (and it can be mere psychological in reality).

6. Group polarization

When like-minded people get together and talk, their existing views tend to become more extreme.
Example 1

After bunch of people who more or less like alpine climbing get together and talk, they will love alpine climbing even more. That is, the view (preference) of each of them is likely to be escalated than being averaged out, whether one had liked alpine climbing a little or had adored it a lot before the meet.

Example 2

Suppose a bunch of people who are more or less not comfortable in bolting a crag, some of whom strictly oppose, whereas some are mildly uncomfortable about it, get together and talk. After the meet, those who strictly opposed may now feel fiercely against it and those who were mildly uncomfortable may now more clearly oppose. That is, the former strengthened their view and get more radicalised, and the latter, while still maintaining the less extreme position than the former, is affected by those people.

7. Denominator Blindness

A literature, or more frequently loads of media, mention, for example, X people affected [killed] but not mention out of Y population (etc). Logically speaking, the former would be often meaningless, or be impossible to give any judgement about, without the latter, that is, something to compare with. However people tend to be controled or manipulated as the media wants to impress the readers as they like.

Example 1

3 people died in Mt. XXX. may make you have an impression Mt. XXX is a very dangerous mountain. Not always. If 3,000,000 visited the mountain during the period? Or, if those unfortunate 3 died of heart attack, which could have happened anywhere?

Example 2

The fatality rate has been tripled. may make you feel uneasy. Not necessarily. If the original fatality rate was absolutely negligibly low in the first place, the tripled value would be still not worth worrying about. Imagine the case: some one found out the death rate caused by being hit by a meteorite was actually 3 times higher than previously thought.

8. Personal experience

influences our perception greatly. If one has (unluckily) experienced something bad, s/he will try to avoid similar events in the future. If one experienced something and was fine, s/he tends to feel similar events will be safe.

For climbers, I think this is a dangerous tendency. When one traversed an avalanche-prone slope without incident, that doesn't mean s/he can traverse the similar slope safely again the next time… Personal experience, good or bad, will not change the probability of an incident in the identical situation in the future (unless some human factor plays a role for the incident to happen). In addition, a testimony of the experience by other people also should be treated with an extreme caution, if the (potential) unfavourable consequence is bad enough. Remember you are unlikely to see those who have experienced the unfavourable consequence and to talk, because those may have died or have quitted the activity (climbing). In other words, the story you hear in person is inherently biased in many levels.

9. Moral panic

Fear sells. Accordingly, media (or corporates, politicians etc) promote the fear, such as, giving a priority to scary stories, because people are more likely to buy them than other less punchy stories. Then, Confirmation bias kicks in and people confirm their concerns and get more scared. It is a spiral of anxiety.


After the 9-11 terror attacks in the US, American people feared they might become a victim of terror in near future. No (significant) terror attacks happened in the US after 2001. Yet, the level of people's concern got worse by 2006. It is thought to be because this Moral panic happened. Allegedly, the FBI director Robert Mueller even said publicly I remain very concerned about what [evidence of potential terror attacks] we are not seeing.

10. Bias to regard negative outcomes as more reliable (Negativity bias; Wikipedia)

People have a tendency to have more confidence in studies with negative outcomes than in studies showing no risk.


Before the Halley's comet return in 1910, an astronomer Camille Flammarion's claim the cyanogen that exist in the tail of the comet would impregnate the atmosphere and possibly snuff out all life on the planet was published in newspaper. People bought it. Despite continued reassurance by the scientific community that such a disaster would not happen, there was a wide-spread fear among public and many rushed to buy gas masks and comet pills (See Halley's Comet Appearance in 1910 by Kevin Curran for detail). Apparently a single negative claim somehow sounded more convincing than opposing positive claims by many, which should be more reliable sources of information as a commonsense.

Any bright side?

All sounds rather depressing.

However, good news are, whereas people's mind and judgement are frequently affected with those irrational fear, past evidence shows very few people get uncontrollably panicked even when facing a risk. In other words, most people are psychologically tough enough to be capable of dealing with the risk to avoid the worst. So, while we had better calmly check and examine our own mind to make sure we are not fear-driven, over-worrying is both unnecessary and harmful in the sense that itself drives another fear.

Dan Gardner presented the evidence (of people not getting panicked) in the book. An irony is I could not find it now, because apparently I have not marked the relevant part, whereas I did mark many other parts that I regarded as noteworthy and hence I could pick them up in writing this and presented above… Obviously, Bias to regard negative outcomes as more reliable worked on me, at least in the sense reliable (in the wording of the bias) is replaced with important. I just confirm how correct those biases are on me!

How to deal with those fear-driven biases

Now, gaining insights how human psychology works, the next obvious question is how to overcome those irrational and hard-wired fear. Gardner argues

  1. The first step in correcting our mistakes of intuition has to be a healthy respect for the scientific process.
    Or, to generalise, scientific can be replaced with logical in the above.
  2. The next step in dealing with risk rationally is to accept that risk is inevitable.

    Indeed, as he claims, It is often possible to make something safer, but safe is usually out of the question which is of course the theme of this website saferclimbing.org!

    Once we realise it,

  3. We must learn to think hard.

    And finally,

  4. if Head and Gut still don't match up, swallow hard and go with Head.

Climbers should know these very well. Do you remember when you fully relied on a rope (and gear and your belayer) for the first time, such as, being lowered down after a bottom-rope climbing? Or, your first fall? It was dead scary, wasn't it? Over the course of time, climbers learn how to trust the gear and all the safety chain, and reduce our fear. We can not totally eliminate the fear, though. No matter how many times one falls, falling is still scary, unwanted, and rushes adrenaline. But experienced climbers can deal with the fear better, and so their judgement is not affected with the fear as much as otherwise, hence the overall risk is reduced.

I suppose how to deal with various fear is the life-long theme for any person, let alone climbers. But now we have learnt how psychology works better, and so are a little wiser than our past. Now we have a hope to improve ourselves! As Winston Churchill wrote in 1958, cited by Dan Gardner as the closing remark of his book Risk:

The future is unknowable, but the past should give us hope.

My thanks to Dan Gardner for his book Risk: The Science and Politics of Fear (2009). The hyperlink is to the description of the book on his website.


Add new comment