Overcoming Decision Flaws from Framing


Purpose: This article examines the harmful effects of framing and suggests ways to improve decisions.

Design / Methodology / Approach: The paper uses a number of real life examples of catastrophic failures of decision making by obviously intelligent and experienced people and compares them with less dramatic but common errors of judgement most people make. The way a problem is presented often determines the choices we make and that we are inconsistent. It explains this phenomenon using Kahneman and Tversky’s Nobel Prize winning work in behavioural economics.

Findings: All human beings are susceptible to decision flaws from framing. It is possible to recognise signs of the way we may be framing a decision situation and adopt strategies to sidestep the hazards of wrong choice.

Practical Implications: By adopting methods suggested in this paper individuals can make better decisions in everyday life. Managers can make appropriate choices rather than take too little risk or too much. Organisations can train senior managers in decision strategies and adopt group processes that reduce the risk of flawed decisions.

Originality / Value: The paper provides insight into errors all human beings are prone to, no matter how intelligent or experienced they are. Using everyday examples Indian managers are familiar with the paper highlights the need to improve thinking.

Keywords: Choices, decision, decision frames, framing, framing effect, intuition.
Paper type: Viewpoint.
Appeared in the Journal of Indian Business Research, Year 2010, Vol. 2, Issue 1


In January 2009 Ramalinga Raju, then chairman of Satyam Computer Services confessed to having manipulated the company’s accounts for eight consecutive years. Each year the company’s cash position and profits were exaggerated. “What started as a marginal gap between actual operating profit and the one reflected in the books of accounts continued to grow over the years,” Raju wrote in his letter of January 7 to the Board. “It was like riding a tiger, not knowing how to get off without being eaten.”

It must have been clear to Raju for quite some time that continuing down the path would lead to catastrophe. Why didn’t he stop before what he was riding became a tiger? He probably didn’t foresee the road led to disaster. Why didn’t he? How could such a capable, intelligent and experienced entrepreneur-manager be so irrational?

Raju may be an extreme case but we are all capable of similar irrationally risky behaviour. The evidence is everywhere. Most people stay invested in a falling share market even though it is safer to exit. We delay investing in a bullish market and once invested postpone exit in the belief it will not fall. We drive faster, sometimes recklessly, when we are late for an appointment. We ignore the fact that even a minor accident may jeopardise the very appointment we are trying to keep.

Perfectly sane people are capable of misjudgement because a considerable amount of thinking is intuitive. Intuition is an automatic, effortless process that uses thumb rules and short cuts to simplify and quicken decision making. It tends to use information that is easily accessible and disregards others even though they may be more relevant. Accessibility enables us to interpret information and make decisions in a split second. Why certain items of information are more accessible than others is harder to explain. It certainly has something to do with past experience, learned behaviour, salience and context. But it also seems to have a lot to do with the way a situation is framed.

Framing Effect

It may appear strange that our preferred choice varies merely because a situation has been presented differently. This occurrence is called the effect of framing. It was demonstrated in an experiment by McNeil and others (Tversky and Kahneman, 1986). They presented statistical data about the effects of surgery and radiation therapy on patients of lung cancer. The same information was presented to two groups in different ways.

         Problem 1 (survival frame)

Surgery: Of 100 people having surgery 90 live through the post-operative period, 68 are alive at the end of the first year, and 34 are alive at the end of five years.

Radiation therapy: Of 100 people having radiation therapy all live through the treatment, 77 are alive at the end of one year, and 22 alive at the end of five years.

         Problem 1 (mortality frame)

Surgery: Of 100 people having surgery 10 die during surgery or the post-operative period, 32 die by the end of the first year, and 66 die by the end of five years.

Radiation therapy: Of 100 people having radiation therapy, none die during treatment, 23 die by the end of one year, and 78 die by the end of five years.

In the group that was presented the problem in the survival frame 18% favoured radiation therapy. However among those that were presented the mortality frame 44% opted for radiation therapy. Embarrassingly, the framing effect among experienced physicians was similar.

The majority of respondents preferred the riskier option when the problem was framed in terms of mortality or loss frame (Tversky and Kahneman, 1986) whereas they tended to make a conservative choice in the survival or gain frame (Tversky and Kahneman, 1986). Similar effects of framing have been observed in many other experiments.

Daniel Kahneman and Amos Tversky explained this curious phenomenon as an effect of the way a decision problem is presented or framed. It occurs because human beings are, by nature, loss averse. When choices are framed between gains people tend to be risk averse. They tend to preserve and protect what they feel they have. In choices framed between losses people prefer to take higher risk to make good a potential loss. This shift, merely because a problem has been presented in different ways, is evidence of our limited rationality as Herbert Simon, and J.G. March and Simon proposed (Bazerman, 1986, Tversky and Kahneman, 1986). That we are incapable of perfectly rational behaviour at all times. It makes us vulnerable to poor decisions as individuals and managers. The insidious part of the problem is that we are not even aware of the machinations of our intuitive minds.

Intuition – help or hindrance?

Is intuition a bad thing, then? Heavens, no! Intuition is a powerful faculty that is a product of tens of thousands of years of human evolution. Working by association and extension, it enables us to make judgement about complex situations in the blink of an eye. Without intuition we won’t be able to live. But intuition is not very good at making judgement of uncertain events, especially future events. The associative process also creates other cognitive biases that blinker vision and compromise our best interests. Sometimes it leads to action that can leave observers aghast at the stupidity.

In October 2009 a young software engineer running late for his flight called the commercial manager of the airline to claim a bomb had been placed on board. Surprisingly he had called from his own cell phone. He was arrested on arrival at the Delhi airport. He has lost his job at a very reputed and large Indian company and now awaits a far worse outcome in the trial that will follow. Perhaps the young man feared the cost of missing the flight and not keeping an important appointment with his boss or client. Or, a couple of previous missed appointments had led to warnings and he feared much worse in this instance. It is likely that he saw the options between losses and opted for a high risk action oblivious of the consequences of tracing the call to him.

If we are prone to the effects of framing, how can we improve the quality of our decisions? Unfortunately, leading behavioural scientists and cognitive psychologists have so far failed to provide definitive solutions. Fortunately, their experiments have yielded indicative directions that seem feasible.

Strategies for Better Decisions

Awareness. Knowledge of biases such as effects of framing can alert us to missteps. Greater awareness of our thoughts gives us the opportunity to bring our rational faculties to bear on the problem.

Audit. Audit of past decisions is a powerful source of learning. It serves as a feedback mechanism to reinforce processes that had led to good judgements. In time they become learned behaviour and enable instances of good judgement to become more accessible to intuition. Audit of one’s thinking is what is commonly called reflection but includes the additional step of embedding past learning.

Consultation. The case for wise counsel can hardly be over emphasised even though evidence of its efficacy may not be abundant in academic literature. Most people are unable to distinguish a preferred choice as intuitive from rational thinking. This blinds us to the possibility that our judgement may have been coloured by a particular frame. A knowledgeable and impartial advisor, especially one who is schooled in the effects of framing, can suggest alternative ways of defining the problem and point out when our thoughts lean towards needless risk or unduly conservative choice.

Group dynamics. Are two minds better than one? Does it help to discuss an issue with people in one’s team? Empirical observation points to the danger of like-minded people, as in a functional group or a small business unit, thinking alike. Indeed experiments have shown groups sometimes swing to greater extremes than individuals (Bazerman, 1986). This is usually a consequence of groupthink arising out of confirmation bias in which homogenous groups believe in similar assumptions or hypotheses. Empirical evidence does however point to bias neutralisation through debate (Bazerman, 1986). When alternative, even contrarian perspectives are brought into the open, the shades of framing seem to fall off. Experience shows heterogeneous groups are more likely to challenge conventional wisdom.


In spite of possessing perhaps the most miraculous product of Nature, the human brain, we are destined to be imperfect. Fortunately, we can improve our strike rate of good decisions by becoming aware of the fact and that we need to intersperse intuition with critical thinking.


Bazerman, Max H (1986), Judgement in Managerial Decision Making, John Wiley & Sons, USA, pp 151-158. Pp5-6 ‘bounded rationality’ (Simon 1957, March and Simon 1958) refers.
Outlookindia.com (2009), “Hoax Bomb Call at Delhi Airport; Man Held” October 25, 2009, http://news.outlookindia.com/item.aspx?668349 (accessed 09 January 2010).
Raju, Ramalinga (2009), Letter dated January 7, 2009 to Board of Directors of Satyam Computer Services.
Tversky, Amos and Kahneman, Daniel (1986), “Rational choice and framing of decisions”, The Journal of Business, Vol. 59, No. 4, Part 2 (Oct., 1986), pp. S251-S278. The experiment of McNeil and others refers.

(Abstract 226 words, article 1517 words)