Vous êtes sur la page 1sur 12

De-biasing sunk cost fallacy and escalation of commitment in organizations decision

making process

Abstract
Powell, Lovallo and Fox (2011) believe that the gap between individual cognition and
collective strategy hinders behavioral strategy development the most. They also think
that behavioral strategy research need to beware making simplistic assumptions about
mental scaling like assuming that a company possesses the psychology of an individual
and that this persons choices reflect a collective decision. This paper is an attempt to
describe how individual psychology mechanisms affect organizations in the event of
sunk cost fallacy and escalation of commitment errors, why companies should de-bias
them and by what means can they achieve that goal. Research in behavioral decision
theory unveils that individuals lack the cognitive capacity to take fully informed and
unbiased decisions in complex environments (Kahneman, Slovic, and Tversky, 1982;
Payne, Bettman, and Johnsons, 1988).
Introduction
People put great trust in their intuition. Some of them have nose for business.
Sometimes, however, it is not worth trusting your judgement while taking crucial
decisions. Decision making is not a mechanical job. It is risk-taking and a challenge to
judgement. The right answer is not central. Central is understanding of the problem.
Decision making, further, is not an intellectual exercise. It mobilizes the vision, energies
and resources of the organization for effective action (Drucker, 1974; 480). Nowadays,
because of information overload, time pressure, simultaneous choice and other
limitations, peoples decision making is likely to be remarkably biased. As the economy
becomes global in tremendous pace, each of these biased decisions is likely to have
implications for a vaster part of society. Everyone of us can fall prey to so called sunk
cost fallacy and escalation of commitment. Both effects show that we do not follow the
judicious advice of management decision making theorists and economists - from a
normative point of view, the decision to authorize additional resources to a project
should be a function of incremental costs or benefits and opportunity costs (Northcraft
& Neale, 1986). Violations of the normative rule can be very costly for organizations.

What we are going to mention below is just a drop in the ocean of numerous examples
when companies experienced negative implications of either sunk cost fallacy or
escalation of commitment.
Term Concorde fallacy is sometimes used by researchers to describe the tendency of
sunk cost fallacy of lower animals (Arkes & Ayton, 1999). The reason for that is the
fact that one of the most vivid examples of sunk cost effect is the history of the
supersonic plane named Concorde. The project was launched by France and Great
Britain in 1962 and 14 years later the first commercial supersonic plane went into
operation (Gilman, 1977). The cost of developing the plane amounted to 1.134 billion
(Craumbaugh, 2015). The horrendous amount of money was spent on the Concorde
construction because British and French governments decided to invest even though
planes bleak financial prospects were known long before the project was completed.
Except trying to preserve their prestige they thought that at some point they have
already invested too much to quit (Teger, 1980). The project was closed only in 2011.
The next example is General Motors and its division Saturn, created in 1985. The plan
of the company was to beat Japanese competitors with brand-new, little, affordable cars.
After launching new brand, consumers initially highly appreciated the idea.
Unfortunately, prices of the Saturn cars were placed below their cost of production, thus
General Motors was losing money on every sold unit (Ritson, 2009). However, bigger
trap for the management of General Motors was that they invested that great deal in
creating Saturn division, so they were persistent in closing it. Saturn tried to recover by
critical changes a few times, but none of those showed positive effect (Taylor, 2004). In
2009 General Motors was unsuccessfully trying to sell its unprofitable business to
Penske Automotive Group (Carlsson, 2011). Eventually, bad investment was closed in
2010. However, this choice was not made by GM management, but it was caused by the
external party, since closing Saturn brand was one of the conditions given by US
government to bail out the GM (Stoll, Terlep, 2009).
Another classic example is Motorola Inc.s Iridium project. In the 1980s, phone
coverage around the world was feeble and it was urgent within the business community
to improve communication around the world. Motorola visualized solving this problem
using 66 low-orbiting satellites, letting people to call directly to any location around the

globe. Back in the time of developing the idea, the project was technologically
sophisticated and made economic sense. It took R&D department 15 years to build up
the product from idea to market release. In the 1990s, technology evolved dramatically
and the extensive cell phone coverage around the globe eradicated vast part of the
projected customer base for Iridium. Motorola still decided to launch Iridium as a
separate company in 1991 and then roll out the Iridium cell phone in 1998. The single
cell phone cost around $3,000, was not practical and had literally the size of a brick. No
wonder that Iridium filed for bankruptcy in 1999. Had the decision makers been paying
attention to these developments, they could have give up the project in the early 1990s.
Our paper aims to inquiry about nature of specific types of psychophysical biases and
their influence on collective decision making. We have chosen Concorde, Saturn and
Iridium projects as we perceive them as relevant empirical examples of situations where
the outcome of failing to overcome sunk cost fallacy or escalation of commitment
turned out to be calamitous. We selected a broad range of finest articles and journals to
make sure of high validity of the paper. In the process of preparing our theoretical
framework, we gathered wide variety of secondary sources in the form of scientific
papers.
Theoretical framework
Examples we briefly described above lend some empirical support to Ross and Staw's
(1986) statement that evidence of escalation of commitment and sunk cost fallacy can
be easily found not just in the laboratory but in the business field as well. Nevertheless,
to better understand the essence of the topic, we will go through the most important
theoretical foundations below.
Arkes (1991) divided errors into three groups: strategy-based, association-based and
psychophysically-based errors. The last type of bias occurs due to the nonlinear relation
between external stimuli and the subjective responses to those stimuli (Arkes, 1991).
Both sunk cost fallacy and escalation of commitment appertain to the psychophysically
based errors. Thus, the first step toward eliminating above mentioned biases from
decisions is to identify the psychological factors that nourish them.

Sunk cost fallacy


Finance and accounting literature outputs are unanimous about how sunk costs in
investment decisions need to be treated. Sunk costs are past, irrecoverable costs that are
unavoidable because they cannot be changed no matter what action is taken (Horngren,
Harrison & Robinson, 1996). Owing to the fact that they are non-redeployable, they
should be ignored (Brealey and Myers, 1996). However, many decision makers fail to
follow this advice. Instead, once they commit themselves to a particular course of
action, they have a tendency to make further investments beyond the optimal and
rational level.
In his research, Howard Garland (1990), unveiled that subjects willingness to make
further investments in endangered project is both positively and linearly related to the
proportion of the budget that had already been dispensed. What is more, two variables
of incremental costs and benefits that might be expected to have an impact on rational
decision making proved to be unrelated to the pattern of results in his study.
Arkes and Blumer (1985) implied that a major contributor to the sunk cost fallacy is
persons desire not to appear to be wasteful. Waste not, want not is one of the
guidelines that of most us have been taught since childhood, therefore, obviously,
avoiding waste is normally advisable. Although, using this rule to combat prior futile
investment dereliction shows an irrelevant overgeneralization of this principle (Larrick,
Morgan, Nisbett, 1990).
It has been demonstrated a lot of times that if the decision maker carries personal
responsibility for an investment, that person is more prone to throw good money after
bad in comparison with the situation when the decision maker is not encumbered with
personal responsibility for initial investment decision. Staw (1976) showed that in one
of his experiments. When business school students felt accountable for a financially
unsuccessful earlier decision, they carried on investing more money in that choice than
in case their prior decision was profitable. This finding gainsays the common sense idea
that negative consequences provoke a change in persons course of action and can be
summarised as self-justification (Brockner, 1992; Staw & Ross, 1978). It relates to the
fact that by quitting to continue initial course of action someone would admit to making

a mistake. Thus, the person persists in continuing to invest in order to exhibit


consistency and justify himself. This weakness is intensified by increased
disappointment as a result of the initial decision and elevated economic consequences of
the decision.

Escalation of commitment
There are a few psychological factors distinguished which contribute to escalation of
commitment. First of all, people have a strong tendency to believe that their own
choices are the best ones. In case of confrontation with contrary evidence they
vigorously seek out proof that their decisions were sensible ones. Termination of an
investment that is underperforming is synonymous to admitting to failure. Psychological
factors are not the only sources of commitment escalation. Social determinants concern
human beings need to "save face" or remain credible with others. Whyte (1991, 1993)
found that escalation of commitment effects linger in individual as well as group
decision processes. Escalation of commitment in group is related to competition.
Decision makers hate to concede defeat, admit to mistakes and expose them to others,
especially in a competitive situation. We have a tendency to try to manage impressions
of other people. Humans also exhibit competitive irrationality and making decisions
often while not being clear about their impact, especially when other people competing
with us complicate the impact. Thus, desire to win and self-justification lead to
irrational commitment escalation in collective decision making.
Former studies have been strongly based on laboratory experiments to reveal procommitment biases of individual subjects, but there has been paltry research analyzing
the structural determinants of firms escalation behavior.
De-biasing
Over the years, researchers have come up with a broad spectrum of methods for
reducing decision biases. In 1982, Fischhoff investigated four strategies that had been
proposed as solutions for biased decision making. The study comprised of providing
warnings about the bias possibility, describing the direction of a bias, giving a dose of
feedback and offering a comprehensive program of training including feedback,
coaching, and other interventions designed to improve judgment. According to his

findings, the first three strategies led to minimal success. Even exhaustive, personalized
feedback generated only mild improvements in decision making (Bazerman and Moore,
2008).
Schwenk and Thomas (1983) introduced framework for matching strategy problems
with decision aids such as dialectical inquiry, devils advocacy, scenario analysis and the
Delphi procedure. Soll and Larrick (2009) proved that people tend to do better when
they average between proposed solutions than while they choose on or the other.
Thaler and Sunstein (2008) claimed that researchers had better accept the fact that
decision makers are just human beings with feeble self control and defective cognition.
Organizations need to focus on managing the psychological architecture of the choice
environment (Powell, Lovallo and Fox, 2011).
Research question and hypothesis
Even though in recent years many researchers have put a great effort to prove that above
mentioned biases can have a positive impact on individuals choices outcomes, we argue
that sunk cost fallacy and escalation of commitment are the errors still necessary to debias in organizations. Thus, in order to overcome the biases while taking essential
strategic decision it is not enough to introduce measures just in the moment when they
arise. We argue that successful debasing process starts a long before the decision-maker
articulates sunk cost fallacy and escalation of commitment behaviour. Secure mechanics
need to be implemented long before decision makers become exposed to making
mistakes. Given the information above, we claim that de-biasing is indeed necessary in
important business decisions. What is more, we suggest that it is not an individuals
choice to overcome them but organizations task. Can sunk cost effect and escalation of
commitment be overcome by means of designing efficacious decision making process
and choice architecture in companies? We do believe so and propose the following
hypothesis: Psychophysical biases can be neutralized or reduced at the organizational
level.
Propositions
Bazerman, Giuliano, and Appelman (1984) recognized that groups are less likely than
individuals to escalate commitment. Also, the existence of multiple participants in

decision making process increases the likelihood that the group will find out the
irrationality of sunk cost fallacy and commitment escalation to underperforming
investments. Yet, unfortunately, groups that escalate tend to do so to a greater degree
than individuals. If group does not recognize an error, team dynamic reinforces support
for the initial decision and increases the level of rationalization to biases. There are a
few ideas how it can be achieved. An organization can explicitly display the running
costs of manufacturing equipment or designing work spaces so that they can seek
alternative points of view (Johnson et al.., 2011).
Ignoring sunk costs and succumbing to escalation of commitment in any situation in
which past investments are informative, reputation concerns are important, and budget
constraints are salient.
Given the substantial costs that can result from suboptimal decision making, it is crucial
to put augmented effort on improving skills and knowledge about strategies that can
navigate companies to better decisions. As overcoming them is a difficult task, we
propose that mechanisms implemented on organizational level may be the remedy.
He distinguished four techniques which can prove to be effective in overcoming
psychophysical biases, however, we will focus only on those we perceive as relevant for
collective decision making. First of them relates to the value function of prospect theory
(Kahneman & Tversky, 1979, appendix 1). As the curve relating objective gains and
losses to subjective gains and losses cannot be changed, de-biasing may be achieved
when new gains or losses are introduced to those currently under attention. It can occur
thanks to informing decision making subjects about opportunity costs. Northcraft and
Neale (1986) has proved that in their experiment subjects who were informed about
the fact that money spent on the condemned project would be unavailable for use on
much more promising ventures perceived the choice of sunk cost option as much less
appealing. Second option is reframing groups gains as losses or the other way round.
Tversky and Kahneman (1981) showed that in their Asian disease example. This
solution can be called choice architecture. The goal of this method is to improve
peoples decisions by carefully structuring how data and available options are presented
to them. In such a way, companies can nudge employees in a specific direction without

taking away their power to make decisions. Herbert Simon (1987) distinguished two
kind of organizational decisions: logical and judgmental.
Heath, Larrick, and Klayman (1998) distinguished two types of repairs for organization
behaviour: motivational, which boost the enthusiasm and energy among workers and
cognitive repairs, which can improve decision processes and prevent from mental
biases. As motivational repair we can understand the manager who gives more
autonomy to project groups. As cognitive one, an executive who decides to invest in his
employees training. Education and instructions are obviously not enough to overcome
sunk cost fallacy and escalation of commitment effect but maybe if group acknowledge
the fact of possibility of making biases it will collectively rethink its choices, trigger off
the drive of being rational and consequently diminish the biases.
Denrell and March (2001) supposed that sometimes biases derive from perturbed
learning environment and would persist even managers were completely rational. In
order to create and ideal learning culture in the organization there must be a shift in the
way learning is perceived. Instead of regarding it as a single event, it needs to be woven
into organizational routines so that it becomes the part of the culture. Errors under
discussion are not only the problem of top level managers. All of the employees may
fall pray to these errors by sacrificing too much their time and energy on doomed tasks.
Employees on all of the echelons need to feel comfortable learning and making the
mistake required to grasp the knowledge. Encouraging employees to share experience,
skills and information as well as giving feedback establish stimulating, informal
learning atmosphere within the organization. When motivated to think less about
keeping promises or preventing loss and also to better consider opportunities or
promoting gains, people may increase their chance of choosing the best options for the
future.
Another method which can help organizations to reduce biases and enable to make more
objective decisions is a contingent road map (Horn, Lovallo and Viguerie, 2006). This
tool lays out signposts to navigate executives through their options at predetermined
checkpoints over the life of a project or business. Signposts indicate the points of when
key concerns have to be resolved, provide information about the oncoming choices and
possible results. For a contingent road map to be efficient, certain decisions must be
assigned to each signpost before the project launch or at least well before the project

comes near the signpost. Thanks to that when the time of the decision arrives it is
possible to mitigate biases by looking at previously stated commitments. What is also
important, yet relevant for our discussion, contingent road maps help to distinguish the
specific biases that may affect the corporate decision making process. For instance, if
economic indicators at one signpost suggest that business should be shut down, but
executives decide that the company has invested too much resources to quit, the sunkcost fallacy and escalation of commitment errors are obviously manipulating them. In
case of new, influential information arrives, the initial road map can be adjusted,
however, alterations should always be made solely to future signposts, not to the current
ones. Described method prevent executives from changing the decision criteria in
mainstream. They help decision makers to focus on future opportunities rather than past
performance and also to identify uncertainties in an explicit way though the use of
multiple potential paths. They lessen the impact of sunk cost fallacy and emotional
escalation in businesses. What is also crucial is that they help executives by removing
their uncomfortable feeling of guilt to unfavorable outcomes that have been specified in
advance. The evident recognition of issues gives a company a chance to adapt, while a
failure to recognize problems beforehand requires a change in strategy that is often
psychologically and politically arduous to explain.
Edwards and von Winterfeldt (1986) once said that if the problem is important and the
tools are available people will use them and thus get right answers. It means that if
decision makers want to be rational and minimalize the probability of fatal business
outcomes caused by any kind of biases, they will use all the possible methods only if
they are available in their environment. We believe that organizations should implement
such tools in the business decision making process and organizational structure for the
sake of being successful.
Future research
To understand or predict what a rat will learn to do in a maze, one has to know both the
rat and the maze (Mowerer, 1960). This quote refers to the fact that organization has to
design decision environments faced by decision-makers in light of knowledge about the
decision environment but also with knowledge about the characteristics of targeted
decision-makers and how they process and draw meaning from information, or what

their goals are. It means that the right choice architecture may differ by these individual
characteristics. Therefore, future research should focus on collective decision making of
organizations. What also deserves attention is a choice architecture with special
attention to overcoming judgmental errors of individuals.
Conclusion
Escalation of commitment and sunk cost fallacy are these errors which in our opinion
need to be de-biased in organizations strategic decision making. Companies tend to
escalate their commitments because of their inclination to deny failure. They also tend
to consider sunk cost in future decision even though it is irrational. Despite clear,
negative feedback indicating that it is time to leave a project, a business or even a whole
industry, executives tend to carry on and fritter resources on unprofitable activities.
Even though abolishing a project or exiting a business can be perceived as a sign of
failure, such moves are completely normal in organizations lives and companies need
to realize that. Thanks to this they can free up their resources and improve their ability
to embrace new market opportunities before it is too late. Methods examined in this
paper are aimed to show that by implementing certain measures in strategic decision
making process and enhancing some parts in organizational structure design company
can be prepared for de-biasing.

Appendix 1

source: http://ui-patterns.com/uploads/image/file/1682/best_1845.jpg
References
Arkes HR. 1991. The costs and benefits of judgment errors: implications for debiasing.
Psychological Bulletin 110: 486498.
Arkes H. R. & Blumer, C.(1985).The psychology of sunk cost. Organizational Behavior
and Human Decision Processes, 35, 124140.
Bazerman MH, Moore D. 2008. Judgment in Managerial Decision Making (7th edn).
Wiley & Sons: Hoboken, NJ.
Drucker PF. 1974. Management: Tasks, Responsibilities, Practices. Harper & Row:
New York.
Edwards, W., & von Winterfeldt, D. (1986). On cognitive illusions and their
implications. In H.R. Arkes & K. R. Hammond (Eds.) Judgment and decision making:
An interdisciplinary reader (pp. 642-679). Cambridge, England: Cambridge University
Press
Heath C, Larrick RP, Klayman J. 1998. Cognitive repairs: how organizations

compensate for the shortcomings of individual learners. Research in Organizational


Behavior 20: 137.
Horn J, Lovallo D, Veguerie P. 2006. Learning to let go: Making better exit decisions.
McKinsey Quarterly
Johnson EJ, Shu S, Dellaert BGC, Fox CR, Goldstein DG, Haubl G, Larrick RP, Peters
E, Payne JW, Schkade D, Wansink B. 2011. Beyond nudges: tools of a choice
architecture. Unpublished manuscript, Columbia University.
Kahneman D, Slovic P, Tversky A. 1982. Judgment Under Uncertainty: Heuristics and
Biases. Cambridge University Press: New York.
Kahneman D, Tversky A. 1979. Prospect theory: an analysis of decisions under risk.
Econometrica 47: 313 327.
Lovallo D, Sibony O. 2010. The case for behavioral strategy. McKinsey Quarterly.
March: 114. .
(Molden & Ming Hui, Promoting de-escalation of commitment: a regulatory-focus
perspective on sunk costs, 2011).
Northcraft, G. B., & Neale, M. A. (1986). Opportunity costs and the framing of resource
allocation decisions. Organizational Behavior and Human Decision Processes, 37, 348356.
Staw BM. 1981. The escalation of commitment to a course of action. Academy of
Management Review 6: 577 587.
Teger, A. (1979).Too much invested to quit: The psychology of the escalation of conflict.
New "fork: Pergamon Press.
Whyte, G. (1986). Escalating commitment to a course of action: A reinterpretation.
Academy of Management Review, 11, 311-321.
http://blog.cultureiq.com/crafting-learning-culture/

Vous aimerez peut-être aussi