Vous êtes sur la page 1sur 7

British Journal of Anaesthesia 108 (2): 22935 (2012)

Advance Access publication 8 December 2011 . doi:10.1093/bja/aer387

CLINICAL PRACTICE

Cognitive errors detected in anaesthesiology: a literature


review and pilot study
M. P. Stiegler 1*, J. P. Neelankavil 1, C. Canales 2 and A. Dhillon 1
1

Department of Anaesthesiology, David Geffen School of Medicine, UCLA, 757 Westwood Blvd-Suite 3325, Los Angeles, CA 90095-7403,
USA
2
Department of Anaesthesiology and Perioperative Care, UC Irvine School of Medicine, Irvine, CA, USA
* Corresponding author. E-mail: mstiegler@mednet.ucla.edu

Editors key points

An anaesthesia-specific
catalogue of cognitive
errors was created using
qualitative methodology.
The errors of key
importance were seen in
.50% of simulated
emergencies.
The study provides
deeper insight into error
generation and possible
strategies to prevent
them.

Methods. This study consisted of two parts. First, we created a cognitive error catalogue specific
to anaesthesiology practice using a literature review, modified Delphi method with experts, and
a survey of academic faculty. In the second part, we observed for those cognitive errors
during resident physician management of simulated anaesthesiology emergencies.
Results. Of .30 described cognitive errors, the modified Delphi method yielded 14 key items
experts felt were most important and prevalent in anaesthesiology practice (Table 1). Faculty
survey responses narrowed this to a top 10 catalogue consisting of anchoring, availability
bias, premature closure, feedback bias, framing effect, confirmation bias, omission bias,
commission bias, overconfidence, and sunk costs (Table 2). Nine types of cognitive errors
were selected for observation during simulated emergency management. Seven of those
nine types of cognitive errors occurred in .50% of observed emergencies (Table 3).
Conclusions. Cognitive errors are thought to contribute significantly to medical mishaps. We
identified cognitive errors specific to anaesthesiology practice. Understanding the key types
of cognitive errors specific to anaesthesiology is the first step towards training in
metacognition and de-biasing strategies, which may improve patient safety.
Keywords: cognition; decision-making; diagnostic errors/prevention and control; medical
mistakes; physicians/psychology
Accepted for publication: 22 August 2011

Increasingly, attention is being focused on the prevention


and management of medical errors that have been estimated to account for 44 000 98 000 deaths annually in
the USA.1 Recent studies suggest that cognitive errors, a
subset of medical errors involving faulty-thought processes
and subconscious biases, are important contributors to
missed diagnoses and patient injury.2 4 Indeed, according
to Groopman,5 a growing body of research shows that technical errors account for only a small fraction of incorrect
diagnoses and treatments. Most errors are mistakes in thinking. And part of what causes these cognitive errors is our
inner feelings, feelings we do not readily admit to and
often dont even recognize. Thus, the psychology of decisionmaking is becoming increasingly appreciated in the nonmedical and medical literature.

Cognitive errors have been described in safety culture


industries, such as aviation and nuclear power plants,6 7 and
in the literature of medical education and other medical specialities, including internal medicine, emergency medicine,
family practice, radiology, neurology, and pathology.2 8 15
In the 1990s, early pioneers of the Crisis Resource Management paradigms at Harvard and Stanford introduced this
topic by including fixation error (also known as anchoring
or tunnel-vision) in their curricula.11 16 17 Numerous additional cognitive errors have been identified and this concept has
been greatly expanded in other disciplines since then. A comprehensive review of cognitive errors or decision-making
psychology has not yet been published in contemporary
anaesthesiology literature. This manuscript sought to
provide an expanded introduction to the concept of cognitive

& The Author [2011]. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved.
For Permissions, please email: journals.permissions@oup.com

Downloaded from http://bja.oxfordjournals.org/ at REDAR on July 14, 2015

This paper deals with a


novel subject of cognitive
errors and psychology of
decision-making in
anaesthesia.

Background. Cognitive errors are thought-process errors, or thinking mistakes, which lead to
incorrect diagnoses, treatments, or both. This psychology of decision-making has received
little formal attention in anaesthesiology literature, although it is widely appreciated in
other safety cultures, such as aviation, and other medical specialities. We sought to
identify which types of cognitive errors are most important in anaesthesiology.

BJA

Stiegler et al.

errors, and to identify the cognitive errors that may be most


important in anaesthesiology practice.

Review of cognitive errors

Table 1 Cognitive error catalogue


Cognitive error

Definition

Illustration

Anchoring

Focusing on one issue at the expense of understanding the


whole situation

While troubleshooting an alarm on an infusion pump, you are


unaware of sudden surgical bleeding and hypotension

Availability bias

Choosing a diagnosis because it is in the forefront of your mind


due to an emotionally charged memory of a bad experience

Diagnosing simple bronchospasm as anaphylaxis because


you once had a case of anaphylaxis that had a very poor
outcome

Premature
closure

Accepting a diagnosis prematurely, failure to consider


reasonable differential of possibilities

Assuming that hypotension in a trauma patient is due to


bleeding, and missing the pneumothorax

Feedback bias

Misinterpretation of no feedback as positive feedback

Belief that you have never had a case of unintentional


awareness, because you have never received a complaint
about it

Confirmation
bias

Seeking or acknowledging only information that confirms the


desired or suspected diagnosis

Repeatedly cycling an arterial pressure cuff, changing cuff


sizes, and locations, because you do not believe the low
reading

Framing effect

Subsequent thinking is swayed by leading aspects of initial


presentation

After being told by a colleague, this patient was extremely


anxious preoperatively, you attribute postoperative agitation
to her personality rather than low blood sugar

Commission bias

Tendency toward action rather than inaction. Performing


un-indicated manoeuvres, deviating from protocol. May be due
to overconfidence, desperation, or pressure from others

Better safe than sorry insertion of additional unnecessary


invasive monitors or access; potentially resulting in a
complication

Overconfidence
bias

Inappropriate boldness, not recognizing the need for help,


tendency to believe we are infallible

Delay in calling for help when you have trouble intubating,


because you are sure you will eventually succeed

Omission bias

Hesitation to start emergency manoeuvres for fear of being


wrong or causing harm, tendency towards inaction

Delay in calling for chest tube placements when you suspect


a pneumothorax, because you may be wrong and you will be
responsible for that procedure

Sunk costs

Unwillingness to let go of a failing diagnosis or decision,


especially if much time/resources have already been allocated.
Ego may play a role

Having decided that a patient needs an awake fibreoptic


intubation, refusing to consider alternative plans despite
multiple unsuccessful attempts

Visceral bias

Counter-transference; our negative or positive feelings about a


patient influencing our decisions

Not trouble-shooting an epidural for a labouring patient,


because she is high-maintenance or a complainer

Zebra retreat

Rare diagnosis figures prominently among possibilities, but


physician is hesitant to pursue it

Try to explain away hypercarbia when MH should be


considered

Unpacking
principle

Failure to elicit all relevant information, especially during


transfer of care

Omission of key test results, medical history, or surgical event

Psych-out error

Medical causes for behavioural problems are missed in favour


of psychological diagnosis

Elderly patient in PACU is combativeprescribing restraints


instead of considering hypoxia

230

Downloaded from http://bja.oxfordjournals.org/ at REDAR on July 14, 2015

Cognitive errors fall under the domain of psychology of


decision-making. According to Croskerry,18 who has presented the most comprehensive medical catalogue of cognitive errors to date, the literature on cognitive bias in
decision-making is ample, but ponderously slow to enter
medicine. Perhaps this is because cognitive errors are considerably less tangible than procedural errors. They are
described as low-visibility; that is, they are generally
unable to be witnessed or recorded and usually occur with
low awareness on the part of the thinker. Thus, they are
not conducive to root-cause analysis. However, these types
of thinking mistakes are thought to be potentially highly preventable.19 Cognitive errors are thought-process errors,
usually linked to failed biases or heuristics. According to

Groopman,5 while heuristics serve as the foundation for all


mature medical decision making, they can lead to grave
errors. The doctor must be aware of which heuristics he is
using, and how his inner feelings may influence them.
Importantly, these are errors that are made despite the
availability of knowledge and data needed to make the
correct decision. Thus, they are distinct from knowledge
gaps. It is important to note that heuristics and biases are
frequently useful in clinical medicine; they allow experts to
arrive at decisions quickly and (usually) accurately.
However, cognitive errors occur when these subconscious
processes and mental shortcuts are relied upon excessively
or under the wrong circumstances. Below, we will introduce
several examples of specific cognitive errors that are commonly described in the literature and easy to understand.
Table 1 lists our catalogue of 14 cognitive errors, along with
definitions and examples for each.
Premature closure describes the cognitive error of
accepting the first plausible diagnosis before it has been

BJA

Cognitive errors

those data to satisfy a chosen diagnosis. Crucial to the


concept of cognitive errors is the lack of deliberate thinking.
Often, clinicians must sift through extraneous data that are
indeed irrelevant to a diagnosis. Upon careful consideration,
they may discard them. However, confirmation bias is not
conscious, deliberate selection of data; it is a largely unconscious process of (incorrect) pattern recognition.
Availability bias is an error in diagnosis due to an emotionally memorable past experience (an adverse patient
outcome, or lawsuit, for example). Physicians may be subconsciously affected by these emotional experiences,
causing the diagnosis to be very readily available at the forefront of the mind, which has the result of inflating the
pre-test probability for that diagnosis. In the future, the physician may unintentionally ignore important differences
between the current presentation and that prior experience,
falling prey to the availability bias error.

Study methods
Having reviewed the literature and noting the absence of
anaesthesiology-specific set of cognitive errors, we set out
to perform two tasks. First, we sought to create an
anaesthesiology-specific catalogue of errors. Next, we
created a pilot study to observe the prevalence of those
errors during simulated anaesthesia emergencies. Institutional Review Board approval was obtained before study
implementation.

Creation of an anaesthesiology-specific catalogue of


cognitive errors
We performed a Medline literature search using the key
words cognitive error, cognitive psychology, diagnostic
error, heuristic, and decision making alone and in combinations. We limited our search to articles from peer-reviewed
journals written in English, screening abstracts for appropriateness. Since we were interested in the modern paradigm
of thought-process errors, we limited our initial search to
items published during or after 1999. We then screened all
abstracts identified by Medline to be related citations of articles we found to be relevant, and we screened relevant articles cited in the articles we reviewed, including those
published before 1999. In all, we found that the concept of
cognitive errors was described in 28 articles, and specific cognitive errors were identified and defined in 23 articles. From
all the specific cognitive errors identified, we examined the
definitions and combined items that appeared to be synonyms. For example, anchoring, fixation, and tunnel
vision are all terms that essentially describe the process of
giving exclusive attention to only one problem at the
expense of fully understanding the situation, and we chose
to refer to this concept as anchoring. Instead of creating a
new taxonomy, we chose among existing terminology
options for use in this study. We listed all the separate
errors, along with definitions, and we added illustrative
anaesthesiology examples.

231

Downloaded from http://bja.oxfordjournals.org/ at REDAR on July 14, 2015

fully verified.8 If a patient is hypotensive after induction, the


anaesthetist may attribute this to the first potential
explanation that comes to mindsuch as the effect of
deep anaesthesia without adequate stimulationand not
consider the other key items on the differential diagnosis.
Just as easily, an anaesthetist may diagnose anaphylaxis
and launch into epinephrine resuscitation without considering other key items that are alternative diagnoses. Importantly, to understand the concept of cognitive error,
premature closure represents the thought-process mistake
that occurs when a preliminary diagnosis is selected
without consideration of other possibilities and without verification of the selected diagnosis. Anaesthetists may have
variability in terms of what specific diagnosis comes to
their minds most readily; the error lies not in the nature of
the preliminary diagnosis itself, but in the premature cessation of investigation. It is worth repeating that cognitive
errors are not knowledge deficits. The anaesthetist in the
above example is assumed to be competent, knowledgeable,
and capable of generating a list of potential diagnoses and
then evaluating each and selecting the best. Yet it is the
overreliance on a thought-process shortcut that leads the
anaesthetist astray.
Feedback bias is another cognitive error that may be particularly pertinent to anaesthesiology. This describes the
process that occurs when significant time elapses between
actions and consequences, or outcome data are never
reported back to the anaesthetist.18 When important information does not return to the decision maker, it is impossible
to shape future decisions based upon that information. As
such, the absence of feedback is subconsciously noted as
the positive feedback. Potential clinical examples of this
abound in anaesthesiology, as we often have limited
patient continuity. If a patient suffers an infection from
central catheter placement, or a corneal abrasion from inadequate eye protection, or even intraoperative awareness,
these issues may not be communicated back to the anaesthesiologist. The anaesthesiologist may remain ignorant of
these complications indefinitely, believing the techniques
used to be adequate.
Confirmation bias is an error characterized by seeking
confirming evidence to support a diagnosis while subconsciously discounting contrary evidence, despite the latter
often being more definitive. Sometimes this will manifest
by trying to force data to fit a desired or suspected diagnosis.6 20 A simple example in anaesthesiology might be the
repeating of arterial pressure measurements, changing cuff
sizes and locations, in an effort to get a reassuring reading,
instead of recognizing the hypotension as real. Similarly,
this may occur with nearly any monitoring device, including
misdiagnosing oesophageal intubation in favour of fixing
the capnograph. Another example might be diagnosing
malignant hyperthermia based upon hyperthermia and
hypercarbia, and not looking for (or discounting the
absence of) metabolic acidosis. As with other cognitive
errors, the fault lies in the thought processsubconsciously
selecting only certain information and attempting to force

BJA

Stiegler et al.

simulated emergencies several times per year. Residents


participated in high-fidelity scenarios, using the SimMan
simulator (Laerdal Medical, Wappingers Falls, NY, USA), and
actors (trained simulation centre staff) represented other
team members (surgeon, nurse). Scenarios were selected
from our library of 20 cases by the simulation faculty based
on appropriateness for the resident level of training and
prior exposure to scenarios. They contained scripted emergencies, including anaphylaxis, malignant hyperthermia,
difficult airway (cant intubate, cant ventilate), and pulmonary embolism. Before the simulation, residents were fully
oriented to the simulation environment, capabilities, and
limitations. Participants were instructed to act as the lead
physician managing patient care, and were also informed
they could ask for help, diagnostic exams, or any resource
commonly available in the operating theatre. For each
case, participants were given a vignette stem and the
ability to ask questions, just as they would in a preoperative
evaluation. Each case was 20 min in length and was immediately followed by a debriefing that lasted 40 min. All of
this is our standard educational curriculum. It should be
emphasized that we did not alter the simulated events in
any way to attempt to cultivate error, nor did we study any
error prevention or recovery strategies. This study was
solely to confirm (or refute) whether the cognitive errors
identified in our catalogue appeared during clinical decisionmaking exercises.
Between October 2009 and February 2010, 32 resident
physicians consented to participate in the study. As cognitive
errors are thought to be ubiquitous processes and not
situation-specific, we did not control for the simulated
vignette subject matter or script, nor did we ask the faculty
instructors to deviate from their usual practices. As is standard in our curriculum, resident physicians were asked to
manage the emergency as if in real life. While observing
the live scenarios, investigators made note of error behaviours. Those behaviours were subsequently explored
during the debriefing, so that the thought processes could
be elicited. The investigators recorded all cognitive errors
that were identified; a resident physician could make any
number of defined cognitive errors, including none or all, in
a given scenario.

Observation of cognitive errors

Expert rating

We wished to observe whether the cognitive errors identified


in our catalogue indeed occurred in clinical practice. As
observation of errors in the operating theatre without intervening would be unethical, we elected to observe residents
in an established simulation course where errors may be
allowed to evolve unchecked and transpire along their
natural course. We did not alter the standard simulation curriculum in any way for this study, and we recruited participants based upon availability. The following is a brief
description of our standard simulation curriculum and environment. As part of standard training, anaesthesiology resident physicians at our institution practice managing

Two expert investigators evaluated each participant after the


simulated case and debriefing, using a five-point Likert-style
survey that queried performance of cognitive errors and nontechnical skills, such as communication and teamwork behaviours. This instrument was developed using the cognitive
errors in the first part of this study, was demonstrated to
be reliable (Cronbachs a 0.81), and is described in detail
elsewhere.21 Each expert rater received instructions and a
calibration session with the principal investigator. Experts
completed the cognitive error survey after the debriefing
process, so that they could ascertain both the degree of
medical knowledge possessed by the resident physician

232

Downloaded from http://bja.oxfordjournals.org/ at REDAR on July 14, 2015

Using the expert opinion of eight faculty on the quality


assurance committee (who routinely investigate errors and
near-misses) and crisis simulation faculty (who have extensive experience observing the natural course of errors
allowed to evolve unchecked) from the corresponding
authors academic Anaesthesiology department, we narrowed the catalogue using two rounds of a modified Delphi
technique. This process consisted of interviews during
which the faculty reviewed the list of errors with an investigator (M.P.S.). They were asked to comment on the relevance of
each error to the practice of anaesthesiology in a hypothetical manner and also in terms of personal experience and
vicarious experiences. They were also asked to comment
on the likelihood of the error based upon level of clinical experience; that is, they were asked whether an error is likely to
be relevant to trainees only or also to experienced anaesthesiologists. Finally, they were asked whether education about
a particular error and its prevention strategy would be useful
to their own practice or to that of other anaesthetists. After
each faculty member had been interviewed once and revisions made, the same faculty were invited to comment on
the proposed exclusions and the remaining catalogue. In
particular, because many cognitive errors have a degree of
conceptual overlap, they were asked if any items appeared
to be synonymous, or if their defining features were too
subtle to differentiate. They were also asked to comment
on the construction of the list and whether it was selfexplanatory. Based upon this information, we narrowed the
catalogue to 14 items, which are listed, defined, and illustrated in Table 1. Finally, this 14-item catalogue, including
error definitions and illustrative clinical examples, was presented via e-mail to all faculty members (77 in total). They
were asked to identify up to five items that they believed
to be most pertinent to anaesthesiology practice. Again,
they were asked to consider both their own personal experiences and those they observed or were told about from colleagues or trainees. They were also invited to comment on
any item they wished, and in particular, they were asked to
comment if they felt a particular error did not occur in anaesthesiology. As is discussed in the Results section, this process
ultimately led to a catalogue of 10 items felt to be especially
relevant to anaesthesiology.

BJA

Cognitive errors

and the thought process by which decisions were made.


Scores were given after debriefing because a defining
feature of cognitive errors is the absence of a knowledge
deficit; thus, it was essential to gain insight into the decisionmaking thought process before scoring. As previously discussed, cognitive errors cannot be detected by observation
alone.

Statistical analysis

Results
Cognitive error catalogue
Thirty-two of the 77 faculty physicians (41.5% response rate)
queried responded. Our faculty consensus identified the following 10 cognitive errors as being highly relevant to the
practice of anaesthesiology: anchoring, availability bias,
omission bias, commission bias, premature closure, confirmation bias, framing effect, overconfidence, feedback bias, and
sunk cost. The percentage of responders who identified a
given error as being among the five most important is
given in Table 2. Throughout the modified Delphi process,
the faculty overwhelmingly affirmed their perception that

Table 2 Important cognitive errors as perceived by


anaesthesiology faculty. *This column represents the number of
faculty out of the 32 responders who identified this error as being
among the five most important to anaesthesiology practice
Cognitive error

Faculty selection [% (n)]*

Anchoring

84.4 (27)

Availability bias

53.1 (17)

Premature closure

46.9 (15)

Feedback bias

46.9 (15)

Confirmation bias

40.6 (13)

Framing effect

40.6 (13)

Commission bias
Overconfidence bias
Omission bias
Sunk costs

32 (10)
32 (10)
28.1 (9)
25 (8)

Visceral bias

12.5 (4)

Zebra retreat

12.5 (4)

Unpacking principle

6.25 (2)

Psych-out error

6.25 (2)

Observation of cognitive errors


Thirty-two resident physicians (PGY 216, PGY 38, PGY 48)
consented to participate in this pilot study. Over a period of
5 months, 38 simulated encounters were observed in real
time. As is our standard practice, these encounters are also
recorded; these recordings were available to study personnel
if needed. Six resident physicians participated in simulation
training twice over the course of the data collection, and
one resident physician was scheduled three times. However,
no resident physician participated in the same scenario
more than once, and no education about cognitive errors
was presented; thus, repeat participants had neither an advantage nor disadvantage compared with other participants.
Table 3 lists the nine cognitive errors observed during the
38 simulation encounters, along with the corresponding
prevalence data. By far, our faculty felt that anchoring was
the most common error, followed by availability bias, premature closure, and feedback bias. These are errors that have
been frequently identified in other fields of medicine. Seven

233

Downloaded from http://bja.oxfordjournals.org/ at REDAR on July 14, 2015

Expert raters indicated whether or not a particular cognitive


error was made by a given resident physician using a fivepoint Likert scale (1, strongly disagree that error occurred;
2, disagree that error occurred; 3, neutral or unsure; 4,
agree that error occurred; 5, strongly agree that error
occurred). For our analyses, we averaged investigator scores
for each error and created a binary variable as follows: an
average of .3 indicated an error occurred; whereas an
average of 3 indicated no error occurred. Then, using
SPSS version 17 (SPSS Inc., Chicago, IL, USA), we performed
frequency analysis for each error.

these kinds of thought-process error occur frequently and


are likely to contribute to actual error behaviours and ultimately, adverse outcomes. Certain items were noted to be less
likely, and they involved patient behaviours. Psych-out error
and visceral bias both depend largely on dynamic interactions with awake patients, so it is not surprising that they
figured less prominently in the final catalogue, yet were identified by our faculty members who specialize in obstetric
anaesthesiology or critical care environments, including the
post-anaesthesia recovery unit.
Cognitive errors chosen by more than 25% of respondents,
for a total of 10 items, were selected for the pilot study.
However, specifically for our simulation-based portion of
the study, we omitted feedback bias, since this kind of
error requires a prolonged amount of time lapse. Thus, nine
of the errors from Table 2 were assessed in the second part
of the study: anchoring, availability bias, omission bias, commission bias, premature closure, confirmation bias, framing
effect, overconfidence, and sunk cost.
Free text responses by faculty overwhelmingly affirmed
the importance of cognitive errors in anaesthesiology practice. Almost all responders indicated that listing only five
was difficult because they all occur with significant frequency. Several responders suggested that the most important errors might vary based upon experience, with some
more likely to occur among trainees and others more likely
among experienced clinicians. As an example, availability
bias was suggested to be more likely among experienced
anaesthesiologists, since they would have more emotionally
memorable experiences from which to draw. Overconfidence
was also suggested to be more likely among experienced
anaesthesiologists, while omission bias was suggested to
be more likely among less experienced clinicians. No
responder indicated that cognitive errors were unimportant
or did not significantly contribute to medical errors.

BJA

Stiegler et al.

Table 3 List and prevalence of cognitive errors observed during 38


anaesthesiology sessions
Cognitive error

Frequency observed [% (n)]

Premature closure

79.5 (31)

Confirmation bias

76.9 (30)

Sunk costs

66.7 (27)

Commission bias

66.7 (27)

Omission bias

61.5 (24)

Anchoring

61.5 (24)

Overconfidence

53.8 (21)

Framing effect

23.7 (9)

Availability bias

7.8 (3)

of the nine errors we studied occurred in at least half of the


simulation sessions.

The observations during our pilot study were consistent with


the high ratings of our faculty; premature closure and confirmation bias were the most commonly observed cognitive errors
in simulated scenarios. Framing effect and availability bias
were observed the least. In fact, errors perceived by faculty
to be important to anaesthesiology were indeed observed frequently among trainees in a simulated environment.
Some items that were scored very highly by anaesthetists
were observed relatively infrequently in simulation, and vice
versa. As a general phenomenon, there are a few possible
explanations for this. First, faculty were asked which errors
they believed were most pertinent or important, not necessarily which were most frequent. Next, the Delphi process
involved perceptions of errors, but those perceptions are
likely based upon observational experience alone, without
the benefit of cognitive mechanism debriefing. In contrast,
during the simulated exercises, our raters were able to
inquire as to thought process and identify errors with the
added insight into cognitive mechanisms. Another possible
explanation is that our faculty were asked to reflect on their
past experience and identify the errors they thought were
most important, while the investigators in the simulation
portion were looking prospectively and had the assistance of
prompts from the cognitive error survey. Additionally, it is possible that our faculty strongly identified with errors they felt
were easier to understand and relate, and this unfamiliarity
itself may have biased their responses. However, since most
errors in our catalogue were observed in simulation, results
from both parts of the study are likely to represent an important first look at cognitive errors in anaesthesiology practice.
Identification of the most common cognitive errors is
crucial to developing appropriate training strategies for management and prevention. With this catalogue as a starting
point, more research should be performed to confirm the
importance of these errors and test error prevention and
recovery methods. Although a thorough treatment of metacognition and strategies for cognitive error prevention and

234

Downloaded from http://bja.oxfordjournals.org/ at REDAR on July 14, 2015

Discussion

recovery is beyond the scope of this manuscript, it deserves


some brief mention here. Put simply, metacognition is thinking about thinking, and studies of metacognitive training
have demonstrated improved decision-making processes
and decision outcomes.22 Basic features of metacognitive
practice include: recognition of limitations of memory,
ability to reflect on and appreciate a broader perspective,
good capacity for self-critique and harnessing overconfidence, and the ability to select specific strategies for best
decision-making.19 Practitioners should do deliberate selfchecks, asking the questions: Am I using the best decisionmaking strategy right now? and Am I relying too heavily
on pattern-recognition or bias?. While some urgent situations require immediate action, more accurate diagnosis
and appropriate therapy may result from active generation
of alternative interpretations of evidence and identification
of hidden assumptions when time permits. Physicians
should be deliberate in deciding when it is appropriate to
think more or think differently.22
A limitation of this consensus is that it was based on the
practice of faculty at a single large academic centre involving
supervision of trainees. It is possible that perceived frequencies and relative importance of various errors would vary with
type of practice and level of experience. Further, regarding
the simulation portion of the study, due to our relatively
small sample size and the diversity of simulated encounters,
we cannot comment on the potential relationships between
years of training, types or prevalence of error, and clinical
outcome of simulated emergencies. Finally, it is not
known whether these same errors occur at similar rates
among experienced anaesthesiologists, nor is it known
whether these errors occur at similar rates during real clinical
emergencies. These are all areas that warrant further study.
The concept of cognitive errors is relatively new to the
medical literature, and relatively novel to the anaesthesiology literature. As such, there is a large amount of work
still to be done. A larger study of anaesthesiologists opinions
on cognitive errors should be done to include those in other
practice models. Further, studies that seek to stratify by
novice, intermediate, and experienced anaesthesiologists
would help identify whether certain errors are more prevalent among certain groups. Additionally, the observation of
these errors in real clinical events should be attempted,
although this would present several challenges (selfreporting is generally poor, and it would be unethical to
observe errors without intervening). As noted, we did not
study any error prevention or recovery strategies, but
simply observed for the occurrence of these errors.
However, the applied psychology concepts of metacognition
and de-biasing strategies have been demonstrated in other
fields to be effective in cognitive error prevention and recovery.18 23 With common cognitive errors in anaesthesiology
identified, specific strategies to minimize them can be developed and disseminated. Ultimately, studies addressing the
effectiveness of metacognitive training and debiasing strategies in preventing these errors among anaesthesiologists
must be performed.

Cognitive errors

Acknowledgements
The authors wish to thank the UCLA Simulation Center faculty
and staff for their participation and support of this project. The
authors also wish to thank Dr Sebastian Uijtdehaage and Dr
Craig Fox for their contributions to this manuscript.

Declaration of interest
None declared.

Funding
Support for this paper was provided solely from institutional/
departmental resources (David Geffen School of Medicine, at
UCLA, and the Department of Anesthesiology at UCLA).

References
1 Kihn LT, Corrigan JM, Donaldson MS. To Err Is Human: Building a
Safer Health System. Washington, DC: Institute of Medicine
National Academy Press Report, 1999
2 Croskerry P. Achieving quality in clinical decision making: cognitive strategies and detection of bias. Acad Emerg Med 2002; 9:
1184 204
3 Mamede S, Van Gog T, Van Den Berge K, et al. Effect of availability
bias and reflective reasoning on diagnostic accuracy among
internal medicine residents. J Am Med Assoc 2010; 304: 1198 203

4 Tokuda Y, Kishida N, Konishi R, Koizumi S. Cognitive error as the


most frequent contributory factor in cases of medical injury: a
study on verdicts judgment among closed claims in Japan.
J Hosp Med 2011; 6: 10914
5 Groopman J. How Doctors Think. Boston, MA: Houghton Mifflin,
2008
6 NEA Issue Brief: an analyses of principle nuclear issues. Report
No. 2, January 1998
7 Malakis S, Kontogiannis T, Kirwan B. Managing emergencies and
abnormal situations in air traffic control (part II): teamwork strategies. Appl Ergon 2010; 41: 62835
8 Berner ES, Graber ML. Overconfidence as a cause of diagnostic
error in medicine. Am J Med 2008; 121: S2 23
9 Borrell-Carrio F, Epstein RM. Preventing errors in clinical practice: a
call for self-awareness. Ann Fam Med 2004; 2: 310 6
10 Fandel TM, Pfnur M, Schafer SC, et al. Do we truly see what we
think we see? The role of cognitive bias in pathological interpretation. J Pathol 2008; 216: 193200
11 Garber MB. Diagnostic imaging and differential diagnosis in 2
case reports. J Orthop Sports Phys Ther 2005; 35: 74554
12 Harasym PH, Tsai TC, Hemmati P. Current trends in developing
medical students critical thinking abilities. Kaohsiung J Med Sci
2008; 24: 341 55
13 Kempainen RR, Migeon MB, Wolf FM. Understanding our mistakes:
a primer on errors in clinical reasoning. Med Teach 2003; 25:
177 81
14 Pescarini L, Inches I. Systematic approach to human error in radiology. Radiol Med 2006; 111: 25267
15 Vickrey BG, Samuels MA, Ropper AH. How neurologists think: a
cognitive psychology perspective on missed diagnoses. Ann
Neurol 2010; 67: 42533
16 Holzman RS, Cooper JB, Gaba DM, Philip JH, Small SD, Feinstein D.
Anesthesia crisis resource management: real-life simulation
training in operating room crises. J Clin Anesth 1995; 7: 675 87
17 Howard SK, Gaba DM, Fish KJ, Yang G, Sarnquist FH. Anesthesia
crisis resource management training: teaching anesthesiologists
to handle critical incidents. Aviat Space Environ Med 1992; 63:
763 70
18 Croskerry P. The importance of cognitive errors in diagnosis and
strategies to minimize them. Acad Med 2003; 78: 77580
19 Croskerry P. Cognitive forcing strategies in clinical decisionmaking. Ann Emerg Med 2003; 41: 11020
20 Redelmeier DA, Shafir E, Aujla PS. The beguiling pursuit of more
information. Med Decis Making 2001; 21: 37681
21 Stiegler MA. Anesthesiologists Nontechnical and Cognitive Skills
Evaluation Tool. UC Los Angeles: Center for Educational Development and Research, 2010
22 Flin R, OConner P, Crichton M. Safety at the Sharp End: A Guide to
Non-Technical Skills. Aldershot: Ashgate, 2008
23 Bond WF, Deitrick LM, Arnold DC, et al. Using simulation to
instruct emergency medicine residents in cognitive forcing
strategies. Acad Med 2004; 79: 438 46

235

Downloaded from http://bja.oxfordjournals.org/ at REDAR on July 14, 2015

To our knowledge, this is the first study to identify cognitive errors specific to anaesthesiology practice and to
observe them prospectively. We were able to identify 10
key errors perceived to be especially pertinent to anaesthesiology practice, and were then able to observe their occurrence consistently and frequently in simulated anaesthesia
scenarios. We believe that these are likely to occur in real
clinical anaesthesia scenarios as well, as they do in other
fields of medicine. This application of cognitive psychology
is important because it follows logically that errors of
thought lead to errors of behaviour, and consequently contribute to morbidity and mortality.
We advocate that anaesthesiologists must have insight
into their own decision-making processes and deliberately
abandon intuitive reasoning for an analytic approach when
necessary. To achieve this, educational training in cognitive
errors, metacognition, and debiasing strategies is needed.
However, there are still many questions regarding which
errors are most important to address and which metacognitive strategies are most appropriate and effective in anaesthesiology. Further research in this area is needed to
reduce decision-making errors and improve patient safety.

BJA

Vous aimerez peut-être aussi