Vous êtes sur la page 1sur 8

1342

Shayne et al. BEDSIDE CLINICAL EVALUATION

Protected Clinical Teaching Time and a Bedside


Clinical Evaluation Instrument in an Emergency
Medicine Training Program
Philip Shayne, MD, Katherine Heilpern, MD, Douglas Ander, MD,
Victoria Palmer-Smith, MD, for the Emory University Department of Emergency
Medicine Education Committee
Abstract
In a process that has evolved over the last four years,
the Emory University Emergency Medicine Education
Committee has developed an academic attending
teaching shift incorporating a formatted lecture series
with a clinical evaluation exercise (CEE). The program
structures the approach to clinical teaching at the bedside, provides an objective clinical evaluation tool specific to emergency medicine residents, and provides targeted learning for medical students and residents
rotating in the emergency department (ED). The CEE instrument was designed to be quick and efficient, satisfy
requirements of assessment of the Accreditation Council
for Graduate Medical Education (ACGME) general competencies, and incorporate the language of the Model
of the Clinical Practice of Emergency Medicine. The
original program called for unstructured bedside teaching three days a week, by faculty freed from clinical duties, combined with a limited series of introductory

emergency medicine lectures. The program proved more


successful when concentrated in a once weekly structured educational program. The prepared, repeating lecture series has been expanded to include many of the
most common ED presenting chief complaints and has
significantly advanced a curriculum for medical students
and visiting interns. A CEE was developed to evaluate
and provide immediate feedback to residents on many
of the core ACGME competencies. The CEE has been successfully used to structure the bedside educational encounter. This dedicated non-clinical teaching shift appears effective in meeting the educational goals of the
authors academic ED. This is a description of the program and its evolution; the program has not been
formally evaluated. Key words: graduate medical education; clinical skills; teaching; clinical competency;
emergency medicine; resident. ACADEMIC EMERGENCY MEDICINE 2002; 9:13421349.

In a process that has evolved over the last four


years, the Emory University Emergency Medicine
Education Committee has developed an academic
attending teaching shift combined with a clinical
evaluation exercise (CEE). From the outset, the program goals were to formalize an approach to didactic teaching and bedside clinical teaching, while
providing an assessment tool for measuring resident clinical competency. In an era of high clinical
burdens at academic centers, we intended to provide targeted learning and evaluation of our EM
residents, rotating residents, and medical students.
Initially the program included a series of didactic
lectures designed for medical students and rotating
residents. The bedside component was initially an
unstructured attending teaching shift with no clinical responsibilities. Our assessment of that pro-

gram, increased clinical workloads, implementation


of the Accreditation Council for Graduate Medical
Education (ACGME) Outcome Project,1 and the
Model of the Clinical Practice of Emergency Medicine,2 all resulted in the evolution of the program
to a more streamlined and efficient process with
more clearly defined goals.
The ACGME has developed a vision of enhancing medical education by focusing on outcome assessment in residency education. The ACGME
Outcome Project1 has developed six general competencies that physicians should possess at the end
of their training program. The six general competencies endorsed by the ACGME are in the areas of
1) patient care, 2) medical knowledge, 3) practicebased learning and improvements, 4) interpersonal
and communication skills, 5) professionalism, and
6) systems-based practice. A detailed discussion of
these general competencies is available on the
ACGME website at www.acgme.org. The residency
review committees are currently integrating the
language of the general competencies into the program requirements for each specialty as Phase One
of the Outcome Project. The objective of Phase Two

From the Department of Emergency Medicine, Emory University School of Medicine, Atlanta, GA (PS, KH, DA, VP).
Received December 17, 2001; revision received May 1, 2002; accepted May 13, 2002.
Address for correspondence and reprints: Philip Shayne, MD,
69 Butler Street, SE, Atlanta, GA 30303. Fax: 404-616-0191;
e-mail: pshayne@emory.edu.

ACAD EMERG MED November 2002, Vol. 9, No. 11 www.aemj.org

of the Outcome Project is the implementation of assessment tools by training programs, in order to
demonstrate successful mastery of the general competencies during residency. Our program addresses
Phase Two of the Outcome Project. Future phases
will address validation of these methods.
To assist the educators, the ACGME has published a master Toolbox of Assessment Methods,3
a list of possible evaluation instruments. Each evaluation tool is ranked by its ability to adequately
assess individual competencies. Training programs
are being tasked with identifying the tools that are
appropriate for their specialty training, practice environment, and faculty talent and constraints. Little
objective data exist to suggest that any specific
training tool or evaluation method is adequate to
demonstrate mastery of the general competencies.
When designing our structured bedside evaluation tool, we combined elements from the ACGME
toolbox checklist evaluation of live or recorded
performance and global rating of live or recorded
performance. These types of evaluation tools are
recognized by ACGME as valuable in assessing
components of five out of six general competencies.

METHODS
Study Design. This was a descriptive study of the
educational innovation and its evolution over the
last four years. Consistent with Phase Two of
the ACGME Outcome Project, we do not yet have
the data to demonstrate the validity of our program.
Study Setting and Population. Our major clinical
teaching site, Grady Memorial Hospital, has an annual emergency department (ED) volume of
105,000 high-acuity adult patient visits, 55,000 pediatric visits, and an ambulatory clinic seeing
45,000 less ill adults. The program described was
instituted in the adult ED, known as the emergency
care center (ECC). In an already challenging environment, compliance with HCFA (Health Care Financing Administration, now CMSCenters for
Medicare and Medicaid Services) guidelines created an additional challenge to our bedside teaching mission. The restrictive chart documentation
policies implemented by Medicare in 1996 impacted our bedside teaching responsibilities by
adding a significant amount of documentation
time, thereby reducing time for bedside teaching.
In response to this challenge, in the fall of 1996, the
emergency medicine faculty agreed that a program
of scheduled, non-clinical teaching time for resident
and medical student education was a departmental
priority. The academic attending pilot program

1343

began in January 1997. On a rotating basis, all


emergency medicine faculty were assigned an additional four hours of non-clinical teaching time in
the Grady ECC. This program occurred three times
per week, and each faculty member served as the
academic attending approximately four times per
year.
Demonstration Project. The core education faculty
developed a standing series of lectures and skills
labs. The lectures were designed to be case-based,
interactive, and oriented toward common ED presentations. Nine lectures and four skills laboratories
centered on basic concepts in emergency medicine
were developed (Table 1). The lectures were repeated on a one-month cycle, following the clinical
rotation schedule of medical students and visiting
residents from non-emergency medicine training
programs. One responsibility of the academic attending was to deliver the lecture in a small group
interactive format. The emergency medicine interns
attended the same lecture series during their first
rotation in the ECC. All participants were released
from clinical duties to attend.
Following the lecture or skills laboratory, the academic attending spent three hours in the ECC performing bedside teaching. The faculty would approach a resident to discuss an active case,
providing a greater depth of discussion and understanding than could usually be afforded in a busy
clinical environment. Faculty were expected to enter an evaluation of these residents using our Webbased clinical evaluation system. This evaluation
contains six categories (clinical skills, fund of
knowledge, clinical judgment, interpersonal skills,
TABLE 1. Lectures and Skills Laboratories
Emergency Medicine Lecture Series
The Approach to Advanced Airway Management: The
Emergent Intubation in the ED
The Approach to the Patient with Chest Pain in the ED
The Approach to the Patient with Abdominal Pain in the
ED
The Approach to Diabetic Ketoacidosis in the ED
The Approach to the Patient with Hyperthermia in the ED
The Approach to the Patient with Hypothermia in the ED
The Approach to the Patient with Altered Mental Status in
the ED
The Approach to the Patient with Trauma in the ED
The Approach to Overdoses in the ED
The Approach to Allergic Emergencies in the ED
Emergency Medicine Skills Labs
Suturing
Splinting and casting
Evaluation of the red eye/Introduction to slit lamp technique
Evaluation of the joints/Low back pain evaluation
ED = emergency department.

1344

Shayne et al. BEDSIDE CLINICAL EVALUATION

professionalism, and procedural skills) and scored


them from poor to excellent on a ten-point scale.
Data Analysis. The majority of data from this project are qualitative only. As an educational program
without direct impact on patient care, the study
was exempt from institutional review board review.

RESULTS
Evaluation of the academic attending program,
changes in our work environment, and publication
of the ACGME core competencies1 and the Model
of the Clinical Practice of Emergency Medicine2 all
effected an evolution of the program. Changes
included the incorporation of extra lectures for
medical students, development of a focused and
efficient clinical evaluation exercise (CEE), and reduction in the total faculty time commitment.
The initial components of the academic attending shift were the didactic lectures series and the
unstructured bedside teaching. We were successful
within our Department to get the faculty as a whole
to volunteer for this extra teaching time. However,
even with a large faculty group, three four-hour academic shifts per week proved to be burdensome.
At one point when faculty scheduling was tight, the
academic attending slot was suspended for eight
months. Resumption of the program was accomplished when adequate faculty coverage was
achieved. While the program was universally accepted and supported as a required part of our
teaching responsibilities, the education faculty felt
it was important to reduce the burden of the program. In response, changes were made to use the
faculty time more efficiently and the program was
condensed to a single six-hour shift per week.
In 2001, the School of Medicine added a mandatory emergency medicine clerkship to the medical student curriculum. To meet this responsibility
and further enhance the students didactic experience, we created a second series of lectures geared
specifically to medical students (Table 2). The topics
were picked based on the Society for Academic
Emergency Medicine curriculum recommendations.4 We attempted to add topics that were either
not taught in the original didactic series or were
not typically seen by students during the clinical
rotation. The medical student lectures were designed to be taught by the academic attending,
which supplanted an hour of bedside teaching.
The bedside teaching pilot proved to be difficult
for several reasons. Some faculty felt awkward in
the ED as a bedside teacher without a clear clinical
or administrative role. The time was unstructured,
and accountability was difficult. Because of their

intense involvement with a single patient, the academic attending often impacted on the management of the patient, necessitating chart documentation. This broadened the scope of the attendings
involvement to a degree that was far greater than
originally planned. It was also apparent that this
teaching role could interfere with the rhythm of the
residents shift. Although one-on-one attention was
appreciated and made academic sense, the time
taken to discuss a single case in depth in the ECC
did not aid patient flow. Finally, the evaluation tool,
which was designed for the global assessment of
residents by faculty, did not provide the detail or
room to accurately detail the type of assessment
performed by the academic attending.
These factors have led to the evolution of the current academic attending structure. Since the summer of 2001, the academic attending is now scheduled once a week, on Thursdays, which tend to be
less busy. Each faculty member is scheduled as the
academic attending once or twice per year for a sixhour non-clinical shift. A sample schedule of that
shift is demonstrated in Table 3. Faculty led a onehour core topic discussion for interns and medical
students. The topic discussion is followed by a
skills laboratory, which is taught by our physicians
assistants. Next is a medical students-only lecture,
usually conducted by a senior emergency medicine
resident on his or her academic rotation and evaluated by the academic attending. This provides
senior emergency medicine residents with a valuable opportunity to teach in a small group format.
The attending physician provides formal written
and oral feedback. We have developed a department-wide evaluation form specifically designed to
TABLE 2. Medical Student Lecture Series
The Approach to Common Pediatric Emergencies
The Approach to Common Obstetric, Gynecologic, and
Genitourinary Emergencies
Injury Control and Domestic Violence Prevention
Arterial Blood Gas Interpretation
ECG Interpretation
ECG = electrocardiogram.

TABLE 3. Academic Attending Schedule


8:00

AM 9:00 AM

9:00 AM 10:00 AM
10:00 AM 11:00 AM
11:00

AM 1:00 PM

Clinical evaluation exercise for EM residents


Skills lab for medical students/rotating
interns
Academic attending lecture
Medical student lecture (by senior EM
resident)
Clinical evaluation exercise for EM residents, continued

EM = emergency medicine.

ACAD EMERG MED November 2002, Vol. 9, No. 11 www.aemj.org

give feedback on teaching style and oral presentation skills.


The bedside teaching component has been replaced with the CEE, a separate evaluation form
developed for this purpose (reproduced in part in
Fig. 1). Under this system, each resident is observed
at least twice a year. Our emergency medicine CEE
is an adaptation of the American Board of Internal
Medicines mini-CEX as defined on the ABIM
web site.5 Like the ABIM mini-CEX, the CEE is designed to be an efficient snapshot of clinical performance, requiring optimally 1520 minutes. The

1345

CEE has been adapted to reflect the unique practice


features specific to emergency medicine and was
designed to address several of the ACGME general
core competencies. Much of the language specific
to emergency medicine was adapted from the
Model of the Clinical Practice of Emergency Medicine.2
In addition, our experience led us to develop a
new scoring method for the CEE. Based on a large
volume of data, we found that an assessment of
poor to excellent on a 110-point scale produced
tightly clustered scores and poor interrater reliabil-

Figure 1 (above and next two pages). The clinical evaluation exercise.

1346

Shayne et al. BEDSIDE CLINICAL EVALUATION

Figure 1 (cont.). The clinical evaluation exercise.

ity. Moreover, these scores dont necessarily address


the idea of competency. Nor is it obvious as to
how the residents level of training is factored in by
each faculty. Our CEE is based on a three-point assessment of competence for each data point: Below expected: Falls short of reasonable expectations, Meets expected: Always meets and
occasionally exceeds expectations, or Outstanding: Far Exceeds expectations for level of training.
We find this scoring easier to administer and follow. The CEE is designed to follow the expected
flow of a patient encounter, and the assessments are

checked off during the observation. It can be performed quickly, quietly, and unobtrusively.
The academic attending is pre-assigned two residents on duty to evaluate during the teaching shift
clinical shift. The attending silently observes a fresh
patientresident encounter and the case with a
management plan is then presented to the academic attending. The attending provides immediate structured feedback on the encounter in the
form of the CEE to the resident. The resident signs
off on the CEE as acknowledgement, and has the
option of adding additional comments.

ACAD EMERG MED November 2002, Vol. 9, No. 11 www.aemj.org

1347

Figure 1 (cont.). The clinical evaluation exercise.

Based on our experience thus far, the entire CEE


encounter can be accomplished in 15 to 30 minutes,
slightly longer than expected. It is performed in
parallel with the service activities of the resident
without causing significant disruption to the normal ECC flow. The academic attending shift ends
when the assigned residents CEEs are completed.
We have found the new assessment measurement
very easy to interpret.
To date, the restructured academic attending pro-

gram is progressing well. The weekly academic attending shift, assigned once or twice per year to
each full-time emergency medicine faculty member,
has become integral to the departments training
program. There are a number of advantages to this
structured educational program. We provide faculty-led didactic teaching to non-emergency medicine residents and medical students during their
one-month clinical rotation. The lecture series has
been popular and highly rated since its inception

1348
in 1997. The adoption of the entire lecture and skills
laboratory components into the medical student
elective curriculum was instrumental in convincing
the medical school to incorporate a mandatory
emergency medicine rotation into the undergraduate curriculum. All senior emergency medicine residents teach actively in this program and receive
immediate written and oral feedback from the academic attending in the audience. Several senior
residents have been inspired by the teaching opportunity to consider careers in academic medicine.
The CEE has yet to be formally evaluated, and
we can only offer our consensus opinion. It appears
that this exercise is far superior to the unstructured
bedside teaching model we previously used. The
CEE provides a specific role for the academic attending in the ECC, and the structured evaluation
form provides the instructor with a template for
feedback and teaching. Dedicated observation of
our residents at the bedside has provided us with
unique insight into their behaviors as clinicians and
professionals. The CEE has not proved disruptive
or burdensome to the residents in the ECC.
The biggest difficulty has been capturing the resident on the way in to see a new patient, so that
the entire initial encounter can be observed. Several
faculty have stayed beyond scheduled hours in order to complete their evaluations. As a result, we
reduced the schedule from three to two CEEs per
academic attending shift. Our goal is to document
two clinical observations per resident per year. This
goal has been met to date.
We would expect that there is some change in the
residents behavior when they are being observed
during a clinical encounter. Despite this Hawthorne effect,6 it is our observation that residents
rapidly fall into their usual and comfortable examination patterns in the familiar ECC setting. Deficiencies and idiosyncrasies have been observed
that one might have expected would be extinguished during direct observation by a faculty
member. This suggests that the snapshot evaluation is capable of accurately providing an assessment of the residents true behaviors. More work
needs to be done to prove the validity of the CEE.
The residents state that the immediate feedback
from CEE is useful and interesting. They do not
report feeling threatened or uncomfortable by the
presence of the academic attending during an
exam. The feedback from the faculty and residents
is that the CEE helps to make the clinical observation reasonably objective. Faculty appreciate the
support of the structural assessment and the efficiency with which it can be done. Residents have
expressed satisfaction with the CEE and, although

Shayne et al. BEDSIDE CLINICAL EVALUATION

it takes a little extra time, do not feel that it disrupts


the rhythm of their workday.

DISCUSSION
This educational program has served several important functions. The adoption and evolution of
an academic attending shift that includes didactics
and bedside teaching have improved the teaching
in our department in an era in which teaching is
difficult. The teaching burden, although still heavy
during a clinical shift, is now distributed equally
among faculty during non-clinical hours. We have
developed a CEE, which provides a thoughtful and
objective assessment of resident performance at the
bedside. It is quick and easy to implement and allows us to meet several ACGME core competency
requirements. The CEE provides focused teaching
time with individual residents, expressing value
and attention to their educational needs. It allows
for protected teaching time without significantly
slowing ECC flow. It provides for immediate faceto-face clinical feedback.
The lecture series has enabled us to develop a
platform to deliver a curriculum specific to the
needs of medical students and interns new to emergency medicine. The lectures highlight the emergency medicine approach to common patient complaints in a case-based format amenable to small
group learning. Having the lectures prepared, or
canned, allows for a consistent approach to be
shared by a large faculty, and for any faculty member to step in and lead the course with minimal
additional preparation. The medical student lectures provide senior residents the opportunity to
teach with faculty guidance and formal feedback.
Directed observation of clinical performance in
EDs has been described previously, and we drew
from the experience of Cydulka et al. at MetroHealth Medical Center in developing the CEE.7
They noted invaluable insight into resident performance similar to our experience. Our CEE is different in a number of significant ways. We do not
have the manpower to shadow each resident for
several hours. The CEE is performed on a single
encounter and is meant as an efficient instrument
for obtaining a snapshot of clinical performance.
Our CEE was developed after publication of the
ACGME general competencies and the Model of
the Clinical Practice of Emergency Medicine, and
incorporates language and principles from these
important documents. Other studies involving
emergency medicine faculty have noted that standardized objective evaluations provide better interrater reliability than global assessment scoring,8,9

1349

ACAD EMERG MED November 2002, Vol. 9, No. 11 www.aemj.org

and we have tried to incorporate that lesson as


well.
The CEE clearly addresses several core competencies of the ACGME. It combines elements of two
assessment tools: checklist evaluation of live or recorded events and global rating of live or recorded performance.1 Referring to the ACGMEs
Suggested Best Methods for Evaluation,1 the CEE
covers assessment of patient care, practice-based
learning, interpersonal and communication skills,
professionalism, and system-based practices; five of
the six general competencies. This accomplishes
much of our requirement for Phase Two of the
ACGME Outcome Project: to implement assessment of competencies instruments. Phase Three
will involve validation testing.

LIMITATIONS
This is a descriptive narrative of a best practices
educational advance. We have yet to complete a
formal evaluation of our CEE tool. The objective
measure of this improvement will require interrater
reliability testing and observation of residents at
multiple times during their three years of training.
Long-term longitudinal follow-up of the residents
should provide us with the data to determine
whether this educational tool can detect and help
modify deficiencies. Since this system does not produce quantitative data, it will be difficult to prove
reliability. The evaluation of the individual faculty
members performance as academic attending requires further development.

CONCLUSIONS
A dedicated non-clinical teaching shift can be effective in meeting the educational goals of an academic emergency department. A prepared, repeating lecture series can be useful to advance a
curriculum for medical students and visiting interns. A clinical evaluation exercise designed specifically for emergency medicine can be used to

structure a bedside educational encounter, provide


useful information for residents, and begin to address ACGME competency outcomes. Senior emergency medicine residents benefit from teaching a
pre-formatted medical student lecture under guidance. Further work needs to be done to evaluate
the effectiveness and power of this pilot educational tool in meeting the ACGME core competency
requirements.
The authors acknowledge the help of the Education Committee
members: Todd Berger, MD, Leon Haley, MD, MHSA, Sheryl
Heron, MD, MPH, Benjamin Holton, MD, Tammie Quest, MD,
and Varnada Karriem-Norwood, MD.

References
1.

2.

3.

4.
5.

6.

7.

8.

9.

Accreditation Council for Graduate Medical Education.


ACGME, Outcomes Project. Website: http://
www.acgme.org/outcome/. Copyright 2000, ACGME.
Hockberger RS, Binder LS, Graber MA, et al. Model of the
Clinical Practice of Emergency Medicine. Ann Emerg Med.
2001; 37:74570.
Accreditation Council for Graduate Medical Education
and the American Board of Medical Specialties. Toolbox of
Assessment Methods. Website: http://www.acgme.org/
outcome/assess/toolbox.asp. Copyright 2000, ACGME.
DeBehnke DJ, Restifo KM, Mahoney JF, Coates WC. Undergraduate curriculum. Acad Emerg Med. 1998; 5:11103.
American Board of Internal Medicine. MiniCEX Pilot Project. Website: http://www.abim.org/Minicex/default.htm.
Accessed 2001.
Hawthorne effect: Initial improvement in a process of
production caused by the obtrusive observation of that
process. The effect was first noticed in the Hawthorne
plant of Western Electric. Production increased not as a
consequence of actual changes in working conditions introduced by the plants management but because management demonstrated interest in such improvements. Web
Dictionary of Cybernetics and Systems: http://
pespmc1.vub.ac.be/asc/indexasc.html. Accessed 2001.
Cydulka RK, Emerman CL, Jouriles NJ. Evaluation of resident performance and intensive bedside teaching during
direct observation. Acad Emerg Med. 1996; 3:34551.
Burdick WP, Ben-David MF, Swisher L, et al. Reliability of
performance-based clinical skill assessment of emergency
medicine residents. Acad Emerg Med. 1996; 3:111923.
Ryan JG, Mandel FS, Sama A, Ward MF. Reliability of faculty clinical evaluations of non-emergency medicine residents during emergency department rotations. Acad
Emerg Med. 1996; 3:112430.

Vous aimerez peut-être aussi