Vous êtes sur la page 1sur 64

assessment

zone
Assessment and Evaluation in Health Professional Education
ISSUE 1 2013
COVER PHOTOGRAPHS:
Jodie Atkinson
UniPrint 110387
FRONT ROW (from left): Deanka Preston, Kate Gray, Jodie Atkinson
MIDDLE ROW (from left): Simone Thurkle, Debra Jeavons, Lubna Al-Hasani,
Sandy Presland, Alison Creagh
BACK ROW (from left): Christopher Etherton-Beer, Elisabeth Feher
Apply now for postgraduate study.
Are you a health professional interested in developing or improving
your skills as an educator?
To meet the growing need for educators in the health professions,
The University of Western Australias Faculty of Medicine, Dentistry
and Health Sciences is offering the following postgraduate
courses in 2014:
Graduate Oert|foate |n Hea|th Profess|ona| Eduoat|on
Graduate D|p|oma |n Hea|th Profess|ona| Eduoat|on
Master of Hea|th Profess|ona| Eduoat|on (by thes|s and
ooursework or by ooursework and d|ssertat|on}
These oourses oan be oomp|eted fu||-t|me or part-t|me.
Visit meddent.uwa.edu.au/healthedu or come to the course
|nformat|on sess|on on Monday 11 November, 4.30 to 5.30pm.
For more |nformat|on phone Oaro||ne Mart|n on 6488 6881.
Theyre learning to become top
health professions educators.
What would you like to achieve?
C
R
I
C
O
S

P
r
o
v
id
e
r

C
o
d
e

0
0
1
2
6
G
Contents
Editorial. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Profiles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
Features
Studying at UWA . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Evaluation metaphors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
When assessment goes wrong horror stories!. . . . . . . . . . . . . . . . 11
Blueprinting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
Social media . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Does your OSCE make the cut?. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
Direct observation checklists . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
Assessment on the go . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
Interviews
Dealing with difficult students (Christine Adams) . . . . . . . . . . . . . . . 12
Different perspectives on midwifery education . . . . . . . . . . . . . . . . 22
(Sadie Gerarghty, Sarah Bayes and Dianne Bloxsome)
Assessing physicians of tomorrow (Claire Harma) . . . . . . . . . . . . . . . 36
Learning issues
Assessing clinical competency. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
Making sense of portfolios. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Assessing registered nurse performance. . . . . . . . . . . . . . . . . . . . . . 49
Puzzle page. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Crossword . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
Useful websites and resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
Glossary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
Quiz and crossword solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
A S S E S S M E N T Z O N E
2
The authors acknowledge the Traditional Owners of the land on which we work and live.
Informed consent was obtained from each health professional interviewed for this magazine.
Copyright the authors
Assessment Zone is licensed under a Creative Commons Attribution-Noncommercial-
No Derivative Works 2.5 Australia License (2013).
Contact: Associate Professor Zarrin Siddiqui (zarrin.siddiqui@uwa.edu.au)
Assessment and Evaluation in Health Professional Education
3
Welcome to the inaugural issue of
Assessment Zone Assessment and
Evaluation in Health Professional Education.
The IMED 5802 student group for Semester
2, 2013, are proud to present this magazine
for our Capstone Experience Group Project.
This group project forms a major assessment
for each student undertaking this unit. It is
the culmination of eleven interprofessional
students electing to work collaboratively to
design and compile a magazine featuring
topics related to assessment and evaluation
geared towards health professional educators.
The composition, core ideas and philosophy
for this magazine align with our learning
outcomes for the unit IMED 5802 Principles
of Assessment and Evaluation. These learning
outcomes are:
Explain the role of assessment in the
learning process
Discuss techniques of measurement
Explain the validity and reliability in
assessment and evaluation
Describe the disadvantages and
limitations of different assessment
tools
Design and critique assessment and
evaluation tools
Assess the quality of an examination
by looking at its constituent parts
Explain the principles and models of
evaluation
We have also committed to applying and
maintaining the UWA educational principles
throughout the magazine items and production.
The conception of this magazine has not
been without its challenges. We formed our
student group early in the semester, many
of us meeting for the first time and with
varying work and study schedules. With a
timeline in place, we met weekly after class
to garner concepts, ideas, and view points
for the magazine items. Meetings were
documented, and communication external
to the classroom was posted via Facebook,
the UWA Learning Management System and
student email accounts. The magazine name,
design and format were decided upon by a
voting system. We utilised Joomag to view
magazine templates and Dropbox on line
applications as a central work space to submit
and review individual student creations. We
completed the process by accessing the
expertise of Uniprint to assist with formatting,
design and publication of the final product.
We would like to acknowledge Associate
Professor Zarrin Siddiqui, unit coordinator for
providing direction to us as required.
On behalf of the student group, I hope this
magazine resonates with you and your role
as an educator, and that you benefit from the
personal stories, feature articles, interviews,
and resource tools that we have produced.
Deanka Preston
Ed
i
tor
i
al
4
Jodie Atkinson is a
Registered Midwife and
Nurse. She currently works
as a Clinical Facilitator at
King Edward Memorial
Hospital and Edith Cowan
University, supervising
both undergraduate and
Masters Midwifery students.
Additionally she works as
a sessional lecturer, tutor
and research assistant at
ECU. Jodie is completing
her Masters in Health
Professional Education.
Alison Creagh has loved
medical teaching since 1979,
and has been employed as
Medical Educator at Family
Planning WA since 2000. She
has also worked for many
years in general practice
and womens health. Alison
is completing her Masters
in Health Professional
Education.
Christopher Etherton-Beer
is a Geriatrician and Clinical
Pharmacologist at The
University of Western
Australia and Royal Perth
Hospital. Christopher is
completing the Graduate
Certificate in Health
Professional Education.
Elisabeth Feher is a Medical
Education Registrar at SCGH
and Advanced Trainee in
Geriatric Medicine. Her
current educational projects
include running an education
program for International
medical graduates and
Introducing simulation into
Basic Physician Training.
Elisabeth is completing the
Graduate Certificate in Health
Professional Education.
Kate Gray is a
Physiotherapy Coordinator
of Medical and Student
Services at Joondalup Health
Campus. Completing the
Graduate Certificate in Health
Professional Education.
p
r
o
f
i
l
e
s
5
Debra Jeavons is a Staff
Development Midwife
responsible for clinical
experiences of student nurses
and midwives (undergraduate
and postgraduate) and
graduate midwives, provides
professional development
and orientation of all staff.
Facilitator of interprofessional
obstetric education within
the health service. Debra is
completing the Graduate
Certificate in Health
Professional Education.
Janet Vince is a Staff
Development Educator,
Swan Kalmunda. Extensive
background in Critical Care
specialising in ICU. She
has experience in nursing
education and is presently
working on the development
of the interprofesional
education project for Fiona
Stanley Hospital. Janet is
completing the Graduate
Certificate in Health
Professional Education.
Lubna Al-Hasani is an
international full time
student in Master, Health
Professional Education at
The University of Western
Australia. Head, Training and
Staff Development in Royal
Hospital, Muscat, Oman.
Sandy Presland is a
Clinical Nurse Recovery
Room, Fremantle Hospital.
Responsibilities include
clinical supervision and team
leadership of undergraduate/
postgraduate nurses.
Career direction: nursing
education and leadership.
Sandy is completing the
Graduate Certificate in Health
Professional Education.
Deanka Preston has been
a Registered Nurse for 21
years and in the last 10
years has worked in the
staff development area
in metropolitan teaching
hospitals. She is currently
Acting Professional
Development Educator
for the Graduate Nurse
Program at Swan Kalamunda
Health Service. Deanka is
completing her Graduate
Certificate in Health
Professional Education.
Simone Thurkle is a Staff
Development Nurse at SCGH
Emergency Department.
She has completed a Post
Graduate Certificate in
Emergency Nursing and
is currently completing a
Graduate Certificate in Health
Professional Education.
A S S E S S M E N T Z O N E
6
s an international post graduate
student in The University of Western
Australia (UWA), my experience has been
enriched both personally and professionally.
Coming from a different background has made
me value the experience from a different
perspective. Also, embracing the cultural
differences has lead me to critically analyse
and adopt the best practices.
The first thing about Australia, which I found
different than my country, was calling people
by their first name. I still struggle doing this,
even in emails.
Also, being a student with a different
background (both cultural and religious)
was difficult at the beginning. But unlike
other countries I felt welcomed not only in
the university, but in the community too.
However, I do find difficulties in getting
Halal food in the university campus, but it is
plentiful in the community.
In terms of professional growth, I have found
the university environment collaborative and
Studying
at UWA
By Lubna Al-Hasani
stimulating. UWA is full of resources
to assist the students in their studies such as
the Study Smarter sessions. Nevertheless,
although they are very helpful, I think UWA
should increase the number of tutors in
sessions like Write Smarter, to cover as
many students as they can. Compared to
my experience in my previous university,
the number of tutors available for assistance
were more and for a longer time.
At UWA the teaching methods are various
such as face to face lectures, workshops,
and the online modules. This integrated
curriculum creates a comprehensive and
fruitful learning experience. Just being in a
setting amongst a group of diverse health
care professionals with different backgrounds
A
In terms of professional growth,
I have found the university
environment collaborative and
stimulating.
Assessment and Evaluation in Health Professional Education
7
and experiences is a valuable asset. Besides,
the lecturers are very supportive and
cooperative, helping the students to reach
their potential in professional growth.
The concept UWA is adapting in its teaching,
is to create future professionals who are great
team players. Most of the assignments are
performed in teams and it is this approach
that leads to professional maturity.
Moreover, I have been exposed to several
forms of formal and informal assessments.
Examples of these assessments are
assignments, student led presentations
and reflections. Interviewing professionals
from the real working environment and
critically analysing their organization is
another assessment example. I believe
that the assessments, even though they
take me away from my comfort zone, have
been helpful to discover my capabilities,
and therefore for me, to acknowledge my
limitations and improve on them.
Assessments at postgraduate level are
more in depth, comprehensive and based
on evidence-based practice. This promotes
critical thinking and develops practical
implications in the real world. Also receiving
feedback in the middle of the units assures
the transparency of the university, as it is
open for constructive feedback.
Going through this experience is not an
easy path, and I have found it a difficult,
yet an exciting experience. However, it is
the difficulties that makes a person and
shapes their future. This is an extra-ordinary
experience and it will assist me in my
journey as an educator and a health care
professional.
I think UWA should increase
the number of tutors in
sessions like Write Smarter
Oman
A S S E S S M E N T Z O N E
8
rom the day my daughter sat behind a wheel of a car it was not just about starting the
car and going forward in a straight line. There were road conditions to evaluate such as
traffic hazards, road works and pedestrians. We had to consider the timing of the lesson and
readiness to learn on my daughters part. At the end of each learning session we would look
back and assess how she went, then analyse her progress. From there we would plan the next
lesson and make any changes to the teaching plan such as practicing reverse parking or perfecting
parallel parking.
As my daughters driving skills improved we would give her more challenges to keep her engaged
and increase her skill. But there came a point she reached a plateau and was not progressing.
Evaluating our teaching abilities, we realised that as parents, we were not the best teachers. So
we brought in expert teaching skills with the use of a driving instructor. With access to improved
teaching tools, my daughter went on to passing her driving assessment.
When I supervise students and junior nurses, its a little like learning to drive. They will often joke
that they are on their L plates. We take them on a learning journey letting them steer the wheel,
but if we do not evaluate the teaching tools used along the way, they may not progress as well.
Poor performance can be the result of an inadequate teaching program. Evaluating the teaching is
just as important as assessing the learning.
eval uat i on
student thoughts, experiences and perceptions explained with the help of a metaphor
by Sandy Presland
F
to
drive
Learning
Assessment and Evaluation in Health Professional Education
9
ne day my
husband and I
set out to find
the perfect pup.
The first thing we
had to do was to
research breeds that
were appropriate. We
assessed a variety
of types in terms of
size, types of hair,
ease of training and
temperament. After
narrowing the search
down, much to my
husbands surprise,
a toy poodle was the
winner! Next thing we
had to do was to assess
where was the best
place to purchase a
puppy from; breeders versus pet shops. We
received feedback from family and friends
and came to the conclusion that a breeder
would be better to suit our needs. Finally
when we found a breeder that suited, we had
to assess which particular dog would suit us
best. We looked from dog to dog and played
with them one by one, spending time with
them to make our decision. After playing with
the hyperactive one, the yappy one, the sulky
one... We found him! The little fluffy, quiet
but happy and playful one: Lenny.
The perfect pup
By Kate Gray
O
After a few sleepless
nights, cleaning up a few
surprises, puppy school
and park excursions, my
husband and I reflected
on our purchase and how
our lives had changed.
We evaluated the
whole experience and
wholeheartedly came to
the conclusion that getting
Lenny was one of the best
things we had done!
I consider selection of
students supervisors as an
important task. I look at the
supervisors experience,
training and motivation
and assess their ability
to adequately support
the students learning.
After the placement has finished, I reflect
on the students experience, the feedback
they received and their final assessment
results. I receive feedback specifically from
the student and the supervisor and after
considering all of this information, I can
evaluate the effectiveness of our student
program. The ultimate outcome is that
hopefully we have positively shaped the
student into becoming a competent and
confident health professional.
eval uat i on
student thoughts, experiences and perceptions explained with the help of a metaphor
A S S E S S M E N T Z O N E
1 0
My Daughters
Journey
wo years ago my daughter decided
to move schools. She was unhappy
where she was and had some ideas
as to where she wanted to go. Together we
looked at why she wanted to move to make
some objective decisions.
Having identified various points, we listed the
qualities we wanted to see from the schools
we were to look in to. From here we began
our research.
We searched the internet, contacted various
schools to review their prospectuses, liaised
with students and parents from various
by Janet Vince
schools and finally selected a few of which
we wanted to visit.
From here we were able to make a formative
assessment of what each school could
offer, guiding our choice. As I evaluated the
choice we made, I based it on continuous
assessment of my daughters performance
and happiness.
When planning my education programs, I
am aware that everyone has a choice, just
as we had. If I dont continuously assess
what I am providing and change according to
feedback, my evaluation will reflect a lack of
interest and engagement
in the programs. I
value feedback through
surveys, questionnaires,
evaluation of study
sessions and word
of mouth. This gives
me vital information
to change and adapt
according to students
wants and needs. My
evaluation is the success
of overall outcomes.
eval uat i on
student thoughts, experiences and perceptions explained with the help of a metaphor
T
Assessment and Evaluation in Health Professional Education
1 1
The VIVA is mine and several of my
colleagues biggest horror stories.
Different students get different assessors. In
my case the assessor interacted with me and
argued about one of my answers when she
was not meant to interact at all. This got me off
track during the exam and meant that I did not
do as well as I could have. My colleague had a
similar experience but in her case she knew
the answer the assessor wanted and which
would have given her a pass but that is not
what she wanted to say or what she believed.
Another colleague got so distressed by an
oral exam she froze and got flustered and
failed even though she knew the correct
responses. Several assessments are done
at one time with the same question,
however each assessor has their own
individual philosophies and interpretation
of the question. Students were aware of
which assessors marked easier or higher
than others. You knew that if you got
certain assessors you would receive
a lower mark even giving the same
answers. This assessment appears to
have caused the most distress.
Assessment can be a terrifying
experience for both the assessor and the assessee. Our group reflected on their own
horrifying experiences of the past. Read on if you dare!! What we recommend NOT doing
to your students...
Assessment
stories
My most awful and instructive experience was at the end of my medical degree, when
everything (back then), depended on passing the final exams, of which there were
quite a number, both clinical and written. I saw a high proportion of my fellow students falling
apart in a variety of ways. Some used lots of alcohol, others took sedatives, some others took
illicit substances, and some became extremely anxious and depressed, including one who was
admitted to hospital. I did learn from this: personal coping skills included taking a fatalist approach
I will do my best, and if thats not enough, the worst thing that can happen is to have to repeat,
professionally I learnt, amongst other things, that continuous assessment would be better!
H
o
r
r
o
r

M
y horror story is about a friend of
m
ine. She w
as a final year student
and on prac at a hospital. She had thought
a com
m
ent she had heard from
a previous
student saying they w
ould rather scrub toilets
w
as an exaggeration until she experienced it
for herself. There w
ere 2 supervisors. They
w
ould contradict w
hat the other one said, and
even contradict their ow
n advice the follow
ing
day. They gave m
ainly negative feedback.
From
a confident, happy person m
y friend
becam
e very stressed and dreaded each day
w
orrying that she w
ould fail and therefore
be unable to graduate. She w
as not told until
the last day that she had passed subject
to a special consideration form
from
uni.
Considering she got a high distinction from

every other prac before and afterw
ards it m
ust
say som
ething about the experience! There
w
ere 2 things she learnt from
this - positive
feedback (or at least constructive feedback)
and a supportive learning environm
ent is w
hat
students need to flourish, secondly she learnt
how
not to treat students in the future. (by the
w
ay this happened recently, not a long tim
e
ago).

by Jodie Atkinson, Alison Creagh and Debra Jeavons


A S S E S S M E N T Z O N E
1 2
Christine, we are writing a magazine
on assessment and evaluation in Health
Professional Education. Can you explain
to our readers your current role in
assessment and the learning process?
Primarily my role involves lecturing and
supervision of undergraduate nursing
students. I am also involved in problem
solving particular cases on site that are
difficult for our facilitating staff. I am generally
contacted when a student is struggling with
their learning objectives on practicum, or
more importantly if there has been a near
miss event such as a potential drug error.
Also I am contacted by clinical facilitators to
deal with student behaviour problems and
students failing to attend practicum or arriving
late. I generally go out
to speak to students
and address issues as
they arise. I rely on the
assessment the clinical
facilitator has made,
and the documentation
provided. I cannot stress enough the
importance of documentation and that we
have evidence of a chain of events, not just
a single record of an incident.
What issues do you experience regarding
underperforming students during clinical
placement?
Generally the issues Im called out to can be
sorted out on site with a meeting and
developing a learning contract with some
clear learning objectives. Specific goals are
graduated through their practicum, so by mid-
way or end of practicum, the student knows
what they need to achieve. If its behavioural
issues, such as attitude problems, the student
responds very quickly
to the threat of removal
from practicum. I
always keep a record
of the issue, and that
it has been addressed.
This is important in
Dealing with
difficult students
An Interview with Christine Adams
Christine Adams is currently a Clinical Placements Officer and
Lecturer for the Notre Dame University Undergraduate Nursing
Program. She also works clinically as a registered nurse in a
busy post anaesthetic care unit. She previously held a position
at Curtin University as a Clinical Facilitator for undergraduate
nursing students.
I cannot stress enough the
importance of documentation
and that we have evidence of a
chain of events, not just a single
record of an incident.
Assessment and Evaluation in Health Professional Education
1 3
terms of the student appealing, should they
fail their practicum. Documentation is crucial
because we have students come back and
say they didnt get a visit from anyone, they
werent aware that there was a problem or
they dont see why they should be failed.

I have had a couple of recent problems where
students have been involved in near miss
events involving potential administration
of the wrong medication. Any incident
that involves patient safety or an adverse
event means removal of the student from
practicum. I will inform the student that
they are to leave the health site premises
immediately and make an appointment with
me onsite at the university, and that its
potentially a fail of the practicum and may
involve repeating the unit. This is actually
a legal requirement because as health
professionals we have a duty of care to
protect the safety of our patients.
Is there any advice you can give to our
readers regarding strategies to support
failing students?
In terms of supporting a failing student, or any
student for that matter, the most important
principle is formative assessment. The
assessor must sit down mid-way through the
students practicum and give feedback on how
they are going, if their learning objectives are
being met and if not, areas they can improve. I
think the importance of formative assessment
certainly supports your final decision. At the
end of a practicum when you are grading the
students summative assessment, you have
the backup of feedback discussions mid-way
through the practicum to support the final
assessment. I can say to the student that
you did not meet the learning objectives we
set nor improve in the areas we identified as
being weak and for that reason I am failing
you as I am unhappy with your level of
competence.
Without identifying and documenting a
students weakness early, it can give the
student grounds for appeal. I think this is
probably the biggest advice I can give readers,
to make sure you identify any areas of poor
performance early in their placement. I really
think formative assessment supports the
student in learning but also supports the
clinical supervisor or assessor in the process
of dealing with students who are
weak or failing.
How do you deal with student reactions
when they are informed they are
performing unsatisfactorily or failing?
I dont get too much into the emotional side
when it comes to informing a student they
are failing. I try to be empathetic but if its
a behavioural issue Im very clear about the
expectations when they go on practicum.
All students sign a clinical practicum policy
which outlines behaviour we expect. If a
student is removed from practicum due to
poor behaviour, they are reminded they have
not met the expectations of the university.
I appreciate they may be upset and for
that reason I ask them to write a reflective
paper on how they could manage the
situation better, prior to making a follow-up
appointment to see me. In that way they are
coming back to me when the emotion
has settled down and they have had time
continued overleaf
A S S E S S M E N T Z O N E
1 4
to reflect on their
behavioural issues.
The same principle
applies when dealing
with students removed
from practicum due
to unsafe practice.
When they are upset
they have a barrier to receiving information.
By waiting for the emotion to settle
down they are more open to receiving
information at the follow up meeting and
thats when I will formulate an action plan.
At the meeting I will emphasise its not so
much a disciplinary measure but a learning
measure and I want the student to be a
safe and competent practitioner. I advise
the student that by reflecting on the critical
elements where they failed, and identifying
appropriate learning opportunities to meet
their objectives, they will make a better
professional down the track.
A lot has been said about failing to fail
students. Why does this happen and what
do you think the consequences are?
Contributing factors for failing to fail
students include lack of mentor experience,
time burden and unwillingness to deal
with a situation potentially fraught with
emotion. Ive heard registered nurses
suggest they cant be bothered failing a
student as it involves more work identifying
weaknesses, setting objectives and
creating learning contracts. Recently a
registered nurse said to me he had signed
off an incompetent student as competent
because he wanted to avoid hurting her
feelings. I really think that is what it comes
down to a lot of the time.
It doesnt serve the
failing student to pass
them. Instead it puts
them in an arena as
a trained practitioner,
where they are under
pressure to perform.
They do not have
the skills or knowledge to deal with an
environment they are not ready for, and we
end up with nurses out there practicing in
an unsafe manner. By giving the student the
benefit of the doubt, we are not doing the
profession any service. It is worth noting that
in the United Kingdom trained nurses are
held accountable by the registration board for
signing off student competencies.
Christine, is there something you wish
you had learnt about assessment and
evaluation earlier in your career?
We are now realising that the qualities that
make a good clinical supervisor are quite
often characteristics such as personality
traits and I think research has caught up with
this. Students are more inclined to use their
initiative and seek out learning opportunities
when their supervisor is welcoming and
makes the student feel part of the team. So
in education we are looking for people with a
passion to teach, and the tools we can teach
later. Experienced health professionals who
are open to educating students need to be
offered opportunities to develop their clinical
supervision and assessment skills. I think that
is what I really didnt get offered earlier on in
my career.
In terms of supporting a
failing student, or any student
for that matter, the most
important principle is formative
assessment
Assessment and Evaluation in Health Professional Education
1 5
Are there any resources that have
changed your approach to assessment
and evaluation?
I was fortunate to attend the Art of Clinical
Supervision program offered by WA Health.
I found it useful, especially focusing on
different learning styles and competency
assessment. Consolidating the knowledge
and tools the program provided, I now apply
these strategies into practice. In the case of a
struggling student I look at how that student
learns so I can adapt the environment to
maximise their learning potential and apply
a structured framework for assessment of
competence.

Christine, as someone whose role
involves assessing students in clinical
practice, what might be the one most
important piece of advice youd like to
share with our readers?
In terms of assessing students, remember
they are students, not practitioners. I think as
practitioners we might have to step back and
accept students may take longer to complete
tasks, stumble along a bit and seek support.
What Im looking for when assessing
competence is if the student is asking
all the right questions and reflecting on
their practice. We need to assess student
competence for the level of training they
are at.
If you would like more information about the
Art of Clinical Supervision program please
contact: ACS.Admin@health.wa.gov.au.
Interview by Sandy Presland
Practical advice for supervisors:
Provide formative assessment Develop an action plan
Identify problems early Set achievable learning objectives
Provide opportunities for feedback Document evidence to justify actions
A S S E S S M E N T Z O N E
1 6
Background
In 2012, I introduced an activity at a
clinicians training day at my work place. After
viewing short video clips of sexual health
consultations, each participant (doctors and
nurses providing clinical training) rated the
health professionals level of competence
in the relevant skill, using the same method
of assessment as for trainees in our clinics.
Assessment was made according to the
current four level scale:
Needs assistance

Developing competence

Competent

Mastery
A Learning Issue is a focused question
generated about issues related to assessment
and evaluation. Alison has examined the issue
of the reliability of the assessment of clinical
competence.
Assessing
clinical
competency
by Alison Creagh
For each skill, the same list of ideal attributes
was used as in our clinics.
Amongst the 15 clinicians present, there was
an unexpectedly wide range of assessments
for each of the five video clips. This led to a
consideration of the reliability of assessments
of competence, and whether reliability could
be improved.
Assessment and Evaluation in Health Professional Education
1 7
does
shows how
knows how
knows
Discussion
Millers pyramid
1
of health professional
assessment is:
It seems widely accepted
2
that assessing
actual practice (does, in Millers pyramid) is
more predictive of work skills than assessing
performance in simulated or standardised
settings (the shows how level). However,
assessing competence in the workplace
appears more complex than assessing the
lower layers of the pyramid.
A number of ways to improve the reliability
of assessments of competence have been
suggested in the literature.
Assessors can be provided with training to
increase consistency between them.
3
For
example, all would observe a
particular scenario, rate the competence of
the trainee, and then discuss any differences
between their assessments.
A commonly accepted strategy is to ensure
that assessments are made on multiple
occasions by multiple assessors.
4
This
has the effect of smoothing out both
individual assessor variations, and the
differences in assessment resulting from
different levels of complexity in the cases
seen. Supervisors could identify trainees
who are not having sufficient assessments
performed, and facilitate further training and
assessment sessions.
4
A practical difficulty
with this strategy may be limitations on
the number of assessments that can be
performed, as they can be time consuming
and costly.
Some authors have discussed
generalizability theory as a method of
calculating the reliability of competency
assessments. These calculations depend
on a number of assessors seeing the same
students performing in a number of cases,
and then looking at the variation in marks
due to individual assessors, the variation
between students, and the variation
between cases, to work out the degrees
of variation due to each of these factors.
Using some standard rating scales, and
some devised for the study, they found
that using more specific rating scales,
aligned more closely with the assessors
ways of thinking about trainees, were
more reliable than the standard, generic
ones. For example, some of the generic
scales included features such as whether
a trainee was at or above the expected
level. Assessors tended to avoid more
pejorative ratings, and infrequently rated
trainees as below the expected level.
Scales that were found to be more reliable
included those where the degree of
trust in the trainee was rated, and those
continued overleaf
A S S E S S M E N T Z O N E
1 8
which assessed the trainees level of
independence to perform certain tasks.
Another recommendation is to consider
who the assessors should be. Should
there be 360 degree assessments, with
everyone working with a trainee providing
a point of view? It seems that the most
reliable assessors are those who observe the
particular skills required on a regular basis. For
example, ward clerks may usefully provide
assessment information on communication
skills or on team work, but would not provide
reliable assessments of clinical skills.
3
Finally, what information should be provided
to assessors to guide their ratings of
trainees? Some organisations have developed
detailed checklists, in an attempt to reduce
the subjectivity of assessments.
3
Examples
of checklist items are trainee introduced
themselves or trainee scrubbed hands
correctly. In a comparison between the
use of detailed checklists and a global rating
of trainees, it was found that the global
rating was more reliable.
3
This is concisely
summarised:
perhaps performance is more than the
sum of its parts
3
Conclusion
Assessing competency in practice is a
better predictor of actual work skills than
assessments of knowledge or of skills in
standardised situations.
In summary, useful strategies include:
use assessors who both observe and
perform the relevant skills on a regular
basis.
ensure that a broad range of assessors
provide feedback and assessment on
multiple occasions.
train assessors in the consistent use of
rating scales.
use rating scales that are specific to the
type of competence being assessed,
rather than generic ones.
ensure that rating scales are aligned to
assessors ways of thinking.
ask assessors to make judgements, rather
than use detailed objective checklists.
As Crossley and Jolly
3
say,
scraping up the myriad evidential
minutiae of the subcomponents of the
task does not give as good a picture
as standing back and considering the
whole.
References
1. Miller G. The assessment of clinical skills/competence/performance. Academic Medicine.1990;65:s63-7.
2. Crossley J, Johnson G, Booth J, W W. Good questions, good answers: construct alignment improves the
performance of workplace-based assessment scales. Medical Education. 2011;45:560-9.
3. Crossley J, Jolly B. Making sense of work- based assessment: ask the right questions, in the right way, about
the right things, of the right people.Medical Education. 2012;46:28-37.
4. Homer M, Setna Z, Jha V, Higham J, Roberts T, Boursicot K. Estimating and comparing the reliability of a suite
of workplace-based assessments: an obstetrics and gynaecology setting. Medical Teacher. 2013;35:684-91.
Assessment and Evaluation in Health Professional Education
1 9
OPTIONS
EACH OPTION MAY BE USED ONCE,
MORE THAN ONCE, OR NOT AT ALL
A. Educational impact
B. Practicality
C. Reliability
D. Standardisation
E. Student satisfaction
F. Uniformity
G. Validity
Lead in: For each assessment of an
instrument, select the characteristic of
effective assessment being measured.
STEMS
1 Item scores from a 100-item multiple
choice question paper are divided into
two sets of 50 questions each that are
compared.
2 Scores on a written examination in the
final year of the medical course are
analyzed to determine whether they are
associated with successful completion of
the intern year.
3 The feasibility and costs (including
materials and staff time) of two
instruments being considered for use in
the new MD curriculum are compared.
4 Student feedback, and overall
performance in a unit, is compared before
and after addition of a formative Objective
Structured Clinical Examination at mid-
semester.
SELECT THE CORRECT OPTION
STEM:
Concurrent validity in assessment:
OPTIONS
A measures the correct learning outcome
B measures a large enough sample of the
intended learning
C is appropriate to the learners stage
D predicts success after graduation
E gives the same results as another test
measuring the same outcome
SELECT THE CORRECT OPTION
STEM
Kirkpatricks model of evaluation has four
levels which measure:
OPTIONS
A. effect, knowledge, performance, outcomes
B. feedback, education, conduct, grades
C. reaction, learning, behaviour, results
D. response, outcome, behaviour, results
Solution page 60.
Created by Alison Creagh, Christopher Etherton-
Beer and Deanka Preston.
The Pu!zz?le Page
Test your knowledge
A S S E S S M E N T Z O N E
2 0
Blueprinting (or creating an assessment Blueprint) is an approach linking assessment with
curricular content and learning outcomes.
1
The Blueprint itself is usually in tabulated format,
which is then used to assist planning of teaching and assessments. One type lists learning
outcomes on one axis, and on the other, the assessment methods and the timing and weighting
of assessments. It often includes the timing of teaching and the type of learning to be achieved.
Like learning experiences and courses, blueprints come in all shapes and sizes! To illustrate these
points, here we share a Blueprint for assessment in a course for preceptors.
Course: Preceptor-ship Course
Unit: Providing Constructive Feedback
Students: 10 Health care professionals,
who have at least 3 - 4 years of
experience
Course Duration: 5 Days (combining
classes and practical exams at the end)
Course Requirements: Pre-reading
assigned articles
Unit Duration: 3 Hours / 3 days
Learning Outcomes:
At the end of this unit, the students will be
able to:
1 Identify the different methods of providing
constructive feedback
2 Discuss the advantages of constructive
feedback
3 Construct real examples from the working
environment and criticize
4 Demonstrate how to provide constructive
feedback effectively
5 Analyze different situations and evaluate
the best ones
Continuous
Assessment
Marks Time
1 Multiple Choice
Questions (Pre and
post the unit)
5% 1st day of
the unit
2 Student led
presentation (15
minutes/each
student)
15% 2nd day of
the unit
3 Assignment 30% End of the
course
Linking assessment to the curriculum
the Blueprint approach
by Lubna Al Hasani
Assessment
Continuous Assessment
Summative Assessment: Practical Exam-
(50%) at the end of the course.
The Course Outlines (Criteria): The
student must pass the practical exam, in
order to pass the unit.
Assessment and Evaluation in Health Professional Education
2 1
Learning Outcomes Recall Interpretation Application Skills Total
1. Identify the different methods of
providing constructive feedback
5% 5%
2. Discuss the advantages of
constructive feedback
15% 15%
3. Construct and critique real examples
from the working environment
30% 30%
4. Demonstrate how to provide
constructive feedback effectively.
50% 50%
100%
Linking assessment to the curriculum
the Blueprint approach
References
1. McLaughlin K, Lemaire j & Coderre S. Creating a reliable and valid blueprint for the internal medicine clerkship
evaluation. Medical Teacher 2005; 27(6):544547
Unit Content:
1 What is preceptor-ship
2 Preceptor and preceptee roles and
characteristics
3 Achieving Competencies
4 Assessment
5 Clinical Supervision
6 Constructive Feedback
7 Evaluation
8 Conflict Resolution
Teaching and learning processes for
the course: Pre- reading, presentation,
workshop, practical exams and seminars.
Assessment Blueprint
A S S E S S M E N T Z O N E
2 2
We are writing a magazine on assessment
and evaluation in health professional
education what would be the one most
important piece of advice youd like to
share with our readers?
I would suggest stop taking advice go with
your gut feelings what works for others may
not work for you. Trial and error is appropriate
for most people. I get up at 0500 and work
until 0730 before work when I have deadlines
in Summer I am a morning person. This
works for me but not everyone.
Is there something you wish youd learnt
about assessment and evaluation earlier in
your career?
I would have benefited from knowing about
Rubrics and Marking Guides early in my
career I never knew what different lecturers
wanted from assignments. A rubric / marking
guide shows what is required and what marks
can be obtained for work produced. I would
have been a better proof reader and critical
analyst if I had known this sooner.
Different Perspectives on Midwifery Education
Sadie Geraghty
is a Midwife
and Midwifery
Educator. She
is Co-ordinator
Post Graduate
Midwifery,
Edith Cowan
University.
Tell us about the structure of assessment
used in post graduate midwifery education
at present.
Postgraduate midwifery uses blended
learning / teaching which is reflected in
assessments. Students are assessed
using a variety of methods written
exams, presentations, reflections, OSCEs,
essays, creative assignments (poetry/
painting/ drawing/modelmaking) in which
the students usually excel at one type of
assessment.
Do you think these assessments
adequately determine whether a student is
competent for registration as a midwife?
They are part of the assessments for
registration the clinical assessments
are equally important. The theoretical
assessments compliment the clinical
assessments by providing knowledge and
critical thinking abilities without both clinical
and theoretical assessments a student would
not be adequately assessed for registration.
What advice would you give in working
with difficult learning situations?
There is no one piece of advice that is
appropriate in difficult learning situations
but I would suggest remaining grounded. Life
experience helps us to deal with situations,
and most of us learn from experience.
Watching how others deal with difficult
situations also is beneficial we often learn
by example.

Assessment and Evaluation in Health Professional Education
2 3
Different Perspectives on Midwifery Education
Enjoy the learning experience find ways to
minimise stress remember being a midwife
is the best job in the world - 80% of our role
is education!
What issues do you experience regarding
under performing students both
academically and in clinical placement?
I dont think students under-perform or over-
perform. Our students are only as good as the
facilitators / teachers who are involved in their
learning. In my experience students are either
over-confident (rare) or under-confident and
compensate to overcome this. We facilitate
learning we guide, direct them the right way
to go. Bullying and punishing is not facilitating
learning. However students must have the
capacity and motivation to learn in the first
place.
How do you deal with student reaction
when they are informed they are
performing unsatisfactorily or failing?
I dont ever tell students they are failing
if they fail, I am failing.
I try to find out what is going on in their lives
and attempt to help them find a solution. I
have never had a bad reaction from a student
even after years of being a clinical facilitator.
I find their strengths which usually helps them
overcome their weaknesses. I always balance
negative comments with positive. Most
people who choose to become a midwife do
so for altruistic reasons I can usually find
positive things to help those back on the path
of learning.
Di, what would be the one most important
piece of advice youd like to share with our
readers?
Believe in yourself, be confident and
prepared.
Is something you wish you had learnt
about assessment and evaluation earlier in
your career?
No, I was given a great rubric to follow for
assessments.
Do you think these assessments
adequately determine whether a student is
competent for registration as a midwife?
No, I dont. I think a lot of students slip
through as competent and when they get out
on the wards and birth suite they cannot apply
themselves and their skills.
Dianne Bloxsome
is a Sessional
Academic teaching
Nursing, Midwifery
and Paramedicine
and a Clinical
Supervisor of
Nursing and Midwifery Students at
Edith Cowan University. She is a
Registered Nurse and Midwife.
continued overleaf
A S S E S S M E N T Z O N E
2 4
What advice would you give in working
with difficult learning situations?
Be confident and get support from superiors
if needed. A problem shared is a problem
halved!!
Speak with the student alone, get advice and
support from lecturer, put the student on a
learning contract and be sure to review it.
How do you deal with student reaction
when they are informed that they are
performing unsatisfactorily by failing?
I have found the students will either be
accepting or defensive, either way they
Dr. Sara Bayes is
currently a Senior
Lecturer and
Midwifery Program
Director at Edith
Cowan University
and a Post Graduate
Research Fellow at University of
Nottingham. She is a Registered
Nurse and Midwife
are upset and annoyed that you have
had the courage to pick them up on their
incompetence.
Any final words of wisdom?
When deciding on whether to pass or fail
a student in question, ask yourself the
question Would you want to work with
them?, and Would you like them to
look after you or your loved one?. If the
answer is no, you fail them.
Sara, can you tell our readers if there is
something you wish you had learnt about
assessment and evaluation earlier in your
career?
Well, my early experience of assessing and
evaluating students was that I was extremely
well supported by the more experienced
academics around me who constantly,
intensively and proactively checked my
marking to make sure I was doing the right
thing by the students - so my only contribution
here, if that hadnt been the case, would be
to ensure that when youre new to it, you set
yourself up with an effective mentoring system.
Are there any resources that have
changed your approach to assessment
and evaluation?
Only the other academics that Ive come into
contact with and milked for their experience
and wisdom over the years!
Do you think these assessments
adequately determine whether a student is
competent for registration as a midwife?
For my setting, I do. Assessments of all
types in each theory and practicum unit
throughout each midwifery preparation
program are structured to enable the student
to demonstrate their developing competence
in a number of the NMBA Midwifery
Assessment and Evaluation in Health Professional Education
2 5
Competencies. By the end of their course,
they will have been assessed against all of
the NMBA competencies.
What advice would you give in working
with difficult learning situations?
I think just to remain fully objective and stay
right away from labelling one thing/person
as responsible too quickly - the fact (usually)
is, an interaction of a range of factors mean
that the situation is not as optimal as it could
be so it needs reflecting on dispassionately
and investigating sensitively - and if there
is one person who is making things tricky,
there is usually a whole sub-story underneath
the behaviour, and its usually not a happy
one. Other than that I would say make sure
expectations and limitations are clear to
everyone (and documented), and stay close
and visible.
What issues do you experience regarding
under performing students both
academically and in clinical placement?
Not that many really because our courses have
such small numbers, so we are able to get on
to under-performance issues early on. The main
thing in both would be not so much a lack of
intellectual capacity but a lack of time available
for individual students to develop towards the
milestones we expect them to reach in their
own time all students must achieve certain
things by certain time points through their
course, but of course everyone learns (gets it)
at a different rate! So thats a challenge.
How do you deal with student reaction
when they are informed they are
performing unsatisfactorily by failing?
I think just by staying calm and clear, and
retaining an adult-adult communication
pattern but still giving time to acknowledging
the students disappointment / fear or
whatever, plus all the things I said in the
question regarding difficult learning situations,
and remaining future-focused, i.e. staying
focused on what we can do to try and remedy
the situation, reinforcing the positives,
continuously bringing the conversation back
to an objective appraisal of the situation and
how to make it better.
Nine times out of ten that works to maintain
the students optimism that its not the end
of the world and to galvanise their resolve
and motivation to doing their best to get back
on track but there will always be the case
where nothing positive you say or do will get
through, and the student will be angry and/
or resentful and take no ownership of the
remedial plan. And for that type of situation
all you can do is remind yourself inwardly that
you have done all you can, and that if youve
acted professionally and supportively, you
are blameless in how they have reacted. And
debrief if you need to with someone who is
not too close to the situation.
Any final words of wisdom?
Be prepared to admit it and apologise to a
student if you get something wrong!
Interview by Jodie Atkinson
A S S E S S M E N T Z O N E
2 6
Welcome to the Web 2.0 age
In this digital age, we are all familiar with the
web. But for a second, think about how far
things have come in a relatively short time.
Web 2.0 describes the new Internet
technologies that now allow users to actively
create content, share material and interact.
1
This is in contrast to the earlier, largely static
(Web 1.0) Internet sites with relatively
little user content. Web 2.0 now offers
educators the opportunity to design and build
innovative custom applications to support
teaching, learning and assessment in specific
disciplines.
2,3
The rise of social media has been one of the
greatest results of the popularity of Web 2.0
technologies. Few people, from students
to Professors, havent at least heard of
Facebook and Twitter. Popular social
media sites now have an astonishing number
of users (Box 1).
Among the enormous numbers of users,
many are avid users, checking their
Facebook or Twitter accounts daily.
5,6
Social
media are particularly popular among the
millenial generation (Generation Y) born
Social media
opportunities for
assessment and
evaluation in the
Web 2.0 age
Are you familiar with these social
media?
Blogger (Web based journaling
[Blogging] site)1
Facebook (Social networking site with 1
billion users)
4
PebblePad (e-Portfolio application)
Twitter (Micro-blogging site for sharing
of ideas and conversation with 200 million
users)
5
Wikipedia (Encyclopedia wiki [website
created, edited, and developed by a
community of individuals working on
collaborative projects])1,
4
Youtube (Video sharing site with 800
million users)4
Box 1: Popular social media sites
by Christopher Etherton-Beer
Assessment and Evaluation in Health Professional Education
2 7
from the early 80s onwards. These young
folk have grown up in a digital age and are
comfortable using social media in both their
private and academic lives.
1,6
In line with
these trends, the use of social media is very
high among health professionals and student
populations,
7
and its now recommended
that medical educators be familiar with
social media technologies.
5
Despite this
enthusiasm for social media use, many
people continue to be concerned about the
boundaries, oversight and privacy of data
shared via social media. In this article we
will consider some of the opportunities and
potential pitfalls, associated with the use of
social media in health professional education.
(Box 2)
Potential benefits of social media in
education and assessment
It is not just access to social media sites
themselves which has increased at a
staggering rate, it now seems that portable
electronic devices (including smart
phones, tablets and lap top computers with
the capability of wirelessly accessing the
Internet) are ubiquitous among Australian
university class attendees. Thus social media
can be used without special organization
by faculty. Compare this to more traditional
technologies designed to facilitate learner
participation (such as audience clickers,
which require specially enabled classrooms
and are limited to use in one physical site).
Combined with portable devices enabled
with Internet access, Web 2.0 technologies
offer remarkable flexibility to teachers and
learners alike.
Instead of diminishing attendance at
traditional teaching activities (such as
lectures), social media can actually enhance
participation. Social media can also be
harnessed to encourage participation
by learners who may be intimidated by
participating in traditional class settings,
but may be more comfortable initially
building their confidence by participating
electronically.
1
Electronic participation
can also be anonymous - when this is the
learners preference, and it is appropriate
in the context of a particular use of social
media. For example, students may contribute
to a Twitter feed during a group lecture, or
respond to a poll of learners via Twitter,
5

under a non-identifying username.
There is now a growing experience, and
evaluation, of the use of social media in health
professional education. The available data
have been reviewed
4,8
and may surprise some
educators. Certainly it seems that judicious
and carefully considered use of social media
can facilitate collaboration and learning, with
positive feedback from learners.
4,9
Social
media presents a flexible tool (Box 3) that
can be used to engage learners, facilitate
collaboration and provide feedback.
8
Web 2.0 technologies and social media
offer exciting opportunities to promote
learner participation, engagement, and
effective assessment
Use of social media can enhance
traditional pedagogical approaches
Teachers and facilitators have
opportunities to model, promote and
require e-professionalism among
learners
Box 2: Key points
continued overleaf
A S S E S S M E N T Z O N E
2 8
Use of social media tools can be associated
with improvement in the knowledge, skills
and attitudes of learners.
8
Have you considered using social media
to:
establish a virtual community of
practice
increase participation in lectures (try
displaying a live Twitter feed!
1,5
provide real time questions, or
feedback, linked to your class blog
5,10
informally poll learners
5

create a shared class blog
9

provide a discussion forum
4
create, discuss, and edit shared student
articles on class topics in a Wiki
1
promote communication between
tutorial group members outside of
traditional classes
10
create e-portfolios of student work,
allowing them to share some or all of
their portfolio
create custom Web 2.0 applications
to support your teaching, learning and
assessment
2
Box 3: Applications of social media in
health professional education
Utilizing social media for student
assessment
There are many opportunities to support
assessment and evaluation in health
professional education using social media.
Provision of feedback is a frequent and
important use of social media in health
professional education.
8
Similarly, the provision
of opportunities for formative self-
assessment was one of the most highly rated
aspects of a medical education Facebook
page in anatomy.
11
Twitter can also be used
to provide formative feedback.
5
Web 2.0
technology can be used to offer authentic
assessments (i.e. that reflect activities the
learner will need to undertake in their actual
practice). For example, a custom web based
application has been designed that facilitates the
assessment of nursing students in searching
and critically reviewing the literature.
2

In addition to the provision of feedback and
assessment activities, the extent and quality
of electronic participation by students in
group work or discussion can be rated by
tutors or peers (mirroring the assessment
of more traditional face-to-face group work).
Learners can also be prompted to self-reflect
on their on-line contributions. Similarly,
electronic material authored by students can
be assessed by tutor, peer or self-rating.
On-line environments offer the potential for
combinations of self, tutor
and peer rating, as material
is easily shared. Provision
of feedback in an on-line
environment can provide
students with experience in
constructing and delivering
constructive feedback, which
in turn can be assessed
formatively and summatively.
Assessment and Evaluation in Health Professional Education
2 9
Evaluating the use of social media in your
course
The social media aspects of your teaching
can be assessed in tandem with other
feedback collected from learners. In
addition, teachers can consider using
specific resources provided by some
social media sites. For example, Facebook
provides the Facebook Insights function
that can automatically provide page
metrics (such as numbers of visitors
and de-identified demographic data).
11

Google Analytics offers a free service to
analyse Internet site statistics (such as
the number of visits to a course Web 2.0
site). Depending on the learner cohort and
particular social media application it may
also be appropriate to measure the learners
attitudes towards the use of social media,
12

and there is much potential for educational
research in this field!
Challenges of using social media
Common challenges when using social
media are technical problems and ensuring
consistent user participation.
8
Learners
may find the array of social media options
bewildering, or find the presentation of
commercial advertising distracting. For these
reasons, teachers may focus on using one
or two social media tools in each class. The
use of social media can present learners
and teachers alike with practical, ethical and
legal challenges
6,13
relating to privacy and the
blurring of private and professional lives. Some
of the risks associated with social media use
(such as breaches of patient confidentiality
or the patient-doctor relationship) have now
been well publicized.
7
These challenges have
led many professional groups and educational
institutions to consider policies regarding the
use of social media. E- professionalism is
defined as the attitudes and behaviors that
reflect traditional professionalism paradigms
but are manifested through digital media.
6

continued overleaf
A S S E S S M E N T Z O N E
3 0
References
1. McAndrew M, Johnston AE. The role of social media in dental education. J Dent Educ. 2012;76(11):1474-81.
2. Eales-Reynolds L-J, Gillham D, Grech C, Clarke C, Cornell J. A study of the development of critical thinking
skills using an innovative web 2.0 tool. Nurse Education Today. 2012;32(7):752-756.
3. Watson N, Massarotto A, Caputo L, Flicker L, Beer C. e-ageing: Development and evaluation of a flexible
online geriatric medicine educational resource for diverse learners. Australasian Journal on Ageing. 2012;
Forthcoming (online ahead of print DOI: 10.1111/j.1741-6612.2012.00622.x).
4. Hamm MP, Chisholm A, Shulhan J, Milne A, Scott SD, Klassen TP, et al. Social Media Use by Health Care
Professionals and Trainees: A Scoping Review. Acad Med. 2013;88(9):1376-1383.
5. Forgie SE, Duff JP, Ross S. Twelve tips for using Twitter as a learning tool in medical education. MedTeach.
2013;35(1):8-14.
6. Kaczmarczyk JM, Chuang A, Dugoff L, Abbott JF, Cullimore AJ, Dalrymple J, et al. e-Professionalism: a new
frontier in medical education. Teach Learn Med. 2013;25(2):165-70.
7. George DR, Green MJ. Beyond Good and Evil: Exploring Medical Trainee Use of Social Media. Teaching and
Learning in Medicine. 2012;24(2):155-157.
8. Cheston CC, Flickinger TE, Chisolm MS. Social media use in medical education: a systematic review. Acad
Med. 2013;88(6):893-901.
9. George DR, Dellasega C. Social media in medical education: two innovative pilot studies. Med Educ.
2011;45(11):1158-9.
10. Wells KM. Social media in medical school education. Surgery. 2011;150(1):2-4.
11. Jaffar AA. Exploring the use of a facebook page in anatomy education. Anat Sci Educ. 2013; DOI 10.1002/
ase.1404.
12. Wang AT, Sandhu NP, Wittich CM, Mandrekar JN, Beckman TJ. Using social media to improve continuing
medical education: a survey of course participants. Mayo Clin Proc. 2012;87(12):1162-70.
13. Cain J, Fink JL. Legal and ethical issues regarding social media and pharmacy education. Am J Pharm Educ.
2010;74(10):184.
A faculty can take many practical steps to limit risks, and maintain e-professionalsim, associated
with use of social media in their courses such as:
consider setting up separate pages, groups or accounts for each class
advise students to select maximal security and privacy settings
have separate profiles, accounts or pages for professional and private interaction
Assessment and Evaluation in Health Professional Education
3 1
ACROSS
6 Way of matching learning outcomes and
assessments
8 Systematic way of checking that learning
objectives are achieved by learning
activities
9 A number of stations to assess different
clinical skills
10 An easy to mark, standardised type of
assessment
11 Means of ensuring that learning has been
achieved
12 Type of assessment that often determines
progression to next stage






1

2

3


4

5


6



7


8


9

10



11



12




Across

6 Way of matching learning outcomes
and assessments

8 Systematic way of checking that
learning objectives are achieved by
learning activities

9 A number of stations to assess
different clinical skills

10 An easy to mark, standardised type
of assessment

11 Means of ensuring that learning has
been achieved

12 Type of assessment that often
determines progression to next
stage

SOLUTION ON PAGE..
Down

1 Life like learning or assessment

2 An assessment measures what it
intends to measure

3 Collection of work by learner

4 Sufficient knowledge and skill to
work in particular clinical scenarios

5 Results of an assessment are
reproducible

7 Type of assessment primarily to
provide feedback to learner
THE ASSESSMENT ZONE MAGAZINE
CROSSWORD
DOWN
1 Life like learning or assessment
2 An assessment measures what it intends
to measure
3 Collection of work by learner
4 Sufficient knowledge and skill to work in
particular clinical scenarios
5 Results of an assessment are reproducible
7 Type of assessment primarily to provide
feedback to learner
Solution on page 48
The ASSESSMENT ZONE Crossword
A S S E S S M E N T Z O N E
3 2
Sandy Presland reviews the strengths and limitations of using portfolios
to assess competency and learning of nursing professionals

The use of portfolios within the nursing profession for the purpose of assessment and professional
development has become accepted widely.
1
Portfolios are considered as a portable and visible
record of professional contribution and credentials.
2
Portfolios consist of an organised collection
of evidence that the nursing professional can use to showcase details of professional education,
practice experience, formative and summative assessments, and achievements.
Overseeing a post anaesthetic care unit within the South Metropolitan Health Service, my
role focuses on ensuring that nursing staff are effectively assessed as competent to deliver
safe patient care and to encourage them to uphold their professional responsibility to maintain
their competency, knowledge and skills relevant to current practice. The present assessment
structure within the unit I supervise involves nursing staff completing competency skills
checklists, which are entered into a data base. An annual performance review ensures the nurse
is meeting professional development requirements. With the current reconfiguration within the
South Metropolitan Health Service, and the opening of a new hospital in 2014, it seems timely
for a review of current assessment practice. Portfolios may be well worth exploring for future
implementation to enhance learning and promote professional development.

According to the Australian Nursing and Midwifery Council, competency is defined as a
combination of skills, knowledge, attitudes, values and abilities that underpin effective
performance.
3
Current competence assessment methods have been reported as only measuring
a nurses competence levels when they focus on skill, not knowledge.
4
However, it has been
reported that portfolio assessment allows a nurse to reflect on both academic and clinical practice,
thereby applying theory to practice.
4
Portfolios offer a collection of evidence that demonstrates a
nurses skills, knowledge, attitudes and achievements that reflect a nurses ongoing professional
development.
5
They offer an opportunity for individuals to document evidence of learning
outcomes, areas of development and reflect upon areas that require further development.
1

Portfolios are also a mandatory requirement for nurses registered with the Nursing and Midwifery
Board of Australia, to keep a record that demonstrates professional competence.
3,6
Making sense of
portfolios
Assessment and Evaluation in Health Professional Education
3 3
Can portfolios effectively assess
competency and learning of the qualified
nursing professional?
Current literature supporting the effectiveness
of portfolios for assessing competence is
limited. In a Queensland Nursing Council
review, a number of articles discussing the
use of portfolios for continuing competence
were identified but no articles were found
describing research that supports the
effectiveness of portfolios in assessing
competence.
7
Furthermore the report found
that there is limited evidence that portfolios
can measure competence with reliability
and validity.
7
The nature of data collection in
portfolios means the concepts of validity and
reliability cannot be applied without objective
criteria for grading.
8
Such grading criteria
would include the use of multiple sources of
evidence for each competency showcased,
and a system utilising several assessors or
external examiners to ensure consistency
between assessors.
Webb et al (2003) argue that due to the
holistic nature of portfolio assessment,
evidence can be descriptive and judgement-
based rather than quantifiable,
8(p600)
making
the testing of validity and reliability difficult.
They suggest that professional judgement
is often involved in assessing competence,
and it is precisely this qualitative evidence
that underlies the use of portfolios.
8
Webb et
al (2003) conclude nursing educators could
separate reflective aspects recorded in the
portfolio from the assessment of clinical skills,
and this may be sufficient to verify quality and
standards of practice.
8
Whether or not portfolios are an effective
measurement of nursing competence appears
to be inconclusive in nursing literature.
4,7

However, portfolios can be structured to
include sufficient evidence from a variety
of sources including reflective accounts,
and quantifiable assessments such as
assessor observations and skills checklists,
thereby providing evidence of professional
development and learning.
4
What are the strengths and limitations of
portfolios?

The strengths of portfolios lie in their ability
to promote self-reflection, autonomy, and
accountability in learning.
7
Portfolios can
record a range of competencies and personal
attributes.
10
They can also promote deep
learning by facilitating personal management
of knowledge, documenting professional
growth and development, functioning as a
goal setting tool, and assisting with linking
learning to practice.
10
Portfolios embrace adult learning by engaging
the nursing professional in self-directed
learning. The onus is on the nurse to collate
evidence of their learning outcomes, personal
and professional development, and to reflect
on areas of further learning that may need to
continued overleaf
A S S E S S M E N T Z O N E
3 4
be developed.
1
It is reflective thinking that
enables the learner to move from surface
learning to deeper understanding and this is
perhaps where the value of portfolios lies.
A range of limitations creating a barrier for the
implementation of portfolios use has been
identified in nursing literature. As discussed,
there is limited evidence that portfolios can
measure competence with reliability and
validity. Lack of clarity about expectations
and outcomes can lead to confusion
and inconsistency with interpretation of
requirement.
7
Standard output of portfolios
can also vary as it is driven by each
individuals motivation and experience
and the self-directed approach does not
suit all learning styles.
1,7
Furthermore
collecting evidence for portfolios can be time
consuming, and this may put extra burden
on the time deficient nursing professional.
Portfolios can also be time consuming to
assess due to the quantity of material,
and pose difficulty in assessing due to the
subjectivity of the material.
11
Taking these limitations into consideration, it
is evident that portfolios may be resisted by
work pressured nurses. To assist in reducing
an extra burden on nurses and increasing
feasibility of use, portfolio guidelines would
need to stress quality not quantity of
evidence. Provision of templates could assist
with making explicit the type of evidence
required and setting the limits of content
.11

Adequate support and feedback would need
to be provided to both nurse and assessor, if
the portfolio system is to be effective.
Summary
It is clear that portfolios can contribute to the
professional development of the qualified
nurse by enhancing deep learning through
reflection, showcasing achievements, and
promoting lifelong learning by engaging
the individual to keep up with change.
Evidence on whether portfolios can measure
competency is inconclusive, however I
believe the value of the portfolio lies in
documenting the assessment as a process,
rather than measuring it as an end product.
Portfolios incorporating both evidence and
reflection aligned to competency standards
maybe worth embracing in the future as a
way of promoting improved nursing practice
and meaningful learning, but their process
may require considerable investment of time
and effort by all parties involved.
Assessment and Evaluation in Health Professional Education
3 5
References
1. Endacott R, Gray MA, Jasper MA, McMullan M, Miller C, Scholes J, et al. Using portfolios in the assessment
of learning and competence: the impact of four models. Nurse Education in Practice [Internet]. 2004 [cited 2013
Sept 2];4(4):250-257. Available from: ScienceDirect.
2. Meister L, Heath J, Andrews J, Tingen MS. Professional nursing portfolios: A global perspective. Medsurg
Nurs [Internet]. 2002 [cited 2013 Sept 2];11(4):177-82. Available from: ProQuest Health & Medical Complete;
ProQuest Nursing & Allied Health Source.
3. Australian Nursing Midwifery Council Continuing Competence Framework 2009: Available from: http://
equals.net.au/pdf/73727_Continuing_Competence_Fram ework.pdf.
4. McCready T. Portfolios and the assessment of competence in nursing: A literature review. Int J Nurs Stud
[Internet]. 2007 [cited 2013 Sept 5];44(1):143-151. Available from: ScienceDirect.
5. Hughes E. Nurses perceptions of continuing professional development. Nurs Stand [Internet]. Jul 6-12, 2005
[cited 2013 Sept 2];19(43):41-9. Available from: ProQuest Nursing & Allied Health Source.
6. Timmins F, Dunne PJ. An exploration of the current use and benefit of nursing student portfolios. Nurse Educ
Today [Internet]. 2009 [cited 2013 Sept 10];29(3):330-341. Available from: ScienceDirect.
7. Evans A. Competency Assessment in Nursing: A summary of literature published since 2000 National
Education Framework Cancer Nursing Education Project (EdCaN), Cancer Australia [serial on the Internet]. 2008:
Available from: www.edcan.org/pdf/EdCancompetenciesliteraturereviewFINAL.pdf.
8. Webb C, Endacott R, A Gray M, Jasper MA, McMullan M, Scholes J. Evaluating portfolio assessment
systems: what are the appropriate criteria? Nurse Educ Today [Internet]. 2003 [cited 2013 Sept 2];23(8):600-609.
Available from: ScienceDirect.
9. Anderson D, Gardner G, Ramsbotham J, Tones M. E- portfolios: developing nurse practitioner competence
and capability. Aust J Adv Nurs [Internet]. 2009 [cited 2013 Sept 2];26(4):70-76. Available from: Informit.
10. Redfern S, Norman I, Calman L, Watson R, Murrells T. Assessing competence to practise in nursing: a
review of the literature. Research Papers in Education [Internet]. 2002 2002/01/01 [cited 2013 Sept 4];17(1):51-
77. Available from: Routledge.
11. Driessen E, Van Tartwijk J, Van Der Vleuten C, Wass V. Portfolios in medical education: why do they meet
with mixed success? A systematic review. Med Educ [Internet]. 2007 [cited 2013 Sept 4];41(12):1224-1233.
Available from: Wiley.
photo: Jodie Atkinson
A S S E S S M E N T Z O N E
3 6
Dr Claire Harma, Advanced
Trainee in Gastroenterology with
the Royal Australasian College of
Physicians, shares her thoughts
on the assessment of physician
trainees.
Dr Harma (right) assisted by endoscopy
registered nurse, Belinda Brohman
We are writing a magazine on assessment
and evaluation in health professional
education what would be the one most
important piece of advice youd like to
share with our readers?
You need to make sure that it is clear to
trainees at the start of a program what the
learning outcomes are and what is going to be
assessed. When providing feedback it needs
to be FIT & ABLE (Frequent, Interactive,
Timely, Appropriate for learner level,
Behaviour specific and balanced, Labelled and
Empathetic)
1
, which is a message that those
interested in education need to pass on to
others acting as supervisors and mentors so
that trainees get more learning out of their
continuous assessment
Is there something you wish youd learnt
about assessment and evaluation earlier in
your career?
Something I wish I had a bit more insight
into earlier in my career is that all too often
the trainee who is performing well in the
workplace is never given detailed feedback,
and often these are the individuals who
actually want to find out the areas they can
improve in rather than just hearing yeah,
good job. More specific feedback and
extension advice for how someone could
improve is warranted, even for those people
who are performing well.

Assessing
Physicians
of tomorrow
Assessment and Evaluation in Health Professional Education
3 7
Are there any resources that have changed
your approach?
The Teaching On The Run program has had
quite a big impact on how I provide feedback
and how I structure assessment, and the
simple and very practical approaches taught
in this program have been very helpful during
the course of this year working as a medical
education registrar. Also, in setting up a mock
written exam for basic physician trainees this
year, weve been using an online question bank
tool in Moodle, which has been very helpful
in collating questions and delivering the exam
online and performing statistical analyses.
Tell us about the structure of assessment
used in physician training at the moment.
I think the best way of explaining this is that
physician training is done in two components.
We have basic physician training (BPT) and
advanced training (AT). BPT typically takes
three years and during that time, basic
physician trainees are expected to have
gained skills and knowledge in a broad
spectrum of physician areas from general
medicine through to sub-specialty areas, and
then there is a barrier exam to get through to
advanced training.
The exam is performed in two parts. There is
a written exam, which has two papers looking
at basic sciences and clinical applications.
Then everyone is ranked on a bell curve, and
if you are in the top two-thirds of candidates
sitting the exam, you are then eligible to sit
the clinical exam.
The clinical exam tries to simulate real patient
interactions to try and get a better idea of
how capable and competent the individual is
in their day to day practice. The clinical exam
includes long cases, where the candidate has
an hour with the patient, and has to interview
and examine the patient and formulate
their major medical problems; as well as
short cases where a clinical examination is
performed in seven minutes while examiners
watch and a diagnosis and management plan
has to be deduced from that. So they are the
big exams, the big barriers that we have in
physician training.
Throughout BPT and AT, there are also a
number of formative assessments performed,
and these include Mini-CEXs, case-based
discussions and learning needs analyses. In
the three years of advanced training, trainees
move into a subspecialty area, and each
subspecialty has a different assessment
structure. Some of the subspecialties
have formalised, summative examinations
throughout their advanced training, but the
vast majority have no barrier assessment
to becoming a consultant. But they do
have a series of mid-term and end-of-term
assessments and formative feedback
sessions.
Do you think this is the best method for
determining whether a trainee is ready to
become a consultant physician? And does
it assess all different facets of practice for
a consultant physician, aside from just
clinical practice?
Physician training is moving away from the
apprenticeship model toward a competency-
based model. Competency was previously
continued overleaf
A S S E S S M E N T Z O N E
3 8
thought to be gained by a certain amount of
time in a particular area, which is what the
apprenticeship model works on, but in this
model you may only be exposed to a certain
range of conditions depending on where you
are training and so you may miss out on the
broader realms of things that fall within your
specialty; whereas the competency-based
model requires the trainee to prove their
competence before going on to become a
consultant. I think by having the barrier exam
between BPT and AT it determines a level of
competence that seems to be fairly rigorous,
but in moving on to become a consultant
physician there is the potential for trainees
working in this apprenticeship model to not
have the degree of exposure they need
to become a competent consultant. And
there is some talk of going on to having exit
examinations for physician training.
The expectation of the role of the consultant
has changed a lot in the last 10 years, where
being a good clinician isnt necessarily
enough to secure employment, and as
such, consultants are now expected to
have experience in research, education and
management. I think these are three key
areas that at the moment arent adequately
assessed. Although in most specialty areas
trainees are expected to perform research
projects over the course of their training,
there isnt necessarily any assessment
associated with that or a quality standard
upheld. Education and management are areas
where there isnt any training or assessment
at the moment.

Can you tell me a bit about the different
modalities of assessment currently used
in gastroenterology training and whether
you think that is adequate to meet the
assessment needs of the trainees?
Work-based assessment is essentially what
the RACP is trying to use, particularly in
advanced training, but I think there are a
number of limitations with this. Continuous
assessment is utilised in the mid and end
of term feedback that the trainee receives.
Depending on the interaction between the
supervisor and trainee, the quality of that
feedback may be different depending on the
supervisor and the frequency of interaction
that those two individuals have. Most
trainees will try and nominate a supervisor
that they work closely with to give them the
best possible feedback, but within some
subspecialties that is not always possible
and there may be a supervisor who doesnt
have close contact with the trainee day
to day, and so the reliability and validity of
that assessment may be diminished. In
procedural specialities, performance-based
assessment is more based on volume rather
than competence. Within my specialty,
gastroenterology, its based on the number
of procedures including gastroscopies and
colonoscopies. The training program is a
three year advanced training program, and
at the end of that you are expected to have
performed 100 colonoscopies unassisted.
Realistically, most will have performed that
many colonoscopies within the first 6-12
months of training and if over the course
of the three years youve only just made
that level, I am not convinced you would
actually be capable of going out to practice
independently. The chances of the rarer
complications cropping up in that time is
Assessment and Evaluation in Health Professional Education
3 9
limited, so you may not feel confident in
managing those complications, which would
be expected of a consultant.
One possibility would be to have a supervisor
who has monitored you closely over a
prolonged period of time who determines
your competence, and that person needs to
have undergone standardised accreditation
by the college. However, I think, ideally,
there needs to be a volume component to
it as well, but I think the current volume
requirement is probably too low.
You mentioned before the possibility of
exit exams being introduced as a barrier to
completing advanced training. What sort
of format might that take? And what are
your views on the issue?

I think with the increasing number of
trainees that are coming through, the ratio
of supervisors to trainees has the potential
to become more dilute, and as a result
that training experience may vary between
trainees, and between training locations.
With increased numbers of trainees, I think
that there is merit in an exit exam. You would
want to assess the broadest range of the
aspects of the job you possibly could. I think
there needs to be a knowledge component,
which would likely be in the form of a written
examination. I think there needs to be a
clinical component, so potentially a patient
interaction, OSCE- styled component. And
I think within procedural specialties, there
may need to be a procedural simulation
component as well to demonstrate procedural
competence.
I can foresee some problems with introducing
an exit exam. It will be costly for the college,
and ultimately for the trainees as the cost will
be passed on to them in increased training
and exam fees. It will be very time consuming
for both organisers and exam candidates to
prepare for the exam. It will still not assess
things like interprofessional communication,
unless there are specific OSCE stations on
these topics. And I think the biggest challenge
will be what to do with the trainees who
dont pass. Will they repeat another year of
training, and will they ultimately be asked to
leave the program? It would all add to the
increased number of trainees who need to be
accommodated in the system. I think all these
issues need to be worked out before this
could be implemented.
1. Beckman TJ, Lee MC. Proposal for a collaborative
approach to clinical teaching. Mayo Clin Proc. 2009
Apr;84(4):339-44.
Interview by Dr Elisabeth Feher
For more information about the Teaching on
the Run program, go to tellcentre.org
A S S E S S M E N T Z O N E
4 0
OSCE stands for Objective Structured
Clinical Examination and assesses mainly
cognitive and psychomotor domains of
learning. OSCEs focus on the students ability
to synthesise and apply their knowledge into
practical tasks.
OSCEs were first introduced into medical
education in 1975 by Ronald Harden in
Scotland at the University of Dundee. They
were utilised as an alternative to traditional
clinical examinations which had significant
limitations due to subjectivity, inconsistency
and poor structure.

OSCEs examine clinical competence and
aim to assess a students communication
skills, ability to think critically and problem
solve. OSCEs are structured performance
tests that are standardised, objective and
if designed well, can have high levels of
reliability and validity
.3,4
Does your OSCE make the cut?
by Kate Gray
OSCEs are used across a variety of health
disciplines including nursing, medicine and
allied health extensively in Australia, North
America and the United Kingdom.
OSCEs are designed as a series of
examination stations that the students rotate
around. Each station is designed to assess
one competency by using a pre-determined
guideline or checklist.

The students are assessed on not only the
results they obtain from their performance of
the practical skill but also how they go about
obtaining these results.
Very clear instructions are provided to
minimise ambiguity for the student and
standardise the responses and prompts
that the examiner and simulated patient
can provide.
OSCE Fast Facts
Assessment and Evaluation in Health Professional Education
4 1
Does your OSCE make the cut?
Assessing the quality of OSCEs
Reliability
OSCEs have been demonstrated to have
much higher levels of reliability than other
forms of practical assessment and have
hence become the mainstay of practical
clinical competence in medical education.
2
Moderate to good inter-rater reliability has
been demonstrated when multiple assessors
score one station independently of one
another. Caution should be used when
there is only one single assessor for each
station.
5
A correlation of 0.6 or more has
been suggested as a good level of agreement
between raters.
5
The ability of an OSCE score on one station
to determine the performance on another
station has been found to be low in some
studies. The number of stations and diversity
of the stations impacts on the internal
consistency of the OSCE. Increasing the
similarity of skills to be tested can improve
the internal consistency of the OSCE
and reducing the number of stations and
inadequate sampling threatens the reliability.
Test retest reliability is the consistency in
which a student receives the same score in
an OSCE if they repeated it several times
over. By using simulated patients, who act
in a standardised manner improves the
consistency of OSCEs as it is easier and
more feasible to train subjects to respond in
a particular way multiple times, rather than to
train a real patient.
3
Standardisation
A marking checklist is provided to all
examiners with appropriate training to
ensure they score each component of the
task reliably against a set of pre determined
criteria. Aspects of the students performance
with higher importance such as safety
can determine an automatic fail grade in
spite of the other aspects of the student
performance.
6
The use of global rating scales
(GRS) can allow the examiner to make a
judgement about student performance that
is based on aspects of the performance that
are difficult to quantify with a checklist, such
as interpersonal skills.
2
Combining the GRS
and checklist give a more holistic and reliable
assessment on performance.
6
Passing a borderline student also needs
to be standardised to ensure the OSCE
is fair and reliable. Standards need to be
set to determine if a borderline student in
one or more stations will pass the overall
examination. Research has demonstrated
that in borderline students, all station scores
should be combined to determine their overall
clinical competence. The standard pass
score for each OSCE can be determined by
expert opinions of the examiners or experts
in the field and all of the students actual
performance scores combined.
6
continued overleaf
A S S E S S M E N T Z O N E
4 2
Validity
Content validity can be high in OSCE exams
when the outcomes being assessed are
reviewed by experts in the field.
3
OSCEs
assess a wide variety of clinical skills by using
multiple assessment stations using a range of
well trained examiners to consistently assess
the students.
3
The marking criteria should
only entail aspects of performance that
are directly related to the skill that is being
assessed.
6

The ability of an OSCE score to correlate
to other examination scores on the same
topics, or the OSCE score correlation to a final
course score has been demonstrated as being
good, especially when checklists and GRS
are utilised.
6,7
Statistically significant criterion
validity has also been demonstrated in OSCE
exams by using correlation statistical studies
to compare students OSCE results with
their written examination, oral examination
and clinical evaluation scores.
3
This suggests
that OSCE scores correlate well to other
examinations taken at the same time.
The predictive validity of OSCEs has been
established when comparing students results
in an OSCE to their actual clinical performance
and performance in other examinations such
as written or case based interdisciplinary
assessment.
8
Students who demonstrate
strong analytic and problem solving skills,
whilst also showing good communication
and practical skills in OSCEs will most likely
perform well in a real life clinical scenario.
OSCEs can effectively identify students who
may struggle with clinical experiences and
also highlight significant concerns around a
students professionalism and ability to cope
under stress.
2
Educational
OSCEs can be used not only in summative
assessment but also in formative assessment
as it provides feedback to students about
their areas for improvement and provides
them with an opportunity to practice
clinical skills and prepare them for final
examinations.
4
OSCEs are generally well received by
students, who perceive OSCEs to be a
realistic reflection of real life scenarios they
encounter in clinical situations. It has been
reported that students feel they can be fair
and useful assessment methods. It must be
noted that students may complain of high
stress and anxiety levels with this type of
examination.
2,3
Practicality
The cost of OSCEs has been found to be
within acceptable limits if compared to other
forms of practical assessment. However
OSCEs are initially costly to set up, not only
in amount of examiner time and facilities but
also in examiner and simulated patient training,
with some reported as requiring up to four
hours of training per actor.
2
These costs are
generally reduced when the same subjects are
used in multiple examination periods.
a success skills for life,
will impact on your practice it is so practical like the job.
First and second year nursing students reflections an OSCE assessment.

Assessment and Evaluation in Health Professional Education


4 3
Development of an OSCE
9,10
Your Checklist
Establish purpose
of test.
To ensure good content validity and reliability ensure your OSCE stations sample an
adequate array of topics that have been covered in the learning material during the
course of the study.
The amount of stations required depends on the volume of content and level of skills to
be assessed. A formative OSCE will require fewer stations than an end of year/graduate
entry examination.
Define
characteristics of
target population
Ensure the OSCE only assesses the clinical skills that are required for that stage in the
students training.
Create a test
blueprint
Specifically define all of the competencies you wish to assess and ensure there is a good
representation of all disciplines to ensure all appropriate content is covered.
To ensure adequate sampling, create a two dimensional matrix outlining all of the content
that will be assessed and the clinical skills or scenarios in which the student will have an
opportunity to display the skills.
Define the scores Decide upon the weighting of each criteria and make sure that the most important
components are given more weighting or representation.
The marking key should be very easy to follow with very clear tasks that the examiner
should see the student perform.
Ensure appropriate randomisation so as to prevent students predicting which skills to
focus on more.
Include a global rating scale to improve reliability and validity of the tool.
Design the
stations
Clinical stations: Interaction between real or simulated patients. Use two assessors
where possible.
Practical Stations: Assessing practical skills with a simulated device or equipment
Static stations: Students complete an assessment based on information that is presented
to them usually in video or graphics. The student is assessed outside of the OSCE station.
Does not involve physical interactions.
Decide upon the length of time of each station, pre reading time and time between each
station to allow for relocation and set up.
Instructions To the student: the question should be very precise. Any misinterpretation of the question
will lead to complications for the assessor and simulated patient as their instructed
responses may become invalid.
To the Examiner: provide standardised responses or cues they are permitted to provide.
The examiner must be very aware of all of the tasks the student is expected to perform.
To the simulated patient: training should be provided to ensure there is consistency
across stations about their responses and requirements.
Equipment Facilities: ensure adequate space, privacy and availability of enough examiners and
simulated patients.
Tools: consider simulated devices, examination table, chair and table for examiner.
Pilot the OSCE Essential to ensuring that the process runs adequately. Look specifically at time frames
for the student to complete the task and the clarity of instructions to student, examiners
and simulated patient.
Report scores Provide feedback wherever possible. Students feel one of the benefits of completing
OSCEs is immediate feedback. If time permits, this is a valuable addition to the OSCE
process particularly in formative assessment.
continued overleaf
A S S E S S M E N T Z O N E
4 4
References
1. Brosnan M, Evans W, Brosnan E, Brown G. Implementing objective structured clinical skills evaluation (OSCE)
in nurse registration programmes in a centre in Ireland: A utilisation focused evaluation. Nurse Educ Today.
2006;26(2):115-122.
2. Hodges B, Regehr G, Hanson M, McNaughton N. An objective structured clinical examination for evaluating
psychiatric clinical clerks. Acad Med. 1997;72(8):715-21.
3. Selim AA, Ramadan FH, El-Gueneidy MM, Gaafer MM. Using Objective Structured Clinical Examination
(OSCE) in undergraduate psychiatric nursing education: Is it reliable and valid? Nurse Educ Today.
2012;32(3):283-288.
4. Newble D, Reed M. Developing and Running an Objective Structured Clinical Examination (OSCE). 2012.
5. Rushforth HE. Objective structured clinical examination (OSCE): Review of literature and implications for
nursing education. Nurse Educ Today. 2007;27(5):481-490.
6. Barry M, Bradshaw C, Noonan M. Improving the content and face validity of OSCE assessment marking
criteria on an undergraduate midwifery programme: A quality initiative. Nurse Education in Practice.
2013;13(5):477-480.
7. Park RS, Chibnall JT, Blaskiewicz RJ, Furman GE, et al. Construct Validity of an Objective Structured Clinical
Examination (OSCE) in Psychiatry: Associations With the Clinical Skills Examination and Other Indicators. Acad
Psychiatry. 2004 Summer;28(2):122-8. Available from: ProQuest Education Journals; ProQuest Health & Medical
Complete.
8. Graham R, Zubiaurre Bitzer LA, Anderson OR. Reliability and Predictive Validity of a Comprehensive Preclinical
OSCE in Dental Education. J Dent Educ. 2013 February 1, 2013;77(2):161-167.
9. Newble D, Dawson B, Dauphinee D, Page G, Macdonald M, Swanson D, et al. Guidelines for assessing
clinical competence. Teach Learn Med. 1994 1994/06/01 [cited 2013/10/06];6(3):213-220.
10. Ben-David MF. Life beyond OSCE. Med Teach [Editorial]. 2003; 239. Available from: aph
Assessment and Evaluation in Health Professional Education
4 5
A common tool utilised for the
assessment of competence in
Advanced Life Support (ALS)
skills is the direct observation
checklist. The candidate is
observed for their performance
in a test environment as a
team leader within a simulated
resuscitation event. The
checklist outlines performance
criteria that are identified as
bold points which are essential to meet
the learning outcomes. Non bold marking
criteria are important in the scheme of the
systematic flow of the test, but are not
critical to the outcome.
As a trained instructor and assessor I feel
confident with the assessing process using
a strict framework, however as a faculty
member partaking in assessment in various
centres, I find myself questioning the
reliability and validity of the checklist as an
effective tool.
The standard setting process within an
ALS program is for the course director to
provide an overview of the conditions of
assessment. The faculty review the scenario
Is it the appropriate tool for the summative assessment
of Advanced Life Support competence?
Direct Observation
Checklist
to ensure that everyone has
the same understanding
of the requirements for a
successful outcome, to
ensure standardisation.
The scenario gives a brief
summary of the patients
presentation with an outline
of symptoms. A full set of
observations are written in
the context of the presenting
condition, along with a structured pathway of
progressive deterioration through to cardiac
arrest. The instructor delivers the scenario to
the candidate and then observes as the test
progresses to identify achievement of the
bold points within the checklist.
The reliability and validity of the Direct
Observation Checklist as a tool for
assessment in advanced life support is
questionable due to the ambiguity it allows
in the interpretation of the content, and
the flexibility for deviation from the script.
I reviewed it following the RSVPE process
looking at: Reliability, Standardisation,
Validity, Practical feasibility and
Effectiveness.
continued overleaf
by Janet Vince
A S S E S S M E N T Z O N E
4 6
Reliability would the outcome of assessment remain unchanged when applied in two
different situations?
Many issues are often raised in relation to the use of certain parameters or procedures which
have not been covered within the content of the course. With concerns about misleading
the candidate, deviations from the checklist are often agreed upon within a faculty cohort.
This cohort differs between centres with different instructors forming the cohort of faculty
on each occasion. The same process of review is applied resulting in different aspects being
highlighted as concern and therefore varying changes made. Some centres determine it is
reasonable to question the candidate immediately following the scenario if they are answered
correctly, even though the skills are not demonstrated, these centres pass the assessee.
Other centres are rigid in following the checklist criteria.
The experience of the instructor often affects the interpretation of the checklist. A novice
instructor will take the list at face value and uphold the criteria, whereas an experienced
instructor may vary the outcomes to suit the participating cohort, or according to their belief
as to the value of the content. Deviations have been seen in relation to the experience of the
candidate and their omission of a bold point performance criteria being classed as a slip up
that wouldnt usually occur and a pass being awarded. With the assessment being convened
at various centres, the outcomes are likely to be variable based on the deviations applied to
the checklist and the experience of the instructors.
The reliability is therefore questionable
Practical Feasibility can the method cater
for many assessments in a reasonable
timeframe?
The assessment process is practical in that a
number of candidates can be assessed within
a short time frame. Several rooms can be set
up allowing concurrent testing to occur. This
however does require greater resources in line
with instructors and the necessary equipment.
The availability of such is outlined in the course
requirements and the attending numbers are
governed by access to these resources
The Practical Feasibility of this assessment
process is compliant with the terms of
assessment
Effectiveness is the assessment
fair and transparent with effective
feedback?
Each course centre has a degree of
flexibility to deviate from the checklist.
The effectiveness of the tool to be fair and
transparent in the assessment content
must be questioned. One centre may be
lenient, whereas a different centre using
the same tool and criteria may enforce
more rigid guidelines in following the
checklist, requiring a higher standard of
performance to gain a pass.
The tool failed in its Effectiveness to
provide a fair and transparent assessment.
Assessment and Evaluation in Health Professional Education
4 7
Validity does the assessment measure
its intended purpose?
Content Validity: The content within the
checklist has been constructed through
evidence of best practice and research into
the condition it represents. The symptoms
displayed by the simulated patient are
consistent with that of real life. The
candidates, despite varying levels within
their professions, have all been trained
to the same level of ALS with the same
content.
Content Validity of the assessment tool
has been met.
Face Validity: Instructors may demonstrate
a variance in the interpretation of the
content within the tool and deviate to suit
the cohort of candidates. Interpretation
may vary from centre to centre.A junior
instructor may be confident that the
content measures what is needed to be
seen in the candidates performance, but
the experienced instructors may feel the
information misleads the candidate.
In terms of Face Validity this tool has
failed.
Predictive Validity: Mistakes made by the
candidate during the scenario are permitted
to be followed up with a question. This
allows a pass to be awarded if they answer
correctly despite having not demonstrated
the specific performance criteria.
Questioning may be regarded as a prompt
and, unless documented, is not permitted
through the assessment process.
In terms of Predictive Validity this tool
has failed.
continued overleaf
Standardisation Does the tool
measure the same outcomes and
context when applied in various
centres?

The tool has been standardised to
accommodate a variety of different
scenarios using the same marking
criteria. Each scenario follows the same
framework and the same process of
deterioration of an unwell patient. Each
follows the same pathway along the
algorithms used, with bold points being
in context to the condition presented.
Every centre must use the same tool
for assessment.
The standardisation is therefore
compliant with the terms of
assessment.
A S S E S S M E N T Z O N E
4 8
Conclusion
The principle of a good assessment must
be fair, valid, reliable and practical to give
it credibility. It is important to ensure the
method of assessment suits the context of
the topic. The choice of direct observation
as a summative assessment tool in the
assessment of competence in Advanced
Life Support is valid in its ability to measure
the intended purpose, however does not
meet all criteria in relation to an effective
assessment as a stand-alone tool. The
ambiguity in the interpretation of the content
and the flexibility of deviating from the set
checklist invalidates the tool as a fair and
valid assessment source. A recommendation
may be to ensure a variety of sources are
utilised throughout the course with effective
formative and summative assessment
processes ensuring knowledge and skill are
measured, and the outcomes reflect the true
competence of the candidate.
Acknowledgement
The review of this direct observation checklist
has been applied using a system we have
covered throughout unit IMED 5802 Principles
of Assessment and Evaluation facilitated by
Associate Professor Zarrin Siddiqui through
the University of Western Australia (UWA).
Acknowledgement is made to Zarrin in the
use of the RSVPE process in relation to her
work created within the document From
Rocky Road to Silk Route.
THE ASSESSMENT ZONE MAGAZINE CROSSWORD SOLUTION




1
S

2
V

3
P


I


A


O

4
C

5
R


M


L


R


O


E

6
B

L

U

E

P

R

I

N

T


M


L


L


D


F


P


I


A

7
F


O

8
E

V

A

L

U

A

T

I

O

N


L


T


B


I


R


I
9
O

S

C

E


L


O

10
M

C

Q


O


N


E


N


A



C


T

11
A

S

S

E

S

S

M

E

N

T


I



V


12
S

U

M

M

A

T

I

V

E


Crossword solution
Assessment and Evaluation in Health Professional Education
4 9
Assessing
registered nurse
performance
Deanka Preston reviews the effectiveness of appraisals.
Introduction
In the health care industry and in particular the nursing profession, appraisals are
compulsory and widely used to identify and document objectives, and to assess
and evaluate their performance on an annual basis. For registered nurses in their
area of clinical practice, appraisals are utilised to assess, monitor, and evaluate on
the job performance and behaviours. This article discusses the effectiveness of
appraisals as a method of assessment for registered nurses.
Discussion
The Registered Nurse Perspective
Registered nurses are often hesitant and
cautious in participating in an appraisal to
assess their performance. Some question
the need for it and others are motivated
to complete an appraisal, but are often
dissatisfied by the process. Spence and
Wood
1
comment that nurses are often
disappointed by the process of performance
appraisal. They believe in the potential value
of performance appraisals, but seldom
experience the feedback, direction, and
encouragement necessary for an effective
appraisal process. There is also an element of
fear and concern regarding the assessors
comments and whether the nurses self-
rating matches that of their supervisors.
1
The Manager Perspective
Many managers also question the efficacy
of appraisals when measuring their
staff members performance. They may
also consider that appraisals are just for
organisational requirements, discount the
staff members performance, and thus fail
to assist them with role development and
lifelong learning.
Coen and Jenkins
2
identify that the practice
of performance appraisal is a mandated
process for a specified period of time,
where by an employees work performance,
behaviours, or traits are individually rated,
assessed, or described by a person other than
the employee, and the results are kept by the
organisation.
2

continued overleaf
A S S E S S M E N T Z O N E
5 0
In Carson
3
, managers are linked to reasons
why performance appraisals systems may
fail. The manager may resist the process and
view it as unnecessary paperwork or may
not know the nurse well enough to complete
the appraisal. The nurse may not be adept
at receiving feedback, so the manager may
be afraid of the nurses response during the
process. The manager may prefer to coach
the nurse, rather than evaluate them, and the
manager may be anxious that they do not have
the required data or evidence to support the
ratings or feedback they provide to the nurse.
3
The issues
There are numerous issues associated
with appraisals and they are considered
problematic by the parties involved. A study
by Carson
3
has identified the following
areas of concern: the manager assumes the
employee does one thing well, therefore
he/she must do everything well. This is
referred to as the halo effect.
3
The manager
completing the assessment only remembers
that a negative event has transpired. The
employees rate themselves highly and the
manager does the same due to the great
team they belong to. The temperament
of the manager involved may be very rigid
or lenient with their employees. The nurse
and the manager may not have adequate
evidence or data to complete the appraisal,
so both parties guess or randomly rate the
performance.
3
For some groups of registered nurses, they
view appraisal methods as being unjust and
have not engaged in any education programs
to comprehend the appraisal tool. Vasset et
al
5
reports that some nurses identified that
the same performance appraisal processes
were used for different employees, there
were different degrees of performance
appraisal training and experience, and follow
up conversations seldom took place.
5
The possible solution
An appraisal tool needs to be user friendly,
to provide and promote opportunity for real
communication between the manager and
nurse. An effective appraisal tool needs to
reflect the current roles and capabilities of
registered nurses. Kalb et al
4
remark that
appraisals need to include an assessment
element that directly relates to competency
standards and nursing practice. They should
include measures to endorse meaningful
performance feedback and address common
performance concerns.
4
Carson
3
also explains
that for an objective performance appraisal, it
is important to assess and evaluate behaviour,
not personal characteristics. This requires a
detailed job description. The nurses ability
must be established and information must be
collected and recorded that relates to their
performance. The appraisal process needs
to be cyclical and the needs of the nurse and
the organisation should be considered when
objectives are set and evaluated.
3
It is also important to highlight that appraisal
tools and processes have been in the spot
light over the years in many health care
organisations, as managers and staff attempt
to implement a system that is best suited
for all stakeholders. Vasset et al
5
claim that
in recent years, performance appraisals have
been transformed from mere monitoring to
an assessment and development tool. These
tools provide adequate feedback to support
Assessment and Evaluation in Health Professional Education
5 1
employee development, serve as a basis for
modifying or changing behaviours to produce
more effective work for organisations, and
provide useful information to supervisors.
5
Registered nurses may also have higher
job motivation and satisfaction when they
view their appraisals as fair, trustworthy, and
unbiased. The appraisal tool can be perceived
as just if it is transparent, explained
sufficiently, gathers evidence rather than
express personal bias. It should also allow
employees to present their own views and
point out the elements of the appraisal that
they have viewed as unfair or unfortunate.
5(p30)
Spence and Wood
1
also attest to the
effectiveness of appraisals, they are believed
to motivate registered nurses to improve their
practice, provide job satisfaction, and improve
staff morale.
1
There is also a clear need
for active participation by all stakeholders
in the development and implementation
of the appraisal system. Applying relevant
research findings to professional development
programs and appraisal tools, enables the
assessment elements to be fully explicable
and can be agreed upon by the appraiser and
appraisee to promote an effective process.
1

Conclusion
The use of appraisals to measure registered
nurses work performance is common in
many health care facilities. The issue as to
whether they are an effective assessment
tool is questionable for all stakeholders
involved. The key is to ensure that the
appraisal tool and process is valid, objective,
designed appropriately and evaluated regularly
to ensure the system in place is meeting the
employees and organisations requirements.
Ultimately any analysis of appraisals should
be of a constructive nature, to ensure the
continuing evolution of its effectiveness.
References
1. Spence D, Wood E. Registered nurse participation in performance appraisal interviews. J Prof Nurs
[Internet]. 2007[cited 2013 Sept 3]; January February; 23(1):55- 59. Available from: Science Direct http://www.
sciencedirect.com.ezproxy.library.u wa.edu.au/science/article/pii/S8755722305001 687
2. Coens T, Jenkins M. Abolishing performance appraisals: Why they backfire and what to do instead [Internet].
San Francisco: Berrett-Koehler; 2002 [cited 2013 Sept 3]. Available from: EBL http://reader.eblib.com.au.ezproxy.
library.uwa. edu.au
3. Carson E. Do performance appraisals of registered nurses reflect a relationship between hospital size and
caring? Nurs Forum [Internet]. 2004 [cited 2013 Sept 6]; 39(1):5-13 Available from: Academic Search Premier
http://web.ebscohost.com.ezproxy.library.uwa. edu.au/ehost
4. Kalb K, Cherry N, Kauzloric J, Brender A, Green K, Miyagawa L et al. A competency based approach to
public health nursing performance appraisal. Public Health Nurs [Internet]. 2006[cited 2013 Sept 7]; 23(2):115-
138 Available from: Academic Search Premier http://web.ebscohost.com.ezproxy.library.uwa. edu.au/ehost
5. Vasset F, Marnburg E, Furunes T. Employees perceptions of justice in performance appraisals. Nursing
Management UK [Internet]. 2010 [cited 2013 Sept 7]; 17(2):30-34 Available from: Academic Search Premier
http://web.ebscohost.com.ezproxy.library.uwa. edu.au/ehost/
A S S E S S M E N T Z O N E
5 2
We often think of student
assessment as a formal process
requiring exams and assignments,
but there are several ways to gauge
how a student is performing in an
informal way.
Classroom assessment can help teachers
identify students strengths and weaknesses,
monitor the students learning and progress
as well as provide information to assist with
planning and conducting of the educational
session.
1
There are different methods known as
Classroom Assessment Techniques (CAT).
These techniques may identify:
How familiar the students are with
important names, events, dates etc.
How the students apply their knowledge
and skills learnt in class.
To what extent they are aware of the
steps they need to go through in solving
problems and how well they can explain
problem solving steps.
How well the students are able to use a
new learning approach to master concepts
and principles.
2
Minute paper
Ask the students to identify the
most significant things they have
learned during the session.
2
Assessment on the Go
Formative Assessment in the Classroom
by Debra Jeavons
Muddiest Point
Give the students 1 2 minutes to list areas
of confusion in the session.

Concept Maps
Concept maps provide a graphic
representation of the students knowledge.
They can provide insight of how they organise
and represent knowledge. They can also
be used either at the beginning of a course
to identify their existing knowledge or how
their knowledge is developing throughout the
course.
1. Create a focus question that clearly
specifies the issue to be addressed.
2. Ask the students to generate a list of
relevant concepts, organise them and then
construct a preliminary map.
3. The students then have an opportunity to
revise, rethink and reconfigure the map.
3

Linking assessment and
instruction is critical to
effective learning
1
Assessment and Evaluation in Health Professional Education
5 3
Assessment on the Go
Formative Assessment in the Classroom
Pros and Cons
Ask the students to make a quick list of pros
and cons to help them consider an issue more
clearly. This assesses the students objectivity
and extent of analysis.
1. Identify a decision, judgement, dilemma or
issue relevant to the course.
2. Create a prompt to elicit pros and cons.
3. Identify how many pros and cons.
4. Prepare questions by writing them on a
board or on paper.
2
Student Generated Test Questions
This method collects written feedback about
what students think are the most important
concepts discussed during the course.
Step by step approach:
1. Student writes a test question either as a
short answer or MCQ.
2. Students quiz each other.
3. Note the level of questioning for relevance,
difficulty and clarity.
4. Monitor the concepts to see what students
think are the most important.
5. Appropriate questions could be used in
the next graded assessment and the
students will feel they are contributing to
the learning.
2

Problem Recognition Tasks
Examples of common problems are
presented to the students and they are
asked to identify the basic type of problem
represented by each example.
Recognition of problem types is the first step
to solving a problem and it increases the
students experience in understanding the
type of problem involved.
With practise this method should improve the
speed and accuracy with which subsequent
problems are solved.
If it is identified that students are incorrectly
classifying types of problems, corrective
measures can be made.
Application Cards
Identify concepts and ask students to come
up with 1 3 applications of the concept from
every day experience.
Each technique has its own strengths and
weaknesses. Students might perform better
on one type than another. Using CAT tools
for assessment in the classroom creates and
maintains a classroom feedback loop.
The quick and easy techniques allow teachers
to get feedback from students on their
learning, and by completing the feedback loop
and providing the students with results of the
assessment, it provides the students with the
increased potential to learn.
References:
1.Linking Classroom Assessment with Student
Learning. Available from: www.ETS.org
2.Using classroom assessment techniques. Eberly
Centre for Teaching Excellence. Available from:
www.cmu.edu/teaching/assessment/
3.Using Concept mapping to foster Critical Thinking.
Available from: www.nursingconceptmapping.com
A S S E S S M E N T Z O N E
5 4
Teaching Resources
for Health Professional
Educators
Duke University, Durham,
North Carolina, USA.

http://guides.mclibrary.
duke.edu/content.
php?PID=43162&Sid=318194
A great website. It offers a lot
of information about health
professional education. It also
recommends some useful links,
books and journals. In addition,
it provides the names of the
educational events on health
professions education such as
conferences. However, you must
register with BMJ or with them in
order to have access to most of
the resources.
Teaching Healthcare
Professionals
Deakin University,
NSW, Australia
http://www.deakin.edu.au/itl/pd/
tl- modules/teaching-approach/
health- care/index.php
A very useful website for clinical
educators. It is full of information
about teaching and assessment.
It also provides links to useful
resources.
USEFUL WEBSITES
AND RESOURCES
FOR ASSESSMENT,
EVALUATION, AND
STUDENT LEARNING
Teaching Effectiveness
Program, Teaching and
Learning Center
University of Oregon USA
http://tep.uoregon.edu/resources/
assessment/ A valuable
website. It is full of beneficial
information about assessment,
writing, learning and teaching. It
provides inclusive resources. It
also includes advice for first time
educators for pre, during and after
educational session.
FAIMER
Foundation for Advancement of
International Medical Education
and Research
http://www.faimer.org/links/
teaching- learning.html
Offers many resources such
as websites, publications
and organizations in health
professional education.
Halogen Software
http://www.halogensoftware.com/
products/eap praisal-healthcare/
Halogen Software is full of useful
and practical methods to assist
in administrative work such
as performance appraisal, job
performance and evaluation.
Business Balls
http://www.businessballs.com/
trainingprogram evaluation.htm
A valuable website with detailed
information about teaching and
assessment. A great tool for
clinical educators.
Evaluation Resource
http://www.mindtools.com/pages/
article/kirkpat rick.htm#sthash.
Mr1XoEbd.dpuf
Kirkpatricks Four-Level Training
Evaluation Model Analysing
Training Effectiveness Available
as free app and newsletter. This
is where Kirkpatricks Four-Level
Training Evaluation Model can
help you objectively analyse
the effectiveness and impact of
your training, so that you can
improve it in the future. On
this website, they look at each
of the Kirkpatrick four levels, and
examine how you can apply the
model to evaluate training. They
also look at some of the situations
where the model may not be
useful.
Assessment and Evaluation in Health Professional Education
5 5
Evaluation Resource
Flinders University, South Australia
http://www.flinders.edu.au/
teaching/quality/eva luation/
conducting-evaluation.cfm
This websites covers:
1. Guided self-evaluation
2. Natural formative evaluation
3. Peer evaluation
4. Student evaluation
Evaluation Toolkit
John Julian, Monash University,
Medicine, Nursing, and Health
Sciences, Victoria
http://www.med.monash.edu.au/
spppm/resear ch/southernsynergy/
workforceprogram/cluster/
resources-evaluation.html
About the toolkit - The Cluster
Evaluation Toolkit provides a
range of different tools that can
be used for training evaluation.
This document outlines a range
of options for immediate post
training evaluation. The evaluation
of training needs, evaluation of
the trainees beginning level
(or comfort ability) of trainees,
as well as evaluation after the
workshop, also need to be
considered.
Evaluation Tool
Social Learning Blog
http://www.dashe.com/blog/
training- development/how-to-
evaluate-learning- kirkpatrick-
model-for-the-21st-century-a-
revision/
Training and Performance
Improvement in the Real World
How to Evaluate Learning:
Kirkpatrick Model for the 21st
Century.
Learning Catalytics
https://learningcatalytics.com/
http://www.youtube.com/
watch?v=1-qyk-Tx2bs
You tube introduction video
http://www.youtube.com/
watch?v=HqTGKAUlz tM You tube
instructors delivering a module
Learning Catalytics (LC) is
a solution for managing the
interactive classroom. When
using LC faculty and students
are able to: Get real-time
feedback using open-ended,
authentic questions. Formatively
assess students even in large
courses, using a wide variety of
question formats .Students can
be asked to sketch a graph, draw
a vector, highlight a passage,
compose text, and compute a
numeric or algebraic answer
and much more. Use dynamic
analytics to impact instruction
during class. Engage students
in productive discussion by
having the system automatically
group students based on their
responses to optimize discussion
productivity. While students
are working, use the seating
chart to instantly understand
which students understand the
concept .Access and contribute
to a growing shared library.
Facilitate collaboration and avoid
duplicating effort by easily sharing
lectures with teaching assistants
or colleagues. LC maybe used on
smart phones and iPads.
A S S E S S M E N T Z O N E
5 6
Online Educator:
A Guide to Creating the
Virtual Classroom
http://books.google.com.au/
books?hl=en&l=&id=0OJjhv
Uby08C&oi=fnd&pg=PP1&dq
=Online+educator+creating+
a+virtual+class room&ots=97
pgaW22J-&sig=NiXWiia6R7Y
ThD9wW_s46rycv8E#v=one
page&q=Online%20educator
%20creatin g%20a%20virtual
%20classroom&f=false
McVay Lynch M. Online
Educator: A Guide to Creating
the Virtual Classroom. 1 ed.
London: Routledge Falmer; 2002
The Internet is changing the way
we live and education has always
played an important part in
shaping our lives. It is now time
for education to capitalise on the
Internets capabilities to create
a new learning environment
for tomorrows students. The
Online Educator provides much
needed straightforward advice
on how to create a web- based
education system. From
Administrative planning and
selecting resources to individual
course development, it offers
the novice
1. clear definitions of common
terms and concepts
2. a practical how-to approach
with useful checklists
3. a discussion of the issues for
students and teaching staff
4. links to useful websites and
other resources
Based firmly on current distance
learning research, yet accessible
and very readable, this book
will be indispensable to anyone
interested in developing online
education.
Game Based Learning
http://www.sciencedirect.
com/science/article/pi i/
S036013151000223X
Game Based Learning (GBL)
is a type of game play that
has defined learning outcomes.
Generally, game based learning is
designed to balance subject matter
with gameplay and the ability of
the player to retain and apply said
subject matter to the real world.
Triantafyllakos G, Palaigeorgiou
G, Tsoukalas IA. Designing
educational software with
students through collaborative
design games: The We! Design
and Play framework. Computers
and Education. 2011 1//;56(1):227-
242.
ONLINE ARTICLES
Generational changes
and their impact in the
classroom: teaching
Generation Me
http://onlinelibrary.wiley.com/
doi/10.1111/j.136
5-2923.2009.03310.x/full
Many faculty members believe that
students today differ from those in
the past. This paper reviews the
empirical evidence for generational
changes among students and
makes recommendations for
classroom teaching based on these
changes. Generational changes
are rooted in shifts in culture and
should be viewed as reflections of
changes in society.
Twenge, J. Generational changes
and their impact in the classroom:
teaching Generation Me. Medical
Education. Volume 43, Issue 5,
pages 398405, May 2009.
University of Medicine &
Dentistry of New Jersey
http://cte.umdnj.edu/
student_evaluation/evalua tion_
constructing.cfm
This online resource is divided into
17 different links to help educators
when constructing exams and
tests including: Multiple choice
questions, Essay items, Short
answer items, Test Blueprint.
BOOK PREVIEW
Assessment and Evaluation in Health Professional Education
5 7
ONLINE CLIP
Assessment and Evaluation
http://www.youtube.com/
watch?v=cvXS2x3Uh
QU
This is an informative video that
discusses formative assessment.
Based on the article written by
Paul Black and Dylan William
Inside the Black Box.
Formative Assessments
Techniques
http://www.youtube.com/
watch?v=2w54owkm
SCQ
An informative video by Matt
Kloser explaining formative
assessments.
The University of Sheffield
Learning and Teaching
Services
http://www.shef.ac.uk/lets/toolkit
This resource provides educators
with information needed to
give students the best possible
learning experience. Recourses
provided are:
- Hints, tips and practical
examples
- Types of assessment
- Use of technology for
assessments
- Student evaluation
Teaching & Learning
Laboratory
http://tll.mit.edu/help/types-
assessment-and- evaluation
An online site that provides
educators with Assessment and
Evaluation resources including:
- The Assessment and
Evaluation process
- Types of assessment and
evaluations
- Formative assessments
- Summative assessments
- Process assessments
- Links to example evaluation
forms
- Glossary of terms used in
assessment and evaluation
University College Dublin
http://www.ucd.ie/teaching/
resources/
This website has been created
by the University College Dublin
to provide educators with online
resources and tools in teaching
and learning. The site provides
links to printable and online
resources, which include but are
not limited to the following:
- Teaching tool kit
- Planning sessions large and
small group teaching
- Assessment
- Guide to taxonomies of learning
outcomes
- Why and what should
students be assessed
ONLINE RESOURCES
The Faculty of Medicine, Dentistry and Health
Sciences is offering the following Health Professional
Education postgraduate courses in 2014:
Graduate Certificate
Graduate Diploma
Masters degree (by coursework or research)
These courses are designed to suit a wide range of
health professionals and those considering applying
are encouraged to attend our course information
session:
When: Monday 11 November, 4.30 to 5.30pm
Venue: Clinical Training and Evaluation Centre,
The University of Western Australia,
Entrance 2, Hackett Drive, Crawley
RSVP: Caroline Martin, phone 6488 6881 or
email caroline.martin@uwa.edu.au
C
R
IC
O
S

P
r
o
v
id
e
r

C
o
d
e

0
0
1
2
6
G
Shes learning
to become a
top health
professions
educator.
What would you
like to achieve?
Information compiled
by Deanka Preston and
Simone Thurkle
A S S E S S M E N T Z O N E
5 8
360 degree assessment
Evidence of progress collected
from a variety of different people,
who could include teachers,
patients, team members or
others. Often a variety of skills are
assessed, including teamwork,
communication skills and patient
care.
1
Assessment
Involves testing, measuring
and collecting evidence against
defined criteria, and providing
feedback.
1,2
Blueprint
A table to assist planning of
teaching and assessments. One
type lists learning outcomes
on one axis, and on the other,
the assessment methods and
the timing and weighting of
assessments. It often includes the
timing of teaching and the type of
learning to be achieved.
Checklist evaluation
A method in which competencies
are broken into specific actions or
behaviours.
1
They can be useful
for self assessment, and for
providing feedback.
Clinical Portfolio
Collection of work by a learner,
relevant to their studies, to
demonstrate progress. It may
illustrate achievements, learning,
reflections, or other items
demonstrating learning.
3,4
Competence
The attainment of knowledge and
skills (interpersonal and technical)
at a satisfactory level for clinical
practice.
1
Criterion-referenced
assessment
Testing against an absolute
standard.
1
GLOSSARY
Difficulty index
Refers to the percentage of
students who get the question
correct. Questions that are too
easy or too difficult are not as
useful for showing students
ability. Also called p value or
facility index. The higher the
p value, the easier the question.
Ideal values are 0.3-0.8.
5
Discrimination index
The ability of a test to distinguish
between students of lower
and higher achievement. It is
calculated from the upper and
lower proportions of scores.
Higher values are ideal; negative
values indicate a poor test item.

Distracter analysis
Comparison of the response
rates for the incorrect answers
(distracters) in an objective test
item.
DOCS Direct observation of
clinical skills
Observation and immediate
feedback to the student on a
complete consultation. Usually
multiple observations by multiple
assessors are performed to
enhance the reliability of this
method of assessment.
1
DOPS - Direct observation of
procedural skills
Analogous to DOCS, comprising
observational assessment of a
procedural task.
Evaluation
A systematic process of
determining the effectiveness
of learning activities by
measuring achievement of
learning outcomes.
1
It can refer
to individuals achieving learning
outcomes, or teaching programs
effectiveness.
Formative assessment
The results of the assessment are
intended to provide constructive
feedback to the learner and to
stimulate learning.
2
Global rating
Assessors provide a rating on
general overall performance
.1
Item analysis
Quantitative methods are used
to identify questions to keep,
questions to alter and questions to
discard. The methods can look at
the ease or difficulty of individual
questions, and the relationship
between individual questions and
the overall score.
5
Item shells
Templates which are used to
write objective test items, where
the content is inserted into blank
spaces.
Learning outcomes
The intended, measurable
results of learning activities. The
knowledge, skills and attitudes
learners are expected to attain are
described.

MCQ Multiple choice
questions
A written or computer based
test of knowledge, containing
questions with an initial stem,
followed by a number of possible
answers (options), only one of
which is correct.
Mini CEX mini clinical
evaluation exercise
An assessment tool in which a
learner is observed engaging with
a patient or client (e.g. taking a
history).
Norm referenced
Setting cut-off scores by
comparing scores with other
students.
Assessment and Evaluation in Health Professional Education
5 9
Options
Choices of answers in MCQ
questions.
OSCE Objective structured
clinical examination
One or more assessments of
different clinical skills are made at
a number of standardised stations
(usually 12-20), for a set period of
time (often 10-15 minutes).
1,4
OSTE Objective Structured
Teaching Evaluation
Analagous to an OSCE, but
stations assess teachers skills.
Peer assessment
Assessment of a learner by a
fellow student or colleague of
equal status, rank or qualification.
Performance appraisals
Assessment of a persons
achievements over a set period
of time, usually in an occupational
role.
Performance based assessment
An evaluation of particular clinical
activities. Checklists and logbooks
are often used.
1
Portfolio
Evidence is collected by the
learner to demonstrate learning
in specific areas, often including a
reflective component.
1
Reliability
The results of an assessment
are reproducible, i.e. if a similar
assessment was undertaken, the
results would be similar.
2

Simulation
Assessment in as life-like a
situation as practicable, usually
using mannequins, role playing,
standardised patients or computer
based programmes.
4
Standardisation
Each learner experiences a similar
assessment task, which is graded
in a similar way.
Standard setting
Defining the cut off score
for passing or failing, and for
other grading.
6
These may be
criterion referenced or norm
referenced. Criterion referenced
is the setting of an absolute score;
norm referenced is setting the cut
off score by comparison with the
cohort of students.
7
Standardised patient
A person trained to provide
a consistent history or other
clinical information to all students
undertaking the assessment.
1
Stem
Initial statement in MCQ
questions.
Summative assessment
The collection of evidence
by the end of a course, that
demonstrates how much a
student has learnt, and often
whether a learner progresses to
the next stage or not.
1
Validity
Refers to how accurately an
assessment measures what it
intends to measure.
1
Concurrent Validity
The test gives the same result
as another accepted or proven
test which measures the same
material.
1
Content Validity
The test contains a good
representation of the content of
the course.
1
Construct Validity
Whether the assessment actually
measures the intended construct.
Construct validity is evidenced
by better scores among learners
who have mastered the subject.
For example, a test given to year
10 and year 12 students, in which
year 10 students score highly,
would not have construct validity.

Face Validity
The test measures what it says it
will measure.
8
Predictive Validity
The ability of an assessment to
predict particular outcomes.
1
Workplace based assessment
An approach to authentic
assessment of a learners
knowledge and clinical skills in
their everyday work place.
Alison Creagh,
GLOSSARY
A S S E S S M E N T Z O N E
6 0
References
1. Wojtczak A. Glossary of Medical Education Terms: AMEE Occasional Paper No. 3. AMEE; 2003.
2. Norcini J, Anderson B, Bollela V, Burch V, Costa M, Duvivier R, et al. Criteria for good assessment: Consensus
statement and recommendations from the Ottawa 2010 Conference. Medical Teacher. 2011;33:206-14.
3. Friedman Ben-David M, Davis M, Harden R, Howie P, Ker J, Pippard M. Portfolios as a method of student
assessment. Medical Teacher. 2001;23(6).
4. Accreditation Council for Graduate Medical Education American Board of Specialties. Toolbox of Assessment
Methods. 2000.
5. Tavakol M, Dennick R. Post-examination analysis of objective tests. Medical Teacher. 2011;33:447-58.
6. Khan K, Gaunt K, Ramachandran S, Pushkar P. The Objective Structured Clinical Examination (OSCE): AMEE
Guide No. 81. Part II: Organisation & Administration. Medical Teacher. 2013;35:e1447-e63.
7. Wrigley W, Van Der Vleuten C, Freeman A, Muitjens A. A systemic framework for the progress test:
Strengths, constraints and issues: AMEE Guide No. 71. Medical Teacher. 2012;34:683-97.
8. Schuwirth LWT. Assessing medical competence: finding the right answers. The Clinical Teacher. 2004;1(1):14-8.
GLOSSARY
Options:
EACH OPTION MAY BE USED
ONCE, MORE THAN ONCE, OR
NOT AT ALL
A. Educational impact
B. Practicality
C. Reliability
D. Standardisation
E. Student satisfaction
F. Uniformity
G. Validity
Lead in: For each assessment
of an instrument, select the
characteristic of effective
assessment being measured.
Stems:
1. Item scores from a 100-item
multiple choice question paper
are divided into two sets of 50
questions each that are compared.
2. Scores on a written
examination in the final year of the
medical course are analyzed to
determine whether they are
associated with successful
completion of the intern year.
3. The feasibility and costs
(including materials and staff
time) of two instruments
beingconsidered for use in
the new MD curriculum are
compared.
4. Student feedback, and
overall performance in a unit,
is compared before and after
addition of a formative Objective
Structured Clinical Examination at
mid-semester.
Answer/Key
1 C
2 G
3 B
4 A
Stem:
When evaluating an assessment,
concurrent validity is when the
test:
Options:
A measures the correct learning
outcome
B measures a large enough
sample of the intended learning
C is appropriate to the learners
stage
D predicts success after
graduation
E gives the same results as
another test measuring the same
outcome
Answer/Key E
Stem:
Kirkpatricks model of evaluation
has four levels which measure:
Options:
A. effect, knowledge,
performance, outcomes
B. feedback, education, conduct,
grades
C. reaction, learning, behaviour,
results [correct]
D. response, outcome, behaviour,
results
Answer/Key C
THE PUZZLE PAGE ANSWERS
FRONT ROW (from left): Deanka Preston, Kate Gray, Jodie Atkinson
MIDDLE ROW (from left): Simone Thurkle, Debra Jeavons, Lubna Al-Hasani,
Sandy Presland, Alison Creagh
BACK ROW (from left): Christopher Etherton-Beer, Elisabeth Feher
Apply now for postgraduate study.
Are you a health professional interested in developing or improving
your skills as an educator?
To meet the growing need for educators in the health professions,
The University of Western Australias Faculty of Medicine, Dentistry
and Health Sciences is offering the following postgraduate
courses in 2014:
Graduate Oert|foate |n Hea|th Profess|ona| Eduoat|on
Graduate D|p|oma |n Hea|th Profess|ona| Eduoat|on
Master of Hea|th Profess|ona| Eduoat|on (by thes|s and
ooursework or by ooursework and d|ssertat|on}
These oourses oan be oomp|eted fu||-t|me or part-t|me.
Visit meddent.uwa.edu.au/healthedu or come to the course
|nformat|on sess|on on Monday 11 November, 4.30 to 5.30pm.
For more |nformat|on phone Oaro||ne Mart|n on 6488 6881.
Theyre learning to become top
health professions educators.
What would you like to achieve?
C
R
I
C
O
S

P
r
o
v
id
e
r

C
o
d
e

0
0
1
2
6
G
assessment
zone
Assessment and Evaluation in Health Professional Education
ISSUE 1 2013
COVER PHOTOGRAPHS:
Jodie Atkinson
UniPrint 110387

Vous aimerez peut-être aussi