Académique Documents
Professionnel Documents
Culture Documents
zone
Assessment and Evaluation in Health Professional Education
ISSUE 1 2013
COVER PHOTOGRAPHS:
Jodie Atkinson
UniPrint 110387
FRONT ROW (from left): Deanka Preston, Kate Gray, Jodie Atkinson
MIDDLE ROW (from left): Simone Thurkle, Debra Jeavons, Lubna Al-Hasani,
Sandy Presland, Alison Creagh
BACK ROW (from left): Christopher Etherton-Beer, Elisabeth Feher
Apply now for postgraduate study.
Are you a health professional interested in developing or improving
your skills as an educator?
To meet the growing need for educators in the health professions,
The University of Western Australias Faculty of Medicine, Dentistry
and Health Sciences is offering the following postgraduate
courses in 2014:
Graduate Oert|foate |n Hea|th Profess|ona| Eduoat|on
Graduate D|p|oma |n Hea|th Profess|ona| Eduoat|on
Master of Hea|th Profess|ona| Eduoat|on (by thes|s and
ooursework or by ooursework and d|ssertat|on}
These oourses oan be oomp|eted fu||-t|me or part-t|me.
Visit meddent.uwa.edu.au/healthedu or come to the course
|nformat|on sess|on on Monday 11 November, 4.30 to 5.30pm.
For more |nformat|on phone Oaro||ne Mart|n on 6488 6881.
Theyre learning to become top
health professions educators.
What would you like to achieve?
C
R
I
C
O
S
P
r
o
v
id
e
r
C
o
d
e
0
0
1
2
6
G
Contents
Editorial. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Profiles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
Features
Studying at UWA . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Evaluation metaphors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
When assessment goes wrong horror stories!. . . . . . . . . . . . . . . . 11
Blueprinting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
Social media . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Does your OSCE make the cut?. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
Direct observation checklists . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
Assessment on the go . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
Interviews
Dealing with difficult students (Christine Adams) . . . . . . . . . . . . . . . 12
Different perspectives on midwifery education . . . . . . . . . . . . . . . . 22
(Sadie Gerarghty, Sarah Bayes and Dianne Bloxsome)
Assessing physicians of tomorrow (Claire Harma) . . . . . . . . . . . . . . . 36
Learning issues
Assessing clinical competency. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
Making sense of portfolios. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Assessing registered nurse performance. . . . . . . . . . . . . . . . . . . . . . 49
Puzzle page. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Crossword . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
Useful websites and resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
Glossary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
Quiz and crossword solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
A S S E S S M E N T Z O N E
2
The authors acknowledge the Traditional Owners of the land on which we work and live.
Informed consent was obtained from each health professional interviewed for this magazine.
Copyright the authors
Assessment Zone is licensed under a Creative Commons Attribution-Noncommercial-
No Derivative Works 2.5 Australia License (2013).
Contact: Associate Professor Zarrin Siddiqui (zarrin.siddiqui@uwa.edu.au)
Assessment and Evaluation in Health Professional Education
3
Welcome to the inaugural issue of
Assessment Zone Assessment and
Evaluation in Health Professional Education.
The IMED 5802 student group for Semester
2, 2013, are proud to present this magazine
for our Capstone Experience Group Project.
This group project forms a major assessment
for each student undertaking this unit. It is
the culmination of eleven interprofessional
students electing to work collaboratively to
design and compile a magazine featuring
topics related to assessment and evaluation
geared towards health professional educators.
The composition, core ideas and philosophy
for this magazine align with our learning
outcomes for the unit IMED 5802 Principles
of Assessment and Evaluation. These learning
outcomes are:
Explain the role of assessment in the
learning process
Discuss techniques of measurement
Explain the validity and reliability in
assessment and evaluation
Describe the disadvantages and
limitations of different assessment
tools
Design and critique assessment and
evaluation tools
Assess the quality of an examination
by looking at its constituent parts
Explain the principles and models of
evaluation
We have also committed to applying and
maintaining the UWA educational principles
throughout the magazine items and production.
The conception of this magazine has not
been without its challenges. We formed our
student group early in the semester, many
of us meeting for the first time and with
varying work and study schedules. With a
timeline in place, we met weekly after class
to garner concepts, ideas, and view points
for the magazine items. Meetings were
documented, and communication external
to the classroom was posted via Facebook,
the UWA Learning Management System and
student email accounts. The magazine name,
design and format were decided upon by a
voting system. We utilised Joomag to view
magazine templates and Dropbox on line
applications as a central work space to submit
and review individual student creations. We
completed the process by accessing the
expertise of Uniprint to assist with formatting,
design and publication of the final product.
We would like to acknowledge Associate
Professor Zarrin Siddiqui, unit coordinator for
providing direction to us as required.
On behalf of the student group, I hope this
magazine resonates with you and your role
as an educator, and that you benefit from the
personal stories, feature articles, interviews,
and resource tools that we have produced.
Deanka Preston
Ed
i
tor
i
al
4
Jodie Atkinson is a
Registered Midwife and
Nurse. She currently works
as a Clinical Facilitator at
King Edward Memorial
Hospital and Edith Cowan
University, supervising
both undergraduate and
Masters Midwifery students.
Additionally she works as
a sessional lecturer, tutor
and research assistant at
ECU. Jodie is completing
her Masters in Health
Professional Education.
Alison Creagh has loved
medical teaching since 1979,
and has been employed as
Medical Educator at Family
Planning WA since 2000. She
has also worked for many
years in general practice
and womens health. Alison
is completing her Masters
in Health Professional
Education.
Christopher Etherton-Beer
is a Geriatrician and Clinical
Pharmacologist at The
University of Western
Australia and Royal Perth
Hospital. Christopher is
completing the Graduate
Certificate in Health
Professional Education.
Elisabeth Feher is a Medical
Education Registrar at SCGH
and Advanced Trainee in
Geriatric Medicine. Her
current educational projects
include running an education
program for International
medical graduates and
Introducing simulation into
Basic Physician Training.
Elisabeth is completing the
Graduate Certificate in Health
Professional Education.
Kate Gray is a
Physiotherapy Coordinator
of Medical and Student
Services at Joondalup Health
Campus. Completing the
Graduate Certificate in Health
Professional Education.
p
r
o
f
i
l
e
s
5
Debra Jeavons is a Staff
Development Midwife
responsible for clinical
experiences of student nurses
and midwives (undergraduate
and postgraduate) and
graduate midwives, provides
professional development
and orientation of all staff.
Facilitator of interprofessional
obstetric education within
the health service. Debra is
completing the Graduate
Certificate in Health
Professional Education.
Janet Vince is a Staff
Development Educator,
Swan Kalmunda. Extensive
background in Critical Care
specialising in ICU. She
has experience in nursing
education and is presently
working on the development
of the interprofesional
education project for Fiona
Stanley Hospital. Janet is
completing the Graduate
Certificate in Health
Professional Education.
Lubna Al-Hasani is an
international full time
student in Master, Health
Professional Education at
The University of Western
Australia. Head, Training and
Staff Development in Royal
Hospital, Muscat, Oman.
Sandy Presland is a
Clinical Nurse Recovery
Room, Fremantle Hospital.
Responsibilities include
clinical supervision and team
leadership of undergraduate/
postgraduate nurses.
Career direction: nursing
education and leadership.
Sandy is completing the
Graduate Certificate in Health
Professional Education.
Deanka Preston has been
a Registered Nurse for 21
years and in the last 10
years has worked in the
staff development area
in metropolitan teaching
hospitals. She is currently
Acting Professional
Development Educator
for the Graduate Nurse
Program at Swan Kalamunda
Health Service. Deanka is
completing her Graduate
Certificate in Health
Professional Education.
Simone Thurkle is a Staff
Development Nurse at SCGH
Emergency Department.
She has completed a Post
Graduate Certificate in
Emergency Nursing and
is currently completing a
Graduate Certificate in Health
Professional Education.
A S S E S S M E N T Z O N E
6
s an international post graduate
student in The University of Western
Australia (UWA), my experience has been
enriched both personally and professionally.
Coming from a different background has made
me value the experience from a different
perspective. Also, embracing the cultural
differences has lead me to critically analyse
and adopt the best practices.
The first thing about Australia, which I found
different than my country, was calling people
by their first name. I still struggle doing this,
even in emails.
Also, being a student with a different
background (both cultural and religious)
was difficult at the beginning. But unlike
other countries I felt welcomed not only in
the university, but in the community too.
However, I do find difficulties in getting
Halal food in the university campus, but it is
plentiful in the community.
In terms of professional growth, I have found
the university environment collaborative and
Studying
at UWA
By Lubna Al-Hasani
stimulating. UWA is full of resources
to assist the students in their studies such as
the Study Smarter sessions. Nevertheless,
although they are very helpful, I think UWA
should increase the number of tutors in
sessions like Write Smarter, to cover as
many students as they can. Compared to
my experience in my previous university,
the number of tutors available for assistance
were more and for a longer time.
At UWA the teaching methods are various
such as face to face lectures, workshops,
and the online modules. This integrated
curriculum creates a comprehensive and
fruitful learning experience. Just being in a
setting amongst a group of diverse health
care professionals with different backgrounds
A
In terms of professional growth,
I have found the university
environment collaborative and
stimulating.
Assessment and Evaluation in Health Professional Education
7
and experiences is a valuable asset. Besides,
the lecturers are very supportive and
cooperative, helping the students to reach
their potential in professional growth.
The concept UWA is adapting in its teaching,
is to create future professionals who are great
team players. Most of the assignments are
performed in teams and it is this approach
that leads to professional maturity.
Moreover, I have been exposed to several
forms of formal and informal assessments.
Examples of these assessments are
assignments, student led presentations
and reflections. Interviewing professionals
from the real working environment and
critically analysing their organization is
another assessment example. I believe
that the assessments, even though they
take me away from my comfort zone, have
been helpful to discover my capabilities,
and therefore for me, to acknowledge my
limitations and improve on them.
Assessments at postgraduate level are
more in depth, comprehensive and based
on evidence-based practice. This promotes
critical thinking and develops practical
implications in the real world. Also receiving
feedback in the middle of the units assures
the transparency of the university, as it is
open for constructive feedback.
Going through this experience is not an
easy path, and I have found it a difficult,
yet an exciting experience. However, it is
the difficulties that makes a person and
shapes their future. This is an extra-ordinary
experience and it will assist me in my
journey as an educator and a health care
professional.
I think UWA should increase
the number of tutors in
sessions like Write Smarter
Oman
A S S E S S M E N T Z O N E
8
rom the day my daughter sat behind a wheel of a car it was not just about starting the
car and going forward in a straight line. There were road conditions to evaluate such as
traffic hazards, road works and pedestrians. We had to consider the timing of the lesson and
readiness to learn on my daughters part. At the end of each learning session we would look
back and assess how she went, then analyse her progress. From there we would plan the next
lesson and make any changes to the teaching plan such as practicing reverse parking or perfecting
parallel parking.
As my daughters driving skills improved we would give her more challenges to keep her engaged
and increase her skill. But there came a point she reached a plateau and was not progressing.
Evaluating our teaching abilities, we realised that as parents, we were not the best teachers. So
we brought in expert teaching skills with the use of a driving instructor. With access to improved
teaching tools, my daughter went on to passing her driving assessment.
When I supervise students and junior nurses, its a little like learning to drive. They will often joke
that they are on their L plates. We take them on a learning journey letting them steer the wheel,
but if we do not evaluate the teaching tools used along the way, they may not progress as well.
Poor performance can be the result of an inadequate teaching program. Evaluating the teaching is
just as important as assessing the learning.
eval uat i on
student thoughts, experiences and perceptions explained with the help of a metaphor
by Sandy Presland
F
to
drive
Learning
Assessment and Evaluation in Health Professional Education
9
ne day my
husband and I
set out to find
the perfect pup.
The first thing we
had to do was to
research breeds that
were appropriate. We
assessed a variety
of types in terms of
size, types of hair,
ease of training and
temperament. After
narrowing the search
down, much to my
husbands surprise,
a toy poodle was the
winner! Next thing we
had to do was to assess
where was the best
place to purchase a
puppy from; breeders versus pet shops. We
received feedback from family and friends
and came to the conclusion that a breeder
would be better to suit our needs. Finally
when we found a breeder that suited, we had
to assess which particular dog would suit us
best. We looked from dog to dog and played
with them one by one, spending time with
them to make our decision. After playing with
the hyperactive one, the yappy one, the sulky
one... We found him! The little fluffy, quiet
but happy and playful one: Lenny.
The perfect pup
By Kate Gray
O
After a few sleepless
nights, cleaning up a few
surprises, puppy school
and park excursions, my
husband and I reflected
on our purchase and how
our lives had changed.
We evaluated the
whole experience and
wholeheartedly came to
the conclusion that getting
Lenny was one of the best
things we had done!
I consider selection of
students supervisors as an
important task. I look at the
supervisors experience,
training and motivation
and assess their ability
to adequately support
the students learning.
After the placement has finished, I reflect
on the students experience, the feedback
they received and their final assessment
results. I receive feedback specifically from
the student and the supervisor and after
considering all of this information, I can
evaluate the effectiveness of our student
program. The ultimate outcome is that
hopefully we have positively shaped the
student into becoming a competent and
confident health professional.
eval uat i on
student thoughts, experiences and perceptions explained with the help of a metaphor
A S S E S S M E N T Z O N E
1 0
My Daughters
Journey
wo years ago my daughter decided
to move schools. She was unhappy
where she was and had some ideas
as to where she wanted to go. Together we
looked at why she wanted to move to make
some objective decisions.
Having identified various points, we listed the
qualities we wanted to see from the schools
we were to look in to. From here we began
our research.
We searched the internet, contacted various
schools to review their prospectuses, liaised
with students and parents from various
by Janet Vince
schools and finally selected a few of which
we wanted to visit.
From here we were able to make a formative
assessment of what each school could
offer, guiding our choice. As I evaluated the
choice we made, I based it on continuous
assessment of my daughters performance
and happiness.
When planning my education programs, I
am aware that everyone has a choice, just
as we had. If I dont continuously assess
what I am providing and change according to
feedback, my evaluation will reflect a lack of
interest and engagement
in the programs. I
value feedback through
surveys, questionnaires,
evaluation of study
sessions and word
of mouth. This gives
me vital information
to change and adapt
according to students
wants and needs. My
evaluation is the success
of overall outcomes.
eval uat i on
student thoughts, experiences and perceptions explained with the help of a metaphor
T
Assessment and Evaluation in Health Professional Education
1 1
The VIVA is mine and several of my
colleagues biggest horror stories.
Different students get different assessors. In
my case the assessor interacted with me and
argued about one of my answers when she
was not meant to interact at all. This got me off
track during the exam and meant that I did not
do as well as I could have. My colleague had a
similar experience but in her case she knew
the answer the assessor wanted and which
would have given her a pass but that is not
what she wanted to say or what she believed.
Another colleague got so distressed by an
oral exam she froze and got flustered and
failed even though she knew the correct
responses. Several assessments are done
at one time with the same question,
however each assessor has their own
individual philosophies and interpretation
of the question. Students were aware of
which assessors marked easier or higher
than others. You knew that if you got
certain assessors you would receive
a lower mark even giving the same
answers. This assessment appears to
have caused the most distress.
Assessment can be a terrifying
experience for both the assessor and the assessee. Our group reflected on their own
horrifying experiences of the past. Read on if you dare!! What we recommend NOT doing
to your students...
Assessment
stories
My most awful and instructive experience was at the end of my medical degree, when
everything (back then), depended on passing the final exams, of which there were
quite a number, both clinical and written. I saw a high proportion of my fellow students falling
apart in a variety of ways. Some used lots of alcohol, others took sedatives, some others took
illicit substances, and some became extremely anxious and depressed, including one who was
admitted to hospital. I did learn from this: personal coping skills included taking a fatalist approach
I will do my best, and if thats not enough, the worst thing that can happen is to have to repeat,
professionally I learnt, amongst other things, that continuous assessment would be better!
H
o
r
r
o
r
M
y horror story is about a friend of
m
ine. She w
as a final year student
and on prac at a hospital. She had thought
a com
m
ent she had heard from
a previous
student saying they w
ould rather scrub toilets
w
as an exaggeration until she experienced it
for herself. There w
ere 2 supervisors. They
w
ould contradict w
hat the other one said, and
even contradict their ow
n advice the follow
ing
day. They gave m
ainly negative feedback.
From
a confident, happy person m
y friend
becam
e very stressed and dreaded each day
w
orrying that she w
ould fail and therefore
be unable to graduate. She w
as not told until
the last day that she had passed subject
to a special consideration form
from
uni.
Considering she got a high distinction from
every other prac before and afterw
ards it m
ust
say som
ething about the experience! There
w
ere 2 things she learnt from
this - positive
feedback (or at least constructive feedback)
and a supportive learning environm
ent is w
hat
students need to flourish, secondly she learnt
how
not to treat students in the future. (by the
w
ay this happened recently, not a long tim
e
ago).
Developing competence
Competent
Mastery
A Learning Issue is a focused question
generated about issues related to assessment
and evaluation. Alison has examined the issue
of the reliability of the assessment of clinical
competence.
Assessing
clinical
competency
by Alison Creagh
For each skill, the same list of ideal attributes
was used as in our clinics.
Amongst the 15 clinicians present, there was
an unexpectedly wide range of assessments
for each of the five video clips. This led to a
consideration of the reliability of assessments
of competence, and whether reliability could
be improved.
Assessment and Evaluation in Health Professional Education
1 7
does
shows how
knows how
knows
Discussion
Millers pyramid
1
of health professional
assessment is:
It seems widely accepted
2
that assessing
actual practice (does, in Millers pyramid) is
more predictive of work skills than assessing
performance in simulated or standardised
settings (the shows how level). However,
assessing competence in the workplace
appears more complex than assessing the
lower layers of the pyramid.
A number of ways to improve the reliability
of assessments of competence have been
suggested in the literature.
Assessors can be provided with training to
increase consistency between them.
3
For
example, all would observe a
particular scenario, rate the competence of
the trainee, and then discuss any differences
between their assessments.
A commonly accepted strategy is to ensure
that assessments are made on multiple
occasions by multiple assessors.
4
This
has the effect of smoothing out both
individual assessor variations, and the
differences in assessment resulting from
different levels of complexity in the cases
seen. Supervisors could identify trainees
who are not having sufficient assessments
performed, and facilitate further training and
assessment sessions.
4
A practical difficulty
with this strategy may be limitations on
the number of assessments that can be
performed, as they can be time consuming
and costly.
Some authors have discussed
generalizability theory as a method of
calculating the reliability of competency
assessments. These calculations depend
on a number of assessors seeing the same
students performing in a number of cases,
and then looking at the variation in marks
due to individual assessors, the variation
between students, and the variation
between cases, to work out the degrees
of variation due to each of these factors.
Using some standard rating scales, and
some devised for the study, they found
that using more specific rating scales,
aligned more closely with the assessors
ways of thinking about trainees, were
more reliable than the standard, generic
ones. For example, some of the generic
scales included features such as whether
a trainee was at or above the expected
level. Assessors tended to avoid more
pejorative ratings, and infrequently rated
trainees as below the expected level.
Scales that were found to be more reliable
included those where the degree of
trust in the trainee was rated, and those
continued overleaf
A S S E S S M E N T Z O N E
1 8
which assessed the trainees level of
independence to perform certain tasks.
Another recommendation is to consider
who the assessors should be. Should
there be 360 degree assessments, with
everyone working with a trainee providing
a point of view? It seems that the most
reliable assessors are those who observe the
particular skills required on a regular basis. For
example, ward clerks may usefully provide
assessment information on communication
skills or on team work, but would not provide
reliable assessments of clinical skills.
3
Finally, what information should be provided
to assessors to guide their ratings of
trainees? Some organisations have developed
detailed checklists, in an attempt to reduce
the subjectivity of assessments.
3
Examples
of checklist items are trainee introduced
themselves or trainee scrubbed hands
correctly. In a comparison between the
use of detailed checklists and a global rating
of trainees, it was found that the global
rating was more reliable.
3
This is concisely
summarised:
perhaps performance is more than the
sum of its parts
3
Conclusion
Assessing competency in practice is a
better predictor of actual work skills than
assessments of knowledge or of skills in
standardised situations.
In summary, useful strategies include:
use assessors who both observe and
perform the relevant skills on a regular
basis.
ensure that a broad range of assessors
provide feedback and assessment on
multiple occasions.
train assessors in the consistent use of
rating scales.
use rating scales that are specific to the
type of competence being assessed,
rather than generic ones.
ensure that rating scales are aligned to
assessors ways of thinking.
ask assessors to make judgements, rather
than use detailed objective checklists.
As Crossley and Jolly
3
say,
scraping up the myriad evidential
minutiae of the subcomponents of the
task does not give as good a picture
as standing back and considering the
whole.
References
1. Miller G. The assessment of clinical skills/competence/performance. Academic Medicine.1990;65:s63-7.
2. Crossley J, Johnson G, Booth J, W W. Good questions, good answers: construct alignment improves the
performance of workplace-based assessment scales. Medical Education. 2011;45:560-9.
3. Crossley J, Jolly B. Making sense of work- based assessment: ask the right questions, in the right way, about
the right things, of the right people.Medical Education. 2012;46:28-37.
4. Homer M, Setna Z, Jha V, Higham J, Roberts T, Boursicot K. Estimating and comparing the reliability of a suite
of workplace-based assessments: an obstetrics and gynaecology setting. Medical Teacher. 2013;35:684-91.
Assessment and Evaluation in Health Professional Education
1 9
OPTIONS
EACH OPTION MAY BE USED ONCE,
MORE THAN ONCE, OR NOT AT ALL
A. Educational impact
B. Practicality
C. Reliability
D. Standardisation
E. Student satisfaction
F. Uniformity
G. Validity
Lead in: For each assessment of an
instrument, select the characteristic of
effective assessment being measured.
STEMS
1 Item scores from a 100-item multiple
choice question paper are divided into
two sets of 50 questions each that are
compared.
2 Scores on a written examination in the
final year of the medical course are
analyzed to determine whether they are
associated with successful completion of
the intern year.
3 The feasibility and costs (including
materials and staff time) of two
instruments being considered for use in
the new MD curriculum are compared.
4 Student feedback, and overall
performance in a unit, is compared before
and after addition of a formative Objective
Structured Clinical Examination at mid-
semester.
SELECT THE CORRECT OPTION
STEM:
Concurrent validity in assessment:
OPTIONS
A measures the correct learning outcome
B measures a large enough sample of the
intended learning
C is appropriate to the learners stage
D predicts success after graduation
E gives the same results as another test
measuring the same outcome
SELECT THE CORRECT OPTION
STEM
Kirkpatricks model of evaluation has four
levels which measure:
OPTIONS
A. effect, knowledge, performance, outcomes
B. feedback, education, conduct, grades
C. reaction, learning, behaviour, results
D. response, outcome, behaviour, results
Solution page 60.
Created by Alison Creagh, Christopher Etherton-
Beer and Deanka Preston.
The Pu!zz?le Page
Test your knowledge
A S S E S S M E N T Z O N E
2 0
Blueprinting (or creating an assessment Blueprint) is an approach linking assessment with
curricular content and learning outcomes.
1
The Blueprint itself is usually in tabulated format,
which is then used to assist planning of teaching and assessments. One type lists learning
outcomes on one axis, and on the other, the assessment methods and the timing and weighting
of assessments. It often includes the timing of teaching and the type of learning to be achieved.
Like learning experiences and courses, blueprints come in all shapes and sizes! To illustrate these
points, here we share a Blueprint for assessment in a course for preceptors.
Course: Preceptor-ship Course
Unit: Providing Constructive Feedback
Students: 10 Health care professionals,
who have at least 3 - 4 years of
experience
Course Duration: 5 Days (combining
classes and practical exams at the end)
Course Requirements: Pre-reading
assigned articles
Unit Duration: 3 Hours / 3 days
Learning Outcomes:
At the end of this unit, the students will be
able to:
1 Identify the different methods of providing
constructive feedback
2 Discuss the advantages of constructive
feedback
3 Construct real examples from the working
environment and criticize
4 Demonstrate how to provide constructive
feedback effectively
5 Analyze different situations and evaluate
the best ones
Continuous
Assessment
Marks Time
1 Multiple Choice
Questions (Pre and
post the unit)
5% 1st day of
the unit
2 Student led
presentation (15
minutes/each
student)
15% 2nd day of
the unit
3 Assignment 30% End of the
course
Linking assessment to the curriculum
the Blueprint approach
by Lubna Al Hasani
Assessment
Continuous Assessment
Summative Assessment: Practical Exam-
(50%) at the end of the course.
The Course Outlines (Criteria): The
student must pass the practical exam, in
order to pass the unit.
Assessment and Evaluation in Health Professional Education
2 1
Learning Outcomes Recall Interpretation Application Skills Total
1. Identify the different methods of
providing constructive feedback
5% 5%
2. Discuss the advantages of
constructive feedback
15% 15%
3. Construct and critique real examples
from the working environment
30% 30%
4. Demonstrate how to provide
constructive feedback effectively.
50% 50%
100%
Linking assessment to the curriculum
the Blueprint approach
References
1. McLaughlin K, Lemaire j & Coderre S. Creating a reliable and valid blueprint for the internal medicine clerkship
evaluation. Medical Teacher 2005; 27(6):544547
Unit Content:
1 What is preceptor-ship
2 Preceptor and preceptee roles and
characteristics
3 Achieving Competencies
4 Assessment
5 Clinical Supervision
6 Constructive Feedback
7 Evaluation
8 Conflict Resolution
Teaching and learning processes for
the course: Pre- reading, presentation,
workshop, practical exams and seminars.
Assessment Blueprint
A S S E S S M E N T Z O N E
2 2
We are writing a magazine on assessment
and evaluation in health professional
education what would be the one most
important piece of advice youd like to
share with our readers?
I would suggest stop taking advice go with
your gut feelings what works for others may
not work for you. Trial and error is appropriate
for most people. I get up at 0500 and work
until 0730 before work when I have deadlines
in Summer I am a morning person. This
works for me but not everyone.
Is there something you wish youd learnt
about assessment and evaluation earlier in
your career?
I would have benefited from knowing about
Rubrics and Marking Guides early in my
career I never knew what different lecturers
wanted from assignments. A rubric / marking
guide shows what is required and what marks
can be obtained for work produced. I would
have been a better proof reader and critical
analyst if I had known this sooner.
Different Perspectives on Midwifery Education
Sadie Geraghty
is a Midwife
and Midwifery
Educator. She
is Co-ordinator
Post Graduate
Midwifery,
Edith Cowan
University.
Tell us about the structure of assessment
used in post graduate midwifery education
at present.
Postgraduate midwifery uses blended
learning / teaching which is reflected in
assessments. Students are assessed
using a variety of methods written
exams, presentations, reflections, OSCEs,
essays, creative assignments (poetry/
painting/ drawing/modelmaking) in which
the students usually excel at one type of
assessment.
Do you think these assessments
adequately determine whether a student is
competent for registration as a midwife?
They are part of the assessments for
registration the clinical assessments
are equally important. The theoretical
assessments compliment the clinical
assessments by providing knowledge and
critical thinking abilities without both clinical
and theoretical assessments a student would
not be adequately assessed for registration.
What advice would you give in working
with difficult learning situations?
There is no one piece of advice that is
appropriate in difficult learning situations
but I would suggest remaining grounded. Life
experience helps us to deal with situations,
and most of us learn from experience.
Watching how others deal with difficult
situations also is beneficial we often learn
by example.
Assessment and Evaluation in Health Professional Education
2 3
Different Perspectives on Midwifery Education
Enjoy the learning experience find ways to
minimise stress remember being a midwife
is the best job in the world - 80% of our role
is education!
What issues do you experience regarding
under performing students both
academically and in clinical placement?
I dont think students under-perform or over-
perform. Our students are only as good as the
facilitators / teachers who are involved in their
learning. In my experience students are either
over-confident (rare) or under-confident and
compensate to overcome this. We facilitate
learning we guide, direct them the right way
to go. Bullying and punishing is not facilitating
learning. However students must have the
capacity and motivation to learn in the first
place.
How do you deal with student reaction
when they are informed they are
performing unsatisfactorily or failing?
I dont ever tell students they are failing
if they fail, I am failing.
I try to find out what is going on in their lives
and attempt to help them find a solution. I
have never had a bad reaction from a student
even after years of being a clinical facilitator.
I find their strengths which usually helps them
overcome their weaknesses. I always balance
negative comments with positive. Most
people who choose to become a midwife do
so for altruistic reasons I can usually find
positive things to help those back on the path
of learning.
Di, what would be the one most important
piece of advice youd like to share with our
readers?
Believe in yourself, be confident and
prepared.
Is something you wish you had learnt
about assessment and evaluation earlier in
your career?
No, I was given a great rubric to follow for
assessments.
Do you think these assessments
adequately determine whether a student is
competent for registration as a midwife?
No, I dont. I think a lot of students slip
through as competent and when they get out
on the wards and birth suite they cannot apply
themselves and their skills.
Dianne Bloxsome
is a Sessional
Academic teaching
Nursing, Midwifery
and Paramedicine
and a Clinical
Supervisor of
Nursing and Midwifery Students at
Edith Cowan University. She is a
Registered Nurse and Midwife.
continued overleaf
A S S E S S M E N T Z O N E
2 4
What advice would you give in working
with difficult learning situations?
Be confident and get support from superiors
if needed. A problem shared is a problem
halved!!
Speak with the student alone, get advice and
support from lecturer, put the student on a
learning contract and be sure to review it.
How do you deal with student reaction
when they are informed that they are
performing unsatisfactorily by failing?
I have found the students will either be
accepting or defensive, either way they
Dr. Sara Bayes is
currently a Senior
Lecturer and
Midwifery Program
Director at Edith
Cowan University
and a Post Graduate
Research Fellow at University of
Nottingham. She is a Registered
Nurse and Midwife
are upset and annoyed that you have
had the courage to pick them up on their
incompetence.
Any final words of wisdom?
When deciding on whether to pass or fail
a student in question, ask yourself the
question Would you want to work with
them?, and Would you like them to
look after you or your loved one?. If the
answer is no, you fail them.
Sara, can you tell our readers if there is
something you wish you had learnt about
assessment and evaluation earlier in your
career?
Well, my early experience of assessing and
evaluating students was that I was extremely
well supported by the more experienced
academics around me who constantly,
intensively and proactively checked my
marking to make sure I was doing the right
thing by the students - so my only contribution
here, if that hadnt been the case, would be
to ensure that when youre new to it, you set
yourself up with an effective mentoring system.
Are there any resources that have
changed your approach to assessment
and evaluation?
Only the other academics that Ive come into
contact with and milked for their experience
and wisdom over the years!
Do you think these assessments
adequately determine whether a student is
competent for registration as a midwife?
For my setting, I do. Assessments of all
types in each theory and practicum unit
throughout each midwifery preparation
program are structured to enable the student
to demonstrate their developing competence
in a number of the NMBA Midwifery
Assessment and Evaluation in Health Professional Education
2 5
Competencies. By the end of their course,
they will have been assessed against all of
the NMBA competencies.
What advice would you give in working
with difficult learning situations?
I think just to remain fully objective and stay
right away from labelling one thing/person
as responsible too quickly - the fact (usually)
is, an interaction of a range of factors mean
that the situation is not as optimal as it could
be so it needs reflecting on dispassionately
and investigating sensitively - and if there
is one person who is making things tricky,
there is usually a whole sub-story underneath
the behaviour, and its usually not a happy
one. Other than that I would say make sure
expectations and limitations are clear to
everyone (and documented), and stay close
and visible.
What issues do you experience regarding
under performing students both
academically and in clinical placement?
Not that many really because our courses have
such small numbers, so we are able to get on
to under-performance issues early on. The main
thing in both would be not so much a lack of
intellectual capacity but a lack of time available
for individual students to develop towards the
milestones we expect them to reach in their
own time all students must achieve certain
things by certain time points through their
course, but of course everyone learns (gets it)
at a different rate! So thats a challenge.
How do you deal with student reaction
when they are informed they are
performing unsatisfactorily by failing?
I think just by staying calm and clear, and
retaining an adult-adult communication
pattern but still giving time to acknowledging
the students disappointment / fear or
whatever, plus all the things I said in the
question regarding difficult learning situations,
and remaining future-focused, i.e. staying
focused on what we can do to try and remedy
the situation, reinforcing the positives,
continuously bringing the conversation back
to an objective appraisal of the situation and
how to make it better.
Nine times out of ten that works to maintain
the students optimism that its not the end
of the world and to galvanise their resolve
and motivation to doing their best to get back
on track but there will always be the case
where nothing positive you say or do will get
through, and the student will be angry and/
or resentful and take no ownership of the
remedial plan. And for that type of situation
all you can do is remind yourself inwardly that
you have done all you can, and that if youve
acted professionally and supportively, you
are blameless in how they have reacted. And
debrief if you need to with someone who is
not too close to the situation.
Any final words of wisdom?
Be prepared to admit it and apologise to a
student if you get something wrong!
Interview by Jodie Atkinson
A S S E S S M E N T Z O N E
2 6
Welcome to the Web 2.0 age
In this digital age, we are all familiar with the
web. But for a second, think about how far
things have come in a relatively short time.
Web 2.0 describes the new Internet
technologies that now allow users to actively
create content, share material and interact.
1
This is in contrast to the earlier, largely static
(Web 1.0) Internet sites with relatively
little user content. Web 2.0 now offers
educators the opportunity to design and build
innovative custom applications to support
teaching, learning and assessment in specific
disciplines.
2,3
The rise of social media has been one of the
greatest results of the popularity of Web 2.0
technologies. Few people, from students
to Professors, havent at least heard of
Facebook and Twitter. Popular social
media sites now have an astonishing number
of users (Box 1).
Among the enormous numbers of users,
many are avid users, checking their
Facebook or Twitter accounts daily.
5,6
Social
media are particularly popular among the
millenial generation (Generation Y) born
Social media
opportunities for
assessment and
evaluation in the
Web 2.0 age
Are you familiar with these social
media?
Blogger (Web based journaling
[Blogging] site)1
Facebook (Social networking site with 1
billion users)
4
PebblePad (e-Portfolio application)
Twitter (Micro-blogging site for sharing
of ideas and conversation with 200 million
users)
5
Wikipedia (Encyclopedia wiki [website
created, edited, and developed by a
community of individuals working on
collaborative projects])1,
4
Youtube (Video sharing site with 800
million users)4
Box 1: Popular social media sites
by Christopher Etherton-Beer
Assessment and Evaluation in Health Professional Education
2 7
from the early 80s onwards. These young
folk have grown up in a digital age and are
comfortable using social media in both their
private and academic lives.
1,6
In line with
these trends, the use of social media is very
high among health professionals and student
populations,
7
and its now recommended
that medical educators be familiar with
social media technologies.
5
Despite this
enthusiasm for social media use, many
people continue to be concerned about the
boundaries, oversight and privacy of data
shared via social media. In this article we
will consider some of the opportunities and
potential pitfalls, associated with the use of
social media in health professional education.
(Box 2)
Potential benefits of social media in
education and assessment
It is not just access to social media sites
themselves which has increased at a
staggering rate, it now seems that portable
electronic devices (including smart
phones, tablets and lap top computers with
the capability of wirelessly accessing the
Internet) are ubiquitous among Australian
university class attendees. Thus social media
can be used without special organization
by faculty. Compare this to more traditional
technologies designed to facilitate learner
participation (such as audience clickers,
which require specially enabled classrooms
and are limited to use in one physical site).
Combined with portable devices enabled
with Internet access, Web 2.0 technologies
offer remarkable flexibility to teachers and
learners alike.
Instead of diminishing attendance at
traditional teaching activities (such as
lectures), social media can actually enhance
participation. Social media can also be
harnessed to encourage participation
by learners who may be intimidated by
participating in traditional class settings,
but may be more comfortable initially
building their confidence by participating
electronically.
1
Electronic participation
can also be anonymous - when this is the
learners preference, and it is appropriate
in the context of a particular use of social
media. For example, students may contribute
to a Twitter feed during a group lecture, or
respond to a poll of learners via Twitter,
5
under a non-identifying username.
There is now a growing experience, and
evaluation, of the use of social media in health
professional education. The available data
have been reviewed
4,8
and may surprise some
educators. Certainly it seems that judicious
and carefully considered use of social media
can facilitate collaboration and learning, with
positive feedback from learners.
4,9
Social
media presents a flexible tool (Box 3) that
can be used to engage learners, facilitate
collaboration and provide feedback.
8
Web 2.0 technologies and social media
offer exciting opportunities to promote
learner participation, engagement, and
effective assessment
Use of social media can enhance
traditional pedagogical approaches
Teachers and facilitators have
opportunities to model, promote and
require e-professionalism among
learners
Box 2: Key points
continued overleaf
A S S E S S M E N T Z O N E
2 8
Use of social media tools can be associated
with improvement in the knowledge, skills
and attitudes of learners.
8
Have you considered using social media
to:
establish a virtual community of
practice
increase participation in lectures (try
displaying a live Twitter feed!
1,5
provide real time questions, or
feedback, linked to your class blog
5,10
informally poll learners
5
create a shared class blog
9
provide a discussion forum
4
create, discuss, and edit shared student
articles on class topics in a Wiki
1
promote communication between
tutorial group members outside of
traditional classes
10
create e-portfolios of student work,
allowing them to share some or all of
their portfolio
create custom Web 2.0 applications
to support your teaching, learning and
assessment
2
Box 3: Applications of social media in
health professional education
Utilizing social media for student
assessment
There are many opportunities to support
assessment and evaluation in health
professional education using social media.
Provision of feedback is a frequent and
important use of social media in health
professional education.
8
Similarly, the provision
of opportunities for formative self-
assessment was one of the most highly rated
aspects of a medical education Facebook
page in anatomy.
11
Twitter can also be used
to provide formative feedback.
5
Web 2.0
technology can be used to offer authentic
assessments (i.e. that reflect activities the
learner will need to undertake in their actual
practice). For example, a custom web based
application has been designed that facilitates the
assessment of nursing students in searching
and critically reviewing the literature.
2
In addition to the provision of feedback and
assessment activities, the extent and quality
of electronic participation by students in
group work or discussion can be rated by
tutors or peers (mirroring the assessment
of more traditional face-to-face group work).
Learners can also be prompted to self-reflect
on their on-line contributions. Similarly,
electronic material authored by students can
be assessed by tutor, peer or self-rating.
On-line environments offer the potential for
combinations of self, tutor
and peer rating, as material
is easily shared. Provision
of feedback in an on-line
environment can provide
students with experience in
constructing and delivering
constructive feedback, which
in turn can be assessed
formatively and summatively.
Assessment and Evaluation in Health Professional Education
2 9
Evaluating the use of social media in your
course
The social media aspects of your teaching
can be assessed in tandem with other
feedback collected from learners. In
addition, teachers can consider using
specific resources provided by some
social media sites. For example, Facebook
provides the Facebook Insights function
that can automatically provide page
metrics (such as numbers of visitors
and de-identified demographic data).
11
Google Analytics offers a free service to
analyse Internet site statistics (such as
the number of visits to a course Web 2.0
site). Depending on the learner cohort and
particular social media application it may
also be appropriate to measure the learners
attitudes towards the use of social media,
12
and there is much potential for educational
research in this field!
Challenges of using social media
Common challenges when using social
media are technical problems and ensuring
consistent user participation.
8
Learners
may find the array of social media options
bewildering, or find the presentation of
commercial advertising distracting. For these
reasons, teachers may focus on using one
or two social media tools in each class. The
use of social media can present learners
and teachers alike with practical, ethical and
legal challenges
6,13
relating to privacy and the
blurring of private and professional lives. Some
of the risks associated with social media use
(such as breaches of patient confidentiality
or the patient-doctor relationship) have now
been well publicized.
7
These challenges have
led many professional groups and educational
institutions to consider policies regarding the
use of social media. E- professionalism is
defined as the attitudes and behaviors that
reflect traditional professionalism paradigms
but are manifested through digital media.
6
continued overleaf
A S S E S S M E N T Z O N E
3 0
References
1. McAndrew M, Johnston AE. The role of social media in dental education. J Dent Educ. 2012;76(11):1474-81.
2. Eales-Reynolds L-J, Gillham D, Grech C, Clarke C, Cornell J. A study of the development of critical thinking
skills using an innovative web 2.0 tool. Nurse Education Today. 2012;32(7):752-756.
3. Watson N, Massarotto A, Caputo L, Flicker L, Beer C. e-ageing: Development and evaluation of a flexible
online geriatric medicine educational resource for diverse learners. Australasian Journal on Ageing. 2012;
Forthcoming (online ahead of print DOI: 10.1111/j.1741-6612.2012.00622.x).
4. Hamm MP, Chisholm A, Shulhan J, Milne A, Scott SD, Klassen TP, et al. Social Media Use by Health Care
Professionals and Trainees: A Scoping Review. Acad Med. 2013;88(9):1376-1383.
5. Forgie SE, Duff JP, Ross S. Twelve tips for using Twitter as a learning tool in medical education. MedTeach.
2013;35(1):8-14.
6. Kaczmarczyk JM, Chuang A, Dugoff L, Abbott JF, Cullimore AJ, Dalrymple J, et al. e-Professionalism: a new
frontier in medical education. Teach Learn Med. 2013;25(2):165-70.
7. George DR, Green MJ. Beyond Good and Evil: Exploring Medical Trainee Use of Social Media. Teaching and
Learning in Medicine. 2012;24(2):155-157.
8. Cheston CC, Flickinger TE, Chisolm MS. Social media use in medical education: a systematic review. Acad
Med. 2013;88(6):893-901.
9. George DR, Dellasega C. Social media in medical education: two innovative pilot studies. Med Educ.
2011;45(11):1158-9.
10. Wells KM. Social media in medical school education. Surgery. 2011;150(1):2-4.
11. Jaffar AA. Exploring the use of a facebook page in anatomy education. Anat Sci Educ. 2013; DOI 10.1002/
ase.1404.
12. Wang AT, Sandhu NP, Wittich CM, Mandrekar JN, Beckman TJ. Using social media to improve continuing
medical education: a survey of course participants. Mayo Clin Proc. 2012;87(12):1162-70.
13. Cain J, Fink JL. Legal and ethical issues regarding social media and pharmacy education. Am J Pharm Educ.
2010;74(10):184.
A faculty can take many practical steps to limit risks, and maintain e-professionalsim, associated
with use of social media in their courses such as:
consider setting up separate pages, groups or accounts for each class
advise students to select maximal security and privacy settings
have separate profiles, accounts or pages for professional and private interaction
Assessment and Evaluation in Health Professional Education
3 1
ACROSS
6 Way of matching learning outcomes and
assessments
8 Systematic way of checking that learning
objectives are achieved by learning
activities
9 A number of stations to assess different
clinical skills
10 An easy to mark, standardised type of
assessment
11 Means of ensuring that learning has been
achieved
12 Type of assessment that often determines
progression to next stage
1
2
3
4
5
6
7
8
9
10
11
12
Across
6 Way of matching learning outcomes
and assessments
8 Systematic way of checking that
learning objectives are achieved by
learning activities
9 A number of stations to assess
different clinical skills
10 An easy to mark, standardised type
of assessment
11 Means of ensuring that learning has
been achieved
12 Type of assessment that often
determines progression to next
stage
SOLUTION ON PAGE..
Down
1 Life like learning or assessment
2 An assessment measures what it
intends to measure
3 Collection of work by learner
4 Sufficient knowledge and skill to
work in particular clinical scenarios
5 Results of an assessment are
reproducible
7 Type of assessment primarily to
provide feedback to learner
THE ASSESSMENT ZONE MAGAZINE
CROSSWORD
DOWN
1 Life like learning or assessment
2 An assessment measures what it intends
to measure
3 Collection of work by learner
4 Sufficient knowledge and skill to work in
particular clinical scenarios
5 Results of an assessment are reproducible
7 Type of assessment primarily to provide
feedback to learner
Solution on page 48
The ASSESSMENT ZONE Crossword
A S S E S S M E N T Z O N E
3 2
Sandy Presland reviews the strengths and limitations of using portfolios
to assess competency and learning of nursing professionals
The use of portfolios within the nursing profession for the purpose of assessment and professional
development has become accepted widely.
1
Portfolios are considered as a portable and visible
record of professional contribution and credentials.
2
Portfolios consist of an organised collection
of evidence that the nursing professional can use to showcase details of professional education,
practice experience, formative and summative assessments, and achievements.
Overseeing a post anaesthetic care unit within the South Metropolitan Health Service, my
role focuses on ensuring that nursing staff are effectively assessed as competent to deliver
safe patient care and to encourage them to uphold their professional responsibility to maintain
their competency, knowledge and skills relevant to current practice. The present assessment
structure within the unit I supervise involves nursing staff completing competency skills
checklists, which are entered into a data base. An annual performance review ensures the nurse
is meeting professional development requirements. With the current reconfiguration within the
South Metropolitan Health Service, and the opening of a new hospital in 2014, it seems timely
for a review of current assessment practice. Portfolios may be well worth exploring for future
implementation to enhance learning and promote professional development.
According to the Australian Nursing and Midwifery Council, competency is defined as a
combination of skills, knowledge, attitudes, values and abilities that underpin effective
performance.
3
Current competence assessment methods have been reported as only measuring
a nurses competence levels when they focus on skill, not knowledge.
4
However, it has been
reported that portfolio assessment allows a nurse to reflect on both academic and clinical practice,
thereby applying theory to practice.
4
Portfolios offer a collection of evidence that demonstrates a
nurses skills, knowledge, attitudes and achievements that reflect a nurses ongoing professional
development.
5
They offer an opportunity for individuals to document evidence of learning
outcomes, areas of development and reflect upon areas that require further development.
1
Portfolios are also a mandatory requirement for nurses registered with the Nursing and Midwifery
Board of Australia, to keep a record that demonstrates professional competence.
3,6
Making sense of
portfolios
Assessment and Evaluation in Health Professional Education
3 3
Can portfolios effectively assess
competency and learning of the qualified
nursing professional?
Current literature supporting the effectiveness
of portfolios for assessing competence is
limited. In a Queensland Nursing Council
review, a number of articles discussing the
use of portfolios for continuing competence
were identified but no articles were found
describing research that supports the
effectiveness of portfolios in assessing
competence.
7
Furthermore the report found
that there is limited evidence that portfolios
can measure competence with reliability
and validity.
7
The nature of data collection in
portfolios means the concepts of validity and
reliability cannot be applied without objective
criteria for grading.
8
Such grading criteria
would include the use of multiple sources of
evidence for each competency showcased,
and a system utilising several assessors or
external examiners to ensure consistency
between assessors.
Webb et al (2003) argue that due to the
holistic nature of portfolio assessment,
evidence can be descriptive and judgement-
based rather than quantifiable,
8(p600)
making
the testing of validity and reliability difficult.
They suggest that professional judgement
is often involved in assessing competence,
and it is precisely this qualitative evidence
that underlies the use of portfolios.
8
Webb et
al (2003) conclude nursing educators could
separate reflective aspects recorded in the
portfolio from the assessment of clinical skills,
and this may be sufficient to verify quality and
standards of practice.
8
Whether or not portfolios are an effective
measurement of nursing competence appears
to be inconclusive in nursing literature.
4,7
However, portfolios can be structured to
include sufficient evidence from a variety
of sources including reflective accounts,
and quantifiable assessments such as
assessor observations and skills checklists,
thereby providing evidence of professional
development and learning.
4
What are the strengths and limitations of
portfolios?
The strengths of portfolios lie in their ability
to promote self-reflection, autonomy, and
accountability in learning.
7
Portfolios can
record a range of competencies and personal
attributes.
10
They can also promote deep
learning by facilitating personal management
of knowledge, documenting professional
growth and development, functioning as a
goal setting tool, and assisting with linking
learning to practice.
10
Portfolios embrace adult learning by engaging
the nursing professional in self-directed
learning. The onus is on the nurse to collate
evidence of their learning outcomes, personal
and professional development, and to reflect
on areas of further learning that may need to
continued overleaf
A S S E S S M E N T Z O N E
3 4
be developed.
1
It is reflective thinking that
enables the learner to move from surface
learning to deeper understanding and this is
perhaps where the value of portfolios lies.
A range of limitations creating a barrier for the
implementation of portfolios use has been
identified in nursing literature. As discussed,
there is limited evidence that portfolios can
measure competence with reliability and
validity. Lack of clarity about expectations
and outcomes can lead to confusion
and inconsistency with interpretation of
requirement.
7
Standard output of portfolios
can also vary as it is driven by each
individuals motivation and experience
and the self-directed approach does not
suit all learning styles.
1,7
Furthermore
collecting evidence for portfolios can be time
consuming, and this may put extra burden
on the time deficient nursing professional.
Portfolios can also be time consuming to
assess due to the quantity of material,
and pose difficulty in assessing due to the
subjectivity of the material.
11
Taking these limitations into consideration, it
is evident that portfolios may be resisted by
work pressured nurses. To assist in reducing
an extra burden on nurses and increasing
feasibility of use, portfolio guidelines would
need to stress quality not quantity of
evidence. Provision of templates could assist
with making explicit the type of evidence
required and setting the limits of content
.11
Adequate support and feedback would need
to be provided to both nurse and assessor, if
the portfolio system is to be effective.
Summary
It is clear that portfolios can contribute to the
professional development of the qualified
nurse by enhancing deep learning through
reflection, showcasing achievements, and
promoting lifelong learning by engaging
the individual to keep up with change.
Evidence on whether portfolios can measure
competency is inconclusive, however I
believe the value of the portfolio lies in
documenting the assessment as a process,
rather than measuring it as an end product.
Portfolios incorporating both evidence and
reflection aligned to competency standards
maybe worth embracing in the future as a
way of promoting improved nursing practice
and meaningful learning, but their process
may require considerable investment of time
and effort by all parties involved.
Assessment and Evaluation in Health Professional Education
3 5
References
1. Endacott R, Gray MA, Jasper MA, McMullan M, Miller C, Scholes J, et al. Using portfolios in the assessment
of learning and competence: the impact of four models. Nurse Education in Practice [Internet]. 2004 [cited 2013
Sept 2];4(4):250-257. Available from: ScienceDirect.
2. Meister L, Heath J, Andrews J, Tingen MS. Professional nursing portfolios: A global perspective. Medsurg
Nurs [Internet]. 2002 [cited 2013 Sept 2];11(4):177-82. Available from: ProQuest Health & Medical Complete;
ProQuest Nursing & Allied Health Source.
3. Australian Nursing Midwifery Council Continuing Competence Framework 2009: Available from: http://
equals.net.au/pdf/73727_Continuing_Competence_Fram ework.pdf.
4. McCready T. Portfolios and the assessment of competence in nursing: A literature review. Int J Nurs Stud
[Internet]. 2007 [cited 2013 Sept 5];44(1):143-151. Available from: ScienceDirect.
5. Hughes E. Nurses perceptions of continuing professional development. Nurs Stand [Internet]. Jul 6-12, 2005
[cited 2013 Sept 2];19(43):41-9. Available from: ProQuest Nursing & Allied Health Source.
6. Timmins F, Dunne PJ. An exploration of the current use and benefit of nursing student portfolios. Nurse Educ
Today [Internet]. 2009 [cited 2013 Sept 10];29(3):330-341. Available from: ScienceDirect.
7. Evans A. Competency Assessment in Nursing: A summary of literature published since 2000 National
Education Framework Cancer Nursing Education Project (EdCaN), Cancer Australia [serial on the Internet]. 2008:
Available from: www.edcan.org/pdf/EdCancompetenciesliteraturereviewFINAL.pdf.
8. Webb C, Endacott R, A Gray M, Jasper MA, McMullan M, Scholes J. Evaluating portfolio assessment
systems: what are the appropriate criteria? Nurse Educ Today [Internet]. 2003 [cited 2013 Sept 2];23(8):600-609.
Available from: ScienceDirect.
9. Anderson D, Gardner G, Ramsbotham J, Tones M. E- portfolios: developing nurse practitioner competence
and capability. Aust J Adv Nurs [Internet]. 2009 [cited 2013 Sept 2];26(4):70-76. Available from: Informit.
10. Redfern S, Norman I, Calman L, Watson R, Murrells T. Assessing competence to practise in nursing: a
review of the literature. Research Papers in Education [Internet]. 2002 2002/01/01 [cited 2013 Sept 4];17(1):51-
77. Available from: Routledge.
11. Driessen E, Van Tartwijk J, Van Der Vleuten C, Wass V. Portfolios in medical education: why do they meet
with mixed success? A systematic review. Med Educ [Internet]. 2007 [cited 2013 Sept 4];41(12):1224-1233.
Available from: Wiley.
photo: Jodie Atkinson
A S S E S S M E N T Z O N E
3 6
Dr Claire Harma, Advanced
Trainee in Gastroenterology with
the Royal Australasian College of
Physicians, shares her thoughts
on the assessment of physician
trainees.
Dr Harma (right) assisted by endoscopy
registered nurse, Belinda Brohman
We are writing a magazine on assessment
and evaluation in health professional
education what would be the one most
important piece of advice youd like to
share with our readers?
You need to make sure that it is clear to
trainees at the start of a program what the
learning outcomes are and what is going to be
assessed. When providing feedback it needs
to be FIT & ABLE (Frequent, Interactive,
Timely, Appropriate for learner level,
Behaviour specific and balanced, Labelled and
Empathetic)
1
, which is a message that those
interested in education need to pass on to
others acting as supervisors and mentors so
that trainees get more learning out of their
continuous assessment
Is there something you wish youd learnt
about assessment and evaluation earlier in
your career?
Something I wish I had a bit more insight
into earlier in my career is that all too often
the trainee who is performing well in the
workplace is never given detailed feedback,
and often these are the individuals who
actually want to find out the areas they can
improve in rather than just hearing yeah,
good job. More specific feedback and
extension advice for how someone could
improve is warranted, even for those people
who are performing well.
Assessing
Physicians
of tomorrow
Assessment and Evaluation in Health Professional Education
3 7
Are there any resources that have changed
your approach?
The Teaching On The Run program has had
quite a big impact on how I provide feedback
and how I structure assessment, and the
simple and very practical approaches taught
in this program have been very helpful during
the course of this year working as a medical
education registrar. Also, in setting up a mock
written exam for basic physician trainees this
year, weve been using an online question bank
tool in Moodle, which has been very helpful
in collating questions and delivering the exam
online and performing statistical analyses.
Tell us about the structure of assessment
used in physician training at the moment.
I think the best way of explaining this is that
physician training is done in two components.
We have basic physician training (BPT) and
advanced training (AT). BPT typically takes
three years and during that time, basic
physician trainees are expected to have
gained skills and knowledge in a broad
spectrum of physician areas from general
medicine through to sub-specialty areas, and
then there is a barrier exam to get through to
advanced training.
The exam is performed in two parts. There is
a written exam, which has two papers looking
at basic sciences and clinical applications.
Then everyone is ranked on a bell curve, and
if you are in the top two-thirds of candidates
sitting the exam, you are then eligible to sit
the clinical exam.
The clinical exam tries to simulate real patient
interactions to try and get a better idea of
how capable and competent the individual is
in their day to day practice. The clinical exam
includes long cases, where the candidate has
an hour with the patient, and has to interview
and examine the patient and formulate
their major medical problems; as well as
short cases where a clinical examination is
performed in seven minutes while examiners
watch and a diagnosis and management plan
has to be deduced from that. So they are the
big exams, the big barriers that we have in
physician training.
Throughout BPT and AT, there are also a
number of formative assessments performed,
and these include Mini-CEXs, case-based
discussions and learning needs analyses. In
the three years of advanced training, trainees
move into a subspecialty area, and each
subspecialty has a different assessment
structure. Some of the subspecialties
have formalised, summative examinations
throughout their advanced training, but the
vast majority have no barrier assessment
to becoming a consultant. But they do
have a series of mid-term and end-of-term
assessments and formative feedback
sessions.
Do you think this is the best method for
determining whether a trainee is ready to
become a consultant physician? And does
it assess all different facets of practice for
a consultant physician, aside from just
clinical practice?
Physician training is moving away from the
apprenticeship model toward a competency-
based model. Competency was previously
continued overleaf
A S S E S S M E N T Z O N E
3 8
thought to be gained by a certain amount of
time in a particular area, which is what the
apprenticeship model works on, but in this
model you may only be exposed to a certain
range of conditions depending on where you
are training and so you may miss out on the
broader realms of things that fall within your
specialty; whereas the competency-based
model requires the trainee to prove their
competence before going on to become a
consultant. I think by having the barrier exam
between BPT and AT it determines a level of
competence that seems to be fairly rigorous,
but in moving on to become a consultant
physician there is the potential for trainees
working in this apprenticeship model to not
have the degree of exposure they need
to become a competent consultant. And
there is some talk of going on to having exit
examinations for physician training.
The expectation of the role of the consultant
has changed a lot in the last 10 years, where
being a good clinician isnt necessarily
enough to secure employment, and as
such, consultants are now expected to
have experience in research, education and
management. I think these are three key
areas that at the moment arent adequately
assessed. Although in most specialty areas
trainees are expected to perform research
projects over the course of their training,
there isnt necessarily any assessment
associated with that or a quality standard
upheld. Education and management are areas
where there isnt any training or assessment
at the moment.
Can you tell me a bit about the different
modalities of assessment currently used
in gastroenterology training and whether
you think that is adequate to meet the
assessment needs of the trainees?
Work-based assessment is essentially what
the RACP is trying to use, particularly in
advanced training, but I think there are a
number of limitations with this. Continuous
assessment is utilised in the mid and end
of term feedback that the trainee receives.
Depending on the interaction between the
supervisor and trainee, the quality of that
feedback may be different depending on the
supervisor and the frequency of interaction
that those two individuals have. Most
trainees will try and nominate a supervisor
that they work closely with to give them the
best possible feedback, but within some
subspecialties that is not always possible
and there may be a supervisor who doesnt
have close contact with the trainee day
to day, and so the reliability and validity of
that assessment may be diminished. In
procedural specialities, performance-based
assessment is more based on volume rather
than competence. Within my specialty,
gastroenterology, its based on the number
of procedures including gastroscopies and
colonoscopies. The training program is a
three year advanced training program, and
at the end of that you are expected to have
performed 100 colonoscopies unassisted.
Realistically, most will have performed that
many colonoscopies within the first 6-12
months of training and if over the course
of the three years youve only just made
that level, I am not convinced you would
actually be capable of going out to practice
independently. The chances of the rarer
complications cropping up in that time is
Assessment and Evaluation in Health Professional Education
3 9
limited, so you may not feel confident in
managing those complications, which would
be expected of a consultant.
One possibility would be to have a supervisor
who has monitored you closely over a
prolonged period of time who determines
your competence, and that person needs to
have undergone standardised accreditation
by the college. However, I think, ideally,
there needs to be a volume component to
it as well, but I think the current volume
requirement is probably too low.
You mentioned before the possibility of
exit exams being introduced as a barrier to
completing advanced training. What sort
of format might that take? And what are
your views on the issue?
I think with the increasing number of
trainees that are coming through, the ratio
of supervisors to trainees has the potential
to become more dilute, and as a result
that training experience may vary between
trainees, and between training locations.
With increased numbers of trainees, I think
that there is merit in an exit exam. You would
want to assess the broadest range of the
aspects of the job you possibly could. I think
there needs to be a knowledge component,
which would likely be in the form of a written
examination. I think there needs to be a
clinical component, so potentially a patient
interaction, OSCE- styled component. And
I think within procedural specialties, there
may need to be a procedural simulation
component as well to demonstrate procedural
competence.
I can foresee some problems with introducing
an exit exam. It will be costly for the college,
and ultimately for the trainees as the cost will
be passed on to them in increased training
and exam fees. It will be very time consuming
for both organisers and exam candidates to
prepare for the exam. It will still not assess
things like interprofessional communication,
unless there are specific OSCE stations on
these topics. And I think the biggest challenge
will be what to do with the trainees who
dont pass. Will they repeat another year of
training, and will they ultimately be asked to
leave the program? It would all add to the
increased number of trainees who need to be
accommodated in the system. I think all these
issues need to be worked out before this
could be implemented.
1. Beckman TJ, Lee MC. Proposal for a collaborative
approach to clinical teaching. Mayo Clin Proc. 2009
Apr;84(4):339-44.
Interview by Dr Elisabeth Feher
For more information about the Teaching on
the Run program, go to tellcentre.org
A S S E S S M E N T Z O N E
4 0
OSCE stands for Objective Structured
Clinical Examination and assesses mainly
cognitive and psychomotor domains of
learning. OSCEs focus on the students ability
to synthesise and apply their knowledge into
practical tasks.
OSCEs were first introduced into medical
education in 1975 by Ronald Harden in
Scotland at the University of Dundee. They
were utilised as an alternative to traditional
clinical examinations which had significant
limitations due to subjectivity, inconsistency
and poor structure.
OSCEs examine clinical competence and
aim to assess a students communication
skills, ability to think critically and problem
solve. OSCEs are structured performance
tests that are standardised, objective and
if designed well, can have high levels of
reliability and validity
.3,4
Does your OSCE make the cut?
by Kate Gray
OSCEs are used across a variety of health
disciplines including nursing, medicine and
allied health extensively in Australia, North
America and the United Kingdom.
OSCEs are designed as a series of
examination stations that the students rotate
around. Each station is designed to assess
one competency by using a pre-determined
guideline or checklist.
The students are assessed on not only the
results they obtain from their performance of
the practical skill but also how they go about
obtaining these results.
Very clear instructions are provided to
minimise ambiguity for the student and
standardise the responses and prompts
that the examiner and simulated patient
can provide.
OSCE Fast Facts
Assessment and Evaluation in Health Professional Education
4 1
Does your OSCE make the cut?
Assessing the quality of OSCEs
Reliability
OSCEs have been demonstrated to have
much higher levels of reliability than other
forms of practical assessment and have
hence become the mainstay of practical
clinical competence in medical education.
2
Moderate to good inter-rater reliability has
been demonstrated when multiple assessors
score one station independently of one
another. Caution should be used when
there is only one single assessor for each
station.
5
A correlation of 0.6 or more has
been suggested as a good level of agreement
between raters.
5
The ability of an OSCE score on one station
to determine the performance on another
station has been found to be low in some
studies. The number of stations and diversity
of the stations impacts on the internal
consistency of the OSCE. Increasing the
similarity of skills to be tested can improve
the internal consistency of the OSCE
and reducing the number of stations and
inadequate sampling threatens the reliability.
Test retest reliability is the consistency in
which a student receives the same score in
an OSCE if they repeated it several times
over. By using simulated patients, who act
in a standardised manner improves the
consistency of OSCEs as it is easier and
more feasible to train subjects to respond in
a particular way multiple times, rather than to
train a real patient.
3
Standardisation
A marking checklist is provided to all
examiners with appropriate training to
ensure they score each component of the
task reliably against a set of pre determined
criteria. Aspects of the students performance
with higher importance such as safety
can determine an automatic fail grade in
spite of the other aspects of the student
performance.
6
The use of global rating scales
(GRS) can allow the examiner to make a
judgement about student performance that
is based on aspects of the performance that
are difficult to quantify with a checklist, such
as interpersonal skills.
2
Combining the GRS
and checklist give a more holistic and reliable
assessment on performance.
6
Passing a borderline student also needs
to be standardised to ensure the OSCE
is fair and reliable. Standards need to be
set to determine if a borderline student in
one or more stations will pass the overall
examination. Research has demonstrated
that in borderline students, all station scores
should be combined to determine their overall
clinical competence. The standard pass
score for each OSCE can be determined by
expert opinions of the examiners or experts
in the field and all of the students actual
performance scores combined.
6
continued overleaf
A S S E S S M E N T Z O N E
4 2
Validity
Content validity can be high in OSCE exams
when the outcomes being assessed are
reviewed by experts in the field.
3
OSCEs
assess a wide variety of clinical skills by using
multiple assessment stations using a range of
well trained examiners to consistently assess
the students.
3
The marking criteria should
only entail aspects of performance that
are directly related to the skill that is being
assessed.
6
The ability of an OSCE score to correlate
to other examination scores on the same
topics, or the OSCE score correlation to a final
course score has been demonstrated as being
good, especially when checklists and GRS
are utilised.
6,7
Statistically significant criterion
validity has also been demonstrated in OSCE
exams by using correlation statistical studies
to compare students OSCE results with
their written examination, oral examination
and clinical evaluation scores.
3
This suggests
that OSCE scores correlate well to other
examinations taken at the same time.
The predictive validity of OSCEs has been
established when comparing students results
in an OSCE to their actual clinical performance
and performance in other examinations such
as written or case based interdisciplinary
assessment.
8
Students who demonstrate
strong analytic and problem solving skills,
whilst also showing good communication
and practical skills in OSCEs will most likely
perform well in a real life clinical scenario.
OSCEs can effectively identify students who
may struggle with clinical experiences and
also highlight significant concerns around a
students professionalism and ability to cope
under stress.
2
Educational
OSCEs can be used not only in summative
assessment but also in formative assessment
as it provides feedback to students about
their areas for improvement and provides
them with an opportunity to practice
clinical skills and prepare them for final
examinations.
4
OSCEs are generally well received by
students, who perceive OSCEs to be a
realistic reflection of real life scenarios they
encounter in clinical situations. It has been
reported that students feel they can be fair
and useful assessment methods. It must be
noted that students may complain of high
stress and anxiety levels with this type of
examination.
2,3
Practicality
The cost of OSCEs has been found to be
within acceptable limits if compared to other
forms of practical assessment. However
OSCEs are initially costly to set up, not only
in amount of examiner time and facilities but
also in examiner and simulated patient training,
with some reported as requiring up to four
hours of training per actor.
2
These costs are
generally reduced when the same subjects are
used in multiple examination periods.
a success skills for life,
will impact on your practice it is so practical like the job.
First and second year nursing students reflections an OSCE assessment.