Vous êtes sur la page 1sur 7

kirkpatrick's learning and training

evaluation theory
Donald L Kirkpatrick's training evaluation
model - the four levels of learning
evaluation
Donald L Kirkpatrick, Professor Emeritus, University Of Wisconsin (where he
achieved his BBA, MBA and PhD), first published his ideas in 1959, in a series
of articles in the Journal of American Society of Training Directors. The
articles were subsequently included in Kirkpatrick's book Evaluating Training
Programs (originally published in 1994; now in its 3rd edition - Berrett-
Koehler Publishers).
Donald Kirkpatrick was president of the American Society for Training and
Development (ASTD) in 1975. Kirkpatrick has written several other significant
books about training and evaluation, more recently with his similarly inclined
son James, and has consulted with some of the world's largest corporations.
Donald Kirkpatrick's 1994 book Evaluating Training Programs defined his
originally published ideas of 1959, thereby further increasing awareness of
them, so that his theory has now become arguably the most widely used and
popular model for the evaluation of training and learning. Kirkpatrick's four-
level model is now considered an industry standard across the HR and
training communities.
More recently Don Kirkpatrick formed his own company, Kirkpatrick Partners,
whose website provides information about their services and methods, etc.

kirkpatrick's four levels of evaluation model


The four levels of Kirkpatrick's evaluation model essentially measure:
 reaction of student - what they thought and felt about the
training
 learning - the resulting increase in knowledge or capability
 behaviour - extent of behaviour and capability improvement
and implementation/application
 results - the effects on the business or environment resulting
from the trainee's performance

All these measures are recommended for full and meaningful evaluation of
learning in organizations, although their application broadly increases in
complexity, and usually cost, through the levels from level 1-4.
Quick Training Evaluation and Feedback
Form, based on Kirkpatrick's Learning
Evaluation Model - (Excel file)

kirkpatrick's four levels of training evaluation


This grid illustrates the basic Kirkpatrick structure at a glance. The second
grid, beneath this one, is the same thing with more detail.

level evaluation evaluation examples of relevance and


type (what description and evaluation tools practicability
is characteristics and methods
measured)

1 Reaction Reaction 'Happy sheets', Quick and very easy to


evaluation is how feedback forms. obtain.
the delegates felt
about the training Verbal reaction, post- Not expensive to gather
or learning training surveys or or to analyse.
experience. questionnaires.

2 Learning Learning Typically Relatively simple to set


evaluation is the assessments or tests up; clear-cut for
measurement of the before and after the quantifiable skills.
increase in training.
knowledge - before Less easy for complex
and after. Interview or learning.
observation can also
be used.

3 Behaviour Behaviour Observation and Measurement of


evaluation is the interview over time behaviour change
extent of applied are required to typically requires
learning back on the assess change, cooperation and skill of
job - implementation. relevance of change, line-managers.
and sustainability of
change.

4 Results Results evaluation Measures are already Individually not difficult;


is the effect on the in place via normal unlike whole
business or management organisation.
environment by the systems and
trainee. reporting - the Process must attribute
challenge is to relate clear accountabilities.
to the trainee.
kirkpatrick's four levels of training evaluation
in detail
This grid illustrates the Kirkpatrick's structure detail, and particularly the modern-day
interpretation of the Kirkpatrick learning evaluation model, usage, implications, and
examples of tools and methods. This diagram is the same format as the one above but
with more detail and explanation:

evaluation evaluation description examples of evaluation relevance and


level and and characteristics tools and methods practicability
type

1. Reaction evaluation is Typically 'happy sheets'. Can be done


Reaction how the delegates immediately the
felt, and their personal Feedback forms based on training ends.
reactions to the subjective personal
training or learning reaction to the training Very easy to obtain
experience, for experience. reaction feedback
example:
Verbal reaction which can Feedback is not
Did the trainees like and be noted and analysed. expensive to gather or
enjoy the training? to analyse for groups.
Post-training surveys or
Did they consider the questionnaires. Important to know that
training relevant? people were not upset
Online evaluation or or disappointed.
Was it a good use of grading by delegates.
their time? Important that people
Subsequent verbal or give a positive
Did they like the venue, written reports given by impression when
the style, timing, delegates to managers relating their
domestics, etc? back at their jobs. experience to others
who might be deciding
Level of participation. whether to experience
same.
Ease and comfort of
experience.

Level of effort required


to make the most of the
learning.

Perceived practicability
and potential for
applying the learning.

2. Learning evaluation is Typically assessments or Relatively simple to set


Learning the measurement of the tests before and after the up, but more
increase in training. investment and thought
knowledge or required than reaction
intellectual capability Interview or observation evaluation.
from before to after the can be used before and
learning experience: after although this is time- Highly relevant and
consuming and can be clear-cut for certain
Did the trainees learn inconsistent. training such as
what what intended to quantifiable or technical
be taught? Methods of assessment skills.
need to be closely related
Did the trainee to the aims of the Less easy for more
experience what was learning. complex learning such
intended for them to as attitudinal
experience? Measurement and analysis development, which is
is possible and easy on a famously difficult to
What is the extent of group scale. assess.
advancement or change
in the trainees after the Reliable, clear scoring and Cost escalates if
training, in the direction measurements need to be systems are poorly
or area that was established, so as to limit designed, which
intended? the risk of inconsistent increases work required
assessment. to measure and
analyse.
Hard-copy, electronic,
online or interview style
assessments are all
possible.

3. Behaviour evaluation Observation and interview Measurement of


Behaviour is the extent to which over time are required to behaviour change is
the trainees applied the assess change, relevance less easy to quantify
learning and changed of change, and and interpret than
their behaviour, and sustainability of change. reaction and learning
this can be immediately evaluation.
and several months after Arbitrary snapshot
the training, depending assessments are not Simple quick response
on the situation: reliable because people systems unlikely to be
change in different ways adequate.
Did the trainees put their at different times.
learning into effect when Cooperation and skill of
back on the job? Assessments need to be observers, typically line-
subtle and ongoing, and managers, are
Were the relevant skills then transferred to a important factors, and
and knowledge used suitable analysis tool. difficult to control.

Was there noticeable Assessments need to be Management and


and measurable change designed to reduce analysis of ongoing
in the activity and subjective judgement of subtle assessments are
performance of the the observer or difficult, and virtually
trainees when back in interviewer, which is a impossible without a
their roles? variable factor that can well-designed system
affect reliability and from the beginning.
Was the change in consistency of
behaviour and new level measurements. Evaluation of
of knowledge sustained? implementation and
The opinion of the trainee, application is an
Would the trainee be which is a relevant extremely important
able to transfer their indicator, is also subjective assessment - there is
learning to another and unreliable, and so little point in a good
person? needs to be measured in a reaction and good
Is the trainee aware of consistent defined way. increase in capability if
their change in nothing changes back
behaviour, knowledge, 360-degree feedback is in the job, therefore
skill level? useful method and need evaluation in this area
not be used before is vital, albeit
training, because challenging.
respondents can make a
judgement as to change Behaviour change
after training, and this can evaluation is possible
be analysed for groups of given good support and
respondents and trainees. involvement from line
managers or trainees,
Assessments can be so it is helpful to
designed around relevant involve them from the
performance scenarios, start, and to identify
and specific key benefits for them,
performance indicators or which links to the level
criteria. 4 evaluation below.
Online and electronic
assessments are more
difficult to incorporate -
assessments tend to be
more successful when
integrated within existing
management and coaching
protocols.

Self-assessment can be
useful, using carefully
designed criteria and
measurements.

4. Results Results evaluation is It is possible that many of Individually, results


the effect on the these measures are evaluation is not
business or already in place via normal particularly difficult;
environment resulting management systems and across an entire
from the improved reporting. organisation it becomes
performance of the very much more
trainee - it is the acid The challenge is to identify challenging, not least
test. which and how relate to to because of the reliance
the trainee's input and on line-management,
Measures would typically influence. and the frequency and
be business or scale of changing
organisational key Therefore it is important to
structures,
performance indicators, identify and agree
responsibilities and
such as: accountability and
roles, which
relevance with the trainee
complicates the process
Volumes, values, at the start of the training,
of attributing clear
percentages, timescales, so they understand what is
accountability.
return on investment, to be measured.
and other quantifiable Also, external factors
aspects of organisational This process overlays greatly affect
performance, for normal good management
organisational and
instance; numbers of practice - it simply needs
business performance,
complaints, staff linking to the training
turnover, attrition, input. which cloud the true
failures, wastage, non- cause of good or poor
compliance, quality Failure to link to training results.
ratings, achievement of input type and timing will
standards and greatly reduce the ease by
accreditations, growth, which results can be
retention, etc. attributed to the training.

For senior people


particularly, annual
appraisals and ongoing
agreement of key business
objectives are integral to
measuring business results
derived from training.

Since Kirkpatrick established his original model, other theorists (for example
Jack Phillips), and indeed Kirkpatrick himself, have referred to a possible fifth
level, namely ROI (Return On Investment). In my view ROI can easily be
included in Kirkpatrick's original fourth level 'Results'. The inclusion and
relevance of a fifth level is therefore arguably only relevant if the assessment
of Return On Investment might otherwise be ignored or forgotten when
referring simply to the 'Results' level.
Learning evaluation is a widely researched area. This is understandable since
the subject is fundamental to the existence and performance of education
around the world, not least universities, which of course contain most of the
researchers and writers.
While Kirkpatrick's model is not the only one of its type, for most industrial
and commercial applications it suffices; indeed most organisations would be
absolutely thrilled if their training and learning evaluation, and thereby their
ongoing people-development, were planned and managed according to
Kirkpatrick's model.
For reference, should you be keen to look at more ideas, there are many to
choose from...
 Jack Phillips' Five Level ROI Model
 Daniel Stufflebeam's CIPP Model (Context, Input, Process, Product)
 Robert Stake's Responsive Evaluation Model
 Robert Stake's Congruence-Contingency Model
 Kaufman's Five Levels of Evaluation
 CIRO (Context, Input, Reaction, Outcome)
 PERT (Program Evaluation and Review Technique)
 Alkins' UCLA Model
 Michael Scriven's Goal-Free Evaluation Approach
 Provus's Discrepancy Model
 Eisner's Connoisseurship Evaluation Models
 Illuminative Evaluation Model
 Portraiture Model
 and also the American Evaluation Association

Also look at Leslie Rae's excellent Training Evaluation and tools available on
this site, which, given Leslie's experience and knowledge, will save you the
job of researching and designing your own tools.

Vous aimerez peut-être aussi