Vous êtes sur la page 1sur 24

MEASURABLE MEASUR RABLE OUTCOM OUTCOMES MES FROM LEARNING

CONTENTS
G 1. LEARNIN 2. P WITH IMPA CT n model s NTINENTS OSS 3 CO opular learning evaluatio CT AIN ITRIXS TR HAVE IMPA 4. How C DOESNT G IN IN A 0% OF TR 5. Why 9 dead 6. ROI is lIVE ROI 7. LONG RWARD? E WAY FO ED ROI TH S A -B Y R 8. IS SALA sheets 9. HAPPY with impact ot) to learn 10. how (n impact learn with 11. how to MODEL KPATRICK World KIR ew N CIL 12. The NTY COUN HIRE COU DS R O TF R E H ATION AT 14. EVALU s 15. top tip IT? DO YOU DO SFER: HOW N A TR G IN FAIL 16. LEARN EPARE TO EPARE, PR R P TO IL N 17. FA FORMATIO TO TRANS SACTION N A TR M 18. FRO VALUATION TURE OF E 19. THE FU ATION OF EVALU VOLUTION 20. THE E AN IM ING MADE PACT ACR

ea rn ing by Reed L m P roduced nd-CO.co -a L y b desig ned y gl in v o L

LEARNING WITH IMPACT


earning is an investment. And like any investment, its crucial to measure the outcomes. But because learning is often so intangible and individual, that measurement is often pretty tough. Its not like working out how much money you saved by swapping the office over to energy-saving lightbulbs there are a whole host of factors to consider. Results are at the heart of what we do, and we think its essential that learning is a journey with a destination, not a one-off event. Thats why weve created this book in partnership with some of the leading thinkers in the L&D industry including Training Journal, The Kite Foundation, the ASTD, Kirkpatrick Partners and many, many more. Inside youll discover jargon demystified, what best practice looks like and some innovative ideas to guide your own learning programmes. We hope you find it useful. In fact, we hope it has real impact!

KIRKPATRICKs Model of Training Evaluation


LEVEL FOURRESULTS
To what degree targeted outcomes occur, as a result of learning event(s) and subsequent reinforcement

There are lots of different eval uation methods out th ere. Heres a quick guide to some of the most popu lar.

LEVEL THREEBEHAVIOUR

To what degree participants apply what they learned during training when they are back on the job

LEVEL TWOLEARNING

To what degree participants acquire the intended knowledge, skills, and attitudes based on their participation in the learning event

LEVEL ONEREACTION

To what degree participants react favourably to the learning event

PHILLIPS Evaluation Model


Based on Kirkpatricks model. It adds a fifth step, ROI, which is calculated using this seven-stage process.

1. COLLECTING PRE PROGRAME PROGRAMME DATA 4. CONVERTING DATA TO MONETARY VALUE

2. COLLECTING POST PROGRAME PROGRAMME DATA 5. TABULATING PROGRAM PROGRAMME COSTS

3. ISOLATING THE EFFECTS OF THE PROGRAM PROGRAMME 6. CALCULATING RETURN ON INVESTMENT 7. IDENTIFYING INTANGIBLE BENEFITS

ROI=

NET PROGRAMME BENEFITS PROGRAMME COSTS

BRINKERHOFFs Success Case Method


Brinkerhoffs model focuses on narratives and stories, supported by evidence: 1 Identify the goals of the learning opportunity and connect them to business needs 2 Survey participants to identify best and worst cases 3 Obtain corroborating evidence 4 Analyze the data 5 Communicate findings

CIRO Evaluation
CONTEXT Identifying training needs and objectives INPUT Designing and delivering training REACTION Quality of trainee experience OUTCOME a) Immediate individual changes before returning to work b) Intermediate individual transferring changes to work c) Ultimate departmental or organisational results

SCRIVENs Key Evaluation Checklist PRELIMINARIES


I Executive summary II Preface III Methodology

FOUNDATIONS
1 Background and context 2 Descriptions and definitions 3 Consumers 4 Resources 5 Values

SUB-EVALUATIONS
6 Process evaluation 7 Outcome evaluation 8&9 Comparative cost-effectiveness 10 Exportability 11 Overall significance

CONCLUSIONS
12 Recommendations and explanations 13 Responsibilities 14 Reporting and follow-up 15 Meta-evaluation

LEADING PROVIDER OF VIRTUALISATION TECHNOLOGIES

itrix is a leading provider of cloud, networking and virtualisation technologies. Citrix products touch 75 percent of Internet users each day.

LINKING IMPROVED SATISFACTION


As well as measuring learner response Citrix measured the overall impact of their training strategy in EMEA with two key metrics in their employee engagement survey, one of which was: Do you feel you have access to sufficient training to improve your skills?

In 2010 Citrix announced X-5 a goal to grow more than 50% by 2015. Following their 2010 annual employee engagement survey, they identified that employees needed more training and opportunities to grow in order to meet this growth objective. Citrix created a revitalised L&D strategy to address this. The focus was on creating consistent, scalable and accessible learning that could be tailored to local stakeholders in 19 countries.

TO IMPROVED PERFORMANCE
These improvements correlated with a 20% growth in revenue. 20% growth from 2010-2012 puts Citrix on track to achieve their X-5 goal of 50% growth from 2010-2015.

10
YEAR

11

12

13

14

15

Its essential to realise that effective training begins long before a learning event begins!

Richard Griffin

Research suggests that only 10% of training is transferred into sustainable performance improvement!*

Before training starts


 arry out training needs analysis C but dont forget that training is not the only option  nsure employees are learning E ready. For example, theres no point sending someone on an eLearning programme if they cant use a computer  ave clear, measurable training H objectives  onsider how impact will be C measured: who will be interested in reviewing the results?

Transfer
 anagers should regularly ask M for feedback about the training from employees Set training-related goals  nsure new learning can be applied E quickly and frequently in employees day-to-day work

impact
 valuate! Not only does evaluation E allow the impact of training to be assessed, it also reinforces employee learning  ook for barriers, such as L workload, that may have prevented employees from transferring their learning. Evaluation outcomes are as much about a supportive workplace as the quality of the training

Training
Ensure its job relevant  ake sure that the training M programmes design, content and delivery style are appropriate to the audience

*Sugrue, B., & Rivera, R. (2005). ASTD State of the industry report

here is well-worn joke about a stranger in a car asking for directions in the country. The local replies if I were you I wouldnt start from here. This seems to be an apt description of our professions approach to evaluation. Our inherited approach is firmly based in the concept of hierarchical evaluation. We should, we are told, proceed by capturing and analysing information on reaction (did they like the course); learning (did they learn anything); behaviour (did they do anything differently as a result); return (what were the bottom line benefits). In practice as every survey ever conducted shows we only collect information at the lowest level (reaction). This produces two effects. The first is that we beat ourselves up for not doing what we ought to do. The other is that we look for increasingly complex ways of isolating the effects of a training intervention. We must recognise that the skill set that drives value for the business has changed and is learned rather than trained.

Two of the essential attributes required in the knowledge-driven economy are technological acumen and influencing skills. We acquire both through trial and error, and peer-group support rather than formal training. Learning in todays organisations is a diffuse activity, not a single event. Our evaluation and ROI models are products of a time when skill sets and learning methods were quite different. The information we produce using the traditional approach is designed to justify our existence. It should be no surprise that senior managers are not interested. There is still a need for trainers to ask are we putting our efforts towards the right objectives? However, in todays economy, we need to discover a better way of finding the answers than ROI. Martyn Sloman Former learning advisor for the CIPD and a Teaching Fellow in the Department of Management & Organisational Psychology at Birkbeck College.

he key to this debate is that we have long confused evaluation and measurement. Like Martyn, I believe that L&D departments should not just produce figures to measure things for the sake of it. But I do think that figures have their place as a tool to critically evaluate the impact of learning. ROI was introduced on the basis that it put a hard financial figure on the value of training, isolating its effects and justifying the training budget. The problem is, as Martyn points out, that nobody really cares about the ROI of a specific training intervention. What matters is the impact against defined objectives. The fault isnt with calculating the return on investment. Its that we dont define the costs, problems and contexts clearly enough to make that calculation worthwhile. All learning needs a clear purpose. Why should something be learned, and what benefit will that learning have? Once this has been established, the content,

intended results and required resources can be defined. Martyn is right our current evaluation activities attempt to give us information no-one really needs (or wants) to know. But that doesnt mean we should throw out ROI altogether. We just need to measure the things that matter. Has the whole learning programme changed how people work? What exactly has it changed? If so, how much are those changes saving us? Thats where ROI becomes a useful tool, not just a number to be crunched after every intervention. Neville Pritchard Is a leading people development specialist and thinker with over 25 years of experience. Hes a Fellow of the CIPD and former Learning Director for Barclays Bank.

IS IS SALARY-BASED SALARY-BASED ROI ROI THE THE WAY WAY FORWARD? FORWARD?
hose responsibility is it to prove ROI on training? There are hundreds of articles and books on this subject, and maybe its just gotten a little complicated. So, heres a view which puts responsibility for embedding the learning at the foot of the business. Its an approach that says the ROI should be calculated from the salary paid to the person doing the learning, not on the final bill of the learning itself. The cost of learning is an investment but the return needs to be sought from those going on it. Our responsibility as L&D professionals is to put forward the absolute best opportunities and resources to learn. Having the ROI

focus on salaries puts the onus on the manager and individual to show the short-to long-term benefits. Otherwise we lose our chance to change performance and behaviours in the long term by getting bogged down on measuring expenditure on training rather than the impact it has. We are part of making sure that the L&D function is not seen as where performance is changed, rather simply where the change starts. Directly equating training outcomes to job performance through salary-based ROI is a great way to do just that.

Teresa Ewington Learning & Development Manager, Thames Water

I LOVED IT!

he very mention of happy sheets will almost universally draw knowing looks and dismissive sneers from L&D professionals, despite the fact that we almost universally continue to use them.

WHATS GOING ON HERE? For years now weve assumed the learner experience is key to measuring the effectiveness of training. Hence the inflated importance given to enter-trainers. But learning is not just for the learners. Most of those who take part in work-based learning are doing it for the benefit of their employers. And rightly so, since they usually foot the bill. But surely a happy learner is a more motivated learner, and so more likely to transfer their learning? Perhaps, but not necessarily. Meta-study research by Sitzmann on 68,245 trainees showed that variance in learner satisfaction accounted for: 2% of the variance in factual knowledge  5% of the variance in skill-based knowledge 0% of the variance in training transfer

Happy sheets are not even an accurate reflection of satisfaction. The majority of learners lie. Not because they are bad people but because they dont want to offend the nice, fun person theyve spent the day with. Happy sheet data can be manipulated end the day with a fun exercise, give out chocolates and watch your results magically improve. Historically, HR, L&D departments and trainers secretly have a fondness for happy sheets because they reaffirm how great we are at what we do. Happy sheets can provide valuable data if used thoughtfully as part of a more in-depth evaluation process. The problem is using them as a justification wasting lots of time collecting and processing the happy sheet data as if its the most important thing, when what really matters is business impact and transfer. Ben Waldman is one of Reed Learnings L&D experts. For the last ten years he has worked as a facilitator, coach and consultant specialising in leadership development.
9

How (not) to learn with impact


Impact Man has been sent on a course, but he doesn't really know why. Still, he makes some friends...

...pays attention (sometimes)...

...and lets everyone know what a nice time he's had on his evaluation form.

impact man goes back to work taking on the evil villain, Major Disengagement.

but because he DIDNT paY attention on his course things don't go as well as they could!

How did that happen?

But it could all be so different...

10

How to learn with impact


Impact man meets with his boss and discusses going on the Managing Difficult Super villains course. He Turns up with some pre-course work and objectives. makes a note of specific actions he'll take.

With a plan in place, he's feeling confident taking on Major Disengagement...

...and he's able to apply what he's learnt to get the right outcome.

Hes saved the day again!

Grrrrr!

11

Level 1: Reaction
Engagement
Engagement is the degree to which participants are actively involved in and contributing to the learning experience.

Relevance

Relevance is the degree to which training participants will have the opportunity to use or apply what they learned in training on the job. Relevance is vital.

Customer satisfaction

Customer satisfaction is the degree to which participants react favourably to the learning event.

MO N

R ITO

ON T

JOB E H

The degree to which participants acquire the intended knowledge, skills and attitudes based on their participation in the learning event. Level 2 also includes: Confidence is the degree to which training participants think they will be able to do what they learned during training on the job.

Confidence

Commitment

Commitment is the degree to which learners intend to apply the knowledge and skills learned during training to their jobs.

12

EN C

AGE R U O

Level 2: LEARNING

LEVE Behav

The New World

By Kirkpatrick Partners

REI N

E RC FO
REW

B LEAR

Required drivers

EL 3: viour

Required drivers are processes and systems that reinforce, monitor, support and reward performance of critical behaviours on the job.

RD A

G NIN

Level 4: RESULTS
Leading indicators
Short term observations and measurements suggesting that critical behaviours are on track to create a positive impact on desired results.

Desired outcomes

These are the measurable business objectives that were identified before the learning intervention.

2010-2012 Kirkpatrick Partners, LLC. All right reserved. Used with permission. Visit Kirkpatrickpartners.com for more information.

13

EVALUATION AT HERTFORDSHIRE COUNTY COUNCIL

eed Learning and Hertfordshire County Council have been working in partnership since 2011 to deliver a large catalogue of Leadership, Management, ICT and Personal Development solutions to a complex, diverse workforce. Measuring the quality of those solutions is vital to the organisations success in a challenging financial climate. Before attending any Reed Learning event, Hertfordshires employees will complete a pre-course questionnaire asking them to rate their confidence against each of the key learning indicators and the importance of these to their role. This information is passed to the trainer who uses it to adapt the session to suit the needs of the group. Eight weeks after the event, all delegates will then be asked to complete a postcourse questionnaire asking them to rate their new level of confidence against each of the key learning indicators. Along with happy sheet information, this is then used to measure whether the course was effective. Using the Reed Learning Evaluation System, Hertfordshire County Council have

been able to demonstrate a 17% improvement in learner confidence and ensure that the right people are attending the right course at the right time.

14

USE SURVEYS TO MEASURE IMPACT Chris Robinson, Boost Evaluation

mpact means making a difference to business performance. It could relate to personal performance, behaviour change, colleague performance or even impact on society. But it has to be tangible and measurable. Here are some top tips for making sure you

1. Ask the right questions


Before doing any kind of survey design, get it straight in your head why you want to do the survey. The best trick is to write down all the conclusions you would want to see. E.g. The new induction course reduces speed to competence by X%. Be ambitious! Dont restrict yourself only to what is easy to measure. Whatever you do though, choose things to measure that explore what happened, not just what people think.

2. Ask the right people


Think creatively about who the right witness to this impact is. Is it the learner? Probably not. When it comes to performance its usually a manager. When it comes to behaviour it could be a colleague or direct report.

3. Ask at the right time


Plan in advance by asking a few of your witnesses up front how long it typically takes them to notice a change in behaviour or performance. Then schedule the survey to go out at this time to optimise your results.
15

Learning transfer: How do you do it?


Simon Chilley

It comes down to three essential factors:* Learner Characteristics: the ability, personality and motivation of the learner. The line manager is crucial here. Picking the right type of learning for each person, discussing how the new skills will be applied in the workplace and endorsing the quality of the learning before the event can all significantly increase the likelihood of learning transfer. Learning Design: the quality of the learning itself. Relevant learning activities that closely resemble real-life, including the opportunity to make mistakes and incorporate credible feedback on practical activity, are all shown to lead to increased learning transfer.

mproving the amount of learning transferred is the goal of most learning professionals. The question is: How do you do it?

Work Environment: the support and opportunity the learner gets to apply their new skills. Again, the line manager is crucial. Firstly in creating a climate in which applying new skills, and giving and receiving feedback are the norm and secondly in actively creating opportunities for new skills to be applied. Learning professionals are often only responsible for one of these three things Learning Design. Unfortunately Learning Design is shown to be much less effective when the Learner Characteristics and Work Environment do not support learning transfer. So, if your organisation is looking to maximise learning transfer you must engage your line managers, vary the types of learning to suit your learners and create a work climate where new skills can be practised. Simon Chilley is a Programme Director at Reed Learning and a learning transfer specialist.

16

* Baldwin, T. T. and Ford, J. K. (1988), Transfer of Training: A Review and Directions for Future Research, Personnel Psychology, vol.41, no.1, pp. 63-103.

etting results from training is down to the training company, right? Youve paid good money, so you should just be able to sit back and relax. Well, not really. Even the very best training providers will struggle to deliver results if you dont do your bit. First, do the hard work of identifying what your business really needs. Get out and about in the organisation and ask the right questions. Once youve done your analysis, identify both learning and behavioural goals; managers are always more interested in the latter. Think business outcomes! Ensure that the people you are nominating for the course are the right people. In these difficult times its not enough that someone needs training. They have to want it. Sell the course, dont just inform people about it. Explain whats in it for them and be bold in detailing what you expect them to do to deliver a return on the investment. Encourage learners to put their new capabilities to work after the learning event. Its rarely possible to offer financial incentives, so be imaginative. Try to link achievement

after training, to eligibility for promotion or other recognition. Ensure that learners have immediate opportunities to use their new capabilities and create a transfer community of other learners, trainers and colleagues to hold them to account. Last, but very definitely not least, dont forget the managers. They can make or break your efforts so make sure that they understand their role before, during and after the course. There is no point in them agreeing to release their people to attend training, if they arent going to be actively involved in making that investment pay off. Robert Terry is founder of The Kite Foundation, a not-for-profit think-tank that develops solutions to support the application of newly-learned skills in the workplace. www.kite-foundation.com

17

From Transaction to Transformation Debbie Carter

othing is more guaranteed to pull an audience to an event than something focused on the Holy Grail of L&D evaluation. So why do so many people find this area of expertise so difficult? Do mature L&D departments and professionals need to evaluate their work as much as they do? I suspect that other departments in organisations dont feel the need to prove their worth as we do they see themselves as an integral part of the business and if that business is growing and doing well they are satisfied that they are doing their job. The mature L&D professional should be more focused on understanding the business.

In the past training and learning was viewed as something separate to the business. This separation meant that we felt the need to have the statistical evidence to prove that our work impacted on the organisation we were serving. Im glad to say that many L&D specialists have moved on from this purely transactional relationship to something that is much more transformational and that move brings a change in emphasis for the L&D skill set. Debbie Carter is Director of Research at Training Journal, one of the UKs leading resources for L&D professionals.

Here are three traits you can develop to help transform your business:

COMMANDER COURAGE

Captain Curiosity

FELICITY FLEX
COURAGE challenge and

ask questions to help the business understand itself better. Many problems are not solved by training or learning initiatives.

FLEXIBILITY nothing stays the same. Support people in discarding old practices and in embracing new ideas.

Curiosity find out whats happening inside and outside the business and talk to all your stakeholders.

18

By Tony Bingham

Evaluating the impact of learning must be a priority


As business becomes more competitive and more global a key differentiator for success will be talent. A study by ASTD and IBM found that senior executives and CEOs agree that learning is strategically valuable. Because the learning function is ultimately responsible for making sure the workforce is well-skilled and the talent pipeline is intact, leaders must measure the impact of what they achieve in ways that matter to the organisation. Many organisations have a long way to go when it comes to ensuring that learning is truly and demonstrably aligned with the business. Fortunately, most are seeing the importance of this. The future lies with learning professionals becoming as astute about

business as they are about learning. ASTD found a strong correlation between goal alignment and market performance, and a strong correlation between alignment and a learning function that is effective at meeting goals its own and the organisations. The future of evaluation will not be defined by a particular model or methodology. The future of evaluation is a matter of strategic relevance and impact. When training and development efforts are designed and aligned to meet business needs and goals, their impact on results will be determined in metrics that are most meaningful to the organization. Tony Bingham is the CEO of the American Society for Training and Development, the worlds largest organisation dedicated to training & development professionals. 19 19

1792

1845

William Farish is the first to use the quantitative mark (or numerical score) to assess his students

printed formal tests were used for the first time to assess learners In Boston, MA.

1897

1911

Joseph Rice uses testing to find no link between time spent learning to spell and competence 20

Frederick Taylor publishes The Principles of Scientific Management, focusing on constant quantitative testing and efficiency

1940

1959

Ralph Tyler finishes an 8-year study and pioneers the idea of behavioural evaluation as an alternative to factual evaluation

Donald Kirkpatrick first publishes his ideas on training evaluation. Along with some revisions, this framework is still widely used today

1960s-1990s

A huge range of evaluation methodologies spring up. In 1997 Worthen et al classify them into 6 categories of orientation: Objectives, Management, Consumer, Expertise, Adversary and Participant

2004 Present

Fitzpatrick et al identify 12 future trends for evaluation, including greater use of technology and using qualitative and quantitative analysis together 21

Ever wondered whether your learning programmes are really making a difference? Then this book is for you. In our latest Little Book, Reed Learning have partnered with Training Journal and some of L&Ds top thinkers to demystify training evaluation and discover how to deliver learning with real impact. Visit us at www.reedlearning.com to find out more.
IN PARTNERSHIP WITH

Vous aimerez peut-être aussi