Vous êtes sur la page 1sur 39

8704_impact_coverS_nc.

qxd 13/10/2006 11:47 Page v

Measuring
the Impact of
Training and
Development in
the Workplace

Boosting
Skills and
Productivity
8704_impact_coverS_nc.qxd 13/10/2006 11:47 Page vi

Contents
Skillnets Ltd 2005
Prepared by the
Executive Summary 1
Impact Measurement Centre Project Summary 4

Part I The Challenge And The Approach 6


Section 1 - Introduction 6
Section 2 - The Project 7
Section 3 - Models To Be Tested 10
Section 4 - Implementation Strategy 14

Part II The Results 20


Section 5 - Application And Implementation
Of The Models 20
Section 6 - Impact Of The Models 33
Section 7 - Barriers And Enablers
To Adoption Of The Models 39
Section 8 - Training And Support
For Implementation Of The Models 44
Section 9 - Evaluation Of The Process 51

Part III Conclusions 60


Section 10 - Conclusions
and Recommendations 60

References 67
1
Executive Summary 1

This pilot project tested the detailed application of > Inadequate Training Needs Assessment prior to
the Kirkpatrick and Phillips Evaluation Models in Irish delivery of a training programme militates against
enterprises with particular regard to the applicability an effective evaluation process.
and usability of the models in evaluating the impact of
> There must be strong support from senior
training in companies. The results can be summarised
management in the organisation for
as:
implementation of the models.

Applicability the models are certainly applicable. > The commitment of the person responsible
The results of the pilot project show that they are: for implementation is vital.
> All staff should be made aware of the basics
> Methodologically sound, comprehensive
of the model, the importance of carrying out
and credible and, hence;
such evaluation and the benefits to both staff
> Acceptable in the Irish business context and the organisation generally. In particular the
involvement of trade union representatives has
Usability the models are usable with adequate
been shown in this pilot to have been very
training and support.
beneficial.
With regard to the usability of the models the pilot
project shows that: > HRD / training objectives / outcomes must be
integrated with business objectives.
> Levels 1-3 are easy to use, Level 4 presents some
> Further, the application of the evaluation process
challenges for companies and Level 5 is difficult,
and its findings must be linked to business
requiring business impact and monetary data and
objectives.
the use of techniques to isolate the impact of
training from other factors. Further, given that Project Background
relatively significant staff and time resources are Given the growing recognition of training as an
needed to carry out a full level 5 ROI evaluation, important factor in economic competitiveness,
it may be easier to do so in larger firms. It is not considerable attention is currently being devoted to
however impossible, and certainly not less the subject of return on investment in training to firms.
important, in smaller firms. This staffing and time This attention is also related to the link between
commitment should however decrease as competitive economic pressures and the need to
competency develops. increase the efficiency and effectiveness of training
which positions evaluation up to the ROI level as a
> In order to ensure a credible and accurate
critical issue for most companies.
evaluation study up to and including level 5, the
organisation must commit to providing appropriate
Skillnets has clearly identified that if training
staff with ROI evaluation training. We consider
adoption by Irish companies is to increase beyond
2-3 days training to be the basic requirement.
its current low level then it is vital to prove the
We believe therefore that the skills/knowledge
benefits of training in terms of its contribution to
of managers to use the models can be developed.
business goals. This requires a set of evaluation tools
> Programmes which will be evaluated to level 5 and processes that are more advanced and credible
should be chosen carefully. Criteria such as size, than those now being used.
scope and cost of training and also the presence
of baseline and easily accessible and reliable data A cost-effective and rigorous ROI system, in which
are important. business strategic practices of planning and
evaluation are applied to training, can greatly
> Baseline data must be gathered prior to all training
improve the quality, efficiency and effectiveness
programmes.
of training.
1 1
Executive Summary
2 3

The focus on ROI also reflects the ongoing emphasis (b) the personal commitment of in-company personnel apply to have most confidence in their results, we see The Case Studies produced as part of the project
on improving the professionalism of both in-company who received the training and implemented in- that they favour comprehensive evaluation designs will be beneficial in future training. The project has
and external trainers and the drive for greater company evaluation studies - which was helped by the with components including a process evaluation, an identified a range of enablers and barriers but the
accountability. Internal competition for company opportunity to be trained and certified in a cutting- impact analysis, analysis of participant perspectives, project has clearly advanced understanding of these,
resources places a burden on the training department edge methodology; (c) the openness of the companies and a benefit-cost analysis. This approach generally which include:
to prove training investments and provide measurable and their provision of resources to complete studies - yields not only a more valid evaluation but better
returns comparable to returns on uses of corporate which was helped, in some cases, by management understanding of what is going on "behind the > company support and acceptance;
capital. In an era of cost cutting, training expenditures interest in the models, and by the small grant incentive numbers." In the business context we may not set the
> time constraints (and hence the need for time
must be made financially accountable. from Skillnets; (d) the, training, support, sustained bar so high, but in fact the ROI process meets and even
saving tools of analysis and manuals);
focus and drive toward completion provided by the exceeds such standards and is powerfully robust as a
Project Implementation consultants. consequence. > basic knowledge of the models (and hence the
Following a public tendering procedure Skillnets need for training).
appointed the Impact Measurement Centre to carry out Constraints on implementation were: (a) the limited An independent evaluation conducted by Mr Anthony
It would appear to be the case that wider-spread use
a pilot project involving 18 companies across 9 time available to train personnel in a completely new Foley, Dean of Dublin City University Business School
of measurement methodologies will not occur without
networks affiliated to Skillnets to test the detailed method and carry out adequate evaluation studies; (b) shows that the participants considered the project to
the relevant training.
application of the Kirkpatrick and Phillips Evaluation the reduced scope of type of training programme that be a success. All would recommend the programme and
Models in Irish enterprises with particular regard to the could be chosen for evaluation dictated by the need to the teaching-content aspects received high ratings.
usefulness and applicability of the models in complete an evaluation within the time window of the
evaluating the impact of training in companies. The project; (c) the challenge of building awareness of the
Figure 1. Phillips' Model for Determining the Return to Investment in Human Resource Development (HRD)
project is a pilot project which may lead to wider methodology and obtaining buy-in from senior
diffusion at a later stage. management in each company.
Tabulate
Kirkpatrick's four levels of evaluation are: (1) Reaction Project Results Costs
Of Solution
(of the participants to the training usually measured in The pilot shows that the models are applicable and
Evaluation
surveys distributed at the end of the training session); usable in an Irish context and can readily be applied
Planning Data Collection
(2) Learning (gains in skills and knowledge achieved by Irish firms, though some issues arise in this Data Analysis Reporting
by the participants usually measured by pre and post regard as highlighted on page 2 above. The project
Level 1: Reaction, Level 3:
tests); (3) Behaviour (focused on whether the skills was strategically important, well designed and Satisfaction, and Application/
and knowledge gained in training are applied and excellently managed and the awareness of Planned Actions Implementation
practiced. This is usually measured on the job three measurement and evaluation was increased. The
Develop Develop Collect Collect Convert Calculate
months or more after training); and (4) Results (or specific objectives of the project were realised : Generate
Objectives Evaluation Data During Data After Isolate Data To The Return
Impact
ultimate outcomes of the training in terms of company Of Plans And Solution Solution The Effects Monetary On
Study
Solution (s) Baseline Data Implementation Implementation Value Investment
goals). Kirkpatrick's model has been amended slightly > to test the models (successfully done in 14
over time to include a fifth level (5) ROI by Dr Jack J. companies - see case studies) Level 2: Level 4: Level 5:
Learning Business Impact ROI
Phillips measuring return on investment of level four
> to train staff (successfully done both in companies
results (Phillips 1996). (See Fig 1, page 3)
and also in Skillnets networks).

The project was implemented between January > to develop support tools (successfully done both in
and December 2004. The project involved a formal establishing a capability, formal training system
Identify
agreement between Skillnets and participating and support materials). Intangible
companies, the training of staff in 18 companies Measures
> to evaluate the process and report on the findings
to use the models in their own firms, development
(contained in this report). Intangible Benefits
of support tools to assist firms in using the methods,
and consultant/advisor services to support the carrying It is quite apparent from the project outcomes, as
out of the exercise. assessed by the independent evaluation, that even
with limited time and basic expertise the rigorous Source: Phillips (1997)
Supports that facilitated the pilot were: application of the ROI process yields a clear and
(a) the vision of Skillnets in bringing forward the idea credible result with a multiplicity of beneficial uses.
and committing funds towards its implementation; If we look at the benchmark which academic evaluators
2 2
Project Summary
4 5

Project Sponsors Skillnets Ltd Networks Participating Hospitality Management Skillnet - Michelle Whelan, Elizabeth Kennedy
South East Micro Skillnet - Karma Farrell
Steering Group Niall Saul, John Dunne, Sean Heading, Mire Hunt, Alan Nuzum SLM Skillnet - Mick McHugh
Brewing Skillnet - John Findlater
Consultants Impact Measurement Centre NETS Skillnet - Stephen Tegart
5 College Way, Clane, Co Kildare North Mayo Skillnet - Carl Blake, Kieran Burke
www.impact-measurement-centre.com Carlow Kilkenny Training Skillnet - Aisling Ward
info@impact-measurement-centre.com BME Skillnet - Bill Halliden
Gerry Doyle, Managing Partner Galway Executive Skillnet

Consortium Eoghan O'Grady, Dublin Institute Of Technology Companies Participating Bord Na Mona - Kevin Harte
Aungier Street, Dublin 2 Braun Oral B - Berna Ward
Sean O'Sullivan, Sean O'Sullivan & Associates Choice Hotels - Triona Brangan, Michelle Whelehan, Jennifer Ryan
156, Hillside, Greystones, Co Wicklow Complete Laboratory Solutions - Evelyn O'Toole , Jackie Gibney
Constructive Solutions - Mark Culleton
Evaluator Anthony Foley, Dean, Dublin City University Business School Diageo - Brendan Farrell, Declan Harrison
Eo Teoranta*
Objectives To test the Kirkpatrick and Phillips evaluation models in Irish Glanbia (Agribusiness) - Jean O'Keeffe
enterprises. test the detailed application of the specified models in a Glanbia (Meats) - Peter Donohoe
small number of companies and/or networks with particular regard to Heineken - P.J. Tierney, Eilis Mulcahy
the usefulness and applicability of the models in evaluating the impact Hilton Hotels - Joanne Mulvey
of training in companies. Irish Biscuits - Roisin Down, Rose Clancy
Kylemore Foods*
Results The models are applicable and readily usable in Irish enterprises. Laepple - Michael Ryan
Lionbridge Technologies - Jimmy Nolan
Start Date January 2004 Masterchefs - Emer Fennel
Novartis - Willie Herlihy
Finish Date December 2004 Pat The Baker - Terry Hughes, Margaret O'Brien
Pauwels-Trafo - Sharon Mitchell, Andrea Flanagan
Key Deliverables > Completed case studies in participating companies Tara Mines - Eoghan O'Neill
> Analysis and evaluation of the usefulness of the models TG4*
> Delivery of a comprehensive training course to provide
participants with the skills to implement the models * Withdrew early in the programme due to lack of available staff to commit to the project.
> Delivery of a consultancy/support service to assist
participants in implementing the models
> Final report outlining the results of the research
and recommendations
> Mid-way review workshop
> Development of tools and resources to support the process
1 2 2 1
Part I The Challenge and the
Approach
6 7

Section 1 - Introduction provide the public and private sector in Ireland with Skillnets Ltd are more advanced and credible than those now being
more effective mechanisms for programme and process Skillnets (established in 1999) is an enterprise-led used. Currently, even if training evaluation is
In November 2003 Skillnets Ltd advertised a public
evaluation at all levels. Led by Gerry Doyle (who has support body which manages the Training Networks undertaken, it is usually at the easiest and lowest level
Request for Tender T040 entitled 'The application of
extensive evaluation and management experience), Programme, an initiative to support groups of the measurement of student reactions through what
best practice models for measuring the impact of
IMC is affiliated with the ROI Institute whose companies to expand the quantity and quality of are euphemistically called smile sheets. Reactions are
Training and Development in the workplace.' The RFT
Chairman is Dr Jack J Phillips one of the world's leading training in Ireland. Skillnets is funded under the important and these smile sheets serve a purpose, but
sought a suitable partner to implement a pilot project
experts in measuring return on investment (ROI) in National Training Fund through the Department of they are not enough to back up an argument to
to test methods of measuring the impact and return on
training and performance improvement programmes. Enterprise Trade and Employment. The networks are support a need for a greater investment in training,
investment (ROI) of training with a selected number of
financed through a co-investment arrangement with when major changes need to be made in the direction
Irish companies operating within the Skillnets network
For the purpose of delivering the Skillnets pilot project the participating companies. the business is taking or, when there is stiffer
through locally organised training networks. The
the Impact Measurement Centre consortium included competition for resources. The heightened interest
Impact Measurement Centre was selected as a result
participation from academia (Eoghan O'Grady of the Skillnets stakeholders are drawn from industry and in evaluation of training and ROI specifically is also
of the tendering process to deliver the project.
Dublin Institute of Technology and Anthony Foley, represent employer interests from IBEC (Irish Business driven by internal forces within the training profession.
Dean of Dublin City University Business School) and and employers Confederation), the CCI (Chambers of As previously noted, market pressures have placed a
Skillnets Ltd was established in 1999 charged with the
the private sector (Sean O'Sullivan, Sean O'Sullivan & Commerce of Ireland), the CIF (Construction industry premium on a skilled and flexible workforce, which in
development, management and administration of a
Associates, Business Performance Improvement Federation), the SFA (Small Firms Association), and turn has significantly raised the importance of training.
Training Networks Programme. Company directors are
Consultants). Support was also provided by Drs Jack employee/trade union representatives from ICTU (Irish
primarily drawn from industry and represent employer
and Patricia Phillips and the ROI Institute. Congress of Trade Unions). The focus on ROI also reflects the ongoing emphasis
interests from IBEC (Irish Business and employers
on improving the professionalism of both in-company
Confederation), the CCI (Chambers of Commerce of
The project was directed by a steering group Skillnets Ltd is funded from the National Training Fund and external trainers and the drive for greater
Ireland), the CIF (Construction Industry Federation),
comprising three members of the Board of Skillnets - through the Department of Enterprise Trade and accountability. Internal competition for company
the SFA (Small Firms Association) and employee/trade
John Dunne, Chief Executive of the Chambers of Employment. resources places a burden on the training department
union representatives from ICTU (Irish Congress of
Commerce of Ireland, Niall Saul, Group Head of Human to prove training investments and provide measurable
Trade Unions), as well as three appointees of the
Resources of Irish Life & Permanent Group plc, and Section 2 - The Project returns comparable to returns on uses of corporate
Minister for Enterprise, Trade and Employment.
Sean Heading, Manager of Education and Training capital. In an era of cost cutting, training expenditures
2.1. Background
Services. The steering group also included the Chief must be made financially accountable. Given the
Skillnets is funded under the National Training Fund Given the growing recognition of training as an
Executive of Skillnets, Mire Hunt, and its Programme increasing influence of Total Quality Management and
through the Department of Enterprise Trade and important factor in economic competitiveness,
Manager, Alan Nuzum. ISO-9000 registration with their heavy emphasis on
Employment. The networks are financed through a considerable attention is currently being devoted to
documentation and measurement, training
co-investment (rather than grant aid) strategy. the subject of return on investment in training to firms.
Sincere thanks to all those who participated in the departments must develop effective tools of benefit-
The public/private funding level is agreed for each This attention is also related to the link between
project both in the companies and networks. We are cost analysis if they are to become integral to
individual network and significant funding support competitive economic pressures and the need to
also grateful to Tom Calledy of Enterprise Ireland and corporate strategy in the future.
has been given at the early stages of the networks. increase the efficiency and effectiveness of training
Breda McNally for their advice in the planning stages.
which positions ROI as a critical issue for most
We are especially indebted to Dr Jack J Phillips and Dr Despite these problems, leading companies and
Skillnets primarily facilitates an enterprise-led companies. No less important is the requirement to link
Patricia Pulliam Phillips for their support, for making training departments are making considerable progress
approach to training and aims to address the lack strategic business objectives with training goals and
available materials and for attending the evaluation in training measurement and evaluation, reflective of
of investment in Human Resource Development outputs.
workshop. its growing importance. In Ireland early adopters have
and training by Irish business through tackling some
included Enterprise Ireland and the AIB Group. Many
of the real and perceived barriers to training. In Ireland, Skillnets has monitored these developments
training practitioners subscribe to the idea that a
Included with this volume is a CD Rom containing - closely for a number of years and has been concerned
"paradigm shift" is occurring in training evaluation,
The role of Skillnets is to provide a framework, at the low level of implementation of the ROI model
depicted by Phillips (1997) (See Table 2.1) as a shift
technical support, advisory services, linkages, and - Evaluation Handbook, Measuring the Impact of amongst Irish companies. This has clear implications for
from "training for activity" to "training for results."
guidance to networks where sought. Skillnets also Training and Development in the Workplace, and sustained economic competitiveness.
ensures transparency, accountability and appropriate
monitoring and control systems to ensure the proper - Case Studies, Results of the Skillnets Pilot In addition, from a policy and strategic perspective,
management of public funds. Project, Measuring the Impact of Training and Skillnets has clearly identified that if training adoption
Development in the Workplace. by Irish companies is to increase beyond its current low
The Impact Measurement Centre* was established to level then it is vital to prove the benefits of training in
specialise in the development of impact measurement terms of its contribution to business goals. This
and performance related tools and techniques to requires a set of evaluation tools and processes that

* Impact Measurement Centre is a trading title and registered business name of Creative Change (Ireland) Ltd., an Irish consulting and training enterprise
1 2 2 1
Part I The Challenge and the Approach
8 9

Table 2.1. Paradigm Shift in Training Evaluation To help in selecting training methods In Ireland an ESRI study (Does Training Generally Work?
These days there are many alternative approaches The Returns on In-Company Training: Alan Barrett and
available to conduct training, including a variety of Philip J O'Connell, Senior Research Officers, ESRI:
Activity Based Results Based
classroom, on-job, on-line and self-study methods. Using 2003)* has shown that training contributes directly to
Characterized by: Characterized by: comparative evaluation techniques, companies can make increased productivity, while another study in the USA
rational decisions about the methods to employ. proves the link between those companies that not only
> no business need for the program > program linked to specific business needs carry out training but spend time and money to
2.3. Why is ROI important? measure the results. 92% of top performing companies
> no assessment of performance issues > assessment of performance effectiveness Human Capital is now recognised as vital to thriving measure training effectiveness through Kirkpatrick's
organisations in their drive for international Level 4 (business results), compared to the industry
> no specific measurable objectives for > specific objectives for application and impact competitiveness. A three-year study by the American average of 11 per cent. An astonishing 67 per cent
application and impact Society for Training and Development (ASTD) Profiting of the top performing companies measure ROI (Level 5)
from Learning: Do Firms' Investments in Education and compared to industry averages between 5 and 10
> no effort to prepare program participants > results expectations communicated to participants Training Pay Off? of 575 US companies offered the first per cent.
to achieve results definitive proof that investment in training directly
improves business and stock market performance. The Unfortunately Ireland has lagged behind in the
> no effort to prepare the work environment > environment prepared to support transfer of data, collected between 1996 and 1998, showed that ROI/Evaluation stakes. While there are no usable
to support application learning when companies were ranked by how much they spent statistics to hand, anecdotal evidence suggests that
on training (per employee), those that spent more on less than 10% of Irish firms evaluate up to Level 4 and
> no efforts to build partnerships with key managers > partnerships established with training did much better than those that spent less. barely 40% go even as far as level 1. Only a handful
key managers and clients Firms in the top half of the ranking had a Total of Irish companies have gone to level 5 - full return on
Shareholder Return (TSR) 45% higher than the weighted investment analysis.
> no measurement of results > measurement of results and cost benefit analysis average of the S&P 500, and an astonishing 86% higher
or cost benefit analysis than firms in the bottom half of the ranking. This pilot project therefore was seen by Skillnets as
a major attempt at building awareness and credibility
> planning and reporting is input-focused > planning and reporting is output-focused Training Magazine in the USA discovered some other for these measurement techniques thus encouraging
arresting insights in a recent annual Top 100 ranking many more Irish companies to go down the road
Source: Phillips (1997) of companies that excel in human capital development. of measuring the return on training. Skillnets saw
The survey found that the most highly regarded US the task also in terms of encouraging stronger
2.2. Reasons to Evaluate Training To validate training as a business tool firms spend 4 per cent of payroll on training (twice the accountability, building the case for training, and
The following are the main arguments for better Training is one of many actions that a company can industry average of 2 per cent). Interestingly, the isolating the effects of training to show where
evaluation of training: take to improve its performance and profitability. leading companies also surpassed industry norms in improvement in employee and business performance
Only if training is properly evaluated can it be measuring and evaluating their training investments. is due to training, validate training as a business tool
To show it was the training compared against these other methods and expect, Ninety two per cent measure training effectiveness and help in the design of training.
After a training programme, there is usually a boost in therefore, to be selected either in preference to or in through Kirkpatrick's Level 4 (business results),
trainees' work performance. Clearly, the two events are combination with other methods. compared to the industry average of 11 per cent.
linked. But then a manager asks the question: "How An astonishing 67 per cent measure ROI (Level 5)
much of the improvement was caused by the training?" To justify the costs incurred in training compared to industry averages between 5 to 10 per
This familiar inquiry is rarely answered with much We all know that when money is tight, training budgets cent. The Number 1 Top 100 firm was Pfizer Inc.
accuracy or credibility. Performance improvements may are amongst the first to be sacrificed. Only by
be linked to training, but usually non-training factors thorough, quantitative analysis can firms make the
have also contributed. As most HR practitioners know, case necessary to resist these cuts.
it can be difficult to show a cause-and-effect
relationship between training and performance and To help improve the design of training
that's why proven and tested high level evaluation Training programmes should be continuously improved
tools are essential. to provide better value and increased benefits for an
organisation. Without formal evaluation, the basis for
changes can only be subjective.
* The authors compared data from surveys carried out by FAS in 1993 and again in 1996-97 to estimate the productivity effects of training and found
statistically significant indicators of the impact of training on productivity
1 1
Part I The Challenge and the Approach
10 11

Section 3 - Models to be Tested Despite the fact that the evaluation model introduced Since the 1970s two basic approaches have evolved 3.4. A Systematic Evaluation Model
by Kirkpatrick is now 45 years old, its elegant to conducting ROI analysis of training. Each confronts Despite numerous obstacles, considerable progress has
3.1. Kirkpatrick and Phillips Models
simplicity has caused it to be the most widely used a different set of challenges. Until the late 1980s the been made in developing methodologies to evaluate
In 1959, Dr Donald L. Kirkpatrick, author, PhD,
method of evaluating training programmes worldwide. most common approach was to view the measurement training and to calculate its ROI, reflective of a growing
consultant, past president of the ASTD, published a
The American Society for Training and Development of ROI as a separate, discrete function independent of recognition of the importance of training. Jack Phillips
series of four articles called Techniques for Evaluating
(ASTD) reported feedback from almost 300 HRD the training under evaluation. The advantages of this has pioneered efforts to develop, systematise, and
Training Programs. The articles described the four
executives and managers in 2000 which revealed that approach are simplicity, efficiency, and clarity in improve the practical evaluation methods used by
levels of evaluation that he had formulated based on
67% of organisations that conduct evaluations use the purpose and operation; it generally requires fewer training professionals and managers in the field.
his work for his PhD dissertation at the University of
Kirkpatrick model. resources and is less costly. However, this approach Phillips presents the most thorough ROI model,
Wisconsin, Madison. Later, Kirkpatrick wrote a book
does not produce a rigorous and systemic approach comprising 18 steps, emphasising a systemic approach
(Evaluating Training Programs: The Four Levels, 2nd
3.2. Return on Investment (ROI) to ROI. to training evaluation. (See Table 3.1)
Edition, Berrett-Koehler Publishers, Inc, San Francisco,
Calculating Return-on-Investment (ROI) is a practice
1998) which is now in its second edition.
of modern management used in the analysis of many The second approach, which is more broadly conceived, Table 3.1. A Systemic Evaluation Model: 18 Steps
business strategies and operations. It is a standard and is now the most widely used is based on the
Kirkpatrick's four levels are: (1) Reaction (of the 1. Conduct a needs assessment and develop
measure used for predictive and diagnostic evaluation premise that ROI is most effective when designed and
participants to the training usually measured in tentative objectives
of business initiatives. Perhaps the most popular implemented as an integral part of the whole training
surveys distributed at the end of the training session);
application of this tool is in the analysis of purchase process. A comprehensive framework for ROI 2. Identify purposes of evaluation
(2) Learning (gains in skills and knowledge achieved
decisions for investments in capital equipment or implementation incorporates all the phases of training,
by the participants usually measured by pre and post 3. Establish baseline data
technology. ROI is simply a measure of benefit versus from initial training needs assessment and planning,
tests); (3) Behaviour (focused on whether the skills
cost. Expressed as a percentage, ROI is determined by program design, benchmarks and measures, data 4. Select Evaluation method/design
and knowledge gained in training are applied and
total net present benefits divided by total net present reporting and collection, through final evaluation
practiced. This is usually measured on the job three 5. Determine evaluation strategy
costs. Benefits and costs are converted into present (Darling 1993).
months or more after training); and (4) Results (or
values since they usually accrue over extended periods 6. Finalize programme objectives
ultimate outcomes of the training in terms of company
of time. In the context of training ROI is a measure of 3.3. Linkage Between the Five Levels
goals). Kirkpatrick's model has been amended slightly 7. Estimate programme costs/benefits
the monetary benefits obtained by an organisation Crucial to this broad based approach is the chain of
over time to include a fifth level (5) ROI by Dr Jack J.
over a specified time period in return for a given impact or linkage between the five levels. Phillips 8. Prepare and present proposal
Phillips measuring return on investment of level four
investment in a training programme. emphasises the "chain of effect" implied in the five-
results (Phillips 1996). 9. Design evaluation instruments
level evaluation model shown in Figure 3.1. Initially, it's
One of the earliest methods for evaluating training essential to derive the measurable results of training 10. Determine and develop programme content
Kirkpatrick alluded to ROI when he created level four
and performance improvement investments was the from participants' application of new skills or
linking training results to business results but never 11. Design or select delivery methods
cost-benefit analysis process. The cost-benefit analysis knowledge on the job over a specific period of time
elaborated it further. However, over time the need to
compares the benefits of a program to its costs after training is completed, a level 3 evaluation. 12. Test programme and make revisions
measure the monetary value impact of training became
through a benefit-cost ratio (BCR) (Thompson 1980; Logically, successful on-the-job application of training
so important to corporations that a fifth level became 13. Implement or conduct programme
Kearsley 1982; Nas 1996; Phillips 1997b). In formula content should stem from participants having learned
essential. Dr. Phillips outlines his approach to Level
form, the BCR calculation is: A benefit-cost ratio of one new skills or acquired new knowledge, a level 2 14. Collect data at proper stages
Five in his book Return on Investment in Training and
means that the benefits equal the costs. A benefit-cost evaluation. Consequently, for a business-results
Performance Improvement Programs (Butterworth 15. Analyse and interpret data
ratio of two, written as 2:1, indicates that for each improvement (a level 4 evaluation), the chain of effect
Heinemann Publishers, Inc, Woburn, MA 1997).
monetary unit spent on the programme two units were implies that measurable on-the-job applications (level 16. Make programme adjustments
Dr. Phillips has written extensively on the subject,
returned as benefits. 3) and improvement in learning (level 2) are achieved.
publishing or editing dozens of books on the topic 17. Calculate return on investment
Without this preliminary evidence, it's difficult to
of ROI.
ROI, on the other hand, compares the net programme isolate the effect of training or to conclude that 18. Communicate programme results
benefits and costs. The ratio is usually expressed as training is responsible for any performance
Ideally, according to Kirkpatrick, evaluation should
a percent by multiplying the fractional values by 100 improvements. Practically speaking, if data is collected Source: Phillips (1997)
be conducted at all four levels because the agreement
(Phillips 1997). ROI can be used both to justify a on business results (level 4), data should also be
or coherence in the findings across the levels
planned investment and to evaluate the extent to collected at the other three levels of evaluation. This
strengthens the conclusions significantly. However,
which the desired return was achieved. To calculate applies equally to return on investment (level 5
largely due to the high cost of evaluations, such a
ROI you must first make estimates or obtain evaluation).
comprehensive approach is not normally implemented
measurements of the costs and benefits associated
by firms in practice.
with a training programme.
1 1
Part I The Challenge and the Approach
12 13

Figure 3.1. Phillips' Model for Determining the Return to Investment in Human Resource Development (HRD) Phillips's ROI Methodology applies a range of unique Figure 3.2: Percentage Evaluation at Each level. Source:
tools and techniques that enable the practitioner to ROI Institute
complete difficult and challenging tasks such as:
Tabulate identifying business results of training and then Level Percent Evaluated
Costs
Of Solution converting them into monetary values, isolating the
Evaluation effects of the training from other factors that could 1. Reaction, Satisfaction, Planned Action 100%
Planning have contributed to the results and identifying
Data Collection Data Analysis Reporting intangible benefits. 2. Learning 60%
Level 1: Reaction, Level 3:
Satisfaction, and Application/ That is done through a lengthy, painstaking and hair- 3. Application/Implementation 30%
Planned Actions Implementation
splitting evaluation involving tools such as focus
Develop Develop Collect Collect Convert Calculate groups, follow-up questionnaires, trend-line analyses 4. Business Impact 10-20%
Generate
Objectives Evaluation Data During Data After Isolate Data To The Return and "controlled" studies, where employees are split into
Impact
Of Plans And Solution Solution The Effects Monetary On
Study two groups, but only one receives training so the 5. ROI 5-10%
Solution (s) Baseline Data Implementation Implementation Value Investment
results can be compared.
Level 2: Level 4: Level 5:
Learning Business Impact ROI
The crucial point comes before any final calculation,
when the impact of learning has to be isolated from
gains in revenue, performance or productivity that
might have accrued because of outside circumstances -
Identify seasonal sales variation, for instance. Phillips has
Intangible established a set of guidelines so that results are
Measures
standardised. These include a rule that only the most
conservative data can be included in the formulas.
Intangible Benefits
Phillips's formulas have taken root in at least 1,000
private and public-sector organisations in 40 countries.
Source: Phillips (1997)
3.5. Choosing the Right Evaluation Level
During the development stage of the project the
The crucial elements in this model, for those who wish things as work habits (tardiness, absenteeism); consultants worked with the participating companies
to go as far as calculating the ROI of a training work climate (grievances, job satisfaction); to help them to decide which evaluation level to work
programme, commence at Level 4. Phillips' specifies attitudes (loyalty, perceptions); and initiative towards taking account of individual circumstances.
the steps thereafter as follows: (implementation of new ideas, number of employee Companies were advised of the need to collect data
suggestions). 4. Total the costs of training. at lower levels where a higher-level evaluation is to be
1. Collect Level-4 evaluation data. Ask: Did on-the-job 5. Compare the monetary benefits with the costs. carried out. However, Phillips ten Guiding Principles
application produce measurable results? 2. Isolate 6. The non-monetary benefits can be presented (No 2) states that when an evaluation is planned for
the effects of training from other factors that may as additional - though intangible - evidence of the a higher level, the previous level of evaluation does
have contributed to the results. 3. Convert the programme's success. not have to be comprehensive. Fig 3.3 is a guide as
results to monetary benefits. Phillips recommends to the average percentage evaluation at each level.
To define the Return on Investment, Phillips
dividing training results into hard data and soft
recommends that the following formula should be
data. He says hard data are the traditional
used:
measures of organisational performance because
they're objective, easy to measure, and easy to
Benefits/ = Programme Benefits/
convert to monetary values. They include output
Cost Ratio Programme Costs
(units produced, items assembled, tasks completed);
quality (scrap, waste, rework); time (equipment ROI = Net Programme Benefits/
downtime, employee overtime, time to complete Programme Costs
projects); and cost (overhead, accident costs, sales
expenses). Conversely, soft data includes such
1 1
Part I The Challenge and the Approach
14 15

Section 4 - Implementation Strategy 4.2. Project Plan Companies Chosen to Participate in the Project of the participating companies. All of the network
Stage 1 - Identification of the Participants Companies were selected to take account of a good managers stayed in close contact to ensure that the
4.1. Project Objectives
A call for participation in the project was issued by spread of size, sectoral distribution, geographic needs of the companies were met at all stages of the
Following approval of its successful tender the Impact
Skillnets to its constituent networks in mid January location and range of training to be evaluated. Of the project.
Measurement Centre proposed to Skillnets re-order and
2004. This was followed up by a briefing session original 21 companies selected three withdrew before
amalgamate its project objectives (see Section 2) into
hosted by Skillnets and delivered by the consultants activity commenced, because of over commitment: First Company Visits
one primary objective and four subsidiary objectives
for interested networks on 2nd February attended by Kylemore Foods, Bellurgan Precision and TG4, and one The consultants visited all companies during March and
as follows:
representatives of 12 networks. Subsequently, more was subsequently accepted - Glanbia Meats - completed a standard report. The primary purpose of
completed applications were received from 9 networks leaving a total of 19 participating companies. However, the visit was to
1. Testing of the Kirkpatrick and Phillips Evaluation
and following an initial review and report on these Eo Teoranta subsequently withdrew after the project
Models in Irish enterprises
applications by the consultants the Steering Group commenced and did not commence any evaluation. 1. Establish rapport and begin the process of
(a) Test the detailed application of the specified
made a final selection of 20 companies from all relationship building with the company and in
models in a small number of companies and/or
9 networks to participate in the project on 20th Bord na Mona particular with the project champion.
networks with particular regard to the usefulness and
February. Braun Oral B
applicability of the models in evaluating the impact of 2. To introduce the project (a standard presentation
Choice Hotels
training in companies. was used by all consultants) to key company
Networks Chosen to Participate in the Project Complete Laboratory Systems
personnel - with the aim of ensuring that the
Hospitality Management Skillnet Constructive Solutions
(b) Disseminate the findings and experience of the project is understood by both the senior
South East Micro Skillnet Diageo
pilot widely across Irish enterprises and agencies and management and the champion.
SLM Skillnet Glanbia Agribusiness
other bodies involved in the funding of such activities.
Brewing Skillnet Glanbia Meats 3. Identify what training the company would like to
NETS Skillnet Heineken have evaluated:
2. Support Processes
North Mayo Skillnet Hilton Hotels
Identify support processes required by networks and a. If training had already commenced to assess (i)
Carlow Kilkenny Training Skillnet Irish Biscuits
companies to conduct the testing and test and whether a needs assessment was carried out and
BME Skillnet Laepple
implement the support processes required. (ii) what data has been gathered to date at level 1
Galway Executive Skillnet Lionbridge Technologies
or 2,
Masterchefs
3. Key Success Factors
Novartis b. If training is about to commence to assess (i) has
Arising from 1 and 2: Identify key success factors for
Pat the Baker a needs assessment been carried out, (ii) what
the application of impact measurement systems within
Pauwels-Trafo data is or will be gathered and (iii) whether
the workplace, and identify issues and barriers to their
Tara Mines its completion date allows for evaluation up
adoption by (a) companies, and (b) publicly funded
to level 4-5.
agencies.
Stage 2 - Development of Objectives
4. Agree with the company what training will be
Initially the consultants worked with Skillnets -
4. Tools evaluated.
to agree:
In the process of 1 and 2: Develop tools and
5. Agree with the company the appropriate Levels at
techniques to allow the wider application of impact
> overall detailed objectives which the training is to be evaluated (Levels 4 or 5
measurement models (beyond the pilot group) after
preferably).
the completion of the pilot and identify potential for > implementation plan
the use of such tools and techniques among Irish 6. Discuss the evaluation instruments to use.
> final timeline
enterprises.
7. Gather the baseline data (a standard Baseline Data
> reporting relationships
sheet was used).
5. Awareness Raising
> deliverables
Develop awareness of the importance, advantages 8. Get the company to sign a standard Agreement
and application of impact measurement aligned to The consultants also made contact with the Networks with Skillnets.
business goals. to ensure they were fully briefed on the project. All but
9. Set the scene for the training course on March
one of the network managers participated in the
30/31 at the Mullingar Park Hotel and ensure buy-
training offered under the project. Some of the
in from the champion within the company. Offer a
network managers attended meetings between the
second place on the course if the company would
companies and the consultants and two network
like to take it up and get the name and details of
managers completed a full evaluation study with one
the second person.
1 1
Part I The Challenge and the Approach
16 17

10. Answer any questions and ensure general 4. Provision of Support Services 4.3. Key Deliverables
satisfaction with the project on the part of the The consultants provided a hands-on support service
Area of Activity Proposed Deliverable/Output Outcome 30 Nov 04
company. to the participants to enable them to effectively
implement the evaluation process. Full details of this
11. Agree a date for the second company visit in May. Stage 1 - Identification of Participants Selected by mid Completed on 20 Feb 04
support service is contained in Section 8.
Participants January 04
All 19 companies signed the agreement and most
companies committed to carry out an evaluation up to 5. Evaluation Workshop
Stage 2 - Develop Objectives - Objectives at Skillnets level Completed on time
Level 5 - full ROI. A half-day evaluation workshop was held at the end of
agreed by January 9th.
the main operational part of the project on 5th
- Objectives at Network levels Completed by end Feb
Stage 3 - Implementation October, 2004 at the Crowne Plaza Hotel, Dublin. This
agreed by end January.
Implementation involved three key elements: 1. The was attended and addressed by Drs Jack and Patti
- Objectives at Company level Completed by 29 March 04
public launch of the project. 2. Training for company Phillips and included the presentation of six case
agreed by end February.
personnel involved in the project; 3. The evaluation studies from project participants.
process itself (including the use of specific evaluation Stage 3 - Implementation
Delivered 30-31 March
tools); 4. Provision of support services by the 6. Advanced Certification in the Use of the Training Course Part 1 - Delivered in mid-March
Completed - Attached
consultants, and, 5. Evaluation workshop at the end of Methodology - Training Course Manual
the main operational period of the project. A week-long ROI Certification workshop conducted by
Introduction of Processing - Customised version in place by Assessment complete 10 June
Drs Jack and Patti Phillips took place in Dublin from 4-
Technology mid-March
1. Public Launch of the Project 8th October, 2004. While this was not a part of the
- Operating Manual Decided not to proceed due to cost
The project was launched by the Minister for Labour pilot project the Steering Committee decided to provide
and poor initial response from
Affairs, Frank Fahey TD, at a reception in the Mullingar a grant to cover some of the costs of nine of the
participants
Park Hotel on Thursday 31st March, 2004. The launch project participants to allow them to take part.
was attended by representatives of networks and Evaluation Process Handbook Completed by 1 June 04 and then
Evaluation Process
companies participating in the project was also 7. Dissemination (including standard forms etc). First incorporated into Evaluation
addressed by John Dunne on behalf of the Steering The first step in the dissemination of the project draft Mid-March. Workbook.
Group and Maire Hunt, CEO, Skillnets. outcomes was linked to the final evaluation workshop
on 5th October, 2004. A luncheon was hosted by Workbook of Evaluation Tools and Evaluation Workbook Completed
Development of Evaluation Tools
2. Training for Company Personnel Skillnets which was attended by 35 representatives Processes - end Nov end June.
The aim of the training was to develop the knowledge from business, state agencies, trade unions,
and skills of participants to enable them to implement educational and training interests and the press. This - Deliver in May Delivered June 1-2
Training Course Part 2
a complete evaluation of a training programme at all group was briefed by Dr Jack Phillips on the ROI - Revise Training Manual June Training Manual 2 - 1 June
levels 1 through 5, including the ability to conduct a concept and given a summary of the project to date. A
basic ROI analysis. further dissemination process is being considered by Stage 4 - Review and Evaluation Impact/Process Mid-way review First and Second Interim Progress
Skillnets for 2005. Operational/Impact Review workshop June Reports presented.
The training was conducted over two separate two-day
Review incorporated in Oct
sessions at the Mullingar Park Hotel on 30-31st March
workshop.
and 1-2 June 2004. A complete analysis of the training
and its impact is contained in Section 8.
Case Studies Report Case Study Report December Case Studies separate document
completed 1st Dec.
3. Evaluation Process
Delivered 5th Oct.
The consultants provided job aids, tools and templates
to the participants in the design and development of
Evaluation Feedback Workshop Deliver workshop December Incorporated at Section 9 of this
effective data collection instruments for use in respect
Evaluation Report Report in December Report.
of the various evaluation levels. These are included in
Final Report Report in December Delivered 1st Dec.
the Evaluation Process handbook and the Evaluation
Workbook produced as part of the project.
Stage 5 - Dissemination
Agree Dissemination Plan March Recommendations contained at
Commence Dissemination Process April Section 10 of this Report.
Dissemination Conference Conference in 2005
1 1
Part I The Challenge and the Approach
18 19

4.4. Programme Participants Table 4.3 Breakdown of Participating Companies by


There was almost an even number of Male (17) and Female (18) participants in the project. County and Sector

Size of Training Breakdown by County Total Breakdown by Sector Total


Company Network Company Evaluated Consultant
Dublin 5 Manufacturing Food 5
Irish Biscuits NETS 1000 Technical S O' Sullivan
Cork 2 Manufacturing General 6
Novartis NETS 200 Management G Doyle
Carlow 2 Brewing 2
Tara Mines NETS 200 Train Trainers E O'Grady
Galway 1 Tourism 2
Bord na Mona NETS 500 Technical G Doyle
Tipperary 1 Information Technology 1
Constructive Solutions SE Micro 13 Management S O' Sullivan
Cavan 1 Services 2
Pauwels Trafo BME 328 Management E O'Grady
Meath 1
Pat the Baker BME 450 Technical E O'Grady
Longford 1
Lapple Carlow Kilkenny
Skillnet 385 H&S G Doyle Kilkenny 1

Braun Oral B Carlow Kilkenny Mayo 1


Skillnet 500 Technical G Doyle
Offaly 2
Glanbia Ballyragget Carlow Kilkenny
Skillnet 5000 Recruitment Tools S O'Sullivan

Complete Laboratory Solutions Galway Executive 20 Marketing E O'Grady

Lionbridge Technologies North Mayo 110 Technical S O'Sullivan

Diageo Brewing Skillnet 270 Train the Trainer S O'Sullivan

Heineken Brewing Skillnet 250 Competencies


Programme G Doyle

Hilton Dublin Hospitality


Management 135 H&S G Doyle

Masterchefs Hospitality
Management 35 Technical E O'Grady

Choice Hotels Hospitality


Management 750 Call Centre E O'Grady

Glanbia Meats Edenderry SLM Skillnet 350 Technical


(2 Programmes) Management S O'Sullivan
2 2
Part II The Results
20 21

Section 5 - Application and and post programme evaluation techniques utilising final outcome. This is despite the fact that some of > identify which participants benefited most or least
Implementation of the Models a variety of data collection methods. them invested significant sums of money in training from the programme
on an annual basis, tens and hundreds of thousands
5.1. Validating the Theoretical Foundation of the > reinforce major points made to the participants
5.1.2. Evaluation in Practice of euro.
Models
Wigley (1988) describes a "production approach" to > gather data to assist in marketing future
5.1.1. The Evaluation Process
training in which evaluation activities are seen as 5.1.3. Purpose of Evaluation programmes
In designing the content of the training programme
being isolated from the training itself. In this approach The basic purposes of training evaluation according
and introducing the participants to the models to be > determine if the programme was appropriate
evaluation is focused on statistics that describe the to Brinkerhoff (1988) are:
tested the consultants undertook a review of the
number of training days per year, the number of 5.1.4. Communicating Evaluation Results
literature in relation to training evaluation in general.
courses per year and the number of trainees attending > to determine that an identified problem represents Evaluation is an analytical process which can also
each course among other things. Whilst these statistics a training need and to determine what the real be diagnostic and predictive. Evaluation involves the
What emerges from this review is that training
are useful in providing data about how popular the goals of the training are; collection of subjective and objective data from a
evaluation is seen by most training practitioners and
programs offered by the training department are, they number of sources using a variety of techniques about
HRD managers as the most difficult part of their job. > to determine the most appropriate training
have little affect in showing whether the training a training programme and the reduction of such data.
This finding is hardly surprising since evaluation is strategy;
department is fulfilling any useful purpose for the Evaluation leads to the synthesis of the data into a
poorly defined having different meanings for different
organisation - unless "bums on seats" is seen as a > to determine if the chosen strategy is successfully report containing a summary of results and
people in many different contexts. There is a strong
useful purpose. implemented; recommendations, with validated rationales, about the
dependence among training practitioners on the
program being evaluated. This last element - the
determination of trainee reactions to programmes as > to determine if learning occurred and to what
The literature from the United States, Canada, and synthesis of a report - is one which has not been
a major means of evaluation. Foxon (1989) makes the extent;
Britain supports the view that the major form of adequately addressed in the literature reviewed. Only
point that many trainers see the "development and
evaluation, in many cases the only form, is end-of- > to determine usage outcomes (at individual level); Phillips (1997) lays emphasis on this point and even
delivery of training as their primary concern, and
course trainee reactions and that the data from these provides very detailed formats and procedures by
evaluation something of an afterthought." She > to determine impacts and worth (at organisational
evaluations is seldom used. The trainee reaction which reporting and the communication of such reports
suggests that the reliance on post-course reactions level).
questionnaire, often referred to as the "smile sheet," may be implemented.
results from an inability to deal with quantitative
is relatively easy to construct and administer when The first two above present even greater challenges
measurement techniques and a lack of finances, time
compared to other forms of evaluation, and, if kept than training practitioners have previously tackled 5.1.5. Learning Measurement in Practice
and expertise in comprehensive evaluation. Further,
simple enough, it is easy to analyse and report on incorporating as they do appraisal of the needs Kirkpatrick's Four Levels and Phillips Fifth Level stand
she suggests that training practitioners are confused
the findings. The data obtained can be useful in assessment tools and methods to ensure that they are out in the literature review as appearing to provide an
by the term and do not understand what its "essential
determining which trainers, training methods, aids truly exposing the real nature of the problem and the acceptable form of training measurement. The Phillips
features" are nor what "purpose it should serve".
and resources are popular and therefore likely to affect correct solution options. Similarly, with determining the ROI Model is also a detailed approach to conducting
trainee motivation and participation. However, its most appropriate training strategy which could rightly Impact Studies to help prove or disprove if a learning
Wigley (1988) defines evaluation as "a data reduction
usefulness is limited in that the data obtained is be said to be the property of instructional planning or intervention was a positive influence on the
process that involves the collection of large amounts of
subjective and gives little or no information about instructional design, yet in choosing the appropriate organisation.
data which are analysed and synthesised into an
whether the training program contributed to or strategy one or more training strategies need to be
overall judgement of worth or merit". The implication
achieved the goals and objectives of the organisation, considered, tested or analysed to ensure that the The illustration in Fig 5.1 tries to capture a balanced
here is that the judgement of worth can be supported
the training department, the particular programme or correct one is identified. scorecard of learning metrics that range from the low
by the data. What is not clear in any of the definitions
the individual trainees. cost/simple solution to the higher cost/ complex
offered is what is entailed in the criteria of worth.
Phillips (2000), further clarifies and expands these solution. Each may be applicable for different needs.
Anecdotal evidence from Ireland gives the impression where he states that evaluation should: Each is explained briefly overleaf.
It has been suggested that a major problem in arriving
that, for the most part, evaluation is seen as an
at a definition of evaluation is confusion with related
activity that occurs at the completion of a training > determine whether or not a programme
terms such as measurement, assessment and
course, and that the practice of evaluation is confined is accomplishing its objectives
validation (Foxon, 1989 p 92). This suggestion is a
to a limited number of activities. None of the
reasonable one if the literature is any indication of the > identify the strengths and weaknesses
companies undertaking this project had previously
way the training population perceives evaluation. Only in a programme.
undertaken a comprehensive approach to evaluation in
five authors (Wigley, 1988; Brinkerhoff, 1988;
which various evaluative data had been collected from > determine the cost/benefit ratio of a programme
Birnbrauer, 1987; Bushnell, 1990; and Phillips, 1997)
the outset and continued through to follow-up
refer to a comprehensive approach to evaluation, > decide who should participate in future
assessment with collation and analysis of the data and
involving the collection of data from the beginning of programmes
compilation of a report with recommendations as the
programme design through to programme completion
2 2
Part II The Results
22 23

Fig 5.1 Balanced Scorecard of Learning Metrics (b) Manager-Based:


Table 5.1 Stages of Completion of the Training Programmes to be Evaluated
Source: Knowledge Advisors, 2003 This method has the same data collection points as
the learner-based solution but adds a manager-based
dimension. The manager of the participant attending Programmes
training is another important data point. They can be Stage of
sent an evaluation instrument timed when the Completion
Value participant receives a follow-up. The manager survey
ROI Process

Completed X X X
Business Results focuses on Levels 3, 4 and 5 of the Kirkpatrick and
In Progress X X X X X X X X X X X X X
Phillips models therefore getting estimates
Job Impact
surrounding job impact, business results and ROI from In Planning X X X
Learning
the manager's perspective. The manager survey also
Satisfaction
asks 'support' type questions to understand the on- A total of fifteen evaluation studies were completed The evaluation studies cover a broad cross section of
Learner Based Manager Based Analyst Based the-job environment where the participant applied under the project from the eighteen participating type of training (see Table 5.3) which provided a very
Post event survey Post event survey Post event survey
Follow up survey Follow up survey Follow up survey
Manager Survey Manager Survey
the training. companies (in one case two studies will be completed solid basis on which to test the models.
Testing/Certification
Phillips ROI Module
under Glanbia Meats, one by the HR Maager Peter
(c) Analyst-Based: Donohoe which is currently in progress and the second Table 5.3 Type of Training Programme Evaluated
This approach uses significantly more comprehensive being an initiative of the network manager, Mick
post event, follow-up and manager surveys and also McHugh of SLM Skillnet). All fifteen completed studies Type of Training Number of
(a) Learner Based: uses other analytical tactics that go beyond surveying. will go all the way to a full ROI analysis which is quite Programme Evaluated Evaluation Studies
A measurement model that captures data from training For example, to analytically measure Level 2 - learning an achievement for the participants concerned.
Management/Supervisor/Leadership Development 4
participants at two distinct points during the learning effectiveness, a detailed test is designed and
process. The first point is directly after the learning administered to participants. Due to the time The remaining four studies will not be completed due Technical Skills 7
intervention (Post Event) where the main commitment of conducting a significantly detailed data to issues unrelated to adoption of the methodology: (a)
Train The Trainers 3
measurement focus is on Kirkpatrick's Level I - and collection and analytical exercise the Analyst-Based the participants in one company being absent on
Level 2 to gauge satisfaction and learning approach is only used for about 5% of all training maternity leave, (b) serious disruption of the work of Health And Safety 2
effectiveness. Because there is a high response rate programmes in most organisations. Typically these the HR department due to industrial relations issues in
Introduction Of Competencies 1
to these data instruments it is also critical to capture programmes are the more strategic or visible and have another company, (c) a takeover of the third company,
indicators for advanced levels of learning such as Level the budget to afford a more costly and time-consuming and (d) pressure of time and organisational changes in Marketing 1
3 - Job Impact, Level 4- Business Results and Level 5 measurement exercise. the fourth company.
Call Centre Training 1
ROI. These indicators are in effect forecasting or
predicting the future impact the training will have on 5.2. Implementation Process The results from the 15 case studies provide sufficient
The companies were predominantly large in size with
the participant and the organisation. Following the first two-day training programme (see information for the purposes of assessing the
only two companies classified as SMEs. It is interesting
Section 8) the participants commenced the evaluation applicability and usability of the models.
to note that both of the SMEs (Complete Laboratory
A second data collection point is a follow-up survey process on the training programme which they had
Solutions in Galway and Constructive Solutions in
conducted a period of time after the participant has identified during the first company visit by the Table 5.2 Anticipated Output of Evaluation Studies by
Tipperary) both complete full ROI studies. The issues
been back on the job. This survey is meant to validate consultants in March 2004 (see Section 4). Level of Evaluation
for SMEs implementing the model are dealt with in
the forecast and predictive indicators of Levels 3, 4
Sections 7 and 10.
and 5 by gathering more realistic estimates now that 5.2.1. Training Programmes Evaluated Evaluation Level Number of Evaluation Studies
the participant is back on the job. A variety of programmes were selected for evaluation,
5.2.2. Extent of Use of Phillips Techniques
three of which had already been completed, thirteen 1 15
The following tables show the number and range
were in the process of delivering the training and three
2 15 of the techniques used by Phillips in the evaluation
were at the pre-commencement stage (see Table 5.1).
studies.
3 15

4 15

5 15

Uncompleted 4
2 2
Part II The Results
24 25

Baseline Data Table 5.4 Data Collection Methods at Level 4 Isolating the Effects of the Training (see also 5.4.1. When we examine feedback from participants
Since all but three of the programmes to be evaluated below) concerning the usability of the techniques we see
Data Collection Number Of Times
had either been completed or were in progress it was Not surprisingly ten of the 15 studies used estimates that those recommended by Phillips as the first and
Instruments Used In 15 Studies
possible to gather baseline data in only three cases. In from performers/supervisors as a means of isolating best to use are also considered highly effective by the
the event only Emer Fennel at Masterchefs used a pre- Performance Monitoring Records 15 the impact of the training and twelve used managers participants - control groups and trend line analysis.
training self assessment form to establish baseline estimates. These were collected primarily via However, these can only be used in a limited number
Action Plans Of Trainees 1
data. Overall the lack of baseline data presented questionnaire though at least 4 companies used Focus of cases where the right conditions prevail. In
significant challenges to completing effective Performance Contracting Groups of managers instead of issuing them with particular, the opportunity to use control groups may
evaluation studies in almost every case. However, a questionnaires. In only two cases was it possible to be hindered in the Irish context by the relative size
Assignments Related To The Training 6
number of creative and valuable solutions were arrived use the preferred method of a control group, while only of most enterprises including those considered to be
by the participants to overcome this gap as evidenced Follow-Up Questionnaires/Surveys 9 one company (Laepple) was able to use a trend line 'large' by Irish standards. This left most of the studies
in the Case Studies. analysis (see Table 5.6). relaying on estimates of performers, supervisors and
Observations On The Job 4
managers. All applied the statistical error range to
Level 1 and 2 Follow-Up Interviews With End Users/Performers 3 What is especially encouraging, and essential for a these estimates recommended by Phillips which
At level 1 and 2 there was universal use of valid outcome, is that none of the 15 studies failed to appears to have alleviated the concern expressed
Follow-Up Focus Groups 3
questionnaires to determine reaction and learning. isolate for the effects of the training and none used by many about the risk of relying solely on estimates
The sample questionnaires in the Evaluation Process Follow-Up Group Session 2 less than one technique for that purpose. In fact eight from performers in particular. In every case where
Handbook prepared for the project were widely used. applied two techniques, five applied three techniques performers estimates were used these were checked
In terms of the usability of the data collection methods and two applied four techniques to isolate. This against estimates from supervisors and or managers
Level 3 applied the general feedback received is detailed in ensures that the ROI analysis in each of these cases is which appears to have achieved an acceptable result.
At level 3 questionnaires with participants were Table 5.5. Of particular interest is the positive quite robust and also provides us with an excellent trial One point that caused some surprise was the almost
augmented by the use of questionnaires with feedback from the three companies that used Focus of the models. universal experience of trainees rating their
supervisors and managers, the use of observation by Groups. This was one of the techniques given special performance levels (in anonymous questionnaire-
supervisors/managers in 4 cases, the use of focus attention by means of a min-workshop conducted by Table 5.6 Techniques used to Isolate the Effects of based estimates) lower than either their supervisors
groups in 4 cases, the use of tests in 4 cases and the Eoghan O'Grady during the training course conducted Training from Other Factors or managers.
use of a mystery client by Choice Hotels. In fact Choice as part of the pilot project.
Isolate The Effects
Hotels (Michelle Whelehan, Jennifer Ryan, and Triona
Of The Training Number Of Times Used
Brangan) also developed a scenario/case study that Table 5.5 Usability of Data Collection Methods
the training participants analysed prior and post at Level 4 Use Of Control Groups 2
training to strengthen their evaluation at level 3 and 4.
Data Collection Instruments Feedback Trend Line Analysis 1

Level 4 Performance Monitoring Best Way, Forecasting Methods


In Table 5.4 we can see that the companies relied Records Always Use If Possible
Performers Estimate Of Impact 11
heavily on the most credible source of data for
Action Plans Of Great Way To Do It, ROI Method
collection of business impact measures at level 4 - Supervisors Estimate Of Impact 10
Trainees Ideal For Action Planning
Performance Records. Action Planning was only a
Managements Estimate Of Impact 13
feasible option in three studies, those that were at the Follow-Up Questionnaires Good When You Use The
pre-planning stage, and one opted for this approach. /Surveys Right Format, Always Subordinates Report Of Other Factors
Nine studies utilised follow-up questionnaires with Cross-Check With Managers
Use Of Experts/Previous Studies 1
participants and managers to identify business impact.
Observations On
These include all those where the training was already Calculate/Estimate The Impact Of Other Factors
The Job Ok If You Have No Other Way!
completed and some were the training was already in
Customer Input 1
progress. Diageo (Brendan Farrell and Declan Harrison), Follow-Up Interviews
Lionbridge (Jimmy Nolan) and Novartis (Willie Herlihy) With End Users/Performers Time Consuming
all used composite Questionnaires to capture L1, L2, L3
Follow-Up Focus Groups Very Effective
data, while Willie Herlihy also used the same
questionnaire to gather Level 4 data for his study on a Follow-Up Can Be Expensive, Link It To
training programme that was already completed. Group Session Some Follow-Up Training
For Best Effect
2 2
Part II The Results
26 27

Table 5.7 Usability of Techniques used to Isolate the Table 5.8 Techniques used to Convert Data Applying Standards > Brief company personnel on the project;
Effects of Training from Other Factors to Monetary Value The consultants were rigorous in the application of
> Undertake a needs assessment (where a
Phillips Ten Guiding Principles in all aspects of the
Isolate The Effects Converting Data To comprehensive needs assessment has not
pilot. This was most helpful in enabling a standard
Of The Training Feedback On Usability Monetary Value Number Of Times Used been carried out );
approach across all evaluation studies, providing a set
Use Of Control Using Different Shifts Is Converting Output Data of clear and standard answers to problems or issues, > Identify appropriate training solutions;
Groups Very Good. A Good Way To To Contribution (Profit/Savings) 14 allowing the evaluation studies to be compared with
> Identify variables to be influenced by the solutions;
Avoid Cross-Contamination. best international practice, and, providing a logical and
Calculating The Cost Of Quality
credible framework in which to anchor the practice of > Develop and record/gather baseline data;
Trend Line Analysis Simple To Use
Converting Employee's Time 10 the pilot project.
When There Are No Other > Determine Levels at which solutions are to be
Factors Affecting He Outcome. Using Historical Costs/Savings Data 1 evaluated (Levels 1 to 5);
5.3. Summary of Support Services Provided
Performers Estimate Easy To Use. Concerned About Using Internal And External Experts 2 Impact Measurement Centre delivered a comprehensive > Determine evaluation instruments to use;
Of Impact Subjectivity Of Results But support service to the local networks and the
Using Data Values From External Databases > Determine data sources to use.
Interested To See How Close participating companies throughout the project.
They Come To Managers Estimates. End User/Performer Estimates The primary focus of this support was to ensure
Like The Idea Of Applying Error effective implementation of the project, and build Visit 2: Carried out approximately 1-2 months following
Supervisors' And Managers' Estimates
Technique. Surprised That Trainees in-company capacity to conduct evaluations up to the initial training course (April-May 04) to:
Rated Themselves Lower Using Hrd Staff Estimates full ROI level if desired.
> Follow-up on the training course;
Than Their Managers.
Tabulating Programme Costs This support included: > Check that data collection procedures are in place;
Supervisors Estimate Of Impact Hard To Get
All of the studies were able to produce fully-loaded
Them To Fill Them In. Training > Support the company around training delivery;
costs of the training programmes being evaluated
The training course as described in Section 8 was
Managements Estimate Using Focus Group Is a without much difficulty. This was facilitated by an > Coach the company representative around key
delivered to representatives of all participating
Of Impact Great Way To Get These intensive mini-workshop delivered as part of the aspects of the methodology;
companies and networks. A total of 1,424 training
Estimates. Managers Very project training by Sean O'Sullivan. Costs can be
hours were delivered to 42 trainees under the project. > Gather specific information;
Interested In This Approach. categorised into three types: onetime costs (such
as needs analysis and design); cost per training event > Brief personnel at all levels regarding the project.
Customer Input Excellent But Expensive. Coaching
(such as facility rental, instructor's salary); and cost
On-site and off-site (telephone, e-mail) coaching Visit 3: Carried out approximately 6 months after the
per participant (such as meals, notebooks, coffee
support to designated company representatives initial training course (Sept 2004) to:
Converting Data to Monetary Values breaks). Fixed costs, such as needs analysis and
throughout the project on either (a) an on-call basis
The preferred method of converting level 4 data training design, should be calculated over the life
depending on the needs of the participants, and (b) > Support the company around data analysis and ROI
to monetary value was converting output to either of the training programme. The more tricky cost
a scheduled basis with some participants where this calculations;
marginal profit or, in most cases, to savings. Converting calculation procedures which created some challenges
was deemed necessary by the consultant and the
savings in employees time was also widely used. for participants during the programme involved > Support in-company initial evaluation of the project;
participant. The amount of such coaching varied
Most participants experienced considerable difficulties assessing the impacts of the rate of obsolescence
depending on the managerial level of the participant, > Gather specific information.
establishing standard values for units of data from of the skill; employee turnover; the opportunity costs
the type of the programme being evaluated and the
with their own company structures. In some cases of trainer's and trainee's time (ideally including fringe Peer Support
relative work pressure in the company concerned.
internal HR or accounting departments held standard benefits and any other charges on payroll). IMC tried to ensure that good peer-to-peer learning
values for employee time or general units of took place during the training course and facilitated
In-Company Consultancy
production but when participants sought to identify Identifying Intangible Benefits effective networking between the participants at all
This took the form of three scheduled visits to the
standard values for smaller items or departmental The identification of intangible benefits of the training events and even on a one-to-one basis.
companies -
outputs they encountered problems. In no case did a being evaluated appears to have presented no
participant work to establish a standard value where difficulty for the participants. In most cases this was eNewsletter
Visit 1: At the Objective setting stage (February-March
none existed. Instead, as they usually had a number picked up at level 4 usually by means of a question to An eNewsletter was produced three times during
04) which included assisting the company with the
of data items that could be converted they relied on the trainee or their superiors. project by IMC for all network participants. This was
items specified above in Section 4 and in particular
one for which data was readily available. designed to keep them up to date on developments
assisting the company representative(s) to:
in the project, provide simple short back-up articles on
key topics and encourage peer interaction with stories
and best practice examples from across the project.
2 2
Part II The Results
28 29

Web > Linkage between training goals and strategic influences first, (followed by error adjustment), and the 5.4.2. How Much Evaluation Is Enough?
IMC assisted Skillnets to set-up a special web page business objectives is often absent in Irish remaining percentage is what is attributed to the One of the key issues that arose in the early stages
on the ROI project as part of the Skillnets website. companies. impact of the training as a residual factor. of the project was how far should each company go
This was aimed at providing general information to non in their evaluation study, all the way to level 5 or
One area which the consultants and participants alike
participants in the Skillnets networks. The proposal to During the training course provided under this pilot, somewhere in between. This created a good deal of
found difficult was trying to obtain accurate, unbiased
provide a special members area for project participants this process caused some concern among a minority dialogue between the companies and the consultants
information from stakeholders, and others providing
was not implemented due to technical reasons. of the participants. This was alleviated to some degree and forced the companies to consider a number of
data, with regard to the isolation of the effects of
by the balancing techniques which Phillips provides, issues around the evaluation process. While the
training from instruments like questionnaires.
Technology Support primarily his technique for applying a statistical error objectives of the pilot project tended to drive most of
IMC placed importance on testing appropriate web- adjustment to discount the estimates. Phillips Guiding the evaluations all the way to level 5 it was clear from
Isolating the Effects of Training
based technology for processing evaluation data Principles also specify that the most conservative this dialogue that the proper level of post-training
Phillip's ROI model is quite clear in establishing a
(see 5.4.10 below). However, there was a general result must always be used when analysing such data. evaluation differs from situation to situation and even
preference, with regard to isolation, for the use
reluctance among the participating companies to go evaluation at level 1 alone is a valid outcome if that
of (a) control groups, (b) trend line analysis, and (c)
down this road at this stage and consequently, apart What has emerged from the use of this method in is what the company intended from the outset. This
forecasting methods, in that order. The use of control
from providing information and briefings on what was practice (12 of the fourteen studies used estimates is a key learning outcome of this project.
groups in quasi-experimental designs is widely
available, no further action was taken in this regard. in one form or another) is that this process is in no way
regarded as the most effective way to isolate the
different from most forms of economic analysis which The Kirkpatrick/Phillips model quantifies five levels of
effects of training, yet it is very difficult to identify
Evaluation Tools and Support Processes Workbook rely on necessary assumptions. In fact it was training evaluation, moving from the simplest and least
equivalent groups, or groups that may be appropriate
IMC has prepared an Evaluation Process Handbook discovered that managers were quite comfortable illuminating to the most complex and in-depth. The
for comparison. Also, contamination can occur if the
and Evaluation Workbook containing all of the various with the process since they spend much of their time first level in the model involves collection and
trained group members interact with control group
tools and job aids used to facilitate the participants working with and making estimates. Neither had they assessment of training participants' reactions using
members and transfer what they are learning, thereby
in implementing the models. These are available as any difficulty in accepting estimates from trainees and evaluation or feedback forms. Everyone who has ever
undermining the experiment. A successful study
two separate documents. supervisors as being a credible source. been to a seminar or training course has been asked to
requires full cooperation from managers and workers,
fill one of these out at the close of the session. Usually
yet if the effort appears too research-oriented or
5.4. Key Issues Arising from the Application of the It had been expected that trainees would over-rate these short questionnaires focus on the delivery of the
academic, it may be resisted. Trend line analysis, which
Models their own performance improvement and over-rate training program and on how well participants liked the
projects outputs compared to actual data after training,
5.4.1. Challenges of the ROI Process the impact of the training on that improvement. In fact course.
can only be feasibly used when no change caused by
The applicability, usability and benefits of the Phillips the opposite was the case and supervisors seem more
other influences, hence they are not very often
ROI process model are confirmed by this pilot project likely to over-rate than do their subordinates. The second level tells us about the actual learning that
possible to utilise. Similarly a forecasting model (more
and the response of the participants (see Section 9). took place during the training. This means that, in most
analytical than trend line analysis, using least squares
However, in conducting this pilot the consultants were The pilot project shows that estimation is workable, cases, participants must take some form of knowledge
statistical methods) can estimate training effects by
keen to classify any shortcomings in the model and be credible and in many cases the only way to get to an quiz both before and after training. These can take the
comparing forecasted values without training with
satisfied that it bore no fundamental flaws which ROI since, according to Phillips, failure to use at least form of simulations, skill practice exercises or projects,
actual data.
would render it inoperable in an Irish context. No such one method to isolate the effects of the training from the outcomes of which provide data on the quality of
flaws were discovered but the following issues arose other factors renders any ROI calculation invalid. The learning that's occurred. If the training is designed
It is only when it is not possible, practical or cost
from the pilot: pilot shows that when using estimation it is important properly, much of this data can be collected during the
effective to use these techniques that evaluators
to be clear and explicit about what assumptions were training event itself, which reduces the administrative
may, according to Phillips, use subjective estimation
> Ease of Use: Levels 1-3 are easy to use, Level 4 made and how they were derived. What is crucial to burden and costs.
from trainees, supervisors and/or managers. This
presents some challenges and Level 5 is difficult business managers is how well-reasoned and credible
method involves estimation of the percentage
in the absence of impact and monetary data; are the assumptions. In this regard, it is best to make Frequently, training is just one component of an overall
improvement due to training by participants,
use of the practical wisdom of the individuals closest change management process designed to modify
> Staff resources and time required to implement supervisors, senior managers, or experts (subordinates
to the work. Another good general practice is to be behaviours. Level three training evaluations help us
a full ROI can be considerable; in the case of training for supervisors and managers).
conservative in making assumptions about benefits know if participants actually apply the knowledge and
This is adjusted by a statistically recognised process
> Skills and knowledge needed require a commitment and thorough in accounting for costs. The use of skills they learned. Usually performed 60 to 90 days
of the estimator stating their level of confidence
to at least 3 days training; estimates from a variety of sources (trainees, after training, this type of evaluation requires follow-
with the estimate (this percentage is then multiplied
supervisors, HR staff, senior managers, etc) to create up visits and observations at the job site or surveys to
> Absence of baseline data and inadequate needs by the estimate to calculate an adjusted percentage).
a complete picture is also desirable. The use of the trainees and their supervisors.
analysis militates against an effective ROI process; Estimators may also start by identifying other
statistical error adjustment is a very useful additional
measure.
2 2
Part II The Results
30 31

According to Kirkpatrick, if the end result is to change necessary for the company decision makers to achieve 5.4.4. Core Issues Around Evaluation Planning Other variables specific to the company may enter the
attitudes, improve skill levels or increase knowledge, this level of informed decision-making capacity in Deciding beforehand what data to collect and analyse process. Using a practical approach, most companies
then only levels one (reaction) and two (learning) need relation to this subject. is very important. As Mark Twain once said, Collecting settle on evaluating one or two of their most popular
be performed. However, if one wishes to use the data is like collecting garbage, you must know in programmes. This pilot bears out the advice that for
training to change behaviour, then he recommends 5.4.3. Evaluation is Not an Add-on advance what you are going to do with the stuff companies implementing the ROI concept for the first
the use of level three (behaviour) and possibly level Among the most common mistakes made by trainers before you collect it." time, only one or two training programmes be selected
four (results). is to think of training evaluation as something one for an initial calculation, as a learning process.
"adds on at the end." Evaluation should be planned and Prior to embarking on any level of evaluation, the
While measuring attitudinal changes or behaviour integrated into the process from the beginning. This training staff and management should ask themselves 5.4.7. ROI in Small and Medium Enterprises
changes can be challenging, by far the most difficult project shows that it is possible to conduct evaluations a simple question: What is our intent in collecting this This pilot shows that SMEs are perfectly capable,
aspect of training effectiveness to measure is return after the event but it would be completely misreading data? Even if the only data collected are participant given the right training and support, of implementing
on investment (ROI), level 4 and 5. ROI calculations the outcomes if this was to be taken as a licence to reaction, and there is no intent to analyse, evaluation of training right up to level 5. Complete
attempt to draw a causal connection between business drop evaluation from training programme planning. communicate, and use the data collected, then the Laboratory Systems from Rosmuc in Co. Galway and
impact and the actual financial indicators of the What we have clearly learned is that the absence of entire effort is for naught. The organisation will benefit Constructive Solutions from Clogheen in Co. Tipperary
company. baseline data for instance, makes it extremely difficult more from conserving valuable resources than it will completed two excellent ROI studies that may be seen
to get a result especially at level 4 and 5 at all and by using resources on data collection design and in the book of Case Studies accompanying this report.
Depending on which variables one decides to measure even when one is possible it affects the credibility implementation and doing nothing with the data. However, there is no doubting the challenge that the
- improvements in work output, sales turnaround, costs of that result. ROI process presents for SMEs with limited staff
savings, increases in sales, quality and so on - 5.4.5. Non-transferability of ROI Results resources. Successful implementation of the model in
evaluating ROI can be time consuming, expensive and It is also clear however, that some misconceptions Just because a training programme succeeded in one an SME requires more time and attention from a more
sometimes ambiguous. Statistically nailing down the exist about what constitutes baseline or benchmark company, does not necessarily mean that it will work senior level of staff than would be required in a large
notion that a specific training event "caused" a specific data and this has inhibited trainers from addressing at another; ROI measures too many factors that are enterprise. Undoubtedly ROI is not for everyone or
sales result requires the use of Phillips techniques for the issue heretofore. Put simply, if a group of trainees unique to the company. Converting benefits to every company. Some companies, large and small, may
isolating the effect of the training from other factors. has some knowledge of the software they are being monetary value are a key element in any ROI lack the trust and supportive climate that ROI requires.
What Phillips also makes quite clear in his Guiding trained to use, it is critical to establish a baseline of calculation and are almost always dependent on No matter what the size of the firm the key ingredient
Principles is that if an attempt is to be made to understanding prior to the training event. This value as determined by the unique operating systems, is a committed champion who must be willing to learn,
calculate an ROI then data must be collected at the requirement can be simply fulfilled by a quiz or cost structure, financial data, and market position of change, and try new things- using ROI as a process
lower levels though in such cases the previous levels questionnaire filled out prior to the event by the company. improvement tool. Without this attitude and approach,
of evaluation do not have to be comprehensive. participants. The same tool can be used very it may be best not to try.
effectively for soft skills training or training in 5.4.6. When is a Company Ready for ROI?
On the other end of the spectrum, by far the easiest leadership and managerial skills, as evidenced ROI is not for every training programme nor indeed 5.4.8. Setting Evaluation Targets
feedback to retrieve is the trainee's reaction to the in this project by Novartis and Masterchefs. for every company. Because of the resources required The pilot pointed up an important misconception
training. Did trainees like the course, the instructor, for the process, most training programmes do not among training practitioners that adoption of the
class material, methodology, content and the training The results of this pre-event benchmarking give include ROI calculations. Therefore, organisations must ROI model requires a company to evaluate almost all
environment? Were the objectives met and was the trainers two important tools: a means to evaluate determine the appropriate level of evaluation. There programmes to the highest possible level. For many
training useful to them? These questions can be the effects of training and an idea of what level of is no prescribed formula, and the number of ROI impact companies who now barely even evaluate at level 1
answered with a simple questionnaire. Unfortunately, knowledge the curriculum itself should assume. studies depends on many variables, including: this is a truly daunting and off-putting prospect.
the answers don't give much insight into questions The importance then of establishing evaluation
such as "Will this training change the way you do your On the other side of the coin, too much money can be > staff expertise on evaluation, targets cannot be over-stated. Targets allow companies
job for the better?" spent unnecessarily in pre-event benchmarking. This to manage the complexity of moving up the chain of
> the nature and type of HRD programmes,
occurs in circumstances where employees are going evaluation levels.
What is crucial then is that companies need to be to be taught something completely new - where we > resources that can be allocated to the process,
sufficiently well informed to be able to make the right can safely assume existing knowledge levels among A target for an evaluation level is the percentage of
> the support from management for training
choices about which level to evaluate at. There is no students is zero. In such a circumstance, only a post- training programmes measured at that level. Repeat
and development,
right or wrong answer. Very few programmes will be test need be given to measure learning. sessions of the same programme are counted in the
evaluated at level 5 but a lot (if not all) can easily be > the company's commitment to measurement total. For example, at Level 1, where it is easy to
evaluated at level 1 and 2. This pilot shows that it and evaluation, and measure reaction, companies achieve a high level of
takes much more information, education and real activity, with many companies requiring 100%
> pressure from others to show ROI calculations.
practice than would otherwise have been thought evaluation. In these situations, a generic questionnaire
is administered at the end of each programme. Level 2,
2 2
Part II The Results
32 33

Learning, is another relatively easy area to measure 5.4.9. Communication Process Model Section 6 Value of Human Capital
and the target is high, usually in the 50-70% range. If there was one key shortcoming in the training and Impact of the Models The ROI model enhances understanding of human
This target depends on the company, based on the support offered under this pilot project then it is to 6.1. Expected Benefits to the Companies capital at all levels in the firm as both a significant
nature and type of programmes. At Level 3, Application, be found in the under emphasis on a communication factor of productivity growth and a critical variable in
the percentage drops because of the time and expense process model for ensuring effective communication The following benefits are expected for the companies the process of technological change. The main benefit
of conducting follow-up evaluations. Targets in the both during and after the process. Participants who implementing the models: of participating in this project was the chance to
range of 25-35% are common. Targets for Level 4, conducted good studies failed to get senior confirm cross skilling's positive contribution to the
Business Impact, and Level 5, ROI, are relatively small, management understanding and further buy-in for the Improved Capability business.
reflecting the challenge of comprehending any new ROI process because of a lack of attention to effective Development of evaluation capability in terms both
process. Common targets are 10% for Level 4 and 5% communication of the results. Phillips provides of the process of evaluation and understanding the Align Programmes to Needs
for Level 5. An example of evaluation targets comprehensive guidelines in this area but, primarily 5 levels of evaluation as well as data collection needs Such ROI calculations align programmes to
recommended by Phillips is shown in Table 5.9. due to time constraints, this matter was not thoroughly and methods. Equally important to the measured organisational needs (Novartis and Diageo), show
In this example, half of Level 4 evaluations are dealt with during the training course nor adequately impact of this programme are the processes used in contributions of selected programs, and earn respect
taken to Level 5, the ROI. followed up in the consultant site visits. measuring it. Impact of training and learning needs of senior management and administrators.
to be embedded over time in other Glanbia Meats
Establishing evaluation targets has two major 5.4.10. Use of Software to Automate the Process processes and strategies. Build Staff Morale
advantages. First, the process provides objectives Part of the pilot project involved identifying and, if Showing results of training in clear and tangible
for the training staff to clearly measure accountability appropriate, introducing software for automating all Establishing Priorities ways can build staff morale, justify or defend budgets,
progress for all programmes or any segment of the or part of the evaluation/ROI process. Information was Calculating ROI in different areas helps companies improve support for training development and enhance
training process. Second, adopting targets also focuses obtained on Knowledge Advisors (Metrics that Matter) to determine which programmes contribute the most, design and implementation processes.
more attention on the accountability process, web based software which is approved by both allowing priorities to be established for high impact
communicating a strong message about the extent Kirkpatrick and Phillips. This allows companies to learning. Successful programmes can be expanded into Reduce Inefficiency
of commitment to measurement and evaluation. streamline the learning/evaluation process, gain access other areas - if the same need is there - ahead of other They can also identify inefficient programmes that
to real-time data on learning and performance, obtain programs. Inefficient programmes can be re-designed need to be redesigned or eliminated and identify
Table 5.9. Evaluation Targets industry benchmark comparisons, use actionable and re-deployed. Ineffective programs may be successful programs that can be implemented in other
intelligence (the ability to quickly gauge how effective discontinued. areas (Bord na Mona).
Level Percent of Courses learning is, and make decisions accordingly), implement
needs assessment tools and testing tools, Focus on Results Broader Impact
Level 1 Participant Satisfaction 100% automatically analyse skill gaps, pinpoint improvement The ROI methodology is a results-based process that Companies that dovetailed action planning with the
opportunities, identify alignment to best practices, and brings a focus on results with all programmes, even for evaluation process realised added benefits from both
Level 2 Learning 60% use ROI calculators and forecasting of ROI. See a demo those not targeted for an ROI calculation. The process as both appear to feed of each other - knowledge by
at www.knowledgeadvisors.com. Participating requires designers, trainers, participants and the participant that the action plan forms part of the
Level 3 Applications (Behaviour) 30% companies were briefed on the software but to date HR/training staff to concentrate on measurable evaluation seems to assist in a better outcome of the
none have opted to take it, primarily it seems because objectives: what the program is attempting to plan which in turn strengthens the evaluation.
Level 4 Results 10% of the high buy-in cost. It may be possible to negotiate accomplish - which has the added benefit of improving
a special group deal (for 10 or more companies) which the effectiveness of all learning and development General Benefits
Level 5 Return on Investment 5% allows the use of a set of baseline tools for one year programs. > Help training personnel ensure that courses are
at a significantly reduced cost and this will be further working.
pursued. Alter Management Perceptions of Training
> Help operations people identify barriers that are
The ROI methodology, when applied consistently
preventing skills from being applied.
and comprehensively, can convince management
that learning is an investment and not an expense. > Help the company be confident that the hours
Managers begin to see training as making a viable employees spend in training and the money that
contribution to their objectives, thus increasing the it invests in its people are time and money well
respect for the function. This is an important step spent.
in building a partnership with management and
> Increased ability to manage the day-to-day activity
increasing management support for training.
of projects leading in turn to increased productivity.
2 2
Part II The Results

34 35

> Less supervision needed from, for example Project > Earn Respect of Senior Management. Developing > Discuss with my company the ROI Process, develop > I will discuss the selected training programme with
Managers, leading to increased planning the ROI information is one of the best ways to earn an overall evaluation programme, bring key participants in advance of their taking the training
effectiveness from Project Managers. the respect of the senior management team. Senior personnel on side within the company. course, I will discuss the need for support from
executives have a never-ending desire to see ROI. senior management.
> Increased interaction and communication with > Clearly identify each level and its criteria, get buy-
They appreciate the efforts to connect training to
peers. in of champion and colleagues, identify a pilot > Will apply to current evaluation assignments, will
business impact and show the actual monetary
group to commence / test. modify evaluation plans, will carry out a level 5
> More motivated and valuable employees leading value. It makes them feel comfortable with the
report on at least one training programmes.
to a better working environment. process and makes their decisions much easier. > Take a more systematic approach to training, place
Management who often support and, approve or more emphasis on data collection, involve people > Decide on a preference, inform my manager, explain
These key benefits, inherent with almost any type
initiate training see the ROI as a breath of fresh air more in the planning of programmes. the need for commitment.
of impact evaluation process, make the ROI process
as they actually see the value of the training in
an attractive challenge for the human resource > Set more objectives for T & D, Evaluate at least to > Talk to management, set our action plan, get buy
terms they understand and appreciate.
development function. level 3, evaluate some to level 5. in from relevant departments.
Intangible Benefits for the Participants
> Evaluate all company training to level 3, Evaluate > More focus planned on the impact training has
6.2. Expected Benefits to the Participants Apart from the monetary benefits, there have been
appropriate training to level4 and level 5, make my on business.
The following benefits are expected for the significant intangible benefits in this programme that
fellow senior managers aware of ROI.
participants implementing the models: make it stand out as a positive investment. > Will test out this process on huge development
> Carry out all levels of evaluation and not just programme.
> Increase in their skill level and ability to handle > Enhanced opportunity for different types of work milestones, follow through on training.
> Complete pre-skills before courses and after course,
management situations resulting in more across departmental divides within the company.
> Evaluate all training, ensure everyone on courses questionnaires after training, monitoring business
opportunities for the participants.
> Understanding the complexities of analysing completes smile sheets, observe performance impact.
> Ability to use proven tools, strategies and problems and making management decisions afterwards, questionnaires to assess the output.
> Carry out evaluation at all levels.
techniques for successfully measuring the results resulting in less support time being required from
> Develop evaluation procedures for all training
of training. senior managers. > Complete data collection, implement ROI in my
programmes, commence collecting data for all.
company.
> Provide a process for collaboratively working with > Increase in their performance appraisal rankings.
> Carry out level 1, 2 and 3 for all training, before
key members in their company to increase their > Continue ROI on an ongoing basis. Make it part of
Participants Feedback on Action embarking on training initiatives set objectives,
support and help ensure that training adds value company development and training practice.
In feedback to the consultants the participants involvement of trainees and manager in levels,
to the business.
identified that: communicate ROI to colleagues. > Put more analysis on costs and returns on training,
> Ability to confidently present the benefits and introduce evaluation forms.
> Design and use pre course skills form as baseline
challenges of each measurement level or technique > The post-training evaluation helped us identify
data, systematic issue of L1/L2/L3 at monthly > ROI plan for selected training programmes.
and provide creative ways for improving their where our training could be improved.
intervals, write evaluation plan.
success. Berna Ward (Braun/Oral B) made a > Set specific targets for training programmes, gain
> It helped us identify areas that people didn't
presentation on ROI to her senior management > Redesign organisations 'happy sheet', explain baseline data before programme.
understand or didn't find of value or benefit.
team - and interestingly most of the discussion before evaluations what the purpose is, look at
> I would now feel comfortable under the right
centred on the non-financial or intangible benefits > We found out that some of the teams were business impact as opposed to training need.
circumstances, having an effective system to
of training. Michael Ryan of Laepple has been not using the skills taught in the training, and
> Revisit feedback (smile) sheets. collect and assess performance data both prior
invited to explain ROI to a meeting of the Carlow it helped us understand why.
to and post training.
Chamber of Commerce. > I will do before and after analysis at my own
> We discovered that people wanted ongoing
courses, like this a lot, plan training and costings > I feel equipped to carry out ROI calculations on
training, rather than one-off courses.
better, will evaluate at all levels. training programmes implemented within my own
> The evaluation results not only helped us improve and the North Mayo Skillnet remit.
> Revisit evaluation process for every course at level
the training, but also gave us an understanding of
1 + 2, start level 3 where possible, Get senior
the system change we needed to make the whole
management on board - high level of support
thing work.
needed, have resources assigned to project,
> Identify more specific evaluation instruments for evaluate just one programme not two as planned,
each T & D activity, focus on business impact in try not to get too carried away on the detail/paper
greater detail and with more clarity. work.
2 2
Part II The Results
36 37

> The benefits to both Lionbridge and the North > Considerable up-grading of the competence and > Providing both a process and a basis for > Evaluation can be a means to bring continuous
Mayo Skillnet are a greater understanding of the ability of Skillnets own programme advisory and collaboratively working with network members improvement of training through greater emphasis
ROI process, the importance of the integrity of support staff regarding implementation of the to increase their use of best practice tools. on documentation, measurement and feedback
the data within this process and the complexity of model. loops.
> Ability to confidently present the benefits and
collecting the data and the relevant times. I would
ROI to Skillnets challenges of ROI to network members. > ROI analysis can bring greater accountability
also be in a position to define objectives and
The issue of how to use the ROI methodology to and improved efficiency to the training function
measurement metrics and build systems which > Ability to promote the benefits of training to
assess ROI of a broad-based funding programme such of firms.
would allow for the collection of baseline data enterprises in a more tangible way and one that
as Skillnets was raised with Dr Jack Phillips during the
based on the type of course or program to be companies will really understand and relate to. > This process highlights the best approaches for
project evaluation workshop by Mr Henry Murdoch, a
evaluated. measuring training effectiveness given
member of the Skillnets Board. Dr Phillips replied that > Earn respect of network members by having a much
organisational constraints, discuss common pitfalls
6.3. Expected Benefits for Skillnets he didn't think it was possible to apply the greater understanding of key business drivers and
and provide strategies for overcoming obstacles.
Skillnets has greatly increased its standing as a body methodology across such a broad based programme being able to bring significant added-value to the
engaged in the promotion and support of business-led because ROI is essentially a measure unique to the training staff function in network member > Effective evaluation can help companies make
training in Ireland by piloting these models on such a company concerned. Much more work needs to be done companies. sound training investment decisions.
scale. The ROI process has been tested before in other to arrive at a usable and valid formula but this pilot
Feedback from Network Managers > Knowing precisely the quality of instruction that is
settings in Ireland, most notably by Enterprise Ireland, opens the door to possibilities in this regard which did
In feedback to the consultants the Network Managers being delivered can be critical to change
but there is no published information on these studies . not previously exist. The Recommendations suggest a
identified that: management success. In this respect, the need for
Thus, this broad-based and comprehensive study is the way forward in this regard.
training evaluation goes beyond simply justifying
first of its kind in Ireland.
> Work with network members to implement ROI, the use of training - or justifying its continuation.
A critical issue that needs to be addressed in any such
communicate ROI concept to steering committee. In cases like these, post-training evaluation
At the conclusion of this study Skillnets will have serious research design includes the choice of an
provides a critical feedback loop for ongoing
a number of resources which it can use and share appropriate unit of analysis: micro (firm) or macro > Raise awareness about ROI / Philips with other
improvements to a company-wide training
regarding the Evaluation/ROI models, including (network or sector or all of Skillnets) level. A study members of network.
programme.
which produces ROI data at the micro or firm level is
> Inform network about the project and consider
> A Case Book of 14 case studies of implementing the most valuable to businesses. Macro-level studies Chain of Impact and Importance of Level 3
programme for lever 4 and 5.
ROI in Irish companies. will essentially produce a different type of evaluation, In Level 1 the initial reaction to the training is
which may need to be more rigorous and academically > Customise Level 1 for network, negotiate input measured. This is the first and most basic level, usually
> A report setting out the key barriers and enablers
defensible; but is unlikely to yield as rich and programmes requirements. involving a post-training survey assessing the quality
to the adoption of this model among Irish
informative an analysis as do micro-level or firm-level of the class. In the case of the financial company, the
companies. > Look at my evaluation process, update level 2-4
studies. Industry-level data cannot provide detailed employees were asked if the material was easy to
forms.
> An Evaluation Process Handbook and the analysis of relationships between training and follow, if the instructor was familiar with the subject
Evaluation Workbook with tools, techniques, performance in individual firms needed for input to > Encourage all trainers to use this to some level. matter and if the facility was conducive to learning.
guidelines, job aids and checklists which illuminate decisions by managers. But can macro-level studies At Level 2 we analyse the learning. There are a number
the steps needed to ensure successful be built on micro-level studies or is it necessary to 6.5. Expected Benefits for Training and of ways to do this, such as testing, simulations and
implementation. have two different forms of evaluation or two different Development Function instructor evaluations. What matters most is finding
processes in place? These questions need further While specific benefits, as noted above, have and will out if the employees actually absorbed the class
> A body of knowledge and best practice with regard
elaboration and a possible way forward is suggested accrue to companies, participants, networks and content. is unlikely. In Level 3 we analyse the skills
to ROI unequalled in Ireland, the UK or most of
in the Recommendations contained in Section 10.2. Skillnets, the project also raises a number of issues of gained over the long term. The goal of this step is to
Europe.
general import to training and development strategies assess whether on-the-job behaviour changed after
> A group of committed ROI practitioners in 6.4. Expected Benefits to Skillnets Networks in Ireland, as follows: the training. This data should be collected over a
companies and networks open and enthusiastic Nine Skillnets networks took part in this pilot project longer period of time. These first two steps are early
about sharing their knowledge and experience. and are expected to gain the following benefits: > These models can provide critical information for indicators. What is crucial to note here is that without
addressing the serious problem of poor transfer of positive feedback and good marks on the testing, a
> A set of examples of early adopters of the model
> Increase in the ability of the Network Manager's knowledge and skills from the training room to the positive ROI. During this phase, data is also gathered
thus providing a focus and encouragement for
skill level to implement the ROI methodology. workplace. on how participants have used their new skills.
others to follow suit.
> Ability of the Network Manager to use proven tools,
strategies and techniques for successfully
measuring the results of training.
2 2
Part II The Results
38 39

In a number of the pilot studies the Level 3 data Cost of Not Evaluating Section 7 - Barriers and Enablers to budget (Phillips 2000). This must be considered in the
revealed very important information - what worked - What is abundantly clear from the pilot is that not Adoption of the Models light of Irish experience which shows that there is
and what didn't - in the training programme. If the conducting an evaluation/ROI analysis of training can often no specific resource allocation for evaluation; and
This pilot shows that it is possible, with training and
trainees are not using the skills, it is essential to find cost companies valuable input that's necessary for where it exists, as evidenced by some of the large
support, to quickly implement the ROI methodology
out why. What the pilot studies show is that there may improving results. In addition it can bolster a company's companies in this pilot, it relates only to lower levels
in most firms. In the process it has been possible
often be barriers that prevent transfer of learning such intention to continue with or further expand a and typically costs no more than 1% of the total
to identify the major barriers which inhibit the
as lack of opportunity to use the new skills, issues programme which their 'gut feeling' told them was training cost. The additional investment in ROI could
implementation of the concept and the enablers which
with the work environment, lack of support, maybe the working but in the absence of credible data to show perhaps be offset by the additional results achieved
allow it to happen speedily and with a high success
wrong skills; or, it might not be what is needed at the ROI they were reluctant to proceed with. from these programmes and the elimination of
rate.
time. This information is crucial and can only be unproductive or unprofitable programmes. The time
obtained by a Level 3 analysis. issue was identified by most of the participants in the
7.1. Barriers
pilot and may have been exacerbated in this instance
Some of these barriers are realistic while others are
When to Evaluate? by the short time frame to complete the pilot. It was
actually based on false perceptions and
There are many training practitioners who believe alleviated to some extent by providing time saving
misapprehensions. The following problems were the
there is no need to evaluate a once-off training course. tools and techniques to facilitate both gathering data
most common barriers to the feasibility and diffusion
This raises an obvious question, however: How do you and analysing it.
of ROI analysis of training which were experienced
decide to offer training only once when you haven't
during the pilot project:
determined how effective the training was, and if it Getting Buy-In from Senior Management
needs to be offered again? This pilot project lends Winning the support, participation and understanding
Lack of Skills
credence to the view that some level of evaluation of colleagues and senior management posed a major
Many HR and training staff do not understand ROI nor
should always occur - the only question is at what barrier to many participants. During the training some
do they have the basic skills necessary to apply the
level. The pilot project participants appear to strongly template briefing tools were given to participants but
process within their scope of responsibilities.
favour evaluating every programme at Level 1 and 2 in general participants found this a difficult task early
Measurement and evaluation is not usually part of the
and many at Level 3. The key lies in finding the right in the process. Towards the end they expressed much
preparation for their job. Also, the typical training
level of evaluation required. The interest in evaluating greater confidence in providing briefings as they had
programme does not focus on results, but more on
everything at Level 1/2 may arise from the skills the the experience of carrying out the process behind
learning outcomes. Staff members attempt to measure
participants have acquired and the ease with which them. More time devoted to internal selling of the
results by measuring learning. Consequently, a
they can now gather Level 1 data. concept should be a major feature of the introduction
tremendous barrier to implementation is the change
of the concept and should be included in future
needed for the overall orientation, attitude, and skills
programmes.
of the HRD and training staff. This was overcome
during this pilot by providing 4 days training in the
Absence of Needs Assessment and Baseline Data
models back-up with consultancy advice and support.
Many of the programmes in the study do not have an
It would appear to be the case that wider-spread use
adequate needs assessment, and some had none at all.
of measurement methodologies will not occur without
Some of these programmes may have been
the relevant training. It is also clear from the project
implemented for the wrong reasons based on
that the use of consultants contributed greatly to the
management requests or efforts to chase a popular fad
undertaking and finalisation of the case studies. This
or trend in the industry. If the programme is not
is probably not financially viable on a continuing basis
needed, or not focused on the right needs, the benefits
and is, in the evaluator's view, a significant finding or
from the programme will be minimal. An ROI calculation
output of the pilot project.
for an unnecessary programme will likely yield a
negative value. Trying to assess value at the back end
Cost and Time
of a programme with a clear delineation of needs and
The ROI methodology inevitably adds additional costs
baseline data is very difficult. This is a realistic barrier
and time to the evaluation process of training
for many programmes and one which was clearly
programmes, although the added amount may not be
evident in this pilot.
as excessive as previously thought. It is possible this
barrier alone stops many ROI implementations early in
the process. A comprehensive ROI process can be
implemented for 3 to 5 percent of the overall training
2 2
Part II The Results
40 41

Misapprehension them, at least in the early stages, from getting to grips Time Lag Effect participants as such. In some instances ROI studies
The fear of obtaining a negative ROI and fear of the with the methodology. These false assumptions form While the costs of training are known up front, before can just be seen by management as promotion and
unknown (where will this process lead us) was realistic barriers that impede the progress of ROI training, and are relatively easy to quantify, the marketing by the training department and it can
prevalent for a number of participants at the outset. implementation. Typical of these false assumptions benefits may accrue slowly over time and may depend happen that "best practice" companies often are
Fear of failure appears in many ways. Designers, were the following: on such unpredictable factors as turnover rates among among the most resistant, given their position that the
developers, trainers and those who commissioned or workers who are trained or market pressures or value of training is an article of faith; their managers
authorised training programme may be concerned > The financial impact of soft skills training is often opportunities which allow the skills taught to be fully see little point in spending money to seek affirmation
about the consequence of a negative ROI. They fear subjective, and difficult to quantify and convert to utilised. The latter was the experience of one of the of this accepted view.
that ROI will be a performance evaluation tool instead monetary value. (The pilot proved that it can be studies in this pilot which effectively prohibits a valid
of a process improvement tool. Also, the ROI process done by testing the model on 5 soft skills ROI calculation as the benefits occurred outside of the Absence of Data Collection Systems
will stir up the traditional fear of change. This fear, programmes) 12 month window from the conclusion of the training Another cultural factor, at least that's what it seems to
often based on unrealistic assumptions and a lack of which is allowed for the calculation. be, is the relative paucity of comprehensive recording
> We deliver professional, competent training so it
knowledge of the process, becomes a realistic barrier systems in the training departments of companies,
will be a dangerous precedent for us to have to
to many ROI implementations. To date two studies Setting Training Objectives large and small alike. The evaluation process was
justify the effectiveness of our programmes in
have returned negative ROIs and the reaction of senior It was clearly evident in the pilot project that training seriously hindered in some cases by problems of data
financial terms." (The pilot showed that proving ROI
management has, in fact, been favourable. Instead of staff set very limited and unspecific objectives for collection and measurement.
enhances the credibility of the training function
reflecting poorly on the players involved the process training programmes and in some cases the objectives
and is not a burden to carry)
has confirmed that the training was good, the trainees are either not set down in writing or not clearly Absence of Cost Data
learned what was taught and in once case the problem > Training is a complex, but necessary activity, and is specified at all. Objectives of training are often murky, The project also identified a major problem in most
arose with lack of support for application on the job influenced by the vagaries of human participation though management when asked is always able to firms around converting business impact data to
and in the other case from the time-lag to achieve and it should not therefore, be subjected to an specify what they understood the objectives to be. financial values along with obtaining accurate
business results. accountability process. (The pilot used tools and It is very difficult to measure the rate of return if the measures of the full costs of training. To be able to
techniques which discount for the human variable meaning of return is not defined in quantifiable terms arrive at these numbers without too much effort on
Lack of Effective Planning and showed that accountability isn't an option.) up-front. the part of the training staff one has to obtain
A successful ROI implementation requires much standard values and accurate and timely data on cost
planning and a disciplined approach to keep the Is it Worth the Effort? Difficulty Understanding the ROI Methodology factors from the appropriate sources in the company.
process on track. Implementation schedules, evaluation Some training staff question the costs of tracking and For some participants elements of the methodology This was rarely forthcoming without a great deal of
targets, ROI analysis plans, measurement and measuring costs and benefits of training as really were difficult to grasp and implement. Working through coaxing and in some cases because the data simply
evaluation policies, and follow-up schedules are worth the effort, especially when the training is the actual process with the support of the consultants wasn't available.
required. This pilot showed that much of the effort of offered only occasionally. enabled the participants to surmount these barriers.
the consultants was geared to keeping the participants These included attribution of the effects to training Lack of Irish Cases Studies
on course albeit to unusually tight timeframes. Without Does ROI Really Work? and the challenge of attributing causation to the Many participants found the lack of Irish case studies
this a lack of careful planning and determined and Some participants expressed an initial lack of training from before-and-after comparison without the during the training particularly disappointing. In fact
sustained focus on implementation becomes a barrier, confidence in the evaluation of training in general and availability of accurate baseline data. Attribution of until this pilot there has been only one published Irish
particularly when there are no immediate pressures to in particular, a lack of trust in the ROI model of effects to training is very difficult due to the influence case study of an ROI and this proved generally
measure the return. In addition where the current evaluation and its ability to measure monetary return on firm performance of a complex myriad of other inadequate for the purposes of the training. This gap
senior management group is not requiring ROI, the in real terms. It was notable that despite presenting factors. Variables such as markets, revenues, costs, will now be filled with the publication of case studies
training staff may not allocate time for planning and considerable evidence of the application of this model interest rates and many other factors, enter into profit arising from the pilot.
coordination. Also, other pressures and priorities often worldwide over two decades the sceptics remained determination, making it difficult to isolate the impact
eat into the time necessary for ROI implementation. among a handful of participants. In fact, it wasn't until of any incremental training expenditure. Most ROI Common Misconceptions
Only carefully planned implementation will be the latter stages of the project when the participants figures aren't precise, though they tend to be as > The popular misconception that ROI should be
successful. had implemented an ROI process for themselves that accurate as many other estimates that companies left to accountants or mathematicians has been
their confidence in the model became firmly routinely make. scotched by this pilot where only one participant
False Assumptions established. was an accountant and most had only a passing
A number of the participants approached the pilot with Cultural Issues acquaintance with accountancy or mathematics.
an open mind and were able to enthusiastically get Though significant barriers in any real sense to the
> ROI analysis requires an in-depth understanding
into the process quickly with optimum success. implementation of the models under review a certain
of your company's strengths and weaknesses,
However, others came to the task with false cultural resistance could be discerned among the
strategies, and extensive goal setting. To some
assumptions about the ROI process, which hindered companies concerned, though not among the
extent this may be true but how deep the
2 2
Part II The Results
42 43

knowledge required is an open question. Effective > The majority of participants approached the pilot 7.3. Supports and Constraints on Project > trainees to have personal performance targets
ROI processes were conducted under this pilot with an open mind and were able to Implementation linked to training
where staff had only a general knowledge of the enthusiastically get into the process quickly with In implementing the pilot project the following
> managers to be involved at all stages of training
company's strategic goals. Without such linkage it optimum success. supports and constraints were encountered:
design, implementation and follow-up
is likely that adoption of the ROI methodology will
> The positive response of management in virtually
not be as widespread as expected. Supports that facilitated the pilot were: (a) the vision
all participating companies. Despite the negative ROI there was agreement that
of Skillnets in bringing forward the idea and
> That perception of complexity has deterred many
> The additional (and personal) input by the Network committing funds towards its implementation; (b) the
from attempting ROI studies in the past, but, in > most trainees obtained benefit from their
Managers and support staff at Skillnets. personal commitment of in-company personnel who
fact, doing a training ROI is a relatively simple and participation
received the training and implemented in-company
uncomplicated process once you have the basic > The ROI Institute Certification procedures focussed
evaluation studies - which was helped by the > the training was well conducted
knowledge and skills to use the tools provided. the programme on achievable learning objectives,
opportunity to be trained and certified in a cutting-
which might otherwise have been vague and > the coaching was particularly helpful for those who
> High ROI values can be achieved with programmes edge methodology; (c) the openness of the companies
enabled the participants to obtain a recognised used it
on leadership, team building, management and their provision of resources to complete studies -
qualification as part of the involvement - the
development and supervisor training. This seems to which was helped, in some cases, by management > there were a number of important intangible
Foundations Certificate in ROI-Impact
stem from a view that value begets value: these interest in the models, and by the small grant incentive benefits the effect of which should not be
Measurement.
programme are usually the most expensive so they from Skillnets; (d) the, training, support, sustained underestimated
should be likely to give a high return. This may be > The Skillnets input of driving this model of focus and drive toward completion provided by the
true but the two negative ROIs so far reported evaluation by the support of financial resources consultants. The main enablers for attempting this evaluation have
under this pilot are concerned with management and personnel helped overcome many of the been the openness of Lionbridge's management to get
development while some excellent ROIs have been barriers above. Constraints on implementation were: (a) the limited involved in the process of learning ROI fundamentals
obtained from basic skills programmes. time available to train personnel in a completely new and principles, the willingness of Skillnets to support
> The consultants input which allowed participants
method and carry out adequate evaluation studies; (b) such an initiative and the support given by the
> Full evaluation of a basic skills programme is to have a sounding board, guide and mentor during
the reduced scope of type of training programme that Consultancy team in supporting and guiding the
difficult because of the problems in identifying and the process and helped keep them on track.
could be chosen for evaluation dictated by the need to participants through the learning process.
separating data for higher level evaluation. Three
> The identification of a champion with some of the complete an evaluation within the time window of the
of the pilot project studies deal with basic skills and
companies who backed the process and gave it project; (c) the challenge of building awareness of the The main barriers were from the data collection side,
the participants have had little difficulty in
credibility in the eyes of senior management. methodology and obtaining buy-in from senior due to the lack of performance metrics and that the
identifying data for conversion to monetary value
management in each company. program selected had commenced before deciding to
at level 4 and 5. > The tools and job aids provided which made the
evaluate it there was a lack of baseline performance
process easier, facilitated planning, setting of goals
> Its not possible to evaluate a programme up to level 7.4. Quotes data that could not be attained. This caused most
and keeping to a tight timeframe.
5 that has no needs analysis and that's already The negative ROI of -34% bears out the general view difficulty when trying to effectively decide what was
finished. In this pilot very credible ROI results have > The type of programme selected for evaluation - within the company that the training did not realise the overall percentage increase in performance due to
been obtained on two such programmes. especially a topical or business relevant programme the potential which it had to bring about the expected the training.
selected. change. A managerial review based on this ROI
7.2. Enablers analysis has pointed to a number of improvements If we were to evaluate more programmes or courses at
> The FETAC certification process, at the lower
The factors that have helped to make this pilot project that are now being used in the design and delivery of this level then the objectives, metrics for measurement
evaluation levels, was helpful in the discipline
a success were: future programmes - and performance levels of those metrics before training
it imposed on companies to retain data.
commenced would need to be available to carry out a
> The participants' commitment to training and their > comprehensive needs analysis complete and valuable ROI assessment calculation.
desire to make this process work as a useful tool in
> training must be linked to actual job situation
their drive to improve the quality of training.
> better buy-in by trainees and their superiors
> The participants' personal commitment to learning
which was most evident during the two-part > support follow-up systems in place
training course.
> impact to be determined before training

> training to have clear and measurable ROI targets


2 2
Part II The Results
44 45

Section 8 - Training and Support for Level 3 - Application Objectives 8.2. Training Programme Content 6. The ROI Process
Implementation of the Models Build a credible case for ROI in the organisation 1. Welcome and Introduction
6.1. Specific Criteria
The primary support provided to the participants Develop a detailed evaluation plan for a specific
2. Background to the Pilot Project 6.2. ROI Process Model
during the pilot was a four day training course divided programme
into two parts which provided a comprehensive 2.1. Skillnets 6.3. Collecting Post-Programme Data
Select appropriate data collection methods
introduction to the two evaluation models as well as
2.2. Pilot Project Objectives 6.4. Exercise 1
helping the participants to focus on their own data Utilise at least two ways to isolate the effects of
collection plan and on project deliverables. programmes 2.3. Development Process 6.5. Isolating the Effects of Training
2.4. Respective Stakeholder Roles 6.6. Exercise 2
8.1. Objectives of the Training Course Utilise at least two ways to convert data to monetary
The objectives for the entire training programme values in impact studies 2.5. Project Outcomes 6.7. Converting Data to Monetary Values
(Part 1 and 2) were:
Calculate the return on investment for selected 6.8. Exercise 3
3. Overview of Evaluation of Training
programmes
Level 1 - Reaction Objectives 6.9. Calculating Costs and the ROI
3.1. Why Evaluate Training
The practicality of the methodology Present evaluation data in a compelling and convincing
6.10. Exercise 4
way 3.2. Benefits of effective evaluation
The usefulness of the material
6.11. Identifying Intangible Benefits
3.3. Trends in training evaluation
Relevance of the methodology to present work Level 4 - Impact Objectives
6.12. Barriers to ROI Implementation
Improve the effectiveness and efficiency of training 3.4. Kirkpatrick and Phillips
The amount of new information about evaluation
programmes
3.6. Exercise 1 7. Pilot Project Implementation
Intent to use the methodology
Expand the implementation of successful programmes
7.1. Overall Project Plan
Recommendation to others to attend from similar 4. Training and Development Programmes
Discontinue or redesign ineffective programmes
job situations 7.2. Company Project Plan
4.1. Needs Analysis
Enhance the influence of, and respect for, your
7.3. Evaluation Data Collection
Level 2 - Learning Objectives function and its contribution to the organisation 4.2. Design
Identify the drivers and benefits of evaluation 7.4. Support Tools
4.3. Creating a Training Programme
Identify and describe the major steps in the 7.5. Next Steps
4.4. What makes good training?
methodology
4.5. Adult Learning Theory 8. Collecting Data during Training (Level 1 - 2)
Develop a detailed evaluation plan
4.6. Exercise 2 8.1. Collecting Data after Training (Level 3 - 4)
Identify and describe at least three ways to isolate
the effects of a programme
5. The Levels of Evaluation and How They Work 10. How to conduct a Focus Group
Identify at least two ways to convert data to monetary
5.1. Level 1
values 11. How to carry out Trend Line Analysis
5.2. Level 2
Identify and analyse intangible measures
12. How to design questionnaires and use participant
5.3. Level 3
Calculate the benefit cost ratio and the ROI estimates
5.4. Level 4
13. How to calculate fully loaded costs of training
5.5. Level 5

5.6. Exercise 3 14. Isolating the effects of Training

15. Converting Data to Monetary Values

16. Calculating ROI

17. Identifying intangible benefits


2 2
Part II The Results
46 47

8.3. Training Hours The course provided a comprehensive introduction to Table 8.2: Level of Skill/Knowledge of the Participants Before and After the Training (Minimum = 1, Maximum = 5)
Table 8.1. Training Hours the two evaluation models as well as helping the
participants to focus on their own data collection plan Before After Percent
Network No. of No. of No. of No. of and on project deliverables. Training Training Increase
Name Companie Trainees hours training ROI studies
Overall Evaluation A. Know the reasons for, and uses of, evaluation 2.9 4.2 43%
BME 2 5 128 1 Overall the participants had a very positive reaction
B. Designing an ROI Action/Implementation Plan 2.0 3.5 95%
(level 1 evaluation) to the programme. 17% rated it as
Brewing 2 5 160 2 Excellent, 56% as Better than Expected, 22% as C. Setting objectives which reflect ultimate outcomes 2.4 3.5 43%
Satisfactory and 6% (2 participants) as Below Average.
D. Determining the timing of data collection 2.0 3.5 74%
SLM 1 2 48 1 In terms of a satisfaction rating between 1 (min) and
4 (max) these figures translate to an overall average E. Selecting appropriate data collection instruments 2.0 3.6 79%
Carlow Kilkenny 3 4 112 2 rating of 2.8 out of a possible 4 (71%). All except 2
F. Identifying factors which influence results 2.1 3.4 58%
participants rated the course at point 3 (max) in terms
Galway Executive 1 2 64 1 of being worthwhile. The average overall rating for G. Selecting appropriate strategies for Isolating the Effects of Training 1.7 3.1 83%
whether the course met participants needs was 2.7
H. Analysing data including calculating the actual return on investment 1.7 3.3 96%
North Mayo 1 2 64 1 out of a possible 3. The rating as to whether the
course had equipped them to carry out evaluations was I. Implementing the ROI Impact Measurement process 1.0 2.9 85%
SE Micro 1 2 64 1 2.8 out of a possible 3.
J. Identifying and removing barriers to implementation of the ROI process 1.5 3.0 94%

NETS 4 6 128 4 Increase in Skill/Knowledge K. Providing internal consulting on ROI 1.2 2.6 85%
The relative change in the level of skill and knowledge
L. Leading discussion on ROI Impact Measurement issues and cases 1.3 3.0 75%
Hospitality 3 7 208 3 of the participants as a result of attending the course
(level 2 evaluation) was assessed by way of a separate Overall Level of Skill/Knowledge 1.8 3.3 82%
Skillnets Staff 5 160 0 anonymous questionnaire before the start of the
course and again (using the same questions) at the Facilitators Programme Content
ROI Diploma 8 9 288 0 end. The average ratings (based on 34 respondents) The ratings given by the participants for the 66% found there was a proper balance of theory and
are shown in Table 1. Since this was only Part 1 of effectiveness of the facilitators are shown in Table 8.3. practice, while 28% found it very theoretical and 11%
26 49 1424 16 a two-part course the overall increase of 82% is found it very practical. Most participants (86%) found
encouraging. The ratings show that participants Table 8.3: Effectiveness rating of the Facilitators (4 the programme length just right. 18 (50%) found the
understood the material presented and the highest Very Effective , 1 Not Effective) course difficult, 15 found it suitable, 1 found it very
8.4. Evaluation of Training Course Part 1
increases in knowledge occurred in the most difficult difficult, 1 found it easy and 1 found it too easy.
36 people attended the first part of the training course Rating
parts of the process (B, E, G and H).
at the Mullingar Park Hotel on 30-31st March 2004.
A. Knowledge of subject 3.4 Increase in Effectiveness
This number included representatives from 17
As a result of the course the average self-estimate
companies, eight networks and three Skillnets staff. B. Organisation and preparation 3.2
of the increase in participants effectiveness to
Irish Biscuits and Eo Teo could not attend and the
C. Facilitation skills 3.0 develop and implement measurement and evaluation
consultants are arranging a one-day briefing session
strategies was 48%. Apart from 2 participants who
for each of the personnel involved to enable them D. Responsiveness to participants 3.1
recorded 0% the lowest increase in effectiveness was
attend the second part of the training course.
E. Creating appropriate learning climate 3.0 20% - the highest 70%.

Overall Rating 3.1


A feedback form was completed by all participants at
the end of Day 1 and this led to a shift in emphasis in
Delivery Methods
the format during Day 2 which proved effective. The
75% of participants thought there was a good balance
consultants will make a number of amendments to the
between lecture, discussion and exercises, while 25%
content and format of the second training course to
thought there was too much of a lecture format. All
take account of the feedback from the participants
except 2 participants found the exercises and case
studies to be either very helpful or helpful.
2 2
Part II The Results
48 49

8.5. Evaluation of Training Course Part 2 Increase in Skill/Knowledge Facilitators Return on Investment
27 people attended the second part of the training The relative change in the level of skill and knowledge The ratings given by the participants for the 10 of the participants listed a monetary value arising
course at the Mullingar Park Hotel on 1-2 June 2004. of the participants as a result of attending the course effectiveness of the facilitators are shown in Table 8.5. from their participation in the programme which
This number included representatives from 16 (level 2 evaluation) was assessed by way of a separate amounted to a total of 264,000 (after adjusting for
companies, five networks and five Skillnets staff. The anonymous questionnaire before the start of the course Table 8.5: Effectiveness rating of the Facilitators a statistical confidence level of 68%). Reasons for this
course provided a comprehensive introduction to the and again (using the same questions) at the end of both (5 Very Effective , 1 Not Effective) return were listed as:
two evaluation models as well as helping the the first and second courses. The average ratings (based
Avg. Avg.
participants to focus on their own data collection plan on 34 respondents) are shown in Table 8.4. > I will be more specific with my training providers
Overall Overall
and on project deliverables. to ensure that evaluation is part and parcel of the
Rating Rating
training intervention;
1st Course 2nd Course
Overall Evaluation
> On completion will have a more focused approach
Overall the participants had a very positive reaction A. Knowledge of subject 3.4 4.7
for future programmes;
(level 1 evaluation) to the programme. 48% rated it as
B. Organisation and preparation 3.2 4.7
Excellent, a further 48% as Better than Expected, and > Better evaluation of programmes, better objective
4% (1 participant) as Satisfactory. All participants rated C. Facilitation skills 3.0 4.6 setting to realise value of programme, drop
the course at point 3 (max) in terms of being programmes not adding value;
D. Responsiveness to participants 3.1 4.7
worthwhile. 96% of participants reported that the
> Not running courses which are irrelevant;
course met participants needs. 100% said the course E. Creating appropriate
had equipped them to carry out evaluations. learning climate 3.0 4.6 > Reduce Costs;

Overall Rating 3.1 4.7 > Better objectives and organisation;


Table 8.4: Level of Skill/Knowledge of the Participants Before and After the Training (Minimum = 1, Maximum = 5)
> As a % of my time spent on ROI networks;
Before After After Delivery Methods
> Sole trader so just personal efficiency;
Training 1st Training 2nd Training 25 of the 27 participants thought there was a good
balance between lecture, discussion and exercises. All > Will not run certain courses as justification now
A. Know the reasons for, and uses of, evaluation 2.9 4.2 4.3 participants found the exercises and case studies to be very suspect;
either very helpful or helpful.
B. Designing an ROI Action/Implementation Plan 2.0 3.5 4.0 > Saving on course costs;

C. Setting objectives which reflect ultimate outcomes 2.4 3.5 4.1 Programme Content > Training budget of network up 10%.
20 participants found there was a proper balance of
D. Determining the timing of data collection 2.0 3.5 3.8
theory and practice, while the remaining 7 found it
E. Selecting appropriate data collection instruments 2.0 3.6 4.2 very practical. 25 participants found the programme
length just right with 2 finding it too long. 16 (18 on
F. Identifying factors which influence results 2.1 3.4 4
the first course) found the course difficult, 11 found it
G. Selecting appropriate strategies for Isolating the Effects of Training 1.7 3.1 4 suitable and 1 found it easy - none found it very
difficult.
H. Analysing data including calculating the actual return on investment 1.7 3.3 4

I. Implementing the ROI Impact Measurement process 1.0 2.9 3.6 Increase in Effectiveness
As a result of the course the average self-estimate of
J. Identifying and removing barriers to implementation of the ROI process 1.5 3.0 3.7
the increase in participants effectiveness to develop
K. Providing internal consulting on ROI 1.2 2.6 3.3 and implement measurement and evaluation
strategies was 69% (48% after the first course).
L. Leading discussion on ROI Impact Measurement issues and cases 1.3 3.0 3.6
Overall Level of Skill/Knowledge 1.8 3.3 3.9
2 2
Part II The Results
50 51

8.6. External Evaluation of the Training Courses > In relation to Questions dealing with various Section 9 - Evaluation of the Process 2. The Project
Mr Anthony Foley of Dublin City University attended training aspects, amount, timing, quality, supporting The project is a pilot project which may lead to wider
A separate and independent evaluation of the entire
parts of both training courses and later conducted material and benefit - there were no disagreements diffusion at a later stage. The aims are:
pilot programme has been conducted by Dr Anthony
an independent evaluation of the entire training or negatives. All answers were agreements. The
Foley Dean of Dublin City University Business School.
programme. The evaluation questionnaire responses range of strong agreements was 9 (prepared well > To test the Kirpatrick and Phillips models for
This evaluation is included below:
from fifteen of the eighteen participating companies for ROI) to 4 (each session was good). evaluating training to show that training is
indicate a strong degree of satisfaction with the worthwhile and value for money.
> Whether there should be more training got a mix of 1. Introduction
training delivery, the advisory delivery and the overall
answers, 6 agreed (4 strongly), 3 were neutral and This is an evaluation of the Skillnet Pilot Programme > To publish the findings to encourage other firms
programme.
6 disagreed. There was not therefore, a consensus on Measuring Return on Investment from training. to take up the methods.
view on having more training. The project is well conceived from a needs and
In particular the evaluation revealed that: > To train staff in 20 companies to use the models
strategic perspective. It is recognised generally
in their own firms.
that competitiveness in a high wage economy is
> There were no disagreements with the view that
underpinned by, among other factors, high quality > To develop support tools to assist firms in using
the location / facilities were good and conducive
skills, education and training. the methods.
to learning.
> There were no disagreements with the view that At present training levels by firms are low relative The project was designed as five stages:
the topics were appropriate. to strategic needs. Skillnets has correctly recognised
that if enterprises are to significantly increase their > Select companies and networks
> The lecturers were well thought of with 11
training levels it must be shown, as precisely as
strongly agreeing they were generally good and > Agree objectives with companies
possible, that training produces a clear contribution
no disagreements.
success. Quantifying the impact of training has not > Test model/method in companies
received much attention in Ireland in the past. This
> Evaluate results
project is a significant attempt to remedy that
omission. > Build awareness

In essence, this project is a strategically important The project involved a formal agreement between
attempt to test out methods of measuring the impact Skillnets and participating companies, training sessions
of training and the return on investment in training in on the evaluation/assessment models for the people
an action research approach with a focus on real world earmarked to carry out the assessments and
situations and enterprises. consultant/advisor services to support the carrying out
of the exercise. The company/individual focus of the
In summary, the project is strategically important, advisors is a very substantial and important part of the
well conceived and timely. project. Unfortunately, it is also a costly element of the
overall package.
Particularly attractive features are the focus on
enterprises, actual training activities and actual Overall, it is clear that the project was well designed,
ROI assessments. had a clear implementation plan and had precise
targets. As will be shown throughout the evaluation,
The purpose of the evaluation is to assess the the specific targets of assessing the models, training
project relative to its targets and to examine its staff, developing support materials and publishing
implementation. The evaluation questionnaire included results have all been broadly achieved.
questions on the merit of ROI but it is not a primary
objective to evaluate critically the ROI methodology. It is less clear that there will be widespread (or wider
spread) take-up of use of ROI but this was not a
We are grateful to the project manager and the specific objective.
participants for their time and views.
2 2
Part II The Results
52 53

3. Methodology Q. 26 The advisory support was excellent overall. It is also noticeable that in answer to Q. 13, there Clearly by those who identified barriers the time
The main element of the evaluation methodology should be more training, six respondents disagreed, needed to implement ROI was a critical issue followed
Q. 26 No changes.
was the evaluation questionnaire completed by the three were neutral and only six agreed/strongly by the lack of understanding of and participation by
participating companies. Fifteen useable Q. 26 Can't think of any changes. agreed. The responses to question 5, The amount of company colleagues. As people became more proficient
questionnaires were returned by the eighteen training was appropriate were 15 agreed/strongly at the topic the speed of implementation will increase.
Q. 26 Very helpful in implementing first ROI.
participating companies. agreed and there were no disagreements. These Future programmes might include a short pro-forma
Q. 26 Overall I found the advisor good. answers would not indicate a desire for more training. briefing document or presentation that ROI users could
The other elements of the evaluation were use in persuading colleagues. A little time devoted to
(i) attendance at some of the projects events and The only less than fully positive write-in comment was Based on discussions with participants, level 4 would internal selling of the concept should be included on
observation of and participation at some sessions; that a little more prior notice should be given on be the most difficult aspect. There appears to be a future programmes.
(ii) discussions with project participants and officers project deadlines. The strong position of the advisory case for more rapidly moving through the earlier
at events; (iii) assessment of the project documents, element is reinforced by the answers to the other stages and devoting time earlier to level 4 (and 5) For Q. 38 (changes to the overall programme) there
procedures and implementation systems. questions on the topic and by the discussions that the issues. The new case studies would facilitate this. were very limited responses. All except one were
evaluator had with participants project sessions. There either blank or stated no changes. The exception
The project manager involved the evaluator in all was general agreement that the advisory element was The Mullingar Session 2 was most identified as the suggested a slower pace at the beginning.
stages of the project, provided access to all documents, essential and that the advisors were very good. most important session (Q.15) but this was based on a
facilitated discussions and participation and included plurality not a majority. There were very few answers These answers are a little at variance with the specific
the evaluator on the project circulation list. Q. 14 dealt with possible changes to improve the given to question 16 (any other comments on training). closed questions. Question 42 stated that the
training. The comments included: There was one negative comment: programme should be run again but with changes.
4 Evaluation Questionnaire and Results Five respondents agreed with this. Question 41 said
More practical work / case studies.
The evaluation questionnaire (see end of this Section I think this is the first time course run in this it should run broadly as is but only 1 respondent
8) is divided into three sections, reflecting the nature Earlier and increased emphasis on choice of format and you knew it. disagreed. Overall, there would seem to be an
of the project, Section A (questions 1-16) deals with programme on which to base ROI - too late indication of very limited strong feelings for change
the training to prepare for and implement ROI. Section when halfway through the programme to relate However even this respondent would recommend the and a broad acceptance of continuing as it is.
B (questions 17-27) deals with the individual to business impact measures. programme to others but agreed it should be with
support/advisory service of the project provided to changes. Detailed Results for Each Question
Use any new Irish case studies.
firms undertaking ROI assessments. Section C The detailed results for each question are shown
(questions 28-43) deals with views on the overall More group work in first session. Other general comments on Q. 16 were: below and are now dealt with. The open-ended
programme and ROI concept. questions have already been discussed.
More practical sessions on isolating the effects Generally very informative
of training.
Questions 14 -16, 26 -27 and 37-38 are open-ended Break up into separate sessions was very Issue 1 (question 1) dealt with the impression
(write-in) questions. Relatively few respondents More time for group exercise. practical and suitable of the training. There were no disagreements on
completed these. The main comments are outlined the contention that the training was well-organised,
First module went too fast. Great to have the Phillips
later in this section. The full details of the fifteen beneficial, achieved objectives was challenging and
responses are presented below. As will be seen More time for each session. prepared people for implementing ROI. All except one
The remaining open-ended questions dealt with the
below, the results are quite good and encouraging case were agreements. The only departure from this
barriers to implementing ROI in the company (Q. 37)
for the project with high levels of satisfaction and The desirability of more case studies was discussed was two neutral responses on challenging.
and changes to the overall programme (Q. 38).
a particularly favourable view of the personal with the project manager during the implementation.
In Q. 37 issues identified were:
advisors element. One of the useful outputs of the pilot project is that
several new Irish case studies have been created which
> Time (identified seven times)
Answers to the open ended questions (26&27) o can be used in later programmes. Overall, the above
n the individual advisory element included: comments reflect a desire for more material, this > Participation and understanding of colleagues &
indicates a degree of enthusiasm for the project and company (five times)
Q. 26 Gerry was extremely helpful and I would not the ROI concept. The time difficulty is a frequent issue
> Manpower (once)
change anything. in training. Participants want more depth but this
increases the number of training days which then > Relevance (once)
Q. 27 Having the support element really
makes this programme less attractive.
motivated me.

Q. 26 No changes. Eoghan was great.


2 2
Part II The Results
54 55

Overall Q. 1 : a positive evaluation of the training All respondents felt it was well delivered and achieved Given that the responses relate to the period Based on observation and participation, and supported
its objectives (Q. 29). immediately after the programme, it remains to by the evaluation results, it was clear that the project
There were no disagreements with the view that the be seen to what extent the participating companies manager energetically worked to ensure smooth
location/facilities were good and conducive to learning All respondents felt the programme prepared them will use ROI in the future. delivery. Overall, the project was well designed, well
(Q. 2). to implement ROI (Q. 28). managed, well delivered, implemented according to
The evaluation questionnaire responses indicate a plan and well documented.
There were no disagreements with the view that the 2 of the 15 respondents had a neutral view on the strong degree of satisfaction with the training delivery,
topics were appropriate (Q. 3). programme being well designed. the advisory delivery and the overall programme. This One caveat would be that the end of the programme
would be consistent with the face-to-face discussions should have been designed to include presentation
The lecturers were well thought of with 11 strong 13 of 15 respondents felt the programme should with participants and the observations of the evaluator of all the case studies. Clearly evaluation of the
agreeing they were generally good and no be run again as it is. There was 1 disagreement. at sessions. All respondents would recommend the publication, awareness, diffusion and ongoing uptake
disagreements (Q. 4). programme. cannot take place at present.
5 of 15 felt it should be run again but with changes.
Questions 5-12 dealt with various training aspects, 5. Management and Implementation 6. Other Outputs
amount, timing, quality, supporting material and No respondent disagreed with the proposition that The management and implementation activities were In addition to the specific project outputs of
benefit. There were no disagreements or negatives. the programme was value for money but there was effectively done. The generally expected indicators knowledge of ROI and other measurement techniques,
All answers were agreements. The range of strong 1 neutral answer. were all present: case studies and awareness of the methods and the
agreements was 9 (prepared well for ROI) to 4 (each need for measurement, the project has also facilitated
session was good). 12 of 15 agreed it was one of the most valuable > A steering group of appropriate personnel was the emergence of a capacity to provide the Phillips
programmes ever undertaken. There were 2 neutrals established. methodology and diploma on an ongoing basis. It has
Question 13 (there should be more training) got a mix and 1 disagreement. effectively pump primed a solid capacity to provide
> There was a clear and precise implementation plan
of answers, 6 agreed (4 strongly), 3 were neutral and professional level training on the topic.
which was broadly carried out according to plan.
6 disagreed. There was not a consensus view on The evaluation questionnaire also dealt with the ROI
having more training. concept, Q. 31-36, 40. There was strong support for > There was appropriate reporting to the steering The evaluator has seen several of the final case
the concept in some cases but weaker evidence of group through the first and second progress studies. They are very useful works and will be
Section B dealt with the support/advisory service. ongoing use: reports. beneficial in future training. The evaluator has not
Q. 17-Q. 24 dealt with organisation, efficiency, role seen the final document on the identification of
> The responsibilities of the various participants and
in ROI implementation amount of service, quality of 13 of 15 agreed that ROI is an excellent business tool enablers and barriers but the project has clearly
stakeholders and project delivery team were clearly
advisor, timing and understanding of problems. There (there were no disagreements). advanced understanding of these which include:
defined. In fact, based on my experience, it was a
were no negatives in this section. Overall the rating of
very well documented project.
the advisor/support service was high, ranging from 12 13 of 13 respondents were confident in using ROI. > Company support and acceptance
strongly agreeing the advisor understood ROI, and my > There was continuing communication between the
> Time constraints (and hence the need for time
advisor was good to 9, time was appropriate. There Only 8 of 15 agreed they would use ROI an ongoing project manager and the participants.
saving tools of analysis and manuals)
were no disagreements. basis. However, there were no disagreements, there
> The evaluator was fully facilitated.
were 7 neutrals. > Basic knowledge of the models (and hence
Q. 25 stating there should have been more support > Feedback was sought and evaluations carried the need for training)
input produced a mix of answers: 4 agreeing, 3 neutral On the other hand, 10 respondents (of 12) disagreed out and feedback material has been made available
It would appear to be the case that wider-spread use
and 8 disagreeing. that they were unlikely to use ROI. to the evaluator.
of measurement methodologies will not occur without
> The project team responded to the feedback where the relevant training.
Overall: the advisory/support input was highly rated 12 of 15 respondents agreed that ROI will improve
appropriate and possible.
(Q. 17-25) the efficiency of their company.
It is also clear from the project that the (expensive)
> Individual participant care was provided where
use of advisors contributed greatly to the undertaking
Section C dealt with the overall programme and ROI 12 of 15 respondents agreed that their company had participants missed elements of the programme.
and finalisation of the case studies. This is probably
concept (Q. 28-43). Overall there is strong support supported them in implementing the ROI project in
not financially viable on a continuing basis and is, in
for the programme with a few soft areas: house - only 1 agreed that their company was not
the evaluator's view, a significant finding or output of
supportive.
the pilot project.
All respondents would recommend the programme
to others (Q. 30).
2 2
Part II The Results
56 57

7. Overall Assessment and Recommendations We also recommend that groups be kept small and Questionnaire With Answers
The objective evaluation through surveys of confined to people eventually intending to use the Section A : Training to prepare for and implement ROI
respondents clearly indicates that participants model to facilitate the transfer of what are difficult
considered the project to be a success. In the concepts especially at the level 4 stage. The primary Key: S = Strongly Agree A = Agree
evaluator's survey all would recommend the objective is to get people evaluating training rather
programme and the teaching/content aspects got high than simply get an awareness of the concept. DS = Strongly Disagree D = Disagree
ratings. This is further confirmed by the analysis of the
project manager's own evaluation systems as reported Where other personnel need ROI training of a broader N = Neutral
in the interim reports and by the evaluators own nature, it could be provided specifically for them as a
observations and discussions at sessions. The project group or groups. Please circle ONE of the letter/s for each part of the questions below.
managers use of before / after skills levels evaluation
systems is a very good feature of his approach. The training should be rolled out and targeted 1. The training was
separately at training advisors and actual users
Well organised 9S 6A N D DS
The specific objectives of the project were realised: of the models/methods.
Beneficial 8S 7A N D DS
> To test models (successfully done in the case The Project was a success. It was well designed,
Achieved it's objectives 7S 8A N D DS
studies). managed and implemented. The most critical finding
from our evaluation is that the advisor was a Challenging 9S 4A 2N D DS
> To train staff (successfully done in both companies
significant determinant of the eventual successful
and also in Skillnets networks). Got me well prepared for
competition of the measurement exercise. This will not
Implementing ROI 8S 7A N D DS
> To develop support tools (successfully done in both continue in the future and training programmes must
establishing a capability, formal training system be designed to cope with this.
2. The location/facilities were
and support materials).
Good 7S 8A N D DS
> To publish the findings (done at the October
workshop but also intended for more extensive Conductive to learning 4S 10 A N D DS
publications).
3. The topics were appropriate 7S 8A N D DS
The project was strategically important, well designed
relative to the ROI concept
and excellently managed and the awareness of
measurement was increased. It is clear that the project
4. The lecturers/trainers were
achieved its proximate objectives.
generally good 11 S 4A N D DS
Of more fundamental importance are the lessons to
Facilitated Discussion 9S 6A N D DS
be learned for ongoing programmes. Based on the
evaluations and discussions it seems that the personal Knew the ROI area well 10 S 5A N D DS
role/visits of the advisor was critical in the successful
completion of the measurement exercise especially to 5. The amount of training was
levels 4 & 5. Because of cost this would probably not
appropriate 5S 10 A N D DS
be a continuing feature and the training has to be
designed to succeed without the individual advisor
6. The timing of the training
support. This would probably require a longer more
gradual and appropriately phased training programme was appropriate 7S 5A 3N D DS
than that designed for the pilot project. This could be
a series of half or one day sessions while the specific 7. The quality of the training
case study (action learning experience) is being
was good 7S 8A N D DS
developed. The advisory role would be fulfilled
by the lecturer.
2 2
Part II The Results
58 59

Questionnaire With Answers (contd)

8. The training prepared me well for using ROI 9S 5A N D DS 26. What changes, if any, would you recommend to improve the individual advisory/support element

9. The supporting material for the training was good 8S 6A N D DS 27. Any other comments on the individual advisory/support element

10. Each training session was good 4S 10 A N D DS Section C : Overall Programme and ROI concept
28. The programme prepared me well to implement ROI 7S 8A N D DS
11. Each training session was beneficial in terms of ROI 7S 7A N D DS

29. The programme was Well designed 7S 6A 2N D DS


12. I would recommend the training element 11 S 4A N D DS
Well delivered 9S 6A N D DS
13. There should have been more training 4S 2A 3N 6D DS
Achieved it's objectives 9S 6A N D DS

14. What changes, if any would you recommend to improve the training
30. I would recommend the programme to others 8S 7A N D DS

15. Which session was the most important to you.


31. The ROI model is an excellent business tool 9S 4A 2N D DS

16. Any other comments on training


32. I will use ROI as an ongoing aspect of my work 3S 5A 7N D DS

Section B : Individual Support/Advisory Service


33. My company supported me in
17. The individual support/ advisory service was implementing the ROI project in-house 4S 8A 2N 1D DS

Well Organised 10 S 5A N D DS
34. My company was not supportive
Beneficial 10 S 5A N D DS in implementing the ROI project in-house 1S A 5N 4D 5 DS
Achieved it's objectives 11 S 4A N D DS
35. I am confident in using ROI 4S 9A N D DS
Efficient 11 S 4A N D DS
36. I am unlikely to use ROI as an ongoing part of my job S 1A 1N 8D 2 DS
Helped in implementing ROI 11 S 4A N D DS

37. What are the main barriers to using ROI in your company (if any)
18. The amount of individual support/advice
was satisfactory 11 S 4A N D DS
38. What changes, if any, would you make to the overall programme

19. My advisor was good 12 S 3A N D DS


39. The programme was value for money 9S 5A 1N D DS

20. The individual advice/support component


40. The ROI model will improve the
was essential in implementing ROI 11 S 4A N D DS
efficiency of my company 6S 6A 3N D DS

21. The timing of the advice/support was appropriate 9S 6A N D DS


41. The programme should be run again broadly as
it is presently operated 10 S 3A 1N 1D DS
22. My advisor understood ROI well 12 S 3A N D DS
42. The programme should be run again but with changes 1S 4A 5N 5D DS
23. My advisor understood my problems 10 S 4A 1N D DS

43. This has been one of the most valuable training


24. I would recommend the advisory/support component 11 S 4A N D DS
programmes I have undertaken 2S 10 A 2N 1D DS

25. There should have been more individual


advisory/support input 2S 2A 3N 4D 4 DS
3 3
Part III Conclusions
60 61

Section 10 - Conclusions and > Inadequate Training Needs Assessment prior to of results or outcomes in terms of company goals, and project, a logical, rational approach to training
Recommendations delivery of a training programme militates against (5) a ROI analysis. Although the challenges to evaluation has the following attributes: simplicity;
an effective evaluation process. implementing such a comprehensive analysis of any reliable sampling; inclusion of all relevant factors; buy-
10.1. Conclusions
training programme are formidable, they are not in from management and workers; and clear
This pilot project tested the detailed application of the > There must be strong support from senior
insurmountable. Business people aren't looking for communication of results. Furthermore, the ROI process
Kirkpatrick and Phillips Evaluation Models in Irish management in the organisation for
unassailable scientific proof. They only want sufficient should be economical (implemented easily as a routine
enterprises with particular regard to the applicability implementation of the models.
evidence to develop a credible and realisable business part of training), credible, scientifically rigorous,
and usability of the models in evaluating the impact of
> The commitment of the person responsible case. Moreover, the effective deployment of ROI applicable to various categories of variety of training,
training in companies. The results can be summarised
for implementation is vital analysis to investments in training offers considerable flexible (applied either before or after training), and
as:
added benefits. applicable with all types of data.
> All staff should be made aware of the basics
Applicability - the models are certainly applicable. of the model, the importance of carrying out
10.1.3. The Evaluation/ROI approach is transparent, 10.1.6. The ROI process is not a difficult concept
The results of the pilot project show that they are: such evaluation and the benefits to both staff
practical, and relatively easy to implement, and can to grasp, though it does require training, but the
and the organisation generally. In particular
provide immediate feedback. The business ROI application of the ROI model has to be in line with
> Methodologically sound, comprehensive and the involvement of trade union representatives
approach is based on benefit-cost analyses, which relevant systems to collect the data required,
credible and, hence; has been shown in this pilot to have been
makes variables and their relationships to outcomes especially in areas where the training concentrates
very beneficial.
> Acceptable in the Irish business context clear and explicit. Where it must rely on estimation by on skills which are support based rather than
> HRD/training objectives/outcomes must be participants, supervisors, and/or managers, empirical production based.
Usability - the models are usable with adequate integrated with business objectives. rigor is retained by applying the qualifying techniques
training and support. With regard to the usability contained in the Phillips methodology. 10.1.7. Barriers to the implementation of ROI do exist.
> Further, the application of the evaluation process
of the models the pilot project shows that: But there are ways to address and eliminate these
and its findings must be linked to business
10.1.4. Adoption of the Evaluation/ROI process by Irish barriers and achieve success with ROI. The ROI
objectives.
> Levels 1-3 are easy to use, Level 4 presents some companies is a realistic prospect though the pace and methodology will require additional time, costs, and
challenges for companies and Level 5 is difficult, The ROI Process extent of such adoption remains to be seen. The pilot new skills to bring change to the design, development,
requiring business impact and monetary data and 10.1.1. It is quite apparent from the project outcomes, shows that there are cultural and size variations and implementation of solutions. But when the payoff
the use of techniques to isolate the impact of as assessed by the independent evaluation, that even between Irish and American companies that make the of ROI is considered, the benefits exceed the
training from other factors. Further, given that with limited time and basic expertise the rigorous implementation of a full ROI Level 5 analysis in many investment, usually by large amounts. For many Irish
relatively significant staff and time resources are application of the Kirkpatrick/Phillips Evaluation/ROI companies in Ireland more challenging. The cultural companies adoption of a full level 5 ROI analysis may
needed to carry out a full level 5 ROI evaluation, process yields a clear and credible result with a difference stems mainly from the somewhat broader seem to be a daunting objective. In many cases it may
it may be easier to do so in larger firms. It is not multiplicity of beneficial uses. If we look at the approach to business growth in Irish firms where the be appropriate initially, to encourage companies to
however impossible, and certainly not less benchmark which academic evaluators apply to have numbers pressure imposed by the stock market is not extend evaluation from level 1 to 2 to 3 and then to 4
important, in smaller firms. This staffing and time most confidence in their results, we see that they as prevalent. This means that the unrelenting and 5. Raising the bar in a steady but gradual way may
commitment should however decrease as favour comprehensive evaluation designs with emphasis on ROI at all levels is not as strong and, be more effective. Nonetheless, it will be necessary for
competency develops. components including a process evaluation, an impact crucially for effective ROI data collection, the systems company personnel to gain the skills and knowledge of
analysis, analysis of participant perspectives, and a and recording methods in Irish companies are not as the full ROI process to enable this progression to occur
> In order to ensure a credible and accurate
benefit-cost analysis. This approach generally yields comprehensive. It is also clear from the pilot that Irish in time.
evaluation study up to and including level 5, the
not only a more valid evaluation but better management place as much importance on intangible
organisation must commit to providing appropriate
understanding of what is going on "behind the benefits of training as they do on the financial return. 10.1.8. The experience of this pilot suggests the
staff with ROI evaluation training. We consider
numbers." In the business context we may not set the Most Irish companies are in fact small by American following questions to ask when planning an
2-3 days training to be the basic requirement.
bar so high, but in fact the ROI process meets and even standards, even if considered large in Ireland. This evaluation:
We believe therefore that the skills/knowledge
exceeds such standards and is powerfully robust as a makes it more difficult to implement some of the key
of managers to use the models can be developed.
consequence. ROI process techniques like control groups. > How will we measure whether the training
> Programmes which will be evaluated to level 5 objectives have been met?
should be chosen carefully. Criteria such as size, 10.1.2. The Evaluation/ROI process incorporates a 10.1.5. Adoption of the Evaluation/ROI methodology is
> Are the training objectives written so that
scope and cost of training and also the presence of multilevel approach to evaluating training, which enhanced by its inclusion of business-friendly
we can measure whether they've been met?
baseline and easily accessible and reliable data are includes (1) a survey of trainee reactions to the approaches that emphasise practicality, simplicity,
important. training, (2) assessment of learning gains achieved, transparency, and efficiency. What the Phillips ROI > Who and what will be evaluated?
(3) validation that the learning has been applied or model brings that is new and important is a universally
> Baseline data must be gathered prior to all training > What is the purpose of the training?
changes behaviour in the workplace, (4) documentation standardised approach. As can be seen in this pilot
programmes.
3 3
Part III Conclusions
62 63

> Is the training designed to increase knowledge, 10.1.11. The specific objectives of the project were basis and is, in the evaluator's view, a significant (e.g., teaming and multi-skilling), and incentive systems
improve skills, change attitudes or change realised, as verified by the independent evaluator: finding or output of the pilot project. (e.g., compensation and recognition for performance).
behaviour? The answer to this question can
determine what levels of evaluation you perform. > To test models (successfully done in the case 10.1.16. Overall, it is clear that the project was well 10.1.22. It would appear to be the case that wider-
studies). designed, had a clear implementation plan and had spread use of measurement methodologies will not
> When will the evaluations occur?
precise targets. As will be shown throughout the occur without the relevant training and awareness
> To train staff (successfully done both in companies
> Is a pre-training baseline study necessary? evaluation, the specific targets of assessing the building.
and also in Skillnets networks).
models, training staff, developing support materials
> If evaluations will require analysis of behaviour
> To develop support tools (successfully done both and publishing results have all been broadly achieved. 10.1.23. The task of conducting rigorous and reliable
changes, what data will we collect to measure
in establishing a capability, formal training system ROI evaluation of training exceeds the resources and
those changes?
and support materials). 10.1.17. The most critical finding from the expertise of most companies. A significant aspect of
> What types of information do we need to know? independent evaluation is that the consultant advisors this research project was the extensive training and
> To publish the findings (done at the October
Is it enough to know whether the participants were a significant determinant of the eventual support offered to the companies. Some element of
workshop but also intended for more extensive
enjoyed or understood the course material? successful completion of the measurement exercise. this would appear to be necessary for rapid adoption of
publication later).
This will not continue in the future and training the model.
> What do we need to know about the attitudes
10.1.12. The project was strategically important, and programmes must be designed to cope with this.
of training participants?
according to the independent evaluation, well designed 10.1.24. While senior management needs to take the
> How will we know whether more training is and excellently managed and the awareness of General Conclusions initiative by requiring an ROI study, the task itself can
necessary? measurement was increased. It is clear that the project 10.1.18. The project has highlighted a need for a be left to a staff member or, depending on how
achieved its proximate objectives. stronger focus on business benefits in training needs thoroughly a company wants the study conducted, to
10.1.9. In today's marketplace, constant change must
analysis and a generally more focused and robust an outside consultant. Either way, the continuing
be met with constant learning and growth. To meet
Implementing the ROI Process training needs analysis. involvement and support of senior management will
that need, a consistently high investment in training
10.1.13. Based on observation and participation, and have important implications for the successful outcome
is necessary. Common sense suggests that the
supported by the evaluation results, it was clear that 10.1.19. The strategy driven four stage training cycle of the process.
companies that do the best job of deploying those
the project team energetically worked to ensure of identify need, plan, deliver and evaluate is
investments will succeed over the long term. We see
smooth delivery. Overall, the project was well inappropriately applied with resources targeted at the 10.1.25. Evaluation at this level of comprehensiveness,
from this pilot that relatively small investments in
designed, well managed, well delivered, implemented delivery to the detriment of the other three stages. however, is more than just collecting and analysing
training evaluation can pay big dividends. Effective
according to plan and well documented. One caveat data. Comprehensive evaluation requires a means to
evaluation of training efforts can tell companies some
put forward by the independent evaluator was that 10.1.20. This pilot shows that SMEs are perfectly ensure that results are credible and that the evaluation
important things about what they still need to learn.
the end of the programme should have been designed capable, given the right training and support, of process can be replicated. The lack of credibility of
In the case of Novartis a completely new approach to
to include presentation of all the case studies. Clearly implementing evaluation of training right up to level 5. results can quickly diminish any evidence of
the identification of training needs, setting training
evaluation of the publication, awareness, diffusion and However, there is no doubting the challenge that the programme impact. A variety of individuals should be
objectives and monitoring results has emerged from
ongoing uptake cannot take place at present. ROI process presents for SMEs with limited staff able to use the same process to evaluate the same
the study undertaken under this pilot.
resources. Successful implementation of the model in training programme with similar results. Without this
10.1.14. In addition to the specific project outputs of an SME requires more time and attention from a more potential for replication, the evaluation process itself
The Project
knowledge of ROI and other measurement techniques, senior level of staff than would be required in a large loses credibility. The entire measurement and
10.1.10. The independent evaluation (Section 9) states
case studies and awareness of the methods and the enterprise. Undoubtedly ROI is not for everyone or evaluation process has to be assessed to ensure that it
that the project was a success. It was well designed,
need for measurement, the project has also facilitated every company. incorporates all of the key elements necessary to
managed and implemented. The responses to the
the emergence of a capacity to provide the Phillips provide a credible process and credible results. That's
evaluator's survey indicate that participants considered
methodology and diploma on an ongoing basis. It has 10.1.21. The project shows that the returns to training why the use of the Phillips model which has such
the project to be a success. In the evaluator's survey all
effectively pump primed a solid capacity to provide are dependent upon several important factors and that widespread adoption worldwide and has been
participants would recommend the programme and the
professional level training on the topic. training is best understood in the larger context of a subjected to rigorous and ongoing academic and
teaching/content aspects got high ratings. This is
firm's entire set of managerial and production business trials is a worthwhile exercise.
further confirmed by the analysis of the project's own
10.1.15. The case studies are considered by the strategies, functions and practices. Business returns to
evaluation systems as reported in the interim reports
independent evaluator to be very useful works and will training are optimised when this entire set forms a 10.1.26. Just doing the ROI calculation isn't enough;
and by the evaluators own observations and
be beneficial in future training. It is also clear from the coherent "organisational logic" supportive of training. firms must commit to using the information to make
discussions at sessions. The project's use of
project that the use of consultants contributed greatly Thus, a comprehensive ROI evaluation of training necessary adjustments.
before/after skills levels evaluation systems is a very
to the undertaking and finalisation of the case studies. should take into account such factors as managerial
good feature of this approach.
This is probably not financially viable on a continuing leadership, company/project culture, work practices
3 3
Part III Conclusions
64 65

10.1.27. Action Planning appears to be a very 10.2.2. Training Needs Analysis > Training Suitability Analysis. Analysis of whether 10.2.7. Evaluation/ROI should be used to
appropriate business tool with which to gather ROI One of the most striking issues to emerge from the training is the desired solution. Training is one of strengthen/improve the learning/education process.
data. Both methodologies appear to be complimentary pilot was the inadequate training needs analysis and several solutions to employment problems. Because it requires measurements before, during and
and work well together. limited baseline data around skills and outputs in the However, it may not always be the best solution. after training, finding the ROI of a particular training
companies. This is an essential ingredient in both a It is important to determine if training will be programme is not a discrete event but should be part
10.1.28. Some of the evaluations at level 3 carried out good ROI process and training outcome and steps to effective in its usage. of an ongoing process. The focus on pre-training
under this pilot show the need for greater emphasis on improve practice in this area by Skillnets is measurements also helps align a programme with the
10.2.3. Implementation of Learning in the Workplace
ensuring transfer of learning to the work setting. They recommended. At the very least some basic criteria overall training and business strategy.
A number of the case studies point to another issue
also show the importance of including supervisors and should be set down but Skillnets as to what
of crucial importance to an effective ROI and training
managers in the setting of objectives of training since constitutes a satisfactory training needs analysis, and 10.2.8. Addressing the training needs of the current
investment experience: Was the learning gained
they will be required to provide an opportunity for the following might usefully be included in such and future Irish workforce can be best accomplished,
subsequently applied on the job? The solution to this
trainees to use the new skills acquired when they criteria: as evidenced by Skillnets, through collaboration across
issue lies in the design, delivery and implementation of
return to the job. industry and between industry and other players. In
the training and again will benefit from some focused
> Context Analysis. An analysis of the business the same way greater collaboration between Skillnets,
attention by Skillnets.
10.2. Recommendations needs or other reasons the training is desired. The other state agencies, academic institutions, companies,
10.2.1. The independent evaluator identified some key important questions being answered by this trades unions, and professional training and HR bodies
Training that is not directly job related will show a
issues with regard to the possible further use of the analysis are who decided that training should be will have the best chance of success.
poor ROI. Training that is job related but faces
model in Skillnets as follows: conducted, why a training programme is seen as
obstacles in the workplace due to limited involvement
the recommended solution to a business problem, 10.2.9. It is recommended that Skillnets proceed to
by supervisors and other stakeholders in the design,
Based on the evaluations and discussions it seems what the history of the firm has been with regard publish the two key resource materials produced under
delivery and implementation of training will also show
that the personal role / visits of the advisor was to employee training and other management the project - the ROI Evaluation Process Handbook
a poor return on investment. It is recommend that
critical in the successful completion of the interventions. and Evaluation Workbook and the book of Case Studies.
Skillnets undertakes some further work in this area
measurement exercise especially to levels 4 & 5. These books contain tools, job aids, templates, and
> User Analysis. Analysis dealing with potential with a view to presenting companies/networks/trainers
Because of cost this would probably not be a techniques- and serve as a resource for application
participants and instructors involved in the process. with best practice examples of what effective transfer
continuing feature and the training has to be designed and implementation.
The important questions being answered by this of learning and implementation of learning in the
to succeed without the individual advisor support. This
analysis are who will receive the training and their workplace entails.
would probably require a longer more gradual and 10.2.10. It is recommended that Skillnets undertake a
level of existing knowledge on the subject, what is
appropriately phased training programme than that dissemination of the outcomes of the current project
their learning style, and who will conduct the 10.2.4. All ROI studies should, from the planning of the
designed for the pilot project. This could be a series of which would aim to build a greater awareness of the
training. programme right through its implementation, identify
half or one day sessions while the specific case study benefits of the Evaluation/ROI process. This could be
and involve all stakeholders - such as Learners,
(action learning experience) is being developed. The > Work analysis. Analysis of the tasks being built around publication of the case studies and the
Supervisors Managers and Trainers - to ensure
advisory role would be fulfilled by the lecturer. performed. This is an analysis of the job and the outcomes of this pilot.
ownership, buy-in and commitment to the process.
requirements for performing the work. Also known
We also recommend that groups be kept small and as a task analysis or job analysis, this analysis
10.2.5. Since the full level 5 ROI evaluation process
confined to people eventually intending to use the seeks to specify the main duties and skill level
is very comprehensive it may only be appropriate in
model to facilitate the transfer of what are difficult required. This helps ensure that the training which
about 5 percent to 10 percent of training programmes.
concepts especially at the level 4 stage. The primary is developed will include relevant links to the
Such programmes would probably have a very long life
objective is to get people evaluating training rather content of the job.
cycle, be very closely tied to operational goals,
than simply get an awareness of the concept.
> Content Analysis. Analysis of documents, laws, strategically focused, very expensive and have a high
procedures used on the job. This analysis answers visibility.
Where other personnel need ROI training of a broader
questions about what knowledge or information is
nature, it could be provided specifically for them as a
used on this job. This information comes from 10.2.6. Evaluation/ROI should always be sold,
group or groups.
manuals, documents, or regulations. It is important identified and implemented as a process improvement
that the content of the training does not conflict or tool, not as a performance evaluation tool.
The training should be rolled out and targeted
contradict job requirements. An experienced worker
separately at training advisors and actual users
can assist (as a subject matter expert) in
of the models/methods.
determining the appropriate content.
3
Part III Conclusions References
66 67

Final Word Our investigation reveals a need for further attention Barrett, A & O'Connell, P.J. (2003) Does Training Poulet, R. & Moult, G. (1987). Putting values into
The Evaluation/ROI process as related to training has to training and its evaluation in the Irish industry. It Generally Work? The Returns on In-Company Training, evaluation. Training and Development Journal, July, 62-
its origins in the United States primarily in the confirms that the Kirkpatrick/Phillips model not only ESRI: Dublin. 66.
manufacturing sector - the logical birthplace of any works in the way its creators say it should but is
process improvement. It quickly moved to the service without doubt the leading edge technology of its kind Birnbrauer, H. (1987). Evaluation techniques that work. Phillips, Jack J. (1994). Measuring Return on
sector, then to non-profits and healthcare, and on to in the world today. Shortcomings in present research Training and Development Journal, July, 53-55. Investment: Volume I. Alexandria, VA: American Society
government organisations. It is now staring to be used on employer-led training across the entire economy for Training and Development.
in the educational sector, where schools and presents Skillnets with a historic opportunity to take Bowsher, J. (1990). Making the call on the COE. Training
universities struggle to show the value of their a leadership role in this area of national concern. This and Development Journal, May, 65-66. Phillips, Jack J. (1996). "ROI: The Search for Best
programmes. kind of leadership has the potential to push out the Practices." Training & Development 50 (February) 2:
frontier of our understanding of the relationship Brinkerhoff, R. O. (1988). An integrated evaluation 42-47.
Where initially the methodology was employed between training and firm/industry performance and model for HRD. Training and Development Journal,
to show the impact of supervisor training, now it is thereby contribute significantly to the development February, 66-68. Phillips, Jack J. (1997a). Handbook of Training
used in all types of training programmes, from highly of a world-class training system in Ireland. Evaluation. Third Edition. Houston, TX: Gulf Publishing.
technical programmes to long-term executive Bumpass, S. & Wade, D. (1990). Measuring participant
development. Applications include coaching and performance - An alternative. Australian Journal of Phillips, Jack J. (1997b). Measuring Return on
management development programmes, such as Educational Technology, 6(2), 99-107. Investment: Volume 2. Alexandria, VA: American
business coaching, mentoring, and career development. Society for Training and Development.
Human resources programmes - such as orientation, Bushnell, D. S. (1990). Input, process, output: A model
compensation systems, recruiting strategies, employee for evaluating training. Training and Development Phillips, Jack J. (1997c). Return on Investment in
relation initiatives, and retention solutions - have been Journal, March, 41-43. Training and Performance Improvement Programs.
evaluated successfully with the ROI methodology. Houston, TX: Gulf Publishing.
Erkut, S. & Fields, J. P. (1987). Focus groups to the
Finally, the number of individuals who have attended rescue. Training and Development Journal, October, 74- Robinson, Dana Gaines and J.C. Robinson. (1989).
formal training in the methodology is staggering. Over 76. Training for Impact: How to Link Training to Business
6,000 specialists and managers have attended almost Needs and Measure the Results. San Francisco : Jossey-
500 workshops conducted in major cities throughout Foxon, M. (1989). Evaluation of training and Bass Publishers.
the world. A thousand individuals have been prepared development programs: A review of the literature.
to implement this internally through an ROI Australian Journal of Educational Technology. 5(1), 89- Senge, Peter M. (1990). The Fifth Discipline: The Art
certification process. 104. and Practice of the Learning Organization. New York,
NY: Currency Doubleday. Training. (1996). Industry
Now, in Ireland the Skillnets pilot project has added a Hewitt, B. (1989). Evaluation a personal perspective. Report. Vol. 33, no. 10: 36-79.
further 44 ROI practitioners to those previous trained Training and Development in Australia, 16(3), 23-24.
under the aegis of Enterprise Ireland and AIB Group. Weatherby, N. L. & Gorosh, M. E. (1989). Rapid response
This is a substantial leap forward in less than a year. Kirkpatrick, Donald L. (1994). Evaluating Training with spreadsheets. Training and Development Journal,
Already the demand for greater implementation of this Programs: The Four Levels. San Francisco: Berrett- September, 75-79.
model from all corners of Irish enterprise is being Koehler Publishers.
heard. Wigley, J. (1988). Evaluating training: Critical issues.
Lombardo, C. A. (1989). Do the benefits of training Training and Development, 15(3), 21-24.
This report has reviewed the implementation of this justify the costs? Training and Development Journal,
state-of-the art method in conducting cost-effective, December, 60-64.
rigorous ROI analysis of training in eighteen Irish
companies. The returns to investment in training is Newstrom, J. W. (1987). Confronting anomalies in
attracting increasing attention from both academic evaluation. Training and Development Journal, July, 56-
and business researchers and policy makers, but 60.
evaluations still lag seriously behind the accelerated
prominence training has attained over the last decade O'Donnell, J. M. (1988). Focus groups: A habit-forming
as a crucial strategic variable in the Irish economy. evaluation technique. Training and Development
Journal, July, 71-73.
Notes
68
8704_impact_coverS_nc.qxd 13/10/2006 11:47 Page vii
8704_impact_coverS_nc.qxd 13/10/2006 11:47 Page iv

Skillnets Ltd
Level 1, Frankfort Court
Dundrum Road, Dublin 14
Phone +353 1 207 9630
Fax +353 1 207 9631
Email info@skillnets.com
www.skillnets.com

The Training Networks Programme is an initiative of


Skillnets Ltd and is funded from the National Training
Fund through the Department of Enterprise, Trade
and Employment

Vous aimerez peut-être aussi