Vous êtes sur la page 1sur 22

Occasional Papers: Issues in Training

Paper No. 10

Assessing Training Impact: IRRIs New Evaluation Approach


P.L. Marcotte, R. Bakker-Dhaliwal, And M. Bell

January 2002 Training Center IRRI International Rice Research Institute

FOREWORD
Occasional Papers. Issues in Training is a series of papers for circulation among IRRI scientists interested in and working with the Training Center. The paper series is a fast and flexible means of presenting issues; presenting plans of training; providing information, results, and impact of training efforts, and providing a forum for discussion of methods, approaches, and dynamics of training events and materials.

These papers have not been edited and are works in progress. It is intended that interested readers will respond to the works directly to the authors, and will provide comments, suggestions, and professional critique.

PAUL L. MARCOTTE Head, Training Center IRRI

Assessing Training Impact: IRRI's New Evaluation Approach


Paul Marcotte, Reena Bakker-Dhaliwal, and Mark Bell1

A paper prepared for

The Standing Panel on Impact Assessment Consultative Group on International Agricultural Research (CGIAR) and The International Maize and Wheat Improvement Center (CIMMYT)

International Conference on Impacts of Agricultural Research and Development: Why Has Impact Assessment Research Not Made More of a Difference?
San Jose, Costa Rica 4-7 February, 2002

Director of the Training Center, Training Evaluation Specialist' and Director of the International Programs Management Office, respectively at the International Rice Research Institute, Los Baos, Philippines
3

Table of Contents
Abstract
.. . 3 4 4 5 5 6 7 8 8 9 9 13 13 15 17 19

1. Introduction: The Importance of Training and Delivery


A Brief History
The Strategic Plan for Delivery for Impact

..

2. Theory of Evaluation .
Generalized Model .. . .. .. . ..
Level 1: Provision of Training: Event Analysis Level 2: Completion of Training: Personal Acquisition Level 3: Resumption of Job-Related Activities: Follow-Up Level 4: Changed Organizational Performance: Impact

3. Level 1 Summary/Analysis of Training Events-2001 4. Level 3 Case Examples of Training Assessment


Follow-up: Thailand Delivery and Follow-Up: Nepal Case

5. Conclusions
Bibliography

Abstract
Paul Marcotte, R. Bakker-Dhaliwal, and M.A.Bell Since 1962, over 13,000 rice professionals from all over Asia have benefited from inhouse training offered at IRRI, while another several thousand have been trained by IRRI staff in-country. Although training events have been regularly evaluated, such evaluation has been inconsistent and without a conceptual model for long-term impact assessment. Currently there is no complete institutional memory of training, making it difficult to assess long-term impact, as there exist incomplete baseline data for comparison. IRRI Training Center's (ITC) new approach to evaluation, instituted in 2000, is to establish evaluation as a proactive rather than a reactive process at the end of training. The objectives of the new evaluation structure are to: a) standardize and systemize the monitoring of IRRI training, b) generate consistent baseline evaluation data for all training, c) use evaluation results, both during and after the event, to modify courseware to meet client needs for improved delivery, d) use the baseline data to do follow-up evaluations, and e) ultimately to sequentially progress to higher levels of evaluations to assess impact at individual, institutional, and national levels. ITC's new model for evaluation comprises 4 levels, based originally on a process proposed by Donald L. Kirkpatrick 40 years ago. Although the approach is not "new," its implementation at ITC in a systematic manner is. At Level 1, the training events are evaluated to gauge participant satisfaction with respect to course content, presentations, and usefulness to the participants' work both during (for courses greater than 2 weeks) and at the completion of each event. At Level 2 the focus is on personal knowledge acquisition, skills and attitudes using pre- and post tests. Level 3 requires follow-up 3 months to a year after an event to assess the extent of application of skills/knowledge to job related activities. At Level 4 evaluations are conducted 1 to 3 years after a training event to assess organizational change as a result of skill/knowledge transfer using a baseline comparison from the previous 3 levels. Although the ITC has a new and comprehensive framework for impact evaluation, it is recognized there are institutional, human resources, and practical limitations that have prohibited its full implementation. Contributing to this general challenge of full course evaluation is the fact that the ITC is expanding its training opportunities by adding numerous multi-media technologies to its courseware, which demand a modified and in some cases a completely different approach to evaluation. This paper will focus on some of the difficulties, the successes, and finally the challenges to be addressed in fully implementing a systematic and comprehensible evaluation program. Keywords: Impact Evaluation, Training Evaluation, IRRI, and Evaluation Methodology

1. Introduction: The Importance of Training and Delivery


A Brief History IRRI has been conducting training since the inception of the institute more than 40 years ago. The inscription on the plaque of the cornerstone reads: International Rice Research Institute An educational and research center Dedicated principally to the study and Improvement of rice: The world's major food crop. This cornerstone was set in 1961, and signed by representatives of the Ford and Rockefeller Foundations, and the President of the Philippines. IRRI has remained steadfast in its attempts to achieve its goal.IRRI's goal today is the same as the original goal, in fact and in spirit. It is: 'To improve the well-being of present and future generations of rice farmers and consumers, particularly those with low incomes." The training effort has been consistent with the goals and objectives of the institute. Despite changes in the organization, its funding base, the continual adjustment to changing clients and their needs, changing physical, political, social and economic conditions, and the alignment of training within the organization, IRRI's training program has also remained true to its original mandate. Today, the IRRI training objectives are: To generate and disseminate rice-related knowledge and technology of shortand long-term environmental, social and economic benefit To enhance national rice research systems.

Since its inception, IRRI has always considered training to be one of the key mechanisms in disseminating its accumulated knowledge and technology to a broad audience. Through its degree and non-degree programs, it has developed human resource capacity in the rice research sector. Its distance learning program and materials are models for adult learning. Its on-campus and regional training have literally changed the approach and method of rice production in the world. Training has played a major part in the process historically and remains an essential technique for multiplying IRRI's impact. IRRI has defined training as the process that enables individuals to acquire knowledge, skills/tools and abilities that will allow them to fulfill the requirements of their job, achieve their career aspirations and attain the goals of the organization. Thus, training is multifunctional as it plays a vital role in individual staff development,
6

provides essential scientific information to organizations, and ultimately enhances productivity, income and livelihood. (Marcotte et al, 2000). The Strategic Plan for Delivery for Impact In order to insure the success of this process, in 2000, a small group of IRRI trainers and scientists began to review the delivery programs, needs assessments and materials for training. What became immediately clear was that planning of delivery had taken a back seat to the actual delivery. Courses had been delivered successfully for years, but materials had not been updated; new scientific findings had not been incorporated; choices for the range of courses delivered was ad hoc, dependent upon the scientist's time and interest, rather than on needs of the clients; and the evaluative process was vague and piecemeal at best. As a result of this review, a strategic plan was written (Marcotte et al, 2001). This plan incorporated all aspects of strategic planning including: Phase 1, a diagnostic phase, in which a vision was created, a stakeholders analysis was performed and an assets analysis (SWOT) was conducted; Phase 2, in which a setting of priorities was accomplished; Phase 3, in which an implementing process was established; and Phase 4, in which a maintenance and evaluative process was begun. By the fall of 2000, an Evaluation and Monitoring Unit was established and operational within the Training Center. This newly established unit was charged with the responsibility of attempting to identify and assess the impact of the IRRI training program. There were two primary activities for this unit: 1. To evaluate training events; and 2. To assess the impact of country projects. The remainder of this paper analyzes the first of these activities, the training event evaluation process.

2. Theory of Evaluation
Evaluation is a process that is neither simple nor easy. The process must start with the establishment of a baseline, assessment of skills, knowledge and delivery, followup and then determination of impact. Thus a simple question such as "Why has impact assessment research not made more of a difference?" is not easy to answer, and in many cases may be impossible to answer. Based on the Strategic Plan "Delivery for Impact: The Dissemination of Rice-Related Knowledge and Technology" (Marcotte et al, 2001), the IRRI Training Center established a process so that, ultimately, impact could be assessed. Without the entirety of this process, the essential question of the impact of research and dissemination cannot be answered. While the approach itself may not be novel (in accordance with the guidelines of this conference), the accomplishment of the entirety of the process necessary to assess the impact appears to be novel in the history of the CG.

Generalized Model
IRRI has adopted a generalized model for training event design and delivery. This model is composed of 8 essential elements beginning with need's assessment and progressing through the setting of an agenda, construction of modules, testing and validating, redesign, material's production and delivery. The process is interactive with evaluation validating individual elements and circumscribing the entire process. Figure 1: The Design and Delivery of a Training Event

Evaluation

Assessment of Needs Writing Prospectus/ Agenda

Delivery

Material Production

Module Construction

Redesign/ Modification

Testing and Validation

Development Sequence
Validation/Evaluation Feedback

Evaluation is the final step in the process and confirms (or denies) the approach. Unfortunately, all too often, the following quote, while relatively old, is still accurate: "Millions for training, but not one cent for evaluation. By design or happenstance, this is an all-too-common occurrence. In many instances it is assumed that training programs have been effective because participants enjoyed the presentations" (Cascio and Awad, 1981: 307).

Given this understanding and approach, the IRRI training Center established a continuous evaluation of our program. The conceptual model and explanation of the various levels follow: Figure 2: Evaluation Type/Sequence

Level 1 Training Event Evaluation Assess Satisfaction of Trainees Event Assessment Questionnaire Completion of Training

Level 2 Skills/Knowledge Acquisition Assess Change in Knowledge, Skills, Attitudes Pre-Test/Post Test Completion of Training

Level 3 Skills/Knowledge Transfer Assess Extent of Application of Skills/ Knowledge to Job Related Activities Survey: Interview and/or Questionnaire 3 months to 6 months

Level 4 Organizational Performance Change Assess Organizational Change as a Result of Skills/Knowledge Transfer And Incorporation Baseline Comparison 1- 3 years

Level 1: Provision of Training: Event Analysis The evaluation of Level 1 focuses on the training event itself as it assesses the satisfaction of trainees with the trainers, the facilities, and the content of the program. These are 'reaction' criteria that measure the trainee's impressions and/or feelings about the program. This is commonly accomplished with an evaluation form administered at the end of the event, or at properly sequenced intervals during an extended training event.

While IRRI has used a variety of formats for this level in the past, the Training Center now has a standard data collection format so that the information collected on events is comprehensive and comparable across events. This data collection is a necessary step in understanding the learning and success of an event, and more importantly is essential to the training program evaluation. The recommended reporting format includes the following 4 parts: Part 1: Report of Participant's Event Evaluation Form a. Review of the workshop on general reactions such as overall rating, meeting of objectives, and strengths and weaknesses; b. Assessment of topic's content, presentations, usefulness, and time allotment c. Features d. Additional topics e. General comments Part 2: Summary of Participants Evaluation Part 3: Conclusions/Recommendations Part 4: Annexes

Level 2: Completion of Training: Personal Acquisition


Level 2 evaluation is conducted at the completion of the training event and evaluates the increased knowledge, changed skills and attitudes of trainees acquired as a result of the training event. This refers to the learning criteria, which are rigorous measures of training outputs such as exams or performance tests. This level is not required or appropriate for all events, however, for certain types of targeted skill training, pre-tests and post-tests on the skills offered during the course will provide an assessment of acquisition that occurred as a result of the course. Since the IRRI courses are normally of short duration, contamination factors such as history, maturation, testing and instrumentation will not negatively affect the measurement of change.

Level 3: Resumption of Job-Related Activities: Follow-Up


Level 3 evaluation is ex-post, conducted three-six months after the trainee has returned to their job and resumed their job activities. This level evaluates the application of the newly acquired knowledge, skills and attitudes as they relate to job activities. These are behavioral criteria that indicate a positive transfer from training to the job. This evaluation requires a systematic observation of behavior that should be accomplished by a supervisor and supplemented by self-reports from the trainee. These reports should be submitted to the trainers, the training units and appropriate NARES and funding organizations.

10

Level 4: Changed Organizational Performance: Impact


Level 4 evaluation assesses organizational change as a result of the training received. While the extent to which training has improved organizational performance is the ultimate measure of success of training, it is the most difficult to assess because it is difficult to attribute change. It is essential at the outset that the expected result is clearly defined, and a tangible product is possible, so that empirical evidence can be collected and measured against some anticipated or expected outcome. Sections 3 and 4 below relate evidence of our current efforts in implementing and documenting the various evaluative levels.

3. Level 1 Summary/Analysis of Training Events-2001


During 2001, a total of 13 events underwent the Level 1 evaluation process. Five content courses on nutrient management, rice production and seed health and production, 3 support courses, 3 workshops and one pilot-study online course were included in the Level 1 event evaluation (Table 1). Some of these courses and event ' s were conducted at the IRRI Headquarters in Los Bahos, Philippines while others were implemented regionally in Thailand and China. As indicated in Table 1 and Figure 3, 92% of the events had satisfaction ratings of good to excellent. The satisfaction score range from 4.67 to 3.69 (between excellent and above average). All the content and methodology related courses rated greater than 4.0 (good) or better. These are courses targeted primarily to mid to high-level researchers and scientists with the exception of the two general Rice Production courses. Support courses were primarily organized for in-house skills building: the PowerPoint courses were delivered to a diverse group of IRRI staff from various divisions while the crosscultural training event was planned solely for the training center staff. The workshop sub-category includes events organized in the field, either with farmers and/or extension agents.

11

Table 1. Event satisfaction rating of 308 participants for 13 events in 2001.


Duration Events Content Courses 1. Strategic Research in Integrated Nutrient Management 2. Rice Production I 3. Rice Production 11 4. Rice Seed Health for Crop Management 5. High Yield Seed Production of Hybrid Rice Methodology Courses 6. Training of Trainers 7. Multi-Agents Systems for NRM Support Courses 8. PowerPoint I 9. PowerPoint II Acronym INM RPI RPII Seed Health Hybrid Rice TOT MAS PowerPoint I PowerPoint II Crosscultural Weeks or days 4 2 2 8 4 4 2 1 2 days 2 days 2 days 15 27 31 6 24 22 16 15 13 21 30 # of Participants Overall Course Satisfaction Weighted Average 4.67 4.61 4.41 4.33 4.00 4.44 4.27 4.57 4.15 3.69 4.56 Overall Course Satisfaction Std. Dev. 0.47 0.57 0.65 0.47 0.77 0.60 0.56 0.49 0.64 0.72 0.50

10. Cross-cultural Awareness and Sensitivity Workshops 11. Valuing and Promoting Cagayan Indigenous Rice Varieties (Agents) in Cagayan Valley (Extension Agents) 12. Valuing and Promoting Cagayan Indigenous Rice Varieties (Farmers) in Cagayan Valley (farmers) 13. Farm Walk (farmer-led lloilo-Farm demo. program) Walk Totals and Averages *5=Excellent, 4=Good, 3=Average, 2=Fair, 1=Poor

2 days 2 days

59 29 308

4.33 3.91 4.31

0.47 0.65 0.62

12

Figure 3: Overall course satisfaction ratings for courses delivered in 2001

However, evidence of participant satisfaction alone does not demonstrate the efficacy or value of a training event. Nor does it aid in improving the courseware for improved delivery and increased impact in the future. In an effort to increase the usefulness of the data collected, the TC Ll evaluations also collect data on the process (i.e. how were the presentations, were objectives met, how was the organization and management of the course, etc.) and on the specific content and usefulness to work as delivered by topic. For courses from 2-4 weeks, this content specific evaluation is administered weekly and for courses between 4-8 weeks the evaluation is administered bi-weekly. This is also partly a function of where the course is delivered, i.e. at IRRI Training Center or regionally. The results of this more rigorous approach are summarized in Table 2 and Figure 4.

13

Table 2 Course ratings for Content, Usefulness to work, and presentation for courses delivered in 2001.
Events Content [Weighted Ave.] Usefulness [Weighted Ave.] Average Course Rating* Presentations Content Usefulness [Weighted [Avg. Std. [Avg. Std. Ave.] Dev.] Dev.] Presentations [Avg. Std. Dev.] Overall average

Content INM RP 1 Hybrid Rice RP 11 Seed Health Methodology MAS for NRM TOT Support PowerPoint I PowerPoint II Cross-cultural Workshops Cagayan (Extension) Iloilo-farm walk Cagayan (Farmers) Totals and Average

4.24 4.28 4.16 4.18 4.12 4.20 4.40 4.33 4.27 3.53 4.22 3.89 3.94 4.16

4.02 3.94 3.93 3.70 3.59 4.16 4.16 4.10 3.99 3.44 3.80 3.93 3.38 3.86

4.19 4.16 3.98 3.97 4.04 4.10 4.35 4.31 4.16 3.39 3.96 3.89 3.87 4.06

0.61 1.04 0.70 0.69 0.41 0.60 0.64 0.67 0.72 0.73 0.68 0.79 0.84 0.68

0.77 1.01 0.74 0.90 0.40 0.60 0.80 0.78 0.82 0.68 0.80 0.5 0.75 0.74

0.66 1.03 0.68 0.74 0.34 0.56 0.63 0.64 0.73 0.72 0.75 0.79 0.87 0.68

4.15 4.13 4.02 3.95 3.92 4.15 4.30 4.25 4.14 3.45 4.00 3.90 3.73 4.03

*5=Excellent, 4=Good, 3=Average, 2=Fair, 1 =Poor

The summaries illustrated in Table 2 and Figure 4 indicate that although the participants are generally quite satisfied and interested with the topics content and presentations, they're more likely to question the usefulness (to their work) of the information delivered. The lower scores for usefulness make the TC reflect on questions such as: do we, the deliverers of information, truly understand the jobs and needs of the participants? Are we not clearly emphasizing the usefulness of the information we're delivering? How can we re-sequence the course topics so that the usefulness is more apparent? Are our training needs assessments really identifying participant needs? Is IRRI or our NARES partners targeting and sending the wrong individuals for that particular course? Although these are broad questions that will need to be addressed at an institutional level, practical results that aid in restructuring a course are also acquired during the weekly evaluations. Uncertain results are often clarified or reconfirmed in the open-ended sections of the evaluation form or with focused discussions at the end of a training event.

14

Figure 4: Course ratings for Content, Usefulness to work, and presentation for courses delivered in 2001.

4. Level 3 Case Examples of Training Assessment


Follow-up: Thailand In January 2001, a visit to Thailand was completed with the following objectives: 1) Conduct follow-up interviews (level 3 evaluation) for individuals who had taken courses in 2000. 2) Facilitate focus groups for participants from previous years for a) pretesting the Impact Assessment form (level 4) which they had received prior to our arrival and b) pre-testing the Information and Communication Technology Capability and Assessment form. Interviews and focus groups were arranged at Ubon Ratchathani University, Ubon Rice Research Center, Khon Kaen Rice Research Center, Pathum Thani Rice Research Center, IRRI Bangkok Office/ Rice Research Institute, and Kasersaert University. The following case study and section detail some of the results of the level 3 assessments (objective 1). A total of 12 participants attended IRRI training courses in 2000 from Thailand. Of these, 8 were available for follow-up interviews. Interviews lasted an average of approximately 30 minutes. Table 3 summarizes the courses for which follow-up was conducted and their respective number of Thai participants. Both quantitative and qualitative data were gathered regarding each course.

15

Table 3: Year 2000 courses that underwent Level 3 (follow-up) assessments and their respective number of Thai participants
Course Name Genetic Evaluation and Utilization Multi-Agent System for NRM R ice Seed Health Use of IT in Reaching Farmers Genetic and Environment Total # Thai participant 1 4 1 1 1 8

Figure 5. Course ratings for design/delivery and content usefulness

In response to a general question regarding the overall course usefulness in retrospect, all the interviewees rated the courses as good (4.0), on a scale of 1 to 5, where 1 is poor, 2 is fair, 3 is average, 4 is good, and 5 is excellent. However, when asked more specific questions related to content/topic usefulness and design/delivery of the course, the results were more variable as presented in Figure 5. The interviewees were able to be fairly specific in regards to which topics were useful in their current work and those that weren't, once they were presented with the course syllabus. Of the 8 interviewees, 5 indicated that they had not yet used the knowledge and skills gained from their trainings in their current work, although all had future expectations for its use. Some reasons given by the interviewees for lack of technology application to their current work were as follows: 1) it was still too soon after the training, 2) the person was still a new researcher, and 3) other job responsibilities interfere with the application of the new technology. Of the 3 interviewees who were already utilizing the new technologies, all had either academic or job related responsibilities that could easily and directly incorporate the newly acquired information.
16

In terms of qualitative data, some interesting comments were related to issues that the Training Center frequently encounters and struggles with such as a) gender imbalance within a training event, b) domination of one national group within a regional course i.e. when 50% or more of the participants represent a single country, and c) English language capabilities. Many suggested that IRRI Training Center change its admission policy regarding trainees. One interviewee summed up the general opinion as follows: "Many people (especially field technicians) in my workplace would benefit greatly from IRRI's training but are ineligible to apply because of their formal education level. Since field technicians tend to do the actual work in the field/lab/etc., many of IRRI's trainings would be of the most practical use to them. The training will also provide them with an opportunity to move up professionally; field technician position is usually very stagnant." The Thai follow-up experience was useful for planning a framework for future level 3 assessments. Although in-person follow-up is more expensive than other methods such as mail, phone, or email, it may be more appropriate to the diverse Asian environment within which the Training Center functions. During the interviews it was noted that the response and accuracy rate was higher as questions were easily clarified and that many of our interviewees found it difficult to express their thoughts in writing. In addition, it was observed by the Thai translator that the interviewees would give more elaborate answers when verbally asked the questionnaire questions, phrased in English, whereas they would leave many of the open ended questions unanswered on the form even when encouraged to write in Thai. Delivery and Follow-Up: Nepal Case In addition to the implementation of the evaluative process to the training courses, the same process is also being extended to technology transfer activities and workshops in the field. In the Spring of 2001, a small team of IRRI scientists, composed of JK Ladha (Crops, Soil and Water Systems Division), V Balasubramanian (CREMNET) and Paul Marcotte (Head, Training Center), conducted a workshop for researchers and extension agents of the Nepal Agricultural Research Council (NARC). The content of the workshop was nutrient management, specifically the Leaf Color Chart (LCC). The LCC is a simple tool/technology for determining nutrient needs for rice plants. While simple in application, the LCC is based on sophisticated and complex science, and it has been field-tested extensively in Indonesia, Vietnam and the Philippines. Basically, the concept is that the color of the leaves of the rice plant will indicate the need for nutritional inputs required for a healthy stand and maximum yield. Research on the various countries has indicated that inputs and their costs will be minimized, and yields will be maximized as the nutrient uptake will be most efficient if applied at the correct time, rather than at prescribed time intervals. In other words, the inputs will be based on readings indicating needs in the actual field rather than laboratory or generalized recommendations.

17

There were 12 researchers and extension agents that attended the mini-workshop held at the NARC facilities on the outskirts of Kathmandu. The workshop took 1/2day, and included lectures, discussions, and hands-on practice. The researchers were taught to use the LCC and design a field-based research experiment. Each was supplied with a package of materials, including the LCC, instructions for use and a guide for the design and collection of base-line information. Several months later, the team re-visited Nepal. The purpose of the follow-up visit was to review the sites selected for field experiments. The first of these was a site designed with local farmers by Raj Schresta, a NARC researcher. This site was in the foothills to the north of Kathmandu near Nagarkot. Experiments were arranged to compare traditional farmer practice of nutrient inputs to standard recommendations, to LCC, and to a control field on the terraces. Once the experiments were designed, the analysis and input decisions were up to the farmers who had been trained in the LCC approach. Preliminary visual inspection of the fields indicated that the LCC fields were outperforming all others. The plants were healthier, the canopy was fuller and the prediction was that the LCC field would out-yield the others by 20%. The farmers instructed Mrs. Kamba, Chair of the IRRI Board of Trustees, in the LCC methodology (see left photo below). The team proceeded to one of the participating farmer's homes for a discussion about the experiments.

A Nepalese farmer shows Mrs. Kamba how to use the LCC.

Researchers show Mrs. Kamba their demonstration plots.

18

A second trip within Nepal was made to Parwanipur in the Terai along the Indian border. Local researcher and IRRI PhD scholar Regmi had arranged for several experiments to be conducted in the area. The results were the same as above. All the LCC experiments outperformed traditional practice at both the experiment station and in the farmer's fields. Also as above, the team met with local farmers to discuss the outcomes of the trials and experiments. This LCC technology transfer workshop and follow-up case in point show direct potential impact on farmers' livelihood. However, experiences of this type need to be more systematically recorded for future level 4 assessments.

5.

Conclusions

There are a number of conclusions that can be made on the evaluative process that has been established. Some are specific to the process and some are details on the lessons that have been learned about specific training interventions. The Process: With respect to Levels 1 and 2, these are time consuming and the staff must be dedicated and vigilant. Trainers must be briefed on the need and heuristic value of the documents, and the summaries must be prepared in a short time frame for immediate use. When designed and administered appropriately, Level 1 evaluations are the best evaluation method of the four levels with respect to cost and efficiency. Because Ll evaluations depend on participant responses, data collection is quick (efficient) and inexpensive. In addition, the information collected is useful and the immediate return on investment (ROI) can be significant. With respect to Level 2, it must be understood that pre- and post-tests are not appropriate for all courses. When appropriate, these are powerful measures of skills and information acquisition directly related to training. With respect to Level 3, it is a very difficult and costly proposition that requires a lot of cooperation. This is especially true with the interview format when the monitoring and evaluation staff is required to visit multiple trainees in a number of countries. However, Level 3 provides essential information on the use value of the learning received in the training events. Incorporating on-site supervisors in the process streamlines the information flow, and creates vested interest in the training received. Follow-up (Level 3) interviews are useful instruments in gauging the application of a new technology to an individuals work and therefore impact assessment on

19

a personal and organizational level. But due to their associated expense, they need to be carefully targeted for specific countries or groups of people as a function of IRRI's relationship priorities with the NARES. With respect to Level 4, there is a need for clear expected results so that there can be unequivocal results and attribution. Unfortunately, often it is not possible to show that personal or organizational improvement can be attributed to the training. However, if the skills learned are not being used then the training may be considered a failure. This may be the ultimate danger of conducting Impact evaluations, i.e. that it may be found that despite the positive results and apparent success of the training, the skills and information may not be in use or effective.

Specific Training Events: Follow-up in the Nepal case indicated that extension agents understood the workshop content, were able to replicate the workshop, and that local farmers were able to understand and use the technology successfully. A simple tool like the LCC, based on correct science and field-tested for performance, was adopted by farmers and became part of their analytic systems approach to production. In this case, farmers saved money on inputs, and produced higher yields for consumption and the market. Thus money was saved, and money and food were increased. With respect to the Thailand follow-up$ there were indications that selection and targeting of participants may not have been the most appropriate for the courses. This is a constant concern. Often, participants are selected on the basis of reward rather than need. One effort to remedy this is the incorporation of In-Country Liaison Scientist's recommendations on the participant.

Although we have yet to complete the whole evaluation process, from Level 1 to Level 4, in its entirety, we have started to discriminate between what is feasible, what is useful, and how to enable the process successfully, yet obtain the necessary baseline data for long-term impact analyses.

20

6. Bibliography
Cascio, W.F. and E.M. Awad, 1981. Human Resource Management. Virginia: Reston Publishing Company, Inc. Reston,

Kirkpatrick, D.L. 1998. Evaluating training programs: the four levels. San Francisco, California:Berett-Koehler Publishers. 2nd Ed. Marcotte, P.L., 2000. Guidelines for the Dissemination and Impact of IRRI Information (GNI-05). IRRI, Los Baos, Philippines. Marcotte, P.L., M. Bell, M. Quiamco, G. Castillo, and S. Morin, 2001. Delivery for Impact: The Dissemination of Rice-Related Knowledge and Technology. IRRI, Los Baos, Philippines.

21

OCCASIONAL PAPERS: ISSUES IN TRAINING Paper No. 1 Report on the Think Tank Meeting on the Use of ICT to Support IRRI's Training Program R.T. Raab (April 1999) Paper No. 2 Web-based Technology: Creating Access to Rice Science or Widening the Digital Divide? P.L. Marcotte, M.B. Quiamco, and L. Norman (October 2000) Research for Development: IRRI's Strategy for Enhancing Research Relevance and Application S. Morin, P.L. Marcotte, M.A. Bell, V.Balasubramanian, and F. Palis (October 2000) Report of IT Sessions at the Expert Consultation on Training compiled by D Shires (February 2001) The Training Center Contribution to the Strategic Marketing of IRRI D. Shires (March 2001) A Strategy and Implementation Plan for the Use of New Approaches and Technologies in Training D. Shires (April 2001) IT Applications in Training and Delivery: The IRRI Experience M. B. Quiamco (May 2001) IRRI's Computer Based Information Delivery System in Training Agricultural Researchers R. Bakker-Dhaliwal, P.L. Marcotte, S. Morin, M.Bell, and P.Comia (July 2001) The Missing Last Mile in the Delivery of Knowledge to the Agricultural Sector T. George, S. Morin, and J. Quiton (December 2001) Assessing Training Impact: IRRI's New Evaluation Approach P Marcotte, R Bakker-Dhaliwal, and M Bell (January 2002)

Paper No. 3

Paper No. 4

Paper No. 5

Paper No. 6

Paper No. 7

Paper No. 8

Paper No. 9

Paper No. 10

22

Vous aimerez peut-être aussi