Vous êtes sur la page 1sur 7

See

discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/260716876

Research methodology workshops evaluation


using the Kirkpatrick's model: Translating
theory into practice

Article in Medical Teacher April 2014


DOI: 10.3109/0142159X.2014.886012 Source: PubMed

CITATIONS READS

8 615

8 authors, including:

Hamza Abdulghani Abdul Majeed A Al Drees


King Saud University King Saud University
54 PUBLICATIONS 501 CITATIONS 62 PUBLICATIONS 618 CITATIONS

SEE PROFILE SEE PROFILE

Md Irshad Ali I Alhaqwi


Kuwait University King Saud bin Abdulaziz University for Healt
46 PUBLICATIONS 311 CITATIONS 23 PUBLICATIONS 216 CITATIONS

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

AlHamad View project

BaHammam View project

All content following this page was uploaded by Hamza Abdulghani on 27 June 2014.

The user has requested enhancement of the downloaded file.


2014, 36: S24S29

Research methodology workshops evaluation


using the Kirkpatricks model: Translating
theory into practice
HAMZA MOHAMMAD ABDULGHANI1, SHAFFI AHAMED SHAIK1, NEHAL KHAMIS1, ABDULMAJEED
ABDULRAHMAN AL-DREES1, MOHAMMAD IRSHAD1, MAHMOUD SALAH KHALIL1, ALI IBRAHIM
ALHAQWI2 & ARTHUR ISNANI1
1
King Saud University, Saudi Arabia and 2King Saud Bin Abdul Aziz University, Saudi Arabia

Abstract
Med Teach Downloaded from informahealthcare.com by Dr. Hamza Mohammad Abdulghani on 03/17/14

Background: Qualitative and quantitative evaluation of academic programs can enhance the development, effectiveness,
and dissemination of comparative quality reports as well as quality improvement efforts.
Objectives: To evaluate the five research methodology workshops through assessing participants satisfaction, knowledge
and skills gain and impact on practices by the Kirkpatricks evaluation model.
Methods: The four level Kirkpatricks model was applied for the evaluation. Training feedback questionnaires, pre and post tests,
learner development plan reports and behavioral surveys were used to evaluate the effectiveness of the workshop programs.
Results: Of the 116 participants, 28 (24.1%) liked with appreciation, 62 (53.4%) liked with suggestions and 26 (22.4%) disliked the
programs. Pre and post MCQs tests mean scores showed significant improvement of relevant basic knowledge and cognitive skills
by 17.67% (p  0.005). Pre-and-post tests scores on workshops sub-topics also significantly improved for the manuscripts
(p  0.031) and proposal writing (p  0.834). As for the impact, 56.9% of participants started research, and 6.9% published their
studies. The results from participants performance revealed an overall positive feedback and 79% of participant reported transfer
For personal use only.

of training skills at their workplace.


Conclusion: The course outcomes achievement and suggestions given for improvements offer insight into the program
which were encouraging and very useful. Encouraging research culture and work-based learning are probably the most
powerful determinants for research promotion. These findings therefore encourage faculty development unit to continue its
training and development in the research methodology aspects.

Introduction Practice points


Healthcare professionals prefer to use logical and systematic  Research methodology workshop significantly
approach to gather useful information on particular health improve participants research related knowledge
problems (Mensik 2011). This useful information is applied to and skills.
find out the solutions of scientific and social problems through  Some important skills such as writing manuscript
experiment, observation, analysis, comparison and reasoning and writing proposal seem to develop slowly and
(Polgar 2008). The essential procedures which healthcare possibly, require multiple and repetitive interventions.
professionals or researchers follow through their research  Encouraging research culture and work-based learn-
work called research methodology (Kothari 2004). Research ing are probably one of the most powerful determin-
14

methodology is the way used to systematically solve research ants for research promotion.
problem and add new contributions to the existing knowledge  Seeking and utilization of learners feedback is a
20

and understanding of the issue investigated. Researchers as valuable tool to promote enthusiasm and progression
such need to understand the assumptions underlying various in research methodology workshop.
research techniques and they need to know the criteria by
which they can decide that certain techniques and procedures
will be applicable to particular problems (Gandhi 2011).
A research methodology workshop intends to help partici- doing empirical research (Johal 2012). Evaluation is one of the
pants, who have had minimum or no previous research essential elements of the educational process and evaluation of
experience, who have just started working towards formulat- workshop program is an effort to determine whether program
ing a research question or topic, or those who are already objectives have been achieved by gathering information to

Correspondence: Hamza Mohammad Abdulghani, MBBS, DPHC, ABFM, FRCGP (UK), MMed Ed (Dundee), Head of the Assessment & Evaluation
Centre, Department of Medical Education, College of Medicine, King Saud University, P.O. Box No: 230155, Riyadh 11321, Saudi Arabia. Fax:
0096114671967; E-mail: hamzaabg@gmail.com

S24 ISSN 0142-159X print/ISSN 1466-187X online/14/S100246 2014 Informa UK Ltd.


DOI: 10.3109/0142159X.2014.886012
Research methodology workshops evaluation

evaluation model that meets their particular needs (Fitzpatrick


et al. 2004). It has also been suggested that the program
Level:4 evaluation emphasizes both educational processes and out-
Results comes (Musal et al. 2008). An ideal evaluation method should
(What is the
organizaon be reliable, valid, acceptable, and inexpensive. In addition,
benet as a result of evaluation may involve subjective and objective measures and
training ?)
qualitative and quantitative approaches. Hence, outcomes of
Level:3
Behaviors assessment can be useful for finding out participants learning
(To what extent did parcipants achievements of academic programs (Morrison 2003).
change their behavior in
workplace as a result of training?) However, the evaluations of educational interventions that
make changes in the workplace are always published in
Level:2
Learning developed countries (Harden et al. 1999; Steinert et al. 2006),
(To what extent parcipants improve knowledge,
whereas few reports are available from developing countries
skills and change atudes as a result of training?)
(Malki et al. 2003). Up to our knowledge, this is one of the
Level:1
Reacon
few published work for evaluation of a faculty development
Med Teach Downloaded from informahealthcare.com by Dr. Hamza Mohammad Abdulghani on 03/17/14

(How did parcipants feel about the workshops program in the area of research in the Kingdom of Saudi
program?)
Arabia. The aim of this article is to evaluate the five research
methodology workshops using Kirkpatrick model in terms of
Figure 1. Kirkpatrick model for program evaluation satisfaction of the participants, improvement of their relevant
(Modified from Kirkpatrick & Kirkpatrick 2006). conceptual knowledge, and cognitive skills, participants
behavioral changes and main outcomes in form of publica-
assess the efficiency of the program (Musal et al. 2008). There tions. The format and approach to the evaluation would offer
are also growing pressures to evaluate academic programs for valuable guidance for educational planners intending to
the significant achievement of program objectives (Morrison improve their workshops.
2003). Workshop organizers are responsible not only for
determining whether individual trainees have met educational
objectives but also for ensuring the quality of the training
Methods
For personal use only.

programs itself (Durning et al. 2007). Study context


Several evaluation models have been proposed for evalu-
The faculty development unit (FDU) in the College of
ation of academic programs. However Donald Kirkpatrick
Medicine, King Saud University (KSU) has developed methods
model (Kirkpatrick & Kirkpatrick 2006) has served as the
and tools for monitoring and evaluation of workshops
primary organizing design for the evaluations of training since
organization, conduction, and outcome including workshop
thirty years (Bates 2004). Kirkpatricks evaluation strategies
organization and planning monitoring checklist, pre and post
represent one of the most comprehensive strategies for
tests, feedback questionnaires on perceived gains in know-
evaluating organizational training. This model comprises four
ledge and skills in addition to organization and delivery
essential levels of evaluation and each level has an impact on
evaluation.
the next level (Figure 1). First level focuses on the participants
The FDU regularly conducts about 3545 workshops and
perceptions of the training program, measures participants
courses in different educational themes annually for faculty and
satisfaction and is used to collect information on how the
health care professionals. The evaluation of research method-
participants felt about the training they received (Kirkpatrick &
ology workshop project was conducted in 2010 to recent 2013
Kirkpatrick 2005). The second level is the learning level where
to identify the most appropriate content, structure and benefits
the participants most likely to acquire knowledge, skills and
for participants. Over the last three years five research
therefore changes in their attitudes and behaviors (Knowles
methodology workshops have been conducted to help the
et al. 1998; Ehlers & Schneckenberg 2010). The assessors could
faculty members and staff with their research projects. The
use pre-tests and post-tests if a training program is measuring
workshop provides the participants with the knowledge and
what knowledge and cognitive skills were learned. Third level
skills required for selection of research topics, literature review,
measures whether the learned knowledge, skills and attitudes
writing research proposal, data collection, data analysis,
were transferable to the workplace to reflect positive changes
compilation of results and writing a manuscript for publication
in behavior and job performance. Fourth level is the most
while considering the ethical issues in research and publication.
important which looks on the main outcomes of a project.
It looks at the results from the improved performance of the
Evaluation tools
participants. This is the answer most sought by the stake-
holders and certainly the most challenging to provide since The four levels Kirkpatrick model was applied to evaluate the
there are many other reasons beyond faculty performance that workshops in form of formative and summative evaluation.
lead to organizational performance (Figure 1), (Kirkpatrick & The formative evaluation was used to investigate the imple-
Kirkpatrick 2007). mentation of the program and identify its weaknesses and
In addition, educational institutions have been advised to strengths regarding publicizing (announcement for the work-
weigh the advantages and disadvantages of different evalu- shop), organization, delivery and staffing, workshop content
ation models and approaches to develop an institution specific quality, educational resources availability, balance between
S25
H. M. Abdulghani et al.

knowledge building and applicability, interaction between Table 1. Demographic characteristics of


workshop participants.
participants and staff as well as feedback on participants
performance. The summative evaluation was performed to
evaluate not only the impact of the program on the Participants Total
participating members especially the acquisition of knowledge Gender
and skills and the change in the research practices but also to Males 51
what extent participants introduced changes in their work- Females 65
Areas
places and made publication. It is also intended to detect Clinical 82
which components of the program were especially effective in Academic 20
these regards. Research 14
Affiliation
KSU staff 102
Data collection Non-KSU staff 14
Departments
Both quantitative and qualitative methods were used for data Medicine 79
collection. During each workshop, participants were asked to Clinical Research 16
Laboratory sciences 8
complete several workshop evaluation tools as a pre- test
Med Teach Downloaded from informahealthcare.com by Dr. Hamza Mohammad Abdulghani on 03/17/14

Dental 5
MCQs, daily feedback questionnaires and finally overall work- Nursing 3
shop evaluation questionnaire and post-test. The aim of pre and Biomedical Technology 2
Pharmacy 2
post tests is to assess the changes in participants knowledge, Physical Therapy 1
understanding and application of research methodology, pro- Total 116
posal and manuscript writing and basic concepts in biostatistics.
Pre-test and post-test MCQs were developed based on the
workshops objectives and contents blueprint in the form of
(18.1%) in the fourth workshop and 31 (26.7%) in the fifth
single best answer type. The total number of MCQs was 50 in
workshop. The participants came from multi-disciplinary
each test. The MCQs were testing the basic applied knowledge
backgrounds including medicine, nursing, pharmacy, dental,
regarding the research methodology.
biomedical technology, physical therapy, laboratory science
A 5-point Likert scales questionnaire has been applied to
and clinical research unit (Table 1). Female participants
For personal use only.

collect the data from the participants to survey their general


were 65 (56.1%), clinicians 82, (70.7%), academicians 20
satisfaction.
(17.2%) and researchers 14 (12.1%) respectively. Majority of
Another 5-point Likert scale questionnaire was handed out
the participants (n 102, 87.9%) were from health profes-
to the participants to evaluate their ability and attitude towards
sions education colleges of the KSA.
research on the Likert scale. It included their ability to come up
with research question, willingness to start research project,
Participants perception
ability to develop suitable research tool and data analysis.
After the end of each workshop, the organization commit- Data collected during and on the last days of each of the
tee surveys the participants research activities via e-mail, over workshops program showed mixed reaction among the
the phone, or by face to face communication. participants at the first Kirkpatrick level (Table 2). Of the 116
The ethical approval for the study was obtained from the persons who attended the workshops, 90 (77.6%) completed
Research Committee Board at the College of Medicine, KSU, the pre-test and post-tests that included basis research know-
Riyadh, Saudi Arabia. ledge and cognitive skills for different research sub-topics.
Twenty-six (22.4%) participants did not complete the pre-test
Statistical analysis and as such were not included in the analysis for the post test.
Twenty eight (24.1%) participants were highly satisfied,
Study design for participants acquisition of knowledge and
whereas 62 (53.4%) participants like but suggested for the
skills were quasi-experimental. The differences of participants
program improvement at different levels.
activities were measured before and after the session, and
differences between the pre-test and post-test was used to
Basic knowledge and cognitive skills of
estimate the effect of the intervention. Data were analyzed in
research methodology
many specific methods due to the diversity of collected data
and significant values calculated using SPSS software. T-test For the second Kirkpatrick level, participants were requested
was used to compare the difference between the two means of to answer MCQs test of basic research knowledge and
the pre- and post-tests. cognitive skills which were derived from workshops object-
ives. At base line (pre-test) mean scores ranged from
20.12  16.6 to 25.17  10.5, whereas post-test mean scores
Results ranged from 24.76  14.9 to 35.12  09, showing significant
improvement of post test scores from their base line scores in
Profile of participants
all workshops (p50.005). The achievements of participants
Of the 116 participants who attended the five workshops, showed gain of basic knowledge and cognitive skills scores of
21 (18.1%) were in the first workshop, 17(14.7%) in the 19.91, 20.35, 18.50, 20.31 and 9.28% during first, second, third,
second workshop, 26 (22.4%) in the third workshop, 21 fourth and fifth workshops, respectively (Table 3). The average
S26
Research methodology workshops evaluation

Table 2. Participants reaction with remarks. the items of study design, writing manuscripts and writing
proposals. Comparing baseline and post-test score revealed a
Categories Participants (%) statistically significant improvement in SPSS (p 0.001), bio-
statistics (p 0.015) and data collection (p 0.031), whereas
Liked the workshop 90 of 116 (77.6%)
Liked with appreciation of certain activities: 28 (24.1%) the improvement did not reach a statistically significant level in
Small group sessions 12 (10.3%) the subtopic of writing manuscripts (p 0.834) and writing
Workshop organization 6 (5.2%) proposals (p 0.404).
Topics are relevant and beneficial 6 (5.2%)
Practical application of SPSS 4 (3.4%)
Liked with suggestions for improvement of 62 (53.4%) Main outcomes of the workshops at 3rd and
certain activities: 4th levels of the Kirkpatrick model
More exercises and small group sessions, less lectures 28 (24.1%)
Extend workshop from 3 days to 5 days 17 (14.7%) Post workshop follow-up revealed that of the 116 participants,
More hands-on training in SPSS 7 (6.0%)
Improve time management 4 (3.4%)
66 (56.9%) had started research, whereas 8 (6.9%) had already
More facilitators 3 (2.6%) published research articles. There is no follow-up data of
Improve materials 3 (2.6%) persons who did not participate in pre and post-test.
Disliked the workshop due to following reasons: 26 of 116 (22.4%)
Med Teach Downloaded from informahealthcare.com by Dr. Hamza Mohammad Abdulghani on 03/17/14

Low voice and thick accents of speakers 10 (8.6%)


Short time, long topics 9 (7.8%)
Workshop organization 4 (3.4%) Discussion
Improper time management of sessions 3 (2.6%)
The study results revealed positive findings for the research
methodology workshops at all four levels of the Kirkpatrick
Table 3. Mean scores obtained by the participants in MCQs model. The present workshops program was valuable in
examination (N 90). contributing to the research methodology and outcome of
the workshop was the identification of individuals willing to
Pre-test Post-test endeavor research program. Previous studies have shown that
Workshops N Mean  SD Mean  SD p Value such hand-on-training programs also support learners to set up
1st workshop 19 25.17  10.5 35.12  09.5 0.001 their own community of research and utilized the learning
2nd workshop 15 22.77  10.6 32.94  11.3 0.003 opportunities provided by the social aspects of their work
For personal use only.

3rd workshop 20 24.00  11.8 33.25  07.4 0.001


environment (Fuller et al. 2005) to reinforce changes at
4th workshop 17 24.47  11.2 34.63  12.1 0.005
5th workshop 19 20.12  16.6 24.76  14.9 0.108 institutional level (Nestel et al. 2004).
Overall 90 23.30  12.1 32.14  11.0 0.005 Participants attending the research methodology work-
shops evaluated by the current study belong to eight different
specialties and have different professions and research back-
Table 4. Mean scores obtained by the participants in subtopics grounds. The inter-professional nature of the participants was
(N 90). beneficial in exchanging views from different perspectives us
well as promoting research culture. This culture is important in
Pre-test Post-test spreading the quality management in professional education
Workshops Topics Mean  SD Mean  SD p Value (Al-Shehri 2012). Similar benefits for such inter-professional
Study design 22.50  11.6 27.35  11.7 0.001 training were reported in the literature including peer support
Writing manuscripts 29.15  25.2 29.90  21.8 0.834 groups facilitation of learning (Seibert et al. 2001), promotion
Writing proposals 31.25  22.4 33.35  24.6 0.404
Biostatistics 20.45  11.8 25.85  11.1 0.015
of reflective interactions (Lawrence 2011) and generation of
Data collection 32.60  12.5 38.35  12.8 0.031 creative innovation (Prideaux & Bligh 2002).
SPSS 25.00  10.0 34.60  11.3 0.001 Kirkpatricks first level of evaluation assesses participants
Qualitative research 29.15  15.6 35.40  15.2 0.011
reactions to course instructor, setting, materials, and learning
activities. The majority of the participants were satisfied with
the research methodology workshops. Also the majority of the
score of all five workshops at base line was 23.30  12.1, participants who liked the program mentioned suggestions for
whereas it increased to post test score of 32.14  11.01. improvement of the program at various levels. Majority of the
Table 4 shows participants basic knowledge cognitive suggestions were considered and applied in subsequent
skills cores categorized according to research relevant workshops. Those suggestions included more exercises,
subtopics. At baseline (pre-test), participants provided correct work with small group discussions and that lectures should
answers were calculated as mean score and it was in order of not be lengthy. Small number of participants reported dissat-
data collection (32.60  12.5), writing proposals (31.25  22.4), isfaction from low voice and thick accents of some speakers,
writing manuscripts (29.15  25.2), and qualitative research long topics not suitable for workshop duration, complaint from
correctly (29.15  15.6). Participants on the other hand had organization and time management. The workshop organizing
difficulties on biostatistics and SPSS, where they scored committee should not simply assess dissatisfaction of partici-
only  50%. When the same participants were tested again at pants in a neutral environment; therefore it needs to exploit
the end of workshop sessions using the same set of questions, existing indignation by asking suggestions for improvement.
there was significant improvement in the items of data Participants annotations of dissatisfaction are very useful for
collection, biostatistics and SPSS and slight improvement in the organizing committee to further sort out the possible issues
S27
H. M. Abdulghani et al.

by offering specific and actionable proposal (Staples 2004). experienced resource persons. Evaluation of Kirkpatricks
In view of this and recommendation of Taylor et al. (2000), third and fourth level are always challenging for any program
the workshops organizing committee was keen to review the organization committee and should not be conducted before
participants comments, satisfaction rates and to take immedi- completing level one and two evaluations (Smidt et al. 2009).
ate actions for issues showing unsatisfactory responses or Mostly these were assessed by survey on phone, e-mail, letter,
comments. For example, a suggestion noted on the first day of or other means. In the current study, of the 116 participants
the second workshop that small group discussions should be included in the study 66 participants (56.9%) started research
increased; this was followed in the next days with more room work and among them 8 (6.9%) published their research work.
in the schedule for small group work which was as such Training effectiveness is based solely on outcome measures.
mentioned in the areas participants liked by the third day of However, it was reported that participants may possess the
the workshop. As reported in the literature, this leads to knowledge, skills, and attitudes taught in the program course,
motivation of participants tangibly and psychologically and but still there is no guarantee of their application on the job
also leads to experiencing of personal recognition (Elonen & (Baldwin & Ford 1988; Rouse 2011). Although our workshop
Artto 2003). Also, actions towards continuous improvements programs showed that the majority of the participants
from one workshop to the next motivate the participants to considered themselves to be capable of managing a research
Med Teach Downloaded from informahealthcare.com by Dr. Hamza Mohammad Abdulghani on 03/17/14

achieve success towards organizational objectives and goals project independently and reported transfer of training skills at
(Heising 2012). As such, in the second workshop participants the workplace, these changes cannot be solely attributed to the
commented that they need more sessions for application of workshops. These changes could be also attributed to other
concepts learnt in lectures on their own research proposals, so confounding factors such as the previous background of the
in the third workshop a session for discussion on individual participants and other factors which helped them to be able to
research proposal and data collection tools was introduced. publish and continue with other research projects.
Another interactive presentation session for data pattern and Nevertheless, research skills are expected to contribute to
handling was also added to the third workshop to serve as an improving the research culture within the institution as
introduction to statistics as a response to the second workshop previously concluded by Bates et al. (2006). The findings
participants comments that they need to understand the basic from our study possibly, support theories from developed
concepts before going into details of statistical associations. In countries concerning the process of social learning at work
the fifth workshop, the structure of the program was exten-
For personal use only.

(Engestrom 2001; Bates et al. 2007). Work-based learning was


sively modified to represent a smooth flow from one step to enhanced by group activities that promote reflective practice
another of research proposal writing followed by statistics and and higher order thinking (Spalding 1999).
SPSS basics. The study results showed that the opinion regarding
Many organizations used first level of Kirkpatricks model educational programs can make amendments to the program
as a sole means of program evaluation (Morgan & Casper participants which is necessary in providing feedback and
2000). However, positive satisfaction numbers do not ensure consequently improving the organization performance.
learning and subsequent application of program content
(Baldwin & Ford 1988). Kirkpatricks second level evaluates
the extent of leaning among the participants (Ehlers & Conclusion
Schneckenberg 2010). At base line, first workshop participants
Participants feedback is important and useful for improving
relevant basic knowledge and cognitive skills mean score was
the conduction of research methodology workshop. In add-
25.17  10.5 which rose to 35.12  09.5 after the end of the
ition, these workshops were found to be effective and meet
workshop. Similarly, second, third, fourth and fifth workshop
international standards for quality education. The workshops
participants mean scores also improved in term of higher
met the needs of individual learners, who considered them-
scores. Overall, workshops participants relevant basic know-
selves to have become confident and competent in research,
ledge and cognitive skills were significantly improved
which is expected to increase the institution research capacity
(p  0.005) except for the fifth workshop which despite
consecutively. The nature of support provided to trainees may
showed increase in participants scores, yet did not reach
vary reflecting the diverse settings in which the training
statistical significance (p  0.108). Categorized by topics, post
program will eventually be implemented. The suggestions
test scores of the research methodology subtopic: study design
given for improvements offer insight into the program and
and data collection showed a statistically significant increase
were very useful for approaching those whom we did not
than the pre test scores. The mean scores of SPSS, qualitative
meet their full expectations.
research and biostatics subtopics have also significantly
increased. Hence, differences between base line and post
workshop knowledge and cognitive skills draw attention to the
Notes on contributors
importance of the learning transfer process in making training
truly effective. Post-training improvements were found in HAMZA MOHAMMAD ABDULGHANI, MBBS, DPHC, ABFM, FRCGP (UK),
MMed Ed (Dundee), Associate Professor, Head of the Assessment &
participants knowledge of the principles of research, and the
Evaluation Centre, Department of Medical Education, College of Medicine,
cognitive skills resulting in application of these principles. This King Saud University, Riyadh, KSA.
improvement is similar to findings reported in previous studies SHAIK SHAFI AHAMED, Msc, PhD, Associate Professor, Consultant
(Ajuwon & Kass 2008) and may be attributable to the Biostatistician, Department of Family & Community Medicine, College of
interactive nature of the workshop and facilitation by Medicine, King Saud University, Riyadh, KSA.
S28
Research methodology workshops evaluation

NEHAL KHAMIS, MD, PhD, MHPE, Assistant Professor of Medical Fitzpatrick JL, Sanders JR, Worthen BR. 2004. Program evaluation:
Education, Consultant of Histopathology, Head of Faculty Development Alternative approaches and practical guidelines. 3rd ed. Boston: Allyn
Unit, College Curriculum Committee Member, Clinical Skills and Simulation and Bacon.
Center Advisory Committee Member, College of Medicine, King Saud Fuller A, Hodkinson H, Hodkinson P, Unwin L. 2005. Learning as
University, Riyadh, KSA & Suez Canal University, Egypt. peripheral participation in communities of practice: A reassessment of
ABDULMAJEED ABDULRAHMAN AL-DREES, Msc, PhD, Associate key concepts in workplace learning. Br Educ Res J 31(1):4968.
Professor, Deputy Chairman, Department of Medical Education/Deprt of Gandhi P. 2011. Clinical research methodology. Indian J Phar Edu Res
Physiology College of Medicine, King Saud University, Riyadh, KSA. 45(2):199209.
MOHAMMAD IRSHAD, BEdu, PhD, Research Consultant, Department of Harden A, Peersman G, Oliver S, Mauthner M, Oakley A. 1999. A systematic
Medical Education, College of Medicine, King Saud University, Riyadh, KSA. review of the effectiveness of health promotion interventions in the
workplace. Occup Med 49 (8):540548.
MAHMOUD SALAH KHALIL, MD, PhD, MHPE, Assistant professor,
Heising W. 2012. The integration of ideation and project portfolio
Department of Medical Education, College of Medicine, King Saud
management: A key factor for sustainable success. Int J Proj Manag
University, KSU.
30(5):582595.
ALI IBRAHIM ALHAQWI, ABFM, FRCGP (UK), PhD (MedED), Associate Johal S. 2012. Assessing the impact of workshops promoting concepts of
Professor, Department of Family Medicine, College of Medicine, King Saud psychosocial support for emergency events. PloS Curr 4: e4.
Bin Abdulaziz University,Riyadh, KSU. Kirkpatrick DL, Kirkpatrick JD. 2005. The transfer of learning to behavior:
ARTHUR ISNANI, MD, Clinical Skill Lab Tutor, Department of Medical Using the four levels to improve performance. San Francisco: Berrett-
Med Teach Downloaded from informahealthcare.com by Dr. Hamza Mohammad Abdulghani on 03/17/14

Education, College of Medicine, King Saud University, KSU. Koehler Publication.


Kirkpatrick DL, Kirkpatrick JD. 2006. Evaluating training programs: The
four levels. 3rd ed. San Francisco: Berrett-Koehler Publication.
Acknowledgements Kirkpatrick DL, Kirkpatrick JD. 2007. Implementing the four levels. San
Francisco: Berrett-Koehler Publication.
The authors would like to thank all the workshops partici- Knowles MS, Holton EF, Swanson RA. 1998. The adult learner. 5th ed.
pants who participated in this study. We also thank Professor Houston: Gulf Publishing Company.
Samy Azer, in the Department of Medical Education for Kothari CR. 2004. Research methodology: Methods and techniques. 2nd ed.
New Delhi, India: New Age International Publication.
reviewing and editing the whole manuscript.
Lawrence BS. 2011. Careers, social context and interdisciplinary thinking.
The publication of this supplement has been made possible Hum Relat 64(1):5984.
Malki AA, Al-Bareeq JM, Al-Halili. 2003. Evaluation of research writing
with the generous financial support of the Dr Hamza Alkholi
workshop. Bahrain Med Bull 25(3):18.
Chair for Developing Medical Education in KSA.
For personal use only.

Mensik JS. 2011. Understanding research and evidence-based practice:


From knowledge generation to translation. J Infus Nurs 34(3):174178.
Declaration of interest: The authors report no conflicts of
Morgan R, and Casper W. 2000. Examining the factor structure of
interest. The authors alone are responsible for the content and participant reactions to training: A multidimensional approach. Hum
the writing of this article. This work was funded by the College Resour Dev 3: 301317.
of Medicine Research Centre, Deanship of Scientific Research, Morrison J. 2003. ABC of learning and teaching in medicine: Evaluation. Br
King Saud University, Riyadh, Saudi Arabia. Med J 326: 385387.
Musal B, Taskiran C, Gursel Y, Ozan S, Timbil S, Velipasaoglu S. 2008. An
example of program evaluation project in undergraduate medical
References education. Edu Health 21(1):17.
Nestel D, Taylor S, Spender Q. 2004. Evaluation of an inter-professional
Ajuwon AJ, Kass N. 2008. Outcome of research ethics training workshop workshop to develop a psychosocial assessment and child-centered
among clinicians and science Nigeria University. BMC Med Ethics communication training programme for pediatricians in training. BMC
9(1):19. Med Edu 4(25):110.
Al-Shehri AM. 2012. Quality management and medical education in Saudi Polgar ST. 2008. Introduction to research in the health sciences. 5th ed.
Arabia. Chap 5. In: Quality Management and Practice, Ng K-S (Ed.). Edinburgh: Churchill Livingstone.
InTech: Rijeka, Croatia. pp. 6786. Prideaux D, Bligh J. 2002. Research in medical education: Asking the right
Baldwin T, Ford J. 1988. Transfer of training: A review and directions of questions. Med Edu 36: 11141115.
future research. Pers Psychol 41(1):63105. Rouse DN. 2011. Employing Kirkpatricks evaluation framework to
Bates I, Akoto AYO, Ansong D, Karikari P, Bedu-Addo G, Critchley J, determine the effectiveness of health information management courses
Agbenyega T, Nsiah-Asare A. 2006. Evaluating health research capacity and programs. Perspect Health Inf Manag 8 (Spring): 1c5c.
building: An evidence based tool. PLoS Med 3(8):12241229. Seibert SE, Kraimer ML, Liden RC. 2001. A social capital theory of career
Bates I, Ansong D, Bedu-Addo G, Agbenyega T, Akoto AYO, Hsiah-Asare success. Acad Manage J 44(2):219237.
A, Karikari P. 2007. Evaluation of a learner design course for teaching Smidt A, Balandin S, Sigafoos J, Reed VA. 2009. The Kirkpatrick model:
health research skills in Ghana. BMC Med Edu 7(18):19. A useful tool for evaluating training outcomes. J Intellect Dev Disabil
Bates R. 2004. A critical analysis of evaluation practice: The Kirkpatrick
34(3):266274.
model, and the principle of beneficence. Eval Program Plann 27:
Spalding B. 1999. How effective is group work in enhancing work based
341347.
learning: An evaluation of an education studies course. J Further High
Durning SJ, Hemmer P, Pangaro LN. 2007. The structure of program
Edu 23(1):119125.
evaluation: An approach for evaluating a course, clerkship or compo-
Staples L. 2004. Roots to Power: A manual for grassroots organization,
nents of a residency or fellowship training program. Teach Learn Med
2nd ed. Westport: Praeger Publishers.
19(3):308318.
Steinert Y, Mann K, Centeno A, Dolmans D, Spencer J, Gelula M,
Ehlers UD, Schneckenberg D. 2010. Changing cultures in higher education:
Prideaux D. 2006. A systematic review of faculty development
Moving ahead to future learning. Heidelberg, New York: Springer.
initiatives designed to improve teaching effectiveness in medical
Elonen S, Artto KA. 2003. Problems in managing internal develop-
education: BEME Guide No. 8. Med Teach 28(6):497526.
ment projects in multi-project environments. Int J Proj Manag
Taylor R, Reeves B, Ewings P, Binns S, Keast J, Mears R. 2000. A systematic
21(6):395402.
review of the effectiveness of critical appraisal skills training for
Engestrom Y. 2001. Expansive learning at work: Towards an activity
clinicians. Med Edu 34: 120125.
theoretical reconceptualization. J Edu Work 14(1):133156.

S29

View publication stats

Vous aimerez peut-être aussi