Vous êtes sur la page 1sur 10

1

Running Header: Assignment 3

Assignment 3 Presented in Partial Fulfillment

Of the Requirements for

EDID 6504- Programme Evaluation and Course Assessment Methods

Trimester II, 2014

University of the West Indies

By

Meredith Connor - 20053571

meredith.hodge@my.open.uwi.edu

Course Coordinator: David Subran


2
Running Header: Assignment 3

Abstract
The purpose of this paper is to develop process evaluation plan for the Job Link-up Programme
in Anguilla. This programme is an initiative of the Department of Youth and Culture (DYC)
which targets youths between the ages of 15-25 years who are at risk of becoming unemployed
due to their socio-economic background or other hindering factors. The programme presents a
holistic approach to development by offering individually tailored packages of work experience,
training guidance and counseling to the participants based on their strengths, personal interests
and current skill levels. The programme is delivered over the course of six months. The DYC
partners with learning institutions, other government departments and youth development centers
across the island (referral agencies), along with counselors, mentors and employers. The plan
will outline the steps to evaluate the processes from the planning to reporting stage of the
programme. The information derived will bring to light the value or merit of the programme
(Morra Imas & Rist 2009 p. 9). It will assess whether the activities were implemented as planned
and whether the expected outputs were actually produced and facilitate decision making that will
result in programme improvement.

Essential steps

Essential steps to planning and evaluation are to identify the stakeholders; plan the evaluation
with the involvement of stakeholders; decide on what information is needed, when it is needed
and the sources of information; develop the methods and techniques that will be used in the
assessment (this is guided by the type of information required). The evaluation team is selected
and trained so that they have an understanding of the steps to the evaluation process. This will
help them to implement the plan effectively. Implement the program and collect data according
to the plan. The results will be analyzed and shared with stakeholders.

Description of the programme to be evaluated

The Population and Housing Census conducted in 2001 revealed that the highest rate of
unemployment was among the age group 15-25 years. Since then there was an upsurge of crime
among that same age group. The Ministry of Social Development conducted qualitative and
quantitative research and concluded that there is a link between unemployment and the increase
3
Running Header: Assignment 3

in crime. Subsequently a policy was developed to reduce unemployment and crime. As a direct
result of that policy the Department of Youth and Culture (DYC) developed the Job Link-Up
Programme (JLP) which was implemented in 2009. This programme is a job education and
skills development initiative targeted at youth (between the ages of 15-25 years) who are at risk
of becoming unemployed due to their socio-economic background or other hindering factors.
The objective of the programme is to facilitate the entry of young people who are at risk of
becoming unemployed into the workforce. Youth in the at risk age group are provided with
individually tailored work experience, training, guidance and counseling. The concept behind the
JLP is that by the completion of each 6 month cycle participants would have acquired the
necessary knowledge to become fully integrated into the job market.
To participate in the JLP it must be verified that the individuals have least three risk
factors that may lead to unemployment, and they should be permanent residents or belongers of
Anguilla. Some of these risk factors are low levels of numeracy and literacy, non completion of
high school education, homelessness and regular contact with the justice, probation, child
welfare or social assistance systems. Participants are generally directed to the programme by
referral agencies such as the Department of Social Development and Welfare, Labour
Department, Education Department, Department of Probation, the Royal Anguilla Police Force,
Her Majesty’s Prison and the Youth Development Centers across the Island.

The DYC has partnered with various corporate and concerned citizens and educational
institutions to provide academic, skills, financial support and employment opportunities to these
youth in the hope of fulfilling the overall objective.

Purpose of the Process Evaluation

The purpose of the process evaluation is to obtain information on the performance of the
programme with regard to its coverage and processes. This information will provide an
understanding of what is happening and afford oversight. The process evaluation will help to
ensure compliance, accountability and transparency, build, share and manage knowledge and
contribute to the development and improvement of the programme, determine programme
relevance, efficiency, effectiveness, reach and sustainability (Morra Imas & Rist p. 12). It will
4
Running Header: Assignment 3

in essence provide the contextual information that will support analyses of the programme’s
outcomes, impact and costs.

Log Frame

Table 1 below identifies some of the resources to be used in the delivery of the
programme. It also shows the activities to be conducted the expected outputs, outcomes and the
anticipated impact that these activities will have of the participants.

Table 1. Log Frame


Resources/ Activities to be conducted Outputs Outcomes/ Impact
Input Benefits
 Staff  Identify stakeholders  Trained evaluators  Participants are
 Counselors  Conduct stakeholder meetings  Individually  Well rounded more competent
 Partners  Planning design tailored participants with and confident
 Funding  Conduct a workshop for all packages of increased  Participants are
 Facilities stakeholders work knowledge, skills, employed
 Equipment  Review of referral applications experience, experience  Participants
 Travel submitted training  Opportunities to engage in
 Appoint core evaluation team guidance and become employed positive
 Present career counseling counseling  Client satisfaction behaviours
 Develop a personal action plan  Agreed with the programme  Participants are
 Conduct orientation and personal evaluation plan  Positive outlook on emotionally
development sessions  Approved List life more stable
 Provide continuing Education training of participants
 Attachment to
 Monitor participants
institutions of
(interviews, site visits, questionnaires
learning and
etc)
training
 Job placement

The Process Evaluation Design


A mixed evaluation approach will be used since both qualitative and quantitative data is
required. This approach was chosen because it adds value to the evaluation design. The use of
both quantitative and qualitative methods such as interviews, observations, surveys and
secondary data provides alternative means of verifying or corroborating and clarifying the
information collected. These address reliability and validity concerns about the evaluation.
Qualitative data is excellent for identifying patterns and trends while qualitative data helps to
answer the “what” and “why” questions about the programme.
5
Running Header: Assignment 3

Evaluation Questions

Coverage

1. What proportion of at risk youth is actually receiving the programme’s services?


2. Are the persons who are receiving the service the intended clients?
3. What are the demographics of the clients?
4. What proportion of the clients completed the programme?
5. What were the characteristics of this who dropped out of the programme?
6. How many clients are accepted into the programme each year?
7. Are there different types of clients at each cycle of the programme?

Process

8. How did clients of the Job Link up programme learn about it?
9. Is there a waiting list for clients wishing to enter the programme? If yes, how long do
they have to wait?
10. What programme intervention activities are actually taking place?
11. Who is conducting the intervention activities?
12. How well is the programme activities implemented?
13. What resources have been allocated for programme?
14. What are the strengths and weaknesses of the programme?
15. What areas of the programme need improvement?
16. Did the client’s required knowledge and skill improve? If yes, by how much did it
improve?
17. What internal and external factors influence programme delivery?
18. Is it feasible to continue the programme?

Selection of Evaluation Team

The number of members of the core evaluation team for the Job Link –up Programme
depends on the number of participants and the number of different referral agencies. To ensure
that the credibility of the evaluation is not called into question the approach used in the formation
of the evaluation team is that each of the partners will select a representative that have not been
6
Running Header: Assignment 3

directly involved in the delivery of the programme. This will ensure that the core evaluation team
would maintain a level of independence so that the evaluation process could remain transparent.
The criteria for nomination of the representative are that they should be competent and have the
basic skill of conducting evaluation studies, and must be able to remain impartial and unbiased.
To avoid conflicts of interest each member of the evaluation team is required to declare an
interest if (a) a relative is a part of the sample group being evaluated, (b) they have been previous
involvement with the delivery of the programme. Additionally, each member of the team is
required to sign an agreement that outlines the requirements and conditions of the engagement so
that there are no misgivings. The head of the core evaluation team will be a representative from
the Department of Youth and Culture. It should be noted that parents of the “at risk” youth will
be present at each stage of the evaluation.

The Job-link-up Programme is evaluated by a core team consisting of:


 Representatives from within the Ministry of Social Development
 1 member of staff of the Department of Youth and Culture (DYC) – (Head of the
evaluation team)
 1 member from each of the referral agencies (Department of Social Development,
Department of Probation, Department of Labour, Department of Education, The
Royal Anguilla Police Force, the 5 Youth Development Centers on Anguilla and
Her Majesty’s Prison).

The Data Collection and Analysis Plan

Data will be collected through interviews, questionnaires, observation and document


reviews. Facts will gathered about the following areas:-
(a) The programme context (b) program design and objectives, (c) programme implementation,
(d) programme components or services, (e) outreach, intake and assessment, (f) client
characteristics (g) project staffing and staff development, (h) changes in outcomes, (i) costs and
(j) programme replicability.
7
Running Header: Assignment 3

The field officer will be responsible for interviewing administrators and other staff who
are knowledgeable about the initial start of the programme and its current operations at the
beginning of the programme. Sponsoring organizations will provide primary and secondary
data to be used by the evaluators on programme history and funding as well reports on the
participants attached to their establishment it terms of their performance, participation,
challenges. Before, during, and after the delivery of the programme participants will be
interviewed to find out their particular interests, how they came to know about the
programme and to get their views on the programme’s effects. Participants will also
complete monthly self-assessments regarding their behaviour and knowledge. Guidance
counselors and mentors will assist the participants in setting their personal development goals
and identifying the points of focus. Counselors and mentors will produce progress reports
for each participant monthly until the end of the programme. Secondary data will be
accessed and retrieved from existing case files and assessment and tracking reports about the
participants throughout at the beginning, middle and end of the programme by the
programme evaluators. Participant demographics will also be obtained from this source upon
entry of the programme by the administrators and staff. Site visits where the apprenticeships
are taking place will be carried out by the field officer at DYC who will take notes based on a
checklist developed and provide a written report to the Head of the Department of Youth and
Culture upon after each visit. Other programme documents such as funding proposals,
directives from funding agencies, flyers and memoranda as well as other planning documents
will be utilized to determine the budgeted costs and if the programme is being delivered
according to the plan.

A key consideration in planning for data collection and analysis is knowing the intended
audience and the required analytical sophistication and precision that is necessary. Qualitative
data such as interview notes, questionnaire responses and other programme documentation
provides insight as to how the programme the programme policies work, or fail to work and
more compelling accounts of success or failure (Rogers & Goodrick, 2010). All data collected
will be ordered and organized so that meaningful information can be extracted. The core
evaluation team will analyze the data. This will allow them to detect trends in client
characteristics, service delivery and utilization. Charts, graphs and textual write ups will be used
8
Running Header: Assignment 3

to refine the data so that it is easily understood. The evaluation team will summarize their
findings and with the support of the documents make recommendations for programme
adjustment, secession

The Report Format

The format of the report will be as follows:

 Title page - The nature of the evaluation, the title of the programme, the phase, the
duration, the identification of the author, and the date of submission.
 Table of content - Main headings and sub-heading and an index of tables, figures and
graphs.
 Executive summary - An overview of the entire report in no more than five pages and a
discussion of the strengths and weaknesses of the chosen evaluation design.
 Introduction - Description of the programme in terms of the needs and objectives, and
systems of delivery. The background of the programme and its environment. The purpose
of the evaluation, its scope and main evaluation questions. Description of similar studies
which were done.
 Research methodology - Research design, implementation of research and the collection
and analysis of.
 Evaluation results - findings, conclusions and recommendations; and
 Appendixes - Terms of reference for the evaluation, resources and sources, methodology
applied for the study (samples, methods etc).

At varying intervals reports will be sent to respective stakeholders to give information


about:-

 Decisions with regard to evaluation and design activities - An inception report detailing
the activities leading up to implementation, the updated action plan and the confirmed
number of clients referred to the programme will be sent to the Ministry of Social
Development MSD no later than one week after the commencement.
9
Running Header: Assignment 3

 Provide information about upcoming programme activities and programme evaluation


progress -A progress report will be provided three months after the programme begins
and a report on the resources used to implement phase 1.
 present interim findings - No later than four weeks after phase 1 a report detailing the
interim findings will be presented to the management team
 Present final finding - After receiving feedback from the main partners (Training
institutions, employers, counselors and mentors, participants) the information will be
evaluated by the core team and the final findings will be presented in no longer than 10
working days after receipt of the information.
 Present formal report - The final report will be presented to all stakeholders in both print
and electronic format containing text, tables, and all appendices using Microsoft word
and Excel.

Ethical issues

The evaluation team has an ethical obligation to abide by the laws of the land and the
guiding principles and policies of the organization (Morra Imas & Rist, 2009. p. 496).
Legitimate stakeholders should be involved in the planning process. High levels of competency,
integrity, honesty, fairness, consistency and objectivity must be maintained at all times to ensure
public confidence in the evaluation process. Information obtained for the purpose of the
evaluation must not be misused. The process and product of the evaluation should be accurate,
complete and reliable and there should be transparency. As such persons participating in the
programme should know why they are being evaluated and how the data collected will be used.
The evaluation must protect and respect the dignity and privacy of the people involved.
Information requested for evaluation purposes should not infringe on the personal values, culture
or standards of the individual and should be provided willingly. Members of the evaluation
team should avoid being unduly influenced and report conflicts of interest to avoid bringing the
credibility of the evaluation into question (Morra Imas & Rist, 2009. p. 510). They should not
accept bribes or favors and should use the training they would receive to avoid questionable
behaviors, and report allegations of corruption or fraud to the appropriate authority for
investigation (Morra Imas & Rist, 2009. p.499).
10
Running Header: Assignment 3

References

Morra Imas, L.G. & Rist, R.C. (2009).The road to results: Designing and conducting effective
Development Evaluations. Washington, DC.: The World Bank.

Morrison, G.R., Ross, S.M., Kalman, H. K. & Kemp, J.E. (2007). designing effective instruction
5th Ed. New Jersey: John Wiley & Sons Ltd.

Rogers, P.J. & Goodrick, D. In Wholey, J.S., Hatry, H.P. & Newcomer, K.E. (2010). Handbook
of practical program evaluation. 3rd ed. San Francisco, CA: Jossey-Bass, pp.429- 453.

Vous aimerez peut-être aussi