Académique Documents
Professionnel Documents
Culture Documents
1
General Concepts
2 4
What is Evaluation?
A process that attempts to determine as systematically and objectively as possible the relevance, effectiveness, and impact of activities in light of their objectives.
What is Evaluation?
Effective programme evaluation is a systematic way to improve and account for public health actions by involving procedures that are useful, feasible, ethical, and accurate.
General Concept
PLANNING
IMPLEMENTATION
EVALUATION
General Concept
COMMUNITY
Process Input Performance
Output
Effects
Hierarchy of Objectives
GENERAL GOAL: IMPROVE HEALTH
Medical
social
framework
Contextual
PROCESS EVALUATION
operational
framework PRACTICAL PROGRAMME ACTIVITIES (Intervention)
EVALUATION
Making a value judgement about something.
A critical assessment of the good and bad points of an intervention, and how it can be improved.
Evaluation showed that risk behaviours, such as fat consumption and smoking, declined more dramatically in North Karelia than in the rest of Finland. This change in behaviour was matched by a reduction in risk factors for CHD, such as mean serum cholesterol and blood pressure, which again was greater than for the rest of Finland. The population reported improvements in their health and general well-being. There was a greater reduction in the death rate from CHD in North Karelia than for Finland as a whole
SOME DEFINITIONS
Evaluation is the process of assessing what has been achieved (whether the specified goals, objectives and targets have been met) and how it has been achieved.
SOME DEFINITIONS
A process that attempts to determine as systematically and objectively as possible the relevance, effectiveness and impact of activities in the light of their objectives.
Effectiveness
SOME TERMS
Efficiency
how the outcome has been achieved, and how good is the process (value for money, use of time & other resources)
WHY EVALUATE?
1. To assess results and to determine if objectives have been met.
5. To improve our own practice by building on our success and learning from our mistakes. 6. To determine the effectiveness and efficiency of different methods of Health Promotion. This helps in deciding the best use of resources.
7. To win credibility and support for Health Promotion. 8. To inform other health promoters so that they dont have to reinvent the wheel. This helps others to improve their practice.
WHAT TO EVALUATE?
1. WHAT has been achieved - the outcome
2. HOW it has been achieved - the process
1 3
Types
of Evaluation 4
Types of Evaluation
Process Impact
(performance
measures) Doing the right things
Outcome
Evaluation Framework
Programme
- facilitators? - content? - methods? - time allotments? - materials?
Evaluation Types
Impact
Behaviour
- knowledge gain? - attitude change? - habit change? - skill development?
Health
- mortality? - morbidity? - disability? - quality of life?
TYPES OF EVALUATION
1. Process Evaluation 2. Impact Evaluation
3. Outcome Evaluation
1.
PROCESS EVALUATION
The process refers to what happens between the input and the outcome.
PE is concerned with assessing the process of programme implementation and how the programme is performing as implementation takes place.
1.
PROCESS EVALUATION
Ongoing, a method of quality control. Monitors progress of the programme, whether the planned activities are carried out efficiently, cost effectively and as scheduled.
Process Evaluation
Characteristics
Shorter-term feedback on programme implementation, content, methods, participant response, practitioner response What is working and what is not working
Uses quantitative or qualitative data
Considerations
Process Evaluation
1. Sources of data -- programme data 2. Limitations of data -- completeness 3. Time frame 4. Costs
Example
e.g., number of low income women screened through a mammography programme
2. IMPACT EVALUATION
Impact refers to immediate effects of the intervention or short-term outcome. It is carried out at the end of the programme.
Characteristics
Impact Evaluation
Long- or short-term feedback on behaviours, knowledge, attitudes, beliefs Uses quantitative or qualitative data
Also called summative (external) evaluation Addresses whether we are accomplishing our intermediate objectives
Impact Evaluation
Considerations
1. Sources of data -- surveillance or programme data 2. Limitations of data (validity and reliability) 3. Time frame 4. Costs
Example
e.g., mammography screening rates
3.
OUTCOME EVALUATION
Outcome are the long-term consequences; they are usually the ultimate goals of a programme.
Outcome evaluation involves an assessment of long-term effects of a programme. More difficult & time-consuming to implement.
Characteristics
Outcome Evaluation
Long-term feedback on health status, morbidity, mortality, disability, quality of life Uses quantitative data
Also called summative (external) evaluation Often used in strategic plans Are we making a difference in long-term health outcomes?
Considerations
Outcome Evaluation
Example
e.g., cancer death rates
1 3
Evaluation
Framework
Framework Development
Response to a problem:
Evaluation not applied consistently across programme areas
Evaluation not well-integrated into the day-to-day management of most programmes
What aspects of the programme will be considered when judging programme performance?
What standards must be reached for the programme to be considered successful?
Standards
Utility Feasibility Propriety
2.Evaluation preview
Action Steps
Elements / considerations
Use standards to form basis for making judgments about programme performance
Effects, after programme has occurred -identify and account for intended and unintended programme consequences
Methods
Experimental: random assignment to
compare effect of an intervention with otherwise equivalent groups
Quasi-experimental: compare
nonequivalent groups or use multiple waves of data to set up a comparison (e.g. time series)
Propriety Standards
Accuracy Standards
Standards
Utility: Ensures that an evaluation will serve
the information needs of intended users
HOW TO EVALUATE?
Depending on its purpose, there may be a number of questions that an evaluation attempts to answer and it is important that these are made explicit in advance.
The choice of such questions will affect the methodology chosen. The questions essentially look at whether an activity achieved its purpose, and, if not, why not.
EVALUATION QUESTIONS
Evaluation Questions
Questions related to INPUTS Questions related to OUTPUTS Questions related to OUTCOMES
Major programme components and their relationship Input Process Output Outcome
Population based
Results Results
Programme based
Resources
Activities
Utilizatio n of services
Access to services
Failures to achieve the objectives can occur at any of the three levels:
INPUTS, OUTPUTS or OUTCOMES.
OUTPUTS
OUTCOMES.
INPUTS Questions:
Did the inputs (resources) planned arrive?
Were they sufficient to provide the health promotion planned? Were the resources transformed into program plan? (adequate management and adequate directions)
Inputs = Resources
(Human, Equipment and Commitment)
OUTPUT Questions:
- Service quantity
- Service quality - Efficiency of services
- Socio-economic effects
- Technical difficulties
PROCESS
It is about assessing the operating procedures and the effect of the implementation process. E.g. You might have a very successful IRS campaign, but if it is at the expense of people being bullied into opening their houses and admitting the spraymen, then the procedure would be evaluated negatively.
useful insights into the implementation process, for example, how interventions are interpreted and responded to by different groups.
In general, process evaluation is most useful in conjunction with outcome evaluation rather than on its own.
3 Cs
Completeness Comprehensiveness Continuity
The 7 Ss:
Structure Staff
Systems Skills
Style
PEST:
Political factors
Environmental factors Socio-economical factors Technological factors
What are or were the objectives of the activity being evaluated? Were the objectives set achieved? Were any improvements observed, the direct result of the activity?
OUTCOME Effects:
In some respects impact evaluation and outcome evaluation may seem synonymous.
Outcome evaluation
assesses whether a health promotion strategy has influenced peoples knowledge, behaviour or health in the desired direction.
Sometimes there is difficulty in evaluation, e.g. some effects need long time and so the effects evaluated will be in the area of process rather than outcome or impact.
PROCESS EVALUATION
1. Measuring the programme inputs i.e. the resources expended in implementing the programme in order to determine whether the programme was worthwhile (efficient and cost effective) 2. Using performance indicators to measure activity. PI provide a quantifiable measure activity. Examples are:
Number of health educational materials produced and distributed.
3. Obtaining feedback from other people e.g. colleagues and other staff.
4. Obtaining feedback from the clients or participants of HP programmes their reactions, perceptions and suggestions methods include observation, interview or questionnaires 5. Documentation e.g. reports, checklist, diaries, video-taping, slides etc.
PIs need to be identified at the planning stage. Monitoring PIs helps you to determine how well your programme is progressing.
IMPACT EVALUATION
1.
- Observing what clients do. - Recording behaviour e.g. number of people attending exercise sessions, health screening, stop smoking etc. - Interview or questionaire.
3. Evaluate policy changes Introduction of pro-health policies in schools, workplaces etc. Such as safety policies, healthy food, exercise, No Smoking etc.
Less/no littering.
Creation of no-smoking zones/areas.
5.
OUTCOME EVALUATION
This is the preferred evaluation method because it measures sustained and significant changes which have stood the test of time.
Uses hard evidence and quantitative methods.
lead-free petrol, ban on indirect tobacco advertising, compulsory use of bicycle helmets and rear seat belts, gazetting of No Smoking Areas, establishment of Safety and Health Committees in all work places etc.
jogging tracks and playgrounds in housing areas, improved public transportation system, better housing facilities, clean air and water, provision of separate motorcycle lanes at all major roads and highways etc.
reduction in morbidity, disability and mortality rate improve life expectancy reduced prevalence of risk factors
1. To compare the target groups health-related behaviour before and after the intervention. change will occur with time
confounding factors difficult to eliminate
2. To compare the target groups behaviour to another group with similar characteristics (demographic, socio-economic) who were not given the programme. The control group is necessary to avoid attributing all behaviour change to the HP programme and therefore overestimating its achievement.
CHALLENGES IN EVALUATION
1.
Some objectives are difficult to measure e.g. attitudes and behaviours. Need to select appropriate evaluation criteria and performance indicators (specific, sensitive, relevant etc.
2. Contamination of HP Outcome
HP is a long term process and can be influenced by many extraneous situational factors. How to adjust for these confounding factors? Difficult to ensure that any change detected is only due to the programme input and not to any outside influence.
3. To Evaluate When?
The timing of evaluation affects the assessment of the overall success or failure of a programme due to time effects.
Delay of impact
The effects of a programme may not be immediate e.g. behaviour change. Immediate evaluation might not yield positive results.
Decay of impact
Changes due to programme are not sustained, and after some time the situation reverts to pre-programme. Late evaluation will not yield results. Adjusting for secular trends
Many factors are already changing in the desired direction even in the absence of HP programme. Only those changes over and above the general trend may be attributed to the programme.
1
Summary
2 4
Evaluation Aim
Evaluation is not an aim itself
The aim is to learn from the experience and make programme improvements for effective national work Collaboration with national partners is key Communication is important
Evaluation is often the last thing considered or funded Programme personnel may be threatened by an evaluation
need to maintain objectivity
Organizational Issues
Trade-off between comprehensive evaluation vs nothing at all 10% Rule as you design and implement programmes
Generating evidence
Accountability
Making the case for health promotion we are competing against illness treatment
Goal
Determinants
Objectives
Strategies
Goal
Outcome evaluation
Determinants
Objectives
Impact evaluation
Strategies
Process evaluation
specific project stages (i.e. after sessions, at monthly intervals and/or at program completion)
incidence/prevalence of health conditions, changes in mortality, improvements in quality of life, long-term changes in behaviour (eg smoking rates)
Objective is it SMART?
What are the key questions that the evaluation should answer Being strategic, rather than doing reach evaluation on every single intervention within a program
What information do we need in order to answer these questions? How will we get this information who, when, how? Planning for analysis, reporting and dissemination Budget being realistic: what can we afford to do; what does govt/donor expect for its investment?
Why Evaluate?
Make sound decisions (should the program be continued, scaled back, discontinued, enhanced?) Justify programs, explain accomplishments
Promote programs, products, and services Gain funding Guide program development
Idea of using the evaluation data to improve a program or make decisions is central to evaluation Evaluations should be conducted with a specific use for and user of the evaluation in mind
Importance of Use
is working? What needs to be improved? How it can be improved?); Generally used internally; Often occurs in early stages of program development future; Used internally and externally by key decision-makers (program staff, supervisors, funders); Often occur later in program development
Formative vs Summative:
formative evaluation;
when the guest tastes it, thats summative
evaluation.
4. Collect data
5. Analyze data and interpret results
6. Communicate and use the results to improve the program or make decisions
Focusing Your Evaluation A. Identify the purpose for your evaluation (clarify uses and users)
Gather Ingredients
Feel Satisfied
Logic Model
S I T U A T I O N
INPUTS
OUTPUTS
OUTCOMES
SITUATION
INPUTS
What we invest
Resources and contributions Staff, volunteers, time, money, materials, equipment, technology, partners, facilities
OUTPUTS
What we do and who we reach
Activities (training, recruitment, workshop etc.) and Products (activity guide, exhibit, curriculum, poster, etc.)
OUTCOMES
The results
Skills, Opinions, Motivations action, Policies
INPUTS
OUTPUTS
OUTCOMES
External Factors
External Factors
Context in which the program is situated and external conditions which influence the success of the program, such as:
INPUTS
OUTPUTS
OUTCOMES
External Factors
Assumptions
Underlying Assumptions
Beliefs we have about the program and the way we think it will work
~
the participants
Faulty assumptions lead to poor results: Are your assumptions realistic and sound?
Evaluation Plan
Evaluation Indicators Sources Data Questions How will you of Info. Collection Tools What do you know it? Who can
want to know? provide the info?
Design/ Sampling
From What will whom and when will you use to gather info be the info? gathered?
Evaluation Questions
Indicators
Sources of info
Tools
Design/ Sampling
Provide information that addresses the intended use for the evaluation results? Contribute information that is not already known? Be of continuing interest? Be feasible, in terms of time, money, and skill?
Issues can emerge that require new or revised questions; be flexible, yet do not chase every new or interesting direction that emerges
Evaluation Questions
Indicators
Sources of info
Tools
Design/ Sampling
Indicators
Evidence or information that represents the phenomena of interest: What would indicate this program objective/outcome has been achieved? What does success look like? Help you know something; they are usually specific and measurable
For each aspect you want to measure, ask: What would it look like? What kind of information is needed?
Evaluation Questions
Indicators
Sources of info
Tools
Design/ Sampling
Participants Non-participants Key informants (parents, teachers, previous participants) Program staff, administrators, or partners Program documents (logs, records, minutes of meetings)
Tools
Design/ Sampling
Tools
Design/ Sampling
Qualitative Descriptive, narrative, rich in explanation. Often collected using a smaller set of participants. Depth Quantitative Numerical measurement. Often collected through larger set of participants. In some cases, can be generalizable to a population. Breadth
Evaluation Questions
Indicators
Sources of info
Tools
Design/ Sampling
Design
When would you need to collect data & from whom if you want to show:
gain or change in participants knowledge?
Evaluation Questions
Indicators
Sources of info
Tools
Design/ Sampling
Sampling decisions are based on population size, what you want to know, and the resources available.
First questions to ask: What is the population of interest and is sampling needed? If the population is small, you likely will include all its members.
Interviews or Surveys?
designed to elicit thoughts, feelings, experiences, and stories from respondents; no response provided; Qualitative closed questions; Quantitative
Interviews
Create an Interview Guide How Many Respondents? Iterative Process
Conducting Interviews
Be consistent with questions and cues! Reassure participants that you will protect their identity
Focus Groups
Group interviews that encourage participants to build on each others responses Guidelines for developing interviews are true for focus groups as well Last roughly 1-2 hours and involve 6-10 people
Surveys
Process large amounts of data
Best for generalizing results to a larger population Can be used for planning, formative, and summative types of evaluation Always pilot test your surveys!
Collect Data
Collect Data
First, review your logic model, evaluation focus, and planning matrix to ensure that you are on target with the data you collect
Look for existing data that may meet your needs before designing new data collection protocol
Data Collection
Maintain consistency! Data collection methods and instruments need to be pilot tested Standardize data collection
Implementing a Survey
How will (or will you) entice people to complete your survey?
Focus Groups
Surveys
Enter data on a spreadsheet Reduce the data to summary statements and relate these back to your evaluation questions
Transcription not always necessary notes are often sufficient Identify similarities and patterns Group themes into meaningful categories
Focus Groups
Explain results
Draw Conclusions
Review open ended questions for insights Dont ignore unexpected results Consider unsolicited comments Avoid claiming causality (use the data suggest or the results indicate)
Make Recommendations
Focus on the most important, data-based conclusions Be practical
List specific action items
PowerPoint
Press Release Web Site
Pilot Testing
You may want to do a pilot test
in order to evaluate the effect of your program. (A pilot test is a practice run using a small group who are similar to your target audience.)
Pilot Testing
feedback about the program from this test group. You evaluate the pilot program, and make needed changes in your program before you carry it out with the wider audience.
Pilot Testing
there are any major problems before you commit yourself to the program. idea of possible evaluation results.