Vous êtes sur la page 1sur 5

EVALUATION MODELS EME 505

MODEL PROPONENT FEATURES STEPS STRENGHTS WEAKNESSES


1. CIPP Evaluation Model Daniel Stufflebeam Guide both evaluators and Context Evaluation The model was not designed The model could be said to blur
stakeholders in posing relevant (Goals) with any specific program or the line between evaluation and
questions and conducting Input Evaluation (Plans) solution in mind; thus, it can other investigative processes
assessments at the beginning or Process Evaluation be easily applied to multiple such as needs assessment.
a project, while it is on the (Actions) evaluation situations.
progress and at its end It is not as widely known and
Product Evaluation Its comprehensive approach applied in the performance
Formative and summative (Outcomes) to evaluation can be applied improvement field as other
from program planning to models.
program outcomes and Needs very careful planning or it
fulfillment of core values. will not succeed
The model is well established Multiple data collection
and has a long history of techniques are needed to address
applicability. each type of data or evaluation
Combines formative question
evaluation with summative
evaluation
All the sections are being
considered so that no part of
the program is overlooked
Clear format for all
stakeholders and evaluators
to follow
Procedure is very specific

2. Stakes Countenance Robert Stake Description and judgment Stakes Model provides Too ideal
Model Intents and observation evaluators with an opportunity Does not emphasize the need
Antecedents, transactions and to compare the desired and issues of the program
contingencies outcome with the actual
outcome. Focuses on being ideal of the
The benefit of this is that it is program
the curriculum developers that
set the criteria of evaluation.
All the evaluators do is
determine if the curriculum
performed in a manner that is
consistent with the ideas of the
developers.
3. Tylers Rationale for Ralph Tyler Linear model, it involves a 1. Stating the objectives Involves active participation of Narrowly interpreted objectives
Program Evaluation certain order or sequence of 2. Selecting learning the learner (acceptable verbs)
steps from beginning to end experiences related to the Objectives are clearly defined
objectives Difficult and time consuming
Deductive it starts from in the purposes. These purposes construction of behavioral
general (examining the need of 3. Organizing learning are translated into educational
objectives objectives
the society) to the specific objectives
(specifying instructional 4. Evaluating the
Simple linear approach to
Curriculum restricted to a
objectives) curriculum constricted range of student
development of behavioral
objectives skills and knowledge
Learning experiences are
individual and are not totally
within the power of the teacher
to select
4. Provus Discrepancy Malcolm Edwin Provus Improve, maintain, or 1. Program Definition The evaluator is a facilitator, The information obtained does not
Evaluation Model terminate the program 2. Program Installation the staff of the program is the provide a solid basis for judging the
Formal or formative 3. Program Process one who evaluates. value of a program in its entirety.
4. Program Product Provides valid information for Ignore, to some extent, the total
Program definition and
5. Program Comparison decision making. evaluation, as it emphasizes partial
program installation
Emphasizes self-evaluation and evaluation, by stages.
improvement of programs. The development and application of
Provide space to change the the criteria used to observe the
standards or program execution program is not clearly specified.
to discontinue it. It deprives us of evaluative
All personnel participate in the judgments of the evaluator, since he
process. only submits the discrepancies to
The model is operational, the person making the decisions.
reality oriented. The recommendations of the
Provides continuous feedback evaluator, of the expert, could be
that strengthens the process and ignored by the other levels.
facilitates program monitoring.
Allows you to make corrections
throughout the change.
Help identify the area of
concern.

5. Scrivens Formative and Michael Scriven Goal-free evaluation(GFE) is Qualitative Controlling goal orientation- There is a chance that some of the
Summative Evaluation any evaluation in which the Unstructured interview related biases most important effects will be
evaluator conducts the missed.
evaluation without particular Participant observation Uncovering side effects The model failed to come to grip
knowledge of or reference to Avoiding the rhetoric of true with the question of what effects to
stated or predetermined goals goals look at, and what needs to be
and objectives. assessed.
Adapting to contextual or
Focuses on unintended effects environmental changes This approach can only lead to poor
Aligning goals with actual planning.
program activities and This is seen as threat by many
outcomes program designers.
Does not provide more explicit
directions for developing and
implementing the model
Not necessarily a practical model
The evaluator does not get rid of all
goals, but replaces the goals of the
project staff with more global goals
based on societal needs and basic
standards of morality.

6. Course Improvement Lee J. Chronbach Course Improvement Course/program


Through Evaluation Decision about individuals curriculum
Administrative regulation
7. Ochaves ABCD Jesus Arce Ochave Combination of different A respondents
Evaluation Model classical evaluation models B the program/operations
Comprehensive, flexible, and C the effects
allows the evaluator to trace D social impact
causal factors
8. Metfessel and Michaels William B. Michael Objective oriented 1. Total school community Simplistic Judgmental decisions are involved
Paradigm Involving Newton S. Metfessel evaluation model as participants Easily understood throughout all phases of the
Multiple Criterion Influenced by Ralph Tyler 2. Listing of specific evaluation.
Easy to follow and implement
Measures for the Evaluation objectives in hierarchical Measures may yield indications of
Possible alternative Produces information relevant
of the effectiveness of order from general to false gains or false losses that are
instruments (multiple to the goals and mission
school program specific desired outcomes correlated with:
criterion) Involve stakeholders as
3. Translate the specific
facilitators of program Experiences in and outside of the
behavioral objectives into
evaluation schools environment.
a communicable form,
applicable to facilitate Uncontrolled differences in the
learning in the school facilitating effects of teachers and other
environment school personnel.
4. Select or construct a Inaccuracies in collecting, reading,
variety of instruments analyzing, collating, and reporting
data.
that will furnish measure Errors in research design and statistical
from which inferences methodology.
can be drawn about the
effectiveness of programs
to meet planned
objectives
5. Carry out periodic
observations through the
use of the varied
instruments to gauge the
extent of behavioral
change that is valid with
respect to the selected
objectives.
6. Analyze the data
provided by the measures
of change through use of
appropriate statistical
methods
7. Interpret data that are
relative to the specified
objectives in terms of
particular judgmental
standards and values
considered appropriate to
desirable levels of
performance
8. Make recommendations
that provide a basis for
further implementation,
modifications, and
revisions of broad goals
and specific objectives
with the purpose of
program improvement

9. Steinmetz Discrepancy Andres Steinmetz Creating S Pragmatic, systematic approach Too broad
Evaluation Model Collecting P to wide variety of evaluation Too complex to apply
Feedback (D) needs
Minimal evaluators are applying
Provides well-informed
decision making
this model
Emphasis on self-evaluation and
systematic program
improvement
10. Connoisseurship and Elliot Eisner Descriptive
Criticism Interpretative
Evaluation
10. Kerrigan Evaluation John Kerrigan 4 Levels Good evaluation tool for Restricted to application in
Model Reaction training programs and short training programs
Learning courses
Applications
Results
11. Impact Evaluation Costly and needs a lot of time

Vous aimerez peut-être aussi