Vous êtes sur la page 1sur 22

Review: Introduction

 Define Evaluation

 How do formal/informal evaluation differ?

 What are two uses of evaluation in education?

 What are the pros/cons of using an external


evaluator?
Alternative Approaches to
Evaluation
Alternative Approaches
 Stakeholders: individuals and groups who have a
direct interest in, and may be affected by,
evaluation; should be involved early, actively &
continuously

 Program: activities that are provided on a


continuing basis; typically what is evaluated

 There are a variety of alternative, often conflicting,


views of what evaluation is and how it should be
carried out
Why so many alternatives?

 The way one views evaluation directly impacts


the type of activities/methods used

 Origins of alternative models stem from


differences in:
 Philosophical & ideological beliefs
 Methodological preferences
 Practical choices
Philosophical & Ideological Beliefs
 Epistemologies (philosophies of knowing)
 Objectivism (social science base of empiricism; replicate)
 Subjectivism (experientially-based; tacit knowledge)
 Pros/Cons of each?
 Principles for assigning value (parallel obj/subj)
 Utilitarian: focus on group gains (avg scores); greatest
good for the greatest number
 Intuitionist-pluralist: value is individually-determined
 Room for both or are these dichotomous?
 Philosophical purists are rare (impractical?)
 Choose the methods right for THAT evaluation
 Understand assumptions/limitations of different
approaches
Methodological Preferences
 Quantitative (numerical)
 Qualitative (non-numerical)

 Evaluation is a transdiscipline; crosses paradigms


 “Law of the instrument” fallacy
 With hammer/nails, all appears to need hammering

 Identify what is useful in each evaluation approach,


use it wisely & avoid being distracted by approaches
designed to deal w/ different needs
Practical Considerations
 Evaluators disagree whether/not intent of evaluation is
to render a value judgment
 Decision-makers or evaluator render judgment?

 Evaluators differ in views of evaluation’s political role


 Authority? Responsibility? These dictate eval style

 Influence of evaluators’ prior experience

 Who should conduct the evaluation and nature of


expertise needed to do so

 Desirability (?) of having a wide variety of evaluation


approaches
Classification Schema for
Evaluation Approaches
Conceptual approaches to evaluation, NOT techniques
 Objectives-oriented: focus on goals/objectives &
degree to which they are achieved
 Management-oriented: identifying and meeting
informational needs of decision makers
 Consumer-oriented: generate information to guide
product/service use by consumers
 Expertise-oriented: use of professional expertise to
judge quality of evaluation object
 Participant-oriented: stakeholders centrally involved in
process
 See figure 3.1 (p. 68)
Objectives-oriented Approach
 Purposes of some activity are specified and
then evaluation focuses on the extent to which
these purposes are achieved
 Ralph W. Tyler popularized this approach in
education (criterion ref test)
 Tylerian models
 Metfessel & Michael’s paradigm (enlarged vision of
alternative instruments to collect evaluation data)
 Provus’s Discrepancy Evaluation Model (agree on stds,
det if discrepancy exists btwn perf/std, use discrepancy info to decide to
improve, maintain, terminate program)
 Logic models
 Determine long-term outcomes & backtrack to today
Objectives-oriented Steps
 Establish broad goals or objectives tied to
mission statement
 Classify the goals or objectives
 Define objectives in behavioral terms
 Find situations where achievement of objectives
can be shown
 Select/develop measurement techniques
 Collect performance data
 Compare data with behaviorally stated
objectives
Objectives-oriented Pros/Cons

 Strengths: simplicity, easy to understand, follow


and implement; produces information relevant
to the mission

 Weakness: can lead to tunnel vision


 Ignores outcomes not covered by objectives
 Neglects the value of the objectives themselves
 Neglects the context in which evaluation takes
place
Goal Free Evaluation
This is the opposite of objectives-oriented
evaluation, but the two supplement one another
 Purposefully avoid awareness of goals; should not
be taken as given, goals should be evaluated
 Predetermined goals not allowed to narrow focus of
evaluation study
 Focus on actual outcomes rather than intended
 Evaluator has limited contact with program
manager and staff
 Increases likelihood of seeing unintended outcomes
Management-oriented Approach
 Geared to serve decision makers
 Identifies decisions administrator must make
 Collects data re: +/- of each decision alternative
 Success based on teamwork between evaluators
and decision makers
 Systems approach to education in which
decisions are made about inputs, processes,
and outputs
 Decision maker is always the audience to
whom evaluation is directed
CIPP Evaluation Model
(Stufflebeam)
 Context Evaluation: planning decisions
 Needs to address? Existing programs?
 Input Evaluation: structuring decisions
 Available resources, alternative strategies?
 Process Evaluation: implementing decisions
 How well is plan being implemented? Barriers to
success? Revisions needed?
 Product Evaluation: recycling decisions
 Results? Needs reduced? What to do after program
has ‘run its course’?
CIPP Steps
 Focusing the Evaluation
 Collection of Information
 Organization of Information
 Analysis of Information
 Reporting of Information
 Administration of Evaluation (timeline, staffing,
budget etc…)
Context Evaluation
Table 5.1

 Objective: define institutional context, target


population and assess their needs

 Method: system analysis, survey, hearings,


interviews, diagnostic tests, Delphi technique (experts)

 For deciding upon the setting to be served, the


goals associated with meeting needs and
objectives for solving problems
Input Evaluation
 Objective: identify and assess system capabilities,
procedural designs for implementing the strategies,
budgets, schedules

 Method: inventory human and material resources,


feasibility, economics via literature review, visit
exemplary programs

 For selecting sources of support, solution


strategies in order to structure change activities,
provide basis to judge implementation
Process Evaluation
 Objective: identify or predict defects in the process or
procedural design, record/judge procedural events

 Method: monitoring potential procedural barriers,


continual interaction with and observation of the
activities of the staff

 For implementing and refining the program


design and procedure (a.k.a., process control)
Product Evaluation
 Objective: collect descriptions and judgments of
outcomes and relate them to CIP, interpret worth/merit

 Methods: measure outcomes, collect stakeholder


information, analyses of data

 For deciding to continue, terminate, modify, or


refocus an activity and to document the effects
(whether intended or unintended)
Uses of Management-oriented
Approaches to Evaluation
 CIPP has been used in school districts, state
and federal government agencies

 Useful guide for program improvement

 Accountability

 Figure 5.1 (p. 94)


 Formative and summative aspects of CIPP
Management-oriented Pros/Cons
 Strengths: appealing to many who like rational,
orderly approaches, gives focus to the
evaluation, allows for formative and summative
evaluation

 Weaknessws: preference given to top


management, can be costly and complex,
assumes important decisions can be identified
in advance of the evaluation
REVIEW/Qs
 Why are there so many alternative approaches
to evaluation?
 What two conceptual approaches to evaluation
did we discuss tonight? What are their +/-?
 Which, if either, of these approaches do you
think will work for your evaluation object?

 Identify your most likely evaluation object

Vous aimerez peut-être aussi