Vous êtes sur la page 1sur 163

Evaluation is for making it work.

If it works ...
Notice and nurture.
If it doesn't work ...
Notice and change.

Design Manual

Program Evaluation
A PRACTITIONER'S GUIDE FOR
TRAINERS AND EDUCATORS
ROBERT O. BRINKERHOFF
DALE M. BRETHOWER
TERRY HLUCHYJ
JERI RIDINGS NOWAKOWSKI

Kluwer-Nijhoff Publishing
Boston The Hague Dordrecht Lancaster
a member of the Kluwer Academic Publishers Group

Distributors for North America:


Kluwer' Nijhoff Publishing
Kluwer Boston, Inc.
190 Old Derby Street
Hingham, Massachusetts 02043, U.S.A.
Distributors outside North America:
Kluwer Academic Publishers Group
Distribution Centre
P.O. Box 322
3300 AH Dordrecht, The Netherlands
Library of Congress Cataloging in Publication Data
Main entry under title:
Program evaluation: a practitioners guide for trainers and
educators-a design manual.
(Evaluation in education and human services)
1. Evaluation research (Social action programs)Handbooks, manuals, etc. 2. Educational accountability
-Handbooks, manuals, etc. I. Brinkerhoff, Robert O.
II. Series.
82-16208
HV11.P74 1983
361.6'1
ISBN-13: 978-0-89838-122-1

e-ISBN-13: 978-94-009-6667-3

001: 10.1007/978-94-009-6667-3

Copyright 1983 by Evaluation Center, Western Michigan


University. No part of this book may be reproduced in
any form by print, photoprint, microfilm, or any other
means, without written permission from the publisher.

Fourth printing 1986

Contents
ABOUT THESE MATERIALS
INTRODUCTION
GETTING STARTED
PRODUCTS
1: Outline of Evaluation Questions
2: Outline of Evaluation Questions
3: Information Collection Plan
4: Analysis and Interpretation Plan
5: Report Plan
6: Management Plan
7: Plan to Evaluate the Evaluation
APPENDICES
A: Selecting What (an Object) to Evaluate
B: An Example of an Evaluation Design
C: Extra Worksheets

vi
xi
1
13

39

53
69
81

91
105

115
123

137

About these Materials


Introduction to the Package

Program Evaluation: A Practitioner's Guide was developed by the Evaluation


Training Consortium (ETC) project at the Evaluation Center, Western
Michigan University. The ETC project was funded by the U.S. Office of
Special Education from 1972 to 1982; it has developed program evaluation
procedures for use by teacher educators and delivered training to thousands
of professionals across the United States. The mission of the ETC has been
to improve the evaluation capabilities of projects and programs engaged in
preparing personnel to work with special and regular education clients and
pupils. This package of materials is intended to carry forward that mission,
and help educators to help themselves improve educational practice.
This set of materials is for use in training, teacher education, and other
professional development programs and projects in private and public
agencies, public schools and colleges and universities. They are designed to
help individuals or groups in their own work, and they can be used to train
others.
The package has the following parts:
(1) Sourcebook, which contains chapters of guidelines, resources and
references for each of 7 key evaluation functions.
(2) Casebook (bound together with Sourcebook), which is a collection of
twelve stories about evaluation applied to real-life projects and programs in different settings. These show people planning, conducting and
using evaluation.
(3) Design Manual, which contains a programmed set of directions,
worksheets, examples, and checklists to help you design an evaluation
for a particular use.

Conceptual Basis
These materials are about designing, conducting, and using evaluation,
but their underlying assumption is that evaluation should be useful for
improving current and/or future training efforts. While these materials are
meant to help you do evaluation well, we believe that evaluation is not
worth doing at all unless you can use it to make training better, or to better
invest training resources.
Good trainin-8, whether preservice or inservice, must satisfy four conditions:
vi

PROGRAM EVALUATION

vii

(1) Training must be directed toward worthwhile goals.


(2) Training strategies must be theoretically sound, reflect good practice,
be feasible, and make optimum use of available resources.
(3) Implementation of training must be efficiently managed and responsive
to emerging problems and changing conditions.
(4) Recycling decisions (i.e., to terminate, continue, curtail or expand
training) should be based on knowledge of impacts of training, the
extent to which training outcomes are in use, and the worth of training.
These decisions should be responsive to continuing and emerging needs
and problems.
These criteria are not independent, and each is important to another.
Training designs must be not only potent but they must be directed toward
worthwhile goals; good designs can serve as guides to implementation, and
implementation is facilitated by good design; and, well-implemented
training is most likely to have positive outcomes. Also, these criteria are
functionally related in a cycle which repeats as training programs grow and
develop:
Cycle of Training Functions
Identify
Goals

Design Training
Strategy

Make Recycling
Decisions

Implement
Training

Evaluation activities are what tie these training functions together.


Different kinds of evaluations are done during each of these training
function stages to ensure that the function is carried out as well as it can
be.
Table 1 shows the different kinds of evaluation we have defined and
portrayed in these materials. The Casebook provides examples of these
different uses; the Sourcebook will help you learn about options and guidelines
for doing these different kinds of evaluation. The Design Manual can help you
design an evaluation to serve one or more of these evaluation purposes.

viii

ABOUT THESE MATERIALS

Evaluation Purposes Related to the Key Training Program Functions


Key Training Function

Evaluation Purposes and Uses

1. Identify worthwhile
training goals

Assess needs, validate goals, prioritize goals, identify


constraints and problems related to goals for training

2. Design effective

Assess alternative strategies, compare training deSigns,


identify criteria to judge designs, determine feasibility
and potential for success

3. Effectively implement
training

Monitor and control program operation, identify


problems and revision needs, determine whether
objectives are achieved, document costs and activities

4. Decide whether to
terminate, continue,
curtail or expand
training

Determine usage and application, identify emerging


and continuing needs, determine benefits of training,
identify problems and revision needs to enhance
training usage

training strategies

How the Materials Are Organized


The Sourcebook is organized by major evaluation function:
(1)
(2)
(3)
(4)
(5)
(6)
(7)

fOCUSing an evaluation and clarifying its purpose


designing an evaluation
collecting information
analyzing information
reporting: interpreting and using evaluation findings
managing evaluation activities
evaluating evaluation efforts

Each function is defined, then the several key decisions needed to complete
the function are explained. The Sourcebook contains examples, guidelines,
criteria and checklists you can use to do more effective evaluation. It also
includes references to other books and resources that can be useful in
evaluating training programs.
The Casebook contains twelve case-examples. Each is a story about
evaluation within a particular training program. The case examples,
designed to portray evaluation applications of different types in different
settings, were contributed by field practitioners and written in conjunction
with ETC staff. They are fictional accounts but based on actual programs
and uses of evaluation. Each case-example is annotated to highlight the
seven major evaluation functions as set forth in the Sourcebook. This is done
to show how these functions differ according to particular program needs
and settings. Following each case is a set of review and discussion questions
to help extend the lessons available in the case-example.

PROGRAM EVALUATION

ix

The Design Manual contains worksheets, directions, and guidelines for


designing an evaluation Its organization is similar to the Sourcebook, as it
helps you produce different parts of an overall evaluation design. Each
section presents an example of the design product needed; gives you
worksheets, directions, and aids for producing that document; and provides
a checklist for assessing your work.
You can use the Design Manual to produce:
(1)
(2)
(3)
(4)
(5)
(6)
(7)

an evaluation overview
an outline of evaluation questions
an information collection plan
an analysis and interpretation plan
a management plan
a report plan
a plan for evaluating your evaluation

Suggestions for Using the Materials


There is no one particular order in which these materials are meant to be
used. You could begin in any of the three parts, using them alone or in
combination. Where you begin and how you use the materials depends on
what you want to use them for. We'll suggest some possible options and
applications here. You could follow one or more of these, or simply look
through the materials and make up your own way of using them.
Remember that you can use these materials by yourself or in conjunction
with a group. Or, you could use these materials to train other people in
evaluation.
Some Options

1. To learn about how evaluation could be used to help with particular

problems, you could read some of the case-examples. Use the guide
below to see which cases relate to certain problems.

some common uses/problems


-putting together an inservice workshop
or program
-designing, conducting a needs assessment
-looking at child-change as a result of
inservice (worth of training)
-managing a new project
-evaluating services provided from an
agency

relevant case-examples
(numbers listed are from
Casebook Table of Contents)

-L-1, L-2, C-2


-L-1, C-1, C-5
-L-4

-C-2

-S-3

ABOUT THESE MATERIALS

some common uses/problems

relevant case-examples
(numbers listed are from
Casebook Table of Contents)

-improving curriculum and courses


-proposal evaluation
-monitoring programs, improving services
-looking for evidence of impact and worth
-improving an evaluation

-C-l, C-5
-S-2
-S-I, S-3, L-3
-L-4, S-3, C-3, L-2
-C-4

2. To learn more about evaluation applications in your setting: read the

case-examples for your setting (state agency, local school, college).


3. To learn more about evaluation in general and how it fits in with training:
read the Sourcebook and discuss a few cases with yourself or others.
4. To complete some evaluation design work for a program you're working
on: use the Design Manua/.
5. To become more knowledgeable and proficient in evaluation: read the
Sourcebook (then try some more evaluation work!).
6. To train others:
(1) Read the Sourcebook yourself and use it as the basis for training.
(2) Select some case-examples from the Casebook for participants to read
and discuss.
(3) Have participants work on evaluation designs using the Design
Manua/.

Introduction
Please glance over the questions that follow and read the answers to those that are of
interest.

Q:
A:

Q:

Q:

What does this manual do?


A: This manual guides the user through designing an evaluation.
Who can use it?
A: Anyone interested or involved in evaluating professional trammg or
inservice education programs. The primary users will be staff members who
are doing their own program evaluation-maybe for the first time.
(Experienced evaluators or other professional educators can find useful
guides and worksheets in it.)
If I work through this manual, what will I accomplish?
A: You will develop one or more evaluation designs, and perhaps you'll also
use the designs to evaluate something to make it better or to document its
current value.
What is an evaluation design?
A: An evaluation design is a conceptual and procedural map for getting
important information about training efforts to people who can use it, as
shown in the graphic below.

The Evaluation Cycle


People need to
know something

!f~~L--ab_OU:;fo_t~~inin~g

:t

C,

People use
information to
change the training
or themselves

Information is
collected about the
training effort

Information is
interpreted and
valued

xi

xii

PROGRAM EVALUATION

Q:

If I use this manual to develop a design, what will I have when I am done?
A: You will have a complete evaluation design:
A set of products relevant to seven major evaluation decisions:
DECISIONS

1. What is the general focus of the evaluation? What are


you trying to prove, improve, or discover?
What are you evaluating? Who cares? For
whom will the evaluation be done? What
constraints are there?
2. What are the questions you are trying to answer?
(e.g., Is a training program implemented
properly? Is it doing what it is supposed to?)
3. How will you collect the information needed to answer
the questions? (e.g., Will you examine records?
Interview people? Observe the activities? Who
will do the actual work?)
4. How will you analyze and interpret the information?
(e.g., Will you have objective criteria? Rely on
expert judgment? Compare to some standard
or comparison group?)
5. How will you communicate about the evaluation? (e.g.,
Will you keep people informed as you go?
What data will you provide, when, and to
whom?)
6. How will you manage the work? (e.g., How much
time, money, and person power can you use?
What are the time lines and deadlines?)
7. How will you tell if your evaluation work is any
good? (e.g., Will you compare it to exemplary
evaluation standards? Look to see if it has a
positive impact?)

Q:

PRODUCTS
1. Evaluation Preview

2. Outline of Evaluation

Questions

3. Information Collection
Plan
4. Analysis and
Interpretation Plan

5. Report Plan

6. Management Plan
7. Plan for Evaluating the
Evaluation

Is this the only way to do a deSign?


A: No. It's just one way that works reasonably well.
Q: If I do my design this way, will I be assured of having a good one?
A: Not completely. You'll be well on your way to having a good design
because you'll be d-ealing with the right kind of information. Product 7
helps you apply evaluation to your evaluation work so you can improve as
you get more experience.
Q: What if I don't have very much time available?
A: The manual is designed with busy people in mind. We set up the materials
so that you don't have to spend a lot of time before you get something you
can use. You work on a real project from the outset. If it's a simple project
(or you are exceptionally clever), you'll have a good design in twenty
minutes or so. More likely you'll have a not-s<relegant design by then that
you'll improve later.
Q: Can the manual be used by someone without very much evaluation experience?
A: sure. We use a "try it, then improve it" strategy. You work simply at first,
then check your work and make it better.

INTRODUCTION

Q:

Q:

Q:

xiii

Why not just show how to do it right the first time?


A: Doing it right the first time only works for very simple tasks. Complex tasks
are learned by a "try, then improve" strategy. Think back over your efforts
to learn to walk, to write an essay, to teach a lesson, to write an objective, to
develop a personal relationship, to understand a textbook, to ride a bicycle,
to figure out what you want to do with your life. How often did you get it
right the first try? (Do you have it right yet?)
Expert evaluators don't get it right the first time either. The first try of the
expert evaluator might be a lot more elegant than yours would be, but it
will still have to be recycled. (Sometimes the expert's first try is no good
because it's too elegant and needs to be made practical.)
How about more expert evaluators? How would they use this manual?
A: Some people who use the manual will have nearly all of the knowledge and
skills necessary to do good evaluation planning. They just need to know
what the steps are and see an example of each. Their "first drafts" will be
very close to final products. Other people will need more help from the
materials and need more recycling to do good work. Setting up the
materials this way gives you more control over how much instruction you
get from the materials. (It's a form of unobtrusive individualization.)
SO you call this a "try it, then improve it" strategy?
A: That's right. We don't attempt to get you to get it right the first time
because: (a) it doesn't work; (b) it's not part of the process you are learning;
and (c) doing it this way gives help when you need it but doesn't hold you
back.

GETTING STARTED:

DEVELOPING A NOT-SO-ELEGANT DESIGN


PRODUCT OVERVIEW

1. The first step in getting started is deciding what you'll evaluate. A


program? A workshop? A manual? Something else? If you're unsure
about what you might use this manual to develop an evaluation design
for, turn to Appendix A, "Selecting What (an Object) to Evaluate."
2. A not-so-elegant design is a first, quick overview of a complete evaluation
design. It tells: what you'll evaluate, for whom and why; the questions you
have to pursue; how you'll pursue them; what you'll do with information
you collect; and how you'll determine how well you did all those
things.
3. A not-so-elegant design might be all the evaluation design work you need
to do.
4. By doing a not-so-elegant deSign, you not only get a start (perhaps a
finish) on a deSign, but you'll learn how this manual works to guide you
through the design process.

TASK STEPS

Procedures

Objectives

Prepare

1. Study the example of a not-so-

Knowledge of its key parts

Try

2. Take about 15 minutes and use

A not-so-elegant design for


your own work

Improve

3. Read about tactics for further


planning and involving others in
helping improve your design.
Use the checklist provided to
decide what refinements your
design needs

Decisions about how to


involve others and how much
more design work to do

elegant design

the worksheet to do your own


not-so-elegant design

Move On 4. Browse through the rest of the


manual to identify some tips,
aids, and worksheets

Decision about what parts of


the manual you'll use to
develop your evaluation
design

A DESIGN MANUAL

EXAMPLE
A NOT-SO-ELEGANT DESIGN
Product
1. Evaluation Preview

"I'd like to know what some of our retired


citizens think of one of our inservice
workshops. We school board members
need to know what they think as we
consider the next millage campaign and
make decisions about budgets."

2. Outline of Questions

"I'd like to know whether they think what


we're teaching makes sense and if it's
worth the money."

3. Information Collection Plan


(information collection
procedures)

"We could invite four or five of them to sit


in on a couple of the inservices. We'd try to
get one who has grandchildren in school and
one who lives in the senior citizens home,
that sort of thing. They'd sit in on a typical
inservice or two, and fill out a questionnaire, then some high school kids could
write it all up."

4. Analysis and Interpretation


Plan
(information analysis, criteria
for judging)

"Nothing fancy. Just tabulate the results and


we'd see if they agreed with one another and
how they liked it. We'd have to consider
other natural biases, but if more than half of
them are negative, I'd really begin to
worry."

5. Report Plan
(content, format, schedule,
audience)

"Why have the high school students


interview and write a report? Wouldn't it be
better to bring the folks here? Good idea!
Then we'd get first hand reports and we
could ask' em anything we wanted to."

6. Management Plan
(tasks, personnel, resources)

"Bob, why don't you see if we could get a


high school social studies or civics class to
handle the whole thing. They could select
the people and the inservices, contact
everybody, and make up the questionnaire.
They could interview the old folks and write
a report. Let's see if we can get that going
next term."

7. Plan to Evaluate the


Evaluation

"I'm not sure those high school kids would


do a good job. Let's have them develop
their plan, then we'll get someone who
knows how to do this sort of thing to review
their work."

(purpose, audience, object,


constraints)

(evaluation questions and


subquestions)

GETTING STARTED

WHAT TO NOTICE ABOUT THE EXAMPLE


1. It includes sketchy outlines of seven (7) evaluation design products. Though the

description comprising each product is very sketchy, and perhaps needs


more work, the design is complete; all seven design decisions are
represented.
2. Taking an idea and thinking it through at about this level of detail is a good way to
start developing a good evaluation design. Some designs-especially to evaluate
very simple objects-don't need a lot more than this brief start. Others
will need a lot more development for each product.
3. The not-so-elegant design has the same parts and products as a longer design. The
manual gives worksheets, tips, and aids for further developing a more
complete evaluation design. Though the whole design may be longer and
more detailed, it will still contain these seven (7) parts.
4. The not-so-elegant design provides a basis for integrating the evaluation design. Even
though there's not a lot of detail, it's possible to make some judgments
about how good an evaluation design this is, and where it needs revision
and more work. Would it provide useful and accurate information? Could
it be pulled off? Should it be?

A DESIGN MANUAL

WORKSHEET
A NOT-SO-ELEGANT DESIGN
PRODUCT
1. EVALUATION PREVIEW

why will the evaluation be done?


for whom will it be done?
what will be evaluated? (See Appendix A if
you're undecided)
what constraints are known?

2. OUTLINE OF EVALUATION QUESTIONS

what questions and subquestions will the


evaluation address for each audience?

3. INFORMATION COLLECTION PLAN


what sources of information will be used?
. how will information be collected?
4. ANALYSIS AND INTERPRETATION PLAN
how will the information be analyzed?
what criteria will be used to judge the
object?
what procedures will be used to make
those judgments?
5. REPORT PLAN
what reports will be made?
what should their contents be?
when will they be given?
to whom will they be made?
what format will be used for them?
6. MANAGEMENT PLAN
what tasks need to be accomplished?
who will do them?
when will they be done?
what resources will be needed to do them?
7. PLAN TO EVALUATE THE EVALUATION
how will you check your design?
how will you know if it's going ok?
what can you use to judge the overall
evaluation effort?

YOUR RESPONSE (Answer in


your head or write notes here)

GETTING STARTED

AID #1
THE PARTS OF AN EVALUATION DESIGN
THE PRODUCT

WHERE IT COMES FROM

1. Evaluation Preview

The evaluation begins with a training effort (object)


about which someone(s) (audiences) has an interest
or other information need, then sets about to think
of what evaluation can do (purpose) to meet needs
and interests. This evaluation work will get done
within some actual setting, in a certain time, etc.
( constraints)

2. Outline of Evaluation
Questions

The evaluator works with audiences and descriptions


of the object to decide just what questions the
evaluation should aim to address in order to meet it,
purpose( s).

3. Information
Collection Plan

Information about the object's context and


performance will be collected to meet audiences'
needs and interests.

4. Analysis and
Interpretation Plan

Information will be processed and compared against


expectations, values, criteria and other referents in
order to weigh its meaning.

5. Report Plan

The evaluator keeps key audiences informed about


the evaluation's intentions, progress and results, and
works with them to help understand its implications.

6. Management Plan

The evaluator and others think through who will do


what, when, where, how and with what resources in
order to make the evaluation work well.

7. Plan to Evaluate
the Evaluation

The evaluation's design, progress and results are


investigated to help revise them and determine how
well they're working-or worked.

object description
purpose
audiences
constraints

NOTE: These parts are not always produced in the order shown. Emergent and more naturalistic evaluations
employ a different order (e.g., evaluation questions might come after information is collected and analyzed).
And, some evaluations go through several cycles of some of these steps. But whatever design approach one
uses, virtually any evaluation has all of the parts described above.

A DESIGN MANUAL

TIPS
There are two major decisions about your evaluation that you should
consider now, before moving further into a design:
1. How much, and how, will you involve others?
2. How much planning should you do in advance?

Here are some thoughts on each:


Tactics for Involving
Others

Tactic

Benefits and Risks

1. Do the whole design by yourself

1. Low effort and commonly done.

2. Do a rough draft of each part of the


design yourself. Check it for
accuracy as you interview others.
Then write up the product and get
people to sign off on it.

2. Moderate effort. Reduces risks by


providing a written agreement up
front. Risks overlooking something
if people sign without really
thinking it through.

3. Identify key persons from the


evaluation audience for your
project. Convene them for a
planning session in which they
(a) brainstorm through the process
of doing a product, (b) break into
subgroups to write parts of it, and
(c) come back together to share,
improve, and approve the resulting
document.

3. Moderate to large effort. Shares


risks, obtains full and early
involvement of key persons. Lower
risk of overlooking something.
Risks (a) not being able to get
people to commit themselves that
much or (b) uncovering!
encountering serious conflicts early
in the process. (That sometimes is
a benefit rather than a risk!)

4. Do a rough draft of each product.


Use it as input into a session similar
to that described for the third
tactic. Get them to edit, modify,
and sign off on it in the group
session.

4. Moderate to large effort. Obtains


early involvement but can be
perceived as manipulative, if badly
handled. Reduces risks of getting
embroiled in conflict.

5. Form an advisory group that


represents all audiences and involve
them throughout.

5. Moderate to large effort. Sustains


involvement and promotes
acceptance and use; however, it can
be volatile and slow the process.
Use of advisory groups is
sometimes mandated

6. Tailor tactics to your situation.


(Use elements similar to those
above and! or based upon other
techniques you know.)

6. Matches your setting and abilities.

and informally in your head rather


than written or shared with others.

Risks misunderstandings and


overlooking something critical.

GETTING STARTED

Planning Approaches

Approach

Benefits & Drawbacks

Plan in great detail before you begin.


Specify carefully each step in the
evaluation: what questions you'll
address, how, who will get what
information, when, and so on. Plan to
follow your plan unless you absolutely
have to deviate.

Benefits: People know what to expect.


The plan can be used like a contract,
to hold people accountable. Costs and
time can be predicted and planned for.
Drawbacks: Assumes a greater degree
of knowledge and control than may
exist. Doesn't allow for response to
changes. Limits the conclusions to
what can be predicted and prescribed.

Wing it. Plan as you go, following the


evaluation as it leads you. Begin only
with a general purpose, and don't
commit yourself to particular
questions, methods or interim
objectives.

Benefits: Can capitalize an opportunity


and respond to changing needs and
conditions. Is more compatible with
how programs and general experience
operate. Mimics life.
Drawbacks: Makes people nervous,
which may affect cooperation. Hard to
staff and budget. Difficult to get
approval from administrators and
participants who want more certainty.

Take the" middle road".


Recognize the two planning errors:
(1) It's a mistake to not have a plan.
(2) It's a mistake to follow your plan
completely.
Plan as specifically as you can, but
recognize, admit and plan for
deviations. Be ready to take advantage
of changes and respond to emerging
needs.

Benefits: Reduces anxiety among


parties to the evaluation because
direction and general procedures are
known, allows allocation of resources,
yet maintains and recognizes
legitimacy of deviation, spontaneity.
Encourages ongoing contact with
audiences. Represents a rational
humane approach.
Drawbacks: Is hardest to do well.
Becomes easy to get committed to
plans and blind to needs for change.
Requires tolerance for ambiguity.

A DESIGN MANUAL

CHECKLIST
EVALUATION DESIGN ADEQUACY
DIRECTIONS: Use this checklist to identify areas of your evaluation design that
need more development. Then work through the corresponding products in this
manual. This is a master checklist. Each product section in the remainder of the
manual has its own, more detailed, checklist. Use these also.

A. Clarity of evaluation focus


1. Is there an adequate description of what (program, context, functions,

products, etc.) is to be evaluated? (object)

2. Is the evaluation object relatively stable and mature? Do you know what

kind of evaluation it can withstand? (object, purposes)

3. Are the reasons for the evaluation specified and defensible? (purpose)
4. Is it clear what planning, implementing, redesign, judging or other decisions
and interests are to be served by the evaluation? (purpose)
5. Are all relevant evaluation audiences described? (audiences)
6. Are the criteria, values and expectations that audiences will bring to bear in
interpreting information known and described? (audiences)
7. Have the events in the setting that are likely to influence the evaluation been
identified? (constraints)
8. Is someone available to do the evaluation who has some basic skills in

conducting evaluations? (constraints)

9. Is the setting conducive to or supportive of evaluation (e.g., political


support)? (constraints)
10. Would disruptions caused by the evaluation be tolerable? (constraints)

B. Evaluation questions
1. Have key stakeholders' needs and questions been identified?

2. Are questions for the evaluation important and worth answering?


3. Are questions sufficiently comprehensive? If addressed, would they meet the
evaluation's purpose?
C. Information Collection
1. Are there procedures available to answer the evaluation questions?

GETTING STARTED

2. Are the kinds of information to be collected logically related to the

information needs?

3. Are the information collection procedures appropriate for the kinds of


information sought?
4. Is the evaluation likely to provide accurate information?
5. Is the evaluation likely to provide timely information?
6. Is the evaluation likely to provide information sufficient to meet its
purposes?
7. Is the evaluation likely to provide useful information to each audience?
8. Are the procedures compatible with the purposes of the evaluation?
9. Will information collection be minimally disruptive?
D. Analysis and interpretation
1. Are information organization, reduction, and storage procedures appro-

priate for the information to be collected?

2. Are information analysis procedures specified and appropriate?

3. Are methods and! or criteria for interpreting evaluation information known


and defensible?
E. Reporting
1. Are report audiences defined? Are they sufficiently comprehensive?

2. Are report formats, content, and schedules appropriate for audience

needs?

3. Will the evaluation report balanced information?


4. Will reports be timely and efficient?
5. Is the report plan responsive to rights for knowledge and information with
respect to relevant audiences?
F.

Management
1. Does the design provide for adequate protection of human privacy and other

rights?

2. Are personnel roles specified and related to information collection and

management requirements?

10

A DESIGN MANUAL

3. Is the evaluatian likely to. be carried aut in a prafessianal and respansible


manner?
4. Is the evaluatian likely to. be carried aut legally?
5. Has sufficient time been allacated far evaluatian activities (instrument
develapment, data collectian, analysis, reparting, management)?
6. Are sufficient fiscal, human, and material resaurces pravided fo.r?
7. Are personnel qualified to. carry out assigned respansibilities?
8. Are intended data saurces likely to. be available and accessible?

9. Are management responsibilities and roles sufficient to support the


evaluatian?
10. Is it feasible to complete the evaluatian in the allatted time?
11. Can a written agreement (memo. or contract) be negotiated to document the

considerations abaut evaluation gaals, responsibilities, resources, and


criteria?

12. Are there praV1SiOns for redesigning or redirecting the evaluation as

experience aver time may indicate?

G. Evaluating the evaluation


1. Is there a cammitment to implement sound evaluation?

2. Is there agreement about criteria to. judge the success of the evaluation?
3. Will the evaluatian's credibility be jeapardized by evaluator bias?
4. Are there procedures planned to assess the quality of the evaluation's design,
progress, and results?
5. Are there provisions for disseminating, reparting, interpreting, and otherwise utiliZing the results and experience of the evaluation?

GETTING STARTED

11

MOVING ON
Now you must make an important decision: Is your not-so-elegant design
complete enough to guide you toward achieving your purposes in
evaluating?
1. The decision is important but probably not difficult. Unless your project

2.

3.

4.

5.

is unusually simple (or you are unusually competent and confident), you
should use the examples, worksheets, and checklists in the manual to
develop a more complete design.
Use the checklist on the previous page to help you decide what parts of
your design need further development.
We ask you to develop a not-so-elegant design not because we expect you
to use it as is, but because it gives you perspective. Experienced
evaluators tend to do a short form in their head to orient them to the
whole task. That way they don't get lost in the details. You need to do the
same.
Wait! Before you make a final decision, browse through the manual to
see some of what it has to offer. Be sure to look at Appendix B. It's an
example of a moderately complex design that will show you the sort of
thing you would generate by working on through the manual. After
you've looked through the book and studied the example in Appendix B,
you'll be able to make an informed decision about whether to stop with
your not-so-elegant design, to improve upon it on your own, or to
improve it by ushg the worksheets, aids, tips, and checklists in each of
the seven (7) product sections that follow in the manual.
If you're unsure about how evaluation might be used in your work, read
some of the examples in the Casebook.

SOURCEBOOK
REFERENCES
Designing
Evaluation

KEY ISSUES

TASKS

PAGE

l. What are some

l. Determine the

37-42

alternative ways to.


design an evaluation?

2. What does a design

include?

3. How do you go about

constructing a design?

4. How do you

recognize a good
design?

amount of planning,
general purpose, and
degree of control
2. Overview evaluation
decisions, tasks, and
products
3. Determine general
procedures for the
evaluation
4. Assess the quality of
the design

43-58
59-60
64-71

,,

,,

\
\

,,

\
\
\

\
\

\
\

,,

,
\

,,

"

""

"

""

.......

.......

\
\

.........

PRODUCT 1

\
\

--

EVALUATION PREVIEW

\
\
\

What you'll evaluate


Why you'll evaluate it
Who you'll involve and report to
What constraints will impinge on
your work

[}2f' 1. Evaluation Preview


D 2. Outline of Evaluation Questions

\
\

D 3. Information Collection Plan


D 4. Analysis and Interpretation Plan
D 5. Report Plan
o 6. Management Plan
D 7. Plan to Evaluate the Evaluation

13

PRODUCT 1

15

PRODUCT OVERVIEW

1. Evaluation Previews are developed by considering four key items: object,

audience, purpose, and constraints.

2. The object (defined as whatever is being evaluated) can be evaluated in

terms of its goals, components, activities, resources, etc.


3. The audience is everyone who has a specified interest in the object
evaluated or the purposes of the evaluation.
4. Purposes are the major reasons and uses for the evaluation.
5. Constraints are the known limitations and demands (e.g., budget,
timelines) that the evaluator accepts as givens.
6. The Evaluation Preview guides and gets refined by later steps in
developing the total evaluation design.
TASK STEPS

Procedures

Objectives

Prepare

l. Review the definition. Read the

1. Knowledge of the major

Try

2. You work through four (4)

2. Content of a preview that

Improve

two examples of Evaluation


Previews. Review the example
using the "what to notice" list.

worksheets, one for each of the


major elements of a preview.
Then, you can pull these
together into a preview
statement like one of the
examples.

3. Use tips and checklists.

Move On 4. Reflect back on what you've


done, check it out with others,
and look ahead to next steps.

parts of a preview

specifies object, audience,


purposes and constraints
for your evaluation

3. Improved clarity, accuracy,


or agreement about the
content of the preview
4. Decisions about your next
action steps

16

A DESIGN MANUAL

EXAMPLE
AN EVALUATION PREVIEW

MEMO
FROM: Pat
TO: Larry, Stella, and Ed
RE: Evaluation of the Parents' Handbook

At our last staff meeting we talked about doing an evaluation of the


Parents' Handbook in its draft form. We all know there are rough spotsand that there are probably more than we know. The evaluation sounds
like a good way to detect those problems and help us with our last
round of revisions.
I'd like to have a meeting next week to talk more about the evaluation,
but I thought I'd get a head start on that through this memo by writing
my initial thoughts.
The major purpose for evaluating the Parents' Handbook is to determine
whether the pilot version meets the purposes for which it was intended,
so we can decide whether revisions are needed before it is more widely
distributed. More specifically, how does the handbook stand in relation
to three criteria:
extent to which it is easily used by and meets needs of parents for
whom we wrote it
clarity and comprehensiveness of the information presented
appropriateness of the cost for producing and distributing it
A secondary purpose is to provide information for the department's
overall evaluation effort regarding the status of its various projects. As
you know, the time for the end-of-the-year report is near; we'll have to
answer some questions about our progress. We should put together the
evaluation design-since we're the ones who are most interested in the
results. Then I'll show it to Dr. Wayland to see whether the design will
get him the answers to questions he has about our project.
You're probably wondering how we're going to manage to do the
evaluation, since we're so short on funds. Like so many other activities,
this one will have to be cheap, and we'll have to do it ourselves. But if
Dr. Wayland likes the idea, we may be able to get some funding for
travel to do some data collection, if that seems important to do, e.g., if

PRODUCT 1

we decide to interview parents. Until we know more about the


questions we need to ask, I don't want to decide on specific strategies.
However, we do know that the evaluation must be completed by the
end of November so we can make revisions to the handbook and have
it ready for the printer in February.
Next week let's talk about the design for the evaluation and go over the
handbook carefully to see what specific parts we'll want to evaluate in
addition to the document as a whole. I'll bring a copy of our needs
assessment report since that will have lots of good leads about what
should be included in this evaluation.

17

18

A DESIGN MANUAL

EXAMPLE
ANOTHER EVALUATION PREVIEW

The Evaluation Associates


11 02 Washington Boulevard
Randolph, New Hampshire
George Chandler
Project Proof
River Valley Intermediate School District
Breda, New Hampshire
Dear Mr. Chandler:
Based on our meeting yesterday, I've outlined what I see as our initial
understanding about the evaluation of Project Proof. Let's use it as a
starting point for our next meeting.
Object of the evaluation: Project Proof is one of several components in the
Office of Special Projects. During the past four years it's been through
several rounds of revision. Its purpose is to provide a system, to
improve the educational planning process, the quality of plans, and the
educational growth of the students. To help me learn more about it, I'll
read the documents you gave me, including those year-end reports,
which look particularly useful. Here's my picture of the project. Let's
refine it next week.

River Valley ISD


Building
Consultation
Services

'"
Administrative
Offices

Vocational
and Special
Education
Program

-planning
-budgeting

Audio- Visual
Services and
Resource
Library

Management ~
Services
Reading
Resource
Project

"-./

,
I

I
./

Workshop
Admin.
Support

II

Catalog

Math Skills Project

PRODUCT 1

Evaluation audiences: Given your concerns for efficient management, you

are the primary audience for my work. Others are your assistant, the
teachers who must use the project (and who I understand are skeptical
about time demands), and finally, the superintendent who must
approve an ongoing budget.
Purposes: The first purpose is to determine whether the project has met

intended needs and goals; the second is to help you decide whether the
project ought to be continued.
Other notes: The evaluation must be conducted within the next eight

months. You and others in your office will do some of the information
collection, but it seems likely that an outside evaluator will be a good
choice to enhance the credibility of the findings.
See you next Wednesday:
Sincerely,

Sam Quinn

19

20

A DESIGN MANUAL

WHAT TO NOTICE ABOUT THE EXAMPLES


1. A variety of formats may be used. The choice depends on your personal
preference for organizing the ideas and on your assumption about what
style will appeal to readers and will help them understand the context.
2. The preview includes a descrIption of the object (what will be evaluated), the audience
(for whom), andpurpose (why the evaluation witl be done). Each of these elements
helps focus the evaluation and will guide your work on other parts of the
design.
3. A preview a/ro describes known constraints on the evaluation. Because it serves as
an initial outline of the evaluation, the overview should include other
information that will shape and guide the evaluation- budget or timeline
limitations.
4. The level of detail will depend on how much is known about the evaluation when the
preview if written, how much needs tu be ,rpeci/ied, and who witl read it. For
example, the memo in the first example is intended to be informal and
for internal staff communication. It contains little detail about the
object, because it is familiar to the memo recipients. The second example
is a formal letter from an external evaluator. He tries to specify what he
knows about the evaluation focus so it can be verified by the client. In
neither case is the preview inflexible; although it can serve as an initial
agreement, any of its elements may change as the evaluation is further
designed and implemented. A preview serves as documentation of the
agreement and as a basis for negotiating any changes.
5. Major value and criteria considerations are included. Evaluation culminates with
judgments that will employ particular value references and criteria. The
preview should begin to identify and forecast these. Each major audience
brings with it a set of values, expectations, and criteria. Notice in the first
example that the staff wants a useful product. Notice in the second
example that the director values efficiency, teachers want practicality and
ease of use, the superintendent is concerned with cost.
6. The object, audience, and purposes guide the development of evaluation questions.
This in turn influences the selection of procedures for collecting and
analyzing information. Constraints shape the design by introducing
factors that influence feasibility.

PRODUCT 1

21

WORKSHEET
DESCRIBING THE OBJECT (Defining "What" You'll Evaluate)
Directions: Use aid #2 as a guide to describe what you will evaluate-the object-in
the spaces below. You may want to write a narrative description
and! or pictorial representation to answer the questions.
List

Who is involved in
the object

Note

Why it exists; What


are its goals,
objectives?

Describe or Ii!

The functional
elements of the
object. What are its
sub-parts and pieces?

Explain

When it did, or does,


take place (or how
frequently)

Describe

Where it exists

22

A DESIGN MANUAL

AID #2
SOME ELEMENTS OF AN OBJECT DESCRIPTION

Who

Actors

Who's involved? Who are the key decision makers and


leaders, the funders, the personnel implementing the
program, and those being served by the program; who are
the advocates and adversaries, interested luminaries?

Why

Goals

What goals and objectives do they appear to be pursuing?


Are these conflicting goals? What needs are being
addressed?

What

Components

Is there a model or description of the object; what and


how many separate components are included and how do
they interact?

Activities

What kinds of activities are included; what services are


being provided; what maintenance or internal
administrative services are provided?

Resources

What are the available benefits and opportunities?


Consider budget, personnel, facilities, use of volunteer
time, expert review or guidance, use of materials,
machinery, communication systems, personnel, etc.

Problems

Generally, what appear to be the biggest issues or


problems in the eyes of key stakeholders (department
chair, faculty, students, dean)?

When

Timeline

How long has the object been around? What does its
history look like, and what kind of future is it
anticipating? How long does it have to accomplish short
and long range goals?

Where

Setting

What in the setting influences the object? Where does it


fit into the larger organization, the political network?
What and who can it be generally influenced by?

PRODUCT 1

23

TIPS
DESCRIBING THE OBJECT

Describing the evaluation object can be a good staff development activity. It


encourages people to clarify intents, roles, and values.
Do not go overboard on this activity. It is tempting to analyze the object and its
context in such detail that it becomes more confusing than helpful.
Some documents you might read to help you do this task are proposals, year-end and
interim reports, and brochures.
Because the object will change over time, make provisions for tracking those
changes and keeping a record of them. This may help you explain some evaluation
results later.
You need help for this task! Talk with key audiences and schedule interviews.
Do not believe everything you read. Proposals, especially, are notorious for not
representing reality.
Get multiple viewpoints. Things look different-and are different-from different
angles.
Sometimes the purpose of an evaluation is to describe the object. So you might not
know much now but that's alright.
Your own preconceptions and biases will always influence your description. That's
why a lot of checking with others is a good idea.

24

A DESIGN MANUAL

CHECKLIST
DESCRIBING THE OBJECT
Answer the following questions and revise the worksheet until you can answer "yes"
to all of them.

YES

NO

Does the description emphasize the key characteristics of the


object and its setting?

Is your description clear enough that others could understand the


description?

Have you included processes or aspects of the object that could


have major influences on the object as a whole if they
malfunctioned? (i. e. have you got the critical parts?)

Have you verified (or do you plan to verify) the factual accuracy
of your description-e.g., through independent judgment, direct
observation?

Do you have a process for keeping the description up to datee.g., a log for recording critical events or a schedule to gather
new information?

Are you satisfied that you know enough about what you're
evaluating to proceed with an evaluation design?

Have you included all the parts or dimensions of the object that
are obviously related to the purposes of the evaluation? (Come
back to this after you've written purposes.)

Have you specified the processes or aspects of the object and


setting that are of most concern to your evaluation audiences?
(Come back to this after identifying audiences.)

Done all you can for now?


Move on to "A UDIENCES"

PRODUCT 1

WORKSHEET
ANALYZING AUDIENCES

List the audiences


for your evaluation
(Use aid #3)

Identify persons/
spokespersons for each
audience

Describe the particular


values, interests,
expectations, etc. that
may playa key role as
criteria in the analysis and
interpretation stage of
your evaluation

25

26

A DESIGN MANUAL

AID #3
SOME TYPICAL AUDIENCES FOR TRAINING EVALUATIONS
An audience is a person or group who ...
makes decisions based on the
evaluation findings

provides information for the


evaluation

is involved in planning or creating


the program being evaluated

has a responsibility for the group


being studied

might be affected by the evaluation

is paying for the evaluation

has legal rights to know about the


evaluation

runs the program being evaluated

sponsors or commissions the


evaluation

advocates the program being


evaluated

approves or criticizes the program


being evaluated
Roles of persons or groups who might be audiences
principal
superintendent
community group
project director
registrar
foundation
board of education
legislator
taxpayer
compliance office
licensing or certification agency
project staff
faculty
student
training recipient
parent group

department chairperson
assistant superintendent
dean
political group
pupil personnel officer
federal office
news media
professional associa tion
accrediting agency
contract administrator
busybody
friend
enemy
classroom teacher
union
client

PRODUCT 1

27

TIPS
ANALYZING AUDIENCES
Don't worry about listing everyone you know as an audience. You may create
confusion by involving more people than can realistically be involved. Know who
the key audiences are. Establish levels of involvement.
If you leave a particular audience out of the evaluation, think through the
consequences of doing that.
One audience may lead you to others. Talk to or think of a few known audiences: go
from them to others.
Talking to members of audience groups is a good way to learn their perspectives on
the evaluation of the object.
You may need to add audiences in the course of the evaluation.
In your enthusiasm to involve audiences, don't promise them the world. Let them
know what your constraints are.
Throughout the evaluation, you will have to return to these audiences, as you try to
accommodate them. Establishing a good rapport at the beginning is important.
Think about: who could sink this evaluation? Who could make it a winner? Are these
people or groups on your list?
Remember that evaluation always has consequences, and sometimes "fallout." Try
to forecast these consequences, and include those who might be affected.

28

A DESIGN MANUAL

CHECKLIST
ANALYZING AUDIENCES

Answer the following questions and revise the worksheet until you can answer "yes"
to all of them.

YES
Have you considered including at least a representative of
everyone who:
will be involved in the evaluation?

NO

will be affected by the evaluation?

will use the evaluation?

Do you have spokespersons for each audience or know how to


identify them?

Have you specified the value perspectives, concerns and interests?

Have you identified current or potential conflicting interests by


several audiences?

Have you considered any special problems in working with each


audience?

Have you (or will you) talk to audience members to discover


their concerns or interests?

And now ................................................. Move On to Purposes

PRODUCT

29

WORKSHEET
IDENTIFYING PURPOSES

Purpose

Rank

Interested Audiences

Write each purpose below.


Use aid #4 as a guide.

Use a number to
rank each
purpose (if more
than 1)

List the people or groups


who are primarily interested
in each purpose.

30

A DESIGN MANUAL

AID #4
SOME EVALUATION PURPOSES

Purposes related
Purposes for" new objects"
to training
(being planned or recently
program functions implemented)

Purposes for "old objects"


(been around awhile, perhaps
through several rounds of
revision)

Goal or Need
Identification

to establish goals or needs


to evaluate the soundness of
goals
to rank goals or needs
to validate needs

to determine whether goals


and needs have been met
to evaluate the soundness of
goals
to identify goals and needs
that guided the object

Program Design

to select a design
to clarify roles or resolve
conflicts
to compare alternative
designs
to determine the adequacy of
a given design

to determine the adequacy of


the design that was used
to assess how well the design
was developed
to compare the design to
alternatives not used

Implementation
and Process

to guide the staff during


implementation
to help staff make
incremental improvements
to diagnose problems
to determine whether the
design is being run as
planned
to document what actually
takes place

to determine whether it was


implemented as planned
to identify and describe
problems that developed
to examine the relationship
between what occurred and
observed outcomes

Products and
Outcomes

to determine immediate
outcomes and initial effects
to deterhline what outcomes
are appropriate
to determine the quality of
intermediate products

to assess the quality of


outcomes
to determine what outcomes
were achieved
to determine whether
intended outcomes were
achieved
to uncover side effects

Recycling
Decisions

to improve the object for


future implementation
to determine how worthwhile
it is
to determine whether it is
what it is intended to be
to determine whether it is
worth the resources it will
consume

to determine whether it was


worthwhile
to determine whether it was
what it was intended to be
to determine whether it was
worth the resources it
consumed
to compare actual outcomes
against needs

PRODUCT 1

31

TIPS
IDENTIFYING PURPOSES

Watch for conflicts if there are multiple purposes


Clarifying purposes is a very important step; the rest of the design must be directed
toward meeting them.
If you are planning a long-term evaluation having many purposes, consider dividing
your evaluation into cycles, e.g., the evaluation of the design, of the first
implementation, etc.
Watch out for questionable though unstated purposes, e.g., to justify decisions
already made, to meet the demand for evaluation, to make the object look
good.
Don't believe everything you read. Dig, dig, dig.
Get all the purposes out. You don't want to be surprised later.
A neat trick for identifying purposes: Write (or ask your client to write) a one
paragraph '" Summary of Conclusions," as if the evaluation were successfully
completed.

32

A DESIGN MANUAL

CHECKLIST
IDENTIFYING PURPOSES

Answer the following questions and revise the worksheet until you can answer "yes"
to all of them.

YES

NO

Do your purposes appear specific enough to guide the


evaluation?

Are they general enough to allow a flexible approach to the


evaluation, if that's needed?

Will they help reduce unnecessary fears or anxieties?

Will they clarify what you are doing and why so you'll probably
get cooperation rather than resistance?

Have you considered "hidden purposes" and how they may


influence the evaluation?

Is the overall tone positive and realistic?

Is each purpose relevant to at least one of your audiences?

Do the purposes imply how the evaluation information will be


used?

Do you have a procedure for recording changes in evaluation


purposes?

Do the purposes cover as many of your audiences' interests as


possible? (You may want to come back to this item after
identifying constraints)
Done? Constraints next

PRODUCT 1

33

WORKSHEET
IDENTIFYING CONSTRAINTS

Directions: Think about constraints for each of the categories below; use aid #5.
Then, list the constraints you will need to attend to.
Some guide questions and categories
Outline of Evaluation Questions
Is there a particular evaluation model
that's supposed to be used to guide
the evaluation?
Are there evaluation questions
required by a major audience?
Information Collection Plan
Are there information collection
procedures or instruments that must
be used?
Is there a group of persons who must
be involved as respondents in the
information collection?
What information is not available?
Analysis and Interpretation Plan
Have criteria for judging the object
been set?
Is there a procedure that's already
been selected for judging the object?
Do some audiences have particular
biases that should be recognized?
Report Plan
Is there a particular report that must
be made:
Are there mandatory report audiences?
Is there a fixed report schedule?
Management Plan
Must the evaluation be completed by a
particular time?
Is the evaluator already determined?
Is there a budget ceiling for the
evaluation?

Write your constraints here

34

A DESIGN MANUAL

Some guide questions and categories


Plan to Evaluate the Evaluation

Is there an auditor, or third party or


"meta-evaluation" required?
Are there funds to evaluate the
evaluation?
Other

Are there any other constraints on the


evaluation?

Write your constraints here

PRODUCT 1

35

AID #5
SOME INFLUENCES ON EVALUATION THAT CREATE CONSTRAINTS

INFLUENCES

EVALUATION IMPLICATIONS

Organizational
Politics

Is there political support for the evaluation? From whom? Are


there opponents? How secure is the object within the
organization?

Program
Leadership

Who has control over the program, formally and informally;


what goals do they have for the program's future? How does
the evaluation fit those goals?

Professional
Influences

How supportive are profeSSional groups of the evaluation? Will


you need to deal with union representatives? What will their
agenda be?

History

How mature and stable is the object to be evaluated? Has there


been a tradition of self-appraisal and evaluation use? Is the
object stable enough to withstand evaluation? What
information already exists?

Organizational
Setting

Where does the program fit into the larger organizational


network! Which decision makers can impact it? What kind of
information could jeopardize the object?

Economics

How secure is the fiscal support system for the program and
the evaluation? Have funds been allocated? Will a written
commitment of fiscal support be forthcoming?

Social Patterns

How much disaffection (interpersonal conflict) is likely to


result? Is the evaluation controversial to staff; are there
apparent factions emerging as a result of its being discussed?
What does the" normal" social pattern look like?

Legal Guidelines Are there legal restrictions (rights of human subjects) that will
limit collection of desired information? Are there professional
or institutional rulings that affect evaluation procedures? Will
the object be affected by pending legislation?
Resources

Will there be available resources to support the evaluation:


e.g., skilled personnel, facilities, time, supportive climate,
access to support services, access to personnel? Is there likely
to be a change in resources that will affect the program?

36

A DESIGN MANUAL

TIPS
IDENTIFYING CONSTRAINTS
To uncover serious constraints, try listing ( or having a colleague list) reasons why the
evaluation can't be done. These may alert you to potential problems or constraints
that won't necessarily make the evaluation impossible, but which certainly need to
be handled.
Some constraints may not become apparent until the evaluation is implemented.
You may identify some hidden ones by talking to key audiences now, particularly
those who control critical resources for the evaluation, e.g., funding, personnel,
and information.
Be sure to look for opportunities at the same time you look for limitations.
Don't be discouraged by the constraints you identify. Look hard for ways to design
the evaluation in spite of them. But don't proceed with the evaluation if you know
it's doomed.
Carefully review prior evaluations of the object and talk to people who conducted
them. These may highlight constraints you can expect.

PRODUCT 1

37

CHECKLIST
IDENTIFYING CONSTRAINTS
Answer the following questions and revise the worksheet until you can answer
"yes" to all of them.

YES

NO

Have you asked the person who is responsible for the evaluation
about constraints?

Have you been careful not to assume constraints where they


don't exist?

Have you checked for funding agent, government or other


demands the evaluation must meet?

Do you know what budget provisions and limitations exist?

Do you know what time lines must be met?

o
o

Are you satisfied that you've uncovered the major constraints


likely to affect your work?
Can you do the job given these constraints?

38

A DESIGN MANUAL

MOVING ON

Check to see if you need to do any of the following things before going on to
develop evaluation questions.
_~_ 1.

_ _ _ 2.
~_~"~"" ___ """" __

3.

~ ___ ~_

4.

___ 5.

Write up the content of the object, audience, and purpose and


constraint worksheets. You should do that if the tactics you
decided on for involving others or for preplanning call for getting
ideas or agreements from others early in your work. You should
always do it for large or sensitive or complex evaluations. See the
examples for suggestions of how to do this.
Get additional information, suggestions, agreements, or guidance
from people.
Refer to other sources of information about evaluation per se
(e.g., the Sourcebook, Casebook, or textbooks or other references).
erences).
Circulate drafts of your work to key persons to get a "sign-off' on
the Preview before you go further.
Sometimes constraints and so forth are such that evaluation
shouldn't be done. If that's the case with yours: Stop!

SOURCEBOOK REFERENCES

Focusing the
Evaluation

KEY ISSUES

TASKS

1. What will be evaluated?

1. Investigate what is to

2. What is the purpose for


evaluating?
3. Who will be affected by
or involved in the
evaluation?
4. What elements in the
setting are likely to
influence the
evaluation?
5. Does the evaluation
have the potential for
successful
implementation?

PAGE
7-15

be evaluated
2. Identify and justify
purpose(s)
3. Identify audiences

20-22

4. Study setting

23-26

5. Decide whether to go
on with evaluation

31-36

16-19

,
\

\\

\\

PRODUCT 2

OUTLINE OF EVALUATION QUESTIONS

The questions you'll "answer" with


information you collect
The aspects of your evaluation
object that should be investigated
More specific definition of your
evaluation purpose

\
\

Ila'l. Evaluation Preview


Ila' 2. Outline of Evaluation Questions

o 3.
04.
o 5.
o 6.
o 7.

Information Collection Plan


Analysis and Interpretation Plan
Report Plan
Management Plan
Plan to Evaluate the Evaluation

39

40

A DESIGN MANUAL

PRODUCT OVERVIEW

1. Evaluation questions are the questions the evaluation will "answer"

(address). They are questions someone (an audience) needs information


about; they direct evaluation inquiry.
2. Evaluation questions are a more detailed breakdown of the purposes
specified in the Preview.
3. Evaluation questions are divided into subquestions that provide specific
guidance in deciding what information should be collected.
4. Evaluation questions and subquestions may get revised, clarified, or
divided into more detailed questions later on.
TASK STEPS

Procedures

Objectives

Prepare

1. Study the example of evaluation

1. Knowledge about

Try

2. First, you draft evaluation

2. A list of questions and


sub questions

Improve

3. Use tips and a checklist

3. Improved and clarified


questions

questions and subquestions

questions in several categories.


Then you rank these and write
subquestions on a final
worksheet

Move On 4. Reflect back upon what you've


done, check it out with others,
and look ahead to next steps

properties of evaluation
questions

4. Decisions about next steps

PRODUCT 2

41

EXAMPLE
A SET OF EVALUATION QUESTIONS FOR A HYPOTHETICAL
TRAINING WORKSHOP

Why the
Question Is
Important

Evaluation Questions

Subquestions

Audience

1. Was the workshop

Did it go
according to
schedule?
Were
resource
materials
used
effectively?

Inservice director

Director needs
to modify plan
where it
doesn't work,
or better
manage staff

2. What are the

What are
their
positions in
schools?
Number of
years in
district?
Special
education
certification
status?

Inservice staff with


publicity or design
responsibilities
Funding agent who
sponsors this for
particular clients

The funding
agent sponsors
this for certain
client types.
We need to
market the
session better
to nonattendees

3. Did the workshop

Inservice staff
Participants

Participants
need to know
what they
learned. Staff
needs to know
workshop
effects.

4. How appropriate

Administrators
Staff

Workshop
must respond
to current and
perceived
needs, or it
won't be
attended or
refunded

--

implemented as
planned?

characteristics of
participants?

facilitate
participants with
the workshop?

were the workshop


objectives?

----------_._-- ----

42

A DESIGN MANUAL

Why the
Question Is
Important

Evaluation Questions

Subquestions

Audience

5. How satisfied were

Satisfied with:
timing
length
staff
performance
topics

Participants
Administrators
Staff

Training won't
work if
participants
don't like it
and won't get
involved.
Positive
reactions
indicate utility
and worth to
participants

Inservice staff
Ad director

Workshop still
needs
debugging,
better
preparation,
more efficient
design, etc.

Inservice director

The director
trains the staff
to make sure
they can, and
do, fulfill their
roles; can't
afford a poor
performance.

participants with
the workshop?

6. What problems

arose during the


planning and
delivery of the
workshop?

7. How well did the

staff perform their


roles?

Roles: as
discussion
leaders
inservice
providers

PRODUCT 2

43

WHAT TO NOTICE ABOUT THE EXAMPLE


1. The degree of specificity can vary. Some questions require information about

2.

3.

4.

5.

several sub questions or variables; others are "small" enough so that the
evaluators choose not to break them into smaller subquestions.
How the questions are phrased can imply how they should be answered. For
example, the investigation of staff performance in #5 requires finding out
if participants are satisfied with staff, while #8 might, but does not
necessarily, direct the information collection that way. It could involve
other ways of collecting information, such as observation by supervisors
or peer ratings.
As each question or subquestion is identified, some potential meanings are omitted.
For example, question 2 has three subquestions. There are others which
could have been added, e.g., age and number of years of teaching
experience. In the process of analyzing the questions only the most
important meanings are kept. The trick is to be careful to avoid
eliminating some meanings prematurely and thereby inappropriately
narrowing the evaluation.
Questions may ask for value judgments, for descrzptive information, or for both. For
example, #4 asks for a judgment about "appropriateness," while #2 asks
for description of characteristics. Both types of questions are fine-but
note that even #2 will be used to make value judgments about the
program because the evaluation audience that interprets the findings will
undoubtedly have in mind some notions about what the "appropriate"
characteristics are, depending on their intended audience for the
workshop.
Another way of phrasing question #2 is to ask: "Did the characteristics of
the participants at the workshop match those of the target audience?"
This requires both description and judgment. If the workshop is in early
stages of development, it's possible that judgments of what's appropriate
will not be made about the workshop participants. One might ask, "Who
came to the workshop?" Coupled with "Which participants benefited the
most?" The information could be used in the design of future workshops
to specify who should attend and to focus the design.
There are a variety of types of questions: Questions can ask about many
features of the object, for example: its: (See also aid #2)
goals and needs
potential problem areas costs
context
controversial areas
potential market
design
critical functions
preconditions
process
accountability issues
discussions
outcomes
preconditions

6. "Why the question is important" helps to determine criteria and values, as well as the
nature of reporting.. It shows what will be done with information collected
for the evaluation questions, by explaining who needs the "answer,"
and/or what might be done with an "answer."

44

A DESIGN MANUAL

WORKSHEET
DRAFT EVALUAnON QUESTIONS

Directions: In the spaces below write potential evaluation questions in each of


the (6) categories, as pertinent to your evaluation project. Use aids
#6 and #7.
Questions pertinent to key object functions:

Q?
Q?
Q?
Q?
Q?

Questions derived from theoretical models:

Q? ------------- -------------------- ---------------------------------Q?


Q?
Q?
Q? - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Questions based on expertise and experience:

Q?
Q?
Q?
Q?
Q?

Questions responsive to audience concerns and interest:

Q?
Q?
Q?
Q?
Q?

Questions defined from evaluation purpose:

Q?
Q?
Q?
Q?
Q?

"Bonus" (and other) questions:

Q?
Q?
Q?
Q?
Q?

PRODUCT

45

AID #6
SIX METHODS OF DEFINING EVALUATION QUESTIONS

METHODS

EXAMPLE (based on a hypothetical


workshop)

1. Analysis of the object: Identify

1. The evaluator did a systems analysis of

2. Use of theoretical frameworks:


The object of evaluation is
interpreted in light of a
particular theoretical model,
such as a change model, an
evaluation model, learning
theory, etc. Evaluation
questions are derived from the
model's key points and
assumptions.

2. A review of training and evaluation


literature turned up two "models" that
seemed especially relevant. One posed 4
major questions:
(1) Did the participants like it?
(2) Did they learn it?
(3) Did they use what they learned?
(4) Did using what they learned make a
difference?
Stufflebeam's CIPP model** suggested a
different set of concerns:
(1) Are the goals valid?
(2) Is the design a good one?
(3) Is the design well implemented?
(4) Were the goals achieved?
Using these models, the evaluation came
up with evaluation questions pertinent to
the two-day workshop.

3. External expertise and


experience: Experts in the
area of the evaluation object
identify evaluation questions
of importance; literature
review of similar evaluation;
review of similar programs.

3. The evaluator called a consultant friend


who recommended three evaluation
reports from similar workshops. These
were reviewed and suggested some good
evaluation questions.

4. Interaction with key


audiences: Discuss the
evaluation with audience
members. Ask what questions

4. The evaluator interviewed several key


audience members: the training director,
the superintendent, a school board
member and a few potential participants.

key functions in the object


and their interdependencies to
highlight critical paths, major
milestones, dependencies, etc.
Evaluation questions are
keyed to critical junctures,
potential weak points, areas of
staff concern, points of critical
function, or key objectives
and goals.

the workshop, defining components in


terms of their inputs, processes, and
outputs. The staff reviewed this analysis,
then generated evaluation questions in 3
categories:
(a) Where are breakdowns most likely to
occur?
(b) Where is there the most
disagreement as to the soundness of
the design?
(c) What are the most important
objectives?

46

A DESIGN MANUAL

METHODS
they want answered, or what
they believe is most important
to investigate.

EXAMPLE (based on a hypothetical


workshop)
Based on their interests and needs, some
key evaluation questions were defined.

5. Definition of the purpose for


evaluation: Do a logical,
definitional analysis of the
purposes for the evaluation.
Identify the set of questions
which, if addressed, would
meet each purpose.

5. A staff meeting was held to brainstorm


evaluation questions that seemed related
to the purposes for the evaluation. This
list was then syntheSized to remove
overlap and duplication. Then a Q-Sort
technique was used to assemble a set of
questions that most agreed defined the
purpose of the evaluation.

6. "Bonus" questions: Given that


you're going to do an
evaluation anyway, are there
some questions you can
pursue that will be
worthwhile, perhaps for
research, public relations,
marketing, etc.?

6. The evaluator and project director


discussed some opportunities presented
by the evaluation. They decided it might
be useful to explore whether participants
who enrolled in order to meet
recertification requirements were more
or less successful than "volunteer"
attendees, as these requirements were
currently under state scrutiny.

Brethower, Karen S. and Rummier, Geary, "Evaluating Training" Improving Human Performance Quarterly,
Vol 5, 1977, pp. 107-120 .
Stufflebeam, Daniel L. in Worthen & Sanders Educational Evaluation: Theory and Practice. Jones
Publishing, Ohio, 1973, pp. 128-150.

PRODUCT 2

47

WORKSHEET
EVALUATION QUESTIONS AND SUBQUESTIONS

Directions: Review and organize and revise your draft evaluation questions. Write
them, their sub-questions, audiences and why they're important in
the spaces below. Use the example (page 41-42) and aid #7.
Evaluation
Questions

Sub-Questions

Audience
(Who cares?)

Why is the question


important?

48

A DESIGN MANUAL

AID #7
EXAMPLE OF EVALUATION QUESTIONS (BASED ON
A 2-DAY INSERVICE WORKSHOP ABOUT SCHWARTZIAN THEORy)

Questions
related to
program functions
Goal or need
identification

Program design

Questions for a
"new" object
(Le., being planned or
recently implemented)
What problems are teachers
having with handicapped
children?
What kinds of training do
teachers want?
What time and other
constraints exist among the
target population?
To what extent do teachers
value the planned workshop
objectives?
Is Design A more practical
than Design B?
Is Design C any good?

Implementation
or processes

Are there sufficient


resources for the chosen
design?
Are these exercises needed
to meet the learning goals?
What do teachers think of
the design?
Did trainers use Schwartzian
Methods correctly?
Who attended the session?
Were Day 1 objectives met?
What problems did trainers
encounter?

Products
or outcomes

How many people attended


optional sessions?
Did teachers' knowledge of
Schwartz's Theory increase?

Questions for an
"old" object
(Le., been around awhile
perhaps through several
rounds of revision)
Are the identified needs still
valid?
Did the workshop address
the three goals?
What needs do attendees
have?

Is the design specific and


practical enough for
replication by satellite sites?
What elements of the
workshop are least
productive?
Is the new shorter design
feasible? as potent?
Why don't other teachers
attend the sessions?
What problems are being
encountered by new training
staff?
Who attends?
Are attendance rates
consistent across times and
locations?
Does the Day 2 afternoon
session help with problem
solving?
What are effects on pupils
of teachers' use of new
methods?

PRODUCT 2

Questions
related to
program functions

Questions for a
"new" object
(Le., being planned or
recently implemented)
How well can teachers use
Schwartz Techniques?
Are graduates using
Schwartz Techniques in
their classrooms?

Recycling decisions

How much did the session


cost?
Do graduates consider the
session worth while?
Are principals supporting
graduates' efforts to use new
methods?

49

Questions for an
"old" object
(Le., been around awhile
perhaps through several
rounds of revision)
What other uses or misuses
are being made of workshop
acquired methods?
Do people who receive less
workshop training perform
less well than others?
What are the costs and
benefits to attendees of
using the new methods?
To what extent is the
workshop reaching the
population?
What continuing needs aLd
problems exist despite tht>
workshop benefits?

50

A DESIGN MANUAL

TIPS
OUTLINING EVALUATION QUESTIONS
Some ways of doing the task:
You or a committee draft a set of questions, then circulate to relevant others for
revising and ranking;
take draft set to meeting at which others work on them;
share questions with staff, consultants, or others.
Don't worry now about "measurability" of questions. Wait until you have a set that
will meet your purposes, then check for feasibility of answering. Don't throw good
questions away until you're sure you must.
You will probably use the six methods (see Aid #6) of defining questions
simultaneously, so don't worry about deciding the "source" of each question.
Although it's important to be as specific as you can in writing evaluation questions,
don't force specificity. As questions are further defined and subdivided, some
potential meaning is lost. Make conscious decisions when omitting meanings.
In the course of the evaluation you may find that changes in your situation (e.g., level
of resources or availability of information) or your experiences in answering some
questions lead you to reconsider that set of questions. Some may become
outdated.
If your evaluation is exploratory-i.e., you want to do a long-term assessment of the
object, but aren't sure about all you want to investigate-consider asking just a few
evaluation questions, then adding others later.
If you need an "answer," it's a good question!
Naturalistic approaches to evaluation will generate evaluation questions as they go.
Your initial questions may be few, and quite general- just enough to get you
pointed in the right direction.

PRODUCT 2

51

CHECKLIST
OUTLINING EVALUATION QUESTIONS
Answer the following questions and revise the worksheet until you can answer
"yes" to all of them.

YES

NO

Is it feasible to answer the question, given what you know about


the resources for the evaluation?

Is each question one that some member of the audience wants


answered or is interested in?

If you omit a question, have you considered the consequences


and who will be upset?

Is the wording of each question sufficiently clear and


unambiguous?

Do you and those interested in the question have a common


understanding about what possible "answers" might be?

Is each question specific enough to give you direction about how


to go about addressing it?

Is each question specific without inappropriately limiting the


scope of the evaluation?

Do they constitute a sufficient set to achieve the purpose(s) of


the evaluation?

Do you know why each question is important?

Is each question worth the expense of answering it? (You may


come back to this one after you complete an Information
Collection Plan.)

Done all you can for now? ............................................. Move on

52

A DESIGN MANUAL

MOVING ON
Check to see if you need to do any of the following things before moving on to
develop an Information Collection Plan.
___ Take a break.
___ Check your evaluation questions with audience members to get their
suggestions or agreements.
___ Rewrite and tidy up your list of evaluation questions and subquestions. (You
can do that right on a worksheet that appears in the Information Collection
Plan section.)
___ Get more information about the object or the context (e.g., data from prior
evaluations, ongoing descriptive data) before you complete your questions.
~ _____

Interview some audience members to make sure you've got their key
questions included.

____ Renegotiate purposes to resolve conflicts.


___ Refer to other sources of information about evaluation (e.g., a colleague
who's expert, some of the references below) to get more, or better,
questions.
SOURCEBOOK REFERENCES

FOCUSING

KEY ISSUE

TASK

5. What are the critical


evaluation questions?

5. Identify major
questions

PAGE

27-30

\
\

"

PRODUCT 3

\
\
\

INFORMATION COLLECTION PLAN

What kind of information you'll


collect
How you'll collect information
What kinds of information will
"answer" which questions

\
\

\
\
\

liZ!' 1. Evaluation Preview


liZ!' 2. Outline of Evaluation Questions
liZ!' 3. Information Collection Plan

o
o
o
o

4.
5.
6.
7.

Analysis and Interpretation Plan


Report Plan
Management Plan
Plan to Evaluate the Evaluation

53

54

A DESIGN MANUAL

PRODUCT OVERVIEW

1. An Information Collection Plan specifies

what information is needed to answer each evaluation question,


where or from whom you'll get the information, and
how you'll get the information
2. Developing an Information Collection Plan helps you
provide an overall picture of the collection effort
encourage thorough and efficient information collection
identify a variety of sources of information
balance needs for complete and accurate information against needs for
practicality and avoiding disruption of ongoing educational processes
develop needed data collection instruments or procedures, and
refine and clarify evaluation questions
TASK STEPS

Procedure

Objectives

Prepare

1. Study the example of an


Information Collection Plan

1. Knowledge about formats

Try

2. Use worksheet A to design and


tinker with an overall
information collection plan.
Then, use worksheet B to specify
the details of each information
collection procedure.

2. An Information Collection

Improve

3. Use tips and checklists

3. An improved and clarified


plan

Move On 4. Reflect back on.what you've


done, check it out with others,
and look ahead to next steps

for relating evaluation


questions to information
sources and data collection
techniques

Plan (A) and Schedule (B)

4. Decisions about next steps

PRODUCT 3

55

EXAMPLE
INFORMATION COLLECTION PLAN (A)
Evaluation Questions

Information
Collection
Procedure
A. Trainers
maintain a
checklist of
activities
and
problems

1. Was
workshop
implemented
as planned?

---

r------

3. Did
workshop
facilitate
learning?

4. How
appropriate
were
objectives?

5. Were
participants
satisfied?

B. Participants

6. What
problems
arose?

complete a
registration
form

C. Selected
"key
informants"
maintain a
log and
respond to
questionnaire

2. What are
characteristics
of participants?

7. How
well did
staff
perform
their
roles?

D. All
participants
complete a
brief endof-session
questionnaire

E. Staff

F. Analysis of
selected
workshop
products

administer
a selfscored arid
anonymously
reported
learning
test

56

A DESIGN MANUAL

EXAMPLE
HOW EACH PROCEDURE WORKS (B)

Information Collection
Procedure

Evaluation
Questions
Addressed

Schedule

Respondents

Sample

Instruments Used

A. Trainer's maintain
checklist

#1,#6

One form
completed at end
of each day

Group trainers

Each one

Checklist with
space for problem
notation

B. Participants complete
registration form

#2

Filled in at
registration

Participants

Each who
registers

Checklist, fill-inthe-blank form

C. Key informants

#1, #3-7

Logs completed
daily on-the spot;
questionnaire twice
daily

Key informants

Two selected
from each
group

Log forms with guide


questions; feed-hack
questionnaire

D. All participants
complete a
questionnaire

#3-7

Complete before
lunch on last day
of session

Participants

Al1 who
remain un til
lunch on
last day

type rating scale


with comments
option

E. Learning test

#3

Administer and
self-score on
morning of last
day; discuss
results

Participants

All

F. Analysis of products

#3

Selected products
are analyzed the
day after session,
then all products
are returned to
participants

Workshop
evaluator

Random;
3 products
per group

maintain logs and


complete a
questionnaire

Two-page Likert-

Multiple-choice
test

Work sample

quality checklist

PRODUCT 3

57

WHAT TO NOTICE ABOUT THE EXAMPLE


1. The Plan has two parts. The matrix form (A) shows which procedures will

2.

3.
4.
5.
6.

7.

8.
9.
10.

address which evaluation questions. The schedule (B) shows how and
when each procedure works.
This plan depends on your having at least one evaluation question. Because your
choice of procedures and decisions about how they will be applied
depends on the kind of information you need, you must have selected at
least one evaluation question to get started on your collection plan.
Certainly, you may have more than one-even a set of questions which
define the entire scope of the evaluation. Nevertheless, even with just a
few questions you can begin to decide how to collect information in an
efficient and comprehensive way. You can then add to your plan as the
evaluation proceeds.
The matrix plan shows which procedures (rows) are intended to collect information
for which questions (columns), "X' s" in the cells show this.
Some procedures (like C and D) respond to more than one question.
Some evaluation questIOns (#1, for example) are addressed by more than one
procedure.
A good plan blends economy with thoroughness. That is, the plan reflects an
effort to use only a few procedures to answer many questions
(economy), as well as an interest in obtaining complete information for
each question by using a variety of sources (thoroughness).
There are many types of procedures for collecting information. Some general
sources of information are persons, products, and processes. For each
source there are a number of specific ways of collecting information, for
example, questionnaires, tests, record analysis, interviews, and observations. (See aid #8)
In addition to specifYing bow information will be collected (procedure), the plan
indicates who will collect it (administrator), when it will be collected (schedule),
from whom (respondent and sample), and what instruments are needed.
Samples of information sources vary in kind, in size, and in the proportion of the
population accessed. It's often wise to use samples. Small purposive samples
(See aid #10) are especially useful in training evaluations.
The plan investigates several different kinds of information. Behavior, perceptions, demographic characteristics, and products are used as indicators (See aid #9) of workshop progress and effects.

58

A DESIGN MANUAL

WORKSHEET
OVERALL INFORMATION COLLECTION PLAN

Write evaluation
questions here (see
Product #2)

List information
collection procedures here
(Use aids #8 and 9)

(in the cells place an "X" to show where a procedure addresses


a question)

PRODUCT 3

59

AID#S
SOME KINDS OF INFORMAnON COLLECTION PROCEDURES

Procedure
Accretion, Erosion
Analysis

What it Measures
or Records
Apparent wear or
accumulation on physical
objects

Example
Learning center materials
are inventoried before and
after a workshop to
determine usage or removal

---~-~~~------------

Artifacts Analysis

Residues or other physical


by-products are observed

Waste-basket contents are


inventoried after workshop
to see what material was
thrown away

Case Studies

The experiences and


characteristics of selected
persons in a project

A few graduates from each


degree program are visited
at their jobs. Their
colleagues are interviewed

Interviews, Group or
Individual

Person's responses and


views

Department chair
interviews students about
course adequacy

Panels, Hearings

Opinions, ideas

A panel of teachers reviews


the needs assessing survey
data to give interpretations

Records Analysis

Records, files, receipts

Resource Center receipts


are analyzed to detect
trends before and after
inservice

Logs

Own behavior and


reactions are recorded
narratively

Practicum students
maintain a log of activities

Simulations, "In
Baskets"

Persons' behaviors in
simulated settings

Students are video-taped


introducing a simulated
inservice session

Sociograms

Preferences for friends,


work and social
relationships

An IEP committee pictures


their interdependence for
conducting meetings

Systems Analysis

Components and
subcomponents and their
functional
interdependencies are
defined

An evaluator interviews
staff about program and
depicts these perceptions in
a systems analysis scheme

60

A DESIGN MANUAL

Procedure

What it Measures
or Records

Advisory, Advocate
Teams

The ideas and viewpoints


of selected persons

Teams are convened to


judge the merit of two
competing inservice plans

Judicial Review

Evidence about activities


is weighed and assessed

A "jury" reviews the data


collected on a new
practicum to decide if it
should be repeated

Behavior Observation
Checklist

Particular physical and


verbal behaviors and
actions

Record how frequently


teachers use a new
questioning technique

Interaction Analysis

Verbal behaviors and


interactions

Observers code faculty


classroom interactions

Inventory Checklist

Tangible objects are


checked or counted

School bulletin boards are


checked for inservice
related materials

Judgmental Ratings

Respondent's ratings of
quality, effort, etc.

Experts rate the adequacy


of the college's curriculum

Knowledge Tests

Knowledge and cognitive


skills

Faculty are tested on


knowledge of special
education laws

Opinion Survey

Opinions and attitudes

Superintendents are asked


to rate their attitudes
toward PL 94-142

Performance Tests
and Analysis

Job-related and specific


task behaviors

Principals are observed and


rated on how they conduct
an interview

Q-sorts, Delphi

Perceived priorities

Parents rank teacher


inservice needs

Self- Ratings

Respondents rate their


own knowledge or
abilities

Students rate how well they


can administer different
diagnostic devices

Survey Questionnaire

Demographic
characteristics,
self-reported variables

Teachers report how


frequently they use certain
resource center materials

Time Series Analysis

Data on selected variables


are compared at several
time points

Frequencies of key
practicum behaviors of
students are charted over
the course of a new
semester-long seminar

Example

PRODUCT 3

61

AID #9
SOME SOURCES OF INFORMATION

When
Evaluation
Question Asks
About ...

Some Potential Indicators Are ...


---- - -

---

--

-_.

-~-

--,-~-"

-- ----

Needs and
goals

characteristics of
job descriptions,
proposals, plans,
reports
policies, rules
demographic data
beliefs, values
normative data

current skill,
knowledge
levels, amount
of training
patronage patterns
expert opinions
criteria, laws,
guidelines
nature and
frequency of
problems

rates of use,
production,
incidences
kind of clients
served
consumer
preferences,
wants
perceived priorities

Training
strategies
and designs

characteristics of
plans, proposals,
user's guides,
instructor's
manuals
records of
resources
available,
programs in use
training literature

data about current


services
people's schedules,
positions, jobs,
rates
expert opinions
results of
feasibility
studies, pilot
tests, research

reports from
commissions,
task groups
demographic data
user/trainee
preferences,
convenience,
needs
recommendations
from task
groups, leaders

Implementation
of training

attendance rates
and patterns
usage of materials,
resources
perceptions of
observers

trainer behaviors
perceptions of
trainees
transactions
(verbal, other)
wear and tear on
materials

trainee behaviors
perceptions of
trainers
discard rates and
nature

Immediate
outcomes

materials produced
in training
trainer ratings
observer ratings

knowledge (i.e.,
test scores)
trainee ratings
self-report ratings

performance in
simulated tasks
pre/post changes
in test scores

On-job usage
of training
outcomes

nature and
frequency of
usage
peer opinions
records of use,
behavior

trainee perceptions
observed behavior
performance
ratings

supervisor opinions
quality of work
samples
test scores
transactions of
trainees with
others

62

A DESIGN MANUAL

When
Evaluation
Question Asks
About ...
Impact (worth)
of training

Some Potential Indicators Are ...


changes in
policies, rules,
organization
perceptions of
clients
patterns of use
rates
cost/benefit
analyses

performance
ratings
performance
of clients
(e.g., test scores)
opinions of
experts, visitors,
observers

promotions records
perceptions of
clients, peers,
relatives
consumer opinions
quality of work
samples
treatment, sales
records

PRODUCT 3

63

WORKSHEET
HOW EACH PROCEDURE WORKS

Directions: Fill in each column below for each procedure listed in worksheet A.
Procedure

Evaluation
Schedule
Sample (kind and
question addressed (when, how, where) size, see aid #10) Respondents

Instruments used

64

A DESIGN MANUAL

AID #10
SOME KINDS OF SAMPLING METHODS
USEFUL IN EVALUATION OF TRAINING

Random Methods:

Purposive Methods:

Straight random sampling-One selects,


via random method (such as a random
numbers table), a predetermined
portion of a population. The
proportion of the population sampled
determines the level of precision of
the generalization to the larger
population. The larger the sample, the
more precise the generalization.

Key informants- This method is


employed to access those persons with
the most information about particular
conditions or situations. Union
representatives or de-facto leaders
among teachers could be a prime
source of teacher attitudes and
opinions; community leaders and other
respected individuals could yield rich
information on community issues, and
so forth.

Quota sampling-Samples are drawn


within certain population categories
and can be made in proportion to the
relative size of the category. The quota
sample ensures that the sample will
include access to low-incidence
subpopulations who would likely not
be drawn in a straight random sample.

Expert judges- This method involves


sampling those persons with
exceptional expertise about certain
conditions or factors of interest. When
information about best practices or
recent advances is sought, an hour
interview with an expert can short-cut
many hours of literature review and
reading.

Stratified samples-Samples are drawn


for each of several "strata," such as
freshmen, sophomores, juniors or
seniors; or workers, supervisors and
managers. Stratified samples are useful
when you have more, or a different,
interest in one particular stratum than
another. You identify strata of greater
interest, then take larger samples from
them. Each stratum is considered a
population.

Extreme groups- This intentionally seeks


out conflicting or extreme viewpoints.
Whereas the random methods aim to
account for bias and converge on the
average, or typical case, the extreme
group sample purposely ignores the
middle ground or common viewpoint
to learn about the atypical extreme
view.

Matrix samples- This method samples


both respondents from a defined
population and items from an
instrument. When the respondent pool
is sufficiently large and there are many
instrument items, it is more efficient
to have each respondent respond to
only a certain subset of the items. If
item subsets are randomly generated

Grapevine sampling- This entails a


growing sample, where each successive
sample member is determined by clues
or explicit directions from the prior
members. One might ask a principal,
for instance, to be directed to the
most voluble (or negative or positive
or reticent, etc.) teacher in the school
That person would be interviewed,

PRODUCT 3

65

Random Methods:

Purposive Methods:

and respondents randomly drawn,


generalization is possible to the entire
population and the entire instrument.
Useful in broad-scale surveys or testing
programs- but only where an
individual's scores are not needed.

then asked to recommend another


person to the interviewer, and so
forth, until the interviewer is satisfied
that a sufficient sample has been
obtained. Direction to the next case
can be asked for explicitly or derived
from each case's notes. The same
method can be used in a survey
questionnaire, much the same as a
chain-letter operates.

66

A DESIGN MANUAL

TIPS
COLLECTING INFORMATION
You will notice a tension between having an efficient and practical plan and having
one which is thorough. That's common in evaluation. Strike a balance. Make
conscious decisions and record rationale. Discuss with client/audience. If good
compromise can't be made, consider not doing evaluation or refocusing it.
You may want to go back to review the evaluation questions at this point, e.g., revise,
rearrange, add to, or eliminate. Before you eliminate because you can't think of a
procedure to collect information to answer it, talk to others experienced in data
collection.
Here's a good place for consultant help!
If you've been working closely with audiences, it may be a good idea to go back now
and show them this plan. Or you may wait until you've done an Analysis and
Interpretation Plan and show them the two together, since they are closely
related.
Use existing data where it's available. It's cheap, and less likely to be biased by
collecting for evaluation purposes.
Be sure you've thought about information collection procedures that are already
available in your program-e.g., tests, annual surveys, etc., that could be useful for
this evaluation. And look at records that can be analyzed. Caution: don't include
those procedures if they aren't useful.
Because you may want to rearrange and revise the plan as you go, it may be easier to
use the format of the worksheet on a larger sheet of paper. Newsprint is a handy
size, particularly if you're working with a group.
Note that each procedure row of a worksheet defines the content of an instrument.
For more information about drafting and piloting instruments, see the

Sourcebook.

PRODUCT 3

67

CHECKLIST
COLLECTING INFORMATION

Answer the following questions and revise the worksheet until you can answer
"yes" to all of them.

YES

NO

Are the procedures clearly specified?

Are the procedures specific enough to guide information


collection?

Are the procedures general enough to avoid limiting the


information collection when it ought to be flexible?

Is it likely that the information you collect will be used only as


you intend?

Are the procedures legal and ethical-i.e., do not violate rights of


privacy or laws or regulations?

Will the cost of any procedure be worth it, given the amount and
kind of information it will provide (consider staff time,
implementation costs, etc.)?

Will the plan avoid undue disruption to the program?

Can the procedures be carried out within the time constraints of


the evaluation?

Will the information collected be reliable?

Do you have multiple information sources for questions that


require them?

Have you eliminated any superfluous procedures?

Do you plan to make maximum use of existing data?

Will the procedures allow you to make valid interpretations of


the information? (You may come back to this after you've
completed an Analysis and Interpretation Plan.)

68

A DESIGN MANUAL

MOVING ON
Check to see whether you need to do any of the following things before
moving on to develop an Analysis and Interpretation Plan.
___ Check over the information you'll be collecting to see if you'll also be
able to answer some other important questions with the information.
___ Check back with your audience (or your advisory committee or a key
person or two) to reconsider evaluation questions. (You can put the
revised questions on a worksheet that appears in the Analysis and
Interpretation Plan section.)
___ Do a more careful description of the object if you have trouble
thinking about what kinds of information are available and most
appropriate.
_______ Do a rough cost estimate and reconsider purposes. (If you suspect
you need to do this, look in the Management Plan section for
ideas.)
___ Refer to other sources of information (e.g., a local expert or some of
the references below).
SOURCEBOOK REFERENCES

FUNCTION

KEY ISSUES

TASKS

Collecting
Information

1. What kinds of

1. Determine the

77-83

2. What procedures

2. Decide how you'll

84-88

3. How much

3. Decide whether you

89-94

4. Will you select or

4. Determine how precise

95-99

5. How do you establish

5. Establish procedures to 100-107

6. How do you plan the

6. Plan the logistics for

information should
you collect?
should you use to
collect needed
information?

information should
you collect?

develop instruments?

reliable and valid


instrumentation?

information collection
effort to get the most
information at the
lowest cost?

information sources
you will use
collect information

PAGE

----

need to sample and, if


so, how

your information must


be and design a means
to collect it.
maximize validity and
reliability

an economical
information collection
procedure

108-115

.........

,", ","

'\

'\

-- -- -

PRODUCT 4

"-

ANALYSIS AND INTERPRETATION PLAN

How you'll handle collected


information
Aggregation, coding and storage
methods
Analysis techniques
What criteria you'll use to make
interpretations

10' 1. Evaluation Preview


10' 2. Outline of Evaluation Questions
10' 3. Information Collection Plan
10' 4. Analysis and Interpretation Plan

\
\

o
o
o

5. Report P1an
6. Management Plan
7. Plan to Evaluate the Evaluation

70

A DESIGN MANUAL

PRODUCT OVERVIEW

1. The Analysis and Interpretation Plan

describes how information will be analyzed to address each evaluation


question.
specifies the criteria for judging results and describes methods for
obtaining judgments.
2. Analysis includes checking over the collected information and preparing
it for analysis.
3. Analyzing and interpreting well involves trying to minimize measurement biases in the data and capricious value biases in their interpretation.
4. Consideration of measurement problems and value biases can lead to
refining evaluation questions and subquestions into progressively smaller
and more readily interpretable sub questions.
TASK STEPS

Procedure

Objectives

Prepare

1. Study the example of an Analysis

1. Knowledge about formats


and characteristics of
Analysis and Interpretation
Plans

Try

2. Two worksheets are used for this

2. An Analysis and
Interpretation Plan

Improve

3. Use tips and checklist

3. An improved and clarified


plan

and Interpretation Plan

product. You plan how to handle


and prepare returned
information on worksheet A.
Then using worksheet B, you
plan analysis procedures and
criteria for judgments relative to
each evaluation question.

Move On 4. Reflect back on what you've


done, check it out with others,
and look ahead to next steps

------

4. Decisions about next steps

PRODUCT 4

71

EXAMPLE
ANALYSIS AND INTERPRETATION PLAN (A):
PRELIMINARY HANDLING AND ANALYSIS STEPS (EXCERPT ONLy)

Information
Collection
Procedure

Verification &
Problem-Checking

Preliminary
Analysis,
Coding, etc.

Storage
and Retrieval
---,~----~---

Participants
complete
registration form

Checked after first


day; if incomplete,
will be returned to
participant for
completion

Coded by project
secretary; add
workshop # code
to each set;
maintain running
summary

Filed by
workshop,
alphabetically by
participant

Self-scored
multiple choice
test

Retain for postworkshop scoring


only tests with
participant's
original (not
changed or
tampered with)
responses.
Discard tests with
more than 20% of
items left blank

Compute item
frequency
distribution.
Compute mean
score per small
group and per
workshop.
Maintain items
analysis on
cumulative basisre all workshops

File by workshop

"Key informants"
logs

Content analyze
only logs with at
least 200-word
entries for each
day

Prepare log
summaries

File by workshop

72

A DESIGN MANUAL

EXAMPLE
ANALYSIS AND INTERPRETATION PLAN (B):
ANALYSIS AND INTERPRETATION PROCEDURES (EXCERPT ONLY)

Evaluation
Question

Information
Collection
Procedures

Analysis
Procedure

Evaluation
Criteria

Procedures for
Interpretation

-----------

1. What are the

Registration form
administered to
participants at the
beginning of the
workshop

Frequencies and
percentages by
position,
certification
status and number
of years in the
district; total
number of
participants

Characteristics of
target audiences.
At least 70% of
teachers and
administrators; at
least 60% of the
non-certified
teachers; equal
distribution across
seniority levels

Inservice director
compares findings
with criteria

2. Were participants

Key informant
logs
Participants
questionnaires

Content analysis
of logs
Frequency
analysis of
selected
questionnaire
items

No key
informants should
consider entire
workshop as
unsatisfactory
Dissatisfaction
should be limited
to minor
workshop
elements
Mean satisfaction
ratings should be
greater than 3 on
5 point scale

Discussion of
findings with key
informants
Review of
satisfaction items
by director and
staff
Comparison of
mean rating to
criterion

3. What problems
arose during the
delivery of the
workshop?

Participant
questionnaires
Interviews with
key informants
Staff checklist

Content analysis
of notes, logs and
relevant items on
questionnaires to
identify cited
problems, how
they developed
and their effects

"Problems" such
as:
unproductive
diversion from
schedule
inadequate facility
(e.g., space, A-V
equipment)
materials not
prepared
staff unable to
answer
questions or
make
appropriate
referrals

Inservice director
reviews findings at
staff meeting
when plans for
workshop revision
are discussed

characteristics of
workshop
participan ts?

satisfied?

PRODUCT 4

73

WHAT TO NOTICE ABOUT THIS EXAMPLE


1. The example shown is an excerpt from the whole plan. There isn't enough space

2.

3.

4.
5.

6.

7.

8.

to show you the entire plan. The excerpt is probably enough to give you
the idea.
The product has two parts. The first shows how information from each
collection procedure is verified, prepared and stored. The other
example shows how information from several procedures are analyzed
and interpreted to "answer" evaluation questions.
There are a variety of criteria that can be used to make judgments. They include
professional standards, goals or objectives, laws, past performance,
needs, or performance of a comparison group. Because of this diversity
of possibilities, it's important to know what (and whose) criteria are
being applied.
Several criteria are often used. That, of course, may make the evaluation
more complicated, but having different value perspectives can also
strengthen it.
A variety ofprocedures for making value interpretations are possible. Individuals
or groups may be involved; the evaluator may make the judgments or
audiences may reserve that right. Because this is a critical part of the
evaluation, it's important to be clear about how the judgments will be
made and by whom.
The analysis and interpretation steps may be distinct or may blend, depending on
what kind of evaluation question is asked. When the question asks for
descriptive information (e.g., question #1), the two steps are relatively
seoarate. That is, one can answer the question by analyzing the findings
without referring to values or making judgments about it. Then
evaluative criteria are applied in order to interpret the findings and
make them useful for program planning. Question #2 combines those
two steps because the analysis involves not only manipulation of the
collected (raw) information, but also some comparison of it to some
criteria for judging its implications for the worth of the object.
The criteria may be specified in detail before information collection begins or may be
defined generally at first, then in more detail when findings are known. The
preferred practice is to specify as much as possible-as soon as possible.
The criteria can become part of the agreement between the evaluator
and the audiences and can help direct the evaluation in useful ways.
However, it may not be possible to determine detailed criteria for some
questions. For example, question #3 has some criteria listed on the
table,
should be identified if the answer to the question is to be useful for
improving the workshop. Thus, the analysis procedure and the criteria
need to be open enough to detect them.
To "answer" some evaluation questions, it is necessary to draw on the answers to
other questions. For example, the answer to a general question about how

74

A DESIGN MANUAL

well the workshop was run could depend on answers to questions about
each of the workshop modules. When answering other questions, too,
you may find that information you didn't intend to use (Le., it's not in
your collection plan) can and should be applied. Looking for and
making necessary revisions to the plan is, of course, important.
9. Carrying out a good Analysis and Interpretation Plan requires attention to
information handlint- storage and retrieval. If those systems haven't been
considered, the plan is likely to have problems.
10. Preparing data for analysis (verIfication, etc.) makes sure that information is
complete enough to be worth the trouble of analysis.

PRODUCT 4

75

WORKSHEET
PRELIMINARY HANDLING & ANALYSIS PLAN

List information
collection
procedures

Describe how
problems will be
identified,
e.g., incomplete
information and
errors

Describe
preliminary
analysis

Describe how the


information will
be stored and
retrieved- who
will have access

76

A DESIGN MANUAL

WORKSHEET
ANALYSIS & INTERPRETATION PLAN

List evaluation
questions

List informationj
collection
procedures for
each question

Describe analysis
procedure(s)

Describe
procedures for
making the
judgments
(see aid #11)

PRODUCT 4

77

AID #11
SOME COMMONLY USED CRITERIA AND METHODS
FOR INTERPRETING INFORMATION ABOUT TRAINING PROGRAMS

Criteria

Expert judgments, opinions, viewpoints


Staff expectations, program established standards
Public (democratic) viewpoints (e.g., as determined from a consensus or voting
procedure)
Special interest viewpoints (e.g., a handicapped parents group)
Research studies and reports (e.g., linking interest to learning)
Institutional standards, guidelines (e.g., your agency or state department)
Commonly accepted practice
Regulations, laws (e.g., federal guidelines)
Accreditation or licensing standards (e.g., AMA, NCATE, CEC, ASHA)
Probabilities, rates of occurrence, predictable likelihood
Norms, expected scores, cutting scores
Methods of comparison, interpretation

Statistical tests of significance


Consensus methods (e.g., Delphi, nominal group technique, Q-sorts)
Voting, democratic procedures
Discussions, meetings, reviews
Hearings, panel reviews
Debates
"J ury" trial
Individual opinions
Expert judgments

78

A DESIGN MANUAL

TIPS
ANALYZING AND INTERPRETING INFORMATION
When you have trouble identifying analysis criteria or procedures with an evaluation
question, try breaking the question in "smaller" (finer, more specific) questions.
This will resolve ambiguity or conflicts (but may not be easy!).
It's not too late to review other parts of the design:
Evaluation questions:
do you need to subdivide some or add others based on the analysis you want to
do?
audiences: should audiences be added to provide balance in value bases or
criteria?
sources of information: do you need to add sources in order to answer the
evaluation questions?
Look at the matrix worksheet on page 58 (Information Collection Plan) to help you
organize your thoughts about analysis.
Each column (evaluation questions) should represent information you will use to
address each evaluation question.
Preliminary analyses are done for each row (information collection procedure)
Seek persons with skills in information analysis and interpretation.
As you decide what criteria will be applied, realize there are many options. The point
is to have good rationale for selecting the basis on which judgments will be made
and to record those decisions.
This plan should contain your first crack at what will be done. Additional analysis
may later seem worthwhile to explore the information further.
Analysis procedures you choose do not have to involve complex statistics. In fact,
they rarely do in program evaluations. Choose procedures that make sense in your
situation. If you do use complex procedures, be sure you explain them in your
reports for readers who are not familiar with them.

PRODUCT 4

79

CHECKLIST
ANALYZING AND INTERPRETING INFORMATION
Answer the following questions and revise the worksheet until you can answer
"yes" to all of them.

Will your preliminary handling checks reveal biases or other


problems in the information (e.g., because of incomplete
responses)?

NO
0

Will they direct you to look at problems with how the


information was actually collected?

Will they help with how information is prepared for storage


(e.g., co dings, transcriptions)?

Will your storage system allow you to maintain confidentiality


and other safeguards?

Will the information stored be accessible only to those


authorized to get it?

Have you taken precautions to prevent loss of information?

Are the interpretation criteria you'll use carefully described and


clear?

Will the interpretation plans yield credible answers to your


questions?

Are the evaluation criteria acceptable to the audiences?

Are your analysis plans likely to help you answer your evaluation
questions?

Will your analysis and interpretation plans allow for identification


of unanticipated outcomes?

Do your plans include opposing viewpoints and alternative


criteria where appropriate?

YES

80

A DESIGN MANUAL

MOVING ON
Check to see if you need to do any of the following before moving on to
develop a Report Plan.
___ Check over your worksheets to find the areas where you are
uncertain about how to analyze and interpret information you
collect.
Reconsider the evaluation questions associated with the problem
areas. Break those questions and sub questions into sets of smaller
questions. (You might want to review the "What to Notice" and
"Example" in the section on Outline of Evaluation Questions to see
how to do that.) With a little ingenuity (and consultation with
audience members) you can probably arrive at a set of small
questions that are relatively easy to answer. (It is possible to break
any complex question into a set of questions that can be answered by
"yes" or no." AnalysiS, after all, means breaking things down into
simpler components.)
Seek help from persons with greater (or different) skills in data
- - - analysis. Expert evaluators seek such help regularly, so please
consider it all right to do.
___ Refer to other sources of information (e.g., an expert in values
clarification or research literature in your area).
SOURCEBOOK REFERENCES

FUNCTION

KEY ISSUES

TASKS

PAGE

Analyzing and
Interpreting
(Evaluation)

1. How will you handle

1. Aggregate and code

119-122

2. Are data worth

2. Verify completeness

123-126

3. How will you analyze

3. Select & run defensible

127-144

4. How will you interpret

4. Interpret the data

145-147

returned data?

analyzing?

the information?

the results of analyses?

data if necessary

and quality of raw data


analyses

using pre specified and


alternative sets of
criteria

",

"

"-

.........

.........

'\

"-

'\

'\

'\

"-

" '\
'\

"-

...........
.............
~

"'-

'"

--

PRODUCT 5

---

'\
REPORT PLAN

~
~
~
~
~

o
o

What you'll report


To whom you'll report
How you'll report
When you'll report
1. Evaluation Preview
2. Outline of Evaluation Question

3.
4.
5.
6.
7.

Information Collection Plan


Analysis & Interpretation Plan
Report Plan
Management Plan
Plan to Evaluate the Evaluation

81

82

A DESIGN MANUAL

PRODUCT OVERVIEW

1. A Report Plan describes how information will be shared with key

members of the evaluation audience.

2. A good reporting plan can contribute greatly to the value of the

evaluation by assuring that audience members get a chance to tell you


whether the information is timely, understandable, believable, and
useful.
3. Some of the most important reporting is built into the evaluation process
as audience members are consulted (and informed or educated) about the
purposes of evaluation, the questions to be answered, the information to
be collected, and the ways the information is to be analyzed and
interpreted.

TASK STEPS

Procedure

Objectives

Prepare

1. Study the example

1. Knowledge about formats

Try

2. Use the worksheet and aid #8 to


decide who you'll report to, how,
what you'll report, & when

2. A Report Plan

Improve

3. Use tips and checklists

3. An improved or clarified
plan

Move On 4. Reflect back, check with others,


look ahead

and strategies for reporting

4. Decisions about next steps

PRODUCT 5

83

EXAMPLE
REPORT PLAN
Audience

Content

Format

Date/Frequency

Event

Project staff 1. Progress to


date; next
steps; problems
needing
attention

Memorandum
and
presentation

Beginning of
each month

Presentation
at staff
meeting;
with onepage written
summary

Department
Head

1. Progress update

Meeting

Bimonthly

Meeting
with Project
Director

2. Review of

Written
report

60 days after
project ends

Final report

1. Evaluation

Meeting

30 days after
project begins

Department
meeting

2. Periodic

Informal
discussions

Unscheduled

Ora~

(See
Management
Plan)
concurrent with
information
collection

Interviews

Meeting

Quarterly

Department
meeting

3. Review of
findings and
implications

Presentation
with
executive
summary

60 days after
project ends

Faculty
meeting

1. Evaluation

Memorandum

Quarterly

Presentation
at
committee
meetings
one-page
summary

2. Review of

Written
report

60 days after
project ends

Final report

design,
findings,
interpretations,
implications

Department
Staff

design

updates on
progress

informal

Advisory
Committee

design,
progress

design,
findings,
interpretations,
implications

84

A DESIGN MANUAL

Audience
Funding
Agent

Format

Date/Frequency

Event

1. Progress,

5-page
written
report

Quarterly

Report is
mailed;
follow-up
phone call

2. Same as other

Written
report with
instruments,
etc.

90 days after
project ends

Final report
with
technical
material

Content

resources used,
problems, next
step
implications

final report,
with more
detailed design
and findings
included

PRODUCT 5

85

WHAT TO NOTICE ABOUT THE EXAMPLE


1. Evaluation reports are defined here as any instance of communication with audiences

2.

3.
4.

5.

about the evaluation plans, progress, or findings. This broadens the scope of
reporting activities well beyond the preparation of a single end-of-theevaluation report.
The communication may be written or oral. A report can be as simple and
informal as a five-minute presentation at a meeting.
A report is made each time there is formal contact with an audience. Thus, in some
cases those reports are concurrent with information collection efforts or
other activities not focusing on the reporting function.
Brevity without losing important information is the general rule. That means that
in some instances a one-page summary of the report is used, even though
you know there's much more complete information in another document
available to those who are interested in it. This strategy tailors report
length to individual readers.
There are a variety of audiences. Successful evaluation depends on a broad
base of informed support. Reporting to a broad array of audiences- just
to inform them-can maintain support and commitment to the evaluation effort.

86

A DESIGN MANUAL

WORKSHEET
PLANNING REPORTS

List
evaluation
audiences

Describe
content of
reports
(use aid #12)

List date
Identify format of report
or frequency
to be used
(use aid #12)
(use aid #12)

Identify event
associated with
report

PRODUCT 5

AID #12
EXAMPLES OF EVALUATION REPORT CONTENT,
SCHEDULE BASES AND FORMATS

Content Options
Evaluation design, intents
Work accomplished
Problems encountered
Revisions to plans
Budget
Resources used
Future activities
Findings
Instruments used
Information collection procedures
Interpretations
Recommendations
Schedule Bases
Calendar periods (e.g., quarterly)
Key decision and other audience
events (e.g., a board meeting)
Stages of evaluation progress
(e.g., design, completion of analysis)
Key program events (e.g., after each
workshop)
Opportunistically (e.g., an invited
speech)

Formats
Written documents
technical reports
interim reports
conference proceedings
memoranda
letters
professional journals
publications
Media releases
press
tv
radio
Meetings
small group discussions
presentations
luncheons
hearing
panel reviews
Leaflets
newsletters
pamphlets
Audio-visual
films
filmstrips
tapes
overhead transparencies

87

88

A DESIGN MANUAL

TIPS
PLANNING REPORTS
Before you simply copy the list of audiences from previous worksheets, consider
whether there are others that should be added or some that should be
eliminated.
Although there may be many informal communications about the evaluation, it will
be worthwhile to plan those that are particularly important.
Although you may be reluctant to make reports to some audiences-remember that
the consequences of failing to report can be more damaging. Your critics draw
unwarranted conclusions.
Be sure your Management Plan describes who will write reports.
The Analysis and Interpretation Plan should specify who has access to information.
(Be sure your report writers do.)
All audiences with right to know about evaluation should be listed.
When planning the content of the report aim for
balance
clarity
objectivity

PRODUCT 5

89

CHECKLIST
PLANNING REPORTS
YES

NO

Will all the audiences receive some kind of communication about


the evaluation?

If the plan is put into operation, will the audiences receive the

Is the format appropriate for each audience, considering interest


and involvement?

Will the reports in the plan be brief enough to keep the


audience's interest and long enough to cover the necessary
information?

Do the reports fit the natural schedule and progression of the


evaluation and/or the evaluation object?

Will reports be made at times convenient for audiences?

Will reports be timely (e.g., meet deadlines for decision making)?

information they want and should have?

90

A DESIGN MANUAL

MOVING ON
Check to see if you need to do any of the following before moving on to
develop a Management Plan
___ Look back at the worksheet on audiences and their interests. See
whether the data you'll collect, the criteria you'll use to interpret it,
and the way you'll report it will be responsive to those interests.

___ If there are conflicts of interests or values that you don't know how
to resolve yet, consider adding a debate (by audience members) to
your Report Plan so the conflicting groups can fight it out among
themselves rather than attack your work.
___ Refer to other sources of information (e. g., an expert on communications or report writing)
SOURCEBOOK REFERENCES

FUNCTION

KEY ISSUES

TASKS

PAGE

Reporting

l. Who should get an

l. Identify who you will

151-153

2. What content should

2. Outline the content to

154-158

3. How will reports be

3. Decide whether reports

159-164

4. What is the appropriate

4. Select a format for the

165-167

5. How can you help

5. Plan post-report

168-169

6. When should reports

6. Map out the report

17~173

evaluation report?

be included in a report?

delivered?

style and structure for


the report?

audiences interpret and


use reports?

be scheduled?

report to

be included

will be written, oral,


etc.
report

discussions,
consultation,
follow-up activities

schedule

-----

:\-------,

I '\.

'

I.-----L-----,~----'

I
I
I
I

~-----\

PRODUCT 6

MANAGEMENT PLAN

How you'll schedule evaluation


tasks
Who will do what
When things get done
Budget: what it will cost

~
~
~
~
~
~

1. Evaluation Preview
2. Outline of Evaluation Question

3.
4.
5.
6.
7.

Information Collection Plan


Analysis and Interpretation Plan
Report Plan
Management Plan
Plan to Evaluate the Evaluation

\
\---------------------~
91

92

A DESIG N MANUA L

PRODU CT OVERV IEW

involv ed in
1. A Manag ement Plan helps in keepin g track of all the details
nt flow of
efficie
and
evalua ting and in achiev ing a reason ably smoot h
activit ies.
t calend ar or a
2. A Manag ement Plan can be a simple person al or projec
work plans,
lines,
time
ts,
budge
bing
descri
ents
docum
compl ex set of
etc.
TASK STEPS

Proced ure

Object ives

Prepar e

1. Study the exampl es

1. Knowl edge about format s

Try

2. The Manag ement Plan has two

2. A Manag ement Plan

Improv e

3. Use tips and checkli sts

3. An improv ed and expand ed


plan
4. Decisio ns about next steps

parts: a task chart and a budget .


You use three (3) worksh eets to
produc e them. First, you use
worksh eet A to plot out key
tasks noting who does them,
resourc es needed , and when they
occur. When you've tinkere d
with this to your satisfac tion,
you comple te a task chart
(works heet B) and a budget
(works heet C).

Move On 4. Reflect back, involve others, and


look ahead

PRODUCT 6

93

EXAMPLE
A MANAGEMENT PLAN IN TWO PARTS

I. EVALUAnON WORKPLAN

---

A. Delineation of Information
1. Site visits
2. Gather information
3. Interact with evaluation
members
4. Interview legislative
personnel
5. Write needs assessment
instrument
------.~-

B. Needs Assessment
1. Select sample
2. Administer needs
assessment instrument
3. Analyze data
4. Write priorities report
5. Distribute findings
6. Obtain comments
7. Write final report
---

Person
Resp.

1,2,3
3
1,3

X
X
X

X
X

1,3

1,2

D. Project Administration
1. Negotiate contract with
Evaluation Committee
2. Recruit/train evaluation
Associates & Assistants
3. Organize communication
procedures
4. Revise evaluation design as
needed
5. Supervise operations
6. Synthesize findings and
prepare interim and final
reports
7. Communicate findings to
key audiences

- - - - f-----

3
1,3
3
1,2
3
3
1

------ ----------

C. Summative Evaluation
1. Select sample
2. Develop summative
instrument
3. Administer summative
instrument
4. Analyze data
5. Conduct site visits
6. Analyze site visit data
7. Write summative report

X
X
X

r---

-- --

X
X

3
1

X
X

------ - - - - - -

X
X

X
X

3
1,2,3
1,2,3

X
X

X
X

1,2

1
1,2,3

X
X

X
X

X
X

X
X

1,2,3

Key to Personnel Involved: 1. Project Director, 2. Research Associates, 3. Research


Assistants

94

A DESIGN MANUAL

II. PROJECT BUDGET


Personnel

A. Project Director (1)


(45% of $22,000/yr X Y, year)

$ 4,950

B. Research Associates (2)


(2 X 15 days X $100/day)

3,000

C. Research Assistants (3)


(3 X 15 days X $50/day)

2,250

D. Secretary
(25% of $8,000/yr X Y, year)

1,000
Personnel Subtotal

$11,200

Fringe
(22.1% of A, B, D)

1,977

Travel and Lodging

A. 3 site visits @ 200 miles roundtrip @ $.17 mile

102

B. Lunch for 6 persons during initial site visits @ $3.50 per person

21

C. Travel to Springfield
2 trips to the Eval Bd. (160 @ .17
1 trip for interviews (160 @ .17)
1 trip to Huntsville (400 @ .17)

54
27
68

2)

D. Per diem for 2 persons for interviews in Lansing ($50/day) for


two days

200

E. Site Visits
1. Air Fare (2 trips X $120 X 2)
2. Car Travel (17 trips @ 400 mile ave X .17/mile)
3. Per Diem (22 trips @ $15/day X 1 day X 2)
(11 trips @ $35/day hotel X 2)
(11 trips @ $7.50/meal X 2)
4. Ground Transportation (2 trips X 15)
Travel and Lodging Subtotal

480
1,156
660
770
165
30
$ 3,733

Materials and Supplies

A. Office supplies ($ 1O/month X 6 months)

60

B. Telephone ($100/month X 6 months)

600

C. Postage
(3 mailings and 2 return mail @ $.30 X 400/mailing)
(office mailing $5/month X 6 months)

600
30

D. Copying
($100/month X 6 months)
(Instruments (400 X .25 X 2

600
200
Materials and Supplies Subtotal
Total Direct Costs
Indirect Costs ($6,771)
TOTAL

Indirect costs $6,771 considered as cost sharing by the University on this project.

$ 2,090
$19,000
-0$19,000

PRODUCT 6

95

WHAT TO NOTICE ABOUT THE EXAMPLE


1. The Management Plan has two parts: (1) a breakdown of tasks, timelines and
2.

3.

4.

5.
6.

responsibilities, and (2) a budget showing resources needed for evaluation tasks.
The Plan shows the major activities ofthe evaluation and when they will occur. This
makes it easy to see the sequence of events and what has to be done in
preparation for certain activities. An alternative is to categorize by major
types-such as reporting or staff training. This may be preferable in
evaluations when those categories are performed by distinct units or
when a budget has those same categories and you want to be able to use
the documents together.
The level of detail regarding activities varies. For some activities there are subactivities which serve as a reminder of what needs to be done and as a way
of distinguishing responsibilities of individuals who are involved in the
same general activity. Often the management plan has only a general list
of far future activities, while the current or near future ones are specified
in more detail.
The budget includes basic categories of expenditures: personnel, travel, materials
and supplies, and indirect costs. Depending on the size of the program,
you may want to have separate categories for consultants, materials and
supplies. Another alternative is to organize the budget by major program
activities or components and show the breakdown by those same general
categories in each, followed by a summary across components at the
end.
The example shown is not the only way to record a management plan. Pert charts,
Gantt charts and other devices work, too. But any plan needs to have
details about tasks, timelines, responsibilities and resources.
Management plans should be roughly as complex as the evaluation effort. The
example portrays a relatively large and complex evaluation effortprobably more costly than many. A simpler evaluation would be cheaper
and more simple (but it wouldn't let us show off the forms as we11!).

96

A DESIGN MANUAL

WORKSHEET
TASK ANALYSIS

List general tasks


and sub-tasks
(see aid #13)

Task
Begin/end
Date

Task:

/
/
/
/
/
/
/

Task:

/
/
/
/
/
/

Task:

/
/
/
/
/
/
/

Name the
person(s) who will
do each sub-task

._- .

Estimate
personnel
cost

List other
resources
needed

Estimate cost
for other
resources

PRODUCT 6

AID #13
SOME TYPICAL EVALUATION TASKS & SUB-TASKS;
AND SOME USUAL COST ITEMS
TASKS & SUB-TASKS

Focusing/Designing
site visits
meetings with clients
drafting designs
review meetings, panels
contract negotation
legal review
planning meetings
review literature

Reporting
plan reports
prepare media
write reports
conduct meetings
schedule presentations
draw charts, figures, etc.
proofread
conduct trial sessions

Information Collection
inventory available information
determine sample sizes, parameters
draw/create samples
select instruments
pilot test instruments, procedures
train observers, raters, analyzers, etc.
conduct interviews, observations,
tests, etc.
distribute forms, instruments, (mail,
etc.)
conduct site visits (travel, etc.)

Management
recruit, train staff
negotiate contract
supervise
conduct meetings
public relations tasks
bookkeeping

Analysis & Interpretation


train coders
verify, sort, "clean" data
code, aggregate information
preliminary analysis (compute
frequencies, etc.)
conduct analysis
develop/ select analysis methods
conduct meetings, reviews, etc. for
interpretation
design or select computer programs

Evaluate the Evaluation


review/select standards
select/hire consultants
conduct analysis
review documents
conduct visits

COST ITEMS

personnel services
typing
copying
space
equipment
computer time

coding, verification
access fees
rental
refreshments
AV rental
graphics

97

98

A DESIGN MANUAL

COST ITEMS fontinued)


travel
lodging
consultant fees
supplies
telephone

programming
computer cards
postage
airfare
mileage

PRODUCT 6

WORKSHEET
REVISED TASK CHART

List tasks for the evaluation

Identify person( s)
responsible

Specify when the


task will be done

99

100

A DESIGN MANUAL

WORKSHEET
BUDGET
Directions: Complete the following budget, using the categories of expenditures.
Personnel

Subtotal
Fringe (institutional percentage of cost)
Travel

Materials & Supplies

Communications

Total direct costs

Total indirect costs

TOTAL

PRODUCT 6

101

TIPS
PLANNING THE MANAGEMENT PLAN
Don't shy away from a Management Plan because you think your evaluation is too
simple for one. A Management Plan can be as simple as a set of dates in your
calendar. It's important to have some timeline for the evaluation and to know who
will do what and what resources are needed.
Like the other products, this one will need to evolve. As tasks approach, you can be
more specific about how they will be done and by whom-and what their sub-tasks
are.
To write your time line, try starting with the last task(s). When does it have to be
done (e.g., when is the final report due; when does a decision based on this
evaluation have to be made). Then plot the task backward from there to the
beginning of the time line.
After drafting the Management Plan, review the entire evaluation design:
is it worth the cost?
must you modify the evaluation design to make it consistent with the available
funds?
is it necessary to terminate the evaluation because the scaled down version will be
useless?
Although this plan may have some of the same elements of an evaluation contract
(budget, responsibilities of personnel, tasks), it should be substituted for one.
When writing general tasks, consider the following categories:
products needed
evaluation questions to be answered
audiences to be served
objectives of the evaluation
phases of the program
stages of the evaluation: designing, collecting, and analyzing information, and
reporting

102

A DESIGN MANUAL

CHECKLIST
PLANNING THE MANAGEMENT PLAN
Answer the following questions and revise the worksheet until you can answer
"yes" to all of them.
YES

NO

Have you listed all the tasks that are preconditions of other
tasks?

Is the level of detail of tasks sufficient to distinguish among


important activities?

If all those tasks are carried out, will you have completed the
evaluation?

Have you included tasks related to the major functions of


evaluation: design, information collection, analysis, reporting,
management and evaluating the evaluation?

Can you survive errors by people in other departments whose


work you don't control (Le., do they have the time and skills)?

Can the tasks be accomplished in the times allotted?

Will you be able to handle multiple tasks when they occur


simultaneously?

Will resources be sufficient to do a quality job?

Is the resource list realistic (not inflated)?

Are the budget totals within the limits you've been given?

Have you considered the potential problem spots-and have you


accounted for them? Have you thought about what will happen if
a task isn't done on time?

Do the personnel have the necessary skills?

Will the evaluators be viewed as credible?

Have conflict of interest biases been identified and minimized?

If this plan is implemented, will the evaluation purposes be met?


Will audiences get needed information on time?

Does the plan correspond to the planned for scope of the


evaluation? Are you in over your head?

Do you have a procedure for keeping records of expenditures and


revenues and of personnel responsibilities and time use?

PRODUCT 6

103

MOVING ON
Check to see if you need to do any of the following before moving on to
looking into a Plan for Evaluating the Evaluation.
___ Renegotiate purposes. Ask key audience members, "Is this really
worth the effort? Now that we see in detail what's involved, do we
want to go ahead with it?"
___ Consider dropping (but check with audiences) some of the evaluation
purposes and questions if things are looking too costly. Maybe you
can do something less, but still plenty worthwhile.
___ Write up a clear (or revised) preview and share it with others.
______ Check around again to see if any of the information you need is
already available somewhere.
______ Prepare more management products (e.g., Pert chart, Gantt chart,
timeline).
____ Get a good administrator to review your Management Plan.
___ Refer to some of the references below.
SOURCEBOOK REFERENCES
FUNCTION

KEY ISSUES

TASKS

PAGE

Management

l. Who should run the

1. Select, hire, and! or

175-180

2. How should evaluation


responsibilities be
formalized?

2. Draw up a contract or
letter of agreement

181-186

3. How much should the

3. Draft the budget

187-190

4. How should evaluation


tasks be organized and
scheduled?

4. Draft a time/task
strategy

191-196

5. What kind of problems


can be expected?

5. Monitor the evaluation


and anticipate problems

197-200

evaluation?

evaluation cost?

train the evaluator

i\------

I '\

-,

I
I
I

r----L-.-----,

~-----I

r---......JL....-----,

I
I
I

'-----,..-~ I

'"

t------

~~

PRODUCT 7

PLAN TO EVALUATE THE EVALUATION

How you'll check your Evaluation


Design
How you'll know if things are
going OK
How you'll assess and learn from
the whole effort

\
\

\
\

Ii1
Ii1
Ii1
Ii1
Ii1
Ii1
Ii1

1. Evaluation Preview
2. Outline of Evaluation Question

3.
4.
5.
6.
7.

Information Collection Plan


Analysis & Interpretation Plan
Report Plan
Management Plan
Plan to Evaluate the Evaluation

\----------------------~
105

106

A DESIGN MANUAL

PRODUCT OVERVIEW

1. An evaluation can be evaluated-just as any other object.


2. The steps in designing an evaluation of an evaluation (sometimes called
meta-evaluation) are like those for evaluating other objects.
3. The process of evaluating evaluations is aided by the fact that the Joint
Committee on Standards for Educational Evaluation has developed a set
of criteria or standards that can be used as a basis for judgment. These are

The Standards for Evaluations of Educational Programs, Projects and Materials

(McGraw-Hill, 1981).
4. An evaluation design can be evaluated before it is implemented, while it
is being implemented, or after the evaluation is completed.
TASK STEPS

Procedure

Objectives

Prepare

1. Study the example of an

1. Confirmation that the

Try

2. Think through why and how you

2. A Plan for Evaluating the

Improve

3. Use tips and checklists

3. An improved or clarified
plan

evaluation of an evaluation

might want to evaluate your


evaluation. Use the worksheet to
plan meta-evaluation activities to
use before, during, and after
your evaluation

Move On 4. Reflect, involve others, and look


ahead

format and process are


similar to that used in
evaluating other objects

Evaluation

4. A decision about next steps

PRODUCT 7

107

EXAMPLE
A PLAN FOR EVALUATING AN EVALUATION
From: Evaluator
To:
Director
Here, as I understand it, is what we agreed on for our "meta-evaluation".
A. Evaluation of the Evaluation Design
1. Purpose: To revise design as needed.
2. Methods: Dr. Schwartz from the University will prepare a written critique,
then present this in a review session with the project evaluator and
workshop staff.
3. Criteria: Dr. Schwartz will base her review on the Joint Committee
Standards. The evaluation design must also be judged to be
absolutely no more costly than necessary to meet its purposes as
specified by the Superintendent.
B. Evaluating During the Evaluation
1. Purpose: To ensure adherence to proper and ethical standards for information collection.
2. Methods: The two consultant visits planned have been dropped due to cost
problems. The evaluator will attend one of the workshops to
monitor evaluation activities. Also, we will devote at least part of
two staff meetings to discuss how the evaluation is going.
3. Criteria: Good information collection practices; efficiency (see Standards).

C. Evaluation of Our Work When It's Complete


1. Purpose: To "certify" our conclusions and to learn how we might proceed
more effectively when we evaluate in the future.
2. Methods: Dr. Bobb from Kermit Associates will prepare a meta-evaluation
report based on the Standards. He will receive a copy of our final
report as soon as it is in final draft form. This will be discussed at an
open meeting in July.
3. Criteria: Joint Committee Standards.

Standards for Evaluations of Educational Programs, Projects, and Materials, McGraw-Hill, 1981.

108

A DESIGN MANUAL

WHAT TO NOTICE ABOUT THE EXAMPLE

1. The meta-evaluation is planned in three parts: before, during and after the

evaluation.

2. Methods used can be quite formal or informal. In general, where more external

3.

4.

S.
6.

credibility is needed, you would want to incorporate a more complete


design (Le., preview, evaluation questions, etc.) and have formal
reporting, like a written report.
An outside evaluator is often hired to do this kind of evaluation. Although an
insider (who worked on the evaluation what's the object) could do it,
questions about that person's credibility are likely to be raised. This
certainly doesn't excuse the insider from assessing the evaluation at
stages along the way, but it does mean that having another perspective is
very useful, particularly when that assessment is done on the total
evaluation.
One purpose for meta-evaluation is to learn from experience. Doing an evaluation
of the evaluation helps audiences and evaluators learn more about
evaluation.
Other products could be developed to provide more detail about the design. The same
steps for those products could be followed, using the worksheets, aids,
tips, and checklists in this manual.
The evaluation design is phased to match major "checkpoints." The two checkpoints in this example are the completion of the design and the end of the
evaluation. Other evaluations may have additional checkpoints, for
example, when a particularly important information collection procedure has been iinplemented

PRODUCT 7

109

WORKSHEET
PLANNING THE EVALUATION OF THE EVALUATION

Directions: Use aids #14 and #15 to complete the worksheet.

Elements
of Plan

Evaluate your
evaluation design

Evaluate processes
(during) your
evaluation

Evaluate the
results/ products
of your evaluation

Purpose

Method

Persons &
resources
needed
-----------------------------------------

Relevant
standards
& criteria

110

A DESIGN MANUAL

AID #14
PARTS OF THE EVALUATION THAT CAN BE EVALUATED
Below is a grid which describes some of the tasks that a meta evaluator might be asked to do at any point in an
evaluation. The important point is that it is never too soon or too late to question the soundness and the worth
of your evaluation.
Focus of Meta Evaluation
Evaluating
Evaluation Designs

Evaluating Evaluation
in Progress

Evaluating Evaluation
after Its Completion

Focusing
Evaluation

to assess and help refine


the evaluation purpose and
questions, investigate
setting and identify
audiences

to determine whether
selected questions and
purposes are being
pursued; to evaluate how
worthwhile they are

to evaluate the soundness


and worth of the evaluation
purpose and the questions
addressed

Designing
Evaluation

to evaluate and refine


design strategies or to
provide information about
options and aid in
designing

to evaluate the
effectiveness of the design
being implemented; to
help monitor or revise if
necessary

to determine whether the


evaluation design was
sound, implemented
properly, and useful for
audience( s)

Collecting
Information

to evaluate or help design


or select instruments and
collection strategy

to observe and evaluate the


collection of information

to assess the quality and


relevance of information
collected and methods used
to collect it

Evaluation
Functions
-------------

------

-----

--------- - - - - - - - - - - - - -

------------- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Analyzing
Information

to guide primary evaluator


in selecting possible
analysis strategies and
consider who will interpret
and how

to evaluate the analysis


process and how effectively
data are being aggregated,
sorted and analyzed

to evaluate the adequacy


and the accuracy of
analyses and the
interpretations of analyses

Reporting
Information

to evaluate report strategy


and suggest format,
audiences to consider, and
report contents

to read and evaluate report


drafts, discuss alternative
reports, refine technical or
lay people reports

to evaluate the evaluation


reports, their balance,
timeliness, adequacy and
ensuing use

Managing
Evaluation

to evaluate and refine the


management plan, budget,
and contract

to evaluate how adequately


the management plan is
being monitored and the
appropriateness of the
contract and budget

to evaluate how well the


evaluation was managed
and budgeted; to determine
whether costs were
reasonable and agreements
upheld

PRODUCT 7

AID #15
SOME METHODS FOR META-EVALUATION
To evaluate evaluation designs
use of checklists, aids
formal written critiques
consultant expert reviews
staff reviews
yield tests, pilot studies
conferences, hearings, panels
presentation at professional meetings
comparison to other evaluations
To evaluate evaluations in progress
follow-up data collection
verification studies
re-analysis
monitor procedures (e. g., by observer)
spot checks of data analyses, information collection procedures, etc.
interviews with respondents
staff reviews
instrument pilot studies
document reviews
To evaluate evaluation results, uses and products
use of checklists, aids
formal written critiques
consultant expert reviews
staff reviews
field tests, pilot studies
conferences, hearings, panels
publication in journals with critiques
presentation at professional meetings

111

112

A DESIGN MANUAL

TIPS
PLANNING THE EVALUATION OF THE EVALUATION

The scope of your meta-evaluation plan depends on the scope of your primary
evaluation. If it's small and just beginning, you may do very simple and limited
meta-evaluation; but if your evaluation is very large and expensive, more metaevaluation may be called for.
Note that one way of evaluating your evaluation design is to use the checklist with
each product or the summary design checklist on pages 8-10.
A number of criteria can be used to judge an evaluation:
intended goals or purposes
mandates
client's or audiences' expectations
adherence to the design
criteria established by experienced evaluators
(See especially Joint Committee, Standards for Evaluations of Programs, Projects and
Materials. McGraw-Hill, 1981.

PRODUCT 7

113

MOVING ON
Check to see if you need to do any of these things before implementing your
evaluation design.

___ Evaluate the design by taking the evaluation questions you have
about it to an expert evaluator.
___ Take the questions you have about your design to key members of
your audience and see what they have to say.
___ Perform the role of expert evaluator yourself by studying the Joint
Committee Standards and applying them to your design.
SOURCEBOOK REFERENCES

FUNCTION

KEY ISSUES

TASKS

PAGE

Meta
Evaluation

1. What are some good

1. Determine whether you

205-207

2. Who should do the

2. Select a meta-evaluator

208-209

3. What criteria or
standards should you
use to evaluate the
evaluation?

3. Select or negotiate
standards

210-217

4. How do you apply a set


of meta-evaluation
criteria?

4. Rank order standards,


determine compliance

218-220

uses of meta-evaluation?

meta evaluation?

need to meta evaluate;


if so, when

APPENDIX A
SELECTING WHAT
(AN OBJECT) TO EVALUATE
PRODUCT OVERVIEW

1. An object of evaluation can be anything you want to evaluate-course,

lesson, program, procedure, resource utilization, results etc.

2. The choice of an evaluation project is influenced by organizational

constraints, by the personal or professional confidence and competence


of the evaluator, and by intelligent guesses about the inherent difficulty
of the work.

TASK STEPS

Procedure

Objectives

Prepare

1. Study some examples of possible


evaluation topics/objects

1. Knowledge about and

Try

2. Use an aid and a worksheet

2. A" first try" selection of an


object for your evaluation
project

Improve

3. Use some tips and a checklist

3. A confirmation or
rethinking of your choice

Move On 4. Turn to Product 1: Evaluation


Preview

sampler of potential
objects or "things to
evaluate"

4. Readiness to use the


workbook as an aid in
doing a worthy evaluation
project

115

116

A DESIGN MANUAL

EXAMPLES
POSSIBLE EVALUA TION OBJECTS

I. "Things" or Products or Outputs or Outcomes


1. A curriculum
8. A design for a workshop
2. A program
9. An intensive workshop
3. A school
10. A project
4. A course
11. A teacher's tenure credentials
5. A unit or module
12. An administrator's
6. A lesson plan
accomplishments
7. Pre--post learning gains
13. A student's school records
14. Graduates of a program
II. Processes or Procedures
1. Student advising procedures
2. A teacher's interactive teaching
3. Procedures for selecting inservice topics
4. A principal's style of running meetings
5. Regular education students' interactions with special education
students
6. Course evaluation procedures
7. System-wide achievement testing procedures
8. Parent-teacher interactions during planning conferences
9. Procedures for gathering personnel planning data
10. Procedures for monitoring regulatory compliance
III. Resources or Tools or Inputs
1. Entering skills of students
2. Instructional media services to teachers
3. Physical facilities and equipment
4. A job applicant's credentials
5. Released time for planning, inservice training, etc.
6. Community support for curricular innovation
7. Health problems of teachers
8. Level of state reimbursement to a local district
9. Skill levels of teachers with respect to multiply handicapped
students
10. Classroom supplies

APPENDIX A

117

EXAMPLES
TWO OBJECTS IN MORE DETAIL
Example #1-0bject Name: procedures for helping students set short-term goals
Comments:
What it is-techniques used with 8-12 year-old children having learning
disabilities
How it works-a peer tutor who has mastered the technique helps them
What it's supposed to do-facilitate independence and make individualization
more feasible
Why you might want to evaluate it-some people object to it as "the blind leading
the blind." Others believe it builds self-esteem and should be used more widely.
Example #2-object Name: Values Clarification Inservice
Comments:
What it is-a "one shot" three-hour workshop
How it works-a university professor comes in on inservice day and runs it
What it's supposed to do-clarify teachers' values and show them how to run
values clarification exercises in their classes
Why you might want to evaluate it-It has been run four times and people seem to
like it, but we don't know much about how effective it is

118

A DESIGN MANUAL

WHAT TO NOTICE ABOUT THE EXAMPLES


1. The objects of evaluation are inputs, process, and outputs. By object of

evaluation we mean anything you could conceive of evaluating or


resources, procedures, and outcomes, or tools, processes, and
products.
2. An object of evaluation doesn't look a particular way; it doesn't have to
be something you can point to. However, you do have to be able to
identify it in some way, e.g., by its effects, by a set of specified
characteristics.
3. Whether something is product, process, or resource depends, sometimes, on the perspective from which you view it. All products are results
of processes or procedures and require resources if they are to be
produced (e. g., a lesson plan is a product of a planning process and an
input to an instructional process). Sometimes an important part of
evaluation of processes is to see what products they produce or what
resources they use (e.g., does a principal's style of running meetings lead
to educational planning decisions or merely consume valuable time?)
4. We could have listed the objects in one list and asked you to think about
each object in terms of resources/inputs, processes/procedures, and
products/outcomes and/or in terms of design, implementation, and
impact. At one point or another, it becomes useful to think of an object
in all of those ways.

APPENDIX A

119

WORKSHEET
SELECTING AN EVALUA nON OBJECT
Directions: Using the previous examples as aids, select one or more objects to
evaluate.
Possibility #1

Object Name

Comments:
What it isHow it worksWhat it's supposed to do-Why you might want to evaluate itPossibility #2

Object Name

Comments:
What it isHow it worksWhat it's supposed to do-Why you might want to evaluate itPossibility #3

Object Name

Comments:
What it isHow it worksWhat it's supposed to do-Why you might want to evaluate it-

120

A DESIGN MANUAL

TIPS
SELECTING AN EVALUA nON OBJECT
N ow you should select from among the alternatives you've listed (or confirm the
wisdom of the choice you've already made).
Use the tips below and the checklist on the next page.
Consider the importance of what you are evaluating
Is the program, course, procedure, etc., one that would have a major impact if it is
effective? ineffective?
Is it so unimportant that it doesn't much matter?
Consider the visibility of the object or the evaluation effort
Is it (or would it be) highly visible and! or a matter of widespread interest from
many persons or groups?
Is it (or would it be) almost totally out of the limelight?
Consider the technical complexity or scope
Is the object a very complex one so that many different variables, questions, issues,
interest groups, and data collection and analysis techniques would need to be
considered?
Is the object a very simple one so that only a very few matters would be
involved?
Consider humanitarian issues
Does the evaluation (or does the object) have the potential for having a major
impact on the livelihood, working conditions, or value and belief systems of
people?
Does the evaluation (or does the object) specifically not deal with and! or avoid
such matters?
Consider the duration of the evaluation project
Would it need to be a long term or longitudinal effort extending over a period of
several years?
Can it be completed in a few hours, days, or weeks?
Consider organizational boundaries
Is the object (or would the evaluation be) concerned with issues that cut across
several organizational boundaries or administrative responsibilities, thereby
requiring widespread coordination or support to do?
Is the object (or would the evaluation be) concerned only with issues almost
entirely within one area of responsibility?
Consider your professional position, style, and competence
Are you ready for or in a position to take on an important, highly visible, complex,
sensitive, and lengthy project that cuts across many boundaries?
Are you cautious or in a pOSition where you should take care to keep a low profile
and not rock the boat?
Consider whether you should try to negotiate a change in an evaluation project
you've been assigned (or change the one you've selected)
After considering some of the tips, do you believe you should try to take on a more
ambitious project?
Do you believe you should try to take on a less ambitious project?

APPENDIX A

121

CHECKLIST
SELECTING AN EVALUATION OBJECT
Tryout alternative possibilities for projects and consider them until you can give
a tentative "yes" answer to the questions below.

YES

NO

Is the project important enough to be worth the effort to do?

Is the project visible enough to be noticed but not overly


constrained by the glare of publicity?

Is the project complex enough to be interesting, yet simple


enough so that you can probably handle it (with a little help from
your friends and the manua~?

Is it likely that you'll be able to deal with humanitarian issues


that arise?

Is the project likely to be reasonably short so that you can learn


from the experience prior to doing other evaluation projects?

Will you probably be able to get support from the persons or


coalitions of persons whose cooperation you'll need to do the
work?

Is the project appropriate for you given your position, style, and
competence?

122

A DESIGN MANUAL

MOVING ON
At this point you should have made a decision about what you will evaluate.
The decision was probably made "with reservations." Whether you are an
experienced evaluator or someone doing a first project, you lack information
about what you'll run into and whether you'll be able to cope with unexpected or
unknown problems.
Take comfort from the fact that-as with all the "early" decisions in an evaluation
project-you can change your mind later.
So-Please turn back to Product 1 and start working on an Evaluation Preview.

APPENDIXB

AN EXAMPLE OF AN EVALUATION
DESIGN
INTRODUCTION

This appendix contains a complete (all 7 products) evaluation design. The


example used is an evaluation design for a three-day training workshop. To
get extra mileage from this example, the three-day workshop is an
evaluation training workshop-in which the hypothetical participants learn
about evaluation and produce evaluation designs using this Design Manual
(it's a lot like a workshop the authors used to conduct).
So, this example gives you a look at all seven (7) products. It might also
give you some ideas how you could use the Design Manual to train
others.

123

124

A DESIGN MANUAL

PRODUCTl
EVALUATION PREVIEW
What is to be Evaluated
The object of the evaluation is a workshop developed and delivered by the ETC Project. It is a three day workshop
which gives participants intensive training in evaluation and time to work on their own evaluation designs.
Participants are from professional development and teacher preparation programs in colleges and universities and
local and state educational agencies. The project received funding from the federal Office of Special Education to
develop the materials used in the workshop, but must rely on registration fees to cover some delivery costs. The
Project is based at a University Evaluation Center which is interested in insuring quality and coordinating the Project
with its other activities.
HOW THE WORKSHOP WORKS

Staff prepare for and


publicize workshop

Participants attend 3-Day Workshop


Lecture and
Simulation with
exercises on
r- demonstration
evaluation design
materials

1.0

Participants return to jobs


and use evaluation products
and principles
3.0

Participants work on evaluation designs


for own projects; staff serve as
consultants
2.0

WORKSHOP AGENDA

Day One
9:00- 9: 30 Introduction
9:30-10:30 Evaluation design
exercise
10:30-12:00 Discussion: Review of
Decision Areas from
SOllrcebook
12:00- 1:00 Lunch
1 :00- 2: 30 Participants read
selected case
2:30- 3:30 Small group exercise:
Participants analyze
case using Decision
Areas
3: 30- 4: 30 Summary Review

Day Two
9:00- 9:30 Introduction to
the Design Manual
9:30-12:00 Participants (each
with own Design
Manual) complete
Products #1 and #2
12:00- 1:00 Lunch
1 :00- 2: 30 Exercise and lecture:
Measurement
Planning
2: 30- 4: 30 Participants complete
Products #3 and #4

Day Three
9:00-10:00 Lecture and
demonstra tion:
Reporting
10:00-11 :00 Participants complete
Product #5
11 :00-12:00 Panel discussion:
Management
12:00- 1:00 Lunch
1 :00- 3: 30 Participants complete
Products #6 and #7
3:30- 4:00 Wrap-up

APPENDIX B

Evaluation Purpose
The primary purpose is to produce information which can be used to redesign subsequent versions of the workshop. A
secondary purpose is to provide impact and other accountability information to external audiences.
Audiences lor the Evaluation
The primary audience is the staff, who want to conduct good, efficient trammg. Other audiences include: (1) the
Federal Office whose primary interest is to see their funds well used to support a quality effort, and (2) the University
Evaluation Center and Administration, who hope to promote coordination with other efforts and high quality, visible
efforts.
Constraints
The project has a small amount of funds set aside for an internal evaluator (a part-time student). The staff prefer an
internal evaluator with whom they can work closely but see the need for credibility of their self-evaluation work. The
evaluation must involve participants (federal regulation) and be completed before the end of the funding period.

125

126

A DESIGN MANUAL

PRODUCT 2
OUTLINE OF THE EVALUATION QUESTIONS

Evaluation Questions

Subquestions

Audiences

Why the Question


is Important

Who attended the


workshops?

What are their:


number?
positions?
organizational affiliations?
evaluation experience?

OSE (funders)
University
Staff
Project Director

The project will change


organizations only if key
leaders (e.g., deans, chairs)
attend

Did participants think it


was worthwhile?

Was it:
interesting?
useful?

OSE (funders)
Staff

The "word-of-mouth"
network is strong among
participants; and, if they
don't like it, they won't
learn it
Needed to revise
subsequent workshop and
to guide any follow-up
Staff expect some rough
spots; the Director will
base staff training on
problem areas

Were the learning


objectives met?

OSE
Staff
Project Director
Staff
Project Director

What problems arose?

What problems were


related to:
preparation?
delivery?

How costly was it?

What did it cost?


Are there savings
possibilities?

University
Project Director
OSE

The project was funded on


a per- participant estimate
that cannot be exceeded
over the entire workshop
series

What uses were made of


the training?

What were the:


job applications?
benefits and effects?

OSE
Staff
Project Director

This will serve as impact


data and will be used as
needs data in next year's
proposal

Is the workshop content


sound?

How sound is it from the


point of view of:
evaluation methodology?
instructional design?

Staff
Project Director
University
OSE

Needed for revision. And,


OSE and the University
expect to see high quality
efforts.

OSE
University
Staff
Project Director

The entire workshop


proposal is based on
identified needs for
evaluation improvement

Is the workshop
responsive to needs?

APPENDIX B

127

PRODUCT 3
INFORMATION COLLECTION: OVERALL PLAN (A)

Evaluation Questions and Sub-Questions

Information
Collection
Procedures

1. Who
attended?
a. number?
b. position?
c. organiza-

tion?

2. Did they

think it was
worthwhile?

a. interesting?

b. useful?

3. Were

learning
objectives
met?

4. What
problems
arose?

a. preparation?

b. delivery?

5. How
costly was
it?

6. What

uses 'were

made?
a. actual

a. costs?
b. potential uses?
savings?
b. benefits?

7. Is the

content sound?
a. evaluation
methods
8. Is it
b. instructional responsive
to needs?
design

--------------------- -

A. Participants

(P's) complete registration

forms at
beginning
of
workshop
B. P's
complete
brief questionnaire
at end of
first day

C. Sample of
P's (key respondent)
discuss
workshop

X (b)

X (b)

at end; staff
members
take notes
D. Staff keep
notes on

P's use of
materials,

questions
asked.
problems
E. External

reviewers

rate

samples of
evaluation
designs
produced at
workshop

128

A DESIGN MANUAL

Evaluation Questions and Sub-Questions

Information
Collection
Procedures

1. Who
attended?
a. number?
b. position?
c. organiza-

tion?

2. Did they

think it was
worthwhile?
a. interesting?
b. useful?

3. Were
learning
objectives
met?

4. What
problems
arose?
a. preparation?
b. delivery?

5. How
6. What
costly was
uses were
it?
made?
a. costs?
a. actual
b. potential uses?
savings?
b. benefits?

7. Is the
content sound?
a. evaluation
methods
8. Is it_
b. instructional responsive
design
to needs?

F. At postworkshop
meetings,

staff discuss
procedures
for developing and
producmg

X (a)

X (b)

materials

and making

arrangements

G. Evaluator
compiles
cost and
registration

fees
H. Staff
members
telephone
interview
sample of
P's after
training
-

-------

I. Selected
evaluation
and instructional design
experts

review

workshop
materials
-- - - - - - -------

]. Evaluation
and instructional design
experts
observe

APPENDIX B

129

PRODUCT 3
INFORMATION COLLECTION: HOW EACH PROCEDURE WORKS (B)

Procedure
A. Participants (P's)
complete registration
forms at beginning of
workshop
B. P's complete brief
questionnaire at end of
first day

Evaluation
Questions
Addressed

Schedule
for Collection

Sample

Beginning of
workshop at
registration

Workshop
participants

All

Registration
Questionnaire

End of each of
three days
during
workshop
Afternoon of
last day of
workshop

Workshop
participants

All

Reaction Form

Workshop
participants

8-12 selected

Continuous
during
workshop

Staff

All

Staff notes and


Key
Respondents
Guide Sheet
Staff Daily Log

Ratings made
two weeks after
workshop

Evaluation
consultant

Reviewer
Rating Form

4a

Continuous
during
preparation and
delivery of
workshop

Staff

All

Staff Daily Logs


and other notes

1 week after
workshop

N.A.

N.A.

None

Approximately
1/3 of participants stratified
by type of job
setting
3 of each type

Interview Guide

C. Sample of P's (key


respondent) discuss
workshop at end; staff
members take notes
D. Staff keep notes on P's
use of materials,
questions asked,
problems
E. External reviewers rate
sample of evaluation
designs during
workshop
F. At post-workshop
meetings, staff discuss
procedures for
developing and
producing materials
and making
arrangements
G. Evaluator compiles
cost and registration
fees
H. Staff members
telephone interview
sample of P's after
training

2. 3, 4b, 8

6, 8

2 months after

Workshop
participants

I. Selected evaluation
and instructional
design experts review
workshop materials

7, 8

Materials sent 1
week after
workshop;
replies
completed in 3
weeks
During
workshop

Expert
reviews

]. Evaluation and
instructional design
experts observe

Instrument(s)
Used

Respondents

3, 4b, 8

workshop

Observers

by staff

1 of each type

Reviewer's
Guide
Questions

None
(observers take
own notes)

130

A DESIGN MANUAL

PRODUCT 4
ANALYSIS AND INTERPRETATION PLAN
Evaluation
Questions

Collection
Procedure

Analysis
Procedure

Evaluation
Criteria

Procedures for
Making Judgments

1. Who attended the

A. Participants (P's)
complete registration forms at
beginning of
workshop.

Analyze
questionnaire
items regarding
the,four subquestions to
determine
frequencies.

Evaluator
compares findings
to criteria.

2. Did P's think it was

B. P's complete brief


questionnaire at
end of first day.

Analyze relevant
questionnaire
items to determine ratings of
workshop
elements. Content analyze
staff notes taken
during the
discussion,

Number of P's
needed to cover
costs; 90% of
participants
should match
characteristics
of intended
participants.
A verage ratings
of 3-0 or less on
5-pt. scale are
considered very
low.

workshop?
a. number?
b. positions?
c. organizational
affilia tion?
d. evaluation expert?

worthwhile?
a. interesting?
b. useful?

C. Sample ofP's (key


respondents)
discuss workshop
at end; staff
members take
notes.
3. Were the learning
objectives met?

C above
D. Staff kept notes
on P's use of
materials, questions asked,
problems.

Content analyze
staff notes to
identify evidence
that objectives
used were not
met,
Summarize
reviewers' rating
sheets.

List of learning
objectives
ranked by
importance.
All major
objectives should
be achieved.

Evaluator
compares all
findings to criteria
and presents own
summary;
reviewers' ratings
also presented
separately. Project
Director makes
final determination.

Content analyze
notes from staff
logs and discussion to identify problems,
how they
developed, and
their effects.

Problems such as
confusions about
materials
inadequate
facility
unproductive
diversion from
schedule.

Evaluator
summarizes information; Project
Director and staff
review ita t staff
meeting, Consensus of staff
sought.

E. External reviewers
rate sample of
evaluation designs
during workshop.
4. What problems arose?
a. preparation?
b. delivery?

Comparison of
summarized
findings with
those from
previous
workshops,

C above for "b"


D above for "b"

APPENDIX B

Evaluation
Questions
------

-- -

- - --------

5. How costly was it?


a. cost?
b. savings possibilities?

Collection
Procedure

-------

F. At post-workshop
meeting, staff
discuss procedures
for developing
and producing
materials and
making arrangements. (for "a")
G. Evaluator
compiles cost and
registration fees.

Analysis
Procedure

Evaluation
Criteria

Were there
unusual or
unjustified
expenditures?

6. What uses were made of H. Staff members


the training?
interview sample
a. job applications?
of P's after
b. benefits/effects?
training.

Analyze items
from interview
schedule regarding uses of
materials;
determine types
of uses and
apparent effects.

Summary
presented. No
pre- set criteria
established.

7. Is the workshop content I.


sound?
a. evaluation point of
view?
b. instructional design
point of view?

Compare
workshop content to design
criteria.
Compare workshop
operation to
design criteria.

Experts selected
so that one is
familiar with
project and at
least one is
nationally
recognized but
with no association with
project or staff
members.

Content analysis
of staff notes and
reports of expert
reviews.

All major needs


(identified when
project began).

J.

8. Is the workshop

responsive to needs?

Evaluation and
instructional
design experts
observe workshops and make
reports to staff.
C above
D above
H above
I above

Procedures for
Making Judgments

-~~----

Compare
expenditures to
budget and to
income from
fees.

Selected
evaluation and
instructional
design experts
review workshop
materials.

1 31

Evaluator presents
findings to
Director who
determines
savings
possibili ties
based upon
comparisons to
similar
activities.
Staff discuss,
reach consensus
about adequacy,
as compared to
needs data. (Information reported
to OSE for any
judgments they
choose to make.)
Evaluator
summarizes comparison of its
content to criteria
to identify
strengths and
weaknesses.

Evaluator
compares findings
to Needs Report.

132

A DESIGN MANUAL

PRODUCT 5
REPORT PLAN
Audience

Content

Format

Date/Frequency

Event

OSE

Description of Project activities


and plans; answers to questions
1-3, 5-8
Description of Project activities
and budget; answers to questions
1,5,7,8
Evaluation design

Written report

60 days after
funding year

End-of-the-year
report

Written report

30 days after
funding year

End-of-the-year
report

Meetings wi th
written summary

2 months
before
workshop
2 weeks after
workshop

Staff meeting

University
Staff

Project
Director

Review of findings and


implications; answers to
questions 1-4, 6
Review of findings and
implications; answers to
questions 7, 8
Same as for staff

Presentation by
evaluator
Presentation by
evaluator

2Y, months

Staff meeting

(see above)

(see above)

(see above)

Progress, problems, and next


steps
Answers to questions 1-8

Informal
discussion
Written report

Every 2 weeks

Meeting

2Y, months

Meeting

after workshop

after workshop

Staff meeting

APPENDIX B

PRODUCT 6
MANAGEMENT PLAN
Evaluation W orkplan
A. Design the
evaluation
draft the design
review
present to staff
revise
have reviewed by
consultant
B. Develop
procedures and
instruments
draft registration
from (Proc. A),
questionnaire
(Proc. B), and
guidelines for
expert review
(Procs. E and J)
review
revise
produce
train staff for
keeping notes on
workshop
process (Procs. C
and D)
develop interview schedule
(Proc. A)
C. Collect information
during workshop
Procs. A, B, C, D
following
workshop
Proc. E
send designs to
reviewers
reviews due
Proc. F
. post-workshop
meeting
Proc. G
compile
budget
information
Proc. H
interview a
sample of P's
Proc. I
send material
to reviewers
reviews due

Person Responsible Feb Mar Apr May June July Aug

Evaluator
Director and staff
Evaluator
Evaluator
Director (and
consultant)

Evaluator

Director and staff


Evaluator
Secretary
Evaluator

x
x
xx
x

x-x

x
x-x

x-x

Evaluator
Staff and evaluator

x
x

Evaluator

x
x
x

133

134

A DESIGN MANUAL

PRODUCT 6: Management Plan (continued)


Evaluation Workplan
D. Analyze
information
to answer
questions 1-5, 7,
B
to answer
question 6
E. Reports
prepare
summaries
staff meetings to
report findings
meetings with
Director
write reports for
Director's use in
year-end reports
prepare metaevaluation report

Person Responsible Feb Mar Apr May June July Aug


---------~.----

xx

Evaluator

Evaluator

xx

Evaluator
Evaluator and
Director
Evaluator and
Director
Evaluator

x
xx xx xx xx

xx

xx x
x
x

Consultant

Budget
Personnel
Evaluator (25% of $4,000 X Y, year)
Consultant fees for 2 workshop observations ($100/day X 2 days x 2)
Consultant fees for reviews ($100 X 4)

subtotal
Travel and Lodgings
To Workshop for:
Evaluator (carfare = $15, per diem = $50 X 3 - $150)
Consultant (2) (carfare = 50 X 2, per diem = $60 X 3 X 2 = $360)

$ 500
400
400
1,300

165
460
subtotal

Material and Supplies


Office supplies
Copying
Postage

625
50
100

20

subtotal

170

TOTAL

$2,095

APPENDIX B

PRODUCT 7
META-EVALUATION PLAN

Evaluation of Evaluation Design


Purpose:
To demonstrate a credible and defensible design to funding agent, and
to revise evaluation design as necessary.
Method:
Send evaluation design to external consultant not affiliated with
project; meet with consultant to review design.
Resources:
Consultant fees, meeting time and space, checklist.
Criteria:
Joint Committee Standards.
Evaluation of Progress
Purpose:
To revise evaluation as necessary.
Method:
Staff will meet with evaluator before, during and after workshop to
discuss evaluation instruments and data collection.
Resources:
None extra.
Criteria:
Utility and accuracy of information.

Evaluation
Purpose:

0/ Completed Eva/uotion

Methods:

Resources:
Criteria:

To "certify" evaluation report and determine how to revise future


evaluation work.
1. Send evaluation report to external consultant who will append a
Meta-evaluation Report.
2. Conduct meeting with staff and invited others to review the
evaluation report, design and uses; discuss utility and worth.
Consultant fees; meeting time, promotion and space.
Joint Committee Standards; utility and economy.

1 35

APPENDIX C
EXTRA WORKSHEETS

APPENDIX C

1 37

WORKSHEET
A

NOT-SO~ELEGANT

DESIGN

PRODUCT
1. EVALUATION PREVIEW

why will the evaluation be done?


for whom will it be done?
what will be evaluated? (See Appendix A if
you're undecided)
what constraints are known?

2. OUTLINE OF EVALUATION QUESTIONS


what questions and subquestions will the
evaluation address for each audience?
.'>

INFORMATION COLLECTION PLAN


what sources of information will be used?
how will information be collected?

4. ANALYSIS AND INTERPRETATION PLAN


how will the information be analyzed?
what criteria will be used to judge the
object?
what procedures will be used to make
those judgments?
5. REPORT PLAN
what reports will be made?
what should their contents be?
when will they be given?
to whom will they be made?
what format will be used for them?
6. MANAGEMENT PLAN
what tasks need to be accomplished?
who will do them?
when will they be done?
what resources will be needed to do them?
7. PLAN TO EVALUATE THE EVALUATION
how will you check your design?
how will you know if it's going ok?
what can you use to judge the overall
evaluation effort?

YOUR RESPONSE (Answer in


your head or write notes here)

138

A DESIGN MANUAL

WORKSHEET
DESCRIBING THE OBJECT (Defining "What" You'll Evaluate)
Directions: Use aid #2 as a guide to describe lllhrit you will evaluate- the object in
the spaces below. You may want to write a narrative description
and/or pictorial representation to answer the questions.

List

Who is involved in
the object

Note

Why it exists; What


are its goals,
objectives?

Describe or list the functional


elements of the
object. What are its
sub-parts and pieces?

Explain

Describe

When it did, or does,


take place (or how
frequently)

Where it exists

APPENDIX C

1 39

WORKSHEET
ANAL YZING AUDIENCES

List the audiences


for your evaluation
(Use aid #3)

Identify persons/
spokespersons for each
audience

Describe the particular


values, interests,
expectations, etc. that
may playa key role as
criteria in the analysis and
interpretation stage of
your evaluation

-----------+-----------+-------------------

140

A DESIGN MANUAL

WORKSHEET
IDENTIFYING PURPOSES

Purpose

Rank

Interested Audiences

Write each purpose below.


Use aid #4 as a guide.

Use a number to
rank each
purpose (if more
than 1)

List the people or groups


who are primarily interested
in each purpose.

APPENDIX C

141

WORKSHEET
IDENTIFYING CONSTRAINTS

Directions: Think about constraints for each of the categories below; use aid #5.
Then, list the constraints you will need to attend to.
Some guide questions and categories

Write your constraints here

Olillille oj !:l'rilllllllfJII fjlli'J/iIIl/J

Is there a particular evaluation model


that's supposed to be used to guide
the evaluation?
Are there evaluation questions
reljuired b\' a major audience)

- - - - - - - - - - - - - - - - + - - - - - - - - - _..._-_."""..

111/0l'llltiliOil Co//n/ilill PIIIII

Are there information collection


procedures or instruments that must
be used)
Is there a group of persons who must
be involved as respondents in the
information collection?
What information is not available?
Allrtivr/r

llIlri Illli'l'/JI'e/lltilill

P!1I1l

Have criteria for judging the object


been set?
Is there a procedure that's already
been selected for judging the object?
Do some audiences have particular
biases that should be recognized?
Reporl P/all

Is there a particular report that must


be made:
Are there mandatory report audiences?
Is there a fixed report schedule?
Mallagemenl Plall

Must the evaluation be completed by a


particular time?
Is the evaluator already determined?
Is there a budget ceiling for the
evaluation?

142

A DESIGN MANUAL

Some guide questions and categories


PI({II to Eva//late the Ev({llI({tioll

Is there an auditor, or third party or


"meta-evaluation" required?
Are there funds to evaluate the
evaluation?

Other
Are there <my other constraints on the
evaluation?

Write your constraints here

APPENDIX C

143

WORKSHEET
DRAFT EVALUA nON QUESTIONS

Directions: In the spaces below write potential evaluation questions in each of


the (6) categories, as pertinent to your evaluation project. Use aids
#6 and #7.
Questions pertinent to key object functions:

Q?
Q? ----------------------------------------------- --Q?
Q?
Q?

Questions derived from theoretical models:

Q?
Q?
Q?
Q?
Q? ------------------ -------------------------- --------- ----

Questions based on expertise and experience:

Q?
Q?

---------- ----------- -- ----------- ------- --- -- -------------------------------------------- -- -------------Q?------------------~------------------ ------------------------Q?_-----------------------------------Q?

Questions responsive to audience concerns and interest:

Q?
Q?
Q?
Q?
Q?

Questions defined from evaluation purpose:

Q?
Q?
Q?
Q?
Q?

"Bonus" (and other) questions:

Q?
Q?
Q?
Q?
Q?

"Continued" heading in case you need it set below

144

A DESIGN MANUAL

WORKSHEET
EVALUA TION QUESTIONS AND SUBQUESTIONS

Directions: Review and organize and revise your draft evaluation questions. Write
them, their sub-questions, audiences and why they're important in
the spaces below. Use the example (page xxx) and aid #7.

Evaluation
Questions

Sub-Questions

Audience
(Who cares?)

Why is the question


important?

APPENDIX C

145

WORKSHEET
OVERALL INFORMATION COLLECTION PLAN

Write evaluation
questions here (see
Product #2) -+

List information
collection procedures here
(Use aids #8 and 9) 1
'_'~

__

"~

_ _ m _

- -'----------- ---------(in the cells place an "X" to show where a procedure addresses
a question)

146

A DESIGN MANUAL

WORKSHEET
HOW EACH PROCEDURE WORKS

Directions: Fill in each column below for each procedure listed in worksheet A.
Procedure

Evaluation
Schedule
Sample (kind and
question addressed (when, how, where) size, see aid #10) Respondents

Instruments used

APPENDIX C

147

WORKSHEET
PRELIMINARY HANDLING & ANALYSIS PLAN

List information
collection
procedures

Describe how
problems will be
identified,
e.g., incomplete
information and
errors

Describe
preliminary
analysis

Describe how the


information will
be stored and
retrieved- who
will have access

148

A DESIGN MANUAL

WORKSHEET
ANALYSIS

&

INTERPRETATION PLAN

List evaluation
questions

List information
collection
procedures for
Describe analysis
each question
. __ procedur_~~_.

Describe
procedures for
making the
judgments
(see aid #11)

APPENDIX C

149

WORKSHEET
PLANNING REPORTS

List
evaluation
audiences

Describe
content of
reports
(use aid #12)

List date
Identify format of report
or frequency
to be used
(use aid #12)
(use aid #12)

Identify event
associated with
report

150

A DESIGN MANUAL

WORKSHEET
TASK ANALYSIS

List general tasks


and sub- tasks
(see aid #13)

Task
Begin/end
Date

Name the
person(s) who will
do each sub- task

Estimate
personnel
cost

List other
resources
needed

Estimate
for other
resources

~~~~-r~~--~~r-

Task:

Task:_~._ .._

Task:_~~~_

/
I
I
/
/
I
I
I

I
I
I
I
I
I
I
I
I
I
I
I

APPENDIX C

WORKSHEET
REVISED TASK CHART

List tasks for the evaluation

Identify person( s)
responsible

Specify when the


task will be done

15 1

152

A DESIGN MANUAL

WORKSHEET
BUDGET
Directions: Complete the following budget, using the categories of expenditures.

Personnel
Subtotal
Fringe (institutional percentage of cost)
Travel

Materials & Supplies

Communications

Total direct costs


Total indirect costs

TOTAL

APPENDIX C

153

WORKSHEET
PLANNING THE EVALUATION OF THE EVALUATION

Directions: Use aids #14 and #15 to complete the worksheet.


Elements
of Plan

Evaluate your
evaluation design

Evaluate processes
(during) your
evaluation

Purpose

Method

Persons &
resources
needed

Relevant
standards
& criteria
------

------------------- - - - - - - - - - -

---

Evaluate the
results/products
of your evaluation

154

A DESIGN MANUAL

WORKSHEET
SELECTING AN EVALUA nON OBJECT
Directions: Using the previous examples as aids, select one or more objects to
evaluate.
Possibility #1

Object Name

Comments:
What it isHow it worksWhat it's supposed to doWhy you might want to evaluate itPossibility #2

Object Name

Comments:
What it isHow it worksWhat it's supposed to doWhy you might want to evaluate itPossibility #3

Object Name

Comments:
What it isHow it worksWhat it's supposed to doWhy you might want to evaluate it-

Vous aimerez peut-être aussi