Académique Documents
Professionnel Documents
Culture Documents
If it works ...
Notice and nurture.
If it doesn't work ...
Notice and change.
Design Manual
Program Evaluation
A PRACTITIONER'S GUIDE FOR
TRAINERS AND EDUCATORS
ROBERT O. BRINKERHOFF
DALE M. BRETHOWER
TERRY HLUCHYJ
JERI RIDINGS NOWAKOWSKI
Kluwer-Nijhoff Publishing
Boston The Hague Dordrecht Lancaster
a member of the Kluwer Academic Publishers Group
e-ISBN-13: 978-94-009-6667-3
001: 10.1007/978-94-009-6667-3
Contents
ABOUT THESE MATERIALS
INTRODUCTION
GETTING STARTED
PRODUCTS
1: Outline of Evaluation Questions
2: Outline of Evaluation Questions
3: Information Collection Plan
4: Analysis and Interpretation Plan
5: Report Plan
6: Management Plan
7: Plan to Evaluate the Evaluation
APPENDICES
A: Selecting What (an Object) to Evaluate
B: An Example of an Evaluation Design
C: Extra Worksheets
vi
xi
1
13
39
53
69
81
91
105
115
123
137
Conceptual Basis
These materials are about designing, conducting, and using evaluation,
but their underlying assumption is that evaluation should be useful for
improving current and/or future training efforts. While these materials are
meant to help you do evaluation well, we believe that evaluation is not
worth doing at all unless you can use it to make training better, or to better
invest training resources.
Good trainin-8, whether preservice or inservice, must satisfy four conditions:
vi
PROGRAM EVALUATION
vii
Design Training
Strategy
Make Recycling
Decisions
Implement
Training
viii
1. Identify worthwhile
training goals
2. Design effective
3. Effectively implement
training
4. Decide whether to
terminate, continue,
curtail or expand
training
training strategies
Each function is defined, then the several key decisions needed to complete
the function are explained. The Sourcebook contains examples, guidelines,
criteria and checklists you can use to do more effective evaluation. It also
includes references to other books and resources that can be useful in
evaluating training programs.
The Casebook contains twelve case-examples. Each is a story about
evaluation within a particular training program. The case examples,
designed to portray evaluation applications of different types in different
settings, were contributed by field practitioners and written in conjunction
with ETC staff. They are fictional accounts but based on actual programs
and uses of evaluation. Each case-example is annotated to highlight the
seven major evaluation functions as set forth in the Sourcebook. This is done
to show how these functions differ according to particular program needs
and settings. Following each case is a set of review and discussion questions
to help extend the lessons available in the case-example.
PROGRAM EVALUATION
ix
an evaluation overview
an outline of evaluation questions
an information collection plan
an analysis and interpretation plan
a management plan
a report plan
a plan for evaluating your evaluation
problems, you could read some of the case-examples. Use the guide
below to see which cases relate to certain problems.
relevant case-examples
(numbers listed are from
Casebook Table of Contents)
-C-2
-S-3
relevant case-examples
(numbers listed are from
Casebook Table of Contents)
-C-l, C-5
-S-2
-S-I, S-3, L-3
-L-4, S-3, C-3, L-2
-C-4
Introduction
Please glance over the questions that follow and read the answers to those that are of
interest.
Q:
A:
Q:
Q:
!f~~L--ab_OU:;fo_t~~inin~g
:t
C,
People use
information to
change the training
or themselves
Information is
collected about the
training effort
Information is
interpreted and
valued
xi
xii
PROGRAM EVALUATION
Q:
If I use this manual to develop a design, what will I have when I am done?
A: You will have a complete evaluation design:
A set of products relevant to seven major evaluation decisions:
DECISIONS
Q:
PRODUCTS
1. Evaluation Preview
2. Outline of Evaluation
Questions
3. Information Collection
Plan
4. Analysis and
Interpretation Plan
5. Report Plan
6. Management Plan
7. Plan for Evaluating the
Evaluation
INTRODUCTION
Q:
Q:
Q:
xiii
GETTING STARTED:
TASK STEPS
Procedures
Objectives
Prepare
Try
Improve
elegant design
A DESIGN MANUAL
EXAMPLE
A NOT-SO-ELEGANT DESIGN
Product
1. Evaluation Preview
2. Outline of Questions
5. Report Plan
(content, format, schedule,
audience)
6. Management Plan
(tasks, personnel, resources)
GETTING STARTED
A DESIGN MANUAL
WORKSHEET
A NOT-SO-ELEGANT DESIGN
PRODUCT
1. EVALUATION PREVIEW
GETTING STARTED
AID #1
THE PARTS OF AN EVALUATION DESIGN
THE PRODUCT
1. Evaluation Preview
2. Outline of Evaluation
Questions
3. Information
Collection Plan
4. Analysis and
Interpretation Plan
5. Report Plan
6. Management Plan
7. Plan to Evaluate
the Evaluation
object description
purpose
audiences
constraints
NOTE: These parts are not always produced in the order shown. Emergent and more naturalistic evaluations
employ a different order (e.g., evaluation questions might come after information is collected and analyzed).
And, some evaluations go through several cycles of some of these steps. But whatever design approach one
uses, virtually any evaluation has all of the parts described above.
A DESIGN MANUAL
TIPS
There are two major decisions about your evaluation that you should
consider now, before moving further into a design:
1. How much, and how, will you involve others?
2. How much planning should you do in advance?
Tactic
GETTING STARTED
Planning Approaches
Approach
A DESIGN MANUAL
CHECKLIST
EVALUATION DESIGN ADEQUACY
DIRECTIONS: Use this checklist to identify areas of your evaluation design that
need more development. Then work through the corresponding products in this
manual. This is a master checklist. Each product section in the remainder of the
manual has its own, more detailed, checklist. Use these also.
2. Is the evaluation object relatively stable and mature? Do you know what
3. Are the reasons for the evaluation specified and defensible? (purpose)
4. Is it clear what planning, implementing, redesign, judging or other decisions
and interests are to be served by the evaluation? (purpose)
5. Are all relevant evaluation audiences described? (audiences)
6. Are the criteria, values and expectations that audiences will bring to bear in
interpreting information known and described? (audiences)
7. Have the events in the setting that are likely to influence the evaluation been
identified? (constraints)
8. Is someone available to do the evaluation who has some basic skills in
B. Evaluation questions
1. Have key stakeholders' needs and questions been identified?
GETTING STARTED
information needs?
needs?
Management
1. Does the design provide for adequate protection of human privacy and other
rights?
management requirements?
10
A DESIGN MANUAL
2. Is there agreement about criteria to. judge the success of the evaluation?
3. Will the evaluatian's credibility be jeapardized by evaluator bias?
4. Are there procedures planned to assess the quality of the evaluation's design,
progress, and results?
5. Are there provisions for disseminating, reparting, interpreting, and otherwise utiliZing the results and experience of the evaluation?
GETTING STARTED
11
MOVING ON
Now you must make an important decision: Is your not-so-elegant design
complete enough to guide you toward achieving your purposes in
evaluating?
1. The decision is important but probably not difficult. Unless your project
2.
3.
4.
5.
is unusually simple (or you are unusually competent and confident), you
should use the examples, worksheets, and checklists in the manual to
develop a more complete design.
Use the checklist on the previous page to help you decide what parts of
your design need further development.
We ask you to develop a not-so-elegant design not because we expect you
to use it as is, but because it gives you perspective. Experienced
evaluators tend to do a short form in their head to orient them to the
whole task. That way they don't get lost in the details. You need to do the
same.
Wait! Before you make a final decision, browse through the manual to
see some of what it has to offer. Be sure to look at Appendix B. It's an
example of a moderately complex design that will show you the sort of
thing you would generate by working on through the manual. After
you've looked through the book and studied the example in Appendix B,
you'll be able to make an informed decision about whether to stop with
your not-so-elegant design, to improve upon it on your own, or to
improve it by ushg the worksheets, aids, tips, and checklists in each of
the seven (7) product sections that follow in the manual.
If you're unsure about how evaluation might be used in your work, read
some of the examples in the Casebook.
SOURCEBOOK
REFERENCES
Designing
Evaluation
KEY ISSUES
TASKS
PAGE
l. Determine the
37-42
include?
constructing a design?
4. How do you
recognize a good
design?
amount of planning,
general purpose, and
degree of control
2. Overview evaluation
decisions, tasks, and
products
3. Determine general
procedures for the
evaluation
4. Assess the quality of
the design
43-58
59-60
64-71
,,
,,
\
\
,,
\
\
\
\
\
\
\
,,
,
\
,,
"
""
"
""
.......
.......
\
\
.........
PRODUCT 1
\
\
--
EVALUATION PREVIEW
\
\
\
\
\
13
PRODUCT 1
15
PRODUCT OVERVIEW
Procedures
Objectives
Prepare
Try
Improve
parts of a preview
16
A DESIGN MANUAL
EXAMPLE
AN EVALUATION PREVIEW
MEMO
FROM: Pat
TO: Larry, Stella, and Ed
RE: Evaluation of the Parents' Handbook
PRODUCT 1
17
18
A DESIGN MANUAL
EXAMPLE
ANOTHER EVALUATION PREVIEW
'"
Administrative
Offices
Vocational
and Special
Education
Program
-planning
-budgeting
Audio- Visual
Services and
Resource
Library
Management ~
Services
Reading
Resource
Project
"-./
,
I
I
./
Workshop
Admin.
Support
II
Catalog
PRODUCT 1
are the primary audience for my work. Others are your assistant, the
teachers who must use the project (and who I understand are skeptical
about time demands), and finally, the superintendent who must
approve an ongoing budget.
Purposes: The first purpose is to determine whether the project has met
intended needs and goals; the second is to help you decide whether the
project ought to be continued.
Other notes: The evaluation must be conducted within the next eight
months. You and others in your office will do some of the information
collection, but it seems likely that an outside evaluator will be a good
choice to enhance the credibility of the findings.
See you next Wednesday:
Sincerely,
Sam Quinn
19
20
A DESIGN MANUAL
PRODUCT 1
21
WORKSHEET
DESCRIBING THE OBJECT (Defining "What" You'll Evaluate)
Directions: Use aid #2 as a guide to describe what you will evaluate-the object-in
the spaces below. You may want to write a narrative description
and! or pictorial representation to answer the questions.
List
Who is involved in
the object
Note
Describe or Ii!
The functional
elements of the
object. What are its
sub-parts and pieces?
Explain
Describe
Where it exists
22
A DESIGN MANUAL
AID #2
SOME ELEMENTS OF AN OBJECT DESCRIPTION
Who
Actors
Why
Goals
What
Components
Activities
Resources
Problems
When
Timeline
How long has the object been around? What does its
history look like, and what kind of future is it
anticipating? How long does it have to accomplish short
and long range goals?
Where
Setting
PRODUCT 1
23
TIPS
DESCRIBING THE OBJECT
24
A DESIGN MANUAL
CHECKLIST
DESCRIBING THE OBJECT
Answer the following questions and revise the worksheet until you can answer "yes"
to all of them.
YES
NO
Have you verified (or do you plan to verify) the factual accuracy
of your description-e.g., through independent judgment, direct
observation?
Do you have a process for keeping the description up to datee.g., a log for recording critical events or a schedule to gather
new information?
Are you satisfied that you know enough about what you're
evaluating to proceed with an evaluation design?
Have you included all the parts or dimensions of the object that
are obviously related to the purposes of the evaluation? (Come
back to this after you've written purposes.)
PRODUCT 1
WORKSHEET
ANALYZING AUDIENCES
Identify persons/
spokespersons for each
audience
25
26
A DESIGN MANUAL
AID #3
SOME TYPICAL AUDIENCES FOR TRAINING EVALUATIONS
An audience is a person or group who ...
makes decisions based on the
evaluation findings
department chairperson
assistant superintendent
dean
political group
pupil personnel officer
federal office
news media
professional associa tion
accrediting agency
contract administrator
busybody
friend
enemy
classroom teacher
union
client
PRODUCT 1
27
TIPS
ANALYZING AUDIENCES
Don't worry about listing everyone you know as an audience. You may create
confusion by involving more people than can realistically be involved. Know who
the key audiences are. Establish levels of involvement.
If you leave a particular audience out of the evaluation, think through the
consequences of doing that.
One audience may lead you to others. Talk to or think of a few known audiences: go
from them to others.
Talking to members of audience groups is a good way to learn their perspectives on
the evaluation of the object.
You may need to add audiences in the course of the evaluation.
In your enthusiasm to involve audiences, don't promise them the world. Let them
know what your constraints are.
Throughout the evaluation, you will have to return to these audiences, as you try to
accommodate them. Establishing a good rapport at the beginning is important.
Think about: who could sink this evaluation? Who could make it a winner? Are these
people or groups on your list?
Remember that evaluation always has consequences, and sometimes "fallout." Try
to forecast these consequences, and include those who might be affected.
28
A DESIGN MANUAL
CHECKLIST
ANALYZING AUDIENCES
Answer the following questions and revise the worksheet until you can answer "yes"
to all of them.
YES
Have you considered including at least a representative of
everyone who:
will be involved in the evaluation?
NO
PRODUCT
29
WORKSHEET
IDENTIFYING PURPOSES
Purpose
Rank
Interested Audiences
Use a number to
rank each
purpose (if more
than 1)
30
A DESIGN MANUAL
AID #4
SOME EVALUATION PURPOSES
Purposes related
Purposes for" new objects"
to training
(being planned or recently
program functions implemented)
Goal or Need
Identification
Program Design
to select a design
to clarify roles or resolve
conflicts
to compare alternative
designs
to determine the adequacy of
a given design
Implementation
and Process
Products and
Outcomes
to determine immediate
outcomes and initial effects
to deterhline what outcomes
are appropriate
to determine the quality of
intermediate products
Recycling
Decisions
PRODUCT 1
31
TIPS
IDENTIFYING PURPOSES
32
A DESIGN MANUAL
CHECKLIST
IDENTIFYING PURPOSES
Answer the following questions and revise the worksheet until you can answer "yes"
to all of them.
YES
NO
Will they clarify what you are doing and why so you'll probably
get cooperation rather than resistance?
PRODUCT 1
33
WORKSHEET
IDENTIFYING CONSTRAINTS
Directions: Think about constraints for each of the categories below; use aid #5.
Then, list the constraints you will need to attend to.
Some guide questions and categories
Outline of Evaluation Questions
Is there a particular evaluation model
that's supposed to be used to guide
the evaluation?
Are there evaluation questions
required by a major audience?
Information Collection Plan
Are there information collection
procedures or instruments that must
be used?
Is there a group of persons who must
be involved as respondents in the
information collection?
What information is not available?
Analysis and Interpretation Plan
Have criteria for judging the object
been set?
Is there a procedure that's already
been selected for judging the object?
Do some audiences have particular
biases that should be recognized?
Report Plan
Is there a particular report that must
be made:
Are there mandatory report audiences?
Is there a fixed report schedule?
Management Plan
Must the evaluation be completed by a
particular time?
Is the evaluator already determined?
Is there a budget ceiling for the
evaluation?
34
A DESIGN MANUAL
PRODUCT 1
35
AID #5
SOME INFLUENCES ON EVALUATION THAT CREATE CONSTRAINTS
INFLUENCES
EVALUATION IMPLICATIONS
Organizational
Politics
Program
Leadership
Professional
Influences
History
Organizational
Setting
Economics
How secure is the fiscal support system for the program and
the evaluation? Have funds been allocated? Will a written
commitment of fiscal support be forthcoming?
Social Patterns
Legal Guidelines Are there legal restrictions (rights of human subjects) that will
limit collection of desired information? Are there professional
or institutional rulings that affect evaluation procedures? Will
the object be affected by pending legislation?
Resources
36
A DESIGN MANUAL
TIPS
IDENTIFYING CONSTRAINTS
To uncover serious constraints, try listing ( or having a colleague list) reasons why the
evaluation can't be done. These may alert you to potential problems or constraints
that won't necessarily make the evaluation impossible, but which certainly need to
be handled.
Some constraints may not become apparent until the evaluation is implemented.
You may identify some hidden ones by talking to key audiences now, particularly
those who control critical resources for the evaluation, e.g., funding, personnel,
and information.
Be sure to look for opportunities at the same time you look for limitations.
Don't be discouraged by the constraints you identify. Look hard for ways to design
the evaluation in spite of them. But don't proceed with the evaluation if you know
it's doomed.
Carefully review prior evaluations of the object and talk to people who conducted
them. These may highlight constraints you can expect.
PRODUCT 1
37
CHECKLIST
IDENTIFYING CONSTRAINTS
Answer the following questions and revise the worksheet until you can answer
"yes" to all of them.
YES
NO
Have you asked the person who is responsible for the evaluation
about constraints?
o
o
38
A DESIGN MANUAL
MOVING ON
Check to see if you need to do any of the following things before going on to
develop evaluation questions.
_~_ 1.
_ _ _ 2.
~_~"~"" ___ """" __
3.
~ ___ ~_
4.
___ 5.
SOURCEBOOK REFERENCES
Focusing the
Evaluation
KEY ISSUES
TASKS
1. Investigate what is to
PAGE
7-15
be evaluated
2. Identify and justify
purpose(s)
3. Identify audiences
20-22
4. Study setting
23-26
5. Decide whether to go
on with evaluation
31-36
16-19
,
\
\\
\\
PRODUCT 2
\
\
o 3.
04.
o 5.
o 6.
o 7.
39
40
A DESIGN MANUAL
PRODUCT OVERVIEW
Procedures
Objectives
Prepare
1. Knowledge about
Try
Improve
properties of evaluation
questions
PRODUCT 2
41
EXAMPLE
A SET OF EVALUATION QUESTIONS FOR A HYPOTHETICAL
TRAINING WORKSHOP
Why the
Question Is
Important
Evaluation Questions
Subquestions
Audience
Did it go
according to
schedule?
Were
resource
materials
used
effectively?
Inservice director
Director needs
to modify plan
where it
doesn't work,
or better
manage staff
What are
their
positions in
schools?
Number of
years in
district?
Special
education
certification
status?
The funding
agent sponsors
this for certain
client types.
We need to
market the
session better
to nonattendees
Inservice staff
Participants
Participants
need to know
what they
learned. Staff
needs to know
workshop
effects.
4. How appropriate
Administrators
Staff
Workshop
must respond
to current and
perceived
needs, or it
won't be
attended or
refunded
--
implemented as
planned?
characteristics of
participants?
facilitate
participants with
the workshop?
----------_._-- ----
42
A DESIGN MANUAL
Why the
Question Is
Important
Evaluation Questions
Subquestions
Audience
Satisfied with:
timing
length
staff
performance
topics
Participants
Administrators
Staff
Training won't
work if
participants
don't like it
and won't get
involved.
Positive
reactions
indicate utility
and worth to
participants
Inservice staff
Ad director
Workshop still
needs
debugging,
better
preparation,
more efficient
design, etc.
Inservice director
The director
trains the staff
to make sure
they can, and
do, fulfill their
roles; can't
afford a poor
performance.
participants with
the workshop?
6. What problems
Roles: as
discussion
leaders
inservice
providers
PRODUCT 2
43
2.
3.
4.
5.
several sub questions or variables; others are "small" enough so that the
evaluators choose not to break them into smaller subquestions.
How the questions are phrased can imply how they should be answered. For
example, the investigation of staff performance in #5 requires finding out
if participants are satisfied with staff, while #8 might, but does not
necessarily, direct the information collection that way. It could involve
other ways of collecting information, such as observation by supervisors
or peer ratings.
As each question or subquestion is identified, some potential meanings are omitted.
For example, question 2 has three subquestions. There are others which
could have been added, e.g., age and number of years of teaching
experience. In the process of analyzing the questions only the most
important meanings are kept. The trick is to be careful to avoid
eliminating some meanings prematurely and thereby inappropriately
narrowing the evaluation.
Questions may ask for value judgments, for descrzptive information, or for both. For
example, #4 asks for a judgment about "appropriateness," while #2 asks
for description of characteristics. Both types of questions are fine-but
note that even #2 will be used to make value judgments about the
program because the evaluation audience that interprets the findings will
undoubtedly have in mind some notions about what the "appropriate"
characteristics are, depending on their intended audience for the
workshop.
Another way of phrasing question #2 is to ask: "Did the characteristics of
the participants at the workshop match those of the target audience?"
This requires both description and judgment. If the workshop is in early
stages of development, it's possible that judgments of what's appropriate
will not be made about the workshop participants. One might ask, "Who
came to the workshop?" Coupled with "Which participants benefited the
most?" The information could be used in the design of future workshops
to specify who should attend and to focus the design.
There are a variety of types of questions: Questions can ask about many
features of the object, for example: its: (See also aid #2)
goals and needs
potential problem areas costs
context
controversial areas
potential market
design
critical functions
preconditions
process
accountability issues
discussions
outcomes
preconditions
6. "Why the question is important" helps to determine criteria and values, as well as the
nature of reporting.. It shows what will be done with information collected
for the evaluation questions, by explaining who needs the "answer,"
and/or what might be done with an "answer."
44
A DESIGN MANUAL
WORKSHEET
DRAFT EVALUAnON QUESTIONS
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
PRODUCT
45
AID #6
SIX METHODS OF DEFINING EVALUATION QUESTIONS
METHODS
46
A DESIGN MANUAL
METHODS
they want answered, or what
they believe is most important
to investigate.
Brethower, Karen S. and Rummier, Geary, "Evaluating Training" Improving Human Performance Quarterly,
Vol 5, 1977, pp. 107-120 .
Stufflebeam, Daniel L. in Worthen & Sanders Educational Evaluation: Theory and Practice. Jones
Publishing, Ohio, 1973, pp. 128-150.
PRODUCT 2
47
WORKSHEET
EVALUATION QUESTIONS AND SUBQUESTIONS
Directions: Review and organize and revise your draft evaluation questions. Write
them, their sub-questions, audiences and why they're important in
the spaces below. Use the example (page 41-42) and aid #7.
Evaluation
Questions
Sub-Questions
Audience
(Who cares?)
48
A DESIGN MANUAL
AID #7
EXAMPLE OF EVALUATION QUESTIONS (BASED ON
A 2-DAY INSERVICE WORKSHOP ABOUT SCHWARTZIAN THEORy)
Questions
related to
program functions
Goal or need
identification
Program design
Questions for a
"new" object
(Le., being planned or
recently implemented)
What problems are teachers
having with handicapped
children?
What kinds of training do
teachers want?
What time and other
constraints exist among the
target population?
To what extent do teachers
value the planned workshop
objectives?
Is Design A more practical
than Design B?
Is Design C any good?
Implementation
or processes
Products
or outcomes
Questions for an
"old" object
(Le., been around awhile
perhaps through several
rounds of revision)
Are the identified needs still
valid?
Did the workshop address
the three goals?
What needs do attendees
have?
PRODUCT 2
Questions
related to
program functions
Questions for a
"new" object
(Le., being planned or
recently implemented)
How well can teachers use
Schwartz Techniques?
Are graduates using
Schwartz Techniques in
their classrooms?
Recycling decisions
49
Questions for an
"old" object
(Le., been around awhile
perhaps through several
rounds of revision)
What other uses or misuses
are being made of workshop
acquired methods?
Do people who receive less
workshop training perform
less well than others?
What are the costs and
benefits to attendees of
using the new methods?
To what extent is the
workshop reaching the
population?
What continuing needs aLd
problems exist despite tht>
workshop benefits?
50
A DESIGN MANUAL
TIPS
OUTLINING EVALUATION QUESTIONS
Some ways of doing the task:
You or a committee draft a set of questions, then circulate to relevant others for
revising and ranking;
take draft set to meeting at which others work on them;
share questions with staff, consultants, or others.
Don't worry now about "measurability" of questions. Wait until you have a set that
will meet your purposes, then check for feasibility of answering. Don't throw good
questions away until you're sure you must.
You will probably use the six methods (see Aid #6) of defining questions
simultaneously, so don't worry about deciding the "source" of each question.
Although it's important to be as specific as you can in writing evaluation questions,
don't force specificity. As questions are further defined and subdivided, some
potential meaning is lost. Make conscious decisions when omitting meanings.
In the course of the evaluation you may find that changes in your situation (e.g., level
of resources or availability of information) or your experiences in answering some
questions lead you to reconsider that set of questions. Some may become
outdated.
If your evaluation is exploratory-i.e., you want to do a long-term assessment of the
object, but aren't sure about all you want to investigate-consider asking just a few
evaluation questions, then adding others later.
If you need an "answer," it's a good question!
Naturalistic approaches to evaluation will generate evaluation questions as they go.
Your initial questions may be few, and quite general- just enough to get you
pointed in the right direction.
PRODUCT 2
51
CHECKLIST
OUTLINING EVALUATION QUESTIONS
Answer the following questions and revise the worksheet until you can answer
"yes" to all of them.
YES
NO
52
A DESIGN MANUAL
MOVING ON
Check to see if you need to do any of the following things before moving on to
develop an Information Collection Plan.
___ Take a break.
___ Check your evaluation questions with audience members to get their
suggestions or agreements.
___ Rewrite and tidy up your list of evaluation questions and subquestions. (You
can do that right on a worksheet that appears in the Information Collection
Plan section.)
___ Get more information about the object or the context (e.g., data from prior
evaluations, ongoing descriptive data) before you complete your questions.
~ _____
Interview some audience members to make sure you've got their key
questions included.
FOCUSING
KEY ISSUE
TASK
5. Identify major
questions
PAGE
27-30
\
\
"
PRODUCT 3
\
\
\
\
\
\
\
\
o
o
o
o
4.
5.
6.
7.
53
54
A DESIGN MANUAL
PRODUCT OVERVIEW
Procedure
Objectives
Prepare
Try
2. An Information Collection
Improve
PRODUCT 3
55
EXAMPLE
INFORMATION COLLECTION PLAN (A)
Evaluation Questions
Information
Collection
Procedure
A. Trainers
maintain a
checklist of
activities
and
problems
1. Was
workshop
implemented
as planned?
---
r------
3. Did
workshop
facilitate
learning?
4. How
appropriate
were
objectives?
5. Were
participants
satisfied?
B. Participants
6. What
problems
arose?
complete a
registration
form
C. Selected
"key
informants"
maintain a
log and
respond to
questionnaire
2. What are
characteristics
of participants?
7. How
well did
staff
perform
their
roles?
D. All
participants
complete a
brief endof-session
questionnaire
E. Staff
F. Analysis of
selected
workshop
products
administer
a selfscored arid
anonymously
reported
learning
test
56
A DESIGN MANUAL
EXAMPLE
HOW EACH PROCEDURE WORKS (B)
Information Collection
Procedure
Evaluation
Questions
Addressed
Schedule
Respondents
Sample
Instruments Used
A. Trainer's maintain
checklist
#1,#6
One form
completed at end
of each day
Group trainers
Each one
Checklist with
space for problem
notation
B. Participants complete
registration form
#2
Filled in at
registration
Participants
Each who
registers
C. Key informants
#1, #3-7
Logs completed
daily on-the spot;
questionnaire twice
daily
Key informants
Two selected
from each
group
D. All participants
complete a
questionnaire
#3-7
Complete before
lunch on last day
of session
Participants
Al1 who
remain un til
lunch on
last day
E. Learning test
#3
Administer and
self-score on
morning of last
day; discuss
results
Participants
All
F. Analysis of products
#3
Selected products
are analyzed the
day after session,
then all products
are returned to
participants
Workshop
evaluator
Random;
3 products
per group
Two-page Likert-
Multiple-choice
test
Work sample
quality checklist
PRODUCT 3
57
2.
3.
4.
5.
6.
7.
8.
9.
10.
address which evaluation questions. The schedule (B) shows how and
when each procedure works.
This plan depends on your having at least one evaluation question. Because your
choice of procedures and decisions about how they will be applied
depends on the kind of information you need, you must have selected at
least one evaluation question to get started on your collection plan.
Certainly, you may have more than one-even a set of questions which
define the entire scope of the evaluation. Nevertheless, even with just a
few questions you can begin to decide how to collect information in an
efficient and comprehensive way. You can then add to your plan as the
evaluation proceeds.
The matrix plan shows which procedures (rows) are intended to collect information
for which questions (columns), "X' s" in the cells show this.
Some procedures (like C and D) respond to more than one question.
Some evaluation questIOns (#1, for example) are addressed by more than one
procedure.
A good plan blends economy with thoroughness. That is, the plan reflects an
effort to use only a few procedures to answer many questions
(economy), as well as an interest in obtaining complete information for
each question by using a variety of sources (thoroughness).
There are many types of procedures for collecting information. Some general
sources of information are persons, products, and processes. For each
source there are a number of specific ways of collecting information, for
example, questionnaires, tests, record analysis, interviews, and observations. (See aid #8)
In addition to specifYing bow information will be collected (procedure), the plan
indicates who will collect it (administrator), when it will be collected (schedule),
from whom (respondent and sample), and what instruments are needed.
Samples of information sources vary in kind, in size, and in the proportion of the
population accessed. It's often wise to use samples. Small purposive samples
(See aid #10) are especially useful in training evaluations.
The plan investigates several different kinds of information. Behavior, perceptions, demographic characteristics, and products are used as indicators (See aid #9) of workshop progress and effects.
58
A DESIGN MANUAL
WORKSHEET
OVERALL INFORMATION COLLECTION PLAN
Write evaluation
questions here (see
Product #2)
List information
collection procedures here
(Use aids #8 and 9)
PRODUCT 3
59
AID#S
SOME KINDS OF INFORMAnON COLLECTION PROCEDURES
Procedure
Accretion, Erosion
Analysis
What it Measures
or Records
Apparent wear or
accumulation on physical
objects
Example
Learning center materials
are inventoried before and
after a workshop to
determine usage or removal
---~-~~~------------
Artifacts Analysis
Case Studies
Interviews, Group or
Individual
Department chair
interviews students about
course adequacy
Panels, Hearings
Opinions, ideas
Records Analysis
Logs
Practicum students
maintain a log of activities
Simulations, "In
Baskets"
Persons' behaviors in
simulated settings
Sociograms
Systems Analysis
Components and
subcomponents and their
functional
interdependencies are
defined
An evaluator interviews
staff about program and
depicts these perceptions in
a systems analysis scheme
60
A DESIGN MANUAL
Procedure
What it Measures
or Records
Advisory, Advocate
Teams
Judicial Review
Behavior Observation
Checklist
Interaction Analysis
Inventory Checklist
Judgmental Ratings
Respondent's ratings of
quality, effort, etc.
Knowledge Tests
Opinion Survey
Performance Tests
and Analysis
Q-sorts, Delphi
Perceived priorities
Self- Ratings
Survey Questionnaire
Demographic
characteristics,
self-reported variables
Frequencies of key
practicum behaviors of
students are charted over
the course of a new
semester-long seminar
Example
PRODUCT 3
61
AID #9
SOME SOURCES OF INFORMATION
When
Evaluation
Question Asks
About ...
---
--
-_.
-~-
--,-~-"
-- ----
Needs and
goals
characteristics of
job descriptions,
proposals, plans,
reports
policies, rules
demographic data
beliefs, values
normative data
current skill,
knowledge
levels, amount
of training
patronage patterns
expert opinions
criteria, laws,
guidelines
nature and
frequency of
problems
rates of use,
production,
incidences
kind of clients
served
consumer
preferences,
wants
perceived priorities
Training
strategies
and designs
characteristics of
plans, proposals,
user's guides,
instructor's
manuals
records of
resources
available,
programs in use
training literature
reports from
commissions,
task groups
demographic data
user/trainee
preferences,
convenience,
needs
recommendations
from task
groups, leaders
Implementation
of training
attendance rates
and patterns
usage of materials,
resources
perceptions of
observers
trainer behaviors
perceptions of
trainees
transactions
(verbal, other)
wear and tear on
materials
trainee behaviors
perceptions of
trainers
discard rates and
nature
Immediate
outcomes
materials produced
in training
trainer ratings
observer ratings
knowledge (i.e.,
test scores)
trainee ratings
self-report ratings
performance in
simulated tasks
pre/post changes
in test scores
On-job usage
of training
outcomes
nature and
frequency of
usage
peer opinions
records of use,
behavior
trainee perceptions
observed behavior
performance
ratings
supervisor opinions
quality of work
samples
test scores
transactions of
trainees with
others
62
A DESIGN MANUAL
When
Evaluation
Question Asks
About ...
Impact (worth)
of training
performance
ratings
performance
of clients
(e.g., test scores)
opinions of
experts, visitors,
observers
promotions records
perceptions of
clients, peers,
relatives
consumer opinions
quality of work
samples
treatment, sales
records
PRODUCT 3
63
WORKSHEET
HOW EACH PROCEDURE WORKS
Directions: Fill in each column below for each procedure listed in worksheet A.
Procedure
Evaluation
Schedule
Sample (kind and
question addressed (when, how, where) size, see aid #10) Respondents
Instruments used
64
A DESIGN MANUAL
AID #10
SOME KINDS OF SAMPLING METHODS
USEFUL IN EVALUATION OF TRAINING
Random Methods:
Purposive Methods:
PRODUCT 3
65
Random Methods:
Purposive Methods:
66
A DESIGN MANUAL
TIPS
COLLECTING INFORMATION
You will notice a tension between having an efficient and practical plan and having
one which is thorough. That's common in evaluation. Strike a balance. Make
conscious decisions and record rationale. Discuss with client/audience. If good
compromise can't be made, consider not doing evaluation or refocusing it.
You may want to go back to review the evaluation questions at this point, e.g., revise,
rearrange, add to, or eliminate. Before you eliminate because you can't think of a
procedure to collect information to answer it, talk to others experienced in data
collection.
Here's a good place for consultant help!
If you've been working closely with audiences, it may be a good idea to go back now
and show them this plan. Or you may wait until you've done an Analysis and
Interpretation Plan and show them the two together, since they are closely
related.
Use existing data where it's available. It's cheap, and less likely to be biased by
collecting for evaluation purposes.
Be sure you've thought about information collection procedures that are already
available in your program-e.g., tests, annual surveys, etc., that could be useful for
this evaluation. And look at records that can be analyzed. Caution: don't include
those procedures if they aren't useful.
Because you may want to rearrange and revise the plan as you go, it may be easier to
use the format of the worksheet on a larger sheet of paper. Newsprint is a handy
size, particularly if you're working with a group.
Note that each procedure row of a worksheet defines the content of an instrument.
For more information about drafting and piloting instruments, see the
Sourcebook.
PRODUCT 3
67
CHECKLIST
COLLECTING INFORMATION
Answer the following questions and revise the worksheet until you can answer
"yes" to all of them.
YES
NO
Will the cost of any procedure be worth it, given the amount and
kind of information it will provide (consider staff time,
implementation costs, etc.)?
68
A DESIGN MANUAL
MOVING ON
Check to see whether you need to do any of the following things before
moving on to develop an Analysis and Interpretation Plan.
___ Check over the information you'll be collecting to see if you'll also be
able to answer some other important questions with the information.
___ Check back with your audience (or your advisory committee or a key
person or two) to reconsider evaluation questions. (You can put the
revised questions on a worksheet that appears in the Analysis and
Interpretation Plan section.)
___ Do a more careful description of the object if you have trouble
thinking about what kinds of information are available and most
appropriate.
_______ Do a rough cost estimate and reconsider purposes. (If you suspect
you need to do this, look in the Management Plan section for
ideas.)
___ Refer to other sources of information (e.g., a local expert or some of
the references below).
SOURCEBOOK REFERENCES
FUNCTION
KEY ISSUES
TASKS
Collecting
Information
1. What kinds of
1. Determine the
77-83
2. What procedures
84-88
3. How much
89-94
95-99
information should
you collect?
should you use to
collect needed
information?
information should
you collect?
develop instruments?
information collection
effort to get the most
information at the
lowest cost?
information sources
you will use
collect information
PAGE
----
an economical
information collection
procedure
108-115
.........
,", ","
'\
'\
-- -- -
PRODUCT 4
"-
\
\
o
o
o
5. Report P1an
6. Management Plan
7. Plan to Evaluate the Evaluation
70
A DESIGN MANUAL
PRODUCT OVERVIEW
Procedure
Objectives
Prepare
Try
2. An Analysis and
Interpretation Plan
Improve
------
PRODUCT 4
71
EXAMPLE
ANALYSIS AND INTERPRETATION PLAN (A):
PRELIMINARY HANDLING AND ANALYSIS STEPS (EXCERPT ONLy)
Information
Collection
Procedure
Verification &
Problem-Checking
Preliminary
Analysis,
Coding, etc.
Storage
and Retrieval
---,~----~---
Participants
complete
registration form
Coded by project
secretary; add
workshop # code
to each set;
maintain running
summary
Filed by
workshop,
alphabetically by
participant
Self-scored
multiple choice
test
Compute item
frequency
distribution.
Compute mean
score per small
group and per
workshop.
Maintain items
analysis on
cumulative basisre all workshops
File by workshop
"Key informants"
logs
Content analyze
only logs with at
least 200-word
entries for each
day
Prepare log
summaries
File by workshop
72
A DESIGN MANUAL
EXAMPLE
ANALYSIS AND INTERPRETATION PLAN (B):
ANALYSIS AND INTERPRETATION PROCEDURES (EXCERPT ONLY)
Evaluation
Question
Information
Collection
Procedures
Analysis
Procedure
Evaluation
Criteria
Procedures for
Interpretation
-----------
Registration form
administered to
participants at the
beginning of the
workshop
Frequencies and
percentages by
position,
certification
status and number
of years in the
district; total
number of
participants
Characteristics of
target audiences.
At least 70% of
teachers and
administrators; at
least 60% of the
non-certified
teachers; equal
distribution across
seniority levels
Inservice director
compares findings
with criteria
2. Were participants
Key informant
logs
Participants
questionnaires
Content analysis
of logs
Frequency
analysis of
selected
questionnaire
items
No key
informants should
consider entire
workshop as
unsatisfactory
Dissatisfaction
should be limited
to minor
workshop
elements
Mean satisfaction
ratings should be
greater than 3 on
5 point scale
Discussion of
findings with key
informants
Review of
satisfaction items
by director and
staff
Comparison of
mean rating to
criterion
3. What problems
arose during the
delivery of the
workshop?
Participant
questionnaires
Interviews with
key informants
Staff checklist
Content analysis
of notes, logs and
relevant items on
questionnaires to
identify cited
problems, how
they developed
and their effects
"Problems" such
as:
unproductive
diversion from
schedule
inadequate facility
(e.g., space, A-V
equipment)
materials not
prepared
staff unable to
answer
questions or
make
appropriate
referrals
Inservice director
reviews findings at
staff meeting
when plans for
workshop revision
are discussed
characteristics of
workshop
participan ts?
satisfied?
PRODUCT 4
73
2.
3.
4.
5.
6.
7.
8.
to show you the entire plan. The excerpt is probably enough to give you
the idea.
The product has two parts. The first shows how information from each
collection procedure is verified, prepared and stored. The other
example shows how information from several procedures are analyzed
and interpreted to "answer" evaluation questions.
There are a variety of criteria that can be used to make judgments. They include
professional standards, goals or objectives, laws, past performance,
needs, or performance of a comparison group. Because of this diversity
of possibilities, it's important to know what (and whose) criteria are
being applied.
Several criteria are often used. That, of course, may make the evaluation
more complicated, but having different value perspectives can also
strengthen it.
A variety ofprocedures for making value interpretations are possible. Individuals
or groups may be involved; the evaluator may make the judgments or
audiences may reserve that right. Because this is a critical part of the
evaluation, it's important to be clear about how the judgments will be
made and by whom.
The analysis and interpretation steps may be distinct or may blend, depending on
what kind of evaluation question is asked. When the question asks for
descriptive information (e.g., question #1), the two steps are relatively
seoarate. That is, one can answer the question by analyzing the findings
without referring to values or making judgments about it. Then
evaluative criteria are applied in order to interpret the findings and
make them useful for program planning. Question #2 combines those
two steps because the analysis involves not only manipulation of the
collected (raw) information, but also some comparison of it to some
criteria for judging its implications for the worth of the object.
The criteria may be specified in detail before information collection begins or may be
defined generally at first, then in more detail when findings are known. The
preferred practice is to specify as much as possible-as soon as possible.
The criteria can become part of the agreement between the evaluator
and the audiences and can help direct the evaluation in useful ways.
However, it may not be possible to determine detailed criteria for some
questions. For example, question #3 has some criteria listed on the
table,
should be identified if the answer to the question is to be useful for
improving the workshop. Thus, the analysis procedure and the criteria
need to be open enough to detect them.
To "answer" some evaluation questions, it is necessary to draw on the answers to
other questions. For example, the answer to a general question about how
74
A DESIGN MANUAL
well the workshop was run could depend on answers to questions about
each of the workshop modules. When answering other questions, too,
you may find that information you didn't intend to use (Le., it's not in
your collection plan) can and should be applied. Looking for and
making necessary revisions to the plan is, of course, important.
9. Carrying out a good Analysis and Interpretation Plan requires attention to
information handlint- storage and retrieval. If those systems haven't been
considered, the plan is likely to have problems.
10. Preparing data for analysis (verIfication, etc.) makes sure that information is
complete enough to be worth the trouble of analysis.
PRODUCT 4
75
WORKSHEET
PRELIMINARY HANDLING & ANALYSIS PLAN
List information
collection
procedures
Describe how
problems will be
identified,
e.g., incomplete
information and
errors
Describe
preliminary
analysis
76
A DESIGN MANUAL
WORKSHEET
ANALYSIS & INTERPRETATION PLAN
List evaluation
questions
List informationj
collection
procedures for
each question
Describe analysis
procedure(s)
Describe
procedures for
making the
judgments
(see aid #11)
PRODUCT 4
77
AID #11
SOME COMMONLY USED CRITERIA AND METHODS
FOR INTERPRETING INFORMATION ABOUT TRAINING PROGRAMS
Criteria
78
A DESIGN MANUAL
TIPS
ANALYZING AND INTERPRETING INFORMATION
When you have trouble identifying analysis criteria or procedures with an evaluation
question, try breaking the question in "smaller" (finer, more specific) questions.
This will resolve ambiguity or conflicts (but may not be easy!).
It's not too late to review other parts of the design:
Evaluation questions:
do you need to subdivide some or add others based on the analysis you want to
do?
audiences: should audiences be added to provide balance in value bases or
criteria?
sources of information: do you need to add sources in order to answer the
evaluation questions?
Look at the matrix worksheet on page 58 (Information Collection Plan) to help you
organize your thoughts about analysis.
Each column (evaluation questions) should represent information you will use to
address each evaluation question.
Preliminary analyses are done for each row (information collection procedure)
Seek persons with skills in information analysis and interpretation.
As you decide what criteria will be applied, realize there are many options. The point
is to have good rationale for selecting the basis on which judgments will be made
and to record those decisions.
This plan should contain your first crack at what will be done. Additional analysis
may later seem worthwhile to explore the information further.
Analysis procedures you choose do not have to involve complex statistics. In fact,
they rarely do in program evaluations. Choose procedures that make sense in your
situation. If you do use complex procedures, be sure you explain them in your
reports for readers who are not familiar with them.
PRODUCT 4
79
CHECKLIST
ANALYZING AND INTERPRETING INFORMATION
Answer the following questions and revise the worksheet until you can answer
"yes" to all of them.
NO
0
Are your analysis plans likely to help you answer your evaluation
questions?
YES
80
A DESIGN MANUAL
MOVING ON
Check to see if you need to do any of the following before moving on to
develop a Report Plan.
___ Check over your worksheets to find the areas where you are
uncertain about how to analyze and interpret information you
collect.
Reconsider the evaluation questions associated with the problem
areas. Break those questions and sub questions into sets of smaller
questions. (You might want to review the "What to Notice" and
"Example" in the section on Outline of Evaluation Questions to see
how to do that.) With a little ingenuity (and consultation with
audience members) you can probably arrive at a set of small
questions that are relatively easy to answer. (It is possible to break
any complex question into a set of questions that can be answered by
"yes" or no." AnalysiS, after all, means breaking things down into
simpler components.)
Seek help from persons with greater (or different) skills in data
- - - analysis. Expert evaluators seek such help regularly, so please
consider it all right to do.
___ Refer to other sources of information (e.g., an expert in values
clarification or research literature in your area).
SOURCEBOOK REFERENCES
FUNCTION
KEY ISSUES
TASKS
PAGE
Analyzing and
Interpreting
(Evaluation)
119-122
2. Verify completeness
123-126
127-144
145-147
returned data?
analyzing?
the information?
data if necessary
",
"
"-
.........
.........
'\
"-
'\
'\
'\
"-
" '\
'\
"-
...........
.............
~
"'-
'"
--
PRODUCT 5
---
'\
REPORT PLAN
~
~
~
~
~
o
o
3.
4.
5.
6.
7.
81
82
A DESIGN MANUAL
PRODUCT OVERVIEW
TASK STEPS
Procedure
Objectives
Prepare
Try
2. A Report Plan
Improve
3. An improved or clarified
plan
PRODUCT 5
83
EXAMPLE
REPORT PLAN
Audience
Content
Format
Date/Frequency
Event
Memorandum
and
presentation
Beginning of
each month
Presentation
at staff
meeting;
with onepage written
summary
Department
Head
1. Progress update
Meeting
Bimonthly
Meeting
with Project
Director
2. Review of
Written
report
60 days after
project ends
Final report
1. Evaluation
Meeting
30 days after
project begins
Department
meeting
2. Periodic
Informal
discussions
Unscheduled
Ora~
(See
Management
Plan)
concurrent with
information
collection
Interviews
Meeting
Quarterly
Department
meeting
3. Review of
findings and
implications
Presentation
with
executive
summary
60 days after
project ends
Faculty
meeting
1. Evaluation
Memorandum
Quarterly
Presentation
at
committee
meetings
one-page
summary
2. Review of
Written
report
60 days after
project ends
Final report
design,
findings,
interpretations,
implications
Department
Staff
design
updates on
progress
informal
Advisory
Committee
design,
progress
design,
findings,
interpretations,
implications
84
A DESIGN MANUAL
Audience
Funding
Agent
Format
Date/Frequency
Event
1. Progress,
5-page
written
report
Quarterly
Report is
mailed;
follow-up
phone call
2. Same as other
Written
report with
instruments,
etc.
90 days after
project ends
Final report
with
technical
material
Content
resources used,
problems, next
step
implications
final report,
with more
detailed design
and findings
included
PRODUCT 5
85
2.
3.
4.
5.
about the evaluation plans, progress, or findings. This broadens the scope of
reporting activities well beyond the preparation of a single end-of-theevaluation report.
The communication may be written or oral. A report can be as simple and
informal as a five-minute presentation at a meeting.
A report is made each time there is formal contact with an audience. Thus, in some
cases those reports are concurrent with information collection efforts or
other activities not focusing on the reporting function.
Brevity without losing important information is the general rule. That means that
in some instances a one-page summary of the report is used, even though
you know there's much more complete information in another document
available to those who are interested in it. This strategy tailors report
length to individual readers.
There are a variety of audiences. Successful evaluation depends on a broad
base of informed support. Reporting to a broad array of audiences- just
to inform them-can maintain support and commitment to the evaluation effort.
86
A DESIGN MANUAL
WORKSHEET
PLANNING REPORTS
List
evaluation
audiences
Describe
content of
reports
(use aid #12)
List date
Identify format of report
or frequency
to be used
(use aid #12)
(use aid #12)
Identify event
associated with
report
PRODUCT 5
AID #12
EXAMPLES OF EVALUATION REPORT CONTENT,
SCHEDULE BASES AND FORMATS
Content Options
Evaluation design, intents
Work accomplished
Problems encountered
Revisions to plans
Budget
Resources used
Future activities
Findings
Instruments used
Information collection procedures
Interpretations
Recommendations
Schedule Bases
Calendar periods (e.g., quarterly)
Key decision and other audience
events (e.g., a board meeting)
Stages of evaluation progress
(e.g., design, completion of analysis)
Key program events (e.g., after each
workshop)
Opportunistically (e.g., an invited
speech)
Formats
Written documents
technical reports
interim reports
conference proceedings
memoranda
letters
professional journals
publications
Media releases
press
tv
radio
Meetings
small group discussions
presentations
luncheons
hearing
panel reviews
Leaflets
newsletters
pamphlets
Audio-visual
films
filmstrips
tapes
overhead transparencies
87
88
A DESIGN MANUAL
TIPS
PLANNING REPORTS
Before you simply copy the list of audiences from previous worksheets, consider
whether there are others that should be added or some that should be
eliminated.
Although there may be many informal communications about the evaluation, it will
be worthwhile to plan those that are particularly important.
Although you may be reluctant to make reports to some audiences-remember that
the consequences of failing to report can be more damaging. Your critics draw
unwarranted conclusions.
Be sure your Management Plan describes who will write reports.
The Analysis and Interpretation Plan should specify who has access to information.
(Be sure your report writers do.)
All audiences with right to know about evaluation should be listed.
When planning the content of the report aim for
balance
clarity
objectivity
PRODUCT 5
89
CHECKLIST
PLANNING REPORTS
YES
NO
If the plan is put into operation, will the audiences receive the
90
A DESIGN MANUAL
MOVING ON
Check to see if you need to do any of the following before moving on to
develop a Management Plan
___ Look back at the worksheet on audiences and their interests. See
whether the data you'll collect, the criteria you'll use to interpret it,
and the way you'll report it will be responsive to those interests.
___ If there are conflicts of interests or values that you don't know how
to resolve yet, consider adding a debate (by audience members) to
your Report Plan so the conflicting groups can fight it out among
themselves rather than attack your work.
___ Refer to other sources of information (e. g., an expert on communications or report writing)
SOURCEBOOK REFERENCES
FUNCTION
KEY ISSUES
TASKS
PAGE
Reporting
151-153
154-158
159-164
165-167
5. Plan post-report
168-169
17~173
evaluation report?
be included in a report?
delivered?
be scheduled?
report to
be included
discussions,
consultation,
follow-up activities
schedule
-----
:\-------,
I '\.
'
I.-----L-----,~----'
I
I
I
I
~-----\
PRODUCT 6
MANAGEMENT PLAN
~
~
~
~
~
~
1. Evaluation Preview
2. Outline of Evaluation Question
3.
4.
5.
6.
7.
\
\---------------------~
91
92
A DESIG N MANUA L
involv ed in
1. A Manag ement Plan helps in keepin g track of all the details
nt flow of
efficie
and
evalua ting and in achiev ing a reason ably smoot h
activit ies.
t calend ar or a
2. A Manag ement Plan can be a simple person al or projec
work plans,
lines,
time
ts,
budge
bing
descri
ents
docum
compl ex set of
etc.
TASK STEPS
Proced ure
Object ives
Prepar e
Try
Improv e
PRODUCT 6
93
EXAMPLE
A MANAGEMENT PLAN IN TWO PARTS
I. EVALUAnON WORKPLAN
---
A. Delineation of Information
1. Site visits
2. Gather information
3. Interact with evaluation
members
4. Interview legislative
personnel
5. Write needs assessment
instrument
------.~-
B. Needs Assessment
1. Select sample
2. Administer needs
assessment instrument
3. Analyze data
4. Write priorities report
5. Distribute findings
6. Obtain comments
7. Write final report
---
Person
Resp.
1,2,3
3
1,3
X
X
X
X
X
1,3
1,2
D. Project Administration
1. Negotiate contract with
Evaluation Committee
2. Recruit/train evaluation
Associates & Assistants
3. Organize communication
procedures
4. Revise evaluation design as
needed
5. Supervise operations
6. Synthesize findings and
prepare interim and final
reports
7. Communicate findings to
key audiences
- - - - f-----
3
1,3
3
1,2
3
3
1
------ ----------
C. Summative Evaluation
1. Select sample
2. Develop summative
instrument
3. Administer summative
instrument
4. Analyze data
5. Conduct site visits
6. Analyze site visit data
7. Write summative report
X
X
X
r---
-- --
X
X
3
1
X
X
------ - - - - - -
X
X
X
X
3
1,2,3
1,2,3
X
X
X
X
1,2
1
1,2,3
X
X
X
X
X
X
X
X
1,2,3
94
A DESIGN MANUAL
$ 4,950
3,000
2,250
D. Secretary
(25% of $8,000/yr X Y, year)
1,000
Personnel Subtotal
$11,200
Fringe
(22.1% of A, B, D)
1,977
102
B. Lunch for 6 persons during initial site visits @ $3.50 per person
21
C. Travel to Springfield
2 trips to the Eval Bd. (160 @ .17
1 trip for interviews (160 @ .17)
1 trip to Huntsville (400 @ .17)
54
27
68
2)
200
E. Site Visits
1. Air Fare (2 trips X $120 X 2)
2. Car Travel (17 trips @ 400 mile ave X .17/mile)
3. Per Diem (22 trips @ $15/day X 1 day X 2)
(11 trips @ $35/day hotel X 2)
(11 trips @ $7.50/meal X 2)
4. Ground Transportation (2 trips X 15)
Travel and Lodging Subtotal
480
1,156
660
770
165
30
$ 3,733
60
600
C. Postage
(3 mailings and 2 return mail @ $.30 X 400/mailing)
(office mailing $5/month X 6 months)
600
30
D. Copying
($100/month X 6 months)
(Instruments (400 X .25 X 2
600
200
Materials and Supplies Subtotal
Total Direct Costs
Indirect Costs ($6,771)
TOTAL
Indirect costs $6,771 considered as cost sharing by the University on this project.
$ 2,090
$19,000
-0$19,000
PRODUCT 6
95
3.
4.
5.
6.
responsibilities, and (2) a budget showing resources needed for evaluation tasks.
The Plan shows the major activities ofthe evaluation and when they will occur. This
makes it easy to see the sequence of events and what has to be done in
preparation for certain activities. An alternative is to categorize by major
types-such as reporting or staff training. This may be preferable in
evaluations when those categories are performed by distinct units or
when a budget has those same categories and you want to be able to use
the documents together.
The level of detail regarding activities varies. For some activities there are subactivities which serve as a reminder of what needs to be done and as a way
of distinguishing responsibilities of individuals who are involved in the
same general activity. Often the management plan has only a general list
of far future activities, while the current or near future ones are specified
in more detail.
The budget includes basic categories of expenditures: personnel, travel, materials
and supplies, and indirect costs. Depending on the size of the program,
you may want to have separate categories for consultants, materials and
supplies. Another alternative is to organize the budget by major program
activities or components and show the breakdown by those same general
categories in each, followed by a summary across components at the
end.
The example shown is not the only way to record a management plan. Pert charts,
Gantt charts and other devices work, too. But any plan needs to have
details about tasks, timelines, responsibilities and resources.
Management plans should be roughly as complex as the evaluation effort. The
example portrays a relatively large and complex evaluation effortprobably more costly than many. A simpler evaluation would be cheaper
and more simple (but it wouldn't let us show off the forms as we11!).
96
A DESIGN MANUAL
WORKSHEET
TASK ANALYSIS
Task
Begin/end
Date
Task:
/
/
/
/
/
/
/
Task:
/
/
/
/
/
/
Task:
/
/
/
/
/
/
/
Name the
person(s) who will
do each sub-task
._- .
Estimate
personnel
cost
List other
resources
needed
Estimate cost
for other
resources
PRODUCT 6
AID #13
SOME TYPICAL EVALUATION TASKS & SUB-TASKS;
AND SOME USUAL COST ITEMS
TASKS & SUB-TASKS
Focusing/Designing
site visits
meetings with clients
drafting designs
review meetings, panels
contract negotation
legal review
planning meetings
review literature
Reporting
plan reports
prepare media
write reports
conduct meetings
schedule presentations
draw charts, figures, etc.
proofread
conduct trial sessions
Information Collection
inventory available information
determine sample sizes, parameters
draw/create samples
select instruments
pilot test instruments, procedures
train observers, raters, analyzers, etc.
conduct interviews, observations,
tests, etc.
distribute forms, instruments, (mail,
etc.)
conduct site visits (travel, etc.)
Management
recruit, train staff
negotiate contract
supervise
conduct meetings
public relations tasks
bookkeeping
COST ITEMS
personnel services
typing
copying
space
equipment
computer time
coding, verification
access fees
rental
refreshments
AV rental
graphics
97
98
A DESIGN MANUAL
programming
computer cards
postage
airfare
mileage
PRODUCT 6
WORKSHEET
REVISED TASK CHART
Identify person( s)
responsible
99
100
A DESIGN MANUAL
WORKSHEET
BUDGET
Directions: Complete the following budget, using the categories of expenditures.
Personnel
Subtotal
Fringe (institutional percentage of cost)
Travel
Communications
TOTAL
PRODUCT 6
101
TIPS
PLANNING THE MANAGEMENT PLAN
Don't shy away from a Management Plan because you think your evaluation is too
simple for one. A Management Plan can be as simple as a set of dates in your
calendar. It's important to have some timeline for the evaluation and to know who
will do what and what resources are needed.
Like the other products, this one will need to evolve. As tasks approach, you can be
more specific about how they will be done and by whom-and what their sub-tasks
are.
To write your time line, try starting with the last task(s). When does it have to be
done (e.g., when is the final report due; when does a decision based on this
evaluation have to be made). Then plot the task backward from there to the
beginning of the time line.
After drafting the Management Plan, review the entire evaluation design:
is it worth the cost?
must you modify the evaluation design to make it consistent with the available
funds?
is it necessary to terminate the evaluation because the scaled down version will be
useless?
Although this plan may have some of the same elements of an evaluation contract
(budget, responsibilities of personnel, tasks), it should be substituted for one.
When writing general tasks, consider the following categories:
products needed
evaluation questions to be answered
audiences to be served
objectives of the evaluation
phases of the program
stages of the evaluation: designing, collecting, and analyzing information, and
reporting
102
A DESIGN MANUAL
CHECKLIST
PLANNING THE MANAGEMENT PLAN
Answer the following questions and revise the worksheet until you can answer
"yes" to all of them.
YES
NO
Have you listed all the tasks that are preconditions of other
tasks?
If all those tasks are carried out, will you have completed the
evaluation?
Are the budget totals within the limits you've been given?
PRODUCT 6
103
MOVING ON
Check to see if you need to do any of the following before moving on to
looking into a Plan for Evaluating the Evaluation.
___ Renegotiate purposes. Ask key audience members, "Is this really
worth the effort? Now that we see in detail what's involved, do we
want to go ahead with it?"
___ Consider dropping (but check with audiences) some of the evaluation
purposes and questions if things are looking too costly. Maybe you
can do something less, but still plenty worthwhile.
___ Write up a clear (or revised) preview and share it with others.
______ Check around again to see if any of the information you need is
already available somewhere.
______ Prepare more management products (e.g., Pert chart, Gantt chart,
timeline).
____ Get a good administrator to review your Management Plan.
___ Refer to some of the references below.
SOURCEBOOK REFERENCES
FUNCTION
KEY ISSUES
TASKS
PAGE
Management
175-180
2. Draw up a contract or
letter of agreement
181-186
187-190
4. Draft a time/task
strategy
191-196
197-200
evaluation?
evaluation cost?
i\------
I '\
-,
I
I
I
r----L-.-----,
~-----I
r---......JL....-----,
I
I
I
'-----,..-~ I
'"
t------
~~
PRODUCT 7
\
\
\
\
Ii1
Ii1
Ii1
Ii1
Ii1
Ii1
Ii1
1. Evaluation Preview
2. Outline of Evaluation Question
3.
4.
5.
6.
7.
\----------------------~
105
106
A DESIGN MANUAL
PRODUCT OVERVIEW
(McGraw-Hill, 1981).
4. An evaluation design can be evaluated before it is implemented, while it
is being implemented, or after the evaluation is completed.
TASK STEPS
Procedure
Objectives
Prepare
Try
Improve
3. An improved or clarified
plan
evaluation of an evaluation
Evaluation
PRODUCT 7
107
EXAMPLE
A PLAN FOR EVALUATING AN EVALUATION
From: Evaluator
To:
Director
Here, as I understand it, is what we agreed on for our "meta-evaluation".
A. Evaluation of the Evaluation Design
1. Purpose: To revise design as needed.
2. Methods: Dr. Schwartz from the University will prepare a written critique,
then present this in a review session with the project evaluator and
workshop staff.
3. Criteria: Dr. Schwartz will base her review on the Joint Committee
Standards. The evaluation design must also be judged to be
absolutely no more costly than necessary to meet its purposes as
specified by the Superintendent.
B. Evaluating During the Evaluation
1. Purpose: To ensure adherence to proper and ethical standards for information collection.
2. Methods: The two consultant visits planned have been dropped due to cost
problems. The evaluator will attend one of the workshops to
monitor evaluation activities. Also, we will devote at least part of
two staff meetings to discuss how the evaluation is going.
3. Criteria: Good information collection practices; efficiency (see Standards).
Standards for Evaluations of Educational Programs, Projects, and Materials, McGraw-Hill, 1981.
108
A DESIGN MANUAL
1. The meta-evaluation is planned in three parts: before, during and after the
evaluation.
2. Methods used can be quite formal or informal. In general, where more external
3.
4.
S.
6.
PRODUCT 7
109
WORKSHEET
PLANNING THE EVALUATION OF THE EVALUATION
Elements
of Plan
Evaluate your
evaluation design
Evaluate processes
(during) your
evaluation
Evaluate the
results/ products
of your evaluation
Purpose
Method
Persons &
resources
needed
-----------------------------------------
Relevant
standards
& criteria
110
A DESIGN MANUAL
AID #14
PARTS OF THE EVALUATION THAT CAN BE EVALUATED
Below is a grid which describes some of the tasks that a meta evaluator might be asked to do at any point in an
evaluation. The important point is that it is never too soon or too late to question the soundness and the worth
of your evaluation.
Focus of Meta Evaluation
Evaluating
Evaluation Designs
Evaluating Evaluation
in Progress
Evaluating Evaluation
after Its Completion
Focusing
Evaluation
to determine whether
selected questions and
purposes are being
pursued; to evaluate how
worthwhile they are
Designing
Evaluation
to evaluate the
effectiveness of the design
being implemented; to
help monitor or revise if
necessary
Collecting
Information
Evaluation
Functions
-------------
------
-----
--------- - - - - - - - - - - - - -
------------- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Analyzing
Information
Reporting
Information
Managing
Evaluation
PRODUCT 7
AID #15
SOME METHODS FOR META-EVALUATION
To evaluate evaluation designs
use of checklists, aids
formal written critiques
consultant expert reviews
staff reviews
yield tests, pilot studies
conferences, hearings, panels
presentation at professional meetings
comparison to other evaluations
To evaluate evaluations in progress
follow-up data collection
verification studies
re-analysis
monitor procedures (e. g., by observer)
spot checks of data analyses, information collection procedures, etc.
interviews with respondents
staff reviews
instrument pilot studies
document reviews
To evaluate evaluation results, uses and products
use of checklists, aids
formal written critiques
consultant expert reviews
staff reviews
field tests, pilot studies
conferences, hearings, panels
publication in journals with critiques
presentation at professional meetings
111
112
A DESIGN MANUAL
TIPS
PLANNING THE EVALUATION OF THE EVALUATION
The scope of your meta-evaluation plan depends on the scope of your primary
evaluation. If it's small and just beginning, you may do very simple and limited
meta-evaluation; but if your evaluation is very large and expensive, more metaevaluation may be called for.
Note that one way of evaluating your evaluation design is to use the checklist with
each product or the summary design checklist on pages 8-10.
A number of criteria can be used to judge an evaluation:
intended goals or purposes
mandates
client's or audiences' expectations
adherence to the design
criteria established by experienced evaluators
(See especially Joint Committee, Standards for Evaluations of Programs, Projects and
Materials. McGraw-Hill, 1981.
PRODUCT 7
113
MOVING ON
Check to see if you need to do any of these things before implementing your
evaluation design.
___ Evaluate the design by taking the evaluation questions you have
about it to an expert evaluator.
___ Take the questions you have about your design to key members of
your audience and see what they have to say.
___ Perform the role of expert evaluator yourself by studying the Joint
Committee Standards and applying them to your design.
SOURCEBOOK REFERENCES
FUNCTION
KEY ISSUES
TASKS
PAGE
Meta
Evaluation
205-207
2. Select a meta-evaluator
208-209
3. What criteria or
standards should you
use to evaluate the
evaluation?
3. Select or negotiate
standards
210-217
218-220
uses of meta-evaluation?
meta evaluation?
APPENDIX A
SELECTING WHAT
(AN OBJECT) TO EVALUATE
PRODUCT OVERVIEW
TASK STEPS
Procedure
Objectives
Prepare
Try
Improve
3. A confirmation or
rethinking of your choice
sampler of potential
objects or "things to
evaluate"
115
116
A DESIGN MANUAL
EXAMPLES
POSSIBLE EVALUA TION OBJECTS
APPENDIX A
117
EXAMPLES
TWO OBJECTS IN MORE DETAIL
Example #1-0bject Name: procedures for helping students set short-term goals
Comments:
What it is-techniques used with 8-12 year-old children having learning
disabilities
How it works-a peer tutor who has mastered the technique helps them
What it's supposed to do-facilitate independence and make individualization
more feasible
Why you might want to evaluate it-some people object to it as "the blind leading
the blind." Others believe it builds self-esteem and should be used more widely.
Example #2-object Name: Values Clarification Inservice
Comments:
What it is-a "one shot" three-hour workshop
How it works-a university professor comes in on inservice day and runs it
What it's supposed to do-clarify teachers' values and show them how to run
values clarification exercises in their classes
Why you might want to evaluate it-It has been run four times and people seem to
like it, but we don't know much about how effective it is
118
A DESIGN MANUAL
APPENDIX A
119
WORKSHEET
SELECTING AN EVALUA nON OBJECT
Directions: Using the previous examples as aids, select one or more objects to
evaluate.
Possibility #1
Object Name
Comments:
What it isHow it worksWhat it's supposed to do-Why you might want to evaluate itPossibility #2
Object Name
Comments:
What it isHow it worksWhat it's supposed to do-Why you might want to evaluate itPossibility #3
Object Name
Comments:
What it isHow it worksWhat it's supposed to do-Why you might want to evaluate it-
120
A DESIGN MANUAL
TIPS
SELECTING AN EVALUA nON OBJECT
N ow you should select from among the alternatives you've listed (or confirm the
wisdom of the choice you've already made).
Use the tips below and the checklist on the next page.
Consider the importance of what you are evaluating
Is the program, course, procedure, etc., one that would have a major impact if it is
effective? ineffective?
Is it so unimportant that it doesn't much matter?
Consider the visibility of the object or the evaluation effort
Is it (or would it be) highly visible and! or a matter of widespread interest from
many persons or groups?
Is it (or would it be) almost totally out of the limelight?
Consider the technical complexity or scope
Is the object a very complex one so that many different variables, questions, issues,
interest groups, and data collection and analysis techniques would need to be
considered?
Is the object a very simple one so that only a very few matters would be
involved?
Consider humanitarian issues
Does the evaluation (or does the object) have the potential for having a major
impact on the livelihood, working conditions, or value and belief systems of
people?
Does the evaluation (or does the object) specifically not deal with and! or avoid
such matters?
Consider the duration of the evaluation project
Would it need to be a long term or longitudinal effort extending over a period of
several years?
Can it be completed in a few hours, days, or weeks?
Consider organizational boundaries
Is the object (or would the evaluation be) concerned with issues that cut across
several organizational boundaries or administrative responsibilities, thereby
requiring widespread coordination or support to do?
Is the object (or would the evaluation be) concerned only with issues almost
entirely within one area of responsibility?
Consider your professional position, style, and competence
Are you ready for or in a position to take on an important, highly visible, complex,
sensitive, and lengthy project that cuts across many boundaries?
Are you cautious or in a pOSition where you should take care to keep a low profile
and not rock the boat?
Consider whether you should try to negotiate a change in an evaluation project
you've been assigned (or change the one you've selected)
After considering some of the tips, do you believe you should try to take on a more
ambitious project?
Do you believe you should try to take on a less ambitious project?
APPENDIX A
121
CHECKLIST
SELECTING AN EVALUATION OBJECT
Tryout alternative possibilities for projects and consider them until you can give
a tentative "yes" answer to the questions below.
YES
NO
Is the project appropriate for you given your position, style, and
competence?
122
A DESIGN MANUAL
MOVING ON
At this point you should have made a decision about what you will evaluate.
The decision was probably made "with reservations." Whether you are an
experienced evaluator or someone doing a first project, you lack information
about what you'll run into and whether you'll be able to cope with unexpected or
unknown problems.
Take comfort from the fact that-as with all the "early" decisions in an evaluation
project-you can change your mind later.
So-Please turn back to Product 1 and start working on an Evaluation Preview.
APPENDIXB
AN EXAMPLE OF AN EVALUATION
DESIGN
INTRODUCTION
123
124
A DESIGN MANUAL
PRODUCTl
EVALUATION PREVIEW
What is to be Evaluated
The object of the evaluation is a workshop developed and delivered by the ETC Project. It is a three day workshop
which gives participants intensive training in evaluation and time to work on their own evaluation designs.
Participants are from professional development and teacher preparation programs in colleges and universities and
local and state educational agencies. The project received funding from the federal Office of Special Education to
develop the materials used in the workshop, but must rely on registration fees to cover some delivery costs. The
Project is based at a University Evaluation Center which is interested in insuring quality and coordinating the Project
with its other activities.
HOW THE WORKSHOP WORKS
1.0
WORKSHOP AGENDA
Day One
9:00- 9: 30 Introduction
9:30-10:30 Evaluation design
exercise
10:30-12:00 Discussion: Review of
Decision Areas from
SOllrcebook
12:00- 1:00 Lunch
1 :00- 2: 30 Participants read
selected case
2:30- 3:30 Small group exercise:
Participants analyze
case using Decision
Areas
3: 30- 4: 30 Summary Review
Day Two
9:00- 9:30 Introduction to
the Design Manual
9:30-12:00 Participants (each
with own Design
Manual) complete
Products #1 and #2
12:00- 1:00 Lunch
1 :00- 2: 30 Exercise and lecture:
Measurement
Planning
2: 30- 4: 30 Participants complete
Products #3 and #4
Day Three
9:00-10:00 Lecture and
demonstra tion:
Reporting
10:00-11 :00 Participants complete
Product #5
11 :00-12:00 Panel discussion:
Management
12:00- 1:00 Lunch
1 :00- 3: 30 Participants complete
Products #6 and #7
3:30- 4:00 Wrap-up
APPENDIX B
Evaluation Purpose
The primary purpose is to produce information which can be used to redesign subsequent versions of the workshop. A
secondary purpose is to provide impact and other accountability information to external audiences.
Audiences lor the Evaluation
The primary audience is the staff, who want to conduct good, efficient trammg. Other audiences include: (1) the
Federal Office whose primary interest is to see their funds well used to support a quality effort, and (2) the University
Evaluation Center and Administration, who hope to promote coordination with other efforts and high quality, visible
efforts.
Constraints
The project has a small amount of funds set aside for an internal evaluator (a part-time student). The staff prefer an
internal evaluator with whom they can work closely but see the need for credibility of their self-evaluation work. The
evaluation must involve participants (federal regulation) and be completed before the end of the funding period.
125
126
A DESIGN MANUAL
PRODUCT 2
OUTLINE OF THE EVALUATION QUESTIONS
Evaluation Questions
Subquestions
Audiences
OSE (funders)
University
Staff
Project Director
Was it:
interesting?
useful?
OSE (funders)
Staff
The "word-of-mouth"
network is strong among
participants; and, if they
don't like it, they won't
learn it
Needed to revise
subsequent workshop and
to guide any follow-up
Staff expect some rough
spots; the Director will
base staff training on
problem areas
OSE
Staff
Project Director
Staff
Project Director
University
Project Director
OSE
OSE
Staff
Project Director
Staff
Project Director
University
OSE
OSE
University
Staff
Project Director
Is the workshop
responsive to needs?
APPENDIX B
127
PRODUCT 3
INFORMATION COLLECTION: OVERALL PLAN (A)
Information
Collection
Procedures
1. Who
attended?
a. number?
b. position?
c. organiza-
tion?
2. Did they
think it was
worthwhile?
a. interesting?
b. useful?
3. Were
learning
objectives
met?
4. What
problems
arose?
a. preparation?
b. delivery?
5. How
costly was
it?
6. What
uses 'were
made?
a. actual
a. costs?
b. potential uses?
savings?
b. benefits?
7. Is the
content sound?
a. evaluation
methods
8. Is it
b. instructional responsive
to needs?
design
--------------------- -
A. Participants
forms at
beginning
of
workshop
B. P's
complete
brief questionnaire
at end of
first day
C. Sample of
P's (key respondent)
discuss
workshop
X (b)
X (b)
at end; staff
members
take notes
D. Staff keep
notes on
P's use of
materials,
questions
asked.
problems
E. External
reviewers
rate
samples of
evaluation
designs
produced at
workshop
128
A DESIGN MANUAL
Information
Collection
Procedures
1. Who
attended?
a. number?
b. position?
c. organiza-
tion?
2. Did they
think it was
worthwhile?
a. interesting?
b. useful?
3. Were
learning
objectives
met?
4. What
problems
arose?
a. preparation?
b. delivery?
5. How
6. What
costly was
uses were
it?
made?
a. costs?
a. actual
b. potential uses?
savings?
b. benefits?
7. Is the
content sound?
a. evaluation
methods
8. Is it_
b. instructional responsive
design
to needs?
F. At postworkshop
meetings,
staff discuss
procedures
for developing and
producmg
X (a)
X (b)
materials
and making
arrangements
G. Evaluator
compiles
cost and
registration
fees
H. Staff
members
telephone
interview
sample of
P's after
training
-
-------
I. Selected
evaluation
and instructional design
experts
review
workshop
materials
-- - - - - - -------
]. Evaluation
and instructional design
experts
observe
APPENDIX B
129
PRODUCT 3
INFORMATION COLLECTION: HOW EACH PROCEDURE WORKS (B)
Procedure
A. Participants (P's)
complete registration
forms at beginning of
workshop
B. P's complete brief
questionnaire at end of
first day
Evaluation
Questions
Addressed
Schedule
for Collection
Sample
Beginning of
workshop at
registration
Workshop
participants
All
Registration
Questionnaire
End of each of
three days
during
workshop
Afternoon of
last day of
workshop
Workshop
participants
All
Reaction Form
Workshop
participants
8-12 selected
Continuous
during
workshop
Staff
All
Ratings made
two weeks after
workshop
Evaluation
consultant
Reviewer
Rating Form
4a
Continuous
during
preparation and
delivery of
workshop
Staff
All
1 week after
workshop
N.A.
N.A.
None
Approximately
1/3 of participants stratified
by type of job
setting
3 of each type
Interview Guide
2. 3, 4b, 8
6, 8
2 months after
Workshop
participants
I. Selected evaluation
and instructional
design experts review
workshop materials
7, 8
Materials sent 1
week after
workshop;
replies
completed in 3
weeks
During
workshop
Expert
reviews
]. Evaluation and
instructional design
experts observe
Instrument(s)
Used
Respondents
3, 4b, 8
workshop
Observers
by staff
1 of each type
Reviewer's
Guide
Questions
None
(observers take
own notes)
130
A DESIGN MANUAL
PRODUCT 4
ANALYSIS AND INTERPRETATION PLAN
Evaluation
Questions
Collection
Procedure
Analysis
Procedure
Evaluation
Criteria
Procedures for
Making Judgments
A. Participants (P's)
complete registration forms at
beginning of
workshop.
Analyze
questionnaire
items regarding
the,four subquestions to
determine
frequencies.
Evaluator
compares findings
to criteria.
Analyze relevant
questionnaire
items to determine ratings of
workshop
elements. Content analyze
staff notes taken
during the
discussion,
Number of P's
needed to cover
costs; 90% of
participants
should match
characteristics
of intended
participants.
A verage ratings
of 3-0 or less on
5-pt. scale are
considered very
low.
workshop?
a. number?
b. positions?
c. organizational
affilia tion?
d. evaluation expert?
worthwhile?
a. interesting?
b. useful?
C above
D. Staff kept notes
on P's use of
materials, questions asked,
problems.
Content analyze
staff notes to
identify evidence
that objectives
used were not
met,
Summarize
reviewers' rating
sheets.
List of learning
objectives
ranked by
importance.
All major
objectives should
be achieved.
Evaluator
compares all
findings to criteria
and presents own
summary;
reviewers' ratings
also presented
separately. Project
Director makes
final determination.
Content analyze
notes from staff
logs and discussion to identify problems,
how they
developed, and
their effects.
Problems such as
confusions about
materials
inadequate
facility
unproductive
diversion from
schedule.
Evaluator
summarizes information; Project
Director and staff
review ita t staff
meeting, Consensus of staff
sought.
E. External reviewers
rate sample of
evaluation designs
during workshop.
4. What problems arose?
a. preparation?
b. delivery?
Comparison of
summarized
findings with
those from
previous
workshops,
APPENDIX B
Evaluation
Questions
------
-- -
- - --------
Collection
Procedure
-------
F. At post-workshop
meeting, staff
discuss procedures
for developing
and producing
materials and
making arrangements. (for "a")
G. Evaluator
compiles cost and
registration fees.
Analysis
Procedure
Evaluation
Criteria
Were there
unusual or
unjustified
expenditures?
Analyze items
from interview
schedule regarding uses of
materials;
determine types
of uses and
apparent effects.
Summary
presented. No
pre- set criteria
established.
Compare
workshop content to design
criteria.
Compare workshop
operation to
design criteria.
Experts selected
so that one is
familiar with
project and at
least one is
nationally
recognized but
with no association with
project or staff
members.
Content analysis
of staff notes and
reports of expert
reviews.
J.
8. Is the workshop
responsive to needs?
Evaluation and
instructional
design experts
observe workshops and make
reports to staff.
C above
D above
H above
I above
Procedures for
Making Judgments
-~~----
Compare
expenditures to
budget and to
income from
fees.
Selected
evaluation and
instructional
design experts
review workshop
materials.
1 31
Evaluator presents
findings to
Director who
determines
savings
possibili ties
based upon
comparisons to
similar
activities.
Staff discuss,
reach consensus
about adequacy,
as compared to
needs data. (Information reported
to OSE for any
judgments they
choose to make.)
Evaluator
summarizes comparison of its
content to criteria
to identify
strengths and
weaknesses.
Evaluator
compares findings
to Needs Report.
132
A DESIGN MANUAL
PRODUCT 5
REPORT PLAN
Audience
Content
Format
Date/Frequency
Event
OSE
Written report
60 days after
funding year
End-of-the-year
report
Written report
30 days after
funding year
End-of-the-year
report
Meetings wi th
written summary
2 months
before
workshop
2 weeks after
workshop
Staff meeting
University
Staff
Project
Director
Presentation by
evaluator
Presentation by
evaluator
2Y, months
Staff meeting
(see above)
(see above)
(see above)
Informal
discussion
Written report
Every 2 weeks
Meeting
2Y, months
Meeting
after workshop
after workshop
Staff meeting
APPENDIX B
PRODUCT 6
MANAGEMENT PLAN
Evaluation W orkplan
A. Design the
evaluation
draft the design
review
present to staff
revise
have reviewed by
consultant
B. Develop
procedures and
instruments
draft registration
from (Proc. A),
questionnaire
(Proc. B), and
guidelines for
expert review
(Procs. E and J)
review
revise
produce
train staff for
keeping notes on
workshop
process (Procs. C
and D)
develop interview schedule
(Proc. A)
C. Collect information
during workshop
Procs. A, B, C, D
following
workshop
Proc. E
send designs to
reviewers
reviews due
Proc. F
. post-workshop
meeting
Proc. G
compile
budget
information
Proc. H
interview a
sample of P's
Proc. I
send material
to reviewers
reviews due
Evaluator
Director and staff
Evaluator
Evaluator
Director (and
consultant)
Evaluator
x
x
xx
x
x-x
x
x-x
x-x
Evaluator
Staff and evaluator
x
x
Evaluator
x
x
x
133
134
A DESIGN MANUAL
xx
Evaluator
Evaluator
xx
Evaluator
Evaluator and
Director
Evaluator and
Director
Evaluator
x
xx xx xx xx
xx
xx x
x
x
Consultant
Budget
Personnel
Evaluator (25% of $4,000 X Y, year)
Consultant fees for 2 workshop observations ($100/day X 2 days x 2)
Consultant fees for reviews ($100 X 4)
subtotal
Travel and Lodgings
To Workshop for:
Evaluator (carfare = $15, per diem = $50 X 3 - $150)
Consultant (2) (carfare = 50 X 2, per diem = $60 X 3 X 2 = $360)
$ 500
400
400
1,300
165
460
subtotal
625
50
100
20
subtotal
170
TOTAL
$2,095
APPENDIX B
PRODUCT 7
META-EVALUATION PLAN
Evaluation
Purpose:
0/ Completed Eva/uotion
Methods:
Resources:
Criteria:
1 35
APPENDIX C
EXTRA WORKSHEETS
APPENDIX C
1 37
WORKSHEET
A
NOT-SO~ELEGANT
DESIGN
PRODUCT
1. EVALUATION PREVIEW
138
A DESIGN MANUAL
WORKSHEET
DESCRIBING THE OBJECT (Defining "What" You'll Evaluate)
Directions: Use aid #2 as a guide to describe lllhrit you will evaluate- the object in
the spaces below. You may want to write a narrative description
and/or pictorial representation to answer the questions.
List
Who is involved in
the object
Note
Explain
Describe
Where it exists
APPENDIX C
1 39
WORKSHEET
ANAL YZING AUDIENCES
Identify persons/
spokespersons for each
audience
-----------+-----------+-------------------
140
A DESIGN MANUAL
WORKSHEET
IDENTIFYING PURPOSES
Purpose
Rank
Interested Audiences
Use a number to
rank each
purpose (if more
than 1)
APPENDIX C
141
WORKSHEET
IDENTIFYING CONSTRAINTS
Directions: Think about constraints for each of the categories below; use aid #5.
Then, list the constraints you will need to attend to.
Some guide questions and categories
- - - - - - - - - - - - - - - - + - - - - - - - - - _..._-_."""..
llIlri Illli'l'/JI'e/lltilill
P!1I1l
142
A DESIGN MANUAL
Other
Are there <my other constraints on the
evaluation?
APPENDIX C
143
WORKSHEET
DRAFT EVALUA nON QUESTIONS
Q?
Q? ----------------------------------------------- --Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q? ------------------ -------------------------- --------- ----
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
Q?
144
A DESIGN MANUAL
WORKSHEET
EVALUA TION QUESTIONS AND SUBQUESTIONS
Directions: Review and organize and revise your draft evaluation questions. Write
them, their sub-questions, audiences and why they're important in
the spaces below. Use the example (page xxx) and aid #7.
Evaluation
Questions
Sub-Questions
Audience
(Who cares?)
APPENDIX C
145
WORKSHEET
OVERALL INFORMATION COLLECTION PLAN
Write evaluation
questions here (see
Product #2) -+
List information
collection procedures here
(Use aids #8 and 9) 1
'_'~
__
"~
_ _ m _
- -'----------- ---------(in the cells place an "X" to show where a procedure addresses
a question)
146
A DESIGN MANUAL
WORKSHEET
HOW EACH PROCEDURE WORKS
Directions: Fill in each column below for each procedure listed in worksheet A.
Procedure
Evaluation
Schedule
Sample (kind and
question addressed (when, how, where) size, see aid #10) Respondents
Instruments used
APPENDIX C
147
WORKSHEET
PRELIMINARY HANDLING & ANALYSIS PLAN
List information
collection
procedures
Describe how
problems will be
identified,
e.g., incomplete
information and
errors
Describe
preliminary
analysis
148
A DESIGN MANUAL
WORKSHEET
ANALYSIS
&
INTERPRETATION PLAN
List evaluation
questions
List information
collection
procedures for
Describe analysis
each question
. __ procedur_~~_.
Describe
procedures for
making the
judgments
(see aid #11)
APPENDIX C
149
WORKSHEET
PLANNING REPORTS
List
evaluation
audiences
Describe
content of
reports
(use aid #12)
List date
Identify format of report
or frequency
to be used
(use aid #12)
(use aid #12)
Identify event
associated with
report
150
A DESIGN MANUAL
WORKSHEET
TASK ANALYSIS
Task
Begin/end
Date
Name the
person(s) who will
do each sub- task
Estimate
personnel
cost
List other
resources
needed
Estimate
for other
resources
~~~~-r~~--~~r-
Task:
Task:_~._ .._
Task:_~~~_
/
I
I
/
/
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
APPENDIX C
WORKSHEET
REVISED TASK CHART
Identify person( s)
responsible
15 1
152
A DESIGN MANUAL
WORKSHEET
BUDGET
Directions: Complete the following budget, using the categories of expenditures.
Personnel
Subtotal
Fringe (institutional percentage of cost)
Travel
Communications
TOTAL
APPENDIX C
153
WORKSHEET
PLANNING THE EVALUATION OF THE EVALUATION
Evaluate your
evaluation design
Evaluate processes
(during) your
evaluation
Purpose
Method
Persons &
resources
needed
Relevant
standards
& criteria
------
------------------- - - - - - - - - - -
---
Evaluate the
results/products
of your evaluation
154
A DESIGN MANUAL
WORKSHEET
SELECTING AN EVALUA nON OBJECT
Directions: Using the previous examples as aids, select one or more objects to
evaluate.
Possibility #1
Object Name
Comments:
What it isHow it worksWhat it's supposed to doWhy you might want to evaluate itPossibility #2
Object Name
Comments:
What it isHow it worksWhat it's supposed to doWhy you might want to evaluate itPossibility #3
Object Name
Comments:
What it isHow it worksWhat it's supposed to doWhy you might want to evaluate it-