Vous êtes sur la page 1sur 8

Evaluation of Beteimining Instiuctional


J. RaphaelHolmes

Far West Laboratory for Educational Research and Development (FWR) has requested
evaluation proposals for its training program Determining Instructional Purposes (DIP). This
document is an evaluation proposal by San Carlos Education Consulting (SCEC).
Description of Program being evaluated
The DIP training program was created to train administrators and graduate students in
educational administration in planning effective school programs. The package includes three
separate units and a Coordinator's Handbook. Each unit consists of 4 to 6 modules with training
on several objectives. The modules include reading material correlated to the module skills,
individual and group activities to practice the skills, and feedback for the practice activities. The
trainees are divided into planning teams to work on real problems in a fictive school district.
The design of the units allows the administrators to choose only one unit, or any combination
of two, or all three, thereby allowing potential purchasers to access onl y the unit required, or
together as an entire three part sequence of instruction. Participants will work step--
on the materials and activities in each unit to acquire the needed outcomes. The training can be
delivered either in a concentrated short-
a few days or weeks. Each units lasts for 10 to 18 hours.
A coordinator should be familiar with the material before the training begins, but there is no
prior knowledge of the subject area required. The coordinator should organize, guide, and
monitor activities where the trainees use the materials and procedures in each of the units. All
important information regarding to the coordinator's role are described in the Coordinator's

All materials come in print form. The units are between 155 and 259 pages long and cost $8.95
1 C P
available for $4.50 per copy.
Evaluation Method
To help FWR decide whether or not to move forward with the DIP program, the proposed
evaluation will be comprised of the following components:
x Expert evaluation of content - Content will be evaluated by one in-house subject matter
expert (SME) together with two outside experts. They will provide feedback on the
accuracy and quality of the content, for individual modules, individual units, and
combinations of units. Our in-house expert will give feedback in the form of a written
report, and the two other experts will be independently interviewed after they are given
the opportunity to familiarize themselves with the materials. Besides helping FWR
decide whether to disseminate the DIP program, this component of our evaluation can
help fulfill the secondary purpose of the evaluation, acting as a review and guide for
interested school administrators.

x Trial runs - SCEC will arrange for multiple trial runs of DIP components, in either a Bay
Area graduate school or school district setting. Each trial run will be observed by a SCEC
consultant, and if possible, each trial run will include one full unit of the DIP program.
The total target observation time is 40 hours, with 6-14 trainees per trial and a
minimum of 25 individual participants. Trials will be preceded by pre-tests and followed
by both post-tests and surveys.

x Our pre- and post-tests
hypothetical administrative planning scenari ul instructional
materials. Surveys will seek the participants
format, appeal, and applicability, as well as gauging their interest in continuing to
participate in the program. Quantitative, Likert scale components of surveys will be used
to describe overall reactions and opinions, while SCEC staff will compile responses to
qualitative questions into meaningful overall trends.

x SCEC will also conduct guided telephone interviews with trial run coordinators to
determine their opinions of the program. The guiding focus of informati on gathering will
be to determine whether trainees have benefited, and if so, in what tangible forms
these benefits have taken shape. The primary purpose of these trials is to give FWR
information to inform their decision regarding deployment of the program, but this may
also result in material useful to communication with potential clients.

x Attitudes of potential clients- SCEC will take advantage of its nationwide connections to
K-12 educators and graduate educational administration instructors to distri bute
electronic surveys and receive completed surveys from a minimum of 40 potential
clients. These instruments will include items assessing the need for a program like DIP,
the expected presentation modality, budget constraints, and degree of fit with current
practices in school training or graduate school instruction. Survey data will again be both
quantitative and qualitative.

The projected start date is October 2, 2011 and should be done by July 10, 2012.
Deadline Date
1. *Meet with Far West staff to discuss IPTP evaluation
proposal and revisions.
Far West
October 2, 2011
2. Provide feedback for implementation into evaluation Far West October 5, 2011
u Lu
program description) to Far West
SCEC October 14, 2011
4. Expert Evaluation of Content SCEC October 14, 2011
5. Submit data collection documents including all surveys
and interview protocol for all participating parties to Far
West for review and approval.
Far West
October 14, 2011
6. Provide feedback for revision Far West October 19, 2011
7. Identify potential participating administrators and
graduate students - mail out sample training packages and
information documents.
SCEC October 24, 2011
8. Follow up contact made to recipients of mail out and
confirmation of a minimum number in each sample group.
(25 administrators and 25 graduate students)
The test group will also be selected at this time.
SCEC October 31, 2011
9. Delivery of training program to participants. SCEC November 1, 2011
10. Distribution of electronic survey from potential clients SCEC November 1, 2011
11. Survey information collected and telephone interviews
conducted from administrators and graduate students.
SCEC November 21, 2011
12. Interview of test group coordinator. SCEC November 21, 2011
13. Collection and analysis of data from potential clients. SCEC November 21, 2011
14. *Formative assessment report to Far West based on
data gathered to this point in the evaluation.
Far West
November 25, 2011
10. Revisions to data collection instruments and
recommendations for changes to evaluation for the trial
Far West December 2, 2011
12. Pretest delivered to trial run participants. SCEC January 4, 2011
13. Test Group trial run of program begins. SCEC January 6, 2012
14. Completion of trial run and collection of data. SCEC February 3, 2012
15. Interviews with trial run coordinators. SCEC February 3, 2011
16. *Meeting with Far West to discuss data and analysis.
Far West
February 15, 2012
17. *Submission of final report and recommendations to Far
Far West
February 29, 2012
18. *Updated report and recommendations to Far West.
Far West
July 10, 2012
*Requires meeting between two parties in person or through video conferencing.

San Carlos Educational Consulting has been providing evaluation and guidance to educational
institutions since 2002. Our clients include schools, school districts, post-secondary institutions,
and creators of educational and training resources. Two of our most recent projects were an
evaluation of the NSF-funded high school pre-service physics teacher training camp (PPTTC) and
a large-scale evaluation of the practicum component of San Francisco St u
practitioner degree program. We have ample experience evaluating instructional packages, and
we are eager to furnish references from our many satisfied clients.

Lead Project Personnel
Dr. Barry Janzen is our chief evaluator and the founder of SCEC. He will supervise the evaluation
process and act as the primary point of contact with FWR. He holds an Ed.D. degree from Boise
State University, and his work has been published in numerous educational journals. Dr. Janzen
has authored a highly regarded textbook on educational evaluation, now in its 4th edition.

Michaela Pacesova will design our data collection instruments and analyze our findings. She is
an evaluation specialist with 10 years of experience, including multiple evaluations of
educational training programs, and she has been working with SCEC since 2004. Ms. Pacesova
M u L 1 8 S u

J. Raphael Holmes is our in-house subject matter expert on school administration training, and
he will also be in charge of observing trial runs. He has been working with SCEC since 2005. He
holds an MS in educational administration from Boise State University, and he has worked as an
educational administration instructor at Rutgers University.

The proposed budget of $25,590 is on the following page. Required DIP materials have been tentatively
priced at half the sale price detailed by FWR in the RFP, but this is subject to recalc lW8
discretion. $8,000 of budget is to be submitted as payment by 10/14 and the completion of task 3, with
the next $8,000 installment paid prior to the 2/15 videoconference and the remainder paid upon
submission of completed evaluation proposal.
Dr. Barry Janzen: 30 days at $450/day $13,500
Michaela Pacesova: 20 days at $300/day $6,000
J. Raphael Holmes: 10 days at $250/day $2,500
Secretary: 15 days at $120/day $1,800
Travel and perDiem
3-day round trip for two: San Francisco to Portand $600
Accommodations in Portland (includes per diem) $380
Estimated miscellaneous travel expenses $150

ICT,Supplies, and Communication
Electronic survey tools $100
DIP materials (5 C Handbooks, 20 full sets of units) *$260
Postage $150
Miscellaneous Material Expenses $150
TOTALICT,Supplies,andCommunication $660

*As previously noted, DIP materials have been tentatively priced at half the sale price detailed
lW8 lW8