Vous êtes sur la page 1sur 5

A Pragmatic Model for a Comprehensive Public School Research and Evaluation System

Author(s): William J. Webster and Robert L. Mendro


Source: The Journal of Educational Research, Vol. 69, No. 4 (Dec., 1975), pp. 156-159
Published by: Taylor & Francis, Ltd.
Stable URL: http://www.jstor.org/stable/27536859
Accessed: 07-03-2017 21:16 UTC

JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted
digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about
JSTOR, please contact support@jstor.org.

Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at
http://about.jstor.org/terms

Taylor & Francis, Ltd. is collaborating with JSTOR to digitize, preserve and extend access to The Journal
of Educational Research

This content downloaded from 132.174.250.76 on Tue, 07 Mar 2017 21:16:27 UTC
All use subject to http://about.jstor.org/terms
156

A Pragmatic Model for a Comprehensive


Public School Research and
Evaluation System
WILLIAM J. WEBSTER
ROBERT L. MENDRO
Dallas Independent School District

ABSTRACT

A model for research and evaluation in a large urban school district is presented. The relationship between research, evaluation,
planning, and development is discussed in light of major functions including context, input, process, and product evaluation, as well
as applied and basic research. The role of these functions in supplying the information base necessary for educational planning and
decision-making is highlighted.

IN RECENT YEARS there has been increased emphasis annual cycle with summative product evaluation and ap
on evaluation in the public schools. Many districts have es plied research information. The process outlined in Figure
tablished comprehensive systems of performance monitor 1 is discussed below.
ing based on the continuous assessment of program and A prerequisite to improvement must be a knowledge of
project outcomes (1, 5, 6,14). Typically, they are struc existing performance levels. Thus, the backbone of any re
tured so as to provide information to decision-makers, but newal system must be a comprehensive context evaluation
are largely limited to evaluation rather than research func program. Stufflebeam, et ah, define context evaluation as
tions (4). Consequently, the information provided by these the provision of baseline information that defines the en
systems is of limited utility. vironment of interest, describes desired and actual condi
The restricted scope of these systems undoubtedly is tions pertaining to the environment, identifies unmet needs
due, in part, to the strained finances of many school sys and unused opportunities, and diagnoses the problems that
tems and to the critical lack of competent personnel prevent needs from being met and opportunities from being
trained to conduct and implement systematic research pro used. An adequate context evaluation system is founded on
grams in the public schools. However, a significant factor a longitudinal data base and provides periodic reports on
influencing the implementation of such limited systems is such variables as student dropout, attendance, achievement
the lack of information detailing the integration of evalua levels, drug usage, demographic and vocational patterns,
tion and research operations in a real world environment. community socioeconomic status and dominant value pat
More directly, the fundamental works underlying the cur terns, and teacher academic and demographic characteris
rent renaissance of educational evaluation have failed to tics. Thus, a context evaluation system provides the basis
emphasize the critical role of research in objective decision for formulating change objectives by identifying needs
making (7, 9, 10, 11). and, in some cases, outlining practical constraints in iden
Evaluation alone, regardless of its comprehensiveness, tified problem areas.
cannot supply sufficient information for educational de Once the context evaluation system has identified needs,
cision-making. Rather, systematic evaluation programs decision-makers must set priorities for those needs and fo
must be carefully integrated with applied and basic re cus on reducing the discrepancy between desired and exist
search programs to provide the data essential for educa ing conditions by establishing goals for those needs that
tional renewal. receive highest priority. It is at this point in time that in
This paper provides a model for a comprehensive evalu put evaluation information is used. Stufflebeam, et al, de
ation and research system highlighting the interrelation fine input evaluation as the provision of information for
ships between evaluation and research and indicating the determining methods of resource utilization for accom
role of both in the process of developing and implementing plishing program goals. In a functioning evaluation system,
educational programs. The CIPP model, proposed by Stuf there are four sources of input information:
flebeam et al. (11), was chosen as a reasonable general pur
pose evaluation model for purposes of modification into
1. previous product evaluation information
the more comprehensive combined model.
2. basic research information
Figure 1 presents a flowchart outlining the essential de
tails of the revised system. The flowchart depicts an opera 3. applied research information
4. non-research and evaluation information.
tional research and evaluation system which begins with
yearly context evaluation, moves through input, process, Product evaluation information concerns the extent to
and interim product evaluation stages, and completes the which specific project or program goals are achieved. When

This content downloaded from 132.174.250.76 on Tue, 07 Mar 2017 21:16:27 UTC
All use subject to http://about.jstor.org/terms
WEBSTER-MENDRO 157

product evaluation information is available, relative to a The development of this evaluation design must necessarily
given program with goals similar to those identified in re involve continuous interaction between program personnel
sponse to context evaluation information, that information and the evaluator in order ultimately to produce maximally
provides useful input to decision-makers in determining effective information. Obviously, the evaluator must be in
the probability that the program would reduce the identi dependent of program management to ensure the optimum
fied discrepancy between desired and existing conditions. objectivity of evaluation results.
Basic research information pertains to information Once the program implementation phase is entered, the
about fundamental relationships that affect student learn evaluator's role becomes one of providing continuous for
ing. Before making a decision to implement a given pro mative evaluation reports relative to program implementa
gram, decision-makers should be apprised of the extent to tion. These reports fall primarily into two categories, pro
which that program is or is not consistent with the princi cess evaluation and interim product evaluation. Process
ples established by basic research in learning. evaluation has three major objectives: (1) the detection or
Applied research information concerns the interaction prediction of defects in procedural design or its implemen
between student characteristics, teacher characteristics, tation during program implementation stages, (2) the pro
and instructional systems. Applied research differs from vision of information for programmed decisions, and (3)
basic research in that the information provided is more the maintenance of a record of the implementation pro
closely related to specific decisions in an applied educa cedure as it occurs (11). Thus, process evaluation informa
tional setting. Decision makers need information relative tion keeps program management informed of the extent
to the types of students (e.g., high anxiety versus low anxi to which program implementation conforms to specifica
ety) that function best in given instructional systems im tions and from an evaluation standpoint, guards against the
plemented by specific types of teachers. Applied research evaluation of a fictitious event.
must be implemented in the public schools by public Interim product evaluation provides periodic feedback
school research departments. Basic research should probably to program management relative to the attainment of spe
be implemented in the universities in conjunction with the cific sub-objectives during the implementation phase. Thus,
public schools, because many local boards of education are process and interim product evaluation reports inform pro
probably too impatient for results and are unlikely to wait gram management as to implementation and goal attain
for a long-term, systematic, longitudinal basic research pro ment levels while program adjustments are still feasible.
gram to pay off. Upon completion of a given cycle of program implemen
Finally, non-research and evaluation information must tation, a summative product evaluation report is prepared.
enter into most educational decisions. Such information This report generally addresses three areas of concern:
as capabilities of staff members, costs, political feasibility 1. the extent to which program objectives were achieved
of program implementation, and existing facilities must relative to some specific set of criteria
be taken into account.
2. the extent to which system objectives were achieved
After the collection of relevant input information, feed
relative to alternative instructional strategies
ing a pre?minary program planning stage, decision makers
determine whether or not sufficient resources are available 3. the cost-effectiveness of the program relative to alter
to make the desired instructional changes. Quite often suf native instructional strategies.
ficient resources are not available and some compromise is It should be obvious that information relative to these three
necessary. In most cases, the lack of resources is not lim areas of concern must be interpreted in light of process and
ited to the realm of cost and political feasibility, but rather interim product evaluation information. Without informa
stems from an insufficient base of research information.
tion about program implementation, product evaluation
Thus, educators are often in the position of having suffi information is of little use. A detailed discussion of an oper
cient material resources but insufficient information re
ational project evaluation process can be found in Webster
sources.
(12).
If sufficient material resources are not available, the sys Most evaluation systems stop at the provision of product
tem may have to exist for some period of time in a state ofevaluation information. These data generally bear upon the
enlightened persistence. Periodic context evaluation will performance of different groups of students under varying
continue to highlight the extent of discrepancy between treatment configurations. Unfortunately, as a result, most
that which is desired and that which exists. If the problem product evaluation reports have generally focused on the
results from insufficient information resources, programs search for single best treatments for all learners, i.e., main
are often implemented without sufficient support data and effects. In order to provide needed information for educa
an information base is built through a series of systematic tional decision-makers, applied research studies involving
evaluation and applied research studies. the systematic investigation of aptitude-treatment and
If sufficient material and information resources are availtrait-trait interactions must be undertaken. Such studies
able, or if sufficient material and minimal information re would be expected to provide important information rela
sources are available, the program planning phase is entered.tive to replicable relationships between student, teacher,
The evaluator's role in program planning involves making and program characteristics (2, 3, 8, 13).
all relevant, available input information available to pro Once context, input, process, and product evaluation in
gram planners and ensuring that stated program objectives formation, as well as applied research data, are available,
are measurable. Out of the program planning sessions, the non-research and evaluation information once more is
evaluator develops a detailed program evaluation design brought to bear on the decision-making process. It would
specifying the criteria by which the program will be judged.be naive to expect educational decisions to be made purely

This content downloaded from 132.174.250.76 on Tue, 07 Mar 2017 21:16:27 UTC
All use subject to http://about.jstor.org/terms
158 THE JOURNAL OF EDUCATIONAL RESEARCH

Figure 1.-Flowchart for an Integrated Research and Evaluation System

Interim
Product Process
Enlightened ?^
Persistence ' Evaluation Evaluation

Operational
Yearly Preliminary ! Program Program
Context w*\ Need? )-^^ Program ^[ Resources > Planning Implementation
Evaluation Planning

t i:
Input
Program
Completed

Product
Previous Basic Applied Evaluation
Product Research Research Non - R&E
Evaluation Info. Info. Info.

Li. . u Applied
Research

Expand
Additional
Context
Evaluation
.-'Fate V,
of
^v Program, '' ^i Non - R&E ?
Info. |

Discontinue

Research and Evaluation Function


Non-Research and Evaluation Function

on the basis of research and evaluation data. Once again, inadditional context evaluation information must be exam
formation such as the absolute program costs, capabilities ined to determine if similar needs exist in other settings.
of program staff members, political feasibility of program If such needs are demonstrated, then the program may be
implementation, and existing facilities and resources, must expanded to other settings and the program planning stage
be considered by decision makers in rendering a final entered to extend the program implementation. Other set
decision. tings eligible for program expansion may include entire
In determining the fate of a given program, four primary schools or specific sub-populations (e.g., highly motivated
choices are available to decision makers. First, they can students) as indicated by applied research data. If such
choose to continue the program in its current setting. If needs do not exist, the program is continued with the
this alternative is chosen, the summative product evaluation original target population or a reduced target population
report and the applied research data become the context based on the results of the applied research studies. The
evaluation information for the next implementation phase extent of continued evaluation to be implemented under
and program implementation commences. This alternative either the expansion or continuation alternative is deter
generally occurs when decisions are to be made on the basis mined by decision-makers under advisement from evalua
of longitudinal studies, i.e., where it is expected that results tion personnel.
will not be in evidence after a relatively short implementa A fourth alternative involves program revision. Much
tion period. program revision should be accomplished on the basis of
A second alternative is to discontinue the program. This process and interim product evaluation reports. Often,
is usually done after product evaluation studies demon however, summative product evaluation and applied re
strate the failure of the program to meet its objectives or search reports highlight weaknesses in portions of programs
in those cases where the program is simply not cost-effec that would otherwise appear to be functional. In this in
tive. (Failure to meet objectives is often a necessary but stance, the summative product evaluation and applied re
not sufficient condition for program discontinuation.) search reports become the context evaluation information
Once a program is discontinued, the system returns to con for the next program planning cycle.
text evaluation and once again applies the needs assessment Figure 1 outlined a model for an operational research
and orientation phases. and evaluation system. However, data, no matter how com
If the product evaluation and applied research informa prehensive, are of limited use if not disseminated to deci
tion are favorable, and it is practically and politically feasi sion-makers. Table 1 outlines the information requirements
ble, the program may be expanded to serve additional stu of decision-makers at different management levels.
dents. Prior to making the decision to expand the program,

This content downloaded from 132.174.250.76 on Tue, 07 Mar 2017 21:16:27 UTC
All use subject to http://about.jstor.org/terms
WEBSTER-MENDRO 159

Table 1.-Information Requirements of Various Management for specific programs in which the
Levels that Should be Met Through Systematic Research and teacher is involved.
Evaluation Programs Applied research information rela
tive to the types of instructional
Management Levels Information Requirements strategies that are most effective with
different learner characteristics and
Board of Education (public) Product evaluation information (gen
specific validated instruments and
erally summative) on the effects of instructions for their use in student
district projects and programs.
diagnosis.
Context evaluation information on
the general state of the educational Parents/Students Feedback on individual student
environment. performance.

Upper Management Product evaluation information (gen


erally summative) on the effects of
district projects and programs. The establishment of effective research and evaluation
Context evaluation on the general systems in the public schools is essential for educational re
state of the educational environment. newal. To assume that public school systems should be pas
Basic research information (mostly sive recipients of externally developed products, often un
literature reviews) on selected funda accompanied by sufficient empirical evidence of success or
mental relationships in learning. accompanied by empirical support based on restricted stu
Input evaluation information rela dent or teacher populations, is ludicrous. Often adjustments
tive to alternative strategies for meet must be made in externally developed programs to meet
ing specific needs.
local requirements, or, where externally developed programs
Applied research information on the do not exist, strategies must be locally developed for meet
interactions between student charac
ing specific needs. Evaluation, in the absence of applied and
teristics, teacher characteristics, and
program success in all district pro basic research data, is not capable of providing the necessary
grams. information for meeting the needs of diverse student
populations.
Project Management Product evaluation information of a
formative and summative nature on
REFERENCES
the effects of specific projects.
Context evaluation information on 1. Barker, L. W., "Research and Evaluation in the Louisville Pub
the baseline state of the project He Schools," Journal of Research and Development in Educa
environment. tion, 5:79-97",1972.
Process evaluation information on a 2. Bracht, G. H., "The Relationship of Treatment Tasks, Person
continuing basis relative to the ex ological Variables, and Dependent Variables to Aptitude
tent of program implementation. Treatment Interactions," unpublished doctoral dissertation,
University of Colorado, 1969.
Applied research information on the 3. Cronbach, L. J.; Snow, R., Individual Differences in Learning
interaction between student charac
Ability as a Function of Instructional Variables, Final Report,
teristics, teacher characteristics, and U.S. Office of Education (Contract No. OEC 4-6-061269
program success. 1217), Stanford University School of Education, March, 1969.
Basic research information on select 4. Dziuban, C. D.; Moser, R. P., "Some Functional Dimensions of
ed fundamental relationships in Large City Public School Research," Journal of Educational
learning. Research, 66:471-478, 1973.
5. Jacobs, J., "A Model for Program Development and Evaluation,"
Building Management Product evaluation information (gen Theory into Practice, 13:15-21,1974.
erally summative) on the effects of 6. Merriman, H. O., "DOPI CIPP," paper presented to the Greater
district projects and programs. Cities Research Council, Washington, D.C, November, 1969.
Context evaluation information on 7. Pro vus, M., Discrepancy Evaluation, McCutcheon Publishing
Company, Berkeley, California, 1971.
the general state of building teachers
8. Salomon, G. S., "Heuristic Models for the Generation of Apti
and students and the sponsoring
tude Treatment Interaction Hypotheses," Review of Educa
community.
tional Research, 42:327-343,1972.
Input evaluation information relative 9. Scriven, M., "The Methodology of Evaluation" in Worthen and
to alternative strategies for meeting Sanders (eds.), Educational Evaluation: Theory and Practice,
specific needs. Charles Jones Publishing Company, Worthington, Ohio, 1973.
Formative and summative product 10. Stake, R. E., "The Countenance of Educational Evaluation,"
evaluation and applied research infor Teachers College Record, 68:523-540,1967.
11. Stufflebeam, D. L., et al, Educational Evaluation and Decision
mation on specific projects imple
mented in the building. Making, Peacock Publishers, Inc., Itasca, Illinois, 1971.
12. Webster, W. J., "The Dallas Approach to Systematic Research
Teachers Product evaluation information of a and Evaluation in Development" Abstracts of Research and
formative and summative nature on Evaluation Reports, 1972-73, Dallas Independent School Dis
the effects of specific programs in trict, August, 1973, pp. 165-199.
which the teacher is involved. 13. Webster, W. J.; Mendro, R. L., "The Investigation of Aptitude
Context evaluation information rela Treatment Interactions as an Integral Part of Program Evalua
tive to student entry level. tion," Journal of Experimental Education, 43: 86-91, Fall 1974.
14. Webster, W. J.; Schuhmacher, C. C, "A Unified Strategy for
Process evaluation information on System-wide Research and Evaluation," Educational Tech
degree of program implementation nology, 13:68-72,1973.

This content downloaded from 132.174.250.76 on Tue, 07 Mar 2017 21:16:27 UTC
All use subject to http://about.jstor.org/terms

Vous aimerez peut-être aussi