Vous êtes sur la page 1sur 7

Running head: EVALUATION PLAN, DATA ANALYSIS

Evaluation Plan, Data Analysis & Recommendations


Uchendu Okeke
Keiser University
Dr. Janet Wynn
EDL751: Leadership: Assessment and Program Evaluation
February 14, 2016

EVALUATION PLAN, DATA ANALYSIS

Warranted Evaluation Model


As previously quantified, the evaluation approach that the author feels would be
appropriate for assessing the Masters Degree program in Post-Secondary Education which is
offered by the College of Education (at Troy University) would be an objective-oriented
evaluation approach or a goal-oriented approach. Janicak (2015) mentioned What is the process
of designing an objective-oriented evaluation process? In the objective-oriented evaluation
approach, the purpose of some educational activity are specified, and the evaluation process
focuses on the extent to which the objectives have been achieved (p. 179).
In regards to the goal-centered/objective-oriented evaluation, Ferlie and Lynn (2007)
avowed the following:
In goal-oriented evaluation, activities are evaluated on the bases of whether they help
achieve formally stated goals. More often than not, this school of thought takes for
granted that it is meaningful to talk about public activities such as programs in terms of
inputs, processes, outputs, and outcomes. Outcomes are most often seen as the
ultimate justification of an activity. Programs are, in this light, merely tools, even
expendable tools, which help us achieve certain outcomes. When the evaluator is
considering the construction or selection of specific outcome measures, he or she starts
with the programs official goals. In fact, program goals are only one way of
understanding a program, but in goal-oriented evaluation approaches, they constitute the
only legitimate source of criteria for judging the program. In other words, if a particular
value, criteria or standard cannot be demonstrably rooted in an official statement about
program goals, it can safely be left out of an evaluation of that program. (p. 624).

EVALUATION PLAN, DATA ANALYSIS

Ferlie and Lynn (2007) further noted the beauty of this simple principle should not be
underestimated; many is the situation in which evaluators are torn between different value
positions when evaluating a program, and sometimes, more or less knowingly, find their own
values interfering in the game. Goal-oriented evaluation claims to offer a way out of this misery.
All that is needed is for the evaluator to described official program goals thoroughly and from
here deduce specific outcome measures.
Evaluation Questions
One of the primary goals of the Masters Degree program in Post-Secondary Education
which is offered by the College of Education (at Troy University) is to improve graduate
edification by providing expansive comprehensive proficiencies to students who partake in the
program. Program evaluation (by means of an objective-oriented evaluation approach) offers a
way of achieving that goal by providing guidance to improve/bolster the specified program.
Rossi and Lipsey (2003) mentioned It is generally wise for the evaluator to develop a
written summary of the specific questions that will guide the evaluation design. This provides a
useful reference to consult while designing the evaluation and selecting research procedures (p.
69). Guerra-Lpez (2008) stated With objectives clarified, the overarching questions that will
drive the evaluation process, as well as the purpose of the evaluation, should also become clear,
well articulated, and ready to be agreed on (p. 87). In addition, Guerra-Lpez avowed Guiding
evaluation questions come from various perspectives and stakeholder groups. Evaluators who
cannot obtain consensus on a set of questions will be far less likely to obtain consensus about the
usefulness of the evaluation report. Since the Masters Degree program in Post-Secondary
Education (at Troy University) is being evaluated, the following formulated evaluation questions
will be implemented amid the evaluation.

EVALUATION
DATAoffering
ANALYSIS
Rate
the quality PLAN,
of the faculty
the
degree program.
Rate the academic standards of the program.
The program has kept pace with the
contemporary trends/developments in your
specified field.
Rate the sufficiency of professional training
opportunities in the program.
Rate the adequacy of amenities and utensils
accessible to graduate students.
Rate your contentment with the guidance you
received amid the program.
Rate the preparation you received in the
program to be an effective educator (of your
discipline).
Rate the overall worth of the degree program.

1
(Poor)
1
(Poor)
1
(Poor)

(Poor)
1
(Poor)
1
(Poor)
1
(Poor)
1

(Poor)

4
5
(Good)
5
(Good)
5
(Good)
5
(Good)
5
(Good)
5
(Good)
5
(
Good)
5
(Good)

Sampling Technique Utilized


The sampling technique that will be utilized is random sampling. Tashakkori and Teddlie
(2002) stated A simple random sample is one in which each person or unit in the clearly defined
population has an equal chance of being included in the sample (p. 278). In addition,
Tashakkori and Teddlie declared to select a simple random sample, the technique can be a
straightforward as drawing names out of a hat or as complex as using a computerized table of
random numbers to draw a sample from the population at large.
Data Collection Method
The data will be collected by means of the survey method. Surveys are extremely utilized
and advantageous instrument in the educational realm (in regards to carrying out an evaluation).

EVALUATION PLAN, DATA ANALYSIS

The survey method can be used as a means of collecting data. Once the data is collected (by
means of the survey), the figures of the data is then carefully assessed/analyzed.
Analysis of Data
The data collected by means of a survey will be analyzed by means of a T-test.
Yamaguchi (2008) mentioned The T-test is a method that uses t-distribution in a statistical
verification process. The t-distribution shows bilateral symmetry like normal distribution and
changes in the peak of the distribution according to the number of cases (p. 276). Additionally,
Yamaguchi stated also, the T-test can be used to verify a possible difference in average values
between two target groups. In addition, it classifies the groups as the case of independent
sampling and dependent sampling.
Possible Ethical Issues
Possible (but not probable) ethical issues implicated by the evaluation of the program
mostly include disruptions to the participants life. In regards to the possible ethical issues in the
midst of a program evaluation, Newman and Brown (1996) mentioned Protection of human
rights, which includes issues related to confidentiality, informed consent, and participants right
to terminate (p. 56). Furthermore, Newman and Brown avowed freedom from political
interference, which includes pressures from interest groups and conflicts related to misuse of
evaluation data. Last but not least, Newman and Brown stated evaluator technical competency,
which includes openness about evaluators technical abilities and assurance of objectivity.
Stratagems for Implementing Recommendations
The success/failure of the Masters Degree program in Post-Secondary Education which
is offered by the College of Education (at Troy University) is dependent on faculty (staff) and the

EVALUATION PLAN, DATA ANALYSIS


student population. Depending on the results of the evaluation, if the students are receiving
insufficient erudition from the instructor(s), in which the student receives a low grade, the
instructor(s) will be required to take a workshop to further bolster his or her teaching skills. In
contrast, if the results of the evaluation (in regards to the Masters Degree program in PostSecondary Education at Troy University) estimates that students are performing meagerly on
their own (via no fault of the instructor), the student(s) amid the program shall be required to
take tutoring classes/college success classes to bolster their grades.
Communicating Results and Recommendations to Stakeholders
In regards to communicating results and recommendations to stakeholders, Torres and
Preskill (2004) stated Consider grouping recommendations into categories. When there are a
large number of recommendations, you might want to organize the recommendations into
various categories (p. 294). Additionally, Torres and Preskill avowed for example,
recommendations could be sorted according to program area, cost, time frame, approval or
further input needed, and organizational functions.

EVALUATION PLAN, DATA ANALYSIS

7
References

Ferlie, E., & Lynn, L. (2007). The oxford handbook of public management. Oxford, UK: Oxford
University Press.
Guerra-Lpez, I. (2008). Performance evaluation: Proven approaches for improving program
and organizational performance. San Francisco, CA: Jossey Bass Wiley.
Janicak, C. (2015). Safety metrics: Tools and techniques for measuring safety performance.
Blue Ridge Summit, PA: Bernan Press.
Newman, D., & Brown, R. (1995). Applied ethics for program evaluation. Thousand Oaks, CA:
Sage Publications.
Torres, R., & Preskill, H. (2004). Evaluation strategies for communicating and reporting:
Enhancing learning in organizations. Thousand Oaks, CA: Sage Publications.
Rossi, P., & Lipsey, M. (2003). Evaluation: A systematic approach. Thousand Oaks, CA: Sage
Publications.
Tashakkori, A., & Teddlie, C. (2002). Handbook of mixed methods in social & behavioral
research. Thousand Oaks, CA: Sage Publications.
Yamaguchi, T. (2008). Practical aspects of knowledge management. Berlin, DE: Springer.

Vous aimerez peut-être aussi