Vous êtes sur la page 1sur 20

Portfolio Project

EDUC 765: Trends and Issues in Instructional


Design
By: Lisa Berkland

Submitted February 24, 2016

PROJECT PROPOSAL MODULE 2


Project Title
Program Evaluation

Sponsoring Organization
ISU Extension and Outreach 4-H Youth Development
4-H empowers youth to reach their full potential through youth-adult
partnerships and research-based experiences.
4-H youth programs provide opportunities for youth to develop skills they
can use now and throughout their life. Iowa 4-H builds upon a century of
experience as it fosters positive youth development that is based on the
needs and strengths of youth, their families, and communities.
Iowa 4-H follows the principles of experiential learning, and draws on the
knowledge base of Iowa State University and other institutions of higher
education in cooperation with the United States Department of Agriculture.
The Iowa 4-H Programs vision and mission statements clearly view youth as
partners working with staff and volunteers, and as full participants in
planning and working for individual and community change.

Project Description
Public institutions are being held at a higher level of accountability than ever
before. To respond to this demand by funders to show results, staff must do a
better job of evaluating programs and sharing their outcomes.

Aim
Improve the ability of staff to implement quality program evaluation,
including communicating program results.

Target Audience
County employed youth staff (approximately 100 staff).
ISU Extension and Outreach paid 4-H Youth Development Specialists
(approximately 30 staff). These specialists serve multiple counties around
the state. Some of them cover the entire state.

Delivery Options
Instruction for this project will be likely be delivered in an online format. In a
recent survey of 4-H staff (both county and university paid) Blended learning
was selected by 52% of the respondents. We do, however, tend to have
nearly a 30% yearly turnover rate among our county youth staff, so having
the training be delivered virtually will help keep training opportunities
available in a timely fashion. This training will, however, have to incorporate
interactive components in order to get staff to complete the course.

FRONT-END ANALYSIS: INSTRUCTIONAL NEED MODULE 3


Instructional Need
According to my supervisor, who made the assignment, we have a need for
reliable and accessible training modules for program evaluation and
communicating with public value. I think the training needs are with county
youth staff and ongoing professional development for youth program
specialists (whether in the field or on campus).
In further conversation, it is assumed that if as an organization, staff can
more frequently implement quality evaluation, they will be more effective at
communicating with funders. According to Rossetts (1999) four
opportunities, the need for instruction on evaluation is most closely related
to employee growth so that they can more effectively contribute to the
organization.
Additional surveying of staff revealed more specific needs. There were 55
responses to the survey out of a total possible of approximately 160 staff.
This is an approximate number due to the fact that some of the staff levels
experience heavy turnover and the number is in constant flux. Thirty-six of
the respondents, or 65%, self-identified as County paid youth staff. Twelve
identified as ISUEO Youth Specialist (22%). Seven indicated Other as the
category that best described them. This seems fairly reflective of the
distribution in our organization.
Among a list of 16 aspects of evaluation, the 8 skills that staff self-reported
the lowest level of skills in included:
1. Write/contribute to a logic model.
2. Use a logic model.
3. Write/select/design the best type of evaluation approach for the
program.
4. Know when human subjects review is required.
5. Select appropriate evaluation indicators.
6. Articulate public value.
7. Connect evaluation with public value.
8. Find data that indicates program need related to public value.
Staff were also asked to identify those skills, of the same 16, that they were
most interested in improving upon their skill level. The top 8 in order were:
1. Write/select/design the best type of evaluation approach for the
program.
2. Select appropriate evaluation indicators.
3. Find data that indicates program need related to public value.
4. Articulate public value.
5. Write/select appropriate evaluation questions.

6. Connect evaluation with public value.


7. Collect data using a variety of methods.
8. Conduct evaluation overall.
Several resources exist to support staff in their evaluation efforts. The
survey revealed that most staff were familiar with resources among our own
campus staff and on our own website. Less familiarity was identified with
other University Extension Resources and the National 4-H Organization, yet
these both were still familiar to over 50% of the respondents. The same
pattern was found to be true when asked if staff had utilized those same
resources.
When asked about preferred methods of training, 51% of respondents
indicated a preference for Blended: Using both online and in-person learning
experiences. Twenty-four percent (24%) preferred Face-to-Face training,
eleven percent(11%) indicated Camtasia, and nine (9%) indicated Webinar as
their preferences. Only 5% had a preference for Online learning exclusively.
While it is clear that there is an instructional need among County and State
paid 4-H staff, the survey also revealed some issues that were not
instructional issues related to the topic that the organization might want to
consider.
1. A request for evaluation and public value help sheets was made.
2. One suggestion was to consider departmental focus on one aspect
of public value each year. Another closely related suggestion was to
select one or two major programming efforts to evaluate across the
state each year.
3. A suggestion was made that one evaluation expert be assigned to
this topic rather than expecting everyone to be proficient at it.
4. Time to allocate to evaluation and creating public value statements
was suggested as an issue/barrier.

FRONT-END ANALYSIS: LEARNER CHARACTERISTICS


MODULE 3
Learner Analysis
Primary Audience
4-H Staff County Paid
4-H Staff University Paid (field or state office)
Secondary Audience
There could be some value to other Extension departments,
depending on how general the instruction ends up being.
General Learner Characteristics
Learners vary greatly in the number of hours paid to do their job.
Anywhere from half-time to full-time employment is the in the scope
of the norm. No matter the number of hours allotted, staff struggle
to find the time for professional development.
Learners are predominantly white females. That is not to say that
there are no males or people of other races/ethnicities.
Learners run the age span from 22 65. They are adult learners
and likely express many adult learner characteristics:
o They are self-directed
o They need feedback
o The need opportunities for interaction with others, not just the
facilitator
o They need the learning to be immediately applicable.
Turnover among county youth staff runs 30-40% annually. University
(state) paid youth staff are more stable, but several positions that
have been on hold are now being filled. Staff range in years of
experience from 0 30+.
Budgets for county offices vary greatly. Some are extremely tight
and others not quite so much, i.e. training fees must be reasonable
if the goal is to reach a majority of staff. Bringing staff together to a
central location for training is an added expense as they pay both
mileage and staff time while on the road and some would have to
travel over 4 hours one way.
Educational attainment of staff ranges from high school graduate to
Masters plus, although it is not uncommon today for county staff to
be hired with a Bachelors degree preferred.
Entry Characteristics

Many offices that staff work in have only 2-3 staff members. Finding
time for learning without interruption can be extremely difficult.
County staff are supervised by elected Extension Councils, which
are 9 member boards. These boards and staff may or may not go
along with State 4-H Leadership. Alignment, therefore, can be an
issue causing support for participation in state level training to vary.
In a recent survey, 74% of staff indicated that they fully understand
why we must evaluate our work. An additional 24% of staff
indicated having a basic understanding of why we evaluate our
work. Only 2% really dont understand why we evaluate our work.
Knowing that the large majority of staff already know why we
evaluate our work is an important foundation/pre-requisite for this
instruction.

Contextual Analysis
Orienting Context
Goals of the learners will include: increasing skill level with evaluation,
as well as to learn something that will be useful to them in performing
their job.
While learners understand why evaluation is a necessary part of their
job, there is some resistance to it. An example is expressed in this
comment by a staff member, Perhaps it shouldnt be expected that
everyone be an expert at doing this, and that there are designated
folks that have this responsibility that are available to county, field
staff and can ultimately work with those staff to complete a high level
evaluation/public value statement. Usability of the content of this
instruction will need to be proven early on and often or staff members
wont complete the instruction.
Some staff members will be content with an overview of evaluation,
while others will desire a greater depth of knowledge.
Staff members paid by County Extension Councils are held accountable
by them. They are typically held less accountable for professional
development than University paid staff. Dependent on the perceived
value by administration for the training, my department may mandate
participation or allow staff to self-select participation in various staff
development opportunities. University employed staff are required to
evaluate and provide 2 success stories communicating public value
each year.
Some learners may believe that post program surveys are the only way
to collect evaluation data.
Information on evaluation is already readily available to staff.
Unfortunately, staff rarely take the time on their own for seeking it out
and refining their skills.

Instructional Context
It is difficult to find a date in our organization that works for everyone.
Summer months are usually avoided for professional development of
4-H staff due to county fair season.
Lighting: This varies by office location. Natural and artificial lighting
may be available. For any face to face part of the instruction, there are
usually nice meeting rooms available with necessary lighting controls,
especially if using the state 4-H Building meeting room.
Room arrangement: Again this varies by each office. Some staff have
offices with full walls and doors that can be closed for quiet and
privacy. Other staff have offices that are more cubicle type, where
distraction from noise by other staff and clientele are highly likely.
Temperature: Most offices have reasonably controlled temperature
settings.
If staff are participating from their desk location, the chair is likely a
typical office chair on rollers.
For virtual learning, accommodations are less of an issue.
All ISU Extension offices meet at least a minimum requirement for Wi-Fi
service. They are all connected to a statewide server. Internet is
typically quite fast. All staff have an individually assigned computer in
their office. Computers are of the PC-type and have at a minimum
Microsoft Office Professional loaded and access to Explorer, Firefox or
Chrome browsers.
Obviously transportation is not an issue for the virtual part of the
training.
Technology Inventory
All staff are provided computers upon employment. Support for these
computers comes from the Extension IT department, which is usually
very efficient. All computers are loaded with Microsoft Office products
including Outlook. Staff are comfortable with webinars using Connect.
Lynda tutorials are available to staff.
Transfer Context
Instruction on evaluation should easily be transferable to any type of
program staff desire to evaluate. Tools and techniques supplied to the
learner should be readily usable.
County paid staff are involved in programming with either youth or
adult audiences on a weekly basis. Typically we encourage evaluation
on larger, significant programming efforts. Staff usually have these
opportunities at least 4-6 times per year. University staff are not
usually involved in direct delivery of content, but rather consult with
county staff. They are the most likely ones to create program
evaluations and analyze the collected data.

There is administrative support to transfer knowledge to programming,


especially for University paid staff. Once elected officials get used to
seeing results communicated to them, they will likely be even more
supportive of their (county) staff doing it.

INSTRUCTIONAL IMPACT BASED UPON LEARNER


CHARACTERISTICS
Application of Learning Theories
Since this audience is an adult audience, adult learning theories do apply.
1. Staff will not commit to participating in a learning experience unless
they are either mandated to participate or they see the objectives up
front and are convinced the instruction will be worth their time.
2. Staff in my organization also want to know the benefits of the
instruction. The benefits can be either personal or professional.
3. While staff understand that sometimes there are technology glitches,
they do expect learning to begin and end on time. They are always
grateful if instruction ends a bit early, but some will feel slighted if it
ends too early.
4. The learners may not expect the facilitator to know everything, but
they do expect the facilitator to be prepared. An unprepared facilitator
will lose an audience very quickly in my organization.
5. As in most adult learning environments, the learners come in with a
variety of experiences and knowledge. Allowing them to share that will
have great appeal.
6. Our staff is used to facilitative approaches to learning. An instructor
who comes across too directive will be labeled a dictator and avoided.
7. With the variety of educational attainment in our staff and experiences,
it isnt unusual to have staff who are extremely knowledgeable and
operate at a high level of professionalism. Giving them opportunities
to make choices throughout the instruction will increase their
perceived value of it.
8. If our staff is asked to make changes, they will resist things that they
dont understand or things that contradict past practices without clear
rationale. Attempts to influence change with weak logic or data
disguised as unbiased will result in lack of trust and minimal efforts at
change.
9. And lastly, how we do love to hear ourselves talk! Small groups will be
beneficial, whether online or in the classroom in order to get every
voice heard.

Application of Motivational Theories


In applying Kellers ARCS theory the following components need to be
considered in the instructional design:
Attention: Use inquiry arousal to stimulate curiosity by posing an initial
situation that has relevance but also demonstrates a need to learn.
This will grab the initial attention of the learner. In order to keep the
learners attention a variety of instructional strategies will need to be
employed.
8

Relevance: Use of examples that learners can relate to will be


important to the learning. Application of the learning to an actual
program that learners are working with will be helpful, as well.
Confidence: Opportunities for practice and feedback will be critical for
staff to be able to increase their level of skill with evaluation.
Satisfaction: Sustaining skills learned will be important for learner
satisfaction. Exploration of setting up formal and informal feedback
and support systems would be helpful. This will need to be discussed
with administration and the professional development department to
determine what long-term support might be available.

Impact of a Diverse Audience on Instruction


Cultural diversity is quite limited currently in our organization. Most diversity
comes by way of age, educational attainment and previous work
experiences. Probably educational level and content will make the most
impact on this instruction. For example, one would expect that staff who
have Masters degrees will be more familiar with evaluation tools and
analysis than those who have a limited post-high school education.
Attempting to make instruction meet the needs of both audiences may
require test-out options or more self-directed learning options.

TASK/GOAL/PERFORMANCE ANALYSIS MODULE 5


Goal Analysis
Instructional Need/Aim
Improve the ability of staff to implement quality program evaluation,
including communicating program results.

Goal Analysis
Step 1 - Write down the goals.
Original goals:
Staff are able to implement effective program evaluation.
Step 2 - Write down everything a learner would have to say or do for you
to agree that the learner has achieved the goal. This is not a list of what
you will need to do as the instructional designer or teacher.

Describe the program.


Define the purpose of the program.
Determine use/users.
Determine key questions.
Select Indicators.
Determine evaluation design.
Identify sources of data.
Select data collection methods.
Set the collection schedule.
Pilot test the evaluation.
Collect data.
Process the data.
Analyze the data.
Interpret the data.
Identify learnings.
Acknowledge limitations.
Share findings and lessons learned.
Use in decision making.
Determine next programming steps.

10

Step 3 - Sort the items listed in step 2.

FOCUS
Describe the program.
Define the purpose of the program.
Determine use/users.
Determine key questions.
Select Indicators.
Determine evaluation design.
COLLECT DATA
Identify sources of data.
Select data collection methods.
Set the collection schedule.
Pilot test the evaluation.
Collect data.
ANALYZE & INTERPRET
Process the data.
Analyze the data.
Interpret the data.
Identify learnings.
Acknowledge limitations.
APPLICATION
Share findings and lessons learned.
Use in decision making.
Determine next programming steps.
Step 4 - Write a complete sentence to describe each of the items on your
final list.

FOCUS Staff will answer key questions that will bring focus to their
evaluation efforts.
Describe the program.
Define the purpose of the program.
Determine use/users.
Determine key questions.
Select Indicators.
Determine evaluation design.
COLLECT DATA Staff will develop a plan and implement collecting
data.
Identify sources of data.
Select data collection methods.
Set the collection schedule.
Pilot test the evaluation.
Collect data.
ANALYZE & INTERPRET Staff will analyze and interpret collected
data.

11

Process the data.


Analyze the data.
Interpret the data.
Identify learnings.
Acknowledge limitations.
APPLICATION- Staff will apply key learnings to make program
decisions and determine next programming steps.
Share findings and lessons learned.
Use evaluation results in decision making.
Determine next programming steps.

INSTRUCTIONAL OBJECTIVES MODULE 5


Project (Instructional) Goal
The goal of this course is for County and University 4-H staff to fully utilize
the program evaluation process by focusing their evaluations, collecting
data, analyzing and interpreting data and using their findings.

Terminal Objectives and Enabling Objectives


Terminal Objective: Using a self-selected topic, staff will define and focus the
program evaluation resulting in selection or development of indicators and
determination of evaluation design. Criteria:
Indicators and evaluation design decisions are congruent with articulated
focus of the evaluation.(Cognitive)
Enabling Objectives:
Using a self-selected topic, staff will describe in one sentence what they
intend to evaluate from a list of possibilities. Examples of those possibilities
include: logistics, teaching style, satisfaction, behavioral changes, outcomes,
impacts. (Cognitive)

Using a self-selected topic, staff will define the purpose of the program
evaluation and record it on their worksheet. Criteria: the definition will be
clear to other participants. Ex. Evaluating for program improvement, to meet
funder requirements, to improve teaching skills, etc. (Cognitive)

Using a self-selected topic, staff will identify potential users of the evaluation
as well as how those users might utilize the information and record it on their
worksheet. Criteria:
o

Users should have a stake in the outcome of the evaluation.

12

Includes what those users might want to know.

Includes where information about users was procured. (Cognitive)

Using a self-selected topic, staff will list questions to be answered by the


evaluation and record those questions on their worksheet. Criteria:
o

Questions are directly related to the purpose of the evaluation stated


earlier.

Questions will supply answers needed for the intended end-user of


evaluation results. (Cognitive)

Using a self-selected topic, staff will determine indicators of


accomplishments, changes or progress that can be measured. Criteria:
o

Indicators selected are directly related to the evaluation purpose, endusers and questions identified previously.

Indicators are measurable.(Cognitive)

Using a self-selected topic, staff will select qualitative, quantitative, or both


for their evaluation design and record their decision on their worksheet along
with a brief explanation for their choice. Criteria:
o

The explanation should logically connect the evaluation design choice


to the purpose, end-users, questions, and indicators previously
selected. (Cognitive)

Terminal Objective: Using a self-selected topic, staff will create a plan for
collecting data that identifies the sources, method and schedule for collecting data.
The plan will also articulate the pilot testing process. Criteria:
The plan is manageable and within the workload capacity of the organization
and staff.

The plan is fiscally manageable, i.e. the funds are available to carry through
on the plan.

The plan includes data sources, specific methodology and a timetable for
data collection.

The pilot testing process addresses who, when and how the evaluation will be
tested.

(Cognitive)

13

Terminal Objective: Using a data provided, staff will analyze and interpret the
data resulting in articulating conclusions, recommendations and limitations. Criteria:
Analysis and interpretation logically follow collected data.

Conclusions and recommendations directly connect with analysis and


interpretation of data.

Limitations correspond to data collection or analysis methodology.

(Cognitive)

Terminal Objective: Using the conclusions, recommendations and limitations


identified previously, staff will develop a plan to use the evaluation results. Criteria:
The plan identifies potential ways to share findings.

The plan uses findings in decisions-making.

The plan articulates next steps for programming.

(Cognitive)

ENABLING OBJECTIVES MATRIX & SUPPORTING CONTENT


MODULE 6

Enabling
Objective

Using a selfselected topic,


staff will
describe
(indicate) in
one sentence
what they
intend to
evaluate from
a list of
possibilities.
Examples of
those

Level on
Blooms
Taxonomy*

Comprehensio
n

Fact,
concept,
principle,
rule,
procedure,
interperso
nal, or
attitude?
Procedure

Learner Activity

Delivery
Method

(What would
learners do to
master this
objective?)

(Group
presentation /
lecture, selfpaced, or
small group)

Learners will
explore different
possibilities for
what might be
evaluated (ex.
Prezi). Then
learners will
identify a
program they
would like to
evaluate and

Self-paced

14

possibilities
include:
logistics,
teaching style,
satisfaction,
behavioral
changes,
outcomes,
impacts.
(Cognitive)

indicate on their
Evaluation
worksheet what
they want to
evaluate.

Using a selfselected topic,


staff will
define
(explain) the
purpose of the
program
evaluation and
record it on
their
worksheet.
Criteria: the
definition will
be clear to
other
participants.
Ex. Evaluating
for program
improvement,
to meet funder
requirements,
to improve
teaching skills,
etc.
(Cognitive)

Comprehensio
n

Procedure

Learners will
examine various
purposes for
evaluation.
Examples of a
variety of
purposes will be
shared. They will
select a purpose,
record it on their
worksheet. Then
they will share
with a
partner/mentor to
receive feedback
on clarity.

Self-paced

Using a selfselected topic,


staff will
identify
potential users
of the
evaluation as
well as how
those users
might utilize
the
information
and record it

Application

Procedure

Learners will
watch a short
video clip that
describes the
importance of
keeping the end
user in mind.
Learners then
brainstorm a list
of end users for
their evaluation

Self-paced

15

on their
worksheet.
Criteria:
Users
should
have a
stake in the
outcome of
the
evaluation.

Includes
what those
users might
want to
know.

Includes
where
information
about users
was
procured.
(Cognitive)

Using a selfselected topic,


staff will list
(create a list
or formulate)
questions to
be answered
by the
evaluation and
record those
questions on
their
worksheet.
Criteria:
Questions
are directly
related to
the
purpose of
the

project and fill in


the worksheet.

Synthesize

Procedure

Learners will read


about creating
questions to be
answered by the
evaluation.
Examples will be
shared in a case
study format.
Then the learners
will create their
own list of
questions.

Self-paced

16

evaluation
stated
earlier.

Questions
will supply
answers
needed for
the
intended
end-user of
evaluation
results.
(Cognitive)

Using a selfselected topic,


staff will
determine
(select/develo
p) indicators of
accomplishme
nts, changes
or progress
that can be
measured.
Criteria:
Indicators
selected
are directly
related to
the
evaluation
purpose,
end-users
and
questions
identified
previously.

Application/
Synthesis

Procedure

Indicators will be
defined.
Examples of
indicators will be
provided. A quiz
with immediate
feedback on
measureable
indicators and
non-measurable
indicators will be
given. Resources
for indicators will
be shared.
Learners will
either select
indicators for
their evaluation
project or develop
appropriate
indicators.

Self-paced

Indicators
are
measurable
.

17

(Cognitive)
Using a selfselected topic,
staff will select
qualitative,
quantitative,
or both for
their
evaluation
design and
record their
decision on
their
worksheet
along with a
brief
explanation for
their choice.
Criteria:
The
explanation
should
logically
connect the
evaluation
design
choice to
the
purpose,
end-users,
questions,
and
indicators
previously
selected.
(Cognitive)

Application

Procedure

Definitions of
qualitative and
quantitative
evaluation will be
given with
benefits/reasons
for each.
Learners will
finish filling out
the focus section
of their worksheet
and share the
worksheet with
the instructor for
feedback.

Self-paced

REFERENCES
Taylor-Powell, Ellen, Steele, Sara, & Douglah, Mohammad. (1996). Planning a
Program Evaluation. University of Wisconsin-Extension, Madison, WI.
18

Taylor-Powell, Ellen, Steele, Sara, & Douglah, Mohammad. (2006). Planning a


Program
Evaluation: Workseet. University of Wisconsin-Extension, Madison, WI.

19

Vous aimerez peut-être aussi