Vous êtes sur la page 1sur 13

Caution

Evaluation of Education
The information in these slides represents
Development Projects the opinions of the individual program
offices and not an official NSF position.
Barb Anderegg, Connie Della-Piana, and
Russ Pimmel
National Science Foundation

FIE Conference
October 29, 2006

Framework for the Session

• Learning situations involve prior knowledge


– Some knowledge correct
Framework for the – Some knowledge incorrect (i. e.,
misconceptions)

Session • Learning is
– Connecting new knowledge to prior knowledge
– Correcting misconception
• Learning requires
– Recalling prior knowledge – actively
– Altering prior knowledge

Active-Cooperative
Session Format
Learning
• Learning activities must encourage • “Working” session
learners to: – Short presentations (mini-lectures)
– Group exercise
– Recall prior knowledge -- actively, explicitly
– Connect new concepts to existing ones • Exercise Format
– Think Æ Share Æ Report Æ Learn
– Challenge and alter misconception
• (TSRL)
• The think-share-report-learn (TSRL) • Limited Time – May feel rushed
process addresses these steps – Intend to identify issues & suggest ideas
• Get you started
• No closure -- No “answers” – No “formulas”

1
Group Behavior Session Goals

• Be positive, supportive, and cooperative The session will enable you to collaborate
– Limit critical or negative comments with evaluation experts in preparing
• Be brief and concise effective project evaluation plans
– No lengthy comments
• Stay focused
– Stay on the subject It will not make you an evaluation expert
• Take turns as recorder
– Report for group not your own ideas

Session Outcomes
After the session, participants should be able to:
• Discuss the importance of goals, outcomes, and
questions in evaluation process Evaluation and Project
– Cognitive, affective, and achievement outcomes
• Describe several types of evaluation tools
Goals/Outcomes/Questions
– Advantages, limitations, and appropriateness
• Discuss data interpretation issues
– Variability, alternate explanations
• Develop an evaluation plan with an evaluator
– Outline a first draft of an evaluation plan

Evaluation and Evaluation and Project


Assessment Goals/Outcomes
• Evaluation (assessment) has many meanings • Evaluation starts with carefully defined
– Individual’s performance (grading) project goals/outcomes
– Program’s effectiveness (ABET accreditation) • Goals/outcomes related to:
– Project’s progress or success (monitoring and
– Project management
validating)
• Initiating or completing an activity
• Session addresses project evaluation
– May involve evaluating individual and group • Finishing a “product”
performance – but in the context of the project – Student behavior
• Project evaluation • Modifying a learning outcome
– Formative – monitoring progress
• Modifying an attitude or a perception
– Summative – characterizing final accomplishments

2
Developing Goals & Goals – Objectives –
Outcomes Outcomes -- Questions
• Start with one or more overarching • Converting goals to outcomes may involve
statements of project intention intermediate steps
– Each statement is a goal – Intermediate steps frequently called objectives
• Convert each goal into one or more specific • More specific, more measurable than goals
expected measurable results • Less specific, less measurable than
– Each result is an outcome outcomes
• Outcomes (goals) lead to questions
– These form the basis of the evaluation
– Evaluation process collects and interprets data
to answer evaluation questions

Definition of Goals, Exercise #1: Identification of


Objectives, and Outcomes Goals/Outcomes
Goal – Broad, overarching statement of • Read the abstract
intention or ambition – Note - Goal statement removed
– A goal typically leads to several objectives
• Suggest two plausible goals
Objective – Specific statement of intention – One focused on a change in learning
– More focused and specific than goal
– A objective may lead to one or more outcomes – One focused on a change in some other
aspect of student behavior
Outcome – Statement of expected result
– Measurable with criteria for success

NOTE: No consistent definition of these terms

Abstract PD’s Response -- Goals


The goal of the project is …… The project is developing
computer-based instructional modules for statics and
mechanics of materials. The project uses 3D rendering
and animation software, in which the user manipulates
• Goals may focus on
virtual 3D objects in much the same manner as they would – Cognitive behavior
physical objects. Tools being developed enable instructors
to realistically include external forces and internal – Affective behavior
reactions on 3D objects as topics are being explained
during lectures. Exercises are being developed for – Success rates
students to be able to communicate with peers and
instructors through real-time voice and text interactions. – Diversity
The project is being evaluated by … The project is being • Cognitive, affective or success in
disseminated through … The broader impacts of the
project are … targeted subgroups

3
PD’s Response – Goals PD’s Response – Goals
on Cognitive Behavior on Affective Behavior
GOAL: To improve understanding of GOAL: To improve
– Concepts & application in course
– Solve textbook problems – Interest in the course
– Draw free-body diagrams for textbook – Attitude about
problems
– Describe verbally the effect of external
• Profession
forces on a solid object • Curriculum
– Concepts & application beyond course • Department
– Solve out-of-context problems – Self- confidence
– Visualize 3-D problems
– Communicate technical problems orally – Intellectual development

PD’s Response – Goals on PD’s Response – Goals on


Success Rates Diversity
• Goals on achievement rate GOAL: To increase a target group’s
changes – Understanding of concepts
– Improve – Achievement rate
• Recruitment rates – Attitude about profession
• Retention or persistence rates – Self-confidence
• Graduation rates

• “Broaden the participation of


underrepresented groups”

Exercise #2: Transforming PD’s Response --


Goals into Outcomes Outcomes
Conceptual understanding
Write one expected measurable outcome for • Students will be better able to solve simple conceptual
problems that do not require the use of formulas or
each of the following goals: calculations
• Students will be better able to solve out-of-context
1. Increase the students’ understanding of problems.
the concepts in statics
Attitude
• Students will be more likely to describe engineering as
2. Improve the students’ attitude about an exciting career
engineering as a career • The percentage of students who transfer out of
engineering after the statics course will decrease.

4
Exercise #3: Transforming PD’s Response --
Outcomes into Questions Questions

Write a question for these expected


Conceptual understanding
measurable outcomes: • Did the students’ ability to solve simple
1. Students will be better able to solve simple conceptual problems increase ?
conceptual problems that do not require
• Did the students’ ability to solve simple
the use of formulas or calculations
conceptual problems increase because of
2. In informal discussions, students will be the use of the 3D rendering and animation
more likely to describe engineering as an software?
exciting career

PD’s Response --
Questions

Attitude
• Did the students discussions indicate Tools for Evaluating
more excitement, about engineering as a
career? Learning Outcomes
• Did the students discussions indicate
more excitement, about engineering as a
career because of the use of the 3D
rendering and animation software?

Examples of Tools for


Evaluation Tools
Evaluating Learning Outcomes
• Surveys • Tool characteristics
– Forced choice or open-ended responses
• Interviews –Advantages and disadvantages
– Structured (fixed questions) or in-depth (free –Suitability for some evaluation
flowing)
• Focus groups questions but not for others
– Like interviews but with group interaction
• Observations
– Actually monitor and evaluate behavior

Olds et al, JEE 94:13, 2005


NSF’s Evaluation Handbook

5
Example – Comparing Example – Appropriateness
Surveys and Observations of Interviews
Surveys Observations • Use interviews to answer these questions:
• Efficient • Time & labor intensive – What does program look and feel like?
• Accuracy depends on – What do stakeholders know about the project?
• Inter-rater reliability
subject’s honesty – What are stakeholders’ and participants’
must be established
• Difficult to develop expectations?
reliable and valid • Captures behavior that – What features are most salient?
survey subjects unlikely to – What changes do participants perceive in
• Low response rate report themselves?
threatens reliability, • Useful for observable
validity, & behavior The 2002 User Friendly Handbook for Project
interpretation Evaluation, NSF publication REC 99-12175

Olds et al, JEE 94:13, 2005

Introduction to CIs

• Measures conceptual understanding


• Series of multiple choice questions
Concept Inventories (CIs) – Questions involve single concept
• Formulas, calculations, or problem solving
not required
– Possible answers include “detractors”
• Common errors
• Reflect common “misconceptions”

Introduction to CIs Sample CI Questions


H2O is heated in a sealed, frictionless, Patm

piston- cylinder arrangement, where the


• First CI focused on mechanics in physics piston mass and the atmospheric H2O

– Force Concept Inventory (FCI) pressure above the piston remain


• FCI has changed how physics is taught constant. Select the best answers.

1. The density of the H2O will:


The Physics Teacher 30:141, 1992
Optics and Photonics News 3:38, 1992
(a) Increase (b) Remain constant (c) Decrease
2. The pressure of the H2O will:
(a) Increase (b) Remain constant (c) Decrease
3. The energy of the H2O will:
(a) Increase (b) Remain constant (c) Decrease

6
Developing Concept
Other Concept Inventories
Inventories
• Existing concept inventories • Developing CI is involved
– Chemistry -- Fluid mechanics
– Statistics -- Circuits
– Identify difficult concepts
– Strength of materials -- Signals and systems – Identify misconceptions and detractors
– Thermodynamics -- Electromagnetic waves – Develop and refine questions & answers
– Heat transfer -- Etc. – Establish validity and reliability of tool
– Deal with ambiguities and multiple
Richardson, in Invention and Impact, AAAS, 2004 interpretations inherent in language

Richardson, in Invention and Impact, AAAS, 2004

Exercise #4: Evaluating a PD’s Response --


CI Tool Evaluating a CI Tool
• Suppose you where considering an • Nature of the tool
– Is the tool relevant to what was taught?
existing CI for use in your project’s – Is the tool competency based?
evaluation – Is the tool conceptual or procedural?
• What questions would you consider in • Prior validation of the tool
deciding if the tool is appropriate? – Has the tool been tested?
– Is there information or reliability and validity?
– Has it been compared to other tools?
– Is it sensitive? Does it discriminate novice and expert?
• Experience of others with the tool
– Has the tool been used by others besides the
developer? At other sites? With other populations?
– Is there normative data?

Affective Goals
GOAL: To improve
– Perceptions about
Tools for Evaluating Affective • Profession, department, working in teams
Factors – Attitudes toward learning
– Motivation for learning
– Self-efficacy, self-confidence
– Intellectual development
– Ethical behavior

7
Exercise #5: Tools for Affective PD Response -- Tools for
Outcome Affective Outcomes
Suppose your project's outcomes included:
• Both qualitative and quantitative tools
1. Improving perceptions about the
profession exist for both measurements
2. Improving intellectual development

Answer the two questions for each outcome:


• Do you believe that established, tested
tools (i.e., vetted tools) exist?
• Do you believe that quantitative tools
exist?

Assessment of Attitude - Assessment of Attitude –


Example Example (Cont.)
• Pittsburgh Freshman Engineering Survey
• Validated using alternate approaches:
– Questions about perception
– Item analysis
• Confidence in their skills in chemistry,
communications, engineering, etc. – Verbal protocol elicitation
• Impressions about engineering as a precise – Factor analysis
science, as a lucrative profession, etc. • Compared students who stayed in
– Forced choices versus open-ended engineering to those who left
• Multiple-choice
Besterfield-Sacre et al , JEE 86:37, 1997
Besterfield-Sacre et al , JEE 86:37, 1997

Tools for Characterizing Evaluating Skills, Attitudes,


Intellectual Development and Characteristics
• Levels of Intellectual Development • Tools exist for evaluating
– Students see knowledge, beliefs, and – Communication capabilities
authority in different ways
– Ability to engage in design activities
• “ Knowledge is absolute” versus
“Knowledge is contextual” – Perception of engineering
• Tools – Beliefs about abilities
– Measure of Intellectual Development (MID) – Intellectual development
– Measure of Epistemological Reflection (MER) – Learning Styles
– Learning Environment Preferences (LEP) • Both qualitative and quantitative tools exist
Felder et al, JEE 94:57, 2005 Turns et al, JEE 94:27, 2005

8
Exercise #6: Interpreting
Evaluation Data
Consider the percentages for Concepts #1, #2,
and #3 and select the best answer for the
Interpreting following statements for each question:

Evaluation Data 1. The concept tested by the question was:


(a) easy (b) difficult (c) can’t tell

2. Understanding of the concept tested by


the question:
(a) decreased (b) increased (c) can’t tell

Interpreting Evaluation PD’s Response --


Data Interpreting Data
• CI does not measure difficulty
Percent with
No. of Students Correct Answer
• Probably no change in understanding of
Concept #1 and #3
Quest Pre Post Pre Post-
1 25 30 29% 23% • Probably an increase in understanding
2 24 32 34% 65%
of Concept #2
– Large variability makes detecting changes
3 25 31 74% 85%
difficult

- - - - - – 25 % is expected value from random


guessing
– There are statistical tests for identifying
significant changes

Exercise #7: Alternate PD's Response -- Alternate


Explanation For Change Explanation For Change
• Students learned concept out of class (e. g., in
• Data suggests that the understanding of another course or in study groups with students
not in the course)
Concept #2
• Students answered with what the instructor
wanted rather than what they believed or “knew”
• One interpretation is that the intervention • An external event (big test in previous period or a
caused the change “bad-hair day”) distorted pretest data
• Instrument was unreliable
• List some alternative explanations • Other changes in course and not the intervention
– Confounding factors caused improvement
– Other factors that could explain the change • Students not representative groups

9
Exercise #8: Alternate PD's Response -- Alternate
Explanation for Lack of Change Explanations for Lack of Effect
• Data suggests that the understanding of
Concept #1 did not increase • An external event (big test in previous period or
a “bad-hair day”) distorted post-test data
• One interpretation is that the intervention • The instrument was unreliable
did cause a change but it was masked by • Implementation of the intervention was poor
other factors • Population too small
• List some confounding factors that could • One or both student groups not representative
have masked a real change • Formats were different on pre and post tests

Exercise #9: Evaluation Plan


• Suppose that a project’s goals are to
improve:
1. The students’ understanding of the
Evaluation Plan concepts in statics
2. The students’ attitude about
engineering as a career

• List the topics that you would address in


the evaluation plan

Evaluation Plan -- PD’s


Responses
• Name & qualifications of the evaluation expert
• Goals and outcomes and evaluation questions
• Tools & protocols for evaluating each outcome


Analysis & interpretation procedures
Confounding factors & approaches for
Working With an

minimizing their impact
Formative evaluation techniques for Evaluator
monitoring and improving the project as it
evolves
• Summative evaluation techniques for
characterizing the accomplishments of the
completed project.

10
What Your Evaluation Can Perspective on Project
Accomplish Evaluation
Provide reasonably reliable, reasonably valid
information about the merits and results of a • Evaluation is complicated & involved
particular program or project operating in
particular circumstance
– Not an end-of-project “add-on”
• Evaluation requires expertise
• Generalizations are tenuous • Get an evaluator involved EARLY
• Evaluation – In proposal writing stage
– Tells what you accomplished – In conceptualizing the project
• Without it you don’t know
– Gives you a story (data) to share

Exercise #10: Evaluator


Finding an Evaluator
Questions
• Other departments • List two or three questions that an
– education, educational psychology, psychology, evaluator would have for you as you begin
administration, sociology, anthropology, science or
mathematics education, engineering education
working together on an evaluation plan.
• Campus teaching and learning center
• Colleagues and researchers
• Professional organizations
• Independent consultants
• NSF sessions or projects

Question: Internal or external evaluator?

PD Response – Evaluator PD Response – Evaluator


Questions Questions (Cont.)
Project issues Operational issues
– What are the goals and the expected measurable – What are the resources?
outcomes – What is the schedule?
– What are the purposes of the evaluation? – Who is responsible for what?
– What do you want to know about the project? – Who has final say on evaluation details?
– What is known about similar projects? – Who owns the data?
– How will we work together?
– Who is the audience for the evaluation?
– What are the benefits for each party?
– What can we add to the knowledge base?
– How do we end the relationship?

11
Preparing to Work With
Working With Evaluator
An Evaluator
• Become knowledgeable Talk with evaluator about your idea (from the start)
– Draw on your experience – Share the vision
– Talk to colleagues Become knowledgeable
• Clarify purpose of project & evaluation – Discuss past and current efforts
– Project’s goals and outcomes Define project goals, objectives and outcomes
– Questions for evaluation – Develop project logic
– Usefulness of evaluation Define purpose of evaluation
• Anticipate results – Develop questions
– Confounding factors – Focus on implementation and outcomes
– Stress usefulness

Culturally Responsive Working With Evaluator


Evaluations (Cont)
Anticipate results
• Cultural differences can affect evaluations – List expected outcomes
– Plan for negative findings
• Evaluations should be done with
– Consider possible unanticipated positive
awareness of cultural context of project outcomes
• Evaluations should be responsive to – Consider possible unintended negative
– Racial/ethnic diversity consequences
– Gender Interacting with evaluator
– Disabilities – Identify benefits to evaluator (e.g. career goals)
– Develop a team-orientation
– Language
– Assess the relationship

Example of Evaluator’s Tool – Human Subjects and the


Project Logic Table IRB
• Projects that collect data from or about students
The Project
• Goals
Goals Objectives Activities Outputs/ Measures or faculty members involve human subjects
Outcomes
• Objectives • Institution must submit one of these
• Activities – Results from IRB review on proposal’s coversheet
• Outputs & outcomes
– Formal statement from IRB representative declaring
• Measures & methods
the research exempt
• Not the PI
– IRB approval form
What do I want to know about my project? • See “Human Subjects” section in GPG
(a)
(b) NSF Grant Proposal Guide (GPG)

12
Other Sources
• NSF’s User Friendly Handbook for Project
Evaluation
– http://www.nsf.gov/pubs/2002/nsf02057/start.htm
• Online Evaluation Resource Library (OERL)
– http://oerl.sri.com/
• Field-Tested Learning Assessment Guide (FLAG)
– http://www.wcer.wisc.edu/archive/cl1/flag/default.asp
• Science education literature

13