Vous êtes sur la page 1sur 12

Chapter 14: Assessing Learning Outcomes

Objectives:
1. Define traditional, alternative, authentic, and
performance assessments

2. Describe effective classroom assessment programs

3. Describe various methods of science assessment

4. Explain scoring assessments and assigning grades


I. What is Assessment
A. Plays a critical role in education
1. Teachers plan to ensure students so well on assessments
2. Students are motivated to learn content to do well on assessments

B. Science Reform and Assessment


1. Tradition assessment = paper and pencil tests
2. Hein and Price (1994) identified reasons for assessment reform
a. Tests reveal what students don’t know; can’t assess all outcomes
b. Science education is now less about content, more about literacy
c. Accountability of schools
d. We know more about how students learn: inquiry learning is better

C. Contemporary Science Assessment


1. Alternative formats: portfolios, journaling, concept mapping, etc…
2. Authentic: real world situations; doing what real scientists do
3. Performance: hands-on or creative tasks, rather than regurgitation
4. Choosing appropriate assessment is challenging task
D. Other important assessment concepts
1. Diagnostic assessment: used to find out what students know
before beginning a unit; may include interviews, journals, pre-tests,
etc…

2. Formative assessment: used during instruction to find out student


progress and provide feedback to students

3. Summative assessment: final evaluation of learning achieved;


usually comprehensive; often serves as the basis of a grade

4. Reliability: consistency or repeatability of an assessment tool;


confidence in the assigned grade depends on reliable assessment

5. Validity: does the assessment measure what it is supposed to;


content validity means asking the appropriate questions; form
validity means having students answer in an appropriate way

6. Evaluation: assessment = collection of data; evaluation = using the


data from assessment to make a decision about quality of work
II. Assessment Methods
A. Performance Tasks
1. A laboratory practical exam is a performance task
2. Using materials, equipment, models to demonstrate learning
3. Teacher assesses by observing the process
4. Logistics and management challenge the use of this assessment
a. Use of stations throughout the room can help
b. Assessing during regular lab activities
5. Checklists allow timely use of performance assessments (p. 280)
6. Computer simulations can allow for storing files of student work
B. Open-Ended Problems
1. Many and varied ways of arriving at a solution
2. Usually, solution is presented in writing
3. Often use real-world issues
4. Students may use multiple resources to find a solution
5. Examples:
a. Describe a method for removing nitrate contamination in well water
b. Experiments to determine what gave someone food poisoning
C. Inquiry-Oriented Investigations
1. Analyze problem, plan and conduct experiment, organize results,
and communicate findings (Doran, 1998)
2. Provide students with materials and directions, safety precautions
3. Some teachers require approval of a plan before students start
4. WOWBugs example p. 282 (behavior of wasps)
5. Individual accountability (Reynolds, 1996)
a. Group report, but individual reports allowed if student disagrees
b. Group does investigation, but all students submit own report
c. Individuals don’t discuss results; prepare separate reports
6. Large scale investigations
a. Can last over weeks or even a semester
b. Assessment and instruction are intermixed
c. Students get an idea of what real scientists do
d. Students report various segments of the project over the time period
e. McPherson College Undergraduate Research program
D. Concept Maps
1. Graphical method to show learning or to infer misconceptions
2. Must provide instruction and practice before using as assessment
E. Observation
1. Science teachers continually do this anyway: write it down
2. Example: safety goggles? correct technique? correct procedure?
3. How to decide what to observe (Hein and Price, 1994)
a. Knowledge application: how students solve problems
b. Information assimilation: how students relate new information to class
c. Vocabulary: listen to student conversation and discussion
4. Challenges
a. Many students per day
b. Many learning objectives per student
c. Checklists can help (p. 284, p. 285)

F. Interviews
1. Verbal questions from teacher to student
2. Can be used before, during, or after instruction
3. Open-Ended
a. Teacher asks fewer, broader questions
b. What do you know about…?
c. Can you explain how that is used outside of school?
4. Partially structured interview
a. Written set of questions to probe specific knowledge
b. Probing questions can help clarify the student response
c. Can be particularly helpful with writing-challenged students
d. In your own words, what is the theory of…?
e. What evidence supports to conclusion that…?
f. What did scientists learn from the study of…?
5. Challenges
a. Not practical to get lengthy interviews of all students in large classes
b. Interview a few students at a time, dispersing questions
c. Interview before, after school, at lunch, during study hall, etc…
d. Tape student-student interviews

G. Journals
1. Assess attitudes, growth, and improve writing at the same time
2. Can include free writing on a subject or specific questions
3. Often overlook spelling/grammar to get at science
4. Challenges: don’t have time to read them all
a. Read randomly selected journals each week
b. Not summative assessments
H. Drawings
1. Nonthreatening, simple, useful for reading/writing challenged
2. May include written descriptions, summaries
3. Provide evidence of conceptual change

I. Portfolios
1. Organization, synthesis, and summarization of student learning
2. Formative, and Summative assessments included
3. May ask for reflection by student on past assessments
4. May include student written captions describing how assessment
was used to demonstrate learning
5. Involves student in the assessment process
6. Looks at totality of the experience, rather than isolated data
7. Student often allowed to choose a limited number of items
8. Judgement
a. Holistic: on portfolio as a whole
b. Analytic: rating of individual items
III. Developing Assessments
A. Challenges to using Contemporary Assessments
1. Often harder than writing exams
2. Often must construct them yourselves
3. Authenticity and complexity of tasks
4. Scoring rubrics is more subjective than scoring exams
5. Step-by-step process: Lewin and Shoemaker, 1998
a. Be clear about skills, knowledge, standards targeted
b. Be familiar with the traits of a strong performance
c. Use a meaningful context within which you assess
d. Write and rewrite the task clearly and concisely
e. Assign the task with step-by-step instructions
f. Provide examples of “good” work
g. Score the task and then make revisions for the next use

B. Rubrics
1. Written criteria by which student work is judged
2. Numeric scale is tied to specific performance
3. Student usually given the task and the rubric at the same time
4. Tasks can be general and scores specific or vice versa (p. 290)
5. Some teacher start with a generic rubric and adapt to a task (p. 291)
6. Value of Rubrics
a. Communicate what students know and can do
b. Provide understandable performance targets to students
c. How will I be graded? How am I doing?

C. Resources of Assessment Tasks


1. Textbooks often include suggestions of contemporary assessments
2. Internet
a. Performance Assessment Links in Science http://pals.sri.com/
b. Links http://www.col-ed.org/smcnws/assesstask.html
3. Journals and tradebooks
a. National Science Teachers Association
b. Association for Supervision and Curriculum Development
c. The Science Teacher
d. Science Scope
e. Educational Leadership
IV. Grading and Reporting Grades
A. Importance of Grades
1. Impact lives of students by evaluating success and failure
2. Compare students with each other
3. Determine scholarship or even acceptance to further education

B. Types of Grading
1. Criterion-referenced: judged relative to established criteria
a. Allow as many A, B, C, D, F grades as students earn
b. Assumes all students can earn an A
c. Does not ensure a normal distribution of grades
2. Norm-referenced: judged relative to other members of the class
a. Assumes class should match a normal distribution
b. Particularly difficult to defend with small class sizes
c. Applied assessment rarely produce this on their own
d. Curving grades = adjusting grades to meet normal distribution
e. Homogeneous classes (advanced placement) don’t work
C. Assigning Final Grades
1. Numerical average of all assessments is the traditional method
a. Some argue criterion-referenced scores shouldn’t do this
b. Replace grades with statements of student attainment
c. Norm-referenced scores must use numbers in any event
2. Point and Percentage Systems
a. Assign points for each assignment reflecting the importance to course
b. Assign grades based on total points earned or percentage of points
c. Flaw: a percentage doesn’t tell you specifically what was learned
3. Fairness, Consistency, and Communication are Key
4. Scaling: adjusting letter cut-offs 90/80/70/60  88/78/68/58 for
5. Curving: assigning grades based on a normal distribution
1
μ Σx i N  number of measuremen ts
N
s s s s x i  an individual measuremen t
s

 
1
1
σ Σ(x i  μ)2 2  Standard Deviation
N 1
F D C B A

Vous aimerez peut-être aussi