Vous êtes sur la page 1sur 27

Editors

Thomas R. Lord
Donald P. French
Linda W. Crow

Arlington, Virginia

Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
Claire Reinburg, Director
Jennifer Horak, Managing Editor
Judy Cusick, Senior Editor
Andrew Cocke, Associate Editor
Betty Smith, Associate Editor

Art and Design


Will Thomas, Jr., Director
Joseph Butera, Graphic Designer, cover and interior design

Printing and Production


Catherine Lorrain, Director
Nguyet Tran, Assistant Production Manager
Jack Parker, Electronic Prepress Technician

National Science Teachers Association


Francis Q. Eberle, PhD, Executive Director
David Beacom, Publisher

Copyright 2009 by the National Science Teachers Association.


All rights reserved. Printed in the United States of America.
12 11 10 09 4 3 2 1

Library of Congress Cataloging-in-Publication Data

College science teachers guide to assessment / Thomas Lord, Donald French, and Linda Crow, editors.
p. cm.
Includes index.
ISBN 978-1-933531-11-3
1. ScienceStudy and teaching (Higher)Methodology. 2. ScienceStudy and teaching (Higher)Evaluation.
3. Academic achievementEvaluation. 4. Educational tests and measurements. I. Lord, Thomas R. II. French,
Donald P., 1952- III. Crow, Linda W.
Q181.C535 2009
507.11dc22
2008041750

NSTA is committed to publishing material that promotes the best in inquiry-based science education. However, conditions
of actual use may vary, and the safety procedures and practices described in this book are intended to serve only as a guide.
Additional precautionary measures may be required. NSTA and the authors do not warrant or represent that the procedures
and practices in this book meet any safety code or standard of federal, state, or local regulations. NSTA and the authors
disclaim any liability for personal injury or damage to property arising out of or relating to the use of this book, including
any of the recommendations, instructions, or materials contained therein.
Permissions
You may photocopy, print, or e-mail up to five copies of an NSTA book chapter for personal use only; this does
not include display or promotional use. Elementary, middle, and high school teachers only may reproduce a
single NSTA book chapter for classroom- or noncommercial, professional-development use only. For permission
to photocopy or use material electronically from this NSTA Press book, please contact the Copyright Clearance
Center (CCC) (www.copyright.com; 978-750-8400). Please access www.nsta.org/permissions for further information
about NSTAs rights and permissions policies.

Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
Table of Contents
Contributors ix
Preface xi
Section 1. General Assessment Topics
Chapter 1: Survey Instrument Validation: 3
The First Commandment of Educational Research
Cynthia Cudaback

Chapter 2: Building a Culture of Faculty-Owned Assessment 7


Don Haviland, Karol Dean, and Eleanor D. Siebert

Chapter 3: Quantitative Assessment of Student Learning in 15


Large Undergraduate Science Classrooms:
Approaches and Caveats
Anneke M. Metz

Chapter 4: Means of Linking Research to Practice in 31


Organizing a Course on Student Assessment
HsingChi von Bergmann

Section 2. Assessment in the College Classroom


A. Traditional Assessments

Chapter 5: Writing/Using Multiple-Choice Questions to 35


Assess Higher-Order Thinking
Kerry L. Cheesman

Chapter 6: Tips on Classroom Assessment: 43


How to Teach Our Students to
Take Multiple-Choice Exams
Linda Tichenor

Chapter 7: Better Multiple-Choice Assessments 45


William J. Straits and Susan Gomez-Zwiep

College Science Teacher Guide to Assessment v


Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
Table of Contents

B. Alternative Assessments

Chapter 8: Alternative Forms of Assessment 49


for the College Science Laboratory
Kerry L. Cheesman

Chapter 9: Assessment of Students Learning 55


Through the Creation of Scientific Posters
Kerry L. Cheesman and Nancy J. Swails

Chapter 10: Using Electronic Portfolios for Assessment in 59


College Science Courses: Instructor Guidelines
and Student Responses
Jerry A. Waldvogel

C. Teacher Education Alternative Assessments

Chapter 11: Wetscience: A Means of Assessing Math, 67


Science, and Technology Incorporation Into a
Service Learning Outreach Program
Mark A. Gallo

Chapter 12: Gauging the Nature of Science (NOS): 77


An Alternate Form of Assessment
Thomas J. Melvin

Chapter 13: Authentic Assessment: 81


Using 5-E Lesson Plan Development to
Evaluate Science Content Learning With
Preservice Teachers
Holly Travis

vi National Science Teachers Association


Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
Section 3. How-To Section
Successful Classroom-Tested Practices and
Instructions and Rubrics for Their Implementation
Chapter 14: Formative Assessment With Student Remotes and E-mail 87
Robert A. Cohen

Chapter 15: Peer Assessment: Value, Fears, Headaches, and Success 89


Anne Coleman

Chapter 16: Working With Student Engagement 95


Grace Eason

Chapter 17: Promoting Student Reflection on Exams 99


Grace Eason

Chapter 18: Hypothesis Modification Activity 103


Brian J. Rybarczyk and Kristen Walton

Chapter 19: Exam Corrections 107


and Analysis, Student Perspective
Kathryn H. Sorensen
109
Chapter 20: Exam Analysis, Instructor Perspective
Kathryn H. Sorensen

Chapter 21: Inquiry-Based Labs: The Scientific Report 111


Bonnie S. Wood

Chapter 22: Student-Authored Book Reviews 115


Bonnie S. Wood

Chapter 23: Student-Led Teaching Models 117


Bonnie S. Wood

College Science Teacher Guide to Assessment vii


Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
Table of Contents

Section 4. General Practices to Improve Assessment


Chapter 24: Eleven Assessment Lessons Learned at the Gate 123
Mary H. Brown

Chapter 25: Developing Assessment Performance Indicators 129


Peter Kandlbinder

Chapter 26: Practices That Jeopardize Bona Fide Student Assessment 137
Thomas R. Lord

Chapter 27: Varied Assessment: A Brief Introduction 147


William J. Straits and R. Russell Wilke

Chapter 28: Assessments That Assist in Motivating Students 151


Ellen Yerger

Index 155

viii National Science Teachers Association


Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
Contributors
Brown, Mary H. Gallo, Mark A.
Lansing Community College Niagara University
MC 5400Science DePaul Hall,
411 N. Grand Ave. Niagara University, NY 14109
Lansing, MI 48901 mgallo@niagara.edu
Brownm@lcc.edu
Gomez-Zwiep, Susan
Cheesman, Kerry L. California State University, Long Beach
Capital University Department of Science Education
Columbus, OH Long Beach, CA 90840
kcheesma@capital.edu sgomezwp@csulb.edu

Coleman, Anne Haviland, Don


Department of Life and Physical Sciences California State University, Long Beach
Cabrini College 1250 Bellflower Blvd.
610 King of Prussia Rd. Long Beach, CA 90808-2201
Radnor, PA 19087 dhavilan@csulb.edu
annecoleman@cabrini.edu
Kandlbinder, Peter
Cohen, Robert A. Institute for Interactive Media and Learning,
Department of Physics University of Technology, Sydney
East Stroudsburg University Peter.Kandlbinder@uts.edu.au
East Stroudsburg, Pennsylvania
rcohen@po-box.esu.edu Lord, Thomas R.
Department of Biology
Cudaback, Cynthia Indiana University of Pennsylvania
Marine Earth and Atmospheric Science Indiana, PA 15705
Campus Box 8208 trlord@iup.edu
North Carolina State University
cynthia_cudaback@ncsu.edu Melvin, Thomas J.
Biology Department
Dean, Karol Indiana University of Pennsylvania
Mount St. Marys College Indiana, Pennsylvania 15705
12001 Chalon Rd. t.j.melvin@iup.edu
Los Angeles, CA 90049
kdean@msmc.la.edu Metz, Anneke M.
Department of Cell Biology and Neuroscience
Eason, Grace Montana State University
University of Maine at Farmington Bozeman, MT 59717
173 High St., Preble Hall anneke@montana.edu
Farmington, ME 04938
geason@maine.edu

College Science Teachers Guide to Assessment ix


Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
Contributors
Rybarczyk, Brian J. von Bergmann, HsingChi
The Graduate School EDT 726, Faculty of Education
University of North Carolina/Chapel Hill University of Calgary
CB#4010 Calgary, AB, Canada T2N 1N4
Chapel Hill, NC 27599-4010 hsingchi@ucalgary.ca
brybar@unc.edu
Waldvogel, Jerry A.
Siebert, Eleanor D. Department of Biological Sciences
Mount St. Marys College Clemson University
12001 Chalon Rd. Clemson, SC 29634-0314
Los Angeles, CA 90049 waldvoj@clemson.edu
esiebert@msmc.la.edu
Walton, Kristen
Sorensen, Kathryn H. Department of Biology
Biology Department Missouri Western State University
American River College 4525 Downs Dr.
4700 College Oak Dr. St. Joseph, MO 64507
Sacramento, CA 95841 kwalton1@missouriwestern.edu
SorensKH@arc.losrios.edu
Wilke, R. Russell
Straits, William J. Angelo State University
California State University, Long Beach Department of Biology
Department of Science Education ASU Station - 10890
Long Beach, CA 90840 San Angelo, TX 76909
wstraits@csulb.edu russell.wilke@angelo.edu

Swails, Nancy J. Wood, Bonnie S.


Capital University University of Maine at Presque Isle
Columbus, OH 43209 181 Main St.
swails@capital.edu Presque Isle, ME 04769
bonnie.s.wood@umpi.edu
Tichenor, Linda
Department of Biology Yerger, Ellen
University of Arkansas Department of Biology
Fort Smith, Arkansas 72913 Indiana University of Pennsylvania
lticheno@uafortsmith.edu 114 Weyandt Hall
Indiana, Pennsylvania 15701
Travis, Holly ellen.yerger@iup.ed
Department of Biology
Indiana University of Pennsylvania
Indiana, Pennsylvania 15705
holly.travis@iup.edu

x National Science Teachers Association


Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
Preface: Note From the Editors

W
elcome to the sixth in the series of the value and importance of assessment in a col-
SCST monographs created by the lege science setting.
members of the Executive Board The monograph examines assessment issues from
of the Society for College Science several different viewpoints and is broken into several
Teachers in cooperation with the National Science chapters. The first section deals with general assess-
Teachers Association. This document covers an ment topics such as validation of survey instruments
extremely important and often controversial topic, and creating a culture for faculty-owned assessment.
that of evaluating the value of students and pro- The second section concerns traditional and alter-
fessorial works. The jointly sponsored monograph native forms of assessment in both science and the
has been three years in the making with the initial science education classroom. The third section pres-
agreement with the National Science Teachers ents a series of how-to assessment practices that have
Association taking place in the fall of 2006. been successfully utilized in the field. Finally, the
Each submission in this monograph was re- fourth section provides a series of tips to enhance
viewed by at least two members on the Editorial assessment in the college science classroom.
Board with the published authors responding to The editors would like to thank all the contribu-
reviewers critiques and providing the final proof- tors to the monograph. The quality of the initia-
ing of their own entry. Articles were selected on tive is indicative of the time and energy they put
the quality of the writing and their contribution to into this work.

Thomas R. Lord Donald P. French Linda W. Crow


Indiana University of Pennsylvina Oklahoma State University Montgomery College
trlord@iup.edu dfrench@okstate.edu linda.w.crow@nhmccd.edu

Acknowledgments
The editors wish to thank Holly Travis for her tireless effort and dedication to the construction of this document.
Thanks also to Ellen Yerger and Tom Melvin for their help in making this monograph a success.

College Science Teachers Guide to Assessment xi


Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
Chapter
Writing/Using
Multiple-Choice
5
Introduction
Multiple-choice exams are widely used in college
science classrooms (as well as for laboratory quiz-
zes and exams). Multiple-choice questions have
Questions to Assess many advantagesperhaps the most important
is that they can be graded quickly and easily, and
Higher-Order Thinking they can be graded by either human or machine.
The clicker systems often used in large lecture
Kerry L. Cheesman rooms are well adapted for answering multiple-
Biological Sciences choice questions, and they can be used for instant
Capital University quizzes with immediate feedback to students.
Columbus, Ohio Instructor time is valuable, and in large class-
rooms the use of essay exams (the primary alter-
native) can quickly become overwhelming, caus-
ing students to wait for feedback for prolonged
periods of time. Feedback on progress needs to
be as rapid as possible, and essay questions do
not lend themselves to that. Essay grading can
also tend to be biased by any number of factors
(time of day, personal biases, differences between
graders, lack of openness to new interpretations,
and so on).
Finally, most graduate entrance exams (includ-
ing the GRE, MCAT, and DAT) are based on
multiple-choice questions. Many later exams, such
as the medical board exams, are also multiple-
choice. Therefore, it is important to make sure
that students are prepared for higher-order
multiple-choice exams and the reasoning that is
required to answer the questions in a proficient
manner. Undergraduate science instructors can

College Science Teachers Guide to Assessment 35


Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
Chapter 5
Writing/Using Multiple-Choice Questions to Assess Higher-Order Thinking

help students be well prepared by using higher- 1. Knowledge: the ability to remember/recall
order multiple-choice questions for assessment of previously learned material.
course material starting in the freshman year. Examples of behavioral verbs: list, name, identify,
Assessment must match ones teaching style define, show
inquiry teaching must be followed by assessment Sample learning objectives in science: know com-
techniques that match the inquiry method of mon terms, know specific facts, know basic
Section teaching. If one follows the learning cycle (5E or procedures and methods

2
other similar models), assessment is encountered
throughout the teaching and learning continuum, 2. Comprehension (understanding): the
and that assessment must be related to the phase ability to grasp the meaning of material, and to
of the cycle (exploration, extension, etc.). Cer- explain or restate ideas.
tainly higher-order questions capture the essence Examples of behavioral verbs: chart, compare,
of exploration and extension much better than contrast, interpret, demonstrate
lower-order questions do. Sample learning objectives in science: understand
The use of higher-order questions does not facts and principles, interpret charts and
mean an end to using lower-order questions. graphs, demonstrate laboratory methods and
Rather, we are referring to a shift from the t procedures
8090% lower-order questions typically found in
college science exams toward a balance between 3. Application: the ability to use learned mate-
lower- and higher-order questions. The goal of rial in new situations.
undergraduate science instruction should be Examples of behavioral verbs: construct, manipu-
critical thinking rather that memorization. Many late, calculate, illustrate, solve
students come to the university with the assump- Sample learning objectives in science: apply con-
tion that science is just a lot of memorization, cepts and principles to new situations, ap-
and college instructors often need to work hard ply theories to practical situations, construct
to destroy that myth. However, that myth is often graphs and charts
kept alive by the choice of questions used on the
exams. If they favor knowledge-style questions, 4. Analysis: the ability to separate material into
then students will continue to believe that science component parts and show relationships between
is mostly about memorization rather than about the parts.
inquiry and analysis. Examples of behavioral verbs: classify, categorize,
organize, deduce, distinguish
Sample learning objectives in science: distinguish be-
Understanding Blooms tween facts and inferences, evaluate the relevan-
cy of data, recognize unstated assumptions
Taxonomy
Most college instructors are familiar, on some 5. Synthesis: the ability to put together sepa-
level, with Blooms taxonomy of learning (Bloom rate ideas to form a new whole or establish new
et al. 1956). Much has been written about the relationships.
use of Blooms taxonomy in the construction of Examples of behavioral verbs: hypothesize, cre-
exam questions, but few instructors take to heart ate, design, construct, plan
the need to use all of the levels instead of just the Sample learning objectives in science: propose a plan
first two in constructing examination questions. for an experiment, formulate a new scheme
Here is a quick review of Blooms taxonomy as it for classifying, integrate multiple areas of
relates to the teaching of college science. learning into a plan to solve a problem

36 National Science Teachers Association


Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
6. Evaluation: the ability to judge the worth or ward test-wise students nor penalize those whose
value of material against stated criteria. test-taking skills are less developed. The purpose
Examples of behavioral verbs: evaluate, recom- is to assess student learning, and therefore each
mend, criticize, defend, justify question needs to be clearly designed to achieve
Sample learning objective in science: judge the way that goal. Remember that higher-order questions
that conclusions are supported by the data take longer to answer than recall questions, so
plan accordingly in the construction of the test.
It is a common misconception that as one To construct a higher-order multiple-choice
climbs the scale of Blooms taxonomy, the diffi- question, start by constructing the stem. The
culty of the questions increases. The increase in stem should pose a problem or state a question.
cognitive demand associated with higher-order Familiar forms include case study, premise and
questions refers to the complexity of the questions, consequence, problem and solution, incomplete
not the difficulty. Higher-order questions require scenario, and analogy. The stem may involve pic-
a different set of cognitive demands, but they are tures and diagrams or just words.
not necessarily more difficult. Write the stem as clearly and simply as possible.
A student should be able to understand the problem
without having to read it several times. Always try
Writing Multiple Choice to state the problem in a positive form, as students
often misread negatively phrased questions. Avoid
Questions extraneous language that is irrelevant to the ques-
Higher-order multiple-choice questions can be tion. While some authors believe this helps sepa-
as easy or as difficult to construct as lower-order rate those who truly understand from those who
questions. Good-quality questions are essential to dont, too often it confuses even the well-prepared
being able to truly assess a students knowledge students, leading to unreliability of the question.
and understanding of the subject matter in any Never use double negatives. Avoid which of
area of science. these is the best choice unless that format is inte-
Before attempting to construct individual gral to the learning objectives. Be sure to include
questions, think about the purpose of the ques- in the stem any words that are redundant to all
tions. In general, the purpose should be to assess of the answers, and use a(n) or a/an to avoid
what students know and dont know, and how eliminating any of the answers as mismatches.
students are able to construct knowledge based Once the stem is constructed, proceed with
on prior learning. Therefore, avoid trick ques- writing the responses. Write the correct answer
tions that may confuse students who understand first. This allows you to be sure it is well con-
the material. Avoid using prepared test banks structed and accurate, and allows you to match
written by the author of the textbook or other the remaining answers to it. Avoid verbal cues,
contracted writers. Honest assessment must and certainly avoid lifting phrases directly from
match the teaching style employed, not the style the text or class notes. Be sure that the incorrect
of the textbook or the style of your colleagues. responses match the correct one in length, com-
Note: You cannot ask higher-order questions if plexity, phrasing, and style. For instance, in the
your teaching style mandates only recall. following example, the mismatch of the answers
Writing good multiple-choice questions takes makes it easy to guess the correct response even if
timea well-constructed test cant be written in a one has little knowledge of the subject material.
single day. Questions need to be written, reviewed
for clarity, and often revised. Questions need to The term side effect of a drug:
be constructed in such a way that they neither re- a. refers to any action of a drug in the body

College Science Teachers Guide to Assessment 37


Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
Chapter 5
Writing/Using Multiple-Choice Questions to Assess Higher-Order Thinking

other than the one the doctor wanted the shy away from and may distort the question as a
drug to have valid assessment tool. Likewise, use all of the above
b. is the main effect of a drug and none of the above sparingly.
c. additionally benefits the individual When all exam questions have been con-
structed, check each one to see where it falls in
Distracters (incorrect answers) must be incor- Blooms hierarchy. Construct a simple table such
Section rect yet plausible. If a recognizable key word ap- as that shown in Table 1 to see the distribution of

2
pears in the correct answer, it should appear in questions. If the questions are disproportionately
some of the distracters as well. Be sure to check distributed, then rewrite enough questions to bal-
will the answers help to distinguish why a student ance the exam between lower-order and higher-
got it wrong? This is an important part of assess- order questions.
ment that is often overlooked by instructors, but is
a critical part of helping students to learn.
Avoid microscopic distinctions between an- Examples of Multiple-Choice
swers, unless this is a significant objective of the
course. Be sure to stagger the correct responses in
Questions at Each Level
their order (use all answer positions as equally as The following examples illustrate the construction
possible). Limit the number of optionsmost au- of multiple-choice questions that fit the higher
thors agree that 45 answers is plenty, and there is levels of Blooms taxonomy. For most an expla-
no assessment advantage in using more than five. nation is included describing why it fits where it
Use all, always, never, none, etc., rarely. These are does, and what a student needs to know to be able
answers that students have been programmed to to answer the question correctly.

TABLE 1. Sample table showing the distribution of questions

Recall Application Evaluation


Topics/ (knowledge, (application, (synthesis, %
Objectives comprehension) analysis) evaluation)

a.

b.

c.

d.

e.

f.

Total

38 National Science Teachers Association


Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
Application Questions
1. Susie and Bill are both healthy and have (To answer this question the role of chlorophyll
healthy parents. Each of them has a sister with in energy transformation must be understood.
autosomal recessive cystic fibrosis. If Susie and Phil The student must apply the concepts of energy
have a child, what is the probability that it will transformation/lack of chlorophyll to a logical
be born with cystic fibrosis? new endpoint. All of the answers are plausible
a. 0 and help to distinguish where an understand-
b. 2/3 ing of energy transformation is incomplete.)
c. 1/2
d. 1/4 4. Which of the following compounds should
e. 1/9 have the highest boiling point?
(To answer this question correctly, one must a. CH3CH2CH2CH3
understand the terms autosomal and recessive, b. CH3NH2
and also understand the concepts of probabil- c. CH3OH
ity as applied to human genetics. In this ques- d. CH2F2
tion the student must apply those concepts to (To answer correctly, a student must under-
a family situation not studied before. The in- stand the concept of boiling point and the role
correct responses are constructed to find mis- of various constituent chemical groups in rais-
conceptions/misunderstandings about genetic ing or lowering the boiling point.)
probabilities.)
Analysis Questions
2. A total of 100 students at Capital University 1. When a solid ball is added to a graduated cylin-
were tested for blood type. The results showed der with water in it, the water level rises from 20
36 were type O, 28 were type A, 28 were type ml to 50 ml. What is the volume of the ball?
B, and 8 were type AB. The frequency of the a. 20 ml
A allele is therefore: b. 30 ml
a. 0.10 c. 50 ml
b. 0.14 d. 70 ml
c. 0.28 (In this example, a student must understand
d. 0.56 the concept of volume and not get distracted
e. 0.64 by the spherical nature of the added object.
(To answer correctly, a student must know the The answers are designed to give the instructor
formula for allele frequency and be able to cal- a sense of the misunderstanding of volume.)
culate it from the given data. The answers were
chosen to help find misunderstandings about 2 From the graph shown here, determine when
allele frequency.) maximal carrying capacity has been reached.
a. point A on the graph
3. Evolutionary forces have produced an unusual b. point B on the graph
plant, the Indian Pipe, that has no chlorophyll. c. point C on the graph
Therefore, the plant must: d. point D on the graph
a. make its own food (To answer correctly a student must be able to
b. absorb food made by other organisms see the relationships among the various organ-
c. photosynthesize without chlorophyll isms, interpret those relationships, and evaluate
d. respire without taking in food the ecosystem relative to the organisms shown
e. use chlorophyll from other plants on the graph.)

College Science Teachers Guide to Assessment 39


Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
Chapter 5
Writing/Using Multiple-Choice Questions to Assess Higher-Order Thinking

3. During an otherwise normal pregnancy, a dents knowledge of the brain is used to estab-
woman begins to experience light-headedness lish a new relationship beyond those studied in
and a decline in energy levels near the end of class. All of the answers are logical outcomes
the first trimester. Which of the following is the of brain dysfunction and help the instructor to
most likely cause of her symptoms? pinpoint the misunderstandings that students
a. lack of B vitamins due to poor diet have.)
Section b. decline of blood pH due to overuse of

2
muscles 2. A neighbor found some mammal bones on the
c. decrease in blood pressure due to ex- grounda keeled sternum and an ulna, where
panding fetal circulation the olecranon occupies 30% of the length of
d. decline in estrogen levels due to ovarian the bone. These bones most likely came from
shutdown what type of mammal?
(All of these answers involve factors that could a. flyer
cause tiredness in a woman. To determine the b. climber
most likely cause in this scenario, a student c. runner
needs to understand the basic mechanics of d. digger
pregnancy and the biochemical changes that e. swimmer
occur during it. The answer given shows a stu- (Here a student must understand the various
dents ability to carefully analyze the situation bones and what their functions are. The stu-
and determine causality.) dent must then formulate a relationship be-
tween the type and formation of the bones and
4. The seeds of various plants vary in size from a the activity that it would promote in a mam-
fraction of a millimeter to several centimeters. mal.)
The most critical factor controlling the size
seed a plant produces is 3. What would be the most logical result of mix-
a. size of the maternal flower ing X and Y, both solubilized in distilled H2O
b. projected size of the animal pollinator at room temperature?
c. quantity of the abiotic pollinator a. precipitation of a solid
d. length of predicted dormancy b. a change in color of the liquid
e. method of distribution of seed c. a rapid rise in temperature
d. a rapid decrease in temperature
Synthesis Questions (To answer correctly, a student must put to-
1. Domoic acid, produced by diatoms, has been gether knowledge of X and Y as compounds
found to bind to hippocampal glutamate re- with knowledge of their dissociations and the
ceptors. If a person were to accidentally con- reactions of the individual components. All of
sume a lot of shellfish contaminated with this the answers reflect outcomes that the student
organism, what effect might be expected? has previously experienced when two com-
a. blindness pounds are mixed.)
b. deafness
c. amnesia Evaluation Questions
d. aphasia 1. Your fitness regimen involves jogging on the
e. rigidity school track 23 miles per day with a friend.
(To answer this question, one must understand On a particular day, about 15 minutes into
the role of the hippocampus and the role of your jog, your friend suddenly pulls up and
glutamate in this area of the brain. Here a stu- falls down, grasping her right calf in pain.

40 National Science Teachers Association


Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
What should you do at that moment? Conclusion
a. apply ice to the calf
b. apply heat to the calf Multiple-choice questions may be used effective-
c. tell her to get up and walk slowly ly to assess student learning as long as they are
d. get emergency help stat constructed properly and include assessment of
(Here the student must be able to appraise the higher-order thinking. Taking the time to con-
situation and evaluate the next course of ac- struct good stems and good answers, not only on
tion. The students knowledge of both muscle the exams but also in the daily questions posed
function and injury are brought to bear in de- during the lesson, is well worth the effort. Prac-
ciding which treatment to use.) ticing coming up with conclusions for critical-
thinking questions is as important in creating
2. According to the American Heart Association, high-level thinking in students as designing good-
obesity plays a major role in early heart fail- quality multiple-choice exams. Asking only non-
ure. Which of the below answers best describes challenging thinking questions during class is a
how being severely overweight can cause the waste of both the instructors and students time
heart to falter? and does little to help assess student learning.
a. Obesity creates the need of greater blood
volume in the circulatory system. Reference
b. Obesity creates the need for a quicker Bloom, B., M. Englehart, E. Furst, W. Hill, and
heart rate each minute. D. Krathwohl. 1956. Taxonomy of educational
c. Obesity creates the need for a larger vol- objectives: The classification of educational goals.
ume of blood exiting the heart per beat. New York: McKay Publishers.
d. Obesity causes the heart to grow larger
than is anatomically recommended.
e. Obesity causes the heart to develop a
greater cardiac vessel network.

3. Abiotic factors impact heavily on photosynthetic


richness in a green plant. The basic factors that
influence sugar making in plants are (1) quan-
tity of water, (2) quantity of sunlight, (3) quan-
tity of CO2, (4) environmental temperature, (5)
movements of the air, and (6) richness of growth
substrate. By moderately increasing in quantity,
which of the factors would positively influence
photosynthesis and which factors would nega-
tively influence photosynthesis?
a. + (1), (2), (3) and (4), (5), (6)
b. + (2), (3), (5), (6) and (1), (4)
c. + (1), (2), (3), (4), (5) and (6)
d. + (3), (4) and (1), (2), (5), (6)
e. + (1), (2), (3), (4) and (5), (6)

College Science Teachers Guide to Assessment 41


Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
Chapter
Eleven Assessment
Lessons Learned at
24
As a gateway instructor for more than 30
years, Ive learned a few things about assessing
the typical community college student. Gate-
the Gate way at my institution is the polite euphemism
for suggesting youll always be teaching the non-
science majors with the slim hope that some may
Mary H. Brown eventually learn to tolerate the subject. Its been
Lansing Community College nearly 20 years since Ive evaluated one true sci-
Lansing, Michigan ence discipline major among the hundreds of
students in my classes each academic year. As
STEM students are nonexistent in my classes,
the best I can hope for is the integrated science
major in education.
Typical community college student is an
oxymoron. There isnt one. Some of my college
freshmen are older than I am, returning to col-
lege for an opportunity at a retirement career.
A few freshmen are dual-enrolled high school
students, 16 or 17 years old. Some of my stu-
dents are parents of young children, and some
are grandparents. Todays community college
students come from all walks of life and include
working adults and recent immigrants or refu-
gees from foreign lands. Included in this mixture
is the universitys student of choice, whose par-
ents chose the inexpensive route for their first
two years of college.
So, how does a gateway instructor assess the
learning of the typical community college stu-
dent? Included in this chapter are 11 assessment
lessons Ive learned along the journey.

College Science Teachers Guide to Assessment 123


Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
Chapter 24
Eleven Assessment Lessons Learned at the Gate

1. Assessments need to be frequent and scaf- 2. Assessments and exams are not always the
folded for legitimate success. Working adults same thing. Assessments come in many
dont want only one or two assessments of forms. In the assessment report I file each
their progress during the semester. They want year to the divisional office to show that my
to know exactly how well they are performing nonscience majors course is worthy of the ti-
in the class at each assignment. They want tle of science CORE(which means it meets

4
Section feedback, personal and directed. These are the criteria of inquiry, shows the processes
practical folks! Not surprisingly, they want and limitations of scientific thought, and
good grades, but they also want real success. analyzes data), no fewer than 12 different as-
They dont mind being challenged, especially sessment techniques are listed.
after theyve been successful. Successful gate- Assessments include the exams, concept
way instructors know that a 16-week semes- maps (two varieties), Vee diagrams, labora-
ter might have seven or eight large exams. tory reports, and capstone projects. All of
The first exam should be the least challeng- these give the instructor information about
ing. Each subsequent exam should be more the students learning and their mastery of
challenging. Students want their efforts to content. Alternative assessments that are real,
show. They want to believe theyre progress- targeted to the content can be more revealing
ing, that hard work pays off. than an exam. Anxiety plays a role in exam
The experts might suggest that assess- taking, but a student has control over a proj-
ments be formative, giving frequent feedback ect. Presentations or projects that allow for
toward the mastery of content. Classroom as- research, sharing of ideas, and collaboration
sessment techniques (CATS) are usually for- are valid assessments. These include contex-
mative assessments, and might include the tual, problem-, case-, or performance-based
quick think pair share or the one-minute assessments.
paper (Angelo and Cross 1993). Students 3. Embedded assessment across multiple sections
want definite feedback on all formative as- has advantages and disadvantages. Commu-
sessments. They want to know that youve sin- nity colleges are notorious for having large
cerely read each and every one. Formal grades numbers of adjunct professors. Mine is not
arent required, just your attention and con- an exception. Subsequently, as the full-time
structive remarks in some format. professor responsible for reporting on multi-
Summative assessments are aligned with ple course offeringseven when I am not the
the evaluation of content mastery or the instructor of recordmy job becomes very
completion of instruction. Many community challenging. Embedded questions on each
college students need state or national bench- exam allow for a logistically simple method
marks (standards) for their instruction in their for tracking all sections of a single course.
trade or vocational courses. My students see Instructors simply provide me detail of the
the state and national benchmarks for science embedded questions after each exam. Thats
education (Roseman and Koppal 2006), and the advantage. The disadvantage is that I have
they know that like other professionals, they no idea why the students miss the embedded
need to meet those standards. Unlike their questions on particular topics. The variables
certification exams for careers, my class is only are too numerous. I have a vague idea within
the beginning of their journey toward scien- my own classes, as I can monitor absences
tific literacy. Benchmarks and standards are or recall the day in class when the topic was
a goal for attainment with the expected out- discussed. There is no information from the
come of lifelong scientific literacy. classes I didnt teach. The number is a cold sta-

124 National Science Teachers Association


Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
tistic without any qualitative information. Af- 5. A wrong answer has tremendous value. A
ter a period of time, the embedded questions well-thought-out, detailed wrong answer gives
on exams must change in their wording. Since you lots of information. It provides you op-
the statistical report depends on data from the portunity to correct a misconception or to
previous year, altering the question requires a craft a discrepant event to allow the learner
whole set of rationale, without qualitative in- to construct a more scientifically accurate re-
formation. Thats another disadvantage. sponse. Providing the question in advance and
4. Listening provides more assessment value than allowing two minutes to think before calling
talking. As my students engage in their labora- on a student to respond yields more informa-
tory activities, I listen. I listen to their interac- tion than simply calling on a student. Wait
tions, collaborations, and arguments. I walk time also works well (Rowe 2003). Giving
to each lab table, and I listen. I can learn a students a 30-second warning before expect-
tremendous amount about their learning and ing a response is very powerful. Im going to
their assimilation of the content by listening. ask <students name> to respond to the next
During the course of the semester, they be- question gives that individual a few extra sec-
come very accustomed to me walking to each onds to compose an answer. The responses are
table without saying anything as they work. I more complete, even when they are wrong.
learn a lot about their thinking processes by 6. Assessment need to be clearly tied to out-
listening. Each unit of instruction also begins comes, objectives, or learning targets. Both
with me listening. Each collaborative group is instructor and student need to clearly know
asked to list prior knowledge about topics in the purpose of the assessment. What are we
the unit. Together we summarize. We use the evaluating? Communications of expectations
prior knowledge expressed in discussions to are important. Providing the format of the
increase the depth of knowledge on the top- exam gives students an opportunity to prepare
ics. Pre-assessment starts with listening. appropriately. You study differently for a writ-
Each unit of the courses I teach starts ten essay than a multiple-choice exam. You
with a series of connection questions. What prepare differently for a presentation than for
do you already know about the topic? How a discussion.
can learning this information be useful 7. Assessments that are viewed as products
to you? What are you looking forward to by students are sources of pride. I have many
learning about this topic? Each unit also product assessments in my classes. Im al-
ends with reflection. What did you learn ways surprised by college students who have
about this topic? Did anything you learned told me that their perfect score concept map
in this unit change your mind? How will was hung on their refrigerator! Or the lab re-
this information be useful for your future? port with the phrase Well done! was read
Postassessment also starts with listening. over the dinner table. It seems it doesnt really
Ive learned that students do not necessar- matter how old we are, a well-done product
ily answer these questions unless they are is a source of pride. Community college stu-
explicitly asked. If there exists a possibility dents know the rewards of hard work. They
that you could be called upon and expected work tremendously hard in the challenges of
to respond directly to a specific question, everyday life. They can do exceptional work
you prepare a response. Without that po- when the assessment is viewed as a product.
tential accountability, its a rare student who 8. Detailed constructive feedback on assessments
prepares a response or who is introspective is essential. It takes about six hours to correct
without the prompting questions. a stack of 24 lab reports, if they are well writ-

College Science Teachers Guide to Assessment 125


Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
Chapter 24
Eleven Assessment Lessons Learned at the Gate

ten. A set of poorly written lab reports takes aging remarks on their progress (e.g., Nice
about twice as long. Each report needs care- improvement on this multiple-choice section;
fully worded constructive feedback. Rubrics keep working!) Statistical analysis is given
on written assignments (and oral for presen- on the entire class performance, and the class
tations) are given in advance, and students are discusses improvement strategies for the next
expected to follow the same criteria for excel- exam. Besides the exams, alternative assess-

4
Section lence in writing as they would in a composi- ments are a near daily occurrence.
tion class. Even on exams, common mistakes 11. Ideally, assessments inform teaching, and self-
are explained. The feedback on the exam is assessments can even inform the learner. Assess-
another opportunity to teach. ment is not only about evaluating the learning
9. Owning and expressing your expectations for process. It should change the teaching process.
their success is crucial. Students rise to the Each assessment should inform the instructor
challenge of high expectations. When given as to needed changes in pedagogy, presentation
rationale for a challenge, they accept. They or missing fundamentals for conceptual un-
will even accept the frustrations of disequilib- derstanding. Self-assessments can provide the
rium if they understand the rationale. Explicit learner with great potential to change.
reasons for content expectations are essential.
Community college students will accept the
because its on the test but are likely to ask The view from the gate as I encourage
you why its on the test. They want a more students to consider the science disciplines is
practical reason for learning the content. Ide- generally positive. Together, the students and
ally, the reason is tied to a potential career, or I investigate, listen to each other, and plan our
an everyday application. journey together. Although assessment reports
10. Assessments need to be varied, perceived as are needed for multiple levels (divisional, pro-
fair and attainable, and evaluated both objec- gram, departmental) there is enough consis-
tively and subjectively (Mintzes, Wandersee, tency across the requirements that only the
and Novak, 1999). The brain loves novelty. perspective changes. Not only have I learned
It fatigues when offered routine. With 12 how to assess my typical community college
different types of assessments throughout students so that I know what they are learning,
the semester, fatigue is more physical than Ive also learned how to teach better science
cerebral! Each exam has a variety of ques- through our shared assessments.
tion types. Students create or correct con-
cept maps; they evaluate true and false state- References
ments, correcting the false. They also write Angelo, T. A., and K. P. Cross. 1993. Classroom
brief answers and traditional multiple para- assessment technique: A handbook for college teachers.
graph essays. Each exam also has multiple- 2nd ed. San Francisco, CA: Jossey Bass
choice questions and paragraph comple- Publishers.
tions. Students analyze their exam results at
the conclusion of each unit and write goals Mintzes, J., J. Wandersee, and J. Novak, eds.
to improve weak performance areas. Stu- 1999. Assessing science understanding: A human
dents know how exams are evaluated. They constructivist view. San Francisco, CA:
know that each section is evaluated indepen- Academic Press.
dently without my knowledge of the test au-
thor. They also know when the sections of Roseman, J. E., and M. Koppal. 2006. Ensuring
the exam are totaled, I often write encour- that college graduates are science literate:

126 National Science Teachers Association


Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
Implications of K-12 benchmarks and
standards. In Handbook of College Science
Teaching, eds. J. Mintzes, and J. Leonard,
325349. Arlington, VA: NSTA Press.

Rowe, M. 2003. Wait-time and rewards as


instructional variables, their influence on
language, logic, and fate control: Part one
wait-time. Journal of Research in Science Teaching.
Vol. S1, 1932.

College Science Teachers Guide to Assessment 127


Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
Index
Page numbers in boldface type refer to figures or tables.
B
A Backwash effect, 129
AAHE. See American Association for Higher BCI. See Biology Concept Inventory
Education Benchmarks for Science Literacy, 59
Administrative support for system, institutions Bio 2010, 59
using resources, 9 Biology Concept Inventory, 19
ADT. See Astronomy Diagnostic Test Biweekly exams, 151
Advanced Placement examinations, 21 Blooms taxonomy, 3637, 5051, 109
American Association for Higher Education, 9 Book reviews, student-authored, 115116
The American Biology Teacher, 115, 117
Analysis, Blooms taxonomy, 36, 50 C
Analysis questions to assess higher-order Challenges in building culture of faculty-owned
thinking, 3940 assessment, 8
AP examinations. See Advanced Placement Child, explaining to, as alternative assessment,
examinations 53
Application, Blooms taxonomy, 36, 50 Classification as alternative assessment, 52
Application questions to assess higher-order Classroom methodologies, quantitative
thinking, 39 assessment, 1620
Appropriate Workload Scale, 132 Classroom-tested practices, instructions,
Armed Services Vocational Aptitude Battery, 21 implementation, 85120
Assessments Clear Goals and Standards Scale, 132
college science laboratory, alternative forms, CMS. See Course management systems
4954 College classroom assessment, 3385
defined, 7 alternative assessments, 4966
with e-mail, 8788 teacher education, 6784
electronic portfolios for, 5966 traditional assessments, 3548
faculty-owned, building culture of, 713 College science laboratory, alternative forms of
5-E lesson plan development, 8184 assessment, 4954
higher-order thinking, multiple-choice questions, Blooms taxonomy, 5051
3541 types of assessment, 5152
to link research to practice, 3132 Commission on Future of Higher Education, 8
for motivating students, 151153 Communication, in culture of faculty-owned
multiple-choice, 4348 assessment, 10
nature of science, alternate form of assessment, Completion of exam evaluation
7779 over course of semester, question regarding, 100
patterns, 129136 question regarding, 101
peer, 8993 Comprehension, Blooms taxonomy, 36, 50
performance indicator development, 129136 Construct validity
practices jeopardizing, 137145 defined, 34
quantitative, in large undergraduate science in validation of survey instruments, 4
classrooms, 1530 Constructive alignment, 129
with student remotes, 8788 Content validity, defined, 34
through creation of scientific posters, 5557 Course management systems, assessing learning
varied assessment, 147149 via, 2225
wetscience, 6775 Create, posters, 52
Astronomy Diagnostic Test, 18, 22 Criterion-related validity, defined, 34
ASVAB. See Armed Services Vocational Aptitude Culture of faculty-owned assessment, 713
Battery American Association for Higher Education, 9

155
Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
assessment, defined, 7 Engagement, student, 9597
building, obstacles to, 8 Epistemological commitments underlying
challenges, 8 activities of science, 79
Commission on Future of Higher Education, 8 Evaluation, Blooms taxonomy, 37, 51
communication, 10 Evaluation questions to assess higher-order
Faculty Learning Communities, 11 thinking, 4041
faculty time, 8 Exam analysis, instructor perspective, 109110
foundations, 8 Exam corrections, analysis, student perspective,
isolation, 8 107108
No Child Left Behind Act, 8 Expert-teaching others in class, as alternative
planning, 910 assessment, 5354
results, 1213 Explaining to child, as alternative assessment, 53
skepticism, 8
values F
clarification of, 9 Face validity, defined, 34
structure reflecting, 1011 Faculty Learning Communities, 11
varying involvement, 8 Faculty-owned assessment, culture of, 713
visible, vocal leadership, 1112 American Association for Higher Education, 9
assessment, defined, 7
D challenges, 8
DAT. See Graduate entrance exams Commission on Future of Higher Education, 8
Data-oral alternative assessment, 53 communication, 10
Design/build/produce as alternative assessment, Faculty Learning Communities, 11
52 faculty time, 8
Diagnostic learning log, 9597, 96 foundations, 8
Distracters, multiple-choice assessments, 46, 46 isolation, 8
DLL. See Diagnostic Learning Log No Child Left Behind Act, 8
Documenting learning outcomes, multiple planning, 910
measures, 9 results, 1213
Donations Please project, 91 skepticism, 8
values
E clarification of, 9
e-mail structure reflecting, 1011
formative assessments using, 8788 varying involvement, 8
sending grades via, 153 visible, vocal leadership, 1112
e-portfolios. See Electronic portfolios Faculty time as obstacle to building campuswide
Edmiston, Dr. Judith, 109 assessment program, 8
Electronic portfolios, college science course Failure prediction, assessment as tool for, 2526
assessment, 5966 FCI. See Force Concept Inventory
assessing, 6163 Field-testing multiple-choice assessments, 47
checklist for, 62 5-E lesson plan development, science content
defined, 60 learning with preservice teachers, 8184
National Coalition for Electronic Portfolio FLCs. See Faculty Learning Communities
Research, 60 FMCI. See Force and Motion Concept
qualitative scoring, 63 Evaluation
rubric for, 63 Force and Motion Concept Evaluation, 2122
Embedded assessment, advantages, Force Concept Inventory, 18, 22, 25
disadvantages, 124125 Foundations of culture of faculty-owned

156 National Science Teachers Association


Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
assessment, 8 K
Future classes, using exam evaluation in, Knowledge, Blooms taxonomy, 36, 50
question regarding, 101
L
G Lab exams, 51
Gateway instructor, assessment lessons learned Lab reports, 51
as, 123127 Laboratory investigation scientific report,
Gateway instructor, assessments learned as, grading criteria, 112114
123127 Large undergraduate science classrooms,
General assessment topics, 132 quantitative assessment, 1530
Good Teaching Scale, 132 failure prediction, assessment as tool for, 2526
Graduate entrance exams, higher-order in-groups, out-groups, comparison of learning
thinking, 35 between, 2022
GRE. See Graduate entrance exams pretest scores, final grade correlations, 25
Group assessments, as alternative assessment, 54 strugglers, capturing, 2627
Law School Admissions Test, 21
H Leadership, in culture of faculty-owned
Helpful of exam evaluation, question regarding, assessment, 1112
101 Linking research, practice, in organizing course
Higher-order questions, multiple-choice tests on student assessment, 3132
using, as alternative assessment, 54 Listening to students, importance of, 45, 125
Higher-order thinking LSAT. See Law School Admissions Test
writing/using multiple-choice questions to
assess, 3541 M
analysis questions, 3940 MCAT. See Medical College Admissions Test
application questions, 39 Medical College Admissions Test, 21, 35
Blooms taxonomy, 3637 Motivation, 101, 151153
distribution of questions, 38 actively proctor exam, 152
evaluation questions, 4041 class time, 152
graduate entrance exams, 35 e-mail list, sending grades over, 153
multiple-choice questions, 3741 fairness of tests, 152
synthesis questions, 40 frequent exams, 151
Hypothesis modification activity, 103106 promoting class work, 152
evaluation, 105106 relating current topics to past topics, 152
group decision making process, 104 rewarding students, 152
implementation, 103105 short exams, 151
student agreement on evaluation, 105 source of questions, mentioning in class, 152
Multiple-choice assessments, 4548
I analysis questions, 3940
In-groups, out-groups, comparison of learning application questions, 39
between, 2022 to assess higher-order thinking, 3541
Inquiry-based labs, scientific report, 111114 Blooms taxonomy, 3637
Isolation, as obstacle to building culture of distracters, 46, 46
faculty-owned assessment, 8 distribution of questions, 38
evaluation questions, 4041
J examples, 3841
Journal of College Science Teaching, 115, 117 field-testing, 47
graduate entrance exams, 35

College Science Teachers Guide to Assessment 157


Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
scoring guide, 47 scoresheet criteria, 57, 57
synthesis questions, 40 Practice, linking research, in organizing course
teach students to take, 4344 on student assessment, 3132
using higher-order questions, as alternative Practices for improving assessment, 121153
assessment, 54 Practices jeopardizing student assessment,
writing, 3738 137145
Preparing Tomorrows Teachers to Use
N Technology, 72
National Coalition for Electronic Portfolio Pretest scores, final grade correlations, 25
Research, 60 Principles of survey instrument development, 4
National Science Education Standards, 59 PT3. See Preparing Tomorrows Teachers to
National Science Foundation, established Use Technology
teaching fellowships, 73 Putting assessment findings into practice, 9
Nature of science, gauging, 7779
The New York Times Book Review, 115 Q
No Child Left Behind Act, 8 Quizzes, weekly, 51
Nonscience major, explaining to, as alternative
assessment, 53 R
Reflection on exams, promotion of, 99101
O completion of exam evaluation, question
Open-ended written questions, 4 regarding, 100101
Organizing course on student assessment, future classes, using exam evaluation in, question
practice, research, linking, 3132 regarding, 101
Outreach program, math, science, technology helpful of exam evaluation, question regarding,
incorporation into service learning. See 101
Wetscience motivation to complete exam evaluation,
question regarding, 101
P student survey results, 100101
Patterns in assessment, 129136 Remote student, formative assessment with,
Appropriate Workload Scale, 132 8788
backwash effect, 129 Rubrics, classroom-tested, implementation,
changing, 134136 85120
Clear Goals and Standards Scale, 132
constructive alignment, 129 S
evaluation tool, 132134 SALG. See Student Assessment of Learning
Good Teaching Scale, 132 Gains
influence of teaching on assessment, 132 SAT. See Scholastic Aptitude Test
integrating assessment items into, 131132 Scaffolding, 124
judging assessment pattern, 130131 Scholastic Aptitude Test, 21
measuring, 134 Science for All Americans, 59
student experience, assessments impact on, 130 Science News, 115
Peer assessment, 8993 Scientific American, 115
Planning, in culture of faculty-owned Scientific posters, 5557
assessment, 910 scoresheet criteria, 57
Posters, 5557 Scientific report, inquiry-based labs, 111114
as alternative assessment, 52 Scientific Report of Laboratory Investigation
assessment of students learning through, Grading Criteria, 111114
5557 Show/demonstrate as alternative assessment, 52

158 National Science Teachers Association


Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.
Skepticism, as obstacles to building campuswide owned assessment, 9
assessment program, 8 Varied assessment, 147149
Staff, work on, assessment as, 9 dynamic vs. static, 148
Statistical Reasoning Assessment, 19 individual vs. group performance, 148
Structure reflecting values, in building culture of individual vs. group review, 148
faculty-owned assessment, 1011 product vs. process, 148
Strugglers, capturing, 2627 Varying involvement, as obstacle to building
Student Assessment of Learning Gains, 17 culture of faculty-owned assessment, 8
Student experience, assessments impact on, 130 Views of Nature of Science Questionnaire, 78
Student-led teaching models, 117120 typical questions from, 78
grading criteria, 118120 Visible, vocal leadership, in culture of faculty-
Student remotes, formative assessment with, owned assessment, 1112
8788 VNOS. See Views of Nature of Science
Student survey results, 100101 Questionnaire
Students performing exam corrections, outline
for, 107108 W
Success, supporting, formative activity focused Weekly exams, quizzes, 151
on, 9 Wetscience, math, science, technology
Survey instrument validation, 36 incorporation into service learning outreach
construct validity, 4 program, 6775
definitions, 34 Writing/using multiple-choice questions to
listening to students, 45 assess higher-order thinking, 3541
open-ended written questions, 4 analysis questions, 3940
principles of survey instrument development, 4 application questions, 39
Synthesis Blooms taxonomy, 3637
Blooms taxonomy, 36, 50 distribution of questions, 38
questions to assess higher-order thinking, 40 evaluation questions, 4041
graduate entrance exams, 35
T multiple-choice questions, 3741
Taxonomy, Blooms, 3637, 5051, 109 synthesis questions, 40
Technical support for system, institutions using Written explanation alternative assessment, 53
resources, 9 Written questions, open-ended, 4
Third International Mathematics and Science
Study, 73
TIMSS. See Third International Mathematics
and Science Study
Topics in assessment, 132
Transparent assessment, 9

V
Validation of survey instruments, 36
construct validity, 4
definitions, 34
listening to students, 45
open-ended written questions, 4
principles of survey instrument development, 4
Validity, defined, 34
Values, clarification in building culture of faculty-

College Science Teachers Guide to Assessment 159


Copyright 2009 NSTA. All rights reserved. For more information, go to www.nsta.org/permissions.

Vous aimerez peut-être aussi