Vous êtes sur la page 1sur 33

Constructed Response in Science Toolkit

Constructed Response in Science Toolkit Developed by: Pete Spencer K-5 Science Consultant St. Clair County ISD
Constructed Response in Science Toolkit Developed by: Pete Spencer K-5 Science Consultant St. Clair County ISD

Developed by:

Pete Spencer K-5 Science Consultant St. Clair County ISD

pspencer@stclair-isd.k12.mi.us

810-364-8990x264

For downloadable CR examples go to www.geocities.com/sciencepete

Toolkit Introduction

Toolkit Introduction Any good carpenter knows having the right tools makes the job at hand easier.

Any good carpenter knows having the right tools makes the job at hand easier. Hopefully this toolkit will make it easier for you to “build” and use constructed response assessments in science.

The materials in the toolkit are organized so that you will first get a feel for what constructed response assessment in science is . You will see examples released directly from past science MEAP tests, actual student responses to those items, and the scores they received. And, in the toolkit appendix, you will see several other examples appropriate for various topics from kindergarten to fifth grade.

You will then be presented with evidence as to why they are important. You see why constructed response assessments warrant the time and energy required to design and use in your daily teaching; not just as preparation for the MEAP test.

You will learn how to design constructed response assessments for students at your grade level. There is no magic to creating a good constructed response assessment, but

are components that you should try to include in each one. You will find a

that you should try to include in each one. You will find a there template to

there

you should try to include in each one. You will find a there template to use

template to use to ensure that you consider each of those components.

Finally, the scoring of constructed response assessments is addressed. You will see sample rubrics that can be used by you, your students, or both when scoring the assessments.

So dig in and find the right “tools” for job of teaching and assessing science to students in an interesting, authentic way.

for job of teaching and assessing science to students in an in teresting, authentic way. St.
for job of teaching and assessing science to students in an in teresting, authentic way. St.

What is a constructed response assessment?

What is a constructed response assessment? In a nutshell… A constructed response assessment is one that

In a nutshell…

A constructed response assessment is one that asks students to produce an answer, not just select one.

Here are some characteristics of a constructed response assessment:

Constructed response, as its name implies, asks students to construct, create, or, in other words, “do” something to build their answer to a question or problem.

Constructed response does not ask students to choose a correct answer from several possibilities, match terms with definitions, or decide whether a statement is true or false.

The product asked for in a constructed response assessment can take many forms:

Short written response (one or two sentences)

Longer written response (a paragraph or more)

Labeled diagram

Graph

Oral response

Model

Justified selected response

Constructed response assessments may take longer for students to complete than selected response, but the benefits will be worth it.

Turn the page to see a comparison between assessing a benchmark using selected response (multiple choice) and constructed response.

Below is the elementary science benchmark, SCI.III.2.E.5, with the associated standard and the specific description of what students should be able to do.

Code Standard Description Key Concepts Real-World Contexts SCI.III.2.E.5 All students will analyze how parts of
Code
Standard
Description
Key Concepts
Real-World Contexts
SCI.III.2.E.5
All students will analyze
how parts of living
things are adapted to
carry out specific
functions:
Explain
Plant parts—
functions of
roots, stems,
selected
leaves, flowers,
seed plant
fruits, seeds.
Common edible plant
parts, such as bean,
cauliflower, carrot,
apple, tomato, spinach.
parts.
Questions 1 – 3 below are actual selected response items released from a
prior 5 th grade science MEAP test. These items assess student knowledge
of the above benchmark.

1. The trunk of an oak tree is like what part of a bean plant?

a. seed

b. stem

c. bean

d. root

2. The part of the seed showing growth will develop into the

The part of the seed showing growth will develop into the a. flowers of the plant.

a. flowers of the plant.

b. fruits of the plant.

c. leaves of the plant

d. roots of the plant.

3. What part of the bean makes food for the rest of the plant?

a. roots

b. stem

c. leaves

d. flower

You probably got all three items correct (the answers are page 4). Review the parts of the benchmark above. Do you feel these three items adequately assess your true understanding of the benchmark? Now try the constructed response assessment on the next page for comparison.

The following is an example of a constructed response assessment that also assesses the same benchmark assessed by the items on the previous page.

The item is written like one that would be found on a science MEAP test, however it was not part of a previous science test.

Randy and Ian were standing in the park talking on a warm spring day. Randy
Randy and Ian were standing in the park talking on
a warm spring day.
Randy was standing next to a
small tree. As they talked Randy pulled leaves off
of the branches. Ian told him he should stop doing
that, because it could hurt the tree.
Using what you know about the function of leaves
on a plant, explain why removing the leaves from a
tree in the springtime could be harmful to it.

The rubric for scoring this item is found on the next page. Before looking at the rubric consider how answering this question differed from answering the three selected response items on page 2.

Does it better assess your understanding of why leaves are important to a plant? Does it require a deeper understanding of the role leaves play than the selected response items? That’s what a constructed response assessment can do very well.

Now turn to the next page to see how you did!

Scoring Rubric for “Leaf Picking” Assessment

Scoring Guide: 2 pts. Total

2

= At least two acceptable responses

1

= One acceptable response

0

= No acceptable responses

Among the acceptable responses:

= No acceptable responses Among the acceptab le responses: - The leaves make food/sugar for the

- The leaves make food/sugar for the tree.

- Plants get most of their food/energy in the spring and summer.

- Leaves get energy from the sun.

- The stem and roots of the plant need food/sugar made in the leaves.

- New leaves will probably not grow until the next year.

Answers from selected response items on page 2:

#1 – B

#2 – D

#3 – C

The comparison of the two types of assessments and the level to which they assess a student’s understanding of the concept is a good example of why constructed response assessments are import. The next section on “The WHY” addresses that in more detail.

The following few pages show provide more examples of what constructed response assessments look like. Some of the items are copied directly from the released items on the 1999 MEAP test. While targeted to fifth graders, consider how you could adapt them to your particular grade level. Students need experience with these types of problems prior to fifth grade. You will also find examples from the released model of the 2002 MEAP test.

There are many more examples of constructed response items for all grade levels in the appendix at the end of the toolbox and available

online at geocities.com/sciencepete/CR/constructed_response_home.htm.

The following released item asks students to analyze a graph showing data from an investigation. Ask your students to do the same throughout the year, whether with data you provide (like this item) or with data gathered and graphed by your students.

Sometimes students go through the mechanics of graphing data, but do not take or have the time to analyze what the graph is telling them.

or have the time to analyze what the graph is telling them. Here is the rubric

Here is the rubric the state used to score student responses for this item:

SCORE POINTS:

2

=

Two acceptable scientific observations about the effect of water on plant growth

1

=

One acceptable scientific observations about the effect of water on plant growth

0

=

No acceptable scientific observations about the effect of water on plant growth

Some acceptable scientific observations include the following:

The amount of water affects plant growth.

Just enough water will help a plant grow.

More water is good to a certain point

Too little water can slow growth.

Too little water can dry out a plant.

Too little water can kill/drown a plant.

Forty liters in two months produces the most growth.

NOTE: The above list is not all-inclusive.

Actual student responses and the scores they received for this item are found in the section on Scoring Constructed Response.

Here is an item that asked students to “construct” or create a graph from data provided in a table. This is a skill that students should learn and practice starting at second or third grade with appropriate simplification of numbers and scales.

with appropriate simplification of numbers and scales. Here is the rubric used to scor e student

Here is the rubric used to score student responses on this item:

SCORE POINTS:

3

= All three bars drawn correctly with correct labels and scale and an appropriate title

2

= At least two bars drawn correctly with correct labels

1

= One bar drawn correctly with correct labels

0

= No bars drawn correctly

NOTES: If grid lines are drawn, the bar for Bark needs to touch the 10-gram line, within a reasonable margin of error. The other bars need only come between the appropriate grid lines; they need no be exactly midway between the bars. The response must be a bar graph in order to receive a score other than 0.

Actual student responses and the scores they received for this item are found in the section on Scoring Constructed Response.

This constructed response item from the ’99 MEAP test is a good example of a question built around a prompt including both writing and diagrams.

When you give constructed response assessments to your students occasionally provide them with prompts that appropriately challenge them in terms of reading. They will be facing such prompts on the MEAP tests and should used to them by test time.

on the MEAP tests and should used to them by test time. Here is the rubric

Here is the rubric used by the state to score student responses:

SCORE POINTS:

1 = A position is taken that either the crayon did or did not change, and an adequate explanation is provided for this position

0 = A position is NOT taken OR A position is taken but an inadequate explanation

is provided for this position. Some acceptable responses include the following:

Yes/Charlene is right because the crayon’s shape changed Yes/Charlene is right because the crayon’s state of matter changed (changed from solid to liquid when it melted OR changed from liquid to solid when it cooled). Yes/Charlene is right because the crayon’s flexibility changed (the original crayon or hardened puddle would snap but the liquid puddle would not). No/Charlene is wrong because the crayon’s color did not change. No/Charlene is wrong because the crayon’s chemical nature did not change (it remained wax). No/Charlene is wrong because the crayon’s flexibility did not change (the hardened puddle would snap just like the original crayon would).

Actual student responses and the scores they received for this item are found in the section on Scoring Constructed Response.

Here are some constructed response items from the model of the 2002 science MEAP test. One of the big differences is that these items are worth three points as opposed to two points like the previous items. They ask students to do a little bit more.

previous items. They ask students to do a little bit more. Note that students are asked

Note that students are asked to do two things: Classify the organisms as producer, consumer, or decomposer and write an explanation of the role of decomposer.

It is important to give students practice performing more that just one task based on a particular prompt. For example, give students data on a particular investigation and have them 1) graph the data and 2) draw a conclusion from the data, or 3) provide them with the conclusion and ask them for evidence from the data that supports that conclusion.

The following item asks the students to perform only one task, but it is a fairly extensive one.

perform only one task, but it is a fairly extensive one. One more example of what

One more example of what constructed response on the new MEAP might look like:

what constr ucted response on the new MEAP might look like: Note: The prior examples are

Note: The prior examples are what constructed response assessments on the MEAP test have looked and may look like. Constructed response assessments can look very different from these as you will see in some of the examples in the appendix and online. The above examples are provided so you can see the types of CR assessments students must answer on the 5 th or 8 th grade MEAP tests. Some of their assessments in the middle and upper elementary grades should look like these to adequately prepare them for the test.

Constructed Response at Lower Grades

The previous examples are just some of what constructed response items can look like, at fifth grade, in particular. Here are a few ideas of what they might look like at lower grades. [Remember, you will find more examples for all grade levels in the appendix.]

At the early elementary grades constructed response might involve students being asked to create at histogram (bar graph) using beans to represent how many of each type of bean are in a mix. The students could use lima beans for their bars. Each lima bean might represent 5 or 10 of each type of bean.

Students might be asked to classify objects by placing pictures of them on a Venn diagram.

Young students can also be asked to draw their ideas of certain science concepts. As their vocabulary increases they can be asked to label the drawings. An example might be to ask students to draw three or four pictures showing the stages of the life cycle of a bean plant.

In the middle elementary grades students can also create their own graphs and/or be asked to interpret simple graphs that are given to them.

Also, now that students are able to write down their ideas, they should be challenged to explain concepts and use their knowledge of the science process and concepts to solve problems.

A constructed response assessment of the elementary benchmark on simple machines for second or third grade students might ask them to draw and describe and how they can use two pulleys, a rope, and some clothes pins to pass notes and materials across the room. You would tell them that they must include what they know about pulleys that make their design work (i.e., the fact that pulleys allow you to change the direction of a force).

Next in the toolkit is a look at the many reasons why constructed response assessments should be included in good science instruction.

Why use

constructed response assessments?

Why use constructed response assessments? In a nutshell… Constructed response assessment is a quality learning

In a nutshell…

Constructed response assessment is a quality learning opportunity for students and a more authentic assessment tool for teachers.

The comparison of selected and constructed response assessment of the SCI.III.2.E.5 benchmark on the function of plant parts (pages 2 and 3 of the “What” section) presents a good case for the benefits of using constructed response assessment in addition to selected response.

Asking students to explain why pulling leaves from a tree in spring can harm the tree, rather than simply asking which plant part makes food for the plant offers the following added benefits:

Before answering, students must consider all that they know about leaves, trees, and plants. As they struggle to process what they know and make connections they gain a deeper understanding.

Students must demonstrate an understanding that leaves make food for the entire plant and that food is transported through the stem to the roots.

If students include a diagram, that provides another way for them to think about how plants function and for teachers to assess their understanding.

During the scoring of the assessment students are presented with several correct responses; some of which they might not have considered. This provides new information at the best time, when they can see the connection and when they have a vested interest in it.

Students scoring each other’s responses can be asked to defend their responses to the scorer. This presents a great opportunity for substantive conversation in supporting their response.

Finally, teachers are given a more accurate assessment of each student’s understanding of the function of leaves in a plant.

More on the importance of constructed response assessment:

Constructed response often entails asking students to write. Writing in any content area increases the learning that takes place. Traditionally, science, like mathematics, has been a content area where students are not expected to write as much or as often as other content areas.

Read what Theron Blakeslee, Director of the Jackson County Mathematics & Science Center and former MDE Science Consultant, has to say about writing in science. The complete article is included at the end of this section.

Writing in science is … extremely important for learning. Writing gives students a chance to think carefully about what they have observed and how they account for their observations. When held to high expectations for clarity in writing, students have to think deeply about the terms they choose to express their thoughts, as well as the logic of their arguments. Writing gives all students a chance to think about questions posed in class, rather than just listening to others’ descriptions and explanations, providing a valuable forum for expression for those students who are reluctant to join into class discussions.

Except from Helping Students with Constructed Response Items on the HSPT and MEAP’s By Theron Blakeslee

Here is more information on the benefits of writing in science from the draft copy of User-Friendly Writing to Learn for Science, by the Content Literacy Committee of the Michigan Department of Education.

WRITE TO LEARN

SCIENCE KNOWLEDGE IS INCREASED THROUGH WRITING BECAUSE WRITING…

Demands THINKING. Writing is an ACTIVE process. The students cannot remain passive.

ORGANIZES and CLARIFIES a student’s thought.

Increases RETENTION because an additional learning mode has been employed. The students are not just LISTENING; they are also KINESTHETICALLY involved.

Enables the teacher to SEE immediately WHO doesn’t understand and WHAT isn’t understood.

Helps students gain TOTAL PICTURE of concept or process.

How to create your own constructed response assessments

You have seen what a constructed response assessment looks like and why it is important to use them with your students.

Now it’s time to see how to create your own CR assessments for the lessons and units that you teach. Since there is no great repository for quality constructed response items for all topics at all grade levels, you need to know how to create your own.

This section provides:

1) A process to consider when designing and using a constructed response assessment,

2) A list of constructed response “products,”

3) Key components of a constructed response item,

4) What to include in a scoring rubric, and

5) A template to use to guide the development of your own constructed response assessments.

The next page outlines a process for the development and use of construct response items. While it may seem lengthy and rather formal, it is important as you begin to design your own assessments that you consider all of those steps. Soon it will become second nature to you, and you will not need to consult the process page or use the template to quickly create high-quality assessments for your class.

Constructed Response Assessment

Development & Use Process

Clearly identify the concept or benchmark you want to assess

identify the concept or benchmark you want to assess Decide what do you want your students

Decide what do you want your students to

“construct”.

What will the product be? Ex.

Written response, labeled graph, diagram, etc.

be? Ex. Written response, labeled graph, diagram, etc. Design a real-world prompt connected to your students’

Design a real-world prompt connected to your students’ lives to engage them and provide a context for the assessment.

to engage them and provide a context for the assessment. Create the task you will ask

Create the task you will ask your students to complete. Be clear as to what the product should look like and include.

clear as to what the product should look like and include. Create a scoring rubric. Give

Create a scoring rubric.

should look like and include. Create a scoring rubric. Give the assessment to your students and

Give the assessment to your students and monitor them as they work on it.

to your students and monitor them as they work on it. Score the responses using the

Score the responses using the rubric:

Students or teacher or both can score it.

the rubric: Students or teacher or both can score it. Analyze the responses, and revise instructional

Analyze the responses, and revise instructional strategy, if needed.

CR Products

Here are some ideas of products you can ask your students to create for their assessment. This list is certainly not all-inclusive, but it will give you some grade-appropriate ideas to get started.

K-2 Ideas

Diagrams/pictures (possibly labeled or explained to you by the students)

Explanations (oral or written) of simple histograms (bar graphs) or pictures

Venn Diagrams for comparisons and classifications

Lists (students generate)

Completing a statement or short story either in writing, orally, or with a picture

Construction of a model (with labeled parts) based on something the students have just learned about; e.g. a spider

Given objects or pictures, ask students to organize them in some way

Any of the following grades 3-5 ideas that you feel can be adapted for successful use by your students

3-5 Ideas

Written response to a prompt and question(s)

Labeled diagram

Explanation of data provided in a table or graph

Creation of a graph using data that is provided; possibly with an explanation of what they can conclude from their constructed graph

Explanation of a photograph or answer to a question based on a photo.

Solution to a problem: Students are asked how they would solve a problem using the assessed concepts.

Persuasive Letter written to a given audience using the assessed concepts as evidence to support their position.

Construction of a model that demonstrates understanding of the assessed concept.

Any of the K-2 ideas that can be adapted to appropriately assess your students.

Key Components of a Constructed Response

There are four key components of a constructed response assessment:

1. The Prompt

2. The Product

3. The Task

4. The Rubric

1. The Prompt 2. The Product 3. The Task 4. The Rubric Each of these is

Each of these is described in more detail below.

The Prompt

The “prompt” should be used to “set the stage” for the assessment. It should relate to the world of your students, which will be different depending on their age and background.

It should be interesting

An interesting prompt can capture a student’s attention and interest in completing the task. Too often it seems that students “just don’t want to write.” Maybe it is because we don’t ask them to write about interesting things.

Think back to the first constructed response assessment in the toolkit (page 4) on the function of leaves on a tree. Students could have been asked to, “Explain the function of leaves on a plant.” While some students would need only that to motivate them to write what they know about leaves, many other students need more.

The story about the two boys talking while one picks leaves off of a tree is something that all students can relate to. Undoubtedly all have picked at least some leaves off of a plant at least once. This may engage them and encourage them to think about how that might have impacted the plant. That prompt does not give them the knowledge needed to answer the question, but it motivates them to think about and try to answer it.

It should be local

Use local geographical features, landmarks, and industries in your prompts when they apply. For example, in St. Clair County if you are assessing your students understanding of expansion and contraction of matter caused by changing temperatures, include the Blue Water Bridge in your prompt. If you are assessing students’ understanding of the water cycle, mention Lake Huron or the St. Clair River in your prompt.

It should require reading (grade appropriate amounts)

Fourth and fifth grade students should occasionally be exposed to paragraph-length prompts. Longer prompts let you set a more- detailed scene and give students practice with the types of prompts they will see on the science MEAP test. Here is a prompt from a prototype of the 2002 5 th grade science MEAP.

from a prototype of the 2002 5 t h grade science MEAP. Here is the constructed

Here is the constructed response question that they ask based on the above prompt:

Shannon decided that the regular yellow popcorn was the BEST popcorn. Identify two pieces of evidence from the chart that support her decision.

Students are required to read the entire paragraph and understand the chart in order to answer the question. That prompt would be used for several other selected response questions in the cluster. Students need to be challenged by such prompts before the MEAP to be used to them by the fifth grade test.

Not all prompts need to be paragraphs, though. At the lower grades they will need to be short and to the point. You can orally explain the prompts in greater detail to better set the scene, but young students should not have to wade through so much writing that they lose track of the task.

It should be relevant

The prompt should relate to the task asked of your students. An interesting, local story that does not relate to the task may just serve to confuse them.

The Product

The types of products your students to produce have been discussed. When choosing the type of product you should consider:

What type best helps students show me what I really want to see that they know and understand? In other words, a bar graph might be a good product to help assess changes in populations in an ecosystem over time, but would probably not be a good one to assess students’ understanding of simple machines.

What is developmentally appropriate for your students?

What type of product have I not used in a while. To accommodate various learning styles and to give students practice with several types, you should mix up the product types.

The Task

The task, as defined here, refers to the actual directions to your students as to what they should do, such as…

… Create a bar graph using the data in the table and explain…

… Using what you have learned about gravity and friction explain why the car had difficulty stopping…

… Draw and label the parts of the plant used for the intake and transport of water and minerals throughout the plant.

Here are some guidelines for evaluating your tasks:

A well-written constructed response task should:

Clearly tell students what they are to do.

Clearly tell students where they are to write their response. If the students are to write their response on the assessment page, there should be sufficient space for them to respond completely.

Use simple, but authentic, vocabulary and good sentence structure.

Identify the information or material that the students should use when preparing their response (data chart, graph, lab activity report, etc.)

Clearly indicate the process that should be demonstrated (identify, explain, predict, describe, etc.)

Provide cues to students as to what the finished product might look like. (Write a paragraph, list two pieces of evidence, draw a diagram and label… etc.)

The Rubric

The rubric is an important part of the constructed response assessment. It is addressed in detail in the following section on “Scoring Constructed Response Assessment.”

Evaluating The Assessment

It is important to observe your students while they are completing the assessment to see if they understand the directions and know what is expected of them.

If, upon scoring the responses, a large percentage of your students score poorly on the assessment, it may be because of one of two reasons: 1) They do not have a good understanding of the science content and concepts you are assessing, or 2) the assessment is not designed to allow them to show their understanding.

You need to be the judge which of the above is occurring. It is likely that by evaluating where their mistakes or omissions are you will be able to tell whether the assessment needs to be modified the next time or whether you may need to go back and teach the concepts in a different way so that more students master them to the desired level.

On the other hand, if a large percentage of your students do very well on the assessment it may be because 1) they have a strong understanding of the concepts being assessed, or 2) the assessment is not challenging them sufficiently to express a deep understanding of the concept. If the latter is the case, the constructed response assessment is not much different than a selected response assessment that assesses at a very low level.

In either of the above cases, do not be afraid to evaluate the assessment and modify it for re-administering to your current students or make notes so that you will not use it again next year in the same form.

Constructed Response Assessment Template

Use this template to design a well-thought constructed response assessment. The sections below are suggested considerations only. You might want to consider other factors than these when designing your assessments.

Grade:

What is the benchmark, standard, or concept you will be assessing? State science benchmark: (Include
What is the benchmark, standard, or concept you will be assessing?
State science benchmark: (Include the description and the relevant key concepts and real-
world contexts.)

This is to be used as a(n):

Pre-assessment

Embedded assessment

Post-assessment

This assessment will ask students to “construct” a(n)…

Product ideas:

Short, written response

Labeled graph or diagram

Letter or other “real-world”

product List (evidence, procedure

steps, further questions, etc.) Oral response

How will the responses be scored?

Student (peer) only

Student with teacher review

Teacher only

How many points will the assessment be worth?

Write the complete constructed response assessment here, including all prompts (text, graph, diagram, etc.).

Keep in mind the benchmark you want to assess.

The prompt: This is the setup to the assessment. It should engage your students. Connect it to the lives of your students. Keep it local, if possible.

The Task: What is it you want the students to do? Analyze data? Write a solution to a problem? Build something? Evaluate a procedure? Provide students with enough cues so they clearly know what is expected of them.

The Rubric: If possible, write the rubric so students can use it to score the assessment. Use sample rubrics to help you.

Other: Any miscellaneous notes about time or materials needed, scheduling the assessments, follow-ups, regular teacher involvement/responsibilities, etc.

Scoring constructed response assessments

A great benefit of constructed response assessments comes during the scoring of them. Your students should be involved in some way in that process. Whether they score their own product, a neighbor’s, or just get a chance to review how you scored their work, they should be involved. It provides yet another learning opportunity for the concepts being assessed.

Such an opportunity occurs when students use a scoring rubric to score their own or their classmates’ responses. In doing so they see what the correct response(s) are and why they did or did not get

it correct. Maybe they just did not know the science concepts, or

maybe it was just because they did not include the appropriate evidence or write down all that they know about the question.

There will be times when you will want to score your students’ responses. In those cases you should make the students aware of the rubric either right after they have completed the assessment or when you return their papers. It is best to do it when you will have time to discuss the correct responses.

A Rubric

Constructed response assessments should be scored using some kind of rubric, or written scoring criteria. Rubrics do not need to be large, complicated tables like you sometimes see, but they should address the concepts and skills you wanted to assess with the given student tasks.

The rubric is an important part of the constructed response assessment for a couple of reasons.

First of all, unlike selected response, true/false, or matching assessments, a constructed response will not always be clearly right or wrong. A well-designed rubric makes the job of scoring CR assessments quicker and easier. The more specific the rubric, the less room there will be for “arguments” if students are scoring their own or their classmates’ responses.

Secondly, you can never predict all possible responses and with some responses it will be difficult to determine whether the student really understands the concept or not. Some of the “arguments” mentioned in the previous paragraph can prompt excellent substantive conversation that requires students to defend their answers with further explanation and evidence. That makes for a tremendous learning opportunity.

Here are some steps you can take when creating a rubric:

1. As you did when you were creating the assessment, you should consider the concept, benchmark, or standard that you are assessing.

2. You need to decide what important content ideas you want your students to know and express in their response. You might give one point for each one or decide if two or three out of all possible ideas are enough. For an example, in the rubric for the leaf function assessment in the What section, there are five acceptable responses. Students receive a score of 2 (the maximum score) if they include two or more of them in their response.

3. Also, consider any science skills being assessed; i.e. graphing, diagramming, etc. You might want to assign a point(s) for details such as a title, appropriate scale, and axis labels on a graph, or correct labels on a diagram or model. A written response would probably not have such a score as part of the rubric.

4. Evidence: Along with the content components in the response, you might want to give a point(s) for the inclusion of evidence in their answer. Not all assessments will require supporting the response with evidence, but when evidence is required, at least a point should be assigned to it.

5. Then list as many possible correct responses/products for your assessment as you can. As mentioned above, you will likely not list all possible correct responses, but that is okay. In fact, discussions about responses not in the rubric make for excellent scientific conversations.

6. You must then decide how many points the assessment will be worth overall and how many and/or which of the possible correct responses the student must include for a top score, a medium score, and so on.

Constructed response items on the MEAP tests range anywhere from two to three points. All constructed response items on the 2002 5 th and 8 th grade science tests were worth three points each. You can use whatever scale you would like, though, when designing your rubric; i.e., 100%, 90%, etc., or 20 pts., 15 pts., 10 pts.)

7. Try not to make the rubric so complicated that your students cannot understand how the item was scored or cannot use the rubric to score their own responses or those of their peers.

For the lower elementary grades peer scoring might not be wise to do often, but exposing them to it based on simple criteria is good practice. For example, if a model of a spider is the assessed product, students can check each other’s to see if they have eight legs, two eyes, etc.

For examples of upper elementary rubrics, see the released items from the 1999 MEAP included in this section. Rubrics for lower elementary grade assessments would need to be appropriately

modified.

well as actual student responses and the scores they received for each. For the most part, you will see that the state rubrics are not very complicated. Also, several other examples of rubrics are found throughout the toolkit and online (see website listed on front cover of the toolkit) with the constructed response examples.

Each item includes the scoring rubric used by the state as

General Scoring Rubric

There is not a generic scoring rubric that can be used for every science constructed response assessment, because each one assesses different concepts and skills. However, the state did come up with a “General Scoring Guide” that can be used to help your students construct good, complete responses. You will find the scoring guide and a much simplified “poster” version of it included in this section.

You can enlarge or re-draw the mini-poster and post it up in your room. When students indicate they are finished with their response you can point to the poster and ask them:

“Are you sure your answer is scientifically correct?

“Is it complete? Did you answer all that it asked of you?”

“Is it clear? Will someone know exactly what you are showing or explaining in your response?”

“Have you included evidence for your response, if appropriate?”

If students get in the habit of thinking of these four things when they complete constructed response assessments, they will be more likely to include those things that they know, but often do not get down on paper.

Substantive Conversation

Substantive conversation is one of the four Teaching and Learning Standards that should be part of every lesson according to the Michigan Curriculum Framework. Learning for children and adults is much deeper when the new ideas are discussed with others.

Constructed response assessment lends itself to such discussions much more than selected response assessments like True/False or multiple-choice items. The scoring of constructed response provides a wonderful opportunity for good, substantive conversation between students and between teacher and students. Here are some suggestions to use during the scoring process:

If students are scoring their peers’ responses, students can be allowed to defend responses that are not on the list of correct responses in the rubric. (There will always be some justifiable ones that you do not think of).

If you have scored the responses and returned them to the students with they rubric, they should be allowed to defend any responses they think should be accepted. You can set appropriate guidelines for how they can appeal their case; i.e. the answer must be scientifically correct, they must defend it with evidence and/or examples, and their defense must be in writing.

You can have the class brainstorm a list of acceptable responses that will be used in the scoring. Students should be required to defend the responses they provide and others should have an opportunity to challenge the correctness of them. The teacher will act as the final judge of what answers are scientifically correct and appropriate for the assessment.

When the product of the assessment is something other than a written response, such as pictures, models, or graphs, they should be posted or displayed so that students can look at how others addressed the problem. Students should have an opportunity to ask questions of other students about their product and compliment or challenge them on their work.

Examples from the MEAP

Included in this section are several constructed response items used in the 1999 science MEAP and sample student responses. They have been released for use by teachers and students. Look them over to see what is expected of students at a fifth grade level. Teachers at all grade levels have a role to play in preparing students to complete successful constructed response items.

Fourth and fifth grade teachers could even copy the student responses and the rubric and ask their students to score them (without including the state score). Then students can see how their scoring compared with the state. This will give them another way of seeing what is required of them.

Eighth grade test released items and more elementary released constructed response items can be downloaded from this site:

http://www.meritaward.state.mi.us/mma/released.htm

Constructed Response Examples

The following pages have examples of constructed response assessments that are to be used at a variety of grades (K-5). Some include rubrics that can be used with them or you can create your own. A brief explanation of them is below.

Please contact Pete Spencer at pspencer@stclair-isd.k12.mi.us or 810-364-8990 x 264 if would like more explanations on any of these or would like them emailed to you in Word format so you can modify them to your needs.

Informal

# of

Grade

 

Title

pages

Levels

Description

     

Shows understanding of melting in a context other than ice and water.

Melting Ice

Cream

Assesses understanding of how solids retain their

1 K-2

 

shape while liquids take the shape of their container.

Students can first color the scoops then show how they might mix.

     

Uses Venn diagram to assess students understanding of how uses of fresh water can differ from that of salt water.

Students are to glue the pictures in the appropriate

Fresh

Water/Salt

Water

2 K-2

are on the diagram, “My Water Venn”. Example:

Swimming should be in the “both” area.

Depending on level pictures must be cut out ahead of time (there are two sets on each page) or you can give a half page to each student and have them cut out the pictures, as they need them.

     

Need to provide a few different types of patterned material (~ 1 square yard each) and hang around the room.

For lower elementary, your students can be provided with just the butterfly diagram and be

A Helping

Hand

1 1 – 5

given oral instructions. Older students can be given the entire sheet and can cut out the butterfly.

Once all butterflies have been taped to the “habitats” have the class look at all of them and discuss which ones would be most likely to survive.

     

To be given prior to starting a plant unit.

Seeds Pre-

Asks students to predict what a seed needs to begin growing and what the seed will look like when it starts to grow.

Assessment

2 2-4

Informal

# of

Grade

 

Title

pages

Levels

Description

     

Similar to above pre-assessment, but this assumes that students have observed germination in the form of baggie gardens or something similar.

Seeds Post-

Assessment

3

2-4

Also assumes that students have weighed a collection of 100 dry beans and 100 that were soaked overnight.

Rubric included.

     

Good pre-assessment on evaporation and water

Puddle

1

2-4

cycle concepts.

May need to edit due to reference to spring.

     

Similar to previous assessment, but expects more

Soccer

puddles

1

2-4

from the students.

A better post-assessment than the previous one.

     

This address forms of energy.

Energy and

Students are asked to design their dream tree house. They must label how they would use each of five forms of energy in or near the tree house.

Tree House

1

3-4

     

Students are asked to consider how they would “clean” a sandbox in the springtime.

Sandbox

Mixture

They must consider the properties of several

2

3-4

 

materials that fell into the sandbox during the winter.

They are asked to draw and label a “Sandbox Cleaner”.

     

Simple Machines assessment focusing on levers and one other one.

Lever

Assessment

3

3-5

Students must free a stuck four-wheeler then get it on the back of a pickup truck.

Rubric included.

     

Assesses characteristics for survival (camouflage) concept.

Beetle

Students must interpret a bar graph and use understanding of camouflage to determine what happened to a beetle population.

Rubric included.

Survival

2

3-5

     

This is a fairly detailed assessment in that it requires students to refer to three different bar graphs to completely answer the second question.

Wet

Also, there must have been some discussion of how clothes driers work for students to understand the first question.

A rubric is included with this that can be used by your students for peer scoring.

Clothes

3

3-5

     

Assesses many of sound-related concepts.

Sounds

Cool

4

3-5

The context is a high school band.

Rubric included.

Informal

# of

Grade

 

Title

pages

Levels

Description

     

Friction assessment

No-Slip

Students must explain forces at work when someone slips and falls.

Socks

2

3-5

They must consider how to invent a pair of socks that will minimize such slipping.

Rubric included.

Food Web

3

4-5

Students are given a collection of about 7 toy animals (insects, snake, frog, etc.) along with the assessment. Could use a set of animals pictures or could do it without the critters, too.

Key terms are assessed.

A rubric is included.

     

Assesses concepts related to shadows.

Students must analyze a diagram showing a cookie jar crime scene to see “Who dunnit?”

Clues are in the Shadows

5

4-5

Must demonstrate understanding of most shadow concepts.

Some data graphing included.

 

Rubric included.

Good summative assessment of shadows.