Vous êtes sur la page 1sur 27

WTS 1 and 2 page 1 of 27

Interpretation of Data Skills

Tate J. Hedtke

Saint Marys University of Minnesota

Schools of Graduate and Professional Programs

Portfolio Entry for Wisconsin Teacher Standard 1&2

EDUW 691 - Professional Skills Development

Caroline A. Hickethier, Instructor

February 13th, 2017


WTS 1 and 2 page 2 of 27

Selected Wisconsin Teacher Standard Descriptors

Wisconsin Teacher Standard (WTS) 1: Teachers know the subject they are teaching.

The teacher understands the central concepts, tools of inquiry, and structures of the

disciplines she or he teachers and can create learning experiences that make these aspects of

subject matter meaningful for pupils.

Knowledge. The teacher understands how students conceptual frameworks and their

misconceptions for an area of knowledge can influence their learning.

Dispositions. The teacher realizes that subject matter knowledge is not a fixed body of

facts but is complex and ever- evolving. He seeks to keep abreast of new ideas and

understandings in the field. The teacher has enthusiasm for the discipline he teachers and sees

connections to everyday life.

Performances. The teacher can evaluate teaching resources and curriculum materials for

their comprehensiveness, accuracy, and usefulness in representing particular ideas and concepts.

Wisconsin Teacher Standard (WTS) 2: Teachers know how children grow.

The teacher understands how children with broad ranges of ability learn and provides

instruction that supports their intellectual, social, and personal development.

Knowledge. The teacher understands how learning occurs-how students construct

knowledge, acquire skills, and develop habits of mind- and knows how to use instructional

strategies that promote student learning for a wide range of student abilities.

Dispositions. The teacher is disposed to use students strengths as a basis for growth, and

their errors as an opportunity for learning.


WTS 1 and 2 page 3 of 27

Performances. The teacher assesses individual and group performance in order to

design instruction that meets learners current needs in each domain (cognitive, social, emotional,

moral, and physical) and that leads to the next level of development.

Danielson Domains

Domain 1: Planning and Preparation

Component 1e: Designing Coherent Instruction

Component 1f: Assessing Student Learning

Domain 3: Instruction

Component 1e: Designing Coherent Instruction


WTS 1 and 2 page 4 of 27

Pre-assessments

Self-assessment of Instruction Related to WTS and Targeted Student Learning Objective(s)

For Wisconsin Teaching Standards (WTS) 1 and 2, I have chosen to focus on improving

the abilities of my students to succeed on interpretation of data questions on standardized

assessments. I teach freshmen level physical science in the pull-out setting at Medford Area

Senior High. This class is taught an alternative curriculum which is aligned with regular

education curriculum and standards at our school, but is taught at a slower pace and less in-

depth. My classroom is made up of twelve freshmen and one sophomore with varying

disabilities. Eight of my students have a Specific Learning Disability (SLD) which greatly

affects their ability to participate in the general education curriculum. One student with an SLD

is also an English Language Learner (ELL). Two students are in a pull-out science class due to

their extreme Attention Deficit Hyperactivity Disorder (ADHD) which allows them special

education services under the term Other Health Impairment (OHI). One student has an

Emotional Behavioral Disability (ED) and the last has Autism Spectrum Disorder (ASD). All

students have reading and math levels that are far below age and grade level expectations.

Recently in my high school and community, there has been discussion about how educators can

bridge the achievement gap between regular and special education students, mainly in the area of

state mandated standardized assessments. It is because of this initiative and community wide

discussion that I have decided to focus my student learning objectives on the area of improving

the scores of my special education students on assessment questions that require them to interpret

data.

There are two knowledge descriptors I have chosen to direct my research and

implementation of best teaching practices to help improve assessment scores. The first is that
WTS 1 and 2 page 5 of 27

The teacher understands how students conceptual frameworks and their misconceptions for an

area of knowledge can influence their learning. Special education students often times have

missed instruction throughout their middle school years due to low math and reading abilities

which requires them to have intense intervention courses. These reading and math interventions

often take the place of science and social studies classes. When students reach ninth grade, some

have missed so much instruction their skills are inferior to their peers in rudimentary scientific

abilities such as reading a simple chart or graph. A successful teacher at the high school level

needs to know how to teach basic skills, while still achieving proper grade level curriculum and

standards.

The disposition factor for this standard is also important to note, as all too often in special

education classes we are affronted with the age old question by students: When are we ever

going to need to know this? The disposition factor for my knowledge descriptor for WTS 1

states as follows: The teacher realizes that subject matter knowledge is not a fixed body of facts

but is complex and ever- evolving. He seeks to keep abreast of new ideas and understandings in

the field. The teacher has enthusiasm for the discipline he teachers and sees connections to

everyday life. One of my goals is to impress upon students the importance of being able to

understand, interpret, and analyze graphically depicted data not only to succeed in high school,

but in order to succeed in higher education, future jobs, and healthy living.

The knowledge descriptor chosen to address WTS 2 is the following: The teacher

understands how learning occurs- how students construct knowledge, acquire skills, and develop

habits of mind- and knows how to use instructional strategies that promote student learning for a

wide range of student abilities. I feel that this knowledge descriptor is the basis for the need of

special education staff and exclusionary instruction. It is imperative that I am able to incorporate
WTS 1 and 2 page 6 of 27

differentiated instruction in order to meet the needs of my students with varying learning

abilities. Since the beginning of the school year I feel that I have been teaching the average

students in my course very well, but have been leaving the students with lesser ability behind,

while not challenging the more gifted students in my classroom appropriately. I hope to

incorporate a variety of strategies that can stimulate academic skill growth in students of all

abilities.

The performance descriptor I have chosen to focus on from WTS 1 is will help guide my

curriculum choices, and also remind me what I need to change when instructional goals are not

met as planed: The teacher can evaluate teaching resources and curriculum materials for their

comprehensiveness, accuracy, and usefulness in representing particular ideas and concepts. As

an educator, one must be able to reflect on their practices daily and have to humility to admit

when certain outcomes are not met, but also to realize why his students were not able to achieve

what was expected.

Vertical standards are addressed in the performance descriptor from WTS 2 I have chosen

which forces an instructor to address the individual learners current needs in each and that

leads to the next level of development. No two students in my group have identical disabilities

and struggles, or interests, and therefore, I will have to incorporate some individualized learning

strategies in order to pique the interests of some reluctant learners and those with academic

challenges.
WTS 1 and 2 page 7 of 27

Assessment of Student Performance and Environment Related to Targeted Student

Learning Objective(s)

I teach one section of physical science filled with twelve freshmen and one sophomore in

the pull-out setting for special education students. One of the students is on the Dynamic

Learning Maps assessment which allows him to learn from an augmented curriculum based upon

the extended grade band standards, while the remaining eleven students will all take the ACT as

juniors in high school. This course meets four times a week; three meetings are 43 minutes long

while the fourth meeting is a blocked 83 minute session.

Our district has used the Next Generation Science Standards (NGSS) the past four years,

and although my class is taught in the pull out setting, it is expected that my curriculum is

properly aligned with these standards. The NGSS has aligned many of their standards and

student outcomes with the ACT college readiness test in order to help assist individuals gain the

skills they need in order to be successful in college stating that: sixty percent of U.S. jobs are

predicted to require some form of postsecondary education (NGSS, 2013). The NGSS release

also goes on to compares education statistics between the United States and other countries

showing that we are continually lagging, as well as discusses the implications of education and

future earnings of an individual. Taking this into consideration, I have decided to use the ACT

College and Career Readiness Standards (CRS) for science to assess my students (ACT, 2014) in

order to assess the abilities of my students and the improvement of their interpretation of data

skills.

The students entering my freshmen pull out physical science class have an array of skills,

so I decided to test them at the lowest Interpretation Of Data (IOD) levels, 200, which states the

following (ACT, 2014):


WTS 1 and 2 page 8 of 27

201. Select a piece of data from a simple data presentation.


202. Identify basic features of a table, graph, or diagram.
203. Find basic information in text that describes a simple data presentation.

Students who are able to perform these tasks are predicted to have an ACT score of 13-15, which

falls on the low end of the spectrum for college and career readiness skills as the benchmark

score is 23 (ACT, 2015). Students at the freshmen level should be able to perform at least an 18

to be considered on pace with their classmates and in order to score proficient when they are

juniors. Therefore, I expect my students to perform at least 80 percent on their assessments in

order to be considered proficient. My preliminary assessment of my students and their basic

interpretation of data skills as shown in Appendix A showed that my students were able to

answer 61.8 percent of questions correctly on an eleven question assignment looking at some

basic charts and graphs. Two students were determined to be proficient while answering 9 out of

11 questions correct (81.8%), and 11 students were determined to be non-proficient in these

skills with the lowest performers answering 3 out of 11 questions correctly (27.3%). No students

were completely unable to perform the tasks, nor were any students able to complete a perfect

score justifying my decision for difficulty of instructional level of CRS.

Assessment Conclusion and Essential Question to Guide Research

The self-assessment of my instructional focus, assessment of student performance and the

learning my students learning environment show that their skills are not commensurate with their

peers and need to be intensively focused upon. My research will focus on implementing

techniques to enhance students IOD skills in order to help ensure they are more properly

prepared for college, in order to allow for the best opportunities after high school. The essential

question I will focus on is: How does intensive instruction of IOD skills affect the performance

of disabled freshmen and their college readiness skills?


WTS 1 and 2 page 9 of 27

Research Summary

Now more than ever, the importance of a students ability to understand and accurately

interpret graphical data is one of the foundations of any science education. This is obvious to

any science educator as: graph interpretation and construction, and the recognition of variables

displayed by these graphs are the basic skills that form the prerequisite to any meaningful

understanding of concepts in science (Tairab & Khalaf Al-Naqbi 2004). Visual data surrounds

children. Information about traditional things such as weather patterns and the stock market are

displayed visually, but so too are the benefits and drawbacks of health-related activities, suicide,

and depression, and the costs of social media usage. Thus, the importance of skills and

interpretation of data (IOD) skills taught in the science classroom actually transcend the

discipline altogether. That information presented to graduates, college students, and workers is

being graphically, but have students been taught the skills needed to interpret the data? (Tairab

& Khalaf Al-Naqbi 2004). The ACT test is a graduation requirement in the state of Wisconsin

covering five different subjects: English, math, science, reading and writing. The science portion

of this test contains five different sections, four of which according to the ACT sample test

provided by their website contains questions requiring one to interpret and read graphically

represented information (ACT, 2017). With the emphasis placed upon standardized tests in the

current climate of education in the United States today, and the increasing occurrence of

graphical information being presented to youth it has grown more important for educators to

research and implement effective interpretation of data and graphical information skills.

Tairab and Khalaf Al-Naqbi performed a study in 2004 seeking to better understand how

students in 10th grade science classes learned from scientific graphs. They were primarily

concerned with students abilities to interpret graphical information, their ability to represent
WTS 1 and 2 page 10 of 27

information graphically, and factors that could impede the process of interpretation and ability

related strategies that could be used to assist students (p.127). Many factors were taken into

consideration while formulating their study, initially the question was asked if students viewed

graphics as symbolic representations of data at all or if they are were simply considered pictures,

meaning students need to be taught the difference between data and pictures. The study focused

on two general tasks, interpretation data as well as creating a general construction of what the

data means. Students performed over all much better with the interpretation of data tasks where

a student was required to read a single piece of information from graphically displayed

information, but students lacked the ability to construct an understanding of the relationships

shown between the information displayed, or the variables (p.130). Lacking an understanding of

an independent and dependent variable prohibited the students from understanding how one

variable was responsible for changing the other, the essence of typical graphics such as line

graphs. As Tairab and Khalaf Al-Naqbi (2004) concluded in their research they realized that

students are able to easily take single pieces of information from charts and graphs and they are

unable to generate conclusions to graphic information or produce graphs of meaningful value

without intense instruction by their educators due to lack of necessary cognitive level.

Another area of difficulty in the United States is ensuring students with disabilities

receive the same level of instruction as those without disabilities. Jimenez, Lo, and Saunders

(2014) focused their research on scripted lessons and their effects on student growth in students

with intellectual disabilities and autism. Science education is important because an

understanding of science offers students the ability to question their own lives and formulate

thinking to make informed decisions (Jimenez et al., 2014). While this research did not focus

solely on interpretation of data skills it did focus on the actual structure of lessons and the
WTS 1 and 2 page 11 of 27

benefits of scripted lessons for those who with an intellectual disability (ID) and autism. The

lessons consisted of intense one-on-one instruction in the pull out special education setting and

included guided notes for students, such as fill in the blank type note taking, and frequent use of

hands-on and explorative approach to science. Their research was quite conclusive showing

marked gains on science quizzes even among the most disabled students in a population. The

study shows quite conclusively that scripted lessons, which are available at all levels of science,

including high-school level where IOD skills are stressed the most, produce gains in students and

therefore can be an effective way to teach the higher level thought type skills require for student

success on secondary standardized tests.

A piece of research conducted at the University of Indiana School of Education shed

interesting light onto the fact that even at the college level, few students enter majors with the

skills needed to be successful in the reading and creation of visual data (Maltese, Harsh &

Svetina 2015). Maltese et al. (2015) noted that three problems arise while interpreting data:

a) understanding the common graphical conventions


b) understanding the conventions of a specific representation
c) understanding the context/content related to the data.

Understanding this should excite science educators due to the fact that the first two of the three

issues plaguing students in the area of IOD can be relatively easily taught to students. The

researchers in this setting decided to look at college aged students and split them into two

categories: students majoring in STEM (science, technology, engineering and mathematics), and

non-STEM students. Students were also split into their corresponding classes as to compare the

abilities of freshmen through PHD level individuals. The thought of the division between the

two types of students based on major would hopefully yield results showing that STEM students

would have higher scores than their counterparts on the assessments of IOD skills. Much to the
WTS 1 and 2 page 12 of 27

surprise of the researchers, there was a moderate correlation between the amount of STEM

coursework completed by participants and their performance, but overall we were surprised that

this relationship was not more strongly positive (Maltese et al., 2015). The results showed that

students who had more education, and in some cases completed six times as many STEM classes

scored relatively similar scores regarding their IOD skills. The conclusion from Maltese et al.,

pointed that:

Because of this, we believe that if any instructors have the educational goals to help

students advance their data visualization skills or if they will use a specific type of graph

repeatedly in class, then it is likely worth it to devote time in class to teaching students

explicitly how to read and create these types of displays (2015).

The implications of this support the earlier findings of Jimenez et al., and Tairab and Khalaf Al-

Naqbi in that all instructors need to make it a point to intensively instruct students how to read,

create, and interpret graphical representations of data on a frequent basis if students are to gain,

retain, and master these skills.

All research noted until this point has focused on the importance and implications of

students abilities to interpret data and graphically represented information. In order to best

address these needs however, Josefina Arce, George M. Bodner and Kelly Hutchinson in

association with Purdue University (2014) sought to explore some of the best practices,

particularly the attitudes of teachers towards implementing strategies in the classroom to address

these skills.

Arce et al. (2014) focused on the importance of the attitude of a teacher in a science

classroom. The two types of teachers in the study were organized into two categories, those

possessing the constructivist, and the traditional mindset. The traditional teacher focuses on
WTS 1 and 2 page 13 of 27

concrete facts, directly delivered by a teacher and received by the student. There is little

opportunity for personalized learning in this setting, and little ability for these teachers to reach a

student that has a constructivist learning style, meaning hands on, exploratory and inquisitive.

The traditional teachers in this study were also shockingly stubborn, and often times incorrect

when teaching information that was unfamiliar to them. Traditional teachers were less likely to

be aware of errors in their approach to unfamiliar problems. The tended to give what they felt

was the correct answer and seldom pondered out-loud whether they were right (p.92). These

teachers are not going to have the abilities or the desire to teach students with learning

disabilities the skills necessary to interpret and use graphically represented information.

Opposite the traditional teacher, Arce et al. (2014) described the constructivist individual

as well. This educators approach to instruction requires a shift from someone who teaches, to

someone who facilitates learning (p.86). These individuals create student centered classrooms

and are less concerned with possessing and being able to directly relay the correct answer to a

question rather than being able to facilitate students desires to find them.

Much of this study resulted strictly subjective information obtained through observation

and interviews and did not include data based on student learning. Instead, data was gathered on

instructor knowledge of the subjects they were teaching, and the content teachers were expected

to deliver to students. The results show overwhelmingly that although neither group of

individuals were entirely proficient on the units and skills they were expected to teach, the

constructivist teacher was much more open-minded, and less arrogant about their knowledge and

abilities as an educator (p.92).

Another difficulty with students regarding data and graphs is how an instructor assess

student learning. Unless an educator completes assessments regularly, combining formal as well
WTS 1 and 2 page 14 of 27

as informal assessments how can one know if students are meeting learning objectives? In 2006,

Maria Arcelia Ruiz-Primo and Erin Marie Furtak couple the importance of inquiry based science

education with frequent informal formative assessment (P.206). If students are taught science in

the context of inquiry, they will know what they know, how they know it, and why they believe

it (p.206). This stand point on instruction agrees with the findings by Jimenez et al., as well as

Arce et al., that hands on inquiry is the most effective way to not only teach science content, but

also to mold students into lifelong inquisitive learners.

Ruiz-Primo and Furtak explore primarily the impacts of informal formative assessments

on student learning, which yielded obvious results, and ones that have important implications for

students with disabilities. Their study compared four teachers and their use of informal

assessments, typically held in the context of casual conversation with students regarding the

learning objectives of different science based units. Four teachers held pre-tests of their

students knowledge on a topic, taught a unit, and then held a post test. What was observed by

the researchers however was how many informal conversations were had between teacher and

their students. The teacher whose students had the highest performance on our tests was the

teacher who held the most discussions, asked the most concept-eliciting questions and employed

the greatest diversity of strategies that used information she had gained about student

understanding (p.231). Although informal assessments cannot replace formal assessments and

teach all the skills needed by a 21st century student, this study has shown that they can be not

only a wonderful way of gauging student learning, but also facilitating the inquisitiveness needed

by students in the science classroom. Also, informal assessments can expose students with

disabilities to assessment questions more often that may have memory retention issues, and help

keep them focused on student learning outcomes.


WTS 1 and 2 page 15 of 27

These vastly different pieces of research create a web of how to best tackle the issue of

how to best teach interpretation of data and graphics in the science classroom. It truly is a

holistic approach that begins with an educators mentality, the design of instruction and finally

how the assessments are completed. Graphing and data analysis skills are not standalone

abilities that can be taught at one grade level and be expected to be retained for the remainder of

a students education. These skills need to be properly taught, effectively integrated into an

intensive inquiry based science curriculum, and assessed often and informally to check for

student understanding of student learning objectives.

Research Implications

My research has given me a very solid foundation on how to approach my research

question: how do I effectively teach interpretation of data and graphical information skills? Prior

to this research, I operated under the assumption that students entered high school with a basic

understanding of how to read data and interpret graphs, which was proven false with the Maltese

et al., research showing that even college level and post graduate students need reminders when

looking at graphs.

Although I did not discover any new strategies on how to directly instruct the skills such

as graphing and reading data, the fact that all educational research in the science content area

focuses on hands on instruction and inquiry based learning shows that I need to have students

frequently make graphs and read them as part of our daily curriculum. These skills need to be

incorporated in daily lesson plans, and especially informal assessments.

Students need to be introduced to personalized learning and have to be willing to engage

in classes in an inquisitive way. This is different for many special education students that have

focused on scripted lessons in the past and that have not been allowed or willing to be creative in
WTS 1 and 2 page 16 of 27

the classroom. Coupled with this, students need to have hands on learning, and be expected to

collect data, create physical representations of data, and then be engaged in discussions of the

data informally.

Research-based Action Plan

Action Plan Summary Outline

1. Design personalized learning activities that meet expectations of the Next Generation Science

Standards and incorporate ACT leveled science skills.


2. Engage students informal formative assessments of data pertaining to units and lessons being

taught.
3. Have students gather data, and create graphic representations of that data.
4. Conduct formal summative assessments of student learning.

Targeted Student Learning Objective(s)

1. Standardized goal: Next Generation Science Standard Practice Physical Science 3-3

Energy: Design, build, and refine a device that works within given constraints to convert

one form of energy into another form of energy.

2. Targeted learning objective: The following are the specific ACT skills goals I hope to

have my students master by the end of their freshmen year. (note: IOD stands for

Interpretation of Data)

IOD 201. Select one piece of data from a simple data presentation (e.g., a simple food

web diagram)

IOD 202. Identify basic features of a table, graph, or diagram (e.g., units of measurement)

IOD 203. Find basic information in text that describes a simple data presentation

Task(s) and Essential Proficiency Criteria for Targeted Learning Objective(s)

1. Task Students are proficient in the 200 level interpretation of data skills.
WTS 1 and 2 page 17 of 27

2. Criteria that Prove Proficiency in Meeting Targeted Learning Objective(s)

a. Students can accurately gather data from a simple experiment

b. Students can create a visual representation of the data in the form of a

graph or chart.

c. Students can accurately answer summative assessment questions when

being asked questions about visually represented data.

Method(s) to Assess Progress of Proficiency for Targeted Learning Objectives(s)

1. Students will complete twice weekly informal assessments and

discussions looking at pertinent charts and graphs depicting information

referencing curricular goals.


2. Students will be able to answer summative questions on an end of

the unit test based on the data they have gathered and collected.

Post-assessments

Instructional Insights Related to WTS and Targeted Student Learning Objective(s)

The targeted student learning objective was for students with varying disabilities to

achieve proficiency in interpretation of data skills while looking at scientific data. The

instruction I provided produced consistent gains throughout the process, but through several

design errors resulted in final summative assessment data that did not yield the desire results

(Artifact A).

My students over the course of the first semester of their freshmen level physical science

class made significant gains relating to their abilities to read charts and graphs, as well as

effectively gather data and then produce graphic representations. Students were expected to

view, make, and interpret graphs throughout every unit this semester, always using data that

pertained to the curriculum being taught. At other times, I introduced charts and graphs that did
WTS 1 and 2 page 18 of 27

not pertain to the topics at hand or that were presented in a unique way to stretch student learning

and expose them to additional representations of data that time may not have allowed us to

otherwise view.

I began my instruction with an introductory quiz looking at three different types of visual

representations of data, a simple line graph, bar graph, and pie chart. No student achieved

perfection and only two students performing at the proficient level (Artifacts B, C & D). Much

like it was described in the research articles I read, several students had difficulties understanding

what the questions were asking them and had little to no understanding what the visual

representations portrayed.

I followed this pre-assessment with a formal lesson on creating graphs. This was an

intensive lesson for the students which was riddled with difficulties stemming from the students

various areas of disabilities. Several students had difficulties understanding the steps to make a

scale on the graph, or even the need for an accurate scale. Students also had issues

understanding where to put the dependent, and independent variables and to keep that consistent

throughout the semester.

I conducted several informal as well as formal assessments throughout the course of the

semester on all the various topics covered in the course. Things culminated with an

individualized hand on project where students were required to build their own mousetrap cars,

and gather race data. This was a very difficult process for many of the students due to lack of

technical skills, fine motor abilities, and patients in general. Seeing as how all students worked

at different levels it was quite difficult to ensure all students were accurately following the

directions and properly building their race car. Come race day, one students mouse trap car was

incomplete due to absences, others had incorrectly assembled their cars and had glued axles stiff
WTS 1 and 2 page 19 of 27

or broken other essential pieces. Despite all the difficulties, ten of the thirteen students could

collect data on a collaborative spreadsheet, and then produce a graph of their data using Google

Sheets (Artifacts E and F).

There are several reasons I feel students did not achieve as I had hoped on the final

assessment of this research process. First, students had only completed assessments up until this

point using paper and pencil tests. The final assessment students completed online using the

Google Forms assessment tool which was new for the individuals. Students were expected to

switch between tabs on their web browser between the quiz, and their graphs rather than having a

tangible test in front of them. Also, the data students used in the final assessment was more

difficult then what they had seen up until this point, where they were required to look at both

charts, and graphs.

Comparison of Student Performance Related to Targeted Student Learning Objectives

The targeted objective for my students was for all individuals to achieve proficiency in

the ACT skills of interpretation of data. Students were considered to be proficient if they were

able to score an 80% on assessments which our district uses as a baseline to introduce the more

advanced level of ACT IOD skills. The goal is for all students to score a 23 on the ACT test,

which requires them to master 400 level IOD skills. Our district focuses on 200 level skills at

with freshmen, increasing as students advance in grade levels or meet minimum proficiency

requirements. Referring to artifact A, only 2/13 students were proficient on the baseline

assessment, and only 2/13 students were proficient on the final summative assessment.

However, if one looks to the three mid review assessments performed, it can be noted that

on these three assessments, 6/12, 5/12, and finally 9/12 students achieved proficiency. This

deviation in achievement could be related to the difficulty of the topic, or the format of
WTS 1 and 2 page 20 of 27

assessment. There was no standard set of assessments used for this research, which of course

was a mistake in hindsight. Assessment questions and the graphs used on tests were all created

to a level I felt was consistent, but there could of course been variations in the difficulties of data.

Also, the methods in which students completed the assessments changed over time as well.

Towards the beginning of the school year, I read the assessments to my students in order to

accommodate some of their disabilities. Over time, I gradually released that responsibility onto

the students and instead would only answer questions when they arose for a student.

Comparison of Learning Environment While Learning Targeted Objectives

The implementation of intensive IOD skills instruction in my freshmen physical science

class helped build confidence in my students with disabilities. Students have become more

comfortable with charts and data, and their general dislike of these activities has decreased over

time. It has never been a secret that these types of skills are difficult for those with disabilities to

grasp, and although the final assessment data does not necessarily support my view of success,

there were many objective gains made by my students. I feel my ability to integrate the IOD

skills in lessons has improved, which is something that will follow us for the rest of the school

year and translate into the other science classes I teach as well.

Reflection of Entire Learning Process

How does intensive instruction of IOD skills affect the performance of disabled freshmen

and their college readiness skills? I feel my research has shown that intensive instruction of IOD

skills coupled with a hand on approach to science curriculum, as well as data collection and

graphic creation is extremely beneficial for students with disabilities. Although my students may

not have made the desired gains, any progress for students with disabilities can be viewed as

good progress. I feel that if I continue my approach with students I will give my students a much
WTS 1 and 2 page 21 of 27

more likely chance at being successful on the science portion of the ACT and other standardized

tests, as well as obtaining IOD skills needed for daily living.

What Worked and Why

1. Several students made significant gains. All research conducted throughout this

process has supported that the only way to increase these abilities in students is to practice it

from all facets, and students have become more comfortable with the gathering, creating, and

reading processes of data.

2. Students seemed to be much more comfortable with non-science examples and scored

better on these types of assessments. I could use these to reinforce skills they already have, and

slowly introduce new ones.

What Did Not Work and Why

1. The students with the most severe disabilities made either little growth, or inconsistent

growth throughout the process. Students with intellectual disabilities or on the autism spectrum

typically have some sort of gap in their knowledge and cognitive abilities which was apparent

throughout this experiment.

2. Throughout the process, students consistently lacked the ability to make coherent

graphs by hand. Students could collaborate as a group at the end and use the Google Forms

service to make graphs of their final project mouse trap race data, but struggled with nearly every

other opportunity. I need to integrate more simple tasks collecting and displaying data to build

new skills, and to reinforce the ones they already have.

My Next Steps
WTS 1 and 2 page 22 of 27

1. All research has proposed that the key to fluency with IOD skills revolves around

practice. I need to continue to find interesting, and fun ways to integrate data collection and

graphing skills.

2. I need to make sure I find a way to integrate IOD skills with every unit taught

throughout the year.

3. For students who have reached proficiency at the 200 level of IOD skills, I need to

start creating more specifically differentiated instruction and assessments.


WTS 1 and 2 page 23 of 27

References

ACT (http://www.act.org/content/act/en/products-and-services/the-act/test-preparation/science-

practice-test-questions.html?page=0&chapter=0)

Arce, J., Bodner, G.M., Hutchinson, K., (2014). A study of the impact of inquiry-based

professional development experiences on the beliefs of intermediate science teachers

about best practices for classroom teaching. International Journal of Education in

Mathematics, Science and Technology, 2(2).

Jimenez, B. A., Lo, Y., & Saunders, A.C. (2014). The additive effects of scripted lessons plus

guided notes on science quiz scores of students with intellectual disabilities and autism.

The Journal of Special Education, 47(4).

Maltese, A. V., Harsh, J. A., & Sventina, D. (2015). Data visualization literacy: investigating

data interpretation along the novice-expert continuum. Journal of College Science

Teaching, 45(1).

NGSS Lead States. (2013). Next generation science standards: For states, by states. The

National Academies Press.

Ruiz-Primo, M.A., Furtak E.M., (2006). Informal formative assessment and scientific inquiry:

exploring teachers practices and student learning. Educational Assessment 11(3&4).

Tairab, H. H., & Khalaf Al-Naqbi, A.K. (2004). How do secondary science students interpret and

construct science graphs? Journal of Biological Education, 38(3).


WTS 1 and 2 page 24 of 27

Artifact A: Experiment Results

The following artifact shows student performance on all data gathered throughout the
experiment. Assessment data is averaged at the bottom of the chart above the number of
proficient and non-proficient students for each assessment. As you can see, throughout the first
four assessments scores generally increased. The final assessment data does not show an overall
increase of the students; however, the researcher would argue that this is not an accurate
reflection of student abilities and growth but rather mistakes of the research facilitator.
WTS 1 and 2 page 25 of 27

Artifact B: Baseline High- Level Example

The high-level example shows a student who answered all questions, and only two of
those were incorrect. The test was completed quickly and confidently with no assistance or
questions read to the student.

Artifact C: Baseline Mid-Level Example

The mid-level example was completed with four incomplete answers. All incomplete
answers were missed on the same graph and could perhaps show a single gap in knowledge
rather than an overall lack of understanding or abilities on interpreting graphs.
WTS 1 and 2 page 26 of 27

Artifact D: Baseline Low-Level Example

This low level example from the baseline data was completed by a student who had an
almost complete lack of understanding of charts and graphs. They only scored three correct
answers out of eleven even after spending more time on the assessment than any other student.

Artifact E

A collaborative document showing student collected data, and a line graph comparing the
speeds of students mouse trap cars.
WTS 1 and 2 page 27 of 27

Artifact F

A collaborative document showing student collected data, and a line graph comparing the
speeds of students mouse trap cars.

Vous aimerez peut-être aussi