Académique Documents
Professionnel Documents
Culture Documents
Tracey Anderson
October 15, 2016
Demonstrated Need
Benjamin Bannekers most recent College and Career Readiness Performance
Indicator (CCRPI) score published in 2015 is 56.3; Fulton County School Districts
average CCRPI score is 71.8 (Georgia Department of Education, 2016). The students
at Benjamin Banneker who are economically disadvantaged students and students
with disabilities have even fewer opportunities according to their CCRPI results:
31.23 and 10.57 respectively (Georgia Department of Education, 2016). Benjamin
Bannekers students also have the lowest SAT scores averages-less than 1,200- in
the Fulton County School District (Fulton County Schools, 2016b). And according to
the most recent annual Title 1 meeting (2015), less than 60% of the schools
students graduate on time. Benjamin Banneker is clearly struggling to help students
succeed academically. Because quantitative data is being used more frequently to
determine funding and even organizational control in the case of the November, 2016
Georgia ballot option for the Opportunity School District, Benjamin Banneker High
School will have to help students perform better academically.
Benjamin Banneker High Schools School Plan identifies a need for identifying
and prioritizing opportunity areas to improve student achievement results and a
plan to achieve specific student performance gains and meet student needs (Fulton
County Schools, 2016a, p. 3). And in its Strategic Plan for 2017, Fulton County
Schools (2016b) recognizes the school districts previous limitations of instructional
delivery methods and instructional offerings instructional models as well as the need
for more integration of technology. The school district (2016b) also has set a 100%
career readiness goal for the end of this school year. And in its most recent annual
associated with more frequent and higher-order use of the internet in secondary
grades social studies classes. Hofer and Swan (2008) as well as Beeson, Journell,
and Ayers (2014) explored the effects of TPACK. Hofer and Swan (2008) investigated
the extent to which teachers technological pedagogical content knowledge (TPACK)
informs their instructional planning; Beeson, Journell, and Ayers (2014) wanted to
know how a teachers TPCK affect the complexity and authenticity of civics
instruction. Journell (2008) also investigated the role of the teacher by analyzing the
role the teacher plays in facilitating asynchronous discussions. Ravitz (2010)
analyzed how teachers using project-based-learning differ according to teacher
culture, student culture, and instructional reforms.
Other research focused on process. DiCamillo (2015) examined what happens
during expedition learning. And some research attempted to identify the processes
skilled readers engage in when locating and reading information on the internet as
well as what informs the choices such readers make when engaged in this type of
digital literacy (Coiro & Dobler, 2007). Some research concentrated on what students
think about a piece of technology or practice. Munoz and Thomas (2016) investigated
students perceptions of cell phone use for instruction and in the classroom. Snyder,
Paska, and Besozzi (2014) wanted to know what students think about screencasting;
DiCamillo (2015) wanted to see how students view expedition learning. Some studies
focused on the support teachers need. And DiCamillo (2015) identified challenges
teachers face in facilitating expedition learning. La Paz, Malkus, Monte-Sano, and
Montanaro (2011) examined the relationship between the degree to which teachers
participated in professional development and student learning over the course of one
academic year, while Sugar and van Tryon (2014) wanted to know what resources a
technology coach could provide for teachers facilitating virtual learning.
Population Studied
Vetsanios, 2007), Indiana (Van Fossan & Waterson, 2008), Northern California (La
Paz, Malkus, Monte-Sano, and Montanaro,2011), New York (DiCamillo, 2015), North
Carolina (Manfra & Lee, 2012; Beeson, Journell, and Ayers, 2014), Virginia (Journell,
2008), and Texas (Strickland, 2005).
Methodologies
In investigating the use and effect of inquiry strategies and technology, many
researchers relied on interview data from students and teachers. Shin (2006),
Doering and Vetsanios (2007), Manfra and Lee (2011), Coiro and Dobler (2007), and
Schul (2012) all interviewed students. Thomas and Munoz (2016) and Snyder, Paska,
and Besozzi (2014) surveyed students. Shin (2006), Schul (2012), Strickland and
Nazzal (2005), Hofer and Swan (2008) and Coiro and Dobler (2007), and Journell
(2008), Schul (2012) interviewed teachers. Van Fossan and Waterson (2008) gave
teachers an online-questionnaire, and Sugar and van Tryon, (2014) surveyed
teachers. Researchers also examined students work. Schin (2006), Manfra and Lee
(2011), Strickland and Nazzal (2005), Schul (2012) analyzed students discussion
posts, sketch maps, presentations, and documentaries. And Shin (2006), Coiro and
Dobler (2007), Schul (2012), and Strickland and Nazzal (2005) conducted multiple
observations of teachers and students interacting with each other and technology in
the classroom. For those researchers trying to determine what impact technology or
a use of technology has on learning, they used students pre- and post- assessments
(Strickland, 2005; La Paz, Malkus, Monte-Sano, and Montanaro, 2011; DiCamillo,
2015).
Researchers applied different strategies to analyzing the data. For qualitative
data derived from observations, students work, and interview or survey responses,
many created some type of category coding (Sugar and van Tryon, 2014; Beeson,
Journell, and Ayers, 2014; Journell, 2008; Doering & Vetsanios, 2007; Shin, 2006;
Coiro & Dobler, 2007; Hofer & Swan, 2008; La Paz, Malkus, Monte-Sano, and
Montanaro, 2011; and Snyder, Paska, and Besozzi, 2014; Ravitz (2010). For those
researchers focusing on improved student learning informed, they analyzed the
quantitative data derived from assessments (La Paz, Malkus, Monte-Sano, and
Montanaro, 2011; Coiro & Dobler, 2007; DiCamillo, 2015; Strickland & Nazzal,
2005). Researchers also compared students and teachers actions during
observations and/or responses to interviews or surveys (Coiro & Dobler, 2007;
Stricklandl, 2005; Hofer & Swan, 2008). Because Van Fossan and Waterson (2008)
wanted to see how much internet use had changed, they compared teachers on-line
questionnaire responses to those responses gathered from an earlier study.
Lens/Theory
Many of the researchers examined teaching practices and student learning by
applying a constructivist approach in which the instructor works more as a facilitator
as students engage in acts of discovery that encourage students to use their prior
knowledge. through a constructivist lens (Doering & Vetsanios, 2007). Shin (2006),
Coiro and Dobler (2007), Manfra and Lee (2011), Van Fossan and Waterson, (2008),
and Doering and Vetsanios (2007), and Beeson, Journell, and Ayers (2014), and
Snyder, Paska, and Besozzi (2014) all applied this approach. Van Fossan and
Waterson (2008) also explained how they were informed by the idea that authorship
can push students beyond acquiring knowledge to learning about the nature of
facts, evidence, and interpretation (p. 132). And Manfra and Lee (2011) expanded
on this attention to student-centered learning by describing a real-world learning
environment. Thomas and Munoz (2016) and Snyder, Paska, and Besozzi (2014)
implicitly addressed student-centered learning by asking for students perceptions
about activities and practices in their learning. Schul (2012) as well as Hofer and
Swan (2008) applied TPACK (technological pedagogical content knowledge), which is
appropriate for any study of a teachers classroom practice and planning with
technology since this approach includes all of a teachers knowledge in an inquiry-
environments with faster internet connections. Beeson, Journell., and Ayers (2014)
found that while two award-winning civics teachers claimed to promote the same
type of authentic, investigatory, and critical-thinking environment with technology,
one teacher used technology more as a digital textbook and notebook.
Because students are the ones using technologies, it makes sense that
researchers find out what students think. Despite the promotion of cell phones for
instructional use, 70% of the 628 high school students in Thomas and Munozs
(2016) study indicated that they werent convinced cell phone use actually helped
them learn more or better. And 90% of these students reported having used cell
phones for some type of learning
9th grade student participants in Snyder, Paska, and Besozzis (2014) study didnt
speak to the level of learning impacted by technology use, they did indicate that they
support using screencasting as a technological instructional tool. Although the
student participants engaging in a webquest in Stricklands (2005) study didnt
necessarily perform as well as their poster-making peers, they did report positive
experiences with the webquest. And the middle schoolers using Google Earth in
Doering and Vetsanioss (2007) study reported having fun (p. 249).
Data Review
Quantitative Data
Benjamin Bannekers most recent College and Career Readiness Performance
Indicator (CCRPI) score published in 2015 is 56.3; Fulton County School Districts
average CCRPI score is 71.8 (Georgia Department of Education, 2016). The students
at Benjamin Banneker who are economically disadvantaged students and students
with disabilities have even fewer opportunities according to their CCRPI results:
31.23 and 10.57 respectively (Georgia Department of Education, 2016). Benjamin
Bannekers students also have the lowest SAT scores averages-less than 1,200- in
the Fulton County School District (Fulton County Schools, 2016b). And according to
the most recent annual Title 1 meeting (2015), less than 60% of the schools
students graduate on time. Benjamin Banneker is clearly struggling to help students
succeed academically. Because qualitative data is being used more frequently to
determine funding and even organizational control in the case of the November, 2016
ballot option for the Opportunity School District, Benjamin Banneker High School will
have to help students perform better academically.
Qualitative Data
Benjamin Banneker High Schools School Plan identifies a need for identifying
and prioritizing opportunity areas to improve student achievement results and a
plan to achieve specific student performance gains and meet student needs (Fulton
County Schools, 2016a, p. 3). And in its Strategic Plan for 2017, Fulton County
Schools (2016b) recognizes the school districts previous limitations of instructional
delivery methods and instructional offerings instructional models as well as the need
for more integration of technology. The school district (2016b) also has set a 100%
career readiness goal for the end of this school year. And in its most recent annual
Title 1 meeting information (2015), the school identified a goal of improving US
History EOC/Milestone performance from 40% of students receiving a meeting or
exceeding score to 50% of students receiving a meeting or exceeding score. Because
the school district and the school have identified specific areas for improvement, any
professional development program the school engages in will have to actively and
specifically integrate these goals into its instruction and practice.
Project Goals and Objectives
Georgia world history social studies standards ask students to be able to
Conduct short as well as more sustained research projects to answer a question
(including a self-generated question) or solve a problem; narrow or broaden the
inquiry when appropriate; synthesize multiple sources on the subject, demonstrating
understanding of the subject under investigation (Georgia Department of Education,
10
11
12
13
14
15
Day 4
8am-9am-Group discussion of the visits: advantages and disadvantages; possibilities
and challenges for doing so with students.
9am-10am-Particpant groups post and respond to other groups inquiry-based
learning ideas.
10am-12pm-Participant groups work with graduate students to begin planning a
collaborative inquiry-based learning unit.
1pm-4pm-Participant groups begin creating the activities and assessments for their
inquiry-based learning unit.
Day 5
8am-9am-Breakfast discussion in which participants share their what theyve
learned, what theyve thought about, concerns they have about using inquiry-based
learning.
9am-12pm-Groups finish creating the activities and assessment for their inquirybased learning unit and post the unit activities and assessments on their weebly
sites.
1pm-3pm-Participants give feedback to each groups weeblies and inquiry-based
learning units.
3pm-4pm-Participants complete course evaluations.
School-year Schedule
Day in August
Participants will observe inquiry-based learning classrooms in metro Altanta.
Day in September
Participants will be observed facilitating inquiry-based learning.
Day in September
Participants will meet with observer and fellow participants to evaluate and modify, if
necessary, inquiry-based learning activities.
16
17
18
19