Vous êtes sur la page 1sur 12

WebQuests and Achievement Lidia Argentina Paun Graduate MAT student at UNC Charlotte ENGL 6274 Spring 2013

Abstract The questions driving this inquiry are: (1.) Are WebQuests an effective way of integrating technology into instruction? (2.) Will building background knowledge through WebQuests increase interest in a reading selection? (3.) Can WebQuests be used to positively aid comprehension of literature? The questions purpose is to explore these topics as a way of gaining an understanding about the integration and effects of using technology through a studentdirected learning activity. Through implementing the use of a WebQuest in a 9th grade English Language Arts classroom, students explore background knowledge relating to a literary text as well as overarching literary concepts. Keywords: adolescent learners, WebQuests, academic performance, secondary education

Introduction In todays high accountability schools, students are more eager to participate in learning through using technology. Using media and interacting through it is such a part of our modern lives that some of us can barely function if we happen to be disconnected. Many students all along the learning spectrum of K-12 and even beyond are well-versed in using technology, and for some teachers the students knowledge surpasses theirs. As far as struggling readers go, engaging them in meaningful ways utilizing technology is a good place to begin addressing their struggles in a very student-directed activity. Some schools have been swept up by the Bring Your Own Technology (BYOT) trend in which students can use their personal technological devices for a learning purpose. While this jump in technology is taking off, some activities can be used with much greater efficiency and interest. As an instructional strategy, the use of WebQuests has been a popular platform which has provided teachers with an engaging tool for a long time. A way teachers can use it to address the issues of low-performing students is by using ready-made WebQuests and tailoring them to engage students of all kinds from high-achievers to struggling readers, as well as English Language Learners (ELL) students. The purpose of this study will be to implement a WebQuest in order to analyze the results on student learning in an actual classroom setting. The goal will be to learn if students reading comprehension will significantly increase when utilizing the tool and result in a better quiz score. This research aims to understand what impact the use of WebQuests has in the building of background knowledge, exploration, and understanding of literature. It will measure the correlation between building background knowledge, learning overarching concepts such as symbolism, and application of it through an assessment. The questions below have been used to focus this inquiry:

Research Questions Can WebQuests be used to positively aid comprehension of literature? Will building background knowledge through WebQuests increase interest in a reading selection? Are WebQuests an effective way of integrating technology into instruction? Will use of WebQuests in the class room result in better retention of what is read, both in the short-term and long-term? Defining Terms Web Quest The WebQuest is an inquiry-based technology activity designed by Bernie Dodge and Tom March at San Diego State University in 1995 (Lacina 2012). From a single page, many types of reading material, multimedia, and even links to other pages can be used to send a student on a quest for finding and building information. The material is provided through the internet, sending the student on a quest for information through interacting with other internet sources (March 2004). A WebQuest could be designed to include different activities and scaffolding for students while integrating technology. It is a student-centered activity in which the student can go through the information on their own. It allows the student to proceed at their own pace while helping to create interest in any follow-up material. Their design is based on a constructivist philosophy, and because cooperative learning and scaffolding of instruction are two of its essential components. For this purpose, the WebQuest has become a favorite with teachers. Teachers who design of use Web Quests tend to emphasize higher level Blooms Taxonomy tasks, such as focusing on using information at the levels of analysis, synthesis, and evaluation. (Lacina 2012).

Constructivist Pedagogy Constructivist pedagogy focuses on students constructing knowledge. From a social constructivist (and constructionist) perspective, this construction occurs primarily through social interactions (Berger & Luckmann, 1966; Vygotsky, 1978; Wertsch, 1986) (Rosen and Nelson, 2008). Struggling Adolescent Reader For a struggling reader, reading at grade-level is a big challenge. Not having grade-level proficiency can result in a number of reasons for why a reader may struggle. Some common areas of struggle include vocabulary, comprehension, and difficulty sounding out words. Comprehension of a text is thus marginalized as these skills have not been developed for gradelevel performance. As a struggling reader, the student may need additional support in one or more key areas exemplifying good reading. Some of the characteristics of a struggling reader include reluctance to participate in reading, a negative attitude toward reading, unwillingness to read, low confidence, and poor accuracy and fluency (Taylor 2012). Because of these multitudes of categories, a struggling reader may grapple with any one or more of these facets. Different struggling readers have specific, personal reading needs that need to be met before they can become successful at comprehending the reading process. For struggling readers, it is a difficult task to construct meaning from text, and without targeted practice in the areas they need scaffolding in they may not improve. Technological Literacy In todays globalized world, being literate takes on new nuances. It is no longer enough to be literate in reading and writing; literacy stretches to other fields, with an increasing focus on

technology. As teachers, we must incorporate teaching 21st century skills into our repertoire in order to assure that our students apply these skills to their learning environment. One of the most important 21st century skills for a teacher to possess is technological literacy. The ability to understand and use basic technological functions opens new options for a multitude of tasks including finding relevant information, distinguishing its validity, and applying that knowledge. Most youths are perceived to be technologically-inclined, as many of them are digital natives. However several studies point out that many students especially low-income students may be low users of technology (Facer & Furlong, 2001). These youths may interact with cyber technology only in the most basic ways, both in the classroom and at home (Lengel & Lengel, 2006; Ware & Warschauer, 2005). Therefore if we as educators expect to prepare students adequately for present and future demands, technological literacy should be taught explicitly and integrated into content area learning in meaningful ways (Gee, 2000; Lankshear & Knobel, 2004; Lengel & Lengel, 2006). (Sox & Avila, 2009). Review of Literature Designing WebQuests The design of a WebQuest is particularity important, as it is the first step to piquing the target student audience's interest. They are a way to incorporate inquiry-based learning into content area instruction (Lacina, 2013). Lacina did not propose a study, but outlined the history of the creation and implementation of the WebQuest as an inquiry-based technology activity. A walk-through of WebQuest features prompts the educator-audience on how to begin designing one of their own in steps: Introduction, task, resources, process, and evaluation all based on the key components of organization, accessibility and appropriateness of resources and visuals.

Not all WebQuests are created equally. Making a decision about which WebQuest to choose is the first step to understanding its effectiveness and its correlation to the students response to it. Even altering an existing one would be a purposeful way of adding improved differentiation methods. Cox and Avilas (2009) Rubric for WebQuest Effectiveness for ELLs uses a 4-point Lykert scale of ineffective (1) to very effective (4) to rate WebQuests and their potential to integrate technology, content knowledge, and comprehensible input. These facets are rated on providing certain types of support for diverse learners. The chart specifically targets linguistic features, multimedia features, and organizational features. Cox and Avila target these areas specifically for the purpose of differentiating to ELL students. A 4 on the scale contains features which should be part of any good and user-friendly WebQuest, and represents a good integration of multimedia and organization. According to their study which reviewed eight WebQuests, the WebQuests did not make a consistent use of the elements they presented for rating, concluding that many are not ELL-friendly. The authors provide ideas and strategies on adapting and customizing existing WebQuests to better support student needs. The application of ten cohesive segments could aid in successful performance: Asking for help, supporting word recognition needs, selecting appropriate materials, developing reading fluency, fostering vocabulary acquisition, building comprehension, monitoring progress, differentiating instruction, grouping students effectively and being flexible about reading materials. (Thomas & Wexler 2007). A particular study, undertaken by two university professors, presented a ten-week study of WebQuests to facilitate literacy and critical thinking. This study analyzed six fifth-grade students progress throughout a WebQuest that included contents of science, English Language Arts, and social science. The WebQuests were carefully weaved in to instruction, and they were

supplemented with mini-lessons and activities which synthesized and connected activities with focused development of skills and critical thinking. Emphasis was successfully placed on gathering, summarizing, analyzing, synthesizing, evaluating. (Ikpeze & Boyd 2007). Methodology Timeline April 1-7: Research appropriate literature. Begin writing review of literature and strategy of implementation. April 8-14: Gather resources for the reading history survey, make necessary adjustments, administer to students, begin writing lesson plans. April 15-21: Find/make adjustments/create a WebQuest based on The Scarlet Ibis, introduce unit background information, reserve computer lab for 2 sessions. April 22-28: Begin reading The Scarlet Ibis, use WebQuest for reading and other activities during lab time. Administer reading comprehension quizzes. Data analysis. April 26-30: Collect, chart, and analyze quiz grades from experimental sample and control sample. Revise and finalize research paper April 30: Final Paper Due Research Site and Participants At my particular urban school of more than 2000 attending students, I chose for my control and experimental groups two English I standard classes I taught. The context was not an unusual one for large-attendance schools with little space: I became hired in the middle of the semester and my classrooms were constructed randomly by pulling students out of their prior environments. The control class contained 21 students, while the experimental class contained 20 students, both with a mix of male and female students. The control class also contained two

students who had been identified as English as a Second Language (ESL) served, while the experimental class contained five. The number of Personal Education Plans (PEP) for students in the experimental class is also greater. Over the course of the instructional year, the experimental classroom had consistently scored between five to eight points lower than the control classroom on average on tests and quizzes. Neither groups classrooms were equipped with computers for student use. The experimental group utilized one of the school's several computer labs, which were reserved as needed for our activity. Data Sources Data was acquired on three fronts: 1) a WebQuest assessment worksheet, 2) prior selections quiz scores, and 3) current selection quiz scores. The reading comprehension quiz assessed students on careful reading of the passage, overarching concepts (such as symbolism), and vocabulary in context. Analyzing at the prior quiz scores provided me with a starting point in determining to what extent one class was out-performing the other. Supplemental reading activities and lessons linked these two data sources together. A WebQuest was used to provide background knowledge for the story The Scarlet Ibis and to help students explore and apply the concept of symbolism. The WebQuest was supplemented by mini-lessons on symbolism, as well as reading scaffolding throughout the short story. Students completed a worksheet based on the activities stated in the WebQuest. The background knowledge built through the activities provided a basis for the reading and instruction that follows it. The mini-lessons and supplementary materials (reading support questions, reading checks, vocabulary in context, application of concepts), along with the WebQuest, allowed me to measure through a quiz what the WebQuests impact on the students

comprehension of the story was. The experimental classroom received the WebQuest, while the control classroom did not. The expectation was that the final quiz scores on passage comprehension would indicate if the WebQuest was an effective way of differentiating and providing reading and information support for students. Findings In this trial, the control classs quiz scores averaged at 75.47%, and the experimental class averaged 76.90%. While in direct comparison the 1.43% improvement may not seem like a significant improvement, the experimental class typically scored an average of eight points lower on tests and quizzes than the other a full letter grade. The real improvement can be shown when compared to other quizzes throughout the year. The fact that the experimental class was able to improve their quiz score dramatically when compared with the control class shows that there may be some correlation between WebQuests and improved comprehension and scores. I believe the improvement reflects positively on the role that WebQuests can play in the classroom. Despite the positive results, the overall findings were inconclusive and will require further testing in the future. The sample size, while comparable, is not representative of the typical population and is too small to make a definitive statement on the benefits of WebQuests. Additional, other factors that may have skewed the results include absenteeism and classroom culture. Even though the experimental class scored slightly better by 1.43% and showed marked improvement when compared to the control class, one quiz does not provide a significant measurement. While the results do support the hypothesis that WebQuest can aid in reading comprehension, I knew that such a small sample size may be problematic going into the

experiment. It will require additional testing with additional classes and multiple quizzes to see if the WebQuest is indeed effective. This study could be used again in the future with a larger sample size which would allow a more thorough analysis of the test results. Further testing would be required to reach a more balanced result. However, I find the initial result promising. Implications for Future Research Future research on this topic could include an in-depth analysis of WebQuests aid for different student populations such as: ESL-served, students on Personal Education Plans (PEPS), students on Individual Education Plans (IEPs), or students on 504 Plans. A poor reading history may also indicate difficulties in reading comprehension that WebQuests could improve through scaffolding activities. It would be interesting to find out how struggling readers or ESL students reading history might affect their current reading comprehension and understanding of literature. Would a scaffolded and tailored WebQuest positively aid their reading comprehension skills? Additionally, utilizing this experiment again with more classes in the future will provide for a more robust sample from which I may be able to provide more data to help support or disprove my initial hypothesis. One way I hope to implement this in the future is to create a full semester's worth of WebQuest activities and monitor how students reading comprehension and test scores are affected. I believe that the correlation I found in this intial study will hold true, and that by engaging students on a technological level I can improve interaction and learning accountability in the classroom.

References Ikpeze, C. H., & Boyd, F. B. (2007). Web-based inquiry learning: facilitating thoughtful literacy with WebQuests. Reading Teacher, 60(7), 644-654. Lacina, J. (2007). Inquiry-based learning and technology: designing and exploring WebQuests. Childhood Education, 83(4), 251. Malin, G. (2010). Is it still considered reading? using digital video storytelling to engage adolescent readers. Clearing House, 83(4), 121-125. doi:10.1080/00098651003774802 March, T. (2003). The learning power of WebQuests. Educational Leadership, 61(4), 42. Rosen, D., & Nelson, C. (2008). Web 2.0: A new generation of learners and education. Computers in the Schools, 25(3/4), 211-225. doi:10.1080/07380560802370997 Sox, A., & Rubinstein-Avila, E. (n.d). WebQuests for english-language learners: Essential elements for design. Journal Of Adolescent & Adult Literacy, 53(1), 38-48. Taylor, C. R. (2012). Engaging the struggling reader: Focusing on reading and success across the content areas. National Teacher Education Journal, 5(2), 51-58. Thomas, C., & Wexler, J. (2007). 10 ways to teach and support struggling adolescent readers. Kappa Delta Pi Record, 44(1), 22-27.