Vous êtes sur la page 1sur 240

EducationandInformationTechnologyAnnual2016

ASelectionofAACEAwardPapers


Editedby
TheoJ.Bastiaens,Ph.D.
GaryH.Marks,Ph.D.

Publishedby

AACEAssociationfortheAdvancementofComputinginEducation

1

EducationandInformationTechnologyAnnual2016:ASelectionofAACEAwardPapers
(ISBN#9781939797223)ispublishedby
AACE,POBox719,Waynesboro,NC28786,USA
Email:info@aace.org
Copyright2016byAACE
www.aace.orgAvailableathttp://www.aace.org/bookshelf.htm

2
Introduction

The Association for the Advancement of Computing in Education (AACE), http://AACE.org,


founded in 1981, is an international, not-for-profit, educational organization with the mission of
advancing Information Technology in Education and E-Learning research, development,
learning, and its practical application.

AACE serves the profession with international conferences, high quality publications, leading-
edge Digital Library (http://EdITLib.org), Career Center, and other opportunities for professional
growth.

We are proud to present to you this selection of 25 award winning papers from AACEs
conferences (http://AACE. org/conf). This year's selection includes papers from the annual
conference of the Society for Information Technology & Teacher Education (SITE) in Las
Vegas, Nevada, the World Conference on Educational Multimedia, Hypermedia and
Telecommunications (Ed-Media) in Montral, Canada, the World Conference on E-Learning in
Corporate, Government, Healthcare, and Higher Education (E-Learn) in Kona, Hawaii and
Global Learn in Berlin, Germany. The decision to nominate a conference paper for an award was
made by peer reviewers. All authors were honored during the conference and received a
certificate that serves as testimony to their outstanding research and contribution to the
conference.

This AACE finest of 2016 book groups the award winning papers in five parts. These five parts
provide a timely overview and record of topics that are of primary interest in educational
technology this year.

We hope that the reader enjoys this selection as much as we enjoyed working with these cutting-
edge scholars. It is the fifth year that we publish this edition. We are grateful for all the feedback
and all the nice comments we got on the 2012, 2013, 2014 and 2015 book. We look forward to
many new future editions of AACEs award papers.

Thank you very much for your support and participation in AACE events and activities.

Theo J. Bastiaens, Gary H. Marks

Part 1 Technology in Education

In part 1 of this collection the focus is more general on Technology in Education.


In Chapter 1 author Borokhovski and colleagues summarize in a meta-analysis data
from 674 independent primary studies that compared higher degrees of technology
use in the experimental with technology-leaner control conditions in terms of their
effects on student learning outcomes in postsecondary education. They found an
overall average weighted effect size of g+ = 0.27 (k = 879, p < .01), indicating low
but significant positive effect of technology integration on learning. The follow-up
analyses revealed significant influence of educational technology used for cognitive
support blended learning instructional settings designed interaction treatments, and
technology integration in teacher training especially when student-centered
pedagogical frameworks are used.
Chapter 2 presents data gathered from 195 preservice and inservice teachers at four
universities during 2014. Author Christensen and colleagues try to determine the
suitability of the newly revised Technology Proficiency Self Assessment (TPSA
C21) for release for wide-scale distribution. Findings were that the four original
scales of the TPSA maintained respectable reliability estimates ranging from .73 to
.86, while two new scales focusing on emerging technologies yielded Cronbachs
Alpha internal consistency reliability estimates of .84 and .91. While university
students at all sites generally agreed or strongly agreed that they were proficient in
most technology-related skills, one group of accelerated masters students reported
less proficiency than anticipated. The authors conclude that the TPSA C21 continues
the reliable and valid measurement tradition of the TPSA spanning more than a
decade, and extends that tradition with the addition of two scales designed to assess
educator skills regarding emerging new technologies.
Authors Khalil and Ebner look in chapter 3 at the potentials of learning analytics.
Within the evolution of technology in education, learning analytics has reserved its
position as a robust technological field that promises to empower instructors and
learners in different educational fields. The 2014 horizon report expects it to be
adopted by educational institutions in the near future. However, the processes and
phases as well as constraints are still not deeply debated. In chapter 3 the authors talk
about the essence, objectives and methodologies of Learning Analytics and propose a
first prototype life cycle that describes its entire process. Furthermore, the authors
raise substantial questions related to challenges such as security, policy and ethics
issues that limit the beneficial appliances of Learning Analytics processes.

4
Part 2 Design and Development

Part 2 stresses the design and development of learning experiences.


In chapter 4 author McMahon and colleagues report on the first stage of the design of
software to assist educators in ensuring assessment questions meet educational
outcomes. A literature review was undertaken to establish a framework for the
classification of instructional statements according to learning complexity. It focused
on taxonomies of learning outcomes to develop an Instructional Activity Matrix that
identifies nouns and verbs used to describe the types of knowledge and cognitive
processes manifest in learning. The matrix is described in chapter 4 and will form the
basis for the software to autonomously evaluate instructional items.
Authors Schmidt and Fulton state in chapter 5 that preparing students with 21st
Century Skills through STEM related teaching is needed, especially at the
elementary level. However, most teacher education preparation programs do not
focus on STEM education. To provide an exemplary STEM unit, they transformed
an inquiry-based unit on moon phases from a traditional science activity into a
technology-rich, digital unit for pre-service teachers (PSTs). In chapter 5, they
describe lessons learned related to the development and implementation of this
STEM unit in an undergraduate elementary methods course and explore the impact
of this on PSTs perceptions of inquiry-based science instruction. Findings indicate
that PSTs held absolutist beliefs and had a need for instruction on inquiry-based
learning prior to engaging in an on-going inquiry.
In chapter 6 author Damnik and colleagues try to understand how technology can be
used to foster high-quality knowledge acquisition. According to Jonassen in 2001,
high-quality knowledge acquisition is only achieved if learners actively interact with
technology and, thus, utilize technological devices as cognitive tools. The purpose of
their chapter is to illustrate how a typical authoring tool, TEE-machine, can be
adapted as a cognitive tool. Furthermore, results of an evaluation questionnaire
assessing knowledge acquisition with the TEE-machine during several regular
university courses will be discussed. The results show that learners think that they
acquired a lot of knowledge during the courses, and that they were able to apply this
knowledge to new situations.
In chapter 7 Author Jones and colleagues analyze a collection of posts written across
a teaching year on a group blog by three teacher educators as they explored their
practice and attempted to learn how to meet this challenge. Their analysis uses a
distributed view of knowledge and learning to identify the barriers and enablers
encountered, and how the teacher educators developed their distributed TPACK
throughout the year. The chapter argues that a distributed view of TPACK offers
some interesting insights that can inform practitioners, researchers and policy makers
as they explore practice and learn how to meet the technology integration challenge.
Asked to design a technology education course for secondary preservice candidates,
the authors of chapter 8, Hathaway and Norton, sought to design a course that
focused on the interaction of technology and disciplinary teaching not on technology
skills. Adopting a design-based research approach, Phase 1 was a review of
literature that led to two course design decisions: to situate participants study of

5
technology in their disciplinary teaching field and to organize modules using
disciplinary habits of mind. Phase 2 led to a third course design decision: to
structure content and activities using a design pattern approach. Chapter 8 presents
Phase 3 of the research process, examining the influence of course design decisions
on candidates learning experiences. Study participants course reflections were
analyzed to understand the influence of the three course design decisions.
Author Parkes and colleagues show us in chapter 9 that Action Research can provide
a useful model for learning (re)design that allows the unit redevelopment process to
take place within an inclusive research framework which is of benefit to both
lecturers and students. An advantage of employing an Action Research model in this
way is that it can help better balance two often competing obligations - teaching and
research, without necessarily compromising the quality of either. The use of Action
Research in the learning (re)design process is discussed in the context of the unit -
ICT in Education.
Author Fulks Read and her colleagues describe in chapter 10 a hybrid faculty
development course for instructors who were building hybrid or fully online college
courses to be taught in upcoming semesters. The Instructional Design team in chapter
10 used the ADDIE instructional design model to guide the creation of the faculty
development course. Participants were generally satisfied with the course design,
navigation, and instructional activities. Participants expectations were generally met
and would recommend the course to others.
Authors Aarreniemi-Jokipelto and Makino present The Dialogue Design System
(DDS) in chapter 11 that facilitates the process of guided dialogue and provides all
students equal opportunity to participate in a learning community. The dialogue is
based on a framework called the Message Construction Cross model. In this study,
the DDS model was used to share information and construct knowledge in the middle
of a project to boost collaborative learning in an exported educational programme.
The study utilized a design science approach and suggested tailored procedures and
processes for the DDS model to accommodate the different target groups and
contexts. The added value of the DDS model in collaborative knowledge
construction and the opportunities of educational technology were evaluated through
a pilot study.

6
Part 3 Games and simulations

Part 3 presents state-of-the-art research in the field of games and simulations.


Chapter 12 is about the application of game elements in non-gaming situations, such
as teacher education courses. The purpose of the study of authors Figg and Jaipal-
Jamani in chapter 12 was to explore the effects of gamification on pre-service
teachers understanding of how to teach with technology (or TPACK knowledge).
Participants were 133 pre-service teachers and qualitative data sources included
email interviews with instructors teaching the gamified technology methods course,
researcher field notes from weekly instructor meetings, student reflections, student
artifacts, and a student survey. Findings indicated that participation in the gamified
course promoted the development of TPACK knowledge.
In Chapter 13 author Vail describes the problem of learners that tend to find theory
courses dry, and struggle to remain engaged with the material. Gamification has been
used in both commerce and educational settings to engage an audience. In this
chapter an Information Security Management course was selected to use as a case
study to investigate if an interactive game developed by the Naval Postgraduate
School would engage learners and assist them in their understanding of how
enterprise policies could be followed and implemented through logical protection
mechanisms, physical security, and enforcement of procedures. It was supposed that
by working through scenarios, learners would increase their understanding of
Information Security through active learning if they were self-aware enough to see
that while they are attempting to achieve the scenario objective, they were applying
abstract security concepts. The study was conducted in order to determine if active
learning was actually taking place while learners played CyberCIEGE.
Authors Comber and Motschnig explain in Chapter 14 that learning to program is a
challenging task. By stepping from programming learning environments to
professional integrated development environments, the complexity involved and the
effort required to successfully write programs grows. Moreover, motivation tends to
decline, especially when the learners are overstrained. Video game development is
perceived as highly motivating by learners and simultaneously is suitable to
effectively deliver important concepts of computer science. In their chapter, a group
of students engaged in game programming with a specialized framework is compared
with a group of students working on student-chosen projects using Windows Forms
applications. The groups are compared based on questionnaires, text feedback, and a
brief programming skills test. Although motivation was high in both groups, the
students engaged in game development scored significantly higher on the
programming skills test.
Authors Paek and Hoffman examine in chapter 15 the impact of a simulation of
political and economic development on students understanding of the features and
actions necessary to lead a successful country in a global, interconnected world. The
purpose of their study was to assess the potential of simulated social studies content
to prompt students to reflect on what it means to be successful locally and globally.
Using a one-group pretest-posttest design, participants experienced an online
simulation for seven 90-minute sessions. Comparing participants pre-simulation and

7
post-simulation concept maps show quantitative and qualitative differences in their
understanding of key concepts related to economic and political development.
Chapter 16 reports on the findings of a renal patient simulation conducted with
undergraduate Pharmacy students. Author Martini and collegueas captured data by a
series of reflective questions through the simulation and by means of an evaluative
survey. Results indicate that students were motivated by the challenge and felt the
task was achievable, even though some were not clear on what it involved at the
outset. There was an overall sense of immersion and intrinsic interest in the
simulation, fairness of the activity and most agreed it was valuable to their learning.
Author Bos reports in Chapter 17 that the teachers role in bringing virtual game-
based activities into the classroom can improve the learning and motivational
effectiveness of studying for students. Preparation of pre-service teachers and
practicing teachers is needed for the development of technological competence, and
curricular skills and confidence in facilitating game-based activities in elementary
environments. Her study looks closely at participants in an education masters
program with a concentration in mathematics and their engagement in designing
mathematics curriculum using a popular virtual game to teach geometric and
measurement concepts in their classrooms. They organized their activity using the
Play Curricular activity Reflection Discussion (PCaRD) with Inquiry,
Communication, Construction of Knowledge and Expression (ICCE). Game
Network Analysis framework provided the methodology framework.
The final chapter in part three is about mobile learning. A comprehensive literature
review methodology was used to examine recent literature through a critical analysis
of data based research published on mobile learning for students with and without
disabilities in K-12 environments from 2007 to 2014. Four findings were manifested
in this review: (a) the research purposes of most M-learning studies that authors Xie
and Basham examined were to evaluate the effectiveness of M-learning on teaching
and learning; (b) the preferred methodology was empirical studies, with mixed
methods, interviews and questionnaires; (c) most of the research outcomes were
positive and M-learning showed potential to include students with diverse needs in
general education with easy access; (d) more research needs to be done on the
interaction model between M-learning technology and users and their demographic
features.

8
Part 4 Support and mentoring

In part four the emphasis lies on supporting and mentoring students. The purpose of the study in
chapter 19 was to examine the effects of employing various multimedia tools in a flipped
classroom on pre-service teachers self-efficacy and knowledge application abilities. The authors
Watts and Ibrahim employed a within-subject design with an independent variable: the flipped
classroom or lecture-based instructional model and two dependent variables: (1) knowledge
application of various multimedia tools and (2) students perception of self-efficacy. The study
participants included 36 senior level elementary education majors from a medium-sized Mid-
western university. The study results showed that there were indeed differences between
students mean test scores and self-efficacy scores and that the differences were statistically
significant in the flipped classroom model as compared to the traditional classroom model.
Author Morisse explains in chapter 20 that theoretical computer science is an obligatory module
in most study programs in computer science. Like mathematics it is a module with a high rate of
examination failure. Especially at universities of applied sciences with a more practical
education approach it is a course, which is not very much liked by the students. For many
students the formal approach of theoretical computer science is a barrier in the study program.
To overcome this barrier Morisse implemented an Inverted classroom model (ICM) for his
course. Main goal of the ICM implementation is to provide as much individual support for
students as possible, to motivate them for this very formal course and, to encourage them to a
continuous learning process. The implementation of the ICM uses video based lectures, which
are closely connected to an extensive textbook, an audience response system, which is used
during the contact time of students and lecturer as well as other tools to support the learning
process. The author reports the progress report of the implementation, results of an evaluation of
the model, gives some prospect to the success of this model, and gives a short review of the role
change of the teaching staff from the lecturer to a coach by side.
In Chapter 21 author Redmond describes a project which investigated qualitative expressions in
an online mentoring community involving secondary pre-service teachers and practising
teachers. The practising teachers acted as online mentors to the pre-service teachers who were
personally, professionally and geographically isolated due to being located in regional, rural or
remote areas. The online mentoring enabled rural and remote pre-service teachers to benefit from
the ability to engage with practising teachers for both professional and academic purposes. The
participants posts hosted in an invitational online space were coded using a content analysis
framework, and outcomes from the online mentoring project are provided.
Authors Spitzer and Ebner claim in chapter 22 that teamwork and collaboration skills are very
important for improving learning efficiency and experience. Therefore an innovative iPad app,
called Teamsketch, was developed to provide a collaborative sketch environment for devices
with which pupils can simultaneously draw one sketch together. Up to four pupils can take part
in a session and train collaboration just by drawing a sketch. First of all different features and
issues of state-of-the-art applications were evaluated. Afterwards a prototype from scratch using
Apples new programming language Swift has been implemented. Additionally, a web service, a
web interface and also a web site were programmed in order to provide an evaluation tool for
teachers. Furthermore, pupils can upload and download their drawn sketches and profile pictures.
A first field test was carried out at the primary school Graz-Hirten. Their test showed the
potential of the app for training and evaluating team and collaboration skills.

9
Part 5 Progress and achievement

In part 5 the focus is on students achievements and perceived progress. Starting with chapter 23
in where author Bordelon starts with the perception that still exists that isolation from the
instructor and other students reduces student satisfaction and achievement in an online class. It
has been determined that student-instructor interaction, student-student interaction, and student-
content interaction each exist in online classes and influence student perceived learning and
satisfaction. What is problematic is that it has yet to be determined if a statistically significant
degree of differences or variations in student-instructor interaction, student-student interaction,
or student-content interaction are related to differences or variations in student perceived
achievement or satisfaction in online graduate courses in the field of education. This quantitative
study utilized a purposive sample of 155 K-12 educators who were enrolled in online, graduate
courses completed an online survey containing questions regarding each type of interaction,
achievement, and satisfaction. Multiple regression analysis was conducted to examine the role of
each interaction separate from the other interactions. Results demonstrated that student-instructor
interaction, student-content interaction, or student-student interaction were positively and
significantly associated with perceived achievement Based on the results, it is recommended that
institutions implement requirements and guidelines for student-instructor interaction and the
incorporation of meaningful student-content interaction.
Author Lin and Colleagues examine in chapter 24 the relationship between interactions and
learning outcomes among 466 high-school students who were taking online language courses in
a Midwestern virtual school. Regression analysis was employed to look into how three broad
types of interactions, learner-instructor, learner-learner, and learner-content, affected students
perceived progress and satisfaction. After controlling for demographic information, motivation,
and learning strategies, the results of multiple regression showed that learner-instructor and
learner-content interactions had significantly positive effects on satisfaction, whereas learner-
learner interaction did not affect satisfaction. Learner-content interaction was the only factor that
affected perceived progress.
The last chapter of this book is about social presence. Author Taverna and colleagues show that a
significant body of research literature confirms that social presence is an important element of
online teaching and learning and students rank interactivity with peers and instructors, and
teaching presence as very important for their learning experience. A major difference between
synchronous and asynchronous online teaching and learning platforms is the opportunity for
regularly scheduled, real-time interaction with instructor and peers. In their study, they
compare student perceptions of synchronous and asynchronous options, considering variables
that impact social presence, interactivity, engagement and satisfaction. Theories that contribute
to our understanding of instructor and student online interactions are discussed.

10
TABLE OF CONTENTS

PART 1 TECHNOLOGY IN EDUCATION



Chapter 1 Technology Integration in Postsecondary Education:
A Summary of Findings of a Series of MetaAnalytical Research
Eugene Borokhovski1, Robert M. Bernard1, Rana Tamim2 & Richard F. Schmid1
1Centre for the Study of Learning and Performance, Concordia University, Canada
2College of Education, Zayed University, United Arab Emirates ...................................................23


Chapter 2 Measuring 21st Century Skills in Technology Educators
Rhonda Christensen, University of North Texas, USA, Rhonda.christensen@gmail.com,
Gerald Knezek, University of North Texas, USA, gknezek@gmail.com, Curby Alexander
Texas Christian University, USA, Curby.alex@tcu.edu, Dana Owens, University of Texas
at Arlington, USA, danasuzanneowens@gmail.com, Theresa Overall, University of Maine,
Farmington, USA, Theresa.overall@maine.edu, Garry Mayes, University of North Texas,
USA Garry.mayes@unt.edu ............................................................................................................................35

Chapter 3 Learning Analytics: Principles and Constraints
Mohammad Khalil, Social Learning, Information Technology Services, Graz University of
Technology, Graz, Austria, mohammad.khalil@tugraz.at, Martin Ebner, Social Learning,
Information Technology Services, Graz University of Technology, Graz, Austria
martin.ebner@tugraz.at ................................................................................................................................43


PART 2 DESIGN AND DEVELOPMENT

Chapter 4 A Classification Matrix of Examination Items to Promote
Transformative Assessment
Mark McMahon, Edith Cowan University, Australia, m.mcmahon@ecu.edu.au,
Michael Garrett, Cinglevue International, Australia, michael.garrett@cinglevue.com ......55

Chapter 5 Lessons Learned from Creation of an Exemplary STEM Unit for
Elementary PreService Teachers: A Case Study
Matthew Schmidt, University of Hawaii Manoa, United States,
matthew.schmidt@hawaii.edu, Lori Fulton, University of Hawaii Manoa, United States
fultonl@hawaii.edu...........................................................................................................................................65

Chapter 6 Fostering Active Knowledge Construction with the TEEmachine
Gregor Damnik, Antje Proske, Hermann Krndle, Psychology of Learning and Instruction,
TU Dresden, Germany, gregor.damnik@tudresden.de, antje.proske@tudresden.de,
hermann.koerndle@tudresden.de .............................................................................................................73

11
Chapter 7 TPACK as shared practice: Toward a research agenda
David Jones, University of Southern Queensland, Australia, David.Jones@usq.edu.au,
Amanda Heffernan, University of Southern Queensland, Australia,
Amanda.Heffernan@usq.edu.au, Peter R. Albion, University of Southern Queensland,
Australia, Peter.Albion@usq.edu.au .........................................................................................................79

Chapter 8 A Preservice Secondary Education Technology Course: Design
Decisions And Students Learning Experiences
Dawn Hathaway and Priscilla Norton, George Mason University, USA,
dhathawa@gmu.edu, pnorton@gmu.edu ..............................................................................................87

Chapter 9 Applying an Action Research Model to Learning (Re)Design
Mitchell Parkes University of New England, Australia mparkes2@une.edu.au, Peter
FletcherUniversity of New England, Australia pfletch2@une.edu.au, Sarah Stein
University of Otago, New Zealand sarah.stein@otago.ac.nz ..........................................................95

Chapter 10 Using ADDIE to Design Online Courses Via Hybrid Faculty
Development
Michelle Fulks Read, PhD Texas State University, United States
Michelle.Read@txstate.edu, Gwendolyn Morel, MSTexas State University,
United States gwendolynmorel@txstate.edu, Danyelle Hennington, M.A. Texas
State University, United States ldh82@txstate.edu ...................................................................... 105

Chapter 11 Dialogue Design System to Share Information and Construct
Knowledge
Pivi AarreniemiJokipelto HAAGAHELIASchool of Vocational Teacher Education,
Finland, Paivi.aarreniemijokipelto@haagahelia.fi, Yukari Makino Faculty of
Informatics, Kansai University, Japan, makino@kansaiu.ac.jp .............................................. 111


PART 3 GAMES AND SIMULATIONS

Chapter 12 Investigating the Development of TPACK Knowledge through
Gamification
Candace Figg, Brock University, Canada, cfigg@brocku.ca, Kamini JaipalJamani,
Brock University, Canada, kjaipal@brocku.ca ......................................................................................125

Chapter 13 Gamification of an Information Security Management Course
John Vail, Florida State College at Jacksonville, United States, jvail@fscj.edu ........................135

Chapter 14 Challenges and opportunities in employing game development in


computer science classes
Oswald Comber, Renate Motschnig,University of Vienna, Faculty of Computer
Science, 1090 Vienna, Austria, oswald.comber@univie.ac.at .................................................. 141

12
Chapter 15 Students Conceptual Understanding of Leadership in a Global
World: Learning via a Webbased Simulation of Political and Economic
Development
Seungoh Paek, Learning Design and Technology, University of Hawaii at Mnoa,
United States, speak@hawaii.edu, Daniel L. Hoffman, Kamehameha Schools,
United States, dahoffma@ksbe.edu ........................................................................................................151

Chapter 16 Ready to Practice? Learning skills using digital simulated patients
Nataly Martini, School of Pharmacy, Faculty of Medical and Health Sciences, University
of Auckland, New Zealand, n.martini@auckland.ac.nz, Ashwini Datt, Centre for Learning
and Research in Higher Education, University of Auckland, New Zealand
a.datt@auckland.ac.nz, Anuj Bhargava, Department of Physiology, Faculty of Medical
and Health Sciences, University of Auckland, New Zealand
a.bhargava@auckland.ac.nz, Craig Webster, Centre for Medical and Health Sciences
Education, University of Auckland, New Zealand, c.webster@auckland.ac.nz .......................157

Chapter 17 Serious Mathematics Games: Making them Happen in Elementary
Schools
Beth Bos, Texas State University, USA, bethbbos@hotmail.com ...................................................163

Chapter 18 Mobile Learning for Learners With and Without Disabilities in K12
Education Settings
Jingrong Xie, James D. Basham, Ph.D., The University of Kansas ...................................................171


PART 4 SUPPORT AND MENTORING

Chapter 19 Examining the Effects of Employing Various Multimedia Tools in a
Flipped Classroom on PreService Teachers Self Efficacy and Knowledge
Application Abilities
Aileen J. Watts, College of Education, Arkansas Tech University, awatts4@atu.edu,
Mohamed Ibrahim, College of Education, Arkansas Tech University,
mibrahim1@atu.edu ........................................................................................................................................181

Chapter 20 Implementation of the Inverted Classroom Model for Theoretical
Computer Science
Karsten Morisse, Faculty of Engineering and Computer Science, University of Applied
Sciences Osnabrueck, Germany, k.morisse@hsosnabrueck.de ......................................................187

Chapter 21 Online mentoring for secondary preservice teachers in regional,
rural or remote locations
Petrea Redmond, School of Teacher Education and Early Childhood, University of
Southern Queensland, Australia, redmond@usq.edu.au ...................................................................197


13
Chapter 22 Collaborative Learning Through Drawing on iPads
Michael Spitzer, Institute for Information Systems and Computer Media, Graz
University of Technology, Austria, michael.spitzer@student.tugraz.at, Martin Ebner,
Social Learning Computer and Information Services, Graz University of Technology,
Austria, martin.ebner@tugraz.at .............................................................................................................205


PART 5 PROGRESS AND ACHIEVEMENT

Chapter 23 Perceptions of Achievement and Satisfaction as Related to
Interactions in Online Courses
Kristi Bordelon, Ph.D., KB Consulting ........................................................................................................217

Chapter 24 Interaction, Satisfaction, and Perceived Progress in Online


Language Courses
ChinHsi Lin, Binbin Zheng, Yining Zhang, Michigan State University, United States,
chinhsi@msu.edu, binbinz@msu.edu, zhangy58@msu.edu ............................................................223

Chapter 25 Keeping It Real: Factors that Impact Social Presence, Feelings


of Isolation, and Interactivity in Online Learning
Franco Taverna, Ph.D.,Senior Lecturer, Human Biology Program, University of Toronto,
Canada, Lena Paulo Kushnir, Ph.D.,Assistant Professor, Psychology,OCAD University,
Canada, Kenneth Berry, M.Sc., Research Associate, OCAD University, Canada, Laurie
Harrison, M.Ed. Director, Online Learning, University of Toronto, Canada .............................231

14
Editors

Theo Bastiaens is professor of Educational Technology at the Fernuniversitt in Hagen,


Germany and part time professor at the Open University, The Netherlands. He is a member of
the AACE board of directors. E-mail: Theo.Bastiaens@fernuni-hagen.de

Gary Marks is CEO and founder of AACE. E-mail: info@aace.org

15
&

DOES YOUR LIBRARY SUBSCRIBE? Special Topic Books


Conference Papers
LearnTechLib is the premier multimedia
resource for peer-reviewed research Presentation Slides
in Educational Technologies and E-Learning.
Conference Talks
125,000+ articles and Journal Articles
16,000 dissertations written by
230,000+ international authors covering Webinars
30+ years of advancements in IT in Education! Videos

Individuals $19/month or $150/year


Libraries $2095/year

Academic Journals including:


International Journal on E-Learning
Journal of Computers in Mathematics and Science Teaching
Journal of Educational Multimedia and Hypermedia

NEW!
Journal of Interactive Learning Research
Journal of Online Learning Research
Journal of Technology and Teacher Education
+ 2,000 More Journals!

Conference proceedings including: Curated ERIC


indexed publications
SITE Society for Information Technology and Teacher Education International Conference on Educational Technology
EdMedia World Conference on Educational Media & Technology
Global Learn Global Conference on Learning and Technology Relevant Content from
Proquest Dissertations
E-Learn World Conference on E-Learning
Just Added
+ 800 More Conferences!

Newest content additions:


e-Books:
Practitioners Guide to Technology, Pedagogy, and Content Knowledge (TPACK):
Rich Media Cases of Teacher Knowledge
Journals:
International Journal of Research Studies
in Educational Technology
Reports:
Horizon Report: 2016 Higher Education Edition
as a TrusTed e-resource
... and many more titles!

Global U, PO Box 958, Waynesville, NC 28786 Email: info@aace.org aace.org


Celebrating 30+ Years of Service to the IT in Education/E-Learning Community

Invitation to Join
The Association for the Advancement of Computing in Education (AACE) is an international, non-profit
educational organization. The Associations purpose is to advance the knowledge, theory, and quality of
teaching and learning at all levels with information technology.
This purpose is accomplished through the encouragement of scholarly inquiry related to technology in
education and the dissemination of research results and their applications through AACE sponsored publications,
conferences, and other opportunities for professional growth.
AACE members have the opportunity to participate in Special Interest Groups (SIGs), high-quality peer-re-
viewed publications, and conferences.

AACE MEMBERSHIP
Join with fellow professionals from around the world to share knowledge and ideas on research, development,
and applications in information technology and education. AACEs membership includes researchers, developers,
and practitioners in schools, colleges, and universities; administrators, policy decision-makers, professional
trainers, adult educators, and other specialists in education, industry, and government with an interest in
advancing knowledge and learning with information technology in education.

Membership Benefit Highlights


Gain professional recognition by participating in AACE sponsored international conferences
Enhance your knowledge and professional skills through interaction
with colleagues from around the world
Learn from colleagues research and studies by receiving AACEs
well-respected journals and books
Receive a subscription to the professional periodical
Journal of Online Learning Research (JOLR) [digital]
Access LearnTechLib-The Learning and technology Library,
a valuable online resource that is fully searchable
and covers 30+ years of academic journals and international
conference proceedings.
Receive discounts on multiple journal subscriptions,
conference registration fees, and Subscriptions.
AACE Social Media enables you to connect with colleagues
worldwide!

AACE Blog: AACE Facebook:


blog.aace.org facebook.com/aaceorg

aace.org
AACE Twitter:
twitter.com/aace
AACE Conferences Details for conferences are available at www.aace.org/conf

The exchange of ideas and experiences is essential to the advancement of the field and the professional
growth of AACE members. AACE sponsors conferences each year where members learn about
research, developments, and applications in their fields, have an opportunity to participate in
papers, panels, poster demonstrations and workshops, and meet invited speakers.

6
March 21-25, 2016 April 28 & 29, 2016
Savannah, Georgia Limerick, Ireland
This conference, held annually, offers Global Learn is an international conference with the
opportunities to share ideas and expertise on all mission of furthering the advancement and innovation
topics related to the use of information technology in in learning and technology. As the educational world
teacher education and instruction about information becomes increasingly global, new ways to explore,
technology for all disciplines in preservice, inservice, learn, and share knowledge are needed.
and graduate teacher education.
AACE MEMBERSHIP

VisitSavannah.com Limerick Institute Of Technology

World Conference On Educational Media & Technology

June 26-30, 2016 November 14-16, 2016


Vancouver, British Columbia Washington, D.C.
This annual conference serves as a multidisciplinary E-Learn is a respected, international conference
forum for the discussion of the latest research, enabling E-Learning researchers and practitioners in
developments and applications of multimedia, corporate, government, healthcare and higher
hypermedia and telecommunications for all education to exchange information on research,
levels of education. developments and applications.

Destination DC
AACE Journals Abstracts for all journal issues are available at www.LearnTechLib.org

LearnTechLib The Learning and Technology International Journal on E-Learning


Library Digital (Corporate, Government, Healthcare, & Higher Education)
(IJEL) ISSN# 1537-245 Quarterly
IJEL serves as a forum to facilitate the
international exchange of information on
the current theory, research, development,
& and practice of E-Learning in education
and training. This journal is designed for
researchers, developers and practitioners
LearnTechLib is your research and instructional source for
in schools, colleges, and universities,
peer-reviewed articles and multimedia (100,000+) on the latest administrators, policy decision-makers,
research, developments, and applications related to all aspects of professional trainers, adult educators, and
Educational Technology and E-Learning from thousands of journals, other specialists in education, industry, and
dissertations, reports, ebooks and international conference government.
proceedings.

Journal of Educational Multimedia & Hypermedia Journal of Interactive Learning Research


(JEMH) ISSN# 1055-8896 Quarterly (JILR) ISSN# 1093-023X Quarterly
Designed to provide a multidisciplinary The Journals published papers relate to the
forum to present and discuss research, underlying theory, design, implementation,
development and applications of multi- effectiveness, and impact on education and

AACE MEMBERSHIP
media and hypermedia in education. The training of the following interactive learning
main goal of the Journal is to contribute environments: authoring systems, CALL, as-
to the advancement of the theory and sessment systems, CBT, computer-medi-
practice of learning and teaching using ated communications, collaborative learning,
these powerful and promising technolog- distributed learning environments, perform-
ical tools that allow the integration of im- ance support systems, multimedia systems,
ages, sound, text, and data. simulations and games, intelligent agents on
the Internet, intelligent tutoring systems,
micro-worlds, and virtual reality-based
learning systems.

Journal of Technology and Teacher Education Journal of Computers in


(JTATE) ISSN# 1059-7069 Quarterly
Mathematics & Science Teaching
(JCMST) ISSN# 0731-9258 Quarterly
A forum for the exchange of knowledge JCMST is the only periodical devoted
about the use of information technology in specifically to using information tech-
teacher education. Journal content covers nology in the teaching of mathematics
preservice and inservice teacher educa- and science. The Journal offers an in-
tion, graduate programs in areas such as depth forum for the exchange of infor-
curriculum and instruction, educational ad- mation in the fields of science,
ministration, staff development, instruc- mathematics, and computer science.
tional technology, and educational
computing.

Journal of Online Learning Research CITE Electronic Journal


(JOLR) ISSN# 2374-1473 Quarterly (CITE) ISSN# 1528-5804 Quarterly
TheJournal of Online Learning Researchis
a peer-reviewed, international journal de-
voted to the theoretical, empirical, and prag-
matic understanding of technologies and
their impact on primary and secondary ped-
agogy and policy in primary and secondary
(K-12) online and blended environments.
An electronic publication of the Society for Information Technol-

NEW!
ogy and Teacher Education (SITE), established as a multimedia,
interactive electronic counterpart of the Journal of Technology
and Teacher Education.
Membership Application You can also apply online at membership.aace.org

Membership Options
Professional Membership Student Membership
Subscription to 1 AACE Journal (Digital, See journal list below)
All the same benefits of a Professional Membership
Full online access to multiyear back issues of that journal
Discount conference registrations and proceedings Offered at a discount for students
Discount subscriptions to additional journals MUST be enrolled as a full-time student in an accredited
Access to the AACE Job Board educational institution and provide school information below
All the benefits of AACE Membership

$125 $45

Professional Membership PLUS Student Membership PLUS


LearnTechLib The Learning and Technology
Library
LearnTechLib The Learning and Technology
Library

All the same benefits of a Professional Membership All the same benefits of a Professional Membership
PLUS 1-year subscription to the LearnTechLib with 100,000+ PLUS 1-year subscription to the LearnTechLib
peer-reviewed journal articles, conference papers and Offered at a discount for students
presentations, videos, webinars and much more. MUST be enrolled as a full-time student in an accredited
educational institution and provide school information below
$175 $75
AACE MEMBERSHIP

Virtual Membership New Option!


Registration as a virtual participant for the following events: Full access to LearnTechLib - The Leading Digital Library Dedicated to
EdMedia World Conference on EdMedia & Technology (Value $225) Education & Information Technology (Value $150)
E-Learn World Conference on E-Learning (Value $225) AACE Face-to-Face Conference Registration discounts
Conference proceedings for AACE events, accessible in LearnTechLib
Education and Information Technology Digital Library
$395 (Value $600)

Select Your Membership Journals Professional & Student Memberships include a subscription to 1 AACE

Journal (Digital, See journal list under Library Subscriptions)


Additional journals can be added to your membership
Please choose from options below:
Number of Journals
1 Journal $125 prof /$45 student 3 Journals $235 prof / $95 student 5 Journals $345 prof / $155 student
2 Journals $180 prof / $70 student 4 Journals $290 prof / $120 student

Journal Title(s)
International Journal on E-Learning (IJEL) Journal of Interactive Learning Research (JILR)
Journal of Educational Multimedia and Hypermedia (JEMH) Journal of Technology and Teacher Education (JTATE)
Journal of Computers in Math and Science Teaching (JCMST) Journal of Online Learning Research (JOLR) Already FREE with membership

Applicant Information Membership extends for 1 year from the approximate date of application.

Name: E-mail:
Address: City: State:
Postal Code: Country: New Member Renewal Membership #
If applying as a student please provide School/Institution Name: Expected Graduation Date:

Method of Payment (USD) Rate____x No. of Years____ =


Enclosed: Check (U.S. funds & bank, payable to AACE)
Purchase Order(Must include $10 service charge) Bank Wire (Must include $25 service charge)
Credit Card: MasterCard VISA AMEX Discover
Total $
Card # Card Exp. Date / Signature:
Return to: AACE P.O. Box 719, Waynesville, NC 28786 USA E-mail: info@aace.org www.aace.org
PART 1 TECHNOLOGY IN EDUCATION

21












































22
1. Technology Integration in Postsecondary Education: A Summary of
Findings of a Series of Meta-Analytical Research
Eugene Borokhovski
Centre for the Study of Learning and Performance, Concordia University, Canada

Robert M. Bernard
Centre for the Study of Learning and Performance, Concordia University, Canada

Rana Tamim
College of Education, Zayed University, United Arab Emirates

Richard F. Schmid
Centre for the Study of Learning and Performance, Concordia University, Canada

Introduction
When educational researchers and practitioners join in to a general public discussion about advances,
prevalence and utilitarian values of modern technology, they are expected to take at least a few steps beyond
fashionable truisms that revolve around latest releases of cutting-edge applications opening new horizons. The real
challenge is to explore more systematically and in depth what works in education for the benefits of student learning
and to effectively link ever advancing technological functionality to educational theory and practice. Well-
established pedagogical frameworks may or may not constructively guide educational use of various new
technologies, as well as, pioneering technological features may or may not make decisive difference in supporting
learning. Research in education is intended for sorting out these may and may not to identify and explain when
and why promising potentials of modern technology materialize in successful educational practice. In this paper we
will summarize findings of a large-scale meta-analysis of classroom technology integration in postsecondary
education (Schmid, Bernard, Borokhovski, Tamim, Abrami, et al., 2009 & 2014) and its several follow-ups that
specifically focus on sub-collections of studies addressing blended learning (Bernard, Borokhovski, Schmid,
Tamim, & Abrami, 2014), designed interaction treatments (Borokhovski, Tamim, Bernard, Schmid, &
Sokolovskaya, 2013) and pedagogical underpinnings of teacher education (Tamim, Borokhovski, Bernard, Schmid,
Abrami & Sokolovskaya, 2014).
Framework for studying educational technology
Reliance on computers in various aspects of our daily life is a reality of modern society and impacts
significantly on nearly all we do, either professionally or personally. Why then are educational researchers still
engaged in a debate over its effectiveness for teaching and learning? Some, like Clark (1983), insist on rather
auxiliary role of educational technology, while some others believe in its more substantial transformative effect
(e.g., Kozma, 1991). The disagreement is probably rooted in the history of educational technology itself. Originally,
it was utilized almost exclusively to deliver instructional content, and as a medium was no more effective to a
human teacher, especially an expert one. Early media studies on distributed closed-circuit television versus live
teaching (Carpenter & Greenhill, 1955) that found no differences between live teachers and televised teachers serve
a good example of that. However, even development and introduction to educational practice of much more
sophisticated computer tools and applications (what is often referred to as interactive, multi-functional, mobile,
smart technology) have not impacted students learning outcomes impressively enough to consider the issue
unequivocally resolved in favor of proponents of educational technology. Actually, our meta-analysis contains
(among more nuanced findings) an impressive illustration of relatively unchanged effects of technology over time.
Throughout two decades, despite remarkable advances in technology itself, our ability to use it effectively for
educational purposes has not changed much (Schmid et al., 2014).
So, first we intended to address a big question of effectiveness of educational technology in higher
education in general, based on the premises that what needs to be studied is not technology per se, but how it is used
for various educational purposes. Then we followed this large-scale meta-analysis by a series of subsequent projects
focusing in turn on major functions of technology use, blended learning, conditions for student-student interactions
and pedagogical approaches to teacher training. For the purposes of this meta-analysis, our definition of educational

23
technology is fairly inclusive and echoes Ross, Morrison & Lowthers (2010) understanding of it as a broad variety
of modalities, tools, and strategies for learning, [whose] effectiveness, therefore, depends on how well [they] help
teachers and students achieve the desired instructional goals (p. 19).
Bernard, Borokhovski, Schmid and Tamim (2014) provided a list of previously conducted meta-analyses of
the effectiveness of technology integration, predominantly comparing technology use in experimental conditions to
technology-free control conditions. Collectively, they depict the state of instructional technology in postsecondary
education quite reasonably. Individually, however, none of them is capable of truly summarizing the state of
research across time periods, technology types and subject matters. Also, as we demonstrated there, some of these
previously conducted meta-analyses appear to be open to various forms of methodological imperfection (i.e., bias).
In 2011, Tamim, Bernard, Borokhovski, Abrami, and Schmid published a second-order meta-analysis of 25
meta-analyses that cut across all levels of formal education, subject matters and technology types, from the 1970s to
the present. A weighted average effect size was 0.35 (p < .01) encompassing 1,055 primary studies and 109,700
participants. The authors addressed the big question based on studies and meta-analyses that tested the with and
without question. They concluded that technology does enhance learning, even if only to a relatively small degree.
However, as Cooper and Koenka (2012) point out, a second-order meta-analysis remains a limited means of
addressing the host of issues that can only be settled in a primary meta-analysis, where coding decisions can be
made by the meta-analyst and the synthesis is conducted at the more granular level of the individual effect sizes. It is
these issues, plus the fact that the educational research landscape has changed, to largely include technology vs.
technology types of instructional settings, that motivated our effort to perform a primary meta-analysis of
technology use in postsecondary classroom.
In summary, the current meta-analysis is intended to overcome some limitations of the previous ones and
to: 1) become the most comprehensive first-order meta-analysis of technology use in postsecondary classroom
education; 2) look at the pedagogical use of a broad range of educational technologies; 3) examine important
instructional moderator variables; and 4) encompass both studies with no technology in the control condition and
studies with various degrees of technology in each condition.
A set of subsequent (follow-up) analyses addressed several additional questions we judged to be of the
utmost research and applied importance. More specifically, we investigated the effects of various purposes of
technology use (with the focus on cognitive support for learning), technology use in blended learning, instructional
settings (interaction treatments) designed specifically to enable student collaborative work, and various pedagogical
approaches to teacher education. Below, we briefly present rationale for each of these follow-ups.
Major Purpose of Technology Use
By now everybody realizes that technologys role in education is not limited to the delivery of content, but
how do students learn through technology without being taught directly? One possible answer to this question
advanced by Cobb (1997) asserts that the most compelling role of computing in learning is its ability to afford
cognitive efficiencies to students. In most if not all learning situations there is shared cognition among the learner,
the task itself and the tools that the learner uses in the process. Further, he argues that the more learners can
distribute cognition outside of their heads, the more cognition can be devoted to the process of learning new
material. Therefore, one purpose of the current review was to explore whether and to what extent cognitive tools
promote student achievement in learning environments involving technology.
Blended learning
There is a growing literature of studies investigating blended learning, which involves a combination of
elements of face-to-face and online instruction. It is sometimes argued that blended learning is the best of both
worlds because it is a marriage of the best elements of the two practices. To date, there have only been two meta-
analyses devoted to blended learning (Means, Toyama, Murphy & Baki, 2013; Bernard et al., 2014, reported here).
Blended learning conditions were found to significantly outperform fully face-to-face classroom instruction (g+ =
0.35, k = 23, p = .001). Among findings of this meta-analysis were the following. Students in both collaborative
interactive learning and teacher-directed expository instructional conditions significantly outperformed those
engaged in active self-study. In computer-mediated communications with instructor and among students,
asynchronous mode only was more effective than it in combination with the synchronous mode. Also, learners in
undergraduate courses seemed to benefit more from blended learning than graduate students. A balance of course
time in favor of on-line instruction (as compared to time spent face-to-face) produced a relatively higher weighted
average effect size, as well as the opportunity to interact face-to-face with the instructor during the class time rather
than before or after. Treatments of longer duration were more effective compared to shorter ones. These findings,
however, were not statistically significant.

24
Subsequently, our meta-analysis (Bernard et al., 2014) addressed the following research questions: What is
the impact of blended learning (i.e., courses that combine face-to-face and online learning) on the achievement of
higher education students? How do various pedagogical factors (e.g., the amount of time spent online and the
purpose of technology use) and course demographics (e.g., subject matter) moderate the overall average effect size?
Designed Interaction Treatments
Institutions of higher education continue to take advantage of developments in computer and
communication technologies that are instrumental in providing students with the opportunity to interact with each
other both inside and outside of the classroom. There are three major forms of interaction that are important for
effective learning, especially in distance education (e.g., Anderson, 2003). Student-student interaction refers to
interaction among individual students or among students working in small groups. Student-teacher interaction
traditionally focused on, but is not limited to classroombased dialogue between students and the instructor. Finally,
student-content interaction refers to students interacting with the subject matter under study to construct meaning,
relate it to personal knowledge, and apply it to problem solving. Meta-analytical findings support the overall
positive influence of the three types of interaction on learning outcomes with specific emphasis on student-student
interactions (Bernard et al., 2009). Cooperative and collaborative theories of learning may offer a meaningful
explanation for that. While originating from different theoretical traditions, both collaborative and cooperative
learning concepts focus on and encourage interaction among students. Collaborative learning is more of a
philosophy that emphasizes that knowledge is a social construct (Bruffee, 1993). It focuses on the adult learner and
involves the negotiation and sharing of meaning. Cooperative learning is a set of structured strategies and techniques
that help students to work with each other to maximize their learning (Johnson & Johnson, 2008). It encompasses
five main elements the most important of which are positive interdependence and individual accountability.
While Moore originally presented the three forms of interaction as an essential factor for distance
education, they could be no less important in classroom instruction, especially with the pervasiveness of
technological tools in todays higher education (e.g., Lou, Abrami, & dApollonia, 2001). The collaborative nature
of student-student interactions should add as much value to successful learning in technology-enhanced classroom
as it does in distance education (Borokhovski, Tamim, Bernard, Abrami, & Sokolovskaya, 2012). In order to
substantiate this claim with research findings, we devised and carried out a meta-analytical study with the following
research question: Are designed interaction treatments (i.e., intentionally implemented collaborative instructional
conditions that are meant to increase student learning) more effective than contextual interaction treatments (i.e.,
learning setups that contain conditions for student-student interaction to occur, but are not intentionally designed to
create collaborative learning environments in technology-enhanced classroom instruction in higher education?
Technology Use in Teacher Education
This section focuses on the subset of studies that specifically addresses technology use in educational and
teacher training programs. The main objective is to further explore the impact of technology use in this specific
context in an attempt to determine what aspects of teaching practices set education aside from STEM and Non-
STEM disciplines. Moreover, it aims to investigate the nature of the most effective pedagogical frameworks
supporting successful technology integration. The focus is on how technology is used to achieve educational goals
by educational professionals.
While it is impossible to review all pedagogical frameworks available for instructors, it seems appropriate
to focus on student-centered instructional strategies. It is argued that the learner-centered approach supports learning
by emphasizing the students role in the instructional environment, thus shifting the focus from knowledge
transmission to the actual learning process (e.g., Laurillard, 2002). Regardless of technology integration, general
teaching strategies that are aligned with the student-centered philosophy include cooperative and collaborative
learning (Bruffee, 1993; Johnson & Johnson, 2008); problem-based learning (Jonassen, Howland, Moore, & Marra,
2003); and the provision of elaborate feedback (Hattie & Timperley, 2007).
Research Questions
To summarize, the meta-analysis we are reporting on and its follow-ups were designed to answer questions
about the impact of instructional technology on postsecondary students achievement outcomesthe overall effect
as well as in its various practical applications. Specifically, the research questions addressed were:
1. What is the weighted average effect size and variability for studies that investigate the impact of the
instructional uses of technology on postsecondary students achievement outcomes?
2. Is there a difference in average effect sizes for achievement outcomes associated with major purposes of

25
technology use?
3. What is the weighted average effect size and variability for studies that investigate the impact of the
instructional uses of technology in so-called blended educational settings?
4. Is there a difference in average effect sizes for achievement outcomes associated with designed vs.
contextual interaction treatments?
5. What is the effectiveness of instructional technology in teacher training dependent on a particular
pedagogical approach?
Method
To facilitate navigation through the review to the readers, here is a brief summary of major terms and
definitions used in the meta-analysis, followed by a set of inclusion/exclusion criteria and the key aspects of the
review methodology (for the complete description of the methodology, please, see Schmid et al., 2014).
Terms and Definitions
Educational technology is understood here according to Ross et al. (2010) as quoted previously. The degree
of technology use was the primary determinant for assigning groups to either the experimental or the control
condition. This distinction is important because it specifies the +/ valence of the effect size. Two types of studies
were found, those that contained no technology in the control condition and those that contained some technology in
the control condition. In the former class of studies, the assignment of the experimental and control group
designation was clear. In the latter case, the differential use of technology in the two conditions was rated. The
condition containing the highest degree of technology use was designated the treatment condition and the
alternative condition was the control. There were three possible interpretations of the degree of technology use. The
experimental group was considered: a) to contain the longest or most frequent exposure to technology tools; b) to
contain more advanced technology (i.e., the one that enables more functions); and/or c) to employ a larger number
of technology tools.
There were the following major purposes of technology use identified and analyzed in the reviewed studies:
1) To promote communication and/or facilitate the exchange of information. This category includes
technology that enables a higher level of interaction between individuals (i.e., two-way communications
among learners and between learners and the teacher).
2) To provide cognitive support for learners. This category encompasses various technologies that enable,
facilitate, and support learning by providing cognitive tools (e.g., concept maps, simulations, wikis,
different forms of elaborate feedback, spreadsheets).
3) To facilitate information search and retrieval. This type of technology is intended to enable and/or facilitate
access to additional information (e.g., web-links, search engines, electronic databases).
4) To enable or enhance content presentation (e.g., PowerPoint presentations, graphical visualizations,
computer tutorials with limited interactive features).
When more than one purpose was identified, codes indicating multiple purposes were created (e.g., cognitive
support plus presentational support). Achievement outcomes included various objective measures of academic
performance (e.g., exam/test scores), but not self-evaluation. We coded a large spectrum of methodological (e.g.,
research design), instructional (e.g., purpose of technology use, pedagogical approach), demographic (e.g., subject
matter), and publication (e.g., date, source) study characteristics to be used in subsequent moderator variable
analyses.
Inclusion/Exclusion Criteria and Review Procedure
To be included in meta-analysis a study under review had to have the following characteristics: (1) be
published no earlier than 1990; (2) be publicly available (or archived); (3) address the impact of computer
technology on students achievement and/or attitudes; (4) contain at least one between-group comparison where one
group fits the definition of the experimental condition and the other group the definition of the control condition,
using the criterion of the degree of technology use; (5) be conducted in a formal postsecondary educational setting;
(6) represent classroom or blended instruction but not distance education environments; and (7) contain sufficient
statistical information for effect size extraction.
Failure to meet any these criteria led to the exclusion of the study with the reason(s) for rejection documented
for further reporting. Two researchers working independently rated studies, first at the abstract level, then at the full
text level, on a scale, from 1 (definite exclusion) to 5 (definite inclusion). All disagreements were discussed until
they were resolved inviting a third opinion when necessary), and initial agreement rates calculated as Cohens
Kappa () and as Pearsons r (where appropriate). Similarly, two researchers participated in all other data extraction

26
procedures (i.e., effect size extraction and study features coding). Agreement rates for their independent decisions at
these stages of the review are reported as Cohens Kappa ().
Literature Search Strategies and Data Sources
Extensive literature searches were designed to identify and retrieve primary empirical studies relevant to the
major research questions. Key terms used in search strategies, with some variations to account for specific retrieval
sources, included: technolog*, comput* web-based instruction, online, Internet, blended learning,
hybrid course*, simulation, electronic, multimedia OR PDAs etc.) AND (college*, university,
higher education, postsecondary, continuing education, OR adult learn*) AND (learn* achievement*,
attitude*, satisfaction, perception*, OR motivation, etc.), AND excluding distance education or distance
learning in the subject field.
The following databases were among the sources examined: ERIC (WebSpirs), ABI InformGlobal
(ProQuest), Academic Search Premier (EBSCO), CBCA Education (ProQuest), Communication Abstracts (CSA),
EdLib, Education Abstracts (WilsonLine), Education: A SAGE Full-text Collection, Francis (CSA), Medline
(PubMed), ProQuest Digital Dissertations & Theses, PsycINFO (EBSCO), Australian Policy Online, British
Education Index, and Social Science Information Gateway.
In addition, Google Internet searches were performed to help identify additional gray literature, including a
search for conference proceedings. Review articles and previous meta-analyses were used for branching, and the
tables of contents of major journals in the field of educational technology (e.g., Educational Technology Research &
Development) were manually searched.
Effect Size Calculation and Synthesis
A d-type standardized mean difference effect size was used as the common metric (i.e., Cohens d), and
then was transformed into Hedges g metric (Hedges & Olkin, 1985) to provide necessary correction for small
sample sizes. The random effects model (Borenstein, Hedges, Higgins & Rothstein, 2009) was the main analytical
approach for this meta-analysis. A mixed effects model was used to test the difference in levels of moderator
variables. In a mixed analysis, average effect sizes for categories of the moderator are calculated using a random
effects model. The variance component Q-Between is calculated across categories using a fixed effect model
(Borenstein, Hedges, Higgins & Rothstein, 2009). All analyses, including sensitivity and publication bias analysis,
were performed in Comprehensive Meta-Analysis 2.2.048 (Borenstein, Hedges, Higgins & Rothstein, 2008).
Results
In this section we present findings of the meta-analysis of effects of classroom technology integration in
higher education on student achievement outcomes, first, overall, and then by individual research sub-question as
outlined earlier.
Overall Findings. The overall random-model results of the Schmid et al. (2014) are shown in Table 1. The
total of 879 effect sizes produced a weighted average effect size of 0.27 that was significantly greater than zero. The
collection is significantly heterogeneous, based on findings from the fixed model where heterogeneity is tested in
terms of the magnitude of Q-Total (i.e., total between-study variability). An effect size of 0.27 is considered to be
small and represents a difference of 0.27sd between the mean of the treatment condition and the control condition,
amounting to about an 11% difference. These results suggest that technology-supported instruction is advantageous
compared to either non-use or limited use, but that this advantage is relatively modest.

Population Estimates k g+ SE Lower 95th Upper 95th


Final Collection 879 0.27* 0.02 0.24 0.31
2
**Heterogeneity Analysis QT = 3,183.10 (df = 878), p < .001, I = 72.42
* p < .01; ** Based on the fixed effect model for k = 879
Table 1. Overall Weighted Average Effect Size (Random Effects Model)
Major Functions. The results become differentiated when effect sizes are divided by pedagogical
application. These results from Schmid et al. (2014) indicate that technology used to support student cognition
outperforms all other categories, but especially support for presentation (please, see Table 2 for details). This effect
is interpreted as a difference primarily between technology used by students (for content understanding) and

27
technology used by teachers (for content delivering). Other functions, such as support for communication and a
mixture of cognitive and presentational support, fall in between.

Levels of Technology Use k g+ Lower 95th Upper 95th QB


Cognitive Support (CS) 186 0.36 0.28 0.44
Presentational Support (PS) 113 0.15 0.07 0.23
Communication Support 27 0.24 0.12 0.35
Mixture (CS plus PS) 485 0.25 0.21 0.30
Between Groups, df = 3 13.28, p = .004
Contrast: Cog. Supp. vs. Presentational Supp., z = 5.14, p < .0001
Table 2. Instructional Moderator Variable Analysis: Major Levels of Technology Use
Blended Learning. The effects of blended learning (i.e., partly in class and partly online) were derived
from the Schmid et al. (2014) database and analyzed and reported in Bernard et al. (2014). As evident from Table 3,
the random-effects weighted average effect size (g+ = 0.334, k = 117) is larger than the overall average effect of
technology use in the original meta-analysis and is in line with the findings from the only meta-analysis of blended
learning (Means et al, 2013). Apparently, there is an advantage that accrues from balancing face-to-face instruction
with online learning outside of class. The mechanisms of this effect have not been determined so that one of the
challenges of educational technology research of the future is to tease out the effects of variables such as, amount of
time devoted to each pattern, the most effective learning strategies in each pattern and the teachers role in the online
portion of blended learning.

Analytical Models k g+ SE Lower 95th Upper 95th


Random Effect Model 117 0.334* 0.04 0.26 0.41
Fixed Effect Model 117 0.316** 0.02 0.28 0.36
Heterogeneity Q-total = 372.91, df = 116, p < .001 I-squared = 68.89% 2 = 0.11
*z = 8.62, p < .001; **z = 15.68, p < .001.
Table 3. Weighted Average Effects for Blended Learning.
Interaction Treatments. Interaction treatments were defined by Bernard et al. (2009) as instructional set-
ups in distance education that are intended to facilitate and promote interaction among students, between students
and teachers and between students and the content. This definition was used to code studies in the Schmid et al.
(2014). Table 4 shows the results of this basic analysis. Conditions where interactions were greater in the treatment
group, compared to the control condition, produced results that were significantly higher that when the control group
was higher in the potential for interaction. As expected, when the two conditions were roughly the same (i.e., g+ =
0.29, k = 703), the outcome was not significantly different from the former condition (i.e., g+ .34 vs. 0.29).
Levels k g+ SE Lower 95th Upper 95th QB
Equal 703 0.29 0.02 0.25 0.33
Control group
127 0.16 0.04 0.07 0.24
higher
Experimental group
48 0.34 0.07 0.20 0.48
higher
Between* 8.93
*p (2) = .012, **p < .0
Table 4. Mixed effects analysis of the degree of student-student
When the interaction treatments were divided further by the presence or absence of design intention, the
results strongly supported designed interaction treatments over contextual interaction treatments (Table 5). It
appears that merely providing the means for student-student interaction is not enough as a pedagogical strategy.
Some form of instructional design is needed (e.g., collaborative learning, reciprocal teaching).

28
Levels k g+ SE Lower 95th Upper 95th QB
Designed treatments 31 0.46* 0.01 0.32 0.60
Contextual treatments 17 0.07 0.02 -0.22 0.37
Between Groups, df = 1* 5.41**
*p < .01,**p (1) = .02
Table 5. Mixed Effects Analysis of Designed and Contextual Interaction Treatments
Technology Use in Educational Courses. In this final analysis, we investigated the effects of technology
across subject areas. Because of our special interest in teacher education (i.e., teaching future teachers) we broke
these studies from the other Non-STEM content areas and compared their average effect size with that of STEM
(i.e., Science, Technology, Engineering, Mathematics) and the remaining Non-STEM subject areas. The results,
shown in Table 6 revealed nearly equal average effects for STEM and Non-STEM minus education, but a difference
between these areas and technology used in teacher education.
To investigate further what might account for such a high average effect for teacher education, we further
classified these studies according to the underlying pedagogical basis for using technology in these studies (see
Table 7). Although the number of studies is somewhat limited after the split, we found that using technology to
provide feedback to teacher education students resulted in an unusually high average effect size of g+ = 0.75. Two
other uses were also prominent: multimedia theory and problem-based learning, each around g+ = 0.50. Notably, the
use of technology to support collaborative learning (i.e., group projects) was very low (less than 0.10) and not
significantly greater than zero. These finings may arise from the peculiarities of teaching future teachers, as
compared with content, as in the other STEM and Non-STEM subject areas, but it does suggest the need to further
explore the range of technology uses in education, as well as in allied areas such as nursing education

Levels of Subject Matter k* g+ Lower 95th Upper 95th QB


STEM Subjects 356 0.28 0.23 0.33
Non-STEM Subjects 454 0.25 0.20 0.30
Teacher Education 66 0.45 0.32 0.58
Between Groups, df = 2 7.78, p = .02
* Three cases of unidentified subject matter or mixture of several subject matters were removed from the
analysis (k = 876)
Table 6. Moderator Variable Analysis for Subject Matter (STEM vs. Non-STEM vs. Education)

Framework k g+ Lower 95th Upper 95th QB


Collaborative Learning 6 0.06 - 0.24 0.35
Feedback Strategies 11 0.75 0.38 1.12
Information Processing 5 0.26 - 0.17 0.70
Multimedia Theory 8 0.46 - 0.10 1.01
Problem-Based Learning 9 0.56 0.16 0.96
Between Groups, df = 4 9.71, p = .046
Table 7. Moderator Variable Analysis: Type of the Pedagogical (Conceptual) Framework
Discussion and Implications
Overall results
Overall, these results demonstrate a consistent message of the value of technology use in education.
Whether or not these results could have been achieved through other means, as Richard E. Clark (1983) has claimed,
is rather a moot point in our view. For better or for worse, technology is with us for the long haul and we as a
profession and professional researchers are obligated to analyze and investigate it, and to make certain that the use
of it for pedagogical purposes achieves maximum learning benefits.

29
Cognitive Versus Presentational Support Tools
Based on the analyses of purposeful use of technology we see where Clarks original assertions about the
benign nature of technology may have arisen. Technologies, prior to the 1980s and well into the 1990s were indeed
benign because their primary purpose was to convey content to students, either through the actions of teachers use
of presentational software or as a consequence of computer-aided-instruction (CAI). Feedback, of course, was
present in these stand-alone technologies but it often amounted to simply providing the right answer without
elaboration (Azevedo & Bernard, 1995). Since the advent of personal computers, and especially software and online
tools that work with the student in the learning process, have we seen changes in technology use that Clark could
not have envisioned prior to 1980, when much of his work was done.
The overall message emerging from these data presented here is that learning is best supported when the
student is engaged in active, meaningful activities via technological tools that provide cognitive support. However,
we are a long way from understanding more specifically how to design effective cognitive support tools and when
precisely to integrate them into instruction. We encourage vigorous research programs to help lessen our knowledge
gaps in these areas.
Blended Learning in Higher Education
For the longest time, there were two primary venues for learning in higher educationclassroom
instruction (with or without technology support) and distance education, now often referred to as online learning.
While there is considerable evidence that technology used to support classroom instruction is beneficial, it is a
requirement when students and teachers are in separate location, often working asynchronously (in correspondence
education, the post office was a technology, of sorts). We are now able to marry these two environments, providing
students with some time in classrooms and some time online out of classrooms. The results of studies of this, so
called, blended or hybrid learning experiences are encouraging (see Means et al, 2013; Bernard et al, 2014 and the
summary in this report), but we are still unable to predict with confidence which variables are most influential (e.g.,
instructional strategy applied in each context) and which are trivial (e.g., time spent in each pattern), or how to
design effective blended learning given the myriad of circumstances that can arise under various conditions. We
must encourage research work in this domain, as it may turn out to be of immense value to university students,
partly because it encourages a pattern of face-to-face and online work that they will encounter throughout their
future careers.
Collaborative Interaction Treatments
In this meta-analysis we have identified a link between the intentioned and designed use of technology in
higher education classrooms and the provision of technology without such explicit intention. The results provide
specific guidance to educators and also contributes to our ongoing understanding of how technologies work and
dont work in the domain of education, especially as it relates to technology-based student-student interaction. This
part of the meta-analysis reiterates the effectiveness of computer supported collaborative learning (CSCL) from the
perspective of technology use and the design of instruction that aims to support and amplify interaction.
Technology Use in Teacher Education
The current analysis has moved one step beyond responding to the general question of whether technology
works or not. Findings have confirmed previous results and provided meaningful insights with regards to specific
pedagogical approaches that are successful in improving student performance. The general analyses were in line
with the findings of the overall meta-analysis (Tamim et al., 2014) indicating the importance of cognitive support
tools for successful learning. In addition, results provided further indication that moderate intensity and complexity
of technology use works better than oversaturation.
The current meta-analysis provides some input regarding pedagogical strategies that work. Particularly
speaking, provision of adequate and specific feedback to students in technology-supported environments highly
increases the impact of technology use on students performance. The current findings are in line with Hattie and
Timperleys (2007) research on feedback in education as a whole. The resulting ES of 0.75 translates to 27-
percentile points gain for average students in the experimental group in comparison to those in the control group.
The gains are significant and the implications are clear. Instructors need to incorporate feedback in their technology-
enhanced instruction. Another pedagogical approach for successful utilization of technology in educational contexts
is problem-based learning (PBL) where the average ES of 0.56 indicates a 21-percentile points gain in students
performance.

30
Finally, one of the most interesting pieces of evidence offered by this study pertains to the importance of
training in successful technology integration in post-secondary education. This is in striking contrast to the implicit
belief that the newer generation is technologically savvy and does not need training.
Concluding Remarks
In conclusion, we would like to specifically highlight for the readers two particular outcomes of this entire
series of meta-analyses that, in our view, are of the utmost importance for research and practice:
Educational technology, no matter how advanced, sophisticated and fashionable, hardly works beyond its
novelty effect. Well-thought through instructional design and pedagogy (e.g., designed interaction treatments,
elaborate feedback, in-time technology training for teachers and students) create and provide substantive added
value of using technological tools for teaching and learning.
Blended learning, which supposedly combines best qualities of face-to-face and online instruction, appears to be
a strong viable venue for applying educational technology to maximize its benefits for learning. Nevertheless,
this promise is still to be substantiated by further primary and meta-analytical research.
References
Anderson, T. (2003). Getting the mix right again: An updated and theoretical rationale for interaction.
International Review of Research in Open and Distance Learning, 4(2), 9-14. Retrieved from:
http://www.irrodl.org/index.php/irrodl/article/view/149/230
Azevedo, R., & Bernard, R. M. (1996). A meta-analysis of the effects of feedback in computer-based instruction.
Journal of Educational Computing Research, 13(2), 109-125.
Bayraktar, S. (2000). A meta-analysis study of the effectiveness of computer assisted instruction in science
education (Unpublished doctoral dissertation). Ohio State University, Columbus, OH. (UMI Number:
9980398).
Bernard, R. M., Borokhovski, E., Schmid, R. F., Tamim, R. M. & Abrami, P. C. (2014). A meta-analysis of
blended learning and technology use in higher education: From the general to the applied. Journal of
Computing in Higher Education. 26(1), 87-122. .http://dx.doi.org/10.1007/s12528-013-9077-3
Bernard, R. M., Borokhovski, E., Schmid, R. F., & Tamim, R. M. (2014). An exploration of bias in meta-analysis:
The case of technology integration research in higher education. Journal of Computing in Higher
Education. 26(3), 183-209. http://dx.doi.org/10.1007/s12528-014-9084-z
Bernard, R.M., Abrami, P.C., Borokhovski, E., Tamim R., Wade, A., Surkes, M., & Bethel, E. (2009). A Meta-
Analysis of Three Types of Interaction Treatments in Distance Education. Review of Educational Research,
79(3), 1243-1289. http://dx.doi.org/10.3102/0034654309333844
Blumenfeld, P. C., Soloway, E., Marx, R. W., Krajcik, J. S., Guzdial, M., & Palincsar, A. (1991). Motivating
project-based learning: Sustaining the doing, supporting the learning. Educational psychologist, 26(3-4),
369-398.
Borenstein, M., Hedges, L. V., Higgins, J. P., & Rothstein, H. (2008). Comprehensive Meta-Analysis (Version
2.2.048) [Computer software]. Englewood, NJ: Biostat.
Borenstein, M., Hedges, L. V., Higgins, J. P., & Rothstein, H. (2010). A basic introduction to fixed-effect and
random-effects models for meta-analysis. Research Synthesis Methods, 1, 97-111.
http://dx.doi.org/10.1002/jrsm.12
Borokhovski, E., Tamim, R. M., Bernard, R.M., Schmid, R.F., & Sokolovskaya, A. (2013). Does Educational
Technology Work Better When Designed for Collaborative Learning? A paper presented at the AERA 2013
annual meeting in San Francisco, CA in April-May, 2013.
Borokhovski, E., Tamim, R. M., Bernard, R. M., Abrami, P. C., & Sokolovskaya, A. (2012). Are contextual and
design student-student interaction treatments equally effective in distance education? A follow-up meta-
analysis of comparative empirical studies. Distance Education, 33(3), 311-329.
http://dx.doi.org/10.1080/01587919.2012.723162
Bruffee, K. (1993). Collaborative learning. Baltimore, MD: Johns Hopkins University Press.
Carpenter, C. R., & Greenhill, L. P. (1955). An investigation of closed-circuit television for teaching university
courses (Report 1). University Park, PA: The Pennsylvania State University.
Clark, R. E. (1983). Reconsidering research on learning from media. Review of Educational Research, 53, 445
459.
Cobb, T. (1997). Cognitive efficiency: Toward a revised theory of media. Educational Technology Research &
Development, 45(4), 21-35. http://dx.doi.org/10.1007/BF02299681
Cooper, H., & Koenka, A. C. (2012). The overview of reviews: Unique challenges and opportunities when research

31
syntheses are the principal elements of new integrative scholarship. American Psychologist, 67, 446-462.
http://dx.doi.org/10.1037/a0027119
Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81-112.
Hedges, L. V., & Olkin, I. (1985). Statistical methods for meta-analysis. Orlando, FL: Academic Press.
Johnson, D. W., & Johnson, R. T. (2008). Cooperation and the use of technology. In J. M. Spector, M. D. Merrill,
J. V. Merrienboer & M. P. Driscoll (Eds.), Handook of Research on Educational Communication and
Technology (Third ed.). New York, London: Lawrence Erlbaum Associsates.
Jonassen, D. H., Howland, J., Moore, J., & Marra, R. M. (2003). Learning to solve problems with technology: A
constructivist perspective. Columbus: Merrill/Prentice-Hall.
Kozma, R. (1991). Learning with media. Review of Educational Research, 61, 179-221.
Larwin, K., & Larwin, D. (2011). A meta-analysis examining the impact of computer-assisted instruction on
postsecondary statistics education: 40 years of research, Journal of Research on Technology in Education,
43(3), 253-278. Available from: http://www.editlib.org/p/54098/
Laurillard, D. (2002). Rethinking university teaching: A framework for the effective use of educational technology
(Second ed.): London: Routledge.
Lee, C. (2000). A paradigm shift in the technology era in higher education. Paper presented at the World
Conference on Educational Multimedia, Hypermedia and Telecommunications.
Lou, Y., Abrami, P. C. & dApollonia, S. (2001). Small group and individual learning with technology: A meta-
analysis. Review of Educational Research, 71(3), 85-157. doi:10.3102/00346543071003449
McCombs, B. L. (2000). Assessing the role of educational technology in the teaching and learning process: A
learner-centered perspective.
McCombs, B. L., & Vakili, D. (2005). A learner-centered framework for E-learning. Teachers College Record,
107(8), 1582-1600.
Means, B, Toyama, Y., Murphy, R. F. & Baki, M. (2013). The effectiveness of online and blended learning: A
meta-analysis of the empirical literature, Teachers College Record, 115(3), 1-47.
Michko, G. M. (2007). A meta-analysis of the effects of teaching and learning with technology in undergraduate
engineering education (Unpublished doctoral dissertation). University of Huston, Huston, TX.
Moore, M. G. & Kearsley, J. P. (2005). Distance education: A systems view. Belmont, CA: Wadsworth.
Moore, M. G. (1989). Three types of interaction. American Journal of Distance Education, 3(2), 17.
Ross, S. M., Morrison, G. R., & Lowther, D. L. (2010). Educational technology research past and present:
Balancing rigor and relevance to impact school learning. Contemporary Educational Technology, 1, 1735.
Retrieved from http://www.cedtech.net/articles/112.pdf
Schenker, J. D. (2007). The effectiveness of technology use in statistics instruction in higher education: A meta-
analysis using hierarchical linear modeling (Unpublished doctoral dissertation). Kent State University,
Kent, OH.
Schmid, R.F., Bernard, R.M., Borokhovski, E., Tamim, R., Abrami, P.C., Wade A., Surkes, M.A., & Lowerison,
G. (2009). Technologys effect on achievement in higher education: a Stage I meta-analysis of classroom
applications. Journal of Computing in Higher Education 21, 95-109. , http://dx.doi.org/10.1007/s12528-
009-9021-8
Schmid, R. F., Bernard, R. M., Borokhovski, E., Tamim, R. M., Abrami, P. C., Surkes, M. A., Wade, C.
A., & Woods, J. (2014). The effects of technology use in postsecondary education: A meta-analysis
of classroom applications. Computers & Education, 72, 271-291.
http://dx.doi.org/10.1016/j.compedu.2013.11.002
Slavin, R. E. (1991). Synthesis of research of cooperative learning. Educational Leadership, 48(5), 71-82.
Tamim, R. M., Bernard, R. M., Borokhovski, E., Abrami, P. C., & Schmid, R. F. (2011). What forty years of
research says about the impact of technology on learning: A second-order meta-analysis and validation
study. Review of Educational Research. 81(3), 4-28. http://dx.doi.org/10.3102/0034654310393361.
Tamim R. M., Borokhovski, E., Bernard, R. M., Schmid, R. F., Abrami, P. C., & Sokolovskaya, A. (2014).
Technology Use in Teacher Training Programs: Lessons Learned from a Systematic Review. A paper
presented at the AERA 2014 annual meeting in Philadelphia, PA in April 2014.
Tekbiyik, A., & Akdeniz, A. R. (2010). A meta-analytical investigation of the influence of computer assisted
instruction on achievement in science, Asia-Pacific Forum on Science Learning and Teaching, 11(2).
Available from http://www.ied.edu.hk/apfslt/v11_issue2/tekbiyik/index.htm.
Timmerman, C. E., & Kruepke, A. (2006). Computer-assisted instruction, media richness and college student
performance, Communication Education, 55(1), 73-104. http://dx.doi.org/10.1080/03634520500489666

32
Wurst, C., Smarkola, C., & Gaffney, M. A. (2008). Ubiquitous laptop usage in higher education: Effects on student
achievement, student satisfaction, and constructivist measures in honors and traditional classrooms.
Computers and Education, 51(4), 1766-1783.

33
34
2. Measuring 21st Century Skills in Technology Educators

Rhonda Christensen
University of North Texas, USA
Rhonda.christensen@gmail.com

Gerald Knezek
University of North Texas, USA
gknezek@gmail.com

Curby Alexander
Texas Christian University, USA
Curby.alex@tcu.edu

Dana Owens
University of Texas at Arlington, USA
danasuzanneowens@gmail.com

Theresa Overall
University of Maine, Farmington, USA
Theresa.overall@maine.edu

Garry Mayes
University of North Texas, USA
Garry.mayes@unt.edu

Introduction

In the competency-based environment that surrounds 21st Century education, proficiency in technology
itself has also assumed an important role whether it is used to enhance instruction, used for communication between
teachers, students and parents or used to assess student learning. The ability to integrate 21st century technology for
learning in schools is an expectation for educators. It is important to measure whether or not teachers are confident
in their ability to integrate the evolving tools in order to target professional development. Using valid and reliable
instruments to assess teachers perceived proficiency with 21st Century technology tools is an important step in
targeting appropriate levels in school-wide training activities, as well as planning meaningful developmental
pathways for individual teachers.
This study uses an instrument that measures educators self-efficacy in their ability to integrate technology
into the classroom. Self-efficacy is the concept that provides the underlying rationale for the TPSA. Self-efficacy is
based on Banduras (1977, 1986) social development theory, and is sometimes defined as the expression of beliefs
of individuals related to their own capacity to perform a certain behavior (Gencturk, Gokcek, & Gunes, 2010). As
reported by Gencturk, Gokcek, and Gunes (2010), teachers with higher self-efficacy are more ambitious and
passionate in their teaching (Tuckman & Sexton, 1990), while Collis (1996) observed that these types of teachers
shape the success or eventual lack of success in any computers-in-education initiative (p. 22). Henson (2003)
found that teacher efficacy is an important component of a classroom teachers success or failure. The authors of this
paper have proposed an operational definition of self-efficacy as confidence in ones competence.

Purpose of the Study

The Technology Proficiency Self Assessment (TPSA) has been used since 1997 in its original form with
only a few minor modifications (changing Alta Vista as a search engine example to Google). The survey instrument
has retained its reliability even in the evolving arena of educational technology. An additional 14 items were added
to the original TPSA to assess newly emerging technologies such as mobile learning and Web 2.0 tools. These items
were added in response to the general trend over time that previously challenging skills became routine for most
teachers. The four TPSA subscales were used to gather data over a ten-year period from 2002 to 2011 on preservice
and inservice teachers. As shown in Figure 1, teacher confidence in technology proficiency skills increased over the

35
ten-year time frame, likely because of the focus on technology integration in education that grew over that period of
time.

email
WWW
Apps
Tech

Changes in Means of the Four


Subscales

Figure 1. Ten-year trends for the development of educator technology skills.

The new instrument, the Technology Proficiency Self-Assessment for 21st Century Learning (TPSA C21)
was used in this study to measure both students from four different universities in educator preparation programs.

Methodology

Participants

Data were gathered from 195 students from four universities at various levels in their teacher education
programs. The majority of students were in teacher preparation programs while a few of the students were inservice
teachers taking masters level courses. The mean age of the participants was 24.41 ranging from 18 to 57. As is the
case in most education programs in the US, the majority (78%) of the participants responding to the survey were
females. The number of students from each university is displayed in Table 1. The groups of students differed in
their level of technology instruction. Each of the groups is described in the following section.

Table 1.
Number of Participants by University Site
Site n
University 1 preservice 66
University 2 preservice 19
University 3 preservice 31
University 4 preservice 55
University 3 inservice 18
Other 6
Total 195

University 1 is a mid-size independent university in the southwestern part of the United States. The course
in which the data were collected is an Introduction to Education class that is required for every student who plans to
enroll in the College of Education. The majority of the students are freshmen and sophomores. The course provides
students with a broad overview of many aspects of the U.S. education system, including the attributes and practices
of effective teachers. This class is divided into three parts including students participation in class activities that
pertain to a selected topic in education (e.g., philosophies of education), observation in area schools, and small
group discussion of class content and observations in small groups. This class serves as a prerequisite for the

36
educational technology class, so most of the students are either concurrently enrolled in both courses or have had no
coursework in educational technology.
University 2 is a small public liberal arts university in New England with close to half of the 1700 students
in teacher preparation programs. The course in which the data were collected is a sophomore-level, 12-credit block
of classes required for all secondary/middle education majors. The classes are curriculum, instruction, and
assessment; classroom management; technology integration; and an intense field experience in which students are
placed in a classroom in their content area and participate fully with middle or high school students from 7:00am to
2:00pm. After an initial two-week orientation on campus, there are six weeks of course work and 6 weeks in the
field during the semester on a three-week rotation. Pre-requisites to the practicum block are a two-credit introduction
to secondary/middle education and a two-credit introduction to special education grades 7-12 which includes a
service learning mentoring experience at a local middle school. In the introduction courses, faculty utilize
instructional technology and require students to use a variety of technologies in their assignments.
University 3 is a large public university serving 36,000 students in a suburban area in the southwestern part
of the United States. There were two groups of participants from this university, junior and senior level
undergraduate, preservice students enrolled in a technology integration course; and a group of inservice educators
who were enrolled in one of the first courses of an online, accelerated-schedule masters degree program. The
preservice students were at the beginning of a technology integration course that teaches the competencies measured
by the TPSA. The TPSA will be given again at the end of the semester to measure the change during the semester.
University 4 is a large Hispanic-serving university of over 34,000 students located in an urban area in the
southwestern part of the United States. The participants in this study were juniors or seniors enrolled a technology
integration course in the College of Education. With no prerequisites for the course, the range of technology skills
that students possess varies a great deal. In the course, students learn to use various educational software programs
as well as develop lesson plans that include a technology component. Participants must select the most appropriate
software and/or technology resources to include in their lesson plans. Each student must produce a website that
serves as an electronic portfolio to organize resources for teaching that they have developed during the course.

Instrumentation

The Technology Proficiency Self-Assessment (TPSA) was originally developed to measure preservice and
inservice teacher confidence in their competence based on a technology proficiency checklist implemented at
Michigan State University (Ropp, 1999). The instrument included 20 items measuring four subscales: (1) E-mail, (2)
World Wide Web, (3) Integrated Applications, and (4) Teaching with Technology. An additional 14 items related to
emerging technologies were added to make it a 34-item likert type survey. Factor analysis on a larger set of data
revealed the 14 items created two different subscales. One of the additional subscales was related to teaching with
emerging technologies and the second subscale was related to emerging technology skills. Internal consistency
reliability was run on the data producing Cronbachs Alpha shown in Table 2.

Table 2.
Internal Consistency Reliabilities for Six Scales on TPSA 21C
Cronbachs Alpha N of items
Email subscale .73 5
WWW subscale .73 5
Integrated Applications subscale .79 5
Teaching with Technology .86 5
Teaching with Emerging Technologies .91 8
Emerging Technologies Skills .84 6
Entire survey .96 34

Procedures

The TPSA C21 was administered online to preservice and inservice teachers at four universities in two
states during the spring and fall of 2014. The 34-item likert type survey was found to be reliable based on
Cronbachs Alpha and valid as tested using factor analysis.

37
Results

As shown in Table 3, the different groups varied a great deal on each of the subscales. Four out of the five
groups rated their highest level of proficiency on the E-mail subscale. As might be expected given that most of the
participants are preservice teachers, their lowest level of proficiency was in the two subscales related to teaching
with technology (Teaching with Technology and Teaching with Emerging Technologies).
Item analysis was completed on the 34 items on the TPSA to look more closely at the level of difficulty
educator groups perceived each item. As shown in Table 4, items were ordered by the level of difficulty from
highest mean (least difficult) to lowest mean (most difficult). For each item an asterisk was placed in the column
representing the factor in which it belonged. While the original subscales of E-mail (S1) and World Wide Web (S2)
contained most of the easiest items, the item rated most difficult (creating a web page) was also on the World Wide
Web subscale. Most of the difficult items were found on the Teaching with Emerging Technologies subscale (S5).

Table 3.
Mean Scores for Teachers and Teacher Candidates on the TPSA C21
Univ 1 Univ 2 Univ 3 Univ 4 Univ 3 All
Preservice Preservice Preservice Preservice Inservice Participants
Email subscale 4.68 (.40) 4.74 (.33) 4.60 (.50) 4.73 (.48) 4.50 (.92) 4.68 (.50)
WWW subscale 4.50 (.46) 4.69 (.38) 4.38 (.55) 4.78 (.42) 4.32 (.98) 4.58 (.55)
Integrated Applications 4.32 (.63) 4.51 (.63) 4.10 (.72) 4.54 (.49) 4.22 (1.10) 4.37 (.68)
subscale
Teaching with 4.02 (.77) 4.42 (.55) 4.09 (.83) 4.51 (.54) 4.16 (1.07) 4.24 (.76)
Technology
Teaching with 4.27 (.60) 4.34 (.92) 4.40 (.64) 4.54 (.55) 4.09 (1.19) 4.37 (.71)
Emerging Technologies
Emerging Technologies 4.48 (.60) 4.63 (.60) 4.55 (.49) 4.57 (.65) 4.36 (.92) 4.52 (.63)
Skills

Table 4.
TPSA C21 Items Ordered by Difficulty
Items from Easiest to Most Difficult Mean (SD) S1 S2 S3 S4 S5 S6
I feel confident that I could
TP1 send e-mail to a friend. 4.94 (.39) *
TP6 use an Internet search engine (e.g., Google) to find Web 4.92 (.39) *
pages related to my subject matter interests.
TP4 send a document as an attachment to an e-mail message. 4.91 (.42) *
TP14 use the computer to create a slideshow presentation. 4.86 (.52) *
TP32 send and receive text messages. 4.84 (.61) *
TP9 keep track of Web sites I have visited so that I can return to 4.76 (.63) *
them later. (An example is using bookmarks.)
TP33 transfer photos or other data via a smartphone. 4.75 (.66) *
TP5 keep copies of outgoing messages that I send to others. 4.74 (.63) *
TP7 search for and find the Smithsonian Institution Web site. 4.69 (.75) *
TP31 download and view streaming movies/video clips. 4.66 (.73) *
TP10 find primary sources of information on the Internet that I can 4.65 (.69) *
use in my teaching.
TP13 save documents in formats so that others can read them if 4.65 (.75) *
they have different word processing programs (eg., saving
Word, pdf, RTF, or text).

38
TP30 download and read e-books. 4.62 (.79) *
TP29 download and listen to podcasts/audio books. 4.58 (.80) *
TP22 use social media tools for instruction in the classroom. (ex. 4.54 (.86) *
Facebook, Twitter, etc.)
TP16 write an essay describing how I would use technology in my 4.48 (.86) *
classroom.
TP18 use technology to collaborate with teachers or students, who 4.44 (.81) *
are distant from my classroom.
TP27 use mobile devices to connect to others for my professional 4.43 (.86) *
development.
TP3 create a distribution list" to send e-mail to several people at 4.42 (.91) *
once.
TP2 subscribe to a discussion list. 4.39 (1.01) *
TP28 use mobile devices to have my students access learning 4.34 (.95) *
activities.
TP34 save and retrieve files in a cloud-based environment. 4.34 (1.02) *
TP21 integrate mobile technologies into my curriculum. 4.28 (.86) *
TP11 use a spreadsheet to create a bar graph of the proportions of 4.26 (1.01) *
the different colors of M&Ms in a bag.
TP25 teach in a one-to-one environment in which the students 4.25 (.94) *
have their own device.
TP26 find a way to use a smartphone in my classroom for student 4.24 (1.07) *
responses.
TP24 use online tools to teach my students from a distance. 4.20 (.96) *
TP12 create a newsletter with graphics. 4.18 (1.05) *
TP17 create a lesson or unit that incorporates subject matter 4.18 (.97) *
software as an integral part.
TP19 describe 5 software programs or apps that I would use in my 4.11 (1.05) *
teaching.
TP23 create a wiki or blog to have my students collaborate. 4.09 (1.06) *
TP20 write a plan with a budget to buy technology for my 3.99 (1.02 *
classroom.
TP15 create a database of information about important authors in a 3.92 (1.10) *
subject matter field.
TP8 create my own web page. 3.87 (1.26) *
Note: S1-Email, S2-WWW, S3-Integrated Applications, S4-Teaching with Technology, S5-Teaching with
Emerging Technologies, S6-Emerging Technology Skills

Discussion

Group mean differences found in Table 3 generally confirm a priori expectations based on known attributes
of the different university groups. For example, university #1 students were in one of their first courses for their
education degree so it makes sense that they were proficient in E-mail (Mean = 4.68) and using the World Wide
Web (Mean = 4.50) but not necessarily confident in their ability to teach with technology (Mean = 4.02) or teach
with emerging technologies (Mean = 4.27). For university #2, the students have been introduced to instructional
technology and have been required to use technology in their assignments. They rated themselves high in
proficiency for emerging technology skills (Mean = 4.63) in addition to E-mail (Mean = 4.74) and World Wide Web
(Mean = 4.69). For university #4, the students were juniors and seniors in their education program so they have
likely been expected to use technology in their previous classes. This group of students was very high in all six
subscales.
One unexpected outcome was the relatively low reported skill level among the university #3 inservice
teachers who were enrolled in an online, accelerated-schedule masters degree program. For most of these educators
it was their first or second course in the program. This relatively low reported skill level is similar to findings
reported by Tyler-Wood, Knezek, & Christensen (2004) when comparing the technology skills of alternative
certification teacher preparation candidates to those of traditional preservice teacher candidates completing their

39
preparations in the context of a four-year undergraduate degree program. Tyler-Wood et al., (2004) found that
alternative certification teachers completed their required three credit-hour technology course with technology skills
still lower on the average than traditional preservice teachers began their technology course sequence. The
implication is that teacher candidates in alternative programs may need extra training in technology applications and
teaching with technology skills.
The distribution of the asterisks in Table 4 give clues regarding how a technology coordinator for a school
district (for example) might select skills to be targeted for mastery at a day-long teacher professional development
institute. Most of the asterisks in the top half of Table 4 are in the S1 and S2 columns on the left hand side of the
table, while most of the asterisks in the bottom half of the table are in the S3-S6 columns to the right of the table.
This indicates most of the easy skills already mastered by most teachers are related to email (S1) and the WWW
(S2), while the ones not yet mastered by teachers in general are in the areas of integrated applications (S3), teaching
with technology (S4), emerging technologies for the classroom (S5) or emerging technology skills (S6). If a school
building was able to have all teachers complete surveys and analyze the responses before the training sessions
began, then the coordinator should anchor the skill acquisition goals slightly above the self-reported means in each
of the six areas. If the technology coordinator does not have the luxury of assessing the teachers before training
begins, then he or she might choose skill development activities in each of the six scale categories, and probably
with some variety regarding difficulty level. For example, a technology coordinator for a large school or an entire
district might make training available for one easy skill, one average skill, and one difficult skill in each of six
categories. Teachers could then be required to select one skill workshop from each column to attend during the
professional development day. In this way the orchestrated selection of training workshops would be likely to match
the professional development skill levels of most teachers

Conclusion

The presence of technology does not guarantee pedagogically sound use in the classroom. Teachers
attitudes towards computers, their self-efficacy, and a general openness to change, are among the factors that are
indicative of effective use in K-12 instructional practice (Christensen, 2002; Wang, Ertmer & Newby, 2004;
Anderson & Maninger, 2007; Vannata & Fordham, 2004). In light of this body of research that influences the
various factors that impact technology infusion, instrumentation must be carefully designed to derive an accurate
picture of technology integration into instruction. The Technology Proficiency Self-Assessment instrument has
retained high reliability over the past 15 years as shown in various studies (Ropp, 1999; Christensen & Knezek,
2001; Morales, Knezek, & Christensen, 2008; Gencturk, Gokcek & Gunes, 2010; Mayes, Mills, Christensen &
Knezek, 2012). The TPSA C21 continues and extends that tradition with the addition of two scales designed to
assess educator skills regarding emerging new technologies.

References

Anderson, S.E., & Maninger, R.M. (2007). Preservice teachers abilities, beliefs, and intentions regarding
technology integration. Journal of Educational Computing Research, 37(2), 151-172.
Bandura, A. (1977a). Self-efficacy: Toward a unifying theory of behavioral change. Psychological Review, 84(2),
191-215.
Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory. Englewood Cliffs, NJ:
Prentice-Hall, Inc.
Christensen, R. (2002). Effects of technology integration education on the attitudes of teachers and students. Journal
of Research on Technology in Education, 34(4), 411-433.
Christensen, R. & Knezek, G. (2001). Equity and diversity in K-12 applications of information technology. Key
Instructional Design Strategies: KIDS Project Findings for 2000-2001. Denton, TX: University of North
Texas, Institute for the Integration of Technology into Teaching and Learning.
Collis, B. (1996). The internet as an educational innovation: Lessons from experience with computer
implementation. Educational Technology, 36(6), 21-30.
Gencturk, E., Gokcek, T., & Gunes, G. (2010). Reliability and validity study of the technology proficiency self-
assessment scale. Procedia Social and Behavioral Sciences, 2, (2010), 2863-2867.
Henson, R. K. (2003). Relationships between preservice teachers' self-efficacy, task analysis, and classroom
management beliefs. Research in the Schools, 10(1),53-62.

40
Mayes, G., Mills, L., Christensen, R. & Knezek, G. (2012). Evolution of Technology Proficiency Perceptions:
Construct Validity for the Technology Proficiency Self Assessment (TPSA) Questionnaire from a Longitudinal
Perspective. In P. Resta (Ed.), Proceedings of Society for Information Technology & Teacher Education
International Conference 2012 (pp. 1988-1993). Chesapeake, VA: AACE.
Morales, C., Knezek, G., & Christensen, R. (2008). Self-efficacy ratings of technology proficiency among teachers
in Mexico and Texas. Computers in the Schools, 25(1), 126-144.
Ropp, M.M. (1999). Exploring individual characteristics associated with learning to use computers and their use as
pedagogical tools in preservice teacher preparation. Journal of Research on Technology in Education, 36(3),
402-424.
Tuckman, B.W. & Sexton, T.L. (1990). The relation between self-beliefs and self regulated performance. Journal of
Social Behavior and Personality, 5, 465-472.
Tyler-Wood, T., Knezek, G. & Christensen, R. (2004). Barriers to Teaching with Technology for Alternative
Certification Teachers. In L. Cantoni & C. McLoughlin (Eds.), Proceedings of World Conference on
Educational Multimedia, Hypermedia and Telecommunications 2004 (pp. 3209-3214). Chesapeake, VA:
AACE.
Vannatta, R. A. & Fordham, N. (2004). Teacher dispositions as predictors of classroom technology use. Journal of
Research on Technology in Education,36 (3), 253-271.
Wang, L., Ertmer, P. A., & Newby, T. J. (2004). Increasing preservice teachers' self-efficacy beliefs for technology
Integration. Journal of Research on Technology in Education, 36(3), 231-250.

41
42
3. Learning Analytics: Principles and Constraints

Mohammad Khalil
Social Learning, Information Technology Services
Graz University of Technology
Graz, Austria
mohammad.khalil@tugraz.at

Martin Ebner
Social Learning, Information Technology Services
Graz University of Technology
Graz, Austria
martin.ebner@tugraz.at

Introduction
Within the last years, technology and the availability of the internet have evolved so rapidly that it has
changed the world of information. Education has enrolled within this revolution and created a new phenomenon
called e-learning (also referred to as web-based education and e-teaching) in which big sets of data exist about
learners and the whole educational system (Castro et al., 2007). Learning Analytics is a fast growing area of the
research field of online education and Technology Enhanced Learning (TEL). It includes different academic
disciplines as an intersection of various fields, e.g. education, psychology, pedagogy, statistics, machine learning
and computer science (Dawson et al., 2014). Additionally, Knight, Buckingham and Littleton combined
epistemology to the Learning Analytics areas of study (Knight et al., 2013). These social and technical connections
have been largely positive to Learning Analytics and with a growing researchers base, we will get the opportunity to
influence the development of analytics in education (Siemens, 2012).
Learning is conventionally defined as the process of acquiring competence and understanding (Goodyear &
Retalis, 2010). On the other side, analysis techniques that derive information from big data such as revealing
patterns and applying them to the education stream are named Learning Analytics. In its initial steps of evolving,
there has been a plethora of definitions used for Learning Analytics. Siemens defines it as the use of intelligent
data, learner product data and analysis models to discover information and social connections, and to predict and
advise on learning (Siemens, 2010). Elias described it as an emerging field, in which sophisticated analytic tools
are used to improve learning and education (Elias, 2011). Learners and teachers leave many traces behind them in
which Learning Analytics can convert them to be beneficial for the education sector (Duval, 2011). Later, the
Society for Learning Analytics Research (SoLAR) defined Learning Analytics as the measurement, collection,
analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing
learning and the environment in which it occurs (SoLAR, 2011). There are several factors driving the emergence
expansion of Learning Analytics. These factors are: a) The boundlessness and the proliferation of internet and
technology among all educational categories. b) The large abundance of data available from learning environments.
c) The availability of tools that can be used to manage and analyze data. d) The increasing demand to understand
learners and improve the learning environment and its context.
A key application of Learning Analytics is monitoring and predicting students learning performance
(Johnson et al., 2011). By using Learning Analytics and optimizing it in the learning environment, tutors for
example, can predict the students future performance in courses. Students can improve their grades. Educational
institutions such as decision makers can be involved to enhance the retention percentage of the universitys
graduates. Course developers would point to the difficulties and weaknesses in the courses models. However,
limited researches have been conducted so far in order to serve Learning Analytics as a standard approach for e-
learning environments (Chatti et al., 2012), (Cooper, 2012), (Greller & Drachsler, 2012). Furthermore, and besides
the technically focused questions about the related fields between e-learning environment and Learning Analytics,
Prinsloo and Slade indicated in their research that higher education institutions policies may no longer be sufficient
to address ethical and privacy issues in the potential of Learning Analytics (Prinsloo & Slade, 2013). Therefore, we
raise substantial questions related to the challenges that surround Learning Analytics such as security, privacy,
policy and ethics issues.

43
Research Methodology
In this paper, we will discuss what Learning Analytics is about, and propose an approach that comprises
proceeding steps, starting from the learning environment and ending with the appropriate intervention. We conclude
a Learning Analytics life cycle after gathering information about Learning Analytics from conference proceedings,
workshops results as well as publications in different journals in the last four years. Furthermore, we looked at the
current available frameworks and reference models. This motivated us to investigate further to propose an approach
that presents a framework as well as a life cycle. We took into account the idea of closing the cycle (Clow, 2012),
the current models by (Greller & Drachsler, 2012) and (Chatti et al., 2012) and the information in respect to
Learning Analytics in this area to date. Afterwards we modeled our approach and enhanced it by exploring
additional related studies about privacy, ethics and security issues.

Discussions
In this section, we present the Learning Analytics life cycle as shown in (figure 1). It considers four main
parts: Learning environment where stakeholders produce data; big data, which consists of massive amounts of
datasets; analytics, which comprises different analytical techniques; Act, where objectives are achieved to optimize
the learning environment. Later on, eight-dimensional constraints encompass the Learning Analytics holistic cycle
will be shown in (figure 3).

Figure 1: Proposed Learning Analytics Life Cycle

Learning Environment

With the ubiquitous technologies spread among education, there is a large collection of educational and
learning environments involved, such as: Personal Learning Environments (PLE), Adaptive Hypermedia educational
systems, Interactive Learning Environments (ILE), Learning Management Systems (LMS), Learning Content
Management Systems (LCMS), Virtual Learning Environments (VLE), Immersive Learning Simulations (ILS),
intelligent tutoring systems and mobile learning. All these learning environments are a gold mine of data that
learners leave behind (Romero & Ventura, 2010). For example, logging a mouse click by its x and y coordinates, or

44
the menu items times clicks, or the time a student spent on a question can produce a huge amount of data that can be
analyzed to provide information about the students motor skills (Mostow & Beck, 2006). The learning environment
has a lot of aspects and roles, but in this proposed learning analytics cycle, the focus will be on the actors /
stakeholders.

Stakeholders

There are different groups who are engaged in Learning Analytics. Each group can get benefits according
to their visions and missions. For instance, Learning Analytics is advantageous to support people in clarifying and
relating information, peer learners and digital artifacts and to support people in pursuing their learning (Fournier et
al., 2011). (Table 1) displays the stakeholders, the objectives and some examples for each group.

Stakeholder Objectives Examples


Learners Enhance their performance. Students are informed about
Personalize online learning. learning process and compare
Recommend courses. their performance with others.
Starting large assignments earlier
and ask questions using
applications like Signals (Arnold
& Pistilli, 2012).
Instructors Enhance their teaching Monitoring learning progress of
methods. Provide real-time the students using applications
feedback to students. like SNAPP (Bakharia &
Dawson, 2011).
Researchers Evaluate courses. Improve Through visualizations, course
courses models. Discover new researchers can compare
methods of delivering Learning Analytics techniques to
educational information. be able to recommend the
persuasive one.
Educational Institutions Support decision processes to Increase retention rate. Monitor
achieve higher educational higher educational perspective
goals. goal by increasing retention rate,
using applications like Signals
and C4S (Jackson & Read,
2012).
Table 1: Learning Analytics Stakeholders

Learning Analytics consider mining learners activities. Most of Learning Analytics definitions reference
learners as the main actor of the Learning Analytics process. This has been researched in (Siemens, 2010), (Duval,
2011), (Ebner & Schn, 2013), (Taraghi et al., 2014).

Big Data

As mentioned before, learners leave a lot of data behind them while using any learning environment. In the
old educational methodologies, the learner is considered as a consumer. She/he has no possibility to be an active
actor in the education process. On the other hand, with Learning Analytics, learners are not only consumers, but also
become producers of data. In educational environments, there are different types of data to be processed. These data
are restricted to the educational area and therefore have an authentic semantic information (Romero & Ventura,
2010). Manyika and his colleagues defined Big data as the reference to datasets whose size is beyond the ability
of typical database software tools to capture, store, manage, and analyze. (Manyika et al., 2011). While learners are

45
using the educational platforms, they generate data. This yields to have repositories of datasets. These datasets
include, but are not limited to:
1) Interaction data; such as the data that is related to visualizations and forums discussions.
2) Traces; which can be number of logins, mouse clicks, number of accessed resources, number of finished
assignments, videos accessed, documents accessed, files downloaded, questions asked, discussions
involved, and social network activities; such as tweets, blogs and comments.
3) Personal data: name, date of birth, local address, email address, personal image, ID or any other
personal related information.
4) Academic Information; these are courses attended, grades, graduation date, exams taken, certificates
etc.
While big data includes this large amount of educational information, it should be searched, processed,
mined and visualized in order to retrieve a meaningful knowledge.

Analytics

There are different methods to analyze data in the atmosphere of education. These analytics methods seek
to discover interesting patterns hidden in the educational datasets. Learning Analytics techniques use various types
of analytical methodologies. These methodologies fall into two main categories: Quantitative and Qualitative
Analysis. In this section, we will summarize the key techniques in both fields.

Quantitative Analysis

Quantitative analysis is all about counting and statistical models. Learning Analytics quantitative methods
are:
Statistical Analysis: By applying statistics and mathematical operations, knowledge can be revealed out
of data. Statistical analysis is related to traces, in which numeric computations can be executed. We can
count the number of visits, analyze mouse clicks and calculate time spent on task. Some of the available
statistical analysis tools are: IBM SPSS and MATLAB.
Visualizations: Visualizations are useful for all Learning Analytics stakeholders. Students can display
their progress in assignments and classes. Teachers can obtain an overview of their own efforts.
Decision makers can make financial decisions upon them. Statistical information can be interpreted into
charts, flowcharts, mind mapping, heat maps, 3D plots, scatterplots, evaluation models and diagrams.
Learning Analytics lack of clarity in what exactly to measure to get a deeper understanding of learning
progress. However, information visualization techniques can connect visualizations not only with
meaning or truth, but also with taking actions (Duval, 2011). Dashboards are an adequate and a widely
accepted style of Learning Analytics visualizations as seen in Course Signal (Arnold & Pistilli, 2012).
Vozniuk and his colleagues offered a portable learning analytics dashboard that provides teachers and
learners the ability to view their progress in different scenarios (Vozniuk, 2013). We see that a vast
research on development of dashboards is imperative for Learning Analytics as they are easy to
understand, render a better visibility and offer an easy insight.
Quantitative Social Network Analysis: Social Network Analysis (SNA) focuses on relationships
between entities. The use of SNA allows to carry out detailed investigations on networks composed of
actors and the relations between them. These relationships between entities/actors are referred to as
strong ties or weak ties (Ferguson, 2012). SNA can employ learning environment data to help
instructors understand the atmosphere of his/her class, and to provide help for students when needed.
There are several available tools that offer social network analysis enriched by visualizations such as;
SNAPP (Bakharia & Dawson, 2011) and Cytoscape 1 . These tools can help by observing students
contributions in class discussions and propose faculties and decision makers the potentiality to identify
students who are isolated (at risk).


1
www.cytoscape.org; (last access December 2014)

46
Qualitative Analysis

Qualitative analysis is about processing data into more explained description. The study of (Fournier et al.,
2011), highlighted that not only quantitative methods should be used with Learning Analytics, but also the
qualitative ones; in such cases, the recommendations could be provided to the learners based on their earlier learning
activities. Our survey harvests two main qualitative analysis methods:
Emotional Intelligence: Emotional intelligence is based on emotions, which relates to psychology and
sociology. A general way of categorizing emotions is dividing them into positive and negative ones.
The University of New England for example, has created an early alert engine called Automated
Wellness Engine (AWE) in order to improve retention and decrease the dropout rate. The AWE is built
on emoticons identification engine that is embedded into the university student portal. Based on various
indicators, The AWE sends wellness reports which detail the reasons for withdrawal and wellness
ratings within courses (Atif et al., 2013).
Qualitative Social Network Analysis: This can take the form of virtual ethnography by including
observations, interviews and surveys (Edwards, 2010). These qualitative data are then analysed by
establishing a social network to transform it into a wider contextual findings.

The Act

In this stage, the analysis outcome is interpreted to achieve the objectives of Learning Analytics. The
greatest value of Learning Analytics comes from optimizing the objectives, as interventions that affect the learning
environment and its stakeholders (Clow, 2012). In the meanwhile, Learning Analytics aims to:
Prediction: The objective of prediction is to explore an unknown numerical/continuous value such as
performance, knowledge, score or grade (Romero & Ventura, 2010). Through prediction, learner
activities and future performance can be revealed. Thus, an appropriate intervention would accomplish
Learning Analytics goals.
Intervention: A convenient intervention can for example: prevent drop-out, determine which students
might be at risk, advise students who may need additional assistance and improve students success.
Recommendation: Learning Analytics can be mined for recommendations and activities of people
(Duval, 2011). The main goal in the context of Learning Analytics is the aptitude to make
recommendations to students based on their activities; e.g. recommend a discussion, suggest a course or
recommend books related to what previous students consulted.
Personalization: Related to recommendation, in which learners shape their personal learning
environment. The objective behind personalization is to support learning for all students, improve
educational performance and accelerate educational innovation (Pea et al., 2014). For instance, it can be
carried out through personalizing e-learning based on learners ability, or support students by
personalizing learning suggestions.
Reflection and Iteration: Reflection and Iteration are defined as self-evaluating of data clients induced
by their own in order to obtain self-knowledge (Greller & Drachsler, 2012). The objective behind
reflection is to evaluate the past work to improve future experience and to turn it into learning according
to personalization and adaptation. This iteration can optimize all Learning Analytics stakeholders in the
design of its life cycle.
Benchmarking: Benchmarking is a learning process, which identifies the best practices that produce
superior results. These practices are replicated to enhance performance outcomes (Vorhies & Morgan,
2005). Thus, through learning analytics, we can identify the weak points in the learning environment as
well as educational performance and therefore, suggest and optimize methods in order to enhance
learning.

Learning Analytics Constraints


There are constraints that affect Learning Analytics technologies. Ethical and privacy issues emerge while
applying Learning Analytics in educational datasets (Greller & Drachsler, 2012). The large-scale of data collection

47
and analysis can lead to questions related to ownership, privacy and ethical issues. In this study, we introduce eight-
dimensional constraints that limit the beneficial appliances of learning analytics processes as shown in (figure 2).

Figure 2: Learning Analytics Constraints

Privacy: Conforming to the main objective of Learning Analytics, namely predicting, a professor for
instance, can point a student who is at risk in his course. This could lead to the problem of labeling, which a learner
is labeled as a bad/good student.
The Learning Analytics committee needs to carefully consider the potential privacy issues while analyzing
students data. Data analysis and customization can reveal personal information, attitudes and activities about
learners. Therefore, some educators claim that educational institutions are using softwares that collect sensitive data
about students without sufficiently considering data privacy and how eventually they are used (Singer, 2014).
Datasets may include sensitive information about learners. Thus, anonymization or de-identification may be required
to preserve learners information. The student privacy law of Family Educational Rights and Privacy Act (FERPA)2
advocates the usage of de-identification in higher education to preserve students records privacy. There are various
cryptographic solutions, anonymization techniques and statistical methods that hide the owners identity (Fung et al.,
2010). The study of (Slade & Prinsloo, 2013) pointed to the requirement of de-identification of data before it
becomes available for the institutions use. This would serve the intervention of Learning Analytics based on
students activity and behavior while assuring the anonymity of their information.
Regulations and laws are a good consideration to address the privacy issue. The Open University in
England has made the first step to regulate laws, specialized in Learning Analytics and privacy principles (OU,
2014). This is a good example that encourages other institutions to consider privacy as a fundamental element that
should not be ignored.
Access: Authentication assures that only legitimate users have the permission to access and view specific
data. Data access is relevant to policy regulations where these regulations must adhere to the authentication and
authorization modules. The student should be allowed to update his/her information and have the ability to provide
additional information when needed. In order to achieve students privacy, there should be access levels for all
Learning Analytics stakeholders. A student has the access level to view and update his information. A teacher is
authorized to access students data without the possibility of viewing sensitive information such as ethnic origin or
nationality. Decision makers can sustain the data in order to meet the institutional perspective which focuses on
preventing the high dropout rate that is considered as a failure of the university system (Grau-Valldosera &
Minguilln, 2011). On the other hand, there are still unanswered questions about students, whether they have the
right to access results of Learning Analytics, or do researchers have the morality to view and analyze students data?
Transparency: Disclosing information is a major challenge for information providers. Learning Analytics
methods should aim to be transparent and easily described to staff and students. The institution can take the step of
assuring transparency by providing information regarding data collection, usage and involvement of third parties in
analyzing students information. Learning Analytics tools use techniques and models that seek to provide assistance
to different stakeholders. It would be familiar that students may want to understand the methods of how their
performance are being tracked, and based on that, how the evaluation and the interventions are processed. Moreover,
students or their parents may ask about their sensitive data, and how much of the information is provided to the
instructor.

2
www2.ed.gov/ferpa; (last access December 2014)

48
Transparency in Learning Analytics does not mean that data should be available to the public. Nevertheless,
we must bring it to all Learning Analytics involved sectors - educational, psychological and computer science with
security experts - to develop the right technical and ethical principles in the whole Learning Analytics life cycle.
Policy: With the adoption of Learning Analytics in the educational fields, institutions are required to adjust
their policies with legislative framework. According to the study of (Prinsloo & Slade, 2013), many institutional
policies failed to fully reflect the ethical and privacy implications of Learning Analytics. Here, we list some possible
regulations that an ethical Learning Analytics policy should describe: a) Collection of personal information, for
example: sex, date of birth, address, ethnicity, occupational status, qualifications and study records. b) Describe the
usage of this information, if it is for the benefit of the students, such as predicting students behavior and a series of
recommendations and advices based on Learning Analytics, or if it is for research reasons to achieve Learning
Analytics objectives. c) Methodology of data collection either by the students input him/herself or by other
services, such as browser cookies. d) Security principles for keeping the data protected. e) A description of the time
period of keeping learners data and a definition of a deletion process. For instance, ClassDoJo, a student tracking
company, announced to keep the students statistics for only one school year and to proceed with a deletion policy to
remove students records after their families complained about their childrens private data (Singer, 2014).
Security: All Learning Analytics tools should follow expedient security principles in order to keep the
analysis results and the students records safe from any threat. The widely-spread security model known to security
experts is the CIA, which stands for Confidentiality, Integrity and Availability (Anciaux et al., 2006). The
confidentiality property, guarantees that the data can never be accessed by an unauthorized access. Integrity property
guarantees that the data cannot be altered, snooped or changed. The availability property means that the data should
be available for authorized parties to access when needed. In the scope of Learning Analytics, students information
and the analysis procedure should be kept safely and only accessible to authorized parties. A key component of
protecting learners information is encrypting their data in order to achieve the confidentiality concept. Encryption
guarantees that only authorized people can use the data. Moreover, assuring confidentiality can include: invoking
file permissions and granting a secure operating environment, while cryptographic hashing of datasets can assure the
integrity property of students records (Chen & Wang, 2008).
Accuracy: As Learning Analytics is an emerging research topic in the field of Technology Enhanced
Learning and a forthcoming trend (Ebner & Schn, 2013), accuracy and validity of information is highly
questionable. Mistakes related to picking a wrong dataset, or not recognizing the component relevant to data will
negatively affect the accuracy of the outcome (Waterman & Bruening, 2014). Therefore, a wrong selection of
educational dataset will lead to inaccurate results. The questions we can ask here are: What if Learning Analytics
results were wrong? And what if the predications or the interventions went wrong? Accordingly, Learning Analytics
would aim to provide guarantees that its analyzing, and picking the data, fit quality criteria and produce an agreed
level of accuracy.
Restrictions: Data protection and copyright laws are legal restrictions that limit the beneficial use of
Learning Analytics. Such legal restrictions are: limitations of keeping the data for longer than a specific period of
time, which are regulated differently in each country; the data should be kept secure and safe from internal and
external threats; data should be used for specific purposes and the results of any process should be as accurate as
possible. The restrictions could be stronger when it relates to personal information. Applying social network analysis
as a method of learning analytics causes the adoption of personal information, therefore, these methods should meet
the regulations of using individuals information.
Ownership: There are two main perspectives about who own the data: students and institutions. (Jones et
al., 2014) concluded that neither the students nor the institutions should win the ownership of the data. They
suggested a hybrid module that merges both perspectives. Institutions can invest the students data in analytics,
develop new personalized learning platforms and benchmark their learning management system. While students
want to enhance their learning and maintain their performance, they would like to ensure that their information is
kept confidential. An uprising question we like to address here is: What if Learning Analytics methods have to
modify the students data for prediction purposes?

Learning Analytics - Principles & Constraints Framework

After the discussion of Learning Analytics cycle and quandaries, we present the final proposed framework
that combines the principles and constraints. (Figure 3), illustrates Learning Analytics Principles & Constraints
framework.

49
Figure 3: Learning Analytics - Principles & Constraints Framework

The final proposed framework shows Learning Analytics main sections, process flow, methodologies, and
objectives, namely as life cycle, and eight-dimensional constraints encompass the central processes of Learning
Analytics. These constraints do not relate to a specific principle, but relate to the entire Learning Analytics
proceeding in general.

Conclusion
Learning Analytics is a promising research field, which provides tools and platforms that influence
researchers in Technology Enhanced Learning. For instance, there are several institutions that have taken an
analytical approach when deciding to update their learning management system (Cooper, 2013). Nonetheless, this
emerging new field lacks an approach that delineates a complete overview of its processes. It could be said that
learning analytics is advancing quickly, but it is not yet in full bloom. This research study reviewed the definitions
of Learning Analytics and the factors that drive its emergence expansion. Then we proposed an approach that
portrays a Learning Analytics life cycle. This provides an entire overview consisting of: learning environment, big
data, analytics and the interventions which are interpreted to achieve the main goals of Learning Analytics. Based on
this approach, we identified the stakeholders, introduced examples of usage, presented methodologies and discussed
the objectives. After that, we paved the way of determining the challenges that surround Learning Analytics and
shed the light on the privacy, security and ethical issues and anticipated questions that need a further research in near
future. In the last section, we presented our vision of a framework that combines both principles and constraints and
reflects our vision of the quintessence of Learning Analytics.

References
Anciaux, N., Bouganim, L., & Pucheral, P. (2006). Data confidentiality: to which extent cryptography and secured
hardware can help. In Annales des tlcommunications (Vol. 61, No. 3-4, pp. 267-283). Springer-Verlag.
Arnold, K. E., & Pistilli, M. D. (2012). Course Signals at Purdue: Using learning analytics to increase student
success. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge
(LAK12) (pp. 267-270). New York, USA: ACM.
Atif, A., Richards, D., Bilgin, A., & Marrone, M. (2013) Learning Analytics in Higher Education: A Summary of
Tools and Approaches. Retrieved 06 October 2014 from
http://www.ascilite.org/conferences/sydney13/program/papers/Atif.pdf

50
Bakharia, A., & Dawson, S. (2011). SNAPP: a bird's-eye view of temporal participant interaction. In Proceedings of
the 1st international conference on learning analytics and knowledge (LAK11) (pp. 168-173). New York,
USA: ACM.
Castro, F., Vellido, A., Nebot, A., & Mugica, F. (2007). Applying data mining techniques to e-learning problems. In
L. C. Jain, T. Raymond & D. Tedman (Eds.), Evolution of teaching and learning paradigms in intelligent
environment (Vol. 62, pp. 183-221). Berlin: Springer-Verlag.
Chatti, M.A., Dyckhoff, A.L., Schroeder, U., & Ths, H. (2012). A reference model for learning analytics.
International Journal of Technology Enhanced Learning, 4(5/6), 318331. doi:10.1504/IJTEL
.2012.051815
Chen, L., & Wang, G. (2008). An efficient piecewise hashing method for computer forensics. In Knowledge
Discovery and Data Mining, 2008. WKDD 2008. First International Workshop (pp. 635-638). IEEE.
Clow, D. (2012). The learning analytics cycle: closing the loop effectively. In Proceedings of the 2nd International
Conference on Learning Analytics and Knowledge (LAK12) (pp. 134-138). New York, USA: ACM.
Cooper, A. (2012). A Framework of Characteristics for Analytics. CETIS Analytics Series, 1(7).
Cooper, A. (2013). Learning Analytics Interoperability some thoughts on a way ahead to get results sometime
soon. Retrieved 10 November 2014 from http://blogs.cetis.ac.uk/adam/wp-
content/uploads/sites/23/2013/10/LA-Interop-Some-Thoughts-v0p2.pdf
Dawson, S., Gaevi, D., Siemens, G. & Joksimovic, S. (2014). Current state and future trends: a citation network
analysis of the learning analytics field. In: Proceedings of the Fourth International Conference on Learning
Analytics & Knowledge (LAK14) (pp. 231-240). New York, USA: ACM.
Duval, E. (2011). Attention please! Learning analytics for visualization and recommendation. In Proceedings of the
1st International Conference on Learning Analytics and Knowledge (LAK 11) (pp. 917). New York,
USA: ACM.
Ebner, M., & Schn, M. (2013). Why Learning Analytics in Primary Education Matters. Bulletin of the Technical
Committee on Learning Technology, Karagiannidis, C. & Graf, S (Ed.), 15(2), 14-17.
Edwards, G. (2010). Mixed-method approaches to social network analysis. ESRC National Centre for Research
Methods Review paper NCRM/015. National Centre for Research Methods.
Elias, T. (2011). Learning Analytics: Definitions, Processes and Potential. Retrieved 10 November 2012 from
http://learninganalytics.net/LearningAnalyticsDefinitionsProcessesPotential.pdf
Ferguson, R. (2012). Learning analytics: drivers, developments and challenges. International Journal of Technology
Enhanced Learning, 4(5), 304-317.
Fournier, H., Kop, R., and Sitlia, H. (2011). The Value of learning analytics to networked learning on a Personal
Learning Environment. In: the 1st International Conference on Learning Analytics and Knowledge
(LAK11) (pp. 104-109). New York, USA: ACM.
Fung, B., Wang, K., Chen, R., & Yu, P. S. (2010). Privacy-preserving data publishing: A survey of recent
developments. ACM Computing Surveys (CSUR).
Goodyear, P., Retalis, S. (2010). Technology-Enhanced Learning, Design Patterns and Pattern Languages. Sense
Publishers. Volume 2. Retrieved 12 November 2012 from https://www.sensepublishers.com/media/1037-
technology-enhanced-learning.pdf
Grau-Valldosera, J., & Minguilln, J. (2011). Redefining dropping out in online higher education: a case study from
the UOC. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge (pp.
75-80). ACM.
Greller, W., & Drachsler, H. (2012). Translating Learning into Numbers: A Generic Framework for Learning
Analytics. Educational Technology & Society, 15 (3), 4257.
Jackson, G. & Read, M. (2012). Connect 4 Success: A Proactive Student Identifications ad Support Program.
Retrieved 12 November 2014 from http://fyhe.com.au/past_papers/papers12/Papers/9B.pdf
Johnson, L., Adams Becker, S., Estrada, V., Freeman, A. (2014). NMC Horizon Report: 2014 Higher Education
Edition. Austin, Texas: The New Media Consortium.
Johnson, L., R. Smith, H. Willis, A. Levine, and K. Haywood. (2011). The 2011 Horizon Report. Austin, TX: The
New Media Consortium.
Jones, K., Thomson, J., and Arnold, K. (2014). Questions of Data Ownership on Campus. EDUCASE Review.
Retrieved 3rd December 2014 from http://www.educause.edu/ero/article/questions-data-ownership-campus
Knight, S., Buckingham Shum, S., and Littleton, K. (2013). Epistemology, pedagogy, assessment and learning
analytics. In: Third Conference on Learning Analytics and Knowledge (LAK 2013) (pp. 7584). New York,
USA: ACM.

51
Manyika, J., Chui, M., Brown, B., Bughin, J., Dobbs, R., Roxburgh, C., & Byers, A. H. (2011). Big data: The next
frontier for innovation, competition, and productivity.
Mostow, J., & Beck, J. (2006). Some useful tactics to modify, map and mine data from intelligent tutors. Natural
Language Engineering, 12(02), 195-208.
Open University of England. (2014). Policy on Ethical Use of Student Data for Learning Analytics Retrieved 03
December 2014 from
http://www.open.ac.uk/students/charter/sites/www.open.ac.uk.students.charter/files/files/ecms/web-
content/ethical-use-of-student-data-policy.pdf
Pea, R., D.Phil., Oxon. David Jacks (2014). THE LEARNING ANALYTICS WORKGROUP A Report on Building
the Field of Learning Analytics for Personalized Learning at Scale. Retrieved 20 November 2014 from
https://ed.stanford.edu/sites/default/files/law_report_complete_09-02-2014.pdf
Prinsloo, P., & Slade, S. (2013). An evaluation of policy frameworks for addressing ethical considerations in
learning analytics. In Proceedings of the Third International Conference on Learning Analytics and
Knowledge (LAK 2013) (pp. 240-244). New York, USA: ACM.
Romero, C., & Ventura, S. (2010). Educational data mining: a review of the state of the art. Systems, Man, and
Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, 40(6), 601-618.
Siemens, G. (2010). What are Learning Analytics? Retrieved 7 November 2014 from
http://www.elearnspace.org/blog/2010/08/25/what-are-learning-analytics/.
Siemens, G. (2012). Learning Analytics: Envisioning a Research Discipline and a Domain of Practice. In
Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (LAK 2012) (pp.
04-08).New York, USA: ACM.
Singer, N. (2014). ClassDojo adopts Deletion Policy for Student Data. Retrieved 29 November 2014 from
http://bits.blogs.nytimes.com/2014/11/18/classdojo-adopts-deletion-policy-for-student-
data/?partner=rss&emc=rss&_r=2
Slade, Sharon and Prinsloo, Paul (2013). Learning analytics: ethical issues and dilemmas. American Behavioral
Scientist, 57(10) pp. 15091528.
Society for Learning Analytics Research. (2011). Open Learning Analytics: an integrated & modularized platform
Retrieved 01 November 2014 from http://solaresearch.org/OpenLearningAnalytics.pdf
Taraghi, B., Ebner, M., Saranti, A., & Schn, M. (2014) On using markov chain to evidence the learning structures
and difficulty levels of one digit multiplication. In Proceedings of the Fourth International Conference on
Learning Analytics And Knowledge (LAK14) (pp. 68-72) .New York, USA: ACM.
Vorhies, D. W., & Morgan, N. A. (2005). Benchmarking marketing capabilities for sustainable competitive
advantage. Journal of Marketing, 69(1), 80-94.
Vozniuk, A., Govaerts, S., & Gillet, D. (2013). Towards portable learning analytics dashboards. In Advanced
Learning Technologies (ICALT), 2013 IEEE 13th International Conference on (pp. 412-416). IEEE.
Waterman, K. K., & Bruening, P. J. (2014). Big Data analytics: risks and responsibilities. International
Data Privacy Law, 4(2), 89-95.

52
PART 2 DESIGN AND DEVELOPMENT

53
54
4. A Classification Matrix of Examination Items to Promote Transformative
Assessment
Mark McMahon
Edith Cowan University
Australia
m.mcmahon@ecu.edu.au

Michael Garrett
Cinglevue International
Australia
michael.garrett@cinglevue.com
Introduction
An important way of transforming learning is through a focus on the learning outcomes to be achieved,
which then informs the content and processes involved (Gagne, 1972). Identifying outcomes allows learning
treatments to be varied in accordance with content, instructional procedures to be related from one subject to
another, and assessment techniques relevant to a given domain (Gagne, 1972). Commonly distinguished domains of
learning include the cognitive domain, affective domain, and psychomotor domain (Driscoll & Driscoll, 2005;
Jonassen, Tessmer, & Hannum, 1999), of which the cognitive domain is most relevant to this research. The purpose
is to provide tools for the classification of outcomes, which can then inform approaches to learning and assessment.
The ultimate goal is to develop a computer system that can autonomously review instructional items and provide
feedback as to their appropriateness in terms of the level and nature of the outcomes to be achieved. A taxonomy of
learning outcomes will form the basis for this system as taxonomies provide classes of overt performance or covert
cognitive attributes that characterise instructional tasks (Jonassen et al., 1999).

Taxonomies of Learning Outcomes

A taxonomy is a hierarchical classification scheme used to organise objects or phenomena into categories
(Jonassen et al., 1999). The key tenet of taxonomies is their hierarchical nature, where classifications at the top of
the taxonomy are more general, inclusive, or complex, subsuming classifications at lower levels (Jonassen &
Grabowski, 2012). Similarly, behaviours or outcomes at lower classification levels of the taxonomy are prerequisite
to, and less complex than, those at the higher levels. Educators employ taxonomies to classify the learning goals as
well as the assessment items which are used to determine whether these goals are being met across the full range of
outcomes (Jonassen & Grabowski, 2012).
In order to establish specific criteria to inform the autonomous classification of instructional statements, a
range of learning taxonomies were selected for evaluation based on their ability to accommodate a breadth of
cognitive learning outcomes. This included Bloom's Taxonomy, Gagne's Taxonomy, Merrill's Performance-Content
Matrix, the Structure of Observed Learning Outcome (SOLO) taxonomy, and the Revised Bloom's Taxonomy.

Bloom's Taxonomy of Learning Outcomes

Bloom's Taxonomy of Learning Outcomes (Bloom, Engelhart, Furst, Hill, & Krathwohl, 1956) describes a
range of cognitive behaviours involving the recall or use of knowledge found in educational testing (Jonassen &
Grabowski, 2012). The taxonomy was intended to serve as a standardised basis for adhering to specific standards
and ensuring educational congruence at unit, course, and curriculum levels, as well as a common language for
facilitating communication (Krathwohl, 2002). Bloom's Taxonomy features six major categories in the cognitive
domain which are ordered in a cumulative hierarchy from simple and concrete to complex and abstract under the
assumption that each simpler category is prerequisite to mastery of the next more complex one, as detailed in Table
1 (Bloom et al., 1956; Huitt, 2004; Jonassen & Grabowski, 2012; Krathwohl, 2002).

Table 1. Structure of cognitive classifications for Bloom's Taxonomy of Learning Outcomes

55
Classification Description
Knowledge Recall of facts, terminology, and methodology without any expectation of understanding or comprehension
Comprehension Elementary understanding and use of knowledge
Application Abstraction of rules or generalisations from knowledge which is applied to solve a related problem
Analysis Deconstructing a knowledge domain to identify its component elements and relationships between them
Synthesis Originating integrating, and combining ideas into a new product, plan, or proposal
Evaluation Appraising, assessing or critiquing on a basis of specific standards or criteria

Many of the above contain sub-classifications that allow a greater level of granularity, such that Knowledge
for example can involve knowledge of facts, ways of dealing with those facts, and knowledge of universal concepts
that can be abstracted from them.

Gagne's Taxonomy of Learning Outcomes

Gagne's Taxonomy of Learning Outcomes (Gagne, 1985) is a component of his Theory of Instruction
which encompasses conditions of learning and events of instruction within an instructional psychology context. The
cognitive domain of the taxonomy identifies different levels of learning for the purpose of sequencing instruction in
the assumption that instruction should begin with the simplest skills and proceed hierarchically to greater levels of
difficulty, as evidenced in Table 2 (Driscoll & Driscoll, 2005; Gagne, 1985; Jonassen & Grabowski, 2012; Jonassen
et al., 1999).

Table 2. Structure of cognitive classifications for Gagne's Taxonomy of Learning Outcomes

Classification Description
Verbal Information State or recall information without understanding or applying it
Concrete Concepts Discriminate between concepts without extensive awareness of the basis for classification
Defined Concepts Classify new examples of events or ideas through understanding of defining characteristics
Rules Statement of relationships between two or more concepts, often causational. Using rules implies the
application of these relationships in new situations
Higher order rules More general statements of relationships, usually referred to as principles. The use of higher order rule
is similar to problem-solving, requiring the learner to select, interpret, and apply appropriate rules
Cognitive Strategy Originate systems or techniques for solving problems or for acquiring new information, including
learning how to learn

Merrill's Performance-Content Matrix

Merrill's Performance-Content Matrix (Merrill, 1983) classifies cognitive outcomes as part of Component
Display Theory. This Theory posits that instructional outcomes, represented either by objectives or test items, can be
classified according to student performance as well as subject matter content (Merrill, 1994). Student performance
classifications include Remember-Instance, Remember-Generality, Use, and Find, while subject matter content
classifications incorporate facts, concepts, procedures, and principles. Learning outcomes can thus be classified
using two separate dimensions and in multiple cells of the Performance-Content Matrix, as detailed in Table 3
(Jonassen & Grabowski, 2012; Merrill, 1983; Merrill, 1994).

Table 3. Classifications for Merrill's Performance-Content Matrix

Performance Description
Classifications
Remember-Instance Search memory in order to reproduce or recognise an example or specific illustration of an object
event, process, or procedure previously known
Remember-Generality Search memory in order to reproduce or recognise a statement of a definition, principle, or steps
a procedure previously known
Use Apply some abstraction to a specific case
Find Derive or invent a new abstraction
Content Classifications
Fact Arbitrarily associated pieces of information such as a proper name, a date, an event, the name of

56
place, or the symbols used to name particular objects, parts, or events
Concept Groups of objects, events, or symbols that all share common characteristics and that are identified
by the same name
Procedure Ordered sequences of steps necessary for accomplishing some goal, solve a particular class of
problem, or produce some product
Principle Cause-and-effect, causational, or correlational relationships that are used to interpret events or
processes and provide explanations or predictions

Structure of the Observed Learning Outcome

The Structure of the Observed Learning Outcome (SOLO) taxonomy (J. B. Biggs & Collis, 1982) classifies
the quality of learning outcomes according to the cognitive processes required to obtain them (Brabrand & Dahl,
2007). SOLO emphasises the observation of student learning cycles to describe the structural complexity of a
particular response to a learning situation using five different levels: prestructural, unistructural, multistructural,
relational, and extended abstract (J. Biggs, 1979; Thompson, Luxton-Reilly, Whalley, Hu, & Robbins, 2008). This
widely applicable framework is most often employed to learn the meaning of a finite display of information and
make judgements about that information, such as the structure of essays, answers to technical questions, medical
diagnoses, or students' accounts of their reading (J. Biggs, 1979; Imrie, 1995).
Much like the other taxonomies described previously, SOLO is a hierarchical taxonomy where each
successive classification level increases in complexity as learners integrate greater detail into their responses and
make connections between them to form structures, ultimately leading to generalisation and abstraction (Brabrand &
Dahl, 2009). Table 4 details the classifications employed by the SOLO taxonomy along with example learning tasks
which could be generated for each level (J. Biggs, 1989; J. B. Biggs & Collis, 1982; Brabrand & Dahl, 2007; Chan
et al., 2002; Ramsden, 1991).

Table 4. Structure of classifications for the SOLO taxonomy

Classification Description
1. Prestructrual Use of irrelevant information or no meaningful response suggesting that the learner is distracted, misle
or hasn't understood the point.
2. Unistructural Answer focuses on one relevant aspect only.
3. Multistructural Answer focuses on several relevant features, but they are not integrated or coordinated together.
4. Relational Several parts are integrated into a coherent, structured, and meaningful whole. Details are linked to
conclusions.
5. Extended abstract Answer generalises the structure beyond the information given. Higher order principles are used to
accommodate new and more abstract features.

Revised Bloom's Taxonomy

The Revised Bloom's Taxonomy (Anderson et al., 2001) updated Bloom's Taxonomy to reflect advances in
the understanding of teaching and learning processes (Pickard, 2007). The most noticeable refinement in the Revised
Bloom's Taxonomy involves the structuring of educational objectives across two dimensions: the Knowledge
dimension and the Cognitive-Process dimension (Amer, 2006), separating the subject matter content and cognitive
performance elements of each knowledge category (Krathwohl, 2002). This division is based on the assumption that
learning outcomes are usually framed in terms of (a) some subject matter content, and (b) a description of what is to
be done with or to that content (Krathwohl, 2002). Statements of learning objectives can thus be considered in terms
of a noun or noun phrase which describes the subject matter content, and a verb or verb phrase which details the
cognitive process or processes employed (Krathwohl, 2002).
The Knowledge dimension of the Revised Bloom's Taxonomy embraces the distinctions of cognitive and
educational psychology that evolved since the original framework was devised (Krathwohl, 2002). The revised
taxonomy also incorporates new learner-centred learning paradigms into its structure, such as constructivism,
metacognition, and self-regulated learning, through the addition of the Metacognitive Knowledge category (Amer,
2006). The structure of the Knowledge dimension of the Revised Bloom's Taxonomy is detailed in Table 5
accordingly (Anderson et al., 2001; Krathwohl, 2002).

57
Table 5. Structure of the Knowledge dimension of the Revised Bloom's Taxonomy

Classification Description
Factual knowledge The basic elements that learners must know to be acquainted with a discipline or solve
problems in it, including sub-classifications of knowledge such as knowledge of
terminology or specific details and elements
Conceptual knowledge The interrelationships among the basic elements within a larger structure that enable them t
function together. Sub=classifications include knowledge of categories, principles and
generalisations, and theories models and structures.
Procedural knowledge How to do something, methods of enquiry, and criteria for using skills, algorithms,
techniques, and methods. Sub-classifications of knowledge relate to subject specific skills
and algorithms, techniques and methods, and criteria for determining appropriate procedur
for a given context.
Metacognitive knowledge Knowledge in general as well as awareness and knowledge of one's own cognition to inform
Knowledge of cognition strategic knowledge, and knowledge of cognitive tasks to an appropriate context.

The Cognitive-Process dimension of the Revised Bloom's Taxonomy adds hierarchical levels, featuring the
same number of categories as Bloom's Taxonomy but renamed using verbs which emphasise meaningful active
processing of new knowledge (Pickard, 2007). This dimension maintains the growth in complexity of the processes
but the renaming allows sub-classifications overlap between the categories, such that cognitive processes associated
with Understand (e.g. Explaining) can be considered more cognitively complex that some of those associated with
Apply (e.g. Executing), for example (Krathwohl, 2002; Pickard, 2007). The structure of the Cognitive-Process
dimension of the Revised Bloom's Taxonomy is detailed in Table 6 (Anderson et al., 2001; Krathwohl, 2002).

Table 6. Structure of the Cognitive-Process dimension of the Revised Bloom's Taxonomy

Classification Description
Remember Retrieving relevant knowledge from long-term memory, demonstrated through subclassifications such
recognising or recalling.
Understand Determining the meaning of instructional messages, including oral, written, and graphical
communication, shown by interpreting, exemplifying, classifying, summarising, inferring, and so on.
Apply Executing or implementing a procedure in a given situation.
Analyse Breaking material into its constituent parts and detecting how the parts relate to one another and to an
overall structure or purpose through a process of differentiating, organising or attributing.
Evaluate Making judgements based on criteria and standards by checking and/or critiquing them.
Create Putting elements together to form a novel coherent whole or make an original product. Sub-
classifications include generating, planning and producing.

As two dimensions, the Revised Bloom's Taxonomy can be expressed in a tabular form, where any
educational objective can be classified in the Taxonomy Table in one or more cells using the nouns and verbs which
constitute the subject matter content and description of what is to be done with or to that content, respectively
(Krathwohl, 2002).

Instructional Activity Matrix for the Classification of Statements Describing Instructional


Activities
In deriving a conceptual foundation to inform the autonomous classification of instructional activities, it
was clear that explicit criteria were required at a granular level to accommodate a wide assortment of learning
outcomes, learning tasks, and assessment tasks. A two dimensional classification matrix comprising a Cognitive-
Process Dimension and a Knowledge Dimension was envisioned consistent with Merrill's Performance-Content
Matrix and the Revised Bloom's Taxonomy. The matrix structure provided a hierarchy of classifications in each
dimension, progressing from concrete to abstract, and lower order to higher order thinking respectively.

The Cognitive-Process Dimension

The Cognitive-Process Dimension of the Instructional Activity Matrix was established via synthesis of the
cognitive elements described in Bloom's Taxonomy, Gagne's Taxonomy, Merrill's Performance Content Matrix, the

58
SOLO taxonomy, and the Revised Bloom's Taxonomy (Anderson et al., 2001; J. Biggs, 1979, 1989; J. B. Biggs &
Collis, 1982; Bloom et al., 1956; Driscoll & Driscoll, 2005; Gagne, 1985; Imrie, 1995; Jonassen & Grabowski,
2012; Krathwohl, 2002; M. D Merrill, 1983; M. David Merrill, 1994). As detailed in Table 7, six separate cognitive
classifications were designated: (1) Remember, (2) Understand, (3), Apply, (4) Analyse, (5) Evaluate, and (6)
Create. This was consistent with the Revised Blooms Taxonomy but also encompassed equivalent cognitive aspects
described in the other taxonomies.

Table 7. Classifications for the Cognitive-Process Dimension of the Instructional Activity Matrix

Classification Revised Bloom's Bloom's Taxonomy Gagne's Taxonomy SOLO Performance-Conte


Taxonomy Matrix
1. Remember Remember Knowledge Verbal Information Uni-structural Remember
Retrieve information Retrieve relevant Recall what has been Recite or recall inert Responding one some item of
from memory knowledge from long learned knowledge isolated i relevant aspect in the information already
term memory memory domain, no level of known
deeper understanding
required
2. Understand Understand Comprehension Concrete Concepts Multi-structural Use
Derive meaning throug Determine the meani Understand learned Distinguish between Response in terms of Use features to
understanding of of instructional material members and non- relevant features, but distinguish concepts
defining characteristic messages members of a concep no coordination,
and interrelationships without extensive integration, or
between common awareness of the bas connection between
components for classification them (structure
unknown)
Defined Concepts Relational Use
Classify new concep Response requires Use definitions to
through understandin parts to be integrated classify new example
of defining together so that the
characteristics whole has a coherent
meaning or structure
3. Apply Apply Application Rule Relational Use
Employ rules to solve Carry out a procedur Use learned material Apply statements Response requires Use rules to solve
related problems throu in a given situation. a task which identify cause the abstraction of related problems
abstraction of existing and-effect relationsh structural knowledge
understanding between two or more describing compone
concepts in a new interrelationships to a
situation related procedure
4. Analyse Analyse Analysis Higher Order Rule Relational Use
Analyse the relationsh Break material into Break up information Derive generalised Responses require Use the nature of
between components t parts and detect how logically into its statements of analysis of the relationships between
derive coherent meanin the parts relate to one component parts relationships and interrelationships components to derive
and identify generalise another and to an principles through between components general principles
statements and overall structure or analysis in order to identify a coherent an
principles. purpose solve new or comple meaningful structure
problems.
5. Evaluate Evaluate Evaluation Higher Order Rule Relational Use
Evaluate the value of Making judgements Evaluate the worth o Employ generalisatio Make judgements in Use understanding o
material using insight based on criteria and something for a given and principles to makcomparison to the related concepts to
into related concepts standards purpose judgements and structure and meanin make judgements
evaluations of relevant known
concepts
6. Create Create Synthesis Cognitive Extended abstract Find
Create original materia Put elements togethe Structure information Strategy Originate interrelated elements Derive or invent a ne
through combination a to form a novel, to form a new pattern systems or technique are seen as an instanc abstraction
extension of existing coherent whole or or whole for solving problems of a more general cas
generalisations to deriv make an original acquiring new and extended to new
new abstractions product information contexts

The Knowledge Dimension

The Knowledge Dimension of the Instructional Activity Matrix was derived by combining the knowledge
components of Merrill's Performance-Content Matrix and the Revised Bloom's Taxonomy. Five separate
classifications were designated: (a) factual knowledge, (b) conceptual knowledge, (c) procedural knowledge, (d)
principle knowledge, and (e) metacognitive knowledge, as detailed in Table 8. Elements of the conceptual

59
knowledge classification in the Revised Bloom's Taxonomy were separated to form a distinct principle knowledge
classification consistent with Merrill's Performance-Content Matrix. This approach allows for finder detail
classification, where knowledge concerning principles that would relate to the conceptual knowledge classification
of the Revised Blooms Taxonomy is instead classified using the principle knowledge classification of the
Instructional Activity Matrix. The Instructional Activity Matrix thus provides educators with the ability to clearly
distinguish between conceptual knowledge and principle knowledge, which is significant given that the conceptual
knowledge classification of the Revised Blooms Taxonomy is the broadest of the knowledge categories and most of
the words in any language identify concepts.
Important distinctions were also made between individual parts of an entity (factual knowledge), and their
related defining characteristics (conceptual knowledge). Knowledge which concerned relationships was similarly
distinguished according to whether it concerned interrelationships between the components of an entity (conceptual
knowledge), or correlation and causation between entities (principle knowledge).

Table 8. Classifications for the Knowledge Dimension of the Instructional Activity Matrix

Classification Components
a. Factual Arbitrarily associated information, including proper names, dates, events, names of places, and the symbol
used to name particular objects, parts, or events
Individual items, parts, elements, or features that are components or constituents of a whole
Terminology, systems of terms, and specific terms
Specific details
b. Conceptual Groups, collections, assemblages, clusters, and aggregations of objects, events, or symbols that share some
common defining characteristics or are related in some way that are identified by the same name. Most of the
words in any language identify concepts.
Classifications, nomenclatures, classes, taxonomies, categories and general or comprehensive divisions
Patterns, distinctive forms, consistent or characteristic arrangements of combined qualities, acts, or
tendencies
Interrelationships amongst the basic elements within a larger structure that enable them to function togethe
Structure, arrangement, or organisation of component items, parts, elements, or features
c. Procedural Domain-specific techniques and methods, specialised procedures, orderly or systematic sequences of steps
necessary to accomplish a goal, solve a problem, or produce a product
Domain-specific skills, abilities, and capabilities
Algorithms and sets of rules for solving a problem
Standards and criteria for determining when to use appropriate procedures
d. Principle Fundamental, primary, or general laws, rules, or truths from which others are derived
Cause-and-effect, causational, or correlational relationships used to interpret events or processes
Explanations, predictions, interpretations, and expositions as to why things happen
Generalisations, abstractions, universalities, and general statements, ideas, or principles
Theories, general propositions, and doctrines
e. Metacognitive Knowledge of cognitive tasks, self-knowledge, learning how to learn, and awareness of one's own cognitiv
capabilities
Strategies, plans, and methods for undertaking cognitive activities and learning information
Contextual and conditional knowledge which guides the use of strategies allocation of resources

The Instructional Activity Matrix

As with any taxonomy, the Instructional Activity Matrix operates according to a number of assumptions. It
assumes that one can characterise human knowledge and activity as discrete cognitive states (Jonassen et al., 1999)
and that such states can be identified, specified, and measured reliably and validly (Jonassen & Grabowski, 2012).
The difficulty in identifying which level of cognitive complexity a learner employs when engaging with a particular
instructional task must be acknowledged (Bloom et al., 1956) along with the fact that the actual cognitive process
that underpins to a specific task will depend on the capabilities of the individual solving that task their prior
experience (Thompson et al., 2008).
With these assumptions in mind, the Instructional Activity Matrix for the classification of statements
describing instructional activities is presented in Figure 1. It provides 30 possible individual classifications for
categorising instructional activities, augmenting the 24 possible classifications of the Revised Blooms Taxonomy
through the addition of the principle knowledge classification within the Knowledge Dimension. Each cell within
the Instructional Activity Matrix is the intersection of the Cognitive-Process and Knowledge Dimensions that
describes the specific cognitive processes and subject-matter content involved. Table 9 provides descriptions and
associated examples for each cell in the Instructional Activity Matrix.

60
1. Remember 2. Understand 3. Apply 4. Analyse 5. Evaluate 6. Create
a. Factual 1a. Recall associatio 2a. Specify features 3a. Utilise fact 4a. Determine featur 5a. Check factual 6a. Generate factual
accuracy representation
b. Conceptual 1b. Recognise conce 2b. Characterise 3b. Enlist concept 4b. Examine concep 5b. Consider concep 6b. Evolve concept
concept
c. Procedural 1c. Recall procedur 2c. Clarify procedur 3c. Execute procedu 4c. Scrutinise 5c. Critique procedu 6c. Devise procedur
procedure
d. Principle 1d. Recognise 2d. Explain principl 3d. Relate principle 4d. Investigate 5d. Validate princip 6d. Discover princip
principle principle
e. Metacognitive 1e. Recognise learni 2e. Comprehend 3e. Implement learn 4e. Explore cognitiv 5e. Assess learning 6e. Develop learning
fundamentals learning processes strategy processing performance abstraction

Figure 1. Instructional Activity Matrix

Table 9. Cell identities within the Instructional Activity Matrix

Cell Description Example Instructional Activity


1a. Recall association Recall specific instances of information that are Memorise the capital of Western Australia.
associated in some way
1b. Recognise concept Recall familiar concepts from memory according to What are the characteristics of a eucalyptus tree?
their defining characteristics
1c. Recall procedure Remember ordered sequences of steps for a given Students will repeat the process of changing a tyre to
procedure memorise the steps involved
1d. Recognise principle Recall familiar principles from memory What is the name of the force which attracts mass
together?
1e. Recognise learning Recognise fundamental aspects of learning activity Recognise different strategies for retaining information
fundamentals
2a. Specify features Summarise the set of features or components of some Students will examine cutaway diagrams which detail th
entity features of a submarine
2b. Characterise concept Understand a concept by way of its defining Classify the following adhesives according to their
characteristics and interrelationships toxicity
2c. Clarify procedure Clarify the way in which a procedure works to achiev What is the purpose of photosynthesis?
its purpose
2d. Explain principle Explain a principle in terms of causal relationships Understand Newton's third law of motion
2e. Comprehend learning Understand various cognitive processes involved in Describe how you successfully learned to ride a bike
processes learning and information processing
3a. Utilise fact Invoke factual knowledge to undertake some task Utilise your knowledge of state capitals to label a map o
Australia
3b. Enlist concept Use knowledge of concepts to undertake some task Apply specific tools in the construction of wooden
furniture
3c. Execute procedure Perform a procedure in order to achieve a particular Students will explore how the quadratic formula can be
outcome used to solve quadratic equations
3d. Relate principle Relate principles and causal rules to the undertaking o Relate the law of gravity to the construction of simple
a task machines
3e. Implement learning Employ specific learning strategies to engage with a Apply learning strategies to become proficient in an
strategy learning task unfamiliar language
4a. Determine features Analyse an entity to determine its features Establish a catalogue of components for a carburetor
4b. Examine concept Deconstruct an entity to determine its defining Students will dissect frog and toad specimens in order to
characteristics and the way they are interrelated or the determine the differences between them
reason for its classification
4c. Scrutinise procedure Deconstruct a procedure to determine its purpose and Students will inspect the steps used to separate chemical
the way it works bonds via hydrolysis to create various chemical
substances
4d. Investigate principle Analyse the way in which a principle can account for How does ionic charge facilitate the creation of chemica
specific outcomes or behaviours compounds?
4e. Explore cognitive Deconstruct cognitive processes to explore their role Compare and contrast your experience with behaviouris
processing during a particular learning process and cognitivist learning approaches
5a. Check factual accuracy Verify the factual accuracy of information Evaluate the accuracy of diagrammatical representations
of spiders
5b. Consider concept Evaluate concepts, their structure, defining Justify the periodic table as a means of classifying
characteristics, and classification chemical elements
5c. Critique procedure Evaluate the value, merit, or suitability of a particular Students will evaluate different scientific approaches us

61
procedure by forensic anthropologists to identify fossil remains
5d. Validate principle Justify the existence of a principle in terms of Assess whether the following case study supports or
observable evidence, behaviours, or outcomes contradicts the laws of supply and demand
5e. Assess learning Monitor and evaluate learning performance in terms o Reflect on your progress in mathematics this year
performance the underlying cognitive processing, task efficiency, a
the learning strategies that were employed
6a. Generate factual Devise new ways of representing factual information Students will develop visual media to present Australian
representation historical events between 1950 and 1960 in an engaging
and thought provoking manner
6b. Evolve concept Extend concepts, interrelationship, and classifications Devise a new classification scheme for the sub-genres o
new contexts electronic music
6c. Devise procedure Develop a new procedure for accomplishing a task Devise a procedure for randomly assigning participants
groups during a research project
6d. Discover principle Discover a new principle which can account for causa Discover cause-and-effect relationships by designing an
relationships undertaking scientific experiments
6e. Develop learning Develop strategies for extending and abstracting Produce a learning portfolio containing critical analysis
abstraction existing knowledge and experience to new and and discussion on your learning activities throughout the
unfamiliar learning contexts semester

Given the above, the process for classifying an instructional statement using the Instructional Activity
Matrix begins with the identification of cognitive and knowledge components. The language used to describe the
instructional statement can be analysed to identify terms which relate to specific categories within the Cognitive-
Process Dimension and Knowledge Dimension, where verbs and interrogative terms (i.e. who, what, where, why,
when, which, and how phrased in the form of a question) correspond to the former, while nouns and adjectives are
associated with the latter. From here, a final cell classification or classifications can be determined based on the
intersection of the associated columns within the Cognitive-Process Dimension and rows of the Knowledge
Dimension.

Conclusion
The Instructional Activity Matrix has been synthesised from an exploration of the literature around
commonly used taxonomies of learning outcomes that operate within the cognitive domain. This system classifies
instructional activities in terms of their cognitive-process and knowledge components and is broadly applicable to
instructional statements which describe learning outcomes, learning tasks, and assessment tasks. While the
Instructional Activity Matrix provides an informed basis for the classification of instructional statements, it is
important to note a number of potential limitations. Instructional statements may not contain terms which can clearly
be associated with a specific column or row within the Instructional Activity Matrix, making classification difficult
or impossible. Furthermore, relying on discrete analysis of the terms contained in an instructional statement may not
give due consideration to the overall meaning of the statement or contextual cues which provide a more accurate
indication as to the cognitive processes and types of knowledge involved.
Nevertheless, the Instructional Activity Matrix provides identities and descriptions across a range of
knowledge and cognitive processes to provide for meaningful and accessible classifications. The addition of a
Principle Knowledge category also allows for finer granularity of conceptual knowledge during classification. Given
that most words in a language identify concepts at some level, this provides a means between knowledge of objects,
events, or symbols that are identified by the same name and share common characteristics (conceptual knowledge),
and knowledge that explains or predicts why things happen according to cause-and-effect or correlational
relationships (principle knowledge).
While the Instructional Activity Matrix is not without its limitations, carefully considered use can provide
an informed basis for the classification of instructions that constitute common assessment tasks. This allows
educators to better understand the learning requirements for a particular learning outcome, learning task, or
assessment task in terms of the knowledge that students will need and what they will need to do with this knowledge
in order to engage with the learning activity. Future publications will report on efforts to validate the Instructional
Activity Matrix as well as its application as a theoretical basis for autonomously classifying large numbers of
instructional statements using linguistic analysis techniques.

References
Amer, A. (2006). Reflections on Blooms revised taxonomy. Electronic Journal of Research in Educational
Psychology, 4(1), 213230.

62
Anderson, L. W., Krathwohl, D. R., Airasian, P. W., Cruikshank, K. A., Mayer, R. E., Pintrich, P. R., Wittrock, M.
C. (2001). A taxonomy for learning, teaching, and assessing: A revision of Blooms Taxonomy of Educational
Objectives. New York: Longman.
Biggs, J. (1979). Individual Differences in Study Processes and the Quality of Learning Outcomes. Higher Education,
8(4), 381394. doi:10.2307/3446151
Biggs, J. (1989). Towards a Model of School-Based Curriculum Development and Assessment Using the SOLO
Taxonomy. Australian Journal of Education, 33(2), 15163.
Biggs, J. B., & Collis, K. F. (1982). The psychological structure of creative writing. Australian Journal of Education,
26(1), 5970.
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of educational
objectives: Handbook I: Cognitive domain. New York: David McKay.
Brabrand, C., & Dahl, B. (2007). Constructive alignment and the SOLO taxonomy: a comparative study of university
competences in computer science vs. mathematics. In Proceedings of the Seventh Baltic Sea Conference on
Computing Education Research - Volume 88 (pp. 317). Darlinghurst, Australia, Australia: Australian
Computer Society, Inc. Retrieved from http://dl.acm.org.ezproxy.ecu.edu.au/citation.cfm?id=2449323.2449325
Brabrand, C., & Dahl, B. (2009). Using the SOLO taxonomy to analyze competence progression of university science
curricula. Higher Education, 58(4).
Chan, C. C., Tsui, M. S., Chan, M. Y. C., & Hong, J. H. (2002). Applying the Structure of the Observed Learning
Outcomes (SOLO) Taxonomy on Students Learning Outcomes: An empirical study. Assessment & Evaluation
in Higher Education, 27(6), 511527.
Driscoll, M. P., & Driscoll, M. P. (2005). Psychology of learning for instruction.
Gagne, R. M. (1972). Domains of Learning. Interchange, 3, 18.
Gagne, R. M. (1985). The conditions of learning (4th ed.). New York: Holt, Rinehart & Winston.
Glaser, R. (1982). Instructional psychology: Past, present, and future. American Psychologist, 37(3), 292305.
Heer, R. (2011). A Taxonomy for Learning, Teaching, and Assessing: A Revision of Blooms Taxonomy of
Educational Objectives. Center for Excellence in Learning and Teaching, Iowa State University. Retrieved
from http://www.celt.iastate.edu/pdfs-docs/teaching/RevisedBloomsHandout.pdf
Huitt, W. (2004). Bloom et al.s taxonomy of the cognitive domain. Retrieved 18 November, 2014 from
http://www.gestaltdialektik.com/content/Factual_Knowledge_in_Vygotskyan_Terms.pdf.
Imrie, B. W. (1995). Assessment for learning: Quality and taxonomies. Assessment & Evaluation in Higher Education,
20(2), 175.
Jonassen, D. H., & Grabowski, B. L. (2012). Handbook of Individual Differences Learning and Instruction. New York:
Routledge.
Jonassen, D. H., Tessmer, M., & Hannum, W. H. (1999). Task Analysis Methods for Instructional Design. Mawwah,
New Jersey: Lawrence Erlbaum.
Krathwohl, D. R. (2002). A Revision of Blooms Taxonomy: An Overview. Theory Into Practice, 41(4), 212218.
Merrill, M. D. (1983). Component Display Theory. In C. M. Reigeluth (Ed.), Instructional-Design Theories and
Models: An Overview of their Current Status (pp. 279334). Hillsdale, New Jersey: Lawrence Erlbaum
Associates.
Merrill, M. D. (1994). Instructional Design Theory. Englewood Cliffs, New Jersey: Educational Technology
Publications.
Pickard, M. J. (2007). The new Blooms Taxonomy: An overview for family and consumer sciences. Journal of Family
and Consumer Sciences Education, 25(1), 4555.
Ramsden, P. (1991). Learning to Teach in Higher Education. Psychology Press.
Thompson, E., Luxton-Reilly, A., Whalley, J. L., Hu, M., & Robbins, P. (2008). Blooms taxonomy for CS assessment.
In Proceedings of the tenth conference on Australasian computing education - Volume 78 (pp. 155161).
Darlinghurst, Australia, Australia: Australian Computer Society, Inc. Retrieved from
http://dl.acm.org/citation.cfm?id=1379249.1379265

63

64
5. Lessons Learned from Creation of an Exemplary STEM Unit for
Elementary Pre-Service Teachers:
A Case Study
Matthew Schmidt
University of Hawaii - Manoa
United States
matthew.schmidt@hawaii.edu

Lori Fulton
University of Hawaii - Manoa
United States
fultonl@hawaii.edu

Introduction

There is a current shortfall of highly qualified STEM educators at the elementary level (Congressional
Research Service, 2006; National Research Council, 2011). This is problematic because exposing students to STEM
topics early in their schooling increases the chances that those students will maintain interest and improve their
abilities in related areas. Teacher preparation programs should focus on training teachers to meet the demand for
qualified elementary STEM teachers, but existing programs are weak in STEM areas (Epstein & Miller, 2011). This
means many practicing teachers will lack subject matter expertise (Congressional Research Service, 2006;
Greenberg, McKee, & Walsh, 2013). We need to make changes to STEM teacher preparation programs, but this
poses challenges on many levels (see Atkinson & Mayo, 2010; Epstein & Miller, 2011; National Center for
Education and the Economy, 2007; National Research Council, 2007). Top-down changes are particularly
challenging; however, bottom-up reform efforts can empower individual educators to enact change within extant
STEM teacher education programs. Bybee (2010) forwards a model in which STEM education reform is initiated by
introducing short STEM instructional units into existing programs. These units can then serve as a model of
exemplary STEM education for instructors, administrators, and parents. This empowers educators to act as agents of
change.
In this spirit, the authors of this paper collaborated to create an exemplar moon investigation unit for a
university teacher education STEM class. This article serves as a case study, which was guided by the following
questions: 1) How does using an inquiry-based approach to teach science impact pre-service teachers learning
experience of the moon investigation unit? and 2) How does the integration of technology in the moon investigation
unit impact pre-service teachers learning experiences? For our project, we developed rapid prototypes of
technology-enhanced instructional materials (including websites, instructional videos, and multimedia presentations)
and evaluated each phase of our implementation.
Design approach.
We opted for a rapid prototyping (RP) design approach because it allows designers to reduce development
time and costs when creating instructional products (Desrosier, 2011; T. S. Jones & Richey, 2000; Tripp &
Bichelmeyer, 1990; Wilson, Jonassen, & Cole, 1993). Rapid prototyping provides a means to quickly clarify needs
without a protracted front-end analysis, can boost creativity, can result in a final product with fewer errors and better
usability, and can influence users acceptance of the product. This approach was appropriate for our project since we
were implementing an instructional unit in a new way and did not have sufficient experience from which to draw.
Drawing from Wilson, Jonassen, and Cole (1993), RP aligned with our overarching design objectives in that it
allowed for both testing the effectiveness and appeal of a particular instructional strategy (p. 21.4) and quickly
developing a model instructional unit. We performed four cycles of RP consisting of four processes: 1) studying the
problem and planning the implementation, 2) building and implementing the instructional and technological
prototypes, 3) collecting and analyzing data related to usability and student performance, and 4) reflecting on our
findings so as to improve the prototypes in the next RP cycle. Our projects RP cycles and processes are illustrated
in Figure 1 below.

65
Cycle 1 Cycle 2 Cycle 3 Cycle 4

Moon
Study & Plan Devise revisions Devise revisions Devise revisions
Investigation
Formal data
analysis and
write-up of
ndings

Amend Rapid Amend Rapid Amend Rapid


Implement Rapid Prototypes
Prototypes Prototypes Prototype

First 3-item student Second 3-item External formative


Collect & Analyze Third 3-item student
evaluation by graduate
survey student survey survey
Data students

Review data & Review data & Review data & Review data &
Reflect discuss discuss discuss discuss

Figure 1. Iterative cycles of design, implementation, evaluation, and reflection.


Pedagogical framework.
We adopted principles commonly used in high quality STEM programs to inform our pedagogical
framework (for an overview, see Hanover Research, 2011), including 1) engaging students in solving relevant and
practical problems, 2) implementing project-based and team-based learning, 3) utilizing advanced technologies, and
4) concentrating on deep and meaningful learning as opposed to rote memorization of facts. The instructional unit
selected was the moon investigation, which was based on the popular, multi-week inquiry-based science activity
described in Abell, Appleton, and Hanuscin (2010). In this activity, pre-service teachers (PSTs) learned about the
moon individually and in groups by observing it on a regular basis over the course of the semester, while at the same
time learning best practices for teaching inquiry-based science.
Inquiry-based science.
Focusing on deep and meaningful learning as opposed to rote memorization of facts is a hallmark of quality
STEM education. Inquiry-based science is one approach to facilitating such learning. Minner, Levy, and Century
(2010) describe inquiry as having (1) the presence of science content, (2) student engagement with science content,
and (3) student responsibility for learning, student active thinking, or student motivation with at least one component
of instruction question, design, data, conclusion, or communication (p. 5). Recently the language of science has
shifted from an inquiry-based approach to science as a practice (Osborne, 2014, p. 177). While engaging students
in the practices of science is essential to developing a coherent science experience, elementary teachers claim to be
uncomfortable teaching science in this way due to their limited content and pedagogical knowledge, negative
experiences with science, management issues, and/or low levels of confidence (Appleton, 2007; Banilower, Heck, &
Weiss, 2007; Davis, Petish, & Smithey, 2006; Howes, Lim, & Campos, 2009). Due to this discomfort many teachers
avoid teaching science or rely on a textbook.
This discomfort extends to the topic of the moon, as the moon phases and the relationship between the
Earth, Sun, and Moon are challenging for adults and children alike (Lelliott & Rollnick, 2010; Mulholland & Ginns,
2008; Wisitsen, 2012). Because of this, we tailored our materials and classroom instruction to have a dual
concentration that promoted PSTs understanding of both the underlying scientific concepts governing the phases of
the moon as well as best practices for implementing inquiry science activities in their own classrooms.
Technological literacy.
Quality STEM education hinges on the utilization of advanced technologies both for facilitating the
learning process as well as to foster students technological literacy. Technological literacy is:
[T]he ability to effectively use technology (i.e., any tool, piece of equipment or device, electronic
or mechanical) to accomplish required learning tasks. Technology literate people know what the
technology is capable of, they are able to use the technology proficiently, and they make
intelligent decisions about which technology to use and when to use it. (p. 47)

Students develop technological literacy by engaging in authentic problem solving that embeds technological tools.
To promote technological literacy, computers can be used as mindtools to help learners engage in complex and
authentic problem solving (Jonassen, 2000, 2011; Jonassen, Howland, Moore, & Marra, 2003). This allows students
to think deeply about the content that they are studying; if they choose to use these tools, they will facilitate their
own learning and meaning-making processes. (Jonassen, Davidson, Collins, Campbell, & Haag, 1995, p. 20).

66
Methods

As with most case studies, the nature of our research was qualitative (Creswell, 2012; Merriam, 2007;
Stake, 1995). The purpose of our research was 1) to investigate the impact of a model STEM instructional unit on
PSTs perceptions of inquiry-based science instruction, as well as 2) to explore methodically methods and processes
for meaningfully translating a traditional paper-and-pencil instructional unit into a multimedia-rich, technology-
infused unit that could be delivered in an inverted format. The questions that guided our research were: 1) How does
using an inquiry-based approach to teach science impact pre-service teachers learning experience of the moon
investigation unit? and 2) How does the integration of technology in the moon investigation unit impact pre-service
teachers learning experiences? Over one semester, we rapid prototyped instructional materials, implemented them,
evaluated them, and made incremental improvements based on evaluation findings. Data from field notes, open-
ended surveys, weekly debriefs, and an external formative evaluation were used to inform our progress, identify
avenues for improvement, and establish best practices.
Research was performed in an undergraduate STEM teacher preparation course at a large Pacific
university. Participants were 35 undergraduate students enrolled in the teacher preparation program (30 female, 5
male). Ages were between 20 and 25 years old. Data sources included field notes, minutes from weekly debriefs,
and surveys. Field notes were created during design meetings and after face-to-face classes. Surveys were delivered
using online forms. An external formative evaluation was conducted by students in a graduate-level instructional
technology course as well, focusing on: 1) technology tools used, 2) design of materials, and 3) instructional
strategies. Formative evaluation results were delivered to the researchers in the form of evaluation reports.
Data from surveys were analyzed using inductive and deductive axial coding (Charmaz, 2006; Strauss &
Corbin, 1998). The survey data were stored in a spreadsheet for sorting and analysis. We developed preliminary
codes for the data, reviewed the codes, discussed their definitions and how they should be applied, applied the codes
to a subset of the data, identified ambiguous or overlapping codes, and amended them. We then individually coded
the data. We established inter-rater reliability by coding each others data and comparing. Reliability was established
using a point-by-point estimate in which agreements (A) were divided by agreements plus disagreements (A+D) and
then multiplied by 100. The averaged inter-rater reliability was 95%, indicating high agreement between coders.
Coded data were categorized, summarized, and finally synthesized into three broad, overarching thematic
areas. Representative artifacts from the data sources were identified and reviewed. Themes and excerpts were then
organized, summarized and crafted into narrative.

Results

Three overarching themes emerged from coding: 1) challenges and successes related to engaging in inquiry
learning, 2) the multidimensional nature of meaningful technology integration, and 3) design principles for
transforming established practices for 21st century learners.

Challenges and Successes Related to Engaging in Inquiry Learning

PSTs were asked to observe the moon three times a week and to record information in a digital science
notebook. This challenged many of the PSTs who were uncertain of what to record due to their perceived lack of
factual knowledge. One PST stated, Describing the moon is difficult because we do not have any knowledge about
the moon. They wanted more explicit instruction asking specifically for instruction on where we have to be
standing exactly, and what time. Such a request removes the responsibility of the PSTs in the inquiry process and
reduces their ability to learn this information through their own observations. Another challenging aspect for PSTs
was engaging in the inquiry process. For many, this process was new and unsettling, as they were accustomed to
receiving information, leaving them unsure whether they were really learning anything at all through the
observations, why this assignment was ongoing for so long, and how this will affect my grade. Further, students
wanted more instruction in applying this activity to themselves as teachers, specifically, how to take a cool and
awesome topic like this [and] make a lesson that is universal.
With challenges were also successes. It was evident in PSTs comments that they were acquiring scientific
understandings such as identifying the cycle of the moon, I know the moon cycles now! When it is waning or
waxing. Full moon by Friday! Some PSTs were surprised to learn that the moon rises in the morning at times.

67
Many PSTs commented about the unusual instructional approach we took with the assignment, saying such things
as, It is different than anything I have done so far. While PSTs struggled with inquiry, they appreciated being
forced to actually look at the moon on a regular basis. As one PST summed it up, I like the fact that we get to
observe the moon in depth. I discover new things every time I do this assignment.

Multidimensional Nature of Meaningful Technology Integration

Pre-service teachers used technology and applied technology skills extensively, which were generally
perceived positively. PSTs commented that they liked practicing using the technology tools and having their
information available online for all to see and comment on. Some found the active approach was a different, more
engaging way to learn about technology and facilitated them becoming an active science learner. Others were
challenged by their lack of technological fluency. One PST stated, I am having major technical difficulty with the
video imports. I don't know how to blog with my group at all, and I don't know how to know if it got saved and they
see my comments when I do try to blog. I still to this day don't know if I blogged.... From an instructional
perspective, we used technology extensively to create and deliver instructional materials. Materials were created
using a rapid prototyping approach, with the assumption that we would make incremental changes to the materials
as we moved through the semester. This approach allowed us to bring our materials online quickly, but caused the
side-effect that they were sometimes inconsistent and confusing. We would make changes to new materials based
on what we were learning as we moved forward, and these changes were not always reflected in older materials.
Another side effect of making continual additions and amendments was that our website navigation became
confusing.

Design Principles for Transforming Established Practices for 21st Century Learners

Formative evaluation of the moon investigation assignment indicated that the directions for the assignment
were clear and the examples and background information in the instructional content [were] relevant to the
instructional goals and context. Evaluators reported the module used a variety of multimedia that aids the
instruction, were easy to access through mobile devices, and were excellent in nature. In terms of degree of
difficulty, the module was found to be appropriate for the learner. The instructional techniques employed were
found to allow teachers to step in to the role of learners of science, which will enhance their ability to teach
science as well as provide sufficient practice and feedback opportunities. PSTs found the topic to be of interest to
them, commenting that they enjoyed observing the moon, they found the assignment intriguing, and that it had
sparked their curiosity about the moon. They also commented about the ease of the assignment, stating, It is not
very difficult to do. Although findings indicate the module was effective in many ways, there were aspects of the
module that need improvement. Both PSTs and external evaluators felt that the number of observations were
excessive. A second area was cultural relevance, as the moon plays a significant role in the Pacific island culture in
terms of navigation, farming, and fishing. We see opportunities in the future to build off of PSTs prior knowledge
and make connections to home, cultural, and community experiences. A third area on which to focus was site
design. While rapid prototyping allowed us to quickly build materials, it also led to difficulties, such as sections
being too wordy and confusing, instructions that were excessive and located in many different places, and
navigation that was not user friendly. Site design was a main area of concern that appeared several times in the
formative evaluation report as well.
Overall, the moon module received high marks, indicating that the transformation of an established practice
for studying the moon had been effective in meeting the needs of 21st century learners. This transformation process
will be strengthened through the identification and concentration on specific areas of improvement the next time we
implement it.

Discussion

In the following sections, we discuss our findings as a series of lessons learned and provide
recommendations for future enhancements.

Lessons Learned: Inquiry

Our vision was to create an instructional unit that was entirely inquiry-based and technology mediated.
While the inquiry approach was successful in fostering PSTs knowledge and understanding of the moon, many

68
struggled with the novelty of the inquiry process. Because of their lack of familiarity with the pedagogical approach,
they had little background for evaluating their learning progress. The many requests to teach facts and to provide
explicit instructions for exactly what to do for observing the moon provide evidence not only that PSTs struggled
with the inquiry approach, but also gives some indication of their epistemological beliefs. Epistemological beliefs
are generally conceived of as learners personal views on the nature of knowledge and knowing (Hofer & Pintrich,
1997; Sandoval, 2005). Learners epistemological beliefs generally fall into three stages of beliefs: 1) absolutist
beliefs, 2) relativist beliefs, and 3) pluralist/evaluative beliefs (Hofer & Pintrich, 1997; Kuhn, 1999; Schraw, 2001).
Drawing from Kuhn (1999), the absolutist view conceives of knowledge as coming from an external source, to be
objective, and to be certain; the relativist view sees knowledge as subjective and a matter of opinion; and the
pluralist/evaluative stance views knowledge as an ongoing process of evaluation and argument. PSTs desire for facts
and explicit directions suggests that students held largely absolutist epistemologies.
PSTs may express a desire for facts, but providing them with direct instruction is counterproductive to
inquiry learning. Instead of directly teaching facts about a specific phenomenon such as moon phases, attention
should be given to the inquiry-based approach itself particularly if students are unfamiliar with it. Students need
guidance to grasp the principles of inquiry-based instruction and learning. Therefore, we contend that students
should acquire an understanding of inquiry through shorter, guided inquiries and instruction on inquiry as an
approach to learning prior to engaging in an extended, more open-ended inquiry. Opportunities to carry out shorter
inquiry activities could include such experiences as creating and exploring coffee filter parachutes or moon
landers from office supplies (see Riechert & Post, 2010). Building in opportunities for meaningful reflection (Abell
et al., 2010; Anderson, 2010) throughout these experiences as well as throughout the extended moon investigation
would provide students with the guidance they desire while allowing for concrete exploration and discussion of the
concept of inquiry, based on experience gained in a short amount of time. Shorter activities will reduce uncertainty
and increase trust in the approach since PSTs will be able to see the outcome of their work more quickly before
engaging with a longer activity like the moon investigation. Also, instructors should focus on identifying and
exploring PSTs current epistemological beliefs (Chan, 2011; Fulmer, 2014). An awareness of PSTs current beliefs
about knowledge provides a basis for exploring the nature of science. This can lead to a positive influence on PSTs
overall perceptions of science (Fulmer, 2014). PSTs epistemological beliefs can impact their future teaching
(Tanase & Wang, 2010) and can be influenced by culture. An awareness of how specific cultural elements might be
influencing PSTs beliefs could be explored as part of the instructional sequence to promote higher stages of
epistemological beliefs in culturally relevant ways.

Lessons Learned: Technology Integration

Just using technology is not an integrated experience. For our project, we wanted to use technology both as
a conduit for learning and as a means of acquiring technological literacy. Having students use digital science
notebooks is one example of a successful and appropriate use of technology for the moon investigation activity.
However, not all uses of technology were equally appropriate. For example, some PSTs would use digital cameras
in their cell phones or tablet computers for taking pictures of the moon, which were low resolution and did not
sufficiently show moon phases. Others would download apps or use Google to locate information on moon phases
or location of the moon and report that information in their digital notebooks with no real understanding of what the
information meant. These examples show a lack of technological literacy, which is not only the ability to use
technology, but also to know when to choose one technology over another or even when to not use technology at all
(Davies, 2011).
Nowadays, technology is everywhere. We have instant access to information at any time. But having access
to information with no context to interpret it is not very useful. For instance, if a PST includes the azimuth and
altitude of the moon in a journal entry, but does not know what those figures or terms mean, then the information
itself is meaningless. We took many opportunities to model and explain appropriate technology use, but we found
that modeling and explanation was not enough to change PSTs behaviors. Even when shown low-resolution
photographs or when they were challenged to explain the meaning of information copied from a Google search,
some PSTs continued to use these ineffective strategies. Explicit examples of effective and ineffective technology
use are needed, and these examples should be presented frequently throughout the instructional sequence. Exploring
examples of appropriate and inappropriate technology use to PSTs gives opportunities to collaboratively develop
competencies. One way this could be scaffolded for PSTs would be to have them perform the first moon observation
activity together as a whole class. This would allow them to not only explore concepts related to the moon (i.e.,
phase, visibility, azimuth, altitude) but also to explore strategies for using technology to record moon observations
such as the benefits and limitations of using a camera, using apps to search for information, etc.

69
Lessons Learned: Instructional Design

Designing new instruction is complex and requires a lot of planning in the selection of which technologies
to use and the design of the instructional materials. We had no prior knowledge or experience upon which to base
our design decisions and technology selection since we had never taught this unit before. For this reason, we opted
for an RP approach so that we could make incremental improvements as we progressed through the semester. The
rapid prototyping approach provided a means for rapidly developing and iterating our designs, but it also caused the
user-friendliness of our digital instructional materials to suffer. The class website became cluttered and lacked a
consistent look and feel. On the one hand, we see this as an indication of constant improvement to our materials and
instructional approach. That is, the design of our materials became more refined and user-friendly over time, which
is why those materials we were building out at the end of the semester looked so different from those at the
beginning of the semester. On the other hand, the lack of consistency impacted the learning experience, making it
sometimes difficult to find things and to locate the latest, most accurate information. As noted in the literature on
rapid prototyping, feature creep and constantly adding to designs can impact usability and effectiveness (Tripp &
Bichelmeyer, 1990). In line with user experience heuristics (Nielsen & Loranger, 2006), simplifying the design of
the materials and the web interface should be a priority, as should ensuring consistency. This includes reliably
presenting the same types of information and media to users in similar ways. In addition to this, support could be
provided via formal support channels for PSTs such as online discussion forums and one-on-one, helpdesk-like
support.

Conclusion

If we as educators rely on top-down policy reform to shape STEM education, we may end up waiting for
Godot. Winning the future starts today. Acting as local change agents within our locus of control is no nave fantasy.
Quality inquiry-based STEM instructional units can be successfully implemented in existing programs, but should
not be seen as a quick-fix. Good design is complex and challenging. Quickly creating prototypes and testing them
iteratively requires significant effort in terms of planning, implementation, and execution. As the case presented here
illustrates, the process is colored by successes and challenges. While the process can result in very effective designs,
it can also result in some components that fall short of expectations. It is therefore critical to ensure that failures are
manageable (Desrosier, 2011). However, as Brown (2008) notes, The goal of prototyping isnt to finish. It is to
learn about the strengths and weaknesses of the idea and to identify new directions that further prototypes might
take (p. 87).

Acknowledgements

This material is based upon work supported by the grant Project Laulima, funded by the Office of Special Education
Programs (OSEP). The authors gratefully acknowledge the assistance of Amelia Jenkins and Donna Grace, Co-
Principal Investigators, without whose help this work would not have been possible.

References

Abell, S. K., Appleton, K., & Hanuscin, D. L. (2010). Designing and teaching the elementary science methods
course. Routledge.
Anderson, R. D. (2010). Inquiry as an organizing theme for science curricula. In S. K. Abell & N. G. Lederman
(Eds.), Handbook of research on science education (pp. 807830).
Appleton, K. (2007). Elementary science teaching. In S. K. Abell & N. G. Lederman (Eds.), Handbook of Research
on Science Education. Mahwah, NJ: Lawrence Erlbaum Associates.
Atkinson, R. D., & Mayo, M. J. (2010). Refueling the U.S. innovation economy: Fresh approaches to science,
technology, engineering and mathematics (STEM) education (SSRN Scholarly Paper No. ID 1722822).
Rochester, NY: Social Science Research Network.
Banilower, E. R., Heck, D. J., & Weiss, I. R. (2007). Can professional development make the vision of the standards
a reality? The impact of the national science foundations local systemic change through teacher
enhancement initiative. Journal of Research in Science Teaching, 44(3), 375395.
Brown, T. (2008). Design thinking. Harvard Business Review, 86(6), 84.

70
Bybee, R. (2010). Advancing STEM education: A 2020 vision. The Technology and Engineering Teacher,
September, 3035.
Chan, K.-W. (2011). Preservice teacher education students epistemological beliefs and conceptions about learning.
Instructional Science, 39(1), 87108.
Charmaz, K. (2006). Constructing grounded theory: A practical guide through qualitative analysis. London: Sage.
Congressional Research Service. (2006). Science, technology, engineering, and mathematics (STEM) education
issues and legislative options (No. RL33434). Washington, D.C.
Creswell, J. W. (2012). Qualitative inquiry and research design: Choosing among five approaches. Sage.
Davies, R. S. (2011). Understanding technology literacy: A framework for evaluating educational technology
integration. TechTrends, 55(5), 4552.
Davies, R. S., Dean, D. L., & Ball, N. (2013). Flipping the classroom and instructional technology integration in a
college-level information systems spreadsheet course. Educational Technology Research and Development,
61(4), 563580. doi:10.1007/s11423-013-9305-6
Davis, E. A., Petish, D., & Smithey, J. (2006). Challenges new science teachers face. Review of Educational
Research, 76(4), 607651.
Desrosier, J. (2011). Rapid Prototyping Reconsidered. The Journal of Continuing Higher Education, 59(3), 135
145. doi:10.1080/07377363.2011.614881
Dove, A., & Dove, A. (2013). Students perceptions of learning in a flipped statistics class (Vol. 2013, pp. 393
398). Presented at the Society for Information Technology & Teacher Education International Conference.
Dugger, W. E. (2001). Standards for Technological Literacy. Phi Delta Kappan, 82(7), 51317.
Eisenkraft, A. (2010). Retrospective analysis of technological literacy of K-12 students in the USA. International
Journal of Technology and Design Education, 20(3), 277303. doi:10.1007/s10798-009-9085-9
Enfield, J. (2013). Looking at the impact of the flipped classroom model of instruction on undergraduate multimedia
students at CSUN. TechTrends, 57(6), 1427. doi:10.1007/s11528-013-0698-1
Epstein, D., & Miller, R. T. (2011). Slow off the mark: Elementary school teachers and the crisis in science,
technology, engineering, and math education. Education Digest: Essential Readings Condensed for Quick
Review, 77(1), 410.
Fulmer, G. W. (2014). Undergraduates Attitudes Toward Science and Their Epistemological Beliefs: Positive
Effects of Certainty and Authority Beliefs. Journal of Science Education and Technology, 23(1), 198206.
doi:10.1007/s10956-013-9463-7
Fulton, L., & Campbell, B. (2014). Science notebooks: Writing about inquiry. Portsmouth, New Hampshire:
Heinemann.
Glaser, B. (2000). The future of grounded theory. Grounded Theory Review, 1, 18.
Greenberg, J., McKee, A., & Walsh, K. (2013). Teacher prep review: A review of the nations teacher preparation
programs. National Council on Teacher Quality.
Hanover Research. (2011). K-12 STEM education overview. Washington, D.C.: Hanover Research.
Hofer, B. K., & Pintrich, P. R. (1997). The development of epistemological theories: Beliefs about knowledge and
knowing and their relation to learning. Review of Educational Research, 67(1), 88140.
Howes, E. V., Lim, M., & Campos, J. (2009). Journeys into inquiry-based elementary science: Literacy practices,
questioning, and empirical study. Science Education, 93(2), 189217.
Ingerman, ., & Collier-Reed, B. (2011). Technological literacy reconsidered: a model for enactment. International
Journal of Technology and Design Education, 21(2), 137148. doi:10.1007/s10798-009-9108-6
ITEA. (2007). Standards for technological literacy: Content for the study of technology (3rd ed.). International
Technology Education Association.
Jonassen, D. H. (2000). Computers as mindtools for schools: Engaging critical thinking. Upper Saddle River, NJ:
Merrill.
Jonassen, D. H. (2011). Learning to solve problems: A handbook for designing problem-solving learning
environments. Routledge New York.
Jonassen, D. H., Davidson, M., Collins, M., Campbell, J., & Haag, B. B. (1995). Constructivism and computer-
mediated communication in distance education. American Journal of Distance Education, 9(2), 726.
Jonassen, D. H., Howland, J., Moore, J., & Marra, R. M. (2003). Learning to solve problems with technology: A
constructivist perspective. Upper Saddle River, NJ: Merrill.
Jones, A. (2013). The role and place of technological literacy in elementary science teacher education. In K.
Appleton (Ed.), Elementary Science Teacher Education: International Perspectives on Contemporary
Issues and Practice. Routledge, NY.

71
Jones, T. S., & Richey, R. C. (2000). Rapid prototyping methodology in action: A developmental study. Educational
Technology Research and Development, 48(2), 6380. doi:10.1007/BF02313401
Kuhn, D. (1999). A developmental model of critical thinking. Educational Researcher, 28(2), 1646.
Lelliott, A., & Rollnick, M. (2010). Big ideas: A review of astronomy education research 19742008. International
Journal of Science Education, 32(13), 17711799. doi:10.1080/09500690903214546
Mason, G. S., Shuman, T. R., & Cook, K. E. (2013). Comparing the Effectiveness of an Inverted Classroom to a
Traditional Classroom in an Upper-Division Engineering Course. IEEE Transactions on Education, 56(4),
430435. doi:10.1109/TE.2013.2249066
Merriam, S. B. (2007). Qualitative research and case study applications in education (2nd ed.). Jossey-Bass.
Minner, D. D., Levy, A. J., & Century, J. (2010). Inquiry-based science instructionwhat is it and does it matter?
Results from a research synthesis years 1984 to 2002. Journal of Research in Science Teaching, 47(4),
474496.
Moravec, M., Williams, A., Aguilar-Roca, N., & ODowd, D. K. (2010). Learn before lecture: A strategy that
improves learning outcomes in a large introductory biology class. CBE Life Sciences Education, 9(4), 473
481. doi:10.1187/cbe.10-04-0063
Mulholland, J., & Ginns, I. (2008). College moon project Australia: Preservice teachers learning about the moons
phases. Reserach in Science Education, 38(3), 385399. doi:10.1007/s11165-007-9005-8
National Center for Education and the Economy. (2007). Tough choices or tough times: The report of the New
Commission on the Skills of the American Workforce. John Wiley & Sons.
National Research Council. (2007). Rising above the gathering storm: energizing and employing America for a
brighter economic future. Washington, D.C.: National Academies Press.
National Research Council. (2011). Successful K-12 STEM Education: Identifying effective approaches in science,
technology, engineering, and mathematics. Washington, DC: National Academies Press.
Nielsen, J., & Loranger, H. (2006). Prioritizing web usability. Pearson Education. Retrieved from
http://books.google.com/books?hl=en&lr=&id=YQsje6Ecl4UC&oi=fnd&pg=PR1&dq=jakob+nielsen&ots
=nWMBCFVsJ_&sig=ShoEvuW5XKY-KGrdsAO8c9-bvmQ
Pearson, G., Young, A. T., & others. (2002). Technically speaking: Why all Americans need to know more about
technology. Washington, D.C.: National Academies Press.
Pierce, R., & Fox, J. (2012). Vodcasts and active-learning exercises in a flipped classroom model of a renal
pharmacotherapy module. American Journal of Pharmaceutical Education, 76(10), 196.
doi:10.5688/ajpe7610196
Riechert, S. E., & Post, B. K. (2010). From skeletons to bridges & other STEM enrichment exercises for high school
biology. The American Biology Teacher, 72(1), 2022.
Sanders, M. (2009). Stem, stem education, stemmania. The Technology Teacher, 68(4), 2026.
Sandoval, W. A. (2005). Understanding students practical epistemologies and their influence on learning through
inquiry. Science Education, 89(4), 634656.
Schraw, G. (2001). Current themes and future directions in epistemological research: A commentary. Educational
Psychology Review, 13(4), 451464.
Stake, R. E. (1995). The art of case study research. Thousand Oaks, CA: Sage Publications, Inc.
Strauss, A., & Corbin, J. M. (1998). Basics of qualitative research: Techniques and procedures for developing
grounded theory. SAGE Publications.
Szalay, A., & Gray, J. (2006). 2020 computing: Science in an exponential world. Nature, 440(7083), 413414.
Tanase, M., & Wang, J. (2010). Initial epistemological beliefs transformation in one teacher education classroom:
Case study of four preservice teachers. Teaching and Teacher Education, 26(6), 12381248.
Tripp, S. D., & Bichelmeyer, B. (1990). Rapid prototyping: An alternative instructional design strategy. Educational
Technology Research and Development, 38(1), 3144.
Trundle, K. C., Atwood, R. K., & Christopher, J. E. (2002). Preservice elementary teachers conceptions of moon
phases before and after instruction. Journal of Research in Science Teaching, 39(7), 633658.
Wilson, B. G., Jonassen, D. H., & Cole, P. (1993). Cognitive approaches to instructional design. The ASTD
Handbook of Instructional Technology, 4, 2121.
Wisitsen, Michele. (2012). Moon phases. Planetarian, 41(4), 1422.
Worth, K., Winokur, J., Crissman, S., HellerWinokur, M., & Davis, M. (2009). The essentials of science and
literacy: A guide for teachers. Portsmouth, New Hampshire: Heineman.

72
6. Fostering Active Knowledge Construction with the TEE-machine
Gregor Damnik
Antje Proske
Hermann Krndle
Psychology of Learning and Instruction, TU Dresden
Germany
gregor.damnik@tu-dresden.de
antje.proske@tu-dresden.de
hermann.koerndle@tu-dresden.de

Introduction
According to Jonassen (2001), high-quality knowledge acquisition in computer-based learning is only achieved if
learners actively use the provided technology. Jonassen (2001) criticizes that in the most common form of learning
from computers, technology is often used as a tutoring system with limited options to interact with the system. For
example, teachers present supplementary information in a computer-based learning environment to give learners the
possibility to reread and repeat parts of a lecture. Thus, learners remain mainly passive: instead of actively
processing information, for instance by independently clarifying problems or topics, they follow only the given ideas
and explanations of the teacher. Learning with computers, on the other hand, utilizes technological devices as
cognitive tools. Cognitive tools extend human minds and foster active knowledge construction. They enable learners
to express and externalize what they have learned and, thus, to produce well-organized representations of their
knowledge (Jonassen, 2001).
The purpose of this paper is to illustrate how a typical authoring tool, developed for multimedia authoring, can be
utilized in a Learners as Designers approach (Jonassen & Reeves, 1996) to foster active knowledge construction in
university courses.

Learners as Designers
The instructional approach Learners as Designers (LaD, Jonassen & Reeves, 1996) embraces the idea of learning
with computers. LaD encourages learners are to design and produce learning materials for others. Compared to
traditional learning methods several researchers demonstrated different benefits of LaD (e.g., increased motivation,
improved information literacy, and problem-solving ability, Lehrer, Erickson, & Connell, 1994; Liu & Rutledge,
1997). Another study explicitly addressed the gains of LaD on knowledge acquisition (Damnik, Proske, Narciss, &
Krndle, 2013). A designing-group, which produced a computer based learning environment, outperformed a text-
reader group, as well as a group working with a pre-structured learning environment, in terms of transfer tasks.
Thus, compared to traditional learning methods, designers acquired more applicable knowledge.
The process of designing digital media in LaD consists of four general phases (Liu & Rutledge, 1997). First, learners
plan their learning material. During the planning phase, learners identify and explore the topic of their material-to be
produced; they search for information, and read source material provided by a teacher. Moreover, they make project
management decisions (e.g., determining tasks for each team member). During the next phase, called the design
phase, learners structure information by creating an outline for their learning material. In doing so, they select
necessary information and organize or reorganize this information in order to design a representation of the
important information. In this way, the learners make preliminary decisions (a) which information they will use for
their learning material, and (b) how this information will be arranged. In the third phase, the transformation phase,
learners transform previously selected and organized information into the planned material. They also produce new
information such as texts, pictures, or other representations in order to further develop their outline into a final
product. In doing so, they integrate their prior knowledge to construct examples, analogies, or summaries which can
be included in these representations. Lastly, in the revision phase, learners need to evaluate and modify their design
product. Therefore, in each phase of the LaD approach learners are actively engaged in designing knowledge rather
than in encoding information (Perkins, 1986). However, in order to successfully cope with all the demands
accompanying the different phases, students need to be supported. One possibility of this support is the application
of cognitive tools.

73
TEE-machine
The Electronic Exercise (TEE-machine) is an authoring tool for creating web-based learning environments (Kraue
& Krndle, 2005). Authoring tools are designed to enable teachers to construct digital learning materials without the
intensive usage of programming knowledge (Ritter, Blessing, & Wheeler, 2003). A prototype of the TEE-machine
was developed on the basis of the theory of knowledge structures (Albert & Lukas, 1999). This tool especially
supported teachers in organizing information for learners. To accomplish this, the teachers first divided a knowledge
domain into several knowledge units. Second, they drew relations (e.g., arrows) among these units to indicate which
knowledge unit constitutes necessary prior knowledge for understanding the content of another knowledge unit (for
details see Kraue & Krndle, 2005). Afterwards, teachers filled each knowledge unit with several materials (e.g.,
text, pictures, or videos) for example, by using existing documents from a lecture. These files had to be assigned to
the respective knowledge unit. Once this process was completed, the TEE-machine compiled all materials into a
web-based learning environment which consisted of a two-dimensional table of contents. Learners could use a
detailed knowledge map for navigation through this learning environment. If a learner opened a knowledge unit by a
mouse click, the corresponding materials were presented. In this way, learners were able to reread and repeat parts
of a lecture or to prepare themselves for an exam. However, this version of the TEE-machine only embraced the idea
of learning from computers.
To foster active knowledge construction, a second version of the TEE-machine, TEE+, was developed. Our aims
were to explicitly support the design and transformation phase of the LaD approach. Thus, the goal of TEE+ is not
to support teachers in presenting learning information, but rather to enable learners to represent their own knowledge
for other learners. Therefore, TEE+ does not simply guide learners through a learning process; rather it requires
learners to go further into the topic being studied than they would have without the authoring tool (Salomon, 1988).
Thus, TEE+ serves as a cognitive tool. Cognitive tools are adapted or developed to function as intellectual partners
to enable and facilitate critical thinking and higher order learning (Jonassen & Reeves, 1996, p. 694). If the
learners task is to represent their knowledge with a cognitive tool, they have to select which information is
important and which is not. Moreover, they have to decide how to organize this information with the tool and how to
use prior knowledge to enrich the represented new information. These higher order thinking processes of selection,
organization, and integration require intensive information processing (Mayer, 2004). Thus, these processes can only
foster learners understanding of a topic as well as the application of their acquired knowledge if the tool effectively
handles lower level functions of the task (e.g., the technical part of creating these representations). Therefore, many
functions of the TEE-machine were modified in order to support these higher order thinking processes in the
different phases of the LaD approach.

74
Figure 1: Designing digital learning material with TEE+

As mentioned above, the first step in LaD is to plan the procedure. After that, learners have to create an outline for
their learning material in LaD. Thus, learners have to be aided in dividing a knowledge domain into knowledge
units. To realize this, the original TEE-functions for building knowledge units and relations were adapted.
Knowledge units can now be colored and be shown by using different shapes (e.g., circles or quadrates). This does
not only foster orientation in the knowledge domain but also support learners in accentuating and highlighting
particular important information and concepts. Moreover, learners can define relations by labeling them. With
TEE+, learners are now able to specify in detail if, for example, a concept is part of a different concept, if an idea is
associated or contrasted with another one, or if a theory can be explained by different assumptions (see Figure 1).
Thus, these relations do not simply show anymore that a knowledge unit consists of prior knowledge for another
one; rather they contain further information about the way how two or more knowledge units are connected. These
hints are essential for the construction process of a designer, as well as for the learning process of a potential user.
By working with both new functions, the product of the design phase is now comparable to a concept map. These
TEE+-functions foster the higher order thinking processes of selection and organization (Mayer, 2004) because
learners can now make explicit how important information is arranged within their product.
In order to support learners during the transformation phase in using their prior knowledge for creating texts that
correspond to the different knowledge units, an integrated website editor (see Figure 2) was added to the TEE-
machine. This editor can easily be used to create and edit examples, analogies, pictures, and other representations.
Thus, in contrast to traditional mind mapping or concept mapping software (e.g., cmapTools; cf. Caas, Hill, Carff,
Suri, Lott, Eskridge, Gomez, Arroyo, & Carvajal, 2004), learners can directly provide and edit information for
potential users of the learning environment. In this way, TEE+ makes higher order thinking processes possible
which effectively support learners understanding of a topic as well as the application of acquired knowledge.

75
Figure 2: Integrated website editor of the TEE+

University courses using the LaD approach and the TEE-machine


In order to pilot-test the current TEE version, TEE+ was evaluated during several regular university courses. These
courses followed a blended learning procedure in which learners overall task was to design and create learning
materials for classmates. Figure 3 provides an overview of the sequence of the 12 course sessions. Every phase of
the LaD approach was explicitly supported by two or three sessions.

76
Figure 3: Sequence of 12 course sessions.

Up to now, four different courses with a total of 59 students have been evaluated at a German University. The
evaluation questionnaire consisted of 11 questions (e.g., I can answer typical questions about the content of the
course.). The results of those questions, which primarily assessed knowledge acquisition with the TEE+, are shown
in table 1. The response scale ranged from 1 (strongly agree) to 4 (strongly disagree). The results indicate that
students like to learn in such a university course combining LaD and TEE+. Moreover, they think that they have
acquired a lot of knowledge during the course, and that they were able to apply this knowledge to new situations.

LaD course
(n = 59)
M SD
I am happy with the knowledge which I have acquired during the course sessions. 1.11 0.32
I can explain the main content of the course. 1.74 0.67
I can explain all concepts which were discussed during the course sessions. 2.21 0.79
I can answer typical questions about the content of the course. 1.93 0.57
I can describe similarities and differences between several models and theories
2.12 0.74
which were discussed during the course sessions.
I can criticize papers of the content area with the knowledge I have acquired during
2.03 0.94
the course sessions.
I have achieved my learning objectives. 2.08 0.67
Table 1: Results of the course evaluation

77
Discussion
By designing learning materials for others, students are expected to actively construct their own knowledge rather
than simply encode information or follow given ideas and explanations from a teacher. They search and select
relevant information, as well as organize or reorganize this information several times in order to create learning
materials. Moreover, they construct their own representations of a knowledge domain such as texts or pictures for
which they are required to integrate their prior knowledge. Thus, the LaD approach fosters higher order thinking and
learning (cf. Mayer, 2004). The results of the course evaluations confirm this finding. The combination of LaD
(Jonassen & Reeves, 1996) and TEE+ helped students to learn basic information and to acquire applicable
knowledge of a knowledge domain. Furthermore, TEE+ seems to be a powerful tool to implement LaD courses at
university. It supports students in creating well-organized representations of their own knowledge by handling lower
level functions of the task and, thus, embraces the idea of cognitive tools.
However, more research is needed to investigate long-term retention and applicability of the acquired knowledge
after the university courses, as well as to compare the effects of LaD to traditional learning methods. So far, it seems
that students like to work with TEE+ in order to design digital learning material in LaD university courses.

References
Albert, D., & Lukas, J. (Eds.). (1999). Knowledge Spaces: Theories, Empirical Research, and Applications.
Mahwah, NJ: Lawrence Erlbaum Associates.
Damnik, G., Proske, A., Narciss, S., & Krndle, H. (2013). Informal Learning with Technology: The Effects of Self-
Constructing Externalizations. The Journal of Educational Research, 106(6), 431-440.
Caas, A. J., Hill, G., Carff, R., Suri, N., Lott, J., Eskridge, T., Gomez, G., Arroyo, M., & Carvajal, R. (2004).
CmapTools: A Knowledge Modeling and Sharing Environment. In A. J. Caas, J. D. Novak, & F. M.
Gonzlez (Eds.), Concept Maps: Theory, Methodology, Technology. Proceedings of the first international
conference on concept mapping (S. 125-133). Pamplona, Spain: Universidad Pblica de Navarra.
Jonassen, D. H. (2001). Learning from, in, and with multimedia: An ecological psychology perspective. In S.
Dijkstra, D. H. Jonassen & D. Sembill (Eds.), Multimedia learning: Results and perspectives (pp. 41-67).
Frankfurt/Main: Peter Lang.
Jonassen, D. H., & Reeves, T. C. (1996). Learning with Technology: Using Computers as Cognitive Tools. In D. H.
Jonassen (Ed.), Handbook of Research for Educational Communications and Technology (pp. 693-719).
New York: Simon and Schuster Macmillan.
Kraue, R., & Krndle, H. (2005). TEE - The Electronic Exercise. In K. P. Jantke, K.-P. Fhnrich & W. S. Wittig
(Eds.), Marktplatz Internet: Von e-Learning bis e-Payment: Tagungsband der 13. Leipziger Informatik-
Tage (pp. 281 - 286). Bonn: Gesellschaft fr Informatik.
Lehrer, R., Erickson, J., & Connell, T. (1994). Learning by designing hypermedia documents. Computers in the
Schools, 10(1-2), 227-254.
Liu, M., & Rutledge, K. (1997). The effect of a 'learner as multimedia designer' environment on at-risk high school
students' motivation and learning of design knowledge. Journal of Educational Computing Research,
16(2), 145-177.
Mayer, R. E. (2004). Should there be a three-strikes rule against pure discovery learning. American Psychologist,
59(1), 14-19.
Perkins, D. N. (1986). Knowledge as design. Hillsdale, NJ: Lawrence Erlbaum Associates.
Ritter, S., Blessing, S., & Wheeler, L. (2003). Authoring tools for component-based learning environments. In T.
Murray, S. Blessing & S. Ainsworth (Eds.), Authoring Tools for Advanced Technology Learning
Environments (pp. 467-489). Dordrecht: Kluwer Academic Publishers.
Salomon, G. (1988). AI in reverse: Computer tools that turn cognitive. Journal of educational computing research,
4(2), 123-139.

78
7. TPACK as shared practice: Toward a research agenda
David Jones
University of Southern Queensland, Australia
David.Jones@usq.edu.au

Amanda Heffernan
University of Southern Queensland, Australia
Amanda.Heffernan@usq.edu.au

Peter R. Albion
University of Southern Queensland, Australia
Peter.Albion@usq.edu.au
Introduction
The 30+ year goal of effective integration of technology into learning and teaching remains elusive (Brantley-Dias
& Ertmer, 2013) regardless of the voluminous amount of effort, research, and literature devoted to developing
insights into how to achieve it. Belland (2009) critiqued the focus of researchers on teachers beliefs and barriers to
adoption of Information and Communication Technologies (ICT) for learning and teaching. Instead he suggested
that the explanation for limited uptake of ICT by teachers in their classroom practice is related to their past
experience and its effect upon teachers habitus or dispositions to act in certain ways. He argued that, consistent
with the observation that teachers very often teach as they were taught, the most powerful influence on the practices
of teacher graduates would be their own experience as students in school. He cited examples of teacher educators
professing constructivist beliefs but using teacher-directed approaches to prepare teacher candidates to adopt
constructivist practices. For teacher education to be effective in overwriting the effects of conventional schooling to
graduate teachers prepared to integrate ICT it must employ the strategies it promotes and allow teacher candidates to
experience them as learners. The value of such modeling has long been recognized (LeBaron & Bragg, 1994) but
not often achieved.

Technological Pedagogical Content Knowledge (TPACK)one recent significant component of the literaturehas
become a popular framework for describing the knowledge required by teachers to successfully integrate technology
and has underpinned attempts to measure and develop the TPACK of individual teachers (Di Blas, Paolini, Sawaya,
& Mishra, 2014). As a consequence of the on-going growth in the perceived importance of ICT in education it has
increasingly been seen as essential for teachers at all levels of education to build their TPACK (Doyle & Reading,
2012). Hence, teacher educators require knowledge in the domains of content, pedagogy, technology and their
various intersections. Typically they are well prepared by education and experience in content and pedagogy but
technology, and its intersections with the other forms of knowledge, presents significant challenges. Teacher
educators are subject to their own habitus (Belland, 2009) and often have limited, if any, experience of the
application of new ICT. Lankshear, Snyder and Green (2000) observed that, where teachers have limited experience
of particular practices in the real world, they are challenged to design related authentic activities for the classroom.
The protean nature of ICT, as discussed below, continually places teacher educators in the position of needing to
model ICT practices that are not established parts of their repertoire. How then can teacher educators effectively
model ICT integration so that the habitus of teacher candidates will be transformed? How can teacher educators
develop the TPACK required for such modeling?

These were among the questions facing a group of teacher educators at the beginning of 2014. With a long history in
the provision of distance education, the University of Southern Queensland has over recent years moved into online
learning in a strategic way. The institution's strategic plan (USQ, 2012) seeks to "provide students with personalized
adaptive learning support" through the use of "emerging technologies" that "enable collaborative teaching and
individualized learning for students regardless of their location or lifestyle" (p. 6). For teacher educators this has
meant that "by 2012, up to 70% of students in the 4-year Bachelor of Education were studying at least some subjects
online" (Albion, 2014, p. 72) and struggling at times when the reality did not always match the institutional rhetoric.

At the start of 2014 a private, shared blog was set up amongst a group of six teacher educators in an attempt to
explore their shared practice and help bridge this reality/rhetoric gap. The blog provided a space to share stories

79
about what frustrated, pleased, confused and surprised the teacher educators as they attempted to integrate
technology into their teaching. Acting as a virtual water cooler, the blog evolved into a space where teacher
educators at different stages of their careers could share practices; seek and provide support; and learn from each
other. Between February and October 2014, the six teacher educators shared 82 posts and 150 comments.

This paper explores some experiences captured in the 77 posts and 111 comments shared by the three main
contributors to the blog, who are also co-authors of this paper. Given the situated and distributed nature of the
experience of using the shared blog, the paper draws upon a distributive view of learning and knowledge for this
analysis. It begins by looking at recent writing around TPACK with a particular focus on perceived issues with
TPACK and suggestions that a distributed view of TPACK might prove useful. Next, the paper describes four
conceptual themes of a situated and distributed view of learning and knowledge. These themes are then used to
identify and analyze the stories shared on the group blog. Finally, some initial questions for practitioners, teacher
educators, researchers and policy makers are raised in the form of a proposed research agenda.

A distributive view of learning and knowledge


As noted by Di Blas et al. (2014), TPACK (the knowledge required to effectively integrate technology) "has
consistently been conceptualized as being a form of knowledge that is resident in the heads of individual teachers"
(p. 2457). The potential limitations of this perspective led those authors to draw on distributed cognition to explore
the idea of distributed TPACK. Earlier, Putnam and Borko (2000) examined "new ideas about the nature of
knowledge, thinking and learning" (p. 4), ideas which they labeled the "situative perspective" and included ideas
such as situated cognition, distributed cognition and communities of practice. Phillips (2014) used these ideas in his
investigation of the development and enactment of TPACK in workplace settings, implicitly recognizing the
existence of TPACK as a form of shared practice embedded in context rather than knowledge held privately by
individuals. Dron and Anderson (2014) suggest that the situative perspective of learning shares some common
concepts and themes with a range of theoretical perspectives including heutagogy, distributed cognition, activity
theory, Actor Network Theory, complexity theory and complex adaptive systems, Communities/Networks of
Practice, and Connectivism. This range of perspectives has arisen from diverse disciplinary traditions.

Putnam and Borko (2000) developed three conceptual themes capturing the essence of these new ideas to examine
the implications they may hold for how teachers design learning experiences, and learn about new ways of teaching.
In the following we describe and use these three themes plus one other to analyze and draw implications from the
stories shared on the blog.

Situated in particular physical and social contexts

The situated theme of learning and knowledge rejects the idea that context does not matter. Instead the entire
context, understood as an interactive system including people, materials and representational systems, in which an
activity takes place becomes "a fundamental part of what is learned" (Putnam & Borko, 2000, p. 4). The situated
nature of learning means that an inappropriate context can limit transfer of learning into different contexts, a
perspective that Putnam and Borko (2000) link to teacher complaints about professional development removed from
the classroom being seen as "too removed from the day-to-day work of teaching to have a meaningful impact" (p. 6).

Social in nature

The social perspective of learning and knowing recognizes the important role of other individuals and of discourse
communities beyond encouragement and stimulation. Instead, how we think and what we know arises from our on-
going interactions with groups of people over time. The implication is that rather than learning being limited only to
instruction in particular concepts and skills, it "is as much a matter of enculturation into a community's ways of
thinking and dispositions" (Putnam & Borko, 2000, p. 5). The conception of schools as a powerful discourse
community with established ways of thinking offers a partial explanation for the resistance to fundamental change of
classroom teaching (Putnam & Borko, 2000).

Distributed across the individual, other persons and tools

80
The view of cognition as distributed proposes that the knowledge required to perform a task does not exist solely
within an individual person, or just within groups of people. A distributed view sees cognition as requiring a
contribution from a range of people and artifacts. The appropriate tools can enhance, transform and distribute
cognition and through this expand "a system's capacity for innovation and invention" (Putnam & Borko, 2000, p.
10). This view offers lenses for exploring how technologies may be able to support and transform teaching and
learning (Putnam & Borko, 2000).

Digital technologies are protean

Part of the argument for the addition of technology to Shulmans PCK to form TPACK was that the rise of digital
technologies had created a very different context from earlier conceptualizations of teacher knowledge, in which
technologies were standardized and relatively stable" (Mishra & Koehler, 2006, p. 1023). The rapid and on-going
evolution of digital technologies means they never become transparent and it becomes important for teachers to
continually develop technological knowledge (Mishra & Koehler, 2006). Though it has been suggested that, as
digital technology use within schools becomes more common, "TPACK should, at least in theory, become
embedded within other aspects of teachers' knowledge (Brantley-Dias & Ertmer, 2013, p. 117), the evolution of
digital technologies will require corresponding changes in TPACK so that it is inherently unstable knowledge. With
this theme we are seeking to explore two propositions. First, that technological knowledge should remain as a first
class component of TPACK, for reason of its constant and rapid change. Second, that there may be benefits from
changing the nature of the technological knowledge that is useful for teachers. As a result, this is a far more tentative
theme than the previous three, but it also builds on the increased role technology plays in cognition suggested by
those three themes.

Dron and Anderson (2014) quote Churchill (1943) as saying "We shape our buildings and afterwards our buildings
shape us" (p. 50). But digital technologies are different, or at least they can be. Kay (1984) described the "protean
nature of the computer" (p. 59) and suggested that it is "the first metamedium, and as such has degrees of freedom
and expression never before encountered" (p. 59). However, experiencing the full protean nature of digital
technologies requires the knowledge to manipulate them, particularly through (but not limited to) programming. If
learning and knowledge are distributed across networks of people and objects which in contemporary classrooms
includes a significant proportion of digital technologies then the ability to modify digital technologies
appropriately would seem to be one approach to enhancing learning, especially given Shulman's (1987) view that
the distinguishing knowledge of a teacher is the capacity "to transform the content knowledge he or she possesses
into forms that are pedagogically powerful and yet adaptive to the variations in ability and background presented by
the students" (p. 15). With digital technologies it is possible and desirable that we shape our technologies, then our
technologies shape us, and then as we learn - we shape our technologies some more.

Stories from the blog


The following analysis draws on these four conceptual themes from a distributed view of TPACK to analyze the
stories shared on the blog.

Situated

A number of discussions on the blog focused on the challenges of centralized processes or communications that
needed to be contextualized to our own circumstances. As noted previously, an inappropriate context can limit
transfer of learning, and posts from the blog demonstrated that a similar limitation was found when communications
or processes seemed less than an effective fit for our context.

An indication that the institutional units responsible for the technologies at the university are perhaps not working in
exactly the same context as teaching staff is revealed through upgrade and maintenance work occurring at perhaps
the worst possible times. For example, upgrades to the online assignment submission and management system were
scheduled in the week major assignments were due to be submitted.

81
Senior academics responsible for learning and teaching adopted a communiqu model to encourage effective
learning, teaching, and technology integration. This model involved the same intermittent email sent out to all
academic staff across the whole university. Due to their less than context-specific nature, the communiqus were
limited to largely generic information. The lack of connection to the situated experience of teaching staff became a
common concern about this model. For example, a communiqu including notice of decommissioning of the old
Computer Managed Assessment process was sent to all academic staff, regardless of whether they made use of this
function.

When considering the importance of situated knowledge and understanding of ICT, questions were raised in terms
of how we are preparing our students for their future teaching contexts. The institution's use of a LMS and other
tools that are unlikely to be found in many schools generated questions about the usefulness of these approaches in
terms of preparing students to transfer their experience and skills beyond the context of university into their own
teaching practices. Further consideration of this notion can also be found below in relation to the distributed theme.

Discussion on the blog noted, in particular, the power of our shared context as a strength of the learning
opportunities afforded by the collaborative nature of the space. Centralized support options by their very nature have
to be neutral and accessible by any academic within the many disciplines offered by the university, with the support
on offer being limited to the experiences of staff typically without any knowledge of the specifics of teacher
education. In comparison, a blog shared by a group of teacher educators enabled the sharing of expertise developed
through many years in various education sectors and systems. Moreover, that experience was grounded in
attempting to develop and harness TPACK to enhance teacher education to the same cohort of students, thereby
significantly increasing the relevance of the support provided to our own needs and those of our students. This is
reflective of two themes, firstly as mentioned previously that the learning opportunities within the blog were situated
in our contexts, but also the notion that distributed TPACK is social in nature.

Social

By its very nature the blog was an appropriate way of sharing information in ways that fit into our own schedules,
likely more so than a traditionally scheduled meeting or forum might have been. Furthermore, the approach
provided opportunities for participants to post comments and observations, or voice frustrations, at the time that they
arose. This resulted in a wider range of issues being raised than might have been discussed had we waited for a
scheduled meeting. Issues or points that might not have ordinarily warranted a specific email or phone call to other
participants were quickly and easily posted to the blog. Fellow participants could then opt-in or -out of the
conversation, depending on whether they were interested or experiencing a similar issue. Given that all three of us
were comfortable in the interactive or social nature of online environments, the blog functioned well as a platform
for discussion. That we contributed the majority of posts and comments on the blog does raise the question whether
our colleagues found it an equally conducive platform.

One aspect of the blog that had significant rewards was the power of sharing practice, adding value to our teaching
as a result. As a relative newcomer to tertiary teaching, Amanda was seeking to learn more about the ways
colleagues were making use of the various functions afforded by Moodle, the LMS used by our institution. David
provided access to the online environment for one of his courses, giving Amanda the opportunity to explore the
ways of working initiated by a more experienced colleague who holds expertise in the field of ICT and education.
One of the key themes explored earlier in this paper, the notion that TPACK is situated in particular physical and
social contexts, was demonstrated by the blog in that it provided opportunities for learning that were contextualized
to our field. This was particularly evident in the result of this collaboration, where Amanda was able to see the LMS
being utilized in ways that she previously had not observed in other courses. In addition, this provided the
opportunity to engage in informed discussions with David about the impact of certain tools and approaches being
used in his course. An important aspect to note here is that this was provided in a context of teacher education
courses as opposed to other, less contextualized, opportunities that may have been available through central support
or elsewhere online. This resulted in a shift in practice for Amanda, adjustments to the online spaces, and more
efficient ways of delivering learning experiences for students in her courses.

In the same vein, another significant discussion on the blog was about the delivery of course content and learning
experiences for our students. Similar opportunities for shifts in practice resulted from discussions about processes
for releasing course content, and even preferences for ways of recording lectures or vodcasts. The key opportunity

82
here, of course, was the chance to discuss these approaches with people who had a shared context of teacher
education, alongside the variety of levels of experience and expertise in different areas, which added a richness to
the conversation. Interactions beyond the blog also benefited participants, with David and Peters interactions in
other online spaces such as Twitter providing solutions to challenges that David was facing with the LMS. The
solution provided by Peter was identified as one that would not have been found as easily without these social
networks.

Distributed

Stories from the blog are also illustrative of the notion that TPACK is distributed across the individual, other
persons and tools. A number of the stories shared were identifying and sharing our responses to gaps in the complex
assemblage of people, technologies and practices that make up learning and teaching, gaps that seemed to reduce the
overall level of TPACK available across the system. One example of this problem was a new system for managing
course documents that could support the use of only the Harvard referencing style, even though courses in teacher
education and some other departments of the university courses require the use of the APA referencing style. Other
examples of such gaps included issues with gaining access to computer labs, complexities of appointing casual
teaching staff to work into courses, accessing the information necessary to create student groups, issues with the
availability and performance of the LMS at the start of semester, and processes for contacting students who did not
submit assignments, or checking in with students who had yet to engage within our courses. Discussion ranged from
commiserating with our colleagues over having similar issues, to providing solutions which each of us had
developed over time.

At times, though, these solutions became moot when systems changed unexpectedly and each of us identified some
approaches that had worked in the past but were no longer possible for various reasons, either around submission of
assessment, delivery and rollover of course content for ease of updating, or the continued use of an e-portfolio from
a previous institution. While each of these issues was solved with workarounds, this was often a time sensitive
exercise and the end result was not necessarily ideal. For example, Peter was working to modify course content from
a previous semester and reached a solution, but it took far longer than it needed to and finished in what seems an
illogical arrangement.

The rise of Web 2.0 and the cloud has also seen a significant increase in the availability of a range of technologies
(e.g., Google providing student email accounts) from providers outside of the institutional context. The use of
Facebook and other social media as a means of working with students was also a pertinent discussion point at
different times throughout the year. At times we discussed issues that had arisen on social media sites that were
impacting upon the course (such as misinformation being disseminated and causing confusion for students), but we
also discussed ways of trying to engage students on multiple platforms. Tensions sometimes emerged around this
integration of outside technologies. One example of note is the frustration voiced by an institutional learning
designer at the direction they were given to encourage the use of formally approved technologies only. The
discovery of a taxonomy separating technologies into core, supported, allowed and emerging is also worth noting
here and harks back to the concepts discussed earlier in the situated theme, wherein we questioned how
comprehensively we were preparing students for a world beyond our institutional tools and approaches.

This insular, institutional view of allowed systems is the reason why the closing of discussion forums on course sites
just prior to and for 72 hours after an exam is seen as a practice that will prevent students being able to communicate
inappropriately about the examination. This seems to be a moot practice, given the standard experience of each
course having its own (and sometimes multiple) student-created and private Facebook groups, not to mention the
other LMS provided forms of communication that are available to students.

All of these experiences with the problematic distribution of TPACK across and beyond the institution are perhaps a
significant contributor to an observation shared after a discussion with other colleagues, for whom "it was just
accepted as par for the course that there would always be fairly major problems with the technology". This
acceptance of problems not only contributes to frustration, it also raises questions about the impact on innovation.
Amanda posed the question "it makes me wonder. How many people would put [innovation] in the too-hard basket
and go back to a tried-and-true method that has worked for them before?". What impact does this have on the ability
to effectively integrate technology, if the knowledge of how to effectively integrate technology is distributed across
artifacts that are seen as always having fairly major problems? This leads directly to the next theme, which concerns

83
the rapidly evolving nature of technology and the importance of being able to work with/in and around limited tools.

Protean

One solution to the problematic distribution of TPACK across complex assemblages is the idea of digital
renovation. Rather than accept these problems, digital renovation draws upon the protean nature of digital
technologies to adjust or enhance rigid and problematic systems to develop solutions. Digital renovation provides
the opportunity to open up new educational possibilities, but only for those with the TPACK to engage in digital
renovation. For example, David shared a pedagogical practice during a planning day and generated some significant
interest from another teacher educator. However, the digital technologies provided by the institution do not provide
sufficient capability to make this practice plausible within the context of a largish course (300+ students). While
David was able to write Perl scripts that bridged the gap, this particular renovation was very specifically situated
within the context of the renovator and could not be easily adopted by others. So, despite the interest in an effective
practice, it could not be adopted more widely.

But not all such solutions suffer the same problem. Throughout the year the shared blog was used to share and
discuss tools and shortcuts that had been developed to work around (or within) the limitations of institutional
systems, providing timely solutions for challenges that were, at times, plaguing all of us. The blogs enabling of
timely solutions is vital here, ensuring that while teachers may be perpetual novices (Mueller et al., 2008) when it
comes to rapidly changing technology and tools, these solutions were developed and shared responsively, providing
collegial support and breaking down silos that may have existed. The blog, therefore, enabled us to collectively
grow our own solutions to issues as they arose.

The inability to undertake appropriate digital renovation also creates gaps at the institutional level. It has often been
reported that a major problem for students using the universitys LMS was their inability to find required resources
within course sites. The widely accepted reason for this difficulty within the institution is claimed to be the
inconsistent and poor layout and design of individual course sites. However, it also points to the apparent inability of
the institution to provide an effective search engine - the widely accepted method for finding resources online - that
works within the bounds of the institutional LMS. Interestingly, a project is currently underway to streamline course
sites to provide a consistent environment for students and apparently solve the discoverability issue. Discussion on
the blog raised questions about the impact of this that also relates to previous themes reflecting the importance of
context and specialized knowledge an environment that works in one course may not be a good fit for another.

A final conversation worth noting in relation to the concept that digital technologies are protean was about the
complexities of working in a recently updated system and the subsequent requirement of teaching staff adopting
new, and as yet untested, processes. The difficulties faced in adopting these practices raised the question of whether
the digital fluency of teaching staff was sufficient. Another perspective on these difficulties arose from one
participant noting that the systems and processes being used were scarcely fit for purpose, raising questions about
the digital fluency of the institution. While it was clear that those involved had reasonable ideas about what makes
for good online learning practice, they did not always seem to have the digital fluency required to translate those
ideas into efficient and effective practice.

The blog afforded us an interactive, low-pressure space to explore these ways of improving our practice, to engage
in thoughtful and critical discussions, and to share the load of developing these understandings.

Conclusions and a research agenda


Throughout 2014 a group of six teacher educators used a group blog to share the ups and downs of trying to
effectively integrate technology into the education of pre-service teachers. Analysis of the 77 posts and 111
comments shared by three of those teacher educators using a distributed view of learning and knowledge was used
to extract insights into how these educators developed and shared the TPACK necessary to effectively integrate
technology into their teaching. The analysis has revealed that our TPACK was enhanced through the ability to
engage in social discussion with colleagues from within the same context. Such situated collaborations helped
overcome the limitations of organisational practices and technologies that were not always well suited to our context
and aims. Knowledge of how to leverage the protean nature of digital technologies to overcome some of these

84
limitations also helped.

In this light and assuming that developing TPACK ought to be a critical goal of teacher education (Mishra &
Koehler, 2006, p. 1046) what do we do next? How can a distributed view of TPACK help teacher educators model
ICT integration so that the habitus of teacher candidates is transformed? How can we develop the TPACK required
for such modeling? Table 1 provides a list of research questions that make up an initial agenda for future work that
maps out some potentially interesting directions.

Table 1: Proposed research questions for future research on TPACK as shared practice

Themes Research questions


Situated How will a University-wide consistent structure for course sites impact the situated nature of
learning?
How can institutional learning and teaching support engage with the situated nature of TPACK
and its development?
How can University-based systems and teaching practices be closer to, or better situated in, the
teaching contexts experienced by pre-service educators?
Social How can the development of TPACK by teacher educators be made more social?
How can TPACK be shared with other teacher educators and their students?
Distribute Is it possible to measure the digital fluency of a university, rather than focus on its teaching staff?
d How can technologies specific to teacher education (e.g. lesson and unit plan templates) be
enhanced to increase the capability and learning of teacher educators?
Protean Can the outputs of digital renovation practices by individual staff be shared?
How can institutions encourage innovation through digital renovation?
What are the challenges and benefits involved in encouraging digital renovation?

In framing this research agenda, it is important to keep in mind that the distributed view of knowledge drawn upon
in this paper strongly suggests that there are significant limits to what teacher educators can achieve alone. The
knowledge required is situated, distributed and social. Thus the success of such a research agenda will depend on
how effectively all of the people and artifacts involved in teacher education can be involved in this research agenda.

References
Albion, P. R. (2014). From Creation to Curation: Evolution of an Authentic 'Assessment for Learning' Task. In L.
Liu, D. Gibson, V. Brown, T. Cavanaugh, J. Lee, C. Maddux, M. Ochoa, M. Ohlson, D. Slykhuis & J.
Voogt (Eds.), Research Highlights in Technology and Teacher Education 2014 (pp. 69-78). Waynesville,
NC: AACE.
Belland, B. R. (2009). Using the theory of habitus to move beyond the study of barriers to technology integration.
Computers & Education, 52(2), 353-364. doi: 10.1016/j.compedu.2008.09.004
Brantley-Dias, L., & Ertmer, P. A. (2013). Goldilocks and TPACK: Is the Construct Just Right?. Journal of
Research on Technology in Education, 46(2), 103-128.
Di Blas, N., Paolini, P., Sawaya, S., & Mishra, P. (2014). Distributed TPACK: Going Beyond Knowledge in the
Head. In M. Searson & M. N. Ochoa (Eds.), Society for Information Technology & Teacher Education
International Conference 2014 (pp. 2464-2472). Jacksonville, Florida, United States: AACE.
Doyle, H., & Reading, C. (2012). Building teacher educator TPACK: Developing leaders as a catalyst for change in
ICT Education. In M. Brown, M. Hartnett & T. Steward (Eds.), Future challenges, sustainable futures. (pp.
272-282). Wellington, NZ: ascilite.
Dron, J., & Anderson, T. (2014). Teaching crowds: Learning and social media: Athabasca University Press.
Kay, A. (1984). Computer Software. Scientific American, (September), 53-59.
Lankshear, C., Snyder, I., & Green, B. (2000). Teachers and Technoliteracy: Managing literacy, technology and
learning in schools. Sydney: Allen and Unwin.
LeBaron, J. F., & Bragg, C. A. (1994). Practicing what we preach: Creating distance education models to prepare
teachers for the twenty-first century. The American Journal of Distance Education, 8(1), 5-19.

85
Mishra, P., & Koehler, M. J. (2006). Technological Pedagogical Content Knowledge: A Framework for Teacher
Knowledge. Teachers College Record, 108, 1017-1054.
Phillips, M. D. (2014). Teachers TPACK enactment in a Community of Practice. Unpublished PhD, Monash
University, Melbourne. Retrieved from
http://newmediaresearch.educ.monash.edu.au/lnmrg/sites/default/files/Teachers%27%20TPACK%20enact
ment%20in%20a%20Community%20of%20Practice.pdf
Putnam, R. T., & Borko, H. (2000). What do new views of knowledge and thinking have to say about research on
teacher learning? Educational Researcher, 4-15.
Shulman, L. S. (1987). Knowledge and teaching: Foundations of the new reform. Harvard Educational Review, 57,
1-22.
USQ. (2012). USQ Strategic Plan 2013-2015. Toowoomba, Qld. Retrieved from https://www.usq.edu.au/about-
usq/about-us/plans-reports/strategic-plan

86
8. A Preservice Secondary Education Technology Course:
Design Decisions And Students Learning Experiences

Dawn Hathaway and Priscilla Norton3


George Mason University
USA
dhathawa@gmu.edu
pnorton@gmu.edu




Introduction
Never was the adage everything old is new again truer that when we were approached by the secondary
education program coordinator after a series of focus group sessions with administrators who hire program
graduates. Administrators had endorsed the overall quality of graduates but identified three areas for improvement:
working with second language learners, accommodating students with special needs, and using technology in
instruction. The program coordinator asked us to help strengthen the program in the domain of technology
integration. We were not surprised as his request echoed the findings of the Project Tomorrow & Blackboard (2013)
report that stated, Principals want new teachers to know how to use technology to create authentic learning
experiences for students (75 percent) and how to leverage technology to differentiate instruction (68 percent) before
they apply for a position at their school (p. 5). It was also consistent with literature stating that teacher education
programs have not taught new teachers how to use technology effectively (e. g. Maddux & Cummings, 2004), that
preservice teachers lack the ability and knowledge needed to teach successfully with technology (e. g. Angeli &
Valanides, 2008; Sang, Valcke, van Braak, & Tondeur, 2010), and that teachers feel inadequately prepared to use
technology, particularly to support teaching and learning in their disciplines (e. g. Hew & Brush, 2007).
Until 2001, all candidates (elementary and secondary education) in the College of Education and Human
Development (CEHD) were required to complete a generic, predominantly skills-based, standalone technology
course. Beginning Fall 2001, the secondary education faculty removed the technology course requirement and
integrated considerations of technology in required methods courses. As the secondary education faculty had
discovered, the goal of integrating technology with other program requirements had failed to meet the needs of
schools. So, in Spring 2013, we accepted the program coordinators challenge and began the design and then
teaching of a technology integration course for secondary education candidates. Everything old was about to be new
again.
The challenge of introducing preservice candidates to the role of technology in support of instruction has
plagued teacher education for many years. While many have recommended specific approaches, none have been
proven as the most effective (Kay, 2006). Ottenbreit-Leftwich, Glazewski, and Newby (2010) concluded, There is
still the question of how to design the most appropriate technology experiences to intentionally meet the specific
technology content goals faculty feel are necessary to prepare preservice teachers to use technology in their
classrooms . . . (p. 23). Given the uncertainties about preservice technology education, we were left to wonder:
should we design four, discipline-based courses (social studies, language arts, science, and mathematics) or should
we design one course centered on concepts and skills related to technology use and integration? As we pondered
this dilemma, it occurred to us that the answer should be yes the heart of a well-designed course for preservice
secondary candidates should center on the interaction of technology integration concepts and discipline-specific
contents. We set out to design a course whose central goal was to focus participants attention on this interaction.
We collaborated with four doctoral students and were guided by a central question: What is an appropriate
course design to build secondary candidates understanding of how technology integration concepts relate to and
inform teaching practice in the disciplines? We adopted a design-based research approach. In design-based research,
learning is engineered and systematically studied in context. Design experimentation is concerned with the full,
interacting system of learning to include tasks or problems, kinds of discourse, norms of participation, tools and
related materials, and the means by which learners orchestrate relations among these elements. Design-based

3
The authors contributed equally to this work.

87
research has as its goals not the right answer but the ability to improve educational practice. Its promise is to provide
opportunities to explore possibilities for creating novel learning and teaching environments, to develop contextualize
theories of learning and teaching, to construct cumulative design knowledge, and to increase the human capacity for
innovation (The Design-Based Research Collective, 2003).
Recognizing that design-based research is a program of research (Bannan-Ritland, 2003) conducted in
iterative phases (Cobb, 2001), our first research phase centered on reviewing the literature related to preservice
technology education. Results of that review exceed the scope of this paper but led us to two central course design
decisions: to situate the course in specific disciplinary areas and to organize content and activities within each
disciplinary area using habits of mind. During the second research phase, we created a design pattern to structure
content and activities and an instantiation of the pattern as an online course EDIT 504 Educational Technology
in Secondary Classrooms. We described this process and the resulting design pattern in a previous paper (Norton &
Hathaway, 2014). In this paper, we describe the third phase of the research process and present evidence related to
the influence of course design decisions on participants learning experiences.

Phase 3 Research Process

Participants

In the CEHD, secondary education is a post-baccalaureate program serving those who seek licensure to
teach at the middle and high school levels. Some program candidates seek licensure only while others seek a
Masters degree. Candidates are able to obtain teaching licensure in mathematics, English/Language Arts (ELA),
social studies, or one of four science disciplines. Once developed, EDIT 504 was offered as a 3 graduate credit hour
option toward the 9 credit hour elective requirement. Twenty-six candidates initially enrolled in the course. One
candidate received permission to withdraw due to family circumstances. One candidate stopped attending, failed to
answer email inquiries, and was not included in this study. One candidate requested and received an Incomplete.
Thus, twenty-three candidates completed the course and were identified as participants in the study. Ten participants
were ELA candidates; eight were social studies candidates; four were science candidates; and 1 was a mathematics
candidate. Nine of the participants were male, and fourteen were female. No other demographic data was available
to the researchers. One of the authors served as the course instructor, and four doctoral students served as content
facilitators.

Analysis of Participants Course Reflections

Participants were asked to address course content, lesson design, learning with technology, teaching with
technology, implications for their practice, and their experiences as an online learner in a written reflection during
the culminating course module. They were encouraged to make connections between what they had learned and
classroom practice. Participants synthesis reflections were submitted to course instructors at the end of the course.
These reflections were not graded but were acknowledged as final products and served as a springboard for final
closing conversations between individual students, the instructor, and the content facilitators. Participants end of
course reflections served as the source of study data. We examined participants reflections for evidence related to
our course design decisions: a.) to situate participants study of technology in their disciplinary teaching field, b.) to
organize modules using disciplinary habits of mind, and c.) to structure content and activities using a design pattern.
Since course reflections were narrative in form, we selected a qualitative approach to data analysis.
Qualitative analysis procedures emphasize the views of the participants and interpret the subject of study from their
perspective. This process is inductive in that themes emerge during the process of categorizing, coding, and
organizing data. We used a categorizing process identified by Maxwell (2013) as coding. In the coding process, we
independently examined and coded the reflections using our course design decisions as pre-established
organizational topics. Data were fractured (separated from their context) and rearranged into the pre-established
organizational topics. When a statement was identified by only one of us, we returned to the reflection, examined the
statement in context, and agreed to either include the statement or to eliminate it. When coding of statements
differed, we returned to the reflection, examined the statement in context, and agreed upon an appropriate category.
When agreement could not be achieved, the statement was eliminated. Collaboratively, we examined each coded

88
category to identify influences on participants learning experiences. Finally, we collaboratively selected quotations
that reflected participants voices.

Influences of Course Design Decisions

Participants were overwhelmingly upbeat and positive about the course in their end of course reflections.
They found the course to be relevant to their practice as well as challenging, informative, and engaging. One
science participant wrote, The course really encouraged me to be creative, reflective and continue to be committed
to my own learning as to the learning of my students. An ELA participant stated that the course opened her eyes to
the changing education landscape, the Common Core Standards, the need to use technology to transform education,
and an incredible number of tools . . . .
Participants found the course to be applicable to their practice. A mathematics participant wrote, My
reason for taking this course was to learn how to effectively integrate technology into my lessons. Luckily, I have
learned more than that . . . . One social studies participant stated, I can see myself referring back to the concepts I
learned in this course throughout my career over and over again. A science participant reflected on the course
stating, All the elements of this course the introductory modules, design examples, design experiences, design
challenges . . . and the reflection and discussion . . . were fundamental to understanding the attributes of an effective
teacher.

Decision 1: Situate Participants Study of Technology in Their Disciplinary Teaching Area

When Gonseth, et al. (2010) studied four-year U.S. teacher preparation programs, they found that 80% of
survey respondents indicated that all or some of their teacher education programs required a standalone educational
technology course in which personal productivity and information presentation were the most commonly taught
topics. Yet, Russell, Bebell, ODwyer, and OConnor (2003) argued for a focus on specific instructional uses of
technology instead of general technology skills, and Brush et al. (2003) stated there is insufficient effort made to
align technology with discipline-specific pedagogy. We decided to situate the course design in participants
disciplinary area and to deemphasize skill-based instruction. The final course consisted of 10 modules. The first
module introduced the course topic (technology in education), course processes, and course expectations. Modules 2
through 4 explored three universal concepts (technology integration, technology affordances, and authentic learning
and lesson design) and were completed by all participants. With the assistance of four doctoral students each with
expertise in one of the four disciplinary content areas, five two week modules (Modules 5 through 9) were designed
for each of the disciplinary contents.
Although we viewed the decision to situate the course in participants disciplinary area as an important
course design decision, only one student commented on the discipline-based organization of the course. That ELA
participant wrote, I was surprised that we were divided into content-based groups for this course. Once we
began our content-specific modules, however, I came to realize the breadth of opportunities available to us as
teachers and how critical it is to choose the technology that best suits our objectives. It is important to note that
none of the participants expressed disappointment in the absence of a skills-based focus or a desire for more skill-
based instruction. One student did, however, acknowledge the importance of the frequently embedded, optional
skill-oriented tutorials stating, I have a better feel for learning the technology tools through the tutorials. The
experiences with the tutorials have given me confidence that students can pick up new technology skills through the
tutorials as well.

Decision 2: Organize Modules Using Disciplinary Habits of Mind

Having made the decision to situate the study of technology in the context of participants disciplinary area
of interest, it quickly became clear that each discipline had many sub-contents. For example, social studies included
world history, U. S. History, government, and economics. Mathematics included basic math, algebra 1 and 2,
geometry, and calculus. Given the challenge of addressing all contents within a disciplinary area, we made the
decision to design each of the disciplinary modules using disciplinary habits of mind not content topics. Using
disciplinary habits of mind to organize the modules acknowledged that a field of practice is governed by a

89
distinctive way of thinking not about facts but evidence, inquiry, and problem-solving (Tishman, Perkins, & Jay,
1995). A disciplinary approach draws attention not only to information as an end in itself but to a means to better-
informed practice (Gardner, 2009). Thus, habits of mind are the comprehensive intellectual and critical thinking
skills common to a discipline (Charbonneau, Jackson, Kobylski, Roginski, Sulewski, & Wattenberg, 2009).
Each of the disciplinary modules was organized using habits of mind selected for each discipline. We
organized the ELA modules around the five components of the ELA Common Core Standards: engaging with
complex texts (informational and fictional); using evidence in writing and research (writing); working
collaboratively and presenting ideas (speaking and listening); and developing necessary language skill areas
(Language) (National Governors Association Center for Best Practices & Council of Chief State School Officers,
2010). We organized the social studies modules around the components of historical thinking: chronology,
comprehension, analysis and interpretation, research, and issue-analysis and decision-making (Westhoff & Polman,
2008; Wineburg, 2001). We organized the science modules using combinations of the cross cutting concepts
identified in the Next Generation Science Standards: pattern, cause and effect, scale, proportion and quantity,
systems and system models, energy and matter flow, cycles and conservation, structure and function, and stability
and change (Achieve Inc., 2013). We organized the mathematics modules using the standards from the National
Council of Teachers of Mathematics (2000): problem solving, reasoning and proof, communication,
representations, and connections.
Analysis of participants reflections revealed a variety of prior experiences with the concept of disciplinary
habits of mind. Some had not previously studied them; some had a general knowledge of them but lacked detail in
their understanding; and some were very familiar with them and found the course design to be reinforcing. There
was a general consensus in participants reflections that the disciplinary habits of mind were relevant to their
secondary students future. An ELA participant wrote, In order to succeed in college and move forward to obtain a
successful career, students need to learn skills and competencies applicable to todays economy and digital society
throughout every class and subject they are exposed to. Participants also acknowledged habits of mind as relevant
for promoting their future secondary students understanding of thinking in their disciplinary area. A social studies
participant stated, One of the main concepts that I took away from this course was that, as teachers, we should be
designing lessons that teach students how to think like historians.
In addition, participants recognized that disciplinary habits of mind can serve as an instructional strategy.
For example, a science participant recognized the way in which the habits of mind could connect content with
students lives when she wrote, Cross-cutting Concepts . . . give the lessons a specific identity . . . a theme and
unifying principle. They also allow for students to make the connections between science and their everyday lives.
A social science participant recognized that habits of mind could structure classroom discourse when he wrote, By
utilizing this method, a teacher creates a far more meaningful dialogue with students . . . . Another social studies
participant acknowledged the role that habits of mind might play in engaging students with content when he wrote,
I learned technology and historical thinking can work together to get students engaged and excited about learning.

Decision 3: Structure Content and Activities Using a Design Pattern

Peltier, Schibrowsky, and Drago (2007) found that course content was the number one driver of perceived
quality of the learning experience (p. 149). They recommended that designers concentrate on course structure and
content and acknowledged that developing the right course structure must be accomplished prior to course delivery.
Acknowledging the need to structure course content, we adopted a design pattern approach. A design pattern
approach provides designers with a framework for conceptualizing and expressing design problems and solutions. In
this way, designers gain insight into their design problem and are able to capture the essence of the problem and its
solution (Hathaway & Norton, 2013). Using this approach, we developed a design pattern for the course during the
second phase of our program of research. That design pattern described a content structure created to build learners
understanding of how general concepts and practice are connected. The design pattern specifies engaging learners
in a recurring activity structure that includes a conceptual design challenge, a design experience, analysis of design
examples, and a situated design challenge (Norton & Hathaway, 2014).
Even though the design pattern was explicitly presented to participants in the introduction to the
disciplinary area modules, only one ELA participant explicitly referred to the influence of the design pattern on their
learning experience. She wrote, The structure of having a conceptual design challenge, design experience, design
examples, and situated design challenges has taught me about the importance of structure and consistency . . . . One
social studies participant did, however, indirectly capture the impact of the design pattern on his learning when he
wrote,

90
By writing about the [historical] concepts [in the conceptual design challenge] I was able to get an
understanding of its importance; and by doing an activity related to the concept [in the design experience] I
was able to appreciate the successes and frustrations of the students. Reading about additional activities [in
the design examples] and creating my own activities related to the concepts [in the situated lesson design], I
could put my newly learned knowledge to work. Together this scaffolding allowed me to better grasp how
each historical concept could be used in the classroom.

Conceptual Design Challenge: The first activity in the design pattern was a Conceptual Design Challenge
in which a disciplinary habit of mind was conceptually linked with appropriate technology tools and resources.
Participants were challenged to produce a product that demonstrated the ways in which technology concepts and
disciplinary habits of mind inform and blend with authentic learning. For example, social studies participants were
asked to review readings and web resources to prepare a presentation for colleagues about the ways in which
technology can be integrated with historical research one of five habits within historical thinking.
No explicit references to the conceptual design challenge were made by participants. However, there was
indirect evidence of participants recognition of the importance of linking technology choices and disciplinary habits
of mind. An ELA participant reflected, It is critical that I can continue to point to these skills and highlight them . .
. as they appear and reappear throughout the year. Even beyond technological strategies, that is a lesson I know I
will remember going into the new school year. A social studies participant may have summed up the impact of this
activity best when he wrote, This courses emphasis on the connecting between building historical thinking skills
and meaningfully integrating technology has changed my perspective.
Design Experience: The second activity in the design pattern was a design experience where participants
were asked to put on a students hat and complete a sample lesson. This activity was designed to engage
participants in completing an instance of practice informed by technology integration concepts and disciplinary
habits of mind. For example, ELA participants took on the role of 11th grade language arts students and created a
podcast as part of a series of podcasts on word usage. The influence that emerged related to the impact of the design
experiences on participants learning experiences centered on recognition of the importance of viewing instances of
technology integration from a students perspective. Wrote an ELA participant, I have learned the importance of
viewing a lesson through a students eyes to determine how effective the lesson is. A social studies participant
reflected, I think one of the most important parts of the course were the design experiences, as we were able to put
ourselves in the shoes of our students and go through lesson plans that were designed to make connections to the
material we were discussing for the week.
Design Examples: Learners need to experience and reflect on models of practice in order to be able to
transfer formal learning to applied contexts (Bullock, 2004). Thus, the third activity in the design pattern consisted
of a series of design examples that participants analyzed and critiqued. The design examples represented case
studies of practice connecting technology concepts to habits of mind and were designed to elicit thoughtful
examination. For example, ELA participants reflected on a case study about an ELA teacher who challenged her
10th grade students to use iAnnotate as part of a lesson on the comprehension of informational text. Participants
discussed the ways in which this case study modeled or failed to model the interaction of technology integration
concepts and the teaching of the Common Core writing standard.
Participants endorsed the process of analyzing models or scenarios of practice. They reflected that
analyzing the design examples helped them create a vision of practice. An ELA participant stated, . . . the
scenarios provided real-life examples of face-to-face lessons. These were great practice for implementing the things
we learned even before we have face-to-face time in the classroom. A science participant wrote, In the design
examples we were able to see things that were done well and things that we would improve on in our classrooms. . .
. the setting of the classroom was always described well so I was able to see what I wanted my future classroom to
look and sound like during hands-on activities, buzzing voices and active bodies.
In addition to promoting the development of a vision of practice, the design examples helped participants
develop strategies for analyzing lessons. Wrote an ELA participant, By analyzing various activities . . . I learned to
analyze the overall function of a lesson. In the future I will be able to criticize my own lesson ideas to (hopefully)
strengthen my approach to a lesson.
Situated Design Challenge: The fourth activity, the situated design challenge, challenged participants to use
their emerging knowledge to develop a plan for practice (a lesson design). For example, one science module asked
participants to design a biology lesson bridging technology integration concepts with stability and change standards
by focusing on the creation of a lesson to teach the interactions between organisms in an ecosystem. Feedback on
the lesson design provided a context for conversation between participants and the content facilitators.

91
Many participants acknowledged the relevance of the lesson design activities. An ELA participant wrote,
The lesson plan portion of the class was by far the largest learning opportunity for me . . . . A social studies
participant wrote, I feel as though I am walking away with more usable material than most of the classes I have
taken so far. In completing their lesson designs, participants were able to develop an appreciation of the centrality
of lesson goals and objectives. An ELA participant wrote, The emphasis that this course puts on learning
objectives helped me to think about lesson planning in a different way. A mathematics participant stated, This
semesters exploration of technology taught me that no one technology is sufficient to meet all the curricular goals
and objectives. Instead teachers must choose technology based on the affordances that support the goals and
objectives of the lesson. An ELA participant might have summed it up best when she reflected,

. . . my digitizing habit was my impulse to use cool technology . . . Ive learned to ask myself: What are
the standards guiding this lesson? What do I want my students to learn? Will using this particular
technology help them learn better? If not, is there a technology that will do so?

Participants also drew lessons from their lesson design experiences about the role of technology in learning.
They acknowledged in their reflections the importance of choosing and using technology to support learning not
just because. A mathematics participant wrote, So, what this course has taught me is that I should only integrate
technology in a lesson plan if the affordances of the technology adds to the authenticity, outcome, or scaffolding of
the lesson. A social studies participant wrote,

I learned a great deal about how to incorporate technology into a lesson plan. The biggest lesson came not
from the utilization of technology, but rather the appropriate appearance and absence of it. . . . it must be
used as fits the lesson, not as the primary source of the lesson. If a lesson is created based on a novel
technology, the teacher risks overlooking the core learning experience for students in favor of trying to
force a unique experience.

Discussion and Recommendations


In pursuit of our central design goal - to focus participants attention on the interaction of technology
integration concepts and discipline-specific contents, we made three course design decisions and examined
participants end of course synthesis reflections to understand the influence of these decisions on participants
learning experiences.
The first course design decision was to situate participants study of technology in their disciplinary
teaching area rather than a more traditional focus on tools and skills. Only one participant remarked about this
decision, linking it to their realization about the breadth of opportunities available to us as teachers and how critical
it is to choose the technology that best suits our objectives. No other participants commented on this decision. It is
possible that participants did not view this feature of the course as noteworthy because they were accustomed to a
disciplinary focus throughout their preservice learning experiences. It is also possible that their experiences with
technology in their life and work had created a sense of technology competence and their interest in the course was
directed toward instructional uses not technology skills.
The second course design decision was to organize modules using disciplinary habits of mind. Participants
acknowledged the relevance of disciplinary habits of mind to their students post-secondary future and to their
students ability to understand disciplinary thinking. In addition, participants recognized that the disciplinary habits
of mind could serve as an instructional strategy.
The third course design decision was to structure content and activities using a design pattern a recurring
four part activity structure. Participants reflections did not refer to the overall design pattern but spoke to each of
the four activities. The conceptual design challenge influenced participants understanding of the linkage between
technology choices and disciplinary habits of mind. The design experiences influenced participants ability to view
instances of technology integration from a secondary students perspective and to draw instructional lessons from
their own experiences. The design examples shaped participants vision of classroom practice and provided
analytical strategies for thinking critically about practice. The situated design challenge (lesson design) was deemed
by participants as perhaps the most relevant activity as it provided both lesson design experience and usable
resources. Participants reported that this activity led to a deeper understanding of the importance of lesson goals and
objectives over technology use as well as providing insights into criteria for choosing and using technology.

92
Our course design decisions succeeded individually and collectively and contributed to participants
understanding of the interaction of technology integration concepts and discipline-specific contents. Situating study
in their disciplinary area and organizing course content using disciplinary habits of mind shaped participants
learning experiences in ways that led to their acknowledgment and endorsement of the importance of the interaction
between technology integration concepts and disciplinary teaching. The design pattern activities scaffolded
participants ability to conceptually understand the interaction of technology and disciplinary learning (conceptual
design challenge), to experience the impact of combining technology and disciplinary study from a secondary
students perspective (design experiences), to envision classroom practice that reflects this interaction (design
examples), and to apply their conceptual, experiential, and envisioned understanding to the creation of instances of
practice grounded by the interaction of technology and disciplinary learning (situated design challenge).
In Phase 3 of our design-based research approach to course design, participants synthesis reflections
supported the efficacy of our course design decisions promoting our goal to create a secondary preservice
technology course situated not in tools and skills but in the interaction of technology and disciplinary teaching.
Phase 4 of our research project now turns to the ways in which course content instantiated by our course design
decisions influence candidates course learning outcomes. What impact does course completion have on candidates
attitudes and beliefs about technology and their practice? What impact does course completion have on candidates
self-efficacy for using technology to support teaching and learning? What relevance and value do candidates assign
to course concepts presented in the first four modules as guides to their thinking about technologys role in
disciplinary teaching? Does their learning transfer to their classroom practice?

References

Achieve Inc., (2013, April). Next Generation Science Standards. Retrieved from http://www.nextgenscience.org/
Angeli, C., & Valanides, N. (2008, March). TCPK in preservice teacher education: Preparing primary education
students to teach with technology. Paper presented at the AERA annual conference, New York.
Bannan-Ritland, B. (2003). The role of design in research: The integrative learning design framework. Educational
Researcher, 32(1), 21-24.
Brush, T, Glazewski, K., Rutowski, K., Berg, K., Stromfors, C., Van-Nest, M. H., Sutton, J. (2003). Integrating
technology in a field-based teacher training program: The PT3 @ ASU projects. Educational Technology
Research & Development, 51(1), 57-72.
Brzycki, D., & Dudt, K. (2005). Overcoming barriers to technology use in teacher preparation programs. Journal of
Technology and Teacher Education, 13(4), 619-641.
Bullock, D. (2004). Moving from theory to practice: An examination of the factors that preservice teachers
encounter as they attempt to gain experience teaching with technology during field placement experiences.
Journal of Technology and Teacher Education, 12(2), 211-237.
Charbonneau, P., Jackson, H., Kobylski, G., Roginski, J., Sulewski, C., & Wattenberg, F. (2009). Developing
students' "habits of mind" in a mathematics program. Primus: Problems, Resources, and Issues in
Mathematics Undergraduate Studies, 19(2), 105-126.
Cobb, P. (2001). Supporting the improvement of learning and teaching in social and institutional context. In S. M.
Carver & D. Klahr (Eds.), Cognition and instruction: Twenty-five years of progress (pp. 455-478).
Mahwah, NJ: Erlbaum.
Gardner, H. (2009). Five minds for the future. Boston, MA: Harvard Business Review Press.
Gronseth, S., Brush, T., Ottenbreit-Leftwich, A., Strycker, J., Abaci, S., Easterling, W., . . . van Leusen, P. (2010).
Equipping the next generation of teachers, Journal of Digital Learning in Teacher Education, 27:1, 30-36.
Hathaway, D., & Norton, P. (2013). Designing an online course content structure using a design patterns approach.
Educational Technology, 53(2), 3-15.
Hew, K., & Brush, T. (2007). Integrating technology into K-12 teaching and learning: Current knowledge gaps and
recommendations for future research. Educational Technology Research & Development, 55(3), 223-252.
Kay, R. (2006). Evaluating strategies used to incorporate technology into preservice education: A Review of the
literature.Journal of Research on Technology in Education, 38(4), 383-408.
Maddox, C., & Cummings, R. (2004). Fad, fashion, and the weak role of theory and research in information
technology in education. Journal of Technology and Teacher Education, 12(4), 511-533.
Maxwell, J. (2013). Qualitative research design: An interactive approach (3rd ed.). Thousand Oaks, CA: Sage
Publications.

93
Mishra, P., & Koehler, M. J. (2007). Technological pedagogical content knowledge (TPCK): Confronting the
wicked problems of teaching with technology. In R. Carlsen, K. McFerrin, J. Price, R. Weber, & D. Willis
(Eds.), Proceedings of Society for Information Technology & Teacher Education International Conference
2007 (pp. 2214-2226). Chesapeake, VA: AACE.
National Council of Teachers of Mathematics. (2000). Principles and standards for school mathematics online.
Retrieved from http://www.nctm.org/standards/content.aspx?id=16909
National Governors Association Center for Best Practices, & Council of Chief State School Officers. (2010).
Common Core State Standards for English language arts and literacy in history/social studies, science, and
technical subjects. Washington, DC: Authors.
Norton, P., & Hathaway, D. (2014). Using a design pattern framework to structure online course content: Two
design cases. In Proceedings of World Conference on E-Learning in Corporate, Government, Healthcare,
and Higher Education 2014 (pp. 1440-1449). Chesapeake, VA: AACE.
Ottenbreit-Leftwich, A., Glazewski, K., & Newby, T. (2010). Preservice technology integration course revision: A
conceptual guide. Journal of Technology and Teacher Education, 18(1), 533.
Peltier, J. W., Schibrowsky, J. A., & Drago, W. (2007). The interdependence of the factors influencing the perceived
quality of the online learning experience: A causal model. Journal of Marketing Education, 29, 140-153.
Project Tomorrow, & Blackboard K-12 (2013). Learning in the 21st century: Digital experiences and expectations
of tomorrows teachers. Retrieved from
http://www.tomorrow.org/speakup/tomorrowsteachers_report2013.html
Russell, M., Bebell, D., ODwyer, L., & OConnor, K. (2003). Examining teacher technology use: Implications for
preservice and inservice teacher preparation. Journal of Teacher Education, 54(4), 297-310.
Sang, G., Valcke, M., van Braak, J., & Tondeur, J. (2010). Student teachers thinking processes and ICT integration:
predictors of prospective teaching behaviors with educational technology. Computers & Education, 54,103-
112
Tishman, S., Perkins, D., & Jay, E. (1995). The thinking classroom: Learning and teaching in a culture of thinking.
Needham Heights, MA: Allyn and Bacon.
Westhoff, L. M., & Polman, J. L. (2008). Developing preservice teachers' pedagogical content knowledge about
historical thinking. International Journal of Social Education, 22(2), 1-28.
Wineburg, S. (2001). Historical thinking and other unnatural acts: Charting the future of teaching the past.
Philadelphia, PA: Temple University Press.

94
9. Applying an Action Research Model to Learning (Re)Design
Mitchell Parkes University of New England, Australia mparkes2@une.edu.au
Peter Fletcher University of New England, Australia pfletch2@une.edu.au
Sarah Stein University of Otago, New Zealand sarah.stein@otago.ac.nz
Introduction
The origins of formal instructional design models such as those developed by Gagn (1973) and Dick and Carey
(1978) were primarily process-oriented. While the emphasis of these earlier models was instruction rather than
learning, student learning was, even so, regarded as important. An example of this comes from Reigeluth (1983),
who says that instructional design is the process of deciding what methods of instruction are best for bringing about
desired changes in student knowledge and skills for a particular course content and specific student population
(p.7). In more recent years, a major shift has occurred to a focus on learning, as well as on the design act. According
to Horton (2012), learning design requires selecting, organising, and specifying the learning experiences necessary
to teach somebody something (p.3). The simplicity of this definition has appeal. It is straightforward and precise
and matches the perspective Horton promotes about his e-learning design processes/models, variously, as being
rapid, minimalist, waste-no-time, results-focused, speedy and quick (p.7). For Horton (2012) design is an
essential component of any good classroom learning. He states that design it is a complex act involving judgment,
compromise, trade-off, and creativity (p.3). While process can appear to be fore-grounded in Hortons view of
learning design, his emphasis on design indicates that learning design involves a lot more than a simple a series of
quick, waste-no-time steps. As Smith and Raglan (2005) stress, the process, whatever it is, is systematic and
reflective involving the act of translating principles of learning and instruction (p.4). In other words, the learning
design process is certainly not straightforward and any model that seems to privilege process over the content of
process should be understood only as a systems model, with the output of one step being the input for the next step
(Dick, 1996).

Why do we need learning design?

Bell, Bush, Nicholson, OBrien, and Tran (2002) argue, learning solutions, which are developed without proper
regard to appropriate pedagogies and the needs of students, are destined to failure (p.2). The implication is that
those who are involved in implementing learning design processes, in addition to knowing about design and design
processes also require sets of skills and knowledge about: learners and learning; learning, teaching and
organisational contexts; teaching and learning strategies; and media and interaction systems (Richie, Klein &
Tracey, 2011). The many decisions and judgements to be made concerning a vast range of elements that make up a
curriculum within a particular discipline context, within a particular institution can be daunting and even
overwhelming. Learning design calls attention to critical teaching and learning issues that should be taken into
account, acknowledged explicitly and acted upon. These teaching and learning issues include: understanding
learners and their learning context; developing goals and learning outcomes; analysing, sequencing and synthesising
content; engaging learners with the subject matter; selecting media for learning and teaching; assessing learning
outcomes and providing feedback; and evaluating the efficacy of learning and teaching (Naidu, 2013). Focusing on
the design of learning draws attention to those elements that matter, thereby emphasising student learning and
learning outcomes and helping to streamline development activities so as to make use of resources in the best
possible way.
Learning Design Models
The use of learning design models is a pragmatic way to drive, guide, monitor and even control learning design
activity. One very well-known model, ADDIE (Molenda, Pershing & Reigeluth, 1996), maps the core and essential
processes in the complex activity of learning design. It does not seem to have a single origin (Molenda, 2003), but it
is well known and widely used among instructional designers. The core sequential, as well as iterative, elements of
the ADDIE model - Analysis, Design, Development, Implementation and Evaluation provide a conceptual
framework of instructional design (Gustafson & Branch, 2002) and can be found, in some form, within other well-
known learning design models that have emerged (e.g., Gagne, 1973; Horton, 2012). The heart of most, if not all,
accepted models of learning design developed during the last 40 years reflect the generic stages of ADDIE.
In tertiary education settings, constructive alignment (Biggs & Tang, 2011) is very well accepted as key to framing
the process of learning design and development. On one level, constructive alignment is very straightforward and
logical. Simply, a course or a curriculum that is constructively aligned is designed so that intended learning

95
outcomes align with assessment activity as well as with the learning activities. In terms of a learning design model,
the focus is on articulation of clear learning outcomes, matching those with assessment tasks and then creating
learning activities to facilitate student learning towards being able to demonstrate the learning outcomes through
assessment tasks. The less straightforward implication of constructive alignment is that that there should be
alignment of not only the observable teaching and learning behaviours, practices and strategies, but also alignment
of those observable practices and strategies with underpinning theories and beliefs. Achieving alignment is therefore
not an easy task and usually requires ongoing review and reflection on the success of course design, based on
evidence about its effectiveness gathered during the implementation.
Building upon foundations
We consider learning (re)design as being distinct from learning design as it is an ongoing professional development
activity that encourages us to view curriculum as something that is not fixed, but organic and developing over time;
reflecting the changing nature of the people and the contexts in which teachers, students and courses exist. All those
things are the curriculum and are not separate from it. The (lived and experienced) curriculum is a result of all those
factors coming together in action, not just something that is designed (in theory) and documented in a book or
posted online. Curriculum thus is defined here as a balance of informed prescription and informed professionalism
(Schneider, 2008, p. 86). Learning (re)design is also academic work insofar as it mirrors the type of
inquiry/research or reflective practice activity that most academics in tertiary settings engage in and are well versed
with.

Our Action Research


Action research provides a rigorous methodology to investigate your own practice with the specific intent to
improve it. Several schools of action research thought have spawned and developed across the globe including the
United Kingdom to support curriculum reform, the United States of America allied with the progressive education
movement and the work of John Dewey, and in Australia with the movement associated with collaborative
curriculum planning (Mills 2014).
What are the benefits of Action Research?
Action Research has a number of advantages as a research framework. These include: being collaborative and
participatory;
able to be undertaken directly in situ;
using feedback from data in an ongoing cyclical process;
tending to avoid the paradigm of research that isolates and controls variables;
being formative, such that the definition of the problem, aims and methodology may alter during the process of
action research;
being methodology eclectic. (Cohen, Manion & Morrison, 2011, p. 346). There are four core reasons why Action
Research provides a useful framework for learning (re)design. First, like, learning (re)design, the aim of
Action Research is to produce improvement. Second, Action Research offers an inclusive approach to
learning (re)design that involves the participation of all stakeholders, including students. Third, Action
Research can provide a more rigorous, continuous and systematic approach to (re)design. Finally, research
outcomes of the learning (re)design process can be used to inform future redevelopments and be written up
as practice-based research. A Combined Action Research Model In our Australian context we have adopted
the Kemmis and McTaggart Plan, Act/Observe and Reflect action research cycle (1988) and combined it
with the Maxwells reconnaissance (2003) to assist in identifying appropriate thematic concerns and
development of the action research question. We have also included an additional revisiting the literature
to assist in the planning. This combined model is represented diagrammatically in Figure 1. Figure 1.
Maxwell / Kemmis and McTaggart Combined Action Research Model This paper reports on the
Reconnaissance and the first iteration of the Planning Phases which led to the development of an integrated
suite of strategies to improve the management, delivery and student experience for a large cohort unit, ICT
in Education. B) Situational Analysis ICT in Education is a 150 hour duration, one trimester core unit in the
Master of Teaching, offered by the School of Education at a large regional university in New South Wales,
Australia. It is a reasonably large cohort unit with enrolments of up to 400 students per trimester. The unit
is offered in two of the three trimesters each year. The majority of the students (95%) study the unit off-
campus with the unit being delivered through the Learning Management System (LMS) Moodle. The unit
is designed to allow students to demonstrate the learning outcomes

96
related to the requirements of the Australian Institute for Teaching and School Leadership in the area of Information
and Communication Technologies (AITSL, 2011). The unit aims to develop in students the ability to use ICT
critically and effectively in both their studies and in their profession as teachers. Content and skills taught in ICT in
Education include, use and creation of digital multimedia; social networking and communications; strategies for
class and faculty administration; strategies for using information from the digital media; the ethical use of digital
information; and the use and evaluation of educational software.
The unit has been offered for over 15 years and has undergone a number of changes in content, assessment and
personnel. Both formal and informal feedback indicated there were a number of issues with the unit. These included,
a decline in overall student satisfaction ratings to below the School average; issues with the provision of timely and
constructive feedback; increasing student and staff workload; and a lack of resources to support the teaching of the
unit.
The baseline data revealed that all students enrolled in the unit have an undergraduate degree and come from a wide
variety of backgrounds and motivations to study. Many students are commencing further study after a long absence
in order to return to the workforce or are retraining as teachers as a career change. The level of ICT experience
varies widely amongst students enrolled in the unit. Some students are highly experienced having either been or
currently employed in the ICT industry or having existing work-related ICT skills. Other students however, have
very low level of skills and possess only a rudimentary understanding of ICT. There are also a range of apparent
attitudes towards the use of ICT ranging from the enthusiastic to the downright hostile.

Assessing Competencies

A critical component of the Reconnaissance is to review the competence of the researchers involved in the Action
Research. In the current study this comprises the four members of the unit teaching team. From a learning (re)design
perspective this is an important process as it means that professional development needs are identified through the
collection of baseline data and specific actions/activities can is embedded into the unit organisation to share and
increase skills and knowledge. Part of the motivation for the unit redevelopment was to make the best use of each
members specialist skills in both teaching the unit, sharing skills and knowledge across the team and improving the
student experience.
The baseline data revealed that the lecturers teaching the unit came from Primary teaching, Secondary teaching and
Training backgrounds. While all team members had strong ICT skills, each member also had their own specialised
skills in such areas as digital media, social media, programming and web-based instruction. All team members
where interested in improving their ability to effectively and efficiently manage and teach in large online contexts.

The Literature

In term literature comprises both the formal literature (academic knowledge) and from discussions with colleagues
(practitioner knowledge). These two forms of literature were combined to provide the study with a grounded
contextualised foundation that initially assisted in the development of a primary action research question and was
later revisited to inform the subsequent planning. Internal and external discussions with colleagues were conducted.
Team members communicated their ideas, opinions, and experiences teaching the unit through both formal (team

97
meetings) and informal (emails and casual conversations) means. Initially the challenges of teaching large cohorts of
students while maintaining educational quality set within the context of an increasing time-challenged workplace
dominated the conversation. Over time the following themes emerged and were summarised as a suite of key team
professional goals and reflections:
Enhancing and developing appropriate and effective teaching, assessment and support strategies.
Developing and implementing strategies to provide responsive and timely student feedback.
Reducing workload while maintaining high levels of student satisfaction.
Projection of team members personalities into the online environment.
Developing knowledge and skills in Action Research. The teaching team also reflected upon, the nature of the
unit; students skills and attitudes towards ICT in teaching; student background and motivations for
learning; and the teaching philosophies underpinning unit delivery.
Further exploration of the formal academic literature was subsequently guided in reference to these five goals and
reflections.
To expand the conversation, discussions with teaching teams in other disciplines in the School were facilitated. In
particular, teaching teams who also taught large cohort units. Common issues identified were the sheer volume of
student-lecturer interactions, the challenges of providing timely feedback, and the high workload associated with
these types of units.
Both formal and informal feedback was sought from students. Formal data was obtained from Student Evaluation
Questionnaires. This included feedback on such areas as, student workload, intellectual stimulation, learning
resources, timely lecturer feedback and overall student satisfaction. As flagged previously, there had been a decline
in overall student satisfaction ratings to below the School average. Student related feedback was also sought from
members of the teaching team. This included online discussion forum comments and correspondence between
students and the teaching team on matters relating to the units delivery, and personal reflections and entries in a
private-lecturer-only forum on the units Moodle site titled, Unit Issues and Improvements. This private forum acted
a central repository where all members of the teaching team could record issues, ideas and suggest improvements as
they occurred to them. Capturing this type of information on the fly was found to be more effective than soliciting
feedback from team members after the teaching period of the unit had ended.
Investigation of the formal literature revealed that teacher ICT attitudes and self-efficacy strongly influence teacher
ICT use in the classroom (Abbitt & Klett, 2007; Snchez, Marcos, Gonzlez & GuanLin, 2012). Therefore, it was
important that pre-service teaching students enrolled in ICT in Education had an experience of the unit that was both
positive and increased their self-efficacy in using ICT. As described above, many students enrolled in the unit were
already negatively disposed towards ICT and so a poor experience in an ICT unit would only act to reinforce this
negativity. Therefore redevelopment of the unit was necessary in order to improve the student experience and make
students more positively inclined to the integration of ICT in their teaching.
Foundations for a Grounded Action Research Question (ARQ)
Overall, the purpose of the Reconnaissance is to provide the required research foundations in terms of, researcher
competencies; the environment, resources and practices; and to provide a connection with previous academic work
and personal experiences (Maxwell, 2003, p. 7). In these respects the Reconnaissance might be considered
analogous to the Analysis Phase in the ADDIE model of learning design. However, while fulfilling similar
functions, there are a number of important differences between the two. First, Reconnaissance collects important
baseline data that can be used to assist in the evaluation of the overall design at the end of the Action Research
cycle. Second, Reconnaissance, not only involves the participation of all major stakeholders but also the inclusion of
formal and informal literature.
From the analysis of our three Reconnaissance elements (i.e., situational analysis, competence and literature) a list
of issues were identified (Table 1).
Differentiating learning and assessment to cater for diverse range of students. Large volumes of forum posts many
of which were repetitive i.e. similar questions. Increasing student self-efficacy Student expectations of fast
turnaround times for responses to posts Balancing staff workload while maintaining student expectations Making
best use of staff time, skills and expertiseEnsuring consistency in staff responses Assessment (practical and
conceptual) Demonstrating good practice Differentiated learning different speeds of travel learning Maintaining
a positive unit climate

98
Table 1. Identified Issues - Reconnaissance Element Summary
Further discussion led to the development of a range of possible strategies to address the identified issues. The
proposed strategies are presented in Table 2.

Table 2. Identified Aspects and associated Proposed Strategies The Action Research Question (ARQ)

The purpose of the Reconnaissance is to produce an Action Research Question (ARQ). It is important that the
Action Research Question is derived out of the reconnaissance because the research is grounded in the practicalities
and realities of the context where it is being conducted (Maxwell, 2003). Maxwell (2003, pp. 8-9) identifies three
criteria that need to be meet for the selection of an Action Research Question that has applicability for learning (re)
design as well. First, the Action Research Question needs to be specific enough so that data can be gathered about it.
From a learning (re)design perspective this is important as it means evidence can be gathered to determine whether
the proposed improvements as a result of the redesign have been successful. Second, the Action Research Question
needs to be strategic in the sense that it will make a difference to practice, the situation or both. Similarly, in
learning design there needs to be certainty that the investment in time, effort and sometimes money will be
worthwhile and that it will bring about the desired (and planned) changes. Third, the Action Research Question has
to have do-ability (Maxwell, 2003, p. 9) meaning that improvements can be made under the particular set of
constraints of the situation. As for learning (re)design, the redevelopment proposed needs to have do-ability and be
achievable. Having completed the Reconnaissance, the overarching Action Research Question developed for the unit
redesign was:

Will the introduction of an integrated suite of specific online strategies improve the management, delivery and
student experience for a large cohort unit?

Adopted Action Research Cycle Structure


In response to more effectively facilitate professional development opportunities the research team decided to
implement a selection of smaller concurrent Action Research cycles which focused on specific aspects of the
proposed strategies (Table 2) in order to commence addressing the overarching Action Research Question. This
approach provided several strategic advantages including breaking the research down into smaller, more manageable
projects, allowing team members to gain experience managing a specific research component with readily available
advice and support from other team members, and enabling the respective projects to be operated in parallel.
Three aspects and eight associated strategies were selected and planned for the unit redesign: Unit Assessment -
Timely Rich Feedback via the provision of audio feedback (described in Parkes & Fletcher, 2014); Unit Assessment -
Structured as a set of Activities via restructuring the way the unit was developed using a model known as learning
through assessment; Unit Assessment - Practically focused via the creation of more project focused assessment

99
tasks; Social Presence - Strategies to establish lecturer social presence via mechanisms for improving lecturer
social presence; Forum Management - Two moderators via role allocation; Forum Management Archiving posts
via a regular weekly housekeeping procedure; and two other aspects which we will report in this paper to
demonstrate how Action Research was employed using an exemplar covering Forum Organisation - Specialised
Activity Forums and Forum Organisation - Use of FAQ forums.

Forum Organisation Exemplar - Action Research Cycle

The remainder of this paper will be devoted to presenting the results associated with the aspect of Forum
Organisation to demonstrate how Action Research was employed as a mechanism for learning (re)design to improve
both the student and lecturer experience. To commence the planning process the literature was revisited and
discussion amongst team members facilitated. This review process refocused attention and highlighted a number of
issues with the way the units discussion forums were structured and managed. These were:
large volume of posts;
repetitive nature of many of the posts;
turnaround times for responses;
inconsistencies in lecturer responses;
increasing lecturer workload;
making best use of lecturer skills and expertise.

Being an online unit the discussion forums were the main conduits for student and lecturer interaction. The original
organisation of the forums had students placed in groups with a lecturer and they worked independently of other
groupings being only able to access and post to their respective discussion forum. This way of organising the
discussion forums mimicked the tutorial model common in face-to-face scenarios where a single lecturer/tutor
supports a group of students, coming together for teaching and learning. Although this way of organising the
discussion forums was quite typical it was considered as being largely responsible for many of the identified
discussion forum issues. The organisational structure meant duplication with lecturers interacting with students
about the same content and activities and often answering the same types of questions. The large number of
repetitive posts that were occurring as result of the forum organisation was also identified as a contributing factor
towards four further issues: the large volume of posts; increasing lecturer workload; turnaround time for responses;
and inconsistency in lecturer responses. To align this concurrent research cycle with the overarching action research
question (ARQ) a sub-ARQ was formulated to address this aspect of the unit redesign: Will the restructure of the
unit discussion forums through the inclusion of specialised forums and FAQs improve the management, delivery and
student experience of discussion forums?

Action Plan The planned action involved reorganising the discussion forums based upon individual tasks and
activities. Task-related forums were created for: Unit Administration, General Discussion, and Referencing. Five
activity- related forums were created according to the five activities making up the two assessment tasks of the unit.
These activity-related forums were Activity One Forum, Activity Two Forum, Activity Three Forum, Activity Four
Forum, and Slowmation Forum. The Slowmation was a substantial digital media project based upon the
Slowmation concept developed by Garry Hoban (2005). An example of one specialised activity forum is given in
Figure 2 (a). (a) Specialised Forum (b) Referencing FAQ Forum example

100
Figure 2. Example of a (a) specialised forum and (b) FAQ forum When posting, students were asked to post
questions and comments to the relevant task or activity forum. Each lecturer was assigned the primary responsibility
of moderating and responding to a particular forum thus helping to provide response consistency. It was predicted
the volume of posts to the Activity forums (which were linked to the respective assignment activities) would vary as
students progressed through the unit. So, the Activity One Forum would rise and peak followed by a rise and peak in
the Activity Two Forum and so on as students progressively completed these Activities. It was also predicted that
organising the forums in this way would help stagger and reduce lecturer workload. This staggered workload it was
believed would also assist lecturers in responding more quickly to student posts.
Accompanying each specialised activity forum was an associated FAQ forum (Figure 2 (b)). These read- only
forums contained advice and answers to commonly asked questions. These FAQs were compiled from two main
sources: first, from members of the teaching teams previous experiences in teaching the unit and knowledge of the
type of common questions that had been asked previously; and second, during each iteration of the unit when a
student commented or asked a question that was felt by team members to be of relevance to all students, this
comment or question/response was captured and preserved as a FAQ.
Before students posted to any of the specialised forums they were asked to check the associated FAQ first. It was
predicted that the provision of FAQs would help alleviate a number of the main issues associated with the discussion
forums by:
reducing the volume of posts - as many common questions were answered as a FAQs;
reducing lecturer workload - by removing the need for lecturers to answer the more common questions;
reducing turnaround time in responses - if it was a common question the answer would have been already
preserved as an FAQ and so the student would not have to wait for a response. Plus, with a reduced number
of posts lecturers could answer other posts more quickly;
removing inconsistencies in responses - as FAQs provided a consistent response to a range of questions.
Act/Observe and Reflect During the Act/Observe and Reflect cycle the planned action for reorganising the
discussion forums was implemented. This section presents a brief set of results for a number of the outcomes of the
implemented action. Use of Specialised Forums With the creation of specialised forums it was predicted that the
volume of posts to the Activity forums (which were linked to the respective assignment activities) would vary as
students progressed through the unit thereby staggering and reducing lecturer workload. The total number of posts
made by students in the Trimester 1 and 2, 2014 offerings of the unit to the five Activity forums is presented in
Figure 3.

Figure 3. Number of student posts to forums


As shown in Figure 3, the volume of posts varied roughly as predicted; with successive peaks of posts for Activity
One (A1), Activity Two (A2), Activity Three (A3), Activity Four (A4) and Slowmation (SM) Forums over the
duration of the unit. As individual team members were responsible for the moderation of a particular forum the
workload associated with managing their respective forums also rose, peaked and then fell away. This is in contrast
to the original organisation of the forums of students in groups with lecturers being responsible for the one student
forum. With this previous arrangement, lecturers had to deal with posts across all areas for the entire duration of the
unit. This new arrangement reduced lecturer workload allowing them to respond to posts in a more timely manner.
No of Posts
Use of FAQ forums

101
Frequency Analysis of the number of individual hits on all the FAQ forum posts by students was undertaken using
data downloaded from the Moodle discussion forum logs. Figure 4 presents the data for 290 students enrolled in the
Trimester 2, 2014 offering of the unit.

Figure 4. Number of FAQ forums hits by students in Trimester 2, 2014


In total, there were 14445 hits to the FAQ forums. With 290 students enrolled in the unit this represents
approximately 50 hits per student. This represents a substantial use of the FAQ forums by students which might
have otherwise generated questions on the specialised forums if no FAQs were available. The use of the FAQ
forums was found to reduce the overall number of posts which further acted to reduce lecturer workload.
Overall student satisfaction
Changes made to the organisation of the discussion forums plus a range of other strategies were implemented to -
amongst other things - improve the student experience of the unit. As mentioned previously, part of the motivation
for the redevelopment of the unit was a decline in student satisfaction to below that of the average for the School.
Figure 5 provides the overall student satisfaction ratings from the Student Evaluation Questionnaire for the unit
against the average ratings for the School from 2009 to 2014 inclusive. The identified strategies were implemented
after the first teaching period of the 2011 iteration of the unit. Student satisfaction rose from 3.59 to 4.18 after initial
redevelopment of the unit and rose again to 4.65 after further refinements were made in the subsequent teaching
period. Pleasingly, student satisfaction has remained high and above the School average for six successive teaching
sessions.

102
Figure 5. Overall student satisfaction rating - unit versus School average 2009 - 2014

Conclusion
Action research was demonstrated to be a suitable framework for learning (re)design. Through Action Research,
members of the teaching team were able to engage in a systematic process of action enquiry for improving learning,
in order to improve practice. Importantly, by undertaking learning (re)design as Action Research, the outcomes of
this process can be presented as research to the wider education community. In any research programme, it is
important to make the work public, and to articulate its significance for new learning: this includes for ones own
education; for the education of others; and for the education of social formations. Implemented in such a fashion, the
application of Action Research to learning re(design) can help better balance two often competing obligations for
the modern academic - teaching and research, without having to compromise the quality of either.
References
Abbitt, J. T., & Klett, M. D. (2007). Identifying influences on attitudes and self-efficacy beliefs towards technology
integration among pre-service educators. Electronic Journal for the integration of technology in Education,
6(1), 28-42.
AITSL (Australian Institute for Teaching and School Leadership) (2011). National Professional Standards for
Teachers. Retrieved from http://www.aitsl.edu.au/docs/default-source/default-document-
library/aitsl_national_professional_standards_for_teachers
Bell, M., Bush, D., Nicholson, P., OBrien, D., & Tran, T. (2002). Universities Online: a survey of online education
and services in Australia. Occasional Paper Series. Retrieved from http://www.voced.edu.au/node/4717
Biggs, J. & Tang, C. (2011). Teaching for quality learning at university, Buckingham: Open University
Press/McGraw Hill.
Cohen, L., Manion, L., & Morrison, K. (2011). Research Methods in Education (7th ed.), London: Routledge.
Dick, W. (1996). The Dick and Carey model: Will it survive the decade?. Educational Technology Research and
Development, 44(3), 55-63.
Dick, W. & Carey, L.M. (1978). The systematic design of instruction. (1st ed.). New York: Harper Collins.
Gagn, R. M. (1973). Learning and instructional sequence. Review of Research in Education, 1, pp. 3-33.
Gustafson, K. L., & Branch, R. M. (2002). What is instructional design? In R.A. Reiser & J. A. Dempsey (Eds.),
Trends and issues in instructional design and technology (pp. 16-25). Saddle River, NJ: Merrill/Prentice-
Hall.
Hoban, G. (2005). From claymation to slowmation. Teaching Science, 51(2), 26-30.
Horton, W. (2012). E-learning by design. San Francisco: Pfeiffer.
Kemmis, S. & McTaggart, R. (1988). The Action Research Planner (3rd Ed.). Waurn Ponds: Deakin University
Maxwell, T. (2003). Action Research for Bhutan?, Rabsel, 3, 1-20.

103
Mills G. E. (2014). Action Research: A Guide for the Teacher (5th ed). Upper Saddle River, N.J: Pearson
Molenda, M. (2003). In search of the elusive ADDIE model. Performance Improvement, 42(5), 34-36.
Molenda, M., Pershing, J.A., & Reigeluth, C.M. (1996). Designing instructional systems. In R. L. Craig (Ed.), The
ASTD training and development handbook (4th ed.) (pp. 266-293). New York: McGraw-Hill.
Naidu, S. (2013). Instructional design models for optimal learning. In Moore, M. G. (Ed.). Handbook of distance
education (3rd ed.) (pp. 268-281). New York: Routledge.
Parkes, M., & Fletcher, P. R. (2014, June). Talking the Talk: Audio Feedback as a Tool for Student Assessment. In
World Conference on Educational Multimedia, Hypermedia and Telecommunications (Vol. 2014, No. 1,
pp. 1606- 1615).
Reigeluth, C. M. (1983). Instructional design: What is it and why is it? In C. M. Reigeluth (Ed.), Instructional
design theories and models: An overview of their current status (pp. 3-36). Hillsdale, NJ:Erlbaum.
Richie, R.C., Klein, J. D., & Tracey, M.W. (2011). The Instructional Design Knowledge Base: Theory, Research,
and Practice. New York: Routledge.
Snchez, A. B., Marcos, J. J. M., Gonzlez, M., & GuanLin, H. (2012). In service teachers attitudes towards the use
of ICT in the classroom. Procedia-Social and Behavioral Sciences, 46, 1358-1364.
Schneider, A. (2008). Seeing school systems through the prism of PISA. In Luke, A., Weir, K., & Woods, A. (Eds.).
Development of a set of principles to guide a P-12 syllabus framework. Brisbane: Queensland Studies
Authority. p. 86.
Smith, P. L. & Raglan, T. J. (2005). Instructional design (3rd ed.). Hoboken, NJ: John Wiley & Sons.

104
10. Using ADDIE to Design Online Courses Via Hybrid Faculty Development
Michelle Fulks Read, PhD Texas State University, United States Michelle.Read@txstate.edu
Gwendolyn Morel, MS Texas State University, United States gwendolynmorel@txstate.edu
Danyelle Hennington, M.A. Texas State University, United States ldh82@txstate.edu
Introduction
As higher education institutions are turning to online teaching and learning, faculty are being asked to transition
traditionally delivered face-to-face courses to offerings delivered partly or entirely online. Professional
development programs to prepare faculty to teach online are needed, not only to learn the technical aspects of
teaching online but, more importantly, to consider new and different ways of teaching (McQuiggan, 2012, p. 29).
Many university faculty development models take instructors through a step-by-step training process (Hinson &
LaPrairie, 2005) to facilitate faculty change in instructional practices and course design mindset. However, beliefs
and attitudes are slow to change, often doing so over time through experience (Rogers, 2003). Deeply rooted
philosophical changes are hard to gauge in singular faculty development events (Read, 2014; Veletsianos, 2011;
Cranton, 2002).
The purpose of this paper is to present faculty perceptions of effectiveness and satisfaction from a hybrid faculty
development course where participants were taught instructional design principles using the ADDIE (Analyze,
Design, Develop, Implement and Evaluation) model to create their future online college courses. The study also
looked at participants confidence and inclination to use what they had learned from the faculty development course
in the future. Participants were not specifically told they were being taught the steps in the ADDIE model or
instructional design principles.
Background
The use of online platforms for professional/faculty development is increasing (Sawchuck, 2009; Dede et al., 2009;
Walsh (2010).

The need for professional development that can fit with teachers busy schedules, that draws on powerful resources
often not available locally, and that can create an evolutionary path toward providing real time, ongoing, work
embedded support has stimulated the creation of online teacher development programs. (Dede et al., 2009, p. 9).
Additional benefits include the opportunity for faculty to become students while developing new technology skills
(Walsh, 2010, p. 518).

Designing an online course is difficult and time consuming. According to Puzziferro and Shelton (2014), Designing
an online course requires a systemic process that dissects the course learning objectives, presents content,
interactivity and assessment (p. 125). The ADDIE model provides a direct instructional design approach
appropriate for encouraging faculty to consider the importance of pre-planning and applying a systematic approach
to course design, followed by evaluation as a guide for future revisions to the course design. Instructors have to
reconcile their previous notions, habits, and ideas used in their face-to-face instruction and convert them to an online
environment. Koehler, Mishra, Hershey, & Peruski (2004), notes,

Introducing a new concept (the Web), where the rules of face-to-face teaching do not necessarily apply, challenged
faculty to establish new ways of thinking about course design. It required the development of new procedures, tools,
and artifacts to represent and teach their content on the Web (p. 35).

Specific faculty concerns included student engagement with readings and their lack of technology expertise
(Koehler et al., 2004). In their 2004 study, Koehler et al., used a design approach to faculty development to help
their faculty build online courses. Their specific approach used a collaborative model between faculty and graduate
students. As faculty designed and developed, students reviewed and provided feedback in several iterations long
before the courses were delivered to enrolled students. Along the way, they not only learned new technology skills,
they also thoughtfully considered how the technology could be leveraged to accomplish higher-order learning goals
for their potential students (Koehler et al., 2004, p. 51).

More recently, Puzziferro & Shelton (2014) described their use of ADDIE for helping their instructors via a 14-

105
week, completely online, 4-phase (concept & planning, course development, testing & revision, and final approval)
faculty development. Their model included principles of the basic ADDIE model and used a team approach that
included the faculty member, an instructional technologist, a course developer, technical support, a librarian or
copyright clearance coordinator, and additional course development team members, as needed, who served as
subject matter experts (SMEs) who usually came from the faculty members department. Faculty used a course-
planning document, which included objectives, assessments, activities and content, and followed this planning stage
by designing the course in a course template. By the third week, an initial prototype for the first half of the course
was under development. Once approved and taught, members of the team met again to discuss issues, concerns,
positive experiences, etc., from the pilot implementation. Those discussions drove course revisions and
maintenance. Although, the authors have not published formal findings regarding faculty satisfaction, they note
positive preliminary results.

By using the ADDIE instructional design model as a guide to bring a hybrid faculty development to instructors
building their own hybrid or totally online courses, the instructional design team modeled the underlying principles
of ADDIE while teaching them to participants. Throughout the course, instructional designers also explicitly
modeled transfer of initial planning and objective identification to course design and development and guided
participants as they navigated the process themselves. Modeling (Darling-Hammond & McLaughlin, 1995),
particularly cognitive modeling (Read, 2014), and coaching/guidance (Darling-Hammond & McLaughlin, 1995) are
effective instructional approaches in faculty development and are identified essential elements within the cognitive
apprenticeship learning theory (Collins, Brown, & Hollum, 1991). As a follow-up, instructional designers also
helped participants run and analyze formative assessments from their courses pilot delivery.
The Study
In Fall 2014, Instructional designers at Texas State University piloted a sustained, semester long, hybrid faculty
development for instructors charged with or desirous to design hybrid or online college courses. The designers
evaluated participant needs through a course application process and telephone interviews to ensure that they would
be successful in completing a semester long faculty development course during which they would build an entire
online course with the aid of the instructional design team. The faculty development contained four face-
to-face meetings in the first 5 weeks, followed by alternating weeks of meeting with an instructional design
consultant individually and with peers in small groups for presentation and consultant/peer feedback. Throughout
the course, library copyright experts and experienced online teaching faculty provided guest speaking sessions
intended to offer subject matter expertise to course participants. During the last week, participants presented their
completed courses to fellow participants, department chairs, and departmental colleagues.

Faculty development topics included syllabus design and learner expectation development; writing objectives;
planning, designing and building a course in Sakai; building online community; managing online discussions; active
rd
learning; formative evaluation plans; creating mini lecture videos; using 3 party video content (e.g., YouTube,
TedTalk); and quiz design. Opportunities were provided for individual consultations with instructional designers,
small group sharing with peer feedback, in-class discussions, online discussion forums, online synchronous
webinars, access to previously created online courses, and opportunities to listen to faculty who had traversed the
process before and provided expertise in specific areas including How to Manage a Course and Still Have a Life,
Hybrid Course Design, and the use of rubrics. Course planning templates (Planning Matrix, TRACS Design
Document, Copyright Documentation, and Quiz-to-Objective Alignment) were used to help facilitate instructors
course development. Instructional design consultants reviewed participant work, dividing the semester long course
into 3 stages. For each stage, instructors wrote objectives, created content and assessments matched to objectives
and submitted their plan for consultant review, before moving on to the TRACS Design Document, which served as
a storyboard for designing course site. (TRACS is an institutionalized name for Sakai, the LMS used by the
university.) Once completed, instructors submitted their TRACS Design Document and waited for consultant
feedback before moving on to the build stage. Consultants provided feedback on the build before instructors moved
on to the next stage. During this process, instructors also attended to other elements of course needs such as drafting
a syllabus and recording mini lectures.

During the pilot and second subsequent delivery of the faculty development course, participants were asked to
nd
anonymously provide feedback at several points including after the first meeting, after the 2 meeting, mid course

106
and end of course. The ongoing formative evaluations led to some changes that were immediately necessary for the
course, while more substantive revisions were made for overall improvement at the systemic, global level.
During the pilot implementation, three university faculty members participated in the faculty development course
with two completing end of course evaluations, while nine university faculty members participated in the second
course delivery and evaluations. Faculty represented colleges and departments from within the university ranging
from Clinical Lab Science, Communication Studies, Family Consumer Sciences, Finance, Journalism, Management,
Music, Psychology, and Social Work. All participating faculty held doctorate degrees with the exception of two who
held Masters degrees. Experience teaching at the higher education level ranged from three to 34 years with an
average of 15.6 years. Half of the participants had some form of prior online teaching experience. Faculty
participating in the second course delivery reported self-confidence and familiarity scores with the institution
learning management system (N=9, M=2.89, SD = .60), Microsoft Office products (N=9, M=3.22, SD = .44), and
basic computer skills (N=9, M=3.56, SD = .73).
Findings
The descriptive and qualitative findings explore the individual and collective perceptions of the eleven instructors
who participated in the hybrid faculty development course. Results are presented by research question.
What is the overall participants experience and satisfaction level with the hybrid faculty development course?

Generally, faculty development participants were happy with the learning experience (N=11, M=3.8, SD=.40), and
enjoyed learning about new possibilities as one participant explained about online learning/teaching.
Furthermore, all of the participants chose agree or tend to agree when asked if they would recommend the
course to others (N=11, M=3.9, SD=.30). Instructional design consultations and feedback appeared to be the most
helpful to participants, followed by group meetings and peer feedback. One participant, calling out the help of his ID
consultants remarked, Really, when I was feeling discouraged about my perceived lack of progress in the course
design, they were able to ease my concerns and get back on track. Figure 1 displays participant satisfaction with
specific aspects of the faculty development course. In terms of specific assignments, participants liked, in order of
preference: drafting a syllabus (9); recording a Mini Lecture (6); peer review (5); creating a Learner Expectations
document and forum discussion (5); and sharing their courses with others (5).

Finally, approximately two-thirds of the participants found the faculty development site easy to navigate (Figure 2)
and felt they understood the expectations of the course (Figure 3). Participants responded to the open-
ended question, What did you like best about the course, with varied answers including: naming their specific ID
consultant and the guidance received; the opportunity to design and create a real lesson/course; flexibility and
open-mindedness of the program to meet various needs; connecting with faculty and group discussions; and the
experience of feeling like a student. As one participant noted, she developed better ideas about how students
perceive my course sites and instructions. Another participant wrote when asked what they liked least about the
course, feeling like a student again - imposed deadlines. Not something that should change, but it was an obstacle I
had to overcome psychologically.

107
Figure 1. Faculty Development Overall Experience and Satisfaction Mean Results (N=11); Scale 4 (Agree) to 1
(Disagree)

Figure 2. Faculty Development agreement regarding ease of navigation though the course.

108
Figure 3. Faculty Development agreement that they understood the expectations of themselves as learners in the
faculty development course.

Peer feedback helpful Group meetings helpful Consultation and feedback helpful Workload was manageable Depth
of Info right Pace was right Con:ident to apply skills in new settings Recommend to others Overall Satisfaction

What are participants intentions and levels of confidence in applying what they have learned to their future
online course design? Overall, participants planned to take away the skills they had developed in the faculty
development course and implement them in the design and development of future courses. Figure 4 shows the
degree of agreement with intentions to use ideas, skills and materials from the faculty development. One participant
stated, The planning matrix seemed to be most beneficial for me to clearly articulate my objectives, work flow, and
assessments for each lesson and the entire course. I will be making one for other courses that I design in future.
Furthermore, participants expressed confidence (N=11, M=3.45, SD=.52) in their ability to take what they had
learned in the faculty development course and apply their new skills to future courses. One participant noted, ...I
feel better prepared as a result of having been through the program.

Figure 4. Faculty Development Overall Post Faculty Development Intentions Mean Results (N=11); Scale 4
(Agree) to 1 (Disagree)

109
Conclusions

Our evaluation showed that overall participants were satisfied with the Advanced Online Course Design and
Development faculty development presented by instructional designers at Texas State University. Concerns with
course navigation and participant expectations were present for very few. All expressed some degree of confidence
in their abilities to use what they had learned in future courses. Consultation and feedback seem to be the most
helpful design element in the faculty development itself; however, participants valued peer feedback and group
meetings as well. More participants were likely to use a planning matrix in future course designs than the TRACS
Design Document, although most were likely to use both, and nearly all participants agreed or tended to agree with
the likelihood that they would use formative evaluations as a part of the instructional design approach to creating
new online courses. Moreover, participants remarked positively on learning news ways of doing things and
challenging their students in online learning (Koehler et al., 2004).

Several participants commented on the advantage of being a learner in an online course, because it allowed them to
see how the students would view and experience online learning environments, which has been previously identified
as an advantage to online professional development (Walsh, 2010). While one person noted feeling like a student
as a least-liked aspect of the course, the participant also acknowledged the advantage by adding that the imposed
feeling should not be changed even though it required psychological adjustment.

Limitations of this study include its very nature as an evaluation of only two iterations of identical faculty
development events, the small number of participants thus far, and the self-reporting aspect of the survey. However,
current results suggest general success with the faculty development design, development, and implementation,
particularly the use of ADDIE as an underlying instructional guideline for teaching faculty to create their own online
courses. Future studies should continue to garner information about effective approaches to faculty development
delivery and more closely examine the confidence levels of faculty to design and teach their own online courses,
adding additional focus to the potential teaching belief changes of faculty from before, during and after the faculty
development course experience (Read, 2014; McQuiggan, 2012).
References

Collins, A., Brown, J. S., & Holum, A. (1991). Cognitive apprenticeship: Making thinking visible. American
Educator, 15(3), 611.
Cranton, P. (2002). Teaching for transformation. New Directions for Adult and Continuing Education, 93, 6372.
Darling-Hammond, L., & McLaughlin, M. W. (1995). Policies that support professional development in an era of
reform. Phi Delta Kappan, 76(8), 597604.
Dede, C., Ketelhut, D., Whitehouse, P., Breit, L., & McCloskey, E. M. (2009). A research agenda for online teacher
professional development. Journal of Teacher Education, 60(1), 819. doi:10.1177/0022487108327554
Hinson, J., & LaPrairie, K. (2005). Learning to teach online: Promoting success through professional development.
Community College Journal of Research and Practice. 29(6), 483-493.
Koehler, M., Mishra, P., Hershey, K., & Peruski, L. (2004). With a little help from your students: A new model for
faculty development and online course design. Journal of Technology and Teacher Education, 12(1), 25-
55.
McQuiggan, C. (2012). Faculty development for online teaching as a catalyst for change. Journal of Asynchronous
Learning Networks. 16(2), 27-61.
Puzziferro, M., & Shelton, K., (2014). A model for developing high-quality online courses: Integrating a systems
approach with learning theory. Journal of Asynchronous Learning Networks, 2 (3-4), 119-136.
Read, M. F. (2014). Faculty change for disciplinary literacies instruction: Effects of cognitive modeling as an
instructional approach in online professional development (Doctoral Dissertation). Retrieved from
University of Texas at Austin Electronic Theses and Dissertations. (2014-06-20T20:02:35Z)
Rogers, E. (2003). Diffusion of innovations (5th ed.). New York: Free Press.
Sawchuk, S. (2009, March 26). Teacher training goes in virtual directions. Education Week.
Veletsianos, G. (2011). Designing opportunities for transformation with emerging technologies. Educational
Technology, 51(2), 41.
Walsh, K. (2010). Object-oriented faculty development: Training teachers with learning objects. In H. Song, T. T.
Kidd, & K. E. Amaral (Eds.), Handbook of Research on Human Performance and Instructional Technology
(pp. 517532). Hershey, PA: Information Science Reference, IGI Global.

110
11. Dialogue Design System to Share Information and Construct Knowledge
Pivi Aarreniemi-Jokipelto
HAAGA-HELIA School of Vocational Teacher Education Finland
Paivi.aarreniemi-jokipelto@haaga-helia.fi

Yukari Makino
Faculty of Informatics Kansai University Japan
makino@kansai-u.ac.jp
Introduction
The Europe 2020 strategy recognises that education and training need to supply the new skills and competences
required by future European societies (European Commission 2010). Critical thinking and problem solving are
frequently mentioned as learning skills of the 21st century. There is also a need to promote active learning and
provide individuals the opportunity to control and develop their own learning (Horizon 2014). Redecer et al. (2011)
defined future learning strategies as personalization, collaboration, and informalization. This means e.g. team
learning and an instructor working as a guide in a learning process. Students active participation in the learning
process motivates and engages them more than teacher-centred approaches (Aarreniemi-Jokipelto 2014). While
personalization promotes integration of learning into students work, it also boosts professional development. At the
same time, educational technologies are facilitating learning, with learning moving from face-to-face to cyberspace
and contact learning occurring through mobile devices. The use of students own devices in a classroom has become
a very popular approach and has led to the bring your own devices (BYOD) concept.

The Dialogue Design System (DDS) utilized in this study aims to support 21st century learning skills, such as
critical thinking and problem solving, by enforcing proper and logical reasoning. It is used in a learner- centred
approach in which the assignment is personalized for a students and his/her institutions current state and needs,
and a central part of the learning is collaborative knowledge construction with peers.
Dialogue Design System (DDS)
The Dialogue Design System (DDS) was originally developed in a mass lecture class at a Japanese university
(Makino 2008). It aims to develop students identities in a learning community and to help them grow as responsible
participants who will make meaningful contributions to collaborative knowledge construction. One of the features of
this system is that the DDS provides all the students with equal opportunity to participate and ensures the quality of
learning outcomes through a guided dialogue. A guided dialogue is one based on a theoretical framework called the
Message Construction Cross (MCC) model (Makino 2005, 2007, 2008, 2009).

The MCC model describes the grammar of meaning making in the bifocal perspectives of the two crosses (see Fig.
1). The five components show the whole image of the MCC model holistically, whereas the seven elements tell of
the individual links precisely. The holistic view represents the fundamental principles of message construction in the
shape of a cross. The horizontal axis represents the principle that Meaning is created and shared through interactions
between Logic and Dialogue. The vertical axis represents the principle that the Value of Meaning is evaluated in the
Context.

On the other hand, the precise view describes the individual elements. The right wing (Logic) consists of thesis,
general, and specific. The left wing (Dialogue) consists of antithesis, synthesis, and thesis. Logic refers to traditional
Aristotelian logic (demonstrative reasoning), whereas Dialogue refers to Hegelian dialectics (dialectical reasoning).
The central pillar consists of issue, thesis, and opinion. In real-world situations such as problem solving and decision
making, the thesis as proposition comes from questioning the issue, and becomes the basis of opinion.

111
Figure 1. Message Construction Cross (holistic view and precise view)

The philosophy of DDS is also evident in the idea of soft argumentation for citizenship education, where everyone
feels equal and secure with fewer power relations (Makino 2010). Generally, those who have powerful voices are
active in argumentation, and thus tend to dominate the process of decision making. Those with powerless voices are
inactive as listeners. However, soft argumentation, in which dialogues are artificially designed based on the MCC
model, respects the equality of participants and supports their growth as active contributors regardless of the power
of their voices. The verbal grammar and visual structure of the MCC model facilitate construction of a process and
analysis of a product in knowledge construction.

Metaphors of chemical reactions such as molecules and compounds illustrate the dynamic process of message
construction described in the MCC model. For example, the photographs in Figure 2 show the visualization of
chemical reactions in soft argumentation in the context of teacher education (Makino 2010). It helps the
participants to view the process and product objectively. The idea of guided dialogue is just like a laboratory
experiment. They follow a certain procedure in order to induce chemical reactions, which would not naturally occur
without a catalyser. It is through a laboratory experiment that we learn and understand the process and product of
chemical reactions. Likewise, we can learn and understand the process and product of collaborative knowledge
construction.

In Photo 1, each group presents their molecule of thesis, general, and specific. In Photo 2, a facilitator gives
instructions. The participants use pieces of paper to write down their comments. Pink papers are for free comments
about content and blue for critical comments about logic. In this way, all participants have equal opportunity to
contribute to the process of collaboration regardless of their power relations.

In Photo 3, the participants carefully examine each of the displayed molecules. As a result, in Photo 4, some groups
receive blue papers and some groups pink papers. The other groups receive no comments at all. In fact, this is the
reality of natural argumentation. However, sometimes it is too obvious to visualize who gets what. In such cases,
facilitators have a choice of another option. In Photo 5, as an optional case, every group receives the same number of
papers of pink and blue. In this way, no one has to feel uncomfortable.

In Photo 6, the members of each group read (listen to) what others have written in the pink and blue papers. In Photo
7, each group examines their arguments based on the comments from others and develops the construction of a
molecule of antithesis, synthesis, and thesis. The cross in Photo 8 represents the compound created in this soft
argumentation as an outcome of the chemical reactions.

112
Figure 2. Visualization of chemical reactions in soft argumentation (Makino 2010, p.21)

The idea of artificially guided dialogue may sound awkward. However, participants can calmly share information
and critique each other without getting too emotional. Usually, it takes considerable amount of time to have all
members speak. In this way, everyone can participate and contribute within a limited time. This laboratory
experiment is a training ground for the participants to understand the process and product of argumentation, and
more importantly, to have the attitude of listening to others, particularly the voices of those who seem powerless.
When they adequately develop their perspectives and attitudes through the training, they no longer need the artificial
guidance.

Although the DDS is designed based on the same philosophy of soft argumentation, the system of guided dialogue
has become more sophisticated as it has been modified to accommodate a university lecture class size of two
hundred students. Not only is the number of participants large, but the courses expected level of knowledge
acquisition is higher. The DDS procedures include the following six steps in order to construct a cross of MCC
within a cycle.

(1) The teacher presents a thesis that represents the content of a lecture.

(2) The participants (mostly first-year students) judge whether the thesis is true or false. They use an idea card to
describe their reasoning and objective evidence.

(3) The team of staff members (third-year students) grade these idea cards, select the excellent ones, and organize
and integrate them according to the MCC grammar.

(4) The staff team prepares for their feedback presentation. The other staff members and the teacher critique and
give advice.

(5) The staff team gives the feedback presentation in front of the participating students in the next class session.

(6) The teacher summarizes the outcome of the dialogue, and connects it to the context of the course.

This is the path of the guided dialogue (see Fig. 3). First, the participating students judge the validity of the thesis
that was drawn from the presupposition of the lecture (issue). They reason their positions with objective evidence
(general and specific). Then, the staff team places the students ideas in a dialectic order (antithesis and synthesis),
and in the feedback presentation, offers an opinion as to the outcome of the dialogue between the participating
students. It is designed so that a cycle constructs a cross according to the MCC grammar. The teacher can entrust the
facilitation role to the staff members. The DDS allows the teacher to act as a supervisor of students dialogues and
focus on overall class management and instruction.

113
Figure 3. The design of guided dialogue in the DDS model (Makino & Leppisaari 2014, p.1365)

In practice, the cycle of six steps is repeated nine times during a semester (Makino & Leppisaari 2014). It prepares
the participating students for the following activities, such as group work and entire-class discussions where
collaboration and contribution are required. The participating students have the opportunity to not only present their
ideas but also practice reasoning with the idea card. The staff students act as mediators between the participating
students by grading, organizing, and integrating their idea cards. For those students who cannot describe their ideas
sufficiently, the staff members act as peer tutors and help them improve their writing.

Through this continuous training, both the participating and staff students develop their basic skills for not only
academic writing such as expository essays and final theses, but also information sharing and idea development for
knowledge construction. Through the guided dialogue of DDS, they grow as responsible contributors of
collaboration as a member of a learning community.

The Study
The study was conducted in a VET Teachers for the Future programme in which 27 Brazilian vocational education
teachers studied in an exported education programme in Finland. The participants came from different fields of
education, and their previous competences in educational technology varied from beginners to those who had
completed doctoral theses in e-learning. The study was conducted in a three-day e- learning module of a programme
in which the participants had previously studied the use of educational technology in online and face-to-face
learning over a period of six days.

Design science research was the approach utilized in the study. A design scientist attempts to engineer innovative
educational environments and simultaneously conducts experimental studies of those innovations (Brown 1992).
The objective of this study was to gain an understanding of the DDS and MCC as models to be used in contact
learning environments together with educational technology tools. The pilot phase of the DDS served as an
experimental study to improve and increase understanding of the DDS model in contexts different from those in
which the model had been created and used. Design science research consists of two activities: build and evaluate
(Jrvinen 2001). Building refers to a process to construct an innovation, artefact, or model for a specific purpose. In
the present study, building refers to the process to tailor the DDS model for use in the exported educational
programmes in which a Finnish teacher teaches e-learning and educational technology to Brazilian teachers.
Evaluation determines how well the innovation, artefact, or model is performed. In the evaluation of the tailored
model, one question is whether it is in some sense better than teaching and learning methods used previously in the
same situations. The main condition of validity for construction is that it solves the problems in question (Kasanen et
al. 1993).

The main research question of this study was: Is it possible to tailor the DDS model to be used in teaching e-learning
and educational technology to Brazilian teachers in an exported education programme?

114
Secondary questions were:

(1) Is the DDS model able to bring added value to the learning outcomes of the programme?

(2) Can the DDS model support students equal participation in a collaborative learning process?

(3) Can the DDS model support information sharing between students during project work?

(4) Can the DDS model support knowledge construction?

(5) CaneducationaltechnologytoolsbeusedwiththeDDSmodel?

The DDS model was planned to be used in the last three days of the e-learning module, during which participants
worked with an individual or on a collaborative e-learning project. The e-learning project phase included the
following educational challenges, which were defined beforehand:

(1) The participants had a large variation in their e-learning competences, so there was a need to personalize
learning paths in the project work.

(2) The teacher was teaching Brazilian groups for the first time and was unsure if she understood the current state
and challenges of e-learning in all the vocational institutions from which the students had arrived.

(3) Thestudentscamefromdifferentfieldsofeducationanddifferentpartsofthecountry.Dueto their heterogeneous


background, their specific competences and needs differed; thus, there was a need for personalized project work.

Utilization of the DDS Model in Practice


The e-learning module of the programme took place for eight hours each day over nine days. The students had
studied e-learning, m-learning, and the use of educational technology in a six-day-long contact day phase; this was
then followed by the DDS pilot phase during the last three days of the module. The activity of the three days
involved working with a project in which students were utilizing the experience gained at the beginning of the
module, to consider and apply the new viewpoints from the provided brief lectures, and personalizing the projects
for students and their institutions needs and challenges. Figure 4 presents the activities in the DDS pilot phase.

115
Figure 4. Activities during the experimental study with the DDS model

The objective of the three days was to create a shared understanding about the subject through collaboration between
the students and the teacher. The students were seen as equal partners with the teacher in the construction of new
knowledge.

The first day of the experimental study started with an introduction to the DDS model, including the main principles
of the DDS model and the description of an idea card and the MCC model as guided dialogue tools. The objectives
and activities for the entire day were introduced, and the purpose of the project work, a brief lecture, and students
previous knowledge as a base for the idea card process for later that same day were demonstrated. This was
followed by a description of the project work. The objectives and requirements of the project work were defined,
and a brief lecture was given about the topic to offer new viewpoints for the project work. The students worked most
of the day with their individual or group projects, which were seen as an important foundation for the reasoning
needed in the DDS model. The concept was that the students would use

both the lecture and the project work in their decision making about the validity of the theses and reasoning of the
final decision in the idea card. This would provide a basis for the DDS process by applying reasons and objective
evidence in the idea cards later that same day. Project work had an important role in the DDS model because
students knew more about the current situation of e-learning and project-based learning in Brazil than did the teacher
and knew what was feasible in their home country and institution. At the end of the day, students were asked to fill
in the idea cards. In this phase the idea card and its fields were explained again, and the students confirmed that they
had understood the meanings of the fields and the logical thinking behind the reasoning. The idea card form was
created with a Google Form. In addition to the idea card, students were asked to share their Google Docs file, which
they had written during the day. Figure 5 illustrates both the form and responses of the idea card..

Figure 5: The idea card form and responses, created with Google Form

The teacher graded the idea cards before the second day, but in contrast to the unrevised DDS model, the teacher
aimed to include all the information provided, not just the best ones. The result was the MCC, which was
constructed through the information received from the idea card responses. At the beginning of the second day, the
DDS model was introduced again with the help of the operational videos produced by Prof. Yukari Makino, the
developer of the DDS model (Makino & Leppisaari 2014). The objective of watching the videos was to remind
students about the logical rules in reasoning and the process of the DDS model. The operational videos were
originally made to explain how to manage the DDS in practice and are aimed at university teachers and students to
help them understand the sophisticated system and the work process for the staff team. Because students had taken
the roles of both staff students and teacher, watching the videos was an important component. Following the videos,
the teacher presented the constructed MCC (see Fig. 1). A discussion between the students then followed. There

116
were many conversations about the right wing of the cross (general and specific). The current conditions in different
Brazilian states and institutions vary so there was a conversation about the challenges and opportunities in different
parts of Brazil, different kinds of solutions in different fields of education, and different solutions depending on a
students experience with the subject. Because of poor logical reasoning or ambiguity in students English
translation, clarification and improvement of some of the reasons and objective evidence presented in the idea cards
were addressed during the discussion. In one group there was a very long discussion on a suggested opinion, and a
student suggested a modification so that it would be less strong. Another student stated that the opinion suggested by
the teacher provided a truthful and comprehensive picture about the situation in Brazil as a whole, not just in a
specific institution. The conversation was lively, and the MCC was used to help create a wider picture about the
subject rather than focus on an individual project. The need to provide objective evidence for reasons forced
participants to use proper reasoning in argumentation; simply presenting an opinion without evidence was not
acceptable. This process resulted in an opinion about the suggested thesis and the subject in discussion.

To summarize, the discussion through the MCC model created broader knowledge about the subject than a single
project and a brief lecture could have provided alone. Usually the project work is presented during the last day, but
in this case the collaborative knowledge via the MCC model was available for project work

during the second day. The use of the idea card and the MCC model improved learning outcomes by forcing the use
of proper reasoning and not allowing opinions to be presented without justified evidence. The DDS facilitated equal
participation for all students through the idea card form and enabled soft argumentation. In practice, the teacher
and the students were equal partners because their combined expertise was needed to construct the new knowledge
to be utilized in Brazil.

Differences between the Original and Pilot Utilization of the DDS Model

The main differences between the original and pilot utilization of the DDS model are compared in Tab. 1.

Table 1: Main differences in the utilization of the original and pilot DDS model

117
In Japan, the DDS model is used in mass lectures of university courses in which Japanese first-year students have
studied. In the present study, it was used through two exported educational programmes in which 13 or 14 Brazilian
teachers were taught by a Finnish teacher, and the teaching and learning approaches were different. In the original
model, the approach is learner centred and teacher oriented, but in this pilot study it was completely learner centred.

Despite these differences, the reasons for using the model in the learning process are similar. In the original model
and the pilot study, the main reasons were to support information sharing and idea development for knowledge
construction; however, in the pilot study, the aim was also to support information sharing in the middle of the
project and enable collaborative knowledge construction. The original model incorporated three types of actors,
students, staff students, and a professor, while the pilot had only students and a teacher, which resulted in different
actor roles. The main difference related to the tools was the idea card, which was a paper form in the original model
but an electronic form in the pilot study.

The original DDS process is much longer than that of the pilot study because the students and the context are
different. In the pilot study, watching the operation videos substituted for the six-step training of cross- construction,
which served as intensive training. This was possible because the students were Brazilian teachers who had either a
master or doctoral degree, work experience, and were more motivated than first-year university students. Also, the
Brazilian students had already studied and lived together for over two months, so the learning community was
established, making it easier to express opinions and collaborate.

The DDS procedures include six steps that construct a cross of the MCC (see section 2). These were constructed in
the pilot as follows:

(1) The teacher presents a thesis that represents the content of a brief lecture and project work.

118
(2) The students judge whether the thesis is true or false. They describe their reasons and objective evidence with
the idea card. The teacher acts as their mentors.

(3) The teacher grades the students idea cards and organizes and integrates all of them according to MCC
grammar.

(4) The teacher presents the students with a suggestion for the MCC. She/he also explains ambiguity and illogical
reasoning noticed in the idea cards.

(5) Students correct the ambiguity and illogical reasoning and create a common understanding about the subject.
They evaluate the presented MCC and suggest changes to the MCC.

(6) The teacher summarizes the discussion between students, suggests a new MCC, and connects it to the context of
the subject.

Step 1 was slightly different compared to the original step. The presented theses represented both the brief lecture
and the students project work because the theses were based on a combination of the teachers and students
knowledge, not just those of a teacher. The project work was based on e-learning knowledge gained in the beginning
of the module (six days) and on knowledge about project-based learning gained during the previous autumn in the
programme. In addition, the students used their expertise about their institutions needs and challenges to tailor the
project to their institution. The objective of the brief lecture was to provide the participants with new ideas and
viewpoints for the project work. The only difference in step 2 was that the teacher acted as a mentor instead of staff
students. In practice, the students judged the validity of the thesis that was drawn from the presupposition of the
brief lecture and the project work (issue). They reasoned their positions with objective evidence (general and
specific). In step 3, it was the teacher who graded the students idea card, but the main difference compared to the
original was that all the cards were used to construct the MCC, not just the best ones. The group size was so small
that it was possible to use all the cards. There were cards that contained reasons and objective evidence, but the
meaning in English was unclear. The students were not native English speakers, so language was a challenge. Since
the idea was to support equal participation in the DSS process, hearing the voices of those who could not clearly
express their reasons in a written form was important. Thus, the teacher placed the students ideas in dialectic order
(antithesis and synthesis) and offered a suggestion for an opinion as the outcome of the dialogue in the feedback
presentation. In step 5, students were able to correct the ambiguity and illogical reasoning and evaluated the MCC
and suggested changes to the MCC and the suggested opinion. Step 6 included a summary of the discussion in step 5
and connected it to the subject context.

Findings
The main research question in the study was: Is it possible to tailor the DDS model to be used in teaching Brazilian
teachers in an exported educational programme? Although it was possible to tailor the DDS, changes were required
in the procedures and processes of the DDS model because the target group and the context shifted. The added value
that the DDS model brought to students is the broader knowledge about a subject than a single project and a brief
lecture could have provided alone. The use of the modified DDS model supported students equal participation
through the idea card and the MCC model. Because all the idea cards were used to construct the MCC, all students
whose reasoning and objective evidence needed improvement had their voices heard. Common practise is to present
project work during the last day of a course/module, but in the tailored DDS model, the knowledge, which was
created in collaboration via the MCC model, was available for project work during the second day, thus supporting
knowledge sharing between students while working on a project. Of note in the DDS procedures was step 5 in which
students re-evaluated the presented MCC and suggested changes. The discussion between students was
comprehensive and created a common understanding about the subject. Based on the findings, the idea card and
MCC message construction crosses are able to serve as a guided dialogue tool for project work, but discussions
between students and teacher are also needed. Educational technology tools can be

used in idea cards and message construction crosses either in a computer classroom or through the BYOD concept.
Some educational tools, e.g. Google tools, are easily used with the DDS model.

119
Conclusions
Despite the differences between the original and the pilot study, the most significant result of the study is that the
DDS model can be successfully implemented in a new context. The DDS model supported equal opportunities for
participating in discussions on a subject, regardless of English skills, because opinions were expressed using idea
cards and reasons and objective evidence could be changed at a later time.

The DDS model enabled knowledge creation between students and a teacher during project work and resulted in a
wider understanding of the subject than what an individual project would have provided. Usually projects are
presented at the end of a course or module, but use of the DDS model enables sharing of ideas and knowledge
construction earlier and facilitates using the results of the process during the project work.

It is possible to use educational tools, such as Google tools, in the DDS model. With Google Forms
(https://forms.google.com/ ), it was easy for a teacher to create an electronic idea card form to be filled in by
students. Also the responses of an idea card in a Google Form can easily be used to construct an MCC. However, the
use of educational technology in the DDS model is not limited to Google tools; other social media tools, which
provide quiz features, could be used in the same way. The requirements are that a tool does not require a definition
of a correct answer and the responses are easily available.

Future Work
Although the findings of the study are positive and encouraging, additional experimental studies on the tailored
procedures and processes are needed. In particular, a longer pilot study during which students construct the MCC
model by themselves would be useful.

References
Aarreniemi-Jokipelto, P. (2014). Future learning and prior learning assessment and recognition in vocational teacher
education. Proceedings of International Teacher Education Conference in Dubai. (pp.532-537). Retrieved
10.12.2014 from http://ite-c.net/publications/itec2014.pdf
Brown, A. (1992). Design Experiments: Theoretical and Methodological Challenges in Creating Complex
Interventions in Classroom Settings. The Journal of the Learning Sciences. 2(2), 141-178.
European Comission. (2010). EUROPE 2020. A strategy for smart, sustainable and inclusive growth.COM 2020.
Retrieved 10.12.2014 from
http://ec.europa.eu/eu2020/pdf/COMPLET%20EN%20BARROSO%20%20%20007%20-
%20Europe%202020%20-%20EN%20version.pd f .
Jrvinen, P. (2001). On research methods. Opinpajan kirjat, Tampereen yliopistopaino Oy. Tampere.
Kasanen, E., Lukka, K., Siitonen, A. (1993). The constructive approach in management accounting research, Journal
of management accounting research, 5, 243-264.
Makino, Y. (2005). Dynamic model of argument construction: Representation of diversity in logic with RGB colour
model. Kansai University Journal of Informatics, 22, 143-175.
Makino, Y. (2007). The third generation of e-learning: expansive learning mediated by a weblog. International
Journal of Web Based Communities, 3(1), 16-31.
Makino, Y. (2008). Design of Giron (Multi-logue): Curriculum Connecting Message and Media. Tokyo: Hituzi
Syobo.
Makino, Y. (2009). Logical-narrative thinking revealed: The message construction cross. The International Journal
of Learning, 16(2), 143-153.
Makino, Y. (2010). Development of soft argumentation workshop for citizenship education. Japan Association for
Communication, Information, & Society, 6(2), 16-25.
Makino, Y. & Leppisaari, I. (2014). Dialogue Design System in a Mass Lecture Class: Bridging the Cultural Gaps in
Pedagogy through Operation Videos. In Proceedings of World Conference on Educational Media and
Technology 2014 (pp. 1361-1370). Association for the Advancement of Computing in Education (AACE).
NMC Horizon report 2014 (2014). Higher Rducation Edition. Retrieved 10.12.2014 from
http://cdn.nmc.org/media/2014-nmc-horizon-report-he-EN-SC.pdf .

120
Redecker, C. Leis, M. Leendertse, M. Punie, Y. Gijsbers, G. Kirschner, P. Stoyanov, S. &Hoogveld, B. (2011). The
Future of Learning:Preparing for Change. European Commission.Joint Research Centre JRC Scientific and
technical report. EUR 24960 EN. Retrieved 10.12.2014 from http://ftp.jrc.es/EURdoc/JRC66836.pdf .

121


122
PART 3 GAMES AND SIMULATIONS

123
124
12. Investigating the Development of TPACK Knowledge through
Gamification

Candace Figg
Brock University
Canada
cfigg@brocku.ca

Kamini Jaipal-Jamani
Brock University
Canada
kjaipal@brocku.ca

Introduction
The use of game elements in real world contexts is not a new phenomenon. Reward points for purchases,
accumulating frequent flyer rewards, or receiving discount coupons for customer loyalty to specific stores are reward
systems familiar to us. In educational settings, teachers reward students with gold stars to indicate student progress
on a classroom poster, or coloring squares on a chart to show completion of tasks. However, the more formal act of
gamifying instruction, or applying game mechanics to instructional tasks, is a fairly recent phenomenon (Cronk,
2012; Glover, 2013). Corcoran (2010) suggests that gamification uses the oldest tricks in the book: providing
instantaneous feedback, egging on the competition, and rewarding even tiny steps of progress. Gamification assumes
that the player isn't especially motivatedat least at the beginningand then provides barrels of incentives to ramp
up that motivation (para. 11). The use of reward systems is only one of the mechanics of game play that can be used
to gamify instruction, or apply game design thinking to non-game applications to make them more fun and
engaging (Gamification Wiki, n.d.). Educause (2011) suggests that gamification is the application of game
elements in non-gaming situations, often to motivate or influence behavior (p. 1). Jacobs (2013) adds that
gamification is less about addition (adding pre-existing mechanics into the existing environment) and more about
creation (developing a new environment from the combination of mechanics and the existing environment) (p. 3).
Although gamification (or what we refer to as gamified learning in educational contexts) is a pedagogical strategy
instructors are beginning to use in higher education courses (Cronk, 2012; Johnson, 2012; Sheldon, 2012), most
documented studies are from business, computer science, and humanities disciplines, or K-12 schools. The use of
this pedagogical strategy with pre-service teachers has seldom been documented. This paper reports on a study that
explored how the application of game elements to course materials in a technology methods course influenced pre-
service teachers learning of TPACK knowledge (Mishra & Koehler, 2006) and their understanding of how to teach
with technology. The findings contribute to the field by providing insights on how gamification was applied to
instruction and learning in Teacher Education.

125
Gamification
Exporting the good aspects of video games to non-gaming educative contexts (Dominquez et al., 2013), is
commonly called gamification, and is a recent trend in education that may be useful for enhancing motivation and
engagement in digital learners (Gee, 2008; Lee & Hammer, 2011; McGonigal, 2010). As well, gamification provides
a connected learning experience by combining academic or formal learning (the content of the course) with informal
learning (choice to explore as much content, or as little, as desired) through tapping into the social preferences
(individual or collaborative) of digital learners.
A formal definition of gamification has been presented by the NMC Horizon Report: 2013 Higher Education
Edition (2013), which defines gamification as the integration of game elements, mechanics, and frameworks in to
non-game situations and scenarios (Johnson et al., 2013, p. 20). Further, gamification attempts to harness the
motivational power of games and apply it to real world problemssuch as, in our case, the motivational problems of
schools (Lee & Hammer, 2011, p. 1). Sweeney (2013) extends this notion of learning through gamification as the
use of game mechanics to gamify content to engage and entice users by encouraging and rewarding use (What is
gamification? section, para 1). According to Sweeney (2013), successful gamification in e-learning is characterized
by: a focus on the learner, not games; a focus on learner behavior, not knowledge; applying motivational techniques
of games to learning that is applicable to life; and communication of goals and quick feedback of progress (Does
Gamification in learning work? section, para 6). Lee and Hammer (2011) assert that educational gamification
proposes the use of game-like rule systems, player experiences and cultural roles to shape learners behaviour (Lee
& Hammer, 2011, p.3).
The application of game mechanics can also result in negative or unsuccessful learning experiences (Lee &
Hammer, 2011). For example, gamifying certain instructional topics can teach students that learning depends on
external rewards, or learning can become perceived by players as too similar to school rules and expectations. Stott
and Neustaedter (2013) caution that simplifying the complexity of game dynamics to a few game mechanics could
not only result in less engagement of students, but also lead to an alienation of interest. Therefore, as Lee and
Hammer point out, we must carefully design gamification projects that address real challenges of schools (2011,
p.4).
Kapp (2012) and Stott and Neustaedter (2013) advise that gamification is more successful when specific game
design elements are included, and the gamified learning environment allows participants the freedom to fail without
penalty, sequences learning events so that the participants attention and interest is maintained, includes a narrative
or story that makes sense for the context and content being presented, and provides constant and frequent feedback
through rewards, levels, and completion of tasks to a goal. Daley (2012) recommended the following set of game
mechanics be used to gamify educational content to ensure game design elements are included:

Setting up tasks, quests, or challenges (tasks can always be repeated, so no penalty for failure),
Creating a narrative around the tasks through a story or knowledge map (creates a context for the
content being presented),
Earning experience points (immediate feedback),
Rewarding achievement with badges and levels (feedback, recognition of achievements, and record of
progress), and
Promoting competition through leaderboards (recognition of achievements).

Game mechanics have been applied to some courses in higher education and been studied for its impact on
student participation and learning. Cronk (2012) gamified an undergraduate management information course using
the game mechanics of achievement badges and levels, leaderboards, a progress bar, reward systems, and challenges
or competitions between users. The tasks and rewards were structured to encourage student participation in preparing
for, and participating in, class discussions. Although the participant sample was small, findings suggested that
participants were receptive to small motivators such as the badges and rewards, and that there was an increase in
student engagement in class discussions as a result of the gamified learning experience. In order to encourage
students to read materials and respond in an online forum, Johnson (2012) gamified an undergraduate business
course, Social Innovation Media,using the game mechanics of quest points, badges, levels, and a leaderboard.
Participants in this study explained that the gamified learning environment provided them with an incentive to
complete projects and class postings/replies in a timely manner, and influenced course performance and the amount
of time they put into the actual learning for the course. Sheldon (2012) piloted gamification of an undergraduate
course, called Multiplayer Game Design, by assigning experience points rather than grades to tasks, challenges, and
quests, which were displayed on leader boards. Additionally, Sheldon incorporated a storyline by having students

126
defeat monsters (take quizzes and tests), complete quests, (class presentations), and craft solutions (writing papers)
as they learned game theory and how to design video games in the course. Findings indicated that students valued
the gamified learning environment because they learned about how to design quality video games while immersed in
the simulated world created through the game mechanics of story, tasks/quests/missions, experience points, and
leaderboard.

Theoretical Framework
The TPACK model (Mishra & Koehler, 2006) highlights the different types of knowledge that arise from
the interaction of technology, pedagogy and content during instruction. The model suggests that knowledge of the
individual constructs of the model, as well as knowledge of the interactions between the three constructs, is
important for effective technology enhanced teaching. The authors (Figg & Jaipal, 2009; Figg & Jaipal, 2011; Figg
& Jaipal, 2012; Jaipal & Figg, 2010a) conducted longitudinal studies with pre-service and in-service teachers in the
field to illuminate what these theoretical constructs might look like in teaching practice. These findings led to the
development of a framework of characteristics and actions of TPACK knowledge as reflected in practice, which was
called The Framework of TPACK-in-Practice. Findings showed three components of TPACK (specifically
technological pedagogical content knowledgeTPCK, technology content knowledgeTCK, and technological
pedagogical knowledgeTPK) could be clearly recognized through specific teacher actions that promoted effective
technology-enhanced teaching. Since these actions were explicit examples of what teachers do, or use, prior to and
during the teaching of a technology-enhanced lesson, the three constructs were referred to as TPCK-in-Practice,
TCK-in-Practice and TPK-in-Practice. TPCK-in-Practice is knowledge about how to design technology-enhanced
instructional experiences for different models of teaching (e.g., problem based, direct instruction) to meet content
learning goals; TCK-in-Practice is knowledge about content-appropriate technologies (tools of a discipline and
ability to repurpose tools appropriately across disciplines), and includes teachers ability to use the tool (personal
attitudes, skills, and comfort level with these technologies); and TPK-in-Practice is knowledge of practical teaching
competencies (e.g., classroom management, differentiation strategies, and assessment) to plan and implement
technology enhanced lessons. Tables 1, 2, 3, and 4 provide details and samples of teachers actions in practice
representative of TPCK, TCK, and TPK (Figg & Jaipal, 2012; Jaipal & Figg, 2010a; Jaipal-Jamani & Figg, 2015).
For example, the teacher action of building technical skills in increments through the teaching of subject matter
content was categorized as sequencing, and classified as TPK because it involved knowledge and skills about lesson
implementation. In a similar way, a number of characteristics emerged that were representative of TPCK, TCK, and
TPK.

Table 1
Characteristics and Actions of TPCK-in-Practice (Jaipal-Jamani & Figg, 2015)

Characteristics Leading to Success Samples of Teacher Actions in Practice


Repertoire of Technology-enhanced Activity Types Analyze structure of various technology-enhanced activity
Representing Content Knowledge types to meet diverse student learning needs
Select most effective technology-enhanced activity type to
meet diverse student learning needs
Knowledge of Content-Based Models of Teaching Analyze type of knowledge to be learned
Appropriate for Technology enhanced Activity Select appropriate Model of Teaching for technology
Types enhanced instruction

Table 2
Characteristics and Actions of TCK-in-Practice (Jaipal-Jamani & Figg, 2015)

Characteristics Leading to Success Samples of Teacher Actions in Practice


Knowledge of Content-Appropriate Technologies Matching discipline-specific tools to content
Repurposing tools of other disciplines to match content
Competence with Content-Appropriate Identifying technical skills needed for discipline-based tool
Technologies use
Identifying personal skill levels of tool use

127
Table 3
Characteristics and Actions of TPK-in-Practice: Planning (Jaipal-Jamani & Figg, 2015)

Characteristics Leading to Samples of Teacher Actions in Practice


Success
TPK-in-Practice: Planning
Assessment Match assessments to technology enhanced learning activities
Create assessment instruments using technology
Use technology to conduct assessments
Activity Choices Select activities based on subject matter learning outcomes/goals
Incorporate a variety of tech activities
Refine activities through collaborative review
Sequencing Build technology and content skills within lesson and unit
Develop technical skills in increments through content activities
Differentiation for Technical Introduce few technical skills in a lesson
Competence Chunk technical skills into simple procedures
Adapt lesson or online activities for students
Create specific learning objects for students (handouts; newsletter; videos;
infographics; simulations, etc.)
Use of technology enhanced activities with multiple modes
Backup Instruction Plan alternate lesson activities
Plan for alternate technologies

Table 4
Characteristics and Actions of TPK-in-Practice: Preparation and TPK-in-Practice: Implementation (Jaipal-Jamani
& Figg, 2015)

Characteristics Leading to Samples of Teacher Actions in Practice


Success
TPK-in-Practice: Preparation
Technology Practice Practice with technology tools in instructional settings
Obtain peer feedback
Digital classroom Resources Collect online resources in linklist or Diigo site
for Teacher and Student Use
TPK-in-Practice: Implementation
Modeling Technology Use To Model best practices for technology tool use
and For Students Model generic functions across applications
Use teacher-created exemplars
Have students model technical skills
Classroom Management Use grouping techniques to support technical skill and content development
Use appropriate demonstration techniques in technology enhanced lessons
Use techniques for engaging students with technology during lessons

The above characteristics and actions were used to analyze the data in this study for evidence of development of
TPACK knowledge.

Context: Gamification of a Technology Methods Course


Prior to gamifying the technology methods course in the Teacher Education Program, a textbook was used
to present background information that all teachers should understand about teaching with technology. Chapters
were assigned each week, with students completing different types of digital quizzes to test understanding of the
material that was necessary to support in-class activities. Some topics that were foundational background of TPACK
knowledge were cybersafety, copyright issues, evaluation of web sites, types of technology-enhanced activities that
could be used in lesson design, and different models of teaching that work well with technology-enhanced activities

128
(e.g., project-based learning, problem-based and inquiry learning, Internet research activities, and digital
storytelling). Lack of TPACK knowledge was obvious when students did not apply knowledge learned to complete
projects. For example, one of the textbook chapters focused on how teachers prepare for lesson implementation by
collecting and developing classroom resources for the lesson, which supports the development of Technological
Pedagogical Knowledge (TPK) (Jaipal & Figg, 2010b). The textbook reading that supported this knowledge included
the topics of copyright, fair use, as well as how to select and cite digital images from the Internet. The in-class
activity was for students to apply their knowledge while constructing a PowerPoint slide incorporating digital
images. However, when evaluating pre-service teachers products, it was noted that images used in most of the
products were copyrighted, were missing citations and permissions, and were improperly cited.
Gamification was chosen as the instructional strategy to address the problem of engaging pre-service
teachers with background information to deepen knowledge about teaching with technology. Wordpress, a free
blogging tool, was selected to create the online interface for the gamified learning activities. The textbook reading
materials for the Teaching with Technology Methods course were chunked according to topics (later to become
Badge Missions), enhanced with video or interactive activities, and structured as tasks in the online GAMIFIED
TPACK Teacher Quest (http://www.handy4class.com/tpack-teacher-game/). Grade points were assigned to each
task, and a badging system was developed to record completion of tasks. The Quest badge missions included topics
such as gamification, TPACK, digital citizenship, problem-based learning, project-based learning, multimedia,
digital storytelling, and Web 2.0 learning objects, but badges were also earned for creating a digital portfolio,
exhibiting professional conduct during in-class collaborative/cooperative activities, successful completion of weekly
quizzes, and attendance/getting to class on time.

Methods
Data were collected from 133 pre-service teachers enrolled in six different sections of the technology
methods course in which the gamified course was implemented. Pre-service teachers were mainly Caucasian,
middle-class, and in their fifth year of university. Five different instructors taught the different sections; four of the
five instructors had previously taught the course with the online textbook. Instructors met weekly to discuss course
progress and any issues or concerns. Data sources included email interviews with the instructors, researcher field
notes from weekly instructor meetings, student reflections, course artifacts created by pre-service teachers and
shared by instructors with permission, and a Socrative survey filled in by pre-service teachers about the design of the
gamified course and using the Quest for learning. The survey was developed to assess TPACK knowledge from the
characteristics shown in Tables 1, 2, 3, and 4 (Figg & Jaipal, 2011). The survey, which consisted of five sections and
a total of 40 questions (Likert scale and open-ended), was used to collect self-reported data from pre-service teachers
of TPACK knowledge and how it should be applied in teaching practice. The two authors read over transcribed and
survey data independently and conducted a content analysis (Creswell, 2012; Miles & Huberman, 1994). The content
analysis was guided by the TPACK-in-Practice Framework. Data were coded in terms of the actions and
characteristics associated with the components of TPACK knowledge (as indicated in Tables 1, 2, 3, and 4) to
provide evidence of pre-service teachers knowledge of TPACK, as illustrated in Table 5.

Table 5.
Samples of evidence of pre-service teachers knowledge of TPCK, TCK, and TPK.

Characteristic Action Evidence from Survey


TCK- Knowledge of Content- Matching Discipline- I would modify tech-enhanced activities that
Appropriate Technologies Specific Tools to involve researching or reading by using a
Content program such as Word Q to allow students who
may be visually impaired to still participate in
the activity.
TPCK Repertoire of Technology- Analyze structure of I had one student who struggled with using
enhanced Activity Types various technology- Microsoft Word to type out his work because he
Representing Content Knowledge enhanced activity types struggled with his spelling. . . I provided him
to meet diverse student with a program known as Dragon that gives
learning needs different possibilities based on the first few
Select most effective letters. And, when he clicks on a word, reads it
technology-enhanced to him so that he knows which one to choose.

129
activity types to meet
diverse student learning
needs
TPK Classroom Management Use grouping I would also pair students up with another
techniques to support student who many need assistance such as
technical skill and moving the mouse or typing while doing a tech-
content development enhanced activity.
I would pair students who are unfamiliar with
technology with students who are familiar with
technology so they can support each other.
TPK Differentiation for Chunk technical skills [I would provide] smaller chunks of instruction
Technical Competence into simple procedures; so that fewer skills are practiced during the
lesson.
I would modify or adapt my tech-enhanced
activities for diverse students by giving them
fewer steps in the process, as well as give them
visually written instructions.
Create specific learning I would provide students with a handout listing
objects for students all the steps they will need to complete the tech-
(handouts; newsletter; enhanced activity so they may refer to it during
videos; infographics; the lesson.
simulations, etc.)
TPK Characteristic - Modeling Use teacher-created I prepared full math lessons with animations on
Technology Use To and For exemplars the slides to demonstrate the steps in equations
Students and showed students various exemplars
I showed my class how to create a table using a
word processor.
I have used the SMART Board in many of my
lessons. I have modeled how a student would
answer the question on the SMART Board, then
got volunteers to come up to the board and
answer a question.

Findings
The findings from the study indicated that the gamified course influenced the development of TPACK
knowledge positively. Data from the survey revealed that prior to teaching a technology-enhanced lesson, the
majority of respondents (over 50%) practiced in the instructional setting where the lesson would take place (See
Table 4: TPK/Preparation: Technology Practice), collaborated with their associate about technology-enhanced
lessons prior to teaching (See Table 3: TPK/Planning: Activity Choice), created exemplars of digital products they
expected their students to produce (See Table 4:TPK/Implementation: Modeling Technology Use To and For
Students), established groups for students prior to the lesson implementation (See Table 3: TPK/Planning:
Classroom Management), and planned alternate lesson activities in case of technical difficulties (See Table 3:
TPK/Planning: Backup Instruction). As well, the majority of respondents indicated that they incorporated the
following during their technology-enhanced teaching: using a variety of subject-based digital tools (See Table 3:
TPK/Planning: Activity Choices), modeling appropriate technical skills and best practices for digital tool use for
students (See Table 4: TPK/Implementation: Modeling), and providing students with opportunities to use digital
technologies or model effective computer techniques or skills for peers (See Table 4: TPK/Implementation:
Modeling). Participants also reported that they were implementing technology-enhanced lessons (85% occasionally,
often, or always) and even sequencing two or more technology-enhanced activities in a single lesson (63%
occasionally, often, or always), indicative of comfort level with teaching with technology. However, the majority did
not recognize the value of preparing materials, such as a handout outlining skill steps (See Table 3: TPK/Planning:

130
Differentiation), to support any technical instruction that needed to take place during the lesson so that elementary
school students could participate successfully in the activity. Overall, the survey indicated that the majority of
respondents were able to describe the actions of technology-enhanced teaching and how to incorporate effective
TPACK-based practices in their teaching, indicative of TPACK knowledge.
Further analysis of the open-ended questions on the survey provided additional evidence of pre-service
teachers knowledge of TPCK, TCK, and TPK. (Refer to Table 5 for examples).
Additionally, instructors noted that pre-service teachers were able to apply knowledge learned through the
Quest to the follow-up, in-class activities that required them to articulate orally, and in writing, explicit examples of
TPACK knowledge. For example, in their class sessions, pre-service teachers were required to construct an ebook
journal presentation that recorded their reflections and learning about TPACK, including defining TPACK
knowledge, giving examples of what TPACK looked like in classroom practice, or explaining how technology tools
support specific learning activities (TCK) One instructor noted that this was the first time in five years of teaching
this specific activity to pre-service teachers that students were able to articulate an appropriate definition for TPACK
without reteaching. Instructors noted other examples from group work conversations, individually-constructed
digital portfolio reflections, and through questionnaire comments submitted by pre-service teachers where pre-
service teachers were able to articulate TPACK knowledge differently. As well the gamified learning structure of the
Quest provided pre-service teachers with specific TPACK knowledge about how to incorporate game mechanics into
their own teaching practice. The course modeled an exemplar of a gamified instructional strategy in practice. From
how to use digital badges as achievement rewards to the task recording management system set up in the LMS, pre-
service teachers experienced how various technology tools could support the application of game mechanics to a
non-game situation. Pre-service teachers indicated that this experience allowed them to visualize ways to incorporate
game mechanics into their own teaching practice.

This sort of tech-enhanced learning developed my own knowledge, but also served as a great
example of how useful gamification can be in the classroom of today. Rewarding students in
badges as a form of grades can be an effective method for motivation and student learning (Pre-
service Teacher 17).
The use of gamification in the course relied on game mechanics such as completing tasks, meeting
obstacles, being rewarded with points and collecting badges. From this experience, I will
definitely consider gamifying my future classroom to increase student engagement and motivation
to complete classroom tasks (Pre-service Teacher 18).
In terms of gamification, I saw the importance of rewarding students in order to keep them
interested in lessons. The knowledge I gained through the game has allowed me to identify with
the way my students learn and provide them with rewards (i.e. badges, point systems, etc.) to
motivate them and provide them with validation to their learning (Pre-service Teacher 19).

In this study, gamified learning motivated pre-service teachers to engage with background information that
supported in-class learning activities; it provided a visual support for how to successfully master course material; and
it provided a safe venue for experimentation with unfamiliar teaching strategies and learning environments.

Discussion and Significance


In our case the application of game mechanics resulted in a successful learning experience for the pre-
service teachers. Consistent with the recommendations of Daley (2012), creating the online teacher quest by setting
up tasks and challenges, providing game points for immediate feedback, rewarding completion of tasks with badges,
motivated students to engage with the background material. This was evidenced in student enhanced TPACK
knowledge demonstrated through in-class applications and reflection responses. A limitation of the game was that it
did not use a storyline as suggested by Kapp (2012) and Stott and Neustaedter (2013), which may have disengaged
some students.
In summary, the findings of the research study conducted on the gamified technology methods course
indicated that gamified learning promoted the development of TPACK knowledge. Pre-service teachers showed
their understandings related to how content, pedagogy, and technology interact to promote learning of subject matter
content during the Quest. Their knowledge of specific actions of TPCK, TCK, and TPK (as identified in Tables 1-5)
was impacted and evidenced through oral and written articulation (in artifacts created, survey responses, and
reflection comments).

131
The blending of game mechanics with the learning of TPACK knowledge is unique in that teacher
candidates experienced a model of how gaming might be useful in classroom instruction, and could then make
connections to how they could apply this instructional strategy in their teaching practice. This study also makes a
unique contribution to the literature on gamification in Teacher Education as it is one of the few examples of
gamification in the field of Teacher Education.
A limitation of the findings from this study is that it reports on how gamification was used as an
instructional strategy in a specific context with a specific population of pre-service teachers; findings are therefore
not meant to be generalized to a larger pre-service teacher population. Nevertheless, the findings point to the
potential to use gamification as an instructional strategy for motivating higher education students to engage with and
learn course content routinely provided through textbooks and background materials, which is similar the findings of
studies in other areas of higher education (Cronk, 2012; Johnson, 2012; Sheldon, 2012).

References
Corcoran, E. (2010). Gaming education. O'Reilly Radar. Retrieved October 12, 2013 from
http://radar.oreilly.com/2010/10/gaming-education.html
Creswell, J. W. (2012). Qualitative inquiry and research design: Choosing among five approaches. Thousand Oaks,
CA: Sage.
Cronk, M. (2012). Using gamification to increase student engagement and participation in class discussion. In T.
Amiel & B. Wilson (Eds.), Proceedings of World Conference on Educational Multimedia, Hypermedia and
Telecommunications 2012 (pp. 311-315). Chesapeake, VA: AACE. Retrieved October 12, 2013 from
http://www.editlib.org/p/40762.
Daley, H. (2011). Education levels up! MrDaley.com. Retrieved October 12, 2013 from
http://www.mrdaley.com/wordpress/2011/07/27/education-levels-up-a-newbs-guide-to-gamifying-your-
classroom/.
EDUCAUSE Learning Initiative. (2011). 7 Things You Should Know About Gamification. Retrieved October 12,
2013 from http://net.educause.edu/ir/library/pdf/ELI7075.pdf
Figg, C., & Jaipal, K. (2009). Unpacking TPACK: TPK characteristics supporting successful implementation. In I.
Gibson et al. (Ed.), Proceedings of the 20th International Conference of the Society for Information
Technology and Teacher Education (SITE) (pp. 4069-4073). Chesapeake, VA: Association for Advancement
of Computing in Education.
Figg, C. & Jaipal, K. (2011). Developing a survey from a taxonomy of characteristics for TK, TCK, and TPK to
assess teacher candidates knowledge of teaching with technology. In Proceedings of Society for Information
Technology & Teacher Education International Conference 2011 (pp. 4330-4339). Chesapeake, VA: AACE.
Figg, C. & Jaipal, K. (2012). TPACK-in-Practice: Developing 21st century teacher knowledge. In P. Resta (Ed.),
Proceedings of Society for Information Technology & Teacher Education International Conference 2012 (pp.
4683-4689). Chesapeake, VA: AACE.
Gamification Wiki. (n.d.). Gamification. Retrieved October 12, 2013 from http://gamification.org/wiki/Gamification
Gee, J. P. (2008). Game-like learning: An example of situated learning and implications for opportunity to learn. In
Pullin, D. C., J. P. Gee, E. H. Haertel, & L. J. Young (Eds.), Assessment, equity, and opportunity to learn (pp.
200-221). Cambridge, UK: Cambridge University Press.
Glover, I. & Latif, F. (2013). Investigating perceptions and potential of open badges in formal higher education. In .
Jan Herrington et al. (Eds.), Proceedings of World Conference on Educational Multimedia, Hypermedia and
Telecommunications 2013 (pp. 1398-1402). Chesapeake, VA: AACE. Retrieved October 12, 2013 from
http://www.editlib.org/p/112141.
Jacobs, M. (2013). Gamification: Moving from addition to creation. Paper presented at The 31st Annual CHI
Conference on Human Factors in Computing Systems. Paris: ACM. Retrieved October 12, 2013 from
http://gamification-research.org/wp-content/uploads/2013/03/Jacobs.pdf
Jaipal, K., & Figg, C. (2010a). Expanding the practice-based taxonomy of characteristics of TPACK. In C. Crawford
et al. (Eds.), Proceedings of Society for Information Technology & Teacher Education International
Conference 2010 (pp. 3868-3875). Chesapeake, VA: AACE.
Jaipal, K., & Figg, C. (2010b). Unpacking the Total PACKage: Emergent TPACK characteristics from a study of
preservice teachers teaching with technology. Journal of Technology and Teacher Education, 18(3), 415-441.
Jaipal-Jamani, K. & Figg, C. (2015) A Framework for TPACK-in-Practice: Designing Technology Professional
Learning Contexts to Develop Teacher Technology Knowledge (TPACK). In Valanides, N. & C. Angeli (Eds.)
Exploring, developing, and assessing TPCK (pps. 137-164). New York: Springer Publications.

132
Johnson, L., Adams Becker, S., Cummins, M., Estrada, V., Freeman, A., and Ludgate, H. (2013). NMC Horizon
Report: 2013 Higher Education Edition. Austin, Texas: The New Media Consortium.
Johnson, S. L. (2012). Gamification of MIS3538: Social Media Innovation. Retrieved from
http://community.mis.temple.edu/stevenljohnson/2012/05/19/gamification-of-mis3538-social-media-
innovation/
Kapp, K. M. (2012). The gamification of learning and instruction: Game-based methods and strategies for training
and education. San Francisco, CA: John Wiley & Sons.
Lee, J. J., & Hammer, J. (2011).Gamification in education: What, how, why bother? Academic Exchange Quarterly,
15(2), 1-5.
McGonigal, A. (2010). Gaming can make a better world. Retrieved from
http://www.ted.com/talks/jane_mcgonigal_gaming_can_make_a_better_world.html
Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook. Sage.
Mishra, P., & Koehler, M. (2006). Technological Pedagogical Content Knowledge: A framework for teacher
knowledge. Teachers College Record, 108 ( 6), 10171054.
Sheldon, L. (2012). The multiplayer classroom: Designing coursework as a game. Boston, MA: Course Technology.
Stott, A., & Neustaedter, C. (2013). Analysis of Gamification in Education. Technical Report 2013-0422-01.
Connections Lab, Simon Fraser University. Surrey, BC, Canada.
Sweeney, S. (2013, August 16). Game on? The use of gamification in e-learning. TrainingZone. Retrieved from
http://www.trainingzone.co.uk/blogs-post/game-use-gamification-e-learning/185137

133
134
13. Gamification of an Information Security Management Course

John Vail
Florida State Collge at Jacksonville
United States
jvail@fscj.edu

Florida State College at Jacksonville offers a junior level Information Security Management course.
Information Security Management is an in-depth course that prepares the student to perform the duties and functions
of an Information Security Manager (ISM). An ISM develops, maintains, and is responsible for, access control
systems and methodologies, implements and monitors security, develops and maintains business continuity plans,
selects cryptographic processes, provides guidance in security issues, provides best practices in security architecture,
analyzes, evaluates and performs extensive analysis of telecommunications, network and Internet security. Students
concentrate on researching, presenting, and developing skills related to these technologies.
Next Generation Learners (NxGL) are challenging the traditional face-to-face method of instruction (Gates
Foundation, 2010; Taylor, 2010). They expect the pedagogy to be developed around their individual and personal
goals, they need to understand why they need to learn specific content, and they need to see the benefits of learning
(Gates Foundation, 2010; Taylor, 2010). These digital natives are acting as disruptors to the traditional methods of
instruction (Bennett, Maton, and Kervin, 2008), and are not content to sit as a container waiting to be filled with
knowledge. In addition to the NxGl as catalyst, Rhode (2009) asserts that interaction, the principle of active
intercourse with either concepts or agents, has helped to redirect pedagogy from teacher-directed to learner-centered.
Groff (2013) has found that over the past several years, education has strategically and holistically
redesigned and realigned elements in order to follow best practices and identified needs in order to achieve
quantifiable outcomes. This is due to the realization that the traditional methods and pedagogies are outdated and no
longer meet the needs of learners (2013). One of the challenges facing education is the creation of curriculum that
contains meaningful and rich learning experiences. Fink (2005) describes learning experiences as being rich when
they accord learners the ability to draw together multiple domains of significant learning collectively. Meaningful
and rich learning experiences encourage learner engagement and help to foster a love of learning (Parsons & Taylor,
2011).
Papert (1991)(as quoted in Pea-Rios, Callaghan, Gardner, & Algaddad, 2012) established that life-long
learning happens as learners performed active tasks that construct objects in the world related to their personal
experiences and ideas. Piaget (1973)(as quoted in Xu, Huang, Wang, & Heales, 2013) believed that learners must
construct knowledge: knowledge is not supplied by a teacher. Constructivist theory puts the learner at the center and
in charge of the active process of attaining knowledge (Dabbagh & Kitsantas, 2011; Juarros, Ibez, & de Benito
Crosetti, 2013; Pea-Rios, Callaghan, Gardner, & Algaddad, 2012; Saadatmand & Kumpulainen, 2013). Problem-
based learning (PBL) is a student-centric constructionist pedagogy where learning occurs as one outcome from the
problem solving exercise (Pea-Rios, Callaghan, Gardner, & Algaddad, 2012). Gamification is one way to utilize
PBL as a way to engage learners, and subsequently increase learning gains (ODonovan, Gain, & Marais, 2013).
CyberCIEGE is a game developed by the Naval Postgraduate School as a tool to provide real world
scenarios where a learner must take action when confronted with a situation that has computer or network security
ramifications (Thompson, & Irvine, 2011). CyberCIEGE was designed to expand on the concepts taught in
Information Security through hands-on interaction with a network simulation where the learner has decisions to
make based on actions presented in the game (Thompson, & Irvine, 2011). The game is intended to show learners
Information Security concepts and relationships between policies set by the employer, the security mechanisms that
are in place, and the need for the business to remain productive (Thompson, & Irvine, 2011). CyberCIEGE brings
context to Information Security concepts by showing learners the ramifications of security techniques and
mechanisms. For example, learners must configure a firewall (the concept) in one scenario and deal with end users
that complain about loss of access to critical infrastructure or with a loss of assets because an attacker got through a
badly constructed access control list productive (Thompson, & Irvine, 2011). The understanding provided by the
scenarios assists learners in their understanding of how enterprise policies could be followed and implemented
through logical protection mechanisms, physical security, and enforcement of procedures.

Statement of the Problem

135
At issue is that learners seem to find the Information Security Management curricula difficult due to the
largely theoretical nature of the content being taught and their relative lack of prior knowledge on the subject
(Ambrose, Bridges, DiPietro, Lovett, & Norman, 2010). Active learning and PBL are two ways to engage students
and increase their comprehension and retention of concepts being taught. CyberCIEGE has the potential to engage
learners and provide deeper comprehension of Information Security Management concepts. Working through
scenarios can provide learners with active learning if they are self-aware enough to see that while they are
attempting to achieve the scenario objective, they are applying abstract security concepts. A study was conducted in
order to find out if active learning is actually taking place while learners are playing CyberCIEGE.

Purpose of the Study

The purpose of this mixed-method study was to determine if CyberCIEGE promoted engagement and
learning gains. Engagement was measured through post-course learner survey instruments. Learning gains were
measured by standardized assessments of learner understanding of Information Security knowledge domains. The
sample consisted of two sections of Bachelors level classes taught by the same instructor, and the classes compared
were taught at a similar time (evening versus afternoon) during the same school year as a reasonable attempt to
curtail confounding factors (Marsh, & Overall, 1981). One section acted as the control and was comprised of 20
learners, while the other section acted as the treatment and was comprised of 15 learners. These samples have a
margin of error of .24 for a confidence level of 95%.

Research Question
To determine if gamification positively impacts learner engagement, learning gains, and knowledge
synthesis, the following questions will be explored.
Q1. Does the introduction of gamification increase learning gains?

Q2. To what degree does gamification engage learners?

Hypotheses

H10: There is no statistical relationship between gamification and learning gains.

H1a. There is a statistical relationship between gamification and learning gains.

H20. Learner engagement is not affirmed by gamification.

H2a. Learner engagement is affirmed by gamification.

Research Methodology

Quantitative and qualitative comparative studies were conducted utilizing exploratory research to
determine if any relationships exist between gamification and engagement, and gamification and learning gains. The
research was based on testing for statistical significance and measure of association. To address the research
questions, an exploratory comparative study was designed to observe and compare engagement and learning gains
of learners playing CyberCIEGE versus learners that did not play CyberCIEGE.

Measurement

The research questions posed relied on data regarding levels of engagement and learning gains.
Determination of the levels of learning gains was measured quantitatively through the testing of learners knowledge
and comprehension of Information Security knowledge domains. The underlying assumption was that the played
CyberCIEGE would have greater knowledge and comprehension gains than the learners that did not play

136
CyberCIEGE. The research questions posed also relied on data regarding learner engagement. Learner engagement
was measured qualitatively through a survey instrument completed by the learners at the end of the course.
The research was conducted at an institution of higher education and participants were voluntarily, after
informed consent, evaluated for learning gains and knowledge synthesis, and surveyed for attitudes towards
engagement. In this study, no data points were rejected. Learners identities were coded to identify individuals
across the two variables being studied. The coding was done in such a way so that individuals and their data will not
be able to be linked back to any specific individual except by the researcher. All identifying information will remain
with the researcher, and the data will be encrypted both in storage and in transmission.

Results
To qualitatively measure student engagement due to CyberCiege, a survey instrument was administered
online and made available to the learners. A Likert Scale was employed that utilized five ordered response levels.
The available responses were strongly disagree, disagree, neutral, agree, and strongly agree. Five questions were
asked of the learners: CyberCIEGE helped make this class more enjoyable, I felt more engaged with the course
content because of the inclusion of CyberCIEGE, CyberCIEGE helped me to learn the material presented in the
course, CyberCIEGE helped me to understand and retain the concepts taught in this course, and the difficulty in
setting up and using CyberCIEGE detracted from the course. Of the 20 learners that finished the course, 15 (or of
the learners) completed the survey.

Figure 1. Responses to the question, I felt more engaged with the course content because of the
inclusion of CyberCIEGE.

As seen in figure 1., in response to the question, I felt more engaged with the course content because of
the inclusion of CyberCIEGE, 46.67% of the respondents agreed, while 6.67% strongly disagreed, 20% disagreed,
and 26.67% felt neutral. Respondents to this question had a mean score of 3.13 and a standard deviation of .957. It is
interesting to note that 60% of the respondents agreed that the inclusion of CyberCIEGE made the class more
enjoyable, as seen in figure 2. This could also be seen as creating engagement.

137
Figure 2. Responses to the question, CyberCIEGE helped make this class more enjoyable.

To quantitatively measure learning gains due to CyberCIEGE, student assessment of the Information
Security knowledge domains were compared. The control group had an average assessment score of 78% (standard
deviation 21.62), while the treatment group had an average assessment score of over 83% (standard deviation
15.51). The average assessment score for the class utilizing CyberCIEGE was higher than the control class (t=2.092,
df=242, p<.025).
Summary
Engaged learning happens when learners are actively involved (Rhode, 2009). Wanstreet (2006) (as quoted
in Rhode, 2009) states that at the fundamental level, interaction is engagement in learning. A culture of learning
must be encouraged whereby teachers and learners interact in such a way that the lines that divide teachers from
learners begin to blur and all grow intellectually (Parsons, & Taylor, 2011). Achievement is to be secondary to
learning and engagement (2011).
Groff (2013) has found that the traditional educational methods and pedagogies are outdated and are no
longer meeting the needs of learners, and that education has begun to strategically and holistically redesign and
realign pedagogy to follow best practices and identified needs in order to be effective. In order for learners to
become engaged, their learning experiences need to be meaningful and rich.
Historically, Information Security Management at Florida State College at Jacksonville has been taught
mostly as a theory based curriculum with reasonable success. The active learning component had been occurring at
the end of the course as students, in groups, participated in a design project, assuming the roles of a senior
management team to design a secure computer/network infrastructure and then present their recommendations,
design, policy, standards, and guidelines, to the class as if they were presenting to the CEO of the target corporation.
Studies were conducted that integrated gamification into the Information Security Management curriculum in an
attempt to help students access and master the skills necessary to be successful information security managers in
work settings. H10 was rejected since it was found to be statistically significant that the group utilizing
CyberCIEGE scored higher on the assessment of their knowledge of concepts and practices in the Information
Security knowledge domains. So it can be concluded that gamification can lead to learning gains.
Although self-reported, the study found that students found the class more enjoyable, and at least 46% felt
engaged due to the introduction of gamification. Quantitative measurements could be taken in the future by
recording the amount of time each learner spends in a scenario versus the time learners that are not utilizing
CyberCIEGE are spending on activities relating to Information Security concepts that are not related to reading (it is
assumed that both control and treatment will be reading the text) and measuring the degree of association between

138
the two variables and covariation of the data. Although this is a preliminary study, it could be said that if
gamification is included in course curricula, it is probable that learners will become engaged, leading to persistence
and advanced learning outcomes (Fink, 2005; Groff, 2013; Rhode, 2009).

References
Ambrose, S. A., Bridges, M. W., DiPietro, M., Lovett, M. C., & Norman, M. K. (2010). How learning works: Seven
research-based principles for smart teaching. John Wiley & Sons.
Bennett, S., Maton, K., & Kervin, L. (2008). The 'digital natives' debate: A critical review of the evidence. British
Journal of Educational Technology 39(5), 775-786. doi: 10.1111/j.1467-8535.2007.00793.x
Bill & Melinda Gates Foundation. (2010). Next generation learning: The intelligent use of technology to develop
innovative learning models and personalized educational pathways. Seattle: Bill & Melinda Gates
Foundation. Retrieved from https://docs.gatesfoundation.org/Documents/nextgenlearning.pdf
Dabbagh, N., & Kitsantas, A. (2011). Personal learning environments, social media, and self-regulated learning: A
natural formula for connecting formal and informal learning. Internet and Higher Education, 15(1), 3-8.
http://dx.doi.org.proxy1.ncu.edu/10.1016/j.iheduc.2011.06.002
Fink, L. (2005). A self-directed guide to designing courses for significant learning. Retrieved from
http://www.deefinkandassociates.com/GuidetoCourseDesign Aug05.pdf
Glover, I. (2013). Play As You Learn: Gamification as a Technique for Motivating Learners. In Jan Herrington et al.
(Eds.), Proceedings of World Conference on Educational Multimedia, Hypermedia and
Telecommunications 2013 (pp. 1999-2008). Chesapeake, VA: AACE. Retrieved from
http://www.editlib.org.proxy1.ncu.edu/p/112246
Groff, J. (2013). Technology-rich innovative learning environments. Paris: The Organization for Economic Co-
operation and Development. Retrieved from http://www.oecd.org/edu/ceri/Technology-
Rich%20Innovative%20Learning%20Environments%20by%20Jennifer%20Groff.pdf
Juarros, V., Ibez, J., & de Benito Crosetti, B. (2014). Research results of two personal learning environments
experiments in a higher education institution. Interactive Learning Environments, 22(2), 205-220.
doi:10.1080/10494820.2013.788031
Marsh, H., & Overall, J. (1981). The relative influence of course level, course type, and instructor on students
evaluations of college teaching. American Education Research Journal 18(1), 103-112. DOI:
10.3102/00028312018001103
O'Donovan, S., Gain, J., & Marais, P. (2013) A Case Study in the Gamification of a University-level Games
Development Course. In Proceedings SAICSIT 2013, 242-251. Retrieved from
http://pubs.cs.uct.ac.za/archive/00000926/01/Gamification.pdf
Parsons, J., & Taylor, L. (2011). Student engagement: What do we know and what should we do? University of
Alberta. Retrieved from
http://education.alberta.ca/media/6459431/student_engagement_literature_review_2011.pdf
Pea-Rios., A., Callaghan, V., Gardner, M., & Algaddad, M. (2012). The interreality portal: A mixed reality co-
creative intelligent learning environment. In Workshop Proceedings of the 8th International Conference on
Intelligent Environments, 267-274. Doi: 10.3233/978-1-61499-080-2-298
Rhode, J. (2009). Interaction equivalency in self-paced online learning environments: An exploration of learner
preferences. The International Review of Research in Open and Distance Learning, 10(1). Retrieved from
http://www.irrodl.org/index.php/irrodl/article/view/603/1178
Saadatmand, M. M., & Kumpulainen, K. K. (2013). Content aggregation and knowledge sharing in a personal
learning environment: Serendipity in open online networks. International Journal Of Emerging
Technologies In Learning, 70-78. doi:10.3991/ijet.v8iS1.2362
Taylor, M. (2010). Teaching generation NeXt: A pedagogy for today's learners. In A collection of papers on self-
study and institutional improvement, 192-196. The Higher Learning Commission. Retrieved from
http://www.taylorprograms.com/images/Teaching_Gen_NeXt.pdf
Thompson, M., & Irvine, C. (2011). Active learning with the cyberciege video game. In Proceedings of the 4th
conference on Cyber security experimentation and test, 10-18. Retrieved from
https://www.usenix.org/legacy/event/cset11/tech/final_files/Thompson.pdf
Xu, D., Huang, W., Wang, H., & Heales, J. (2013). Enhancing e-learning effectiveness using an intelligent agent-
supported personalized virtual learning environment: An empirical investigation. Information &
Management 51(4), 430-440. Doi: 10.1016/j.im.2014.02.009

139
140
14. Challenges and Opportunities in Employing Game Development in
Computer Science Classes

Oswald Comber, Renate Motschnig


University of Vienna
Faculty of Computer Science
1090 Vienna, Austria
oswald.comber@univie.ac.at

Introduction

Video games are very popular among teenagers. According to the Entertainment Software Association
(2014), 59 percent of Americans play video games. In a survey of the target audience for the Game Programming
in Schools study in Austria, 72 percent of respondents stated that they play video games regularly (Comber,
2012a). Although video games were initially considered with suspicion by a large number of educators, they finally
realized the potential of these games (Squire, 2003). Serious Games (Michael & Chen, 2005) can be employed to
playfully deliver educational contents to learners in an engaging way. Apart from playing a game to learn
something, the game itself can be the subject of creation. Thus, instead of spending hours working within the
boundaries of the games mechanisms, students can create their own mechanisms and develop their own video
games.
Developing video games can be a fruitful learning experience in various school subjects, but it, in
particular, seems to be promising in computer science. Theoretically and practically, a game development project
can span an entire computer science curriculum in Austria (BMBF, 2004). A game development project can be put
in place to mediate the use of office applications, especially for project-specific tasks. Game development also
embraces the use of graphics software, such as GIMP (Kimball et. al., 2014) or Audacity (Dannenberg & Mazzoni,
2014). Most importantly, game development addresses the field of programming in a pervasive way. The
programming of a game entails more than just typing in source code. When programming a game, you have to
perform some modeling and understand at least the principles of object-oriented programming. Besides this, for
some types of games, e.g., massively multiplayer online role-playing games or build and conquer games, databases
are the best way to store necessary information. Overall, video game development in the classroom provides a
versatile teaching strategy in computer science classes.

Challenges in game development

Video game development is very appealing to students (Comber, 2012a; Figure 1). When introducing the
idea of game development to teenage learners in computer science classes, one of the first questions is always Can
we develop a game like Assassins Creed, Far Cry, or Skyrim4 this year? That leads us to the second fact: Game
development is a very complex time- and resource-consuming activity when pursuing the development of triple A
titles. Grand Theft Auto 5, for example, had a budget of $265 million (International Business Times, 2013), and a
development team of approximately 1,000 persons (Mudgal, 2013). Obviously, accomplishing this in a school
setting is not convenient with approximately two hours of computer science per week. Thus, the answer to the
students first question is Unfortunately not, but how did you like Doodle Jump, Flappy Bird, or the original Super
Mario5? The answer is invariably something like (Not that mind-blowing,) but real fun!. Starting from this point,
it is easy to continue with the students to talk about possible video game projects in class.


4
Replace the games titles by the titles of any spectacular, cutting-edge video games with exorbitant development
costs and resources each year to get your currents students questions.
5
Despite its age, the original Super Mario is still known by teenagers today.

141
Figure 1. Attitudes of students aged 1317 regarding video game development

Basic principles to make game development more productive

For every kind of lecture, it is helpful to gauge the thoughts of the audience. After one year of game
programming in secondary school (Comber, 2012b), students reactions were collected. Based on a reflection of the
results and the students statements, some basic principles for how to successfully lead game programming classes
were formulated. These principles help make video game development with a class of teenagers more productive
and rewarding.
The first principle is: Set reachable goals. It is important to be firm and clear about unrealistic expectations
and talk about realistically feasible game projects. Feasible games that qualify can be simple jump n run or
platformer games6, certain role playing games with a top view7, top-view car games8, or puzzle games9.
This principle is very important to avoid frustration in students and teachers while approaching to the end of the
project. Even at the beginning of the project, unrealistic goals can lead to losing oneself in perfecting the details,
getting stuck, and wasting time even before anything is accomplished. We sum up this principle with a students
feedback: Game development was a good idea, but significantly too complex and not possible to accomplish in
only two hours a week.
The second principle is: Reduce complexity where possible. When it comes to learning programming, there
are two ways to reduce complexity. Students in an early year of game development working only with C# in Visual
Studio 2010 Express and XNA (Microsoft, 2011) stated that the main source of dissatisfaction was the complexity
of game programming (Comber, 2012b). The students feedback concerning the complexity often followed these
patterns: Programming a game is just too complicated! or I find programming interesting on one hand, but on the
other hand, it is very complicated. Their concerns about the complexity of game development even pertained to
simple tasks, including loading a players texture and moving the player around the screen with the arrow keys. This
complexity caused the students actual motivation to lag behind their expected motivation (Figure 2).


6
e.g., Doodle Jump, Super Mario
7
e.g., Pokmon RPG
8
e.g., Bombay Taxi 2
9
e.g., Tetris, Bubble Shooter

142
Figure 2. Expected vs. actual motivation level

The first approach is to choose goals that are easily accomplished with the programming environment
currently being used. If you do not want to cut back your goals, the second approach is to boost the game
development process. For this purpose, the use of tools that aid in achieving the objectives or a specialized
programming learning environment are of great help. There are various programming learning
languages/environments, such as Logo (Logo Foundation, 2011), Kara (Reichert, 2014), and Greenfoot (Computing
Education Group University of Kent, 2014). For simple video game development, Scratch (Resnick et. al., 2009) or
SNAP (Mnig & Harvey, 2013) are easy-to-start environments. Scratch delivers a straightforward, colorful user
interface, which allows beginners to create their programs easily via drag and drop. However, if students have been
instructed to type their own code and some physics simulation goodies are the flavor of choice, a more customized
and specialized solution is necessary. In our case, a small framework called the Framework for Game Programming
in Schools (GamePinS) was created and employed.
Be extremely well prepared is the third principle. There are two substantial reasons that being well prepared
is very important. The first reason is that approaches that might work in other lessons, for example, the teacher
setting up an environment for learning, and the students doing some great project work by themselveswill fail in
computer science10 when it comes to transforming the ideas into real results. The other reason is that there can
always be technical problems in the PC lab, and it is wise to minimize these problems through advance testing. By
saying it worked yesterday at home does not count as being extremely well prepared. Excellent preparation
includes being an expert on what you want the students to understand and achieve. Relying on the ingenious hackers
in class is not a reasonable option. Besides the individual qualities and principles of every lecturer or teacher, these
three principles are vital for successful video game development with teenagers in computer science.

Game development in computer science classes

Based on the abovementioned principles of setting reachable goals, reducing complexity, and preparing
very well, a second term of game programming was tackled from February to June 2014. To reduce the complexity
somewhat, GamePinS was developed. GamePinS is based on XNA and Microsoft Visual Studio Express and allows


10
except for one or two very capable hackers in class, who might be able to achieve something by
themselves.

143
students to quickly begin programming their games. GamePinS supports the import of maps created with Tiled Map
Editor (Lindeijer & Cook, 2013) and provides physics simulations by embedding the Farseer Physics Engine
(Weber, 2013). A typical GamePinS sample project with self-designed graphics is depicted in Figure 3. The project
can also be found online (Comber, 2013).

Figure 3. Screenshot of a GamePinS sample project with Farseer Physics for the player (elephant) and the bowls

All activities in this term began with a brief introduction to console programming with C# and Visual
Studio. After the students learned about the basics of programming, an introduction to GamePinS was delivered to
all students. The additional skills of making textures with GIMP, building maps or levels with Tiled Map Editor, and
editing sounds with Audacity were covered. In the next step, students formed project teams, because game
programming was realized not in a tutorial style but in the same manner as of a typical software project. The
members of the group therefore assumed the following roles: project leader and integrative programmer, engine
programmers, graphics designers and programmers, level designers and programmers, and audio designers and
programmers. All of the designer roles in the project included programming tasks in the game, writing code
fragments for purposes such as displaying the graphics, loading and displaying the level, and playing sounds. The
project leader completed some programming tasks that were not covered by the other programmers and was
responsible for the integration of the code the others produced. The reason for assigning roles in which everyone
was responsible for some programming tasks was to ensure that everyone acquired programming skills and
advanced their computational thinking.

Research method or Why design-based research meets all needs!

The group of students doing the GamePinS activities consisted of eight learners. The small group size made
it impossible to perform extensive statistical testing that would be feasible with larger groups. In contrast, the
circumstances in the classroom did not allow control of the research environment as effectively as in a laboratory.
Fortunately, there is a research methodology that is designed for the investigation of small groups in an actual
(learning) environment: design-based research (DBR). DBR is characterized by being situated in a real educational
context and focusing on the design and testing of a significant intervention (Anderson & Shattuck, 2012, p. 16).
However, DBR goes beyond merely designing and testing particular interventions; it entails constructing
cumulative design knowledge and developing contextualized theories of learning and teaching (Design-Based

144
Research Collective, 2003, pp. 6, 8). The idea of utilizing a mixture of research methods is also present in the notion
of DBR (Maxcy, 2003).
Due the small group size, measuring students performance in computer science through game development
was tackled in several ways. One source of research data was using questionnaires. The questionnaires showed basic
trends and consisted of simple questions about age, gender, computer usage habits, and playing games, also, most
importantly, questions concerning students self-estimation of their game programming experience and progress. A
second source of research data was a written feedback from the students, in which the learners were asked to explain
some of their experiences. The third line of attack was to evaluate the students programming skills in a short test on
typical programming and algorithmic challenges.
Through these research methods, it was shown that a group of students who were developing games
benefited through pervasive engagement in a computer science class and high motivation; also, the students further
perceived an improvement regarding their programming skills. This is a positive result, but using only this research
does not make these conclusions strong enough. Therefore, the GamePinS group was compared to another group of
eight students of the same age who were implementing student-chosen software projects (SCSP) with plain Visual
Studio. The GamePinS and the SCSP groups had the same introduction to console programming with C# and Visual
Studio, first they learned the basics of programming. The GamePinS group subsequently continued with game
development activities and the SCSP group carried out their projects in Visual Studio. To provide equivalent starting
conditions, the GamePinS group had an introduction in the GamePinS framework, while the SCSP group had
lectures in Windows Forms programming to the same extent.

Research and results

Development of the GamePinS projects and the student-chosen projects began at the end of February 2014
and continued until the middle of June 2014. The final questionnaires for the GamePinS group were completed by
six learners (N1 = 6) in class, and those for the SCSP group were completed by seven learners (N2 = 7). This is not a
strong basis for performing a statistical evaluation, but is still enough to see a trend. One of the questionnaire items
was how the students rated their improvement in writing programs. In this case, a clear trend was notable, namely
that students programming games with the GamePinS framework experienced a greater degree of self-evaluated
improvement than did the students in the SCSP group (Figure 3).

Figure 4. Self-evaluated improvement in programming skills

145
In the next step, the programming skills test was completed by both student groups. The test itself was
given on the neutral ground of console programming; thus, neither of the groups had an advantage. The items of the
programming skills test examined the students abilities to understand logical conditions, their knowledge of data
types, their comprehension and use of variables, and their understanding of what an algorithm does. In the data type
question (Q1), students had to match values to their corresponding data types, e.g., true, 456, 12.3456756,
and Halli Hallo! to bool, int, double, and string, respectively. The basic programming question (Q2) required the
student to fill in the logical conditions in a while loop.

Q2. (Translation)11 Hansi the squirrel is terribly hungry. Luckily, he knows a place where plenty of food is available.
The only problem is a hunting cat who is lurking around the food and has already devoured some squirrels.
Fill in the blank space in the while loop so that the squirrel collects food as long as it has not already collected
enough and there is no danger.

bool collectedEnough = false;


bool Danger = false;
while (....)
{
// The function LookoutforDanger() returns true, if danger is detected
Danger = Hansi.LookoutforDanger();

// The function CollectFood() lets Hansi collect food


// it retuns true if enough food was collected
collectedEnough = Hansi.CollectFood();
}

Possible correct answers were: while(!collectedEnough && !Danger) and while(collectedEnough==false &&
Danger==false). Any permutations like while(!Danger && !collectedEnough) were also counted as correct
answers.

A typical item from the question evaluating the students understanding of what an algorithm does (Q3) was
determining what a piece of source code does. The students were asked to answer questions like the following.

Q3. (Translation) Take a close look at this function:

int Calculate(int p)
{
if (p > 0) return p + Calculate(p - 1);
else return 0;
}

What result does this function produce if it is called as follows?


Calculate(4);

The parameter value of 4 is an example value; during the IT skills test, random values between two and five were
used, and the correct result was automatically calculated by the quiz tool as n (n + 1) / 2.
In the question evaluating the students understanding of variables, the students were asked to switch the
values of two integer variables. In one test case, the switching had to be accomplished with the use of a temporary
variable (Q4), and in the other case, without the use of a temporary variable (Q5).
The results of the programming skills test showed some interesting trends. While in the understanding of
data types (Q1), both groupsGamePinS and SCSP,were nearly equal, their performance on the other items show a
huge gap (Figure 5). Overall, the students from the GamePinS group scored an average of 25.47 points, and those
from the SCSP-group scored 10.86 points.


11
The test items were in German language, therefore a translation is provided here.

146
Figure 5. Scores of the programming skills test.

Overall, the cumulative score and the scores on specific questions showed the significant advantage of
taking part in game programming activities. Since the small group size made the results more like a snapshot than a
signifier of general results, a triangulation (Jick, 1979; Olsen, 2004) of the different types of gathered results was
performed to reinforce the test results. The results of the programming skills test were cross-checked with the
students self-estimation and validated by the students comments, both of which supported the findings that
students benefited more from game programming than from the student-chosen software projects.
The students comments were collected by asking specific questions about game development and the
development experience. Questions like Can game programming increase engagement in IT class? were asked.
One answer was, I think Yes, because you understand how to program better. Students were also asked, How
was your experience of game programming? The feedback included Game programming was an astonishing thing.
It was a great experience, It was interesting to create a game, and It was fun, and I also learned something.
Conclusion and prospects

Game development in computer science classes appeals strongly to learners and is suitable to effectively
enhance the learning of important aspects of computer science, i.e., programming. The average programming skills
tests score of the game development group was more than twice that of the SCSP group. In addition to the
deliberate choices to set reachable goals and thoroughly prepare lessons, reducing complexity is a very important
strategy. This strategy was pursued by developing a small, specialized framework (the GamePinS framework), and it
yielded positive results. The motivation of students developing games with GamePinS in 2014 was substantially
higher than that of students in 2012 with only Visual Studio 2010 Express and XNA. In both years, C# was the
programming language of choice.
The field of game development offers a variety of possibilities. There are some very solid programming
learning environments out there (e.g., Scratch), but in our case, where students are instructed to program by writing
source code with tools that are also used in the software industry, there is still much to explore. For this reason, the
GamePinS development will continue. Preparation of detailed eLearning resources is now in progress, and since the
development of XNA has been discontinued (Rose, 2013), the next move will be migrating to another underlying
framework, preferably MonoGame. MonoGame is an open source version of XNA and supports iOS, Android, Mac
OS X, Windows 8 Metro, and Linux (MonoGame Team, 2009).
Further research will include the repetition of the research cycle and administration of the test for
reproduction of the results in another course of game programming. Additionally, at least one more teacher has

147
agreed to participate in the next research cycle. This teacher has also agreed to enforce game development in one
course and work with a second control group on other software projects. Thus, there will be two GamePinS groups
and two OSCP groups, supported with equivalent eLearning material in Moodle to collect further results.
Finally, one important insight crystalized from the ongoing game development with students. Game
development has much to offer for young learners, and with the right tools, it provides not only the potential to boost
computer science skills but also the chance for individuals to take the step from video game consumers to creators.

References

Anderson, Terry, & Shattuck, Julie. (2012). Design-Based Research: A Decade of Progress in Education Research?
Educational Researcher, 41(1), 16-25. doi: 10.3102/0013189x11428813
BMBF. (2004). Austrian Federal Ministry of Education and Women's Affairs. Curriculum of Computer Sciene in
Austria - Lehrplan Informatik fr die AHS Oberstufe. Retrieved 2014, August 3, from
https://www.bmbf.gv.at/schulen/unterricht/lp/lp_neu_ahs_14_11866.pdf?4dzgm2
Comber, Oswald. (2012a). Attitudes towards "Game Programing in School - A survey amongst Secondary School
students (username: gameprog, password: 2012GamePin$) Retrieved 2012, August 8, from
http://www.comber.at/research/results2012/SurveyA_2012.htm
Comber, Oswald. (2012b). Game Programing in School - One schoolyear of game programming in Secondary
school. A survey amongst students (username: gameprog, password: 2012GamePin$) Retrieved 2012,
August 8, from http://www.comber.at/research/results2012/SurveyB_2012.htm
Comber, Oswald. (2013). GamePinS - Platformer Example. Retrieved 2014, August 10, from
http://www.comber.at/gamepins/Beispiele/GamePinS-Seitenansicht-Beispiel.zip
Computing Education Group - University of Kent. (2014). Greenfoot. Retrieved 2014, August 8, from
http://www.greenfoot.org/
Dannenberg, Roger, & Mazzoni, Dominic. (2014). Audacity, a free audio editor. Retrieved 2014, August 4, from
http://audacity.sourceforge.net/
Design-Based Research Collective. (2003). Design-Based Research: An Emerging Paradigm for Educational Inquiry.
Educational Researcher, 32(1), 5-8. doi: 10.3102/0013189x032001005
Entertainment Software Association. (2014). Essential Facts About the Computer and Video Game Industry.
Retrieved 2014, July 30, from http://www.theesa.com/facts/pdfs/ESA_EF_2014.pdf
International Business Times. (2013). 'GTA 5' Costs $265 Million To Develop And Market, Making It The Most
Expensive Video Game Ever Produced. Retrieved 2013, September 14, from http://www.ibtimes.com/gta-
5-costs-265-million-develop-market-making-it-most-expensive-video-game-ever-produced-report
Jick, Todd D. (1979). Mixing qualitative and quantitative methods: Triangulation in action. Administrative science
quarterly, 602-611.
Kimball et. al., Spencer. (2014). GIMP - The GNU Image Manipulation Program. Retrieved 2012, August 9, from
http://www.gimp.org
Lindeijer, Thorbjrn, & Cook, Daniel. (2013). Tiled map editor. Retrieved 2013, March 19, 2013, from
http://www.mapeditor.org/
Logo Foundation. (2011). Logo. Retrieved 2014, June 23, from http://el.media.mit.edu/logo-
foundation/logo/index.html
Maxcy, S.J. (2003). Pragmatic threads in mixed methods research in the social sciences: The search for multiple
modes of inquiry and the end of the philosophy of formalism. Handbook of mixed methods in social and
behavioral research, 51-89.
Michael, David R, & Chen, Sandra L. (2005). Serious games: Games that educate, train, and inform: Muska &
Lipman/Premier-Trade.
Microsoft. (2011). XNA Game Studio 4.0. Retrieved 2013, August 22, from
http://msdn.microsoft.com/en-us/library/bb200104(v=xnagamestudio.40).aspx
Mnig, Jens, & Harvey, Brian. (2013). Snap! - formerly known as BYOB. Retrieved 2013, March 30, 2013, from
http://snap.berkeley.edu/
MonoGame Team. (2009). MonoGame - An open source implementation of Microsoft's XNA. Retrieved 2013,
April 1, from http://monogame.codeplex.com/
Mudgal, Kartik. (2013). GTA 5 dev team size more than 1000, manpower dependent on game detail. Retrieved
2014, August 5, from
http://www.gamechup.com/gta-5-dev-team-size-more-than-1000-manpower-dependent-on-game-detail/

148
Olsen, Wendy. (2004). Triangulation in social research: qualitative and quantitative methods can really be mixed.
Developments in sociology, 20, 103-118.
Reichert, Raimond. (2014). Kara. Retrieved 2014, August 8, from
http://www.swisseduc.ch/informatik/karatojava/index.html
Resnick et. al., Mitchel. (2009). Scratch: programming for all. Commun. ACM, 52(11), 60-67. doi:
10.1145/1592761.1592779
Rose, Mike. (2013). It's official: XNA is dead. Retrieved 2014, August 10, from
http://www.gamasutra.com/view/news/185894/Its_official_XNA_is_dead.php
Squire, Kurt. (2003). Video games in education. Int. J. Intell. Games & Simulation, 2(1), 49-62.
Weber, Jeff. (2013). Farseer Physics Engine. Retrieved 2013, August 13, from https://farseerphysics.codeplex.com/

149
150
15. Students Conceptual Understanding of Leadership in a Global World:
Learning via a Web-based Simulation of Political and Economic Development

Seungoh Paek
Learning Design and Technology
University of Hawaii at Mnoa
United States
speak@hawaii.edu

Daniel L. Hoffman
Kamehameha Schools
United States
dahoffma@ksbe.edu

Introduction

Many scholars agree there is a need for increased emphasis on preparing children to be globally oriented.
According to Legrain (2002), our lives are becoming increasingly intertwined with those of distant people and
places around the worldeconomically, politically, and culturally (p. 4). In order for todays youth to be prepared
for a more intertwined future, they will need global competence. Global competence is defined as the ability to
contribute to knowledge, as well as comprehend, analyze, and evaluate its meaning in the context of an increasingly
globalized world (NASULGC, 2004).
The need to prepare children for life in a global society poses significant challenges to educators.
According to Zhao (2010), cultivating global citizenship requires teachers to have a global perspective, model
cultural sensitivity, model global citizenship, and engage students in educational activities aimed at developing
global citizenship (p. 247). In order for these perspectives, models, and activities to make it to classrooms, a
number of changes are required to the way social studies is taught in U.S. classrooms. For example, Suarez-Orozco
and Sattin (2007) suggest schools need to restructure curriculum and pedagogy such that they can present
opportunities for students to understand how their actions are embedded in a larger global context and may have
implication on a more macro-level scale. Given the need to cultivate all that global competence and citizenship
entails, researchers and educators should seek to design and implement opportunities that allow students to
contemplate the global context in K-12 settings.
One type of educational tool that might be leveraged to support student thinking about the larger global
society is simulations. Simulations are computer-based interactive environments with an underlying model and are
often used to explore phenomena that occur over long or extremely short time periods (DAngelo et al., 2014). In the
past 20 years, researchers have touted the ability of simulations to promote active learning in classrooms (e.g.,
Alvarez, 2008). Early work by Geban, Petek Askar and zkan (1992) found student problem-solving in a computer-
simulated experiment resulted in more positive attitudes towards and greater process skills in chemistry when
compared to conventional instruction. More recently, Chang, Chen, Lin, and Sung (2006) found that second-year
junior high school students, who learned basic characteristics of an optical lens using a simulation-based lesson, had
significantly higher learning outcomes than students in laboratory learning setting.
Though often associated with the domain of science, simulations have also been used to explore a variety
of topics in social studies including a U.S. Senate office (Lay & Smarick, 2006), the Cuban Missile Crisis (Pace,
Bishel, Beck, Holquist, & Makowski, 1990), the Holocaust (Schweber, 2004), and treaty negotiation (Gehlbach et
al., 2008). Despite these examples, Gehlbach et al. (2008) argue that relatively little is known about the extent to
which simulations improve motivational outcomes for students. They also suggest it is unclear whether findings
from other contexts will generalize to simulations in middle school social studies classrooms (p. 896).
However, given the need to provide social studies teachers tools that help students to realize how their
individualized actions are embedded in and have impact on a larger global context, the current research set out to
examine the impact of a social studies-related simulation on student thinking about these topics. Specifically, the
study investigated the extent to which a simulation of political and economic development altered students
understanding of the features and actions of successful countries in a globalized, interconnected world. It was

151
hypothesized that leading a country in a simulated world (details below) would provide an opportunity for students
to contemplate the ingredients of a successful nation and how a countrys characteristics and behaviors are
situated within a broader content.
To explore these ideas, the authors implemented a two-week study using a pre-existing simulation. Details
of the study design and the simulation used are described in the following section.

Method

Participants & Design

Fourteen (N=14) students ranging in age from 9 to 15 participated in the study. The participants were
recruited from a university-sponsored summer program offering daylong courses in science, technology, and the
arts. The majority of participants were male (93%). The study employed a one-group, pretest-posttest design. All
participants followed the same procedure.

Procedure

The study was broken down into three parts: pre-simulation, simulation, and post-simulation. The pre-
simulation portion of the study introduced the project to all participants with parental permission and signed consent
forms. This included an orientation presentation in which participants were introduced to the simulation and asked
about their background knowledge of simulations and computers. Participants were also given an opportunity to ask
questions about simulations in general and their specific role using the simulation. When all questions were
addressed, a series of pre-test measures were administered. These measures included surveys regarding their
perceived knowledge and achievement in the domain of social studies. They also answered questions about their
motivation to learn new social studies content. Furthermore, participants were asked to draw a concept map based on
their understanding of the features and actions of a successful country in the world. This work is a part of a larger
study, so the concept maps are the focus of this paper
The next part of the study involved the simulation. Participants were randomly assigned one simulated
country that they would lead in the projects simulated world for a period of seven days. These countries were based
on real nations such as Brazil and Ghana. In total, the simulated world consisted of 14 countries (one per
participant). Each day of the study began with a 15-minute mini-lesson presented by the researchers followed by a
90-minute play session in which participants tried to develop their countries, making a number of economic and
political decisions. All participants worked by themselves using an individual laptop computer. At the end of each
play session, the researchers led a 15-minute whole group debriefing. The post-play debriefings were designed to
prompt participants to reflect on their experience of controlling and leading their nation. Each lesson in the
simulation stage lasted two hours.
The final stage of the experiment lasted a single day. Participants completed three post assessments
designed to match those administered at the beginning of the study. These were given in the same order and
participants had the same amount of time to complete them. To bring closure to the experience, a final activity was
conducted involving a whole-class discussion about what happened in the simulated world. During this time,
participants were encouraged to talk using two perspectives: a national perspective and a global perspective.
Prompts used during the discussion included: What strategies did you use to develop your country? Did you notice
any international trends in the simulated world? At the end of the debriefing, the study was over. The research team
thanked the students for participating in the study and said goodbye, marking the end of the data collection.

Materials

The Simulation

The simulation used in the study is called Simpolicon. This simulation models economic development,
environmental sustainability, and international relations, and is owned by the Akwaaba Foundation, a non-profit
education organization. The simulation itself is free and available online. The simulation was chosen for its age-
appropriate portrayal of the complex processes and problems facing developing nations.
All countries in the simulated world begin with a limited amount of land, mineral deposits, unskilled labor,

152
and hand tools. Students must make decisions on how to best use these resources to advance their country politically
and economically. The goal of each country led by a student is to create and maintain a stable, secure country with a
well-balanced and sustainable economy. The challenge for students is to provide for their countrys unlimited
economic wants through advanced economic production, while at the same time limiting pollution, preparing for
natural disasters, public health issues, and other real-world issues.
The simulation is designed to gives students the opportunity to make choices (as if they were political and
economic experts) that will produce products to keep citizens alive and healthy, as well as provide jobs and
opportunities for education. At the same time, they must attempt to conserve resources and keep pollution levels
reasonable. Through this process, it is believed students might learn how national decisions and actions impact
citizens locally and globally.

Concept Mapping

In an effort to measure participants understanding of the features and actions of successful countries
operating in a global context, a concept mapping activity was employed twice: once before the seven day simulation
experience (pre-simulation) and again immediately following the final session with the simulation (post-simulation).
Concept maps have been used to assess students conceptual understanding of topics at a particular point in time.
They are a mechanism for structuring and organizing knowledge into hierarchies, and allow the analysis of
phenomena in the form of cause-effect relations (Kinchin and Hay 2000; Stoyanov and Kommers 1999). As a
measurement tool, concept maps appear to be better suited to assess inquiry or project-based learning than more
traditional assessments such as multiple-choice tests (Stoddart et al., 2000; Novak & Gowin, 1984; Markham,
Mintzes, & Jones, 1994).
After a brief introduction showing simple examples of unrelated concept maps from other domains,
participants were given a single sheet of paper and a pencil. They were asked to create a concept map based on the
following prompt: What are the features and actions of a successful and sustainable country in a global,
interconnected-world? Participants were given 20 minutes to complete this activity.

Results

The purpose of this one-group pre-post study was to uncover preliminary evidence that a free, web-based
simulation of political and economic development might be a means to engage students in thinking about the
features and actions of a successful country operating in a global context. To examine the impact of using such a
simulation for seven days participants pre-simulation and post-simulation concept maps were compared. The results
show that both the number of nodes and the number of connectors per concept map increased significantly from pre-
to post-simulation. That is, the average number of nodes increased from 17.14 (SD = 5.08) before the simulation
experience to 20.79 (SD = 8.29) after the intervention. This difference was statistically significant: t(13) = 2.25, p =
.042. Furthermore, the average number of connectors between nodes increased from 19.50 (SD = 6.09) before the
simulation to 26.79 (SD = 14.02). Like the number of nodes, the difference in the number of connectors was
statistically significant: t(13) = 2.16, p = .05. Together these results suggest that concept maps were more detailed
and interconnected after the simulation experience. See Figure 1 for a side-by-side comparison of two participants
pre-simulation and post-simulation concept maps.

153
Figure 1. Side-by-side comparisons of two participants pre-simulation (left) and post-simulation (right)
concept maps.

Prompted by the quantitative changes described above, the researchers were interested in examining the
content changes in the participants concept maps. That is, did the concepts represented in the concept map nodes
change between pre-simulation and post-simulation. A closer examination of the content contained in the concept
maps revealed three interesting findings.
First, the most common word to appear in the pre-simulation concept maps was tourism. Out of the 14
participants, seven (50%) listed either Tourists or Tourism as being features or actions of a successful country.
This finding is not surprising given the fact that the participants are from a state known for tourism. From an
education perspective, knowing that tourism is an important part of a locality is positive. However, the concept all
by itself is rather superficial in that it does not connote a deeper appreciation for the impact tourism has on a local
economy and culture. It also overlooks what it takes to sustain tourism from an environmental and service-industry
perspective. For example, the pre-simulation concept map shown at the bottom of Figure 1 shows that a successful
country has tourists. Connecting nodes to tourists include restaurants, music, attractions, and nice beaches. As this
example demonstrates, this participant has a concept map that does not use terms such as economy, environment, or
education.
A second interesting finding centered on a common word appearing on the post-simulation concept maps:
education. Out of 14 participants, 9 (64.28%) included a node for education on their post-simulation concept map.
Their pre-simulation maps only included education three times (21.43%). This result suggests participants may have
realized that education is a key feature of advancing a nation. Furthermore, many of the post-simulation concept

154
maps included nodes comprising the building blocks an education system including teachers, schools, colleges and
universities. This result is not surprising given the design of the simulation. In the simulated world, all nations begin
with unskilled labor. However, in order to advance their country socially and economically, participants quickly
realize they must obtain a more educated workforce, which can be done by dedicating resources to produce schools
and teachers.
A third finding related to the content of the concept maps revolved around the people of the participants
countries. For instance, only four participants (28.57%) wrote people on their pre-simulation concept maps. The
nodes that connected to this central node included concepts such as make your people happy, hurt less people,
and more people to come. These are relatively vague nodes compared to the results of the post-simulation concept
maps. On those concept maps, 12 participants (85.71%) listed people with more specific action-oriented child
nodes including make people have some wealth and feed people. Although difficult with only 14 participants,
one might argue that the emphasis on people demonstrates a raised awareness about the role the citizens of a
nation play in advancing their country.
While these findings are preliminary and based on a small sample of participants, the quantitative and
qualitative differences between the pre-simulation and post-simulation concept maps suggest some changes in
participants understanding of the features and actions of successful countries in a global world. Importantly, the
differences seem to suggest that participants may have extended and deepened their point of view of whats involved
in an economically, socially, and politically developing countryregardless of the geographic location of their
simulated country.

Discussion

Altogether, the results of the study suggest that providing opportunities to lead a simulated country in a
simulated world may be a way to extend and deepen their understanding of importance aspects of life in a global
society. The simulation in used in this provided a shared experience in which students could compare and contrast
patterns of development at micro and macro levels. Despite its limitations (small sample size, one group
experimental design), this study provides some encouraging evidence that given the right opportunities, students are
willing and able to wrestle with the challenges of being leaders in the 21st Century. In closing, the researcher feel
additional research is needed to better understand what aspects of the simulation experience contributed to student
learningwas it the simulation itself, the discussions surrounding events in the simulation, or a combination of the
two that promoted student growth? In conclusion, it seems plausible that simulations that focus on political and
economic development, if carefully designed in a user-friendly, age-appropriate manner, might prove to be a
valuable resource for teachers interested in focusing on global competence in their social studies classroom.

References
Alvarez, P. (2008). Students play the notables: Testing a simulation exercise. The History Teacher, 41(2), 179197.
DAngelo, C. D., Rutstein, D., Harris, C., Bernard, R., Borokhovski, E., & Haertel, G. (2014). Simulations for STEM
Learning: Systematic Review and Meta-Analysis. Menlo Park, CA.
Deci, E. L., Eghrari, H., Patrick, B. C., & Leone, D. R. (1994). Facilitating internalization: The self determination
theory perspective. Journal of Personality, 62(1), 119-142.
Chang, K. E., Chen, Y. L., Lin, H. Y., & Sung, Y. T. (2008). Effects of learning support in simulation-based physics
learning. Computers & Education, 51(4), 1486-1498.
Eccles (Parsons), J. E., Adler, T., & Meece, J. L. (1984). Sex differences in achievement: A test of alternate theories.
Journal of Personality and Social Psychology, 46(1), 26.
Eccles, J. S., Adler, T. F., Futterman, R., Goff, S. B., Kaczala, C. M., Meece, J., & Midgley, C. (1983).
Expectancies, values and academic behaviors. In J. T. Spence (Ed.), Achievement and achievement motives:
Psychological and sociological approaches (pp. 75-146). San Francisco: W. H. Freeman.
Geban, ., Askar, P., & zkan, . (1992). Effects of computer simulations and problem-solving approaches on high
school students. The Journal of Educational Research, 86(1), 5-10.
Gehlbach, H., Brown, S. W., Ioannou, A., Boyer, M. a., Hudson, N., Niv-Solomon, A., Janik, L. (2008).
Increasing interest in social studies: Social perspective taking and self-efficacy in stimulating simulations.
Contemporary Educational Psychology, 33(4), 894914. doi:10.1016/j.cedpsych.2007.11.002

155
Kinchin, I. M., Hay, D. B., & Adams, A. (2000). How a qualitative approach to concept map analysis can be used to
aid learning by illustrating patterns of conceptual development. Educational research, 42(1), 43-57.
Lay, J. C., & Smarick, K. J. (2006). Simulating a Senate Office: The Impact on Student Knowledge and Attitudes.
Journal of Political Science Education, 2(2), 131146. doi:10.1080/15512160600668967
Legrain, P. (2002). Open World: The truth About Globalization. Economist. Abacus.
Markham, K. M., Mintzes, J. J., & Jones, M. G. (1994). The concept map as a research and evaluation tool: Further
evidence of validity. Journal of Research in Science Teaching, 31(1), 91-101.
NASULGC. (2004, October). A call to leadership: The presidential role in internationalizing the university. A
Report of the NASULGC Task Force on International Education. Washington, DC: Author.
Novak, J. D. (1998). Learning, creating, and using knowledge. Concept maps as facilitative tools in schools and
corporations. Mahwaw: Lawrence Erlbaum.
Novak, J. D., & Gowin, D. B. (1984). Learning how to learn. Cambridge: Cambridge University Press.
Pace, D., Bishel, B., Beck, R., Holquist, P., & Makowski, G. (1990). Structure and spontaneity: Pedagogical
tensions in the construction of a simulation of the Cuban Missile Crisis. The History Teacher, 24(1), 5365.
Ryan, R. M. (1982). Control and information in the intrapersonal sphere: An extension of cognitive evaluation
theory. Journal of Personality and Social Psychology, 43(3), 450.
Ryan, R. M., Koestner, R., & Deci, E. L. (1991). Ego-involved persistence: When free-choice behavior is not
intrinsically motivated. Motivation and Emotion, 15(3), 185-205.
Stoyanov, S., & Kommers, P. (1999). Agent-support for problem solving through concept-mapping. Journal of
Interactive Learning Research, 10(3/4), 401-425.
Schweber, S. A. (2004). Making sense of the Holocaust: Lessons from classroom practice. New York, NY: Teachers
College Press.
Stoddart, T., Abrams, R., Gasper, E., & Canaday, D. (2000). Concept maps as assessment in science inquiry
learning-a report of methodology. International Journal of Science Education, 22(12), 1221-1246.
Suarez-Orozco, M., & Sattin, C. (2007). Wanted: global citizens. Educational Leadership, 64(7), 5863.
Zhao, Y. (2010). Preparing Globally Competent Teachers: A New Imperative for Teacher Education. Journal of
Teacher Education, 61(5), 422431. doi:10.1177/0022487110375802

156
16. Ready to Practice? Learning skills using digital simulated patients

Nataly Martini
School of Pharmacy, Faculty of Medical and Health Sciences
University of Auckland, New Zealand
n.martini@auckland.ac.nz

Ashwini Datt
Centre for Learning and Research in Higher Education
University of Auckland, New Zealand
a.datt@auckland.ac.nz

Anuj Bhargava
Department of Physiology, Faculty of Medical and Health Sciences
University of Auckland, New Zealand
a.bhargava@auckland.ac.nz

Craig Webster
Centre for Medical and Health Sciences Education,
University of Auckland, New Zealand
c.webster@auckland.ac.nz

Introduction
Simulation is widely held to be a powerful teaching technology in many domains (Cela-Ranilla, Esteve-
Mon, Esteve-Gonzalez, & Merce Gisbert-Cervera, 2014; Damassa & Sitko, 2010; Lyons, 2012). In particular, for
improving competencies in contexts where the skills practice is associated with significant risks, like in medical
education (Damassa & Sitko, 2010). The opportunity for hours of deliberate practice (McGaghie & Fisichella, 2014)
in otherwise rare real-life scenarios is one of the major strengths of simulations. Achievement through directed goals
and immediate feedback can be intrinsically motivating, hence promote deep learning (Lyons, 2012; Malone, 1981).
However, there are also significant barriers to the use of educational simulations, in particular, the resources and
time required for planning, developing and implementing them (Lean, Mozier, Towler & Abbey, 2006).
Lean et al. (2006) categorises simulation based learning as either role play, gaming or computer simulation.
Each of these differs in their design and purpose with potential overlaps. In role plays, participants assume the role
of different characters to interact in a controlled situation. Gaming and simulations replicate real-life scenarios
where participants have a set of instructions to achieve a pre-defined goal. Most importantly, since simulations are a
proxy for a real environment, its relevance and reception depends on how able the participants are to suspend
disbelief and engage (Damassa & Sitko, 2010). All this is irrespective of the detail in its design.
Good educational simulation designs include challenge and curiosity (Malone, 1980), that can be used to
varying degrees to motivate and engage students in deep, meaningful learning. The challenge is presented in the
uncertainty of achieving a defined goal (e.g. keeping the patient well in a patient care simulation). The provision of
selective information (e.g. patient history, drug charts) creates a level of curiosity that invokes intrinsic motivation,
often associated with deep learning. Unlike surface learning approaches, where the focus is on reproducing material
or rote learning, deep learning enables students to see interconnections between elements, or the meanings and
implications of what is learned (Biggs, Kember & Leung, 2001). Be it deep or surface, the learning approach taken
by the student can be modified by changes in the teaching situation. This may change the student's motivation that
then affects the educational outcome.
This paper describes one such attempt at deep learning with the use of a 2D simulation called Ready to
Practice? in medical education. It is a screen-based case simulation that enables students to experience a critical and
challenging emergency situation in renal care. Screen-based cases, being the earliest forms of simulation, have a
proven track record. And Ready to Practice? falls within the second most common use of simulations, which is for
patient care (Damassa & Sitko, 2010). Patient care cases also span across the procedural and situational simulation
categories of Lyons (2012). Procedural simulation require participants to follow a correct sequence of steps to

157
achieve the appropriate end goal while the situational one prompts participants to think critically and make relevant
decisions with constant feedback to rethink and adapt.
Ready to Practice? tests student skills in communication and decision making required for the development of
an effective patient care plan in a safe and comfortable learning environment. The pace and interaction is learner
driven and controlled. Consistent with the work of Cela-Ranilla et al. (2014), the Ready to Practice? simulation:
1. Offers an alternative approach to learning patient care and management plans.
2. Extends the learning in lectures and clinical practice.
3. Provides an opportunity for interaction between potential team members (portrayed as characters in the
simulation).

Simulation design
The simulation is web-based, can be logged into by multiple participants at a time, and from any internet-
connected computer. Each participant is presented with still photographs of a virtual patient actor and clinical staff
taken in a real clinical environment at Middlemore hospital, South Auckland. Speech bubbles, prompts and decision
menus appear overlaid on photographs with a series of options that the participant can select. A progress bar signals
how far the participant is progressing through the simulation. The simulation is divided into three parts.
Part 1 (Fig. 1) sees the nurse introduce the patient to the simulation participant by providing some basic
medical information on the patient and offer a number of options to gather additional information to assist the
participant in reaching an accurate diagnosis. Additional information is provided in the form of a GP letter, patient
notes, the readout of a cardiac monitor, and interaction with the patient through predetermined questions and
responses. The status of the patient is illustrated by a cardiac monitor that pops up on the screen at intervals, warning
the participant if they are taking too long to reach a clinical care decision, or if the patient is deteriorating.
Responses of the nurse also change depending on the care choices made by the participant. Should the participant
take longer than nine minutes to reach a diagnosis and make an appropriate treatment recommendation, a flat lined
ECG appears and the participant is informed that the patient did not survive, at which point there is an option to
restart the simulation. It is important to note that a number of distractors have been introduced into the simulation to
add realism, and to allow the participant to assess the situation and select the most important/relevant information to
reach an informed clinical decision.

Figure 1: Ready to Practice? Part 1 Figure 2: Ready to Practice? Part 2 drug chart

Part 2 introduces a Senior Medical Officer who poses a series of questions followed by a set of
predetermined answers from which the participant is required to select the correct response. Questions include why
the participant thought the patient had presented to hospital, and what the best medical treatment is for the condition.
Included in this section is a series of open-ended questions, which are coded in response to the individual choices
made in the simulation by participants. For example, if the participant did not order a CT scan or neurological
examination, they are asked why they made this decision. A text box allows the participant to type in their reply.
Also in this part, participants are requested to complete a drug chart for the patient by increasing, reducing or
stopping the dose of the current medication that the patient has been charted (Fig. 2). Correct answers are indicated
by a tick and a comment on why the selection is correct. If incorrect, there is an option to try again.

158
Part 3 is the Medical Handover Meeting. As in previous parts, the participant is asked a series of questions
from which they may select the correct response from a selection of predetermined answers. This section allows the
participant to reflect on previous decisions made in Parts 1 and 2. Based on the participants responses, the
simulation codes individual responses to gather additional information on why participants made those clinical care
decisions.

Methods
With institutional ethics committee approval (Ref: 012394), participants were recruited from fourth year
Pharmacy students enrolled in a clinical paper at the University of Auckland. Students were informed about the
simulation by the course coordinator via an announcement on a learning management system (LMS) and by clinical
lecturers in a lecture and workshops. The simulation was made available via a hyperlink pasted into an announcement
on the course LMS, and was made available from the 6th October3rd November 2014, during which time participants
were given an opportunity to complete the simulation and an evaluative survey.
To complete the simulation, students were required to enter their student ID number to guard against anyone
completing the simulation more than once. Responses to the reflective questions within the simulation were collected
by a third party (InGame Ltd, Auckland, New Zealand) on a spreadsheet after students completed and submitted the
survey. An evaluative survey (adapted from Whitton, 2007) was attached to the simulation as a separate link and
guided the students to SurveyGizmo. Responses of this evaluation were sent anonymously to co-investigator (AB).
Students who wished to enter into a prize draw were given an opportunity to submit their ID numbers via a separate
page on the simulation, which was not linked to the survey response.
Data were captured by a series of reflective questions through the simulation and at the end by means of an
evaluative survey. Evaluative questions were designed using a 5-point Likert scale from strongly disagree to
strongly agree; a series of questions to determine if decision-making processes were improved, worsened or stayed
the same; and a question on whether or not participants thought they were likely to be motivated by simulations.
Analysis involved tabulation and description of numeric values and the selection of exemplar quotations for
the reporting of results.

Results
From 97 fourth year Pharmacy students, 19 (19.6%) completed the simulation of which 18 submitted the
evaluative survey. One student submitted an email evaluation. Not all participants answered all the questions in the
simulation. Of these participants, 16 (88.9%) were motivated to learn using simulations; 11 (61.1%) said it improved
their ability to make effective use of clinical resources; 11 (61.1%) felt that it had improved their ability to make
good decisions quickly; and 9 (50%) believed it had improved their constructive decision making skills.

Participant evaluation of the simulation


Eighteen participants completed the survey (Tab. 1- five-point Likert scale data has been grouped into three
categories for the purposes of analysis).
One student questioned whether the simulation was created for pharmacy or medical students as she had
no idea like what to do first etc. in an acute setting, nor how to read the ECGs But definitely thought it was cool,
and a really good way to test your knowledge and think on your feet, which again we don't really get any experience
at.[Participant 19]

Table 1: Student responses to the evaluative survey (N=18)


Survey statement SD + D (%) N (%) SA + A (%)
I wanted to explore all the options available to me 3 (16.7) 3 (16.7) 12 (66.7)
I knew what I had to do to complete the activity 6 (33.3) 1 (5.6) 11 (61.1)
I felt that I could achieve the goal of the activity 2 (11.1) 16 (88.9)
I had all the things I required to complete the activity successfully 1 (5.6) 1 (5.6) 16 (88.9)
The activity was too complex 14 (77.8) 4 (22.2)
I had too many potential options available to me 9 (50) 6 (33.3) 3 (16.7)
I felt absorbed in the activity 3 (16.7) 1 (5.6) 14 (77.8)

159
I had to concentrate hard on the activity 5 (27.8) 3 (16.7) 10 (55.6)
Feedback I was given was useful* 1 (5.6) 2 (11.1) 15 (83.3)
SD = strongly disagree; D = disagree; N = neutral; SA = strongly agree; A = agree
* The participant who disagreed with this statement commented that some of the tasks werent suitable for pharmacy
students and opinions did not always match what they had been taught.

Simulation analysis and reflection: Information gathering, diagnosis and treatment


Participants were warned that the patients condition was deteriorating if they spent too long gathering
information. When asked whether they believed that could have responded more promptly, almost all students
agreed as they recognised that the patients condition was serious.
Yes, I got caught up with questioning the patient and trying to figure out what was going on. [Participant
16]
Only a few students did not make use of all the clinical resources available as they believed they had gathered
sufficient information to reach a diagnosis and that time was of the essence. Those who did make use of all the
resources agreed they were sufficient to aid in a diagnosis, even though one student needed additional references
for ranges of electrolyte levels. [Participant 2]
Participants were required to work through a number of distractors to reach a diagnosis. Those who clicked on the
medication prompt by interacting with the patient were asked to comment on the significance of this.
It was very significant and realistic to what limited knowledge a patient would have in reality about their
medication. [Participant 5]
We wouldnt have known that he took Nurofen for back pain, which ended being the precipitating factor
that helped us figure out the reason for his acute renal failure. [Participant 16]
A few made the diagnosis without speaking to the patient regarding their medicines, as they had found the
information elsewhere.
The back pain was mentioned in the patients notes. It is relevant as the NSAID given for the back pain can
precipitate renal failure when added with diuretic and ACE inhibitor. [Participant 13]
Another distractor prompted participants to request a neurology assessment or order a CT scan. None of the students
selected to do so, and when asked to explain their reasons, the majority thought there was a more probable
diagnosis, some were aware of the time restrictions and others suggested they would not have known how to
interpret the results.
I have not learnt much about interpreting neurology examination results/CT scans so did not believe I
would gain any additional information from this. [Participant 9]
Asked to explain the key features of the patients condition that led to their diagnosis, the majority of the students
recognised a drug interaction caused the injury to the patient and the blood results indicated an acutely dangerous
situation. Half of the students selected the most appropriate drug treatment for the patients condition
(hyperkalaemia) based on their diagnosis.

Reflection of knowledge
Upon reflection on their diagnosis, all students agreed that ibuprofen was the cause of the patients acute
condition. Only one student correctly stopped all four important medicines in the patient. When the others were
asked why they had not stopped or reduced the dose of the important drugs, some said they had not considered the
renal clearance of these agents.
Didn't really think about it in the acute setting. Thought he should stay on the cilazapril as
renoprotective Didn't realise atenolol or gliclazide were renally excreted/didn't cross my mind at the
time. [Participant 19]
Others thought that medicines would need to be titrated down, rather than stopping them suddenly and possibly
causing the patients other conditions to deteriorate.
The Medical Handover provided an opportunity for further reflection. When asked what effect the
ibuprofen had on the kidneys, the majority of students identified the correct answer with a detailed description of the
effect of the drug on renal perfusion. They also correctly identified that the patients high potassium level was cause
for concern, and recognised that this could affect the patients heart, leading to life threatening arrhythmias.
Although most students admitted to not being able to read an ECG, the majority were concerned about ECG
changes.
ECG changes meant there was an acute problem with the patient's heart function which needed to be
corrected quickly. [Participant 9]

160
Asked to select the most appropriate treatment for a patient with severe hyperkalaemia, almost all students selected
the correct management.
This treatment will act on his heart first which is our first priority. [1]
It is important to use calcium to stabilise the heart first as this is the most potentially life threatening
problem. [13]
Discussion
The evaluation survey was considered an appropriate method to determine the level of participant
engagement with the intended learning from the simulation. Since Pharmacy students in their final year of study are
expected to have already assimilated knowledge and comprehension required to successfully complete clinical tasks,
the simulation aimed to test deeper learning through use of higher-level cognitive skills.
As Whitton (2007) points out, higher levels of engagement with a learning activity result in an increase in
learning from it. She argues that engaging activities encourage and facilitate learning and that there is a link between
intrinsic motivation, engagement and educational value. Based on previous theories, Whitton identified five factors
that affect engagement, namely challenge, control, immersion, interest and purpose (Tab. 2).
Table 2: Factors that affect engagement (Whitton, 2007)
Factor Description Related options from the evaluation survey
Challenge The most complex of the factors, I wanted to explore all the options available to me
consisting of: the motivation to (motivation)
undertake the challenge; clarity as to I knew what I had to do to complete the activity (clarity)
what the challenge involves; and a I felt that I could achieve the goal of the activity
perception that the challenge is (achievability)
achievable. I had all the things I required to complete the activity
successfully (achievability)
Control The fairness of the activity, the level of The activity was too complex
choice over types of action. I had too many potential options available to me
Immersion The extent to which the individual I felt absorbed in the activity
is absorbed in the activity.
Interest The intrinsic interest of the individual I had to concentrate hard on the activity
in the activity or its subject matter.
Purpose The perceived value of the activity, The feedback I was given was not useful
whether it is seen as being worthwhile
and whether feedback is perceived as
having value.
Challenge is not only an important motivator to engage students with educational digital simulations but
also a factor in its good design. Students in this study exhibited a high level of motivation and wanted to explore the
options made available through the simulation. This re-affirms what Malone (1980) proposed and is indicative of
Ready to Practice? being a well-designed, digital patient simulator with potential broader appeal.
Although final year Pharmacy students at the University of Auckland will have had some previous
exposure to a hospital environment, the simulation provided access to patient data that they would not likely have
had previous exposure to. This prepares them better for the real-life context where timely decisions are required for
proper management of patient care.
Although just over half of the students had clarity on what they were expected to do in the simulation,
almost all felt that the challenge was achievable. This can be attributed to the immediate and consistent feedback
that is integrated in the simulation to guide the students through the decision making process. Feedback was deemed
to be useful to the majority of participants. One student did not feel the simulation was suitable for pharmacy
students and as a result did not feel motivated by the simulation, nor felt that it resulted in improvement in any skills.
The results indicate that the activity was not too complex and half of the participants were in agreement
regarding the number of options made available to them. Only a few students did not feel immersed in the activity,
which indicates that the majority were absorbed in the simulation. This confirms what Damassa & Sitko (2010)
stated as being proxy for a real environment, the relevance and reception of simulations depend on how able the
participants are to suspend disbelief and engage. Over half of the participants said they had to concentrate hard on
the activity, which is said to correlate with the intrinsic interest in an activity. Competency based simulations are a
growing trend in medical education (Damassa & Sitko, 2010). If included in the core curriculum, the key

161
competencies of pharmacy practice can be embedded into Ready to Practice? to provide greater clarity on the
expectations, hence cementing its purpose.

Simulation
Possibly the most indicative of learning was the response to the final question in Part 3, which asked
students to select the most appropriate treatment for a patient with severe hyperkalaemia. Whereas fewer than half
accurately selected the appropriate treatment in Part 1, almost all students selected the correct drug in Part 3.

Limitations of the study


As a proof-of-concept study, the simulation was not integrated into the core Pharmacy syllabus so the
response rate was lower than anticipated. This may be because students had competing demands on their time and
those that took part in the study had a preference for learning using computer simulation.
Conclusion
The great majority of Pharmacy students found the digital patient simulator engaging, the feedback from
their interaction with the simulator to be useful, and felt that they had the necessary resources available to them to
achieve the task. Results indicate that students were motivated by the challenge of caring for the simulated patient
and felt that the task was achievable, even though some were not clear on what it involved at the outset. There was
an overall sense of immersion and intrinsic interest in the simulated patient, and most agreed it was valuable to their
learning. Our pilot study is evidence for proof-of-concept of digital patient simulators in healthcare education, and
we intend to use this information to extend and refine our patient simulation environment.

References
Biggs, J.B., Kember, D., & Leung, D.Y.P. (2001) The Revised Two Factor Study Process Questionnaire: R-SPQ-2F.
British Journal of Educational Psychology, 71:133-149
Cela-Ranilla, J., Esteve-Mon, F., Esteve-Gonzlez, V., & Gisbert-Cervera, M. (2014). Developing self-management
and teamwork using digital games in 3D simulations. Australasian Journal of Educational Technology, 30(6):
634-651.
Damassa, D. A., & Sitko, T. D. (2010). Simulation technologies in Higher Education: uses, trends, and implications.
EDUCAUSE Centre for Applied Research Bulletin 3.
Lean, J., Mozier, J., Towler, M., & Abbey, C. (2006). Simulation and games: use and barriers in education. Active
Learning in Higher Education, 7 (3): 227-242.
Lyons, J. (2012). Learning with technology: theoretical foundations underpinning simulations in higher education.
In M. Brown, M. Hartnett & T. Stewart (Eds.), Future challenges, sustainable futures. Proceedings ASCILITE
Wellington, pp. 582-586.
Malone, T.W. (1980) What makes things fun to learn? Heuristics for designing instructional computer games.
Proceedings of the 3rd ACM SIGSMALL symposium and the first SIGPC symposium on Small systems pp.
162-169
Malone, T.W. (1981) Toward a theory of intrinsically motivating instruction. Cognitive Science, 4: 333-369
McGaghie W.C. & Fisichella, P.M. (2014) The science of learning and medical education. Medical Education, 48:
104-112
Whitton N.J. (2007). An Investigation into the Potential of Collaborative Computer Game-Based Learning in Higher
Education. (Doctoral thesis, Edinburgh Napier University, Edinburgh, United Kingdom). Retrieved from
http://researchrepository. napier.ac.uk/4281/1/Whitton.pdf

162
17. Serious Mathematics Games:
Making them Happen in Elementary Schools

Beth Bos
Texas State University
USA
bethbbos@hotmail.com

Introduction
A view of virtual worlds moves students from the idea of simple games and situates them into a pliable
environment completely capable of delivering formal as well as contextual learning experiences to the users. (Lim
2009) The variety of interfaces, policies, and age target of different virtual worlds gives instructors numerous
choices of the medium through which they can deliver the material to students. (Braman, Meiselwitz, Vincenti,
Wang 2011). Virtual worlds are mediums in which educational materials can be delivered to students. Virtual
worlds are a form of interactive virtual reality; in these 3-D online environments users can participate and interact
simultaneously. Similar to many gaming applications, virtual worlds have been very popular, and games such as
World of Warcraft or Minecraft have millions of active user accounts (Messinger et al. 2009).
For the purpose of this paper virtual worlds refers to their use as serious games. Serious games are
digital games, simulations, virtual environments, and mixed reality/media that provide opportunities to engage in
activities through responsive narrative/story, gameplay or encounter to inform, influence, for well-being, and/or
experience to convey meaning (Marsh 2011, p.63). This study investigated participants in an education masters
program with a concentration in mathematics and their engagement in designing serious games using Minecraft, a
popular virtual game, to teach geometric and measurement concepts in their classrooms.
In recent years, many new ways of teaching academic and professional skills to children and adults have
been tested using serious games (Kebritchi et al. 2l010; Lorant-Royer et al. 2010). Although the idea of games as
effective learning tools is not new (Annetta el al, 2009), the question has recently emerged as a subject of
experimental research. Some researchers claim that games permit constructive, situated, and experiential learning,
which is enhanced by active experimentation and immersion in the game (Squire 2008: Hainey et al. 2011). Their
work highlights the great advantage of games as compared with traditional methods such as face-to-face or pencil-
and-paper teaching. In addition, the traditional linear approach to learning appears to be counter-intuitive to many
students, and games allow them to escape the constraints it imposes (Tanes & Cemalcilar 2010).
Recent research has indicated that the mere availability of technology, favorable attitudes, and school and
policy level expectations towards technology use does not necessarily result in teachers integrating serious games
with student-centered pedagogy (Palak & Walls 2009; Chen, 2008, Kenny & McDaniel 2011). The lack of an
effective model for teachers to use to help them meet curricular objectives and demands of the school day has been a
barrier to implementation of educational technology and serious computer games (Palak & Wells 2009; Mishra &
Koehler 2006). In response to the need for providing a methodological framework for teachers, Foster and Shah
(2012) proposed a pedagogical model for integrating serious games into classroomsPlay Curricular activity
Reflection Discussion (PCaRD)and have tested the framework in a yearlong study. The PCaRD model was
developed to include opportunities for learners to experience activity for inquiry, communication, construction, and
expression (ICCE) (Foster 2012; Foster & Shah 2014). The model was used and tested to determine its effectiveness
with elementary mathematics master teachers as they designed and implemented their games within the classroom.

Theoretical framework
ICCE experiences are based on the premise that opportunities are built on using the natural curiosities of
learners (Dewey, 1902). Inquiry includes exploration and problem-solving opportunities in which the player is
expected to be active, engaged, and involved in reasoning (Chin & Chia, 2004). Communication, according to Foster
and Shah (2014), should provide feedback and guidance that has purpose in meeting the players progress towards
reaching the game objectives. Construction of knowledge is seen not only by artifacts created, built, or designed to
represent or show understanding of the experiences but also by the questions they ask when faced with challenges
by way of discussions (Chin & Chia, 2004). Expression includes the sharing of ones feelings, emotions, values and

163
ideas (Bruce, 1999). As stated by Chin and Chia, the learning experience should allow for meaningful identification
with the process and projection of self while in the process of gaining new insights.
The Game Network Analysis (GaNA) framework provides the methodological framework teachers require
within their environmental context to focus on the pedagogy and content of games as well as the process to use and
apply games in classrooms (Foster, 2012). The GaNA addresses some identified learning gaps and empowers
teachers in adapting game-based learning within a new or existing curriculum by guiding them in the process of
game selection and game integration in school contexts (Foster, 2012). The methodological process combines the
Technological Pedagogical Content Knowledge (TPACK) framework (Mishra & Koehler 2006) and PCaRD (Foster
& Shah 2012). TPACK helps teachers approach the game as a curriculum with constraints and affordances for
technology, pedagogy and content (Foster, Mishra & Koehler, 2012). PCaRD is a pedagogical model that aids
teachers in the systematic incorporation of games in classrooms to achieve curricular goals and facilitate student
engagement in academic domains (Foster & Shah 2012). PCaRD includes the ICCE framework, which facilitates
teacher-designed opportunities for inquiry, communication, construction, and expression experiences to foster
transformative learning anchored in the games (Shah & Foster, 2012).
Through the PCaRD pedagogical model and GaNA, pre-service teachers were able to make connections
between the game and the curriculum structure both prior to and during implementation of a game-based lesson for
ninth grade English/Language Arts focusing on cause and effect in narrative experiences (Shah, Foster, Scottoline,
Duvall, 2014). The lesson plan was not implemented in a school setting. More studies in other disciplines are needed
to determine the effectiveness of PCaRD and GaNA.

Minecraft
In this study teachers used a popular virtual game, Minecraft to design serious games in mathematics.
Minecraft has been called a sandbox by designers and users alike, because there is no clear objective of the game,
where one is limited only by ones imagination. Minecraft is a Massively Multi-player Online Role Playing Game
(MMORPG) with a LegosTM environment where players can build whatever they can envision. Minecraft can be
played in survivor mode with a single player or in creative mode with multiplayers. The Minecraft game, designed
by Markus Notch Persson, recently sold to Microsoft, was designed to be simple and open so that users could
interact with the environments that are normally impenetrable in most other video games. When students play
serious games teachers are providing opportunities for students to demonstrate their creativity and inventiveness
while collaborating with others. Building in the survivor mode enables one to develop more materials through
chopping down trees, mining stone, creating a furnace to manufacture charcoal, and more. Only by learning how to
develop materials is a participant able to build and develop their Minecraft environment. While playing in creative
mode, the participant has all the materials available in their users inventory that the participant has worked to
obtain. Minecraft is like Second Life in that it is a real-time environment built to belong to the user, and its purpose
or goal is user-defined (Oliverio & Beck, 2009). Rather than a totally open system, Minecraft requires the user to log
on to a dedicated server making it safe for use in public schools.

Research Purpose
The purpose of this paper was to study a class of eleven graduate students who are practicing teachers with
no game-based learning experience as they designed mathematics lessons using the GaNA and the PCaRD models
to see if change occurred in their classrooms. The research questions were as follows: (1) How did eleven master
teacher candidates approach connecting Minecraft to teach geometry objectives? (2) What is the effect of integrating
serious game-based learning on master teachers using GaNA?

Methods
This study took place at a large university in Central Texas. The intent was to use GaNA and the PCaRD
model within a Geometry course for Elementary Master Teachers. A qualitative approach was undertaken to
investigate the characteristics of Minecraft as a learning game and integrate it in elementary lesson plans for
teaching geometry.

Participation and settings


The eleven experienced teachers enrolled in the Geometry course for Elementary Master Mathematics
Teachers had no virtual gaming experience nor experience in developing curriculum for game-based environments.
The ecological climate in their school districts and campuses was focused on state-mandated objectives, and their
students performance on state-mandated exams. All participants had never considered using a virtual game like

164
Minecraft in their classroom even though they had seen their students playing the game with mobile devices brought
from home.
The graduate class met once a week for three hours on an elementary campus in the teachers school
district. The game-based unit took five weeks. Prior to exploring the game, the teachers discussed the advantage of
creating a scenario for students to design structures that would help them investigate area, perimeter and expand
their depth of understanding while allowing the students to create their own structures. The GaNA and PCaRD
models were introduced as guidelines. A high school student was brought in to teach the teachers how to play the
game. Learning the game itself appeared to be the biggest hurdle at first, but agreeing to stay in creative mode
relieved some of the participants anxiety. Participants were divided into teams of two to three teachers and limited
to the study of geometry. Once the games were completed and analyzed using GaNA and PCaRD models as shown
in Tables 1 and 2, the activity was taught in the participants individual classrooms. Their reaction and open
responses were recorded. Upon completion of the lesson, the participating teacher commented on their students
reaction and provided their personal feedback about the lesson.

Data and analysis


A descriptive template as shown in Tables 1 and 2 was used for the completion of the in-class course
assignment. Student directions to play the game were submitted along with screen shots taken to verify student use.
The study used a qualitative approach. Data analysis began by examining the transcripts and written work of each
participant and grounding it in a constant comparative method of coding (Glaser & Strauss 1997) in which
participant responses were coded with external and internal codes.

Results
The teachers were very hesitant to design a game for a platform with which they were unfamiliar and were
opposed to the idea that such an activity could be implemented in their classroom due to time constraints of their
district wide scope and sequence and due to technical problems that might arise. Having the high school student, a
master at the game, teach the mathematics teachers and providing the opportunity to play with the gaming aspects of
Minecraft gave them some confidence in order to begin thinking about the math topics and how they could
incorporate the game in a mathematical activity.
Group A, two teachers, decided they wanted their students to re-create a map of their school. A map of the
school was drawn to scale to help the students see what their final product would look like. The teachers proceeded
to re-create the map using Minecraft and took screenshots of the construction process. Unfortunately, they did not
allow their students to repeat the activity due to perceived time constraints. The lesson was very prescriptive and
was limiting to their students creativity, but would have provided a visual model of perimeter and area. The
researcher was disappointed that the teachers in Group A did not actually implement the lesson with their students.
Group B, an ambitious group of three teachers had watched their students play Minecraft and often let them
play it as a reward when they completed all of their work. These three teachers understood the creative affordances
of the game and believed in their students ability. They designed a simple assignment. Their students would use
Minecraft to build a mountain ski resort. Students were to construct 2 separate mountains with a valley between
them. They would begin by building a mountain with a base layer that was 7 m. by 7 m. Each new layer would have
2 less meters on each side. This step would continue until they reached the top of their mountain, which would be 1
m. by 1 m. After completing the first mountain, the students were to build a valley next to the base of the mountain
by digging a valley that was 1 m. wide and 2 m. deep. On the other side of the valley, the students would construct a
second mountain similar (proportional) to the first one they builtcreating their own base with dimensions of their
choosing. The screenshots turned in by the students included a ski lift from one mountain to the other mountain with
individual seats for each skier. Proportionality is a big theme in Minecraft, as players must have a sense of ratio and
proportion as they expand their collection of materials and tools in the survivor mode. The students reflected upon
the process, discussing the proportional rows used to construct their second mountain and how they would do it
differently the next time they built a mountain to make it more three dimensional.
Group C, another group of 3 teachers, designed a lesson where the students would be building a fort 5 m.
by 8 m. and 4 m. tall, with a roof with a 1 m2 hole for a ladder to go through. The assignment was very prescriptive
with no reflective piece. The 5th graders were able to create the fort with little effort, and the students demonstrated
little or no creativity in building their fort. The students did communicate with each other as how to build the fort
and where to put the hole in the roof that also determined where the ladder would go. The reflective discussion was
centered on the dimensions of the roof and the need for a ladder to the roof and a second ladder on the outside of the

165
fort to climb down. They were exploring the environment to make the assignment more realistic but unfortunately,
the instructions were too structured to allow more creative thought.
Group D, another group of 3, was more interested in the comparison between perimeter and area and the
difference between a squares perimeter and rectangles (without equal sides) perimeter. After reviewing the
definition of perimeter and area, and using a class set of IPods with Minecraft downloaded and installed, students
were asked to go to the creative mode to build a coastal town with a pier area of 12 square meters, a bait shop with a
perimeter of 12 meters, a restaurant with an area of 24 square meters, and a square store with an area of 16 square
meters. The dimensions were carefully chosen to encourage comparison and discussion on the similarities and
differences among one anothers configurations. Students found that only one set of dimensions was possible for the
store and continued making conjectures about the structures shapes and their relationship to perimeter and area. The
students commented that the difference between perimeter and area became clearer with the visual imagery provided
by the game.
Of all the mathematical activity designs, Group Ds most closely exemplified the GaNA and PCaRD model
as shown in Figures 1 and 2. Since Minecraft is an open sandbox a designer can devise multiple tasks to be
performed. Group D teachers focused on facilitating a deep conceptual understanding of area and perimeter,
empowering their students to compare and contrast configurations to arrive at conjectures, and then test the
conjectures to see if they made sense in this virtual environment. Questioning on the part of the teacher and the
facilitating role of the teacher were fundamental to the development of deeper conceptual learning. The students
participating in this activity used inquiry, communication, construction, and expressed their thinking. The PCaRD
helped the teachers to organize their ideas and set the game process in motion.
Minecraft is directly linked with play for many children. Bringing it into the classroom for students to
discover spatial relationships appears to be a natural transition, but these teachers seemed resistant to a change in
their instruction. Despite their initial resistance, rudimentary game scenarios were designed that supported, and in
some cases enhanced content. Playing in the Minecraft sandbox enhanced the play aspects of the assignment. With
Group B design, the students discovered that the instructions were done in two dimensions only and needed to be
revised to function in a three dimensional world. Furthermore, the key concept of proportionality remained true in
the third dimension as well as in the two dimensional alignment. Reflection was necessary for this discussion to
occur, and, the facilitating role of the instructor proved useful. The other two Groups learning activities (Group A
and C) used their definition of perimeter and area to apply the concepts to constructing objects with the given units
and units squared. Both groups of teachers were more open to letting their students explore with Minecraft in their
classroom if used in small groups. In order for the teachers to develop more effective games and feel more confident
in their ability to make serious games with Minecraft, they would need more experience in designing within a
critical perspective. Five weeks was not enough time for them to look deeply at the mathematics and adapt it to
Minecraft using the PCaRD and GaNA models.
When the teachers were asked to reflect on lessons learned, they were very vocal. Constructing the games
was not the issue of discussion, but time wasthe time on task required of the students. They felt that they would
not have that time available to them until after the State mandated tests were given. Nevertheless, the idea was
planted, and the teachers did learn about Minecraft as tool for experiencing mathematics. Unfortunately, they had a
very difficult time visualizing a classroom that connected mathematical content to a serious game in lieu of a
classroom of students completing mathematical problems on work sheets

Limitations
This study on serious gaming in elementary school was limited due to testing pressures placed on teachers
and their own self-imposed restrictions in considering a game-based environment as a viable tool for learning
mathematics conceptually instead of playing a game. With school district support for innovative teaching and
learning and less focus on standardized testing, these teachers would likely be more willing to let go of their
worksheets and embrace serious games as tools for learning. Their reluctance was an indicator of the pressures
placed on master teachers who oversee district initiatives. Also limiting the study was the length of the intervention.
If the program could have been continued for a longer period of time, more time would have allowed the
development and refinement of true serious games with depth, precision, and fidelity. The teachers looked to the
game designers to put content into serious games, and failed to see how they could design their own lessons and
contribute to the design of higher quality serious games.

166
Conclusion
Shah and Foster (2014) suggest that teachers efforts at adopting game-based learning can be sustained if
the following criteria are met: (a) knowledge of games and their pedagogical possibilities are known (Kob, Kin,
Wadhwa, & Lim 2012, (b) skills at using games for achieving desired learning goals are facilitated (Rizhaupt,
Gunter & jones 2010), and (c) teachers have competence in navigating the contextual conditions impacting the use
of games in schools (Demirbilek & Tanner 2010). The teachers in this study were skeptical of implementing serious
games in their schools due to testing pressures and their lack of confidence in the educational fidelity of the product
they were using. They failed to recognize the possibilities of replacing a practice worksheet with a childrens digital
game. Even though GaNA and PCaRD provided the methodical framework, pedagogical model for serious games,
and data supporting the use of serious games, the teacher-participants did not see how mathematical concepts could
be connected with such depth as to be maintained in long-term memory when using a serious game. The participants
need to realize that curiosity can replace repetition, and instinctive problem-solving can substitute Polyas standard
steps that teachers interpret as underlining the sentence and highlighting key terms. With a naturalistic gaming
platform problem-solving becomes more natural and less procedural. It is based on perseverance and unrequited
curiosity. When all is said, it comes down to teaching for a test or teaching for the overarching ideas. As Li (2013)
says in her research, there is a need for research studies that aid teachers in fully realizing and utilizing the
educational merit of game-based learning. This research makes a similar suggestion with a twist. More research on
educating teachers of the cognitive value of serious game-based learning in elementary mathematics classroom is
needed along with continued professional development on how to use platforms to develop serious games suitable
for the classroom.

References
Akcaoglu, M. (2013). Using MMORPGs in classrooms. Stores vs. teachers as sources of motivation. In Y. Baek &
N. Whitton (Eds). Cases on Digital Game-Based Learning: Methods, Models, and Strategies (pp. 15-24).
Hershey, PA: IGI Global.
Annetta, L.A., Minogue, J., Holmes, S. Y., & Cheng, M. T. (2009). Investigating the impact of video games on high
school students engagement and learning about genetics. Computers & Education, 53, 74-85.
Braman, J., Dudley, A., Colt, K., Vincenti, V., Wang, Y. (2011). Gaining insight into the applications of second life
in a computer course: Students perspectives. Proceeding of the 14th HCI International Conference on
Online Communities and Social Computing, Orlando, FL.
Chan Lin, L., Hong, J., Horng, J., Chang, S., & Chu, H. (2006). Factors influencing technology integration in
teaching: A Taiwanese perspective. Innovation in Education and Teaching International, 43(1), 1-16.
Chen, C. (2008). Why do teachers not practice what they believe regarding technology integration? The Journal of
Educational Research, 102(1), 65-75.
Chin, C., & Chia, L.G. (2004). Problem-based learning: Using students questions to drive knowledge construction.
Science Education, 88(5), 707-727.
Demirbilek, M., & Tanner, S. L. (2010). Math teachers perspectives on using educational computer games in math
education. Procedia-Social and Behavioral Sciences, 9, 709-716.
Foster, A. N. (2012). Assessing learning games for school content: Framework, and methodology. In D. Ifenthaler,
D. Eseryel & X. Ge (Eds). Assessment in Game-based Learning: Foundation, Innovations, and
Perspectives. New York, NY: Springer.
Foster A. N., Mishra, P., & Koehler, M. (2012). Digital game analysis. Using the Technological Pedagogical
Content Knowledge framework to determine the affordances of a game for learning. In M. Khine (Ed.)
Learning to Play: Exploring the Future of Education with Video Games. New York, NY: Peter Lang
Publications.
Foster, A. N. & Shah. M. (2012). PCaRD: A model for teachers to integrate games in their classroom. Paper
presented at the Society for Information Technology & Teacher Education International Conference, 2012,
Austin, Texas, USA.
Glaser, B. G., & Strauss, A. L. (1967). The discovery of ground theory; strategies for qualitative research. Chicago:
Aldine.
Hammond, M., Fragkouli, E., Suandi, I., Crosson, S., Ingram, J., Johnston-Wilder, P, Johnston-Wilder, S., Kingston,
Y., Pope, M., & Wray, D. (2009). What happens as student teachers who made very good use of ICT
during pre-service training enter their first year of teaching? Teacher Development, 13(2), 93-116.

167
Hainey, T., Connolly, T. Stansfield, M., & Boyle, E. A. (2011). Evaluation of a game to teach requirements
collection and analysis in software engineering at tertiary education level. Computers & Education, 56, 21-
5
Kebritchi, M., Hirumi A., & Bai H. (2010). The effects of modern mathematics computer games on mathematics
achievement and class motivation. Computers & Education, 55, 427-443
Mishra, P., & Koehler, N. (2006). Technological Pedagogical Content Knowledge: A new framework for teacher
knowledge. Teachers College Record, 108(6), 1017-1054.
Koh, E., Kin, Y. G., Wadhawa, B., & Lim, J. (2012). Teacher perceptions of games in Singapore schools. Similation
& Gaming 43(1), 51-56.
Li, Q. (2013). Digital games and learning: A study of perservice teachers perceptions. International Journal of Play
(2), 1-16
Lim, K. (2009). The six learnings of second life: A framework for designing curricular interventions in-world.
Journal of Virtual World Research, 2(1).
Lorant-Rouser, S., Much C., Mescle, H., & Lieury , A. (2010). Kawashima vs, Super Maria! Should a game be
serious in order to stimulate cognitive aptitudes? European Review of Applied Psychology, 60, 221-232.
Oliverio, J. C., & Beck, D. (2009). Enhanced interaction in mixed social environments. In R.F. Ferdig (Ed.),
Handbook of Research on Effective Electronic Gaming in Education (pp. 146-162). Hershey, PA:
Information Science Publishing
Kenny R. F., & McDaniel, R. (2011). The role teachers expectations and value assessments of video games in their
adopting and integrating them into their classrooms. British Journal of Educational Technology, 42(2),
197-213.
Marsh, T. (2011). Serious games continuum: Between games for purposes and experiential environments for
purpose. Entertainment Computing, 2, 60-68.
Messinger, P., Stroulia, E., Lyons, K., Bone, M., Niu, R. Smirnov, K. & Perelgut. S. (2009). Virtual worlds-past,
present and future: New directions in social computing. Decision Support Systems, 47, 204-228
Palak, D., & Walls, R. T. (2009). Teachers beliefs and technology practices: A mixed-methods approach. Journal
of Research on Technology in Education, 41(4), 417-441.
Ritzhaupt, A. D., Gunter, E., & Jones, J. G. (2010). Survey of commercial off-the-shelf video games: Benefits and
barriers in formal educational settings. International Journal of Instructional Technology and Distance
Learning, 7, 45-55.
Shah, M., & Foster, A. (2014). Game network analysis: Developing and assessing teachers knowledge of game-
based learning. In M. Searson & M. Ochoa (Eds.), Proceedings of Society for Information Technology &
Teacher Education International Conference 2014 (pp. 685-692). Chesapeake, VA: AACE. .
Shah, M., Foster, A., Scottoline, M., & Duvall, M. (2014). Pre-service teacher wducation in game-based learning:
Analyzing and ontegrating Minecraft. In M. Searson & M. Ochoa (Eds.), Proceedings of Society for
Information Technology & Teacher Education International Conference 2014 (pp. 2646-2654).
Chesapeake, VA: AACE.
Squire, K. D. (2008). Video game based learning: An emerging paradigm for instruction. Performance Improvement
Quarterly, 21, 7-36.
Tanes, Z., & Cemalcilar, Z. (2010). Learning from SimCity: An empirical study with Turkish adolescents. Journal
of Adolescence, 33, 731-739.

168
Table 1
Examples of GaNA questions for game analysis
TPACK- Examples of Guiding Questions Minecraft
ICCE
Discussion
Technology What platform is used to run this game? MMORPG, Sandbox game
Pedagogy (General)
Inquiry What is the instructional style of this game? Or Discovery, Socio-constructivism, Situated learning,
how would you describe its teach approach?
Construction What prior knowledge is needed to accomplish How to move the cursor. Definition of mathematical concepts like the
the objective? definition of perimeter and area. Some experience in Minecraft is
required.
Communication When does the game give feedback to the Because the activity will be designed for the creative mode. Student
players? will use the cursor to manipulate the cubes to build structures. Each
placement of the cubes provides feedback.
Expression How can players customize their game play They can create their own environment using a wide variety of
experience? materials
Content Which national and state core curriculum CCSS.Math.Content.3.MD.D.8
standards were address in this game CCSS.Math.Content.3.MD.C.7.a- d
Pedagogy (specific to the
teachers area of concentration)
Inquiry In what ways does the game allow exploration You can build wherever you want with a variety of tools and use what
and inquiry of a particular topic/academic every material; i.e. metal, mineral, wood, that are available to you.
content You can construct whatever you want in the sandbox environment and
show relationships between size and number.
Construction In what ways will students possibly be able to Students can show their understanding of perimeter, area,
demonstrate their understanding to measurement, etc.
topic/academic content in the game?
Communication How does the game guide/ inform students in The guide provides instructions on how to play the game and lesson
learning about the topic/academic content? direction as well as definitions of important terns:
Expression What opportunities are available for players to They can construction a structure with any length or width, or with
feel connected personally through freedom of any element or wood they desire as long as the structures have the
expression to learn perform and demonstrate prescribed perimeter or area. This means they can use various
their learning of the topic/academic content? configurations for the structure and that answers can vary.
Overall What is your overall impression about this An excellent way to show the relationship between perimeter and area.
game?

169

170
Table 2
Examples of GaNA question for game integration
PCaRD Dimensions Example of Guiding Outcome With Minecraft
Play List questions that you may use to observe students as How do we use measurement in our lives?
they are playing (e.g.) are they gaining opportunities Which tools are used to measure length and area?
For ICCE? What is a standard unit?
Why do we need a standard unit of measurement?
How do you we use measurement to solve problems?

Curricular activity Develop a problem-based case or activity that addresses 1. Build a pier with an area of 12. Calculate the perimeter of
the objective of this lesson. Provide any instructions that the pier.
will be given in relation to the reflection prompt or 2. Build a bait shop with a rectangular perimeter of 12.
question. Calculate the area of the bait shop. Be prepared to compare
your answer with your neighbor.
3. Build a restaurant with an area of 24 blocks. Calculate the
perimeter of the restaurant. Be prepared to compare your
answer with others in class.
4. Build a store with a square shape and a perimeter of 16.
Calculate the area of the store. Be prepared to compare your
answer with others in class

Reflection Develop a reflection prompt or case that addresses the Compare your answers with your neighbor. If your response is
objective of this lesson. Provide any instructions that will different explain why both answers answer the question. Is it
be given in relation to the reflection prompt or questions. possible to have different responses? On Number 4 is it possible
to have different responses? Why or why not.
Discussion Enlist prompts that you would use to invite further What patterns would help you find the perimeter and area of any
discussion in relation to the lesson objective and the square or rectangular shape? Why would you need to find the
PCaRD parts of the day. perimeter or area of an object?
Overall What lessons are learned from the application of pCaRD The reflective piece is often overlooked. I like how we put it into
in this session for future lessons? our lesson.
18. Mobile Learning for Learners With and Without Disabilities in K-12
Education Settings
Jingrong Xie
James D. Basham, Ph.D.
The University of Kansas

Mobile Learning in K-12 Education Settings.

Vygotsky (1978) argued that learning occurred first through interpersonal interaction with the social
environment and then through intrapersonal internalization. Engestrm (1987) viewed the relationship between the
human agent and the objects within the environment as mediated by cultural means, tools, and signs. As a
mediating tool between the social environment and intrapersonal understanding, mobile devices are the most
connected form of technology. Mobile learning (M-learning) refers to any sort of learning that takes place when the
learner is not at a fixed, predetermined location, or learning that happens when the learner takes advantage of
learning opportunities provided by mobile technologies (Omalley et al, 2003). As such, with a focus on learning,
M-learning is a very flexible approach that seems to be appropriate to addressing diverse needs and achieving the
various learning objectives associated with lifelong learning. Recent numbers indicated that the worldwide M-
learning market reached $3.2 billion in 2010 and is targeted to reach $9.1 billion by 2015 (Liang, Yu & Long, 2011).
Unfortunately, very little of the research on M-learning is focused on societys most diverse learners, especially
those learners with disabilities. The primary purposes of this review are to examine (a) the issues that researchers
have been investigating on the topic of M-learning for teaching and learning for students with and without
disabilities in K-12 settings, (b) the trends in M-learning for teaching and learning and (c) studies accounting for the
interaction between students with and without disabilities and other variables such as socioeconomic status, gender,
and the effectiveness of M-learning technology on teaching and learning.
In this article, we make a case that mobile devices are underutilized as a cultural means, tool and sign of
learning especially for students in K-12 education with disabilities. Moreover we construct this understanding
through a conceptual framework based on the cultural historical activity theory. Cultural historical activity theory
was employed as a theoretical framework in this review to understand how mobile technology is used as a mediated
tool to facilitate teaching and learning in the current K-12 educational community. M-learning consists of four
elements within this conceptual framework. These four elements include (a) learners with and without disabilities;
(b) mediated tools (mobile learning technology); (c) access; (d) context (learning environment surrounding learners
and learners' demographic environment). Through the review, we explain how technology links this conceptual
framework to current understanding about M-learning for students with and without disabilities in K-12 settings.

M-learning and Learners with Disabilities & Lifelong Learning

As we learn more about the human condition, it is recognized that disability is supported by cultural
practices and norms, rather than preordained and innate conceptions of existence (Thomas, 2004; Munyi, 2012).
That when provided with the appropriate supports and environmental considerations learners with disabilities
demonstrate various strengths that would otherwise go unnoticed. Shogren, Wehmeyer, Palmer, Rifenbark, & Little
(2015) have shown the importance of supporting individuals with disabilities to be self-determined. Self-
determination has been linked to the development of the skills and attitudes that enable lifelong learning including
the adoption of new practices and tools. As society adopts mobile learning its critical to consider the potential
implications for designing practices and systems that support the needs of individuals with disabilities.

Mediated Tools

Mobile technology is widely used to support students learning inside and outside the classroom
environment. Mobile devices are portable, easily distributable, affordable, and provide a means to support lifelong
learning. These devices and their associated systems could potentially be widely used in education to impact how
we teach and learn both in formal and informal education settings. In a review of worldwide trends in M-learning,
UNESCO (2012) concluded that mobile devices are positioned to influence teaching and learning in a way the
personal computer never was able to do. UNESCO identified portability, low-cost, and wide distribution as critical
factors that make these devices important to transforming the human learning experience. M-learning is ideal for

171

facilitating collaboration and communication (Farooq, Schafer, Rosson & Carroll, 2002; Duncan-Howell et al,
2007).

Access

Mobile technology makes learning resources more accessible to learners with diverse needs. Mobile
technology provides channels to access digital literacy. Roschelle and Pea (2002) stated that mobile technology is
easy to access, benefits autonomous learning, motivates students to learn, promotes student collaboration and
communication, and supports inquiry based instructional activities. M-learning could be used as a supplementary
teaching technique that is a feasible way to minimize constraints of time and place in the learning environment
(Huang, Jeng, & Huang, 2009; Huang, Lin, & Cheng, 2009).

M-learning and context

It is not enough only to examine the technological artifacts and instruments operating behind M-learning
but researchers should also pay attention to the underlying sociocultural values that add to the formation of these
learning practices (Jonassen & Rohrer-Murphy 1999). Users are the center of M-learning activities (Roschelle,
2003; Sharples, Taylor & Vavoula, 2005; Taylor & Evans, 2005). Therefore, users' learning needs should be
considered and addressed in the learning environment. Demographic information such as the sociocultural values is
relevant to the efficacy of M-learning activities for students with and without disabilities. It is necessary to re-
conceptualize learning for the mobile age, to understand the role of mobility and communication in the process of
learning, and to indicate the importance of context in building meaning, and the great impact of digital networks in
supporting virtual communities that cross barriers of age and culture (Sharples et al, 2005). With a rapid
advancement of mobile technology and the lack of focus on critical demographic variables there is a need to conduct
a more current review of the literature on the demonstrated effects of mobile learning for students with and without
disabilities.

Methods

This section describes the approach to locating and selecting the research that is reviewed in this paper. For
this review, the following electronic databases were searched: Google Scholar, ERIC, and SAGE. Search terms
included the specific terminology of mobile learning and then combinations of the following words or phrases:
mobile technology, special education, inclusive education, socioeconomic status, digital literacy, gaming, computer-
assisted instruction (CAI), iPad, and tablet. Selecting research articles for this review was with inclusive and
exclusive criteria. In total, twenty-three articles met the criteria for further discussion in this study. Twenty of the
articles are empirical studies and 3 are literature reviews. Two levels of coding took place to meet the aims of the
review. First, the references of the included articles were entered into a spreadsheet. Each article was then read
with a focus on the key features of the article. Next, the articles were coded by the primary purposes of the studies.
Finally, the primary findings were evaluated as a means of supporting or supplementing educational resources and
facilitating teaching and learning for learners with and without disabilities.

Results
The articles in this review examined both the learning opportunities and affordances offered through
learning on mobile devices in both K-12 formal and informal educational settings with a specific focus on learners
with and without disabilities. The findings showed that various types of mobile tools were used in the research (i.e.,
smartphones, iPad, personal digital assistants (PDA), iPod). As depicted in Table 1, it provides a summary of the 20
studies with findings on effectiveness of M-learning and objectives. Table 2 provides a summary of studies with
findings related to users' demographic variables (i.e., SES, gender, locality, and disability type), especially related to
SES, locality in demographic variables in the research. An overview of the demographic variables was reported in
the results section of each research study. The information of mobile devices used in the studies is provided in
Table 3. Finally, the articles were coded by themes in Table 4.

Table 1
Example summary of findings

172
Author(s) Findings
Huizenga, Admiraal, Akkerman & Dam the intervention (playing the Frequency 1550
(2009) gamevs. the receipt of the regular project-based
lesson series) for knowledge of medieval
Amsterdam P<=0.001
Huang, Lin, & Cheng (2010) The MPLS group liked outdoor plant learning
better (Q1, t(30)= 2.403, P<0.05)
FernNdez-LPez, Wilcoxon test revealed statistically significant
RodrGuez-FRtiz, RodrGuez-Almendros, & differences in all the skills considered (p<0.05)
MartNez-Segura (2013)

Table 2
Example Summary of Research Methods and Demographic variables: SES, gender, locality, and disability type
Author(s) Participants Locality Disability Demograph
Type ic variables: Social
Economic Status
(SES), Gender
Huizenga, 458 Amsterdam N/A N/A
Admiraal, secondary school , The Netherlands
Akkerman, & Dam students, 20 classes,
(2009) 5 schools, 12-16
years with majority N/A N/A
having an age of 13
years
Elementary
Huang, Lin, Education, 32 Taiwan
& Cheng (2010) students, with an age
of 11, 16 students in
experiment group,
and 16 students in
control group

Table 3
Mobile devices used in the studies
Author(s) Mobile Device
Huizenga, Admiraal, Akkerman, & Dam (2009) smart phones, video phones, GPS technology
Huang, Lin, & Cheng (2010) the MPLS and the PDA (personal digital assistants)
FernNdez-LPez, the Picaa platform, Ipad and Ipod touch
RodrGuez-FRtiz, RodrGuez-Almendros, &
MartNez-Segura (2013) videophone, gamephone, GPS receiver
Admiraal, Raessens, & Van Zeijts (2007) paper prototype of apps
Israel, Marino, Basham, & Spivak (2013) Quick Response (QR) codes and mobile devices
McCabe & Tedesco (2012) PDAs
Chang, Chen, & Hsu (2011) TeacherMate, a USB port; the ARM processor;
Kim et al. (2011) mobile e-books, open Flash (GNASH), Linux
TeacherMate, Linux, ARM 9 processor
Kim et al. (2012)

Table 4
Individual learning mode & collaborative learning mode

173

Authors Content
Knowledge Collaboration Individual learning
Acquisition Engagement Motivation learning mode mode
Huizenga, X X X N/A N/A
Admiraal,
Akkerman &
Dam (2009)

Huang, Lin, N/A N/A N/A N/A N/A


& Cheng
(2010)

Mobile devices/tools

Among the 20 empirical studies, various types of M-learning tools were actively used in the research (see
Table 3). For example, six studies conducted research on mobile phones including smart phones (Huizenga et al.,
2009; Ehret & Hollett, 2013; Walker, 2013; Wong et al., 2013; Nowell, 2014; Callow & Zammit, 2012).

Research design

The review analyzed the research methods of empirical study. Mixed methods (Huizenga et al., 2009;
Huang, Lin, & Cheng, 2010; FernNdez-LPez et al, 2013; Admiraal et al., 2007; Israel et al., 2013; McCabe &
Tedesco, 2012; Chang, Chen & Hsu, 2011; Ng & Nicholas, 2013; Ehret & Hollett, 2013; Kim et al., 2011; Kim et
al., 2012; Callow & Zammit, 2012) were found primarily used in those studies. The reviewed studies showed that
observation was the widely used research method (11 out of 21 studies), followed by pre-and post-test method (10
out of 21 studies), and interviews (8 out of 21 studies) (see Table 2).

Demographics

As depicted in Table 2, among the 20 empirical studies, participants in 3 studies were teachers working in
K-12 settings, participants in 16 studies were learners in formal K-12 education settings, and only in one article
included both learners and their parents. Within these, research of 11 articles took place at elementary education, 7
of the studies had a primary focus on secondary education, and two both in elementary and secondary education. Of
the 20 research studies, four studies reported specific findings about learners with disabilities. Six of the articles had
a focus on the effectiveness of M-learning to learners, examining the socioeconomic status factor as one of the
demographic variables.

Research themes

Four main themes were primarily discussed across 23 articles, including content knowledge acquisition,
engagement, motivation, and communication and collaboration learning mode or independent M-learning mode (see
Table 4). This section analyzed SES and other demographic variables related to specific findings within each of
these research themes.

Discussion

The purpose of this literature review was to investigate issues, trends, and associated outcomes for M-
learning in K-12 education settings for learners with and without disabilities. We were interested in the variables
associated with the design of M-learning environments and supports for enhancing lifelong learning for learners
with disabilities. Generally, this review demonstrated that peer-reviewed research in the area of K-12 mobile
learning is lacking, especially relative to learners with disabilities. Mobile learning technology was shown as an
effective mediated tool with easy access for facilitating teaching and learning for students with diverse needs under
the research of examining variables from socio cultural perspective. The study also examined mobile learning

174
effectiveness with relation to other demographic variables, which revealed relationships among SES, disability type,
gender, prior m-learning experience, and locality specificity.
The means by which mobile devices are designed, used, and researched impacts the findings of
effectiveness of M-learning on educational teaching and learning. Instead of primarily choosing mixed methods to
do experimental research to explore a broader view of M-learning in educational practices, some other research
methods need to be considered to further the research on variables and effectiveness of M-learning under the given
context. Studies have shown the effectiveness of M-learning technology in facilitating positive outcomes and have
the potential to provide expanded literacy exposure advancing lifelong learning for learners living in low SES
surroundings. Overall, the findings indicate that the design of an M-learning experience should be contextualized
within the teaching and learning situation.

Implications and limitations



More academic attention and research efforts are needed to focus on how to design and use M-learning
technology to provide inclusive education for all learners, but especially those with disabilities. Specific
investigation is needed on how to effectively consider personalized learning needs. The importance of both
individual and collaborative learning modes within the design of M-learning experiences is implied. Another
implication comes from the instructional perspective that M-learning is different from traditional learning, and more
attention needs to focus on how to facilitate instructional pedagogy to promote learning and teaching practices in the
application of M-learning technology.
Certain limitations might have limited the associated outcomes of the review. Future reviews might
include a different range of sources to provide more detailed and representative results. In addition, the combination
of keywords in searching may lead to different outcomes. Therefore, future reviews should revisit the keywords in
locating articles. This process is important to ensure that all the related research studies in the field are included for
analysis.

Conclusions

Marino (2010) indicated that mobile technologies hold a great deal of potential for all learners, but
especially those with disabilities. The findings and implications of this review are expected to provide both teacher-
practitioners and researchers with valuable references and suggestions regarding to the use of M-learning in
designing and the implementation of lifelong learning plans for learners with disabilities.

References

Admiraal, W., Raessens, J., & Van Zeijts, H. (2007). Technology enhanced learning through mobile technology in
secondary education. Expanding the knowledge economy. Issues, applications, case studies (Part 2), 1241-
1248.
Callow, J., & Zammit, K. (2012). 'Where lies your text?':(Twelfth Night Act I, Scene V): Engaging high school
students from low socioeconomic backgrounds in reading multimodal texts. English in Australia, 47(2), 69.
Chang, C. S., Chen, T. S., & Hsu, W. H. (2011). The study on integrating WebQuest with mobile learning for
environmental education. Computers & Education, 57(1), 1228-1239.
Chen, C.-P, Shih, J.-L., & Ma, Y.-C.(2014). Using Instructional Pervasive Game for School Childrens Cultural
Learning. Educational Technology & Society, 17(2), 169182.
Ciampa, K. (2014). Learning in a mobile age: an investigation of student motivation. Journal of Computer Assisted
Learning, 30(1), 82-96.
Duncan-Howell, J. A., & Lee, K. T. (2007). M-Learning Innovations and Initiatives: Finding a place for mobile
technologies within tertiary educational settings. In Atkinson, Roger, McBeath, Clare, Soong Swee Kit,
Alan, & Cheers, Chris (Eds.) Ascilite 2007, 2-5 December 2005, Singapore.
Ehret, C., & Hollett, T. (2013). (Re) placing School: Middle School Students' Countermobilities While Composing
With iPods. Journal of Adolescent & Adult Literacy, 57(2), 110-119.
Engestrm, Y. (1987). Learning by expanding: An activity-theoretical approach to developmental research.
Helsinki: Orienta-Konsultit.

175

FernNdez-LPez, ., RodrGuez-FRtiz, M. J., RodrGuez-Almendros, M. L., & MartNez-Segura, M. J. (2013).
Mobile learning technology based on iOS devices to support students with special education needs.
Computers & Education, 61, 77-90.
Jacobs, V. A. (2002). Reading, writing, and understanding. Educational Leadership, 60(3), 58-62.
Jeng, Y.-L., Wu, T.-T., Huang, Y.-M., Tan, Q., & Yang, S. J. H. (2010). The Add-on Impact of Mobile Applications
in Learning Strategies: A Review Study. Educational Technology & Society, 13 (3), 311.
Hwang, G.-J., & Wong, L.-H.(2014).Guest Editorial: Powering Up: Insights from Distinguished Mobile and
Ubiquitous Learning Projects across the World. Educational Technology & Society,17(2),13.
Huang, Y. M., Lin, Y. T., & Cheng, S. C. (2010). Effectiveness of a mobile plant learning system in a science
curriculum in Taiwanese elementary education. Computers & Education, 54(1), 47-58.
Huizenga, J., Admiraal, W., Akkerman, S., & Dam, G. T. (2009). Mobile game based learning in secondary
education: engagement, motivation and learning in a mobile city game. Journal of Computer Assisted
Learning, 25(4), 332-344.
Hutchison, A., Beschorner, B., & Schmidt Crawford, D. (2012). Exploring the use of the iPad for literacy learning.
The Reading Teacher, 66(1), 15-23.
Israel, M., Marino, M. T., Basham, J. D., & Spivak, W. (2013). Fifth graders as app designers: how diverse learners
conceptualize educational apps. Journal of Research on Technology in Education, 46(1), 53-80.
Kim, P., Buckner, E., Kim, H., Makany, T., Taleja, N., & Parikh, V. (2012). A comparative analysis of a game-
based mobile learning model in low-socioeconomic communities of India. International Journal of
Educational Development, 32(2), 329-340.
Kim, P., Hagashi, T., Carillo, L., Gonzales, I., Makany, T., Lee, B., & Garate, A. (2011).
Socioeconomic strata, mobile technology, and education: A comparative
analysis. Educational Technology Research and Development, 59(4), 465-486.
Kissinger, J. S. (2013). The Social & Mobile Learning Experiences of Students Using Mobile E-Books. Journal of
Asynchronous Learning Networks, 17(1), 155-170.
Kukulska-Hulme, A. (2005). Mobile usability and user experience. Mobile Learning: A handbook for educators and
trainers, 45-56.
Kwon, S., & Lee, J. E. (2010). Design principles of m-learning for ESL. Procedia-Social and Behavioral Sciences,
2(2), 1884-1889.
Liang, W., Yu, S., & Long, T. (2013). A study of the tablet computer's application in K-12 schools in China.
International Journal of education and Development using Information and Communication technology
(IJEDICT), 9(3), 61-70.
Liu, M., Scordino, R., Geurtz, R., Navarrete, C., Ko, Y., & Lim, M. (2014). A Look at Research on Mobile Learning
in K12 Education From 2007 to the Present. Journal of Research on Technology in Education, 46(4), 325-
372.
Marino, M. T. (2010). Defining a Technology Research Agenda for Elementary and Secondary Students with
Learning and Other High-Incidence Disabilities in Inclusive Science Classrooms. Journal of Special
Education Technology, 25(1), 1-27.
Marty, P. F., Alemanne, N. D., Mendenhall, A., Maurya, M., Southerland, S. A., Sampson, V., ...& Schellinger, J.
(2013). Scientific inquiry, digital literacy, and mobile computing in informal learning environments.
Learning, Media and Technology, 38(4), 407-428.
McCabe, M., & Tedesco, S. (2012). Using QR Codes and Mobile Devices to Foster an Inclusive Learning
Environment for Mathematics Education. International Journal of Technology and Inclusive Education
(IJTIE), 1(1).
Ng, W., & Nicholas, H. (2013). A framework for sustainable mobile learning in schools. British Journal of
Educational Technology, 44(5), 695-715.
Nowell, S. D. (2014). Using disruptive technologies to make digital connections: stories of media use and digital
literacy in secondary classrooms. Educational Media International, 51(2), 109-123.
OMalley, C., Vavoula, G., Glew, J. P., Taylor, J., Sharples, M., & Lefrere, P. (2003). MOBIlearn WP4Guidelines
for learning/teaching/tutoring in a mobile environment. Retrieved from
http://www.mobilearn.org/download/results/guidelines.pdf.
Pellerin, M. (2014). Language Tasks Using Touch Screen and Mobile Technologies:
Reconceptualizing Task-Based CALL for Young Language Learners. Canadian Journal of Learning & Technology,
40(1).

176
Sharples, M., Taylor, J., & Vavoula, G. (2005). Towards a theory of mobile learning. Proceedings of mLearn 2005,
1(1), 1-9.
Shogren, K. A., Wehmeyer, M. L., Palmer, S. B., Rifenbark, G. G., & Little, T. D. (2015). Relationships between
self-determination and postschool outcomes for youth with disabilities. Journal of Special Education, 53,
30-41.
Stead, G., Sharpe, B., Anderson, P., Cych, L., & Philpott, M. (2006). Emerging technologies for learning. Coventry,
UK: Becta.
Traxler, J. (2005). Defining mobile learning. In Proceedings of the IADIS International Conference: Mobile
Learning 2005 (pp. 261-266). Qawra, Malta: IADIS.
United Nations Educational, Scientific and Cultural Organization. (2012). Turning on mobile learning: Global
themes, UNESCO Working Paper Series on Mobile Learning. Retrieved from
http://unesdoc.unesco.org/images/0021/002164/216451e.pdf
Vygotsky, L.S. (1978). Mind and society: The development of higher mental process. Cambridge, MA: Harvard
University Press. http:///h
Walker, R. (2013). I dont think I would be where I am right now. Pupil perspectives on using mobile devices for
learning. Research in Learning Technology, 21.
Wagner, E. D. (2005). Enabling mobile learning. EDUCASE, May/June 2005, 41-52.
Wong, L. H., Hsu, C. K., Sun, J., & Boticki, I. (2013). How Flexible Grouping Affects the Collaborative Patterns in
a Mobile-Assisted Chinese Character Learning Game. Educational Technology & Society, 16(2), 174-187.
Wong, L. H., & Looi, C. K. (2011). What seems do we remove in mobile-assisted seamless learning? A critical
review of the literature. Computers & Education, 57(4), 2364-2381.
Wu, W. H., Jim Wu, Y. C., Chen, C. Y., Kao, H. Y., Lin, C. H., & Huang, S. H. (2012). Review of trends from
mobile learning studies: A meta-analysis. Computers & Education, 59(2), 817-827.
Zurita, G., & Nussbaum, M. (2004). A constructivist mobile learning environment supported by a wireless handheld
network. Journal of Computer Assisted Learning, 20(4), 235-243.
Common Core State Standards. (2014). Common Core State Standards Initiative. Retrieved from
http://www.corestandards.org/in-the-states.

177

178
PART 4 SUPPORT AND MENTORING

179

180
19. Examining the Effects of Employing Various Multimedia Tools in a
Flipped Classroom on Pre-Service Teachers Self Efficacy and Knowledge
Application Abilities

Aileen J. Watts
College of Education
Arkansas Tech University
awatts4@atu.edu

Mohamed Ibrahim
College of Education
Arkansas Tech University
mibrahim1@atu.edu

Introduction
Sir Francis Bacons famous statement scientia est postestas knowledge is power has long been embraced as a
hallmark of academia. Unfortunately in this 21st century technological era knowledge acquisition is just the tip of the
proverbial intellectual ice burg. The flipped classroom model is a paradigm shift for many traditional academics.
Instead of lecturing and sending home students to work on material or take exams flipped classrooms use class time
to facilitate critical, collaborative discussions and engage in performance based activities which challenge students
understanding of the course content and enable them to collectively apply material in a variety of creative ways. In a
flipped classroom model the roles of instructor and learner are adjusted to more effectively influence the students
learning outcomes. In a licensure based, practitioners degree like teacher education, this need for hands on,
performance based activities is even more important as we work to prepare this next generation of teachers to be
effective educators in todays rigorous, technologically driven classrooms. The flipped classroom teaching strategy
stems from a large body of literature on student-centered learning such as Piaget work (1968) on constructivism and
collaborative learning and Vygotsky (1978) on cooperative learning. Therefore, the purpose of this study was to
examine the effects of employing various multimedia tools in a flipped classroom on pre-service teachers self-
efficacy and knowledge application abilities.

Theoretical Framework
Flipped Classroom

Although the lecture discussion model has been relied upon for decades as a mechanism to help students acquire
new knowledge (Hattie, 2008; Schwerdt & Wupperman, 2010), other researchers argue that the problem with this
instructional methodology is its limitations in positively influencing students application and or transfer capabilities
of new knowledge. For some students, information shared in class lectures may cover information they are already
familiar with; other students may have trouble taking in new information because of the instructional style or pacing
of the professor, or students may simply lack the prior knowledge necessary to understand the new concepts being
introduced (Goodwin & Miller, 2013). Bransford, Brown and Cocking (2000) suggest that there are three key
elements that are critical to improving a students learning paradigm, To develop competence in an area of inquiry,
students must: a) have a deep foundation of factual knowledge, b) understand facts and ideas in the context of a
conceptual framework, and c) organize knowledge in ways that facilitate retrieval and application (p. 16).

The flipped classroom model enables students to acquire information outside the classroom through a wide variety
of formats (text book, asynchronous video lectures, online learning modules, etc.) and then apply their acquisition of
new knowledge through active, group based, problem solving activities in the classroom. Being able to receive
immediate feedback from their peers and instructor enables students to learn to re-adjust misconceptions and re-
organize the new knowledge for future application (Bergmann & Sams, 2012; Bishop & Verlerger, 2013; Stayer,
2007). The flipped classroom research suggests a number of prescriptive principles, which include aspects such as:

181

short quizzes using classroom response systems (i.e.: clickers) at the first of class to make sure that students read the
content, minimal lecture (no more that 20% of class time); practice and hands-on-activities (80% of class time). The
research suggests that having students familiar with the materials before class, either through having read the
chapters or watched videos, enables the instructor to practice more complex levels of skills and students have the
opportunity to ask more meaningful question in class; work through problems with the guidance of the instructor
and the support of their peers; and there is a much stronger emphasis on collaborative, interactive learning.

Self-efficacy

Banduras (1994) theory of self-efficacy purports that self-efficacy is a critical element that directly influences not
only task performance but also cognitive development. He defines self-efficacy as, peoples beliefs about their
capabilities to produce designated levels of performance that exercise influence over events that affect their lives.
Self-efficacy beliefs determine how people feel, think, motivate themselves and behave (Bandura, p. 71). This
social cognitive perspective holds that successful, self-regulated learners possess higher levels of motivation
(personal influences), apply more effective learning strategies (behavioral influences) and respond more
appropriately to situational demands (environmental influences) (Pintrich, 2002). Therefore, many educators
consider students self-efficacy to be an essential indicator to evaluate students engagement in problem solving
activities, which are a significant part of flipped classrooms and to motivate students to engage in effective learning
strategies. The gap that still remains is what if any role does the flipped classroom methodology play in promoting
pre-service teachers knowledge application of specific multimedia tools on students self-efficacy skills. While
numerous benefits of student-centered, hands on learning environments have been documented in the literature, the
actual effects of the flipped classroom model on student self-efficacy in creating and applying multimedia projects
still requires more analysis.

Multimedia in Learning:

Empirical research has shown that the brains capacity for multimodal processing (visual and auditory), when
coupled with multimedia instruction, can directly influence student learning (Baddeley, 1992; Mayer, 2003, 2005;
Miler, 2005). Recent studies have shown that our brains working memory contains several channels, the visual
typically handling less information than the auditory, but when the information is present through both channels the
overall cognitive processing capabilities increase (Sweller, 2005). Along these same lines, Mayer (2005) has found
that effective multimedia presentations introduce new information in such a way as to utilize students existing
schemas so as to assist them in incorporating new information into their long-term memory (refer to Figure 1).

Figure 1: Visual Representation of the Cognitive Theory of Multimedia Learning (Mayer, 2005)
Research also suggests that activating prior knowledge (Kalyuga, 2005), student directed pacing (Mayer, Dow &
Mayer, 2003), visualizing complex information (animation) (Mayer & Chandler, 2001), and active engagement with
the material (Mayer, 2003) all contribute to improving the effects of multimedia on student learning. Finally,
research also states that students should be provided with opportunities to apply what they have learned and receive
formal and informal feedback on their progress (Gee, 2005; Perkins, 1992; Mayer, 2005).

Research questions

The purpose of this research was to examine the effects of employing various multimedia tools in a flipped
classroom on pre-service teachers self-efficacy and knowledge application abilities. Following a thorough review of

182
the literature on flipped classrooms, multimedia and self-efficacy, this study was guided by four overarching
questions:

1. Will the use of the flipped classroom model, as an interactive, student centered learning framework,
improve students application of content knowledge?
2. Will the use of the flipped classroom model, as an interactive, student centered learning framework,
improve students self-efficacy?
3. Will the use of multimedia projects in a flipped classroom model, improve students knowledge application
and self-efficacy?
4. Is there a relationship between students application of multimedia elements on their self-efficacy within a
flipped classroom model?

Research Methods
Guided by a review of various multidisciplinary studies on flipped classrooms, self-efficacy and multimedia
learning, the investigators hypothesized that a flipped classroom model would improve students self-efficacy in
creating and analyzing multimedia projects between phases one and phase two. The study had one independent
variable: the teaching method (lecture-based or flipped-based method) and two dependent variables: (1) knowledge
application (2) students perception of self-efficacy to create and analyze multimedia tools. Study participants
included 36 pre-service teachers enrolled in content literacy courses at a medium sized, mid-western university.
Participants were non-science majors 35 female and 1 male. English was reported as the native language of
all participants. The average reported age of the participants was 22-26 years (SD = 1.213 years). Participants were a
mixture of Caucasian, Hispanic and Asian descent, and all were students in their senior year.

Data Collection Procedures

This mixed methods study relied on the triangulation of qualitative data sources comprised of: classroom
observation and discussion notes, and student feedback on the post, self-efficacy survey. The quantitative
instruments for data collection included a demographic survey administered through Blackboard, pre and post self-
efficacy surveys to assess students perceived self-efficacy based on Bandura measure (Bandura, 2006) (refer to
Table 1), case study analysis results, and two 20 question, multiple choice quizzes covering the content material
covered during phase one and phase two of the study.

Phase 1 Self-Efficacy Question (Pre and Post) Phase 2 Self-Efficacy Question (Pre and Post)
After reading chapter 6 and completing this phases After reading chapter 7 and completing this phases
learning activities how certain are you that you can multimedia learning activities how certain are you that
identify and apply effective instructional strategies you can identify and apply effective instructional
and activities, which will help you to develop fluent strategies and activities, which will help you to build
readers and writers in your class? childrens word knowledge in your class?
Rate your degree of confidence by recording a Rate your degree of confidence by recording a number
number from 0 to 100 using the scale given below: from 0 to 100 using the scale given below:
0 Cannot do at all 0 Cannot do at all
10 10
20 20
30 30
40 40
50 Moderately can do 50 Moderately can do
60 60
70 70
80 80
90 90
100 Highly certain can do 100 Highly certain can do

Table 1: Self-Efficacy Questionnaire (Pre & Post) for Phase 1 (Traditional) and Phase 2 (Flipped)

183

The lessons and activities used in this study were adapted from the course text, Literacy in the Early Grades by
Gail Tompkins, Third Edition (2011). Participants received instruction through two different teaching methods:
traditional lecture-based method (Phase 1) and a flipped classroom method (Phase 2). During phase one of the study,
students participated in a traditional classroom teaching model, which was comprised of a large lecture discussion
component, traditional power point slides, individual note taking requirements, whole class case study analysis
discussion from the chapter in the course text, and a paper based multiple choice quiz developed around the topic of
effective instructional strategies and activities for developing fluent readers and writers.

During phase two, students participated in a flipped classroom model, which contained a large collaborative element
for the instructional methodology with various interactive, multimedia activities. One of the multimedia activities
included lesson simulations using Gizmos, a web 2.0 simulation tool to help students develop a deeper
understanding of challenging concepts through inquiry and exploration. Another web-based activity included a
group jigsaw activity where the students explored several free multimedia resources available for building word
knowledge in students and then shared their findings with the group. A third activity was that the students conducted
a Voicethread interview of their school practicum teacher and then partnered with another student in the class to
analyze possible solutions posed by the practicum teacher. And finally, using Quizlet, students took and a self-paced
quiz developed around the topic of effective instructional strategies and activities for building childrens word
knowledge.

Instruments

For the pre-test data collection procedures, students participated in a short, demographic survey, which included
questions such as students gender, years in college, area of specialization and age. The students self-efficacy
survey (pre) was designed with 11-point scale ranges from "Cannot do at all" at zero to Highly certain can do" at
100. Students were asked to answer how confident they were in their belief that they have the ability to engage in a
specific objective outlined for that phase. (Refer to Table 1). The investigators developed the self-efficacy measure

Instrument Phase 1 Phase 2


Mean Mean
Pre Self-Efficacy Survey 67% 68%
Post Self-Efficacy Survey 83% 89%
Post Multiple Choice Quiz 84% 90%
Table 2: Summative Scores from the Quantitate Instruments

based on Banduras Guide to the construction of self-efficacy scales (Pajares, 2006). The survey was tailored to
specifically assess students self-efficacy regarding their ability to develop fluent readers and writers in their
classrooms. The post-test data collection procedures were comprised of two different tools, one was the re-
administration of the self efficacy survey with the topic question for that phase, and two, twenty question multiple
choice quizzes, which again, were designed to measure students content knowledge regarding the specific topic
outlined in each of the two phases.

Findings
An analysis of the research data suggested that the studies findings were consistent with current research findings
(Bergmann & Sams, 2012; Bishop & Verlerger, 2013; Stayer, 2007) that the flipped classroom model improves
students perceived understandings and application of course material, such as multimedia tools, through enhanced
collaboration and self-analysis activities. The four main themes that emerged from the meta-analysis of the
qualitative data were: 1) collaborative interaction and application, 2) metacognitive self-reflection, 3) the instructors
theory of action, and 4) multimedia as an instructional tool for enhanced exploration and expression.

Collaborative Interaction and Application

184
The correlation between the hands-on collaborative components of the flipped classroom model and students
application of the lesson material emerged as a dominant theme time and again, both in the analysis of the research
(Bergmann & Sams, 2012; Tucker, 2012) and in the study findings. Results from both the observational data and the
survey findings, showed students resounding found preferred hand-on, collaborative group work, stating that it,
enabled them to practice using the material they had learned in the chapter (and in class discussion). They also
stated that it provided the opportunity to see the inherent challenges in certain practices, and they were able to
discuss their ideas and impressions of these limitations with their classmates.

Metacognitive Self-Reflection

Another key theme, which emerged was the importance of consistent, metacognitive self-reflection regarding the
learning process following each of the phases of the study. The role of metacognition in improving student outcomes
emerged from the literature as one way to influence learning (Schraw, Crippen, & Hartley, 2006; Tanner, 2012), a
theory that was supported in the study findings. More than 85% of the students commented that specifically focusing
on the three areas: 1) what they learned, 2) how they learned it, and 3) what they needed to do in the future to
improve their learning proved significantly more helpful than the traditional summative reflections they typically did
in other courses, which usually only focused on the first area. For example, student 23 specifically stated that having
this opportunity to reflect metacognitively at the conclusion of both of the phases (traditional versus flipped), really
enabled her to think about how she learned best and to make measureable goals for how she could take more
ownership of her learning in the future.

Instructors Theory of Action

Argyris & Schns (1997) theory of action regarding espoused theory (what we say), versus our theories in use
(what we do) was selected to illustrate the dichotomy in the findings regarding the instructors use of power in the
classroom, and how it differed from phase 1 (traditional) to phase 2 (flipped). This concept of modeling how to use
power with versus over others emerged multiple times in the data. Students reported in their reflections and in
classroom discussions, how much more they learned when the instructor acted as facilitator of learning in the class,
asking questions to help motivate, guiding and empower them to work collaboratively with their peers versus simply
standing at the front and lecturing from a power point, being able to receive immediate feedback from their peers
and their instructor really helped them to adjust their misconceptions and re-organize the new information for future
use. (Bergmann & Sams, 2012; Bishop & Verlerger, 2013; Stayer, 2007). The data described the instructors role in
phase 2 (flipped) as hands-on; an active supporter of the learning process; consistently modeling those
characteristics that the students felt were important for them to develop for use as future teachers with their own
students.

Multimedia for Exploration and Expression

The final theme that emerged from the qualitative study findings and that concurred with the literature review was
the important role multimedia played in improving students self-efficacy (Baddeley, 1992; Mayer, 2003, 2005;
Miler, 2005). Many of the students commented on how much they enjoyed working with their groups to explore the
new tools like Voicethread and Gizmos and how much more interactive they were than just discussing a few pages
from the text. Nearly a 100% of the students preferred the self-paced opportunity (Mayer, Dow & Mayer, 2003) that
Quizlet provided to review the course material. Student 31, who had given herself a 6 on the Phase 1 self-efficacy
pre-test and an 8 on the post-test earning a 77% on the summative quiz went up to a 7 on the pre-test during Phase 2
and a 9 on the post test and a 100% on the summative quiz. In her post comments she stated that being able to utilize
the multimedia tools, especially Voicethread, made the information come alive, I really felt like I could
understand the problem more by being able to see and hear the teacher describing it. I loved using this program!

Conclusion and Implications of Study Findings


The main conclusion that emerged from the researchers interpretation of the studys findings and their conclusions
regarding the relationships to the literature was that the flipped classroom model does in fact influence students
perceived self efficacy regarding their understanding and application of course material and that multimedia tools do

185

indeed play a role in this development. The new question that could be analyzed in future studies might include:
Which multimedia tools have the greatest influence on students self-efficacy in flipped classrooms? This research
contributes conceptually by theorizing that the flipped classroom model directly influences pre-service teachers
knowledge application and self-efficacy. It addresses a gap in the literature in identifying how the flipped classroom
model is defined, introduced, and employed in conjunction with specific experiential, multimedia learning activities
and self-efficacy reflective practices. Even though the method and sampling procedures restrict a broad based,
general application, the explanatory framework could be conceptually generalizable to a wider audience.

References
Argyris, D. & Schon, A. (1997). Organizational learning: A theory of action perspective. Reading, Mass: Addison
and Wesley.
Baddeley, A. (1992). Working memory. Science, 255, 556-559.
Bandura, A. (1997). Self-efficacy : the exercise of control. New York: W.H. Freeman.
Bandura, A. (2006). Toward a Psychology of Human Agency. Perspectives on Psychological Science, 1(2), 164-
180.
Bradsford JD, Brown AL, and Cocking RR (2000). How people learn: Brain, mind, experience,
and school. Washington, D.C.: National Academy Press.
Gee, J.P. (2005). Learning by design: Good video games as learning machines, E-Learning, 2, 5-16.
Hattie, J. (2009). Visible learning : a synthesis of over 800 meta-analyses relating to achievement. London; New
York: Routledge.
Kalyuga, S. (2005). Prior knowledge principle in multimedia learning. In R.E. Mayer (Ed.). The Cambridge
Handbook of Multimedia Learning. New York: Cambridge University Press.
Mayer, R. (2003). Learning and Instruction. Upper Saddle River, NJ: Prentice Hall.
Mayer, R. (2005). Introduction to multimedia learning in R.E. Mayer (Ed.). The Cambridge Handbook of
Multimedia Learning. New York: Cambridge University Press.
Mayer, R. & Chandler, P. (2001). When learning is just a click away: Does simple interaction foster deeper
understanding of multimedia messages? Journal of Educational Psychology, 93, 390-397.
Mayer, R., Dow, G. & Mayer, S. (2003). Multimedia learning in an interactive self-explaining environment: What
works in the design of agent-based micro-worlds? Journal of Educational Psychology, 95, 806-813.
Pajares, F. U. T. C. (2006). Self-efficacy beliefs of adolescents. from http://site.ebrary.com/id/10429529
Perkins, D. (1992). Smart schools: Better thinking and learning for every child. New York: The Free Press.
Piaget, J. (1968). On the development of memory and identity. Barre, MA: Clark University Press with Barre
Publishers.
Pintrich, P. R. S. D. H. (2002). Motivation in education: Theory, research, and applications. Upper Saddle River,
N.J.: Merrill.
Schraw, G., Crippen, K., & Hartley, K. (2006). Promoting self-regulation in science education: Metacognition a part
of a broader perspective on learning. Research in Science Education, 36: 111-139.
Schwerdt, G. W. A. C. (2009). Is traditional teaching really all that bad? a within-student between-subject approach.
Munich: CESifo.
Sweller, J. (2005). The redundancy principle in multimedia learning. In R.E. Mayer (Ed.), Cambridge handbook of
multimedia learning (pp. 159-167) Cambridge, UK: Cambridge University Press.
Tompkins, G. (2011). Literacy in the early grades. Boston, MA: Pearson Education, Inc.
Tucker, B. (2012). The flipped classroom. Education Next, 12 (1), 82-83.
Vygotsky, L. (1978). Interaction between learning and development. Mind and Society, Cambridge, MA: Harvard
University Press.

186
20. Implementation of the Inverted Classroom Model for Theoretical
Computer Science

Karsten Morisse
Faculty of Engineering and Computer Science
University of Applied Sciences Osnabrueck
Germany
k.morisse@hs-osnabrueck.de

Introduction
The module "Theoretical Computer Science" (TCS), sometimes called "Theory of Computation" or
"Fundamentals of Computer Science", comprises the fundamental mathematical properties of computer hardware,
software, and certain applications thereof. Basic topics are formal languages, computational models like finite
automata or Turing machines, and the limitations of computability and decidability. The subject has connections
with engineering practice and it also has purely philosophical considerations. At our university the course
Theoretical Computer Science is a compulsory course in the study programs Media and Computer Science
("Informatik - Medieninformatik") and Technical Computer Science ("Informatik - Technische Informatik"). It is a
second year course, which is usually taught in the 4th semester of study. In every of the first three semesters there is
a course in mathematics, which is a helpful prerequisite for TCS. Usually, the approach for TCS is a very formal
one. After a definition, e.g. a finite state machine or the Turing machine, some statements as a lemma or a theorem
with formal proofs are presented. Sometimes the results do not have a direct link to an application. Especially for
students at a university of applied sciences this fact might be a challenge for a deeper understanding and learning
motivation.
Universities of applied sciences in Germany have the major challenge, that there is not one single general
qualification for university entrance but that there are several ways. Therefore there is a great difference in the first
year students' prior knowledge in many areas, especially in mathematics (Braun et al., 2014). This might be one
reason for the high rate of examination failure in mathematics courses. Due to the strong relationship of TCS and
mathematics, the same situation can be seen for these courses. After teaching TCS for several years as a traditional
course, we introduced the Inverted Classroom Model (Baker 2000, Lage et al. 2000, Bergmann & Sams 2012) as a
teaching concept for Theoretical Computer Science in summer 2014. Within this article we describe the
implementation process of it and first results after the second pass.

ICM and Motivational thoughts

A basic requirement for the development of university teaching is to improve the service offer for students
regarding to teaching quality, possibility of individualization and flexibility of studying. However, if one considers
the duration of the educational theoretical discussion of the didactical potential of new media (e.g. video based
lectures) and compares this to the actual university reality, a low practical adoption of media based teaching
concepts in universities becomes evident (Bishop & Verleger 2013). The inverted classroom is a learner-centered
course concept where the students prepare the classroom, usually by watching video lectures. Therefore digital
technologies are used to move the direct instruction from the classroom to the individual learning environment of the
student (out-class). The classroom time (in-class) with the lecturer is then dedicated to discussions, reflections and
individual training and problem-based learning (see Figure 1). Our course concept implements this idea and
combines various electronically supported learning modules to provide as much individual support for students as
possible and most important to encourage students to a continuous learning process during the semester. Video
based lecture recordings play an important role in the scenario. In the past, we made thoroughly experiences with
different formats from live streaming to several on-demand video formats (Wichelhaus et al. 2008). For the ICM
implementation for the TCS we choose short video sequences which are distributed via YouTube. We decided to use
YouTube because of the broad acceptance and accessibility even on mobile devices. An important effect of the ICM

187

concept is the fact, that the role of the lecturer is shifted from the classical role as a presenter of knowledge to the
new role as a coach by side. This must be kept in mind when one considers implementing the ICM.

Figure 1 Inverted Classroom Model vs. Traditional teaching

There are several motivations for using an ICM in general. First, the ICM allows free class time for
interactive activities, such as cooperative problem-based learning (Lage et al. 2000). Second, content can be
presented in several different formats, and therefore the students' different learning styles can be addressed (Lage et
al. 2000). Moreover, students can set their individual learning pace. A third reason is to impart knowledge in areas
like communication skills, problem solving competencies, and to work in multidisciplinary teams even in courses on
engineering and computer science. These higher outcomes are becoming more and more important for employability
(Bishop & Verleger 2013).
The author uses ICM in a course on Audio/Video Technology since summer 2007. The result of the
corresponding evaluation (Morisse & Ramm 2007, Wichelhaus et al. 2008a) was a motivation to introduce the ICM
concept for the course TCS, which is a compulsory course with 5 ECTS credit points, a workload of 8 10h per
week, 4h of contact time per week, and a high examination failure rate. Instead of the usual frontal chalkboard based
lecture, a combination of lecture recordings, live coaching sessions, teaching materials (script & video) and weekly
exercises and problems are offered. For many teachers a certain rejection and shyness in dealing with video-based
teaching can be observed. Especially the fact that the teacher is recorded is an enormous reason for inhibitions
(Persike 2015). In our case the correlation of the course content of other courses given by the author (technology of
digital audio and video) with the media used to support knowledge sharing was an important motivating issue for the
teacher to use video-based teaching elements since several years and has led to overcome the naturally existing
inhibitions thresholds. A key objective from the students point of view is the promotion of a self-organized learning
process. Our ICM concept consists of four modules:
video lecture,
live coaching,
electronic textbook,
a final examination.
Moreover a LMS - learning management system (SharePoint) is used in which the students are provided with
important information on the course organization (e.g. time schedule), course material, and tools to support
collaborative work (forum, wiki). The use of the different building blocks and the stepwise development and
refinement of the overall concept is shown in Figure 2. TCS is a course about formal languages, automata theory,
computability and is not very much liked by the students because of a very formal approach. It is a rather small
course with 20 - 30 students participating each semester. Following Bergmann& Sams (2012) the in-class time for

188
this course is enriched by several warm-up and stimulating activities to activate the students for an agile
participation during these sessions.

Figure 2 Development of ICM concept

Videos or lecture recordings play an important role within the ICM. In the beginning (2005, 2006) we
made a more or less experimental use of live and on demand streaming as well as a synchronized representation of
speaker and material (virtPresenter, Mertens et al. (2007)). We started with complete recordings of the typical 90
min lecture time. From a technical point of view, these kind of lecture recordings can be produced automatically to a
high degree (eg Ketterl et al. (2006); Engelbert et al. (2013)). By informal discussions and analysis of server log
files we made the experience that students prefer to access specific content very precisely. This correlates to (Zappe
et al. 2009) that students prefer videos with a length of 20 - 30min. Therefore the 90 min videos of 2013 were
divided in a post-production process in fine-grained video episodes (each episode with a length between 2 and 20
min). Several studies (Hermann et al. (2006); Hermann et al (2007a); Kruger, (2005)) have shown that students
perceive lecture recordings as an additional value. The handling and use of videos in terms of time and location
independence, the possibility of interruptions for further research as well as the repetition were seen very positively
to the learning process. The division of the recordings in short episodes gave the students the opportunity to access
the video content very precisely without intensive search operations in a full-length lecture recording of 90min.
However, this is enormous production expenditure, which is difficult to automatize. For lectures with a large number
of students there are not only these advantages in terms of interaction capabilities, but simultaneously solve the
problems associated with a crowded lecture hall (poor acoustics, poor visibility, no seats, interference by anxiety).
However, some things must be considered:
Lecturers must not be burden with the organisation and technical production of the
lecture recording. This has to be organized by the university or the organisation.
Care must be taken on an adequate technical quality and the dramaturgy. Especially the
quality of the lecture recording, when used as a lecture replacement, is an important
factor in the perception of learners.

189

Partially there is the danger of audio/text and image shearing, i.e. to produce texts
without image context. This is very disruptive from a media-psychological perspective
(Ballstaedt (1997), Kerres (2012))
For the lecture Theoretical Computer Science, where usually in-class much of the content is developed
interactively at a whiteboard or chalkboard (e.g. formal proofs of mathematical theorems), we prefer to use
screencasts taken on a tablet computer, where the content is written onto the tablet (e.g. in MS OneNote). These
were recorded with the Matterhorn production system (Opencast (2015)) and in a post-production step divided in
over 100 short video sequences, which have been published via YouTube (Morisse (2013)). We decided to use
YouTube because of the broad acceptance and accessibility even on mobile devices.

The Live-Coaching is a personal meeting between students and the lecturer two times a week in a classroom.
Instead of presenting the lecture content, this in-class time is used to clarify open issues, to discuss students'
questions and to work on problems. For each meeting a specific topic is announced within the learning management
system and it is presumed that the students have prepared themselves with the videos and the textbook. As part of
the ICM, students have to prepare themselves and to develop their individual questions. This leads to in-depth
discussions and an increase in the treated content, because usually the prepared questions are of higher quality than
spontaneously occurring issues in general. At the same time it also presents new challenges to the lecturer because
she or he does not know what questions will be asked and the discussions can reach a much deeper quality compared
to traditional classroom teaching. Basically, this form can be carried out with different sizes of student groups, from
small groups to large groups of students. Our observations show that a certain group of students make use of the
coaching sessions only as passive listeners to deepen their learning. In this sense, a larger auditorium can even be
useful to have multiple active participants. One might encounter the classic problems of mass events and the live
coaching: If too many questions occur or with a large number of participants, the individual influence of the
individual, the view and the acoustics will be affected. A solution could be to prepare the meeting via email
(collecting questions), and then answer the questions systematically. The coaching module could also be kept fully
supported electronically (live chat, forum, wiki, etc.). Consequently, according Bachmann et al. (2001), we would
then reach the virtualization concept of the event. However, the important social component of learning would then
be lost and it would become a pure "Remote Event". However, we have not tried these options so far because of our
small group sizes. In our first implementation of the ICM the coaching session was more or less without a fixed
structure. Usually the students are motivated to ask their questions at the beginning of the in-class time. This was
changed with the implementation of the course concept for the Theoretical Computer Science lecture. Following
Bergmann et al. (2012) we use a fixed (but flexible on demand) schedule: for a 90 min course hour we set up the
structure in Table 1.

Activity Time Description


Warm-up 5 - 10 min i.e. 3x3 (3 groups, 3min discussion, 3 presentations), Clicker
questions, ...
Q & A time on video and 10 min Questions to the course contents of this week (presumes the
text preparation of the students)
Independent work on 50 - 70 Work in small groups (2 - 4 students) on problems to content
problems in teams min for this week
Final discussion 15 min Selected discussion on problems
Overall 90 min
Table 1: Structure of coaching session

As can be seen from Table 1 the structure of the coaching session has been changed. Contrarily to the first
implementation of ICM, this contact time between students and lecturer is much more organized. An important role
has an audience response system that is used sometimes for the warm-up and during the session. With these we
practice Peer Instruction (see Mazur (1997)) to keep the interaction level between students and between students and
teacher high. Most of the time is spent for work on tasks in small groups (2 - 4 students). The role of the teacher is
now to be a coach by side for each of the groups.
Students are encouraged to prepare these coaching sessions by tasks like "familiarize yourself with the
acceptance by finite automata" and not by "read section 4.3 of the textbook". Thus they are not forced to use the
learning material of the course but they can also use other resources. An additional reference list is published in the

190
LMS. Teaching material such as annotated slides, textbook, recommended literature or online references is available
within the LMS. This material can be used to complement or to deepen the content of lecture recordings. In an
evaluation within our usability lab with eye tracking facilities we did some investigations on the combined usage of
annotated PDF documents and lecture recording videos. During learning with the videos students are working
simultaneously with this text material during pausing the video for clarification - this form of learning is not possible
in the traditional classroom lecture in this intense form. While working on the weekly online assessments this text
material is the main source of information while other sources in the Internet like Wikipedia or a Google search
were used very sparsely, contrarily to our expectations (see Wichelhaus et al. (2008) for details). For the course
Theoretical Computer Science we developed a comprehensive textbook, which can be used as the leading
information resource. Beside a very detailed text, it contains targeted links to other resources, e.g. QR-codes are
used to link to lecture videos for the specific content on that page (see Figure 3). In a printed version these QR-codes
can be used to access the relevant video with a smartphone, in the electronic version these QR-code graphics are
clickable links.

Figure 3 Interactive Textbook

Evaluation and Exam performance

We evaluated our first implementation of ICM with a group of 24 students of media and computer science.
Within that evaluation we had the following goals: Usage, acceptance and quality of the ICM concept, and a
subjective personal review from the students. For the study setting we used a questionnaire with open questions and
an observation of the students in a usability lab while they are working on a specific problem within the theoretic
field of the course. It could be shown that the students recognize the added value in terms of the altered teaching
concept. Among other things, the location and time flexibility, the degree of personal responsibility, the structure of

191

the event, an additional learning effect through repetitions and breaks, and the way to an effective exam preparation
are highlighted positive. The students say they can successfully eliminate problems or queries by using the coaching
sessions and the videos, as well as through communication with other students. Contradictory is the result with
respect to the usual learning environment: While some students feared a decrease in the intensity of existing social
contacts, other students speak positively about an increase in the exchange with fellow students and professors.
Coaching sessions and the exchange between students promotes the communication process. Some students feel
undue overhead due to the self-organization of their learning process and evaluate the personal responsibility as an
excessive demand. However, a majority of students recognize that the necessary individual and direct responsibility
plays an important role in ensuring that the whole concept can unfold all the added values for each individual
student. Additionally the course organization encourages students to a more self-organized learning process. In
summary the teaching scenario promotes particularly the following core or key competencies: self-organization and
self-responsibility, continuous learning, media literacy (handling, use and selection of new media). Details can be
found in (Wichelhaus et al (2008, 2008a, 2008b)).
In April 2014 we evaluated the course TCS with a group of 16 students (14 male, 2 female). We were
interested in the preparation process of the coaching sessions and the provided material. The time investment is
spreaded from less than 30 minutes preparation time per coaching session to 60 minutes and 60-90 minutes with 5
nominations each. In general, the ICM is very much liked because of a lot of assisted time for practical problem
solving and discussion (94% report on adequate problems during coaching sessions). The ability to define their own
learning speed was only mentioned by 37% of the students. However, 87% mentioned an option for active and self-
dependent learning. The combination of the electronic textbook with direct linked videos was mentioned positively:
69% nominated the short length of the videos positive, 56% are using the textbook concurrently to the videos.
Students are graded in both courses Audio-/Videotechnology and Theoretical Computer Science by a final
written examination. We did not investigate formally the exam performance by dividing the students in a
experimental group and a control group, but we observed a trend of better exam results. Figure 4 shows the
deviation from median for each semester where the course was given by the author. We used the ICM from summer
term 2007 (SoSe 2007) to summer term 2009. In winter term 2011 (WiSe 11/12) the course was given as a
traditional class. In summer 2007 we introduced online assessments with the option to earn extra credits for the final
examination. This was done to encourage the students to a continuous learning process. These assignments were
optional, but students were encouraged to complete them. This was also possible for the students in winter 2007,
winter 2008 and summer 2009. In summer 2008 we did not offer this option. The course Theoretical Computer
Science was given just two times with a written final examination: in winter 2013 as a traditional class, in summer
2014 as an ICM. The average grade went up from 4,4 to 3,8 (on a grade scale from 1 to 5, where 1 is best).
Noticeable was a decrease in the failure quote, which went down from 69% to 25%.

0.8
0.6
0.4
0.2
0
0.2 SoSe WiSe SoSe WiSe SoSe WS SoSe WiSe SoSe WiSe SoSe WiSe
03 04/05 05 05 06 06/07 07 07/08 08 08/09 09 11/12
0.4
Deviation Median AV

Figure 4 Examination result for AV

Critical reflection and transfer

Now let us discuss obstacles that we observe in everyday university life and that can be relevant for the
adaptation of ICM to other subject-specific application areas. A crucial aspect for the applicability of ICM is the

192
subject-specific topicality of the contents. In subjects with a high demand of renovation, the lecture recordings must
be updated each semester. This can only be realized with the recording of real lectures or a dedicated media
production. In this case it is not possible to provide the lecture recordings solely as a replacement for the presence of
the lecture. If the traditional lecture is held regularly, an additional coaching session is difficult and unlikely to
provide due to time constraints of the lecturer. Lecture recordings should then be made available as supplementary
material additionally to the traditional lecture, since they have a high level of acceptance as an additional teaching
material in particular for exam preparation (Hermann et al (2006)). From the perspective of lectures with a very high
number of students they are also a useful measure to reduce the negative effects of overcrowded lecture halls. Under
these circumstances it would be didactically important to provide regular online exercises, since experience from
previous semesters have shown that students with the possibility of anytime, anywhere learning tend to increase the
procrastination behavior (Wichelhaus et al (2008)). Therefore, basic subjects with a low degree of renovation are
more suitable for the course concept in general than courses or subjects with a high necessity of actuality.
Another subject-specific aspect is the degree of interaction within a course. If there is a high degree of
cooperative work and discussion, a lecture recording only makes sense if the contributions of the active participants
are included in a video. This means an increased effort (possibly additional camera for the auditorium, mobile
microphones, average technical complexity is in post-production) for the production of the lecture recordings. Apart
from the technical production the willingness to active contributions on student side might decrease due to
inhibitions if they are aware of the recording.
A third aspect that could lead to a modification of the teaching concept is the substantive focus of the
course. For courses within the humanities or social sciences, lecture recordings could gain importance for the exam
preparation. However, online exercises in form of multiple-choice questions or other machine-readable question-
forms in these subjects are often unsuitable for an automatic correction procedure. In technical subjects multiple-
choice questions or text fields can be used well to check the learning progress. Alternative exam forms for a
knowledge survey, e.g. open answer questions cannot or hardly be evaluated automatically. Thus, the necessary
correction cost would increase disproportionately on instructors side. In order not to completely lose the added
value of online exercises, the coaching sessions could be expanded in scope and modified content in these cases, e.g.
pure question-answer session could be replaced by a kind of interactive training lesson.
Although there is a desire for lecture recordings on students' side, there are still various concerns on
lecturers' side regarding the recording of their person and lectures. This is partly due to the associated transparency
to which the teacher gets involved when his or her lectures are published. A second issue is legal in nature. Teachers
are not infallible and errors in a lecture recording might have legal consequences for the students and teachers
regarding the exam situations. The discussion about quality of courses becomes more complex with the use of
lecture recordings. Certain inertia in the enforcement of such forms of teaching is therefore understandable and
certainly a human behavior, even if the quality of the curriculum could be demonstrably improved. Moreover, one
important aspect is the still unresolved question of acceptance of electronic teaching methods for the teaching load
of lecturers. Although there are some proposals to modify the teaching load calculation for the lecturers in Germany
(e.g. HRK (2005), Hener et al. (2005)), at many universities the common flat-rate calculation is standard which is
based on the given hours in a classroom. Innovative teaching concepts in which also significant parts of the course
take place outside the lecture hall are difficult to incorporate.
Our experience has shown that students are struggling in particular with the self-organization of their
learning processes (Wichelhaus et al. (2008b)). A motivation for a continuous learning is misunderstood as
additional overhead and has a negative connotation. However, it turned out that the added value of the concept can
only fully develop if appropriate awareness about the need for targeted, self-organized learning structure is present
on students' side. Self-organized learning also includes an open approach to individual weakness in understanding.
The learner must have the courage to fill his knowledge gaps by asking. Within the coaching sessions it must be the
objective of the teacher to take care for an atmosphere where an open discussion can take place and where the
students are encouraged to ask their questions. From the traditional lecture methods in a university there is often no
culture of open questions. Therefore the acceptance of these kind of individualized courses, based on self-organized
learning processes, is particularly low. Finally, self-organized learning can only take place effectively when personal
interest for a special topic exists. Unfortunately this is sometimes observed with a negatively association concerning
careerism or elite consciousness, which in turn inhibits the development of an autonomous learning process. This
has some consequences for the lectures. At the beginning of a lecture a clear communication about the teaching
concept is necessary: how the students can benefit from the concept and which added values they can gain. There is
a role switch for the lecturer. The usual barriers or hierarchy must be conquered. The lecturer is more a coach by
side than a "knowledge presenter". He or she accompanies the students on their individual learning path. Students

193

must be made aware that the ability of a self-controlled and a continuously learning represent an important core
competence (EU (2006)).
The pressure on students has increased significantly with the switch from Diploma degree to bachelor
degree programs in Europe, especially in Germany. Along with it goes a students' passive attitude towards learning
contexts: There is a tendency to organize the learning process with little self-effort and very time efficiently to reach
short-term learning goals, i.e. to achieve passing the next exam. Only in a few cases, these degrees of freedom open
up the motivation to a continuous learning behaviour. So here it needs incentives to provide this motivation. Self-
organized learning must be learned. It is therefore desirable that course concepts, which focus the self-organized
learning process of the students, do not remain singularities in the context of an overall study program.

Transfer and university support structure


We implemented the ICM concept at a university with a relative small numbers of students within each
single course (usually between 30 - 60). However, the concept offers the potential to be used in courses with larger
numbers of students. This results in new opportunities and new challenges for students as well as for teachers.
Students will be required for an autonomous, self-organized lecture preparation. A clear and consistent
communication is required from the teacher so that the students recognize the need of the course preparation and the
potential is clear for their individual learning success. Our experience shows that a targeted communication concept
is essential to motivate the students such that the independent and self-organized learning results in a successful
learning. Particularly from the viewpoint of the ever-demanded lifelong learning - which will often be a self-
organized learning - this self-learning skill must be trained, and this teaching goal is not promoted widely enough.
On the one hand for the teaching staff there is a challenge to face a lecture recording, on the other hand, however,
the concept offers the teacher the opportunity to perform as part of the in-class sessions with the students an in-
depth, interactive discussion process and a shift to a problem-based learning scenario. Because of the in-class
preparation phase the students think about the content in advance and prepare questions specifically, and therefore,
the discussion about the content and thus the entire event can take place at a qualitatively higher level.
The costly media production of electronic learning material is one issue, which prevent lecturers to switch
to an electronic supported learning concept. Here, the university must provide adequate teaching load allocation
models for teachers and appropriate support services. Important and time-consuming is the quality assurance of such
concepts. We can identify gaps and deficits in students' knowledge by analysing the questions that are provided in
the coaching sessions and the mistakes that are made in the online exams. These results can be directly incorporated
into the updating and improvement of the teaching material for the purpose of quality assurance.
A control operation for lecture concepts with intensive media support requires resources of the university
and cannot be left alone to the lecturer. Especially when a large number of high quality lecture recordings have to be
produced and provided, a supportive institution is required, which ensures regular operation. In particular, the
technique of recording and distribution is a continuous challenge. Hence, before the semester begins the recording
technology must be maintained and the already existing material must be provided. During the lecture period new
recordings have to be made. At our university we use trained student assistants for this recording operation. The
central organization, training of support staff, as well as the maintenance and support of the technique is performed
by the e-Learning Competence Center (eLCC) of the university. Moreover this unit is responsible for all services for
consulting, teaching staff training and media production for learning purposes.

Conclusions

We presented the development of an Inverted or Flipped classroom teaching concept that has been
developed at an university in a computer science study program in two courses. Building blocks of the concept are
video based lecture recordings and coaching sessions in which the lecturer switches from a knowledge presenter to a
coach for students' individual learning paths. Universities are requested to provide a service infrastructure to
motivate more lecturers to adapt this kind of learning scenario.

References

Bachmann, G., Dittler, M., Lehmann, T., Glatz, D., Rsel, F. (2001). Das Internetportal LearnTechNet der Uni

194
Basel: Ein Online-Supportsystem fr Hochschuldozierende im Rahmen der Integration von eLearning in
die Prsenzuniversitt. In Haefeli, O., Bachmann, G. & Kindt, M. (Hrsg.), Campus 2002 Die Virtuelle
Hochschule in der Konsolidierungsphase (S. 87 97), Mnster: Waxmann-Verlag.
Baker, J. (2000) "The "Classroom Flip": Using Web Course Management Tools to Become the Guide by the Side"
Selected Papers from the 11th International Conference on College Teaching and Learning (2000): 9-17.
Ballstaedt, S. (1997) Wissensvermittlung, Betz Psychologie Verlags Union, Weinheim, 1997.
Bergmann, J., Sams, A. (2012). Flip your Classroom. International Society for Technology in Education, 2012.
Bishop, J., Verleger, M. (2013). The Flipped CLassroom: A Survey of the Research. 120th ASEE Annual
Conference & Exposition, June, 2013.
Braun, I., Ritter, S., Vasko, M. (2014) Inverted Classroom by Topic - A Study in Mathematics for Electrical
Engineering Students, International Journal of Engineering Pedagogy, Vol 4(3), 2014,
http://dx.doi.org/10.3991/ijep.v4i3.3299
Engelbert, B., Greweling, C. & Morisse, K. (2013). The Use and Benefit of a Xbox Kinect based Tracking System
in a Lecture Recording Service. In: Jan Herrington et al. (Eds.), Proceedings of World Conference on
Educational Multimedia, Hypermedia and Telecommunications 2013 (pp. 179-184). Chesapeake,
VA:AACE
EU (2006) Empfehlung des Europischen Parlaments und des Rates zu Schlsselkompetenzen fr
lebensbegleitendes Lernen, Amtsblatt der Europischen Union, (2006/962/EG), 30.12.2006
Hener, Y., Handel, K., Voegelin, L. (2005). Teaching Points als Mastab fr die Lehrverpflichtung und
Lehrplanung. CHE Arbeitspapier Nr. 69, Gtersloh.
HRK (2005). Entschlieung des 204. Plenums der HRK vom 14.06.2005. Empfehlungen zur Sicherung der Qualitt
von Studium und Lehre in Bachelor- und Masterstudiengngen,
http://www.hrk.de/de/beschluesse/109_2628.php?datum=204.+HRK-Plenum+am+14.+Juni+2005
(4.8.2008).
Hermann, C. Lauer, T., Trahasch, S. (2006). Eine lernerzentrierte Evaluation des Einsatzes von
Vorlesungsaufzeichnungen zur Untersttzung der Prsenzlehre. In M. Mhlhuser, G. Rling, R.
Steinmetz (Hrsg.), DeLFI 2006 4. e-Learning Fachtagung Informatik (S. 39-50). Bonn: Kllen Verlag.
Hermann, C., Welte, M., Latocha, J., Wolk, C., Huerst, W. (2007). Eine logfilebasierte Evaluation des Einsatzes von
Vorlesungsaufzeichungen. In C. Eibl, J. Magenheim, S. Schubert, M. Wessner (Hrsg.), DeLFI 2007 5. e-
Learning Fachtagung Informatik (S. 151-160). Bonn: Kllen Verlag.
Kerres, M. (2012) Mediendidaktik. 3.Oldenbourg Verlag Mnchen, 3. Aufl., 2012.
Ketterl, M., Mertens, R., Morisse, K. & Vornberger, O. (2006). Studying with Mobile Devices: Workflow and Tools
for Automatic Content Distribution. In Proceedings of Ed-Media 2006 World Conference on Educational
Multimedia, Hypermedia and Telecommunications, 2006 (pp. 2082 - 2088). Chesapeake, VA: AACE.
Ketterl, M., Schmidt, T., Mertens, R. & Morisse, K. (2006). Techniken und Einsatzszenarien fr Podcasts in der
universitren Lehre. In: C. Rensing (Hrsg.) Proceedings der Pre-Conference Workshops der 4. e-Learning
Fachtagung der GI (DeLFI) (S. 81 90). Berlin: Logos-Verlag.
Krger, M. (2005). Vortragsaufzeichnungen Ein Querschnitt ber die pdagogischen Forschungsergebnisse. In H.
Horz, W. Huerst, T. Ottmann, C. Rensing, S. Trahasch (Hrsg.), eLectures Einsatzmglichkeiten,
Herausforderungen und Forschungsperspektiven (S. 25-30). Workshop im Rahmen der GMW und DeLFI
Jahrestagung 2005. Berlin: Logos-Verlag.
Lage, M. J., Platt, G. J., & Treglia, M. (2000). Inverting the classroom: A gateway to creating an inclusive learning
environment. Journal of Economic Education, 31, 30-43.
Mazur, E. (1997). Peer Instruction: A User's Manual, Series in Educational Innovation, Prentice Hall, Upper Saddle
River, NJ, 1997.
Mertens, R., Ketterl, M. & Vornberger, O. (2007). The virtPresenter lecture recording system: Automated
production of web lec- tures with interactive content overviews. International Journal of Interactive
Technology and Smart Education (ITSE), 4 (1). February 2007. Troubador publishing, UK. S. 55-66.
Morisse, K. & Ramm, M. (2007). Teaching via Podcasting: One year of One year of Experience with Workflows,
Tools and Usage in Higher Education. In Proceedings of ED-Media 2007, World Conference on
Educational Multimedia, Hypermedia and Telecommunications, 2007 (pp. 2081 - 2088). Chesapeake, VA:
AACE.
Morisse, K (2013) Playlist Theoretische Informatik, http://bit.ly/1abFjuL (30.03.2015)
Opencast (2015) Matterhorn Lecture Capture & Video Management for Education, http://opencast.org/matterhorn/
(30.03.2015)

195

Persike, M. (2015) Inverted Classroom unter der Lupe. ICM and Beyond - Lehren und Lernen im 21. Jahrhundert.
February, 2015. https://www.youtube.com/watch?v=Wy99mbVOmdI (Access: 30.03.2015)
Strayer, J. (2007) The Effects of the Classroom Flip on the Learning Environment: A Comparison of Learning
Activity in a traditional Classroom and a Flip Classroom that used an Intelligent Tutoring System. Ph.D.
Dissertation, Ohio State University, Columbus, OH, USA, 2007.
Wichelhaus, S., Schler, T., Ramm, M. & Morisse, K. (2008). More than Podcasting: An evaluation of an integrated
blended learning scenario. Proceedings of ED-Media 2008, World Conference on Educational Multimedia,
Hypermedia and Telecommunications, 2008 (pp. 4468 4475), Chesapeake, VA: AACE.
Wichelhaus, S., Schler, T., Ramm, M. & Morisse, K. (2008a). Weg von der klassischen Frontalveranstaltung
Podcasts, Live-Coaching und Onlinetests als integrale Veranstaltungselemente in der Lehre. Delfi 2008 - 6.
e-Learning Fachtagung der GI, S. 209 - 220, Lecture Notes in Informatics, Vol. 132, Gesellschaft fr
Informatik, Bonn, 2008.
Wichelhaus, S., Schler, T., Ramm, M. & Morisse, K. (2008b). Medienkompetenz und selbstorganisiertes Lernen
Ergebnisse einer Evaluation. In S. Zauchner, P. Baumgartner, E. Blaschitz, A. Weissenbck (Hrsg.):
Offener Bildungsraum Hochschule, S. 124 - 133, Waxmann-Verlag, Mnster, 2008.
Zappe, S., Leicht, R., Messner, J., Litzinger, T., and Lee, H.W. (2009) ""Flip- ping the classroom to explore active
learning in a large undergraduate course, in Proc. ASEE Conf., Austin, TX, 2009, p. AC2009-92.

196
21. Online mentoring for secondary pre-service teachers in regional, rural or
remote locations
Petrea Redmond
School of Teacher Education and Early Childhood
|University of Southern Queensland, Australia
redmond@usq.edu.au

Introduction
Many teacher education programs are now offered online with pre-service teachers located in rural and
remote areas. Due to a range of circumstances these pre-service teachers are unable to move to locations which offer
them less professional isolation. Pre-service teachers located in rural and remote locations do not have the
opportunity to interact and engage in a professional learning dialogue with teaching professionals within their
disciplines e.g. often there is not a specialist Physics teacher in a small rural high school. An online mentoring
project was established to enable rural and remote pre-service teachers to benefit from the ability to engage with
practising teachers for both professional and academic purposes. This innovative project provided online mentoring
that extended beyond the traditional educational boundaries to enhance support for pre-service educators while
developing partnerships with teachers in the field.
Darling-Hammond (2000, 2010, 2012) claimed that traditional teacher education programs are not
responsive to present-day demands of teaching and are often distant from contemporary practice. She also
recommended that the quality of initial teacher preparation and ongoing mentoring could influence teacher
competence and reduce attrition from the profession.
The multiple reasons for beginning teachers leaving the profession are well documented (Ashiedu & Scott-
Ladd, 2012; Buchanan, 2010; Fetherston & Lummis, 2012; Schuck et al., 2011), however, the crucial reason for
leaving the profession is the lack of professional support (Buchanan, 2010; Roberts, 2004; Scheopner, 2010; Schuck
et al., 2011). The lack of support is heightened for those located in rural or remote locations where there are fewer
experienced teachers in their geographical proximity, and where access to secondary teachers with knowledge in a
similar discipline area can be more problematic.
Another significant reason causing beginning teachers to leave the teaching profession is the increased
workload and complexity of teachers work (OECD, 2005). There is additional pressure to gain understanding of an
increased volume of knowledge and to be able to apply it in practice (Maher & Macallister, 2013), including the
expectation to modify curriculum for diverse learners needs, and increased technology and pedagogy knowledge
and practice. These increased pressures on beginning teachers are exacerbated for those located in rural and remote
locations where they are expected to assume additional roles and responsibilities and often teach outside their
subject area in their early years in the profession (Roberts & Lean, October, 2005).
Studies show that the key support required by beginning teachers includes mentoring and structured
supervision (Buchanan, 2010; Commonwealth of Australia, 2013; Joseph, 2011; Schuck et al., 2011). In their report
on attrition of recent graduates, the Queensland College of Teachers (2013) found that providing adequate support
such as structured induction, mentoring from suitable experienced teachers and resources for all graduate teachers,
including those employed as casual/relief teachers and on temporary contracts (p. 47) would have changed their
experience and impacted on their decision to leave the profession.
Secondary teachers in rural and remote locations have decreased contact and networking with teachers in
the same subject area from other schools (Roberts, 2004, p. 10). This project established discipline specific online
mentoring for secondary pre-service teachers. The one-to-many discipline specific online mentoring program
harnessed the potential of technology to reduce feelings of isolation and to enhance access to information and
networking opportunities with experienced teachers. Pre-service teachers were able to access mentors from any
location rather than solely within their local area.

Online mentoring
Online mentoring, E-mentoring or virtual mentoring has been described as the "use of e-mail or computer
conferencing systems to support a mentoring relationship when a face-to-face relationship would be impractical"
(O'Neil, Wagner, & Gomez, 1996, p. 39). It has been suggested by Hunt, Powell, Little and Mike (2013) that E-

197

mentoring also facilitates a medium of exchange between mentor and mentee that is less threatening and non-
confrontational, conducive to building a community of learners (p. 288).
Technology now makes it possible for mentoring relationships to occur between those within different
geographical locations, at a time and place convenient for both parties. It can occur synchronously or
asynchronously. Asynchronous mentoring provides the parties with time to create a reflective response, and an
opportunity to research prior to responding rather than having an off the cuff response. Asynchronous text mentoring
also provides a written and lasting record of the conversations, which can be reviewed over time.
Other advantages online mentoring has over face-to-face mentoring include (Eby, 1997; Ensher, Heun, &
Blanchard, 2003; Gutke & Albion, 2008; Mueller, 2004):
Logistical convenience due to no travel being required and no necessity to be in the same location;
Flexibility of access as it occurs at time convenient to participants and can occur on a device such as the
Smart Phone already in their pocket/handbag;
Status difference not being obvious in E-mentoring;
Some degree of anonymity and a less threatening environment, which can encourage mentees to ask
questions they are less likely to ask in person;
Reduced costs with no travel or time away from job;
Scalability: infrastructure is in place irrespective of number of participants; and
Group online mentoring has the capacity to contribute to the development of a community of learners.
Some deterrents to online mentoring include those items that are constraints in traditional face-to-face
mentoring (Ensher et al., 2003; Mueller, 2004):
Availability of suitable mentors;
Ongoing commitment from all parties;
Ongoing access to ICT resources: may need to re-establish relationships after a long break;
Potential for miscommunication due to the lack of non-verbal cues; and
The need for mentoring programs to be managed, planned and implemented, including training and
ongoing contact with participants.
Many pre-service teachers study online and dont have access to a teacher with discipline expertise in their
local area. The online mentoring project was established to provide all pre-service teachers access to a discipline
expert, irrespective of their location. Stanulis and Flodens (2009) study of beginning teachers within the United
States, found that mentoring advantages included an opportunity to share ideas, resources, and advice; an
opportunity to hear from other new teachers who were going through similar struggles; and the increased openness
to try new things in their practice (p. 119).
The mentoring in this project, included peer mentoring and group mentoring with an experienced teacher.
Lateral or peer mentoring provides job related skill development rather than career related development, and can
elicit emotional support, friendship, and feedback through a reciprocal relationship (Eby, 1997). The one-to-many
mentoring within the group arrangement provided pre-service teachers an opportunity to hear from other pre-service
teachers as well as an opportunity to access the support of an established teacher, all of whom worked in the same
discipline area. Dansky (1996) suggested that an important element of group mentoring is the dynamics of the group
and the collective behaviours that are not present in a one-to-one relationship. The social dynamics include
polarization, conformity, communication flows, and social networks (p. 7), which again, are less obvious in one-
to-one mentoring relationships.

Computer Conferencing and Content Analysis Conceptual Framework


Online discussions are not constrained by space, location or time. The online environment enables multiple
contributions by all learners, unlike face to face environments where it would be physically impossible for all
learners to have their say (Henri, 1992, p. 118). This adds to the richness and diversity of perspectives within the
dialogue.
To understand both the social and cognitive learning processes in online discussions, Henri (1992) developed a
Computer Conferencing and Content Analysis framework. This framework has five dimensions:
1. Participative: the number of statements made by participants;
2. Social: statements not related to the formal content;
3. Interactive: statements referring implicitly or explicitly to other statements or participants;
4. Cognitive: statements which demonstrate knowledge and skills related to content and learning processes;
and

198
5. Metacognitive: statements which identify personal characteristics, knowledge and skills which hinder or
enhance personal learning processes and task completion.

The cognitive dimension explores the ways people are engaging and learning within the dialogue. It
includes clarification questions, making inferences or judgments, and proposing strategies or actions. Henri (1992)
suggested that there are two levels of cognition: surface and in-depth. With surface processing including posts that:
repeat information without offering interpretation or new ideas; concur with others without adding personal
comments; and propose solutions without explanation or offering multiple solutions without judgment of their
suitability. In-depth processes are more complex. They include posts that: link ideas, facts etc. to support
interpretation and judgment; offer and elaborate on new information; propose solutions with justification; compare
and contrast ideas or solutions; provide evidence to support claims; and look at the problem from a big picture
perspective.
This framework assists educators to understand the pedagogical and learning outcomes of online
discussions. In an effort to explore the depth of discussion within the mentoring relationships in this project, the
online discussions have been analysed using Henris (1992) Computer Conferencing and Content Analysis
framework.

Context and Method


Teacher mentors in different disciplines were identified through the researchers professional networks and
the professional experience office. Teachers were invited to volunteer to take on the role of mentor in their discipline
area (e.g. Mathematics, English, Science, Business, and Computing). The mentors were identified as effective
practitioners and curriculum experts by their peers or academics and came from a range of disciplines. The mentors
would either be located in rural or remote schools or have had past experience teaching in such locations. The role of
the mentor would be to assist beginning teachers in the ongoing development of their discipline specific pedagogical
content knowledge, to answer questions, to share their experiences of issues related to being located in rural or
remote locations, and to support the mentees as beginning professionals. The teacher mentors received a small
payment for their time from a grant and worked online for approximately one hour per week across two semesters.
The pre-service teacher mentees were volunteers from the Bachelor of Education (BEDU) Secondary
specialisation across all year levels and the Graduate Diploma of Learning and Teaching (GDTL) Secondary
specialisation, who were completing their professional experience in rural or remote areas.
Data came from archived online discussions within a Wikispace area set up for the project. The
asynchronous discussions were analysed to explore the contributions from the mentors and the mentees. After the
completion of the year and finalisation of results, the online discussion posts were downloaded and de-identified
prior to data analysis. The data was analysed using Henris (1992) Computer Conferencing and Content Analysis
Conceptual Framework, described above. The archived postings in the community online discussion area were
coded to identify the dimensions from the framework.
A second data source was interviews with participants. Three student mentees and five teacher mentors
were interviewed. The research questions explored in this project were:
How do pre-service teachers and teachers respond to online mentoring?
What types of cognitive presence do pre-service teachers show within online mentoring?

Results and Discussion:


During this project, mentees and mentors across 10 different curriculum areas generated 578 posts and
12832 views over an eight-month period. There were 10 mentors and 50 mentees, which suggests an average of 9.6
posts and over 213 views of the posts for each participant. The high number of views indicates that the pre-service
teachers were very interested in the types of discussions that were occurring, however were hesitant to post their
own perspectives. Redmond, Devine and Bassoon (2014) revealed that lurkers can learn vicariously by reading
the contributions of other students. (p. 123) and the data appears to support the concept that although participants
were not contributing they were interested in the topics under discussion. Figure 1 presents the breakdown of the
posts over the different dimensions of Henris (1992) framework.

The most common topics of conversation included homework expectations, student engagement and
content relevance, literacy across the curriculum, classroom management, professional experience expectations, and

199

pedagogical approaches, all of which came from the lens of a specific discipline. The topics under discussion were
driven by the mentees.
Less than 1% of the posts were social in nature. Given the geographical spread of the mentees, and that the
lack of prior relationships between mentees and mentors, this result was surprising. Perhaps the percentage was low
because the facilitator did not establish a specific forum or protocol for this purpose. Having said that, there were
elements of social activity within many of the posts coded at higher levels.
Interactive posts were those that responded to, or made commentary on other posts without having a
significant cognitive element. 15% of the posts were coded as interactive rather than monologic (Gunawardena,
Lowe, & Anderson, 1997) where there is no ongoing post/response cycle. This result is similar to the results of
Pawan, Paulus, Yalcin and Chang (2003) who also reported low levels of interactive posts. In contrast, McKenzie
and Murphy (2000) had 74% of posts coded at the interactive dimension.
Cognitive posts have been further broken down into surface and in-depth, which are seen as different skills
connected to the learning process, and which impact on understanding, reasoning, and critical thinking. Posts coded
as surface processing are low-level posts, as opposed to in-depth posts, which required the learner to evaluate
information, to organise it and to compare it to previous understandings. The majority of the posts (64%) were
evaluated as being cognitive surface posts, which shared ideas, experiences or opinions without offering alternatives,
justifications or explanations. The second most common type of posts (19%) were cognitive in-depth posts, which
were more complex. Again this contrasts with McKenzie and Murphy (2000) who had 22% at the surface level and
the deeper processing was three times that.
When completing the data analysis the researcher, an educator within the secondary program, noted that the
contributions were at a much higher level than would normally be present within general discussion forums related
to a specific course. This raises the question of whether the higher cognitive levels of interaction displayed in the
space were due to the perception that participating mentors might be a more authentic audience.
Metacognitive posts made up only 2% of the total. These posts included elements that identified
metacognitive knowledge of the self, task or strategies, or metacognitive skills of evaluation, planning, regulation
and self-awareness. Gunawardena, Lowe, and Andersons (1997) also found low levels of metacognition, however
McKenzie and Murphy (2000) found that 16% of the posts in their study were coded as metacognitive, perhaps
because they used prompts within the online weekly activities.

Number of posts at each dimension


400

350

300
Number of Posts

250

200

150

100

50

0
Social Interactive Cognitive Cognitive In Metacognitive
surface depth
Dimension
Figure 1: Analysiss of discussion posts a using Henris (1992) content analysis framework
From the interview data it was clear that both the mentors and the mentees positively responded to the
online mentoring. Mentor A commented I certainly did enjoy it; supported by Pre-Service Teacher X who stated I

200
thought it was a good idea. Mentor E suggested it was an interesting approach however was frustrated with the
number of lurkers and questioned how we might get them to participate rather than observe. The positive
comments from participants support Hew and Knapczyks (2007) study who found that both the mentors and the
mentees benefited from the mentoring relationship.
The pre-service teachers enjoyed the access to others, both peers and experts, who had the same discipline
focus but were located beyond their geographical area. Student X revealed I have been learning to appreciate the
extent of experience gained from others outside my geographical area. Mentor A found that the project was a good
way of communicating with students that you do not normally have contact with.
The online space provided the flexibility of participation at any time, in any place. Mentor E appreciated
the flexibility of time, while Mentor B suggested it wasnt hard to participate but it was time consuming and
required an ongoing commitment. The pre-service teachers also commented on time, Student Y suggesting that
being too busy with assessment and other uni related tasks made me less inclined to spend what little spare time I
had pursuing more computer work.
Some pre-service teachers and mentors engaged in the process more than others. Pre-service teacher Z
suggested that I was a bit more proactive than the mentor, I think. In contrast, Mentor C suggested that it should be
made compulsory to require an entry every week. Having an assessment obligation would have increased the
number of posts in the mentoring space. Mentor D commented that it was good to be able to give advice and then
have students try out suggestions on prac or reflect that they had seen other do similar things. Basically it was the
benefit of reflection and discussion. Mentor A was a bit disappointed in terms of the amount of student response,
adding, I expected more questions.
The participants agreed that the depth of discussion varied and depended on the topic. Mentor D
commented that I would comment and then other students would also comment on the posts of fellow students, so
the conversation became quite focused and specific. Whiting and de Janaz (2004) also observed the importance of
asking effective questions, which should be personally meaningful and open-ended to gain the most comprehensive
responses.

Outcomes
A number of outcomes can be identified as a result of this study which explored online mentoring of
secondary pre-service teachers. Firstly, online mentoring can occur in an open online space with one-to-many
participants. Secondly, secondary pre-service teachers benefit from have access to discipline expertise throughout
their program and not just during their professional experience. Thirdly, like online teaching, online mentoring
benefits from focus questions or stimulus that requires the pre-service teachers to respond at high cognitive and
metacognitive levels.
The results of this study exploring online mentoring are limited to secondary teacher education in one
regional university in Australia and this provided limited opportunity for generalisations. Future study in this area
could include consideration as to the personal and professional learning of both the mentors and the mentees, and
exploration of the differences between mentoring conversations in closed spaces and open spaces.

Conclusion
In an effort to further support the ongoing learning and reduce attrition of rural and remote pre-service
teachers, an online mentoring project was established. The goal was to reduce the professional isolation and to
enhance the discipline pedagogical content knowledge of the pre-service teachers by providing one-to-to mentors in
a range of discipline areas.
In answering the first research question, How do pre-service teachers and teachers respond to online
mentoring?, pre-service teachers and practicing teachers responded positively to the opportunity to be involved in
this online mentoring project. The mentoring questions and concerns were driven by the pre-service teachers rather
than by a course, an academic, or the mentors. When exploring the second research question, What types of
cognitive presence do pre-service teachers show within online mentoring?, it was clear that the majority of the posts
by the participants within the online mentoring project were classified as cognitive at the surface level. Perhaps
higher levels of participation and depth of conversation would have occurred if stimulus or starter questions seeded
the discussions initially rather than having the conversation driven solely by the pre-service teachers interests and
issues.
Mentoring can be used to build pre-service teachers capacity and to develop competence while they
discuss, debate, give examples and theorize about their concerns as novice teachers with their peers and experienced

201

teachers. Secondary pre-service teachers in rural and remote areas often have limited opportunity for this interaction
at a local level, and online mentoring can provide access to experienced educators, which can leverage effective
growth and development for novice educators.

Acknowledgment
This research was supported by a University of Southern Queensland Early Career Researcher grant.

References
Ashiedu, J., & Scott-Ladd, B. (2012). Understanding Teacher Attraction and Retention Drivers: Addressing Teacher
Shortages. Australian Journal of Teacher Education, 37(11). doi: 10.14221/ajte.2012v37n11.1
Buchanan, J. (2010). May I be excused? Why teachers leave the profession. Asia Pacific Journal of Education,
30(2), 199-211. doi: 10.1080/02188791003721952
Commonwealth of Australia. (2013). Teaching and Learning maximising our investment in Australian schools. In
Senate Standing Committee on Education Employment and Workplace Relations (Ed.). Canberra,
Australia: Senate Printing Unit.
Dansky, K. H. (1996). The effect of group mentoring on career outcomes. Group & Organization Management,
21(1), 5-21. doi: 10.1177/1059601196211002
Darling-Hammond, L. (2000). How teacher education matters. Journal of Teacher Education, 51(3), 166-173. doi:
10.1177/0022487100051003002
Darling-Hammond, L. (2010). Teacher education and the American future. Journal of Teacher Education, 61(1-2),
35-47. doi: 10.1177/0022487109348024
Darling-Hammond, L. (2012). The right start: Creating a strong foundation for the teaching career. Phi Delta
Kappan, 94(3), 8-13.
Eby, L. T. (1997). Alternative forms of mentoring in changing organizational environments: A conceptual extension
of the mentoring literature. Journal of Vocational Behavior, 51(1), 125-144.
Ensher, E. A., Heun, C., & Blanchard, A. (2003). Online mentoring and computer-mediated communication: New
directions in research. Journal of Vocational Behavior, 63(2), 264-288. doi: 10.1016/S0001-
8791(03)00044-7
Fetherston, T., & Lummis, G. (2012). Why Western Australian Secondary Teachers Resign. Australian Journal of
Teacher Education, 37(4). doi: 10.14221/ajte.2012v37n4.1
Gunawardena, C. N., Lowe, C. A., & Anderson, T. (1997). Analysis of a global online debate and the development
of an interaction analysis model for examining social construction of knowledge in computer conferencing.
Journal of Educational Computing Research, 17(4), 397-431.
Gutke, H., & Albion, P. (2008). Exploring the worth of online communities and e-mentoring programs for beginning
teachers. In K. McFerrin, R. Weber, R. Carlsen & D. A. Willis (Eds.), Society for Information Technology
& Teacher Education International Conference (Vol. 2008, pp. 1416-1423). Chesapeake, VA: AACE.
Henri, F. (1992). Computer conferencing and content analysis. In A. R. Kaye (Ed.), Collaborative Learning
Through Computer Conferencing: The Najaden Papers (pp. 117-136). Berlin: SpringerVerlag.
Hew, K. F., & Knapczyk, D. (2007). Analysis of ill-structured problem solving, mentoring functions, and
perceptions of practicum teachers and mentors toward online mentoring in a field-based practicum.
Instructional Science, 35(1), 1-40. doi: 10.1007/s11251-006-9000-7
Hunt, J. H., Powell, S., Little, M. E., & Mike, A. (2013). The Effects of E-Mentoring on Beginning Teacher
Competencies and Perceptions. Teacher Education and Special Education, 36(4), 286-297. doi:
10.1177/0888406413502734
Joseph, D. (2011). Early Career Teaching: Learning to be a teacher and staying in the job. Australian Journal of
Teacher Education, 36(9). doi: http://dx.doi.org/10.14221/ajte.2011v36n9.5
Maher, M., & Macallister, H. (2013). Retention and Attrition of Students in Higher Education: Challenges in
Modern Times to What Works. Higher Education Studies, 3(2), 62-73. doi: 10.5539/hes.v3n2p62
McKenzie, W., & Murphy, D. (2000). "I hope this goes somewhere": Evaluation of an online discussion group.
AUSTRALIAN JOURNAL OF EDUCATIONAL TECHNOLOGY, 16(3), 239-257.
Mueller, S. (2004). Electronic mentoring as an example for the use of information and communications technology
in engineering education. European journal of engineering education, 29(1), 53-63. doi:
10.1080/0304379032000129304
O'Neil, D. K., Wagner, R., & Gomez, L. (1996). Online mentors: Experimenting in science class. Educational
Leadership, 54(3), 39-42.

202
OECD. (2005). Teachers Matter: Attracting, developing and retaining effective teachers. Retrieved from
http://www.oecd.org/education/school/34990905.pdf
Pawan, F., Paulus, T. M., Yalcin, S., & Chang, C. F. (2003). Online learning: Patterns of engagement and interaction
among inservice teachers. Language Learning and Technology, 7(3), 119-140.
Redmond, P., Devine, J., & Bassoon, M. (2014). Exploring discipline differentiation in online discussion
participation. Australasian Journal of Educational Technology, 30(2), 122-135.
Roberts, P. (2004). Staffing an Empty Schoolhouse: Attracting and Retaining Teachers in Rural, Remote and
Isolated Communities. Sydney, Australia: NSW Teachers' Federation.
Roberts, P., & Lean, D. (October, 2005). What does a successful staffing system for rural, remote and isolated
schools look like? Paper presented at the 2005 Society for the Provision of Education in Rural Australia,
Darwin, Australia.
Scheopner, A. J. (2010). Irreconcilable differences: Teacher attrition in public and catholic schools. Educational
Research Review, 5(3), 261-277. doi: 10.1016/j.edurev.2010.03.
Schuck, S., Aubusson, P., Buchanan, J., Prescott, A., Louviere, J., & Burke, P. (2011). Retaining Effective Early
career Teachers in NSW Schools. The report of a project commissioned by the NSW Department of
Education and Training. Retrieved from
http://www.rilc.uts.edu.au/pdfs/Beginning_Teacher_Retention_Report.pdf
Stanulis, R. N., & Floden, R. E. (2009). Intensive mentoring as a way to help beginning teachers develop balanced
instruction. Journal of Teacher Education, 60(2), 112-122. doi: 10.1177/0022487108330553
Whiting, V. R., & de Janasz, S. C. (2004). Mentoring in the 21st century: Using the internet to build skills and
networks. Journal of Management Education, 28(3), 275-293. doi: 10.1177/1052562903252639

203

204
22. Collaborative Learning Through Drawing on iPads

Michael Spitzer
Institute for Information Systems and Computer Media
Graz University of Technology
Austria
michael.spitzer@student.tugraz.at

Martin Ebner
Social Learning - Computer and Information Services
Graz University of Technology
Austria
martin.ebner@tugraz.at

Introduction
Collaborative learning describes situations in which a group of people tries to learn something together
(Dillenbourg, 1999). This concept can be transferred to Computer-Supported Collaborative Learning (CSCL) in
general.
Several studies prove the eligibility of computers to support collaborative learning (Haythornthwaite, 1999) (Zurita
& Nussbaum, 2004). As the devices got more and more computational power and their size decreased, also mobile
devices were investigated in their efficiency in collaborative learning (Mobile Device Collaborative Learning
(MDCL)) (Ebner & Kienleitner, 2014). Mobile devices support face-to-face communication of collaboration groups
because the devices are smaller than PCs and, hence, the pupils can sit close to each other (Zurita & Nussbaum,
2004).
On the other hand, the smaller display could be a drawback because of the lack of space. Therefore, a clean and
well-considered user interface is necessary (Danesh, A., Inkpen, K., Lau, F., Shu, K. and Booth K., 2001).
Furthermore special usability issues have to be taken into account (e.g. button size, screen rotation, ) (Huber &
Ebner, 2013).
Nowadays, pupils know how to handle mobile devices very well. In the United States 52% of children aged between
0-8 years have already used a mobile media platform such as smart phones and tablets but only 28% of them have
ever used educational game apps (Rideout, 2013). Even among older children (pupils at lower secondary school in
Austria, age: 10-14) only 6% of the pupils use their device frequently for learning (Grimus and Ebner, 2014, p.
1603).
This issue should be targeted with well-designed and well-considered apps, which increase the educational value. In
this research study, we would like to introduce an innovative iPad app, called Teamsketch, which tries to fulfill these
requirements. Teamsketch is intended to serve as a collaborative drawing app for iPads, which can be used by
groups consisting of 2-4 group members.
Johnson, Johnson & Holubec (D. W. Johnson, R. T. Johnson & Holubec, 2002) define three different types of
collaboration groups: (1) pupils work together for a short period of time (a few minutes to one hour), (2) pupils work
together for a medium period of time to reach a collaborative goal, (3) pupils work together for a long period of
time.
Teamsketch targets groups, which work together for a medium period of time to reach a collaborative goal: One
collectively finished sketch. Before beginning the development of such an app several aspects have to be considered
to assist teachers using Teamsketch as a collaborative learning tool. At first, the group roles should be assigned. In
Teamsketch one pupil acts as a leader and starts the sketch and also sets the game parameters, such as sketch time,
line guides and sketch name. Others join the created sketch/session. Then the task and positive dependency should
be explained to the pupils. A task could be a sketch theme, such as wood, flower meadow or farm. The pupils
should be aware that everyone could delete strokes of others. Therefore they have to draw carefully and should
arrange a collaborative drawing strategy. An important task for the teacher is to observe and intervene if a group has
problems or needs assistance. After the pupils finished the sketch, the teacher should evaluate the team performance
as well as managing the pupils reflection discussion as suggested by D. W. Johnson & R. T. Johnson (2008).

205

Research Design
In our research study we strongly follow the approach of prototyping. According to (Alavi, 1984) as well as (Larsen,
1986) prototyping is based on four steps: identifying basic requirements, development of a working prototype,
implementation and usage (field study), and revision.
The requirements have been defined as follows: The overall aim of this project was to develop a high responsive
sketch collaboration app for iPads. Synchronizing of the devices should be performed in real time without the need
of an Internet connection. Additionally, a web service and a web interface were required for uploading and
downloading sketches. Furthermore, teachers should be able to access sketches of their pupils.
Our research question is, whether it is possible to develop an app, which allows drawing together in real time.
Furthermore, we would like to investigate what the interface needs to look like in order to assist pupils in an
appropriate way. Finally, we will examine the prototype work in an educational environment.

State of the Art


After analyzing of three already available collaborative Sketch Apps (Baiboard12 (collaboration requires an internet
connection), Whiteboard Lite 13 (collaboration over Wi-Fi) and Flockdraw 14 (Browser based collaboration)) the
following conclusions were made:
Real-time sketch collaboration is already implemented in some applications but screen synchronization suffers from
draw update latency and/or synchronization errors. Fixing this issue became one main target in this research project.
The second one was implementing the application without requiring an internet connection. Since the target
audiences of Teamsketch are school classes, requiring an internet connection would lead to high latency because all
pupils would use the same internet connection at once. To address this issue, Bluetooth should be available in case
of Wi-Fi is not working or the performance is not satisfying. In Teamsketch only basic drawing tools will be
provided, no layers, text or customized shapes. Instead of drawing, the main focus should be on collaboration and
not on multiple app features. The undo function will not be implemented because the app should encourage efficient
and well-considered drawing. This assumption will be investigated during the prototype test in a primary school
class. No chat function will be added because pupils will be seated at one desk while drawing and should
communicate face-to-face. Furthermore, no predefined shapes (circles, polygons and letters) will be implemented
because students should practice free hand drawing to improve their creativity and fine motor skills. Finally, drawn
strokes should be optimized to look more realistic (less cornered) when users often change the draw direction (no
sharp directional changes). Therefore a line smoothing and curve-fitting algorithm should be used.

Prototype
The prototype development was divided into three main parts:
Web service
Web interface
iPad app

Figure 1 shows the overall Teamsketch environment. App users as well as web interface users can login to
Teamsketch web service to access their profile picture and sketches. The Teamsketch web service forwards the user
authentication to the already existing User Management Web Service.


12
http://www.baiboard.com, last accessed: 15. Oct. 2014
13
http://www.greengar.com, last accessed: 19. Oct. 2014
14
http://flockdraw.com, last accessed: 17. Oct. 2014

206
Figure 1 Teamsketch environment
Web Service
The web service was implemented as a Simple Object Access Protocol (SOAP) service in Web Services Description
Language (WSDL). The Teamsketch web service forwards login requests to the User Management SOAP service,
which is used to identify pupils. All pupils in the test classes use this User Management service for authentication
therefore they only need to login using one single username and password to various learning apps 15 of the
University of Technology, Graz (TU Graz). After authentication, the Teamsketch SOAP service provides image
upload and download functionality. Also a web interface was built to provide sketch evaluation for teachers.

Web interface
The web interface (Figure 2) shows the pupil sketch view. The current implementation supports one profile picture
and three sketches in two different resolutions. Teachers can access sketches of students of their classes. Students
can only see and download their own sketches. When users log in to the web interface, the system provides a
download page to access the sketches, drawn with Teamsketch App. Teachers are provided with a class list after
they logged into the system. First, they select a certain class, then the student list will be shown. After selecting one
pupil, the student screen (Figure 2) displays the pupils sketches.

Figure 2 Teamsketch web interface


15
http://learninglab.tugraz.at/app/, last accessed: 26. Nov. 2014

207

App for iPads
The app was implemented in Apples new development language Swift and uses the Multipeer Connectivity
Framework to synchronize the sketches. One pupil starts a new sketch, configures some sketch parameters such as
line guides, time limit and sketch name. The next task is to invite other pupils to the session. After up to four pupils
are connected to one session, the host can start the sketch. Teamsketch supports Wi-Fi connections as well as
Bluetooth connections, which is a significant advantage in school environments. Schools are usually not equipped
with state-of-the-art wireless infrastructure. When all pupils of one class use collaboration apps over Internet
simultaneously, performance issues could occur. With two available connectivity options, this issue could be
addressed efficiently.
Figure 3 shows all screen transitions and inter-device communication workflows. The app starts with the main
screen (Figure 4). Users can access the user management screen (UserManagementViewController (Figure 6), start a
new sketch (StartNewSketchViewController) or join an already created sketch (ServiceAdvertiserViewController).

Figure 3 Teamsketch screen transitions and inter-device communication


208
When a user decides to start a new sketch, some basic sketch parameters, such as line guides, sketch name and time
limit can be set (StartNewSketchViewController). Then the LookingForPlayersViewController will be shown. All
nearby devices in join sketch mode (ServiceAdvertiserViewController) will be listed and players can be invited to
the session. As soon as more than two players engage in session, the sketch host can start the sketch. The
SketchViewController will be shown on all connected devices. AbortSketchViewController and
StoreSketchViewController implement pop-up controllers to save, restart or quit the sketch.
Figure 5 shows the details of the Teamsketch drawing screen.

Figure 4 Teamsketch main screen

Figure 5 Teamsketch on iPad

Figure 6 Teamsketch user management screen

209

Field study
There are already some collaboration sketch apps on the market, but most of them suffer from synchronization
errors, high latency or require an Internet connection. After the features and issues of state-of-the-art collaborative
drawing apps were analyzed, the app prototype was implemented in Apples new programming language Swift16.
After the implementation of the prototype and also the web service had been finished, a field study in a primary
school was carried out. The test school class (3rd grade, age: 9 years) was a so-called iPad class with 16 pupils.
They have been using the iPad since first grade hence they are very experienced using mobile devices in school.
Each experimental group consisted of four pupils. They all sat together at one table. Figure 7 shows a typical group
setting for a group of two. The other group of two was sitting opposite to them on the same table.

Figure 7 Setting of the field study


Before the pupils started to draw a sketch, they got a brief introduction to the pairing process. They were already
experienced with the multi-peer connecting procedure because they previously used other collaborative apps such as
Buchstabenpost17. The field study was adapted from previous app tests in the iPad class (Kienleitner, 2013). At first
the group was divided into smaller groups of two. These smaller groups drew one sketch together to familiarize
themselves with the app and the collaborative drawing concept. A MacBook was added as observer (3rd participant)
to the session to record the drawing process. They got a drawing topic such as farm, city or flower meadow.
After finishing the sketch, the groups of two were merged to one group of four. They received a new assignment and
drew one sketch together.
After the sketches were finished, a short verbal feedback round was held strongly according to the cut-off-
technology (Schnhart & Ebner, 2014). Pupils had to rate the app with different smileys. The most happy looking
smiley corresponded to the school mark 1 and the saddest looking smiley corresponded to the school mark 5.
Three questions had to be rated: (1) Did you like drawing together? (2) Was it easy to use the app? (3) Would you
like to draw again? The groups had to give one rating per question together as a group. The main goal was not to get
a valuable rating, the main goal was to get information what they really thought about the concept and the app while
they were discussing (Fischer, 2007).
The following observations could be made: During device pairing phase, collaborative learning could be easily
observed. Usually one pupil understood the pairing process first and taught the other pupils how to connect to the
session. After the connection was established, all pupils started to draw immediately. After few seconds they
realized that drawing without discussing the strategy was not target-aimed. A few pupils got very angry because
others deleted or changed their shapes without discussing the changes. Some of them even refused to continue
drawing. Other pupils even refused to start drawing in the group at all, they preferred drawing alone. During team


16
https://developer.apple.com/swift, last access: 25. Nov. 2014
17
https://itunes.apple.com/at/app/buchstaben-post/id736836885?mt=8, last access: 25. Nov. 2014

210
drawing, one pupil usually assumed leadership and branched the work. All groups, except for one, branched the
work finally by dividing the drawing space into different parts. Every pupil drew a part separately. In some cases
this approach worked very well. Figure 8 shows a sketch, where all members of the group drew very harmonically
(group of two). The sketch looks as if only one pupil drew it. Then the group of two was merged to a group of four.
They got a new task: draw a farm (Figure 9). They had a lot of difficulties to arrange the sketch space effectively;
therefore, two of them drew the carrot field, and the other two drew the farm tractor and the cat. These two sketches
show the expressiveness of collaborative drawn sketches.

Figure 8 A harmonic group Figure 9 Group with difficulties

The behavior of one group of two girls differed significantly; they did not separate the drawing space, one girl drew
the contours, the other girl colored the shapes of the houses. Figure 10 shows the finished sketch of this group.

Additionally, significant observations could be made as


far as the pupils group roles were concerned. In some
groups, two or more pupils tried to lead, this lead to
controversial discussions. In other groups, some pupils
followed others blindly; they even asked others how to
colorize their own shapes. Another interesting

Figure 10 Contour and fill collaboration

211

observation was made as regards the reset option. The first two groups of four were introduced to the quick reset
function, which clears the whole screen. As the first disagreement occurred, at least one pupil had the idea to just
clear the whole sketch and start over without discussing the issues. The following groups were not introduced to the
reset option because they should try to solve the issues and not just delete their work. After a group finished
drawing, a short interview was held. The pupils gave detailed feedback about the user interface. Some pupils did not
understand how to adjust the felt pen width (double tap on felt pen); others were irritated by the white felt pen. They
did not realize that the white felt pen acts as a rubber but after a short explanation, every pupil used the white felt
pen appropriately. Other pupils touched felt pens (color change) accidentally in the lower area, and hence, they
changed the color unintentionally. The implemented zoom feature and overview feature was not used by the pupils
even though the functionality had been explained several times. They preferred to see the whole sketch. Others
would have liked more features, such as customized shapes, clipart import and more shiny colors. These were only
the main observations, many details can be revealed by investigating the audio recordings and screen video
recordings. The next section will analyze the recognized observations.

Discussion
During the development, the question why pupils should draw on four iPads instead of just drawing on one big sheet
of paper was asked. The most significant advantage of the app is that all four pupils can draw on every part of the
sketch simultaneously. When they draw on a sheet of paper, only one pupil can be on a certain paper location at
once and, thus, the pupil covers the drawing area while drawing. Therefore, it is very difficult for others to get an
overview of the sketch. On the iPad, every pupil can see the whole sketch at all time. Additionally, pupils do not
distract others while drawing. Nobody can press forward, everybody draws on his own screen. Furthermore, pupils
are focused on their own shapes and it is not obvious who drew which part of the sketch. When pupils draw on a real
sheet of paper they usually observe each other and get distracted.
Pupils faced some troubles concerning the user interface. These issues will be addressed with the next Teamsketch
update. Double tap functionality will be substituted with a swipe gesture. The white felt pen symbol will be changed
to a rubber symbol. The use of the zoom function should be evaluated again after the pupils got more familiar with
the app. During the field study, pupils liked drawing on the enlarged sketch whereas they did not like not being able
to see the drawings of other pupils on other parts of the sketch.
Additionally, some pupils mentioned the lack of shiny colors as well as a custom shape import feature. These
additional features should not be implemented because the focus should be on the act of collaborative drawing.
More features could lead to distraction.
Collaborative drawing is more difficult than drawing alone; each individual pupil has to concentrate on his own
work as well as the groups teamwork (D. W. Johnson & R. T. Johnson, 1989, D. W. Johnson, R. T. Johnson &
Holubec, 2002, D. W. Johnson, 2003, D. W. Johnson & R. T. Johnson, 2005 quoted from: D. W. Johnson & R. T.
Johnson, 2008).
Therefore, Teamsketch will only provide simple drawing tools and focuses only on collaboration rather than on
multiple features. Additionally, the field test showed that drawing together a collaborative sketch needs highly
developed skills. Groups need a significant amount of time in order to be able to draw together. After the field study,
the teacher of the iPad class was interviewed. She has more than four years of experience using iPads in class room
and therefore, her impressions should also be considered. She rated the collaborative learning capabilities of the
Teamsketch app as very high and she will continue using Teamsketch to train group skills such as leadership,
teamwork and face-to-face team communication. Table 1 summarizes once again all findings unveiled during the
field study.

212
Table 2 Teamsketch - challenges and positive effects
Challenges
User Interface Double tap was not intuitive, felt pen switching process had to be explained to the pupils.
White felt pen The white felt pen will be changed to a rubber because it was not intuitive for the pupils
Zoom function Pupils did not use the zoom function they preferred to see the whole sketch at all time.
App features Pupils would like the app to have more features, but collaborative drawing should focus on
teamwork and not on multiple functions, as they could distract pupils from reaching the
collaborative goal.
Clear sketch The clear sketch function should be removed because pupils used it too often. Some pupils
preferred deleting the whole sketch to discussing their issues.

Table 1 Teamsketch - challenges and positive effects


Positive effects
Teamwork After a few minutes nearly all pupils managed to draw together a collaborative sketch, they
discussed their drawing strategy and found solutions without the teachers intervention.
Evaluation Teachers can evaluate team skills of their pupils by observing them while they are drawing.
Group roles Pupils can train different group roles: they can be assigned to the group leader by starting a
new sketch and managing drawing strategy, or just join the sketch without performing any
leadership tasks.
Simplicity The concept of Teamsketch is very simple. Most pupils did not need an introduction because
they were already very familiar with iPads and drawing in general. Only the pairing process
has to be explained, but shortly after the introduction, pupils, who already understood the
pairing concept, taught others how to connect to the session.
Motivation Most of the pupils like drawing, they dont get the feeling that they have to learn or train
specific tasks and skills. They just draw a sketch. Most of the children didnt want to stop
drawing after the time limit was reached.

Conclusion
Mobile devices can be used to train team and collaboration skills such as leadership, communication, discretionary
and conflict resolution. It can be concluded that with the Teamsketch prototype, collaborative drawing on iPad was
realized without requiring an Internet connection. The optional web service and interface provide a platform to
review student sketches. Collaborative sketches are very expressive; it is obvious to see which groups had
difficulties while drawing. Using the app, the pupils are engaged to work together. If they don't work as a team, they
will not manage to finish a sketch. The field study confirmed the assumption that collaborative drawing on iPads
could act as a collaborative learning tool and assessment. Experienced observers in the field of education,
psychology and social behavior should be able to evaluate the social and collaboration development very efficiently
with Teamsketch. Further research studies have to be carried out on how collaborative drawing with Teamsketch
could improve the collaboration and team skills. Additionally, teachers could define specific educational scenarios
to train specific team skills. In the meanwhile a larger field study will be carried out.

References
Alavi, M. (1984). An assessment of the prototyping approach to information systems development. In: Commun.
ACM 27, 6 (June 1984), pp. 556-563.
Danesh, A., Inkpen, K., Lau, F., Shu, K. and Booth, K. (2001). GeneyTM: Designing a Collaborative Activity for

213

the palmTM Handheld Computer. In: Proceedings of the SIGCHI Conference on Human Factors in
Computing Systems. CHI 01. Seattle, Washington, USA: ACM, pp. 388395. ISBN: 1-58113-327-8.

Dillenbourg, P. (1999). Collaborative learning: cognitive and computational approaches. 2nd Revised Edition.
Emerald Group Publishing Limited.
Ebner, M. and Kienleitner, B. (2014). A Contribution to Collaborative Learning Using iPads for School Children.
In: European Immersive Education Summit, 2014, Vienna, pp. 87-97
Ebner, M., Schnhart, J. and Schn, S. (2015) Experiences with iPads in Primary School. In: RESPUESTA
EVALUACIN. Revista PROFESORADO, accepted, in print
Fischer, J. (2007). Detektivische Methode - Legetechnik.
URL: http://wissensreise.de/Intranet/Aufgabenkultur/forschen/Seiten/dMundLegetechnik.html, last
accessed: May 2014
Grimus, M. and Ebner, M. (2014). Learning with Mobile Devices Perceptions of Students and Teachers at Lower
Secondary Schools in Austria. In: Proceedings of World Conference on Educational Multimedia,
Hypermedia and Telecommunications 2014, pp. 16001609.
Haythornthwaite, C. (1999). "Collaborative work networks among distributed learners." Systems Sciences, 1999.
HICSS-32. Proceedings of the 32nd Annual Hawaii International Conference, vol.Track1, no., pp.16 pp.,,
5-8 Jan 1999, doi: 10.1109/HICSS.1999.772707
Huber, S. and Ebner, M. (2013). iPad Human Interface Guidelines for M-Learning. In: Z.L. Berge and L.Y.
Muilenburg (Eds.), Handbook of mobile learning, pp. 318-328. New York: Routledge.
Johnson, David W. (2003). Social Interdependence: Interrelationships Among Theory, Research, and Practice. In:
American Psychologist 40.11, pp. 934945. ISSN: 0003-066X
Johnson, David W. and Johnson, Roger T. (1989). Cooperation and Competition: Theory and Research. Interaction
Book Co.
Johnson, David W. and Johnson, Roger T. (2005). New Developments in Social Interdependence Theory. In:
Genetic, Social, and General Psychology Monographs, pp. 285358. ISSN: 8756-7547
Johnson, David W. and Johnson, Roger T. (2008). Wie kooperatives Lernen funktioniert. In: Friedrich Jahresheft
26, pp. 1620. ISSN: 0176-2966.
Johnson, David W., Johnson, Roger T. and Holubec, Edythe J. (2002). Circles of Learning: Cooperation in the
Classroom. 5th Printing. Interaction Book Company
Kienleitner, B. (2013). A Contribution to Collaborative Learning Using iPads for School Children. Master thesis
at Graz University of Technology
Larson, O. (1986). Information Systems prototyping. In: Proceedings Interest HP 3000 Conference, Madrid pp.
351-364. URL: www.openmpe.com/cslproceed/HPIX86/P351.pdf, last accessed: October 2014
Rideout, V. (2013). Zero to Eight. Tech. rep. Common Sense Media Inc.
URL: https://www.commonsensemedia.org/file/zero-to-eight-2013pdf-0/download, last accessed: Nov.
2014
Zurita, G. and Nussbaum, M. (2004). Computer supported collaborative learning using wirelessly interconnected
handheld computers. In: Computers and Education 42.3, pp. 289314. ISSN: 0360-1315

214
PART 5 PROGRESS AND ACHIEVEMENT

215

216
23. Perceptions of Achievement and Satisfaction as Related to Interactions in
Online Courses
Kristi Bordelon, Ph.D.
KB Consulting

Introduction
Distance learning originated in the early 1800s, when correspondence courses, the original form of distance
learning, were introduced (Harnar, Brown & Mayall, 2000). Initially, correspondence courses used the postal service
as the information delivery and communication tool. As communication technologies improved, distance learning
courses were offered via radio and television, and with the prevalence of Internet access, online (Gabrielle, 2001;
Kahn, 1997). The increase in online communication tools has contributed to an increase in online course offerings.
Faculty members initially opposed distance education, insisted that learning could not occur without the
instructor directing the students. These faculty members denounced distance learning with claims the method was
ineffective due to student isolation. Critics still contend that feelings of isolation result in reduced student
achievement and satisfaction in online courses (Manning, 2010; Mansour & Mupinga, 2007; Reisetter, Lapointe, &
Korcuska, 2007).
Initial concerns of online courses led to studies comparing student satisfaction and achievement in online
and face-to-face programs. Results consistently demonstrated students in an online class performed equivalent to or
better than students in the face-to-face class (Braun, 2008; Hrastinski, 2008; Hughes et al., 2007; Lim et al., 2008;
Wang, & Morgan, 2008). Subsequent studies focusing on students enrolled in online courses determined that
students can achieve and be satisfied in online courses (Al Jarf, 2004; Bray, Aoki, and Dlugosh, 2008; Chang &
Smith, 2008; Hughes et al., 2007; Lim et al., 2008). Findings also stated that students in online classes exhibit more
positive attitudes about the course and perform better when the class includes social and collaborative activities
(Brownson, 2009; Durrington et al., 2006). Delving into factors that influence satisfaction and achievement, some
studies focused on the impact of either student-instructor interaction, student-student interaction, or student-content
interaction on achievement or satisfaction (Bickle & Carroll, 2003; Moore, 1989; Thurmond & Wambach, 2004).
Moores theory of interaction explains that three types of interaction exist in a distance learning
environment: student-instructor interaction, student-student interaction, and student-content interaction (Moore,
1989). Student-instructor interaction is any communication between the student and instructor. Student- student
interaction is any communication between students. Student-content interaction is any activity in which the student
interacts with the course content. Each type of interaction results in positive effects on both student achievement and
satisfaction (Al Jarf, 2004; Chen et al., 2009; Cox-Davenport, 2010; Drouin, 2008; Gorsky & Caspi, 2005; Hughes
et al., 2007; Jiang, 2008; Jung et al., 2006; Kupczynski et al., 2008; Lapointe & Gunawardena, 2004; Lim, et al.,
2008; Manning, 2010; Moore, 1989; Rhode, 2009; Salmon, 2002; Schwartzman, 2007; Sher, 2008; Swan, 2002;
Woods, 2002). No determination has been made if all forms of interaction are equally important to student perceived
learning and student satisfaction in online courses (Arbaugh & Rau, 2007). Although, there has not been a
determination if all forms of interaction are equally important in an online class (Arbaugh & Rau, 2007). This
quantitative non-experimental study expanded on previous research that assessed if one or more types of interaction
influence student perceived learning and / or student satisfaction in an online class. Each type of interaction
influences student perceived learning and satisfaction.
Advances in technology combined with student demand have contributed to growth in online course and
program offerings and enrollments (Allen & Seaman, 2008 / 2010; Coronel, 2008; Karber, 2001; Puzziferro, 2006).
The increase in online course offerings facilitates the need to investigate the degree to which differences or
variations in student-instructor interaction, student-student interaction, and student-content interaction are related to
differences or variations in perceived achievement and satisfaction in online graduate courses in the field of
education. Determining these relationships can increase the knowledge of best practices in online course design,
therefore improving student achievement and satisfaction. To increase the effectiveness of online classes, it is
necessary to determine how each type of interaction relates to student achievement and satisfaction (Grandzol &
Grandzol, 2010). Researchers, including Drouin (2008), Grandzol and Grandzol (2010), and Sher (2008), have
studied interactions in online class environments. However, few studies have compared the individual relationships

217

between student satisfaction, student achievement, and each type of interaction (Chang & Smith, 2008). No
determination had been made if a statistically significant degree of differences or variations in student-instructor
interaction, student-student interaction, or student-content interaction are related to statistically significant
differences or variations in student perceived achievement or satisfaction in online graduate courses in the field of
education. The purpose of this quantitative non-experimental study was to investigate the degree to which
differences or variations in student-instructor interaction, student-student interaction, or student-content interaction
are related to differences or variations in perceived achievement and satisfaction in online graduate courses in the
field of education.

The Study

Various studies have measured instructional success using student attitudes, perceptions, and perceived
learning (Duckworth, 2010; Grandzol & Grandzol, 2010; Gupta, 2004; Hanze & Berger, 2007; Peterson & Miller,
2004; Sher, 2008). Following the model from previous studies, this study used student attitudes, perceptions, and
perceived learning investigate the degree to which differences or variations in student-instructor interaction, student-
student interaction, or student-content interaction are related to differences or variations in perceived achievement or
satisfaction in online graduate courses in the field of education. A purposive sample of educators enrolled in online
graduate classes in the field of education was asked to complete an online survey that contains quantitative questions
regarding each type of interaction, achievement, and satisfaction.
The sample size consisted of the approximately 300 students enrolled in the institutions during the
collection period. Students were employed in K-12 schools and enrolled in courses either to earn a Masters degree
or to complete courses to meet recertification requirements. All courses were in the field of education, offered
entirely online, used institution owned course content, and were taught by instructors experienced in teaching online
courses. During each class, students and instructors interacted via email and discussion boards. Some of the students
teach in the same school or district and had a personal or professional relationship prior to enrolling in the online
courses. The majority of the students only interacted with each other in the context of completing course
assignments.

Findings

A multiple regression analysis was conducted. Initially, student-instructor interaction, student-student


interaction, and student-content interaction were selected as Model 1, Model 2, and Model 3, respectively. From
those results, the final sequence for student-instructor interaction, student-content interaction, and student-student
interaction was determined by the amount of change displayed by each construct variable (Table 1). Using enter
variable selection, the variables were prioritized by the change in R2 reported. Results determined that the sequence
which resulted in the largest change in R2 to the previous model was student-instructor interaction, student-content
interaction and then student-student interaction. Each of the interaction variables were significantly associated with
perceived student achievement. The multivariate regression equation demonstrated that student-instructor
interaction, student-content interaction, and student-student interaction were positively and significantly associated
with perceived achievement (R2 = .890, p<.05). Model 1, the regression model with only the constant and student-
instructor independent interaction construct explains the majority (85.3%) of the variability of student perceived
achievement dependent construct (R2 = .853, F(1,153) = 1049, p<.05). Model 2, the regression model with the
student-content independent interaction construct added to Model 1, demonstrates a small but significant increase
(3%) of perceived achievement (R2 = .882, F(2,153) = 601, p<.05). Model 3, the regression model with the student-
student independent interaction construct added to Model 2, demonstrates a minimal increase (1%) (R2 = .890,
F(3,153) = 426, p<.05). Even though, the increase was not large, it was significant enough to reject the null
hypothesis that the true increase was zero (Tables 1 and 2).

218
Table 1
Model Summary Student Perceived Achievement
Change Statistics
Model R R2 Adjusted R2 SE R2 F Df1 Df2 Sig. F
1 .924a .853 .852 .46963 .853 889.500 1 153 .000
2 .939b .882 .881 .42162 .029 37.828 1 152 .000
3 .943c .890 .888 .40955 .007 10.093 1 151 .002

a Predictors: (constant) student-instructor interaction

b Predictors: (constant) student-instructor interaction and student-content interaction

c Predictors: (constant) student-instructor interaction, student-content interaction, and student-student interaction

Table 2
ANOVA Student Achievement
Model SS Df MS F Sig.
1 Regression 200.388 1 200.388 1049.275 .000a
Residual 29.029 152 .191
Total 229.416 153
2 Regression 203.822 2 101.911 601.243 .000b
Residual 25.595 151 .170
Total 229.416 153
3 Regression 205.321 3 68.440 426.055 .000c
Residual 24.096 150 .161
Total 229.416 153

a Predictors: (constant) student-instructor interaction

b Predictors: (constant) student-instructor interaction and student-content interaction

c Predictors: (constant) student-instructor interaction, student-content interaction, and student-student interaction

Of the three interaction variables, two were significantly associated with student satisfaction. The
multivariate regression equation demonstrated that student-instructor interaction and student-content interaction
were positively and significantly associated with student satisfaction (R2 = .897, p<.05). Model 1, the regression
model with only the constant and the student-instructor independent interaction construct explained the majority (R2
= .882, F(1,153) = 1279, p<.05)of the variability of student satisfaction dependent construct. Model 2, the
regression model with the student-content independent interaction construct added to Model 1, demonstrated a
minimal (1.3%) increase of satisfaction, explaining 89.7% of the variability of student satisfaction dependent
construct (R2 = .897, F(2,153) = 19, p<.05) . Model 3, the regression model with the student-student independent
interaction construct added to Model 2, demonstrated no increase of the variability of student satisfaction dependent
construct (R2 = .897, F(3,153) = .03, p<.05) (Tables 3 and 4).

219

Table 3
Model Summary Satisfaction
Model Summary Satisfaction
Change Statistics
Model R R2 Adjusted R2 SE R2 F Df1 Df2 Sig. F
1 .940a .884 .884 .44598 .884 1170.508 1 153 .000
2 .947b .897 .896 .42141 .013 19.357 1 152 .000
3 .947c .897 .895 .42277 .000 .027 1 151 .869

a Predictors: (constant) student-instructor interaction

b Predictors: (constant) student-instructor interaction and student-content interaction

c Predictors: (constant) student-instructor interaction, student-content interaction, and student-student interaction

Table 4
ANOVA Student Satisfaction
Model SS Df MS F Sig.
1 Regression 235.096 1 235.096 1278.817 .000a

Residual 27.943 152 .184

Total 263.040 153

2 Regression 236.535 2 118.267 673.780 .000b

Residual 26.505 151 .176

Total 263.040 153

3 Regression 236.587 3 78.862 447.183 .000c

Residual 26.453 150 .176

Total 263.040 153

a Predictors: (constant) student-instructor interaction

b Predictors: (constant) student-instructor interaction and student-content interaction

c Predictors: (constant) student-instructor interaction, student-content interaction, and student-student interaction

Conclusions

Results of this study indicated that a statistically significant difference in student-instructor interaction,
student-student interaction, or student-content interaction and student perceived achievement and student
satisfaction in online graduate courses in the field of education exists. A difference exists when comparing each type
of interaction and student satisfaction, but the results of the multiple regression analysis and partial correlations did
not indicate student-student interaction had a statistically significant difference when compared to the other two
interaction variables. With the increase in online course offerings, determining factors that can positively influence

220
student achievement and satisfaction will contribute to the quality of online courses. Findings from this study did
provide details on the differences regarding the relationship between each type of interaction, student perceived
achievement, and satisfaction. Further research is recommended with a larger sample size to expand on the findings.
Additionally, conducting the same study with a different online student population could help determine the
generalization of the findings. In addition to recommendations for further research, the results indicated that
suggestions regarding practical application of the results can be made, such as implementing policies and procedures
in online classes that emphasize the importance of student-instructor interaction and student-content interaction.

References

Al Jarf, R. (2004). The effects of web based learning on struggling EFL college writers. Foreign Language Annals,
37(1), 49.
Arbaugh, J. B. (2008). Does the community of inquiry framework predict outcomes in online mba courses?
International Review of Research in Open and Distance Learning, 9(2).
Arbaugh, J. & Rau, B. (2007). A study of disciplinary, structural, and behavioral effects on course outcomes in
online mba courses. Decision Sciences Journal of Innovative Education, 5(1), 65 (Document ID:
1199097301).
Bickle, M., & Carroll, J. (2003). Checklist for quality online instruction: Outcomes for learners, the professor and
the institution. College Student Journal, 37(2), 208.
Braun, T. (2008). Making a Choice: The perceptions and attitudes of online graduate students. Journal of
Technology and Teacher Education, 16(1), 63-92.
Bray, E., Aoki, K., & Dlugosh, L. (2008). Predictors of learning satisfaction in Japanese online distance learners.
International Review of Research in Open & Distance Learning, 9(3), 1-24.
Drouin, M. (2008). The relationship between students' perceived sense of community and satisfaction, achievement,
and retention in an online course. Quarterly Review of Distance Education, 9(3), 267-284.
Duckworth, A. (2010). Cooperative learning: Attitudes, perceptions, and achievement in a traditional, online, and
hybrid instructional setting. (Doctoral dissertation). Retrieved from Dissertations & Theses: Full Text.
(Publication No. AAT 3416278).
Gabrielle, D.M. (2001). Distance Learning: An examination of perceived effectiveness and student satisfaction in
higher education. In J. Price et al. (Eds.), Proceedings of Society for Information Technology and Teacher
Education International Conference 2001. Chesapeake, VA: AACE. Retrieved from
http://www.editlib.org/index.cfm/file/paper_ca43e4fb-78fa-4d87-abd3-
0b43a298c2c8.pdf?fuseaction=PurchasePapers.DownloadFile&paper=ca43e4fb-78fa-4d87-abd3-
0b43a298c2c8&download=AFAFC4D2-F308-0A29-CCDD-04BB1B1D327F
Gorsky, P., & Caspi, A. (2005). Dialogue: A theoretical framework for distance education instructional systems.
British Journal of Educational Technology, 36(2), 137.
Grandzol C. & Grandzol J. (2010). Interaction in online courses: More is not always better. Online Journal of
Distance Learning Administration. 13(2).
Gupta, M. (2004). Enhancing student performance through cooperative learning in physical sciences. Assessment
and Evaluation in Higher Education, 29(1), 63-73. Retrieved from ProQuest Education Journals.
(Document ID: 686340261).
Hanze, M., & Berger, R. (2007). Cooperative learning, motivational effects, and student characteristics: An
experimental study comparing cooperative learning and direct instruction in 12th grade physics classes.
Learning & Instruction. 17(1), 29-41, DOI: 10.1016/j.learninstruc.2006.11.004; (AN 24139773)
Harnar, M., Brown, S. & Mayall, H. (2000). Measuring the effect of distance education of the learning experience:
Teaching accounting via picturetel. International Journal of Instructional Media, 27(1), 37.Retrieved from
Research Library. (Document ID: 50829860).
Hrastinski, S. (2008). Asynchronous and synchronous e-learning. EDUCAUSE Quarterly, 31(4). Retrieved from
http://www.educause.edu/EDUCAUSE+Quarterly/EDUCAUSEQuarterlyMagazineVolum/Asynchronousa
ndSynchronousELea/163445
Hughes, J., McLeod, S., Brown, R., Maeda, Y., & Choi, J. (2007). Academic achievement and perceptions of the
learning environment in virtual and traditional secondary mathematics classrooms. The American Journal
of Distance Education, 21(4), 99.

221

Jiang, J. (2008). Exploring the relationship between the feelings of isolation among distance learners and the levels
of interaction built into the online course. (Ph.D. dissertation). Retrieved from Dissertations & Theses: Full
Text. (Publication No. AAT 3305424)
Karber, D.J. (2001). Comparisons and contrasts in traditional versus on-line teaching in management. Higher
Education in Europe, 26(4), 533-536.
Khan, B. H. (1997). Web-based instruction. Englewood Cliffs. NJ: Educational Technology Publications.
Kupczynski, L., Brown, M., & Davis, R. (2008). The impact of instructor and student interaction in internet-based
courses. Journal of Instruction Delivery Systems, 22 (1), 6-11. Retrieved from
http://www.salt.org/salt.asp?ss=m&pn=jids
LaPointe, D., & Gunawardena, C. (2004). Developing, testing and refining of a model to understand the relationship
between peer interaction and learning outcomes in computer-mediated conferencing. Distance Education,
25(1). 83- 107.
Lim, J. (2005). A comparison of online instruction versus traditional classroom instruction in a wellness course.
Research Quarterly for Exercise and Sport: Research Consortium Abstracts, 76(1), A81.
Lim, J., Kim, M., Chen, S., & Ryder, C. (2008). An empirical investigation of student achievement and satisfaction
in different learning environments. Journal of Instructional Psychology, 35(2), 113-119. Retrieved from
http://vnweb.hwwilsonweb.com/hww/Journals/getIssues.jhtml?sid=HWW:OMNIS&id=02313
Manning, K. (2010). A Delphi study: Exploring faculty perceptions of the best practices influencing student
persistence in blended courses. (Doctoral dissertation). Retrieved from Dissertations & Theses: Full Text.
(Publication No. AAT 3390343).
Mansour, B., & Mupinga, D. (2007). Students' positive and negative experiences in hybrid and online classes.
College Student Journal, 41(1), 242-248. Retrieved from http://www.projectinnovation.biz/csj_2006.html
Moore, M. (1989). Three types of interaction. The American Journal of Distance Education, 20(1), 1-5.
Peterson, S., & Miller, J. (2004). Comparing the quality of students experiences during cooperative learning and
large-group instruction. The Journal of Educational Research, 97(3). Retrieved from ABI/INFORM
Global. (Document ID: 577506911).
Puzziferro, M. (2006). Online technologies self-efficacy, self-regulated learning, and experiential variables as
predictors of final grade and satisfaction in college-level online courses. Ph.D. dissertation, New York
University, United States - New York. Retrieved from Dissertations & Theses: Full Text. (AAT 3199984).
Reisetter, M., Lapointe, L., & Korcuska, J. (2007). The impact of altered realties: implications of online delivery for
learners' interactions, expectations, and learning skills. International Journal on E-Learning, 6(1), 55-80.
Rhode, J. (2009). Interaction equivalency in self-paced online learning environments: An exploration of learner
preferences. International Review of Research in Open & Distance Learning, 10(1). Retrieved from
http://www.irrodl.org/index.php/irrodl
Salmon, G. (2002). Mirror, mirror, on my screen. Exploring online reflections. British Journal of Educational
Technology, 33(4), 379-391.
Schwartzman, R. (2007). Electronifying oral communication: Refining the conceptual framework for online
instruction. College Student Journal, 41(1), 37-49. Retrieved from Retrieved from
http://www.projectinnovation.biz/csj_2006.html
Sher, A. (2009). Assessing the relationship of student-instructor and student-student interaction to student learning
and satisfaction in web-based online learning environment. Journal of Interactive Online Learning, 8(2),
102-120.
Swan, K. (2002). Building learning communities in online courses: The importance of interaction. Education
Communication and Information, 2(1). 23-49.
Thurmond, V. & Wambach, K. (2004). Understanding interactions in distance education: A review of the literature.
International Journal of Instructional Technology and Distance Learning, Retrieved from
http://www.itdl.org/journal/Jan_04/article02.htm.
Wang, L., & Morgan, W. (2008). Student perceptions of using instant messaging software to facilitate synchronous
online class interaction in a graduate teacher education course. Journal of Computing in Teacher
Education, 25(1), 15-21.
Woods, R. (2002). How much communication is enough in online courses? Exploring the relationship between
frequency of instructor-initiated personal email and learners perception. International Journal of
Instructional Media, 29(4), 377 395.

222
24. Interaction, Satisfaction, and Perceived Progress in Online Language
Courses
Chin-Hsi Lin
Binbin Zheng
Yining Zhang
Michigan State University
United States

chinhsi@msu.edu
binbinz@msu.edu
zhangy58@msu.edu

Introduction

An increasing number of K-12 schools offer online courses. Online learning programs and virtual schools
can both be described as educational organizations that offer K-12 courses through Internet- or Web-based methods,
including formal instruction and other resources (Cavanaugh & Blomeyer, 2007). Student enrollment in fully online
schools in the United States was 275,000 in 2011-12 (Watson, Murin, Vashaw, Genin, & Rapp, 2012), and in 2012-
13 there were 740,000 enrollments in distance-education courses in U.S. K-12 school districts (Johnson, Adams
Becker, Estrada, & Freeman, 2014). The primary purposes of offering online courses for K-12 students are to
expand educational access, provide curricular choices, offer high-quality learning opportunities, and achieve
administrative efficiency (Barbour & Reeves, 2009; Berge & Clark, 2005; Cavanaugh & Blomeyer, 2007). As
Barbour and Reeves (2009) note, however, very little evidence supports the linkages between online learning and
these conjectural benefits, and more research on factors that account for K-12 student success in distance education
is especially needed.
This study aims to fill this gap by examining the relationship between interaction and learning outcomes in
online language courses. From a sociocultural perspective, language learning is an active and interactive process of
exploration and discovery, underscoring the need for mediation and social interaction in the development of
meaning (Lantolf, 2006; Vygotsky, 1978). Language learning is not merely a process of acquiring new language
forms; rather, language learners develop the capacity to regulate their own linguistic production by participating in
social activities regulated by native speakers (Lantolf, 1994, 2000). Previous studies have also demonstrated that
social interaction leads to language acquisition because it provides learners with comprehensive input, the input that
matches students language proficiency so students can understand (Alison & Philp, 1998; Krashen, 1985; Long,
1983; Pica, 1994). Interaction is no less essential a component in online education; however, students in online
learning environments tend to have fewer opportunities to interact with their teachers and peers than they would
have in traditional face-to-face instruction. Our study examines the relationship between interaction and learning
outcomes in online language courses taken by middle- and high-school students. Using the most comprehensive
framework for interaction in distance education developed to date, Moore (1989), we operationalize interactions into
three types: learner-instructor, learner-learner, and learner-content. The learning outcomes we examine are
satisfaction and perceived progress.
Our research questions are as follows:
1. Do any or all of these three types of interactions affect student satisfaction?
2. Do any or all of these three types of interactions affect perceived progress?

Interaction in Online Education

Interaction is recognized as a fundamental component not only of traditional classrooms, but of distance
education as well (Gunawardena & Zittle, 1997; Hillman, Willis, & Gunawardena, 1994; Moore, 1989; Swan,
2001). In fact, most scholars unequivocally assert the importance of interaction in online education (Bernard, et al.,
2009; Kuo, Walker, Schroder, & Belland, 2014; Marks, Sibley, & Arbaugh, 2005), as interactions among students,
teachers, and content presumably occur in all types of education regardless of delivery methods.

223

Previous empirical studies of interaction have mainly focused on investigating it as a factor that may
influence learners satisfaction and perceived learning outcomes (e.g., Eom, Wen, & Ashill, 2006; Swan, 2001). The
amount of learners interactions with content, instructors and other learners were associated with how satisfied they
felt and how much they thought they had learned (Swan, 2001). Eom et al. (2006) found that the quantity of
interaction significantly affected distance-learning students satisfaction but not their perceived progress. However,
that study treated interaction as a single, undifferentiated phenomenon. In fact, different types of interactions,
although interrelated, are not identical (Swan, 2001).
In the following sections, we will first review the types of interaction proposed by Moore (1989) and go on
to discuss the differential effects of interaction in online learning. We chose to use Moores typology not only
because of its popularity in the literature, but also because it provides an easily observable, measurable variable to
evaluate the impact of interaction in online courses (Roblyer & Wiencke, 2003, p. 80).

Learner-learner interaction

Moore (1989) characterized interaction as an instructional exchange among learner, instructor and content,
and provided a conceptual framework for three types of interaction in distance education: learner-content, learner-
instructor, and learner-learner. Learner-learner interaction refers to interactions occurring between one student and
others, or among students working in small groups. According to sociocultural theories and the theory of distributed
cognition (Gunawardena, Lowe, & Anderson, 1997; Lantolf, 2006; Salmon, 2011; Vygotsky, 1978), learner-learner
interaction is beneficial for cognitive development, language learning, and motivational support. In addition,
engaging in learner-learner interaction helps the development of deeper learning (Anderson, 2003).
Previous research on learner-learner interaction in online environments has reported mixed results: with
some studies indicating that this type of interaction does not affect student satisfaction in higher education (e.g., Kuo,
et al., 2014), and others showing that it has a small positive effect on satisfaction (Bernard, et al., 2009; Jonassen,
Davidson, Collins, Campbell, & Haag, 1995; Jun, Choi, Lim, & Leem, 2002; Marks, et al., 2005; Sherry, Fulford, &
Zhang, 1998). Anderson (2003) and Bernard et al. (2009) further assert that engaging in learner-learner interaction
helps to increase achievement.

Learner-instructor interaction

Learner-instructor interaction takes place between students and educators. The forms of learner-instructor
interaction can cover a wide spectrum of activity, ranging from formal evaluation and discipline to informal support,
guidance and encouragement (Moore, 1989; Swan, 2002). It can occur synchronously, for instance via video-
conferencing and online chats, or asynchronously via emails, discussion boards and so forth (Bernard, et al., 2009).
A high frequency and high quality of learner-instructor interaction is thought to be essential to successful distance
learning (Moore, 1989).
Several studies have provided consistent empirical evidence to support the importance of learner-instructor
interactions in online education. For example, Arbaugh (2000) found interaction between students and instructors
was the only factor that significantly affects learning in online MBA courses. Jung et al. s (2002) finding suggested
that college students in student-instructor interaction group outperformed students in other two interaction groups,
and they also participated more actively. Similarly, Marks et al. (2005) found that learner-instructor interaction was
the most important among the three types of interactions, and that in graduate-level courses, its effect was twice that
of learner-learner interaction. Kuo et al. (2014) have also reported a positive effect of learner-instructor interaction
on student satisfaction in higher education.

Learner-content interaction

Learner-content interaction, as when learners read learning materials or participate in task-oriented


activities, has come to be considered the basic form of interaction in online learning (Jung, et al., 2002). For Moore
(1989), learner-content interaction was one-way communication from the subject of study to the student. In the
context of online learning, learner-content interaction refers to a variety of pedagogical tools and assignments such
as audio and video presentations, individual or group projects, links to other resources, online chats and discussion
forums. From a social-constructivist perspective, these initiate learners didactic conversation, reinforcing students
roles as creators of knowledge (Anderson, 2003; Moore, 1989; Moore & Kearsley, 2011).

224
Methods

This study was conducted in a virtual school in the Midwestern U.S., and employed a survey approach to
explore our research questions on the relationship between interactions and learning outcomes. This school meets
the definition of a virtual school proposed by Watson et al. (2013): a state-wide supplemental program in which
students take individual courses, while also being enrolled in a physical school or cyber-school within the same
state. This particular non-profit program was authorized by the state and is overseen by state agencies.

Participants

During the period covered by our study, the target school had a total of 1,593 middle- and high-school
students enrolled in various world language courses, including Chinese, French, German, Japanese, Spanish, and
Latin. All were invited to participate in the survey, and 597 did so, a response rate of 37.5%. According to school
staff members, this was a higher response rate than class evaluations in this school would normally receive. Among
the 597 participants, 131 did not provide complete data. Most of these 131 students simply entered no response to
most of the questions, but submitted their survey anyway. Staff informed us that this pattern of submission with
many missing answers was a very common phenomenon in this virtual school in regard to course evaluations and
other surveys.

Survey

A 67-item survey, requiring approximately 20 minutes to complete, was distributed to all the middle- and
high-school students taking language courses in the virtual school in late April 2014. It included questions about the
students demographic information, motivations, self-efficacy, learning strategies, interactions, and learning
outcomes.

Results

Among the 466 students who provided complete data, 35% were males. Participants were predominately
White (79%), with 5% being Black, 3% Hispanic, and 7% Asian, and others (6%). The sample students were 92%
high-school students, with 15% in 9th grade, 31% in 10th grade, 26% in 11th grade, and 20% in 12th grade (see
Table 1). The majority of students in our sample (58%) took online language courses as electives, with another 39%
taking them to fulfill requirements, and the remaining 4% doing so for credit recovery. With regard to the three
types of interaction, students reported a very limited amount of learner-learner interaction (2.81 out of 7), whereas
their learner-instructor and learner-content interactions were both above average, 4.47 and 4.16 respectively.
Students tended to be satisfied with their course (4.47 out of 7) and felt they had made above-average progress (4.75
out of 7). Intrinsic motivation was moderately correlated with extrinsic motivation and self-efficacy. In addition,
intrinsic motivation had moderate levels of correlation with meta-cognitive and language-learning strategies.

Table 1 Standardized Effects of Interaction on Satisfaction and Perceived Progress

Satisfaction Perceived Progress


Learner-learner interactions -0.01 0.02
(-0.25) (0.65)
Learner-instructor interactions 0.14*** 0.05
(3.78) (1.46)
Learner-content interactions 0.48*** 0.11**
(10.41) (2.81)
R2 0.67 0.75
Observations 466 466
Note. Control effects included in the analytic model are students demographic information, motivations, self-
efficacy, and learning strategies.
*p<.05. **p<.01. ***p<.001.

225

Satisfaction

We controlled for students demographic information, reasons to take the course (elective, credit recovery,
or required course), intrinsic motivation, extrinsic motivation, self-efficacy, metacognitive strategies, language-
learning strategies in comparing the three types of interactions. From Table 1, we can find that among the three
types of interactions, learner-instructor and learner-content interactions had significant positive effects on student
satisfaction: b = 0.14, p < 0.001 and b = 0.48, p < 0.001, respectively. While the effects of both types of interaction
on student satisfaction were positive, it should be noted that the effect size of learner-content interaction (0.48) was
more than three times greater than the effect size of learner-instructor interaction (0.14), and three times greater than
that of language-learning strategies (0.16).

Perceived progress

We ran a similar analysis for perceived progress and found a very similar pattern. As seen from Table 1,
among the three types of interaction, only learner-content interaction had a significant positive effect on students
perceived progress, b = 0.11, p < 0.01. Though learner-instructor interaction had a positive effect on student
satisfaction, it did not affect students perceived progress, b = 0.05, n.s.

Discussion

Interaction and satisfaction

Among the three types of interaction, learner-instructor and learner-content interactions had a significantly
positive effect on satisfaction, whereas learner-learner interaction did not affect it. Learner-instructor and learner-
content interactions together explained 11% of the variance in student satisfaction, and learner-content interaction
was found to be the strongest predictor of student satisfaction out of the three types of interactions.
The significance of the impact of learner-content interaction on student satisfaction was consistently
reported in previous studies of higher education (Chejlyk, 2006; Keeler, 2006; Kuo, et al., 2014), and to our
knowledge, this is the very few study that examines the relationship between online interaction and student
satisfaction in K-12 environment. Our results extend the findings from research in higher education settings to high-
school students, and provide further empirical evidence that learner-content interaction has a positive impact on
student satisfaction. The importance of learner-instructor interaction to student satisfaction is also supported by our
results. Such findings are consistent with previous research that examined online undergraduate (e.g., Kuo et al.,
2014) and graduate courses (e.g., Marks et al., 2005). Many scholars have asserted the importance of learner-learner
interaction and concluded that it has a small but positive effect on satisfaction (e.g., Anderson, 2003; Bernard et al.,
2009; Jung et al, 2002). Our results, aligning with those of Kuo et al. (2014), found no evidence to support this point
of view. One possible explanation for these differences may reside in the specific context of virtual schools. As
Kaseman and Kaseman (2000) point out, some courses in virtual schools operate like traditional correspondence
courses and provide very limited interactional opportunity for students, whereas in others, students can interact with
teachers and peers through various modes of computer-mediated communication including emails, discussion
forums, chat rooms, audio- or video-conferencing, and instant messaging. As shown in Table 1, the overall level of
learner-learner interaction reported by our sample was much smaller than the levels of learner-instructor and learner-
content interaction. From our preliminary analysis of teacher interviews, the online courses we examined are very
similar to the first type of online courses Kaseman and Kaseman described: the school purchases ready-made
courses that leave very little scope for changes to their content by teachers; this makes it difficult to promote learner-
learner interaction using project- or task-based learning. As Kuo et al. (2014) conclude, if collaboration among
learners is not required, then learner-learner interaction may not affect student satisfaction at all (p. 44).

Interaction and perceived progress

Among the three types of interactions, learner-content interaction is the only one that affected perceived
progress, accounting for 1% of its variation. The significant effect of learner-content interaction in predicting
students perceived progress, alone among the three types, supports the social-constructivist perspective in which
these learning materials or tools initiate learners thought and conversation, and students become creators of

226
knowledge (Anderson, 2003; Moore, 1989; Moore & Kearsley, 2011). In addition, our finding regarding the
importance of learner-content interaction reflected the changing nature of Web-based learning content. When we
think of learner-content interaction, we are no longer referring to the traditional one-way communication starting
from learning material to students. In fact, the greatly enhanced capabilities of the online courses, and of the Internet
itself, have redefined the function of student-content interaction. As Anderson (2003) points out, the revolutionary
development of student-content interaction has overturned its former reliance on text-based material, and numerous
types of computer-assisted instruction, simulations and presentation tools are steadily enriching the body of learning
content. Our results contradict Marks et al.s (2005) finding that the majority of student-content variables were
insignificant in predicting perceived progress. We believe that the divergent characteristics of the online classes in
that study and ours may explain this discrepancy. Marks et al. conducted their study in a number of business courses
in an MBA program, whereas ours targeted courses teaching various world languages to high-school students. As
learning a language requires a tight connection between content-learning and language-teaching (Coyle, 2007), it is
perhaps not surprising to observe a heavier reliance on course content among language students than among
business students. Lastly, we would like to stress that our findings regarding the non-significance of the roles of
learner-instructor and learner-learner interaction in predicting students perceived progress does not mean that these
two types of interactions are unimportant in distance learning. The use of off-the-shelf course content, coupled with
limited synchronous class time, may have hindered our observation of a significant effect of learner-instructor
and/or learner-learner interaction. Therefore, we would like to call for replication of this type of study in different
online learning contexts in the future.

Conclusion

This study provides empirical evidence for the extent to which each of the three main types of interaction
predicts students progress and satisfaction. Interactions increase the variances explained by our models by 10% for
satisfaction, and 1% for perceived progress, after controlling for the variances explicable by motivational variables
and learning strategies. This study adds to the body of research that examines the role of interaction in online
learning environments in higher education, and extends it to cover high-school students taking online language
courses. Its results suggest that improvements in learner-content interaction may help to increase learner satisfaction
and perceived progress.
These findings have some important implications for the design of online language courses for high-school
students. Attention must be paid to incorporating different types of interactions into the courses. An effective
starting point for doing so, as clearly demonstrated by Swan (2002), is to recognize that verbal immediacy behaviors
can support interactions among online learners. The term immediacy behaviors refers to the perceived psychological
distance between communicators (Wiener & Mehrabian, 1968); examples of verbal immediacy behaviors include
giving praise, soliciting viewpoints, and use of humor. Through verbal immediacy behaviors, participants project
their identities into communications and create social presence, which promotes the sense of community, interaction,
and student engagement (Arbaugh, 2000; Battalio, 2007; Roblyer & Wiencke, 2003; Swan, 2002).
Several directions for future research are suggested by this study. First, further investigation, especially in-
depth qualitative research, is needed to advance our understanding of how different types of interaction affect
student satisfaction and progress. Second, given the significance of interaction in online learning, further research
should examine how specific course components may promote or inhibit different types of interactions. Third, the
participants in this study were drawn from online language classes at a single virtual school, so the results may not
generalize well to other types of online high-school courses or to other regions. A more diverse population should
therefore be studied in future research. Empirical assessments of interaction and measurement of effects on
achievement, as suggested by Wagner (1994), should also be pursued.

References

Alison, M., & Philp, J. (1998). Conversational interaction and second language development: Recasts, responses,
and red herrings? The Modern Language Journal, 82(3), 338-356. doi: 10.2307/329960
Anderson, T. (2003). Modes of interaction in distance education: Recent developments and research questions. In
M. G. Moore & W. G. Anderson (Eds.), Handbook of distance education (pp. 129-144). Mahwah, NJ:
Lawrence Erlbaum Associates.
Arbaugh, J. B. (2000). How classroom environment and student engagement affect learning in Internet-based MBA
courses. Business Communication Quarterly, 63(4), 9-26. doi: 10.1177/108056990006300402

227

Barbour, M. K., & Reeves, T. C. (2009). The reality of virtual schools: A review of the literature. Computers &
Education, 52(2), 402-416. doi: 10.1016/j.compedu.2008.09.009
Barker, K., & Wendel, T. (2001). E-learning: Studying Canadas virtual secondary schools. Kelowna, BC: Society
for the Advancement of Excellence in Education.
Barnard-Brak, L., Lan, W. Y., & Paton, V. O. (2010). Profiles in self-regulated learning in the online learning
environment. The International Review of Research in Open and Distance Learning, 11(1), 61-80.
Battalio, J. (2007). Interaction online: A reevaluation. Quarterly Review of Distance Education, 8(4), 339-352.
Berge, Z. L., & Clark, T. A. (2005). Virtual schools: Planning for success. New York: Teachers College Press.
Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, C. A., Tamim, R. M., Surkes, M. A., et al. (2009). A meta-
analysis of three types of interaction treatments in distance education. Review of Educational Research,
79(3), 1243-1289. doi: 10.2307/40469094
Cavanaugh, C., & Blomeyer, R. L. (2007). What works in K-12 online learning. Eugene, OR: International Society
for Technology in Education.
Chang, S.-H., & Smith, R. A. (2008). Effectiveness of personal interaction in a learner-centered paradigm distance
education class based on student satisfaction. Journal of Research on Technology in Education, 40(4), 407-
426.
Chejlyk, S. (2006). The effects of online course format and three components of student perceived interactions on
overall course satisfaction. (Doctoral dissertation), Capella University, Ann Arbor. Available from
ProQuest (UMI No. 3213421)
Eastin, M. S., & LaRose, R. (2000). Internet self-efficacy and the psychology of the digital divide. Journal of
Computer-Mediated Communication, 6(1). Retrieved from
http://onlinelibrary.wiley.com/doi/10.1111/j.1083-6101.2000.tb00110.x/full
Eom, S. B., Wen, H. J., & Ashill, N. (2006). The determinants of students' perceived learning outcomes and
satisfaction in university online education: An empirical investigation. Decision Sciences Journal of
Innovative Education, 4(2), 215-235. doi: 10.1111/j.1540-4609.2006.00114.x
Gunawardena, C. N., Lowe, C. A., & Anderson, T. (1997). Analysis of a global online debate and the development
of an interaction analysis model for examining social construction of knowledge in computer conferencing.
Journal of Educational Computing Research, 17(4), 397-431.
Gunawardena, C. N., & Zittle, F. J. (1997). Social presence as a predictor of satisfaction within a computermediated
conferencing environment. American Journal of Distance Education, 11(3), 8-26. doi:
10.1080/08923649709526970
Hillman, D. C. A., Willis, D. J., & Gunawardena, C. N. (1994). Learnerinterface interaction in distance education:
An extension of contemporary models and strategies for practitioners. American Journal of Distance
Education, 8(2), 30-42. doi: 10.1080/08923649409526853
Johnson, L., Adams Becker, S., Estrada, V., & Freeman, A. (2014). NMC horizon report: 2014 K-12 edition. Austin,
TX: The New Media Consortium.
Jonassen, D., Davidson, M., Collins, M., Campbell, J., & Haag, B. B. (1995). Constructivism and computer
mediated communication in distance education. American Journal of Distance Education, 9(2), 7-26.
Jung, I., Choi, S., Lim, C., & Leem, J. (2002). Effects of different types of interaction on learning achievement,
satisfaction and participation in Web-based instruction. Innovations in Education and Teaching
International, 39(2), 153-162. doi: 10.1080/14703290252934603
Kaseman, L., & Kaseman, S. (2000). How will virtual schools effect homeschooling? Home Education Magazine,
(November-December), 16-19.
Keeler, L. C. (2006). Student satisfaction and types of interaction in distance education courses. (Doctoral
dissertation). Available from ProQuest (UMI No. 3233345)
Krashen, S. D. (1985). The input hypothesis. London: Longman.
Kuo, Y.-C., Walker, A. E., Schroder, K. E. E., & Belland, B. R. (2014). Interaction, Internet self-efficacy, and self-
regulated learning as predictors of student satisfaction in online education courses. The Internet and Higher
Education, 20, 35-50. doi: 10.1016/j.iheduc.2013.10.001
Lantolf, J. P. (1994). Sociocultural theory and second language learning: Introduction to the special issue. The
Modern Language Journal, 78(4), 418-420. doi: 10.2307/328580
Lantolf, J. P. (2006). Introducting sociocultural theory. In J. P. Lantolf & S. L. Thorne (Eds.), Sociocultural theory
and the genesis of second language development (pp. 1-26). New York: Oxford University Press.
Lantolf, J. P. (Ed.). (2000). Sociocultural theory and second language learning. Oxford: Oxford University Press.

228
Long, M. H. (1983). Native speaker/non-native speaker conversation and the negotiation of comprehensible input.
Applied Linguistics, 4(2), 126-141. doi: 10.1093/applin/4.2.126
Marks, R. B., Sibley, S. D., & Arbaugh, J. B. (2005). A structural equation model of predictors for effective online
learning. Journal of Management Education, 29(4), 531-563. doi: 10.1177/1052562904271199
Moore, M. G. (1989). Three types of interaction. The American Journal of Distance Education, 3(2), 1-6.
Moore, M. G., & Kearsley, G. (2011). Distance education: A systems view of online learning. New York:
Wadsworth.
Noels, K. A., Pelletier, L. G., Clment, R., & Vallerand, R. J. (2000). Why are you learning a second language?
Motivational orientations and self-determination theory. Language Learning, 50, 57-85. doi: 10.1111/0023-
8333.00111
Pica, T. (1994). Research on negotiation: What does it reveal about second-language learning conditions, processes,
and outcomes? Language Learning, 44(3), 493-527. doi: 10.1111/j.1467-1770.1994.tb01115.x
Pintrich, P. R., Smith, D. A. F., Garcia, T., & McKeachie, W. J. (1993). Reliability and predictive validity of the
motivated strategies for learning questionnaire (MSLQ). Educational and Psychological Measurement,
53(3), 801-813. doi: 10.1177/0013164493053003024
Queen, B., & Lewis, L. (2011). Distance education courses for public elementary and secondary school students:
2009-10. First look. Nces 2012-008. Washington, DC: National Center for Education Statistics.
Roblyer, M. D., & Wiencke, W. R. (2003). Design and use of a rubric to assess and encourage interactive qualities
in distance courses. American Journal of Distance Education, 17(2), 77-98. doi:
10.1207/S15389286AJDE1702_2
Salmon, G. (2011). E-moderating: The key to teaching and learning online (3rd ed.). New York: Routledge.
Sherry, A. C., Fulford, C. P., & Zhang, S. (1998). Assessing distance learners' satisfaction with instruction: A
quantitative and a qualitative measure. American Journal of Distance Education, 12(3), 4-28.
Swan, K. (2001). Virtual interaction: Design factors affecting student satisfaction and perceived learning in
asynchronous online courses. Distance Education, 22(2), 306-331.
Swan, K. (2002). Building learning communities in online courses: The importance of interaction. Education,
Communication & Information, 2(1), 23-49. doi: 10.1080/1463631022000005016
Tseng, W.-T., Drnyei, Z., & Schmitt, N. (2006). A new approach to assessing strategic learning: The case of self-
regulation in vocabulary acquisition. Applied Linguistics, 27(1), 78-102. doi: 10.1093/applin/ami046
Vygotsky, L. S. (1978). Mind in society. Cambridge, MA: Harvard University Press.
Wagner, E. D. (1994). In support of a functional definition of interaction. American Journal of Distance Education,
8(2), 6-29. doi: 10.1080/08923649409526852
Watson, J., Murin, A., Vashaw, L., Genin, B., & Rapp, C. (2012). Keeping pace with K-12 online and blended
learning: An annual review of policy and practice. Retrieved September 20, 2013, from
http://kpk12.com/cms/wp-content/uploads/KeepingPace2012.pdf
Watson, J., Murin, A., Vashaw, L., Genin, B., & Rapp, C. (2013). Keeping pace with K-12 online and blended
learning: An annual review of policy and practice. Retrieved September 20, 2013, from
http://kpk12.com/cms/wp-content/uploads/EEG_KP2013-lr.pdf
Wiener, M., & Mehrabian, A. (1968). Language within language: Immediacy, a channel in verbal communication.
New York: Ardent Media.

229

230
25. Keeping It Real: Factors that Impact Social Presence, Feelings of
Isolation, and Interactivity in Online Learning

Franco Taverna, Ph.D.,


Senior Lecturer, Human Biology Program,
University of Toronto, Canada

Lena Paulo Kushnir, Ph.D.,


Assistant Professor, Psychology,
OCAD University, Canada

Kenneth Berry, M.Sc.,


Research Associate,
OCAD University, Canada

Laurie Harrison, M.Ed.,


Director, Online Learning,
University of Toronto, Canada

Introduction
Since the very beginning of formal education, instructors have been lecturing to students in the sage on the
stage format (King, 1993, p. 30). The technique dominates to this day, and students have come to expect this style
of education. Borrowing terminology from the lexicon of online learning, this might be described as synchronous,
that is, it happens in real-time, and it gives the student personal contact with the professor. The social presence of
the instructor is inherent in this teaching format, and the opportunity for social interaction, with both the professor
and peers, is very important to students. A second key aspect of this social context is the availability of immediate
feedback - asking questions of the instructor or discussing ideas with classmates - although the level of interactivity
and active learning depends very much on the pedagogical style of the sage on the stage and an instructors openness
to moving toward being a guide on the side (King, 1993).
Online education is not new, but recently a Massive Open Online Courses (MOOC) phenomenon has
foregrounded its importance and potential for the transformation of higher education (Carey & Trick, 2013). Many
leaders within higher education are positioning online education as critical to their long term strategy (Allen &
Seaman, 2013) and continue to explore the possibility that online learning may provide increased access (Daniel et.
al.,, 2009) and/or cost-efficiencies (Bates & Sangra, 2011). However, an overarching concern is the importance of
maintaining the quality of the educational experience, and ensuring that students are actively engaged in the learning
process in comparison to traditional face-to-face courses.
One of the biggest challenges for online learning is providing social presence including interactivity with course
content as well as peers to promote active learning. The idea of social presence in online learning has been defined
by Garrison (2009) as the ability of participants to identify with the community (e.g., course of study),
communicate purposefully in a trusting environment, and develop inter-personal relationships by way of projecting
their individual personalities (p. 352). Furthermore, the type of interaction seems important; immediate interaction
has been shown to increase perception of instructor social presence (Shutt & Laumakis, 2009), and enhance learning
(Zirkin & Sumler, 1995). For many years we have had the tools to provide online synchronous teaching and
learning with all of our students online at the same time. Most webinar software platforms allow students to
participate, in classes, tutorials, lectures and presentations, mimicking the face-to-face class experience. Recently,
there has been much greater focus on asynchronous teaching strategies, with podcasts, lecture recordings and
Massive Open Online Courses (MOOCs) becoming ubiquitous in the higher education landscape. Although the
online environment provides myriad methods of interaction with classmates and instructors, the most commonly
used types are not immediate (e.g. email, discussion boards, etc.). The lack of strong social presence may lead to a
sense of isolation or feeling lonely, often correlated with higher student dropout rates (Pigliapoco & Bogliolo, 2007).

231

Another important dimension of online education is known as teaching presence, that is, the design,
facilitation and direction of the online learning experiences (Anderson, et al. 2001). Teaching presence is promoted
by the design of teaching strategies that integrate student engagement with the instructor, their peers and the course
content. Online synchronous technology tools provide affordances such as direct instruction through real-time
communication between the instructor and students, and immediate interaction with the instructor and peers for
discussion and clarification of course material.
Both social presence and teaching presence are important underpinnings that influence student experience (Lee
et. al. 2011; Lyons et. al., 2012) and lessen the sense of isolation in online classes (e.g. see Munoz et. al., 2014).
Early research on distance education indicated that students perceptions of the degree of interaction influence their
view of the quality of the course (Klesius et. al., 1997). Both student-student and student-instructor interactions
have been shown to be significant predictors of student satisfaction (Kuo et. al., 2014), and correlated with student
achievement (Zirkin & Sumler, 1995).

Methodology
We analyzed two online courses, one delivered in real time (synchronously), that is, Study 1 below, and one
delivered asynchronously, that is, Study 2 below, using teaching and learning strategies that provide social presence
and teaching presence. Social presence was promoted by offering opportunities to purposely communicate between
students and instructor. Teaching presence was fostered by using activities with lots of feedback, allowing
interaction with the instructor, course content, and interaction between students. In order to evaluate the impact of
these instructional design strategies, we assessed student perception of the course activities and their satisfaction
with their learning experience using a mixed quantitative and qualitative anonymous survey methodology. The use
of synchronous tools in the real-time courses to simulate the experience of a face-to-face classroom setting, may
help to create a satisfying learning experience (Salmon, 2000), while the use of asynchronous tools like video
messaging, discussion boards and interactive tools may help to increase interactivity amongst students, engagement
and satisfaction in the asynchronous course. Some of the data from Study 2 is published elsewhere (Paulo Kushnir
& Berry, 2014); it is used here to expand discussions around presence, interactivity, engagement and satisfaction.

Study 1 Synchronous Course


The synchronously delivered online course was a 2nd year, undergraduate human biology course offered during
the winter 2012-13 semester (the final exam was completed face-to-face). The enrollment at the beginning of the
course was 135 students with 125 completing. The course was delivered with Blackboard Collaborate webinar
platform. Attendance and participation was required for all 135 students during the 2 hour weekly online lecture and
the one-hour weekly online tutorials. Online lectures and tutorials were designed using the familiar format of
powerpoint style presentation via the Blackboard Collaborate Whiteboard feature as a foundation. The instructor
is present live via webcam and audio. Students attended class by logging into the webinar software on their
local computer. All lectures and tutorials were recorded, and the videos posted for later student review. Surveys
revealed that 42% login from home, 16% from campus residence, 34% from a location on campus such as a library
on their own computer, and 5% on campus using a university computer.

232
Study 2 Asynchronous Course
The asynchronously delivered online course was an undergraduate Introductory Psychology course offered
during the summer 2013 semester (the term tests and the final exam were completed face-to-face). A total of 52
students were enrolled in this course. Students received either 30-60 minute lecture videos as part of the online
course materials (i.e. section 01), or the same lecture videos chunked into 5-15 minute lecture video segments with
embedded quizzes that popped up during the short lecture clips to asses engagement and interactivity with course
content (i.e. section 02). All course messages and materials were delivered via the institutions learning management
system (section 01 of the same course), and for some students, via a video messaging tool to assess any impact on
social presence (section 02 of the same course).

Data Collection and analyses


Students in both studies received an end-of-term survey. The survey provided important summative feedback
about student experience, satisfaction, and engagement, and about which course components most contributed to
their learning experience. Students were provided with open ended questions to share feedback about course
components, delivery and overall effectiveness. Analysis of the data included response frequencies of the
quantitative survey questions. Qualitative analyses of the open-ended survey questions included response
frequencies of the students survey text answers across the groups and weighted word lists that were calculated and
puzzled out into word clouds. The word clouds represented summaries of the text that students wrote in their open-
ended answers. A word cloud visualizes information that is related to a specific survey question and, in essence, it
depicts visually, the frequency of specific topics that students write about in their open-ended answers. The
importance (or frequency) of specific words is displayed by using font size (see Bateman et al., 2008 for an
overview of word/tag clouds).

Results and Discussion


Social Presence
Getting to know an instructor can provide an effective means of increasing social presence in a course. This
may be a challenging in online environments. In Study 1, the webinar tool afforded many opportunities for real-time
student-instructor interactions. For example, activities that took advantage of webinar tools such as an interactive
whiteboard and a raise your hand feature to ask questions in class provided a valued vehicle for social presence.
The ability to ask questions in class was also frequently mentioned as being satisfying in Study 1 (Figure 1).

Figure 1. What satisfied students most about the course. Left, Study 1, right Study 2.

In addition, the raise your hand/chat box questions and the polling (yes/no answer) features, (both provided
opportunities to interact with the instructor), were reported by 40% and 45% respectively of students as most
contributing to their learning experience (Figure 2). In Study 2, the lecture podcasts and embedded quizzes were

233

frequently chosen as components contributing to the learning experience (Figure 2). When we asked students what
satisfied them most about the course, interactivity was consistently reported in Study 1 while video lessons
engaging was commonly mentioned in Study 2 (Figure 1). In Study 2, we also asked which strategies helped
students to get to know the instructor, the lecture recordings and video messaging/announcements were the two most
frequent answers (Figure 3).
These observations suggest that social presence can be fostered in both synchronous and asynchronous
environments. Video lectures that include the instructor (and not just a screen cast of lecture content) and video
messages and announcements rather than just text based announcements, enhance social presence in asynchronous
environments, while taking advantage of the myriad interaction tools available in most webinar software provides
simple means to enhance social presence in both learning spaces. Students in online courses have reported greater
satisfaction with their learning experience when they there is a greater sense of social presence in a course
(Anderson et al. 2001; Garrison & Cleveland-Innes, 2005, Lyons, Reysen & Pierce, 2012; Richardson & Swan,
2003)

Figure 2. Course components that contributed most to learning experience, for Study 1 (synchronous course, left)
and Study 2 (asynchronous course, right)

Figure 3. For Study 2 (asynchronous) - Course component that helped students get to know the instructor

Teaching Presence
Opportunities for student interaction with instructors and classmates provide feelings of community in
traditional face-to-face courses (Homberg-Wright &Wright, 2012). Historically, students report their preference for
face-to-face classes because they like interacting with the instructor and their classmates (Berry & Paulo Kushnir,

234
2013; Daymont & Blau, 2008). Can online environments better mimic the face-to-face experience by providing
plenty of opportunities for engagement and interaction? Delivering a lecture synchronously using webinar tools
mimics the face-to-face experience in many ways. Real-time audio and video, and multiple interaction capabilities
allow engagement with immediate feedback. In Study 1, students ranked many of these capabilities as contributing
most to their learning experience (see Figure 2). The lecture webinars, the real time quizzing and polling features
were all rated by the majority of respondents as important for their learning experience. Of note, the high level of
interactivity seemed to be unexpected for most students in Study 1. For example, one student wrote: I was quite
skeptical at first about the concept of an online course. However, it turned out quite nicely. I find that it has a more
interactive student-instructor relationship than in normal classes. Many students shared this sentiment in the open
ended responses.
In Study 2, embedded quizzes within the asynchronous lecture videos and the lecture videos with the instructor
in action, helped students feel engaged with the instructor and course material (see Figure 4), while the peer
instruction activities were mentioned most frequently as providing engagement with classmates.

Figure 4. For Study 2, course components that helped students feel most engaged.

Challenges and Student Dissatisfaction


In both studies, specific opportunities were designed for students to interact with one another. In Study 1,
virtual breakout room sessions allowed students to interact in real time to brainstorm, problem solve, or gather
evidence from internet sources. These sessions proved to very problematic for many students. Indeed, only 18%
chose breakout room sessions as components contributing to the learning experience (Figure 2). Comments
revealed that lack of participation and interactivity by some students within the breakout session were the main
reasons for the poor experience. In Study 2, webinar break out rooms were also used to facilitate synchronous peer
activities but they had to be abandoned due to continuous problems with the webinar tool. These instead were
replaced with asynchronous discussions that allowed for students to interact with one another in peer instruction
activities that students later reported being one of the course components that helped them feel the most engaged
(Figure 4).
A common theme for both asynchronous and synchronous online courses, is the desire for more face-to-face
interaction with the instructor and classmates (Figure 5). These responses contrast the general level of satisfaction
of both courses, and the high level of satisfaction with the level of interactivity with both the instructor and
classmates, particularly in Study 1 (synchronous delivery). For example, one student from Study 1 wrote
A pleasant surprise is how well I feel like I know my classmates. It is a lot more personal in some
ways to see their names in that chat window, and I feel as though I have gotten to know them, in
contrast to a normal offline lecture, where you never see the same face twice and there is no
connection between students. Surprisingly, online lectures feel more personal - who knew?
In Study 2, we asked students if they felt isolated participating in the online course. Students who received
more personal and lively video messages/announcements (i.e. students in section 02) reported feeling less isolated
than students who received text announcements (i.e. student in section 01; see figure 6)

235

Fgure 5. Left, for Study 1, what students find most frustrating or problematic about the course. Right, for Study
2, what type of interactions students wanted more of.

Figure 6. Study 2, students report of feeling isolated in the course

In the future, activities that encourage student-student interaction should be designed to overcome these
challenges. For example, students could be placed in larger sized groups for group activities, ensuring that a critical
mass of students will engage and help to overcome any resistance or lack of participation by some. In addition,
tasks within student-student interactive activities could be more directed to individuals to better ensure participation
within the group.
We found one other aspect of the synchronous online delivery format that students reported as frustrating or
problematic. Many students stated that they were distracted in the webinar sessions (Figure 5). There were two
types of distractions described. The first distraction comes from the running dialogue that occurs on the chat box
during lectures. As the course progressed, students were encouraged to overcome this distraction by only referring
to the chat box when the instructor paused for Q/A. The second was for the online format in general, as illustrated
by this student comment, Sometimes just paying attention. As compared to a real time, in person lecture, my
attention tends to stray a little bit more often. Again we note this paradox in student responses tools and activities
that are designed to be engaging and interactive, and bring a sense of social and teaching presence to the class were
found to be problematic for some students.

236
Conclusions
Online teaching and learning systems, whether synchronously or asynchronously delivered can offer myriad
opportunities for engagement, interaction and presence. Many can provide some of the perceived advantages of the
face-to-face setting to the online learning environment. While we may not make eye contact in the literal sense,
social presence, teaching presence, and interactivity (with immediate feedback) with instructors and other students
in the online space can contribute to a positive experience for students in online courses. Coupled with the
convenience, accessibility and easy availability of learning aids, such as lecture recordings, both synchronous and
asynchronous online formats can foster teaching and learning enhancements while providing desired access and
efficiencies. Although a commonly desired type of interaction student to student is challenging to replicate in
the online space, and at least some students report dissatisfaction and isolation, this paper provides insights into
potential solutions. Both instructors and students might need to redefine what is meant by interactivity and
connectivity as we learn to learn in online learning environments.
Social interactions are often difficult to conduct online. Using social psychology theory, we can try to
understand online interactions. Many online interactions lack the social context and social cues afforded by face-to-
face interactions, making online ones more difficult and effortful than face-to-face interactions (Paulo Kushnir,
2004). Ironically, online interactions are without a doubt social. In trying to understand the social world of
online learning environments, it seems obvious to consider issues regarding the various ways in which we deal with
one another, influence one another and act together as a group, that is, how we interact with one another. Perhaps
less obvious (and more difficult to study) but important for understanding the social issues of online learning, would
be to consider the various ways in which we, individually, try to understand our social world around us; so, for
example, our social cognition. Here one can consider issues around social comparison, attitudes, impression
management, attribution, self-perception and other cognitive constructions.
There are many kinds of interactions that one can consider, but two that are particularly well suited for
understanding some of the findings and discussions in this paper are one-on-one interactions and many-on-one
interactions.

One-on-one Interactions:
Some social psychologists would argue that there are two common principles that govern the way in which we
deal with one another, that is, social exchange theory and the reciprocity principle. The idea is that our interactions
have a give-and-take quality to them and when we take something, we feel that we must repay or reciprocate what
has been given. An important element of these principles is that we do not get something for nothing. This seems
like a rather negative view of how we relate to one another and dismisses evidence of genuine altruistic or helping
behaviour; but there is ample evidence that people often do not demonstrate prosocial behaviour even though it is
needed (e.g., in emergency situations). While such evidence may initially seem to support the ideas of social
exchange theory and the reciprocity principle, social psychologists have illustrated that often what seems like cold,
uncaring behaviour is really the result of what has been coined the bystander effect or bystander apathy. This
is a social phenomenon that demonstrates a decreased likelihood that an individual will help someone in distress,
due to the presence of other potential helpers (Sternberg, 1994). These other potential helpers communicate
nonverbal cues that confuse or mislead a potential helper. For example, the apathetic look of other potential helpers
might cause one to be confused as to whether there really is an emergency. Also, the mere presence of others can
mislead a potential helper into believing that help has already been given and no further action is necessary.
Markey (2000) discusses the bystander effect in online learning environments and argues that the same laws
that have been traditionally applied to face-to-face interaction also exist online. This author reports real life
examples of bystander apathy in on-line news groups and chat lines, and reports empirical evidence of factors
contributing to this phenomenon in online environments. When comparing helping behaviour in online interactions,
Markey (2000) found that if students needed help, directing their query to someone specific and using that persons
name yielded help more quickly than if they did not address someone directly. Just as social psychologists have
found in face-to-face situations, Markey (2000) also found that the participant group size also had a significant
effect on the likelihood of one getting help in an online setting. Specifically, if the group size is small (for example,
two participants) and one asks for help without addressing the potential helper by name, then one is likely to get help
more quickly than if the group size is large (for example, 19 participants). There was no effect of group size when

237

one asked for help and specified the name of the person of whom they were asking help (i.e., one would always get
help quickly under such conditions).
Markeys (2000) findings have interesting implications for online students. It seems that when you ask for help
by addressing someone by name, this makes the potential helper more personally responsible. Under such
conditions, bystander apathy would not have any effect on the likelihood of that person helping since she would not
be looking to other potential helpers for information cues about the situation (e.g., cues about whether someone
really needs help, or cues about whether help has already been provided). While in our study we found that small
group sizes had a negative impact on student-student interactions, it seems important that instructors plan for quality
interactions amongst students, where they get to know one another (at least sufficiently to call one another by name)
to increase chances of cooperative group behaviours.
It seems likely that, at least in an educational online setting, one might find a greater tendency of altruistic
behaviour and less bystander apathy compared to other non-educational online settings and face-to-face interactions.
We argue that this would be reasonable when one considers the social contract to which students implicitly agree
upon joining an online course; students are prepared to ask for help and give help with course tasks in order to learn
and advance course objectives.

Many-on-one Interactions
Social impact theory is represented by interactions of many people with one person, that is, where the behaviour
of others affects one persons behaviour (usually influencing the one person to conform to others group
behaviours). There are a number of social psychological phenomenon that fall under this category of interaction, for
example, social facilitation, inhibition, conformity to groups, and blind obedience to individuals who hold positions
of authority. Social impact theorists argue that impact increases as the size of the influencing group increases. They
also argue that impact decreases as the number of people to whom the influence is targeted increases (i.e., diffusion
of social impact). A psychological phenomenon called social loafing is an example of diffusion of social impact.
This phenomenon shows the effects of working in a group. Specifically, it demonstrates that ones efforts (i.e., total
output on a task) diminishes when one works on a similar task with a group of people; this compared to working on
the same task alone (i.e., ones effort on a task increases when working alone).
McKinlay et. al., (1999) investigated the phenomenon of social loafing online and concluded that for some
tasks, online groups are at a great disadvantage. This is consistent with what we have found elsewhere (Berry &
Paulo Kushnir, 2013). In online settings, participants have an easy environment to engender social loafing (more so
than in face-to-face settings) since group members may not meet outside their virtual environment, which in turn
may leave group members with the feeling of less responsibility and accountability. These authors also conclude
that online groups are more disadvantaged than groups working face-to-face, since face-to-face groups are more
likely to engage in social compensation behaviours (i.e., where some group members will work harder than they
might usually because they have a loafer in the group) and thus make-up for the loafers lack of effort. As
McKinlay et. al., (1999) see it, this is less likely to happen online since they find online participants have lower
levels of perceived group membership importance.
The implications of social loafing are very serious for online learning as it undermines the very fabric around
which such courses are structured. McKinlay et. al., (1999) also argue the possibility that due to the nature of online
interactions, social loafing may be difficult to perceive which only exacerbates the problem. We would argue that
this is not likely, especially in an educational online environment where participants are highly motivated to
participate and cooperate with one another to meet course objectives. It would seem likely that group participants
are able to smell a rat when their vested interests are particularly high. Since academic tasks usually have grades
attributed to the completion of tasks, it seems just as unlikely that one would observe less social compensation
behaviours online compared to a traditional face-to-face class. The trick is staying focused and on task, completing
course work within a set time frame, as we found here in this study and elsewhere (Paulo Kushnir & Berry, 2014).
Whether meeting virtually or face-to-face, tasks must be completed and students in both groups are likely to make-
up for any loafers or slackers if not just to have their own course demands met.
Parise et. al., (1996) investigated on-line cooperation with computer social agents. Briefly, they found that
when online students have a chance to discuss options, then they cooperate (significantly) more often with one
another than if they do not have the opportunity for discussion. They also found that participants cooperated more

238
often with other real people (e.g., by video) followed by cooperating next most highly with text-based agents, and
least with a computerized image of a person, animal or cartoon (even though participants reported that they liked
working with these other agents).
Parise et. als., (1996) research has interesting implications for online learning and interface design implications
that developers of online learning technologies should consider. These results suggest that group discussion time is
very important for cooperative completion of group tasks and designers should be careful about making cute icons
for agents if people have to work cooperatively in a group; a situation that is very common in online learning
environments (e.g., avatars). While some of these studies investigating social psychological phenomena online are
somewhat dated, they serve to demonstrate that student behaviour is quite predicable whether face-to-face or online
(as evidenced by the replication of social phenomena in online settings). Knowing what we know about social
psychological theories, what instructors and course designers need to focus on are environments, course assignments
and student interaction opportunities that encourage positive group behaviours.

References
Allen, I. E., and Seaman, J. (2013). Changing Course: Ten Years of Tracking Online Education in the United States.
Sloan Consortium. PO Box 1238, Newburyport, MA 01950. Retrieved from
http://files.eric.ed.gov/fulltext/ED541571.pdf
Anderson, T., Rourke, L., Garrison, D. R., and Archer, W. (2001). Assessing teaching presence in a computer
conferencing context. Journal of asynchronous learning networks, 5(2), 1-17.
Bateman, S., Gutwin, C., and Nacenta, M. (2008). Seeing Things in the Clouds: The Effect of Visual Features on
Tag Cloud Selections. In: Proc. of the 19th ACM conference on Hypertext and Hypermedia, pp. 193202.
ACM Press, New York.
Bates, A. T., and Sangra, A. (2011). Managing technology in higher education: Strategies for transforming teaching
and learning. Jossey-Bass.
Berry, K. and Paulo Kushnir, L. (2013). Crossing borders: A comparison of the impact of various teaching strategies
and tools in an online and face-to-face psychology course. In Jan Herrington et al. (Eds.), Proceedings of
the 25th World Conference on Educational Multimedia, Hypermedia and Telecommunications 2013 (pp.
1724-1731). Chesapeake, VA: AACE.
Bolliger, D. U. (2004). Key factors for determining student satisfaction in online courses. International Journal on
E-Learning, 3(1), 61-67.
Carey, T., & Trick, D. (2013). How Online Learning Affects Productivity, Cost and Quality in Higher Education: An
Environmental Scan and Review of the Literature. Toronto: Higher Education Quality Council of Ontario.
Retrieved from
http://www.heqco.ca/SiteCollectionDocuments/How_Online_Learning_Affects_Productivity-ENG.pdf.
Daniel, J., Kanwar, A., and Uvali-Trumbi, S. (2009). Breaking Higher Educations Iron Triangle: Access, Cost,
and Quality. Change: The Magazine of Higher Learning, 41(2), 3035.
Garrison, D. R. (2009). Communities of inquiry in online learning: Social, teaching and cognitive presence. In C.
Howard et al. (Eds.), Encyclopedia of distance and online learning (2nd ed., pp. 352-355). Hershey, PA:
IGI Global. Retrieved from http://sloanconsortium.org/system/files/v11n1_8garrison.pdf
Garrison, D. R., and Cleveland-Innes, M. (2005). Facilitating cognitive presence in online learning: Interaction is
not enough. The American Journal of Distance Education, 19(3), 133-148.
Homberg-Wright, K., and Wright, D. J. (2012). MBA and undergraduate business student perceptions of online
courses: Experienced online students versus students who have not taken an online course. Global
Education Journal, (1), 169-186.
King, A. (1993). From sage on the stage to guide on the side. College teaching, 41(1), 30-35.
Klesius, J., Homan, S., and Thompson, T. (1997). Distance education compared to traditional instruction: The
students' view. International Journal of Instructional Media, 24(3), 207-220.
Kuo Y.C., Walker, A.E., Belland, B.R., Schroder, K.E.E., and Kuo, Y.T. (2014). A Case Study of Integrating
Interwise: Interaction, Internet Self-Efficacy, and Satisfaction in Synchronous Online Learning
Environments. The International Review of Research in Open and Distance Learning, 15(1), 161-181.

239

Lee S.J., Srinivasan S., Trail T., Lewis D., and Lopez S. (2011). Examining the relationship among student
perception of support, course satisfaction and learning outcomes in online learning. Internet and Higher
Education 14, 158-163.
Lyons A., Reysen S., and Pierce L. (2012). Video lecture format, student technological efficacy, and social presence
in online courses. Computers in Human Behavior, 28(1), 181-186.
Markey, P.M. (2000) Bystander intervention in computer-mediated Communication. Computers in Human
Behavior 16, 183-188
McKinlay, A., Procter, R., and Dunnett, A. (1999). An investigation of social loafing and social compensation in
computer-supported cooperative work. Proceedings of the international ACM SIGGROUP conference on
Supporting group work, 249257. ACM. Retrieved from
http://portal.acm.org/citation.cfm?id=320297.320327
Munoz, L.R., Pellegrinii-Lafont, C., and Cramer, E. (2014). Using Social Media in Teacher Preparation Programs:
Twitter as a Means to Create Social Presence. Perspectives on Urban Education, 11(2), 57-69.
Parise, S., Sproull, L, Kiesler, S., and Waters, K. (1996). My partner is a real dog: cooperation with social agents.
Proceedings of the ACM 1996 Conference on Computer Supported Cooperative Work (CSCW96), ACM
Press, New York, NY, 399440.
Paulo Kushnir, L. (2004). Information overload in Computer-Mediated Communication and Education: Is there
really too much information? Implications for distance education. In J. Hewitt and I. DeCoito, OISE
Papers in Educational Technologies, Vol. 1, pages 54-71. Toronto, ON: University of Toronto Press.
Paulo Kushnir, L., and Berry, K. (2014). Inside, Outside, Upside Down: New Directions in Online Teaching and e-
Learning. In M. Baptista Nunes and M. McPherson (Eds.), Proceedings of the International Conference e-
Learning 2014(ICEL 2014) (pp. 133-140). Lisbon, Portugal: IADIS Press.
Pigliapoco, E., and Bogliolo, A. (2007). The effects of the psychological sense of community in on-line and face-to-
face academic courses. Conference ICL2007, Villach, Austria, 1-16.
Richardson, J.C. and Swan, K. (2003). Examining Social Presence in Online Courses in Relation to Students
Perceived Learning and Satisfaction. Journal of Asynchronous Learning Networks. 7(1), 68-88.
Salmon, G. (2000). E-moderating: The key to teaching and learning online. London: Kogan.
Smith, G.G., and Ferguson D. (2005) Student attrition in mathematics e-learning. Australasian Journal of
Educational Technology, 21(3), 323-334.
Sternberg, R. J. (1994). Allowing for thinking styles. Educational Leadership, 52(3), 36-40.
Sumler, D., and Zirkin, B. (1995). Interactive or not interactive? That is the question. Journal of Distance
Education, 10(1), 95-112.

240

Vous aimerez peut-être aussi