Académique Documents
Professionnel Documents
Culture Documents
by
Donna Kay
Capella University
March 2011
UMI 3449387
Copyright 2011 by ProQuest LLC.
All rights reserved. This edition of the work is protected against
unauthorized copying under Title 17, United States Code.
ProQuest LLC
789 East Eisenhower Parkway
P.O. Box 1346
Ann Arbor, MI 48106-1346
Abstract
The shift from face-to-face instruction to other technology-mediated delivery
environments brings changes to both organizational structures and instructional design
processes. Instructional designers in United States colleges and universities to should
rethink methods of designing, developing, and supporting instructional design activities
within their institutions in order to best design instruction utilizing shifting instructional
delivery media. This mixed methods study examined organizational structures and the
instructional design activities taking place within these structures. Study results
determined the frequency with which specific instructional design activities take place
varies from organizational structure to organizational structure. The study examined the
strengths and weaknesses of each organizational structure as perceived by those
responsible for design and development of instruction. The most common instructional
delivery method for which participants have responsibility is hybrid delivery. A shift has
taken place from traditional faculty centered structures to alternate structures providing
varying levels of instructional designer support or control over instructional design
processes. The results seemed to indicate that in a faculty centered structure the content
and quality of instruction can vary among campuses of the same institution and even
among sections of the same course on a given campus. Lack of quality in instruction
under a faculty centered structure was the greatest weakness identified.
Dedication
To my Lord and Savior, Jesus Christ. The better I know you, the better I
understand that I can do all things through Christ. Without your strengthening I would
not have made it through this process. To my parents, thank you for the years of love and
belief in me. Thank you for always telling me I can do whatever I set my mind to do. To
my daughters, Katherine and Lesa, my greatest cheerleaders. I love you and remind you
once again, you are precious in His sight and you too can do whatever you set your mind
to do.
iv
Acknowledgments
Thank you to Dr. Sonja Irlbeck, my mentor. Your feedback and leadership in
individual classes laid the foundation for research and writing in a scholarly fashion.
Your feedback and encouragement to all of us in the dissertation courseroom and the tone
you set for encouraging one another was critical to navigating through this dissertation
journey. You created an environment in which we knew we were not alone and provided
feedback that was both timely and beneficial. This paper would never have made it to
completion without your efforts.
Dr. Betl Czerkawski and Dr. Eric Wellington, thank you for serving on my
dissertation committee. Your insights and feedback were both helpful and appreciated.
You brought an alternate viewpoint from which to see my work and brought to light gaps
in the study. You also put up with my last minute reviews each quarter and found the
time to fit them in even though you were busy with final grading for your own classes.
To Pamela Shireman, thank you for your assistance and encouragement. Thank
you to Dr. Claudia Kittock, Dr. Kris Khthofson, and Stephanie Fells for your review and
suggestions for improvement of the interview instrument.
Table of Contents
Acknowledgments
List of Tables
viii
List of Figures
ix
CHAPTER 1. INTRODUCTION
Rationale
Research Questions
Definition of Terms
10
11
11
Learner Demographics
11
Online Technologies
13
16
Summary
37
CHAPTER 3. METHODOLOGY
38
vi
Study Design
39
Sampling
41
Instrumentation
41
Data Analysis
42
Ethical Issues
45
Summary
46
47
Quantitative Data
47
Qualitative Data
64
Summary
76
78
Results
79
Discussion
85
Recommendations
89
Summary
93
105
113
APPENDIX C. TABLES
116
vii
List of Tables
Table C1. Frequency and Percentage of Performance of Instructional Design
Activities
116
117
118
120
121
122
123
Table C8. Comparison of Reasons for Selecting Organizational Structure Overall and Organizational Structure
124
125
126
127
viii
List of Figures
Figure 1. Participants roles within their institution
49
50
50
Figure 4. Delivery options offered by institutions and delivery options for which
participants have responsibility for instructional design
51
52
53
57
57
58
58
59
59
60
60
61
61
ix
62
CHAPTER 1. INTRODUCTION
Introduction to the Problem
Colleges and universities within the seven regional accrediting bodies in the
United States are experiencing a time of rapid change. The Higher Learning Commission
of the North Central Association of Colleges and Universities (NCA-HLC) is an
association of approximately 10,000 schools and colleges located throughout 19 states.
The NCA-HLC represents the largest regional accreditation region for higher education
in the United States. This region is widely diverse in terms of its geographic, economic,
and cultural environment. Perhaps the two most widely spread and potentially enduring
factors stimulating change and affecting instructional design practices at schools within
the NCA-HLC are learner demographics and online technologies (Ayers, 2002; JonesKavalier & Flannigan, 2006; Saba, 2005; Yankelovish, 2005). These enduring factors,
coupled with a struggling economic environment and expanding learning theories,
promote the need to follow proper instructional design processes when designing and
developing instruction and training in colleges and universities.
Instructional designers recognize the process of analysis, design, development,
implementation, and evaluation (ADDIE) as the cornerstone of historical instructional
design (ID) processes. The foundation for instructional design modelssuch as those
created by Morrison, Ross, and Kemp (2007); Dick, Carey, and Carey (2009); and the
more recent Sims (2009) PD4L modelhas been guided by this process. How often and
Colleges and universities within the NCA-HLC, for example, now offer courses that are a
combination of face-to-face delivery, hybrid delivery, technology-mediated delivery, and
totally online delivery. These expanded learning environments present colleges and
universities with challenges related to design and development of courses.
Changing learner demographics in this expanding environment adds to the
complexity of instructional design tasks by creating new challenges in identifying
learners and learners needs (Ayers, 2002; Jones-Kavalier & Flannigan, 2006; Saba,
2005; Yankelovish, 2005), course interactions (Hedberg, 2006; Lanier, 2006 ), academic
integrity (Gearhart, 2001), and technology (Hedberg, 2006; Jones-Kavalier & Flannigan,
2006). A modification in instructional strategies often accompanies these changes as
instruction moves from a behavioral and cognitivist approach to approaches that are more
constructivist and connectivist (Albi, 2007; Baggaley, 2008; Doyle, 2009; Jonassen,
1999; Kirschner, Sweller, & Clark, 2006; Mayer, 1999; Mayer, Griffith, Jurkowitz, &
Rothman, 2008; Mayer & Johnson, 2008; Siemens, 2004, 2005). The organizational
structure under which leadership and support for instructional design activities takes
place within changing environments must be thoughtfully established (Beckman, 2009;
Braganza, Awazu, & Desouza, 2009; Brill, Bishop, & Walker, 2006; Franken, Edwards,
& Lambert, 2009; Scott, 2003). Recognizing organizational structures that effectively
support instructional design processes is important.
Instructional design activities can occur at various locations within institutions
with instructional design leadership and support being provided under a variety of
organizational structures. Instructional design leadership and support available to faculty,
trainers, managers, and others involved in design and development tasks may influence
Rationale
The study was conducted to provide a foundation upon which to encourage
development of best practices when creating organizational structures for supporting
instructional design at colleges and universities. Further research to compare
organizational structures on instructional design quality may be possible based on the
foundation provided by the study.
Research Questions
1. What types of instructional design activities are found in colleges and
universities using instructional designer centered, instructional designer
supported, and faculty centered models of organizational structure in the
Higher Learning Commission of the North Central Association of Colleges
and Universities?
2. What are the strengths of types of organizational structures in colleges in the
Higher Learning Commission of the North Central Association of Colleges
and Universities as perceived by instructional design professionals?
3. What are the weaknesses of types of organizational structures in colleges in
the Higher Learning Commission of the North Central Association of
Colleges and Universities as perceived by instructional design professionals?
Significance of the Study
The Internet brings many opportunities and challenges to colleges, creating fastchanging environments. How an organization handles change can have an impact
throughout the organization. The overall organizational structure, together with the
leadership and support available within the structure, guides the organization through
times of change (Beckman, 2009; Franken, Edwards, & Lambert, 2009; Fullan, 2001;
Scott, 2003; Thompson & Purdy, 2009). The placement and support of instructional
design activities within the overall organizational structure can take many forms. The
organizational structure from which instructional design activities originate and from
which support for those activities comes from will have an effect on the processes used
and procedures for designing and developing instruction and training in colleges.
Instructional designers have realized the process of placing courses online is more
than simply placing traditional (courseroom) materials online (Albi, 2007; Baggaley,
2008). Instructional designers are invaluable in creating quality learning environments
for students and helping secure quality instructional content, assessments, and delivery.
Instructional design leadership and support are needed to meet these challenges, even
when that leadership and support is structured in various ways within an organization.
This study sought to determine what organizational structures are in place and the
instructional design activities taking place within these structures.
Definition of Terms
The following definitions guided the study:
ADDIE. An acronym that stands for analysis, design, development,
implementation, and evaluation. It represents the processes in the instructional systems
design (ISD) model and can be found throughout instructional design literature (Molenda,
2003).
Classical content analysis. A method of qualitative data analysis in which the
researcher counts the number of times each code is utilized (Leech & Onwuegbuzie,
2007, p. 569).
Constant comparative analysis. A method of conducting qualitative data analysis
in which the researcher searches for themes throughout the data. The data is read, broken
down into chucks, and coded with meaningful labels (Leech & Onwuegbuzie, 2007, p.
565).
Distance education. Planned learning that normally occurs in a different place
from teaching, requiring special course design and instruction techniques, communication
through various technologies, and special organizational and administrative
arrangements (Moore & Kearsley, 2005, p. 2).
Faculty centered model. An instructional design organizational model in which
the instructor is responsible for the design, development, delivery, and facilitation of his
or her own courses. Support is typically in the form of assistance in the use of the course
delivery system rather than instructional design support.
Hybrid courses. Courses that include both online and face-to-face components.
Instructional design. A systematic and reflective process of translating
principles of learning and instruction into plans for instructional materials, activities,
information resources, and evaluation (Smith & Ragan, 2005, p. 7).
Instructional design activities. The individual steps performed within each
procedure identified in the instructional design process.
Instructional design leadership. The process of providing vision and direction
(Irlbeck & Pucel, 2000) for instructional design and development activities.
Instructional design process. The procedures used for design, development, and
implementation of courses within a program of study.
Instructional design theory. Prescriptive theory that identifies instructional
conditions required for particular instructional consequences or outcomes (Reiser &
Dempsey, 2007, p. 338).
10
11
12
learners are attending college or university for the first time. Many of these students
dropped out of high school or have been out of school for many years. Differences in
prior knowledge can make the task of designing instruction complex.
Learner expectations have also changed. A switch in learner views of higher
education from a privilege to a commodity to be purchased appears to be taking place
(Reiser & Dempsey, 2007, p. 277). Expectations of and demands on institutions of
higher learning are changing as views of education change to a consumerist viewpoint
(Dempsey, Albion, Litchfield, Harvard, & McDonald, 2007, p. 227).
Changes in learner demographics and expectations often occur because of
expanded access to the Internet and other online technologies that make instruction and
information more accessible. Online technologies create unique challenges for
instructional designers.
Online Technologies
Increasing uses of online technologies to deliver instruction adds to the
complexity of ID activities. Online technology is currently being used to enhance
instruction in face-to-face delivery of instruction, to blend benefits of both face-to-face
and online delivery of instruction in hybrid courses, and to deliver fully online distance
education instruction. Instructional designers can choose the best methods for delivering
content, assessing learning, developing interactions, and evaluating effectiveness from a
wide range of learning theories, online resources, and multimedia options. The solutions
chosen must be robust enough to support increasingly diverse learner populations.
13
Online technology makes it possible for learners to shop for instructional options
and create a potpourri of learning. Learners have options of receiving instruction through
traditional, hybrid, or online delivery methods from a myriad of schools or other sources
(Delialioglu & Yildirim, 2008; Vernadakis, Antoniou, Giannousi, Zetou, &
Kioumourtzoglou, 2011). Reiser and Dempsey (2007) state students are becoming real
consumers these days and often have several choices of junior colleges, four-year
colleges, and universities both in town and online (p. 277).
No one method of delivery appears to provide definitively improved performance
in student learning. Each delivery methodface-to-face, online, web-enhanced, and
hybridcontains both benefits and shortfalls. Vernadakis et al. (2011) studied the
benefits of hybrid learning environments and found them to combine the best features of
online learning and traditional classroom learning (p. 188). Delialioglu and Yildirim
(2008) state early studies showed that technology can be a double-edged sword if not
properly planned and implemented (p. 475). Instructional designers must recognize
what delivery options are available, be equipped to design effective instruction for each
option, and select delivery methods best suited to the learner and instructional need for
which instruction is being designed.
Options in online technologies bring many challenges to instructional design
processes. Identifying learners, learner demographics, and learner readiness provides a
challenge to instructional designers. Effectively delivering content, assessing learning,
developing interactions, and evaluating effectiveness may be more complex. Appropriate
online technologies must be selected to implement instruction designed based on a
variety of learning theories, provide quality assessment for both learner and instructor,
14
and create environments in which learners interact effectively with instructors, content,
and other learners.
Online technologies can make selection and use of the most appropriate learning
theory and delivery method for desired learning outcomes more complex. The
educational environment of 2011 and beyond requires the ability to design instruction
using a variety of cognitivist, constructivist, and connectivist learning theories (Albi,
2007; Baggaley, 2008; Doyle, 2009; Jonassen, 1999; Kirschner, Sweller, & Clark, 2006;
Mayer, 1999; Mayer, Gruffith, Jurkowitz, & Rothman, 2008; Mayer & Johnson, 2008;
Siemens, 2004; 2005). Learning theories often suggest the necessity for a variety of
assessments, including the need to assess student readiness and learning. Online
technologies may complicate the design of assessment measures and academic integrity
issues relating to assessment (Corcoran, Dershimer, & Tichenor, 2004; Lanier, 2006).
Constructivist and connectivist learning theories often focus on learner
interactions (Jonassen, 1999; Mayer & Johnson, 2008; Siemens, 2004; 2005). Moore and
Kearsley (2005) identify three types of learner interactions: instructor-learner, learnerlearner, and learner-content (pp. 140-141). Online delivery environments create
complexity in each of these interactions. Many in higher education view developing a
sense of community for learner-learner interactions in online environments as critical.
These help learners feel they are a part of the learning community and lead to quality
interaction and instruction as a challenge. The online isolation factor and improving
learner engagement has been the focus of much research in the 2000s (Deloach &
Greenlaw, 2007; Dennen, 2005; Dennen, Darabi, & Smith, 2007; Jacobs & Archie, 2008;
Mann, 2005; Spatariu, Quinn, & Hartley, 2007; Woods & Ebersole, 2003). Research has
15
16
17
18
19
20
21
(p. xv). They also cautioned, No single ID model is well matched to the many and
varied design and development environments in which ID personnel work (p. xv).
Many models exist from which instructional designers can select to help manage this
diversity. Most models have the ADDIE process as a foundational framework.
Various categories of instructional design models have been identified. Ryder
(2010) gives two categories of instructional design models as modern prescriptive and
postmodern phenomenological models. Qureshi (2004) describes the four categories of
design, time-focused, task-focused, and learner-focused models. Fauser, Henry, and
Norman (2006) used Gustafson and Branchs (2002) categories as the focus of their
comparison for design models. This paper focuses on the three categories of ID models
identified by Gustafson and Branchclassroom-oriented, system-oriented, projectorientedto classify instructional development models. Classroom-oriented models,
which are typically used by individual instructors to produce a small amount of
instruction, require minimal resources and ID skill. These usually involve little front-end
analysis and are relatively low in complexity. System-oriented models typically involve
a team effort and are used to develop an entire course or curriculum. The resource
commitment is high and greater levels of ID skills are required. System-oriented models
typically involve a substantial amount of front-end analysis and are technologically
complex. Product-oriented models are used to produce an instructional package. These
usually use a team approach and require high levels of resources and ID skills. They
typically include a moderate amount of front-end analysis and are moderately complex
(p. 14).
22
The Morrison, Ross, and Kemp (2007) classroom-oriented model, developed from
the work of Kemp (1971), and the Dick, Carey, and Carey (2009) system-oriented model,
first published by Dick and Carey in 1978, are examined as examples of ID models that
have been used for several years. The Sims (2009) PD4L model is examined as a
problem-solving ID model designed for online instruction in a team-based approach. The
ADDIE foundation can be clearly identified in the Morrison, Ross, and Kemp and the
Dick, Carey, and Carey models. Sims PD4L model incorporates elements of both
ADDIE and problem solving approaches to ID.
Morrison, Ross, and Kemp. Morrison, Ross, and Kemp (2007) describe ADDIE
as a label that refers to a family of models that share common elements (p. 13). The
Morrison, Ross, and Kemp (MRK) model includes all five parts of the ID process and is
designed to be flexible. It allows instructional designers to approach the design and
development project in a manner appropriate for the specific project being completed.
Flexibility does not mean that components of the ID process can be eliminated but that
the extent to which the components are addressed and the order of processing will vary
from project to project. This makes the MRK model an ID model that follows the
ADDIE process as well as a problem solving model that instructional designers adapt to
meet the needs of current instructional design situations.
Morrison, Ross, and Kemp (2007) state there are four fundamental planning
elements in almost every ID model. These include evaluating learner characteristics,
identifying learning objectives, establishing instructional strategies, and developing
evaluation procedures (p. 14). Development and implementation flow from design
23
activities and are tested using evaluation procedures. Activities within the ID process can
readily be identified when referring to the MRK model.
Seven elements are at the core of the MRK model. Each element represents a step
in the instructional design process and is represented by a circle in the MRK model.
Eight ongoing processes are also identified in the outer rings of the model (Morrison,
Ross, Kalman, & Kemp, 2011, pp. 14-19). The instructional problems circle in the MRK
model represents an analysis of the identified need to determine if it can be solved by
instructional interventions. The MRK model identifies a complex needs assessment and
a goal analysis as part of the process. This reflects the analysis phase of the ADDIE
process. Analysis also takes place in the learner characteristics and task analysis circles
of the MRK model. Instructional designers are concerned with identifying characteristics
of the learner that include general characteristics, specific entry characteristics, and
learning styles (Morrison, Ross, & Kemp, 2007, p. 55) as they relate to learner
characteristics. Identifying the contextual analysis within which learning is to take place
is also included in this circle. Contextual analysis includes the orienting context that
involves learning more about the user and his or her goals and perceptions of the learning
situation, the instructional context that seeks to understand the setting in which
instruction will take place, and the transfer context that seeks to understand how desired
learning will be applied (pp. 63-66). The third analysis activity, represented by the task
analysis circle in the MRK model, identifies content to be included in instruction. This
step carefully analyzes all of the subtle steps to be included and seeks to view content
from learners perspectives (pp. 52-73). The importance of this step is emphasized by the
many changes taking place in learner demographics.
24
Design activities can be found in the MRK model in the instructional objectives,
content sequencing, and instructional strategies circles. Instructional objectives identify
what learners are to learn. According to Morrison, Ross, and Kemp (2007), instructional
objectives serve the purpose of providing a means for designing instruction, providing a
framework for evaluating learning, and providing a method to guide the learner (p.
104). Content sequencing involves identifying the best ways to sequence content for
optimum learning. Instructional strategies can be designed after identifying the sequence
in which content should be delivered. The general learning environment best used to
deliver instruction and the sequence and methods to achieve an objective are described
in this step (p. 146).
Designing the message includes both design and development activities.
Instructional messages may be developed concurrently with designing the message using
current technology. Additional development activities take place in the development of
instruction once messages have been designed. Development of instructional materials is
included in this step. Implementation occurs after all materials have been developed.
Students are then introduced to content and learning materials and instruction actually
begins.
Evaluation is to take place during each of the above activities. Modifications to
analysis, design, and development are on-going activities. Evaluation is conducted to
measure the effectiveness of the instructional design and materials once instruction has
been implemented.
This brief review of the MRK model shows how the ID process of ADDIE is
foundational to the MRK model. Problem solving methods are needed as the
25
instructional designer determines the extent to which activities are completed and the
order in which they occur as determined by need. The ID process of ADDIE can be
clearly identified as the foundation of the slightly more linear model of instructional
design provided by Dick, Carey, and Carey (2009) called the Systems Approach Model
(SAM).
Dick, Carey, and Carey. The SAM model has ten interconnected boxes
[which] represent sets of theories, procedures, and techniques employed by instructional
designers to design, develop, evaluate, and revise instruction (Dick, Carey, & Carey,
2009, p. 6). The ID process of ADDIE as foundational to the design of the SAM model
is evident. The SAM model incorporates problem solving methods as revision and
iteration of steps occur as needed. The SAM model begins similarly to the MRK model
with identification of instructional goals.
Analysis in the SAM model includes instructional analysis and analysis of
learners and context. The SAM model of instructional analysis includes identifying
skills and knowledge that should be included (Dick, Carey, & Carey, 2009, p. 39) via
the instructional deign. Analysis under this model includes understanding instructional
goals and identifying subordinate skills to be included in instruction. Entry level skills
learners should possess prior to beginning instruction are also analyzed. Learner and
context analysis follows or may take place concurrently with instructional analysis.
Learner and context analysis in changing higher education environments may uncover
unexpected results.
26
The design process in the SAM model begins with writing performance
objectives. The process is influenced by outcomes of previous analysis activities as well
as feedback received from ongoing evaluation. Objectives are used in design and
development of assessment instruments that influence design of instructional strategies
such as selecting delivery methods, content sequencing, and instructional strategies.
Development activities consist of creating assessment instruments and selection
and development of the actual learning environment and instructional materials based on
instructional strategies designed. Evaluation activities in the SAM model are both
formative and summative. Dick, Carey, and Carey (2009) define formative evaluation as
the collection of data and information during the development of instruction that can be
used to improve the effectiveness of the instruction (p. 257). Formative evaluation may
lead to revisions in design and development of instruction or a more in-depth analysis of
instructional needs or learners and contexts. Summative evaluation occurs after
instruction has been implemented and is defined as the design of evaluation studies and
the collection of data to verify the effectiveness of instructional materials with target
learners (p. 320).
The foundation provided by ADDIE and the problem solving approach needed
can be clearly seen in activities identified by the SAM model. The Sims (2009) Proactive
Design for Learning (PD4L) model, in contrast, does not as obviously include all aspects
of the ADDIE process. The problem solving nature of the PD4L model may lead to an
incorrect belief that parts of the ID process are eliminated from the model.
27
28
PD4L addresses a set of factors which, if applied and considered, will ensure the
integrity of the online teaching and learning environment. The first of these
relates to ensuring there is a clear definition of the strategic intent of the course
why it is required, who it is for and what the desired outcomes are. (p. 388)
These factors clearly relate to the analysis phase of the ID process. The PD4L
model can be seen to have the ID process of ADDIE as its foundation along with a strong
basis in problem-solving.
Three of the many ID models available to instructional designers have been
analyzed in this section. The choice of model will be determined by the instructional
design setting and instructional goals. The instructional design setting includes the
instructional problem, whether instruction will take place face-to-face, online, or in a
hybrid learning environment as well as the organizational structure in which instructional
design takes place. Organizational structure, for the purpose of this study, refers to
instructional design systems used by organizations to design, develop, deliver, and
support instructional design processes within that institution.
Organizational Structure
The form of leadership and support provided for instructional design activities
within an organization is determined, in part, by the organizational structure.
Understanding various organizational structures is important in todays changing
academic environment for a variety of reasons. One reason centers around ID leadership.
ID leadership helps guide instructional design activities today and for the future. Fullan
(2001) refers to leaders as agents of change. Todays changing academic environment
requires leaders who can work with others to ensure appropriate instructional design
methods are followed to provide instruction meeting instructional objectives and learner
29
expectations. Marx (2006) states that working together is the only way to improve our
chances of survival as an educational system (p. 19). The organizational structure
provides the pattern from which leadership and support for ID activities emerge as
faculty, instructional designers, and others work together to design and develop
instruction and training.
Leadership and support for ID activities can help bridge the gap between subject
matter experts knowledge of the subject and their knowledge of learning theories,
instructional design theories, and instructional design models. Organizational structure
helps determine from where support for course development and modification will come
and who within the organization is responsible for course design and development in
todays changing and technologically complex environments.
Various organizational structures are used in higher education today. The models
upon which organizational structures are designed and used at colleges and universities in
the NCA-HLC are relatively unknown. Three instructional design organizational
structures appear to be in use in higher education today. The main features characterizing
various organizational structures can be identified by where responsibility and support for
instructional design lies. The author has coined three termsinstructional designer
centered, instructional designer supported, and faculty centeredto identify three
organizational structures explored in this study. The following discussion describes these
three potential structures and their characteristics.
Instructional designer centered structure. The instructional designer centered
structure (IDC) most closely resembles systems analysis and design models found in
30
31
Literature or information to support this conjecture is limited. The lean team approach as
described by Moore and Kearsley (2005) is perhaps one example of an IDC structure.
Moore and Kearsley in describing the lean team approach state it has to be recognized
that no individual is a teacher in this system, but that indeed it is the system that teaches.
Even the content is not owned by a professor, but is the product of group consensus
(pp. 107-108). The IDC structure allows faculty members who will be teaching or
moderating a course to not be involved in design and development. White (personal
communication, May 5, 2010) and Clawson (personal communication, May 12, 2010)
described an IDC approach used at one private, for-profit organization, Capella
University. A summation of the conversations follows.
Capella University uses an instructional design team that includes instructional
designers, curriculum specialist, subject matter experts, course producers, and editors.
The instructional design process for Capella begins with the curriculum development
team consisting of a curriculum specialist and faculty members from the specialization
who are responsible for the analysis portion of the instructional design process. During
the curriculum development process, the team identifies outcomes and competencies for
the program or degree, professional standards are identified, audience analysis is
performed, and high-level assessment strategies are discussed and documented. Once the
analysis is completed and documented, design work begins. During the design phase, the
curriculum specialist transitions out and an instructional designer and the rest of the
course design/development team transitions in. Additional faculty members may be
added to the team at this time. Assessment strategies are developed and faculty members
working with instructional designers select course materials and appropriate instructional
32
strategies can be used to meet competencies and sub-competencies identified during the
analysis phase. Producers are called upon to build the course in the online environment
once design and development have been completed. Editors check for consistency, ADA
compliance, copyright issues, and to make sure all electronic activities work properly
after the course is built. The course is released after being thoroughly tested and
established outcomes assured. Teaching faculty members receive the course several days
prior to its release to learners so faculty can prepare to teach and confirm that all links
work. Also concerns can be raised for any issues that need to be addressed. Instructional
designers assist with the creation of grading rubrics and activities to ensure identified
competencies and sub-competencies are included throughout the process in the design
and development of the course (S. Clawson, personal communication, May 12, 2010; N.
White, personal communication, May 5, 2010).
Another example of the IDC structure being used organization-wide can be found
at the private, for-profit university, University of Phoenix. According to Seiden (2009),
standardization provides both consistency and faculty support (Business orientation,
para. 5). Kinser (2006) further describes the Phoenix instructional design process by
stating courses are designed by a committee of subject-matter experts and standardized
across the system (p. 27). This description appears to define an IDC structure in use at
the University of Phoenix, but it is possible it indicates an instructional designer
supported structure as described in the next section.
Instructional designer supported structure. The instructional designer
supported (IDS) structure is similar to the IDC structure in that the instructional designer
33
plays a major role in instructional design throughout the process. The IDS structure,
unlike the IDC structure, places the instructional designer in the role of supporting the
design process rather than leading the process. Merrill and Wilson (2007) describes these
as designers-by-assignment (p. 336). One view of this may be found in the
instructional design generator structure presented by Dempsey et al. (2007) which
describes the instructional designers role as follows:
work closely with unit leaders (subject matter experts) in the initial stages to
design a blueprint outlining the key learning and implementation strategies
appropriate for the context. In close collaboration with the ID, the course leader
develops a sample or module of the course and the ID provides feedback. Once
agreed on, this provides a model for the writing of subsequent modules and
detailed ID feedback is usually not required. (p. 224)
Focus in the IDS structure is on the instructional design process and design of
appropriate learning strategies. This structure provides extensive support to instructors
for course design and development. The IDS structure often focuses on design and
development of courses involving distance learning technologies or use of course
management systems. The University of Arkansas at Little Rocks Scholarly Technology
and Resources (STaR) Center states its mission as faculty support for teaching and
learning using UALRs Blackboard learning system management system (University of
Arkansas at Little Rock, 2008, Mission statement). STaR provides instructional design
services, development assistance, training, and production of multimedia solutions.
The IDS structure is a team-based structure in which faculty members work
closely with instructional designers to design and develop instructional solutions.
According to Dempsey et al. (2007) instructional designers are responsible for keeping
up to date with ID literature and educational theory and practice . . . sharing this
34
knowledge with the team and negotiating appropriate application to a course context in a
team environment (p. 227). This structure increases the knowledge of all members of
the team, providing a means to improve our chances of survival as an educational
system (Marx, 2006, p.19). Institutions that use less of a team approach often require
faculty members to be responsible for design of their own courses with limited ID
leadership and support. The organizational structure at these institutions resembles what
this study terms the faculty centered structure.
Faculty centered structure. The faculty centered (FC) structure places
responsibility for instructional design and development on faculty members.
Instructional design in this structure is often in the form of helping faculty members learn
to use course management systems rather than on instructional design in todays complex
and changing academic environment. Merrill and Wilson (2007) used the term
designers-by-assignment in describing the role of faculty in what is termed here as the
FC structure (p. 336). Designers-by-assignment (p. 336) are faculty members
assigned to be an instructional designer . . . not trained as an instructional designer (p.
336). Organizational structures using a FC structure must provide leadership and support
for faculty members who are thrust into the role of designers. Little is known about the
type of leadership and support needed or provided under this structure.
Many problems exist with this structure, including lack of training in instructional
design and teaching methods (Dempsey et al., 2007, p. 227), lack of interest in learning
new methods of teaching (p. 228), and faculty overload. Lack of training in teaching
methods and instructional design practices make many faculty designers novice
35
instructional designers at best. Silber (2007), quoting Nelson and Stolterman, suggests
that as novice instructional designers, faculty members can present solutions that waste
resources and lead to solutions that are not only ineffective, but can actually create more
difficulty (p. 7).
Instructional design and educational literature has addressed ID challenges
occurring in todays changing, technologically complex educational environments. Much
has been written since 2003 on the need to create a sense of community (Jacobs &
Archie, 2008; Mann, 2005; Woods & Ebersole, 2003), the importance of discussion in
online courses (Deloach & Greenlaw, 2007; Dennen, 2005; Dennen, Darabi, & Smith,
2007; Spatariu, Quinn, & Hartley, 2007) and assessment and academic integrity issues
(Corcoran, Dershimer, & Tichenor, 2004; Lanier, 2006). Yet little seems to have been
written for faculty members on how to design and develop online instruction.
Wiesenberg and Stacey (2005), describing the work of Ally (2004) state, it is the
instructional strategy, not the technology, that determines the quality of the learning
within a distance classroom . . . the design of these strategies must follow sound design
principles (p. 388). Weisenberg and Stacey (2005) conclude there exists a great need for
strong institutional technical support, excellent online teaching/learning skills, as well as
an inclusive philosophy of teaching and learning online (p. 398). FC structures
providing excellent leadership and support are needed. The level and type of leadership
and support for ID activities typically available in organizational structures following a
FC structure is unknown.
36
Summary
Todays instructional designers must be able to identify learners and the needs or
learners in the midst of a diverse learner population. Instructional designers must also
deliver instructional solutions to meet increasing demands of learner-consumers, create
online instruction using a variety of learning theories, provide for a variety of learning
styles, and accommodate learners with disabilities. Organizational structures must be in
place to provide leadership and support to those involved in the instructional design
process.
The ADDIE foundation of the instructional design process is enhanced by the
emergence of problem solving approaches. Research is still warranted to determine
whether there is a significant difference in instructional design activities between the
various organizational structures. This study sought to determine what organizational
structures are currently used and the instructional design activities taking occuring within
these structures. The study further sought to gain a greater understanding of the strengths
and weaknesses of various instructional design organizational structures in use. The
mixed methods explanatory approach described in Chapter 3 was used for this study.
37
CHAPTER 3. METHODOLOGY
Changes taking place in higher education make it advantageous to understand
instructional design processes being used to design and develop instruction and training
in colleges and universities within The Higher Learning Commission of the North Central
Association of Colleges and Universities. Instructional design organization structures
under which instructional design leadership and support are provided also need to be
identified. This mixed methods explanatory study sought to gain a greater understanding
of organizational structures being used in higher education at member colleges of the
Higher Learning Commission of the North Central Association of Colleges and
Universities (NCA-HLC) in the United States and instructional design activities taking
place. The study sought to answer the following questions:
1. What types of instructional design activities are found in colleges and
universities using instructional designer centered, instructional designer
supported, and faculty centered models of organizational structure in the
Higher Learning Commission of the North Central Association of Colleges
and Universities?
2. What are the strengths of types of organizational structures in colleges in the
Higher Learning Commission of the North Central Association of Colleges
and Universities as perceived by instructional design professionals?
3. What are the weaknesses of types of organizational structures in colleges in
the Higher Learning Commission of the North Central Association of
Colleges and Universities as perceived by instructional design professionals?
38
Study Design
The study used a mixed methods explanatory design. Creswell (2008) describes
an explanatory design as one in which the researcher might seek to explain the results in
more depth in a qualitative phase of the study (p. 566). A survey method was used to
collect quantitative data. Results were analyzed to determine if a significant difference
occurs in instructional design activities between different forms of organizational
structures where instructional design activities occur. Telephone interviews were
conducted after surveys had been received. The interviews explore reasons for any
differences found between organizational structures and instructional design activities.
Participants were selected for the quantitative portion of the study from among
1,015 colleges and universities in the NCA-HLC that were listed as accredited as of May
24, 2010 (Higher Learning Commission, 2010, Current or previously affiliated
institutions 05/25/2010). No participants were selected from colleges listed by the
NCA-HLC with a status of inactive, merged, accredited on probation, accredited on
notice, accredited show cause, or candidate. These institutions were eliminated from
selection in an effort to ensure the sample was representative of the population being
studied (Creswell, 2008, p. 151).
After names and e-mail addresses were recorded and IRB approval received, an email was sent to selected candidates inviting them to participate in the study.
Commercial web software SurveyMonkey was used to conduct the survey. A reminder
e-mail was sent one week after the initial invitation with a second e-mail reminder
following five days later.
39
40
Sampling
A simple random sampling method was used to select potential participants from
200 of the 1,015 colleges and universities within the HLC (Creswell, 2008, p. 631).
Based on Fowlers table of confidence ranges for variability attributed to sampling (1993,
p. 31), it was determined a minimum of 50 respondents was required to provide a 95%
confidence level and a sampling error of 15%.
Institutions websites from which candidates were invited to participate in the
study were reviewed to find the names and contact information for employees listed in an
instructional design role. The titles included instructional designer, faculty development
leader, curriculum specialist, course developer, distance education director, and similar
roles. The school was called in an effort to find the name of a person in one of the above
roles if a name could not be found from the website. The next school was selected from
the larger randomly generated list if a telephone call failed to produce a name. This
process spanned a period of one month, requiring approximately 40 to 50 hours to
complete.
Interview participants were selected based on availability from those who
indicated on the survey their willingness to participate in an interview. All respondents
who volunteered to be interviewed agreed to the recording of the interview (Creswell,
2008, p. 227).
Instrumentation
The quantitative portion of the study used a modified version of Vannoys (2008)
validated, three-part survey instrument (see Appendix A). Use of this instrument
41
provides validity through the use of an existing, validated survey. The first part of
Vannoys instrument was used to collect data concerning instructional design activities
and the frequency with which activities were conducted. The second part determined
respondents reasons for excluding specific instructional design activities. The modified
instrument used for this study eliminated questions relating specifically to use of course
management systems and qualitative questions from Part 3 of Vannoys survey.
Demographic questions were added at the beginning of the survey to identify
respondents roles at the institution, type and academic level of the institution,
instructional delivery options offered, and instructional design organizational structure
used. The opportunity for volunteers to participate in a qualitative follow-up telephone
interview was provided at the end of the survey.
The questions and format of the telephone interview (see Appendix B) were
validated by a survey expert in social science research and two instructional designers
known to the researcher. The telephone interview was administered by the researcher.
Data Analysis
Survey data collection ended two weeks after the initial e-mail invitations were
sent. Interview data collection continued for two weeks after the close of quantitative data
collection. Quantitative data analysis began at the conclusion of quantitative data
collection. Descriptive statistics on each item in Parts I, II, and III were used to
determine measures of central tendency. Mode was used to determine central tendency
of responses for each item in Part I of the survey and both median and mode were used to
measure central tendency of responses for each item in Parts II and III of the survey
42
(Ramano, Kromrey, Corraggio, & Skowronek, 2006, p. 4). The variable organizational
structure was used to perform a chi-square analysis to determine if a difference exists in
the type of organizational structures used at institutions that are members of the NCAHLC.
Data was then cross-tabulated on organizational structure and measures of central
tendency calculated based on organizational structure. The Kruskal-Wallis test was used
to determine if there was a difference between organizational structures and the
frequency with which specific instructional design activities were performed (Aczel,
1999).
Qualitative data analysis began at the completion of the first telephone interview.
According to Leech and Onwuegbuzie (2007), at least two types of data analysis are
needed for triangulation (p. 579), and using multiple forms of data analysis can increase
understanding of the data (p. 563). For this study, constant comparative analysis was
used to identify themes emerging from the data. As themes emerged, classical content
analysis was used to determine which themes occurred most often. The process involved
grouping questions into five categories:
43
The purpose of the qualitative portion of the study was to gain a greater
understanding of instructional design activities taking place at institutions within the
NCA-HLC and the strengths and weaknesses of organizational structures as perceived by
instructional design professionals. Participants were asked a series of questions designed
to aid in this understanding. Participants were read an introductory statement about the
purpose of the interview and asked for permission to record the interview prior to being
asked any questions (see Appendix B).
Interviews took between 23 and 66 minutes with an average interview time of 38
minutes. Two digital recordings were made of each interview and transferred to
Audacity, a free recording software, for transcription. One outside person and the
interviewer transcribed the interviews. Completed transcripts were e-mailed to
participants for approval, review, and modification. In instances where no response was
received to approve or modify the transcript, an attempt was made to contact the
participant by telephone. For those who could not be reached by telephone, a final e-mail
was sent asking for response by a deadline if the transcript was not approved. Three
participants sent minor transcript corrections.
Two reviewersthe principal investigator and a former professional colleague
separately reviewed transcripts. The former professional colleague had prior knowledge
of evaluating and identifying themes in literature and technical communications in
addition to participating on an extensive large collaborative project requiring identifying
of themes. Training was provided to transfer that skillset to the identification of
emerging themes. Each reviewer identified emerging themes for each category
previously identified based on individual participant responses. Themes were then
44
grouped for all respondents and ranked based on frequency of occurrence. Themes and
rankings were compared between reviewers. Discrepancies between reviewers were
discussed until a consensus was achieved. Themes were then reranked to determine
which appeared most frequently in the data.
Organizational structure strengths and weaknesses were lsited. In addition, the
strengths and weaknesses of each organizational structure were identified. Data analysis
was accomplished using Microsoft Word, Access, and Excel.
Ethical Issues
Potential ethical issues included informed consent and anonymity issues.
Informed consent was established prior to participants having access to the survey.
Information relating to informed consent was presented in introductory materials sent
with the e-mail for participation. Survey participants agreed to informed consent by
clicking on the link and acknowledging their consent to participate in the survey before
gaining access to the survey site. Confidentiality was maintained at all times in both
qualitative and quantitative portions of the study. Respondent names were not collected
until a respondent volunteered to participate in the telephone interview. Names of
respondents volunteering for telephone interviews were separated from their responses
and stored in a separate password protected file on the researchers desktop computer.
Names of respondents institutions were not collected. This study has been approved by
Capella University's IRB 101194-1, effective from September 14, 2010 through
September 14, 2011. Research began after receipt of Capella University Institutional
Review Board approval.
45
Summary
A mixed methods explanatory study was conducted to identify common
organizational structures and the instructional design activities occurring under each
structure. Institutions from which participants were invited were selected at random from
the NCA-HLC. An existing, validated survey was modified for use in the quantitative
portion of the study; telephone interviews, using questions and formats validated by
experts, were used for the qualitative portion.
The anticipated timeline for the study was two weeks for quantitative data
collection, four weeks to conduct qualitative interviews with a two-week overlap with
quantitative data collection, and two weeks to process and analyze data. Reporting
findings and conclusions was anticipated to take three to six weeks. Actual analysis of the
data and reporting of findings took approximately four weeks to complete. Chapter 4
discusses the data collection and analysis.
46
47
curriculum specialist, course developer, distance education director, and similar roles.
The individuals were sent an e-mail inviting them to participate in the study. The e-mail
contained the link to a survey administered by SurveyMonkey. Respondents were able
to complete the survey in an average of eight minutes with most taking between seven
and nine minutes to complete. The invitational e-mail informed potential participants
that responses were anonymous and confidential. The opening screen of the survey also
included information relating to anonymity and confidentiality.
Sixty-four (32%) of the 200 people invited to participate followed the link to the
survey. One chose not to enter the survey after clicking on the link and six chose not to
complete the entire survey. Three of those entering the survey, but not completing the
entire survey, sent e-mails indicating that after entering the survey they felt they were not
the best person on campus to respond to the survey. Each of them suggested alternate
people to contact. One person reported an inability to complete the survey due to
technical problems. A total of 87.5% (n = 56) of those responding completed the survey.
Demographics
Part I of the survey collected demographic data from participants in an effort to
gain a better understanding of the overall structure of participants institutions, the role
and knowledge of participants, the instructional design organizational structure used at
the institution, and participants experiences with other organizational structures.
Descriptive, nonparametric statistics of mode and mode percentage were performed on
the data collected in response to questions in Part 1. These questions collected nominal
level data that refers to categories or classifications. The groups are simply names to
48
differentiate them, no order is implied (McMillan & Schumacher, 1997, p. 205) making
the calculation of median, as the study methodology initially stated, invalid for this data.
Responses to these questions aid in understanding the organizational structure used in
participants institutions.
Participant role. Participants identified their roles in their institution as
administrator, instructional designer, faculty developer, teaching faculty, or other (See
Figure 1). Participants identifying themselves as other were asked to specify their roles.
Responses included instructional technology support specialist; instructional materials
developer; outreach staff; teach and design instruction in addition to conducting research;
instructional design/curriculum coordinator; Director - College of Arts, Sciences, and
Letters online program; and coordinator.
49
50
Figure 4. Delivery options offered by institutions and delivery options for which
participants have responsibility for instructional design.
Results indicate respondents represent a variety of institutions providing
instruction to all levels of students. The results also indicate a wide variety of delivery
methods in use and that participants have instructional design responsibility under a range
of delivery methods. The high percentage of respondents, 81.0%, designing hybrid
instruction strongly indicates the need for instructional designers who can develop
instruction for both face-to-face and online instructional environments.
Organizational structure. Respondents were asked to identify the instructional
design organizational structure best describing what was used at their institutions.
Responses included instructional designer centered, instructional designer supported,
51
faculty centered, and other (See Figure 5). Other organizational structures identified
include: our outreach school uses the instructional designer supported model; our
teaching center uses the faculty centered mode; we currently have a blend of the faculty
centered model and the instructional designer supported model; and the institution uses a
faculty centered model; within our center; we use a combination of ID centered and ID
supported models to serve a larger body of educators (beyond the institution).
127).) The results of this analysis were compared to results obtained by Vannoy (2008)
whose survey was modified for use in this study. A comparison revealed similar findings
of the mode response for performance frequency of specific instructional design activities
between Vannoys results and those found in this study. The comparison revealed that
six of the 11 activities, 54.5%, were rated as being performed with the same level of
frequency by participants in this study and Vannoys study. A similar result was found in
the median response value with six of the activities, 54.5%, having the same median
value. Similar activities are identified in Table C1.
Participant responses to frequency of performing each instructional design
activity surveyed was tested using a goodness-of-fit, one-sample, chi-square test for all
respondents. Aczel (1999) defined goodness-of-fit as a statistical test of how well our
data support an assumption about the distribution of a population or random variable of
interest (p. 707). The null hypothesis, H0 (no difference in the frequency with which
specific instructional design activities are performed), was tested and rejected for 10 of
the 11 instructional design activities. Performing a follow up evaluation of training was
the only activity which respondents reported with equal consistency across the frequency
columns. This indicates that not all respondents perform the various instructional design
activities at the same frequency with the exception of performing follow-up evaluation of
training (see Table C2).
Organizational Structure
A goodness-of-fit chi square test was performed to test the null hypothesis, Ho
(there is no difference in the number of institutions using the instructional designer
54
55
instructional design activities. The study design called for use of a chi-square test for
independence to test the null hypothesis, H0 (there is no difference in instructional design
activities among the IDC, IDS, and FC structures of organizational structure). It was
determined, based on Aczel (1999) and consultation with statisticians from both Capella
University and the University of Arkansas at Little Rock, a chi-square analysis was
inappropriate to use in this study. Frequency counts of fewer than five found in the cross
tabulation in Table C3 and the small sample size, n = 53, are given as the reason for this
determination. Aczel (1999) cautions against using the chi-square test when the count in
any cell is less than five (p. 710). The study had over 20% of the cells with values of
fewer than five. Instead, the SPSS non-parametric, independent samples test, KruskalWallis, was conducted to test the null hypothesis.
Kruskal-Wallis was determined appropriate for the data given its use of ranks of
the observations rather than the data (Aczel, 1999, p. 695). Values p < .000 for
frequencies of usually, regularly, selectively, and never and p < .001 for frequencies of
always and rarely supported rejection of the null hypothesis in each case. This indicates
there is a difference between organizational structures and the frequency with which
various instructional design activities are performed.
Figure 7 and Figure 8 illustrates, respectively, the percentage of respondents in
each organizational structure who indicated the frequency with which they conduct a
needs assessment and determine if the need can be solved by training. This indicates an
unequal distribution of frequency of performance because respondents using an IDS
structure report most often they usually perform these two tasks, and those from IDC
organizational structures report most often they never perform the two tasks. Figure 9
56
through Figure 17 illustrate the difference in frequency with which each of the remaining
instructional design activities are performed based on organizational structure.
57
58
59
60
61
62
C4). Table C4 represents nominal-level data on which mode and percentage were
reported. A comparison of results to reasons for omitting instructional design activities to
those found in Vannoys (2008) study found identical responses for six, 54.5%, of the
activities. Items showing the same reason for not including an activity during the
instructional design process included the following: conduct a needs assessment,
determine if the need can be solved by training, conduct task analysis, assess trainees
entry skills and characteristics, pilot test instruction, and do a follow-up evaluation. In
each of the five cases where responses differed between studies, respondents in this study
responded they always perform the activities. This was a choice that was not available in
the Vannoy survey. In two of the five cases (develop test items and select instructional
strategies), the Vannoy respondents selected not applicable. The not applicable response
was removed from this studys survey and replaced with always include.
The data was cross-tabulated on organizational structure and reasons for omitting.
A nonparametric Kruskal-Wallis test was performed to test the null hypothesis, Ho (there
is no difference between organizational structures and reasons for omitting specific
instructional design activities). In each case, the null hypothesis was accepted; this
indicats there was no difference between organizational structures and reasons for not
performing specific instructional design activities.
A variety of organizational structures are used in institutions within the NCAHLC; the frequency with which instructional design activities are performed varies from
activity to activity; and, a variety of reasons exist for omitting activities. A difference
appears to exist between organizational structures and frequency of performance of
63
instructional design activities, but this difference does not extend to reasons activities
might be excluded.
The second part of the study collected qualitative data. Qualitative data was used
to gain a greater understanding of the strengths and weaknesses of the IDC, IDS, and FC
organizational structures as perceived by instructional design professionals.
Qualitative Data
The 63 respondents who chose to participate in the survey were given an
opportunity to participate in the qualitative data collection portion of the study. Thirteen
participants volunteered to participate in telephone interviews. Eleven, 17.4% of all
survey respondents, actually participated in interviews. One was not available at the
scheduled time and did not respond to attempts to reschedule. The other was not available
until after the close of the time frame allowed in the study design for interviews.
Procedure
The purpose of the qualitative portion of the study was to gain a greater
understanding of instructional design activities taking place at institutions within the
NCA-HLC. The study further sought to understand strengths and weaknesses of
organizational structures as perceived by instructional design professionals.
Questions were grouped into five categories:
64
65
66
Respondents were asked to identify the role each person played in the
instructional design process. All respondents listed the instructor or subject matter expert
(SME), n = 11, as involved in the process. Table C6 shows the breakdown of
instructional design roles by organizational structure. The next most frequently identified
role, n = 7, was that of instructional designer. This role was identified as present in each
organizational structure except the FC structure where it was not listed among
instructional design roles. Other roles identified by participants included editors, video
developers, faculty chair, project coordinator, graphic designer, creative services director,
course production team, and others whose roles in the process was that of reviewers.
Participants were asked to describe the tasks performed by each person identified.
This task identification helped to further understand the function of the various roles
identified. The two most frequently identified tasks performed by SMEs were
development of content or materials and development of assessments. Additional tasks
identified as responsibilities of the SME were the following: develop learning objectives,
design learning outcomes, and ensure proper fit within the curriculum. Three tasks were
the most frequently identified for instructional designers in the IDC and IDS
organizational structures. One was to work with faculty members in the design of the
course including development of goals and objectives. The second task was to help
faculty members in selection of learning activities and assessments. The third task was
transfer of content designed by faculty members into the delivery platform. Additional
tasks included: ownership of the overall course design process in the IDC structure,
provide technical support in a blended structure, and review learning materials in the IDS
structure.
67
Where a director of instructional design was identified, the tasks were primarily to
administer and advertise instructional design services and to train and consult with
faculty members. Curriculum specialists served the primary purpose of ensuring course
fit in the curriculum and tracking curricular changes. Multimedia designers created and
designed multimedia materials including pictures, digital videos, and sound tracks.
Administrators ensure academic integrity and provide a timeline for project completion.
Communication and interaction is an important aspect of these roles, especially in
the IDC and IDS organizational structures. Respondents were asked to identify areas
where various people involved in the instructional design process interact with one
another. In the FC organizational structure, the SME/faculty memberdesigner by
assignment (Merrill & Wilson, 2007, p. 336)typically worked with other faculty
members in the discipline, directors of instructional design, curriculum specialists, and
multimedia designers. The purpose of these interactions was to gain support in
performance of instructional design activities but responsibility and leadership rest with
the SME/faculty member. In the IDS structure, the SME interacted extensively with the
instructional designer in course development. The instructional designer typically worked
with administrators and the directors of instructional design to engage the help of others
needed in the process and to ensure consistency across the curriculum and within the
delivery platform. The participant interviewed from an IDC environment indicated that
the SME interacted almost exclusively with the instructional designer during the
instructional design process. The instructional designer then communicated with
administrators and others in support of the project. Course design in the FC and IDS
68
structure was often, although not always, initiated by the SME. In the IDC structure,
course design or revision typically began at the administrative level.
A similarity is seen among organizational structures and roles that were used to
conduct instructional design activities within the structures. The greatest difference
seemed to be in the number of roles and interactions between people performing the
roles. Instructional design processes and the activities taking place within the processes
were identified next.
Instructional design activities. Participants were asked to describe the
instructional design process at their institutions, followed by a description of typical
instructional design activities. Instructional design process was defined for participants
as referring to the procedures used for designing, developing, and implementing courses
within a program of study or curriculum. Instructional design activities were defined as
the individual steps performed within the procedures identified as part of the instructional
design process. See Table C7 for participant responses.
The results show all participants indicated involvement in determining the
syllabus, content, assessments and materials; these tasks were performed prior to
developing course materials or placing courses into learning management systems and
opening the course for students to access. Most participants, 81.8%, indicated the
process begins with a course request or recognition of the need for instruction. This is
followed by development of goals and objectives, 72.7% of participants. The KruskalWallis test was performed to test the null hypothesis, Ho (There is no difference between
organizational structures and instructional design processes based on responses given by
69
interview participants). In each case the null hypothesis was accepted, indicating there
was no difference between organizational structures and instructional design processes.
The next question asked participants about instructional design activities. In
answering this question participants responses served to further clarify responses to the
previous question about instructional design processes. This clarification provided
insight into processes and organizational structures. For example, one participant
representing an IDC organizational structure, further elaborated on the first step by
requesting a course or recognizing the need for a new course.
Whenever a project comes in, there has to be a needs assessment. It is co-lead by
the instructional designer and the faculty chair. The instructional designer makes
sure it is done. That way we truly know who our students are, who our audience
is, their prerequisite knowledge, what competencies do we have to meet within
this course. (Participant 3)
Participant 3 further stated that in his/her IDC organizational structure, design was an
iterative process that is not done until the end so it is never really done.
In the FC organizational structure, the activities were not clearly defined.
Participant 4 stated, I dont really know what processes they are using if there is any
kind of formal recognizable one other than their own craft of designing courses. A
greater depth of understanding can be gained from following explaination.
It depends on the instructor here. Some instructors may say OK, what content
am I covering and what order am I covering it? What midterm and final am I
going to give? That would be at the lesser end. On the greater end, hopefully
people are looking specifically at what learner outcomes we want to achieve and
the processes through which we might achieve those. (Participant 10)
In the IDS organizational structure the instructional designer often serves the
functions of review and quality control. Participant 6 explained the process at his/her
institution.
70
The SME uses the previous draft of the course as their starting point. Taking into
consideration the content, he drafts learning outcomes, what they expect learners
to learn. Design learning activities . . . The instructional designer does a review of
the draft looking for learning outcomes to make sure they are on the higher level
of Blooms taxonomy.
Participant 8 further expanded on the instructional designers role in review and quality
control by stating, Another thing the instructional designer does is look at instructional
objectives and make sure those instructional objectives are met.
The process and activities are much the same in each organizational structure as
determined by the Kruskal-Wallis test for independence. The depth or quality of those
activities does appear to differ based on organizational structures.
Organizational structures. Participants were asked to identify the
organizational structure used at their institutions. Responses included the following:
IDC, n = 1;
IDS, n = 3;
FC, n = 4;
Participant 1 (FC), The faculty for the most part work in isolation. They
teach the way they understand as opposed to teaching the way the students
might learn best.
71
Participant 4 (FC), The faculty have been designing their courses as the
primary people for 100 plus years at our institution. The idea of having
instructional design people is relatively new within our institution.
Participant 6 (IDS), The design of the course really centers with the faculty
and the school. They control the content and how it is delivered. The designer
supports, reviews, and offers suggestions.
Participant 3 (IDC), Thats what we are, a virtual assembly line here. The
faculty member still owns all their content . . . They are working through a
process . . . Its really centered around the role of the instructional designer.
Participant 9 (IDS & FC), If they [faculty] are very experienced, it would be
faculty centered. If they [faculty] are inexperienced it would be instructional
designer supported.
72
work on those in-common elements with their own faculty. Basically, we had a train the
trainer. This training along with the fact that at the end, we have loud voices for
academic freedom caused faculty to feel they know how to design their course and its
purely from a presentational content perspective has brought us back to a FC structure.
From institutions supporting an IDS structure one participant reported no change
and two participants reported change in organizational structure. Participant 2 said the
change was to split the concepts of instructional design and development from
instructional innovation. The instructional design services unit handles fully online
efforts and the instructional development unit works with faculty training. Participant 8
identified the change as a move away from FC. The change began by teaching faculty
members how to write instructional objectives and develop good testing methods. It also
incorporated a method for providing systematic feedback and course evaluation to faculty
members. He/she stated, Originally the only people doing the instructional design work
were the instructors themselves. There was really no instructional design person here.
Participant 3 indicated the organizational structure at his/her school had been
totally faculty centered. The organizational structure evolved over time to the current
process with resources, feedback, and support available throughout the process that is
guided by the instructional design team. Of the three participants indicating a use of both
FC and IDS organizational structures, two indicated organizational structure changes.
Participant 9 stated, I would say its gone from having very much a learning
management system focus, how to use the learning management system, to more a
pedagogical teaching consideration. Participant 5 stated the change in his/her institution
73
occurred because of the number of online courses offered and the faculty members
teaching them.
All participants reporting changes in organizational structure indicated they
believed the changes were for the better. Participants explained why they believed the
current organizational structures were chosen at their institutions. Tradition is the most
commonly stated reason (see Table C8).
Four of the five participants indicating tradition as one of the primary reasons for
the existing organizational structure also indicated the organizational structure at their
institutions had not changed during the time they had been there. One indicated a change
to a combination of IDS and FC structures. Participants were next asked to discuss their
perspectives of benefits and shortfalls of their organizational structures.
Strengths and weaknesses of organizational structures. Thirty six percent, n =
4, of participants identified scalability/efficiency and faculty expertise as major benefits
of the organizational structure used at their institutions. Responsiveness, quality, faculty
autonomy, and communication were each identified by 27.3%, n = 3, as major benefits.
Most participants identified between two and three major benefits of their organizational
structure. Additional benefits included: works for the institution, integration into campus
with training units on hybrid learning, cost containment, location, and faculty relief.
Next participants were asked to identify major shortfalls of the organizational
structure used in their institutions. Five participants, 45.5%, identified poor or
inconsistent quality as the number one shortfall of the organizational structure. Three of
the four participants from FC organizational structures identified this shortfall. The next
74
most commonly identified shortfalls were nonresponsiveness of the process and lack of
faculty training with 27.3%, n = 3, of participants identifying this shortfall. Additional
shortfalls included: focus is not on learning, the rate of change makes for uncertainty,
perception of the process, and lack of staffing.
A side-by-side comparison of benefits and shortfalls identified for each
organizational structures is shown in Tables D9 and D10. A variety of reasons were
given when participants were asked why they believed the shortfalls existed. Mentioned
by 27.3%, n = 3, of participants were the following: tradition, leadership, and faculty
workload. Participant 10 summarized, I think when you have a historic pattern, it takes
a lot of momentum and a lot of intervention to change that model. Participant 3 states,
So many things are going on at once. You dont get to pay as much attention to an
individual course maybe as you would like. Participant 1 said, Lack of opportunity for
collaboration . . . I think the workloads are so heavy that there is not time for people to be
innovative.
Participants were given the opportunity to suggest changes that might reduce or
eliminate shortfalls in the organizational structure. Forty six percent, n = 5, stated the
need for resources and training in instructional design and teaching, and 36.4%, n = 4,
suggested reducing workloads by giving release time, reducing courseload, or hiring
additional people.
Given the opportunity to share other information about instructional design
activities and the organizational structures at their institutions, participants gave positive
closing comments. Participant 1 stated, There are some people who are doing a
remarkable job. Participant 9 stated his/her optimism is much greater. Others
75
76
77
78
Results
Question 1
What types of instructional design activities are found at colleges and universities
using instructional designer centered, instructional designer supported, and faculty
centered models of organizational structure within the North Central Association of
Colleges and Schools, the Higher Learning Commission?
Understanding instructional design activities taking place in each of the
organizational structures serves to aid in understanding how organizational structures
support instructional design activities. The importance of this understanding is discussed
by Bichelmeyer (2003) in the context of the difference between instruction and
instructional design. According to Bichelmeyer, instruction has as its objective to
ensure learning to the best of each students abilities (p. 5) while instructional design
has as its objective to facilitate standardization of instruction (p. 5). It is important to
understand whether organizational structure impacts instructional design activities
supporting the similar, yet different, objectives of instructional design and instruction.
The study examined instructional design activities taking place in different organizational
structures in order to gain a greater understanding of the differences found in each
organizational structure.
Quantitative data were collected using a survey modified from the work of
Vannoy (2008). Data were analyzed to determine what organizational structures that
support design and development of instruction are used in institutions in the NCA-HLC
and the frequency with which specific instructional design activities take place. This
79
information was then cross tabulated to determine the frequency with which activities
take place in each of three organizational design structures: instructional designer
centered (IDC), instructional designer supported (IDS), and faculty centered (FC).
Institutional data collected shows respondents represent a combination of private,
nonprofit; private, for profit; and public schools. Schools represented varied and
included two-year technical colleges, two-year community colleges, four-year colleges
and universities. Respondents indicated that their institutions use a broad range of course
delivery methods including face-to-face, web-enhanced, hybrid, and online. The various
methods are each utilized with similar frequencies. Respondents were also asked to
identify delivery methods for which they personally were responsible for designing
instruction. Results were similar to respondents institutional delivery methods with the
exception that respondents most frequently reported responsibility for designing
instruction for hybrid delivery methods.
Results indicated three common instructional design organizational structures in
place. These were instructional designer centered (IDC), instructional designer supported
(IDS), and faculty centered (FC). The FC structure was the most commonly identified
organizational structure in use, the IDC structure was the least commonly used structure.
No additional organizational structures were identified by respondents.
The frequency that instructional design activities occur among the surveyed
institutions varied from activity to activity. Overall, the most frequently occurring
activities were identified as: write learning objectives, identify types of learning
outcomes, and select instructional strategies. The least frequently occurring activities
were: pilot test instruction, develop test items, and conduct a task analysis. Results are
80
similar to the findings of Vannoy (2008). Vannoy found 50% of the most frequently
chosen activities remain the same as those in Wedman and Tessmers (1993) study.
The study further shows the frequency that instructional design activities were
performed varies among organizational structures. For example, the most frequently
occurring response for persons using an IDS structure to the activity pilot test instruction
before completion was always. However, those using an IDC or FC structure most
frequently responded selectively or rarely. The most frequent response to the frequency
that needs assessments are conducted was given as usually for those using an IDS
structure, selectively in the FC structure, and never in the IDC structure.
Reasons for eliminating instructional design activities were also examined. A
variety of reasons was identified and found to vary from activity to activity. The two
most frequently occurring reasons for not conducting an activity were for conduct a needs
assessment and pilot test instruction before completion. Respondents indicated the most
frequent reason for eliminating a needs assessment was because the decision had already
been made. Pilot testing was not conducted most often due to lack of time. The KruskalWallis test showed organizational structure does not seem to impact reasons given for
omitting specific activities.
The findings in this study indicated the most frequently occurring instructional
design activities taking place were write/use learning objectives and identify types of
learning outcomes. The least frequently occurring activities were develop test items and
pilot test instruction before completion. Findings further indicated there was a difference
in frequencies with which instructional design activities took place between
organizational structures. The most frequently occurring instructional design activities in
81
82
83
across the curriculum and within delivery platforms. In the IDC environment, SMEs
interacted almost exclusively with instructional designers during the instructional design
processes. The instructional designer communicated with administrators and others in
support of the project.
These interactions guided instructional design processes and activities as part of
the processes at each institution. Responses indicated there was little or no difference
between organizational structure and instructional design processes. Participant
responses, while indicating the processes taking place may be similar, showed there was
a difference in the depth or quality of activities based on organizational structures. The
impact these differences have on instructional design and course quality overall was
addressed by participants who perceived quality as a benefit of the instructional designer
centered organizational structure and lack of quality as a shortfall of the faculty centered
organizational structures.
No one overarching major benefit was identified for any organizational structures,
but several benefits were listed. Scalability/efficiency and faculty expertise were
identified most often by participants as major benefits. Benefits solely attributed to FC
organizational structures were tradition and works for institution. The benefit attributed
only to IDS structures was integration into campus with training units on hybrid learning.
No benefits were listed as specifically unique to IDC structures.
Poor or inconsistent quality was identified most often as the major shortfall of
organizational structures used by participants. This shortfall was identified by five
participants. Three of the four participants from FC organizational structures identified
poor or inconsistent quality as a shortfall of the structure. Shortfalls identified as unique
84
to the FC structure were lack of focus on learning and faculty inflexibility. Unique to the
IDS structure was the rate of change makes for uncertainty and unique to the IDC
structure was perception of the process.
Faculty expertise was ultimately seen as both a major benefit and weakness of
organizational structures. It was seen as a benefit in the knowledge faculty had of the
subject but a weakness in the lack of knowledge faculty had in instructional design.
Participant 2 expressed his/her concern.
My personal belief is that instructional designers (instructional design teams,
units, whatever) have to find a way to straddle that fence [faculty autonomy-vsquality instruction]. Its always been a struggle. Were trying to build quality
product, effective learning units, learning modules, learning courseshowever
you want to look at itbut the faculty voice, the faculty ownership is a crucial
part of that. Thats the $64,000 question! How do we let faculty be facultyand
do all the things that we want to dobut yet ensure that the students that go
through that course all have a meaningful learning experience? . . . thats the real
key. Thats the core to doing this.
Other participants also expressed this concern. Faculty members need to be able
to do what they do bestteach to their specializationwhile ensuring quality
instructional design that enables quality instruction for learners. This concern about
quality needs to be addressed, along with recognition of other strengths of organizational
structures that were identified from information collected during interviews.
Discussion
Organizational structures used for the design, development, and support of
instructional design activities vary between institutions within the NCA-HLC. A shift
has been seen from traditional faculty centered structures where faculty have sole
responsibility for design, development, delivery, and facilitation of their own courses.
85
This shift includes instructional designer centered and instructional designer supported
structures that provide varying levels of instructional designer support or control over
instructional design processes. With this change, comes the need to understand strengths
and weaknesses of each structure. This understanding can help decision makers at
colleges and universities make informed choices regarding how to best provide effective
environments for development and support of instructional design activities.
This study showed a wide range of delivery methods are being utilized at
institutions within the NCA-HLC. Respondents indicated that within these delivery
methods the most common method for which they have responsibility is hybrid delivery.
According to Vernadakis et al. (2011), hybrid delivery has evolved as an effort to
combine the best of online and face-to-face instruction (p. 188). Those designing for
hybrid delivery need to understand how to effectively design for both environments.
Vernadakis et al. state, E-learning technology developed around the hybrid paradigm is
beneficial for improving the quality of learning, but is useless if it is not based on
pedagogical prescriptions (p. 189). Faculty are steeped and highly knowledgeable in
their subject matter but have had very little educational training or support in terms of
teaching and learning theory, pedagogy, and practices of evaluation and assessment
(Participant 4). Concerns such as those expressed by Participant 4 and others need to be
addressed. Understanding whether a difference exists between organizational structures
and strengths and weaknesses of each structure can lead to greater understanding of how
instructional design activities and organizational structures impact the quality of
instructional design. This understanding can also help decision makers understand and
86
87
systems, while others have provided training in pedagogy. Participant 9 addressed this
issue when stating changes in the organizational structure have gone from a learning
management system focus to a pedagogical teaching consideration. Yet shifting
training focus as discussed by Participant 9 enhances the need for proper planning and
implementation of training. Referring to training, Participant 7 indicated faculty feels
overwhelmed by the amount of technology training they have been asked to absorb over
the last ten years.
The importance of selecting appropriate instructional methods was addressed by
Reigeluth (1999) in his discussion of the need to select methods to achieve desired
instructional outcomes (p. 8). Study participants indicated recognition of the need to
teach faculty how to use technology, but along with this necessity must also be sound
pedagogy. This was addressed by participants discussing the role of the instructional
designer in helping faculty learn to design instruction. The need was identified to help
faculty move from desired learning outcomes or objectives to focus on content. Focus on
content must then be enhanced to include engagement strategies and assessment
(Participant 10). Those in leadership positions at institutions of higher learning should
recognize these needs and work toward methods for addressing them. Possibilities are
reduced workload and providing a center for professional development as suggested by
Participant 1 and better project planning as suggested by Participant 10.
Study participants indicated a need for recognizing and understanding the benefits
a professional instructional designer can bring to instructional design processes. The
participants also referred to the importance of recognizing and understanding differences
in instructional needs and instructional design activities for various delivery media.
88
Participant 9 summed it up as an online course its not just the face-to-face course put
online; there are certain implications for how you would design that course. Participants
expressed concern over the number of people designing courses who have no technology
training and/or instructional design training. The need was identified to train instructors
how to teach in the online environment (Participant 8). The lack of consistency in the
quality of instructional design may be attributed to inconsistency in faculty training and
desire for training. Participant 9 said at his/her institution there are lots of faculty who
are jumping on board [receiving training in pedagogy] and we also have some who are
not interested. The potential impact of those not jumping on board should be
investigated to determine the impact on overall quality of education.
The study reveals one of the greatest strengths of the FC and IDS organizational
structures is faculty expertise. This expertise must be retained during the instructional
design process and should be supported by the organizational structures. Silber (2007)
discussed the importance of training faculty in instructional design principles in order to
avoid solutions that are not only ineffective, but can actually create more difficulty (p.
7). The organizational structure selected for use by institutions must draw on faculty
strengths and provide faculty autonomy, while providing instructional design support to
ensure quality instructional design and instruction.
Recommendations
Recommendations for Institutions
Institutions of higher learning face great challenges in todays educational
environment. Competition for students is stiff, learner demographics are changing, and
89
90
ways to help faculty understand the importance of instructional design and the benefit an
instructional designer can bring to the instructional design process. Institutional leaders
and decision makers should also examine the difference between professional
instructional designers and designers-by-assignment (Merrill & Wilson, 2007, p. 336).
If faculty members are to continue as designers-by-assignment (p. 336), thought should
be given to extensive training for these educators in pedagogy and instructional design
principles. Also, release time should be provided to allow faculty members time to attend
training and develop skill. Merrill (2007) states 95% of all instructional design is done
by designers-by-assignment (p. 336). He further explained that many instructional
products currently designed for corporate America fall far short of their potential. They
are inefficient and often ineffective (p. 337). This ineffectiveness appears to carry over
into academic instructional design. Study participants indicated a lack of quality and the
need for training faculty members in both pedagogy and technology.
Recommendations for Further Research
Further study is warranted to gather data from a larger sampling of the population.
The small sample size, while answering the study questions, limited the extent to which
data could be examined. For example, the results could not provide an indication of the
significance of the difference in frequency that various instructional design activities are
conducted within different organizational structures. This information could provide
further insight into the effectiveness of organizational structures. Of interest for future
study is a determination of whether there is a similarity between organizational structure
and instructional design model chosen. For example, do faculty centered structures tend
91
92
93
various structures. Overall, findings revealed the most frequently occurring instructional
design activities are:
All three were identified in the most frequently occurring list for both faculty centered
and instructional designer centered organizational structures. Identifying learning
objectives was the only one of the above activities to place in the top three frequently
occurring activities in the instructional designer supported organizational structure.
The least frequently occurring instructional design activities overall are:
Conducting a pilot test was identified as the least frequently occurring activity for all
organizational structures. The only organizational structure for which conducting a task
analysis was the least frequently occurring activity was the faculty centered structure.
The frequency with which activities occurred in the various organizational structures was
reported in Table C3. Reasons for omitting activities appeared to be consistent across
organizational structures.
The qualitative portion of the study expanded the investigation to identify
strengths and weaknesses of organizational structures as perceived by participants. Lack
of quality was identified as a potential weakness in faculty centered organizational
structures. Faculty expertise in their subject matter was recognized as an inherent
94
strength of the structure. Further study was recommended to delve deeper into methods
for maintaining faculty expertise and developing consistent quality. Concern was raised
regarding the responsiveness of organizational structures. The faculty centered structure
was seen to be responsive to needs of learners and content changes but potentially at the
expense of quality. The instructional designer centered structure was noted for being
responsive to change on a large scale but nonresponsive to the immediate needs of the
learners or changes in content.
The study has shown instructional designer supported and instructional designer
centered organizational structures follow team based approaches. This team based
approach benefits the process by using the expertise of the subject matter expert and the
expertise of the instructional designer in the course design process. In that, its a
collaborative process. We are using the expertise of both (Participant 6). The study also
shows the faculty centered structure is more often an individual effort. While this
individual effort allows the expertise of the subject matter expert to take center stage it
can lead to an instructional design disadvantage in that faculty as a whole are not
systematically prepared professionally to teach (Participant 10).
Training and organizational restructuring can be used to reduce disadvantages
found in a faculty centered structure. Several study participants indicated a need exists
for faculty training in pedagogical skills and instructional design processes. Along with
the need for faculty training was recognized a need to understand the role instructional
designers can play in the design and support of quality instruction at institutions of higher
education. Recommendations were made for institutions to examine their instructional
design practices in an effort to create or enhance environments upporting instructional
95
96
REFERENCES
Aczel, A. D. (1999). Complete business statistics (4th ed.). Boston: Irwin/McGraw-Hill.
Albi, R. (2007). Professors as instructional designers: Lived experiences in designing
and developing online instruction. Ph.D. dissertation, Capella University, United
States -- Minnesota. Retrieved from Dissertations & Theses @ Capella
University. (Publication No. AAT 3288703).
Ally, M. (2004). Foundations of educational theory for online learning. In T. Anderson
& F. Elloumi (Eds.), Theory and practice of online learning (p. 5). Athabasca
University, Alberta, Canada: Creative Commons.
Americal Psychiatric Association. (2010). Publication manual of the American
Psychology Association (6th ed.). Washington, DC: Author.
Atkinson, R. C., & Shiffrin, R. M. (1968). Human memory: A proposed system and its
control processes. In K. Spence & J. Spence (Eds.), Vol. 2. The psychology of
learning and motivation (pp. 13 - 113). New York: Academic Press.
Ayers, D. F. (2002). Mission priorities of community colleges in the southern United
States. Community College Review, 30(3), 11-30.
Baggaley, J. (2008). Where did distance education go wrong? Distance Education,
29(1), 39-51.
Beckman, S. L. (2009). Introduction to a symposium on organization design. California
Management Review, 51(4), 6-10.
Best, J. W., & Kahn, J. V. (2003). Research in education (9th ed.). Boston, MA: Allyn
and Bacon.
Bichelmeyer, B. A. (2003, August). Instructional theory and instructional design theory:
Whats the difference and why should we care? 2003 IDT Record. Retrieved
from http://www.indiana.edu/~idt/articles/documents/ID_theory.Bichelmeyer.
html
Bichelmeyer, B. A. (2005). The ADDIE model A metaphor for the lack of clarity in
the field of IDT. AECT 2004 IDT Futures Group Presentations. Retrieved from
http://www.indiana.edu/~idt/shortpapers/documents/IDTf_Bic.pdf
Braganza, A., Awazu, Y., & Desouza, K. C. (2009). Sustaining innovation is challenge
for incumbents. Research Technology Management, 52(4), 46-56.
97
Brill, J. M., Bishop, M. J., & Walker, A. E. (2006). The competencies and
characteristics required of an effective project manager: A web-based Delphi
study. Educational Technology Research and Development 54(2), 115-140.
Chen, G. (2009, January 26). Changing student demographics: Rising number of
professional students. [Online article]. Retrieved from http://www.community
collegereview.com/articles/75
Corcoran, C. A., Dershimer, E. L., & Tichenor, M. S. (2004). A teachers guide to
alternative assessment: Taking the first steps. The Clearing House, 77(5), 213216.
Creswell, J. W. (2008). Educational research: Planning, conducting, and evaluating
quantitative and qualitative research (3rd ed.). Upper Saddle River, NJ: Pearson.
Darvin, J. (2006). Real-world cognition doesnt end when the bell rings: Literacy
instruction strategies derived from situated cognition research. Journal of
Adolescent and Adult Literacy, 49(5), 398-407.
Day, J. C., & Jamieson, A. (2003). School enrollment 2000: Census 2000 Brief (Report
No. C2KBR-26). Retrieved from U. S. Census Bureau website: http://www.
census.gov/prod/2003pubs/c2kbr-26.pdf
Delialioglu, O., & Yildirim, Z. (2008). Design and development of a technology
enhanced hybrid instruction based on MOLTA model: Its effectiveness in
comparison to traditional instruction. Computers & Education, 51(1), 474-483.
DeLoach, S. B., & Greenlaw, S. A. (2007). Effectively moderating electronic
discussions. Journal of Economic Education, 38(4), 419-434.
Dempsey, J. V., Albion, P., Litchfield, B. C., Havard, B., & McDonald, J. (2007). What
do instructional designers do in higher education? In. R. A. Reiser & J. Dempsey
(Eds.), Trends and issues in instructional design and technology (2nd ed., pp.
221-233). Upper Saddle River, NJ: Pearson.
Dennen, V. P. (2005). From message posting to learning dialogues: Factors affecting
learner participation in asynchronous discussion. Distance Education, 26(1), 127148.
Dennen, V. P., Darabi, A. A., & Smith, L. J. (2007). Instructor-learner interaction in
online courses: The relative perceived importance of particular instructor
importance of particular instructor actions on performance and satisfaction.
Distance Education, 28(1), 65-79.
98
Dick, W., & Carey, L. (1978). The systematic design of instruction. Upper Saddle River,
NJ: Pearson.
Dick, W., Carey, L., & Carey, J. O. (2009). The systematic design of instruction (7th
ed.). Upper Saddle River, NJ: Pearson.
Doyle, W. (2009). Online education: The revolution that wasn't. Change: The Magazine
of Higher Learning, 41(3), 56-58.
Driscoll, M. P. (2005). Psychology of learning for instruction (3rd ed.). Boston, MA:
Pearson.
Ertmer, P. A., & Stepich, D. A. (2005). Instructional design expertise: How will we
know it when we see it? Educational Technology, (45)6, 38-43.
Fauser, M., Henry, K., & Norman, D. K. (2006). Comparison of alternative instructional
design models. [Online article]. Retrieved from http://m.deekayen.net/compariso
n-alternative-instructional-design-models
Fowler, F. J. (1993). Survey research methods (2nd ed.). Newbury Park, CA: Sage.
Franken, A., Edwards, C., & Lambert, R. (2009). Executing strategic change:
Understanding the critical management elements that lead to success. California
Management Review, 51(3), 49-73.
Fullan, M. (2001). Leading in a culture of change. San Francisco, CA: Jossey-Bass.
Gearhart, D. (2001). Ethics in distance education: Developing ethical policies. Online
Journal of Distance Leaning Administration, 4(1).
Gholson, B., & Craig, S. (2006). Promoting constructive activities that support vicarious
learning during computer-based instruction. Educational Psychology Review,
18(2), 119-139.
Gustafson, K. L., & Branch, R. M. (2002). Survey of instructional development models
(4th ed.). Syracuse, NY: ERIC.
Hedberg, J. G. (2006). E-learning futures? Speculations for a time yet to come. Studies
in Continuing Education, 28(2), 171-183.
Higher Learning Commission. (2010, May 25). Current or previously affiliated
institutions 05/25/2010 [Official web site]. Retrieved from http://www.
higherlearningcommission.org/component/option,com_directory/Itemid,184/
99
Howell, C. L., & Eric Clearinghouse for Community Colleges (2001). Facilitating
responsibility for learning in adult community college students (Publication No.
ED451841). Eric Digest [Electronic Database]. Retrieved from http://www.eric
digests.org/2001-4/adult.html
Irlbeck, S., Kays, E., Jones, D., & Sims, R. (2006). The phoenix rising: Emergent models
of instructional design. Distance Education, 27(2), 171-185.
Irlbeck, S. A., & Pucel, D. J. (2000). Dimensions of leadership in higher education
distance education. iwalt, pp.63 International Workshop on Advanced Learning
Technologies, 2000. doi: IWALT.2000.890567.
Jacobs, J., & Archie, T. (2008). Investigating sense of community in first-year college
students. Journal of Experiential Education, 30(3), 282-285.
Jonassen, D. (1999). Designing constructivist learning environments. In C. M. Reigeluth
(Ed.), Instructional-design theories and models: Vol. 2. A new paradigm of
instructional theory (pp. 215-239). Mahwah, NJ: Erlbaum.
Jonassen, D., Strobel, J., & Gottdenker, J. (2005). Model building for conceptual change.
Interactive Learning Environments, 13(1-2), 15-27.
Jones-Kavalier, B. R., & Flannigan, S. L. (2006). Connecting the digital dots: Literacy of
the 21st century. Educause Quarterly, 29(2), 8-10.
Kemp, J. (1971). Instructional design: A plan for unit and course development.
Belmont, CA: Fearon.
Kinser, K. (2006). What Phoenix doesnt teach us about for-profit higher education.
Change 38(4), 24-29.
Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during
instruction does not work: An analysis of the failure of constructivist, discovery,
problem-based, experimental, and inquiry-based teaching. Educational
Psychologist, 41(2), 75-86.
Lanier, M. M. (2006). Academic integrity and distance learning. Journal of Criminal
Justice Education, 17(2), 244-261.
Leech, N. L., & Onwuegbuzie, A. J. (2007). An array of qualitative data analysis tools: A
call for data analysis triangulation. School Psychology Quarterly, 22(4), 557-584.
Magliaro, S., Lockee, B., & Burton, J. (2005). Direct instruction revisited: A key model
for instructional technology. Educational Technology Research and
Development, 53(4), 41-55.
100
101
Mosby (Ed.). (2006). Mosbys pocket dictionary of medicine, nursing, & health
professions (5th ed.). St. Louis: Mosby.
Orszag, J. M., Orszag, P. R., & Whitmore, D. M (2001, August). Learning and earning:
Working in college. [Online article]. Commissioned by Upromise, Inc.
Retrieved from http://www.brockport.edu/career01/upromise.htm
Palloff, R. M., & Pratt, K. (2003). The virtual student: A profile and guide to working
with online learners . San Francisco, CA: Jossey-Bass.
Qureshi, E. (2004). Instructional design models. [Online article]. Retrieved from
http://web2.uwindsor.ca/courses/edfac/morton/instructional_design.htm
Reigeluth, C. M. (1999). Instructional-design theories and models: A new paradigm of
instructional theory (Vol. 2) . Mahwah, NJ: Erlbaum.
Reiser, R. A., & Dempsey, J. V. (2007)). Trends and issues in instructional design and
technology (2nd ed.). Upper Saddle River, NJ: Pearson.
Ramano, J., Kromrey, J. D., Corraggio, J., & Skowronek, J. (2006). Appropriate statistics
for ordinal level data : Should we really be using t-test and Cohens d for
evaluating group differences on the NSSE and other surveys? Paper presented at
the annual meeting of the Florida Association of Institutional Research, February
1 -3, 2006, Cocoa Beach, FL. Abstract retrieved from http://www.florida-air.org/
romano06.pdf
Ryder, M. (2010). Instructional design models. Retrieved from http://carbon.ucdenver.
edu/~mryder/itc_data/idmodels.html
Saba, F. (2005). Critical issues in distance education: A report from the United States.
Distance Education, 26(2), 255-272.
Scott, G. (2003). Effective change management in higher education. Educause Review,
38(6), 64-80.
Seiden, M. (2009). For-profit colleges deserve some respect. Chronicle of Higher
Education, 55(41), A80.
Siemens, G. (2004). Connectivism: A learning theory for the digital age. elearnspace.
Retrieved from http://www.elearnspace.org/Articles/connectivism.htm
Siemens, G. (2005). Connectivism: Learning as Network Creating. elearnspace.
Retrieved from http://www.elearnspace.org/Articles/networks.htm
102
103
Wayne State University (2010). Barrier-free elearning: eLearning: Glossary. From the
Wayne State University College of Engineering Website, http://www.eng.wayne.
edu/page.php?id=1263
Wedman, J., & Tessmer, M. (1993). Instructional designers decisions and priorities: A
survey of design practice. Performance Improvement Quarterly, 6(2), 43-57.
Whitten, J. L., & Bentley, L. D. (1998). Systems analysis and design methods (4th ed.).
Boston, MA: McGraw Hill.
Weisenberg, F., & Stacey, E. (2005). Reflections on teaching and learning online:
Quality program design, delivery and support issues from a cross-global
perspective. Distance Education, 26(3), 385-404.
Woods, R., & Ebersole, S. (2003). Using non-subject-matter-specific discussion boards
to build connectedness in online learning. The American Journal of Distance
Education, 17(2), 99-118.
Yang, P. (2006). UCLA community college review: Reverse transfer and multiple
missions of community colleges. Community College Review, 33, 55-70.
Yankelovish, D. (2005). Ferment and change: Higher education in 2015. Chronicle of
Higher Education, 52(14), B6-B9.
104
105
106
assessment.
9. I determine if
need can be
solved by
training.
10. I write/use
learning
objectives.
107
characteristics.
14. I develop test
items.
15. I select
instructional
strategies.
18. I do a follow up
evaluation of the
training.
108
Other ____________________________________
24. Assess trainees entry skills and characteristics.
Lack expertise
Client wont support
Decision already made
Considered unnecessary
Not enough time
Not enough money
Always include
Other ____________________________________
25. Develop test items.
Lack expertise
Client wont support
Decision already made
Considered unnecessary
Not enough time
Not enough money
Always include
Other ____________________________________
26. Select instructional strategies.
Lack expertise
Client wont support
Decision already made
Considered unnecessary
Not enough time
Not enough money
Always include
Other ____________________________________
27. Select media formats for the training.
Lack expertise
Client wont support
110
111
112
113
supported, or faculty centered which model would best describe that used at
your institution? [The interviewer may read the following definitions if asked.
Instructional designer centered model. An instructional design organizational
model in which instructional design activities are performed by a team of
instructional design experts with the subject matter expert (SME) serving as a
member of the design team during the design, development and delivery of
instructional solutions. Instructors teaching the course may or may not serve
on the design and development team. Instructional designer supported model.
An instructional design organizational model in which the instructor works
closely with an instructional designer or instructional design team in design,
development, delivery, and facilitation of his or her own courses. Support is
typically in the form of assistance in both instructional design and
development activities and the use of the course delivery system. Faculty
centered model. An instructional design organizational model in which the
instructor is responsible for the design, development, delivery, and facilitation
of his or her own courses. Support is typically in the form of assistance in the
use of the course delivery system rather than instructional design support.]
9. Why did you select this model as your answer?
10. Has the organizational structure changed during the time you have been at
your institution?
If yes:, ask 10a, 10b, 10c, and 10d if no skip to Question 11
10a. How has it changed?
10b. What was the organizational structure like before the change?
114
115
APPENDIX C. TABLES
Table C1. Frequency and Percentage of Performance of Instructional Design Activities
Activity
Always
Usually
Regularly
Selectively
Rarely
Never
Conduct a needs
assessment
19.6%
(11)
23.2%
(13)abd
17.9%
(10)ce
23.2%
(13) a b
7.1%
(4)
8.9%
(5)
10.7%
(6)
37.5%
(21)ad
21.4%
(12)ce
14.3%
(8)
5.4%
(3)
10.7%
(6)
Write/use learning
objectives
50.0%
(28)acde
17.9%
(10)
12.5%
(7)
8.9%
(5)
5.4%
(3)
5.4%
(3)
7.1%
(4)
30.4%
(17)a
16.1%
(9)ce
19.6%
(11)
7.1%
(4)
19.6%
(11)
32.1%
(18)ad
28.6%
(16)ce
14.3%
(8)
14.3%
(8)
5.4%
(3)
5.4%
(3)
30.4%
(17)a
21.4%
(12)ce
21.4%
(12)
10.7%
(6)
7.1%
(4)
8.9%
(5)
23.2%
(13)
10.7%
(6)
12.5%
(7)
25.0%
(14)a
10.7%
(6)c
17.9%
(10)
Select instructional
strategies
32.1%
(18)ad
21.4%
(12)c
21.4%
(12)
16.1%
(9)
3.6%
(2)
5.4%
(3)
28.6%
(16)ad
21.4%
(12)c
19.6%
(11)
21.4%
(12)
1.8%
(1)
7.1%
(4)
17.9%
(10)
14.3%
(8)
10.7%
(6)
26.8%
(15)ac
19.6%
(11)
10.7%
(6)
Do a follow up evaluation
of training
19.6%
(11)
17.9%
(10)
19.6%
(11)c
23.2%
(13)a
8.9%
(5)
10.7%
(6)
Note. amode, b activities having two or more modes, cmedian, dmode responses are the same as found in
the Vannoy (2008) study, emedian responses are the same as found in the Vannoy study, (n) frequency
count.
116
Activity
Chi-Square
95% CI
Asymp. Sig.
df
12.836
.046
28.364
.000
59.164
.000
19.709
.003
30.145
.000
23.527
.001
12.836
.046
32.182
.000
25.564
.000
13.091
.042
11.055
.087*
117
Usually
Regularly
Selectively
Rarely
Never
0.0% (0)
20.0% (1)
20.0% (1)
20.0% (1)c
0.0% (0)
40.0% (2)a
IDS
20.0% (3)
46.7% (7)ac
13.3% (2)
20.0% (3)
0.0% (0)
0.0% (0)
FC
21.2% (7)
12.1% (4)
21.2% (7)c
24.2% (8)a
12.1% (4)
9.1% (3)
0.0% (0)
20.0% (1)
20.0% (1)
20.0% (1)c
0.0% (0)
40.0% (2)a
IDS
20.0% (3)
53.3% (8)ac
0.0% (0)
13.3% (2)
0.0% (0)
13.3% (2)
FC
9.1% (3)
33.3% (11)a
30.3% (10)c
12.1% (4)
9.1% (3)
6.1% (2)
80.0% (4)a
0.0% (0)c
0.0% (0)
0.0% 0)
0.0% (0)
20.0% (1)
IDS
53.3% (8)a
33.3% (5)c
13.3% (2)
0.0% (0)
0.0% (0)
0.0% (0)
FC
45.5% (15)a
12.1% (4)c
15.2% (5)
12.1% (4)
9.1% (3)
6.1% (2)
20.0% (1)ab
20.0% (1)ab
20.0% (1)abc
20.0% (1)ab
0.0% (0)
20.0% (1)ab
IDS
6.7% (1)
46.7% (7)ac
20.0% (3)
6.7% (1)
6.7% (1)
13.3% (2)
FC
6.1% (2)
27.3% (9)a
15.2% (5)
24.2% (8)c
9.1% (3)
18.2% (6)
80.0% (4)a
0.0% (0)c
0.0% (0)
0.0% (0)
0.0% (0)
20.0% (1)
IDS
46.7% (7)a
20.0% (3)c
13.3% (2)
20.0% (3)
0.0% (0)
0.0% (0)
FC
21.2% (7)
36.4% (12)ac
15.2% (5)
12.1% (4)
9.1% (3)
6.1% (2)
118
Usually
Regularly
Selectively
Rarely
Never
40.0% (2)ab
0.0% (0)
0.0% (0)
20.0% (1)c
0.0% (0)
40.0% (2)ab
IDS
33.3% (5)ab
33.3% (5)abc
6.7% (1)
13.3% (2)
6.7% (1)
6.7% (1)
FC
30.3% (10)
18.2% (6)
33.3% (11)ac
6.1% (2)
6.1% (2)
6.1% (2)
40.0% (2)a
0.0% (0)
20.0% (1)c
20.0% (1)
0.0% (0)
20.0% (1)
IDS
6.7% (1)
20.0% (3)ab
20.0% (3)ab
20.0% (3)abc
13.3% (2)
20.0% (3)ab
FC
30.3% (10)a
9.1% (3)
9.1% (3)
27.3% (9)c
9.1% (3)
15.2% (5)
60.0% (3)a
20.0% (1)c
0.0% (0)
20.0% (1)
0.0% (0)
0.0% (0)
IDS
26.7% (4)
20.0% (3)
13.38% (2)c
33.3% (5)a
6.7% (1)
0.0% (0)
FC
33.3% (11)a
24.2% (8)c
30.3% (10)
6.1% (2)
0.0% (0)
6.1% (2)
40.0% (2)ab
0.0% (0)
40.0% (2)abc
20.0% (1)
0.0% (0)
0.0% (0)
IDS
33.3% (5)a
20.0% (3)c
13.3% (2)
26.7% (4)
6.7% (1)
0.0% (0)
FC
27.3% (9)a
24.2% (8)c
21.2% (7)
18.2% (6)
0.0% (0)
9.1% (3)
0.0% (0)
0.0% (0)
20.0% (1)
40.0% (2)abc
40.0% (2)ab
0.0% (0)
IDS
46.7% (7)a
13.3% (2)c
0.0% (0)
20.0% (3)
20.0% (3)
0.0% (0)
FC
9.1% (3)
18.2% (6)
12.1% (4)
27.3% (9)ac
18.2% (6)
15.2% (5)
0.0% (0)
0.0% (0)
60.0% (3)ac
20.0% (1)
20.0% (1)
0.0% (0)
IDS
26.7% (4)
13.3% (2)
20.0% (3)c
33.3% (5)a
6.7% (1)
0.0% (0)
FC
18.2% (6)
24.2% (8)a
15.2% (5)c
18.2% (6)
9.1% (3)
15.2% (5)
Note: The frequency of counts < 5 preclude the use of chi-square test for independent samples on this data,
n= 53. a mode, b activities having multiple modes, c median
119
Lack
Expertise
Client
Wont
Support
Decision
Already
Made
Considered
Unnecessary
Not
Enough
Time
Not
Enough
Money
Always
Include
Other
20.0%
(11)
27.3%
(15)
61.8%
(34)a
45.5%
(25)
49.1%
(27)
30.9%
(17)
7.3%
(4)
3.6%
(2)
14.5%
(8)
25.5%
(14)
49.1%
(27)a
20.0%
(11)
27.3%
(15)
18.2%
(10)
7.3%
(4)
3.6%
(2)
7.3%
(4)
14.5%
(8)
32.7%
(18)
10.9%
(6)
20.0%
(11)
0.0%
(0)
49.1%
(27)a
10.9
%
(6)
16.4%
(9)
20.0%
(11)
32.7%
(18)
25.5%
(14)
40.0%
(22)a
7.3%
(4)
14.5%
(8)
9.1%
(5)
3.6%
(2)
12.7%
(7)
38.2%
(21)
14.5%
(8)
12.7%
(7)
0.0%
(0)
47.3%
(26)a
9.1%
(5)
3.6%
(2)
10.9%
(6)
40.0%
(22)a
23.6%
(13)
30.9%
(17)
9.1%
(5)
23.6%
(13)
3.6%
(2)
16.4%
(9)
10.9%
(6)
23.6%
(13)
20.0%
(11)
18.2%
(10)
5.5%
(3)
30.9%
(17)a
9.1%
(5)
Select
instructional
strategies
7.3%
(4)
18.2%
(10)
30.9%
(17)
1.8%
(1)
10.9%
(6)
5.5%
(3)
50.9%
(28)a
5.5%
(3)
Select media
formats for
training
18.2%
(10)
20.0%
(11)
30.9%
(17)
10.9%
(6)
14.5%
(8)
12.7%
(7)
38.2%
(21)a
5.5%
(3)
0.0%
(0)
16.4%
(9)
12.7%
(7)
20.0%
(11)
52.7%
(29)a
7.3%
(4)
27.3%
(15)
7.3%
(4)
1.8%
(1)
9.1%
(5)
9.1%
(5)
20.0%
(11)
41.8%
(23)a
9.1%
(5)
34.5%
(19)
7.3%
(4)
Conduct a
Needs
Assessment?
Determine if
need can be
solved by
training
Write/use
learning
objectives
Conduct task
analysis
Identify types
of learning
outcomes
Assess
trainees entry
skills and
characteristics
Develop test
items
Pilot test
instruction
before
completion
Do a follow
up evaluation
of training
120
3 or 4
5 or 6
79
10+
Total
IDC
0.0%
(0)
0.0%
(0)
0.0%
(0)
0.0%
(0)
0.0%
(0)
9.1%
(1)c
9.1%
(1)
IDS
0.0%
(0)
0.0%
(0)
27.3%
(3)c
0.0%
(0)
0.0%
(0)
0.0%
(0)
27.3%
(3)
FC
18.2%
(2)c
9.1%
(1)
9.1%
(1)
0.0%
(0)
0.0%
(0)
0.0%
(0)
36.4%
(4)b
0.0%
(0)
9.1%
(1)
18.2%
(2)c
0.0%
(0)
0.0%
(0)
0.0%
(0)
27.3%
(3)
18.2%
(2)
18.2%
(2)
54.5%
(6)a
0.0%
(0)
0.0%
(0)
9.1%
(1)
100.0%
(11)
Combination
IDS and FC
Total
Note: Table displays count and percentage of total, n = 11, amode all responses, bmode organizational
structure, cmode each organization structure.
121
Table C6. Comparison of Organizational Structures and Role in the Instructional Design
Process, N = 37
Role
Org.
Structure
Multimedia
Designer
Admin.
Other
11
SME
ID
Director
of ID
Curriculum
Specialist
IDC
IDS
FC
Combination
IDS and FC
Total
Note: Number of respondents identified by their specific instructional design roles and broken down by
organizational structure , n = 11. SME = Subject Matter Expert, ID = Instructional Designer, Admin. =
Administrator, Other = all other roles identified with a count of 1
122
FC
IDS
IDC
IDS &
FC
81.8%
(9)
75.0%
(3)
100%
(3)
100%
(1)
66.7%
(2)
Locate resources
27.3%
(3)
25.0%
(1)
33.3%
(1)
0.00%
(0)
33.3%
(1)
72.7%
(8)
75.0%
(3)
100%
(3)
100.0%
(11)
100.0%
(4)
100%
(3)
100%
(1)
100%
(3)
9.1%
(1)
0.00%
(0)
0.00%
(0)
0.00%
(0)
33.3%
(1)
9.1%
(1)
0.00%
(0)
0.00%
(0)
0.00%
(0)
33.3%
(1)
36.4%
(4)
25.0%
(1)
66.7%
(2)
100%
(1)
0.00%
(0)
100.0%
(11)
100.0%
(4)
100%
(3)
100%
(1)
100%
(3)
45.5%
(5)
25.0%
(1)
66.7%
(2)
100%
(1)
33.3%
(1)
9.1%
(1)
0.00%
(0)
33.3%
(1)
0.00%
(0)
0.00%
(0)
100.0%
(11)
100.0%
(4)
100%
(3)
100%
(1)
100%
(3)
Process
Request course
development/Recognize need for
new course or modification of a
course
Go Live
123
0.00%
(0)
66.7%
(2)
Table C8. Comparison of Reasons for Selecting Organizational Structure - Overall and
Organizational Structure
Count
Reason for Selecting
All
Participants
45.5%
(5)
FC
75.0%
(3)
IDS
0.0%
(0)
IDC
0.0%
(0)
IDS &
FC
66.7%
(2)
Faculty autonomy
18.2%
(2)
25.0%
(1)
0.0%
(0)
0.0%
(0)
33.3%
(1)
Faculty driven
27.3%
(3)
50.0%
(2)
33.3%
(1)
0.0%
(0)
0.0%
(0)
9.1%
(1)
0.0%
(0)
33.3%
(1)
0.0%
(0)
0.0%
(0)
Integrate to campus
9.1%
(1)
0.0%
(0)
33.3%
(1)
0.0%
(0)
0.0%
(0)
Economy/scalability
27.3%
(3)
0.0%
(0)
0.0%
(0)
100.0%
(1)
66.7%
(2)
Quality
18.2%
(2)
0.0%
(0)
33.3%
(1)
100.0%
(1)
0.0%
(0)
18.2%
(2)
50.0%
(2)
0.0%
(0)
0.0%
(0)
0.0%
(0)
9.1%
(1)
0.0%
(0)
0.0%
(0)
0.0%
(0)
33.3%
(1)
Accreditation
9.1%
(1)
0.0%
(0)
33.3%
(1)
0.0%
(0)
0.0%
(0)
Quality/standardization
27.3%
(3)
25.0%
(1)
33.3%
(1)
0.0%
(0)
33.3%
(1)
Streamlining
9.1%
(1)
0.0%
(0)
0.0%
(0)
0.0%
(0)
33.3%
(1)
Competition
9.1%
(1)
0.0%
(0)
33.3%
(1)
0.0%
(0)
0.0%
(0)
Tradition
Note: Percentages all participants, n = 11; FC, n = 4; IDS, n = 3; IDC, n = 1; both IDS and FC, n = 3
124
FC
IDS
IDC
IDS & FC
Tradition
Cost containment
Scalability / Efficiency
Responsiveness
0a
Quality
0a
Faculty Autonomy
0a
Faculty expertise
Location
Improved Communication
ID expertise
Faculty relief
125
IDS
IDC
IDS & FC
1a
Non-responsive process
1a
Training needed
Lack of staffing
Lack of quality
3a
Work load
Faculty inflexibility
Major Shortfalls
Note: aindicates trait considered a benefit in one organizational structure is considered a weakness in
another.
126
Table C11. Comparison of Findings: Current Study, Vannoy (2008), and Wedman and
Tessmer (1993)
Findings listed
by Vannoy
(2008)
Most frequently
used design
activities
Least frequently
used design
activities
Follow up evaluation
Identifying learning outcomes
Selecting instructional strategies
Writing learning objectives
Developing test items
Selecting media
This Study
2010
x
x
x
x
x
x
x
x
x
Modified from Table 6: Findings from both studies (Vannoy, 2008, p. 46).
127
Vannoy
(2008)
Wedman &
Tessmer
(1993)
x
x
x
x
x
x
x
x