Vous êtes sur la page 1sur 138

ORGANIZATIONAL STRUCTURE AND THE INSTRUCTIONAL DESIGN PROCESS

by
Donna Kay

SONJA A. IRLBECK, EdD, Faculty Mentor and Chair


BETL CZERKAWSKI, PhD, Committee Member
ERIC WELLINGTON, PhD, Committee Member

Barbara Butts Williams, PhD, Dean, School of Education

A Dissertation Presented in Partial Fulfillment


Of the Requirements for the Degree
Doctor of Philosophy

Capella University
March 2011

UMI Number: 3449387

All rights reserved


INFORMATION TO ALL USERS
The quality of this reproduction is dependent upon the quality of the copy submitted.
In the unlikely event that the author did not send a complete manuscript
and there are missing pages, these will be noted. Also, if material had to be removed,
a note will indicate the deletion.

UMI 3449387
Copyright 2011 by ProQuest LLC.
All rights reserved. This edition of the work is protected against
unauthorized copying under Title 17, United States Code.

ProQuest LLC
789 East Eisenhower Parkway
P.O. Box 1346
Ann Arbor, MI 48106-1346

Donna Kay, 2011

Abstract
The shift from face-to-face instruction to other technology-mediated delivery
environments brings changes to both organizational structures and instructional design
processes. Instructional designers in United States colleges and universities to should
rethink methods of designing, developing, and supporting instructional design activities
within their institutions in order to best design instruction utilizing shifting instructional
delivery media. This mixed methods study examined organizational structures and the
instructional design activities taking place within these structures. Study results
determined the frequency with which specific instructional design activities take place
varies from organizational structure to organizational structure. The study examined the
strengths and weaknesses of each organizational structure as perceived by those
responsible for design and development of instruction. The most common instructional
delivery method for which participants have responsibility is hybrid delivery. A shift has
taken place from traditional faculty centered structures to alternate structures providing
varying levels of instructional designer support or control over instructional design
processes. The results seemed to indicate that in a faculty centered structure the content
and quality of instruction can vary among campuses of the same institution and even
among sections of the same course on a given campus. Lack of quality in instruction
under a faculty centered structure was the greatest weakness identified.

Dedication
To my Lord and Savior, Jesus Christ. The better I know you, the better I
understand that I can do all things through Christ. Without your strengthening I would
not have made it through this process. To my parents, thank you for the years of love and
belief in me. Thank you for always telling me I can do whatever I set my mind to do. To
my daughters, Katherine and Lesa, my greatest cheerleaders. I love you and remind you
once again, you are precious in His sight and you too can do whatever you set your mind
to do.

iv

Acknowledgments
Thank you to Dr. Sonja Irlbeck, my mentor. Your feedback and leadership in
individual classes laid the foundation for research and writing in a scholarly fashion.
Your feedback and encouragement to all of us in the dissertation courseroom and the tone
you set for encouraging one another was critical to navigating through this dissertation
journey. You created an environment in which we knew we were not alone and provided
feedback that was both timely and beneficial. This paper would never have made it to
completion without your efforts.
Dr. Betl Czerkawski and Dr. Eric Wellington, thank you for serving on my
dissertation committee. Your insights and feedback were both helpful and appreciated.
You brought an alternate viewpoint from which to see my work and brought to light gaps
in the study. You also put up with my last minute reviews each quarter and found the
time to fit them in even though you were busy with final grading for your own classes.
To Pamela Shireman, thank you for your assistance and encouragement. Thank
you to Dr. Claudia Kittock, Dr. Kris Khthofson, and Stephanie Fells for your review and
suggestions for improvement of the interview instrument.

Table of Contents
Acknowledgments

List of Tables

viii

List of Figures

ix

CHAPTER 1. INTRODUCTION

Introduction to the Problem

Background of the Study

Statement of the Problem

Purpose of the Study

Rationale

Research Questions

Significance of the Study

Definition of Terms

Assumptions and Limitations

Nature of the Study

Organization of the Remainder of the Study


CHAPTER 2. LITERATURE REVIEW

10
11

Changes in Higher Education

11

Learner Demographics

11

Online Technologies

13

Connections to Instructional Design

16

Summary

37

CHAPTER 3. METHODOLOGY

38

vi

Study Design

39

Sampling

41

Instrumentation

41

Data Analysis

42

Ethical Issues

45

Summary

46

CHAPTER 4. DATA COLLECTION AND ANALYSIS

47

Quantitative Data

47

Qualitative Data

64

Summary

76

CHAPTER 5. RESULTS, CONCLUSIONS, AND RECOMMENDATIONS

78

Results

79

Discussion

85

Recommendations

89

Summary

93

APPENDIX A. INSTRUCTIONAL DESIGN ACTIVITIES SURVEY

105

APPENDIX B. INTERVIEW QUESTIONS

113

APPENDIX C. TABLES

116

vii

List of Tables
Table C1. Frequency and Percentage of Performance of Instructional Design
Activities

116

Table C2. Distribution of Frequency of Performance of Instructional Design


Activities

117

Table C3. Frequency and Percentage of Performance of Instructional Design


Activities Based on Organization Model

118

Table C4. Reasons Instructional Design Activities Are Omitted

120

Table C5. Comparison of Organizational Structures and People Involved in the


Instructional Design Process

121

Table C6. Comparison of Organizational Structures and Role in the Instructional


Design Process, N = 37

122

Table C7. Comparison of Instructional Design Process Based Overall and


Organizational Structure

123

Table C8. Comparison of Reasons for Selecting Organizational Structure Overall and Organizational Structure

124

Table C9. Comparison of Major Benefits Among Organizational Structures

125

Table C10. Comparison of Major Shortfalls Among Organizational Structures

126

Table C11. Comparison of Findings: Current Study, Vannoy (2008), and


Wedman and Tessmer (1993)

127

viii

List of Figures
Figure 1. Participants roles within their institution

49

Figure 2. Type of instution participants represent

50

Figure 3. Institutional level of participants institutions

50

Figure 4. Delivery options offered by institutions and delivery options for which
participants have responsibility for instructional design

51

Figure 5. Organizational structures in place at participants institutions.

52

Figure 6. Types of organizational structures with which participants have


experience

53

Figure 7. Comparison of frequency of performance of needs assessment among


organizational structures.

57

Figure 8. Comparison of frequency of performance in determining if the need can


be solved by training among organizational structures.

57

Figure 9. Comparison of frequency of performance of writing/using learning


objectives among organizational structures.

58

Figure 10. Comparison of frequency of performance of conducting a task


analysis among organizational structures.

58

Figure 11. Comparison of frequency of performance of identifying types of


learning outcomes among organizational structures.

59

Figure 12. Comparison of frequency of performance of assessing trainees entry


skills and characteristics among organizational structures.

59

Figure 13. Comparison of frequency of performance of developing test items


among organizational structures.

60

Figure 14. Comparison of frequency of performance of selecting instructional


strategies among organizational structures.

60

Figure 15. Comparison of frequency of performance of selecting media formats


for training among organizational structures.

61

Figure 16. Comparison of frequency of performance of pilot testing instruction


before completion among organizational structures.

61

ix

Figure 17. Comparison of frequency of performing a follow-up evaluation of


training among organizational structures.

62

CHAPTER 1. INTRODUCTION
Introduction to the Problem
Colleges and universities within the seven regional accrediting bodies in the
United States are experiencing a time of rapid change. The Higher Learning Commission
of the North Central Association of Colleges and Universities (NCA-HLC) is an
association of approximately 10,000 schools and colleges located throughout 19 states.
The NCA-HLC represents the largest regional accreditation region for higher education
in the United States. This region is widely diverse in terms of its geographic, economic,
and cultural environment. Perhaps the two most widely spread and potentially enduring
factors stimulating change and affecting instructional design practices at schools within
the NCA-HLC are learner demographics and online technologies (Ayers, 2002; JonesKavalier & Flannigan, 2006; Saba, 2005; Yankelovish, 2005). These enduring factors,
coupled with a struggling economic environment and expanding learning theories,
promote the need to follow proper instructional design processes when designing and
developing instruction and training in colleges and universities.
Instructional designers recognize the process of analysis, design, development,
implementation, and evaluation (ADDIE) as the cornerstone of historical instructional
design (ID) processes. The foundation for instructional design modelssuch as those
created by Morrison, Ross, and Kemp (2007); Dick, Carey, and Carey (2009); and the
more recent Sims (2009) PD4L modelhas been guided by this process. How often and

to what extent this process is followed in design and modification of instruction in


colleges and universities in the NCA-HLC is unknown. It is also unknown whether
instructional design activities differ under different organizational structures.
Various organizational structures exist to support design and development of
instruction. One structure commonly used in higher education relies on faculty members
to design and develop their own courses. Merrill and Wilson (2007) refer to faculty
members as designers-by-assignment (p. 336). Faculty members often have no prior
training or knowledge of instructional design theory or practice in this structure. In
contrast, an organizational structure supporting professional instructional designers to
design and develop instruction with instructors serving as subject matter experts (SMEs)
in the design and development process can be found at some colleges and universities.
Institutions using an organizational structure that provides professional instructional
design support to designers-by-assignment (p. 336)) can be found between these two
extremes. Merrill and Wilson describe this as a subject matter expert being assigned to
be an instructional designer rather than being an instructional designer (p. 336). The
instructional design activities typically included in each of these organizational structures
is not yet documented in the literature. Understanding instructional design activities
taking place under various organizational structures is important in order to gain a greater
awareness of the role organizational structure plays in the quality of instructional design.
Background of the Study
Changes in instructional delivery methods, instructional strategies, and
instructional design processes are altering higher education in the United States.

Colleges and universities within the NCA-HLC, for example, now offer courses that are a
combination of face-to-face delivery, hybrid delivery, technology-mediated delivery, and
totally online delivery. These expanded learning environments present colleges and
universities with challenges related to design and development of courses.
Changing learner demographics in this expanding environment adds to the
complexity of instructional design tasks by creating new challenges in identifying
learners and learners needs (Ayers, 2002; Jones-Kavalier & Flannigan, 2006; Saba,
2005; Yankelovish, 2005), course interactions (Hedberg, 2006; Lanier, 2006 ), academic
integrity (Gearhart, 2001), and technology (Hedberg, 2006; Jones-Kavalier & Flannigan,
2006). A modification in instructional strategies often accompanies these changes as
instruction moves from a behavioral and cognitivist approach to approaches that are more
constructivist and connectivist (Albi, 2007; Baggaley, 2008; Doyle, 2009; Jonassen,
1999; Kirschner, Sweller, & Clark, 2006; Mayer, 1999; Mayer, Griffith, Jurkowitz, &
Rothman, 2008; Mayer & Johnson, 2008; Siemens, 2004, 2005). The organizational
structure under which leadership and support for instructional design activities takes
place within changing environments must be thoughtfully established (Beckman, 2009;
Braganza, Awazu, & Desouza, 2009; Brill, Bishop, & Walker, 2006; Franken, Edwards,
& Lambert, 2009; Scott, 2003). Recognizing organizational structures that effectively
support instructional design processes is important.
Instructional design activities can occur at various locations within institutions
with instructional design leadership and support being provided under a variety of
organizational structures. Instructional design leadership and support available to faculty,
trainers, managers, and others involved in design and development tasks may influence

quality of instruction or training. Understanding how colleges and universities have


adapted organizational structures to support instructional design activities will help
decision makers understand and develop effective environments for leadership and
support of instructional design activities in modern academic environments.
Statement of the Problem
The shift from face-to-face instruction to other technology-mediated delivery
environments brings changes to both organizational structures and instructional design
processes. Shifting instructional delivery media requires instructional designers at
colleges and universities in the United States to rethink methods of designing,
developing, and supporting instructional design activities within their institutions. Little
is known about which instructional design activities take place within different
organizational structures. This study investigated instructional design activities taking
place within these organizational structures to help bridge this knowledge gap.
Purpose of the Study
The purpose of this study was to determine what organizational structures support
instructional design and development activities at colleges and universities in the NCAHLC. The study sought to determine what organizational structures are in position and
the instructional design activities taking place within these structures. A better
understanding of strengths and weaknesses of organizational structures was a further goal
of the study.

Rationale
The study was conducted to provide a foundation upon which to encourage
development of best practices when creating organizational structures for supporting
instructional design at colleges and universities. Further research to compare
organizational structures on instructional design quality may be possible based on the
foundation provided by the study.
Research Questions
1. What types of instructional design activities are found in colleges and
universities using instructional designer centered, instructional designer
supported, and faculty centered models of organizational structure in the
Higher Learning Commission of the North Central Association of Colleges
and Universities?
2. What are the strengths of types of organizational structures in colleges in the
Higher Learning Commission of the North Central Association of Colleges
and Universities as perceived by instructional design professionals?
3. What are the weaknesses of types of organizational structures in colleges in
the Higher Learning Commission of the North Central Association of
Colleges and Universities as perceived by instructional design professionals?
Significance of the Study
The Internet brings many opportunities and challenges to colleges, creating fastchanging environments. How an organization handles change can have an impact
throughout the organization. The overall organizational structure, together with the
leadership and support available within the structure, guides the organization through
times of change (Beckman, 2009; Franken, Edwards, & Lambert, 2009; Fullan, 2001;
Scott, 2003; Thompson & Purdy, 2009). The placement and support of instructional
design activities within the overall organizational structure can take many forms. The

organizational structure from which instructional design activities originate and from
which support for those activities comes from will have an effect on the processes used
and procedures for designing and developing instruction and training in colleges.
Instructional designers have realized the process of placing courses online is more
than simply placing traditional (courseroom) materials online (Albi, 2007; Baggaley,
2008). Instructional designers are invaluable in creating quality learning environments
for students and helping secure quality instructional content, assessments, and delivery.
Instructional design leadership and support are needed to meet these challenges, even
when that leadership and support is structured in various ways within an organization.
This study sought to determine what organizational structures are in place and the
instructional design activities taking place within these structures.
Definition of Terms
The following definitions guided the study:
ADDIE. An acronym that stands for analysis, design, development,
implementation, and evaluation. It represents the processes in the instructional systems
design (ISD) model and can be found throughout instructional design literature (Molenda,
2003).
Classical content analysis. A method of qualitative data analysis in which the
researcher counts the number of times each code is utilized (Leech & Onwuegbuzie,
2007, p. 569).
Constant comparative analysis. A method of conducting qualitative data analysis
in which the researcher searches for themes throughout the data. The data is read, broken

down into chucks, and coded with meaningful labels (Leech & Onwuegbuzie, 2007, p.
565).
Distance education. Planned learning that normally occurs in a different place
from teaching, requiring special course design and instruction techniques, communication
through various technologies, and special organizational and administrative
arrangements (Moore & Kearsley, 2005, p. 2).
Faculty centered model. An instructional design organizational model in which
the instructor is responsible for the design, development, delivery, and facilitation of his
or her own courses. Support is typically in the form of assistance in the use of the course
delivery system rather than instructional design support.
Hybrid courses. Courses that include both online and face-to-face components.
Instructional design. A systematic and reflective process of translating
principles of learning and instruction into plans for instructional materials, activities,
information resources, and evaluation (Smith & Ragan, 2005, p. 7).
Instructional design activities. The individual steps performed within each
procedure identified in the instructional design process.
Instructional design leadership. The process of providing vision and direction
(Irlbeck & Pucel, 2000) for instructional design and development activities.
Instructional design process. The procedures used for design, development, and
implementation of courses within a program of study.
Instructional design theory. Prescriptive theory that identifies instructional
conditions required for particular instructional consequences or outcomes (Reiser &
Dempsey, 2007, p. 338).

Instructional designer. Somebody who applies a systematic methodology based


on instructional theory to create content for learning events (Wayne State University,
2010, Glossary).
Instructional designer centered model. An instructional design organizational
model in which instructional design activities are performed by a team of instructional
design experts with subject matter experts (SMEs) serving as members of the design team
during the design, development, and delivery of instructional solutions. Instructors
teaching the course may or may not be the SME and may or may not serve on the design
and development team.
Instructional designer supported model. An instructional design organizational
model in which the instructor works closely with instructional designers or an
instructional design team to design, develop, deliver, and facilitate his or her own
courses. Support is typically in the form of assistance in both instructional design and
development activities and the use of the course delivery system.
Online technologies. For purposes of this study, this term will mean technologies
used to deliver distance education via the Internet.
Organizational structure. Instructional design systems used by organizations to
design, develop, deliver, and support instructional design processes within institutions.
Assumptions and Limitations
Assumptions
1. Random sampling methods produced a sample that is representative of
instructional designers, faculty development leaders, curriculum specialists,
course developers, and others in similar roles in the NCA-HLC.

2. Participants accurately and honestly responded to survey and interview


questions.
Limitations
1. Limiting the sample size to allow for time and resource constraints may have
restricted the depth of information resulting from the study.
2. Limiting the population to organizations with accredited status may have
restricted the breadth of information resulting from the study.
Nature of the Study
This mixed methods explanatory research study included a descriptive component
using a cross-sectional quantitative survey method and a qualitative component using
telephone interviews to gain greater understanding of instructional design activities
taking place within various organizational structures. The survey developed by Vannoy
(2008) as adapted from the work of Wedman and Tessmer (1993) was modified to collect
data for the study. Questions were added to the survey to collect descriptive data about
current organizational structures (Best & Kahn, 2003; Creswell, 2008; McMillan &
Schumacher, 1997) from instructional designers, faculty development leaders, curriculum
specialists, course developers, and others in similar roles at colleges and universities in
the NCA-HLC. Parts 1 and 2 of the Vannoy (2008) survey were adapted to collect data
examining instructional design activities taking place under different organizational
structures in which instructional design activities take place. The qualitative portion of
the survey was conducted using telephone interviews of volunteers from the survey. The
telephone interviews collected information relating to the strengths and weaknesses of the
instructional design processes and the structures under which they operate.

Organization of the Remainder of the Study


A review of the literature discussing recent changes in higher education,
instructional design process, and instructional design theories will be presented with a
discussion of instructional design organizational models in Chapter 2. Chapter 3 details
the methodology used in the study for gathering both descriptive data and exploring the
strengths and weaknesses of various organizational structures. Chapter 4 provides an indepth discussion of data collection and analysis and Chapter 5 contains the results,
conclusions, and recommendations drawn from the study.

10

CHAPTER 2. LITERATURE REVIEW


Changes in Higher Education
Many changes have occurred in higher education in the 1990s and 2000s.
Technological advances that led to large numbers of online classes, use of distance
delivery methods to support or provide instruction, and a change in learner demographics
are among these changes. Most, if not all, colleges and universities in the Higher
Learning Commission of the North Central Association of Colleges and Schools (NCAHLC) have been impacted by these changes. New learning theories have been identified
that influence the way instruction and training are designed. Strong instructional design
(ID) leadership and support are needed to effectively design instruction and training in
these changing environments. Todays instructional designers and leaders must be aware
of these changes and be prepared to embrace emerging methods and technologies that
will enhance instruction and training.
Learner Demographics
Changing learner demographics requires the instructional designer to carefully
evaluate for whom instruction or training is being designed. Palloff and Pratt (2003)
state, Online learning, in its best form, is learner-centered and learner-focused (p. xiii).
Focus has often been on integration of technology into the curriculum rather than on
understanding learning theories appropriate for the technology available and the learners
who receive the instruction (p. xiv). Expanding diversity in learner demographics creates

11

unique challenges for instructional designers as they endeavor to design learner-focused


instruction.
Perhaps the first change in learner demographics that instructional designers see
today is an increasingly diverse learner age. The typical college learner can no longer be
identified as an 1824 year old (Day and Jamieson, 2003); ages now range from high
school students attending college or university on government initiatives, learners in their
30s and older returning to school for mid-career changes, and retirees looking to expand
their knowledge or for ways to supplement their incomes during retirement (Chen, 2009;
Howell, 2001). Changes in learner age creates a wealth of instructional design issues to
be considered, including learner experience, prior knowledge, and what is considered
meaningful to the learner (Moore & Kearsley, 2005, pp. 17-18, 183-184).
Instructional designers often encounter other changes in learner demographics in
the form of learner responsibilities and life styles. Todays college learner is increasingly
employed either full or part time, often married with or without children, and many are
single parents (Palloff & Pratt, 2003, p. 3). Studies and school responsibilities, while
important, are often not the first priorities for these learners. Time is stretched between
work, family, and school with learners modifying time allocations as various issues arise
or events occur in their lives. The reduced priority given to educational endeavors
impacts how students approach learning and their ability to succeed (Orszag, Orszag, &
Whitmore, 2001, Executive summary).
Increased variability in learner ages and responsibilities brings a wide variety in
educational preparation. Increasing numbers of learners are returning to higher education
for additional training after having received a college degree (Yang, 2006). Other

12

learners are attending college or university for the first time. Many of these students
dropped out of high school or have been out of school for many years. Differences in
prior knowledge can make the task of designing instruction complex.
Learner expectations have also changed. A switch in learner views of higher
education from a privilege to a commodity to be purchased appears to be taking place
(Reiser & Dempsey, 2007, p. 277). Expectations of and demands on institutions of
higher learning are changing as views of education change to a consumerist viewpoint
(Dempsey, Albion, Litchfield, Harvard, & McDonald, 2007, p. 227).
Changes in learner demographics and expectations often occur because of
expanded access to the Internet and other online technologies that make instruction and
information more accessible. Online technologies create unique challenges for
instructional designers.
Online Technologies
Increasing uses of online technologies to deliver instruction adds to the
complexity of ID activities. Online technology is currently being used to enhance
instruction in face-to-face delivery of instruction, to blend benefits of both face-to-face
and online delivery of instruction in hybrid courses, and to deliver fully online distance
education instruction. Instructional designers can choose the best methods for delivering
content, assessing learning, developing interactions, and evaluating effectiveness from a
wide range of learning theories, online resources, and multimedia options. The solutions
chosen must be robust enough to support increasingly diverse learner populations.

13

Online technology makes it possible for learners to shop for instructional options
and create a potpourri of learning. Learners have options of receiving instruction through
traditional, hybrid, or online delivery methods from a myriad of schools or other sources
(Delialioglu & Yildirim, 2008; Vernadakis, Antoniou, Giannousi, Zetou, &
Kioumourtzoglou, 2011). Reiser and Dempsey (2007) state students are becoming real
consumers these days and often have several choices of junior colleges, four-year
colleges, and universities both in town and online (p. 277).
No one method of delivery appears to provide definitively improved performance
in student learning. Each delivery methodface-to-face, online, web-enhanced, and
hybridcontains both benefits and shortfalls. Vernadakis et al. (2011) studied the
benefits of hybrid learning environments and found them to combine the best features of
online learning and traditional classroom learning (p. 188). Delialioglu and Yildirim
(2008) state early studies showed that technology can be a double-edged sword if not
properly planned and implemented (p. 475). Instructional designers must recognize
what delivery options are available, be equipped to design effective instruction for each
option, and select delivery methods best suited to the learner and instructional need for
which instruction is being designed.
Options in online technologies bring many challenges to instructional design
processes. Identifying learners, learner demographics, and learner readiness provides a
challenge to instructional designers. Effectively delivering content, assessing learning,
developing interactions, and evaluating effectiveness may be more complex. Appropriate
online technologies must be selected to implement instruction designed based on a
variety of learning theories, provide quality assessment for both learner and instructor,

14

and create environments in which learners interact effectively with instructors, content,
and other learners.
Online technologies can make selection and use of the most appropriate learning
theory and delivery method for desired learning outcomes more complex. The
educational environment of 2011 and beyond requires the ability to design instruction
using a variety of cognitivist, constructivist, and connectivist learning theories (Albi,
2007; Baggaley, 2008; Doyle, 2009; Jonassen, 1999; Kirschner, Sweller, & Clark, 2006;
Mayer, 1999; Mayer, Gruffith, Jurkowitz, & Rothman, 2008; Mayer & Johnson, 2008;
Siemens, 2004; 2005). Learning theories often suggest the necessity for a variety of
assessments, including the need to assess student readiness and learning. Online
technologies may complicate the design of assessment measures and academic integrity
issues relating to assessment (Corcoran, Dershimer, & Tichenor, 2004; Lanier, 2006).
Constructivist and connectivist learning theories often focus on learner
interactions (Jonassen, 1999; Mayer & Johnson, 2008; Siemens, 2004; 2005). Moore and
Kearsley (2005) identify three types of learner interactions: instructor-learner, learnerlearner, and learner-content (pp. 140-141). Online delivery environments create
complexity in each of these interactions. Many in higher education view developing a
sense of community for learner-learner interactions in online environments as critical.
These help learners feel they are a part of the learning community and lead to quality
interaction and instruction as a challenge. The online isolation factor and improving
learner engagement has been the focus of much research in the 2000s (Deloach &
Greenlaw, 2007; Dennen, 2005; Dennen, Darabi, & Smith, 2007; Jacobs & Archie, 2008;
Mann, 2005; Spatariu, Quinn, & Hartley, 2007; Woods & Ebersole, 2003). Research has

15

addressed both how to improve instructor-learner interactions and learner-learner


interactions and the importance of these interactions to student success and retention.
Online technology and technological advances of the 2000s have lead to greater options
for instructional designers when designing learner-content interaction. Advances include
simulations, virtual communities, performance support systems, and mobile technologies.
Changes in learner demographics and opportunities and challenges created by
online technologies have impacted how instructional design activities are performed.
Instructional designers and researchers have examined instructional design theories and
the impact these changes have had on instructional design processes. New instructional
design theories have been introduced to assist with instructional design processes in
todays changing academic environment.
Connections to Instructional Design
Instructional Design Theories
The advent of additional instructional design theories creates both opportunities
and challenges for instructional designers. Design of learner centered instruction requires
instructional designers to understand the purpose and use of various forms of instruction.
It is also important to select proper instructional methods for desired learning outcomes
that meet learner needs and expectations. Today, instructional designers must understand
constructivist and connectivist theories, as well as behavioral and cognitive theories as
they relate to instructional design. Knowing when to use various instructional methods
and how to apply them can be challenging.

16

Instructional designers evaluate instructional design theories and determine which


best fit the instructional situation. Reigeluth (1999) addresses the importance of selecting
appropriate instructional methods in order to develop instruction that produces desired
outcomes. Reigeluth stated, An essential feature of instructional-design theories is that
the methods they offer are situational rather than universal. In other words, one method
may work best in one situation, while another may work best in a different situation (p.
8). This indicates the task of selecting the proper method can be complex. Instructional
designers must continue to evaluate the appropriateness of instructional methods for
learners, learning situations, and desired outcomes as changes take place in higher
education and new instructional design theories are presented and tested. Evaluating new
instructional design theories should be centered in knowledge of existing theories.
Existing instructional design theories include theories based on desired behavioral,
cognitive, constructivist, or connectivist outcomes.
Behavioral instructional design theories. Behavioral instructional design
theories are centered on the need to foster a change in behavior. Behavioral ID theories
have their roots in the work of B. F. Skinner (1938, 1950, 1958). Skinners theories have
been adapted over the years to help enhance instruction in various educational settings to
meet desired learning objectives (Driscoll, 2005; Magliaro, Lockee, & Burton, 2005).
More than one behavioral learning objective is not uncommon in higher educational
settings. Designing instruction to meet desired behavioral learning objectives can be
quite complex in todays multifaceted and changing educational environment.

17

Cognitive instructional design theories. Instruction began moving away from


its behavioral focus to embrace cognitive principles and outcomes in the 1960s and
1970s. Driscoll (2005) states, According to the cognitive information view, the human
learner is conceived to be a processor of information in much the same way a computer
is (p. 74). According to Driscoll, the work of Atkinson and Shiffrin (1968) provides the
roots for information processing models in education (p. 7). Many instructional design
models were built around this view of learning during the 1960s and into the 1980s
(Darvin, 2006; Driscoll, 2005; Gholson & Craig, 2006; Jonassen, Strobel, & Gottdenker,
2005).
Instruction and evaluation tended to became more standardized as information
processing models of instructional design were advanced. Reigeluth (1999) states our
current paradigm of training and education was never designed for learning; it was
designed for sorting . . . (p. 18). Methods of instructional design that address individual
learners and their needs were desired. Development of constructivist theories of
instructional design arose from this need.
Constructivist instructional design theories. Constructivist theories developed
during the 1990s and 2000s. Constructivist theories are based on the need to add
meaning to learning and built on the belief that learners should create their own
knowledge (Reigeluth, 1999, p.143). Learners are responsible for constructing their
own meanings in the constructivist paradigm. Methods used for constructivist instruction
vary greatly from Mayers (1999) instructional design theory that is intended to foster

18

knowledge construction in structured environments to Jonassens (1999) instructional


design theory for ill-structured environments.
Connectivist instructional design theories. Connectivist theories began
appearing around 2004. Instructional design emphasis is placed on learners making
connections between various real-world and educational concepts to create meaning and
knowledge in the connectivist view (Siemens, 2004). McLoughlin and Lee (2008)
expand on the concept in their discussion of the use of social software.
The challenge is to enable self-direction, knowledge building, and learner control
by offering flexible options for students to engage in learning that is authentic and
relevant to their needs and to those of the networked society while still providing
necessary structure and scaffolding. (p. 2)
Instructional designers carefully select instructional design theory or theories that
are appropriate to the learning situation for which instruction is being developed.
Todays changing and complex academic environment makes this task ever more
challenging. The instructional design process has been in place for many years to assist
instructional designers with this task.
Instructional Design Process
The instructional design process of analysis, design, development,
implementation, and evaluation (ADDIE) can be found in much of the instructional
design literature. This process is often referred to as the ADDIE model or process.
Molenda (2003) indicates the acronym began to be recognized in the late 1980s and is a
colloquial term used to describe a systematic approach to instructional development (p.
35). Morrison, Ross, and Kemp (2007) state the ADDIE Model is merely a colloquial

19

label for systematic approach to instructional development, virtually synonymous with


instructional systems development (ISD) (p. 13). ADDIE is a foundational element in
the instructional design field regardless of its origins. Few, if any, instructional designers
are unaware of the acronym and what it means.
ADDIE plays an important role in instructional design. Bichelmeyer (2005) gives
several examples of the impact the ADDIE model has in both instructional design
training and the instructional design profession. Bichelmeyers statement we should
examine the ADDIE model in more detail to help us get a better sense of the core of our
field (p. 3) gives a clear indication of the importance of ADDIE in instructional design.
Molenda (2003) states there is a tendency to accept the ADDIE term as an umbrella
term, and then to go on to elaborate more fully fleshed-out models and narrative
descriptions (p. 36) of the instructional design process. ADDIE provides a strong
foundation around which ID models can be built.
Gustafson and Branch (2002) define the ADDIE process and its role in
instructional development.
Instructional development consists of at least five major activities: [a] analysis of
the setting and learner needs, [b] design of a set of specifications for an effective,
efficient, and relevant learner environment, [c] development of all learner and
management materials, [d] implementation of the resulting instruction, and [e]
both formative and summative evaluations of the results of the development. (p.
xiv)
Each of these activities is represented by one part of the ADDIE process.
ADDIE, while foundational to the instructional design process, came under
criticism in the mid 2000s as instructional designers and researchers began to question
the instructional design process. The instructional design process is increasingly being

20

considered a problem-solving process rather than a systemic process as represented by


ADDIE. Ertmer and Stepich (2005) questioned what instructional design expertise really
is. They concluded instructional design is the ability to apply foundational principles to
problem-solving situations (p. 42) after evaluating several dimensions of instructional
design. Silber (2007) adds to this discussion with his principle-based model of
instructional design. Silber states it is time to changeto think about ID as a set of
principles underlying designing instruction (p. 5). Instructional designers should
possess knowledge of a variety of both traditional and emerging ID models in order to
select the best ID model to use for a given instructional design project.
Instructional Design Models
Many instructional design models exist from which to choose. Understanding
where the ADDIE process and problem-solving models fit into a few ID models may
assist in understanding how the frameworks fit into an ID model. Irlbeck, Kays, Jones,
and Sims (2006) state
rapid changes in our understanding of learning have positioned ISD to
accommodate new ideas, with the ADDIE model providing the foundation from
which more recent instructional design models have been built that integrate
constructivism, systems thinking, forms of interaction, information processing,
and learning-centered approaches. (p. 172)
Instructional design (ID) models help instructional designers design and develop
instruction. Each ID model is designed to be used in a particular type of instructional
design setting to accomplish a specific type of instructional goal. Gustafson and Branch
(2002) give the purpose of ID models as [to] serve as conceptual, management, and
communication tools for analyzing, designing, creating, and evaluating guided learning

21

(p. xv). They also cautioned, No single ID model is well matched to the many and
varied design and development environments in which ID personnel work (p. xv).
Many models exist from which instructional designers can select to help manage this
diversity. Most models have the ADDIE process as a foundational framework.
Various categories of instructional design models have been identified. Ryder
(2010) gives two categories of instructional design models as modern prescriptive and
postmodern phenomenological models. Qureshi (2004) describes the four categories of
design, time-focused, task-focused, and learner-focused models. Fauser, Henry, and
Norman (2006) used Gustafson and Branchs (2002) categories as the focus of their
comparison for design models. This paper focuses on the three categories of ID models
identified by Gustafson and Branchclassroom-oriented, system-oriented, projectorientedto classify instructional development models. Classroom-oriented models,
which are typically used by individual instructors to produce a small amount of
instruction, require minimal resources and ID skill. These usually involve little front-end
analysis and are relatively low in complexity. System-oriented models typically involve
a team effort and are used to develop an entire course or curriculum. The resource
commitment is high and greater levels of ID skills are required. System-oriented models
typically involve a substantial amount of front-end analysis and are technologically
complex. Product-oriented models are used to produce an instructional package. These
usually use a team approach and require high levels of resources and ID skills. They
typically include a moderate amount of front-end analysis and are moderately complex
(p. 14).

22

The Morrison, Ross, and Kemp (2007) classroom-oriented model, developed from
the work of Kemp (1971), and the Dick, Carey, and Carey (2009) system-oriented model,
first published by Dick and Carey in 1978, are examined as examples of ID models that
have been used for several years. The Sims (2009) PD4L model is examined as a
problem-solving ID model designed for online instruction in a team-based approach. The
ADDIE foundation can be clearly identified in the Morrison, Ross, and Kemp and the
Dick, Carey, and Carey models. Sims PD4L model incorporates elements of both
ADDIE and problem solving approaches to ID.
Morrison, Ross, and Kemp. Morrison, Ross, and Kemp (2007) describe ADDIE
as a label that refers to a family of models that share common elements (p. 13). The
Morrison, Ross, and Kemp (MRK) model includes all five parts of the ID process and is
designed to be flexible. It allows instructional designers to approach the design and
development project in a manner appropriate for the specific project being completed.
Flexibility does not mean that components of the ID process can be eliminated but that
the extent to which the components are addressed and the order of processing will vary
from project to project. This makes the MRK model an ID model that follows the
ADDIE process as well as a problem solving model that instructional designers adapt to
meet the needs of current instructional design situations.
Morrison, Ross, and Kemp (2007) state there are four fundamental planning
elements in almost every ID model. These include evaluating learner characteristics,
identifying learning objectives, establishing instructional strategies, and developing
evaluation procedures (p. 14). Development and implementation flow from design

23

activities and are tested using evaluation procedures. Activities within the ID process can
readily be identified when referring to the MRK model.
Seven elements are at the core of the MRK model. Each element represents a step
in the instructional design process and is represented by a circle in the MRK model.
Eight ongoing processes are also identified in the outer rings of the model (Morrison,
Ross, Kalman, & Kemp, 2011, pp. 14-19). The instructional problems circle in the MRK
model represents an analysis of the identified need to determine if it can be solved by
instructional interventions. The MRK model identifies a complex needs assessment and
a goal analysis as part of the process. This reflects the analysis phase of the ADDIE
process. Analysis also takes place in the learner characteristics and task analysis circles
of the MRK model. Instructional designers are concerned with identifying characteristics
of the learner that include general characteristics, specific entry characteristics, and
learning styles (Morrison, Ross, & Kemp, 2007, p. 55) as they relate to learner
characteristics. Identifying the contextual analysis within which learning is to take place
is also included in this circle. Contextual analysis includes the orienting context that
involves learning more about the user and his or her goals and perceptions of the learning
situation, the instructional context that seeks to understand the setting in which
instruction will take place, and the transfer context that seeks to understand how desired
learning will be applied (pp. 63-66). The third analysis activity, represented by the task
analysis circle in the MRK model, identifies content to be included in instruction. This
step carefully analyzes all of the subtle steps to be included and seeks to view content
from learners perspectives (pp. 52-73). The importance of this step is emphasized by the
many changes taking place in learner demographics.

24

Design activities can be found in the MRK model in the instructional objectives,
content sequencing, and instructional strategies circles. Instructional objectives identify
what learners are to learn. According to Morrison, Ross, and Kemp (2007), instructional
objectives serve the purpose of providing a means for designing instruction, providing a
framework for evaluating learning, and providing a method to guide the learner (p.
104). Content sequencing involves identifying the best ways to sequence content for
optimum learning. Instructional strategies can be designed after identifying the sequence
in which content should be delivered. The general learning environment best used to
deliver instruction and the sequence and methods to achieve an objective are described
in this step (p. 146).
Designing the message includes both design and development activities.
Instructional messages may be developed concurrently with designing the message using
current technology. Additional development activities take place in the development of
instruction once messages have been designed. Development of instructional materials is
included in this step. Implementation occurs after all materials have been developed.
Students are then introduced to content and learning materials and instruction actually
begins.
Evaluation is to take place during each of the above activities. Modifications to
analysis, design, and development are on-going activities. Evaluation is conducted to
measure the effectiveness of the instructional design and materials once instruction has
been implemented.
This brief review of the MRK model shows how the ID process of ADDIE is
foundational to the MRK model. Problem solving methods are needed as the

25

instructional designer determines the extent to which activities are completed and the
order in which they occur as determined by need. The ID process of ADDIE can be
clearly identified as the foundation of the slightly more linear model of instructional
design provided by Dick, Carey, and Carey (2009) called the Systems Approach Model
(SAM).
Dick, Carey, and Carey. The SAM model has ten interconnected boxes
[which] represent sets of theories, procedures, and techniques employed by instructional
designers to design, develop, evaluate, and revise instruction (Dick, Carey, & Carey,
2009, p. 6). The ID process of ADDIE as foundational to the design of the SAM model
is evident. The SAM model incorporates problem solving methods as revision and
iteration of steps occur as needed. The SAM model begins similarly to the MRK model
with identification of instructional goals.
Analysis in the SAM model includes instructional analysis and analysis of
learners and context. The SAM model of instructional analysis includes identifying
skills and knowledge that should be included (Dick, Carey, & Carey, 2009, p. 39) via
the instructional deign. Analysis under this model includes understanding instructional
goals and identifying subordinate skills to be included in instruction. Entry level skills
learners should possess prior to beginning instruction are also analyzed. Learner and
context analysis follows or may take place concurrently with instructional analysis.
Learner and context analysis in changing higher education environments may uncover
unexpected results.

26

The design process in the SAM model begins with writing performance
objectives. The process is influenced by outcomes of previous analysis activities as well
as feedback received from ongoing evaluation. Objectives are used in design and
development of assessment instruments that influence design of instructional strategies
such as selecting delivery methods, content sequencing, and instructional strategies.
Development activities consist of creating assessment instruments and selection
and development of the actual learning environment and instructional materials based on
instructional strategies designed. Evaluation activities in the SAM model are both
formative and summative. Dick, Carey, and Carey (2009) define formative evaluation as
the collection of data and information during the development of instruction that can be
used to improve the effectiveness of the instruction (p. 257). Formative evaluation may
lead to revisions in design and development of instruction or a more in-depth analysis of
instructional needs or learners and contexts. Summative evaluation occurs after
instruction has been implemented and is defined as the design of evaluation studies and
the collection of data to verify the effectiveness of instructional materials with target
learners (p. 320).
The foundation provided by ADDIE and the problem solving approach needed
can be clearly seen in activities identified by the SAM model. The Sims (2009) Proactive
Design for Learning (PD4L) model, in contrast, does not as obviously include all aspects
of the ADDIE process. The problem solving nature of the PD4L model may lead to an
incorrect belief that parts of the ID process are eliminated from the model.

27

Sims. The PD4L model, designed to be used by a team in development of online


instruction, is founded on the work of Sims and Jones (2002) Three-Phase Design (3PD)
model. The center point of the PD4L model is the 3PD model that clearly includes
design, development, implementation, and evaluation in a concurrent and iterative
process. Evaluation consistently takes place throughout the development process. The
3PD model is unique in that the moderately structured problem-solving (Silber, 2007,
p. 5) nature of the model allows instruction to be delivered after the first phase is
completed but before the learning system has been fully designed and developed. Sims
and Jones (2002) state the purpose of the first phase is to design and create a functional
teaching and learning environment that will meet all learning outcomes as well as faculty
teaching and learning strategies . . . it is functional, and production does not try to
complete a final package at the first attempt (p. 5).
Phase 2 takes place during instructional delivery and includes learner and teacher
feedback leading to immediate delivery environment and instructional material
modifications. The PD4L model appears to follow the process Silber (2007) states is
used by expert instructional designers of doing just enough analysis to lead to a
hypothesis about a solution, which they propose, then do a prototype, see results, and
then modify (p. 8). Modification in the PD4L model takes place in both Phase 2 and
Phase 3. Phase 3 consists of ongoing course enhancements.
Analysis of instructional goals, learners, and context found in the MRK and SAM
models may at first glance appear absent from the PD4L model; however, Sims (2009)
includes this important foundational element via the description of the PD4L model.

28

PD4L addresses a set of factors which, if applied and considered, will ensure the
integrity of the online teaching and learning environment. The first of these
relates to ensuring there is a clear definition of the strategic intent of the course
why it is required, who it is for and what the desired outcomes are. (p. 388)
These factors clearly relate to the analysis phase of the ID process. The PD4L
model can be seen to have the ID process of ADDIE as its foundation along with a strong
basis in problem-solving.
Three of the many ID models available to instructional designers have been
analyzed in this section. The choice of model will be determined by the instructional
design setting and instructional goals. The instructional design setting includes the
instructional problem, whether instruction will take place face-to-face, online, or in a
hybrid learning environment as well as the organizational structure in which instructional
design takes place. Organizational structure, for the purpose of this study, refers to
instructional design systems used by organizations to design, develop, deliver, and
support instructional design processes within that institution.
Organizational Structure
The form of leadership and support provided for instructional design activities
within an organization is determined, in part, by the organizational structure.
Understanding various organizational structures is important in todays changing
academic environment for a variety of reasons. One reason centers around ID leadership.
ID leadership helps guide instructional design activities today and for the future. Fullan
(2001) refers to leaders as agents of change. Todays changing academic environment
requires leaders who can work with others to ensure appropriate instructional design
methods are followed to provide instruction meeting instructional objectives and learner

29

expectations. Marx (2006) states that working together is the only way to improve our
chances of survival as an educational system (p. 19). The organizational structure
provides the pattern from which leadership and support for ID activities emerge as
faculty, instructional designers, and others work together to design and develop
instruction and training.
Leadership and support for ID activities can help bridge the gap between subject
matter experts knowledge of the subject and their knowledge of learning theories,
instructional design theories, and instructional design models. Organizational structure
helps determine from where support for course development and modification will come
and who within the organization is responsible for course design and development in
todays changing and technologically complex environments.
Various organizational structures are used in higher education today. The models
upon which organizational structures are designed and used at colleges and universities in
the NCA-HLC are relatively unknown. Three instructional design organizational
structures appear to be in use in higher education today. The main features characterizing
various organizational structures can be identified by where responsibility and support for
instructional design lies. The author has coined three termsinstructional designer
centered, instructional designer supported, and faculty centeredto identify three
organizational structures explored in this study. The following discussion describes these
three potential structures and their characteristics.
Instructional designer centered structure. The instructional designer centered
structure (IDC) most closely resembles systems analysis and design models found in

30

business. The instructional designer functions at the center of instructional design


activities in a manner similar to that of the systems analyst. Whitten and Bentley (1998)
state the systems analyst facilitates the development of information systems and
computer applications (p. 7). Similarly Morrison, Ross, and Kemp (2007) define an
instructional designer as a person responsible for carrying out and coordinating the
planning work; competent in managing all aspects of the instructional design process (p.
18). Instructional designers work with subject matter experts, learners, administrators,
web designers, programmers, and others to ensure instructional goals are met in much the
same way systems analysts work with end-users, managers, programmers, database
administrators, vendors, and others.
Instructional designers must be able to communicate with a wide range of people,
to have the ability to manage a project, and to understand the skills and responsibilities of
fellow team members. Instructional designers must understand learning theory,
instructional design theory, different learning environments, and learners. The designer
must be able to select the appropriate instructional design structure for the current
instructional design project and work within that structure. Selecting needed components
and discarding those that are not necessary will lead the instructional design project team
to a completed instructional product that meets learning objectives within budget and
time constraints. Instructional designers must also be able to work within an emergent
system as described by Irlbeck, Kays, Jones, and Sims (2006) to design solutions for
situations embracing less well-structured problems.
Given this as the responsibility of instructional designers in the IDC structure, it
seems this structure may be widely used today in higher education in the United States.

31

Literature or information to support this conjecture is limited. The lean team approach as
described by Moore and Kearsley (2005) is perhaps one example of an IDC structure.
Moore and Kearsley in describing the lean team approach state it has to be recognized
that no individual is a teacher in this system, but that indeed it is the system that teaches.
Even the content is not owned by a professor, but is the product of group consensus
(pp. 107-108). The IDC structure allows faculty members who will be teaching or
moderating a course to not be involved in design and development. White (personal
communication, May 5, 2010) and Clawson (personal communication, May 12, 2010)
described an IDC approach used at one private, for-profit organization, Capella
University. A summation of the conversations follows.
Capella University uses an instructional design team that includes instructional
designers, curriculum specialist, subject matter experts, course producers, and editors.
The instructional design process for Capella begins with the curriculum development
team consisting of a curriculum specialist and faculty members from the specialization
who are responsible for the analysis portion of the instructional design process. During
the curriculum development process, the team identifies outcomes and competencies for
the program or degree, professional standards are identified, audience analysis is
performed, and high-level assessment strategies are discussed and documented. Once the
analysis is completed and documented, design work begins. During the design phase, the
curriculum specialist transitions out and an instructional designer and the rest of the
course design/development team transitions in. Additional faculty members may be
added to the team at this time. Assessment strategies are developed and faculty members
working with instructional designers select course materials and appropriate instructional

32

strategies can be used to meet competencies and sub-competencies identified during the
analysis phase. Producers are called upon to build the course in the online environment
once design and development have been completed. Editors check for consistency, ADA
compliance, copyright issues, and to make sure all electronic activities work properly
after the course is built. The course is released after being thoroughly tested and
established outcomes assured. Teaching faculty members receive the course several days
prior to its release to learners so faculty can prepare to teach and confirm that all links
work. Also concerns can be raised for any issues that need to be addressed. Instructional
designers assist with the creation of grading rubrics and activities to ensure identified
competencies and sub-competencies are included throughout the process in the design
and development of the course (S. Clawson, personal communication, May 12, 2010; N.
White, personal communication, May 5, 2010).
Another example of the IDC structure being used organization-wide can be found
at the private, for-profit university, University of Phoenix. According to Seiden (2009),
standardization provides both consistency and faculty support (Business orientation,
para. 5). Kinser (2006) further describes the Phoenix instructional design process by
stating courses are designed by a committee of subject-matter experts and standardized
across the system (p. 27). This description appears to define an IDC structure in use at
the University of Phoenix, but it is possible it indicates an instructional designer
supported structure as described in the next section.
Instructional designer supported structure. The instructional designer
supported (IDS) structure is similar to the IDC structure in that the instructional designer

33

plays a major role in instructional design throughout the process. The IDS structure,
unlike the IDC structure, places the instructional designer in the role of supporting the
design process rather than leading the process. Merrill and Wilson (2007) describes these
as designers-by-assignment (p. 336). One view of this may be found in the
instructional design generator structure presented by Dempsey et al. (2007) which
describes the instructional designers role as follows:
work closely with unit leaders (subject matter experts) in the initial stages to
design a blueprint outlining the key learning and implementation strategies
appropriate for the context. In close collaboration with the ID, the course leader
develops a sample or module of the course and the ID provides feedback. Once
agreed on, this provides a model for the writing of subsequent modules and
detailed ID feedback is usually not required. (p. 224)
Focus in the IDS structure is on the instructional design process and design of
appropriate learning strategies. This structure provides extensive support to instructors
for course design and development. The IDS structure often focuses on design and
development of courses involving distance learning technologies or use of course
management systems. The University of Arkansas at Little Rocks Scholarly Technology
and Resources (STaR) Center states its mission as faculty support for teaching and
learning using UALRs Blackboard learning system management system (University of
Arkansas at Little Rock, 2008, Mission statement). STaR provides instructional design
services, development assistance, training, and production of multimedia solutions.
The IDS structure is a team-based structure in which faculty members work
closely with instructional designers to design and develop instructional solutions.
According to Dempsey et al. (2007) instructional designers are responsible for keeping
up to date with ID literature and educational theory and practice . . . sharing this

34

knowledge with the team and negotiating appropriate application to a course context in a
team environment (p. 227). This structure increases the knowledge of all members of
the team, providing a means to improve our chances of survival as an educational
system (Marx, 2006, p.19). Institutions that use less of a team approach often require
faculty members to be responsible for design of their own courses with limited ID
leadership and support. The organizational structure at these institutions resembles what
this study terms the faculty centered structure.
Faculty centered structure. The faculty centered (FC) structure places
responsibility for instructional design and development on faculty members.
Instructional design in this structure is often in the form of helping faculty members learn
to use course management systems rather than on instructional design in todays complex
and changing academic environment. Merrill and Wilson (2007) used the term
designers-by-assignment in describing the role of faculty in what is termed here as the
FC structure (p. 336). Designers-by-assignment (p. 336) are faculty members
assigned to be an instructional designer . . . not trained as an instructional designer (p.
336). Organizational structures using a FC structure must provide leadership and support
for faculty members who are thrust into the role of designers. Little is known about the
type of leadership and support needed or provided under this structure.
Many problems exist with this structure, including lack of training in instructional
design and teaching methods (Dempsey et al., 2007, p. 227), lack of interest in learning
new methods of teaching (p. 228), and faculty overload. Lack of training in teaching
methods and instructional design practices make many faculty designers novice

35

instructional designers at best. Silber (2007), quoting Nelson and Stolterman, suggests
that as novice instructional designers, faculty members can present solutions that waste
resources and lead to solutions that are not only ineffective, but can actually create more
difficulty (p. 7).
Instructional design and educational literature has addressed ID challenges
occurring in todays changing, technologically complex educational environments. Much
has been written since 2003 on the need to create a sense of community (Jacobs &
Archie, 2008; Mann, 2005; Woods & Ebersole, 2003), the importance of discussion in
online courses (Deloach & Greenlaw, 2007; Dennen, 2005; Dennen, Darabi, & Smith,
2007; Spatariu, Quinn, & Hartley, 2007) and assessment and academic integrity issues
(Corcoran, Dershimer, & Tichenor, 2004; Lanier, 2006). Yet little seems to have been
written for faculty members on how to design and develop online instruction.
Wiesenberg and Stacey (2005), describing the work of Ally (2004) state, it is the
instructional strategy, not the technology, that determines the quality of the learning
within a distance classroom . . . the design of these strategies must follow sound design
principles (p. 388). Weisenberg and Stacey (2005) conclude there exists a great need for
strong institutional technical support, excellent online teaching/learning skills, as well as
an inclusive philosophy of teaching and learning online (p. 398). FC structures
providing excellent leadership and support are needed. The level and type of leadership
and support for ID activities typically available in organizational structures following a
FC structure is unknown.

36

Summary
Todays instructional designers must be able to identify learners and the needs or
learners in the midst of a diverse learner population. Instructional designers must also
deliver instructional solutions to meet increasing demands of learner-consumers, create
online instruction using a variety of learning theories, provide for a variety of learning
styles, and accommodate learners with disabilities. Organizational structures must be in
place to provide leadership and support to those involved in the instructional design
process.
The ADDIE foundation of the instructional design process is enhanced by the
emergence of problem solving approaches. Research is still warranted to determine
whether there is a significant difference in instructional design activities between the
various organizational structures. This study sought to determine what organizational
structures are currently used and the instructional design activities taking occuring within
these structures. The study further sought to gain a greater understanding of the strengths
and weaknesses of various instructional design organizational structures in use. The
mixed methods explanatory approach described in Chapter 3 was used for this study.

37

CHAPTER 3. METHODOLOGY
Changes taking place in higher education make it advantageous to understand
instructional design processes being used to design and develop instruction and training
in colleges and universities within The Higher Learning Commission of the North Central
Association of Colleges and Universities. Instructional design organization structures
under which instructional design leadership and support are provided also need to be
identified. This mixed methods explanatory study sought to gain a greater understanding
of organizational structures being used in higher education at member colleges of the
Higher Learning Commission of the North Central Association of Colleges and
Universities (NCA-HLC) in the United States and instructional design activities taking
place. The study sought to answer the following questions:
1. What types of instructional design activities are found in colleges and
universities using instructional designer centered, instructional designer
supported, and faculty centered models of organizational structure in the
Higher Learning Commission of the North Central Association of Colleges
and Universities?
2. What are the strengths of types of organizational structures in colleges in the
Higher Learning Commission of the North Central Association of Colleges
and Universities as perceived by instructional design professionals?
3. What are the weaknesses of types of organizational structures in colleges in
the Higher Learning Commission of the North Central Association of
Colleges and Universities as perceived by instructional design professionals?

38

Study Design
The study used a mixed methods explanatory design. Creswell (2008) describes
an explanatory design as one in which the researcher might seek to explain the results in
more depth in a qualitative phase of the study (p. 566). A survey method was used to
collect quantitative data. Results were analyzed to determine if a significant difference
occurs in instructional design activities between different forms of organizational
structures where instructional design activities occur. Telephone interviews were
conducted after surveys had been received. The interviews explore reasons for any
differences found between organizational structures and instructional design activities.
Participants were selected for the quantitative portion of the study from among
1,015 colleges and universities in the NCA-HLC that were listed as accredited as of May
24, 2010 (Higher Learning Commission, 2010, Current or previously affiliated
institutions 05/25/2010). No participants were selected from colleges listed by the
NCA-HLC with a status of inactive, merged, accredited on probation, accredited on
notice, accredited show cause, or candidate. These institutions were eliminated from
selection in an effort to ensure the sample was representative of the population being
studied (Creswell, 2008, p. 151).
After names and e-mail addresses were recorded and IRB approval received, an email was sent to selected candidates inviting them to participate in the study.
Commercial web software SurveyMonkey was used to conduct the survey. A reminder
e-mail was sent one week after the initial invitation with a second e-mail reminder
following five days later.

39

The qualitative portion of the study consisted of recorded telephone interviews.


Survey responses were screened daily to determine if any respondents had volunteered to
participate in the telephone interview. Those volunteering to participate in the interview
were contacted via e-mail within two days of submitting their surveys to schedule the
interview. Interviews began during the second week of data collection and continued for
four weeks. A minimum of one volunteer from each of the three organizational
structures of instructional designer centered, instructional designer supported, and faculty
centered was required for comparisons to be made.
The purpose of the interviews was to gain a greater understanding of instructional
design activities taking place within each organizational structure than might not have
been evident from analysis of only quantitative data. The qualitative portion of the study
also sought to provide a better understanding of the strengths and weakness of various
organizational structures as perceived by instructional design professionals. Interviews
were recorded (permission was sought and needed to be granted by the volunteers in
order to participate in the interviews) and the typed transcript sent to each participant for
validation. Participants who did not return transcripts within one week with comments or
corrections were contacted by telephone in an attempt to receive oral approval. Those
not available by telephone were contacted again by e-mail and asked to respond if the
transcript was in error or permission was not given to use the transcript information.
Three participants responded with minor corrections to their transcript. All interview
participants responses were included in the study findings.

40

Sampling
A simple random sampling method was used to select potential participants from
200 of the 1,015 colleges and universities within the HLC (Creswell, 2008, p. 631).
Based on Fowlers table of confidence ranges for variability attributed to sampling (1993,
p. 31), it was determined a minimum of 50 respondents was required to provide a 95%
confidence level and a sampling error of 15%.
Institutions websites from which candidates were invited to participate in the
study were reviewed to find the names and contact information for employees listed in an
instructional design role. The titles included instructional designer, faculty development
leader, curriculum specialist, course developer, distance education director, and similar
roles. The school was called in an effort to find the name of a person in one of the above
roles if a name could not be found from the website. The next school was selected from
the larger randomly generated list if a telephone call failed to produce a name. This
process spanned a period of one month, requiring approximately 40 to 50 hours to
complete.
Interview participants were selected based on availability from those who
indicated on the survey their willingness to participate in an interview. All respondents
who volunteered to be interviewed agreed to the recording of the interview (Creswell,
2008, p. 227).
Instrumentation
The quantitative portion of the study used a modified version of Vannoys (2008)
validated, three-part survey instrument (see Appendix A). Use of this instrument

41

provides validity through the use of an existing, validated survey. The first part of
Vannoys instrument was used to collect data concerning instructional design activities
and the frequency with which activities were conducted. The second part determined
respondents reasons for excluding specific instructional design activities. The modified
instrument used for this study eliminated questions relating specifically to use of course
management systems and qualitative questions from Part 3 of Vannoys survey.
Demographic questions were added at the beginning of the survey to identify
respondents roles at the institution, type and academic level of the institution,
instructional delivery options offered, and instructional design organizational structure
used. The opportunity for volunteers to participate in a qualitative follow-up telephone
interview was provided at the end of the survey.
The questions and format of the telephone interview (see Appendix B) were
validated by a survey expert in social science research and two instructional designers
known to the researcher. The telephone interview was administered by the researcher.
Data Analysis
Survey data collection ended two weeks after the initial e-mail invitations were
sent. Interview data collection continued for two weeks after the close of quantitative data
collection. Quantitative data analysis began at the conclusion of quantitative data
collection. Descriptive statistics on each item in Parts I, II, and III were used to
determine measures of central tendency. Mode was used to determine central tendency
of responses for each item in Part I of the survey and both median and mode were used to
measure central tendency of responses for each item in Parts II and III of the survey

42

(Ramano, Kromrey, Corraggio, & Skowronek, 2006, p. 4). The variable organizational
structure was used to perform a chi-square analysis to determine if a difference exists in
the type of organizational structures used at institutions that are members of the NCAHLC.
Data was then cross-tabulated on organizational structure and measures of central
tendency calculated based on organizational structure. The Kruskal-Wallis test was used
to determine if there was a difference between organizational structures and the
frequency with which specific instructional design activities were performed (Aczel,
1999).
Qualitative data analysis began at the completion of the first telephone interview.
According to Leech and Onwuegbuzie (2007), at least two types of data analysis are
needed for triangulation (p. 579), and using multiple forms of data analysis can increase
understanding of the data (p. 563). For this study, constant comparative analysis was
used to identify themes emerging from the data. As themes emerged, classical content
analysis was used to determine which themes occurred most often. The process involved
grouping questions into five categories:

course description, Question 1;

instructional design roles, Questions 2 through 5;

instructional design activities, Questions 6 and 7;

organizational structures, Questions 8 through 11;

strengths and weaknesses of organizational structures, Questions 12 through


16.

43

The purpose of the qualitative portion of the study was to gain a greater
understanding of instructional design activities taking place at institutions within the
NCA-HLC and the strengths and weaknesses of organizational structures as perceived by
instructional design professionals. Participants were asked a series of questions designed
to aid in this understanding. Participants were read an introductory statement about the
purpose of the interview and asked for permission to record the interview prior to being
asked any questions (see Appendix B).
Interviews took between 23 and 66 minutes with an average interview time of 38
minutes. Two digital recordings were made of each interview and transferred to
Audacity, a free recording software, for transcription. One outside person and the
interviewer transcribed the interviews. Completed transcripts were e-mailed to
participants for approval, review, and modification. In instances where no response was
received to approve or modify the transcript, an attempt was made to contact the
participant by telephone. For those who could not be reached by telephone, a final e-mail
was sent asking for response by a deadline if the transcript was not approved. Three
participants sent minor transcript corrections.
Two reviewersthe principal investigator and a former professional colleague
separately reviewed transcripts. The former professional colleague had prior knowledge
of evaluating and identifying themes in literature and technical communications in
addition to participating on an extensive large collaborative project requiring identifying
of themes. Training was provided to transfer that skillset to the identification of
emerging themes. Each reviewer identified emerging themes for each category
previously identified based on individual participant responses. Themes were then

44

grouped for all respondents and ranked based on frequency of occurrence. Themes and
rankings were compared between reviewers. Discrepancies between reviewers were
discussed until a consensus was achieved. Themes were then reranked to determine
which appeared most frequently in the data.
Organizational structure strengths and weaknesses were lsited. In addition, the
strengths and weaknesses of each organizational structure were identified. Data analysis
was accomplished using Microsoft Word, Access, and Excel.
Ethical Issues
Potential ethical issues included informed consent and anonymity issues.
Informed consent was established prior to participants having access to the survey.
Information relating to informed consent was presented in introductory materials sent
with the e-mail for participation. Survey participants agreed to informed consent by
clicking on the link and acknowledging their consent to participate in the survey before
gaining access to the survey site. Confidentiality was maintained at all times in both
qualitative and quantitative portions of the study. Respondent names were not collected
until a respondent volunteered to participate in the telephone interview. Names of
respondents volunteering for telephone interviews were separated from their responses
and stored in a separate password protected file on the researchers desktop computer.
Names of respondents institutions were not collected. This study has been approved by
Capella University's IRB 101194-1, effective from September 14, 2010 through
September 14, 2011. Research began after receipt of Capella University Institutional
Review Board approval.

45

Summary
A mixed methods explanatory study was conducted to identify common
organizational structures and the instructional design activities occurring under each
structure. Institutions from which participants were invited were selected at random from
the NCA-HLC. An existing, validated survey was modified for use in the quantitative
portion of the study; telephone interviews, using questions and formats validated by
experts, were used for the qualitative portion.
The anticipated timeline for the study was two weeks for quantitative data
collection, four weeks to conduct qualitative interviews with a two-week overlap with
quantitative data collection, and two weeks to process and analyze data. Reporting
findings and conclusions was anticipated to take three to six weeks. Actual analysis of the
data and reporting of findings took approximately four weeks to complete. Chapter 4
discusses the data collection and analysis.

46

CHAPTER 4. DATA COLLECTION AND ANALYSIS


The purpose of this study was to determine what organizational structures were in
place at colleges within the Higher Learning Commission of the North Central
Association of Colleges and Universities (NCA-HLC) and the instructional design
activities taking place within those structures. A further goal of the study was to gain a
better understanding of strengths and weaknesses of the following three organizational
structures: instructional designer centered (IDC), instructional designer supported (IDS),
and faculty centered (FC). The study addressed the following questions:
1. What types of instructional design activities are found in colleges and
universities using instructional designer centered, instructional designer
supported, and faculty centered models of organizational structure in the
Higher Learning Commission of the North Central Association of Colleges
and Universities?
2. What are the strengths of types of organizational structures in colleges in the
Higher Learning Commission of the North Central Association of Colleges
and Universities as perceived by instructional design professionals?
3. What are the weaknesses of types of organizational structures in colleges in
the Higher Learning Commission of the North Central Association of
Colleges and Universities as perceived by instructional design professionals?
Quantitative Data
Two hundred schools from within the NCA-HLC were selected using a simple
random sampling method. Potential participants were selected from within these 200
schools by searching the websites of the selected schools for those listed in an
instructional design role such as instructional designer, faculty development leader,

47

curriculum specialist, course developer, distance education director, and similar roles.
The individuals were sent an e-mail inviting them to participate in the study. The e-mail
contained the link to a survey administered by SurveyMonkey. Respondents were able
to complete the survey in an average of eight minutes with most taking between seven
and nine minutes to complete. The invitational e-mail informed potential participants
that responses were anonymous and confidential. The opening screen of the survey also
included information relating to anonymity and confidentiality.
Sixty-four (32%) of the 200 people invited to participate followed the link to the
survey. One chose not to enter the survey after clicking on the link and six chose not to
complete the entire survey. Three of those entering the survey, but not completing the
entire survey, sent e-mails indicating that after entering the survey they felt they were not
the best person on campus to respond to the survey. Each of them suggested alternate
people to contact. One person reported an inability to complete the survey due to
technical problems. A total of 87.5% (n = 56) of those responding completed the survey.
Demographics
Part I of the survey collected demographic data from participants in an effort to
gain a better understanding of the overall structure of participants institutions, the role
and knowledge of participants, the instructional design organizational structure used at
the institution, and participants experiences with other organizational structures.
Descriptive, nonparametric statistics of mode and mode percentage were performed on
the data collected in response to questions in Part 1. These questions collected nominal
level data that refers to categories or classifications. The groups are simply names to

48

differentiate them, no order is implied (McMillan & Schumacher, 1997, p. 205) making
the calculation of median, as the study methodology initially stated, invalid for this data.
Responses to these questions aid in understanding the organizational structure used in
participants institutions.
Participant role. Participants identified their roles in their institution as
administrator, instructional designer, faculty developer, teaching faculty, or other (See
Figure 1). Participants identifying themselves as other were asked to specify their roles.
Responses included instructional technology support specialist; instructional materials
developer; outreach staff; teach and design instruction in addition to conducting research;
instructional design/curriculum coordinator; Director - College of Arts, Sciences, and
Letters online program; and coordinator.

Figure 1. Participants roles within their institutions.


Institutional data. Institutional types were identified as being private, nonprofit;
private, for profit; or public (See Figure 2).

49

Figure 2. Type of institution participants represent.

Institutional levels included two-year technical colleges, two-year community colleges,


four-year colleges, universities, and other (See Figure 3).

Figure 3. Institutional level of participants institutions.


Participants answering other identified their institutions as bachelors, masters, and
doctorate college; graduate school; and engineering school with undergraduate and
masters degrees. When asked about delivery systems used at their institutions,

50

respondents reported various combinations of face-to-face, web-enhanced, hybrid, and


online. Respondents reported responsibility for designing for a variety of delivery
methods including face-to-face, web-enhanced, hybrid, and online (See Figure 4).

Figure 4. Delivery options offered by institutions and delivery options for which
participants have responsibility for instructional design.
Results indicate respondents represent a variety of institutions providing
instruction to all levels of students. The results also indicate a wide variety of delivery
methods in use and that participants have instructional design responsibility under a range
of delivery methods. The high percentage of respondents, 81.0%, designing hybrid
instruction strongly indicates the need for instructional designers who can develop
instruction for both face-to-face and online instructional environments.
Organizational structure. Respondents were asked to identify the instructional
design organizational structure best describing what was used at their institutions.
Responses included instructional designer centered, instructional designer supported,

51

faculty centered, and other (See Figure 5). Other organizational structures identified
include: our outreach school uses the instructional designer supported model; our
teaching center uses the faculty centered mode; we currently have a blend of the faculty
centered model and the instructional designer supported model; and the institution uses a
faculty centered model; within our center; we use a combination of ID centered and ID
supported models to serve a larger body of educators (beyond the institution).

Figure 5. Organizational structures in place at participants institutions.


Respondents were given definitions for three organizational structures and asked
what structures they had experience using or whether they had experience with another
structure. Responses included instructional designer centered (IDC) only, instructional
designer supported (IDS) only, faculty centered (FC) only, both IDC and IDS, both IDC
and FC, both IDS and FC, all three, and other (See Figure 6). Those responding with
other indicated they marked this choice because they did not have experience with any
organizational structures other than the one used at their institution.
52

Figure 6. Types of organizational structures with which participants have experience.


Results indicate that most people, 89.5%, who had responsibility for instructional
design had experience in at least one organizational structure other than the one with
which they currently worked. Many of them, 43.9%, have experience in multiple
organizational structures. Respondents were asked next to provide information relating
to specific instructional design activities taking place at their institutions.
Instructional Design Activities
Part II of the survey collected information relating to instructional design
activities taking place within institutions. Participants were asked to select from the
following choices: always, usually, regularly, selectively, rarely, and never in response to
the frequency with which specific instructional design activities were performed.
Fifty-six participants responded to questions relating to instructional design
activities (see Table C1). (Note: American Psychology Association (2010) requires
tables placed in an appendix to be labeled with the appendix letter and table number (p.
53

127).) The results of this analysis were compared to results obtained by Vannoy (2008)
whose survey was modified for use in this study. A comparison revealed similar findings
of the mode response for performance frequency of specific instructional design activities
between Vannoys results and those found in this study. The comparison revealed that
six of the 11 activities, 54.5%, were rated as being performed with the same level of
frequency by participants in this study and Vannoys study. A similar result was found in
the median response value with six of the activities, 54.5%, having the same median
value. Similar activities are identified in Table C1.
Participant responses to frequency of performing each instructional design
activity surveyed was tested using a goodness-of-fit, one-sample, chi-square test for all
respondents. Aczel (1999) defined goodness-of-fit as a statistical test of how well our
data support an assumption about the distribution of a population or random variable of
interest (p. 707). The null hypothesis, H0 (no difference in the frequency with which
specific instructional design activities are performed), was tested and rejected for 10 of
the 11 instructional design activities. Performing a follow up evaluation of training was
the only activity which respondents reported with equal consistency across the frequency
columns. This indicates that not all respondents perform the various instructional design
activities at the same frequency with the exception of performing follow-up evaluation of
training (see Table C2).
Organizational Structure
A goodness-of-fit chi square test was performed to test the null hypothesis, Ho
(there is no difference in the number of institutions using the instructional designer

54

centered organizational structure, the instructional designer supported organizational


structure, and the faculty centered organizational structure) (Ho: IDS = IDC = FC) at
institutions within the NCA-HLC. Responses from the three respondents indicating their
institutions had other instructional design organizational structures were eliminated from
the test. These respondents indicted multiple organizational structures were used.
Depending on the department within the institution in which instructional design
activities were performed, the organizational structures were also varied. The survey did
not allow for respondents to specify which organizational structure their responses related
to; thus, they were eliminated from the test. The results, X2 (2, 53) = 22.8, p =.000,
showed that the type of organizational structures used at NCA-HLC schools were not
equally distributed across these institutions. Based on survey response, the null
hypothesis is rejected in favor of concluding a significant difference exists between
organizational structures at institutions within the NCA-HLC. The faculty centered
structure (FC ) was the most frequently used organizational structure (60.3% of
respondents). The instructional designer centered structure (IDC) was the least
frequently used organizational structure (8.6% of respondents). The instructional
designer supported structure was used by 25.9% of respondents.
Responses to frequency of performance of instructional design activities were
then cross-tabulated with the organizational structure used at the institution. The purpose
of the cross-tabulation was to determine what types of instructional design activities were
performed under different organizational structures (see Table C3).
The cross-tabulation shown in Table C3 gives median, mode, and percentages of
the number of respondents indicating the frequency with which they performed various

55

instructional design activities. The study design called for use of a chi-square test for
independence to test the null hypothesis, H0 (there is no difference in instructional design
activities among the IDC, IDS, and FC structures of organizational structure). It was
determined, based on Aczel (1999) and consultation with statisticians from both Capella
University and the University of Arkansas at Little Rock, a chi-square analysis was
inappropriate to use in this study. Frequency counts of fewer than five found in the cross
tabulation in Table C3 and the small sample size, n = 53, are given as the reason for this
determination. Aczel (1999) cautions against using the chi-square test when the count in
any cell is less than five (p. 710). The study had over 20% of the cells with values of
fewer than five. Instead, the SPSS non-parametric, independent samples test, KruskalWallis, was conducted to test the null hypothesis.
Kruskal-Wallis was determined appropriate for the data given its use of ranks of
the observations rather than the data (Aczel, 1999, p. 695). Values p < .000 for
frequencies of usually, regularly, selectively, and never and p < .001 for frequencies of
always and rarely supported rejection of the null hypothesis in each case. This indicates
there is a difference between organizational structures and the frequency with which
various instructional design activities are performed.
Figure 7 and Figure 8 illustrates, respectively, the percentage of respondents in
each organizational structure who indicated the frequency with which they conduct a
needs assessment and determine if the need can be solved by training. This indicates an
unequal distribution of frequency of performance because respondents using an IDS
structure report most often they usually perform these two tasks, and those from IDC
organizational structures report most often they never perform the two tasks. Figure 9

56

through Figure 17 illustrate the difference in frequency with which each of the remaining
instructional design activities are performed based on organizational structure.

Figure 7. Comparison of frequency of performance of needs assessment among


organizational structures.

Figure 8. Comparison of frequency of performance in determining if the need can be


solved by training among organizational structures.

57

Figure 9. Comparison of frequency of performance of writing/using learning objectives


among organizational structures.

Figure 10. Comparison of frequency of performance of conducting a task analysis


among organizational structures.

58

Figure 11. Comparison of frequency of performance of identifying types of learning


outcomes among organizational structures.

Figure 12. Comparison of frequency of performance of assessing trainees entry skills


and characteristics among organizational structures.

59

Figure 13. Comparison of frequency of performance of developing test items among


organizational structures.

Figure 14. Comparison of frequency of performance of selecting instructional strategies


among organizational structures.

60

Figure 15. Comparison of frequency of performance of selecting media formats for


training among organizational structures.

Figure 16. Comparison of frequency of performance of pilot testing instruction before


completion among organizational structures.

61

Figure 17. Comparison of frequency of performing a follow-up evaluation of training


among organizational structures.
Study findings have shown there is a difference in the organizational structures
used at institutions represented in the study, and there is a difference in the frequency
with which various instructional design activities are performed by participants. It has
also been shown there is a difference between organizational structures and the frequency
with which various instructional design activities are performed. A further evaluation of
differences between instructional design activities and the organizational structures
within which they were performed was studied in comparison to reasons for eliminating
specific activities during instructional design.
Reasons for Eliminating Activities
Part III of the study collected information relating to reasons for eliminating
specific activities from the instructional design process in an effort to better understand
the instructional design process and decisions instructional designers make (see Table

62

C4). Table C4 represents nominal-level data on which mode and percentage were
reported. A comparison of results to reasons for omitting instructional design activities to
those found in Vannoys (2008) study found identical responses for six, 54.5%, of the
activities. Items showing the same reason for not including an activity during the
instructional design process included the following: conduct a needs assessment,
determine if the need can be solved by training, conduct task analysis, assess trainees
entry skills and characteristics, pilot test instruction, and do a follow-up evaluation. In
each of the five cases where responses differed between studies, respondents in this study
responded they always perform the activities. This was a choice that was not available in
the Vannoy survey. In two of the five cases (develop test items and select instructional
strategies), the Vannoy respondents selected not applicable. The not applicable response
was removed from this studys survey and replaced with always include.
The data was cross-tabulated on organizational structure and reasons for omitting.
A nonparametric Kruskal-Wallis test was performed to test the null hypothesis, Ho (there
is no difference between organizational structures and reasons for omitting specific
instructional design activities). In each case, the null hypothesis was accepted; this
indicats there was no difference between organizational structures and reasons for not
performing specific instructional design activities.
A variety of organizational structures are used in institutions within the NCAHLC; the frequency with which instructional design activities are performed varies from
activity to activity; and, a variety of reasons exist for omitting activities. A difference
appears to exist between organizational structures and frequency of performance of

63

instructional design activities, but this difference does not extend to reasons activities
might be excluded.
The second part of the study collected qualitative data. Qualitative data was used
to gain a greater understanding of the strengths and weaknesses of the IDC, IDS, and FC
organizational structures as perceived by instructional design professionals.
Qualitative Data
The 63 respondents who chose to participate in the survey were given an
opportunity to participate in the qualitative data collection portion of the study. Thirteen
participants volunteered to participate in telephone interviews. Eleven, 17.4% of all
survey respondents, actually participated in interviews. One was not available at the
scheduled time and did not respond to attempts to reschedule. The other was not available
until after the close of the time frame allowed in the study design for interviews.
Procedure
The purpose of the qualitative portion of the study was to gain a greater
understanding of instructional design activities taking place at institutions within the
NCA-HLC. The study further sought to understand strengths and weaknesses of
organizational structures as perceived by instructional design professionals.
Questions were grouped into five categories:

course description, Question 1;

instructional design roles, Questions 2 through 5;

instructional design activities, Questions 6 and 7;

organizational structures, Questions 8 through 11;

64

strengths and weaknesses of organizational structures, Questions 12 through


16.

Discussion of each of these categories and themes follows.


Categories and Themes
Course description. The interview began by asking participants to define the
term, course. The purpose of this question was to create a foundation for questions about
course design and to verify that interviewer and participant were answering questions
about the same topic (a course). Responses were varied. For example, Participant 9
stated that a course is an offering for credit taught by faculty and part of a curriculum.
The most frequently occurring themes emerging from this question were time frame and
students/faculty. These were mentioned by 45.5% of participants, n = 5: credit/grades,
educational objectives, and content each from 36.3% of participants, n = 4. Other themes
identified were the following: part of a curriculum, defined by a syllabus, assessments,
instructional strategies, unique learning situation, and delivery modes.
A number of these themes were carried forward in the interviews as they related
to instructional design activities, organizational structures, and the strengths and
weaknesses of organizational structures. For example, Participant 9 continued his/her
discussion in a later question by stating the instructional designer works with the faculty
or SME on the actual structure of the class, how to link the activities with the learning
objectives, where to put reading activities and material development. This response
carried forward the following themes: educational objectiveslearning objectives,

65

contentreading activities and material development, and students/facultyworks with


the faculty.
Instructional design roles. Questions 2 through 5 focused on collecting data
about organizational structure. These questions asked how many people were involved in
instructional design at the institution, the job title of each person, and what tasks each
person performs. In addition, participants were asked to identify points of interaction
between those involved in the design process. Responses showed as few as one person to
as many as 15 people were involved in the instructional design process. Table C5 shows
a comparison between organizational structures and number of people involved. The
faculty centered (FC) organizational structure was the most frequently described
structure, n = 4. The most frequently occurring number of people involved in the
instructional design process was three or four , n = 6. All respondents who used an IDS
structure, n = 3, reported having three or four people involved in the instructional design
process. The greatest range in number of people involved in the instructional design
process was found in the FC organizational structure, n = 4, where anywhere from one to
four people might be involved in the process. Others were sometimes reported in the
number involved in the instructional design process but only in a review capacity. These
people were not included in the total count. For example, Participant 11 reported an
involvement of up to 30 people or more but only listed three as having a direct role in
the instructional design process. The rest were identified as committee members at
various levels of curriculum or course review.

66

Respondents were asked to identify the role each person played in the
instructional design process. All respondents listed the instructor or subject matter expert
(SME), n = 11, as involved in the process. Table C6 shows the breakdown of
instructional design roles by organizational structure. The next most frequently identified
role, n = 7, was that of instructional designer. This role was identified as present in each
organizational structure except the FC structure where it was not listed among
instructional design roles. Other roles identified by participants included editors, video
developers, faculty chair, project coordinator, graphic designer, creative services director,
course production team, and others whose roles in the process was that of reviewers.
Participants were asked to describe the tasks performed by each person identified.
This task identification helped to further understand the function of the various roles
identified. The two most frequently identified tasks performed by SMEs were
development of content or materials and development of assessments. Additional tasks
identified as responsibilities of the SME were the following: develop learning objectives,
design learning outcomes, and ensure proper fit within the curriculum. Three tasks were
the most frequently identified for instructional designers in the IDC and IDS
organizational structures. One was to work with faculty members in the design of the
course including development of goals and objectives. The second task was to help
faculty members in selection of learning activities and assessments. The third task was
transfer of content designed by faculty members into the delivery platform. Additional
tasks included: ownership of the overall course design process in the IDC structure,
provide technical support in a blended structure, and review learning materials in the IDS
structure.

67

Where a director of instructional design was identified, the tasks were primarily to
administer and advertise instructional design services and to train and consult with
faculty members. Curriculum specialists served the primary purpose of ensuring course
fit in the curriculum and tracking curricular changes. Multimedia designers created and
designed multimedia materials including pictures, digital videos, and sound tracks.
Administrators ensure academic integrity and provide a timeline for project completion.
Communication and interaction is an important aspect of these roles, especially in
the IDC and IDS organizational structures. Respondents were asked to identify areas
where various people involved in the instructional design process interact with one
another. In the FC organizational structure, the SME/faculty memberdesigner by
assignment (Merrill & Wilson, 2007, p. 336)typically worked with other faculty
members in the discipline, directors of instructional design, curriculum specialists, and
multimedia designers. The purpose of these interactions was to gain support in
performance of instructional design activities but responsibility and leadership rest with
the SME/faculty member. In the IDS structure, the SME interacted extensively with the
instructional designer in course development. The instructional designer typically worked
with administrators and the directors of instructional design to engage the help of others
needed in the process and to ensure consistency across the curriculum and within the
delivery platform. The participant interviewed from an IDC environment indicated that
the SME interacted almost exclusively with the instructional designer during the
instructional design process. The instructional designer then communicated with
administrators and others in support of the project. Course design in the FC and IDS

68

structure was often, although not always, initiated by the SME. In the IDC structure,
course design or revision typically began at the administrative level.
A similarity is seen among organizational structures and roles that were used to
conduct instructional design activities within the structures. The greatest difference
seemed to be in the number of roles and interactions between people performing the
roles. Instructional design processes and the activities taking place within the processes
were identified next.
Instructional design activities. Participants were asked to describe the
instructional design process at their institutions, followed by a description of typical
instructional design activities. Instructional design process was defined for participants
as referring to the procedures used for designing, developing, and implementing courses
within a program of study or curriculum. Instructional design activities were defined as
the individual steps performed within the procedures identified as part of the instructional
design process. See Table C7 for participant responses.
The results show all participants indicated involvement in determining the
syllabus, content, assessments and materials; these tasks were performed prior to
developing course materials or placing courses into learning management systems and
opening the course for students to access. Most participants, 81.8%, indicated the
process begins with a course request or recognition of the need for instruction. This is
followed by development of goals and objectives, 72.7% of participants. The KruskalWallis test was performed to test the null hypothesis, Ho (There is no difference between
organizational structures and instructional design processes based on responses given by

69

interview participants). In each case the null hypothesis was accepted, indicating there
was no difference between organizational structures and instructional design processes.
The next question asked participants about instructional design activities. In
answering this question participants responses served to further clarify responses to the
previous question about instructional design processes. This clarification provided
insight into processes and organizational structures. For example, one participant
representing an IDC organizational structure, further elaborated on the first step by
requesting a course or recognizing the need for a new course.
Whenever a project comes in, there has to be a needs assessment. It is co-lead by
the instructional designer and the faculty chair. The instructional designer makes
sure it is done. That way we truly know who our students are, who our audience
is, their prerequisite knowledge, what competencies do we have to meet within
this course. (Participant 3)
Participant 3 further stated that in his/her IDC organizational structure, design was an
iterative process that is not done until the end so it is never really done.
In the FC organizational structure, the activities were not clearly defined.
Participant 4 stated, I dont really know what processes they are using if there is any
kind of formal recognizable one other than their own craft of designing courses. A
greater depth of understanding can be gained from following explaination.
It depends on the instructor here. Some instructors may say OK, what content
am I covering and what order am I covering it? What midterm and final am I
going to give? That would be at the lesser end. On the greater end, hopefully
people are looking specifically at what learner outcomes we want to achieve and
the processes through which we might achieve those. (Participant 10)
In the IDS organizational structure the instructional designer often serves the
functions of review and quality control. Participant 6 explained the process at his/her
institution.

70

The SME uses the previous draft of the course as their starting point. Taking into
consideration the content, he drafts learning outcomes, what they expect learners
to learn. Design learning activities . . . The instructional designer does a review of
the draft looking for learning outcomes to make sure they are on the higher level
of Blooms taxonomy.
Participant 8 further expanded on the instructional designers role in review and quality
control by stating, Another thing the instructional designer does is look at instructional
objectives and make sure those instructional objectives are met.
The process and activities are much the same in each organizational structure as
determined by the Kruskal-Wallis test for independence. The depth or quality of those
activities does appear to differ based on organizational structures.
Organizational structures. Participants were asked to identify the
organizational structure used at their institutions. Responses included the following:

IDC, n = 1;

IDS, n = 3;

FC, n = 4;

IDS and FC combined, n = 3

One participant reported a combined organizational structure, indicating a FC structure


was used for face-to-face courses and IDS for online courses. Participants indicated
several reasons for their response to organizational structure. Below are excerpts from
participant responses.

Participant 1 (FC), The faculty for the most part work in isolation. They
teach the way they understand as opposed to teaching the way the students
might learn best.

71

Participant 4 (FC), The faculty have been designing their courses as the
primary people for 100 plus years at our institution. The idea of having
instructional design people is relatively new within our institution.

Participant 10 (FC), We work with a significant number of faculty but there


is a lot larger number that we dont work with.

Participant 2 (IDS), Here at my school our policies and processes tend to be


more department centered or design centered. But our practices really kind of
bridge the gap.

Participant 6 (IDS), The design of the course really centers with the faculty
and the school. They control the content and how it is delivered. The designer
supports, reviews, and offers suggestions.

Participant 3 (IDC), Thats what we are, a virtual assembly line here. The
faculty member still owns all their content . . . They are working through a
process . . . Its really centered around the role of the instructional designer.

Participant 9 (IDS & FC), If they [faculty] are very experienced, it would be
faculty centered. If they [faculty] are inexperienced it would be instructional
designer supported.

Responses indicated there was a difference in organizational structures, and the


differences tended to revolve around faculty members designing their own courses in FC
structures, instructional designers supporting faculty in the design of their courses in IDS
structures, and instructional designers driving the course design processes in IDC
structures.
Participants were asked if the organizational structure had changed at their
institutions during the time they had been there. Three of the four participants from FC
organizational structures reported no change; one, Participant 11, reported moving to IDS
then back to FC. The participant states the initial change was based upon pressure from
our students not being able to transfer the courses completely. This led to the push for
development of an IDS structure. In the IDS structure we basically trained faculty to

72

work on those in-common elements with their own faculty. Basically, we had a train the
trainer. This training along with the fact that at the end, we have loud voices for
academic freedom caused faculty to feel they know how to design their course and its
purely from a presentational content perspective has brought us back to a FC structure.
From institutions supporting an IDS structure one participant reported no change
and two participants reported change in organizational structure. Participant 2 said the
change was to split the concepts of instructional design and development from
instructional innovation. The instructional design services unit handles fully online
efforts and the instructional development unit works with faculty training. Participant 8
identified the change as a move away from FC. The change began by teaching faculty
members how to write instructional objectives and develop good testing methods. It also
incorporated a method for providing systematic feedback and course evaluation to faculty
members. He/she stated, Originally the only people doing the instructional design work
were the instructors themselves. There was really no instructional design person here.
Participant 3 indicated the organizational structure at his/her school had been
totally faculty centered. The organizational structure evolved over time to the current
process with resources, feedback, and support available throughout the process that is
guided by the instructional design team. Of the three participants indicating a use of both
FC and IDS organizational structures, two indicated organizational structure changes.
Participant 9 stated, I would say its gone from having very much a learning
management system focus, how to use the learning management system, to more a
pedagogical teaching consideration. Participant 5 stated the change in his/her institution

73

occurred because of the number of online courses offered and the faculty members
teaching them.
All participants reporting changes in organizational structure indicated they
believed the changes were for the better. Participants explained why they believed the
current organizational structures were chosen at their institutions. Tradition is the most
commonly stated reason (see Table C8).
Four of the five participants indicating tradition as one of the primary reasons for
the existing organizational structure also indicated the organizational structure at their
institutions had not changed during the time they had been there. One indicated a change
to a combination of IDS and FC structures. Participants were next asked to discuss their
perspectives of benefits and shortfalls of their organizational structures.
Strengths and weaknesses of organizational structures. Thirty six percent, n =
4, of participants identified scalability/efficiency and faculty expertise as major benefits
of the organizational structure used at their institutions. Responsiveness, quality, faculty
autonomy, and communication were each identified by 27.3%, n = 3, as major benefits.
Most participants identified between two and three major benefits of their organizational
structure. Additional benefits included: works for the institution, integration into campus
with training units on hybrid learning, cost containment, location, and faculty relief.
Next participants were asked to identify major shortfalls of the organizational
structure used in their institutions. Five participants, 45.5%, identified poor or
inconsistent quality as the number one shortfall of the organizational structure. Three of
the four participants from FC organizational structures identified this shortfall. The next

74

most commonly identified shortfalls were nonresponsiveness of the process and lack of
faculty training with 27.3%, n = 3, of participants identifying this shortfall. Additional
shortfalls included: focus is not on learning, the rate of change makes for uncertainty,
perception of the process, and lack of staffing.
A side-by-side comparison of benefits and shortfalls identified for each
organizational structures is shown in Tables D9 and D10. A variety of reasons were
given when participants were asked why they believed the shortfalls existed. Mentioned
by 27.3%, n = 3, of participants were the following: tradition, leadership, and faculty
workload. Participant 10 summarized, I think when you have a historic pattern, it takes
a lot of momentum and a lot of intervention to change that model. Participant 3 states,
So many things are going on at once. You dont get to pay as much attention to an
individual course maybe as you would like. Participant 1 said, Lack of opportunity for
collaboration . . . I think the workloads are so heavy that there is not time for people to be
innovative.
Participants were given the opportunity to suggest changes that might reduce or
eliminate shortfalls in the organizational structure. Forty six percent, n = 5, stated the
need for resources and training in instructional design and teaching, and 36.4%, n = 4,
suggested reducing workloads by giving release time, reducing courseload, or hiring
additional people.
Given the opportunity to share other information about instructional design
activities and the organizational structures at their institutions, participants gave positive
closing comments. Participant 1 stated, There are some people who are doing a
remarkable job. Participant 9 stated his/her optimism is much greater. Others

75

identified continued problems facing institutions and instructional designers. Participant 5


stated, I wish I knew how to explain to the faculty what instructional design could really
do for them. Participant 9 said, Its growing. Probably like every institution we have
lots of faculty who are jumping on board [receiving training in pedagogy] and we also
have some who are not interested.
Summary
Quantitative methods were used to investigate research question 1. The results
showed a difference in instructional design activities among three organizational
structures (IDC, IDS, and FC) as shown in Table C2 and validated by the Kruskal-Wallis
test. Table C3 shows which instructional design activities are most frequently performed
within each of the three organizational structures. Table C4 shows reasons participants
gave for excluding specific activities.
Qualitative interviews were conducted and transcripts analyzed in response to
research questions 2 and 3. The major strengths and weaknesses are listed in Tables D9
and D10. Scalability, responsiveness, and quality were identified as major strengths of
the IDC organizational structure; lack of faculty control, perception of the process, and
nonresponsiveness were listed as weaknesses. Note that responsiveness is listed as both a
strength and weakness in this structure. It was interpreted as a strength in responding
quickly to the need to put things up, yet it was a weakness in being nonresponsive to
last minute Im changing my course right now types of changes.
Table C9 shows benefits of the IDS organizational structure were identified as
integration into campus with training units on hybrid learning, cost containment,

76

scalability/efficiency, quality, improved communications, and providing both faculty and


instructional designer expertise. Weaknesses of this structure were seen as the rate of
change creates uncertainty, lack of faculty training in instructional design, and lack of
quality in some instances (see Table C10).
The following benefits of the FC organizational structure were identified:
tradition, works for institution, efficiency, faculty autonomy, faculty and instructional
designer expertise, and improved communications (see Table C9). Weaknesses included
lack of quality, focus is not on learning, lack of training in instructional design, and
nonresponsiveness (see Table C10).
The next chapter provides a summary of the results reported in this chapter.
Conclusions regarding the study design and findings are discussed and recommendations
for future study expressed.

77

CHAPTER 5. RESULTS, CONCLUSIONS, AND RECOMMENDATIONS


The shift from face-to-face instruction to other technology-mediated delivery
environments has brought changes to both organizational structures and instructional
design processes. Shifting instructional delivery media requires instructional designers at
colleges and universities to rethink methods of designing, developing, and supporting
instructional design activities within their institutions. The purpose of this study was to
determine what organizational structures support instructional design and development
activities at colleges and universities within the Higher Learning Commission of the
North Central Association of Colleges and Universities (NCA-HLC). The study also
sought to develop a better understanding of strengths and weaknesses of organizational
structures. Three research questions were posed and are discussed in the following
paragraphs.
A mixed methods explanatory study was employed to address these questions.
Both quantitative and qualitative methods were used in the study of Question 1. Primary
data was collected using quantitative methods and expanded upon with qualitative data.
Questions 2 and 3 were studied using qualitative methods. This chapter will discuss the
results obtained from the research, conclusions drawn, and recommendations for future
study.

78

Results
Question 1
What types of instructional design activities are found at colleges and universities
using instructional designer centered, instructional designer supported, and faculty
centered models of organizational structure within the North Central Association of
Colleges and Schools, the Higher Learning Commission?
Understanding instructional design activities taking place in each of the
organizational structures serves to aid in understanding how organizational structures
support instructional design activities. The importance of this understanding is discussed
by Bichelmeyer (2003) in the context of the difference between instruction and
instructional design. According to Bichelmeyer, instruction has as its objective to
ensure learning to the best of each students abilities (p. 5) while instructional design
has as its objective to facilitate standardization of instruction (p. 5). It is important to
understand whether organizational structure impacts instructional design activities
supporting the similar, yet different, objectives of instructional design and instruction.
The study examined instructional design activities taking place in different organizational
structures in order to gain a greater understanding of the differences found in each
organizational structure.
Quantitative data were collected using a survey modified from the work of
Vannoy (2008). Data were analyzed to determine what organizational structures that
support design and development of instruction are used in institutions in the NCA-HLC
and the frequency with which specific instructional design activities take place. This

79

information was then cross tabulated to determine the frequency with which activities
take place in each of three organizational design structures: instructional designer
centered (IDC), instructional designer supported (IDS), and faculty centered (FC).
Institutional data collected shows respondents represent a combination of private,
nonprofit; private, for profit; and public schools. Schools represented varied and
included two-year technical colleges, two-year community colleges, four-year colleges
and universities. Respondents indicated that their institutions use a broad range of course
delivery methods including face-to-face, web-enhanced, hybrid, and online. The various
methods are each utilized with similar frequencies. Respondents were also asked to
identify delivery methods for which they personally were responsible for designing
instruction. Results were similar to respondents institutional delivery methods with the
exception that respondents most frequently reported responsibility for designing
instruction for hybrid delivery methods.
Results indicated three common instructional design organizational structures in
place. These were instructional designer centered (IDC), instructional designer supported
(IDS), and faculty centered (FC). The FC structure was the most commonly identified
organizational structure in use, the IDC structure was the least commonly used structure.
No additional organizational structures were identified by respondents.
The frequency that instructional design activities occur among the surveyed
institutions varied from activity to activity. Overall, the most frequently occurring
activities were identified as: write learning objectives, identify types of learning
outcomes, and select instructional strategies. The least frequently occurring activities
were: pilot test instruction, develop test items, and conduct a task analysis. Results are

80

similar to the findings of Vannoy (2008). Vannoy found 50% of the most frequently
chosen activities remain the same as those in Wedman and Tessmers (1993) study.
The study further shows the frequency that instructional design activities were
performed varies among organizational structures. For example, the most frequently
occurring response for persons using an IDS structure to the activity pilot test instruction
before completion was always. However, those using an IDC or FC structure most
frequently responded selectively or rarely. The most frequent response to the frequency
that needs assessments are conducted was given as usually for those using an IDS
structure, selectively in the FC structure, and never in the IDC structure.
Reasons for eliminating instructional design activities were also examined. A
variety of reasons was identified and found to vary from activity to activity. The two
most frequently occurring reasons for not conducting an activity were for conduct a needs
assessment and pilot test instruction before completion. Respondents indicated the most
frequent reason for eliminating a needs assessment was because the decision had already
been made. Pilot testing was not conducted most often due to lack of time. The KruskalWallis test showed organizational structure does not seem to impact reasons given for
omitting specific activities.
The findings in this study indicated the most frequently occurring instructional
design activities taking place were write/use learning objectives and identify types of
learning outcomes. The least frequently occurring activities were develop test items and
pilot test instruction before completion. Findings further indicated there was a difference
in frequencies with which instructional design activities took place between
organizational structures. The most frequently occurring instructional design activities in

81

the instructional designer centered organizational structure were write/use learning


objectives and identify types of learning outcomes. In the instructional designer
supported structure the most frequently occurring activities were determine if the need
can be solved by training and write/use learning objectives. For the faculty centered
structure the most frequently performed activities were write/use learning objectives and
select instructional strategies.
Focus of the study next turned to gaining a better understanding of strengths and
weaknesses of instructional design organizational structures. Qualitative methods were
used to answer the next two research questions.
Questions 2 and 3
Eleven telephone interviews were conducted with persons responsible for
instructional design. They were asked what they felt are the strengths/weaknesses of
organizational structures used at their institutions. Interview responses provided data
about how many people were involved in the instructional design process, the role of
those involved, and benefits and limitations of organizational structures.
Participants indicated the instructional design process in the FC structure typically
involved as few as one person to as many as four, with one being the most frequently
given response. Participants from the IDS structure indicated three to four people were
typically involved in instructional design processes at their institutions; IDC structures
included 10 or more people in the process. All participants said the most frequently
identified role of those involved in instructional design processes was the subject matter
expert; seven participants identified an instructional designer. Instructional designers

82

were not mentioned as having involvement in instructional design processes by any


participants in the FC structure, while participants from the other structures indicated the
instructional designer was always involved in the processes. Other roles identified as
frequently occurring in instructional design processes were director of instructional
design, curriculum specialist, and multimedia designer.
The most frequently identified tasks performed by SMEs were development of
content or materials and development of assessments. Tasks of developing learning
objectives, designing learning outcomes, and ensuring proper fit in the curriculum were
also included. The most frequently identified tasks of instructional designers were to
work with faculty in the design of courses. Specifically, these tasks include development
of goals and objectives, selection of learning activities and assessments, and selecting
activities and transfer of content into the delivery platform designed by faculty. Others
who were identified as involved in instructional design processes performed various
support and administrative functions.
Participants identified a variety of interactions among those people involved in
the instructional design process. In the FC structure, interaction was typically between
SMEs and other faculty members in the discipline together with the directors of
instructional design, curriculum specialists, and/or multimedia designers. The purpose of
interactions was to gain support in performance of instructional design activities, but
responsibility and leadership rested with the SME/faculty member. In the IDS structure
SMEs interacted extensively with instructional designers in course development.
Instructional designers typically worked with administrators and directors of instructional
design to engage the help of others needed in the process. This would ensure consistency

83

across the curriculum and within delivery platforms. In the IDC environment, SMEs
interacted almost exclusively with instructional designers during the instructional design
processes. The instructional designer communicated with administrators and others in
support of the project.
These interactions guided instructional design processes and activities as part of
the processes at each institution. Responses indicated there was little or no difference
between organizational structure and instructional design processes. Participant
responses, while indicating the processes taking place may be similar, showed there was
a difference in the depth or quality of activities based on organizational structures. The
impact these differences have on instructional design and course quality overall was
addressed by participants who perceived quality as a benefit of the instructional designer
centered organizational structure and lack of quality as a shortfall of the faculty centered
organizational structures.
No one overarching major benefit was identified for any organizational structures,
but several benefits were listed. Scalability/efficiency and faculty expertise were
identified most often by participants as major benefits. Benefits solely attributed to FC
organizational structures were tradition and works for institution. The benefit attributed
only to IDS structures was integration into campus with training units on hybrid learning.
No benefits were listed as specifically unique to IDC structures.
Poor or inconsistent quality was identified most often as the major shortfall of
organizational structures used by participants. This shortfall was identified by five
participants. Three of the four participants from FC organizational structures identified
poor or inconsistent quality as a shortfall of the structure. Shortfalls identified as unique

84

to the FC structure were lack of focus on learning and faculty inflexibility. Unique to the
IDS structure was the rate of change makes for uncertainty and unique to the IDC
structure was perception of the process.
Faculty expertise was ultimately seen as both a major benefit and weakness of
organizational structures. It was seen as a benefit in the knowledge faculty had of the
subject but a weakness in the lack of knowledge faculty had in instructional design.
Participant 2 expressed his/her concern.
My personal belief is that instructional designers (instructional design teams,
units, whatever) have to find a way to straddle that fence [faculty autonomy-vsquality instruction]. Its always been a struggle. Were trying to build quality
product, effective learning units, learning modules, learning courseshowever
you want to look at itbut the faculty voice, the faculty ownership is a crucial
part of that. Thats the $64,000 question! How do we let faculty be facultyand
do all the things that we want to dobut yet ensure that the students that go
through that course all have a meaningful learning experience? . . . thats the real
key. Thats the core to doing this.
Other participants also expressed this concern. Faculty members need to be able
to do what they do bestteach to their specializationwhile ensuring quality
instructional design that enables quality instruction for learners. This concern about
quality needs to be addressed, along with recognition of other strengths of organizational
structures that were identified from information collected during interviews.
Discussion
Organizational structures used for the design, development, and support of
instructional design activities vary between institutions within the NCA-HLC. A shift
has been seen from traditional faculty centered structures where faculty have sole
responsibility for design, development, delivery, and facilitation of their own courses.

85

This shift includes instructional designer centered and instructional designer supported
structures that provide varying levels of instructional designer support or control over
instructional design processes. With this change, comes the need to understand strengths
and weaknesses of each structure. This understanding can help decision makers at
colleges and universities make informed choices regarding how to best provide effective
environments for development and support of instructional design activities.
This study showed a wide range of delivery methods are being utilized at
institutions within the NCA-HLC. Respondents indicated that within these delivery
methods the most common method for which they have responsibility is hybrid delivery.
According to Vernadakis et al. (2011), hybrid delivery has evolved as an effort to
combine the best of online and face-to-face instruction (p. 188). Those designing for
hybrid delivery need to understand how to effectively design for both environments.
Vernadakis et al. state, E-learning technology developed around the hybrid paradigm is
beneficial for improving the quality of learning, but is useless if it is not based on
pedagogical prescriptions (p. 189). Faculty are steeped and highly knowledgeable in
their subject matter but have had very little educational training or support in terms of
teaching and learning theory, pedagogy, and practices of evaluation and assessment
(Participant 4). Concerns such as those expressed by Participant 4 and others need to be
addressed. Understanding whether a difference exists between organizational structures
and strengths and weaknesses of each structure can lead to greater understanding of how
instructional design activities and organizational structures impact the quality of
instructional design. This understanding can also help decision makers understand and

86

develop effective environments for leadership and support of instructional design


activities in modern academic environments.
Study results determined the frequency that specific instructional design activities
take place varies from organizational structure to organizational structure. Such
differences can have impacts on the quality of instruction designed and delivered at
institutions using different organizational structures. The results seemed to indicate that
in a faculty centered structure the content and quality of instruction can vary among
campuses of the same institution and even among sections of the same course on a given
campus. The greatest weakness identified by the study was a perception of lack of
quality in instruction under a faculty centered structure. The most commonly identified
reason for this weakness seemed to be the lack of faculty training in education and
instructional design. This weakness was identified by at least one participant from each
of the roles of administrator, instructional designer, faculty developer, teaching faculty,
and administrator.
In an academic environment where increasing numbers of learners view higher
education as a commodity to be purchased (Reiser & Dempsey, 2007, p. 277), it is
important to take steps to consistently provide quality education for learners in face-toface, hybrid, and online courses. Resources are needed to help train faculty in
instructional design for this multi-delivery environment.
The study indicates there is a need to identify the type of training faculty should
receive. Delialioglu and Yildirim (2008) stated proper planning and implementation is
important in todays more technologically complex educational environment (p. 475).
Some institutions have addressed this need by providing training in learning management

87

systems, while others have provided training in pedagogy. Participant 9 addressed this
issue when stating changes in the organizational structure have gone from a learning
management system focus to a pedagogical teaching consideration. Yet shifting
training focus as discussed by Participant 9 enhances the need for proper planning and
implementation of training. Referring to training, Participant 7 indicated faculty feels
overwhelmed by the amount of technology training they have been asked to absorb over
the last ten years.
The importance of selecting appropriate instructional methods was addressed by
Reigeluth (1999) in his discussion of the need to select methods to achieve desired
instructional outcomes (p. 8). Study participants indicated recognition of the need to
teach faculty how to use technology, but along with this necessity must also be sound
pedagogy. This was addressed by participants discussing the role of the instructional
designer in helping faculty learn to design instruction. The need was identified to help
faculty move from desired learning outcomes or objectives to focus on content. Focus on
content must then be enhanced to include engagement strategies and assessment
(Participant 10). Those in leadership positions at institutions of higher learning should
recognize these needs and work toward methods for addressing them. Possibilities are
reduced workload and providing a center for professional development as suggested by
Participant 1 and better project planning as suggested by Participant 10.
Study participants indicated a need for recognizing and understanding the benefits
a professional instructional designer can bring to instructional design processes. The
participants also referred to the importance of recognizing and understanding differences
in instructional needs and instructional design activities for various delivery media.

88

Participant 9 summed it up as an online course its not just the face-to-face course put
online; there are certain implications for how you would design that course. Participants
expressed concern over the number of people designing courses who have no technology
training and/or instructional design training. The need was identified to train instructors
how to teach in the online environment (Participant 8). The lack of consistency in the
quality of instructional design may be attributed to inconsistency in faculty training and
desire for training. Participant 9 said at his/her institution there are lots of faculty who
are jumping on board [receiving training in pedagogy] and we also have some who are
not interested. The potential impact of those not jumping on board should be
investigated to determine the impact on overall quality of education.
The study reveals one of the greatest strengths of the FC and IDS organizational
structures is faculty expertise. This expertise must be retained during the instructional
design process and should be supported by the organizational structures. Silber (2007)
discussed the importance of training faculty in instructional design principles in order to
avoid solutions that are not only ineffective, but can actually create more difficulty (p.
7). The organizational structure selected for use by institutions must draw on faculty
strengths and provide faculty autonomy, while providing instructional design support to
ensure quality instructional design and instruction.
Recommendations
Recommendations for Institutions
Institutions of higher learning face great challenges in todays educational
environment. Competition for students is stiff, learner demographics are changing, and

89

technology is changing types of delivery methods and accessibility to education and


educational products. Design and development of instruction must keep pace with
changes taking place. Based on findings in this study, a variety of organizational
structures exist to support instructional activities. Differences exist in the frequency with
which instructional design activities occur and the perceived quality of instruction
produced under differing organizational structures.
Organizations will benefit by looking at methods used to support the design and
development of instruction in their institution. A need exists to examine the consistency
of quality of instructional content, delivery, and assessments at institutions and across
courses and sections. It is recommended that leaders at institutions of higher education
engage in an evaluation of instructional design practices and training given to faculty
members responsible for designing their own courses. It is important to determine if
current training is really a matter of educating faculty to use a learning management
system or whether it provides quality information and training in pedagogy. Decisions
must also be made on how best to provide instruction to a diverse community of learners
and through diverse delivery media.
It needs to be recognized that delivering online instruction is not simply putting
face-to-face content online. Decision makers at institutions of higher learning should
evaluate methods for helping faculty members to efficiently use learning management
systems for placing content online. In addition, methods for effectively using technology
to provide pedagogically sound instruction need to be addressed.
The role instructional designers can play in ensuring quality instructional design
should be examined. Institutional leaders and instructional designers need to identify

90

ways to help faculty understand the importance of instructional design and the benefit an
instructional designer can bring to the instructional design process. Institutional leaders
and decision makers should also examine the difference between professional
instructional designers and designers-by-assignment (Merrill & Wilson, 2007, p. 336).
If faculty members are to continue as designers-by-assignment (p. 336), thought should
be given to extensive training for these educators in pedagogy and instructional design
principles. Also, release time should be provided to allow faculty members time to attend
training and develop skill. Merrill (2007) states 95% of all instructional design is done
by designers-by-assignment (p. 336). He further explained that many instructional
products currently designed for corporate America fall far short of their potential. They
are inefficient and often ineffective (p. 337). This ineffectiveness appears to carry over
into academic instructional design. Study participants indicated a lack of quality and the
need for training faculty members in both pedagogy and technology.
Recommendations for Further Research
Further study is warranted to gather data from a larger sampling of the population.
The small sample size, while answering the study questions, limited the extent to which
data could be examined. For example, the results could not provide an indication of the
significance of the difference in frequency that various instructional design activities are
conducted within different organizational structures. This information could provide
further insight into the effectiveness of organizational structures. Of interest for future
study is a determination of whether there is a similarity between organizational structure
and instructional design model chosen. For example, do faculty centered structures tend

91

to use classroom-oriented instructional design models more often than instructional


designer centered structures? This information may assist faculty members and
instructional designers in selecting the best instructional design model to meet the needs
of specific instructional design settings and instructional goals (Gustafson and Branch,
2002).
Additional research needs to be conducted to determine if there is a difference in
the organizational structure and the level of education provided at institutions. This
information could assist educational professionals looking to modify their instructional
design organizational structures to make decisions regarding modification. It could also
provide a framework from which to better understand instructional design activities at
specific academic levels and institutional functions or missions.
Faculty designer perceptions of the instructional design process from both a
faculty centered structure and instructional designer supported structure warrant study in
comparison to those serving as subject matter experts in an instructional designer
centered structure. Examining how faculty designers perceive the purpose and process of
instructional design can shed additional light on reasons for including or excluding
instructional design activities. Increased understanding by faculty members of the need
for instructional design can aid in recognizing the strengths and weakness of
organizational structures and perhaps help resolve the problem of lack of quality
consistency identified in the study. Further study is also warranted to delve deeper into
methods for maintaining faculty expertise and developing consistent quality while being
responsive to changes in student needs and content.

92

Additional research is also warranted to further examine the strengths and


weaknesses of various organizational structures. This study revealed several benefits of
each of the three organizational structures. However, no one benefit surfaced as the
major benefit of a given structure. Similarly, several shortfalls were identified for each
organizational structure. Only lack of quality in the faculty centered structure was noted
by a majority of respondents. Specifically a study should be conducted on methods being
used to maintain the autonomy of faculty members and improve or provide consistent
quality in instructional design.
Finally, the study should be expanded to institutions outside of the NCA-HLC.
Expansion of the study to other institutions and other accrediting bodies will provide an
expanded opportunity to fully understand organizational structures and the strengths and
weaknesses existing in these structures. The expanded knowledge can help
organizational leaders make more fully informed decisions relating to instructional
design, development, and support at their institutions.
Summary
This study sought to identify instructional design activities typically included in
three organizational structures in an attempt to gain a greater awareness of the role
organizational structure plays in the quality of instructional design. Three organizational
structures were identified: instructional designer centered, instructional designer
supported, and faculty centered. The study results indicate a difference does exist in
organizational structures and instructional design activities taking place within the

93

various structures. Overall, findings revealed the most frequently occurring instructional
design activities are:

identifying learning outcomes,

selecting instructional strategies,

writing learning objectives

All three were identified in the most frequently occurring list for both faculty centered
and instructional designer centered organizational structures. Identifying learning
objectives was the only one of the above activities to place in the top three frequently
occurring activities in the instructional designer supported organizational structure.
The least frequently occurring instructional design activities overall are:

conducting task analysis

conducting pilot tests

Conducting a pilot test was identified as the least frequently occurring activity for all
organizational structures. The only organizational structure for which conducting a task
analysis was the least frequently occurring activity was the faculty centered structure.
The frequency with which activities occurred in the various organizational structures was
reported in Table C3. Reasons for omitting activities appeared to be consistent across
organizational structures.
The qualitative portion of the study expanded the investigation to identify
strengths and weaknesses of organizational structures as perceived by participants. Lack
of quality was identified as a potential weakness in faculty centered organizational
structures. Faculty expertise in their subject matter was recognized as an inherent

94

strength of the structure. Further study was recommended to delve deeper into methods
for maintaining faculty expertise and developing consistent quality. Concern was raised
regarding the responsiveness of organizational structures. The faculty centered structure
was seen to be responsive to needs of learners and content changes but potentially at the
expense of quality. The instructional designer centered structure was noted for being
responsive to change on a large scale but nonresponsive to the immediate needs of the
learners or changes in content.
The study has shown instructional designer supported and instructional designer
centered organizational structures follow team based approaches. This team based
approach benefits the process by using the expertise of the subject matter expert and the
expertise of the instructional designer in the course design process. In that, its a
collaborative process. We are using the expertise of both (Participant 6). The study also
shows the faculty centered structure is more often an individual effort. While this
individual effort allows the expertise of the subject matter expert to take center stage it
can lead to an instructional design disadvantage in that faculty as a whole are not
systematically prepared professionally to teach (Participant 10).
Training and organizational restructuring can be used to reduce disadvantages
found in a faculty centered structure. Several study participants indicated a need exists
for faculty training in pedagogical skills and instructional design processes. Along with
the need for faculty training was recognized a need to understand the role instructional
designers can play in the design and support of quality instruction at institutions of higher
education. Recommendations were made for institutions to examine their instructional
design practices in an effort to create or enhance environments upporting instructional

95

design activities. Additional study of organizational structures and instructional design


activities was also recommended.

96

REFERENCES
Aczel, A. D. (1999). Complete business statistics (4th ed.). Boston: Irwin/McGraw-Hill.
Albi, R. (2007). Professors as instructional designers: Lived experiences in designing
and developing online instruction. Ph.D. dissertation, Capella University, United
States -- Minnesota. Retrieved from Dissertations & Theses @ Capella
University. (Publication No. AAT 3288703).
Ally, M. (2004). Foundations of educational theory for online learning. In T. Anderson
& F. Elloumi (Eds.), Theory and practice of online learning (p. 5). Athabasca
University, Alberta, Canada: Creative Commons.
Americal Psychiatric Association. (2010). Publication manual of the American
Psychology Association (6th ed.). Washington, DC: Author.
Atkinson, R. C., & Shiffrin, R. M. (1968). Human memory: A proposed system and its
control processes. In K. Spence & J. Spence (Eds.), Vol. 2. The psychology of
learning and motivation (pp. 13 - 113). New York: Academic Press.
Ayers, D. F. (2002). Mission priorities of community colleges in the southern United
States. Community College Review, 30(3), 11-30.
Baggaley, J. (2008). Where did distance education go wrong? Distance Education,
29(1), 39-51.
Beckman, S. L. (2009). Introduction to a symposium on organization design. California
Management Review, 51(4), 6-10.
Best, J. W., & Kahn, J. V. (2003). Research in education (9th ed.). Boston, MA: Allyn
and Bacon.
Bichelmeyer, B. A. (2003, August). Instructional theory and instructional design theory:
Whats the difference and why should we care? 2003 IDT Record. Retrieved
from http://www.indiana.edu/~idt/articles/documents/ID_theory.Bichelmeyer.
html
Bichelmeyer, B. A. (2005). The ADDIE model A metaphor for the lack of clarity in
the field of IDT. AECT 2004 IDT Futures Group Presentations. Retrieved from
http://www.indiana.edu/~idt/shortpapers/documents/IDTf_Bic.pdf
Braganza, A., Awazu, Y., & Desouza, K. C. (2009). Sustaining innovation is challenge
for incumbents. Research Technology Management, 52(4), 46-56.

97

Brill, J. M., Bishop, M. J., & Walker, A. E. (2006). The competencies and
characteristics required of an effective project manager: A web-based Delphi
study. Educational Technology Research and Development 54(2), 115-140.
Chen, G. (2009, January 26). Changing student demographics: Rising number of
professional students. [Online article]. Retrieved from http://www.community
collegereview.com/articles/75
Corcoran, C. A., Dershimer, E. L., & Tichenor, M. S. (2004). A teachers guide to
alternative assessment: Taking the first steps. The Clearing House, 77(5), 213216.
Creswell, J. W. (2008). Educational research: Planning, conducting, and evaluating
quantitative and qualitative research (3rd ed.). Upper Saddle River, NJ: Pearson.
Darvin, J. (2006). Real-world cognition doesnt end when the bell rings: Literacy
instruction strategies derived from situated cognition research. Journal of
Adolescent and Adult Literacy, 49(5), 398-407.
Day, J. C., & Jamieson, A. (2003). School enrollment 2000: Census 2000 Brief (Report
No. C2KBR-26). Retrieved from U. S. Census Bureau website: http://www.
census.gov/prod/2003pubs/c2kbr-26.pdf
Delialioglu, O., & Yildirim, Z. (2008). Design and development of a technology
enhanced hybrid instruction based on MOLTA model: Its effectiveness in
comparison to traditional instruction. Computers & Education, 51(1), 474-483.
DeLoach, S. B., & Greenlaw, S. A. (2007). Effectively moderating electronic
discussions. Journal of Economic Education, 38(4), 419-434.
Dempsey, J. V., Albion, P., Litchfield, B. C., Havard, B., & McDonald, J. (2007). What
do instructional designers do in higher education? In. R. A. Reiser & J. Dempsey
(Eds.), Trends and issues in instructional design and technology (2nd ed., pp.
221-233). Upper Saddle River, NJ: Pearson.
Dennen, V. P. (2005). From message posting to learning dialogues: Factors affecting
learner participation in asynchronous discussion. Distance Education, 26(1), 127148.
Dennen, V. P., Darabi, A. A., & Smith, L. J. (2007). Instructor-learner interaction in
online courses: The relative perceived importance of particular instructor
importance of particular instructor actions on performance and satisfaction.
Distance Education, 28(1), 65-79.

98

Dick, W., & Carey, L. (1978). The systematic design of instruction. Upper Saddle River,
NJ: Pearson.
Dick, W., Carey, L., & Carey, J. O. (2009). The systematic design of instruction (7th
ed.). Upper Saddle River, NJ: Pearson.
Doyle, W. (2009). Online education: The revolution that wasn't. Change: The Magazine
of Higher Learning, 41(3), 56-58.
Driscoll, M. P. (2005). Psychology of learning for instruction (3rd ed.). Boston, MA:
Pearson.
Ertmer, P. A., & Stepich, D. A. (2005). Instructional design expertise: How will we
know it when we see it? Educational Technology, (45)6, 38-43.
Fauser, M., Henry, K., & Norman, D. K. (2006). Comparison of alternative instructional
design models. [Online article]. Retrieved from http://m.deekayen.net/compariso
n-alternative-instructional-design-models
Fowler, F. J. (1993). Survey research methods (2nd ed.). Newbury Park, CA: Sage.
Franken, A., Edwards, C., & Lambert, R. (2009). Executing strategic change:
Understanding the critical management elements that lead to success. California
Management Review, 51(3), 49-73.
Fullan, M. (2001). Leading in a culture of change. San Francisco, CA: Jossey-Bass.
Gearhart, D. (2001). Ethics in distance education: Developing ethical policies. Online
Journal of Distance Leaning Administration, 4(1).
Gholson, B., & Craig, S. (2006). Promoting constructive activities that support vicarious
learning during computer-based instruction. Educational Psychology Review,
18(2), 119-139.
Gustafson, K. L., & Branch, R. M. (2002). Survey of instructional development models
(4th ed.). Syracuse, NY: ERIC.
Hedberg, J. G. (2006). E-learning futures? Speculations for a time yet to come. Studies
in Continuing Education, 28(2), 171-183.
Higher Learning Commission. (2010, May 25). Current or previously affiliated
institutions 05/25/2010 [Official web site]. Retrieved from http://www.
higherlearningcommission.org/component/option,com_directory/Itemid,184/

99

Howell, C. L., & Eric Clearinghouse for Community Colleges (2001). Facilitating
responsibility for learning in adult community college students (Publication No.
ED451841). Eric Digest [Electronic Database]. Retrieved from http://www.eric
digests.org/2001-4/adult.html
Irlbeck, S., Kays, E., Jones, D., & Sims, R. (2006). The phoenix rising: Emergent models
of instructional design. Distance Education, 27(2), 171-185.
Irlbeck, S. A., & Pucel, D. J. (2000). Dimensions of leadership in higher education
distance education. iwalt, pp.63 International Workshop on Advanced Learning
Technologies, 2000. doi: IWALT.2000.890567.
Jacobs, J., & Archie, T. (2008). Investigating sense of community in first-year college
students. Journal of Experiential Education, 30(3), 282-285.
Jonassen, D. (1999). Designing constructivist learning environments. In C. M. Reigeluth
(Ed.), Instructional-design theories and models: Vol. 2. A new paradigm of
instructional theory (pp. 215-239). Mahwah, NJ: Erlbaum.
Jonassen, D., Strobel, J., & Gottdenker, J. (2005). Model building for conceptual change.
Interactive Learning Environments, 13(1-2), 15-27.
Jones-Kavalier, B. R., & Flannigan, S. L. (2006). Connecting the digital dots: Literacy of
the 21st century. Educause Quarterly, 29(2), 8-10.
Kemp, J. (1971). Instructional design: A plan for unit and course development.
Belmont, CA: Fearon.
Kinser, K. (2006). What Phoenix doesnt teach us about for-profit higher education.
Change 38(4), 24-29.
Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during
instruction does not work: An analysis of the failure of constructivist, discovery,
problem-based, experimental, and inquiry-based teaching. Educational
Psychologist, 41(2), 75-86.
Lanier, M. M. (2006). Academic integrity and distance learning. Journal of Criminal
Justice Education, 17(2), 244-261.
Leech, N. L., & Onwuegbuzie, A. J. (2007). An array of qualitative data analysis tools: A
call for data analysis triangulation. School Psychology Quarterly, 22(4), 557-584.
Magliaro, S., Lockee, B., & Burton, J. (2005). Direct instruction revisited: A key model
for instructional technology. Educational Technology Research and
Development, 53(4), 41-55.

100

Mann, S. J. (2005). Alienation in the learning environment: a failure of community?


Studies in Higher Education, 30(1), 43-55.
Marx, G, (2006). Future-focused leadership: Preparing schools, students, and
communities for tomorrows realities . Alexandria, VA: Association for
Supervision and Curriculum Development.
Mayer, R.E. (1999). Designing instruction for constructivist learning. In C. M.
Reigeluth (Ed.), Instructional-design theories and models: Vol. 2. A new
paradigm of instructional theory (pp. 141-159). Mahwah, NJ: Erlbaum.
Mayer, R. E., Griffith, E., Jurkowitz, I. T. N., & Rothman, D. (2008). Increased
interestingness of extraneous details in a multimedia science presentation leads to
decreased learning. Journal of Experimental Psychology: Applied, 14(4), 329339.
Mayer, R. E., & Johnson, C. I. (2008). Revising the redundancy principle in multimedia
learning. Journal of Educational Psychology, 100(2), 380-386.
McLoughlin, C., and M. Lee. (2008). Future learning landscapes: Transforming
pedagogy through social software. Innovate: Journal of Online Education, 4(5).
Retrieved from http://www.innovateonline.info/pdf/vol4_issue5/Future_
Learning_Landscapes-__Transforming_Pedagogy_through_Social_Software.pdf.
(Accession No. EJ840361)
McMillan, J. H., & Schumacher, S. (1997). Research in education: A conceptual
introduction (4th ed.). New York, NY: Addison-Wesley.
Merrill, M. D., & Wilson, B. (2007). The future of instructional design
(point/counterpoint). In R. A. Reiser & J. V. Dempsey (Eds.), Trends and Issues
in Instructional Design and Technology (2nd ed., pp. 335-351). Upper Saddle
River, NJ: Prentice Hall.
Molenda, M. (2003). In search of the elusive ADDIE model. Performance Improvement,
42(5), 34-36.
Moore, M. G., & Kearsley, G. (2005). Distance education: A systems view (2nd ed.).
Belmont, CA: Wadsworth.
Morrison, G. R., Ross, S. M., & Kemp, J. E. (2007). Designing effective instruction (5th
ed.). Hoboken, NJ: John Wiley & Sons.
Morrison, G. R., Ross, S. M., Kalman, H. L., & Kemp, J. E. (2011). Designing effective
instruction (6th ed.). Hoboken, NJ: John Wiley & Sons.

101

Mosby (Ed.). (2006). Mosbys pocket dictionary of medicine, nursing, & health
professions (5th ed.). St. Louis: Mosby.
Orszag, J. M., Orszag, P. R., & Whitmore, D. M (2001, August). Learning and earning:
Working in college. [Online article]. Commissioned by Upromise, Inc.
Retrieved from http://www.brockport.edu/career01/upromise.htm
Palloff, R. M., & Pratt, K. (2003). The virtual student: A profile and guide to working
with online learners . San Francisco, CA: Jossey-Bass.
Qureshi, E. (2004). Instructional design models. [Online article]. Retrieved from
http://web2.uwindsor.ca/courses/edfac/morton/instructional_design.htm
Reigeluth, C. M. (1999). Instructional-design theories and models: A new paradigm of
instructional theory (Vol. 2) . Mahwah, NJ: Erlbaum.
Reiser, R. A., & Dempsey, J. V. (2007)). Trends and issues in instructional design and
technology (2nd ed.). Upper Saddle River, NJ: Pearson.
Ramano, J., Kromrey, J. D., Corraggio, J., & Skowronek, J. (2006). Appropriate statistics
for ordinal level data : Should we really be using t-test and Cohens d for
evaluating group differences on the NSSE and other surveys? Paper presented at
the annual meeting of the Florida Association of Institutional Research, February
1 -3, 2006, Cocoa Beach, FL. Abstract retrieved from http://www.florida-air.org/
romano06.pdf
Ryder, M. (2010). Instructional design models. Retrieved from http://carbon.ucdenver.
edu/~mryder/itc_data/idmodels.html
Saba, F. (2005). Critical issues in distance education: A report from the United States.
Distance Education, 26(2), 255-272.
Scott, G. (2003). Effective change management in higher education. Educause Review,
38(6), 64-80.
Seiden, M. (2009). For-profit colleges deserve some respect. Chronicle of Higher
Education, 55(41), A80.
Siemens, G. (2004). Connectivism: A learning theory for the digital age. elearnspace.
Retrieved from http://www.elearnspace.org/Articles/connectivism.htm
Siemens, G. (2005). Connectivism: Learning as Network Creating. elearnspace.
Retrieved from http://www.elearnspace.org/Articles/networks.htm

102

Silber, K. H. (2007). A principle-based model of instructional design: A new way of


thinking about and teaching ID. Educational Technology Magazine: The
Magazine for Managers of Change in Education, 47(5), 5-19.
Sims, R., & Jones, D. (2002). Continuous improvement through shared understanding:
Reconceptualising instructional design for online learning. Paper presented at the
Ascilite 2002 conference, Auckland, New Zealand. Retrieved from
http://www.ascilite.org.au/conferences/auckland02/proceedings/papers/162.pdf
Sims, R. (2009). From three-phase to proactive learning design: Creating effective online
teaching and learning environments. In J. Willis (ed.) Constructivist Instructional
Design (C-ID): Foundations, Models, and Practical Examples (pp. 379-391).
Charlotte, NC: Information Age.
Skinner, B. F. (1938). The behavior of organisms: An experimental analysis. Englewood
Cliffs, NJ: Prentice-Hall.
Skinner, B. F. (1950). Are theories of learning necessary? Psychological Review, 57,
193-216.
Skinner, B. F. (1958). Reinforcement today. American Psychologist, 13, 94-99.
Smith, P. L., & Ragan, T. J. (2005). Instructional Design (3rd ed.). Hoboken, NJ: Wiley.
Spatariu, A., Quinn, L. F., & Hartley, K. (2007). A review of research on factors that
impact aspects of online discussions quality. TechTrends, 51(3), 44-48.
Thompson, T. A., & Purdy, J. M. (2009). When a good idea isn't enough: Curricular
innovation as a political process. Academy of Management and Education, 8(2),
188-207.
University of Arkansas at Little Rock. (2008). Scholarly Technology and Resources.
From the University of Arkansas at Little Rock Website, http://ualr.edu/star/
index.php/home/about-star/our-mission/
Vannoy, K. E. H. (2008). Instructional design using course management systems. Ph.D.
dissertation, Capella University, United States -- Minnesota. Retrieved from
Dissertations & Theses @ Capella University. (Publication No. AAT 3310887).
Vernadakis, N., Antoniou, P., Giannousi, M., Zetou, E., & Kioumourtzoglou,E. (2011).
Comparing hybrid learning with traditional approaches on learning the Microsoft
Office Power Point 2003 program in tertiary education. Computers & Education,
56(1), 188-199.

103

Wayne State University (2010). Barrier-free elearning: eLearning: Glossary. From the
Wayne State University College of Engineering Website, http://www.eng.wayne.
edu/page.php?id=1263
Wedman, J., & Tessmer, M. (1993). Instructional designers decisions and priorities: A
survey of design practice. Performance Improvement Quarterly, 6(2), 43-57.
Whitten, J. L., & Bentley, L. D. (1998). Systems analysis and design methods (4th ed.).
Boston, MA: McGraw Hill.
Weisenberg, F., & Stacey, E. (2005). Reflections on teaching and learning online:
Quality program design, delivery and support issues from a cross-global
perspective. Distance Education, 26(3), 385-404.
Woods, R., & Ebersole, S. (2003). Using non-subject-matter-specific discussion boards
to build connectedness in online learning. The American Journal of Distance
Education, 17(2), 99-118.
Yang, P. (2006). UCLA community college review: Reverse transfer and multiple
missions of community colleges. Community College Review, 33, 55-70.
Yankelovish, D. (2005). Ferment and change: Higher education in 2015. Chronicle of
Higher Education, 52(14), B6-B9.

104

APPENDIX A. INSTRUCTIONAL DESIGN ACTIVITIES SURVEY


From Instructional design using course management systems, by Vannoy, 2008,
Retrieved from Dissertations & Theses @ Capella University. (Publication No. AAT
3310887). Copyright 2008 by Katherine Vannoy. Adapted with permission.
This survey will gather information about your organization and the instructional design
activities instructional designers in your institution engage in when designing and
developing courses. Responses will be anonymous unless you choose to participate in the
follow-up interview. All information will remain confidential.
I understand that participating in this survey is voluntary and that all information will
remain confidential. Yes No
[Note: participants must select yes in order to move on to the survey.]
PART I: Demographic Data
Please answer the following questions about your institution and your function within the
institution.
1. My primary function at the institution is as a(n):
Administrator
Instructional designer
Faculty developer
Teaching faculty
Other (please specify) ____________
2. My institution is:
Private, not for profit
Private, for profit
Public
3. My institution is a:
Two-year technical college
Two-year community college
Four-year college
University
Other (please specify) _________________________

105

4. My institution offers (check all that apply):


Face-to-face classes (classes that have no Internet component)
Web-enhanced classes (classes that meet face-to-face but use Internet
enhancements)
Hybrid classes (classes that meet both face-to-face and using Internet)
Online classes (classes that have no face-to-face components)
5. I design instruction for (check all that apply):
Face-to-face classes (classes that have no Internet component)
Web-enhanced classes (classes that meet face-to-face but use Internet
enhancements)
Hybrid classes (classes that meet both face-to-face and using Internet)
Online classes (classes that have no face-to-face components)
Please use the following definitions to answer questions 6 and 7.
Instructional designer centered model. An instructional design organizational model in
which instructional design activities are performed by a team of instructional design
experts with the subject matter expert (SME) serving as a member of the design team
during the design, development and delivery of instructional solutions. Instructors
teaching the course may or may not serve on the design and development team.
Instructional designer supported model. An instructional design organizational model in
which the instructor works closely with an instructional designer or instructional design
team in design, development, delivery, and facilitation of his or her own courses.
Support is typically in the form of assistance in both instructional design and
development activities and the use of the course delivery system.
Faculty centered model. An instructional design organizational model in which the
instructor is responsible for the design, development, delivery, and facilitation of his or
her own courses. Support is typically in the form of assistance in the use of the course
delivery system rather than instructional design support.
_______________
6. The instructional design organization model used by my institution would best be
described as a(n):
Instructional designer centered model
Instructional designer supported model
Faculty centered model

106

Other (please describe)


___________________________________________________________
___________________________________________________________
7. I have also experienced the following instructional design organization model(s)
either at my current institution or prior institutions (please select all that apply):
Instructional designer centered model
Instructional designer supported model
Faculty centered model
Other (please describe)
___________________________________________________________
___________________________________________________________
PART II: Instructional Design Activities
Please answer the following questions based on the frequency with which you perform
each task. (Use the following definitions for responses: Alwaysperform the task every
time you design instruction; Usuallyconsistently perform the task rarely skipping it;
Regularlyperform the task most of the time but skipping it is not rare; Selectivelyonly
perform the task when considered necessary; Rarelysometimes perform the task but
usually do not; Nevernever perform the task.
Always Usually Regularly Selectively Rarely Never
8. I conduct a needs

assessment.
9. I determine if
need can be
solved by
training.

10. I write/use
learning
objectives.

11. I conduct task


analyses.

12. I identify the


types of learning
outcomes.

13. I assess trainees


entry skills and

107

characteristics.
14. I develop test
items.

15. I select
instructional
strategies.

16. I select media


formats for the
training.

17. I pilot test


instruction before
completion.

18. I do a follow up
evaluation of the
training.

PART III: Reasons for excluding an instructional design activity


There may be times when you make a decision to exclude an activity during the
instructional design of a project. Please check all reasons for deciding to exclude an
activity from some projects.
19. Conduct a needs assessment.
Lack expertise
Client wont support
Decision already made
Considered unnecessary
Not enough time
Not enough money
Always include
Other ____________________________________
20. Determine if need can be solved by training.
Lack expertise

108

Client wont support


Decision already made
Considered unnecessary
Not enough time
Not enough money
Always include
Other ____________________________________
21. Write learning objectives.
Lack expertise
Client wont support
Decision already made
Considered unnecessary
Not enough time
Not enough money
Always include
Other ____________________________________
22. Conduct task analysis.
Lack expertise
Client wont support
Decision already made
Considered unnecessary
Not enough time
Not enough money
Always include
Other ____________________________________
23. Identify the types of learning outcomes.
Lack expertise
Client wont support
Decision already made
Considered unnecessary
Not enough time
Not enough money
Always include
109

Other ____________________________________
24. Assess trainees entry skills and characteristics.
Lack expertise
Client wont support
Decision already made
Considered unnecessary
Not enough time
Not enough money
Always include
Other ____________________________________
25. Develop test items.
Lack expertise
Client wont support
Decision already made
Considered unnecessary
Not enough time
Not enough money
Always include
Other ____________________________________
26. Select instructional strategies.
Lack expertise
Client wont support
Decision already made
Considered unnecessary
Not enough time
Not enough money
Always include
Other ____________________________________
27. Select media formats for the training.
Lack expertise
Client wont support

110

Decision already made


Considered unnecessary
Not enough time
Not enough money
Always include
Other ____________________________________
28. Pilot test instruction before completion.
Lack expertise
Client wont support
Decision already made
Considered unnecessary
Not enough time
Not enough money
Always include
Other ____________________________________
29. Conduct a follow up evaluation of the training.
Lack expertise
Client wont support
Decision already made
Considered unnecessary
Not enough time
Not enough money
Always include
Other ____________________________________
30. If you are interested in participating in a follow-up telephone interview (30-minutes
to 1-hour) to discuss strengths and weaknesses of organizational structures you have
experienced, please provide the following contact information. Interviews will be
recorded to insure all information is correct and no important points have been
missed. Your confidentiality will be maintained in any data reporting.
Name: ___________________________
Phone number: ________________

111

E-mail address: ______________________________


Best time to be reached: ___________________________________
This study has been approved by Capella University's IRB 101194-1, effective from
September 14,2010 through September 14, 2011.

112

APPENDIX B. INTERVIEW QUESTIONS


Questions
1. Please define the term course as you understand it.
2. How many people are normally involved in the design of a course at your
institution?
3. What is the job title or job description of each of the people included in
Question 2 in the design of a course at your institution?
4. What tasks does [ask for each person identified in Question 3] perform in their
role in course design?
5. In what ways or areas of design might [ask for each person identified in
Questions 3 and 4] interact with others involved in the design of a course
during the design process?
6. [For purposes of this question instructional design process refers to the
procedures used for designing, developing, and implementing courses within a
program of study or curriculum.] Please describe the instructional design
process at your institution.
7. [For purposes of this question instructional design activities refer to the
individual steps performed within the procedures identified above.] Please
describe current instructional design activities that typically take place in your
institution.
8. Using one of the three models of instructional design organization structures
provided in the survey: instructional designer centered, instructional designer

113

supported, or faculty centered which model would best describe that used at
your institution? [The interviewer may read the following definitions if asked.
Instructional designer centered model. An instructional design organizational
model in which instructional design activities are performed by a team of
instructional design experts with the subject matter expert (SME) serving as a
member of the design team during the design, development and delivery of
instructional solutions. Instructors teaching the course may or may not serve
on the design and development team. Instructional designer supported model.
An instructional design organizational model in which the instructor works
closely with an instructional designer or instructional design team in design,
development, delivery, and facilitation of his or her own courses. Support is
typically in the form of assistance in both instructional design and
development activities and the use of the course delivery system. Faculty
centered model. An instructional design organizational model in which the
instructor is responsible for the design, development, delivery, and facilitation
of his or her own courses. Support is typically in the form of assistance in the
use of the course delivery system rather than instructional design support.]
9. Why did you select this model as your answer?
10. Has the organizational structure changed during the time you have been at
your institution?
If yes:, ask 10a, 10b, 10c, and 10d if no skip to Question 11
10a. How has it changed?
10b. What was the organizational structure like before the change?

114

10c. What stimulated the change?


10d. Do you think the change was for the better?
11. Why do you believe your institution chose the current structure?
12. What do you consider to be the major benefits of the organizational structure
used in your institution?
13. What do you consider to be the major shortfalls of the organizational structure
used in your institution?
14. Why do you believe these shortfalls exist?
15. What do you think are the causes of the shortfalls caused by the instructional
design organization structure, if any?
16. What changes, if any, could be made in the organizational structure to reduce
or eliminate these shortfalls?
17. What other information would you like to share about instructional design
activities and the organizational structure at your organization?

115

APPENDIX C. TABLES
Table C1. Frequency and Percentage of Performance of Instructional Design Activities

Activity

Always

Usually

Regularly

Selectively

Rarely

Never

Conduct a needs
assessment

19.6%
(11)

23.2%
(13)abd

17.9%
(10)ce

23.2%
(13) a b

7.1%
(4)

8.9%
(5)

Determine if need can be


solved by training

10.7%
(6)

37.5%
(21)ad

21.4%
(12)ce

14.3%
(8)

5.4%
(3)

10.7%
(6)

Write/use learning
objectives

50.0%
(28)acde

17.9%
(10)

12.5%
(7)

8.9%
(5)

5.4%
(3)

5.4%
(3)

7.1%
(4)

30.4%
(17)a

16.1%
(9)ce

19.6%
(11)

7.1%
(4)

19.6%
(11)

Identify types of learning


outcomes

32.1%
(18)ad

28.6%
(16)ce

14.3%
(8)

14.3%
(8)

5.4%
(3)

5.4%
(3)

Assess trainees entry skills


and characteristics

30.4%
(17)a

21.4%
(12)ce

21.4%
(12)

10.7%
(6)

7.1%
(4)

8.9%
(5)

Develop test items

23.2%
(13)

10.7%
(6)

12.5%
(7)

25.0%
(14)a

10.7%
(6)c

17.9%
(10)

Select instructional
strategies

32.1%
(18)ad

21.4%
(12)c

21.4%
(12)

16.1%
(9)

3.6%
(2)

5.4%
(3)

Select media formats for


training

28.6%
(16)ad

21.4%
(12)c

19.6%
(11)

21.4%
(12)

1.8%
(1)

7.1%
(4)

Pilot test instruction before


completion

17.9%
(10)

14.3%
(8)

10.7%
(6)

26.8%
(15)ac

19.6%
(11)

10.7%
(6)

Do a follow up evaluation
of training

19.6%
(11)

17.9%
(10)

19.6%
(11)c

23.2%
(13)a

8.9%
(5)

10.7%
(6)

Conduct task analysis

Note. amode, b activities having two or more modes, cmedian, dmode responses are the same as found in
the Vannoy (2008) study, emedian responses are the same as found in the Vannoy study, (n) frequency
count.

116

Table C2. Distribution of Frequency of Performance of Instructional Design Activities

Activity

Chi-Square

95% CI
Asymp. Sig.

df

Conduct a Needs Assessment?

12.836

.046

Determine if need can be solved by training

28.364

.000

Write/use learning objectives

59.164

.000

Conduct task analysis

19.709

.003

Identify types of learning outcomes

30.145

.000

Assess trainees entry skills and characteristics

23.527

.001

Develop test items

12.836

.046

Select instructional strategies

32.182

.000

Select media formats for training

25.564

.000

Pilot test instruction before completion

13.091

.042

Do a follow up evaluation of training

11.055

.087*

Note: *Accept the null hypothesis, df = degrees of freedom, CI = confidence interval

117

Table C3. Frequency and Percentage of Performance of Instructional Design Activities


Based on Organization Model
Always

Usually

Regularly

Selectively

Rarely

Never

Conduct a Needs Assessment?


IDC

0.0% (0)

20.0% (1)

20.0% (1)

20.0% (1)c

0.0% (0)

40.0% (2)a

IDS

20.0% (3)

46.7% (7)ac

13.3% (2)

20.0% (3)

0.0% (0)

0.0% (0)

FC

21.2% (7)

12.1% (4)

21.2% (7)c

24.2% (8)a

12.1% (4)

9.1% (3)

Determine if need can be solved by training


IDC

0.0% (0)

20.0% (1)

20.0% (1)

20.0% (1)c

0.0% (0)

40.0% (2)a

IDS

20.0% (3)

53.3% (8)ac

0.0% (0)

13.3% (2)

0.0% (0)

13.3% (2)

FC

9.1% (3)

33.3% (11)a

30.3% (10)c

12.1% (4)

9.1% (3)

6.1% (2)

Write/use learning objectives


IDC

80.0% (4)a

0.0% (0)c

0.0% (0)

0.0% 0)

0.0% (0)

20.0% (1)

IDS

53.3% (8)a

33.3% (5)c

13.3% (2)

0.0% (0)

0.0% (0)

0.0% (0)

FC

45.5% (15)a

12.1% (4)c

15.2% (5)

12.1% (4)

9.1% (3)

6.1% (2)

Conduct task analysis


IDC

20.0% (1)ab

20.0% (1)ab

20.0% (1)abc

20.0% (1)ab

0.0% (0)

20.0% (1)ab

IDS

6.7% (1)

46.7% (7)ac

20.0% (3)

6.7% (1)

6.7% (1)

13.3% (2)

FC

6.1% (2)

27.3% (9)a

15.2% (5)

24.2% (8)c

9.1% (3)

18.2% (6)

Identify types of learning outcomes


IDC

80.0% (4)a

0.0% (0)c

0.0% (0)

0.0% (0)

0.0% (0)

20.0% (1)

IDS

46.7% (7)a

20.0% (3)c

13.3% (2)

20.0% (3)

0.0% (0)

0.0% (0)

FC

21.2% (7)

36.4% (12)ac

15.2% (5)

12.1% (4)

9.1% (3)

6.1% (2)

118

Table C3. Frequency and Percentage of Performance of Instructional Design Activities


Based on Organization Model (Continued)
Always

Usually

Regularly

Selectively

Rarely

Never

Assess trainees entry skills and characteristics


IDC

40.0% (2)ab

0.0% (0)

0.0% (0)

20.0% (1)c

0.0% (0)

40.0% (2)ab

IDS

33.3% (5)ab

33.3% (5)abc

6.7% (1)

13.3% (2)

6.7% (1)

6.7% (1)

FC

30.3% (10)

18.2% (6)

33.3% (11)ac

6.1% (2)

6.1% (2)

6.1% (2)

Develop test items


IDC

40.0% (2)a

0.0% (0)

20.0% (1)c

20.0% (1)

0.0% (0)

20.0% (1)

IDS

6.7% (1)

20.0% (3)ab

20.0% (3)ab

20.0% (3)abc

13.3% (2)

20.0% (3)ab

FC

30.3% (10)a

9.1% (3)

9.1% (3)

27.3% (9)c

9.1% (3)

15.2% (5)

Select instructional strategies


IDC

60.0% (3)a

20.0% (1)c

0.0% (0)

20.0% (1)

0.0% (0)

0.0% (0)

IDS

26.7% (4)

20.0% (3)

13.38% (2)c

33.3% (5)a

6.7% (1)

0.0% (0)

FC

33.3% (11)a

24.2% (8)c

30.3% (10)

6.1% (2)

0.0% (0)

6.1% (2)

Select media formats for training


IDC

40.0% (2)ab

0.0% (0)

40.0% (2)abc

20.0% (1)

0.0% (0)

0.0% (0)

IDS

33.3% (5)a

20.0% (3)c

13.3% (2)

26.7% (4)

6.7% (1)

0.0% (0)

FC

27.3% (9)a

24.2% (8)c

21.2% (7)

18.2% (6)

0.0% (0)

9.1% (3)

Pilot test instruction before completion


IDC

0.0% (0)

0.0% (0)

20.0% (1)

40.0% (2)abc

40.0% (2)ab

0.0% (0)

IDS

46.7% (7)a

13.3% (2)c

0.0% (0)

20.0% (3)

20.0% (3)

0.0% (0)

FC

9.1% (3)

18.2% (6)

12.1% (4)

27.3% (9)ac

18.2% (6)

15.2% (5)

Do a follow up evaluation of training


IDC

0.0% (0)

0.0% (0)

60.0% (3)ac

20.0% (1)

20.0% (1)

0.0% (0)

IDS

26.7% (4)

13.3% (2)

20.0% (3)c

33.3% (5)a

6.7% (1)

0.0% (0)

FC

18.2% (6)

24.2% (8)a

15.2% (5)c

18.2% (6)

9.1% (3)

15.2% (5)

Note: The frequency of counts < 5 preclude the use of chi-square test for independent samples on this data,
n= 53. a mode, b activities having multiple modes, c median

119

Table C4. Reasons Instructional Design Activities are Omitted


Instructional
Design
Activity

Lack
Expertise

Client
Wont
Support

Decision
Already
Made

Considered
Unnecessary

Not
Enough
Time

Not
Enough
Money

Always
Include

Other

20.0%
(11)

27.3%
(15)

61.8%
(34)a

45.5%
(25)

49.1%
(27)

30.9%
(17)

7.3%
(4)

3.6%
(2)

14.5%
(8)

25.5%
(14)

49.1%
(27)a

20.0%
(11)

27.3%
(15)

18.2%
(10)

7.3%
(4)

3.6%
(2)

7.3%
(4)

14.5%
(8)

32.7%
(18)

10.9%
(6)

20.0%
(11)

0.0%
(0)

49.1%
(27)a

10.9
%
(6)

16.4%
(9)

20.0%
(11)

32.7%
(18)

25.5%
(14)

40.0%
(22)a

7.3%
(4)

14.5%
(8)

9.1%
(5)

3.6%
(2)

12.7%
(7)

38.2%
(21)

14.5%
(8)

12.7%
(7)

0.0%
(0)

47.3%
(26)a

9.1%
(5)

3.6%
(2)

10.9%
(6)

40.0%
(22)a

23.6%
(13)

30.9%
(17)

9.1%
(5)

23.6%
(13)

3.6%
(2)

16.4%
(9)

10.9%
(6)

23.6%
(13)

20.0%
(11)

18.2%
(10)

5.5%
(3)

30.9%
(17)a

9.1%
(5)

Select
instructional
strategies

7.3%
(4)

18.2%
(10)

30.9%
(17)

1.8%
(1)

10.9%
(6)

5.5%
(3)

50.9%
(28)a

5.5%
(3)

Select media
formats for
training

18.2%
(10)

20.0%
(11)

30.9%
(17)

10.9%
(6)

14.5%
(8)

12.7%
(7)

38.2%
(21)a

5.5%
(3)

0.0%
(0)

16.4%
(9)

12.7%
(7)

20.0%
(11)

52.7%
(29)a

7.3%
(4)

27.3%
(15)

7.3%
(4)

1.8%
(1)

9.1%
(5)

9.1%
(5)

20.0%
(11)

41.8%
(23)a

9.1%
(5)

34.5%
(19)

7.3%
(4)

Conduct a
Needs
Assessment?
Determine if
need can be
solved by
training
Write/use
learning
objectives
Conduct task
analysis
Identify types
of learning
outcomes
Assess
trainees entry
skills and
characteristics
Develop test
items

Pilot test
instruction
before
completion
Do a follow
up evaluation
of training

Note: Respondents were able to select multiple responses. a = mode, n = 55

120

Table C5. Comparison of Organizational Structures and People Involved in the


Instructional Design Process
Number Involved
Organizational
Structure

3 or 4

5 or 6

79

10+

Total

IDC

0.0%
(0)

0.0%
(0)

0.0%
(0)

0.0%
(0)

0.0%
(0)

9.1%
(1)c

9.1%
(1)

IDS

0.0%
(0)

0.0%
(0)

27.3%
(3)c

0.0%
(0)

0.0%
(0)

0.0%
(0)

27.3%
(3)

FC

18.2%
(2)c

9.1%
(1)

9.1%
(1)

0.0%
(0)

0.0%
(0)

0.0%
(0)

36.4%
(4)b

0.0%
(0)

9.1%
(1)

18.2%
(2)c

0.0%
(0)

0.0%
(0)

0.0%
(0)

27.3%
(3)

18.2%
(2)

18.2%
(2)

54.5%
(6)a

0.0%
(0)

0.0%
(0)

9.1%
(1)

100.0%
(11)

Combination
IDS and FC
Total

Note: Table displays count and percentage of total, n = 11, amode all responses, bmode organizational
structure, cmode each organization structure.

121

Table C6. Comparison of Organizational Structures and Role in the Instructional Design
Process, N = 37
Role
Org.
Structure

Multimedia
Designer

Admin.

Other

11

SME

ID

Director
of ID

Curriculum
Specialist

IDC

IDS

FC

Combination
IDS and FC
Total

Note: Number of respondents identified by their specific instructional design roles and broken down by
organizational structure , n = 11. SME = Subject Matter Expert, ID = Instructional Designer, Admin. =
Administrator, Other = all other roles identified with a count of 1

122

Table C7. Comparison of Instructional Design Process Based Overall and


Organizational Structure
Count
All
Participants

FC

IDS

IDC

IDS &
FC

81.8%
(9)

75.0%
(3)

100%
(3)

100%
(1)

66.7%
(2)

Locate resources

27.3%
(3)

25.0%
(1)

33.3%
(1)

0.00%
(0)

33.3%
(1)

Develop goals and objectives

72.7%
(8)

75.0%
(3)

100%
(3)

Determine syllabus, content,


assessments, materials

100.0%
(11)

100.0%
(4)

100%
(3)

100%
(1)

100%
(3)

Determine delivery method F2Fm


Online, etc)

9.1%
(1)

0.00%
(0)

0.00%
(0)

0.00%
(0)

33.3%
(1)

Determine appropriate media


(video, PP, etc.)

9.1%
(1)

0.00%
(0)

0.00%
(0)

0.00%
(0)

33.3%
(1)

Review and modify

36.4%
(4)

25.0%
(1)

66.7%
(2)

100%
(1)

0.00%
(0)

Develop materials/ course in LMS

100.0%
(11)

100.0%
(4)

100%
(3)

100%
(1)

100%
(3)

Review and modify

45.5%
(5)

25.0%
(1)

66.7%
(2)

100%
(1)

33.3%
(1)

Pilot and modify

9.1%
(1)

0.00%
(0)

33.3%
(1)

0.00%
(0)

0.00%
(0)

100.0%
(11)

100.0%
(4)

100%
(3)

100%
(1)

100%
(3)

Process
Request course
development/Recognize need for
new course or modification of a
course

Go Live

123

0.00%
(0)

66.7%
(2)

Table C8. Comparison of Reasons for Selecting Organizational Structure - Overall and
Organizational Structure
Count
Reason for Selecting

All
Participants
45.5%
(5)

FC
75.0%
(3)

IDS
0.0%
(0)

IDC
0.0%
(0)

IDS &
FC
66.7%
(2)

Faculty autonomy

18.2%
(2)

25.0%
(1)

0.0%
(0)

0.0%
(0)

33.3%
(1)

Faculty driven

27.3%
(3)

50.0%
(2)

33.3%
(1)

0.0%
(0)

0.0%
(0)

Personal choice in collaboration


with team

9.1%
(1)

0.0%
(0)

33.3%
(1)

0.0%
(0)

0.0%
(0)

Integrate to campus

9.1%
(1)

0.0%
(0)

33.3%
(1)

0.0%
(0)

0.0%
(0)

Economy/scalability

27.3%
(3)

0.0%
(0)

0.0%
(0)

100.0%
(1)

66.7%
(2)

Quality

18.2%
(2)

0.0%
(0)

33.3%
(1)

100.0%
(1)

0.0%
(0)

Lack of ID/lack of ID training

18.2%
(2)

50.0%
(2)

0.0%
(0)

0.0%
(0)

0.0%
(0)

Out dated program

9.1%
(1)

0.0%
(0)

0.0%
(0)

0.0%
(0)

33.3%
(1)

Accreditation

9.1%
(1)

0.0%
(0)

33.3%
(1)

0.0%
(0)

0.0%
(0)

Quality/standardization

27.3%
(3)

25.0%
(1)

33.3%
(1)

0.0%
(0)

33.3%
(1)

Streamlining

9.1%
(1)

0.0%
(0)

0.0%
(0)

0.0%
(0)

33.3%
(1)

Competition

9.1%
(1)

0.0%
(0)

33.3%
(1)

0.0%
(0)

0.0%
(0)

Tradition

Note: Percentages all participants, n = 11; FC, n = 4; IDS, n = 3; IDC, n = 1; both IDS and FC, n = 3

124

Table C9. Comparison of Major Benefits Among Organizational Structures


Count
Major Benefits

FC

IDS

IDC

IDS & FC

Tradition

Works for institution

Integration into campus with training


units on hybrid learning

Cost containment

Scalability / Efficiency

Responsiveness

0a

Quality

0a

Faculty Autonomy

0a

Faculty expertise

Location

Improved Communication

ID expertise

Faculty relief

125

Table C10. Comparison of Major Shortfalls Among Organizational Structures


Count
FC

IDS

IDC

IDS & FC

Focus is not on learning

The rate of change makes for uncertainty

Lack of faculty control / autonomy

1a

Perception of the process

Non-responsive process

1a

Lack of faculty training in ID

Training needed

Lack of staffing

Lack of quality

3a

Work load

Faculty inflexibility

Major Shortfalls

Note: aindicates trait considered a benefit in one organizational structure is considered a weakness in
another.

126

Table C11. Comparison of Findings: Current Study, Vannoy (2008), and Wedman and
Tessmer (1993)
Findings listed
by Vannoy
(2008)
Most frequently
used design
activities

Least frequently
used design
activities

Instructional Design Activities

Follow up evaluation
Identifying learning outcomes
Selecting instructional strategies
Writing learning objectives
Developing test items
Selecting media

Conducting needs assessment


Conducting task analysis
Conducting pilot tests
Assessing skills and Characteristics

This Study
2010

x
x
x

x
x
x
x

x
x

Modified from Table 6: Findings from both studies (Vannoy, 2008, p. 46).

127

Vannoy
(2008)

Wedman &
Tessmer
(1993)

x
x
x
x

x
x
x
x

Vous aimerez peut-être aussi