Vous êtes sur la page 1sur 22

Field Methods

http://fmx.sagepub.com/

Applying Three Strategies for Integrating Quantitative and


Qualitative Databases in a Mixed Methods Study of a Nontraditional
Graduate Education Program
Vicki L. Plano Clark, Amanda L. Garrett and Diandra L. Leslie-Pelecky
Field Methods 2010 22: 154 originally published online 29 December 2009
DOI: 10.1177/1525822X09357174

The online version of this article can be found at:


http://fmx.sagepub.com/content/22/2/154

Published by:

http://www.sagepublications.com

Additional services and information for Field Methods can be found at:

Email Alerts: http://fmx.sagepub.com/cgi/alerts

Subscriptions: http://fmx.sagepub.com/subscriptions

Reprints: http://www.sagepub.com/journalsReprints.nav

Permissions: http://www.sagepub.com/journalsPermissions.nav

Citations: http://fmx.sagepub.com/content/22/2/154.refs.html

Downloaded from fmx.sagepub.com at Stockholms Universitet on March 4, 2011


Field Methods
22(2) 154­–174
Applying Three © 2010 SAGE Publications
Reprints and permission: http://www.
Strategies for Integrating sagepub.com/journalsPermissions.nav
DOI: 10.1177/1525822X09357174
Quantitative and http://fm.sagepub.com

Qualitative Databases in
a Mixed Methods Study
of a Nontraditional
Graduate Education
Program

Vicki L. Plano Clark,1 Amanda L. Garrett,1


and Diandra L. Leslie-Pelecky2

Abstract
A central issue for mixed methods research is for researchers to effectively
integrate (or mix) the quantitative and qualitative data in their studies. Despite
extensive methodological discussions about integration, researchers using
mixed methods approaches struggle with translating these discussions into
practice and often make inadequate use of integration in their studies. The
authors examined their integration practices as they applied three literature-
based strategies within the context of one mixed methods study about a
nontraditional graduate education program. From this examination, the authors
describe the processes, products, uses, and challenges that materialized as
they merged their quantitative and qualitative databases to develop a better
understanding of participants’ perceptions of their experience in the program.

1
University of Nebraska–Lincoln, Lincoln, NE
2
University of Texas at Dallas, Richardson, TX

Corresponding Author:
Vicki L. Plano Clark, University of Nebraska–Lincoln, Office of Qualitative and Mixed Methods
Research, 114 Teachers College Hall, Lincoln, NE 68588-0345
Email: vpc@unlserve.unl.edu

Downloaded from fmx.sagepub.com at Stockholms Universitet on March 4, 2011


Plano Clark et al. 155

Keywords
mixed methods research, integration, concurrent design, quantitative and
qualitative data, data transformation

The combination of quantitative and qualitative data as mixed methods re-search


has developed into a multidisciplinary and international field of inquiry
(Tashakkori and Creswell 2007). Researchers’ use of mixed methods to address
complex research questions across diverse disciplines is growing in prevalence
and acceptance (Plano Clark 2005; Bryman 2006). Within the mixed methods
literature, the issue of integrating (or mixing) quantitative and qualitative data
within mixed methods studies is receiving increased attention. Greene, Caracelli,
and Graham (1989) first raised this topic by looking for evidence of integration
in their seminal review of fifty-seven mixed methods evaluation studies. Today,
integration has emerged as central to thinking about mixed methods research
(Creswell et al. 2003; Teddlie and Tashakkori 2006). Scholars have mapped dif-
ferent integration techniques available to researchers (Caracelli and Greene
1993; Onwuegbuzie and Teddlie 2003; Bryman 2006; Creswell and Plano Clark
2007) and examined the challenges and tensions associated with integrating
quantitative and qualitative data (Sale, Lohfeld, and Brazil 2002; Bryman 2007;
Creswell, Plano Clark, and Garrett 2008).
Despite 20 years of methodological writings about integration, O’Cathain,
Murphy, and Nicholl (2007) recently concluded that researchers are still
making inadequate use of meaningful integration within their mixed methods
studies. Scholars suggest that this problem may be compounded by the lack
of literature providing guidance for researchers wanting to translate method-
ological discussions about mixed methods into practice (Bryman 2006; Morse
2006; Greene 2007). Bryman (2007) noted that the lack of exemplars of how
methods can be integrated in practice is a serious problem for researchers
wanting to implement integration strategies.
There are many possible approaches for integrating quantitative and qual-
itative data, and researchers often choose a strategy that fits the relative timing
of the collection of the two data sets. In sequential mixed methods studies,
researchers collect quantitative and qualitative data in two phases and tend to
make use of iterative integration approaches that emphasize connections bet­
ween the study’s phases (Greene and Caracelli 1997). In contrast, when using
a concurrent design, researchers collect, analyze, and integrate the quantita-
tive and qualitative data within one phase of research to provide corroborating
or complementary information (Greene, Caracelli, and Graham 1989; Creswell
et al. 2003; Teddlie and Tashakkori 2009); that is, the quantitative and qualitative

Downloaded from fmx.sagepub.com at Stockholms Universitet on March 4, 2011


156 Field Methods 22(2)

data sets are directly related or compared to each other. Creswell and Plano
Clark (2007) describe integration within a concurrent design as merging the
quantitative and qualitative data. The value of integration in concurrent ap-
proaches surpasses the mere summation of qualitative and quantitative
evidence; it is in the dynamic merging of the two forms of data that they
become greater than the sum of their parts. Therefore, the value of concurrent
mixed methods designs can be realized only if researchers apply effective
merging strategies in their practice.
This methodological article is aimed at increasing our understanding of
integration practices by examining three strategies for merging quantitative
and qualitative databases within one concurrent mixed methods study. To illus-
trate the strategies, we discuss examples from a mixed methods study about
a nontraditional science, technology, engineering, and math (STEM) gradu-
ate education program. The methodological discussion reports the processes
we utilized to implement each strategy, the research products that resulted,
the different uses we found for each strategy in response to our research ques-
tions, and the challenges that emerged in our practice.

Integration within Concurrent Mixed Methods Designs


Our work builds on the existing literature about integration within concurrent
mixed methods designs. Scholars note that achieving integration in concur-
rent mixed methods designs is problematic because it is inherently difficult
to compare results from different forms of data and because the quantitative
and qualitative data may represent different phenomena (Sale, Lohfeld, and
Brazil 2002; Creswell et al. 2003). Methodologists have, however, provided
guides for integrating within concurrent approaches (Caracelli and Greene
1993; Creswell et al. 2003; Onwuegbuzie and Teddlie 2003; Bryman 2006).
Creswell and Plano Clark (2007) summarized this literature and noted three
strategies used to merge quantitative and qualitative data: (1) merging in a dis-
cussion, (2) merging with a matrix, and (3) merging by data transformation.

Merging in a Discussion
The most straightforward strategy for merging quantitative and qualitative
data is to present and interpret the two sets of results in a conclusion section
of a manuscript (Creswell and Plano Clark 2007). Reviews of published
mixed methods studies have found this to be one of the most common ap-
proaches to achieving integration (Plano Clark 2005; Hanson et al. 2005;
Plano Clark et al. 2008). Some authors using this strategy present a paragraph

Downloaded from fmx.sagepub.com at Stockholms Universitet on March 4, 2011


Plano Clark et al. 157

reporting a quantitative result and follow it with a paragraph reporting a cor-


responding qualitative result (e.g., Yount and Gittelsohn 2008). Others state
a descriptive quantitative result and then immediately include a qualitative quote
within the same paragraph to corroborate or illustrate the quantitative result (e.g.,
Mizrahi and Rosenthal 2001). Although less common, another model occurs
when authors begin with a qualitative finding and then provide a supporting
(often descriptive) quantitative result (e.g., McAuley et al. 2006).

Merging with a Matrix


A second strategy is to merge using visual displays, primarily in the form of
matrices. The use of matrices in research is not new. Researchers utilize cross-
tab tables to display quantitative results and have a wide range of visual displays
to consider for displaying qualitative information (Miles and Huberman 1994).
The use of matrices in the context of mixed methods, however, has begun to
receive increased attention (e.g., O’Cathain, Murphy, and Nicholl 2007).
Creswell and Plano Clark (2007) highlighted researchers’ use of displays for
comparing groups identified in one data set in terms of data derived from the
other data set. For example, Fetters et al. (2007) used a matrix to compare
qualitative quotes representing major themes across levels of a categorical
variable differentiating types of participants. Similarly, Lee and Greene (2007)
displayed quantitative performance scores and qualitative quotes to highlight
both convergent and discrepant responses. Wittink, Barg, and Gallo (2006)
grouped participants by themes that emerged from interviews and then used
a matrix to examine the quantitative characteristics of each group.

Merging by Data Transformation


The third merging strategy noted by Creswell and Plano Clark (2007) is data
transformation. Data transformation is a process of “quantitizing” qualitative
information or “qualitizing” quantitative information (Tashakkori and Teddlie
1998). The purpose of data transformation is to make the two sets of data
more readily comparable or to convert one type of information to the other to
facilitate further analyses (Onwuegbuzie and Teddlie 2003). Examples of data
transformation by quantitizing qualitative information are more common. Many
researchers create dichotomous variables representing the presence or absence
of a qualitative code or theme for an individual participant (e.g., Pagano et al.
2002). These new variables are used to determine descriptive results, such as
the percentage of participants who mentioned a theme, or in further statistical
analyses. Other researchers employ more sophisticated rubrics for quantitizing.

Downloaded from fmx.sagepub.com at Stockholms Universitet on March 4, 2011


158 Field Methods 22(2)

For example, Idler, Hudson, and Leventhal (1999) developed a theoretical


framework for transforming qualitative results into a 6-point scale that
preserved detail from the qualitative themes and addressed the problem of
assigning scores to individuals that mention multiple issues within their
responses.
Despite this literature and the published examples, there is little guidance
available for how researchers implement these strategies to answer the research
questions in their own mixed methods studies. We used the literature and cited
examples to inform how we applied the strategies within our study.

The Mixed Methods Study


In this article, we report what we learned when we applied different merging
strategies within a study about a nontraditional STEM graduate education pro-
gram. This section overviews the study to provide context for our application
of the strategies.

The STEM Graduate Education Program


The National Science Foundation’s Graduate Fellows in K–12 Education
Program (GK–12) was developed to make future STEM leaders aware of the
issues challenging K–12 education. The STEM graduate education program
involved in this study is one of 146 GK–12 programs in the United States. In
this program, STEM graduate students (who are pursuing advanced degrees
in a STEM field) work in elementary and/or middle schools in the local school
district for approximately 10 hours each week throughout the school year.
Each graduate student is paired with a lead teacher, and they work together to
address the specific needs of the school (e.g., grade-level math and science
standards and process skills). In addition, the selected graduate students are
part of a cohort that shares experiences such as attending weekly meetings,
collaborating to develop materials, and presenting at conferences.

The Study’s Purpose


This study grew out of our interest in examining how the alumni of this non-
traditional program perceive the impact of their participation. Although many
GK–12 programs are conducting research about participants during program
participation, this research investigated graduate students’ perceptions after
participation as they returned to their graduate programs, graduated, and con-
tinued into professional careers. Our primary research question was, how do

Downloaded from fmx.sagepub.com at Stockholms Universitet on March 4, 2011


Plano Clark et al. 159

the alumni of the STEM graduate education program perceive the impact of
their participation? This question called for mixed methods because we needed
quantitative information to measure the perceived extent of the impact and
qualitative information to describe individual perceptions and experiences of
the impact. Thus, we included both types of data to develop a more complete
understanding of the participants’ perceptions.

Data Collection
The population for this study included all graduate students who had com-
pleted appointments in this nontraditional program before fall 2007. The primary
source of data collection was a Web-based questionnaire that included closed-
and open-ended items; that is, we collected the quantitative and qualitative
data concurrently. We chose to use an online questionnaire because (1) par-
ticipants were accustomed to reporting information about the program in this
format, (2) there were limited resources available for a longitudinal study
of program alumni, (3) quantitative and qualitative data would be readily
accessible, and (4) both kinds of data would be collected from all individu-
als in the sample.
The development of the questionnaire was informed by the findings of an
earlier study of the meaning of participation in the program (Buck et al. 2006)
as well as the stated goals of the GK–12 program. Topics included current
education and employment status, perceptions of program participation, beliefs
and knowledge related to being a scientist, beliefs and knowledge related to
K-12 education, and future professional plans. The questionnaire included three
items that included both closed- and open-ended parts, thirty-nine closed-
ended items, and five open-ended items (see examples in the appendix). We
invited individuals to participate via email and followed informed consent
procedures as approved by our institutional review board. The overall response
rate was 85.7% (N = 36). Although online questionnaires may produce poor
qualitative data, we felt that these individuals would take the time to provide
rich responses because they were invested in the program. In fact, the res­
ponses to the open-ended items represented eighty pages of single-spaced text.
Therefore, although the data were not as rich as interview data, the resulting
database did allow us to explore individuals’ perceptions.

Initial Data Analysis


Consistent with recommendations for a concurrent mixed methods approach
(Tashakkori and Teddlie 1998), we began by separately analyzing the quanti-
tative data quantitatively and the qualitative data qualitatively. This was done

Downloaded from fmx.sagepub.com at Stockholms Universitet on March 4, 2011


160 Field Methods 22(2)

to obtain an initial understanding of the two databases before implementing


our merging strategies. In addition, this procedure allowed us to obtain sepa-
rate and independent results that could be compared for purposes of corroboration
and complementarity, before advancing to more integrative analyses such as
data transformation (Greene 2007; Teddlie and Tashakkori 2009). Specifically,
we analyzed the quantitative data with SPSS using descriptive statistics and
a variety of inferential procedures depending on the nature of the hypothesis
being tested and the properties of the variables. Where appropriate, we used
procedures to ensure the quality of our quantitative analyses such as examin-
ing the internal consistency of summed scores, checking underlying
assumptions before running statistical tests, and correcting our alpha value
for multiple tests.
Likewise, we used traditional procedures for analyzing the qualitative data
(Miles and Huberman 1994). These procedures included open coding, the-
matic development, triangulating multiple researchers’ perspectives, and actively
searching for contradictory evidence. We utilized the MAXqda2 software for
storing and analyzing our qualitative database. We selected this software
because of numerous features that we felt would be advantageous for our
mixed methods analyses, including capabilities to link participants’ quantita-
tive scores directly to their qualitative responses and to transform qualitative
codes into quantitative scores.

Applications of the Three Strategies


to Integrate by Merging
The focus of this article is on the process that occurred as we merged the
quantitative and qualitative data from our study. We present our use of three
merging strategies: in a discussion, with a matrix, and by data transformation
(Creswell and Plano Clark 2007). We employed the strategies to answer spe-
cific questions we had about the participants’ perceptions of the impact of the
program, and the different types of questions led to a set of techniques that
emerged for the strategies. Therefore, for each strategy we delineate the steps
we used to implement it in our analyses, highlight substantive results that
emerged from its use, describe the different techniques that emerged, and
note the challenges that arose in our practice.

Merging in a Discussion
We found that merging results in a discussion was generally a straightforward
strategy to apply that was useful for many of our questions about specific topics.
Our implementation of this strategy followed five steps. First, we separately

Downloaded from fmx.sagepub.com at Stockholms Universitet on March 4, 2011


Plano Clark et al. 161

analyzed the two sets of data to obtain separate sets of results and to explore
the available information. Second, we identified “overlap” topics within the
substantive content of the two sets of results. These overlap areas suggested
topics that would be fruitful for comparing different results. Third, we refined
the analyses in the overlap areas based on what we had learned from the two
data sets and as needed to facilitate comparisons. For example, we created a
quantitative subscale for communication skills based on its prominence in
the qualitative data. Fourth, we directly compared the quantitative and quali-
tative results for a substantive topic and assessed to what extent and in what
ways the data sets supported or illustrated each other. Finally, we wrote a para-
graph (such as illustrated in Figure 1) that (1) stated specific results from one
data set, followed by a corresponding result from the other data set and
(2) discussed our interpretations about how the result from one data set
corroborates, illustrates, or generalizes a result from the other data set. This
interpretation paragraph is the primary product from this integration strategy.
Different techniques. We utilized the merging in a discussion strategy for many
of our initial questions. From these questions, we used three distinct tech-
niques for merging by discussion: corroborating findings, developing a more
complete picture, and identifying divergence (see Table 1). The first tech-
nique was to develop a method by method comparison to corroborate results
about a specific topic or construct. This technique increased our confidence
in the validity and meaningfulness of conclusions based on separate results
from each database. For example, we wondered how the quantitative data
about the perceptions of the benefits derived from participating in the pro-
gram corroborated the qualitative themes about benefits. After analyzing the
two data sets, we compared the quantitative and qualitative results for specific
content areas, such as perceptions of improved communication skills. By
developing a discussion such as presented in Figure 1, we were able to both
describe a qualitative finding and indicate the extent to which it was supported
by the quantitative results on the same topic.
Merging results in a discussion also worked well to develop a more com-
plete picture by presenting two complementary sets of results about a topic.
This technique fit our online questionnaire data well, particularly when an
item included both a closed-ended and an open-ended part. For example,
participants rated their perception of the impact that participation had on their
progress toward completing their degree and described the impact. We asked,
how do the qualitative descriptions complement the quantitative ratings? By
presenting the combined results from both data sets in a discussion, we
developed a more complete understanding of the extent of the perceived
impact as well as the reasons for the ratings. For example, we were surprised

Downloaded from fmx.sagepub.com at Stockholms Universitet on March 4, 2011


162 Field Methods 22(2)

The theme of improved communication skills was a major emphasis for many
participants, most often the first topic they mentioned when prompted to discuss benefits that
came to mind. Three participants briefly mentioned improved written communication skills, such
as for grant writing. Most individuals, however, wrote about how their oral communication skills
had improved during the program experience. Some emphasized the improvement of general
public speaking skills, such as one who noted, “My public speaking skills have improved
QUAL dramatically.” Most participants who discussed this theme, however, specifically noted
findings improvements in their ability to communicate their content topic to different audiences. One
explained, “The interactions with teachers and children helped me to better articulate my
research.” Another added, “I gained valuable experience in communicating my research with
others.” Some participants noted that they had gained better appreciation of and skills for
communicating with individuals representing diverse backgrounds. As one participant wrote,
“Mostly, I learned a lot about communication... Sometimes it is just difficult to communicate, but
there are a lot of things that can be done to make it easier.” Improved communication skills also
emerged as a positive benefit from the quantitative results. Participants rated the items related to
QUAN oral communication (with groups and individuals) highly (M = 4.25, SD = .612), indicating that
results participants perceived these skills as improving because of the program. As noted in the
& qualitative findings, the quantitative results also indicate less of a benefit related specifically to
comparison written communication. The item about improved written communication had the lowest mean
value (M = 3.64, SD = .683) of all items asking about different improvements, with 17 participants
indicating that this skill had not changed as result of their participating in the program.

Figure 1. Example of merging in a discussion

to find that twenty-one individuals (58%) indicated that their participation


had a positive impact on completing their degree despite the time and effort
required by the program. The qualitative data provided insight into the ways
that program participation positively affected individuals’ degree completion,
such as increasing motivation and focus, providing social support, helping to
develop skills essential to the degree program, providing financial support,
and having better schedule flexibility than other assistantships.
These first two merging by discussion techniques aligned well with topics
where we expected to find convergence. As suggested by Greene (2008), we
also considered whether actively seeking divergence could provide additional
information. Our third technique was to further analyze the two databases to
identify divergent and alternative perspectives about a topic. We noted from our
quantitative results that the data from all closed-ended items were positively
skewed. Although this is a positive finding in terms of evaluating the program,
we wondered whether having only positive information about the experience
was giving us a complete understanding of the program. Therefore, a new ques-
tion emerged that focused on identifying divergence (see Table 1). We analyzed
the qualitative data to reveal the negative and challenging aspects of partici-
pating in the program as a divergent perspective to the quantitative results.
Combined with the quantitative results, the qualitative themes of time demands,
programmatic tensions, negative emotions, and loss of disciplinary experiences
provided a broader range of perspectives about program participation.
Methodological challenges. We faced several issues as we merged results in a
discussion. Direct comparisons could be made only for topics represented by

Downloaded from fmx.sagepub.com at Stockholms Universitet on March 4, 2011


Plano Clark et al. 163

Table 1. Merging Strategies, Different Techniques, and Their Application in the


Science, Technology, Engineering, and Math (STEM) Education Program Study

Example Question from the STEM


Merging Strategy Techniques Education Program Study

In a discussion Compare findings To what extent do the quantitative


method by method results about the benefits of
for corroboration participation corroborate the
qualitative themes about benefits?
Develop a more How do the qualitative descriptions
complete picture of the impact of participation on
by presenting two completing a degree complement
complementary sets the quantitative ratings?
of results
Identify divergence In what ways do the qualitative themes
and alternative about negative aspects of program
perspectives across participation present alternative
the methods perspectives to the positive
quantitative results?
With a matrix Explore differences in How do the comments of female
qualitative findings participants about self-confidence
based on quantitative differ from those of the male
categories participants?
Examine differences in In what ways do participants’
qualitative findings perceptions of the benefits of
based on statistical participation by disciplinary
differences in field shed light on the significant
continuous variables differences?
Examine differences How do participants’ ratings of
in the quantitative the impact of participation differ
results based on a by the level of negative aspects
qualitative typology experienced (as described in a
qualitatively developed typology)?
By data Develop a new What variable indicates the amount
transformation quantitative variable of negative aspects associated with
based on qualitative an individual’s experience in the
findings to test for program? How does this variable
relationships with relate to other variables?
other variables
Consolidate quantitative What indicator based on quantitative
and qualitative and qualitative information best
information to measures a participant’s interest
develop a better in working with K–12 education in
variable to use in the future? How does this indicator
statistical analyses correlate with other variables?

Downloaded from fmx.sagepub.com at Stockholms Universitet on March 4, 2011


164 Field Methods 22(2)

both quantitative and qualitative data, which has important implications for
data collection as well as data analysis. Although some scholars argue that
quantitative and qualitative data cannot represent the same phenomenon (Sale,
Lohfeld, and Brazil 2002), we successfully identified general “topics” of over-
lap and compared the results from the two data sets to examine complementary
facets of the participants’ perceptions on these topics. A key tension we ex-
perienced was between remaining true to the traditional standards of
analyzing the data sets and wanting to extract specific pieces of data for com-
parison. For example, on the topic of perceived benefits, we debated whether
to analyze our qualitative data using predetermined topic codes derived from
the closed-ended items to produce data that were easily comparable or to
conduct a separate thematic analysis to capture the richness of the qualitative
data. In the context of our exploratory purpose, we opted to conduct an initial
thematic analysis that began with open coding. This approach ensured that
we would find areas of convergence and also increased the likelihood of
uncovering areas of divergence, such as skills discussed in the qualitative
data that were not measured with the quantitative items. Another challenge
was deciding whether each comparison should be initiated by first examining
quantitative results or qualitative results. We found that being open to starting
with either data set encouraged us to think more deeply about meaningful
comparisons that could be made and the differing rhetorical effect that
resulted as we composed our discussion paragraphs.

Merging with a Matrix


Our use of matrices facilitated the integration of different but related quanti-
tative and qualitative results by comparing results by specific categories. These
group comparisons enhanced our understanding by bringing together quanti-
tative and qualitative data related to multiple content areas. After the initial
separate analyses, we first noted quantitative and/or qualitative results that
pointed to interesting differences in the data. We next identified meaningful
grouping categories based on these results (e.g., levels of a categorical
variable or a typology from the qualitative data). Third, we conducted further
analyses to develop a matrix such as illustrated in Figure 2 that related the
grouping categories to other results (e.g., qualitative themes or quantitative
means). We placed the grouping variable (e.g., STEM discipline in Figure 2)
along one dimension and identified meaningful results to place along the other
dimension (e.g., prominent themes and significant quotes). In addition to
choosing the dimensions, we had to select the type(s) of information to dis-
play within the cells of the matrix. Options included quantitative descriptive
information, thematic categories, normalized counts representing themes, and

Downloaded from fmx.sagepub.com at Stockholms Universitet on March 4, 2011


Plano Clark et al. 165

QUAL

QUAN Most Prominent


Comparisons Qualitative Themes
Discipline a Sample Quotes
Improved Skills about Benefits
M (SD) (% of statements)
Improved It also allowed me to translate very complicated concepts to children
Communication Skills and it helped me work on my communication skills
(30%)
Biological b, c
40.17 Improved Teaching If I choose to go into academia, I feel that my tea ching skills have
Sciences
(3.06) Skills (23%) improved. My evaluations have been significantly more positive since
(n = 6)
working with [the program].
Improved Marketability [The program] has made me a more marketable scientist. I believe it
(23%) has enhanced my CV.
Improved Mostly, I learned a lot about communication. And it wasn’t just about
Communication Skills communicating with (teaching) students. I learned a lot about
(37%) communicating with other profession als. Sometimes it is just difficult to
Physical
communicate, but there are a lot of things that can be done to make it
Sciences &
36.07 b easier.
Engineering
QUAN (4.11) Camaraderie and I found the support of the other fellows to be very helpful on poor
(nQUAN = 15;
Friendship (22%) research days or weeks, since it o ften feels that the atmosphere is
nQUAL = 13)
somewhat competitive within my own department.
Improved Confidence By gaining the ability to talk science with everyone, I am a more
(14%) confident, coherent scientist in public and at work.
Improved Teaching One of the main goals of a Ph.D. in Mathematics is the betterment of
Skills (39%) one’s teaching ability. I simply cannot say enough about the positive
effect of [the program] with respect to my teaching. Outside of raw
experience, it has had the single most significant impact.
Mathematics c
34.79 Improved Marketability Participating in [the program] has made me a more desirable candidate
(nQUAN = 14;
(2.67) (23%) for a teaching position. I have heard that a credential such as this is
nQUAL = 13)
highly desirable in the tenure track job market.
Working with People I was able to teach material and to work with individuals outside my
Outside Field (16%) subject area. This will help me to be more effective in interdepartmental
collaboration in the future.
a
ANOVA test found a significant difference among groups for the "improved skills" variable ( F = 5.21, p = .011).
b
Univariate post hoc comparisons found significant differences between Biological Sciences and Physical Sciences & Engineering (p = .048).
c
Univariate post hoc comparisons found significant differences between Biological Sciences and Mathematics ( p = .008).

Figure 2. Example of merging with a matrix

sample quotes. Last, we interpreted how the resulting matrix display uncovered
similarities and differences among the cases along the various dimensions.
Different techniques. We identified three techniques for using matrices to
merge the two data sets in response to our research questions (see Table 1).
These techniques differ in terms of how the grouping categories are derived
and the information displayed within the matrix cells. The simplest applica-
tion of this strategy was to explore differences in the qualitative findings
based on levels of a categorical variable (e.g., qualitative comments about
self-confidence by men vs. women).
We found two additional techniques for applying matrices in our study.
The second way we used a matrix was to qualitatively explore a statistically
significant difference in the quantitative results. We were interested to learn
how the perceptions of program benefits varied by certain characteristics, such
as disciplinary field. Through our quantitative analyses, we found significant
differences among disciplines (F = 5.21, p = .011), and univariate post hoc
tests indicated that the biologists’ perceptions differed significantly from those
of the physical scientists and engineers and the mathematicians. We developed

Downloaded from fmx.sagepub.com at Stockholms Universitet on March 4, 2011


166 Field Methods 22(2)

one matrix that displayed quantitative group means and qualitative themes
related to benefits as a function of discipline. The resulting matrix relates a
categorical variable, means for a continuous variable, frequency of occur-
rence of qualitative themes, and illustrative quotes (see Figure 2). This matrix
helped us understand the different ways that individuals benefited as well as
the disciplinary contexts for those perceived benefits (e.g., viewing one’s
discipline as competitive or placing a high value on teaching). A third tech-
nique emerged from situations where we examined results in terms of a grouping
variable that we derived from the qualitative data. We grouped participants
by types of negative experiences (program tensions vs. loss of disciplinary
experience) and then compared the groups in a matrix in terms of their quan-
titative scores.
Methodological challenges. Several challenges occurred as we merged our
results in matrices. First, this strategy requires identifying a grouping variable
from one data set that corresponds to meaningful differences in the other data
set. As we weighed different possibilities, we were reminded that the deci-
sion as to what is meaningful can be made only within the context of the study’s
purpose. This became important to emphasize because so many potential com-
parisons exist between the two data sets, and we found that without care, the
ease of applying this strategy could outweigh the use of the strategy to answer
specific questions. Although straightforward to apply for categorical vari-
ables, this strategy is more challenging when applied to continuous variables.
In one context, we found it helpful to convert participants’ scores into nor-
malized scores for grouping purposes. We note that both SPSS and MAXqda
have powerful capabilities for analyzing data by grouping variables. For exa­
mple, by using the “attribute” feature of MAXqda, we were able to examine
our qualitative results in sets based on a quantitative variable, such as com-
paring the themes discussed by individuals representing each discipline. We
found, however, that summarizing the results in a matrix requires much tedious
copying and pasting among the different software programs.

Merging by Data Transformation


Our process of merging by data transformation included several steps (see
Figure 3). This process started when insights from our initial analyses led us
to pose a question that called for a new variable. Our next step was to identify
whether we had qualitative (or a combination of qualitative and quantitative)
information that could serve as an indicator for the new variable with further
analyses. Once the information was identified, we developed a rubric for
scoring this new variable and assigned a score to each participant. Because of

Downloaded from fmx.sagepub.com at Stockholms Universitet on March 4, 2011


Plano Clark et al. 167

New insights Identify the QUAL Develop and apply Establish evidence Conduct
call for information that will a rubric for for validity of new analyses and
creating a new serve as an indicator quantitizing the QUAN variable report QUAN
QUAN variable for the new variable QUAL information results

Quantitative Four themes that Rubric developed for Assessed inter-rater Tested to see if
data are emerged from the assigning a reliability and resolved the two groups
positively qualitative analysis “negative aspects” discrepancies; Tested (positive impact
skewed, but provide indicators of score to each hypothesis that “negative vs. very positive)
qualitative data negative aspects of the participant based on aspects” correlates differed in terms of
indicate that experience the number of negatively with the negative
there were unique statements “encourage others to aspects variable
Qualitative Information:
negative coded within the participate”
Emergent themes from
aspects about negative themes. New Result:
coded data: Program
individual’s Evidence: A significant
tensions, Negative
experience in New Variable : 100% agreement difference was
emotions, Degree took
the program Negative impacts (M achieved; Hypothesis found (Mann-
more time, and Loss of
= 2.82; SD = 1.65) supported (rS = –.483; Whitney U = 22.5,
disciplinary experience
p = .004) p = .004)

Figure 3. Example of merging by transforming qualitative data

concern about the validity of the new variable, we employed strategies to estab-
lish evidence for its validity. These strategies included using multiple raters
during the scoring process, ensuring a high interrater agreement (at least 90%),
checking the characteristics of the new variable by evaluating its descriptive
statistics, and using convergent and discriminant validity procedures. Once
we were satisfied that the new variable provided useful information, we used
the new quantitized variable in statistical analyses with other quantitative vari-
ables and reported the results.
Different techniques. We used data transformation for two reasons: to develop
new variables and to construct better variables when variables that were needed
did not exist or were inadequate (see Table 1). For example, when designing
the questionnaire, we expected to find a range of scores in our quantitative
data, but instead the results were positively skewed across all items. Our
qualitative analyses, however, uncovered data about some negative experi-
ences with participating in the program. Therefore, we decided to transform
qualitative themes into a new quantitative variable (“negative impacts”) to
help us further understand the perceived impact of participation in the pro-
gram (see Figure 3). Because we did not have a theory to guide our quantitizing
of the negative experiences, we chose to count the number of statements
coded into the negative impacts subthemes (e.g., programmatic tensions and
loss of disciplinary experience) as an indication of the extent of a partici-
pant’s negative experience with the program. We proceeded to relate this new
quantitized variable to other quantitative variables in correlational analyses
for validation purposes. Finally, we established that there is a significant dif-
ference in the amount of negative comments made by those who rated the
overall program impact as positive versus very positive. Therefore, we con-
cluded that participants found program participation to be a positive experience

Downloaded from fmx.sagepub.com at Stockholms Universitet on March 4, 2011


168 Field Methods 22(2)

overall, but for some this experience was tempered by their perceptions of
corresponding negative impacts.
A second use for data transformation emerged when we realized that one of
our existing quantitative variables was inadequate. We included items that
asked for participants’ actions and intentions in working with K–12 education
in the past, present, and future. The data about participants’ actions in the past
indicated that many participants had misunderstood the phrasing of the item.
Therefore, we were uncertain about the validity of the scores representing
their future intentions. Caracelli and Greene (1993) describe a technique of
consolidating quantitative and qualitative information into a new variable.
This technique is similar to data transformation, with the exception that more
than one type of data is considered to generate the new variable. We created a
new variable for all participants based on their quantitative responses about
future intentions and their open-ended qualitative comments that discussed
their future intentions. We developed a rubric for examining the extent of evi-
dence for future involvement in K–12 education in both the quantitative and
the qualitative information and assigned a score (1–4) indicating the level of
evidence (none, little, moderate, extensive). We used this consolidated variable
in further analyses to identify variables that predict future involvement.
Methodological challenges. We found that data transformation helped us take
advantage of important insights that emerged from our analyses but that this
technique should be used with care. This strategy requires the researcher to
clearly identify the need for a transformed or consolidated variable and then
determine whether sufficient information is available on which to base the
new variable. This procedure also requires developing standard procedures for
transforming qualitative information into a quantitative score in a reliable and
valid way. One issue that emerged in our analysis was the need to set proce-
dures for handling cases that had missing qualitative data. Once the variable
was scored, we also needed to consider its characteristics to determine what
statistical tests were appropriate (e.g., using nonparametric statistics) and the
strategies that could be implemented to establish evidence for the reliability
and validity of the quantitized variable. Finally, we found that we must use cau-
tion when interpreting the statistical results from the new variable to keep from
overgeneralizing the meaning as it may have limited reliability because of the
unsystematic nature of the unstructured data on which it is based.

Discussion
Our goal for this article was to carefully examine our practice as we integrated
our quantitative and qualitative data within one concurrent mixed methods
study about individuals’ perceptions of the impact of their participation in a

Downloaded from fmx.sagepub.com at Stockholms Universitet on March 4, 2011


Plano Clark et al. 169

nontraditional STEM graduate education program. From this practice, we have


described how we applied three merging strategies. These descriptions included
the process steps for implementing the strategy and sample products. In addi-
tion, we delineated different techniques for using each strategy that resulted
from our study’s research questions and identified different challenges that
we confronted during our implementation. These descriptions suggest guide-
lines for using the three strategies as well as issues that can be anticipated in
the research design and analysis process.
These results contribute to our understanding of strategies available for
merging the quantitative and qualitative components of a concurrent mixed
methods study. The strategies themselves are not new (Caracelli and Greene
1993; Miles and Huberman 1994; Tashakkori and Teddlie 1998; Onwueg-
buzie and Teddlie 2003; Creswell and Plano Clark 2007). This work, however,
went beyond general descriptions to identify eight specific uses by which the
strategies contributed to one study’s integration. Researchers considering the
use of merging strategies can use this information to weigh each strategy’s
utility in this study as they select one or more strategies for their own study’s
context. That is, it is not that one strategy is better than another, but they are
each suited to address different kinds of questions.
This examination of different merging strategies highlights the complexity
inherent in most applications of mixed methods research in practice. Much
attention in the literature has focused on what is mixed in mixed methods stud-
ies. Interestingly, we came to realize that each of the merging strategies that we
utilized mixed at a different level. The process of merging by data transforma-
tion focused on merging at the level of the data. The development of a matrix
facilitated merging at the level of results. Finally, merging by discussion called
for us to merge during our larger interpretations. We also found that the strate-
gies were useful for examining one (discussions), two (matrices), or more (data
transformation) substantive topics in addition to merging the two types of data.
Adding to this complexity is the number of unique techniques that we
employed in this one, relatively simple, mixed methods study. Mixed meth-
ods writers have enumerated different techniques that call for a mixed methods
approach (Greene, Caracelli, and Graham 1989; Bryman 2006), but we found
that one individual study can effectively utilize more than one or two spe-
cific techniques. We also found that many of our research questions as listed
in Table 1 emerged after we had completed our initial analyses and went beyond
our originally planned analyses. Bryman (2006) noted that the emergence of
new questions and ways to integrate may be common within mixed methods
studies, and our work supports this conjecture.
Most discussions of mixed methods designs emphasize the timing of the
two methods and whether they are used in a sequence or concurrently (e.g.,

Downloaded from fmx.sagepub.com at Stockholms Universitet on March 4, 2011


170 Field Methods 22(2)

Morse 1991; Creswell and Plano Clark 2007; Teddlie and Tashakkori 2009).
Although we describe the overall design of this study as a concurrent approach,
it is clear that our analytic processes and interpretations drew on the databases
in both concurrent and sequential ways as the analyses moved from being more
separate (when merging in a discussion) to more integrative (when merging
with a matrix or by data transformation). Therefore, the notion of the timing within
a mixed methods design may not be as simple as one overall classification.
In light of these reflections, we also recognize limitations in our project.
For one, we considered only strategies for merging and did not consider other
integration strategies such as those used in sequential mixed methods
approaches. Our overarching purpose was exploratory, and the extent of the
two databases constrained the types of analyses that we could implement. We
had a relatively small sample size for advanced statistical analyses and a rela-
tively large sample size for in-depth qualitative exploration, a common issue
in many mixed methods studies. Although respondents provided extensive
responses for the open-ended items, our qualitative database was also limited
by the fact that we were unable to probe for further details. This limits the
depth of our understanding of the participants’ perspectives and raises con-
cern about the adequacy of the database used for the data transformation
procedures in particular. In addition, it is possible that there was an interac-
tion between the closed- and open-ended items that biased the participants’
responses, as found in another study using a questionnaire for collecting both
data forms (Vitale, Armenakis, and Field 2008). Although the generalizabil-
ity and transferability of the results to other nontraditional programs are
limited, our process of merging two data sets may be independent of the
study’s specific data. So similar procedures, questions, and issues may occur
in studies using more sophisticated quantitative and qualitative analyses.
Despite its limitations, this study gave us a better and more nuanced
understanding of the perspectives of the alumni of this one nontraditional
program. Through merging the two sets of data, we were able to examine in
greater detail both positive and negative aspects of program participation, to
develop a more complete description of how participants rate the experiences
as well as the reasons behind these ratings, and to explore differences among
the participants’ perceptions and experiences. We also identified numerous
methodological issues related to our merging strategies that warrant further
examination such as handling missing qualitative information in data trans-
formation, developing procedures for validating transformed data, and
examining techniques for identifying and utilizing divergent information. In
addition, further work needs to examine integration strategies beyond those
classified as merging.

Downloaded from fmx.sagepub.com at Stockholms Universitet on March 4, 2011


Plano Clark et al. 171

In conclusion, this methodological discussion attempted to add to our prac-


tical knowledge about strategies for integrating within a single content study.
It related three literature-based merging strategies to specific research ques-
tions and identified a set of techniques within each of the general approaches.
We expect that the ingenuity of other researchers will lead to additional uses
for these strategies, and we hope these applications will be documented so
that our knowledge of the strategies will continue to grow. Unfortunately, as
found in a recent research review (O’Cathain, Murphy, and Nicholl 2007),
many researchers using mixed methods currently are not taking advantage of
even basic strategies for integration. As several scholars have discussed the
importance of having exemplars of translating discussions in the literature
to practice (e.g., Morse 2006; Bryman 2007; O’Cathain, Murphy, and Nicholl
2007), we hope that this discussion of our process provides a resource for
researchers who are unsure how such strategies can be applied in their mixed
methods studies.

Appendix

Here are three exemplar items from the online questionnaire. A copy of the
instrument can be requested from the authors.

A. An item that included a closed-ended and open-ended part:


4. My overall rating of the impact of participating in [the pro-
gram] is  .  .  .  (very positive, positive, neutral, negative, very
negative). Please describe the impact.
B. A closed-ended item:
6.  Indicate your agreement with the following statements:
  (j) Working with [the program] reinforced my decision to
pursue a degree in math, science or engineering. (strongly
agree, agree, neither agree nor disagree, disagree, strongly
disagree)
C. An open-ended item:
11. In what ways do you see your professional and scientific
career being different because of your experiences in [the
program]?

Acknowledgment
This article is based on a paper presented at the 2008 annual meeting of the American
Educational Research Association, New York. We thank our colleagues in the Office

Downloaded from fmx.sagepub.com at Stockholms Universitet on March 4, 2011


172 Field Methods 22(2)

of Qualitative and Mixed Methods Research at the University of Nebraska–Lincoln,


Abbas Tashakkori, and three anonymous reviewers for their constructive feedback on
previous drafts of this article.

Declaration of Conflicting Interests


The authors declared no potential conflicts of interest with respect to the authorship
and/or publication of this article.

Funding
The authors disclosed receipt of the following financial support for the research and/
or authorship of this article: This work was supported in part by the National Science
Foundation (DGE-0338202). The opinions, views, and conclusions expressed in this
article may not reflect those of the funding agency.

References
Bryman, A. 2006. Integrating quantitative and qualitative research: How is it done?
Qualitative Research 6 (1): 97–113.
Bryman, A. 2007. Barriers to integrating quantitative and qualitative research. Jour-
nal of Mixed Methods Research 1 (1): 8–22.
Buck, G. A., D. L. Leslie-Pelecky, Y. Lu, V. L. Plano Clark, and J. W. Creswell. 2006.
Self-definition of woman experiencing a nontraditional graduate fellowship pro-
gram. Journal of Research in Science Teaching 43 (8): 852–73.
Caracelli, V. J., and J. C. Greene. 1993. Data analysis strategies for mixed-method
evaluation designs. Educational Evaluation and Policy Analysis 15 (2):
195–207.
Creswell, J. W., and V. L. Plano Clark. 2007. Designing and conducting mixed meth-
ods research. Thousand Oaks, CA: Sage.
Creswell, J. W., V. L. Plano Clark, and A. Garrett. 2008. Methodological issues in con-
ducting mixed methods research designs. In Advances in mixed methods research:
Theories and applications, ed. M. M. Bergman, 66–83. London: Sage.
Creswell, J. W., V. L. Plano Clark, M. Gutmann, and W. Hanson. 2003. Advanced
mixed methods research designs. In Handbook of mixed methods in social and
behavioral research, ed. A. Tashakkori and C. Teddlie, 209–40. Thousand Oaks,
CA: Sage.
Fetters, M. D., T. Yoshioka, G. M. Greenberg, D. W. Gorenflo, and S. Yeo. 2007.
Advance consent in Japanese during prenatal care for epidural anesthesia during
childbirth. Journal of Mixed Methods Research 1 (4): 333–65.
Greene, J. C. 2007. Mixed methods in social inquiry. San Francisco: Jossey-Bass.
Greene, J. C. 2008. Is mixed methods social inquiry a distinctive methodology?
Journal of Mixed Methods Research 2 (1): 7–22.

Downloaded from fmx.sagepub.com at Stockholms Universitet on March 4, 2011


Plano Clark et al. 173

Greene, J. C., and V. J. Caracelli, eds. 1997. Advances in mixed-method evaluation:


The challenges and benefits of integrating diverse paradigms. New directions for
evaluation, No. 74. San Francisco: Jossey-Bass.
Greene, J. C., V. J. Caracelli, and W. F. Graham. 1989. Toward a conceptual frame-
work for mixed-method evaluation designs. Educational Evaluation and Policy
Analysis 11 (3): 255–74.
Hanson, W. E., J. W. Creswell, V. L. Plano Clark, K. S. Petska, and J. D. Creswell.
2005. Mixed methods research designs in counseling psychology. Journal of Coun-
seling Psychology 52 (2): 224–35.
Idler, E. L., S. V. Hudson, and H. Leventhal. 1999. The meanings of self-ratings of
health: A qualitative and quantitative approach. Research on Aging 21 (3): 458–76.
Lee, Y., and J. C. Greene. 2007. The predictive validity of an ESL placement test:
A mixed methods approach. Journal of Mixed Methods Research 1 (4): 366–89.
McAuley, C., N. McCurry, M. Knapp, J. Beecham, and M. Sleed. 2006. Young fami-
lies under stress: Assessing maternal and child well-being using a mixed-methods
approach. Child and Family Social Work 11 (1): 43–54.
Miles, M. B., and A. M. Huberman. 1994. Qualitative data analysis: An expanded
sourcebook. 2nd ed. Thousand Oaks, CA: Sage.
Mizrahi, T., and B. B. Rosenthal. 2001. Complexities of coalition building: Leaders’
successes, strategies, struggles, and solutions. Social Work 46 (1): 63–78.
Morse, J. M. 1991. Approaches to qualitative-quantitative methodological triangula-
tion. Nursing Research 40 (2): 120–23.
Morse, J. M. 2006. The politics of developing research methods. Qualitative Health
Research 16 (1): 3–4.
O’Cathain, A., E. Murphy, and J. Nicholl. 2007. Integration and publications as indica-
tors of “yield” from mixed methods studies. Journal of Mixed Methods Research 1
(2): 147–63.
Onwuegbuzie, A. J., and C. Teddlie. 2003. A framework for analyzing data in mixed
methods research. In Handbook of mixed methods in social and behavioral
research, ed. A. Tashakkori and C. Teddlie, 351–83. Thousand Oaks, CA: Sage.
Pagano, M. E., B. J. Hirsch, N. L. Deutsch, and D. P. McAdams. 2002. The transmis-
sion of values to school-age and young adult offspring: Race and gender differ-
ences in parenting. Journal of Feminist Family Therapy 14 (3/4): 13–36.
Plano Clark, V. L. 2005. Cross-disciplinary analysis of the use of mixed methods in
physics education research, counseling psychology, and primary care. PhD diss.,
University of Nebraska–Lincoln. Dissertation Abstracts International 66:02A.
Plano Clark, V. L., C. A. Huddleston-Casas, S. L. Churchill, D. O. Green, and A. L.
Garrett. 2008. Mixed methods approaches in family science research. Journal of
Family Issues 29 (11): 1543–66.

Downloaded from fmx.sagepub.com at Stockholms Universitet on March 4, 2011


174 Field Methods 22(2)

Sale, J. E., L. H. Lohfeld, and K. Brazil. 2002. Revisiting the quantitative-qualitative


debate: Implications for mixed-methods research. Quality & Quantity 36 (1): 43–53.
Tashakkori, A., and J. W. Creswell. 2007. The new era of mixed methods. Journal of
Mixed Methods Research 1 (1): 3–7.
Tashakkori, A., and C. Teddlie. 1998. Mixed methodology: Combining qualitative and
quantitative approaches. Thousand Oaks, CA: Sage.
Teddlie, C., and A. Tashakkori. 2006. A general typology of research designs featur-
ing mixed methods. Research in the Schools 13 (1): 12–28.
Teddlie, C., and A. Tashakkori. 2009. Foundations of mixed methods research. Thousand
Oaks, CA: Sage.
Vitale, D. C., A. A. Armenakis, and H. S. Field. 2008. Integrating qualitative and
quantitative methods for organizational diagnosis: Possible priming effects? Jour-
nal of Mixed Methods Research 2 (1): 87–105.
Wittink, M. N., F. K. Barg, and J. J. Gallo. 2006. Unwritten rules of talking to doc-
tors about depression: Integrating qualitative and quantitative methods. Annals of
Family Medicine 4 (4): 302–9.
Yount, K. M., and J. Gittelsohn. 2008. Comparing reports of health-seeking behavior
from the integrated illness history and a standard child morbidity survey. Journal
of Mixed Methods Research 2 (1): 23–62.

Bios
Vicki L. Plano Clark is director of the Office of Qualitative and Mixed Methods
Research, a research service center at the University of Nebraska–Lincoln in Lincoln,
NE.

Amanda Garrett is a doctoral student in educational psychology at the University of


Nebraska–Lincoln in Lincoln, NE.

Diandra L. Leslie-Pelecky is professor of physics at the University of Texas at Dallas


in Richardson, TX.

Downloaded from fmx.sagepub.com at Stockholms Universitet on March 4, 2011

Vous aimerez peut-être aussi