Vous êtes sur la page 1sur 19

Rich Qualitative Data Collection Using Electronic Platforms

Melody J. Elrod
MAE 7910
University of South Florida
Introduction
Historically, qualitative researchers have relied heavily upon interviews as sources of
data, as we have in our Mentoring Study (Kersaint, Smith, Ellerbrock, Sears, & Elrod, 2013).
Interviews provide full interaction between researcher and participant, allowing the interviewer
to carefully probe the participant for a rich account of his/her experiences. Interviewing, though,
relies upon participants memories and researchers interviewing skills (James & Busher, 2009;
Ravert, Calix, & Sullivan, 2010), meaning we do not always get the whole story. In the case of
our study, participants are asked to report in a single session about their experiences from an
entire semester. It seems inevitable that many of those experiences are lost. Some of this lost
data is mediated by the use of focus groups where participants can trigger each others memories,
but ultimately, we lose a large portion of our participants personal experiences.
In order to gather information in a timelier manner, many researchers use ongoing
surveys or questionnaires to capture changing data. These sources, however, are reliant upon
pre-determined questions and answers, robbing participants of the opportunity to respond freely.
Not even questionnaires that include open response can allow participants to organically share
their experiences. The static, pre-determined nature of questionnaire prompts and responses does
not allow participants to offer unique thought.
In my work conducting and transcribing qualitative interviews, I have been frustrated by
the missed opportunities I have heard in the recordings and wanted to find a method of data
collection that would allow us to collect rich data representative of participants experiences in a
continuous, accessible way. After discussing the merit of various data collection tools with my
professors and classmates, I suspected that electronic platforms might hold the answer to this
dilemma. This paper serves as an account of my investigation into electronic media for
conducting interviews asynchronously in order to collect rich qualitative data. In it, I will review
the available research and products and provide recommendations for moving forward with the
Mentoring Study.
Literature Review
In reviewing the literature, I journeyed from surveys and questionnaires through
synchronous and asynchronous interviews into the world of mobile applications. This section
will focus solely on findings from published research.
Pros of Electronic Data Collection
In a review of literature concerning the use of online (web-based) data collection,
researchers agreed on two major advantages of electronic platforms. First, participants are often
more forthcoming concerning sensitive subjects (Bowker & Tuffin, 2004; James & Busher,
2009; Robinson, 2001; Salmons, 2010; Shields, 2003; Suzuki, Ahluwalia, Arora, & Mattis,
2007). Social conventions and inhibitions are lowered, making it possible for participants to
relate ideas and comments that they might normally reserve when speaking to someone of a
particular ethnicity, gender, or age (Franklin & Lowry, 2001; OConnor & Madge, 2003).
Second, the use of electronic platforms guarantees accurate transcripts since participants and
researchers type their interactions (e.g., Bowker & Tuffin, 2004; Shields, 2003; Suzuki et al.,
2007).
Researchers also identified several other advantages for electronic data collection,
especially concerning asynchronous (delayed) communications. When participants are engaged
asynchronously, they are able to respond at a time that is convenient to them (James & Busher,
2009; Salmons, 2010, 2012). This convenience may facilitate responses and follow-up questions
that are more considered and reflective (Dowling, 2012; Polkinghorne, 2005; Salmons, 2010,
2012). Also, unlike synchronous communications with a set beginning and end, asynchronous
communications can continue over the course of weeks, months, or even years, depending upon
the focus of the study (James & Busher, 2009; Salmons, 2012).
Cons of Electronic Data Collection
Electronic data collection is not without its disadvantages, though. A commonly
identified disadvantage is that of transcript authentication. Though the transcript will be
accurate, the source of the comments can remain in question without face-to-face recognition
(Bowker & Tuffin, 2004; Robinson, 2001). Another prevailing difficulty of online interactions
identified in the literature is the loss of non-verbal communication (Bowker & Tuffin, 2004;
Franklin & Lowry, 2001; OConnor & Madge, 2003; Polkinghorne, 2005; Suzuki et al., 2007).
James and Busher (2009) noted that with the loss of non-verbal communication, it can be
difficult for the researcher to connect with the participant in a meaningful way. Researchers
should anticipate this difficulty and expect to invest more time in establishing rapport (Dowling,
2012). Without a real connection to the researcher, participants may become discouraged or
disinterested and drop out of the study (James & Busher, 2009; OConnor & Madge, 2003).
Because the bulk of the data will be collected through the written word, it is also
important to recognize the balance of power that is created amongst participants and researchers
of varying literary and typing skills. In forum settings, participants with stronger skills can
potentially dominate the conversation (Dowling, 2012; James & Busher, 2009), making it
difficult to collect data representative of the whole group. In individual settings, it is important
to preserve participants vernacular (text speak, colloquialisms, incorrect word usage, etc.),
representing their ideas and comments truthfully without editing (James & Busher, 2009).
Rather than making assumptions about participants words, researchers should follow up with
respectfully worded questions to clarify meanings as necessary.
Lastly, though the anonymity of online communications was cited as an advantage of
electronic platforms, that anonymity cannot be assumed. When participants know one another
well enough to recognize speech patterns and commonly used phrases, social conventions and
inhibitions may not be lowered as expected (Franklin & Lowry, 2001). Also, some formats like
email and commonly used forums or messaging systems may not be anonymous at all.
Communicating remotely does not imply anonymity (James & Busher, 2009).
Methods & Means for Electronic Data Collection
Researchers and participants can interact either synchronously (in real time) or
asynchronously (in delayed time). In order to meet the needs of continuous data collection that
is convenient to participants, I have focused my energies on asynchronous communications.
Ravert, Calix and Sullivan (2010) provide a break-down of three types of asynchronous
communications. Fixed-interval reporting requires participants to respond at a certain time(s)
during the day, week, or study (i.e., Fridays at 4pm). Event-based reporting allows participants
to provide a response during or after a particular event (i.e., during coffee breaks). In interval-
based studies, participants respond when prompted (i.e., by a beeper or message). In order to
gather data from participants in a way that is organic, I gravitate towards event-based reporting.
In the initial stages of any qualitative study, researchers should provide participants with
protocols for involvement. Studies utilizing an electronic platform are no different. For
example, when collecting data via email, James and Busher (2009) asked participants to include
the message stream in each response with new text at the top of the message. Bolger and his
colleagues (2003) caution researchers to distribute prompts and questions sparingly to reduce the
possibility of overwhelming participants.
Selecting the means for collecting data electronically must be done with care. Bosnjak
and his colleagues (2010), in an effort to study participants willingness to engage in mobile
surveys, reviewed Davis technology acceptance model (TAM) established in the late 1980s with
respect to computer behavior. In this model, Davis cites perceived ease of use as the
mediating factor for users attitude, perception of usefulness, and intentions when engaging with
technology. This model is paired with the unified theory of acceptance and use of technology
(UTAUT), which highlights performance expectancy (including perceived usefulness), effort
expectancy (which extends perceived ease of use), social influences (subjective norms and
image), and facilitating conditions (Bosnjak, Metzger, & Graf, 2010, p. 353). Together, these
models caution researchers to consider how usable participants will find various modes of
communication.
The list of available communication methods is growing quickly. It includes message
boards, text messages (Ravert et al., 2010), email messages (Bowker & Tuffin, 2004; Dowling,
2012; James & Busher, 2009), journal/diary entries (Bolger, Davis, & Rafaeli, 2003), videos,
pictures, and document sharing. These avenues are accessed through websites, desktop
programs, and mobile applications (e.g., 20/20 Research, Inc., 2014; Fog Creek Software, 2000;
Penzu Inc, 2014; Salari, Sharpe, Khambay, & Gonzalez-Estrada, 2014). Because many of these
platforms have not yet been specifically investigated through the literature as a means for
qualitative data collection, I will review them in the Product Review section below.
Implications
Before moving on to the product review, however, let us synthesize some of the literature
as it applies to research design. Most importantly, asynchronous communication does have the
potential for providing rich qualitative data. Moving forward with the intention of collecting
data using electronic platforms to collect data asynchronously is a sound and credible decision.
In doing so, though, we must consider several factors.
First, because we will lose face-to-face interactions with our participants, we must be
intentional about establishing rapport by offering personal information about the researcher as it
pertains to research. For example, participants in the mentoring study might benefit from the
knowledge that the researchers were at one time preservice teachers seeking certification and that
the research being conducted will help to improve the experience for future students.
Next, a system for prompt and response must be established and distributed to
participants. Participants should have a clear understanding of what will be expected during the
study. Researchers should prepare a set of prompts or questions for the study and be flexible
enough to change those prompts as needed.
Finally, investigators must closely track participant feedback in order to follow-up
quickly to vague or unclear responses. In doing so, they must be respectful of participants
language skills and habits, preserving the personalities of participants while probing for clarity in
understanding participants experiences.
Product Review
Though the literature provided needed information about using electronic platforms for
collecting rich qualitative data, many of those platforms are left unexplored by current studies.
In this section, I will use a mixture of available research and my own experiences with the
available products to discuss three major subdivisions: email, web-first products, and mobile
applications. I have intentionally addressed these subdivisions chronologically in the order of
their predominance in qualitative research.
Email
Email is the most commonly cited product in the literature (Bowker & Tuffin, 2004;
Dowling, 2012; James & Busher, 2009; Salmons, 2012; Suzuki et al., 2007). In general,
researchers use this medium to send questions or prompts to participants who then respond in
their own time. In this way, researchers are use email threads to produce an accurate transcript
of the interview. Interviewers are urged to send questions in isolation in order to ask follow-up
questions and probe for more information in a natural way.
Unlike face-to-face interviews, however, participants may not respond immediately,
which is considered both an advantage and a disadvantage. Waiting for participants responses
can be a nerve-wracking exercise for researchers as they wonder if participants have lost interest
or if the email has been lost in cyberspace (or the spam folder). In contrast, delayed participant
responses have the potential to produce more reflective, introspective data as participants
consider their experiences more carefully. Because of the time delay, email interviews can
continue for weeks, months, or even years unless the researcher sets specific time parameters for
the conversation.
As a commonly-used medium available on the web, in desktop applications, and as
mobile applications, email is also readily accessible to most participants and researchers. Little
training is required for email interviews and protocols for such use can be very simple. It should
be noted, however, that to protect the privacy of participants and researchers, alternate email
addresses may be needed for some studies. If so, some training may be required to help
participants set up and access second email addresses on their devices.
Web-First Products
The second-most cited category of electronic products is web-first products, a term I
coined to describe developers intentions for use. In this mobile world, many long-standing
programs and websites have been supplemented with mobile applications. This category is
meant to encompass those products whose primary function is/was web based. For example,
though I can access my bank account using my tablet and smartphone, remote access was
intended to be online via a web browser. Mobile capabilities are secondary and offer limited
access to the full online functions. All the products included in this discussion were created with
this premise. Also, for the sake of protecting participants rights, all of these products require a
secure login and allow participants and researchers the option to include multi-media (pictures,
videos) and document sharing.
There are several types of web-first products. Most commonly found in the literature are
message boards and forums (Franklin & Lowry, 2001; OConnor & Madge, 2003; Salmons,
2010). An example of this kind of forum is Canvas (Instructure, Inc., 2011), an online learning
site used by the University of South Florida and more than 800 other educational institutions.
With Canvas or other online forums, participants can respond to prompts individually or in a
group setting. Many of these forums also offer limited-access mobile applications.
Journal and blogging sites are a second type of web-first product. With a product like
Penzu (Penzu Inc, 2014), participants respond in isolation or interact with one another by
commenting on established entries. As with forums, many journal and blogging sites have been
supplemented with limited-access mobile applications.
The final type of web-first product I will discuss is social media sites. Sites like
Facebook (Zuckerberg, Moskovitz, Saverin, McCollum, & Hughes, 2004) offer group and
messaging options that allow participants and researchers to interact either in forums or in
isolation. Though these products in general (and Facebook in particular) are supplemented with
full-access mobile applications, they are by definition social forums, meaning that participants
may not see them as a serious vehicle for sharing their experiences authentically.
Mobile Applications
The last category, mobile applications, is the newest and least cited in the literature.
Though solid support exists for using mobile products to gather qualitative data (Besara, 2012;
Bolger et al., 2003; Ravert et al., 2010), little research is available to study the products
specifically designed for qualitative data collection and analysis. Indeed, until recently, these
products have been used primarily for corporate marketing. The greatest benefit of mobile
applications is its constancy. Participants can respond in the moment from mobile devices
(tablets, smartphones, etc.). Though mobile devices generally elicit brief responses, many of
these applications are accompanied by website versions that allow for lengthier responses.
Examples of mobile products include Ethos (Salari et al., 2014), MyInsights (Mobile
Market Research, 2014), and QualBoard (20/20 Research, Inc., 2014). All three of these
applications were created with data collection purposes in mind and are equipped with tools that
allow the researcher to create transcripts from the data gathered in a variety of ways. For a
comparative analysis of these three products, see the appendix.
Implications
After studying and testing the various products available for qualitative data collection, I
drew several implications for our research. First, given the novel nature of mobile applications
and some web-first products, participants may require training in order to fully utilize the
features of these products. For example, the capabilities of Ethos to respond to multiple prompts
simultaneously are not obvious to a first-time user. Likewise, researchers may need to call
attention to the various multi-media and document sharing capabilities of Canvas.
Next, the choice of product must be mediated by the type of qualitative data being
collected and the level of security needed to collect such data. When conducting a study
concerning the ways participants utilize social media to gather and share information, a product
like Facebook may be best. If researchers wish to investigate relationships between individuals
while protecting the individuals involved, however, Facebook may not offer the level of security
needed to protect participants.
Lastly, when considering ongoing, to-the-minute data collection, those products that can
be accessed on mobile devices offer the most active option. For example, while diary and
blogging sites have the potential to provide rich accounts of participants experiences, they are
less likely to provide in-depth data on the go. Accompanying mobile applications that allow
them to make brief notes during the day to be expanded upon later provide a greater opportunity
for full accounts of those experiences.
A Note about Cost
The products listed above are accompanied by various costs. Though some are free (like
email or Facebook), others are quite costly. The most expensive of the options explored is
undoubtedly the mobile applications. As these applications are accompanied by research-
specific tools, however, their cost is not directly comparable to those of web-first products like
Penzu and Canvas. I admit also that as a novice researcher, my understanding of qualitative
analysis is just forming. For these reasons, I do not include a cost analysis in my discussions of
the products I have reviewed. Decisions about cost must be made by someone who better
understands the qualitative research features of the mobile applications I have introduced.
Discussion
In order to make recommendations based on the literature and product reviews, I must
consider again the original dilemma: We want to collect rich representations of participants
experiences and we want to find a platform for doing so that offers convenience and accessibility
throughout the study. Which product or products will satisfy both of these needs?
Recommendations
Given the capabilities of the mobile applications, I recognize this type of product as the
most likely to satisfy our needs. Mobile applications have the most field application, allowing
participants to respond in the moment and removing the necessity of relying upon participants
memories. Also, considering the generation of our primary participants (pre-service teachers),
mobile applications will be the most attractive for continuous use (Bosnjak et al., 2010).
Studying the three qualitative-specific applications introduced above, Ethos and
QualBoard stand out as providing participants with the ability to follow-up to brief on-the-go
responses with lengthier responses via the accompanying websites. These two applications also
provide researchers with the best tools to collect participant-specific data and group-specific data
by segmenting participants into subject-specific or role-specific groups. The other platforms
(email and web-first products) lack these research tools.
Applications to Ongoing Research
Mentoring study. As applies to the Mentoring Study (Kersaint et al., 2013), I would
recommend the following actions. First, all participants would be added to either Ethos or
QualBoard, including pre-service teachers (PSTs), collaborating teachers (CTs), university
faculty (USF), and program personnel (HP). Once added, participants would be subdivided in
several ways: by role (PST, CT, etc.), by school (Davidson, Williams, etc.), and by placement
CT (Talemantez, Weg, etc.). Before beginning to collect data, all participants would receive
application and website training either synchronously (i.e., training session) or asynchronously
(i.e., videos or instructional narratives).
To begin collecting data, researchers would distribute an application protocol that
includes approximations for the number of prompts per week, the number of expected responses,
expected lengths for those responses, etc. This protocol would be submitted as a multiple choice
question with answers Yes, I understand and will participate accordingly, No, I dont
understand and would like more information before proceeding, and I do not wish to
participate. The protocol will serve as a reference tool for both researcher and participants
throughout the study.
Prompts for participant response will be created for three categories: (1) the mentoring
relationship (expectations, experiences, concerns, suggestions, etc.), (2) teaching and teacher
practice (beliefs about, understanding of; before, during, and after), and (3) the practicum
program (expectations, experiences, connections to coursework, concerns, suggestions, etc.).
Researchers will actively monitor participants responses throughout the study and respond to
individuals with follow-up questions as needed to clarify participants perspectives and
experiences.
Helios grant. By using Ethos or QualBoard to gather data for the mentoring study, we
also open avenues to supplement the purposes of the Helios grant for the Middle School STEM
Residency Program. By setting up an additional project or embedding a new project within the
mentoring study, we can use the placement groups described above to open a forum amongst
PSTs, CTs, and USF faculty. These new forums could be used to coordinate tasks assigned by
USF faculty that are meant to be applied in CTs classrooms and would benefit the Middle
School STEM Residency Program in three very important ways.
First, a collaborative forum would satisfy collaborative teachers need to be better
connected to USF personnel. Over the course of the last two semesters, CTs have commented to
supervisors and program personnel that they wish they had more contact with the university.
They would like feedback on how their duties are being carried out. Though they have been sent
emails concerning the university-specific tasks PSTs are asked to perform in their classrooms,
they still request more access to the university. A continuous, easily accessible forum would
grant them that access.
Secondly, the forum would provide university faculty a more convenient way to follow-
up with individual PSTs during unit and lesson planning tasks. Throughout the observers
comments for the mathematics methods courses, we can see professors intentions to meet
individually with PSTs and their frustration about not finding adequate time to do so.
Asynchronous, semester-long communication would provide a platform for productive, in-depth
follow up with each PST.
Lastly, and perhaps most importantly, an ongoing forum amongst the CTs, PSTs, and
university faculty would relieve tension for pre-service teachers who are working to reconcile the
theories they learn in their coursework and application in their pre-established practicum
classrooms. For example, as they write and revise their lesson plans, many PSTs have expressed
frustration that their CTs will not allow them to do what their university professors insist they do.
PSTs are encouraged to implement inquiry-based instruction during their coursework, but are
unable to do so in classrooms that do not have established sociomathematical norms to support
such instruction. A forum that allows communication between CTs and faculty will make these
kinds of frustrations apparent to all parties and remove PSTs as middle men.
Closing Thoughts
As we work with the millennium generation in this mobile age, it is no longer sufficient
to rely upon interviews and observations to gather data that represents participants experiences.
We must consider the interests and motivations of our participants in selecting a platform for
communication during qualitative studies. As technology natives, the millennium generation
conducts its life online. To capture that life, we must go online with them.

References
20/20 Research, Inc. (2014). QualBoard Mobile. Nashville, TN: 20/20 Research, Inc. Retrieved
from http://www.2020research.com/about-us/
Besara, R. (2012). Apps for assessment: A starting point. The Reference Librarian, 53, 304309.
Bolger, N., Davis, A., & Rafaeli, E. (2003). Diary methods: Capturing life as it is lived. Annual
Review in Psychology, 54, 579616.
Bosnjak, M., Metzger, G., & Graf, L. (2010). Understanding the willingness to participate in
mobile surveys: Exploring the role of utilitarian, affective, hedonic, social, self-
expressive, and trust-related factors. Social Science Computer Review, 28(3), 350370.
Bowker, N., & Tuffin, K. (2004). Using the online medium for discursive research about people
with disabilities. Social Science COmputer Review, 22(2), 228241.
Dowling, S. (2012). Online asynchronous and face-to-face interviewing: Comparing methods for
exploring womens experiences of breastfeeding long term. In J. Salmons (Ed.), Cases in
Online Interview Research (pp. 277302). Thousand Oaks, CA: SAGE Publications, Inc.
Fog Creek Software. (2000). Trello. Trello. Retrieved July 3, 2014, from https://trello.com/
Franklin, K. K., & Lowry, C. (2001). Computer-mediated focus group sessions: Naturalistic
inquiry in a networked environment. Qualitative Research, 1(2), 169184.
Instructure, Inc. (2011). Canvas. Salt Lake City, UT: Instructure, Inc.
James, N., & Busher, H. (2009). Online Interviewing. London: SAGE Publications, Inc.
Retrieved from http://srmo.sagepub.com.ezproxy.lib.usf.edu/view/online-
interviewing/SAGE.xml
Kersaint, G., Smith, J. J., Ellerbrock, C., Sears, R., & Elrod, M. J. (2013). Examining the
mentorship relationship between collaborating teachers and preservice teachers during
early field experiences (Study Protocol). Tampa, FL: University of South Florida.
Mobile Market Research. (2014). MyInsights. Amsterdam: Mobile Market Research. Retrieved
from http://www.mobilemarketresearch.net/myinsights
OConnor, H., & Madge, C. (2003). Focus groups in cyberspace: Using the internet for
qualitative research. Qualitative Market Research: An International Journal, 6(2), 133
143.
Penzu Inc. (2014). Penzu.com. Penzu.com. Retrieved from www.penzu.com
Polkinghorne, D. E. (2005). Language and meaning: Data collectionin qualitative research.
Journal of Counseling Psychology, 52(2), 137145.
Ravert, R. D., Calix, S. I., & Sullivan, M. J. (2010). Research in brief: Using mobile phones to
collect daily experience data from college undergraduates. Journal of College Student
Development, 51(3), 343352.
Robinson, K. M. (2001). Unsolicited narratives from the Internet: A rich source of qualitative
data. Qualitative Health Research, 11(5), 706714.
Salari, S., Sharpe, S. P., Khambay, V., & Gonzalez-Estrada, B. (2014). Ethos. Chatham, United
Kingdom: Ethos. Retrieved from https://www.ethosapp.com/#home
Salmons, J. (2010). Online Interviews in Real Time. Thousand Oaks, CA: SAGE Publications,
Inc.
Salmons, J. (2012). Designing and conducting research with online interviews. In J. Salmons
(Ed.), Cases in Online Interview Research (pp. 130). Thousand Oaks, CA: SAGE
Publications, Inc.
Shields, C. (2003). Giving Voice to students: Using the internet for data collection. Qualitative
Research, 3(3), 397414.
Suzuki, L. A., Ahluwalia, M. K., Arora, A. K., & Mattis, J. S. (2007). The pond you fish in
determines the fish you catch: Exploring strategies for qualitative data collection. The
Counseling Psychologist, 35(2), 295327.
Zuckerberg, M., Moskovitz, D., Saverin, E., McCollum, A., & Hughes, C. (2004). Facebook.
Menlo Park, CA: Facebook, Inc.


Appendix

E
t
h
o
s

M
y
I
n
s
i
g
h
t
s

Q
u
a
l
B
o
a
r
d

Feature


Researcher Resources
Participants can be segmented into groups multiple ways
Prompts can be assigned to time intervals
Prompts can be assigned to categories
Ease of Use?
Participant Responses
Participants can respond multiple times to one prompt
Mobile Application
Web
Participants can change their responses (option)
Participants can see others responses (option)
Participants can upload multi-media responses
Participants can respond with web-based resources
Participants can respond unprompted (mobile)
Participants can respond unprompted (web)
Overall Look Modern (M) or Somewhat Dated (D)
Application D M M
Desktop Site M D M
User Support
Fast/Immediate (F) or Slow (S) F S F
Possible Collaboration amongst CT/PST/USF

App Cost

Ethos
Currently in unlimited free trial
$1.03/day for Student project
$13.71/day for In-House
$42.85/day for Non-Profit

MyInsights
Currently in a free trial
Text - $3.39/participant/week
Photo - $6.78/participant/week
Video - $10.18/participant/week

QualBoard
Currently in a one-month free trial
$600 for days 1-3
$100/day for days 4-10
$10/day for days 11+

Back to text

Vous aimerez peut-être aussi