Vous êtes sur la page 1sur 6

Exploratory Study of Deaf Individuals Use of Technology and its Usability in Emergency and Everyday Situations.

Jen Adam DePaul University, School of CDM 1 East Jackson Blvd., Chicago, IL 60604 jlynnadam@gmail.com
ABSTRACT

Valerie Fenster DePaul University, School of CDM 1 East Jackson Blvd., Chicago, IL 60604 vmfenster@aol.com

Jason Friedlander DePaul University, School of CDM 1 East Jackson Blvd., Chicago, IL 60604 jfrites@gmail.com

Jeri Herrera DePaul University, School of CDM 1 East Jackson Blvd., Chicago, IL 60604 jeriherrera@gmail.com

A large number of deaf and hearing-impaired people rely on mobile devices and relay services, such as Smartphone applications, text and video messaging and video remote interpreting to aid communication in everyday situations. Despite these advancements, people who are hearing impaired face unique challenges with regards to being alerted to emergency situations and important environmental sounds, such as sirens, alarms and informational announcements. Many devices exist on the market today to alert the hearing-impaired to these and other sounds. However, literature reviews seem to indicate there is still a need for improvements to alerting technology. Through interviews with hearing-impaired people, we explored the utilization and desired features of technology available to individuals in home and non-home environments. Participants reported that existing mobile technology tends to lack sufficient vibration strength to be effective. We discovered that a majority the people we interviewed were resistant to new alerting technology.
Author Keywords

Sound awareness tools help hearing-impaired people know when doorbells sound, phones ring and alarm clocks go off through vibration sensing, flashing lights and visual displays [5]. Deaf users regularly use mobile text devices such as Blackberries and Sidekicks to text message each other. Wearable technology, such as hearing aids, makes it easier for Deaf users to communicate with hearing individuals and with each other [2, 7]. For those people in the U.S. who could benefit from wearing a hearing aid, only one out of five actually wears one [3]. Deaf users text one another using mobile devices, and developers have made advances in the compression of sign language video so that Deaf users can communicate over the telephone lines [2] using American Sign Language (ASL). Since ASL can be communicated at the same rate as spoken language, [2] hearing-impaired users make and receive phone calls using video display and face-time technology. Video mobile phones make it possible for Deaf people to communicate in their native sign language [2]. Video recording and conferencing software allow both hearing and non-hearing individuals to easily communicate with one another. Much progress has been made to support communication between deaf and hearing people, including automatic recognition of sign language using computer vision techniques and translation of spoken language into text signed by an Avatar [5]. Despite the advancements that allow Deaf users to communicate with one another, catastrophes leave these individuals unaware. During 2005s Hurricane Katrina, two of the primary reasons people did not evacuate were either being a person with a disability or a family member of someone with a disability. About one third of those who did not leave their homes during this disaster were people with a disability. A lesson learned from Hurricane Katrina and other disasters in recent years is that the special needs of people with disabilities must be integrated into all aspects of emergency management [4]. Hearing-impaired people face unique challenges in emergency situations, e.g., when an emergency vehicle approaches. They may miss important safety cues because they cannot hear. They may inadvertently walk in front of

Deaf; emergency alerts; mobile devices; videophones; flashers; vibration; video relay service; video remote interpretation; ASL.
ACM Classification Keywords

H.5.2 User Interfaces


General Terms

Human Factors; Design; Languages; Performance; Reliability.


INTRODUCTION

In the U.S., there are approximately 37 million deaf and hearing-impaired people with an estimated one in ten living with some degree of hearing loss [3].

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission.

oncoming traffic or miss an important knock at the door. In public spaces or unfamiliar places, they may be unaware of fire alarms or public service announcements. Kelchner (2012) reported, Deaf drivers must rely on their sense of sight to warn them of dangers while on the road [6]. In an effort to understand how Deaf people currently find out about emergency situations, we conducted a study to explore how assistive technology is used in home and nonhome environments. This paper will demonstrate that the majority of our participants are resistant to new technology. Some feel they are deaf and dont need to receive more information, while others feel the technology they use is already adequate. They do not receive adequate information pertaining to emergency situations. When outside the home, they rely heavily on visual cues to stay informed of their environment. They are exceptionally cognizant of people running or those that look agitated or panicked. This generally alerts them to potentially dangerous situations and encourages them to seek more information or run for cover. Our study shows that assistive technology alone does not offer Deaf people necessary communication channels acquiring public service announcements or emergency information. Federal and local agencies not do an adequate job of disseminating critical emergency or public service announcements to Deaf people. According to our participants, assistive technology devices do not offer this information.
METHODS Participants

Part.

Age

Gender

Primary Communication Method Used ASL ASL ASL Blue tooth microphone, Reading lips Read lips, ASL interpreter, in writing, through Purple

P1 P2 P3 P4 P5

21 - 31 32 - 42 43 - 53 13 32 - 42

Male Female Female Female Male

Table 1. Participant demographics and primary communications method used.

Part.

Hearing Technology Used None None None Cochlear Implant Hearing aid in both ears

Assistive Technology Used

P1 P2 P3 P4 P5

Video Relay Service (VRS), vLog, iPhone Android, VRS vLog, ASL interpreting, Tango, Skype Blackberry, VRS, vLog, ASL Blue tooth microphone, Cell phone iPhone, Bluetooth, Purple App, iMessage, Video Remote Interpreting (VRI) Video mail, Tango, Skype

Three female and two male participants took part in our study. All five participants were profoundly deaf and ranged in age from 13 to 53. The 13-year old participant used a cochlear implant. Three of the four adults did not use any type of hearing device, while the fourth used hearing aids in both ears and could not hear without them. The participant with a cochlear implant considered herself as hearing-enabled, and she did not use American Sign Language (ASL). In school, her teachers use a Bluetooth microphone headset to transmit sound to the receiver in her implant. The four adult participants all use ASL to communicate in person. They also combined ASL with devices that were video-enabled to make phone calls. Their video devices included videophones, and iPhones, Androids or computers with applications such as Skype, Tango, Video Logging (vLogs) and video email. The following table represents the demographic profile of our study sample.

Table 2. Participant hearing and assistive technology used. Data collection

Our team, distributed in California and the Midwest, took two different approaches to recruiting participants. Two respondents were recruited through GLAD (Greater Los Angeles Agency of Deafness). A Snowball Sampling technique was used to recruit our remaining three respondents from team members friends in Boston, Milwaukee and Chicago. The Los Angeles GLAD interviews were conducted remotely via a normal telephone conversation. A hearing enabled person, fluent in ASL, interpreted for both the hearing moderator and the hearing-impaired participant. Sitting with the participant, the ASL interpreter signed the moderators questions to the participant and then spoke the participants signed responses back through the phone to the moderator. The Boston interview was conducted remotely using AOLs Instant Messenger (AIM). In this case, there was no need

for an ASL interpreter or relay service as interview questions were typed into AIM and the participant typed responses back to the moderator in the same manner. The Milwaukee interview used video remote interpreting (VRI), allowing the moderator to use a regular phone line to place the call. The participant used a videophone that displayed an ASL interpreter (the operator). As the moderator spoke, the operator signed questions to the participant. As the participant signed responses back through the video display, the operator interpreted the ASL and spoke the answers to the moderator. The interview in Chicago was conducted via Skype. This was the interview involving the participant with a cochlear implant. The moderator and participant could hear each other. Due to technical issues between the cochlear implant and Skype, the participants father needed to repeat questions and answers for clarification purposes. Research shows that average Deaf high school graduate reads at a fourth grade level; therefore, interview questions were word-checked to ensure all respondents could easily understand each question. [1] Participants were interviewed about the alerting technologies they currently use. We used a questionnaire format consisting of open-ended, multiple choice and priority-based preference questions. Participants were asked to describe the type(s) of assistive technology, if any, they used to communicate between both deaf and hearing-enabled people. We asked them to describe how often they are in home and non-home environments, and how they communicate with both deaf and hearing-enabled individuals in these settings. Participants were asked to describe how they are alerted to sounds, for instance, doorbells and telephones, and if they might see improvement areas for how they are alerted to sounds today. We inquired how they currently learn about emergency situations, for example, fires or severe weather, and if they would be interested in or had ideas for new ways to be alerted. Finally, we asked them to rate features such as long battery life, reliability, style, mobility or being discreet as being a priority consideration when developing a new alerting technology. Because participant and moderator could not always see each other, the moderator read off the features, allowing time for the participant to write each down. They then asked the participant to review their list and to call out their feature preferences in priority order.
Data Analysis Procedures

assistive technology used, or whether or not a hearing aid was worn or ASL was used. Given the small sample size, we attempted to analyze data patterns for technology improvements and the need for new communication methods. We reviewed the transcripts using a qualitative inductive coding technique to search for commonly mentioned participant attitudes. Codes were noted and then combined to create themes, determining any positive or negative positions regarding assistive technology, alerts for home and non-home environments and emergency situations.
RESULTS

When asked what type of assistive technology or other forms of communication participants use today, four of our five said they use ASL to communicate. The fifth is a 13year old and uses a cochlear implant. The four participants using ASL also use a Video Relay Service to communicate with other deaf persons. Videophones with flashers are used in home and office environments. Smartphones, such as Blackberrys, iPhones, Androids with video calling allows callers to sign to one another. Varying smartphone applications, such as text transcription are used along with sound recognition applications. These applications are used to a lesser degree due to less reliability. Participants that reported they live and/or work in home and non-home environments. Four out of five stated one or more Deaf persons are also present. If a hearing roommate or family member is present in the home, then that person generally communicates using use ASL, too. We asked how participants communicate in non-home group settings like office environments, when hearing persons are present. For the four that use ASL, they stated they read lips, use pencil and paper, and use an ASL interpreter who can facilitate communication during a meeting. When at home, participants said they use light flashers to let them know someone is ringing the doorbell. One participant asks visitors to email her prior to coming over to alert her to a time to expect a knock at the door. She also uses a flasher that turns a lamp on and off alerting her someone is at the door. To wake our participants up in the morning, alarm clocks are connected to flashers or operate by shaking the bed. Flashers are also used on telephones to let participants know a call is coming in. Our participants indicated that receiving emergency alerts via text message would be helpful, but they did not indicate if these should be a separate emergency application or embedded into their smartphone or another device. Three of our five participants said, when in public places, they rely on the reactions of the hearing to alert them to emergency situations, and four out of five answered that they would not use any device at all for emergency situations. If any type of emergency-related technology is developed for their use, participants stated their preference is to keep their hands free, and would clip devices to their clothing.

Interview sessions were recorded and transcribed for data analysis. The transcriptions were used to count the frequency in which participants responded to straightforward questions, for instance, the type(s) of

They would also hang items around their necks or keep items in their purse. One participant mentioned something like a watch might be of interest. Perhaps keeping their hands free is to ensure they can still communicate effectively using ASL. When asked about areas for improvement to current technology, participants said that vibratory notification found in their smartphones should be made stronger. Participants also rated reliability as the number one feature necessary for a new technology, and that incorporating technology into an existing smartphone and mobility shared the number two spot.
Figure 4. A Deaf caller uses her Android to make a face-to-face video call (left). Seeing an upper and lower screen display that shows both callers simultaneously, they converse using ASL (right).

Figure 1. At the Greater Los Angeles Agency for Deafness, hallways are always visible through open glass doors (left) and fire alarms are fitted with flasher lights for emergencies.

Figure 5. Deaf callers used teletypes before video technology was available (left). Todays smaller desktop models have replaced older floor models and can be used where video technology is unavailable (right). DISCUSSION

Figure 2. Flasher lights on her videophone notifies a Deaf person a caller is trying to reach her (left). When the phone is answered, the caller signs hello to the recipient (right).

From our interviews, we learned that participants rely on a combination of in-home assistive technology, mobile devices, and visual acuity to alert them to potentially hazardous situations as well as important environmental information. People are using assistive technology such as specialized Smartphone applications, video relay services, and videophones. These are generally equipped with light flashers or vibration to alert users of incoming information. Hearing-impaired people also rely on those who are hearing-enabled to alert them in certain situations, such as for announcements in public places. One participant who frequently travels expressed frustration at having missed announcements alerting him to schedule changes. This causes obvious time delays, in addition to hassles such as fees for making new travel arrangements. When asked how he would have liked to receive this information, he indicated that text messages to his phone would be ideal. We learned that deaf and hearing-impaired people are very visually oriented and they rely on other people (movement, activity, panic) in their environment to alert them to dangerous / emergency situations. Ambient sounds can be very informative in non-threatening situations, as well. Hearing-enabled people rely on sound for situational awareness, for instance a person in another room, a siren in the distance, or approaching footsteps. In reviewing work by Ho-Ching, Mankoff and Landay, we learned that very little research has been done in the area of non-speech sound recognition [5]. We feel that Deaf people may be unaware of their needs because they dont currently

Figure 3. A Deaf caller using a Video Relay Service (VRS) waits for the ASL operator to appear (left). The ASL operator appears on the main video display, ready to interpret a call between deaf and hearing callers (right).

hear and dont recognize the criticality of not hearing, and that further research needs to be conducted. The work being done by Ho-Ching et al looks promising and should be followed up with further studies to understand if there is a desire for deaf people to be notified of ambient sounds via non-distracting visual display [5]. A nation-wide service is currently under development to alert mobile device users of emergency situations in their vicinity [8]. Research has studied the optimal strength and pattern for vibrations to provide a unique identifier for hearing-impaired users to quickly recognize the signal [8]. We think that current visual devices such as Smartphones, videophones, video relay services and even vLogs (YouTube, etc.), could be improved by outfitting them with direct feeds for deaf users that are hooked up to government agencies, police agencies, FBI, to send out alerts to subscribers. It is very hard for us to draw any major conclusions based on the limited participant number and with the technology being so young and still being adapted by many people. If we simply focused on the technology and history we can say that a few years ago, if people were asked do they need a calendar, a text messaging device, a camera, games and apps on a telephone, most people would have said, "No". In many cases, not until we mentioned "what if you could invent something else, something that warned you about something you couldn't see" did our participants start to talk about something new. Only two participants seemed interested in solving our research questions. Perhaps it is because these are issues they deal with every day as members of Deaf culture, living in a hearing world. Our literature research and responses from at least one of the participants uncovered that current vibration alerts on mobile devices, especially the iPhone, do not meet their needs. Although they are able to use the accessibility settings to create custom vibration patterns for various callers and features, the vibration pattern is often too weak to be considered reliable. Reliability was rated amongst a majority of our participants as the most important requirement on any assistive device.
CONCLUSION

remaining four senses, as well as visual cues taken from hearing-enabled counterparts, to aid them. The results presented here may provide preliminary direction to uncover how Deaf users, identifying themselves within Deaf culture, would embrace specific aspects of technology for alerting in everyday and emergency situation. Further investigation into this study would provide a better understanding into the attitude of Deaf users, who do not wish to hear, but would like to be alerted to specific situations.
ACKNOWLEDGMENTS

We would like to thank all of our participants. We would especially like to thank GLAD (Greater Los Angeles Agency of Deafness). GLADs warm reception to our study was very much appreciated. GLAD offered us access to participants, a guided tour of their facility and allowed us to photograph assistive technology in their offices.
REFERENCES

1. Azbel, L. (2004). How do the deaf read? The paradox of performing a phonemic task without sound [White paper]. Retrieved from: http://psych.nyu.edu/pelli/docs/azbel2004intel.pdf 2. Cherniavsky, N., Chon, J., Wobbrock, J. O., Ladner, R. E., & Riskin, E. A. (2009, October 4). Activity Analysis Enabling Real-Time Video Communication on Mobile Phones for Deaf Users [White paper ACM]. Retrieved from DePaul University, School of CDM: https://d2l.depaul.edu/d2l/lms/content/home.d2l?ou=152 844 3. Greater Los Angeles Agency on Deafness. (2012). Statistics & Population. Retrieved from http://www.gladinc.org/information-center/resources 4. Harkness, L. (2009). Colorado Assistive Technology Coalition. Retrieved from http://www.ucdenver.edu/academics/colleges/medicalsc hool/programs/atp/Documents/Assistive_Technology_a nd_Emergency_Management.pdf 5. Ho-Ching, F., Mankoff, J., & Landay, J. A. (2003). Can you see what I hear? The Design and Evaluation of a Peripheral Sound Display for the Deaf [White paper]. Retrieved from http://www.cs.cmu.edu/~io/publications/old-pubs/469ho-ching.pdf 6. Kelchner, L. (2012). Issues That Deaf People Face. Retrieved from http://www.ehow.com/list_6023189_issues-deaf-peopleface.html 7. Power, M.R., & Power, D. (2004). Everyone here speaks TXT: Deaf people using SMS in Australia and the rest of the world. Journal of Deaf Studies and Deaf Education. Retrieved from http://jdsde.oxfordjournals.org/content/9/3/333.full.pdf

In this paper we described the interview process completed to understand how Deaf users employ assistive technology today. We have shown that Deaf users rely on technology to assist them with alerts for everyday interactions such as doorbells, phone calls and alarms clocks. With emergency situations, there is a deficiency in reliable services for alerting Deaf users to potential danger and public health situations. We found that they are not very receptive to technology for emergency or alerting situations and rely upon their

8. Harkins, J., Tucker, P. E., Williams, N., & Sauro, J. (2010, April 19). Vibration Signaling in Mobile Devices for Emergency Alerting: A Study With Deaf Evaluators

[Online Journal]. Journal of Deaf Studies and Deaf Education, 15(4). Retrieved from http://jdsde.oxfordjournals.org/content/15/4/438.full

Vous aimerez peut-être aussi