Vous êtes sur la page 1sur 15

Katelyn Pearson

June 14, 2017


G/T Independent Research

Annotated Source List

Auer, Jr., E. T., Bernstein, L. E., Sungkarat, W., & Singh, M. (2007). Vibrotactile activation of
the auditory cortices in deaf versus hearing adults. ​Neuroreport​, ​18​(7), 645-648.
https://doi.org/10.1097/WNR.0b013e3280d943b9
This article states that deaf people experience a more intense response to vibrotactile
stimulation in the secondary auditory cortex than hearing people do.

Bavelier, D., Dye, M. W.G., & Hauser, P. C. (2006). Do deaf individuals see better? ​Trends in
Cognitive Sciences​, ​10​(11), 512-518. https://doi.org/10.1016/j.tics.2006.09.006
This journal article examines the relationship between sight and hearing loss. It states
that deaf people have better peripheral vision.

Beauchamp, M. S., Yasar, N. E., Fyre, R. E., & Ro, T. (2008). Touch, sound and vision in human
superior temporal sulcus. ​NeuroImage​, ​41​(3), 1011-1020.
http://dx.doi.org/10.1016/j.neuroimage.2008.03.015
This experiment studies the superior temporal sulcus’ (STS) response to stimuli. It has
already been established by previous researchers that the STS responds to audio and
visuals, so this team chose to focus on the response to somatosensory, physical feeling
that can happen anywhere on the body, stimulation. This was done by conducting two
experiments where the participants’ brain activity was monitored by blood-oxygen
level dependent functional magnetic resonance imaging (BOLD fMRI). The first test
applied vibrations to the participant’s left hand while audio was played in his/her ear.
The participant was to then respond to the stimulation by using his/her left hand to
activate a button. The second experiment solely applied vibrations to the participant’s
hands and feet to remove the factor of a physical response. It was concluded that
contrary to prior belief, the STS responds to the processing of stimuli rather than the
performance of a task. While the results from the first experiment reflected the
accepted belief about the purpose of the STS, the second experiment showed that the
completion of a task is not needed to stimulate the STS.

This journal shared the discovery of a link between how vibrations, audio, and visuals
are processed in the human brain. This provides insight into why visuals have been
found to be processed in the same area of the deaf brain as sound would be. To expand
upon this knowledge, the next step would be to identify how sensitive deaf people are
to vibrations compared to a hearing person to account for any heightened perception.

Bennebach, M., Rognon, H., & Bardou, O. (2013). Fatigue of structures in mechanical vibratory
environment. from mission profiling to fatigue life prediction. ​Procedia Engineering​, ​66​,
508-521. https://doi.org/10.1016/j.proeng.2013.12.103
This article examines the effect of vibration on body parts and states that mechanical
vibrations have been shown to cause muscle fatigue.

Bottum, B. (2007). ​U.S. Patent No. 7,164,770 B2​. Washington, DC: U.S. Patent and Trademark
Office. Retrieved from
https://docs.google.com/viewer?url=patentimages.storage.googleapis.com/pdfs/US71647
70.pdf
This invention addresses the issues of speakers being too large and the cutoff
frequency being too high. Speakers aim to produce a wide range of sounds, but this is
difficult because the diameter of the speaker correlates to what frequencies it can
produce. This means that more than one driver is necessary to be able to create a full
range of frequencies, but it also means the speaker turns out to be reasonably large.
This patent describes a sound system with channel outputs in the front left, right, and
center, and back left and right. So the speakers are compact while still the wide range
of frequency still remains, double coiled woofers, the largest speakers in diameter, are
used for the sound output in the front, left, and back. Another woofer with two coils
could be used for the right and center outputs. The coils are connected between
speakers to give a feeling of immersion.

This document provided background on the workings of sound systems and taught the
purposes of different parts through immersion in the topic. While at first it was
confusing to determine how the parts worked together, after analysis a deeper
understanding was obtained. The knowledge gained will allow for further research of
speakers in articles where little explanation is given.

Brain anatomy differences between deaf, hearing depend on first learned. (2014, April 14).
Retrieved April 3, 2017, from Georgetown University Medical Center website:
https://gumc.georgetown.edu/news/Brain-Anatomy-Differences-Between-Deaf-Hearing-
Depend-on-First-Language-Learned
This article studies the differences between deaf and hearing brains. It states that there
is less white matter in the auditory cortex of deaf people.

Chapter 9: Hearing: Physiology and psychoacoustics​ [PowerPoint slides]. (n.d.). Retrieved


November 14, 2016, from University of Washington website:
http://faculty.washington.edu/ionefine/S&P2009pdfs/S&P2009_chapter9.pdf
This article explains the mechanics of sound waves, frequency and pitch, hearing
range, timbre, the structure of the ear, Békésys’ Place Theory of Hearing, complex
tones, two tone suppression, the cochlea, and auditory perception in the brain. Timbre,
made of of complex tones, consists of all the characteristics of sound except pitch and
volume. The ear can be divided into the outer earch that helps determine the location
of sound, the middle ear that amplifies and transmits vibrations, and the inner ear that
translates vibrations and send them to the brain. Békésys’ Place Theory of Hearing
states that the frequency of sounds are determined by where sound waves create
vibrations on the basilar membrane. Missing the lowest (fundamental) frequency
changes the timber but not the perceived pitch.
A large amount of information on the biological aspects of sounds perception was
provided by this article, building a strong foundation to help explain how sound is
processed by the ear and brain. This information allows for further investigation into
the differences between deaf and hearing brains as well as different reasons deaf
people are not able to hear sound.

Cincotti, F., Kauhanen, L., Aloise, F., Palomäki, T., Caporusso, N., Jylänki, P., . . . Millán, J. D.
R. (2007). Vibrotactile feedback for brain-computer interface operation. ​Computational
Intelligence and Neuroscience​. http://dx.doi.org/10.1155/2007/48937
This article investigate the use of vibrations when dealing with brain-computer
interfaces. When the explains physiological perception of vibrations, information
regarding factors that affect sensitivity was mentioned including that sensitivity is
mainly based off of age and the location of the motor with frequency being secondary
based on research from 2003. For example, during this experiment subjects reported
feeling discomfort when motors were placed where a bone was directly underneath the
skin. It also states that physicals perception of vibrations is shaped by frequency,
intensity, timbre, variations, and spatial location while perception that varies between
individuals is based on rhythm, tempo, and flutter. Six tests were performed to test
using brain-computer interfaces were performed and it was concluded that it would
indeed be viable to utilize vibrations for brain-computer interfaces, especially when a
person’s sense of sight is overloaded.

This journal assisted in providing information of vibrating motor placement even


though brain-computer interfaces do not directly relate to the research topic. The next
step is to begin the primary research process.

Department of Defense Army Research Laboratory. (2007, May). ​The tactile modality: A review
of tactile sensitivity and human tactile interfaces​ (Technical Report No. ARL-TR-4115)
(K. Myles & M. S. Binseel, Authors). Retrieved from
http://www.arl.army.mil/arlreports/2007/ARL-TR-4115.pdf
This report analyzed how tactile stimulation is perceived by the body to help in the
creation of tactile aids. It is first stated that when a mechanism only utilizes one sense,
it can result in a sensory overload creating a need for multimodal devices. The two
main uses of tactile input are to emphasize vision or hearing and to act as an
independent information source. When vibrations are being used, they are best felt on
Pacinian tissue and on hairy, bony areas rather than fleshy, smooth ones. When
placing vibrating motors it is important to not place them too close together if they are
meant to be felt as separate inputs, though the amount of space needed varies among
different parts of the body. On the head, the face is the most sensitive followed by the
scalp, forehead, and then the temples. The hands are also recognized as being very
sensitive, however they are usually occupied causing developers to look at other areas
of the body for stimulation areas.
This article provides useful information that will directly help in the creation of the
experiment, including specifics on how close motors can be to each other and
graphical analysis of vibration detection. The next step is to find more information
along these lines to create a better understanding of the human body’s reaction to
vibrations.

Eagleman, D. (2015). Sensory substitution. Retrieved June 12, 2017, from David Eagleman
website: http://www.eagleman.com/research/sensory-substitution
This website details the work of David Eagleman, specifically his vibrotactile vest
designed for deaf and hearing impaired people.

Geong, G.-Y., & Yu, K.-H. (2016). Multi-section sensing and vibrotactile perception for walking
guide of visually impaired person. ​Sensors​, ​16​(7). http://dx.doi.org/10.3390/s16071070
This article details the creation of an Electronic Travel Aid (ETA) and explains its
features. The goal of this project was to create an aid that not only has reliable object
detection notification, but that is also usable. The main components of the aid include
the seven ultrasonic sensors for object detection, the vibrotactile notification pad, the
electromagnetic brakes to prevent the subject from going off a drop, and a control unit.
The sensors are arranged to detect obstacles above, in front of, and that drop off before
the user. Once an obstacle is identified the user is alerted by the notification board
using one of the seven vibrators that corresponds with the object’s location. Vibrations
were used as a signal because the intensity easily be adjusted by altering the voltage.
When walk tests were performed, the ETA proved to be very efficient with is perfect
success rate for avoidance of single obstacles and a sixty percent success rate for
avoidance of double and triple obstacles.
Thought this device was made to assist blind people, some of the concepts such as
using an array of vibrations to portray the location of an object can be applied to a
vibrotactile aid for deaf people. The next step is to continue research of different
applications of vibrotactile machinery but to find an application for deaf people.

Gerstmann’s syndrome information page. (n.d.). Retrieved from


https://www.ninds.nih.gov/Disorders/All-Disorders/Gerstmanns-Syndrome-Information-
Page
This page defines Gerstmann’s Syndrome in detail.

Good, A., Reed, M. J., & Russo, F. A. (2014). Compensatory plasticity in the deaf brain: Effects
on perception of music. ​Brain Sci​, ​4​(4), 560-574.
https://doi.org/10.3390/brainsci4040560
This article states that more of the deaf brain is sued for tactile processing than the
hearing brain.

Gupta, S. (2014, July 9). Music for your skin. Retrieved October 27, 2016, from
http://www.pbs.org/wgbh/nova/next/body/haptic-hearing/
This article discusses recent research by Dr. Carman Branje on developing “music” for
deaf people by using vibrations. Research on experiencing sound vibrations through
skin emerged in the 1920’s at a deaf school and was reintroduced in the 1970’s and
1980’s, but was then overshadowed by the development of cochlear implants.
However, cochlear implants are made for speech and do a poor job at conveying
music. Dr. Branje had the idea of using a device developed by Dr. Deb Fels, the
“Emoti-Chair, ” to convey music through vibrations. The Emoti-Chair is a chair fitted
with 16 vibrating coils that deliver different frequencies to different parts of the body.
Dr. Branje used a special keyboard that transmitted frequencies to the Emoti-Chair,
ranging from 40 to 421 Hertz, and asked composers to create “songs” for it that would
utilize the vibrations. When played, the “listeners” reported experiencing the emotion
of the song, such as feeling happy or sad. Dr. Branje’s research continues with the
goal of helping artists incorporate touch into their works.

This source gives a reasonable overview of the the experiment, but more elaboration
about how the Emoti-Chair works would have been helpful. This information can be
obtained through further research that will also include looking into any other projects
of Dr. Branje’s. Dr. Branje and Dr. Fels will also be considered as future interviewees.

Kalan, J. (2013, July 31). Helping the deaf to ‘see’ and ‘feel’ sound. Retrieved October 13, 2016,
from http://www.bbc.com/future/story/20130731-helping-the-deaf-to-see-sound
This article by Jonathan Kalan explores why there are so few inventions to aid deaf
people and how this can be improved. One of the inventions discussed is the
VibroHear, a wristband that vibrates and flashes red or green lights, with the intensity
based on the volume and distance of the sound. Other inventions include Music for
Deaf People by Frederik Podzuweit, sound identifying glasses by KAIST, and the
Vibering by Kwang-seok Jeong, Min-hee Kim and Hyun-joong Kim. Unfortunately,
there is not a large enough demand for these items and they end up being too
expensive for consumers. Jonathan suggests the solution to this problem is switching
to computer programs and applications such as Vineya and ClearCaptions to increase
accessibility and provide for a wider range of people who may be looking for
assistance.

This source provides not only a number of existing inventions for deaf people and
their descriptions, but also a possible solution to the problem of low consumption.
Further exploration can be done to learn more about the programs and devices
mentioned, and more importantly a plan for a science project can be more thoroughly
delved into.

Karns, C. M., Dow, M. W., & Neville, H. J. (2012). Altered cross-modal processing in the
primary auditory cortex of congenitally deaf adults: A visual-somatosensory fMRI study
with a double-flash illusion. ​Journal of Neuroscience​, ​32​(28), 9626-9638.
https://doi.org/10.1523/JNEUROSCI.6488-11.2012
This article claims that the auditory cortex of deaf people is primarily used to process
tactile information, not visual input is is commonly accepted.
Khoo, W. L., Knapp, J., Palmer, F., Ro, T., & Zhu, Z. (2013). Designing and testing wearable
rangeu2010vibrotactile devices. ​Journal of Assistive Technologies​, ​7​(2), 102-117.
https://doi.org/10.1108/17549451311328781
This journal article tests how off the shelf vibration motors and materials can be used
to make a useful assistive device. The motors are placed along the arms and used for
navigation.

Khoo, W. L., Knapp, J., Palmer, F., Ro, T., & Zhu, Z. (2013). Designing and testing wearable
range-vibrotactile devices. ​Journal of Assistive Technologies​, ​7​(2), 102-117.
http://dx.doi.org/10.1108/17549451311328781
This article was written by a group of researchers from the City College of New York
who teamed up to research a low cost way to implement vibrotactile transmitters to aid
the blind in navigation. To eliminate risk of injury, the researchers had participants
maneuver a computer simulation that mimicked an indoor area through the use of
Unity3D. This precaution was taken to rid the vibration system of any kinks before
people could get hurt in a physical setting. To use the simulation, three vibrators were
placed on each arm of a participant that responded to the location of the user’s avatar.
The next step of this experiment is to ally the vibration system to a real life test. From
what the researchers have already found, the vibrotactile system works well for
alerting people of walls and other roadblocks, but conveying the presence of obstacles
such as doors is difficult. It was also discovered that all places on the arms have the
same sensitivity to vibrations.

This article allowed for the task ahead of creating a device that uses vibrations in a
similar matter to be put in perspective. Helpful information such as that the arms are
equally sensitive to vibration everywhere was also provided by this journal. The next
step is to look up other experiments where vibrotactile have been used to analyzed the
different ways they have been implemented.

Lab 2: Olfactory fatigue and memory. (2003, October 6). Retrieved from Evergreen State
College website:
http://archives.evergreen.edu/webpages/curricular/2003-2004/perception/Lab1006.htm
This source is a lab for a college science class, teaching about olfactory fatigue. The
background information was used and the lab activity not performed.

Ladner, R. (2010). ​Technology for deaf people​ [PowerPoint slides]. Retrieved October 21, 2016,
from University of Washington website:
https://courses.cs.washington.edu/courses/cse590w/10sp/deaf-tech10.pdf
This source is a power point for an Introduction to Deaf Studies class that goes over
inventions that aid deaf people. The presentation begins with giving a brief
background on deafness including a description of deafness through different lenses
and deaf versus Deaf, and then proceeds to list technology for deaf people. Some of
the devices listed, such as video phones and email, were not made with the intention of
helping deaf people, but are still included. Notable appliances include open and closed
captioning, the ASL-STEM Forum, the video phone, the Video Relay Service (VRS),
and Video Remote Interpreting (VRI). The current research in 2010 is then provided,
including instruments that recognize and translate sign language, and avatars that are
able to sign and automatic captioning.

This lesson was very useful in the abundance of information that not only provided
background on existing material, but also research that was being conducted in 2010
and may still be being explored. Additional resources will need to be gathered to learn
about current research. The gaps in aid that need to be filled can be brainstormed with
the facts given by this article.

Lederman, S. J., & Klatzky, R. L. (2009). Haptic perception: A tutorial. ​Attention, Perception, &
Psychophysics​, ​71​(7), 1439-1459. http://dx.doi.org/10.3758/app.71.7.1439

Levänen, S., & Hamdorf, D. (2001). Feeling vibrations: Enhanced tactile sensitivity in
congenitally deaf humans [Abstract]. ​Neuroscience Letters​, ​301​(1), 75-77.
https://doi.org/10.1016/S0304-3940(01)01597-X
This article claims that deaf people have an increased tactile sensitivity compared to
hearing people.

Livadas, G. (2011, November). Unlocking the mysteries of the deaf brain. Retrieved November
16, 2016, from Rochester Institute of Technology website:
https://www.rit.edu/research/feature/november-2011/unlocking-mysteries-deaf-brain
This article highlights key topics researched and conclusions made by Peter Hauser
and his team who focus on why the deaf brain differs from the hearing brain and how
sign language affects thought process. Little is understood about the human brain, but
Hauser emphasizes that deaf people may require alternative diagnosis treatment to
medical problems such as strokes. The other point Hauser emphasizes is that growing
up, deaf children require an environment based on visual learning in order to achieve
their maximum potential. As deaf children grow older, they become more aware of
their peripheral surroundings than their hearing peers, so an education based on visuals
allows them to excel and fully utilize their heightened sight processing. Hauser has
also concluded that when deaf children grow up in hearing families, their language
skills are not as advanced which inhibits executive functions. With his team, Hauser
wishes to encourage education strategies in the deaf community that will aid the
children’s development and optimize their potential.

This article focuses more on why the deaf brain is different from the hearing brain
rather than how. This makes it a good source to help bridge the gap between
mechanical innovations that assist deaf people to the biological differences between
deaf and hearing brains which is the next step of the research.

Macsweeney, M., Capek, C. M., Campbell, R., & Woll, B. (2008). The signing brain: The
neurobiology of sign language. ​Trends in Cognitive Sciences​, ​12​(11), 432-440.
https://doi.org/10.1016/j.tics.2008.07.010
This journal article talks about the neurology of the deaf brain. it mentions that more
of the left parietal lobe is used for signed languages than it is for spoken languages.

Maher, R. C. (2016). Gunshot recordings from a criminal incident: Who shot first? ​Acoustical
Society of America​, ​139​(4). http://dx.doi.org/10.1121/1.4949969
These presentation proceedings explain how to determine who first fired their gun in
circumstances when two known shooters are being examined and an audio clip is
present. When being used in law, this branch of science is called audio forensics.
When analyzing the audio with a spectrogram where time and frequency is measured,
anomalies such as echo need to be pinpointed. Echos can be analyzed by examining
the layout of where the shooting took place, the position of the audio recorder, the
location of the shooting, and how long after the initial gunshot the echo appears.
Depending on the placement of buildings and other large structures, a gunshot can
echo off a wall, and when refracted, can be picked up by the audio recorder. The
suspect facing the building that created the echo can then be identified as the one who
shot in that part of the recording. This type of audio forensics is difficult to practice
because the audio recordings more likely than not, will pick up on sounds other than
the gunshots. Other challenges are that devices used to record audio are not meant to
process sounds as loud as gunshots and a database containing data of how different
gunshot recordings sound is still being developed.

This resource was beneficial because it contributed a new outlook on how acoustics
can be utilized beyond creating speakers or headphones. By providing this different
view, new research paths such as how to make recording devices that can record a
wide variation of sounds well and other methods of audio forensics can be explored.

Mazzoni, A., & Bryan-Kinns, N. (2016). Mood glove: A haptic wearable prototype system to
enhance mood music in film. ​Entertainment Computing​, ​17​, 9-17.
http://dx.doi.org/10.1016/j.entcom.2016.06.002
This article explores how vibrations can be used to amplify emotions felt during
movies using a device called the Mood Glove. Influence for this project came from
sources such as Music For Bodies and vibrations used in video games. Rather than
using a chair as has been done in previous experiments, in this investigation a glove is
used as the medium to transmit vibrations. There are three steps to this experiment: the
annotation of movie clips for mood, the study of how vibration patterns are received
by participants, and the study of how the combination of vibrations and movie clips
affect the mood of participants. Through this process it was discovered that placement
of vibrations on the hand have no affect on mood but the intensity and frequency do.
The higher the intensity and frequency, the more pleasurable feelings are heightened.
With lower intensity and frequency, less enthused emotions can be felt.

Information on the effectiveness of vibrations, such as that different places on the


hand do not elicit different responses when vibrations are applied, was provided by
this article. This information will be useful when determining where to place the
vibrators. The next step is to research more in depth the sensitive locations on the
body.
Mckenzie, I., Wiggins, B. J., & Lennox, P. (2014). Hearing without ears. ​International
Conference on Auditory Display​. Retrieved from
https://www.researchgate.net/publication/267506833_Hearing_without_Ears
This paper details an experiment done in 2014 with efforts to have people “hear” using
vibrations applied to head. To do this a head band with vibration transmitters in five
different locations to test where how the vibrations were received. When the
transducer vibrated at 300g to 350g of exerted force, the higher levels got a more
active response than the lower transmissions, but after exceeding 5.9N participants
reported feeling discomfort. Three tests were conducted professionally. The first
established if participants could distinguish the location the vibrations originated, the
second to determined if any vibrations felt externalized, and the third to examine the
effect of delayed vibration. The results included participants bringing able to
differentiate vibrations from the left and right, feeling as if there was an external
source to different degrees, and due to the delayed audio, and the illusion of imagery.
The researchers concluded that is viable for creations to create a sense of spatial
imagery.

Contributions were made by this article to the background of how hearing people
experience sound, which is necessary knowledge for the comparison of how deaf
people perceive sound. Several past experiments were referenced, such as a study
done by Stanley and Walker in 2006, that will examined to determine if there is any
information relevant to the research topic.

Miller, J. (2010). Radio waves map matter without counting galaxies. ​Physics Today​, ​63​(9),
15-19. http://dx.doi.org/10.1063/1.3490487
During the first 380,000 years of the universe’s creation sound was able to travel
through space through the plasma that inhabited it, but when the plasma cooled, atoms
were made and left in their formations. These leftover patterns, called baryonic
acoustic oscillations (BAOs) which have dispersed over 480 million light years due to
dark matter, can be used to determine the plasma’s density from when the universe
was young. Intensity mapping is a new technique that will utilize the radio waves
emitted from hydrogen atoms throughout the universe opposed to using visible light
from stars uncover the BAOs. Researches Tzu-Ching Chang, Ue-Li Pen, and Jeffrey
Peterson tested intensity mapping by using the Robert C. Byrd Green Bank Telescope
(GBT) and focusing on two sections of space where the density had been cataloged by
DEEP2 optically. The GBT was initially bombarded by cellphone radio emissions
from Earth, and countered by removing the more polarized signals. To counter
unwanted frequencies from space the researchers applied a principal component of
analysis and did away with the stronger radio emissions. The data was then compared
it back to the findings by DEEP2 in the same location and analyses matched, showing
the unwanted information was successfully extracted.
“Radio waves map matter without counting galaxies” was an interesting read that
provided new insight for different ways to examine space. Intensity mapping is an
innovative way to map density in the universe and directly correlates to the topic of
radio waves.

Napoli, D. J. (2014). A magic touch: Deaf gain and the benefits of tactile sensation. In H.-D. L.
Bauman & J. J. Murray (Authors), ​Deaf gain: Raising the stakes for human diversity​ (pp.
211-232). Retrieved from
http://www.swarthmore.edu/SocSci/dnapoli1/lingarticles/A%20Magic%20Touch.pdf
This source explores the effects of tactile stimulation on hearing and seeing, deaf, and
deaf-blind people through different subtopics. The section about communication
through touch explains how greetings where where people touch are used around the
world and why. How touch affects infant development is shown to be positive and
correlates with increased social behavior and weight gain. Cognitive effects of touch
include increased functioning in the elderly, the identification of what is underneath an
object though through the detection of vibrations by apes and humans, and sustained
memory. Different ways the use of touch can be applied are to guide people and in
medical training. The plasticity of the brain allows the section known as the auditory
cortex to adapt to a lack of hearing and increased sensitivity to sound and touch. All of
this relates to “Deaf Gain” because touch has a large impact on Deaf culture such as
feeling the vibrations of music through a balloon or a deaf mother using touch to
communicate with her baby.

This book explicitly confirms the assumption that was being made without support
from a source that deaf people are more sensitive to tactile sensations than hearing
people. A broader understanding of how touch is used throughout different situations
and places was also provided by this article. The next step is to research how
vibrations can be mechanically produced to assist the tangible aspect of this research.

Neary, W. (2001, November 27). Brains of deaf people rewire to ‘hear’ music. Retrieved
October 13, 2016, from University of Washington website:
http://www.washington.edu/news/2001/11/27/brains-of-deaf-people-rewire-to-hear-music
/
This article discusses a study by Dr. Dean Shibata in 2001 on how the deaf feel sound.
According to Shibata’s research, when a child is deaf his or her brain rewires to
perceive vibrations where sound would have been analyzed. This was done by
scanning the brains of ten deaf volunteers and eleven hearing volunteers and applying
vibrations to their hands. While the section of the brain that usually processes
vibrations was active in both test groups, the deaf group also had the extra brain
activity. This explains how deaf people are able to appreciate and enjoy music even
though they are not able to hear it. Shibita believes neurosurgeons should take this
difference in brain structure into account when operating on a deaf person. He also
recommends exposing deaf children to music early on to stimulate this part of the
brain.
Although this source does not elaborate on how the research done can be applied in a
mechanical way, it provides useful background information on how deaf people
experience sound and why its appreciation is possible. This allows for further research
into stimulants for the deaf that use vibrations and how the Deaf community regards
these devices.

New research links acoustic and gravity waves. (2016, August). ​Bulletin of the American
Meteorological Society​, ​97​(8), 1333+. Retrieved from
http://ic.galegroup.com/ic/scic/AcademicJournalsDetailsPage/AcademicJournalsDetails
Window?disableHighlighting=false&displayGroupName=Journals&currPage=&scanId=
&query=&prodId=SCIC&search_within_results=&p=SCIC&mode=view&catId=&limite
r=&display-query=&displayGroups=&contentModules=&action=e&sortBy=&documentI
d=GALE%7CA463953350&windowstate=normal&activityType=&failOverType=&com
mentary=&source=Bookmark&u=elli85889&jsid=a76c45d2da979dcf11119ec89178a14f
While acoustic and gravity waves have immensely different properties, researchers at
the Massachusetts Institute of Technology have recently discovered a link between
them. Acoustic waves are usually created on the ocean floor by large disturbances
such as earthquakes and other disasters alike. Surface gravity waves, on the other
hand, tend to travel on the surface of the ocean waves as the name suggests.
Researchers found out that two gravity waves propagating towards each other can give
off their initial energy as a sound wave. This was surprising because sound waves are
almost not affected by gravity waves at all. Because of the acoustic waves’ production
during major disturbances, these new discoveries will hopefully lead to the prediction
of natural disasters such as tsunamis, large storms, and landslides. This way
evacuation can be completed earlier and more lives can be protected.

Because of this discovery, acoustic waves can be linked to concepts that were initially
expected to have no connection, and prompts research into more relationships that
seem to have little correlation at first glance. The implication of being able to help
many people in the face of a disaster is also a great motivation to research the topic
further. Thanks to the simple language the article used, such as there being no
equations used as explanations, evaluation was smooth and went without frustration.

Olulade, O. A., Koo, D. S., LaSasso, C. J., & Eden, G. F. (2014). Neuroanatomical profiles of
deafness in the context of native language experience. ​The Journal of Neuroscience​,
5613-5620. https://doi.org/10.1523/JNEUROSCI.3700-13.2014

Pacinian corpuscle. (n.d.). Retrieved from Rutgers University Virtual Biology Labs website:
http://bio.rutgers.edu/~gb102/lab_5/104cm.html
This is an interactive webpage form Rutgers University that details the Pacinian
Corpuscle.

Parvizi, J. (2010). Chapter 153 – nerve endings. In ​High yield orthopaedics​ (pp. 315-316).
Philadelphia: Saunders/Elsevier. https://doi.org/10.1016/B978-1-4160-0236-9.00164-4
This book was used for information on Ruffini’s corpuscles, a type of
mechanoreceptor.
Purves, D., & Williams, S. M. (2001). ​Neuroscience​ (2nd ed.). Sunderland, Mass.: Sinauer
Associates.
The information about mechanoreceptors, specifically Meissner’s corpuscles, was
used from this book. Information about the auditory cortex was also used.

Shake-n-wake vibrating alarm clock. (2017). Retrieved June 12, 2017, from Assistech website:
https://assistech.com/store/90410
The “Shake-n-Wake Vibrating Alarm Clock” is a product sold at many online stores,
this on call Assistech. It is a watch-like device that is worn on the wrist and vibrates to
wake the user up.

Shull, P. B., & Damian, D. D. (2015). Haptic wearables as sensory replacement, sensory
augmentation and trainer – a review. ​Journal of NeuroEngineering and Rehabilitation​,
12​(1). http://dx.doi.org/10.1186/s12984-015-0055-z
This article reviews and analyzes devices that use tactile stimulation to assist people
with impairments including loss of sight, loss of hearing, and amputation. For cases of
total impairment the aids aim to lost sense and for partial impairment the aids act as a
sensory helper. As a sensory replacement, tactile stimulation is commonly used for
prosthetic feedback with amputees, guidance and navigation with the blind, and
auditory detection and translation with the deaf. As a sensory supplement, tactile
stimulation is commonly used to help with standing balance, walking balance, and
rehabilitation from neurological diseases. In the text it is noted that haptic stimulation
is best received on the fingertips but the arms or back are commonly used as vehicles
for communication. The authors concluded that for further development of haptic aids
it is necessary for the devices to be practical, last longer mechanically, have multiple
sources of stimulation, have lower energy costs, and be tested for long term use.

While other journals that were annotated provide explanations of different inventions
that utilize haptic stimulation, this paper analyzes them and draws insightful
conclusions useful for future development of the subject. The next step is to continue
on this line of research and find analysis that has more focus on tactile aids.

Siple, L., Greer, L., & Holcomb, B. R. (2004). ​Deaf culture tipsheet​. Retrieved from Pepnet
website:
https://www.rit.edu/ntid/radscc/sites/rit.edu.ntid.radscc/files/file_attachments/deaf_cultur
e_tip_sheet.pdf
This document from Rochester Institute of Technology explains some quick facts
about Deaf culture that can be very useful for a hearing person.

Sparks, M. (2015, January 19). New device allows deaf people to ‘hear with their tongue’
[Newsgroup post]. Retrieved from The Telegraph website:
http://www.telegraph.co.uk/technology/news/11354541/New-device-allows-deaf-people-
to-hear-with-their-tongue.html
This article describes a vibrotactile aid made by students at Colorado State University
that applies vibrations on the tongue to alert the user of sound around them. This
invention has yet to be commercialized or published in a journal article as of June
2017.

Storrs, C. (2009, November 26). People hear with their skin as well as their ears. Retrieved
October 25, 2016, from
https://www.scientificamerican.com/article/skin-hearing-airflow-puff-sound-perception/
This article summarizes the study “Aero-Tactile Integration in Speech Perception,”
conducted by Bryan Gick and Donald Derrick, published November 26, 2009.
Because the sounds “ta” and “pa” as well as “da” and “ba” are predominantly
differentiated in the English language by a puff of air, the study had participants listen
to “pa,” ta,” “ba,” and “da” sounds blindfolded while puffs of air were released onto
each person’s hand, neck or ear, only half as intense it would be in normal
conversation. The results show that without a puff of air “pa” was mistaken for “ba”
and “ta” for “da” about 35 percent of the time, and when the air puff was added to
“pa” and “da,” recognition improved about 15 percent. The accuracy for recognizing
“ba” and “da” was about 80 percent, and decreased about 10 percent when puffs of air
were added. This shows the relationship between a person’s sense of touch and sound
in conversations. Gick theorizes that his research can be used in conjunction with
hearing aids and to help pilots in because of the high noise level of their environments.

This article provided a better understanding of the perception of sound by hearing


people which is needed to figure out how sound is perceived differently by deaf
people. Before brainstorming for an invention this background knowledge is needed to
decrease problems along the way. Further research will include looking into Bryan
Gick and Donal Derrick’s other publications to uncover any other useful information.

Taylor, J. (2017, March 7). Mom’s guide 2017: Best baby monitor for deaf parents. Retrieved
June 12, 2017, from MomTricks website:
https://www.momtricks.com/baby-monitors/best-baby-monitor-deaf-parents/
This source reviews the best baby moniter for deaf parents. The device chosen, called
the “Summer Infant Babble Band,” is worn on the wrist and allows the user to set it to
vibrate, flash, or both when the child makes a noise.

Text description for use of hearing aids in 2006. (2012, October 18). Retrieved June 12, 2017,
from National Institute on Deafness and Other Communication Disorders website:
https://www.nidcd.nih.gov/health/statistics/text-description-use-hearing-aids-2006
This source publishes statistics collected on the number of adults with moderate to
severe hearing loss who used hearing aids in 2006.

2 psychoacoustics. (n.d.). Retrieved November 30, 2016, from Frost School of Music website:
http://www.music.miami.edu/programs/mue/research/mescobar/thesis/web/Psychoacousti
cs.htm#_Toc531257796
This article focuses on the mechanics of how sound is transmitted through the ear. The
topics explained are ear physiology, ear sensitivity, binaural hearing, pitch, loudness,
timbre, masking, and critical bands. Ear physiology is the mechanics of the outer,
middle, and inner ear, as well as the place theory and the auditory nerve. Ear
sensitivity states that humans have a wide hearing range, volume wise, and that low
frequencies are harder to hear when soft. Binaural hearing is the ability to identify the
location of a sound. Pitch refers to how a frequency is perceived and is measured
using a logarithmic scale. Loudness, or how the intensity of sound is heard, is related
to frequency in the sense that different frequencies with the same intensity can be
different loudnesses. Timbre is the characteristics of sounds that deal with harmony
and style. Masking is when louder sounds overpower softer sounds and is an important
factor in audio perceptual coding. The smallest range of frequencies that activates the
same section in the basilar membrane is the critical band which contributes to the
loudness of a sound; the closer the critical bands of multiple sounds are, the less the
loudness will increase from the original sound.

Through its concise and informative information, this source helped establish the
stronger background on the biological functions of the ear that is needed before
researching how sound is perceived in the brain. Before this, how signals sent from the
ears are processed in the brain needs to be determined first.

Tzanakis, I., Lebon, G. S. B., Eskin, D. G., & Pericleous, K. A. (2016). Characterisation of the
ultrasonic acoustic spectrum and pressure field in aluminium melt with an advanced
cavitometer. Journal of Materials Processing Technology, 229, 582–586.
doi:10.1016/j.jmatprotec.2015.10.009
This journal investigates how ultrasonic audio waves affect liquid aluminum through
acoustic cavitation. Treating metals with ultrasonic waves has been shown to improve
the structure and improve the sturdiness. Cavitation, the formation and destruction of
bubbles in liquids, of liquid metals is a subject that has been overlooked in the past but
shows great promise. By using a cavitometer this experiment was able to compare the
pressure created by the acoustics in liquid aluminum and water. The data was analyzed
by creating a graph comparing the effects of frequency (kHs) on amplitude (dBu)
collected from the water and a separate graph with the same variables for the liquid
aluminum. It was concluded that the acoustic pressure is diminished by the liquid
aluminum and more equally spread across the water, in liquid aluminum the pressure
increases with higher frequencies, and that a numerical measurement could be applied
to acoustic pressure in liquid metals.

A new view on how acoustic waves can be used was provided by this article,
encouraging research on the less explored topics within acoustics. There were multiple
confusing topics that required extra inquiry, but the many visuals helped in obtaining a
more complete understanding of the investigation.
Weinstein, D., & Weinstein, S. (1964). Intensity and spatial measures of somatic sensation as a
function of body-part, laterality, and sex. ​PsycEXTRA Dataset​.
https://doi.org/10.1037/e572342012-122
This document details how vibrations are perceived as sensitive to vibrotactile input
and pressure on multiple body parts.

Vous aimerez peut-être aussi