Vous êtes sur la page 1sur 15

Blind people how they see through sound

How a deaf artist and a blind activist experience the world

Born deaf, TED Fellow and artist Christine Sun Kim (TED Talk: The enchanting music of sign language)
uses sound as her medium, so when she saw Daniel Kish (TED Talk: How I use sonar to navigate the
world) explain how he uses the echoes from the clicks of his tongue to navigate the world without
sight she was intrigued. The two of them got together to talk about how they each perceive
sound, and to share their thoughts on sight and sound etiquette. This conversation was translated
live from English to American Sign Language and back again by Dylan Geil and Denise Kahler.
Christine Sun Kim: Whats your relationship with sound? How do you use it to perceive your
environment?
Daniel Kish: In one sense, all sound is the same, in that all sound creates pressure waves in the air.
But I distinguish between incident sounds and reflection sounds those that are made, and
those that are reflected. Any sound that is made will also reflect from surfaces around it, so I hear
both but its the reflection of sound that provides me with a sense of the dimension of space and
the objects occupying a space.
The noisier an environment, the more difficult it is to sort echoes out from my surroundings.
Daniel Kish
CSK: You use a cane for navigation. How do you make the decision to use the cane versus
echolocation clicking?
DK: Thats a bit like asking, under what circumstances would you use central vision versus
peripheral vision? The reality is that I use them both together to accomplish a wide range of tasks.
Echolocation gives me a lot of information about objects that are far away or above the ground,
whereas the cane gives me good, detailed information about objects that are at ground level or
below ground level. Sonar doesnt detect those very well, or at all. So Id be hard-pressed to choose
between them. People who only have central vision and no peripheral vision have a very hard time.
They can read, but they cant move around very well. And people who only have peripheral vision
and no central vision can move around just fine, but they cant read and they cant study faces very
well. I use a variety of skills, strategies and perceptions to acquire a rich, comprehensive image of my
environment.
CSK: Are there certain surfaces and materials that reflect sound better than others? Do you like
some surfaces or textures better than others? Or do you use the contrasts to shape your world and
understand your relationship with a space?
DK: Precisely. Its not really a matter of like or dislike, but a matter of what information is available.
Its true that there are materials that are less or more reflective. Hard surfaces, such as a wall, are
relatively reflective; soft materials, such as furniture, are poorly reflective, so theyre more difficult
to hear. But that doesnt necessarily mean that I dont prefer them. I do tend to steer clear of noisy
environments.
Christine Sun Kim, far right, in conversation with Daniel Kish, center right. The pair's conversation
was translated into American Sign Language by Dylan Geil and Denise Kahler. Photo by Ryan
Lash/TED.
Christine Sun Kim, far right, in conversation with Daniel Kish, center right. The pairs conversation
was translated into American Sign Language by Dylan Geil and Denise Kahler. Photo by Ryan
Lash/TED.
CSK: If the material doesnt reflect the sound well, or if, say, youre in a space so large that it takes a
while for the reflection to get back to you, do you feel a little bit less in control?
DK: The echo characteristics of an environment dont generally cause me to feel more or less in
control; but the noise levels in an environment can. The noisier an environment, the more difficult it
is to sort echoes out from my surroundings, such as at a noisy party. I prefer not to do it, but I am
able to. It might be a bit like driving in heavy rain. People prefer not to do it, but they do what they
must.
CSK: Speaking of that, I was thinking about weather, too. Does rain affect the way sounds are, or if
its hot outside or if theres snow on the ground?
DK: Yes, different environments do result in different effects. Rain covers everything with water,
which is highly reflective, and it removes sound diversity from surfaces so rain tends to make
everything sound the same.
CSK: Would you say the character of the sound is monotonous?
DK: Kind of like that, yes.
CSK: Does it make the environment kind of boring?
DK: Well, it makes it more difficult to distinguish different characteristics. So where a bush would
ordinarily sound clearly like a bush, it may be more difficult to identify it when its covered with
water.
Snow has a tendency to pad things and make them sound absorbent. But at the same time, those
surfaces that are not covered in snow are reflective, so they stand out. So take, for example, a tree
thats surrounded at the base by snow. The upper portion of the tree will stand out, sort of
highlighted against the snow. Many blind people dont like to travel in the snow, but when they
learn flash sonar, they find it much easier.
When Im sharing a hotel room with a hearing person, Im very conscious about everything from
using the laptop to turning on the faucet, because I have no way to gauge the relative noise this
might make. Christine Sun Kim
CSK: Im curious about social norms sound etiquette. Do you feel uncomfortable clicking among
others? How do sighted people respond to it?
DK: Theres a range. In general I find that sighted people dont really notice the clicking. Theyll hear
it if you point it out to them, but they dont tend to hear it incidentally. Sighted people dont tend to
notice lots of sounds that blind people notice. I would expect that you notice things visually that
many sighted, hearing people dont, too.
Sometimes I have to clarify this to blind people, because it can be hard for them to understand that
a sound that they can always hear clearly is not noticed by most sighted people. It helps them be less
self-conscious.
Whats most noticeable to sighted people is how blind people visibly present themselves how
they move, their posture and echolocation helps with that.
CSK: I am very conscious of sound etiquette. When Im sharing a hotel room with a hearing person,
for example, Im very conscious about everything from using the laptop to turning on the faucet,
because I have no way to gauge the relative noise this might make and whether or not it will be
disturbing. Sometimes I do have an I dont care attitude, but when Im in a larger group I become
much more conscious.
Im curious about how youre perceiving me now. Youre hearing me through my interpreter; are
you thinking about what I look like? Or would you like to hear some type of vocalization from me?
Im wondering whats going on in the back of your mind as we talk.
DK: I am aware of your hand movements, and I find them quite intriguing. I am very comfortable
orienting my attention and directing my attention on you as a person, even though I am hearing
what you say from another direction, via the translator.
CSK: Is directing your attention towards me a sort of visual etiquette, to make things easier and
more comfortable for me?
DK: You probably have it happen to you a lot, where someone youre speaking with directs their
attention to your interpreter. I have it happen a lot, that if I am with someone, someone else we
encounter will often direct their attention to the person Im with, rather than to me. So I understand
very well the misconception that leads to that mistake.
CSK: Yes, youre right. Absolutely. What about touch etiquette? I have some friends who are both
deaf and blind, and they may physically grab onto someones arm as a way of communicating. But in
general, in this culture, touch is not really accepted or encouraged. Im wanting to reach out and
touch you but Im wondering what your usual preference is? Is that something that youre OK
with, or something thats not appropriate?
DK: I think that is something thats worked out between two individuals, though of course its
influenced by culture, as all human interaction is.
A lot of blind people would have conducted this entire interview facing straight ahead. Or maybe
even with their head down. On a stage, Id say the majority of blind people would stand exactly in
one place and just looked straight ahead. Blind people often fall into this wooden neck syndrome,
where their heads are fixed straight ahead, and so theyre not really engaging people. It looks
unnatural, and yet I would say it happens that way most of the time. The unfortunate part is that
blind people are not learning visual social engagement. I suspect that a lot of the reason people
appreciate me as a speaker has as much to do with how I present physically and visually as with
what I say. I think it would be a very different experience if I didnt physically engage the audience
when I speak. I think a lot of people would feel disconnected from me.
I was wearing a Hawaiian shirt covered in brightly colored palm trees to a formal occasion. Daniel
Kish
CSK: I do the same thing, but in the opposite way. Sometimes, when Im giving a speech to a group, I
lessen my facial expression when I sign, toning it down to make myself a little bit more presentable.
In rehearsals, Ill ask my interpreter, Whats the tone Im giving out? I want to look hearing-friendly
to the audience, if thats the term? So I know exactly what youre doing, but I do that in the opposite
way.
I wish I could just be myself, but I want to connect with people, and to do that, I have to understand
their culture, and their vision and sound etiquette. I can actually be quite a loud person. Last week, I
got a note from my neighbor who lives upstairs, complaining that Im too loud when I arrive home
when I open and close the door, when I walk around. I thought, But Im at home. I want to be
myself completely, feel free to do what I want to do. But even at home, I guess the floor makes a lot
of sound and reverberates.
DK: Yes. It would be a bit like me being concerned about the colors I wear. I have no concern
whatsoever about it, but I know that other people do. So I have a color identifier in my phone that
tells me what colors Im wearing. I had someone go through my closet and get rid of all of the busy
colors, anything printed or with stripes, because Im on camera a lot.
Ill tell you a funny story: one time I showed up to a formal occasion and I had a nice collared,
button-down shirt, and I wore it. It was the only shirt I brought with me. I started to approach the
venue, and a friend of mine laughed good-naturedly and said, Oh, I really like your shirt! And I kind
of laughed and said, Why do you like my shirt so much? And she said, I like all the palm trees! So
I was wearing a Hawaiian shirt covered in brightly colored palm trees to a formal occasion.
They all laughed because they know Im from California, and Californians are a bit strange, so no one
really took offense.
http://ideas.ted.com/how-a-deaf-artist-and-a-blind-activist-experience-the-world/

The blind boy who learned to see with sound

Daniel Kish has taught thousands of people all over the world how to "see" by using sound. But of all
the students he's taken on, none have been quite like Ethan.
On 26 January 2015, 10-year old Ethan Loch, from Bonnybridge, near Falkirk, walked through the
doors of St Mary's Music School in Edinburgh. He was there for an audition. While his parents waited
outside, he was shown through to the music room. He sat down at the piano and began to play. He
knew his chances were slim.
He was competing against hundreds of other children for a coveted place in one of the top music
schools in the country. But his audition was unlike anyone else's. Because Ethan is blind.
I'm not walking around blind. I know I am actually blind, but I'm clicking so that I can hear echoes to
help me find the way.
Ethan Loch
"He would stand at the piano for hours when he was a toddler," his mother Larinda says. "Aged
three-and-a-half he'd worked through the entire first movement of Beethoven's Moonlight Sonata."
At the same age Ethan developed a fascination with sounds.
He would record them on dictaphones - things like vacuum cleaners, washing machines,
trampolines, hand-dryers and trains. He'd then go to the piano, and, thanks to having perfect pitch,
was able to recreate them musically. Soon, he began lessons on the piano and the accordion, and a
few years later, one of his teachers suggested this obvious piano prodigy should audition for music
school.
A month later he received the news he'd been hoping for- he had been offered a place. Ethan was to
become St Mary's first blind pupil. But there was a problem. Ethan found getting around to be a
struggle. More, perhaps, than other blind children his age, and to get to St Mary's every day, he
would have to board a train and cross two main roads.
Worried that he might not easily meet this challenge, his parents turned to someone they'd got to
know when they were living overseas - Daniel Kish.

Daniel Kish uses his echolocation skills to describe what's inside a park he's never been to before.
Like Ethan, Daniel Kish is blind. His eyes were removed when he was 13 months old because of
retinal cancer, but it had little impact on his mobility. "I'm told that the first thing I did upon
awakening from the surgery was to climb out of my crib and begin wandering around the nursery,"
Daniel says.
As he explored, Daniel would click his tongue against the roof of his mouth and the sounds would
help him work out what was around him. He was beginning to master echolocation, although he
didn't know it was called that back then. It is similar to animal echolocation, which is used by bats.
"That clicking sound bounces off surfaces throughout the environment," he says. "And it comes back
with information - distances, locations, positions, contours, densities. I can construct images from
that information."

By the age of six, Daniel had got so good at echolocation that he could ride a bike down the road,
clicking to avoid people and cars. Neuroscientists were so fascinated by his skills that they carried
out an experiment with him. They found that when he clicks, he's activating the visual part of his
brain. He's now in his 40s, a slim, energetic man, with bright blue prosthetic eyes. He still uses a cane
- he's not suggesting that echolocation replaces it but merely enhances it - and he likes to go rock
climbing and hiking.
Daniel has taught echolocation to thousands of other blind people through his school, World Access
for the Blind, in California, and in July last year, he flew to Scotland to tutor Ethan, two months
before he was due to start at music school. Ethan wasn't completely new to the technique - many
blind people use it. But he'd never been taught as intensively as this.
Ethan began with some clicking exercises in the family's living room. Daniel held up a book a little
distance from one side of Ethan's face. Ethan clicked into the air, trying to work out from the echoes
where the book was. When he directed his clicks towards it, he heard a distinctive sound coming
back and grabbed the book. He was a fast learner and so Daniel soon progressed on to smaller and
smaller objects.
A few days later, Daniel showed him how he could use echolocation to understand the size and
shape of an unfamiliar room by "clicking for corners." Daniel spun him round until Ethan was
disorientated.
Ethan used his tongue to make clicks bounce off the walls of the room until he could hear the
particular echo they made when he directed them against the corner. He then headed straight
towards it.
With just a few clicks, Ethan could now create a sonic representation of the structure of a room in an
instant. It was going well, but when they tried it outside, it was a different story.
One morning, Daniel took Ethan on a practice run of the route to his new school. When they were
leaving the train station, he walked into a pole, then veered into the road. Daniel pulled him back
sharply. Then, when they arrived at St Mary's, Ethan started ploughing into the bushes. "What's
going on?" asked Daniel. "When have we ever gone through the bushes?"

It was a difficult day. But between them, Daniel and Ethan were beginning to work out what the
problem was. It was something to do with Ethan's fascination with sounds. "Sometimes I lose track if
I hear sounds in the distance," Ethan says. "I want to know where these sounds are coming from,
and so I keep going towards them. My ears keep wanting to hear them."
"You can see that he knows where something is," Daniel says. "You can tell by his head movements.
He'll be clicking right at something, he's almost there, within inches, and then something will just
draw him off course."
Ethan was even drawn to the sound of cars in the road. So instead of sounds helping Ethan to
understand the world around him - where the cars are or where the pavement is - he was getting
lost in them inside his head. It was something Daniel needed to change.
A few days later, Daniel took Ethan on a four-hour walk up rocky tracks and narrow paths. The idea
was to push him out of his comfort zone. Every few minutes, Daniel would shout: "Ethan - is your
attention external or internal?" "External," Ethan replied. After a shaky start, Ethan began to focus.
At the top of the hill, Ethan put down his cane, cupped his hands around his mouth and yelled into
the sky, listening to the way his voice returned to him after bouncing off nearby hills. "Shouting for
echoes," Daniel calls it.
Over the course of those few weeks, Daniel drilled Ethan - on the clicking, on his cane work, and
staying focused on the world around him. He left at the end of the summer. Now it was down to
Ethan.

Three months later I went to meet Ethan and his Mum, Larinda, at St Mary's for Ethan's final day of
term. The difference was remarkable. As Ethan walked through the station he was much less
distracted by the sounds around him. When we get to the pedestrian crossing, he "echolocated" the
pole and navigated around it. When we reached the school, after firing clicks into the air to work out
where the building was, he walked towards it.
Later, he went to the Church to play in the school concert. It would be his first performance in an
orchestra. He'd had to learn the whole piece, and memorise his solo. It's one of the difficulties of
playing in an orchestra if you're blind - there's rarely a Braille version of the music. That night, he
walked on stage, clicked to find his chair, and sat down. His solo was note perfect. At the end of it,
Larinda was in tears.
"When we came to the school this morning with Ethan I just couldn't believe what I was seeing," she
says. "He used all the wonderful strategies that Daniel's given him. The world's a big place and he's
beginning to find that out."
Ethan, too, was on a high. "I'm not walking around blind," he says. "I know I am actually blind, but
I'm clicking so that I can hear echoes to help me find the way. I just see in a different way."
Later, I heard one of Ethan's latest compositions and I thought I could hear something new in his
playing. It felt more confident, more free. Daniel Kish hadn't just taught Ethan echolocation. He'd
helped him find something more profound - independence.
http://www.bbc.com/news/disability-35550768

How blind people use batlike sonar

Blind from infancy due to retinal cancer, Daniel Kish learned as a young boy to judge his height while
climbing trees by making rapid clicking noises and listening for their echoes off the ground. No one
taught him the technique, which is now recognized as a human form of echolocation. He just used
it, without knowing that he behaved like a bat, says Lutz Wiegrebe, a neurobiologist at the Ludwig
Maximilian University in Munich, Germany.

Like Kish, a handful of blind echolocators worldwide have taught themselves to use clicks and echoes
to navigate their surroundings with impressive easeKish can even ride his bike down the street, as
his daring YouTube videos show. A study of sighted people newly trained to echolocate now
suggests that the secret to Kishs skill isnt just supersensitive ears. Instead, the entire body, neck,
and head are key to seeing with soundan insight that could assist blind people learning the skill.

Bats and other animals that rely on sounds to detect prey in the dark move their ears much like
humans use their eyes to track an object of interest, making constant adjustments to their ear
positions. When bats echolocate, they emit rapid-fire, high-frequency clicks (usually out of range of
human hearing), then swivel their ears like radar dishes to catch the echoes, a system sensitive
enough to detect objects as thin as a human hair and tiny, night-flying insects. Unlike bats large,
mobile ears, however, human ears are small and fixedan obstacle to blind people who use their
ears to see.

To test the extent to which people can compensate for this immobility, Wiegrebe and colleagues
recruited eight undergraduates with normal vision to don blindfolds and learn some basic
echolocation skills. The students were first taught to produce sharp, high-frequency clicks with their
tongues. Then they were blindfolded and led into a long, narrow corridor, where they practiced
sensing the position of the walls based on how long it took for an echoed click to reach their ears.
Although some people are more naturally talented than others at echolocation, most got quite
good after 2 to 3 weeks of training, Wiegrebe says, and could reliably orient themselves to walk
down the corridor without running into any walls using just clicks and echoes.

Next, the researchers created a virtual version of the corridor to test how important head and body
movements, rather than hearing alone, had been to the students accuracy. Blindfolded subjects sat
in a chair wearing headphones while a computer program simulated the acoustics of the real-life
corridor when they clicked into a microphone. To ensure that the acoustics of the simulated room
were realistic, the researchers asked two blind echolocation experts to navigate it first; both were
quickly able to orient their bodies toward the center of the aisle.

The blindfolded students were also instructed to use their clicks and echoes to line up their bodies
with the center of the corridor. In one test, they were told to rotate the virtual corridor without
making any head or body movements, using a joystick. In another, the corridor was fixed and
participants were allowed to swivel their chairs and heads to determine their position in the room.

The difference between the two conditions was stark, Wiegrebe says. When the participants
couldnt move their heads or torsos, they zigzagged down the virtual hall and were unable to self-
correct before hitting a wall. When the corridors position was fixed and their bodies and heads
were free to move, however, the novice echolocators soon righted themselves, the team reports
online today in the Proceedings of the Royal Society B.

The virtual corridor is a very creative way to determine just how important body movements are
to echolocation, says Lore Thaler, a psychologist at Durham University in the United Kingdom. The
findings fit well with her own recent study, which showed that head movements can enable blind
echolocation experts to sense an objects contours, she says. The research also provides a new way
of studying echolocation that cant be done in animals, she notes. After all, a bat cant use a
joystick.
Echolocation is a skill that has evolved independently several times in the animal kingdom in
response to low visibility conditionswhether at night, as with bats and a few nocturnal birds, or in
murky water, as with whales and dolphins, Wiegrebe notes. Its not magic. Though the research is
still in its early stages, he hopes that a virtual reality program similar to that used in the study will
eventually help blind people learn to use echolocation in the safety and privacy of their homes.
http://www.sciencemag.org/news/2014/11/how-blind-people-use-batlike-sonar

Some blind people 'see' with their ears, neuropsychologists show

Dr. Olivier Collignon of the University of Montreal's Saint-Justine Hospital Research Centre compared
the brain activity of people who can see and people who were born blind, and discovered that the
part of the brain that normally works with our eyes to process vision and space perception can
actually rewire itself to process sound information instead.

The research was undertaken in collaboration with Dr Franco Lepore of the Centre for Research in
Neuropsychology and Cognition and was published March 15 in the Proceedings of the National
Academy of Sciences.

The research builds on other studies which show that the blind have a heightened ability to process
sounds as part of their space perception. "Although several studies have shown occipital regions of
people who were born blind to be involved in nonvisual processing, whether the functional
organization of the visual cortex observed in sighted individuals is maintained in the rewired
occipital regions of the blind has only been recently investigated," Collignon said. The visual cortex,
as its name would suggest, is responsible for processing sight. The right and left hemisphere of the
brain have one each. They are located at the back of the brain, which is called the occipital lobe.
"Our study reveals that some regions of the right dorsal occipital stream do not require visual
experience to develop a specialization for the processing of spatial information and are functionally
integrated in the preexisting brain network dedicated to this ability."

The researchers worked with 11 individuals who were born blind and 11 who were not. Their brain
activity was analyzed via MRI scanning while they were subjected to a series of tones. "The results
demonstrate the brain's amazing plasticity," Collignon said. Plasticity is a scientific term that refers
to the brain's ability to change as a result of an experience. "The brain designates a specific set of
areas for spatial processing, even if it is deprived of its natural inputs since birth. The visually
deprived brain is sufficiently flexible that it uses "neuronal niche" to develop and perform functions
that are sufficiently close to the ones required by the remaining senses. Such a research
demonstrates that the brain should be more considered as a function-oriented machine rather than
a pure sensory machine."

The findings raise questions regarding how this rewiring occurs during the development of blind new
born babies. "In early life, the brain is sculpting itself on the basis of experience, with some synaptic
connections eliminated and others strengthened," Collignon noted. Synaptic connections enable our
neurons, or brain cells, to communicate. "After a peak of development ending approximately at the
age of 8 months, approximately 40% of the synapses of the visual cortex are gradually removed to
reach a stable synaptic density at approximately the age of 11 years. It is possible that that the
rewiring occurs as part of the maintenance of our ever changing neural connections, but this theory
will require further research," Collignon said.
Collignon's study received funding from the Fondation de l'Hpital Sainte-Justine, the Fonds de la
recherche en sant du Qubec, the Canadian Institutes for Health Research, the Natural Sciences
and Engineering Council of Canada, and the Fonds de la Recherche Scientifique of Belgium.
https://www.sciencedaily.com/releases/2011/03/110316104123.htm

The Blind Individuals Who See By Sound

Daniel Kish has been blind since he was 13 months old, but you wouldnt be able to tell. He navigates
crowded streets on his bike, camps out in the wilderness, swims, dances and does other activities
many would think impossible for a blind person. How does he do it? Kish is a human echolocator, a
real life Daredevil.
Using a technique similar to what bats and dolphins use, human echo-locators navigate using audio
cues given off by reflective surfaces in the environment. Few people know that this same technique
can work for human beings. But as a matter of fact, echolocation comes quite naturally to people
like Kish, who are deprived of visual information. I dont remember learning this, he says. My
earliest memories were of detecting things and noting what they might have reminded me of and
then going to investigate.
Kish was born with bilateral retinoblastomas, tiny cancers of the retina, which is part of the eye
responsible for sensing visual information. Tumors form early in this type of cancer, so aggressive
treatment is necessary to ensure they dont metastasize to the rest of the body.
Unfortunately, the tumors cannot be separated from the retina. Laser treatments are performed to
kill them off, followed by chemotherapy. The result is that the retina is destroyed along with the
cancer, meaning patients often are left completely blind. Kish lost his first eye at 7 months and the
other at 13 months. He has no memory of having eyesight. His earliest vivid memory is from when
he was very young, maybe 2. He climbed out his bedroom window and walked over to a chain-link
fence in his backyard. He stood over it, angled his head upward and clicked over it with his tongue,
listening for the echo. He could tell there were things on the other side. Curious as to what they
were, he climbed over the fence and spent much of his night investigating.
Like Kish, Ben Underwood was a self-taught echolocator and was also diagnosed with bilateral
retinoblastomas, in his case at the age of 2. After many failed attempts to save his vision by treating
the tumors with radiation and chemotherapy, his mother made the difficult decision to remove her
sons right eye and left retina. This left Ben completely blind.
A couple of years later, when Ben was in the back seat of the car with the window down, he
suddenly said, Mom, do you see that tall building there? Shocked by his statement, his mother
responded, I see the building, but do you see it?
It turned out Ben had picked up on the differences in sounds coming from empty space versus a tall
building. When Ben was in school, he started clicking with his tongue. At first it was an idle habit, but
then he realized he could use the skill to detect the approximate shape, location and size of objects.
Soon, Ben was riding a bicycle, skateboarding, playing video games, walking to school and doing
virtually anything else an ordinary boy his age could do. He never used a guide dog, a white cane or
his hands. Very sadly, Ben passed away in 2009 after the cancer that claimed his eyes returned.
Experiencing Echolocation
For centuries, researchers have been trying to find out how blind people compensate for their loss of
vision. It was clear that some blind people occasionally were able to hear objects that were
apparently making no sounds. But no one knew exactly how blind people did this. And although bat
echolocation was documented in 1938, scientists didnt become seriously interested in the
phenomenon until the early years of the Cold War, when military funding made the research
feasible. It turns out human echolocation is akin to active sonar and the kind of echolocation used by
dolphins and bats, but less fine-grained. While bats can locate objects as small as flies, human
echolocators report that objects must be much larger about the size of a water glass for them
to be locatable.
Philosophers and neuroscientists often talk about phenomenology, or what its like to have an
experience. If we show you a red ball and ask you about its color, assuming youre not colorblind, it
should be easy for you to answer red. However, if we ask you to describe what its like to see the
color red, youd have a much harder time answering. By their very nature, questions about
phenomenology can be nearly impossible to answer, making it hard to discover exactly what its like
to experience echolocation.
Indeed, Kish says that, because he has been blind for as long as he can remember, he has nothing to
compare his experience to. He cant really say whether his experience is like seeing. However, he
says he definitely has spatial imagery, which has the properties of depth and dimension. Research
indicates that the imagery of echolocation is constructed by the same neurology that processes
visual data in sighted people. The information isnt traveling down the optic pathway the
connection from the eyes to the brain but it ends up in the same place. And some individuals who
have gone blind later in life describe the experience as visual, in terms of flashes, shadows or bright
experiences. It seems possible that echolocators have visual imagery that is similar to that of sighted
people.
Bens case provides some evidence of this, as he consistently reported seeing the objects he could
detect, not just hearing them. And while self-reports are notoriously unreliable, there is other
evidence that Ben really could see with his ears.
Ben had prosthetic eyes that replaced his real ones, but since his eye muscles were still intact, the
prosthesis moved in different directions, much like real eyes. In the documentary Extraordinary
People: The Boy Who Sees Without Eyes, its clear Bens prosthetic eyes were making saccades in
several situations that require focusing quickly on different objects in the peripheral field. Saccades
are the quick, coordinated movements of both eyes to a focal point. The documentary never
discussed Bens saccadic eye movements, but theres no doubt they occurred and that they matched
the auditory stimuli he received. For example, in one scene, he was playing a video game that
required destroying objects entering the scene. Although echolocation didnt give him the ability to
see the images on a flat screen, Ben could play games based on the sound effects played through
the televisions speakers. Like many blind people, Ben used the sound cues to figure out where
objects were on the screen. His saccadic eye movements corresponded to the changes in location of
the virtual objects.
The primary role of saccadic eye movements is to guarantee high resolution in vision. We can see
with high resolution only when images from the visual field fall on the retinas central region, called
the fovea. When images fall on the more peripheral areas of the retina, we dont see them very
clearly. Only a small fraction of an entire scene falls on the fovea at any given time. But rapid eye
movements can ensure that you look at many parts of an entire scene in high resolution. Although
its not immediately apparent, the brain creates a persisting picture based on many individual
snapshots.
Many other factors govern eye movements, including changes in and beliefs about the environment,
and intended action. For example, your eyes move in the direction of a sudden noise. A belief that
someone is hiding in the tree in front of you makes your eyes seek out the tree. And intending to
limb a tree triggers your eyes to switch from the ground to the tree so you can inspect it properly.
Even when recalling visual imagery and there is no sensory input or external environment to process,
saccadic eye movements still occur. This happens because when the brain stores information about
the environment, it stores information about eye movements along with it. When your brain
generates a visual image, its likely a composite of different snapshots of reality, and rapid eye
movements help keep the visual image organized and in focus. This dual storage mechanism explains
Bens saccadic eye movements: The sound stimuli from Bens environment triggered his brain to
generate spatial imagery matching the sound stimuli, and his saccadic eye movements helped keep
the image organized and in focus.
Daniel Kish has been blind since he was 13 months old, but you wouldnt be able to tell. He navigates
crowded streets on his bike, camps out in the wilderness, swims, dances and does other activities
many would think impossible for a blind person. How does he do it? Kish is a human echolocator, a
real life Daredevil.
Using a technique similar to what bats and dolphins use, human echo-locators navigate using audio
cues given off by reflective surfaces in the environment. Few people know that this same technique
can work for human beings. But as a matter of fact, echolocation comes quite naturally to people
like Kish, who are deprived of visual information. I dont remember learning this, he says. My
earliest memories were of detecting things and noting what they might have reminded me of and
then going to investigate.
Kish was born with bilateral retinoblastomas, tiny cancers of the retina, which is part of the eye
responsible for sensing visual information. Tumors form early in this type of cancer, so aggressive
treatment is necessary to ensure they dont metastasize to the rest of the body.
Unfortunately, the tumors cannot be separated from the retina. Laser treatments are performed to
kill them off, followed by chemotherapy. The result is that the retina is destroyed along with the
cancer, meaning patients often are left completely blind. Kish lost his first eye at 7 months and the
other at 13 months. He has no memory of having eyesight. His earliest vivid memory is from when
he was very young, maybe 2. He climbed out his bedroom window and walked over to a chain-link
fence in his backyard. He stood over it, angled his head upward and clicked over it with his tongue,
listening for the echo. He could tell there were things on the other side. Curious as to what they
were, he climbed over the fence and spent much of his night investigating.
Like Kish, Ben Underwood was a self-taught echolocator and was also diagnosed with bilateral
retinoblastomas, in his case at the age of 2. After many failed attempts to save his vision by treating
the tumors with radiation and chemotherapy, his mother made the difficult decision to remove her
sons right eye and left retina. This left Ben completely blind.
A couple of years later, when Ben was in the back seat of the car with the window down, he
suddenly said, Mom, do you see that tall building there? Shocked by his statement, his mother
responded, I see the building, but do you see it?
It turned out Ben had picked up on the differences in sounds coming from empty space versus a tall
building. When Ben was in school, he started clicking with his tongue. At first it was an idle habit, but
then he realized he could use the skill to detect the approximate shape, location and size of objects.
Soon, Ben was riding a bicycle, skateboarding, playing video games, walking to school and doing
virtually anything else an ordinary boy his age could do. He never used a guide dog, a white cane or
his hands. Very sadly, Ben passed away in 2009 after the cancer that claimed his eyes returned.

Experiencing Echolocation
For centuries, researchers have been trying to find out how blind people compensate for their loss of
vision. It was clear that some blind people occasionally were able to hear objects that were
apparently making no sounds. But no one knew exactly how blind people did this. And although bat
echolocation was documented in 1938, scientists didnt become seriously interested in the
phenomenon until the early years of the Cold War, when military funding made the research
feasible. It turns out human echolocation is akin to active sonar and the kind of echolocation used by
dolphins and bats, but less fine-grained. While bats can locate objects as small as flies, human
echolocators report that objects must be much larger about the size of a water glass for them
to be locatable.
Philosophers and neuroscientists often talk about phenomenology, or what its like to have an
experience. If we show you a red ball and ask you about its color, assuming youre not colorblind, it
should be easy for you to answer red. However, if we ask you to describe what its like to see the
color red, youd have a much harder time answering. By their very nature, questions about
phenomenology can be nearly impossible to answer, making it hard to discover exactly what its like
to experience echolocation.
Indeed, Kish says that, because he has been blind for as long as he can remember, he has nothing to
compare his experience to. He cant really say whether his experience is like seeing. However, he
says he definitely has spatial imagery, which has the properties of depth and dimension. Research
indicates that the imagery of echolocation is constructed by the same neurology that processes
visual data in sighted people. The information isnt traveling down the optic pathway the
connection from the eyes to the brain but it ends up in the same place. And some individuals who
have gone blind later in life describe the experience as visual, in terms of flashes, shadows or bright
experiences. It seems possible that echolocators have visual imagery that is similar to that of sighted
people.
Bens case provides some evidence of this, as he consistently reported seeing the objects he could
detect, not just hearing them. And while self-reports are notoriously unreliable, there is other
evidence that Ben really could see with his ears.
Ben had prosthetic eyes that replaced his real ones, but since his eye muscles were still intact, the
prosthesis moved in different directions, much like real eyes. In the documentary Extraordinary
People: The Boy Who Sees Without Eyes, its clear Bens prosthetic eyes were making saccades in
several situations that require focusing quickly on different objects in the peripheral field. Saccades
are the quick, coordinated movements of both eyes to a focal point. The documentary never
discussed Bens saccadic eye movements, but theres no doubt they occurred and that they matched
the auditory stimuli he received. For example, in one scene, he was playing a video game that
required destroying objects entering the scene. Although echolocation didnt give him the ability to
see the images on a flat screen, Ben could play games based on the sound effects played through
the televisions speakers. Like many blind people, Ben used the sound cues to figure out where
objects were on the screen. His saccadic eye movements corresponded to the changes in location of
the virtual objects.
The primary role of saccadic eye movements is to guarantee high resolution in vision. We can see
with high resolution only when images from the visual field fall on the retinas central region, called
the fovea. When images fall on the more peripheral areas of the retina, we dont see them very
clearly. Only a small fraction of an entire scene falls on the fovea at any given time. But rapid eye
movements can ensure that you look at many parts of an entire scene in high resolution. Although
its not immediately apparent, the brain creates a persisting picture based on many individual
snapshots.
Many other factors govern eye movements, including changes in and beliefs about the environment,
and intended action. For example, your eyes move in the direction of a sudden noise. A belief that
someone is hiding in the tree in front of you makes your eyes seek out the tree. And intending to
climb a tree triggers your eyes to switch from the ground to the tree so you can inspect it properly.
Even when recalling visual imagery and there is no sensory input or external environment to process,
saccadic eye movements still occur. This happens because when the brain stores information about
the environment, it stores information about eye movements along with it. When your brain
generates a visual image, its likely a composite of different snapshots of reality, and rapid eye
movements help keep the visual image organized and in focus. This dual storage mechanism explains
Bens saccadic eye movements: The sound stimuli from Bens environment triggered his brain to
generate spatial imagery matching the sound stimuli, and his saccadic eye movements helped keep
the image organized and in focus.

Seeing with Your Ears


Sighted people often use a simple form of echolocation, too, perhaps without even realizing it.
When youre hanging a picture on a wall, one way to locate a stud within it is to knock around and
listen for changes in pitch. But when you tap on a hollow space in the wall, you usually dont hear an
actual echo yet you can tell somehow that the space sounds hollow.
Research shows we can perceive these types of stimuli subconsciously. When we do hear echoes, its
from sound bouncing back off distant objects. When you click your tongue or whistle toward a
nearby object, though, the echo returns so fast that it overlaps the original sounds, making it hard to
hear an echo. But the brain unconsciously interprets the combination of the sound thrown in one
direction and the returning sound as an alteration in pitch. What makes Ben and Kish so remarkable
is that they can use what everyones brain unconsciously detects in an active way to navigate the
world. And although Ben and Kish may seem superhuman because of their perceptual abilities,
research confirms that sighted humans can acquire echolocation, too. After all, the visual cortex
does process some sounds, particularly when the brain seeks to match auditory and visual sensory
inputs.
American psychologist Winthrop Niles Kellogg began his human-echolocation research program
around the time of the Cuban Missile Crisis. His research showed that both blind and sighted
subjects wearing blindfolds could learn to detect objects in the environment through sound, and a
study by another researcher showed that, with some training, both blind and sighted individuals can
precisely determine certain properties of objects, such as distance, size, shape, substance and
relative motion from sound alone. While sighted individuals show some ability to echolocate, Kellogg
showed that blind echolocators seem to operate a bit differently when collecting sensory data. They
move their heads in different directions when spatially mapping an environment, while sighted
subjects dont move their heads when given the same types of tasks.
Interestingly, when sighted individuals are deprived of visual sensory information for an extended
period of time, they naturally start echolocating, possibly after only a few hours of being blindfolded.
Whats more, with their newfound echolocation skills comes some visual imagery. After a week of
being blindfolded, the imagery becomes more vivid. One of Kelloggs research participants said he
experienced ornate buildings of whitish green marble and cartoon-like figures.
Is sound alone responsible for echolocators ability to navigate the environment? Researchers
wonder whether touch cues, such as the way air moves around objects, can offer information about
the surroundings. Philip Worchel and Karl Dallenbach from the University of Texas at Austin sought
to answer these questions in the 1940s. Their experiments involved asking both blind and
blindfolded sighted participants to walk toward a board placed at varying distances. Participants
were rewarded for learning to detect the board by not walking into it face-first. After multiple trials,
both the blindfolded and blind subjects became better able to detect the obstacle. After about 30
trials, blindfolded subjects were as successful at stopping in front of the boards as blind subjects
when they were wearing hard-soled shoes. But this ability disappeared when subjects performed the
same experiments on carpet or while wearing socks, which muffled the sound their footsteps
created. The researchers concluded that the subjects relied on sound emanating from their shoes,
implying sound is responsible for navigational ability.
The increased ability to navigate via sound appears to be the result of sound processing in the brain,
not merely increased acuity of hearing. One study showed that the blind and the sighted scored
similarly on normal hearing tests. But when a recording had echoes, parts of the brain associated
with visual perception in sighted people activated in echolocators but not in sighted people. These
results showed how echolocators extracted information from sound that wasnt available to the
sighted controls.
Some reports seem to indicate that humans cant perceive objects that are very close within 2
meters of them through echolocation. But a 1962 study by Kellogg at Florida State University
showed that blind people can detect obstacles at much shorter distances 30 to 120 centimeters.
Some participants were accurate even within 10 centimeters, suggesting that although subjects
arent consciously aware of the echo, they still can respond appropriately to echo stimuli.
The above cases show that our perceptual experiences involve a lot more than just what were
consciously aware of. Our brain is primed to accomplish the seemingly superhuman, even at the
basic level of perception. Its an extraordinary organ that creates our rich experiences by turning
waves that merely strike the eardrum into complicated, phenomenal representations of our
surroundings.
Teaching the Blind to See
Perhaps what is most amazing about echolocation is that it can be taught. Kish wrote his
developmental psychology masters thesis on the subject, developing the first systematic approach
to teaching the skill. Now, through his organization, World Access for the Blind, he strives to give the
blind nearly the same freedom as sighted people by teaching blind people to navigate with their
ears.
So why dont all blind people echolocate? The problem, says Kish, is that our society has fostered
restricting training regimens. One example of this restriction is the traditional cane training method.
The military developed the cane techniques more than 60 years ago for blind veterans, people used
to living in restricted circumstances. For them, it was easy to adapt to the regimented system. But
now, the majority of blind people, including children and the elderly, learn this system.
Kishs training curriculum differs from tradition by taking an immersive approach intended to
activate environmental awareness. Its a tough-love approach with very little hand-holding. He
encourages children to explore their home environment for themselves and discourages family
members from interfering unless the child otherwise could be harmed.
Kish then changes the cane his students use during walking. The student holds the cane out in front,
elbow slightly bent, so the hand is roughly at waist height. With every step, the cane tip lands about
where the students foot will land so the cane clears the area of where the child is about to step. The
student repeatedly taps the cane from left to right, called two-point touch. Although computer
models support this method, Kish thinks the movement is unnatural. We are not robots, he says.
The reality is that the biomechanics do not sustain the kind of regimented movement you have to
have in order for that to work you lose fluidity of motion. You dont have to be a physical
therapist to know thats a recipe for a wrist problem.
Then the real fun begins: The students learn to echolocate by systematic stimulus differentiation.
Notice the term detection isnt included. No stimulus really occurs in a vacuum, so the process is not
so much detection as it is distinguishing one stimulus from another and its background. The process
follows a standard learning structure: Students first learn to differentiate among strong, obvious
stimuli and then advance to weaker, less obvious stimuli. Kish establishes a hook stimulus by using
a plain panel he moves around in the students environment. Students dont need to know what
theyre listening for, since the stimulus is selected to be strong enough that it captures the brains
attention. But the panel doesnt work for everyone at first. In those cases, he uses a 5-gallon bucket
or something else that produces a very distinct sound quality. Once the brain is hooked on the
characteristic stimulus, he starts manipulating its features to make the effect subtler.
The next set of exercises helps students learn how to determine what the objects actually are. This
essentially involves three characteristics: where things are, how large they are and depth of
structure, which refers to the geometric nature of the object or surface. Students answer questions
such as, Is the object coarse or smooth? Is it highly solid or sparse? and Is it highly reflective or
absorbent? Kish says all those patterns come back as acoustic imprints. The key is to notice the
changes in the sound when it comes back from when it went out. With determined practice,
students eventually learn how to differentiate among general environmental stimuli.
Although the organizations instructors are currently all blind echolocators, Kish anticipates that
sighted instructors could teach the skill as well. Several sighted instructors are in training, showing
promise of using echolocation themselves. His goal is to have sighted instructors performing just as
well as blind echolocators, though he emphasizes that although both sighted and blind individuals
can echolocate, they may have profoundly different phenomenological experiences. In the
meantime, Kish is hard at work, teaching the blind how to use their ears to see.
http://discovermagazine.com/2015/july-aug/27-sonic-vision
Device plays sounds to let blind people see

The world is a jumble of sights, sounds, and smells. While these signals may seem distinct and
independent, they actually interact and integrate within the brains network of sensory neurons.
A new assistive device for blind people taps into this sensory network. It translates images into
sounds, allowing visually impaired people to detect their environment without the need for hours of
training or intense concentration.
https://www.youtube.com/watch?v=2w3bfmL0RXg
The work is described in a paper published in Scientific Reports.
Many neuroscience textbooks really only devote a few pages to multisensory interaction, says
Shinsuke Shimojo, a professor of experimental psychology at the California Institute of Technology
(Caltech) and principal investigator on the study. But 99 percent of our daily life depends on
multisensoryalso called multimodalprocessing.
The research raises an important philosophical question: What is seeing?
As an example, he says, if you are talking on the phone with someone you know very well, and they
are crying, you will not just hear the sound but will visualize their face in tears. This is an example of
the way sensory causality is not unidirectionalvision can influence sound, and sound can influence
vision.
Shimojo and postdoctoral scholar Noelle Stiles have exploited these crossmodal mappings to
stimulate the visual cortex with auditory signals that encode information about the environment.
They explain that crossmodal mappings are ubiquitous; everyone already has them. Mappings
include the intuitive matching of high pitch to elevated locations in space or the matching of noisy
sounds with bright lights. Multimodal processing, like these mappings, may be the key to making
sensory substitution devices more automatic.
How the device works
The researchers conducted trials with both sighted and blind people using a sensory substitution
device, called a vOICe device, that translates images into sound.
The vOICe device is made up of a small computer connected to a camera that is attached to
darkened glasses, allowing it to see what a human eye would. A computer algorithm scans each
camera image from left to right, and for every column of pixels, generates an associated sound with
a frequency and volume that depends upon the vertical location and brightness of the pixels.
A large number of bright pixels at the top of a column would translate into a loud, high-frequency
sound, whereas a large number of lower dark pixels would be a quieter, lower-pitched sound. A
blind person wearing this camera on a pair of glasses could then associate different sounds with
features of their environment.
Hear a sound, see a color
In the trials, sighted people with no training or instruction were asked to match images to sounds;
while the blind subjects were asked to feel textures and match them to sound. Tactile textures can
be related to visual textures (patterns) like a topographic mapbright regions of an image translate
to high tactile height relative to a page, while dark regions are flatter.
Both groups showed an intuitive ability to identify textures and images from their associated sounds.
Surprisingly, the untrained (also called naive) groups performance was significantly above chance,
and not very different from the trained.
The intuitively identified textures used in the experiments exploited the crossmodal mappings
already within the vOICe encoding algorithm.
When we reverse the crossmodal mappings in the vOICe auditory-to-visual translation, the naive
performance significantly decreased, showing that the mappings are important to the intuitive
interpretation of the sound, explains Stiles.
We found that using this device to look at texturespatterns of light and darkillustrated
intuitive neural connections between textures and sounds, implying that there is some preexisting
crossmodality, says Shimojo.
One common example of crossmodality is a condition called synesthesia, in which the activation of
one sense leads to a different involuntary sensory experience, such as seeing a certain color when
hearing a specific sound. Now, we have discovered that crossmodal connections, preexisting in
everyone, can be used to make sensory substitution intuitive with no instruction or training.
The researchers do not exactly know yet what each sensory region of the brain is doing when
processing these various signals, but they have a rough idea.
Why the brain cross-processes
Auditory regions are activated upon hearing sound, as are the visual regions, which we think will
process the sound for its spatial qualities and elements. The visual part of the brain, when processing
images, maps objects to spatial location, fitting them together like a puzzle piece, Stiles says.
To learn more about how the crossmodal processing happens in the brain, the group is currently
using functional magnetic resonance imaging (fMRI) data to analyze the crossmodal neural network.
These preexisting neural connections provide an important starting point for training visually
impaired people to use devices that will help them see. A sighted person simply has to open their
eyes, and the brain automatically processes images and information for seamless interaction with
the environment.
Current devices for the blind and visually impaired are not so automatic or intuitive to use, generally
requiring a users full concentration and attention to interpret information about the environment.
The Shimojo labs new finding on the role of multimodal processing and crossmodal mappings starts
to address this issue.
What is seeing?
Beyond its practical implications, Shimojo says, the research raises an important philosophical
question: What is seeing?
It seems like such an obvious question, but it gets complicated, says Shimojo. Is seeing what
happens when you open your eyes? No, because opening your eyes is not enough if the retina [the
light-sensitive layer of tissue in the eye] is damaged.
Is it when your visual cortex is activated? But our research has shown that the visual cortex can be
activated by sound, indicating that we dont really need our eyes to see. Its very profoundwere
trying to give blind people a visual experience through other senses.
The National Science Foundation, the Della Martin Fund for Discoveries in Mental Illness, and the
Japan Science and Technology Agency, Core Research for Evolutional Science and Technology funded
the work.
http://www.futurity.org/device-sounds-blind-people-1038402/

Vous aimerez peut-être aussi