Vous êtes sur la page 1sur 8

12/21/2017 Beware emotional robots: Giving feelings to artificial beings could backfire, study suggests | Science | AAAS

1K   13 91
 
Log in | My account | Contact us

Become a member Renew my subscription | Sign up for newsletters

  

Participants watched this scene play out in virtual reality and rated its eeriness.
J.-P.Stein, P. Ohler., Cognition 160 (March 2017) © Elsevier B.V.

Beware emotional robots: Giving feelings to arti cial beings could back re,
study suggests
By Matthew Hutson Mar. 13, 2017 , 11:00 AM

In the recent movie Rogue One: A Star Wars Story, the face of the character Grand Moff Tarkin was
constructed digitally, as the actor who had originally played him had died. Some who knew about the
computer trickery saw his appearance as slightly unnatural, leading to a sense of unease. Their discomfort
demonstrates what the Japanese roboticist Masahiro Mori referred to in 1970 as the “uncanny valley”: Our
a nity toward robots and animations increases as they physically appear more humanlike, except for a
large dip where they are almost but not quite there.

But what happens when a character’s appearance remains the same, but observers think its mind has
become more humanlike? New research reveals that this, too, unnerves people, a nding that could have
possible implications for a range of human-computer interactions.

The study “pushes forward work on the uncanny valley” by showing that “it’s not simply how [something]
moves and how it looks, but also what you think it represents,” says Jonathan Gratch, a computer scientist
http://www.sciencemag.org/news/2017/03/beware-emotional-robots-giving-feelings-artificial-beings-could-backfire-study-suggests 1/8
12/21/2017 Beware emotional robots: Giving feelings to artificial beings could backfire, study suggests | Science | AAAS

1K
at the University
  13
of Southern
91
California in Los Angeles, who was not involved with the work. “There’s going
to be a lot more human-machine interactions, human-machine teams, machines being your boss, machines
writing newspaper articles. And so this is a very topical question and problem.”

SIGN UP FOR OUR DAILY NEWSLETTER


Get more great content like this delivered right to you!

Email Address Sign Up

By signing up, you agree to share your email address with the publication. Information provided here is subject to Science's privacy policy.

Previous work has shown a discomfort with humanlike robots, with people ascribing more emotions to
them. In a study published by the psychologists Kurt Gray of the University of North Carolina in Chapel Hill
and Daniel Wegner (now deceased) in 2012, participants watched a brief video of a robot’s head either
from the front, where they could see its “human” face, or from behind, where they saw electrical
components. The ones who watched its face rated it as more capable of feeling pain and fear, and as a
result they felt more “creeped out.”  

But what happens when the appearance of an arti cial intelligence remains the same but its emotions
become more humanlike? To nd out, Jan-Philipp Stein and Peter Ohler, psychologists at the Chemnitz
University of Technology in Germany, gave virtual-reality headsets to 92 participants and asked them to
observe a short conversation between a virtual man and woman in a public plaza. The characters discuss
their exhaustion from hot weather, the woman expresses frustration about lack of free time, and the man
conveys sympathy for the woman’s annoyance at waiting for a friend.

Everyone watched the same scene, but participants received one of four descriptions. Half were told the
avatars were controlled by humans, and half were told they were controlled by computers. Within each
group, half were told the conversation was scripted, and half were told it was spontaneous.

Those who thought they’d watched two computers interact autonomously saw the scene as more eerie
than did the other three groups. That is, natural-seeming social behavior was ne when coming from a
human, or from a computer following a script. But when a computer appeared to feel genuine frustration
and sympathy, it put people on edge, the team reports this month in Cognition.

Stein and Ohler call the phenomenon the “uncanny valley of the mind.” But whereas the uncanny valley is
normally used to describe the visual appearance of a robot or virtual character, this study nds that, given a
particular appearance, emotional behavior alone can seem uncanny. “It’s pretty neat in that they used all
the same avatars and just changed the conceptualization of it,” Gray says.

Some work shows that people are more comfortable with computers that display social skills, but this
study suggests limitations. Annoyance at waiting for a friend, for example, might feel a little too human.
With social skills, there may be not an uncanny valley but an uncanny cliff. When designing virtual agents,
Gray suggests, “keep the conversation social and emotional but not deep.”

An open question is why the volunteers who thought they were seeing two spontaneous computers felt
distressed. Stein suggests they may have felt human uniqueness was under threat. In turn, humans may

http://www.sciencemag.org/news/2017/03/beware-emotional-robots-giving-feelings-artificial-beings-could-backfire-study-suggests 2/8
12/21/2017 Beware emotional robots: Giving feelings to artificial beings could backfire, study suggests | Science | AAAS

1K
lose superiority
  13
and control
91
over our technology. In future work, Stein plans to see whether people feel
more comfortable with humanlike virtual agents when they feel they have control over the agents’ behavior.

Gray and Gratch say next steps should include measuring not only people’s explicit ratings of creepiness,
but also their behavior toward social bots. “A lot of the creepiness may arise more from when you re ect on
it than when you’re in an interaction,” Gratch says. “You might have a nice interaction with an attractive
virtual woman, then you sit back and go, ‘Eugh.’” 

Posted in: Social Sciences, Technology


doi:10.1126/science.aal0902

Matthew Hutson
Matthew Hutson is a freelance science journalist in New York City.
 Email Matthew  Twitter

More from News


Why do we keep making robots that look like people? This video has some answers

Why human society isn’t more—or less—violent than in the past

People don’t trust driverless cars. Researchers are trying to change that

News from Science has introduced metered access. Full access to all news content is included in
AAAS membership.

Got a tip?
How to contact the news team

Related Jobs

http://www.sciencemag.org/news/2017/03/beware-emotional-robots-giving-feelings-artificial-beings-could-backfire-study-suggests 3/8
12/21/2017 Beware emotional robots: Giving feelings to artificial beings could backfire, study suggests | Science | AAAS

1K   13 91
Postdoctoral Positions in Henan
Provincial People's Hospital
Zhengzhou, Henan (CN) Up to about $50,000 per year +
generous initial research fund + housing benefits, etc

Health and life sciences PhD needed. Provide competitive


benefits (up to $50,000/year) and generous research fund,
etc.

Employer: Henan Provincial People’s Hospital

Faculty Positions in Neurosciences at


Qingdao University
Qingdao University School of Medicine, Qingdao, China
Salary range: 280,000-1,000,000 RMB +bonus, housing
subsidy etc.

The Institute of Neuroregeneration and Neurorehabilitation at


Qingdao University (INNQU) invites applications for 10
faculty positions.

Employer: Qingdao University

Tenure-track Assistant Professor


Columbia, South Carolina Competitive salary and startup
funds are available.

Inviting applications for tenure-track ASSISTANT


PROFESSOR position with expertise in Immunology.

Employer: University of South Carolina School of Medicine

MORE JOBS ►

ScienceInsider
NASA picks missions to Titan and a comet as nalists for billion-dollar mission
By Paul Voosen Dec. 20, 2017

NIH plans big shake-up of minority mentoring network


By Jeffrey Mervis Dec. 19, 2017

Extending red snapper season might have broken U.S. law


By Rob Hotakainen, E&E News Dec. 19, 2017

NIH lifts 3-year ban on funding risky virus studies


By Jocelyn Kaiser Dec. 19, 2017

http://www.sciencemag.org/news/2017/03/beware-emotional-robots-giving-feelings-artificial-beings-could-backfire-study-suggests 4/8
12/21/2017 Beware emotional robots: Giving feelings to artificial beings could backfire, study suggests | Science | AAAS
Qatar’s science suffers under Arab blockade
1K   13By Jocelyn91
Kaiser Dec. 18, 2017

More ScienceInsider

Sifter
A single molecule appears to reverse age-related mental decline
Dec. 20, 2017

Researchers nd moth last seen 130 years ago


Dec. 18, 2017

Arti cial intelligence spots a record-setting exoplanet


Dec. 15, 2017

Half-billion-year-old fossil reveals what ancient eyes looked like


Dec. 12, 2017

Drug for Huntington disease shows promise in landmark trial


Dec. 12, 2017

More Sifter

Science
COMPUTERS/MATHEMATICS

http://www.sciencemag.org/news/2017/03/beware-emotional-robots-giving-feelings-artificial-beings-could-backfire-study-suggests 5/8
12/21/2017 Beware emotional robots: Giving feelings to artificial beings could backfire, study suggests | Science | AAAS

15 December 2017 A Matter of Trust


1K   13 91
Vol 358, Issue 6369
COMPUTERS/MATHEMATICS
Not So Fast

SCIENTIFIC COMMUNITY

Sudan seeks a science revival

PALEONTOLOGY

Why fossil scientists are suing Trump over monuments move

EUROPEAN NEWS

Swedish plastics study fabricated, panel nds

ECOLOGY

Ice-shrouded life sees daylight

Table of Contents

Subscribe Today
Receive a year subscription to Science plus access to exclusive AAAS member resources, opportunities, and bene ts.

First Name

Last Name

Email Address

Subscribe Today

Get Our Newsletters


Enter your email address below to receive email announcements from Science. We will also send you a newsletter digest with the
latest published articles. See full list
✓ Science Table of Contents
✓ Science Daily News
✓ Science News This Week
✓ Science Editor's Choice
✓ First Release Noti cation
✓ Science Careers Job Seeker
Email address

By providing your email address, you agree to send your email address to the publication. Information provided here is subject to
Science's Privacy Policy.

Sign up today

http://www.sciencemag.org/news/2017/03/beware-emotional-robots-giving-feelings-artificial-beings-could-backfire-study-suggests 6/8
12/21/2017 Beware emotional robots: Giving feelings to artificial beings could backfire, study suggests | Science | AAAS

1K About  us 13 91
Journals
Leadership
Team members
Work at AAAS

Advertise
Advertising kits
Custom publishing

For subscribers
Site license info
For members

International
Chinese
Japanese

Help
Access & subscriptions
Reprints & permissions
Contact us
Accessibility

Stay Connected
   

© 2017 American Association for the Advancement of Science. All rights Reserved. AAAS is a partner of HINARI, AGORA, OARE, CHORUS,
CLOCKSS, CrossRef and COUNTER.

Terms of Service
Privacy Policy
Contact Us

http://www.sciencemag.org/news/2017/03/beware-emotional-robots-giving-feelings-artificial-beings-could-backfire-study-suggests 7/8
12/21/2017 Beware emotional robots: Giving feelings to artificial beings could backfire, study suggests | Science | AAAS

http://www.sciencemag.org/news/2017/03/beware-emotional-robots-giving-feelings-artificial-beings-could-backfire-study-suggests 8/8

Vous aimerez peut-être aussi