Vous êtes sur la page 1sur 1

The battlefield in the Middle-East teaches important lessons on the virtualisation of real-life situation: on the social bonds that

humans develop with machines, and on the other end of the spectrum on the impossibility to totally 'virtualise' an experience: the link with the physical world always comes back to remind of its reality. New research from the University of Washington explores social bonds soldiers develop with their 'bomb disposal' robots in Iraq and Afghanistan. They often anthropomorphize the machines that help keep them alive, assigning those human attributes, and even displaying empathy toward them. To the point of refusing replacement robots unless it is 'their' machine being repaired or holding funerals for their fallen brothers in robotic arms. It recently happened in Iraq: the tribute involved a 21-gun salute, and the awarding of a Purple Heart and a Bronze Star Medal. 'He' was a MARCbot, a R2D2-like robot designed to disarm explosives. The other side of this "virtualisation" paradigm is the issue impacting Drone* Operators (* technically un-manned machines guided by operators, as opposed to fully automated robots) diagnosed with post-traumatic stress disorder despite not physically facing the battlefield. This runs counterintuitive to the initial concept of drone warfare, which assumed that combat's devastating psychological effects had been mitigated. Instead, moral injury" is now an accepted issue, which represents a shift from the violence done to people toward their feelings about what they have done to others. Yet, 61% of Americans in the latest Pew survey support military drones because they wont risk US lives. Unfortunately there are serious impacts: a 2011 survey shows 42% of operators reported moderate to high stress, and 20% reported emotional exhaustion or burnout.

The specific insight is that even in a highly digitised world shielding operators from direct physical action, psychological consequences still exist. It means that the very idea of a robotic or artificial intelligence fulfilling our duties unsettles. Whilst the technology is advancing at fast pace, the moral and ethical burdens it carries have been largely unconsidered. As in many domains, the military is at the vanguard of future civilian issues. Scientists, legal experts and philosophers are now joining forces to scrutinise the promise of intelligent systems and wrangle over their implications: @zeronomics after wining chess games, IBM's supercomputer Watson will soon diagnose diseases, and also be used by health Insurers to assess customers. But what are the implications for businesses and customers when they want to argue decisions made by "black boxes"? The ethics of this emerging paradigm experienced in the extreme environment of the military will ultimately need to be addressed by civilian organisations.

Vous aimerez peut-être aussi