Vous êtes sur la page 1sur 16

Human Computer Interaction in Virtual Reality

1. Introduction
The goal of virtual reality (VR) systems is to immerse the participant within a
computer-generated, virtual environment (VE). Interacting with the VE poses issues
unique to VR. The ideal VE system would hae the participant fully !eliee he was
actually performing a tas". Eery component of the tas" would !e fully replicated. The
enironment would !e isually identical to the real tas". #urther, the participant would
hear accurate sounds, smell identical odors, and when they reached out to touch an
o!$ect, they would !e a!le to feel it. #or e%ample, in a VR system to e%amine designs for
product assem!ly, the ideal system would present an e%perience identical to actually
performing the assem!ly tas". &arts and tools would hae mass, feel real, and handle
appropriately. The participant would interact with eery o!$ect as if he would if he were
doing the tas". The irtual o!$ects would in turn respond to the user's action
appropriately. Training and simulation would !e optimal.
(!iously, current VEs are still a ways from that ideal system. &articipants use
speciali)ed equipment, such as trac"ed displays and gloes, to trac" moement, interpret
actions, and proide input to the VR system. Interactie three-dimensional (*+)
computer graphics and audio software generate the appropriate scenes and auditory cues.
#inally, the participant receies the VE output (e.g. images, sounds, haptic feed!ac")
through isual and audio hardware.
In this article, we focus on immersie irtual reality systems. Immersie VR is
characteri)ed , though not uniersally , !y participant head trac"ing (monitoring the
participant's position and orientation) and stereo imagery (proiding different iews of
the VE for each eye).
Interestingly, VR human-computer interaction (-.I) issues can !e stri"ingly
different than traditional /+ or *+ -.I.
The participant views the virtual environment from a first person perspective
projection point of view.
VR interaction strives for a high level of fidelity between the virtual action
and the corresponding real action being simulated. #or e%ample, a VR
system for training soldiers in close quarters com!at must hae the participant
perform physical actions, and receie isual, audio, and haptic input, as
similar to the actual scenario as possi!le.
Some virtual actions have no real action correlate. -ow do system designers
proide interactions, such as deletion and selection, as naturally as possi!le0
Typically most if not all objects in the virtual environment are virtual.
That is, when a participant reaches out gra! a irtual o!$ect, there will no
physical o!$ect to gie an appropriate feel. #or hands-on tas"s, such as
assem!ly design erification, haing nothing to feel or handle might !e so
detrimental to the e%perience as to ma"e the VR ineffectie.
Immersie VR systems that satisfy the high fidelity interactions requirements can
!ecome an important tool for training, simulation, and education for tas"s that are
dangerous, e%pensie, or infeasi!le to recreate. E%amples of a near perfect com!ination
of real and irtual o!$ects are flight simulators. In most state-of-the-art flight simulators,
the entire coc"pit is real, !ut a motion platform proides motion sensations, and the
isuals of the enironment outside the coc"pit are irtual. The resulting synergy is so
compelling and effectie it is almost uniersally used to train pilots.
2. VR Interaction: Technology
Trac"ing and signaling actions are the primary means of input into VEs.
2.1. Inputs
Tracking is the determination of a o!$ect's position and orientation. .ommon
o!$ects to trac" include the participant's head, participant's lim!s, and interaction deices
(such as gloes, mice or $oystic"s). 1ost trac"ing systems hae sensors or mar"ers
attached to the o!$ects. Then, other deices trac" and report the position and orientation
of the sensors.
.ommercial trac"ing systems employ one or a com!ination of mechanical,
magnetic (&olhemus #astra" and 2scension #loc" of 3irds), optical (4orldVi) &&T,
*rdTech -i!all), acoustic (5ogitech 6+ 1ouse), inertial (Intersense I7-899), and global
position satellites (:&7) approaches. Each method has different adantages with respect
to cost, speed, accuracy, ro!ustness, wor"ing olume, scala!ility, wirelessness, and si)e.
;o one trac"ing technology handles all trac"ing situations.
+ifferent tas"s hae arying requirements on the accuracy, speed, and latency of
the trac"ing system's reports. VEs that aim for a high leel of participant sense of
presence , a measure of how much the participant !eliees they are <in the VE' , hae
stringent head trac"ing requirements. Researchers estimate that the VR and trac"ing
systems need to accurately determine the participant's pose and to display the appropriate
images in under 89 milliseconds, and prefera!ly under =9 milliseconds. If the lag is too
high, the VR system induces a >swimming? feeling, and might ma"e the participant
disoriented and hamper the quality of interactiity.
Trac"ing the participant's lim!s allows the VR system to @A present an avatar, a
irtual representation of the user within the irtual enironment, and /A rough shape
information of the participant's !ody pose. Researchers !eliee that the presence of an
aatar increases a participant's sense of presence. The accuracy and speed requirements
for lim! trac"ing are typically lower than that of head trac"ing.
#inally o!$ect trac"ing, usually accomplished !y attaching a sensor, allows a
irtual model of an o!$ect to !e registered with a physical real o!$ect. #or e%ample,
attaching a trac"er to a dinner plate allows an associated irtual plate to !e naturally
manipulated. 7ince each sensor reports the pose information of a single point, most
systems use one sensor per o!$ect and assume the real o!$ect is rigid in shape and
appearance.
7ince humans use their hands for many interaction tas"s, trac"ing and o!taining
inputs from a hand-!ased controller was a natural eolution for VR controllers. 2 trac"ed
gloe reports position and pose information of the participant's hand to the VR system.
They can also report pinching gestures (#a"espace &inchgloe), !utton presses (!uttons
!uilt into the gloe) and finger !ends (Immersion .y!erTouch). These gloe actions are
associated with irtual actions such as grasping, selecting, translation, and rotation.
Trac"ed gloes proide many different "inds of inputs and most importantly, are ery
natural to use. :loe disadantages include si)ing pro!lems (most are a one si)e fits all),
limited feed!ac" (issues with haptic feed!ac" and detecting gestures), and hygiene
complications with multiple users.
The most common interaction deices are trac"ed mice (sometimes called !ats)
and $oystic"s. They are identical to a regular mouse and $oystic", !ut with an integrated *
or 6 degrees-of-freedom (+(#) trac"ing sensor that reports the deice's position andBor
orientation. Trac"ed mice and $oystic"s hae numerous !uttons for the participant to
proide input, and they are cheap, easily adapta!le for different tas"s, and familiar to
many users. -oweer, they might not proide the required naturalness, feel and
functionality for a gien tas".
2 compromise to get ease of use, numerous inputs into a system, and proper
feed!ac" is to engineer a specific deice to interface with the VR system. #or e%ample,
the Cniersity of ;orth .arolina (C;.) Cltrasound augmented reality surgery system
attached a trac"ing sensor to a sonogram wand. The inputs from the sonogram wand
!uttons were passed to the VR system. This ena!led the 2R system to proide a natural
interface for training and simulation. -oweer, this required deeloping software and
manufacturing specific ca!les to communicate !etween the sonogram machine and a &..
.reating these specific deices is time consuming and the resulting tools are usa!le for a
limited set of tas"s.
2.2. Outputs
:ien the system inputs, the resulting VE (isuals, audio, tactile information) is
outputted to the participant. #or e%ample, as the participant changes their head position
and orientation, the trac"ing system passes that information to the VR system's rendering
engine. *+ iews of the VE are generated from the updated pose information.
The isual output is typically presented either in a head-mounted display (-1+)
or a multiple-wall !ac" pro$ected .2VE
T1
enironment. -1+s are head-worn helmets
with integrated display deices. The helmet has two screens located a short distance from
the user's eyes. -1+s can !e thought of as the participant >carrying? around the display.
There are many commercial -1+ solutions including the Virtual Research VD, V#E
#orteVR, and 7ony :lasstron. (httpABBwww.stereo*d.comBhmd.htm) proides an informal
surey of commercial -1+s.
.2VE
T1
enironments hae multiple !ac" pro$ected display walls and data
pro$ectors. The irtual enironment is rendered from multiple iews (such as forward,
right, left, down) and pro$ected onto the display walls. #a"espace, Inc. proides
commercial .2VE
T1
solutions (www.fa"espacesystems.com).
VR systems use either stereo headphones or multiple spea"ers to output audio.
:ien the participant's position, sounds sources, and VE geometry, stereo or speciali)ed
audio is presented to the user. .ommon audio pac"ages include .reatie 5a!'s E2E and
2u7I1's 2uTra".
VR haptic (tactile) information is presented to the participant through actie
feed!ac" deices. E%amples of force feed!ac" deices include a i!rating $oystic" (e.g.
i!rating when the user collides with a irtual o!$ect) and the 7ensi!le &hantom, which
resem!les a si% +(# pen. 2ctie feed!ac" deices can proide a high leel of -.I
fidelity. Two e%amples of effectie systems are the d2! system, which simulates
painting on a irtual canas, and Immersion .y!er:rasp gloe, which allows design
ealuation and assem!ly erification of irtual models.
3. VR Interaction: ocomotion
VR locomotion, the moement and naigation of the participant within the VE, is
one of the primary methods of VR interaction. VE locomotion is different than real
world locomotion !ecauseA
The virtual space can be of an extremely different sie and scale compared to
the real space tracked volume. #or e%ample, naigation in a VE on the
molecular or planetary scale requires special considerations.
The method of V! locomotion might have a physical e"uivalent that is
difficult or undesirable to emulate. #or e%ample, consider the naigation
issues in a VE that simulates emergency eacuations on an oil platform to
train rescue personnel.
The most common method for locomotion is flying. 4hen some input, such as a
!utton press, is receied, the participant is translated in the VE along some ector. Two
common choices for this translation ector are the iew direction of the user and along a
ector defined !y the position and orientation of an interaction deice, such as a trac"ed
$oystic" or mouse. 4hile easy to use and effectie, flying is not ery natural or realistic.
In walking in place, when the participant ma"es a wal"ing motion (lifting their
feet up and down, !ut not physically translating), the participant is translates in the VE
along a ector. 3y monitoring the trac"er sensor's reports, the VR system can detect
wal"ing motions.
7pecific deices hae !een engineered to proide long distance locomotion.
Treadmills such as the 7arcos Treadport and 2TR 2T527 allow the user to physically
wal" long distances in the VE. 7ometimes a steering deice is coupled to change
direction of VE. Cnfortunately, treadmills do not easily handle rotations or uneen
terrain, and they are growing more uncommon. #urther, there were safety issues in
simulating high speeds and collisions with irtual o!$ects.
(ther locomotion deices include speciali)ed deices such as motion platforms
and e%ercise cycles. 2 motion platform is a mechanical stage whose moement is
controlled !y the VR system. #or e%ample, flight simulators use motion platforms to
physically moe and rotate the participant to simulate the sensations of flight. (f course,
there are limitations to the range of motions, !ut for many applications, motion platforms
proide an e%tremely alua!le leel of realism.
VR locomotion approaches hae to deal with a finite , and typically quite limited
,trac"ed space within which the participant can physically moe around. #urther, the
participant typically has numerous wires connecting the trac"ing sensors and display
deices to the VR system. 1otori)ed methods for VR locomotion also hae safety issues
in the speed and methods they moe the user.
;ew commercial and research approaches to VR locomotion loo" to proide a
more natural and effectie locomotion oer larger spaces. ;ew trac"ing systems, such as
the 4orldVi) &&T, Intersense I7-899, and the *rdTech -i3all, are scala!le wide area
trac"ers with wor"ing olumes approaching F9' % F9'. This allows the participant to
physically naigate large VE distances. 7tudies hae shown that real wal"ing is !etter
than wal"ing in place which is !etter than flying for participant sense of presence.
;R5's :2ITER system is a com!ination of harnesses and treadmills that allows
soldiers to emulate running across large distances in com!at simulations. The Redirected
4al"ing pro$ect at C;. loo"s to e%pand the physical trac"ing olume !y su!tly rotating
the irtual world as the participant wal"s. This causes the participant to physically wal"
in a circle, though in the irtual world, it appears as if they hae wal"ed along a straight
line. This could allow a finite real world space to proide an infinite irtual wal"ing
space.
!. VR Interaction: Interacting "ith Virtual O#$ects
Training and simulation VR systems, which ma"e up a su!stantial num!er of
deployed systems, aim to recreate real world e%periences. The accuracy in which the
irtual e%perience recreates the actual e%perience can !e e%tremely important, such as in
medical and military simulations.
The fundamental pro!lem is that most things are not real in a VE. (f course, the
other end of the spectrum , haing all real o!$ects , remoes any adantages of using a
VE such as quic" prototyping, or training and simulation for e%pensie or dangerous
tas"s. -aing eerything irtual remoes many of the important cues that we use to
perform tas"s, such as motion constraints, tactile response, and force feed!ac". Typically
these cues are either appro%imated or not proided at all. +epending on the tas", this
could reduce the effectieness of a VE.
The participant interacts with o!$ects in the VE, simulations, and system o!$ects.
The methods to interact will ary on the tas", participants, and equipment (hardware and
software) configuration. #or e%ample, the interactions to locate *+ o!$ects in orientation
and mo!ility training for the ision impaired are different than those in a surgery
planning simulation. Varia!les to consider include accuracy, lag, intuitieness, fidelity to
the actual tas", and feed!ac".
!.1. Virtual O#$ect Interaction
2pplying *+ transformations and signaling system commands are the most
common irtual o!$ect interactions. VR issues include the lac" of a registered physical
o!$ect with the irtual o!$ect and the limited ways for getting inputs to the system. This
poses difficulties !ecause we rely on a com!ination of cues including isual, haptic, and
audio, to perform many cognitie tas"s. The lac" of haptic cues from a VE with purely
irtual o!$ects could hinder performance.
:ien that most o!$ects are irtual, can a system without motion constraints,
correct affordance, or haptic feed!ac" still remain effectie0 Is it een possi!le0 These
are some of the !asic research questions that are !eing e%plored, and it is the system
designers' $o! to proide interaction methodologies that do not impede system
effectieness.
!.2. VR %imulation Interaction
VR systems use simulations for a ariety of tas"s, from calculating physics (i.e.
collision detection and response) to lighting to appro%imate real world phenomena. 1ost
VR systems require participant interaction to control simulation o!$ects and the
simulation itself. #or e%ample, in a military solider simulation, the participant affects a
soldier's iew and !attlefield location and proides input such as pressing !uttons for
firing his weapon.
1any simulations focus on recreating realistic e%periences for the participant.
-aing a natural means of interaction improes realism. -oweer, this adds to the
difficulty in high-quality VR interaction. 4e can engineer specific o!$ects, for e%ample a
prop machine gun with the trigger sensor connected to the computer, !ut that increases
cost and reduced generality (the prop has limited uses in other applications). (n the other
end of the spectrum, using a generic interaction deice, such as a trac"ed $oystic", might
proe too different than the actual tas" to proide any !enefit.
!.3. VR %ystem Interaction
The third o!$ect of VR interaction is system o!$ects, such as menus and dialog
!o%es. 2s in traditional users of /+ or des"top *+ systems, VR participants need to
e%ecute commands such as opening files, changing system settings, and accepting
incoming messages. VR systems hae unique issues dealing with the followingA
#irst person perspectie of the enironment
;atural methods to present the system interface
+esire to aoid lowering the participant's sense of presence
2ccept participant input
1ost VR systems proide the system interface as irtual o!$ects attached either to
the irtual enironment (world coordinate system), trac"ed deice (local coordinate
system), or the participant (user coordinate system). 2ttaching the system interface to the
world coordinate system (the interface would appear as an VE o!$ect, such as a computer
panel), proides a way to "eep the interaction with the irtual enironment (!oth irtual
o!$ects and the system interface) consistent. 3ut for some VEs , such as a solider
simulation , the scale (large distances) or the su!$ect matter (realistic com!at) do not
naturally lend themseles to such a system interface.
2ttaching the system interface to a trac"ed deice, such as a participant-carried
ta!let or mouse, allows the system to proide a consistent irtual world. &reious studies
hae shown the presence of a physical surface enhanced tas" performance oer the purely
irtual surface implementations.
2ttaching the user interface to the user has the menus and dialog !o%es appear
relatiely stationary to the user, een as they naigate around the world. This is similar to
implementing a standard /+ des"top interface in a *+ enironment. In this case, the
interface is always within reach of the participant, !ut its appearance and integration with
the rest of the VE is typically not as seamless.
&. 'uture (irections in VR Interaction
VR interaction is constantly eoling new hardware, software, interaction
techniques, and VR systemsG the topics coered here are !y no means a comprehensie
list.
;ew products, such as the Immersion -aptic 4or"station, proide high quality
trac"ing of the participant's hands coupled with force feed!ac" that will allow the
participant to >feel? the irtual o!$ects. The improed interaction could ena!le VR to !e
applied to hands-on tas"s that were preiously hampered !y poor haptic feed!ac".
VEs populated with multiple participants (often physically distri!uted oer great
distances) hae unique interaction issues. In a Cniersity .ollege 5ondon study, two
participants, one at C.5 (England), and the other at C;. at .hapel -ill, (Cnited 7tates
of 2merica), are tas"ed with naigating a irtual ma)e while carrying a stretcher. -ow
do the participants interact with a shared irtual space, simulation, and each other0
Researchers are interested in how important audio, gestures, and facial e%pressions are
for cooperatie interaction.
.om!ining seeral interaction methods might deelop into solutions which are
greater than a sum of its parts. #or e%ample, the 3io7im1ER system see"s to train
emergency response medical personnel. The system interprets hand gestures and oice
commands in con$unction with traditional interaction methods to interact with the
simulation. Researchers are also inestigating passie techniques that use image
processing and computer ision to aide in trac"ing and interpreting the participant's
actions and gestures.
There is also research into new types of VR systems. -y!rid enironments , VEs
that com!ine real and irtual o!$ects , focus on proiding natural physical interfaces to
irtual systems as well as intuitie irtual interfaces. There e%ists a spectrum of
enironments, from augmented reality , supplementing display of the real world with
irtual o!$ects , to mi%ed and augmented irtual reality , supplementing display of the
irtual world with real o!$ects.
-y!rid systems loo" to improe performance and participant sense of presence !y
haing real o!$ects registered with irtual o!$ects. 7tudies into passie haptics had ma$or
irtual o!$ects, such as the walls and unmoa!le furniture, registered with stationary
physical o!$ects. It was found that passie haptics did improe sense of presence.
;ew methods to naigate and interact with irtual o!$ects are constantly !eing
deeloped, and there are moements to formali)e the description and ealuation of
interaction technologies (IT). This allows VR system engineers to ma"e interface design
decisions confidently and reduce the ad hoc nature of IT creation. #ormal ealuation also
promotes a critical reiew of how and why people interact with VEs.
2s the types of interactions grow more comple%, higher order interactions with
simulation o!$ects are !ecoming a ma$or research focus. Interpreting the participant's
facial e%pressions, oice, gestures, and pose as inputs could proide a new leel of
natural interaction. 2lso, participants will interact with more comple% o!$ects, such as
deforma!le o!$ects and irtual characters.
2s the hardware, interactions technologies, and software progress, VR system
designers deelop a more natural and effectie means for participants to interact with the
VE. 4e !eliee improed -.I will allow VR to fulfill its promises in proiding a new
paradigm for humans to interact with digital information.
3en$amin .. 5o", #niversity of $orth %arolina at %harlotte
5arry #. -odges, #niversity of $orth %arolina at %harlotte
). 'urther Reading
3a%ter, 4., 7hei!, V., 5in, 1., H 1anocha, +. (/99@). +23A Interactie -aptic &ainting with *+
Virtual 3rushes. &roceedings of '%( S)**R'&+ ,--., F6@-F6D.
3owman, +., H -odges, 5. (@88I). 2n Ealuation of Techniques for :ra!!ing and 1anipulating
Remote (!$ects in Immersie Virtual Enironments. .//0 Symposium on )nteractive 1-2 *raphics,
*=-*D.
#uchs, -., 5iingston, 1., Ras"ar, R., .olucci, +., Jeller, J., 7tate, 2., .rawford, K., Rademacher, &.,
+ra"e, 7., H 1eyer, 2. (@88D) L2ugmented Reality Visuali)ation for 5aparoscopic 7urgeryL.
&roceedings of 3irst )nternational %onference on (edical )mage %omputing and %omputer-'ssisted
)ntervention 4()%%') 5/67.
-and, .. (@88I) 2 7urey of *-+ Interaction Techniques. %omputer *raphics 3orum, @6(=), /68-/D@.
-offman, -. (@88D). &hysically Touching Virtual (!$ects Csing Tactile 2ugmentation Enhances the
Realism of Virtual Enironments. &roceedings of the )!!! Virtual Reality 'nnual )nternational
Symposium 5/6, =8-6*.
-oller!ach, K., Eu, M., .hristensen, R., H Kaco!sen, 7. (/999) +esign specifications for the second
generation 7arcos Treadport locomotion interface, +aptics Symposium8 &roceedings of 'S(!
2ynamic Systems and %ontrol 2ivision8 68(/), @/8*-@/8D.
-Nllerer, T., #einer, 7., Terauchi, T., Rashid, :., H -allaway, +. (@888) E%ploring 12R7A +eeloping
Indoor and (utdoor Cser Interfaces to a 1o!ile 2ugmented Reality 7ystem, %omputers and
*raphics, /*(6), II8-ID=.
Ins"o, 3. (/99@) &assie -aptics 7ignificantly Enhances Virtual Enironments, &h.+. +issertation,
+epartment of .omputer 7cience, C;.-.hapel -ill.
5indeman, R., 7i!ert, K., H -ahn, K. (@888) -and--eld 4indowsA Towards Effectie /+ Interaction in
Immersie Virtual Enironments. )!!! Virtual Reality.
5o", 3. (/99@) (nline 1odel Reconstruction for Interactie Virtual Enironments. &roceedings ,--.
Symposium on )nteractive 1-2 *raphics, 68-I/, /FD.
5o", 3., ;ai", 7., 4hitton, 1., H 3roo"s, #. (/99*) Effects of -andling Real (!$ects and 2atar
#idelity (n .ognitie Tas" &erformance in Virtual Enironments. &roceedings )!!! Virtual Reality
,--1.
1eehan, 1., Ra))aque, 7., 4hitton, 1., H 3roo"s, #. (/99*) Effect of 5atency on &resence in
7tressful Virtual Enironments, &roceedings of )!!! Virtual Reality ,--1.
1ine, 1., 3roo"s, #., H 7equin, .. (@88I) (oving 9bjects in Space: !xploiting &roprioception in
Virtual-!nvironment )nteraction; &roceedings of S)**R'&+ /0.
Ra))aque, 7. John, O., 4hitton, 1. (/99@) Redirected 4al"ing. &roceedings of !urographics ,--..
Ric"el, K., H Kohnson, 4. (/999) Tas"-(riented .olla!oration with Em!odied 2gents in Virtual
4orlds. !mbodied %onversational 'gents.
7later, 1. H Csoh, 1. (@88*) The Influence of a Virtual 3ody on &resence in Immersie Virtual
Enironments, &roceedings of the Third 'nnual %onference on Virtual Reality, *F-F/.
7tansfield, 7., 7hawer, +., 7o!el, 2., &rasad, 1. H Tapia, 5. (/999) +esign and Implementation of a
Virtual Reality 7ystem and Its 2pplication to Training 1edical #irst Responders. &resence:
Teleoperators and Virtual !nvironments, 8(6), =/F-==6.
7utherland, I. (@86=). The Cltimate +isplay. &roceedings of )3)& <=, /, =96.
Templeman, K., +en!roo", &., H 7i!ert, 5. (@888) Virtual 5ocomotionA 4al"ing in &lace Through
Virtual Enironments. &resence: Teleoperators and Virtual !nvironments. D(6).
Csoh, 1., 2rthur, J, et al. (@888) 4al"ing P Virtual 4al"ingP #lying, in Virtual Enironments.
&roceedings of S)**R'&+ //, *=8-*6F.
Van+am, 2. 5aidlaw, +., H 7impson, R. (/99/) E%periments in Immersie Virtual Reality for
7cientific Visuali)ation 8 %omputers and *raphics, /6, =*=-===.
4elch, :., 3ishop, :., Vicci, 5., 3rum!ac", 7., H Jeller, J. The -i3all Trac"erA -igh-&erformance
4ide-2rea Trac"ing for Virtual and 2ugmented Enironments. &roceedings of the '%( Symposium
on Virtual Reality Software and Technology .///.
4elch, :. H #o%lin, E. (/99/) 1otion Trac"ingA ;o 7iler 3ullet, !ut a Respecta!le 2rsenal. )!!!
%omputer *raphics and 'pplications8 Special )ssue on Tracking, //(6)A /F,*D.
Oachmann, :. H Rettig, 2. (/99@) ;atural and Ro!ust Interaction in Virtual 2ssem!ly 7imulation.
!ighth )S&! )nternational %onference on %oncurrent !ngineering: Research and 'pplications.

Vous aimerez peut-être aussi