Vous êtes sur la page 1sur 26

VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR

1. INTRODUCTION
To improve quality of life for the elderly and disabled people, electric-powered
wheelchairs (EPWs) have been rapidly developed over the last 2 years (!in" and #ooper, 2$)
(%impson et al&, 2')& (ost of current EPWs are controlled by users) hands via *oystic+s, and
are very difficult for elderly and disabled users who have restricted limb movements caused by
par+in"son diseases and quadriple"ics& ,s cheap computers and sensors are embedded into
EPWs, they become more intelli"ent, and are named as intelli"ent wheelchairs (-Ws)& .arious
research and developments on -Ws have been carried out in the last decade, such as #,// %mart
Wheelchair (#,// #entre, 011'), Wheelesley (2anco et al&, 011$), 3(4- (5oyer and 6or"olte,
0117), 4av#hair (/evine et al&, 0111), T,3 pro*ects (T& 8omi and ,& 8riffith, 0119), :olland
(:;fer and /an+enau, 0119), (aid (Prassler et al&, 0119), <Penn %mart Wheelchair (:ao et al&,
22), %-,(3 ((a=o et al&, 22)&
The successful deployment of the -Ws relies on their hi"h performance and low cost&
/i+e all the other intelli"ent service robots, the main performance of -Ws includes>
The autonomous navi"ation capability for "ood safety, fle?ibility, mobility, obstacle
avoidance&
The intelli"ent interface between the users and the -Ws, includin" hand-based control
(*oystic+, +eyboard, mouse, touch screen), voice-based control (audio), vision-based
control (cameras), and other sensor-based control (infrared sensors, sonar sensors,
pressure sensors, etc&)&
,s an hand free interface, head "esture and E(8 si"nals have already been applied in
some e?istin" -W systems, such as 4/P:Wheelchair (Wei, 2'), W,T%34 ((atsumoto et al&,
0111), ((atsumoto et al&, 20), 3%,@, wheelchair (4a+anishi et al&, 0111), (@uno et al&,
20), etc& 5owever, these systems are not robust enou"h to be deployed in the real world and
many improvements are necessary& The new "eneration of head "esture based control of
wheelchairs should be able to deal with the followin" uncertainty in the practical applications of
-Ws>
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 1
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
The user head may be out of the ima"e view, or only the profile face is in the captured
ima"eA
The face color is user-dependent, and may chan"e dramatically in varyin" illumination
conditionsA
The user may have different facial appearances, such as mustache and "lassesA and
The bac+"round may be cluttered when -Ws move in the real world&
-n this report, a novel head "esture based interface (58-) is developed for our -Ws,
namely :obo#hair, based on the inte"ration of the ,daboost face detection al"orithm(6rads+i,
0119) and the #amshift ob*ect trac+in" al"orithm (.iola and Bones, 2')& -n this process, head
"esture reco"nition is conducted by means of real time face detection and trac+in"&
This report presents an -ntelli"ent Wheelchair(-W) control system for the people with various
disabilities& To facilitate a wide variety of user abilities , his system involves the use of face-
inclination and mouth-shape information ,where the direction of an -W is determined by the
inclination of the user)s face, while proceedin" and soppin" are determined by the shapes of the
user)s mouth& The system is composed of electric powered wheelchair, data acquisition board,
ultrasonicCinfrared sensors, a P# camera and vision system& Then the vision system o analy=e
user)s "estures is performed by three sta"es > detector, reco"ni=er and convertor&
-n the detector, the facial re"ion of the intended user is first obtained usin" ,daboost A
thereafter the mouth re"ion is detected based on ed"e information&
The e?tracted features are sent to the reco"ni=er, which reco"ni=es the face inclination
and mouth shape usin" statistical analysis and @-means clusterin" , respectively
These reco"nition results are then delivered to the convertor to control the wheelchair
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 2
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
/. ROLE OF IW0S
/.1 WHAT IS THE VITTAL ROLE PLAYED BY IW0S1
With the increase of elderly and disabled people, a wide ran"e support devices and care
equipment has been developed to help improve their quality of life (D3/) & -n particular,
intelli"ent wheelchairs (-Ws) have received considerable attention as mobility aids& Essentially,
-Ws are electric powered wheelchairs (EPWs) with an embedded computer and sensors, "ivin"
them intelli"ence& Ei"ure 0 shows the various -Ws
Fi'!r" 12 Int"&&i'"nt Wh""&%hair 3IW4 3a4 GRASP La5#rat#r( S6art Chair ) 354
Wh""&%hair #$ Y!ta7a "t. a&) 3%4 Na8 Chair.
Two basic techniques have been required to develop -Ws> 0) auto navi"ation techniques
for automatic obstacle detection and avoidance, 2) convenient interfaces that allow handicapped
users to control the -W themselves usin" their limited physical abilities& While it is important to
develop a system that enables the user to assist in the navi"ation, the system is useless if it
cannot be adapted to the abilities of the user& Eor e?ample, in the case a user cannot manipulate a
standard *oystic+, other control options need to be provided&
/./ RELATED RESEARCH
%o far many access methods for -Ws have been developed and then they can be classified
as intrusive and non-intrusive& They are summari=ed in Table 0& -ntrusive methods use "lasses, a
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 3
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
headband, or cap with infraredCultrasound emitters to measure the userFs intention based on
chan"es in the ultrasound waves or infrared reflect & -n contrast, non-intrusive methods do not
require any additional devices attached to userFs face or head&
TABLE 12 IW CONTROLS IN LITERATURES
Intr!i8" Int"r$a%" 2
INTELLIGENT
WHEELCHAIR
FEATURE DEVICE SUPPORTING
COMMANDS
2&/&#hen,et, al 5ead
3rientation
Tilt %ensors, (icroprocessor 8o, 6ac+, /eft, :i"ht
%-,(3 Pro*ect Eye 8a=e Electrode 8o, 6ac+, /eft, :i"ht
Wheelsley Eye 8a=e -nfrared %ensors, <ltrasonic
ran"e %ensors, Electrodes
8o, %top, /eft, :i"ht
N#n-intr!i8" Int"r$a%" 3V#i%"4 2
INTELLIGENT
WHEELCHAIR
FEATURE DEVICE SUPPORTING
COMMANDS
%iamo Pro*ect .oice <ltrasonic %ensors, -nfrared %ensors,
#amera and /aser !iode
8o, 6ac+, /eft, :i"ht
:36 #hair .oice <ltrasonic %ensors, -nfrared %ensors,
head microphone
8o, %top, %peed up,
%peed !own, :otate
4,. #hair .oice !os-based #omputer, <ltrasonic
Transducer, /ap Tray, %onar %ensor
8o, %top, 6ac+, /eft,
:i"ht
T,3 Pro*ect .oice %ensors, 2 Processor bo?es 8o, %top, 6ac+, /eft,
:i"ht, %peed !own
N#n-intr!i8" Int"r$a%" 3Vii#n4 2
INTELLIGENT FEATURE DEVICE SUPPORTING
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 4
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
WHEELCHAIR COMMANDS
2oshida Eace <ltrasonic %ensors,2 .ideo #amera 8o, %top, /eft, :i"ht
58- 5ead and
4ose
Webcam , <ltrasonic %ensors ,!ata
,cquisition 6oard
8o, /eft, :i"ht, %peed
!own, %peed <p
%-,(3 5ead ##! #olour-micro #amera 8o, /eft, :i"ht, %peed
!own, %peed <p
:eport -W Eace and
(outh
Web #amera, !ata ,cquisition
6oard
%in"le #ommands>
8o, /eft, :i"ht, :otate
(i?in" #ommands>
8o-/eft, 8o-:i"ht
,s shown in Table 0, voice-based and vision-based methods belon" to the nonintrusive
methods& .oice control is a natural and friendly access method, however, the e?istence of other
noises in a real environment can lead to command reco"nition failure, resultin" in safety
problems& ,ccordin"ly, a lot of research has been focused on vision-based interfaces, where
control is derived from reco"ni=in" the userFs "estures by processin" ima"es or videos obtained
via a camera& With such interfaces, face or head movements are most widely used to convey the
userFs intentions& When a user wishes to move in a certain direction, it is a natural action to loo+
in that direction, thus movement is initiated based on noddin" the head, while turnin" is
"enerated by the head direction& 5owever, such systems have a ma*or drawbac+, as they are
unable to discriminate between intentional behavior and unintentional behavior& Eor e?ample, it
is natural for a user to loo+ at an obstacle as it "ets close, however, the system will turn and "o
towards that obstacle&
/.9 THE TECHINAL IDEA
,ccordin"ly, we develop a novel -W interface usin" face and mouth reco"nition for the
severely disabled& The main "oal of the present study is to provide a more convenient and
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 5
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
effective access method for people with various disabilities& Eor accurate reco"nition of the userFs
intention, the direction of the -W is determined accordin" to the face inclination, while
proceedin" and stoppin" are determined by the shape of the mouth& This format was inspired
based on the operation of car, as the userFs face movements correspond to the steerin" wheel,
while the userFs mouth corresponds to the bra+e and "as pedal& The mechanisms prevent an
accident in the case the user instinctively turns their head to loo+ at an obstacle, thereby ma+in"
safer& (oreover, the proposed control mechanisms require minimal user motion, ma+in" the
system more comfortable and more adaptable for the severely disabled when compared to
conventional methods&
The -W system consists of Eacial Eeature !etector (!etector), Eacial Eeature :eco"ni=er
(:eco"ni=er), and #onverter & -n our system, the facial re"ion is first obtained usin" ,daboost
al"orithm, which is robust to the time-varyin" illumination& Thereafter the mouth re"ions are
detected based on ed"e information& These detection results are delivered to the :eco"ni=er,
which reco"ni=es the face inclination and mouth shape& These reco"nition results are then
delivered to the #onverter, thereby the wheelchair are operated& To assess the effectiveness of the
proposed interface, it was tested with G' users and the results were compared with those of other
systems& Then, the results showed that the proposed system has the superior performance to
others in terms of accuracy and speed, and they also confirmed that the proposed system can
accurately reco"ni=e userFs "estures in real-time&
9. IW0S ARCHITECURE AND ITS OVERVIEW
9.1 SYSTEM3IW0S4 ARCHITECTURE
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 6
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
The proposed -W is composed of electric powered wheelchair, data acquisition board,
and a P# camera and vision system& , data acquisition board (!,D-board) is used to process the
sensor information and control the wheelchair& The !,D-board and a vision system are
connected via a serial port& -n our system, a E<B-T%< (%7$0) noteboo+ is used as a vision
system to process a video streamin" received from a P# camera& The camera is connected to a
vision system throu"h a <%6 port and is mounted on the front of the wheelchairFs tray, pointin"
down at an appro?imately 0$ de"ree an"le& The baseline between a user and camera is 2$ cm
(1&9 inches)& 3ur system is described in Ei"ure 2 &
FIGURE /2 Th" Pr#t#t(:" #$ #!r IW
SPECIFICATION FOR THE PROPOSED IW0S
HARDWARE
Wheelchair > EPW-!,E%E (& care :ider
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 7
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
!,D 6oard > #ompile Technolo"y %!D-!,'EH
-nput device > /o"itech (7' I '9) <p to G frameCsec 2'-6it True %#&#!r
.ision %ystem > Pentium -. 0&J 85= 086 (emory
%ensors > Two ultrasonic sensors %i? -nfra-red sensors
SOFTWARE
3% > (% Window HP
!eveloped /an"ua"e > (% .isual #KK, (% .isual 6asic 7&
#amera #ontrol > 3pen #.
9./ OVERVIEW OF VISION-BASED CONTROL SYSTEM
The proposed control system receives and displays a live video streamin" of the user
sittin" on the wheelchair in front of the computer& Then, the proposed interface allows the user to
control the wheelchair directly by chan"in" their face inclination and mouth shape& -f the user
wants the wheelchair to move forward, they *ust say L8o&L #onversely, to stop the wheelchair,
the user *ust says L<hm&L 5ere, the control commands usin" the shape of the mouth are only
effective when the user is loo+in" forward, thereby preventin" over-reco"nition when the user is
tal+in" to someone& (eanwhile, the direction of the -W is determined by the inclination
("radient) of the userFs face, instead of the direction of the head& ,s a result, the proposed
mechanism can discriminate between intentional and unintentional behavior, thereby preventin"
potential accidents, when the user instinctively turns their head to loo+ at an obstacle&
Eurthermore, the proposed control mechanisms only require minimal user motion, ma+in" the
system safer, more comfortable, and more adaptable to the severely disabled when compared to
conventional methods&
;. DESCRIPTION OF THE STAGES
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 8
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
Ei"ure G describes the process to reco"ni=e userFs "estures, where the reco"nition is
performed by three steps> !etector, :eco"ni=er, and #onverter& Eirst, the facial re"ion is obtained
usin" the ,daboost al"orithm, and the mouth re"ion is detected based on ed"e information&
These detection results are then delivered to the :eco"ni=er, which reco"ni=es the face
inclination and mouth shape usin" K-means clusterin" and a statistical analysis, respectively&
Thereafter, the reco"nition results are delivered to the #onverter, which operates the wheelchair&
(oreover, to fully "uarantee user safety 0 ran"e sensors are used to detect obstacles in
environment and avoid them& -n what follows, the details for the respective components are
shown&
FIGURE 92 Th" O8"ra&& Ar%hit"%t!r" #$ th" Pr#:#"d C#ntr#& S(t"6
;.1 FACIAL FEATURE DETECTOR2 DETECT USER<S FACE AND
MOUTH FROM PC CAMERA
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 9
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
Eor each frame of an input streamin", this module locali=es the facial re"ion and mouth
re"ion, and sends them to the :eco"ni=er& The facial re"ion is obtained usin" the ,daboost
al"orithm for robust face detection, and the mouth re"ion is obtained usin" ed"e information
within the facial re"ion&
Eor application in a real situation, the face detection should satisfy the followin" two
requirements>
-t should be robust to time-varyin" illumination and cluttered environments and
-t should be fast enou"h to supply real-time processin"& Thus, the ,daboost al"orithm is
used to detect the facial re"ion&
This al"orithm was ori"inally proposed by .iola and has been used by many researchers& The
,daboost learnin" method is an iterative procedure for selectin" features and combinin"
classifiers& Eor each iteration, the features with the minimum misclassification error are selected,
and wea+ classifiers are trained based on the selected features& The ,daboost learnin" method
+eeps combinin" wea+ classifiers into a stron"er one until it achieves a satisfyin" performance&
To improve the detection speed, a cascade structure is adopted in each of the face detectors, to
quic+ly discard the easy-to-classify non-faces& This process is illustrated in Ei"ure '&
Ei"ure $ shows some face detection results& To demonstrate its robustness, the face
detection method was tested with several standard !6s such as .,@ !6 & (oreover, it was
tested on the data obtained from real environment& Ei"ures $(a) and $(b) show the results for
.,@ !6s, respectively& ,nd Ei"ures $(c) is the results for online streamin" data& ,s seen in
Ei"ure $, the proposed method is robust to the time-varyin" illumination and the cluttered
environments& To reduce the comple?ity of the mouth detection, it is detected based on the
position of the facial re"ion usin" the followin" properties>
The mouth is located in the lower re"ion of the face and
The mouth has a hi"h contrast compared to the surroundin"s&
Thus, the mouth re"ion is locali=ed usin" an ed"e detector within a search re"ion estimated
usin" several heuristic rules based on the facial re"ion& The details for the search re"ion are "iven
in our previous wor+ by the current authors&
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 10
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
FIGURE ;2 O!t&in" #$ Fa%" D"t"%ti#n Uin' Ada5##t A&'#rith6
FIGURE =2 Fa%" D"t"%ti#n R"!&t 3a4 Th" R"!&t $#r MMI DB)354 Th" R"!&t $#r VA*
DB) 3%4 Th" R"!&t $#r On&in" Str"a6in' Data
Ei"ure 7 shows mouth detection results& %ince the detection results include both narrow ed"es
and noise, the noise is eliminated usin" the post-processin"&
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 11
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
FIGURE >2 Th" M#!th D"t"%ti#n R"!&t 3a4 Ed'" D"t"%ti#n R"!&t 354 N#i" R"6#8"d
R"!&t
;./ FACIAL FEATURE RECOGNI?ER 2 RECOGNI?E FACE
INCLINATION AND MOUTH SHAPE OF THE INTENDED
USER
This module reco"ni=es the userFs face inclination and mouth shape, both of which are
continuously and accurately reco"ni=ed usin" a statistical analysis and template matchin"& ,s a
result, the proposed reco"ni=er enables the user to control the wheelchair directly by chan"in"
their face inclination and mouth shape& Eor e?ample, if the user wants the wheelchair to move
forward, the user *ust says L8o&L #onversely, if the user wants the wheelchair to stop, the user
*ust says L<hm&L 5ere, these commands only have an effect when the user is loo+in" forward,
thereby preventin" over-reco"nition when user is tal+in" to someone& Plus the direction of the
-W is determined by the inclination of the userFs face instead of the direction of the userFs head&
/et denote the orientation of the facial re"ion& Then, can be calculated by findin" the
minimi=ed inertia, which is defined as follows&
(0)
where the , is the number of pi?els in the re"ion :, and d is the distance between pi?el (r, c) and
a?is of inertia which pass throu"h the centroid, ( , )& We obtain these properties
by and &
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 12
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
To minimi=e the inertia, the derivative is ta+en with respect to & ,ccordin"ly, the
orientation can then be obtained by equation (2)&
(2)
where rr, cc and rc are the second moments, the respective of which are defined
as and & -f the value of M is less than , this
means that the user nods their head slantin" to the left& 3therwise, it means that the user nods
their head slantin" to the ri"ht& Ei"ure J shows the reco"nition results for the face inclination&
FIGURE . 2 Th" R"%#'niti#n R"!&t F#r Fa%" In%&inati#n. 3A4 Th" C#66and O$ T!rn-
L"$t) 3B4 Th" C#66and O$ T!rn-Ri'ht.
To reco"ni=e the mouth shape in the current frame, template matchin" is performed, where the
current mouth re"ion is compared with mouth-shape templates& These templates are obtained
byK-means clusterin" from 00' mouth ima"es& K-means clusterin" is a method of classifyin" a
"iven data set into a certain number of clusters fi?ed a priori& -n this e?periment, multiple mouth-
shape templates were obtained, which consisted of 7 different shapes of L8oL and L<hm&L
Ei"ure 9 shows the mouth shape templates&
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 13
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
FIGURE @ 2 Th" 6#!th ha:" t"6:&at". 3a4 AG#A 6#!th ha:" t"6:&at" and) 354 BUh6A
6#!th ha:" t"6:&at".
The results of the comparin" the templates with a candidate are represented by matchin" scores&
The matchin" score between a mouth-shape template and a candidate is calculated usin" the
5ammin" distance, where the 5ammin" distance between two binary strin"s is defined as the
number of di"its in which they differ& 5ere the matchin" scores for all the mouth-shape templates
and a mouth candidate are calculated, and the mouth-shape template with the best matchin"
score is selected&
;.9 CONVERTER2 TRANSLATE USER<S GESTURE INTO IW<S
CONTROL COMMANDS
The proposed system uses a data acquisition board as a converter to translate the userFs
"estures into control commands for the -W& %imilar to a "eneral electric powered wheelchair,
which is controlled by the volta"e passed to the *oystic+, a data acquisition board (%!D-
!,'EH) is used to transform the ,!# function and !,#& Ei"ure 1 shows the data acquisition
board used in our -W& The board is connected to a computer throu"h a serial port and
pro"rammed usin" .isual 6asic& The pro"rammed function then translates the userFs "estures
into control commands for the -W.
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 14
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
FIGURE C > Data A%D!iiti#n 5#ard 3SDE-DAF;EG4
The commands "iven from the user interface are passed to the control pro"ram runnin"
the wheelchair throu"h the serial port& The board pro"ram then controls the speed and direction
of wheelchair by modifyin" the volta"e passin" throu"h the wheelchair&
Table 2 shows command map between wheelchair movement and output volta"e& The
proposed system is able to control both the direction and the velocity of the wheelchair, as the
user can produce a different output volta"e by chan"in" their mouth shape or face orientation& -n
addition to simple commands, such as "o-forward, "o-bac+ward, turn-left, or turn-ri"ht, the
proposed system can also "ive a mi?ture of two simple commands, similar to *oystic+ control&
Eor e?ample, the wheelchair can "o in a '$ de"ree direction by combinin" the "o-forward and
"o-ri"ht commands&
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 15
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
TABLE /> OPERATION VOLTS OF INTELLIGENT WHEELCHAIR
COMMANDS OUTPUT 1 OUTPUT /
8o 2&$.NG&J. 2&'$.
6ac+ 0&2.N2&'$. 2&'$.
/eft 2&'$. 0&2.NG&J.
:i"ht 2&'$. 2&'$.NG&J.
%top 2&'$. 2&'$.
The interface system of -W was developed in P# platform> the operation system is
Windows HP and #P< is Pentium 0&J 85=& The #amera is /o"itech, which was connected to the
computer usin" the <%6 port and supplied G color ima"es si=ed at G2 I 2' per second&
To assess the validity of the proposed system, it was tested on G' participants, includin"
0J disabled and 0J able bodied users& The disabled users had the followin" disabilities> ata?ia
and quadriple"ia from cord-in*uries& The details are summari=ed in Table G&
TABLE 9 > TESTING GROUPS
STAGE TIME NUMBER
EPW USAGE
COMPUTER USAGE3H43I4
,ble-6oiled <sers 0J O 0O (12O)
!isabled <sers 0J 90O 7'O (2GO)
=. EGPERIMENTAL RESULTS
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 16
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
The e?periments were performed by two steps& Eirst, the performance of the proposed
system is presented, which was tested in various environments& The effectiveness of the proposed
system is then discussed in comparison with other methods&
EGPERIMENT I2 TO MEASURE THE ACCURACY OF OUR INTERFACE
Eor the proposed system to be practical in the real environments, it should be robust to
various illuminations and cluttered bac+"rounds& Thus, the proposed method was applied to
detect and reco"ni=e the userFs facial features in a comple? bac+"round& Ei"ure 0 shows that the
facial feature detection results& The scenes had a cluttered stationary bac+"round with a varyin"
illumination& ,s seen in Ei"ure 0, the results accurately detected the face and mouth, confirmin"
the robustness to time-varyin" illumination, and low sensitivity to a cluttered environment&
FIGURE 1F 2 Fa%" and 6#!th d"t"%ti#n r"!&t&
With the proposed system the userFs intention is represented by the inclination of the face
and shape of the mouth, ma+in" the accurate reco"nition of these features crucial&
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 17
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
Ei"ures 00(a) and00(b) show the reco"nition results for the face inclination and the mouth
shapes, respectively& ,s shown in these fi"ures, they are continuously and accurately reco"ni=ed&
FIGURE 11 2 Fa%" and 6#!th r"%#'niti#n r"!&t& 3a4 $a%" in%&inati#n r"%#'niti#n) 354
6#!th ha:" r"%#'niti#n.
To quantitatively evaluate the performance of the proposed system, we as+ed each user to
perform contain commands, such as "o strai"ht, stop, turn left or turn ri"ht and repeat the action
$ times& Table ' shows the avera"e time ta+en to detect the face and facial features, then
reco"ni=e them in an indoor and outdoor environment& ,s a result, the avera"e time ta+en to
process a frame was about 72 ms, allowin" the proposed system to process more than 0$
framesCsec on avera"e (07 framesCsec in indoor and 0' framesCsec in outdoor)& Table $ shows the
reco"nition rates of the proposed interface for the respective commands& The proposed system
shows the precision of 0O and the recall of 17&$O on avera"e& Thus, this e?periments proved
that the proposed system can accurately reco"ni=e userFs intentions in real-time&
TABLE ; 2 PROCESSING TIME 3.MS4
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 18
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
STAGE INDOOR OUTDOOR
Eace !etection G G2
(outh !etection 0$ 09
Eace -nclination :eco"nition 2 2
(outh %hape :eco"nition 0$ 07
Total 72 79
TABLE = 2 PERFORMANCE EVALUATION RESULTS
COMMANDS RECALL PRECISION
/eft Turn &19 0
:i"ht Turn &1' 0
8o %trai"ht &17 0
%top &19 0
Ei"ure 02 shows some snapshots for the proposed system to be applied on various
environments& 3utdoor environments have a time-varyin" illumination and more comple?
bac+"round, as shown in Ei"ures 02(b) and 02(d)& 5owever, despite those comple?ities, the
proposed system wor+ed very well in both environments& -n particular, in case of someone
comes to tal+ the user (in Ei"ure 02(c)), our system can accurately discriminate between
intentional and unintentional behaviors, thereby preventin" potential accidents, when the user
instinctively turns their head to loo+ at a person&
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 19
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
FIGURE 1/ 2 IW C#ntr#& On R"a& En8ir#n6"nt&
=./ EGPERIMENT II2 TO COMPARE WITH OTHER INTERFACES
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 20
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
To prove the efficiency and effectiveness of the proposed -W interface, it was also
compared with other systems& 5ere, two methods were adopted, one is headband-based method
and the other is method usin" face trac+in" & Then, the former belon"s to the intrusive method
and the latter belon"s to the vision-based method& -n the headband-based method, the "o-and-
stoppin" is controlled by noddin" userFs head to the front or to the rear, and the direction is
chan"ed by noddin" userFs head to the left side or to the ri"ht side& -n such system, the head
motions are measured throu"h a headband that includes an accelerometer sensor& 3n the other
hand, a face-based interface detects userFs face in the first frame and trac+s it continuously, and
then userFs face are detected usin" s+in-color model&
Ei"ure 0G shows the control commands for respective methods& When visually inspected,
our system requires the smaller motions than others& This tells us our system is more comfortable
and suitable to the severely disabled&
FIGURE 19 2 Int"&&i'"nt Wh""&%hair in:!t 6"th#d&
Eor the practical use by the severely disabled, such systems should be operable on both
indoor and outdoor environments& Thus, the three systems were evaluated across indoors and
outdoors, chan"es in time of day and weather conditions& %uch conditions are summari=ed in
Table 7& ,nd some test maps in indoor and outdoor environments are shown in Ei"ure 0'&
TABLE >2 TEST ENVIRONMENTS
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 21
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
PLACES TIME AND ILLUMINATION
Ind##r (!aytime, fi?ed illumination)
(4i"ht time, fi?ed illumination)
O!td##r (!aytime, time-varyin" illumination and a shadow)
(4i"httime, -)
Ind##r t# #!td##r) #r 8i%" 8"ra
(!aytime, time-varyin" illumination and shadow)
FIGURE 1; 2 S#6" EJa6:&" #$ T"t Ma:& 3A4 O!td##r T"t Ma:) 3B4 Ind##r T"t Ma:&
The participants to navi"ate each map 0 times usin" three interfaces& The performances
for three interfaces were then evaluated in terms of the accuracy and speed& -n those e?periments,
the face-based method was tested in only the indoor environments, due to its sensitivity to the
time-varyin" illumination& ,s mentioned above, it used the s+in-color model to e?tract userFs
face, so it is very sensitive to illumination chan"es&
Tables J and 9 show the summari=ed performance comparisons for the three methods&
Table Jshows the avera"e times ta+en to reach the destination for three interfaces& ,mon" three
methods, the face-based method is the slowest, whereas there is no si"nificant difference
between our method and the head-band method& (eanwhile, Table 9 shows the reco"nition
accuracies of three methods, where the proposed-based method produced the best performance
with an avera"e accuracy of 17&$O, while the face-based method had an accuracy of 9J&$O and
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 22
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
the headband-based method had an accuracy of 99O& -n e?periments, the face and headband
based methods made over-reco"nition or miss-reco"nition in some environments& When
travellin" on an uphill (or downhill) place, the headband method often missed the "o-strai"ht and
stop commands& ,nd the face-based method often missed userFs face so that it cannot trac+ userFs
"estures& (oreover, it cannot discriminate between intentional and unintentional behaviors,
thereby ma+in" potential accidents, when the user instinctively turns their head to loo+ at an
obstacle&
TABLE . 2 PROCESSING TIME
METHODS AVERAGE TIME TA*EN TO REACH
THE DESTINATION3.MS4
:eport-method ( test in indoor and outdoor) '9&G0s
5ead-based mehod (test in indoor and outdoor) '9&70s
Eace-based method( test in indoor) $0&2Gs
TABLE @ 2 ACCURACY3I4
METHODS PRECISION RECALL
:eport-method ( test in indoor
and outdoor)
0 17&$
5ead-based mehod (test in
indoor and outdoor)
91 99
Eace-based method( test in
indoor)
9J 9J&$
#onsequently, these comparisons proved the efficiency and effectiveness of the proposed
system& (oreover, as shown in Ei"ure 0G, the proposed method requires more minimal user
motion than the head-band method, ma+in" the proposed system more suitable for the severely
disabled than conventional methods&
>. ADVANTAGES
The advanta"es of the proposed system include the followin"s>
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 23
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
(inimal user motion, ma+in" the proposed system more adaptable to the severely
disabled, than conventional methods&
:obustness to a cluttered bac+"round and the time-varyin" illumination and
,ccurate reco"nition of user intention based on discriminatin" between intentional and
unintentional behavior&
To prove these advanta"es, the proposed system was tested with G' users on the indoor
outdoor environments& 5owever, to "uarantee full user safety, the proposed system also needs to
be able to detect and avoid obstacles automatically thus further research in this area is currently
underway&
.. CONCLUSION
-n this report, a few interestin" views were discussed re"ardin" an -W system , which
is adaptable, efficient and robust for disabled people with physical abilities& -n this system , the
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 24
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
direction of -W is determined by user)s face inclination, while "oin" and stoppin" is determined
by user)s mouth shape& This report describes the desi"n and implementation of a novel hands
free control system for -Ws& This -W system provides enhanced mobility for the elderly and
disabled people who have very restricted limb movements or severe handicaps&
, robust 58- is desi"ned for vision-based head "esture reco"nition of the :obo#hair
user& The reco"ni=ed "estures are used to "enerate motion control commands so that the
:obo#hair can be controlled accordin" to the user)s intention& To avoid unnecessary movements
caused by user loo+in" around randomly, the 58- is focused on the central position of the
wheelchair to identify useful head "estures&
The proposed system was tested with G' users in indoor and outdoor environments
and the results were compared with those of other systems, then the results showed that the
proposed system has superior performance to other systems in terms of speed and accuracy&
Therefore, it is proved that proposed system provided a friendly and convenient interface to the
severely disabled people&
The future research will be focused on some e?tensive e?periments and evaluation of
58- in both indoor and outdoor environments where cluttered bac+"rounds, chan"in" li"htin"
conditions, sunshine and shadows may brin" complications to head "esture reco"nition&
@. REFERENCES
1. Bournal of 4euroEn"ineerin" and :ehabilitation 2009- Bin %un Bu, 2unhee %hin, and Eun
2i @im
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 25
VISION BASED INTERFACE SYSTEM FOR HANDS FREE CONTROL OF AN INTELLIGENT WHEEL CHAIR
/. 2& (atsumoto, T& -no and T& 3"asawara, !evelopment of intelli"ent wheelchair
systemwith face and "a=e based interface, in 10th IEEE International Workshop onRobot
and Huan !ounication "R#$%& 2001' (20), pp& 272P27J&
9. P& Bia, 5& 5u, T& /u and @& 2uan, 5ead "esture reco"nition for hands-free control ofan
intelli"ent wheelchair, (ournal o) Industrial Robot 9;(0) (2J) 7P79&
;. T& Eel=er and 6& Ereisleben, 5aW#o%> The Qhands-free) wheelchair control system, in
International %!$ *I+%!!E** !on)erence on !oputers and %ccessibilit, (,#(
Press, 4ew 2or+, 22), pp& 02JP0G'&
=. http>CCwww&ncbi&nlm&nih&"ovCpmcCarticlesCP(#2JG27GC
>. http>CCwww&*neuroen"rehab&comCcontentC7C0CGG
.. http>CC*ournals&cambrid"e&or"CactionCdisplay,bstractRfromPa"eSonlineTaidS9G$1$2
@. http>CCdces&esse?&ac&u+CstaffChhuCPapersC-B5:-.ol9-4o'-200-JJ&pdf
PVP Siddhartha Intit!t" #$ T"%hn#&#'() *an!r!) Vi+a(a,ada - . Pa"e 26

Vous aimerez peut-être aussi