Académique Documents
Professionnel Documents
Culture Documents
Mechatronics
journal homepage: www.elsevier.com/locate/mechatronics
Dead-reckoning sensor system and tracking algorithm for 3-D pipeline mapping
Dongjun Hyun a, Hyun Seok Yang a,*, Hyuk-Sung Park b, Hyo-Jun Kim c
a
Department of Mechanical Engineering, Yonsei University, 262, Seongsanno, Seodamun-gu, Seoul 120-749, Republic of Korea
b
Robogen, RM133, Yonsei Engineering Research Park, 262, Seongsanno, Seodaemun-gu, Seoul 120-749, Republic of Korea
c
Department of Mechanical Engineering, Kangwon National University, KyoDong, Samcheok, Kangwon Do 245-711, Republic of Korea
a r t i c l e i n f o a b s t r a c t
Article history: A dead-reckoning sensor system and a tracking algorithm for 3-D pipeline mapping are proposed for a tap
Received 11 March 2009 water pipeline for which the diameter is small and the inner surface is rough due to pipe scales. The goals
Accepted 30 November 2009 of this study are to overcome the performance limitations of small and low-grade sensors by combining
various sensors with complementary functions and achieve robustness against a severe environment. A
dead-reckoning sensor system consists of a small, low-cost micro electromechanical system inertial mea-
Keywords: surement unit (MEMS IMU) and an optical navigation sensor (used in laser mice). A tracking algorithm
3-D pipeline mapping
consists of a multi-rate extended Kalman filter (EKF) to fuse redundant and complementary data from
Dead reckoning
Multi-sensor fusion
the MEMS IMU and the optical navigation sensor and a geometry compensation method to reduce posi-
Extended Kalman filter tion estimation error using the end point of the pipeline. Two sets of experimental data have been
Optical navigation sensor obtained by driving a radio-controlled car equipped with the sensor system in a 3-D pipeline and on
asphalt pavement. Our study can be used to estimate the path of a 3-D pipeline or mobile robots.
Ó 2009 Elsevier Ltd. All rights reserved.
* Corresponding author. Tel.: +82 2 2123 2824; fax: +82 2 312 2159. The dead-reckoning sensor system consists of a MEMS IMU, an
E-mail address: hsyang@yonsei.ac.kr (H.S. Yang). optical navigation sensor, a CPU module, and an SD card. The CPU
0957-4158/$ - see front matter Ó 2009 Elsevier Ltd. All rights reserved.
doi:10.1016/j.mechatronics.2009.11.009
214 D. Hyun et al. / Mechatronics 20 (2010) 213–223
module in this study is comprised of an S3C2440 processor using The optical navigation sensor is interfaced with SPI and provides
an ARM920T core, 4 MB Nor Flash, 64 MB Nand Flash, and 64 MB a sampling period of 5 ms or less. The dimensions of the MEMS
SDRAM. The CPU module has an elegant, low-power, simple, and IMU are 58 58 22 mm, and the optical navigation sensor is
fully static design for cost- and power-sensitive applications. The 55 55 31 mm. The dead-reckoning sensor system is so com-
CPU module stores the SD card with the MEMS IMU and optical pact that it fits on a small mobile robot. Fig. 1 shows an overview
navigation sensor measurements for the post-processing of exper- of the dead-reckoning sensor system.
imental data. The storage capacity of the SD card is 1 GB and sensor
measurements can be written for more than 20 h. The MEMS IMU
is interfaced with RS-232 and offers a sampling period of 10 ms.
Fig. 1. Overview of the tracking sensor system. Fig. 3. Linear motion experimental result of a MEMS accelerometer.
Table 1
Process of the multi-rate EKF.
Kalman if zkþ1 ;
filtering Rkþ1 ¼ diagð~rkþ1 Þ
Kkþ1 ¼ Pkþ1 HTkþ1 ðHkþ1 Pkþ1 HTkþ1 þ Rkþ1 Þ1
kþ1 þ Kkþ1 ðzkþ1 Hkþ1 x
xkþ1 ¼ x kþ1 Þ
Pkþ1 ¼ ðI Kkþ1 Hkþ1 ÞPkþ1
else
kþ1
xkþ1 ¼ x
Pkþ1 ¼ Pkþ1
Fig. 7. Illustration of geometry compensation: (a) forward/backward estimation curves and (b) compensating procedure.
Fig. 10. 3-D pipeline mapping results of the (a) accelerometer and (b) optical navigation sensor without the sensor fusion.
slip ambiguity that plagues wheel-type odometers. However, an But, the optical navigation sensor with the afocal lens may not
optical navigation sensor sometimes fails to process the image of work properly at excessive height because the image sensor of
the surface when the optical navigation sensor moves on mirrors, the optical navigation sensor has low resolution. Fig. 4 shows the
glass, or glossy surfaces, or moves so fast that the sensor cannot newly designed optical system.
take continuous snapshots. Therefore, an optical navigation sensor
cannot be used independently as an odometer. However, the out-
age of the optical navigation sensor can be clearly detected and 3. Tracking algorithm
treated more easily than slip error by the estimation algorithm.
For the laser mice, the optical system is comprised of a lens and 3.1. Multi-rate EKF algorithm
a laser diode (as an illumination source), and functions only when
the optical navigation sensor is 2–3 mm away from the surface. If The EKF is the most popular recursive state estimator for non-
an optical navigation sensor is fitted to a mobile robot 2–3 mm linear system, despite several shortcomings. The multi-rate sam-
away from the surface, then the mobile robot should only move pling method is applied to EKF for an asynchronous sensor
on a smooth surface such, that the optical navigation sensor does system that consists of sensors with different sampling rate by
not conflict with a bumpy or inclined surface. Therefore, a new Armesto et al. [14]. Multi-rate EKF is most suitable for the sensor
optical system comprising of a micro lens and LEDs is designed system proposed in this paper because the MEMS IMU and optical
so that the optical navigation sensor is usable even when the opti- navigation sensor have different sampling rates and the outage of
cal navigation sensor is located 20–80 mm away from the surface. the optical navigation sensor occurs irregularly. Abnormal data
The micro lens is the afocal lens that is used for the camera of the from the outage of the optical navigation sensor can be removed
cellular phone and has infinite field of depth without autofocus. by the multi-rate sampling method to prevent abnormal data from
Fig. 11. Results of the sensor fusion. Fig. 12. Results of the geometry compensation.
218 D. Hyun et al. / Mechatronics 20 (2010) 213–223
xkþ1 ¼ Fk xk þ Gk wk ð4Þ
2 3
T2 3
I33 TI33 I
2 33
þ T6 ½xk 033 034 033
6 7
6 2 7
6 033
6 I33 TI33 þ T2 ½xk 033 034 033 7
7
6 7
6 033 033 I33 þ T½xk 033 034 033 7
6 7
Fk ¼ 6 7
6 033 033 033 I33 034 033 7
6 7
6 7
6 sin ðkxk k2Þ
T 7
6 043 043 043 043 cos kxk k T2 I44 X k 7
Fig. 13. Position error of the geometry compensation and forward estimation. 4 kxk k 5
033 033 033 033 033 I33
ð5Þ
2 3
3
T
I33 033 033
6 62 7
6T I
6 2 33 033 033 77
6 7
6 TI33 T½v k 033 7
Gk ¼ 6 7 ð6Þ
6 0
6 33 033 TI33 7
7
6 sin ðkxk kT2Þ
7
6 0
4 43 T
2 kxk k
Xk 033 75
033 TI33 033
2 3
0 xz xy
6 7
6 7
½xk ¼ 6 xz 0 xx 7 ð7Þ
4 5
xy xx 0 k
2 3
0 vz v y
6 7
6 7
½v k ¼ 6 v z 0 vx 7 ð8Þ
4 5
Fig. 14. Estimated velocity and optical navigation sensor. vy v x 0 k
Fig. 15. Estimated velocity and optical navigation sensor at (a) 48–52 s and (b) 60–68 s.
D. Hyun et al. / Mechatronics 20 (2010) 213–223 219
Fig. 16. Estimated states: (a) velocity, (b–d) acceleration and bias, (e) quaternion and (f) angular velocity.
2 3
q1 q2 q3 angular acceleration, and the derivative of bias. The full motion
6 7 model with noise are Eqs. (4)–(9) and detailed derivations are
6 q0 q3 q2 7
Xk ¼ 6
6 q
7 ð9Þ shown in the Section 3 of [14]. T is the sampling period.
4 3 q0 q1 7
5
q2 q1 q0 k zk ¼ hðxk Þ þ v k ¼ Hk x k þ v k ð10Þ
h iT
States, xk, consist of position, velocity, acceleration, acceleration T T
zk ¼ ½ zT1 zT2 zT3 k ¼ aTInertial xTInertial ddOptical ð11Þ
bias, quaternion, and angular rate. Plant noises, wk, consist of jerk, k
220 D. Hyun et al. / Mechatronics 20 (2010) 213–223
2 3
aInertial ¼ aInertial;raw ½qk1 gn ¼ a þ b ð12Þ 2q20 1 þ 2q21 2q1 q2 2q0 q3 2q1 q3 þ 2q0 q2
6 7
2 3 ½q ¼ 4 2q1 q2 þ 2q0 q3 2q20 1 þ 2q22 2q2 q3 2q0 q1 5 ð20Þ
2q20 1 þ 2q21 2q1 q2 þ 2q0 q3 2q1 q3 2q0 q2 2q1 q3 2q0 q2 2q2 q3 þ 2q0 q1 2q20 1 þ 2q23
6 7
6 7
½qk1 ¼ 6 2q1 q2 2q0 q3 2q20 1 þ 2q22 2q2 q3 þ 2q0 q1 7
4 5
pnkþ1 ¼ pnk þ ½qkþ1 dpbkþ1 ð21Þ
2q1 q3 þ 2q0 q2 2q2 q3 2q0 q1 2q20 1 þ 2q23 k1
ð13Þ Eqs. (18)–(21) give position in reference axes, pnkþ1 . The differential
displacement in reference axes, dpn, is obtained by transforming the
xInertial ¼ x ð14Þ differential displacement in body axes, dpb, to reference axes with
1 the rotation matrix, [qk+1], because pk in xk is position in body
ddOptical ¼ T v þ T 2 a ð15Þ axes.
2
The estimation of states with acceleration bias included is very
2 3 2 3
Hz1 033 033 I33 I33 033 033 useful for the dead-reckoning sensor system, since sufficiently ex-
6 7 6 7 act estimation of acceleration bias provides the proper estimation
6 7 6 I33 7
Hk ¼ 6 Hz2 7 ¼ 6 033 033 033 033 033 7 ð16Þ of velocity during the intermittent outages of the optical naviga-
4 5 4 5
T 2 tion sensor. As shown on Fig. 3, if outage duration of the optical
Hz3 k 033 TI33 2
I33 033 033 033 k
navigation sensor is less than 5 s and sufficiently exact acceleration
2 3 bias is given then the MEMS accelerometer can give proper velocity
r1 0 0
60 estimation without any assistance. Fig. 5 shows that the velocity is
6 r2 07
7 estimated successfully based on proper accelerometer bias estima-
rk Þ ¼ 6
Rk ¼ diagð~ 6 .. .. 7;
7 ð17Þ
4 . . 05 tion in spite of the outages of the optical navigation sensor.
0 0 0 rn k
3.2. Geometry compensation algorithm
where ~ rk ¼ ½~rTz1 ~ rTz3 Tk ¼ ½ r 1 r 2 rn Tk .
rTz2 ~
Measurements, zk, consist of the acceleration, aInertial, and angu-
The geometry compensation algorithm proposed in this paper
lar rates, xInertial, of the MEMS IMU and the displacement, ddOptical,
represents a method to reduce error in the position estimation
of the optical navigation sensor. aInertial is obtained by subtracting
by combining the two curves obtained from the forward and back-
the gravity in body axes from aInertial,raw that are raw outputs of
ward estimations with a geometric approaching method. For the
accelerometers. gn is the gravity in reference axes and qk1 is the
3-D pipeline mapping, for which start and end points are given,
complex conjugate of the orientation quaternion at the previous
the maximum position error of the method using both forward
step. ½qk1 is the rotation matrix which transforms from refer-
and backward estimations can be less than half of the method
ence to body axes. Hk is measurement model which represents
using only forward estimation or backward estimation because
the relationship between three measurement vectors and six state
vectors. Rk is the measurement noise covariance matrix.
Fig. 17. Asphalt pavement measurements from (a) isometric and (b) side views. Fig. 19. Results of the geometry compensation.
D. Hyun et al. / Mechatronics 20 (2010) 213–223 221
4. Experimental results
Fig. 22. Estimated velocity and optical navigation sensor at (a) 34–38 s and (b) 294–289 s.
222 D. Hyun et al. / Mechatronics 20 (2010) 213–223
Table 2
Odometer and position error.
Experiment Total length Estimated length Odometer error Position error forward/geometry comp.
3-D pipeline 24.46 m 24.51 m 0.05 m (0.2%) 0.72 m (2.9%)
0.22 m (1.0%)
Asphalt pavement 855.4 m 850.7 m 4.7 m (0.6%) 22.4 m (2.6%0)
7.1 m (0.8%)
Fig. 23. Estimated states: (a) velocity, (b–d) acceleration and bias, (e) quaternion and (f) angular velocity.
D. Hyun et al. / Mechatronics 20 (2010) 213–223 223
observation. From an overhead viewpoint, the shape of the Fig. 23 shows estimated states by the multi-rate EKF algorithm.
pipeline is ‘@’ and the length, width and height were 7.4 m, Eqs. (22) and (23) shows selected values for R and Q matrix
9 m, and 2.4 m. based on experimental results
The RC car experienced many disturbances during the run. The
Q ¼ diag½ I13 1:5e2 I13 0:38 I13 0:5e 6 ð24Þ
inner surface of the PVC pipe was slippery and the tires of the RC
car continuously experienced slip while driving up the inclined R ¼ diag½ I13 0:1 I13 1e 5 I13 1e 8 ð25Þ
pipes. It was difficult to control the RC car due to the opaque
PVC pipes resulting in sudden accelerations, braking and rapid
5. Conclusions and future work
steering of the RC car at the corners of the pipeline. Additionally,
the intermittent acryl pipes disturb the optical navigation sensor.
The experiment has demonstrated that the dead-reckoning sen-
As a result, the dead-reckoning sensor system experienced a diffi-
sor system, including the optical navigation sensor developed for
cult environment.
this study, functions successfully under difficult conditions, and
Figs. 10a and b represent the position estimation results
the tracking algorithm successfully has estimated the path of the
without sensor fusion. Specifically in Fig. 10b, the optical navi-
RC car. In particular, the tracking algorithm has combined redun-
gation sensor loses considerable distance due to the transparent
dant and complementary measurements of the optical navigation
acryl pipes and irregular driving, such that the length of esti-
sensor and the low-cost MEMS IMU, and has restored displacement
mated path is much shorter than the actual length of the 3-D
of the optical navigation sensor measurements successfully.
pipeline.
Future research will aim to reduce the failure rate of the optical
Figs. 11 and 12 show estimation results of sensor fusion and
navigation sensor measurements and produce the tracking algo-
geometry compensation respectively. The sensor fusion algorithm
rithm process in real time. Reduction of the optical navigation sen-
effectively combined the complementary characteristics of the
sor failure can be achieved by improving the optical system. Real-
optical navigation sensor and inertial sensors, and the geometry
time processing can be achieved by preventing orientation errors
compensation algorithm reduced the position error of the esti-
that increase continuously over time. An additional algorithm
mated 3-D pipeline. The position error in Fig. 13 is minimum dis-
referring to the direction of the gravitational vector and odometers
tance between the reference points of the pipeline and estimated
on the right and left hand sides of the mobile robot can restrict the
path.
orientation error within an appropriate range.
Figs. 14 and 15 show the velocity of the estimated states and the
optical navigation sensor measurements. Failures of the optical
References
navigation sensor measurements have occurred 79 times while
processing 13909 measurements. However, the velocity was esti- [1] Lee S, Song JB. Robust mobile robot localization using optical flow sensors and
mated smoothly during failures of the optical navigation sensor. encoders. In: International conference on robotics and automation; April 2004.
The tracking algorithm nearly exactly has reconstructed the lost p. 1039–44.
[2] Cooney JA, Xu WL, Bright G. Visual dead-reckoning for motion control of a
measurement by the optical navigation sensor. Mecanum-wheeled mobile robot. Mechatronics 2004;14:623–37.
Fig. 16 shows estimated states by the multi-rate EKF algorithm. [3] Bradshaw J, Lollini C, Bishop BE. On the development of an enhanced optical
mouse sensor for odometry and mobile robotics education. In: 39th
southeastern symposium on system theory; March 2007. p. 6–10.
4.2. Experiment and estimation results on asphalt pavement [4] Bavly DM, Parkinson B. Cascaded Kalman filters for accurate estimation of
multiple biases, dead-reckoning navigation, and full state feedback control of
ground vehicles. IEEE Trans Control Syst Technol 2007:199–208.
Long-range performance of the proposed system is demon- [5] Johnson EA, Morris Bamberg SJ, Minor MA. A state estimator for rejecting noise
strated on the asphalt pavement of a known path. The RC car and tracking bias in inertial sensors. Int Conf Rob Autom 2008:3256–63.
was driven equipped with the dead-reckoning sensor system on [6] Lamon P, Siegwart R. Inertial and 3D-odometry fusion in rough terrain –
towards real 3-D navigation. In: Proceedings of 2004 IEEE/RSJ international
the asphalt pavement, of which 89 points have been measured
conference on intelligent robots and systems, September 28–October 2, 2004.
with D-GPS and the total distance is 855.4 m. Fig. 17 shows the Sendai, Japan.
path of the measured asphalt pavement (Fig. 17). [7] López-Orozco JA, de la Cruz JM, Besada E, Ruipérez P. An asynchronous, robust,
and distributed multisensor fusion system for mobile robots. Int J Rob Res
Figs. 18 and 19 show the estimation results of the sensor fusion
2000;19(10):914–32.
and geometry compensation. The estimation results of the test on [8] Ojeda L, Borenstein J. Personal dead-reckoning system for GPS-denied
the asphalt pavement are similar to that of the test in the 3-D pipe- environments. In: International workshop on safety, security and rescue
line. The position error shown in Fig. 20 is similar to that of the test robotics; September 2007 [ID:978-1-4244-1569-4/07].
[9] Sukkarieh S, Nebot EM, Durrant-Whyte HF. A high integrity IMU/GPS
in the 3-D pipeline. navigation loop for autonomous land vehicle applications. IEEE Trans Rob
Figs. 21 and 22 show the velocity of the estimated states and the Autom 1999;15(3):572–8.
optical navigation sensor measurements. Failures of the optical [10] Yu J, Lee JG, Park CG, Han HS. An off-line navigation of a geometry PIG using a
modified nonlinear fixed-interval smoothing filter. Control Engineering Pract
navigation sensor measurements have occurred 4092 times while 2005;13:1403–11.
processing 33702 measurements (Figs. 21 and 22). [11] Nassar, S. Improving the inertial navigation system (INS) error model for INS
The proposed dead-reckoning sensor system and tracking algo- and INS/DGPS applications. PhD thesis, Department of Geomatics Engineering,
University of Calgary, Calgary, Alberta, Canada, UCGE Report No. 20183; 2003.
rithm outputs consistent performance in both the experiments of [12] Gelb A. Applied optimal estimation. Cambridge (MA, USA): The M.I.T. Press,
the 3-D pipeline and the asphalt pavement. Table 2 shows odome- Massachusetts Institute of Technology; 1973.
ter and position errors. The odometer error refers to the difference [13] Wang HG, Williams TC. Strategic inertial navigation systems: high-accuracy
inertially stabilized platforms for hostile environment. IEEE Control Syst Mag
between the total length and estimated length, and the position er-
2008:65–85.
ror refers to the maximum value of the position error in Figs. 12 [14] Armesto L, Tornero J, Vincze M. Fast ego-motion estimation with multi-rate
and 18. fusion of inertial and vision. Int J Rob Res 2007:577–89.