Académique Documents
Professionnel Documents
Culture Documents
I. I NTRODUCTION
Mobile robot localization is the problem of determining
the pose of a robot relative to a given map of the environment
[28]. It is a fundamental requirement to realize vehicle autonomy. This problem has been well solved for indoor robots on
a planar surface. However, there are still many challenges to
get an accurate, robust while low-cost approach for vehicle
localization in an outdoor 3D urban environment. Together
with localization comes its concomitant problem of mapping.
A map is an abstract representation of the environment,
which usually serves as a prior for robot localization. How
to map the 3D environment and what representation to use
are additional questions that must be addressed. This paper
introduces a novel idea of synthetic LIDAR, which constructs
synthetic 2D scan from 3D features, and solves the localization and mapping problem in a 2D manner. The algorithm is
developed under the idea of minimal sensing, using only one
tilted-down single-layer LIDAR, and odometry information.
The fusion of Global Positioning System (GPS) and Inertial Navigation System (INS) to estimate vehicle position
has been the most popular localization solution in recent
years [18], [21], [4]. This solution works well in open areas;
however, it is not suitable for dense urban environment
where GPS signals severely suffer from satellite blockage
and multipath propagation caused by high buildings [14].
Road-matching algorithms are then proposed to alleviate this
problem, where a prior road map is used as either additional
motion constraint or observation to update the localization
estimation[6], [7]. While this solution achieves good global
localization, it is not designed for precise estimation relative
to the local environment, an ability that is highly desirable
in many cases.
1 Z. J. Chong, B. Qin, M. H. Ang Jr. are with the National University
of Singapore, Kent Ridge, Singapore {chongzj, baoxing.qin,
mpeangh} at nus.edu.sg
2 T. Bandyopadhyay is with the Singapore-MIT Alliance for Research and
Technology, Singapore tirtha at smart.mit.edu
3 E. Frazzoli and D. Rus are with the Massachusetts Institute of Technology, Cambridge, MA., USA frazzoli at mit.edu, rus at
csail.mit.edu
978-1-4673-5643-5/13/$31.00 2013 IEEE
1554
xn
cos(n ) cos(n )
yn = cos(n ) sin(n ) rn rn1
zn
sin(n )
(1)
The 3D perception assumes that odometry system is accurate
enough in a short time period, and accumulates the laser
scans for 3D range data. A classification procedure is then
applied to extract interest points from the accumulated data.
The extracted laser points are then projected onto a virtual
horizontal plane (by ignoring their z values), and a synthetic
2D LIDAR is constructed. The 2D localization fuses odometry information from odometry and measurements from the
synthetic 2D LIDAR in a Monte Carlo Localization scheme.
With a prior map of vertical features generated beforehand,
localization on the 2D horizontal plane is achieved.
III. 3D PERCEPTION
One of the requirements of the synthetic LIDAR is excellent adaptability that fits with different types of environment
1555
2D scan
odometry
IMU +
Encoder
projected
(x, y, yaw)
Prior
Map
3D perception
Rolling Window
Updating
rolling ahead
Points
Classification
2D Synthetic
LIDAR
2D localization
Monte Carlo
Localization
scan accumulation
obsolete
Tilted-down
LIDAR
window size,
di = (pi x) ~n
(3)
P
k
By taking x = p = k1 i=1 pi as the centroid of pk ,
the values of ~n can be computed in a least-square sense
such that di = 0. The solution for ~n is given by computing
the eigenvalue and eigenvector of the following covariance
matrix C R3x3 of P K [24]:
k
1X
(pi p) (pi p)T , C v~j = j v~j , j {0, 1, 2}
k i=1
(4)
Where k is the number of points in the local neighborhood,
p as the centroid of the neighbors, j is the j th eigenvalue
with v~j as the j th eigenvector. The principal components of
P K corresponds to the eigenvectors v~j . Hence, the approximation of ~n can be found from the smallest eigenvalue 0 .
Once the normal vector ~n is found, the vertical points can
then be obtained by simply taking the threshold of ~n along
the z axis, e.g. 0.5. This can vary depending on how noisy
the sensor data is.
To find the local neighborhood points efficiently, KDtree [19] is built from all the points obtained from the
rolling window and perform a fixed radius search at each
point. Although the surface normal can be calculated as a
whole, performing normal calculation at each point in the
rolling window can be very expensive. To further reduce the
computation complexity, two successive rolling windows are
maintained, where
[
Pn+1
= Pn (Pn+1 \ Pn )
(5)
C=
1556
Point Classification
Fig. 2.
Interest Points
B. Experimental Results
The synthetic LIDAR is able to perform at a rate of 50
Hz output on a laptop with Core i7 processor, showing that
the synthetic LIDAR can be used to perform a real time
localization. The localization results are shown in Fig. 6.
Judging from the prior map, the localization result from
our algorithm always aligns with our driving path where a
parallel line with the road boundary is clearly shown in the
long stretch of road. Since our algorithm does not rely on
GPS, our estimation still performs well near areas crowded
by tall buildings. Note that in the experiment, a rough
initial position is given and hence localization is mostly
concerned with pose tracking. However, the system is able
to cope with small kidnapping problems, e.g. brief data error
from the LIDAR since the odometry system is still able to
provide information. Should a large kidnapping occur, e.g.
the vehicle was moved in between placed without turning
on the localization module, a rough initial position may be
provided to speed up the convergence rate.
Fig. 4 shows localization variance vs driving distance.
1557
Fig. 3.
Vehicle testbed
(b) position variance
Fig. 4.
Fig. 5.
ACKNOWLEDGMENT
This research was supported by the National Research
Foundation (NRF) Singapore through the Singapore MIT
Alliance for Research and Technologys (FM IRG) research
programme, in addition to the partnership with the Defence
Science Organisation (DSO). We are grateful for their support.
R EFERENCES
[20]
[21]
[22]
[23]
[24]
[25]
[26]
[27]
[28]
[29]
[30]
[31]
1559