Académique Documents
Professionnel Documents
Culture Documents
I. I NTRODUCTION
The research in autonomous micro helicopters is advancing and evolving rapidly leading to great progress over the
past few years. However, current solutions lack the robustness and flexibility to general conditions that is required in
order to leave the controlled laboratory environments. As a
result, we are yet to see truly autonomous flights in general
environments without reliance on unrealistic assumptions
like uninterrupted GPS signal, perfect communication link to
a ground station for off-board processing/control or (almost
perfect) pose measurements from motion capture systems
(e.g. Vicon). Only after solving these issues, higher level
tasks such as autonomous exploration, swarm operation and
large trajectory planning can be tackled.
Autonomous flights in unknown environments exclude the
use of motion capture systems. Using GPS is not always
reliable due to effects like shadowing or multipath in citylike environments. Therefore, commonly used sensors for
pose estimation are stereo and monocular cameras as well
as laser scanners. Due to the power, weight and computation
The research leading to these results has received funding from
the European Communitys Seventh Framework Programme (FP7) under
grant agreements n. 231855 (sFly) and n.266470 (myCopter). Stephan
Weiss and Markus Achtelik are currently PhD students and Margarita
Chli is a senior researcher at the ETH Zurich (email: {stephan.weiss,
markus.achtelik, margarita.chli}@mavt.ethz.ch). Roland Siegwart is full
professor at the ETH Zurich and head of the Autonomous Systems Lab
(email: r.siegwart@ieee.org).
Fig. 1: The hexacopter which is used for the experimental analysis during
an autonomous flight with the proposed framework
a = am ba na
(1)
ba = nba
(2)
B. State Representation
The state of the filter is composed of the position of the
i
and its
IMU piw in the inertial world frame, its velocity vw
i
attitude quaternion qw describing a rotation from the inertial
to the IMU frame. We also add the gyro and acceleration
biases b and ba as well as a possible measurement scale
factor . The calibration states are the rotation from the IMU
frame to the measurement sensor frame qis and the distance
between these two sensors psi . Note that the calibration states
can be omitted and set to a calibrated constant making the
filter more robust [7]. For completeness we keep them in the
filter state. This yields a 24-element state vector X:
T
i
i
X = {piw vw
qw
bT bTa psi qis }
i
vw
i
v w
T
C(q
i (am
w)
i
qw
b = nb
1
i
(m b n )qw
2
b a = nba = 0 psi = 0 qis = 0
(5)
(6)
(7)
qIs p I
IMU
I
I
qw
pw
pw
(4)
ba n a ) g
C. Measurement Models
The propagation model discussed above is the core in our
EKF framework making the IMU an indispensable sensor of
the system. However, we do not see this as restriction since
almost all airborne navigation systems are able to bear an
IMU. MEMS IMUs are cheap, small and lightweight such
that they are very applicable on almost any robotic platform.
Around this core sensor, we can have additional sensing
modalities to enable robust MAV navigation. In particular,
here we analyze a 6DoF pose sensor (e.g. a camera or
Vicon system) and a 3DoF position sensor (e.g. GPS or
laser tracker). Other sensors can also be used, given the
observability analysis confirms the observability of the used
states. Fig. 2 depicts the setup with the coordinate frames,
sensors and state variables discussed below.
Sensor
robot body
(3)
(8)
World
Fig. 2: Setup depicting the robot body with its sensors w.r.t. a world
reference frame. The systems state as described in Section IV-B is X =
i q i b b ps q s } whereas ps and q s denote the robots sensor
{piw vw
w a
w
w
i i
measurements in (a possibly scaled) position and attitude respectively in a
world frame. Depending on the type of sensor, only one of the measurements
is available (i.e. GPS or laser tracker only measure the 3D position psw
whereas a camera or Vicon system measures both).
(9)
Processors / Computer
1 kHz
100 Hz
Measurement Update
1-20 Hz
pred. state
a)
curr. state
updated state
b)
covariance P
1|0
2|2
meas.
curr. state
t
pred. of updated state
c)
Fig. 3: Distribution of the processing tasks with their execution rates. State
prediction, as the most time-critical part for controlling the helicopter, is
executed on the IMU microcontroller (HLP) and is guaranteed to run in
real-time at 1 kHz. This leaves enough time for the more complex parts
to be computed on the Atom computer with a non-real-time operating
system. Also very important for the whole system to work is accurate time
synchronization.
meas.
state x
Covariance Update
curr. state
time sync
Covariance Propagation
FCU, HL
controller
state correction
EKF Tasks
State Prediction
3|2
4|3
5|5
meas t=2
meas t=5
6
7
prediction
ekf ringbuffer
update
updated prediction
computed at once
A. Stationary Hovering
First, we show the performance of the framework under
hovering conditions. This is important to evaluate since the
filter needs motion in order for all the states to be observable
(Section IV). Observability ensures no drift on these states.
For this experiment, we let the vehicle hover at a height of
1 m for 30 s. We then compute the RMS error between the
filter estimate and ground truth and between ground truth
and the desired setpoint. The latter is done in order to show
6 The tools we used can be found at http://www.ros.org/wiki/
vicon_bridge
7 A high resolution version is available at http://www.youtube.
com/watch?v=12x53pCkFoI
0.1
0.05
0
0.05
10
ground tr.
15
t [s]
20
filter
25
30
10
15
t [s]
20
25
30
0.5
0.3
0
0.5
1
measurement
0.1
err. filter
0.06
err. x [m]
err. x [m]
VI. R ESULTS
pos. x [m]
pos. x [m]
0.2
0.1
10
15
t [s]
20
25
30
10
15
t [s]
20
25
30
B. Dynamic Flight
2
x
0
2
y
z
0
10
15
t [s]
1.5
0
1
z [m]
vel.[m/s]
10
15
1
0.5
acc. [m/s2]
t [s]
1
10.5
0
1
0 0.5
x [m]
10
y [m]
15
t [s]
Fig. 8: The trajectory used for the dynamic flight experiments. The top and
right plot show the desired position. Middle and bottom plots show the
velocities and necessary accelerations to keep an absolute track speed of
1 m/s.
err. x [m]
pos. x [m]
10
filter
15
t [s]
setpoint
20
25
30
10
15
t [s]
20
25
30
0.3
err. x [m]
0.2
0.1
ground tr.
2
pos. x [m]
0.2
0.1
10
15
t [s]
20
25
30
10
15
t [s]
20
25
30
VII. C ONCLUSIONS
Driven by the need for truly autonomous and robust
navigation of MAVs, this paper presents a system for online
inter-sensor calibration and 6DoF pose estimation in an
implementation that runs entirely onboard a MAV (with
an ATOM 1.6 GHz processor). Our results demonstrate the
power and modularity of our framework using different
sensors in addition to an IMU, while an observability analysis
reveals that yaw estimation is possible from 3Dof position
measurements only. The distributed EKF architecture we
propose allows a MAV state prediction at 1 kHz for accurate
control, while time-delays are compensated such that exact
filter updates are ensured in a way that computational load
is optimally distributed across updates. Our thorough quantitative evaluation shows that our approach is highly robust
against noise, delay and slow measurement readings.
VIII. ACKNOWLEDGMENTS
The authors would like to thank Simon Lynen for his
help with producing the video and Prof. Daniela Rus and
Prof. Nicholas Roy for hosting Stephan Weiss and Markus
Achtelik in summer 2011 at CSAIL, MIT.
A PPENDIX
In the following, detailed results from the experiments are
shown. The values denote the RMS error between the filter
estimate and ground truth, except for the second vector in the
i
columns for piw and qw
which denote the RMS error between
setpoint and ground truth. (w/) and (w/o) denote experiments
with and without attitude measurements respectively (see
Section IV-C.1, Section IV-C.2).
A. Hovering
p [m]
q [rad]
i
qw
[rad (rpy)]
[%]
0.003
0.010
0.019
0.001
0.019
0.005
h 0.011 ih 0.028 i h 0.006 ih 0.015
i
0.014
0.007
0.032
0.010 ih 0.016 i
h 0.003
ih 0.006
i h 0.007
0.014
0.038
0.008
0.009
0.025
0.006
0.019
0.008
0.022
h 0.017 ih 0.056 i h 0.006 ih i
0.012
0.059
0.036
0.013 i h 0.034 ih 0.033 i
h 0.010
ih
0.034
0.011
0.104
0.025
0.016
0.043
0.009 ih 0.015 i
h 0.026
ih 0.033
i h 0.021
0.055
0.098
0.054
0.025
0.099
0.039
0.230
0.062
0.229
h 0.087 ih 0.150 i h 0.045 ih i
0.098
0.016
0.166
0.080
0.190
0.096
0.202
0.001
0.001
w/ att.
0.005
0.009
w/ att.
0.010
0.018
w/ att.
0.020
0.036
w/ att.
0.050
0.036
w/ att.
0.100
0.036
w/o att.
0.200
0.036
w/o att.
psi [m]
qis [rad]
0.3
0.032
0.019
h 0.010
i h 0.000
i
0.085
0.026
0.024
0.010
h 0.050
i h 0.000
i
0.054
0.031
0.052
0.018
h 0.029
i h 0.000
i
0.020
0.023
0.049
0.042
h 0.026
i h 0.000
i
0.016
0.022
0.029
0.010
h 0.082
i h 0.000
i
0.005
0.000
0.054
0.000
h 0.113
i h 0.000
i
0.035
0.000
0.057
0.000
0.058
0.000
0.4
0.4
0.4
0.6
0.8
0.8
piw [m]
i
qw
[rad (rpy)]
0.011
0.023
0.012
0.026
0.015
0.036
ih 0.006
i
h 0.002
0.035
0.006
h 0.004
ih 0.008
i
0.011
0.036
h 0.005
ih 0.009
i
0.009
0.036
0.016
0.036
0.021
0.005
0.080
0.012
h 0.004
ih 0.006
i
0.017
0.066
psi [m]
[%]
0.041
0.008
0.015
h 0.007 ih i
0.020
h 0.070
ih 0.065
i
0.012
0.052
h 0.025
ih 0.024
i
0.016
0.077
h 0.009
ih 0.016
i
0.010
0.059
0.040
0.045
0.3
0.3
0.3
0.3
0.4
20
10
5
2
1
piw [m]
i
qw
[rad (rpy)]
0.012
0.018
ih 0.005
i
h 0.001
0.029
0.005
0.023
0.011
h 0.001
ih 0.006
i
0.005
0.021
0.012
0.032
h 0.002
ih 0.005
i
0.009
0.032
0.014
0.030
h 0.003
ih 0.008
i
0.019
0.045
0.016
0.039
0.005
0.009
0.010
0.007
0.002
0.071
0.005
0.026
0.004 i h 0.000 i
h 0.028
0.021
0.064 i h 0.001 i
h 0.037
0.015
0.069 i h 0.000 i
h 0.031
0.015
0.010
0.006
0.002
0.020
0.040
0.001
0.076 i h 0.000 i
h 0.031
0.040
0.3
0.3
0.3
0.3
0.3
psi [m]
qis [rad]
h 0.041 i h 0.034 i
0.009
0.025
0.010
0.007
0.009
0.017
0.008
0.023
0.016
0.024
0.003
0.000
0.003 i h 0.000 i
h 0.041
0.034
0.005 i h 0.000 i
h 0.039
0.040
0.009 i h 0.000 i
h 0.039
0.041
0.013 i h 0.000 i
h 0.040
0.029
B. Dynamic Flight
p [m]
q [rad]
.001
.001
w/o att.
.005
.009
w/ att.
.005
.009
w/o att.
.010
.018
w/ att.
.010
.018
w/o att.
.020
.036
w/ att.
.020
.036
w/o att.
.050
.036
w/ att.
.050
.036
w/o att.
piw [m]
i
qw
[rad (rpy)]
[%]
0.002
0.055
0.027
0.001
0.007
0.040
0.033
h 0.014 ih 0.059 i h 0.014 ih i
0.011
0.076
0.026
0.015 i h 0.015 ih 0.028 i
h 0.005
ih
0.014
0.082
0.022
0.009
0.086
0.029
0.029 ih 0.034 i
h 0.005
ih 0.015
i h 0.016
0.016
0.070
0.010
0.036
0.024
0.005
0.011
0.015
h 0.016 ih 0.071 i h 0.021 ih 0.026
i
0.012
0.056
0.025
0.049 ih 0.039 i
h 0.007
ih 0.017
i h 0.015
0.017
0.061
0.017
0.052
0.028
0.020 i h 0.025 ih 0.039 i
h 0.012
ih
0.018
0.072
0.021
0.014
0.075
0.028
0.038 ih 0.037 i
h 0.011
ih 0.016
i h 0.014
0.033
0.105
0.033
0.088
0.027
0.020
0.026
0.018
h 0.033 ih 0.098 i h 0.023 ih 0.026
i
0.035
0.108
0.029
0.020
0.024
0.060
0.050
0.4
1.1
0.5
1.1
0.6
1.2
0.7
1.2
0.7
100
200
500
psi [m]
qis [rad]
h 0.024 i h 0.000 i
0.016
0.000
0.015
0.019
0.021
0.000
0.017
0.006
0.028
0.000
0.034
0.012
0.040
0.000
0.044
0.014
0.024
0.062
0.000
0.000
h 0.015
i h 0.000
i
0.041
0.022
h 0.017
i h 0.000
i
0.016
0.000
h 0.037
i h 0.000
i
0.037
0.018
h 0.025
i h 0.000
i
0.017
0.000
h 0.067
i h 0.000
i
0.031
0.016
h 0.025
i h 0.000
i
0.016
0.000
h 0.062
i h 0.000
i
0.018
0.016
h 0.026
i h 0.000
i
0.018
0.000
piw [m]
i
qw
[rad (rpy)]
0.029
0.045 i
h 0.050
ih
0.025
0.044
h 0.014
ih 0.034
i
0.030
0.041
h 0.036
ih 0.038
i
0.024
0.063
0.050
0.066
[%]
0.500
0.700
1.000
1.100
psi [m]
qis [rad]
h 0.010 i h 0.025 i
0.043
0.032
0.008
0.024
0.014
0.039
0.021
0.030
0.046
0.000
h 0.018
i h 0.000
i
0.006
0.021
h 0.052
i h 0.000
i
0.016
0.010
h 0.031
i h 0.000
i
0.018
0.011
h 0.041 i h 0.034 i
[%]
0.065
0.026 ih 0.032 i
h 0.012
0.041
0.008
h 0.012 ih 0.014
i
0.043
0.018 ih 0.016 i
h 0.013
0.046
0.022 ih 0.021 i
h 0.012
0.019
0.005
0.017
50
qis [rad]
delay
[ms]
h 0.052 i h 0.025 i
piw [m]
i
qw
[rad (rpy)]
0.006
0.064
h 0.005
ih 0.012
i
0.007
0.050
0.006
0.060
h 0.004
ih 0.008
i
0.016
0.094
0.015
0.068
h 0.005
ih 0.015
i
0.050
0.177
0.032
0.100
0.009
0.020
[%]
0.032
0.020
h 0.023 ih 0.025
i
0.040
h 0.042
ih 0.047
i
0.012
0.026
h 0.031
ih 0.031
i
0.026
0.040
0.059
0.063
0.8
0.8
0.8
0.8
psi [m]
qis [rad]
h 0.011 i h 0.019 i
0.009
0.011
0.019
0.024
0.014
0.007
0.029
0.123
0.024
0.000
h 0.013
i h 0.000
i
0.007
0.022
h 0.024
i h 0.000
i
0.030
0.013
h 0.049
i h 0.000
i
0.014
0.016
R EFERENCES
[1] S. Weiss and R. Siegwart, Real-time metric state estimation for modular vision-inertial systems, in Proceedings of the IEEE International
Conference on Robotics and Automation (ICRA), 2011.
[2] J. Kelly and G. S. Sukhatme, Visual-inertial sensor fusion: Localization, mapping and sensor-to-sensor self-calibration, International
Journal of Robotics Research (IJRR), vol. 30, no. 1, pp. 5679, 2011.
[3] F. Mirzaei and S. Roumeliotis, A Kalman Filter-Based Algorithm
for IMU-Camera Calibration: Observability Analysis and Performance
Evaluation, IEEE Transactions on Robotics and Automation, vol. 24,
no. 5, pp. 1143 1156, 2008.
[4] D. Strelow and S. Singh, Motion estimation from image and inertial
measurements, International Journal of Robotics Research (IJRR),
vol. 23, no. 12, p. 1157, 2004.
[5] S. Roumeliotis, A. Johnson, and J. Montgomery, Augmenting inertial
navigation with, image-based motion estimation, in Proceedings
of the IEEE International Conference on Robotics and Automation
(ICRA), 2002.
[6] A. Mourikis, N. Trawny, S. Roumeliotis, A. Johnson, A. Ansar, and
L. Matthies, Vision-aided inertial navigation for spacecraft entry,
descent, and landing, IEEE Transactions on Robotics (T-RO), vol. 25,
no. 2, pp. 264280, 2009.
[7] E. Jones, Large scale visual navigation and community map building,
Ph.D. dissertation, University of California at Los Angeles, 2009.
[8] P. Pinies, T. Lupton, S. Sukkarieh, and J. D. Tardos, Inertial aiding
of inverse depth SLAM using a monocular camera, in Proceedings
of the IEEE International Conference on Robotics and Automation
(ICRA), 2007.
[9] M. Bryson, M. Johnson-Roberson, and S. Sukkarieh, Airborne
smoothing and mapping using vision and inertial sensors, in Proceedings of the IEEE International Conference on Robotics and Automation
(ICRA), 2009.
[10] T. Ford, J. Neumann, P. Fenton, M. Bobye, and J. Hamilton, Oem4
inertial: A tightly integrated decentralised inertial/gps navigation system, NovAtel Inc, Tech. Rep., 2001.
[11] M. W. Achtelik, M. C. Achtelik, S. Weiss, and R. Siegwart, Onboard
IMU and Monocular Vision Based Control for MAVs in Unknown Inand Outdoor Environments, in Proceedings of the IEEE International
Conference on Robotics and Automation (ICRA), 2011.
[12] N. Trawny and S. I. Roumeliotis, Indirect Kalman filter for 3D
attitude estimation, University of Minnesota, Dept. of Computer
Science and Engineering, Tech. Rep. 2005-002, 2005.
[13] R. Hermann and A. Krener, Nonlinear controllability and observability, IEEE Transactions on Automatic Control, vol. 22, no. 5, pp. 728
740, 1977.