Vous êtes sur la page 1sur 6

The 5th International Conference on Electrical Engineering and Informatics 2015

August 10-11, 2015, Bali, Indonesia

Implementation of Image-Based Autopilot Controller


using Command Filtered Backstepping for Fixed
Wing Unmanned Aerial Vehicle
Aditya Wildan Farras1, Bambang Riyanto Trilaksono2, dan Fadjar Rahino Putra3
Electrical Engineering Department
School of Electrical Engineering and Informatics - Institut Teknologi Bandung
Ganeca street 10 Bandung, West Java, Indonesia
adityawildanfarras@gmail.com1, briyanto@lskk.ee.itb.ac.id2, frtriputra@gmail.com3

Abstract— A visual based autopilot for a fixed-wing unmanned pointing to the target object continuously. Therefore, an image
aerial vehicle (UAV) is one of the powerful method to track a based visual servoing control is needed to drive a camera pan-
target object on sea or ground level using an image based visual tilt gimbal to always point the camera direction to the target
servoing control. A feature of target object must always appear object. Hereafter, the attitudes of UAV and pan-tilt gimbal are
on image plane of camera to that by controlling pan-tilt gimbal
calculated to produce the UAV heading command.
movement. Aircraft attitude and pan-tilt gimbal can be used as a
reference to drive the aircraft to track the target autonomously. Furthermore, the distance of the target object from the UAV
In this paper, the aircraft attitude and pan-tilt gimbal movement itself can be calculated using the ground level altitude. If the
are controlled using command filtered backstepping method that distance is still far, the UAV will head to the target object
is designed to adapt nonlinear aircraft dynamics and the camera directly. Otherwise, the UAV will revolve around the target
gimbal image based visual servoing. Implementation of such a object that the circle radius distance can be calculated depend
visual based autopilot is successfully conducted to govern the on the speed of the UAV itself.
pan-tilt gimbal for pointing the camera direction and to drive the This paper shows an implementation of the visual based
aircraft motion to track the target object. autopilot controller using command filtered backstepping for
the fixed-wing UAV. The method of command filtered
Keywords—Unmanned Aerial Vehicle (UAV), Control system,
and Visual Servoing. backstepping is currently prominent to adapt the nonlinear
UAV dynamics and the image based visual servoing system
I. INTRODUCTION for the development of a nonlinear adaptive controller.
A surveillance system is needed to protect a wide range of Outline of this paper is as follows. Section II presents
area from various suspicious objects inside on sea or ground hardware configuration of visual based autopilot, including
level. This suspicious object must be identified as a target data communication employed. Section III describes software
object of surveillance either in the state of moving or system of visual based autopilot control system and the image
unmoving. A fixed-wing unmanned aerial vehicle (UAV) can processing embedded system. Section IV contains the
perform this task effectively to pursuit an suspicious object on integration of image based visual servoing using camera
sea or ground level for the identification. The implementation gimbal and autopilot driving control. Section V describes the
of visual based autopilot controller on the UAV is important to test results and their discussion. Conclusion of this work is
fulfill this task in order to follow the suspicious object drawn in Section VI.
continuously. II. VISUAL BASED AUTOPILOT CONTROLLER
Because of the moving restriction of the fixed-wing UAV,
a camera gimbal is also needed to govern a camera direction A. Hardware Configurations
always pointing to the target object in order to make features Visual based autopilot control controller is implemented
of target object always available on image plane of camera on a Pixhawk PX4 microcontroller board. This microcontroller
sensor. The following scenario will be performed to make the has a fairly high speed processing to perform a complex
UAV pursuing the tracked object. A ground operator selects a computation and already has sensor modules such as gyro,
suspicious object from a captured camera image on PC screen accelerometer, GPS, telemetry modem, etc.
and sends the segmented image coordinates to an on-board The visual based autopilot controller requires the
image processing hardware. This segmented image is then integration with other systems that support the availability of
calculated as an image reference by the on-board image visual image feature and the test stability performance of the
processing hardware to produce target object features from controller. The integrated hardware systems includ the image
subsequent captured images of the camera sensor. The features processing system, the Hardware In-the-Loop Simulation
may be out of the image plane if the camera direction is not (HILS) system, and protocols data communication system.

978-1-4673-7319-7/15/$31.00 ©2015 IEEE

235
systems. The required communiication configurations are
described as follows:
1) Image processing system com mmunication
Image processing system of Cubieboard2 single board
computer calculates TLD algorith hm and then produces an
extracted feature point. The visuall based autopilot controller
of Pixhawk PX4 microcontroller board
b obtains this extracted
feature point using a UART co ommunication interface as
Figure 1 Pixhawk PX4 microcontroller b
board shown in Figure 3.
Image processing system converts a seleccted image target
object on image plane of camera sensor to bbecome a feature
point that is then transmitted to the visual based autopilot
controller. Afterwards, this feature point is used as a reference
to determine the heading of the aircraft and the direction of the
camera. The image processing system uses Trracking, Learning
and Detection (TLD) method to extract the target feature on
image plane of camera sensor. This image prrocessing system
is implemented by a Cubieboard2 single boardd computer.
A HILS system is used to simulate the responses of the Figure 3 Data Communication between Pixhawk
P PX4 and Cubieboard2
aircraft and the camera gimbal while the visuaal based autopilot
controller ouput is applied to them. Thee HILS system 2) HILS system communication
integrates the visual based autopilot controoller system and The HILS system of personal computer calculates UAV
image processing system and then visualizes tthe movement of dynamics and camera gimbal attitude to visualize the
aircraft in real-time along with the live-view w of the camera responses of the UAV attitude and a the live-view camera
vision. vision. The visual based autopilot controller of Pixhawk PX4
A protocol data communication system iss also required to microcontroller board produces servvo motors’ signals of PWM
communicate data between on-ground haardwares in the to drive the UAV aileron, elevator,, rudder and throttle and to
aircraft to the on-ground hardware systems in base station. govern the pan-tilt mechanism off the camera gimbal. This
Figure 2 shows the diagram of hardware conffiguration and its board is also received the sensors’ signals of UAV attitude,
data flow. velocity, position, etc. The servoo motors’ signals and the
sensors’ signals are obtained fro om HILS system using a
UART communication interface.

3) Internal communication of th he microcontroller


Pixhawk PX4 microcontroller board utilizes a real-time
NuttX operating system (OS). There are several program
application modules in NuttX OS th hat communicate each other
using micro Object Request Brok ker (uORB) data structure.
UORB data structure shares requireed data between threads and
applications using BUS system. An A application can publish
data to other applications or can ask
k the data. The utilization of
UORB is effective to avoid the prob blems of the data sharing.
III. VISUAL BASED AUTOPILOT
T SOFTWARE SYSTEMS

A. Visual Based Autopilot Main Prrogram


The visual based autopilot main
n program is implemeted in
Pixhawk PX4 microcontroller board d as a module of NuttX OS
that is synchronized with the imagee processing system and the
HILS system through protocol daata communication system.
The task scheduling is performed d to generate a real-time
operation between each system withw a multi-tasking which
enable that each process on the sysstems are performed in the
Figure 2 Diagram of hardware configuration and its data flow right time.
Figure 4 shows the block diagraam of visual based autopilot
B. Data Communication System
software systems. The main prog gram receives the sensors’
The data communication system is develooped to integrate signals of the aircraft linear/anguular velocity, attitude, and
the visual based autopilot controller with other hardware altitude from the HILS system, and d also the extracted feature

236
point from the image processing system. Thhis program then with‫ ݔ‬is plant state variables, ‫ ݑ‬is plant control inputs, ݂ and
calculates an estimated control output maagnitude of the ݃ is model functions.
commanded aircraft surface control/thrust, annd the command
gimbal pan-tilt angles as described in Table 1.
Table 1 Module of visual based autopilot maiin program

Module Visual based autopilot


Input - Aircraft linear velocity V
- Aircraft angular velocity ȍ
- Aircraft attitude (roll, pitch, and yaw) Ȅ
- Altitude h
- Feature point p
Output - Command value of aircraft surfaace control ࢛࢙
- Command value of aircraft thrusst ࢛࢖
- Command value of camera gimbbal pan-tilt ࢛ࢲ
Function - Calculating the command contrrol output that is
given to the servo motors of cam mera gimbal pan-
tilt and aileron, to stabilize the camera direction
and the UAV heading.
Figure 5 Block diagram of command filttered backstepping for camera
gimbal visual servoing

The camera gimbal image baseed visual servoing model is


then defined as follows:
‫݌‬ሶ ൌ ‫ܮ‬௢ Ǥ ‫ݑ‬ఏ (2)
where ‫݌‬ሶ is feature position rate, ‫ݑ‬ఏ is control output of the
controller, and ‫ܮ‬௢ is a Jacobian maatrix of image based visual
servoing.
The design of command filtereed backstepping for camera
gimbal image based visual servoing g has two stages to generate
the command backstepping contro ol output of pan-tilt. First
stage is used to calculate the desireed command control output
based on the current feature position. The next second stage is
then used to generate the comm mand filtered backstepping
Figure 4 Block diagram of visual based autop
pilot systems
control output߁ based on the resiiduals between the current
control output and the comman nd control output that is
B. Backstepping Control of Camera Gimbal V Visual Servoing
calculated in the first stage. Because the results of
The command filtered backstepping meethod is used to backstepping control output is the derivation of pan-tilt rate,
generate control commands of the camera gimbal pan-tilt the integration calculation is needeed to obtain the command
angles regarding to the residuals between the current feature pan-tilt angles.
position and the expected command positionn in the center of
image plane. If the current feature positionn is right on the C. Backstepping Control of Aircrafft Steering
center of image plane, the camera is definitelly pointing to the The command filtered backstepp ping method is also used to
target object. However, camera gimbal annd aircraft servo generate control commands of the aircraft control surface of
motors have limited angular position and ratee. The command aileron regarding to the residuals between the current heading
filtered backstepping uses this limitation vallues to filter the and the expected command heading g that is provided from the
command pan-tilt angle and then converge thhe current feature calculation results of the aircraft and the camera attitudes.
point to center position of image plane. Figgure 5 shows the Figure 6 shows the block diagrram of command filtered
block diagram of command filtered backsteppping for camera backstepping for aircraft steering. Where, ɖ is the aircraft
gimbal visual servoing. Where, ‫ ݌‬is a curreent feature point heading and Ɂୟ is the aircraft contro
ol surface of aileron.
position, ‫݌‬௖ is the position of the target feaature point, ‫ݑ‬ఏ is The design of command filtereed backstepping for aircraft
control output that will be provided to camerra gimbal servos, steering has four stages to generate the command backstepping
and ߁ is the backstepping control output. The nonlinear model control output of aileron. The firstt stage is used to calculate
that is commonly used in this backstepping ddesign is defined the desired command roll angle of the aircraft based on the
in the form of strict feedback system as follow
ws: current heading. The next secon nd stage is then used to
generate the command roll rate from m the residuals between the
‫ݔ‬ሶ ൌ ݂ ൅ ݃Ǥ ‫ݑ‬ (1)
current roll rate and the command roll
r rate that is calculated in

237
the first stage. The third stage then generrates the desired Figure 7. The whole systems weightw including additional
command control output of aileron based onn the residuals of battery, camera and gimbal is less than 500 gram. Therefore,
the current roll rate and command roll ratee. The last stage the aircraft must able to handle thee minimum payload weight
produces the command backstepping control ooutput. of 500 gram. The camera is attach hed to the gimbal, and the
gimbal is then mounted on the airccraft. The aircraft dynamics
for command filtered backsteppin ng is calculated using the
parameters of the aircraft geometrry such as wingspan, wing
chord, etc.
B. HILS Integration
unctionality of visual based
Before the real flight test, the fu
autopilot controller must be tested in
i HILS system to examine
the aircraft flight stabilization. The responses of the aircraft
dynamics can be simulated as th he replacement of the real
aircraft, and therefore the controlller gain can be set for the
appropriate values. Figure 8 shows the HILS integration for
testing the Pixhawk PX4 microconttroller and the Cubieboard2
single board computer. Command control
c output is sent to the
aircraft dynamic and camera gimb bal visual servoing model
calculation PC in order to visualizze aircraft attitude and the
live-view camera vision on projeector screen. Sensory data
provided by HILS system aree then received by the
microcontroller for the replacemen nt of the real sensors on-
board.

Figure 8 HILS Inttegration

C. Aircraft Integration for Real Fliight


After the visual based autopilott controller and the image
processing system are tested and work properly on HILS,
n the field.
actual flight test is then conducted in
Figure 6 Block diagram of command filtered backsteepping for aircraft
steering

IV. HARDWARE INTEGRATION


N

A. Visual Based Autopilot Controller Assembly

Figure 9 Aircraft that have been implem mented visual servoing control
system fly in th
he air

The sensor data i.e. translationall velocity, angular velocity,


aircraft attitude, aircraft altitude, features
f position in image
plane, and the camera gimbal pan-tilt angles are recorded by a
data logger. The selected image coordinates
c in the form of
bounding box is sent/received betw ween ground station and the
Figure 7 Pixhawk PX4 microcontroller installed on A
Aeromodel Cessna aircraft by data modem along with the aircraft telemetry data.
The Pixhawk PX4 microcontroller bboard and the The images captured by on-board camera is streamed to on-
Cubieboard2 single board computer can be installed on ground system by a video transmitteer.
lightweight aircraft such as aeromodel aircrraft as shown in

238
V. EXPERIMENT SETUP, RESULT, AND DISCUSSIONS simulation and implementation has resulted in the same
transient response on the motion of the image feature point
A. Testing Procedure
position. At steady state condition, feature point position
• Integrate the visual based autopilot controller with the succeed to reach target position of feature point that is‫݌‬௘௡ௗ ൌ
image processing system and the protocol data ሾͲǡ Ͳሿ݉݉.
communication system. For response of aircraft position, initial position of aircraft
• Run PX4 Eclipse IDE to compile the program code is ‫ܸܣܷݏ݋݌‬௦௧௔௥௧ ൌ ሾെͳʹʹǤ͵ͻ͵ͷǡ ͵͹ǡ͸Ͷ͸ሿ݀݁݃Ǥ In the
and upload the program to Pixhawk PX4 beginning, aircraft moves to follow the target. After the
microcontroller. distance between aircraft and target reach the radius of
• Run all systems and observe the implementation loiterܴ௟௢௜௧௘௥ , aircraft moves to around the target.
results on Teraterm screen or in data log.
• For HILS, run HILS calculation model to animate the 2) Qualitative testing of visual servoing main computation for
aircraft attitude and the live-view camera vision. camera gimbal and aircraft on HILS with input from image
B. Results and discussions processing module
Visual servoing control system succeed to control the
1) Quantitative testing of visual servoing main computation camera gimbal to always lead to the target object and the
for camera gimbal and aircraft aircraft moves to follow and around the target object. At
Response of the feature point position and the aircraft steady state, the position of the point feature is located on
position are shown in Figure 10 and Figure 11. centroid, that is coordinates (0,0), the point feature in the
pixel-value (160, 120) at a resolution of (320, 240). In other
words, the location of point features on the steady conditions
right in the middle of the image plane.

Figure 10 Implementation result of feature point transient response

Figure 11 Implementation result of aircraft motion (aircraft position)


transient response

The initial condition of the input image feature point


location is set with value ‫݌‬௦௧௔௥௧ ൌ ሾͳǤ͵ͷͲǡ ͳǤͳ͵ͷሿ݉݉ . At
transient response, there is a little oscillation at feature point
position. Transient response of feature point is in the form of
underdamped, seen from the position of the feature point from
the beginning to the point of destination, pass to a negative
value, then back again to the point of destination. Both
Figure 12 Sequence of aircraft and camera gimbal motion response

239
3) Qualitative testing of visual servoing main computation for ACKNOWLEDGMENT
camera gimbal at real camera gimbal servos The authors would like to thank the BPPT for their
contribution in developing the fixed wing UAV. This work is
Testing is done by using both static object and dynamic supported by School of Electrical Engineering and Computer
object. The visual servoing succeeds in directing motion of the Science, ITB.
camera gimbal to always point the camera toward the target.
At the beginning, large error value of point feature is
observed. When the control system started, this error becomes REFERENCES
smaller and the value finally converge to zero. Even though [1] Nugroho, Primawan D., Trilaksono, Bambang R., Triputra, Fadjar
there is overshoot, in the steady state, position of feature point Rahino. Image Processing System for Image-Based Fixed Wing
Unmanned Aerial Vehicle. Submitted paper ICEEI 2015
succeed to reach the desired set point which is the centroid
(exactly in the center of the image). This condition is reached [2] Prabowo, Yaqub A., Trilaksono, Bambang R., Triputra, Fadjar Rahino.
Hardware In-the-Loop Simulation for visual servoing of fixed wing
with settling time of 1.125 sec – 2.25 sec, according to 20 UAV. Submitted paper ICEEI 2015
times of test. [3] Beard, Randal W., McLain, Timothy W., Small Unmanned Aircraft
Therefore, the controller succeeds to make feature point move Theory and Practice, Princenton University Press, United Kingdom.
to the desired set point and the time needed to reach setting 2012.
point with error less than 2% is 1.125 seconds. [4] Triputra, Fadjar Rahino, Sistem Kendali Terbang Visual Servoing PUNA
Berbasis Citra Menggunakan Gimbal Kamera, Institut Teknologi
Bandung. 2013.
[5] Slotine, Jean-Jacques E., Li, Weiping., Applied Nonlinear Control, 1st
Edition, Prentice-Hall, Englewood Cliffs, NJ 1991.
[6] Anderson, B.D.O, Moore, J.B., Linear Optimal Control, Prentice-Hall,
NJ, 1971.
[7] Borra, B., Nonlinear UAV Flight Control using Command Filtered
Backstepping, Master Thesis, California Polytechnic State University,
San Luis Obispo, 2012.
[8] Chaumetter, F., Hutchinson S., Visual Servo Control, Part I: Basic
Approaches, IEEE Robotics and Automation Magazine, Vol. 13, No. 4,
p. 82-90, 2006
[9] Chaumetter, F., Hutchinson S., Visual Servo Control, Part I: Advanced
Approaches, IEEE Robotics and Automation Magazine, Vol. 14, No. 1,
p. 109-118, 2007
[10] Espinoza, T., Garcia, Dzul, A., Garcia, L., and Parada, R., Nonlinear
Controllers Applied to Fixed-Wing UAV, 2012 IEEE Ninth Electronics,
Robotics and Automotive Mechanices Conference (CERMA). Nov. 19-
23, 2012.
[11] Espinoza, T., Garcia, Dzul, A., Lozano, R., and Parada, R., Backstepping
– sliding mode controllers applied to a fixed-wing UAV, 2013
International Conference on Unmanned Aircraft Systems (ICUAS), May
28-31, 2013.
[12] Hamel, T., Mahony, R., Image based visual servo-control for a class of
aerial robotic systems, Elsevier Automatica Vol.43, No. 11, p. 1975-
1983, Nov., 2007
Figure 13 Sequence of camera gimbal motion response (left: image [13] Triputra, F.R., Trilaksono, B.R, Sasongko, R.A, and Dahsyat, M.,
captured by camera, right: motion of camera gimbal servos) Longitudinal dynamic system modelling of a fixed-wing UAV towards
autonomous flight control system development: A case study of BPPT
VI.CONCLUSIONS wulung UAV platform, International Conference on System Engineering
and Technology (ICSET), 2012
Based on the results of the design, implementation, and
[14] Triputra, F.R., Trilaksono, B.R, Adiono, T., Sasongko, R.A, and
testing that has been done, some conclusions can be written as Dahsyat, M., Nonlinear Dynamic Modeling of a Fixed-Wing Unmanned
follows. Hardware and software environment was developed Aerial Vehicle: A Case Study of Wulung, Journal of Mechatronics,
and configured to enable effective controller implementation. Electrical Power, and Vehicular Technology (MEV), Vol. 6, No. 1,
The data communication system has been implemented on the 2015.
control board Pixhawk PX4. Data transmission between the
microcontroller and image processing system and HILS
system is running well and the data was received in
accordance with the submitted data. The visual based autopilot
controller has been developed to control the camera gimbal to
always point the camera toward the target and also to control
the aircraft motion to always follow and move around the
object.

240

Vous aimerez peut-être aussi