Académique Documents
Professionnel Documents
Culture Documents
Abstract— A visual based autopilot for a fixed-wing unmanned pointing to the target object continuously. Therefore, an image
aerial vehicle (UAV) is one of the powerful method to track a based visual servoing control is needed to drive a camera pan-
target object on sea or ground level using an image based visual tilt gimbal to always point the camera direction to the target
servoing control. A feature of target object must always appear object. Hereafter, the attitudes of UAV and pan-tilt gimbal are
on image plane of camera to that by controlling pan-tilt gimbal
calculated to produce the UAV heading command.
movement. Aircraft attitude and pan-tilt gimbal can be used as a
reference to drive the aircraft to track the target autonomously. Furthermore, the distance of the target object from the UAV
In this paper, the aircraft attitude and pan-tilt gimbal movement itself can be calculated using the ground level altitude. If the
are controlled using command filtered backstepping method that distance is still far, the UAV will head to the target object
is designed to adapt nonlinear aircraft dynamics and the camera directly. Otherwise, the UAV will revolve around the target
gimbal image based visual servoing. Implementation of such a object that the circle radius distance can be calculated depend
visual based autopilot is successfully conducted to govern the on the speed of the UAV itself.
pan-tilt gimbal for pointing the camera direction and to drive the This paper shows an implementation of the visual based
aircraft motion to track the target object. autopilot controller using command filtered backstepping for
the fixed-wing UAV. The method of command filtered
Keywords—Unmanned Aerial Vehicle (UAV), Control system,
and Visual Servoing. backstepping is currently prominent to adapt the nonlinear
UAV dynamics and the image based visual servoing system
I. INTRODUCTION for the development of a nonlinear adaptive controller.
A surveillance system is needed to protect a wide range of Outline of this paper is as follows. Section II presents
area from various suspicious objects inside on sea or ground hardware configuration of visual based autopilot, including
level. This suspicious object must be identified as a target data communication employed. Section III describes software
object of surveillance either in the state of moving or system of visual based autopilot control system and the image
unmoving. A fixed-wing unmanned aerial vehicle (UAV) can processing embedded system. Section IV contains the
perform this task effectively to pursuit an suspicious object on integration of image based visual servoing using camera
sea or ground level for the identification. The implementation gimbal and autopilot driving control. Section V describes the
of visual based autopilot controller on the UAV is important to test results and their discussion. Conclusion of this work is
fulfill this task in order to follow the suspicious object drawn in Section VI.
continuously. II. VISUAL BASED AUTOPILOT CONTROLLER
Because of the moving restriction of the fixed-wing UAV,
a camera gimbal is also needed to govern a camera direction A. Hardware Configurations
always pointing to the target object in order to make features Visual based autopilot control controller is implemented
of target object always available on image plane of camera on a Pixhawk PX4 microcontroller board. This microcontroller
sensor. The following scenario will be performed to make the has a fairly high speed processing to perform a complex
UAV pursuing the tracked object. A ground operator selects a computation and already has sensor modules such as gyro,
suspicious object from a captured camera image on PC screen accelerometer, GPS, telemetry modem, etc.
and sends the segmented image coordinates to an on-board The visual based autopilot controller requires the
image processing hardware. This segmented image is then integration with other systems that support the availability of
calculated as an image reference by the on-board image visual image feature and the test stability performance of the
processing hardware to produce target object features from controller. The integrated hardware systems includ the image
subsequent captured images of the camera sensor. The features processing system, the Hardware In-the-Loop Simulation
may be out of the image plane if the camera direction is not (HILS) system, and protocols data communication system.
235
systems. The required communiication configurations are
described as follows:
1) Image processing system com mmunication
Image processing system of Cubieboard2 single board
computer calculates TLD algorith hm and then produces an
extracted feature point. The visuall based autopilot controller
of Pixhawk PX4 microcontroller board
b obtains this extracted
feature point using a UART co ommunication interface as
Figure 1 Pixhawk PX4 microcontroller b
board shown in Figure 3.
Image processing system converts a seleccted image target
object on image plane of camera sensor to bbecome a feature
point that is then transmitted to the visual based autopilot
controller. Afterwards, this feature point is used as a reference
to determine the heading of the aircraft and the direction of the
camera. The image processing system uses Trracking, Learning
and Detection (TLD) method to extract the target feature on
image plane of camera sensor. This image prrocessing system
is implemented by a Cubieboard2 single boardd computer.
A HILS system is used to simulate the responses of the Figure 3 Data Communication between Pixhawk
P PX4 and Cubieboard2
aircraft and the camera gimbal while the visuaal based autopilot
controller ouput is applied to them. Thee HILS system 2) HILS system communication
integrates the visual based autopilot controoller system and The HILS system of personal computer calculates UAV
image processing system and then visualizes tthe movement of dynamics and camera gimbal attitude to visualize the
aircraft in real-time along with the live-view w of the camera responses of the UAV attitude and a the live-view camera
vision. vision. The visual based autopilot controller of Pixhawk PX4
A protocol data communication system iss also required to microcontroller board produces servvo motors’ signals of PWM
communicate data between on-ground haardwares in the to drive the UAV aileron, elevator,, rudder and throttle and to
aircraft to the on-ground hardware systems in base station. govern the pan-tilt mechanism off the camera gimbal. This
Figure 2 shows the diagram of hardware conffiguration and its board is also received the sensors’ signals of UAV attitude,
data flow. velocity, position, etc. The servoo motors’ signals and the
sensors’ signals are obtained fro om HILS system using a
UART communication interface.
236
point from the image processing system. Thhis program then with ݔis plant state variables, ݑis plant control inputs, ݂ and
calculates an estimated control output maagnitude of the ݃ is model functions.
commanded aircraft surface control/thrust, annd the command
gimbal pan-tilt angles as described in Table 1.
Table 1 Module of visual based autopilot maiin program
237
the first stage. The third stage then generrates the desired Figure 7. The whole systems weightw including additional
command control output of aileron based onn the residuals of battery, camera and gimbal is less than 500 gram. Therefore,
the current roll rate and command roll ratee. The last stage the aircraft must able to handle thee minimum payload weight
produces the command backstepping control ooutput. of 500 gram. The camera is attach hed to the gimbal, and the
gimbal is then mounted on the airccraft. The aircraft dynamics
for command filtered backsteppin ng is calculated using the
parameters of the aircraft geometrry such as wingspan, wing
chord, etc.
B. HILS Integration
unctionality of visual based
Before the real flight test, the fu
autopilot controller must be tested in
i HILS system to examine
the aircraft flight stabilization. The responses of the aircraft
dynamics can be simulated as th he replacement of the real
aircraft, and therefore the controlller gain can be set for the
appropriate values. Figure 8 shows the HILS integration for
testing the Pixhawk PX4 microconttroller and the Cubieboard2
single board computer. Command control
c output is sent to the
aircraft dynamic and camera gimb bal visual servoing model
calculation PC in order to visualizze aircraft attitude and the
live-view camera vision on projeector screen. Sensory data
provided by HILS system aree then received by the
microcontroller for the replacemen nt of the real sensors on-
board.
Figure 9 Aircraft that have been implem mented visual servoing control
system fly in th
he air
238
V. EXPERIMENT SETUP, RESULT, AND DISCUSSIONS simulation and implementation has resulted in the same
transient response on the motion of the image feature point
A. Testing Procedure
position. At steady state condition, feature point position
• Integrate the visual based autopilot controller with the succeed to reach target position of feature point that isௗ ൌ
image processing system and the protocol data ሾͲǡ Ͳሿ݉݉.
communication system. For response of aircraft position, initial position of aircraft
• Run PX4 Eclipse IDE to compile the program code is ܸܣܷݏ௦௧௧ ൌ ሾെͳʹʹǤ͵ͻ͵ͷǡ ͵ǡͶሿ݀݁݃Ǥ In the
and upload the program to Pixhawk PX4 beginning, aircraft moves to follow the target. After the
microcontroller. distance between aircraft and target reach the radius of
• Run all systems and observe the implementation loiterܴ௧ , aircraft moves to around the target.
results on Teraterm screen or in data log.
• For HILS, run HILS calculation model to animate the 2) Qualitative testing of visual servoing main computation for
aircraft attitude and the live-view camera vision. camera gimbal and aircraft on HILS with input from image
B. Results and discussions processing module
Visual servoing control system succeed to control the
1) Quantitative testing of visual servoing main computation camera gimbal to always lead to the target object and the
for camera gimbal and aircraft aircraft moves to follow and around the target object. At
Response of the feature point position and the aircraft steady state, the position of the point feature is located on
position are shown in Figure 10 and Figure 11. centroid, that is coordinates (0,0), the point feature in the
pixel-value (160, 120) at a resolution of (320, 240). In other
words, the location of point features on the steady conditions
right in the middle of the image plane.
239
3) Qualitative testing of visual servoing main computation for ACKNOWLEDGMENT
camera gimbal at real camera gimbal servos The authors would like to thank the BPPT for their
contribution in developing the fixed wing UAV. This work is
Testing is done by using both static object and dynamic supported by School of Electrical Engineering and Computer
object. The visual servoing succeeds in directing motion of the Science, ITB.
camera gimbal to always point the camera toward the target.
At the beginning, large error value of point feature is
observed. When the control system started, this error becomes REFERENCES
smaller and the value finally converge to zero. Even though [1] Nugroho, Primawan D., Trilaksono, Bambang R., Triputra, Fadjar
there is overshoot, in the steady state, position of feature point Rahino. Image Processing System for Image-Based Fixed Wing
Unmanned Aerial Vehicle. Submitted paper ICEEI 2015
succeed to reach the desired set point which is the centroid
(exactly in the center of the image). This condition is reached [2] Prabowo, Yaqub A., Trilaksono, Bambang R., Triputra, Fadjar Rahino.
Hardware In-the-Loop Simulation for visual servoing of fixed wing
with settling time of 1.125 sec – 2.25 sec, according to 20 UAV. Submitted paper ICEEI 2015
times of test. [3] Beard, Randal W., McLain, Timothy W., Small Unmanned Aircraft
Therefore, the controller succeeds to make feature point move Theory and Practice, Princenton University Press, United Kingdom.
to the desired set point and the time needed to reach setting 2012.
point with error less than 2% is 1.125 seconds. [4] Triputra, Fadjar Rahino, Sistem Kendali Terbang Visual Servoing PUNA
Berbasis Citra Menggunakan Gimbal Kamera, Institut Teknologi
Bandung. 2013.
[5] Slotine, Jean-Jacques E., Li, Weiping., Applied Nonlinear Control, 1st
Edition, Prentice-Hall, Englewood Cliffs, NJ 1991.
[6] Anderson, B.D.O, Moore, J.B., Linear Optimal Control, Prentice-Hall,
NJ, 1971.
[7] Borra, B., Nonlinear UAV Flight Control using Command Filtered
Backstepping, Master Thesis, California Polytechnic State University,
San Luis Obispo, 2012.
[8] Chaumetter, F., Hutchinson S., Visual Servo Control, Part I: Basic
Approaches, IEEE Robotics and Automation Magazine, Vol. 13, No. 4,
p. 82-90, 2006
[9] Chaumetter, F., Hutchinson S., Visual Servo Control, Part I: Advanced
Approaches, IEEE Robotics and Automation Magazine, Vol. 14, No. 1,
p. 109-118, 2007
[10] Espinoza, T., Garcia, Dzul, A., Garcia, L., and Parada, R., Nonlinear
Controllers Applied to Fixed-Wing UAV, 2012 IEEE Ninth Electronics,
Robotics and Automotive Mechanices Conference (CERMA). Nov. 19-
23, 2012.
[11] Espinoza, T., Garcia, Dzul, A., Lozano, R., and Parada, R., Backstepping
– sliding mode controllers applied to a fixed-wing UAV, 2013
International Conference on Unmanned Aircraft Systems (ICUAS), May
28-31, 2013.
[12] Hamel, T., Mahony, R., Image based visual servo-control for a class of
aerial robotic systems, Elsevier Automatica Vol.43, No. 11, p. 1975-
1983, Nov., 2007
Figure 13 Sequence of camera gimbal motion response (left: image [13] Triputra, F.R., Trilaksono, B.R, Sasongko, R.A, and Dahsyat, M.,
captured by camera, right: motion of camera gimbal servos) Longitudinal dynamic system modelling of a fixed-wing UAV towards
autonomous flight control system development: A case study of BPPT
VI.CONCLUSIONS wulung UAV platform, International Conference on System Engineering
and Technology (ICSET), 2012
Based on the results of the design, implementation, and
[14] Triputra, F.R., Trilaksono, B.R, Adiono, T., Sasongko, R.A, and
testing that has been done, some conclusions can be written as Dahsyat, M., Nonlinear Dynamic Modeling of a Fixed-Wing Unmanned
follows. Hardware and software environment was developed Aerial Vehicle: A Case Study of Wulung, Journal of Mechatronics,
and configured to enable effective controller implementation. Electrical Power, and Vehicular Technology (MEV), Vol. 6, No. 1,
The data communication system has been implemented on the 2015.
control board Pixhawk PX4. Data transmission between the
microcontroller and image processing system and HILS
system is running well and the data was received in
accordance with the submitted data. The visual based autopilot
controller has been developed to control the camera gimbal to
always point the camera toward the target and also to control
the aircraft motion to always follow and move around the
object.
240