Vous êtes sur la page 1sur 6

Decision Process of Autonomous Drones for

Environmental Monitoring
Ömür Yıldırım Klaus Diepold Revna Acar Vural
Department of Electronics and Department of Electrical and Department of Electronics and
Communication Engineering Computer Engineering Communication Engineering
Yildiz Technical University Technical University of Munich Yildiz Technical University
Istanbul, Turkey Munich, Germany Istanbul, Turkey
Email: omur.yildirim@tum.de Email: kldi@tum.de Email: racar@yildiz.edu.tr

Abstract—Environmental monitoring has a key role to reduce detect emergencies. The monitoring capability of the system
the human effect on nature and wildlife. Due to intense tracking is validated in simulation experiments, using a rotary-wing
and observation tasks, environmental monitoring is an expensive aircraft.
solution. Autonomous observation is an open discussion to raise
the efficiency of environmental monitoring and reduce the cost Many autonomous monitoring approaches in the literature
of operation. Unmanned aerial vehicles (UAVs) are possible have acquired promising tracking and observation perfor-
candidates with their proven success in monitoring and tracking. mance. Applications for autonomous environmental monitor-
Thus, we are offering a decision process for the autonomy ing problems gather around two approaches [1]. The first
of drones in monitoring tasks. Our system is capable to fly approach was network from sensors, where each sensor col-
autonomously, e.g., covering a given area, and able to perform
certain tasks, e.g., identifying bottles. Our simulation results lected measurements and exchanged measurements with each
prove that autonomous drones can be used for a large variety of other, in order to track the environment and estimate the
environmental monitoring tasks. location of the event. This localization approach required the
Index Terms—target detection, environmental monitoring, use of reference points (known locations), thus expensive and
UAV, decision making, autonomous navigation. time-consuming deployment and standardization. Furthermore,
this strategy limited the coverage area. The second approach
I. I NTRODUCTION contains the use of vehicle(s) equipped with an appropriate
Environmental monitoring described as persistent system- sensor(s), that moves towards the defined area. Usage of
atic studies in which the environmental conditions are unveiled robotic systems [2, 4, 5] supported more dynamic and flexible
with the purpose of appraising the progress made to accom- system which was suitable for localization. Yet, in wild or
plish given environmental targets, also to facilitate the detec- urban environments it is very challenging to perform consistent
tion of new environmental circumstances. It is being used to operations with robotic systems due to many crucial technical
monitor and manage natural resources, track species at risk and issues such as planning, navigation and endurance.
save protected areas. Autonomy for environmental monitoring However, to date, the persistent performance of robotic
has been researched and developed instrumentally recently systems operating in an urban environment is very challenging,
[1–3]. However, robust environmental monitoring, especially even with a low degree of human intervention where stable
in Unmanned Aerial Vehicle (UAV), remains a challenging broadband radio link cannot be guaranteed in such environ-
task due to high-level observation, periodical or continuous ments. The limited availability of computing resources and
tracking, tracking several data sources correspondingly, big low-weight sensors operating in harsh environments for mobile
size of fields and access to uncivilized areas. In this article, systems pose a great challenge in achieving autonomy.
we present an autonomous system design to resolve these Our research goal is to develop a decision process for a UAV
challenges. We structured the components of our system into which is capable of accomplishing a variety of monitoring
layers to support modular functionality. We use a static path missions without human intervention. Autonomy for environ-
planning approach to determine the mission’s road-map. To mental monitoring missions brings requirements on the system
serve safe navigation through outdoor fields, we use both laser operating. Differences between missions demands system to
sensor and global positioning system (GPS). Navigation inside be adaptable in terms of processing data sources and planning
the planned path is supported by a module that detects and ability. UAV should be able to operate in uncontrolled outdoor
clusters target object(s) in the environment. To support safety environments, such as forests or urban areas. Therefore, the
for UAV, environment and to ensure continuity of mission, we system has to ensure from powerful control of UAV since safe
used a module that tracks several sensors and data sources to conditions can not be guaranteed. In respect to these demands,
solution for fully autonomous environmental monitoring is to
plan and control mission with on-board resources.
978-1-7281-1862-8/19/$31.00 2019
c IEEE In the very beginning, we have developed a modular frame-
work for autonomous UAVs in monitoring missions. The con-
structed framework supports the asynchronous development
of modules. It enables a static path planning, a robust flight
and navigation in urban and wild outdoor environments and
serves a platform for high-level target monitoring such as
the perception of bottles and a safe process of mission. We
implemented algorithms for modules and tested our framework
on a simulation. Our implementations for control of UAV
developed for a quad-rotor due to the fact that we have a
quad-rotor platform to make real-world experiments in future
and relatively easier control in comparison with a fixed wing
UAV.
Recently, similar researches have been released to exe-
cute various monitoring missions. High proportion of re-
searches have concentrated to high altitude imagery with
an autonomous aerial vehicle to map environments, such as
detecting Black poplar response to drought thermal imaging Fig. 1: Framework of the proposed system.
[6] or soil erosion monitoring [7]. Disaster monitoring and
management have been another popular topic for autonomous
UAV applications. These operations surveyed damaged areas,
mission and runs a specific algorithm. In this work, we will
for example after Hurricane Wilma at Florida researchers used
focus on the decision-making modules of our system.
unmanned sea surface and micro aerial vehicles to detect
damages [8] and in [9] researchers achieved management
and monitoring of infrastructure development after a disaster.
Because of these applications need to collect and process big A. Autonomous System for Environmental Monitoring
sized data, they need huge computational power which is not
available on-board. Researchers propose to send collected data An autonomous system for environmental monitoring is
to a base station and process data after monitoring finished. constructed upon two intercommunicating layers, cognition
There are two main differences between our work and and vision. Each layer consists of a modular structure. The
previous works. First, we investigate small individual targets vision layer receives raw camera data as input and publishes
rather than processing aerial view of a group of targets. This the estimation of a possible target and identified target infor-
requires low altitude flights with target oriented maneuvers mation as output. In our simulation, an estimator based on
and this enables target specific data gathering. Second, we probability is used to imitate the vision layer. Outputs of the
process real-time data with on-board sensors and processors. vision layer feed the cognition layer which is responsible for
This brings a reaction capability for complete autonomy. decision-making and taking real time actions. The cognition
layer should control UAV with a chain of actions depending
on the output of the vision layer and should track events in
II. E NVIRONMENTAL M ONITORING F RAMEWORK
the environment, e.g., collision.
Maintaining an environmental monitoring process au- For finding an effective and efficient solution, monitoring
tonomously is an intense task which demands powerful and ro- tasks can be divided into levels. In the first level of monitoring,
bust decision-making architecture. Further, an autonomous ve- the observer should be certain to not overlook any part of field.
hicle must have metrics for awareness because it requires self- This level is often called path planning and path tracking. In
control according to the environment. In regard to awareness, the next level of monitoring, the definition of environment
an autonomous vehicle should be able to track various data, and recognition of targets should be done. Recognition is
such as surroundings, position, targets, events of environment subjective in a manner of quality of inspection, e.g., image
and self condition. Yet, a permanent solution for environmental observation may be effected from angle of camera or daylight.
monitoring is not realistic, due to awareness metrics for a In the last level of monitoring, the system should track self
certain monitoring task may not be sufficient for another conditions and environment to avoid potential harm and be
monitoring task, e.g., the difference between monitoring trees sure to the continuity of the mission. In respect to levels of
and monitoring deer tracks. monitoring, our system for autonomous monitoring is designed
In this work, we tested our proposed autonomous monitor- to act in three decision states; path tracking, target acquisition
ing system in a scenario where the task is identifying left-over and emergency. To handle each decision state cognition layer is
beer bottles inside a university campus. To be able to perform a separated into three modules; coverage module, investigation
monitoring task and to design an adaptable system for different module and safety module. While each module has separated
environmental monitoring tasks, we divided our system into action and decision logic, each module is in effect of other
modules, each of which is responsible for a certain part of the modules for their actions-decisions as in Figure 1.
B. Complete Coverage Path Planning during the inspection is not included complete coverage cal-
culation. Because UAV can return to a position that is already
Complete coverage path planning produces a road-map to
covered.
ensure the sweeping of the free space of the defined area. Path
planning approaches can be characterized by requirements. Algorithm 1 Extended vertical cellular decomposition
As in [10], algorithms required initial knowledge of the Input: vertices V
environment that describes the blocked and free space. As in Output: cells C, graph G, decomposition of cells DC
[11], algorithms tried to resolve an unknown environment by a
Vlines ← calculate vertical lines(V)
dynamic approach. For dynamic approaches, it is not definite
Ladaptive ← []
to provide a criterion for optimal path(s). Still, during cover-
for line in Vlines do
age of unknown environments, avoiding repeated coverage is
Ladaptive ← f ind adaptive lines(line, V)
important.
end for
Another separation between path planning algorithms is the
Vlines ← merge lines(Vlines , Ladaptive )
dimension of paths. There are robotic applications such as
for line in Vlines do
cleaning robots, autonomous underwater covering vehicles,
C ← f ind cell(line, V)
painter robots, lawn mowers, agricultural crop harvesting
end for
equipment which required a special type of path planning
G ← f ind graph(C)
in a two-dimensional (2D) environment. Planning a three-
DC ← decompose(C)
dimensional (3D) path can be impractical for such applications
since the task is sweeping the surface. Yet, in 3D planning, task
is usually travelling. In this work, we implement a static 2D Algorithm 1 describes the method, which searches for
approach for path planning. As in [10], we use an algorithm vertical lines and creates cells by separating spaces between
based on vertical cell decomposition. vertical lines and searches for a possible graph along with cells
Path planning within 2D space can be achieved by the and plans a path according to the graph.
construction of a graph G with a set of vertices V and edges
E. Graph ensures complete coverage of field in consideration C. Autonomous Inspection
of target detection specifications. In section III we explain Inspection tasks demand high-level decision processes and
specifications of our implementation. Vertical cell decomposi- intensive labor. Approaches can use machines, tools, human
tion algorithms construct a finite data structure that completely labor and/or other suitable items for inspecting objects. One of
encodes a solution for a path within a given area and obstacles. the issues is that inspection systems need to solve continuous
Algorithms divide free space Cf ree into a finite set of regions or periodic data which needed to detect and monitor the
often called cells. Then adjacency information of cells is condition of objects. Furthermore, inspection tasks can be at
extracted to build the graph from start to endpoints. Finally, fields that are difficult to reach. Autonomous approaches are
based on the plane-sweep(line-sweep) principle each cell is advantageous compared to manual approaches in respect to the
decomposed to lines. demands of inspection tasks. In [5], an autonomous drone was
In this work, we extend the vertical decomposition algo- used for search and rescue task. They used a stereo camera to
rithm to solve position issues where two points along Cobstacle localize drone and to detect and inspect targets. The drone flew
lie on a vertical line that slices through Cf ree and one of the autonomously until it detects a target or gathers an emergency
edges of Cobstacle is vertical. To achieve this we use vertical exception. In another study [4], an autonomous inspection
lines as a collection of interest points rather than single vertex
information.
To be able to track total coverage as in Figure 2, UAV
first calculates the total target detection area Adetection . It is a
complete circle as in equation (1):

Adetection = πtan2 (α)r2detection εaltitude (1)

where rdetection is target detection range, α is the angle


of view, εaltitude is error rate of altitude sensor. After, UAV
calculates complete coverage Ccoverage as in equation (2):

Ccoverage = πtan2 (α)r2detection εaltitude + ∆tVεacclrm (2)

where ∆t is travel time, V is the velocity of vehicle and Fig. 2: Calculation of complete coverage in simulation, area
εacclrm is error rate of the accelerometer. Yet, movement highlighted as green represents the covered area.
system was proposed to detect emergencies. They created a collision, UAV tries to avoid collision and continue to the task.
multi-UAV application to provide real-time data from ongoing In case of other emergencies, UAV breaks monitoring tasks
fires to officers. and tries to get a safe position (e.g., return to base station).
The expected output from the autonomous inspection is
to gather reliable information about the detected target. De- III. E XPERIMENTAL S ETUP
pending on the application, conditions for better data quality
A. Implementation
differs, for example, while identifying bark beetle damage on
trees [2], vehicle required high altitude flight and for chemical We implemented our simulation based on certain target
source localization in a small controlled aquatic environment features. Depending on these features we determined certain
[12], vehicle required a direct approach to the source. In our parameters for target detection. These parameters are used for
work, we need to identify beer bottles. We used a policy- algorithms of path planning and inspection. In this section
based approach to get a better perspective of the detected details of target and algorithms are provided.
bottle as in Figure 3. The decision for a certain policy is 1) Target details: In our mission, we have tried to col-
taken by consideration of target position and visual identifier lect information of beer bottles inside Technische Universität
feedback. As explained in section III, the controller ensures München (TUM) campus. Beer bottles differ on size but
the optimum distance between the detected bottle and UAV. roughly they have 0.25 meter height, 0.07 meter diameter.
If identification of the bottle is not achieved from optimum 2) Target detection details: During complete coverage UAV
distance to target then UAV takes a circular route around the has an aerial view of field. To be able to detect target, in respect
target with optimum detection distance as diameter. to our target size optimum distance from the target have set
to 2.5 meter. To be able to identify bottle type and brand we
have set 0.75 meter as optimum identification range.
3) Path planning details: For our path planning algorithm
we have used an algorithm based on vertical cell decomposi-
tion principal. For decomposing constructed cells C to road-
maps DC, we have used back and forth motion lines. Distance
between motion lines is set to optimum target detection range
which is also the altitude of UAV during complete coverage.
To solve the position issues of obstacles, we have used
the adaptive line concept. Adaptive lines have used in the
calculation of cells to divide spaces accordingly.
(a) Target detection circle is blue (b) Target detection circle be- 4) Inspection details: After target detection during com-
during target search comes green once a target de-
plete coverage, UAV tries to increase the accuracy of detection.
tected
For successful identification UAV approaches through the
Fig. 3: UAV and targets in the simulation of the TUM map. target with optimum identification range. If identification ac-
curacy is under the threshold, UAV does a circular movement
D. Emergencies and Safety around the target with diameter as optimum identification
range.
The goal of safety is avoiding or reducing possible harm to We calculated target detection precision Ptarget during
UAV and the environment. There are many flight regulations simulation with Gaussian distribution as in equation (3);
and safety regulations for autonomous vehicles determined by
governments. Further, to protect equipment and maintaining 100 −(x−µ)2 /2σ2
the mission, detection of emergencies is required. To identify P (x) = √ e (3)
σ 2π
emergencies we use a set of rules. Each rule depends on
certain sensor measurements as sufficient as the threshold. In where µ is 0.75 meter, x is the parametric distance between
general, rules are produced from the collection of previous UAV and target, σ is 1 meter. Further, we applied a random
experiences, standards of autonomous flights and health merits function using python’s random uniform function which re-
of mechanical parts of UAV. turns a value between 0 and 100. This random number is
We track and detect collision, power need to finish task, compared with complementary of target detection precision
damage on mechanics (e.g., broken propeller), loss of localiza- to 100.
tion, optimal daylight to process camera input, adverse weather Algorithm 2 describes the method, which calculates if a
conditions. To detect emergencies, we use several data sources; target is detected or not detected. Calculation detection preci-
Laser Imaging, Detection And Ranging (LIDAR) for collision sion is based on the Gaussian probability of target distance.
calculations, forecast tracking to detect potential instabilities Complementary of detection precision is being compared with
for flight such as strong winds or rainstorms, a photo-diode a randomly generated number which is between 0 and 100. If
to detect the quality of daylight. To ensure safety, we apply the random number is bigger then complementary of detection
two policies to solve a security violation. If an emergency is a precision then the target is marked as detected.
Algorithm 2 Determination of target detection
Input: target precision Ptarget
Output: detection D
possibility ← random.unif orm(0, 100)
compP ← 100 − Ptarget
if possibility >compP then
D ← T rue
else
D ← F alse
end if

B. Simulation
Simulations for two maps were conducted to show and
compare the effectiveness of the monitoring system in simple
and complex areas. Table I shows the results for the small map Fig. 5: TUM campus from aerial view.
and Table II shows the results for the big map (TUM campus).
In both scenarios, we considered a random placement of 10 TABLE I
bottles (0.25 meter height, 0.07 meter diameter) which stand Results of Simple Obstacle Map Simulation
vertically. Further, we placed a quad-rotor rotary wing UAV
1st 2nd
with a 4500 milliampere hour battery and a LIDAR sensor Detected target count 9 10
with 10 meter sensitivity. We assumed the limit speed of UAV Labeled target count 9 10
as 10 meter/second and constant acceleration 5 meter/second2 . Location precision 97% 91%
Travel distance (meter) 1849 1822
For the small map, we placed a single obstacle to the center Mission time (minutes) 7.20 7.31
of field as in Figure 4. The big map had 4 obstacles which Battery 79.9% 78.3%
represent buildings of TUM campus as in Figure 2 and 5. Emergency interruption? no no

IV. R ESULTS
The following data is collected during simulations: detected
bottle count, identified bottle labels, identified bottle locations, to the defined base location. To finish whole coverage, UAV
total travel distance, total mission time, total power consump- restarted the mission for 7 times and it finished monitoring
tion and total interruption by emergencies. mission in 8 attempts. Because of field size, certain parts of the
For the small map, we ran simulation two times to calcu- campus doesn’t contain bottles. During runs 1, 2 and 3, system
late the mean performance of the system. UAV successfully faced with 9 bottles of 10 placed bottles. After these three runs,
completed the mission on both simulation runs without inter- only 1 bottle left in the field. For runs 4th, 6th, 7th and 8th,
ruption. During simulations of the first map, system identified system didn’t detect any target. Because of lack of target, these
19 of 20 placed bottles with 94% average position precision. runs don’t contain any precision results in Table II. During the
Average travel distance per 1% battery usage was 87.82 simulation of the TUM campus, system detected 9 targets and
meter. Results for first map showed that our system is able identified 8 targets with 94.625% average position precision.
to conduct autonomous monitoring in an environment with Average travel distance per 1% battery usage was 180.5 meter.
smooth weather conditions and without unknown obstacles. Simulation results in Tables I and II, shows that average
For the first simulation run of the TUM campus, UAV de- target detection rate was 93.3%. 94.7% of Detected targets
clared an emergency due to critical battery level and returned are labeled successfully. Drone covered 326.08 square meter
per 1% battery usage. The average flight time of drone was
22 minutes 57 seconds with complete battery usage.
As explained in section III-B, estimation for target detection
and target position depend on the probabilistic function. Thus,
results for estimations have been unable to demonstrate a
ground truth for consistency of identification module. How-
ever, 93.3% target detection and 94.7% labelling precision
rates suggest that the system has the robustness to find and
investigate targets due to the fact that our estimations depend
on the view perspective of target by UAV, which poses a
requirement for a realistic approach to the target.
Results for two maps differ in average coverage to energy
Fig. 4: 3D rendered simple obstacle map. consumption rate and total break count. During the TUM
TABLE II
Results of TUM Main Campus Simulation

1st 2nd 3rd 4th 5th 6th 7th 8th


Detected target count 5 2 1 0 1 0 0 0
Labeled target count 5 2 1 0 0 0 0 0
Location precision 96% 93% 91% - 0% - - -
Travel distance (meter) 12.1 12.2 12.2 12.3 12.3 12.2 12.3 2.5
Mission time (minutes) 22.95 22.95 22.95 22.95 22.95 22.95 22.95 4.78
Battery 32% 32% 32% 32% 32% 32% 32% 79%
Emergency interruption? Yes Yes Yes Yes Yes Yes Yes No

campus simulation, UAV travelled nearly twice as much as R EFERENCES


what it travelled in the small map simulation. In consideration [1] B. Bayat et al. “Environmental monitoring using au-
of placed bottle counts were same on both maps, results tonomous vehicles: a survey of recent searching tech-
suggest that the system becomes inefficient with increasing niques”. In: Curr. Opin. Biotechnol. 45 (2017), pp. 76–
count of detected targets per square meter. This finding, while 84.
preliminary, suggests that the system can be improved by a [2] R. Näsi et al. “Using UAV-based photogrammetry and
better sense of target, for example supporting image cluster hyperspectral imaging for mapping bark beetle damage
with LIDAR based shape extractor. During the TUM campus at tree-level”. In: Remote Sens. 7.11 (2015), pp. 15467–
simulation, the mission interrupted 7 times due to the low 15493.
battery of UAV. In respect to the average recharge time [3] S. Manfreda et al. “On the use of unmanned aerial sys-
of battery (4 hours), the system faced 28 hours of break. tems for environmental monitoring”. In: Remote Sens.
This finding suggests that the system may reach to increased 10.4 (2018), p. 641.
capability for real-time monitoring of big fields. Therefore, [4] M. Quaritsch et al. “Fast aerial image acquisition and
fixed wing drones can be suggested to improve the on-air mosaicking for emergency response operations by col-
capacity of the system. The system provided highly successful laborative UAVs”. In: Proc. Int. ISCRAM Conf. 2011,
identification rate (90%) which suggests that the system is able pp. 1–5.
to collect reliable information from targets while protecting [5] T. Tomic et al. “Toward a fully autonomous UAV:
itself from hazardous conditions. Research platform for indoor and outdoor urban search
V. C ONCLUSION and rescue”. In: IEEE J. Robot. Autom. 19.3 (2012),
pp. 46–56.
In this paper, a modular software framework for the decision
[6] R. Ludovisi et al. “UAV-based thermal imaging for
process of autonomous UAVs in environmental monitoring
high-throughput field phenotyping of black poplar re-
missions is presented. Implementation of the system is tested
sponse to drought”. In: Front. Plant Sci. 8 (2017),
in different simulations. Presented test results show that the
p. 1681.
system provides a seamless planning, control and navigation
[7] S. d’Oleire Oltmanns et al. “Unmanned aerial vehicle
execution for a quad-rotor UAV. Furthermore, the structure of
(UAV) for monitoring soil erosion in Morocco”. In:
the system gives opportunity for adjustments. The proposed
Remote Sens. 4.11 (2012), pp. 3390–3416.
investigation module enables high performance for collecting
[8] R. R. Murphy et al. “Cooperative use of unmanned sea
target information. The identification rate was 90% in the first
surface and micro aerial vehicles at Hurricane Wilma”.
run of small map simulation, 100% in the second run of small
In: J Field Robot. 25.3 (2008), pp. 164–180.
map simulation and 80% in TUM campus simulation. High
[9] C. A. F. Ezequiel et al. “UAV aerial imaging applica-
precision of investigation module is shown by an average
tions for post-disaster assessment, environmental man-
90% identification rate of placed targets. In addition to highly
agement and infrastructure development”. In: ICUAS
promising monitoring performance, system executed missions
Int. Conf. IEEE. 2014, pp. 274–283.
with excellent safety level.
[10] E. U. Acar et al. “Morse decompositions for coverage
At present, the implementation of the suggested system is
tasks”. In: Int. J. Robotics Res. 21.4 (2002), pp. 331–
tested with simulations. To consolidate our findings, we will
344.
focus on hardware integration of current software implementa-
[11] S. X. Yang and C. Luo. “A neural network approach to
tion. Further, to resolve mission breaks on losing GPS signal,
complete coverage path planning”. In: IEEE Trans. Syst.
an on-board odometry solution may be developed, such as
Man. Cybern. B Cybern. 34.1 (2004), pp. 718–724.
laser or visual odometry.
[12] M. Ohashi et al. “Crayfish robot that generates flow
ACKNOWLEDGMENTS field to enhance chemical reception”. In: J. Sens. Tech.
The authors would like to thank the ABBY-Net for provid- 2.04 (2012), p. 185.
ing the idea of environmental monitoring project.

Vous aimerez peut-être aussi