Vous êtes sur la page 1sur 33

Software Engineering Department

Ort Braude College

Final Project in Software Engineering, Course 61401


Flocking algorithm for autonomous flying robots

Ori SAYDA 302842943 , Gidi BUKS 304618879


Supervisor: Dr. Reuven COHEN

Karmiel – February 4th, 2018

1
Contents

1. INTRODUCTION.................................................................................................... 3
1.1 Organization of The Paper .......................................................................... 3
2. Background and related work ................................................................................. 4
2.1. Flocking Birds ............................................................................................ 4
2.2 PID Controller ............................................................................................. 4
2.3 Inner Noise Interference On GPS Device .................................................... 5
2.4. Autonomous Drone Industry ....................................................................... 5
2.5. Definition of the Problem ............................................................................ 5
2.6. Other Products ........................................................................................... 6
2.7. Proposed Solution ...................................................................................... 6
2.8. Advantages and Disadvantages of the Proposed Solution ......................... 6
3. Theory.................................................................................................................... 7
3.1. Autonomous Flight of The Individual In Flock ............................................. 7
3.2. Autonomous Flight of Several Drones as A Flock....................................... 9
3.3. Collective Target Tracking ........................................................................ 11
4. SOFTWARE ENGINEERING DOCUMENTATION ............................................... 13
4.1. Requirements (Use Cases) ...................................................................... 13
4.2. Design (GUI, UML Diagrams)................................................................... 15
4.2.1. User Initialization .......................................................................... 15
4.2.2 Technician Initialization ................................................................. 16
4.2.3 Area Initialization ...........................................................................17
4.2.4 Target Initialization ........................................................................18
4.3 Design (UML Diagrams) ............................................................................19
5. Testing ................................................................................................................. 23
6. RESULTS AND CONCLUSIONS ......................................................................... 25
6.1 Results ...................................................................................................... 25
6.2 Conclusions ..............................................................................................32
REFERENCES ........................................................................................................33

2
Abstract In this era, when everyone tries to find convenient, faster and more efficient
solutions, to send things between remote places, this system is an efficient and convenient
solution.
This work was conducted after testing on this type of product across the Internet, with the
help of the article Flocking algorithm for automous flying robots[1], we investigated the
problems that could occur in this system.
A major problem that the system must consider is that, unlike the drummers who fly alone, in
this system we have to consider that the flight is in the band, which greatly affects the
hovering flight.
In this project we will implement the formulas from the above article, these formulas were
investigated by the authors of the article and verified.
KEYWORDS: Flock, Drone, Autonomous, Flying robots, Algorithm.

1. INTRODUCTION

In the beginning of the 21st century, progress in the electronic domain led to a simple and
inexpensive production of flight controllers, positioning devices and cameras.
Since then, drones have come into our lives and have been developed rapidly, while constantly
improving the control, the precision and the devices` handling.

This development enabled to have new purposes for the drone usage, in order to gain lower costs,
improve the efficiency, provides inexhaustible working force, and ameliorates the quality of life.
The chase for a robotic entity that works in autonomously initiative is accelerated in different areas in
our lives, from autonomous systems in cars up to autonomous planes and drones, used in a variety of
fields (military, agricultural, logistics, transportation, aviation etc.).

The assembly of a drone flock is complex and many parameters need to be considered in real time.
Thus, in order to avoid obstacles and collisions, stay in the predetermined area, get to the wanted
destination and follow a defined target and so on...

The purpose of this project is to develop an algorithm that gives the done the ability to respond in real
time to the nearby drones. The algorithm will allow the drones to get inputs from the surroundings
while taking into consideration the existing conditions in the area, in order to fly as a flock with a
determined target.

In our algorithm, each one of the drones is completely autonomous, and is capable of decision making
in response to the other drones in the flock and to the surrounding conditions. This is done while
maintaining stability and avoiding obstacles found by sensors that are located on top of the drone. The
drones includes GPS, Sonar sensors, Bluetooth, and Wi-Fi communication devices and a preprocessing
unit
In order to fly as a flock, the drones are driven by three main principles: repulsion, alignment, and
GPS.

1.1 Organization Of The Paper

In Section 2, we define basic concepts in the field of the project and relate to related studies.
Section 3 explains three major parts of the solution: Section 3.1 explains the autonomous flight of the
individual in flock, Section 3.2 explains the autonomous flight of several drones as a flock and Section
3.3 explains the Collective target tracking. In Section 4 we explain our expectations of the results and
what we want to achieve with this system. In Section 5 consists of preliminary Software Engineering
Documents: Requirements (Section 5.1), class diagram and initial GUI (Section 5.2), concluded by a
short test plan section (Section 5.3).

3
2. Background and related work
2.1. Flocking Birds

Flocking bird's behavior is an amazing phenomenon exhibited when a group of birds ameliorates
are on the way for searching food or in flight to new countries (See Figure 1).

Today there are systems that simulate the behavior of flocking animals in general and flocking birds in
particular.

Figure 1 - flock of birds fly in the sky.

2.2 PID Controller

The PID Controller or in his full name proportional-integral-derivative controller is a control loop
feedback mechanism that usually used in industrial control systems[5].

This Controller is used to calculate the error between the measured velocity vector and the desired
velocity vector.

Calculating the error takes into account the parameters of the proportional, integral and differential
terms.

To better understand the characteristics of that controller, measurements and settlements were made on
a specific autonomous flying robot.

The low-level algorithm which the controller based on receives two inputs: i) Desired velocity, ii)
Measured velocity.

The algorithm returns a single value calculated by the following formula.


𝑡
𝑑 𝑒(𝑡)
𝜑𝑜𝑢𝑡 (𝑡) = 𝐾𝑝 𝑒(𝑡) + 𝐾𝑑 + 𝐾𝑖 ∫ 𝑒(𝑡 ′ )𝑑𝑡 ′ + 𝜑𝑏𝑖𝑎𝑠
𝑑𝑡
0

Proportional parameter - 𝐾𝑝

Integral parameter - 𝐾𝑖

Differential parameter - 𝐾𝑑

The error between the measured velocity vector and the desired velocity vector - 𝑒(𝑡)

4
Current time - t

t' -for the calculate of the integral, The time between zero time to Current time

𝜑𝑏𝑖𝑎𝑠 – The delay time of calculation and data transfer

In ideal case, the exponential convergence will be calculated at ideal time 𝜏𝐶𝑇𝑅𝐿 .

In non-ideal cases there is slowdown and staying in place or raising acceleration and more mobility.

2.3 Inner Noise Interference On GPS Device

To better understand the inner noise disorder on the system, tests were conducted on two main
variables.

Thus, tests made on a specific autonomous flying robot, however, these conclusions can be cast on
other drones.

The two main variables:

- The error between the measured velocity vector and the desired velocity vector in approximation of
Gaussian function.

- The position is measured approximately of 2.5 [m].

The test was with static GPS on the ground (usually with static GPS on the ground to the measurements
have higher amplitude and frequency compared to measurements with flying robots).

Therefore, can be understand that these tests overestimate the actual inner noise disturbance on flying
robots (drones).

According to the findings (The assuming is that the GPS was installed optimally and the differences
were tested using the Euler-Maruyama method), the GPS position signal has the same data as those
that measured.

Therefore, the effect of the inner noise exists nut can be solved and does not interface with the solution
of the overall problem.

2.4. Autonomous Drone Industry

In the last 10 years, some big companies like Google, Amazon and Facebook enter the field of
flying robots (drones). These companies started the research and the development of single
autonomous drones, mainly because of the cheap costs of drones manufacturing in the recent years.
Experts predict that swarms of drones will be ready for use for the civilian population soon in many
different fields [6].

2.5. Definition of the Problem

Autonomous drones are found in the market for several years (i.e. DJIs famous drones) yet, the
flight of the drones as a flock is an issue that has been getting a lot of attention lately.
Unlike an autonomous drone that is flying alone, the flock flight consists of handful of other
parameters that have to be considered. The parameters include the communication time, the
inaccuracy of the other drones, the staying in the defined zone, the avoiding static and active obstacles
including the other drones composing the flock, the adjustment of the single drones behavior to the
others, the simultaneous arrival to the target, the join tracking of a defined object and shared decision
making. All these parameters make it a much more difficult task, and this is in the goal of this project.

5
2.6. Other Products

Other products of autonomous drone flocks are NVIDIA UAVs & Autonomous Drones [1], Hexo+
Self-Flying Camera [2], Intel drones flock [3] and in the last years, scientists in Budapest created the
first flock of flying robots.

The Hexo+ Self-Flying Camera was designed to capture the video of extreme Athlete, having a single
drone that follows a moving target. This product includes a movement capture video and therefore it is
handling collisions slower than a drone that is directed to a specific area, however the Hexo+ Self-
Flying Camera is target directed function that does not exist in NVIDIA UAVs & Autonomous Drones
for example.

Another comparable Autonomous Drones system is NVIDIA UAVs & Autonomous Drones. This
system is faster, and like the Hexo+ its work only on one drone and not in a flock.

The third system is of Intel Corporation this product was successfully tested in the last Super-Bowl, as
the drones flock painted the Intel logo, Pepsi logo and the USA flag in the sky.

2.7. Proposed Solution

In the attempt to solve these existing problem we are led by three important principles that can
resolve the problems rising during drone flight:
1. Keep away in short term – Each drone has the ability to keep itself away when getting close to any
obstacle and set a new velocity vector. This repulsion force is necessary in order to prevent collisions.
2. Alignment rule – Each drone has the ability to communicate with the other drones in the flock in
order to update their velocity vector if they are located in the communication range. This rule allows
the flock to fly together while maintaining their velocity.

3. Preferably global positional – To force all the drones in the flock to maintain their predetermined
position, so that every drone can calculate the area in which it is supposed to fly while moving with the
flock. This principle is important in order to stay grouped in the determined flying zone.

2.8. Advantages and Disadvantages of the Proposed Solution

Advantages
1. Every drone is flying individually and independently, which means that if one of the drones is
tackled, it has no effect on the rest of the flock.
2. The system is perfectly autonomous, therefore, after the primary definitions of the operator, there is
no need for further interventions, and the flock manages to reach its target.

Disadvantages
1. The communication relies on the sensors - In case the drones are beyond the communication range
they will not be able to transfer/receive information to/from (respectively) another drones in the flock.
2. The degree of inaccuracy (inner noise) in each one of the devices used in the system – The measured
values are not the real values on the ground this could lead to collision and inaccuracy on the system.
3. Nature forces can have an effect on the system and even destroy it - External forces such as strong
winds or storms can destroy the system and take it out of use.

6
3. Theory
To solve the problem we have described, we are concentrating on three main topics:

- Autonomous flight of the individual in flock.


- Autonomous flight of several drones as a flock
-Collective target tracking

3.1. Autonomous Flight of the Individual in Flock

To enable the autonomus flight of each drone possible, we need to allow it to control its own
velocity vector. In order to accomplish that we will use a low level algorithm based on the velocity PID
controller. The input fo this algorithm would be the desirable velocity vector, and the output fixes the
new velocity vector.
However, regarding flock flight, other parameters needs to be considered.As a result, the desirable
velocity vector, serving as input to our algorithm is no longer constant, and it variates at any given
moment, depending on the other drones’ velocity and position.
The velocity vector of an individual drone can be calculated using the function f i , which we shall
elaborate later on.
The velocity vector on an individual drone in the flock is calculated using the following equation:

𝑣𝑖𝑑 (𝑡) = 𝑓𝑖 ({𝑥𝑗 (𝑡)} 𝑗=1


𝑁 𝑁
, {𝑣𝑗 (𝑡)} 𝑗=1 )

where N is the number of drones composing the flock.


Yet, we need to consider several other parametes, such as:

Inner noise – The system is based on GPS sensors which provide the position and the velocity vector
of each drone in the flock. These sensors have a level of inaccuracy, which gives us a small deviation
in the positin of each drone and its velocity vector. The inaccuracy can be designated by the stochastic
function 𝜂𝑖𝑠 (𝑡), yet, for simplicity, we will refer to it as a constant and mark it as 𝜎𝑠 .

General noise – There are external forces that influence the system, that should also be considered
(e.g. wind, rain, storms etc.). In our system we will refer to an optimal environment as much as
possible. Yet, we must consider the winds which have a direct impact on the acceleration. The effect of
the wind is given by the stochastic function 𝜂𝑖 (𝑡), For simplicity we will refer to it as a constant
marked as 𝜎.

Time delay – As priviously discused, the system is based on sensors such as sonar, a comunication unit
between the drones, and a processing unit for any individual.
From the moment that a sensor needs to transmit/ process incoming data till the moment it actually get
it, there is a time interval which can be critical in some cases, leading to a situation where the
transmited/ processed data is no longer relevant. Thus, leading to potential colisions. The time delay
varient is hard to measure, and can have some destructive effects on the system. This feature varies in
differen types of sensors. We will refer to this parameter as as 𝑡𝑑𝑒𝑙 .

Refresh rate of the sensors – Each sensor in the system has a refresh time, expressing the time
interval between measurments. This parameter varies in different sensors types, and it will be marked
as 𝑡𝑠−1 .

Inertia – Ideally orders would be executed immediately, at the very moment they were transmitted. In
reality this behaviour is impossible , since the command needs to be processed and executed, and this
action takes time. In case of velocity modification for example, the change cannot occur neigther
instantly nor in time with exponential convergence. We will refer to it a constant and mark it as 𝜏𝐶𝑇𝑅𝐿 .
We will limit the acceleration factor and mark it as 𝛼𝑚𝑎𝑥 .

Locality of the communication – Each sensor that allows the data transmition between adjacent
drones, has a communication range 𝑟𝑐 . The range 𝑟𝑐 varies between the different types of sensors found

7
in the market.
In case that the distance between two drones is bigger than 𝑟𝑐 , the data won’t go through. In other
words, we do not have to consider drones that are outside the communication range. This variable as
well is hard to measure and can have destructive effects on the system.

All the parameter listed above, lead to new equations.


The acceleration is given by:

(1)

𝑣𝑖𝑑 (𝑡)−𝑣𝑖 (𝑡)−𝑣𝑖𝑠 (𝑡) 𝑣𝑖𝑑 (𝑡)−𝑣𝑖 (𝑡)−𝑣𝑖𝑠 (𝑡)


𝑎𝑖 (𝑡) = 𝜂𝑖 (𝑡) + ∙ min { }
|𝑣𝑖𝑑 (𝑡)−𝑣𝑖 (𝑡)−𝑣𝑖𝑠 (𝑡)| 𝜏𝐶𝑇𝑅𝐿

Where 𝜏𝐶𝑇𝑅𝐿 stands for the relaxation time of the low-level controller.

The velocity is now given by:

𝑣𝑖𝑑 (𝑡) = 𝑓𝑖 [{𝑥𝑗 (𝑡 − 𝑡𝑑𝑒𝑙 ) + 𝑥𝑗𝑠 (𝑡 −


𝑡𝑑𝑒𝑙 )} 𝑗≠𝑖, 𝑥𝑖 (𝑡) + 𝑥𝑖𝑠 (𝑡), {𝑣𝑗 (𝑡 − 𝑡𝑑𝑒𝑙 ) + 𝑣𝑗𝑠 (𝑡 − 𝑡𝑑𝑒𝑙 )} 𝑗≠𝑖, 𝑣𝑖 (𝑡) + 𝑣𝑖𝑠 (𝑡)]

Where 𝑥𝑖𝑠 (𝑡) and 𝑣𝑖𝑠 (𝑡) are a measure of the combined position and velocity noise for a random
variable 𝜎, which results from solving the second-order stochastic differential equation: 𝑥̈ 𝑖 (𝑡) =
𝑣̇ 𝑖𝑆 = 𝜂𝑖𝑠 , 𝑓𝑖 , {… }𝑗≠𝑖 express a set with iterator j  i . The function '𝑓𝑖 depends on the position and
velocity of the 𝑖th drone and the delayed positions and velocities of the other droness and it changes in
frequency of 𝑡𝑠−1 .

The chosen function 𝑓𝑖 has two parameters:

𝑓𝑖 depends on the relative coordinates of the interacting drnes in the comunication range.

𝑓𝑖 (𝑡) = 𝑓𝑖 ({𝑥𝑗 (𝑡 − 𝑡𝑑𝑒𝑙 ) − 𝑥𝑖 (𝑡) + 𝑥𝑗𝑠 (𝑡 − 𝑡𝑑𝑒𝑙 ) − 𝑥𝑖𝑠 (𝑡)}𝑗≠𝑖 , {𝑣𝑗 (𝑡 − 𝑡𝑑𝑒𝑙 ) + 𝑣𝑗𝑠 (𝑡
− 𝑡𝑑𝑒𝑙 )} 𝑗≠𝑖 , 𝑣𝑖 (𝑡) + 𝑣𝑖𝑠 (𝑡))

The drones’ comunication range is expressed in the function 𝑓𝑖 as the sum of all interactions between
each pair of drones in the flock (𝑓𝑖𝑗 ).
𝑁

𝑓𝑖 = ∑ 𝑓𝑖𝑗 (𝑥̃𝑗 − 𝑥̃𝑖 , 𝑣̃𝑖 , 𝑣̃𝑗 )𝜃( 𝑟𝑐 − |𝑥̃𝑖 − 𝑥̃𝑗 |)


𝑗=1

8
where 𝜃(𝑥) defines the communication range explicitely; it equals to 0 if x < 0 and equals to 1 if x  0 .
𝑥̃𝑖 and 𝑣̃𝑖 are the measured position and velocity values including the modelled inner noise term: (𝑥̃𝑖 =
𝑥𝑖 + 𝑥𝑖𝑠 𝑎𝑛𝑑 𝑣̃𝑖 = 𝑣𝑖 + 𝑣𝑖𝑠 ).

In Table 1, (given from the paper) we summarize the parameters of the model defined by (1).

Parameter Unit Definition Valid range / value


τCTRL s Relaxation time of low-level controller (e.g. PID τCTRL 1s
controller)
Maximum magnitude of acceleration 2
amax 2 a = 6 m/s
m/s max
σs 2 2 Measure of inner noise fluctuation
2 2
m /s σ = 0.005m /s
s
1 Frequency of receiving sensory data 1 1
s t =s 5 s
t
1
rc s m Communication range rc = 30  140 m

td el s Time delay of communication tdel = 0 2 s

σ Measure of outer
Tablenoise fluctuation 2 3
2 3
m /s
1 - Parameters of the flying robot model. σ = 0 0 .2 m /s

Table 1 - Parameters of the flying robot model. The column „Valid range” shows
values that are valid for our experimental setup with quadcopters.

3.2. Autonomous Flight Of Several Drones As A Flock

The flight of a number of autonomous drones as a flock, is actually an autonomous flight of each
drone in a predetermined area with fixed limits that alter its position according to the drones
movement. During the flight in this specific area, additional parameters should be considered e.g. the
ability to auto orgenize within the determined area, and avoidind collisions at the same time.
A minimal algorithm will be used to exercise this behavior. This algorithm is based on models of
animal swarms. The self-reorganization of the individual drone in the flock is achieved by its own
decisions in order to find its place relatively to the rest of the flock. The desired velocity vector of each
drone is given as the sum of the interaction between the drones and some other conditions that relate to
self-propelling behavior and the communication in the bounded area.
Each condition will be further explained below.

We define each drone as self-propelled particle (SPP) with requested velocity as 𝑣𝑓𝑙𝑜𝑐𝑘 :

(2)
𝑣𝑖
𝑣𝑖𝑆𝑃𝑃 = 𝑣𝑓𝑙𝑜𝑐𝑘
|𝑣𝑖 |

Short-range repulsion

In order to avoid collisions between the drones we define a local linear repulsion:

(3)

𝑟𝑒𝑝 𝐷(|𝑑𝑖𝑗 | − 𝑟0 )
𝑣𝑖𝑗 = 𝑑𝑖𝑗 𝜃(𝑟0 − |𝑑𝑖𝑗 |)
|𝑑𝑖𝑗 |

9
where 𝑑𝑖𝑗 = 𝑥𝑗 − 𝑥𝑖 , D is the repulsion force, and 𝑟0 is the interaction range.
We consider that the amplitude of fluctuations in the measured position caused by inner noise can be in
the same range as 𝑟0 .

Although there is a certain inaccuracy in the measured position, it does not lead to sudden and
significant changes in the system. Therefore, the use of a simple linear repulsion is preferable over
higher-order terms

Velocity alignment of neighbors

Any type of velocity alignment rule term in practical control algorithm is composed of three guidelines:
1. The velocity of each individual should be as close as possible to those of the other members
composing the flock.

2. The alignment rule affects only the units within the communication range.

3. The algorithm should have an upper threshold value for the velocity, thus, even when drones are
close to collision. The purpose of this threshold is to maintain the stability of the system.

(4)

𝑓𝑟𝑖𝑐𝑡 𝑣𝑗 − 𝑣𝑖
𝑣𝑖𝑗 = 𝐶𝑓𝑟𝑖𝑐𝑡 2
(𝑚𝑎𝑥{𝑟𝑚𝑖𝑛 , |𝑑𝑖𝑗 |})

where 𝐶𝑓𝑟𝑖𝑐𝑡 is the strength of the alignment, and 𝑟𝑚𝑖𝑛 defines the threshold to avoid division by close-
to-zero distances.

This term is a precise, practical choice for taking into consideration the tendency of the drones to align
their direction of motion.

The locality of the viscous friction term in practice is assured by the inverse-square decay of the term
as a function of distance. However, the maximal velocity 𝑣𝑚𝑎𝑥 and the value of 𝐶𝑓𝑟𝑖𝑐𝑡 also has to be
𝑓𝑟𝑖𝑐𝑡
limited. The interaction becomes local if the magnitude of 𝑣𝑖𝑗 gets negligible compared to |𝑣𝑖𝑆𝑃𝑃 | at
2
large distances, i.e. when 𝐶𝑓𝑟𝑖𝑐𝑡 ≪ 𝑣𝑓𝑙𝑜𝑐𝑘 |𝑑𝑖𝑗 | /2𝑣𝑚𝑎𝑥 for large values of |𝑑𝑖𝑗 | . The optimal ratio of
𝑣𝑓𝑙𝑜𝑐𝑘 and 𝐶𝑓𝑟𝑖𝑐𝑡 is thus defined by the limit of the velocities and the desired interaction range.

Boundaries and shill agents

An important principle in order to achieve a flock behavior is the limitation to keep all the flocks’
member within a determined area in constant motion. This area delimits the acceptable flying zone, and
by that, creating a flock.

We will be using a square shaped area composed by walls with repulsion abilities, defined as virtual
“shill” agent. In case one of the drones composing the flocks, or more, crosses the delimited perimeter,
the wall will send a signal to the drone, and the wall coordinates, in the purpose of aligning the drone’s
velocity towards the center of defined area.

(5)
𝑥𝑎 − 𝑥𝑖
𝑣𝑖𝑠ℎ𝑖𝑙𝑙 = 𝐶𝑠ℎ𝑖𝑙𝑙 ∙ 𝑠(|𝑥𝑎 − 𝑥𝑖 |, 𝑅̃ (𝑥𝑖 , 𝑥𝑎 , 𝑅), 𝑑) (𝑣𝑓𝑙𝑜𝑐𝑘 − 𝑣𝑖 )
|𝑥𝑎 − 𝑥𝑖 |

Where 𝐶𝑠ℎ𝑖𝑙𝑙 expresses the strength of the „shill-repulsion”, 𝑥𝑎 is the position of the centre of the area,
𝑆(𝑥, 𝑅, 𝑑) is a sigmoid curve, which smoothly reduces the strength of the repulsion inside the area:

10
(6)

0 𝑖𝑓 𝑥 ∈ [0, 𝑅]
𝜋 𝜋
𝑠(𝑥, 𝑅, 𝑑) = {sin ( (𝑥 − 𝑅) − ) + 1 𝑖𝑓 𝑥 ∈ [𝑅, 𝑅 + 𝑑]}
𝑑 2
1 𝑖𝑓 𝑥 > 𝑅 + 𝑑

𝑅̃ is a function that defines the shape of the area (in this case, a square with side length R). The
association of the three terms defined above are the minimal prerequisites of flocking behavior, i.e.
with these terms we could assure stable and collision-free collective motion. Therefore, the velocity
vector of the individual is determined by the following equation:

(7)
𝑟𝑒𝑝 𝑓𝑟𝑖𝑐𝑡
𝑣𝑖𝑑 = 𝑣𝑖𝑆𝑃𝑃 + 𝑣𝑖𝑠ℎ𝑖𝑙𝑙 + ∑(𝑣𝑖𝑗 + 𝑣𝑖𝑗 ) 𝜃(𝑟𝑐 − |𝑑𝑖𝑗 |)
𝑗≠𝑖

In Table 2, (given from the paper) we summarize the parameters of the self-propelled flocking
algorithm.

Parameter Unit Definition


Vflock m/s Preferred „flocking” velocity

D 1/s Strength of repulsion


r0 m Interaction range of repulsion

Cfrict m2 Strength of viscous friction

Rmin m This parameter defines a threshold to avoid division by zero

R m Side length of the square-shaped arena

Cshill Maximum strength of shill-repulsion near walls

D m Characteristic „width” of the wall

Table 2 - Parameters of the self-propelled flocking algorithm.

3.3. Collective Target Tracking

In this category, we will discuss another feature, which is the ability to track a predetermined target.

We will allow this, using an algorithm having a fixed target point defined in advance, and using the
𝑟𝑒𝑝 𝑓𝑟𝑖𝑐𝑡
interaction terms 𝑣𝑖𝑗 and 𝑣𝑖𝑗 .

We can refer to the flock as at „meta-agent” at the center of mass moving towards the target position
with desired velocity 𝑣0 .
Each individual unit must accomplish the following rules, while avoiding collisions:
1) Approach this meta-agent close enough for it to join the flock.
2) Move parallel with the meta-agent in order to reach the target.

11
The algorithm favors each drone to transit smoothly between two stable states:
1) The flocking state, where the drones are far away from the target.

2) The collective hovering state, when approaching the target.


When the flock gets near to the target, the drones transit from the flocking state to hovering state. The
magnitude of the velocity has to approach zero smoothly, and the drones should stay stable as a flock.

In our system, the communication between the drones is local, and therefore we cannot calculate the
global center of mass. Nevertheless, drones can calculate the local center of mass (𝐶𝑜𝑀), based on the
information received from other drones within the communication range.
We define the velocity vector towards the target in a sphered-shaped area with a radius 𝑟𝑐 using the
equation:

(8)

𝑡𝑟𝑔 𝑥𝑖𝐶𝑜𝑀 − 𝑥𝑖 𝑥 𝑡𝑟𝑔 − 𝑥𝑖𝐶𝑜𝑀


𝑣𝑖 = 𝑉0 [𝑠(|𝑥𝑖𝐶𝑜𝑀 − 𝑥𝑖 |, 𝑟𝐶𝑜𝑀 , 𝑑) + 𝑠(|𝑥 𝑡𝑟𝑔 − 𝑥𝑖𝐶𝑜𝑀 |, 𝑟𝑡𝑟𝑔 , 𝑑) ]
|𝑥𝑖𝐶𝑜𝑀 − 𝑥𝑖 | |𝑥 𝑡𝑟𝑔 − 𝑥𝑖𝐶𝑜𝑀 |

where 𝑣0 refers to the magnitude of the preferred velocity, 𝑥 𝑡𝑟𝑔 is the position of the target, 𝑥𝑖𝐶𝑜𝑀
stands for the position of the local center of mass from the viewpoint of the 𝑖𝑡ℎ unit, 𝑟𝑡𝑟𝑔 is the radius
of the target area, 𝑟𝐶𝑜𝑀 is the radius of the sphere-shaped meta-agent. 𝑆(𝑥, 𝑅, 𝑑) is the sigmoid function
defined in the previous section (6). It is important to mention that the locality of the viscous friction
term defined earlier depends on the values of 𝑣0 and 𝐶𝑓𝑟𝑖𝑐𝑡 in this algorithm.

The magnitude of the target tracking term saturates at 𝑣0 :

(9)
𝑡𝑟𝑔
𝑡𝑟𝑔 𝑣𝑖 𝑡𝑟𝑔
𝑣̃𝑖 = 𝑡𝑟𝑔 𝑚𝑖𝑛{𝑉0 , |𝑣𝑖 |}
|𝑣𝑖 |

The final desired velocity calculated by the algorithm is:

(10)
𝑡𝑟𝑔 𝑟𝑒𝑝 𝑓𝑟𝑖𝑐𝑡
𝑣𝑖𝑑 = 𝑣̃𝑖 + ∑(𝑣𝑖𝑗 + 𝑣𝑖𝑗 ) 𝜃(𝑟𝑐 − |𝑑𝑖𝑗 |)
𝑗≠𝑖

In Table 3 (from the paper) , we summarize the parameters of the target tracking algorithm.

Parameter Unit Definition


v0 m/s Preferred velocity far from the target position

rCo M m Radius of expected flock size (characteristic size of the meta-agent)

rtrg m Characteristic size of the target area

d m Characteristic size of the „transition” area – velocity of the meta-


agent approach to zero near the target point with this „relaxation length”.

Table 3 - Parameters of the target-tracking algorithm.

12
4. SOFTWARE ENGINEERING DOCUMENTATION
4.1. Requirements (Use Cases)

Figure 2 - Use Case Diagram

13
Use case 1: Area initialization

Goal: The user initials the parameters for the flight area.

Preconditions: Fill R-0, Radios, Relaxation length, C-Shill, C-Frict, wall width.

Possible user errors: the fields not filled.

Actor System
1. Fill all fields (R-0, Radios, Relaxation 3. Display massage to the user if missing info to
length, C-Shill, C-Frict, wall width). initial.
2. Click Initialize. 4. Set the information in the drones program.
4 Table - Pseudo code flow of area initialization Use Case.

Use case 2: Set Target Area

Goal: set relevant properties for the area initialization.

Preconditions: Fill all essential fields (rCom, xCom, yCom, zCom).

Possible user errors: the fields not filled.

Actor System
1. Fill all fields (rCom, xCom, yCom, zCom). 3. Display massage to the user if missing info to
initial.
2. Click Initialize. 4. Set the information in the drones program.
Table 5 - Pseudo code flow of set target area Use Case.

Use case 3: Technician Initialization

Goal: set relevant properties for the technician initialization.

Preconditions: Fill all essential fields (Sensor Frequency, Time Delay, T control, outer Noise, inner
Noise, Communication Range, maximum acceleration, Preferred velocity).

Possible user errors: the fields not filled properly or Values do not match data.
Actor System
1. Fill all essential fields (Sensor Frequency, 3. Display massage to the user if missing info to
Time Delay, T control, outer Noise, inner initial.
Noise, Communication Range, maximum
acceleration).
2. Click Initialize. 4. Set the information in the drones program.
Table 6 - Pseudo code flow of technician initialization Use Case.

14
4.2. Design (GUI, UML Diagrams)

In this section, we would like to introduce our GUI for the project.

4.2.1. User initialization

Figure 3 - User Initialization GUI

The purpose of this window is to allow the user control the system.

This means that the user will be able to enter the Initialization data into the system – (i) The direction
vector values for each system skimmer, (ii) The preferred speed.
Using the buttons in the window, the user can define and change Characteristics in the system:
Set up flying area - initial the data for the flying zone of the flock.
Set up target - initial the data for moving target if this the user choice.
back Home - order the drone to back their way home.
Set up technician – initial the necessary data on each type of drone.

In addition, the user will be able to track the progress of the flock through the "log screen".

15
4.2.2 Technician Initialization

Figure 4 - Technician Initialization GUI

This window designed for the drone technician to insert the necessary data on each drone.
The data for each type of drone is different because of the difference in the production and shape of the
drone.
The technician must enter following details in order to activate the system: sensor frequency, time
delay, t control, communication, internal noise, outer noise, maximum acceleration, and then press on
the initialize button. If the technician wants to return to the user Initialization page the technician needs
to click the cancel button.

16
4.2.3 Area Initialization

Figure 5 - Area Initialization GUI

This window will be used when the user selects the option that the flock's purpose is to reach to a
specific area.

The user must enter the area characteristics that the flock need reach to: X Com, R Com, relaxation
length, c-shill, C-frict, repulsion strength, side length, wall width and then press on the initialize button.
If the user wants to return to the user Initialization page the technician needs to click the cancel button.

17
4.2.4 Target Initialization

Figure 6 - Target Area Initialization GUI


This window will be use when the user selects the option that the flock's purpose is to follow a
moving target.

The user must enter the target characteristics: X Com, R Com and then press on the initialize button. If
the user wants to return to the user Initialization page the technician needs to click the cancel button.

18
4.3 Design (UML Diagrams)

This project based on java.

This selection based on the following reasons:

- Java is an object-oriented language, which enables to write modular and reusable programs

-Java is a multithreaded language, which is required for our system.

The system designed in the three-layer model where the application layer is the most useful since the
drones are autonomous and not managed by a "super-computer" or a flock leader.

The combination of the three-layer model with work with java is optimal for the system.
Visual Paradigm(Ort Braude College)
Thread

+run()()
Control
Entity
<<Entity>>
droneET <<Control>>
Main
-droneNum : int
-selfLocation : GpsET -area : AreaET
-velocityVec : MathVector -target : TargetET
<<Control>>
-droneInCommunicationRange : int[] -flock : FlockET
moveDrownThread

+stopDrowns() : void
+ViTrg() : MathVector
+vectorViTrg() : MathVector
+moveAreaCenter() : void
+inRange() : int []

<<Entity>> GUI.Ulviews

AreaET
<<Boundary>>
-rCoM : float
-relaxationLength : float UserInitializationUI
<<Boundary>>
-centerPoint : MathVector -XtoVectorTF : JTextfield
<<Control>> AreaInitializationUI
-insideRadius : float -YtoVectorTF : JTextfield
FlockCT -rComTF : JTextfield
-outSideRadius : float -ZtoVectorTF : JTextfield
-interactionRangeRepulsion... +calcFlockArea() : boolean -VflockTF : JTextfield <<Boundary>> -xComTF : JTextfield

-CShill : float -start : JButton technicianInitializationUI -relaxtionLengthTF : JTextfield

-cFrict : float -setUpFlyingArea : JButton -CshillTF : JTextfield


-SensorFrequencyTF : JTextfield
-rMin : float -setUpTArea : JButton -CfrictTF : JTextfield
-setTimeDelayTF : JTextfield
-Ro : float -btnSetUpTechnician : JButton -RepulsionStrengthTF : JTextfield
-SetTcontrolTF : JTextfield
-HELP : JButton -SideLengthRTF : JTextfield
-outerNoiseTF : JTextfield
<<Entity>> -WallWidthTF : JTextfield
-innerNoiseTF : JTextfield
FlockET -rComJL : JLabel
-setAmaxTF : JTextfield
-xComJL : JLabel
-droneNum : int -setCommunicationRangeTF : JTextfield
-relaxtionLengthJL : JLabel
-TargetArea : AreaET -setTimeDelayLB : JLabel
-CshillJL : JLabel
-flockArea : AreaET -SetTcontrolLB : JLabel
<<Boundary>> -CfrictJL : JLabel
-Tcontrol : float -SensorFrequencyLB : JLabel
HELPUI -RepulsionStrengthJL : JLabel
-aMax : float -outerNoiseLB : JLabel
-SideLengthRJL : JLabel
-innerNoise : float -ParametersExp : JTextfield -innerNoiseLB : JLabel
-WallWidthJL : JLabel
-sensorFrequency : float <<Control>> -recommendedValues : JTextfield -setAmaxLB : JLabel
-define : JButton
-CommunicationRange : float viewsController -setCommunicationRangeLB : JLabel
-communicationTimeDelay : float -defineJB : JButton
-outerNoise : float
-vFlock : float <<Boundary>>
TargetInitializationUI
<<Entity>>
-define : JButton
GpsET -rTrgJL : JLabel
-locationVector : MathVector -xTrgJL : JLabel

<<Control>> -yTrgJL : JLabel


-zTrgJL : JLabel
AreaCT
-rTrgTF : JTextfield
+calcViSpp(parameter : int) : MathVector
-xTrgTF : JTextfield
+calcVijFrict(parameter : int, XjVec : MathVector, VjVec : MathVector) : MathVector
<<Entity>> -yTrgTF : JTextfield
+calcVijRep(parameter : int, XjVec : MathVector) : MathVector
TargetET -zTrgTF : JTextfield
+checkLocation(location : MathVector) : int
-rTrg : float +sFunc(location : MathVector, R : float, d : float) : float
-xTrg : MathVector +calcViShill(dID : int) : MathVector
+calcVi(dID : int, dronesInCommunicationRange : int []) : MathVector
+calcVi2(dID : int, dronesInCommunicationRange : int []) : MathVector

<<Entity>>
MathVector

-dX : float
-dY : float
-dZ : float

+printVector() : void
+add(parameter : MathVector) : MathVector
+addScalar(parameter : MathVector) : void
+sub(parameter : MathVector) : MathVector
+scale(parameter : float) : MathVector
+normalize() : MathVector
+multVector(parameter : MathVector) : MathVector
+addAndReturn(parameter : MathVector) : MathVector
+addXYDem(parameter : MathVector) : MathVector
+subAndReturn(parameter : MathVector) : MathVector
+subXYDem(parameter : MathVector) : MathVector
+addConstant3Dem(parameter : float) : MathVector
+addConstant2Dem(parameter : float) : MathVector
+addConstantOnYDem(parameter : float) : MathVector
+addConstantOnXDem(parameter : float) : MathVector
+subConstant3Dem(parameter : float) : MathVector
+subConstant2Dem(parameter : float) : MathVector
+subConstantOnYDem(parameter : float) : MathVector
+subConstantOnXDem(parameter : float) : MathVector
+normalizeVec() : float

Figure 7 - UML Diagram

19
Presentation Tier

In our system the top layer allows the user and technician to insert parameters easily.

Visual Paradigm(Ort Braude College)

GUI.Ulviews

<<Boundary>>
UserInitializationUI
<<Boundary>>
-XtoVectorTF : JTextfield
AreaInitializationUI
-YtoVectorTF : JTextfield
-ZtoVectorTF : JTextfield -rComTF : JTextfield

-VflockTF : JTextfield <<Boundary>> -xComTF : JTextfield

-start : JButton technicianInitializationUI -relaxtionLengthTF : JTextfield

-setUpFlyingArea : JButton -CshillTF : JTextfield


-SensorFrequencyTF : JTextfield
-setUpTArea : JButton -CfrictTF : JTextfield
-setTimeDelayTF : JTextfield
-btnSetUpTechnician : JButton -RepulsionStrengthTF : JTextfield
-SetTcontrolTF : JTextfield
-HELP : JButton -SideLengthRTF : JTextfield
-outerNoiseTF : JTextfield
-WallWidthTF : JTextfield
-innerNoiseTF : JTextfield
-rComJL : JLabel
-setAmaxTF : JTextfield
-xComJL : JLabel
-setCommunicationRangeTF : JTextfield
-relaxtionLengthJL : JLabel
-setTimeDelayLB : JLabel
-CshillJL : JLabel
-SetTcontrolLB : JLabel
<<Boundary>> -CfrictJL : JLabel
-SensorFrequencyLB : JLabel
HELPUI -RepulsionStrengthJL : JLabel
-outerNoiseLB : JLabel
-SideLengthRJL : JLabel
-ParametersExp : JTextfield -innerNoiseLB : JLabel
-WallWidthJL : JLabel
-recommendedValues : JTextfield -setAmaxLB : JLabel
-define : JButton
-setCommunicationRangeLB : JLabel
-defineJB : JButton

<<Boundary>>
TargetInitializationUI

-define : JButton
-rTrgJL : JLabel
-xTrgJL : JLabel
-yTrgJL : JLabel
-zTrgJL : JLabel
-rTrgTF : JTextfield
-xTrgTF : JTextfield
-yTrgTF : JTextfield
-zTrgTF : JTextfield

Figure 8 - Presentation Tier - UML Diagram

20
Application Tier

The middle layer controls the application functionality by performing detailed processing.

this layer is the most critical and it controls the autonomous of the drones .

Visual Paradigm(Ort Braude College)

Control

<<Control>>
Main

-area : AreaET
-target : TargetET
<<Control>>
-flock : FlockET
moveDrownThread

+stopDrowns() : void
+ViTrg() : MathVector
+vectorViTrg() : MathVector
+moveAreaCenter() : void
+inRange() : int []

<<Control>>
FlockCT

+calcFlockArea() : boolean

<<Control>>
viewsController

<<Control>>
AreaCT

+calcViSpp(parameter : int) : MathVector


+calcVijFrict(parameter : int, XjVec : MathVector, VjVec : MathVector) : MathVector
+calcVijRep(parameter : int, XjVec : MathVector) : MathVector
+checkLocation(location : MathVector) : int
+sFunc(location : MathVector, R : float, d : float) : float
+calcViShill(dID : int) : MathVector
+calcVi(dID : int, dronesInCommunicationRange : int []) : MathVector
+calcVi2(dID : int, dronesInCommunicationRange : int []) : MathVector

Figure 9 - Application Tier - UML Diagram

21
Data Tier Visual Paradigm(Ort Braude College)

Entity

Includes the management of the database servers <<Entity>>


droneET

-droneNum : int
-selfLocation : GpsET
-velocityVec : MathVector

The sata in this tier is independent -droneInCommunicationRange : int[]

of the application servers.


<<Entity>>
AreaET

-rCoM : float
-relaxationLength : float
-centerPoint : MathVector
-insideRadius : float
-outSideRadius : float
-interactionRangeRepulsion...
-CShill : float
-cFrict : float
-rMin : float
-Ro : float

<<Entity>>
FlockET

-droneNum : int
-TargetArea : AreaET
-flockArea : AreaET
-Tcontrol : float
-aMax : float
-innerNoise : float
-sensorFrequency : float
-CommunicationRange : float
-communicationTimeDelay : float
-outerNoise : float
-vFlock : float

<<Entity>>
GpsET

-locationVector : MathVector

<<Entity>>
TargetET

-rTrg : float
-xTrg : MathVector

<<Entity>>
MathVector

-dX : float
-dY : float
-dZ : float

+printVector() : void
+add(parameter : MathVector) : MathVector
+addScalar(parameter : MathVector) : void
+sub(parameter : MathVector) : MathVector
+scale(parameter : float) : MathVector
+normalize() : MathVector
+multVector(parameter : MathVector) : MathVector
+addAndReturn(parameter : MathVector) : MathVector
+addXYDem(parameter : MathVector) : MathVector
+subAndReturn(parameter : MathVector) : MathVector
+subXYDem(parameter : MathVector) : MathVector
+addConstant3Dem(parameter : float) : MathVector
+addConstant2Dem(parameter : float) : MathVector
+addConstantOnYDem(parameter : float) : MathVector
+addConstantOnXDem(parameter : float) : MathVector
+subConstant3Dem(parameter : float) : MathVector
+subConstant2Dem(parameter : float) : MathVector
+subConstantOnYDem(parameter : float) : MathVector
+subConstantOnXDem(parameter : float) : MathVector
+normalizeVec() : float

Figure 10 - Data Tier - UML Diagram

22
5. Testing
Testing for the user initialization:
Test ID Description Expected results Actual Comments
results
User User enter the relevant The server get the pass The system start and the
Initialization1 properties and click "Start" properties and set them in drones flock start their way
button. the system. to the target.
The technician and
area/target properties was
entered.
User User didn't entered all the System pop a message – pass
Initialization2 relevant properties and click "Please fill all fields.".
"Start" button. Return to User
Initialization window.
Table 7 - Testing for the user initialization GUI

Testing for the technician initialization:


Test ID Description Expected results Actual Comments
results
technician Technician enter the The server get the pass The system return to User
Initialization1 relevant properties and click properties and set them in Initialization window.
"Initial" button. the system.

technician Technician didn't entered all System pop a message – pass The system can't start the
Initialization2 the relevant properties and "Please fill all fields.". drones because of missing
click "Initial" button. Return to Technician properties.
Initialization window.
Table 8 - Testing for the technician initialization GUI.

Testing for the area initialization:

Test ID Description Expected results Actual Comments


results
area User enter the relevant The server get the pass The system return to User
Initialization1 properties of the target area properties and set them in Initialization window.
and click "Initial" button. the system.

area Technician didn't entered all System pop a message – pass The system can't start the
Initialization2 the relevant properties of "Please fill all fields.". drones because of missing
the target area and click Return to Area properties (if this option is
"Initial" button. Initialization window. chosen).
Table 9 - Testing for the area initialization GUI.

23
Testing for the target initialization:
Test ID Description Expected results Actual Comments
results
target User enter the relevant The server gets the pass The system returns to User
Initialization1 properties of the moving properties and set them in Initialization window.
target and click "Initial" the system.
button.

target User didn't entered all the System pop a message – pass The system can't start the
Initialization2 relevant properties of the "Please fill all fields.". drones because of missing
moving target and click Return to Target properties (if this option is
"Initial" button. Initialization window. chosen).
Table 10 - Testing for the target initialization GUI.

24
6. RESULTS AND CONCLUSIONS
6.1 Results

First, we like to clarify we had some difficult to work and do the calculation with area that is square
shape arena, so after Consultation with our supervisor we decided to work with ring shape arena.

This help to do the calculation of the location of each drone relative to the defined flight area.

Additionally because of hardware problems (working with home computers that not built to run
simulators and algorithm software together), the software and calculations done very slowly, and the
simulator (V-REP) failed to work with more than three threads without collapse. In contrast with the
article we based on, the scale of values is much smaller, like the scale between the simulator to the real
world.

In the section below, we will show results of our experiment according to the expected results.

Experiment 1

In this experiment, we will submit different values for the C-Frict parameter:

Test 1:

In this test C-Frict = 0.2:

Rest of the parameters:


C-Shill = 0.33
R-0 = 1
R target = 7
Communication range = 0.88
V-Flock = 0.33
R-CoM = 6

In the start of the simulation, we can see the flock approaching to each other too close and contact each
other.

25
In this capture, we can see that the drones balanced and keep their distance, flaying together to their
target:

With the drones reaching the target, we can see they take a position not too close to their rest flock
members.

Time to target: 3:50


Collisions: non but with contact.

26
Test 2:

In this test C-Frict = 0.33:

Rest of the parameters:


C-Shill = 0.33
R-0 = 1
R target = 7
Communication range = 0.88
V-Flock = 0.33
R-CoM = 6

In the start of the simulation, we can see the flock approaching to each other not too close to collisions

In this capture, we can see that the drones keep their distance, flaying together in formation to their
target:

With the drones reaching the target, we can see every drone take a position in perfect formation.

Time to target: 4:10


Collisions: NON.

27
Experiment 2

In this experiment, we will submit different values for the C-Shill parameter:

Test 1: Rest of the parameters:


C-Frict = 0.33
In this test C-Shill = 0.15: R-0 = 1
R target = 7
Communication range = 0.88
V-Flock = 0.33
R-CoM = 6
In the start of the simulation, we can see that it take time to the drones to take place in their way to the
target.

In this capture, we can see that the drones keep their distance and take their place in the formation.

With the drones reaching the target, we can see every drone take a position in a little deformed
formation.

Time to target: 3:20


Collisions: NON.

28
Test 2:

In this test C-Shill = 0.33:

Rest of the parameters:


C-Frict = 0.33
R-0 = 1
R target = 7
Communication range = 0.88
V-Flock = 0.33
R-CoM = 6

In the start of the simulation, we can see the flock approaching to each other in perfect distance to
avoid collisions.

In this capture, we can see that the drones keep their distance and take their place in a perfect
formation.

With the drones reaching the target, we can see every drone take a position in a perfect formation.

Time to target: 3:55


Collisions: NON.

29
Experiment 3

In this experiment, we will submit different values for the R-0 parameter:

Test 1:

In this test R-0 = 1:

Rest of the parameters:


C-Frict = 0.33
C-Shill = 0.33
R target = 7
Communication range = 0.88
V-Flock = 0.33
R-CoM = 6

In the start of the simulation, we can see that drones first create a formation before start the way to their
target.

In this capture, we can see that the drones keep their distance and take their place in a perfect
formation.

With the drones reaching the target, we can see every drone take a position in a straight formation.

Time to target: 3:40


Collisions: NON.

30
Test 2:

In this test R-0: 3

Rest of the parameters:


C-Frict = 0.33
C-Shill = 0.33
R target = 7
Communication range = 0.88
V-Flock = 0.33
R-CoM = 6

In the start of the simulation, we can see the flock approaching to each other too close and contact
each other.

In this capture, we can see that the drones not balanced and not keep their distance, too close, a lot of
contact between the drones.

With the drones reaching the target, we can see every drone take a position in a straight formation, with
a perfect distance.

Time to target: 5:05


Collisions: non – but a lot of contact
between the drones.

31
6.2 Conclusions

At the experiment stage, we have made many experiments with different algorithm's parameters, so
we can better understand how the algorithm responds with various variables value.

According to first experiment, we can see that increasing of the C-Frict parameter in 150% effect, the
distance between the flock members, favorably. Keeping distance more reflected by the results and that
effect, the time to reach destination become shorter.

From experiment 2, we can see that if we decrease the C-Shill parameter with more than half, the
movements become more sharp and unpredictable. Because of these movements, there is a greater
chance of contact / collisions between the flock members.

We learn from last experiment, that when the R-0 variable is larger so the stability of the drones in their
flight area is much unexpected and the drones stay in their flight area in even if they contact to other
drones in the flock.

In conclusion, we get the expected result from different experiments with different values to various
variables. We can see that our project reached to the result we search with this project.

32
REFERENCES
[1] Csaba Virágh, Gábor Vásárhelyi, Norbert Tarcai, Tamás Szörényi, Gergő Somorjai1,
Tamás Nepusz, Tamás Vicsek - Flocking algorithm for autonomous flying robots - ELTE
Department of Biological Physics, 1117 Budapest, Pázmány Péter Sétány.
[2] Copyright © 2017 NVIDIA Corporation [online] - http://www.nvidia.com/object/uavs-
drones-technology.html
[3] Dario Floreano & Robert J. Wood - Science, technology and the future of small
autonomous drones - NATURE | VOL 521, 2015 Macmillan Publishers Limited.
[4] Copyright ® Hexo+ 2017 [online] - by Squadrone System - https://hexoplus.com/
[5] © Intel Corporation [online] - https://www.intel.com/content/www/us/en/technology-
innovation/aerial-technology-overview.html
[6] Wikipedia PID controller [online] – https://en.wikipedia.org/wiki/PID_controller

33

Vous aimerez peut-être aussi