Vous êtes sur la page 1sur 39

EED 498 Major Project 2

Drone for painting of High Rise Buildings

Project report submitted in partial fulfilment of the requirement for the degree of
Bachelor of Technology

Submitted by
Mayank Gupta (1510110220)
Pranav Reddy M (1510110209)

Under Supervision of
Mr. Aakash Sinha
Department of Electrical Engineering

Department of Electrical Engineering


School of Engineering
Candidate Declaration

We hereby declare that the thesis entitled “Drone for painting of High Rise Buildings”
submitted for the Bachelors of Technology Degree program. This thesis has been written in
our own words. We have adequately cited and referenced the original sources.

Mayank Gupta
(1510110220)
Date:

Pranav Reddy M
(1510110209)
Date:
CERTIFICATE

It is certified that the work contained in the project report titled “Drone for painting of high
rise buildings,” by “Mayank Gupta” and “Pranav Reddy” has been carried out under our
supervision and that this work has not been submitted elsewhere for a degree.

(Signature of Supervisor)
Mr. Aakash Sinha
Department of Electrical Engineering
School of Engineering
Shiv Nadar University
May 2019
Abstract

Painting is a process of applying paint on the surface of an object. But painting is time and
effort consuming process. It is also a hazardous and exhausting technique which makes it an
excellent case for automation. There is a strong need for automation in painting as it reduces
the human effort, maintains consistency and helps in obtaining perfection.

In this project, the design of an automated painting system is described. The painting drone
targets those prioritizing safety alongside time sensitivity. The design objective is to fulfill the
foundation of simplicity, easy handling, low cost, reducing human effort and consistent
painting. The system includes a UAV, that is tethered to the ground using water pipes that are
able to transport as well as jet spray water upto a height of 3 m at 3 mins/Litre.
Contents

1. Introduction ................................................................................................................ 01
2. Literature Review ....................................................................................................... 03
2.1 Worker Bee ....................................................................................................................... 03
2.2 Pictobot ...................................................................................................................... 03
3. Mechanism .................................................................................................................. 05
3.1 Spraying Mechanism .................................................................................................... 05
3.1.1 Paint Spraying ..................................................................................................... 05
3.1.2 Submersible Pump .............................................................................................. 06
3.2 Drone interfacing(Pipes) ................................................................................... 08
4. Drone Structure .......................................................................................................... 09
4.1 Components of the Drone ............................................................................................. 09
4.1.1 Frame .................................................................................................................. 09
4.1.2 Motors ................................................................................................................. 09
4.1.3 Electronic Speed Controller ................................................................................ 10
4.1.4 Communication ................................................................................................... 11
4.1.5 Flight Controller(Pixhawk) ................................................................................. 11
4.1.6 Radio Communication ........................................................................................ 12
5. Implementation ........................................................................................................... 13
5.1 Methodology ................................................................................................. 13
5.2 Interfacing the Drone Software:Mission Planner ..................................................... 15

6. Circuit Diagram .......................................................................................................... 18


7. Wall Crack Detection ................................................................................................. 19
7.1 Sobel Algorithm ........................................................................................... 19
7.2 Canny Edge Detection .............................................................................................. 20
7.2.1 Noise Reduction .................................................................................................. 20
7.2.2 Finding Intensity Gradient of the Image ............................................................. 21
8. Conclusion ................................................................................................................... 24
8.1 Applications ................................................................................................. 24
8.2 Final Goals ................................................................................................................ 24
8.3 Results.......................................................................................................... 24
8.4 Challenges .................................................................................................... 24
8.5 Future Prospects ............................................................................................ 24
9. References.................................................................................................................... 25
10. Acknowledgment ........................................................................................................ 27
A. Appendix......................................................................................................... 28
F. Figures ...........................................................................................................
2.1 Pictobot ...................................................................................................... 04
3.1 Water Spraying Mechanism .......................................................................... 05
3.2 Submersible Water Pump 40W ...................................................................... 07
3.3 Water for Spray Mechanism .......................................................................... 07
3.4 Sprayers interfaced with the drone ................................................................. 08
4.1 Frame ......................................................................................................... 09
4.2 Brushless Motors ......................................................................................... 10
4.3 ESC............................................................................................................ 10
4.4 2.4 GHz 6 Channel Reciever ........................................................................ 11
4.5 Layout diagram of Pixhawk .......................................................................... 12
4.6 Remote Controller ....................................................................................... 12
5.1 Submersible Water pumps immersed in water ................................................. 13
5.2 Arduino, Relay Bluetooth and ultrasonic Circuit ............................................. 14
5.3 Drone with sprayer and water pumps ............................................................. 14
5.4 Checking Levels of the drone ........................................................................ 15
5.5 Motor Testing .............................................................................................. 16
6.1 Arduino Bluetooth and relay circuit diagram ................................................... 17
6.2 Arduino and Ultrasonic sensors circuit diagram .............................................. 18
7.1 Sobel Gx Gy................................................................................................ 19
7.2 |G| matrix .................................................................................................... 20
7.3 Raspberry Pi with Pi camera ......................................................................... 21
7.4 Before and after photos for (a) Multi (b) thin (c) thick crack using Sobel. .......... 22
7.5 After applying Canny Edge detection on the image through Raspberry Pi Cam .. 23
Chapter 1

Introduction
Fast globalization and interconnectivity create the major driving force in creating and
enhancing chance. Therefore, the society has to acquire new trends of innovation to prosper in
their ways of life. The community has revolutionized due to the interconnectivity greatly
compared to some years back when usage of technology did not exist. Saving human labor
numbers and timing are only the two main advantages; besides them we must consider the
opportunity to reduce or eliminate human exposure to difficult and hazardous environments,
and to improve the quality of such works which would solve most of the problems connected
with safety when many activities occur at the same time. When wall painting workers and
robots are properly integrated in building tasks, the whole painting process can be better
managed and savings in human labor and timing are obtained as a consequence. These factors
motivate the development of
a robotic painting system.

Also, the task of painting is a time consuming one. Automation in this field would at least
substitute humans in applying the paint, thus saving the valuable labor hours of working. The
fact that long exposure to paint and varnishes damages the human health has been proving.
So, the advent of automation will help to get rid of this hazard. The automated painting robot
was to be designed with the vision to facilitate easy wall painting.

Besides, In India, many accidents take place while exterior painting of tall buildings. It is
difficult to paint in higher altitude, sometimes many casualties take place because of
loosening of rope or misbalancing. To avoid these kinds of casualties, we shift our focus
towards the research for replacing human with robot in a painting system.

Through the performance of full scale experiments conducted in some research papers we’ve
gone through, it was possible to show that robots are always more profitable than human
work when highly autonomous robots are adopted. The feasibility of highly autonomous
robots is shown by a number of papers which pictured the reduction of auxiliary work from
labor.

1
Reasons which made us choose this Project
➢ Safety
In 2014, falls accounted for roughly 40 percent of all the deaths in construction, the
most dangerous line of work in the U.S., according to the Department of Labor.
➢ Lawsuits
Since paint is not applied properly in most cases, paint companies can face lawsuits
over this issue bringing in a lot of trouble.
➢ Time
Mostly, painting is done using scaffolding, and this scaffolding is also a tedious
process as the painters must setup it and it must be changed after painting a set of an
area.

The final arrangement of this prototype is expected to be able to perform fast and efficient
painting. In this project, painting was executed using a spray system with its on/off control.
The automation is done through Bluetooth as well as an ultrasonic sensor. This way, it can be
manually controlled using a Bluetooth as well as automatically controlled as soon as it comes
near to the wall (Less than 20 cms). We know it is very risky to paint outside wall at elevated
height but at now we are using a prototype model to show the painting process.

2
Chapter 2
Literature Review

Existing Technologies:

2.1 Worker Bee


Quadcopter connected to a mobile base station via an umbilical cord. The drone has a spray
wand to paint walls, while the base station houses the paint reservoir, air compressor and power
source. It can reach up to three stories high.
The drone uses LiDAR for 3-D scanning, optical and ultrasonic sensors for sub-millimeter
depth perception to determine distances to walls, and temperature and humidity sensors that can
help refine its painting strategies. It also uses short-wave infrared (SWIR) sensors to look for
invisible details such as structural weaknesses, which could help paint manufacturers avoid
lawsuits.

2.2 Picto-Bot
Picto-Bot consists of a mobile base that has a robot arm mounted on an elevatable platform. At
the end of the arm is a spray nozzle, an optical camera that scans the workspace to calculate the
trajectory of the paint, and laser and ultrasound sensors for navigation, range-finding, and
obstacle avoidance. The robot and its 2,000-psi air compressor can work four hours on one
charge of its rechargeable lead-acid battery, and its paint tanks can hold 120 liters. It can
navigate on the ground and paint autonomously without human intervention.

As Pictobot is on the ground, the main advantage is that gravity is not a concern for it. Also, as
opposed to the Worker Bee, Pictobot does not need to worry about power supply. But the
limitation is that it can only be used in indoor circumstances, and that too to only a ceiling
height. Also, this made the machine extremely cumbersome and heavy.

3
Figure 2.1: Pictobot

Further, in one of the papers we read, H. Anderson et al (2002) presented an approach for
automatically spray paint families of unknown parts. It uses a sensing cell where the part
geometry is acquired. From the part geometry process-relevant features are extracted and
corresponding paint routines are found and grouped to obtain optimal painting trajectories. At
last a collision free robot path and an executable robot program are generated and all steps are
fully automatic and no intervention of an operator is needed. First implementations at industrial
users show that the approach is feasible. Parts can be scanned and robot programs are generated
automatically for a part rate of one per minute using conventional PC technology. It is planned
to improve the Flex Paint process by using robot-mounted sensors to be able to scan sections
of parts that are not visible with fixed sensors. Next extension may be methods to teach other
than the current geometric features.

4
Chapter 3
Mechanism
We have made use of a spraying mechanism for this drone as it was the best suited for this
design.
3.1 Spraying Mechanism
In this section we will discuss the spraying mechanism in terms of construction, pipes
and pumps.
3.1.1 Paint spray mechanism
Spray painting is a painting technique where a device sprays a coating through the air onto a
surface. The most common types employ compressed gas usually airs to atomize and direct the
paint particles. Spray guns evolved from airbrushes, and the two are usually distinguished by
their size and the size of the spray pattern they produce. Airbrushes are hand-held and used
instead of a brush for detailed work such as photo retouching, painting nails or fine art.
So here for high pressure flow, we have made use of a brass gardening nozzle as this creates
enough pressure for the water to be actually sprayed and this also covers the cracks in the wall.
The speed of the water spray is such that it on an average it can fill a 1-liter container in just a
little less than 3 minutes.
The nozzles are further connected to pipes having external diameter ¼ inch which are generally
used in the RO water Purifier.

Figure 3.1: The water spraying mechanism

5
3.1.2 Submersible Pumps

A submersible pump, also called an electric submersible pump, is a pump that can be fully
submerged in water. The motor is hermetically sealed and close-coupled to the body of the
pump. A submersible pump pushes water to the surface by converting rotary energy into kinetic
energy into pressure energy. This is done by the water being pulled into the pump: first in the
intake, where the rotation of the impeller pushes the water through the diffuser. From there, it
goes to the surface.

The major advantage to a submersible pump is that it never has to be primed, because it is
already submerged in the fluid. Submersible pumps are also very efficient because they don’t
really have to spend a lot of energy moving water into the pump. Water pressure pushes the
water into a submersible pump, thus “saving” a lot of the pump’s energy.

Also, while the pumps themselves aren’t versatile, the selection certainly is. Some submersible
pumps can easily handle solids, while some are better for liquids only. Submersible pumps are
quiet, because they are under water, and cavitation is never an issue, because there is no “spike”
in pressure as the water flows through the pump.

There are a few disadvantages with submersible pumps, and two have to do with the seal. The
seals can become corroded with time. When that happens, water seeps into the motor, rendering
it useless until it is repaired. Also, that seal makes the submersible pump a bit difficult to get
into for repairs.

Caution must be taken with submersible pumps; they must be fully submerged. The water
around a submersible pump actually helps to cool the motor. If it is used out of water, it can
overheat.

6
Fig 3.2: Submersible Water pumps 40W

Fig 3.3: Water for spray mechanism

7
3.2 Drone Interfacing(Pipes)
After the pumping mechanism and spray mechanism are built, we then interface both of
them with the drone to reach high rise buildings and paint the walls of the building.

Fig 3.4 Sprayers interfaced with drone (Side View)

8
Chapter 4
Drone Structure

4.1 Components of the Drone


In this section we discuss the basic components as well as structure of the drone.

4.1.1 Frame
The frame is what keeps all the parts together. It has to be sturdy, but on the other hand, it also
has to be light so that the motors and the batteries don’t struggle to keep it in the air. It consists
of a Centre holding plate, Arms and the Motor brackets.

Fig 4.1: Frame

4.1.2 Motors
The thrust that allows the Quadcopter to get airborne is provided by Brushless DC motors and
each of them is separately controlled by an electronic speed controller.
Things to keep in check for the motors are the thrust to weight ratio, the efficiency of the motor,
motor movements, pole count, KV value.

9
Fig 4.2: Brushless Motors

4.1.3 Electronic Speed Controller


Electronic Speed Controller delivers the movement information from the flight controller to the
motors. It regulates how much power the motors get, which determines the speed and direction
changes of the quad. It comes with an input for a battery and has a motor output with three
phases, so we will need four of them for each motor. that comes from the source by choosing a
controller with 10A or higher.

Fig 4.3: ESC

10
4.1.4 Communication
The Drone has a 433 MHz air module for radio communication that supports speed upto 250
kbps.In addition to that it has 2.4 GHz 6 channel receiver which allows it to communicate at
different frequencies.

Fig 4.4: 2.4 GHz 6 Channel Receiver

4.1.5 Flight Controller (Pixhawk)


Here in this setup we have used a Pixhawk for flying the drone. The purpose of the Pixhawk is
to connect the various sensors, motors, radio etc and make them work in synchronization. Along
with this when we calibrate the sensors (which we will cover in the next chapter)
The main advantages of having a Pixhawk(PX4) are:
 PX4 is powerful open source autopilot flight stack.
 Great choice of hardware for vehicle controller, sensors and other peripherals.
 Flexible and powerful flight modes and safety features.

11
Figure 4.5: Layout diagram of Pixhawk

4.1.6 Radio Communication


A Radio Control (RC) system is used to manually control the vehicle. It consists of a remote
control unit that uses a transmitter to communicate stick/control positions with a receiver based
on the vehicle. Some RC systems can additionally receive telemetry information back from the
autopilot.

Fig 4.6: Flight Controller

12
Chapter 5
Implementation
5.1 Methodology
The steps involved in painting the wall include,
 Pumping the water up to the height of the wall using submersible water pumps.
 Spray the pumped water on to the wall.
 Interfacing of the spray mechanism with the drone
 Wall crack detection

Water pumping using two 40W submersible water pumps, one for control using ultrasonic
sensor and other for Bluetooth control using mobile phone.
Since the water pump is for a thicker pipe, we used M-Seal to fit an RO pipe in the same holder.
This allowed us to have a better flow and water was able to travel through the 3 metre long pipe
and was able to gain enough pressure to be sprayed on the wall.

Fig 5.1 Submursible Water pumps immersed in water

13
Fig 5.2 Arduino, relay, Bluetooth and ultrasonic circuit

Fig 5.3: Drone with sprayer and water pumps

14
5.2 Interfacing the Drone Software:Mission Planner
Mission Planner is a ground control station for Plane, Copter and Rover. It is compatible with
Windows only. Mission Planner can be used as a configuration utility or as a dynamic control
supplement by configuring the Motors, Accelerometers, GPS as well the Roll, Pitch Yaw
calibration so that the drone is able to carry out a stable take off, flight as well as landing.

Fig 5.4: Checking the level of the drone

15
Fig 5.5:Radio Calibration for the Remote

Here are a few things you can do with Mission Planner:


• Load the firmware (the software) into the autopilot board (i.e. Pixhawk series) that
controls your vehicle.
• Setup, configure, and tune your vehicle for optimum performance.
• Plan, save and load autonomous missions into your autopilot with simple point-and-
click way-point entry on Google or other maps.
• Download and analyze mission logs created by your autopilot.
• Interface with a PC flight simulator to create a full hardware-in-the-loop UAV
simulator.
With appropriate telemetry hardware you can:
o Monitor your vehicle’s status while in operation.
o Record telemetry logs which contain much more information the the on-board autopilot
logs.
o View and analyze the telemetry logs.
o Operate your vehicle in FPV (first person view)

16
` Fig 5.6:Motor testing

17
Chapter 6
Circuit Diagrams

Fig 6.1 Arduino Bluetooth and relay circuit diagram

Fig 6.2 Arduino and Ultrasonic sensors circuit diagram

18
Chapter 7

Wall Crack Detection

In a lot of cases, it has been observed that due to improper application of the paint, paint
companies face lawsuit actions as well. One of these cases is Cracks. In case of Cracks we have
improper application of the paint as well as negligence in terms of cracks, we face a lot of
issues.
Till Mid-semester as we were trying to keep the computational power low due to the project
being based on a Raspberry Pi 3 B+, we used a Sobel Algorithm.
The following is the algorithm for the same.

7.1 Sobel Algorithm

In theory at least, the operator consists of a pair of 3×3 convolution kernels as shown in
Figure 1. One kernel is simply the other rotated by 90.

Fig 7.1: Sobel Gx Gy

These kernels are designed to respond maximally to edges running vertically and horizontally
relative to the pixel grid, one kernel for each of the two perpendicular orientations. The
kernels can be applied separately to the input image, to produce separate measurements of the
gradient component in each orientation (call these Gx and Gy). These can then be combined
together to find the absolute magnitude of the gradient at each point and the orientation of that
gradient. The gradient magnitude is given by:

Typically, an approximate magnitude is computed using:

19
which is much faster to compute.

The angle of orientation of the edge (relative to the pixel grid) giving rise to the spatial
gradient is given by:

In this case, orientation 0 is taken to mean that the direction of maximum contrast from black
to white runs from left to right on the image, and other angles are measured anti-clockwise
from this.

Often, this absolute magnitude is the only output the user sees --- the two components of the
gradient are conveniently computed and added in a single pass over the input image using the
pseudo-convolution operator shown in Figure 2.

Fig 7.2: |G| Matrix

Using this kernel the approximate magnitude is given by:

7.2 Canny Edge Detection


Canny algorithm is not a single algorithm, but stacking of multiple algorithms. It may use any
of the Kernels (Prewitt, Sobel) etc. But for our purpose, we have used Sobel.
7.2.1 Noise Reduction
Since edge detection is susceptible to noise in the image, first step is to remove the noise in the
image with a 5x5 Gaussian filter. We have already seen this in previous chapters.

20
7.2.2 Finding Intensity Gradient of the Image
These steps are similar to the ones we followed in the Sobel operator above .We are using a
Sobel kernel in both horizontal and vertical direction to get first derivative in horizontal
direction ( ) and vertical direction ( ). From these two images, we can find edge gradient
and direction for each pixel as follows:

Gradient direction is always perpendicular to edges. It is rounded to one of four angles


representing vertical, horizontal and two diagonal directions.

Fig 7.3: Raspberry Pi with Pi camera

21
Figure 7.4: Before and after photos for (a) Multi (b) thin (c) thick crack using Sobel.

22
Figure 7.5: After applying Canny Edge detection on the image through Raspberry Pi Cam

23
Chapter 8
Conclusion
8.1 Applications
Home and Industry

8.2 Final Goals


1) Improving the design of the nozzle spray as well as testing with the umbilical cord
mechanism.
2) Implementing Canny Algorithm on the Output of the Edge detection to classify whether a
photo has a crack or not.

8.3 Results
 Spray painting successfully implemented
 Umping mechanism implemented
 Wall crack detection using canny algorithm

8.4 Challenges
➢ If a spray nozzle is too close to a surface, the paint bounces off, but if it’s too far, it
atomizes and not enough catches.
➢ The Drone has a flight time of 20 minutes on full charge.

8.5 Future Prospects


 The Design can also be used in the case of a firefighting drone.
 Instead of the water source being on ground, if it is situated somewhere up, the flow of
water would be easier and the drone would be easier to maneuver, after required
changes.

24
References
[1] “Set Up Of A Robotized System For Interior Wall Painting”, B. Naticchia, A. Giretti, A.
Carbonari, ISARC2006.
[2] “Experiencing Computer Integrated Construction”, J. Constr. Eng. Manage., 119(2), 307–
322.
[3] “Construction Supply Chain Management Handbook”, Yamazaki Y. and Maeda J., 1998.
[4] “Balancing Human-and-Robot Integration in Building Tasks”, Kahane, B. and Rosenfeld,
Y., 2004.
[5] “Robot for Interior Finishing Works in Building — Feasibility Analysis”, Warszawsky, A.
and Rosenfeld, Y. 1994.
[6] “A construction Robot for Autonomous Plastering of. Walls and Ceilings”, Proceedings of
the 14th ISARC, Forsberg, J., Aarenstrup, R., Wernersson, A., 1997.
[7] “High tractive power wall-climbing robot”, Automation in Construction (1995), Bach, Fr.
W., Rachkov, M., Seevers, J. and Hahn, M., 1995.
[8] “Behavior-based artificial intelligence in miniature mobile robot”, Volume 9, Number 2,
Yilma, B., Seif, M. A. 1999
[9] “Autonomous stair-climbing with miniature jumping robots”, Stoeter, S. A., 2005.
[10] JayadevMisra, A walk over the shortest path:Dijkstra’s Algorithm viewed as fixed-
pointcomputation, Information Processing Letters 77, 2001
[11] R. C. Gonzales and R. E. Woods, Digital imageprocessing, Prentice Hall Press, USA,
2002.
[12] S.m.s.Elattar, Automation and robotics in construction: Opportunities and challenges,
Emirates journal for engineering research, Vol no 13 (2), Page no 21-26 2008
[13] Naticchia, A. Giretti, A. Carbonari,Set up of a robotized system for interior wall painting,
Proceedings of the 23rd ISARC, October 3-5,Tokyo, Japan, 2006.
[14] Johan Forsberg Roger AarenstrupAkeWernersson, A Construction Robot for
Autonomous Plastering of Walls and Ceilings, Vol 6, 2000.
[15] Jayshreesahu, S.K.Sahu, Jayendra Kumar, Microcontroller Based Dc Motor Control,
International Journal of Engineering Research & Technology (IJERT),Vol. 1 Issue 3, May –
2012.
[16] A. Warszawsky, Y. Rosenfeld: “Robot for interior finishing works in building: feasibility
analysis,” ASCE Jou Engineering and Management, vol.120 (1), pp. 132
[17] B. Kahane, Y. Rosenfeld: “Balancing human building task,” Computer-Aided Civil a
vol.19, pp. 393-410, 2004.

25
[18] B. Naticchia, A. Giretti, A. Car color system for interior wall painting,”Advanced
Robotic systems, vol. 4,
[19] M. De Grassi, B. Naticchia, A. Giretti&Aan automatic four color spraying device carried
by a robot,”ISARC, December 17-19, Sendai, Japan, 2007.
[20] B. Naticchia, A. Giretti, A. Carbonari: “Set up of a robotized system for interior wall
painting,” Proceedings of the 23Tokyo, Japan, 2006.
[21] I. Aris, A. K. Parvez Iqbal, A. R. Ramli& S. Shams development of a program
buildings.,” JurnalTeknologi,UniversitiTeknologi Malay pp. 27-48, June 2005.
[22] M. W. Spong, S. Hutchinson, M. Vidyasagar Control. John Wiley & Sons Publishing
Inc., 2006, pp. 22259-262.
[23] B. Siciliano, L. Sciavicco, L. Villano, G.Oriolo:Planning and Control,” Springer Pum_
has been achieved which It is expected that well.
[24] J. Canny, "A Computational Approach to Edge Detection," in IEEE Transactions on
Pattern Analysis and Machine Intelligence, vol. PAMI-8, no. 6, pp. 679-698, Nov. 1986.
doi: 10.1109/TPAMI.1986.4767851
[25] J. Canny, "A Computational Approach to Edge Detection," in IEEE Transactions on
Pattern Analysis and Machine Intelligence, vol. PAMI-8, no. 6, pp. 679-698, Nov. 1986.
doi: 10.1109/TPAMI.1986.4767851

26
Acknowledgments

Our progress in spite of the challenges as they are would not have been possible without the
support and Guidance and Invaluable feedback of our project mentor Aakash Sinha.
In addition, we would also like to thank Wakeel Sir (Embedded Lab) and Ashwini Sir (Project
Lab) for helping us with components.

27
Appendix

Pi code for Wall crack detection


#!/usr/bin/env python
from picamera import PiCamera
from time import sleep
import cv2
import numpy as np

camera = PiCamera()
camera.start_preview()
sleep(1)
camera.capture('picture.jpg')
camera.stop_preview()

img = cv2.imread('picture.jpg')
gray = cv2.cvtColor(picture, cv2.COLOR_BGR2GRAY)
#canny
img_canny = cv2.Canny(gray,100,200)

cv2.imshow("Original Image", img)


cv2.imshow("Canny", img_canny)

cv2.waitKey(0)
cv2.destroyAllWindows()

Arduino Code for Bluetooth and Ultra-sonic


char junk;
String inputString="";
int trigger_pin = 2;

int echo_pin = 3;

int relay_pin = 12;

28
int time;

int distance;

void setup()
{
Serial.begin(9600);
pinMode(13, OUTPUT);
pinMode (trigger_pin, OUTPUT);
pinMode (echo_pin, INPUT);
pinMode (relay_pin, OUTPUT);

void loop()
{
digitalWrite (trigger_pin, HIGH);
delayMicroseconds (10);
digitalWrite (trigger_pin, LOW);
time = pulseIn (echo_pin, HIGH);
distance = (time * 0.034) / 2;
if (distance >= 40)
{
Serial.println (distance);
digitalWrite (relay_pin, HIGH);
delay (500);
}
else {
Serial.println (distance);

digitalWrite (relay_pin, LOW);

delay (500);

29
if(Serial.available()){
while(Serial.available())
{

char inChar = (char)Serial.read();


inputString += inChar;
}
Serial.println(inputString);
while (Serial.available() > 0)
{ junk = Serial.read() ;
} // clear the serial buffer
if(inputString == "a"){ //Relay On
digitalWrite(13, LOW);

}
else if(inputString == "b
"){ //Relay Off
digitalWrite(13, HIGH);

}
inputString = ""; } }

Mid-Semester Code:
MATLAB Code for Sobel.

Photo_Crack=imread('C:\Users\HP LAPTOP\Desktop\crack_photos\multi.jpg');
imshow(Photo_Crack);
gray = rgb2gray(Photo_Crack); % Converted Image to grayscale for a clearer view
BW = im2bw(gray,0.5);
imshow(BW),title('Original Image');

[~, threshold] = edge(BW, 'sobel');


fudgeFactor = 0.5;
Binary_Gradient = edge(BW,'sobel', threshold * fudgeFactor);
figure, imshow(Binary_Gradient), title('Sobel Edge detection'); % Here we obtain the Binary Gradient Mask using the Sobel
technique
% We get a rough outline

30
X_SE = strel('line', 1, 90); % Strel function to find dilated mask and we set the length as 3 and angles 0 and 90 for x and y
axis
Y_SE = strel('line', 1, 0);
Dilated_Gradient = imdilate(Binary_Gradient, [X_SE Y_SE]); % superimposing across the 2 axes
figure, imshow(Dilated_Gradient), title('The Dilated Gradient Mask');
% Here we can see the difference more clearly as the outline is broader but
% there are still gaps

BWdfill = imfill(Dilated_Gradient, 'holes');


figure, imshow(BWdfill);
title('Filled Image');
% Here we can obtain a continous line and the crack is seperated
Binaryoutline = bwperim(BWdfill);
Segout = BW;
Segout(Binaryoutline) = 255;
figure, imshow(Segout), title('Crack Outlined');

figure;
imshowpair(Photo_Crack, Segout, 'montage');
title('Before and After');

31

Vous aimerez peut-être aussi