Vous êtes sur la page 1sur 40

GESTURE CONTROL OF INDUSTRIAL ROBOT

A PROJECT REPORT

Submitted By:

PIYUSHSINGH P KSHATRIYA-11EE043

In partial fulfillment for the award of the degree


Of
Bachelor of Technology
In
Electrical Engineering

CHANDUBHAI S PATEL INSTITUTE OF

TECHNOLOGY,

CHARUSAT,

CHANGA,ANAND,GUJARAT,388421

M & V DEPARTMENT OF

ELECTRICAL ENGINEERING

CSPIT CHANGA Page 1


Gesture Controlled Industrial Robot

M&V Patel Department of Electrical


Engineering
C.S. Patel Institute of Technology
CHARUSAT, Changa

DECLARATION

I hereby declare that the Project Report for the project entitled
“GESTURE CONTROL OF INDUSTRIAL ROBOT ”
submitted in partial fulfillment for the degree of Bachelor of Technology
in Electrical Engineering to CHARUSAT UNIVERSITY,
Changa, is a bona fide record of the project work carried out at
CHANDUBHAI S PATEL INSTITUTE OF TECH . under the
supervision of ASST. Prof. Mr. Ankit prajapati.

Name of The Student Sign of Student

Piyushsingh P kshatriya
Gesture Controlled Industrial Robot

M&V Patel Department of Electrical


Engineering
C.S. Patel Institute of Technology
CHARUSAT, Changa

CERTIFICATE

This is to certify that project entitled GESTURE CONTROL OF INDUSTRIAL

ROBOT has been carried out by PIYUSHSINGH P KSHATRIYA(11EE043) under

my guidance in partial fulfillment for the degree of Bachelor of Technology in

th
Electrical Engineering 8 Semester of CHARUSAT UNIVERSITY, Changa

during the academic year 2014-15.

Mr. Ankit Prajapati Dr. pragnesh


bhatt
Assistant professor professor & head

M & V Patel Department of Electrical M & V Department of


Engineering Electrical Engineering
C,S Patel Institute of Technology C S Patel Institute of
CHARUSAT,Changa Tech. CHARUSAT,Changa
Gesture Controlled Industrial Robot

ACKNOWLEDGEMENTS
This project involved the collection and analysis of information
from a wide variety of sources and the efforts of many people
beyond me. Thus it would not have been possible to achieve the
results reported in this document without their help, support and
encouragement.

I will like to express my gratitude to the person for their help in


the work leading to this report:

 ASST Prof. Mr. Ankit Prajapati sir: Project


Supervisors/Guide: For their useful comments on the
subject matter and for the knowledge I gained by
sharing ideas with them.

Piyushsingh P Kshatriya (11EE043) Sign:…………………...


Gesture Controlled Industrial Robot

ABSTRACT

Service robots directly interact with people, so finding a


more natural and easy user interface is of fundamental
importance. While earlier works have focused primarily
on issues such as manipulation and navigation in the
environment, few robotic systems are used with user
friendly interfaces that possess the ability to control the
robot by natural means. To facilitate a feasible solution
to this requirement, we have implemented a system
through which the user can give commands to a wireless
robot using gestures. Through this method, the user can
control or navigate the robot by using gestures of his/her
palm, thereby interacting with the robotic system. The
command signals are generated from these gestures
using image processing. These signals are then passed to
the robot to navigate it in the specified directions.
Gesture Controlled Industrial Robot

TABLE OF CONTENTS

CERTIFICATE........................................................................................3
ACKNOWLEDGEMENTS.....................................................................4
ABSTRACT..............................................................................................5
TABLE OF CONTENT...........................................................................6

CHAPTER 1 : GESTURE TECHNOLOGY…………………………


C
1.1 Gesture Technology
1.2 Hand Gesture system
1.3 Wireless Technology

CHAPTER 2: ARDUINO BOARD WITH ATMEGA 328


MICROCONTROLLERS
2.1 Introduction of arduino board
2.2 Characteristics of arduino board
2.3 Board description`

CHAPTER3: About Leap motion & Programming with Leap Motion


3.1 Background
3.2 Task and Approach
3.3 Problems with the Code and how it was fixed
3.4 Results
3.5 Potential Improvements

CHAPTER 4: BLOCK DIAGRAM AND WORK FLOW DIAGRAM


OF THE PROJECT

CHAPTER 5: LEAP MOTION SENSOR PROGRAMME

CHAPTER 6: SERVO MOTOR SPECIFICATIONS AND


PROGRAMMING

CHAPTER 7: REFERENCES
Gesture Controlled Industrial Robot

CHAPTER 1
GESTURE TECHNOLOGY C

1.1 Gesture Technology


Gesture recognition is a topic in computer science and language
technology with the goal of interpreting human gestures via
mathematical algorithms. Gestures can originate from any bodily motion
or state but commonly originate from the face or hand. Current focuses in
the field include emotion recognition from the face and hand gesture
recognition. Many approaches have been made using cameras
and computer vision algorithms to interpret sign language. However, the
identification and recognition of posture, gait, proxemics, and human
behaviors is also the subject of gesture recognition techniques. Gesture
recognition can be seen as a way for computers to begin to understand
human body language, this building a richer bridge between machines
and humans than primitive text user interfaces or even GUIs (graphical
user interfaces), which still limit the majority of input to keyboard and
mouse. Gesture recognition enables humans to interface with the machine
(HMI) and interact naturally without any mechanical devices. Using the
concept of gesture recognition, it is possible to point a finger at
the computer screen so that the cursor will move accordingly. This could
potentially make conve0ntional input devices such
as mouse, keyboards and even touch-screens redundant. Gesture
recognition can be conducted with techniques from computer
vision and image processing. The literature includes ongoing work in the
computer vision field on capturing gestures or more general human pose
and movements by cameras connected to a computer.In computer
interfaces, two types of gestures are distinguished: We consider online
Gesture Controlled Industrial Robot

gestures, which can also be regarded as direct manipulations like scaling


and rotating. In contrast, offline gestures are usually processed after the
interaction is finished; e. g. a circle is drawn to activate a context
menu.Offline gestures: Those gestures that are processed after the user
interaction with the object. An example is the gesture to activate a
menu.Online gestures: Direct manipulation gestures. They are used to
scale or rotate a tangible object.
1.2.1 Hand Gesture system

Fig no:1.1 Gesture movement diagram


1.2.2 Applications of Gesture technology
Gesture Controlled Industrial Robot

Gesture recognition is useful for processing information from humans


which is not conveyed through speech or type. As well, there are
various types of gestures which can be identified by computers.
 Sign language recognition. Just as speech recognition can
transcribe speech to text, certain types of gesture recognition
software can transcribe the symbols represented through sign
language into text.
 For socially assistive robotics. By using proper sensors
(accelerometers and gyros) worn on the body of a patient and by
reading the values from those sensors, robots can assist in patient
rehabilitation. The best example can be stroke rehabilitation.
 Directional indication through pointing. Pointing has a very
specific purpose in our society, to reference an object or location
based on its position relative to ourselves. The use of gesture
recognition to determine where a person is pointing is useful for
identifying the context of statements or instructions. This
application is of particular interest in the field of robotics.
 Control through facial gestures. Controlling a computer through
facial gestures is a useful application of gesture recognition for
users who may not physically be able to use a mouse or
keyboard. Eye tracking in particular may be of use for controlling
cursor motion or focusing on elements of a display.
 Alternative computer interfaces. Foregoing the traditional
keyboard and mouse setup to interact with a computer, strong
gesture recognition could allow users to accomplish frequent or
common tasks using hand or face gestures to a camera.
Gesture Controlled Industrial Robot

 Immersive game technology. Gestures can be used to control


interactions within video games to try and make the game player's
experience more interactive or immersive.
 Virtual controllers. For systems where the act of finding or
acquiring a physical controller could require too much time,
gestures can be used as an alternative control mechanism.
Controlling secondary devices in a car or controlling a television
set are examples of such usage.

 Affective computing. In affective computing, gesture recognition


is used in the process of identifying emotional expression through
computer systems.
 Remote control. Through the use of gesture recognition, "remote
control with the wave of a hand" of various devices is possible.
The signal must not only indicate the desired response, but also
which device to be controlled.

1.3 Wireless Technology


Wireless telecommunications is the transfer of information between two
or more points hat are not physically connected. Distances can be short,
such as a few meters for television remote control, or as far as thousands
or even millions of kilometers for deep-space radio communications. It
encompasses various types of fixed, mobile, and portable two-way
radios, cellular telephones, personal digital assistants (PDAs),
and wireless networking. Other examples of wireless
technology include GPS units, Garage door openers or garage doors,
wireless computer,mice,keyboards and Headset,(audio), headphones,
radio receivers, satellite television, broadcast television and
cordless telephones. Wireless operations permit services, such as long
range communications, that are impossible or impractical to implement
with the use of wires. The term is commonly used in the
telecommunications industry to refer to telecommunications systems (e.g.
Gesture Controlled Industrial Robot

radio transmitters and receivers, remote controls, computer networks,


network terminals, etc.) which use some form of energy (e.g. radio
frequency (RF),acoustic energy, etc.) to transfer information without the
use of wires. Information is transferred in this manner over both short and
long distances. Wireless networking (i.e. the various types of unlicensed
2.4 GHz WiFi devices) is used to meet many needs. Perhaps the most
common use is to connect laptop users who travel from location to
location. Another common use is for mobile networks that connect via
satellite. A wireless transmission method is a logical choice to network a
LAN segment that must frequently change locations
The following situations justify the use of wireless technology:
 To span a distance beyond the capabilities of typical cabling,
 To provide a backup communications link in case of normal
network failure,
 To link portable or temporary workstations,
 To overcome situations where normal cabling is difficult or
financially impractical.
 To remotely connect mobile users or networks.
Gesture Controlled Industrial Robot

1.3.1 Wireless System

Fig no: 1.2 Network connections in wireless system

1.3.2 Applications of wireless technology


Businesses succeed today because they are fast, not vast. Instead of
holding large stockpiles of materials and finished goods inventory to
meet customer commitments, companies rely on fast information
exchange to drive responsive enterprise and supply chain systems that
adjust to dynamic production, distribution and service needs.If
information is old, it's wrong. And when information is wrong, systems
stop, shipments are delayed, and service and productivity suffer. Wireless
technology has become essential for getting accurate, real-time
information when and where it’s needed.Now companies are finding new
ways to use wireless to create a competitive advantage. They’re
leveraging legacy wireless LANs to provide automated asset tracking and
Gesture Controlled Industrial Robot

to connect their workforces with wireless voice-over-IP (VoIP). Real-time


responsiveness is being extended beyond the four walls with GPS and
wide-area voice & data networks for dynamic dispatch and remote access
to enterprise information. Before starting a wireless project, make sure
your solutions provider is grounded in all the aspects required to make a
system successful. Many providers can hang access points and install
radio cards, but can’t make the connection between wireless technology
and business value.

Wi-Fi
It is a wireless local area network that enables portable computing
devices to connect easily to the Internet. Standardized as IEEE
802.11 a,b,g,n, Wi-Fi approaches speeds of some types of wired Ethernet.
Wi-Fi has become the de facto standard for access in private homes,
within offices, and at public hotspots. Some businesses charge customers
a monthly fee for service, while others have begun offering it for free in
an effort to increase the sales of their goods.
Gesture Controlled Industrial Robot

CHAPTER 02
ARDUINO BOARD WITH ATMEGA 328
MICROCONTROLLER

2.1 Introduction of arduino board


Arduino is an open-source electronics prototyping platform based on
flexible, easy-to-use hardware and software. It's intended for artists,
designers, hobbyists, and anyone interested in creating interactive objects
or environments. Arduino can sense the environment by receiving input
from a variety of sensors and can affect its surroundings by controlling
lights, motors, and other actuators. The microcontroller on the board is
programmed using the Arduino programming language (based
on Wiring) and the Arduino development environment (based
on Processing). Arduino projects can be stand-alone or they can
communicate with software running on a computer (e.g. Flash,
Processing, and MaxMSP. It is a tool for making computers that can
sense and control more of the physical world than your desktop
computer. It's an open-source physical computing platform based on a
simple microcontroller board, and a development environment for writing
software for the board. Arduino can be used to develop interactive
objects, taking inputs from a variety of switches or sensors, and
controlling a variety of lights, motors, and other physical outputs.
Arduino projects can be stand-alone, or they can be communicated with
software running on your computer (e.g. Flash, Processing, MaxMSP.)
The boards can be assembled by hand or purchased preassembled; the
open-source IDE can be downloaded for free. The Arduino programming
language is an implementation of Wiring, a similar physical computing
platform, which is based on the Processing multimedia programming
environment.
Gesture Controlled Industrial Robot

2.2 Characteristics of arduino board

a) Inexpensive
Arduino boards are relatively inexpensive compared to other
microcontroller platforms. The least expensive version of the
Arduino module can be assembled by hand, and even the pre-
assembled Arduino modules cost less than $50
b) Cross-platform The Arduino software runs on Windows,
Macintosh OSX, and Linux operating systems. Most
microcontroller systems are limited to Windows.
c) Simple, clear programming environment
The Arduino programming environment is easy-to-use for
beginners, yet flexible enough for advanced users to take
advantage of as well. For teachers, it's conveniently based on the
Processing programming environment, so students learning to
program in that environment will be familiar with the look and feel
of Arduino
d) Open source and extensible software
The Arduino software and is published as open source tools,
available for extension by experienced programmers. The language
can be expanded through C++ libraries, and people wanting to
understand the technical details can make the leap from Arduino to
the AVR C programming language on which it's based. SImilarly,
you can add AVR-C code directly into your Arduino programs if
you want to.
Gesture Controlled Industrial Robot

e) Open source and extensible hardware


The Arduino is based on
Atmel's ATMEGA8 and ATMEGA168microcontrollers.
The plans for the modules are published under a Creative

Commons license, so experienced circuit designers can


make their own version of the module, extending it and
improving it. Fig no:2.1 Simple arduino board

2.3 Board description`


The Arduino Uno is a microcontroller board based on the ATmega328. It
has 14 digital input/output pins (of which 6 can be used as PWM
outputs), 6 analog inputs, a 16 MHz crystal oscillator, a USB connection,
a power jack, an ICSP header, and a reset button. It contains everything
needed to support the microcontroller; simply connect it to a computer
with a USB cable or power it with a AC-to-DC adapter or battery to get
started. "Uno" means one in Italian and is named to mark the upcoming
release of Arduino 1.0.

2.4 Pin configuration of ATmega 328


Gesture Controlled Industrial Robot

Fig no: 2.2 Pin diagram of Atmega328


2.5 Pin Descriptions

VCC Digital supply voltage

GND Ground

Port B (PB7) Port B is an 8-bit bi-directional I/O port with internal pull-
up resistors (selected for each bit). The Port B output buffers have
symmetrical drive characteristics with both high sink and source
Gesture Controlled Industrial Robot

capability. As inputs, Port B pins that are externally pulled low will
source current if the pull-up resistors are activated. The Port B pins are
tri-stated when a reset condition becomes active, even if the clock is not
running. Depending on the clock selection fuse settings, PB7 can be
used as output from the inverting Oscillator amplifier.

PB6 Depending on the clock selection fuse settings, PB6 can be used as
input to the inverting Oscillator amplifier and input to the internal clock
operating circuit.

Port C (PC5) Port C is a 7-bit bi-directional I/O port with internal pull-
up resistors (selected for each bit). The output buffers have symmetrical
drive characteristics with both high sink and source capability. As
inputs, Port C pins that are externally pulled low will source current if
the pull-up resistors are activated. The Port C pins are tri-stated when a
reset condition becomes active, even if the clock is not running.

PC6 PC6 is used as an I/O pin. Note that the electrical characteristics of
PC6 differ from those of the other pins of Port C.; PC6 is used as a Reset
input. A low level on this pin for longer than the minimum pulse length
will generate a Reset, even if the clock is not running.

Port D (PD7) Port D is an 8-bit bi-directional I/O port with internal pull-
up resistors (selected for each bit). The Port D output buffers have
symmetrical drive characteristics with both high sink and source
capability. As inputs, Port D pins that are externally pulled low will
source current if the pull-up resistors are activated. The Port D pins are
tri-stated when a reset condition becomes active, even if the clock is not
running.
Gesture Controlled Industrial Robot

AVcc AVCC is the supply voltage pin for the A/D Converter. It should be

externally connected to VCC, even if the ADC is not used. If the ADC is

used, it should be connected to VCC through a low-pass filter. Note that


PC6...4 use digital supply voltage.

AREF AREF is the analog reference pin for the A/D Converter.

2.6 Block Diagram

Fig no: 2.3 block diagram of arduino

2.7 Programming Environment of ATmega 328


Arduino programs can be divided in three main
parts: structure, values (variables and constants), and functions
2.7.1 Structure
Gesture Controlled Industrial Robot

Setup ( )
The setup () function is called when a sketch starts. Use it to initialize
variables, pin modes, start using libraries, etc. The setup function will
only run once, after each powerup or reset of the Arduino boa
Article I. Loop ()
Article II. After creating a setup () function, which initializes and sets
the initial values, the loop() function does precisely what its name
suggests, and loops consecutively, allowing your program to change and
respond. Use it to actively control the Arduino board.
Section II.1
Section II.2 2.7.2 Constants
Section II.3 Constants are predefined variables in the Arduino
language. They are used to make the programs easier to read. We classify
constants in groups.

Section II.4 High


Section II.5 The meaning of high is somewhat different depending on
whether a pin is set to an input or output When a pin is configured as an
with pinMode, and read with digitalRead, the microcontroller will report
high if a voltage of 3 volts or more is present at the pin.
Low
The meaning of low also has a different meaning depending on whether a
pin is set to input or output. When a pin is configured as an input with pin
mode, and read with digital read, the microcontroller will report low if a
voltage of 2 volts or less is present at the pin.

2.7.3 Functions

Digital I/O There is basically three functions are used in digital I/O.
Gesture Controlled Industrial Robot

Pin Mode () Configures the specified pin to behave either as an input or


an output.
Digital Write () Write a high or a low value to a digital pin. If the pin is
configured as an input, writing a high value with digital Write () will
enable an internal 20K pull-up resistor. Writing low will disable the pull-
up. The pull-up resistor is enough to light a led dimly, so if LEDs appear
to work, but very dimly, this is a likely cause. The remedy is to set the pin
to an output with the pin Mode () function.

Digital Read () Reads the value from a specified digital pin,


either high or low.

Analog I/O In analog I/O there are also three functions to take input from
accelerometer which are

Analog Reference () Configures the reference voltage used for analog


input (i.e. the value used as the top of the input range). The options are:

Default The default analog reference of 5 volts (on 5V Arduino boards)


or 3.3 volts (on 3.3V Arduino boards)

Internal An built-in reference, equal to 1.1 volts on


the ATmega168 or ATmega328 and 2.56 volts on theATmega8 (not
available on the Arduino Mega)
Internal 1V1 A built-in 1.1V reference (Arduino Mega only)
Internal 2V56 A built-in 2.56V reference (Arduino Mega only)
External The voltage applied to the AREF pin (0 to 5V only) is used as
the reference.
Gesture Controlled Industrial Robot

CHAPTER3

About Leap Motion Sensor

 Within the tracking area of Leap Motion sensor, the internal model
detects human arms, 27 hands, fingers, and tools, recording and
reporting their position information, gesture information and
motion in frames.

 LM is only able to send commands as numbers while the


microcontroller can only receive commands as characters. Our
code converted numbers into characters before sending them to the
microcontroller.

 Another problem was figuring out how to send multi-commands to


microcontroller.When we needed to control several motors at a
time, and combine several commands together to send to the
microcontroller.
Gesture Controlled Industrial Robot

PROGRAMMING WITH LEAP MOTION SENSOR

3.1 Background

The Leap Motion sensor contains a camera sensor that takes


stereographic pictures of the objects in front of it. Leap Motions internal
software then scans the images to look for palms, fingers, and point able
objects such as pencils. All these are also combined and recorded as one
frame at each position in space. LM scans and sends the PC about 300
frames per second. At this speed, the LM device is able to see and use
information quickly and accurately, making for great Response times.
Once all objects in a frame are identified, they are assigned a unique
internal ID that the software uses to keep track of the individual parts of
the hand. The leap motion controller preserves an internal model of the
human hand and compares it to the current image of the user’s hand using
the previous ID. This is a method used by Leap Motion to correct any
invalid data read by the sensor due to a hand being partially out of range
or a finger hidden due to a bad angle.

The Controller class and core interface of the LM controller is very


powerful. We can
Access each frame of tracking data and configuration information from
the established Controller class. For example, using Java frame data can
be used at any time by the Controller.frame() function. The latest frame
information can be polled by calling frame(0) or frame(). The controller
can store up to 60 in its historical frame. The previous frame can be
called by setting the history parameter to a positive integer. An instance
of a subclass of Leap Listener can be added to the controller to handle the
events from the Leap Motion sensor. Each subclass listener can be
initialized, exited or changed the connection status according to the
operating system input focus. When events occur, the controller object
invokes the corresponding callback function in each predefined listener
object. In our project, we did not add listener mainly based on our goal of
project. However, if our robot arm needs to meet more demands, we
could optimize algorithm using Listener subclass.

The controller object is multi-threaded and calls Listener functions on its


own thread instead of an application thread. Within the tracking area of
Leap Motion sensor, the internal model detects human arms, 27 hands,
Gesture Controlled Industrial Robot

fingers, and tools, recording and reporting their position information,


gesture information and motion in frames. The default frame rate is 300
frames per second. The frame rate can be detected and modified by
software tracking and other functions. For example, frame.current
FramesPerSecond() can get the instantaneous frame rate. The Point able
class provides the physical characteristics of a detected finger or tool.
Point able objects include fingers and tools. Pointable.is Finger() function
determines whether a Pointable object represents a finger.
Pointable.isTool() function determines whether a Pointable object
represents a tool. A typical tool is defined as a finger like tool that is
longer than finger [13].

3.2 Task and Approach

The scope of the project consisted of using a programming language to


use the input read by the Leap Motion sensor to control a physical robot
arm. Our group used C++ in the visual studios platform to control the 6-
Degree of freedom robot arm. The program received input from the user
through the LM sensor and generated commands to the robot via the
microcontroller and Bluetooth. The X, Y, and Z coordinates of the tips of
the index finger and thumb were assigned unique variable names in order
for the program to find their location in space. The coordinates of
the human fingers were mapped to the tips of each of the two robot
prongs to mimic human movements. The distance formula was used to
calculate the distance between both points at the tips of the fingers. This
distance tells the robot to open or close the gripper. To move the arm, we
used the coordinates of the center of the user's palm to drive each motor
according to each axis. When the palm travels in the X direction, motor 1
turns the entire robot. Palm movements in the Y and Z directions actuate
motor 2 and 3 to raise and lower the arm.

3.3 Problems with the Code and how it was fixed

A problem we had to overcome was the communication between Leap


motion and the
Microcontroller. LM is only able to send commands as numbers while the
microcontroller can only receive commands as characters. Our code
converted numbers into characters before sending them to the
microcontroller via Bluetooth.
Another problem was figuring out how to send multi-commands to
microcontroller.
Gesture Controlled Industrial Robot

When we needed to control several motors at a time, and combine several


commands together to send to the microcontroller. However, the each
command has its own format, so the length and format of combined
command became a big problem. In the beginning, we used ArrayList to
support dynamic arrays that can grow as needed to contain the multiple
commands. However, the microcontroller could not receive the combined
command even if we changed to other data structure. To fix the problem
we created a list of arrays with different fixed lengths according to the
length of each single command to satisfy the length of each combination.
Finally, we overcame this problem and found that this algorithm is better
than the algorithm using ArrayList. The main reason was that each
command includes many symbols and characters like “#” and “P” and
there was no need to change these in each array (There is no need to
allocate memory each time like with ArrayList).

3.4 Results

The results were satisfactory. The team was able to complete its task of
manipulating the robot arm using hand gestures over the LM sensor.

3.5 Potential Improvements


• We could have improved the movement of the robotic arm being
controlled by
LM by using PID control or introducing subclass Listener in the program.
• Leap Motion is releasing a new version of the sensor, which could have
improved
our project due to its new features and software stability.
• Websocket is a new medium of communication that could have been
used to
Control the arm through Internet connections instead of using a
microcontroller
Gesture Controlled Industrial Robot

CHAPTER 4

BLOCK DIAGRAM OF THE PROJECT PROCEDURE


Gesture Controlled Industrial Robot

WORK FLOW DIAGRAM


Gesture Controlled Industrial Robot

CHAPTER 5

LEAP MOTION SENSOR PROGRAMME

import processing.serial.*;
Serial myPort;
import de.voidplus.leapmotion.*;
int o;

LeapMotion leap;

void setup()
{
size(800, 500);
String portName = Serial.list()[9];
myPort = new Serial(this, portName, 9600);
leap = new LeapMotion(this);
}

void draw()
{
background(255);
// ...
int fps = leap.getFrameRate();

// ========= HANDS =========

for(Hand hand : leap.getHands()){

// ----- BASICS -----

int hand_id = hand.getId();


PVector hand_position = hand.getPosition();
Gesture Controlled Industrial Robot

PVector hand_stabilized = hand.getStabilizedPosition();


PVector hand_direction = hand.getDirection();
PVector hand_dynamics = hand.getDynamics();
float hand_roll = hand.getRoll();
float hand_pitch = hand.getPitch();
float hand_yaw = hand.getYaw();
boolean hand_is_left = hand.isLeft();
boolean hand_is_right = hand.isRight();
float hand_grab = hand.getGrabStrength();
float hand_pinch = hand.getPinchStrength();
float hand_time = hand.getTimeVisible();
PVector sphere_position = hand.getSpherePosition();
float sphere_radius = hand.getSphereRadius();

int w = (int)map(hand.getRoll(), -70, 70, 0, 60);


int x = (int)map(hand.getPosition().x, -50, 900, 61, 120);
int y = (int)map(hand.getPosition().y, 0, 600, 121, 180);
int z = (int)map(hand.getGrabStrength(), 0, 1, 181, 250);

print(x);
print(" ");
print(y);
print(" ");
print(z);
print(" ");
println(w);
myPort.write(w);
myPort.write(x);
myPort.write(y);
myPort.write(z);

Finger finger_thumb = hand.getThumb();


// or hand.getFinger("thumb");
// or hand.getFinger(0);
Gesture Controlled Industrial Robot

Finger finger_index = hand.getIndexFinger();


// or hand.getFinger("index");
// or hand.getFinger(1);

Finger finger_middle = hand.getMiddleFinger();


// or hand.getFinger("middle");
// or hand.getFinger(2);

Finger finger_ring = hand.getRingFinger();


// or hand.getFinger("ring");
// or hand.getFinger(3);

Finger finger_pink = hand.getPinkyFinger();


// or hand.getFinger("pinky");
// or hand.getFinger(4);

// ----- DRAWING -----

hand.draw();
// hand.drawSphere();

// ========= ARM =========

if(hand.hasArm()){
Arm arm = hand.getArm();
float arm_width = arm.getWidth();
PVector arm_wrist_pos = arm.getWristPosition();
PVector arm_elbow_pos = arm.getElbowPosition();
}

// ========= FINGERS =========

for(Finger finger : hand.getFingers()){


Gesture Controlled Industrial Robot

// ----- BASICS -----

int finger_id = finger.getId();


PVector finger_position = finger.getPosition();
PVector finger_stabilized = finger.getStabilizedPosition();
PVector finger_velocity = finger.getVelocity();
PVector finger_direction = finger.getDirection();
float finger_time = finger.getTimeVisible();

// ----- SPECIFIC FINGER -----

switch(finger.getType()){
case 0:
// System.out.println("thumb");
break;
case 1:
// System.out.println("index");
break;
case 2:
// System.out.println("middle");
break;
case 3:
// System.out.println("ring");
break;
case 4:
// System.out.println("pinky");
break;
}

// ----- SPECIFIC BONE -----

Bone bone_distal = finger.getDistalBone();


// or finger.get("distal");
// or finger.getBone(0);
Gesture Controlled Industrial Robot

Bone bone_intermediate = finger.getIntermediateBone();


// or finger.get("intermediate");
// or finger.getBone(1);

Bone bone_proximal = finger.getProximalBone();


// or finger.get("proximal");
// or finger.getBone(2);

Bone bone_metacarpal = finger.getMetacarpalBone();


// or finger.get("metacarpal");
// or finger.getBone(3);

// ----- DRAWING -----

// finger.draw(); // = drawLines()+drawJoints()
// finger.drawLines();
// finger.drawJoints();

// ----- TOUCH EMULATION -----

int touch_zone = finger.getTouchZone();


float touch_distance = finger.getTouchDistance();

switch(touch_zone){
case -1: // None
break;
case 0: // Hovering
// println("Hovering (#"+finger_id+"): "+touch_distance);
break;
case 1: // Touching
// println("Touching (#"+finger_id+")");
break;
}
}
Gesture Controlled Industrial Robot

// ========= TOOLS =========

for(Tool tool : hand.getTools()){

// ----- BASICS -----

int tool_id = tool.getId();


PVector tool_position = tool.getPosition();
PVector tool_stabilized = tool.getStabilizedPosition();
PVector tool_velocity = tool.getVelocity();
PVector tool_direction = tool.getDirection();
float tool_time = tool.getTimeVisible();

// ----- DRAWING -----

// tool.draw();

// ----- TOUCH EMULATION -----

int touch_zone = tool.getTouchZone();


float touch_distance = tool.getTouchDistance();

switch(touch_zone){
case -1: // None
break;
case 0: // Hovering
// println("Hovering (#"+tool_id+"): "+touch_distance);
break;
case 1: // Touching
// println("Touching (#"+tool_id+")");
break;
}
Gesture Controlled Industrial Robot

}
}

// ========= DEVICES =========

for(Device device : leap.getDevices()){


float device_horizontal_view_angle =
device.getHorizontalViewAngle();
float device_verical_view_angle =
device.getVerticalViewAngle();
float device_range = device.getRange();
}

// ========= CALLBACKS =========

void leapOnInit(){
// println("Leap Motion Init");
}
void leapOnConnect(){
// println("Leap Motion Connect");
}
void leapOnFrame(){
// println("Leap Motion Frame");
}
void leapOnDisconnect(){
// println("Leap Motion Disconnect");
}
void leapOnExit(){
// println("Leap Motion Exit");
}
Gesture Controlled Industrial Robot

CHAPTER 6
ABOUT SERVOMOTORS

 A servo motor is basically a DC motor(in some special


cases it is AC motor) along with some other special
purpose components that make a DC motor a servo.

 In a servo unit, you will find a small DC motor, a


potentiometer, gear arrangement and an intelligent
circuitry.

 The intelligent circuitry along with the potentiometer


makes the servo to rotate according to our wishes.
Gesture Controlled Industrial Robot

SERVO MOTOR SPECIFICATIONS

 TowerPro MG996R - Standard Servo

 Basic Information

 Modulation:

 AnalogTorque:

 4.8V: 130.5 oz-in (9.40 kg-cm)


6.0V: 152.8 oz-in (11.00 kg-cm)
Speed:

 4.8V: 0.17 sec/60°


6.0V: 0.14 sec/60°
Weight:1.94 oz (55.0 g)

Dimensions:
Length:.60 in (40.6 mm)
Width: 0.78 in (19.8 mm)
Height:1.69 in (42.9 mm)

 Gear Type:Metal
Rotation/Support:Dual Bearings
Gesture Controlled Industrial Robot

SERVO MOTORS PROGRAMME

#include <Servo.h>
Servo my1;
Servo my2;
Servo my3;
Servo my4;
Servo my5;
Servo my6;
int potpin = 0;
int potpin1= 2;
int potpin2= 4;
int potpin3= 6;
int vale;
int vals;

void setup()
{
my1.attach(2);
my2.attach(3);
my3.attach(4);
my4.attach(5);
my5.attach(6);
my6.attach(8);
my1.write(1);
my2.write(179);
my3.write(1);
my4.write(179);
pinMode(13,OUTPUT);
Serial.begin(9600);
}

void loop()
Gesture Controlled Industrial Robot

{
if(Serial.available()>0)
{
int afj=Serial.read();
if(afj>=0 && afj<=60)
{
int asds = afj;
vals = map(asds, 5,60, 5, 178);
int vals2=map(asds,60,0,1,178);
my3.write(vals);
my4.write(vals2);
}
else if(afj>=61 && afj<=120)
{
int qse=afj;
qse=map(qse,120,61,10,160);
my6.write(qse);
}
else if(afj>=121 && afj<=180)
{
int asde = afj;;
vale = map(asde, 180, 127, 1, 178);
int vale1=map(asde,121,180,27,178);
my1.write(vale);
my2.write(vale1);
}
else if(afj>=181 && afj<=250)
{
int awd=afj;
int val=map(awd,200,250,20,80);
my5.write(val);
}
}
}
Gesture Controlled Industrial Robot

OVERALL COSTING FOR MAKING THIS PROJECT

1) LEAP MOTION SENSOR--- 6915 INR

2) SERVO MOTORS (6*900)---5400 INR

3) WOOD ---60 INR

TOTAL--- 12375 INR


Gesture Controlled Industrial Robot

CHAPTER 7
REFERENCES

1. ARDUINO.CC
2. ARDUINO.CN
3. ARDUINO.GOOGLECODE.COM
4. CITESEERX.IST.PSU.EDU
5. CODE-REFERENCE.COM
6. DEVELOPER.LEAPMOTION.COM
7. EN.M.WIKIPEDIA.ORG
8. EN.WIKIPEDIA.ORG
9. ENERGIA.NU
10. FORUM.PROCESSING.ORG
11. FORUM.PROCESSING.ORG
12. GITHUB.COM
13. SIMPLE.WIKIPEDIA.ORG
14. WWW.AIOPASS4SURE.COM
15. WWW.ARDUINO.CC
16. WWW.DIGIKEY.COM
17. WWW.LINKEDIN.COM
18. WWW.OPENPROCESSING.ORG
19. WWW.SLIDESHARE.NET

Vous aimerez peut-être aussi