Vous êtes sur la page 1sur 204

Yemen Republic

Sana’a University
Faculty of Engineering
Electrical Engineering Department
Computer and Control Specialization

Design and Implementation of a


Multi-Purpose Service Robot (MPSR)
Prepared by:
Hamdi M. Sahloul

Supervisor:
Prof. Dr. Abdul Raqib Abdo Asaad

A final year graduation project submitted to the Electrical Engineering


Department as a final requirement in the degree of B.Eng. in Electrical
Engineering (Computer and Control Specialization).

Sana’a July 2010


LEGAL NOTE
Roboty implementation and its associated documentation can be used, reproduced, and modified
freely in marketing communications.

When using Roboty or any modification of it, proper attribution is required under the terms of the
Creative Commons Attribution License. For more details on proper attribution, please refer to the
Appendix I.

By copying, installing, or otherwise using these materials, you agree to be bound by the terms of the
Creative Commons Attribution License. If you do not agree you may not use or view this
documentation and its associated materials and must STOP NOW.

I aggressively enforce my intellectual property rights to the fullest extent of the law; be warned!
Forward

In the memorial of the Islamic Golden Age[65] that had been erased from the history of the world trying
to hide the truth of our scientists' books and knowledge that had been taken away from us after the fall of
Al-Andalus[66] and then attributed to others without any attribution to us. I present this work to the Muslim
and Arabian people who were in reality the founders of this robotics revolution.

I decline Joseph F. Engelberger (1925-Today) to be the “Father Of Robotics”, as Abo Al-Eiz Al-Jazari
(1136-1206) is the real “Father Of Robotics”.[refer to the introduction] Let us just brush down the dust and tell the
world who we are and get back our rights!

IX
Acknowledgments

I want to express my gratitude to my parents and siblings who kept lighting their hearts for me in the
times of darkness and despair. I am forever in debt to you and I hope this work would satisfy you all.

Also, I want to say a big thank you to my supervisor who trusted me and believed in this project. The
professor Asaad also supported me with many guidelines and research materials that came in handy, as well
as structuring this documentation.

Not to forget my dear friends Khalid Dhafir, Ahmed Al-Balasi, and Saleh Al-Titi who helped me in
reviewing and optimizing this documentation, so thank you very much.

Last but not least, I want to say thank you to all the people who had helped me in ways they probably
did not even know. They just contributed in some way or another to make this come into reality.

XI
Abstract

This document describes the hardware and software architecture behind the ROBOTY robot. ROBOTY is
a differential wheeled robot with self balancing, motion, speech and objects recognition capabilities.
ROBOTY is also the first autonomous robot in Yemen, all of which will be primarily controlled by voice
commands. The final goal of this research project is to build a robot capable of playing chess.

Keywords: Autonomous, Robotics, Balancing, Object Recognition, Brain Building.

XIII
Preface

This project presents a simple, easy to implement robotic system; I called it Roboty, which is capable of
talking, hearing, moving its neck that holds a webcam which detects and identifies human faces, moving in
all four directions with two wheels while maintaining its balance.

This documentation is written as a textbook for engineering students, but it could be used by other
people who have basic knowledge of algebra, logic and programming languages.

In this documentation all materials are presented in such a way that the reader can follow the
discussions easily. Furthermore, all materials necessary for understanding the subject matter presented
(such as block diagrams, C code and waveforms) are included to ease grasping the subject.

In addition, while including many notes and details for the whole system and its components along with
its block diagrams to ease the understanding of the concept behind this robot system, I implemented the
system so you can enjoy it and verify its theoretical part as well!

The theoretical background materials for designing the robot system are discussed in detail. After
grasping the theoretical aspects, the reader can use MPLAB IDE (Integrated Development Environment) and
Arduino IDE with advantage of programming their chips involving various types of standard logic and
standard protocols. It is assumed that the reader is familiar with C/C++ coding to use IDEs mentioned above
or its equivalents.

It is also preferred that the reader knows the concepts of Logic Circuits and Digital Systems. Moreover,
there exists a need for background knowledge on the Digital Control and Image processing for those who
want to deeply understand the whole aspects of the implemented robot.

For those who are interested in re-implementing the robot system you may need to have about 1000 to
1800 USD (United States Dollars) budget, basic knowledge of Electronics Fundamentals, soldering electronic
chips, and deeper knowledge of the above mentioned requirements.

XV
At A Glance Table

At A Glance Table

CHAPTER 1. INTRODUCTION TO ROBOTICS ................................................................................................... 3


CHAPTER 2. HARDWARE AND IMPLEMENTATION ....................................................................................... 3
CHAPTER 3. SOFTWARE DESIGN AND INTERFACING.................................................................................. 3
CHAPTER 4. OPERATION AND MAINTENANCE .............................................................................................. 3
CHAPTER 5. CONCLUSIONS AND RECOMMENDATIONS ............................................................................. 3

XVII
Definition

XVIII
Table of Contents

Table of Contents

CHAPTER 1. INTRODUCTION TO ROBOTICS ................................................................................................... 3


1.1 DEFINITION .......................................................................................................................................................... 3
1.2 ROBOTICS HISTORY ............................................................................................................................................. 3
1.3 HOW ROBOTS WORK?.......................................................................................................................................... 3
1.4 USES OF ROBOTS.................................................................................................................................................. 3
1.5 IMPORTANCE OF ROBOTS ..................................................................................................................................... 3
1.6 TYPES OF ROBOTS ................................................................................................................................................ 3
1.7 IMPACT OF ROBOTS .............................................................................................................................................. 3
1.8 FUTURE OF TECHNOLOGIES.................................................................................................................................. 3
1.9 THIS PROJECT: MULTI-PURPOSE SERVICE ROBOTS ............................................................................................... 3
1.9.1 Importance of Multi-purpose Service Robots.................................................................................................. 3
1.9.2 Goal of the Project.......................................................................................................................................... 3
1.9.3 Expected outcomes.......................................................................................................................................... 3
1.9.4 Methodology ................................................................................................................................................... 3
1.9.5 Block Diagram................................................................................................................................................ 3
1.10 PROJECT ORGANIZATION ..................................................................................................................................... 3
CHAPTER 2. HARDWARE AND IMPLEMENTATION ....................................................................................... 3
2.1 MICROCHIP® PIC® MICROCONTROLLER ............................................................................................................ 3
2.1.1 Olimex PIC-P28-USB ..................................................................................................................................... 3
2.1.2 I²C EEPROM .................................................................................................................................................. 3
2.1.3 Olimex PIC-MCP-USB ................................................................................................................................... 3
2.1.4 MPLAB Integrated Development Environment............................................................................................... 3
2.1.5 CCS IDE Compilers........................................................................................................................................ 3
2.2 VRBOT MODULE .................................................................................................................................................. 3
2.3 SPEAKJET, VOICEBOX SHIELD & TTS256 ........................................................................................................... 3
2.4 CHARACTER LCD ................................................................................................................................................ 3
2.4.1 LCD Backpack ................................................................................................................................................ 3
2.5 GPS MODULE ...................................................................................................................................................... 3
2.5.1 Logic Level Converter..................................................................................................................................... 3
2.6 TILT CONTROLLER ............................................................................................................................................... 3
2.6.1 Pocket AVR Programmer................................................................................................................................ 3
2.6.2 FTDI Basic Breakout - 3.3V ........................................................................................................................... 3
2.6.3 Arduino IDE.................................................................................................................................................... 3
2.7 MOTORS UNIT ...................................................................................................................................................... 3
2.7.1 Wheels encoder ............................................................................................................................................... 3
2.7.2 Serial controlled motor driver ........................................................................................................................ 3
2.8 INFRARED PROXIMITY SENSOR LONG RANGE ...................................................................................................... 3
2.9 SERVOS ................................................................................................................................................................ 3
2.10 BEAGLEBOARD .................................................................................................................................................... 3
2.10.1 BeagleBuddy Zippy..................................................................................................................................... 3
2.10.2 Linux UVC Webcam ................................................................................................................................... 3
2.10.3 Ångström Linux distribution ....................................................................................................................... 3
2.10.4 OpenCV ...................................................................................................................................................... 3
2.11 BATTERIES AND CHARGERS ................................................................................................................................. 3
2.12 MAIN BOARD MODIFICATIONS .............................................................................................................................. 3
2.13 USING THE TTS256 WITH THE VOICEBOX SHIELD ............................................................................................... 3
2.14 ATTACHING ZIPPY TO THE BEAGLEBOARD .......................................................................................................... 3
2.15 INTERCONNECTION BETWEEN ALL MODULES ....................................................................................................... 3
2.16 FINAL HARDWARE ............................................................................................................................................... 3

XIX
Definition

CHAPTER 3. SOFTWARE DESIGN AND INTERFACING.................................................................................. 3


3.1 9DOF SOFTWARE ................................................................................................................................................ 3
3.2 MOTOR DRIVER SOFTWARE ................................................................................................................................. 3
3.3 BEAGLEBOARD SOFTWARE.................................................................................................................................. 3
3.3.1 Face detection................................................................................................................................................. 3
3.3.2 Face Recognition ............................................................................................................................................ 3
3.3.3 I²C Communication with PIC®....................................................................................................................... 3
3.3.4 Starting the program when Linux starts up..................................................................................................... 3
3.4 PIC® SOFTWARE ................................................................................................................................................. 3
CHAPTER 4. OPERATION AND MAINTENANCE .............................................................................................. 3
4.1 VOICE COMMANDS .............................................................................................................................................. 3
4.2 TALKING .............................................................................................................................................................. 3
4.3 FACE RECOGNITION ............................................................................................................................................. 3
4.3.1 Camera Movements ........................................................................................................................................ 3
4.4 BALANCING ......................................................................................................................................................... 3
4.5 WALKING ............................................................................................................................................................. 3
4.5.1 Avoiding obstacles .......................................................................................................................................... 3
4.6 NAVIGATION ........................................................................................................................................................ 3
4.7 LCD DISPLAY ...................................................................................................................................................... 3
4.7.1 Status indication ............................................................................................................................................. 3
4.7.2 Errors reporting.............................................................................................................................................. 3
CHAPTER 5. CONCLUSIONS AND RECOMMENDATIONS ............................................................................. 3
5.1 CHALLENGES ....................................................................................................................................................... 3
5.2 CONCLUSIONS ...................................................................................................................................................... 3
5.3 RECOMMENDATIONS ............................................................................................................................................ 3
APPENDIX I. CREATIVE COMMONS ATTRIBUTION LICENSE .................................................................. 87
I.1 EXACT REPRODUCTIONS .................................................................................................................................... 88
I.2 MODIFIED VERSIONS.......................................................................................................................................... 88
I.3 OTHER MEDIA.................................................................................................................................................... 88
I.4 CONTACT ........................................................................................................................................................... 88
APPENDIX II. RESOURCES ..................................................................................................................................... 89
APPENDIX III. BEAGLEBOARD AND ÅNGSTRÖM DISTRIBUTION .......................................................... 93
III.1 BUILDING YOUR DEVELOPMENT ENVIRONMENT ................................................................................................. 94
III.1.1 Connections .............................................................................................................................................. 94
III.1.2 Setting up the console ............................................................................................................................... 94
III.1.3 Verifying setup.......................................................................................................................................... 95
III.1.4 Setting up the operating system ................................................................................................................ 96
III.1.5 The Ångström Linux distribution .............................................................................................................. 97
III.1.6 Download the distribution ........................................................................................................................ 97
III.1.7 Partition the card ..................................................................................................................................... 98
III.1.8 Copying the files into the disk................................................................................................................... 98
III.2 BOOTING LINUX ................................................................................................................................................. 99
III.3 INSTALLING REQUIRED APPLICATIONS AND LIBRARIES .................................................................................... 100
III.4 EXAMPLE “CAPTURE.C” ................................................................................................................................... 100
APPENDIX IV. FULL ROBOT SOFTWARE...................................................................................................... 103
IV.1 9DOF SOFTWARE ............................................................................................................................................ 104
IV.2 MOTOR DRIVER SOFTWARE ............................................................................................................................. 116
IV.3 BEAGLEBOARD SOFTWARE.............................................................................................................................. 126
IV.4 PIC SOFTWARE ................................................................................................................................................ 136
XX
Table of Contents

APPENDIX V. MICROCONTROLLER'S PROTOCOLS AND SIGNALS ........................................................ 155


V.1 PWM ............................................................................................................................................................... 156
V.2 UART.............................................................................................................................................................. 157
V.2.1 RS232...................................................................................................................................................... 157
V.2.2 EIA232F ................................................................................................................................................. 157
V.2.3 TTL and USB .......................................................................................................................................... 158
V.2.4 Other Terminology ................................................................................................................................. 158
V.3 I²C ................................................................................................................................................................... 160
APPENDIX VI. KALMAN FILTER ..................................................................................................................... 161
VI.1 ACCELEROMETER TO ATTITUDE ....................................................................................................................... 162
VI.2 GYROSCOPE TO ROLL, PITCH AND YAW ............................................................................................................ 163
VI.3 KALMAN FILTERING OF IMU DATA .................................................................................................................. 166
VI.3.1 Introduction ............................................................................................................................................ 166
VI.3.2 Basic operation....................................................................................................................................... 166
VI.3.3 Our simple model ................................................................................................................................... 167
VI.3.4 Wrapping it all up................................................................................................................................... 167
APPENDIX VII. PID CONTROLLER ................................................................................................................... 169
VII.1 DC MOTOR PID CONTROLLER .................................................................................................................... 170
VII.1.1 PID Tuning ............................................................................................................................................. 171
GLOSSARY .................................................................................................................................................................... 175
REFERENCES ............................................................................................................................................................... 177

5.1 CHALLENGES ....................................................................................................................................................... 3


5.2 CONCLUSIONS ...................................................................................................................................................... 3
5.3 RECOMMENDATIONS ............................................................................................................................................ 3

XXI
Chapter 1. Introduction to robotics

Robot, computer-controlled machine that is programmed to move,


manipulates objects, and accomplishes tasks while interacting with its
environment. Robots are able to perform repetitive tasks more
quickly, cheaply, and accurately than humans. The term robot
originates from the Czech word robota, meaning “compulsory labor.”
It was first used in the 1921 play RUR (Rossum's Universal Robots) by
the Czech novelist and playwright Karel Capek. The word robot has
been used since then to refer to a machine that performs work to
assist people or work that humans find difficult or undesirable.[1]

Robotics history and robots uses

Robots types and impacts

Future technology

Multi-purpose Service Robots – importance and goals

1
Definition

1.1 Definition

American Heritage Dictionary: robot (rbt,-bt) n.

1. A mechanical device that sometimes resembles a human being and is capable of performing a
variety of often complex human tasks on command or by being programmed in advance.

2. A machine or device that operates automatically or by remote control.

3. A person who works mechanically without original


thought, especially one who responds automatically to
the commands of others.

Webster:

1. A machine that looks like a human being (see


Figure1.1) and performs various complex acts (as
walking or talking) of a human being; a similar but
fictional machine whose lack of capacity for human
emotions is often emphasized.

2. An efficient insensitive person who functions


automatically.

3. A device that automatically performs complicated


often repetitive tasks.

4. A mechanism guided by automatic controls.


Figure 1.1: ASIMO, a humanoid robot.
The robots in the movies are portrayed as a fantastic,
intelligent, and sometime dangerous artificial life, but robots are really working for people and performing
tasked for them and tasks that may be dangerous. And in the future robots will show up in schools, homes
and even as parts of the body. As technology advances we are finding more and greater ways to use them.[1]

1.2 Robotics History

Robots began as entertainment for royalty. Inventors such


as Al-Jazari (1136 – 1206) and Leonardo Da Vinci (1452 –
1519) worked to build automatons for their benefactors. Al-
Jazari built a floating band that resembled humans and
performed various songs and drum beats depending on the
programming of a series of pegs (see Figure1.2). Da Vinci
created an automaton based on the knight's armor. It could
stand and move its arms and neck, as well as opening its Figure 1.2: Al-Jazari's musical robot band.
mouth.[2]

2
Introduction to robotics

In the early 1800’s mechanical puppets were first built in Europe, just for entertainment value. And
these were called robots since there parts were driven by linkage and cams and controlled by rotating drum
selectors. In 1801 Joseph Maria Jacquard made the next great change and invented the automatic draw
loom. The draw loom would punch cards and was used to control the lifting of thread in fabric factories. This
was the first person to be able to store a program and control a machine. After that there were many small
changes in robotics but we were slowly moving forward.[1]

The first industrial robots were “Unimates” developed by George Devol and Joe Engelberger in the late
1950’s and early 1960’s. The first patents were by Devol but Engelberger formed “Unima on” which was the
first market robots. For a while the economic viability of these robots proved disastrous and thing slowed
down for robotics. But the industry recovered and by the mid-1980’s, after that robotics were back on
track.[9]

George Devol Jr, in 1954 developed the multi-jointed artificial arm which had led to the modern robots.
But mechanical engineer Victor Scheinman, developed the truly flexible arm known as the Programmable
Universal Manipulation Arm (PUMA).[9]

In 1950 Isaac Asimov came up with laws for robots and these were:

 A robot may not injure a human being, or through inaction allow a human being to come to
harm.

 A robot must obey the orders given it by


human beings, except where such orders
would conflict with the first law.

 A robot must protect its own existence as


long as such protection does not conflict
with the first or second law.[3]

Mobile Robo cs moved into its own in 1983 when


Odetics introduced six-legged vehicle (shown in Figure1.3)
which was capable of climbing over objects. This robot Figure 1.3: First mobile robot
could li over 5.6 mes its own weight parked and 2.3
times its weight moving.[1]

1.3 How Robots Work?

In order for a robot to imitate the actions of a human being, it has to be able to perform three fundamental
tasks. First, it must be conscious of the world around it, just as humans obtain information about the world
from our five senses. Second, the robot must somehow "know" what to do. One way for it to get that
knowledge is to have a human prepare a set of instructions that are then implanted into the robot's "brain."
Alternatively, it must be able to analyze and interpret data it has received from its senses and then make a
decision based on that data as to how it should react. Third, the robot must be able to act on the instructions
or data it has received.[4]

3
Uses of Robots

1.4 Uses of Robots

The idea of a robot as a general-purpose human servant is very far from the robots in widespread use today.
The most common robots are industrial, factory machines: hydraulic, mechanical arms precisely controlled
by computers. You may have seen robots like this working in automobile factories, where they assemble,
weld, and spray-paint new cars. Clothes factories use similar robot arms, fitted with lasers, to cut fabric with
extraordinary speed and precision. Although most industrial robotic arms are general-purpose machines,
they are "trained" (essentially, programmed) to do one highly specific job and they never do anything else.
To use them for another purpose, you would have to completely reprogram them.[5]

Another important job robots do is to run errands that no


human would want to do. The military have long used remote-
control robotic machines to defuse bombs. A typical bomb
defusal robot (see Figure1.4) has tracks to maneuver it
around, a camera that lets the operator see what it is doing,
and a robot arm for manipulating whatever it finds. Some of
these machines also have a remote-controlled rifle attached
so they can destroy suspect packages. While we tend to call
them "robots," machines like this are not really robotic: they
are simply remote-controlled machines operated at a safe
distance by a human being: they do not have an onboard
computer and they are not controlling their own movements.
Space explorer robots (like the ones that have landed on
Mars) usually combine remote control (they can be steered Figure 1.4: Bomb defusal robot
from mission control on Earth) and autonomous operation
(they can navigate themselves, explore, and send pictures or samples of what they find back to Earth).[5]

1.5 Importance of Robots

The importance of robotics have become more apparent as industrial robotics technology has grown and
developed in the last 60 years since the first industrial robot, Unimate, was put into use in the 1950s. About
90% of the robots in use today are in the industrial robo cs sector in factories. As in first half of 2010, about
198,000 industrial robots were in use in the U.S., as reported by the Robo cs Industry Associa on (RIA).
Robots are now also used in warehouses, laboratories, research and exploration sites, energy plants,
hospitals, and outer space.[6]

The importance of robotics can be classified by its advantages divided into four major categories:

 Quality, accuracy and precision.

 Efficiency, speed and production rate.

 Ability to work in environments that are unsafe or inhospitable for humans.

 Freedom from human limitations such as boredom and the need to eat and sleep.[6]
4
Introduction to robotics

1.6 Types of Robots

There are numerous ways to describe robot types. In my experience, there are various classes of robots. The
main reason of these differences is that different scientists and engineers tend to have different views on
issues that should be considered under term "robotics".[7]

Nowadays, robots do a lot of different tasks in many fields. And this number of jobs entrusted to robots
is growing steadily. That is why one of the best ways to classify robots is by their application. There are:
Industrial robots, Domestic or household robots, Medical robots, Service robots, Military robots,
Entertainment robots and Exploration robots.[7][11]

I also prefer to categorize the robots based on their use: general-purpose autonomous robots which
can perform a variety of functions independently and dedicated robots which perform pre-specified
functions such as industrial or medical robots.

1.7 Impact of Robots

Robotics brings about higher quality and lower cost to the manufacturing industry. But this could mean
replacing old labor with skilled people in software and sensor development. These machines will have to be
maintained and the staff will have to be trained on their repair. So, you may dismiss some workers but the
overall loss may not be that bad.[8]

Though, currently robotics on the production line takes away many jobs that were done by humans and
this cost the companies less.[8]

1.8 Future of Technologies

Perhaps the most dramatic changes in futuristic robots will arise from their increasing ability to reason. The
field of artificial intelligence is moving rapidly from university laboratories to practical application in industry,
and machines are being developed so they can perform cognitive tasks, such as strategic planning and
learning from experience. Increasingly, diagnosis of failures in aircraft or satellites, the management of a
battlefield, or the control of a large factory will be performed by intelligent computers.[10]

5
This project: Multi-purpose Service Robots

1.9 This project: Multi-purpose Service Robots

Service robots have no strict internationally accepted definition, which, among other things, delimits them
from other types of equipment, in particular the manipulating industrial robot, however, have adopted a
preliminary definition:

A service robot is a robot which operates semi- or fully autonomously to perform services useful to the
well-being of humans and equipment, excluding manufacturing operations.

With this definition, manipulating industrial robots could also be regarded as service robots, provided
that they are installed in non-manufacturing operations. Service robots may or may not be equipped with an
arm structure as is the industrial robot. Often, but not always, the service robots are mobile. In some cases,
service robots consist of a mobile platform on which one or several arms are attached and controlled in the
same mode as the arms of the industrial robot.

Because of their multitude of forms and structures as well as application areas, service robots are not
easy to define.

Since 2007 a working group of ISO has been revising the ISO 8373 which finally will include an official
definition of service robots.[12]

Multi-purpose service robots can perform many services in comparison with the dedicated one,
which means a complex structure and programming, on the other hand multi-purpose service robots would
offer more functionality and advanced interaction.

I have chosen to explore this challenging area because of the growing interest and vast expectation of
robots impact on our lives, making us ready to face any possible difficulties along the way and attempt to
find appropriate solutions for each problem encountered. The design of Robot is a very challenging process
in which I hope to prove that it is a doable thing within the individual capabilities.

1.9.1 Importance of Multi-purpose Service Robots

Multi-purpose Service Robots has become a very interesting field of study nowadays due to the widespread
use of service robots in different practical applications. Today, service robots are widely used to perform
jobs cheaper or more accurately and reliably than humans.

These revolutionary developments in robotics have made the process of robot design a very
interesting area to explore in scientific and academic research.

Its main importance aside from its benefits in helping humans in daily live, is that it inspires the future
of robots development and ideas.

6
Introduction to robotics

1.9.2 Goal of the Project

The goal of this project is to design a robot that should be capable of the following:

 Balances with only two parallel wheels and moving in all directions while doing that.

 Detect surrounding objects and avoid obstacles.

 Design a motor controller to be used for controlling speed and position.

 Uses GPS to detect its geographical position and able to move to a specific position using GPS map.

 Listens to the commands given to it, act according to those commands. Responses include talking
and moving.

 Recognize faces of people that the robot deals with.

Finally, if everything went as planned, I would like also to:

 Add at least one arm that could move in a way like human arms with clipper(s) on them while its
effects on movement got compensated by the balancing unit so that the robot will not fall.

 Recognize chess pieces and plays chess in a moderate level with humans using one of its arms.

The key to success in fulfilling these objectives is to believe that you are really capable to do it, even
though you feel like you do not have enough knowledge and power.

1.9.3 Expected outcomes

Working with this project will help you to practically apply the most of what I have learnt in your academic
study as well as discovering some unknown areas that you are not familiar with. At the end of this project,
you should be able to:

 Design an electronic circuit for almost any practical problem.

 Design mechatronic robotic prototypes.

 Program high-performance ATMega® and PIC® (Programmable Interface Controller)


microcontrollers to perform a specific task.

 Connect different microcontrollers in complex control systems.

 Interface physical sensors to electronic components and adjust its signals levels and types.

 Design Digital controllers for sensors and actuators.

 Document academic papers and projects in a sophisticated manner.

7
This project: Multi-purpose Service Robots

1.9.4 Methodology

Before we start the structure and programming, I want to mention that I will use among the ATMega® and
PIC® microcontrollers the “BeagleBoard” as a secondary processing hardware in this project especially for
the complex image processing and face recognition. I also used the Custom Computer Services (CCS) C
Compiler PCWHD series v4.093 as an add-on to the MPLAB to compile the PIC® C code.

I will apply the bottom-up[64] design methodology as much as possible in order to satisfy the
underground needs of this project in an acceptable manner so that the reader is able to understand the
materials discussed here, I cannot just jump to the final step while the base had not constructed.

I tried to keep the C/C++ code as simple as possible, to avoid confusing the reader for the code. In
addition I included as much figures as possible to ease understanding the outcomes of the corresponding
codes.

1.9.5 Block Diagram

The following is (see Figure1.5) the block diagram of the system that maps its main components with enough
details at the current level, further details in next chapters. Arrows indicates control and/or data flow.

8
Introduction to robotics

Main processor

"PIC® Microcontroller"

LCD

Webcam controller Motors controller

"Beagle Board" "ATMega µC" based

Webcam Position feedback from


both wheels

Left Wheel Motor Right Wheel Motor

Tilt controller Microphone Controller

"ATMega µC" based "VRbot Module"

Accelerometers and Microphone


Gyroscope sensors

Speaker Controller Infrared Proximity


"SpeakJet and TTS256" Sensor

Speaker

GPS Module Neck and Proximity


sensor's Servos

Figure 1.5: Robot system block diagram

9
Project Organization

1.10 Project Organization

I had learned that the project team that does the work should be as small as possible.[13] Small is beautiful,
and effective. I got experience not to start inviting everyone to the team. Only people who have an added
value and will spend a significant amount of time to the project can be in the core team. I tried to avoid
going overboard on working groups. Working groups can drown a project in communication overhead. And I
decided that if there should be that much discussion anyway, postpone the project preparations and first
make up the minds. Unfortunately, we had not come to an agreement from the early beginning, and since
then, I decided to do this alone, even though this is really tough on me.

The project strategy that I followed for organizing this project was the well known strategy of “Divide
and Conquer”.

This project was divided into the many preconstruction tasks that were way too far from formal
documentation to be stated here, which involves:

1. Deciding on the requirements, feasibility study, and searching for possible components.

2. Estimating the costs, collecting funds and purchasing components.

3. Building prototypes and correcting requirements, this kept looping including all the above steps
until the finally accepted prototype.

The project then went into the main stage that is also divided into the following main tasks which would
be represented in dedicated chapters in this documentation:

1. Hardware Design and Implementation.

2. Software Programming and Interfacing.

3. Operation and Maintenance.

Hopefully, I had already completed most of the stages above, and now I am writing the rest of
documentation while fixing minor issues regarding project finalization.

Following is (see Figure1. 6) the activity network diagram that I have prepared for designing,
implementing, and documenting this project. Of course preparing it was a recursive process that changed
the activity diagram every time I faced new requirements or unexpected outcomes. I do not expect you to
spend more than 2 months re-implementing this robot system, but since I was doing many experiments and
did not have clear guidelines – unlike you – I also experienced many delays due to resources leakage and
programming issues as well as being alone in all these complicated tasks.

10
Introduction to robotics

1 week
Estimate
requirements 7/11/2009
Start Project Proposal
24/10/2009 Collect Team
Members
7 weeks
2 weeks 3 weeks
Organize Collect money Search & purchase
28/11/2009 Project components
Start documentation
Gather 26/12/2009
Information Start implementation
7 weeks 5 weeks

Correct Build
4 weeks Chapter 1
Requirements Prototypes
Hardware 16/1/2010
Documentation
2 weeks 2 weeks
13/2/2010
Hardware Final Linux setup
Chapter 2 hardware
modification
30/1/2010

Software
Documentation 6 weeks
4 weeks 3 weeks 4 weeks
4 weeks
Programming Programming Programming Programming Programming
Motor driver VRbot 9DOF PIC Beagle

15/5/2010 11 weeks

Chapter 3
System Modules
Software
1/5/2010
Segments
4 weeks 13 weeks
2 weeks
Operation
Documentation Interfacing Voice Interfacing Interfacing Interfacing
Shield VRbot GPS Proximity sensor

14/8/2010 Interfacing Interfacing Interfacing Interfacing Interfacing


Chapter 4 LCD 9DOF Servos Zippy I²C Webcam

2 weeks 2 weeks

Conclusion & Operation 31/7/2010


Recommendation Robot system

Faults & Problems


2 weeks
Chapter 5 18/9/2010
Finalization
28/8/2010 End

3 weeks

Figure 1.6: Project activity network

11
Project Organization

12
Chapter 2. Hardware and Implementation

As you may have already noted from the block diagram of the system
in the previous chapter, the system is composed of some components
that need first to be discussed separately. We shall take a look into
the inter-connection between this modules including its signaling
protocols and power management strategy.

PIC and ATMega microcontrollers

Beagle and Zippy boards

Actuators and Sensors

Hardware changes and interconnection

13
Microchip® PIC® Microcontroller

2.1 Microchip® PIC® Microcontroller

PIC® is a family of Harvard architecture microcontrollers made by


Microchip Technology, derived from the PIC1640 originally developed
by General Instrument's Microelectronics Division. The name PIC®
initially referred to "Programmable Interface Controller".[14]

PICs are popular with both industrial developers and hobbyists


alike due to their low cost, wide availability, large user base,
extensive collection of application notes, availability of low cost or Figure 2.1: 16-bit 28-pin PDIP PIC24
free development tools, and serial programming (and re- microcontroller next to a metric ruler
programming with flash memory) capability.[15] See Figure2.1

I am using PIC18F2455 of the PIC® Microcontrollers' series, a 28-Pin USB (Universal Serial Bus) v2.0
compliant processor. 12Kbit of program space and 24 I/O (Input/output) lines, 10 of which are 10bit Analog
to Digital converter capable (see Figure2.2). It can runs up to 48MHz with external crystal. Package can be
programmed in circuit.

Figure 2.2: PIC18F2455 Pins Diagram

This chip is the main processor – even though it is not the most powerful inside this robot – that
handles communication between the rest modules. For internal structure and Pins usages, please refer to
the datasheet.

Do not panic because you saw that 28-leged spider. The connections are too simple, all you need to do
is to connect the power pins (19-20) and the oscillator pins (9-10) to a 20MHz oscillator and you are ready to
go. Most of the other pins are I/O pins, we will deal with them later. But, if you want a rapid development
then get a prototype board such as Olimex PIC-P28 or Olimex PIC-P28-USB (preferred).

14
Hardware and Implementation

2.1.1 Olimex PIC-P28-USB

This is a 28 Pin Development board with built-in USB connection. Power and serial communications are
provided by the FTDI (Future Technology Devices International) USB<->RS232 (Recommended Standard 232)
converter chip. You do not have to do anything! Connect the USB cable from your computer to the board
and a new serial com port will be installed under Windows. It also includes a socket for I²C (Inter-Integrated
Circuit) EEPROM (Electrically Erasable Programmable Read-Only Memory) - complete with pull-up resistors
and connections to PIC® pins. It is perfect for basic debugging and serial data logging applications.[16]

Figure 2.3: ICSP/ICD enabled 28 pin PIC® microcontroller prototype board with USB

Now we could easily place the PIC® µC (microcontroller) in this board and use the prototyping ports for
our experiments. But you may ask what is that small 8-Pin socket used for? (see Figure2.3) Well, it is
reserved for a back bone storage called I²C EEPROM. I will use the EEPROM for storing strings that got
displayed on the LCD (Liquid Crystal Display) and/or pronounced by the TTS256 so I could save the ROM size
for the critical code only.

If you do have some extra time, or running out of money, I suggest that you build your own board, it is
fairly simple, see the following schematic[16] shown in Figure2.4.

15
Microchip® PIC® Microcontroller

Figure 2.4: 28 Pin PIC Development Board Schema c

16
Hardware and Implementation

2.1.2 I²C EEPROM

I am using 24LC256 (see Figure2.5), a 2 Wire Serial Communica on (I²C)


EEPROM - The back bone of any microprocessor project. These 8 pin
chips only need two wires to communicate and retain their data even
with power failure. Use the 256K EEPROM for some serious data
storage.[17]

For the internal structure, and Pins assignments, please refer to the
datasheet.

Now that we know all the components that shall be placed on the
Figure 2.5: 24LC256 EEPROM chip
PIC® board, we need to know how we could start programming it. I will
use MPLAB Compatible ZIF Programmer - USB Powered called Olimex PIC-MCP-USB.

2.1.3 Olimex PIC-MCP-USB

This is a low cost alternative to PicStart+ (the


official Microchip programmer). The MCP-USB is
MPLAB compatible and is fully recognized under
MPLAB as a PicStart+ device. To operate it needs
only a USB A-to-B cable. The MCP-USB includes a
gold plated 40 pin ZIFF socket which accepts all
types of DIP PIC® microcontrollers. In addition it
has an ICSP connector and can be used to program
all PIC®-PxxB prototype boards[18], our board is one
of them that has the ISCP (In-Circuit Serial
Programming) connector which could be
connected to the shown cable (See Figure2.6)
Figure 2.6: Olimex PIC-MCP-USB with 40 pin ZIFF, ICSP
directly to program the PIC® while in the board, or
connector
if you want to, you could get the PIC® chip
programmed in the ZIFF socket (shown in blue) with the pin 1 of the PIC® near the stick that secures the ZIFF
socket.

Also, if you want to build your own simple ICSP and save some money while consuming some time, then
you can follow the schematic[19] shown in Figure2.7. Note that this schematic requires serial RS232
connection and external power source. If you do not have a serial RS232 port on your PC (Personal
Computer), use the DB9M-to-USB adaptor discussed in Appendix III.

17
Microchip® PIC® Microcontroller

Figure 2.7: Serial Port Programmer Schematic - ICSP Only

Great progress, but how could I operate one of these programmers and program the PIC®
microcontroller? This is done by the MPLAB IDE.

18
Hardware and Implementation

NOTE
You may think that this IDEs should be discussed within the next chapter which is dedicated to
software, well, the next chapter is dedicated to discuss the robot's developed software, not the IDEs
and its associated add-ons which are not part of my software, but requirements to operate its
corresponding hardware.

2.1.4 MPLAB Integrated Development Environment

MPLAB Integrated Development Environment (IDE) is a free, integrated toolset for the development of
embedded applications employing Microchip's PIC® and dsPIC® microcontrollers. MPLAB IDE runs as a 32-bit
application on MS Windows®, is easy to use and includes a host of free software components for fast
application development and super-charged debugging. MPLAB IDE also serves as a single, unified graphical
user interface for additional Microchip and third party software and hardware development tools. Moving
between tools is a snap because MPLAB IDE has the same user interface for all tools including simulators,
debuggers, and programmers.[20] See Figure2.8.

Figure 2.8: MPLAB IDE running a project with various debugging windows

Unfortunately, the compilers that came with the MPLAB IDE are not good enough to be used for
compiling standard C and RTOS (Real Time Operating System) derived code, they have very limited C version
19
Microchip® PIC® Microcontroller

which is very difficult to learn and use, though, there exist some third party products which have a very good
compilers for PIC® microcontrollers that could operate as an add-on of MPLAB IDE. I am using CCS C
Compiler , PCWHD series, which is very powerful and enabled me to write the code that I am familiar with in
PC. But if you want my advice, take the PCWH series and save 100 USD, or program using their compliers for
free, but porting the code provided in this document to the cheaper compilers is not guaranteed.

2.1.5 CCS IDE Compilers

Intelligent and highly optimized CCS C compilers contain Standard C operators and Built-in Function
libraries that are specific to PIC® registers, providing developers with a powerful tool for accessing device
hardware features from the C language level. Standard C preprocessors, operators and statements can be
combined with hardware specific directives and CCS provided built-in functions and example libraries to
quickly develop applications incorporating cutting edge technologies such as capacitive touch, wireless and
wired communication, motion and motor control and energy management.[21] See Figure2.9.

Figure 2.9: CCS Compiler - PCW Interface

Remember that PCWH and PCWHD are the only series of CCS that supports PIC18 series of
microcontrollers, so be warned.

20
Hardware and Implementation

Now our preparations of the main board are ready and we can start programming the PIC®
microcontroller using standard C. Let us learn about the other components and peripherals that will get
connected to the main board so soon.

2.2 VRbot Module

VRbot (VR: Voice Recognition) Module is designed to easily add versatile voice command functionality to
robots, and in particular ROBONOVA-I and ROBOZAK.[22]

Although designed specifically with the ROBONOVA-I and ROBOZAK robots in mind, the VRbot module
can also be used to efficiently implement Voice Recognition capabilities on virtually any host platform. The
module is completely self-contained and interacts with the host through a simple, yet robust serial protocol,
enabling Voice Recognition on relatively low-power processors such as ATMEGA®, PIC® etc.[23] See
Figure2.10.

Great news, it could be connected directly to the PIC® just as easy as soldering 4 wires to 4 holes in the
prototyping board discussed earlier. And now we have a hearing capability added to the robot. Programming
comes later in the next chapter. Now we need it to speak as well.

Figure 2.10: VRbot - Voice Recognition Module

Unfortunately, we need to purchase this item or its equivalents, it cannot be made at home. If you want
more details about this item, please refer to its datasheet.

2.3 SpeakJet, VoiceBox Shield & TTS256

The SpeakJet is a completely self-contained, single-chip voice and complex sound synthesizer. It uses a
mathematical sound algorithm to control an internal five channel sound synthesizer to generate on-the-fly,
unlimited vocabulary speech synthesis and complex sounds.[24] See Figure2.11a.

Populated on the VoiceBox Shield (designed especially for Arduino, but operates with PIC® as well) is
the 18-DIP SpeakJet IC (no need to purchase separated SpeakJet) and a two stage audio amplifier with a
21
SpeakJet, VoiceBox Shield & TTS256

potentiometer to set the gain. You can connect a speaker directly to the “SPK+/-” pins, and get your
controller talking with very minimal work![25] See Figure2.11b.

The TTS256 (Text to Speech chip for SpeakJet) is an 8-bit microprocessor programmed with letter-to-
sound rules. This built-in algorithm allows for the automatic real-time translation of English ASCII characters
into allophone addresses compatible with the SpeakJet Speech Synthesizer IC. Combine this with the
SpeakJet to build a complete text-to-speech solution.[26] See Figure2.11c.

Do not forget to purchase a speaker to output the voice, I am using a 0.5W 8Ω Speaker connected
directly to the output of the VoiceBox shield.

Figure 2.11a: SpeakJet 18-Pin chip Figure 2.11b: VoiceBox Shield with a 13x7 Figure 2.11c: TTS256 a 28-Pin, 8-bit
grid of holes for prototyping microprocessor

Again, the VoiceBox Shield could be assembled at home, If this is your choice, then you would need to
purchase the SpeakJet and a dual operation amplifier among some resistors and capacitors as indicated in
the schematic[25] shown in Figure2.12.

By the way, I heard that there are many text-to-speech convertors that comes in one chip which is UART
(Universal Asynchronous Receiver/Transmitter) capable, but I was not able to locate one of it, if you found it
cheaper, then yes; get one of it and use it with our system, no changes would happens to the system
architecture, may be some minor changes within the software module that drives this chip.

22
Hardware and Implementation

Figure 2.12: VoiceBox Shield Schematic

23
Character LCD

2.4 Character LCD

This is a basic 20 character by 4 line display. It utilizes the extremely


common HD44780 parallel interface chipset. Interface code is freely
available. You will need ~11 general I/O pins to interface to this LCD
screen. It includes LED (Light-Emitting Diode) backlight.[27] See
Figure2.13.

Well the I/O pins within our microcontroller are limited and we
cannot just scarify them for the LCD, so I will use a serial enabled LCD
Backpack provided SparkFun company.

We will use this LCD to output the status indications such as


Figure 2.13: Basic 20x4 Character LCD
actions taking progress at the real-time as well as the boot diagnostics
and some possible errors so you could maintain the robot
accordingly.

2.4.1 LCD Backpack

The serial enabled LCD backpack allows you to control a parallel


based LCD over a single-wire serial interface. The SerLCD backpack
takes care of all the HD44780 commands allowing seamless
integration with any micro that can communicate over a wide
range of TTL serial baud rates. The SerLCD currently supports 16 Figure 2.14: Serial Enabled LCD Backpack
and 20 character wide screens with 2 or 4 lines of display.[28] See
Figure2.14.

It is just what we need to integrate with the above LCD in order to save our I/O pins, just solder it to the
LCD back and we now only need one I/O pin to control the LCD instead of 11 Pin.

2.5 GPS Module

The D2523T is a compact GPS smart-antenna engine board, which


comes equipped with a Sarantel GeoHelix high-gain active antenna
and GPS receiver circuits. The module is based around the high
performance 50-channel u-blox 5 pla orm.[29] See Figure2.15.

To interface this GPS module you need to use the interface


cable for EM401 or EM406, and then connect the other side of this Figure 2.15: 50 Channel D2523T Helical
GPS Receiver
cable to a logic level converter since this GPS module operates at
3.3V and our circuit operates at 5V.

24
Hardware and Implementation

2.5.1 Logic Level Converter

If you have ever tried to connect a 3.3V device to a 5V system, you


know what a challenge it can be. The SparkFun logic level converter
is a small device that safely steps down 5V signals to 3.3V and steps
up 3.3V to 5V. This level converter also works with 2.8V and 1.8V
devices. Each level converter has the capability of conver ng 4 pins
on the high side to 4 pins on the low side. Two inputs and two
outputs are provided for each side.[30] See Figure2.16.

Bread board friendly! It can be used with normal serial, I²C, SPI
(Serial Peripheral Interface), and any other digital signal. But it does
not work with an analog signal. Figure 2.16: Logic Level Converter

This is just it, thanks to SparkFun again, their staffs are really
handy. Now cut the other end of the GPS cable and solder its wires according to the indicated signals, and
you are ready to go for interfacing the other side of the level convertor directly to the PIC® main board.

This convertor is too cheap and I do not suggest that you build your own, just purchase this one and
save a lot of time.

By the way, this convertor has two serial channels, and we only used one of them, the second channel
would be used for the balancing sensors board (part of the tilt controller) as it is usually 3.3V and the one I
got is also 3.3 volt.

2.6 Tilt controller

An accelerometer is a device that measures proper acceleration, the acceleration experienced relative to
freefall. Single- and multi-axis models are available to detect magnitude and direction of the acceleration as
a vector quantity, and can be used to sense orientation, acceleration, vibration shock, and falling. Micro-
machined accelerometers are increasingly present in portable electronic devices and video game controllers,
to detect the position of the device or provide for game input.[31]

A gyroscope is a device for measuring or maintaining orientation, based on the principles of


conservation of angular momentum. A mechanical gyroscope is essentially a spinning wheel or disk whose
axle is free to take any orientation. This orientation changes much less in response to a given external torque
than it would without the large angular momentum associated with the gyroscope's high rate of spin. Since
external torque is minimized by mounting the device in gimbals, its orientation remains nearly fixed,
regardless of any motion of the platform on which it is mounted.[32]

25
Tilt controller

The needed combination of these sensors are X- and Z- accelerometers as well as X-gyroscope with a
dedicated processing unit (interfaced by UART or I²C) for fast handling of position and speed changes, we
cannot just rely on the main processor to process the outputs of the sensors directly as it already have
enough burden. I had searched for such component but found the three sensors without processing unit, or
more additional sensors (and additional cost as well) with a processing unit. I got no choice but to purchase
it, it is called “9DOF - Razor IMU - AHRS compatible” – DOF: Degrees Of Freedom, IMU: Inertial
Measurement Unit – designed by SparkFun company. If you found something like it with the only needed
sensors then use it instead of the one I got as it should be cheaper.

The 9DOF Razor IMU incorporates four sensors - an LY530AL


(single-axis gyro), LPR530AL (dual-axis gyro), ADXL345 (triple-axis
accelerometer), and HMC5843 (triple-axis magnetometer) - to give
you nine degrees of inertial measurement. The outputs of all sensors
are processed by an on-board ATMega®328 (a microcontroller
similar to the PIC® microcontroller) and output over a serial
interface.[33] See Figure2.17.

This chip contains all the needed sensors (and more) and
controller with just two I/O lines for communication with PIC® main
Figure 2.17: 9DOF - Razor IMU - AHRS
board. compatible

But, unfortunately since it is ATMega® based (not PIC® based) we may need to have another bootloader
programmer (optional – used in case of a corrupted bootloader; as in my case), a 3.3V (or 5V if you want to
use the logic level convertor and hit the reset button manually while programming) FTDI basic breakout for
firmware uploading, as well as another programming IDE such as Arduino IDE (which I am using) or
AVRDude.

Since all the needed sensors and microprocessor are surface mounted on the PCB (Printed Circuit
Board), I guess we cannot assemble ours and the only choice we have is to purchase one of these breakout
boards.

2.6.1 Pocket AVR Programmer

This is a simple to use USB AVR programmer. It is low cost, easy to


use, works great with AVRDude and Arduino IDE. It is based on Dick
Steefland's USBtiny and Limor Fried's USBtinyISP.[34] See Figure2.18.

With this programmer we could program the bootloader of the


9DOF by soldering 2x3-Pin male pins into the 2x3-Pin holes of the
9DOF then we could place the programmer's 2x3-Pin ISP into that
new soldered connector.
Figure 2.18: Pocket AVR Programmer

26
Hardware and Implementation

2.6.2 FTDI Basic Breakout - 3.3V

This is a basic breakout board for the FTDI FT232RL USB to serial IC. The pinout of this board matches the
FTDI cable to work with official Arduino and cloned 3.3V Arduino boards. It can also be used for general
serial applications. The major difference with this board is that it brings out the DTR pin as opposed to the
RTS pin of the FTDI cable. The DTR pin allows an Arduino target to auto-reset when a new Sketch is
downloaded. This is a really nice feature to have and allows a sketch to be downloaded without having to hit
the reset bu on. This board will auto reset any Arduino board that has the reset pin brought out to a 6-pin
connector.[35] See Figure2.19.

Figure 2.19: FTDI Basic Breakout

With this breakout we could easily program the C code into the 9DOF using the Arduino IDE, just select
the COM port that this board installed with from Arduino IDE, and you are ready to go.

2.6.3 Arduino IDE

The open-source Arduino environment makes it easy to write code and upload it to the I/O board. It runs on
Windows, Mac OS X, and Linux. The environment is written in Java and based on Processing, avr-gcc, and
other open source software. See Figure2.20

The Arduino development environment contains a text editor for writing code, a message area, a text
console, a toolbar with buttons for common functions, and a series of menus. It connects to the Arduino
hardware to upload programs and communicate with them.

Software written using Arduino is called sketch. These sketches are written in the text editor. It has
features for cutting/pasting and for searching/replacing text. The message area gives feedback while saving
and exporting and also displays errors. The console displays text output by the Arduino environment
including complete error messages and other information. The toolbar buttons allow you to verify and
upload programs, create, open, and save sketches, and open the serial monitor.[36]

27
Motors Unit

Figure 2.20: Arduino IDE with example C code

2.7 Motors Unit

Composed of the position/speed feedback coming from the wheels encoders and the serial controlled motor
driver which corrects the wheels position and speed according to the requested speed from the PIC®
microcontroller, this is done by comparing wheels encoders output in fixed intervals with the set-point using
a PID (Proportional–integral–derivative controller) controller.

28
Hardware and Implementation

2.7.1 Wheels encoder

I purchased a wheel encoder set from Pololu composed of a pair of 42x19mm wheels, a pair of extended
brackets, and two matching wheel encoders. See Figure2.21a.

This quadrature encoder board is designed to hold two infrared reflectance sensors inside the hub of
the 42x19mm wheel and measure the movement of the twelve teeth along the wheel’s rim. The two sensors
are spaced to provide waveforms approximately 90 degrees out of phase, allowing the direc on of rota on
to be determined and providing four counts per tooth for a resolu on of 48 counts per wheel rota on. Each
analog sensor signal is fed to a comparator with hysteresis to provide glitch-free digital outputs. The
compact layout of the board fits all of the components within the envelope of the hub and tire, allowing the
board to be mounted between the motor and a chassis. The encoder is calibrated for operation from 4.5V to
5.5V (just as desired for our robot), but it can be recalibrated for opera on at 3.3V.[37]

All we need now is to purchase a pair of “250:1 Micro Metal Gearmotor HP” from Pololu (See
Figure2.21b) which has a long (0.365" or 9.27 mm), D-shaped metal output shaft that matches the Pololu
wheel 42x19mm and 32x7mm. Assemble them together with the wheel encoder set, and feed our motor
controller with the outputs. See Figure2.21c

If you want to build a big robot, get a suitable wheels and powerful power supply, nothing would
change with the system signals or whatever else.

Figure 2.21a: Wheel Encoder Set Figure 2.21b: Micro Metal Gearmotor Figure 2.21c: Motor assembled with
wheel encoder set

29
Infrared Proximity Sensor Long Range

2.7.2 Serial controlled motor driver

The Serial Controlled Motor Driver allows you to control up to two


DC motors using a serial command interface. The serial interface is
easy to use and it lets the user select an individual motor, the
direction, and the desired speed/position/tilt compensation value.
The board is based on the L298 Dual Full-Bridge Motor Driver from
ST Micro. The motor driver can provide up to 4 Amps of current to
the motors (2 Amps per motor). See Figure2.22.

Power can be applied to either the two-pin JST (Japanese


Solderless Terminal) connector or the GND and Vcc header pins.
Supplied power should be DC and within 5-16V. Please note, the pin
Figure 2.22: Serial controlled motor
labeled “5V” should only be used as an output.[38] driver

I had two problems with this controller, it does not support quadrature wheel encoders and its baud
rate is set to 115200bps which is very huge, but since it is stated within its features that "Unused
ATMega®328 pins broken out for custom use", and if you look closely at the board, you would find 4 I/O pins
not used which are the exact needed pins for the quadrature wheel encoders. Also, since it is based on the
ATMega®, we could re-program it with the same tools used to re-program the 9DOF (except that we need a
5V instead of 3.3V FTDI breakout here), we also only need to solder its 2x3-Holes with 2x3 Male connector in
case of bootloader corruption, otherwise just use the FTDI Basic Breakout and be aware that the pins here
are not arranged in the same way of the 9DOF. Also, while programming, the reset is not done automa cally
(due to the incompatibility with FTDI) and you would need to reset the board manually via its tiny reset
button located near the ATMega® chip.

2.8 Infrared Proximity Sensor Long Range

Infrared proximity sensor made by Sharp. Part # GP2Y0A02YK0F has


an analog output that varies from 2.8V at 15cm to 0.4V at 150cm
with a supply voltage between 4.5 and 5.5VDC (See Figure2.23). The
sensor has a JST Connector.[39] I recommend purchasing the Infrared
Sensor Jumper Wire - 3-Pin JST or soldering wires directly to the
back of the module.

This sensor is great for sensing objects up to 150cm away! And Figure 2.23: Infrared Proximity Sensor
it only requires one pin to get connected to the main board. Long Range - Sharp GP2Y0A02YK0F

WARNING
To avoid damaging the sensitive electronic devices and sensors, the power supply of the Servos and
DC motors must be separated from the electronic devices power supply.

30
Hardware and Implementation

2.9 Servos

Here is a simple, low-cost, high quality servo for all your mechatronic needs. Large servo with a standard 3
pin power and control cable. It includes hardware as shown in Figure2.24a.

I am using two of the large type servos to control the movement of the neck of the robot (that holds the
webcam), each one would require only one I/O Pin of the PIC® microcontroller.

Also, there exists a small type servo with the same specifications of the above large one. It includes
hardware as shown in Figure2.24b.

I am using one of these small servos to control the movements of the infrared proximity sensor, also
each member of this type would require only one I/O Pin of the PIC® microcontroller.

Figure 2.24a: Large Servo Figure 2.24b: Small Servo

2.10 BeagleBoard

The BeagleBoard is a low-power, low-cost Single-


board computer produced by Texas Instruments in
association with Digi-Key. The BeagleBoard was
designed with open source development in mind,
and as a way of demonstrating the Texas
Instrument's OMAP3530 system-on-a-chip. The
board was developed by a small team of TI
engineers. See Figure2.25.[40][41]

This board deserve more details, but as it is out


of our main scope of study here, please refer to its
manual and associated guides[42][43][44] in case of
further interest. Figure 2.25: Beagle Board (top view)

31
BeagleBoard

Note: this is the most powerful component in the whole system which I dedicated for the image
processing and I am running it in association of the BeagleBuddy Zippy, Linux UVC (USB Video Class) Webcam
and the Ångström Linux distribution.

2.10.1 BeagleBuddy Zippy

The BeagleBuddy Zippy Ethernet Combo Board is a


low cost expansion board for the BeagleBoard (see
Figure2.26) that provides the following peripherals:

 10BaseT Ethernet

 Second SD/MMC Interface

 Second RS232 Serial Interface

 Real-Time clock with Battery Back-up

 I²C Interface (+5V level)

 AT24C01 Serial EEPROM for Board


Identification[45]
Figure 2.26: BeagleBuddy Zippy

Yes, you are right, I am using this board for both Ethernet port and I²C interface provided by it, so I
could connect the board to the internet, importing some Linux libraries and tools which facilitates image
processing, and also using the I²C interface to connect it directly to the main board of PIC® microcontroller
(since I²C signals of BeagleBoard without Zippy are 3.3V).

2.10.2 Linux UVC Webcam

You will need to pick a webcam to be used with the


BeagleBoard (BB). USB Video Class (UVC for short) is a
new compatibility standard for USB 2.0 video-chat
cameras. It is imperative that whatever you buy
supports UVC, or you will not find a Linux driver for the
camera.

The Logitech QuickCam Pro 9000[46] (see


Figure2.27) – that I use – capture high quality images
and video. But their “Right Light 2” function may cause
low frame-rates in low-lighting conditions, and the Figure 2.27: Logitech QuickCam Pro 9000
camera does native motion blur when there are low
frame-rates. “Right Light 2” can be disabled as we need to turn off motion blur.

32
Hardware and Implementation

I had removed the legs of this webcam and attached it to a two perpendicular servomotors that would
enable it to move left, right, up and down.

This webcam can be connected directly to the host USB port of the BeagleBoard, and we now need to
install the Ångström Linux distribution within its SD card (you need to purchase one no less than 4GB), then
you ready to go.

2.10.3 Ångström Linux distribution

Ångström was started by a small group of people who


worked on the OpenEmbedded, OpenZaurus and
OpenSimpad projects to unify their effort to make a
stable and user-friendly distribution for embedded
devices like handhelds, set top boxes and network-
attached storage devices and more.[47] See Figure2.28

All you need to install the Ångström onto your


BeagleBoard is to follow the installation instructions
within the Appendix III of the project. Furthermore, a
number of software is needed to process the images
from the webcam, and the main one is the OpenCV
library. Figure 2.28: Running Ångström within Beagle Board

2.10.4 OpenCV

OpenCV (Open Source Computer Vision) is a library of programming functions for real time computer vision.
Officially launched in 1999, the OpenCV project was ini ally an Intel Research ini a ve to advance CPU-
intensive applications, part of a series of projects including real- me ray tracing and 3D display walls. The
main contributors to the project included Intel’s Performance Library Team, as well as a number of
optimization experts in Intel Russia.

The library is mainly written in C, which makes it portable to some specific platforms such as digital
signal processors. Wrappers for languages such as C#, Python, Ruby and Java have been developed to
encourage adoption by a wider audience.

However, since version 2.0, OpenCV includes both its tradi onal C interface as well as a new C++
interface, which seeks to reduce common programming errors when using OpenCV in C. Much of the new
developments and algorithms in OpenCV are in the C++ interface. Unfortunately, it is much more difficult to
provide wrappers in other languages to C++ code as opposed to C code; therefore the other language
wrappers are generally lacking some of the newer OpenCV 2.0 features.

OpenCV is released under a BSD (Berkeley Software Distribution) license, it is free for both academic
and commercial use. The library has more than 500 op mized algorithms (see Figure2.29). It is used around

33
Batteries and Chargers

the world, has more than 2,000,000 downloads and over 40,000 people in the user group. Uses range from
interactive art, to mine inspection, stitching maps on the web on through advanced robotics.[48]

Figure 2.29: OpenCV algorithms overview

2.11 Batteries and Chargers

Since this robot is autonomous, I cannot just connect it to the wall adaptor. For this reason, I was searching
for proper portable batteries to keep the power on for hours, and the best choice I found was the Polymer
Lithium Ion Batteries for the motors, and BeagleJuice for the circuit. I needed the Accucel-6 Charger for the
LiPoly batteries.

Polymer Lithium Ion Battery Pack - 2200mAh (Milliampere-hour) 7.4v is an excellent choice for anything
that requires a small battery with a lot of punch. The voltage is low enough not to tax your regulating
circuits, and the discharge rate is high enough to accommodate a lot of electronics and a few small motors.
See Figure2.30a.

Accucel-6 50W 5A Balancer/Charger will handle LiPo (Polymer Lithium Ion battery)/LiFe (Lithium iron
phosphate battery) up to 6S and NiMH (Nickel-metal hydride battery)/NiCd (Nickel-cadmium battery) up to
34
Hardware and Implementation

15S and shows individual cell voltage during charge with real-time updates throughout the charge cycle. See
Figure2.30b.

BeagleJuice makes the BeagleBoard portable. A ba ery module that provides a 5V power supply
directly to the BeagleBoard, it can be mounted onto the back side using standoffs, which keeps the
expansion ports easily accessible. The 4500mAh Li-ion battery is charged using specialized charging circuitry
that boosts performance for longer-lasting applications. The module itself can supply power through a
special, 2-pin barrel jack, or directly through wires on dual pin headers. It can be charged through a Type B-
mini USB port, and a second USB port is also available for accelerated charging. See Figure2.30c.

Figure 2.30a: Polymer Lithium Ion Battery Figure 2.30b: Accucel-6 50W Charger Figure 2.30c: BeagleJuice Battery

2.12 Main board modifications

Now that we have completely discussed all the components within the system, we will now start modifying
and assembling them up to build our beloved, long waited robot.

By referring to the data sheet of the PIC18F2455, you may no ce that the Olimex PIC®-P28-USB board
was not designed for the PIC18F2455 microcontroller, and thus, the board I²C bus was not connected
correctly. See Figure2.31a

Also, a er correc ng it, you may no ce that the address of the EEPROM associated with it is 0x50 which
conflicts with address of the BeagleBuddy Zippy board's EEPROM, do not tell me to change the address of
the complex board of Zippy which is needed for the boat of Linux, this is not an option, and we just need to:

1. Correct the I²C bus, by disconnecting the wires from PIN 14 and 15. Connec ng Pin 5 of the
EEPROM to PIN 21, and connec ng PIN 6 of EEPROM to PIN 22.

2. Change the address of the EEPROM within the board to 0x51, by removing ALL the wires
from PIN 1 of the EEPROM. Connec ng PIN 1 of EEPROM to the PIN 8 of the EEPROM, and
PIN 2 of EEPROM to PIN 7 of EEPROM

Now, our board is compatible with the BeagleBuddy Zippy board, and could also communicate with its
own EEPROM correctly (but with different address). See Figure2.31b

35
Using the TTS256 with the VoiceBox Shield

Figure 2.31a: PIC®-P28-USB before modification Figure 2.31b: PIC®-P28-USB after modification

2.13 Using the TTS256 with the VoiceBox Shield

The TTS256 Text to Speech IC is a fantastic little chip that interfaces to the
SpeakJet IC and allows the user to send a text string to the device and have
the text automatically converted to sounds the SpeakJet can interpret. In
essence you send English sentences to the TTS256 and then the SpeakJet
speaks the sentence aloud. While a person could certainly do this themselves
the “code cost” is very high; once the necessary code was written to interpret
English you would not have much code space left for your own application.
The TTS256 means you can add text to speech to your project while still
having plenty of room for other things.

We have to put the TTS256 IC onto the shield. The shield has 13 columns
of prototyping holes, the TTS256 has 14 pins on each side. A dirty fix that must
Figure 2.32: TTS256 with bent
be made is to bend pins 1 and 28 so that the rest of the chip can be put into
Pins
the holes, or if you have experience, drill for the pins 1 and 28 in the shield.
Pin 1 is indicated on the IC with a li le dot. Check it out in Figure2.32.

Put the TTS256 with the bent pins into the holes on the VoiceBox shield. Orient the chip so that the
bent pins are facing the “Vin” pin on the shield. After that, we have to solder two wires to bent pins and this
will make it easier.

36
Hardware and Implementation

Now to the delicate part; there are 7 signals that have to be connected on the TTS256. Table2.1
describes the connections that need to be made. The 5V, GND, Rx, Tx and Reset pin of the TTS256 should be
connected to shield header pins. The “Ready” and “Tx” pins of the SpeakJet will need to be wired to the
TTS256 chip.[49]

Table 2.1: Interfacing connections

Signal Name TTS256 Pin VoiceBox Pin SpeakJet Pin

Vdd 28 5V -

GND 14 Gnd -

Rx 18 Rx -

Tx 5 Tx -

Reset 1 Reset -

SJ_Ready 20 - 15

SJ_Tx 24 - 10

Good job, we are done attaching the TTS256 to the VoiceBox shield. See Figure2.33a and Figure2.33b

Figure 2.33a: VoiceBox shield top view Figure 2.33b: VoiceBox shield bottom view

37
Attaching Zippy to the BeagleBoard

2.14 Attaching Zippy to the BeagleBoard

First we need to solder a 2x14 Header (which


comes with the Zippy board) into the
BeagleBoard’s Expansion connector (J3), to do
that insert the 2x14 Header’s SHORT PINS from
the back side of the BeagleBoard into the
BeagleBoard’s expansion connector (J3) (posi on
the 2x14 Header so the LONG PINS are on the
BACK SIDE of the BeagleBoard) and finally solder
the SHORT PINS of the 2x14 Header from the
TOP SIDE of the BeagleBoard. See Figure2.34 Figure 2.34: Soldering BeagleBoard's Expansion Header

Now attach the four board spacers with the


screws provided, connect the expansion board
onto the BACK SIDE of the BeagleBoard by
ma ng with the 2x14 Header you just soldered.

Make sure all of the pins align correctly,


continue pushing the two boards together until
the connectors mate together.

Finally attach the male standoffs as shown


in Figure2.35.[45]

Figure 2.35: Attaching Zippy to the BeagleBoard

Not to forget that you need to attach 4-


wires female jumper cable to the 4-Pin header
expansion connector located on the Zippy board
next to the 12mm coin ba ery - this battery is
used for the real-time clock (see Figure2.36). This
cable would enable the I²C communication
between the BeagleBoard and the PIC®
microcontroller, the pins as ordered are:

1. Vcc (5V Power)

2. SDA - I²C 5V signal

3. SCL - I²C 5V signal


Figure 2.36: 4-pin header expansion connector for I²C

4. GND

38
Hardware and Implementation

2.15 Interconnection between all modules

The interconnection I had made is not obligatory for most control signals, though, changing them from this
setup (see Figure2.37), would malfunction the robot system if the proper changes with the software – that I
will discuss in the next chapter – are not taken as well. For this reason, and if you are not aware of what I am
talking about, please just keep with my interconnection as it is. Note that arrows indicate wires' ends not
signals' directions.

Up/Down servo
Left/Right servo
Logitech Webcam Pro 9000 5V Power supply
Power Jack
Beagle Board
USB host port
BeagleBuddy Zippy 2x14 Expansion port

Pin 25
4-Pin connector, Pins 1,2,3,4 Pin 2 Pin 3
Tx1 Tx2
Pin 24 Logic Level convertor
9DOF GPS
Pins 20,21,22,19
PIC18F2455 Pins 17, 18 Tx, Rx
Pin 16 VRbot
Backpack
Mic.
LCD Pin 5 Tx, Rx, Rst
Voice Shield
Pins 26, 27, 28
Pin 23 Spk.
Proximity sensor Pin 4
Left/Right servo
Pin RX-I
Motors Driver

ADC6, PC4 M1+, M1- ADC7, PC5 Pins M2+, M2-

Pins A, B
Encoder Power Pins +, -
Pins A, B Right DC Motor
Encoder Power Pins +, -
Left DC Motor

Figure 2.37: Interconnection of the robot system

39
Final Hardware

2.16 Final Hardware

Hurray! I was capable of implementing the robot almost as I designed, though, I was not able to install the
arm(s), and hope that someone would continue developing this project. Have a look at the final prototype of
the robot (shown in Figure2.38) before its wiring finalization and release. This prototype has the dimensions
of 35cm between wheels, 65cm ground to top, 12cm maximum thickness.

Figure 2.38: Last prototype of the robot system before finally assembled

40
Software Design and Interfacing

Chapter 3. Software Design and Interfacing

Robot software is the coded commands that tell the mechanical


device (known as a robot) what tasks to perform and control its
actions. Robot software is used to perform tasks and automate tasks
to be performed. Programming robots is a non-trivial task. Many
software systems and frameworks have been proposed to make
programming robots easier.

9DOF so ware

Motor driver software

BeagleBoard software

PIC software

41
9DOF Software

Some robot software aims at developing intelligent mechanical devices. Though common in science fiction
stories, such programs are yet to become common-place in reality and much development is yet required in
the field of artificial intelligence before they even begins to approach the science fiction possibilities. Pre-
programmed hardware may include feedback loops making the robot able to interact with its environment,
but does not convey actual intelligence.[50]

The software of our robot system is quit complicated, since it is composed of sub-systems and modules
that work interactively with each system it is connected with. Each sub-system has its own software system
which we will discuss separately in order to ease comprehending its functions, then we would discuss its
communication procedures and protocols. If you just want to skip this – in case you just want to re-
implement the system – and have the final software system of the whole robot, please refer to the Appendix
IV.

Before everything else, I think you may need to understand related protocols and signals that had been
used in this robot system to enable its communication with its parts. In this case please refer to the Appendix
V.

3.1 9DOF Software

You may need to understand the concept on Kalman filter before you start reading this section, I have
included a brief explanation of Kalman filter in the Appendix VI, please refer to it to get synced with what I
am talking about here.

You may ponder that since accelerometer returns the degree of tilt of the robot body with simple tan-1
calculations, why I am designing a complicated digital controller that only returns the tilt degree? Well this is
because the accelerometer is not stable while the robot is falling, the accelerometer would feel like there is
no gravity at that time in both X and Z axis in case of free fall. So, all I want to do is to rely on the
accelerometer readings while not falling to correct the integration of gyroscope reading (which is the angular
position – in other word, the tilt degree) and then rely totally on the gyroscope reading while falling, which
leads us to design a controller that will also predict the value of tilt degrees depending in its preprogrammed
and previous values.

I am using Arduino IDE to write a one file C sketch that deals with the communications between sensors
and microcontroller via I²C protocol, and also the microcontroller iterates through the Kalman filter,
outputting the degree of tilt (alpha), notifying the PIC® microcontroller with the degree once it changes so
that it could maintain balance via the motor driver (if falling forward move the wheels forward with a speed
proportional to the acceleration and tilt degree reported, and vice versa).

The code portion below (in Listing3.1) shows the accelerometer and gyroscope offset correction and
scale factors, offset is the zero-g mean and the zero-rate mean of the sensors respectively, the scale factors
are used so that the outputs of the sensors represent the real world tilt degrees.

//Sensors offsets and scale factors


#define ACCEL_X_OFFSET 14
#define ACCEL_X_1G 272

#define ACCEL_Z_OFFSET -25


42
Software Design and Interfacing

#define ACCEL_Z_1G 251

//Initial values to stable the system faster


//Kalman filter would stable it even those values are too far
#define GYRO_X_OFFSET 378.62 //378.672 //378.842
// GYRO_X_SCALE is calculated so that alpha matches pitch
#define GYRO_X_SCALE -61.85 //-64.6
Listing 3.1: sensors offset and scale factors

After that within the main() function, I declared the variables used inside the Kalman filter, I used a
50Hz sampling frequency, which means a sampling me of 20 millisecond. See Listing3.2

int raw_x, raw_z, raw_gyro;


double x, z, scalled_x, scalled_z, pitch, gyro, scalled_gyro, alpha = 0, bias = 0;

// We are calculating output at 50Hz


const double dt = 0.02; // 1/Frequency

Listing 3.2: some sensors and Kalman filter variables declaration

Now let us initialize the matrices of Kalman filter (review Appendix VI for details about what is going on
here) and a variable that keeps the latest alpha value, see Lis ng3.3

//x = A · x + B · u
// A[rows][cols]
double A[2][2] =
{
{1, -1*dt},
{0, 1}
};
double B[2][1] =
{
{dt},
{0}
};
double *u[1][1] =
{
{&scalled_gyro}
};
double *X[2][1] =
{
{&alpha},
{&bias}
};

double *y[1][1] =
{
{&pitch}
};
double C[1][2] = {1, 0};
double Inn[1][1];
double P[2][2] =
{
43
9DOF Software

{1, 0},
{0, 1}
};
double Sz[1][1] = {17.2}; //Degrees - not radian
double s[1][1];
double K[2][1];
double Sw[2][2] =
{
{0.057, 0}, //Degrees - not radian
{0, 0.172} //Degrees - not radian
};

signed int latest_alpha = 0;


Listing 3.3: Kalman matrices variables declaration

Inside our endless loop (which is required to keep the system up as long as the power is on) we just
applied the Kalman filter to our inputs from X and Z accelerometers and the X gyroscope, As a result the
system returns the stable alpha value that does not get affected by the free fall effect. See Lisitng3.4

While(1)
{
raw_x = x_accel();
raw_z = z_accel();

x = (double)raw_x - ACCEL_X_OFFSET;
z = (double)raw_z - ACCEL_Z_OFFSET;

scalled_x = x / ACCEL_X_1G;
scalled_z = z / ACCEL_Z_1G;

pitch = atan2(scalled_x, scalled_z) / PI * 180; //Degrees - not radian

raw_gyro = x_gyro();
gyro = (double)raw_gyro - GYRO_X_OFFSET;

scalled_gyro = gyro / GYRO_X_SCALE;

// Start implementing the Digital Controller using Kalman Filter, for details, please refer to the
Appendix VI

// X = A * X + B * u
*X[0][0] = (A[0][0] * (*X[0][0]) + A[0][1] * (*X[1][0])) + (B[0][0] *
(*u[0][0]));
*X[1][0] = (A[1][0] * (*X[0][0]) + A[1][1] * (*X[1][0])) + (B[1][0] *
(*u[0][0]));

// Inn = y – C * x
Inn[0][0] = (*y[0][0]) - (C[0][0] * (*X[0][0]) + C[0][1] * (*X[1][0]));

// CP = C * P
double CP[1][2] =
{
{C[0][0] * P[0][0] + C[0][1] * P[1][0], C[0][0] * P[0][1] + C[0][1]
* P[1][1]}

44
Software Design and Interfacing

};
// s = CP * C' + Sz
s[0][0] = (CP[0][0] * C[0][0] + CP[0][1] * C[0][1]) + Sz[0][0];

// AP = A * P
double AP[2][2] =
{
{A[0][0] * P[0][0] + A[0][1] * P[1][0], A[0][0] * P[0][1] + A[0][1]
* P[1][1]},
{A[1][0] * P[0][0] + A[1][1] * P[1][0], A[1][0] * P[0][1] + A[1][1]
* P[1][1]}
};
// K = AP * C' * inv(s)
K[0][0] = (AP[0][0] * C[0][0] + AP[0][1] * C[0][1]) / s[0][0];
K[1][0] = (AP[1][0] * C[0][0] + AP[1][1] * C[0][1]) / s[0][0];

// X = X + K * Inn
*X[0][0] += K[0][0] * Inn[0][0];
*X[1][0] += K[1][0] * Inn[0][0];

// A'
double A_[2][2] =
{
{A[0][0], A[1][0]},
{A[0][1], A[1][1]}
};
// KC = K * C
double KC[1][1] =
{
{K[0][0] * C[0][0] + K[1][0] * C[0][1]}
};
// P = AP * A' – K C * P * A' + Sw
P[0][0] = (AP[0][0] * A_[0][0] + AP[0][1] * A_[1][0]) - KC[0][0] *
(P[0][0] * A_[0][0] + P[0][1] * A_[1][0]) + Sw[0][0];
P[0][1] = (AP[0][0] * A_[0][1] + AP[0][1] * A_[1][1]) - KC[0][0] *
(P[0][0] * A_[0][1] + P[0][1] * A_[1][1]) + Sw[0][1];
P[1][0] = (AP[1][0] * A_[0][0] + AP[1][1] * A_[1][0]) - KC[0][0] *
(P[1][0] * A_[0][0] + P[1][1] * A_[1][0]) + Sw[1][0];
P[1][1] = (AP[1][0] * A_[0][1] + AP[1][1] * A_[1][1]) - KC[0][0] *
(P[1][0] * A_[0][1] + P[1][1] * A_[1][1]) + Sw[1][1];

/*
//Printing double is not possible via %f - Arduino is limited :(
//Here is my workaround with one decimal digit
if (alpha < 0) printf("-");
printf("%d.%d\r\n", abs((int)alpha), abs((int)(alpha * 10 - (int)alpha * 10)));
*/

// For easier communication with PIC®, I only would use the range of -127 to 128 degrees
// Enough resolution and effective communication
if (latest_alpha != (signed int)alpha)
{
latest_alpha = (signed int)alpha;
printf("%c", latest_alpha & 0xFF);
}

// This is the what controls the frequency of our sampling


// Without delay, the frequency is about 85Hz

45
Motor Driver Software

// But we do not need that speed since it would be just a headache on the PIC® microcontroller
// BTW: Seems that delay_ms is faster 2.9 than real time
// but I do not mind since I adjusted it manually
delay_ms(83);
}
Listing 3.4: Kalman filter loop

Note that the alpha is compared to the most recent alpha, so if there is no change, we do not need to
make an overhead communication with the PIC® microcontroller, also I am sending only one character to
represent the degree, which will have the resolution of -127 to 128 degrees, I think we do not need more
than this, the delay function at the end is used to control the frequency; it keeps sampling at the desired
frequency.

3.2 Motor Driver Software

Both 9DOF and the controller inside it are useless if it is not governing the actuator(s) that will maintain the
balance. This job is done by the motor driver.

Well, it is not that trivial to connect the 9DOF directly to the motor driver, since we need the wheels
also to move according to the will of the robot, not only the will of gravity and wind. For this reason, the
9DOF and the Motor Driver are connected to the Main PIC® board which combines the requested speed of
the robot as well as the acceleration that represents the reported tilt by the 9DOF.

Before discussing the code, you need to have knowledge about the PID Digital Controller, refer to the
Appendix VII to get some. By the way, code which is out of scope is left out, if you want to have a full version
of this code, refer to Appendix IV.

Now, let us dive deeper into the code, following in Lis ng3.5 are the preprocessor definitions and
functions prototype used in the code, know them to know the rest of the code.

//Motors related constants


#define MOTOR_1 0
#define MOTOR_2 1
#define FORWARD 0
#define REVERSE 1
#define SPEED_FACTOR 7

// Calculated using stat A+BX for fast acceleration


// Obtained from the relation between needed speed and PWM
#define OCR1A_STAT_A -83.5
#define OCR1A_STAT_B 1.02
#define OCR1B_STAT_A -130.5
#define OCR1B_STAT_B 1.41

#define CURRENT_THRESHOLD 150 //This is compared against the 10 bit ADC value and corresponds to
roughly 1.5A on the Current Sense pin
#define SPEED_AVERAGE_POLYNOMIALS 4 //Speed average polynomials count

void motor1_PID();
46
Software Design and Interfacing

void motor2_PID();
uint16_t read_adc(uint8_t adc_input);
void control_motors();
void set_direction(int motor, int direcion);
void set_speed(int motor, int speed);
Listing 3.5: Preprocessor definitions and functions prototype

The concept of this controller is to calculate the PWM (Pulse-Width Modulation) needed to drive the
motor at the desired speed, speed is calculated by first derivative of position which is obtained from the
motor encoder. Following in Lis ng3.6 the code used to calculate position, direction, speed, and current
overload of the first motor used to control the left wheel. This code is a part of the ISR (Interrupt service
Routine) that runs when the encoders input changes.

//================================================================
//Interrupt Routines
//================================================================
/*
PORTC Pin change interrupt service routine. Decodes the encoder.
For algorithm, see Scott Edwards article from Nuts&Volts V1 Oct. 1995 nv8.pdf
(righthand bit of old A,B) xor (lefthand bit of new A,B) => dir.
Increment or decrement encoder position accordingly
*/
ISR (PCINT1_vect)
{
static boolean enc_last_A[2] = {0, 0}, enc_last_B[2] = {0, 0}, enc_now_A[2],
enc_now_B[2];
static uint16_t enc1_last_counter[SPEED_AVERAGE_POLYNOMIALS] = {0},
enc2_last_counter[SPEED_AVERAGE_POLYNOMIALS] = {0};

/**
* Steps used:
*
* 1. which encoder? 1 or 2 or both
* 2. determine direction of rotation
* 3. update encoder position
* 4. remember last state
*
* 5. calculate speed
* 6. check overload
*
* Future Development: Find a way to use ADC pins as digital - this would enhance the performance
*/

enc_now_A[0] = (PINC & 0x10) >> 4;


enc_now_A[1] = (PINC & 0x20) >> 5;

if (enc_last_A[0] != enc_now_A[0]) //Is it Encoder 1?


{
enc_now_B[0] = (read_adc(ENC1_B) >> 9);
enc_dir[0] = enc_now_A[0] ^ enc_now_B[0];
if(enc_dir[0] == 0) enc_pos[0]++; else enc_pos[0]--;
enc_last_A[0] = enc_now_A[0];
enc_last_B[0] = enc_now_B[0];

47
Motor Driver Software

// Calculate motor speed


long double average_speed = 0;
for (uint8_t i = 0; i < SPEED_AVERAGE_POLYNOMIALS-1; i++)
{
average_speed += enc1_last_counter[i];
enc1_last_counter[i] = enc1_last_counter[i+1];
}
average_speed += enc1_last_counter[SPEED_AVERAGE_POLYNOMIALS-1];
average_speed += enc_counter[0];
average_speed = 2441.4 * (SPEED_AVERAGE_POLYNOMIALS+1) / average_speed;
enc1_last_counter[SPEED_AVERAGE_POLYNOMIALS-1] = enc_counter[0];
//motorRPM[0] = 0.1 * motorRPM[0] + 0.9 * average_speed;
motorRPM[0] = average_speed;
//motorRPM[0] = 2441.4 / enc_counter[0];
//if (enc_dir[0] == 1) motorRPM[0] *= -1;

//Check Overload sensor


motor_current[0] = read_adc(SENSE_1);
if(motor_current[0] > CURRENT_THRESHOLD)
{
//If an over current is detected, stop the motor
set_speed(MOTOR_1, 0);
PORTC |= (1<<2); //Overload LED on
}
enc_counter[0] = 0;
}

//Motor 2 code is the same with only changes on the index of matrices, no need to repeat!!
}

Listing 3.6: Encoder ISR and associated calculations

After this, all we need is to pass the speed calculated to the PID, which will control the PWM
accordingly by using Kp, Ki, and Kd variables which may take some time to be tuned correctly. Note that I
was sampling at 100ms, which is quite enough for controlling DC motors. See Lis ng3.7 for definition of the
PID variables as well as the Timer ISR that got executed at almost 1000Hz.

/************** PID Variables ******************/


const long double dt = 0.10035;

// Refer to: http://www.ecircuitcenter.com/circuits/pid1/pid1.htm


// To know how to tune PID, or refer to the Appendix VII
long double Kp[2] = {0.05, 0.05};
long double Ki[2] = {0.001, 0.001};
long double Kd[2] = {30, 40};

//encoders/motors position and direction of rotation


long double ErrorIntegral[2] = {0, 0};
uint16_t motorRPMSetpoint[2] = {0, 0};
long double motorRPM[2] = {0, 0};
uint16_t enc_pos[2] = {0, 0};
boolean enc_dir[2];
uint16_t motor_current[2] = {0, 0};
uint16_t enc_counter[2] = {0, 0};

48
Software Design and Interfacing

//Counter speed is 976.56Hz


ISR(TIMER0_OVF_vect)
{
//This is the interrupt service routine for TIMER0 OVERFLOW Interrupt.
//CPU automatically call this when TIMER0 overflows.

//Adjust the motors peed at ~10Hz => dt is 100ms


if (enc_counter[0] % 98)
{
motor1_PID();
motor2_PID();
}

// If no activity within 200ms, wheels are not moving


// Also prevents counter varaible from overflow
// Assuming minimum speed 12RPM
if (enc_counter[0] >= 195) motorRPM[0] = enc_counter[0] = 0;
if (enc_counter[1] >= 195) motorRPM[1] = enc_counter[1] = 0;

//Increment our variable


enc_counter[0]++;
enc_counter[1]++;
}
Listing 3.7: Timer ISR and associated calculations

You may have noticed that the timer ISR calls the left and right motors PID function, these functions
simply applies the PID mathematics and add its value to the PWM represented by the variables of OCR1A
and OCR1B respec vely. In main() function (see Lis ng3.8), since all the other functionalities are interrupt
driven, it only keeps listening on the serial port for any instruction in changing the speed and/or direction of
any motor.

/************** PID Algorithm ******************/

// Motor 1 - left motor


void motor1_PID()
{
static long double lastError[4] = {0, 0, 0, 0};
long double error = motorRPMSetpoint[0] - motorRPM[0];

long double PID = Kp[0] * error + //P


Ki[0] * ErrorIntegral[0] + //I
Kd[0] * (((error - lastError[0]) / 4) * dt); //D

ErrorIntegral[0] += ((lastError[0] + 2 * (lastError[1] + lastError[2]) + error) / 6) *


dt;
for (uint8_t i = 0; i < 3; i++) lastError[i] = lastError[i+1];
lastError[3] = error;

//limit OCR1A 0-1023


if (PID+OCR1A > 1023) OCR1A = 1023;
else if (PID+OCR1A < 0) OCR1A = 0;
else OCR1A += (int)PID;
}

49
Motor Driver Software

// Motor 2- right motor


// Same as the motor 1 controller with only change in the matrices index, no need to repeat code

int main(void)
{
ioinit();

set_direction(MOTOR_1, FORWARD);
set_speed(MOTOR_1, 0);
set_direction(MOTOR_2, FORWARD);
set_speed(MOTOR_2, 0);

while(1)
{
control_motors();

}
return 0;
}

void control_motors()
{
// Simple, yet affective communication
// Using one charecter to send the motor selector, direction and speed
uint8_t cmd = uart_getchar();

//MSB for motor selection


boolean motor = (cmd & 0x80) >> 7;

//7th bit for motor direction


boolean direction = (cmd & 0x40) >> 6;

//First 6 bits for speed selection


// From 0 - stopped
// To 63 - full speed
uint8_t speed = cmd & 0x3F;

// Apply the settings


set_direction(motor, direction);
set_speed(motor, speed);
}

void set_direction(int motor, int direction)


{
if(motor == MOTOR_1)
{
if(direction == FORWARD) { M1_FORWARD(); } else { M1_REVERSE(); }
}
else
{
if(direction == FORWARD) { M2_FORWARD(); } else { M2_REVERSE(); }
}
}

void set_speed(int motor, int speed)


{
//Is it the same speed?
if (motorRPMSetpoint[motor] == speed * 100) return;

50
Software Design and Interfacing

//Since not the same speed, reset the accumulated error


ErrorIntegral[motor] = 0;
motorRPMSetpoint[motor] = speed * SPEED_FACTOR;
if(motor == MOTOR_1)
{
OCR1A = OCR1A_STAT_A + OCR1A_STAT_B * speed * SPEED_FACTOR;
if (speed != 0)
PORTC &= ~(1<<2); //Overload LED off
}
else
{
OCR1B = OCR1B_STAT_A + OCR1B_STAT_B * speed * SPEED_FACTOR;
if (speed != 0)
PORTC &= ~(1<<3); //Overload LED off
}
}

Listing 3.8: PID and main() functions

You may have also noticed that I have used only one character to represent a 64 different speed and
forward/reverse direction for each motor, this kind of communication is not user friendly, but it is effective
in communication and processing. Moreover, I programmed an A+BX estimating functions (kind of numerical
mathematics) of the PWM for fast control system stabilization. Hope you got the idea of this controller, and
want to try programming your DC Motor PID.

3.3 BeagleBoard Software

Face Recognition is a very active area in the


Computer Vision and Biometrics fields, as it has
been studied vigorously for 25 years and is finally
producing applications in security, robotics, human-
computer-interfaces, digital cameras, games and
entertainment.[52] Face Recognition generally
involves two stages:

1. Face Detection, where a photo is searched to


find any face (shown in Figure3.1 as a red Figure 3.1: Hamdi's face detected
rectangle), then image processing cleans up the
facial image for easier recognition.

2. Face Recognition, where that detected and processed face is compared to a database of known faces, to
decide who that person is (shown in Figure3.1 as green text).

51
BeagleBoard Software

3.3.1 Face detection

Face Detection can be accomplished reliably using software such as OpenCV's Face Detector (since 2002),
working in roughly 90-95% of clear photos of a person looking forward at the camera. It is usually harder to
detect a person's face when they are viewed from the side or at an angle, and sometimes this requires 3D
Head Pose Estimation. It can also be very difficult to detect a person's face if the photo is not very bright, or
if part of the face is brighter than another or has shadows or is blurry or wearing glasses, etc.

The OpenCV library makes it fairly easy to detect a frontal face in an image using its Haar Cascade Face
Detector (also known as the Viola-Jones method).

The function "cvHaarDetectObjects" in OpenCV performs the actual face detection, but the function is a
bit tedious to use directly[51], so it is easier to use this wrapper function shown in Lis ng3.9.

// Perform face detection on the input image, using the given Haar Cascade.
// Returns a rectangle for the detected region in the given image.
CvRect detectFaceInImage(IplImage *inputImg, CvHaarClassifierCascade* cascade)
{
// Smallest face size.
CvSize minFeatureSize = cvSize(20, 20);
// Only search for 1 face.
int flags = CV_HAAR_FIND_BIGGEST_OBJECT | CV_HAAR_DO_ROUGH_SEARCH;
// How detailed should the search be.
float search_scale_factor = 1.1f;
IplImage *detectImg;
IplImage *greyImg = 0;
CvMemStorage* storage;
CvRect rc;
double t;
CvSeq* rects;
CvSize size;
int i, ms, nFaces;

storage = cvCreateMemStorage(0);
cvClearMemStorage(storage);

// If the image is color, use a greyscale copy of the image.


detectImg = (IplImage*)inputImg;
if (inputImg->nChannels > 1) {
size = cvSize(inputImg->width, inputImg->height);
greyImg = cvCreateImage(size, IPL_DEPTH_8U, 1);
cvCvtColor(inputImg, greyImg, CV_BGR2GRAY);
detectImg = greyImg; // Use the greyscale image.
}

// Detect all the faces in the greyscale image.


t = (double)cvGetTickCount();
rects = cvHaarDetectObjects(detectImg, cascade, storage,
search_scale_factor, 3, flags, minFeatureSize);
t = (double)cvGetTickCount() - t;
ms = cvRound(t / ((double)cvGetTickFrequency() * 1000.0));
nFaces = rects->total;
printf("Face Detection took %d ms and found %d objects\n", ms, nFaces);

52
Software Design and Interfacing

// Get the first detected face (the biggest).


if (nFaces > 0)
rc = *(CvRect*)cvGetSeqElem(rects, 0);
else
rc = cvRect(-1,-1,-1,-1); // Could not find the face.

if (greyImg)
cvReleaseImage(&greyImg);
cvReleaseMemStorage(&storage);
//cvReleaseHaarClassifierCascade(&cascade);

return rc; // Return the biggest face found, or (-1,-1,-1,-1).


}

Listing 3.9: Face detection function

Now you can simply call "detectFaceInImage" whenever you want to find a face within an image. You
also need to specify the face classifier that OpenCV should use to detect the face. For example, OpenCV
comes with several different classifiers for frontal face detection, as well as some profile faces (side view),
eye detection, nose detection, mouth detection, whole body detection, etc. You can actually use this
function with any of these other detectors if you want, or even create your own custom detector such as for
car or person detection, but since frontal face detection is the only one that I want to develop, it is the only
one I will discuss.

For frontal face detection, you can choose one of these Haar Cascade Classifiers that come with OpenCV
(in the "data/haarcascades/" folder):

 haarcascade_frontalface_default.xml

 haarcascade_frontalface_alt.xml

 haarcascade_frontalface_alt2.xml

 haarcascade_frontalface_alt_tree.xml

Each one will give slightly different results depending on your environment, so you could even use all of
them and combine the results together (if you want the most detection). There are also some more eye,
head, mouth and nose detectors that come with OpenCV.[51]

So you can do what is shown in Lis ng3.10 in your program for face detection:

// Haar Cascade file, used for Face Detection.


char *faceCascadeFilename = "haarcascade_frontalface_alt.xml";
// Load the HaarCascade classifier for face detection.
CvHaarClassifierCascade* faceCascade;
faceCascade = (CvHaarClassifierCascade*)cvLoad(faceCascadeFilename, 0, 0, 0);
if(!faceCascade) {
printf("Couldnt load Face detector '%s'\n", faceCascadeFilename);
exit(1);
}

// Grab the next frame from the camera.

53
BeagleBoard Software

IplImage *inputImg = cvQueryFrame(camera);

// Perform face detection on the input image, using the given Haar classifier
CvRect faceRect = detectFaceInImage(inputImg, faceCascade);

// Make sure a valid face was detected.


if (faceRect.width > 0) {
printf("Detected a face at (%d,%d)!\n", faceRect.x, faceRect.y);
}

//.... Use 'faceRect' and 'inputImg' ....

// Free the Face Detector resources when the program is finished


cvReleaseHaarClassifierCascade(&cascade);
Listing 3.10: Face detection program

3.3.2 Face Recognition

However, Face Recogni on is much less reliable than Face Detec on, generally 30-70% accurate. Face
Recognition has been a strong field of research since the 1990s, but is still far from reliable, and more
techniques are being invented each year.[52]

I will show you how to use Eigenfaces (also called "Principal Component Analysis" or PCA[53]), a simple
and popular method of 2D Face Recogni on from a photo, as opposed to other common methods such as
Neural Networks[53] or Fisher Faces.[53]

3.3.2.1 Preprocess facial images

Now that you have detected a face, you can use that face image for Face Recognition. However, if you tried
to simply perform face recognition directly on a normal photo image, you will probably get less than 10%
accuracy!

It is extremely important to apply various image pre-processing techniques to standardize the images
that you supply to a face recognition system. Most face recognition algorithms are extremely sensitive to
lighting conditions, so that if it was trained to recognize a person when they are in a dark room, it probably
will not recognize them in an illuminated room, etc. This problem is referred to as "illumination dependent",
and there are also many other issues, such as the face should also be in a very consistent position within the
images (For example, the eyes being in the same pixel coordinates), consistent size, rotation angle, hair and
makeup, emotion (smiling, angry, etc), position of lights (to the left or above, etc). This is why it is so
important to use a good image preprocessing filters before applying face recognition. You should also do
things like removing the pixels around the face that are not used, such as with an elliptical mask to only
show the inner face region, not the hair and image background, since they change more often than the face .

For simplicity, the face recognition system I will show you is Eigenfaces using greyscale images. So I will
show you how to easily convert color images to greyscale (also called 'grayscale'), and then easily apply
Histogram Equalization as a very simple method of automatically standardizing the brightness and contrast
54
Software Design and Interfacing

of your facial images. For better results, you can use color face recognition (ideally with color histogram
fitting in HSV: Hue, Saturation, and Value color system or another color space instead of RGB: Red, Green,
and Blue color system), or apply more processing stages like edge enhancement, contour detection, motion
detection, etc.[52] Also, the code is resizing images to a standard size, but this might change the aspect ratio
of the face. Here in Figure3.2 you can see an example of this preprocessing stage:

Figure 3.2: Hamdi's face preprocessing

Shown here in Lis ng3.11 is some basic code to convert from a RGB or greyscale input image to a
greyscale image, resize to a consistent dimension, then apply Histogram Equalization for consistent
brightness and contrast:

// Either convert the image to greyscale, or use the existing greyscale image.
IplImage *imageGrey;
if (imageSrc->nChannels == 3) {
imageGrey = cvCreateImage(cvGetSize(imageSrc), IPL_DEPTH_8U, 1);
// Convert from RGB (actually it is BGR) to Greyscale.
cvCvtColor(imageSrc, imageGrey, CV_BGR2GRAY);
}
else {
// Just use the input image, since it is already Greyscale.
imageGrey = imageSrc;
}

// Resize the image to be a consistent size, even if the aspect ratio changes.
IplImage *imageProcessed;
imageProcessed = cvCreateImage(cvSize(width, height), IPL_DEPTH_8U, 1);
// Make the image a fixed size.
// CV_INTER_CUBIC or CV_INTER_LINEAR is good for enlarging, and
// CV_INTER_AREA is good for shrinking / decimation, but bad at enlarging.
cvResize(imageGrey, imageProcessed, CV_INTER_LINEAR);

// Give the image a standard brightness and contrast.


cvEqualizeHist(imageProcessed, imageProcessed);

//..... Use 'imageProcessed' for Face Recognition ....

if (imageGrey)
cvReleaseImage(&imageGrey);
if (imageProcessed)
cvReleaseImage(&imageProcessed);
Listing 3.11: Histogram equalization program

55
BeagleBoard Software

3.3.2.2 Eigenfaces for Face Recognition

Now that you have a pre-processed facial image, you can perform Eigenfaces (PCA) for Face Recognition.
OpenCV comes with the function "cvEigenDecomposite()", which performs the PCA operation, however you
need a database (training set) of images so it can know how to recognize each individual.

So you should collect a group of preprocessed facial images of each person you want to recognize. For
example, if you want to recognize someone from a class of 10 students, then you could store 20 photos of
each person, for a total of 200 preprocessed facial images of the same size (say 100x100 pixels).

Use "Principal Component Analysis" to convert all your 200 training images into a set of "Eigenfaces"
that represent the main differences between the training images. First it will find the "average face image"
of your images by getting the mean value of each pixel. Then the Eigenfaces are calculated in comparison to
this average face, where the first Eigenface is the most dominant face differences, and the second Eigenface
is the second most dominant face differences, and so on, un l you have about 50 Eigenfaces that represent
most of the differences in all the training set images.

Figure 3.3: Average Face and Eigenfaces

In the images above (Figure3.3) you can see the average face and the first and last Eigenfaces that were
generated from a collec on of 30 images each of 4 people.[52]

Notice that the average face will show the smooth face structure of a normal person, the first few
Eigenfaces will show some dominant features of faces, and the last Eigenfaces (e.g.: Eigenface 119) are
mainly image noise. You can see the first 32 Eigenfaces in the image below in Figure3.4.

56
Software Design and Interfacing

Figure 3.4: The first 32 Eigenfaces of a person

3.3.2.3 Face Recognition using Principal Component Analysis

To explain Eigenfaces (Principal Component Analysis) in simple terms, Eigenfaces figures out the main
differences between all the training images, and then how to represent each training image using a
combination of those differences.[52]

So for example, one of the training images might be made up of:

(averageFace) + (13.5% of eigenface0) - (34.3% of eigenface1) +


(4.7% of eigenface2) + ... + (0.0% of eigenface199).

Once it has figured this out, it can think of that training image as the 200 ra os:

{13.5, -34.3, 4.7, ..., 0.0}.

It is indeed possible to generate the training image back from the 200 ra os by mul plying the ra os
with the Eigenface images, and adding the average face. But since many of the last Eigenfaces will be image
noise or will not contribute much to the image, this list of ratios can be reduced to include only the most
dominant ones, such as the first 30 numbers, without affecting the image quality much. So now it is possible
to represent all 200 training images using just 30 Eigenface images, the average face image, and a list of 30
ra os for each of the 200 training images.[51]

Interes ngly, this means that we have found a way to compress the 200 images into just 31 images plus
a bit of extra data, without losing much image quality. But this appendix is about face recognition, not image
compression, so we will ignore that.

57
BeagleBoard Software

To recognize a person in a new image, it can apply the same PCA calcula ons to find 200 ra os for
representing the input image using the same 200 Eigenfaces. Once again, it can just keep the first 30 ra os
and ignore the rest as they are less important. It can then search through its list of ratios for each of its
known people in its database, to see who is most similar to the input image, out of the whole training
images that were supplied.[52]

3.3.2.4 Implementing Offline Training

Basically, to create a faces database from training images, you create a text file that lists the image files and
which person each image file represents. For example, you could put the show text in Lis ng 3.12 into a text
file called "4_images_of_2_people.txt":

1 Hamdi data/Hamdi/Hamdi1.jpg
1 Hamdi data/Hamdi/Hamdi2.jpg
1 Hamdi data/Hamdi/Hamdi3.jpg
1 Hamdi data/Hamdi/Hamdi4.jpg
2 Khalid data/Khalid/Khalid1.jpg
2 Khalid data/Khalid/Khalid2.jpg
2 Khalid data/Khalid/Khalid3.jpg
2 Khalid data/Khalid/Khalid4.jpg
Listing 3.12: Person and images path text file

This will tell the program that person 1 is named "Hamdi", and the 4 preprocessed facial photos of
Hamdi are in the "data/Hamdi" folder, and person 2 is called "Khalid" with 4 images in the "data/Khalid"
folder. The program can then load them all into an array of images using the function "loadFaceImgArray()".
Note that for simplicity, it does not allow spaces or special characters in the person's name, so you might
want to enable this, or replace spaces in a person's name with underscores (such as Hamdi_Sahloul).

To create the database from these loaded images, you use OpenCV's "cvCalcEigenObjects()" and
"cvEigenDecomposite()" functions, See Lis ng3.13

// Tell PCA to quit when it has enough eigenfaces.


CvTermCriteria calcLimit = cvTermCriteria(CV_TERMCRIT_ITER, nEigens, 1);

// Compute average image, eigenvectors (eigenfaces) and eigenvalues (ratios).


cvCalcEigenObjects(nTrainFaces, (void*)faceImgArr, (void*)eigenVectArr,
CV_EIGOBJ_NO_CALLBACK, 0, 0, &calcLimit,
pAvgTrainImg, eigenValMat->data.fl);

// Normalize the matrix of eigenvalues.


cvNormalize(eigenValMat, eigenValMat, 1, 0, CV_L1, 0);

// Project each training image onto the PCA subspace.


CvMat projectedTrainFaceMat = cvCreateMat(nTrainFaces, nEigens, CV_32FC1);
int offset = projectedTrainFaceMat->step / sizeof(float);
for(int i=0; i<nTrainFaces; i++) {
cvEigenDecomposite(faceImgArr[i], nEigens, eigenVectArr, 0, 0,
pAvgTrainImg, projectedTrainFaceMat->data.fl + i*offset);
}
Listing 3.13: faces database creation

58
Software Design and Interfacing

You now have:

 the average image "pAvgTrainImg",

 the array of Eigenface images "eigenVectArr[]" (e.g.: 200 Eigenfaces if you used nEigens=200 training
images),

 the matrix of eigenvalues (Eigenface ratios) "projectedTrainFaceMat" of each training image.

These can now be stored into a file, which will be the face recognition database. The function
"storeTrainingData()" in the code will store this data into the file "facedata.xml", which can be reloaded
anytime to recognize people that it has been trained for. There is also a function "storeEigenfaceImages()" in
the code, to generate the images shown earlier, of the average face image to "out_averageImage.jpg" and
Eigenfaces to "out_eigenfaces.jpg".

3.3.2.5 Implementing Offline Recognition

In what concerning the implementation of the offline recognition stage, where the face recognition system
will try to recognize who is the face in several photos from a list in a text file. The same text file that is used
for offline training can also be used for offline recognition. The text file lists the images that should be
tested, as well as the correct person in that image. The program can then try to recognize who is in each
photo, and check the correct value in the input file to see whether it was correct or not, for generating
statistics of its own accuracy.[51]

The implementation of the offline face recognition is almost the same as offline training:

1. The list of image files (preprocessed faces) and names are loaded into an array of images,
from the text file that is now used for recognition testing (instead of training). This is
performed in code by "loadFaceImgArray()".

2. The average face, Eigenfaces and Eigenvalues (ratios) are loaded from the face recognition
database file "facedata.xml", by the function "loadTrainingData()".

3. Each input image is projected onto the PCA subspace using the OpenCV function
"cvEigenDecomposite()", to see what ratio of Eigenfaces is best for representing this input
image.

4. But now that it has the Eigenvalues (ratios of Eigenface images) to represent the input image,
it looks for the original training image that had the most similar ratios. This is done
mathematically in the function "findNearestNeighbor()" using the "Euclidean Distance", but
basically it checks how similar the input image is to each training image, and finds the most
similar one: the one with the least distance in Euclidean Space. You might get better results if
you use the Mahalanobis space (define USE_MAHALANOBIS_DISTANCE in the code).

5. The distance between the input image and the most similar training image is used to
determine the "confidence" value, to be used as a guide of whether someone was actually

59
BeagleBoard Software

recognized or not. A confidence of 1.0 would mean a definite recognition, and a confidence of
0.0 would mean a terrible recogni on.

Once it knows which training image is the most similar to the input image, and assuming the confidence
value is not too low (it should be at least 0.6 or higher), then it has figured out who that person is, in other
words, it has recognized that person!

3.3.2.6 Implementing Real-time Recognition from a Camera

It is very easy to use a webcam stream as an input to the face recognition system instead of a file list.
Basically you just need to grab frames from a camera instead of from a file, and you run forever until the
user wants to quit, instead of just running until the file list has run out. OpenCV provides the
'cvCreateCameraCapture()' function (also known as 'cvCaptureFromCAM()') for this.

Grabbing frames from a webcam can be implemented easily using the show function in Lis ng3.14

// Grab the next camera frame. Waits until the next frame is ready, and
// provides direct access to it, so do NOT modify or free the returned image!
// Will automatically initialize the camera on the first frame.
IplImage* getCameraFrame(CvCapture* &camera)
{
IplImage *frame;
int w, h;

// If the camera has not been initialized, then open it.


if (!camera) {
printf("Acessing the camera ...\n");
camera = cvCreateCameraCapture(0);
if (!camera) {
printf("Could not access the camera.\n");
exit(1);
}
// Try to set the camera resolution to 320 x 240.
cvSetCaptureProperty(camera, CV_CAP_PROP_FRAME_WIDTH, 320);
cvSetCaptureProperty(camera, CV_CAP_PROP_FRAME_HEIGHT, 240);
// Get the first frame, to make sure the camera is initialized.
frame = cvQueryFrame(camera);
if (frame) {
w = frame->width;
h = frame->height;
printf("Got the camera at %dx%d resolution.\n", w, h);
}
// Wait a little, so that the camera can auto-adjust its brightness.
Sleep(1000); // (in milliseconds)
}

// Wait until the next camera frame is ready, then grab it.
frame = cvQueryFrame(camera);
if (!frame) {
printf("Could not grab a camera frame.\n");
exit(1);
}

60
Software Design and Interfacing

return frame;
}
Listing 3.14: capturing frames from camera

The previous function can be used like what shown in Lis ng3.15:

CvCapture* camera = 0; // The camera device.


while (cvWaitKey(10) != 27) { // Quit on "Escape" key.
IplImage *frame = getCameraFrame(camera);
//...
}
// Free the camera.
cvReleaseCapture(&camera);
Listing 3.15: capture program

Putting together all the parts that I have explained so far, the face recognition system runs as follows:

1. Grab a frame from the camera.

2. Convert the color frame to greyscale.

3. Detect a face within the greyscale camera frame.

4. Crop the frame to just show the face region (using cvSetImageROI() and cvCopyImage()).

5. Preprocess the face image.

6. Recognize the person in the image.

3.3.2.7 Implementing Online Training from a Camera

Now you have a way to recognize people in real-time using a camera, but to learn new faces you would have
to shut down the program, save the camera images as image files, update the training images list, use the
offline training method from the command-line, and then run the program again in real-time camera mode.
As a matter of fact, this is exactly what you can do programmatically to perform online training from a
camera in real-time!

So here is the easiest way to add a new person to the face recognition database from the camera
stream without shutting down the program:

1. Collect a group of photos from the camera (preprocessed facial images), possibly while you
are performing face recognition also.

2. Save the collected face images as image files onto the hard-disk using cvSaveImage().

3. Add the filename of each face image onto the end of the training images list file (the text file
that is used for offline training).

61
BeagleBoard Software

4. Once you are ready for online training the new images (that is, once you have 20 faces, or
when the user says that they are ready), you "retrain" the database from all the image files.
The text file listing the training image files has the new images added to it, and the images are
stored as image files on the computer, so online training works just like it did in offline
training.

5. But before retraining, it is important to free any resources that were being used, and re-
initialize the variables, so that it behaves as if you shutdown the program and restarted it. For
example, after the images are stored as files and added to the training list text file, you should
free the arrays of Eigenfaces, before doing the equivalent of offline training (which involves
loading all the images from the training list file, then finding the Eigenfaces and ratios of the
new training set using PCA).

This method of online training is fairly inefficient, because if there were 50 people in the training set
and you add one more person, then it will train again for all 51 people, which is bad because the amount of
time grows exponentially as more users or training images are added. But if you are just dealing with a few
hundred training images in total then it should not take more than a few seconds.

3.3.3 I²C Communication with PIC®

Now, since we are done with recognition, we need to tell the main node (PIC® microcontroller) what is going
on and to obtain commands from it as well, unfortunately, Linux community had removed the support for
the slave I²C since kernel 2.6.31, I am using kernel 2.6.32, for this reason I am going to do the reverse, setting
Beagle as master and PIC® as slave, this may consume more resources on the main node, but there were no
options since patching Linux kernel is no more possible because of its overhead and cross-compiling
problems.

I had assigned the address 0x58 for the I²C Slave on the PIC®, and used the following functions shown its
prototypes in Lis ng3.16 to enable communication between these two modules.

#define I2C_BUS 2
#define I2C_ADDRESS 0x58

void i2c_report(signed char code);


signed char i2c_wait_request(void);
Listing 3.16: I²C address preprocessor and prototypes

But before calling any function of the above ones, we need to initialize the communication, this is done
in the main() function using the following code shown in Lis ng3.17.

//I2C Init block


{
unsigned char i2c_bus[11];
// Open i2c bus
sprintf(i2c_bus, "/dev/i2c-%d", I2C_BUS);
if ((i2c_fp = open(i2c_bus, O_RDWR)) < 0)
{

62
Software Design and Interfacing

fprintf(stderr, "Error: Cannot Open I2C bus\n");


return -1;
}

// Set i2c I2C address


if (ioctl(i2c_fp, I2C_SLAVE_FORCE, I2C_ADDRESS) < 0)
{
fprintf(stderr, "Error: Cannot set I2C address\n");
return -2;
}
}
Listing 3.17: I²C initialization

I am calling the i2c_report() function to inform the PIC® microcontroller with the index of detected face
as well as the error code indicators. Possible errors, warnings and information with its corresponding
meaning are shown in Table3.1.

Table 3.1: Codes meaning

Code# Message Type Reportable

-1 Cannot Open I²C bus Error No

-2 Cannot set I²C address Error No

-3 Could not load classifier cascade Error Yes

-4 Could not load training data Error Yes

-5 Failed to initialize video capture Error Yes

-6 Failed to initialize storage Error Yes

-7 Image query failed Error Yes

-8 No face detected Warning Yes

-9 Unknown face! Warning Yes

None I²C read failed. Warning No

None I²C write failed Warning No

>0 Face detected with index code# Info Yes

63
BeagleBoard Software

Note that i2c_report() function is as simple as writing a byte via I²C to the PIC®. Also the
i2c_wait_request() function is as simple as reading a byte from PIC® via I²C. See Lis ng3.18.

signed char i2c_wait_request()


{
signed char code;

// Read from I2C, this would hang the process until a response is received
if(read(i2c_fp, &code, 1) != 1) return -1;
return code;
}

void i2c_report(signed char code)


{
// Write into I2C
if(write(i2c_fp, code, 1) != 1)
{
fprintf(stderr, "Warning: I2C write failed.\n");
return;
}
}
Listing 3.18: I²C report and wait request functions

3.3.4 Starting the program when Linux starts up

The entire program within our microcontrollers starts once the microcontroller goes up, this is not the case
with Beagle Board, it is running Linux and this is what runs when the board powered up not your program.

I do not want you to get in the details since it is not meant to be discussed here, so I will just explain
how to do this, assuming you compiled the face recognition code as facerecognize in the root home as the
working directory, then you need to issue this commands, and you are ready to go:

mv facerecognize /etc/init.d/
chmod +x /etc/init.d/facerecognize
ln -s /etc/init.d/facerecognize /etc/rc5.d/
update-rc.d facerecognize defaults

Note that this program does not loop forever, it breaks its loop when a key is pressed, and this will help
in debugging and avoiding infinite loops that might prevent you from accessing or rebooting your Linux. Now
reboot the Beagle and enjoy!

64
Software Design and Interfacing

3.4 PIC® Software

Hopefully, you do not need to program the PIC® using Machine Language or Assembly Language since we
have a CCS C Compiler that will do this for us. Just program in the C language and the IDE would check the
syntax, optimize the code, generate the HEX file, and program the chip via the programmer board.

Also the CCS Real Time Opera ng System (RTOS) which was released in version 3.20 of PCW or PCWH
allows a PIC® microcontroller to define and run regularly scheduled tasks. This is accomplished by a
dispatcher built at compile time through the preprocessor directives to define tasks. This provides a
cooperative multi-tasking RTOS. However, due to the practical limitations of the PIC®, it does not offer
preemptive scheduling.

If you want to read the whole code used with the PIC, please refer to the Appendix IV, here, we are
discussing some important portions of the code, not the whole program.

Shown in Lis ng3.19 the initialization of the PIC microcontroller that defines the Pins, tunes its ADC
(Analog-to-Digital Converter) into the 10bit mode, moves the strings within the ROM to the RAM so I could
address them using pointers, sets the oscillator to the high speed mode, disabling the watch-dog timer,
allowing the hex code to be read, and using high voltage programming.

#include <18F2455.h>
#include <stdarg.h>

#device adc=10
#device PASS_STRINGS=IN_RAM
#fuses HS, NOWDT, NOPROTECT, NOLVP

#use delay(clock=20000000)

#use rtos(timer=0, minor_cycle=20ms)


Listing 3.19: PIC initialization of ADC, strings mapping, fuses, delay, RTOS

I am manipulating some global variables that need to be shared between some modules, these
variables are shown in Lis ng3.20.

Signed int8 tilt = 0;


double distance = 600.0; //In centimeter
double battery = 100.0; //Percentage
double latitude, longitude, altitude;
Listing 3.20: Global variables within the PIC

To have a flexible program I also made some definitions for the Pins used with the PIC, this definitions
are shown in Lis ng3.21.

#define SER_MOTOR_TX PIN_C6


#define SF9DOF_RX PIN_C7
#define BEAGLE_SDA PIN_B0
#define BEAGLE_SCL PIN_B1
#define BEAGLE_ADDRESS 0xA4
#define LCD_TX PIN_B2
#define LCD_WIDTH 20
#define LCD_HEIGHT 4
#define TTS256_TX PIN_B5

65
PIC® Software

#define TTS256_RX PIN_C0


#define TTS256_RESET PIN_C1
#define VRBOT_USE_UART
#define VRBOT_TX PIN_B3
#define VRBOT_RX PIN_B4
#define GPS_RX PIN_B6
Listing 3.21: Pins, dimensions and addresses definitions

And you may found some #use directives within the program that are used to define the stream that
the following functions uses, this #use directives are for both I²C and UART communication, shown in
Lis ng3.22.

#use rs232(baud=9600, parity=N, rcv=SF9DOF_RX, xmit=SER_MOTOR_TX, bits=8, ERRORS,


STREAM=COM_BALANCE)
#use i2c(SLAVE, SDA=BEAGLE_SDA, SCL=BEAGLE_SCL, address=BEAGLE_ADDRESS,
FORCE_HW)
#use rs232(baud=9600,parity=N,xmit=LCD_TX,bits=8, STREAM=COM_LCD)
#use rs232(baud=9600,parity=N,xmit=TTS256_TX,rcv=TTS256_RX,bits=8,
STREAM=COM_TTS256)
#ifdef VRBOT_USE_UART
#use rs232(baud=9600,parity=N,xmit=VRBOT_TX,rcv=VRBOT_RX,bits=8,
STREAM=COM_VRBOT)
#else
#use i2c(master, sda=VRBOT_SDA, scl=VRBOT_SCL, STREAM=COM_VRBOT)
#endif
#use rs232(baud=9600,parity=N,rcv=GPS_RX,bits=8, STREAM=COM_GPS)
Listing 3.22: #use directives used with our PIC

Since I am using a very advanced and complex programming feature; RTOS. I needed to define the task
that would operate at a fixed rate and would not exceed its maximum execution time, failing in tuning the
periods or tasks contents may result in malfunctioning program. See Lis ng3.23.

//Every 20ms - 9DOF services task


#task(rate=20ms, max=20ms, queue=2)
void service_9dof();
//Every 100ms - Proximity services task
#task(rate=100ms, max=20ms, queue=2)
void service_proximity();
//Every 100ms - VRBot services task
#task(rate=100ms, max=20ms)
void service_vrbot();
//Every 100ms - Beagle services task
#task(rate=100ms, max=20ms)
void service_beagle();
//Every 100ms - Battery services task
#task(rate=100ms, max=20ms, queue=2)
void service_battery();
//Every 20ms - GPS services task
#task(rate=20ms, max=20ms)
void service_gps();
Listing 3.23: RTOS tasks definitions and services prototypes

Now, and before diving deeper in the code, we need to have a look at the main() function, and what is
going inside it. Starting by initializing the ADC for the battery and proximity sensors, then we initialize the
modules connected to the PIC, start the LCD indica ons and the TTS256 text to speech convertor, no ng
that I aimed at having this convertor talking in Arabic Language, so that it would be more acceptable in my

66
Software Design and Interfacing

society, after that, I started the RTS services, if one or more critical services failed to start, the LCD would
display an error that tells you the message "system terminated" before going down. See Lis ng3.24.

void main(void)
{
setup_adc_ports(AN0_TO_AN2 | VSS_VDD);
setup_adc(ADC_CLOCK_DIV_32);

sf9dof_init();
init_motors();

tts256_init();
lcd_init(false);
delay_ms(1500); //LCD Splash

fprintf(COM_LCD, "Starting up ...");


fprintf(COM_TTS256, "Gaarri bedee altaashgheel.\r\n");

lcd_clear();
fprintf(COM_LCD, "Detecting VRbot module ...");
// Detect VRbot module
if (VRbot_Detect() == false)
{
fprintf(COM_LCD, "failed.");
return;
}
fprintf(COM_LCD, "done.");

lcd_clear();
fprintf(COM_LCD, "Setting up VRbot module ...");
VRbot_SetDelay(0);
fprintf(COM_LCD, "done.");

lcd_split_screen();
lcd_update_msg("Listening ...");

beagle_init();
gps_init();
rtos_run();

lcd_split_screen();
lcd_update_msg("System terminated!");
}

Listing 3.24: main() function and associated initializes

Now that you had a complete overview of the PIC program, let us start discussing in details the
functionality of this real time operating system.

Star ng by the 9DOF, we keep checking if there exist a new change in the tilt degree, if so, we display
the degree on the screen, inform the motor controller of this change, so that the motor controller would
handle it by accelerating with proper ratio according to the tilt degree. See Lis ng3.25.

void service_9dof()
{
if (sf9dof_kbhit())
{
tilt = sf9dof_getc();
67
PIC® Software

lcd_update_tilt(tilt);
if (tilt != 0)
{
motor_tilt_control(tilt);
if (abs(tilt) > 10)
{
lcd_update_msg("I am falling. X(");
}
}
}
}

Listing 3.25: 9DOF RTOS Rou ne

The proximity sensor is an analog sensor placed on the channel 1, reading it, and then applying the
proper conversion in centimeters to enable the robot to stop once an obstacle is detected within the range
of 25cm. See Lis ng3.26.

void service_proximity()
{
set_adc_channel(1);
delay_us(20);
//3.845mV/cm
//10-bit ADC, Vss-Vdd
//ADC: 0.7867/cm
//distance = (unsigned int16)(1.267 * (double)read_adc());
distance = (double)(1.64 * (double)read_adc());
lcd_update_distance(distance);
//Stop if obstcle exista at 25cm or less
if (distance <= 25)
{
lcd_update_msg("Obstacle detected!");
motor_speed_control(0);
}
}
Listing 3.26: Proximity RTOS Routine

Battery routine is almost the same as that of proximity sensor, I just placed two 4.7KΩ resistors in series
with the battery nodes and used the middle line as an input to the PIC, and the ba ery is 7.4V. See
Lis ng3.26.

boolean battery_warning = false;


void service_battery()
{
set_adc_channel(2);
delay_us(20);
//3.88V at 100% to 1.75V at 0%
//10-bit ADC, Vss-Vdd
//ADC: 794 at 100% to 358 at 0%
battery = 0.7 * battery + 0.3 * (-82.11 + 0.229 * (double)read_adc());
if (battery > 100) battery = 100;
if (battery < 0) battery = 0;
lcd_update_battery(battery);
if (battery > 40) battery_warning = false;
if (battery < 30 && battery_warning == false)
{
fprintf(COM_TTS256, "Enteebaah, Albattarria zaft.\r\n");
lcd_update_msg("Battery low!!");
68
Software Design and Interfacing

battery_warning = true;
}
}
Listing 3.27: Battery RTOS Routine

VRbot keeps waiting for the trigger phrase (trigger phrase is: Roboty) in order to listen to the voice
commands from the operator, I cannot allow it to listen to every word even if it is not directed to him, that is
why I am using a trigger, after that it listens to the command and then executes its associated operations, if
there is not command recognized for three waiting times, time out occurs. See Lis ng3.28.

void service_vrbot()
{
unsigned int8 i;
signed int8 cmd;

if (VRbot_GetGroupCount(GROUP_0) == 0) rtos_terminate();
do
{
rtos_yield();
VRbot_RecognizeSD(GROUP_0);
while (!vrbot_kbhit()) rtos_yield();
cmd = vrbot_getc();
if (cmd != STS_SIMILAR && cmd != STS_RESULT)
continue;
vrbot_putc(ARG_ACK);
while (!vrbot_kbhit()) rtos_yield();
cmd = vrbot_getc() - ARG_ZERO;

} while (cmd != G0_ROBOTY);


fprintf(COM_TTS256, "Naam, Saidi.\r\n");
lcd_update_msg("Yes, Sir");
i = 0;
while(true)
{
rtos_yield();
if (VRbot_GetGroupCount(GROUP_1) == 0) rtos_terminate();
VRbot_RecognizeSD(GROUP_1);
while (!vrbot_kbhit()) rtos_yield();
cmd = vrbot_getc();
i++;
if (cmd != STS_SIMILAR && cmd != STS_RESULT)
{
if (i >= 3)
{
lcd_update_msg("Timeout ...");
return;
}
continue;
}
vrbot_putc(ARG_ACK);
while (!vrbot_kbhit()) rtos_yield();
cmd = vrbot_getc() - ARG_ZERO;
break;
}
//handle the cmd, no need to list the actions here, refer to the Appendix IV for that
}
Listing 3.28: VRbot RTOS Routine

69
PIC® Software

BeagleBoard on the other had is operating as an I²C master of the bus, so I programmed the PIC as a
slave on the bus, it keeps waiting for a change on the status of the Beagle that informs the PIC of the
recognition result and acts accordingly. See Lis ng3.29.

const char *person_array[] = {"Hamdi", "Khalid", "Abdulahman", "Tareeq", "Moharam",


"AlBalasi", "Father"};
char recognize_msg[64];
void service_beagle()
{
if (beagle_status() != 0)
{
if ((signed int8)beagle_status() > 0)
{
fprintf(COM_TTS256, "Taam eltaarf aaleek.\r\n");
sprintf(recognize_msg, "%s recognized", person_array[beagle_status() -
1]);
lcd_update_msg(recognize_msg);
}
else
{
sprintf(recognize_msg, "Not recognized (#%d)", beagle_status());
fprintf(COM_TTS256, "Fasheel eltaarf aaleek.\r\n");
lcd_update_msg(recognize_msg);
}
beagle_status_reset();
}
}
Listing 3.29: BeagleBoard RTOS Routine

Note that if the beagle asks for an input, the PIC provides it whether it is to capture a picture and
recognize the face or not according to the command received from the VRbot. This code is driven via an
interrupt routine shown in Lis ng3.30.

#INT_SSP
void ssp_interupt ()
{
capture_i2c_state = i2c_isr_state();

if(capture_i2c_state < 0x80) //Beagle is sending status


{
capture_status = i2c_read();
}
if(capture_i2c_state == 0x80) //Beagle is requesting trigger
{
i2c_write(capture_trigger);
capture_trigger = 0;
}
}

Listing 3.30: BeagleBoard RTOS Routine

From my point of view, there is not any need to dive any deeper in the code unless you are interested in
details. If so, move to the Appendix IV and read it yourself, I guess you do not need my help there since you
seem to be an advanced programmer.

70
Operation and Maintenance

Chapter 4. Operation and Maintenance

Note that I am writing this chapter while the robot is still under
testing and modification, so you might find that some contents here
are different from that of the final prototype. Also, there are some
functions that still under development and do not operate properly
which may cause its description here to be based on mere
expectations.

Hearing and Talking

Face Recognition

Balancing and Moving

Errors and Maintenance

71
Voice Commands

4.1 Voice Commands

VRbot offers a voice recognition service that is programmed via its associated interface called "VRbot GUI
1.1". Using this interface, I programmed the following commands shown in Table4.1.

Table 4.1: Voice commands, its functions, and its pronunciation.

Code Type Name Function Pronunciation


(Arabic)

0 Trigger G0_ROBOTY Trigger the robot to start listening for a voice ‫روﺑوﺗﻲ‬
command

0 Command G1_FORWARD Moves the robot forward ‫إﻟﻰ اﻷﻣﺎم‬

1 Command G1_BACKWARD Moves the robot backward ‫إﻟﻰ اﻟﺧﻠف‬

2 Command G1_LEFT Turn le by 90 degree ‫إﻟﻰ اﻟﯾﺳﺎر‬

3 Command G1_RIGHT Turn right by 90 degree ‫إﻟﻰ اﻟﯾﻣﯾن‬

4 Command G1_STOP Stop moving ‫ﺗوﻗف‬

5 Command G1_INTRO The robot Introduces himself ‫ﻋرف ﺑﻧﻔﺳك‬

6 Command G1_NAME The robot mentions his name ‫ﻣﺎ أﺳﻣك؟‬

7 Command G1_FROM The robot reveals his nationality ‫ﻣن أﯾن أﻧت؟‬

8 Command G1_HELLO The robot greets you ‫اﻟﺳﻼم ﻋﻠﯾﻛم‬

9 Command G1_WHOAMI The robot recognize your face ‫ھل ﺗﻌرﻓﻧﻲ؟‬

10 Command G1_CAMUP The robot moves the webcam upward ‫اﻟﻛﺎﻣﯾرا ﻟﻸﻋﻠﻰ‬

11 Command G1_CAMDOWN The robot moves the webcam downward ‫اﻟﻛﺎﻣﯾر ﻟﻸﺳﻔل‬

12 Command G1_CAMLEFT The robot turns the webcam left ‫اﻟﻛﺎﻣﯾرا ﻟﻠﯾﺳﺎر‬

13 Command G1_CAMRIGHT The robot turns the webcam right ‫اﻟﻛﺎﻣﯾرا ﻟﻠﯾﻣﯾن‬

After training the VRbot for the desired commands and its pronunciation, It was successfully responding
to the voice commands and acts accordingly. I connected this module with PIC, and connected the other
modules, and it was able to instruct them to act as it commands.

72
Operation and Maintenance

4.2 Talking

One of the modules that act in conjugation with the VRbot and many other modules is the TTS256, which is
capable of converting/synthesizing the text into speech. Using these modules I was able to produce many
sound responses which are listed below in Table4.2.

Table 4.2: Speech responses, its meanings, and its triggers.

Phrase (Arabic) Meaning Trigger

‫ اﻟرﺟﺎء اﻹﻧﺗظﺎر‬Please wait. G1_WHOAMI

‫ أﻧﺎ ﻣن اﻟﯾﻣن‬I am from Yemen G1_FROM

‫اﻟﺑطﺎرﯾﺔ ﺿﻌﻔت‬, ‫ إﻧﺗﺑﺎه‬Alert! battery low! Battery voltage very low

‫ﺗم ﺗﺻﻣﯾﻣﻲ ﺑواﺳطﺔ ﺣﻣدي‬. ‫ إﺳﻣﻲ روﺑوﺗﻲ‬My name is Roboty, Designed by G1_INTRO
‫ﺷﻛرا‬
ً . ‫ ﺳﺣﻠول‬Hamdi Sahloul, Thanks.

‫ إﺳﻣﻲ روﺑوﺗﻲ‬My name is Roboty G1_NAME

‫ ﻓﺷل اﻟﺗﻌرف ﻋﻠﯾك‬Recognition failed BeagleBoard detection result

‫ ﺟﺎري ﺑدء اﻟﺗﺷﻐﯾل‬Starting up Power connected, or restart


pressed

‫ﺳﯾدي‬, ‫ ﻧﻌم‬Yes, Sir G0_ROBOTY

‫ ﺗم اﻟﺗﻌرف ﻋﻠﯾك‬Recognition succeeded BeagleBoard detection result

‫ وﻋﻠﯾﻛم اﻟﺳﻼم‬Peace upon you too G1_HELLO

73
Face Recognition

4.3 Face Recognition

Once the VRbot receives the voice command of G1_WHOAMI it will inform the PIC which communicates
with BeagleBoard to capture an image via QuickCam connected to it. This process involves huge processing
done by the BeagleBoard and after
detecting the face, it normalizes the image,
compare it to a prebuilt database of faces, if
matched, it returns the index of the person
who is being recognized, if not, then an
error code is returned, for error codes and
its explanations, please refer to Table3.1.

After that, the result is retrieved by


the PIC , which in its place commands both
the LCD and TTS256 to indicate the results,
the LCD indicates more detailed results
including the error code in case of failure or
the person name in case of success. Figure 4.1: QuickCam capturing an image

Finally, I had something solid that I could put in this documentation, note the orange LED around the
Logitech log while the QuickCam is capturing an image after being asked to do so. See Figure4.1.

Figure 4.2: LCD displays the recognition result

But since I requested it to detect the face in front of it, while there was no person there, the TTS256
said that the face recognition failed, and the LCD indicated the error code as well. See Figure4.2.

Please have no worries, I guarantee that this module is working like a charm by now, it is only that I had
not built a good database for it to show you some examples here.

74
Operation and Maintenance

4.3.1 Camera Movements

I am running out of time while this module


has not implemented yet, but since I had an
experience with the Servo motors (when I
was trying to use them instead of DC
motors to control the wheels), I guess it
would be an easy task implement this
before the seminar date.

The camera is being held by two


perpendicular large servo motors that
enable it to move in the four directions just
like human neck. See Figure4.3.

I still have enough Pins on the PIC, and


I guess the next two days will be dedicated
to them as well as the small servo motor
holding the ultrasonic range finder (which is
a replacement of the infrared proximity
sensor that was used earlier, since it was
generating some noise on the power
network, and its reading were not totally
stable).

By the way, these servos are capable Figure 4.3: QuickCam on top of two large servos acting as neck
of turning around with about 180 degrees
each, enabling the robot to look in almost all directions that normal human could. I am really excited about
its applications such as motion detection, I guess this would be a future recommendation.

4.4 Balancing

This is the most important feature I was trying to implement, I went all the way facing all the troubles of
modern digital controllers and traditional PID controllers to have this objective done, but until this moment,
I am currently tuning the PID controller to respond according to the reported tilt degree from the 9DOF
which is already stable by now. Right now I am having a drunken robot that cannot stand up for only one
second, but I will do my best to tune the PID controller so that it would respond as expected from it,
maintaining the robot balance for hours even while moving or being pushed with an average force.

Some people says that this is impossible to have a two wheels balancing robot, but I don't believe so,
mine is being only a little bit longer and heavier than the limit that the motors and wheels can handle, and
may be this is my biggest problem, but since I am already on the lost time, I cannot start building a new
prototype and experiment the differences.

75
Walking

4.5 Walking

If I place a third or more wheels (not driven) and disables the func onality of the 9DOF, then this robot is
moving greatly in all directions just as instructed. Well, this is a temporarily solution until I solve the balance
problem.

This robot is a differential wheeled robot[63], whose movement is based on two separately driven
wheels placed on either side of the robot body. It can thus change its direction by varying the relative rate of
rotation of its wheels and hence does not require an additional steering motion.

Currently, I am assuming that the turns are taken only when the robot is stopped, but I guess that I will
(or maybe you – the reader) update this code to accommodate this feature in the future.

I am only experiencing a problem with the motors being too loud while operating especially at
maximum speed, its noise prevents the robot from hearing me until it stops by itself when it comes to an
obstacle.

4.5.1 Avoiding obstacles

I was using the infrared proximity sensor, but this one was a big failure in the accuracy and power
consumption, so, I switched to the ultrasonic range finder which is doing great, once an obstacle is detected
within the range of 20cm it stops the robot from moving forward anymore.

I did not yet implement the small servo control that holds the range finder which would enable the
robot to turn left or right once an obstacle is detected according to its readings from the two sides.

4.6 Navigation

There exist a GPS chip and helical antenna next to the 9DOF under the QuickCam which is intended to be
used for the navigation of the robot. To be honest, I think this unit is not suitable for this robot, since the
robot is an in-door type and cannot navigate outside especially with its two wheels that might cause it a lot
of trouble in a rough surfaces and strong winds. Another reason is that I already noticed that this GPS unit is
not obtaining its geographical position unless it is in an open area, meaning that this unit would probably not
operate at the in-door environment.

In addition, I just interfaced this unit, but I did not implement yet its protocol in order to extract its
data. Even though its implementation is quite simple, my limited time cornered me into letting its
implementation to be a future development beyond the limits of this project presentation.

76
Operation and Maintenance

4.7 LCD display

I divided the view of the LCD into many regions listed below:

 The first line for the status indication of the current process.

 The second line is used to indicate the tilt degree of the robot, and the distance detected by the
range finder.

 The third line is used to indicate the battery power level and the altitude obtained from the GPS.

 The last line reports the latitude and longitude of the geographical position obtained also from
the GPS unit. See Figure4.4.

Figure 4.4: LCD regions and functionality.

4.7.1 Status indication

As you have already seen, this LCD reports various type of status messages as well as the sensors readings.
You could use the output of these indicators to diagnostic the system and trace the problems, having a basic
background of the normal operation flow would came in a great help detecting the problems here.

77
LCD display

If any glitches within the system had been detected, then I think the first procedures shall involve
checking the power connection to the corresponding malfunctioning module, its data and control wires
connected to the controlling module, and any suspicious soldering short circuits or wrong pins mapping.

Also, almost all the modules offer a LED indication of its power and signaling, which would help in case
of LCD display problems. A list of the messages and its meanings is shown in Table4.3.

Table 4.3: LCD messages and its meanings.

Message Meaning

Starting up... Indicates that the PIC is starting properly

Detecting VRbot module... done. Indicates that the VRbot module is communicating
properly.

Setting up VRbot module... Modifying some setting within VRbot to make it faster

Listening... The boot up completed and the robot is ready to


receive the trigger.

Yes, Sir Indicates that the trigger is understood and the robot
is now waiting for the command

Detecting face... Indicates that the command of G1_WHOAMI received

%s recognized Prints out the name of the person whom his face
recognized

I am from Yemen Printed when G1_FROM command received

Introducing myself Printed when G1_INTRO command received

My name is Roboty Printed when G1_NAME command received

Peace upon you too Printed when G1_HELLO command received

Moving CAM upward... Printed when G1_CAMUP command received

Moving CAM downward... Printed when G1_CAMDOWN command received

Moving CAM left... Printed when G1_CAMLEFT command received

Moving CAM right... Printed when G1_CAMRIGHT command received

Moving forward... Printed when G1_FORWARD command received

Moving backward... Printed when G1_BACKWARD command received

Turning left... Printed when G1_LEFT command received

78
Operation and Maintenance

Turning right... Printed when G1_RIGHT command received

Stopping... Printed when G1_STOP command received

Battery low!! Printed when ba ery level goes to 30% or under.

I am falling. X) Printed when lt degree goes to 10 degrees or


greater.

Obstacle detected! Printed when distance sensed goes to 25 cm or less.

4.7.2 Errors reporting

Currently, there are few errors and warnings that I faced and implemented on the LCD to help maintaining
the system, these errors/warnings and its reasons are shown in Table4.4.

Table 4.4: LCD errors, warnings and its reasons.

Message Type Reason

Detecting VRbot module... failed. Error Printed before system collapse in case of
VRbot not responding.

Timeout... Warning No known command received within


specific time after the trigger

Not recognized (#%d) Warning Failed to recognize the person with the
error code specified

System terminated! Error Printed before system collapse in case of


fatal error, such as missing commands
inside VRbot.

79
LCD display

80
Conclusions and Recommendations

Chapter 5. Conclusions and Recommendations

I had learned that success and problems are inseparable pairs, once
you believe that you succeeded without problems, you are in a big
failure. For this reason, I am writing this chapter to discuss the
problems, conclusions from both the success and failures I came
through, and finally I want to write down some recommendations for
those who want to follow on my lead.

Challenges

Conclusion

Recommendations

81
Challenges

5.1 Challenges

First of all, coming with an idea is half of the project, I was not very cautious about this in the beginning, and
this caused me a lot of problems, I have spent many days and nights trying to do the impossible with a
limited hardware, when I found out that there are hardware related issues, I discovered that I did not
thought of the idea very well nor did I read enough about its applicability using such a hardware. Thanks to
this, I lost a lot of valuable time.

Raising funds was my second problem as I am not that rich, I am a person in the third world who does
not work currently, and at best situa ons, have an annual incoming less than 3500 USD. I cannot pay my
whole income to this robot. So, I asked many companies and rich known people to support the project, and
this was a real pain, I merely collected the half expected amount, and I needed to find another ways of doing
this. I started importing some hardware, and I managed to build some prototypes that I used to persuade
more supporters and use that money again to enhance the prototypes and go in cycles again and again. The
rest of the money was paid from the family pocket, which was really hard on me especially that I know they
really need this money, but cannot afford to see me fail.

By the way, customs also did not leave me alone. If I purchase some items with total amount of about
120 USD or more, they usually hold the package until paying the customs fees, they indeed also hold some
packages with about 70 USD as they suspected invoices being replacement invoices. On the other hand, if I
reduced the items within each shipment to reduce its price, I go into another problem which is the shipment
fees that really goes too much. I also searched for many hardware components for this project, and what
made the matters worse was that the needed items located in many different places that required separate
shipments and shipment fees. Also some stores that I was planning to purchase many items from them goes
out of stock driving me insane.

Building prototypes was not this easy, I have spent too much time and money in developing them, you
may not believe that the prototypes cost was more than the final implemented system. If you want to build
a prototype, try to ensure that its components are reusable in your next prototype, this was not quite the
situation, but since this is my first robot, I guess it is natural. If I only found a documentation of similar
project just like this documentation, I think it would save a lot of time and money, and that is really why I am
writing this documentation with too much detail in the first place.

From technical point of view, problems involves only one hardware UART interface within the PIC®
microcontroller while I was lacking about six serial interfaces, this led me to use the software UART that
requires fast and efficient processing of the incoming data in spite of the limited resources of the PIC, which
was a real challenge that forced me to look into the RTOS, which was a feature supported by a few
compilers, one of them was the CCS PCWH compiler which is licensed with 500 USD. Even after that, the
RTOS programming was a real nightmare that even experts complain from it.

Stabilizing the 9DOF sensors using Kalman filter was not an early idea that I found out as I spent many
weeks trying to learn the new ATMega IDE and assembly-like language before finding out that the sensors
are not useable without digital controller, which made me into another deep despair and diving into codes
of others designed for aircrafts and the likes.

Since I started with full rotation servo motors to serve as the wheels motors, thinking that I could
control the speed using only PWM, I ended up with a big failure while the deadline was approaching, leaving

82
Conclusions and Recommendations

me with only one way to go, the DC motors, which also proved its failures as well, I discovered the geared DC
motors soon, it had a good torque and speed, but I was not able to control the speed correctly until I came
to the idea of using encoders. These stages of development forced me to change many hardware designs
and spending more and more money without coming to a solid solution. Now I see that such a solution exists
on the PID controller that I still tuning until this moment without luck, may be the DC motors are not good
enough, or do I have an algorithm problem, I do not know yet.

Fortunately, I was familiar with Linux, but not with embedded Linux within a half hand sized board, also
the Linux distribution this time was something that I merely heard of its name, and I barely managed to
install the Linux inside the BeagleBoard. After that I came to another problem which is to install the UVC
driver for the QuickCam which was really a collection of punk tricks and Linux packages that request Internet
connection to the board, and I needed I²C communication forcing me to purchase a BeagleBuddy Zippy
board, I was about getting crazy due to this unbelievable costs.

BeagleBoard did not left me alone trying to act as a master on the I²C bus, as the Linux removed the
support of slave operation just a revision before the installed one. This was really a headache and I sacrificed
many processing time on the PIC so I programmed it as a slave instead of master, and to reverse the rules of
operation.

Being luckless has no limits, some of the expensive items went diffused because of small mistakes while
soldering or powering up without proper regulators.

5.2 Conclusions

Well, I am very satisfied with my first robot and have learnt a lot from the process of building it. From the
early stages of reading the books, through designing the body, sourcing the components and parts,
programming it, to the final circuits and the successful “maiden voyage” of the robot using three wheels. I
felt like it is my own baby when it started talking and listening, I really liked it very much.

Like most people, there are things that I would do differently if I was to build the same robot again but
most of them are quite minor. The main change I would make would be to reduce the robot size, build the
circuits on a PCB instead of using point-to-point wiring.

Programming the components was the most time consuming and fiddly part of building the robot,
partly due to the decision to build a two wheels balancing robot (but it was worth it).

As for my next projects, I would probably build a “Humanoid robot” with two legs in the near future. I
am also quite keen on building a MAV (Micro Air Vehicle).

If you are seeking more knowledge for your academic studies or considering becoming a robotics
hobbyist, I hope this project had been helpful even for the first step in your project. And for those who
already are involved deeper in this great field, I hope this was not a total waste of time.

83
Recommendations

5.3 Recommendations

Thinking now that I was dreaming of this robot to play chess was not really a crazy idea after all, I guess you
may carry on my wish and implement the arm that I purchased but got no time to interface, and extend the
coding of face recognition into chess pieces recognition, modify some ready to use chess algorithm that
could be found easily in the internet, move the arm to grip robot's pieces and let us play chess!!

Balancing until this moment is still under development, if there is a reason for me to fail in this, this
would be due to the size of the robot, try to reduce its size, get a better motors and wheels, and give it your
best. So, what about a single PCB that holds all the robot modules reducing its shape and enhancing its
quality and communication accuracy?

My knowledge on ATMega microcontrollers is still limited, I needed to find a way to use ADC pins as
digital I/O but I failed on that within the time frame I am it. If you managed to know that way, please try to
do that for the pins reading on the encoder of the serial motor controller, as this would enhance the
performance significantly.

Currently the Beagle do only offline training, I think it is better to implement online training code. By
the way, same goes for the VRbot, I think VRbot would act exquisite being programmable using some voice
commands.

Also, try to use the QuickCam to do motion detection, and then move the neck of the robot to follow
this movements, this practice will guide you to discover a more sophisticated vision algorithms and to
enhance this robot capabilities.

The current implementation of turning algorithm cannot turn while moving, try to modify it so that it
can turn while moving using the differential concept.

The GPS unit until the moment is just for show, I need someone to implement its routines so that it
starts telling the right geographical position information.

Do you have any further ideas, try applying them out and share them with others!!

84
Creative Commons Attribution license

Creative commons attribution license

Resources

Beagleboard and Ångström distribution

Full robot software

Microcontroller's protocols and signals

Kalman filter

PID controller

85
Creative Commons Attribution license

86
Creative Commons Attribution license

Appendix I. Creative Commons Attribution license

Proper attribution is required when you reuse or create modified


versions of hardware or content that appears on a work made
available under the terms of the Creative Commons Attribution
license. The complete requirements for attribution can be found in
sec on 4b of the Crea ve Commons legal code located at
http://crea vecommons.org/licenses/by/3.0/legalcode.

Exact reproduction

Modified versions

Other media

Contact

87
Creative Commons Attribution license

In practice I ask that you provide attribution to Hamdi Sahloul to the best of the ability of the medium in
which you are producing the work.

There are several typical ways in which this might apply:

I.1 Exact Reproductions

If your work exactly reproduces hardware, text or images from this project, in whole or in part, please
include a paragraph at the bottom of your work that reads:

Portions of this work are reproduced from work created and shared by Hamdi M. Sahloul and used
according to terms described in the Crea ve Commons 3.0 A ribu on License.

Also, please state the original source (and links if applicable) so that readers can refer there for more
information.

I.2 Modified Versions

If your work shows modified hardware, text or images based on the content from this project, please include
a paragraph at the bottom of your work that reads:

Portions of this work are modifications based on work created and shared by Hamdi M. Sahloul and
used according to terms described in the Crea ve Commons 3.0 A ribu on License.

Again, please state the original source (and links if applicable) so that readers can refer there for more
information. This is even more important when the content has been modified.

I.3 Other Media

If you produce some works, such as books, audio, or video, I ask that you make your best effort to include a
spoken or written attribution in the spirit of the messages above.

I.4 Contact

If you have questions or suggestions regarding my hardware, documentation and code license policies,
please email me at hamdisahloul@hotmail.com.

Although I am unable to send personal responses to every email I receive, I do read all of the feedback
that is submitted and will use it to improve my hardware, documentation and code.
88
Resources

Appendix II. Resources

You may already found that this project consist of many hardware
and software components which may make it very difficult to
realize, so I think I have to (at least) help you in locating these
resources.

Table of resources

Alternative solutions

Cost Estimation

89
Resources

You may just need to get some money and leave the reset to me. Have a look at Table II.1.

Table II.1: Resources used for the robot system

Price
Resource Location
(USD)

Beagle Board http://search.digikey.com/scripts/DkSearch/dksus.dll?lang=en&site=US&KeyW 149.00


ords=296-23428-ND

BeagleBuddy h p://www. ncantools.com/product.php?produc d=16147 79.00


Zippy

5V Regulated http://search.digikey.com/scripts/DkSearch/dksus.dll?lang=en&site=US&KeyW 18.30


Transformer ords=T377-P5P-ND

DB9M-to-USB h p://www.sparkfun.com/commerce/product_info.php?products_id=8580 12.95


adaptor

IDC10-to- Local Store 5.00


DB9M serial
cable

DB9F-to-DB9F http://search.digikey.com/scripts/DkSearch/dksus.dll?lang=en&site=US&KeyW 6.57


null modem ords=ae9879-nd
cable

Ethernet cable Local Store 5.00

SD Card – 4GB Local Store 15.00


storage space

SD Card reader h p://www.sparkfun.com/commerce/product_info.php?products_id=8698 9.95

PIC18F2455 http://www.sparkfun.com/commerce/product_info.php?products_id=9324 5.95


Microcontroller

Olimex PIC- h p://www.sparkfun.com/commerce/product_info.php?products_id=19 30.95


P28-USB

24LC256 h p://www.sparkfun.com/commerce/product_info.php?products_id=525 1.95


EEPROM

Olimex PIC- http://www.sparkfun.com/commerce/product_info.php?products_id=4 91.95


MCP-USB

MPLAB IDE h p://ww1.microchip.com/downloads/en/DeviceDoc/MPLAB_IDE_v8_56.zip Free

CCS PCWH http://www.ccsinfo.com/product_info.php?products_id=PCWH_full 500.00

90
Resources

Compiler

VRbot Module http://www.sparkfun.com/commerce/product_info.php?products_id=9753 57.95

VoiceBox h p://www.sparkfun.com/commerce/product_info.php?products_id=9799 39.95


Shield

TTS256 h p://www.sparkfun.com/commerce/product_info.php?products_id=9811 21.95

Character LCD http://www.sparkfun.com/commerce/product_info.php?products_id=9568 29.95


with Backpack

GPS Module h p://www.sparkfun.com/commerce/product_info.php?products_id=9566 79.95

GPS Interface h p://www.sparkfun.com/commerce/product_info.php?products_id=9123 2.95


cable

Logic Level h p://www.sparkfun.com/commerce/product_info.php?products_id=8745 1.95


Converter

9DOF - Razor http://www.sparkfun.com/commerce/product_info.php?products_id=9623 124.95


IMU - AHRS
compatible

Pocket AVR h p://www.sparkfun.com/commerce/product_info.php?products_id=9825 14.95


Programmer

FTDI Basic h p://www.sparkfun.com/commerce/product_info.php?products_id=10009 13.95


Breakout - 3.3V

Arduino IDE http://arduino.googlecode.com/files/arduino-0019.zip Free

Wheel encoder h p://www.pololu.com/catalog/product/1218 39.95


set

250:1 Micro http://www.pololu.com/catalog/product/995 15.95 x 2


Metal
Gearmotor HP

Serial h p://www.sparkfun.com/commerce/product_info.php?products_id=9571 19.95


controlled
motor driver

Infrared h p://www.sparkfun.com/commerce/product_info.php?products_id=8958 14.95


Proximity
Sensor (bad quality, get the ultrasonic range finder if you have could afford it)

Ultrasonic h p://www.sparkfun.com/commerce/product_info.php?products_id=9491 49.95*


Range Finder
(better replacement of the proximity sensor)

91
Resources

Servo – Large h p://www.sparkfun.com/commerce/product_info.php?products_id=9064 12.95 x 2

Servo – Small http://www.sparkfun.com/commerce/product_info.php?products_id=9065 8.95

QuickCam Pro h p://www.bhphotovideo.com/c/product/505399-REG/ 63.49


9000

Ångström Linux http://www.angstrom-distribution.org/demo/beagleboard/ Free


distribution

OpenCV http://sourceforge.net/projects/opencvlibrary/files/opencv-unix/2.1/OpenCV- Free


2.1.0.tar.bz2/download

LiPoly Battery - h p://www.sparkfun.com/commerce/product_info.php?products_id=9703 14.95


2200mAh 7.4v

Accucel-6 h p://www.sparkfun.com/commerce/product_info.php?products_id=9705 42.95


Charger

BeagleJuice http://www.liquidware.com/shop/show/BB-BJC/BeagleJuice 88.73

1671.79
Gross Total
+ 35**

*: Optional items, but preferred **: The price difference in case of taking all the optional items

I hope that the price is acceptable and you can manage to get this amount of money, in reality, I had
spent much more money in my prototypes and experiments as well as the price of defused items that I
destroyed.

May be the stores above are not the best priced, but I did another calculation which is that I shall pay
for the shipment and handling for each package that I import, and since that would cost me a lot of money
more than getting them all from about two or three stores, even this stores does not give a great prices.

By the way, I purchased also a Soldering Station, Oscilloscope, many USB cables and Jumper Wires, you
may or may not need to purchase them depending on your own existing tools, if you do not, go to SparkFun
or Digi-Key and search for some.

92
Appendix III. BeagleBoard and Ångström Distribution

Since we had connected the Beagle Board with the BeagleBuddy


Zippy, we are ready to build our development environment.

Linux installation

Package manager

OpenCV

93
BeagleBoard and Ångström Distribution

The Beagle Board is a pocket-sized reference board (see Figure2.22) containing a Texas Instruments
OMAP3530 system-on-a-chip (SoC) processor (ARM Cortex A-8 core) running at up to 600MHz.[54] I picked
the Beagle Board because it is an inexpensive platform capable of running Linux.

III.1 Building your development environment

The following items are required to setup our Beagle Board with Ångström and associated software:

 A desktop or laptop computer acts as the host platform.

 A 5mm Barrel Power 5V Regulated Transformer or USB mini-A-to-USB A female On-The-Go


(OTG) cable.

 DB9M-to-USB adaptor (optional if your host platform has an RS-232 port).

 IDC10-to-DB9M serial cable.

 DB9F-to-DB9F null modem cable.

 Ethernet cable connected to the internet.

 SD Card of at least 4GB storage space.

 SD Card reader.

III.1.1 Connections

The combination of the first three cables gives you a serial connection, which enables you to watch and
interact with the board's bootloader and operating system through a terminal emulation program on your
host platform.

III.1.2 Setting up the console

The only way to know what is happening on the board is to watch and interact with its serial output through
a console. You do this by setting up a terminal emulation program. When the serial port is configured on the
host platform, any activity on the board is displayed on the console, including prompts for bootloader
commands, operating system login prompts, and error messages.

94
BeagleBoard and Ångström Distribution

III.1.2.1 Setting up the console on Linux

To set up a serial console under Linux, use a terminal emulation program such as “minicom”, as shown here:

1. Install “minicom” on your system.

2. Launch into setup as root by running the command “sudo minicom -s”.

3. Select Serial Port Setup, and then press Return.

4. Set the serial port by typing A to select Serial Device, change to the /dev/ yS0 directory, and
then press Return.

5. Ensure that communica ons se ngs (E) are 115200 8N1.

6. Turn Hardware Flow Control off by typing F.

7. Ensure that Software Flow Control is also off.

8. Press Return to go back to the main menu.

9. Choose Save Setup as “dfl” to use these settings by default.

10. Choose Exit to exit the setup section and launch “minicom” with these settings.

III.1.2.2 Setting up the console on Windows

To set up a serial console under Windows XP or Vista operating system, I recommend downloading “PuTTY”,
a terminal emulation program for Windows available at no cost. Configure “PuTTY” for serial use by clicking
Serial in the Session window. Next, click Serial at the bottom of the Category pane. Set the speed to 115200,
the data bits to 8, the parity to “None”, the stop bits to 1, and flow control to “None”.

III.1.3 Verifying setup

To find out whether your console is set up properly, apply power to the board, either by plugging in the USB
standard-A-to-mini-A device cable directly into the Beagle Board or by simply plugging the transformer cable
into the power jack. If all is correct, the text in Listing III.1 appears on the console.

95
BeagleBoard and Ångström Distribution

Texas Instruments X-Loader 1.4.4ss (Apr 13 2010 - 22:36:28)


Beagle Rev C4
Loading u-boot.bin from nand

U-Boot 2010.03 (Jun 06 2010 - 10:00:15)

OMAP3530-GP ES3.1, CPU-OPP2, L3-165MHz, Max clock-720Mhz


OMAP3 Beagle board + LPDDR/NAND
I2C: ready
DRAM: 256 MB
NAND: 256 MiB
In: serial
Out: serial
Err: serial

Probing for expansion boards, if none are connected you'll see a


harmless I2C error.

Recognized Tincantools Zippy expansion board (rev 1)


Beagle Rev C4
Die ID #6e0a000400000000040373051500a00a
Hit any key to stop autoboot: 0
No MMC card found
Booting from nand ...

NAND read: device 0 offset 0x280000, size 0x400000


4194304 bytes read: OK
Wrong Image Format for bootm command
ERROR: can not get kernel image!
OMAP3 beagleboard.org #

Listing III.1: Bootloader output

The prompt at the end of the output is from the second-stage bootloader, waiting for instructions on
how to load the operating system.

The host system is ready and the Beagle Board is set up. All you need now is an operating system.

III.1.4 Setting up the operating system

Downloadable binaries exist for many Linux distributions that run on the Beagle Board, with Ångström,
Maemo, Ubuntu, and Android being the most popular. All are under active development, and all have been
demonstrated in public by professionals and hobbyists alike. I installed the Ångström distribution, which is
well tested and lean enough that it turns the Beagle Board into an effective Linux desktop machine and not-
so-thin client.

96
BeagleBoard and Ångström Distribution

III.1.5 The Ångström Linux distribution

The Ångström distribution contains four major components. They are shown below in the order in which you
must copy them to the SD card, as the bootloaders must appear first on the card:

 First-stage bootloader (MLO)

 Second-stage bootloader (u-boot.bin)

 Linux boot image (uImage)

 Linux file system

The Beagle Board's firmware contains a first-stage bootloader called X-loader. X-loader can also be
loaded from a removable storage space (such as an SD card) in a signed file called MLO. X-loader bootstraps
the system only enough to load the second-stage bootloader, which otherwise would not fit into memory.

The second-stage bootloader provided in flash memory on the Beagle Board is U-boot, although most
distributions provide their own version of U-boot in a file called u-boot.bin. U-boot initializes the system,
then boots the Linux kernel. It can also be run from the console.

The Linux boot image, named uImage, finally boots the Linux kernel, which resides in the Linux file
system in the /boot directory.[54]

There are several ways to set up the file system; the method shown here requires a bit of work at the
beginning but is flexible.

III.1.6 Download the distribution

To assemble the Ångström distribution, you need the following files:

 Angstrom-Beagleboard-demo-image-glibc-ipk-2010.3-beagleboard.roo s.tar.bz2

 MLO

 README.txt

 md5sums

 u-boot.bin

 uImage

Obtained from http://www.angstrom-distribution.org/demo/beagleboard/.

Double-check that each of the files has been downloaded properly. Navigate to the download directory
in a terminal window, type md5sum *, then compare the values for each file with the contents of the file
md5sum.

97
BeagleBoard and Ångström Distribution

III.1.7 Partition the card

This method creates two par ons on the SD card. The first is a FAT32 par on that hosts the bootloaders
and the kernel image. The remaining space on the card is dedicated to a third extended file system (ext3)
partition.

You must create two disk par ons on the SD card. The FAT32 par on contains the bootloaders and
the raw Linux kernel image. FAT32 is used for the boot partition, because it is a very basic file system that is
straightforward and well understood by the Beagle Board by default, requiring no intelligence from the
bootloader or operating system.

The Linux root file system, however, can be in any file system format understood by the Linux kernel. I
am showing ext3, but the Journaling Flash File System version 2 (JFFS2: Journaling Flash File System version
2) and SquashFS are also good choices, particularly for flash-based storage systems.

Now, insert the SD card into your host platform (using the Card reader) and create two partitions on the
SD card using your favorite par on tool. Create a small, bootable FAT32 par on, followed by a larger ext3
partition. Detailed instructions for this process using the fdisk utility are available on the Beagle Board
community page.

Remove and reinsert the card, and the two new partitions should mount on your host platform (Well,
under windows I was not able to mount ext3, so I switched into bootable Linux CD to complete the setup,
other steps could be done from Windows without problems).

III.1.8 Copying the files into the disk

Now you are ready to copy the files into the disk. Make sure you copy them in this particular order:

1. Copy MLO onto the bootable FAT32 par on.

2. Copy u-boot.bin onto the bootable FAT32 par on.

3. Copy uImage onto the bootable FAT32 par on.

4. Extract the root file system into the ext3 par on. The easiest way to do this is from the
command line:

cd ext3FileSystem; tar xvjf downloadLocation/Angst*.tar.bz2

Unmount the partitions, to unmount the partitions, run the following command:

cd ~; sync; sudo umount BeagleBoot; sudo umount BeagleRootFS

Finally, remove the SD card, then insert it into the Beagle Board.

98
BeagleBoard and Ångström Distribution

III.2 Booting Linux

With the serial console visible on your host platform, plug the power cable into the USB hub. The text shown
(in Lis ng 6.1) should appear on the console.

There are two ways to instruct the Beagle Board to boot Linux from the card:

1. Remove power, then hold down the user button (closest to the outside of the board) while
reapplying power.

2. Type the following lines at the U-boot prompt to set the environment variables for booting
from the card:

setenv bootargs 'console=ttyS0,115200n8 root=/dev/mmcblk0p2 rw rootwait'


setenv bootcmd 'mmcinit; fatload mmc 0 80300000 uImage.bin; bootm 80300000'
boot

Note that you can write these environment variables to memory to instruct the Beagle Board to boot
always from flash memory by typing saveenv before booting with the boot command.

The first time Ångström boots, it inspects the buses on the system and initializes the Ethernet adapter,
and any other peripherals, then finds the correct drivers to use. Subsequent boots are much faster.

Show in FigureIII.1 the display after the boot process from the serial prompt.

Figure III.1: Ångström login display

Use the keyboard to provide a user name and password. You can use root as the user name with a blank
password.

Now, you now have a fully functional Linux system running several applications as well as a package
manager (opkg). The best part is that this system is running on a tiny, inexpensive, fan-less computer that
consumes less than 4 Wa s total—probably about 1/25 that of the host system. You have built a thin client.
It is time to start programming it. Processing images is not a dream anymore!

99
BeagleBoard and Ångström Distribution

III.3 Installing required applications and libraries


Now plug your Ethernet cable into the BeagleBuddy Zippy board so that we would use the opkg manager to
install our image processing applications and libraries as well as the Webcam driver and associated
applications. To do that, run the following commands:

opkg update; opkg install --force-overwrite apm apmd cmake cpp curl dosfstools
e2fsprogs e2fsprogs-e2fsck e2fsprogs-fsck ffmpeg-dev gcc libgio-2.0-0
libgobject-2.0-0 glib-1.2 glib-1.2-dbg glib-1.2-dev glibmm gst-opencv-dev gst-
opencv-static gst-plugin-video4linux gst-plugin-video4linux-dev gstreamer
gstreamer-ti i2c-tools kernel kernel-dev kernel-headers kernel-module-
quickcam-messenger kernel-module-rtc-core kernel-module-rtc-ds1307 kernel-
module-rtc-twl kernel-module-uvcvideo kernel-module-videobuf-dvb kernel-
modules libcurl4 libglib-2.0-dev libjpeg-dev libv4l libv4l-dev lsof

opkg install --force-overwrite opencv opencv-apps opencv-dev opencv-doc


opencv-samples openssl-dev python-compile python-compiler python-devel python-
distutils python-opencv task-dvsdk-toolchain-target task-native-sdk ti-
biosutils ti-cgt6x ti-cmem-module ti-codec-engine ti-codec-engine-apps ti-
codec-engine-dev ti-codec-engine-examples ti-dmai ti-dspbios ti-dsplink ti-
dsplink-module ti-framework-components ti-linuxutils ti-lpm-module ti-lpm-
utils ti-sdma-module ti-xdais ti-xdctools tzdata-asia

mkdir /dev/misc; ln -s /dev/rtc /dev/misc/rtc; echo roboty > /etc/hostname;


hostname roboty; export LD_LIBRARY_PATH=/usr/local/lib:$LD_LIBRARY_PATH

The above commands will take a while to download and install the required packages, and setup some
environment requirements, once its execution is complete, we could write our first test C program that uses
both the OpenCV library and Webcam to capture some pictures, and then compile it, enjoying it!

III.4 Example “capture.c”


Following (see Listing III.2) is our test program that uses OpenCV for capturing images from the webcam:

#include </usr/include/opencv/cv.h>
#include </usr/include/opencv/highgui.h>
#include </usr/include/opencv/cxcore.h>
#include "stdio.h"
#include "string.h"

int main(int argc, char **argv)


{
CvCapture *pCapture = 0;
IplImage *pVideoFrame = 0;
int i;
char filename[50]; //returns the number of available cameras in the system
// int ncams = cvcamGetCamerasCount();
// fprintf(stderr, "Number of cameras: %d\n", ncams); // Initialize video capture
// pCapture = cvCaptureFromCAM(CV_CAP_ANY);

pCapture = cvCaptureFromCAM(-1);
if(!pCapture)
{
fprintf(stderr, "failed to initialize video capture\n");

100
BeagleBoard and Ångström Distribution

return -1;
} // Capture three video frames and write them as files
for(i=0; i<3; i++)
{
pVideoFrame = cvQueryFrame(pCapture);
if(!pVideoFrame)
{
fprintf(stderr, "failed to get a video frame\n");
} // Write the captured video frame as an image file
sprintf(filename, "VideoFrame%d.jpg", i+1);
if(!cvSaveImage(filename, pVideoFrame, NULL))
{
fprintf(stderr, "failed to write image file %s\n", filename);
} // IMPORTANT: Do not release or modify the image returned
// from cvQueryFrame() !
} // Terminate video capture and free capture resources
cvReleaseCapture(&pCapture);
return 0;
}
Listing III.2: C code of “capture.c”

To edit a file in Beagle Board, you need to use some editor, I am using the vim editor that comes with
the Ångström distribution by issuing the command:

vim capture.c

Note that vim editor is a complicated editor for Linux beginners, though, with time you may master it,
all you need now to learn it is how to paste my code in your console, press the key “a” on the keyboard to
start editing, then you can simply paste the code by right mouse click in the console window using PuTTY.
After pasting the code, press “Esc” key then write “:wq” and press Return/Enter. By doing this, you save the
file “capture.c” into the current working directory within the Beagle Board.

All we need now is to compile it using GCC, issue the following command:

gcc `pkg-config --cflags opencv` `pkg-config --libs opencv` -o capture


capture.c

We are now having an executable file called “capture” (specified using the “-o” option) that could be
executed using the following command.

./capture

The webcam connected to the beagle board shall immediately capture 3 frames and save them into the
current working directory. We can check on that by issuing the command:

ls -lia

You should be able to see a list of files with some named “VideoFrame1.jpg”, “VideoFrame2.jpg” and
“VideoFrame3.jpg”, this means that we succeeded in our missing integra ng OpenCV with our Ångström
Linux Distribution all within the Beagle Board.

101
BeagleBoard and Ångström Distribution

102
Full robot software

Appendix IV. Full robot software

A fully functional software for all the programmable modules are


included for reference and reimplementation. Feel free to use
them according to the license mentioned in appendix I.

9DOF So ware

Motor driver software

Beagleboard software

PIC software

103
Full robot software

IV.1 9DOF Software

#include <math.h>
#include <stdlib.h>
#include <stdio.h>
#include <avr/io.h>
#include <avr/pgmspace.h>

// missing headers workaround


#define TRUE 1
#define FALSE 0
#define outb(port, val) (port) = (val)
#define inb(port) (port)
#define TWCR_CMD_MASK 0x0F
#define BV(x) (1 << (x))

#define STATUS_LED 5 //stat LED is on PB5

#define sbi(var, mask) ((var) |= (uint8_t)(1 << mask))


#define cbi(var, mask) ((var) &= (uint8_t)~(1 << mask))

#define WRITE_sda() DDRC = DDRC | 0b00010000 //SDA must be output when writing
#define READ_sda() DDRC = DDRC & 0b11101111 //SDA must be input when reading - do not forget
the resistor on SDA!!

///============Initialize Prototypes=====//////////////////
void init(void);
void UART_Init(unsigned int ubrr);
uint8_t uart_getchar(void);
static int uart_putchar(char c, FILE *stream);
void put_char(unsigned char byte);
static FILE mystdout;
void delay_ms(uint16_t x);
void i2cInit(void);
unsigned char i2cGetReceivedByte(void);

///============Function Prototypes=========/////////////////
uint16_t x_accel(void);
uint16_t y_accel(void);
uint16_t z_accel(void);
uint16_t x_gyro(void);
uint16_t y_gyro(void);
uint16_t z_gyro(void);

///============I2C Prototypes=============//////////////////
void i2cSendStart(void);
void i2cSendStop(void);
void i2cWaitForComplete(void);
void i2cSendByte(unsigned char data);
void i2cHz(long uP_F, long scl_F);

///============EEPROM Protoypes============//////////////////
void write_to_EEPROM(unsigned int Address, unsigned char Data);
unsigned char read_from_EEPROM(unsigned int Address);

104
Full robot software

///============Global Vars=========/////////////////
uint8_t status = 0;

///============Global Constants=========/////////////////
#define ACCEL_X_OFFSET 14
#define ACCEL_X_1G 272

#define ACCEL_Z_OFFSET -25


#define ACCEL_Z_1G 251

//Initial values to stable the system faster


//Kalman filter would stable it even those values are too var
#define GYRO_X_OFFSET 378.62 //378.672 //378.842
// GYRO_X_SCALE is calculated so that alpha matches pitch
#define GYRO_X_SCALE -61.85 //-64.6

/////===========MAIN=====================/////////////////////
int main(void)
{
init();

sbi(PORTB, STATUS_LED);
delay_ms(1000);
cbi(PORTB, STATUS_LED);
delay_ms(1000);
sbi(PORTB, STATUS_LED);
delay_ms(1000);
cbi(PORTB, STATUS_LED);

int raw_x, raw_z, raw_gyro;


double x, z, scalled_x, scalled_z, pitch, gyro, scalled_gyro, alpha = 0, bias = 0;

// We are calculating output at 50Hz


const double dt = 0.02; // 1/Frequency
//x = A · x + B · u
// A[rows][cols]
double A[2][2] =
{
{1, -1*dt},
{0, 1}
};
double B[2][1] =
{
{dt},
{0}
};
double *u[1][1] =
{
{&scalled_gyro}
};
double *X[2][1] =
{
{&alpha},
{&bias}
};

105
Full robot software

double *y[1][1] =
{
{&pitch}
};
double C[1][2] = {1, 0};
double Inn[1][1];
double P[2][2] =
{
{1, 0},
{0, 1}
};
double Sz[1][1] = {17.2}; //Degrees - not radian
double s[1][1];
double K[2][1];
double Sw[2][2] =
{
{0.057, 0}, //Degrees - not radian
{0, 0.172} //Degrees - not radian
};

signed int latest_alpha = 0;


while(1)
{
raw_x = x_accel();
raw_z = z_accel();

x = (double)raw_x - ACCEL_X_OFFSET;
z = (double)raw_z - ACCEL_Z_OFFSET;

scalled_x = x / ACCEL_X_1G;
scalled_z = z / ACCEL_Z_1G;

pitch = atan2(scalled_x, scalled_z) / PI * 180; //Degrees - not radian

raw_gyro = x_gyro();
gyro = (double)raw_gyro - GYRO_X_OFFSET;

scalled_gyro = gyro / GYRO_X_SCALE;

// Start implementing the Digital Controller using Kalman Filter, for details, please refer to the
Appendix VI
*X[0][0] = (A[0][0] * (*X[0][0]) + A[0][1] * (*X[1][0])) + (B[0][0] *
(*u[0][0]));
*X[1][0] = (A[1][0] * (*X[0][0]) + A[1][1] * (*X[1][0])) + (B[1][0] *
(*u[0][0]));

Inn[0][0] = (*y[0][0]) - (C[0][0] * (*X[0][0]) + C[0][1] * (*X[1][0]));

double CP[1][2] =
{
{C[0][0] * P[0][0] + C[0][1] * P[1][0], C[0][0] * P[0][1] + C[0][1]
* P[1][1]}
};
s[0][0] = (CP[0][0] * C[0][0] + CP[0][1] * C[0][1]) + Sz[0][0];

double AP[2][2] =

106
Full robot software

{
{A[0][0] * P[0][0] + A[0][1] * P[1][0], A[0][0] * P[0][1] + A[0][1]
* P[1][1]},
{A[1][0] * P[0][0] + A[1][1] * P[1][0], A[1][0] * P[0][1] + A[1][1]
* P[1][1]}
};
K[0][0] = (AP[0][0] * C[0][0] + AP[0][1] * C[0][1]) / s[0][0];
K[1][0] = (AP[1][0] * C[0][0] + AP[1][1] * C[0][1]) / s[0][0];

*X[0][0] += K[0][0] * Inn[0][0];


*X[1][0] += K[1][0] * Inn[0][0];

double A_[2][2] =
{
{A[0][0], A[1][0]},
{A[0][1], A[1][1]}
};
double KC[1][1] =
{
{K[0][0] * C[0][0] + K[1][0] * C[0][1]}
};
P[0][0] = (AP[0][0] * A_[0][0] + AP[0][1] * A_[1][0]) - KC[0][0] *
(P[0][0] * A_[0][0] + P[0][1] * A_[1][0]) + Sw[0][0];
P[0][1] = (AP[0][0] * A_[0][1] + AP[0][1] * A_[1][1]) - KC[0][0] *
(P[0][0] * A_[0][1] + P[0][1] * A_[1][1]) + Sw[0][1];
P[1][0] = (AP[1][0] * A_[0][0] + AP[1][1] * A_[1][0]) - KC[0][0] *
(P[1][0] * A_[0][0] + P[1][1] * A_[1][0]) + Sw[1][0];
P[1][1] = (AP[1][0] * A_[0][1] + AP[1][1] * A_[1][1]) - KC[0][0] *
(P[1][0] * A_[0][1] + P[1][1] * A_[1][1]) + Sw[1][1];

/*
//Printing double is not possible via %f - Arduino is limited :(
//Here is my workaround with one decimal digit
if (alpha < 0) printf("-");
printf("%d.%d\r\n", abs((int)alpha), abs((int)(alpha * 10 - (int)alpha * 10)));
*/

// For easier communniication with PIC, I only would use the range of -127 to 128 degrees
// Enough resolution and effective communication
if (latest_alpha != (signed int)alpha)
{
latest_alpha = (signed int)alpha;
printf("%c", latest_alpha & 0xFF);
}

// This is the what controls the frequency of our sampling


// Without delay, the frequency is about 85Hz
// But we do not need that speed since it would be just a headach on the PIC microcontroller
// BTW: Seems that delay_ms is faster 2.9 than real time
// but I do not mind since I adjusted it manually
delay_ms(83);
}
}

uint16_t x_gyro(void)
{
uint16_t xl;// xlow register

107
Full robot software

uint16_t xh;// xhigh register

// x-axis
ADMUX = (1 << REFS0)|(1 << MUX0);//ADC1
ADCSRA = (1 << ADEN)|(1 << ADSC)|(1<<ADPS2)|(1<<ADPS1);

while(ADCSRA & (1 << ADSC));


xl = ADCL;
xh = ADCH & 0x03;
xh = xh << 8;
xh = xh + xl;

ADMUX &= 0xF0;

return xh;
}

uint16_t y_gyro(void)
{
uint16_t yl;// ylow register
uint16_t yh;// yhigh register

// y-axis
ADMUX = (1 << REFS0)|(1 << MUX1);//ADC2
ADCSRA = (1 << ADEN)|(1 << ADSC)|(1<<ADPS2)|(1<<ADPS1);

while(ADCSRA & (1 << ADSC));


yl = ADCL;
yh = ADCH & 0x03;
yh = yh << 8;
yh = yh + yl;

ADMUX &= 0xF0;

return yh;

uint16_t z_gyro(void)
{
uint16_t zl;// zlow register
uint16_t zh;// zhigh register

// z-axis
ADMUX = (1 << REFS0);//ADC0
ADCSRA = (1 << ADEN)|(1 << ADSC)|(1<<ADPS2)|(1<<ADPS1);

while(ADCSRA & (1 << ADSC));


zl = ADCL;
zh = ADCH & 0x03;
zh = zh << 8;
zh = zh + zl;

ADCSRA = 0;
ADMUX &= 0xF0;

return zh;

108
Full robot software

uint16_t x_accel(void)
{
//0xA6 for a write
//0xA7 for a read

uint8_t dummy, xh, xl;


uint16_t xo;

//0x32 data registers


i2cSendStart();
i2cWaitForComplete();
i2cSendByte(0xA6); //write to ADXL
i2cWaitForComplete();
i2cSendByte(0x32); //X0 data register
i2cWaitForComplete();

i2cSendStop(); //repeat start


i2cSendStart();

i2cWaitForComplete();
i2cSendByte(0xA7); //read from ADXL
i2cWaitForComplete();
i2cReceiveByte(TRUE);
i2cWaitForComplete();
xl = i2cGetReceivedByte(); //x low byte
i2cWaitForComplete();
i2cReceiveByte(FALSE);
i2cWaitForComplete();
dummy = i2cGetReceivedByte(); //must do a multiple byte read?
i2cWaitForComplete();
i2cSendStop();

//0x33 data registers


i2cSendStart();
i2cWaitForComplete();
i2cSendByte(0xA6); //write to ADXL
i2cWaitForComplete();
i2cSendByte(0x33); //X1 data register
i2cWaitForComplete();

i2cSendStop(); //repeat start


i2cSendStart();

i2cWaitForComplete();
i2cSendByte(0xA7); //read from ADXL
i2cWaitForComplete();
i2cReceiveByte(TRUE);
i2cWaitForComplete();
xh = i2cGetReceivedByte(); //x high byte
i2cWaitForComplete();
i2cReceiveByte(FALSE);
i2cWaitForComplete();
dummy = i2cGetReceivedByte(); //must do a multiple byte read?
i2cWaitForComplete();
i2cSendStop();

109
Full robot software

xo = xl|(xh << 8);


return xo;
}

uint16_t y_accel(void)
{
//0xA6 for a write
//0xA7 for a read

uint8_t dummy, yh, yl;


uint16_t yo;

//0x34 data registers


i2cSendStart();
i2cWaitForComplete();
i2cSendByte(0xA6); //write to ADXL
i2cWaitForComplete();
i2cSendByte(0x34); //Y0 data register
i2cWaitForComplete();

i2cSendStop(); //repeat start


i2cSendStart();

i2cWaitForComplete();
i2cSendByte(0xA7); //read from ADXL
i2cWaitForComplete();
i2cReceiveByte(TRUE);
i2cWaitForComplete();
yl = i2cGetReceivedByte(); //x low byte
i2cWaitForComplete();
i2cReceiveByte(FALSE);
i2cWaitForComplete();
dummy = i2cGetReceivedByte(); //must do a multiple byte read?
i2cWaitForComplete();
i2cSendStop();

//0x35 data registers


i2cSendStart();
i2cWaitForComplete();
i2cSendByte(0xA6); //write to ADXL
i2cWaitForComplete();
i2cSendByte(0x35); //Y1 data register
i2cWaitForComplete();

i2cSendStop(); //repeat start


i2cSendStart();

i2cWaitForComplete();
i2cSendByte(0xA7); //read from ADXL
i2cWaitForComplete();
i2cReceiveByte(TRUE);
i2cWaitForComplete();
yh = i2cGetReceivedByte(); //y high byte
i2cWaitForComplete();
i2cReceiveByte(FALSE);
i2cWaitForComplete();
dummy = i2cGetReceivedByte(); //must do a multiple byte read?

110
Full robot software

i2cWaitForComplete();
i2cSendStop();
yo = yl|(yh << 8);
return yo;
}

uint16_t z_accel(void)
{
//0xA6 for a write
//0xA7 for a read

uint8_t dummy, zh, zl;


uint16_t zo;

//0x36 data registers


i2cSendStart();
i2cWaitForComplete();
i2cSendByte(0xA6); //write to ADXL
i2cWaitForComplete();
i2cSendByte(0x36); //Z0 data register
i2cWaitForComplete();

i2cSendStop(); //repeat start


i2cSendStart();

i2cWaitForComplete();
i2cSendByte(0xA7); //read from ADXL
i2cWaitForComplete();
i2cReceiveByte(TRUE);
i2cWaitForComplete();
zl = i2cGetReceivedByte(); //z low byte
i2cWaitForComplete();
i2cReceiveByte(FALSE);
i2cWaitForComplete();
dummy = i2cGetReceivedByte(); //must do a multiple byte read?
i2cWaitForComplete();
i2cSendStop();

//0x37 data registers


i2cSendStart();
i2cWaitForComplete();
i2cSendByte(0xA6); //write to ADXL
i2cWaitForComplete();
i2cSendByte(0x37); //Z1 data register
i2cWaitForComplete();

i2cSendStop(); //repeat start


i2cSendStart();

i2cWaitForComplete();
i2cSendByte(0xA7); //read from ADXL
i2cWaitForComplete();
i2cReceiveByte(TRUE);
i2cWaitForComplete();
zh = i2cGetReceivedByte(); //z high byte
i2cWaitForComplete();
i2cReceiveByte(FALSE);

111
Full robot software

i2cWaitForComplete();
dummy = i2cGetReceivedByte(); //must do a multiple byte read?
i2cWaitForComplete();
i2cSendStop();
zo = zl|(zh << 8);
return zo;
}

/*********************
****Initialize****
*********************/

void init (void)


{
//1 = output, 0 = input
DDRB = 0b01100000; //PORTB4, B5 output for stat LED
DDRC = 0b00010000; //PORTC4 (SDA), PORTC5 (SCL), PORTC all others are inputs
DDRD = 0b00000010; //PORTD (TX output on PD1)
PORTC = 0b00110000; //pullups on the I2C bus

UART_Init(51); //9600bps, fill in UBRR to set baud http://www.wormfood.net/avrbaudcalc.php

i2cInit();
}

void UART_Init(unsigned int ubrr)


{
int ubrr_new;

// compatible stream forwarding


fdev_setup_stream(&mystdout, uart_putchar, NULL, _FDEV_SETUP_WRITE);
// set baud rate
ubrr_new = ubrr;
UBRR0H = ubrr_new>>8;
UBRR0L = ubrr_new;

// Enable receiver and transmitter


UCSR0A = (0<<U2X0);
UCSR0B = (1<<RXEN0)|(1<<TXEN0);

// Set frame format: 8 bit, no parity, 1 stop bit,


UCSR0C = (1<<UCSZ00)|(1<<UCSZ01);

stdout = &mystdout; //Required for printf init


}

uint8_t uart_getchar(void)
{
while(!(UCSR0A & (1<<RXC0)));
return(UDR0);
}

static int uart_putchar(char c, FILE *stream)


{
if (c == '\n') uart_putchar('\r', stream);

loop_until_bit_is_set(UCSR0A, UDRE0);

112
Full robot software

UDR0 = c;

return 0;
}

void put_char(unsigned char byte)


{
/* Wait for empty transmit buffer */
while (!(UCSR0A & (1<<UDRE0)));
/* Put data into buffer, sends the data */
UDR0 = byte;
}

void delay_ms(uint16_t x)
{
uint8_t y, z;
for (; x > 0 ; x--){
for (y = 0 ; y < 90 ; y++){
for (z = 0 ; z < 6 ; z++){
asm volatile ("nop");
}
}
}
}

/*********************
**EEPROM Functions***
*********************/

//Description: Writes an unsigned char(Data) to the EEPROM at the given Address


//Pre: Unsigned Int Address contains address to be written to
// Unsigned Char Data contains data to be written
//Post: EEPROM "Address" contains the "Data"
//Usage: write_to_EEPROM(0, 'A');
void write_to_EEPROM(unsigned int Address, unsigned char Data)
{
//Interrupts are globally disabled!

while(EECR & (1<<EEPE)); //Wait for last Write to complete


//May need to wait for Flash to complete also!
EEAR = Address; //Assign the Address Register with "Address"
EEDR=Data; //Put "Data" in the Data Register
EECR |= (1<<EEMPE); //Write to Master Write Enable
EECR |= (1<<EEPE); //Start Write by setting EE Write Enable
}

//Description: Reads the EEPROM data at "Address" and returns the character
//Pre: Unsigned Int Address is the address to be read
//Post: Character at "Address" is returned
//Usage: unsigned char Data;
// Data=read_from_EEPROM(0);
unsigned char read_from_EEPROM(unsigned int Address)
{
//Interrupts are globally disabled!

while(EECR & (1<<EEPE)); //Wait for last Write to complete


EEAR = Address; //Assign the Address Register with "Address"

113
Full robot software

EECR |= (1<<EERE); //Start Read by writing to EER


return EEDR; //EEPROM Data is returned
}

/*********************
****I2C Functions****
*********************/

void i2cInit(void)
{
// set i2c bit rate to 40KHz
i2cSetBitrate(100);
// enable TWI (two-wire interface)
sbi(TWCR, TWEN);

//initialize
i2cSendStart();
i2cWaitForComplete();
i2cSendByte(0xA6); //write to ADXL
i2cWaitForComplete();
i2cSendByte(0x2D); //power register
i2cWaitForComplete();
i2cSendByte(0x08); //measurement mode
i2cWaitForComplete();
i2cSendStop();

i2cSendStart();
i2cWaitForComplete();
i2cSendByte(0xA6); //write to ADXL
i2cWaitForComplete();
i2cSendByte(0x31); //data format
i2cWaitForComplete();
i2cSendByte(0x08); //full resolution
i2cWaitForComplete();
i2cSendStop();
}

void i2cSetBitrate(unsigned short bitrateKHz)


{
unsigned char bitrate_div;
// set i2c bitrate
// SCL freq = F_CPU/(16+2*TWBR))
//#ifdef TWPS0
// for processors with additional bitrate division (mega128)
// SCL freq = F_CPU/(16+2*TWBR*4^TWPS)
// set TWPS to zero
cbi(TWSR, TWPS0);
cbi(TWSR, TWPS1);
//#endif
// calculate bitrate division
bitrate_div = ((F_CPU/4000l)/bitrateKHz);
if(bitrate_div >= 16)
bitrate_div = (bitrate_div-16)/2;
outb(TWBR, bitrate_div);
}

void i2cSendStart(void)
114
Full robot software

{
WRITE_sda();
// send start condition
TWCR = (1<<TWINT)|(1<<TWSTA)|(1<<TWEN);
}

void i2cSendStop(void)
{
// transmit stop condition
TWCR = (1<<TWINT)|(1<<TWEN)|(1<<TWSTO);
}

void i2cWaitForComplete(void)
{
int i = 0; //time out variable

// wait for i2c interface to complete operation


while ((!(TWCR & (1<<TWINT))) && (i < 90))
i++;
}

void i2cSendByte(unsigned char data)


{

WRITE_sda();
// save data to the TWDR
TWDR = data;
// begin send
TWCR = (1<<TWINT)|(1<<TWEN);
}

void i2cReceiveByte(unsigned char ackFlag)


{
// begin receive over i2c
if(ackFlag)
{
// ackFlag = TRUE: ACK the recevied data
outb(TWCR, (inb(TWCR)&TWCR_CMD_MASK)|BV(TWINT)|BV(TWEA));
}
else
{
// ackFlag = FALSE: NACK the recevied data
outb(TWCR, (inb(TWCR)&TWCR_CMD_MASK)|BV(TWINT));
}
}

unsigned char i2cGetReceivedByte(void)


{
// retieve received data byte from i2c TWDR
return(inb(TWDR));
}

unsigned char i2cGetStatus(void)


{
// retieve current i2c status from i2c TWSR
return(inb(TWSR));
}

115
Full robot software

IV.2 Motor Driver Software

#include <string.h>
#include <stdlib.h>
#include <stdio.h>
#include <ctype.h>
#include <avr/io.h>
#include <avr/interrupt.h>
#include <avr/pgmspace.h>
//*******************************************************
// GPIO Definitions
//*******************************************************
#define ADC_VREF_TYPE 0x40
//Port C Pin Assignments
#define SENSE_1 0
#define SENSE_2 1
#define ENC1_B 6 | ADC_VREF_TYPE
#define ENC2_B 7 | ADC_VREF_TYPE

//Port D Pin Assignments


#define RX 0
#define TX 1
#define M1_P 2
#define M1_N 3
#define M2_P 4
#define M2_N 7

//Port B Pin Assignments


#define PWM1 1
#define PWM2 2
#define MOSI 3
#define MISO 4
#define SCK 5

//Motors related constants

//Enable MANUAL_INPUT to use the syntax of 4 chars command


// char 1 - Motor selection, 1 for motor1, 2 for motor 2
// char 2 - Motor direction, f for forward, r for reverse
// char 3&4 - Motor speed, 0 stopped, 63 full speed
#define MANUAL_INPUT

//Enable PID_TUNE to use the brute force tuning


// Review Appendix VII for format and usage
#define PID_TUNE
#define TUNE_MOTOR MOTOR_2
#define STEP_SPEED 60 //the 60% of full speed (RPM)
#define KP_START 0.02
#define KP_END 0.08
#define KP_STEP 0.01
#define KI_START 0.00000
#define KI_END 0.00010
#define KI_STEP 0.00002
#define KD_START 10
#define KD_END 60
#define KD_STEP 10
116
Full robot software

#define MOTOR_1 0
#define MOTOR_2 1
#define FORWARD 0
#define REVERSE 1
#define SPEED_FACTOR 7

// Calculated using stat A+BX for fast acceleration


// Obtained from the relation between needed speed and PWM
#define OCR1A_STAT_A -83.5
#define OCR1A_STAT_B 1.02
#define OCR1B_STAT_A -130.5
#define OCR1B_STAT_B 1.41

//*******************************************************
// Macros
//*******************************************************
#define sbi(var, mask) ((var) |= (uint8_t)(1 << mask))
#define cbi(var, mask) ((var) &= (uint8_t)~(1 << mask))

#define M1_FORWARD() sbi(PORTD, M1_N);cbi(PORTD, M1_P)


#define M1_REVERSE() sbi(PORTD, M1_P);cbi(PORTD, M1_N)

#define M2_FORWARD() sbi(PORTD, M2_N);cbi(PORTD, M2_P)


#define M2_REVERSE() sbi(PORTD, M2_P);cbi(PORTD, M2_N)

//*******************************************************
// General Definitions
//*******************************************************
#define UART_UBRR 203 //Used to set the AVR Baud Rate TO 9600 (External 16MHz Oscillator)
#define CURRENT_THRESHOLD 150 //This is compared against the 10 bit ADC value and corresponds to
roughly 1.5A on the Current Sense pin
#define SPEED_AVERAGE_POLYNOMIALS 4 //Speed average polynomials count

static FILE mystdout;


void UART_Init(uint16_t ubrr);
uint8_t uart_getchar(void);
static int uart_putchar(char c, FILE *stream);
void delay_ms(uint16_t x);
void delay_us(uint16_t x);

void motor1_PID();
void motor2_PID();
uint16_t read_adc(uint8_t adc_input);
void control_motors();
void set_direction(int motor, int direcion);
void set_speed(int motor, int speed);
void start_tuning(void);
//================================================================
//Define Global Variables
//================================================================

/************** PID Variables ******************/


const long double dt = 0.10035;

// Refer to: http://www.ecircuitcenter.com/circuits/pid1/pid1.htm


// To know how to tune PID, or refer to Appendix VII for better tuning way
long double Kp[2] = {0.05, 0.05};

117
Full robot software

long double Ki[2] = {0.0001, 0.001};


long double Kd[2] = {30, 40};

//encoders/motors position and direction of rotation


long double ErrorIntegral[2] = {0, 0};
uint16_t motorRPMSetpoint[2] = {0, 0};
long double motorRPM[2] = {0, 0};
uint16_t enc_pos[2] = {0, 0};
boolean enc_dir[2];
uint16_t motor_current[2] = {0, 0};
uint16_t enc_counter[2] = {0, 0};

//================================================================
//Interrupt Routines
//================================================================
/*
PORTC Pin change interrupt service routine. Decodes the encoder.
For algorithm, see Scott Edwards article from Nuts&Volts V1 Oct. 1995 nv8.pdf
(righthand bit of old A,B) xor (lefthand bit of new A,B) => dir.
Increment or decrement encoder position accordingly
*/
ISR (PCINT1_vect)
{
static boolean enc_last_A[2] = {0, 0}, enc_last_B[2] = {0, 0}, enc_now_A[2],
enc_now_B[2];
static uint16_t enc1_last_counter[SPEED_AVERAGE_POLYNOMIALS] = {0},
enc2_last_counter[SPEED_AVERAGE_POLYNOMIALS] = {0};

/**
* Steps used:
*
* 1. which encoder? 1 or 2 or both
* 2. determine direction of rotation
* 3. update encoder position
* 4. remember last state
*
* 5. calculate speed
* 6. check overload
*
* Future Development: Find a way to use ADC pins as digital - this would enhance the performance
*/

enc_now_A[MOTOR_1] = (PINC & 0x10) >> 4;


enc_now_A[MOTOR_2] = (PINC & 0x20) >> 5;

if (enc_last_A[MOTOR_1] != enc_now_A[MOTOR_1]) //Is it Encoder 1?


{
enc_now_B[MOTOR_1] = (read_adc(ENC1_B) >> 9);
enc_dir[MOTOR_1] = enc_now_A[MOTOR_1] ^ enc_now_B[MOTOR_1];
if(enc_dir[MOTOR_1] == 0) enc_pos[MOTOR_1]++; else enc_pos[MOTOR_1]--;
enc_last_A[MOTOR_1] = enc_now_A[MOTOR_1];
enc_last_B[MOTOR_1] = enc_now_B[MOTOR_1];

// Calculate motor speed


long double average_speed = 0;
for (uint8_t i = 0; i < SPEED_AVERAGE_POLYNOMIALS-1; i++)
118
Full robot software

{
average_speed += enc1_last_counter[i];
enc1_last_counter[i] = enc1_last_counter[i+1];
}
average_speed += enc1_last_counter[SPEED_AVERAGE_POLYNOMIALS-1];
average_speed += enc_counter[MOTOR_1];
average_speed = 2441.4 * (SPEED_AVERAGE_POLYNOMIALS+1) / average_speed;
enc1_last_counter[SPEED_AVERAGE_POLYNOMIALS-1] = enc_counter[MOTOR_1];
//motorRPM[MOTOR_1] = 0.1 * motorRPM[MOTOR_1] + 0.9 * average_speed;
motorRPM[MOTOR_1] = average_speed;
//motorRPM[MOTOR_1] = 2441.4 / enc_counter[MOTOR_1];
//if (enc_dir[MOTOR_1] == 1) motorRPM[MOTOR_1] *= -1;

//Check Overload sensor


motor_current[MOTOR_1] = read_adc(SENSE_1);
if(motor_current[MOTOR_1] > CURRENT_THRESHOLD)
{
//If an over current is detected, stop the motor
set_speed(MOTOR_1, 0);
PORTC |= (1<<2); //Overload LED on
}
enc_counter[MOTOR_1] = 0;
}

if (enc_last_A[MOTOR_2] != enc_now_A[MOTOR_2]) //Is it Encoder 2?


{
enc_now_B[MOTOR_2] = (read_adc(ENC2_B) >> 9);
enc_dir[MOTOR_2] = enc_now_A[MOTOR_2] ^ enc_now_B[MOTOR_2];
if(enc_dir[MOTOR_2] == 0) enc_pos[MOTOR_2]++; else enc_pos[MOTOR_2]--;
enc_last_A[MOTOR_2] = enc_now_A[MOTOR_2];
enc_last_B[MOTOR_2] = enc_now_B[MOTOR_2];

// Calculate motor speed


long double average_speed = 0;
for (uint8_t i = 0; i < SPEED_AVERAGE_POLYNOMIALS-1; i++)
{
average_speed += enc1_last_counter[i];
enc1_last_counter[i] = enc1_last_counter[i+1];
}
average_speed += enc1_last_counter[SPEED_AVERAGE_POLYNOMIALS-1];
average_speed += enc_counter[MOTOR_2];
average_speed = 2441.4 * (SPEED_AVERAGE_POLYNOMIALS+1) / average_speed;
enc1_last_counter[SPEED_AVERAGE_POLYNOMIALS-1] = enc_counter[MOTOR_2];
//motorRPM[MOTOR_2] = 0.1 * motorRPM[MOTOR_1] + 0.9 * average_speed;
motorRPM[MOTOR_2] = average_speed;
//motorRPM[MOTOR_2] = 2441.4 / enc_counter[MOTOR_2];
//if (enc_dir[MOTOR_2] == 1) motorRPM[MOTOR_2] *= -1;

//Check Overload sensor


motor_current[MOTOR_2] = read_adc(SENSE_2);
if(motor_current[MOTOR_2] > CURRENT_THRESHOLD)
{
//If an over current is detected, stop the motor
set_speed(1, 0);
PORTC |= (1<<3); //Overload LED on
}

119
Full robot software

enc_counter[MOTOR_2] = 0;
}
}

//Counter speed is 976.56Hz


ISR(TIMER0_OVF_vect)
{
//This is the interrupt service routine for TIMER0 OVERFLOW Interrupt.
//CPU automatically call this when TIMER0 overflows.

//Adjust the motors speed at ~10Hz => dt is 100ms


if (enc_counter[MOTOR_1] % 98) motor1_PID();
if (enc_counter[MOTOR_2] % 98) motor2_PID();

// If no activity within 200ms, wheels are not moving


// Also prevents counter varaible from overflow
// Assuming minimum speed 12RPM
if (enc_counter[MOTOR_1] >= 195) motorRPM[MOTOR_1] = enc_counter[MOTOR_1]
= 0;
if (enc_counter[MOTOR_2] >= 195) motorRPM[MOTOR_2] = enc_counter[MOTOR_2]
= 0;

//Increment our variable


enc_counter[MOTOR_1]++;
enc_counter[MOTOR_2]++;
}

int main(void)
{
ioinit();

set_direction(MOTOR_1, FORWARD);
set_speed(MOTOR_1, 0);
set_direction(MOTOR_2, FORWARD);
set_speed(MOTOR_2, 0);

#ifdef PID_TUNE
start_tuning();
#else
while(1) control_motors();
#endif
return 0;
}

//==================================================
//Core functions
//==================================================
//Function: ioinit
//Purpose: Initialize AVR I/O, UART and Interrupts
//Inputs: None
//Outputs: None
void ioinit(void)
{
//1 = output, 0 = input
DDRB |= (1<<PWM1)|(1<<PWM2)|(1<<MOSI)|(1<<SCK); //Enable PWM and SPI pins as
outputs
PORTB |= (1<<MISO); //Enable pull-up on MISO pin

120
Full robot software

DDRC |= 0xFF; //Initialize Port C to all outputs


DDRC |= (1<<2)|(1<<3); //Port C pins 2 and 3 as output, Overload LEDS

DDRD |= 0xFF; //Set Port D to all outputs


DDRD &= ~(1<<RX); //Set bit 0(Rx) to be an input
PORTD |= (1<<RX); //Enable the pull-up on Rx

UART_Init(UART_UBRR);

//Refer to:
http://www.petervis.co.cc/C/Light%20and%20Temp%20Sensors/Light%20and%20temperature%20sensors.html
// For ADC clock division
ADCSRA = (1<<ADEN)|(1<<ADPS2)|(1<<ADPS0); //Enable the ADC and set the ADC
prescaler to 32

// Prescaler = FCPU/64
//Refer to: http://extremeelectronics.co.in/avr-tutorials/avr-timers-an-introduction/
// for prescaler settings
TCCR0B |= (1<<CS01)|(1<<CS00);
//Enable Overflow Interrupt Enable
TIMSK0 |= (1<<TOIE0);
//Initialize Counter
TCNT0 = 0;

//Init Timer 1 for 10 BIT Fast Mode PWM with output toggle on OC1A and OC1B
TCCR1A = (1<<COM1A1) | (1<<COM1B1) | (1<<WGM10) | (1<<WGM11); //Set
toggle mode for PWM pins and set 10bit fast PWM mode
TCCR1B = (1<<WGM12); //Set 8bit fast PWM mode
OCR1A = 0;
OCR1B = 0;
TCCR1B |= (1<<CS12);

// Init timer 2
//Set Prescaler to 8. (Timer Frequency set to 16Mhz)
//Used for delay routines
TCCR2B = (1<<CS20); //Divde clock by 1 for 16 Mhz Timer 2 Frequency

// Encoder ISR registeration


DDRC &=~(3<<4); //Port C pins 4 and 5 as input
DDRC &=~(3<<6); //Port ADC pins 6 and 7 as input
PCMSK1 |= (3<<PCINT12); //enable interrupt on pin change, bits 4&5 PORTC
PCICR |= 1<<PCIE1; //enable interrupt on pin change, PORTC

sei(); //enable global interrupts

/************** PID Algorithm ******************/

// Motor 1 - left motor


void motor1_PID()
{
static long double lastError[4] = {0, 0, 0, 0};
long double error = motorRPMSetpoint[MOTOR_1] - motorRPM[MOTOR_1];

long double PID = Kp[MOTOR_1] * error + //P


Ki[MOTOR_1] * ErrorIntegral[MOTOR_1] + //I

121
Full robot software

Kd[MOTOR_1] * (((error - lastError[0]) / 4) * dt); //D

ErrorIntegral[MOTOR_1] += ((lastError[0] + 2 * (lastError[1] + lastError[2]) + error)


/ 6) * dt;
for (uint8_t i = 0; i < 3; i++) lastError[i] = lastError[i+1];
lastError[3] = error;

//limit OCR1A 0-1023


if (PID+OCR1A > 1023) OCR1A = 1023;
else if (PID+OCR1A < 0) OCR1A = 0;
else OCR1A += (int)PID;
}

// Motor 2- right motor


void motor2_PID()
{
static long double lastError[4] = {0, 0, 0, 0};
long double error = motorRPMSetpoint[MOTOR_2] - motorRPM[MOTOR_2];

long double PID = Kp[MOTOR_2] * error + //P


Ki[MOTOR_2] * ErrorIntegral[MOTOR_2] + //I
Kd[MOTOR_2] * (((error - lastError[0]) / 4) * dt); //D

ErrorIntegral[MOTOR_2] += ((lastError[0] + 2 * (lastError[1] + lastError[2]) + error)


/ 6) * dt;
for (uint8_t i = 0; i < 3; i++) lastError[i] = lastError[i+1];
lastError[3] = error;

//limit OCR1B 0-1023


if (PID+OCR1B > 1023) OCR1B = 1023;
else if (PID+OCR1B < 0) OCR1B = 0;
else OCR1B += (int)PID;
}

#ifdef MANUAL_INPUT
#define BUFFER_SIZE 5
void control_motors()
{
// Simple, yet affective communication
// Using one charecter to send the motor selector, direction and speed
uint8_t chr, cmd[BUFFER_SIZE], i = 0;

while(1)
{
printf("%c", chr = uart_getchar());
if (i >= BUFFER_SIZE || chr == '\r' || chr == '\n') break;
cmd[i++] = chr;
}
if (i != 4)
{
printf("\r\nCommand error\r\n");
return;
}

//First byte for motor selection


boolean motor = (cmd[0] == '1') ? 0 : 1;

//Fecond byte for motor direction

122
Full robot software

boolean direction = (cmd[1] == 'f') ? 0 : 1;

//Last two bytes for speed selection


// From 0 - stopped
// To 63 - full speed
uint8_t speed = 10 * (cmd[2] - '0') + (cmd[3] - '0');

// Apply the settings


set_direction(motor, direction);
set_speed(motor, speed);
}
#else
void control_motors()
{
// Simple, yet affective communication
// Using one charecter to send the motor selector, direction and speed
uint8_t cmd = uart_getchar();

//MSB for motor selection


boolean motor = (cmd & 0x80) >> 7;

//7th bit for motor direction


boolean direction = (cmd & 0x40) >> 6;

//First 6 bits for speed selection


// From 0 - stopped
// To 63 - full speed
uint8_t speed = cmd & 0x3F;

// Apply the settings


set_direction(motor, direction);
set_speed(motor, speed);
}
#endif

void set_direction(int motor, int direction)


{
if(motor == MOTOR_1)
{
if(direction == FORWARD) { M1_FORWARD(); } else { M1_REVERSE(); }
}
else
{
if(direction == FORWARD) { M2_FORWARD(); } else { M2_REVERSE(); }
}
}

void set_speed(int motor, int speed)


{
//Is it the same speed?
if (motorRPMSetpoint[motor] == speed * 100) return;

//Since not the same speed, reset the accumulated error


ErrorIntegral[motor] = 0;
motorRPMSetpoint[motor] = speed * SPEED_FACTOR;
if(motor == MOTOR_1)
{
OCR1A = OCR1A_STAT_A + OCR1A_STAT_B * speed * SPEED_FACTOR;

123
Full robot software

if (speed != 0)
PORTC &= ~(1<<2); //Overload LED off
}
else
{
OCR1B = OCR1B_STAT_A + OCR1B_STAT_B * speed * SPEED_FACTOR;
if (speed != 0)
PORTC &= ~(1<<3); //Overload LED off
}
}

void start_tuning(void)
{
printf("NOTE: Due to Arduino limitation, PID variables are scaled by its corresponding
increment step\r\n\r\n");
printf("Press any key to start!\r\n");
printf("Kp, Ki, Kd, 20 speed samples (CSV Format)\r\n");
uart_getchar();
for (Kp[TUNE_MOTOR] = KP_START; Kp[TUNE_MOTOR] <= KP_END; Kp[TUNE_MOTOR] +=
KP_STEP)
{
for (Ki[TUNE_MOTOR] = KI_START; Ki[TUNE_MOTOR] <= KI_END; Ki[TUNE_MOTOR]
+= KI_STEP)
{
for (Kd[TUNE_MOTOR] = KD_START; Kd[TUNE_MOTOR] <= KD_END;
Kd[TUNE_MOTOR] += KD_STEP)
{
OCR1A = OCR1B = ErrorIntegral[TUNE_MOTOR] = motorRPMSetpoint[TUNE_MOTOR]
= 0;
delay_ms(1000);
printf("%d;%d;%d", (int)(Kp[TUNE_MOTOR] / KP_STEP), (int)(Ki[TUNE_MOTOR] /
KI_STEP), (int)(Kd[TUNE_MOTOR] / KD_STEP));
motorRPMSetpoint[TUNE_MOTOR] = STEP_SPEED;
for (uint8_t i = 0; i < 20; i++)
{
printf(";%d", (int)motorRPM[TUNE_MOTOR]);
//This delays a 100ms, even it says only 50ms
delay_ms(50);
}
printf("\r\n");
}
}
}
OCR1A = OCR1B = ErrorIntegral[TUNE_MOTOR] = motorRPMSetpoint[TUNE_MOTOR] = 0;
}

uint16_t read_adc(uint8_t adc_input)


{
ADMUX = adc_input;

// Start the AD conversion


ADCSRA |= 0x40;

// Wait for the AD conversion to complete


while ((ADCSRA & 0x10) == 0);
ADCSRA |= 0x10;
return ADCW;
}

124
Full robot software

void UART_Init(uint16_t ubrr)


{
// compatible stream forwarding
fdev_setup_stream(&mystdout, uart_putchar, NULL, _FDEV_SETUP_WRITE);
// set baud rate
// USART Baud rate: 9600 (With 16 MHz Clock)
UBRR0H = (ubrr >> 8) & 0x7F; //Make sure highest bit(URSEL) is 0 indicating we are writing
to UBRRH
UBRR0L = ubrr;
UCSR0A = (1<<U2X0); //double the UART Speed
UCSR0B = (1<<RXEN0)|(1<<TXEN0); //Enable Rx and Tx in UART
UCSR0C = (1<<UCSZ00)|(1<<UCSZ01); //8-Bit Characters
stdout = &mystdout; //Required for printf init
}

uint8_t uart_getchar(void)
{
while(!(UCSR0A & (1<<RXC0)));
return(UDR0);
}

static int uart_putchar(char c, FILE *stream)


{
if (c == '\n') uart_putchar('\r', stream);

loop_until_bit_is_set(UCSR0A, UDRE0);
UDR0 = c;

return 0;
}

void delay_ms(uint16_t x)
{
for (; x > 0 ; x--)
delay_us(1000);
}

void delay_us(uint16_t x)
{
TIFR2 = (1<<TOV2); //Clear any interrupt flags on Timer2
TCNT2= 240; //Setting counter to 239 will make it 16 ticks until TOV is set. .0625uS per click means 1
uS until TOV is set
while(x>0){
while((TIFR2 & (1<<TOV2)) == 0);
TIFR2 = (1<<TOV2); //Clear any interrupt flags on Timer2
TCNT2=240;
x--;
}
}

125
Full robot software

IV.3 BeagleBoard Software

#include </usr/include/opencv/cv.h>
#include </usr/include/opencv/highgui.h>
#include</usr/include/opencv/cxcore.h>
#include </usr/include/opencv/cvaux.h>
#include <linux/i2c-dev.h>
#include <fcntl.h>
#include <termios.h>
#include <stdio.h>

//#define USE_MAHALANOBIS_DISTANCE // You might get better recognition accuracy if you enable this.
#define I2C_BUS 2
#define I2C_ADDRESS 0x58
#define STDIN_FILENO 0

int kbhit(void);
void i2c_report(signed char code);
signed char i2c_wait_request(void);
IplImage *queryImage(CvCapture* capture);
CvRect *detectBiggestFace(IplImage *image);
IplImage *extractFace(IplImage *image, CvRect* rect);
IplImage *toGrayscale(IplImage *image);
IplImage *imageResize(IplImage *image, CvSize size);
IplImage * extractDetectedEqualizedFace(IplImage *image);
void learn();
int faceRecognize(IplImage* img, float minimumConfidence);
void doPCA(void);
void storeTrainingData();
int loadTrainingData();
int findNearestNeighbor(float * projectedTestFace, float minimumConfidence);
int loadFaceImgArray();

int i2c_fp = 0; // I2C File


Pointer
const char *cascade_name =
"/usr/share/opencv/haarcascades/haarcascade_frontalface_alt.xml";// Path and name of the haar
cascade
const float search_scale_factor = 1.1f; // How detailed should the search be.
const char *txtDataFile = "/home/root/dB/faces.txt"; // Training data txt file
const char *xmlDatabaseFile = "/home/root/dB/faces.xml"; // Recognition data xml file
const float minimumConfidence = 0.6; // 1.0 - definite recognition, 0.0 - a terrible
recognition
CvSize minFeatureSize; // Smallest face size.
CvCapture* capture; // Capture the image from camera
CvHaarClassifierCascade* cascade; // Haar classifier
CvMemStorage* storage; // memory for calculations
IplImage ** faceImgArr = 0; // array of face images
CvMat * trainPersonNumMat = 0; // array of person numbers
//CvMat * trainPersonNameMat = 0; // array of person names
int nTrainFaces = 0; // the number of training images
int nEigens = 0; // the number of eigenvalues
IplImage * pAvgTrainImg = 0; // the average image
IplImage ** eigenVectArr = 0; // eigenvectors
CvMat * eigenValMat = 0; // eigenvalues
126
Full robot software

CvMat * projectedTrainFaceMat = 0; // projected training faces

int main(int argc, char** argv)


{
//I2C Init block
{
unsigned char i2c_bus[11];
// Open i2c bus
sprintf(i2c_bus, "/dev/i2c-%d", I2C_BUS);
if ((i2c_fp = open(i2c_bus, O_RDWR)) < 0)
{
fprintf(stderr, "Error: Cannot Open I2C bus\n");
return -1;
}

// Set i2c I2C address


if (ioctl(i2c_fp, I2C_SLAVE_FORCE, I2C_ADDRESS) < 0)
{
fprintf(stderr, "Error: Cannot set I2C address\n");
return -2;
}
}

// OpenCV Init block


{
minFeatureSize = cvSize(40, 40); // Smallest face size.

// Load the HaarClassifierCascade


cascade = (CvHaarClassifierCascade*)cvLoad(cascade_name, 0, 0, 0);
// Check whether the cascade has loaded successfully. Else report and error and quit
if(!cascade)
{
fprintf(stderr, "Error: Could not load classifier cascade\n");
i2c_report(-3);
return -3;
}

// load the saved training data


if(!loadTrainingData())
{
fprintf(stderr, "Error: Could not load training data\n");
i2c_report(-4);
return -4;
}

// Image to capture the image from camera


capture = cvCreateCameraCapture(0);
// If not loaded succesfully, then:
if(!capture)
{
fprintf(stderr, "Error: Failed to initialize video capture\n");
i2c_report(-5);
return -5;
}

// Init memory for calculations


storage = cvCreateMemStorage(0);
if (!storage)

127
Full robot software

{
fprintf(stderr, "Error: Failed to initialize storage\n");
i2c_report(-6);
return -6;
}
}

while(!kbhit())
{
signed char code = i2c_wait_request();
if (code == -1)
{
fprintf(stderr, "Warinig: I2C read failed.\n");
sleep(1);
continue;
}
if (code == 0)
{
// Received a do not capture signal, waiting!!
usleep(100000);
continue;
}

IplImage *image = queryImage(capture);


// If the image does not exist, quit
if(!image)
{
fprintf(stderr, "Error: Image query failed\n");
i2c_report(-7);
break;
}

IplImage *face = extractDetectedEqualizedFace(image);


if (!face)
{
fprintf(stderr, "Warning: No face detected\n");
i2c_report(-8);
continue;
}

int index = faceRecognize(face, minimumConfidence);


if (index == -1)
{
fprintf(stderr, "Warning: Unknown face!\n");
i2c_report(-9);
continue;
}
printf("Info: Recognized face of index: %d\n", *(trainPersonNumMat->data.i +
index));
//printf("Info: Recognized face of person: %c\n", *(trainPersonNameMat->data.i + index));
i2c_report(*(trainPersonNumMat->data.i + index));

// Release the face image


cvReleaseImage(&face);
}
// Release the i2c, storage, cascade & capture
close(i2c_fp);

128
Full robot software

cvReleaseMemStorage(&storage);
cvReleaseHaarClassifierCascade(&cascade);
cvReleaseCapture(&capture);

// return 0 to indicate successfull execution of the program


return 0;
}

int kbhit(void)
{
struct termios oldt, newt;
int ch;
int oldf;

tcgetattr(STDIN_FILENO, &oldt);
newt = oldt;
newt.c_lflag &= ~(ICANON | ECHO);
tcsetattr(STDIN_FILENO, TCSANOW, &newt);
oldf = fcntl(STDIN_FILENO, F_GETFL, 0);
fcntl(STDIN_FILENO, F_SETFL, oldf | O_NONBLOCK);

ch = getchar();

tcsetattr(STDIN_FILENO, TCSANOW, &oldt);


fcntl(STDIN_FILENO, F_SETFL, oldf);

if(ch != EOF)
{
ungetc(ch, stdin);
return 1;
}

return 0;
}

signed char i2c_wait_request()


{
signed char code;

// Read from I2C, this would hang the process until a response is received
if(read(i2c_fp, &code, 1) != 1) return -1;
return code;
}

void i2c_report(signed char code)


{
// Write into I2C
if(write(i2c_fp, code, 1) != 1)
{
fprintf(stderr, "Warning: I2C write failed.\n");
return;
}
}

IplImage *queryImage(CvCapture* capture)


{
cvSetCaptureProperty(capture, CV_CAP_PROP_FRAME_WIDTH, 864);
cvSetCaptureProperty(capture, CV_CAP_PROP_FRAME_HEIGHT, 480);

129
Full robot software

// Image to capture the image from camera


IplImage *image = cvQueryFrame(capture);

return image;
}

CvRect *detectBiggestFace(IplImage *image)


{
// Detect all the faces in the greyscale image.
float t = (double)cvGetTickCount();
// Only search for 1 face.
CvSeq* rects = cvHaarDetectObjects(image, cascade, storage, search_scale_factor, 3,
CV_HAAR_FIND_BIGGEST_OBJECT | CV_HAAR_DO_ROUGH_SEARCH, minFeatureSize);
t = (double)cvGetTickCount() - t;
int ms = cvRound(t / ((double)cvGetTickFrequency() * 1000.0));
int nFaces = rects->total;
printf("Info: Face Detection took %d ms and found %d objects\n", ms, nFaces);

if (nFaces == 0) return 0;

// Get the first detected face (the biggest).


return (CvRect*)cvGetSeqElem(rects, 0);
}

IplImage *extractFace(IplImage *image, CvRect* rect)


{
// Allocate face copy as the same size of the rectangle
IplImage *face = cvCreateImage(cvSize(rect->width,rect->height), image->depth,
image->nChannels);

//Cop the face region


cvSetImageROI(image, *rect);
cvCopy(image, face, 0);
cvResetImageROI(image);

return face;
}

IplImage *toGrayscale(IplImage *image)


{
IplImage *image_gray;

// Allocate image copy as the same size of the image


if (image->nChannels > 1)
{
image_gray = cvCreateImage(cvSize(image->width,image->height),
IPL_DEPTH_8U, 1);
cvCvtColor(image, image_gray, CV_BGR2GRAY);
}
else
{
image_gray = cvCreateImage(cvSize(image->width,image->height),
IPL_DEPTH_8U, image->nChannels);
// Check the origin of image. If top left, copy the image image to image_gray.
if(image->origin == IPL_ORIGIN_TL) cvCopy(image, image_gray, 0);
// Else flip and copy the image
else cvFlip(image, image_gray, 0);
}
130
Full robot software

return image_gray;
}

IplImage *imageResize(IplImage *image, CvSize size)


{
IplImage *image_resized = cvCreateImage(size, image->depth, image->nChannels);
// Make the image a fixed size.
// CV_INTER_CUBIC or CV_INTER_LINEAR is good for enlarging, and
// CV_INTER_AREA is good for shrinking / decimation, but bad at enlarging.
cvResize(image, image_resized, CV_INTER_LINEAR);

return image_resized;
}

IplImage * extractDetectedEqualizedFace(IplImage *image)


{
image = toGrayscale(image);
CvRect *rect = detectBiggestFace(image);
// No face detected
if(!rect) return 0;
printf("Info: Detected a face at (%d,%d)!\n", rect->x, rect->y);

// Resize the image to be a consistent size, even if the aspect ratio changes.
IplImage *face = imageResize(extractFace(image, rect), cvSize(100, 100));
cvReleaseImage(&image);
cvEqualizeHist(face, face);

return face;
}

void learn()
{
int i, offset;

// load training data


nTrainFaces = loadFaceImgArray();
if(nTrainFaces < 2)
{

fprintf(stderr,
"Error: Need 2 or more training faces\n"
"Input file contains only %d\n", nTrainFaces);
return;
}
// do PCA on the training faces
doPCA();

// project the training images onto the PCA subspace


projectedTrainFaceMat = cvCreateMat(nTrainFaces, nEigens, CV_32FC1);
offset = projectedTrainFaceMat->step / sizeof(float);
for(i=0; i<nTrainFaces; i++)
{
//int offset = i * nEigens;
cvEigenDecomposite(
faceImgArr[i],
nEigens,
eigenVectArr,

131
Full robot software

0, 0,
pAvgTrainImg,
//projectedTrainFaceMat->data.fl + i*nEigens);
projectedTrainFaceMat->data.fl + i*offset);
}

// store the recognition data as an xml file


storeTrainingData();
}

int faceRecognize(IplImage* img, float minimumConfidence)


{
int i;
float * projectedTestFace = 0;

// project the test images onto the PCA subspace


projectedTestFace = (float *)cvAlloc(nEigens*sizeof(float));

// project the test image onto the PCA subspace


cvEigenDecomposite(
img,
nEigens,
eigenVectArr,
0, 0,
pAvgTrainImg,
projectedTestFace);

return findNearestNeighbor(projectedTestFace, minimumConfidence);

void doPCA()
{
int i;
CvTermCriteria calcLimit;
CvSize faceImgSize;

// set the number of eigenvalues to use


nEigens = nTrainFaces-1;

// allocate the eigenvector images


faceImgSize.width = faceImgArr[0]->width;
faceImgSize.height = faceImgArr[0]->height;
eigenVectArr = (IplImage**)cvAlloc(sizeof(IplImage*) * nEigens);
for(i=0; i<nEigens; i++)
eigenVectArr[i] = cvCreateImage(faceImgSize, IPL_DEPTH_32F, 1);

// allocate the eigenvalue array


eigenValMat = cvCreateMat(1, nEigens, CV_32FC1);

// allocate the averaged image


pAvgTrainImg = cvCreateImage(faceImgSize, IPL_DEPTH_32F, 1);

// set the PCA termination criterion


calcLimit = cvTermCriteria(CV_TERMCRIT_ITER, nEigens, 1);

// compute average image, eigenvalues, and eigenvectors

132
Full robot software

cvCalcEigenObjects(
nTrainFaces,
(void*)faceImgArr,
(void*)eigenVectArr,
CV_EIGOBJ_NO_CALLBACK,
0,
0,
&calcLimit,
pAvgTrainImg,
eigenValMat->data.fl);

cvNormalize(eigenValMat, eigenValMat, 1, 0, CV_L1, 0);


}

void storeTrainingData()
{
CvFileStorage * fileStorage;
int i;

// create a file-storage interface


fileStorage = cvOpenFileStorage(xmlDatabaseFile, 0, CV_STORAGE_WRITE);

// store all the data


cvWriteInt(fileStorage, "nEigens", nEigens);
cvWriteInt(fileStorage, "nTrainFaces", nTrainFaces);
cvWrite(fileStorage, "trainPersonNumMat", trainPersonNumMat, cvAttrList(0,0));
//cvWrite(fileStorage, "trainPersonNameMat", trainPersonNameMat, cvAttrList(0,0));
cvWrite(fileStorage, "eigenValMat", eigenValMat, cvAttrList(0,0));
cvWrite(fileStorage, "projectedTrainFaceMat", projectedTrainFaceMat, cvAttrList(0,0));
cvWrite(fileStorage, "avgTrainImg", pAvgTrainImg, cvAttrList(0,0));
for(i=0; i<nEigens; i++)
{
char varname[200];
sprintf(varname, "eigenVect_%d", i);
cvWrite(fileStorage, varname, eigenVectArr[i], cvAttrList(0,0));
}

// release the file-storage interface


cvReleaseFileStorage(&fileStorage);

int loadTrainingData()
{
int i;
// create a file-storage interface
CvFileStorage * fileStorage = cvOpenFileStorage(xmlDatabaseFile, 0,
CV_STORAGE_READ);
if(!fileStorage)
{
fprintf(stderr, "Error: Can not open %s\n", xmlDatabaseFile);
return 0;
}

nEigens = cvReadIntByName(fileStorage, 0, "nEigens", 0);


nTrainFaces = cvReadIntByName(fileStorage, 0, "nTrainFaces", 0);
trainPersonNumMat = (CvMat *)cvReadByName(fileStorage, 0, "trainPersonNumMat",
0);
//trainPersonNameMat = (CvMat *)cvReadByName(fileStorage, 0, "trainPersonNameMat", 0);
133
Full robot software

eigenValMat = (CvMat *)cvReadByName(fileStorage, 0, "eigenValMat", 0);


projectedTrainFaceMat = (CvMat *)cvReadByName(fileStorage, 0,
"projectedTrainFaceMat", 0);
pAvgTrainImg = (IplImage *)cvReadByName(fileStorage, 0, "avgTrainImg", 0);
eigenVectArr = (IplImage **)cvAlloc(nTrainFaces*sizeof(IplImage *));
for(i = 0; i < nEigens; i++)
{
char varname[200];
sprintf(varname, "eigenVect_%d", i);
eigenVectArr[i] = (IplImage *)cvReadByName(fileStorage, 0, varname, 0);
}

// release the file-storage interface


cvReleaseFileStorage(&fileStorage);

return 1;
}

int findNearestNeighbor(float * projectedTestFace, float minimumConfidence)


{
//double leastDistSq = 1e12;
double leastDistSq = DBL_MAX;
int i, iTrain, iNearest = 0;

for(iTrain=0; iTrain<nTrainFaces; iTrain++)


{
double distSq=0;

for(i=0; i<nEigens; i++)


{
float d_i = projectedTestFace[i] - projectedTrainFaceMat-
>data.fl[iTrain*nEigens + i];
#ifdef USE_MAHALANOBIS_DISTANCE
distSq += d_i*d_i / eigenValMat->data.fl[i]; // Mahalanobis distance
(might give better results than Eucalidean distance)
#else
distSq += d_i*d_i; // Euclidean distance.
#endif
}

if(distSq < leastDistSq)


{
leastDistSq = distSq;
iNearest = iTrain;
}
}

// Return the confidence level based on the Euclidean distance,


// so that similar images should give a confidence between 0.5 to 1.0,
// and very different images should give a confidence between 0.0 to 0.5.
float pConfidence = 1.0f - sqrt(leastDistSq / (float)(nTrainFaces * nEigens)) / 255.0f;
printf("Info: Face recognized with confidence of %f\r\n", pConfidence);

if (pConfidence < minimumConfidence) return -1;


return iNearest;
}

int loadFaceImgArray()
134
Full robot software

{
FILE * imgListFile = 0;
char imgFilename[512], name[32];
int iFace, nFaces = 0;

// open the input file


if(!(imgListFile = fopen(txtDataFile, "r")))
{
fprintf(stderr, "Error, Can\'t open file %s\n", txtDataFile);
return 0;
}

// count the number of faces


while(fgets(imgFilename, 512, imgListFile)) ++nFaces;
rewind(imgListFile);

// allocate the face-image array and person number matrix


faceImgArr = (IplImage **)cvAlloc(nFaces*sizeof(IplImage *));
trainPersonNumMat = cvCreateMat(1, nFaces, CV_32SC1);
//trainPersonNameMat = cvCreateMat(1, nFaces, CV_STRING);

// store the face images in an array


for(iFace=0; iFace<nFaces; iFace++)
{
// read person number and name of image file
fscanf(imgListFile,
"%d %s %s", trainPersonNumMat->data.i + iFace, /*trainPersonNameMat-
>data.i + iFace*/name, imgFilename);

// load the face image


faceImgArr[iFace] = cvLoadImage(imgFilename,
CV_LOAD_IMAGE_GRAYSCALE);

if(!faceImgArr[iFace])
{
fprintf(stderr, "Error: Can\'t load image from %s\n", imgFilename);
return 0;
}
}

fclose(imgListFile);

return nFaces;
}

135
Full robot software

IV.4 PIC Software

#include <18F2455.h>
#include <stdarg.h>

#device adc=10
#device PASS_STRINGS=IN_RAM
#fuses HS, NOWDT, NOPROTECT, NOLVP

#use delay(clock=20000000)

#use rtos(timer=0, minor_cycle=20ms)

signed int8 tilt = 0;


double distance = 600.0; //In centimeter
double battery = 100.0; //Percentage
double latitude, longitude, altitude;

#define SER_MOTOR_TX PIN_C6


#define SF9DOF_RX PIN_C7
#use rs232(baud=9600, parity=N, rcv=SF9DOF_RX, xmit=SER_MOTOR_TX, bits=8, ERRORS,
STREAM=COM_BALANCE)

//9DOF services
void sf9dof_init() { }

boolean sf9dof_kbhit()
{
return kbhit(COM_BALANCE);
}

char sf9dof_getc()
{
return fgetc(COM_BALANCE);
}

//Motor services
#define LEFT_MOTOR 0
#define RIGHT_MOTOR 1
#define FORWARD 0
#define BACKWARD 1

boolean ser_motor_kbhit()
{
return kbhit(COM_BALANCE);
}

void ser_motor_putc(char chr)


{
do
{
fputc(chr, COM_BALANCE);
}
while (rs232_errors & 0x20);
}

136
Full robot software

void motor_speed_control(signed int8 speed)


{
unsigned int1 direction;
unsigned int8 value;

direction = (speed > 0) ? FORWARD : BACKWARD;


value = abs(speed);

ser_motor_putc((LEFT_MOTOR << 7) | (direction << 6) | (0 << 4) | (value &


0x0F));
ser_motor_putc((RIGHT_MOTOR << 7) | (direction << 6) | (0 << 4) | (value &
0x0F));
}

void motor_tilt_control(signed int8 degree)


{
unsigned int1 direction;
unsigned int8 value;

direction = (degree > 0) ? BACKWARD : FORWARD;


value = abs(degree / 3);

ser_motor_putc((LEFT_MOTOR << 7) | (direction << 6) | (1 << 4) | (value &


0x0F));
ser_motor_putc((RIGHT_MOTOR << 7) | (direction << 6) | (1 << 4) | (value &
0x0F));
}

void motor_position_control(signed int8 fifteens_degrees)


{
unsigned int1 left_direction, right_direction;
unsigned int8 value;

left_direction = (fifteens_degrees > 0) ? BACKWARD : FORWARD;


right_direction = (fifteens_degrees > 0) ? FORWARD : BACKWARD;
value = abs(fifteens_degrees);

ser_motor_putc((LEFT_MOTOR << 7) | (left_direction << 6) | (2 << 4) | (value &


0x0F));
ser_motor_putc((RIGHT_MOTOR << 7) | (right_direction << 6) | (2 << 4) | (value &
0x0F));
}

void init_motors()
{
motor_speed_control(0);
}

//Beagle services
#define BEAGLE_SDA PIN_B0
#define BEAGLE_SCL PIN_B1
#define BEAGLE_ADDRESS 0xA4

#use i2c(SLAVE, SDA=BEAGLE_SDA, SCL=BEAGLE_SCL, address=BEAGLE_ADDRESS,


FORCE_HW)

unsigned int8 capture_i2c_state, capture_status = 0, capture_trigger = 0;

#INT_SSP
137
Full robot software

void ssp_interupt ()
{
capture_i2c_state = i2c_isr_state();

if(capture_i2c_state < 0x80) //Beagle is sending staus


{
capture_status = i2c_read();
}
if(capture_i2c_state == 0x80) //Beagle is requesting trigger
{
i2c_write(capture_trigger);
capture_trigger = 0;
}
}

void beagle_init()
{
enable_interrupts(GLOBAL);
enable_interrupts(INT_SSP);
}

void beagle_capture(unsigned int8 capture)


{
capture_trigger = capture;
}

unsigned int8 beagle_status(void)


{
return capture_status;
}

void beagle_status_reset(void)
{
capture_status = 0;
}

//LCD services
#define LCD_TX PIN_B2
#define LCD_WIDTH 20
#define LCD_HEIGHT 4

//Special Commands
#define SPECIAL_SETUP_COMMAND 0x7C
#define SPECIAL_EXTENDED_COMMAND 0xFE

//Setup commands
#define SAVE_SPLASH 0x0A //<control>j
#define TOGGLE_SPLASH 9
#define BAUD_RATE_2400 0x0B //<control>k
#define BAUD_RATE_4800 0x0C //<control>l
#define BAUD_RATE_9600 0x0D //<control>m
#define BAUD_RATE_14400 0x0E //<control>n
#define BAUD_RATE_19200 0x0F //<control>o
#define BAUD_RATE_38400 0x10 //<control>p
#define RESET_BAUD_RATE 0x12 //<control>r

//Backlight Brightness
#define BRIGHTNESS_OFF 128
138
Full robot software

#define BRIGHTNESS_40_ON 140


#define BRIGHTNESS_73_ON 150
#define BRIGHTNESS_FULLY_ON 157
#define BRIGHTNESS_NOT_VALID 158

//LCD Type
#define LCD_20_CHARACTER 3
#define LCD_16_CHARACTER 4
#define LCD_4_LINE 5
#define LCD_2_LINE 6

//Extended Commands
#define CLEAR_DISPLAY 0x01
#define CURSOR_HOME 0x02
#define CURSOR_RIGHT_ONE 0x14
#define CURSOR_LEFT_ONE 0x10
#define SCROLL_RIGHT 0x1C
#define SCROLL_LEFT 0x18
#define DISPLAY_ON 0x0C
#define DISPLAY_OFF 0x08
#define UNDERLINE_CURSOR_ON 0x0E
#define UNDERLINE_CURSOR_OFF 0x0C
#define BLINK_CURSOR_ON 0x0D
#define BLINK_CURSOR_OFF 0x0C
#define CURSOR_POSITION_OFFSET 0x80

//Charecter Display
unsigned int8 lcd_16_charecter_offset[] = {0, 64, 16, 80};
unsigned int8 lcd_20_charecter_offset[] = {0, 64, 20, 84};

#use rs232(baud=9600,parity=N,xmit=LCD_TX,bits=8, STREAM=COM_LCD)

void lcd_init(int1);
void lcd_clear(void);
void lcd_save_splash(void);
void lcd_toggle_splash(void);
void lcd_baud_rate_set(unsigned int16 rate);
void lcd_baud_rate_reset();
void backlight_set(unsigned int8 brightness);
void lcd_type_setup();
void lcd_cursor_move(unsigned int8 line, unsigned int8 position);
void lcd_putc(unsigned int8 chr);
void lcd_send_cmd(unsigned int8 cmd);
void lcd_setup_command(void);
void lcd_send_string(char * incoming_string);

void lcd_init(int1 firstRun)


{
lcd_send_cmd(DISPLAY_ON);
lcd_clear();
lcd_send_cmd(UNDERLINE_CURSOR_ON);
lcd_send_cmd(BLINK_CURSOR_OFF);
if (firstRun)
{
backlight_set(BRIGHTNESS_40_ON);
lcd_type_setup();
}
}

139
Full robot software

void lcd_clear(void)
{
lcd_send_cmd(CLEAR_DISPLAY);
}

void lcd_save_splash(void)
{
lcd_setup_command();
lcd_putc(SAVE_SPLASH);
delay_ms((LCD_WIDTH * 2) * 100);
}

void lcd_toggle_splash(void)
{
lcd_setup_command();
lcd_putc(TOGGLE_SPLASH);
delay_ms(100);
}

void lcd_baud_rate_set(unsigned int16 rate)


{
lcd_setup_command();
lcd_putc(rate);
delay_ms(100);
}

void lcd_baud_rate_reset()
{
lcd_setup_command();
lcd_putc(RESET_BAUD_RATE);
delay_ms(100);
}

void backlight_set(unsigned int8 brightness)


{
lcd_setup_command();
lcd_putc(brightness);
delay_ms(100);
}

void lcd_type_setup()
{
lcd_setup_command();
lcd_putc((LCD_WIDTH == 20) ? LCD_20_CHARACTER : LCD_16_CHARACTER);
lcd_setup_command();
lcd_putc((LCD_HEIGHT == 4) ? LCD_4_LINE : LCD_2_LINE);
}

void lcd_cursor_move(unsigned int8 line, unsigned int8 position)


{
lcd_send_cmd(((LCD_WIDTH == 20) ? lcd_20_charecter_offset[line - 1] :
lcd_16_charecter_offset[line - 1]) + (position - 1) + CURSOR_POSITION_OFFSET);
}

void lcd_putc(unsigned int8 chr)


{
fputc(chr, COM_LCD);

140
Full robot software

void lcd_send_cmd(unsigned int8 cmd)


{
fprintf(COM_LCD, "%c%c", SPECIAL_EXTENDED_COMMAND, cmd);
}

void lcd_setup_command(void)
{
lcd_putc(SPECIAL_SETUP_COMMAND);
}

void lcd_split_screen()
{
lcd_clear();
lcd_cursor_move(2, 1);
fprintf(COM_LCD, "Tilt:%3d Dist:%4.0fcm", tilt, distance);
lcd_cursor_move(3, 1);
fprintf(COM_LCD, "Btry:%5.1f%% Alt:%4f", battery, altitude);
lcd_cursor_move(4, 1);
fprintf(COM_LCD, "Lt:%6.3f Lg:%6.3f", latitude, longitude);
}

void lcd_update_msg(char *msg)


{
unsigned int8 i = 0;

lcd_cursor_move(1, 1);
for (; i < 20; i++)
{
if (msg[i] == 0) break;
lcd_putc(msg[i]);
}
for (; i < 20; i++)
{
lcd_putc(' ');
}
}

void lcd_update_tilt(signed int8 tilt)


{
lcd_cursor_move(2, 6);
fprintf(COM_LCD, "%3d", tilt);
}

void lcd_update_distance(double distance)


{
lcd_cursor_move(2, 15);
fprintf(COM_LCD, "%4.0f", distance);
}

void lcd_update_battery(double battery)


{
lcd_cursor_move(3, 6);
fprintf(COM_LCD, "%5.1f%%", battery);
}

void lcd_update_gps(double latitude, double longitude, double altitude)


{
141
Full robot software

lcd_cursor_move(3, 17);
fprintf(COM_LCD, "%4.0f", altitude);
lcd_cursor_move(4, 4);
fprintf(COM_LCD, "%6.3f", latitude);
lcd_cursor_move(4, 14);
fprintf(COM_LCD, "%6.3f", longitude);
}

//TTS256 services
#define TTS256_TX PIN_B5
#define TTS256_RX PIN_C0
#define TTS256_RESET PIN_C1

#use rs232(baud=9600,parity=N,xmit=TTS256_TX,rcv=TTS256_RX,bits=8,
STREAM=COM_TTS256)
void tts256_init()
{
//Reset the TTS256 module (active high)
output_high(TTS256_RESET);
delay_ms(100);
output_low(TTS256_RESET);
fprintf(COM_TTS256, "\r\n");
}

#define tts256_talk(a) printf(a)

//VRbot services
#define VRBOT_USE_UART
#define VRBOT_TX PIN_B3
#define VRBOT_RX PIN_B4
/*
#define VRBOT_SDA PIN_B0
#define VRBOT_SCL PIN_B1
#define VRBOT_ADDR 0x9A
*/

#define GROUP_0 0 //(Command count: 1)


#define G0_ROBOTY 0
#define GROUP_1 1 //(Command count: 10)
#define G1_FORWARD 0
#define G1_BACKWARD 1
#define G1_LEFT 2
#define G1_RIGHT 3
#define G1_STOP 4
#define G1_INTRO 5
#define G1_NAME 6
#define G1_FROM 7
#define G1_HELLO 8
#define G1_WHOAMI 9

#define CMD_BREAK 'b' // abort recog or ping


#define CMD_SLEEP 's' // go to power down
#define CMD_KNOB 'k' // set si knob <1>
#define CMD_LEVEL 'v' // set sd level <1>
#define CMD_LANGUAGE 'l' // set si language <1>
#define CMD_TIMEOUT 'o' // set timeout <1>
#define CMD_RECOG_SI 'i' // do si recog from ws <1>
142
Full robot software

#define CMD_TRAIN_SD 't' // train sd command at group <1> pos <2>


#define CMD_GROUP_SD 'g' // insert new command at group <1> pos <2>
#define CMD_UNGROUP_SD 'u' // remove command at group <1> pos <2>
#define CMD_RECOG_SD 'd' // do sd recog at group <1> (0 = trigger mixed si/sd)
#define CMD_ERASE_SD 'e' // reset command at group <1> pos <2>
#define CMD_NAME_SD 'n' // label command at group <1> pos <2> with length <3> name <4-n>
#define CMD_COUNT_SD 'c' // get command count for group <1>
#define CMD_DUMP_SD 'p' // read command data at group <1> pos <2>
#define CMD_MASK_SD 'm' // get active group mask
#define CMD_RESETALL 'r' // reset all commands and groups
#define CMD_ID 'x' // get version id
#define CMD_DELAY 'y' // set transmit delay <1> (log scale)
#define CMD_BAUDRATE 'a' // set baudrate <1> (bit time, 1=>115200)

#define STS_MASK 'k' // mask of active groups <1-8>


#define STS_COUNT 'c' // count of commands <1>
#define STS_AWAKEN 'w' // back from power down mode
#define STS_DATA 'd' // provide command length <1> label <2-11> training <12>
#define STS_ERROR 'e' // signal error code <1-2>
#define STS_INVALID 'v' // invalid command or argument
#define STS_TIMEOUT 't' // timeout expired
#define STS_INTERR 'i' // back from aborted recognition (see 'break')
#define STS_SUCCESS 'o' // no errors status
#define STS_RESULT'r' // recognised sd command <1> - training similar to sd <1>
#define STS_SIMILAR 's' // recognised si <1> (in mixed si/sd) - training similar to si <1>
#define STS_OUT_OF_MEM 'm' // no more available commands (see 'group')
#define STS_ID 'x' // provide version id <1>

// protocol arguments are in the range 0x40 (-1) to 0x60 (+31) inclusive
#define ARG_MIN 0x40
#define ARG_MAX 0x60
#define ARG_ZERO 0x41

#define ARG_ACK 0x20 // to read more status arguments

//Future Development: Tune for I2C communication


#ifdef VRBOT_USE_UART
#use rs232(baud=9600,parity=N,xmit=VRBOT_TX,rcv=VRBOT_RX,bits=8,
STREAM=COM_VRBOT)
#else
#use i2c(master, sda=VRBOT_SDA, scl=VRBOT_SCL, STREAM=COM_VRBOT)

// SC16IS752 Dual UART Register Defines


//
#define RHR 0x00 // Recv Holding Register is 0x00 in READ Mode
#define THR 0x00 // Xmit Holding Register is 0x00 in WRITE Mode
//
#define IER 0x01 // Interrupt Enable Register
//
#define IIR 0x02 // Interrupt Identification Register in READ Mode
#define FCR 0x02 // FIFO Control Register in WRITE Mode
//
#define LCR 0x03 // Line Control Register
#define MCR 0x04 // Modem Control Register
#define LSR 0x05 // Line status Register
#define MSR 0x06 // Modem Status Register
#define SPR 0x07 // ScratchPad Register
143
Full robot software

#define TCR 0x06 // Transmission Control Register


#define TLR 0x07 // Trigger Level Register
#define TXLVL 0x08 // Xmit FIFO Level Register
#define RXLVL 0x09 // Recv FIFO Level Register
#define IODir 0x0A // I/O P:ins Direction Register
#define IOState 0x0B // I/O Pins State Register
#define IOIntEna 0x0C // I/O Interrupt Enable Register
#define IOControl 0x0E // I/O Pins Control Register
#define EFCR 0x0F // Extra Features Control Register
//
#define DLL 0x00 // Divisor Latch LSB 0x00
#define DLH 0x01 // Divisor Latch MSB 0x01
//
#define EFR 0x02 // Enhanced Function Register
//
#define I2CWRITE 0b0
#define I2CREAD 0b1

#define CHANA 0
#define CHANB 1

//
//***********************************************
byte ReadUART(int8 RegAddr, int1 CHAN) // Internal register address plus channel #(0 or 1)
{ // returns byte read from the UART register
byte data;
//
i2c_start(COM_VRBOT);
delay_us(15);
while (i2c_write(COM_VRBOT, VRBOT_ADDR | I2CWRITE) != 0); // write cycle
while (i2c_write(COM_VRBOT, (RegAddr << 3) | (CHAN << 1)) != 0); // write cycle for
reg address to selected channel
i2c_start(COM_VRBOT); // restart
delay_us(15);
while (i2c_write(COM_VRBOT, VRBOT_ADDR | I2CREAD) != 0); // read cycle
data = i2c_read(COM_VRBOT, 0);
i2c_stop(COM_VRBOT);
return(data);
}
//
//*********************************************
void WriteUART(int8 RegAddr, int1 CHAN, byte Data) // Internal register address plus channel
#(0 or 1)
{
// sends data byte to selected UART register
i2c_start(COM_VRBOT);
delay_us(15);
while (i2c_write(COM_VRBOT, VRBOT_ADDR | I2CWRITE) != 0); // write cycle
while (i2c_write(COM_VRBOT, (RegAddr << 3) | (CHAN << 1)) != 0); // write
cycle
while (i2c_write(COM_VRBOT, data) != 0);
i2c_stop(COM_VRBOT);
}
//
//*********************************************
void UART_Send_Char(int1 CHAN, byte Data) //channel #(0 or 1) plus the data byte to be sent
{ // send byte to UART Xmit via the I2C bus
144
Full robot software

WriteUART(THR, CHAN, Data); // send data to UART Transmit Holding Register


}
//
//*******************************************************
void Init_SC16IS752(void)
{
// This init routine initializes ChannelS A and B
//
// Channel A Setups
//Prescaler in MCR defaults on MCU reset to the value of 1
WriteUART(LCR, CHANA, 0x80); // 0x80 to program baud rate divisor
WriteUART(DLL, CHANA, 0x60); // 0x60=9600 with X1=14.7456MHz
WriteUART(DLH, CHANA, 0x00); //
//
WriteUART(LCR, CHANA, 0xBF); // access EFR register
WriteUART(EFR, CHANA, 0x00); // disable enhanced registers
//
WriteUART(LCR, CHANA, 0x03); // 8 data bits, 1 stop bit, no parity
WriteUART(FCR, CHANA, 0x07); // reset TXFIFO, reset RXFIFO, enable FIFO mode

// Channel B Setups
//Prescaler in MCR defaults on MCU reset to the value of 1
WriteUART(LCR, CHANB, 0x80); // 0x80 to program baud rate divisor
WriteUART(DLL, CHANB, 0x60); // 0x60=9600 with X1=14.7456MHz
WriteUART(DLH, CHANB, 0x00); //
//
WriteUART(LCR, CHANB, 0xBF); // access EFR register
WriteUART(EFR, CHANB, 0x00); // disable enhanced registers
//
WriteUART(LCR, CHANB, 0x03); // 8 data bits, 1 stop bit, no parity
WriteUART(FCR, CHANB, 0x07); // reset TXFIFO, reset RXFIFO, enable FIFO mode
}
//
//*********************************************
char Poll_UART_RHR(CHAN)
{ // Poll UART to determine if data is waiting
char data = 0x00;
//
if (ReadUART(LSR, CHAN) & 0x01) // is data waiting??
{ // data present in receiver FIFO
data = ReadUART(RHR, CHAN);
}
// return received char or zero
return(data);
}
//
//*********************************************
void Set_GPIO_Dir(bits)
{ // Set Direction on UART GPIO Port pins GPIO0 to GPIO7
// 0=input 1=Output
WriteUART(IOControl, 0, 0x03); // Set the IOControl Register to GPIO Control
WriteUART(IODir,0, bits); // output the control bits to the IO Direction Register
}
//*********************************************
byte Read_GPIO()
{ // Read UART GPIO Port
char data = 0x00;

145
Full robot software

//
data=ReadUART(IOState,0); // get GPIO Bits state 0-7

// return data bits state or zero


return(data);
}
//
//*********************************************
void Write_GPIO(data)
{ // Load UART GPIO Port
WriteUART(IOState,0, data); // set GPIO Output pins state 0-7
}
//
//*********************************************
#endif

boolean VRbot_Detect(void);
boolean VRbot_SetLanguage(unsigned int8 lang);
unsigned int8 VRbot_GetGroupCount(unsigned int8 group);
void VRbot_RecognizeSI(unsigned int8 group);
void VRbot_RecognizeSD(unsigned int8 group);
signed int8 VRbot_CheckResult(void);
char vrbot_getc();
void vrbot_putc(char chr);
boolean vrbot_kbhit();
void vrbot_init();
byte ReadUART(int8 RegAddr, int1 CHAN);
void WriteUART(int8 RegAddr, int1 CHAN, byte Data);
void UART_Send_Char(int1 CHAN, byte Data);
void Init_SC16IS752(void);
char Poll_UART_RHR(CHAN);
void Set_GPIO_Dir(bits);
byte Read_GPIO();
void Write_GPIO(data);

void vrbot_init()
{
#ifndef VRBOT_USE_UART
Init_SC16IS752();
#endif
}

boolean vrbot_kbhit()
{
#ifdef VRBOT_USE_UART
return kbhit(COM_VRBOT);
#else
return ReadUART(LSR, 0) & 0x01;
#endif
}

void vrbot_putc(char chr)


{
#ifdef VRBOT_USE_UART
fputc(chr, COM_VRBOT);
#else
UART_Send_Char(0, chr);
#endif
146
Full robot software

char vrbot_getc()
{
#ifdef VRBOT_USE_UART
return fgetc(COM_VRBOT);
#else
return ReadUART(RHR, 0);
#endif
}

boolean VRbot_Detect(void)
{
unsigned int8 i, j;

vrbot_init();
while (vrbot_kbhit()) vrbot_getc();
for (j = 0; j < 3; j++)
{
vrbot_putc(CMD_BREAK);
for (i = 0; i < 3; i++)
{
delay_ms(100);
if (vrbot_kbhit()) break;
}
while (vrbot_kbhit())
{
if (vrbot_getc() == STS_SUCCESS) return true;
}
}
return false;
}

boolean VRbot_SetDelay(unsigned int8 delay)


{
fprintf(COM_VRBOT, "%c%c", CMD_DELAY, ARG_ZERO + delay);
while (!vrbot_kbhit()) delay_ms(10);
return vrbot_getc() == STS_SUCCESS;
}

boolean VRbot_SetLanguage(unsigned int8 lang)


{
fprintf(COM_VRBOT, "%c%c", CMD_LANGUAGE, ARG_ZERO + lang);
while (!vrbot_kbhit()) delay_ms(10);
return vrbot_getc() == STS_SUCCESS;
}

unsigned int8 VRbot_GetGroupCount(unsigned int8 group)


{
fprintf(COM_VRBOT, "%c%c", CMD_COUNT_SD, ARG_ZERO + group);
while (!vrbot_kbhit()) delay_ms(10);
if (vrbot_getc() != STS_COUNT) return 0;
vrbot_putc(ARG_ACK);
while (!vrbot_kbhit()) delay_ms(10);
return vrbot_getc() - ARG_ZERO;
}

void VRbot_RecognizeSD(unsigned int8 group)


{
147
Full robot software

fprintf(COM_VRBOT, "%c%c", CMD_RECOG_SD, ARG_ZERO + group);


}

void VRbot_RecognizeSI(unsigned int8 group)


{
fprintf(COM_VRBOT, "%c%c", CMD_RECOG_SI, ARG_ZERO + group);
}

signed int8 VRbot_CheckResult(void)


{
unsigned int8 rx;
while (!vrbot_kbhit()) delay_ms(10);
rx = vrbot_getc();
if (rx != STS_SIMILAR && rx != STS_RESULT) return -1; // Mismach or Timeout
vrbot_putc(ARG_ACK);
while (!vrbot_kbhit()) delay_ms(10);
return vrbot_getc() - ARG_ZERO;
}

//GPS servies
#define GPS_RX PIN_B6

#use rs232(baud=9600,parity=N,rcv=GPS_RX,bits=8, STREAM=COM_GPS)

void gps_init() { }

boolean gps_kbhit()
{
return kbhit(COM_GPS);
}

void gps_putc(char chr)


{
fputc(chr, COM_GPS);
}

char gps_getc()
{
return fgetc(COM_GPS);
}

//Future Development: Implement the GPS routines


boolean gps_locked()
{
return false;
}

// Future Development: Implement the GPS routines


double gps_obtain_latitude()
{
return 15.370443;
}

// Future Development: Implement the GPS routines


double gps_obtain_longitude()
{
return 44.178019;
}

148
Full robot software

// Future Development: Implement the GPS routines


double gps_obtain_altitude()
{
return 2200.0;
}

//Every 20ms - 9DOF services task


#task(rate=20ms, max=20ms, queue=2)
void service_9dof();

//Every 100ms - Proximity services task


#task(rate=100ms, max=20ms, queue=2)
void service_proximity();

//Every 100ms - VRBot services task


#task(rate=100ms, max=20ms)
void service_vrbot();

//Every 100ms - Beagle services task


#task(rate=100ms, max=20ms)
void service_beagle();

//Every 100ms - Battery services task


#task(rate=100ms, max=20ms, queue=2)
void service_battery();

//Every 20ms - GPS services task


//#task(rate=20ms, max=20ms)
#task(rate=100ms, max=20ms)
void service_gps();

void service_9dof()
{
if (sf9dof_kbhit())
{
tilt = sf9dof_getc();
lcd_update_tilt(tilt);
if (tilt != 0)
{
motor_tilt_control(tilt);
if (abs(tilt) > 10)
{
lcd_update_msg("I am falling. X(");
}
}
}
}

void service_proximity()
{
set_adc_channel(1);
delay_us(20);
//3.845mV/cm
//10-bit ADC, Vss-Vdd
//ADC: 0.7867/cm
//distance = (unsigned int16)(1.267 * (double)read_adc());
distance = (double)(1.64 * (double)read_adc());

149
Full robot software

lcd_update_distance(distance);
//Stop if obstcle exista at 25cm or less
if (distance <= 25)
{
lcd_update_msg("Obstacle detected!");
motor_speed_control(0);
}
}

boolean battery_warning = false;


void service_battery()
{
set_adc_channel(2);
delay_us(20);
//3.88V at 100% to 1.75V at 0%
//10-bit ADC, Vss-Vdd
//ADC: 794 at 100% to 358 at 0%
battery = 0.7 * battery + 0.3 * (-82.11 + 0.229 * (double)read_adc());
if (battery > 100) battery = 100;
if (battery < 0) battery = 0;
lcd_update_battery(battery);
if (battery > 40) battery_warning = false;
if (battery < 30 && battery_warning == false)
{
fprintf(COM_TTS256, "Enteebaah, Albattarria zaft.\r\n");
lcd_update_msg("Battery low!!");
battery_warning = true;
}
}

void service_vrbot()
{
unsigned int8 i;
signed int8 cmd;

if (VRbot_GetGroupCount(GROUP_0) == 0) rtos_terminate();
do
{
rtos_yield();
VRbot_RecognizeSD(GROUP_0);
while (!vrbot_kbhit()) rtos_yield();
cmd = vrbot_getc();
if (cmd != STS_SIMILAR && cmd != STS_RESULT)
continue;
vrbot_putc(ARG_ACK);
while (!vrbot_kbhit()) rtos_yield();
cmd = vrbot_getc() - ARG_ZERO;

} while (cmd != G0_ROBOTY);


fprintf(COM_TTS256, "Naam, Saidi.\r\n");
lcd_update_msg("Yes, Sir");
i = 0;
while(true)
{
rtos_yield();
if (VRbot_GetGroupCount(GROUP_1) == 0) rtos_terminate();
VRbot_RecognizeSD(GROUP_1);
while (!vrbot_kbhit()) rtos_yield();

150
Full robot software

cmd = vrbot_getc();
i++;
if (cmd != STS_SIMILAR && cmd != STS_RESULT)
{
if (i >= 3)
{
lcd_update_msg("Timeout ...");
return;
}
continue;
}
vrbot_putc(ARG_ACK);
while (!vrbot_kbhit()) rtos_yield();
cmd = vrbot_getc() - ARG_ZERO;
break;
}
switch(cmd)
{
case G1_FORWARD:
lcd_update_msg("Moving forward ...");
motor_speed_control(8);
break;
case G1_BACKWARD:
lcd_update_msg("Moving backward ...");
motor_speed_control(-8);
break;
case G1_LEFT:
lcd_update_msg("Turning left ...");
motor_position_control(6); //15*6 degrees
break;
case G1_RIGHT:
lcd_update_msg("Turning right ...");
motor_position_control(-6);
break;
case G1_STOP:
lcd_update_msg("Stopping ...");
motor_speed_control(0); //15*-6 degrees
break;
case G1_INTRO:
lcd_update_msg("Introducing myself");
fprintf(COM_TTS256, "Essmi Roboty. Tama tasmiimii biwasitat Hamdi
Sahloul. Shokrran.\r\n");
break;
case G1_NAME:
lcd_update_msg("My name is Roboty");
fprintf(COM_TTS256, "Essmi Roboty.\r\n");
break;
case G1_FROM:
lcd_update_msg("I am from Yemen");
fprintf(COM_TTS256, "Ana men Al Yemen.\r\n");
break;
case G1_HELLO:
lcd_update_msg("Peace upon you too");
fprintf(COM_TTS256, "Wa alikum Alsalaam.\r\n");
break;
case G1_WHOAMI:
lcd_update_msg("Detecting face...");
fprintf(COM_TTS256, "Alraga Alentdhaar.\r\n");
beagle_capture(1);
151
Full robot software

break;
}
}

const char *person_array[] = {"Hamdi", "Khalid", "Abdulahman", "Tareeq", "Moharam",


"AlBalasi", "Father"};
char recognize_msg[64];
void service_beagle()
{
if (beagle_status() != 0)
{
if ((signed int8)beagle_status() > 0)
{
fprintf(COM_TTS256, "Taam eltaarf aaleek.\r\n");
sprintf(recognize_msg, "%s recognized", person_array[beagle_status() -
1]);
lcd_update_msg(recognize_msg);
}
else
{
sprintf(recognize_msg, "Not recognized (#%d)", beagle_status());
fprintf(COM_TTS256, "Fasheel eltaarf aaleek.\r\n");
lcd_update_msg(recognize_msg);
}
beagle_status_reset();
}
}

void service_gps()
{
if (gps_locked())
{
latitude = gps_obtain_latitude();
longitude = gps_obtain_longitude();
altitude = gps_obtain_altitude();

lcd_update_gps(latitude, longitude, altitude);


}
}

void main(void)
{
setup_adc_ports(AN0_TO_AN2 | VSS_VDD);
setup_adc(ADC_CLOCK_DIV_32);

sf9dof_init();
init_motors();

tts256_init();
lcd_init(true);
delay_ms(1500); //LCD Splash

fprintf(COM_LCD, "Starting up ...");


fprintf(COM_TTS256, "Gaarri bedee altaashgheel.\r\n");

lcd_clear();
fprintf(COM_LCD, "Detecting VRbot module ...");
// Detect VRbot module

152
Full robot software

if (VRbot_Detect() == false)
{
fprintf(COM_LCD, "failed.");
return;
}
fprintf(COM_LCD, "done.");

lcd_clear();
fprintf(COM_LCD, "Setting up VRbot module ...");
VRbot_SetDelay(0);
fprintf(COM_LCD, "done.");

lcd_split_screen();
lcd_update_msg("Listening ...");

beagle_init();
gps_init();
rtos_run();

lcd_split_screen();
lcd_update_msg("System terminated!");
}

153
Full robot software

154
Microcontroller's Protocols and Signals

Appendix V. Microcontroller's Protocols and Signals

There exist many communication protocols that microcontrollers


use to carry serial bit streams and communicate with other
microcontrollers and PCs, you may need to know about these
protocols in order to comprehend the robot software.

PWM protocol

UART protocol

I²C protocol

155
Microcontroller's Protocols and Signals

V.1 PWM

PWM stands for Pulse Width Modulation. This means that we can generate a pulse whose width (i.e.
duration) can be altered. Since microcontrollers live in a digital world then their output pins can be either
low (0v) or high (5v). However: the rest of the world tends not to speak such an open-or-shut case i.e. the
rest of the world tends to be analogue. Rather than just being on or off: motors tend to need speed control,
lighting may need to be dimmed, servos need to move to a particular position, buzzers need a sound
frequency etc.[55]

Well most real world devices have some kind of latency (i.e. they do not do what you ask immediately).
This could be caused by a mixture of momentum, inductance, capacitance, friction (amongst others).

By turning an output pin of a Microcontroller repeatedly high and low very quickly then the result is an
average of the amount of me the output is high. If it is always low the result is 0v, always high then the
result is 5v, if half-and-half then the result is 2.5v.

Frequency: In this example (see FigureV.1)


the waveform repeated every 3 units. Assuming
that each unit was 1ms then our waveform
repeats every 3ms. Given that Frequency = 1 /
Time, then the signal frequency is 1/0.003, or
333.33 Hz. Note that with PWM this frequency
remains constant - we just use the comparator
value to adjust the duty cycle.

Duty Cycle: The percentage of time that our


output pin is high is called the duty time. In the
example above it is high for 2/3 of the me i.e. a
66.66% duty cycle. Figure V.1: PWM waveform

156
Microcontroller's Protocols and Signals

V.2 UART

The UART, or Universal Asynchronous Receiver/Transmitter, is a feature of your computer microcontroller


useful for communicating serial data (text, numbers, etc.) to other devices. The computer/µC changes
incoming parallel information (within the microcontroller) to serial data which can be sent on a
communication line.[56]

These are the different standards/protocols used from transmitting data. They are incompatible with
each other, but if you understand what each is, then you can easily convert them to what you need for your
robot.

V.2.1 RS232

RS232 is the old standard and is star ng to become obsolete. Few if any laptops even have RS232 ports
(serial ports) today, with USB becoming the new universal standard for attaching hardware. But since the
world has not yet fully swapped over, you may encounter a need to understand this standard.

Back in the day circuits were noisy, lacking


filters and robust algorithms, etc. Wiring was also
poor, meaning signals became weaker as wiring
became longer (relates to resistance of the wire).
So to compensate for the signal loss, they used
very high voltages. Since a serial signal is basically
a square wave, where the wavelengths relate to
the bit data transmi ed, RS232 was standardized
as +/-12V. To get both +12V and -12V, the most
common method is to use the MAX232 IC (see
FigureV.2) (or ICL232 or ST232 - different IC's that
all do the same thing), accompanied with a few
capacitors and a DB9 connector.[56] Figure V.2: RS232 using MAX232 IC

V.2.2 EIA232F
Today signal transmission systems are much more robust, meaning a +12V/-12V signal is unnecessary. The
EIA232F standard (introduced in 1997) is basically the same as the RS232 standard, but now it can accept a
much more reasonable 0V to 5V signal. Almost all current computers (a er 2002) u lize a serial port based
on this EIA-232 standard. This is great, because now you no longer need the annoying MAX232 circuit![56]

Instead what you can use is something called the RS232 shi er - a circuit that takes signals from the
computer/microcontroller (TTL) and correctly inverts and amplifies the serial signals to the EIA232F
standard.

157
Microcontroller's Protocols and Signals

V.2.3 TTL and USB

The UART takes bytes of data and transmits the individual bits in a sequential fashion. At the destination, a
second UART re-assembles the bits into complete bytes.[56]

You really do not need to understand what TTL is, other than that TLL is the signal transmitted and
received by your microcontroller UART. This TTL signal is different from what your PC serial/USB port
understands, so you would need to convert the signal.

You also do not really need to understand USB, other than that its fast becoming the only method to
communicate with your PC using external hardware. To use USB with your robot, you will need an adaptor
that converts to USB. You can easily find converters under $20, or you can make your own by using either
the FT232RL or CP2102 IC's.

V.2.4 Other Terminology

V.2.4.1 TX and RX

Tx represents transmit and Rx represents


receive. The transmit pin always transmits data,
and the receive pin always receives it. Sounds
easy, but it can be a bit confusing. For example,
suppose you have a GPS device that transmits a
TTL signal and you want to connect this GPS to
your microcontroller UART. This is how you
would do it, see FigureV.3

Notice how Tx is connected to Rx, and Rx is Figure V.3: UART Tx and Rx connection with GPS Module
connected to Tx. If you are the type of person to
accidentally plug in your wiring backwards, you
may want to add a resistor of say ~2KΩ coming out of your UART to each pin. This way if you connect Tx to
Tx accidentally, the resistor will absorb all the bad current that will otherwise destroy your UART. And
remember to make your ground connection common!

158
Microcontroller's Protocols and Signals

V.2.4.2 Baud Rate

Baud is a measurement of transmission speed in asynchronous communication. The computer, any


adaptors, and the UART must all agree on a single speed of information - bits per second.

For example, your robot would pass sensor data to your laptop at 38400 bits per second and your
laptop would listen for this stream of 1s and 0s expec ng a new bit every 1/38400bps = 26us (0.000026
seconds). As long as the robot outputs bits at the pre-determined speed, your laptop can understand it.

Remember to always configure all your devices to the same baud rate for communication to work!

V.2.4.3 Data bits, Parity, Stop Bits, Flow Control

These are basically variations of the signal, each with long explanations of why you would/would not use
them. Stick with the defaults, and make sure you follow the suggested settings of your adaptor. Usually you
will use 8 data bits, no parity, 1 stop bit, and no flow control – but not always. Note that if you are using a
PIC microcontroller you would have to declare these settings in your code.

V.2.4.4 Asynchronous Serial Transmission

As you should already know, baud rate defines bits sent per second. But baud only has meaning if the two
communicating devices have a synchronized clock. For example, what if your microcontroller crystal has a
slight deviation of 0.1 second, meaning it thinks 1 second is actually 1.1 seconds long. This could cause your
baud rates to break!

One solution would be to have both devices share the same clock source, but that just adds extra wires.
All of this is handled automatically by the UART!

Asynchronous transmission allows data to be transmitted without the sender having to send a clock
signal to the receiver. Instead, the sender and receiver must agree on timing parameters in advance and
special bits are added to each word in which these bits are used to synchronize the sending and receiving
units.

When a word is given to the UART for Asynchronous transmissions, a bit called the "Start Bit" is added
to the beginning of each word that is to be transmitted. The Start Bit is used to alert the receiver that a word
of data is about to be sent, and to force the clock in the receiver into synchronization with the clock in the
transmitter. These two clocks must be accurate enough to not have the frequency dri by more than 10%
during the transmission of the remaining bits in the word.[56]

159
Microcontroller's Protocols and Signals

V.3 I²C

The I²C (Inter IC) bus was developed by Philips in the early 1980s, for use in communica on of peripheral
devices within a TV-set. In some implementations, I²C is called as TWI (Two-Wire Interface) due to royalty
issues. I²C is actually pronounced as "i squared c" (I²C), but most pronounce it as "i two c" (I2C).[57]

From its name we can easily deduce that it provides a communication link between ICs (integrated
circuits). I²C is multi-master and can support a maximum of 112 devices on the bus. The specifica on
declares that 128 devices can be connected to the I²C bus, but it also defines 16 reserved addresses.

Microcontrollers generally have internal software-based pull-up resistors on the SCL and SDA lines. If
not, you shall setup them manually.

There are two kinds of device on the I²C bus


(see FigureV.4):

 MASTER – node that always


controls the clock line.
Figure V.4: The I²C bus, 1 MASTER , 3 SLAVEs, and the required
 SLAVE – node that is not in control pull-up resistors
of the clock line.

The most common setup is 1 MASTER and mul ple slaves on the I²C bus. Both MASTER and SLAVE can
transfer data over the I²C bus, but that transfer is always controlled by the MASTER.

An EEPROM memory will not have the same device address as a digital temperature IC , also, some
slave devices provide a simple way of changing its device address which allows up to 8 of the same devices
to be connected to the I²C bus, our 24LC256 EEPROM address could be changed by pulling up or down the
Pins A0, A1, A2.

The device address is only 7-bits, data transmi ed on the I²C bus is 8-bit. The first 7-bits define the
device address. The 8th bit defines the R/W (read/write) mode. For write opera ons this bit is 0, and 1 for
read operations. So the read/write bit just makes the device address an odd or even address. Data is
transmitted on the I²C bus MSB first.[57]

If you are familiar with hexadecimal and binary numbers, then it would be easy to state that the address
of the 24LC256 EEPROM is “1 0 1 0 A2 A1 A0 1” for reading, and “1 0 1 0 A2 A1 A0 0” for wri ng. Since we
changed the 7-bit address of our EEPROM into 0x51, which is “1 0 1 0 0 0 1” in binary, it means that our A2
and A1 are grounded, while A0 is pulled-up to Vcc. Address for writing would be 0xA2, and 0xA3 for reading.

160
Kalman Filter

Appendix VI. Kalman Filter

This appendix will explain how Kalman filtering works. We will use
a more practical approach to avoid the boring theory, which is
hard to understand anyway. I will try to make it look more
concrete instead of puzzling generalized approach.

Accelerometer to attitude

Gyroscope to roll, pitch and yaw

Kalman filtering of IMU data

161
Kalman Filter

VI.1 Accelerometer to attitude

An accelerometer measures, as it is name hints,


acceleration along a predefined axis. As you
probably remember from you physics class, the
earth’s gravity is also an acceleration (a falling
stone keeps going faster and faster). So: with an
accelerometer, we can measure the earth’s
gravity! Figure VI.1 shows how we do it.

The red arrow represents the earth’s


gravity. The blue arrow shows how the
accelerometer senses gravity. Note that the axis
of this accelerometer is perpendicular to the
aircraft (we placed it like that in our robot!).[58]

The angle theta between the actual gravity Figure VI.1: Accelerometer axis direction
vector and the measured gravity is related to the
pitch of the aircra (pitch = theta + 90°). If we know theta, we know our pitch! Since we know the magnitude
of the earth’s gravity, simple calculus gives us our pitch angle:

accelerometer = cos (theta) * gravity


theta = acos (accelerometer / gravity)
And since pitch = theta + 90°
pitch = asin (accelerometer / gravity)

Calculating the roll angle is pretty much the same. We only need an extra accelerometer with an axis
perpendicular to the pitch-accelerometer.

Reality is a bit different from this simplified example. The sin-1 cannot give you the full 360 degrees
ranging pitch angle. A plane heading for the sky and one heading for the ground would both result in a 0
(zero) measurement. We will need an extra accelerometer to dis nguish these cases. The 2-argument
inverse tangent makes sure the resulting angle is in the correct quadrant. Thus:

pitch = atan2(accelerometer / gravity, z / gravity)

Now you know most about using accelerometers to calculate pitch and roll, do not start building your
own autopilot system just yet! There are more forces working on a balancing robot than just gravity! We will
need gyroscopes to correct this over short period of time (also useful to eliminate the effect of vibrations on
the accelerometer). Over a longer period of time, we will need some more advanced physics to estimate
these other forces so we can compensate for them.

162
Kalman Filter

VI.2 Gyroscope to roll, pitch and yaw

Now you know most practical things you need to know about accelerometers, we will continue with
gyroscopes (other names: gyro, angular rate sensor).

A gyroscope is a very fancy name for a device that measures the angular rate (how many degrees. The
gyroscopes used in very critical applications (like a jumbo jet) are very advanced and complicated.
Fortunately for us, there are some low-cost and small sized alternatives which are good enough. They are
fabricated using MEMS (Micro-Electro-Mechanical Systems) and iMEMS (Integrated MEMS) technology by
big companies like Analog Devices.[59] Great! Let us start with some theory:

As you probably remember from physics, position, velocity and acceleration are related to each other:
deriving the position, gives us velocity:

dx = vx

with x being the position on the x-axis and vx being the velocity along the x-axis.
Maybe less obvious, the same holds for angles. While velocity is the speed at which the position is changing,
angular rate is nothing more than the speed the angle is changing. That is right:

d alpha = angular rate = gyroscope output

with alpha being the angle. It is starting to look pretty good! Knowing that the inverse of deriving (d) is
integrating (∫), we change our formula into:

∫angular rate = ∫gyroscope output = alpha

Now we found a relation between angle (attitude!) and our gyroscope’s output: integrating the
gyroscope, gives us our attitude-angle.

Enough boring theory! Let us take a look at some figures. The following figures all represent the same
mo on: I took a gyroscope, turned it 90 degrees le and back, and turned it 90 degrees right and back.

The raw data (used here – see FigureVI.2) is what we get when we feed the gyroscope’s output (0-5
volt) into a 10-bit ADC (analog to digital convertor). So the raw values are between 0 and 1024. (The red line
is just a low-pass filtered version of the blue data).

163
Kalman Filter

Figure VI.2: Raw gyroscope’s output

You can clearly see a positive angular rate followed by a negative one. But we will need to shift the
figure down, to make sure negative values correspond to a negative angular rate. Otherwise the integration
(which can be seen as the sum of our y-values) would keep adding up values and never subtracting any! We
normalize it by subtrac ng about 490 from every value. This normaliza on gives us the following (see
FigureVI.3)

Figure VI.3: Normalized gyroscope’s output

Now all we need to do, according to our formulas, is integrate it! discrete integration is nothing more
than summing up all the values! Basically, integra on from 0 to the ith value:

integration(i) = integration (i-1) + vali

This is the simplest possible integrator. A more advanced one, which also flattens out possible jitter in
the data, is the Runge-Kutta integrator:

164
Kalman Filter

integration(i) = integration(i-1) + 1⁄6 (vali-3 + 2 vali-2 + 2 vali-1 + vali)

Using this Runge-Kutta integration, we get the following (Figure VI.4)

Figure VI.4: Integration of raw gyroscope’s output

This is pretty much the exact movement I made! Now we just need to add a scale factor to our data so
our result is in degrees (see FigureVI.5)

Figure VI.5: Scaled Integration of raw gyroscope’s output – alpha degrees

This pretty much ends my story of the simplified gyroscope! In reality, gyroscopes are suffering from an
effect called drift. This means that over time, the value a gyroscope has when in steady position (called bias),
drifts away from it is initial steady value (see FigureVI.6)

165
Kalman Filter

Figure VI.6: alpha degrees with bias effect

The blue line gives you an idea about the dri . During 4500 samples (12 seconds in my setup), the bias
dri ed about 30 degrees!

Remember that we need the bias (about 490 in our example) to normalize our data. How can we
integrate when we have no idea about the correct bias? We will need to find a way to get it. A hint: our
accelerometer is not affected by drift.

VI.3 Kalman filtering of IMU data

VI.3.1 Introduction

To many of us, Kalman filtering is something like the holy grail. Indeed, it miraculously solves some problems
which are otherwise hard to get a hold on. But beware, Kalman filtering is not a silver bullet and will not
solve all of your problems!

Make sure you know from the previous sections how the data from “accelerometers” and “gyroscopes”
are used. Some basic knowledge of algebra may also come in handy.

VI.3.2 Basic operation

Kalman filtering is an iterative filter that requires two things. First of all, you will need some kind of input
(from one or more sources) that you can turn into a prediction of the desired output using only linear
calculations. In other words, we will need a linear model of our problem.

166
Kalman Filter

Secondly, you will need another input. This can be the real world value of the predicted one, or
something that is a good approximation of it.

While iterating, the Kalman filter will change the variables a bit in our linear model, so the output of
our linear model will be closer to the second input.[60]

VI.3.3 Our simple model

Obviously, our two inputs will consist of the gyroscope and accelerometer data. The model using the
gyroscope data looks like this:

The first formula represents the general form of a linear model. We need to “fill in” the A and B
matrices, and choose a state x. The variable u represents the input. In our case this will be the gyroscope’s
data. Remember how we integrate? We just add the NORMALIZED measurements up.

alphak = alphak-1 + (uk – bias)

We need to include the time between two measurements (dt) because we are dealing with the rate
(degrees/s):

alphak = alphak-1 + (uk – bias) * dt

We rewrite it:

alphak = alphak-1 – bias * dt + uk * dt

This is what we have in our matrix multiplication. Remark that our bias remains constant! In the tutorial
on gyroscopes, we saw that the bias drifts. Well, here comes the Kalman-magic: the filter will adjust the bias
while iterating by comparing the result with the accelerometer’s output (our second input)! Great!

VI.3.4 Wrapping it all up

Now all we need are the bits and bolts that actually do the magic! These are some formulas using matrix
algebra and statistics (see Table VI.1). No need right now to know the details of it, but if you insist to do so,
then you shall refer to some Digital Control book.

167
Kalman Filter

Table VI.1: Kalman Digital Controller's formulas

u = measurement1 Read the value of the last measurement

x=A*x+B*u Update the state x of our model

Read the value of the second measurement/real value. Here this


y = measurement2
will be the angle calculated from our accelerometer.

Calculate the difference between the second value and the value
Inn = y – C * x
predicted by our model. This is called the innovation

s = C * P * C’ + Sz Calculate the covariance

K = A * P * C’ * inv(s) Calculate the Kalman gain

x = x + K * Inn Correct the prediction of the state

P = A * P * A’ – K * C * P * A’ + Sw Calculate the covariance of the prediction error

The C matrix is the one that extracts the output from the state matrix. In our case, this is (1 0)’ :

alpha = C * x

Sz is the measurement process noise covariance: Sz = E(zk zkT)

In our example, this is how much jitter we expect on our accelerometer’s data.

Sw is the process noise covariance matrix (a 2×2 matrix here): Sw = E(x · xT)
Thus: Sw = E([alpha bias]’ · [alpha bias])

Since only the diagonal elements of the Sw matrix are being used, we will only need to know E(alpha2)
and E(bias2), which is the 2nd moment. To calculate those values, we will need to look at our model: The
noise in alpha comes from the gyroscope and is multiplied by dt2. Thus: E(alpha2) = E(u2)· dt2.

These factors depend on the sensors you are using. You will need to figure them out by doing some
experiments. In the source code of the autopilot/rotomotion Kalman filtering, they use the following
constants:

E(alpha2) = 0.001
E(bias2) = 0.003
Sz = 0.3 (radians = 17.2 degrees)

168
Appendix VII. PID Controller

A proportional–integral–derivative controller (PID controller) is a


generic control loop feedback mechanism (controller) widely used
in industrial control systems.

DC Motor PID

PID Tuning

169
PID Controller

A PID is the most commonly used feedback


controller (see FigureVII.1). A PID controller
calculates an "error" value as the difference
between a measured process variable and a
desired set-point. The controller attempts to
minimize the error by adjusting the process control
inputs. In the absence of knowledge of the
underlying process, a PID controller is the best
controller. However, for best performance, the PID Figure VII.1: A block diagram of a PID controller
parameters used in the calculation must be tuned
according to the nature of the system – while the design is generic, the parameters depend on the specific
system.

The PID controller calculation (algorithm) involves three separate parameters, and is accordingly
sometimes called three-term control: the proportional, the integral and derivative values, denoted P, I, and
D. The proportional value determines the reaction to the current error, the integral value determines the
reaction based on the sum of recent errors, and the derivative value determines the reaction based on the
rate at which the error has been changing. The weighted sum of these three actions is used to adjust the
process via a control element such as the position of a control valve or the power supply of a heating
element. Heuristically, these values can be interpreted in terms of time: P depends on the present error, I on
the accumulation of past errors, and D is a prediction of future errors, based on current rate of change.[61]

By tuning the three constants in the PID controller algorithm, the controller can provide control action
designed for specific process requirements. The response of the controller can be described in terms of the
responsiveness of the controller to an error, the degree to which the controller overshoots the set-point and
the degree of system oscillation. Note that the use of the PID algorithm for control does not guarantee
optimal control of the system or system stability.

VII.1DC Motor PID Controller

So why do we need speed control? Actually there are two reasons for accurate speed control over our
motors which are:

 Motor speed should be independent of load.

 Differential drive platforms[63] need to synchronize wheel speed to go in a straight line.

Since we are controlling the speed of the


motors using the PWM via an H-Bridge embedded
within the Motor Driver board (see Figure VII.2), by
changing its duty cycle (refer to Appendix V), all we
need is feedback that will be used to correct the
error, this is done by the encoder.

170 Figure VII.2: DC Motor PID controller


PID Controller

Now all we need is to use the measurement of output to control the input (Feedback) using our
ATMega microcontroller.

Knowing that PID algorithms adjust the gain to the plant based on several characteristics of the
feedback, not just the current value. We shall start designing our controller using the error term which is
derived by subtracting the feedback (motor speed) from the set point (set speed). This is the error in terms
of a number of encoder counts per unit time – in other word it is the first derivative of position; the speed.

So, what is the mathematics of PID?

Simple proportional coefficient Kp is


multiplied by the error term. It provides linear
response to the error term.

Integral coefficient Ki is multiplied by the


error term and added to the sum of all
previous integral terms. It provides response
to accumulated error.

Derivative coefficient Kd is multiplied by


the difference between the previous error
and the current error. It responds to change
Figure VII.3: PID implementation
in error from one PID cycle to the next.

Combining all the above gains would result in the PID controller. See Figure VII.3

VII.1.1 PID Tuning

Knowing that the behavior of most systems is measured by the system’s “Step response” – see Figure VII.4,
we need to measure a step response for our PID controller, to do this, and you need to:

 Add code to monitor the output of the PID algorithm (i.e. encoder speed feedback, counts per PID)

 Store the feedback speed value into an array element for the first 48 PID execu ons. (2 seconds)

 Change the set speed from 0 to 60% (from 0 to 60RPM) of the motor’s maximum speed. This is
equivalent to a step function.

 A er 2 seconds, stop the motor and print the array data to the serial port.

 This allows the response of the platform to be determined numerically.

171
PID Controller

Figure VII.4: Example response of a temperature PID controller

172
PID Controller

I used the Brute Force Approach[62] for tuning the PID (and it still there waiting you to enable its
preprocessor directive called PID_TUNE), the tuning algorithm loops through a range of values for each
coefficient P, I, and D. shown example in Lis ng VII.1

For (P=P_start; P<P_end; ++P)


{
For (I=I_start; I<I_end; ++I)
{
For (D=D_start; D<D_end; ++D)
{
Set motor speed to 60RPM
Wait for the motor to go for 2 seconds
Set motor speed to 0
Print the P, I, and D values and the 20 array elements of sensed speed
}
}
}
Listing VII.1: Example of the brute force tuning

Following a sample PID tuning data in Lis ng VII.2

0;0.001;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0
0;0.001;1;8;14;11;7;4;4;1;1;0;0;0;0;0;0;0;0;0;0;0;0
0;0.001;2;6;15;12;10;9;8;6;7;6;7;6;8;7;7;6;7;7;8;7;8
0;0.001;3;5;12;12;12;11;11;11;10;10;11;10;11;11;12;11;12;12;11;12;12
0;0.001;4;6;15;15;14;15;13;13;13;13;14;14;14;14;15;15;14;14;14;14;15
0;0.001;5;8;16;17;17;16;14;14;14;16;16;16;16;16;18;17;16;15;17;16;15
1;0.001;0;8;15;12;11;11;14;16;19;27;28;31;32;32;33;33;15;33;33;33;33
1;0.001;1;5;12;10;11;14;17;21;24;25;27;28;31;31;32;32;32;32;33;32;32
1;0.001;2;6;13;13;15;15;18;23;24;25;26;28;29;30;31;30;30;31;31;31;31
1;0.001;3;7;14;16;17;19;20;23;23;25;25;28;29;29;29;30;30;29;31;30;31
1;0.001;4;6;16;19;18;20;21;23;24;25;26;27;27;28;28;28;29;29;30;29;30
1;0.001;5;6;18;22;21;22;22;23;23;25;26;27;27;28;28;28;28;29;29;29;30
2;0.001;0;6;12;12;16;21;27;30;32;34;35;36;35;35;34;32;32;31;30;30;28
2;0.001;1;6;13;15;19;23;26;29;31;32;34;33;34;0;32;32;32;31;31;30;30
2;0.001;2;6;14;18;21;24;26;27;30;31;32;32;32;33;32;32;31;31;31;30;29
2;0.001;3;5;18;21;23;26;27;28;30;29;29;29;28;28;30;29;30;31;31;31;32
2;0.001;4;6;18;24;26;25;25;26;27;30;30;30;30;31;31;31;30;30;31;30;30
2;0.001;5;6;19;27;26;26;26;26;28;29;30;30;30;30;29;31;30;30;30;30;31
Listing VII.2: Sample of PID tuning data

Now the results of all PID values within the test range are plotted with respect to time. The values
which yield the best curve with preferred overshot and settling time will be used for the PID controller. You
could use the Microsoft® Excel to do this by importing your data as comma-separated values (CSV). Good
luck!

173
PID Controller

174
Glossary

Glossary

A F
 ADC: Analog-to-Digital Converter  FTDI: Future Technology Devices International

 Arduino: a single-board microcontroller and a  FYP: Final Year Project


software suite for programming it

 ATMega: is a modified Harvard architecture 8-


G
bit RISC single chip AVR based microcontroller  Gimbal: is a pivoted support that allows the
which was developed by Atmel. rotation of an object about a single axis.

B H
 B.Eng: Bachelor of Engineering  HSV: Hue, Saturation, and Value color system

 BB: BeagleBoard  Humanoid Robot: is a robot with its overall


appearance, based on that of the human
 BSD: Berkeley Software Distribution body, allowing interaction with made-for-
human tools or environments.
C
 C/C++: statically typed, free-form, multi- I
paradigm, compiled, general-purpose  I/O: Input/output
programming languages
 I²C or I2C: Inter-Integrated Circuit
 CCS: Custom Computer Services
 IDE: Integrated Development Environment
 CSV: Comma-separated values
 iMEMS: Integrated MEMS
D  IMU: Inertial Measurement Unit
 DC: Direct Current
 ISR: Interrupt service Routine
 Digital Controller: a branch of control theory
that uses digital computers to act as system J
controllers.
 JFFS2: Journaling Flash File System version 2
 DOF: Degrees Of Freedom
 JST: Japan Solderless Terminal

E
L
 EEPROM: Electrically Erasable Programmable
Read-Only Memory  LCD: Liquid Crystal Display

 Eigenfaces: are a set of eigenvectors used in  LED: Light-Emitting Diode


the computer vision problem of human face
 LiFe: Lithium iron phosphate battery
recognition.
175
Glossary

 LiPo/LiPoly: Polymer Lithium Ion battery  Roboty: The name I called on the robot I am
designing.
 Loom: is a device used to weave cloth
 RPM: Revolutions Per Minute
M  RS232: Recommended Standard 232
 mAh: Milliampere-hour
 RTOS: Real Time Operating System
 MAV: Micro Air Vehicle
 RUP: Rossum's Universal Robots
 Mechatronics: synergistic combination of
Mechanical engineering, Electronic S
engineering, Computer engineering, Control
engineering, and Systems Design  SoC: system-on-a-chip

 MEMS: Micro-Electro-Mechanical Systems  SPI: Serial Peripheral Interface

 MPSR: Multi-Purpose Service Robot T


 MSB: Most significant bit  TWI: Two-Wire Interface

N U
 NiCd: Nickel-cadmium battery  UART: Universal Asynchronous
Receiver/Transmitter
 NiMH: Nickel-metal hydride battery
 UAV: Unmanned Aerial Vehicle
P  µC: microcontroller
 PC: Personal Computer
 USB: Universal Serial Bus
 PCA: Principal Component Analysis
 USD: United States Dollar
 PCB: Printed Circuit Board
 UVC: USB Video Class
 PDIP: Plastic Dual Inline Package

 PIC: Programmable Interface Controller


V
 VR: Voice Recognition
 PID: Proportional–integral–derivative
controller  VRbot: Voice Recognition Module

 PUMA: Programmable Universal Manipulation


Arm

 PWM: Pulse-Width Modulation

R
 RGB: Red, Green, and Blue color system

 RIA: Robotics Industry Association


176
References

References

177
References

[1] Williams, Mike. Aug 11, 2002. “History of Robotics.” Ball State University.
<http://www.bsu.edu/web/mawilliams/history.html>.

[2] Chavis, Jason. Oct 16, 2008. “The Importance of Robots.” eHow Computer & Electronics
<h p://www.ehow.com/about_4596141_importance-robots.html>.

[3] “Three Laws of Robotics.” Wikipedia, the free encyclopedia. 10 Oct 2010
<http://en.wikipedia.org/wiki/Three_Laws_of_Robotics>.

[4] “How Robots Works.” Net Industries <h p://science.jrank.org/pages/5900/Robo cs-How-robots-


work.html>.

[5] Woodford, Chris. Aug 20, 2009. “Uses of Robots.” Explain that stuff!
<http://www.explainthatstuff.com/robots.html>.

[6] Bengtson , Harlan. Jul 9, 2010. “Importance of Robots.” Bright Hub


<h p://www.brighthub.com/engineering/mechanical/ar cles/76606.aspx>.

[7] “Types of robots.” All On Robots <http://www.allonrobots.com/types-of-robots.html>.

[8] “Impact Of Robots.” <http://users.cjb.net/mechatronics/robotics.pdf>.

[9] David, Nathan. Oct 23, 2010. “Design and Control of a Robotic Arm.” University of Nigeria Nsukka
<h p://www.scribd.com/doc/27420223/Design-and-Control-of-a-Robotic-Arm>.

[10] “Artificial Satellite Research Papers.” <http://www.allfreeessays.com/topics/artificial-satellite/60>.

[11] “Achievements in robot technology.” GoeSTY's Junction Blog. Jul 4, 2010


<http://sutrawidanta.wordpress.com/2010/07/04/mr-robotorobots/>.

[12] “Service Robots, Introduction.” IFR International Federation of Robotics


<http://www.ifr.org/service-robots/>.

[13] “Project Organization Structure.” Project Shrink Publishing. Jul 22, 2007
<h p://www.so wareprojects.org/project_intake_organiza on211.htm>.

[14] “PIC microcontroller.” Wikipedia, the free encyclopedia. 5 Oct 2010


<http://en.wikipedia.org/wiki/PIC_microcontroller>.

[15] “PIC18F2455 Data sheet.” Microchip Technology Inc. 26 Mar, 2009


<h p://ww1.microchip.com/downloads/en/DeviceDoc/39632e.pdf>.

[16] “28-Pin PIC microcontroller prototype board.” OLIMEX Ltd. 2005


<http://www.olimex.com/dev/pic-p28-usb.html>.

[17] “24LC256 EEPROM.” Microchip Technology Inc. 13 May, 2010


<h p://www.microchip.com/wwwproducts/devices.aspx?ddocname=en010823>.

[18] “MPLAB Compatible Devices Programmer.” OLIMEX Ltd. 2007 <http://www.olimex.com/dev/pic-


mcp-usb.html>.

178
References

[19] “ICSP Guide.” Microchip Technology Inc. 25 Mar, 2003


<h p://ww1.microchip.com/downloads/en/devicedoc/30277d.pdf>.

[20] “MPLAB Integrated Development Environment.” Microchip Technology Inc. 20 Jun, 2009
<http://www.microchip.com/stellent/idcplg?IdcService=SS_GET_PAGE&nodeId=1406&dDocName=e
n019469&part=SW007002>.

[21] “CCS IDE Compilers.” CCS, Inc. 11 Aug, 2010


<http://www.ccsinfo.com/product_info.php?cPath=Store_Software&products_id=ide-compiler>.

[22] “Voice Recognition Module.” Galizia GmbH. <http://www.veear.eu/Products/VRbot.aspx>.

[23] “VRbot data sheet.” VeeaR Tigal Keg. 2009 <http://download.tigal.com/veear/VRbot-DSv1.0.pdf>.

[24] “SpeakJet Dictionary.” SparkFun Electronics


<http://www.sparkfun.com/datasheets/Components/General/SpeakJet-dictionary.zip>.

[25] “VoiceBox Shield Schematic.” SparkFun Electronics


<http://www.sparkfun.com/datasheets/DevTools/Arduino/Arduino%20Voice%20Shield-v13.pdf>.

[26] “TTS256 data sheet.” Speech Related IC's for the Robo c Hobbyist. May 4, 2007
<h p://www.speechchips.com/downloads/TTS256_Datasheet_prelim.pdf>.

[27] “Basic 20x4 Character LCD data sheet.” Xiamen Ocular LCD Device Co. Ltd. Sep 12, 2003
<h p://www.sparkfun.com/datasheets/LCD/GDM2004D.pdf>.

[28] “Serial Enabled LCD Backpack data sheet.” SparkFun Electronics. 2004
<h p://www.sparkfun.com/datasheets/LCD/SerLCD_V2_5.PDF>.

[29] “D2523T Helical GPS Receiver data sheet.” ADH Technology Co. Ltd.
<h p://www.sparkfun.com/datasheets/GPS/Modules/D2523T%20V1.pdf>.

[30] “Logic Level Converter Schematic.” SparkFun Electronics. Dec 6, 2008


<http://www.sparkfun.com/datasheets/BreakoutBoards/Level-Converter-v10.pdf>.

[31] “Accelerometer.” Wikipedia, the free encyclopedia. 10 Jul 2010


<http://en.wikipedia.org/wiki/Accelerometer>.

[32] “Gyroscope.” Wikipedia, the free encyclopedia. 6 Oct 2010


<http://en.wikipedia.org/wiki/Gyroscope>.

[33] “9 Degrees of Freedom Schema c.” SparkFun Electronics. Jan 25, 2010
<h p://www.sparkfun.com/datasheets/Sensors/IMU/9DOF-Razor-v14.pdf>.

[34] “Pocket AVR Programmer Schematic.” SparkFun Electronics. 4 Dec 1009


<http://www.sparkfun.com/datasheets/Programmers/AVR-Pocket-Programmer-v15.pdf>.

[35] “FTDI Basic Breakout Schematic.” SparkFun Electronics. 23 Jul 2008


<h p://www.sparkfun.com/datasheets/DevTools/Arduino/FTDI%20Basic.pdf>.

[36] “Arduino – Environment.” Arduino™ website. 4 Oct 2010


<http://www.arduino.cc/en/Guide/Environment>.

179
References

[37] “Wheel Encoder Set Product Info.” Pololu Corporation. 18 Sep 2010.
<h p://www.pololu.com/catalog/product/1218>.

[38] “Serial Controlled Motor Driver Schematic.” SparkFun Electronics. 8 Dec 2009.
<http://www.sparkfun.com/datasheets/Robotics/ROB-09571-
Serial%20Controlled%20Dual%20Motor%20Driver%20-%20v11.pdf>.

[39] “Infrared Proximity Sensor Long Range data sheet.” SHARP Corpora on. 1 Dec, 2006.
<http://sharp-world.com/products/device/lineup/data/pdf/datasheet/gp2y0a02yk_e.pdf>.

[40] “BeagleBoard overview.” Wikipedia, the free encyclopedia. 30 Sep 2010


<http://en.wikipedia.org/wiki/Beagle_Board>.

[41] Kridner, Jason. 26 Dec, 2008. “BeagleBoard hardware details.” the Beagle Board.
<http://beagleboard.org/hardware>.

[42] “BeagleBoard technical details.” Embedded Linux Wiki. 22 Sep, 2010


<http://elinux.org/BeagleBoard>.

[43] “BeagleBoard Manual.” the Beagle Board. 1 Dec 2009


<http://beagleboard.org/static/BBSRM_latest.pdf>.

[44] “Beagle Board Beginners Guide.” Embedded Linux Wiki. 24 Jul, 2010
<http://elinux.org/BeagleBoardBeginners>.

[45] “BeagleBoard Zippy.” Embedded Linux Wiki. 21 May, 2010 <http://elinux.org/BeagleBoard_Zippy>.

[46] “Logitech Webcam Pro 9000.” Logitech. <http://www.logitech.com/en-us/webcam-


communications/webcams/devices/6333>.

[47] Koen. Mar 21, 2010. “Introduction: The Ångström Distribution” The Ångström Distribution |
Embedded power. <http://www.angstrom-distribution.org/>.

[48] Bradski, Gary. “OpenCVWiki: Welcome.” OpenCV Wiki. Jun 10, 2010
<http://opencv.willowgarage.com/>.

[49] Ryowens. May 4, 2010. “Using the TTS256 with the VoiceBox Shield.” SparkFun Electronics.
<h p://www.sparkfun.com/commerce/tutorial_info.php?tutorials_id=166>.

[50] “Robot Software.” Wikipedia, the free encyclopedia. 27 Sep 2010


<http://en.wikipedia.org/wiki/Robot_software>.

[51] Emami, Shervin. Sep 6, 2010. “Face Detection and Face Recognition.” Shervin Emami website
<http://www.shervinemami.co.cc/faceRecognition.html>.

[52] Hewitt, Robin. “Seeing With OpenCV.” Servo Magazine April 2007: 36-39

[53] Grgic, Mislav. 7 Nov, 2008. “Face Recognition Algorithms.” University of Zagreb.
<http://www.face-rec.org/algorithms/>.

[54] Osier-Mixon, Jeffrey M. 18 Aug, 2009. “Boot Linux on the Beagle Board.” IBM® website.
<http://www.ibm.com/developerworks/linux/library/l-beagle-board/index.html>.
180
References

[55] Webbot. 30 Nov, 2008. “PWM overview.” Society Of Robots.


<h p://www.societyofrobots.com/member_tutorials/node/229>.

[56] “UART Tutorial.” Society Of Robots.


<http://www.societyofrobots.com/microcontroller_uart.shtml>.

[57] Rgcustodio. 9 Jun, 2007. “I²C Tutorial.” Society Of Robots.


<h p://www.societyofrobots.com/member_tutorials/node/35>.

[58] Pycke, Tom. 10 May, 2006. “Accelerometer to attitude.” MAV-blog.


<h p://tom.pycke.be/mav/69>.

[59] Pycke, Tom. 11 May, 2006. “Gyroscope to roll, pitch and yaw.” MAV-blog.
<http://tom.pycke.be/mav/70>.

[60] Pycke, Tom. 22 May, 2006. “Kalman filtering of IMU data.” MAV-blog.
<h p://tom.pycke.be/mav/71>.

[61] “PID Controller.” Wikipedia, the free encyclopedia. 7 Oct 2010


<http://en.wikipedia.org/wiki/PID_controller>.

[62] Bickle, Rick. 7 Nov, 2003. “DC Motor Control Systems.” Dallas Personal Robotics Group.
<http://www.dprg.org/tutorials/2003-10a/motorcontrol.pdf>.

[63] “Differential wheeled robot.” Wikipedia, the free encyclopedia. 14 Sep 2010
<http://en.wikipedia.org/wiki/Differential_wheeled_robot>.

[64] “Top-down and bottom-up design.” Wikipedia, the free encyclopedia. 6 Oct 2010
<http://en.wikipedia.org/wiki/Top-down_and_bottom-up_design>.

[65] “Islamic Golden Age.” Wikipedia, the free encyclopedia. 11 Oct 2010
<http://en.wikipedia.org/wiki/Islamic_Golden_Age>.

[66] “Al-Andalus.” Wikipedia, the free encyclopedia. 10 Oct 2010 <http://en.wikipedia.org/wiki/Al-


Andalus>.

181
References

182

Vous aimerez peut-être aussi