Vous êtes sur la page 1sur 19

BIOLOGICALLY INSPIRED ROBOTICS

Charlie: the head robot


Sriram Kumar, Devangini Patel, Sara Adela Abad Guaman, Suraj Padhy, Stabak Nandi MSc. Articial Intelligence, ECS, University of Southampton, skk1g12@soton.ac.uk, dp2c12@soton.ac.uk, saag1g12@soton.ac.uk, sp3g12@soton.ac.uk, sn6g12@soton.ac.uk

AbstractWe have a lot of technological developments like computers, cell phones, Facebook, and many more. As technology is advancing, it is becoming difcult for common man to learn the ways to access them. So, we are aiming to create an I/O device which is easy for any human to access the internet and communicate with others. We went through many designs and found that humans are the best and comfortable forms which with humans would like to communicate with. If you want people to communicate with machines, then it has to be done naturally. So we decided to make a robot that is human like and has all the features of a human face. The features include camera eyes, microphone ears, speaker voice, touch sensor skin, the motor powered eyes and jaws, facial muscles and many more. It has exibility with around 17 degrees of freedom. This robot has all the peripherals to aid social experience. These human like features bring in trust, reliability and emotional intelligence to any piece of articially intelligent software.
Fig. 1. The geminoids made so far.

Index TermsHead robot, expressions, sensors, actuators

II. S PECIFICATIONS OF THE ROBOT Dimension (HxWxB) Degree of freedoms 21.5 cm x 23 cm x 15 cm Eye : 2 x 2 = 4 Mouth : 2 x 1 = 1 Eye led : 2 x 2 = 4 Linear actuators : 10 x 2 = 20 Piezo : 13 QTC : 7 Capacitive Sensors : 5 Camera, microphone and speakers Arduino Mega 2560 Arduino Uno Micro processor : Computer Webserver 5 X 5 V, 1 Amp 10 X 4.4V, 1 Amp 4.4V, 1 Amp 5V, 1.5 Amps

I. P REVIOUS P ROJECTS

Sensors

E have looked at some of the previous head robots made which are as follows:

Peripheral devices Processing

Kismet: Kismet is an expressive robotic creature with perceptual and motor modalities tailored to natural human communication channels. To facilitate a natural infantcaretaker interaction, the robot is equipped with visual, auditory, and proprioceptive sensory inputs. The motor outputs include vocalizations, facial expressions, and motor capabilities to adjust the gaze direction of the eyes and the orientation of the head. [1] Professor Hiroshi Ishiguros geminoids and telenoid robot: A geminoid is a robot that will work as a duplicate of an existing person. It appears and behaves as a person and is connected to the person by a computer network. [2]. All the geminoids are shown in Fig. 1

Servo motors Linear actuators Sensors Controllers and processors

III. A RCHITECTURE The architecture of the head robot system is shown in Fig. 2. The head robot consists of the following components: Processors: arduino mega 2560, arduino uno, and laptop

BIOLOGICALLY INSPIRED ROBOTICS

Fig. 3.

The skull base.

Fig. 2.

The architecture of the system.

microphone is passed over to the robots speakers. Also, there is test to speech code, so that words can be stored in some database and depending on the mood of the robot (sensor values), it can say certain things. IV. M ECHANICAL STRUCTURE The base idea of constructing the mechanical structure were taken from TJ Animatronic Puppet which is the rst open source robotic expression platform. [3] The whole aluminium skull is xed on a blue perspex piece which is shown in Fig. 3. The engineering drawings of these pieces are given in Fig. 4. We got these pieces milled at the mechanical labs. The eye box for controlling the pulling of the eyes and moving of the eye lids is shown in Fig. 6. The engineering drawing of the eybox is shown in Fig. 5. The solidworks assembly are given in 7, 8. Where there are 2 layered platform attached behind. These platforms are used to hold the servo motors and guide the eye ball for moving. The more details structure of the eyebox are being shown in the Fig. 9 and 12. The eye balls and the eyelids are being designed in such a way that they are projected out of this platform and the centre of gravity is being balanced on both the sides. This projection will be holding the eyeball and the holes in the side of this holder will be helping to glide the eyebrows and help it open and close. The eyeball and the eyebrow are being powered by micro servo motors. The detailed view of this is shown in the Fig. . The eyelissds are again being guided by servo motors. This way the Eye box is being made in parts and attached together. This eye box is placed inside the big hole of the skull base and it is clamped to the top of the skull base as shown in Fig. 3. The teeth were then xed onto the jaw using polymorph as shown in Fig. 13 This structure is quite weak and we cant directly place latex mask over it. So we decided to make a metal casing which would be screwed with the skull base and provide a

Actuators: as facial muscles; servo motors for controlling the eyes, eyelids and jaw; linear actuators for generating the facial expressions Sensors: as skin; capacitive sensors for detecting proximity of humans, QTC pills for detecting hard pressure at small areas like nose and ears and piezo electric sensors for picking up the sudden change in pressure. Cameras: as eyes; to do stereo vision, to detect people as this considers the presence of humans around them and gives them attention; can also do additional image processing. Microphone: as ears; to listen to humans words and their tone and can do speech processing Speaker: as mouth; to speak to humans so that the head robot can communicate verbally with humans.

As shown, there are various inputs of the environment that are considered i.e. vision, touch, hearing and then these are processed and output is given in terms of speech, facial expression, movement of different parts of the face. Here, Arduino Mega is connected to the face robots eyes, eyelids and jaws. This processor sends all the angle values to the servo motors attached to these parts and they move accordingly. Arduino Mega is connected to the sensors, linear actuators, Arduino Uno and the laptop. This is the master processor and the angles for Arduino Uno come from Arduino Mega. The Arduino reads the value from the sensors and concurrently reads the values of the actuators from the laptop. If it is intended for Arduino Uno, then these values are forwarded there or else they are send to the linear actuators. The cameras, microphone and the speaker are connected to the laptop. Image processing is done from the camera input. Microphone input is just taken as of now and sent to the laptop. We have made code so that the sound at the laptops

BIOLOGICALLY INSPIRED ROBOTICS

Fig. 6.

The eye box for the eyes and eyelids mechanism.

Fig. 4.

The skull base engineering drawing.

Fig. 7.

The eye box solidworks assembly.

Fig. 5.

The skull base engineering drawing.

platform for the polymorph face and latex mask. The metal casing consists of head which covers the top part, metal jaws (so it takes the space that actual jaws take up), cheeks that protect the area under the eyes and over the metal jaw which are shown in Fig. 14. The metal casing also consists of eye protection cases shown in Fig. 15 ; these are put on at the end (especially after the eye box is kept inside). The main objective of the metal casing is to protect the servo motors and mechanics inside the skull. There is no space inside the metal casing, so the processors are kept behind the

Fig. 8.

The eye box solidworks assembly.

BIOLOGICALLY INSPIRED ROBOTICS

Fig. 11.

The eye ball.

Fig. 9.

The eye ball solidworks assembly.

Fig. 12.

The eye lid mechanism.

Fig. 10.

The eye ball.

V. S ENSORS Touch sensation is the amount of stress over the contact area between both surfaces. [4] However, touch sense has 2 submodalities: the kinaesthetic submodality receives the inputs from the receptor within the tendons, muscles and joints, while the cutaneous submodality receives its inputs from the skin. [5] As a result, the second submodality will be replicated in the present project. Consequently, pressure sensors located in the face are needed in order to obtain the input data. Nevertheless, there are many constraints such as cost, face physical dimensions, repeatability, and accuracy for selecting these devices. Based on this constraints, the capacitive sensor was chosen in order to sense a caress, quantum tunnelling pills are utilized in small areas such as ears and nose, and piezoelectric sensors are employed to sense harder interactions in large areas such as cheeks, shin, foreheads, and eyebrows. The analysis made to select the sensor is described in the design subsection.

head robot. After the metal casing a layer of polymorph is put. This polymorph resembles the human face more than the metal casing. This layer of polymorph is for tting the sensors and actuators on and there is not much measurement difference between the polymorph face and the latex mask so that when the person interacting with the robot touches it, he/she has to press a lot or feels that there is hollow space. The polymorph face is shown in Fig. 16. The polymorph face is placed over the metal face and then the latex mask is placed over the polymorph. The tting of the latex mask over the polymorph face is discussed in the actuator section.

BIOLOGICALLY INSPIRED ROBOTICS

Fig. 13.

The skull base with eyes and teeth. Fig. 15. The full metal casing

Fig. 14.

Partial metal casing over the skull base.

A. Justication of the sensors Sensors have to be able to differentiate between a caress and a hard touch. Additionally, they have to cover most of the face area. However, the size has to be small enough to identify the location of the touch. For sensors the main input is the variation of pressure when the face is touched. Making capacitive sensors for arduino is simple. It just requires a resistor and an electrical conductive material. It can sense the electrical human capacitance through more than a quarter of inch of electrical insulator material. Plus, the sensing distance depends on the value of the resistor. The larger the resistance, the larger the range of the sensor. However, its time response is also proportional to the resistance because is

Fig. 16.

The polymorph face

BIOLOGICALLY INSPIRED ROBOTICS

Fig. 17.

The capacitive sensor electric circuit.

Fig. 19.

The QTC circuit diagram.

Fig. 18.

The piezo electric circuit diagram.

equal to R*C. Other advantages are the cost, dimensions, and shapes. [6] On the other hand, the piezoelectric sensors have a dened shape. They can be found in a low variety of dimensions, but their repeatability is better than that of the capacitive sensors. As a consequence they are employed to measure the pressure applied to the skin. Its electrical connection requires just a resistor because when pressure is applied to the sensor, a voltage is generated in the device. However, they are too big to be placed in areas such as nose and ears. In order to sense pressure in small areas of the face, quantum tunnelling pills which size is 3mmX3mm are a good alternative. Its output signal is the change of resistance. The resistance is innite when there is not an applied pressure, but the resistance might be zero due to the amount of pressured applied. B. Circuits of the sensors The capacitive sensor is made of copper tape. The size and shape are dened by the available space in the surface. The aim of using this sensor is to cover big areas in order to sense a caress. As a result, the design will contemplate 7 sensors. They cover the forehead, left cheek, right cheek, chin, eyebrows, nose, and lips. The conditioning circuit is illustrated in the Fig. 17. The resistor R2 is employed to protect the pin of the controller. . The piezoelectric sensor circuit implemented is recommended by [7]. Based on the size of the sensor and face, 15 piezoelectric sensors are required. On the other hand, Fig. 19 illustrates the circuit for the QTP sensors. It is a voltage divider circuit. If the pressure is zero, the input of the pin is also zero. If there is pressure

Fig. 20.

The prototype of the capacitive sensors.

applied, the resistances decreases and the input in the pin is bigger than zero. Because of the area available in the ears and nose, this sensors will be paced in these locations. The full circuit of all the sensors is given in Appendix Fig. 40. C. Location of the sensors In order to test the performance of the sensors a prototype phase was implemented. Due to the lack of a surface as a face, the sensors performance evaluation was made using the back of a chair. Its simplicity, strength, size, shape, and accessibility were the reason for selecting as a test surface. 7 capacitive sensors were tested at the beginning, but because of the space in the face they were reduce to 5. Fig. g:capacitiveprototype shows the prototype of the capacitive sensors on the chair and Fig. 21 illustrate the implementation phase for the capacitive sensors. On the other hand, the piezoelectric sensors are attached to the latex. Every sensor is made of at least 2 QTC pills

BIOLOGICALLY INSPIRED ROBOTICS

Fig. 21.

The capacitive sensors on the polymorph face. Fig. 23. The location of the capacitive sensors on the polymorph face.

Fig. 22.

The piezo electric sensors on the polymorph face.

connected in parallel. Fig. 22 shows the distribution of these sensors. The location of the sensors on the polymorph face is given in Fig. 23 and 24.

Fig. 24.

The location of the piezo electric sensors on the polymorph face.

VI. ACTUATORS A. Justication for the actuators Actuators are needed to show different emotions. Their size should be small because of the available space on the face. We need enough actuators on the face to pull the latex mask by 11.5 cm easily and also to produce a range of facial expressions smartly. Additionally, another type of actuator is required for moving the jaw, eyes, and eyelids. Their size could be bigger or the same as the other actuators because they can be placed inside the head, but its torque has to be enough to move these parts. B. Different actuators considered for producing facial expressions

The following table depicts the list of materials and actuators researched about for being the potential actuators as facial muscles and also enlists the advantages and drawbacks.

BIOLOGICALLY INSPIRED ROBOTICS

Fig. 25.

The NiTi experiment.

Fig. 26. The ultra micro linear actuators placed besides a nger and a penny.

Actuator Name Pneumatic Muscles

Electro Active Polymer Shape Memory Alloy Ultra Micro Linear Actuators

Advantages light weight uses low power high ow rates for rapid contractrelax cycling very exible very natural behaviour light compact strong tiny cheap

Disadvantages Need huge and bulky air compressors and air reservoirs [8] expensive not easily available long cooling time [9], [10]

Fig. 27.

The ultra micro linear actuators placed inside a casing.

C. Experiments with Shape Memory Alloy We carefully studied the experiments performed by researchers on SMA and videos available online. We tried to replicate a simple experiment, as shown in 25, based on the ones given in [9] to check the effectiveness of NiTi wires. 5V was supplied by an external power supply and around 2.14A owed through the circuit. It took approximately 1.5 minutes for it to pull by 1cm and 1.5 minutes again to expand to its original size. Also, the size of the whole setup is pretty large; though we did think of putting these structures behind the head and then having the SMA structure pull wires which would be attached to the latex. The cooling time was unacceptable, so we moved with another actuators. D. Experiments with ultra micro linear actuators Then we thought of using ultra micro linear actuators because of their advantages and suitability for this project. The ultra micro linear actuator looks like the one shown in Fig. 26. Such a fragile component cant be placed unprotected under the polymorph place because there are chances of some E. Actuator circuits The circuits of the actuators is shown in the Fig. 28. The difference between the two types of servo motors is the source voltage. Actuators for showing expressions require a power supply from +3.3V to +4.8V, while the actuators utilized for moving jaw, eyes and eyelids need +5V. mechanical component breaking it, the electrical ciruit being tampered by uxes around it and so on. So it becomes necessary to cover it with some mechanical protective casing as shown in Fig. 27. Here, pin connectors are placed through the legs of the actuator and then two pcb board plates are placed above and below it. A solder wire is made to go through the hole in the arm of the linear actuator. An aluminium piece, with a hole drilled though it, is kept on the pcb plate and the solder wire is made to go through this hole too. The two ends of the solder wire are placed on the aluminium piece and stuck there with velcro.

BIOLOGICALLY INSPIRED ROBOTICS

Fig. 28.

The actuator circuit.

Fig. 30.

The location of the linear actuators below the polymorph face.

Fig. 29.

The location of the linear actuators below the polymorph face.

The full circuit design is shown in Fig. 41 in the Appendix. F. Location of the actuators We studied where we need to place the actuators to produce the 6 basic expressions: happy, sad, anger, surprise, disgust, neutral. [11], [12]. Looking at the size of one linear actuator unit, as shown in Fig. 27, we will only be able to place around 10 below the polymorph face. Considering the available space below the polymorph as well as the space between the polymorph and the mechanical structure, we decided to place the linear actuators at the positions as shown in Fig. g:actuatorlocation. The actuators can be seen below the polymorph from top view in Fig. g:actuatorlocation2 and from bottom view in Fig. g:actuatorlocation3. To attach the linear actuators with the latex mask as shown in Fig. 32 , velcro was stuck under the latex masked and both sides of the velcro were fastened. VII. P ROCESSORS A. Microcontroller selection Several parameters have to be dened in order to choose the microcontroller. These features are summarized in the table I. Based on the table table:microcontrollerrequirements, the Arduino Mega 2560 microcontroller board was selected. It is illustrated in the Fig. 33 and its features are described in the table II. Although Arduino Mega board has enough digital ports to handle the 5 servomotors whose power supply is +5V, the

Fig. 31.

The location of the linear actuators below the polymorph face.

Fig. 32.

The latex mask.

BIOLOGICALLY INSPIRED ROBOTICS

10

TABLE I
MICROCONTROLLER REQUIREMENTS

TABLE II A RDUINO M EGA 2560 MICROCONTROLLER BOARD MAIN FEATURES # Outputs Feature Microcontroller Operating Voltage DC current per I/O Pin Digital I/O pins Analogue inputs Serial communication ports TWI communication ports Flash memory SRAM EEPROM Clock Speed Description ATmega2560 5V 40 mA 54 16 4 1 256 KB 8 KB 4 KB 16 MHz

Device

QTY

Capacitive 7 sensors Piezoelectric 15 sensors QTC sensors 3 Actuators +4.4V 22 Actuators +5V 5 I2C Comm. 1 port Serial Comm. 1 port Total of inputs

Analogue /Digital D A A D D D D / outputs

Input/ Output I I I O O I/O I/O

# Inputs 7 15 3

1 1 27

22 5 1 1 29

Fig. 34.

The Induino(Arduino Uno) microcontroller.

Mega 2560. First, we bought 5V 10A power supply unit as shown in Fig. 36. But this blew off some of the servo motors.
Fig. 33. The Arduino Mega 2560 microcontroller.

Then we went for usb power supply of 5V and two outputs of 2.1A and 1A.

induino board was chosen to manage them because it facilitates the parallel processors rather than concurrent software processing. During the testing phase, the arduino uno (Induino) could not perform the sensing and controlling both types of actuators in a way that for human perception is simultaneously. As a result, the induino board was selected to control these 5 servomotors. Fig. 34 exhibits the induino board, Fig. 35 illustrates the shield used to control the motors, and the table table:induinofeatures summarizes its features. [13], [14] B. Pin connections The connections of Arduino Mega 2560 with different sensors and the linear actuators are given in the table IV and those for Induino are given in table table:induinopin. VIII. P OWER SUPPLY We needed 5V and 3A to drive our circuits and then split the 3A into two parts: 1A for Induino and 2A for Arduino
Fig. 35. The Induino(Arduino Uno) microcontrollers shield.

BIOLOGICALLY INSPIRED ROBOTICS

11

TABLE III I NDUINO MICROCONTROLLER BOARD MAIN FEATURES Feature Microcontroller Operating Voltage DC current per I/O Pin Digital I/O pins Analogue inputs Serial communication ports TWI communication ports Flash memory SRAM EEPROM Clock Speed Description ATmega328/168/88 5V 40 mA 23 8 1 1 32 KB 2 KB 1 KB 16 MHz Device

TABLE V I NDUINO PIN CONNECTIONS Name Eyes horizontal motion Eyes vertical motion Left eyelid Right eyelid Jaw Pin Name 6 9 11 12 13

Servomotors

TABLE IV A RDUINO M EGA 2560 PIN CONNECTIONS Device Name Send Eyebrows Right Cheek Left Cheek Chin Forehead PRS0 PRS1 PRS2 PRS3 PRS4 PRS5 PRS6 PRS7 PRS8 PRS9 PRS10 PRS11 PRS12 Left Ear Right Ear Nose Left Mouth Left cheek Left eye Left eyebrow Forehead Right eyebrow Right eye Right cheek Right mouth jaw Pin Name 22 26 30 34 38 42 A0 A1 A2 A3 A4 A5 A6 A7 A12 A8 A9 A10 A11 A13 A14 A15 23 25 27 29 31 33 35 37 39 41

Capacitive sensors

Piezoelectric sensors

Fig. 36.

The old power supply unit.

QTC sensors

Linear Actuators

A. Voltage regulator The linear actuators work on voltage levels between 3.3v and 4.8V [15]. To step voltage from 5V to 4.4V, rectier diode (1N 5400) component has been used. IX. S OFTWARE A. Arduino code 1) Master slave communication between the microcontrollers: I2 C is a multimaster serial signal-ended comptuer bus invented by Philips. [16] For Charlie, we used I2 C to communicate between the Arduino Mega and Arduino Uno. In this Arduino Mega will act as master and Arduino Uno will act as slave. The code for communication between the master and the slave is in Appendix listing 1. In Arduino Mega pin no.s 20 and 21 are used for I2 C communication

and in Arduino Uno Analog pin 4 and 5 are used for I2 C communication. Pin 4 of arduino uno and pin 21 of arduino mega and pin 5 of arduino uno and pin 20 of arduino mega are simply connected with each other respectively and this forms a I2 C communication between the two controllers. The reason for us going to I2 C communication is we wanted Charlie to do multi processing. The difference between multi-threading and multi-processing is in multi-threading the processor time is being shared and the process are being switched between different stated to get the simultaneous work. On the other hand multi-processing is having multiple processors handling separate processes and as a result we have got double the speed and double the accuracy. Usually controllers are not so quick in switching time and process and so having multi-processors for building robotics would be a better option. 2) Expression code: So far three basic expressions are created. This code is given in the appendix listing 2. The values used for linear actuators to produce anger, surprise and happy are given in the table table:expressiontable. For understanding reasons, the happy expression is shown in Fig. 37. B. Python code 1) Image processing: We have used OpenCV for doing image processing. [17] Stereo Vision code has been implemented. Face tracking has been done by using Haar cascades. The iris has been detected using the methods prescribed by [18].

BIOLOGICALLY INSPIRED ROBOTICS

12

TABLE VI T HE VALUE OF THE LINEAR ACTUATORS TO PRODUCE DIFFERENT


EXPRESSIONS

Expression

Happy

Anger

Surprise

Pin Number 23 25 37 39 41 27 29 31 33 35 27 29 31 33 35

Angle Value 180 180 180 180 0 0 0 0 0 180 180 180 180 180 0

Fig. 39.

The communication unit for higher level communication.

X. P OSSIBLE A PPLICATIONS Such a robot could have multiple uses:

Telenoid: This can be the next step for Skype where the emotions of the caller can be captured and transferred on to the robotic head and hence feel having loved ones closer to you. Leave off the boring navigator and having a human head face that sits next to you in a car and communicates the route and you can share your personal feelings. A robot that can understand your feelings and emotions and help you to get out of stress Record the incidents around a human and learn from them to create a total digital copy of the human which can live forever and develop its own personality and thus to extend human life. To take care of the elderly emotionally and physically. Be a personal assistant or even a companion. XI. C ONCLUSION

Fig. 37. The linear actuator values for happy expression with the pin numbers for all the linear actuators..

2) Speech processing: PyAudio library has been used for transferring speech from microphone to speaker. Pyttsx engine has been used for producing speech from text. C. Higher level communication between all the components We have devised the higher level protocol to be of the form of packets as shown in Fig. 38. The rst character is a control number and the second trailing payload is the data then following by packet terminal character. Mode number one is reserved for deciding the mode. If mode = 0, then it operates in autonomous mode and if model=1, it operates in skype mode. The modes 2 to 4 are used for sending the actuator values and reading the sensors and these are illustrated in Fig. 39.

We made a human head that would look, hear and talk. It could sense people touching it and respond. It would be able to move its eyes, eyelids and jaws. It was able to make facial expressions like anger, surprise, happy. Thus, we made a basic human head with some basic functionality which could serve as a platform on which AI software could be built on it. R EFERENCES
[1] C. Breazeal, Sociable Machines: Expressive Social Exchange Between Humans and Robots. PhD thesis, Department of Electrical Engineering and Computer Science, MIT, 2000. [2] N. H. Shuichi Nishio, Hiroshi Ishiguro, Geminoid: Teleoperated Android of an Existing Person. Vienna, Austria: I-Tech Education and Publishing, 2007. [3] Tj animatronic puppet. http://tjrobot.weebly.com/. [4] G. Robles-De-La-Torre and V. Hayward, Force can overcome object geometry in the perception of shape through active touch, in Journal of Nature, vol. 412, pp. 445448, July 2001. [5] G. V. M. S. G. Dahiya, R.S.; Metta, Tactile sensingfrom humans to humanoids, in IEEE Transactions on Robotics, vol. 26, pp. 120, Feb. 2010. [6] Arduino capacitive sensing. http://playground.arduino.cc//Main/CapacitiveSensor? from=Main.CapSense. [7] Arduino piezo electric sensing. http://arduino.cc/en/Tutorial/Knock.

Fig. 38.

The communication unit for higher level communication.

BIOLOGICALLY INSPIRED ROBOTICS

13

[8] S. U. . W. L. . G. J. Majoe, D. ; MA Syst. & Control Ltd., Pneumatic air muscle and pneumatic sources for light weight autonomous robots, in 2011 IEEE International Conference on Robotics and Automation, pp. 32433250, May 2011. [9] S. P. E. V. S. V. Adelaide Nespoli, Stefano Besseghini, The high potential of shape memory alloys in developing miniature mechanicaldevices: A review on shape memory alloy mini-actuators, vol. 158, pp. 149160, May 2010. [10] I. N. A. V. P. A. A. P. G. A. M. A. T. R. Z. P. S. R. L. P. A. W. John D. W. Madden, Member and I. W. Hunter, Articial muscle technology: Physical principles and naval prospects, vol. 29, pp. 706728, July 2004. [11] H. KOBAYASHI and F. HARA, Study on face robot for active human interface - mechanisms of face robot and expression of 6 basic facial expressions, pp. 276281, 1993. [12] P. R. M. S. B. J. R. M. Tingfan Wu, Nicholas J. Butko, Learning to make facial expressions, in 2009 IEEE 8TH INTERNATIONAL CONFERENCE ON DEVELOPMENT AND LEARNING, pp. 16, 2009. [13] Induino site. http://induino.blogspot.co.uk/. [14] Induino specication site. http://www.atmel.com/devices/atmega328p.aspx?tab=parameters. [15] Ultra micro servo 1.7g for 3d ight. http://www.hobbyking.com/hobbyking/store/ 11737 HobbyKing Ultra Micro Servo 1 7g for 3D Flight Left .html. [16] I2c wikipedia page. http://en.wikipedia.org/wiki/I [17] Opencv ofcial site. http://opencv.willowgarage.com/wiki/. [18] R. V. Jian-Gang Wang, Eric Sung, Eye gaze estimation from a single image of one eye, in Proceedings of the Ninth IEEE International Conference on Computer Vision (ICCV 2003) 2-Volume Set, 2003.

BIOLOGICALLY INSPIRED ROBOTICS

14

A PPENDIX A. I2 C Code
Listing 1. I2 C Master Slave communication Arduino Code

Master code : / / Wire M a s t e r Code / / by S r i r a m Kumar K / / C r e a t e d 29 March 2013 / / T h i s e x a m p l e c o d e i s i n t h e p u b l i c domain .

# i n c l u d e <Wire . h> void setup ( ) { Wire . b e g i n ( ) ; S e r i a l . begin (9600); } byte x = 0; void loop ( ) { i n t slaveno = 2; i n t dataLength = 19; sendData ( slaveno ) ; slaveno = 3; sendData ( slaveno ) ; } void sendData ( i n t slaveno ) { Wire . b e g i n T r a n s m i s s i o n ( s l a v e n o ) ; / / t r a n s m i t t o d e v i c e #4 Wire . w r i t e ( x i s ) ; / / sends f i v e bytes Wire . w r i t e ( x ) ; / / s e n d s one b y t e Wire . e n d T r a n s m i s s i o n ( ) ; / / stop transmitting x ++; delay (500); } void r e c e i v e D a t a ( i n t slaveno , i n t amountofData ) { Wire . r e q u e s t F r o m ( s l a v e n o , a m o u n t o f D a t a ) ; / / r e q u e s t 6 b y t e s from s l a v e d e v i c e #2 w h i l e ( Wire . a v a i l a b l e ( ) ) { c h a r c = Wire . r e a d ( ) ; Serial . print (c ); } Serial . println (); delay (500); } / / s l a v e may s e n d l e s s t h a n r e q u e s t e d / / r e c e i v e a byte as c h a r a c t e r / / print the character

BIOLOGICALLY INSPIRED ROBOTICS

15

S l a v e Code : / / Wire S l a v e Code / / by S r i r a m Kumar K / / C r e a t e d 29 March 2013 # i n c l u d e <Wire . h> void setup ( ) { Wire . b e g i n ( 2 ) ; / / j o i n i 2 c b u s w i t h a d d r e s s #2 Wire . o n R e q u e s t ( s e n d E v e n t ) ; / / r e g i s t e r e v e n t Wire . o n R e c e i v e ( r e c e i v e E v e n t ) ; S e r i a l . begin (9600); } void loop ( ) { delay (100); } void sendEvent ( ) { Wire . w r i t e ( m e s s a g e from s l a v e 2 ) ; / / r e s p o n d w i t h m e s s a g e o f 6 b y t e s / / a s e x p e c t e d by m a s t e r } v o i d r e c e i v e E v e n t ( i n t howMany ) { w h i l e ( 1 < Wire . a v a i l a b l e ( ) ) / / l o o p t h r o u g h a l l b u t t h e l a s t { c h a r c = Wire . r e a d ( ) ; / / r e c e i v e b y t e a s a c h a r a c t e r Serial . print (c ); / / print the character } i n t x = Wire . r e a d ( ) ; / / r e c e i v e b y t e a s an i n t e g e r Serial . println (x ); / / print the integer } B. Expression Code
Listing 2. Facial Expressions Arduino Code

/ / L i n e a r A c t u a t o r Code t o p r o d u c e f a c i a l e x p r e s s i o n s / / by D e v a n g i n i P a t e l # i n c l u d e < S e r v o . h> const i n t servoCount = 10; c o n s t i n t dontMove = 2 0 0 ; Servo muscles [ servoCount ] ; i n t pos = 0 ; / / variable to store the servo position i n t startPinNumber = 23; boolean consider [ servoCount ] = { true , true , true , true , true , true , true , true , true , t r u e };

BIOLOGICALLY INSPIRED ROBOTICS

16

i n t pinValues [ servoCount ] = {23 , 25 , 27 , 29 , 31 , 33 , 35 , 37 , 39 , 41}; i n t i n i t i a l V a l u e s 2 [ servoCount ] = {0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0}; i n t i n i t i a l V a l u e s [ servoCount ] = {180 , 0 , 90 , 90 , 90 , 90 , 90 , 180 , 180 , 90}; i n t smileLength = 90; i n t s m i l e E n d V a l u e s [ s e r v o C o u n t ] = { 0 , 1 8 0 , dontMove , dontMove , dontMove , dontMove , dontMove , 0 ,

i n t surpriseLength = 90; i n t s u r p r i s e E n d V a l u e s [ s e r v o C o u n t ] = { dontMove , dontMove , 0 , 0 , 0 , 0 , 1 8 0 , dontMove , dontMove , d

i n t angerLength = 90; i n t a n g e r E n d V a l u e s [ s e r v o C o u n t ] = { dontMove , dontMove , 1 8 0 , 1 8 0 , 1 8 0 , 1 8 0 , 0 , dontMove , dontMove void i n i t i a l S e t u p (){ f o r ( i n t i = 0 ; i < s e r v o C o u n t ; i ++) { muscles [ i ] . write ( i n i t i a l V a l u e s [ i ] ) ; } } void doAction ( i n t s t a r t V a l u e s [ ] , i n t endValues [ ] , i n t actionLength ){ f o r ( i n t j = 0 ; j < s m i l e L e n g t h ; j ++) { f o r ( i n t i = 0 ; i < s e r v o C o u n t ; i ++) { i f ( e n d V a l u e s [ i ] ! = dontMove ) { i n t d e l t a = endValues [ i ] s t a r t V a l u e s [ i ] ; muscles [ i ] . write ( s t a r t V a l u e s [ i ] + j ( d e l t a ) / actionLength ) ; } } delay (10); } delay (1000); } void simpleRun ( ) { f o r ( p o s = 0 ; p o s < 1 8 0 ; p o s += 1 ) / / g o e s from 0 d e g r e e s t o 180 d e g r e e s { / / in s t e p s of 1 degree f o r ( i n t j = 0 ; j < s e r v o C o u n t ; j ++) if ( consider [ j ]) muscles [ j ] . w r i t e ( pos ) ; / / t e l l s e r v o t o go t o p o s i t i o n i n v a r i a b l e pos delay (10); / / w a i t s 15ms f o r t h e s e r v o t o r e a c h t h e p o s i t i o n } f o r ( p o s = 1 8 0 ; pos > =0; pos =1) / / g o e s from 180 d e g r e e s t o 0 d e g r e e s { f o r ( i n t j = 0 ; j < s e r v o C o u n t ; j ++) if ( consider [ j ]) muscles [ j ] . w r i t e ( pos ) ; / / t e l l s e r v o t o go t o p o s i t i o n i n v a r i a b l e pos delay (10); / / w a i t s 15ms f o r t h e s e r v o t o r e a c h t h e p o s i t i o n } } void setup ( ) { f o r ( i n t i = 0 ; i < s e r v o C o u n t ; i ++) muscles [ i ] . a t t a c h ( pinValues [ i ] ) ; initialSetup (); } void loop ( ) {

BIOLOGICALLY INSPIRED ROBOTICS

17

simpleRun ( ) ; doAction ( i n i t i a l V a l u e s , smileEndValues , smileLength ) ; doAction ( i n i t i a l V a l u e s , surpriseEndValues , s u r p r i s e L e n g t h ) ; doAction ( i n i t i a l V a l u e s , angerEndValues , angerLength ) ; initialSetup (); } C. Circuit Designs

BIOLOGICALLY INSPIRED ROBOTICS

18

Fig. 40.

The electric ciruit design for the sensors.

BIOLOGICALLY INSPIRED ROBOTICS

19

Fig. 41.

The electric ciruit design for the actuators.

Vous aimerez peut-être aussi