Académique Documents
Professionnel Documents
Culture Documents
RUI SILVA*
University of Minho Supervisors: Estela Bicho, Pedro Branco
School of Engineering * rsilva@dei.uminho.pt
Centre ALGORITMI
Today’s robots are still far from being able to interact with humans in
●
The same action but a different facial expression
natural and human-like way (Breazeal, 2003; Fong 2003); may have a different interpretation;
●
The same facial expression can have different
interpretation depending on the context;
Interact actively with humans, rather than be used as mere tools;
●
Different facial expressions may have the same
underlying reason (depending for example on the
Focus in non-verbal communication as an essential component for person with whom the robot interacts).
every day social interaction;
Objectives
We humans continuously monitor the actions and
the facial expressions of our partners, interpret Integrate the ability to read the
them effortlessly in terms of: partner emotional states:
●
Their intention;
●
Facial expressions;
●
Emotional/mental state.
●
Movement kinematics.
The existing cognitive control architecture allows the robot to: Create socially intelligent robots capable of interacting with humans in
Two important (high level) cognitive skills are Read motor intentions; a human-like way, and thus be able to play the role of:
required: ●
Implement a flexible mapping from action observation: Arm-hand ●
Adaptive servants;
●
Understanding human actions; gestures of the human partner, contextual information and shared task ●
Personal assistants.
knowledge;
●
Understanding/interpreting emotional states; ●
Onto action execution: Robot overt behaviour;
References
Increase the acceptance by allowing a fluent
adjustment of complementary behaviours in human- Multi-layer architecture, where each of the layers is formalized by a
●
Breazeal, C. (2003). Toward sociable robots. Robotics and Autonomous Systems,
robot collaborative tasks, without the need to use DNF:
42(3-4), 167–175. Elsevier.
explicit communication; ●
Fong, T., Nourbakhsh, I., & Dautenhahn, K. (2003). A survey of socially interactive
robots. Robotics and autonomous systems. Elsevier.
●
Blair, R. J. (2003). Facial expressions, their communicatory functions and neuro-
Robotic platform cognitive substrates. Philosophical Transactions of the Royal Society B: Biological
Motivation Sciences, 358(1431), 561. The Royal Society.
●
Silva, R., Bicho, E., & Erlhagen, W. (2008). ARoS: An Anthropomorphic Robot For
ARoS (Anthropomorphic Robot
Advance towards more socially intelligent robots; Human-Robot Interaction And Coordination Studies. In Proceedings of the
System), an anthropomorphic CONTROLO'2008 Conference - 8th Portuguese Conference on Automatic Control
robot equipped with: (pp. 819-826). UTAD - Vila Real, Portugal: UTAD.
This implies, endowing the robot with additional cognitive skills:
●
Static torso; ●
Action and Facial expression understanding and to behave
●
Bicho, E., Louro, L., Hipólito, N., & Erlhagen, W. (2008). A dynamic neural Field
architecture for Flexible and Fluent human-robot interaction. In Proceedings of the
●
7 DOF arm; accordingly. 2008 International Conference on Cognitive Systems (pp. 179-185). Karlsruhe,
●
4 DOF dexterous hand; Germany.
●
Bicho, E., Louro, L., Hipólito, N., & Erlhagen, W. (2009). A dynamic field approach to
●
Pan-tilt unit;
goal inference and error monitoring for human-robot interaction. In E. DAUTENHAHN,
●
Stereo vision system. AISB Convention 2009 on Adaptive & Emergent Behaviour & Complex Systems :
Silva et al., (2008) proceedings of the International Symposium on New Frontiers in Human-Robot
Interaction (pp. 31-37). Edinburgh, Scotland.
Engenharia para a Qualidade de Vida: SAÚDE, LAZER E AMBIENTE– Semana da Escola de Engenharia - 11 a 16 de Outubro de 2010