Vous êtes sur la page 1sur 8

An Integrative Computational

Model of Emotions
Luis-Felipe Rodrguez1, Felix Ramos1 , and Gregorio Garca2
1

Department of Computer Science, Cinvestav Guadalajara, Mexico


{lrodrigue,framos}@gdl.cinvestav.mx
Department of Neuropsychology, Benemerita Universidad Aut
onoma de Puebla
{gregorio.garcia}@correo.buap.mx

Abstract. In this paper we propose a computational model of emotions


designed to provide autonomous agents with mechanisms for aective
processing. We present an integrative framework as the underlying architecture of this computational model, which enables the unication of
theories explaining the dierent facets of the human emotion process and
promotes the interaction between cognitive and aective functions. This
proposal is inspired by recent advances in the study of human emotions
in disciplines such as psychology and neuroscience.
Keywords: Computational Models of Emotions, Integrative Frameworks,
Cognitive and Aective Functions, Autonomous Agents.

Introduction

Computational Models of Emotions (CMEs) are software systems designed to synthesize the operations and architectures of the components that constitute the
process of human emotions. These models include mechanisms for the evaluation of stimuli, elicitation of emotions, and generation of emotional responses [13].
CMEs are useful in several areas. In disciplines such as psychology and neuroscience they serve as testbed for the evaluation, completion, and improvement of
theoretical models [1]. In the eld of articial intelligence, CMEs are usually incorporated into agent architectures, enabling autonomous agents (AAs) to recognize
and simulate emotions and execute emotionally-driven responses [6].
The development of CMEs has been primarily driven by advances in the study
of human emotions. In particular, most CMEs are inspired by psychological
theories [2, 7, 1214, 28], which explain the process of emotions from a functional
perspective, investigating the inputs, outputs, and behaviors of the components
that constitute this process. Similarly, disciplines such as cognitive and aective
neuroscience have recently reported signicant progress in explaining the internal
mechanisms underlying human emotions [4, 25]. The theories originated in these
elds investigate emotions in terms of brain functions, brain structures, and
neural pathways, providing a deeper understanding of the components involved
in the processing of emotions. Unfortunately, related literature reports few CMEs
whose development is based on this type of evidence [2, 28].
S. DMello et al. (Eds.): ACII 2011, Part II, LNCS 6975, pp. 272279, 2011.
c Springer-Verlag Berlin Heidelberg 2011


An Integrative Computational Model of Emotions

273

Regardless of the theoretical approach taken into account in the development


of CMEs, most theories have constraints that cause variabilities in the internal
design of these computational models. For example, due to the complexity in the
study of human emotions, theoretical models focus on particular facets of the
emotion process, which are explained at various levels of abstraction and from
dierent perspectives. Furthermore, the number and type of the components that
constitute the emotion process and the denitions of concepts used to describe
aective processes widely dier among theories. Nevertheless, theoretical models
complement each other to allow the development of functional CMEs [24].
These and other issues discussed below have to be addressed in the development of CMEs. The architectures of such computational models should provide
proper environments for the unication of heterogeneous theories that explain
the dierent facets of human emotions. In this paper, we propose a CME that
incorporates an integrative and scalable architecture which promotes the interaction between aective and cognitive functions. This proposal is mainly inspired
by Newells ideas on unied theories of cognition [17] as well as by theoretical
advances in the eld of neuroscience.

Related Work

In general, CMEs summarize the emotion process in three phases: stimuli evaluation, emotion elicitation, and emotional responses. In the rst phase, CMEs
evaluate the emotional signicance of perceived stimuli using a series of criteria.
For example, MAMID [8] performs an assessment of stimuli in terms of their
valence using appraisal variables such as expectation and novelty. In the second phase, this type of information is used to elicit particular emotions and
determine their associated intensity. For example, WASABI [2] uses a threedimensional space to decide which emotions will be elicited. In the last phase,
emotions inuence processes such as decision making [8], conversational skills
[7], and facial expressions [2]. However, despite this general and well accepted
abstract cycle, CMEs address each of this phases in very dierent ways.
One aspect that causes a marked dierence among CMEs is the number and
type of aective functions they take into account for the processing of emotions.
Marsella and Gratch [14] consider mood as the unique aective modulator for
the elicitation of emotions. Marinier et al. [12] implement a mechanism that
includes emotions, mood, and feeling to determine the aective experience of
AAs. Gebhard [7] integrates emotions, mood, and personality to achieve aective
processing in conversational AAs. However, the operational and architectural
roles that such aective factors play in each model also diers.
Regardless of how CMEs address the phases described above and the number
and type of aective processes they involve, their implementation is primarily
inspired by psychological theories. The most implemented approach is the appraisal theory [7, 12, 14], which explains the elicitation and dierentiation of
emotions based on the relationship between individuals and their environment
[19, 21]. However, most psychological theories lack the detail needed to fully

274

L.-F. Rodrguez, F. Ramos, and G. Garca

meet the requirements of a computational model, forcing developers to include


additional working assumptions to achieve functional systems.
Regarding the labeling of emotions, while some CMEs are not interested in
providing specic models for this procedure [12, 14], others focus on the elicitation of categorical emotions [5, 7]. The CMEs included in the rst class argue
that the labeling of emotions depends on diverse factors such as culture and
personality, which may or may not be considered in CMEs. Nevertheless, nearly
all CMEs use the emotional labels included in the groups of basic and non-basic
emotions.
There are several case studies in which CMEs have proven useful. In each
case emotions modulate the verbal and non-verbal behavior of AAs dierently.
Moreover, although it is recognized that such behavior is carried out by cognitive
functions in humans, CMEs are not fully committed with building coherent
systems that integrate both aective and cognitive processing.
In summary, the operational and architectural variability in CMEs may be
explained as follows. First, there is not a universal and well-accepted theory explaining human emotions, consequently, CMEs are implemented according to the
theories in which they are based. Second, they are mainly inspired by psychological theories, which lack the detail needed to fully implement a computational
system and force the inclusion of a variety of subjective assumptions. Finally,
since each CME is designed for a specic purpose, they must meet dierent
requirements which restrict the elements to be included in their design. Furthermore, the design of CMEs has been restricted by two conditions. First, as many
of them have been integrated in cognitive frameworks [2, 12], they are required
to meet specic constraints imposed by these models. Second, although other
stand-alone CMEs have been proposed, they are developed to process aective
information to modulate specic cognitive functions [5, 7], limiting thus the development of more comprehensive models of emotion. See table 1 for a concise
analysis of the issues discussed here.

Proposed Model

This section covers the two concepts that comprise our proposal, which aim to
address the issues identied previously. We rst introduce an integrative framework intended to unify aective and cognitive models, and then present the main
characteristics and properties of the proposed CME.
3.1

Fundamentals for an Integrative Framework

Based on the previous review and advances in the study of human emotions, we
identify two issues that must be considered in the development of CMEs:
1. Integrative and scalable architecture: CMEs must incorporate frameworks
that consistently allow the unication of theories and models that explain
the diverse components of human emotions. Although most CMEs can be

PEACTIDM Appraisal
theory
by
[12]
Scherer [23] and physiological concepts of feelings
by Damasio [4]
WASABI [2] Appraisal
theory
by
Scherer [23], PAD space
by Mehrabian [16], and
physiological concepts by
Damasio [4]

Cathexis
[28]

Alma [7]

Diverse appraisal theories


[26, 27] and psychological
personality models [15]
Appraisal
model
by
Ortony et al. [19], the
Five Factor Model of
personality [15], and the
PAD Temperament space
by Mehrabian [16]
Diverse psychological [21]
and neuropsychological [4]
theories

Mamid [8]

Emotions and
mood

Emotions, mood,
and feelings

Emotions, drives,
mood, and
personality

Emotions, mood,
and personality

Emotions and
personality

Appraisal
theory
by Emotions,
Ortony et al. [19] and motivational
Roseman et al. [21]
states, and mood

Aective
Processes
Appraisal theory by Smith Emotions and
and Lazarus [27]
mood

Foundations

Flame [5]

EMA [14]

Model

Eects of Emotions

Goal-directed
Autonomous Agents

Decision-making in virtual and physical agents

Virtual
humans
for
training and psychotherapy environments [9]
Embodied
Conversational Agents

Decision-making in virtual pets showing believable behavior

Decision-making in Virtual Humans developed


for training purposes

Case Studies

Primary emotions: Angry, Annoyed, Bored, Facial expressions, invol- Emotional expressions
Concentrated, Depressed, Fearful, Happy, untary behaviors such as and responses in virtual
Sad, Surprised. Secondary emotions: Hope, breathing, and voluntary players
Fears-conrmed, Relief
behaviors such as verbal
expressions

Surprise, Hope, Joy, Fear, Sadness, Anger, Agents expressions, atand Guilt
tentional processes, beliefs, desires, and intentions
Joy, Sad, Disappointment, Relief, Hope, Fear, Action selection
Pride, Shame, Reproach, and Admiration.
Complex emotions: Anger (sad + reproach),
Gratitude (joy + admiration), Gratication
(joy + pride), and Remorse (sad + shame)
Anxiety/fear, Anger/aggression, Negative af- Goal and action selection
fect (sadness, distress), and positive aect
(joy, happiness)
Admiration, Anger, Disliking, Disappoint- Verbal and Non-verbal
ment, Distress, Fear, Fears Conrmed, Gloat- Expressions such as wording, Gratication, Gratitude, Happy For, ing, length of phrases, and
Hate, Hope, Joy, Liking, Love, Pity, Pride, facial expressions. CogniRelief, Remorse, Reproach, Resentment, Sat- tive Processes such as
isfaction, Shame
Decision-Making
Primary emotions: Anger, Fear, Sad- Agents Expressivity such
ness/Distress, Enjoyment/Happiness, Dis- as facial expressions and
gust, and Surprise. This model handles body postures. Cognitive
secondary emotions but does not provides Processes such as percepan explicit model for the labeling of them
tion, memory, and action
selection
This model implements the model by Scherer General cognitive behav[23] for the mapping of appraisal dimension ior
values to specic modal emotions

Emotions Labels

Table 1. Some major dierences among CMEs

An Integrative Computational Model of Emotions


275

276

L.-F. Rodrguez, F. Ramos, and G. Garca

seen as integrative frameworks, they usually include only those aspects that
allow them to meet their design objectives. Moreover, they are not committed
to the construction of suitable environments for the steady incorporation of
new ndings about this human function, which is an essential requirement
since the brain mechanisms underlying emotions begin to be revealed [3].
2. Consistent model for the interaction between aective and cognitive functions: although nearly all CMEs are aware of the antecedents and consequents of emotions (i.e., perception and motor-action), in most cases they
are not committed with establishing proper interfaces for the interaction between various aective and cognitive processes. In this sense, CMEs should
be developed so that they properly handle both aective and cognitive data.
A coherent framework addressing these issues will undoubtedly change the way
in which CMEs are developed, leading to signicant advances in the eld of
aective computing. In addition, these two aspects encourage the creation of
CMEs that not only present certain dierences with respect to their operational
and architectural assumptions, but that incorporate frameworks which allow the
proper integration of the various components of the human emotions process.
We consider evidence from psychology and neuroscience to create an integrative framework that addresses these two issues, which represents the underlying
architecture of the CME that we propose in section 3.2. Such integrative framework is designed on the basis of a multiprocess and multilevel perspective as
shown in gure 1 [18]. It considers several brain functions, areas, and nuclei
for the processing of emotions [11, 20]. In gure 1, the level two consists of
a series of abstract models of cognitive and aective functions, such as mood
and personality, which interact in order to achieve the dynamics of emotions
[22]. These abstract models comprise various architectural components (at level
three) whose collaborative work produce their corresponding behavior. While
the modules in the level two take the role of brain functions, the modules in
level three simulate the operations and architectures of brain structures.

Fig. 1. Integrative framework for a CME

An Integrative Computational Model of Emotions

277

In this manner, we can establish the following hypotheses. The proposed CME
will incorporate an integrative and scalable architecture, since traditional and
modern theories explaining the diverse facets of human emotions can be implemented in this framework by using the structural and operational basis within
it. Similarly, cognitive models explained in terms of these two levels can be implemented using the same structural and operational constraints used for implementing aective processes. This approach allows reducing and validating
the working assumptions in CMEs used to achieve functional systems and
induced by psychological theories. Finally, neuroscience is increasingly able to
explain many processes that are common to all individuals [25], allowing us to
address the emotion process by implementing the core mechanisms from which
emotional behavior emerges [10].
3.2

Design of the CME

In this section we describe the characteristics and properties of the CME we


are proposing, which is based on the integrative framework described above.
The following list sets out the requirements for the construction of a functional
model of emotions, which is composed of modules that simulate brain functions
according to the level two in gure 1. This functional model, in a top-down
approach, will serve as the starting point for the design and implementation of
the architectural components in level three.
Dynamics of Emotion: the emotion process follows a cycle in which all emotionally charged stimuli impact the agents emotional state.
Emotion Regulation: emotions are re-evaluated when the contextual conditions in which they were rst generated have changed or are no longer
appropriate to implement the corresponding emotional responses.
Environment Adaptability: an element in the environment is always appreciated by the agent dierently at dierent times.
Aective and Cognitive Processes: emotions have to do with a dynamic inuence between diverse cognitive and aective phenomena.
Internal and External Emotion Elicitors: the dynamic of emotions is driven
by the assessment of external and internal stimuli.
Labeling of Emotions: it is needed a correlation between the emotional state
of the agent and an emotional label.
Emotions Intensity: emotions are generated with an associated intensity,
which is necessary to modulate the agents responses induced by emotions.
Emotional Reactions: emotions always generate verbal/non-verbal behavior.

Conclusions and Future Work

Computational models of emotion are highly inuenced by theoretical models


that investigate the actual process of emotions. These theories address particular
facets of this human function and explain them using dierent levels of abstraction and from dierent perspectives. Most CMEs are based on psychological

278

L.-F. Rodrguez, F. Ramos, and G. Garca

approaches, which lack the detail needed to fully develop computational models. On this basis, we proposed an integrative framework for unifying theories
explaining emotions in terms of brain functions, brain structures, and neural
pathways. Which provides a convenient environment for the steady inclusion
of evidence about the functioning of human emotions, and allows the proper
interaction between cognitions and emotions. We also presented the main characteristics of a CME to be implemented in this integrative framework.
Future research focuses on the following tasks:
Identication of the cognitiveaective processes that meet the characteristics and properties of the proposed CME (functional model at level two).
Identication of the brain structures and interactions that underlie the processes in the functional model (architectural model at level three).
Formalization of the internal operations of the CME.
Analysis of computational techniques for implementing each process in the
CME.
Implementation of the CME.
Development of case studies and evaluations.
Acknowledgments. The authors would like to acknowledge the PhD scholarship (CONACYT grant No. 229386) sponsored by the Mexican Government
for their partial support to this work. We would like to thank the anonymous
reviewers for their valuable comments and suggestions.

References
1. Armony, J.L., Servan-Schreiber, D., Cohen, J.D., LeDoux, J.E.: Computational
modeling of emotion: explorations through the anatomy and physiology of fear
conditioning. Trends in Cognitive Sciences 1(1), 2834 (1997)
2. Becker-Asano, C.W., Wachsmuth, I.: Aective computing with primary and secondary emotions in a virtual human. Autonomous Agents and Multi-Agent Systems 20(1), 3249 (2010)
3. Dalgleish, T., Dunn, B.D., Mobbs, D.: Aective neuroscience: Past, present, and
future. Emotion Review 1(4), 355368 (2009)
4. Damasio, A.R.: Descartes error: Emotion, Reason, and the Human Brain, 1st edn.
Putnam Grosset Books, New York (1994)
5. El-Nasr, M.S., Yen, J., Ioerger, T.R.: Flamefuzzy logic adaptive model of emotions. Autonomous Agents and Multi-Agent Systems 3(3), 219257 (2000)
6. Fellous, J.-M., Arbib, M.A.: Who needs emotions?: the brain meets the robot.
Oxford University Press, Oxford (2005)
7. Gebhard, P.: Alma: a layered model of aect. In: Proceedings of the International
Conference on Autonomous Agents And Multiagent Systems, pp. 2936 (2005)
8. Hudlicka, E.: This time with feeling: Integrated model of trait and state eects
on cognition and behavior. Applied Articial Intelligence: An International Journal 16(7-8), 611641 (2002)
9. Hudlicka, E.: A computational model of emotion and personality: Applications
to psychotherapy research and practice. In: Proceedings of the 10th Annual CyberTherapy Conference: A Decade of Virtual Reality (2005)

An Integrative Computational Model of Emotions

279

10. Lane, R.D., Nadel, L., Allen, J.J.B., Kaszniak, A.W.: The study of emotion from
the perspective of cognitive neuroscience. In: Lane, R.D., Nadel, L. (eds.) Cognitive
Neuroscience of Emotion. Oxford University Press, New York (2000)
11. LeDoux, J.E., Phelps, E.A.: Emotion networks in the brain. In: Lewis, M.,
Haviland-Jones, J.M. (eds.) Handbook of Emotions, pp. 159179. Guilford Press,
New York (2000)
12. Marinier, R.P., Laird, J.E., Lewis, R.L.: A computational unication of cognitive
behavior and emotion. Cognitive Systems Research 10(1), 4869 (2009)
13. Marsella, S., Gratch, J., Petta, P.: Computational models of emotion. In: Scherer,
K.R., B
anziger, T., Roesch, E.B. (eds.) Blueprint for Aective Computing: A
Source Book, 1st edn. Oxford University Press, Oxford (2010)
14. Marsella, S.C., Gratch, J.: Ema: A process model of appraisal dynamics. Cognitive
Systems Research 10(1), 7090 (2009)
15. McCrae, R.R., John, O.P.: An introduction to the ve-factor model and its applications. Journal of Personality 60(2), 175215 (1992)
16. Mehrabian, A.: Pleasure-arousal-dominance: A general framework for describing
and measuring individual dierences in temperament. Current Psychology 14(4),
261292 (1996)
17. Newell, A.: Unied theories of cognition. Harvard University Press, Cambridge
(1990)
18. Ochsner, K.N., Barrett, L.F.: A multiprocess perspective on the neuroscience of
emotion. In: Mayne, T.J., Bonanno, G.A. (eds.) Emotions: Currrent Issues and
Future Directions, pp. 3881. Guilford Press, New York (2001)
19. Ortony, A., Clore, G.L., Collins, A.: The cognitive structure of emotions. Cambridge University Press, Cambridge (1990)
20. Phan, K.L., Wager, T.D., Taylor, S.F., Liberzon, I.: Functional neuroimaging studies of human emotions. CNS Spectrums 9(4), 258266 (2004)
21. Roseman, I.J., Spindel, M.S., Jose, P.E.: Appraisals of emotion-eliciting events:
Testing a theory of discrete emotions. Journal of Personality and Social Psychology 59(5), 899915 (1990)
22. Rusting, C.L.: Personality, mood, and cognitive processing of emotional information: Three conceptual frameworks. Psychological Bulletin 124(2), 165196 (1998)
23. Scherer, K.R.: Appraisal considered as a process of multi-level sequential checking.
In: Scherer, K.R., Schorr, A., Johnstone, T. (eds.) Appraisal Processes in Emotion:
Theory, Methods, Research, pp. 92120. Oxford University Press, New York (2001)
24. Scherer, K.R.: Emotion and emotional competence: conceptual and theoretical issues for modelling agents. In: Scherer, K.R., B
anziger, T., Roesch, E.B. (eds.)
Blueprint for Aective Computing: A Source Book. Oxford University Press, Oxford (2010)
25. Shepherd, G.M.: Creating modern neuroscience: the revolutionary 1950s. Oxford
University Press, Oxford (2009)
26. Smith, C.A., Kirby, L.D.: Toward delivering on the promise of appraisal theory.
In: Scherer, K.R., Schorr, A., Johnstone, T. (eds.) Appraisal Processes in Emotion,
Oxford, NY (2001)
27. Smith, C.A., Lazarus, R.S.: Emotion and adaptation. In: Pervin, L.A. (ed.) Handbook of Personality: Theory and Research, pp. 609637. Guilford Press, New York
(1990)
28. Vel
asquez, J.D.: Modeling emotions and other motivations in synthetic agents. In:
Proceedings of the Fourteenth National Conference on Articial Intelligence and
Ninth Conference on Innovative Applications of Articial Intelligence, pp. 1015
(1997)

Vous aimerez peut-être aussi