Académique Documents
Professionnel Documents
Culture Documents
Model of Emotions
Luis-Felipe Rodrguez1, Felix Ramos1 , and Gregorio Garca2
1
Introduction
Computational Models of Emotions (CMEs) are software systems designed to synthesize the operations and architectures of the components that constitute the
process of human emotions. These models include mechanisms for the evaluation of stimuli, elicitation of emotions, and generation of emotional responses [13].
CMEs are useful in several areas. In disciplines such as psychology and neuroscience they serve as testbed for the evaluation, completion, and improvement of
theoretical models [1]. In the eld of articial intelligence, CMEs are usually incorporated into agent architectures, enabling autonomous agents (AAs) to recognize
and simulate emotions and execute emotionally-driven responses [6].
The development of CMEs has been primarily driven by advances in the study
of human emotions. In particular, most CMEs are inspired by psychological
theories [2, 7, 1214, 28], which explain the process of emotions from a functional
perspective, investigating the inputs, outputs, and behaviors of the components
that constitute this process. Similarly, disciplines such as cognitive and aective
neuroscience have recently reported signicant progress in explaining the internal
mechanisms underlying human emotions [4, 25]. The theories originated in these
elds investigate emotions in terms of brain functions, brain structures, and
neural pathways, providing a deeper understanding of the components involved
in the processing of emotions. Unfortunately, related literature reports few CMEs
whose development is based on this type of evidence [2, 28].
S. DMello et al. (Eds.): ACII 2011, Part II, LNCS 6975, pp. 272279, 2011.
c Springer-Verlag Berlin Heidelberg 2011
273
Related Work
In general, CMEs summarize the emotion process in three phases: stimuli evaluation, emotion elicitation, and emotional responses. In the rst phase, CMEs
evaluate the emotional signicance of perceived stimuli using a series of criteria.
For example, MAMID [8] performs an assessment of stimuli in terms of their
valence using appraisal variables such as expectation and novelty. In the second phase, this type of information is used to elicit particular emotions and
determine their associated intensity. For example, WASABI [2] uses a threedimensional space to decide which emotions will be elicited. In the last phase,
emotions inuence processes such as decision making [8], conversational skills
[7], and facial expressions [2]. However, despite this general and well accepted
abstract cycle, CMEs address each of this phases in very dierent ways.
One aspect that causes a marked dierence among CMEs is the number and
type of aective functions they take into account for the processing of emotions.
Marsella and Gratch [14] consider mood as the unique aective modulator for
the elicitation of emotions. Marinier et al. [12] implement a mechanism that
includes emotions, mood, and feeling to determine the aective experience of
AAs. Gebhard [7] integrates emotions, mood, and personality to achieve aective
processing in conversational AAs. However, the operational and architectural
roles that such aective factors play in each model also diers.
Regardless of how CMEs address the phases described above and the number
and type of aective processes they involve, their implementation is primarily
inspired by psychological theories. The most implemented approach is the appraisal theory [7, 12, 14], which explains the elicitation and dierentiation of
emotions based on the relationship between individuals and their environment
[19, 21]. However, most psychological theories lack the detail needed to fully
274
Proposed Model
This section covers the two concepts that comprise our proposal, which aim to
address the issues identied previously. We rst introduce an integrative framework intended to unify aective and cognitive models, and then present the main
characteristics and properties of the proposed CME.
3.1
Based on the previous review and advances in the study of human emotions, we
identify two issues that must be considered in the development of CMEs:
1. Integrative and scalable architecture: CMEs must incorporate frameworks
that consistently allow the unication of theories and models that explain
the diverse components of human emotions. Although most CMEs can be
PEACTIDM Appraisal
theory
by
[12]
Scherer [23] and physiological concepts of feelings
by Damasio [4]
WASABI [2] Appraisal
theory
by
Scherer [23], PAD space
by Mehrabian [16], and
physiological concepts by
Damasio [4]
Cathexis
[28]
Alma [7]
Mamid [8]
Emotions and
mood
Emotions, mood,
and feelings
Emotions, drives,
mood, and
personality
Emotions, mood,
and personality
Emotions and
personality
Appraisal
theory
by Emotions,
Ortony et al. [19] and motivational
Roseman et al. [21]
states, and mood
Aective
Processes
Appraisal theory by Smith Emotions and
and Lazarus [27]
mood
Foundations
Flame [5]
EMA [14]
Model
Eects of Emotions
Goal-directed
Autonomous Agents
Virtual
humans
for
training and psychotherapy environments [9]
Embodied
Conversational Agents
Case Studies
Primary emotions: Angry, Annoyed, Bored, Facial expressions, invol- Emotional expressions
Concentrated, Depressed, Fearful, Happy, untary behaviors such as and responses in virtual
Sad, Surprised. Secondary emotions: Hope, breathing, and voluntary players
Fears-conrmed, Relief
behaviors such as verbal
expressions
Surprise, Hope, Joy, Fear, Sadness, Anger, Agents expressions, atand Guilt
tentional processes, beliefs, desires, and intentions
Joy, Sad, Disappointment, Relief, Hope, Fear, Action selection
Pride, Shame, Reproach, and Admiration.
Complex emotions: Anger (sad + reproach),
Gratitude (joy + admiration), Gratication
(joy + pride), and Remorse (sad + shame)
Anxiety/fear, Anger/aggression, Negative af- Goal and action selection
fect (sadness, distress), and positive aect
(joy, happiness)
Admiration, Anger, Disliking, Disappoint- Verbal and Non-verbal
ment, Distress, Fear, Fears Conrmed, Gloat- Expressions such as wording, Gratication, Gratitude, Happy For, ing, length of phrases, and
Hate, Hope, Joy, Liking, Love, Pity, Pride, facial expressions. CogniRelief, Remorse, Reproach, Resentment, Sat- tive Processes such as
isfaction, Shame
Decision-Making
Primary emotions: Anger, Fear, Sad- Agents Expressivity such
ness/Distress, Enjoyment/Happiness, Dis- as facial expressions and
gust, and Surprise. This model handles body postures. Cognitive
secondary emotions but does not provides Processes such as percepan explicit model for the labeling of them
tion, memory, and action
selection
This model implements the model by Scherer General cognitive behav[23] for the mapping of appraisal dimension ior
values to specic modal emotions
Emotions Labels
276
seen as integrative frameworks, they usually include only those aspects that
allow them to meet their design objectives. Moreover, they are not committed
to the construction of suitable environments for the steady incorporation of
new ndings about this human function, which is an essential requirement
since the brain mechanisms underlying emotions begin to be revealed [3].
2. Consistent model for the interaction between aective and cognitive functions: although nearly all CMEs are aware of the antecedents and consequents of emotions (i.e., perception and motor-action), in most cases they
are not committed with establishing proper interfaces for the interaction between various aective and cognitive processes. In this sense, CMEs should
be developed so that they properly handle both aective and cognitive data.
A coherent framework addressing these issues will undoubtedly change the way
in which CMEs are developed, leading to signicant advances in the eld of
aective computing. In addition, these two aspects encourage the creation of
CMEs that not only present certain dierences with respect to their operational
and architectural assumptions, but that incorporate frameworks which allow the
proper integration of the various components of the human emotions process.
We consider evidence from psychology and neuroscience to create an integrative framework that addresses these two issues, which represents the underlying
architecture of the CME that we propose in section 3.2. Such integrative framework is designed on the basis of a multiprocess and multilevel perspective as
shown in gure 1 [18]. It considers several brain functions, areas, and nuclei
for the processing of emotions [11, 20]. In gure 1, the level two consists of
a series of abstract models of cognitive and aective functions, such as mood
and personality, which interact in order to achieve the dynamics of emotions
[22]. These abstract models comprise various architectural components (at level
three) whose collaborative work produce their corresponding behavior. While
the modules in the level two take the role of brain functions, the modules in
level three simulate the operations and architectures of brain structures.
277
In this manner, we can establish the following hypotheses. The proposed CME
will incorporate an integrative and scalable architecture, since traditional and
modern theories explaining the diverse facets of human emotions can be implemented in this framework by using the structural and operational basis within
it. Similarly, cognitive models explained in terms of these two levels can be implemented using the same structural and operational constraints used for implementing aective processes. This approach allows reducing and validating
the working assumptions in CMEs used to achieve functional systems and
induced by psychological theories. Finally, neuroscience is increasingly able to
explain many processes that are common to all individuals [25], allowing us to
address the emotion process by implementing the core mechanisms from which
emotional behavior emerges [10].
3.2
278
approaches, which lack the detail needed to fully develop computational models. On this basis, we proposed an integrative framework for unifying theories
explaining emotions in terms of brain functions, brain structures, and neural
pathways. Which provides a convenient environment for the steady inclusion
of evidence about the functioning of human emotions, and allows the proper
interaction between cognitions and emotions. We also presented the main characteristics of a CME to be implemented in this integrative framework.
Future research focuses on the following tasks:
Identication of the cognitiveaective processes that meet the characteristics and properties of the proposed CME (functional model at level two).
Identication of the brain structures and interactions that underlie the processes in the functional model (architectural model at level three).
Formalization of the internal operations of the CME.
Analysis of computational techniques for implementing each process in the
CME.
Implementation of the CME.
Development of case studies and evaluations.
Acknowledgments. The authors would like to acknowledge the PhD scholarship (CONACYT grant No. 229386) sponsored by the Mexican Government
for their partial support to this work. We would like to thank the anonymous
reviewers for their valuable comments and suggestions.
References
1. Armony, J.L., Servan-Schreiber, D., Cohen, J.D., LeDoux, J.E.: Computational
modeling of emotion: explorations through the anatomy and physiology of fear
conditioning. Trends in Cognitive Sciences 1(1), 2834 (1997)
2. Becker-Asano, C.W., Wachsmuth, I.: Aective computing with primary and secondary emotions in a virtual human. Autonomous Agents and Multi-Agent Systems 20(1), 3249 (2010)
3. Dalgleish, T., Dunn, B.D., Mobbs, D.: Aective neuroscience: Past, present, and
future. Emotion Review 1(4), 355368 (2009)
4. Damasio, A.R.: Descartes error: Emotion, Reason, and the Human Brain, 1st edn.
Putnam Grosset Books, New York (1994)
5. El-Nasr, M.S., Yen, J., Ioerger, T.R.: Flamefuzzy logic adaptive model of emotions. Autonomous Agents and Multi-Agent Systems 3(3), 219257 (2000)
6. Fellous, J.-M., Arbib, M.A.: Who needs emotions?: the brain meets the robot.
Oxford University Press, Oxford (2005)
7. Gebhard, P.: Alma: a layered model of aect. In: Proceedings of the International
Conference on Autonomous Agents And Multiagent Systems, pp. 2936 (2005)
8. Hudlicka, E.: This time with feeling: Integrated model of trait and state eects
on cognition and behavior. Applied Articial Intelligence: An International Journal 16(7-8), 611641 (2002)
9. Hudlicka, E.: A computational model of emotion and personality: Applications
to psychotherapy research and practice. In: Proceedings of the 10th Annual CyberTherapy Conference: A Decade of Virtual Reality (2005)
279
10. Lane, R.D., Nadel, L., Allen, J.J.B., Kaszniak, A.W.: The study of emotion from
the perspective of cognitive neuroscience. In: Lane, R.D., Nadel, L. (eds.) Cognitive
Neuroscience of Emotion. Oxford University Press, New York (2000)
11. LeDoux, J.E., Phelps, E.A.: Emotion networks in the brain. In: Lewis, M.,
Haviland-Jones, J.M. (eds.) Handbook of Emotions, pp. 159179. Guilford Press,
New York (2000)
12. Marinier, R.P., Laird, J.E., Lewis, R.L.: A computational unication of cognitive
behavior and emotion. Cognitive Systems Research 10(1), 4869 (2009)
13. Marsella, S., Gratch, J., Petta, P.: Computational models of emotion. In: Scherer,
K.R., B
anziger, T., Roesch, E.B. (eds.) Blueprint for Aective Computing: A
Source Book, 1st edn. Oxford University Press, Oxford (2010)
14. Marsella, S.C., Gratch, J.: Ema: A process model of appraisal dynamics. Cognitive
Systems Research 10(1), 7090 (2009)
15. McCrae, R.R., John, O.P.: An introduction to the ve-factor model and its applications. Journal of Personality 60(2), 175215 (1992)
16. Mehrabian, A.: Pleasure-arousal-dominance: A general framework for describing
and measuring individual dierences in temperament. Current Psychology 14(4),
261292 (1996)
17. Newell, A.: Unied theories of cognition. Harvard University Press, Cambridge
(1990)
18. Ochsner, K.N., Barrett, L.F.: A multiprocess perspective on the neuroscience of
emotion. In: Mayne, T.J., Bonanno, G.A. (eds.) Emotions: Currrent Issues and
Future Directions, pp. 3881. Guilford Press, New York (2001)
19. Ortony, A., Clore, G.L., Collins, A.: The cognitive structure of emotions. Cambridge University Press, Cambridge (1990)
20. Phan, K.L., Wager, T.D., Taylor, S.F., Liberzon, I.: Functional neuroimaging studies of human emotions. CNS Spectrums 9(4), 258266 (2004)
21. Roseman, I.J., Spindel, M.S., Jose, P.E.: Appraisals of emotion-eliciting events:
Testing a theory of discrete emotions. Journal of Personality and Social Psychology 59(5), 899915 (1990)
22. Rusting, C.L.: Personality, mood, and cognitive processing of emotional information: Three conceptual frameworks. Psychological Bulletin 124(2), 165196 (1998)
23. Scherer, K.R.: Appraisal considered as a process of multi-level sequential checking.
In: Scherer, K.R., Schorr, A., Johnstone, T. (eds.) Appraisal Processes in Emotion:
Theory, Methods, Research, pp. 92120. Oxford University Press, New York (2001)
24. Scherer, K.R.: Emotion and emotional competence: conceptual and theoretical issues for modelling agents. In: Scherer, K.R., B
anziger, T., Roesch, E.B. (eds.)
Blueprint for Aective Computing: A Source Book. Oxford University Press, Oxford (2010)
25. Shepherd, G.M.: Creating modern neuroscience: the revolutionary 1950s. Oxford
University Press, Oxford (2009)
26. Smith, C.A., Kirby, L.D.: Toward delivering on the promise of appraisal theory.
In: Scherer, K.R., Schorr, A., Johnstone, T. (eds.) Appraisal Processes in Emotion,
Oxford, NY (2001)
27. Smith, C.A., Lazarus, R.S.: Emotion and adaptation. In: Pervin, L.A. (ed.) Handbook of Personality: Theory and Research, pp. 609637. Guilford Press, New York
(1990)
28. Vel
asquez, J.D.: Modeling emotions and other motivations in synthetic agents. In:
Proceedings of the Fourteenth National Conference on Articial Intelligence and
Ninth Conference on Innovative Applications of Articial Intelligence, pp. 1015
(1997)