Vous êtes sur la page 1sur 16

   

   

Introduction

Faces, which are extremely important to social interactions, include many of the characteristics

we need to be informed about the people we interact with. Some of these, for instance, are;

identity, mood, race, gender, and age. Faces offer a huge variety and abundance of social stimuli.

It is not improbable at all, that they constitute the main reason why perception has a critically

important role in social interactions for an extensive range of species. In addition, it is one of the

most discrete areas in the field of research on cognitive sciences for more than five decades with

a considerable number of surveys that have evolved our knowledge in the field of face

perception.

It is known that many species recognize each other through olfaction or audition,

(Pascalis O& Kelly DJ, 2009) therefore primates will do it through vision and faces. Faces

provide a great variety of information to an observer since this information has been received it

gets classified in two major categories regarding the face traits and face states. Traits category

contains elements of the face that are mostly permanent for instance gender, race, identity or

aesthetics. The face states category refers to elements that can be changed and are not permanent

such as; emotional expressions or intentions (Lee K& Eskritt M& Symons LA& Muir D, 1998).

All of the above results to the inference that when a face is viewed, initializes two processes,

categorization of face to a belonging or not to our species and the analysis of it at an individual

level. Through the pass of time, many researches provide information about adults ability to

process faces, supporting that their expertise enables them to distinguish similar unfamiliar faces

and having an intact memory for hundreds of faces by the pass of time (Bahrick HP& Bahrick

PO& Wittlinger RP,1975). From these result two critical types for face processing (Tanaka J). On

the one hand, there is featural information, which is about the traits of a face that can be

perceived isolated, a distinction has been made between them as internal referring to eyes, nose,
   

and mouth and external, associated with the hairstyle for instance. The internal seems to be more

important (Tanaka JW, Farah MJ, 1993). On the other hand is configural that refers to spatial

relationships between facial features (regarding the distance of mouth, nose, eyes).

Brain regions involved in face recognition.

In order to understand how face processing occurs, derived the need to discover the brain regions

that are involved and their functional roles. FMRI studies support that face perception depends

on different processes than object perception (Dauchaine& Yovel 2008). Additionally, face-

selective areas prove the existence of specific neural locus for face processing.

Many reports indicate, that the faces are processed by different ‘'systems'' than objects,

suggesting that brain damage can specifically harm face perception (Ellis & Florence 1990 Hoff

& Potzl 1937). Evidence for the aforementioned resulting from observations in an experiment in

which faces were turned upside down and showed significant reduction in performance rather

than it did with any other inverted object (Yin 1969). Yin explained that this negative effect of

inversion on the recognition of faces happened due to the fact that it was difficult to extract from

inverted faces information such as the correct image of face parts, an important process of

recognition (holistic process), this means that holistic process can't function properly with an

inverted face. This is also supported by the experiment of Tanaka and Farah in 1993 which

provided more evidence for the difference between holistic representation and for object

recognition (Young et al.1987). Participants asked to memorize some identities and then to

recognize some of their facial features. Sometimes the characteristic that was asked to be

recognized was isolated and others within a face. Results suggested better identification when the

features were presented within the face.

The model of face processing has been proposed in 2000 and it is based on the findings
   

of human and monkey neuroimaging and neurophysiology. Many parts of it have been based on

the Bruce & Young face model (Haxby et al., 2000). This model states that face processing

system consists of the OFA, the FFA and the pSTS-FA by which regions is carried out facial

visual analysis. To begin with, the OFA is involved in the first stage where it sends its output to

the FFA which manipulates stable characteristics of faces such as gender and identity. From OFA

region is also provided input to the pSTS-FA which represents variable aspects of faces with the

great importance of social communication like expressions or movements. The rear connectors

provide repetitive processing means. Furthermore suggests that the information that was

extracted from faces are manipulated by an extended face network. The areas of this network are

connected to the central areas and each one of them obtains a different type of information. The

areas that are connected to the pSTS-FA involve intraparietal sulcus, the auditory, and the

amygdala. Intraparietal sulcus directs the attention to be in accordance with where the gaze is

directed. The auditory cortex comprehends with speech perception. The amygdala is involved

with the emotional information derived from faces. Additionally, there is a link with FFA and an

area in the anterior temporal cortex is involved with the processing of semantic information.

Basic emotions and amygdala functioning.

It is well known that faces are rich in expressions and can easily express many emotions non

verbally. Contrary to other forms of nonverbal communication emotions have a distinct feature;

they are universal and can be understood by all people from the most of the cultures around the

world. We apprehend emotions as a complex construct which includes a set of processes such as

evaluating an external stimulus in combination with neural responses followed by a set of

psychophysiological reactions and expressions (Scherer, 2009; Adolphs, 2017). Based on the

claims that emotions have a biological basis, it is inferred the existence of an innate mechanism

which connects the stimuli with the production of a certain behavioral response. For instance, in
   

case of a potential danger, it is automated the fight or flight response. Ekman in 1999 proposed

the six basic or primary emotions which are; fear, anger, joy, sadness, surprise, and disgust.

Amygdala is the most crucial structure involved in social information processing and the

process of emotions (Scherf et al., 2012), especially on fear conditioning, but the existence of

huge documentation suggests also a wide range of functions including negative and positive

aspects (Davis and Whalen, 2001; Somerville et al., 2004; van den Bulk et al., 2013). Research

has shown that amygdala appears to show greater activation when presented fearful and disgust

faces and somewhat lower for happy and neutral ones (Costafreda et al., 2008; Fusar-Poli et al.,

2009). Anatomically amygdala is a complex structure consisted of many nuclei interconnections

with at least two subdivisions, on the one hand involves the basolateral amygdala which

incorporates lateral, basal and accessory basal nuclei, on the other hand the central amygdala

which involves the central nucleus (Davis and Whalen, 2001; Cardinal et al., 2002; Heimer et al.,

2008). The basolateral amygdala seems to be responsible for learning (Pavlovian) and the

depiction of value. It receives sensory information from many structures such as the cortex

whoever in turn receives information from parietal, cingulate, prefrontal and insular cortices, in

which insula especially appears to be disproportionately important for the recognition of disgust

and more specifically in facial expressions associated with distaste, additionally with moral

disgust. The central nucleus participates in numerous attentional functions and central amygdala

is also concerned to be the one who controls brainstem (Cardinal et al., 2002) by using its large

connections to the hypothalamus and brainstem nuclei, involving the midbrain reticular

formation in order to coordinate responses of behavioral, autonomic and neuroendocrine nature.

The role of amygdala in face emotional recognition.

There are several studies that have provided extremely useful data on the role of the amygdala in

face emotion recognition through the investigation of lesions in this area. One classical study
   

examined emotional recognition after bilateral amygdala damage in a patient with a complete

destruction of this brain structure due to Urbach-Wiethe disease (Adolphs R, Tranel D, Damasio

H, Damasio, 2004), a rare condition which progressively destroys amygdala. The patient was

tested in a task in which she had to match expressive faces with the correct emotion label and

also to recognize any similar emotion presented by different faces. In this case, results showed an

impairment in recognition of fear and somewhat milder damage in the ability to recognize the

intensity of dissimilar negative emotions. Thus, it is confirmed the existence of specific neural

system regarding the recognition of fear and danger stimuli. Lesion to amygdala can produce a

deficit in fear recognition, including facial expressions because of the crucial role that plays in

this system. These findings were cross-examined by other studies which tested patients with

different aetiologies of amygdala damage (Young AW, Aggleton JP, Hellawell DJ et al. , 1995,

Calder AJ, Young AW, Rowland D et al, 1996). A study examined the ability of emotion

recognition in patients with bilateral amygdala damage resulting in eight cases from herpetic

encephalitis and in one from a congenital disease, the total of nine patients (Adolphs R, Tranel D,

Hamann S et al.,1999). Six pictures of faces presented, expressing the basic emotions and three

of neutral, patients had to rate from 0 to 5 the intensity of each. It was compared the performance

of the amygdala patients groups, with patients who had other lesions apart from amygdala, and a

healthy control group. Results showed common responses between healthy group and brain-

damaged group. In contrast, those patients with bilateral damage achieved normal ratings for

positive valenced faces rather than negative emotions which were abnormally low, especially

fear. In general, it was observed substantial variability between patients.

Emotion recognition deficits have been also detected in patients with unilateral lesions in

amygdala. The most common reason for this type of lesion resulted from temporal lobe epilepsy.

In a recent study was investigated the ability of emotional face recognition in patients with
   

extratemporal and temporal lobe epilepsy. Across three different groups of characteristics of

lesions was compared their performance on matching test of emotion-label. The compared

groups are; a group of temporal lobe epilepsy with mesial temporal sclerosis, the same epileptic

type with different lesion etiology and also extratemporal lobe epilepsy. Results indicated that

right mesial temporal sclerosis had impaired the ability to recognize all emotions in comparison

to the rest of the groups. From the analysis of each emotional category, fear recognition appeared

to be the most impaired. These data verify that right unilateral amygala damage can cause facial

expression recognition deficit (Meletti S, Benuzzi F, Rubboli G et al.,2003).

However, researches involving emotion recognition after surgical lobectomy followed by

amygdala lesions showed conflicting results. There is a number of unilateral temporal lobectomy

patients that haven't any problem with recognizing emotional faces (Adolphs R, Tranel D,

Damasio H, Damasio AR,1995). Additionally, another study examined the ability of emotion

recognition in right and also left temporal lobectomy patients, with results indicating significant

difficulty in the minority of the participated patients (Fowler HL, Baker GA, Tipples J et al.,

2006). A possible explanation for the variability or even the absence of emotion recognition

deficit might be other understudied factors such as the etiology of epilepsy and also clinical

factors like the onset of seizures along with the developmental history of the pathology.

Significant lack of information stems from the fact that there are very few studies using

fMRI to patients with temporal lobe epilepsy and amygdala lesions. This type of studies could

provide immediate information about amygdala dysfunction (Schacher M, Haemmerle B,

Woermann FG et al.,2006). A study used this method in a group of patients with temporal lobe

epilepsy with left or right amygdala damage to investigate responses in the brain from pictures of

fearful and neutral faces. Results came in line, with several pre-existed evidence which support
   

that, right amygdala-damaged patients with seizures of early-onset showed greater impairment in

recognition of fearful expressions (Benuzzi F, Meletti S, Zamboni G et al.,2004).

In another study can be seen in patients with left temporal lobe epilepsy and control group

that fearful faces activated occipital, frontal cortices and right amygdala , but this activation

didn't appear in patients with right temporal lobe epilepsy who failed to recognize fearful faces

(Batut AC, Gounot D, Namer IJ et al., 2006). These evidence support that right hemisphere and

right amygdala are dominant in the process of faces and their expressions. Additionally, in a

study with patients who had mesial temporal sclerosis results denote that amygdala lesions cause

changes in responses to fearful faces, impacting specifically ipsilateral visual cortical areas

(Vuilleumier P, Richardson MP, Armony JL et al., 2004). Another study examined the neural

responses for face stimuli with differentiated spatial frequencies. Responses of amygdala were

higher regarding fearful faces with low frequencies, while fusiform cortex showed greater

sensitivity to high frequencies, independently from the emotion expressed. It is supported from

the evidence these different pathways transport face information with different contents through

that way manage different features of face recognition (Broks P, Young AW, Maratos EJ et

al.,1998). Concerning normal situations, these two paths work together at the same time but

lesion to one of them can result in different types of impairment depending on the

counterbalancing strategies and the lesion onset. In fact, this explains the differences in the

ability to recognize emotions between patients with bilateral amygdala damage between patients

with congenital etiology and a different one (Hamann SB, Stefanacci L, Squire LR et al.,1996).

Also found, that early onset of damage in amygdala might have a negative effect in some

regulating influences on visual cortex while is developed, which could be important for the

coordination of those visual areas with specific features related to emotional expressions,

particularly when lesions involve the right hemisphere where face processing is dominant. This
   

possible explanation agrees with results observed in patients with early onset of unilateral

amygdala damage (Anderson AK, Phelps EA.,2000).

Conclusion

Concluding from all the above, it is clear that amygdala has a major role in recognition of

emotional facial expressions since damage in this structure can impair the function, but the

degree of impairment varies from case to case. In simple words, this means that amygdala plays

a crucial role in face emotional recognition but this ability itself doesn't depend only in amygdala

due to the fact that there are needed other factors to be also presented in order to degenerate this

ability.

References

1. Pascalis, O., & Kelly, D. (2008). Face Processing. Encyclopedia of Infant and Early

Childhood Development, 471-478. doi:10.1016/b978-012370877-9.00059-1

2. Lee, K., Eskritt, M., Symons, L. A., & Muir, D. (1998). Childrens use of triadic eye gaze

information for "mind reading." Developmental Psychology, 34(3), 525-539.

doi:10.1037//0012-1649.34.3.525
   

3. Bahrick, H. P., Bahrick, P. O., & Wittlinger, R. P. (1975). Fifty years of memory for names

and faces: A cross-sectional approach. Journal of Experimental Psychology:

General, 104(1), 54-75. doi:10.1037//0096-3445.104.1.54

4. Tanaka J. Featural, configural, and holistic encoding. In: Calder A, Johnson MH, editors.

Handbook of Face Processing. Blackwell Publishing; Oxford, UK: in press

5. Tanaka, J. W., & Farah, M. J. (1993). Parts and Wholes in Face Recognition. The Quarterly

Journal of Experimental Psychology Section A, 46(2), 225-245.

doi:10.1080/14640749308401045

6. Yin, R. K. (1969). Looking at upside-down faces. Journal of Experimental Psychology,81(1),

141-145. doi:10.1037/h0027474

7. Young, A. W., Hellawell, D., & Hay, D. C. (1987). Configurational information in face

perception. Perception, 16, 747–759.

8. Haxby, J. V., Hoffman, E. A., & Gobbini, M. I. (2000). The distributed human neural system

for face perception. Trends in Cognitive Sciences, 4 (6), 223–233

9. Scherer, K. R. (2009). Emotions are emergent processes: They require a dynamic

computational architecture. Philosophical Transactions of the Royal Society B:

Biological Sciences, 364(1535), 3459-3474. doi:10.1098/rstb.2009.0141


   

10. Adolphs R. (2017). How should neuroscience study emotions? by distinguishing emotion

states, concepts, and experiences. Soc. Cogn. Affect. Neurosci. 12 24–31. 10.1093/scan/

nsw153

11. Heimer L, Van Hoesen GW, Trimble M, Zahm DS. Anatomy of neuropsychiatry: The new

anatomy of the basal forebrain and its implications for neuropsychiatric illness. Academic

Press; Burlington, MA: 2008

12. Scherf, K. S., Behrmann, M., and Dahl, R. E. (2012). Facing changes and changing faces in

adolescence: a new model for investigating adolescent-specific interactions between

pubertal, brain and behavioral development. Dev. Cogn. Neurosci. 2, 199–219. doi:

10.1016/j.dcn.2011.07.016

13. Davis, M., and Whalen, P. J. (2001). The amygdala: vigilance and emotion. Mol.

Psychiatry 6, 13–34. doi: 10.1038/sj.mp.4000812

14. Somerville, L. H., Kim, H., Johnstone, T., Alexander, A. L., and Whalen, P. J. (2004). Human

amygdala responses during presentation of happy and neutral faces: correlations with state

anxiety. Biol. Psychiatry 55, 897–903. doi: 10.1016/j.biopsych.2004.01.007

15. van den Bulk, B. G., Koolschijn, P. C. M. P., Meens, P. H. F., van Lang, N. D. J., van der

Wee, N. J. A., Rombouts, S. A. R. B., et al. (2013). How stable is activation in the

amygdala and prefrontal cortex? A study of emotional face processing across three

measurements. Dev. Cogn. Neurosci. 4, 65–76. doi: 10.1016/j.dcn.2012.09.005


   

16. Costafreda, S. G., Brammer, M. J., David, A. S., and Fu, C. H. (2008). Predictors of

amygdala activation during the processing of emotional stimuli: a meta-analysis of 385

PET and fMRI studies. Brain Res. Rev. 58, 57–70. doi:

10.1016/j.brainresrev.2007.10.012

17. Fusar-Poli, P., Placentino, A., Carletti, F., Landi, P., Allen, P., Surguladze, S., et al. (2009).

Functional atlas of emotional faces processing: a voxel-based meta-analysis of 105

functional magnetic resonance imaging studies. J. Psychiatry Neurosci. 34, 418–432.

18. Heimer L, Van Hoesen GW, Trimble M, Zahm DS. Anatomy of neuropsychiatry: The new

anatomy of the basal forebrain and its implications for neuropsychiatric illness. Academic

Press; Burlington, MA: 2008.

19. Cardinal, R. N., Parkinson, J. A., Hall, J., & Everitt, B. J. (2002). Emotion and motivation:

The role of the amygdala, ventral striatum, and prefrontal cortex. Neuroscience &

Biobehavioral Reviews, 26(3), 321-352. doi:10.1016/s0149-7634(02)00007-6

20. Adolphs, R., Tranel, D., Damasio, H., & Damasio, A. (1994). Impaired recognition of

emotion in facial expressions following bilateral damage to the human amygdala. Nature,

372(6507), 669-672. doi:10.1038/372669a0

21. Adolphs, R., Tranel, D., Damasio, H., & Damasio, A. (1995). Fear and the human amygdala.

The Journal of Neuroscience, 15(9), 5879-5891. doi:10.1523/jneurosci.15-09-05879.1995


   

22. Young, A. W., Aggleton, J. P., Hellawell, D. J., Johnson, M., Broks, P., & Hanley, J. R.

(1995). Face processing impairments after amygdalotomy. Brain, 118(1), 15-24.

doi:10.1093/brain/118.1.15

23. Calder, A. J. (1996). Facial Emotion Recognition after Bilateral Amygdala Damage:

Differentially Severe Impairment of Fear. Cognitive Neuropsychology, 13(5), 699-745.

doi:10.1080/026432996381890

24. Adolphs, R., Tranel, D., Hamann, S., Young, A., Calder, A., Phelps, E., . . . Damasio, A.

(1999). Recognition of facial emotion in nine individuals with bilateral amygdala

damage. Neuropsychologia, 37(10), 1111-1117. doi:10.1016/s0028-3932(99)00039-1

25. Meletti, S., Benuzzi, F., Rubboli, G., Cantalupo, G., Maserati, M. S., Nichelli, P., &

Tassinari, C. (2003). Impaired facial emotion recognition in early-onset right mesial

temporal lobe epilepsy. Neurology,60(3), 426-431. doi:10.1212/wnl.60.3.426

26. Fowler, H. L., Baker, G. A., Tipples, J., Hare, D. J., Keller, S., Chadwick, D. W., & Young,

A. W. (2006). Recognition of emotion with temporal lobe epilepsy and asymmetrical

amygdala damage. Epilepsy & Behavior, 9(1), 164-172. doi:10.1016/j.yebeh.2006.04.013

27. Schacher, M., Haemmerle, B., Woermann, F. G., Okujava, M., Huber, D., Grunwald, T., . . .

Jokeit, H. (2006). Amygdala fMRI lateralizes temporal lobe epilepsy. Neurology, 66(1),

81-87. doi:10.1212/01.wnl.0000191303.91188.00
   

28. Benuzzi, F., Meletti, S., Zamboni, G., Calandra-Buonaura, G., Serafini, M., Lui, F., . . .

Nichelli, P. (2004). Impaired fear processing in right mesial temporal sclerosis: A fMRI

study. Brain Research Bulletin,63(4), 269-281. doi:10.1016/j.brainresbull.2004.03.005

29. Batut, A., Gounot, D., Namer, I. J., Hirsch, E., Kehrli, P., & Metz-Lutz, M. (2006). Neural

responses associated with positive and negative emotion processing in patients with left

versus right temporal lobe epilepsy. Epilepsy & Behavior, 9(3), 415-423.

doi:10.1016/j.yebeh.2006.07.013

30. Vuilleumier, P., Richardson, M. P., Armony, J. L., Driver, J., & Dolan, R. J. (2004). Distant

influences of amygdala lesion on visual cortical activation during emotional face

processing. Nature Neuroscience,7(11), 1271-1278. doi:10.1038/nn1341

31. Broks, P., Young, A. W., Maratos, E. J., Coffey, P. J., Calder, A. J., Isaac, C. L., . . . Hadley,

D. (1998). Face processing impairments after encephalitis: Amygdala damage and

recognition of fear. Neuropsychologia,36(1), 59-70. doi:10.1016/s0028-3932(97)00105-x

32. Hamann, S. B., Stefanacci, L., Squire, L. R., Adolphs, R., Tranel, D., Damasio, H., &

Damasio, A. (1996). Recognizing facial emotion. Nature, (6565), 379-497.

doi:10.1038/379497a0

33. Anderson, A. K., & Phelps, E. A. (2000). Expression Without Recognition: Contributions of

the Human Amygdala to Emotional Communication. Psychological Science, 11(2), 106-

111. doi:10.1111/1467-9280.00224 
   
   

Vous aimerez peut-être aussi