Vous êtes sur la page 1sur 30

Laterality: Asymmetries of Body, Brain and Cognition

ISSN: 1357-650X (Print) 1464-0678 (Online) Journal homepage: http://www.tandfonline.com/loi/plat20

Processing emotion from the eyes: A divided visual


field and ERP study

Alan A. Beaton , Nathalie C. Fouquet , Nicola C. Maycock , Eleanor Platt ,


Laura S. Payne & Abigail Derrett

To cite this article: Alan A. Beaton , Nathalie C. Fouquet , Nicola C. Maycock , Eleanor Platt ,
Laura S. Payne & Abigail Derrett (2012) Processing emotion from the eyes: A divided visual
field and ERP study, Laterality: Asymmetries of Body, Brain and Cognition, 17:4, 486-514, DOI:
10.1080/1357650X.2010.517848

To link to this article: http://dx.doi.org/10.1080/1357650X.2010.517848

Published online: 17 Feb 2011.

Submit your article to this journal

Article views: 271

View related articles

Citing articles: 1 View citing articles

Full Terms & Conditions of access and use can be found at


http://www.tandfonline.com/action/journalInformation?journalCode=plat20

Download by: [Universidad Del Norte] Date: 20 January 2017, At: 08:35
LATERALITY, 2012, 17 (4), 486514

Processing emotion from the eyes: A divided visual field


and ERP study

Alan A. Beaton1,2, Nathalie C. Fouquet1,2,


Nicola C. Maycock1, Eleanor Platt1, Laura S. Payne1,
and Abigail Derrett1
1
Department of Psychology, University of Swansea, Wales, UK
2
Wales Institute of Cognitive Neuroscience, Wales, UK

The right cerebral hemisphere is preferentially involved in recognising at least some


facial expressions of emotion. We investigated whether there is a laterality effect in
judging emotions from the eyes. In one task a pair of emotionally expressive eyes
presented in central vision had to be physically matched to a subsequently presented
set of eyes in one or other visual hemifield (eyes condition). In the second task a word
was presented centrally followed by a set of eyes to left or right hemifield and the
participant had to decide whether the word correctly described the emotion
portrayed by the laterally presented set of eyes (word condition). Participants were
a group of undergraduate students and a group of older volunteers (50). There was
no visual hemifield difference in accuracy or raw response times in either task for
either group, but log-transformed times showed an overall left hemifield advantage.
Contrary to the right hemisphere ageing hypothesis, older participants showed no
evidence of a relative right hemisphere decline in performance on the tasks. In the
younger group mean peak amplitude of the N170 component of the EEG at lateral
posterior electrode sites was significantly greater over the right hemisphere (T6/PO2)
than the left (T5/PO1) in both the perceptual recognition task and the emotional
judgement task. It was significantly greater for the task of judging emotion than in
the eyes-matching task. In future it would be useful to combine electrophysiological
techniques with lateralised visual input in studying lateralisation of emotion with
older as well as younger participants.

Keywords: Emotion; Hemispheric asymmetry; Baron-Cohens eyes test; N170;


Right hemisphere ageing.

Address correspondence to: Dr Alan A. Beaton, Department of Psychology, University of


Swansea, Singleton Park, Swansea, SA2 8PP, UK. E-mail: a.a.beaton@swansea.ac.uk
The work reported here was supported by funding from the Wales Institute of Cognitive
Neuroscience.

# 2012 Psychology Press, an imprint of the Taylor & Francis Group, an Informa business
http://www.psypress.com/laterality http://dx.doi.org/10.1080/1357650X.2010.517848
PROCESSING EMOTION FROM THE EYES 487

The ability to effortlessly recognise someone by their face is one of the most
highly adaptive and complex functions that the visual system has to perform
and it is not yet entirely clear how this is achieved. At an early
neurophysiological stage of processing, cells have been found in temporal
cortex of non-human primates that respond selectively to faces (Bruce,
Desimone, & Gross, 1981; Perett, Rolls, & Caan, 1982; Perret et al., 1984)
and even to particular views of individual faces (Perrett et al., 1985). Some
cells fire only to faces in which the eyes are looking at the participant and
not if the eyes are looking away (see Perrett et al., 1985; for reviews see
Desimone, 1991; Rolls, 2007). Whether a particular animal is the object of
regard of another has, of course, great biological and social significance, as it
has for humans (Itier & Batty, 2009). This is borne out by the results of an
fMRI study which showed that the region of the ventral striatum became
active when (human) participants looked at a photograph of an attractive
female in which the eyes appeared to be looking directly at the participant.
When the eyes were looking away, this region was not active or much less so
(Kampe, Frith, Dolan, & Frith, 2001). Part of the superior temporal sulcus
is said to be an important cortical region for visual processing of the
movements and gaze of another persons eyes (Hasselmo, Rolls, & Baylis,
1989). This region is activated when an ambiguous stimulus is perceived as a
pair of eyes but not when perceived as something else (Kingstone, Tipper,
Ristic, & Ngan, 2004).
The eyes are more important than the mouth in an adults representation of
a face in memory (McKelvie, 1976) and information from the eyes is extracted
at a very early stage in processing faces (Vinette, Gosselin, & Schyns, 2004).
Infants learn to attend to the eyes before they attend to the mouth (Caron,
Caron, Caldwell, & Weiss, 1973) or the complete face (Taylor, Edmonds,
McCarthy, & Allison, 2001). These facts point to the salience of the eyes in
providing information that is useful in a wide variety of situations such as
interpreting facial expressions of emotion (for reviews see Itier & Batty, 2009;
Vuilleumier & Pourtois, 2007).
In his book The Expression of the Emotions in Man and Animals Darwin
(1872, pp. 170171) wrote:

The movements of the expression in the face and body are of much
importance for our welfare. They serve as the first means of communica-
tion between the mother and her infant; she smiles approval, or frowns
disapproval, and thus encourages her child on the right path. We readily
perceive sympathy in others by their expression; our sufferings are thus
mitigated, our pleasures increased, and mutual good feelings strengthened.

Both humans and non-human primates (Emery, 2000) attend more to the
eyes than to other regions of the face (such as nose or mouth) when judging
488 BEATON ET AL.

emotion in others (Baron-Cohen, Wheelwright, & Joliffe, 1997; Hernandez


et al., 2009; Spezio, Adolphs, Hurley, & Piven, 2007). This ability is impaired
after bilateral damage to the amygdala (Adolphs, 2008; Calder et al., 1996),
and in people with autism, who tend to avert their gaze from another persons
eyes during social interactions (Baron-Cohen et al., 1997; Baron-Cohen,
Wheelwright, Hill, Raste, & Plumb, 2001) and spend less time than controls
in exploring the region of the eyes in photographs of faces (Hernandez et al.,
2009).
The importance of the eyes in reflecting emotional states raises the
question as to whether the well-established laterality effect in recognition of
facial emotional expression in humans (Bourne, 2005; Coolican, Eskes,
McMullen, & Lecky, 2008; Demaree, Everhart, Youngstrom, & Harrison,
2005; Ley & Bryden, 1979; Mandal et al., 1999; Mandal, Tandon, & Asthana,
1991; Schmitt, Hartje, & Willmes, 1997; Tamietto, Corazzini, de Gelder, &
Geminiani, 2006; Weddell, 1989; but see Fusar-Poli et al., 2009) is mediated
largely or even exclusively through a lateralised mechanism for discerning
emotion from the eyes. It is therefore of interest to investigate whether there is
any difference between left and right sides of the brain in the ability to
recognise emotion solely from the eyes. The explanation for any laterality
effect that occurs (in this as in any other study) has implications for our
understanding of how the stimuli in question are processed (Yovel, Levy,
Grabowecky, & Paller, 2003).
The relative contribution of left and right cerebral hemispheres to the
processing and expression of emotion is an issue that has been debated for over
70 years. Clinicians such as Goldstein (1939) noted that damage to the left side
of the brain was frequently followed by an extreme emotional reaction
described by Goldstein as catastrophic whereas emotional indifference was
often seen after damage to the right hemisphere (e.g., Babinski, 1914;
Gainotti, 1969, 1972). Comparable findings were reported after injection of
sodium amylobarbitone into the left or right carotid artery (e.g., Terzian &
Cecotto, 1959; Perria, Rosadini, & Rossi, 1961). Work with neurologically
intact participants in the visual (e.g., Dimond, Farrington, & Johnson, 1976)
and auditory (e.g., Beaton, 1979) modalities contributed to the belief that the
two hemispheres played different roles in emotion, but raised as many
questions as answers. Gradually there arose the idea that the right hemisphere
is more important for all aspects of emotional processing, the so-called right
hemisphere hypothesis (for review of early work in this field see Beaton, 1985;
Tucker, 1981).
An alternative hypothesis, the valence hypothesis, holds that the left
hemisphere is dominant for processing positive emotions whereas the right
hemisphere is dominant for processing negative emotions (for reviews see
Borod, 1993; Borod, Hayward & Koff, 1997; Davidson, 1995, 2003). A further
hypothesis bearing considerable similarity to the valence hypothesis is that the
PROCESSING EMOTION FROM THE EYES 489

left hemisphere plays a major role in approach behaviour (happiness, surprise,


anger) while the right hemisphere underpins withdrawal or avoidance
behaviour (sadness, fear, disgust). This approachwithdrawal hypothesis and the
original valence hypothesis differ in how they classify the emotion of anger
(Alves, Fukusima, & Aznar-Casanova, 2008). According to the valence
hypothesis, anger is regarded as a negative emotion. A fourth approach
to the lateralisation of emotional behaviour links the behavioural inhibition-
behavioural activation systems model (Gray, 1982) to the right and left
hemispheres respectively (Demaree et al., 2005). Davidson (1995, 2003) and
Demaree et al. (2005) provide detailed reviews of these four main hypotheses
(and their subtle variations) on the relation between hemispheric lateralisa-
tion and emotion.
A commonly used method of investigating differential hemispheric
processing capacity in neurologically normal participants is the divided
visual field paradigm (for reviews see Beaton, 1985; Beaumont, 1982;
Christman, 1989; Dimond, 1972; for more recent consideration of metho-
dological factors see Boles, 1994; Bourne, 2006; Hunter & Brysbaert, 2008;
Jordan, Patching, & Milner, 1998). Since each half field of vision projects
largely or exclusively to the contralateral visual hemifield (for discussion of
this issue see Ellis & Brysbaert, 2010; Jordan & Paterson, 2009; Tootell,
Mendola, Hadjikhani, Liu, & Dale, 1998), information can be presented to
one or other hemisphere if the stimuli are presented outside a small foveal
region of putative naso-temporal overlap (see Lindell & Nicholls, 2003), or
even within this region (Ellis & Brysbaert, 2010; Lavidor & Ellis, 2003),
provided the participant does not move his or her eyes prior to or during
stimulus exposure. Given certain assumptions and adequate methodological
controls (Jordan et al., 1998; Jordan & Paterson, 2009), specialisation of one
or other cerebral hemisphere is inferred from a superiority of the opposite
visual hemifield in speed and/or accuracy for the task in question.
We chose to investigate the question of whether recognition of emotion
from the eyes is a lateralised function in neurologically healthy individuals
by employing the divided visual field technique and the eyes test developed
by Baron-Cohen (1995). We carried out a pilot study in which the task was
to match sets of laterally presented eyes with the appropriate verbal label
describing the emotion portrayed in the eyes. The participant made his or
her response by pointing to one of four words on a sheet of paper after each
trial. In this pilot study, carried out with 16 male and 16 female
undergraduate students (mean age 20.75 years), there was a trend towards
a left visual hemifield (right hemisphere) superiority.
We also presented the same stimuli to a different group of student
participants in a second pilot study (16 males, 16 females, mean age 20.2
years), but this time asked them to identify the eyes they had just seen rather
than to label the emotion portrayed. Again, the participant made his or her
490 BEATON ET AL.

response by pointing to one of four alternative pairs of eyes. The results


showed no significant effect of visual hemifield. Unlike our first pilot study,
then, which suggested a left visual field advantage in the recognition of
emotion portrayed by a pair of eyes, this second pilot study showed no
difference between the left and right visual hemifield in simple forced choice
recognition of the same stimuli. If confirmed, this would imply that any left
visual field/right hemisphere advantage in recognition of emotion from the
eyes is not reducible to a left hemifield advantage in recognition of
individual facial features per se.
We decided to repeat the pilot studies more formally but with a number of
changes. In particular we decided to measure response times as well as
accuracy and to investigate recognition of eyes alone and classification of
emotional expression as two conditions of a within-participants study rather
than as two separate experiments with different participants. Given the
results of our pilot studies, our predictions were that a left hemifield
advantage would be found for the task of judging emotion from eyes but not
in the straightforward recognition task.
Although the capacity to discriminate between basic facial expressions of
emotion may be hard-wired, being observable in infants at least as young as 7
months (Leppanen, Moulson, Vogel-Farley, & Nelson, 2007), the ability to
distinguish more subtle or complex expressions presumably develops (along
with lateralisation of the relevant mechanisms, see Workman, Chilvers,
Yeomans, & Taylor, 2006) through childhood and early adulthood (for review
see McLure, 2000) and possibly beyond. Older adults have had a lifetimes
experience in interacting with others. It might be the case therefore that they
are more skilled than younger adults in interpreting facial expressions,
especially from subtle cues such as those involving the eyes. However,
experimental evidence suggests that, if anything, older adults are slightly
less able than their younger counterparts to recognise facial expressions of
emotion (Calder et al., 2003; MacPherson, Phillips, & Della Sala, 2002, 2006;
McDowell, Harrison, & Demaree, 1994; but see Moreno, Borod, Welkowitz,
& Alpert, 1993). Most pertinent in regard to the present investigation, on
Baron-Cohens eyes test older adults (aged from 6080 years) have been
reported to do less well than younger adults (Phillips, MacLean, & Allen,
2002).
There is some evidence that, as adults become older, the right hemisphere
shows a greater age-related cognitive decline than the left hemisphere.
According to the right hemisphere ageing hypothesis (Goldstein & Shelly,
1981), all functions mediated predominantly or exclusively by the right
hemisphere undergo more pronounced decline with age than those subserved
by the left hemisphere (for reviews see Alves et al., 2008; Beaton, Hugdahl, &
Ray, 2000; Cabeza, 2002; Dolcos, Rice, & Cabeza, 2002; Nebes, 1990). In a
functional imaging study in which young and older adults were presented
PROCESSING EMOTION FROM THE EYES 491

with pictures of emotionally expressive faces, older participants showed


reduced neural activity in medial temporal lobe structures relative to their
younger counterparts, activity in the right hippocampus correlating nega-
tively with age (Iidaka et al., 2002).
Levine and Levy (1986) presented groups of participants of different ages
with the standard chimeric faces task, in which left and right sides of
different faces are combined to make a single whole face and presented at
short exposure durations. When asked to make a judgement of the emotion
expressed by the face participants responses are usually biased by the
emotion portrayed by the side of the face to the viewers left (e.g., Bourne,
2005; Burt & Perrett, 1997; Castro-Schilo & Kee, 2010; Christman &
Hackworth, 1993). This is often regarded as indicating right hemisphere
dominance for facial processing. Levine and Levy (1986) found that there
was no change in the magnitude of this left-side bias with increasing age in
groups of participants from 5 to 80 years of age. Similarly, Coolican et al.
(2008) found no difference between younger and older participants in the
direction or magnitude of left side advantage in judging the emotion
portrayed in upright chimeric faces. On the other hand, Failla, Sheppard,
and Bradshaw (2003) reported that while three groups of younger
participants (mean ages approximately 6, 11, and 22 years) all showed a
significant left-side bias, older participants aged 6070 (mean age 66.07
years) showed no significant bias, thereby supporting the right hemisphere
ageing hypothesis. Further partial support comes from a study by Prodan,
Orbelo, and Ross (2007). These authors reported that elderly participants
(62 years) showed decreased ability to perceive emotion in the upper
region of faces, especially when stimuli were presented to the left visual
hemifield, compared with younger participants.
A second aim of our experiment, therefore, was to investigate whether
older participants show the same pattern of performance on the lateralised
eyes task as younger participants. On the basis of the right hemisphere
ageing hypothesis, it was predicted that older participants would do less well
with left hemifield presentation than with right hemifield presentation
whether the task was simply recognition of previously presented eyes or
required a judgement of the emotion expressed by the eyes. Since younger
participants were expected to show a right hemisphere (left visual hemifield)
superiority in judging emotion, it was also predicted that older participants
would show reduced asymmetry in judgement of emotional expression in
comparison with younger participants.
We decided to combine the divided visual half-field technique with
measurement of participants brain activity as recorded from the scalp
(electroencephalography, EEG). Event-related potentials (ERPs) are
components of the averaged EEG waveform that can be related to the onset
of particular stimulus events. It is well established that one particular
492 BEATON ET AL.

component, the N170, occurs approximately 140 to 200 milliseconds after the
presentation of a face. It is most prominent over occipito-temporal sites and is
thought to be generated by neurons in face-selective areas of the fusiform
gyrus (see Kanwisher, McDermott, & Chun, 1997; Yovel, Tambini, &
Brandman, 2008) and superior temporal sulcus (Hasselmo et al., 1989; for
reviews see Dekowska, Kuniecki, & Jaskowski, 2008; Haxby, Hoffman, &
Gobbini, 2000). Its magnitude at lateral posterior electrode sites is reliably
greater to the appearance of a face in comparison to any other class of
stimulus. It is widely regarded as reflecting a stage of pre-categorical
processing during which the structural components of a face are configurally
encoded in a representation that may subsequently be used by recognition or
identification processes (Bentin & Deouell, 2000; Bentin, Allison, Puce, Perez,
& McCarthy, 1996; Eimer, 1998, 2000a; Jacques & Rossion, 2009; but see
Thierry, Martin, Downing, & Pegna, 2007; and counter-arguments and review
by Rossion & Jacques, 2008).
Bentin et al. (1996) proposed that the N170 reflects activity of an eye-
sensitive region of the cortex. Eimer (1998) tested this proposal by presenting
either complete faces or faces from which the eyes and eyebrows had been
removed. It was argued that if the N170 component is generated by processes
sensitive to the presence of eyes, then it should be absent or greatly attenuated
in the latter condition. On the contrary, the results of the study showed that
the difference waveform between houses, used as control stimuli, and faces
was more or less identical whether the eyes were included in a face or not.
Eimer (1998) concluded that Rather than being linked to processes devoted
to the detection of single features, the N170 is more likely to be caused by the
structural encoding of different face components (p. 2948).
There is some evidence that the N170 recorded for faces and for eyes
alone comes from different cortical generators (Shibata et al., 2002; Taylor
et al., 2001). Further, amplitude of the N170 component of the EEG has
been found to be greater in response to isolated eyes than to faces with the
eyes removed or to normal upright faces (Bentin et al., 1996; Itier, Alain,
Sedore, & McIntosh, 2007; Itier, Latinus, & Taylor, 2006; Jemel, George,
Chaby, Fiori, & Renault, 1999; Taylor et al., 2001). Itier et al. (2007)
suggested that this reflects the fact that normal faces are processed
configurally, and responded to by face-selective neurons, whereas isolated
eyes elicit activity in populations of both eye-selective neurons, that are not
active when faces are processed configurally, and face-selective neurons (Itier
et al., 2007).
The N170 component has been reported in some studies utilising normal
faces presented in central vision (e.g., Bentin et al., 1996; Itier & Taylor,
2004; Luo, Feng, He, Wang, & Luo, 2010; Rossion, Joyce, Cottrell, & Tarr,
2003; Rossion et al., 1999; Shibata et al., 2002; Streit, Wolwer, Brinkmeyer,
Ihl, & Gaebel, 2000) or centrally presented chimeric faces (Yovel et al., 2003)
PROCESSING EMOTION FROM THE EYES 493

to be of greater amplitude over the right than the left cerebral hemisphere.
With regard to isolated eyes presented centrally, Bentin et al. (1996) reported
that although in two experiments the amplitude of N170 was greater over the
right than the left hemisphere, the statistical significance of this finding was
inconsistent. Shibata et al. (2002) state that N170 amplitudes recorded at
occipito-temporal electrode sites were significantly greater over the right
than the left hemisphere for faces but do not provide data or results for their
eyes alone condition.
In ERP experiments to date involving presentation of isolated eyes the
task has been purely perceptual as, for example, in the target detection study
by Bentin et al. (1996) and the studies by Itier and colleagues (2006, 2007)
in which participants had to decide whether the eyes were presented in an
upright or inverted orientation. Furthermore, the relatively few studies on
the N170 involving isolated eyes as stimuli have presented them in central
vision. We therefore wished to determine whether there is any hemispheric
difference in amplitude of N170 when isolated eyes are presented unilaterally
to the left or right of fixation in a perceptual matching (recognition) task. We
also wanted to investigate whether the N170 component occurs when
participants make judgements of emotion to laterally presented eyes and, if
so, whether there is any greater laterality effect in amplitude of the N170
component in this task compared with the simple recognition task.
We did not expect to find a difference between left and right visual
hemifields in accuracy of recognition of isolated eyes and therefore did not
predict a hemispheric difference in amplitude of the N170 response, despite
the observation of Bentin et al. (1996), who used centrally presented eyes,
that the N170 response was (inconsistently) greater over the right than the
left hemisphere in a target detection task. However, given the salience of the
eyes to observers judgements of emotion and the voluminous research
implicating the right hemisphere in the recognition of emotion from faces
(Bourne, 2005; Coolican et al., 2008; Demaree et al., 2005; Ley & Bryden,
1979; Mandal et al., 1991, 1999; Schmitt et al., 1997; Tamietto et al., 2006;
Weddell, 1989; for reviews see Itier & Batty, 2009; Vuilleumier & Pourtois,
2007), we predicted, for the task of judging emotion, that presentation of the
eyes to the left visual hemifield (right cerebral hemisphere) would lead to a
greater amplitude of N170 response than presentation to the right visual
hemifield (left cerebral hemisphere). We were able to test these predictions
with a group of undergraduate participants; unfortunately, we were unable
to carry out EEG recording with older participants.
To summarise, the aims of this study were to determine (i) whether there is
any visual hemifield asymmetry in recognition of unilaterally presented pairs
of eyes, (ii) whether there is any visual hemifield asymmetry in judging the
emotions expressed by those eyes, (iii) whether younger and older adults show
any difference in relative hemifield accuracy and/or speed of performance, and
494 BEATON ET AL.

(iv) whether there is any difference in amplitude of the N170 component of the
EEG at lateral posterior electrode sites in the emotional judgement task (or,
indeed, in the simple recognition task). Our predictions were that there would
be no hemifield performance difference on the recognition task, but that there
would be a left hemifield (right hemisphere) advantage in judging emotion,
that this would be attenuated (in line with the right hemisphere ageing
hypothesis) in older compared with younger participants, and that the amplitude
of the N170 component would be equivalent over the two hemispheres in the
recognition task but greater over the right than the left hemisphere in the task of
judging emotion from the eyes.

METHOD
Participants
There were two groups of participants. Younger participants were 16 male
volunteers all of whom were aged 2022 years. All were strongly right-
handed as determined by Annetts (1970) handedness inventory except for
two participants who, it turned out after the experiment, showed left-handed
tendencies. However, their data were entirely in keeping with the other
participants and were retained for analysis. (Analyses excluding these two
participants showed the same effects as with them included.) Data from two
right-handed participants had to be dropped because of technical difficulties
with recording. This report is therefore based on 14 younger participants.
There were six male and nine female older right-handed participants. All
participants confirmed that they were over the age of 50 years but the
majority declined to give their exact age. The oldest was a female of 75 years
of age. No participant suffered from any known neurological or medical or
psychiatric condition likely to have affected their ability to perform the
experiment. Most of these older participants were unwilling to travel to the
laboratory to participate in EEG recording so this was omitted.

Stimuli and apparatus


Experimental stimuli were the 36 photographs from the Baron-Cohen eyes
test (Baron-Cohen et al., 2001) adjusted to be as equal in size as possible
without altering the horizontal-to-vertical ratio. Practice stimuli were taken
from Baron-Cohen et al. (1997).
Stimuli and apparatus were the same for both groups of participants
except that younger participants were tested using a desktop computer in a
university psychology laboratory, whereas for older participants the study
was run using a laptop computer in a quiet room in their own home.
PROCESSING EMOTION FROM THE EYES 495

EEG recording
For each of the younger participants, continuous EEG data were recorded
from 32 channels (Ag/Cl sintered electrodes placed according to the
international 1020 system) using Scan 4.3 software on a SynAmps system.
The electrodes were embedded in a rubber cap that was stretched over the
scalp. Data were recorded using a left ear lobe reference at a 500-Hz
sampling rate with a 0.05100 Hz band pass filter with the Notch filter on
(50 Hz). Before analysis, recordings were re-referenced using the common
average method.

Procedure
Stimuli were presented on a computer screen to the participant, who sat with
his/her eyes at a fixed distance of approximately 115 cm from the screen, head
position being maintained by keeping it in a constant position relative to fixed
points on the laboratory walls. In one condition, the recognition condition,
the participant had to indicate by means of a button press whether a centrally
presented photograph of a pair of eyes matched a subsequent pair of eyes (Yes/
No) presented in one or other visual hemifield. This will be referred to as the
eyes condition. The participant first saw a central fixation point for 200 ms
which was replaced by a pair of eyes presented at fixation for 500 ms. This was
followed by re-presentation of the central fixation point for 300 ms before a
second pair of eyes was presented to the left or right of fixation for 200 ms.
There was a variable inter-trial interval of 600 to 1700 ms between the
participants response and the re-appearance of the fixation point signalling
the next trial. The dimensions of each photograph on the screen were
approximately 12 cm5 cm. The inside border of the photograph was 4 cm
left or right from the central fixation point. Assignment of the labels Yes
(same) and No (different) to the response buttons was counterbalanced
across participants. Participants used their right hand to respond.
In a second condition, recognition of emotion, a word was presented at
fixation followed by a pair of eyes in one or other visual hemifield as before.
The task was to decide (Yes/No) whether the centrally presented word was a
correct (appropriate) description of the emotion expressed by the eyes. This
will be referred to as the word condition. The procedure was exactly as in the
first condition except that this time the centrally presented stimulus was a
word rather than a pair of eyes. In a preliminary check we examined the degree
to which people agreed with the labels assigned by Baron-Cohen et al. (2001)
to the sets of eyes. The stimulus items and response alternatives in our
experiment were selected in the light of this preliminary check. Thus we had
496 BEATON ET AL.

some evidence that for match trials the eyes were indeed paired with the
appropriate verbal labels.
Programs for presentation of the stimuli and recording of responses were
written in E-prime (Psychology Software Tools, Inc). In each condition there
were 72 stimulus presentations to each visual hemifield; 36 matched the
centrally presented pair of eyes or word, hereafter referred to as matching
trials, 36 were different to the centrally presented pair of eyes or word,
hereafter referred to as non-matching trials. There were therefore 144 trials
in each of the eyes (recognition) and word (emotion) conditions.
Half the number of participants in each group completed the eyes
condition first followed by the word condition and the other half completed
the tasks in the reverse order. Within each condition, match and non-match
trials appeared in random order and the stimulus items were presented to
participants in different random orders.

RESULTS
Preliminary analysis showed that the order in which the conditions were
administered had no significant effect on the results and the data for the two
orders were therefore combined for subsequent analysis. There was nothing
in the data to suggest a sex difference and results were therefore combined
across men and women. Only data for trials on which a participants
response was correct were used in the analyses.

Accuracy
The mean number (and standard deviations) of correct responses for
matching and non-matching correct trials for the eyes (recognition) and
word (emotion) conditions are shown for younger and older participants in
Table 1.
Because we were uninterested in the difference in overall accuracy levels
on the two tasks (eyes and word conditions), the data were analysed for the
two conditions separately rather than condition being entered as a factor in
an analysis of variance. This simplifies presentation of the data as it obviates
discussion of theoretically uninteresting interaction terms.
The data of Table 1 were submitted to three-way analysis of variance for a
mixed design with visual hemifield (left vs right) and trial type (matching vs
non-matching) as repeated measures factors. The between-participants factor
was group (younger versus older). For the eyes (recognition) condition, there
was no significant effect of visual hemifield, F(1, 27) B 1, p.986, but trial
type was significant, F(1, 27)4.43, pB.045, h2p .14, as was participant
group, F(1, 27)10.03, p.004, h2p .27. There were more correct responses
PROCESSING EMOTION FROM THE EYES 497
TABLE 1
Mean number (SD) of correct responses to presentation of matching and non-
matching trials in each visual hemifield in the eyes (recognition) and word
(emotion) conditions

Left visual hemifield Right visual hemifield

Condition Match Non-match Match Non-match

Younger group (N 14)


Eyes 33.93 34.64 33.43 34.43
(SD) 2.67 2.31 1.83 2.38
Range 2936 2736 2936 2736
Word 24.57 20.43 25.00 21.43
(SD) 5.52 5.75 4.23 5.10
Range 1432 1429 1831 1229
Older group (N 15)
Eyes 31.27 32.87 30.80 33.20
(SD) 3.47 2.77 3.55 1.86
Range 2436 2536 2336 2936
Word 15.80 27.73 17.07 25.07
(SD) 4.06 3.11 5.37 5.23
Range 824 2033 1028 1632

on non-match than match trials and younger participants were more accurate
than older participants. There was no other significant main effect and no
significant two- or three-way interaction.
For accuracy in the word (emotion) condition, there was also no significant
effect of visual hemifield, F(1, 27)B1, p.986, but participant group was
significant, F(1, 27)5.47, p.027, h2p .17. The two-way interaction
between trial type and group was significant, F(1, 27)19.47, pB.001,
h2p .42, as was the three-way interaction between hemifield, trial type, and
group, F(1, 27)4.57, p.042, h2p .15. There was no other statistically
significant effect. Decomposition of the three-way interaction showed that the
hemifield-by-trial type interaction was significant only for the older group,
F(1, 14)4.69, p.048, h2p .25, and that the group-by-hemifield interac-
tion was significant, F(1, 27)7.98, p .009, h2p .23, only for non-match
trials. The group-by-trial type interaction was significant for both the left
hemifield, F(1, 27)26.15, pB.001, h2p .49, and the right hemifield, F(1,
27)11.19, p.002, h2p .29. Further decomposition of the two-way
interactions revealed that the significance of the three-way interaction
(between hemifield, trial type, and group) is a reflection of the fact that on
match trials there was no difference between hemifields for either younger or
older participants, but that on non-match trials the difference between
hemifields was significant for the older group (t2.65, df14, p.019) but
498 BEATON ET AL.

not for the younger group. In addition, for younger participants matches were
significantly more accurate than non-matches in the right hemifield (t
13.55, df13, pB.001) but not the left hemifield, whereas for older
participants non-matches were more accurate than matches in both the left
hemifield (t8.13, df14, pB.001) and the right (t3.17, df14, p.007)
hemifield.

Response times
Table 2 shows the mean response times (and standard deviation) in
milliseconds for the eyes and word conditions for each group of participants.
Response times of Table 2 were submitted to separate three-way analysis
of variance for the two conditions. For the eyes (recognition) condition, the
only significant main effect was group, F(1, 27)67.89, pB.001, h2p .72,
younger participants being faster than older participants. The interaction
between group and trial type was also significant, F(1, 27)15.38, p.001,
h2p .36. This was due to the fact that younger participants were faster
to give correct responses to matches than non-matches (t2.38, df13,
p.033), whereas older participants showed the opposite effect (t3.33,
df14, p.005).

TABLE 2
Mean response times (SD) (ms) for correct trials for the eyes (recognition) and word
(emotion) conditions

Left visual hemifield Right visual hemifield

Condition Match Non-match Match Non-match

Younger group (N 14)


Eyes 425.6 480.4 451.0 465.3
(SD) 135.3 128.2 129.0 110.9
Range 248801 320790 238752 292660
Word 832.0 878.7 816.8 943.1
(SD) 272.9 291.9 232.1 376.4
Range 4791384 4861477 519817 5341896
Older group (N 15)
Eyes 875.4 780.8 913.5 804.4
(SD) 120.4 117.6 243.0 99.9
Range 6831163 600972 5901566 661991
Word 1522.9 1272.0 1516.1 1472.4
(SD) 413.0 417.4 534.7 579.8
Range 10612441 10152728 10653163 9813291
PROCESSING EMOTION FROM THE EYES 499

For the word (emotion) condition, there was a significant main effect of
group, F(1, 27)16.51, pB.001, h2p .38, a significant interaction between
group and trial type, F(1, 27)15.52, p.001, h2p .37, and a significant
interaction between visual hemifield and trial type, F(1, 27)4.73, p.039,
h2p .15. Younger participants were faster overall than older participants
and faster to respond correctly to matches than to non-matches (t2.40,
df13, p.033), whereas older participants showed the reverse effect (t
3.20, df14, p.006). Overall, correct responses to non-matches were
faster for the left than the right hemifield (t3.51, df28, p.002), but
responses to matches were non-significantly faster for the right than the left
hemifield (tB1).
Essentially the same results were obtained when response times were log
transformed in order to improve homogeneity of variance. For the eyes
condition, only the group-by-trial type was significant, F(1, 27)20.81,
pB.001, h2p .44. For the word condition, visual hemifield, F(1, 27)4.35,
p.047, h2p .14, was the only significant main effect. This was qualified by a
two-way interaction between hemifield and trial type, F(1, 27)4.95, p
.035, h2p .16. There was also a significant interaction between group and
trial type, F(1, 27)16.52, pB.001, h2p .38. These interactions were not
germane to the main questions being investigated and therefore were not
decomposed further.
To summarise the main results of the behavioural data, in both the eyes
and the word conditions younger participants were more accurate than older
participants but there was no significant main effect of visual hemifield nor
any significant interaction involving this factor other than the three-way
interaction between field, group, and trial type. This reflected a different
pattern of response to match and non-match trials by the two groups,
especially in the left hemifield.
Raw response times were faster for younger than older participants but
showed no hemifield difference. Transformed response times, however, were
faster for the left visual hemifield and interacted with trial type but not
participant group.

Analysis of EEG data


Amplitude. Only epochs of 1000 ms based on the onset of the critical
stimuli (laterally presented eyes) with a 200-ms baseline correction and free of
artefacts (9 100 mV on any channel) were analysed. For reasons outlined in
the introduction, we focused on the N170 component of the ERP identified
within a time window of 160200 ms post stimulus onset. Amplitude (in
microvolts) and latency (in milliseconds) of the ERP were recorded in both
conditions at left and right electrode leads. The evoked response at
500 BEATON ET AL.

corresponding locations of the left and right hemispheres constituted the


measure analysed.
ERPs recorded from the hemisphere contralateral to the visual hemifield in
which the lateralised visual stimulus was presented are referred to as being
produced by direct stimulation while ERPs recorded from the ipsilateral
hemisphere (which generally show a peak a few milliseconds later than the
contralateral hemisphere, presumably as a consequence of callosal transfer)
are referred to as being the result of indirect stimulation. In fact, as
anticipated from the EEG literature, in analysing data for indirect stimulation
we found no difference between hemispheres and no hemisphere by trial type
interaction in either the eyes (recognition) or the word (emotion) condition. In
this report therefore we present only data for direct stimulation and confine
our discussion to the effects of direct stimulation, as it is this that is most
relevant to the hypotheses under investigation. We further confine our
discussion to electrode sites representative of brain activity in the infero-
temporal and parieto-occipital structures involved in face processing, in
accordance with our predictions and a host of other studies of ERP responses
to face stimuli (see Bentin et al., 1996; Eimer, 2000a; for reviews see Haxby
et al., 2000; Itier & Batty, 2009).
Average amplitudes were calculated for each condition and relevant
electrode site. Table 3 shows for correct responses in each of the two
conditions (with direct stimulation) the mean peak amplitude of ERP at
infero-temporal (T5/T6) and parieto-occipital (PO1/PO2) electrode sites as a
function of match type. Note that the amplitudes represent negative
deflections*the minus sign is omitted for the sake of clarity of the table.
As an example of the ERP data, Figure 1 shows, for correct responses on
match trials in both the eyes (recognition) and word (emotion) conditions, the
amplitude (peaking at approximately 170 ms from critical stimulus onset) over
left and right hemisphere temporal electrode sites (T5left hemisphere, T6
right hemisphere). Data for the parieto-occipital sites are similar and
therefore a figure is omitted in the interests of brevity.
Data from temporal and parieto-occipital leads were analysed separately.
Three-way analysis of variance of the mean peak amplitude of the N170
component of the waveform at the temporal (T5/T6) electrode sites with
hemisphere, task (eyes versus word condition) and trial type as repeated
measures factors revealed an overall main effect of hemisphere, F(1, 13)
15.42, p.002, h2p .54. Task was also significant as a main effect, F(1, 13) 
24.98, pB.001, h2p .66. There were no significant interactions.
For the parieto-occipital electrode sites (PO1/PO2) three-way analysis
of variance showed similar effects; only the main effects of hemisphere, F(1,
13)7.07, p.02, h2p .35, and task, F(1, 13)8.02, p.014, h2p .38,
were significant. There were no significant interactions.
TABLE 3
Mean peak amplitude (mV) of ERP at left and right hemisphere temporal (T5, T6) and parieto-occipital (PO1, PO2) electrode sites for
matching (M) and non-matching (NM) trials in the eyes and word conditions (correct responses direct stimulation)

Left hemisphere Right hemisphere Left hemisphere Right hemisphere

Electrode site T5 T6 PO1 PO2

Trial type Match Non-match Match Non-match Match Non-match Match Non-match

Condition
Eyes 4.16 4.24 7.07 6.72 3.16 3.28 5.21 4.90
(SD) 3.69 3.47 4.46 4.21 4.65 4.33 5.20 5.03

PROCESSING EMOTION FROM THE EYES


Word 7.74 6.84 10.31 10.66 5.65 4.79 6.89 6.81
(SD) 3.09 4.13 3.87 6.60 4.39 4.46 4.96 5.68

TABLE 4
Mean latency of ERP at left and right hemisphere temporal (T5, T6) and parieto-occipital (PO1/PO2) electrode sites for matching (M)
and non-matching (NM) trials in the eyes and word conditions (correct responses, direct stimulation)

Left hemisphere Right hemisphere Left hemisphere Right hemisphere

Electrode site T5 T6 PO1 PO2

Trial type Match Non-match Match Non-match Match Non-match Match Non-match

Condition
Eyes 184.3 176.1 176.4 175.7 176.3 174.4 170.7 172.3
(SD) 17.85 12.68 12.08 9.17 17.02 12.87 9.69 11.97
Word 175.3 172.9 172.3 174.3 173.7 170.6 170.0 171.9
(SD) 12.02 10.22 12.12 11.15 11.58 12.06 11.87 10.88

501
502 BEATON ET AL.

15.0 Black Solid line Word condition-T6 (right hemisphere)


Grey Solid line Eye condition-T6 (right hemisphere)

V 0.0

Black Dotted line Word condition-T5 (left hemisphere)


Grey Dotted line Eye condition-T5 (left hemisphere)
15.0
200.0 50.0 300.0 550.0 800.0
ms

Figure 1. ERPs for direct stimulation and correct responses on match trials at left temporal (T5) and
right temporal (T6) electrode sites. As per convention, negative voltage is upwards.

At both temporal and parieto-occipital electrode sites, mean amplitude


was higher for the right than the left hemisphere and higher in the word than
in the eyes condition.
Inspection of Figure 1 suggests that in both the eyes and the word
conditions there was a small second negative peak at approximately 300 ms
post onset followed by a sustained amplitude difference between left and
right hemispheres for the next 200 milliseconds. We had no a priori
hypothesis other than in regard to the N170 component and therefore we
will not present any statistical analysis of these effects (but see discussion
section).

Response latency. Table 4 shows, for correct responses (direct stimula-


tion), the mean latency of ERP at temporal and parieto-occipital electrode
sites as a function of hemisphere and match type for the eyes (recognition)
and word (emotion) conditions.
Three-way analyses of variance of the latency data (direct stimulation)
with condition, hemisphere, and trial type as repeated measures factors
showed no significant main effect or interaction for either temporal or
parieto-occipital electrode sites. Latency will therefore not be discussed
further in this report.

DISCUSSION
The aims of this experiment were to investigate possible visual hemispheric
asymmetries in response to lateralised presentation of emotionally expressive
eyes. In the event, we found no overall visual hemifield difference for either
younger or older participants in samedifferent recognition accuracy for
PROCESSING EMOTION FROM THE EYES 503

briefly presented eyes. Nor was there any hemifield effect in either group for
the task of verifying the appropriateness of a given label to describe the
emotion expressed by the eyes, contrary to results from our initial pilot study
which suggested that a right hemisphere (left hemifield) superiority might be
found.
In both the eyes (recognition) and the word (emotion) condition, level of
accuracy was higher (and response times faster) for our younger than for our
older participants. However, on both tasks accuracy of performance by our
older participants was no worse for their right hemisphere (left visual
hemifield) than for their left hemisphere (right visual hemifield) and they
did not show a reduced difference between the hemifields in comparison with
younger participants. That is, there was no significant group by hemifield
interaction, although for non-match match trials in the word condition
(judgement of emotion) the difference between hemifields, favouring the left
hemifield, was significant for older participants but not younger participants
(contributing to a significant three-way interaction). This difference favoured
the left hemifield and thus, if anything, contradicts the right hemisphere
ageing hypothesis.
An alternative hypothesis to the right hemisphere ageing hypothesis, the
hemispheric asymmetry reduction in older adults (HAROLD) model
(Cabeza, 2002; Dolcos et al., 2002), holds that hemispheric activity
(specifically of the pre-frontal cortex), tends to be less lateralised in older
than younger persons. This is said to give rise to reduced functional
asymmetry in older compared with younger people for tasks mediated by
pre-frontal cortex. In a recent fMRI study by Berlinger et al. (2010) a reduced
pattern of functional lateralisation of temporal regions was found in older
(mean age 62.0 years) compared with younger (mean age 26.5 years)
participants. These authors suggested that the HAROLD model should be
extended to apply to brain regions other than the pre-frontal cortex. However,
there is no support from our own data for extending the HAROLD model to
the cerebral hemispheres as a whole.
There was no significant difference in raw response times between the two
hemifields for either the eyes or the word condition for either group of
participants. Our older participants were on average about twice as slow to
make their responses as their younger counterparts. An advantage for
younger participants over older ones in speed (and accuracy) of discrimina-
tion of facial emotion has been reported by others (e.g., Gunning-Dixon et al.,
2003).
When the raw response times were log transformed, overall response
times in the word (emotion) condition were found to be significantly shorter
in the left than the right visual hemifield (as was found for raw response
times for non-match trials). This is consistent with findings discussed in the
introduction that show an overall superiority of the right hemisphere in
504 BEATON ET AL.

emotional processing of faces and facial features and, as far as our older
participants are concerned, contradicts the right hemisphere ageing hypoth-
esis in as much as there was no group by hemifield interaction.
It is, of course, quite possible that one reason for our failure to find a
difference in lateralisation between older and younger participants is that, as a
group, the former were not sufficiently advanced in age for any hemispheric
asymmetry in cognitive decline to show up. Most studies examining
performance on cognitive tasks within the context of the right hemisphere
ageing hypothesis have been carried out with participants over the age of
60 years. Our own participants were certainly over 50 but were reluctant to
state their exact ages to a much younger experimenter (AD) despite being
invited to do so. However, they all confirmed that they were over 50 which,
although they may have been considerably older, was considered a tactful way
of enabling the experimenter to estimate at least a lower age limit. In any case,
in at least some experiments, changes in lateralised performance with age have
been observed in participants under 60 years of age (e.g., Beaton et al., 2000).
Moreover, we note that there was a statistically significant overall difference in
accuracy of performance and in response times between our two groups of
participants, albeit that the difference was not large.
Having said this, the extended response times of our older compared with
younger participants mean that we would wish to be extremely cautious in
interpreting lack of visual hemifield asymmetry in response times of our
older participants as indicating a definite lack of hemispheric difference in
speed of processing. Any potential effect might well have been obscured by a
lack of sensitivity in the measure.
In most experiments employing a match versus non-match paradigm
responses to matches are faster and more accurate than to non-matches (see,
e.g., Humphrey & Lupker, 1993). This was more or less (the data are
not entirely consistent) what we observed in our younger group. Our
older participants, however, were faster (in both conditions) to decide on a
non-match response and more accurate, especially in the word condition.
Barrett, Rugg, and Perrett (1988), in a face-matching experiment, also
observed a reversal of the usual pattern of performance. Conceivably our
older participants used different strategies or criteria to perform the matching
tasks to those used by younger participants. This may have interacted with (or
indeed been promoted by) the longer response times of the older group. For
example, older participants might have required much more (neural) evidence
for a match than a non-match decision, whereas younger participants
might have adopted a less-stringent approach. If so, any fading in initial
representation of a stimulus with an increase in response time for older
participants would make it increasingly less likely that they would decide in
favour of a match. This would lead to an increase in the proportion of correct
responses on non-match compared with match trials.
PROCESSING EMOTION FROM THE EYES 505

An alternative possibility in relation to the word condition is that older


participants were more discerning in regard to the appropriateness of a
particular verbal label to describe the emotion portrayed in the eyes and
therefore were reluctant to give match decisions. This implies that the eyes
test should be standardised for respondents from different age groups.
Turning now to the EEG data for our younger participants, the main
finding of our experiment is that the mean peak amplitude of the N170
component of the EEG in response to lateralised presentation of isolated
pairs of eyes was greater over the right than the left hemisphere, both for the
simple recognition task and for the task of classifying emotion from the eyes,
even though there was no corresponding behavioural asymmetry. We
interpret these findings as supporting two proposals. First, that processing
part of a face (the region of the eyes), not just the whole face, engages regions
of the right hemisphere to a greater extent than the left hemisphere, whatever
the nature of the experimental task. Second, that classifying emotion in eyes in
relation to a briefly presented prior verbal label engages primarily right
hemisphere processes despite presentation of this verbal stimulus in central
vision.
With regard to the first proposal, it seems likely that a greater amplitude
of N170 over the right than the left hemisphere will emerge more reliably
with lateralised than with central presentation. The current N170 literature
is very largely confined to free viewing of faces (or eyes) in central vision.
Eimer (2000b) presented faces to one or other side of fixation but did not test
for leftright differences (but see Honda, Watanabe, Nakamura, Miki, &
Kagiki, 2007; Yovel et al., 2003, for greater amplitude of N170 over the right
hemisphere with lateralised presentation of faces). Unfortunately, the design
of our study does not allow us to distinguish whether the greater amplitude
over the right than the left hemisphere (in both our experimental conditions)
was due to the emotional expression portrayed in the eyes that we used as
stimuli or was due simply to the eyes being presented unilaterally to a single
hemisphere at a time. In future work we intend to compare Baron-Cohens
eyes with a set of neutrally expressive eyes to investigate this issue.
We found that the amplitude of N170 was greater in the word condition
than in the eyes condition. An explanation of this finding is not immediately
obvious. If, as suggested by some authors (Eimer, 1998; Eimer & Holmes,
2007), the N170 reflects pre-categorical structural encoding of components
of a face, it is entirely to be expected that this component should be present
in both the eyes task (recognition) and in the task of classifying emotion
from eyes (the word task). However, it should not be any greater for the word
condition than for the eyes condition since the eliciting stimulus is the same
(laterally presented eyes) in both conditions.
There is considerable inconsistency in the ERP literature as to whether
the amplitude of the N170 component varies as a function of emotional
506 BEATON ET AL.

expression. Eimer, Holmes, and McGlone (2003) found that the amplitude
of the N170 component at lateral temporal electrodes (T5 and T6) was
similar for emotional and neutral faces with no difference in magnitude as a
function of different emotional expressions (see also, e.g., Hermann et al.,
2002). Eimer and Holmes (2007) suggested that the structural encoding of
faces, as reflected by the N170, is insensitive to information derived from
emotional facial expression (p. 21). However, some authors have obtained
contrary results (e.g., Batty & Taylor, 2003; Leppanen et al., 2007; Sato,
Kochiyama, Yoshikawa, & Matsumura, 2001). It is at least possible,
therefore, that N170 reflects more than structural encoding and is influenced
by emotional processing (as required in our word condition but not our eyes
condition).
An alternative explanation of why the magnitude of N170 was greater in
our word condition than in the eyes condition relates to the fact that, in the
latter condition, two sets of eyes were shown in close temporal succession. In a
number of studies in which one face was shown shortly after another, a
reduction in magnitude of the N170 response to the second presentation of
the same face has been reported, whether stimulus masking was employed
(e.g., Henson, Mouchlianitis, Matthews, & Kouider, 2008) or not (e.g.,
Caharel, dArripe, Ramon, Jacques, & Rossion, 2009; Jacques & Rossion,
2009; Eimer, Kiss, & Nicholas, 2010). Importantly, a reduction in amplitude
of response to presentation of a second (unmasked) face that was different to
the first face has been reported in a magnetoencephalography (MEG) study
(Harris & Nakayama, 2007); that is, the faces do not need to be identical for
an adaptation effect to occur. The reduction in amplitude of the M170
response, analogous to the N170 response obtained using EEG, was much
greater than that observed after presentation of two different stimuli from
another stimulus category (houses) thereby indicating that the amplitude
reduction, or repetition effect, was face-specific. However, Eimer et al. (2010)
have recently reported finding an adaptation effect with eyes only. These
authors argued that the N170 component reflects the activation of neurons
that respond equally strongly to full faces and to face parts. The N170 is
reduced in amplitude whenever 2 stimuli that activate these neurons are
presented in rapid succession (p. 9). We suggest that a similar repetition effect
for eyes can explain the fact that in our experiment the amplitude of the N170
component was reduced in the eyes condition relative to the word condition.
(In the word condition the eyes were presented only once per trial, following
presentation of a word).
With regard to the small second negative peak observable at about 300 ms
post-stimulus onset (N300) in our data (see Figure 1) we note that other
authors studying face processing have found a negative deflection at or close
to this latency (N300 component) at temporal electrode sites (e.g., Carretie &
Inglesias, 1995; Dennis & Chen, 2007; George, Evans, Fiori, Davidoff, &
PROCESSING EMOTION FROM THE EYES 507

Renault, 1996; Sato et al., 2001) albeit of somewhat larger amplitude than is
apparent in our data. Some researchers have noted negativity effects at 400
ms post stimulus onset or even later (see Barrett et al., 1988; Vuilleumier &
Pourtois, 2007). Bentin and Deouell (2000) refer to a face N400 as being
analogous to the N400 component reported with verbal stimuli and
generally considered to correspond to a stage of semantic classification
(for review see Lau, Phillips, & Poeppel, 2008).
As we had no a priori hypothesis other than in regard to the N170
component we did not present a detailed analysis of the N300 component or
sustained magnitude difference between conditions and hemispheres. We
simply note in passing that George et al. (1996) found that the N300 in
response to faces was greater for scrambled than unscrambled faces and
greater over the right temporal than the left temporal area. Similar to the data
of our study, the temporal negativity extended for some two hundred
milliseconds beyond the peak. George et al. (1996) considered that their
results provide evidence for the existence of a long duration negativity (from
150 to 350 ms) associated with the processing of scrambled faces (p. 71).
Since a scrambled face, unlike a normal face, is considered to be processed
in a non-configural manner, the greater amplitude of N300 for scrambled
than unscrambled faces found by George et al. (1996) suggests that it is the
processing of individual facial features*such as eyes*that leads to an
enhanced N300 component. Luo et al. (2010) suggested that amplitude of
N300 (and of a P300 component) probably reflects further evaluation of
information related to the affective valence of a face (p. 1865). These
authors propose a model of facial expression in which there are three stages.
In the first stage faces are processed for the presence of a negatively
valenced emotion (such as threat). In the second stage, indexed by the
N170 response, emotionally charged faces are said to be distinguished from
neutral faces while in the third stage, indexed by the N300 response, different
facial expressions are distinguished.
Before concluding our discussion, we should acknowledge a theoretical
limitation. We have not discussed the results of our experiment in relation to
the valence (or the approachwithdrawal) hypothesis since it was difficult, if
not impossible, to examine our results from this perspective (for recent
discussion of the valence hypothesis in relation to facial expression of emotion
see Alves, Aznar-Casanova, & Fukusima, 2009). This is because very few of
the labels assigned by Baron-Cohen to the eyes we used could be readily
classified as positive or negative. For example, what is the correct designation
for eyes said to show concern or interest or flirtatiousness?
When all is said and done, the term emotion refers to a wide range of
affective experience and behaviour. Different emotions (see e.g., Calder,
Lawrence, & Young, 2001; Murphy, Nimmo-Smith, & Lawrence, 2003) and
different aspects of emotion (perception, expression, feeling) undoubtedly
508 BEATON ET AL.

depend on different regions and sub-systems within the brain (Batty &
Taylor, 2003; Davidson, 2003; Murphy et al., 2003). The extent to which the
emotional processes underlying any particular experimental task are
lateralised is bound to vary. Hence it is probably inappropriate to think in
terms of an overall hemispheric asymmetry for emotion in general, or
indeed, for any emotion in particular (see Murphy et al., 2003). In a meta-
analytic review of findings from neuroimaging studies, Wager, Phan,
Liberzon, and Taylor (2003) found no support for the view that the right
hemisphere is dominant for emotion, and only very limited support for
the hypothesis that lateralisation of emotion is valence-specific (see also
Murphy et al., 2003). They argued that the cerebral hemisphere is too
general a unit of analysis to describe data from neuroimaging (p. 524).
To conclude, our experiment found no overall behavioural difference,
either in younger or older participants, between left and right hemifield
responses to lateralised presentation of emotionally expressive eyes. This was
the case whether the task required participants to match one pair of eyes to
another pair of eyes previously exposed in central vision or to compare the
laterally presented eyes with a word exposed in central vision. Nor is there
anything in the data we have reported to support the view that with increased
age there is a diminution in right hemisphere relative to left hemisphere ability
to match sets of eyes or to judge emotion solely from the region of the eyes.
However, despite the lack of a behavioural asymmetry between left and right
visual hemifields, in our group of young adults we found that the amplitude of
the N170 component of the ERP was greater over the right than the left
temporal and parieto-occipital regions of the scalp in both our experimental
tasks. The magnitude of this response was significantly greater when the eyes
had to be matched to a previously presented word than when they had to be
matched to a previously presented pair of eyes. These ERP findings may be
interpreted as reflecting a greater involvement of the right than the left
hemisphere in processing isolated eyes and suggest the utility of using
electrophysiological techniques in combination with lateralised visual input to
investigate emotional asymmetry in the human brain throughout the life
span.

REFERENCES
Adolphs, R. (2008). Fear, faces and the human amygdala. Current Opinion in Neurobiology, 18,
166172.
Alves, N. T., Aznar-Casanova, J. A., & Fukusima, S. S. (2009). Patterns of brain asymmetry in the
perception of positive and negative facial expressions. Laterality, 14, 256272.
Alves, N. T., Fukusima, S. S., & Aznar-Casanova, J. A. (2008). Models of brain asymmetry in
emotional processing. Psychology and Neuroscience, 1, 6366.
Annett, M. (1970). A classification of hand preference by association analysis. British Journal of
Psychology, 61, 303321.
PROCESSING EMOTION FROM THE EYES 509

Babinski, J. (1914). Contributions of cerebral hemispheric organization in the study of mental


troubles. Revue Neurologique, 27, 845848.
Baron-Cohen, S. (1995). Mindblindness: An essay on autism and theory of mind. Cambridge, MA:
MIT Press.
Baron-Cohen, S., Wheelwright, S., Hill, J., Raste, Y., & Plumb, I. (2001). The Reading of the Mind
in the Eyes test revised version: A study with normal adults, and adults with Asperger
syndrome or high functioning autism. Journal of Child Psychology and Psychiatry, 42, 241251.
Baron-Cohen, S., Wheelwright, S., & Joliffe, T. (1997). Is there a language of the eyes? Evidence
from normal adults and adults with autism or Asperger syndrome. Visual Cognition, 4,
311331.
Barrett, S. E., Rugg, M. D., & Perrett, D. I. (1988). Event-related potentials and the matching of
familiar and unfamiliar faces. Neuropsychologia, 26, 105117.
Batty, M., & Taylor, M. J. (2003). Early processing of the six basic facial emotional expressions.
Cognitive Brain Research, 17, 613620.
Beaton, A. A. (1979). Hemispheric emotional asymmetry in a dichotic listening task. Acta
Psychologica, 43, 103109.
Beaton, A. A. (1985). Left side/right side: A review of laterality research. London: Batsford;
New Haven, CT: Yale University Press.
Beaton, A. A., Hugdahl, K., & Ray, P. (2000). Lateral asymmetries and interhemispheric transfer
in aging. In M. K. Mandal, M. B. Bulman-Fleming, & G. Tiwari (Eds.), Side bias: A
neuropsychological perspective (pp. 101152). Dordrecht: Kluwer Academic Publishers.
Beaumont, J. G. (Ed.). (1982) Divided visual field studies of cerebral organisation. London:
Academic Press.
Bentin, S., Allison, T., Puce, A., Perez, E., & McCarthy, G. (1996). Electrophysiological studies of
face perception in humans. Journal of Cognitive Neuroscience, 8, 551565.
Bentin, S., & Deouell, L. Y. (2000). Structural encoding and identification in face processing: ERP
evidence for separate mechanisms. Cognitive Neuropsychology, 17, 3554.
Berlinger, M., Bottini, G., Danelli, L., Scialfa, G., Sberna, M., Colombo, N., et al. (2010, January).
Re-assessing the HAROLD model. Poster presentation at the 28th European workshop on
Cognitive Neuropsychology, Bressanone, Italy.
Boles, D. B. (1994). An experimental comparison of stimulus type, display type and input variable
contributions to visual field asymmetry. Brain and Cognition, 24, 184197.
Borod, J. C. (1993). Cerebral mechanisms underlying facial, prosodic, and lexical emotional
expression: A review of neuropsychological studies and methodological issues. Neuropsychology,
7, 445463.
Borod, J. C., Haywood, C. S., & Koff, E. (1997). Neuropsychological aspects of facial asymmetry
during emotional expression: A review. Neuropsychology Review, 7, 4160.
Bourne, V. (2006). The divided visual field paradigm: Methodological considerations. Laterality,
11, 373393.
Bourne, V. J. (2005). Lateralised processing of positive facial emotion: Sex differences in strength of
hemispheric dominance. Neuropsychologia, 43, 953956.
Bruce, C. J., Desimone, R., & Gross, C. G. (1981). Visual properties of neurons in a polysensory
area in superior temporal sulcus of the macaque. Journal of Neurophysiology, 46, 369384.
Burt, D. M., & Perrett, D. I. (1997). Perceptual asymmetries in judgements of facial attractiveness,
age, gender, speech and expression. Neuropsychologia, 35, 685693.
Cabeza, R. (2002). Hemispheric asymmetry reduction in older adults: The HAROLD model.
Psychology and Aging, 17, 85100.
Caharel, S., dArripe, O., Ramon, M., Jacques, C., & Rossion, B. (2009). Early adaptation to
repeated unfamiliar faces across viewpoint changes in the right hemisphere: Evidence from the
N170 ERP component. Neuropsychologia, 47, 639643.
510 BEATON ET AL.

Calder, A. J., Keane, J., Manly, T., Sprengelmeyer, R., Scott, S., Nimmo-Smith, I., et al. (2003).
Facial expression recognition across the adult life span. Neuropsychologia, 41, 195202.
Calder, A., Lawrence, A. D., & Young, A. W. (2001). Neuropsychology of fear and loathing. Nature
Reviews Neuroscience, 2, 352363.
Calder, A., Young, A. W., Rowland, D., Perrett, D. I., Hodges, J. R., & Etcoff, N. L. (1996). Facial
emotion recognition after bilateral amygdale damage: Differentially severe impairment of fear.
Cognitive Neuropsychology, 13, 699745.
Caron, A. J., Caron, R. F., Caldwell, R. C., & Weiss, S. J. (1973). Infant perception of the structural
properties of the face. Developmental Psychology, 9, 385399.
Carretie, L., & Iglesias, J. (1995). An ERP study on the specificity of facial expression processing.
International Journal of Psychophysiology, 19, 183192.
Castro-Schilo, L., & Kee, D. W. (2010). Gender differences in the relationship between emotional
intelligence and right-hemisphere lateralization for facial processing. Brain and Cognition, 73,
6267.
Christman, S. D. (1989). Perceptual characteristics in visual laterality research. Brain and
Cognition, 11, 238257.
Christman, S. D., & Hackworth, M. D. (1993). Equivalent perceptual asymmetries for free viewing
of positive and negative emotional expressions in chimeric faces. Neuropsychologia, 31,
621624.
Coolican, J., Eskes, G. A., McMullen, P. A., & Lecky, E. (2008). Perceptual biases in processing
facial identity and emotion. Brain and Cognition, 66, 176187.
Darwin, C. R. (1872). The expression of the emotions in man and animals. London: John Murray.
Davidson, R. J. (1995). Cerebral asymmetry, emotion, and affective style. In R. J. Davidson &
K. Hugdahl (Eds.), Brain asymmetry (pp. 361387). Cambridge, MA: MIT Press.
Davidson, R. J. (2003). Affective neuroscience and psychophysiology: Towards a synthesis.
Psychophysiology, 40, 655665.
Dekowska, M., Kuniecki, M., & Jaskowski, P. (2008). Facing facts: Neuronal mechanisms of face
perception. Acta Neurobiologiae Experimentalis, 68, 229252.
Demaree, H. A., Everhart, D. E., Youngstrom, E. A., & Harrison, D. W. (2005). Brain
lateralization of emotional processing: Historical roots and a future incorporating
Dominance. Behavioral and Cognitive Neuroscience Reviews, 4, 320.
Dennis, T. A., & Chen, C-C. (2007). Emotional face processing and attention performance in three
domains: Neurophysiological mechanisms and moderating effects of trait anxiety. International
Journal of Neurophysiology, 65, 1019.
Desimone, R. (1991). Face-selective cells in the temporal cortex of monkeys. Journal of Cognitive
Neuroscience, 3, 18.
Dimond, S. J. (1972). The double brain. Edinburgh, UK: Churchill Livingstone.
Dimond, S. J., Farrington, L., & Johnson, P. (1976). Differing emotional response from right and
left hemispheres. Nature, 261, 690692.
Dolcos, F., Rice, H. J., & Cabeza, R. (2002). Hemispheric asymmetry and aging: Right hemisphere
decline or asymmetry reduction? Neuroscience and Biobehavioral Reviews, 26, 819825.
Eimer, M. (1998). Does the face-specific N170 component reflect the activity of a specialized eye
processor? Neuroreport, 9(13), 29452948.
Eimer, M. (2000a). The face-specific N170 component reflects late stages in the structural encoding
of faces. Neuroreport, 11, 23192324.
Eimer, M. (2000b). Attentional modulations of event-related brain potentials sensitive to faces.
Cognitive Neuropsychology, 17, 103116.
Eimer, M., & Holmes, A. (2007). Event-related brain potential correlates of emotional face
processing. Neuropsychologia, 45, 1531.
PROCESSING EMOTION FROM THE EYES 511

Eimer, M., Holmes, A., & McGlone, F. P. (2003). The role of spatial attention in the processing of
facial expression: An ERP study of rapid brain responses to six basic emotions. Cognitive,
Affective and Behavioral Neuroscience, 3, 97110.
Eimer, M., Kiss, M., & Nicholas, S. (2010). Response profile of the face-sensitive N170 component;
a rapid adaptation study. Cerebral Cortex, 20, 24422552.
Ellis, A. W., & Brysbaert, M. (2010). Split fovea theory and the role of the two hemispheres in
reading: A review of the evidence. Neuropsychologia, 48, 353365.
Emery, N. J. (2000). The eyes have it: The neuroethology, function and evolution of social gaze.
Neuroscience and Biobehavioral Reviews, 24, 581604.
Failla, C. V., Sheppard, D. M., & Bradshaw, J. L. (2003). Age and responding-hand related changes
in performance of neurologically normal subjects on the line bisection and chimeric-faces task.
Brain and Cognition, 52, 353363.
Fusar-Poli, P., Placentino, A., Carletti, E., Allen, P., Landi, P., Abbamonte, M., et al. (2009).
Laterality effect on emotional face processing: ALE meta-analysis of evidence. Neuroscience
Letters, 452, 262267.
Gainotti, G. (1969). So-called catastrophic reactions and indifferent behaviour during cerebral
trauma. Neuropsychologia, 16, 369374.
Gainotti, G. (1972). Emotional behavior and hemispheric side of the lesion. Cortex, 8, 4155.
George, N., Evans, J., Fiori, N., Davidoff, J., & Renault, B. (1996). Brain events related to normal
and moderately scrambled faces. Cognitive Brain Research, 4, 6576.
Goldstein, G., & Shelly, C. (1981). Does the right hemisphere age more rapidly than the left?
Journal of Clinical Neuropsychology, 3, 6578.
Goldstein, K. (1939). The organism. New York: American Book Publishers.
Gray, J. (1982). The neuropsychology of anxiety. An enquiry in to the functions of the septo-
hippocampal system. Oxford, UK: Oxford University Press.
Gunning-Dixon, F. M., Gur, R. C., Perkins, A. C., Schroeder, L., Turner, T., Turestsky, B. I., et al.
(2003). Age-related differences in brain activation during emotional face processing.
Neurobiology of Aging, 24, 285295.
Harris, A., & Nakayama, K. (2007). Rapid-face selective adaptation of an early extrastriate
component in MEG. Cerebral Cortex, 17, 6370.
Hasselmo, M., Rolls, E. T., & Baylis, G. C. (1989). The role of expression and identity in the face-
selective responses of neurons in the temporal visual cortex of the monkey. Behavioral Brain
Research, 32, 329342.
Haxby, J. V., Hoffman, E. A., & Gobbini, M. I. (2000). The distributed human neural system for
face perception. Trends in Cognitive Sciences, 4, 223233.
Henson, R. N., Mouchlianitis, E., Matthews, W. J., & Kouider, S. (2008). Electrophysiological
correlates of masked face priming. Neuroimage, 40, 884895.
Hermann, M. J., Aranda, D., Ellgring, H., Mueller, T. J., Strik, W. K., Heidrich, A., et al. (2002).
Face-specific event potential in humans is independent from facial expression. International
Journal of Psychophysiology, 45, 241244.
Hernandez, N., Metzger, A., Magne, R., Bonnet-Brilhaut, F., Roux, S., Barthelemy, C., et al.
(2009). Exploration of core features of a human face by healthy and autistic adults analyzed by
visual scanning. Neuropsychologia, 47, 10041012.
Honda, Y., Watanabe, S., Nakamura, M., Miki, K., & Kakigi, R. (2007). Interhemispheric
difference for upright and inverted face perception in humans: An event-related potential study.
Brain Topography, 20, 3139.
Humphrey, G. K., & Lupker, S. (1993). Codes and operations in picture matching. Psychological
Research, 55, 237247.
Hunter, Z. R., & Brysbaert, M. (2008). Visual half-field experiments are a good measure of cerebral
language dominance if used properly: Evidence from fMRI. Neuropsychologia, 46, 316325.
512 BEATON ET AL.

Iidaka, T., Okada, T., Murata, T., Omori, M., Kosaka, H., Sadato, N., et al. (2002). Age-related
differences in the medial temporal lobe responses to emotional faces as revealed by fMRI.
Hippocampus, 12, 352362.
Itier, R., & Batty, M. (2009). Neural bases of eye and gaze processing: The core of social cognition.
Neuroscience and Biobehavioral Reviews, 33, 843863.
Itier, R. J., Alain, C., Sedore, K., & McIntosh, A. R. (2007). Early face processing specificity: Its in
the eyes! Journal of Cognitive Neuroscience, 19, 18151826.
Itier, R. J., Latinus, M., & Taylor, M. J. (2006). Face, eye and object early processing: What is the
face specificity? Neuromage, 29, 667676.
Itier, R. J., & Taylor, M. J. (2004). Effects of repetition learning on upright, inverted and contrast-
reversed face processing using ERPs. Neuroimage, 21, 15181532.
Jacques, C., & Rossion, B. (2009). The initial representation of individual faces in the right
occipito-temporal cortex is holistic: Electrophysiological evidence from the composite face
illusion. Journal of Vision, 9, 116.
Jemel, B., George, N., Chaby, L., Fiori, N., & Renault, B. (1999). Differential processing of part-
to-whole and part-part face priming: An ERP study. Neuroreport, 10, 10691075.
Jordan, T. R., Patching, G. R., & Milner, A. D. (1998). Central fixations are inadequately
controlled by instruction alone: Implications for studying cerebral asymmetry. Quarterly
Journal of Experimental Psychology: Human Experimental Psychology, 51, 371391.
Jordan, T. R., & Paterson, K. B. (2009). Re-evaluating split-fovea processing in word recognition:
A critical assessment of recent research. Neuropsychologia, 47, 23412353.
Kampe, K. K. W., Frith, C. D., Dolan, R. J., & Frith, U. (2001). Reward value of attractiveness and
gaze. Nature, 413, 589 (correction p. 602).
Kanwisher, N., McDermott, J., & Chun, M. M. (1997). The fusiform face area: A module in
human extra-striate cortex specialized for face perception. Journal of Neuroscience, 17, 402411.
Kingstone, A., Tipper, C., Ristic, J., & Ngan, E. (2004). The eyes have it! An fMRI investigation.
Brain and Cognition, 55, 269271.
Lau, E. F., Phillips, C., & Poeppel, D. (2008). A cortical network for semantics: (De)constructing
the N400. Nature Reviews Neuroscience, 9, 920933.
Lavidor, M., & Ellis, A. W. (2003). Interhemispheric integration of letter stimuli presented foveally
or extra-foveally. Cortex, 39, 6983.
Leppanen, J. M., Moulson, M. C., Vogel-Farley, V. K., & Nelson, C. A. (2007). An ERP study of
emotional face processing in the adult and infant brain. Child Development, 78, 232245.
Levine, S. C., & Levy, J. (1986). Perceptual asymmetry for chimeric faces across the life span. Brain
and Cognition, 5, 291306.
Ley, R. G., & Bryden, M. P. (1979). Hemispheric differences in processing emotions and faces.
Brain and Language, 7, 127138.
Lindell, A. K., & Nicholls, M. E. R. (2003). Cortical representation of the fovea: Implications for
visual half-field research. Cortex, 39, 111117.
Luo, W., Feng, W., He, W., Wang, N-Y., & Luo, Y-J. (2010). Three stages of facial expression
processing: ERP study with rapid serial visual presentation. Neuroimage, 49, 18571867.
MacPherson, S., Phillips, L. H., & Della Sala, S. (2002). Age, executive function and social decision
making: A dorso-lateral prefrontal theory of cognitive aging. Psychology of Aging, 17, 598609.
MacPherson, S., Phillips, L. H., & Della Sala, S. (2006). Age-related differences in the ability to
perceive sad facial expressions. Aging Clinical and Experimental Research, 18, 418424.
Mandal, M. K., Borod, J. C., Asthana, H. S., Mohanty, A., Mohanty, S., & Koff, E. (1999). Effects
of lesion variables and emotion type on the perception of facial emotion. Journal of Nervous and
Mental Diseases, 187, 603609.
Mandal, M. K., Tandon, S. C., & Asthana, H. S. (1991). Right brain damage impairs recognition
of negative emotions. Cortex, 27, 247253.
PROCESSING EMOTION FROM THE EYES 513

McClure, E. B. (2000). A meta-analytic review of sex differences in facial expression processing and
their development in infants, children and adolescents. Psychological Bulletin, 126, 424453.
McDowell, C. L., Harrison, D. W., & Demaree, H. A. (1994). Is right hemisphere decline in the
perception of emotion a function of aging? International Journal of Neuroscience, 79, 111.
McKelvie, S. J. (1976). The role of eyes and mouth in the memory of a face. American Journal of
Psychology, 89, 311323.
Moreno, C. R., Borod, J. C., Welkowitz, J., & Alpert, M. (1990). Lateralization of the expression
and perception of facial emotion as a function of age. Neuropsychologia, 28, 199209.
Moreno, C. R., Borod, J. C., Welkowitz, J., & Alpert, M. (1993). The perception of facial emotion
across the adult lifespan. Developmental Neuropsychology, 9, 305314.
Murphy, F. C., Nimmo-Smith, I., & Lawrence, A. D. (2003). Functional neuroanatomy of
emotions: A meta-analysis. Cognitive, Affective and Behavioral Neuroscience, 3, 207233.
Nebes, R. D. (1990). Hemispheric specialization in the aged brain. In C. Trevarthen (Ed.), Brain
circuits and the functions of the mind: Essays in Honor of Roger W. Sperry (pp. 364370).
Cambridge, UK: Cambridge University Press.
Perria, L., Rosadini, G., & Rossi, G. F. (1961). Determination of side of cerebral dominance with
amobarbital. Archives of Neurology, 4, 173189.
Perrett, D., Rolls, E. T., & Caan, W. (1982). Visual neurons responsive to faces in the monkey
temporal cortex. Experimental Brain Research, 47, 329342.
Perrett, D., Smith, P. A., Potter, D. D., Mistlin, A. J., Head, A. S., Milner, A. D., et al. (1984).
Neurons responsive to faces in the temporal cortex: Studies of functional organization,
sensitivity to identity and relation to perception. Human Neurobiology, 3, 197208.
Perrett, D., Smith, P. A., Potter, D. D., Mistlin, A. J., Head, A. S., Milner, A. D., et al. (1985).
Visual cells in the temporal cortex sensitive to face view and gaze direction. Proceedings of the
Royal Society of London B, 223, 293317.
Phillips, L. H., MacLean, R. D. J., & Allen, R. (2002). Age and understanding of emotions:
Neuropsychological and sociocognitive perspectives. Journal of Gerontology, 57B, 526530.
Phillips, M. L., Young, A. W., Scott, S. K., Calder, A. J., Andrew, C., Giampietro, V., et al. (1998).
Proceedings of the Royal Society of London. Series B, 265, 18091817.
Prodan, C. I., Orbelo, D. M., & Ross, E. D. (2007). Processing of facial blends of emotion: Support
for right hemisphere cognitive aging. Cortex, 43, 196206.
Rolls, E. T. (2007). The representation of information about faces in the temporal and frontal lobes.
Neuropsychologia, 45, 124143.
Rossion, B., Delvenne, J-F., Debatisse, D., Goffaux, V., Bruyer, R., Crommelinck, M., et al. (1999).
Spatio-temporal localization of the face inversion effect: An event-related potentials study.
Biological Psychology, 50, 173189.
Rossion, B., & Jacques, C. (2008). Does physical interstimulus variance account for early
electrophysiological face sensitive responses in the human brain? Ten lessons on the N170.
Neuroimage, 39, 19591979.
Rossion, B., & Jacques, C. (2009). The initial representation of faces in the right-occipito-temporal
cortex is holistic: Electrophysiological evidence from the composite face illusion. Journal of
Vision, 9, 116.
Rossion, B., Joyce, C. A., Cottrell, G. W., & Tarr, M. J. (2003). Early lateralization tuning for face,
word and object processing in the visual cortex. Neuroimage, 20, 16091624.
Sato, W., Kochiyama, T., Yoshikawa, S., & Matsumura, M. (2001). Emotional expression boosts
early visual processing of the face: ERP recording and its decomposition by independent
component analysis. Neuroreport, 12, 709714.
Schmitt, J. J., Hartje, W., & Willmes, K. (1997). Hemispheric asymmetry in the recognition of
emotional attitude conveyed by facial expression, prosody and propositional speech. Cortex,
33, 6581.
514 BEATON ET AL.

Shibata, T., Nishijo, H., Tamura, R., Miyamoto, K., Eifuku, S., Endo, S., et al. (2002). Generators
of visual evoked potentials for faces and eyes in the human brain as determined by dipole
localization. Brain Topography, 15, 5163.
Spezio, M. L., Adolphs, R., Hurley, R. S. E., & Piven, J. (2007). Abnormal use of facial
information in high-functioning autism. Journal of Autism and Developmental Disorders, 37,
929939.
Streit, M., Wolwer, W., Brinkmeyer, J., Ihl, R., & Gaebel, W. (2000). Electrophysiological correlates
of emotional and structural face processing in humans. Neuroscience Letters, 278, 1316.
Tamietto, M., Corazzini, L. L., de Gelder, B., & Geminiani, G. (2006). Functional asymmetry
and interhemispheric cooperation in the perception of emotions from facial expressions.
Experimental Brain Research, 171, 389404.
Taylor, M. J., Edmonds, G. E., McCarthy, G., & Allison, T. (2001). Eyes first! Eye processing
develops before face processing in children. Neuroreport, 12, 16711676.
Terzian, H., & Cecotto, C. (1959). Un nuovo metodo per la determinazione e lo studio della
dominanza emisferica. Giornale di Psychiatria e di Neuropatologia, 87, 889924.
Thierry, G., Martin, C. D, Downing, P., & Pegna, A. J. (2007). Controlling for inter-stimulus
perceptual variance abolishes N170 face selectivity. Nature Neuroscience, 10, 505511.
Tootell, R. B. H., Mendola, J. D., Hadjikhani, N. K., Liu, A. K., & Dale, A. M. (1998). The
representation of the ipsilateral visual field in human cerebral cortex. Proceedings of the
National Academy of Sciences of the USA, 95, 818824.
Tucker, D. M. (1981). Lateral brain function, emotion and conceptualization. Psychological
Bulletin, 19, 1946.
Vinette, C., Gosselin, F., & Schyns, P. G. (2004). Spatio-temporal dynamics of face recognition in a
flash: Its in the eyes. Cognitive Science, 28, 289301.
Vuilleumier, P., & Pourtois, G. (2007). Distributed and interactive brain mechanisms during
emotion face perception: Evidence from functional neuroimaging. Neuropsychologia, 45,
174194.
Wager, T. D., Phan, K. L., Liberzon, I., & Taylor, S. F. (2003). Valence, gender, and lateralization of
functional brain anatomy in emotion: A meta-analysis. Neuroimage, 19, 513531.
Weddell, R. A. (1989). Recognition memory for emotional facial expressions in patients with focal
cerebral lesions. Brain and Cognition, 1, 117.
Workman, L., Chilvers, L., Yeomans, H., & Taylor, S. (2006). Development of cerebral
lateralisation for recognition of emotions in chimeric faces in children aged 5 to 11. Laterality,
11, 493507.
Yovel, G., Levy, J., Grabowecky, M., & Paller, K. A. (2003). Neural correlates of the left-visual-field
superiority in face perception appear at multiple stages of face processing. Journal of Cognitive
Neuroscience, 15, 462474.
Yovel, G., Tambini, A., & Brandman, T. (2008). The asymmetry of the fusiform face area is a stable
individual characteristic that underlies the left-visual-field superiority for faces. Neuropsycho-
logia, 46, 30613068.

Vous aimerez peut-être aussi