Académique Documents
Professionnel Documents
Culture Documents
To cite this article: Alan A. Beaton , Nathalie C. Fouquet , Nicola C. Maycock , Eleanor Platt ,
Laura S. Payne & Abigail Derrett (2012) Processing emotion from the eyes: A divided visual
field and ERP study, Laterality: Asymmetries of Body, Brain and Cognition, 17:4, 486-514, DOI:
10.1080/1357650X.2010.517848
Download by: [Universidad Del Norte] Date: 20 January 2017, At: 08:35
LATERALITY, 2012, 17 (4), 486514
# 2012 Psychology Press, an imprint of the Taylor & Francis Group, an Informa business
http://www.psypress.com/laterality http://dx.doi.org/10.1080/1357650X.2010.517848
PROCESSING EMOTION FROM THE EYES 487
The ability to effortlessly recognise someone by their face is one of the most
highly adaptive and complex functions that the visual system has to perform
and it is not yet entirely clear how this is achieved. At an early
neurophysiological stage of processing, cells have been found in temporal
cortex of non-human primates that respond selectively to faces (Bruce,
Desimone, & Gross, 1981; Perett, Rolls, & Caan, 1982; Perret et al., 1984)
and even to particular views of individual faces (Perrett et al., 1985). Some
cells fire only to faces in which the eyes are looking at the participant and
not if the eyes are looking away (see Perrett et al., 1985; for reviews see
Desimone, 1991; Rolls, 2007). Whether a particular animal is the object of
regard of another has, of course, great biological and social significance, as it
has for humans (Itier & Batty, 2009). This is borne out by the results of an
fMRI study which showed that the region of the ventral striatum became
active when (human) participants looked at a photograph of an attractive
female in which the eyes appeared to be looking directly at the participant.
When the eyes were looking away, this region was not active or much less so
(Kampe, Frith, Dolan, & Frith, 2001). Part of the superior temporal sulcus
is said to be an important cortical region for visual processing of the
movements and gaze of another persons eyes (Hasselmo, Rolls, & Baylis,
1989). This region is activated when an ambiguous stimulus is perceived as a
pair of eyes but not when perceived as something else (Kingstone, Tipper,
Ristic, & Ngan, 2004).
The eyes are more important than the mouth in an adults representation of
a face in memory (McKelvie, 1976) and information from the eyes is extracted
at a very early stage in processing faces (Vinette, Gosselin, & Schyns, 2004).
Infants learn to attend to the eyes before they attend to the mouth (Caron,
Caron, Caldwell, & Weiss, 1973) or the complete face (Taylor, Edmonds,
McCarthy, & Allison, 2001). These facts point to the salience of the eyes in
providing information that is useful in a wide variety of situations such as
interpreting facial expressions of emotion (for reviews see Itier & Batty, 2009;
Vuilleumier & Pourtois, 2007).
In his book The Expression of the Emotions in Man and Animals Darwin
(1872, pp. 170171) wrote:
The movements of the expression in the face and body are of much
importance for our welfare. They serve as the first means of communica-
tion between the mother and her infant; she smiles approval, or frowns
disapproval, and thus encourages her child on the right path. We readily
perceive sympathy in others by their expression; our sufferings are thus
mitigated, our pleasures increased, and mutual good feelings strengthened.
Both humans and non-human primates (Emery, 2000) attend more to the
eyes than to other regions of the face (such as nose or mouth) when judging
488 BEATON ET AL.
component, the N170, occurs approximately 140 to 200 milliseconds after the
presentation of a face. It is most prominent over occipito-temporal sites and is
thought to be generated by neurons in face-selective areas of the fusiform
gyrus (see Kanwisher, McDermott, & Chun, 1997; Yovel, Tambini, &
Brandman, 2008) and superior temporal sulcus (Hasselmo et al., 1989; for
reviews see Dekowska, Kuniecki, & Jaskowski, 2008; Haxby, Hoffman, &
Gobbini, 2000). Its magnitude at lateral posterior electrode sites is reliably
greater to the appearance of a face in comparison to any other class of
stimulus. It is widely regarded as reflecting a stage of pre-categorical
processing during which the structural components of a face are configurally
encoded in a representation that may subsequently be used by recognition or
identification processes (Bentin & Deouell, 2000; Bentin, Allison, Puce, Perez,
& McCarthy, 1996; Eimer, 1998, 2000a; Jacques & Rossion, 2009; but see
Thierry, Martin, Downing, & Pegna, 2007; and counter-arguments and review
by Rossion & Jacques, 2008).
Bentin et al. (1996) proposed that the N170 reflects activity of an eye-
sensitive region of the cortex. Eimer (1998) tested this proposal by presenting
either complete faces or faces from which the eyes and eyebrows had been
removed. It was argued that if the N170 component is generated by processes
sensitive to the presence of eyes, then it should be absent or greatly attenuated
in the latter condition. On the contrary, the results of the study showed that
the difference waveform between houses, used as control stimuli, and faces
was more or less identical whether the eyes were included in a face or not.
Eimer (1998) concluded that Rather than being linked to processes devoted
to the detection of single features, the N170 is more likely to be caused by the
structural encoding of different face components (p. 2948).
There is some evidence that the N170 recorded for faces and for eyes
alone comes from different cortical generators (Shibata et al., 2002; Taylor
et al., 2001). Further, amplitude of the N170 component of the EEG has
been found to be greater in response to isolated eyes than to faces with the
eyes removed or to normal upright faces (Bentin et al., 1996; Itier, Alain,
Sedore, & McIntosh, 2007; Itier, Latinus, & Taylor, 2006; Jemel, George,
Chaby, Fiori, & Renault, 1999; Taylor et al., 2001). Itier et al. (2007)
suggested that this reflects the fact that normal faces are processed
configurally, and responded to by face-selective neurons, whereas isolated
eyes elicit activity in populations of both eye-selective neurons, that are not
active when faces are processed configurally, and face-selective neurons (Itier
et al., 2007).
The N170 component has been reported in some studies utilising normal
faces presented in central vision (e.g., Bentin et al., 1996; Itier & Taylor,
2004; Luo, Feng, He, Wang, & Luo, 2010; Rossion, Joyce, Cottrell, & Tarr,
2003; Rossion et al., 1999; Shibata et al., 2002; Streit, Wolwer, Brinkmeyer,
Ihl, & Gaebel, 2000) or centrally presented chimeric faces (Yovel et al., 2003)
PROCESSING EMOTION FROM THE EYES 493
to be of greater amplitude over the right than the left cerebral hemisphere.
With regard to isolated eyes presented centrally, Bentin et al. (1996) reported
that although in two experiments the amplitude of N170 was greater over the
right than the left hemisphere, the statistical significance of this finding was
inconsistent. Shibata et al. (2002) state that N170 amplitudes recorded at
occipito-temporal electrode sites were significantly greater over the right
than the left hemisphere for faces but do not provide data or results for their
eyes alone condition.
In ERP experiments to date involving presentation of isolated eyes the
task has been purely perceptual as, for example, in the target detection study
by Bentin et al. (1996) and the studies by Itier and colleagues (2006, 2007)
in which participants had to decide whether the eyes were presented in an
upright or inverted orientation. Furthermore, the relatively few studies on
the N170 involving isolated eyes as stimuli have presented them in central
vision. We therefore wished to determine whether there is any hemispheric
difference in amplitude of N170 when isolated eyes are presented unilaterally
to the left or right of fixation in a perceptual matching (recognition) task. We
also wanted to investigate whether the N170 component occurs when
participants make judgements of emotion to laterally presented eyes and, if
so, whether there is any greater laterality effect in amplitude of the N170
component in this task compared with the simple recognition task.
We did not expect to find a difference between left and right visual
hemifields in accuracy of recognition of isolated eyes and therefore did not
predict a hemispheric difference in amplitude of the N170 response, despite
the observation of Bentin et al. (1996), who used centrally presented eyes,
that the N170 response was (inconsistently) greater over the right than the
left hemisphere in a target detection task. However, given the salience of the
eyes to observers judgements of emotion and the voluminous research
implicating the right hemisphere in the recognition of emotion from faces
(Bourne, 2005; Coolican et al., 2008; Demaree et al., 2005; Ley & Bryden,
1979; Mandal et al., 1991, 1999; Schmitt et al., 1997; Tamietto et al., 2006;
Weddell, 1989; for reviews see Itier & Batty, 2009; Vuilleumier & Pourtois,
2007), we predicted, for the task of judging emotion, that presentation of the
eyes to the left visual hemifield (right cerebral hemisphere) would lead to a
greater amplitude of N170 response than presentation to the right visual
hemifield (left cerebral hemisphere). We were able to test these predictions
with a group of undergraduate participants; unfortunately, we were unable
to carry out EEG recording with older participants.
To summarise, the aims of this study were to determine (i) whether there is
any visual hemifield asymmetry in recognition of unilaterally presented pairs
of eyes, (ii) whether there is any visual hemifield asymmetry in judging the
emotions expressed by those eyes, (iii) whether younger and older adults show
any difference in relative hemifield accuracy and/or speed of performance, and
494 BEATON ET AL.
(iv) whether there is any difference in amplitude of the N170 component of the
EEG at lateral posterior electrode sites in the emotional judgement task (or,
indeed, in the simple recognition task). Our predictions were that there would
be no hemifield performance difference on the recognition task, but that there
would be a left hemifield (right hemisphere) advantage in judging emotion,
that this would be attenuated (in line with the right hemisphere ageing
hypothesis) in older compared with younger participants, and that the amplitude
of the N170 component would be equivalent over the two hemispheres in the
recognition task but greater over the right than the left hemisphere in the task of
judging emotion from the eyes.
METHOD
Participants
There were two groups of participants. Younger participants were 16 male
volunteers all of whom were aged 2022 years. All were strongly right-
handed as determined by Annetts (1970) handedness inventory except for
two participants who, it turned out after the experiment, showed left-handed
tendencies. However, their data were entirely in keeping with the other
participants and were retained for analysis. (Analyses excluding these two
participants showed the same effects as with them included.) Data from two
right-handed participants had to be dropped because of technical difficulties
with recording. This report is therefore based on 14 younger participants.
There were six male and nine female older right-handed participants. All
participants confirmed that they were over the age of 50 years but the
majority declined to give their exact age. The oldest was a female of 75 years
of age. No participant suffered from any known neurological or medical or
psychiatric condition likely to have affected their ability to perform the
experiment. Most of these older participants were unwilling to travel to the
laboratory to participate in EEG recording so this was omitted.
EEG recording
For each of the younger participants, continuous EEG data were recorded
from 32 channels (Ag/Cl sintered electrodes placed according to the
international 1020 system) using Scan 4.3 software on a SynAmps system.
The electrodes were embedded in a rubber cap that was stretched over the
scalp. Data were recorded using a left ear lobe reference at a 500-Hz
sampling rate with a 0.05100 Hz band pass filter with the Notch filter on
(50 Hz). Before analysis, recordings were re-referenced using the common
average method.
Procedure
Stimuli were presented on a computer screen to the participant, who sat with
his/her eyes at a fixed distance of approximately 115 cm from the screen, head
position being maintained by keeping it in a constant position relative to fixed
points on the laboratory walls. In one condition, the recognition condition,
the participant had to indicate by means of a button press whether a centrally
presented photograph of a pair of eyes matched a subsequent pair of eyes (Yes/
No) presented in one or other visual hemifield. This will be referred to as the
eyes condition. The participant first saw a central fixation point for 200 ms
which was replaced by a pair of eyes presented at fixation for 500 ms. This was
followed by re-presentation of the central fixation point for 300 ms before a
second pair of eyes was presented to the left or right of fixation for 200 ms.
There was a variable inter-trial interval of 600 to 1700 ms between the
participants response and the re-appearance of the fixation point signalling
the next trial. The dimensions of each photograph on the screen were
approximately 12 cm5 cm. The inside border of the photograph was 4 cm
left or right from the central fixation point. Assignment of the labels Yes
(same) and No (different) to the response buttons was counterbalanced
across participants. Participants used their right hand to respond.
In a second condition, recognition of emotion, a word was presented at
fixation followed by a pair of eyes in one or other visual hemifield as before.
The task was to decide (Yes/No) whether the centrally presented word was a
correct (appropriate) description of the emotion expressed by the eyes. This
will be referred to as the word condition. The procedure was exactly as in the
first condition except that this time the centrally presented stimulus was a
word rather than a pair of eyes. In a preliminary check we examined the degree
to which people agreed with the labels assigned by Baron-Cohen et al. (2001)
to the sets of eyes. The stimulus items and response alternatives in our
experiment were selected in the light of this preliminary check. Thus we had
496 BEATON ET AL.
some evidence that for match trials the eyes were indeed paired with the
appropriate verbal labels.
Programs for presentation of the stimuli and recording of responses were
written in E-prime (Psychology Software Tools, Inc). In each condition there
were 72 stimulus presentations to each visual hemifield; 36 matched the
centrally presented pair of eyes or word, hereafter referred to as matching
trials, 36 were different to the centrally presented pair of eyes or word,
hereafter referred to as non-matching trials. There were therefore 144 trials
in each of the eyes (recognition) and word (emotion) conditions.
Half the number of participants in each group completed the eyes
condition first followed by the word condition and the other half completed
the tasks in the reverse order. Within each condition, match and non-match
trials appeared in random order and the stimulus items were presented to
participants in different random orders.
RESULTS
Preliminary analysis showed that the order in which the conditions were
administered had no significant effect on the results and the data for the two
orders were therefore combined for subsequent analysis. There was nothing
in the data to suggest a sex difference and results were therefore combined
across men and women. Only data for trials on which a participants
response was correct were used in the analyses.
Accuracy
The mean number (and standard deviations) of correct responses for
matching and non-matching correct trials for the eyes (recognition) and
word (emotion) conditions are shown for younger and older participants in
Table 1.
Because we were uninterested in the difference in overall accuracy levels
on the two tasks (eyes and word conditions), the data were analysed for the
two conditions separately rather than condition being entered as a factor in
an analysis of variance. This simplifies presentation of the data as it obviates
discussion of theoretically uninteresting interaction terms.
The data of Table 1 were submitted to three-way analysis of variance for a
mixed design with visual hemifield (left vs right) and trial type (matching vs
non-matching) as repeated measures factors. The between-participants factor
was group (younger versus older). For the eyes (recognition) condition, there
was no significant effect of visual hemifield, F(1, 27) B 1, p.986, but trial
type was significant, F(1, 27)4.43, pB.045, h2p .14, as was participant
group, F(1, 27)10.03, p.004, h2p .27. There were more correct responses
PROCESSING EMOTION FROM THE EYES 497
TABLE 1
Mean number (SD) of correct responses to presentation of matching and non-
matching trials in each visual hemifield in the eyes (recognition) and word
(emotion) conditions
on non-match than match trials and younger participants were more accurate
than older participants. There was no other significant main effect and no
significant two- or three-way interaction.
For accuracy in the word (emotion) condition, there was also no significant
effect of visual hemifield, F(1, 27)B1, p.986, but participant group was
significant, F(1, 27)5.47, p.027, h2p .17. The two-way interaction
between trial type and group was significant, F(1, 27)19.47, pB.001,
h2p .42, as was the three-way interaction between hemifield, trial type, and
group, F(1, 27)4.57, p.042, h2p .15. There was no other statistically
significant effect. Decomposition of the three-way interaction showed that the
hemifield-by-trial type interaction was significant only for the older group,
F(1, 14)4.69, p.048, h2p .25, and that the group-by-hemifield interac-
tion was significant, F(1, 27)7.98, p .009, h2p .23, only for non-match
trials. The group-by-trial type interaction was significant for both the left
hemifield, F(1, 27)26.15, pB.001, h2p .49, and the right hemifield, F(1,
27)11.19, p.002, h2p .29. Further decomposition of the two-way
interactions revealed that the significance of the three-way interaction
(between hemifield, trial type, and group) is a reflection of the fact that on
match trials there was no difference between hemifields for either younger or
older participants, but that on non-match trials the difference between
hemifields was significant for the older group (t2.65, df14, p.019) but
498 BEATON ET AL.
not for the younger group. In addition, for younger participants matches were
significantly more accurate than non-matches in the right hemifield (t
13.55, df13, pB.001) but not the left hemifield, whereas for older
participants non-matches were more accurate than matches in both the left
hemifield (t8.13, df14, pB.001) and the right (t3.17, df14, p.007)
hemifield.
Response times
Table 2 shows the mean response times (and standard deviation) in
milliseconds for the eyes and word conditions for each group of participants.
Response times of Table 2 were submitted to separate three-way analysis
of variance for the two conditions. For the eyes (recognition) condition, the
only significant main effect was group, F(1, 27)67.89, pB.001, h2p .72,
younger participants being faster than older participants. The interaction
between group and trial type was also significant, F(1, 27)15.38, p.001,
h2p .36. This was due to the fact that younger participants were faster
to give correct responses to matches than non-matches (t2.38, df13,
p.033), whereas older participants showed the opposite effect (t3.33,
df14, p.005).
TABLE 2
Mean response times (SD) (ms) for correct trials for the eyes (recognition) and word
(emotion) conditions
For the word (emotion) condition, there was a significant main effect of
group, F(1, 27)16.51, pB.001, h2p .38, a significant interaction between
group and trial type, F(1, 27)15.52, p.001, h2p .37, and a significant
interaction between visual hemifield and trial type, F(1, 27)4.73, p.039,
h2p .15. Younger participants were faster overall than older participants
and faster to respond correctly to matches than to non-matches (t2.40,
df13, p.033), whereas older participants showed the reverse effect (t
3.20, df14, p.006). Overall, correct responses to non-matches were
faster for the left than the right hemifield (t3.51, df28, p.002), but
responses to matches were non-significantly faster for the right than the left
hemifield (tB1).
Essentially the same results were obtained when response times were log
transformed in order to improve homogeneity of variance. For the eyes
condition, only the group-by-trial type was significant, F(1, 27)20.81,
pB.001, h2p .44. For the word condition, visual hemifield, F(1, 27)4.35,
p.047, h2p .14, was the only significant main effect. This was qualified by a
two-way interaction between hemifield and trial type, F(1, 27)4.95, p
.035, h2p .16. There was also a significant interaction between group and
trial type, F(1, 27)16.52, pB.001, h2p .38. These interactions were not
germane to the main questions being investigated and therefore were not
decomposed further.
To summarise the main results of the behavioural data, in both the eyes
and the word conditions younger participants were more accurate than older
participants but there was no significant main effect of visual hemifield nor
any significant interaction involving this factor other than the three-way
interaction between field, group, and trial type. This reflected a different
pattern of response to match and non-match trials by the two groups,
especially in the left hemifield.
Raw response times were faster for younger than older participants but
showed no hemifield difference. Transformed response times, however, were
faster for the left visual hemifield and interacted with trial type but not
participant group.
Trial type Match Non-match Match Non-match Match Non-match Match Non-match
Condition
Eyes 4.16 4.24 7.07 6.72 3.16 3.28 5.21 4.90
(SD) 3.69 3.47 4.46 4.21 4.65 4.33 5.20 5.03
TABLE 4
Mean latency of ERP at left and right hemisphere temporal (T5, T6) and parieto-occipital (PO1/PO2) electrode sites for matching (M)
and non-matching (NM) trials in the eyes and word conditions (correct responses, direct stimulation)
Trial type Match Non-match Match Non-match Match Non-match Match Non-match
Condition
Eyes 184.3 176.1 176.4 175.7 176.3 174.4 170.7 172.3
(SD) 17.85 12.68 12.08 9.17 17.02 12.87 9.69 11.97
Word 175.3 172.9 172.3 174.3 173.7 170.6 170.0 171.9
(SD) 12.02 10.22 12.12 11.15 11.58 12.06 11.87 10.88
501
502 BEATON ET AL.
V 0.0
Figure 1. ERPs for direct stimulation and correct responses on match trials at left temporal (T5) and
right temporal (T6) electrode sites. As per convention, negative voltage is upwards.
DISCUSSION
The aims of this experiment were to investigate possible visual hemispheric
asymmetries in response to lateralised presentation of emotionally expressive
eyes. In the event, we found no overall visual hemifield difference for either
younger or older participants in samedifferent recognition accuracy for
PROCESSING EMOTION FROM THE EYES 503
briefly presented eyes. Nor was there any hemifield effect in either group for
the task of verifying the appropriateness of a given label to describe the
emotion expressed by the eyes, contrary to results from our initial pilot study
which suggested that a right hemisphere (left hemifield) superiority might be
found.
In both the eyes (recognition) and the word (emotion) condition, level of
accuracy was higher (and response times faster) for our younger than for our
older participants. However, on both tasks accuracy of performance by our
older participants was no worse for their right hemisphere (left visual
hemifield) than for their left hemisphere (right visual hemifield) and they
did not show a reduced difference between the hemifields in comparison with
younger participants. That is, there was no significant group by hemifield
interaction, although for non-match match trials in the word condition
(judgement of emotion) the difference between hemifields, favouring the left
hemifield, was significant for older participants but not younger participants
(contributing to a significant three-way interaction). This difference favoured
the left hemifield and thus, if anything, contradicts the right hemisphere
ageing hypothesis.
An alternative hypothesis to the right hemisphere ageing hypothesis, the
hemispheric asymmetry reduction in older adults (HAROLD) model
(Cabeza, 2002; Dolcos et al., 2002), holds that hemispheric activity
(specifically of the pre-frontal cortex), tends to be less lateralised in older
than younger persons. This is said to give rise to reduced functional
asymmetry in older compared with younger people for tasks mediated by
pre-frontal cortex. In a recent fMRI study by Berlinger et al. (2010) a reduced
pattern of functional lateralisation of temporal regions was found in older
(mean age 62.0 years) compared with younger (mean age 26.5 years)
participants. These authors suggested that the HAROLD model should be
extended to apply to brain regions other than the pre-frontal cortex. However,
there is no support from our own data for extending the HAROLD model to
the cerebral hemispheres as a whole.
There was no significant difference in raw response times between the two
hemifields for either the eyes or the word condition for either group of
participants. Our older participants were on average about twice as slow to
make their responses as their younger counterparts. An advantage for
younger participants over older ones in speed (and accuracy) of discrimina-
tion of facial emotion has been reported by others (e.g., Gunning-Dixon et al.,
2003).
When the raw response times were log transformed, overall response
times in the word (emotion) condition were found to be significantly shorter
in the left than the right visual hemifield (as was found for raw response
times for non-match trials). This is consistent with findings discussed in the
introduction that show an overall superiority of the right hemisphere in
504 BEATON ET AL.
emotional processing of faces and facial features and, as far as our older
participants are concerned, contradicts the right hemisphere ageing hypoth-
esis in as much as there was no group by hemifield interaction.
It is, of course, quite possible that one reason for our failure to find a
difference in lateralisation between older and younger participants is that, as a
group, the former were not sufficiently advanced in age for any hemispheric
asymmetry in cognitive decline to show up. Most studies examining
performance on cognitive tasks within the context of the right hemisphere
ageing hypothesis have been carried out with participants over the age of
60 years. Our own participants were certainly over 50 but were reluctant to
state their exact ages to a much younger experimenter (AD) despite being
invited to do so. However, they all confirmed that they were over 50 which,
although they may have been considerably older, was considered a tactful way
of enabling the experimenter to estimate at least a lower age limit. In any case,
in at least some experiments, changes in lateralised performance with age have
been observed in participants under 60 years of age (e.g., Beaton et al., 2000).
Moreover, we note that there was a statistically significant overall difference in
accuracy of performance and in response times between our two groups of
participants, albeit that the difference was not large.
Having said this, the extended response times of our older compared with
younger participants mean that we would wish to be extremely cautious in
interpreting lack of visual hemifield asymmetry in response times of our
older participants as indicating a definite lack of hemispheric difference in
speed of processing. Any potential effect might well have been obscured by a
lack of sensitivity in the measure.
In most experiments employing a match versus non-match paradigm
responses to matches are faster and more accurate than to non-matches (see,
e.g., Humphrey & Lupker, 1993). This was more or less (the data are
not entirely consistent) what we observed in our younger group. Our
older participants, however, were faster (in both conditions) to decide on a
non-match response and more accurate, especially in the word condition.
Barrett, Rugg, and Perrett (1988), in a face-matching experiment, also
observed a reversal of the usual pattern of performance. Conceivably our
older participants used different strategies or criteria to perform the matching
tasks to those used by younger participants. This may have interacted with (or
indeed been promoted by) the longer response times of the older group. For
example, older participants might have required much more (neural) evidence
for a match than a non-match decision, whereas younger participants
might have adopted a less-stringent approach. If so, any fading in initial
representation of a stimulus with an increase in response time for older
participants would make it increasingly less likely that they would decide in
favour of a match. This would lead to an increase in the proportion of correct
responses on non-match compared with match trials.
PROCESSING EMOTION FROM THE EYES 505
expression. Eimer, Holmes, and McGlone (2003) found that the amplitude
of the N170 component at lateral temporal electrodes (T5 and T6) was
similar for emotional and neutral faces with no difference in magnitude as a
function of different emotional expressions (see also, e.g., Hermann et al.,
2002). Eimer and Holmes (2007) suggested that the structural encoding of
faces, as reflected by the N170, is insensitive to information derived from
emotional facial expression (p. 21). However, some authors have obtained
contrary results (e.g., Batty & Taylor, 2003; Leppanen et al., 2007; Sato,
Kochiyama, Yoshikawa, & Matsumura, 2001). It is at least possible,
therefore, that N170 reflects more than structural encoding and is influenced
by emotional processing (as required in our word condition but not our eyes
condition).
An alternative explanation of why the magnitude of N170 was greater in
our word condition than in the eyes condition relates to the fact that, in the
latter condition, two sets of eyes were shown in close temporal succession. In a
number of studies in which one face was shown shortly after another, a
reduction in magnitude of the N170 response to the second presentation of
the same face has been reported, whether stimulus masking was employed
(e.g., Henson, Mouchlianitis, Matthews, & Kouider, 2008) or not (e.g.,
Caharel, dArripe, Ramon, Jacques, & Rossion, 2009; Jacques & Rossion,
2009; Eimer, Kiss, & Nicholas, 2010). Importantly, a reduction in amplitude
of response to presentation of a second (unmasked) face that was different to
the first face has been reported in a magnetoencephalography (MEG) study
(Harris & Nakayama, 2007); that is, the faces do not need to be identical for
an adaptation effect to occur. The reduction in amplitude of the M170
response, analogous to the N170 response obtained using EEG, was much
greater than that observed after presentation of two different stimuli from
another stimulus category (houses) thereby indicating that the amplitude
reduction, or repetition effect, was face-specific. However, Eimer et al. (2010)
have recently reported finding an adaptation effect with eyes only. These
authors argued that the N170 component reflects the activation of neurons
that respond equally strongly to full faces and to face parts. The N170 is
reduced in amplitude whenever 2 stimuli that activate these neurons are
presented in rapid succession (p. 9). We suggest that a similar repetition effect
for eyes can explain the fact that in our experiment the amplitude of the N170
component was reduced in the eyes condition relative to the word condition.
(In the word condition the eyes were presented only once per trial, following
presentation of a word).
With regard to the small second negative peak observable at about 300 ms
post-stimulus onset (N300) in our data (see Figure 1) we note that other
authors studying face processing have found a negative deflection at or close
to this latency (N300 component) at temporal electrode sites (e.g., Carretie &
Inglesias, 1995; Dennis & Chen, 2007; George, Evans, Fiori, Davidoff, &
PROCESSING EMOTION FROM THE EYES 507
Renault, 1996; Sato et al., 2001) albeit of somewhat larger amplitude than is
apparent in our data. Some researchers have noted negativity effects at 400
ms post stimulus onset or even later (see Barrett et al., 1988; Vuilleumier &
Pourtois, 2007). Bentin and Deouell (2000) refer to a face N400 as being
analogous to the N400 component reported with verbal stimuli and
generally considered to correspond to a stage of semantic classification
(for review see Lau, Phillips, & Poeppel, 2008).
As we had no a priori hypothesis other than in regard to the N170
component we did not present a detailed analysis of the N300 component or
sustained magnitude difference between conditions and hemispheres. We
simply note in passing that George et al. (1996) found that the N300 in
response to faces was greater for scrambled than unscrambled faces and
greater over the right temporal than the left temporal area. Similar to the data
of our study, the temporal negativity extended for some two hundred
milliseconds beyond the peak. George et al. (1996) considered that their
results provide evidence for the existence of a long duration negativity (from
150 to 350 ms) associated with the processing of scrambled faces (p. 71).
Since a scrambled face, unlike a normal face, is considered to be processed
in a non-configural manner, the greater amplitude of N300 for scrambled
than unscrambled faces found by George et al. (1996) suggests that it is the
processing of individual facial features*such as eyes*that leads to an
enhanced N300 component. Luo et al. (2010) suggested that amplitude of
N300 (and of a P300 component) probably reflects further evaluation of
information related to the affective valence of a face (p. 1865). These
authors propose a model of facial expression in which there are three stages.
In the first stage faces are processed for the presence of a negatively
valenced emotion (such as threat). In the second stage, indexed by the
N170 response, emotionally charged faces are said to be distinguished from
neutral faces while in the third stage, indexed by the N300 response, different
facial expressions are distinguished.
Before concluding our discussion, we should acknowledge a theoretical
limitation. We have not discussed the results of our experiment in relation to
the valence (or the approachwithdrawal) hypothesis since it was difficult, if
not impossible, to examine our results from this perspective (for recent
discussion of the valence hypothesis in relation to facial expression of emotion
see Alves, Aznar-Casanova, & Fukusima, 2009). This is because very few of
the labels assigned by Baron-Cohen to the eyes we used could be readily
classified as positive or negative. For example, what is the correct designation
for eyes said to show concern or interest or flirtatiousness?
When all is said and done, the term emotion refers to a wide range of
affective experience and behaviour. Different emotions (see e.g., Calder,
Lawrence, & Young, 2001; Murphy, Nimmo-Smith, & Lawrence, 2003) and
different aspects of emotion (perception, expression, feeling) undoubtedly
508 BEATON ET AL.
depend on different regions and sub-systems within the brain (Batty &
Taylor, 2003; Davidson, 2003; Murphy et al., 2003). The extent to which the
emotional processes underlying any particular experimental task are
lateralised is bound to vary. Hence it is probably inappropriate to think in
terms of an overall hemispheric asymmetry for emotion in general, or
indeed, for any emotion in particular (see Murphy et al., 2003). In a meta-
analytic review of findings from neuroimaging studies, Wager, Phan,
Liberzon, and Taylor (2003) found no support for the view that the right
hemisphere is dominant for emotion, and only very limited support for
the hypothesis that lateralisation of emotion is valence-specific (see also
Murphy et al., 2003). They argued that the cerebral hemisphere is too
general a unit of analysis to describe data from neuroimaging (p. 524).
To conclude, our experiment found no overall behavioural difference,
either in younger or older participants, between left and right hemifield
responses to lateralised presentation of emotionally expressive eyes. This was
the case whether the task required participants to match one pair of eyes to
another pair of eyes previously exposed in central vision or to compare the
laterally presented eyes with a word exposed in central vision. Nor is there
anything in the data we have reported to support the view that with increased
age there is a diminution in right hemisphere relative to left hemisphere ability
to match sets of eyes or to judge emotion solely from the region of the eyes.
However, despite the lack of a behavioural asymmetry between left and right
visual hemifields, in our group of young adults we found that the amplitude of
the N170 component of the ERP was greater over the right than the left
temporal and parieto-occipital regions of the scalp in both our experimental
tasks. The magnitude of this response was significantly greater when the eyes
had to be matched to a previously presented word than when they had to be
matched to a previously presented pair of eyes. These ERP findings may be
interpreted as reflecting a greater involvement of the right than the left
hemisphere in processing isolated eyes and suggest the utility of using
electrophysiological techniques in combination with lateralised visual input to
investigate emotional asymmetry in the human brain throughout the life
span.
REFERENCES
Adolphs, R. (2008). Fear, faces and the human amygdala. Current Opinion in Neurobiology, 18,
166172.
Alves, N. T., Aznar-Casanova, J. A., & Fukusima, S. S. (2009). Patterns of brain asymmetry in the
perception of positive and negative facial expressions. Laterality, 14, 256272.
Alves, N. T., Fukusima, S. S., & Aznar-Casanova, J. A. (2008). Models of brain asymmetry in
emotional processing. Psychology and Neuroscience, 1, 6366.
Annett, M. (1970). A classification of hand preference by association analysis. British Journal of
Psychology, 61, 303321.
PROCESSING EMOTION FROM THE EYES 509
Calder, A. J., Keane, J., Manly, T., Sprengelmeyer, R., Scott, S., Nimmo-Smith, I., et al. (2003).
Facial expression recognition across the adult life span. Neuropsychologia, 41, 195202.
Calder, A., Lawrence, A. D., & Young, A. W. (2001). Neuropsychology of fear and loathing. Nature
Reviews Neuroscience, 2, 352363.
Calder, A., Young, A. W., Rowland, D., Perrett, D. I., Hodges, J. R., & Etcoff, N. L. (1996). Facial
emotion recognition after bilateral amygdale damage: Differentially severe impairment of fear.
Cognitive Neuropsychology, 13, 699745.
Caron, A. J., Caron, R. F., Caldwell, R. C., & Weiss, S. J. (1973). Infant perception of the structural
properties of the face. Developmental Psychology, 9, 385399.
Carretie, L., & Iglesias, J. (1995). An ERP study on the specificity of facial expression processing.
International Journal of Psychophysiology, 19, 183192.
Castro-Schilo, L., & Kee, D. W. (2010). Gender differences in the relationship between emotional
intelligence and right-hemisphere lateralization for facial processing. Brain and Cognition, 73,
6267.
Christman, S. D. (1989). Perceptual characteristics in visual laterality research. Brain and
Cognition, 11, 238257.
Christman, S. D., & Hackworth, M. D. (1993). Equivalent perceptual asymmetries for free viewing
of positive and negative emotional expressions in chimeric faces. Neuropsychologia, 31,
621624.
Coolican, J., Eskes, G. A., McMullen, P. A., & Lecky, E. (2008). Perceptual biases in processing
facial identity and emotion. Brain and Cognition, 66, 176187.
Darwin, C. R. (1872). The expression of the emotions in man and animals. London: John Murray.
Davidson, R. J. (1995). Cerebral asymmetry, emotion, and affective style. In R. J. Davidson &
K. Hugdahl (Eds.), Brain asymmetry (pp. 361387). Cambridge, MA: MIT Press.
Davidson, R. J. (2003). Affective neuroscience and psychophysiology: Towards a synthesis.
Psychophysiology, 40, 655665.
Dekowska, M., Kuniecki, M., & Jaskowski, P. (2008). Facing facts: Neuronal mechanisms of face
perception. Acta Neurobiologiae Experimentalis, 68, 229252.
Demaree, H. A., Everhart, D. E., Youngstrom, E. A., & Harrison, D. W. (2005). Brain
lateralization of emotional processing: Historical roots and a future incorporating
Dominance. Behavioral and Cognitive Neuroscience Reviews, 4, 320.
Dennis, T. A., & Chen, C-C. (2007). Emotional face processing and attention performance in three
domains: Neurophysiological mechanisms and moderating effects of trait anxiety. International
Journal of Neurophysiology, 65, 1019.
Desimone, R. (1991). Face-selective cells in the temporal cortex of monkeys. Journal of Cognitive
Neuroscience, 3, 18.
Dimond, S. J. (1972). The double brain. Edinburgh, UK: Churchill Livingstone.
Dimond, S. J., Farrington, L., & Johnson, P. (1976). Differing emotional response from right and
left hemispheres. Nature, 261, 690692.
Dolcos, F., Rice, H. J., & Cabeza, R. (2002). Hemispheric asymmetry and aging: Right hemisphere
decline or asymmetry reduction? Neuroscience and Biobehavioral Reviews, 26, 819825.
Eimer, M. (1998). Does the face-specific N170 component reflect the activity of a specialized eye
processor? Neuroreport, 9(13), 29452948.
Eimer, M. (2000a). The face-specific N170 component reflects late stages in the structural encoding
of faces. Neuroreport, 11, 23192324.
Eimer, M. (2000b). Attentional modulations of event-related brain potentials sensitive to faces.
Cognitive Neuropsychology, 17, 103116.
Eimer, M., & Holmes, A. (2007). Event-related brain potential correlates of emotional face
processing. Neuropsychologia, 45, 1531.
PROCESSING EMOTION FROM THE EYES 511
Eimer, M., Holmes, A., & McGlone, F. P. (2003). The role of spatial attention in the processing of
facial expression: An ERP study of rapid brain responses to six basic emotions. Cognitive,
Affective and Behavioral Neuroscience, 3, 97110.
Eimer, M., Kiss, M., & Nicholas, S. (2010). Response profile of the face-sensitive N170 component;
a rapid adaptation study. Cerebral Cortex, 20, 24422552.
Ellis, A. W., & Brysbaert, M. (2010). Split fovea theory and the role of the two hemispheres in
reading: A review of the evidence. Neuropsychologia, 48, 353365.
Emery, N. J. (2000). The eyes have it: The neuroethology, function and evolution of social gaze.
Neuroscience and Biobehavioral Reviews, 24, 581604.
Failla, C. V., Sheppard, D. M., & Bradshaw, J. L. (2003). Age and responding-hand related changes
in performance of neurologically normal subjects on the line bisection and chimeric-faces task.
Brain and Cognition, 52, 353363.
Fusar-Poli, P., Placentino, A., Carletti, E., Allen, P., Landi, P., Abbamonte, M., et al. (2009).
Laterality effect on emotional face processing: ALE meta-analysis of evidence. Neuroscience
Letters, 452, 262267.
Gainotti, G. (1969). So-called catastrophic reactions and indifferent behaviour during cerebral
trauma. Neuropsychologia, 16, 369374.
Gainotti, G. (1972). Emotional behavior and hemispheric side of the lesion. Cortex, 8, 4155.
George, N., Evans, J., Fiori, N., Davidoff, J., & Renault, B. (1996). Brain events related to normal
and moderately scrambled faces. Cognitive Brain Research, 4, 6576.
Goldstein, G., & Shelly, C. (1981). Does the right hemisphere age more rapidly than the left?
Journal of Clinical Neuropsychology, 3, 6578.
Goldstein, K. (1939). The organism. New York: American Book Publishers.
Gray, J. (1982). The neuropsychology of anxiety. An enquiry in to the functions of the septo-
hippocampal system. Oxford, UK: Oxford University Press.
Gunning-Dixon, F. M., Gur, R. C., Perkins, A. C., Schroeder, L., Turner, T., Turestsky, B. I., et al.
(2003). Age-related differences in brain activation during emotional face processing.
Neurobiology of Aging, 24, 285295.
Harris, A., & Nakayama, K. (2007). Rapid-face selective adaptation of an early extrastriate
component in MEG. Cerebral Cortex, 17, 6370.
Hasselmo, M., Rolls, E. T., & Baylis, G. C. (1989). The role of expression and identity in the face-
selective responses of neurons in the temporal visual cortex of the monkey. Behavioral Brain
Research, 32, 329342.
Haxby, J. V., Hoffman, E. A., & Gobbini, M. I. (2000). The distributed human neural system for
face perception. Trends in Cognitive Sciences, 4, 223233.
Henson, R. N., Mouchlianitis, E., Matthews, W. J., & Kouider, S. (2008). Electrophysiological
correlates of masked face priming. Neuroimage, 40, 884895.
Hermann, M. J., Aranda, D., Ellgring, H., Mueller, T. J., Strik, W. K., Heidrich, A., et al. (2002).
Face-specific event potential in humans is independent from facial expression. International
Journal of Psychophysiology, 45, 241244.
Hernandez, N., Metzger, A., Magne, R., Bonnet-Brilhaut, F., Roux, S., Barthelemy, C., et al.
(2009). Exploration of core features of a human face by healthy and autistic adults analyzed by
visual scanning. Neuropsychologia, 47, 10041012.
Honda, Y., Watanabe, S., Nakamura, M., Miki, K., & Kakigi, R. (2007). Interhemispheric
difference for upright and inverted face perception in humans: An event-related potential study.
Brain Topography, 20, 3139.
Humphrey, G. K., & Lupker, S. (1993). Codes and operations in picture matching. Psychological
Research, 55, 237247.
Hunter, Z. R., & Brysbaert, M. (2008). Visual half-field experiments are a good measure of cerebral
language dominance if used properly: Evidence from fMRI. Neuropsychologia, 46, 316325.
512 BEATON ET AL.
Iidaka, T., Okada, T., Murata, T., Omori, M., Kosaka, H., Sadato, N., et al. (2002). Age-related
differences in the medial temporal lobe responses to emotional faces as revealed by fMRI.
Hippocampus, 12, 352362.
Itier, R., & Batty, M. (2009). Neural bases of eye and gaze processing: The core of social cognition.
Neuroscience and Biobehavioral Reviews, 33, 843863.
Itier, R. J., Alain, C., Sedore, K., & McIntosh, A. R. (2007). Early face processing specificity: Its in
the eyes! Journal of Cognitive Neuroscience, 19, 18151826.
Itier, R. J., Latinus, M., & Taylor, M. J. (2006). Face, eye and object early processing: What is the
face specificity? Neuromage, 29, 667676.
Itier, R. J., & Taylor, M. J. (2004). Effects of repetition learning on upright, inverted and contrast-
reversed face processing using ERPs. Neuroimage, 21, 15181532.
Jacques, C., & Rossion, B. (2009). The initial representation of individual faces in the right
occipito-temporal cortex is holistic: Electrophysiological evidence from the composite face
illusion. Journal of Vision, 9, 116.
Jemel, B., George, N., Chaby, L., Fiori, N., & Renault, B. (1999). Differential processing of part-
to-whole and part-part face priming: An ERP study. Neuroreport, 10, 10691075.
Jordan, T. R., Patching, G. R., & Milner, A. D. (1998). Central fixations are inadequately
controlled by instruction alone: Implications for studying cerebral asymmetry. Quarterly
Journal of Experimental Psychology: Human Experimental Psychology, 51, 371391.
Jordan, T. R., & Paterson, K. B. (2009). Re-evaluating split-fovea processing in word recognition:
A critical assessment of recent research. Neuropsychologia, 47, 23412353.
Kampe, K. K. W., Frith, C. D., Dolan, R. J., & Frith, U. (2001). Reward value of attractiveness and
gaze. Nature, 413, 589 (correction p. 602).
Kanwisher, N., McDermott, J., & Chun, M. M. (1997). The fusiform face area: A module in
human extra-striate cortex specialized for face perception. Journal of Neuroscience, 17, 402411.
Kingstone, A., Tipper, C., Ristic, J., & Ngan, E. (2004). The eyes have it! An fMRI investigation.
Brain and Cognition, 55, 269271.
Lau, E. F., Phillips, C., & Poeppel, D. (2008). A cortical network for semantics: (De)constructing
the N400. Nature Reviews Neuroscience, 9, 920933.
Lavidor, M., & Ellis, A. W. (2003). Interhemispheric integration of letter stimuli presented foveally
or extra-foveally. Cortex, 39, 6983.
Leppanen, J. M., Moulson, M. C., Vogel-Farley, V. K., & Nelson, C. A. (2007). An ERP study of
emotional face processing in the adult and infant brain. Child Development, 78, 232245.
Levine, S. C., & Levy, J. (1986). Perceptual asymmetry for chimeric faces across the life span. Brain
and Cognition, 5, 291306.
Ley, R. G., & Bryden, M. P. (1979). Hemispheric differences in processing emotions and faces.
Brain and Language, 7, 127138.
Lindell, A. K., & Nicholls, M. E. R. (2003). Cortical representation of the fovea: Implications for
visual half-field research. Cortex, 39, 111117.
Luo, W., Feng, W., He, W., Wang, N-Y., & Luo, Y-J. (2010). Three stages of facial expression
processing: ERP study with rapid serial visual presentation. Neuroimage, 49, 18571867.
MacPherson, S., Phillips, L. H., & Della Sala, S. (2002). Age, executive function and social decision
making: A dorso-lateral prefrontal theory of cognitive aging. Psychology of Aging, 17, 598609.
MacPherson, S., Phillips, L. H., & Della Sala, S. (2006). Age-related differences in the ability to
perceive sad facial expressions. Aging Clinical and Experimental Research, 18, 418424.
Mandal, M. K., Borod, J. C., Asthana, H. S., Mohanty, A., Mohanty, S., & Koff, E. (1999). Effects
of lesion variables and emotion type on the perception of facial emotion. Journal of Nervous and
Mental Diseases, 187, 603609.
Mandal, M. K., Tandon, S. C., & Asthana, H. S. (1991). Right brain damage impairs recognition
of negative emotions. Cortex, 27, 247253.
PROCESSING EMOTION FROM THE EYES 513
McClure, E. B. (2000). A meta-analytic review of sex differences in facial expression processing and
their development in infants, children and adolescents. Psychological Bulletin, 126, 424453.
McDowell, C. L., Harrison, D. W., & Demaree, H. A. (1994). Is right hemisphere decline in the
perception of emotion a function of aging? International Journal of Neuroscience, 79, 111.
McKelvie, S. J. (1976). The role of eyes and mouth in the memory of a face. American Journal of
Psychology, 89, 311323.
Moreno, C. R., Borod, J. C., Welkowitz, J., & Alpert, M. (1990). Lateralization of the expression
and perception of facial emotion as a function of age. Neuropsychologia, 28, 199209.
Moreno, C. R., Borod, J. C., Welkowitz, J., & Alpert, M. (1993). The perception of facial emotion
across the adult lifespan. Developmental Neuropsychology, 9, 305314.
Murphy, F. C., Nimmo-Smith, I., & Lawrence, A. D. (2003). Functional neuroanatomy of
emotions: A meta-analysis. Cognitive, Affective and Behavioral Neuroscience, 3, 207233.
Nebes, R. D. (1990). Hemispheric specialization in the aged brain. In C. Trevarthen (Ed.), Brain
circuits and the functions of the mind: Essays in Honor of Roger W. Sperry (pp. 364370).
Cambridge, UK: Cambridge University Press.
Perria, L., Rosadini, G., & Rossi, G. F. (1961). Determination of side of cerebral dominance with
amobarbital. Archives of Neurology, 4, 173189.
Perrett, D., Rolls, E. T., & Caan, W. (1982). Visual neurons responsive to faces in the monkey
temporal cortex. Experimental Brain Research, 47, 329342.
Perrett, D., Smith, P. A., Potter, D. D., Mistlin, A. J., Head, A. S., Milner, A. D., et al. (1984).
Neurons responsive to faces in the temporal cortex: Studies of functional organization,
sensitivity to identity and relation to perception. Human Neurobiology, 3, 197208.
Perrett, D., Smith, P. A., Potter, D. D., Mistlin, A. J., Head, A. S., Milner, A. D., et al. (1985).
Visual cells in the temporal cortex sensitive to face view and gaze direction. Proceedings of the
Royal Society of London B, 223, 293317.
Phillips, L. H., MacLean, R. D. J., & Allen, R. (2002). Age and understanding of emotions:
Neuropsychological and sociocognitive perspectives. Journal of Gerontology, 57B, 526530.
Phillips, M. L., Young, A. W., Scott, S. K., Calder, A. J., Andrew, C., Giampietro, V., et al. (1998).
Proceedings of the Royal Society of London. Series B, 265, 18091817.
Prodan, C. I., Orbelo, D. M., & Ross, E. D. (2007). Processing of facial blends of emotion: Support
for right hemisphere cognitive aging. Cortex, 43, 196206.
Rolls, E. T. (2007). The representation of information about faces in the temporal and frontal lobes.
Neuropsychologia, 45, 124143.
Rossion, B., Delvenne, J-F., Debatisse, D., Goffaux, V., Bruyer, R., Crommelinck, M., et al. (1999).
Spatio-temporal localization of the face inversion effect: An event-related potentials study.
Biological Psychology, 50, 173189.
Rossion, B., & Jacques, C. (2008). Does physical interstimulus variance account for early
electrophysiological face sensitive responses in the human brain? Ten lessons on the N170.
Neuroimage, 39, 19591979.
Rossion, B., & Jacques, C. (2009). The initial representation of faces in the right-occipito-temporal
cortex is holistic: Electrophysiological evidence from the composite face illusion. Journal of
Vision, 9, 116.
Rossion, B., Joyce, C. A., Cottrell, G. W., & Tarr, M. J. (2003). Early lateralization tuning for face,
word and object processing in the visual cortex. Neuroimage, 20, 16091624.
Sato, W., Kochiyama, T., Yoshikawa, S., & Matsumura, M. (2001). Emotional expression boosts
early visual processing of the face: ERP recording and its decomposition by independent
component analysis. Neuroreport, 12, 709714.
Schmitt, J. J., Hartje, W., & Willmes, K. (1997). Hemispheric asymmetry in the recognition of
emotional attitude conveyed by facial expression, prosody and propositional speech. Cortex,
33, 6581.
514 BEATON ET AL.
Shibata, T., Nishijo, H., Tamura, R., Miyamoto, K., Eifuku, S., Endo, S., et al. (2002). Generators
of visual evoked potentials for faces and eyes in the human brain as determined by dipole
localization. Brain Topography, 15, 5163.
Spezio, M. L., Adolphs, R., Hurley, R. S. E., & Piven, J. (2007). Abnormal use of facial
information in high-functioning autism. Journal of Autism and Developmental Disorders, 37,
929939.
Streit, M., Wolwer, W., Brinkmeyer, J., Ihl, R., & Gaebel, W. (2000). Electrophysiological correlates
of emotional and structural face processing in humans. Neuroscience Letters, 278, 1316.
Tamietto, M., Corazzini, L. L., de Gelder, B., & Geminiani, G. (2006). Functional asymmetry
and interhemispheric cooperation in the perception of emotions from facial expressions.
Experimental Brain Research, 171, 389404.
Taylor, M. J., Edmonds, G. E., McCarthy, G., & Allison, T. (2001). Eyes first! Eye processing
develops before face processing in children. Neuroreport, 12, 16711676.
Terzian, H., & Cecotto, C. (1959). Un nuovo metodo per la determinazione e lo studio della
dominanza emisferica. Giornale di Psychiatria e di Neuropatologia, 87, 889924.
Thierry, G., Martin, C. D, Downing, P., & Pegna, A. J. (2007). Controlling for inter-stimulus
perceptual variance abolishes N170 face selectivity. Nature Neuroscience, 10, 505511.
Tootell, R. B. H., Mendola, J. D., Hadjikhani, N. K., Liu, A. K., & Dale, A. M. (1998). The
representation of the ipsilateral visual field in human cerebral cortex. Proceedings of the
National Academy of Sciences of the USA, 95, 818824.
Tucker, D. M. (1981). Lateral brain function, emotion and conceptualization. Psychological
Bulletin, 19, 1946.
Vinette, C., Gosselin, F., & Schyns, P. G. (2004). Spatio-temporal dynamics of face recognition in a
flash: Its in the eyes. Cognitive Science, 28, 289301.
Vuilleumier, P., & Pourtois, G. (2007). Distributed and interactive brain mechanisms during
emotion face perception: Evidence from functional neuroimaging. Neuropsychologia, 45,
174194.
Wager, T. D., Phan, K. L., Liberzon, I., & Taylor, S. F. (2003). Valence, gender, and lateralization of
functional brain anatomy in emotion: A meta-analysis. Neuroimage, 19, 513531.
Weddell, R. A. (1989). Recognition memory for emotional facial expressions in patients with focal
cerebral lesions. Brain and Cognition, 1, 117.
Workman, L., Chilvers, L., Yeomans, H., & Taylor, S. (2006). Development of cerebral
lateralisation for recognition of emotions in chimeric faces in children aged 5 to 11. Laterality,
11, 493507.
Yovel, G., Levy, J., Grabowecky, M., & Paller, K. A. (2003). Neural correlates of the left-visual-field
superiority in face perception appear at multiple stages of face processing. Journal of Cognitive
Neuroscience, 15, 462474.
Yovel, G., Tambini, A., & Brandman, T. (2008). The asymmetry of the fusiform face area is a stable
individual characteristic that underlies the left-visual-field superiority for faces. Neuropsycho-
logia, 46, 30613068.