Vous êtes sur la page 1sur 31

Human Neuroplasticity and Education

Pontifical Academy of Sciences, Scripta Varia 117, Vatican City 2011


www.pas.va/content/dam/accademia/pdf/sv117/sv117-kuhl.pdf

Brain Mechanisms Underlying the


Critical Period for Language: Linking
Theory and Practice
Patricia K. Kuhl

Introduction
Half a century ago, humans’ capacity for speech and language provoked
classic debates on what it means to be human by strong proponents of na-
tivism (Chomsky, 1959) and learning (Skinner, 1957). The debate centered
on learning and development, and the remarkable transition that all children
make as they acquire a language.While we are now far beyond these debates
and informed by a great deal of data about infants, their innate predisposi-
tions and incredible abilities to learn once exposed to natural language
(Kuhl, 2009; Saffran, Werker, and Werner, 2006), we are still just breaking
ground with regard to the neural mechanisms that underlie language de-
velopment and its ‘critical period’ (see Friederici and Wartenburger, 2010;
Kuhl and Rivera-Gaxiola, 2008; Kuhl et al., 2008). Developmental neuro-
science is beginning to deepen our understanding of the nature of language
and its ‘window of opportunity’ for learning.
To explore the topic of the critical period for language, and its practical
implications, I will focus on the youngest learners – infants in the first year
of life – and compare them to adult learners. The linguistic data will focus
on the most elementary units of language – the consonants and vowels that
make up words. Infants’ responses to the basic building blocks of speech
provide an experimentally accessible window on the roles of nature and
nurture in language acquisition. Comparative studies at the phonetic level
have allowed us to examine humans’ unique language processing abilities
at birth and as they respond to language experience. We are beginning to
discover how exposure to two languages early in infancy produces a bilin-
gual brain, and bilingualism is allowing us to test theories of the critical pe-
riod. Neuroimaging of infants is advancing our understanding of the
uniquely human capacity for language.

Windows to the young brain


Rapid advances have been made in noninvasive techniques that examine
language processing in young children (Figure 1, see p. 239). They include

Human Neuroplasticity and Education 33


PATRICIA K. KUHL

Electroencephalography (EEG)/Event-Related Potentials (ERPs), Magne-


toencephalography (MEG), functional Magnetic Resonance Imaging
(fMRI), and Near-Infrared Spectroscopy (NIRS).
Event-Related Potentials (ERPs) have been widely used to study speech
and language processing in infants and young children (for reviews, see
Conboy, Rivera-Gaxiola, Silva-Pereyra, and Kuhl, 2008; Friederici, 2005;
Kuhl, 2004). ERPs, a part of the EEG, reflect electrical activity that is time-
locked to the presentation of a specific sensory stimulus (for example, syl-
lables or words) or a cognitive process (for example, recognition of a
semantic violation within a sentence or phrase). By placing sensors on a
child’s scalp, the activity of neural networks firing in a coordinated and syn-
chronous fashion in open field configurations can be measured, and voltage
changes occurring as a function of cortical neural activity can be detected.
ERPs provide precise time resolution (milliseconds), making them well
suited for studying the high-speed and temporally ordered structure of
human speech. ERP experiments can also be carried out in populations
who cannot provide overt responses because of age or cognitive impair-
ment. Spatial resolution of the source of brain activation is, however, limited.
Magnetoencephalography (MEG) is another brain imaging technique that
tracks activity in the brain with exquisite temporal resolution. The SQUID
(superconducting quantum interference device) sensors located within the
MEG helmet measure the minute magnetic fields associated with electrical
currents that are produced by the brain when it is performing sensory, motor,
or cognitive tasks. Going beyond EEG, MEG allows precise localization of the
neural currents responsible for the sources of the magnetic fields. Cheour et al.
(2004) and Imada et al. (2006) used new head-tracking methods and MEG to
show phonetic discrimination in newborns and in infants in the first year of
life. Sophisticated head-tracking software and hardware enables investigators
to correct for infants’ head movements, and allows the examination of multiple
brain areas as infants listen to speech (Imada et al., 2006). MEG (as well as
EEG) techniques are completely safe and noiseless.
Magnetic resonance imaging (MRI) can be combined with MEG
and/or EEG, providing static structural/anatomical pictures of the brain.
Structural MRIs show anatomical differences in brain regions across the
lifespan, and have recently been used to predict second-language phonetic
learning in adults (Golestani, Molko, Dehaene, LeBihan, and Pallier, 2007).
Structural MRI measures in young infants identify the size of various brain
structures and these measures correlate with later language abilities (Ortiz-
Mantilla, Choe, Flax, Grant, and Benasich, 2010).When structural MRI im-
ages are superimposed on the physiological activity detected by MEG or

34 Human Neuroplasticity and Education


BRAIN MECHANISMS UNDERLYING THE CRITICAL PERIOD FOR LANGUAGE: LINKING THEORY AND PRACTICE

EEG, the spatial localization of brain activities recorded by these methods


can be improved.
Functional magnetic resonance imaging (fMRI) is a popular method of
neuroimaging in adults because it provides high spatial-resolution maps of
neural activity across the entire brain (e.g., Gernsbacher and Kaschak, 2003).
Unlike EEG and MEG, fMRI does not directly detect neural activity, but
rather the changes in blood-oxygenation that occur in response to neural ac-
tivation. Neural events happen in milliseconds; however, the blood-oxygena-
tion changes that they induce are spread out over several seconds, thereby
severely limiting fMRI’s temporal resolution. Few studies have attempted
fMRI with infants because the technique requires infants to be perfectly still,
and because the MRI device produces loud sounds making it necessary to
shield infants’ ears. fMRI studies allow precise localization of brain activity
and a few pioneering studies show remarkable similarity in the structures re-
sponsive to language in infants and adults (Dehaene-Lambertz, Dehaene, and
Hertz-Pannier, 2002; Dehaene-Lambertz et al., 2006).
Near-Infrared Spectroscopy (NIRS) also measures cerebral hemody-
namic responses in relation to neural activity, but utilizes the absorption of
light, which is sensitive to the concentration of hemoglobin, to measure ac-
tivation (Aslin and Mehler, 2005). NIRS measures changes in blood oxy-
and deoxy-hemoglobin concentrations in the brain as well as total blood
volume changes in various regions of the cerebral cortex using near infrared
light. The NIRS system can determine the activity in specific regions of
the brain by continuously monitoring blood hemoglobin level. Reports
have begun to appear on infants in the first two years of life, testing infant
responses to phonemes as well as longer stretches of speech such as ‘moth-
erese’ and forward versus reversed sentences (Bortfeld, Wruck, and Boas,
2007; Homae, Watanabe, Nakano, Asakawa, and Taga, 2006; Peña, Bonatti,
Nespor, and Mehler, 2002; Taga and Asakawa, 2007). As with other hemo-
dynamic techniques such as fMRI, NIRS typically does not provide good
temporal resolution. However, event-related NIRS paradigms are being de-
veloped (Gratton and Fabiani, 2001a,b). One of the most important po-
tential uses of the NIRS technique is possible co-registration with other
testing techniques such as EEG and MEG.

Phonetic learning
Perception of the phonetic units of speech – the vowels and consonants
that make up words – is one of the most widely studied linguistic skills in in-
fancy and adulthood. Phonetic perception and the role of experience in learn-
ing is studied in newborns, during development as infants are exposed to a

Human Neuroplasticity and Education 35


PATRICIA K. KUHL

particular language, in adults from different cultures, in children with devel-


opmental disabilities, and in nonhuman animals. Phonetic perception studies
provide critical tests of theories of language development and its evolution.
An extensive literature on developmental speech perception exists and brain
measures are adding substantially to our knowledge of phonetic development
and learning (see Kuhl, 2004; Kuhl et al., 2008; Werker and Curtin, 2005).
In the last decade, brain and behavioral studies indicate a very complex
set of interacting brain systems in the initial acquisition of language early in
infancy, many of which appear to reflect adult language processing (Dehaene-
Lambertz et al., 2006). In adulthood, language is highly modularized, which
accounts for the very specific patterns of language deficits and brain damage
in adult patients following stroke (Kuhl and Damasio, in press). Infants, how-
ever, must begin life with brain systems that allow them to acquire any and
all languages to which they are exposed, and allow acquisition of language as
either an auditory-vocal or a visual-manual code on roughly the same
timetable (Petitto and Marentette, 1991).We are in a nascent stage of under-
standing the brain mechanisms underlying infants’ early flexibility with regard
to the acquisition of language – their ability to acquire language by eye or by
ear, and their ability to acquire one or multiple languages – and also the re-
duction in this initial flexibility that occurs with age, dramatically decreasing
our capacity to acquire a new language as adults (Newport, 1990).The infant
brain is exquisitely poised to ‘crack the speech code’ in a way that the adult
brain cannot. Uncovering why this is so is a very interesting puzzle.
In this review I will also explore a current working hypothesis and its
implications for the critical period in language – that the critical period is
not driven solely by time (maturation), but by experience. In exploring the
critical period for phonetic learning we will examine the role of experience,
particularly in closing the optimal period for learning. I will also develop
the idea that systems-level top-down mechanisms, such as those controlling
social cognition, play an essential role in infants’ abilities to ‘crack the speech
code’. On this view, infants combine a powerful set of domain-general com-
putational skills with their equally extraordinary social skills to enable learn-
ing. Thus, the underlying brain systems for social cognition and language
processing mutually influence one another in controlling the opening and
closing of the critical period during development. Nature’s language ex-
periments – the case of simultaneous bilinguals who learn more than one
language – are revealing a great deal about how experience alters the brain,
and these data are affecting arguments about the critical period as well.The
data suggest revisions of theory. Of equal importance, the data how one
might facilitate language learning and literacy in young children.

36 Human Neuroplasticity and Education


BRAIN MECHANISMS UNDERLYING THE CRITICAL PERIOD FOR LANGUAGE: LINKING THEORY AND PRACTICE

Regarding the social effects, I have suggested that the social brain – in
ways we have yet to understand – ’gates’ the computational mechanisms un-
derlying learning in the domain of language (Kuhl, 2007; in press). The as-
sertion that social factors gate language learning may help explain not only
how typically developing children acquire language, but also why children
with autism exhibit twin deficits in social cognition and language, and why
nonhuman animals with impressive computational abilities do not acquire
language. Moreover, this gating hypothesis may explain why social factors
play a far more significant role than previously realized in human learning
across domains throughout our lifetimes (Meltzoff, Kuhl, Movellan, and Se-
jnowski, 2009).Theories of social learning have traditionally emphasized the
role of social factors in language acquisition (Bruner, 1983;Tomasello, 2003a,
b; Vygotsky, 1962). However, these models emphasized the development of
lexical understanding and the use of others’ communicative intentions to help
understand the mapping between words and objects. The new data indicate
that social interaction gates an even more basic aspect of language – learning
of the elementary phonetic units – suggesting a more fundamental connec-
tion between the brain mechanisms underlying human social understanding
and the origins of language than has previously been hypothesized.
Research on infants’ phonetic perception in the first year of life shows how
computational, cognitive, and social skills combine to form a very powerful
learning mechanism. Interestingly, this mechanism does not resemble Skinner’s
operant conditioning and reinforcement model of learning, nor Chomsky’s
detailed view of parameter setting. The processes that infants employ when
learning from exposure to language are complex and multi-modal, but also
child’s play in that they grow out of infants’ heightened attention to items and
events in the natural world: the faces, actions, and voices of other people.

Language exhibits a ‘critical period’ for learning


A stage-setting concept for human language learning is the graph shown
in Figure 2, redrawn from a study by Johnson and Newport on English
grammar in native speakers of Korean learning English as a second language
(1989). The graph as rendered shows a simplified schematic of second lan-
guage competence as a function of the age of second language acquisition.
Figure 2 is surprising from the standpoint of more general human learn-
ing. In the domain of language, infants and young children are superior
learners when compared to adults, in spite of adults’ cognitive superiority.
Language is one of the classic examples of a ‘critical’ or ‘sensitive’ period in
neurobiology (Bruer, 2008; Johnson and Newport, 1989; Knudsen, 2004;
Kuhl, 2004; Newport, Bavelier, and Neville, 2001).

Human Neuroplasticity and Education 37


PATRICIA K. KUHL

Figure 2. The relationship between age of acquisition of a second language and language skill (adapted
from Johnson and Newport, 1989).

Scientists are generally in agreement that this learning curve is represen-


tative of data across a wide variety of second-language learning studies (Bia-
lystok and Hakuta, 1994; Birdsong and Molis, 2001; Flege, Yeni-Komshian,
and Liu, 1999; Johnson and Newport, 1989; Kuhl, Conboy, Padden, Nelson,
and Pruitt, 2005a; Kuhl et al., 2008; Mayberry and Lock, 2003; Neville et al.,
1997; Weber-Fox and Neville, 1999; Yeni-Komshian, Flege, and Liu,, 2000;
though see Birdsong, 1992; White and Genesee, 1996). However, not all as-
pects of language exhibit the same temporally defined critical window. The
developmental timing of critical periods for learning phonetic, lexical, and
syntactic levels of language vary, though studies cannot yet document the pre-
cise timing at each individual level. Studies indicate, for example, that the crit-
ical period for phonetic learning occurs prior to the end of the first year,
whereas syntactic learning flourishes between 18 and 36 months of age.Vo-
cabulary development explodes at 18 months of age, but does not appear to
be as restricted by age as other aspects of language learning – one can learn
new vocabulary items at any age. One goal of future research will be to doc-
ument the ‘opening’ and ‘closing’ of critical periods for all levels of language
and understand how they overlap and why they differ.
Given widespread agreement on the fact that we do not learn equally
well over the lifespan, theory is currently focused on attempts to understand
how and why learning is restricted to certain periods. What accounts for
adults’ inability to learn a new language with the facility of an infant?

38 Human Neuroplasticity and Education


BRAIN MECHANISMS UNDERLYING THE CRITICAL PERIOD FOR LANGUAGE: LINKING THEORY AND PRACTICE

Recent data on critical periods in the visual domain – particularly in the


case of ocular dominance – are exploring from a physiological perspective
the pharmacological triggers at the cellular level that open the critical period
and those that close the period of optimum learning. For example, we have
known since the pioneering work of Hubel and Wiesel (Hubel and Weisel,
1963; Weisel and Hubel, 1963; Hensch, 2005) that ocular dominance in the
brain’s visual cortex is determined by experience at a particular point in de-
velopment – input from the two eyes determines the relative dominance of
one eye over another. Closing one eye during the critical period for binocular
fusion results in a permanent reduction in visual acuity. Recent research in-
dicates that in the case of binocular vision, the brain’s inhibitory circuits are
responsible for both the onset and offset of plasticity (Hensch and Stryker,
1996; Hensch, 2005). This finding represents an exciting new step in under-
standing the underlying mechanisms of the critical period for vision.
Work on the molecular components (inhibitory GABAergic systems,
etc.) that control the opening and narrowing of learning periods pose an
important question from a theoretical perspective: Something has to trigger
these inhibitory circuits – is it maturation that triggers the cellular mecha-
nisms causing them to initiate learning, and eventually to slow learning, or
does the trigger stem from the environment? Vision research has provided
a clue: Rearing animals completely in the dark (by eye-suturing for exam-
ple), and then opening the animal’s eye after the typical learning period is
over, extends the critical period (Cho and Bear, 2010). At least for binocular
vision, the critical period is not strictly maturational. Knowing whether this
is the case more generally – beyond vision – will advance theory.

Phonetic level contributions to ‘critical period’ theory


Work in my laboratory has focused on the idea that experience, not simply
time or maturation, opens and closes the critical period in the case of language
(Kuhl, 2000). Our published work focuses on closing mechanisms, ones that
may cause phonetic learning to decline with language experience. Work
on the opening of the critical period has recently begun.
Language acquisition is often cited as an example of a critical learning
period that is constrained by time, or factors such as hormones, that are
outside the learning process itself. The studies on speech (as well as those
on birds acquiring bird song, see Doupe and Kuhl, 1999) suggest an alter-
native (Kuhl, 2000). The work on speech suggests that early learning itself
may constrain later learning. In earlier writings, I advanced the concept of
neural commitment, the idea that neural circuitry and overall architecture
forms early in infancy to detect the phonetic and prosodic patterns of

Human Neuroplasticity and Education 39


PATRICIA K. KUHL

speech (Kuhl, 2004; Zhang, Kuhl, Imada, Kotani, and Tohkura, 2005; Zhang
et al., 2009). The neural architecture formed with experience is designed
to maximize the efficiency of processing for the language(s) experienced
by the infant. Once fully established, the neural architecture arising from
exposure to French or Tagalog, for example, impedes learning of a new lan-
guage that does not conform.

Infant phonetic learning: computation ‘gated’ by the social brain


The world’s languages contain approximately 600 consonants and 200
vowels (Ladefoged, 2001). Each language uses a unique set of distinct ele-
ments, phonemes, which change the meaning of a word (e.g. from bat to pat
in English). But phonemes are actually groups of non-identical sounds,
phonetic units, which are functionally equivalent in the language. Japanese-
learning infants have to group the phonetic units r and l into a single
phonemic category (Japanese r), whereas English-learning infants must up-
hold the distinction to separate rake from lake. Spanish learning infants
must distinguish phonetic units critical to Spanish words (baño and paño),
whereas English learning infants must combine them into a single category
(English b).
If infants were exposed only to the subset of phonetic units that will
eventually be used phonemically to differentiate words in their language,
the problem would be trivial. But infants are exposed to many more pho-
netic variants than will be used phonemically. The baby’s task in the first
year of life, therefore, is to make some progress in figuring out the compo-
sition of the 40-odd phonemic categories in their language before trying
to acquire words that depend on these elementary units. An important dis-
covery in the 1970s was that infants initially hear all phonetic differences;
they have a universal phonetic capacity at birth (Eimas, 1975; Eimas, Sique-
land, Jusczyk, and Vigorito, 1971; Lasky, Syrdal-Lasky, and Klein, 1975;
Werker and Lalonde, 1988).
Between 6 and 12 months of age nonnative discrimination declines (Best
and McRoberts, 2003; Rivera-Gaxiola, Silvia-Pereyra, and Kuhl, 2005a;
Tsao, Liu, and Kuhl, 2006; Werker and Tees, 1984), and native language
speech perception shows a significant increase (Kuhl et al., 2006;Tsao et al.,
2006) (see Figure 3, 240).
What happens during this 2-month window to prompt the transition?
Available data now allows us to create a model of the transition in phonetic
perception, and our current working model of the process (Kuhl et al., 2008)
shows that two factors are key to phonetic learning during the sensitive pe-
riod – computational learning and social cognition.

40 Human Neuroplasticity and Education


BRAIN MECHANISMS UNDERLYING THE CRITICAL PERIOD FOR LANGUAGE: LINKING THEORY AND PRACTICE

The computational component


An implicit form of learning, referred to as ‘statistical learning’ (Saffran,
Aslin, and Newport, 1996), plays an important role in infants’ phonetic
learning. Figure 4 (p. 240) provides a cartoon version of the process. Re-
search shows that adult speakers of English and Japanese produce the English
r, the English /l/, and the Japanese r sounds, so it is not the mere presence
of the sound in language spoken to infants that accounts for learning
(Werker et al., 2007). Instead, it is the patterns of distributional frequency
of the sounds across the two languages that provide the information that
English-learning and Japanese-learning infants use to learn phonetically.
When infants listen to English and Japanese, they attend to the distrib-
utional properties of the phonetic units contained in the two languages,
and the distributional data affect their perception (Kuhl, Williams, Lacerda,
Stevens, and Lindblom, 1992; Maye, Weiss, and Aslin, 2008; Maye, Werker,
and Gerken, 2002; Teinonen, Fellman, Naatanen, Alku, and Huotilainen,
2009). These distributional differences are exaggerated in ‘motherese’, the
prosodically and phonetically stretched utterances that are near universal in
languages spoken to children around the world, (Kuhl et al., 1997;Vallabha,
McClelland, Pons, Weker, and Amano, 2007; Werker et al., 2007).
As illustrated in the idealized case (see Figure 4, p. 240), the distributions
of English and Japanese sounds differ: English motherese contains many
English r and l sounds and very few of the Japanese retroflex r sounds, while
the reverse is true for Japanese motherese. A variety of studies show that
infants pick up the distributional frequency patterns in ambient speech,
whether they experience them during short-term laboratory experiments
or over months in natural environments, and that this alters phonetic per-
ception (Maye et al, 2002; Maye et al, 2008). Statistical learning from the
distributional properties in speech thus supports infants’ transition in early
development from universal perception exhibited at birth to native-lan-
guage perception that is exhibited by the end of the first year of life.
The foregoing data and arguments led us to suggest that statistical learning
processes could govern brain plasticity (Kuhl, 2002; Kuhl et al., 2008). If infants
build up statistical distributions of the sounds contained in the language they
hear, at some point these distributions would become stable. At the point of
stability, additional language input would not cause the overall statistical dis-
tribution of sounds to change substantially, and, according to our model, this
stability would cause a decline in sensitivity to language input. In other words,
the decline in plasticity is hypothesized to be driven by a statistical process in
which stability reduces plasticity. Hypothetically, for instance, the infants’ rep-
resentation of the vowel /a/ might have stabilized by the time the child hears

Human Neuroplasticity and Education 41


PATRICIA K. KUHL

her one-millionth token of the vowel /a/, and this could instigate the begin-
ning of the closure of the critical period. On this account, plasticity is inde-
pendent of time, and instead dependent on the amount and the variability of
input provided by experience.This reasoning lends itself to testable hypothe-
ses. Studies of bilingual infants, reviewed later in this chapter, provide one ex-
ample of an empirical test of the model.

Brain rhythms index statistical learning for speech


Statistical learning is an implicit strategy that induces phonetic learning
in infancy but not in adulthood – spending months in a foreign country
does not change speech perception in spite of the new statistical distribu-
tions we experience. Work in my laboratory has recently shown that brain
oscillations (‘rhythms’), associated with higher cognitive functions such as
attention and cognitive effort, index the shift in statistical learning in speech
(Bosseler, Taulu, Imada, and Kuhl, 2011).
A brain rhythm, theta (~4-8Hz), has been shown in previous studies to
index attention and cognitive effort in adults (Jensen and Tesche, 2002) as
well as infants (Kahana, Sekuler, Caplan, Kirschen, and Madsen, 1999).
Using native and nonnative speech sounds, presented frequently or infre-
quently in the classic oddball paradigm, we tested three age groups: 6-8
month-old infants, 10-12 month-old infants, and adults. Data was collected
using magnetoencephalography (MEG), a whole-brain imaging technology
that is completely safe and noiseless.
Bosseler et al. (2011) predicted that early in development, when infants are
maximally sensitive to language experience, attention and cognitive effort are
driven by infants’ sensitivity to the distributional frequency of events, as por-
trayed in Figure 4. Once learning occurs (after neural commitment is com-
plete), attention and cognitive effort are dominated by learned categories;
stimuli that fit learned phonetic categories are processed easily, and increased
attention and mental effort are required for novel stimuli that do not fit learned
categories.We expected 6-8 month-old infants to show theta increases for any
frequent stimulus, regardless of language, and adults to show theta increases for
novel stimuli, regardless of frequency. 10-12 month-old infants were expected
to show an intermediate pattern resembling that of adults.
Our results confirmed these predictions (Bosseler et al., 2011). 6-8-month-
old infants demonstrated increased theta for frequently presented speech
sounds, regardless of whether they were native or nonnative. Adults showed
the opposite pattern of response, with increased theta shown to nonnative
sounds regardless of frequency. The 10-12-month-old infants showed an in-
termediate pattern of results, approximating the adult theta pattern.

42 Human Neuroplasticity and Education


BRAIN MECHANISMS UNDERLYING THE CRITICAL PERIOD FOR LANGUAGE: LINKING THEORY AND PRACTICE

These results show that theta indexes the well-established change in


speech perception that is brought about by exposure to a specific language –
as infants experience a particular language, the brain’s neural circuitry fo-
cuses on registering high frequency speech events that represent the pho-
netic categories used in the ambient language.This implicit strategy provides
infants with an ability to learn through experience with language to attend
to the sounds that are critical to the ambient language used in their cultural
community. Adults no longer implicitly absorb the statistical properties of
phonetic units in a new language. Attention and cognitive effort is driven
by learned category structure.
The question that remains is whether the implicit strategy indexed by
theta brain rhythms is unique to speech. A range of studies show that the
perceptual narrowing first observed for speech perception (Werker and Teas,
1984) occurs in other domains. Infants show perceptual narrowing between
8 and 12 months of age for visual discrimination of faces (Pascalis, de Haan,
and Nelson, 2002) or languages (Weikum, et al., 2007), for recognition of
the conceptual distinctions that underpin word meanings (Hespos and
Spelke, 2004), and when detecting inter-sensory correspondences
(Lewkowicz and Ghazanfar, 2006). In all cases, young infants’ abilities are
initially better than those shown in adults and decline during the second
half of the first year. Infants begin life with the capacity to differentiate
many forms, and this initial capacity narrows as a function of experience.
The perceptual narrowing phenomenon may therefore reflect a domain-
general developmental shift in perceptual strategy brought about by the
brain’s response to experience rather than a specific critical learning window
for speech. Stimuli that reflect cultural categories that are learned socially
(speech, faces, conceptual categories, musical intervals) are candidate do-
mains for which this pattern might hold. Further empirical research will
be needed to test this broader hypothesis with a variety of stimuli.

The social component


Whether or not the perceptual narrowing indexed by theta and observed
in speech turns out to be a domain-general phenomenon, studies on pho-
netic learning have gone one step further than studies in other domains in
understanding the complex conditions that must be met in order for infants
to learn during the period from 6-12 months. In the domain of speech,
data now show that infants’ computational skills cannot solely account for
the transition in phonetic perception that occurs in the second half of the
first year of life. Our studies demonstrate that infant language learning in
complex natural environments requires something more than raw compu-

Human Neuroplasticity and Education 43


PATRICIA K. KUHL

tation. Laboratory studies testing infant phonetic and word learning from
exposure to complex natural language demonstrated limits on statistical
learning, and provided new information suggesting that social brain systems
are integrally involved and, in fact, may be necessary to instigate natural
phonetic learning (Kuhl, Tsao, and Liu, 2003; Conboy and Kuhl, 2011).
The new experiments tested infants in the following way: At 9 months of
age, the age at which the initial universal pattern of infant perception has
changed to one that is more language-specific, infants were exposed to a foreign
language for the first time (Kuhl et al., 2003). Nine-month-old American infants
listened to 4 different native speakers of Mandarin during 12 sessions scheduled
over 4–5 weeks.The foreign language ‘tutors’ read books and played with toys
in sessions that were unscripted. A control group was also exposed for 12 ses-
sions but heard only English from native speakers. After infants in the experi-
mental Mandarin exposure group and the English control group completed
their sessions, all were tested with a Mandarin phonetic contrast that does not
occur in English. Both behavioral and ERP methods were used.The results in-
dicated that infants showed a remarkable ability to learn from the ‘live-person’
sessions – after exposure, they performed significantly better on the Mandarin
contrast when compared to the control group that heard only English. In fact,
they performed equivalently to infants of the same age tested in Taiwan who
had been listening to Mandarin for 10 months (Kuhl et al., 2003).
The study revealed that infants can learn from first-time natural exposure
to a foreign language at 9 months, and answered what was initially the ex-
perimental question: can infants learn the statistical structure of phonemes
in a new language given first-time exposure at 9 months of age? If infants
required a long-term history of listening to that language – as would be
the case if infants needed to build up statistical distributions over the initial
9 months of life – the answer to our question would have been no. How-
ever, the data clearly showed that infants are capable of learning at 9 months
when exposed to a new language. Moreover, learning was durable. Infants
returned to the laboratory for their behavioral discrimination tests between
2 and 12 days after the final language exposure session, and between 8 and
33 days for their ERP measurements. No ‘forgetting’ of the Mandarin con-
trast occurred during the 2 to 33 day delay.
Infants exposed to Mandarin were socially very engaged in the language
sessions. Would infants learn if they were exposed to the same information
in the absence of a human being, say, via television or an audiotape? If sta-
tistical learning is sufficient, the television and audio-only conditions should
produce learning. Infants who were exposed to the same foreign-language
material at the same time and at the same rate, but via standard television

44 Human Neuroplasticity and Education


BRAIN MECHANISMS UNDERLYING THE CRITICAL PERIOD FOR LANGUAGE: LINKING THEORY AND PRACTICE

or audiotape only, showed no learning – their performance equaled that of


infants in the control group who had not been exposed to Mandarin at all
(see Figure 5, p. 241).
Thus, the presence of a human being interacting with the infant during
language exposure, while not required for simpler statistical-learning tasks
(Maye et al., 2002; Saffran et al., 1996), is critical for learning in complex
natural language-learning situations (Kuhl et al., 2003). Using the same ex-
perimental design, this work has been extended to Spanish and advanced
beyond the Kuhl et al. (2003) study. Conboy showed that infants not only
learn Spanish phonemes (Conboy and Kuhl, 2011) but also the Spanish
words they were exposed to during the language-exposure sessions (Con-
boy and Kuhl, 2010). Moreover, Conboy and colleagues demonstrated that
individual differences in infants’ social behaviors during the Spanish expo-
sure sessions is significantly correlated to the degree to which infants learn
both phonemes and words, as indicated by the relationship between social
behaviors during the sessions and brain measures documenting learning
post-exposure (Conboy, Brooks, Meltzoff, and Kuhl, 2008).
These studies suggest that infants’ computational abilities are enabled by
social interaction, a situation mirrored in neurobiological studies on vocal
communication learning in other species, such as birds (Doupe and Kuhl,
2008). The notion that social interaction acts as a ‘gate’ for infants initial
language learning has important implications for children with autism that
we are beginning to explore (see Kuhl, Coffey-Corina, Padden, and Daw-
son, 2005b; Kuhl, 2010a).The broader role of socio-cultural context on lan-
guage learning is also illustrated in studies focusing on language and brain
with children from families with low socio-economic status. Our work in
this arena links the degree of left-hemisphere specialization for language
and literacy at the age of 5 years to the extent to which a child’s environ-
ment provides opportunities for learning (See Raizada, Richards, Meltzoff,
and Kuhl, 2008; Neville, this volume). The growing body of work suggests
that the early language environment of the child has a significant effect on
the trajectory of language learning.
The model we have developed indicates that the interaction between
computational skills and social cognition potentially opens the critical pe-
riod for phonetic learning. Infants have computational skills from birth
(Teinonen et al., 2009). The fact that the effects of linguistic experience on
phonetic perception are not observed until 8 months of age suggests that
computation itself is not the trigger for learning. As discussed, we initially
reasoned that infants might require 8 months of listening to build up reliable
statistical distributions of the sounds contained in the language they expe-

Human Neuroplasticity and Education 45


PATRICIA K. KUHL

rienced, but our results verified that 9-month-old infants did not require 8
months of listening to learn from experiencing a new language – they
learned after less than 5 hours of exposure to a new language, as long as
exposure occurred in a social context.
These data raise the possibility that infants’ social skills – the ability to
track eye movements, achieve joint visual attention, and begin to understand
others’ communicative intentions and develop at this time – erve as a trigger
to instigate plasticity. Social understanding might be the ‘gate’ that initiates
phonetic learning in human infants (Kuhl, in press). There is a neurobio-
logical precedent for social interaction acting as a trigger for learning in
songbirds. It is well established that a more natural social setting extends
the learning period and that manipulating other social features can either
shorten or extend the optimum period for learning (Knudsen, 2004;Wooley
and Doupe, 2008). The possibility of a social interaction plasticity trigger
in humans raises many new questions, and also has implications for devel-
opmental disabilities (see Kuhl, 2010a for discussion).

Bilingual language learning


In our model of early language development (Kuhl et al., 2008), bilingual
language learners are expected to follow the same principles as monolingual
learners – both computational and social factors influence the period of
plasticity. Nonetheless, we argue that this process might result in bilingual
infants reaching the developmental transition in perception at a later point
in time than infants learning either language monolingually.We have argued
that infants learning two first languages simultaneously would remain open
to experience for a longer period of time because they are mapping lan-
guage input in two forms, each with distinct statistical distributions. Social
input often links the statistical distribution for a particular language to in-
dividual social partners, and thus perhaps assists infants in separating the sta-
tistics of one language from another. If this reasoning is correct, it should
take a longer period of time to begin to close the critical period in bilin-
guals because it takes longer for sufficient data from both languages to reach
distributional stability – depending on factors such as the number of people
in the infant’s environment producing the two languages in speech directed
toward the child, and the amount of input each speaker in the infant’s en-
vironment provides. It would be highly adaptive for bilingual infants to re-
main perceptually open for a longer period of time.
Only a few studies have addressed the timing issue and results have been
mixed, perhaps due to differences in methodology, differences in the amount
of language exposure to the two languages in individual bilingual participants,

46 Human Neuroplasticity and Education


BRAIN MECHANISMS UNDERLYING THE CRITICAL PERIOD FOR LANGUAGE: LINKING THEORY AND PRACTICE

and the specific characteristics of the languages and speech contrasts studied.
Bosch and Sebastián-Gallés (2003a) compared 4-, 8- and 12-month-old in-
fants from Spanish monolingual households, Catalan monolingual households,
and Spanish-Catalan bilingual households on a vowel contrast that is phone-
mic in Catalan but not in Spanish.Their results showed that at 4 months in-
fants discriminated the vowel contrast but that at 8 months of age only infants
exposed to Catalan succeeded. Interestingly, the same group of bilinguals re-
gained their ability to discriminate the speech contrast at 12 months of age.
The authors reported the same developmental pattern in bilingual infants in
a study of consonants (Bosch and Sebastián-Gallés, 2003b) and in a later study
of vowels (Sebastián-Gallés and Bosch, 2009). The authors interpreted their
results as evidence that different processes may underlie bilingual and mono-
lingual phoneme category formation.
In contrast, other investigations have found that bilingual infants dis-
criminate phonetic contrasts in their native languages on the same timetable
as monolingual infants. For example, Burns,Yoshida, Hill, and Werker (2007)
tested consonant discrimination using English-relevant as well as French-
relevant values at 6-8, 10-12, and 14-20 months in English monolingual
and English-French bilingual infants. 6-8 month old English monolingual
infants discriminated both contrasts while 10-12 and 14-20 month old Eng-
lish monolingual infants discriminated only the English contrast. In bilingual
infants, all age groups were able to discriminate both contrasts. Similarly,
Sundara, Polka, and Molnar (2008) found that 10-12-month-old French-
English bilingual infants were able to discriminate a French /d/ from an
English /d/ while age-matched French monolingual infants were unable
to do so.These studies support the view that monolingual and bilingual in-
fants develop phonetic representations at the same pace (see also Sundara
and Scutellaro, in press).
We conducted a longitudinal study of English-Spanish bilingual infants
using a brain measure of discrimination for phonetic contrasts in both lan-
guages (Event Related Potentials, or ERPs) and assessments of language input
in the home at two early points in development, followed by examination of
word production in both languages months later (Garcia-Sierra et al., in press).
It is the first ERP study of speech perception in bilingual infants that com-
bined concurrent and longitudinal methods to assess early phonetic percep-
tion, early language exposure, and later word production.The study addressed
three questions: Do bilingual infants show the ERP components indicative
of neural discrimination for the phonetic units of both languages on the same
timetable as monolingual infants? Is there a relationship between brain meas-
ures of phonetic discrimination and the amount of exposure to the two lan-

Human Neuroplasticity and Education 47


PATRICIA K. KUHL

guages? Is later word production in the infants’ two languages predicted by


early ERP responses to speech sounds in both languages and/or the amount
of early language exposure to each language?
As predicted by our model of speech perception development, bilingual
infants displayed a pattern of neural commitment that is different from that
of monolingual infants previously tested using the same stimuli and methods
(Rivera-Gaxiola et al., 2005a; Rivera-Gaxiola, Klarman, Garcia-Sierra, and
Kuhl, 2005b). Rivera-Gaxiola et al. (2005a, b) collected data from English
monolingual infants at 7 and 11 months of age. Monolingual infants at 7
months showed neural discrimination (in the form of a Mismatch negativity,
or MMN) of both the native phonetic contrast (English) and the non-native
contrast (Spanish); at 11 months monolingual infants showed an MMN for
the native phonetic contrast (English) only, indicating that they had learned
native sounds and were no longer in the initial universal stage of perception.
Bilingual infants did not show an MMN for Spanish and English contrasts at
6-8 months, but showed a non-significant positivity, a more immature re-
sponse. By 10-12 months of age, the MMN was observed for both contrasts.
In studies of 11- and 14-month-old monolingual and bilingual infants, this
pattern is replicated; bilingual infants show the MMN to sounds from their
(two) native languages at a later point in time when compared to monolingual
infants.We believe this represents an extended period of openness to experi-
ence in bilingual children (see Figure 6, p. 242).
Thus, our brain measures of bilingual phonetic development provide some
support for the idea that bilingual infants instigate phonetic learning at the
same time as monolingual infants, but that they may remain open to experi-
ence for a longer period of time.This pattern represents bilingual infants’ highly
adaptive response to their more variable language environments.
We also hypothesized that bilingual infants’ discrimination of the sounds
of English and Spanish would be related to language exposure in the home,
and that the pattern of the relationship would be influenced by age. The
results showed an interesting relationship between the pattern of brain ac-
tivity as a function of high vs. low exposure to the language. Specifically,
only infants who had high exposure to English (or Spanish), and conse-
quently lower exposure to the second language, showed an age effect in
their brain responses to speech.Work is underway to further investigate this
relationship, but the findings of Garcia-Sierra et al. (in press) suggest that
bilingual infants exposed to high levels of one language have neural re-
sponses to that language that resemble those of monolingual infants. Given
the variability in language experience in bilingual infants, more research is
required to determine how much language input is sufficient to close the

48 Human Neuroplasticity and Education


BRAIN MECHANISMS UNDERLYING THE CRITICAL PERIOD FOR LANGUAGE: LINKING THEORY AND PRACTICE

critical period, and, whether there is greater plasticity for language through-
out life as a function of early experience with two languages.
Finally, we hypothesized a relationship between early language brain meas-
ures and later word production, as well as relationships between early language
exposure and later word production. Both hypotheses were confirmed. Chil-
dren who were English dominant in word production at 15 months showed
relatively better neural discrimination of the English contrast, as well as stronger
English exposure in the home. Similarly, children who were Spanish dominant
in word production at 15 months showed relatively better neural discrimina-
tion of the Spanish contrast and stronger Spanish exposure in the home.
Taken as a whole, the results suggest that bilingual infants tested with pho-
netic units from both of their native languages stay perceptually open longer
when compared to monolingual infants – indicating perceptual narrowing
at a later point in time. We reasoned that this is highly adaptive for bilingual
infants. We also show that individual differences in infants’ neural responses
to speech, as well as their later word production, are influenced by the amount
of exposure infants have to each of their native languages at home.

Is adult second language learning enhanced by social experience?


Can understanding the role of computational and social mechanisms
help design interventions that improve adults’ acquisition of a second lan-
guage? Our studies in Japan with native speakers of Japanese attempting to
learn English suggest that it may be possible.The difficulty of the /r-l/ dis-
tinction for native speakers of Japanese is well documented, even after ex-
tensive training (Flege, Takagi, and Mann, 1996; Goto, 1971; Miyawaki et
al., 1975;Yamada and Tohkura, 1992).We hypothesize that processing Eng-
lish requires the development of new distributional frequency maps unique
to English, because early exposure to Japanese caused neural commitment
to the distributional patterns of Japanese which would subsume English
/r/and /l/ into the Japanese /r/ category (see Fig 4, p. 240). Computational
neural modeling experiments have produced findings that are consistent
with this view (Vallabha and McClelland, 2007).
New training studies conducted by our laboratory group in collabora-
tion with the MEG researchers at Nippon Telephone and Telegraph in
Tokyo suggest that training which capitalizes on a natural infant-learning
strategies may provide the impetus to build new perceptual maps during
second language learning.We examined perception of English /ra/ vs. /la/
in ten Japanese subjects and ten American subjects (Zhang et al., 2005). Be-
havioral measures included identification and discrimination of the speech
syllables. MEG recordings were made while subjects listened to the syllables.

Human Neuroplasticity and Education 49


PATRICIA K. KUHL

Listening to native language sounds resulted in brain activation that was


both significantly more focalized in the brain and occurred with shorter
durations – we interpreted these patterns as greater neural efficiency when
listening to native language speech. We reasoned that neural efficiency re-
flected the expertise resulting from early learning, and that neural efficiency
developed at the expense of neural plasticity.
We tested this notion in a follow-up training study, in which we used
highly social speech signals to train Japanese listeners to respond to the /r/
and /l/ stimuli (Zhang et al., 2009).Taking our cue from the ‘motherese’ stud-
ies, Japanese participants heard and viewed American speakers producing
acoustically modified /ra/ and /la/ syllables. Stimuli had greatly exaggerated
formant frequencies, reduced bandwidths, and extended durations, like those
produced by mothers when speaking to their infants. In the computer-train-
ing program, listeners were allowed to choose from many different talkers,
and the syllables presented varied greatly. The listeners presented the stimuli
to themselves via computer during 12 sessions, and no explicit feedback was
provided.The behavioral data revealed a significant improvement in identifi-
cation of these English (nonnative) speech stimuli, larger by a factor of 3 over
that reported by previous studies. Correspondingly, the MEG results showed
greater neural efficiency in the left hemisphere – more focal brain activation
over a shorter duration – when listening to the English syllables.
These results suggest that the principles underlying motherese speech may
help elicit adult second-language learning.Three parameters are of greatest in-
terest: (a) exaggeration of the acoustic dimensions critical to the phonetic con-
trast, (b) an unsupervised ‘social’ learning situation, and (c) wide variability in
speech, mimicking natural learning. Our studies show that feedback and rein-
forcement are not necessary in this process; listeners simply need the right kind
of listening experience exaggerated acoustic cues, multiple instances by many
talkers who can be seen and heard and selected by the participants, and mass
listening experience without ‘testing’ – features that motherese provides in-
fants – may represent a natural way to learn a language.These features, especially
the more social nature of the experience – seeing talkers and choosing to listen
to them – may allow listeners to create new perceptual maps rather than simply
subsuming the English /r/ and /l/ stimuli into existing Japanese distributions,
which would obscure the English distinction (see McClelland, Thomas, Mc-
Candliss, and Fiez, 1999).We are exploring this further in current studies.

Phonetic learning predicts the rate of language growth


The forgoing review suggests that early phonetic learning is a complex
process: Infants, computational skills are involved, as well as social cognition,

50 Human Neuroplasticity and Education


BRAIN MECHANISMS UNDERLYING THE CRITICAL PERIOD FOR LANGUAGE: LINKING THEORY AND PRACTICE

in opening a window of increased plasticity at about 8 months of life. Be-


tween 8 and 10 months monolingual infants show an increase in native lan-
guage phonetic perception, a decrease in nonnative phonetic perception,
and phonetic learning from a new language can be induced by social ex-
perience with a speaker of a new language in the laboratory (though not
via a standard TV).
This early initial step in language learning is strongly correlated with
the growth of future language skills, and with later pre-literacy skills. In our
initial work demonstrating the link between early speech perception and
later language, we conducted a longitudinal study examining whether a test
of phonetic discrimination for vowels at 6 months of age predicted chil-
dren’s language skills measured up to 18 months later. The data demon-
strated that infants’ phonetic discrimination ability at 6 months of age was
significantly correlated with their language skills at 13, 16, and 24 months
of age (Tsao, Liu, and Kuhl, 2004). However, we recognized that in this ini-
tial study the association we observed might be due to infants’ cognitive
skills, such as the ability to perform in the behavioral task we used to assess
discrimination, or to sensory abilities that affected auditory resolution of
the differences in formant frequencies that underlie phonetic distinctions.
To address these issues, we assessed both native and nonnative phonetic
discrimination in 7.5-month-old infants, and used both a behavioral (Kuhl
et al., 2005a) and an event-related potential measure, the mismatch nega-
tivity (MMN), to assess infants’ performance (Kuhl et al., 2008). Using a
neural measure removed potential cognitive effects on performance; the use
of both native and nonnative contrasts addressed the sensory issue, since
better sensory abilities would be expected to improve both native and non-
native speech discrimination.
According to our developmental model, future language growth should
be associated with early performance on both native and nonnative contrasts,
but in opposite directions.We predicted that better native language perception
should result in significantly advanced language abilities at 14, 18, 24, and 30
months of age, whereas better nonnative phonetic perception at the same age
should show poorer language abilities at the same four future points in time.
The results conformed to this prediction. When both native and nonnative
phonetic discrimination was measured in the same infants at 7.5 months of
age, (Kuhl et al., 2005a; Kuhl et al., 2008), better native speech discrimination
was associated with better later language outcomes, whereas better nonnative
performance was associated with poorer performance. Hierarchical linear
growth modeling of vocabulary between 14 and 30 months for MMN values
(see Figure 7, p. 242) showed that native and nonnative discrimination pre-

Human Neuroplasticity and Education 51


PATRICIA K. KUHL

dicted future language, but in opposite directions. Better native discrimination


predicted advanced future language development and better nonnative dis-
crimination predicted less advanced future language development.
These results are explained by our model: better native phonetic dis-
crimination enhances infants’ skills in detecting words and this vaults infants
towards language, whereas better nonnative abilities indicated that infants
remained longer at an earlier phase of development – still sensitive to all
phonetic differences. Infants’ ability to learn which phonetic units are rele-
vant in the language(s) in their environment, while decreasing or inhibiting
their attention to the phonetic units that do not distinguish words in their
language, is the necessary first step on the path toward language.
Importantly, recent data from our laboratory indicate long-term associ-
ations between early measures of infants’ phonetic perception and future
language and reading skills. Our studies show that the trajectory of change
in the discrimination two simple vowels between 7 and 11 months of age
predicts those children’s language abilities and also their phonological aware-
ness skills, which are critical to reading, at the age of 5 years (Cardillo, 2010;
Cardillo Lebedeva and Kuhl, 2009).
Infants were tested at 7 and 11 months of age, and their patterns of
speech perception development were categorized in one of three ways: (1)
infants who show excellent native discrimination at 7 months and maintain
that ability at 11 months – the ‘high-high’ group, (2) infants who show poor
abilities at 7 months but increased performance at 11 months – the ‘low-
high’ group, and (3) infants who show poor abilities to discriminate at both
7 and 11 months of age – the ‘low-low’ group. We followed these children
until the age of 5: receptive and expressive language skills were assessed at
18 months, 24 months, and 5 years of age; phonological awareness skills,
the most accurate measure of eventual reading skills, were assessed at age 5.
Strong relationships were observed between infants’ early speech perception
performance and their later language skills at all ages, and between infants’
early speech perception performance and phonological awareness skills at
5 years of age. In all cases, the infants who showed excellent skill in detecting
phonetic differences in native language sounds by 11 months of age (the
‘high-high’ and the ‘low-high’ groups) had significantly higher expressive
and receptive language skills at the ages of 18- and 24-months, as well as at
the age of 5 years. Moreover, at the age of 5 years, these same two groups
scored significantly higher in pre-literacy skills involving phonological
awareness; importantly, these significance patterns held after measures of
socio-economic status (SES) were partialed out in the regression analysis
(Cardillo, 2010; Cardillo Lebedeva and Kuhl, 2009).

52 Human Neuroplasticity and Education


BRAIN MECHANISMS UNDERLYING THE CRITICAL PERIOD FOR LANGUAGE: LINKING THEORY AND PRACTICE

From theory to practice


While recommendations extending theory to practice should be done cau-
tiously, it is not difficult to extend the results of early language and literacy
studies reviewed here to practice. First, the data show that the initial steps that
infants take toward language in the first year of life matter – they appear to be
a pathway to children’s development of later language and literacy. Infants’ early
language abilities, which can be tested with fairly simple measures in the first
year of life, predict language skills and literacy skills up to 4.5 years later.While
these data are correlational as opposed to causal in nature, our data allow us to
begin to connect the dots and suggest that the richness of the early language
environment of the child creates the kind of neural architecture that is neces-
sary for robust language and literacy development. Our model and data suggest
as well that environmental influences affect these early steps.The trajectory of
phonetic learning between 6 and 12 months of life (which predict children’s
future skills) are themselves strongly correlated with the complexity and fre-
quency of the language young children experience at home (Raizada et al.,
2008; in preparation). Hearing the exaggerated speech known widely as ‘moth-
erese’ early in development is strongly correlated with early speech discrimi-
nation measured in the laboratory (Liu, Kuhl and Tsao, 2003). Motherese
exaggerates the critical acoustic cues in speech (Kuhl et al., 1997;Werker et al.,
2007), and infants social interest in speech is, we believe, important to the social
learning process. Thus, we assert that talking to children early in life, reading
to children early in life, and doing both of these things while interacting socially
with children around language and literacy activities, creates the milieu in
which plasticity during the critical period can be maximized for all children.

Acknowledgements
The author and research reported here were supported by a grant from
the National Science Foundation’s Science of Learning Program to the Uni-
versity of Washington LIFE Center (SBE-0354453), and by grants from the
National Institutes of Health (HD37954, HD55782, HD02274, DC04661).
This chapter updates the information published in Kuhl (Neuron, 2010).

References
Aslin, R.N., & Mehler, J. (2005). Near-in- Best, C.C., & McRoberts, G.W. (2003).
frared spectroscopy for functional studies Infant perception of nonnative conso-
of brain activity in human infants: prom- nant contrasts that adults assimilate in
ise, prospects, and challenges. Journal of different ways. Language and Speech, 46,
Biomedical Optics, 10, 11009. 183-216.

Human Neuroplasticity and Education 53


PATRICIA K. KUHL

Bialystok, E., & Hakuta, K. (1994). In other Burns,T.C.,Yoshida, K.A., Hill, K., & Werk-
words: the science and psychology of second- er, J.F. (2007).The development of pho-
language acquisition. New York, NY: Basic netic representation in bilingual and
Books. Birdsong, D. (1992). Ultimate at- monolingual infants. Applied Psycholin-
tainment in second language acquisition. guistics, 28, 455-474.
Linguistic Society of America, 68, 706-755. Cardillo, G.C. (2010). Predicting the pre-
Birdsong, D. (1992). Ultimate attainment dictors: Individual differences in longi-
in second language acquisition. Language, tudinal relationships between infant
68, 706-755. phoneme perception, toddler vocabulary,
Birdsong, D., & Molis, M. (2001). On the and preschooler language and phono-
evidence for maturational constraints in logical awareness (Doctoral dissertation,
second-language acquisitions. Journal of University of Washington, 2010). Disser-
Memory and Language, 44, 235-249. tation Abstracts International.
Bortfeld, H., Wruck, E., & Boas, D.A. Cardillo Lebedeva, G.C., & Kuhl, P.K.
(2007). Assessing infants’ cortical response (2009). Individual differences in infant speech
to speech using near-infrared spec- perception predict language and pre-reading
troscopy. NeuroImage, 34, 407-415 skills through age 5 years. Paper presented
Bosch, L., & Sebastián-Gallés, N. (2003a). at the Annual Meeting of the Society for
Simultaneous bilingualism and the per- Developmental & Behavioral Pediatrics.
ception of a language-specific vowel con- Portland, OR.
trast in the first year of life. Language and Cheour, M., Imada, T., Taulu, S., Ahonen,
Speech, 46, 217-243. A., Salonen, J., & Kuhl, P.K. (2004). Mag-
Bosch, L., & Sebastián-Gallés, N. (2003b). netoencephalography is feasible for infant
Language experience and the perception assessment of auditory discrimination.
of a voicing contrast in fricatives: Infant Experimental Neurology, 190, S44-S51.
and adult data. In D. Recasens, M. J. Solé Cho, K.K.A., & Bear, M.F. (2010). Promot-
& J. Romero (Eds.), Proceedings of the 15th ing neurologic recovery of function via
international conference of phonetic sciences. metaplasticity. Future Neurology, 5, 21-26.
(pp.1987-1990). Barcelona: UAB/Casual Chomsky, N. (1959). Review of Skinner’s
Productions. Verbal Behavior. Language, 35, 26-58.
Bosseler, A.N.,Taulu, S., Imada,T., & Kuhl, Conboy, B.T., Brooks, R., Meltzoff, A.N.,
P.K. (2011). Developmental Changes in & Kuhl, P.K. (submitted). Relating infants’
Cortical Rhythms to Native and Non- social behaviors to brain measures of sec-
Native Phonetic Contrasts. Symposium ond language phonetic learning.
presentation at the 2011 Biennial Meeting Conboy, B.T., & Kuhl, P.K. (2010, Novem-
for the Society for Research in Child ber). Brain responses to words in 11-
Development, Montreal, Canada, March month-olds after second language expo-
31 – April 2, 2011. sure. Poster presented at the American
Bruner, J. (1983). Child’s talk: Learning to Speech-Language-Hearing Association,
use language. New York: W.W. Norton. November 18-20, Philadelphia, PA.
Bruer, J.T. (2008). Critical periods in second Conboy, B.T., & Kuhl, P.K. (2011). Impact
language learning: distinguishing phe- of second-language experience in infancy:
nomena from explanation. In M. Mody brain measures of first- and second-lan-
& E. Silliman (Eds.), Brain, behavior and guage speech perception. Developmental
learning in language and reading disorders Science, 14, 242-248,
(pp. 72-96). New York, NY:The Guilford Conboy, B.T., Rivera-Gaxiola, M., Silva-
Press. Pereyra, J.F., & Kuhl, P.K. (2008). Event-

54 Human Neuroplasticity and Education


BRAIN MECHANISMS UNDERLYING THE CRITICAL PERIOD FOR LANGUAGE: LINKING THEORY AND PRACTICE

related potential studies of early language Friederici, A.D. (2005). Neurophysiological


processing at the phoneme, word, and markers of early language acquisition:
sentence levels. In A.D. Friederici & G. from syllables to sentences. Trends in Cog-
Thierry (Eds.), Early language development: nitive Science, 9, 481-488.
bridging brain and behaviour, Trends in lan- Friederici, A.D., & Wartenburger, I. (2010).
guage acquisition research series, Volume 5 Language and brain. Wiley Interdisciplinary
(pp. 23-64). Amsterdam/The Nether- Reviews: Cognitive Science, 1, 150-159. On-
lands: John Benjamins. line publication: DOI: 10.1002/WCS.9
Dehaene-Lambertz, G., Dehaene, S., & Garcia-Sierra, A., Rivera-Gaxiola, M., Con-
Hertz-Pannier, L. (2002). Functional neu- boy, B.T., Romo, H., Percaccio, C.R.,
roimaging of speech perception in infants. Klarman, L., Ortiz, S., & Kuhl, P.K. (in
Science, 298, 2013-2015. press). Bilingual language learning: An
Dehaene-Lambertz, G., Hertz-Pannier, L., ERP study relating early brain responses
Dubois, J., Meriaux, S., and Roche, A., to speech, language input, and later word
Sigman, M., & Dehaene, S. (2006). Func- production. Journal of Phonetics.
tional organization of perisylvian activa- Gernsbacher, M.A., & Kaschak, M.P. (2003).
tion during presentation of sentences in Neuroimaging studies of language pro-
preverbal infants. Proceedings of the National duction and comprehension. Annual Re-
Academy of Sciences of the United States of view of Psychology, 54, 91-114.
America, 103, 14240-14245. Golestani, N., Molko, N., Dehaen, S., Lebi-
Doupe, A., & Kuhl, P.K. (1999). Birdsong han, D., & Pallier, C. (2007). Brain struc-
and speech: Common themes and mech- ture predicts the learning of foreign
anisms. Annual Review of Neuroscience, 22, speech sounds. Cerebral Cortex, 17, 575-
567-631. 582.
Doupe, A.J., & Kuhl, P.K. (2008). Birdsong Goto, H. (1971). Auditory perception by
and human speech: common themes and normal Japanese adults of the sounds ‘l’
mechanisms. In H. P. Zeigler & P. Marler and ‘r’. Neuropsychologia, 9, 317-323.
(Eds.), Neuroscience of Birdsong (pp.5-31). Gratton, G., & Fabiani, M. (2001a). Shed-
Cambridge, England: Cambridge Uni- ding light on brain function: The event
versity Press. related optical signal. Trends in Cognitive
Eimas, P.D. (1975). Auditory and phonetic Science, 5, 357-363.
coding of the cues for speech: discrimi- Gratton, G., & Fabiani, M. (2001b). The
nation of the /r–l/ distinction by young event-related optical signal: A new tool
infants. Perception and Psychophysics, 18, for studying brain function. International
341-347. Journal of Psychophysiology, 42, 109-121.
Eimas, P.D., Siqueland, E.R., Jusczyk, P., Hensch,T. K., & Stryker, M. P. (1996). Oc-
&Vigorito, J. (1971). Speech perception ular dominance plasticity under
in infants. Science, 171, 303-306. metabotropic glutamate receptor block-
Flege, J.,Takagi, N., & Mann,V. (1996). Lex- ade. Science. 272, 554-557.
ical familiarity and English-language ex- Hensch, T.K. (2005) Critical period plas-
perience affect Japanese adults’ perception ticity in local cortical circuits. Nature Re-
of /r/ and /l/. Journal of the Acoustical view Neuroscience, 6, 877-888
Society of America, 99, 1161-1173. Hespos, S.J., & Spelke, E.S. (2004). Con-
Flege, J.E.,Yeni-Komshian, G.H., & Liu, S. ceptual precursors to language. Science,
(1999). Age constraints on second-lan- 430, 453-456.
guage acquisition. Journal of Memory and Homae, F., Watanabe, H., Nakano, T.,
Language, 41, 78-104. Asakawa, K., & Taga, G. (2006).The right

Human Neuroplasticity and Education 55


PATRICIA K. KUHL

hemisphere of sleeping infant perceives Kuhl, P.K. (2010a). Brain mechanisms in


sentential prosody. Neuroscience Research, early language acquisition. Neuron, 67,
54, 276-280. 713-727.
Hubel, D.H., & Wiesel, T.N. (1963). Re- Kuhl, P. K. (2010b, October). The linguistic
ceptive fields of cells in striate cortex of genius of babies. A TEDTalk retrieved from
very young, visually inexperienced kit- TED.com website: www.ted.com/talks/
tens. Journal of Neurophysiology, 26, 994- lang/eng/patricia_kuhl_the_linguistic_ge
1002. nius_of_babies.html
Imada,T., Zhang,Y., Cheour, M.,Taulu, S., Kuhl, P.K. (in press). Social mechanisms in
Ahonen, A., & Kuhl, P.K. (2006). Infant early language acquisition: Understanding
speech perception activates Broca’s area: integrated brain systems supporting lan-
a developmental magnetoenceohalogra- guage. In J. Decety & J. Cacioppo (Eds.),
phy study. Neuroreport, 17, 957-962. The handbook of social neuroscience. Oxford
Jensen, O., & Tesche, C.D. (2002). Frontal UK: Oxford University Press.
theta activity in humans increases with Kuhl, P.K., Andruski, J.E., Chistovich, I.A.,
memory load in a working memory task. Chistovich, L.A., Kozhevnikova, E.V.,
European Journal of Neuroscience, 15, 1395- Ryskina,V.L., Stolyarova, E.I., Sundberg,
1399. U., & Lacerda, F. (1997). Cross-language
Johnson, J., & Newport, E. (1989). Critical analysis of phonetic units in language ad-
period effects in second language learn- dressed to infants. Science, 277, 684-686.
ing: the influence of maturation state on Kuhl, P.K.., Coffey-Corina, S., Padden, D.,
the acquisition of English as a second & Dawson, G. (2005b). Links between
language. Cognitive Psychology, 21, 60- social and linguistic processing of speech
99. in preschool children with autism: be-
Kahana, M.J., Sekuler, R., Caplan, J.B., havioral and electrophysiological evi-
Kirschen, M., & Madsen, J.R. (1999). dence. Developmental Science, 8, 1-12.
Human theta oscillations exhibit task de- Kuhl, P.K., Conboy, B.T., Coffey-Corina,
pendence during visual maze navigation. S., Padden, P., Rivera-Gaxiola, M., &
Nature, 399, 781-784. Nelson, T. (2008). Phonetic learning as
Knudsen, E.I. (2004). Sensitive periods in a pathway to language: New data and
the development of the brain and be- native language magnet theory expanded
havior. Journal of Cognitive Neuroscience, (NLM-e). Philosophical Transactions of the
16, 1412-1425. Royal Society B: Biological Sciences, 363,
Kuhl, P.K. (2000). A new view of language 979-1000.
acquisition. Proceedings of the National Kuhl, P.K., Conboy, B.T., Padden, D., Nel-
Academy of Science, 97, 11850-11857. son,T., & Pruitt, J. (2005a). Early speech
Kuhl, P.K. (2004). Early language acquisi- perception and later language develop-
tion: cracking the speech code. Nature ment: implications for the ‘critical period’.
Reviews Neuroscience, 5, 831-843. Language and Learning Development, 1,
Kuhl, P.K. (2007). Is speech learning ‘gated’ 237-264.
by the social brain? Developmental Science, Kuhl, P.K., & Damasio, A. (in press). Lan-
10, 110-120. guage. In E. R. Kandel. J. H. Schwartz,
Kuhl, P.K. (2009). Early language acquisi- T. M. Jessell, S. Siegelbaum, & J. Hudspeth
tion: Neural substrates and theoretical (Eds.), Principles of neural science: 5th Edi-
models. In M.S. Gazzaniga (Ed.), The tion. McGraw Hill: New York.
Cognitive Neurosciences, 4th Edition (pp. Kuhl, P.K., & Rivera-Gaxiola, M. (2008).
837-854). Cambridge, MA: MIT Press. Neural substrates of language acquisition.

56 Human Neuroplasticity and Education


BRAIN MECHANISMS UNDERLYING THE CRITICAL PERIOD FOR LANGUAGE: LINKING THEORY AND PRACTICE

Annual Review of Neuroscience, 31, 511- and feature generalization. Developmental


534. Science, 11, 122-134.
Kuhl, P.K., Stevens, E., Hayashi, A., Deguchi, McClelland, J., Thomas, A., McCandliss,
T., Kiritani, S., & Iverson, P. (2006). Infants B., & Fiez, J. (1999). Understanding fail-
show facilitation for native language pho- ures of learning: Hebbian learning, com-
netic perception between 6 and 12 petition for representational space, and
months. Developmental Science, 9, F13-F21. some preliminary experimental data. In
Kuhl, P.K.,Tsao, F.M., & Liu, H.M. (2003). J. Reggia, E. Ruppin & D. Glanzman
Foreign-language experience in infancy: (Eds.), Progress in Brain Research. Volume
effects of short-term exposure and social 121. Disorders of Brain, Behavior and Cog-
interaction on phonetic learning. Pro- nition: The Neurocomputational Perspective
ceedings of the National Academy of Sciences (pp. 75-80). Amsterdam: Elsevier.
of the United States of America, 100, 9096- Meltzoff, A.N., Kuhl, P.K., Movellan, J., &
9101. Sejnowski, T. (2009). Foundations for a
Kuhl, P.K., Williams, K.A., Lacerda, F., new science of learning. Science, 17, 284-
Stevens, K.N., & Lindblom, B. (1992). 288.
Linguistic experience alters phonetic per- Miyawaki, K., Strange, W., Verbrugge, R.,
ception in infants by 6 months of age. Liberman, A.M., Jenkins, J.J., & Fujimura,
Science, 255, 606-608. O. (1975). An effect of linguistic expe-
Ladefoged, P. (2001). Vowels and consonants: rience: the discrimination of [r] and [l]
An introduction to the sounds of language. by native speakers of Japanese and English.
Oxford: Blackwell Publishers. Perception and Psychophysics, 18, 331-340.
Lasky, R.E., Syrdal-Lasky, A., & Klein, R.E. Neville, H. J., Coffey, S.A., Lawson, D.S.,
(1975). VOT discrimination by four to Fischer, A., Emmorey, K., & Bellugi, U.
six and a half month old infants from (1997). Neural systems mediating Amer-
Spanish environments. Journal of Experi- ican Sign Language: effects of sensory
mental Child Psychology, 20, 215-225. experience and age of acquisition. Brain
Lewkowicz, D.J. & Ghazanfar, A.A. (2006). and Language, 57, 285-308.
The decline of cross-species intersensory Newport, E. (1990). Maturational constraints
perception in human infants. Proceedings on language learning. Cognitive Science,
of the National Academy of Sciences of the 14, 11-28.
United States of America, 103, 6771-6774. Newport, E.L., Bavelier, D., & Neville, H.J.
Liu, H.M., Kuhl, P.K., & Tsao, F.M. (2003). (2001). Critical thinking about critical
An association between mothers’ speech periods: Perspectives on a critical period
clarity and infants’ speech discrimination for language acquisition. In E. Dupoux
skills. Developmental Science, 6, F1-F10. (Ed.), Language, Brain, and Cognitive De-
Mayberry, R. I., & Lock, E. (2003). Age velopment: Essays in Honor of Jacques Mehler
constraints on first versus second language (pp. 481-502). Cambridge, MA: MIT
acquisition: evidence for linguistic plas- Press.
ticity and epigenesis. Brain Language, 87, Ortiz-Mantilla, S., Choe, M.S., Flax, J., Grant,
369-384. P.E., & Benasich. A.A. (2010). Association
Maye, J., Werker, J.F., & Gerken, L. (2002). between the size of the amygdala in in-
Infant sensitivity to distributional infor- fancy and language abilities during the
mation can affect phonetic discrimina- preschool years in normally developing
tion. Cognition, 82, B101-B111. children. NeuroImage, 49, 2791-2799.
Maye, J.,Weiss, D., & Aslin, R. (2008). Sta- Pascalis, O., de Haan, M., & Nelson, C.A.
tistical learning in infants: Facilitation (2002). Is face processing species-specific

Human Neuroplasticity and Education 57


PATRICIA K. KUHL

during the first year of life? Science, 296, Sundara, M., & Scutellaro, A. (in press).
1321-1323. Rhythmic distance between languages
Peña, M., Bonatti, L., Nespor, M., & Mehler, affects the development of speech per-
J. (2002). Signal-driven computations in ception in bilingual infants. Journal of Pho-
speech processing. Science, 298: 604-607. netics.
Petitto, L.A., & Marentette, P. F. (1991). Taga, G., & Asakawa, K. (2007). Selectivity
Babbling in the manual mode: Evidence and localization of cortical response to
for the ontogeny of language. Science, auditory and visual stimulation in awake
251, 1493-1496. infants aged 2 to 4 months. Neuroimage,
Raizada, R.D., Richards,T.L., Meltzoff, A.N., 36, 1246-1252.
& Kuhl, P.K. (2008). Socioeconomic status Teinonen, T., Fellman, V., Naatanen, R.,
predicts hemispheric specialization of the Alku, P., & Huotilainen, M. (2009). Sta-
left inferior frontal gyrus in young chil- tistical language learning in neonates re-
dren. NeuroImage, 40, 1392-1401. vealed by event-related brain potentials.
Rivera-Gaxiola, M., Silvia-Pereyra, J., & BMC Neuroscience, 10, doi:10.1186/1471-
Kuhl, P.K. (2005a). Brain potentials to 2202-10-21.
native and non-native speech contrasts Tomasello, M. (2003a). Constructing A Lan-
in 7- and 11-month-old American in- guage: A Usage-Based Theory of Language
fants. Developmental Science, 8, 162-172. Acquisition. Cambridge, MA: Harvard
Rivera-Gaxiola, M., Klarman, L., Garcia- University Press.
Sierra, A., & Kuhl, P. K. (2005b). Neural Tomasello, M. (2003b). The key is social
patterns to speech and vocabulary growth cognition. In D., Gentner & S. Kuczaj
in American infants. NeuroReport, 16, (Eds), Language and Thought (pp. 47-51).
495-498. Cambridge, MA: MIT Press.
Sebastián-Gallés, N., & Bosch, L. (2009). Tsao, F.M., Liu, H.M., & Kuhl, P.K. (2004).
Developmental Shift in the Discrimina- Speech perception in infancy predicts
tion of Vowel Contrasts in Bilingual In- language development in the second year
fants: Is the distributional account all of life: a longitudinal study. Child Devel-
there is to it? Developmental Science, 12, opment, 75, 1067-1084.
874-887. Tsao, F.M., Liu, H.M., & Kuhl, P.K. (2006).
Saffran, J., Aslin, R., & Newport, E. (1996). Perception of native and non-native af-
Statistical learning by 8-month-old in- fricate-fricative contrasts: cross-language
fants. Science, 274, 1926-1928. tests on adults and infants. Journal of the
Saffran, J.,Werker, J., & Werner, L.A. 2006. Acoustical Society of America, 120, 2285-
The infant’s auditory world: Hearing, 2294.
speech and the beginnings of language. Vallabha, G.K., McClelland, J.L., Pons, F.,
In D. Kuhn & R.S. Siegler (Eds.), Hand- Werker, J.F., & Amano, S. (2007). Unsu-
book of Child Psychology,Vol. 2, Cognition, pervised learning of vowel categories
Perception and Language (6th edition) (pp. from infant-directed speech. Proceedings
58-108). New York, NY: Wiley. of the National Academy of Sciences (USA),
Skinner, B. F. (1957). Verbal Behavior. New 104, 13273-13278.
York: Appleton-Century-Crofts. Vallabha, G.K., & McClelland, J.L. (2007).
Sundara, M., Polka, L., & Molnar, M. (2008). Success and failure of new speech cate-
Development of coronal stop perception: gory learning in adulthood: Conse-
Bilingual infants keep pace with their quences of learned Hebbian attractors
monolingual peers. Cognition, 108, 232- in topographic maps. Cognitive, Affective
242. and Behavioral Neuroscience, 7, 53-73.

58 Human Neuroplasticity and Education


BRAIN MECHANISMS UNDERLYING THE CRITICAL PERIOD FOR LANGUAGE: LINKING THEORY AND PRACTICE

Vygotsky, L.S. (1962). Thought and Language. perceptual reorganization during the first
Cambridge, MA: MIT Press. year of life. Infant Behavior and Develop-
Weber-Fox, C.M., & Neville, H.J. (1999). ment, 7, 49-63.
Functional neural subsystems are differ- White, L., & Genesee, F. (1996). How native
entially affected by delays in second lan- is near-native? The issue of ultimate at-
guage immersion: ERP and behavioral tainment in adult second language ac-
evidence in bilinguals. In D. Birdsong quisition. Second Language Research, 12,
(Ed.), Second Language Acquisition and the 233-265. Online publication: DOI:
Critical Period Hypothesis (pp. 23-38). Mah- 10.1177/026765839601200301.
wah, NJ: Lawerence Erlbaum and Asso- Woolley, S.C., & Doupe, A.J. (2008). Social
ciates, Inc. context-induced song variation affects
Weikum,W.M.,Vouloumanos, A., Navarra, female behavior and gene expression.
J., Soto-Faraco, S., Sebastian-Galles, N., Public Library of Science Biology, 6, e62.
& Werker, J.F. (2007).Visual language dis- Yamada, R., & Tokhura,Y. (1992).The effects
crimination in infancy. Science, 316, 1159. of experimental variables on the percep-
Wiesel,T.N., & Hubel, D.H. (1963). Single tion of American English /r/ and /l/ by
cell responses in striate cortex of kittens Japanese listeners. Perception and Psy-
deprived of vision in one eye. Journal of chophysics, 52, 376-392.
Neurophysiology, 26, 1003-1017 Yeni-Komshian, G.H., Flege, J.E., & Liu, S.
Werker, J.F., & Curtin, S. (2005). PRIMIR: (2000). Pronunciation proficiency in the
a developmental framework of infant first and second languages of Korean–
speech processing. Language Learning and English bilinguals. Bilingualism: Language
Development, 1, 197-234. and Cognition, 3, 131-149.
Werker, J.F., & Lalonde, C. (1988). Cross- Zhang,Y., Kuhl, P.K., Imada,T., Kotani, M.,
language speech perception: initial ca- & Tohkura,Y. (2005). Effects of language
pabilities and developmental change. De- experience: neural commitment to lan-
velopmental Psychology, 24, 672-683. guage-specific auditory patterns. Neu-
Werker, J.F., Pons, F., Dietrich, C., Kajikawa, roimage, 26, 703-720.
S., Fais, L., & Amano, S. (2007). Infant- Zhang,Y, Kuhl, P.K., Imada, T., Iverson, P.,
directed speech supports phonetic cate- Pruitt, J., Stevens, E.B., Kawakatsu, M.,
gory learning in English and Japanese. Tohkura,Y., & Nemoto, I. (2009). Neural
Cognition, 103, 147-162. signatures of phonetic learning in adult-
Werker, J.F., & Tees, R.C. (1984). Cross- hood: a magnetoencephalography study.
language speech perception: evidence for Neuroimage, 46, 226-240.

Human Neuroplasticity and Education 59


TABLES • KUHL

Figure 1. Four techniques now used extensively with infants and young children to examine their re-
sponses to linguistic signals (From Kuhl & Rivera-Gaxiola, 2008).

Human Neuroplasticity and Education 239


TABLES • KUHL

Figure 3. Effects of age on discrimination of the American English /ra-la/ phonetic contrast by American
and Japanese infants at 6–8 and 10–12 months of age. Mean percent correct scores are shown with
standard errors indicated (Kuhl et al., 2006).

Figure 4. Idealized case of distributional learning is shown. Two women speak ‘motherese’, one in Eng-
lish and the other in Japanese. Distributions of English /r/ and /l/, as well as Japanese /r/, are shown.
Infants’ sensitivity to these distributional cues has been shown with simple stimuli. (Adapted from
Kuhl, 2010b).

240 Human Neuroplasticity and Education


TABLES • KUHL

Figure 5. The need for social interaction in language acquisition is shown by foreign-language learning
experiments. Nine-month-old infants experienced 12 sessions of Mandarin Chinese through (A) natural
interaction with a Chinese speaker (left) or the identical linguistic information delivered via television
(right) or audiotape (not shown). (B) Natural interaction resulted in significant learning of Mandarin
phonemes when compared with a control group who participated in interaction using English (left). No
learning occurred from television or audiotaped presentations (middle). Data for age-matched Chinese
and American infants learning their native languages are shown for comparison (right) (adapted from
Kuhl et al., 2003).

Human Neuroplasticity and Education 241


TABLES • KUHL

Figure 6. On the NLM-e account, monolingual and bilingual children ‘open’ the critical period for pho-
netic learning at the same point in time. However, bilingual children remain ‘open’ to the effects of ex-
perience for a longer period of time, due the higher variability in speech input.

Figure 7. (A) A 7.5-month-old infant wearing an ERP electrocap. (B) Infant ERP waveforms at one sensor
location (CZ) for one infant are shown in response to a native (English) and nonnative (Mandarin) pho-
netic contrast at 7.5 months. The mismatch negativity (MMN) is obtained by subtracting the standard
waveform (black) from the deviant waveform (English = red; Mandarin = blue). This infant’s response
suggests that native-language learning has begun because the MMN negativity in response to the native
English contrast is considerably stronger than that to the nonnative contrast. (C) Hierarchical linear
growth modeling of vocabulary growth between 14 and 30 months for MMN values of +1SD and −1SD
on the native contrast at 7.5 months (C, left) and vocabulary growth for MMN values of +1SD and −1SD
on the nonnative contrast at 7.5 months (C, right) Kuhl, 2010a).

242 Human Neuroplasticity and Education

Vous aimerez peut-être aussi