Vous êtes sur la page 1sur 6

S C I E N C E S C O M PA S S

65 64 63 62 61 60 59 58 57 56 55 54 53 52 51 50 49 48 47 46 45 44 43 42 41 40 39 38 37 36 35 34 33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1
P E R S P E C T I V E S : N E U RO S C I E N C E

PERSPECTIVES
the right side of the brain is preferentially activated during the processing of musical pitch (3). Initial studies on the affective response to music implicate a variety of neural structures in limbic and paralimbic brain areas, as well as in midbrain and basal forebrain regions linked to reward and motivation (4). One area consistently modulated by affective responses is the ventromedial frontal cortex (see the top figure), the principal region identified by Janata et al . as showing sensitivity to tonality. A separate line of investigation originates with research into the psychological reality of theoretical descriptions of music. Are listeners and performers influenced by the tonal, harmonic, melodic, rhythmic, and metric patterns identified in music theory? If so, does this knowledge influence how a listener organizes and remembers music, and how a musician plans and executes a performance? Empirical studies have extensively documented the precise contents of this knowledge and show that it largely conforms to descriptions of music theory. Many aspects of music, such as whether notes are played in tune, appear to be acquired implicitlythat is, without explicit formal instructionand the influence of this knowledge is manifest in a wide range of musical behaviors. One of the most developed areas in the cognitive science of music is the description of pitch structures in Western tonal-harmonic music. This includes musical scales, harmonics, and relations between musical keys. Music in many styles is organized around one or more stable reference tones (the tonic, in Western tonal music). Other tones differ in their degree of perceived stability, giving rise to a tonal hierarchy. Tones high in the hierarchy are remembered accurately, are heard as giving a sense of finality or closure, and are more expected within the tonal contextan effect b that Janata et al. exploit in their D tonality perception task. Keys that are considered closely related, in A the sense that modulations (that

Mental Models and Musical Minds


Robert J. Zatorre and Carol L. Krumhansl

usic is found in all A cultures and has a remarkable diversity of forms. Cognitive scientists have discovered highly specif ic and detailed knowledge of musical structure even in individuals without extensive musical training. Brain imaging has proved valuable for investigating the neural basis of a variety of cognitive functions, including how the brain processes music. These developments converge on page 2167 of this issue with the research article by Janata et al. (1). They report that abstract patterns of Western tonal musical structure are mirrored in patterns of brain activity in human subjects. Using functional magnetic resonance imaging, Janata and colleagues analyzed the brain activity of eight musically experienced listeners as they performed two musical perception tasks requiring them to detect a deviance in timbre (a note played by a flute instead of a clarinet) and notes that violated local tonality. The investigators found that the auditory cortex as well as a number of other brain areas were activated in their subjects as they undertook the musical perception tasks. The most consistent activation was along the superior temporal gyrus of both hemispheres. Additional regions that were activated included the temporal, parietal, frontal, and limbic lobes as well as the thalamus and cerebellum, indicating that the processing of music is extremely complicated. Recent research in neuroanatomy, neurophysiology, and functional brain imaging has investigated the location and properties of auditory cortical fields (2). There is an emerging consensus that both the cytoarchitecture and neural connections within auditory cortical fields form the basis of a hierarchy of processing regions in the brain. These regions start in core areas of the primary auditory cortex, and emanate in several processing streams that extend into various portions of the superior and
R. J. Zatorre is at the Montreal Neurological Institute, McGill University, Montreal, Quebec H3A 2B4, Canada. E-mail: robert.zatorre@mcgill.ca C. L. Krumhansl is in the Department of Psychology, Cornell University, Ithaca, NY 14853, USA.

The sensation of music. ( A ) Auditory cortical areas in the superior temporal gyrus that respond to musical stimuli. Regions that are most strongly activated are shown in red. ( B ) Metabolic activity in the ventromedial region of the frontal lobe increases as a tonal stimulus becomes more consonant.

middle temporal gyri (see the top figure). Music offers a way to explore the functional organization of these putative processing streams. Auditory cortical regions have extensive, spatially organized projections to and from frontal and parietal lobes. These extended cortical networks are thought to underlie the complex neural events engaged during the processing of complex sounds. Even a seemingly simple musical task such as perceiving and recognizing a melody entails a variety of cognitive processes, including perceptual, attentional, mnemonic, and affective responses. Music seems to depend on specific brain circuitry, because it can be dissociated from processing of other classes of sounds, d A g including speech, in F A individuals with cerd a f tain diseases or brain E C lesions. This concept e c a is supported by B G E functional imaging b g e B studies, which reveal B D that there are spe- G cialized activity patMental key maps. (A) terns for tonal proUnfolded version of cessing, including the key map, with opthose found in the posite edges to be temporal and frontal considered matched. brain areas that are There is one circle of critical for tonal fifths for major keys working memory. In (red) and one for miaddition, as Janata nor keys (blue), each and colleagues con- wrapping the torus three times. In this way, every major key is flanked firm, there are hemi- by its relative minor on one side (for example, C major and a minor) spheric asymme- and its parallel minor on the other (for example, C major and c minor). tries; for example, (B) Musical keys as points on the surface of a torus.
VOL 298 SCIENCE www.sciencemag.org

2138

13 DECEMBER 2002

S C I E N C E S C O M PA S S
65 64 63 62 61 60 59 58 57 56 55 54 53 52 51 50 49 48 47 46 45 44 43 42 41 40 39 38 37 36 35 34 33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 is, shifts from one key to another) are relatively easy to effect, have similar tonal hierarchies and shared harmonies. Geometric models, or maps, of these key relations that conform to descriptions in music theory have been generated by computational methods. One such key map in the shape of a torus or ring (5), and similar to that used by Janata et al. in their experiments, was obtained by training a selforganizing neural network model with experimentally quantified tonal hierarchies (see the bottom figure). This key map provides a visual display of an abstract mental model of key relationships. Results of experiments investigating how the sense of key develops and continuously changes can be projected onto the key map (6). The strength of the Janata et al. work is that it brings together various techniques in an original combination. Their study probes the neural correlates of tonality perception in an effort to identify brain regions that respond to musically modulating sequences. In so doing, this study raises a variety of questions. Is the topography of the key map reflected directly in the cortical activation pattern? Alternatively, do the Janata et al. findings reflect more general processes of remembering and comparing tones? How are the activation patterns (in particular those of the ventromedial frontal area) observed by Janata and co-workers related to affective responses to music? One might expect sensory-related regions of the superior temporal cortex to be implicated in computing the relationships between tones that result in tonality mapping. However, it is still not known how the perceptual responses of the superior temporal cortex interact with the responses distributed across the activated brain regions reported by Janata and colleagues. Implicit learning of key musical structures may take place over a lifetime of listening to music, so it is possible that tonal maps become widely distributed over the brain. Regardless of the answers to these questions, cognitive neuroscience has benefited from the application of sophisticated cognitive models to explore the correlation between music processing and neuroanatomical regions of the brain.
1. P. Janata et al., Science 298, 2167 (2002). 2. J. H. Kaas, T. A. Hackett, M. J. Tramo, Curr. Opin. Neurobiol. 9, 164 (1999). 3. R. J. Zatorre, I. Peretz, Ann. N.Y. Acad. Sci. 930, 193 (2001). 4. A. J. Blood, R. J. Zatorre, Proc. Natl. Acad. Sci. U.S.A. 98, 11818 (2001). 5. P. Toiviainen, C. L. Krumhansl, Perception, in press. 6. See www.cc.jyu.fi/~ptoiviai/bwv805/index.html for a movie that shows how key strengths change over time as a Bach organ duet is played. The perceptual judgments of listeners (top) are compared with a computer model of key-finding (bottom).

References and Notes

P E R S P E C T I V E S : M AT E R I A L S S C I E N C E

size and shape are continually being improved. Metals such as silver, gold, cobalt, and platinum have been made into nanospheres, nanorods, nanowires, Catherine J. Murphy nanocubes, and nanoprisms through chemical reactions of precursors at room or slightly he visions of nanotechnologysmall- metals, the mean free path of an electron at elevated temperatures (6, 7, 1113), typically er, faster, cheaper, smarter informa- room temperature is ~10 to 100 nm (2). in the presence of a directing agent. Unfortion-storage devices, energy sources, Hence, in a metallic particle with a diameter tunately, multiple shapes and sizes of and medical devices in which the size of in- of ~100 nm or less, substantial deviations nanoparticles are frequently produced in these dividual device elements approaches that of from bulk metallic properties are expected, reactions (see the second figure). Purification individual moleculescrucially depend on and new size-dependent properties may by centrifugation and size-selective precipitathe ability to make and manipulate objects emerge. For example, gold ceases to be a tion (or tight control over reaction time) is on the 1- to 100-nm scale. noble, unreactive metal: then required to isolate pure The engineers top down approach to Gold nanoparticles 2 to 3 products (6, 7, 1113). making nanometer-scale objects is to carve nm in diameter can catalyze Control of size and them out lithographically from a substrate; chemical reactions (4). shape was originally atthe chemists bottom up approach is to The melting temperatributed solely to the presassemble them from molecular-scale pre- ture of gold decreases drasence of the directing cursors. On page 2176 of this issue, Sun tically with size for spheres agent, which functions as and Xia (1) use the latter approach to show smaller than 20 nm (5). At a hard or soft template that simple chemical reactions in solution diameters from ~10 to 100 (such as porous alumina can produce silver nanocubes of control- nm, the spheres appear red, membranes or micelles). lable size in high yield. A simple, quantita- not gold, when well disIt is now widely believed tive oxidation-reduction reaction of the sil- persed, as in stained glass that preferential absorpver nanocubes with gold salts results in (see the first figure). Nontion of molecules and hollow gold nanoboxes. The cubic faces of spherical gold and silver ions in solution to differthese nanomaterials are crystallographical- nanoparticles absorb and ent crystal faces directs ly well defined (1), an important feature scatter light of different the growth of nanopartifor connecting these nanometer-scale ele- wavelengths, depending on cles into various shapes ments into future devices. nanoparticle size and shape by controlling the growth Many applications envisioned for nan- ( 6 , 7 ). Silver and gold rates along different crysotechnology require nanometer-scale ele- nanoparticles have been tal axes (11, 14, 15). This ments that are conducting or semiconduct- used as sensors to detect view is shared by Sun and ing. Inorganic materials such as metals and analytes through surfaceXia (1). semiconductors have fundamental length enhanced Raman scattering Red gold. Stained-glass window In their study, the reacscales in the 1- to 100-nm range (2, 3). In and other optical effects in Milan Cathedral, Italy, made by tion to make silver peculiar to the ~10- to 100- Niccolo da Varallo between 1480 nanocubes from silver ninm size range (810). and 1486, showing the birth of trate takes place at The author is in the Department of Chemistry and Synthetic chemical methSt. Eligius, patron saint of gold- ~150C in a highboiling Biochemistry, University of South Carolina, ods for making metallic smiths. The red colors are due to point solvent (ethylene Columbia, SC 29208, USA. E-mail: murphy@mail. chem.sc.edu nanoparticles of controlled colloidal gold. glycol), which also func-

Nanocubes and Nanoboxes

CREDIT: FOTOTECA VEN. FABBRICA DEL DUOMO

www.sciencemag.org

SCIENCE

VOL 298

13 DECEMBER 2002

2139

RESEARCH ARTICLES
74. J. Garcia-Ferna `ndez, P. W. H. Holland, Nature 370, 563 (1994). 75. D. E. Ferrier, C. Minguillon, P. W. H. Holland, J. GarciaFernandez, Evol. Dev. 2, 284 (2000). 76. A. Di Gregorio et al., Gene 156, 253 (1995). 77. M. Gionti et al., Dev. Genes. Evol. 207, 515 (1998). 78. D. Chourrout, R. Di Lauro, personal communication. 79. O. Hobert, H. Westphal, Trends Genet. 16, 75 (2000). 80. S. I. Tomarev, Int. J. Dev. Biol. 41, 835 (1997). 81. G. Krishnan, Indian J. Exp. Biol. 13, 172 (1975). 82. S. M. Read, T. Bacic, Science 295, 59 (2002). 83. J. Zuo et al., Plant Cell 12, 1137 (2000). 84. N. Lo et al., Curr. Biol. 10, 801 (2000). 85. D. R. Nobles, D. K. Romanovicz, R. M. Brown Jr., Plant Physiol. 127, 529 (2001). 86. R. C. Hardison, Proc. Natl. Acad. Sci. U.S.A. 93, 5675 (1996). 87. K. E. van Holde, K. I. Miller, H. Decker, J. Biol. Chem. 276, 15563 (2001). 88. Y. Satou et al., Development 128, 2893 (2001). 89. T. Kusakabe et al., Dev. Biol. 242, 188 (2002). 90. N. Harafuji, D. N. Keys, M. Levine, Proc. Natl. Acad. Sci. U.S.A. 99, 6802 (2002). 91. D. N. Keys et al., in preparation. 92. This work was performed under the auspices of the U.S. Department of Energys Ofce of Science, Biological and Environmental Research Program; by the University of California, Lawrence Livermore National Laboratory under Contract No. W-7405-Eng-48, Lawrence Berkeley National Laboratory under contract no. DE-AC03-76SF00098, and Los Alamos National Laboratory under contract no. W-7405-ENG36; and by MEXT, Japan (grants 12201001 to Y.K., 12202001 to N.S.), Japan Society for the Promotion of Science (to Y.S.), Human Frontier Science Program (to N.S. and M.L.), and NIH (HD-37105 and NSF IBN-9817258 to M.L.) Supporting Online Material www.sciencemag.org/cgi/content/full/298/5601/2157/ DC1 SOM Text Tables S1 to S9 Figures S1 and S2 References 1 November 2002; accepted 20 November 2002

The Cortical Topography of Tonal Structures Underlying Western Music


Petr Janata,1,2* Jeffrey L. Birk,1 John D. Van Horn,2,3 Marc Leman,4 Barbara Tillmann,1,2 Jamshed J. Bharucha1,2
Western tonal music relies on a formal geometric structure that determines distance relationships within a harmonic or tonal space. In functional magnetic resonance imaging experiments, we identied an area in the rostromedial prefrontal cortex that tracks activation in tonal space. Different voxels in this area exhibited selectivity for different keys. Within the same set of consistently activated voxels, the topography of tonality selectivity rearranged itself across scanning sessions. The tonality structure was thus maintained as a dynamic topography in cortical areas known to be at a nexus of cognitive, affective, and mnemonic processing. The use of tonal music as a stimulus for probing the cognitive machinery of the human brain has an allure that derives, in part, from the geometric properties of the theoretical and cognitive structures involved in specifying the distance relationships among individual pitches, pitch classes (chroma), pitch combinations (chords), and keys (1 3). These distance relationships shape our perceptions of music and allow us, for example, to notice when a pianist strikes a wrong note. One geometric property of Western tonal music is that the distances among major and minor keys can be represented as a tonality surface that projects onto the doughnut shape of a torus (1, 4 ). A piece of music elicits activity on the tonality surface, and harmonic motion can be conceptualized as displacements of the activation focus on the tonality surface (3). The distances on the surface also help govern expectations that actively arise while
Department of Psychological and Brain Sciences, Center for Cognitive Neuroscience, 3Dartmouth Brain Imaging Center, Dartmouth College, Hanover, NH 03755, USA. 4Institute for Psychoacoustics and Electronic Music, Ghent University, Ghent, Belgium.
1 2

*To whom correspondence should be addressed. Email: petr.janata@dartmouth.edu

one listens to music. Patterns of expectation elicitation and fulfillment may underlie our affective responses to music (5). Two lines of evidence indicate that the tonality surface is represented in the human brain. First, when one subjectively rates how well each of 12 probe tones, drawn from the chromatic scale (6 ), fits into a preceding tonal context that is established by a single chord, chord progression, or melody, the rating depends on the relationship of each tone to the instantiated tonal context. Nondiatonic tones that do not occur in the key are rated as fitting poorly, whereas tones that form part of the tonic triad (the defining chord of the key) are judged as fitting best (2). Probe-tone profiles obtained in this manner for each key can then be correlated with the probe-tone profile of every other key to obtain a matrix of distances among the 24 major and minor keys. The distance relationships among the keys readily map onto the surface of the torus (4 ). Thus, there is a direct correspondence between music-theoretic and cognitive descriptions of the harmonic organization of tonal music (7 ). Second, electroencephalographic studies of musical expectancy (8 11) have examined the effect of melodic and harmonic

context violations on one or more components of event-related brain responses that index the presence and magnitude of context violations. Overall, the cognitive distance of the probe event from the established harmonic context correlates positively with the amplitudes of such components. These effects appear even in listeners without any musical training (9, 11). The perceptual and cognitive structures that facilitate listening to music may thus be learned implicitly (2, 1215). The prefrontal cortex has been implicated in the manipulation and evaluation of tonal information (10, 11, 16 18). However, the regions that track motion on the tonality surface have not been identified directly. When presented with a stimulus that systematically moves across the entire tonality surface, will some populations of neurons respond selectively to one region of the surface and other populations respond selectively to another region of the surface? Identication of tonality-tracking brain areas. In order to identify cortical sites that were consistently sensitive to activation changes on the tonality surface, eight musically experienced listeners (see subjects in supporting online text) underwent three scanning sessions each, separated by 1 week on average, in which they performed two perceptual tasks during separate runs. During each run, they heard a melody that systematically modulated through all 12 major and 12 minor keys (see stimuli and tasks in supporting online text) (Fig. 1 and audio S1). A timbre deviance detection task required listeners to respond whenever they heard a note played by a flute instead of the standard clarinet timbre, whereas a tonality violation detection task required listeners to respond whenever they perceived notes that violated the local tonality (Fig. 1D). The use of two tasks that required attentive listening to the same melody but different perceptual analyses facilitated our primary goal of identifying cortical areas that exhibit tonality tracking that is largely independent of the specific task that is being performed (see scanning procedures in supporting online

www.sciencemag.org SCIENCE VOL 298 13 DECEMBER 2002

2167

RESEARCH ARTICLES
Fig. 1. Properties of the tonality surface and behavioral response proles. In the key names, capital letters indicate major keys and lowercase letters indicate minor keys. (A) Unfolded tori showing the average tonality surfaces for each of the 24 keys in the original melody. The top and bottom edges of each rectangle wrap around to each other, as do the left and right edges. and refer to the angular position along each of the circles comprising the torus. The color scale is arbitrary, with red and blue indicating strongest and weakest activation, respectively. Starting with C major and shifting from left to right, the activation peak in each panel reects the melodys progression through all of the keys. (B) The circle of fths. Major keys are represented by the outside ring of letters. Neighboring keys have all but one of their notes in common. The inner ring depicts the (relative) minor keys that share the same key signature (number of sharps and ats) with the adjacent major key. The color code refers to the three groups of keys into which tonality tracking voxels were categorized (Fig. 3). (C) Correlations among the average tonality surface topographies for each key. The topographies of keys that are closely related in a music-theoretic sense are also highly positively correlated, whereas those that are distantly related are negatively correlated. Three groups of related keys, indicated in (B), were identied by singular value decomposition of this correlation matrix. (D). Average response proles (eight listeners, three sessions each) from the tonality deviance detection task illustrate the propensity of specic test tones to pop out and elicit a response in some keys but not in others over the course of the melody. Error bars reect 1 SEM.

text). Using a regression analysis with separate sets of regressors to distinguish task effects from tonality surface tracking, we identified task- and tonality-sensitive areas (see fMRI analysis procedures in supporting online text). Tonality regressors were constructed from the output of a neural network model of the moment-to-moment activation changes on the tonality surface (see tonality surface estimation in supporting online text). Our tasks consistently activated several regions in the temporal, parietal, frontal, and limbic lobes as well as the thalamus and cerebellum. The most extensive consistent activation was along the superior temporal gyrus (STG) of both hemispheres, though the extent was greater in the right hemisphere, stretching from the planum temporale to the rostral STG and middle temporal gyrus (Fig. 2A and Table 1). Both the task and the tonality regressors correlated significantly and consistently with activity in the rostromedial prefrontal cortex, primarily in the rostral and ventral reaches of the superior frontal gyrus (SFG) (Figs. 2 and 3). The consistent modulation of this area in all of our listeners led us to focus on this region as a possible site of a tonality map. Tonality-specic responses in the rostromedial prefrontal cortex. At the

Fig. 2. Group conjunction maps showing the consistency with which specic structures were activated across listeners. Conjunction maps of individual listeners, containing the voxels that were activated signicantly (P 0.001) in all scanning sessions for that listener, were normalized into a common space and summed together across listeners (see spatial normalization in supporting online text). Voxels that were consistently activated by at least four of the eight listeners are projected onto the groups mean normalized T1 image. (A) Areas sensitive to the two task regressors (Table 1). (B) The only areas whose activity patterns were signicantly and consistently correlated with the tonality regressors both within and across listeners were the rostral portion of the ventromedial superior frontal gyrus and the right orbitofrontal gyrus.

individual level, we reconstructed and categorized the tonality sensitivity surface (TSS) for each voxel that exhibited significant responses (P 0.001) in every one of the three scanning sessions (see tonality surface estimation in supporting online text). The

reconstructed surfaces from each session indicated that the medial prefrontal cortex maintains a distributed topographic representation of the overall tonality surface (Fig. 3). Although some voxels exhibited similar TSSs from session to session, the global tonality

2168

13 DECEMBER 2002 VOL 298 SCIENCE www.sciencemag.org

RESEARCH ARTICLES
given a preceding musical input. Given the diversity of the music we hear, the situations in which we hear it, and our affective and motoric responses to it, it is likely that tonal contexts are maintained in cortical regions predisposed to mediating interactions between sensory, cognitive, and affective information. The medial prefrontal cortex is a nexus for such functions (20, 21) and is therefore an ideal region for maintaining a tonality map. In the macaque, connections to the medial prefrontal cortex from unimodal sensory cortices are widespread for the auditory modality and sparse for the other sensory modalities (22). In our experiments, we observed significant taskrelated activity in auditory association areas and the anterior STG, primarily in the right hemisphere. Reciprocal projections between these areas and the ventral medial prefrontal cortex help explain how and why a tonality map might be maintained in the medial prefrontal cortex. This region has already been implicated in assessing the degree of musical consonance or dissonance caused by a harmonic accompaniment to a melody (23). Our results suggest that the rostromedial prefrontal cortex not only responds to the general degree of consonance but actively maintains a distributed topographic representation of the tonality surface. The perception of consonance and dissonance depends on intact auditory cortices (24, 25). However, even with bilateral auditory cortex ablations, the ability to generate expectancies based on tonal contexts remains, suggesting that the cognitive structures maintaining tonal knowledge largely reside outside of temporal lobe auditory structures (24 ). Dynamic topographies. In contrast to distributed cortical representations of classes of complex visual objects that appear to be topographically invariant (26 ), we found that the mapping of specific keys to specific neural populations in the rostromedial prefrontal cortex is relative rather than absolute. Within a reliably recruited network, the populations of neurons that represent different regions of the tonality surface are dynamically allocated from one occasion to the next. This type of dynamic topography may be explained by the properties of tonality structures. In contrast to categories of common visual objects that differ in their spatial features, musical keys are abstract constructs that share core properties. The internal relationships among the pitches defining a key are the same in each key, thereby facilitating the transposition of musical themes from one key to another. However, the keys themselves are distributed on a torus at unique distances from one another. A dynamic topography may also arise from the interplay of short-term and long-term memory stores of tonal information and may serve

Fig. 3. Topography of tonality sensitivity of rostroventral prefrontal cortex in three listeners across three scanning sessions each. Each voxels color represents the key group with which the voxels TSS was maximally correlated (Fig. 1B). The minority of voxels that were maximally correlated with the average tonality surface are shown in white. A TSS represents how sensitive the voxel is to each point on the torus. The TSSs of selected voxels are displayed as unfolded tori. Figure 1A serves as a legend for assigning keys to the individual TSSs. The highlighted voxels were chosen to display both the consistency and heterogeneity of the tonality surfaces across sessions. For each listener, the activity of all voxels shown was signicantly correlated with the tonality regressors in all sessions. Thus, what changed between sessions was not the tonality-tracking behavior of these brain areas but rather the region of tonal space (keys) to which they were sensitive. This type of relative representation provides a mechanism by which pieces of music can be transposed from key to key, yet maintain their internal pitch relationships and tonal coherence.

topography varied across sessions in each of the listeners. The number of voxels falling into each of the tonality categories (Fig. 1B) was evenly distributed within each session (table S1), but the relative pattern of tonality sensitivity changed. For all listeners, we also found tonality-sensitive voxels outside of the medial prefrontal region (table S2). The precise constellations of sensitive areas differed across listeners. We found tonality-sensitive foci in the orbital and frontal gyri, primarily

in the right hemisphere; the temporal pole; the anterior and posterior superior temporal sulci; the precuneus and superior parietal gyrus; the posterior lingual gyrus; and the cerebellum (19). Discussion. Central to our ability to hear music coherently are cognitive structures that maintain perceptual distance relationships among individual pitches and groups of pitches. These structures shape expectations about pitches we will hear,

www.sciencemag.org SCIENCE VOL 298 13 DECEMBER 2002

2169

RESEARCH ARTICLES
Table 1. Loci consistently showing a main effect of task in a majority of listeners. MTG, middle temporal gyrus; IFG, inferior frontal gyrus; SPG, superior parietal gyrus. Left hemisphere Lobe Region (Brodmann area) x Temporal STG (22) STG/Heschls gyrus (41/42) STG/planum temporale (22) Rostromedial STG Rostroventral MTG (21) Middle MTG/superior temporal sulcus (21) Ventral MTG (21) Frontal Rostroventromedial SFG (10/14) Superior frontal sulcus/frontopolar gyrus (10) Lateral orbital gyrus (11) IFG, pars orbitalis (47) IFG, pars opercularis (44) Precentral gyrus (6) Parietal Postcentral gyrus (1) Supramarginal gyrus (40) Precuneus (7) SPG (7) SPG/transverse parietal sulcus (7) Limbic Collateral sulcus Hippocampus/collateral sulcus Other Cerebellum Mediodorsal thalamic nucleus 4 38 0 82 79 11 35 25 9 5 6 5 11 19 3
23. A. J. Blood, R. J. Zatorre, P. Bermudez, A. C. Evans, Nature Neurosci. 2, 382 (1999). 24. M. J. Tramo, J. J. Bharucha, F. E. Musiek, J. Cognit. Neurosci. 2, 195 (1990). 25. I. Peretz, A. J. Blood, V. Penhune, R. Zatorre, Brain 124, 928 (2001). 26. J. V. Haxby et al., Science 293, 2425 (2001). 27. A. R. Damasio, Cognition 33, 25 (1989). 28. J. J. Eggermont, Neurosci. Biobehav. Rev. 22, 355 (1998). 29. S. Funahashi, Neurosci. Res. 39, 147 (2001). 30. E. Bigand, B. Poulain, B. Tillmann, D. DAdamo, J. Exp. Psychol. Hum. Percept. Perf., in press. 31. D. Huron, R. Parncutt, Psychomusicology 12, 154 (1993). 32. M. Leman, Music Percept. 17, 481 (2000). 33. N. Oram, L. L. Cuddy, Psychol. Res. 57, 103 (1995). 34. We thank T. Laroche for assistance with data collection. Supported by NIH grant P50 NS17778-18. The data and stimuli from the experiment are available on request from the fMRI Data Center at Dartmouth College (www.fmridc.org) under accession number 2-2002-1139B. Supporting Online Material www.sciencemag.org/cgi/content/full/298/5601/2167/ DC1 SOM Text Figs. S1 to S3 Tables S1 and S2 References Audio S1 17 July 2002; accepted 27 September 2002

Right hemisphere Cluster size (voxels) 74 74 14 Location (mm) x y z Listeners at peak (no.) Cluster size (voxels)

Location (mm) y 11 19 41 z 10 9 15

Listeners at peak (no.) 6 7 5

64 56 68

52 64 64 38 52 56 60 4 26 49 49 56 60 49 64 64 0 11 19

11 30 26 15 0 15 11 64 64 41 45 19 22 4 11 30 45 56 49

5 15 5 35 35 15 25 0 30 10 4 5 20 55 25 35 55 80 75

8 6 6 5 6 5 6 5 5 5 5 6 4 5 6 6 5 5 6

163 163 163 36 36 163 163 27 3 4 3 3 11 10 163 3 42 3 5

49

27

0 4 4 30

45 56 71 8

55 75 60 30

5 6 6 5

42 42 22 10

26 26 45

11 86 64

25 30 45

5 5 5

23 10 8

a beneficial role in coupling the moment-tomoment perception of tonal space with cognitive, affective, and motoric associations, which themselves may impose constraints on the activity patterns within rostral prefrontal regions (21, 2729).
1. R. N. Shepard, Psychol. Rev. 89, 305 (1982). 2. C. L. Krumhansl, Cognitive Foundations of Musical Pitch (Oxford Univ. Press, New York, 1990). 3. F. Lerdahl, Tonal Pitch Space (Oxford Univ. Press, New York, 2001). 4. C. L. Krumhansl, E. J. Kessler, Psychol. Rev. 89, 334 (1982). 5. L. B. Meyer, Emotion and Meaning in Music (Univ. of Chicago Press, Chicago, 1956). 6. The chromatic scale consists of 12 equally sized intervals into which an octave is divided. On a piano, a chromatic scale starting at middle C would be played by striking adjacent keys until the note C, either one octave above or below middle C, was reached. 7. The extent to which tonality representations are maintained in long-term or short-term memory stores, or a combination of the two, is a matter of debate. Self-organizing neural network models of implicit learning accurately mimic results from a wide array of experiments that assess tonal knowledge (15), and harmonic priming experiments directly highlight the inuence of learned tonal structures (13, 30). However, models of short-term sensory memory account for signicant proportions of the variance in probe-tone experiments (31, 32), and

8. 9. 10. 11.

References and Notes

12. 13. 14. 15. 16. 17. 18. 19.

20. 21. 22.

probe tone ratings depend, partially, on the pitch distribution statistics of the contexts that precede probes (33). P. Janata, J. Cognit. Neurosci. 7, 153 (1995). M. Besson, F. Fa ta, J. Exp. Psychol. Hum. Percept. Perf. 21, 1278 (1995). A. D. Patel, E. Gibson, J. Ratner, M. Besson, P. J. Holcomb, J. Cognit. Neurosci. 10, 717 (1998). S. Koelsch, T. Gunter, A. D. Friederici, E. Schro ger, J. Cognit. Neurosci. 12, 520 (2000). J. J. Bharucha, K. Stoeckig, J. Exp. Psychol. Hum. Percept. Perf. 12, 403 (1986). H. G. Tekman, J. J. Bharucha, J. Exp. Psychol. Hum. Percept. Perf. 24, 252 (1998). R. France `s, La Perception de la Musique (Vrin, Paris, 1958). B. Tillmann, J. J. Bharucha, E. Bigand, Psychol. Rev. 107, 885 (2000). R. J. Zatorre, A. C. Evans, E. Meyer, A. Gjedde, Science 256, 846 (1992). R. J. Zatorre, A. C. Evans, E. Meyer, J. Neurosci. 14, 1908 (1994). B. Maess, S. Koelsch, T. C. Gunter, A. D. Friederici, Nature Neurosci. 4, 540 (2001). The existence of a tonal map that is distributed within and across cortical areas rather than focused within a small cortical area may seem paradoxical, yet this representational form is predicted by some models of functional brain organization (27). H. Barbas, Brain Res. Bull. 52, 319 (2000). D. Tranel, A. Bechara, A. R. Damasio, in The New Cognitive Neurosciences, M. S. Gazzaniga, Ed. (MIT Press, Cambridge, MA 2000), pp. 10471061. H. Barbas, H. Ghashghaei, S. M. Dombrowski, N. L. Rempel-Clower, J. Comp. Neurol. 410, 343 (1999).

2170

13 DECEMBER 2002 VOL 298 SCIENCE www.sciencemag.org

Vous aimerez peut-être aussi