visual–auditory spatial processing in auditory cortical neurons

14
This article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal non-commercial research and education use, including for instruction at the authors institution and sharing with colleagues. Other uses, including reproduction and distribution, or selling or licensing copies, or posting to personal, institutional or third party websites are prohibited. In most cases authors are permitted to post their version of the article (e.g. in Word or Tex form) to their personal website or institutional repository. Authors requiring further information regarding Elsevier’s archiving and manuscript policies are encouraged to visit: http://www.elsevier.com/copyright

Upload: independent

Post on 16-Nov-2023

0 views

Category:

Documents


0 download

TRANSCRIPT

This article appeared in a journal published by Elsevier. The attachedcopy is furnished to the author for internal non-commercial researchand education use, including for instruction at the authors institution

and sharing with colleagues.

Other uses, including reproduction and distribution, or selling orlicensing copies, or posting to personal, institutional or third party

websites are prohibited.

In most cases authors are permitted to post their version of thearticle (e.g. in Word or Tex form) to their personal website orinstitutional repository. Authors requiring further information

regarding Elsevier’s archiving and manuscript policies areencouraged to visit:

http://www.elsevier.com/copyright

Author's personal copy

Research Report

Visual–auditory spatial processing in auditory cortical neurons

Jennifer K. Bizley⁎, Andrew J. KingDepartment of Physiology, Anatomy and Genetics, University of Oxford, Parks Road, Oxford, OX1 3PT, UK

A R T I C L E I N F O A B S T R A C T

Article history:Accepted 21 February 2008Available online 10 March 2008

Neurons responsive to visual stimulation have now been described in the auditory cortex ofvarious species, but their functions are largely unknown. Here we investigate the auditory andvisual spatial sensitivity of neurons recorded in 5 different primary and non-primary auditorycortical areas of the ferret. We quantified the spatial tuning of neurons by measuring theresponses to stimuli presented across a range of azimuthal positions and calculating themutual information (MI) between the neural responses and the location of the stimuli thatelicited them. MI estimates of spatial tuning were calculated for unisensory visual, unisensoryauditory and for spatially and temporally coincident auditory–visual stimulation. Themajorityof visually responsive units conveyed significant information about light-source location,whereas, over a corresponding region of space, acoustically responsive units generallytransmitted less information about sound-source location. Spatial sensitivity for visual,auditory and bisensory stimulation was highest in the anterior dorsal field, the auditory areapreviously shown to be innervated by a region of extrastriate visual cortex thought to beconcerned primarily with spatial processing, whereas the posterior pseudosylvian field andposterior suprasylvian field, whose principal visual input arises from cortical areas that appearto be part of the ‘what’ processing stream, conveyed less information about stimulus location.In some neurons, pairing visual and auditory stimuli led to an increase in the spatialinformation available relative to themost effective unisensory stimulus, whereas, in a smallersubpopulation, combined stimulation decreased the spatial MI. These data suggest that visualinputs to auditory cortex can enhance spatial processing in the presence of multisensory cuesand could therefore potentially underlie visual influences on auditory localization.

© 2008 Elsevier B.V. All rights reserved.

Keywords:MultisensoryFerretSingle-unit recordingVirtual acoustic spaceMutual informationSound localization

1. Introduction

A number of studies have now described the presence of neu-rons in auditory cortex whose responses are modulated by avisual stimulus, but the functions of these inputs remainelusive. Sensitivity to visual stimulationhasbeendemonstratedin the auditory cortex of human (Calvert et al., 1999; Foxe et al.,

2000, 2002; Giard and Peronnet, 1999; Molholm et al., 2004, 2002;Murray et al., 2005) and non-human (Brosch et al., 2005; Gha-zanfar et al., 2005; Kayser et al., 2007; Schroeder and Foxe, 2002;Schroeder et al., 2001) primates, ferrets (Bizley et al., 2007) andrats (Wallace et al., 2004). Visual activity is most commonlyobserved in non-primary auditory cortex, but neurons whoseresponses are modulated by light are also found in the primary

B R A I N R E S E A R C H 1 2 4 2 ( 2 0 0 8 ) 2 4 – 3 6

⁎ Corresponding author. Department of Physiology, Anatomy and Genetics, Sherrington Building, University of Oxford, Parks Road, Oxford,OX1 3PT, UK.

E-mail address: [email protected] (J.K. Bizley).Abbreviations:A1, primary auditory cortex; AAF, anterior auditory field; ADF, anterior dorsal field; AVF, anterior ventral field; MI,mutual

information; PPF, posterior pseudosylvian field; PSF, posterior suprasylvian field

0006-8993/$ – see front matter © 2008 Elsevier B.V. All rights reserved.doi:10.1016/j.brainres.2008.02.087

ava i l ab l e a t www.sc i enced i rec t . com

www.e l sev i e r. com/ loca te /b ra in res

Author's personal copy

fields (Bizley et al., 2007). The effects of visual stimulation can beeither to facilitate or suppress activity to a concurrent auditorystimulus (Bizley et al., 2007; Ghazanfar et al., 2005).

The origin and function of visual activity within auditorycortex is widely debated. It has been suggested that multi-sensory inputs might serve to prime auditory cortex, acting toenhance the response to a subsequent auditory stimulus(Schroeder and Foxe, 2005). Somatosensory inputs have beenshown to reset the phase of ongoing neural oscillations inprimary auditory cortex (A1) (Lakatos et al., 2007), supportingthe idea that non-auditory inputs might function to resetongoing auditory cortex activity (Schroeder and Foxe, 2005). Asecond hypothesis is that the superior spatial resolution of thesomatosensory and visual systems might be used to increasespatial sensitivity in the auditory system, perhaps contribut-ing to the enhanced localization abilities that are observedwith multisensory versus unisensory stimuli (Welch andWarren, 1986; Stein et al., 1988).

Understanding the origin of multisensory inputs to audi-tory cortex should help to elucidate their function. Themajority of auditory–visual interactions in monkey auditorycortex have been shown to be face specific, which wasinterpreted as evidence for integration being a result offeedback from the superior temporal sulcus (Ghazanfaret al., 2005). However, visual inputs to auditory cortex arisefrom a number of sources; in addition to inputs from multi-sensory association areas there are direct connections fromboth primary and non-primary visual cortex, as well as frommultisensory thalamic nuclei (Bizley et al., 2007; Budingeret al., 2006, 2007; Falchier et al., 2002; Hackett et al., 2007;Rockland and Ojima, 2003).

The pattern of anatomical connectivity recently identifiedfrom visual cortex to auditory cortex in the ferret raises intri-guing predictions as to the functional properties of visual–auditory interaction in different fields of the auditory cortex inthis species. Bizley et al. (2007) observed sparse direct connec-tions from V1 to A1 and found that non-primary auditorycortical areas are more heavily innervated by extrastriatevisual areas. Those on the anterior bank of the ectosylviangyrus are innervated predominantly by area SSY, thought to bethe ferret homologue of the motion processing area MT(Philipp et al., 2005), whereas those on the posterior bank ofthe gyrus receive larger inputs from area 20, thought to be partof the “what” processing stream (Manger et al., 2004). Thissuggests that the visual responses recorded in differentauditory cortical fields should vary in the amount of spatialinformation they convey according to the source of theseprojections.

In order to examine further the role of visual activity inauditory cortex, we mapped spatial receptive fields indifferent auditory cortical fields of the ferret using acoustic,visual, or combined stimulation. We analyzed these data byestimating the mutual information (MI) between the stimuliand the responses, which enabled us to take into accountinformation in spike timing in addition to overall spike count.Our results show that visual inputs to auditory cortex doindeed increase the information about stimulus location inthe responses of the neurons and that the extent to whichthey do so varies in a predictable fashion according to thesource of the inputs from extrastriate visual cortex.

2. Results

The extracellular responses of 305 single units were recordedfrom5 auditory fields in 5 ferrets. Fig. 1 illustrates the locationofall the auditory fields that have so far been identified in thisspecies. Unitswere first classifiedon thebasis of their responsesto global light flashes presented froman LED positioned close tothecontralateral eyeand tobroadbandnoiseburstspresented tothe contralateral ear. Thirty-six percent of the units (n=111/305)were classified as bisensory (defined as showing a significantalteration in spike firing to both auditory and visual stimulipresented in isolation), 24% (n=74/305) responded only to visualstimulation in isolation and the remainders were classified asunisensory auditory (39%, 120/305). Table 1 shows the distribu-tion of each response type by cortical field. Pure tone stimuliwere used for mapping frequency-response areas in order tohelp determine the cortical area in which the recordings weremade, but all auditory spatial tuning estimates were carried outwith broadband noise bursts. Fig. 1A illustrates the location ofeach of the auditory cortical fields in the ferret and Fig. 1Bsummarises the main potential cortical sources and types ofvisual information to each area.

Fig. 1 – (A) Schematic showing the organization of auditorycortex in the ferret. The primary fields, A1 and AAF, aretonotopically organized (arrows indicate a high-to-lowcharacteristic frequency gradient). Two posterior fields, PPFandPSF, are also tonotopically organized.A1, primaryauditorycortex;AAF,anteriorauditory field;PPF,posteriorpseudosylvianfield; PSF, posterior suprasylvian sulcus; VP, ventral posterior;ADF, anterior dorsal field; AVF, anterior ventral field; fAES,anterior ectosylvian sulcal field; pss, pseudosylvian sulcus; sss,suprasylvian sulcus; c, caudal; v, ventral. (B) Summary of visualinputs from visual andmultisensory areas into each of the 5auditory fields recorded from in this study.

25B R A I N R E S E A R C H 1 2 4 2 ( 2 0 0 8 ) 2 4 – 3 6

Author's personal copy

Wehave previously shown in a small subpopulation of neu-rons that at least some auditory cortical neurons exhibitspatially modulated visual sensitivity (Bizley et al., 2007). Herewe systematically explored the spatial sensitivity of the visualresponses in a much larger population of units while also exa-mining their auditory spatial tuning and the effects of combinedvisual–auditory stimulation. This was achieved by presentingsounds in virtual acoustic space (VAS) and visual stimuli froman array of 8 LEDs located 100 cm from the animal at locationsranging from 5° (ipsilateral), in 15° intervals, to −100° (contrala-teral) to the cortex from which the recordings were made.

Spatial sensitivity of each unit to the different stimuli wasdeterminedbymeasuring theMI, quantified inbits, between thestimulus location and the neuronal response. At least 30 repe-titions of each combination of spatial location and stimulusmodality (auditory, visual or bisensory) were presented in apseudorandomly interleaved order. MI estimates were thenobtained for each stimulus modality separately, so that thenumber of bits represents a measure of the spatial modulationof the neuronal response for a given stimulus modality. This

enabled the relative amounts of spatial informationprovidedbyeach to be compared. TheMI values obtainedwere assessed forsignificance by using a bootstrapping procedure, as described inthe Experimental procedures.

2.1. Spatial sensitivity to visual stimuli

Seventy-one percent (132/185) of units whose responses weremodulated by a unisensory visual stimulus contained signi-ficant spatial information. The latencies of these responsesdiffered somewhat between different cortical fields and wereconsistent with those reported for a larger sample of units inBizley et al. (2007). Example raster plots for 3 units whoseresponses weremodulated by LED location are shown in Fig. 2.The proportion of units whose responses were significantlymodulated by the unisensory visual stimulus and transmittedsignificant information about LED location varied betweencortical fields; neurons in the anterior auditory field (AAF) andthe anterior dorsal field (ADF) were most likely to be spatiallysensitive with 72% (n=32) and 75% (n=66), transmitting sig-nificant amounts of information about LED location, respec-tively. In A1, the posterior pseudosylvian field (PPF) and theposterior suprasylvian field (PSF), significant spatial sensitiv-ity was observed in 63% (n=11), 64% (n=35) and 62% (n=41) ofvisually responsive units, respectively. Fig. 3A plots thedistribution of MI values obtained for visually sensitive neu-rons in each cortical field. In addition to having the largestproportion of spatially sensitive visual units, ADF neuronstransmitted significantly more information than those in theother cortical fields (Kruskal–Wallis test, p=0.0001).

Fig. 3D shows the distribution across auditory cortex of MIvalues between visual stimulus location and the responses oftheseunits. Data fromthe5animalswere collapsedontoa singleauditory cortex, with the location of each unit represented by a

Fig. 2 – (A–C) Raster plots showing the responses of three units recorded in fields PPF, AAF and ADF, respectively, to lightflashes from an LED (100 ms duration) presented at a range of azimuthal locations (at a fixed elevation of +5°). All three unitstransmitted significant information about LED location in their response (A, 1.54 bits; B, 2.07 bits; C, 1.59 bits).

Table 1 – Different response categories (unisensoryvisual, unisensory auditory, and bisensory, defined onthe basis of a significant alteration in spike firing to visualand/or auditory stimuli) for the 305 units recorded in the 5different auditory cortical fields

Corticalfield

Auditory(%)

Visual(%)

Bisensory(%)

Number ofunits

A1 67.6 0.0 32.4 34AAF 52.9 26.5 20.6 68PPF 52.7 33.8 13.5 74PSF 32.8 4.9 62.3 61ADF 7.0 39.4 53.5 71

26 B R A I N R E S E A R C H 1 2 4 2 ( 2 0 0 8 ) 2 4 – 3 6

Author's personal copy

single polygon on the cortical surface (see Fig. 1A for the typicallocation of each cortical field). Where multiple recordings weremadeat a singlepoint (i.e.whenunitswere recordedonmultiplerecording sites at different depths on the same electrode), recor-dings are arranged in a circle, with the deepest unit shownat theright hand side and progressively more superficial recordingsplotted in a clockwise direction within the circle. The size of thepolygon reflects the density of sampling in that area, while itscolor indicates the amount of spatial information conveyed in

bits (with visually unresponsive units represented by open poly-gons). This figure clearly demonstrates that most informationabout the LED direction was conveyed by units located on theanterior bank of the ectosylvian gyrus, i.e. in ADF and AAF.

2.2. Spatial sensitivity to auditory stimuli

In order to compare the auditory and visual responses directly,sensitivity to VAS stimuli was tested over the same range of

Fig. 3 – (A–C) Boxplots displaying the amount of information (in bits) transmitted by units in each of the five cortical fields aboutLED location (A), sound-source location (B) or the location of a combined auditory–visual stimulus (C). Only units for which therewas a significant unisensory visual or auditory response are plotted inA andB, respectively, whereas C shows themultisensoryMI values for all units recorded, irrespective of their response to unisensory stimulation. The total number of units in each groupis shownat the top of eachplot. The boxplots show themedian (horizontal bar), inter-quartile range (boxes), spreadof data (tails)and outliers (cross symbols). The notch indicates the distribution of data about themedian. Therewere significant differences inthe distribution ofMI values in different cortical fields (Kruskal–Wallis test; LED location, p=0.0001; auditory location, p=0.0035;bisensory stimulus location, p<0.0001). Significant post-hoc pair-wise differences (Tukey–Kramer test, p<0.05) betweenindividual cortical fields are shown by the lines above each boxplot. (D–F) Voronoi tessellations plotting the spatial information,in bits, on the surface of auditory cortex for every unit recorded. Data from 5 animals have been collapsed onto a single auditorycortex. Each polygon represents a single unit. Where multiple recordings were made at different active sites on the sameelectrode, recordings are arranged in a circle about the point fromwhich they were recorded. The deepest unit is shown on theright hand side and progressively more superficial units are arranged moving clockwise from this point. Units that conveyedvery small amounts of spatial information are plotted in red and the most informative neurons are plotted in yellow.

27B R A I N R E S E A R C H 1 2 4 2 ( 2 0 0 8 ) 2 4 – 3 6

Author's personal copy

locations, i.e. from 5° to −100° azimuth, provided by the LEDarray. Over this relatively limited region of space, only 23% ofthe units transmitted significant information about sound-source location and the MI values were generally lower thanthose obtained with visual stimuli (compare Figs. 3A and B).The distribution across auditory cortex of MI values betweenauditory stimulus location and response is shown in Figs. 3Band E. Aswith the visual responses, significantly higher valueswere found for units recorded in ADF (Fig. 3B; yellow polygonsin Fig. 3E), suggesting that spatial sensitivity is greater in thisfield than in other cortical areas.

Given the restricted range of azimuths tested relative to thesize of the spatial receptive fields reported in previous studiesof ferret auditory cortex (Mrsic-Flogel et al., 2005) and the factthat a relatively high sound level was used, this paucity ofauditory spatial sensitivity is perhaps not surprising. A subset(n=93) of units was also tested with a more extensive range ofvirtual sound directions, from 95° to −100° azimuth. Whereasonly 15% (n=14) of this subset were found to transmit a sig-nificant amount of spatial information when the estimationwas based on the restricted range of locations (5° to −100°azimuth), this increased to 52% (n=48) over the larger range.These units included all those whose responses were spatiallyinformative across the smaller range of sound directions.

Fig. 4 shows the responses of three units to noise burstspresented at different virtual sound directions. The units shownin Figs. 4A and C exhibited a contralateral preference, whereasthe unit in Fig. 4B firedmost when sounds were presented fromin front of the animal. A comparison of the distribution acrosscortical areasofunits showingsignificantauditoryspatial tuningover the95° to−100° rangeof azimuths revealed that roughlyonehalf of the units recorded in A1 (10/20), AAF (9/20) and PPF (9/17)were spatially tuned, with slightlymore in PSF (12/16) and fewerin ADF (7/20). The distribution of MI values obtained in eachcortical field is shown in Fig. 4D for all units for which azimuthresponse profiles were measured over the larger range ofstimulus locations and in Fig. 4E for only those units that exhi-bited significant spatial tuning. There were significant differ-ences between the values obtained in each field (Kruskal–Wallis,p=0.010), with units in A1, PSF and ADF being more informativethan those in AAF and PPF.

2.3. Combined auditory–visual stimulation can modulatethe spatial information conveyed by auditory cortical neurons

To examinewhether the presence of a visual stimulus can alterthe spatial information conveyed by auditory cortical neurons,visual and auditory stimuli were presented simultaneouslyfromcorresponding spatial locations. Thiswasdone for all unitsirrespective of their responses to unisensory stimulation. Inorder to quantify the types of interaction observed, we askedwhether the amount of spatial information conveyed by com-bined auditory–visual stimulation was significantly differentfrom the unisensory condition that was the more spatiallysensitive. This was achieved using a bootstrapping procedurethat first randomized the responses between the auditory andbisensory stimuli and recalculated the MI 1000 times, and thenrepeated this by randomizing trials between the visual andbisensory stimuli. If the MI for the bisensory condition wasgreater than both the 90th percentile values, estimated by boot-

strapping, then there was said to be a significant positive inter-action, i.e. combined visual–auditory stimulation increased theamount of spatial information in the response. However, ifunisensory stimulation resulted in a value that was higher thanthe 90th percentile of the bootstrapped values, then combinedstimulationwas said tonegatively interact, indicating that therewas a significant decrease in the amount of spatial informa-tion available when bisensory stimulation was used. Forty-seven percent of units did not show a significant interaction,whereas combined stimulation increased the amount of infor-mation about stimulus location in 35% of units and decreased itin the remaining 18%.

Fig. 5 shows the responses of 3 example units. The unit inFig. 5A showed no crossmodal interaction with combined sti-mulation; it was sensitive only to visual stimuli and showedrobust spatial tuning thatwasuninfluencedby a simultaneouslypresented acoustic stimulus. This is indicated by the similaritybetween the raster plots and the azimuth-spike-rate functionsfor visual and bisensory stimulation. Fig. 5B shows a unit thatwas classified as showing a positive interaction in response tobisensory stimulation. The auditory response did not containsignificant information about the direction of the sound source(note the similarity in the raster plots for each azimuth angle).There was some indication that visual stimuli suppressed thespontaneous firing of this unit, although not in a spatially de-pendent fashion. However, when combined with sound thissuppressive effect of the visual stimulus altered the shape of theauditory response profile, increasing the extent to which thespike rate wasmodulated by stimulus direction. As a result, thisunit now transmitted significant spatial information. Finally,Fig. 5C shows an example of a unit that was classified as dis-playing a negative crossmodal interaction. This unit was res-ponsive to both unisensory visual and auditory stimuli and, inboth cases, there was a weak, but significant, modulation of theresponse with spatial location. However, when the stimuli werepresented in combination, amuchmore robust spiking responsewas elicited and the unit was strongly driven at previously inef-fective locations. As a result, the degree to which the unit'sresponse was modulated by stimulus location was reduced tosuch an extent that it no longer contained significant spatialinformation.

2.4. Visual stimulation enhances auditory spatial tuning

Fig. 6A shows the proportion of units in each cortical field thatexhibited either positive (black) or negative (gray) visual–audi-tory interactions. In nearly all areas, the majority of unitsshowed positive interactions, indicating that combinedvisual–auditory stimulation increased the amount of spatialinformation in the response compared to that elicited by themost effective unisensory stimulus. In order to assess theoverall effects of the visual stimuli on auditory spatialsensitivity, we also determined the proportion of units ineach cortical area whose responses transmitted more infor-mation about stimulus location when spatially aligned visualand auditory stimuli were presented together compared to theresponses to sound alone (Fig. 6B). Around 10–20% of auditoryunits in all cortical areas showed enhanced spatial tuning inthe presence of bisensory stimuli, with the exception of PSF,where this figure rose to ~50%.

28 B R A I N R E S E A R C H 1 2 4 2 ( 2 0 0 8 ) 2 4 – 3 6

Author's personal copy

The amount of spatial information available in the unitresponses recorded in each of the five cortical fields tocombined auditory–visual stimulation is shown in Fig. 3C,while Fig. 3F plots these values for each unit recorded on thesurface of the cortex. This illustrates that neurons in auditorycortex tend to transmit most information about stimuluslocation when bisensory stimuli are present, as indicated bythe greater proportion of polygons that are yellow in color.

This is evident when compared to unisensory visual stimula-tion (Fig. 3D), but is most marked when compared tounisensory auditory stimulation (Fig. 3E.). This increase in MIwas seen in all auditory cortical fields, but, as with bothauditory and visual stimuli presented in isolation, the highestmultisensory MI values were associated with units in ADF,suggesting that this is the region of ferret auditory cortex thatis most concerned with spatial processing.

Fig. 4 – (A–C) Raster plots showing the responses of three units to broadband noise bursts (100ms duration) presented in virtualacoustic space at a range of azimuthal locations (at a fixed elevation of +5°). All three units transmitted significant informationabout sound-source location in their responses (0.42, 0.20, and 0.45 bits, respectively). Units A and B were recorded in A1, andunit C in ADF. (D and E) Distribution of MI values obtained from all units tested with this wider range of azimuths (D, n=93) andfrom only those units whose responses were significantly spatially modulated (E, n=48).

29B R A I N R E S E A R C H 1 2 4 2 ( 2 0 0 8 ) 2 4 – 3 6

Author's personal copy

30 B R A I N R E S E A R C H 1 2 4 2 ( 2 0 0 8 ) 2 4 – 3 6

Author's personal copy

Theamountof spatial informationavailable in the responsesof units, combined across all 5 cortical fields, classified as uni-sensory auditory, unisensory visual and bisensory is comparedfor unisensory and bisensory stimulation in Fig. 7. In the case ofthe unisensory auditory (Fig. 7A) and unisensory visual (Fig. 7B)units, the addition of the second stimulusmodality could resultin either an increase or decrease in transmitted information.Fig. 7C plots the change in MI in the bisensory stimulus relativeto either the unisensory auditory (blue crosses) or visual (redcircles) stimulus. The largest increases in MI were observedwhen the visual stimulus was added to spatially congruentsounds in the bisensory condition, suggesting that they makethe greatest contribution to the enhanced spatial processingobserved in the auditory cortex in the presence ofmultisensorycues.

3. Discussion

We have demonstrated that neurons within auditory cortexfrequently transmit significant amounts of information aboutthe location of a visual stimulus. Moreover, for some neuronsspatially and temporally congruent visual stimuli increased theamount of spatial information available from the neural res-ponse when compared to acoustic stimulation alone. Thesedata suggest that visual inputs into auditory cortex can enhancethe spatial processing capacity of auditory cortical neurons.

As in our previous study (Bizley et al., 2007), visual stimuliwere found to drive ormodulate the activity of some units in allauditory cortical areas fromwhich recordingsweremade. How-ever, the incidence of these visual responses and their influenceon the spatial sensitivity of the neurons varied across auditorycortex. Neurons in ADF, on the anterior bank of the ectosylviangyrus, were most likely to convey significant information aboutlight-source location, and tended tohave significantlyhigherMIvalues than the responses of units recorded in other corticalareas. ADF neurons also generally transmitted more informa-tion about sound-source location, suggesting that this region offerret auditory cortex might be particularly involved in proces-sing spatial information. Combined auditory–visual stimulationrevealed evidence for multisensory interactions in a large pro-portion of ADF units, about half of which transmitted signifi-cantlymore spatial information relative to themost informativeunisensory stimulus. These findings are illustrated inFigs. 3D–F,which show that spatial sensitivity for auditory, visual andbisensory stimulation is highest at the ventral extreme of ADF,close to the boundary with the anterior ventral field (AVF) and

Fig. 6 – (A) Bar graph showing the proportion of units in eachcortical field where a significant crossmodal interaction wasobserved. The proportion of units in which combinedauditory–visual stimulation resulted in a significant increaseor decrease in spatial information, relative to the mostspatially informative unisensory response, is shown by theblack and gray bars, respectively. (B) Bar graph plotting theproportion of units within each cortical field in which theaddition of a spatially and temporally congruent visualstimulus led to an increase in spatial information comparedto that estimated from the response to sound alone.

Fig. 5 – Examples of responses to auditory, visual and combined auditory–visual stimulation. For each unit, the response to thethree stimulus types presented at different azimuths is shown by the raster plots and by the spike rate functions (plotted over a300 ms window and fitted with a cubic polynomial). (A) Unisensory visual unit that was spatially tuned and unaffected bysimultaneously presented sound. Spatial MI values obtained for this unit were 0.45, 1.9 and 2.01 bits for acoustic, visual andbisensory stimulation, respectively. (B) Acoustically responsive unit that responded in a consistent fashion across the range ofazimuths tested. Unisensory visual stimulation suppressed the spontaneous activity of this unit and, when combined withacoustic stimulation, resulted in a spatially tuned response. Spatial MI values obtained for this unit were 0.31, 0.29 and 0.38 bitsfor acoustic, visual and bisensory stimulation, respectively. (C) Bisensory unit showing broad spatial tuning to both auditoryand visual unisensory stimulation. Pairing auditory and visual stimulation produced a more robust spiking response in thisunit, which was modulated much less by spatial location. Spatial MI values obtained for this unit were 0. 49, 0.51 and 0.45 bitsfor acoustic, visual and bisensory stimulation, respectively.

31B R A I N R E S E A R C H 1 2 4 2 ( 2 0 0 8 ) 2 4 – 3 6

Author's personal copy

the anterior ectosylvian sulcal field (fAES), both of which areknown tobemultisensory areas (Bizleyet al., 2007;Manger et al.,2005; Ramsay and Meredith, 2004). The fields on the anteriorbankof the ectosylviangyrus areheavily innervatedby area SSY(Bizley et al., 2007), located in the caudal bank of thesuprasylvian sulcus, and thought to be the ferret homologueof primate visual motion processing area MT (Philipp et al.,2005). Although multisensory thalamic nuclei are also apotential source of visual input to the auditory cortex (Budingeret al., 2007; Hackett et al., 2007), the concordance between thefunctional characteristics of visual responses in ADF and thevisual cortical areas that innervate the anterior bank supportsthe idea that these responses are cortical in origin.

It is striking that a far higher proportion of units recorded inPSF exhibited crossmodal spatial interactions and conveyedmore information about stimulus location when visual andauditory stimuli were combined than those recorded in othercortical fields (Fig. 6). While revealing a marked influence ofvisual stimuli on the spatial sensitivity of neurons in this partof the cortex, the MI values obtained for all stimulus con-ditions, including bisensory stimulation, were far lower thanthose estimated for ADF neurons and comparable to those inthe other fields. However, it is clear from Fig. 3F that some PSFunits, particularly near the caudal edge of the gyrus, arerelatively informative about stimulus location. PSF, like PPF, isinnervated by area 20 (Bizley et al., 2007), which is thought tobe part of the visual “what” processing stream (Manger et al.,2004). Although the significance of clusters of spatially infor-mative neurons in this part of auditory cortex is not imme-diately clear, PSF is, unlike PPF, reciprocally connected withfAES (J. K. Bizley, F. R. Nodal, V. M. Bajo and A.J. King, unpub-lished observation), which may explain the relatively highincidence of multisensory interactions that are found there.

Thus, while the present results support the general distinc-tion between cortical areas involved in visual localization andidentification, it seems likely that this distinction is not com-

plete and that there is an overlap in function between them.Indeed, it has been shown that neurons in macaque V4 are,under certain circumstances, directionally selective (Toliaset al., 2005). The influence of visual inputs on the spatial sen-sitivity of many neurons in PSF may therefore reflect the parti-cularly broad auditory spatial tuning of these neurons or thepresence of limited visual spatial sensitivity, arising eitherwithin the ferret's equivalent of the ventral visual processingstream or from areas, including fAES, that are more concernedwith spatial processing.

In this study, visual and auditory stimuli were presentedfrom a range of spatially matched locations. Studies of multi-sensory integration in the superior colliculus have shown thatspatially congruent stimuli tend to result in response enhance-ment, whereas spatially discordant stimuli give rise to suppres-sive interactions (e.g.KingandPalmer, 1985;MeredithandStein,1983, 1986; Wallace et al., 1996). It is generally assumed thatthese changes in firing rate contribute to the relative accuracywith which animals orient toward unisensory and bisensorystimuli (Stein et al., 1988), an assumption supported by thebehavioral deficits observed following collicular lesions (Burnettet al., 2004). By estimating the MI between the stimuli and thespike trains, we were able to show at the level of individualneurons that combining visual and auditory stimuli from thesame direction in space does indeed, in many cases, increasethe information transmitted about stimulus location.

Other studies have shown that bisensory stimulation cansuppress the responses of cortical neurons (e.g. Bizley et al., 2007;Dehner et al., 2004; Ghazanfar et al., 2005). The direction of thechange in evoked firing rate does not, however, necessarily re-veal the functional significance of the multisensory interaction.For example, we observed cases (e.g. Fig. 5C) in which spatiallycongruent visual and auditory stimuli resulted in stronger butless spatially sensitive responses compared to those evoked byunisensory stimulation. In this case, multisensory facilitation ofthe spike firing rate was observed across a broad range of

Fig. 7 – Scatter plots plotting the amount of spatial information available from unisensory stimulation (blue crosses, auditory;red circles, visual) against that obtained with bisensory stimulation for units classified as unisensory auditory (A), unisensoryvisual (B) or bisensory (C).

32 B R A I N R E S E A R C H 1 2 4 2 ( 2 0 0 8 ) 2 4 – 3 6

Author's personal copy

azimuths, degrading the information about stimulus location inthe response of the neuron. In other cases (Fig. 5B), suppressiveinteractions between the visual and auditory inputs sharpenedthe spatial receptive field, which conveyed significantly morespatial information than the responses to unisensory stimula-tion. These findings stress the importance of adopting objectivemeasures, like estimating the MI between the stimulus andresponse, which takes into account both the number and thetiming of the evoked spikes, when attempting to assess theneuronal consequences of multisensory convergence andintegration.

The range of azimuthal locations tested in this study wasrestricted to locations that fell just within the ferret's visualfield,which extends from~25° ipsilateral to ~130° contralateral.Auditory spatial receptive fields of cortical neurons are oftenlarger than this and can respond to soundspresented across thefull 360° of azimuth. Nevertheless, the responses of these neu-rons typically vary within this large receptive field (Benedeket al., 1996; Eordegh et al., 2005; Middlebrooks and Pettigrew,1981;Mrsic-Flogel et al., 2005; Stecker et al., 2005; Xu et al., 1998).This study extends this concept by showing that congruentvisual stimuli can also alter spatial tuning within the auditoryreceptive field.However, it is important tonote thatwehavenotexplored the effects of combining visual stimuli with soundspresented over the full azimuthal extent of each auditory recep-tive field.Moreover, sensitivity to sound-source elevation is alsocommonly observed in auditory cortical neurons (Mrsic-Flogelet al., 2005; Xu et al., 1998) and was not examined here. Furtherstudies exploring a greater range of stimulus locations aretherefore required for a full understanding of spatial integrationof auditory–visual cues in auditory cortex.

The ferret is increasingly being used as a model species forauditory processing. An important question is to what degreefindings in this species can be related to those in another. Thecat has been extensively used for studying auditory cortex, andit is now possible to draw homologies between certain auditoryfields in these two species. A1 is well conserved across mam-malian species, and AAF appears to be equivalent in cat andferret (Kowalski et al., 1995; Imaizumi et al., 2004; Bizley et al.,2005). The posterior fields, PPF and PSF, in the ferret sharecertain similarities with the posterior auditory field (PAF) in thecat, both being tonotopically organized and containing neuronswith relatively long response latencies and rich dischargepatterns (Phillips and Orman, 1984; Stecker et al., 2003; Bizleyet al., 2005). It remains to be seen, however, whether either ofthese areas exhibits responses that are particularly suited forspatial analysis, as is the case in cat PAF (Stecker et al., 2003;Malhotra and Lomber, 2007). Like the cat's secondary auditoryfield, A2, (Schreiner and Cynader, 1984), ferret ADF lacks tono-topic organization and contains neuronswith broad frequency-response areas (Bizley et al., 2005), while the high incidence ofmultisensory convergence in ferret AVF suggests that this areais likely to be near or actually homologous to cat fAES (Benedeket al., 1996; Manger et al., 2005; Ramsay and Meredith, 2004;Wallace et al., 1992).

Although multisensory convergence is widespread in thecortex of a range of species, there is little clear-cut evidence forthe functional role played by the interactions observed in spe-cific cortical areas. It has, however, recently been shown thatsomatosensory inputs can modulate the phase of oscillatory

activity in monkey auditory cortex, potentially amplifying theresponse to auditory signals produced by the same source (La-katos et al., 2007). Activity in auditory cortex is alsomodulated bymouth movements, but not by moving circles, in both monkeys(Ghazanfar et al., 2005) and humans (Pekkola et al., 2005), sug-gestingan involvement inaudiovisual communication. Similarly,within the spatial domain, previous studies have proposed thatauditory cortex might play a key role in the ventriloquism illu-sion (Recanzone, 1998; Woods and Recanzone, 2004). By show-ing that auditory spatial receptive fields can be shaped by asimultaneously presented visual stimulus, our data reveal a pos-sible neural substrate for vision-induced improvements in soundlocalization. By the same token, because more information istypically conveyed by the visual responses, it might be expec-ted that conditioning with spatially offset sound and lightwould reshape auditory spatial receptive fields in ways thatcould underlie the visual capture of sound-source location.

4. Experimental procedures

4.1. Animal preparation

All animal procedures were approved by the local ethicalreview committee and performed under licence from theUK Home Office in accordance with the Animal (ScientificProcedures) Act 1986. The electrophysiological results pre-sented in this study were obtained from six adult pigmentedferrets (Mustela putorius), which also contributed data toother studies in our laboratory. All animals received regularotoscopic examinations prior to the experiment to ensurethat both ears were clean and disease free.

Anesthesia was induced by a single dose of a mixture ofmedetomidine (Domitor; 0.022 mg/kg/h; Pfizer, Sandwich, UK)and ketamine (Ketaset; 5 mg/kg/h; Fort Dodge Animal Health,Southampton, UK). The left radial vein was cannulated and acontinuous infusion (5 ml/h) of a mixture of medetomidineand ketamine in physiological saline containing 5% glucosewas provided throughout the experiment. The ferret alsoreceived a single, subcutaneous dose of 0.06 mg/kg/h atropinesulphate (C-VetVeterinaryProducts, Leyland,UK) and12-hourlysubcutaneous doses of 0.5 mg/kg dexamethasone (Dexadreson;IntervetUKLtd.,MiltonKeynes, UK) inorder to reduce the riskofbronchial secretions and cerebral oedema, respectively. Theferret was intubated and placed on a ventilator. Body tempe-rature, end-tidal CO2 and the electrocardiogram (ECG) weremonitored throughout the experiment.

The animal was placed in a stereotaxic frame and thetemporal muscles on both sides were retracted to expose thedorsal and lateral parts of the skull. Ametal bar was cementedand screwed onto the right side of the skull, holding the headwithout further need of a stereotaxic frame. The right pupilwas protected with a zero-refractive power contact lens. Onthe left side, the temporal muscle was largely removed togain access to the auditory cortex, which is bounded by thesuprasylvian sulcus (Fig. 1) (Kelly et al., 1986). The suprasyl-vian and pseudosylvian sulci were exposed by a craniotomy.The overlying dura was removed and the cortex covered withsilicon oil. The animal was then transferred to an anechoicchamber (IAC Ltd, Winchester, UK).

33B R A I N R E S E A R C H 1 2 4 2 ( 2 0 0 8 ) 2 4 – 3 6

Author's personal copy

Eye position measurements were taken from 3 animals(5 eyes) approximately 7 h after anesthesia was induced byback-reflecting the location of the optic disc using a reversibleophthalmoscope fitted with a corner-cube prism. The mean±SD azimuth values were 33.4±3.5°. In one animal (one eye),the values obtained from repeated measurements taken over a48-hour period were 33.6±2.0° azimuth, indicating that eyeposition remained constant.

4.2. Stimuli

Acoustic stimuli were generated using TDT system 3 hardware(Tucker-Davis Technologies, Alachua, FL). Panasonic head-phone drivers (RPHV297, Panasonic, Bracknell, UK) were used.Closed-field calibrations were performed using an 1/8th inchcondenser microphone (Brüel and Kjær, Naerum, Denmark),placed at the end of a model ferret ear canal, to create aninverse filter that ensured the driver produced a flat (<±5 dB)output up to 25 kHz.

Pure tone stimuli were used to obtain frequency-responseareas, both to characterize individual units and to determinetonotopic gradients in order to identify in which cortical fieldany given recording was made. The tone frequencies usedranged, in 1/3-octave steps, from 200 Hz to 24 kHz. Intensitylevels were varied between 10 and 80 dB SPL in 10 dB in-crements. Each frequency-level combination was presentedpseudorandomly ≥3 times at a rate of once per second.Broadband noise bursts (40 Hz–30 kHz bandwidth and cosineramped with a 10 ms rise/fall time), generated afresh on everytrial, were used as a search stimulus in order to establishwhether each unit was acoustically responsive.

Virtual acoustic space (VAS) stimuli were used to vary thelocation of the closed-field noise bursts. A series of measure-ments, including head size, sex, body weight and pinna size,were taken from each ferret in order to select the best matchfrom our extensive library of ferret head-related transferfunctions (HRTFs). It has previously been shown that indivi-dual differences in the acoustical cues can be significantlyreduced by frequency scaling the HRTFs according to differ-ences in body size (Xu and Middlebrooks, 2000; Schnupp et al.,2003). Basing the stimuli on measurements from animals thatmost closely matched the ones used in the present study willtherefore improve the fidelity of the VAS and avoids the needfor making individual acoustic measurements. The visualstimuli were presented from an array of 8 LEDs, positioned 1mfrom the animal and at 15° intervals from 5° ipsilateral of themidline (denoted as 5° azimuth) to 100° contralateral to themidline (−100° azimuth). Broadband noise was presented inVAS from these same positions. Acoustic and visual stimuliwere presented alone and in combination, pseudorandomly,at a rate of 1 presentation per second.

4.3. Data acquisition

Recordings were made with silicon probe electrodes (Neuro-nexus Technologies, Ann Arbor, MI). We used probes witheither 16 or 32 active sites, in a 1×16, 2×16 or 4×8 configura-tion (number of shanks×number of positions on each shank).Multiple shanks were separated by 200 μm and active siteswere separated vertically by 150 μm. The electrodes were

positioned so that they entered the cortex orthogonal to thesurface of the ectosylvian gyrus. Recordings were made in fiveof the seven identified acoustically responsive areas: thetonotopically organized primary fields A1 and AAF, located onthe middle ectosylvian gyrus; PPF and PSF, which aretonotopically organized areas on the posterior bank, andADF, a non-tonotopic area located on the anterior bank of theectosylvian sulcus. In each animal sufficient penetrationswere made to visualize the tonotopic organization of auditorycortex. While there is a degree of inter-animal variability inthe location of different auditory fields within the ectosylviangyrus, the auditory cortex in the ferret is clearly separatedfrom surrounding visual and somatosensory areas by thesuprasylvian sulcus. The five cortical areas are readilydistinguished from each other on the basis of the tonotopicgradients and their position with respect to the suprasylviansulcus. Thus, there is a clear low frequency reversal at theventral edge of the primary fields and a second separating thetwo posterior fields, which are also tonotopically ordered. ADFcan be distinguished on the basis that it lies ventral to the lowfrequency edge of AAF. Typically, ADF neurons have verybroad frequency-response areas and have higher thresholdsfor pure tones than neurons in the primary and posteriorfields. Neurons in ADF can be distinguished from those in AVFas the latter are not well driven by pure tones. Neurons in AAFand ADF have shorter response latencies than those in A1 andthe posterior fields, respectively. Moreover, neurons recordedin PSF and PPF tend to exhibit more sustained firing.

The neuronal recordings were bandpass filtered (500 Hz–5 kHz), amplified (up to 20,000×) and digitized at 25 kHz. Dataacquisition and stimulus generation were performed usingBrainWare (Tucker-Davis Technologies).

4.4. Data analysis

Spike sorting was performed offline. Single units were isolatedfrom the digitized signal by manually clustering data accord-ing to spike features such as amplitude, width and area. Wealso inspected auto-correlation histograms and only cases inwhich the inter-spike-interval histograms revealed a clearrefractory period were classed as single units.

We have previously shown that MI estimates are a robustand sensitive method of quantifying multisensory interac-tions (Bizley et al., 2007). Briefly, the stimulus S and neuralresponse R were treated as random variables and the MIbetween them, I(S;R), measured in bits, was calculated as afunction of their joint probability p(s, r) and defined as:

I S;Rð Þ ¼X

s;rp s; rð Þlog2

p s; rð Þp sð Þp rð Þ

where p(s), p(r) are the marginal distributions (Cover andThomas, 1991). The MI is zero if the two values areindependent (that is p(s, r)=p(s)p(r) for every r and s) and ispositive otherwise.

The degree to which a neuron was spatially informativewas estimated for the bisensory, unisensory visual andunisensory auditory responses separately using the followingmethod. Neural responses were divided into n time bins oflength t. For each time bin the spike count was used in order to

34 B R A I N R E S E A R C H 1 2 4 2 ( 2 0 0 8 ) 2 4 – 3 6

Author's personal copy

calculate the MI between stimulus location and the spikecount in that bin. This was repeated for all time bins over aresponse duration of 600ms. The MI was then summed acrossall time bins to give the final value. This procedure wasrepeated for time bins of length 20, 35, 100, 300 and 600 ms. Inorder to estimate whether the amount of informationcalculated (in bits) was “significant”, a bootstrapping proce-dure was performed whereby within each time bin responseswere randomly assigned to spatial position and the MIrecalculated. This was repeated 1000 times and if the actualvalue exceeded the 90th percentile of the bootstrapped valuesthe response was considered to be significantly spatiallyinformative. The MI value used for subsequent analysis wasthe one at which the highest significant value was obtained.MI estimates can be subject to significant positive samplingbias (Nelken et al., 2005; Rolls and Treves, 1998). Standard biascorrection procedures were used (Treves and Panzeri, 1995)and the MI values reported were all bias-corrected.

The advantages of calculating MI values are, firstly thisyields a value, in bits, which is in itself readily interpretable;the number of stimuli (in this case spatial locations) which arediscriminable is 2MI(bits). Secondly, we may use the MI valueobtained in different stimulus conditions (i.e. unisensory vi-sual, unisensory auditory and bisensory stimulation) to makequantitative comparisons of the information transmission indifferent stimulus conditions.

In order to compare the MI values in different auditorycortical fields, composite auditory maps were created bypooling data from the 5 animals. Each electrode penetrationwas first assigned to a cortical field on the basis of thefrequency-response areas and tonotopic organizationrecorded in that animal, together with photographs docu-menting the location of each recording site with respect tosulcal landmarks on the cortical surface. These datawere thenplaced onto a single auditory cortex using the frequencyorganization revealed by the intrinsic optical imaging study ofNelken et al. (2004), with the individual frequency mapsoverlaid to create the composite views shown in Fig. 3.

Acknowledgments

We are grateful to Kerry Walker, Fernando Nodal and JanSchnupp for their assistance with data collection and to theWellcome Trust (Principal Research Fellowship to A. J. King)and BBSRC (grant BB/D009758/1 to J. Schnupp, A. J. King andJ. K. Bizley) for the financial support.

R E F E R E N C E S

Benedek, G., Fischer-Szatmari, L., Kovacs, G., Perenyi, J., Katoh, Y.Y.,1996. Visual, somatosensory and auditory modality propertiesalong the feline suprageniculate-anterior ectosylvian sulcus/insular pathway. Prog. Brain. Res 112, 325–334.

Bizley, J.K., Nodal, F.R., Nelken, I., King, A.J., 2005. Functionalorganization of ferret auditory cortex. Cereb. Cortex 15, 1637–1653.

Bizley, J.K., Nodal, F.R., Bajo, V.M., Nelken, I., King, A.J., 2007.Physiological and anatomical evidence for multisensoryinteractions in auditory cortex. Cereb. Cortex 17, 2172–2189.

Brosch, M., Selezneva, E., Scheich, H., 2005. Nonauditory events ofa behavioral procedure activate auditory cortex of highlytrained monkeys. J. Neurosci. 25, 6797–6806.

Budinger, E., Heil, P., Hess, A., Scheich, H., 2006. Multisensoryprocessing via early cortical stages: connections of the primaryauditory cortical field with other sensory systemsNeuroscience 143, 1065–1083.

Budinger, E., Laszcz, A., Lison, H., Scheich, H., Ohl, F.W., 2007.Non-sensory cortical and subcortical connections of theprimary auditory cortex in Mongolian gerbils: bottom-up andtop-down processing of neuronal information via field AI.Brain Res. Aug 22; [Electronic publication ahead of print].

Burnett, L.R., Stein, B.E., Chaponis, D., Wallace, M.T., 2004. Superiorcolliculus lesions preferentially disrupt multisensoryorientation. Neuroscience 124, 535–547.

Calvert, G.A., Brammer, M.J., Bullmore, E.T., Campbell, R., Iversen,S.D., David, A.S., 1999. Response amplification insensory-specific cortices during crossmodal bindingNeuroreport 10, 2619–2623.

Cover, T.M., Thomas, J.A., 1991. Elements of Information Theory.Wiley & Sons. vol.

Dehner, L.R., Keniston, L.P., Clemo, H.R., Meredith, M.A., 2004.Cross-modal circuitry between auditory and somatosensoryareas of the cat anterior ectosylvian sulcal cortex: a ‘new’inhibitory form of multisensory convergence. Cereb. Cortex 14,387–403.

Eordegh, G., Nagy, A., Berenyi, A., Benedek, G., 2005. Processing ofspatial visual information along the pathway between thesuprageniculate nucleus and the anterior ectosylvian cortex.Brain Res. Bull. 67, 281–289.

Falchier, A., Clavagnier, S., Barone, P., Kennedy,H., 2002. Anatomicalevidence of multimodal integration in primate striate cortex.J. Neurosci. 22, 5749–5759.

Foxe, J.J., Morocz, I.A., Murray, M.M., Higgins, B.A., Javitt, D.C.,Schroeder, C.E., 2000. Multisensory auditory–somatosensoryinteractions in early cortical processing revealed byhigh-densityelectrical mapping. Brain Res. Cogn. Brain Res. 10, 77–83.

Foxe, J.J., Wylie, G.R., Martinez, A., Schroeder, C.E., Javitt, D.C.,Guilfoyle, D., Ritter, W., Murray, M.M., 2002.Auditory–somatosensory multisensory processing in auditoryassociation cortex: an fMRI study. J. Neurophysiol. 88, 540–543.

Ghazanfar, A.A., Maier, J.X., Hoffman, K.L., Logothetis, N.K., 2005.Multisensory integration of dynamic faces and voices in rhesusmonkey auditory cortex. J. Neurosci. 25, 5004–5012.

Giard, M.H., Peronnet, F., 1999. Auditory–visual integration duringmultimodal object recognition in humans: a behavioral andelectrophysiological study. J. Cogn. Neurosci. 11, 473–490.

Hackett, T.A., De La Mothe, L.A., Ulbert, I., Karmos, G., Smiley, J.,Schroeder, C.E., 2007. Multisensory convergence in auditorycortex, II. Thalamocortical connections of the caudal superiortemporal plane. J. Comp. Neurol. 502, 924–952.

Imaizumi, K., Priebe,N.J., Crum, P.A., Bedenbaugh, P.H., Cheung,S.W.,Schreiner, C.E., 2004. Modular functional organization of catanterior auditory field. J. Neurophysiol. 92, 444–457.

Kayser, C., Petkov, C.I., Augath, M., Logothetis, N.K., 2007.Functional imaging reveals visual modulation of specific fieldsin auditory cortex. J. Neurosci. 27, 1824–1835.

Kelly, J.B., Judge, P.W., Phillips, D.P., 1986. Representation of thecochlea in primary auditory cortex of the ferret (Mustelaputorius). Hearing Res. 24, 111–115.

King, A.J., Palmer, A.R., 1985. Integration of visual and auditoryinformation in bimodal neurones in the guinea-pig superiorcolliculus. Exp. Brain Res. 60, 492–500.

Kowalski, N., Versnel, H., Shamma, S.A., 1995. Comparison ofresponses in the anterior and primary auditory fields of theferret cortex. J. Neurophysiol. 73, 1513–1523.

Lakatos, P., Chen, C.M., O'Connell, M.N., Mills, A., Schroeder, C.E.,2007. Neuronal oscillations and multisensory interaction inprimary auditory cortex. Neuron 53, 279–292.

35B R A I N R E S E A R C H 1 2 4 2 ( 2 0 0 8 ) 2 4 – 3 6

Author's personal copy

Malhotra, S., Lomber, S.G., 2007. Sound localization duringhomotopic and heterotopic bilateral cooling deactivation ofprimary and nonprimary auditory cortical areas in the cat.J. Neurophysiol. 97, 26–43.

Manger, P.R., Engler, G., Moll, C.K., Engel, A.K., 2005. The anteriorectosylvian visual area of the ferret: a homologue for an enigmaticvisual cortical area of the cat? Eur. J. Neurosci. 22, 706–714.

Manger, P.R., Nakamura, H., Valentiniene, S., Innocenti, G.M., 2004.Visual areas in the lateral temporal cortex of the ferret (Mustelaputorius). Cereb. Cortex 14, 676–689.

Meredith, M.A., Stein, B.E., 1983. Interactions among convergingsensory inputs in the superior colliculus. Science 221, 389–391.

Meredith, M.A., Stein, B.E., 1986. Spatial factors determine theactivity of multisensory neurons in cat superior colliculus.Brain Res. 365, 350–354.

Middlebrooks, J.C., Pettigrew, J.D., 1981. Functional classes ofneurons in primary auditory cortex of the cat distinguished bysensitivity to sound location. J. Neurosci. 1, 107–120.

Molholm, S., Ritter, W., Murray, M.M., Javitt, D.C., Schroeder, C.E.,Foxe, J.J., 2002. Multisensory auditory–visual interactions duringearly sensory processing in humans: a high-density electricalmapping study. Brain Res. Cogn. Brain Res. 14, 115–128.

Molholm, S., Ritter, W., Javitt, D.C., Foxe, J.J., 2004. Multisensoryvisual–auditory object recognition in humans: a high-densityelectrical mapping study. Cereb. Cortex 14, 452–465.

Mrsic-Flogel, T.D., King, A.J., Schnupp, J.W.H., 2005. Encoding ofvirtual acoustic space stimuli by neurons in ferret primaryauditory cortex. J. Neurophysiol. 93, 3489–3503.

Murray, M.M., Molholm, S., Michel, C.M., Heslenfeld, D.J., Ritter, W.,Javitt, D.C., Schroeder, C.E., Foxe, J.J., 2005. Grabbing your ear:rapid auditory–somatosensory multisensory interactions inlow-level sensory cortices are not constrained by stimulusalignment. Cereb. Cortex 15, 963–974.

Nelken, I., Bizley, J.K., Nodal, F.R., Ahmed, B., Schnupp, J.W.H.,King, A.J., 2004. Large-scale organization of ferret auditorycortex revealed using continuous acquisition of intrinsicoptical signals. J. Neurophysiol. 92, 2574–2588.

Nelken, I., Chechik, G., Mrsic-Flogel, T.D., King, A.J., Schnupp, J.W.H.,2005. Encoding stimulus information by spike numbers andmean response time in primary auditory cortex. J. Comput.Neurosci. 19, 199–221.

Pekkola, J., Ojanen, V., Autti, T., Jääskeläinen, I.P., Möttönen, R.,Tarkiainen, A., Sams, M., 2005. Primary auditory cortexactivation by visual speech: an fMRI study at 3T. Neuroreport16, 125–128.

Philipp, R., Distler, C., Hoffmann, K.P., 2005. A motion-sensitivearea in ferret extrastriate visual cortex: an analysis inpigmented and albino animals. Cereb. Cortex 2006 Jun;16(6):779–790. Electronic publication 2005 Aug 31.

Phillips, D.P., Orman, S.S., 1984. Responses of single neurons inposterior field of cat auditory cortex to tonal stimulation.J. Neurophysiol. 51, 147–163.

Ramsay, A.M., Meredith, M.A., 2004. Multiple sensory afferents toferret pseudosylvian sulcal cortex. Neuroreport 15, 461–465.

Recanzone, G.H., 1998. Rapidly induced auditory plasticity: theventriloquism aftereffect. Proc. Natl. Acad. Sci. U. S. A. 95, 869–875.

Rockland, K.S., Ojima, H., 2003. Multisensory convergence incalcarine visual areas inmacaquemonkey. Int. J. Psychophysiol.50, 19–26.

Rolls, E.T., Treves, A., 1998. Neural Networks and Brain Function,vol. Oxford University Press, Oxford.

Schnupp, J.W.H., Booth, J., King, A.J., 2003. Modeling individualdifferences in ferret external ear transfer functions. J. Acoust.Soc. Am. 113, 2021–2030.

Schreiner, C.E., Cynader, M.S., 1984. Basic functional organizationof second auditory cortical field (AII) of the cat. J. Neurophysiol.51, 1284–1305.

Schroeder, C.E., Foxe, J.J., 2002. The timing and laminar profile ofconverging inputs to multisensory areas of the macaqueneocortex. Brain Res. Cogn. Brain Res. 14, 187–198.

Schroeder, C.E., Foxe, J., 2005.Multisensory contributions to low-level,‘unisensory’ processing. Curr. Opin. Neurobiol. 15, 454–458.

Schroeder, C.E., Lindsley, R.W., Specht, C., Marcovici, A., Smiley, J.F.,Javitt, D.C., 2001. Somatosensory input to auditory associationcortex in the macaque monkey. J. Neurophysiol. 85, 1322–1327.

Stecker, G.C., Mickey, B.J., Macpherson, E.A., Middlebrooks, J.C.,2003. Spatial sensitivity in field PAF of cat auditory cortex.J. Neurophysiol. 89, 2889–2903.

Stecker, G.C., Harrington, I.A.,Macpherson, E.A., Middlebrooks, J.C.,2005. Spatial sensitivity in the dorsal zone (area DZ) of catauditory cortex. J. Neurophysiol. 94, 1267–1280.

Stein, B.E., Huneycutt, W.S., Meredith, M.A., 1988. Neurons andbehavior: the same rules of multisensory integration apply.Brain. Res. 448, 355–358.

Tolias, A.S., Keliris, G.A., Smirnakis, S.M., Logothetis, N.K., 2005.Neurons in macaque area V4 acquire directional tuning afteradaptation to motion stimuli. Nat. Neurosci. 8, 591–593.

Treves, A., Panzeri, S., 1995. The upward bias in measures ofinformation derived from limited data samples. Neural Comput.7, 399–407.

Wallace, M.T., Meredith, M.A., Stein, B.E., 1992. Integration ofmultiple sensory modalities in cat cortex. Exp. Brain Res. 91,484–488.

Wallace, M.T., Wilkinson, L.K., Stein, B.E., 1996. Representationand integration of multiple sensory inputs in primate superiorcolliculus. J. Neurophysiol. 76, 1246–1266.

Wallace, M.T., Ramachandran, R., Stein, B.E., 2004. A revised viewof sensory cortical parcellation. Proc. Natl. Acad. Sci. U. S. A.101, 2167–2172.

Welch, R.B., Warren, D.H., 1986. Intersensory interactions. In: Boff,K.R., Kaufman, L., Thomas, J.P. (Eds.), Handbook of Perceptionand Human Performance. Vol. 1: Sensory Processes andPerception. Wiley, New York, pp. 1–36.

Woods, T.M., Recanzone, G.H., 2004. Visually induced plasticity ofauditory spatial perception in macaques. Curr. Biol. 14,1559–1564.

Xu, L., Middlebrooks, J.C., 2000. Individual differences inexternal-ear transfer functions of cats. J. Acoust. Soc. Am. 107,1451–1459.

Xu, L., Furukawa, S., Middlebrooks, J.C., 1998. Sensitivity tosound-source elevation in nontonotopic auditory cortex.J. Neurophysiol. 80, 882–894.

36 B R A I N R E S E A R C H 1 2 4 2 ( 2 0 0 8 ) 2 4 – 3 6