Limits...
Phase-locked responses to speech in human auditory cortex are enhanced during comprehension.

Peelle JE, Gross J, Davis MH - Cereb. Cortex (2012)

Bottom Line: This enhanced phase locking was left lateralized and localized to left temporal cortex.Together, our results demonstrate that entrainment to connected speech does not only depend on acoustic characteristics, but is also affected by listeners' ability to extract linguistic information.This suggests a biological framework for speech comprehension in which acoustic and linguistic cues reciprocally aid in stimulus prediction.

View Article: PubMed Central - PubMed

Affiliation: MRC Cognition and Brain Sciences Unit, Cambridge CB2 7EF, UK.

ABSTRACT
A growing body of evidence shows that ongoing oscillations in auditory cortex modulate their phase to match the rhythm of temporally regular acoustic stimuli, increasing sensitivity to relevant environmental cues and improving detection accuracy. In the current study, we test the hypothesis that nonsensory information provided by linguistic content enhances phase-locked responses to intelligible speech in the human brain. Sixteen adults listened to meaningful sentences while we recorded neural activity using magnetoencephalography. Stimuli were processed using a noise-vocoding technique to vary intelligibility while keeping the temporal acoustic envelope consistent. We show that the acoustic envelopes of sentences contain most power between 4 and 7 Hz and that it is in this frequency band that phase locking between neural activity and envelopes is strongest. Bilateral oscillatory neural activity phase-locked to unintelligible speech, but this cerebro-acoustic phase locking was enhanced when speech was intelligible. This enhanced phase locking was left lateralized and localized to left temporal cortex. Together, our results demonstrate that entrainment to connected speech does not only depend on acoustic characteristics, but is also affected by listeners' ability to extract linguistic information. This suggests a biological framework for speech comprehension in which acoustic and linguistic cues reciprocally aid in stimulus prediction.

Show MeSH

Related in: MedlinePlus

Linguistic influences on cerebro-acoustic coherence. (A) Group analysis showing neural sources in which intelligible 16 channel vocoded speech led to significantly greater coherence with the acoustic envelope than the 1 channel vocoded speech. Effects shown are whole-brain corrected (P < 0.05). Coronal slices shown from an MNI standard brain at 8 mm intervals. (B) For a 5 mm radius sphere around the middle temporal gyrus peak (−60, −16, −8), the 4 channel vocoded speech also showed significantly greater coherence than the 4 channel rotated vocoded speech, despite being equated for spectral detail. (C) Analysis of the first and second halves of each sentence confirms that results were not driven by sentence onset effects: there was no main effect of sentence half nor an interaction with condition.
© Copyright Policy - creative-commons
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC3643716&req=5

BHS118F4: Linguistic influences on cerebro-acoustic coherence. (A) Group analysis showing neural sources in which intelligible 16 channel vocoded speech led to significantly greater coherence with the acoustic envelope than the 1 channel vocoded speech. Effects shown are whole-brain corrected (P < 0.05). Coronal slices shown from an MNI standard brain at 8 mm intervals. (B) For a 5 mm radius sphere around the middle temporal gyrus peak (−60, −16, −8), the 4 channel vocoded speech also showed significantly greater coherence than the 4 channel rotated vocoded speech, despite being equated for spectral detail. (C) Analysis of the first and second halves of each sentence confirms that results were not driven by sentence onset effects: there was no main effect of sentence half nor an interaction with condition.

Mentions: To assess effects of intelligibility on cerebro-acoustic coherence more broadly we conducted a whole-brain search for regions in which coherence was higher for the intelligible 16 channel speech than for the unintelligible 1 channel speech, using a voxel-wise threshold of P < 0.001, corrected for multiple comparisons (P < 0.05) using cluster extent. As shown in Figure 4A, this analysis revealed a significant cluster of greater coherence centered on the left middle temporal gyrus [13 824 μL: peak at (−60, −16, −8), Z = 4.11], extending into both inferior and superior temporal gyri. A second cluster extended from the medial to the lateral surface of left ventral inferior frontal cortex [17 920 μL: peak at (−8, 40, −20), Z= 3.56]. A third cluster was also observed in the left inferior frontal gyrus [1344 μL: peak at (−60, 36, −16), Z = 3.28], although this was too small to pass whole-brain cluster extent correction (and thus not shown in Fig. 4). (We conducted an additional analysis in which the source reconstructions were calculated on a single frequency range of 4–7 Hz, as opposed to averaging separate source localizations, as described in Materials and Methods. This analysis resulted in the same 2 significant clusters of increased coherence in nearly identical locations.)Figure 4.


Phase-locked responses to speech in human auditory cortex are enhanced during comprehension.

Peelle JE, Gross J, Davis MH - Cereb. Cortex (2012)

Linguistic influences on cerebro-acoustic coherence. (A) Group analysis showing neural sources in which intelligible 16 channel vocoded speech led to significantly greater coherence with the acoustic envelope than the 1 channel vocoded speech. Effects shown are whole-brain corrected (P < 0.05). Coronal slices shown from an MNI standard brain at 8 mm intervals. (B) For a 5 mm radius sphere around the middle temporal gyrus peak (−60, −16, −8), the 4 channel vocoded speech also showed significantly greater coherence than the 4 channel rotated vocoded speech, despite being equated for spectral detail. (C) Analysis of the first and second halves of each sentence confirms that results were not driven by sentence onset effects: there was no main effect of sentence half nor an interaction with condition.
© Copyright Policy - creative-commons
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC3643716&req=5

BHS118F4: Linguistic influences on cerebro-acoustic coherence. (A) Group analysis showing neural sources in which intelligible 16 channel vocoded speech led to significantly greater coherence with the acoustic envelope than the 1 channel vocoded speech. Effects shown are whole-brain corrected (P < 0.05). Coronal slices shown from an MNI standard brain at 8 mm intervals. (B) For a 5 mm radius sphere around the middle temporal gyrus peak (−60, −16, −8), the 4 channel vocoded speech also showed significantly greater coherence than the 4 channel rotated vocoded speech, despite being equated for spectral detail. (C) Analysis of the first and second halves of each sentence confirms that results were not driven by sentence onset effects: there was no main effect of sentence half nor an interaction with condition.
Mentions: To assess effects of intelligibility on cerebro-acoustic coherence more broadly we conducted a whole-brain search for regions in which coherence was higher for the intelligible 16 channel speech than for the unintelligible 1 channel speech, using a voxel-wise threshold of P < 0.001, corrected for multiple comparisons (P < 0.05) using cluster extent. As shown in Figure 4A, this analysis revealed a significant cluster of greater coherence centered on the left middle temporal gyrus [13 824 μL: peak at (−60, −16, −8), Z = 4.11], extending into both inferior and superior temporal gyri. A second cluster extended from the medial to the lateral surface of left ventral inferior frontal cortex [17 920 μL: peak at (−8, 40, −20), Z= 3.56]. A third cluster was also observed in the left inferior frontal gyrus [1344 μL: peak at (−60, 36, −16), Z = 3.28], although this was too small to pass whole-brain cluster extent correction (and thus not shown in Fig. 4). (We conducted an additional analysis in which the source reconstructions were calculated on a single frequency range of 4–7 Hz, as opposed to averaging separate source localizations, as described in Materials and Methods. This analysis resulted in the same 2 significant clusters of increased coherence in nearly identical locations.)Figure 4.

Bottom Line: This enhanced phase locking was left lateralized and localized to left temporal cortex.Together, our results demonstrate that entrainment to connected speech does not only depend on acoustic characteristics, but is also affected by listeners' ability to extract linguistic information.This suggests a biological framework for speech comprehension in which acoustic and linguistic cues reciprocally aid in stimulus prediction.

View Article: PubMed Central - PubMed

Affiliation: MRC Cognition and Brain Sciences Unit, Cambridge CB2 7EF, UK.

ABSTRACT
A growing body of evidence shows that ongoing oscillations in auditory cortex modulate their phase to match the rhythm of temporally regular acoustic stimuli, increasing sensitivity to relevant environmental cues and improving detection accuracy. In the current study, we test the hypothesis that nonsensory information provided by linguistic content enhances phase-locked responses to intelligible speech in the human brain. Sixteen adults listened to meaningful sentences while we recorded neural activity using magnetoencephalography. Stimuli were processed using a noise-vocoding technique to vary intelligibility while keeping the temporal acoustic envelope consistent. We show that the acoustic envelopes of sentences contain most power between 4 and 7 Hz and that it is in this frequency band that phase locking between neural activity and envelopes is strongest. Bilateral oscillatory neural activity phase-locked to unintelligible speech, but this cerebro-acoustic phase locking was enhanced when speech was intelligible. This enhanced phase locking was left lateralized and localized to left temporal cortex. Together, our results demonstrate that entrainment to connected speech does not only depend on acoustic characteristics, but is also affected by listeners' ability to extract linguistic information. This suggests a biological framework for speech comprehension in which acoustic and linguistic cues reciprocally aid in stimulus prediction.

Show MeSH
Related in: MedlinePlus