Limits...
Phase-locked responses to speech in human auditory cortex are enhanced during comprehension.

Peelle JE, Gross J, Davis MH - Cereb. Cortex (2012)

Bottom Line: This enhanced phase locking was left lateralized and localized to left temporal cortex.Together, our results demonstrate that entrainment to connected speech does not only depend on acoustic characteristics, but is also affected by listeners' ability to extract linguistic information.This suggests a biological framework for speech comprehension in which acoustic and linguistic cues reciprocally aid in stimulus prediction.

View Article: PubMed Central - PubMed

Affiliation: MRC Cognition and Brain Sciences Unit, Cambridge CB2 7EF, UK.

ABSTRACT
A growing body of evidence shows that ongoing oscillations in auditory cortex modulate their phase to match the rhythm of temporally regular acoustic stimuli, increasing sensitivity to relevant environmental cues and improving detection accuracy. In the current study, we test the hypothesis that nonsensory information provided by linguistic content enhances phase-locked responses to intelligible speech in the human brain. Sixteen adults listened to meaningful sentences while we recorded neural activity using magnetoencephalography. Stimuli were processed using a noise-vocoding technique to vary intelligibility while keeping the temporal acoustic envelope consistent. We show that the acoustic envelopes of sentences contain most power between 4 and 7 Hz and that it is in this frequency band that phase locking between neural activity and envelopes is strongest. Bilateral oscillatory neural activity phase-locked to unintelligible speech, but this cerebro-acoustic phase locking was enhanced when speech was intelligible. This enhanced phase locking was left lateralized and localized to left temporal cortex. Together, our results demonstrate that entrainment to connected speech does not only depend on acoustic characteristics, but is also affected by listeners' ability to extract linguistic information. This suggests a biological framework for speech comprehension in which acoustic and linguistic cues reciprocally aid in stimulus prediction.

Show MeSH

Related in: MedlinePlus

Source-localized cerebro-acoustic coherence results. (A) Source localization showing significant cerebro-acoustic coherence in the unintelligible 1 channel condition compared to a permutation-derived  baseline derived from random pairings of acoustic envelopes to MEG data across all participants. Effects shown are whole-brain corrected (P < 0.05). (B) ROI analysis on coherence values extracted from probabilistically defined primary auditory cortex regions relative to coherence for random pairings of acoustic and cerebral trials. Data showed a significant hemisphere × number of channels × normal/random interaction (P < 0.001).
© Copyright Policy - creative-commons
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC3643716&req=5

BHS118F3: Source-localized cerebro-acoustic coherence results. (A) Source localization showing significant cerebro-acoustic coherence in the unintelligible 1 channel condition compared to a permutation-derived baseline derived from random pairings of acoustic envelopes to MEG data across all participants. Effects shown are whole-brain corrected (P < 0.05). (B) ROI analysis on coherence values extracted from probabilistically defined primary auditory cortex regions relative to coherence for random pairings of acoustic and cerebral trials. Data showed a significant hemisphere × number of channels × normal/random interaction (P < 0.001).

Mentions: We next conducted a whole-brain analysis on source-localized data to see whether the unintelligible 1 channel condition showed significantly greater coherence between the neural and acoustic data than that seen in random pairings of acoustic envelopes and neural data. These results are shown in Figure 3A using a voxel-wise threshold of P < 0.001 and a P < 0.05 whole-brain cluster extent correction for multiple comparisons using random field theory (Worsley et al. 1992). This analysis revealed a number of regions that show significant phase locking to the acoustic envelope in the absence of linguistic information, including bilateral superior and middle temporal gyri, inferior frontal gyri, and motor cortex.Figure 3.


Phase-locked responses to speech in human auditory cortex are enhanced during comprehension.

Peelle JE, Gross J, Davis MH - Cereb. Cortex (2012)

Source-localized cerebro-acoustic coherence results. (A) Source localization showing significant cerebro-acoustic coherence in the unintelligible 1 channel condition compared to a permutation-derived  baseline derived from random pairings of acoustic envelopes to MEG data across all participants. Effects shown are whole-brain corrected (P < 0.05). (B) ROI analysis on coherence values extracted from probabilistically defined primary auditory cortex regions relative to coherence for random pairings of acoustic and cerebral trials. Data showed a significant hemisphere × number of channels × normal/random interaction (P < 0.001).
© Copyright Policy - creative-commons
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC3643716&req=5

BHS118F3: Source-localized cerebro-acoustic coherence results. (A) Source localization showing significant cerebro-acoustic coherence in the unintelligible 1 channel condition compared to a permutation-derived baseline derived from random pairings of acoustic envelopes to MEG data across all participants. Effects shown are whole-brain corrected (P < 0.05). (B) ROI analysis on coherence values extracted from probabilistically defined primary auditory cortex regions relative to coherence for random pairings of acoustic and cerebral trials. Data showed a significant hemisphere × number of channels × normal/random interaction (P < 0.001).
Mentions: We next conducted a whole-brain analysis on source-localized data to see whether the unintelligible 1 channel condition showed significantly greater coherence between the neural and acoustic data than that seen in random pairings of acoustic envelopes and neural data. These results are shown in Figure 3A using a voxel-wise threshold of P < 0.001 and a P < 0.05 whole-brain cluster extent correction for multiple comparisons using random field theory (Worsley et al. 1992). This analysis revealed a number of regions that show significant phase locking to the acoustic envelope in the absence of linguistic information, including bilateral superior and middle temporal gyri, inferior frontal gyri, and motor cortex.Figure 3.

Bottom Line: This enhanced phase locking was left lateralized and localized to left temporal cortex.Together, our results demonstrate that entrainment to connected speech does not only depend on acoustic characteristics, but is also affected by listeners' ability to extract linguistic information.This suggests a biological framework for speech comprehension in which acoustic and linguistic cues reciprocally aid in stimulus prediction.

View Article: PubMed Central - PubMed

Affiliation: MRC Cognition and Brain Sciences Unit, Cambridge CB2 7EF, UK.

ABSTRACT
A growing body of evidence shows that ongoing oscillations in auditory cortex modulate their phase to match the rhythm of temporally regular acoustic stimuli, increasing sensitivity to relevant environmental cues and improving detection accuracy. In the current study, we test the hypothesis that nonsensory information provided by linguistic content enhances phase-locked responses to intelligible speech in the human brain. Sixteen adults listened to meaningful sentences while we recorded neural activity using magnetoencephalography. Stimuli were processed using a noise-vocoding technique to vary intelligibility while keeping the temporal acoustic envelope consistent. We show that the acoustic envelopes of sentences contain most power between 4 and 7 Hz and that it is in this frequency band that phase locking between neural activity and envelopes is strongest. Bilateral oscillatory neural activity phase-locked to unintelligible speech, but this cerebro-acoustic phase locking was enhanced when speech was intelligible. This enhanced phase locking was left lateralized and localized to left temporal cortex. Together, our results demonstrate that entrainment to connected speech does not only depend on acoustic characteristics, but is also affected by listeners' ability to extract linguistic information. This suggests a biological framework for speech comprehension in which acoustic and linguistic cues reciprocally aid in stimulus prediction.

Show MeSH
Related in: MedlinePlus