Limits...
Phase-locked responses to speech in human auditory cortex are enhanced during comprehension.

Peelle JE, Gross J, Davis MH - Cereb. Cortex (2012)

Bottom Line: This enhanced phase locking was left lateralized and localized to left temporal cortex.Together, our results demonstrate that entrainment to connected speech does not only depend on acoustic characteristics, but is also affected by listeners' ability to extract linguistic information.This suggests a biological framework for speech comprehension in which acoustic and linguistic cues reciprocally aid in stimulus prediction.

View Article: PubMed Central - PubMed

Affiliation: MRC Cognition and Brain Sciences Unit, Cambridge CB2 7EF, UK.

ABSTRACT
A growing body of evidence shows that ongoing oscillations in auditory cortex modulate their phase to match the rhythm of temporally regular acoustic stimuli, increasing sensitivity to relevant environmental cues and improving detection accuracy. In the current study, we test the hypothesis that nonsensory information provided by linguistic content enhances phase-locked responses to intelligible speech in the human brain. Sixteen adults listened to meaningful sentences while we recorded neural activity using magnetoencephalography. Stimuli were processed using a noise-vocoding technique to vary intelligibility while keeping the temporal acoustic envelope consistent. We show that the acoustic envelopes of sentences contain most power between 4 and 7 Hz and that it is in this frequency band that phase locking between neural activity and envelopes is strongest. Bilateral oscillatory neural activity phase-locked to unintelligible speech, but this cerebro-acoustic phase locking was enhanced when speech was intelligible. This enhanced phase locking was left lateralized and localized to left temporal cortex. Together, our results demonstrate that entrainment to connected speech does not only depend on acoustic characteristics, but is also affected by listeners' ability to extract linguistic information. This suggests a biological framework for speech comprehension in which acoustic and linguistic cues reciprocally aid in stimulus prediction.

Show MeSH
Sensor level cerebro-acoustic coherence for magnetometer sensors. (A) For 2 example participants, the magnetometer with the maximum coherence values (across all frequencies) was selected. Coherence values were then plotted at this sensor as a function of frequency, along with significance levels based on permutation analyses (see text). Topographic plots of coherence values for all magnetometers, as well as a topographic plot showing significance values, are also displayed. (B) Coherence values as a function of frequency computed as above, but averaged for the maximum coherence magnetometer in all 16 listeners. Minimum and maximum values across subjects are also shown in the shaded portion. Coherence values show a clear peak in the 4–7 Hz range.
© Copyright Policy - creative-commons
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC3643716&req=5

BHS118F2: Sensor level cerebro-acoustic coherence for magnetometer sensors. (A) For 2 example participants, the magnetometer with the maximum coherence values (across all frequencies) was selected. Coherence values were then plotted at this sensor as a function of frequency, along with significance levels based on permutation analyses (see text). Topographic plots of coherence values for all magnetometers, as well as a topographic plot showing significance values, are also displayed. (B) Coherence values as a function of frequency computed as above, but averaged for the maximum coherence magnetometer in all 16 listeners. Minimum and maximum values across subjects are also shown in the shaded portion. Coherence values show a clear peak in the 4–7 Hz range.

Mentions: We first analyzed MEG data in sensor space to examine cerebro-acoustic coherence across a range of frequencies. For each participant, we selected the magnetometer with the highest summed coherence values between 0 and 20 Hz. For that sensor, we then plotted coherence as a function of frequency, as shown in Figure 2A for 2 example participants. For each participant, we also conducted a nonparametric permutation analysis in which we calculated coherence for 5000 random pairings of acoustic envelopes with neural data; based on the distribution of values obtained through these random pairings, we were able to determine the chance of obtaining coherence values for the true pairing. In both the example participants, we see a coherence peak between 4 and 7 Hz that exceeds the P < 0.005 threshold based on this permutation analysis. For these 2 participants, greatest coherence in this frequency range is seen in bilateral frontocentral sensors (Fig. 2A). The maximum-magnetometer coherence plot averaged across all 16 participants, shown in Figure 2B, also shows a clear peak between 4 and 7 Hz. This is consistent with both the acoustic characteristics of the stimuli and the previous literature, and therefore supports our decision to focus on this frequency range for further analyses.Figure 2.


Phase-locked responses to speech in human auditory cortex are enhanced during comprehension.

Peelle JE, Gross J, Davis MH - Cereb. Cortex (2012)

Sensor level cerebro-acoustic coherence for magnetometer sensors. (A) For 2 example participants, the magnetometer with the maximum coherence values (across all frequencies) was selected. Coherence values were then plotted at this sensor as a function of frequency, along with significance levels based on permutation analyses (see text). Topographic plots of coherence values for all magnetometers, as well as a topographic plot showing significance values, are also displayed. (B) Coherence values as a function of frequency computed as above, but averaged for the maximum coherence magnetometer in all 16 listeners. Minimum and maximum values across subjects are also shown in the shaded portion. Coherence values show a clear peak in the 4–7 Hz range.
© Copyright Policy - creative-commons
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC3643716&req=5

BHS118F2: Sensor level cerebro-acoustic coherence for magnetometer sensors. (A) For 2 example participants, the magnetometer with the maximum coherence values (across all frequencies) was selected. Coherence values were then plotted at this sensor as a function of frequency, along with significance levels based on permutation analyses (see text). Topographic plots of coherence values for all magnetometers, as well as a topographic plot showing significance values, are also displayed. (B) Coherence values as a function of frequency computed as above, but averaged for the maximum coherence magnetometer in all 16 listeners. Minimum and maximum values across subjects are also shown in the shaded portion. Coherence values show a clear peak in the 4–7 Hz range.
Mentions: We first analyzed MEG data in sensor space to examine cerebro-acoustic coherence across a range of frequencies. For each participant, we selected the magnetometer with the highest summed coherence values between 0 and 20 Hz. For that sensor, we then plotted coherence as a function of frequency, as shown in Figure 2A for 2 example participants. For each participant, we also conducted a nonparametric permutation analysis in which we calculated coherence for 5000 random pairings of acoustic envelopes with neural data; based on the distribution of values obtained through these random pairings, we were able to determine the chance of obtaining coherence values for the true pairing. In both the example participants, we see a coherence peak between 4 and 7 Hz that exceeds the P < 0.005 threshold based on this permutation analysis. For these 2 participants, greatest coherence in this frequency range is seen in bilateral frontocentral sensors (Fig. 2A). The maximum-magnetometer coherence plot averaged across all 16 participants, shown in Figure 2B, also shows a clear peak between 4 and 7 Hz. This is consistent with both the acoustic characteristics of the stimuli and the previous literature, and therefore supports our decision to focus on this frequency range for further analyses.Figure 2.

Bottom Line: This enhanced phase locking was left lateralized and localized to left temporal cortex.Together, our results demonstrate that entrainment to connected speech does not only depend on acoustic characteristics, but is also affected by listeners' ability to extract linguistic information.This suggests a biological framework for speech comprehension in which acoustic and linguistic cues reciprocally aid in stimulus prediction.

View Article: PubMed Central - PubMed

Affiliation: MRC Cognition and Brain Sciences Unit, Cambridge CB2 7EF, UK.

ABSTRACT
A growing body of evidence shows that ongoing oscillations in auditory cortex modulate their phase to match the rhythm of temporally regular acoustic stimuli, increasing sensitivity to relevant environmental cues and improving detection accuracy. In the current study, we test the hypothesis that nonsensory information provided by linguistic content enhances phase-locked responses to intelligible speech in the human brain. Sixteen adults listened to meaningful sentences while we recorded neural activity using magnetoencephalography. Stimuli were processed using a noise-vocoding technique to vary intelligibility while keeping the temporal acoustic envelope consistent. We show that the acoustic envelopes of sentences contain most power between 4 and 7 Hz and that it is in this frequency band that phase locking between neural activity and envelopes is strongest. Bilateral oscillatory neural activity phase-locked to unintelligible speech, but this cerebro-acoustic phase locking was enhanced when speech was intelligible. This enhanced phase locking was left lateralized and localized to left temporal cortex. Together, our results demonstrate that entrainment to connected speech does not only depend on acoustic characteristics, but is also affected by listeners' ability to extract linguistic information. This suggests a biological framework for speech comprehension in which acoustic and linguistic cues reciprocally aid in stimulus prediction.

Show MeSH