Limits...
Multisensory and modality specific processing of visual speech in different regions of the premotor cortex.

Callan DE, Jones JA, Callan A - Front Psychol (2014)

Bottom Line: The left inferior parietal lobule and right cerebellum also showed these properties.The left ventral superior and dorsal premotor cortex (PMvs/PMd) did not show this multisensory enhancement effect, but there was greater activity for the visual only over audio-visual conditions in these areas.The results suggest that the inferior regions of the ventral premotor cortex are involved with integrating multisensory information, whereas, more superior and dorsal regions of the PMC are involved with mapping unimodal (in this case visual) sensory features of the speech signal with articulatory speech gestures.

View Article: PubMed Central - PubMed

Affiliation: Center for Information and Neural Networks, National Institute of Information and Communications Technology, Osaka University Osaka, Japan ; Multisensory Cognition and Computation Laboratory Universal Communication Research Institute, National Institute of Information and Communications Technology Kyoto, Japan.

ABSTRACT
Behavioral and neuroimaging studies have demonstrated that brain regions involved with speech production also support speech perception, especially under degraded conditions. The premotor cortex (PMC) has been shown to be active during both observation and execution of action ("Mirror System" properties), and may facilitate speech perception by mapping unimodal and multimodal sensory features onto articulatory speech gestures. For this functional magnetic resonance imaging (fMRI) study, participants identified vowels produced by a speaker in audio-visual (saw the speaker's articulating face and heard her voice), visual only (only saw the speaker's articulating face), and audio only (only heard the speaker's voice) conditions with varying audio signal-to-noise ratios in order to determine the regions of the PMC involved with multisensory and modality specific processing of visual speech gestures. The task was designed so that identification could be made with a high level of accuracy from visual only stimuli to control for task difficulty and differences in intelligibility. The results of the functional magnetic resonance imaging (fMRI) analysis for visual only and audio-visual conditions showed overlapping activity in inferior frontal gyrus and PMC. The left ventral inferior premotor cortex (PMvi) showed properties of multimodal (audio-visual) enhancement with a degraded auditory signal. The left inferior parietal lobule and right cerebellum also showed these properties. The left ventral superior and dorsal premotor cortex (PMvs/PMd) did not show this multisensory enhancement effect, but there was greater activity for the visual only over audio-visual conditions in these areas. The results suggest that the inferior regions of the ventral premotor cortex are involved with integrating multisensory information, whereas, more superior and dorsal regions of the PMC are involved with mapping unimodal (in this case visual) sensory features of the speech signal with articulatory speech gestures.

No MeSH data available.


Related in: MedlinePlus

Brain activity significantly active for the contrast of visual only VO relative to the combined AV conditions thresholded at p < 0.001 uncorrected. Activity was present in the left PMvs/PMd and the left MT/V5 visual motion processing area. (A) Activity rendered on the surface of the left, back, right, and top of the brain. (B) Section through brain taken at MNI coordinate −36, 3, 54 shows activity that was present in the PMvs/PMd region. L, left side of brain; R, right side of brain.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4017150&req=5

Figure 8: Brain activity significantly active for the contrast of visual only VO relative to the combined AV conditions thresholded at p < 0.001 uncorrected. Activity was present in the left PMvs/PMd and the left MT/V5 visual motion processing area. (A) Activity rendered on the surface of the left, back, right, and top of the brain. (B) Section through brain taken at MNI coordinate −36, 3, 54 shows activity that was present in the PMvs/PMd region. L, left side of brain; R, right side of brain.

Mentions: The contrasts investigating differences between the combined AV conditions and the VO condition are given in Figures 7–8 and Tables 6–7. The contrast of AV vs. VO revealed significant activity (pFDR < 0.05 corrected across entire volume, T = 3.48) in only the STG/S region also encompassing primary and secondary auditory cortex (see Figure 7 and Table 6). The results of the ROI analysis did not show any significant activity in the PMvi/Broca's, PMvs/PMd, or the cerebellum. The contrast of VO relative to the combined AV conditions did not show significant activity when using the FDR correction for multiple comparisons therefore the results are shown using a threshold of p < 0.001 uncorrected (T = 3.73; see Figure 8). Active brain regions include the left PMvs/PMd, and the right MT/V5, and the right inferior occipital gyrus (see Figure 8 and Table 7). The results of the ROI analysis (see Table 7) showed significant activity (p < 0.05 corrected) in the left PMvs/PMd (MNI coordinate: −39, 3, 54).


Multisensory and modality specific processing of visual speech in different regions of the premotor cortex.

Callan DE, Jones JA, Callan A - Front Psychol (2014)

Brain activity significantly active for the contrast of visual only VO relative to the combined AV conditions thresholded at p < 0.001 uncorrected. Activity was present in the left PMvs/PMd and the left MT/V5 visual motion processing area. (A) Activity rendered on the surface of the left, back, right, and top of the brain. (B) Section through brain taken at MNI coordinate −36, 3, 54 shows activity that was present in the PMvs/PMd region. L, left side of brain; R, right side of brain.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4017150&req=5

Figure 8: Brain activity significantly active for the contrast of visual only VO relative to the combined AV conditions thresholded at p < 0.001 uncorrected. Activity was present in the left PMvs/PMd and the left MT/V5 visual motion processing area. (A) Activity rendered on the surface of the left, back, right, and top of the brain. (B) Section through brain taken at MNI coordinate −36, 3, 54 shows activity that was present in the PMvs/PMd region. L, left side of brain; R, right side of brain.
Mentions: The contrasts investigating differences between the combined AV conditions and the VO condition are given in Figures 7–8 and Tables 6–7. The contrast of AV vs. VO revealed significant activity (pFDR < 0.05 corrected across entire volume, T = 3.48) in only the STG/S region also encompassing primary and secondary auditory cortex (see Figure 7 and Table 6). The results of the ROI analysis did not show any significant activity in the PMvi/Broca's, PMvs/PMd, or the cerebellum. The contrast of VO relative to the combined AV conditions did not show significant activity when using the FDR correction for multiple comparisons therefore the results are shown using a threshold of p < 0.001 uncorrected (T = 3.73; see Figure 8). Active brain regions include the left PMvs/PMd, and the right MT/V5, and the right inferior occipital gyrus (see Figure 8 and Table 7). The results of the ROI analysis (see Table 7) showed significant activity (p < 0.05 corrected) in the left PMvs/PMd (MNI coordinate: −39, 3, 54).

Bottom Line: The left inferior parietal lobule and right cerebellum also showed these properties.The left ventral superior and dorsal premotor cortex (PMvs/PMd) did not show this multisensory enhancement effect, but there was greater activity for the visual only over audio-visual conditions in these areas.The results suggest that the inferior regions of the ventral premotor cortex are involved with integrating multisensory information, whereas, more superior and dorsal regions of the PMC are involved with mapping unimodal (in this case visual) sensory features of the speech signal with articulatory speech gestures.

View Article: PubMed Central - PubMed

Affiliation: Center for Information and Neural Networks, National Institute of Information and Communications Technology, Osaka University Osaka, Japan ; Multisensory Cognition and Computation Laboratory Universal Communication Research Institute, National Institute of Information and Communications Technology Kyoto, Japan.

ABSTRACT
Behavioral and neuroimaging studies have demonstrated that brain regions involved with speech production also support speech perception, especially under degraded conditions. The premotor cortex (PMC) has been shown to be active during both observation and execution of action ("Mirror System" properties), and may facilitate speech perception by mapping unimodal and multimodal sensory features onto articulatory speech gestures. For this functional magnetic resonance imaging (fMRI) study, participants identified vowels produced by a speaker in audio-visual (saw the speaker's articulating face and heard her voice), visual only (only saw the speaker's articulating face), and audio only (only heard the speaker's voice) conditions with varying audio signal-to-noise ratios in order to determine the regions of the PMC involved with multisensory and modality specific processing of visual speech gestures. The task was designed so that identification could be made with a high level of accuracy from visual only stimuli to control for task difficulty and differences in intelligibility. The results of the functional magnetic resonance imaging (fMRI) analysis for visual only and audio-visual conditions showed overlapping activity in inferior frontal gyrus and PMC. The left ventral inferior premotor cortex (PMvi) showed properties of multimodal (audio-visual) enhancement with a degraded auditory signal. The left inferior parietal lobule and right cerebellum also showed these properties. The left ventral superior and dorsal premotor cortex (PMvs/PMd) did not show this multisensory enhancement effect, but there was greater activity for the visual only over audio-visual conditions in these areas. The results suggest that the inferior regions of the ventral premotor cortex are involved with integrating multisensory information, whereas, more superior and dorsal regions of the PMC are involved with mapping unimodal (in this case visual) sensory features of the speech signal with articulatory speech gestures.

No MeSH data available.


Related in: MedlinePlus