Limits...
The relationship between level of autistic traits and local bias in the context of the McGurk effect.

Ujiie Y, Asai T, Wakabayashi A - Front Psychol (2015)

Bottom Line: The McGurk effect is a well-known illustration that demonstrates the influence of visual information on hearing in the context of speech perception.The results revealed that global facial information facilitates the influence of visual speech cues on McGurk stimuli.These results suggest that individual differences in the McGurk effect might be due to a weak ability to process global facial information in individuals with high levels of autistic traits.

View Article: PubMed Central - PubMed

Affiliation: Information Processing and Computer Sciences, Graduate School of Advanced Integration Science, Chiba University Chiba, Japan ; Japan Society for the Promotion of Science Tokyo, Japan.

ABSTRACT
The McGurk effect is a well-known illustration that demonstrates the influence of visual information on hearing in the context of speech perception. Some studies have reported that individuals with autism spectrum disorder (ASD) display abnormal processing of audio-visual speech integration, while other studies showed contradictory results. Based on the dimensional model of ASD, we administered two analog studies to examine the link between level of autistic traits, as assessed by the Autism Spectrum Quotient (AQ), and the McGurk effect among a sample of university students. In the first experiment, we found that autistic traits correlated negatively with fused (McGurk) responses. Then, we manipulated presentation types of visual stimuli to examine whether the local bias toward visual speech cues modulated individual differences in the McGurk effect. The presentation included four types of visual images, comprising no image, mouth only, mouth and eyes, and full face. The results revealed that global facial information facilitates the influence of visual speech cues on McGurk stimuli. Moreover, individual differences between groups with low and high levels of autistic traits appeared when the full-face visual speech cue with an incongruent voice condition was presented. These results suggest that individual differences in the McGurk effect might be due to a weak ability to process global facial information in individuals with high levels of autistic traits.

No MeSH data available.


Related in: MedlinePlus

Examples of the four types of visual stimuli used in Experiment 2. (A) No image (audio only). (B) Mouth-only presentation. (C) Eyes and mouth presentation. (D) Full-face image presentation. All of these images were presented with a congruent or incongruent voice in the experiment.
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4484977&req=5

Figure 2: Examples of the four types of visual stimuli used in Experiment 2. (A) No image (audio only). (B) Mouth-only presentation. (C) Eyes and mouth presentation. (D) Full-face image presentation. All of these images were presented with a congruent or incongruent voice in the experiment.

Mentions: The four types of presentations of visual stimuli—no image (audio-only), mouth-only, eyes and mouth, and full face—(examples of the visual stimuli are shown in Figure 2) were created for each condition by using Adobe Premiere Pro CS6 to crop eye regions and the mouth region from visual images. The eye region included the region from the inner corner of the eyes to the outer corner. The mouth region included a range of motion of the upper lip and lower lip. This task consisted of 72 congruent stimuli and 24 incongruent stimuli per block.


The relationship between level of autistic traits and local bias in the context of the McGurk effect.

Ujiie Y, Asai T, Wakabayashi A - Front Psychol (2015)

Examples of the four types of visual stimuli used in Experiment 2. (A) No image (audio only). (B) Mouth-only presentation. (C) Eyes and mouth presentation. (D) Full-face image presentation. All of these images were presented with a congruent or incongruent voice in the experiment.
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4484977&req=5

Figure 2: Examples of the four types of visual stimuli used in Experiment 2. (A) No image (audio only). (B) Mouth-only presentation. (C) Eyes and mouth presentation. (D) Full-face image presentation. All of these images were presented with a congruent or incongruent voice in the experiment.
Mentions: The four types of presentations of visual stimuli—no image (audio-only), mouth-only, eyes and mouth, and full face—(examples of the visual stimuli are shown in Figure 2) were created for each condition by using Adobe Premiere Pro CS6 to crop eye regions and the mouth region from visual images. The eye region included the region from the inner corner of the eyes to the outer corner. The mouth region included a range of motion of the upper lip and lower lip. This task consisted of 72 congruent stimuli and 24 incongruent stimuli per block.

Bottom Line: The McGurk effect is a well-known illustration that demonstrates the influence of visual information on hearing in the context of speech perception.The results revealed that global facial information facilitates the influence of visual speech cues on McGurk stimuli.These results suggest that individual differences in the McGurk effect might be due to a weak ability to process global facial information in individuals with high levels of autistic traits.

View Article: PubMed Central - PubMed

Affiliation: Information Processing and Computer Sciences, Graduate School of Advanced Integration Science, Chiba University Chiba, Japan ; Japan Society for the Promotion of Science Tokyo, Japan.

ABSTRACT
The McGurk effect is a well-known illustration that demonstrates the influence of visual information on hearing in the context of speech perception. Some studies have reported that individuals with autism spectrum disorder (ASD) display abnormal processing of audio-visual speech integration, while other studies showed contradictory results. Based on the dimensional model of ASD, we administered two analog studies to examine the link between level of autistic traits, as assessed by the Autism Spectrum Quotient (AQ), and the McGurk effect among a sample of university students. In the first experiment, we found that autistic traits correlated negatively with fused (McGurk) responses. Then, we manipulated presentation types of visual stimuli to examine whether the local bias toward visual speech cues modulated individual differences in the McGurk effect. The presentation included four types of visual images, comprising no image, mouth only, mouth and eyes, and full face. The results revealed that global facial information facilitates the influence of visual speech cues on McGurk stimuli. Moreover, individual differences between groups with low and high levels of autistic traits appeared when the full-face visual speech cue with an incongruent voice condition was presented. These results suggest that individual differences in the McGurk effect might be due to a weak ability to process global facial information in individuals with high levels of autistic traits.

No MeSH data available.


Related in: MedlinePlus