Limits...
Comparing the processing of music and language meaning using EEG and FMRI provides evidence for similar and distinct neural representations.

Steinbeis N, Koelsch S - PLoS ONE (2008)

Bottom Line: This paper presents evidence showing that expressed affect is a primary pathway to music meaning and that meaning in music is represented in a very similar fashion to language meaning.Most importantly, however, when primed by affective words, single chords incongruous to the preceding affect also elicited an N400 and activated the right posterior STS, an area implicated in processing meaning of a variety of signals (e.g. prosody, voices, motion).This provides an important piece of evidence in support of music meaning being represented in a very similar but also distinct fashion to language meaning: Both elicit an N400, but activate different portions of the right temporal lobe.

View Article: PubMed Central - PubMed

Affiliation: Max-Planck Institute for Human Cognitive and Brain Research, Leipzig, Germany. steinb@cbs.mpg.de

ABSTRACT
Recent demonstrations that music is capable of conveying semantically meaningful information has raised several questions as to what the underlying mechanisms of establishing meaning in music are, and if the meaning of music is represented in comparable fashion to language meaning. This paper presents evidence showing that expressed affect is a primary pathway to music meaning and that meaning in music is represented in a very similar fashion to language meaning. In two experiments using EEG and fMRI, it was shown that single chords varying in harmonic roughness (consonance/dissonance) and thus perceived affect could prime the processing of subsequently presented affective target words, as indicated by an increased N400 and activation of the right middle temporal gyrus (MTG). Most importantly, however, when primed by affective words, single chords incongruous to the preceding affect also elicited an N400 and activated the right posterior STS, an area implicated in processing meaning of a variety of signals (e.g. prosody, voices, motion). This provides an important piece of evidence in support of music meaning being represented in a very similar but also distinct fashion to language meaning: Both elicit an N400, but activate different portions of the right temporal lobe.

Show MeSH
ERPs in response to word targets (A) and chord targets (B).Word targets incongruous with the expressed affect of the preceding chord elicited an increased N400 between 300–500 ms distributed broadly over the scalp (A) with a centro-parietal maximum. Chord targets incongruous with the expressed affect of the preceding word elicited an increased N400 between 200–400 ms distributed broadly over the scalp (B) with a fronto-central maximum.
© Copyright Policy
Related In: Results  -  Collection


getmorefigures.php?uid=PMC2376093&req=5

pone-0002226-g003: ERPs in response to word targets (A) and chord targets (B).Word targets incongruous with the expressed affect of the preceding chord elicited an increased N400 between 300–500 ms distributed broadly over the scalp (A) with a centro-parietal maximum. Chord targets incongruous with the expressed affect of the preceding word elicited an increased N400 between 200–400 ms distributed broadly over the scalp (B) with a fronto-central maximum.

Mentions: Analysis of the ERPs, time-locked to the correctly evaluated target words (which were either congruous or incongruous with the preceding chord prime), revealed an increased negativity between 300–500 ms distributed broadly over the scalp in response to incongruous targets (Figure 3A). This was indicated by a significant interaction between factors Prime and Target (F (1,19) = 6,72; p<0.05). There were no main effects of either prime or target.


Comparing the processing of music and language meaning using EEG and FMRI provides evidence for similar and distinct neural representations.

Steinbeis N, Koelsch S - PLoS ONE (2008)

ERPs in response to word targets (A) and chord targets (B).Word targets incongruous with the expressed affect of the preceding chord elicited an increased N400 between 300–500 ms distributed broadly over the scalp (A) with a centro-parietal maximum. Chord targets incongruous with the expressed affect of the preceding word elicited an increased N400 between 200–400 ms distributed broadly over the scalp (B) with a fronto-central maximum.
© Copyright Policy
Related In: Results  -  Collection

Show All Figures
getmorefigures.php?uid=PMC2376093&req=5

pone-0002226-g003: ERPs in response to word targets (A) and chord targets (B).Word targets incongruous with the expressed affect of the preceding chord elicited an increased N400 between 300–500 ms distributed broadly over the scalp (A) with a centro-parietal maximum. Chord targets incongruous with the expressed affect of the preceding word elicited an increased N400 between 200–400 ms distributed broadly over the scalp (B) with a fronto-central maximum.
Mentions: Analysis of the ERPs, time-locked to the correctly evaluated target words (which were either congruous or incongruous with the preceding chord prime), revealed an increased negativity between 300–500 ms distributed broadly over the scalp in response to incongruous targets (Figure 3A). This was indicated by a significant interaction between factors Prime and Target (F (1,19) = 6,72; p<0.05). There were no main effects of either prime or target.

Bottom Line: This paper presents evidence showing that expressed affect is a primary pathway to music meaning and that meaning in music is represented in a very similar fashion to language meaning.Most importantly, however, when primed by affective words, single chords incongruous to the preceding affect also elicited an N400 and activated the right posterior STS, an area implicated in processing meaning of a variety of signals (e.g. prosody, voices, motion).This provides an important piece of evidence in support of music meaning being represented in a very similar but also distinct fashion to language meaning: Both elicit an N400, but activate different portions of the right temporal lobe.

View Article: PubMed Central - PubMed

Affiliation: Max-Planck Institute for Human Cognitive and Brain Research, Leipzig, Germany. steinb@cbs.mpg.de

ABSTRACT
Recent demonstrations that music is capable of conveying semantically meaningful information has raised several questions as to what the underlying mechanisms of establishing meaning in music are, and if the meaning of music is represented in comparable fashion to language meaning. This paper presents evidence showing that expressed affect is a primary pathway to music meaning and that meaning in music is represented in a very similar fashion to language meaning. In two experiments using EEG and fMRI, it was shown that single chords varying in harmonic roughness (consonance/dissonance) and thus perceived affect could prime the processing of subsequently presented affective target words, as indicated by an increased N400 and activation of the right middle temporal gyrus (MTG). Most importantly, however, when primed by affective words, single chords incongruous to the preceding affect also elicited an N400 and activated the right posterior STS, an area implicated in processing meaning of a variety of signals (e.g. prosody, voices, motion). This provides an important piece of evidence in support of music meaning being represented in a very similar but also distinct fashion to language meaning: Both elicit an N400, but activate different portions of the right temporal lobe.

Show MeSH