Limits...
Electrophysiological evidences demonstrating differences in brain functions between nonmusicians and musicians.

Zhang L, Peng W, Chen J, Hu L - Sci Rep (2015)

Bottom Line: However, evidence demonstrating that long-term music training modulates higher brain functions is surprisingly rare.We observed that compared to nonmusicians, musicians have (1) larger high-frequency steady-state responses, which reflect the auditory information processing within the sensory system, and (2) smaller low-frequency vertex potentials, which reflect higher cognitive information processing within the novelty/saliency detection system.Therefore, we speculate that long-term music training facilitates "bottom-up" auditory information processing in the sensory system and enhances "top-down" cognitive inhibition of the novelty/saliency detection system.

View Article: PubMed Central - PubMed

Affiliation: Key Laboratory of Cognition and Personality (Ministry of Education) and School of Psychology, Southwest University, Chongqing, China.

ABSTRACT
Long-term music training can improve sensorimotor skills, as playing a musical instrument requires the functional integration of information related to multimodal sensory perception and motor execution. This functional integration often leads to functional reorganization of cerebral cortices, including auditory, visual, and motor areas. Moreover, music appreciation can modulate emotions (e.g., stress relief), and long-term music training can enhance a musician's self-control and self-evaluation ability. Therefore, the neural processing of music can also be related to certain higher brain cognitive functions. However, evidence demonstrating that long-term music training modulates higher brain functions is surprisingly rare. Here, we aimed to comprehensively explore the neural changes induced by long-term music training by assessing the differences of transient and quasi-steady-state auditory-evoked potentials between nonmusicians and musicians. We observed that compared to nonmusicians, musicians have (1) larger high-frequency steady-state responses, which reflect the auditory information processing within the sensory system, and (2) smaller low-frequency vertex potentials, which reflect higher cognitive information processing within the novelty/saliency detection system. Therefore, we speculate that long-term music training facilitates "bottom-up" auditory information processing in the sensory system and enhances "top-down" cognitive inhibition of the novelty/saliency detection system.

No MeSH data available.


Related in: MedlinePlus

The comparison of event-related potentials (ERPs) evoked by transient auditory stimuli between nonmusicians and musicians.ERPs evoked by transient auditory stimuli (group-level average; FCz-A1A2) from nonmusicians and musicians are respectively marked in red and blue. x axis, latency (ms); y axis, amplitude (μV). The scalp topographies of N1 and P2 in auditory ERPs, from both nonmusicians and musicians, are displayed in the upper and lower parts respectively. Gray scale represents the P values obtained for each time point using an independent sample t-test to assess the significant difference of auditory ERPs between nonmusicians and musicians. Whereas N1 latencies and amplitudes were not significantly different between nonmusicians and musicians, P2 latencies were significantly shorter and P2 amplitudes were significantly larger for nonmusicians than musicians.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4559803&req=5

f2: The comparison of event-related potentials (ERPs) evoked by transient auditory stimuli between nonmusicians and musicians.ERPs evoked by transient auditory stimuli (group-level average; FCz-A1A2) from nonmusicians and musicians are respectively marked in red and blue. x axis, latency (ms); y axis, amplitude (μV). The scalp topographies of N1 and P2 in auditory ERPs, from both nonmusicians and musicians, are displayed in the upper and lower parts respectively. Gray scale represents the P values obtained for each time point using an independent sample t-test to assess the significant difference of auditory ERPs between nonmusicians and musicians. Whereas N1 latencies and amplitudes were not significantly different between nonmusicians and musicians, P2 latencies were significantly shorter and P2 amplitudes were significantly larger for nonmusicians than musicians.

Mentions: Figure 2 shows the group-level average transient AEP waveforms (FCz-A1A2) and the scalp topographies at the peak latencies of N1 and P2 for both nonmusicians and musicians (n = 14 for each group). Scalp topographies of both N1 and P2 were remarkably similar between nonmusicians and musicians. The N1 was maximal at fronto-central region and extended bilaterally towards fronto-temporal regions, and the P2 was more centrally distributed at the fronto-central region22. Whereas both N1 latencies and amplitudes were not significantly different between nonmusicians and musicians (N1 latency: 105 ± 13 ms vs. 115 ± 19 ms, P = 0.13; N1 amplitude: −6.65 ± 2.43 μV vs. −5.37 ± 2.02 μV, P = 0.14), both P2 latencies and amplitudes were significantly different between the two groups (P2 latency: 174 ± 16 ms vs. 200 ± 29 ms, P = 0.008; P2 amplitude: 5.91 ± 3.47 μV vs. 3.29 ± 1.63 μV, P = 0.01). Similar results were obtained when the N1 and P2 amplitudes (i.e., mean peak amplitudes) were measured by calculating the mean values within their respective peak intervals (N1: 80–120 ms; P2: 155–180 ms). Whereas mean peak N1 amplitudes were not significantly different between nonmusicians and musicians (−4.23 ± 2.00 μV vs. −3.04 ± 2.33 μV; P = 0.16), mean peak P2 amplitudes were significantly different between the two groups (4.65 ± 2.89 μV vs. 1.46 ± 2.57 μV; P = 0.005).


Electrophysiological evidences demonstrating differences in brain functions between nonmusicians and musicians.

Zhang L, Peng W, Chen J, Hu L - Sci Rep (2015)

The comparison of event-related potentials (ERPs) evoked by transient auditory stimuli between nonmusicians and musicians.ERPs evoked by transient auditory stimuli (group-level average; FCz-A1A2) from nonmusicians and musicians are respectively marked in red and blue. x axis, latency (ms); y axis, amplitude (μV). The scalp topographies of N1 and P2 in auditory ERPs, from both nonmusicians and musicians, are displayed in the upper and lower parts respectively. Gray scale represents the P values obtained for each time point using an independent sample t-test to assess the significant difference of auditory ERPs between nonmusicians and musicians. Whereas N1 latencies and amplitudes were not significantly different between nonmusicians and musicians, P2 latencies were significantly shorter and P2 amplitudes were significantly larger for nonmusicians than musicians.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4559803&req=5

f2: The comparison of event-related potentials (ERPs) evoked by transient auditory stimuli between nonmusicians and musicians.ERPs evoked by transient auditory stimuli (group-level average; FCz-A1A2) from nonmusicians and musicians are respectively marked in red and blue. x axis, latency (ms); y axis, amplitude (μV). The scalp topographies of N1 and P2 in auditory ERPs, from both nonmusicians and musicians, are displayed in the upper and lower parts respectively. Gray scale represents the P values obtained for each time point using an independent sample t-test to assess the significant difference of auditory ERPs between nonmusicians and musicians. Whereas N1 latencies and amplitudes were not significantly different between nonmusicians and musicians, P2 latencies were significantly shorter and P2 amplitudes were significantly larger for nonmusicians than musicians.
Mentions: Figure 2 shows the group-level average transient AEP waveforms (FCz-A1A2) and the scalp topographies at the peak latencies of N1 and P2 for both nonmusicians and musicians (n = 14 for each group). Scalp topographies of both N1 and P2 were remarkably similar between nonmusicians and musicians. The N1 was maximal at fronto-central region and extended bilaterally towards fronto-temporal regions, and the P2 was more centrally distributed at the fronto-central region22. Whereas both N1 latencies and amplitudes were not significantly different between nonmusicians and musicians (N1 latency: 105 ± 13 ms vs. 115 ± 19 ms, P = 0.13; N1 amplitude: −6.65 ± 2.43 μV vs. −5.37 ± 2.02 μV, P = 0.14), both P2 latencies and amplitudes were significantly different between the two groups (P2 latency: 174 ± 16 ms vs. 200 ± 29 ms, P = 0.008; P2 amplitude: 5.91 ± 3.47 μV vs. 3.29 ± 1.63 μV, P = 0.01). Similar results were obtained when the N1 and P2 amplitudes (i.e., mean peak amplitudes) were measured by calculating the mean values within their respective peak intervals (N1: 80–120 ms; P2: 155–180 ms). Whereas mean peak N1 amplitudes were not significantly different between nonmusicians and musicians (−4.23 ± 2.00 μV vs. −3.04 ± 2.33 μV; P = 0.16), mean peak P2 amplitudes were significantly different between the two groups (4.65 ± 2.89 μV vs. 1.46 ± 2.57 μV; P = 0.005).

Bottom Line: However, evidence demonstrating that long-term music training modulates higher brain functions is surprisingly rare.We observed that compared to nonmusicians, musicians have (1) larger high-frequency steady-state responses, which reflect the auditory information processing within the sensory system, and (2) smaller low-frequency vertex potentials, which reflect higher cognitive information processing within the novelty/saliency detection system.Therefore, we speculate that long-term music training facilitates "bottom-up" auditory information processing in the sensory system and enhances "top-down" cognitive inhibition of the novelty/saliency detection system.

View Article: PubMed Central - PubMed

Affiliation: Key Laboratory of Cognition and Personality (Ministry of Education) and School of Psychology, Southwest University, Chongqing, China.

ABSTRACT
Long-term music training can improve sensorimotor skills, as playing a musical instrument requires the functional integration of information related to multimodal sensory perception and motor execution. This functional integration often leads to functional reorganization of cerebral cortices, including auditory, visual, and motor areas. Moreover, music appreciation can modulate emotions (e.g., stress relief), and long-term music training can enhance a musician's self-control and self-evaluation ability. Therefore, the neural processing of music can also be related to certain higher brain cognitive functions. However, evidence demonstrating that long-term music training modulates higher brain functions is surprisingly rare. Here, we aimed to comprehensively explore the neural changes induced by long-term music training by assessing the differences of transient and quasi-steady-state auditory-evoked potentials between nonmusicians and musicians. We observed that compared to nonmusicians, musicians have (1) larger high-frequency steady-state responses, which reflect the auditory information processing within the sensory system, and (2) smaller low-frequency vertex potentials, which reflect higher cognitive information processing within the novelty/saliency detection system. Therefore, we speculate that long-term music training facilitates "bottom-up" auditory information processing in the sensory system and enhances "top-down" cognitive inhibition of the novelty/saliency detection system.

No MeSH data available.


Related in: MedlinePlus