Limits...
The Temporal Dynamics of Scene Processing: A Multifaceted EEG Investigation

View Article: PubMed Central - HTML - PubMed

ABSTRACT

Our remarkable ability to process complex visual scenes is supported by a network of scene-selective cortical regions. Despite growing knowledge about the scene representation in these regions, much less is known about the temporal dynamics with which these representations emerge. We conducted two experiments aimed at identifying and characterizing the earliest markers of scene-specific processing. In the first experiment, human participants viewed images of scenes, faces, and everyday objects while event-related potentials (ERPs) were recorded. We found that the first ERP component to evince a significantly stronger response to scenes than the other categories was the P2, peaking ∼220 ms after stimulus onset. To establish that the P2 component reflects scene-specific processing, in the second experiment, we recorded ERPs while the participants viewed diverse real-world scenes spanning the following three global scene properties: spatial expanse (open/closed), relative distance (near/far), and naturalness (man-made/natural). We found that P2 amplitude was sensitive to these scene properties at both the categorical level, distinguishing between open and closed natural scenes, as well as at the single-image level, reflecting both computationally derived scene statistics and behavioral ratings of naturalness and spatial expanse. Together, these results establish the P2 as an ERP marker for scene processing, and demonstrate that scene-specific global information is available in the neural response as early as 220 ms.

No MeSH data available.


Related in: MedlinePlus

Experiment 1 results. a, Mean N1/170 and P2 peak amplitudes (left and right column, respectively) in response to scenes (red), faces (blue), and objects (green; peak amplitudes are plotted separately for each hemisphere, for the posterior lateral electrode sites. Error bars indicate the SEM. Significant differences (p < 0.05) between pairs of categories are denoted by asterisk. b, Group-averaged ERPs (n = 12) for the three categories (scenes in red, faces in blue, objects in green) for the left and right hemispheres (data are plotted for the posterior lateral sites). c, ERP difference waveforms depicting face sensitivity (blue, faces-objects) and scene sensitivity (red, scenes-objects) over time for the left and right hemispheres (data are plotted for the posterior lateral sites). The waveforms (solid lines) are presented with across-subjects 95% confidence intervals around them (light blue and red for face and scene sensitivity, respectively).
© Copyright Policy - open-access
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC5037322&req=5

Figure 3: Experiment 1 results. a, Mean N1/170 and P2 peak amplitudes (left and right column, respectively) in response to scenes (red), faces (blue), and objects (green; peak amplitudes are plotted separately for each hemisphere, for the posterior lateral electrode sites. Error bars indicate the SEM. Significant differences (p < 0.05) between pairs of categories are denoted by asterisk. b, Group-averaged ERPs (n = 12) for the three categories (scenes in red, faces in blue, objects in green) for the left and right hemispheres (data are plotted for the posterior lateral sites). c, ERP difference waveforms depicting face sensitivity (blue, faces-objects) and scene sensitivity (red, scenes-objects) over time for the left and right hemispheres (data are plotted for the posterior lateral sites). The waveforms (solid lines) are presented with across-subjects 95% confidence intervals around them (light blue and red for face and scene sensitivity, respectively).

Mentions: As expected, the N1 component showed the well known N170 face effect (Bentin et al., 1996), with its strongest amplitude evoked by images of faces relative to images of objects and scenes (Table 2, N1/170 peak amplitudes ANOVA). This effect was most pronounced in posterior lateral electrodes (Fig. 3a, left), as revealed in a significant category × site × mediality interaction (F(4,44) = 7.71, MSE = 9.107, p = 0.002)d. Follow-up category × site ANOVAe for the lateral sites (posteriorf, centralg, and anteriorh) revealed an N170 effect that was restricted to the posterior lateral sites (F(2,22) = 7.01, MSE = 4.65, p = 0.007)f, showing a stronger amplitude to faces (mean = −4.83 mV, SEM = 0.98) than to objects (mean = −2.39 mV, SEM = 0.82; t(11) = 2.53, p = 0.01)i or scenes (mean = −2.03 mV, SEM = 0.73 t(11) = 3.70, p = 0.002)j, which did not differ in their amplitude (t(11) = 0.52, p = 0.30k; Fig. 3a, left). No significant main effects or interactions were found for the medial sites (Table 2, N1/170 peak amplitudes ANOVA)l.


The Temporal Dynamics of Scene Processing: A Multifaceted EEG Investigation
Experiment 1 results. a, Mean N1/170 and P2 peak amplitudes (left and right column, respectively) in response to scenes (red), faces (blue), and objects (green; peak amplitudes are plotted separately for each hemisphere, for the posterior lateral electrode sites. Error bars indicate the SEM. Significant differences (p < 0.05) between pairs of categories are denoted by asterisk. b, Group-averaged ERPs (n = 12) for the three categories (scenes in red, faces in blue, objects in green) for the left and right hemispheres (data are plotted for the posterior lateral sites). c, ERP difference waveforms depicting face sensitivity (blue, faces-objects) and scene sensitivity (red, scenes-objects) over time for the left and right hemispheres (data are plotted for the posterior lateral sites). The waveforms (solid lines) are presented with across-subjects 95% confidence intervals around them (light blue and red for face and scene sensitivity, respectively).
© Copyright Policy - open-access
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC5037322&req=5

Figure 3: Experiment 1 results. a, Mean N1/170 and P2 peak amplitudes (left and right column, respectively) in response to scenes (red), faces (blue), and objects (green; peak amplitudes are plotted separately for each hemisphere, for the posterior lateral electrode sites. Error bars indicate the SEM. Significant differences (p < 0.05) between pairs of categories are denoted by asterisk. b, Group-averaged ERPs (n = 12) for the three categories (scenes in red, faces in blue, objects in green) for the left and right hemispheres (data are plotted for the posterior lateral sites). c, ERP difference waveforms depicting face sensitivity (blue, faces-objects) and scene sensitivity (red, scenes-objects) over time for the left and right hemispheres (data are plotted for the posterior lateral sites). The waveforms (solid lines) are presented with across-subjects 95% confidence intervals around them (light blue and red for face and scene sensitivity, respectively).
Mentions: As expected, the N1 component showed the well known N170 face effect (Bentin et al., 1996), with its strongest amplitude evoked by images of faces relative to images of objects and scenes (Table 2, N1/170 peak amplitudes ANOVA). This effect was most pronounced in posterior lateral electrodes (Fig. 3a, left), as revealed in a significant category × site × mediality interaction (F(4,44) = 7.71, MSE = 9.107, p = 0.002)d. Follow-up category × site ANOVAe for the lateral sites (posteriorf, centralg, and anteriorh) revealed an N170 effect that was restricted to the posterior lateral sites (F(2,22) = 7.01, MSE = 4.65, p = 0.007)f, showing a stronger amplitude to faces (mean = −4.83 mV, SEM = 0.98) than to objects (mean = −2.39 mV, SEM = 0.82; t(11) = 2.53, p = 0.01)i or scenes (mean = −2.03 mV, SEM = 0.73 t(11) = 3.70, p = 0.002)j, which did not differ in their amplitude (t(11) = 0.52, p = 0.30k; Fig. 3a, left). No significant main effects or interactions were found for the medial sites (Table 2, N1/170 peak amplitudes ANOVA)l.

View Article: PubMed Central - HTML - PubMed

ABSTRACT

Our remarkable ability to process complex visual scenes is supported by a network of scene-selective cortical regions. Despite growing knowledge about the scene representation in these regions, much less is known about the temporal dynamics with which these representations emerge. We conducted two experiments aimed at identifying and characterizing the earliest markers of scene-specific processing. In the first experiment, human participants viewed images of scenes, faces, and everyday objects while event-related potentials (ERPs) were recorded. We found that the first ERP component to evince a significantly stronger response to scenes than the other categories was the P2, peaking &sim;220 ms after stimulus onset. To establish that the P2 component reflects scene-specific processing, in the second experiment, we recorded ERPs while the participants viewed diverse real-world scenes spanning the following three global scene properties: spatial expanse (open/closed), relative distance (near/far), and naturalness (man-made/natural). We found that P2 amplitude was sensitive to these scene properties at both the categorical level, distinguishing between open and closed natural scenes, as well as at the single-image level, reflecting both computationally derived scene statistics and behavioral ratings of naturalness and spatial expanse. Together, these results establish the P2 as an ERP marker for scene processing, and demonstrate that scene-specific global information is available in the neural response as early as 220 ms.

No MeSH data available.


Related in: MedlinePlus