Limits...
Differences in Early Stages of Tactile ERP Temporal Sequence (P100) in Cortical Organization during Passive Tactile Stimulation in Children with Blindness and Controls.

Ortiz Alonso T, Santos JM, Ortiz Terán L, Borrego Hernández M, Poch Broto J, de Erausquin GA - PLoS ONE (2015)

Bottom Line: On the other hand, they are equally proficient in recognizing stimuli with semantic content (letters).The last observation is consistent with the role of P100 on somatosensory-based recognition of complex forms.The cortical differences between seeing control and blind groups, during spatial tactile discrimination, are associated with activation in visual pathway (occipital) and task-related association (temporal and frontal) areas.

View Article: PubMed Central - PubMed

Affiliation: Department of Psychiatry, Facultad de Medicina, Universidad Complutense, Madrid, Spain.

ABSTRACT
Compared to their seeing counterparts, people with blindness have a greater tactile capacity. Differences in the physiology of object recognition between people with blindness and seeing people have been well documented, but not when tactile stimuli require semantic processing. We used a passive vibrotactile device to focus on the differences in spatial brain processing evaluated with event related potentials (ERP) in children with blindness (n = 12) vs. normally seeing children (n = 12), when learning a simple spatial task (lines with different orientations) or a task involving recognition of letters, to describe the early stages of its temporal sequence (from 80 to 220 msec) and to search for evidence of multi-modal cortical organization. We analysed the P100 of the ERP. Children with blindness showed earlier latencies for cognitive (perceptual) event related potentials, shorter reaction times, and (paradoxically) worse ability to identify the spatial direction of the stimulus. On the other hand, they are equally proficient in recognizing stimuli with semantic content (letters). The last observation is consistent with the role of P100 on somatosensory-based recognition of complex forms. The cortical differences between seeing control and blind groups, during spatial tactile discrimination, are associated with activation in visual pathway (occipital) and task-related association (temporal and frontal) areas. The present results show that early processing of tactile stimulation conveying cross modal information differs in children with blindness or with normal vision.

No MeSH data available.


Related in: MedlinePlus

Schematic representation of the stimulus presentation set up.Stimuli are flashed in the LCD screen, read out by a camera mounted on the dark glasses, transformed into digital input and fed as tactile stimulation to the hand of the subject.
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4520520&req=5

pone.0124527.g001: Schematic representation of the stimulus presentation set up.Stimuli are flashed in the LCD screen, read out by a camera mounted on the dark glasses, transformed into digital input and fed as tactile stimulation to the hand of the subject.

Mentions: a micro-camera (visual receptor) and a tactile stimulator (stimulation matrix). The former is mounted on an eyeglass frame and the latter is passively touched by the volunteer. Images from the surrounding environment are captured by the micro-camera and transferred to the tactile stimulator either wirelessly or through a cable. The stimulator has a microprocessor inside, equipped with ad hoc algorithms which transform images captured by the micro-camera into vibro-tactile impulses. The stimulation matrix has 28X28 stimulation points, corresponding to binned pixels of the image captured by the micro-camera. The stimulation matrix is passively touched by the child with blindness with his/her left hand. Images projected on a flat screen and captured by the camera occupying the whole field of view presented at a rate of one per second. Half of the stimuli were lines and half were letters. The experiment was carried out in a very dimly lit room isolated from external noise. Subjects sat in an armchair, 75 cm in front of a 19” LCD screen (refresh rate 100 Hz) that displayed the stimuli (see schematic representation on Fig 1), and were provided with a keyboard to enter responses to recognized shapes. They were asked to be as relaxed as possible. The stimuli were delivered to seeing children exactly as they were presented to the non-seeing children, using the same set up with dark glasses, such that they did not see the screen but received the same tactile stimuli. Two tasks were performed.


Differences in Early Stages of Tactile ERP Temporal Sequence (P100) in Cortical Organization during Passive Tactile Stimulation in Children with Blindness and Controls.

Ortiz Alonso T, Santos JM, Ortiz Terán L, Borrego Hernández M, Poch Broto J, de Erausquin GA - PLoS ONE (2015)

Schematic representation of the stimulus presentation set up.Stimuli are flashed in the LCD screen, read out by a camera mounted on the dark glasses, transformed into digital input and fed as tactile stimulation to the hand of the subject.
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4520520&req=5

pone.0124527.g001: Schematic representation of the stimulus presentation set up.Stimuli are flashed in the LCD screen, read out by a camera mounted on the dark glasses, transformed into digital input and fed as tactile stimulation to the hand of the subject.
Mentions: a micro-camera (visual receptor) and a tactile stimulator (stimulation matrix). The former is mounted on an eyeglass frame and the latter is passively touched by the volunteer. Images from the surrounding environment are captured by the micro-camera and transferred to the tactile stimulator either wirelessly or through a cable. The stimulator has a microprocessor inside, equipped with ad hoc algorithms which transform images captured by the micro-camera into vibro-tactile impulses. The stimulation matrix has 28X28 stimulation points, corresponding to binned pixels of the image captured by the micro-camera. The stimulation matrix is passively touched by the child with blindness with his/her left hand. Images projected on a flat screen and captured by the camera occupying the whole field of view presented at a rate of one per second. Half of the stimuli were lines and half were letters. The experiment was carried out in a very dimly lit room isolated from external noise. Subjects sat in an armchair, 75 cm in front of a 19” LCD screen (refresh rate 100 Hz) that displayed the stimuli (see schematic representation on Fig 1), and were provided with a keyboard to enter responses to recognized shapes. They were asked to be as relaxed as possible. The stimuli were delivered to seeing children exactly as they were presented to the non-seeing children, using the same set up with dark glasses, such that they did not see the screen but received the same tactile stimuli. Two tasks were performed.

Bottom Line: On the other hand, they are equally proficient in recognizing stimuli with semantic content (letters).The last observation is consistent with the role of P100 on somatosensory-based recognition of complex forms.The cortical differences between seeing control and blind groups, during spatial tactile discrimination, are associated with activation in visual pathway (occipital) and task-related association (temporal and frontal) areas.

View Article: PubMed Central - PubMed

Affiliation: Department of Psychiatry, Facultad de Medicina, Universidad Complutense, Madrid, Spain.

ABSTRACT
Compared to their seeing counterparts, people with blindness have a greater tactile capacity. Differences in the physiology of object recognition between people with blindness and seeing people have been well documented, but not when tactile stimuli require semantic processing. We used a passive vibrotactile device to focus on the differences in spatial brain processing evaluated with event related potentials (ERP) in children with blindness (n = 12) vs. normally seeing children (n = 12), when learning a simple spatial task (lines with different orientations) or a task involving recognition of letters, to describe the early stages of its temporal sequence (from 80 to 220 msec) and to search for evidence of multi-modal cortical organization. We analysed the P100 of the ERP. Children with blindness showed earlier latencies for cognitive (perceptual) event related potentials, shorter reaction times, and (paradoxically) worse ability to identify the spatial direction of the stimulus. On the other hand, they are equally proficient in recognizing stimuli with semantic content (letters). The last observation is consistent with the role of P100 on somatosensory-based recognition of complex forms. The cortical differences between seeing control and blind groups, during spatial tactile discrimination, are associated with activation in visual pathway (occipital) and task-related association (temporal and frontal) areas. The present results show that early processing of tactile stimulation conveying cross modal information differs in children with blindness or with normal vision.

No MeSH data available.


Related in: MedlinePlus