Limits...
Spatially valid proprioceptive cues improve the detection of a visual stimulus.

Jackson CP, Miall RC, Balslev D - Exp Brain Res (2010)

Bottom Line: Proprioceptive cues were given by applying a brief lateral force to the participant's arm, either in the same direction (validly cued) or in the opposite direction (invalidly cued) to the on-screen location of the mask.The d' detection rate of the target increased when the direction of proprioceptive stimulus was compatible with the location of the visual target compared to when it was incompatible.These results suggest that proprioception influences the allocation of attention in visual space.

View Article: PubMed Central - PubMed

Affiliation: Centre for Neuroscience Studies, Botterell Hall, Queen's University, Kingston, ON, K7L 3N6, Canada. carl@biomed.queensu.ca

ABSTRACT
Vision and proprioception are the main sensory modalities that convey hand location and direction of movement. Fusion of these sensory signals into a single robust percept is now well documented. However, it is not known whether these modalities also interact in the spatial allocation of attention, which has been demonstrated for other modality pairings. The aim of this study was to test whether proprioceptive signals can spatially cue a visual target to improve its detection. Participants were instructed to use a planar manipulandum in a forward reaching action and determine during this movement whether a near-threshold visual target appeared at either of two lateral positions. The target presentation was followed by a masking stimulus, which made its possible location unambiguous, but not its presence. Proprioceptive cues were given by applying a brief lateral force to the participant's arm, either in the same direction (validly cued) or in the opposite direction (invalidly cued) to the on-screen location of the mask. The d' detection rate of the target increased when the direction of proprioceptive stimulus was compatible with the location of the visual target compared to when it was incompatible. These results suggest that proprioception influences the allocation of attention in visual space.

Show MeSH
Force profile, experimental set-up and procedure. a Set-up for Experiments 1 and 2 with monitor placed frontoparallel. b Set-up for Experiments 3 and 4 with projection-mirror system. c Participants reached forward, experienced a lateral perturbation and had to detect a visual target that could appear in the direction of the perturbation (validly cued condition, shown) or opposite to it (invalidly cued condition). Left column screen display, right column participant’s hand position in vBOT workspace, empty arrow direction of perturbation, dashed arrow direction of hand/cursor motion
© Copyright Policy
Related In: Results  -  Collection


getmorefigures.php?uid=PMC2908745&req=5

Fig1: Force profile, experimental set-up and procedure. a Set-up for Experiments 1 and 2 with monitor placed frontoparallel. b Set-up for Experiments 3 and 4 with projection-mirror system. c Participants reached forward, experienced a lateral perturbation and had to detect a visual target that could appear in the direction of the perturbation (validly cued condition, shown) or opposite to it (invalidly cued condition). Left column screen display, right column participant’s hand position in vBOT workspace, empty arrow direction of perturbation, dashed arrow direction of hand/cursor motion

Mentions: Participants used a vBOT robotic manipulandum (Howard et al. 2009) with their right hand. The manipulandum was held in a power grasp with the thumb uppermost and the palm of the hand to the right of the vertical handle. A computer sampled the position and velocity of the manipulandum and updated the forces imposed on the handle with millisecond resolution. In Experiment 1 and Experiment 2, participants controlled the movement of a cursor (black circle) subtending 0.57° visual angle on a white computer screen (40 × 30 cm), positioned frontoparallel 1 m in front of them (Fig. 1a). The movement of the manipulandum mapped 1:1 onto the movement of the cursor on the screen, so if the participants moved 10 cm using the robot, the cursor moved 10 cm (5.7° visual angle) on the screen. In Experiments 3 and 4, we used a projection-mirror system (Fig. 1b) to provide a more spatially consistent visuomotor environment where a forward movement of the hand would translate into an identical co-localized cursor movement on the horizontal screen, rather than a movement upwards as in the set-up of Experiments 1 and 2. In this set-up, participants gazed down into a mirror that reflected a horizontal screen positioned such that they saw stimuli as virtual images in the same plane as their hand on the robot underneath, and the visual cursor was spatially aligned with the position of the vBOT handle. The viewing distance to the most distant virtual image was 66 cm. Assuming that the arm length is around 70 cm, all visual stimuli were thus within peripersonal space. We kept the on-screen size of stimuli the same across experiments, so that the cursor size at the eye was increased to 0.87° in Experiment 3. A 10 cm hand movement caused the cursor to move 10 cm (8.7° visual angle). In the first Experiments 1 and 2, horizontal blinkers attached to a pair of safety goggles blocked the direct view of their hand; in Experiments 3 and 4, the projection-mirror apparatus blocked the view of the hand. Participants responded with their right foot on a pair of foot switches. They were instructed to lift their toe if the target was present and lift their heel if it was absent.Fig. 1


Spatially valid proprioceptive cues improve the detection of a visual stimulus.

Jackson CP, Miall RC, Balslev D - Exp Brain Res (2010)

Force profile, experimental set-up and procedure. a Set-up for Experiments 1 and 2 with monitor placed frontoparallel. b Set-up for Experiments 3 and 4 with projection-mirror system. c Participants reached forward, experienced a lateral perturbation and had to detect a visual target that could appear in the direction of the perturbation (validly cued condition, shown) or opposite to it (invalidly cued condition). Left column screen display, right column participant’s hand position in vBOT workspace, empty arrow direction of perturbation, dashed arrow direction of hand/cursor motion
© Copyright Policy
Related In: Results  -  Collection

Show All Figures
getmorefigures.php?uid=PMC2908745&req=5

Fig1: Force profile, experimental set-up and procedure. a Set-up for Experiments 1 and 2 with monitor placed frontoparallel. b Set-up for Experiments 3 and 4 with projection-mirror system. c Participants reached forward, experienced a lateral perturbation and had to detect a visual target that could appear in the direction of the perturbation (validly cued condition, shown) or opposite to it (invalidly cued condition). Left column screen display, right column participant’s hand position in vBOT workspace, empty arrow direction of perturbation, dashed arrow direction of hand/cursor motion
Mentions: Participants used a vBOT robotic manipulandum (Howard et al. 2009) with their right hand. The manipulandum was held in a power grasp with the thumb uppermost and the palm of the hand to the right of the vertical handle. A computer sampled the position and velocity of the manipulandum and updated the forces imposed on the handle with millisecond resolution. In Experiment 1 and Experiment 2, participants controlled the movement of a cursor (black circle) subtending 0.57° visual angle on a white computer screen (40 × 30 cm), positioned frontoparallel 1 m in front of them (Fig. 1a). The movement of the manipulandum mapped 1:1 onto the movement of the cursor on the screen, so if the participants moved 10 cm using the robot, the cursor moved 10 cm (5.7° visual angle) on the screen. In Experiments 3 and 4, we used a projection-mirror system (Fig. 1b) to provide a more spatially consistent visuomotor environment where a forward movement of the hand would translate into an identical co-localized cursor movement on the horizontal screen, rather than a movement upwards as in the set-up of Experiments 1 and 2. In this set-up, participants gazed down into a mirror that reflected a horizontal screen positioned such that they saw stimuli as virtual images in the same plane as their hand on the robot underneath, and the visual cursor was spatially aligned with the position of the vBOT handle. The viewing distance to the most distant virtual image was 66 cm. Assuming that the arm length is around 70 cm, all visual stimuli were thus within peripersonal space. We kept the on-screen size of stimuli the same across experiments, so that the cursor size at the eye was increased to 0.87° in Experiment 3. A 10 cm hand movement caused the cursor to move 10 cm (8.7° visual angle). In the first Experiments 1 and 2, horizontal blinkers attached to a pair of safety goggles blocked the direct view of their hand; in Experiments 3 and 4, the projection-mirror apparatus blocked the view of the hand. Participants responded with their right foot on a pair of foot switches. They were instructed to lift their toe if the target was present and lift their heel if it was absent.Fig. 1

Bottom Line: Proprioceptive cues were given by applying a brief lateral force to the participant's arm, either in the same direction (validly cued) or in the opposite direction (invalidly cued) to the on-screen location of the mask.The d' detection rate of the target increased when the direction of proprioceptive stimulus was compatible with the location of the visual target compared to when it was incompatible.These results suggest that proprioception influences the allocation of attention in visual space.

View Article: PubMed Central - PubMed

Affiliation: Centre for Neuroscience Studies, Botterell Hall, Queen's University, Kingston, ON, K7L 3N6, Canada. carl@biomed.queensu.ca

ABSTRACT
Vision and proprioception are the main sensory modalities that convey hand location and direction of movement. Fusion of these sensory signals into a single robust percept is now well documented. However, it is not known whether these modalities also interact in the spatial allocation of attention, which has been demonstrated for other modality pairings. The aim of this study was to test whether proprioceptive signals can spatially cue a visual target to improve its detection. Participants were instructed to use a planar manipulandum in a forward reaching action and determine during this movement whether a near-threshold visual target appeared at either of two lateral positions. The target presentation was followed by a masking stimulus, which made its possible location unambiguous, but not its presence. Proprioceptive cues were given by applying a brief lateral force to the participant's arm, either in the same direction (validly cued) or in the opposite direction (invalidly cued) to the on-screen location of the mask. The d' detection rate of the target increased when the direction of proprioceptive stimulus was compatible with the location of the visual target compared to when it was incompatible. These results suggest that proprioception influences the allocation of attention in visual space.

Show MeSH