Limits...
Effect of visual distraction and auditory feedback on patient effort during robot-assisted movement training after stroke.

Secoli R, Milot MH, Rosati G, Reinkensmeyer DJ - J Neuroeng Rehabil (2011)

Bottom Line: With sound feedback, however, these participants increased their effort and decreased their tracking error close to their baseline levels, while also performing the distracter task successfully.These effects were significantly smaller for the participants who used their non-paretic arm and for the participants without stroke.This effect was greater for the hemiparetic arm, suggesting that the increased demands associated with controlling an affected arm make the motor system more prone to slack when distracted.

View Article: PubMed Central - HTML - PubMed

Affiliation: Biomechatronic Lab,, Department of Mechanical Engineering, University of California, 4200 Engineering Gateway, Irvine, CA 92697-3875, USA. rsecoli@uci.edu

ABSTRACT

Background: Practicing arm and gait movements with robotic assistance after neurologic injury can help patients improve their movement ability, but patients sometimes reduce their effort during training in response to the assistance. Reduced effort has been hypothesized to diminish clinical outcomes of robotic training. To better understand patient slacking, we studied the role of visual distraction and auditory feedback in modulating patient effort during a common robot-assisted tracking task.

Methods: Fourteen participants with chronic left hemiparesis from stroke, five control participants with chronic right hemiparesis and fourteen non-impaired healthy control participants, tracked a visual target with their arms while receiving adaptive assistance from a robotic arm exoskeleton. We compared four practice conditions: the baseline tracking task alone; tracking while also performing a visual distracter task; tracking with the visual distracter and sound feedback; and tracking with sound feedback. For the distracter task, symbols were randomly displayed in the corners of the computer screen, and the participants were instructed to click a mouse button when a target symbol appeared. The sound feedback consisted of a repeating beep, with the frequency of repetition made to increase with increasing tracking error.

Results: Participants with stroke halved their effort and doubled their tracking error when performing the visual distracter task with their left hemiparetic arm. With sound feedback, however, these participants increased their effort and decreased their tracking error close to their baseline levels, while also performing the distracter task successfully. These effects were significantly smaller for the participants who used their non-paretic arm and for the participants without stroke.

Conclusions: Visual distraction decreased participants effort during a standard robot-assisted movement training task. This effect was greater for the hemiparetic arm, suggesting that the increased demands associated with controlling an affected arm make the motor system more prone to slack when distracted. Providing an alternate sensory channel for feedback, i.e., auditory feedback of tracking error, enabled the participants to simultaneously perform the tracking task and distracter task effectively. Thus, incorporating real-time auditory feedback of performance errors might improve clinical outcomes of robotic therapy systems.

Show MeSH

Related in: MedlinePlus

Human Machine Interface. Visual and audio interface used for the tracking task: Target position is represented by a red filled dot (black dot in the figure) and hand position is represented by a green filled dot (light gray dot in the figure) in a black screen (white in the figure). A visual distracter is also shown in the bottom right corner.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC3104373&req=5

Figure 1: Human Machine Interface. Visual and audio interface used for the tracking task: Target position is represented by a red filled dot (black dot in the figure) and hand position is represented by a green filled dot (light gray dot in the figure) in a black screen (white in the figure). A visual distracter is also shown in the bottom right corner.

Mentions: We designed a tracking task, similar to commonly-used robotic therapy tracking tasks, for which subjects had to follow a target on a computer screen as accurately as possible in a cyclic left-to-right movement using their affected upper extremity. Note that the movement trajectory was entirely horizontal (in the X axis), and required a left-to-right motion of about 18 inches long with a "minimum jerk" velocity profile for the target [27]. The subject's hand position (midpoint of the robot's stick handled by the subject) was represented by a green dot and the target position was represented by a red dot. The user interface was implemented using Microsoft Visual Basic .NET and OpenGL (see Figure 1). While tracking the target, the subjects were asked to click a mouse using their hand not positioned in the robot when a goal visual distracter appeared on the computer screen. The visual distracters varied randomly according to the combination of three parameters: color (red or green), position of the distracter (bottom left or right of the computer screen) and position of a yellow horizontal line (above or below the distracter); by varying these features, eight total distracters were possible. The two goal distracters were chosen from among the eight combinations, for which participants were instructed to click the mouse button, consisted of a green colored dot with a yellow line above appearing at the bottom left of the screen, or a red dot with a yellow line below appearing at the bottom right of the screen. The visual distracters were shown for 2 sec with a random time gap between 1 and 5 sec between each distracter.


Effect of visual distraction and auditory feedback on patient effort during robot-assisted movement training after stroke.

Secoli R, Milot MH, Rosati G, Reinkensmeyer DJ - J Neuroeng Rehabil (2011)

Human Machine Interface. Visual and audio interface used for the tracking task: Target position is represented by a red filled dot (black dot in the figure) and hand position is represented by a green filled dot (light gray dot in the figure) in a black screen (white in the figure). A visual distracter is also shown in the bottom right corner.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC3104373&req=5

Figure 1: Human Machine Interface. Visual and audio interface used for the tracking task: Target position is represented by a red filled dot (black dot in the figure) and hand position is represented by a green filled dot (light gray dot in the figure) in a black screen (white in the figure). A visual distracter is also shown in the bottom right corner.
Mentions: We designed a tracking task, similar to commonly-used robotic therapy tracking tasks, for which subjects had to follow a target on a computer screen as accurately as possible in a cyclic left-to-right movement using their affected upper extremity. Note that the movement trajectory was entirely horizontal (in the X axis), and required a left-to-right motion of about 18 inches long with a "minimum jerk" velocity profile for the target [27]. The subject's hand position (midpoint of the robot's stick handled by the subject) was represented by a green dot and the target position was represented by a red dot. The user interface was implemented using Microsoft Visual Basic .NET and OpenGL (see Figure 1). While tracking the target, the subjects were asked to click a mouse using their hand not positioned in the robot when a goal visual distracter appeared on the computer screen. The visual distracters varied randomly according to the combination of three parameters: color (red or green), position of the distracter (bottom left or right of the computer screen) and position of a yellow horizontal line (above or below the distracter); by varying these features, eight total distracters were possible. The two goal distracters were chosen from among the eight combinations, for which participants were instructed to click the mouse button, consisted of a green colored dot with a yellow line above appearing at the bottom left of the screen, or a red dot with a yellow line below appearing at the bottom right of the screen. The visual distracters were shown for 2 sec with a random time gap between 1 and 5 sec between each distracter.

Bottom Line: With sound feedback, however, these participants increased their effort and decreased their tracking error close to their baseline levels, while also performing the distracter task successfully.These effects were significantly smaller for the participants who used their non-paretic arm and for the participants without stroke.This effect was greater for the hemiparetic arm, suggesting that the increased demands associated with controlling an affected arm make the motor system more prone to slack when distracted.

View Article: PubMed Central - HTML - PubMed

Affiliation: Biomechatronic Lab,, Department of Mechanical Engineering, University of California, 4200 Engineering Gateway, Irvine, CA 92697-3875, USA. rsecoli@uci.edu

ABSTRACT

Background: Practicing arm and gait movements with robotic assistance after neurologic injury can help patients improve their movement ability, but patients sometimes reduce their effort during training in response to the assistance. Reduced effort has been hypothesized to diminish clinical outcomes of robotic training. To better understand patient slacking, we studied the role of visual distraction and auditory feedback in modulating patient effort during a common robot-assisted tracking task.

Methods: Fourteen participants with chronic left hemiparesis from stroke, five control participants with chronic right hemiparesis and fourteen non-impaired healthy control participants, tracked a visual target with their arms while receiving adaptive assistance from a robotic arm exoskeleton. We compared four practice conditions: the baseline tracking task alone; tracking while also performing a visual distracter task; tracking with the visual distracter and sound feedback; and tracking with sound feedback. For the distracter task, symbols were randomly displayed in the corners of the computer screen, and the participants were instructed to click a mouse button when a target symbol appeared. The sound feedback consisted of a repeating beep, with the frequency of repetition made to increase with increasing tracking error.

Results: Participants with stroke halved their effort and doubled their tracking error when performing the visual distracter task with their left hemiparetic arm. With sound feedback, however, these participants increased their effort and decreased their tracking error close to their baseline levels, while also performing the distracter task successfully. These effects were significantly smaller for the participants who used their non-paretic arm and for the participants without stroke.

Conclusions: Visual distraction decreased participants effort during a standard robot-assisted movement training task. This effect was greater for the hemiparetic arm, suggesting that the increased demands associated with controlling an affected arm make the motor system more prone to slack when distracted. Providing an alternate sensory channel for feedback, i.e., auditory feedback of tracking error, enabled the participants to simultaneously perform the tracking task and distracter task effectively. Thus, incorporating real-time auditory feedback of performance errors might improve clinical outcomes of robotic therapy systems.

Show MeSH
Related in: MedlinePlus