Limits...
Auto detection and segmentation of physical activities during a Timed-Up-and-Go (TUG) task in healthy older adults using multiple inertial sensors.

Nguyen HP, Ayachi F, Lavigne-Pelletier C, Blamoutier M, Rahimi F, Boissy P, Jog M, Duval C - J Neuroeng Rehabil (2015)

Bottom Line: Segmentation time stamps were compared to results from two examiners visually segmenting the activities of the TUG.We were able to detect these activities in a TUG with 100% sensitivity and specificity (n = 192) during the 10 meter TUG.When applying the segmentation algorithms to the 10 meter TUG, we were able to parse 100% of the transition points (n = 224) between different segments that were as reliable and less variable than visual segmentation performed by two independent examiners.

View Article: PubMed Central - PubMed

Affiliation: Département de Kinanthropologie, Université du Québec à Montréal, C.P. 8888, succursale Centre-Ville, Montréal, H3C 3P8, Québec, Canada. hpnguyen@utexas.edu.

ABSTRACT

Background: Recently, much attention has been given to the use of inertial sensors for remote monitoring of individuals with limited mobility. However, the focus has been mostly on the detection of symptoms, not specific activities. The objective of the present study was to develop an automated recognition and segmentation algorithm based on inertial sensor data to identify common gross motor patterns during activity of daily living.

Method: A modified Time-Up-And-Go (TUG) task was used since it is comprised of four common daily living activities; Standing, Walking, Turning, and Sitting, all performed in a continuous fashion resulting in six different segments during the task. Sixteen healthy older adults performed two trials of a 5 and 10 meter TUG task. They were outfitted with 17 inertial motion sensors covering each body segment. Data from the 10 meter TUG were used to identify pertinent sensors on the trunk, head, hip, knee, and thigh that provided suitable data for detecting and segmenting activities associated with the TUG. Raw data from sensors were detrended to remove sensor drift, normalized, and band pass filtered with optimal frequencies to reveal kinematic peaks that corresponded to different activities. Segmentation was accomplished by identifying the time stamps of the first minimum or maximum to the right and the left of these peaks. Segmentation time stamps were compared to results from two examiners visually segmenting the activities of the TUG.

Results: We were able to detect these activities in a TUG with 100% sensitivity and specificity (n = 192) during the 10 meter TUG. The rate of success was subsequently confirmed in the 5 meter TUG (n = 192) without altering the parameters of the algorithm. When applying the segmentation algorithms to the 10 meter TUG, we were able to parse 100% of the transition points (n = 224) between different segments that were as reliable and less variable than visual segmentation performed by two independent examiners.

Conclusions: The present study lays the foundation for the development of a comprehensive algorithm to detect and segment naturalistic activities using inertial sensors, in hope of evaluating automatically motor performance within the detected tasks.

No MeSH data available.


Related in: MedlinePlus

Schematic of the TUG task and the inertial sensor motion capture system. A) Spatial schematic of a TUG path and different transition points. Seven transitions were identified among the activities performed during a TUG. These transitions are: 1) sit-to-stand 2) stand-to-walk-out 3) walk-out-to-turn 4) turn-to-walk-in 5) walk-in-to-turn 6) turn-to-sit 7) stand-to-sit. B) Diagram of the 17 sensors and their location on the Animazoo suit. C) A close-up view of the sensors on the shoulders, trunk and hip. D) The orientation of the axes on the sensor. Using the right-hand Cartesian coordinate system, the y-axis line is aligned along the length of the inertial sensor while the x-axis is aligned along the width of the sensor. E) Global work flow of the algorithm to detect the activities and transition between activities using an inertial sensor motion capture system.
© Copyright Policy - open-access
Related In: Results  -  Collection

License 1 - License 2
getmorefigures.php?uid=PMC4403848&req=5

Fig1: Schematic of the TUG task and the inertial sensor motion capture system. A) Spatial schematic of a TUG path and different transition points. Seven transitions were identified among the activities performed during a TUG. These transitions are: 1) sit-to-stand 2) stand-to-walk-out 3) walk-out-to-turn 4) turn-to-walk-in 5) walk-in-to-turn 6) turn-to-sit 7) stand-to-sit. B) Diagram of the 17 sensors and their location on the Animazoo suit. C) A close-up view of the sensors on the shoulders, trunk and hip. D) The orientation of the axes on the sensor. Using the right-hand Cartesian coordinate system, the y-axis line is aligned along the length of the inertial sensor while the x-axis is aligned along the width of the sensor. E) Global work flow of the algorithm to detect the activities and transition between activities using an inertial sensor motion capture system.

Mentions: In this study, participants performed two randomly selected TUG tasks, one having length of 10 meters, the other 5 meters. Participants performed two trials of each TUG task. The algorithm was based on the 10 meters TUG because it provided more walking strides as well as a more gradual transition between Walking and Turning. The 5 meters TUG was used to evaluate the extensibility of the algorithm for shorter distance TUG task. The TUG was used simply because it contains key activities (Standing, Walking, Turning and Sitting) that are performed in a continuous fashion. Data recording started with participants in a standing position to align the sensors with the motion capture system, then sat down in a plastic armed-chair to perform the TUG task. Participants then stood up from the sitting position with their arms on the chair, walked to a distance marker on the floor, turned around, and walked back to the chair turned around, and finally sat down (Figure 1A). Participants were asked to perform these tasks at their own pace and no instructions were given on how to stand, sit, walk, or turn.Figure 1


Auto detection and segmentation of physical activities during a Timed-Up-and-Go (TUG) task in healthy older adults using multiple inertial sensors.

Nguyen HP, Ayachi F, Lavigne-Pelletier C, Blamoutier M, Rahimi F, Boissy P, Jog M, Duval C - J Neuroeng Rehabil (2015)

Schematic of the TUG task and the inertial sensor motion capture system. A) Spatial schematic of a TUG path and different transition points. Seven transitions were identified among the activities performed during a TUG. These transitions are: 1) sit-to-stand 2) stand-to-walk-out 3) walk-out-to-turn 4) turn-to-walk-in 5) walk-in-to-turn 6) turn-to-sit 7) stand-to-sit. B) Diagram of the 17 sensors and their location on the Animazoo suit. C) A close-up view of the sensors on the shoulders, trunk and hip. D) The orientation of the axes on the sensor. Using the right-hand Cartesian coordinate system, the y-axis line is aligned along the length of the inertial sensor while the x-axis is aligned along the width of the sensor. E) Global work flow of the algorithm to detect the activities and transition between activities using an inertial sensor motion capture system.
© Copyright Policy - open-access
Related In: Results  -  Collection

License 1 - License 2
Show All Figures
getmorefigures.php?uid=PMC4403848&req=5

Fig1: Schematic of the TUG task and the inertial sensor motion capture system. A) Spatial schematic of a TUG path and different transition points. Seven transitions were identified among the activities performed during a TUG. These transitions are: 1) sit-to-stand 2) stand-to-walk-out 3) walk-out-to-turn 4) turn-to-walk-in 5) walk-in-to-turn 6) turn-to-sit 7) stand-to-sit. B) Diagram of the 17 sensors and their location on the Animazoo suit. C) A close-up view of the sensors on the shoulders, trunk and hip. D) The orientation of the axes on the sensor. Using the right-hand Cartesian coordinate system, the y-axis line is aligned along the length of the inertial sensor while the x-axis is aligned along the width of the sensor. E) Global work flow of the algorithm to detect the activities and transition between activities using an inertial sensor motion capture system.
Mentions: In this study, participants performed two randomly selected TUG tasks, one having length of 10 meters, the other 5 meters. Participants performed two trials of each TUG task. The algorithm was based on the 10 meters TUG because it provided more walking strides as well as a more gradual transition between Walking and Turning. The 5 meters TUG was used to evaluate the extensibility of the algorithm for shorter distance TUG task. The TUG was used simply because it contains key activities (Standing, Walking, Turning and Sitting) that are performed in a continuous fashion. Data recording started with participants in a standing position to align the sensors with the motion capture system, then sat down in a plastic armed-chair to perform the TUG task. Participants then stood up from the sitting position with their arms on the chair, walked to a distance marker on the floor, turned around, and walked back to the chair turned around, and finally sat down (Figure 1A). Participants were asked to perform these tasks at their own pace and no instructions were given on how to stand, sit, walk, or turn.Figure 1

Bottom Line: Segmentation time stamps were compared to results from two examiners visually segmenting the activities of the TUG.We were able to detect these activities in a TUG with 100% sensitivity and specificity (n = 192) during the 10 meter TUG.When applying the segmentation algorithms to the 10 meter TUG, we were able to parse 100% of the transition points (n = 224) between different segments that were as reliable and less variable than visual segmentation performed by two independent examiners.

View Article: PubMed Central - PubMed

Affiliation: Département de Kinanthropologie, Université du Québec à Montréal, C.P. 8888, succursale Centre-Ville, Montréal, H3C 3P8, Québec, Canada. hpnguyen@utexas.edu.

ABSTRACT

Background: Recently, much attention has been given to the use of inertial sensors for remote monitoring of individuals with limited mobility. However, the focus has been mostly on the detection of symptoms, not specific activities. The objective of the present study was to develop an automated recognition and segmentation algorithm based on inertial sensor data to identify common gross motor patterns during activity of daily living.

Method: A modified Time-Up-And-Go (TUG) task was used since it is comprised of four common daily living activities; Standing, Walking, Turning, and Sitting, all performed in a continuous fashion resulting in six different segments during the task. Sixteen healthy older adults performed two trials of a 5 and 10 meter TUG task. They were outfitted with 17 inertial motion sensors covering each body segment. Data from the 10 meter TUG were used to identify pertinent sensors on the trunk, head, hip, knee, and thigh that provided suitable data for detecting and segmenting activities associated with the TUG. Raw data from sensors were detrended to remove sensor drift, normalized, and band pass filtered with optimal frequencies to reveal kinematic peaks that corresponded to different activities. Segmentation was accomplished by identifying the time stamps of the first minimum or maximum to the right and the left of these peaks. Segmentation time stamps were compared to results from two examiners visually segmenting the activities of the TUG.

Results: We were able to detect these activities in a TUG with 100% sensitivity and specificity (n = 192) during the 10 meter TUG. The rate of success was subsequently confirmed in the 5 meter TUG (n = 192) without altering the parameters of the algorithm. When applying the segmentation algorithms to the 10 meter TUG, we were able to parse 100% of the transition points (n = 224) between different segments that were as reliable and less variable than visual segmentation performed by two independent examiners.

Conclusions: The present study lays the foundation for the development of a comprehensive algorithm to detect and segment naturalistic activities using inertial sensors, in hope of evaluating automatically motor performance within the detected tasks.

No MeSH data available.


Related in: MedlinePlus