Limits...
Auto detection and segmentation of physical activities during a Timed-Up-and-Go (TUG) task in healthy older adults using multiple inertial sensors.

Nguyen HP, Ayachi F, Lavigne-Pelletier C, Blamoutier M, Rahimi F, Boissy P, Jog M, Duval C - J Neuroeng Rehabil (2015)

Bottom Line: Segmentation time stamps were compared to results from two examiners visually segmenting the activities of the TUG.We were able to detect these activities in a TUG with 100% sensitivity and specificity (n = 192) during the 10 meter TUG.When applying the segmentation algorithms to the 10 meter TUG, we were able to parse 100% of the transition points (n = 224) between different segments that were as reliable and less variable than visual segmentation performed by two independent examiners.

View Article: PubMed Central - PubMed

Affiliation: Département de Kinanthropologie, Université du Québec à Montréal, C.P. 8888, succursale Centre-Ville, Montréal, H3C 3P8, Québec, Canada. hpnguyen@utexas.edu.

ABSTRACT

Background: Recently, much attention has been given to the use of inertial sensors for remote monitoring of individuals with limited mobility. However, the focus has been mostly on the detection of symptoms, not specific activities. The objective of the present study was to develop an automated recognition and segmentation algorithm based on inertial sensor data to identify common gross motor patterns during activity of daily living.

Method: A modified Time-Up-And-Go (TUG) task was used since it is comprised of four common daily living activities; Standing, Walking, Turning, and Sitting, all performed in a continuous fashion resulting in six different segments during the task. Sixteen healthy older adults performed two trials of a 5 and 10 meter TUG task. They were outfitted with 17 inertial motion sensors covering each body segment. Data from the 10 meter TUG were used to identify pertinent sensors on the trunk, head, hip, knee, and thigh that provided suitable data for detecting and segmenting activities associated with the TUG. Raw data from sensors were detrended to remove sensor drift, normalized, and band pass filtered with optimal frequencies to reveal kinematic peaks that corresponded to different activities. Segmentation was accomplished by identifying the time stamps of the first minimum or maximum to the right and the left of these peaks. Segmentation time stamps were compared to results from two examiners visually segmenting the activities of the TUG.

Results: We were able to detect these activities in a TUG with 100% sensitivity and specificity (n = 192) during the 10 meter TUG. The rate of success was subsequently confirmed in the 5 meter TUG (n = 192) without altering the parameters of the algorithm. When applying the segmentation algorithms to the 10 meter TUG, we were able to parse 100% of the transition points (n = 224) between different segments that were as reliable and less variable than visual segmentation performed by two independent examiners.

Conclusions: The present study lays the foundation for the development of a comprehensive algorithm to detect and segment naturalistic activities using inertial sensors, in hope of evaluating automatically motor performance within the detected tasks.

No MeSH data available.


Related in: MedlinePlus

Temporal schematic of segment transitions during a TUG and kinematics pattern during turning transitions. A) Selected inertial sensors are identified on the Y-axis (trunk ωy) of the graphs and the kinematics pattern during a walk-in-to-turn transition for all participants (n = 16). These patterns showed a consistent kinematic behavior of this sensor during Turning; therefore, it was used to identify Turning as well as the transition to the activities before and after Turning. B) The raw and filtered signals of the trunk ωy with two different maximum peaks that indicated two different turns during the TUG task. The time stamps of first minimum peak to the left and right of these peaks were used to approximate the transition point before and after Turning.
© Copyright Policy - open-access
Related In: Results  -  Collection

License 1 - License 2
getmorefigures.php?uid=PMC4403848&req=5

Fig4: Temporal schematic of segment transitions during a TUG and kinematics pattern during turning transitions. A) Selected inertial sensors are identified on the Y-axis (trunk ωy) of the graphs and the kinematics pattern during a walk-in-to-turn transition for all participants (n = 16). These patterns showed a consistent kinematic behavior of this sensor during Turning; therefore, it was used to identify Turning as well as the transition to the activities before and after Turning. B) The raw and filtered signals of the trunk ωy with two different maximum peaks that indicated two different turns during the TUG task. The time stamps of first minimum peak to the left and right of these peaks were used to approximate the transition point before and after Turning.

Mentions: The kinematic pattern of the joints and limbs during the performance of these activities were used to identify a set of sensors that marked the transition point for each segment. For example, the patterns of the trunk angular velocity for all participants during walk-in-to-turn are shown in Figure 4A. While there were variability between participants in the duration and amplitude of these signals, there was a similar pattern that indicated the beginning and ending of Turning. While the maximal peak in trunk angular velocity (ωy, Trunk) was used to detect Turning, the time stamp of the first minimum to the left and right (tmin) of these peaks were used to approximate the transition between Walking and Turning (Figure 4B). Similar patterns were also exhibited in the hip and head sensor. However, these sensors were not always in-phase with each other; therefore, some might have lagged while others led. Therefore, an average of the sensor information from the head, trunk and hip were use as surrogate approximation of the walk-to-turn and turn-to-walk transition. The transition time for a few selected sensors were individually and collectively (using the mean) compared with the visual segmentation time and the sensor combinations that yielded the smallest differences across all participants were used to estimate the transition between these activities. The selected sensors and the algorithm to detect these transitions are presented in Figure 5.Figure 4


Auto detection and segmentation of physical activities during a Timed-Up-and-Go (TUG) task in healthy older adults using multiple inertial sensors.

Nguyen HP, Ayachi F, Lavigne-Pelletier C, Blamoutier M, Rahimi F, Boissy P, Jog M, Duval C - J Neuroeng Rehabil (2015)

Temporal schematic of segment transitions during a TUG and kinematics pattern during turning transitions. A) Selected inertial sensors are identified on the Y-axis (trunk ωy) of the graphs and the kinematics pattern during a walk-in-to-turn transition for all participants (n = 16). These patterns showed a consistent kinematic behavior of this sensor during Turning; therefore, it was used to identify Turning as well as the transition to the activities before and after Turning. B) The raw and filtered signals of the trunk ωy with two different maximum peaks that indicated two different turns during the TUG task. The time stamps of first minimum peak to the left and right of these peaks were used to approximate the transition point before and after Turning.
© Copyright Policy - open-access
Related In: Results  -  Collection

License 1 - License 2
Show All Figures
getmorefigures.php?uid=PMC4403848&req=5

Fig4: Temporal schematic of segment transitions during a TUG and kinematics pattern during turning transitions. A) Selected inertial sensors are identified on the Y-axis (trunk ωy) of the graphs and the kinematics pattern during a walk-in-to-turn transition for all participants (n = 16). These patterns showed a consistent kinematic behavior of this sensor during Turning; therefore, it was used to identify Turning as well as the transition to the activities before and after Turning. B) The raw and filtered signals of the trunk ωy with two different maximum peaks that indicated two different turns during the TUG task. The time stamps of first minimum peak to the left and right of these peaks were used to approximate the transition point before and after Turning.
Mentions: The kinematic pattern of the joints and limbs during the performance of these activities were used to identify a set of sensors that marked the transition point for each segment. For example, the patterns of the trunk angular velocity for all participants during walk-in-to-turn are shown in Figure 4A. While there were variability between participants in the duration and amplitude of these signals, there was a similar pattern that indicated the beginning and ending of Turning. While the maximal peak in trunk angular velocity (ωy, Trunk) was used to detect Turning, the time stamp of the first minimum to the left and right (tmin) of these peaks were used to approximate the transition between Walking and Turning (Figure 4B). Similar patterns were also exhibited in the hip and head sensor. However, these sensors were not always in-phase with each other; therefore, some might have lagged while others led. Therefore, an average of the sensor information from the head, trunk and hip were use as surrogate approximation of the walk-to-turn and turn-to-walk transition. The transition time for a few selected sensors were individually and collectively (using the mean) compared with the visual segmentation time and the sensor combinations that yielded the smallest differences across all participants were used to estimate the transition between these activities. The selected sensors and the algorithm to detect these transitions are presented in Figure 5.Figure 4

Bottom Line: Segmentation time stamps were compared to results from two examiners visually segmenting the activities of the TUG.We were able to detect these activities in a TUG with 100% sensitivity and specificity (n = 192) during the 10 meter TUG.When applying the segmentation algorithms to the 10 meter TUG, we were able to parse 100% of the transition points (n = 224) between different segments that were as reliable and less variable than visual segmentation performed by two independent examiners.

View Article: PubMed Central - PubMed

Affiliation: Département de Kinanthropologie, Université du Québec à Montréal, C.P. 8888, succursale Centre-Ville, Montréal, H3C 3P8, Québec, Canada. hpnguyen@utexas.edu.

ABSTRACT

Background: Recently, much attention has been given to the use of inertial sensors for remote monitoring of individuals with limited mobility. However, the focus has been mostly on the detection of symptoms, not specific activities. The objective of the present study was to develop an automated recognition and segmentation algorithm based on inertial sensor data to identify common gross motor patterns during activity of daily living.

Method: A modified Time-Up-And-Go (TUG) task was used since it is comprised of four common daily living activities; Standing, Walking, Turning, and Sitting, all performed in a continuous fashion resulting in six different segments during the task. Sixteen healthy older adults performed two trials of a 5 and 10 meter TUG task. They were outfitted with 17 inertial motion sensors covering each body segment. Data from the 10 meter TUG were used to identify pertinent sensors on the trunk, head, hip, knee, and thigh that provided suitable data for detecting and segmenting activities associated with the TUG. Raw data from sensors were detrended to remove sensor drift, normalized, and band pass filtered with optimal frequencies to reveal kinematic peaks that corresponded to different activities. Segmentation was accomplished by identifying the time stamps of the first minimum or maximum to the right and the left of these peaks. Segmentation time stamps were compared to results from two examiners visually segmenting the activities of the TUG.

Results: We were able to detect these activities in a TUG with 100% sensitivity and specificity (n = 192) during the 10 meter TUG. The rate of success was subsequently confirmed in the 5 meter TUG (n = 192) without altering the parameters of the algorithm. When applying the segmentation algorithms to the 10 meter TUG, we were able to parse 100% of the transition points (n = 224) between different segments that were as reliable and less variable than visual segmentation performed by two independent examiners.

Conclusions: The present study lays the foundation for the development of a comprehensive algorithm to detect and segment naturalistic activities using inertial sensors, in hope of evaluating automatically motor performance within the detected tasks.

No MeSH data available.


Related in: MedlinePlus