Limits...
Computer-assisted 3D kinematic analysis of all leg joints in walking insects.

Bender JA, Simpson EM, Ritzmann RE - PLoS ONE (2010)

Bottom Line: We improved the legs' visibility by painting white dots on the joints, similar to techniques used for digitizing human motion.Our experimental design reduced the complexity of the tracking problem by tethering the insect and allowing it to walk in place on a lightly oiled glass surface, but in principle, the algorithms implemented are extensible to free walking.We encourage collaborative enhancements to make this tool both better and widely utilized.

View Article: PubMed Central - PubMed

Affiliation: Department of Biology, Case Western Reserve University, Cleveland, Ohio, United States of America. jbender@case.edu

ABSTRACT
High-speed video can provide fine-scaled analysis of animal behavior. However, extracting behavioral data from video sequences is a time-consuming, tedious, subjective task. These issues are exacerbated where accurate behavioral descriptions require analysis of multiple points in three dimensions. We describe a new computer program written to assist a user in simultaneously extracting three-dimensional kinematics of multiple points on each of an insect's six legs. Digital video of a walking cockroach was collected in grayscale at 500 fps from two synchronized, calibrated cameras. We improved the legs' visibility by painting white dots on the joints, similar to techniques used for digitizing human motion. Compared to manual digitization of 26 points on the legs over a single, 8-second bout of walking (or 106,496 individual 3D points), our software achieved approximately 90% of the accuracy with 10% of the labor. Our experimental design reduced the complexity of the tracking problem by tethering the insect and allowing it to walk in place on a lightly oiled glass surface, but in principle, the algorithms implemented are extensible to free walking. Our software is free and open-source, written in the free language Python and including a graphical user interface for configuration and control. We encourage collaborative enhancements to make this tool both better and widely utilized.

Show MeSH

Related in: MedlinePlus

Calculation of stride timing.(A) Enlarged view of a portion of the walking bout shown in Fig. 5. Although it is not obvious, the x-, y-, and z-positions of the TiTa points (shown in black, red, and blue, respectively) are displayed here as points rather than lines to demonstrate the discretization due to the digital video capture. Discontinuities typically occurred when the forward and backward tracking intersected (see Methods). The gray boxes denote the stance phases of each leg, as defined by the leg's z-coordinate. The dashed, green boxes show how the stance phase calculation changes when the foot's anterior and posterior extreme positions (AEP and PEP, respectively) are used to determine stride timing.
© Copyright Policy
Related In: Results  -  Collection


getmorefigures.php?uid=PMC2964314&req=5

pone-0013617-g009: Calculation of stride timing.(A) Enlarged view of a portion of the walking bout shown in Fig. 5. Although it is not obvious, the x-, y-, and z-positions of the TiTa points (shown in black, red, and blue, respectively) are displayed here as points rather than lines to demonstrate the discretization due to the digital video capture. Discontinuities typically occurred when the forward and backward tracking intersected (see Methods). The gray boxes denote the stance phases of each leg, as defined by the leg's z-coordinate. The dashed, green boxes show how the stance phase calculation changes when the foot's anterior and posterior extreme positions (AEP and PEP, respectively) are used to determine stride timing.

Mentions: We first calculated stride transitions using the one-dimensional AEP and PEP within a short time window (width 1/4f s, generally 25–125 ms) around the estimates computed from the x-coordinate's derivative. We then made a second, complementary measurement of stride transitions including the z-coordinates of the TiTa points (i.e., their height above the substrate). For this, we roughly clustered the z-coordinate values into “swing” and “stance” bins by fitting a Gaussian mixture model to the distribution of z-values using the k-means algorithm (Fig. 8B). The mean (μ) and standard deviation (σ) of the cluster with the smaller z-value were used to represent values which occurred primarily during stance, when the foot was touching the glass plate. These values are not the same for each leg because of inconsistencies in the placement of the painted dots and small uncertainties in the coordinate transformation. Additionally, the histograms do not appear to be fully described by two Gaussians, but these fits were typically sufficient to distinguish ground contact. In the same time window (1/4f s) around the x-derivative estimates, we redefined the beginning of swing as the earliest time at which the z-value went above μ+σ and the beginning of stance as the latest time when the z-value went below μ+2σ. The AEP/PEP and z-value metrics yield similar but slightly different times for each stride (Fig. 9), which is not due to imprecision but because the legs do, in fact, begin moving backward before they touch the ground at the beginning of stance and continue moving backward even after they leave the ground at the beginning of swing [2].


Computer-assisted 3D kinematic analysis of all leg joints in walking insects.

Bender JA, Simpson EM, Ritzmann RE - PLoS ONE (2010)

Calculation of stride timing.(A) Enlarged view of a portion of the walking bout shown in Fig. 5. Although it is not obvious, the x-, y-, and z-positions of the TiTa points (shown in black, red, and blue, respectively) are displayed here as points rather than lines to demonstrate the discretization due to the digital video capture. Discontinuities typically occurred when the forward and backward tracking intersected (see Methods). The gray boxes denote the stance phases of each leg, as defined by the leg's z-coordinate. The dashed, green boxes show how the stance phase calculation changes when the foot's anterior and posterior extreme positions (AEP and PEP, respectively) are used to determine stride timing.
© Copyright Policy
Related In: Results  -  Collection

Show All Figures
getmorefigures.php?uid=PMC2964314&req=5

pone-0013617-g009: Calculation of stride timing.(A) Enlarged view of a portion of the walking bout shown in Fig. 5. Although it is not obvious, the x-, y-, and z-positions of the TiTa points (shown in black, red, and blue, respectively) are displayed here as points rather than lines to demonstrate the discretization due to the digital video capture. Discontinuities typically occurred when the forward and backward tracking intersected (see Methods). The gray boxes denote the stance phases of each leg, as defined by the leg's z-coordinate. The dashed, green boxes show how the stance phase calculation changes when the foot's anterior and posterior extreme positions (AEP and PEP, respectively) are used to determine stride timing.
Mentions: We first calculated stride transitions using the one-dimensional AEP and PEP within a short time window (width 1/4f s, generally 25–125 ms) around the estimates computed from the x-coordinate's derivative. We then made a second, complementary measurement of stride transitions including the z-coordinates of the TiTa points (i.e., their height above the substrate). For this, we roughly clustered the z-coordinate values into “swing” and “stance” bins by fitting a Gaussian mixture model to the distribution of z-values using the k-means algorithm (Fig. 8B). The mean (μ) and standard deviation (σ) of the cluster with the smaller z-value were used to represent values which occurred primarily during stance, when the foot was touching the glass plate. These values are not the same for each leg because of inconsistencies in the placement of the painted dots and small uncertainties in the coordinate transformation. Additionally, the histograms do not appear to be fully described by two Gaussians, but these fits were typically sufficient to distinguish ground contact. In the same time window (1/4f s) around the x-derivative estimates, we redefined the beginning of swing as the earliest time at which the z-value went above μ+σ and the beginning of stance as the latest time when the z-value went below μ+2σ. The AEP/PEP and z-value metrics yield similar but slightly different times for each stride (Fig. 9), which is not due to imprecision but because the legs do, in fact, begin moving backward before they touch the ground at the beginning of stance and continue moving backward even after they leave the ground at the beginning of swing [2].

Bottom Line: We improved the legs' visibility by painting white dots on the joints, similar to techniques used for digitizing human motion.Our experimental design reduced the complexity of the tracking problem by tethering the insect and allowing it to walk in place on a lightly oiled glass surface, but in principle, the algorithms implemented are extensible to free walking.We encourage collaborative enhancements to make this tool both better and widely utilized.

View Article: PubMed Central - PubMed

Affiliation: Department of Biology, Case Western Reserve University, Cleveland, Ohio, United States of America. jbender@case.edu

ABSTRACT
High-speed video can provide fine-scaled analysis of animal behavior. However, extracting behavioral data from video sequences is a time-consuming, tedious, subjective task. These issues are exacerbated where accurate behavioral descriptions require analysis of multiple points in three dimensions. We describe a new computer program written to assist a user in simultaneously extracting three-dimensional kinematics of multiple points on each of an insect's six legs. Digital video of a walking cockroach was collected in grayscale at 500 fps from two synchronized, calibrated cameras. We improved the legs' visibility by painting white dots on the joints, similar to techniques used for digitizing human motion. Compared to manual digitization of 26 points on the legs over a single, 8-second bout of walking (or 106,496 individual 3D points), our software achieved approximately 90% of the accuracy with 10% of the labor. Our experimental design reduced the complexity of the tracking problem by tethering the insect and allowing it to walk in place on a lightly oiled glass surface, but in principle, the algorithms implemented are extensible to free walking. Our software is free and open-source, written in the free language Python and including a graphical user interface for configuration and control. We encourage collaborative enhancements to make this tool both better and widely utilized.

Show MeSH
Related in: MedlinePlus