Limits...
A multi-animal tracker for studying complex behaviors

View Article: PubMed Central - PubMed

ABSTRACT

Background: Animals exhibit astonishingly complex behaviors. Studying the subtle features of these behaviors requires quantitative, high-throughput, and accurate systems that can cope with the often rich perplexing data.

Results: Here, we present a Multi-Animal Tracker (MAT) that provides a user-friendly, end-to-end solution for imaging, tracking, and analyzing complex behaviors of multiple animals simultaneously. At the core of the tracker is a machine learning algorithm that provides immense flexibility to image various animals (e.g., worms, flies, zebrafish, etc.) under different experimental setups and conditions. Focusing on C. elegans worms, we demonstrate the vast advantages of using this MAT in studying complex behaviors. Beginning with chemotaxis, we show that approximately 100 animals can be tracked simultaneously, providing rich behavioral data. Interestingly, we reveal that worms’ directional changes are biased, rather than random – a strategy that significantly enhances chemotaxis performance. Next, we show that worms can integrate environmental information and that directional changes mediate the enhanced chemotaxis towards richer environments. Finally, offering high-throughput and accurate tracking, we show that the system is highly suitable for longitudinal studies of aging- and proteotoxicity-associated locomotion deficits, enabling large-scale drug and genetic screens.

Conclusions: Together, our tracker provides a powerful and simple system to study complex behaviors in a quantitative, high-throughput, and accurate manner.

Electronic supplementary material: The online version of this article (doi:10.1186/s12915-017-0363-9) contains supplementary material, which is available to authorized users.

No MeSH data available.


Related in: MedlinePlus

The tracking software successfully extracts trajectories of multiple animals at a time despite background noise. a Principal component analysis over all detected animal instances in a single frame shows that worm entities are faithfully segregated from background entities. For example, during long temporal recordings, drop condensation on the lid and trail marks contribute to the identification of approximately 2000 erroneous entities per frame. The number of genuine animal entities, however, remains constant (around 100) throughout the movie (see also Additional file 6: Figure S1). Thus, the software successfully learns the crucial features to detect animal entities and discards background noise. b Accuracy of worm detection as a function of the training size. The precision (fraction of actual worms out of total entities classified as worms) reaches nearly 90%, and recall (out of all worms genuinely found on the plate, how many are indeed classified as worms) reaches approximately 85%. Fscore is the harmonic average of the two parameters (precision and recall); as few as 50 worms are sufficient to reach these accuracies (see Methods for details how these parameters were obtained). c Implementing a variant of a Kalman predictor significantly enhances track length. Shown are the lengths (in seconds) of all tracks without implementing the predictor (left), and when using the predictor (right). The y-axis and x-axis denote the start and end time, respectively, of the period the tracks lengths were averaged on. Shown are all tracks identified throughout the movie. Significant longer tracks are extracted if implementing the predictor (80 ± 0.54 [s] compared to without the predictor 58 ± 0.3 [s], in the period of the experiment that yielded the longest tracks, P < 0.001). d There is a trade-off between the number of worms loaded on the assay plate and the average length of detected tracks. The more worms loaded the shorter are the continuously detected tracks. Up to 200 worms can be assayed simultaneously in a field of view of 4 × 4 cm if not requiring extraction of long continuous tracks
© Copyright Policy - OpenAccess
Related In: Results  -  Collection

License 1 - License 2
getmorefigures.php?uid=PMC5383998&req=5

Fig2: The tracking software successfully extracts trajectories of multiple animals at a time despite background noise. a Principal component analysis over all detected animal instances in a single frame shows that worm entities are faithfully segregated from background entities. For example, during long temporal recordings, drop condensation on the lid and trail marks contribute to the identification of approximately 2000 erroneous entities per frame. The number of genuine animal entities, however, remains constant (around 100) throughout the movie (see also Additional file 6: Figure S1). Thus, the software successfully learns the crucial features to detect animal entities and discards background noise. b Accuracy of worm detection as a function of the training size. The precision (fraction of actual worms out of total entities classified as worms) reaches nearly 90%, and recall (out of all worms genuinely found on the plate, how many are indeed classified as worms) reaches approximately 85%. Fscore is the harmonic average of the two parameters (precision and recall); as few as 50 worms are sufficient to reach these accuracies (see Methods for details how these parameters were obtained). c Implementing a variant of a Kalman predictor significantly enhances track length. Shown are the lengths (in seconds) of all tracks without implementing the predictor (left), and when using the predictor (right). The y-axis and x-axis denote the start and end time, respectively, of the period the tracks lengths were averaged on. Shown are all tracks identified throughout the movie. Significant longer tracks are extracted if implementing the predictor (80 ± 0.54 [s] compared to without the predictor 58 ± 0.3 [s], in the period of the experiment that yielded the longest tracks, P < 0.001). d There is a trade-off between the number of worms loaded on the assay plate and the average length of detected tracks. The more worms loaded the shorter are the continuously detected tracks. Up to 200 worms can be assayed simultaneously in a field of view of 4 × 4 cm if not requiring extraction of long continuous tracks

Mentions: At the core of the tracker is a machine learning algorithm which provides an immense flexibility to image and track virtually any moving animal, including worms, flies, zebrafish, and mice (Fig. 1c–f, and Additional file 2: Movie S1, Additional file 3: Movie S2, Additional file 4: Movie S3, Additional file 5: Movie S4). The user is not required to explicitly provide animal features. Rather, the user ‘trains’ the tracker by clicking on a small number of animal entities from the assay images using a user-friendly GUI (Fig. 1b). The tracker learns the animals’ features based on the user’s picks, and uses this information to build a discriminative model between animal entities and background ‘noise’ (Fig. 2a). Moreover, the machine learning algorithm is insensitive to different acquisition parameters such as resolution, contrast, frame rate, color depth, etc. The tracker successfully distinguishes (with a configurable parameter for an approximate false negative rate, β, set to 0.01 by default, see Methods) between animals and background ‘noise’ that typically accumulates during long experiments (Additional file 6: Figure S1a). As few as a dozen clicks on animal entities from the first, mid, and last frames of the movie are sufficient for accurate extraction of animal trajectories throughout thousands of frames. In fact, accuracy analyses show that the tracker achieves a precision of approximately 0.9 and recall of over 0.85 (Fig. 2b, see Methods for details).Fig. 2


A multi-animal tracker for studying complex behaviors
The tracking software successfully extracts trajectories of multiple animals at a time despite background noise. a Principal component analysis over all detected animal instances in a single frame shows that worm entities are faithfully segregated from background entities. For example, during long temporal recordings, drop condensation on the lid and trail marks contribute to the identification of approximately 2000 erroneous entities per frame. The number of genuine animal entities, however, remains constant (around 100) throughout the movie (see also Additional file 6: Figure S1). Thus, the software successfully learns the crucial features to detect animal entities and discards background noise. b Accuracy of worm detection as a function of the training size. The precision (fraction of actual worms out of total entities classified as worms) reaches nearly 90%, and recall (out of all worms genuinely found on the plate, how many are indeed classified as worms) reaches approximately 85%. Fscore is the harmonic average of the two parameters (precision and recall); as few as 50 worms are sufficient to reach these accuracies (see Methods for details how these parameters were obtained). c Implementing a variant of a Kalman predictor significantly enhances track length. Shown are the lengths (in seconds) of all tracks without implementing the predictor (left), and when using the predictor (right). The y-axis and x-axis denote the start and end time, respectively, of the period the tracks lengths were averaged on. Shown are all tracks identified throughout the movie. Significant longer tracks are extracted if implementing the predictor (80 ± 0.54 [s] compared to without the predictor 58 ± 0.3 [s], in the period of the experiment that yielded the longest tracks, P < 0.001). d There is a trade-off between the number of worms loaded on the assay plate and the average length of detected tracks. The more worms loaded the shorter are the continuously detected tracks. Up to 200 worms can be assayed simultaneously in a field of view of 4 × 4 cm if not requiring extraction of long continuous tracks
© Copyright Policy - OpenAccess
Related In: Results  -  Collection

License 1 - License 2
Show All Figures
getmorefigures.php?uid=PMC5383998&req=5

Fig2: The tracking software successfully extracts trajectories of multiple animals at a time despite background noise. a Principal component analysis over all detected animal instances in a single frame shows that worm entities are faithfully segregated from background entities. For example, during long temporal recordings, drop condensation on the lid and trail marks contribute to the identification of approximately 2000 erroneous entities per frame. The number of genuine animal entities, however, remains constant (around 100) throughout the movie (see also Additional file 6: Figure S1). Thus, the software successfully learns the crucial features to detect animal entities and discards background noise. b Accuracy of worm detection as a function of the training size. The precision (fraction of actual worms out of total entities classified as worms) reaches nearly 90%, and recall (out of all worms genuinely found on the plate, how many are indeed classified as worms) reaches approximately 85%. Fscore is the harmonic average of the two parameters (precision and recall); as few as 50 worms are sufficient to reach these accuracies (see Methods for details how these parameters were obtained). c Implementing a variant of a Kalman predictor significantly enhances track length. Shown are the lengths (in seconds) of all tracks without implementing the predictor (left), and when using the predictor (right). The y-axis and x-axis denote the start and end time, respectively, of the period the tracks lengths were averaged on. Shown are all tracks identified throughout the movie. Significant longer tracks are extracted if implementing the predictor (80 ± 0.54 [s] compared to without the predictor 58 ± 0.3 [s], in the period of the experiment that yielded the longest tracks, P < 0.001). d There is a trade-off between the number of worms loaded on the assay plate and the average length of detected tracks. The more worms loaded the shorter are the continuously detected tracks. Up to 200 worms can be assayed simultaneously in a field of view of 4 × 4 cm if not requiring extraction of long continuous tracks
Mentions: At the core of the tracker is a machine learning algorithm which provides an immense flexibility to image and track virtually any moving animal, including worms, flies, zebrafish, and mice (Fig. 1c–f, and Additional file 2: Movie S1, Additional file 3: Movie S2, Additional file 4: Movie S3, Additional file 5: Movie S4). The user is not required to explicitly provide animal features. Rather, the user ‘trains’ the tracker by clicking on a small number of animal entities from the assay images using a user-friendly GUI (Fig. 1b). The tracker learns the animals’ features based on the user’s picks, and uses this information to build a discriminative model between animal entities and background ‘noise’ (Fig. 2a). Moreover, the machine learning algorithm is insensitive to different acquisition parameters such as resolution, contrast, frame rate, color depth, etc. The tracker successfully distinguishes (with a configurable parameter for an approximate false negative rate, β, set to 0.01 by default, see Methods) between animals and background ‘noise’ that typically accumulates during long experiments (Additional file 6: Figure S1a). As few as a dozen clicks on animal entities from the first, mid, and last frames of the movie are sufficient for accurate extraction of animal trajectories throughout thousands of frames. In fact, accuracy analyses show that the tracker achieves a precision of approximately 0.9 and recall of over 0.85 (Fig. 2b, see Methods for details).Fig. 2

View Article: PubMed Central - PubMed

ABSTRACT

Background: Animals exhibit astonishingly complex behaviors. Studying the subtle features of these behaviors requires quantitative, high-throughput, and accurate systems that can cope with the often rich perplexing data.

Results: Here, we present a Multi-Animal Tracker (MAT) that provides a user-friendly, end-to-end solution for imaging, tracking, and analyzing complex behaviors of multiple animals simultaneously. At the core of the tracker is a machine learning algorithm that provides immense flexibility to image various animals (e.g., worms, flies, zebrafish, etc.) under different experimental setups and conditions. Focusing on C. elegans worms, we demonstrate the vast advantages of using this MAT in studying complex behaviors. Beginning with chemotaxis, we show that approximately 100 animals can be tracked simultaneously, providing rich behavioral data. Interestingly, we reveal that worms&rsquo; directional changes are biased, rather than random &ndash; a strategy that significantly enhances chemotaxis performance. Next, we show that worms can integrate environmental information and that directional changes mediate the enhanced chemotaxis towards richer environments. Finally, offering high-throughput and accurate tracking, we show that the system is highly suitable for longitudinal studies of aging- and proteotoxicity-associated locomotion deficits, enabling large-scale drug and genetic screens.

Conclusions: Together, our tracker provides a powerful and simple system to study complex behaviors in a quantitative, high-throughput, and accurate manner.

Electronic supplementary material: The online version of this article (doi:10.1186/s12915-017-0363-9) contains supplementary material, which is available to authorized users.

No MeSH data available.


Related in: MedlinePlus