Limits...
Measuring torsional eye movements by tracking stable iris features.

Ong JK, Haslwanter T - J. Neurosci. Methods (2010)

Bottom Line: We propose a new method to measure torsional eye movements from videos taken of the eye.In this method, we track iris features that have been identified as Maximally Stable Volumes.These features, which are stable over time, are dark regions with bright borders that are steep in intensity.

View Article: PubMed Central - PubMed

Affiliation: Institute of Medical Device Engineering, FH OO Forschungs & Entwicklungs GmbH, Upper Austria University of Applied Sciences, Garnisonstr 21, 4020 Linz, Austria. james.ong@fh-linz.at

Show MeSH

Related in: MedlinePlus

Plots of torsional position and speed over time for an annulus mounted on a stepper motor. (a) and (c) correspond to the video where the camera is directly in front of the annulus, while (b) and (d) correspond to the video where the camera has been placed such that it is imaging the annulus from 43° off-centre. In all plots, the points are the estimates from our method, and the circles are the estimates obtained by rotating and visually matching the images. For the position plots, the lines represent the estimates from cross-correlation, and in the angular speed plots, the lines represent the gyroscope data. The damped oscillations of the stepper motor just after each movement are also present in the torsion plots. The inherent granularity of the cross-correlation estimates resulting from the angular resolution of 3 pixels/degree is visible as an oscillation artefact with an amplitude of a third of a degree.
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC2958308&req=5

fig0030: Plots of torsional position and speed over time for an annulus mounted on a stepper motor. (a) and (c) correspond to the video where the camera is directly in front of the annulus, while (b) and (d) correspond to the video where the camera has been placed such that it is imaging the annulus from 43° off-centre. In all plots, the points are the estimates from our method, and the circles are the estimates obtained by rotating and visually matching the images. For the position plots, the lines represent the estimates from cross-correlation, and in the angular speed plots, the lines represent the gyroscope data. The damped oscillations of the stepper motor just after each movement are also present in the torsion plots. The inherent granularity of the cross-correlation estimates resulting from the angular resolution of 3 pixels/degree is visible as an oscillation artefact with an amplitude of a third of a degree.

Mentions: Fig. 6a shows the position estimates from our torsional tracking method. Since it was not possible to directly obtain position estimates from the stepper motor, we also generated human estimates of torsional position by superimposing a target frame and reference frame, and then rotating the target frame until the features aligned. These human estimates, as well as the torsional estimates from cross-correlation, are also shown in Fig. 6a. Our results from feature tracking clearly show the oscillations of the stepper motor at the completion of each movement. The method is accurate at estimating the magnitude of each jump, even though each full movement occurred within the span of only five frames, meaning that we only obtained valid torsion estimates from a few large features. Note that the unequal step sizes produced by the stepper motor are caused by the torque induced by the cable attached to the gyroscopes. The difference between the torsional position estimates from our torsional tracking method and the human estimates had a mean of 0.02° and a standard deviation of 0.04°; this is comparable in size to the uncertainty of our human estimates (±0.1°).


Measuring torsional eye movements by tracking stable iris features.

Ong JK, Haslwanter T - J. Neurosci. Methods (2010)

Plots of torsional position and speed over time for an annulus mounted on a stepper motor. (a) and (c) correspond to the video where the camera is directly in front of the annulus, while (b) and (d) correspond to the video where the camera has been placed such that it is imaging the annulus from 43° off-centre. In all plots, the points are the estimates from our method, and the circles are the estimates obtained by rotating and visually matching the images. For the position plots, the lines represent the estimates from cross-correlation, and in the angular speed plots, the lines represent the gyroscope data. The damped oscillations of the stepper motor just after each movement are also present in the torsion plots. The inherent granularity of the cross-correlation estimates resulting from the angular resolution of 3 pixels/degree is visible as an oscillation artefact with an amplitude of a third of a degree.
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC2958308&req=5

fig0030: Plots of torsional position and speed over time for an annulus mounted on a stepper motor. (a) and (c) correspond to the video where the camera is directly in front of the annulus, while (b) and (d) correspond to the video where the camera has been placed such that it is imaging the annulus from 43° off-centre. In all plots, the points are the estimates from our method, and the circles are the estimates obtained by rotating and visually matching the images. For the position plots, the lines represent the estimates from cross-correlation, and in the angular speed plots, the lines represent the gyroscope data. The damped oscillations of the stepper motor just after each movement are also present in the torsion plots. The inherent granularity of the cross-correlation estimates resulting from the angular resolution of 3 pixels/degree is visible as an oscillation artefact with an amplitude of a third of a degree.
Mentions: Fig. 6a shows the position estimates from our torsional tracking method. Since it was not possible to directly obtain position estimates from the stepper motor, we also generated human estimates of torsional position by superimposing a target frame and reference frame, and then rotating the target frame until the features aligned. These human estimates, as well as the torsional estimates from cross-correlation, are also shown in Fig. 6a. Our results from feature tracking clearly show the oscillations of the stepper motor at the completion of each movement. The method is accurate at estimating the magnitude of each jump, even though each full movement occurred within the span of only five frames, meaning that we only obtained valid torsion estimates from a few large features. Note that the unequal step sizes produced by the stepper motor are caused by the torque induced by the cable attached to the gyroscopes. The difference between the torsional position estimates from our torsional tracking method and the human estimates had a mean of 0.02° and a standard deviation of 0.04°; this is comparable in size to the uncertainty of our human estimates (±0.1°).

Bottom Line: We propose a new method to measure torsional eye movements from videos taken of the eye.In this method, we track iris features that have been identified as Maximally Stable Volumes.These features, which are stable over time, are dark regions with bright borders that are steep in intensity.

View Article: PubMed Central - PubMed

Affiliation: Institute of Medical Device Engineering, FH OO Forschungs & Entwicklungs GmbH, Upper Austria University of Applied Sciences, Garnisonstr 21, 4020 Linz, Austria. james.ong@fh-linz.at

Show MeSH
Related in: MedlinePlus