Limits...
An Inertial and Optical Sensor Fusion Approach for Six Degree-of-Freedom Pose Estimation.

He C, Kazanzides P, Sen HT, Kim S, Liu Y - Sensors (Basel) (2015)

Bottom Line: In contrast, inertial sensing does not require line-of-sight but is subject to drift, which may cause large cumulative errors, especially during the measurement of position.When all the markers are occluded, the position tracking relies on the inertial sensors that are bias-corrected by the optical tracking system.Experiments are performed with an augmented reality head-mounted display (ARHMD) that integrates an optical tracking system (OTS) and inertial measurement unit (IMU).

View Article: PubMed Central - PubMed

Affiliation: Beijing Engineering Research Center of Mixed Reality and Advanced Display, School of Optoelectronics, Beijing Institute of Technology, Beijing 100081, China. wosipo007@163.com.

ABSTRACT
Optical tracking provides relatively high accuracy over a large workspace but requires line-of-sight between the camera and the markers, which may be difficult to maintain in actual applications. In contrast, inertial sensing does not require line-of-sight but is subject to drift, which may cause large cumulative errors, especially during the measurement of position. To handle cases where some or all of the markers are occluded, this paper proposes an inertial and optical sensor fusion approach in which the bias of the inertial sensors is estimated when the optical tracker provides full six degree-of-freedom (6-DOF) pose information. As long as the position of at least one marker can be tracked by the optical system, the 3-DOF position can be combined with the orientation estimated from the inertial measurements to recover the full 6-DOF pose information. When all the markers are occluded, the position tracking relies on the inertial sensors that are bias-corrected by the optical tracking system. Experiments are performed with an augmented reality head-mounted display (ARHMD) that integrates an optical tracking system (OTS) and inertial measurement unit (IMU). Experimental results show that under partial occlusion conditions, the root mean square errors (RMSE) of orientation and position are 0.04° and 0.134 mm, and under total occlusion conditions for 1 s, the orientation and position RMSE are 0.022° and 0.22 mm, respectively. Thus, the proposed sensor fusion approach can provide reliable 6-DOF pose under long-term partial occlusion and short-term total occlusion conditions.

Show MeSH

Related in: MedlinePlus

Position error (mm) in three-axes versus time; error increases in cases of full OTS occlusion (times corresponding to dashed red lines in Figure 13). (a) Error of position on X-axis; (b) Error of position on Y-axis; (c) Error of position on Z-axis.
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4541887&req=5

sensors-15-16448-f014: Position error (mm) in three-axes versus time; error increases in cases of full OTS occlusion (times corresponding to dashed red lines in Figure 13). (a) Error of position on X-axis; (b) Error of position on Y-axis; (c) Error of position on Z-axis.

Mentions: If none of the markers are visible, the position tracking is only based on the inertial sensors data. Without the drift-free marker position information and real-time calibration from the OTS, the inertial sensors’ noise is double integrated which causes the estimated position to drift from the ground truth at an increasing rate, as shown in Figure 13 and Figure 14. When all markers are occluded (during 5.5–13.2 s, 31.07–43.4 s and 53.95–62.88 s), the OTS (red) cannot track the position while the HTS (blue) can track the position correctly only in the beginning of the occlusion but quickly drifts away from the ground truth.


An Inertial and Optical Sensor Fusion Approach for Six Degree-of-Freedom Pose Estimation.

He C, Kazanzides P, Sen HT, Kim S, Liu Y - Sensors (Basel) (2015)

Position error (mm) in three-axes versus time; error increases in cases of full OTS occlusion (times corresponding to dashed red lines in Figure 13). (a) Error of position on X-axis; (b) Error of position on Y-axis; (c) Error of position on Z-axis.
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4541887&req=5

sensors-15-16448-f014: Position error (mm) in three-axes versus time; error increases in cases of full OTS occlusion (times corresponding to dashed red lines in Figure 13). (a) Error of position on X-axis; (b) Error of position on Y-axis; (c) Error of position on Z-axis.
Mentions: If none of the markers are visible, the position tracking is only based on the inertial sensors data. Without the drift-free marker position information and real-time calibration from the OTS, the inertial sensors’ noise is double integrated which causes the estimated position to drift from the ground truth at an increasing rate, as shown in Figure 13 and Figure 14. When all markers are occluded (during 5.5–13.2 s, 31.07–43.4 s and 53.95–62.88 s), the OTS (red) cannot track the position while the HTS (blue) can track the position correctly only in the beginning of the occlusion but quickly drifts away from the ground truth.

Bottom Line: In contrast, inertial sensing does not require line-of-sight but is subject to drift, which may cause large cumulative errors, especially during the measurement of position.When all the markers are occluded, the position tracking relies on the inertial sensors that are bias-corrected by the optical tracking system.Experiments are performed with an augmented reality head-mounted display (ARHMD) that integrates an optical tracking system (OTS) and inertial measurement unit (IMU).

View Article: PubMed Central - PubMed

Affiliation: Beijing Engineering Research Center of Mixed Reality and Advanced Display, School of Optoelectronics, Beijing Institute of Technology, Beijing 100081, China. wosipo007@163.com.

ABSTRACT
Optical tracking provides relatively high accuracy over a large workspace but requires line-of-sight between the camera and the markers, which may be difficult to maintain in actual applications. In contrast, inertial sensing does not require line-of-sight but is subject to drift, which may cause large cumulative errors, especially during the measurement of position. To handle cases where some or all of the markers are occluded, this paper proposes an inertial and optical sensor fusion approach in which the bias of the inertial sensors is estimated when the optical tracker provides full six degree-of-freedom (6-DOF) pose information. As long as the position of at least one marker can be tracked by the optical system, the 3-DOF position can be combined with the orientation estimated from the inertial measurements to recover the full 6-DOF pose information. When all the markers are occluded, the position tracking relies on the inertial sensors that are bias-corrected by the optical tracking system. Experiments are performed with an augmented reality head-mounted display (ARHMD) that integrates an optical tracking system (OTS) and inertial measurement unit (IMU). Experimental results show that under partial occlusion conditions, the root mean square errors (RMSE) of orientation and position are 0.04° and 0.134 mm, and under total occlusion conditions for 1 s, the orientation and position RMSE are 0.022° and 0.22 mm, respectively. Thus, the proposed sensor fusion approach can provide reliable 6-DOF pose under long-term partial occlusion and short-term total occlusion conditions.

Show MeSH
Related in: MedlinePlus