Limits...
An Inertial and Optical Sensor Fusion Approach for Six Degree-of-Freedom Pose Estimation.

He C, Kazanzides P, Sen HT, Kim S, Liu Y - Sensors (Basel) (2015)

Bottom Line: In contrast, inertial sensing does not require line-of-sight but is subject to drift, which may cause large cumulative errors, especially during the measurement of position.When all the markers are occluded, the position tracking relies on the inertial sensors that are bias-corrected by the optical tracking system.Experiments are performed with an augmented reality head-mounted display (ARHMD) that integrates an optical tracking system (OTS) and inertial measurement unit (IMU).

View Article: PubMed Central - PubMed

Affiliation: Beijing Engineering Research Center of Mixed Reality and Advanced Display, School of Optoelectronics, Beijing Institute of Technology, Beijing 100081, China. wosipo007@163.com.

ABSTRACT
Optical tracking provides relatively high accuracy over a large workspace but requires line-of-sight between the camera and the markers, which may be difficult to maintain in actual applications. In contrast, inertial sensing does not require line-of-sight but is subject to drift, which may cause large cumulative errors, especially during the measurement of position. To handle cases where some or all of the markers are occluded, this paper proposes an inertial and optical sensor fusion approach in which the bias of the inertial sensors is estimated when the optical tracker provides full six degree-of-freedom (6-DOF) pose information. As long as the position of at least one marker can be tracked by the optical system, the 3-DOF position can be combined with the orientation estimated from the inertial measurements to recover the full 6-DOF pose information. When all the markers are occluded, the position tracking relies on the inertial sensors that are bias-corrected by the optical tracking system. Experiments are performed with an augmented reality head-mounted display (ARHMD) that integrates an optical tracking system (OTS) and inertial measurement unit (IMU). Experimental results show that under partial occlusion conditions, the root mean square errors (RMSE) of orientation and position are 0.04° and 0.134 mm, and under total occlusion conditions for 1 s, the orientation and position RMSE are 0.022° and 0.22 mm, respectively. Thus, the proposed sensor fusion approach can provide reliable 6-DOF pose under long-term partial occlusion and short-term total occlusion conditions.

Show MeSH

Related in: MedlinePlus

(a) Orientation tracking results of the hybrid tracking system (HTS), which has a higher update rate than the optical tracking system (OTS); (b) Orientation tracking results in period of 2.7 s–3.1 s. Red: The OTS result at an update rate of 20 Hz. Blue: The synchronized tracking result based on the OTS result and updated by inertial measurement unit (IMU) measurement between two samples of OTS.
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4541887&req=5

sensors-15-16448-f002: (a) Orientation tracking results of the hybrid tracking system (HTS), which has a higher update rate than the optical tracking system (OTS); (b) Orientation tracking results in period of 2.7 s–3.1 s. Red: The OTS result at an update rate of 20 Hz. Blue: The synchronized tracking result based on the OTS result and updated by inertial measurement unit (IMU) measurement between two samples of OTS.

Mentions: The hybrid tracking system (HTS) consists of one stereo camera (Micron Tracker Hx40) as the OTS and one IMU rigidly attached to the camera. The OTS tracks special patterns at approximately 20 fps and a latency of 60 ms, and the captured images are transferred to the host computer via a FireWire port. The IMU contains a 3-axis gyroscope (two-axis IDG300 and a single axis ISZ300 from InvenSense), a 3-axis accelerometer (LIS331DLH from STMicroelectronics) and a 3-axis magnetometer (HMC1043 from Honeywell). It provides the 9 data values of tri-axial accelerometer, gyroscope and magnetometer feedback to the host computer via a USB port at the rate of 100 Hz. The software that captures all sensor data and displays surgical augmented reality (AR) images on the HMD is developed with C++ and implemented on the host PC (MacBook Air). It synchronizes the data at the mean time of sampling from the two tracking units, that is, between the two sampling points of the OTS, the HTS captures the orientation with the IMU measurement, which can supply the tracking data at a higher rate of 100 Hz. As the OTS is assumed to be accurate when the markers are all visible, the synchronized HTS tracking result is based on the slower OTS and updated by the orientation estimated with the IMU measurement as shown in Figure 2.


An Inertial and Optical Sensor Fusion Approach for Six Degree-of-Freedom Pose Estimation.

He C, Kazanzides P, Sen HT, Kim S, Liu Y - Sensors (Basel) (2015)

(a) Orientation tracking results of the hybrid tracking system (HTS), which has a higher update rate than the optical tracking system (OTS); (b) Orientation tracking results in period of 2.7 s–3.1 s. Red: The OTS result at an update rate of 20 Hz. Blue: The synchronized tracking result based on the OTS result and updated by inertial measurement unit (IMU) measurement between two samples of OTS.
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4541887&req=5

sensors-15-16448-f002: (a) Orientation tracking results of the hybrid tracking system (HTS), which has a higher update rate than the optical tracking system (OTS); (b) Orientation tracking results in period of 2.7 s–3.1 s. Red: The OTS result at an update rate of 20 Hz. Blue: The synchronized tracking result based on the OTS result and updated by inertial measurement unit (IMU) measurement between two samples of OTS.
Mentions: The hybrid tracking system (HTS) consists of one stereo camera (Micron Tracker Hx40) as the OTS and one IMU rigidly attached to the camera. The OTS tracks special patterns at approximately 20 fps and a latency of 60 ms, and the captured images are transferred to the host computer via a FireWire port. The IMU contains a 3-axis gyroscope (two-axis IDG300 and a single axis ISZ300 from InvenSense), a 3-axis accelerometer (LIS331DLH from STMicroelectronics) and a 3-axis magnetometer (HMC1043 from Honeywell). It provides the 9 data values of tri-axial accelerometer, gyroscope and magnetometer feedback to the host computer via a USB port at the rate of 100 Hz. The software that captures all sensor data and displays surgical augmented reality (AR) images on the HMD is developed with C++ and implemented on the host PC (MacBook Air). It synchronizes the data at the mean time of sampling from the two tracking units, that is, between the two sampling points of the OTS, the HTS captures the orientation with the IMU measurement, which can supply the tracking data at a higher rate of 100 Hz. As the OTS is assumed to be accurate when the markers are all visible, the synchronized HTS tracking result is based on the slower OTS and updated by the orientation estimated with the IMU measurement as shown in Figure 2.

Bottom Line: In contrast, inertial sensing does not require line-of-sight but is subject to drift, which may cause large cumulative errors, especially during the measurement of position.When all the markers are occluded, the position tracking relies on the inertial sensors that are bias-corrected by the optical tracking system.Experiments are performed with an augmented reality head-mounted display (ARHMD) that integrates an optical tracking system (OTS) and inertial measurement unit (IMU).

View Article: PubMed Central - PubMed

Affiliation: Beijing Engineering Research Center of Mixed Reality and Advanced Display, School of Optoelectronics, Beijing Institute of Technology, Beijing 100081, China. wosipo007@163.com.

ABSTRACT
Optical tracking provides relatively high accuracy over a large workspace but requires line-of-sight between the camera and the markers, which may be difficult to maintain in actual applications. In contrast, inertial sensing does not require line-of-sight but is subject to drift, which may cause large cumulative errors, especially during the measurement of position. To handle cases where some or all of the markers are occluded, this paper proposes an inertial and optical sensor fusion approach in which the bias of the inertial sensors is estimated when the optical tracker provides full six degree-of-freedom (6-DOF) pose information. As long as the position of at least one marker can be tracked by the optical system, the 3-DOF position can be combined with the orientation estimated from the inertial measurements to recover the full 6-DOF pose information. When all the markers are occluded, the position tracking relies on the inertial sensors that are bias-corrected by the optical tracking system. Experiments are performed with an augmented reality head-mounted display (ARHMD) that integrates an optical tracking system (OTS) and inertial measurement unit (IMU). Experimental results show that under partial occlusion conditions, the root mean square errors (RMSE) of orientation and position are 0.04° and 0.134 mm, and under total occlusion conditions for 1 s, the orientation and position RMSE are 0.022° and 0.22 mm, respectively. Thus, the proposed sensor fusion approach can provide reliable 6-DOF pose under long-term partial occlusion and short-term total occlusion conditions.

Show MeSH
Related in: MedlinePlus