Limits...
Tightly-coupled stereo visual-inertial navigation using point and line features.

Kong X, Wu W, Zhang L, Wang Y - Sensors (Basel) (2015)

Bottom Line: This paper presents a novel approach for estimating the ego-motion of a vehicle in dynamic and unknown environments using tightly-coupled inertial and visual sensors.To improve the accuracy and robustness, we exploit the combination of point and line features to aid navigation.The mathematical framework is based on trifocal geometry among image triplets, which is simple and unified for point and line features.

View Article: PubMed Central - PubMed

Affiliation: College of Mechatronics and Automation, National University of Defense Technology, Changsha 410073, China. kongxianglong51@gmail.com.

ABSTRACT
This paper presents a novel approach for estimating the ego-motion of a vehicle in dynamic and unknown environments using tightly-coupled inertial and visual sensors. To improve the accuracy and robustness, we exploit the combination of point and line features to aid navigation. The mathematical framework is based on trifocal geometry among image triplets, which is simple and unified for point and line features. For the fusion algorithm design, we employ the Extended Kalman Filter (EKF) for error state prediction and covariance propagation, and the Sigma Point Kalman Filter (SPKF) for robust measurement updating in the presence of high nonlinearities. The outdoor and indoor experiments show that the combination of point and line features improves the estimation accuracy and robustness compared to the algorithm using point features alone.

No MeSH data available.


The attitude estimation errors and 3  bounds.
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4507590&req=5

sensors-15-12816-f006: The attitude estimation errors and 3 bounds.

Mentions: We demonstrate the velocity and attitude deviations of the proposed method with the corresponding 3 bounds in Figure 5 and Figure 6, which verify that the velocity and attitude estimates are consistent. Note that the standard deviations of the roll and pitch angle errors are bounded, while the standard deviation of the yaw angle error grows over time. This is consistent with the observable property of the VINS system, which indicates that the yaw angle is unobservable [8]. The yaw angle error is bounded under 5° due to the accuracy of the gyroscopes in the experiment.


Tightly-coupled stereo visual-inertial navigation using point and line features.

Kong X, Wu W, Zhang L, Wang Y - Sensors (Basel) (2015)

The attitude estimation errors and 3  bounds.
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4507590&req=5

sensors-15-12816-f006: The attitude estimation errors and 3 bounds.
Mentions: We demonstrate the velocity and attitude deviations of the proposed method with the corresponding 3 bounds in Figure 5 and Figure 6, which verify that the velocity and attitude estimates are consistent. Note that the standard deviations of the roll and pitch angle errors are bounded, while the standard deviation of the yaw angle error grows over time. This is consistent with the observable property of the VINS system, which indicates that the yaw angle is unobservable [8]. The yaw angle error is bounded under 5° due to the accuracy of the gyroscopes in the experiment.

Bottom Line: This paper presents a novel approach for estimating the ego-motion of a vehicle in dynamic and unknown environments using tightly-coupled inertial and visual sensors.To improve the accuracy and robustness, we exploit the combination of point and line features to aid navigation.The mathematical framework is based on trifocal geometry among image triplets, which is simple and unified for point and line features.

View Article: PubMed Central - PubMed

Affiliation: College of Mechatronics and Automation, National University of Defense Technology, Changsha 410073, China. kongxianglong51@gmail.com.

ABSTRACT
This paper presents a novel approach for estimating the ego-motion of a vehicle in dynamic and unknown environments using tightly-coupled inertial and visual sensors. To improve the accuracy and robustness, we exploit the combination of point and line features to aid navigation. The mathematical framework is based on trifocal geometry among image triplets, which is simple and unified for point and line features. For the fusion algorithm design, we employ the Extended Kalman Filter (EKF) for error state prediction and covariance propagation, and the Sigma Point Kalman Filter (SPKF) for robust measurement updating in the presence of high nonlinearities. The outdoor and indoor experiments show that the combination of point and line features improves the estimation accuracy and robustness compared to the algorithm using point features alone.

No MeSH data available.