Limits...
Tightly-coupled stereo visual-inertial navigation using point and line features.

Kong X, Wu W, Zhang L, Wang Y - Sensors (Basel) (2015)

Bottom Line: This paper presents a novel approach for estimating the ego-motion of a vehicle in dynamic and unknown environments using tightly-coupled inertial and visual sensors.To improve the accuracy and robustness, we exploit the combination of point and line features to aid navigation.The mathematical framework is based on trifocal geometry among image triplets, which is simple and unified for point and line features.

View Article: PubMed Central - PubMed

Affiliation: College of Mechatronics and Automation, National University of Defense Technology, Changsha 410073, China. kongxianglong51@gmail.com.

ABSTRACT
This paper presents a novel approach for estimating the ego-motion of a vehicle in dynamic and unknown environments using tightly-coupled inertial and visual sensors. To improve the accuracy and robustness, we exploit the combination of point and line features to aid navigation. The mathematical framework is based on trifocal geometry among image triplets, which is simple and unified for point and line features. For the fusion algorithm design, we employ the Extended Kalman Filter (EKF) for error state prediction and covariance propagation, and the Sigma Point Kalman Filter (SPKF) for robust measurement updating in the presence of high nonlinearities. The outdoor and indoor experiments show that the combination of point and line features improves the estimation accuracy and robustness compared to the algorithm using point features alone.

No MeSH data available.


(a) The point-line-point correspondence among three views; (b) Stereo geometry for two views and line-line-line configuration.
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4507590&req=5

sensors-15-12816-f001: (a) The point-line-point correspondence among three views; (b) Stereo geometry for two views and line-line-line configuration.

Mentions: Once the trifocal tensor is computed, we can use of it to map a pair of matched points in the first and second views into the third view, using the homography between the first view and the third view induced by a line in the second image [18]. As shown in Figure 1a, a line in second view defines a plane in space, and this plane induces a homography between the first view and third view. As recommended by Hartley [18], the line is chosen as the line perpendicular to the epipolar line. The transfer procedure is summarized as follows [18]: (1)


Tightly-coupled stereo visual-inertial navigation using point and line features.

Kong X, Wu W, Zhang L, Wang Y - Sensors (Basel) (2015)

(a) The point-line-point correspondence among three views; (b) Stereo geometry for two views and line-line-line configuration.
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4507590&req=5

sensors-15-12816-f001: (a) The point-line-point correspondence among three views; (b) Stereo geometry for two views and line-line-line configuration.
Mentions: Once the trifocal tensor is computed, we can use of it to map a pair of matched points in the first and second views into the third view, using the homography between the first view and the third view induced by a line in the second image [18]. As shown in Figure 1a, a line in second view defines a plane in space, and this plane induces a homography between the first view and third view. As recommended by Hartley [18], the line is chosen as the line perpendicular to the epipolar line. The transfer procedure is summarized as follows [18]: (1)

Bottom Line: This paper presents a novel approach for estimating the ego-motion of a vehicle in dynamic and unknown environments using tightly-coupled inertial and visual sensors.To improve the accuracy and robustness, we exploit the combination of point and line features to aid navigation.The mathematical framework is based on trifocal geometry among image triplets, which is simple and unified for point and line features.

View Article: PubMed Central - PubMed

Affiliation: College of Mechatronics and Automation, National University of Defense Technology, Changsha 410073, China. kongxianglong51@gmail.com.

ABSTRACT
This paper presents a novel approach for estimating the ego-motion of a vehicle in dynamic and unknown environments using tightly-coupled inertial and visual sensors. To improve the accuracy and robustness, we exploit the combination of point and line features to aid navigation. The mathematical framework is based on trifocal geometry among image triplets, which is simple and unified for point and line features. For the fusion algorithm design, we employ the Extended Kalman Filter (EKF) for error state prediction and covariance propagation, and the Sigma Point Kalman Filter (SPKF) for robust measurement updating in the presence of high nonlinearities. The outdoor and indoor experiments show that the combination of point and line features improves the estimation accuracy and robustness compared to the algorithm using point features alone.

No MeSH data available.