Limits...
Monocular camera/IMU/GNSS integration for ground vehicle navigation in challenging GNSS environments.

Chu T, Guo N, Backén S, Akos D - Sensors (Basel) (2012)

Bottom Line: As opposed to GNSS, a generic IMU, which is independent of electromagnetic wave reception, can calculate a high-bandwidth navigation solution, however the output from a self-contained IMU accumulates errors over time.Our proposed integration architecture is examined using a live dataset collected in an operational traffic environment.The experimental results demonstrate that the proposed integrated system provides accurate estimations and potentially outperforms the tightly coupled GNSS/IMU integration in challenging environments with sparse GNSS observations.

View Article: PubMed Central - PubMed

Affiliation: School of Earth and Space Sciences, Peking University, Haidian District, Beijing, China. tianxing.chu@colorado.edu

ABSTRACT
Low-cost MEMS-based IMUs, video cameras and portable GNSS devices are commercially available for automotive applications and some manufacturers have already integrated such facilities into their vehicle systems. GNSS provides positioning, navigation and timing solutions to users worldwide. However, signal attenuation, reflections or blockages may give rise to positioning difficulties. As opposed to GNSS, a generic IMU, which is independent of electromagnetic wave reception, can calculate a high-bandwidth navigation solution, however the output from a self-contained IMU accumulates errors over time. In addition, video cameras also possess great potential as alternate sensors in the navigation community, particularly in challenging GNSS environments and are becoming more common as options in vehicles. Aiming at taking advantage of these existing onboard technologies for ground vehicle navigation in challenging environments, this paper develops an integrated camera/IMU/GNSS system based on the extended Kalman filter (EKF). Our proposed integration architecture is examined using a live dataset collected in an operational traffic environment. The experimental results demonstrate that the proposed integrated system provides accurate estimations and potentially outperforms the tightly coupled GNSS/IMU integration in challenging environments with sparse GNSS observations.

No MeSH data available.


Related in: MedlinePlus

Yaw angle error statistics of the camera/IMU/GNSS integration.
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC3376587&req=5

f10-sensors-12-03162: Yaw angle error statistics of the camera/IMU/GNSS integration.

Mentions: Researchers sometimes refer to quaternions, rather than Euler angles, for orientation representation, particularly for high dynamic applications [14,15]. The quaternion expression avoids the gimbal lock phenomenon when the pitch angle approaches ±90 deg, and operates more efficiently compared with multiplications of direction cosine matrices. Whereas, since a ground vehicle is incapable of operating orthogonal to the ground plane, Euler angles do not suffer from the singularity problem and can provide an intuitive manner for the user to perceive the vehicle direction. We, therefore, choose roll, pitch and yaw angles for representing the vehicle’s orientation based on Tait-Bryan convention. Figure 10 shows the yaw angle error statistics of the camera/IMU/GNSS integration. Yaw jitters occur as a result of the abrupt change in the centrifugal force when the vehicle underwent the U-turn operation. Although the angle residual is maintained within 1 deg in a majority of the time during the test segment, the relatively low level of systematic error still can be observed from the zoomed-in subfigure due to the imperfection of the computer vision module. This further proves the reversal of the positioning error during the vehicle’s cornering as shown in Figure 8. It can be inferred that the accumulated position and orientation errors will gradually erode the navigation solution if the dataset is long enough without any other sources of corrections.


Monocular camera/IMU/GNSS integration for ground vehicle navigation in challenging GNSS environments.

Chu T, Guo N, Backén S, Akos D - Sensors (Basel) (2012)

Yaw angle error statistics of the camera/IMU/GNSS integration.
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC3376587&req=5

f10-sensors-12-03162: Yaw angle error statistics of the camera/IMU/GNSS integration.
Mentions: Researchers sometimes refer to quaternions, rather than Euler angles, for orientation representation, particularly for high dynamic applications [14,15]. The quaternion expression avoids the gimbal lock phenomenon when the pitch angle approaches ±90 deg, and operates more efficiently compared with multiplications of direction cosine matrices. Whereas, since a ground vehicle is incapable of operating orthogonal to the ground plane, Euler angles do not suffer from the singularity problem and can provide an intuitive manner for the user to perceive the vehicle direction. We, therefore, choose roll, pitch and yaw angles for representing the vehicle’s orientation based on Tait-Bryan convention. Figure 10 shows the yaw angle error statistics of the camera/IMU/GNSS integration. Yaw jitters occur as a result of the abrupt change in the centrifugal force when the vehicle underwent the U-turn operation. Although the angle residual is maintained within 1 deg in a majority of the time during the test segment, the relatively low level of systematic error still can be observed from the zoomed-in subfigure due to the imperfection of the computer vision module. This further proves the reversal of the positioning error during the vehicle’s cornering as shown in Figure 8. It can be inferred that the accumulated position and orientation errors will gradually erode the navigation solution if the dataset is long enough without any other sources of corrections.

Bottom Line: As opposed to GNSS, a generic IMU, which is independent of electromagnetic wave reception, can calculate a high-bandwidth navigation solution, however the output from a self-contained IMU accumulates errors over time.Our proposed integration architecture is examined using a live dataset collected in an operational traffic environment.The experimental results demonstrate that the proposed integrated system provides accurate estimations and potentially outperforms the tightly coupled GNSS/IMU integration in challenging environments with sparse GNSS observations.

View Article: PubMed Central - PubMed

Affiliation: School of Earth and Space Sciences, Peking University, Haidian District, Beijing, China. tianxing.chu@colorado.edu

ABSTRACT
Low-cost MEMS-based IMUs, video cameras and portable GNSS devices are commercially available for automotive applications and some manufacturers have already integrated such facilities into their vehicle systems. GNSS provides positioning, navigation and timing solutions to users worldwide. However, signal attenuation, reflections or blockages may give rise to positioning difficulties. As opposed to GNSS, a generic IMU, which is independent of electromagnetic wave reception, can calculate a high-bandwidth navigation solution, however the output from a self-contained IMU accumulates errors over time. In addition, video cameras also possess great potential as alternate sensors in the navigation community, particularly in challenging GNSS environments and are becoming more common as options in vehicles. Aiming at taking advantage of these existing onboard technologies for ground vehicle navigation in challenging environments, this paper develops an integrated camera/IMU/GNSS system based on the extended Kalman filter (EKF). Our proposed integration architecture is examined using a live dataset collected in an operational traffic environment. The experimental results demonstrate that the proposed integrated system provides accurate estimations and potentially outperforms the tightly coupled GNSS/IMU integration in challenging environments with sparse GNSS observations.

No MeSH data available.


Related in: MedlinePlus