Limits...
Graph Structure-Based Simultaneous Localization and Mapping Using a Hybrid Method of 2D Laser Scan and Monocular Camera Image in Environments with Laser Scan Ambiguity.

Oh T, Lee D, Kim H, Myung H - Sensors (Basel) (2015)

Bottom Line: To verify the effectiveness of the proposed method, real experiments were conducted in an indoor environment with a long corridor.The experimental results were compared with those of the conventional GMappingapproach.The results demonstrate that it is possible to localize the robot in environments with laser scan ambiguity in real time, and the performance of the proposed method is superior to that of the conventional approach.

View Article: PubMed Central - PubMed

Affiliation: Urban Robotics Laboratory (URL), Korea Advanced Institute of Science and Technology (KAIST), 291 Daehak-ro (373-1 Guseong-dong), Yuseong-gu, Daejeon 305-701, Korea. buljaga@kaist.ac.kr.

ABSTRACT
Localization is an essential issue for robot navigation, allowing the robot to perform tasks autonomously. However, in environments with laser scan ambiguity, such as long corridors, the conventional SLAM (simultaneous localization and mapping) algorithms exploiting a laser scanner may not estimate the robot pose robustly. To resolve this problem, we propose a novel localization approach based on a hybrid method incorporating a 2D laser scanner and a monocular camera in the framework of a graph structure-based SLAM. 3D coordinates of image feature points are acquired through the hybrid method, with the assumption that the wall is normal to the ground and vertically flat. However, this assumption can be relieved, because the subsequent feature matching process rejects the outliers on an inclined or non-flat wall. Through graph optimization with constraints generated by the hybrid method, the final robot pose is estimated. To verify the effectiveness of the proposed method, real experiments were conducted in an indoor environment with a long corridor. The experimental results were compared with those of the conventional GMappingapproach. The results demonstrate that it is possible to localize the robot in environments with laser scan ambiguity in real time, and the performance of the proposed method is superior to that of the conventional approach.

No MeSH data available.


Experimental system. The system consists of Pioneer P3-DX, Hokuyo URG-04LX, Point Grey Flea3 FL3-U3-13E4C-C and MSI GE60-2OE.
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4541856&req=5

f7-sensors-15-15830: Experimental system. The system consists of Pioneer P3-DX, Hokuyo URG-04LX, Point Grey Flea3 FL3-U3-13E4C-C and MSI GE60-2OE.

Mentions: The overall system for the experiment was configured as follows. Pioneer P3-DX [30] was used as the robot platform, and Hokuyo URG-04LX [31], a laser scanner, Point Grey Flea3 FL3-U3-13E4C-C [32], a monocular camera, and MSI GE60-2OE [33], a laptop PC, were installed on the robot platform. The whole system is illustrated in Figure 7. The specifications of each device are described in Table 1. The Pioneer P3-DX provides wheel odometry information through 100 tick encoders. The Hokuyo URG-04LX has a maximum range of about 4 m. Although the update rate of the Flea3 camera is 60 Hz, the image processing procedure is carried out only when a node of the graph structure is added for real-time computation. The MSI GE60-20E PC is capable of GPU (graphics processing unit) computing for fast image processing. The overall framework was implemented on a Linux platform, Ubuntu 12.04, based on an open source and open library, including OpenCV (Open Computer Vision) 2.4 [34], ROS (Robot Operating System) hydro [35] and iSAM (incremental Smoothing And Mapping) [2].


Graph Structure-Based Simultaneous Localization and Mapping Using a Hybrid Method of 2D Laser Scan and Monocular Camera Image in Environments with Laser Scan Ambiguity.

Oh T, Lee D, Kim H, Myung H - Sensors (Basel) (2015)

Experimental system. The system consists of Pioneer P3-DX, Hokuyo URG-04LX, Point Grey Flea3 FL3-U3-13E4C-C and MSI GE60-2OE.
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4541856&req=5

f7-sensors-15-15830: Experimental system. The system consists of Pioneer P3-DX, Hokuyo URG-04LX, Point Grey Flea3 FL3-U3-13E4C-C and MSI GE60-2OE.
Mentions: The overall system for the experiment was configured as follows. Pioneer P3-DX [30] was used as the robot platform, and Hokuyo URG-04LX [31], a laser scanner, Point Grey Flea3 FL3-U3-13E4C-C [32], a monocular camera, and MSI GE60-2OE [33], a laptop PC, were installed on the robot platform. The whole system is illustrated in Figure 7. The specifications of each device are described in Table 1. The Pioneer P3-DX provides wheel odometry information through 100 tick encoders. The Hokuyo URG-04LX has a maximum range of about 4 m. Although the update rate of the Flea3 camera is 60 Hz, the image processing procedure is carried out only when a node of the graph structure is added for real-time computation. The MSI GE60-20E PC is capable of GPU (graphics processing unit) computing for fast image processing. The overall framework was implemented on a Linux platform, Ubuntu 12.04, based on an open source and open library, including OpenCV (Open Computer Vision) 2.4 [34], ROS (Robot Operating System) hydro [35] and iSAM (incremental Smoothing And Mapping) [2].

Bottom Line: To verify the effectiveness of the proposed method, real experiments were conducted in an indoor environment with a long corridor.The experimental results were compared with those of the conventional GMappingapproach.The results demonstrate that it is possible to localize the robot in environments with laser scan ambiguity in real time, and the performance of the proposed method is superior to that of the conventional approach.

View Article: PubMed Central - PubMed

Affiliation: Urban Robotics Laboratory (URL), Korea Advanced Institute of Science and Technology (KAIST), 291 Daehak-ro (373-1 Guseong-dong), Yuseong-gu, Daejeon 305-701, Korea. buljaga@kaist.ac.kr.

ABSTRACT
Localization is an essential issue for robot navigation, allowing the robot to perform tasks autonomously. However, in environments with laser scan ambiguity, such as long corridors, the conventional SLAM (simultaneous localization and mapping) algorithms exploiting a laser scanner may not estimate the robot pose robustly. To resolve this problem, we propose a novel localization approach based on a hybrid method incorporating a 2D laser scanner and a monocular camera in the framework of a graph structure-based SLAM. 3D coordinates of image feature points are acquired through the hybrid method, with the assumption that the wall is normal to the ground and vertically flat. However, this assumption can be relieved, because the subsequent feature matching process rejects the outliers on an inclined or non-flat wall. Through graph optimization with constraints generated by the hybrid method, the final robot pose is estimated. To verify the effectiveness of the proposed method, real experiments were conducted in an indoor environment with a long corridor. The experimental results were compared with those of the conventional GMappingapproach. The results demonstrate that it is possible to localize the robot in environments with laser scan ambiguity in real time, and the performance of the proposed method is superior to that of the conventional approach.

No MeSH data available.