Limits...
Radar Sensing for Intelligent Vehicles in Urban Environments.

Reina G, Johnson D, Underwood J - Sensors (Basel) (2015)

Bottom Line: Radar overcomes the shortcomings of laser, stereovision, and sonar because it can operate successfully in dusty, foggy, blizzard-blinding, and poorly lit scenarios.This paper presents a novel method for ground and obstacle segmentation based on radar sensing.The algorithm operates directly in the sensor frame, without the need for a separate synchronised navigation source, calibration parameters describing the location of the radar in the vehicle frame, or the geometric restrictions made in the previous main method in the field.

View Article: PubMed Central - PubMed

Affiliation: Department of Engineering for Innovation, University of Salento, via Arnesano, 73100 Lecce, Italy. giulio.reina@unisalento.it.

ABSTRACT
Radar overcomes the shortcomings of laser, stereovision, and sonar because it can operate successfully in dusty, foggy, blizzard-blinding, and poorly lit scenarios. This paper presents a novel method for ground and obstacle segmentation based on radar sensing. The algorithm operates directly in the sensor frame, without the need for a separate synchronised navigation source, calibration parameters describing the location of the radar in the vehicle frame, or the geometric restrictions made in the previous main method in the field. Experimental results are presented in various urban scenarios to validate this approach, showing its potential applicability for advanced driving assistance systems and autonomous vehicle operations.

No MeSH data available.


Classification results obtained from the RCGD method overlaid over the radar image (Left) in a typical urban environment (Right). Blue dot: Radar-labelled ground. Red dot: Radar-labeled obstacle.
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4507681&req=5

f10-sensors-15-14661: Classification results obtained from the RCGD method overlaid over the radar image (Left) in a typical urban environment (Right). Blue dot: Radar-labelled ground. Red dot: Radar-labeled obstacle.

Mentions: A typical result obtained from the RCGD approach is shown in Figure 10, referring to time t = 22.31 min, when the vehicle drove on a double-lane road in regular traffic. Blue dots are the given observations labelled as ground, whereas red dots denote radar-labelled obstacles. The road in front of and behind the vehicle, as well as the car on the right and the car and the wall on the left are properly detected.


Radar Sensing for Intelligent Vehicles in Urban Environments.

Reina G, Johnson D, Underwood J - Sensors (Basel) (2015)

Classification results obtained from the RCGD method overlaid over the radar image (Left) in a typical urban environment (Right). Blue dot: Radar-labelled ground. Red dot: Radar-labeled obstacle.
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4507681&req=5

f10-sensors-15-14661: Classification results obtained from the RCGD method overlaid over the radar image (Left) in a typical urban environment (Right). Blue dot: Radar-labelled ground. Red dot: Radar-labeled obstacle.
Mentions: A typical result obtained from the RCGD approach is shown in Figure 10, referring to time t = 22.31 min, when the vehicle drove on a double-lane road in regular traffic. Blue dots are the given observations labelled as ground, whereas red dots denote radar-labelled obstacles. The road in front of and behind the vehicle, as well as the car on the right and the car and the wall on the left are properly detected.

Bottom Line: Radar overcomes the shortcomings of laser, stereovision, and sonar because it can operate successfully in dusty, foggy, blizzard-blinding, and poorly lit scenarios.This paper presents a novel method for ground and obstacle segmentation based on radar sensing.The algorithm operates directly in the sensor frame, without the need for a separate synchronised navigation source, calibration parameters describing the location of the radar in the vehicle frame, or the geometric restrictions made in the previous main method in the field.

View Article: PubMed Central - PubMed

Affiliation: Department of Engineering for Innovation, University of Salento, via Arnesano, 73100 Lecce, Italy. giulio.reina@unisalento.it.

ABSTRACT
Radar overcomes the shortcomings of laser, stereovision, and sonar because it can operate successfully in dusty, foggy, blizzard-blinding, and poorly lit scenarios. This paper presents a novel method for ground and obstacle segmentation based on radar sensing. The algorithm operates directly in the sensor frame, without the need for a separate synchronised navigation source, calibration parameters describing the location of the radar in the vehicle frame, or the geometric restrictions made in the previous main method in the field. Experimental results are presented in various urban scenarios to validate this approach, showing its potential applicability for advanced driving assistance systems and autonomous vehicle operations.

No MeSH data available.