Limits...
Radar Sensing for Intelligent Vehicles in Urban Environments.

Reina G, Johnson D, Underwood J - Sensors (Basel) (2015)

Bottom Line: Radar overcomes the shortcomings of laser, stereovision, and sonar because it can operate successfully in dusty, foggy, blizzard-blinding, and poorly lit scenarios.The algorithm operates directly in the sensor frame, without the need for a separate synchronised navigation source, calibration parameters describing the location of the radar in the vehicle frame, or the geometric restrictions made in the previous main method in the field.Experimental results are presented in various urban scenarios to validate this approach, showing its potential applicability for advanced driving assistance systems and autonomous vehicle operations.

View Article: PubMed Central - PubMed

Affiliation: Department of Engineering for Innovation, University of Salento, via Arnesano, 73100 Lecce, Italy. giulio.reina@unisalento.it.

ABSTRACT
Radar overcomes the shortcomings of laser, stereovision, and sonar because it can operate successfully in dusty, foggy, blizzard-blinding, and poorly lit scenarios. This paper presents a novel method for ground and obstacle segmentation based on radar sensing. The algorithm operates directly in the sensor frame, without the need for a separate synchronised navigation source, calibration parameters describing the location of the radar in the vehicle frame, or the geometric restrictions made in the previous main method in the field. Experimental results are presented in various urban scenarios to validate this approach, showing its potential applicability for advanced driving assistance systems and autonomous vehicle operations.

No MeSH data available.


Radar beam model for flat, horizontal ground, up view (a); Uphill ground in view of the radar (b).
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4507681&req=5

f4-sensors-15-14661: Radar beam model for flat, horizontal ground, up view (a); Uphill ground in view of the radar (b).

Mentions: Previous RGS method in this field showed that, knowing the pose of the vehicle and the characteristics of the radar beam, it is theoretically possible to locate the ground footprint. For example, for the radar configuration used in this work and under the assumptions of quasi-horizontal ground, the area illuminated by the radar on the terrain is shown in Figure 4a. Clearly, the effect of roll and pitch will be to deform the three annuli (stretching them along roll and pitch axes resulting in three ellipses). The main assumptions underlying the previously proposed RGS method can be summarized as follows


Radar Sensing for Intelligent Vehicles in Urban Environments.

Reina G, Johnson D, Underwood J - Sensors (Basel) (2015)

Radar beam model for flat, horizontal ground, up view (a); Uphill ground in view of the radar (b).
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4507681&req=5

f4-sensors-15-14661: Radar beam model for flat, horizontal ground, up view (a); Uphill ground in view of the radar (b).
Mentions: Previous RGS method in this field showed that, knowing the pose of the vehicle and the characteristics of the radar beam, it is theoretically possible to locate the ground footprint. For example, for the radar configuration used in this work and under the assumptions of quasi-horizontal ground, the area illuminated by the radar on the terrain is shown in Figure 4a. Clearly, the effect of roll and pitch will be to deform the three annuli (stretching them along roll and pitch axes resulting in three ellipses). The main assumptions underlying the previously proposed RGS method can be summarized as follows

Bottom Line: Radar overcomes the shortcomings of laser, stereovision, and sonar because it can operate successfully in dusty, foggy, blizzard-blinding, and poorly lit scenarios.The algorithm operates directly in the sensor frame, without the need for a separate synchronised navigation source, calibration parameters describing the location of the radar in the vehicle frame, or the geometric restrictions made in the previous main method in the field.Experimental results are presented in various urban scenarios to validate this approach, showing its potential applicability for advanced driving assistance systems and autonomous vehicle operations.

View Article: PubMed Central - PubMed

Affiliation: Department of Engineering for Innovation, University of Salento, via Arnesano, 73100 Lecce, Italy. giulio.reina@unisalento.it.

ABSTRACT
Radar overcomes the shortcomings of laser, stereovision, and sonar because it can operate successfully in dusty, foggy, blizzard-blinding, and poorly lit scenarios. This paper presents a novel method for ground and obstacle segmentation based on radar sensing. The algorithm operates directly in the sensor frame, without the need for a separate synchronised navigation source, calibration parameters describing the location of the radar in the vehicle frame, or the geometric restrictions made in the previous main method in the field. Experimental results are presented in various urban scenarios to validate this approach, showing its potential applicability for advanced driving assistance systems and autonomous vehicle operations.

No MeSH data available.