Samenvatting
Stereo cameras are crucial sensors for self-driving vehicles as they are low-cost and can be used to estimate depth. It can be used for multiple purposes, such as object detection, depth estimation, semantic segmentation, etc. In this paper, we propose a stereo vision-based perception framework for autonomous vehicles. It uses three deep neural networks simultaneously to perform free-space detection, lane boundary detection, and object detection on image frames captured using the stereo camera. The depth of the detected objects from the vehicle is estimated from the disparity image computed using two stereo image frames from the stereo camera. The proposed stereo perception framework runs at 7.4 Hz on the Nvidia Drive PX 2 hardware platform, which further allows for its use in multi-sensor fusion for localization, mapping, and path planning by autonomous vehicle applications.
Originele taal-2 | Engels |
---|---|
Titel | 2020 IEEE 91st Vehicular Technology Conference, VTC Spring 2020 - Proceedings |
Aantal pagina's | 6 |
ISBN van elektronische versie | :978-1-7281-5207-3 |
DOI's | |
Status | Gepubliceerd - 30 jun. 2020 |
Evenement | 91st IEEE Vehicular Technology Conference (VTC2 2020-Spring ) - Antwerp, België Duur: 25 mei 2020 → 28 mei 2020 |
Congres
Congres | 91st IEEE Vehicular Technology Conference (VTC2 2020-Spring ) |
---|---|
Verkorte titel | VTC2020-Spring |
Land/Regio | België |
Stad | Antwerp |
Periode | 25/05/20 → 28/05/20 |