Towards Safe Autonomous Driving: Capture Uncertainty in the Deep Neural Network For Lidar 3D Vehicle Detection

To assure that an autonomous car is driving safely on public roads, its object detection module should not only work correctly, but show its prediction confidence as well. Previous object detectors driven by deep learning do not explicitly model uncertainties in the neural network. We tackle with th...

Full description

Saved in:
Bibliographic Details
Published in:Proceedings (IEEE Conference on Intelligent Transportation Systems) pp. 3266 - 3273
Main Authors: Feng, Di, Rosenbaum, Lars, Dietmayer, Klaus
Format: Conference Proceeding
Language:English
Published: IEEE 01.11.2018
Subjects:
ISBN:9781728103211, 1728103215
ISSN:2153-0017
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:To assure that an autonomous car is driving safely on public roads, its object detection module should not only work correctly, but show its prediction confidence as well. Previous object detectors driven by deep learning do not explicitly model uncertainties in the neural network. We tackle with this problem by presenting practical methods to capture uncertainties in a 3D vehicle detector for Lidar point clouds. The proposed probabilistic detector represents reliable epistemic uncertainty and aleatoric uncertainty in classification and localization tasks. Experimental results show that the epistemic uncertainty is related to the detection accuracy, whereas the aleatoric uncertainty is influenced by vehicle distance and occlusion. The results also show that we can improve the detection performance by 1%-5% by modeling the aleatoric uncertainty.
ISBN:9781728103211
1728103215
ISSN:2153-0017
DOI:10.1109/ITSC.2018.8569814