LiDAR target fusion and algorithm detection based on improved YOLO

In order to achieve safe driving behavior, the most important point of automatic driving is to detect the target. At present, the judgment of obstacles is based on single sensor, so it is difficult to detect some complex road environment, and it is easy to be missed. Therefore, this paper proposes a...

Full description

Saved in:
Bibliographic Details
Published in:Journal of physics. Conference series Vol. 1682; no. 1; pp. 12010 - 12015
Main Authors: Yu, Zhuoquan, Liu, Jingyao, Xu, Wenyang, Liu, Yiding, Lu, Chenji
Format: Journal Article
Language:English
Published: IOP Publishing 01.11.2020
ISSN:1742-6588, 1742-6596
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In order to achieve safe driving behavior, the most important point of automatic driving is to detect the target. At present, the judgment of obstacles is based on single sensor, so it is difficult to detect some complex road environment, and it is easy to be missed. Therefore, this paper proposes another system device combined with color camera technology in lidar. This is another detection method proposed on the basis of Yolo, which improves the detection ability of small targets such as non motor vehicles and people. This is based on the Yolo algorithm, using images and other samples to obtain relevant useful data, and finally build the detection system model. Finally, the sensor is introduced to combine the color image and the deep image in order to improve the detection accuracy. Finally, the fusion of decision-making level is verified by test samples. The results show that the improved YOLO algorithm and decision-level fusion algorithm have higher target detection accuracy, can meet the real-time requirements, and can reduce the miss detection rate of small and weak targets such as non-motorized vehicles and pedestrians. Therefore, the method proposed in this paper has good performance and broad application prospects, while taking into account both accuracy and real-time.
ISSN:1742-6588
1742-6596
DOI:10.1088/1742-6596/1682/1/012010