LiDAR target fusion and algorithm detection based on improved YOLO

In order to achieve safe driving behavior, the most important point of automatic driving is to detect the target. At present, the judgment of obstacles is based on single sensor, so it is difficult to detect some complex road environment, and it is easy to be missed. Therefore, this paper proposes a...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Journal of physics. Conference series Ročník 1682; číslo 1; s. 12010 - 12015
Hlavní autoři: Yu, Zhuoquan, Liu, Jingyao, Xu, Wenyang, Liu, Yiding, Lu, Chenji
Médium: Journal Article
Jazyk:angličtina
Vydáno: IOP Publishing 01.11.2020
ISSN:1742-6588, 1742-6596
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:In order to achieve safe driving behavior, the most important point of automatic driving is to detect the target. At present, the judgment of obstacles is based on single sensor, so it is difficult to detect some complex road environment, and it is easy to be missed. Therefore, this paper proposes another system device combined with color camera technology in lidar. This is another detection method proposed on the basis of Yolo, which improves the detection ability of small targets such as non motor vehicles and people. This is based on the Yolo algorithm, using images and other samples to obtain relevant useful data, and finally build the detection system model. Finally, the sensor is introduced to combine the color image and the deep image in order to improve the detection accuracy. Finally, the fusion of decision-making level is verified by test samples. The results show that the improved YOLO algorithm and decision-level fusion algorithm have higher target detection accuracy, can meet the real-time requirements, and can reduce the miss detection rate of small and weak targets such as non-motorized vehicles and pedestrians. Therefore, the method proposed in this paper has good performance and broad application prospects, while taking into account both accuracy and real-time.
ISSN:1742-6588
1742-6596
DOI:10.1088/1742-6596/1682/1/012010