Design of 2D LiDAR and camera fusion system improved by differential evolutionary PID with nonlinear tracking compensator

•A differential evolutionary PID with nonlinear tracking compensator is proposed to accurately control the lidar pitching motion.•The quadratic polynomial transition function is used to optimize the pitching trajectory.•The fused system can obtain homogeneous, and dense colored 3D point cloud. An im...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Infrared physics & technology Ročník 116; s. 103776
Hlavní autoři: Xu, Xiaobin, Zhao, Minghui, Lu, Yonghua, Ran, Yingying, Tan, Zhiying, Luo, Minzhou
Médium: Journal Article
Jazyk:angličtina
Vydáno: Elsevier B.V 01.08.2021
Témata:
ISSN:1350-4495, 1879-0275
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:•A differential evolutionary PID with nonlinear tracking compensator is proposed to accurately control the lidar pitching motion.•The quadratic polynomial transition function is used to optimize the pitching trajectory.•The fused system can obtain homogeneous, and dense colored 3D point cloud. An improved 2D LiDAR and camera fusion system is proposed for the 3D reconstruction of unknown environments. It combines the advantages of dense 2D point cloud and rich color image, adopting a differential evolutionary nonlinear tracking PID to control the pitching motion of LiDAR and camera accurately. The quadratic polynomial transition function is used to optimize the pitching trajectory. The environment was scanned by the system and converted into a 3D colored point cloud by the data fusion algorithm. The experimental results show: the proposed PID control algorithm can accurately control the pitching motion with a small average error (0.0267°) and significantly reduce the point cloud inhomogeneity (0.00698); the processing time for converting each 2D point cloud into the 3D point cloud is about 0.6 ms; combined with the data fusion algorithm, the system can obtain the dense colored 3D point cloud; compared with binocular camera, depth camera and 3D LiDAR under the condition of strong light interference, the fusion system outperforms, with the reconstruction object errors of distance, length and width of 0.23%, 0.17% and 0.46% respectively. In conclusion, the system can obtain homogeneous, and dense colored 3D point cloud in real time while ensuring stable refresh frame rate.
ISSN:1350-4495
1879-0275
DOI:10.1016/j.infrared.2021.103776