Analysis of the Influence of Foggy Weather Environment on the Detection Effect of Machine Vision Obstacles

This study is to analyze the influence of visibility in a foggy weather environment on the accuracy of machine vision obstacle detection in assisted driving. We present a foggy day imaging model and analyze the image characteristics, then we set up the faster region convolutional neural network (Fas...

Full description

Saved in:
Bibliographic Details
Published in:Sensors (Basel, Switzerland) Vol. 20; no. 2; p. 349
Main Authors: Liu, Zhaohui, He, Yongjiang, Wang, Chao, Song, Runze
Format: Journal Article
Language:English
Published: Switzerland MDPI AG 08.01.2020
MDPI
Subjects:
ISSN:1424-8220, 1424-8220
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This study is to analyze the influence of visibility in a foggy weather environment on the accuracy of machine vision obstacle detection in assisted driving. We present a foggy day imaging model and analyze the image characteristics, then we set up the faster region convolutional neural network (Faster R-CNN) as the basic network for target detection in the simulation experiment and use Karlsruhe Institute of Technology and Toyota Technological Institute (KITTI) data for network detection and classification training. PreScan software is used to build weather and traffic scenes based on a foggy imaging model, and we study object detection of machine vision in four types of weather condition—clear (no fog), light fog, medium fog, and heavy fog—by simulation experiment. The experimental results show that the detection recall is 91.55%, 85.21%, 72.54~64.79%, and 57.75% respectively in no fog, light fog, medium fog, and heavy fog environments. Then we used real scenes in medium fog and heavy fog environment to verify the simulation experiment. Through this study, we can determine the influence of bad weather on the detection results of machine vision, and hence we can improve the safety of assisted driving through further research.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s20020349