Dynamically Activated De-Glaring and Detail- Recovery for Low-Light Image Enhancement Directly on Smart Cameras

Low-light conditions often significantly affect the stability of a computer-vision system. Existing studies of unpaired-learning-based low-light image enhancement do not consider glare that occurs during the night, which can lead to significant degradation of image quality. To improve image quality,...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE transactions on emerging topics in computing Ročník 13; číslo 1; s. 222 - 233
Hlavní autoři: Dong, Shao-Wei, Lu, Ching-Hu
Médium: Journal Article
Jazyk:angličtina
Vydáno: New York IEEE 01.01.2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:2168-6750, 2168-6750
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Low-light conditions often significantly affect the stability of a computer-vision system. Existing studies of unpaired-learning-based low-light image enhancement do not consider glare that occurs during the night, which can lead to significant degradation of image quality. To improve image quality, our study proposes an additional enhancement module that can be applied to existing methods. That is, our proposed "lightweight low-light image de-glaring network" can remove glare from low-light images. We also propose a "low-light image-detail-recovery network" to enhance the boundary details of low-light images after removing glare to further improve image quality. The experimental results show that our proposed approaches can effectively improve low-light image quality. In addition, we propose "dynamically activated de-glaring" to assess the quality of input images first to determine whether de-glaring should be undertaken in order to effectively utilize the computational resources of a smart camera and avoid unnecessary image enhancement. The experimental results show that running time and frames per second can be greatly improved when applied to real-world scenarios.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2168-6750
2168-6750
DOI:10.1109/TETC.2024.3403935