Pixel-wise Wasserstein Autoencoder for Highly Generative Dehazing

We propose a highly generative dehazing method based on pixel-wise Wasserstein autoencoders. In contrast to existing dehazing methods based on generative adversarial networks, our method can produce a variety of dehazed images with different styles. It significantly improves the dehazing accuracy vi...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE transactions on image processing Ročník 30; s. 1
Hlavní autori: Kim, Guisik, Park, Sung Woo, Kwon, Junseok
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: New York IEEE 01.01.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:1057-7149, 1941-0042, 1941-0042
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:We propose a highly generative dehazing method based on pixel-wise Wasserstein autoencoders. In contrast to existing dehazing methods based on generative adversarial networks, our method can produce a variety of dehazed images with different styles. It significantly improves the dehazing accuracy via pixel-wise matching from hazy to dehazed images through 2-dimensional latent tensors of the Wasserstein autoencoder. In addition, we present an advanced feature fusion technique to deliver rich information to the latent space. For style transfer, we introduce a mapping function that transforms existing latent spaces to new ones. Thus, our method can produce highly generative haze-free images with various tones, illuminations, and moods, which induces several interesting applications, including low-light enhancement, daytime dehazing, nighttime dehazing, and underwater image enhancement. Experimental results demonstrate that our method quantitatively outperforms existing state-of-the-art methods for synthetic and real-world datasets, and simultaneously generates highly generative haze-free images, which are qualitatively diverse.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1057-7149
1941-0042
1941-0042
DOI:10.1109/TIP.2021.3084743