Searching Towards Class-Aware Generators for Conditional Generative Adversarial Networks

Conditional generative adversarial networks (cGANs) are designed to generate images based on the provided conditions, e.g ., class-level distributions, semantic label maps, etc . Existing methods have used the same generator architecture for all classes. This paper presents an idea that adopts neura...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE signal processing letters Ročník 29; s. 1669 - 1673
Hlavní autoři: Zhou, Peng, Xie, Lingxi, Ni, Bingbing, Tian, Qi
Médium: Journal Article
Jazyk:angličtina
Vydáno: New York IEEE 2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:1070-9908, 1558-2361
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Conditional generative adversarial networks (cGANs) are designed to generate images based on the provided conditions, e.g ., class-level distributions, semantic label maps, etc . Existing methods have used the same generator architecture for all classes. This paper presents an idea that adopts neural architecture search (NAS) to find a class-aware architecture for each class. The search space contains regular and class-modulated convolutions, where the latter is designed to introduce class-specific information while avoiding the reduction of training data for each class generator. The search algorithm follows a weight-sharing pipeline with mixed-architecture optimization so that the search cost does not grow with the number of classes. To learn the sampling policy, a Markov decision process is embedded into the search algorithm, and a moving average is applied for better stability. Class-aware generators show advantages over class-agnostic architectures experimentally. Moreover, we discover two intriguing phenomena that are inspirational to craft cGANs by hand.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1070-9908
1558-2361
DOI:10.1109/LSP.2022.3193589