Searching Towards Class-Aware Generators for Conditional Generative Adversarial Networks

Conditional generative adversarial networks (cGANs) are designed to generate images based on the provided conditions, e.g ., class-level distributions, semantic label maps, etc . Existing methods have used the same generator architecture for all classes. This paper presents an idea that adopts neura...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE signal processing letters Jg. 29; S. 1669 - 1673
Hauptverfasser: Zhou, Peng, Xie, Lingxi, Ni, Bingbing, Tian, Qi
Format: Journal Article
Sprache:Englisch
Veröffentlicht: New York IEEE 2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Schlagworte:
ISSN:1070-9908, 1558-2361
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Conditional generative adversarial networks (cGANs) are designed to generate images based on the provided conditions, e.g ., class-level distributions, semantic label maps, etc . Existing methods have used the same generator architecture for all classes. This paper presents an idea that adopts neural architecture search (NAS) to find a class-aware architecture for each class. The search space contains regular and class-modulated convolutions, where the latter is designed to introduce class-specific information while avoiding the reduction of training data for each class generator. The search algorithm follows a weight-sharing pipeline with mixed-architecture optimization so that the search cost does not grow with the number of classes. To learn the sampling policy, a Markov decision process is embedded into the search algorithm, and a moving average is applied for better stability. Class-aware generators show advantages over class-agnostic architectures experimentally. Moreover, we discover two intriguing phenomena that are inspirational to craft cGANs by hand.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1070-9908
1558-2361
DOI:10.1109/LSP.2022.3193589