Skip to main content
SHARE
Publication

Stable Parallel Training of Wasserstein Conditional Generative Adversarial Neural Networks : *Full/Regular Research Paper sub...

by Massimiliano Lupo Pasini, Junqi Yin
Publication Type
Conference Paper
Journal Name
The 2021 International Conference on Computational Science and Computational Intelligence (CSCI'21)
Book Title
2021 International Conference on Computational Science and Computational Intelligence (CSCI)
Publication Date
Page Numbers
1 to 7
Publisher Location
New Jersey, United States of America
Conference Name
The 2021 International Conference on Computational Science and Computational Intelligence (CSCI'21)
Conference Location
Las Vegas, Nevada, United States of America
Conference Sponsor
IEEE
Conference Date
-

We use a stable parallel approach to train Wasserstein Conditional Generative Adversarial Neural Networks (W-CGANs). The parallel training reduces the risk of mode collapse and enhances scalability by using multiple generators that are concurrently trained, each one of them focusing on a single data label. The use of the Wasserstein metric reduces the risk of cycling by stabilizing the training of each generator. We apply the approach on the CIFAR10 and the CIFAR100 datasets, two standard benchmark datasets with images of the same resolution, but different number of classes. Performance is assessed using the inception score, the Fréchet inception distance, and image quality. An improvement in inception score and Fréchet inception distance is shown in comparison to previous results obtained by performing the parallel approach on deep convolutional conditional generative adversarial neural networks (DC-CGANs). Weak scaling is attained on both datasets using up to 100 NVIDIA V100 GPUs on the OLCF supercomputer Summit.