Skip to main content
SHARE
Publication

Evolutionary Architecture Search for Generative Adversarial Networks Based on Weight Sharing

Publication Type
Journal
Journal Name
IEEE Transactions on Evolutionary Computation
Publication Date
Page Numbers
653 to 667
Volume
28
Issue
3

Generative adversarial networks (GANs) are a powerful generative technique but frequently face challenges with training stability. Network architecture plays a significant role in determining the final output of GANs, but designing a fine architecture demands extensive domain expertise. This article aims to address this issue by searching for high-performance generator’s architectures through neural architecture search (NAS). The proposed approach, called evolutionary weight sharing GANs (EWSGAN), is based on weight sharing and comprises two steps. First, a supernet of the generator is trained using weight sharing. Second, a multiobjective evolutionary algorithm (MOEA) is employed to identify optimal subnets from the supernet. These subnets inherit weights directly from the supernet for fitness assessment. Two strategies are used to stabilize the training of the generator supernet: 1) a fair single-path sampling strategy and 2) a discarding strategy. Experimental results indicate that the architecture searched by our method achieved a new state-of-the-art among NAS–GAN methods with a Fréchet inception distance (FID) of 9.09 and an inception score (IS) of 8.99 on the CIFAR-10 dataset. It also demonstrates competitive performance on the STL-10 dataset, achieving FID of 21.89 and IS of 10.51.