Stable Parallel Training of Wasserstein Conditional Generative Adversarial Neural Networks: Full/Regular Research Paper submission for the symposium CSCI-ISAI: Artificial Intelligence

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

We use a stable parallel approach to train Wasserstein Conditional Generative Adversarial Neural Networks (W-CGANs). The parallel training reduces the risk of mode collapse and enhances scalability by using multiple generators that are concurrently trained, each one of them focusing on a single data label. The use of the Wasserstein metric reduces the risk of cycling by stabilizing the training of each generator. We apply the approach on the CIFAR10 and the CIFAR100 datasets, two standard benchmark datasets with images of the same resolution, but different number of classes. Performance is assessed using the inception score, the Fréchet inception distance, and image quality. An improvement in inception score and Fréchet inception distance is shown in comparison to previous results obtained by performing the parallel approach on deep convolutional conditional generative adversarial neural networks (DC-CGANs). Weak scaling is attained on both datasets using up to 100 NVIDIA V100 GPUs on the OLCF supercomputer Summit.

Original languageEnglish
Title of host publicationProceedings - 2021 International Conference on Computational Science and Computational Intelligence, CSCI 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1-7
Number of pages7
ISBN (Electronic)9781665458412
DOIs
StatePublished - 2021
Event2021 International Conference on Computational Science and Computational Intelligence, CSCI 2021 - Las Vegas, United States
Duration: Dec 15 2021Dec 17 2021

Publication series

NameProceedings - 2021 International Conference on Computational Science and Computational Intelligence, CSCI 2021

Conference

Conference2021 International Conference on Computational Science and Computational Intelligence, CSCI 2021
Country/TerritoryUnited States
CityLas Vegas
Period12/15/2112/17/21

Funding

Massimiliano Lupo Pasini thanks Dr. Vladimir Pro-topopescu for his valuable feedback in the preparation of this manuscript. This work used resources of the Oak Ridge Leadership Computing Facility, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC05-00OR22725. This research is sponsored by the Artificial Intelligence Initiative as part of the Laboratory Directed Research and Development Program of Oak Ridge National Laboratory, managed by UT-Battelle, LLC, for the US Department of Energy under contract DE-AC05-00OR22725. This manuscript has been authored in part by UT-Battelle, LLC, under contract DE-AC05-00OR22725 with the US

Keywords

  • Artificial Intelligence
  • High Performance Computing
  • Multicore processing

Fingerprint

Dive into the research topics of 'Stable Parallel Training of Wasserstein Conditional Generative Adversarial Neural Networks: Full/Regular Research Paper submission for the symposium CSCI-ISAI: Artificial Intelligence'. Together they form a unique fingerprint.

Cite this