Applied and Computational Engineering

- The Open Access Proceedings Series for Conferences


Proceedings of the 4th International Conference on Computing and Data Science (CONF-CDS 2022)

Series Vol. 2 , 22 March 2023


Open Access | Article

Research on Image Generation Methods Based on Adversarial Neural Networks

Renjie Ding * 1
1 Shanghai Maritime University, 1550 Harbour Avenue, Shanghai, 200135, China

* Author to whom correspondence should be addressed.

Applied and Computational Engineering, Vol. 2, 73-83
Published 22 March 2023. © 2023 The Author(s). Published by EWA Publishing
This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Citation Renjie Ding. Research on Image Generation Methods Based on Adversarial Neural Networks. ACE (2023) Vol. 2: 73-83. DOI: 10.54254/2755-2721/2/20220590.

Abstract

Image generation has become a heated research topic in recent years owing to its wide landing scenes and great potential in all walks of life. Especially after the emergence of adversarial neural networks, both the training process and results have been greatly improved compared with previous model methods. This paper focuses on the advantages of directly using Generative Adversarial Nets (GAN) to generate images, as well as its main problems: training instability, pattern collapse, and global correlation, and introduces the strategies and skills of subsequent improved GAN for these problems. Through experiments, we compare the improved network with the original GAN and try to combine the core strategies of these networks. In the experiment, the image quality generated by the combined network is higher.

Keywords

WGAN, adversarial neural networks, Image generation, SAGAN

References

1. Kingma D P , Welling M . Auto-Encoding Variational Bayes[J]. arXiv.org, 2014.

2. Ekanadham C . Sparse deep belief net models for visual area V2[J]. Advances in Neural Information Processing Systems, 2008.

3. Gregor K , Danihelka I , Mnih A , et al. Deep AutoRegressive Networks[J]. 2013.

4. Goodfellow I, Pouget-Abadie J, Mirza M, et al. Generative adversarial nets[J]. Advances in neural information processing systems, 2014, 27.

5. Tim, Bollerslev. Generalized autoregressive conditional heteroskedasticity[J]. Journal of Econometrics, 1986.

6. Van Oord A, Kalchbrenner N, Kavukcuoglu K. Pixel recurrent neural networks[C]//International conference on machine learning. PMLR, 2016: 1747-1756.

7. Radford A , Metz L , Chintala S . Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks[J]. Computer ence, 2015.

8. Salimans T, Goodfellow I, Zaremba W, et al. Improved techniques for training GANs[J]. Advances in neural information processing systems, 2016, 29.

9. Chen X, Duan Y, Houthooft R, et al. InfoGAN: Interpretable representation learning by information maximizing generative adversarial nets[J]. Advances in neural information processing systems, 2016, 29.

10. Nowozin S, Cseke B, Tomioka R. f-GAN: Training generative neural samplers using variational divergence minimization[J]. Advances in neural information processing systems, 2016, 29.

11. Reed S , Akata Z , Yan X , et al. Generative Adversarial Text to Image Synthesis[J]. JMLR.org, 2016.

12. Wang X , Gupta A . Generative Image Modeling Using Style and Structure Adversarial Networks[J]. Springer International Publishing, 2016.

13. Denton E L, Chintala S, Fergus R. Deep generative image models using a laplacian pyramid of adversarial networks[J]. Advances in neural information processing systems, 2015, 28.

14. Zhang H , Xu T , Li H , et al. StackGAN: Text to Photo-realistic Image Synthesis with Stacked Generative Adversarial Networks[J]. IEEE, 2017.

15. Huang X, Li Y, Poursaeed O, et al. Stacked generative adversarial networks[C]//Proceedings of the IEEE conference on computer vision and pattern recognition. 2017: 5077-5086.

16. Zhang H, Goodfellow I , Metaxas D , et al. Self-Attention Generative Adversarial Networks[J]. 2018.

17. Gulrajani I, Ahmed F, Arjovsky M, et al. Improved training of wasserstein GANs[J]. Advances in neural information processing systems, 2017, 30.

18. Arjovsky M , Chintala S , Bottou L . Wasserstein GAN[J]. 2017.

19. Heusel M , Ramsauer H , Unterthiner T , et al. GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium[J]. 2017.

Data Availability

The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License. Authors who publish this series agree to the following terms:

1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this series.

2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this series.

3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See Open Access Instruction).

Volume Title
Proceedings of the 4th International Conference on Computing and Data Science (CONF-CDS 2022)
ISBN (Print)
978-1-915371-19-5
ISBN (Online)
978-1-915371-20-1
Published Date
22 March 2023
Series
Applied and Computational Engineering
ISSN (Print)
2755-2721
ISSN (Online)
2755-273X
DOI
10.54254/2755-2721/2/20220590
Copyright
22 March 2023
Open Access
This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited

Copyright © 2023 EWA Publishing. Unless Otherwise Stated