Data generation using generative adversarial networks to increase data volume

Ulzada Aitimova, Murat Aitimov, Bigul Mukhametzhanova, Zhanat Issakulova, Akmaral Kassymova, Aisulu Ismailova, Kuanysh Kadirkulov, Assel Zhumabayeva


The article is an in-depth analysis of two leading approaches in the field of generative modeling: generative adversarial networks (GANs) and the pixel-to-pixel (Pix2Pix) image translation model. Given the growing interest in automation and improved image processing, the authors focus on the key operating principles of each model, analyzing their unique characteristics and features. The article also explores in detail the various applications of these approaches, highlighting their impact on modern research in computer vision and artificial intelligence. The purpose of the study is to provide readers with a scientific understanding of the effectiveness and potential of each of the models, and to highlight the opportunities and limitations of their application. The authors strive not only to cover the technical aspects of the models, but also to provide a broad overview of their impact on various industries, including medicine, the arts, and solving real-world problems in image processing. In addition, we have identified prospects for the use of these technologies in various fields, such as medicine, design, art, entertainment, and in unmanned aerial vehicle systems. The ability of GANs and Pix2Pix to adapt to a variety of tasks and produce high-quality results opens up broad prospects for industry and research.


Deep learning; Discriminator; Generative adversarial networks; Image processing; Pixel-to-pixel

Full Text:



Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

International Journal of Electrical and Computer Engineering (IJECE)
p-ISSN 2088-8708, e-ISSN 2722-2578

This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).