Generative adversarial deep learning in images using Nash equilibrium game theory

Syeda Imrana Fatima, Yugandhar Garapati

Abstract


A generative adversarial learning (GAL) algorithm is presented to overcome the manipulations that take place in adversarial data and to result in a secured convolutional neural network (CNN). The main objective of the generative algorithm is to make some changes to initial data with positive and negative class labels in testing, hence the CNN results in misclassified data. An adversarial algorithm is used to manipulate the input data that represents the boundaries of learner’s decision-making process. The algorithm generates adversarial modifications to the test dataset using a multiplayer stochastic game approach, without learning how to manipulate the data during training. Then the manipulated data is passed through a CNN for evaluation. The multi-player game consists of an interaction between adversaries which generates manipulations and retrains the model by the learner. The Nash equilibrium game theory (NEGT) is applied to Canadian Institute for Advance Research (CIFAR) dataset. This was done to produce a secure CNN output that is more robust to adversarial data manipulations. The experimental results show that proposed NEGT-GAL achieved a grater mean value of 7.92 and takes less wall clock time of 25,243 sec. Therefore, the proposed NEGT-GAL outperforms the compared existing methods and achieves greater performance.

Keywords


Canadian Institute for Advance Research; convolutional neural network deep learning; generative adversarial learning algorithm; Nash equilibrium game theory

Full Text:

PDF


DOI: http://doi.org/10.11591/ijece.v13i6.pp6351-6360

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

International Journal of Electrical and Computer Engineering (IJECE)
p-ISSN 2088-8708, e-ISSN 2722-2578

This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).