A systematic review on sequence-to-sequence learning with neural network and its models

Hana Yousuf, Michael Lahzi, Said A. Salloum, Khaled Shaalan

Abstract


We develop a precise writing survey on sequence-to-sequence learning with Neural Network and its models. The primary aim of this report is to enhance the knowledge of the Sequence-to-Sequence Neural Network and to locate the best way to deal with executing it. Three models are mostly used in Sequence-to-Sequence Neural Network applications, namely: Recurrent Neural Networks (RNN), Connectionist Temporal Classification (CTC), and Attention model. The evidence we adopted in conducting this survey included utilizing the examination inquiries or research questions to determine keywords, which were used to search for bits of peer-reviewed papers, articles, or books at scholastic directories. Through introductory hunts, 790 papers, and scholarly works were found, and with the assistance of choice criteria and PRISMA methodology, the number of papers reviewed decreased to 16. Every one of the 16 articles was categorized by their contribution to each examination question, and they were broken down. At last, the examination papers experienced a quality appraisal where the subsequent range was from 83.3% to 100%. The proposed systematic review enabled us to collect, evaluate, analyze, and explore different approaches of implementing Sequence-to-Sequence Neural Network models and pointed out the most common use in machine learning. We followed a methodology that shows the potential of applying these models to real-world applications.

Keywords


attention models; connectionist temporal classifications; recurrent neural networks; sequence-to-sequence models; systematic review;



DOI: http://doi.org/10.11591/ijece.v11i3.pp%25p
Total views : 0 times


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

ISSN 2088-8708, e-ISSN 2722-2578