Application for Iraqi sign language translation on Android system

Received Feb 9, 2020 Revised Apr 13, 2020 Accepted Apr 25, 2020 Deaf people suffer from difficulty in social communication, especially those who have been denied the blessing of hearing before the acquisition of spoken language and before learning to read and write. For the purpose of employing mobile devices for the benefit of these people, their teachers and everyone who has contact with them, this research aims to design an application for social communication and learning by translating Iraqi sign language into text in Arabic and vice versa. Iraqi sign language has been chosen because of a lack of applications for this field. The current research, to the best of our knowledge, is the first of its kind in Iraq. The application is open source; words that are not found in the application database can be processed by translating them into letters alphabetically. The importance of the application lies in the fact that it is a means of communication and e-learning through Iraqi sign language, reading and writing in Arabic. Similarly, it is regarded as a means of social communication between deaf people and those with normal hearing. This application is designed by using JAVA language and it was tested on several deaf students at Al-Amal Institute for Special Needs Care in Mosul, Iraq. It was well comprehended and accepted.


INTRODUCTION
It is in human nature to seek to help others in ways possible according to abilities, expertise and knowledge. Similar to how medical experts have an important role in treating patients, and medical engineers in manufacturing tools and instruments to help the sick and injured, such as the manufacture of prostheses for the handicapped and headphones for the hearing impaired, there is also a major role for the programmer in creating and developing applications and technologies to help people with special needs and in order to make their lives easier. The role of the programmer lies in creating and developing applications and technologies to help deaf people to communicate with each other, as well as with people with normal hearing, in addition to facilitating their education. Undoubtedly, that the deaf constitute a large and influential percentage in society, but they suffer from the difficulty of communication. Sign language, as a means of communication, has been developed over thousands of years. It is defined as movements using one or both hands and sometimes using facial expressions [1]. With the invention of the Internet and the development of technology, the deaf have had a share in benefiting from this technology represented by providing means to educate and communicate. This issue has become an important area that has attracted the attention of many scientists and researchers working in this field. Sign language recognition systems (SLRS) are one of the areas of human interaction with computers (HCI) that have had a significant impact on the deaf community [2]. Sign language differs from one country to another [3] because it is often the deaf who develop the sign language by identifying signs that have a special meaning for them. This is affected by the culture and environment in which they live, for example, when a deaf person in most Gulf countries refers to his chest to indicate the color white, which relates to the color white Gulf uniform they are used to wearing [4], but people of other countries and cultures do not wear this traditional attire so it will not have any special meaning for them.
The aim of this paper is to create an application for smart phone devices to translate Iraqi sign language into what it corresponds to in classical Arabic and vice versa , as well as sending messages (SMS). This application helps the deaf to communicate with each other as well as with people with normal hearing. This application is of great significance for the deaf to learn to read and write Arabic. Through monitoring of many deaf people, it has been noticed that their ability to read and write Arabic gradually weakens after graduating from deaf teaching institutes because they do not use these skills as they usually use sign language. This constitutes a problem. Therefore, this application is designed to present the optimal solution as it provides the deaf person with a feasible way to learn and use the Arabic language continuously. This application is not only a means of communication, but also a way to continue learning and using Arabic. It also aims at increasing their level of awareness and mastery of this language in a way that is no less efficient than their level of sign language.
The use of this application falls within the areas of e-learning and distance learning as it facilitates the education of deaf people who cannot join deaf institutes. It does not require the physical presence of a person, but it suffices for the deaf person to have the free application on a smart phone device and learn while being at home. This application seeks to store images of signs of all words, letters and numbers in the Iraqi sign language and then link these images to words that correspond to them in the Arabic.

RELATED WORKS
The development of sign language translation applications is a research-worthy topic that has been given much attention by scholars. Being the focus of attention of many researchers, fruitful research papers in this field have been published. The following is a review of some of this research.
Ludeña et al., introduced a system for translating Spanish sign language. This system is designed to translate spoken sentences in Spanish into their equivalent in the Spanish sign language. A speech recognition system was used to convert spoken sentences into written sentences, then the written sentences are translated into the equivalent in Spanish sign language, using a virtual human (avatar). This system has been applied to work in two domains: bus transport information and hotel reception, hence, most of the vocabulary needed by staff and deaf users in these two domains has been absorbed [5].
In 2012, a chat system was designed by Hanadi et al. for deaf people, the first of its kind in the Arab world. Through this system, Deaf people can communicate with each other, share pictures and multimedia and learn about each other's experiences. The system consists of two applications: sign language translation tool and chat system, both of which use the same database which includes the unified Arabic sign language dictionary to translate Arabic sign language [6]. Harsh et al., use a Kinect device that contains a set of sensors with the ability to capture a user photo and take many shots that embody the user's movements when his hands are above the trunk area. These snapshots are used as input to systems that use special algorithms to distinguish images and compare them with special images in Indian sign language stored within a special database. It is then translated into the corresponding Indian language [7].
Mohammed et al., eveloped a data acquisition and control (DAC) system that translates sign language into text that can be read by anyone. The system is called gesture recognition and sign language translator and it utilizes a smart glove that captures the gesture of the hand and interprets these gestures into readable text. This text can be sent wirelessly to a smart phone or shown in an embedded LCD display [8].
In 2016, the health questionnaire EuroQol EQ-5D-5L used to assess an individual's general health was of interest to Rogers et al. They translated this questionnaire from English to British sign language. The descriptive system for this tool consists of five dimensions: movementself-careusual activitiespain/discomfortanxiety/depression. Each dimension has five levels: no problemssimple problemsmoderate problemsacute problemssevere problems. All these levels are written in the English language, and when you click on any level, the video clip that represents it in the British sign language (BSL) is displayed, so that the deaf person can choose the level that represents his health status [9].
Another system produced by Basma et al., uses the leap motion controller (LMC). This device works at a rate of 200 frames per second. These frames are used to determine the number, position and rotation of hands and fingers. The workflow consists of several steps: Pre-processing -Tracking -Features -Extractionsign classification. This system can recognize fixed signals such as letters and numbers. It also recognizes dynamic signals that include motions. Similarly, it proposes a method for dividing a series of continuous signals based on palm speed tracking. Thus, the system not only translates the already fragmented signals but is also able to translate continuous sentences [10]. Sujay et al., roposed a system to translate American sign language; this system consists of two stages: -Stage 1: isolated gloss recognition system: The inputs into this stage are video clips. Systems are used to distinguish American Sign Language from these videos. These systems consist of video pre-processing and a time series neural network module. -Stage 2: gloss to speech neural translator: This stage works on the outputs of the previous stage, where the signals of the American Sign Language derived from the previous stage are translated into the corresponding words in English [11]. Ebling et al., introduced a system for translating German sign language. Microsoft Kinect v2 depth sensor and two GoPro Hero 4 Black video cameras were used, placed in front of the signer, in addition to three cameras that are installed on the top, right, and left of the signer, so that signals are captured from different angles. The well-known recording program BosphorusSign Recording Software was developed to synchronize the capture from various devices [12].
A mobile application was designed to translate Pakistani sign language by Parvez et al. Samples were selected from 192 deaf participants aged 5-10 years from two special institutes for children. The aim of this study is to determine the effectiveness of the mobile interface through an advanced mobile application to learn basic mathematical concepts using Pakistan sign language (PSL). This study bridges the gap between technology-based and traditional teaching methods, which are used to teach mathematical concepts using PSL [13].

BACKGROUND
Sign language translation using a computer or cell phone device is a form of machine translation where words are translated from source language into target language using a sign language translation system. Many scholars and researchers have dealt with the topic of translating sign language in different ways, each method having its own scope of application and use. They have achieved different success rates. Some studies have utilized cameras [14][15][16] or gloves with sensors [17][18][19] and many algorithms and artificial intelligence techniques to distinguish the sign language. An optimal method cannot be identified because what suits a particular situation may not be appropriate for another. For example, if a deaf person gives a lecture to a group of students, in this case it is preferable to utilize the methods used with the camera or electronic gloves equipped with sensors to translate the sign language. However, during the daily life of a deaf person when wanting to communicate with family and friends, or anyone encountered on the road or at the work site, it is better to use the sign language translation systems represented by using an application on a mobile device. The data entry into this application is through a dedicated keyboard designed for this purpose. Using this application is easy and cost efficient. It is worth noting that most sign language translation systems are designed to be one-way only. Some of these systems work to translate sign language into spoken language only [20], while others work to translate spoken language into sign language only [21]. This does not allow the full integration of deaf people into society. It is also noted that many of these systems use video clips or images of very large sizes to represent the sign language [22,23]. Consequently, this requires the use of a dedicated server for storing video clips or images. Therefore, it needs to use the Internet, which leads to restricting the use of these systems and thus reduces the flexibility of use. It was noticed through the study of the Arabic language and sign language that the root word, its derivatives and conjugations have the same translation in the sign language [24,25]. Hence, this research aims at providing a flexible and free application.

METHODOLOGY
To explain how the Iraqi sign language translation system works, we offer the following.

General overview
An overview of the steps involved in implementing the proposed system are listed as follows. a. In this research, a free application for smart phones is designed to translate Iraqi sign language into the equivalent in spoken Arabic. This application is designed using "Android Studio" as a development environment to produce applications that run on devices that are managed by Android as an operating system. As for the programming language, Java Micro Edition has been used because of the ease of this language in transferring the application on different devices. Due to the nature of this application that requires the use of a very large number of images that represent words, letters and numbers in the sign language, needing very large storage space, an efficient way to store images with the least possible  [26,27]. This program provides the ability to use a virtual human (avatar) to visualize sign language movements, and then store the images of the avatar in the application database for use in the translation process. b. Sign language grammar has been studied so that we can properly represent the sign language images, as the sign language has its own way of representing derivations, conjugation of verbs, expressing singular, folded, and plural, as well as expressing feminine and masculine aspects of grammar amongst others [1].

User interface
The user interface is designed to be as easy to use as in Figure 1. It contains the following components: -Tape to display the Arabic text. Each button in it was represented so that it carries an image and symbol of the letter or the number that represents it. Thus it can be used easily by the healthy person and the deaf and mute person alike. Also, a special button was added to put a space between words, as well as a button for scanning. -The button ‫االشارة"‬ ‫"لغة‬ to go to the window for translating Iraqi sign language into standard Arabic.

Database
A database is designed with several fields. The first field "word" is designated for storing words, letters, and numbers. The second field "img_name" contains the names of the images that correspond to the words. It is to be noted that some words need more than one image to be translated. The third field "Categories" contains the names of the categories, and the fourth field "Additional-img" contains the names of additional images to express the direction of movement. When the user presses the button ‫/ترجم"‬ translate", the application, through a special code, searches inside the database for the names of the images that correspond to each word separately, then a function for displaying images is called using "img_name" field. The process of searching for the names of images and calling the image display function for each word continues until the end of the sentence is reached.

Images display
The process of displaying images from the database is done through a programmed function that is provided with 2 parameters: the first is a series of letters representing the name of the image and the second is a numerical value representing the speed of display determined at (500 ms). Prior experience has proven that this is an ideal speed for display on mobile devices that are managed by the Android operating system. Through implementing this function, sign language images are displayed as animated films on the application screen, represented by the movements that the virtual human (avatar) performs.

THE PROPOSED SYSTEM WORKS
Application works includes two modes: a. First mode: from a hearing person to a deaf person see Figure 2(a) The text written in classical Arabic is translated into its equivalent in Iraqi sign language once the user uses the keyboard to write text for the purpose of translating it, clicking on each letter or number that represents it. It will be written in a dedicated bar to display the text. This procedure is implemented through the use of a special method programmed in Java language. The work of this method depends on the index for each button on the keyboard, where each index is linked to a letter or number representing that button. After completing writing the text, the user presses the button " ‫ترجم‬ /translate". The program searches within the database for the names of images that correspond to each word separately. As a result, the image display method is called upon. If the user enters words that are not in the database, such as the names of people or words that contain spelling errors, the application displays images of the letters of the text alphabetically. b. Second mode: from the deaf person to the hearing person see Figure 2 Iraqi sign language is translated into classical Arabic, where the button ‫االشارة"‬ ‫"لغة‬ designed in the user interface, when pressed, will move to the second mode, and to facilitate the process of finding the required images, the images are organized into categories (alphabet, language, verbs, adjectives, history, geography, etc.). Once the button ‫االشارة"‬ ‫"لغة‬ is clicked, a list of these categories will be displayed. In the same way, when any category is clicked, images that represent the words in this category will be displayed. Moreover, clicking on any image will have the word representing that image written in a tape for translation in Arabic text. The user is able to send the messages via (SMS). As the application deals with a complete language, it is difficult to absorb all vocabularies of the language in a short period. So, this application is designed to be open source so that more words can be added whenever needed so that as many words as possible can be stored over time. If the user uses the first mode and writes a sentence for example: ‫ألتعلم"‬ ‫المدرسة‬ ‫الى‬ ‫."أذهب‬ Clicking on the button"‫,"ترجم‬ the sentence will be translated into the Iraqi sign language as shown in Figure 3. -Note that the word ‫"أذهب"‬ has been translated by Figure 3

Example 2
If the user uses the first mode and writes the name of a person, for example ‫حسين"‬ ‫"احمد‬ and as is well known, the sign language does not contain images of the names, so the names will be translated alphabetically, as shown in Figure 4. -In Figure 4(a), we notice the image represents the letter ‫."الف"‬ -In Figure 4 Figure 4(c), the image represents the letter ‫."ميم"‬ -In Figure 4(d), the image represents the letter ‫."دال"‬ -In Figure 4(e), the image represents the letter ‫."حاء"‬ -In Figure 4(f), the image represents the letter ‫."سين"‬ -In Figure 4(g), the image represents the letter ‫."ياء"‬ -In Figure 4(h), the image represents the letter ‫."نون"‬ As for the speed with which these images are displayed, it programmatically determined so that they are comfortable for the eye and clear.

Example 3
If the user uses the second mode by clicking on the button ‫االشارة"‬ ‫,"لغة‬ a list of categories names will appear, as shown in Figure 5. By selecting the category and clicking on it, the sign language images for this category will be displayed. For example, if the user wants to choose images for the phrase ‫كتاب"‬ ‫"أقرأ‬ he will click on the category ‫"أالفعال"‬ and from it he will select the image that represent the word ‫"أقرأ"‬ as shown in Figure 6(a). Then he moves to the category ‫ومستلزماتها"‬ ‫"المدرسة‬ and specifies the image for the word ‫"كتاب"‬ as in Figure 6(b), thus, the sentence is translated from sign language into standard Arabic.

CONCLUSION
The use of this application facilitates the integration of deaf people into society through the use of the easiest and most widely used means in the world, which are mobile phone applications; this paper puts forward a system for translating the Iraqi sign language, due to the lack of applications related to this dialect. This application is an educational system that can be applied to teach deaf people Iraqi sign language. It also teaches them how to read and write in standard Arabic. Additionally, the application teaches healthy people Iraqi sign language easily, thus removing barriers between deaf people and healthy people and facilitating the process of communication between them. Since this application works without using the Internet, this is easy to use it at any time and any place. The use of (Blender) program makes it possible to store the huge amount of images that the application needs by dividing the images and reducing the display resolution, thus reducing the size of the images.
There are two proposes for future works: 1) to expand this system to include the use of sound for input and output by healthy people, thus facilitating further communication; and 2) use the root word extraction algorithm in Arabic language, so that the program works on the outputs of this algorithm. Thus, it may be possible to reduce the size of the databases to some extent as well as reducing the number of images that we need for translation. The root word is stored in databases on behalf of the derivatives or the conjugation of that word, which all have the same translation in sign language. This increases the efficiency of the translation process.