Malaysian sign language mobile learning application: a recommendation app to communicate with hearing-impaired communities

ABSTRACT


INTRODUCTION
Sign language is a body language using hand gestures to communicate with peoples [1], as opposed to spoken language. Sign language is often used to communicate with the people suffered from deafness or hearing impaired, where deafness is a condition of losing the capabilities to hear the sounds. Sign language is also defined as a language that used hands and movements to make a sign to deliver a meaning to the peoples in a communication process, targeting to deaf or hearing-impaired persons [2,3].
World Federation of the Deaf is established to take care of the welfare and uphold the right of the deaf communities. The organization has many associations located in most of the developing countries, including Malaysia. When the Malaysia Federation of the Deaf was established in the year 1998, the federation has set the standard communication sign language for Malaysia, which is Malaysian Sign Language (MSL) [4]. This is because according to a different country, the communication languages will also vary based on the country's style and dialects, including sign language. Hence in Malaysia, MSL is used  [5].
There are a lot of ways to learn MSL. The most common and direct way to learn the sign language is by attending the class at the learning center. Despite that, MSL can also be learnt via surfing the internet for related websites, or by purchasing books. Mobile learning application for sign language has also become a trend in these few years as an alternative way to replace all the other ways. Mobile learning application is software that can deliver the learning contents to the peoples using the mobile phones. Mobile learning application has advantage on its high mobility and flexibility in the accessing of the contents which allows learners to access studying materials and resources anytime everywhere [6]. Mobile learning application also has a high successful rate on learning compared to native learning way as the user can choose when they want to learn, rather than forcing them while they have no desire on learning [7]. In the mobile learning application, the learning materials can comprise of various elements such as videos, images and sounds rather than plain text to allow more interactive and interesting learning contents delivered to the learners [8].
There are a few existing mobile learning application for MSL [9][10][11], yet they are still far too less and incomplete for effective learning in the sign language [12].
Hence, to solve the problems faced, there is a necessity to build a new, improved version of MSL mobile learning app. The app will contain a sign detector mechanism which can capture the image of the sign and interpret it to the app users. Other than that, the app will contain useful phrases of sign and categorized to the different situation to improve the navigation process. Furthermore, there will be a quiz module to test the user knowledge on the sign languages they have learnt while using the app.

RESEARCH METHOD
Malaysian Sign Language Mobile Learning Application is the app prototype that allows everyone to learn the MSL. There are four features integrated into the app, which sign detection are using the phone camera, learning by category, play quiz, and give feedback. In the app development of the project, the evolutionary prototyping-based methodology is chosen as the model to develop the system. Evolutionary prototyping model is a system development model that allows user or clients to test the system prototype in multiple stages to determine the system's requirement more clearly and enable the developer to improve and develop the desired system of the clients. Using this methodology, the developer will first include the minimal functionality of the system to test the user's opinion and acceptance and gathering the new requirement from the users. More improvement and functionality will be added in the second prototype until the final prototype is perfect and ready to be implemented as the real system [13]. HTML, Typescript and Ionic Framework are used to develop the app while Angular Firebase is used as the database for the app.

Analysis and design
The data used for the app development is collected from a few sources. First, data collection is carried out by interviewing one of the deaf people. Due to the disability of the interviewee to speak, the interview process is done by using hand gestures and paper writing method. The results of the interview are functional requirements and user requirements for the proposed app. Second, Internet is used as a medium to collect the data relate to the existing MSL. However, the resources related to MSL are scarce and limited.
Analysis phase involves the process of requirement analysis to determine the goal and functionality of the proposed app that will be developed. It involves the analysis of requirements gathered from the clients in the interviews and questionnaires and the result is four modules implemented in the app, which are a sign detection module, a learning by category module, a quiz module, and a feedback module, as shown in Figure 1.
Design phase involves the process of designing the wireframe and interface of the proposed application. Wireframe decides the bone structure of the application on what should be included, the arrangement of the elements and the functionality of an app. The design of a wireframe can save the application development time and produce a consistent layout for the application [14]. In the proposed app, the wireframe is designed and showed in Figure 2. The structure of the proposed app is designed to be clean and simple to enhance user-friendly and usability for all kinds of user. The header part will contain a back button to allow the user to go back to the previous tab and the tab title will be shown in the middle of the header to display the location of the user in the app. The contents of the tab will be displayed on the middle of the screen, while the navigation bar will be located at the lower part. This wireframe layout will be applied to most of the pages in the app to build a consistent layout for the user and enhance usability.

Implementation
In this phase, the implementation works have been applied to the modules as discussed in the previous section to turn the modules into functional according to system functional requirements and user requirements. The developed app contained four features as discussed in the previous section, which is sign detection, learn by category, quiz, and feedback. The app is implemented by using Ionic Framework and Angular Firebase. The database that is used in the project is Firebase which is a cloud database and requires no-SQL commands to query the data. Besides, there are several API which had been used to develop the app. Camera API has been implemented in the app to allow the user's phone to capture the image in the sign detection modules in the app. Cloud Vision API is a Google Cloud service that can detect the contents and labels in the image. In this project, Cloud Vision API has been implemented to sign detection module to detect the labels of sign language contained in the image. The following sections show the implementation of the app interfaces.

Sign detection
Sign detection module is a module that allows the user to capture the image and detect the sign meaning. In this module, camera plugin will be used to enable the ability of user's phone to initiate the camera function and take the picture. Besides, cloud vision API will be used to detect the label contained in the image captured with the datasets on the internet to determine the most suitable label for the image and display it. Figure 3 shows the interface of the module after the sign detection result has been displayed based on the image captured by the user. Firstly, users are required to tap on the "take photo" button to take the photo. After the photo is taken, it will be automatically uploaded to the firebase storage where the cloud function which is the label detection from Cloud Vision API will be initiated to start detecting the label that might possibly be contained in the image. The labels contained in the image will be analyzed by comparing the existing dataset on the internet to obtain the most accurate result. However, since there is lack of dataset of Malaysian Sign Language contents on the internet, the sign detection module only can detect few of the signs currently, for example alphabet A, B, and E.

Learning by category
Learning by category module is a module that allows users to learn the Malaysian Sign Language according to the category that is useful in their daily life. In the app, there are sixteen categories currently included, with a total of more than two hundred signs content included in the categories. Figure 4 shows the beginning interface of the module. When the users opened the app, category tab is displayed as the first tab for the users. Users can scroll and find their desired category and tap on it to view the signs list in that category.  Figure 5 shows the sign list interface after the user chose the category. In this case, the user had chosen the "Feelings" category. The sign list page will then display the entire sign name that related to the category as a list to the user to select the sign to learn. Figure 6 shows the sign page interface after the users have selected their desired sign to learn. The image will be read from the firebase database and display to the user. Depending on the sign chosen, the image will be displayed in two types, which is a dynamic image in GIF format, or a static image of JPEG format.

Quiz
Quiz module is a module that allows users to test their knowledge on the sign language they have learnt within the app. When the user selects the quiz tab, the quiz image will be generated randomly from the collections of the list of signs contained in the database and display to the user. Figure 7 shows the quiz page interface when the user selected the quiz tab in the app. A random image will be chosen from the sign list in the database and displayed to the user to answer the quiz. Users can either choose to enter their answer for the quiz and click on the "check" button to validate their answer or choose to skip to the next question if they have no idea on the answer of the quiz. The app will prompt a dialog box for each of the options they selected to operate the next move based on their decision, such as ask for proceeding to next question or reveal the answer of that question. The validation mechanism is also implemented in the input area of the quiz answer to ensure the user answer the question with the right format.

Feedback
Feedback module is an important module that allows the communication of between developer and the users. The module is used as the bridge to connect the developer and the users of the app to collect their opinions and suggestions, or as well complaints to be used for better improvements of the system in the future. Figure 8 shows the feedback page interface when the user selects the feedback tab in theapp. The users are only required to enter their e-mail address and their opinions against the app and click on the "send" button to directly forward their opinions to the developer. Validation process is implemented to ensure the email entered by the user is valid to allow the developer to communicate with the users via e-mail if necessary. The sent feedback will be stored on the firebase database for the developer to view the incoming feedback from the users and help the developer to collect essential suggestions from distinct perspective to make a better version of an improved app in the future.

RESULTS AND DISCUSSION
The app is successfully built by the developer. User testing is carried out to evaluate the app's design and the app's functions. This user testing involved 30 user's act as the public. The user testing was conducted at UniversitiTun Hussein Onn Malaysia in Parit Raja, Johor, Malaysia. Table 1 shows the result of the user acceptance test on the app's design. The navigation feature in the app has scored the highest score, which is 99% from users rating, which means the app is easy to understand and use by the public. The scoring is followed by the second criteria, which is the content layout and thirdly, the app interfaces. However, text style has achieved the lowest scoring which is 69% only from users rating. The reason behind this was probably the inappropriate text size commented by most of the users.  Table 2 shows the result of user acceptance of the system functions. Among the four functions, sign detection modules achieved the highest score, which is 87% of the user rating. However, since there is lack of dataset of MSL contents in the internet, the sign detection module only can detect few of the signs currently, for example alphabet A, B, and E. The scoring is followed by a quiz module, which achieved a 77% score and the learn by category module which is 69%. Some suggestions from users which the learn by category module should include some sentences instead of single phrases to increase the learning effectiveness in sign language. Feedback modules achieved the lowest score from the users, which is only 65%, as the module is just used for sending a comment without any other features in it. These data had been taken to review the future works of the app.

CONCLUSION
In educating deaf and mute people, there are three great approaches that can be successfully used: the bilingual-bicultural (BiBi) approach which makes use of ASL; the auditory verbal approach that teaches the English Language through residual hearing and speech instead of sign language; and total communication that combines auditory and visual communication for instruction [15]. In this paper, we presented a mobile app prototype to learn MSL. This app embraces a function for the MSL learner to detect sign meaning using the phone camera. Besides, to achieve the objective of improving the communication process between normal people with the deaf community, the app has gathered a lot of MSL to put into the learning category. With a total of 16 categories and over 200 signs included, it is an immense help to weaken the communication barrier between the deaf communities and normal people. The data collected from the user acceptance test had also aided in deciding the project's future works and directions and we believe that this app will considerably introduce a new function compared to the current application.