Deep learning approaches for recognizing facial emotions on autistic patients

Fatima Ezzahrae El Rhatassi, Btihal El Ghali, Najima Daoudi

Abstract


Autistic people need continuous assistance in order to improve their quality of life, and chatbots are one of the technologies that can provide this today. Chatbots can help with this task by providing assistance while accompanying the autist. The chatbot we plan to develop gives to autistic people an immediate personalized recommendation by determining the autist’s state, intervene with him and build a profil of the individual that will assist medical professionals in getting to know their patients better so they can provide an individualized care. We attempted to identify the emotion from the image's face in order to gain an understanding of emotions. Deep learning methods like Convolutional neural networks and Vision Transformers could be compared using the FER2013. After optimization, CNN achieved 74% accuracy, whereas the VIT achieved 69%. Given that there is not a massive dataset of autistic individuals accessible, we combined a dataset of photos of autistic people from two distinct sources and used the CNN model to identify the relevant emotion. Our accuracy rate for identifying emotions on the face is 65%. The model still has some identification limitations, such as misinterpreting some emotions, particularly "neutral," "surprised," and "angry," because these emotions and facial traits are poorly expressed by autistic people, and because the model is trained with imbalanced emotion categories.

Keywords


Autism; Convolutional neural network; Facial emotion recognition; Chatbot; Vision transformers

Full Text:

PDF


DOI: http://doi.org/10.11591/ijece.v14i4.pp4034-4045

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

International Journal of Electrical and Computer Engineering (IJECE)
p-ISSN 2088-8708, e-ISSN 2722-2578

This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).