Feature extraction comparison for facial expression recognition using adaptive extreme learning machine
Abstract
Facial expression recognition is an important part in the field of affective computing. Automatic analysis of human facial expression is a challenging problem with many applications. Most of the existing automated systems for facial expression analysis attempt to recognize a few prototypes emotional expressions such as anger, contempt, disgust, fear, happiness, neutral, sadness, and surprise. This paper aims to compare feature extraction methods that are used to detect human facial expression. The study compares the gray level co-occurrence matrix, local binary pattern, and facial landmark (FL) with two types of facial expression datasets, namely Japanese female facial expression (JFFE), and extended Cohn-Kanade (CK+). In addition, we also propose an enhancement of extreme learning machine (ELM) method that can adaptively select best number of hidden neurons adaptive ELM (aELM) to reach its maximum performance. The result from this paper is our proposed method can slightly improve the performance of basic ELM method using some feature extractions mentioned before. Our proposed method can obtain maximum mean accuracy score of 88.07% on CK+ dataset, and 83.12% on JFFE dataset with FL feature extraction.
Keywords
adaptive extreme learning machine; affective computing; extreme learning machine; facial expression recognition; feature extraction;
Full Text:
PDFDOI: http://doi.org/10.11591/ijece.v13i1.pp1113-1122
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
International Journal of Electrical and Computer Engineering (IJECE)
p-ISSN 2088-8708, e-ISSN 2722-2578
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).