Information Retrieval from Emotions and Eye Blinks with help of Sensor Nodes

Puneet Singh Lamba, Deepali Virmani

Abstract


In everyday life, there are situations where the only way to communicate are emotions. EMOTICONS are the epitome of the same. This aspect of communication can also be used in emergency situations (terrorist attacks, hijacks) where the only way to communicate is by performing some extraordinary actions or through some emotions. Incorporating technology to the above mentioned circumstances the paper proposes a novel framework of detecting an emergency situation by retrieving information from emotions and eye blinks using sensor nodes. The proposed framework can be deployed in places (hotels, banks, airports etc.) which are more suspected to attacks. The framework takes input from real time parameters: eye blinks, emotions, heart rate. Based on behavioral changes, biological changes and physical changes the proposed framework extracts meticulous information. The proposed framework is further validated through implementation of a facial emotion recognition system that successfully recognizes various human emotions. The facial emotion recognition system of the proposed framework is compared with existing SVM technique in terms of accuracy, training and testing error. Accuracy with the proposed system is increased to 78.40% in comparison with existing SVM that is 75.37% and the training error is decreased to 0.004103 whereas with the existing SVM method training error is 0.008935.

Keywords


emoticons; emotions; eye blink; information retrieval; sensor networks

Full Text:

PDF


DOI: http://doi.org/10.11591/ijece.v8i4.pp2433-2441

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

International Journal of Electrical and Computer Engineering (IJECE)
p-ISSN 2088-8708, e-ISSN 2722-2578

This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).