Unobtrusive hand gesture recognition using ultra-wide band radar and deep learning

Djazila Souhila Korti, Zohra Slimane

Abstract


Hand function after stroke injuries is not regained rapidly and requires physical rehabilitation for at least 6 months. Due to the heavy burden on the healthcare system, assisted rehabilitation is prescribed for a limited time, whereas so-called home rehabilitation is offered. It is therefore essential to develop robust solutions that facilitate monitoring while preserving the privacy of patients in a home-based setting. To meet these expectations, an unobtrusive solution based on radar sensing and deep learning is proposed. The multi-input multi-output convolutional eXtra trees (MIMO-CxT) is a new deep hybrid model used for hand gesture recognition (HGR) with impulse-radio ultra-wide band (IR-UWB) radars. It consists of a lightweight architecture based on a multi-input convolutional neural network (CNN) used in a hybrid configuration with extremely randomized trees (ETs). The model takes data from multiple sensors as input and processes them separately. The outputs of the CNN branches are concatenated before the prediction is made by the ETs. Moreover, the model uses depthwise separable convolution layers, which reduce computational cost and learning time while maintaining high performance. The model is evaluated on a publicly available dataset of gestures collected by three IR-UWB radars and achieved an average accuracy of 98.86%.

Keywords


Extra trees; hand gesture recognition; impulse-radio ultra-wide band; lighweight architecture; multi-input convolutional neural network; optuna

Full Text:

PDF


DOI: http://doi.org/10.11591/ijece.v13i6.pp6872-6881

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

International Journal of Electrical and Computer Engineering (IJECE)
p-ISSN 2088-8708, e-ISSN 2722-2578

This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).