Real-Time Hand Gesture Recognition Based on the Depth Map for Human Robot Interaction

Minoo Hamissi, Karim Faez


In this paper, we propose and implement a novel and real-time method for recognizing hand gestures using depth map. The depth map contains information relating to the distance of objects from a viewpoint. Microsoft’s Kinect sensor is used as input device to capture both the color image and its corresponding depth map. We first detect bare hand in cluttered background using distinct gray-level of the hand which is located near to the sensor. Then, scale invariance feature transform (SIFT) algorithm is used to extract feature vectors. Lastly, vocabulary tree along with K-means clustering method are used to partition the hand postures to ten simple sets as: "one", "two", "three", "four", "five", "six", "seven", "eight", "nine" and "ten" numbers based on the number of extended fingers. The vocabulary tree allows a larger and more discriminatory vocabulary to be used efficiently. Consequently, it leads to an improvement in accuracy of the clustering. The experimental results show superiority of the proposed method over other available approaches. With this approach, we are able to recognize 'numbers' gestures with over 90% accuracy.



Hand gesture recognition; Depth map; @home robot; Vocabulary tree

Full Text:


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

International Journal of Electrical and Computer Engineering (IJECE)
p-ISSN 2088-8708, e-ISSN 2722-2578

This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).