A novel sketch based face recognition in unconstrained video for criminal investigation

Napa Lakshmi, Megha P. Arakeri

Abstract


Face recognition in video surveillance helps to identify an individual by comparing facial features of given photograph or sketch with a video for criminal investigations. Generally, face sketch is used by the police when suspect’s photo is not available. Manual matching of facial sketch with suspect’s image in a long video is tedious and time-consuming task. To overcome these drawbacks, this paper proposes an accurate face recognition technique to recognize a person based on his sketch in an unconstrained video surveillance. In the proposed method, surveillance video and sketch of suspect is taken as an input. Firstly, input video is converted into frames and summarized using the proposed quality indexed three step cross search algorithm. Next, faces are detected by proposed modified Viola-Jones algorithm. Then, necessary features are selected using the proposed salp-cat optimization algorithm. Finally, these features are fused with scale-invariant feature transform (SIFT) features and Euclidean distance is computed between feature vectors of sketch and each face in a video. Face from the video having lowest Euclidean distance with query sketch is considered as suspect’s face. The proposed method’s performance is analyzed on Chokepoint dataset and the system works efficiently with 89.02% of precision, 91.25% of recall and 90.13% of F-measure.

Keywords


chokepoint dataset; face recognition; feature optimization; video summarization; video surveillance;

Full Text:

PDF


DOI: http://doi.org/10.11591/ijece.v13i2.pp1499-1509

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

International Journal of Electrical and Computer Engineering (IJECE)
p-ISSN 2088-8708, e-ISSN 2722-2578

This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).