Alert NET: Deep convolutional-recurrent neural network model for driving alertness detection

P. C. Nissimagoudar, A. V. Nandi, Aakanksha Patil, Gireesha H. M.

Abstract


Drowsy driving is one of the major problems which has led to many road accidents. Electroencephalography (EEG) is one of the most reliable sources to detect sleep on-set while driving as there is the direct involvement of biological signals. The present work focuses on detecting driver’s alertness using the deep neural network architecture, which is built using ResNets and encoder-decoder based sequence to sequence models with attention decoder. The ResNets with the skip connections allow training the network deeper with a reduced loss function and training error. The model is built to reduce the complex computations required for feature extraction. The ResNets also help in retaining the features from the previous layer and do not require different filters for frequency and time-invariant features. The output of ResNets, the features are input to encoder-decoder based sequence to sequence models, built using Bi-directional long-short memories. Sequence to Sequence model learns the complex features of the signal and analyze the output of past and future states simultaneously for classification of drowsy/sleepstage-1 and alert stages. Also, to overcome the unequal distribution (class-imbalance) data problem present in the datasets, the proposed loss functions help in achieving the identical error for both majority and minority classes during the raining of the network for each sleep stage. The model provides an overall-accuracy of 87.92% and 87.05%, a macro-F1-core of 78.06%, and 79.66% and Cohen's-kappa score of 0.78 and 0.79 for the Sleep-EDF 2013 and 2018 data sets respectively.

Keywords


attention network; bidirectional LSTM; class imbalance; electroencephalogram (EEG); encoder-decode model; loss functions; ResNets; sequence models;



DOI: http://doi.org/10.11591/ijece.v11i4.pp%25p
Total views : 0 times


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

ISSN 2088-8708, e-ISSN 2722-2578