Deep convolutional autoencoder for radar-based classification of similar aided and unaided human activities

Radar-based activity recognition is a problem that has been of great interest due to applications such as border control and security, pedestrian identification for automotive safety, and remote health monitoring. This paper seeks to show the efficacy of micro-Doppler analysis to distinguish even th...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on aerospace and electronic systems Vol. 54; no. 4; pp. 1709 - 1723
Main Authors: Seyfioglu, Mehmet Saygin, Ozbayoglu, Ahmet Murat, Gurbuz, Sevgi Zubeyde
Format: Journal Article
Language:English
Published: New York IEEE 01.08.2018
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
ISSN:0018-9251, 1557-9603
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Radar-based activity recognition is a problem that has been of great interest due to applications such as border control and security, pedestrian identification for automotive safety, and remote health monitoring. This paper seeks to show the efficacy of micro-Doppler analysis to distinguish even those gaits whose micro-Doppler signatures are not visually distinguishable. Moreover, a three-layer, deep convolutional autoencoder (CAE) is proposed, which utilizes unsupervised pretraining to initialize the weights in the subsequent convolutional layers. This architecture is shown to be more effective than other deep learning architectures, such as convolutional neural networks and autoencoders, as well as conventional classifiers employing predefined features, such as support vector machines (SVM), random forest, and extreme gradient boosting. Results show the performance of the proposed deep CAE yields a correct classification rate of 94.2% for micro-Doppler signatures of 12 different human activities measured indoors using a 4 GHz continuous wave radar-17.3% improvement over SVM.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0018-9251
1557-9603
DOI:10.1109/TAES.2018.2799758