Ambient intelligence-based multimodal human action recognition for autonomous systems

Human activity recognition can deduce the behaviour of one or more people from a set of sensor measurements. Despite its widespread applications in monitoring activities, robotics, and visual surveillance, accurate, meticulous, precise and efficient human action recognition remains a challenging res...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:ISA transactions Jg. 132; S. 94 - 108
Hauptverfasser: Jain, Vidhi, Gupta, Gaurang, Gupta, Megha, Sharma, Deepak Kumar, Ghosh, Uttam
Format: Journal Article
Sprache:Englisch
Veröffentlicht: United States Elsevier Ltd 01.01.2023
Schlagworte:
ISSN:0019-0578, 1879-2022, 1879-2022
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Human activity recognition can deduce the behaviour of one or more people from a set of sensor measurements. Despite its widespread applications in monitoring activities, robotics, and visual surveillance, accurate, meticulous, precise and efficient human action recognition remains a challenging research area. As human beings are moving towards the establishment of a smarter planet, human action recognition using ambient intelligence has become an area of huge potential. This work presents a method based on Bi-Convolutional Recurrent Neural Network (Bi-CRNN) -based Feature Extraction and then Random Forest classification for achieving outcomes utilizing Ambient Intelligence that are at the cutting edge of human action recognition for Autonomous Robots. The auto fusion technique used has improved fusion for utilizing and processing data from various sensors. This paper has drawn comparisons with already existing algorithms for Human Action Recognition (HAR) and tried to propose a heuristic and constructive hybrid deep learning-based algorithm with an accuracy of 94.7%.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0019-0578
1879-2022
1879-2022
DOI:10.1016/j.isatra.2022.10.034