Adaptive Human Action Recognition With an Evolving Bag of Key Poses

Vision-based human action recognition allows to detect and understand meaningful human motion. This makes it possible to perform advanced human-computer interaction, among other applications. In dynamic environments, adaptive methods are required to support changing scenario characteristics. Specifi...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on autonomous mental development Vol. 6; no. 2; pp. 139 - 152
Main Authors: Chaaraoui, Alexandros Andre, Florez-Revuelta, Francisco
Format: Journal Article
Language:English
Published: IEEE 01.06.2014
Subjects:
ISSN:1943-0604, 1943-0612
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Vision-based human action recognition allows to detect and understand meaningful human motion. This makes it possible to perform advanced human-computer interaction, among other applications. In dynamic environments, adaptive methods are required to support changing scenario characteristics. Specifically, in human-robot interaction, smooth interaction between humans and robots can only be performed if these are able to evolve and adapt to the changing nature of the scenarios. In this paper, an adaptive vision-based human action recognition method is proposed. By means of an evolutionary optimization method, adaptive and incremental learning of human actions is supported. Through an evolving bag of key poses, which models the learned actions over time, the current learning memory is developed to recognize increasingly more actions or actors. The evolutionary method selects the optimal subset of training instances, features and parameter values for each learning phase, and handles the evolution of the model. The experimentation shows that our proposal achieves to adapt to new actions or actors successfully, by rearranging the learned model. Stable and accurate results have been obtained on four publicly available RGB and RGB-D datasets, unveiling the method's robustness and applicability.
ISSN:1943-0604
1943-0612
DOI:10.1109/TAMD.2014.2315676