Human Action Recognition Based on Inertial Sensors and Complexity Classification

Human Action Recognition Based on Inertial Sensors and Complexity Classification

Lijue Liu (School of Information Science and Engineering, Central South University, Changsha, China), Xiaoliang Lei (School of Information Science and Engineering, Central South University, Changsha, China), Baifan Chen (School of Information Science and Engineering, Central South University, Changsha, China) and Lei Shu (School of Information Science and Engineering, Central South University, Changsha, China)
Copyright: © 2019 |Pages: 18
DOI: 10.4018/JITR.2019010102

Abstract

In this article, a human action recognition technique based on complexity classification is proposed. Considering the features of human actions such as continuity, individuality, variety randomness, the demands for recognition of different types of actions are different, the problem of action recognition can be classified into simple action recognition and complex action recognition -- the classification criterions are given respectively. Meanwhile, the hardware design of data acquisition device is introduced and the angle variation is chosen to represent the user's body state changes. For simple actions, a real-time recognition algorithm based on template matching performed well on cost control, and a method based on BLSTM-RNN is used for complex motion recognition to improve the accuracy of identification.
Article Preview

Z.Zhang et al (Zhang Z et al., 2009, Zhang Z et al., 2011) raised a hierarchical information fusion algorithm, which can estimate the human gestures. In the analysis, the algorithm has geometric constraints on human bone, and they introduced particle filter algorithm in the information fusion process. Although this method recognition rate is higher, its calculation is relatively complicated. Especially for the embedded systems with few resources and low computing, its implementation is difficult. Furthermore, Hao Yang (Yang et al., 2011) and Luinge (Luinge et al., 2007) gained the gesture relationship between the carrier sensing coordinate and body coordinate through human place under different statuses of predefined acceleration and angular velocity information in the human body. Besides, Huiyu Zhou (Zhou et al., 2008) adopted body skeleton model linked by the human joint and got the space position between arm elbow and wrist joints through the transformation between the coordinate system. Even though the method can locate the joint position, it cannot get the corresponding flip Angle of arm turning.

Complete Article List

Search this Journal:
Reset
Open Access Articles
Volume 13: 4 Issues (2020): Forthcoming, Available for Pre-Order
Volume 12: 4 Issues (2019): 3 Released, 1 Forthcoming
Volume 11: 4 Issues (2018)
Volume 10: 4 Issues (2017)
Volume 9: 4 Issues (2016)
Volume 8: 4 Issues (2015)
Volume 7: 4 Issues (2014)
Volume 6: 4 Issues (2013)
Volume 5: 4 Issues (2012)
Volume 4: 4 Issues (2011)
Volume 3: 4 Issues (2010)
Volume 2: 4 Issues (2009)
Volume 1: 4 Issues (2008)
View Complete Journal Contents Listing