摘要
针对行为监测在现实场景下实测效果较差的情况,提出了一种新型的提取人体运动特征的方法,该方法不仅考虑了骨骼点信息,也融合了图像的环境属性信息。考虑到现有的大量实验都是在提取人体骨骼特征的基础上融合多种复杂算法进行实验分类,并未考虑到仅仅使用提取骨骼特征进行算法评估的不合理性。因此提出了一种基于骨骼特征的图像信息重建方法,并结合骨骼特征的图卷积网络和注意力机制等算法以及图像识别方法以达到人体行为识别的目的。首先使用Openpose提取骨骼点信息;然后使用图卷积和注意力进行一次分类,在一次分类的基础上通过加入骨骼点扩张系数来分割图形,从而达到对分割的图形进行分类二次精确分类的目的;最后在HMDB51数据集上进行评估,结果表明所提方法的准确度相比对比方法平均提高了5.6%,在实际测试中其有较强的优势。这表明所提方法不仅更精确,同时也更具有实际应用价值。
Aiming at the poor measurement effect of behavior monitoring in real life,a new method of extracting human behavior features is proposed.It not only considers the body point information,but also integrates the environmental attribute information of image.Considering that a large number of existing experiments use a variety of complex algorithms for experimental classification on the basis of human body features extraction,it does not take into account the irrationality of only using the body features for algorithm evaluation.Therefore,an image information reconstruction method based on body features is proposed in this paper,which combines the image convolution network of body features,attention mechanism and image recognition method to realize human behavior recognition.The body point information is extracted by Openpose,and then the body points are classified by graph convolution and attention.On the basis of the first classification,the body point expansion coefficient is added to segment the images so as to realize second accurate classification.Finally,the evaluation accuracy on the HMDB51 dataset improves by 5.6%,and it has a big advantage in the actual test.This shows that the method is not only more accurate,but also has more practical application value.
作者
赵小虎
叶圣
李晓
ZHAO Xiao-hu;YE Sheng;LI Xiao(National,Local Joint Engineering Laboratory of Internet Application Technology on Mine(China University of Mining,Technology),Xuzhou,Jiangsu 221008,China;School of Information and Control Engineering,China University of Mining and Technology,Xuzhou,Jiangsu 221008,China)
出处
《计算机科学》
CSCD
北大核心
2022年第6期269-275,共7页
Computer Science
基金
国家重点研发计划(2017YFC0804400)。