摘要
In this study,we explore a human activity recognition(HAR)system using computer vision for assisted living systems(ALS).Most existing HAR systems are implemented using wired or wireless sensor networks.These systems have limitations such as cost,power issues,weight,and the inability of the elderly to wear and carry them comfortably.These issues could be overcome by a computer vision based HAR system.But such systems require a highly memory-consuming image dataset.Training such a dataset takes a long time.The proposed computervision-based system overcomes the shortcomings of existing systems.The authors have used key-joint angles,distances between the key joints,and slopes between the key joints to create a numerical dataset instead of an image dataset.All these parameters in the dataset are recorded via real-time event simulation.The data set has 780,000 calculated feature values from 20,000 images.This dataset is used to train and detect five different human postures.These are sitting,standing,walking,lying,and falling.The implementation encompasses four distinct algorithms:the decision tree(DT),random forest(RF),support vector machine(SVM),and an ensemble approach.Remarkably,the ensemble technique exhibited exceptional performance metrics with 99%accuracy,98%precision,97%recall,and an F1 score of 99%.