摘要
针对机器人演示学习中目标跟踪性能的严格要求,提出一种可以有效克服快速运动、遮挡和目标漂移的物体跟踪方法。首先计算中值流,并预测目标的位置偏移,以此计算高斯权重;然后修正搜索区域,并使用在线多示例分类器进行目标搜索,计算似然度;最后使用贝叶斯框架对结果进行融合,使用穷举搜索得到最优的预测位置,并更新在线分类器。实验结果表明,与现有方法相比,该方法对快速运动和目标漂移具有更强的鲁棒性,而且可以达到实时跟踪。
To satisfy the stringent requirements of the object tracking performance in the robot' s learning-from-demonstra- tion-framework, a new tracking algorithm that can deal with fast motions, occlusions, and drifts, is proposed. First, the Median-Flow method is used to predict the position-shift of the object and the Gaussian weight of each patch. Then, the search-region is modified and the object is located by the online multi-instance learning classifier. Afterwards, the likeli-hood of each patch is calculated. Finally, the results are combined under the Bayes framework to get the best prediction by exhaustive search and the online classifier is updated. Experiments in several commonly used test videos show that our method outperforms the other state-of-the-art tracking methods, especially for fast motion and drifts. Furthermore, the pro- posed method can run in real-time.
出处
《中国图象图形学报》
CSCD
北大核心
2013年第1期93-100,共8页
Journal of Image and Graphics
基金
国家重点基础研究发展计划(973)基金项目(2010CB327900)
关键词
服务机器人
演示学习
目标跟踪
在线多示例学习
中值流
service robots
learning from demonstration
object tracking
online multi-instance learning
median flow