期刊文献+

快速鲁棒的摄像机运动跟踪方法 被引量:1

Fast and robust camera tracking method
下载PDF
导出
摘要 针对普遍存在的摄像机运动实时跟踪效率低、目标易丢失等问题,提出一种快速鲁棒的跟踪方法。釆用FAST特征结合金字塔光流跟踪的方法,对目标特征点的提取和匹配进行优化,减少计算量;建立摄像机跟踪状态的检测与更新机制,实时检测有效跟踪点的数量并及时更新,保持摄像机运动跟踪的稳定性。实验结果表明,该方法有效减少了无标识增强现实中特征点提取与匹配的时间消耗,提高了摄像机运动跟踪的效率,具有良好的实时性,在遮挡及环境变化导致特征点数量减少时表现出较好的鲁棒性。 There usually exist such problems as low efficiency of real-time camera motion tracking and quickly losing of the tracked target points.To solve the problems,a fast robust tracking method was proposed.Two contributions were made.FAST features and pyramid optical flow were adopted to optimize the extracting and matching of target feature points,reducing computational cost.A new mechanism of detecting and updating of camera tracking state was adopted,which real-time detected the number of effective tracking points,and updated the tracking points once the number was too small,retaining the stability of camera motion tracking.Experimental results show the proposed method reduces time-consuming of extracting and matching of feature points in mark-less augmented reality,and improves camera motion tracking efficiency.Compared to other methods,the proposed method gets better real-time tracking ability with better robustness when the number of feature points drops as a result of object occluding and environment changing.
出处 《计算机工程与设计》 北大核心 2017年第10期2735-2739,共5页 Computer Engineering and Design
基金 浙江省自然科学基金项目(LQ13F020016)
关键词 自然特征 无标识增强现实 加速分割测试特征 金字塔光流 运动跟踪 natural feature mark-less augmented real i ty FAST pyramid optical f lo w tracking
  • 相关文献

参考文献1

二级参考文献9

  • 1ROYER E,LHUILLIER M,DHOME M,et al. Monoc ular vision for mobile robot localization and autonomous navigation [J]. International Journal of Computer Vi- sion, 2007, 74(3): 237-260.
  • 2SONG Xiao@ng, SONG Zbbin, SENEVIRATNE I. D,et al. Optical flow-based slip and velocity estimation tech- nique for unmanned skid-steered vehicles [C]// Intelli- gent Robots and Systems (IROS), 2008 IEEE/RSJ Inter- national Conference on. Nice: IEEE, 2008:101 106.
  • 3NISTER D, NARODITSKY O, BERGEN J. Visual odometry [C]// Computer Vision and Pattern Recogni- tion (CVPR), 2004 IEEE Computer Society Conference on. Washington, DC: IEEE, 2004, 1(1): 652-659.
  • 4HOWARD A. Real-time stereo visual odometry for au- tonomous ground vehicles [C]// Intelligent Robots and Systems (IROS), 2008 1EEE/RSJ International Confer- ence on. Nice: IEEE, 2008:3946-3952.
  • 5KITT B,REHDER J,CHAMBERS A, et al. Monocular visual odometry using a planar road model to solve scale ambiguity [C]// Proceeding of European Conference on Mobile Robots. Orebro: Is. n. ], 2011 : 43 - 48.
  • 6SUN D, ROTH S, BLACK M J. Secrets of optical {low estimation and their principles [C]// Computer Vision and Pattern Recognition (CVPR), 2010 IEEE Conference on. San Francisco: IEEE, 2010 2432-2439.
  • 7LUCAS B D,KANAKE T. An iterative image registra- tion technique with an application to stereo vision [C]// International Joint Conference on Artificial Intelligence (IJC1A). Vancouver: AAAI,1981 .. 674 - 679.
  • 8BOUGUET J. Pyramidal Implementation of the affine Lucas Kanade feature tracker description of the algo- rithm [J]. Intel Corporation,2001, 5.
  • 9BAY H, ESS A, TUYTELAARS T, et al. SURF: speeded up robust features [J]. Computer Vision and Image Understanding ( CVIU ), 2008, 110 ( 3 ): 346 - 359.

共引文献15

同被引文献3

引证文献1

二级引证文献10

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部