期刊文献+

Real-time Visual Odometry Estimation Based on Principal Direction Detection on Ceiling Vision 被引量:2

Real-time Visual Odometry Estimation Based on Principal Direction Detection on Ceiling Vision
原文传递
导出
摘要 In this paper,we present a novel algorithm for odometry estimation based on ceiling vision.The main contribution of this algorithm is the introduction of principal direction detection that can greatly reduce error accumulation problem in most visual odometry estimation approaches.The principal direction is defned based on the fact that our ceiling is flled with artifcial vertical and horizontal lines which can be used as reference for the current robot s heading direction.The proposed approach can be operated in real-time and it performs well even with camera s disturbance.A moving low-cost RGB-D camera(Kinect),mounted on a robot,is used to continuously acquire point clouds.Iterative closest point(ICP) is the common way to estimate the current camera position by registering the currently captured point cloud to the previous one.However,its performance sufers from data association problem or it requires pre-alignment information.The performance of the proposed principal direction detection approach does not rely on data association knowledge.Using this method,two point clouds are properly pre-aligned.Hence,we can use ICP to fne-tune the transformation parameters and minimize registration error.Experimental results demonstrate the performance and stability of the proposed system under disturbance in real-time.Several indoor tests are carried out to show that the proposed visual odometry estimation method can help to signifcantly improve the accuracy of simultaneous localization and mapping(SLAM). In this paper,we present a novel algorithm for odometry estimation based on ceiling vision.The main contribution of this algorithm is the introduction of principal direction detection that can greatly reduce error accumulation problem in most visual odometry estimation approaches.The principal direction is defned based on the fact that our ceiling is flled with artifcial vertical and horizontal lines which can be used as reference for the current robot s heading direction.The proposed approach can be operated in real-time and it performs well even with camera s disturbance.A moving low-cost RGB-D camera(Kinect),mounted on a robot,is used to continuously acquire point clouds.Iterative closest point(ICP) is the common way to estimate the current camera position by registering the currently captured point cloud to the previous one.However,its performance sufers from data association problem or it requires pre-alignment information.The performance of the proposed principal direction detection approach does not rely on data association knowledge.Using this method,two point clouds are properly pre-aligned.Hence,we can use ICP to fne-tune the transformation parameters and minimize registration error.Experimental results demonstrate the performance and stability of the proposed system under disturbance in real-time.Several indoor tests are carried out to show that the proposed visual odometry estimation method can help to signifcantly improve the accuracy of simultaneous localization and mapping(SLAM).
出处 《International Journal of Automation and computing》 EI CSCD 2013年第5期397-404,共8页 国际自动化与计算杂志(英文版)
关键词 Visual odometry ego-motion principal direction ceiling vision simultaneous localization and mapping(SLAM) Visual odometry ego-motion principal direction ceiling vision simultaneous localization and mapping(SLAM)
  • 相关文献

参考文献24

  • 1D. M. Helmick, Y. Cheng, D. S. Clouse, L. H. Matthies, S. I. Roumeliotis. Path following using visual odometry for a mars rover in high-slip environments. In Proceedings of the IEEE Aerospace Conference, IEEE, Big Sky, MT, vol. 2, pp. 772-789, 2004.
  • 2D. Nistr, O. Naroditsky, J. Bergen. Visual odometry for ground vehicle applications. Journal of Field Robotics, vol. 23, no. 1, pp. 3-20, 2006.
  • 3D. Scaramuzza, R. Siegwart. Appearance-guided monoc- ular omnidirectional visual odometry for outdoor ground vehicles. IEEE Transactions on Robotics, vol. 24, no. 5, pp. 1015-1026, 2008.
  • 4A. J. Davison. Real-time simultaneous localisation and mapping with a single camera. In Proceedings of the 9th IEEE International Conference on Computer Vision, IEEE, Nice, France, pp. 1403-1410, 2003.
  • 5D. Nister, O. Naroditsky, J. Bergen. Visual odometry. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, IEEE, Wash- ington DC USA, vol. 1, pp. 652-659, 2004.
  • 6D. Nister. An efficient solution to the five-point relative pose problem. IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 26, no. 6, pp. 756-770, 2004.
  • 7Y. Yu, C. Pradalier, G. H. Zong. Appearance-based monoc- ular visual odometry for ground vehicles. In Proceedings of IEEE/ASME International Conference on Advanced Intel- ligent Mechatronics IEEE, Budapest, pp. 862-867, 2011.
  • 8K. H. Yang, W. S. Yu, X. Q. Ji. Rotation estimation for mobile robot based on single-axis gyroscope and monocular camera. International Journal of Automation and Comput- ing, vol. 9, no. 3, pp. 292-298, 2012.
  • 9D. Xu, H. W. Wang, Y. F. Li, M. Tan. A new calibration method for an inertial and visual sensing system. Interna- tional Journal of Automation and Computing, vol. 9, no. 3, pp. 299-305, 2012.
  • 10M. Agrawal, K. Konolige. Real-time localization in outdoor environments using stereo vision and inexpensive GPS. In Proceedings of the 18th International Conference on Pat- tern Recognition, IEEE, Hong Kong, pp. 1063-1068, 2006.

同被引文献8

  • 1SCARAMUZZA D, SIEGWART R. Apperarance-guided monocular omnidirectional visual odometry for outdoor ground vehi- cles [J]. IEEE Trans on Robotics, 2008, 24 (5) : 1015-1026.
  • 2SCARAMUZZA D, FRAUNDORFER F, SIEGWARTR. Real-time monocular visual odometry for on-road vehicles with 1- point ransac [ C ] //Proc of IEEE International Conference on Robotics and Automation. Piscataway, NJ: IEEE Press, 2009: 4293-4299.
  • 3JEAN-YVES BOUGUET. Camera Calibration Toolbox for Matlab [ EB/OL]. (2013-12-02). http://www, vision, caltech. edu/bouguetj/calib_doc/.. 2014.11.
  • 4FRAUNDORFER F, SCARAMUZZA D. Visual odometry Part II: Matching, robustness, optimization, and applications [J]. Robotics & Automation Magazine, IEEE, 2012, 19(2) : 78-90.
  • 5高云峰,李伟超,李建辉.室内移动机器人视觉里程计研究[J].传感器与微系统,2012,31(2):26-29. 被引量:8
  • 6李宇波,朱效洲,卢惠民,张辉.视觉里程计技术综述[J].计算机应用研究,2012,29(8):2801-2805. 被引量:27
  • 7蔡自兴,贺汉根,陈虹.未知环境中移动机器人导航控制研究的若干问题[J].控制与决策,2002,17(4):385-390. 被引量:119
  • 8王亚龙,张奇志,周亚丽.基于Kinect的三维视觉里程计的设计[J].计算机应用,2014,34(8):2371-2374. 被引量:9

引证文献2

二级引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部