期刊文献+

一种基于语义信息辅助的无人机视觉/惯性融合定位方法 被引量:1

A Semantic Information Aided Visual/Inertial Fusion Localization Method of UAV
下载PDF
导出
摘要 视觉传感器在无人机室内定位中发挥着重要作用。传统基于特征点的视觉里程计算法通过底层亮度关系进行描述匹配,抗干扰能力不足,会出现匹配错误甚至失败的情况,导航系统的精度及鲁棒性有待提升。由于室内环境存在丰富的语义信息,提出了一种基于语义信息辅助的无人机视觉/惯性融合定位方法。首先,将室内语义信息进行因子建模,并与传统的视觉里程计方法进行融合;然后,基于惯性预积分方法,在因子图优化中添加惯性约束,以进一步提高无人机在动态复杂环境下的定位精度和鲁棒性;最后,通过无人机室内飞行试验对算法的定位精度进行了分析。试验结果表明,相较于传统的视觉里程计算法,该方法具有更高的精度和鲁棒性。 Vision sensors play an important role in the indoor positioning of unmanned aerial vehicles(UAVs).The traditional visual odometry calculation method based on feature points descri-bes and matches through the underlying brightness relationship,which has insufficient anti-interference ability,and may cause matching errors or even failures.The accuracy and robustness of the navigation system need to be improved.Due to the rich semantic information in the indoor environment,a semantic information aided visual/inertial fusion localization method of UAV is proposed.Firstly,the indoor semantic information is factored and fused with the traditional visual odometry method.Then,based on the inertial pre-integration method,inertial constraint is added to the factor graph optimization to further improve the positioning accuracy and robustness of the UAV in complex dynamic situations.Finally,the positioning accuracy of the algorithm is analyzed through the UAV indoor flight experiment.The experimental results show that the proposed method has higher accuracy and robustness than the traditional visual odometry method.
作者 吕品 何容 赖际舟 杨子寒 袁诚 LYU Pin;HE Rong;LAI Ji-zhou;YANG Zi-han;YUAN Cheng(College of Automation Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106, China)
出处 《导航定位与授时》 CSCD 2022年第4期60-69,共10页 Navigation Positioning and Timing
基金 国家自然科学基金(61973160) 装备重大基础研究项目(51405-02B02)。
关键词 视觉/惯性定位 语义信息 因子图优化 无人机 Visual/inertial localization Semantic information Factor graph optimization UAV
  • 相关文献

参考文献2

二级参考文献38

  • 1How J P, Bethke B, Frank A, et al. Real-time indoor autonomous ve- hicle test environment[ J]. IEEE Controls System Magazine, 2008 ,28 (2) :51 -64.
  • 2Bachrach A, Winter De A , He R, et al. RANGE - Robust Autono- mous Navigation in GPS-denied Environments [ J ]. Journal of Field Robotics, 28 (5) :646 - 666.
  • 3Tournier G P, Valenti M, How J P. Estimation and Control of a Quadrotor Vehicle Using Monocular Vision and Moire Patterns [ C ]// AIAA Guidance, Navigation, and Control Conference and Exhibit, Keystone, Colorado,2006:1 - 16.
  • 4Rondon E, Gareia-Carrillo L R, Fantoni I. Vision-based altitude, po- sition and speed regulation of a quadrotor rotorcraft [ C ]//International Conference on Intelligent Robots and Systems, Taipei, 2010:628 - 633.
  • 5Hartley R, Zisserman A. Muhiple View Geometry in Computer Vision [M]. Cambridge Univ. Press, 2004.
  • 6Metni N, Hamel T. Visual Tracking Control of Aerial Robotic Systems with Adaptive Depth Estimation [ C ]// Sevilla, Spain :44th IEEE Conference on Decision and Control. 2005:6078 -6084.
  • 7Bazin J C, Kweon I, Demonceaux C, et al. UAV Attitude Estimation by Vanishing Points in Catadioptric Image[ C]// IEEE International Conference on Robotics and Automation. Pasadena, CA: 2008:2743 - 2749.
  • 8Kanade T, Amidi O, Ke Q. Real-time and 3D vision for autonomous small and micro air vehicles[ C ]//Proceedings of the 43rd IEEE Con- ference on Decision and Control Atlantis: 2004:1655 -1662.
  • 9Stowers J, Bainbridge-Smith A, Hayes M, et al. Optical Flow for Heading Estimation of a Quadrotor Helicopter [ J ]. International Jour- nal of Micro Air Vehicles, 2009,1 (4) : 229 - 239.
  • 10Verveld M J, Chu Q P, Wagter De C, et al. Optic Flow Based State Estimation for an Indoor Micro Air Vehicle [ C ]//AIAA Guidance, Navigation, and Control, Conference, Toronto : 2010 : 1 - 21.

共引文献11

同被引文献68

引证文献1

二级引证文献12

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部