期刊文献+

光学头部姿态跟踪的多传感器数据融合研究 被引量:8

Multi-sensor Data Fusion for Optical Tracking of Head Pose
下载PDF
导出
摘要 精确的头部姿态跟踪是室内增强现实系统实现高精度注册的关键技术之一.本文介绍了使用传感器数据融合原理实现高精度的光学头部姿态跟踪的新方法.该方法使用多传感器数据融合中的扩展卡尔曼滤波器和融合滤波器,将两个互补的单摄像机Inside-out跟踪和双摄像机Outside-in跟踪的头部姿态进行数据融合,以减小光学跟踪传感器的姿态误差.设计了一个典型实验装置验证所提出的算法,实验结果显示,在静态测试下的姿态输出误差与使用误差协方差传播法则计算得到的结果是一致的;在动态跟踪条件下,与单个Inside-out或Outside-in跟踪相比,所提出的光学头部姿态数据融合算法能够使跟踪器获得精度更高、更稳定的位置和方向信息. Accurate head pose tracking is a key issue to accomplish precise registration in indoor augmented reality systems.This paper proposes a novel approach based on multi-sensor data fusion to achieve optical tracking of head pose with high accuracy.This approach employs two extended Kalman filters and one fusion filter for multi-sensor environment to fuse the pose data from two complement optical trackers,an inside-out tracking(IOT) with a camera and an outside-in tracking(OIT) with two cameras,respectively.The aim is to reduce the pose errors from the optical tracking sensors.A representative experimental setup is designed to verify the above approach.The experimental results show that,in the static state,the pose errors of IOT and OIT are consistent to the theoretical results obtained using the rules of error covariance matrix propagation from the respective image noises to the final pose errors,and that in the dynamic state,the proposed multi-sensor data fusion approach used with our combined optical tracker can achieve more accurate and more stable outputs of position and orientation compared with using a single IOT or OIT alone.
出处 《自动化学报》 EI CSCD 北大核心 2010年第9期1239-1249,共11页 Acta Automatica Sinica
基金 国家高技术研究发展计划(863计划)(2006AA02Z4E5 2008AA01Z303 2009AA012106) 国家自然科学基金(60827003) 长江学者和创新团队发展计划(IRT0606)资助~~
关键词 多传感器数据融合 误差协方差矩阵 头部姿态跟踪 增强现实 Multi-sensor data fusion error covariance matrix head pose tracking augmented reality
  • 相关文献

参考文献35

  • 1Azuma R, Baillot Y, Behringer R, Feiner S, Julier S, MacIntyre B. Recent advances in augmented reality. IEEE Computer Graphics and Applications, 2001, 21(6): 34-47.
  • 2Ferrin F J. Survey of helmet tracking technologies. In: Proceedings of the Conference on Large Screen Projection, Avionic, and Helmet-Mounted Displays. USA: SHE, 1991. 86-94.
  • 3Haritos T, Macchiarella N D. A mobile application of augmented reality for aerospace maintenance training. In: Proceedings of the 24th Digital Avionics Systems Conference. Washington D.C., USA: IEEE, 2005. 1-9.
  • 4Broll W, Lindt I, Herbst I, Ohlenburg J, Braun A K, Wetzel R. Toward next-gen mobile AR game. IEEE Computer Graphics and Applications, 2008, 28(4): 40-48.
  • 5Hamza-Lup F G, Santhanam A P, Imielinska C, Meeks S L, Rolland J P. Distributed augmented reality with 3-D lung dynamics -- a planning tool concept. IEEE Transactions on Information Technology in Biomedicine, 2007, 11(1): 40-46.
  • 6NDI Inc [Online], available: http://www.ndigital.com, June 3, 2009.
  • 7Vicon Inc [Online], available: http://www.vicon.com, June 6, 2009.
  • 8Ribo M, Pinz A, Fuhrmann A L. A new optical tracking system for virtual and augmented reality applications. In: Proceedings of the 18th IEEE Instrumentation and Measurement Technology Conference. Budapest, Hungary: IEEE, 2001. 1932-1936.
  • 9HIT Lab [Online], available: http://www.hitl.washington. edu/artoolkit/, March 3, 2009.
  • 10Ward M, Azuma R, Bennett R, Gottschalk S, Fuchs H. A demonstrated optical tracker with scalable work area for head-mounted display system. In: Proceedings of the Symposium on Interactive 3D Graphics. Cambridge, USA: ACM, 1992. 43-52.

二级参考文献14

  • 1HOLLOWAY R L. Registration error analysis for augmented reality [J]. Presence: Teleoperators & Virtual Environments, 1997, 6(4): 413-432.
  • 2HARALICK R M, JOO H, LEE C, et al. Pose estimation from corresponding point data [J]. IEEE Trans. Systems, Man, and Cybernetics, 1989, 19(6): 1426-1445.
  • 3ZHUANG X, HUANG Y. Robust 3-D-3-D pose estimation [J]. IEEE Trans. Pattern Analysis and Machine Intelligence, 1994, 16(8): 818-824.
  • 4WOLTRING H, HUISKES R, DE LANGE A, et al. Finite centroid and helical axis estimation from noisy landmark measurements in the study of human joint kinematics [J]. Journal of biomechanics, 1985, 18(5): 79-389.
  • 5MORRIS T, DONATH M. Using a maximum error statistics to evaluate measurement errors in 3d position and orientation tracking systems [J]. Presence: Teleoperators & Virtual Environments, 1993, 2(4):314 -343.
  • 6HOFF W, VINCENT T. Analysis of head pose accuracy in augmented reality [J]. IEEE Transactions on Visualization and Computer Graphics, 2000, 6(4):319-334.
  • 7DAVIS L, CLARKSON E, ROLLAND J E Predicting accuracy in pose estimation for marker-based tracking system [C]. Proceedings of the Second IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR'03), Tokyo, Japan ,2003: 28-35.
  • 8BAUER M, SCHLEGEL M, PUSTKA D, et al. Predicting and estimation the accuracy of n-ocular optical tracking system[C]. Proceedings of the sixth IEEE and ACM International Symposium on Mixed and Augmented Reality(ISMAR'06). Santa Barbara, USA, 2006:43-51.
  • 9ZHANG Z. A flexible new technique for camera calibration [J]. IEEE Transaction on Pattern Analysis and Machine Intelligence, 2000, 22( 11): 1330-1334.
  • 10HARTLEY R, ZISSERMAN A. Multiple view geometry in computer vision [M]. 2nd Ed., New York, NY, USA: Cambridge University Press, 2003.

共引文献3

同被引文献100

引证文献8

二级引证文献175

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部