호모그래피기반의 카메라 추적기술을 이용한 텔레프레즌스 시스템

Tele-presence System using Homography-based Camera Tracking Method

  • 김태협 (중앙대학교 첨단영상대학원 영상학과) ;
  • 최윤석 (중앙대학교 첨단영상대학원 영상학과) ;
  • 남보담 (중앙대학교 첨단영상대학원 영상학과) ;
  • 홍현기 (중앙대학교 첨단영상대학원 영상학과)
  • Kim, Tae-Hyub (Dept. of Imaging Science and Arts, GSAIM, Chung-Ang University) ;
  • Choi, Yoon-Seok (Dept. of Imaging Science and Arts, GSAIM, Chung-Ang University) ;
  • Nam, Bo-Dam (Dept. of Imaging Science and Arts, GSAIM, Chung-Ang University) ;
  • Hong, Hyun-Ki (Dept. of Imaging Science and Arts, GSAIM, Chung-Ang University)
  • 투고 : 2011.11.03
  • 심사 : 2012.04.30
  • 발행 : 2012.05.25

초록

텔레프레즌스(tele-presence)와 원격조정(tele-operation) 기술은 멀리 떨어진 사용자에게 몰입감이 높은(immersive) 장면이나 모바일 로봇 등의 제어 환경을 제공한다. 본 논문에서는 호모그래피(homography) 정보 기반의 카메라 추적기술을 이용한 텔레프레즌스 시스템이 제안된다. 먼저 카메라가 탑재된 HMD(head mounted display)를 착용한 사용자의 머리 움직임을 카메라 추적기술로 분석한다. 그리고 전방향(omni-directional) 카메라를 장착한 로봇으로부터 입력되는 파노라마 영상에서 사용자의 시야(field of view)에 해당하는 장면을 생성하여 HMD를 통해 디스플레이한다. 사용자의 움직임을 추정하는 과정에서 3차원 평면으로 구성된 공간의 호모그래피 정보를 이용하며, 실제로 측정된 3차원 데이터를 기준으로 마커기반의 ARToolkit을 이용하는 방법과 호모그래피 기반 방법의 정확도를 각각 비교하였다.

Tele-presence and tele-operation techniques are used to build up an immersive scene and control environment for the distant user. This paper presents a novel tele-presence system using the camera tracking based on planar homography. In the first step, the user wears the HMD(head mounted display) with the camera and his/her head motion is estimated. From the panoramic image by the omni-directional camera mounted on the mobile robot, a viewing image by the user is generated and displayed through HMD. The homography of 3D plane with markers is used to obtain the head motion of the user. For the performance evaluation, the camera tracking results by ARToolkit and the homography based method are compared with the really measured positions of the camera.

키워드

참고문헌

  1. P. Milgram, H. Takemura, A. Utsumi, and F. Kishino, "Augmented reality: a class of displays on the reality-virtuality continuum," in Proc. of Telemanipulator and Telepresence Technologies, Vol. 2351, pp. 282-292, 1994.
  2. G. Mantovani, G. Riva "Real presence : how different ontologies generate different criteria for presence, telepresence, and virtual presence," MIT press., 1999.
  3. S. Dasgupta and A. Banerjee, "An augmented-reality-based real-time panoramic vision system for autonomous navigation," IEEE Trans. on Systems, Man, and Cybernetics, Vol. 36, No. 1, pp. 154-161, 2006. https://doi.org/10.1109/TSMCA.2005.859177
  4. M. Fiala, "Pano-presence for teleoperation," in Proc. of Intelligent Robots and Systems, pp. 3798-3802, 2005.
  5. H. Kato and M. Billinghurst, "Marker tracking and HMD calibration for a video-based augmented reality conferencing system," in Proc. of IWAR, pp. 85-94, 1999.
  6. D. Abawi and J. Bienwald, "Accuracy in optical tracking with fiducial markers : an accuracy function for ARToolKit," in Proc. ISMAR, pp. 260-261, 2004.
  7. A. Takagi, S. Yamazaki, Y. Saito, and N. Taniguchi, "Development of a stereo video see-through HMD for AR systems," in Proc. of ISAR, pp. 68-77, 2000.
  8. H. Nagahara, Y. Yagi, and M. Yachida, "Wide field of view head mounted display for tele-presence with an omnidirectional image sensor," in Proc. of Omnidirectional Vision and Camera Networks, pp. 16, 2003.
  9. K. Yamazawa, T. Ishikawa, T. Sato, S. Ikeda, Y. Nakamura, K. Fujikawa, H. Sunahara, and N. Yokoya, "Web-based telepresence system using omni-directional video streams," Lecture Notes on Computer Science, Vol. 3333, pp. 45-52, 2004.
  10. H. Lee, D. Kim, M. Park, and G. Park, "Augmented reality based vision system for network based mobile robot," Lecture Notes on Computer Science, Vol. 5068, pp. 123-130, 2008.
  11. http://www.hitl.washington.edu/artoolkit/
  12. F. Dornaika and C. Garcia, "Robust camera calibration using 2D to 3D feature correspondences," Optical Science Engineering and Instrumentation, Videometrics V, Vol. 3174, 1999.
  13. R. Hartley and A. Zisserman, Multiple view geometry, Cambridge University Press, 2004.