DOI QR코드

DOI QR Code

Integrated Navigation Algorithm using Velocity Incremental Vector Approach with ORB-SLAM and Inertial Measurement

속도증분벡터를 활용한 ORB-SLAM 및 관성항법 결합 알고리즘 연구

  • Kim, Yeonjo (Dept. of Aerospace Information Engineering, Konkuk University) ;
  • Son, Hyunjin (Dept. of Aerospace Information Engineering, Konkuk University) ;
  • Lee, Young Jae (Dept. of Aerospace Information Engineering, Konkuk University) ;
  • Sung, Sangkyung (Dept. of Aerospace Information Engineering, Konkuk University)
  • Received : 2018.12.04
  • Accepted : 2018.12.17
  • Published : 2019.01.01

Abstract

In recent years, visual-inertial odometry(VIO) algorithms have been extensively studied for the indoor/urban environments because it is more robust to dynamic scenes and environment changes. In this paper, we propose loosely coupled(LC) VIO algorithm that utilizes the velocity vectors from both visual odometry(VO) and inertial measurement unit(IMU) as a filter measurement of Extended Kalman filter. Our approach improves the estimation performance of a filter without adding extra sensors while maintaining simple integration framework, which treats VO as a black box. For the VO algorithm, we employed a fundamental part of the ORB-SLAM, which uses ORB features. We performed an outdoor experiment using an RGB-D camera to evaluate the accuracy of the presented algorithm. Also, we evaluated our algorithm with the public dataset to compare with other visual navigation systems.

Keywords

DHJGII_2019_v68n1_189_f0001.png 이미지

그림 1 ${\Delta}\overrightarrow{V}{^n}{_{Ref,k}}$${\Delta}\overrightarrow{V}{^n}{_{INS,k}}$의 관계[11] Fig. 1 The relation between ${\Delta}\overrightarrow{V}{^n}{_{Ref,k}}$ and ${\Delta}\overrightarrow{V}{^n}{_{INS,k}}$[11]

DHJGII_2019_v68n1_189_f0002.png 이미지

그림 2 V103 오일러 자세각 추정 결과 Fig. 2 Ground Truth and estimated result of Euler angles of V103

DHJGII_2019_v68n1_189_f0003.png 이미지

그림 3 제안된 알고리즘과 약결합 기반 알고리즘의 V103 자세 오차 비교 Fig. 3 Attitude error of proposed and LC algorithm of V103

DHJGII_2019_v68n1_189_f0004.png 이미지

그림 4 제안된 알고리즘과 약결합 기반 알고리즘의 V103 위치 오차 비교 Fig. 4 Position error of proposed and LC algorithm of V103

DHJGII_2019_v68n1_189_f0005.png 이미지

그림 5 항법 모듈 및 실험 장비 Fig. 5 Navigation module and experimental setup

DHJGII_2019_v68n1_189_f0006.png 이미지

그림 6 야외 실험 환경 Fig. 6 Outdoor experimental environments

DHJGII_2019_v68n1_189_f0007.png 이미지

그림 7 수평 및 수직 위치 추정 결과 Fig. 7 Estimation results of horizontal and vertical position

DHJGII_2019_v68n1_189_f0008.png 이미지

그림 8 위치 추정 오차 비교 Fig. 8 Estimation error of position

DHJGII_2019_v68n1_189_f0009.png 이미지

그림 9 자세 추정 오차 비교 Fig. 9 Estimation error of attitude

표 1 EuRoC dataset의 위치, 자세 오차 결과 Table 1 Position and Attitude Error of EuRoC Dataset

DHJGII_2019_v68n1_189_t0001.png 이미지

표 2 알고리즘 별 ATE 결과(median) Table 2 Absolute Translational RMSE (median)

DHJGII_2019_v68n1_189_t0002.png 이미지

표 3 ADIS 16448 제원 Table 3 Specification of ADIS 16448

DHJGII_2019_v68n1_189_t0003.png 이미지

References

  1. S. Shen, N. Michael, and V. Kumar, "Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs", Proc. - IEEE Int. Conf. Robot. Autom., Vol. 2015-June, No. June, pp. 5303-5310, 2015.
  2. S. Weiss and R. Siegwart, "Real-time metric state estimation for modular vision-inertial systems", Proc. - IEEE Int. Conf. Robot. Autom., pp. 4531-4537, 2011.
  3. G. Huang, M. Kaess, and J. J. Leonard, "Towards consistent visual-inertial navigation", in Proceedings - IEEE International Conference on Robotics and Automation, 2014, pp. 4926-4933.
  4. S. Leutenegger, S. Lynen, M. Bosse, R. Siegwart, and P. Furgale, "Keyframe-based visual-inertial odometry using nonlinear optimization", Int. J. Rob. Res., Vol. 34, No. 3, pp. 314-334, Mar. 2015. https://doi.org/10.1177/0278364914554813
  5. R. Mur-Artal and J. D. Tardos, "ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras", IEEE Trans. Robot., Vol. 33, No. 5, pp. 1255-1262, Oct. 2017. https://doi.org/10.1109/TRO.2017.2705103
  6. T. Qin, P. Li, and S. Shen, "VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator", IEEE Trans. Robot., Vol. 34, No. 4, pp. 1004-1020, 2017. https://doi.org/10.1109/tro.2018.2853729
  7. T. Schneider et al., "maplab: An Open Framework for Research in Visual-inertial Mapping and Localization", Vol. 3, No. 3, pp. 1418-1425, 2017. https://doi.org/10.1109/lra.2018.2800113
  8. C. Mei, G. Sibley, and P. Newman, "Closing loops without places", in IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings, 2010, p. 3738-3744p.
  9. R. Kummerle, G. Grisetti, H. Strasdat, K. Konolige, and W. Burgard, "G2o: A general framework for graph optimization", Proc. - IEEE Int. Conf. Robot. Autom., pp. 3607-3613, 2011.
  10. K. Schmid and H. Hirschmuller, "Stereo vision and IMU based real-time ego-motion and depth image computation on a handheld device", in Proceedings - IEEE International Conference on Robotics and Automation, 2013, pp. 4671-4678.
  11. B. Lee, S. Yun, H. K. Lee, Y. J. Lee, and S. Sung, "An efficient attitude reference system design using velocity differential vectors under weak acceleration dynamics", Int. J. Aeronaut. Sp. Sci., Vol. 17, No. 2, pp. 222-231, 2016. https://doi.org/10.5139/IJASS.2016.17.2.222
  12. B. Lee, Y. J. Lee, and S. Sung, "Attitude Determination Algorithm based on Relative Quaternion Geometry of Velocity Incremental Vectors for Cost Efficient AHRS Design", Int. J. Aeronaut. Sp. Sci., Vol. 19, No. 2, pp. 459- 469, Jun. 2018. https://doi.org/10.1007/s42405-018-0030-6
  13. M. Burri et al., "The EuRoC micro aerial vehicle datasets", Int. J. Rob. Res., Vol. 35, No. 10, pp. 1157-1163, 2016. https://doi.org/10.1177/0278364915620033
  14. T. Bailey and H. Durrant-Whyte, "Simultaneous localization and mapping (SLAM): Part I", 2006.
  15. A. I. Mourikis and S. I. Roumeliotis, "A multi-state constraint Kalman filter for vision-aided inertial navigation", in Proceedings - IEEE International Conference on Robotics and Automation, 2007, pp. 3565-3572.
  16. J. Kelly and G. S. Sukhatme, "Visual-inertial sensor fusion: Localization, mapping and sensor-to-sensor Self-calibration", Int. J. Rob. Res., Vol. 30, No. 1, pp. 56-79, 2011. https://doi.org/10.1177/0278364910382802
  17. M. Li and A. I. Mourikis, "Improving the accuracy of EKF-based visual-inertial odometry", in Proceedings - IEEE International Conference on Robotics and Automation, 2012, pp. 828-835.
  18. M. Li and A. I. Mourikis, "High-precision, consistent EKF-based visual-inertial odometry", Int. J. Rob. Res., Vol. 32, No. 6, pp. 690-711, May 2013. https://doi.org/10.1177/0278364913481251
  19. P. Tanskanen, T. Naegeli, M. Pollefeys, and O. Hilliges, "Semi-direct EKF-based monocular visual-inertial odometry", in IEEE International Conference on Intelligent Robots and Systems, 2015, Vol. 2015-Decem, pp. 6073-6078.
  20. M. Bloesch, S. Omari, M. Hutter, and R. Siegwart, "Robust visual inertial odometry using a direct EKFbased approach", in IEEE International Conference on Intelligent Robots and Systems, 2015, Vol. 2015-Decem, pp. 298-304.
  21. M. Bloesch, M. Burri, S. Omari, M. Hutter, and R. Siegwart, "Iterated extended Kalman filter based visualinertial odometry using direct photometric feedback", Int. J. Rob. Res., Vol. 36, No. 10, pp. 1053-1072, 2017. https://doi.org/10.1177/0278364917728574
  22. K. Sun et al., "Robust Stereo Visual Inertial Odometry for Fast Autonomous Flight", Vol. 3, No. 2, pp. 965-972, 2017. https://doi.org/10.1109/lra.2018.2793349
  23. S. Lynen, M. W. Achtelik, S. Weiss, M. Chli, and R. Siegwart, "A robust and modular multi-sensor fusion approach applied to MAV navigation", in IEEE International Conference on Intelligent Robots and Systems, 2013, pp. 3923-3929.
  24. S. Shen, Y. Mulgaonkar, N. Michael, and V. Kumar, "Multi-sensor fusion for robust autonomous flight in indoor and outdoor environments with a rotorcraft MAV", in Proceedings - IEEE International Conference on Robotics and Automation, 2014, pp. 4974-4981.
  25. R. Mur-Artal, J. M. M. Montiel, and J. D. Tardos, "ORB-SLAM: A Versatile and Accurate Monocular SLAM System", IEEE Trans. Robot., Vol. 31, No. 5, pp. 1147-1163, 2015. https://doi.org/10.1109/TRO.2015.2463671
  26. E. Rublee, V. Rabaud, K. Konolige, and G. Bradski, "ORB: An efficient alternative to SIFT or SURF", in Proceedings of the IEEE International Conference on Computer Vision, 2011, pp. 2564-2571.
  27. D. G. Lowe, "Distinctive image features from scale-invariant keypoints", Int. J. Comput. Vis., Vol. 60, No. 2, pp. 91-110, Nov. 2004. https://doi.org/10.1023/B:VISI.0000029664.99615.94
  28. J. Sturm, N. Engelhard, F. Endres, W. Burgard, and D. Cremers, "A benchmark for the evaluation of RGB-D SLAM systems", IEEE Int. Conf. Intell. Robot. Syst., pp. 573-580, 2012.
  29. P. Ji, "StereoScan : Dense 3D Reconstruction in", Ieeexplore.Ieee.Org, pp. 1-9, 2016.
  30. A. Solin, S. Cortes, E. Rahtu, and J. Kannala, "PIVO: Probabilistic inertial-visual odometry for occlusionrobust navigation", Proc. - 2018 IEEE Winter Conf. Appl. Comput. Vision, WACV 2018, Vol. 2018-Janua, pp. 616-625, 2018.
  31. R. Mur-Artal and J. D. Tardos, "Visual-Inertial Monocular SLAM with Map Reuse", IEEE Robot. Autom. Lett., Vol. 2, No. 2, pp. 796-803, Apr. 2016. https://doi.org/10.1109/LRA.2017.2653359
  32. T. Pire, T. Fischer, J. Civera, P. De Cristoforis, and J. J. Berlles, "Stereo parallel tracking and mapping for robot localization", in IEEE International Conference on Intelligent Robots and Systems, 2015, Vol. 2015-Decem, pp. 1373-1378.
  33. N. Krombach, D. Droeschel, and S. Behnke, "Combining feature-based and direct methods for semi-dense real-time stereo visual odometry", Adv. Intell. Syst. Comput., Vol. 531, No. July, pp. 855-868, 2017.