Development of a SLAM System for Small UAVs in Indoor Environments using Gaussian Processes

가우시안 프로세스를 이용한 실내 환경에서 소형무인기에 적합한 SLAM 시스템 개발

  • 전영산 (건국대학교 항공우주정보시스템공학과) ;
  • 최종은 ;
  • 이정욱 (건국대학교 항공우주정보시스템공학과)
  • Received : 2014.08.30
  • Accepted : 2014.09.29
  • Published : 2014.11.01


Localization of aerial vehicles and map building of flight environments are key technologies for the autonomous flight of small UAVs. In outdoor environments, an unmanned aircraft can easily use a GPS (Global Positioning System) for its localization with acceptable accuracy. However, as the GPS is not available for use in indoor environments, the development of a SLAM (Simultaneous Localization and Mapping) system that is suitable for small UAVs is therefore needed. In this paper, we suggest a vision-based SLAM system that uses vision sensors and an AHRS (Attitude Heading Reference System) sensor. Feature points in images captured from the vision sensor are obtained by using GPU (Graphics Process Unit) based SIFT (Scale-invariant Feature Transform) algorithm. Those feature points are then combined with attitude information obtained from the AHRS to estimate the position of the small UAV. Based on the location information and color distribution, a Gaussian process model is generated, which could be a map. The experimental results show that the position of a small unmanned aircraft is estimated properly and the map of the environment is constructed by using the proposed method. Finally, the reliability of the proposed method is verified by comparing the difference between the estimated values and the actual values.


SIFT;small unmanned aircraft;SLAM;vision sensor;Gaussian process


  1. T. T. Nwe, T. Htike, K. M. Mon, Dr.Z. M. Naing, and Dr.Y. M. Myint, "Application of an inertial navigation system to the quad-rotor UAV using MEMS sensors," Proc. of World Academy of Science, Engineering and Technology, Aug. 2008.
  2. E. Altug, J. P. Ostrowski, and R. Mahony, "Control of a quadrotor helicopter using visual feedback," Proc. of the 2002 IEEE Conference on Robotics & Automation, 2002.
  3. J. Huang, D. Millman, M. Quigley, and D. Stavens, "Efficient, generalized indoor WiFi GraphSLAM," Proc. of 2011 IEEE International Conference on Robotics and Automation, pp. 1038-1043, May 2011.
  4. A. Brooks, A. Makarenko, and B. Upcroft, "Gaussian process models for indoor and outdoor sensor-centric robot localization," IEEE Trans. on Robotics, vol. 24, no. 6, pp. 1341-1351, Dec. 2008.
  5. Y. Xu and J. Choi, "Adaptive sampling for leanrning gaussian process using mobile sensor networks," Sensors, vol. 11, no. 3, pp. 3051-3066, 2011.
  6. C. Han, C. Oh, and B. Choi, "Distinctive image features from scale-invariant keypoints," International Journal of Computer Vision, vol. 60, no. 2, pp. 91-110, 2004.
  7. J. O. Lee, T. S. Kang, K. H. Lee, S. G. Im, and J. K. Park, "Vision-based indoor localization for unmanned aerial vehicles," Journal of Aerospace Engineering, vol. 24, no. 3, pp. 373-377, 2011.
  8. C. E. Rasmussen and C. K. I. Williams, Gaussian Processes for Machine Learning, the MIT Press, 2006.
  9. I. W. Selesnick, R. G. Baraniuk, and N. C. Kingsbury "The dual-tree complex wavelet transform," IEEE Signal Processing Magazine, vol. 22, no. 6, pp. 123-151, 2005.

Cited by

  1. An Object Recognition Method Based on Depth Information for an Indoor Mobile Robot vol.21, pp.10, 2015,
  2. Localization of a Monocular Camera using a Feature-based Probabilistic Map vol.21, pp.4, 2015,
  3. Vision-based Reduction of Gyro Drift for Intelligent Vehicles vol.21, pp.7, 2015,
  4. Electromagnetic Strip Stabilization Control in a Continuous Galvanizing Line using Mixture of Gaussian Model Tuned Fractional PID Controller vol.21, pp.8, 2015,