Kalman Filter-based Sensor Fusion for Posture Stabilization of a Mobile Robot

모바일 로봇 자세 안정화를 위한 칼만 필터 기반 센서 퓨전

Jang, Taeho;Kim, Youngshik;Kyoung, Minyoung;Yi, Hyunbean;Hwan, Yoondong

  • Received : 2015.12.08
  • Accepted : 2016.06.29
  • Published : 2016.08.01


In robotics research, accurate estimation of current robot position is important to achieve motion control of a robot. In this research, we focus on a sensor fusion method to provide improved position estimation for a wheeled mobile robot, considering two different sensor measurements. In this case, we fuse camera-based vision and encode-based odometry data using Kalman filter techniques to improve the position estimation of the robot. An external camera-based vision system provides global position coordinates (x, y) for the mobile robot in an indoor environment. An internal encoder-based odometry provides linear and angular velocities of the robot. We then use the position data estimated by the Kalman filter as inputs to the motion controller, which significantly improves performance of the motion controller. Finally, we experimentally verify the performance of the proposed sensor fused position estimation and motion controller using an actual mobile robot system. In our experiments, we also compare the Kalman filter-based sensor fused estimation with two different single sensor-based estimations (vision-based and odometry-based).


Kalman Filter;Mobile Robot;Vision System;Posture Stabilization


  1. Lee, S. and Jeong, B. K., 2015, "Research Trends in Robotics, Control, and Automation Based on 2011- 2014 IEEE ICRA Proceedings (Part I)," Journal of the Korean Society for Precision Engineering, Vol. 32, pp. 233-241.
  2. Kim Y. and Minor, M. A., 2007, "Path Manifoldbased Kinematic Control of Wheeled Mobile Robots Considering Physical Constraints," The International Journal of Robotics Research, Vol. 26, pp. 955-975, September 1.
  3. Kim, Y. and Minor, M. A., "Kinematic Motion Control of Wheeled Mobile Robots Considering Curvature Constraints," in Robotics and Automation, 2008. ICRA 2008. IEEE International Conference on, 2008, pp. 2527-2532.
  4. Hwang, W., Park, J., Kwon, H.-I., Anjum, M. L., Kim, J.-H., Lee, C., Kim, K.-S., and Cho, D.-I., Dan, "Vision Tracking System for Mobile Robots Using Two Kalman Filters and a Slip Detector," in Control Automation and Systems (ICCAS), 2010 International Conference on, 2010, pp. 2041-2046.
  5. Lee, J. G. and Lee, Y. S., 1987, "Advanced Kalman Filter - A Survey," pp. 464-469.
  6. Kim Y., 2015, "Motion State Estimation for an Autonomous Vehicle-trailer System Using Kalman Filtering-based Multi-sensor Data Fusion," The Asian international Journal of Life Sciences, vol. ASIA LIFE SCIENCES Supplement 11, pp. 81-92.
  7. Doopalam, T. and Lee, D. J., 2015, "Efficient Kinect Sensor-Based Reactive Path Planning Method for Autonomous Mobile Robots in Dynamic Environments," Transactions of the Korean Society of Mechanical Engineers - A, Vol. 39, pp. 549-559.
  8. Phung, S. L., Bouzerdoum, A. and Chai, D., "A Novel Skin Color Model in YCbCr Color Space and its Application to Human Face Detection," in Image Processing. 2002. Proceedings. 2002 International Conference on, 2002, pp. I-289-I-292 Vol.1.
  9. Arbelaez, P., Maire, M., Fowlkes, C. and Malik, J., 2011, "Contour Detection and Hierarchical Image Segmentation," Pattern Analysis and Machine Intelligence, IEEE Transactions on, Vol. 33, pp. 898-916.

Cited by

  1. Error Correction of Real-time Situation Recognition using Smart Device vol.19, pp.9, 2018,