DOI QR코드

DOI QR Code

인간행동제약을 위한 레이저파인더 기반의 로봇주행제어

Robot Navigation Control using Laserscanner to Restrict Human Movement

  • 진태석 (동서대학교 메카트로닉스공학과)
  • 투고 : 2013.01.23
  • 심사 : 2013.02.22
  • 발행 : 2013.05.31

초록

본 논문에서는 실내 보안지역내의 인간 출입을 통제하기위해 보안로봇을 이용한 출입통제 기능을 수행한 시스템 및 연구결과를 제시하고 있다. 제안된 로봇은 환경인식을 위한 레이저파인더를 탑재하고 보안지역을 상시 관찰을 수행하며 출입금지 구역에 인간을 출입이나 진행을 감지했을 때, 로봇이 인간의 속도벡터를 계산 및 주행할 경로를 계획하고 인간의 진행방향을 차단할 수 있도록 예측된 경로를 따라 주행을 하게 된다. 이때, 인간의 움직임은 포인터 물체로 간주하였으며 로봇의 기구학에 기반하여 인간의 위치를 추정한다. 실내에서 계속적인 환경변화에 대해 로봇은 감시기능 수행하게 된다. 통제구역에 대한 진입을 인식하게 되면 인간의 움직임의 반대방향으로 주행하여 진입차단 기법을 제시한다. 제안된 연구결과를 검정하기위해 로봇을 이용한 위치추정 및 추적 실험결과를 제시하였다.

In this research, we describe a security robot system and ongoing research results to control human's wrong direction in order to forbid human to enter security zone. Proposed robot system surveils a security area with equipped laserscanner sensor usually. When it detect walking human who is for the area, robot calculates his velocity vector, plans own path to forestall and interrupts him who want to head restricted area and starts to move along the estimated trajectory. The walking human is assumed to be a point-object and projected onto an scanning plane to form a geometrical constraint equation that provides position data of the human based on the kinematics of the mobile robot. While moving the robot continues these processes for adapting change of situation. After arriving at an opposite position human's walking direction, the robot advises him not to be headed more and change his course. The experimental results of estimating and tracking of the human in the wrong direction with the mobile robot are presented.

키워드

참고문헌

  1. I. Nakutani, H. Saito, T. Kubot, et al., "Micro Scanning Laser Range Sensor for Palnetary Exploration", Proc. of Int. Conf. on Integrated Micro/Nanotechnology for Space Application, 1995.
  2. TaeSeokJin, JangMyung Lee, and Hideki Hashimoto, "Position Estimation of Mobile Robot using Images of Moving Target in Intelligent Space with Distributed Sensors" Advanced Robotics, The Robotics Society of Japan, Vol.20, No.6, pp.737-762, June 2006
  3. O. C. Jenkins, G. G. Serrano, and M. M. Loper. Recognizing Human Pose and Actions for Interactive Robots, chapter 6, pages 119-38. I-Tech Education and Publishing, 2007.
  4. Yilin. Zhao and Spencer L. BeMent, "Kinematics, Dynamics and Control of Wheeled Mobile Robots," IEEE Conf. Robotics and Automation, pp. 91-96, 5. 1992.
  5. M. Dani Baba and E. T. Power, "Scheduling Performance in Distributed Real-Time Control System," Proc. of 2nd International CAN Conference, pp. 7/2-7/11, 1995.
  6. Steven M. Lavalle, Rajeer Sharma, "On Motion Planning in Changing Partially Predictable Environments," The International Journal of Robotics Research, vol 16, No. 6, pp. 705-805, December, 1997.
  7. Mark W. Spong, M. Vidyasagar, Robot Dynmics and Control, John Wiley & Sons, Inc. 1989.
  8. R. van der Merwe, E. Wan, S. Julier, A. Bogdanov, G. Harvey, and J. Hunt. Sigma-point kalman filters for nonlinear estimation and sensor fusion: Applications to integrated navigation. In AIAA Guidance Navigation & Control Conference, 2004.
  9. M. L. Walters, K. Dautenhahn, K. L. Koay, C. Kaouri, R. te Boekhorst, C. Nehaniv, I. Werry, and D. Lee. Close encounters: Spatial distances between people and a robot of mechanistic appearance. In Proceedings of 2005 5th IEEE-RAS International Conference on Humanoid Robots, pages 450-55, Tsukuba, Japan, December 2005.
  10. TaeSeok Jin, "Kinematics Method of Camera System for Tracking of a Moving Object" International Journal of KIMICS, Vol.8, No.2, pp.145-149, April 2010. https://doi.org/10.6109/jicce.2010.8.2.145