DOI QR코드

DOI QR Code

인터렉티브 공연·전시를 위한 RGB-D 카메라 기반 휴머노이드 로봇의 실시간 로봇 동작 생성 방법

Real-Time Motion Generation Method of Humanoid Robots based on RGB-D Camera for Interactive Performance and Exhibition

  • 서보형 (한국생산기술연구원 로봇응용연구부문) ;
  • 이덕연 (한국생산기술연구원 로봇응용연구부문) ;
  • 최동운 (한국생산기술연구원 로봇응용연구부문) ;
  • 이동욱 (한국생산기술연구원 로봇응용연구부문)
  • Seo, Bohyeong (Applied Robot R&D Department, Korea Institute of Industrial Technology) ;
  • Lee, Duk-Yeon (Applied Robot R&D Department, Korea Institute of Industrial Technology) ;
  • Choi, Dongwoon (Applied Robot R&D Department, Korea Institute of Industrial Technology) ;
  • Lee, Dong-Wook (Applied Robot R&D Department, Korea Institute of Industrial Technology)
  • 투고 : 2020.06.02
  • 심사 : 2020.07.21
  • 발행 : 2020.07.30

초록

휴머노이드 로봇 기술이 발전함에 따라서 로봇을 공연에 활용하는 사례가 늘어나고 있다. 이에 따라서 로봇의 동작을 사람처럼 자연스럽게 표현하여 공연에서의 활용범위를 보다 높이기 위한 연구들이 진행되고 있다. 이 중 모션캡쳐 기술을 이용하는 방식이 많이 사용되고 있는데, 일반적으로 모션 캡처를 하기 위해서 신체에 각 부위에 부착된 IMU 센서 혹은 마커들과 정밀한 고성능 카메라가 요구되는 등 준비하는 데에서 환경적인 불편함이 존재한다. 또한, 공연기술에 사용되는 로봇의 경우에는 실시간으로 돌발상황이나 관객의 반응에 따라서 실시간으로 대응해야 하는 문제가 존재한다. 본 논문에서는 위에서 언급한 문제들을 보완하고자 다수의 RGB-D 카메라를 이용한 실시간 모션캡쳐 시스템을 구축하고, 모션 캡쳐된 데이터를 이용하여 사람 동작과 유사한 자연스러운 로봇 동작을 생성하는 방법을 제안한다.

As humanoid robot technology advances, the use of robots for performance is increasing. As a result, studies are being conducted to increase the scope of use of robots in performances by making them natural like humans. Among them, the use of motion capture technology is often used, and there are environmental inconveniences in preparing for motion capture, such as the need for IMU sensors or markers attached to each part of the body and precise high-performance cameras. In addition, for robots used in performance technology, there is a problem that they have to respond in real time depending on the unexpected situation or the audience's response. To make up for the above mentioned problems, in this paper, we proposed a real-time motion capture system by using a number of RGB-D cameras and creating natural robot motion similar to human motion by using motion-captured data.

키워드

참고문헌

  1. Amazing Shaman animatronic in Na'vi River Journey, Pandora -The World of Avatar, Walt Disney World, https://www.youtube.com/watch?v=_p4mn5BstQo
  2. Shuichi Nishio, Hiroshi Ishiguro, and Norihiro Hagita, "Geminoid : Teleoperated Android of an Existent Person.", Humananoid Robots: New Developments, pp. 343-352, June 2007, doi:10.5772/4876
  3. Robots vs humans on the opera stage "My score is-", News A, Channel A, 2018. 3. 1 http://www.ichannela.com/news/main/news_detailPage.do?publishId=000000081857
  4. Turtle Talk - Turtle Talk, Walt Disney World, https://www.tokyo-disneyresort.jp/kr/tds/attraction/detail/246/
  5. Oscar Efrain Ramos Ponce, "Generation of the whole-body motion for humanoid robots with the complete dynamics.", Robotics [cs.RO]. Universite Toulouse III Paul Sabatier, pp. 52-59, 2014
  6. Yiming Yang, Wolfgang Merkt, Henrique Ferrolho, Vladimir Ivan, and Sethu Vijayakumar "Efficient Humanoid Motion Planning on Uneven Terrain Using Paired Forward-Inverse Dynamic Reachability Maps", IEEE Robotics and Automation Letters, vol. 2, no.4, 2017.
  7. Guoyu Zuo, Yongkang Qiu, and Tingting pan, "Attitude algorithm of human motion capture system for teleoperation of humanoid robots," 2018 Chinese Automation Congress (CAC) Automation Congress (CAC), pp. 3890-3895, Nov 2018.
  8. Rongkai Liu, Liang Peng, Lina Tong, Kaizhi Yang, and Bingyang Liu, "The Design of Wearable Wireless Inertial Measurement Unit for Body motion Capture System," 2018 IEEE International Conference on Intelligence and Safety for Robotics (ISR) Intelligence and Safety for Robotics (ISR), IEEE International Conference on, pp. 557-562, Aug 2018.
  9. Kun Qian, Jie Niu, and Hong Yang, "Developing a Gesture Based Remote Human-Robot Interaction System Using Kinect," Internal Journal of Smart Home, vol. 7, no. 4, pp. 203-208, Jul. 2013
  10. Van Vuong Nguyen, Joo-Ho Lee, "Full-Body Imitation of Human Motions with Kinect and Heterogeneous Kinematic Structure of Humanoid Robot", 2012 IEEE/SCIE International Symposium on System Integration (SII), pp.93-98, Dec. 2012.
  11. Lin Yang, Longyu Zhang, Haiwei Dong, Abdulhammed Alelaiwi, and Abdulmotaleb El Saddik, "Valuating and Improving the Depth Accuracy of Kinect for Windows v2", IEEE Sensors Journal, vol. 15, no. 8, pp. 4275-4285, Aug. 2015. https://doi.org/10.1109/JSEN.2015.2416651
  12. G. Welch, G. Bishop, "An introduction to the Kalman filter.", pp. 41-95, 1995.
  13. Jamie Shotton, Andrew Fitzgibbon, Mat Cook, Toby Sharp, Mark Finocchio, Richard Moore, Alex Kipman, and Andrew Blake, "Real-Time Human Pose Recognition in Parts from Single Depth Images", IEEE CVPR 2011, pp.1297-1304, 2011.
  14. Mark W. Spong, Seth Hutchinson, and M. Vidyasagar, Robot Modeling and Control First Edition, pp. 85-98, 2005.