DOI QR코드

DOI QR Code

Hand Language Translation Using Kinect

  • Pyo, Junghwan (Dept. of Electronics and Communications Engineering, Kwangwoon University) ;
  • Kang, Namhyuk (Dept. of Electronics and Communications Engineering, Kwangwoon University) ;
  • Bang, Jiwon (Dept. of Electronics and Communications Engineering, Kwangwoon University) ;
  • Jeong, Yongjin (Dept. of Electronics and Communications Engineering, Kwangwoon University)
  • Received : 2014.04.30
  • Accepted : 2014.06.18
  • Published : 2014.06.30

Abstract

Since hand gesture recognition was realized thanks to improved image processing algorithms, sign language translation has been a critical issue for the hearing-impaired. In this paper, we extract human hand figures from a real time image stream and detect gestures in order to figure out which kind of hand language it means. We used depth-color calibrated image from the Kinect to extract human hands and made a decision tree in order to recognize the hand gesture. The decision tree contains information such as number of fingers, contours, and the hand's position inside a uniform sized image. We succeeded in recognizing 'Hangul', the Korean alphabet, with a recognizing rate of 98.16%. The average execution time per letter of the system was about 76.5msec, a reasonable speed considering hand language translation is based on almost still images. We expect that this research will help communication between the hearing-impaired and other people who don't know hand language.

Keywords

References

  1. http://www.microsoft.com/en-us/kinectforwindows/
  2. Douglas Chai, "Face Segmentation Using Skin-Color Map in Videophone Applications", p. 551-564, IEEE Transactions on circuits and systems for video technology, Vol. 9, No. 4, June, 1999 https://doi.org/10.1109/76.767122
  3. Pedro Trindade, "Hand gesture recognition using color and depth images enhanced with hand angular pose data", presented at the IEEE Conference of Multisensor Fusion and Integration for Intelligent Systems, p. 71-76, September, 2012
  4. Yi Li, "Multi-scenario Gesture Recognition using Kinect", presented at the 17th International Conference on Computer Games, p. 126-130, August, 2012
  5. Sanghyeok Oh, "Interactive Learning of Korean Finger Spelling using Data Glove", p.733-736, in Proc. Spring Ann. Conference of the Korea Multimedia Society, Korea, May, 2007.
  6. Seungki Min, "Optimize Data Glove-based System for Korean Finger Spelling Recognition", p. 237-241, Korea Information Science Society, June, 2007
  7. Min-Ji Kang, "The Study on Dynamic Images Processing for Finger Languages", p. 184-189, KIISE Journal of Korea Computer Congress, Vol. 34, No. 1, April, 2004
  8. Hee-Deok Yang, "Automatic Spotting of Sign and Fingerspelling for Continuous Sign Language Recognition", KIISE Journal of Software and Application, Vol. 8, No. 2, p. 102-107, Febuary, 2012
  9. Hanhoon Park, "A Study on Hand Region Detection for Kinect-Based Hand Shape Recognition", p. 393-400, Journal of The Korean Society of Broadcast Engineers, Vol. 18, No. 3, May, 2013 https://doi.org/10.5909/JBE.2013.18.3.393
  10. http://www.jameco.com/Jameco/workshop/howitworks/xboxkinect.html
  11. http://en.wikipedia.org/wiki/Connected-component_labeling
  12. Guochan Chang, "A Decision Tree based Real-time Hand Gesture Recognition Method using KINECT", p. 1393-1402, Journal of The Korea Multimedia Society, Vol. 16, No. 12, December, 2013 https://doi.org/10.9717/kmms.2013.16.12.1393
  13. http://www.korean.go.kr(National Institute of the Korean Language)

Cited by

  1. A Study On Positioning Of Mouse Cursor Using Kinect Depth Camera vol.18, pp.4, 2014, https://doi.org/10.7471/ikeee.2014.18.4.478