DOI QR코드

DOI QR Code

A Notation Method for Three Dimensional Hand Gesture

  • Choi, Eun-Jung (Department of Industrial and Management Engineering, POSTECH) ;
  • Kim, Hee-Jin (Department of Industrial and Management Engineering, POSTECH) ;
  • Chung, Min-K. (Department of Industrial and Management Engineering, POSTECH)
  • Received : 2012.07.15
  • Accepted : 2012.07.30
  • Published : 2012.08.31

Abstract

Objective: The aim of this study is to suggest a notation method for three-dimensional hand gesture. Background: To match intuitive gestures with commands of products, various studies have tried to derive gestures from users. In this case, various gestures for a command are derived due to various users' experience. Thus, organizing the gestures systematically and identifying similar pattern of them have become one of important issues. Method: Related studies about gesture taxonomy and notating sign language were investigated. Results: Through the literature review, a total of five elements of static gesture were selected, and a total of three forms of dynamic gesture were identified. Also temporal variability(reputation) was additionally selected. Conclusion: A notation method which follows a combination sequence of the gesture elements was suggested. Application: A notation method for three dimensional hand gestures might be used to describe and organize the user-defined gesture systematically.

Keywords

References

  1. Bhuiyan, M. and Picking, R., A., Gesture controlled user interface for inclusive design and evaluative study of its usability, Journal of software engineering and applications, 4, 513-521, 2011. https://doi.org/10.4236/jsea.2011.49059
  2. David, V. K. and Rajasekaran, S., Retracted chapter: gesture and signature recognition using MicroARTMAP, Studies in Computational Intelligence, 160, 93-113, 2009. https://doi.org/10.1007/978-3-540-85130-1_7
  3. Henze, N., Locken, A., Boll, S., Hesselmann, T. and Pielot, M., "Free-hand gestures for music playback: deriving gestures with a user-centered process", Proceedings of the 9th international conference on Mobile and Ubiquitous Multimedia, 16, 2010.
  4. Karam, M. and Schraefel, M. C., "A taxonomy of gesture in human computer interactions", Technical report, Technical Report ECSTRIAM05- 009, Electronics and Computer Science, University of Southampton, 2005.
  5. Kuhnel, C., Westermann, T., Hemmert, F., Kratz, S., Muller, A. and Moller, S., I'm home: Defining and evaluating a gesture set for smart-home control, International Journal of Human-Computer Studies, 693-704, 2011.
  6. Lao, S., Heng, X., Zhang, G., Ling, Y. and Wang, P. A., "Gestural interaction design model for multi-touch displays". Proceedings of the 2009 British Computer Society Conference on Human-Computer interaction, Swinton, UK, (pp. 440-446), 2009.
  7. Lee, B. and Song, P., Advanced representation method of hand motion by cheremes analysis in KSL, Journal of Korea Multimedia Society, 9(8), 1067-1075, 2006.
  8. Lin, J. Y., Wu, Y. and Huang, T. S., "3D model-based hand tracking using stochastic direct search method", Sixth IEEE International Conference on Automatic Face and Gesture Recognition, Seoul, Korea, (pp. 693-698), 2004.
  9. Mauney, D., Howarth, J., Wirtanen, A. and Capra, M., "Cultural similarities and differences in user-defined gestures for touchscreen user Interfaces", Proceedings of the CHI'10, Atlanta, Georgia, USA., 2010.
  10. McNeill, D., Hand and mind: What gestures reveal about thought, University of Chicago Press, 1992.
  11. Mo, Z., Gesture interface engine, Ph.D. thesis; University of Southern California, 2007.
  12. Nesselrath, R., Lu, C., Schulz, C. H., Frey, J. and Alexandersson, J., "A gesture based system for context-sensitive interaction with smart homes", DeutscherAAL-Kongress, 2011.
  13. Nielsen, M., Moeslund, T., Storring, T. and Granum, E., "A procedure for developing intuitive and ergonomic gesture interfaces for manmachine interaction", Proceedings of the 5th Interaction Gesture Workshop, Genova, Italy, 2003.
  14. Nielsen, M., Storring, T., Moeslund, T. and Granum, E., A procedure for developing intuitive and ergonomic gesture interfaces for HCI, Gesture-Based Communication in Human-Computer Interaction, 105-106, 2004.
  15. Nielsen, M., Moeslund, T. B. and Granum, S., E., Gesture interfaces, In P. Kortum(Ed.): HCI Beyond the GUI: Design for Haptic, Speech, Olfactory, and Other Nontraditional Interfaces, Elsevier, 2008.
  16. Park, W., A multi-touch gesture vocabulary design methodolgy for mobile devices, Ph.D. thesis; Division of Mechanical and Industrial Engineering POSTECH, Pohang, Korea, 2012.
  17. Quek, F., McNeill, D., Bryll, R., Duncan, S., Ma, X. F. and Kirbas, C., Multimodal human discourse: Gesture and speech, ACM Transactions on Computer-Human Interaction, 9, 171-193, 2002. https://doi.org/10.1145/568513.568514
  18. Saffer, D., Designing Gestural Interfaces: touchscreens and interactive devices, 1st ed., O'Reilly Media, 2008.
  19. Stokoe, W. C., Sign language structure: An outline of the visual communication system of the American deaf, Studies in Linguistics: Occasional Papers 8, 1960.
  20. Sutton, V., Signwriting for everyday use, Newport Beach: Sutton Movement Writing Press, 1981.
  21. Witkin, A. and Kass, M., "Spacetime constraints", Proceedings of Siggraph, NewYork, USA, (pp. 159-168), 1988.
  22. Wu, Y., Lin, J. Y., and Huang, T. S., "Capturing natural hand articulation", Proceedings of 8th International Conference on Computer Vision, 2, (pp. 426-432), 2001.
  23. Wobbrock, J. O., Morris, M. R. and Wilson, A. D., "User-defined gestures for surface computing", Proceedings of the 27th International Conference on Human Factors in Computing Systems, (pp. 1083 -1092), 2009.