Advanced SearchSearch Tips
A Notation Method for Three Dimensional Hand Gesture
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
A Notation Method for Three Dimensional Hand Gesture
Choi, Eun-Jung; Kim, Hee-Jin; Chung, Min-K.;
  PDF(new window)
Objective: The aim of this study is to suggest a notation method for three-dimensional hand gesture. Background: To match intuitive gestures with commands of products, various studies have tried to derive gestures from users. In this case, various gestures for a command are derived due to various users' experience. Thus, organizing the gestures systematically and identifying similar pattern of them have become one of important issues. Method: Related studies about gesture taxonomy and notating sign language were investigated. Results: Through the literature review, a total of five elements of static gesture were selected, and a total of three forms of dynamic gesture were identified. Also temporal variability(reputation) was additionally selected. Conclusion: A notation method which follows a combination sequence of the gesture elements was suggested. Application: A notation method for three dimensional hand gestures might be used to describe and organize the user-defined gesture systematically.
Three-dimensional hand gesture;Hand gesture notation method;Gesture elements;Gesture features;
 Cited by
A taxonomy and notation method for three-dimensional hand gestures, International Journal of Industrial Ergonomics, 2014, 44, 1, 171  crossref(new windwow)
Bhuiyan, M. and Picking, R., A., Gesture controlled user interface for inclusive design and evaluative study of its usability, Journal of software engineering and applications, 4, 513-521, 2011. crossref(new window)

David, V. K. and Rajasekaran, S., Retracted chapter: gesture and signature recognition using MicroARTMAP, Studies in Computational Intelligence, 160, 93-113, 2009. crossref(new window)

Henze, N., Locken, A., Boll, S., Hesselmann, T. and Pielot, M., "Free-hand gestures for music playback: deriving gestures with a user-centered process", Proceedings of the 9th international conference on Mobile and Ubiquitous Multimedia, 16, 2010.

Karam, M. and Schraefel, M. C., "A taxonomy of gesture in human computer interactions", Technical report, Technical Report ECSTRIAM05- 009, Electronics and Computer Science, University of Southampton, 2005.

Kuhnel, C., Westermann, T., Hemmert, F., Kratz, S., Muller, A. and Moller, S., I'm home: Defining and evaluating a gesture set for smart-home control, International Journal of Human-Computer Studies, 693-704, 2011.

Lao, S., Heng, X., Zhang, G., Ling, Y. and Wang, P. A., "Gestural interaction design model for multi-touch displays". Proceedings of the 2009 British Computer Society Conference on Human-Computer interaction, Swinton, UK, (pp. 440-446), 2009.

Lee, B. and Song, P., Advanced representation method of hand motion by cheremes analysis in KSL, Journal of Korea Multimedia Society, 9(8), 1067-1075, 2006.

Lin, J. Y., Wu, Y. and Huang, T. S., "3D model-based hand tracking using stochastic direct search method", Sixth IEEE International Conference on Automatic Face and Gesture Recognition, Seoul, Korea, (pp. 693-698), 2004.

Mauney, D., Howarth, J., Wirtanen, A. and Capra, M., "Cultural similarities and differences in user-defined gestures for touchscreen user Interfaces", Proceedings of the CHI'10, Atlanta, Georgia, USA., 2010.

McNeill, D., Hand and mind: What gestures reveal about thought, University of Chicago Press, 1992.

Mo, Z., Gesture interface engine, Ph.D. thesis; University of Southern California, 2007.

Nesselrath, R., Lu, C., Schulz, C. H., Frey, J. and Alexandersson, J., "A gesture based system for context-sensitive interaction with smart homes", DeutscherAAL-Kongress, 2011.

Nielsen, M., Moeslund, T., Storring, T. and Granum, E., "A procedure for developing intuitive and ergonomic gesture interfaces for manmachine interaction", Proceedings of the 5th Interaction Gesture Workshop, Genova, Italy, 2003.

Nielsen, M., Storring, T., Moeslund, T. and Granum, E., A procedure for developing intuitive and ergonomic gesture interfaces for HCI, Gesture-Based Communication in Human-Computer Interaction, 105-106, 2004.

Nielsen, M., Moeslund, T. B. and Granum, S., E., Gesture interfaces, In P. Kortum(Ed.): HCI Beyond the GUI: Design for Haptic, Speech, Olfactory, and Other Nontraditional Interfaces, Elsevier, 2008.

Park, W., A multi-touch gesture vocabulary design methodolgy for mobile devices, Ph.D. thesis; Division of Mechanical and Industrial Engineering POSTECH, Pohang, Korea, 2012.

Quek, F., McNeill, D., Bryll, R., Duncan, S., Ma, X. F. and Kirbas, C., Multimodal human discourse: Gesture and speech, ACM Transactions on Computer-Human Interaction, 9, 171-193, 2002. crossref(new window)

Saffer, D., Designing Gestural Interfaces: touchscreens and interactive devices, 1st ed., O'Reilly Media, 2008.

Stokoe, W. C., Sign language structure: An outline of the visual communication system of the American deaf, Studies in Linguistics: Occasional Papers 8, 1960.

Sutton, V., Signwriting for everyday use, Newport Beach: Sutton Movement Writing Press, 1981.

Witkin, A. and Kass, M., "Spacetime constraints", Proceedings of Siggraph, NewYork, USA, (pp. 159-168), 1988.

Wu, Y., Lin, J. Y., and Huang, T. S., "Capturing natural hand articulation", Proceedings of 8th International Conference on Computer Vision, 2, (pp. 426-432), 2001.

Wobbrock, J. O., Morris, M. R. and Wilson, A. D., "User-defined gestures for surface computing", Proceedings of the 27th International Conference on Human Factors in Computing Systems, (pp. 1083 -1092), 2009.