JOURNAL BROWSE
Search
Advanced SearchSearch Tips
Interacting with Touchless Gestures: Taxonomy and Requirements
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Interacting with Touchless Gestures: Taxonomy and Requirements
Kim, Huhn;
  PDF(new window)
 Abstract
Objective: The aim of this study is to make the taxonomy for classifying diverse touchless gestures and establish the design requirements that should be considered in determining suitable gestures during gesture-based interaction design. Background: Recently, the applicability of touchless gestures is more and more increasing as relevant technologies are being advanced. However, before touchless gestures are widely applied to various devices or systems, the understanding on human gestures` natures and their standardization should be prerequisite. Method: In this study, diverse gesture types in various literatures were collected and, based on those, a new taxonomy for classifying touchless gestures was proposed. And many gesture-based interaction design cases and studies were analyzed. Results: The proposed taxonomy consisted of two dimensions: shape (deictic, manipulative, semantic, or descriptive) and motion(static or dynamic). The case analysis based on the taxonomy showed that manipulative and dynamic gestures were widely applied. Conclusion: Four core requirements for valuable touchless gestures were intuitiveness, learnability, convenience and discriminability. Application: The gesture taxonomy can be applied to produce alternatives of applicable touchless gestures, and four design requirements can be used as the criteria for evaluating the alternatives.
 Keywords
Touchless or non-touch gestures;Gesture taxonomy;Gesture requirements;Gesture-based interaction;
 Language
Korean
 Cited by
1.
Towards Establishing a Touchless Gesture Dictionary based on User Participatory Design,;;

Journal of the Ergonomics Society of Korea, 2012. vol.31. 4, pp.515-523 crossref(new window)
 References
1.
Alpern, M. and Minardo, K., Developing a car gesture interface for use as a secondary task, CHI 2003, pp.932-933, 2003.

2.
Bach, K. M., Jæger, M. G., Skov, M. B. and Thomassen, N. G., You Can Touch, but You Can't Look: Interacting with In-Vehicle Systems, CHI 2008, pp.1139-1148, April 5-10, 2008.

3.
eyeSight, http://www.youtube.com/watch?v=FRJp7b-EFbI, 2010.

4.
Henze, N., Locken, A., Boll, S., Hesselmann, T. and Pielot, M., Free-Hand Gestures for Music Playback: Deriving Gestures with a User-Centred Process, MUM'10, December 1-3, 2010.

5.
Hitachi, http://www.youtube.com/watch?v=O21SYHDEPOs&feature= related, 2008.

6.
Karam, M. and Schraefel, M. C., A Taxonomy of Gestures in Human Computer Interaction, Technical Report ECSTR-IAM05-009, Electronics and Computer science, University of Southampton, 2005.

7.
Kim, H. S., Hwang, S. W. and Moon, H. J., A Study on Vision Based Gesture Recognition Interface Design for Digital TV, Journal of Korean Society of Design Science, Vol. 20, No. 3, 257-268, 2007.

8.
Kim, H. and Park, M. K., A Study on the Constitutional Elements of GUI that Induces Gestures in Touch Screen, Journal of the Korean Society of Design Cluture, Vol. 15, No. 2, 146-157, 2009.

9.
Kim, H. and Song, H. W., Towards Designing More Intuitive Touchless Operations based on Hand Gestures, Journal of Korean Society of Design Science, Vol. 25, No. 1, 269-277, 2012.

10.
Mantyjarvi, J., Kela, J., Korpipaa, P. and Kallio, S., Enabling Fast and Effortless Customisation in Accelerometer based Gesture Interaction, MUM '04 Proceedings of the 3rd international conference on Mobile and ubiquitous multimedia, 25-31, 2004.

11.
Nam, J. Y., Choe, J. and Jung, E. S., Development of Finger Gestures for Touchscreen-based Web Browser Operation, Journal of the Ergonomics Society of Korea, Vol. 27, No. 4, pp.109-117, 2008. crossref(new window)

12.
Nielsen, M. Störring, M., Moeslund, T. B. and Granum, E., A Procedure for Developing Intuitive and Ergonomic Gesture Interface for HCI, In: Gesture-Based Communication in Human-Computer Interaction, 105-106, 2004.

13.
Pavlovic, I, V., Sharna, R. and Huang, S, T., Visual Interpretation of Hand Gestures for Human-Computer Interaction, IEEE Transactions on Patiern Analysis and Machine Intelligence, Vol. 19, No. 7, 1997.

14.
Pickering, C. A., Burnham, K. J. and Richardson, M. J., A Research Study of Hand Gesture Recognition Technologies and Applications for Human Vehicle Interaction, Automotive Electronics, 3rd Institution of Engineering and Technology Conference, pp.1-15, 28-29 June, 2007.

15.
Rahman, A. M., Hossain, M. A. and Parra, J., Motion-Path based Gesture Interaction with Smart Home Services, MM '09 Proceedings of the 17th ACM international conference on Multimedia, 761-764, 2009.

16.
Saffer, D., Designing Gestural Interfaces, O'Reilly Media, Inc., 2009.

17.
Vatavu, R. D. & Pentiuc, S. G., Multi-level Representation of Gesture as Command for Human Computer Interaction, Computing and Informatics, Vol. 27, 1001-1015, 2008.

18.
Visteon, http://www.youtube.com/watch?v=dd8i1fD_ia8, 2010.

19.
Fourney, A., Terry, M. and Mann, R., http://www.youtube.com/watch?v=sjoRzqudssk, University of Waterloo, 2009.

20.
Zerlina, F. V., Huang, H., Ku, D. H., Lee, B. L. and Lin, Y., Gesture-based Interface Cognitive Work Evaluation in a Driving Context, IE486 Final Project, School of Industrial Engineering, Purdue University.