Advanced SearchSearch Tips
Conditions of Applications, Situations and Functions Applicable to Gesture Interface
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Conditions of Applications, Situations and Functions Applicable to Gesture Interface
Ryu, Tae-Beum; Lee, Jae-Hong; Song, Joo-Bong; Yun, Myung-Hwan;
  PDF(new window)
Objective: This study developed a hierarchy of conditions of applications(devices), situations and functions which are applicable to gesture interface. Background: Gesture interface is one of the promising interfaces for our natural and intuitive interaction with intelligent machines and environments. Although there were many studies related to developing new gesture-based devices and gesture interfaces, it was little known which applications, situations and functions are applicable to gesture interface. Method: This study searched about 120 papers relevant to designing and applying gesture interfaces and vocabulary to find the gesture applicable conditions of applications, situations and functions. The conditions which were extracted from 16 closely-related papers were rearranged, and a hierarchy of them was developed to evaluate the applicability of applications, situations and functions to gesture interface. Results: This study summarized 10, 10 and 6 conditions of applications, situations and functions, respectively. In addition, the gesture applicable condition hierarchy of applications, situation and functions were developed based on the semantic similarity, ordering and serial or parallel relationship among them. Conclusion: This study collected gesture applicable conditions of application, situation and functions, and a hierarchy of them was developed to evaluate the applicability of gesture interface. Application: The gesture applicable conditions and hierarchy can be used in developing a framework and detailed criteria to evaluate applicability of applications situations and functions. Moreover, it can enable for designers of gesture interface and vocabulary to determine applications, situations and functions which are applicable to gesture interface.
Gesture interface;Applicability;Gesture application;Situation;Functions;
 Cited by
Bhuiyan, M. and Picking, R., "Gesture-controlled user interfaces, what have we done and what's next?" 5th Collaborative Research Symposium on Security, E-Learning, Internet and Networking(pp. 59-60), Darmstadt. Germany. 2009.

Bhuiyan, M. and Picking, R. A., Gesture controlled user interface for inclusive design and evaluative study of its usability, Journal of Software Engineering and Applications, 4(513-521), 2011. crossref(new window)

Blatt, L. and Schell, A., "Gesture Set Economics for Text and Spreadsheet Editors". Proceedings of the Human Factors and Ergonomics Society 34th Annual meeting(pp. 410-414), Orlando, FL. 1990.

Guo, C. and Sharlin, E., "Exploring the use of tangible user interfaces for human-robot interaction: a comparative study". CHI '08: Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems(pp. 121-130), Florence. Italy. 2008.

Hummels, C. and Stappers, P. J., "Meaningful gestures for human computer interaction: beyond hand postures". Proceeding of Third IEEE International Conference on Automatic Face and Gesture Recognition(pp. 591-596), Nara. Japan. 1998.

Hurtienne, J., Stößel, C. and Sturm, C., Physical gestures for abstract concepts: Inclusive design with primary metaphors. Interacting with Computers, 22(6), 475-484, 2010. crossref(new window)

Jia, P., Hu, H., Lu, T. and Yuan, K., Head gesture recognition for hands-free control of an intelligent wheelchair. Industrial Robot: An International Journal, 34(1), 60-68, 2007. crossref(new window)

Kela, J., Korpipaa, P., Mäntyjarvi, J., Kallio, S., Savino, G., Jozzo, L. and Marca, D., Accelerometer-based gesture control for a design environment, Personal and Ubiquitous Computing, 10(5), 285-299, 2006. crossref(new window)

Kuhnel, C., Westermann, T. and Hemmert, F., I'm home: Defining and evaluating a gesture set for smart-home control, International Journal of Human-Computer Studies, 69(11), 693-704, 2011. crossref(new window)

Li, J., Communication of Emotion in Social Robots through Simple Head and Arm Movements, International Journal of Social Robotics, 3, 125-142, 2010.

Mitra, S. and Acharya, T., Gesture recognition: A survey. Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, 37(3), 311-324, 2007. crossref(new window)

Nielsen, M., Storring, M., Moeslund, T. B. and Granum, E., "A procedure for developing intuitive and ergonomic gesture interfaces for manmachine interaction". Proceedings of the 5th International Gesture Workshop(pp. 1-12), Aalborg. Denmark. 2003.

Nishikawa, A., Hosoi, T., Koara, K., Negoro, D., Hikita, A., Asano, S., Kakutani, H., Miyazaki, F., Sekimoto, M., Yasui, M., Miyake, Y., Takiguchi, S., and Monden, M., FAce MOUSe: A novel humanmachine interface for controlling the position of a laparoscope. IEEE Transactions on Robotics and Automation 19(5), 825-841, 2003. crossref(new window)

Nesselrath, R., Lu, C., Schulz, C.H., Frey, J. and Alexandersson, J., A gesture based system for context-sensitive interaction with smart homes, In R. Wichert and B.Eberhardt(Eds), Advanced Technologies and Societal Change, Springer, Berlin, 209-219, 2011.

Oviatt, S., DeAngeli, A. and Kuhn, K., Integration and synchronization of input modes during multimodal human-computer interaction. Referring Phenomena in a Multimedia Context and their Computational Treatment, 1-13, 1997.

Oviatt, S., Ten Myths of Multimodal Interaction. Communications of the ACM, 42(11), 74-81, 1999.

Rauschert, I., Agrawal, P., Sharma, R., Fuhrmann, S., Brewer, I. and MacEachren, A. M., "Designing a human-centered, multimodal GIS interface to support emergency management". Proceedings of the 10th ACM International Symposium on Advances in Geographic Information Systems(pp. 119-124), McLean. VA. 2002.

Rico, J., "Usable gestures for mobile interfaces: evaluating social acceptability", Proceedings of the 28th international conference on Human factors in computing system(pp. 887-896), Atlanta. GA. 2010.

Ronkainen, S., Koskinen, E., Liu, Y. and Korhonen, P., Environment Analysis as a Basis for Designing Multimodal and Multidevice User Interfaces, Human-Computer Interaction, 25(2), 148-193, 2010. crossref(new window)

Rhyne, J., Dialogue Management for Gestural Interfaces. Computer graphics, 21(2), 137-142, 1987.

Shan, C., Gesture Control for Consumer Electronics, Multimedia Interaction and Intelligent User Interfaces, 107-128, 2010. crossref(new window)

Wachs, J., Kolsch, M. and Stern, H., Vision-based hand-gesture applications. Communications of the ACM, 54(2), 60-71, 2011.

Wickens, C. D. and Hollands, J. G., Engineering psychology and human performance, 3rd ed., Prentice Hall, 1999.

Wilson, A. and Oliver. N., GWindows: Towards Robust Perception-Based UI. in First IEEE Workshop on Computer Vision and Pattern Recognition for Human Computer Interaction. 2003.

Young, J., Sung, J., Voida, A. and Sharlin, E., Evaluating human-robot interaction, International Journal of Social Robotics, 3, 53-67, 2011. crossref(new window)