JOURNAL BROWSE
Search
Advanced SearchSearch Tips
User Needs of Three Dimensional Hand Gesture Interfaces in Residential Environment Based on Diary Method
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
User Needs of Three Dimensional Hand Gesture Interfaces in Residential Environment Based on Diary Method
Jeong, Dong Yeong; Kim, Heejin; Han, Sung H.; Lee, Donghun;
  PDF(new window)
 Abstract
The aim of this study is to find out the user's needs of a 3D hand gesture interface in the smart home environment. To find out the users' needs, we investigated which object the users want to use with a 3D hand gesture interface and why they want to use a 3D hand gesture interface. 3D hand gesture interfaces are studied to be applied to various devices in the smart environment. 3D hand gesture interfaces enable the users to control the smart environment with natural and intuitive hand gestures. With these advantages, finding out the user's needs of a 3D hand gesture interface would improve the user experience of a product. This study was conducted using a diary method to find out the user's needs with 20 participants. They wrote the needs of a 3D hand gesture interface during one week filling in the forms of a diary. The form of the diary is comprised of who, when, where, what and how to use a 3D hand gesture interface with each consisting of a usefulness score. A total of 322 data (209 normal data and 113 error data) were collected from users. There were some common objects which the users wanted to control with a 3D hand gesture interface and reasons why they want to use a 3D hand gesture interface. Among them, the users wanted to use a 3D hand gesture interface mostly to control the light, and to use a 3D hand gesture interface mostly to overcome hand restrictions. The results of this study would help develop effective and efficient studies of a 3D hand gesture interface giving valuable insights for the researchers and designers. In addition, this could be used for creating guidelines for 3D hand gesture interfaces.
 Keywords
Gesture Interface;User Needs;Diary Method;
 Language
Korean
 Cited by
 References
1.
Beigl, M. (1999), Point and click-interaction in smart environments, In Handheld and Ubiquitous Computing, 311-313.

2.
Bolger, N., Davis, A., and Rafaeli, E. (2003), Diary methods : Capturing life as it is lived, Annual review of psychology, 54(1), 579-616. crossref(new window)

3.
Boyce, C. and Neale, P. (2006), Conducting in-depth interviews : A guide for designing and conducting in-depth interviews for evaluation input, 3-7, Pathfinder International, Watertown, MA, USA.

4.
Chantasuban, S. and Thiemjarus, S. (2009), Ubiband : A framework for music composition with bsns, In Wearable and Implantable Body Sensor Networks, BSN 2009, Sixth International Workshop on, 267-272.

5.
Chua, C. S., Guan, H., and Ho, Y. K. (2002), Model-based 3D hand posture estimation from a single 2D image, Image and Vision computing, 20(3), 191-202. crossref(new window)

6.
Choi, E., Kwon, S., Lee, D., Lee, H., and Chung, M. K. (2012), Design of Hand Gestures for Smart Home Appliances based on a User Centered Approach, Journal of the Korean Institute of Industrial Engineers, 38(3), 182-190. crossref(new window)

7.
Cook, D. J. and Das, S. K. (2007), How smart are our environments? An updated look at the state of the art, Pervasive and mobile computing, 3(2), 53-73. crossref(new window)

8.
Cook, D. J., Youngblood, M., and Das, S. K. (2006), A multi-agent approach to controlling a smart environment, In Designing smart homes, 165-182.

9.
Dey, A. K., Abowd, G. D., and Salber, D. (2000), A context-based infrastructure for smart environments, In Managing Interactions in Smart Environments, 114-128.

10.
Essa, I. A. (2000), Ubiquitous sensing for smart and aware environments, Personal Communications, IEEE, 7(5), 47-49.

11.
Fang, Y., Wang, K., Cheng, J., and Lu, H. (2007), A real-time hand gesture recognition method. In Multimedia and Expo, 2007 IEEE International Conference on, 995-998.

12.
Guan, H., Feris, R. S., and Turk, M. (2006), The isometric self-organizing map for 3d hand pose estimation, In Automatic Face and Gesture Recognition, FGR 2006, 7th International Conference on, 263-268.

13.
Guion, L. A., Diehl, D. C., and McDonald, D. (2011), Conducting an in-depth interview.

14.
Henze, N., Locken, A., Boll, S., Hesselmann, T., and Pielot, M. (2010), Free-hand gestures for music playback : deriving gestures with a user-centred process, In Proceedings of the 9th international conference on Mobile and Ubiquitous Multimedia, 16.

15.
Hyun, H. and Han, T. (2011), User Experience-Centered Product Design Concept by User Activities, Journal of Korea Entertainment Industry Association, 2, 121-128.

16.
Ionescu, D., Ionescu, B., Gadea, C., and Islam, S. (2011), An intelligent gesture interface for controlling TV sets and set-top boxes, In Applied Computational Intelligence and Informatics (SACI), 2011 6th IEEE International Symposium on, 159-164.

17.
Jaijongrak, V. R., Chantasuban, S., and Thiemjarus, S. (2009), Towards a BSN-based gesture interface for intelligent home applications, In ICCAS-SICE, 5613-5617.

18.
Krumm, J., Shafer, S., and Wilson, A. (2001), How a smart environment can use perception, In Workshop on Sensing and Perception for Ubiquitous Computing (part of UbiComp 2001).

19.
Kuhnel, C., Westermann, T., Hemmert, F., Kratz, S., Muller, A. and Moller, S. (2011), I'm home : Defining and evaluating a gesture set for smart-home control, International Journal of Human-Computer Studies, 69(11), 693-704. crossref(new window)

20.
Lee, D., Lee, H., and Chung, M. K. (2011), An Analysis of Time Use on Activities of Daily Living : Considering Korean Adults in Seoul, Journal of the Korean Institute of Industrial Engineers, 37(2), 105-117. crossref(new window)

21.
Legard, R., Keegan, J., and Ward, K. (2003), In-depth interviews, Qualitative research practice : A guide for social science students and researchers, 138-169.

22.
Litosseliti, L. (2003), Using focus groups in research, Bloomsbury Publishing.

23.
Manresa, C., Varona, J., Mas, R., and Perales, F. (2005), Hand tracking and gesture recognition for human-computer interaction, Electronic letters on computer vision and image analysis, 5(3), 96-104.

24.
Nielsen, M., Storring, M., Moeslund, T. B., and Granum, E. (2004), A procedure for developing intuitive and ergonomic gesture interfaces for HCI, In Gesture-Based Communication in Human-Computer Interaction, 409-420.

25.
Powell, R. A. and Single, H. M. (1996), Focus groups, International Journal for Quality in Health Care, 8(5), 499-504.

26.
Rahman, A. S. M., Hossain, M. A., Parra, J., and El Saddik, A. (2009), Motion-path based gesture interaction with smart home services, In Proceedings of the 17th ACM international conference on Multimedia, 761-764.

27.
Reis, H. T. (1994), Domains of experience : Investigating relationship processes from three perspectives, Theoretical frameworks for personal relationships, 87-110.

28.
Reis, H. T. and Gable, S. L. (2000), Event-sampling and other methods for studying everyday experience, Handbook of research methods in social and personality psychology, 190-222.

29.
Singla, G., Cook, D. J., and Schmitter-Edgecombe, M. (2010), Recognizing independent and joint activities among multiple residents in smart environments, Journal of ambient intelligence and humanized computing, 1(1), 57-63. crossref(new window)

30.
Sriboonruang, Y., Kumhom, P., and Chamnongthai, K. (2006), Visual hand gesture interface for computer board game control, In Consumer Electronics, ISCE'06, 2006 IEEE Tenth International Symposium on, 1-5.

31.
Stern, H. I., Wachs, J. P., and Edan, Y. (2008), Optimal consensus intuitive hand gesture vocabulary design, In Semantic Computing, 2008 IEEE International Conference on, 96-103.

32.
Trochim, W. and Donnelly, J. P. (2007), The Research Methods Knowledge Base, Third ed., Cengage Learning, Mason, OH, USA.

33.
Varkonyi-Koczy, A. R. and Tusor, B. (2011), Human computer interaction for smart environment applications using fuzzy hand posture and gesture models, Instrumentation and Measurement, IEEE Transactions on, 60(5), 1505-1514. crossref(new window)

34.
Vatavu, R. D. (2012), User-defined gestures for free-hand TV control, In Proceedings of the 10th European conference on Interactive tv and video, 45-48.

35.
Wachs, J., Stern, H., Edan, Y., Gillam, M., Feied, C., Smith, M., and Handler, J. (2006), A real-time hand gesture interface for medical visualization applications, In Applications of Soft Computing, 153-162.

36.
Wachs, J. P., Stern, H. I., Edan, Y., Gillam, M., Handler, J., Feied, C., and Smith, M. (2008), A gesture-based tool for sterile browsing of radiology images, Journal of the American Medical Informatics Association, 15(3), 321-323. crossref(new window)

37.
Yamamoto, Y., Yoda, I., and Sakaue, K. (2004), Arm-pointing gesture interface using surrounded stereo cameras system, In Pattern Recognition, ICPR 2004, Proceedings of the 17th International Conference on, 4, 965-970.

38.
Yoo, H. and Pan, Y. (2013), Video observation method of multiple viewpoints for service design, Journal of Korean Society Design Science, 26(1), 193-210.

39.
Zhu, C. and Sheng, W. (2011), Wearable sensor-based hand gesture and daily activity recognition for robot-assisted living, Systems, Man and Cybernetics, Part A : Systems and Humans, IEEE Transactions on, 41(3), 569-573.