User-Defined Hand Gestures for Small Cylindrical Displays

소형 원통형 디스플레이를 위한 사용자 정의 핸드 제스처

  • Received : 2017.01.04
  • Accepted : 2017.02.23
  • Published : 2017.03.28


This paper aims to elicit user-defined hand gestures for the small cylindrical displays with flexible displays which has not emerged as a product yet. For this, we first defined the size and functions of a small cylindrical display, and elicited the tasks for operating its functions. Henceforward we implemented the experiment environment which is similar to real cylindrical display usage environment by developing both of a virtual cylindrical display interface and a physical object for operating the virtual cylindrical display. And we showed the results of each task in the virtual cylindrical display to the participants so they could define the hand gestures which are suitable for each task in their opinion. We selected the representative gestures for each task by choosing the gestures of the largest group in each task, and we also calculated agreement scores for each task. Finally we observed mental model of the participants which was applied for eliciting the gestures, based on analyzing the gestures and interview results from the participants.


Cylindrical Display;Flexible Display;Hand Gesture;User-Defined Gesture;User Interface


Grant : 소형 플렉시블 디스플레이를 위한 UI/UX 및 콘텐츠 저작도구 개발

Supported by : 한국콘텐츠진흥원


  4. M. Bolas, J. Pair, K. Haynes, and I. McDowall, "Environmental and Immersive Display Research at the University of Southern California," In proceedings of IEEE Virtual Reality Conference, pp.317-317, 2006.
  5. T. Endo, Y. Kajiki, T. Honda, and M. Sato, "Cylindrical 3-D video display observable from all directions," In proceedings of IEEE Pacific conference on Computer Graphics and Applications, pp.300-306, 2000.
  6. B. Sajadi and A. Majumder, "Auto-calibration of cylindrical multi-projector systems," In proceedings of IEEE Virtual Reality Conference (VR '10), pp.155-162, 2010.
  7. T. Kawanishi, M. Tsuchida, S. Takagi, and H. Murase, "Small cylindrical display for anthropomorphic agents," In proceedings of International Conference on Multimedia and Expo(ICME '03), Vol.2, pp.85-88, 2003.
  8. T. Yendo, N. Kawakami, and S. Tachi, "Seelinder: the cylindrical lightfield display," In proceedings of ACM SIGGRAPH '05, Emerging technologies, Article No.16, 2005.
  9. C. Abras, D. Maloney-Krichmar, and J. Preece, "User-Centered Design," In W. Bainbridge, Encyclopedia of Human-Computer Interaction, Thousand Oaks: Sage Publications, 2004.
  10. M. Wu and R. Balakrishnan, "Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays," In proceedings of the 16th annual ACM symposium on User interface software and technology (UIST '03), pp.193-202, 2003.
  11. T. Seifried, M. Haller, S. D. Scott, F. Perteneder, C. Rendl, D. Sakamoto, and M. Inami, "Cristal : A collaborative home media and device controller based on a multi-touch display," In Proceedings of the international conference on interactive tabletops and surfaces (ITS '09), pp.37-44, 2009.
  12. K. Ouchi, N. Esaka, Y. Tamura, M. Hiraharam, and M. Doi, "Magic Wand : An intuitive gesture remote control for home appliances," In Proceedings of the international conference on Active Media Technology (AMT '05), p.274, 2005.
  13. G. Pan, J. Wu, D. Zhang, Z. Wu, Y. Yang, and S. Li, "GeeAir : A universal multimodal remote control device for home appliances," Personal and Ubiquitous Computing, Vol.14, No.8, pp.723-735, 2010.
  14. L. Dipietro and A. M. Sabatini, "A survey of glove-based systems and their applications," IEEE transactions on systems, man, and cybernetics-part C : Applications and reviews, Vol.38, No.4, pp.461-482, 2008.
  15. 김헌, 박수헌, "음성 및 비접촉 제스처 기반 차내 정보시스템 인터렉션의 조작 종류에 따른 효과," 디자인학연구, Vol.25, No.2, pp.93-101, 2012.
  16. 유승헌, 이태일, "공간 제스처 UX (Airtouch UX) 디자인을 위한 정보 디스플레이 코드 활용," 한국디자인지식저널, Vol.25, pp.21-30, 2013.
  17. 정혜선, 김후성, "한국인의 손짓언어를 활용한 3D 제스처인터랙션 어휘연구," 한국디자인문화학회지, Vol.21, No.4, pp.633-646, 2015.
  18. H. J. Kim, K. H. Jeong, S. K. Kim, and T. D. Han, "Ambient Wall: Smart Wall Display interface which can be controlled by simple gesture for smart home," In proceeding of ACM SIGGRAPH Asia 2011, Sketches, Article No.1, 2011.
  19. J. O. Wobbrock, M. R. Morris, and A. D. Wilson, "User-Defined Gestures for Surface Computing," In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI '09), pp.1083-1092, 2009.
  20. R. Nesselrath, C. Lu, C. H. Schulz, J. Frey, and J. Alexandersson, "A gesture based system for context-sensitive interaction with smart homes," Ambient Assisted Living, Deutscher AAL-Kongress, Springer, pp.209-219, 2011.
  21. T. Kawasaki, M. Tsuchida, S. Takagi, and H. Murase, "Small Cylindrical Dispaly For Anthropomorphic Agents," In proceedings of International Conference on Multimedia and Expo 2003.
  22. G. Bayer, F. Alt, and J. Muller, "Audience behavior around large interactive cylindrical screens," In Proceedings of CHI 2011, pp.1021-1030, 2011.
  23. G. Bayer, F. Kottner, M. Schiewe, I. Haulsen, and A. Butz, "Squaring the circle: how framing influences user behavior around a seamless cylindrical display," In Proceedings of CHI 2013, pp.1729-1738, 2013.
  24. K. Kim, J. Bolton, A. Girouard, J. Cooperstock, and R. Vertegaal, "TeleHuman: Effects of 3D Perspective on Gaze and Pose Estimation with a Life-size Cylindrical Telepresence Pod," In Proceedings of CHI 2012, pp.2531-2540, 2012.
  25. J. Bolton, P. Wang, K. Kim, and R. Vertegaal, "BodiPod: interacting with 3d human anatomy via a $360^{\circ}$ cylindrical display," In Proceedings of CHI 2012, pp.1039-1042, 2012.
  26. R. Zarin, N. True, N. Papworth, K. Lindberg, and D. Fallman, "Be green: implementing an interactive, cylindrical display in the real world," In proceedings of ACM PerDis 2013, pp.55-60, 2013.
  27. J. T. Cacioppo, W. L. Gardener, and G. G. Berntson, "The affect system has parallel and integrative processing components: Form follows function," Journal of Personality and Social Psychology, Vol.76, No.5, pp.839-855, 1999.
  28. J. Wobbrock, H. Aung, B. Rothrock, and B. Myers, "Maximizing the guessability of symbolic input," In Proceedings of CHI Extended Abstracts on Human Factors in Computing Systems, pp.1869-1872, 2005.
  29. J. Ruiz, Y. Li, and E. Lank, "User-defined motion gestures for mobile interaction," In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp.197-206, 2011.
  30. R. Vatavu, "User-defined gestures for free-hand TV control," In Proceedings of the 10th European Conference on Interactive TV and Video, pp.45-48, 2012.
  31. R. Vatavu, "There's a world outside your TV: Exploring interactions beyond the physical TV screen," In Proceedings of the 11th European Conference on Interactive TV and Video, pp.143-152, 2013.
  32. H. Dong, N. Figueroa, and A. E. Saddik, "An Elicitation Study on Gesture Attitudes and Preferences Towards an Interactive Hand-Gesture Vocabulary," In Proceedings of the 23rd ACM International Conference on Multimedia, pp.999-1002, 2015.
  33. M. R. Morris, "Web on the wall: insights from a multimodal interaction elicitation study," In Proceedings of th 2012 ACM International Conference on Interactive Tabletops and Surfaces, pp.95-104, 2012.