JOURNAL BROWSE
Search
Advanced SearchSearch Tips
Gesture based Natural User Interface for e-Training
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Gesture based Natural User Interface for e-Training
Lim, C.J.; Lee, Nam-Hee; Jeong, Yun-Guen; Heo, Seung-Il;
  PDF(new window)
 Abstract
Objective: This paper describes the process and results related to the development of gesture recognition-based natural user interface(NUI) for vehicle maintenance e-Training system. Background: E-Training refers to education training that acquires and improves the necessary capabilities to perform tasks by using information and communication technology(simulation, 3D virtual reality, and augmented reality), device(PC, tablet, smartphone, and HMD), and environment(wired/wireless internet and cloud computing). Method: Palm movement from depth camera is used as a pointing device, where finger movement is extracted by using OpenCV library as a selection protocol. Results: The proposed NUI allows trainees to control objects, such as cars and engines, on a large screen through gesture recognition. In addition, it includes the learning environment to understand the procedure of either assemble or disassemble certain parts. Conclusion: Future works are related to the implementation of gesture recognition technology for a multiple number of trainees. Application: The results of this interface can be applied not only in e-Training system, but also in other systems, such as digital signage, tangible game, controlling 3D contents, etc.
 Keywords
Gesture recognition;NUI(Natural User Interface);e-Training;Gesture protocol;Tracking system;
 Language
Korean
 Cited by
 References
1.
Adam, Haber., Visualization and Simulation of 3D depth data from the Kinect Camera, http://cgi.cse.unsw.edu.au/-cs4411/wiki/index.php?title =Adam, (retrieved June 3, 2011).

2.
Broekman, B., et al. Embedded Software Testing, Hongrung publishing company, 2008.

3.
Fukuhara, et al. 3D-Motion Estimation of Human Head for Model-Based Image Coding. IEE Proc., Vol. 140, no. 1, 26-35, 1993.

4.
Gwon, Wonil., Testing Process in ISO/IEC 29119-2, Software Engineering: Software Testing, 2008.

5.
IEEE Standard for Software Testing Documentation, http://blog.naver.com/raindrap?Redirect=Log&logNo=40048822447, (retrieved March 12, 2008).

6.
Manin., Kinect for Windows SDK, http://www.cnblogs.com/aawolf/archive/2011/06/21/2086139.html, (retrieved June 21, 2011).

7.
Park, Doyun and Lee, Ji-Hyun., Investigating the Affective Quality of Motion in User Interfaces to Improve User Experience, 67-78, 2010.

8.
Park, K. S. and Lim, C. J., "An efficient camera calibration method for vision-based head tracking", Man-Machine Production Systems Laboratory, Department of Industrial Engineering, Korea advanced Institute of Science and Technology, 1999.

9.
Smith, Ian F., et al. Test automation for embedded products, 2004.

10.
The innovation business of the knowledge economic technique, e-training service for practicing automobile maintenance, (pp. 3-20), 2011.

11.
The Ministry knowledge economy Technology and Standard, Remote control equipment based on motion, 2012.