Designing a Framework of Multimodal Contents Creation and Playback System for Immersive Textbook

실감형 교과서를 위한 멀티모달 콘텐츠 저작 및 재생 프레임워크 설계

  • 김석열 (한국과학기술원 전산학과) ;
  • 박진아 (한국과학기술원 전산학과)
  • Received : 2010.05.18
  • Accepted : 2010.08.11
  • Published : 2010.08.28


For virtual education, the multimodal learning environment with haptic feedback, termed 'immersive textbook', is necessary to enhance the learning effectiveness. However, the learning contents for immersive textbook are not widely available due to the constraints in creation and playback environments. To address this problem, we propose a framework for producing and displaying the multimodal contents for immersive textbook. Our framework provides an XML-based meta-language to produce the multimodal learning contents in the form of intuitive script. Thus it can help the user, without any prior knowledge of multimodal interactions, produce his or her own learning contents. The contents are then interpreted by script engine and delivered to the user by visual and haptic rendering loops. Also we implemented a prototype based on the aforementioned proposals and performed user evaluation to verify the validity of our framework.


Virtual Education;Multimodal Contents;Haptic Feedback;Immersive Textbook


Supported by : 정보통신산업진흥원


  1. R. Moreno and R. Mayer, “Interactive Multimodal Learning Environments,” Educational Psychology Review, Vol.19, No.3, pp.309-326, 2007.
  2. C. Teo, E. Burdet, and H. Lim, “A Robotic Teacher of Chinese Handwriting,” Proceedings of the 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, pp.335-341, 2002.
  3. P. Persson, M. Cooper, L. Tibell, S. Ainsworth, A. Ynnerman, and B. Jonsson, “Designing and Evaluating a Haptic System for Biomolecular Education,” Proceedings of IEEE Virtual Reality Conference, pp.171-178, 2007.
  4. 박선영, 이준훈, 김현곤, 김영미, 최권영, 류제하, “실감책을 위한 시스템 및 저작 도구 기본 프레임 워크”, 한국 HCI2009 학술대회, pp.99-104, 2009.
  7. M. O'Malley and S. Hughes, “Simplified Authoring of 3D Haptic Content for the World Wide Web,” Proceedings of the 11th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, pp.428-429, 2003.
  8. L. Wei, A. Sourin, and O. Sourina, “Function- Based Haptic Interaction in Cyberworlds.” Proceedings of the 2007 International Conference on Cyberworlds, pp.225-232, 2007.
  9. M. Carrozzino, F. Tecchia, S. Bacinelli, C. Cappelletti, and M. Bergamasco, “Lowering the Development Time of Multimodal Interactive Application: The Real-life Experience of the XVR Project,” Proceedings of the 2005 ACM SIGCHI International Conference on Advances in computer entertainment technology, pp.270-273, 2005.
  11. F. El-Far, M. Eid, M. Orozco, and A. El Saddik, “Haptic Applications Meta-Language,” Proceedings of the 10th IEEE international symposium on Distributed Simulation and Real-Time Applications, pp.261-264, 2006.
  12. M. Eid, S. Andrews, A. Alamri, and A. El Saddik, “HAMLAT: A HAML-Based Authoring Tool for Haptic Application Development,” Lecture Notes in Computer Science, No.5024, pp.857-866, 2008.
  14. C. Ho, C. Basdogan, and M. Srinivasan, “Haptic Rendering: Point- and Ray-Based Interactions,” Proceedings of the Second PHANToM Users Group Workshop, 1997.
  15. 김석열, 박진아, “HaptiBody Navigator - XML 기반의 인체 학습용 멀티모달 컨텐츠 재생 시스템”, 한국 HCI2010 학술대회, pp.209-211, 2010.
  16. F. Conti, F. Barbagli, D. Morris, and C. Sewell, “CHAI: An Open-Source Library for the Rapid Development of Haptic Scenes,” World Haptics Conference, 2005.
  17. J. Park, M. Chung, S. Hwang, Y. Lee, D. Har, and H. Park, “Visible Korean Human: Improved Serially Sectioned Images of the Entire Body,” IEEE Transactions on Medical Imaging, Vol.24, No.3, pp.352-360, 2005.