DOI QR코드

DOI QR Code

Controlling Position of Virtual Reality Contents with Mouth-Wind and Acceleration Sensor

  • Received : 2019.02.14
  • Accepted : 2019.04.01
  • Published : 2019.04.30

Abstract

In this paper, we propose a new framework to control VR(Virtual reality) contents in real time using user's mouth-wind and acceleration sensor of mobile device. In VR, user interaction technology is important, but various user interface methods is still lacking. Most of the interaction technologies are hand touch screen touch or motion recognition. We propose a new interface technology that can interact with VR contents in real time using user's mouth-wind method with acceleration sensor. The direction of the mouth-wind is determined using the angle and position between the user and the mobile device, and the control position is adjusted using the acceleration sensor of the mobile device. Noise included in the size of the mouth wind is refined using a simple average filter. In order to demonstrate the superiority of the proposed technology, we show the result of interacting with contents in game and simulation in real time by applying control position and mouth-wind external force to the game.

Keywords

CPTSCQ_2019_v24n4_57_f0001.png 이미지

Fig. 1. Military training and medical system with VR

CPTSCQ_2019_v24n4_57_f0002.png 이미지

Fig. 2. Various touch patterns.

CPTSCQ_2019_v24n4_57_f0003.png 이미지

Fig. 3. Leap motion[1] and Myo gesture control armband[14].

CPTSCQ_2019_v24n4_57_f0004.png 이미지

Fig. 4. Various angle types with angle between mobile device and view vector.

CPTSCQ_2019_v24n4_57_f0005.png 이미지

Fig. 5. Wind direction classified according to three angle types (red arrow : wind direction).

CPTSCQ_2019_v24n4_57_f0006.png 이미지

Fig. 6. Influence of gravity on mobile device condition.

CPTSCQ_2019_v24n4_57_f0007.png 이미지

Fig. 7. Rotation angle on mobile device.

CPTSCQ_2019_v24n4_57_f0008.png 이미지

Fig. 8. Angle calculation from the axis of the acceleration sensor.

CPTSCQ_2019_v24n4_57_f0009.png 이미지

Fig. 9. Control position changes according to orientation.

CPTSCQ_2019_v24n4_57_f0010.png 이미지

Fig. 10. Magnitude of refined mouth-wind(X-axis : time, Y-axis : magnitude of sound).

CPTSCQ_2019_v24n4_57_f0011.png 이미지

Fig. 11. Vector field diffused by mouth-wind.

CPTSCQ_2019_v24n4_57_f0012.png 이미지

Fig. 12. Smoke simulation diffused by mouth-wind.

CPTSCQ_2019_v24n4_57_f0013.png 이미지

Fig. 13. Smoke simulation diffused by multi-focus control position and mouth-wind.

CPTSCQ_2019_v24n4_57_f0014.png 이미지

Fig. 14. Movement of VR content using proposed method(red cylinder : VR content).

References

  1. M. Buckwald. (2014) Leap motion. https://developer.leapmotion.com/documentation/.
  2. P. Trotta. (2018) Captoglove. https://www.captoglove.com/downloads/.
  3. V. Omni. (2018) Cyberith virtualizer. https://www.cyberith.com/research-development/.
  4. Facebook. (2018) Oculus go. https://developer.oculus.com/documentation/.
  5. W. Zhao, J. Chai, and Y.-Q. Xu, "Combining markerbased mocap and rgb-d camera for acquiring high-fidelity hand motion data," in Proceedings of the 11th ACM SIGGRAPH/Eurographics conference on Computer Animation, 2012, pp. 33-42..
  6. T. Cakmak and H. Hager, "Cyberith virtualizer: a locomotion device for virtual reality," in ACM SIGGRAPH 2014 Emerging Technologies, 2014, p. 6.
  7. K. Hinckley, J. Pierce, M. Sinclair, and E. Horvitz, "Sensing techniques for mobile interaction," in Proceedings of the 13th annual ACM symposium on User interface software and technology, 2000, pp. 91-100.
  8. Sangtae Kim, Soobin Lee, and Sungkwan Jung, "Method for recognizing individual users using mobile device and 3d depth camera," Proceedings of Symposium of the Korean Institute of communications and Information Sciences, pp. 489-490, 2015.
  9. Jong-Hyun Kim, "Interaction technique in simulations using mouth-wind on mobile devices," Journal of the Korea Computer Graphics Society, vol. 24, no. 4, pp. 21-27, 2018.. https://doi.org/10.15701/kcgs.2018.24.4.21
  10. I. E. Sutherland, "A head-mounted three dimensional display,"in Proceedings of the December 9-11, 1968, fall joint computer conference, part I, 1968, pp. 757-764.
  11. C. Schissler, A. Nicholls, and R. Mehra, "Efficient hrtf-based spatial audio for area and volumetric sources," IEEE transactions on visualization and computer graphics, vol. 22, no. 4, pp. 1356-1366, 2016. https://doi.org/10.1109/TVCG.2016.2518134
  12. H. G. Hoffman, "Physically touching virtual objects using tactile augmentation enhances the realism of virtual environments," in Virtual Reality Annual International Symposium, 1998. Proceedings., IEEE 1998, 1998, pp. 59-63.
  13. C. Carvalheiro, R. Nobrega, H. da Silva, and R. Rodrigues, "User redirection and direct haptics in virtual environments," in Proceedings of the 2016 ACM on Multimedia Conference, 2016, pp. 1146-1155.
  14. T. Labs. (2014) Myo gesture control armband. https://support.getmyo.com/hc/enus/categories/200376235-Developing-With-Myo.
  15. S. Rawat, S. Vats, and P. Kumar, "Evaluating and exploring the myo armband," in System Modeling & Advancement in Research Trends (SMART), International Conference, 2016, pp. 115-120.
  16. Chung-Jae Lee, Jong-Hyun Kiml, Jung Lee and Sun-Jeong Kim, "Verification of the usefulness of smartphone for wrist swing motion in vr environments," Journal of Korea Game Society, vol. 17, no. 3, pp. 53-62, 2017. https://doi.org/10.7583/JKGS.2017.17.3.53
  17. J. Stam, "Stable fluids," in Proceedings of the 26th Annual Conference on Computer Graphics and Interactive Techniques, ser. ACM SIGGRAPH, 1999, pp. 121-128.