DOI QR코드

DOI QR Code

VR 환경에서 가상 객체 선택 상호작용 개선을 위한 사용자 의도 반영 방법

How to Reflect User's Intention to Improve Virtual Object Selection Task in VR

  • 김찬희 (한양대학교 컴퓨터소프트웨어학과) ;
  • 남현길 (한양대학교 컴퓨터소프트웨어학과) ;
  • 박종일 (한양대학교 컴퓨터소프트웨어학과)
  • Kim, Chanhee (Department of Computer Science, Hanyang University) ;
  • Nam, Hyeongil (Department of Computer Science, Hanyang University) ;
  • Park, Jong-Il (Department of Computer Science, Hanyang University)
  • 투고 : 2021.09.06
  • 심사 : 2021.11.10
  • 발행 : 2021.11.30

초록

본 연구는 사전에 VR(virtual reality) 콘텐츠 상황을 인지하고 활용함으로써 파악된 사용자의 의도와 사용자의 손과 해당 객체와의 기하학적 관계를 모두 고려하여, 선택될 가상 객체의 우선순위를 부여하는 방법을 제안한다. VR 콘텐츠에서 가상 객체를 집는 행위는 필수적이면서 가장 많이 사용되는 상호작용이다. VR 환경에서 가상 객체가 서로 가까이 위치해 있을 때, 사용자의 의도와 다른 가상 객체가 잡히는 상황이 발생한다. 이를 해결하기 위해 본 논문에서는 상황에 따라 적합하게 상호작용이 발생하기 위해, 사용자 의도와 사용자의 손과 가상 객체와의 거리에 각각 다른 가중치를 부여하여 우선순위를 도출하도록 하였다. 가상 객체의 개수와 가상 객체간의 거리 요소가 다양화된 상황에서 해당 방법을 적용하여 실험을 진행하였다. 실험 결과 가상 객체간의 밀도가 높고 서로간의 거리가 가까울 때, 상황 인지의 가중치 비율을 높여서 상호작용을 발생시켰을 경우 사용자의 만족도가 20.34% 증가하며 제안된 방법의 효과를 입증하였다. 제안된 방법이 사용자의 의도를 반영할 수 있는 상호작용 기술 향상에 이바지할 것으로 기대한다.

This paper proposes a method to prioritize the virtual objects to be selected, considering both the user's hand and the geometric relationship with the virtual objects and the user's intention which is recognized in advance. Picking up virtual objects in VR content is an essential and most commonly used interaction. When virtual objects are located close to each other in VR, a situation occurs in which virtual objects that are different from the user's intention are selected. To address this issue, this paper provides different weights for user intentions and distance between user's hand and virtual objects to derive priorities in order to generate interactions appropriately according to the situation. We conducted the experiment in the situation where the number of virtual objects and the distance between virtual objects are diversified. Experiments demonstrate the effectiveness of the proposed method when the density between virtual objects is high and the distance between each other is close, user satisfaction increases to 20.34% by increasing the weight ratio of the situation awareness. We expect the proposed method to contribute to improving interaction skills that can reflect users' intentions.

키워드

과제정보

이 성과는 정부(과학기술정보통신부)의 재원으로 한국연구재단의 지원을 받아 수행된 연구임 (NRF-2019R1A4A1029800).

참고문헌

  1. Y. Li, J. Huang, F. Tian, H.-A. Wang, and G.-Z. Dai, ''Gesture interaction in virtual reality,'' Virtual Reality Intell. Hardw., vol.1, no.1, pp. 84-112, Feb 2019. https://doi.org/10.3724/SP.J.2096-5796.2018.0006
  2. A. Zenner and A. Kruger, "Estimating Detection Thresholds for Desktop-Scale Hand Redirection in Virtual Reality," 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 47-55, 2019.
  3. D. Kaminska, T. Sapinski, S. Wiak, T. Tikk, R. Haamer, E. Avots, A. Helmi, C. Ozcinar, and G. Anbarjafari, "Virtual Reality and Its Applications in Education: Survey," Information , vol.10, no.10, pp. 1-20, 2019 https://doi.org/10.3390/info10100318
  4. S. Tabatabai, "Covid-19 impact and virtual medical education," Journal of advances in medical education & professionalism, Vol.8, no.3, pp.140-143, July 2020.
  5. L. Liu, R. van Liere, C. Nieuwenhuizen, and J.-B. Martens, "Comparing Aimed Movements in the Real World and in Virtual Reality," Proc. IEEE Virtual Reality Conf., pp. 219-222, 2009.
  6. S. Esmaeili, B. Benda, and E. D. Ragan, "Detection of scaled hand interactions in virtual reality: The effects of motion direction and task complexity," 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 453-462, 2020.
  7. M. Karam, A framework for research and design of gesture based human-computer interactions, Ph.D's Thesis of University of Southampton, Southampton, UK, 2006.
  8. M. Holl, M. Oberweger, C. Arth, V. Lepetit, "Efficient physics-based implementation for realistic hand-object interaction in virtual reality." 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp.175-182, 2018.
  9. G. Park, J. Lee, "Comparative Study on the Interface and Interaction for Manipulating 3D Virtual Objects in a Virtual Reality Environment", Transactions of the Society of CAD/CAM Engineers, Vol.21, No.1, pp.20-30, March 2016. https://doi.org/10.7315/CADCAM.2016.020
  10. H. Kang, J. Shin, and K. Ponto. "A comparative analysis of 3d user interaction: How to move virtual objects in mixed reality." 2020 IEEE conference on virtual reality and 3D user interfaces (VR), pp. 275-284, 2020.
  11. J. Lee, "VR System Environment Technologies and User Input Elements," JOURNAL OF THE KOREAN SOCIETY DESIGN CULTURE, Vol.24, No.2, pp.585-596, June 2018. https://doi.org/10.18208/ksdc.2018.24.2.585
  12. C. Kim, "Interaction Improvement Using Situation Control in VR", The Korean Institute of Broadcast and Media Engineers, pp.216-218, 2021.
  13. J. Oh, J. Lee, H. Heo, J. Lee, and J. Park," Selection Method by using Gaze Tracking and Gesture Recognition for Occlusion of Virtual Object," The HCI Society of Korea, pp.310-312, 2016.
  14. K. Ryu, J.-J. Lee, and J.-M. Park. "GG interaction: a gaze-grasp pose interaction for 3d virtual object selection," J. Multimodal User Interfaces, vol.13, pp.383-393, July 2019. https://doi.org/10.1007/s12193-019-00305-y
  15. L. Sidenmark, C. Clarke, X. Zhang, J. Phu, and H. Gellersen. "Outline Pursuits: Gaze-assisted Selection of Occluded Objects in Virtual Reality", Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems(CHI '20), pp.1-13, April 2020.
  16. J. Oh, J. Lee, "Explosion Casting: An Efficient Selection Method for Overlapped Virtual Objects in Immersive Virtual Environments", JOURNAL OF THE KOREA CONTENTS ASSOCIATION, Vol 18, No. 3, pp.11-18, March 2013. https://doi.org/10.5392/JKCA.2018.18.03.011