• Title/Summary/Keyword: remote rendering

Search Result 37, Processing Time 0.037 seconds

On Design of A Remote Partitioned Rendering System Using A Game Engine (게임엔진 기반 원격 분할 렌더링 시스템의 설계)

  • Lim, Choong-Gyoo
    • Journal of Korea Game Society
    • /
    • v.19 no.5
    • /
    • pp.5-14
    • /
    • 2019
  • This paper proposes a remote partitioned rendering method for displaying 3D applications on large tiled displays, which were developed using the Unity game engine. With the proposed method, one can represent them without additional development or any modifications to the applications. The paper verifies its feasibility by applying to a sample Unity-based application.

A Remote Partitioned Rendering System Using Direct3D (Direct3D 기반 원격 분할 렌더링 시스템)

  • Lim, Choong-Gyoo
    • Journal of Korea Game Society
    • /
    • v.18 no.1
    • /
    • pp.115-124
    • /
    • 2018
  • Various kinds of tile-based ultra-high resolution display devices have been developed by, for example, constructing display walls using many commodity LCD displays. To represent 3D applications like computer games on these devices, one has to develop 3D applications or develop particular APIs only for representing on these devices. If one can develop a distributed rendering system using legacy 3D APIs such as OpenGL and Direct3D by extending a remote rendering system, commercial computer games can be represented on such display devices without modifying their source codes. The purpose of the paper is to propose a new Dired3D-based distribute rendering system by extending a Direct3D-based remote rendering system and to show its feasibility technically by appling it to a sample Direct3D application and performing a few experimentations.

The Development of Device and the Algorithm for the Haptic Rendering (가상현실 역감구현을 위한 알고리즘과 장치개발)

  • 김영호;이경백;김영배
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2000.11a
    • /
    • pp.106-109
    • /
    • 2000
  • The virtual reality - haptic device is developed for the purpose used in the work that human cannot approach and that need elaborate exercises. To render haptic, the total system is constituted master, haptic device, and slave, remote manipulator. Human operates the remote manipulator. Human operates the remote manipulator relying on the hapti devices and stereo graphic. And then the force and scene of the remote manipulator is fed-back from each haptic devices and virtual devices. The feedback information gets system gain exactly. The system gain provides the most exact haptic and virtual devices. The feedback information gets system gain exactly. The system gain provides the most exact haptic and scene to human by the location, the graphic rendering and the haptic rendering algorithm on real-time. In this research, 3D haptic device is developed for common usage and make human feel the haptic when human contacts virtual object rendered by computer graphic. The haptic device is good for tracing location and producing devices because of the row structure. Also, openGL and Visual Basic is utilized to the algorithms for haptic rendering. The haptic device of this research makes the interface possible not only with virtual reality but also with the real remote manipulator.

  • PDF

Implementation of AR Remote Rendering Techniques for Real-time Volumetric 3D Video

  • Lee, Daehyeon;Lee, Munyong;Lee, Sang-ha;Lee, Jaehyun;Kwon, Soonchul
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.12 no.2
    • /
    • pp.90-97
    • /
    • 2020
  • Recently, with the growth of mixed reality industrial infrastructure, relevant convergence research has been proposed. For real-time mixed reality services such as remote video conferencing, the research on real-time acquisition-process-transfer methods is required. This paper aims to implement an AR remote rendering method of volumetric 3D video data. We have proposed and implemented two modules; one, the parsing module of the volumetric 3D video to a game engine, and two, the server rendering module. The result of the experiment showed that the volumetric 3D video sequence data of about 15 MB was compressed by 6-7%. The remote module was streamed at 27 fps at a 1200 by 1200 resolution. The results of this paper are expected to be applied to an AR cloud service.

Development of a Remote Rendering System using Direct3D API (Direct3D API의 원격 실시간 실행 시스템 개발)

  • Lim, Choong-Gyoo
    • Journal of Korea Game Society
    • /
    • v.14 no.5
    • /
    • pp.117-126
    • /
    • 2014
  • There are various kinds of applications if one can develop a remote execution system using for legacy 3D APIs. It can be used in implementing a cloud gaming service based on the real-time video streaming technology. Or, it can also be used in implementing a GPU virtualization for simultaneously rendering of many different 3D applications. The OpenGL API consists of independent global functions while the Direct3D API consists of Microsoft COM-based interfaces and their member functions, which makes the implementation of remote rendering system more difficult. The purpose of the paper is to show the applicability of the technology to any legacy 3D API by successfully designing and implementing a remote rendering system using the Direct3D API. It applies the implementation to a sample Direct3D application and also performs a few experimentations to show the technical feasibility.

Development of Mobile 3D Urban Landscape Authoring and Rendering System

  • Lee Ki-Won;Kim Seung-Yub
    • Korean Journal of Remote Sensing
    • /
    • v.22 no.3
    • /
    • pp.221-228
    • /
    • 2006
  • In this study, an integrated 3D modeling and rendering system dealing with 3D urban landscape features such as terrain, building, road and user-defined geometric ones was designed and implemented using $OPENGL\;{|}\;ES$ (Embedded System) API for mobile devices of PDA. In this system, the authoring functions are composed of several parts handling urban landscape features: vertex-based geometry modeling, editing and manipulating 3D landscape objects, generating geometrically complex type features with attributes for 3D objects, and texture mapping of complex types using image library. It is a kind of feature-based system, linked with 3D geo-based spatial feature attributes. As for the rendering process, some functions are provided: optimizing of integrated multiple 3D landscape objects, and rendering of texture-mapped 3D landscape objects. By the active-synchronized process among desktop system, OPENGL-based 3D visualization system, and mobile system, it is possible to transfer and disseminate 3D feature models through both systems. In this mobile 3D urban processing system, the main graphical user interface and core components is implemented under EVC 4.0 MFC and tested at PDA running on windows mobile and Pocket Pc. It is expected that the mobile 3D geo-spatial information systems supporting registration, modeling, and rendering functions can be effectively utilized for real time 3D urban planning and 3D mobile mapping on the site.

Remote Control of a Mobile Robot using Haptic Device (촉각 정보를 이용한 이동로봇의 원격제어)

  • 권용태;강희준;노영식
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2004.10a
    • /
    • pp.737-741
    • /
    • 2004
  • A mobile robot system is developed which is remotely controlled by a haptic master called ‘PHANTOM’. The mobile robot has 4 ultrasonic sensors and single CCD camera which detects the distance from a mobile robot to obstacles in the environment and sends this information to a haptic master. For more convenient remote control, haptic rendering process is performed like viscosity forces and obstacle avoidance forces. In order to show the effectiveness of the developed system, we experiment that the mobile robot runs through the maze and the time is checked to complete the path of the maze with/without the haptic information. Through this repeated experiments, haptic information proves to be useful for remote control of a mobile robot.

  • PDF

Real-Time Haptic Rendering for Tele-operation with Varying Communication Time Delay (가변적인 통신지연시간을 갖는 원격 작업 환경을 위한 실시간 햅틱 렌더링)

  • Lee, K.;Chung, S.Y.
    • Journal of Power System Engineering
    • /
    • v.13 no.2
    • /
    • pp.71-82
    • /
    • 2009
  • This paper presents a real-time haptic rendering method for a realistic force feedback in a remote environment with varying communication time-delay. The remote environment is assumed as a virtual environment based on a computer graphics, for example, on-line shopping mall, internet game and cyber-education. The properties of a virtual object such as stiffness and viscosity are assumed to be unknown because they are changed according to the contact position and/or a penetrated depth into the object. The DARMAX model based output estimator is proposed to trace the correct impedance of the virtual object in real-time. The output estimator is developed on the input-output relationship. It can trace the varying impedance in real-time by virtue of P-matrix resetting algorithm. And the estimator can trace the correct impedance by using a white noise that prevents the biased input-output information. Realistic output forces are generated in real-time, by using the inputs and the estimated impedance, even though the communication time delay and the impedance of the virtual object are unknown and changed. The generated forces trace the analytical forces computed from the virtual model of the remote environment. Performance is demonstrated by experiments with a 1-dof haptic device and a spring-damper-based virtual model.

  • PDF

Bi-layers Red-emitting Sr2Si5N8:Eu2+ Phosphor and Yellow-emitting YAG:Ce Phosphor: A New Approach for Improving the Color Rendering Index of the Remote Phosphor Packaging WLEDs

  • Nhan, Nguyen Huu Khanh;Minh, Tran Hoang Quang;Nguyen, Tan N.;Voznak, Miroslav
    • Current Optics and Photonics
    • /
    • v.1 no.6
    • /
    • pp.613-617
    • /
    • 2017
  • Due to optimal advances such as chromatic performance, durability, low power consumption, high efficiency, long-lifetime, and excellent environmental friendliness, white LEDs (WLEDs) are widely used in vehicle front lighting, backlighting, decorative lighting, street lighting, and even general lighting. In this paper, the remote packaging WLEDs (RP-WLEDs) with bi-layer red-emitting $Sr_2Si_5N_8:Eu^{2+}$ and yellow-emitting YAG:Ce phosphor was proposed and investigated. The simulation results based on the MATLAB software and the commercial software Light Tools indicated that the color rendering index (CRI) of bi-layer phosphor RP-WLEDs had a significant increase. The CRI had a considerable increase from 72 to 94. In conclusion, the results showed that bi-layer red-emitting $Sr_2Si_5N_8:Eu^{2+}$ and yellow-emitting YAG:Ce phosphor could be a prospective approach for manufacturing RP-WLEDs with enhanced optical properties.

Volume Haptic Rendering Algorithm for Realistic Modeling (실감형 모델링을 위한 볼륨 햅틱 렌더링 알고리즘)

  • Jung, Ji-Chan;Park, Joon-Young
    • Korean Journal of Computational Design and Engineering
    • /
    • v.15 no.2
    • /
    • pp.136-143
    • /
    • 2010
  • Realistic Modeling is to maximize the reality of the environment in which perception is made by virtual environment or remote control using two or more senses of human. Especially, the field of haptic rendering, which provides reality through interaction of visual and tactual sense in realistic model, has brought attention. Haptic rendering calculates the force caused by model deformation during interaction with a virtual model and returns it to the user. Deformable model in the haptic rendering has more complexity than a rigid body because the deformation is calculated inside as well as the outside the model. For this model, Gibson suggested the 3D ChainMail algorithm using volumetric data. However, in case of the deformable model with non-homogeneous materials, there were some discordances between visual and tactual sense information when calculating the force-feedback in real time. Therefore, we propose an algorithm for the Volume Haptic Rendering of non-homogeneous deformable object that reflects the force-feedback consistently in real time, depending on visual information (the amount of deformation), without any post-processing.