• Title/Summary/Keyword: Smart User Interface

Search Result 324, Processing Time 0.03 seconds

Arduino-based Tangible User Interfaces Smart Puck Systems (아두이노 기반의 텐저블 유저 인터페이스 스마트퍽 시스템)

  • Bak, Seon Hui;Kim, Eung Soo;Lee, Jeong Bae;Lee, Heeman
    • Journal of Korea Multimedia Society
    • /
    • v.19 no.2
    • /
    • pp.334-343
    • /
    • 2016
  • In this paper, we developed a low cost smart puck system that can interact with the intuitive operation of natural finger touches. The tangible smart puck, designed for capacitive tabletop display, has Arduino embedded processor which communicates only with the MT server. The MT server communicates both to the smart puck and the display server. The display server displays the relevance information on the location of the smart pucks on the tabletop display and handles interactions with the users. The experiment results show that the accuracy of identifying the smart puck ID was very reliable enough to use in practice, and the information presentation processing time is confirmed excellent enough compared to traditional expensive commercial products.

Case study of VR experience studying for smart education support (스마트 교육 지원을 위한 VR 체험학습 사례 연구)

  • Kim, Moon Seok
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.10 no.2
    • /
    • pp.131-141
    • /
    • 2014
  • UX design e-learning systems in order to establish the reasonable structured modules that are based on interdisciplinary research. E-learning by focusing on technology-related education, information technology and psychology, business administration, etc. has been studied in the field. However, an important part of smart education, one of the students in the field of self-directed research is lacking UX design. UX Design the User Interface and Visual Identity is central to the success of the content of the important research areas in modern smart society. This case study is for smart education digital textbooks considered the characteristics of the study. Presented by the Ministry of Education on the basis of the standard of digital textbook content of the e-learning features, including multi-disciplinary analysis of UX design requires a structured model is proposed. Research data on the smart education and virtual education in UX design being used as the basis for studies forward.

Adaptive User Interface Techniques for Efficient Mobile Service (효율적인 모바일 서비스를 위한 적응적 사용자 인터페이스 기술)

  • Kang, Young-Min
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2006.11a
    • /
    • pp.21-25
    • /
    • 2006
  • As the mobile computing technologies rapidly develops, the mobile phones are evolving into smart phones capable of higher performance. However, it is inevitable that the mobile phones have limited input/output functions in order to maintain the current portability. Because of the limited input and output, the mobile user interfaces currently adopt the inefficient structures based on sequential search and selection. In this paper, an intelligent and adaptive user interface techniques are proposed in order to surmount the current inefficiency of the static interface and to maximize the user convenience. The proposed techniques, based on the users' previous tasks, automatically generate an intelligent and adaptive user interface in order to make users efficiently utilize the mobile services.

  • PDF

Mobile Terminal-Based User Interface for Intelligent Robots (휴대용 단말기 기반의 재능 로봇 사용자 인터페이스)

  • Kim Gi-Oh;Xuan Pham Dai;Park Ji-Hwan;Hong Soon-Hyuk;Jeon Jae-Wook
    • The KIPS Transactions:PartB
    • /
    • v.13B no.2 s.105
    • /
    • pp.179-186
    • /
    • 2006
  • A user interface that connects a user to intelligent robots needs to be designed for executing them efficiently. In this paper, it is analyzed how to organize a mobile terminal based user interface according to the function and level of autonomy of intelligent robots and the user interface of PDA (Personal Digital Assistant) and smart phone is developed for controlling intelligent robots remotely. In the image-based user interface, a user can see the motion of a robot directly and control the robot. In the map-based interface, the quantity of transmission information is reduced and therefore a user can control the robot with a small delay of transmission time.

User Needs of Three Dimensional Hand Gesture Interfaces in Residential Environment Based on Diary Method (주거 공간에서의 3차원 핸드 제스처 인터페이스에 대한 사용자 요구사항)

  • Jeong, Dong Yeong;Kim, Heejin;Han, Sung H.;Lee, Donghun
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.41 no.5
    • /
    • pp.461-469
    • /
    • 2015
  • The aim of this study is to find out the user's needs of a 3D hand gesture interface in the smart home environment. To find out the users' needs, we investigated which object the users want to use with a 3D hand gesture interface and why they want to use a 3D hand gesture interface. 3D hand gesture interfaces are studied to be applied to various devices in the smart environment. 3D hand gesture interfaces enable the users to control the smart environment with natural and intuitive hand gestures. With these advantages, finding out the user's needs of a 3D hand gesture interface would improve the user experience of a product. This study was conducted using a diary method to find out the user's needs with 20 participants. They wrote the needs of a 3D hand gesture interface during one week filling in the forms of a diary. The form of the diary is comprised of who, when, where, what and how to use a 3D hand gesture interface with each consisting of a usefulness score. A total of 322 data (209 normal data and 113 error data) were collected from users. There were some common objects which the users wanted to control with a 3D hand gesture interface and reasons why they want to use a 3D hand gesture interface. Among them, the users wanted to use a 3D hand gesture interface mostly to control the light, and to use a 3D hand gesture interface mostly to overcome hand restrictions. The results of this study would help develop effective and efficient studies of a 3D hand gesture interface giving valuable insights for the researchers and designers. In addition, this could be used for creating guidelines for 3D hand gesture interfaces.

Implementation of an User Interface Developing Tool for 3D Simulator (3차원 시뮬레이터의 사용자 인터페이스 개발 도구 구현)

  • Yoon, Ga-Rim;Jeon, Jun-Young;Kim, Young-Bong
    • Journal of Korea Multimedia Society
    • /
    • v.19 no.2
    • /
    • pp.504-511
    • /
    • 2016
  • 3D simulation programs or games on a smart phone and a personal computer have often employed 3D graphic processing techniques and 3D graphical views. However, the user interfaces in those 3D programs have sticked to take a typical 2D style user interface and thus the combination of a 2D user interface view and a 3D simulation view give us a mismatched sense. Since a 2D user interface has been based on the windows controls, it causes sometime DC conflicts between a simulation view and an interface view. Therefore, we will implement the UI developing tool which can be inserted into the pipeline structure for the development of a 3D simulation software and also follows the view-handler design pattern in Microsoft windows system. It will provide various graphical effects such as the deformation of UI depending on the view direction of simulation view and the sitting pose of user. This developing tool gives the natural user interface which heightens the sense of unity with a given 3D simulation view.

Study on User Interface for a Capacitive-Sensor Based Smart Device

  • Jung, Sun-IL;Kim, Young-Chul
    • Smart Media Journal
    • /
    • v.8 no.3
    • /
    • pp.47-52
    • /
    • 2019
  • In this paper, we designed HW / SW interfaces for processing the signals of capacitive sensors like Electric Potential Sensor (EPS) to detect the surrounding electric field disturbance as feature signals in motion recognition systems. We implemented a smart light control system with those interfaces. In the system, the on/off switch and brightness adjustment are controlled by hand gestures using the designed and fabricated interface circuits. PWM (Pulse Width Modulation) signals of the controller with a driver IC are used to drive the LED and to control the brightness and on/off operation. Using the hand-gesture signals obtained through EPS sensors and the interface HW/SW, we can not only construct a gesture instructing system but also accomplish the faster recognition speed by developing dedicated interface hardware including control circuitry. Finally, using the proposed hand-gesture recognition and signal processing methods, the light control module was also designed and implemented. The experimental result shows that the smart light control system can control the LED module properly by accurate motion detection and gesture classification.

User interface for remote control robot

  • Kim, Gi-Oh;Jeon, Jae-Wook
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.52-56
    • /
    • 2005
  • The recent growth of the robot technology has made robots be popular and provides people with many opportunities to apply various robots. But most robots are controlled by its unique program, users feel hard and unfamiliar with robot. Therefore we need to find ways to make user feel comfortable and familiar with the usage of robot. First we will analyze how the user interacts with the robot. Next we will discuss a standard human-robot interface provide more usability with that analysis. In this paper, 10 degree of the Level Of Autonomy(LOA) are proposed. It is evaluated that what interface components and designs are proper to each LOA. Finally we suggest a way to design the standard human-robot interface for remote controlleds robot through handheld devices like the Personal Digital Assistant(PDA) and smart phone.

  • PDF

Motion-based Controlling 4D Special Effect Devices to Activate Immersive Contents (실감형 콘텐츠 작동을 위한 모션 기반 4D 특수효과 장치 제어)

  • Kim, Kwang Jin;Lee, Chil Woo
    • Smart Media Journal
    • /
    • v.8 no.1
    • /
    • pp.51-58
    • /
    • 2019
  • This paper describes a gesture application to controlling the special effects of physical devices for 4D contents using the PWM (Pulse Width Modulation) method. The user operation recognized by the infrared sensor is interpreted as a command for 3D content control, several of which manipulate the device that generates the special effect to display the physical stimulus to the user. With the content controlled under the NUI (Natural User Interface) technique, the user can be directly put into an immersion experience, which leads to provision of the higher degree of interest and attention. In order to measure the efficiency of the proposed method, we implemented a PWM-based real-time linear control system that manages the parameters of the motion recognition and animation controller using the infrared sensor and transmits the event.