• Title/Summary/Keyword: human and computer interface

Search Result 484, Processing Time 0.039 seconds

Implementation of Human and Computer Interface for Detecting Human Emotion Using Neural Network (인간의 감정 인식을 위한 신경회로망 기반의 휴먼과 컴퓨터 인터페이스 구현)

  • Cho, Ki-Ho;Choi, Ho-Jin;Jung, Seul
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.13 no.9
    • /
    • pp.825-831
    • /
    • 2007
  • In this paper, an interface between a human and a computer is presented. The human and computer interface(HCI) serves as another area of human and machine interfaces. Methods for the HCI we used are voice recognition and image recognition for detecting human's emotional feelings. The idea is that the computer can recognize the present emotional state of the human operator, and amuses him/her in various ways such as turning on musics, searching webs, and talking. For the image recognition process, the human face is captured, and eye and mouth are selected from the facial image for recognition. To train images of the mouth, we use the Hopfield Net. The results show 88%$\sim$92% recognition of the emotion. For the vocal recognition, neural network shows 80%$\sim$98% recognition of voice.

Tactile Sensation Display with Electrotactile Interface

  • Yarimaga, Oktay;Lee, Jun-Hun;Lee, Beom-Chan;Ryu, Je-Ha
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.145-150
    • /
    • 2005
  • This paper presents an Electrotactile Display System (ETCS). One of the most important human sensory systems for human computer interaction is the sense of touch, which can be displayed to human through tactile output devices. To realize the sense of touch, electrotactile display produces controlled, localized touch sensation on the skin by passing small electric current. In electrotactile stimulation, the mechanoreceptors in the skin may be stimulated individually in order to display the sense of vibration, touch, itch, tingle, pressure etc. on the finger, palm, arm or any suitable location of the body by using appropriate electrodes and waveforms. We developed an ETCS and investigated effectiveness of the proposed system in terms of the perception of roughness of a surface by stimulating the palmar side of hand with different waveforms and the perception of direction and location information through forearm. Positive and negative pulse trains were tested with different current intensities and electrode switching times on the forearm or finger of the user with an electrode-embedded armband in order to investigate how subjects recognize displayed patterns and directions of stimulation.

  • PDF

Tangible Space Initiative

  • Ahn, Chong-Keun;Kim, Lae-Hyun;Ha, Sung-Do
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2004.10a
    • /
    • pp.1053-1056
    • /
    • 2004
  • Research in Human Computer Interface (HCI) is towards development of an application environment able to deal with interactions of both human and computers that can be more intuitive and efficient. This can be achieved by bridging the gap between the synthetic virtual environment and the natural physical environment. Thus a project called Tangible Space Initiative (TSI) has been launched by KIST. TSI is subdivided into Tangible Interface (TI) which controls 3D cyber space with user's perspective, Responsive Cyber Space (RCS) which creates and controls the virtual environment and Tangible Agent (TA) which senses and acts upon the physical interface environment on behalf of any components of TSI or the user. This paper is a brief introduction to a new generation of Human Computer Interface that bring user to a new era of interaction with computers in the future.

  • PDF

Trends on Human/Robot Interface Research (휴먼/로봇 인터페이스 연구동향 분석)

  • Im, Chang-Ju;Im, Chi-Hwan
    • Journal of the Ergonomics Society of Korea
    • /
    • v.21 no.2
    • /
    • pp.101-111
    • /
    • 2002
  • An intelligent robot, which has been developed recently, is no more a conventional robot widely known as an industrial robot. It is a computer system embedded in a machine and utilizes the machine as a medium not only for the communication between the human and the computer but also for the physical interaction among the human, the computer and their environment. Recent advances in computer technology have made it possible to create several of new types of human-computer interaction which are realized by utilizing intelligent machines. There is a continuing need for better understanding of how to design human/robot interface(HRI) to make for a more natural and efficient flow of information and feedback between robot systems and their users in both directions. In this paper, we explain the concept and the scope of HRI and review the current research trends of domestic and foreign HRL. The recommended research directions in the near future are also discussed based upon a comparative study of domestic and foreign HRI technology.

Development of Wearable Vibrotactile Display Device (착용 가능한 진동촉감 제시 장치 개발)

  • Seo, Chang-Hoon;Kim, Hyun-Ho;Lee, Jun-Hun;Lee, Beom-Chan;Ryu, Je-Ha
    • 한국HCI학회:학술대회논문집
    • /
    • 2006.02a
    • /
    • pp.1-6
    • /
    • 2006
  • 촉감 제시 방법은 다른 사람에게 방해를 주지 않고 은밀하게 정보를 전달할 수 있는 장점이 있으며, 특히 시각 혹은 청각 장애인에게는 반드시 필요한 정보 전달의 수단이다. 또한 촉감을 이용한 정보의 전달은 시청각을 이용한 정보전달의 방법을 보완하거나 때로는 대체할 수도 있다. 본 논문에서는 웨어러블, 모바일, 또는 유비쿼터스 컴퓨팅 환경에서 사용할 수 있는 착용 가능한 진동촉감 제시 장치를 제안한다. 이 진동촉감 제시 장치는 25개의 진동모터를 $5{\times}5$의 형태로 배열하여 문자, 숫자뿐만 아니라 다양하고 복잡한 패턴을 표시할 수 있다. 코인형 진동모터 각각을 스펀지로 감싸고 푹신푹신한 재질의 패드에 세워서 배열하여 진동의 퍼짐을 최소화하고 사람의 글씨 쓰는 순서에 따라 진동모터를 순차적으로 구동시키는 새로운 추적모드를 제안하여 사용자의 문자 및 숫자 인식률을 크게 향상시켰다. 사용자 성능 평가에서는 사용자의 발등에 영문 알파벳을 표시하여 86.7%의 인식률을 얻었다. 또한 진동촉감 제시 장치를 이용하여 핸드폰에서의 발신자 정보표시를 한다거나 네비게이션 시스템에 적용할 수 있는 등의 유용한 응용분야를 제시하였다.

  • PDF

Immersive Live Sports Experience with Vibrotactile Sensation (스포츠 방송에서의 몰입감 증대를 위한 진동촉감 제시 시스템)

  • Lee, Beom-Chan;Lee, Jun-Hun;Seo, Chang-Hoon;Cha, Jong-En;Ryu, Je-Ha
    • 한국HCI학회:학술대회논문집
    • /
    • 2006.02a
    • /
    • pp.230-237
    • /
    • 2006
  • 본 논문은 스포츠 방송의 몰입감 증대를 위한 진동촉감 제시 시스템 및 장치를 제안하며, 촉감 정보를 효과적으로 전달하기 위한 촉감제시 방법론 및 제어 알고리즘을 제안한다. 최근 디지털 컨텐츠의 전달에 있어 오감을 통한 정보 전달의 관심이 증대됨에 따라, 대중을 대상으로 다양한 정보를 전달하는 디지털 매체에서의 시청각과 더불어 촉감 제시의 역할과 중요성이 증대되었다. 따라서 본 논문에서는 실시간으로 동적인 현장 상황을 실감 있게 전달하는 스포츠 방송에서의 햅틱 효과와 역할 그리고 가능한 시나리오를 정의하고, 진동촉감 제시 장비를 설계하여 촉감 정보 표현에 대한 기초 연구를 수행하였다. 또한 제안된 촉감을 이용한 스포츠 방송 시나리오 중 축구 방송을 기반으로 사용자 촉감 인지 실험을 수행하였으며, 실험 결과를 바탕으로 축구 방송 시스템을 구축하여 실감방송 전시회 시연을 통해 진동촉감 시스템 및 촉감 제시 방법론을 검증하였다. 촉감이 인간의 오감 중 시청각 다음으로 정보를 인지하는 중요한 감각 체계인 만큼 많은 양의 정보를 대중에게 전달하는 방송 시스템에서 시청각과 더불어 효과적인 정보 전달 체계로써 기여할 것이라고 여겨진다.

  • PDF

An Efficient Camera Calibration Method for Head Pose Tracking (머리의 자세를 추적하기 위한 효율적인 카메라 보정 방법에 관한 연구)

  • Park, Gyeong-Su;Im, Chang-Ju;Lee, Gyeong-Tae
    • Journal of the Ergonomics Society of Korea
    • /
    • v.19 no.1
    • /
    • pp.77-90
    • /
    • 2000
  • The aim of this study is to develop and evaluate an efficient camera calibration method for vision-based head tracking. Tracking head movements is important in the design of an eye-controlled human/computer interface. A vision-based head tracking system was proposed to allow the user's head movements in the design of the eye-controlled human/computer interface. We proposed an efficient camera calibration method to track the 3D position and orientation of the user's head accurately. We also evaluated the performance of the proposed method. The experimental error analysis results showed that the proposed method can provide more accurate and stable pose (i.e. position and orientation) of the camera than the conventional direct linear transformation method which has been used in camera calibration. The results of this study can be applied to the tracking head movements related to the eye-controlled human/computer interface and the virtual reality technology.

  • PDF

Program Development of Emotional Human and Computer Interface

  • Jung, Seul;Cho, Kiho
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2002.10a
    • /
    • pp.102.3-102
    • /
    • 2002
  • $\textbullet$ Human and computer interface(HCI) $\textbullet$ Voice recognition $\textbullet$ Image recognition $\textbullet$ Neural network $\textbullet$ Hopfield net

  • PDF

A Novel EMG-based Human-Computer Interface for Electric-Powered Wheelchair Users with Motor Disabilities (거동장애를 가진 전동휠체어 사용자를 위한 근전도 기반의 휴먼-컴퓨터 인터페이스)

  • Lee Myung-Joon;Chu Jun-Uk;Ryu Je-Cheong;Mun Mu-Seong;Moon Inhyuk
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.11 no.1
    • /
    • pp.41-49
    • /
    • 2005
  • Electromyogram (EMG) signal generated by voluntary contraction of muscles is often used in rehabilitation devices because of its distinct output characteristics compared to other bio-signals. This paper proposes a novel EMG-based human-computer interface for electric-powered wheelchair users with motor disabilities by C4 or C5 spine cord injury. User's commands to control the electric-powered wheelchair are represented by shoulder elevation motions, which are recognized by comparing EMG signals acquired from the levator scapulae muscles with a preset double threshold value. The interface commands for controlling the electric-powered wheelchair consist of combinations of left-, right- and both-shoulders elevation motions. To achieve a real-time interface, we implement an EMG processing hardware composed of analog amplifiers, filters, a mean absolute value circuit and a high-speed microprocessor. The experimental results using an implemented real-time hardware and an electric-powered wheelchair showed that the EMG-based human-computer interface is feasible for the users with severe motor disabilities.

A Novel Computer Human Interface to Remotely Pick up Moving Human's Voice Clearly by Integrating ]Real-time Face Tracking and Microphones Array

  • Hiroshi Mizoguchi;Takaomi Shigehara;Yoshiyasu Goto;Hidai, Ken-ichi;Taketoshi Mishima
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1998.10a
    • /
    • pp.75-80
    • /
    • 1998
  • This paper proposes a novel computer human interface, named Virtual Wireless Microphone (VWM), which utilizes computer vision and signal processing. It integrates real-time face tracking and sound signal processing. VWM is intended to be used as a speech signal input method for human computer interaction, especially for autonomous intelligent agent that interacts with humans like as digital secretary. Utilizing VWM, the agent can clearly listen human master's voice remotely as if a wireless microphone was put just in front of the master.

  • PDF