• Title/Summary/Keyword: Robust Eye Tracking

Search Result 26, Processing Time 0.026 seconds

Robust Eye Region Discrimination and Eye Tracking to the Environmental Changes (환경변화에 강인한 눈 영역 분리 및 안구 추적에 관한 연구)

  • Kim, Byoung-Kyun;Lee, Wang-Heon
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.18 no.5
    • /
    • pp.1171-1176
    • /
    • 2014
  • The eye-tracking [ET] is used on the human computer interaction [HCI] analysing the movement status as well as finding the gaze direction of the eye by tracking pupil's movement on a human face. Nowadays, the ET is widely used not only in market analysis by taking advantage of pupil tracking, but also in grasping intention, and there have been lots of researches on the ET. Although the vision based ET is known as convenient in application point of view, however, not robust in changing environment such as illumination, geometrical rotation, occlusion and scale changes. This paper proposes two steps in the ET, at first, face and eye regions are discriminated by Haar classifier on the face, and then the pupils from the discriminated eye regions are tracked by CAMShift as well as Template matching. We proved the usefulness of the proposed algorithm by lots of real experiments in changing environment such as illumination as well as rotation and scale changes.

Real-Time Eye Tracking Using IR Stereo Camera for Indoor and Outdoor Environments

  • Lim, Sungsoo;Lee, Daeho
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.11 no.8
    • /
    • pp.3965-3983
    • /
    • 2017
  • We propose a novel eye tracking method that can estimate 3D world coordinates using an infrared (IR) stereo camera for indoor and outdoor environments. This method first detects dark evidences such as eyes, eyebrows and mouths by fast multi-level thresholding. Among these evidences, eye pair evidences are detected by evidential reasoning and geometrical rules. For robust accuracy, two classifiers based on multiple layer perceptron (MLP) using gradient local binary patterns (GLBPs) verify whether the detected evidences are real eye pairs or not. Finally, the 3D world coordinates of detected eyes are calculated by region-based stereo matching. Compared with other eye detection methods, the proposed method can detect the eyes of people wearing sunglasses due to the use of the IR spectrum. Especially, when people are in dark environments such as driving at nighttime, driving in an indoor carpark, or passing through a tunnel, human eyes can be robustly detected because we use active IR illuminators. In the experimental results, it is shown that the proposed method can detect eye pairs with high performance in real-time under variable illumination conditions. Therefore, the proposed method can contribute to human-computer interactions (HCIs) and intelligent transportation systems (ITSs) applications such as gaze tracking, windshield head-up display and drowsiness detection.

Real Time Eye and Gaze Tracking (트래킹 Gaze와 실시간 Eye)

  • Min Jin-Kyoung;Cho Hyeon-Seob
    • Proceedings of the KAIS Fall Conference
    • /
    • 2004.11a
    • /
    • pp.234-239
    • /
    • 2004
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process fur each person, our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks (GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

  • PDF

Development of Eye-Tracking System Using Dual Machine Learning Structure (이중 기계학습 구조를 이용한 안구이동추적 기술개발)

  • Gang, Gyeong Woo;Min, Chul Hong;Kim, Tae Seon
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.66 no.7
    • /
    • pp.1111-1116
    • /
    • 2017
  • In this paper, we developed bio-signal based eye tracking system using electrooculogram (EOG) and electromyogram (EMG) which measured simultaneously from same electrodes. In this system, eye gazing position can be estimated using EOG signal and we can use EMG signal at the same time for additional command control interface. For EOG signal processing, PLA algorithms are applied to reduce processing complexity but still it can guarantee less than 0.2 seconds of reaction delay time. Also, we developed dual machine learning structure and it showed robust and enhanced tracking performances. Compare to conventional EOG based eye tracking system, developed system requires relatively light hardware system specification with only two skin contact electrodes on both sides of temples and it has advantages on application to mobile equipments or wearable devices. Developed system can provide a different UX for consumers and especially it would be helpful to disabled persons with application to orthotics for those of quadriplegia or communication tools for those of intellectual disabilities.

Eye detection on Rotated face using Principal Component Analysis (주성분 분석을 이용한 기울어진 얼굴에서의 눈동자 검출)

  • Choi, Yeon-Seok;Mun, Won-Ho;Cha, Eui-Young
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2011.05a
    • /
    • pp.61-64
    • /
    • 2011
  • There are many applications that require robust and accurate eye tracking, such as human-computer interface(HCI). In this paper, a novel approach for eye tracking with a principal component analysis on rotated face. In the process of iris detection, intensity information is used. First, for select eye region using principal component analysis. Finally, for eye detection using eye region's intensity. The experimental results show good performance in detecting eye from FERET image include rotate face.

  • PDF

Robust Face and Facial Feature Tracking in Image Sequences (연속 영상에서 강인한 얼굴 및 얼굴 특징 추적)

  • Jang, Kyung-Shik;Lee, Chan-Hee
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.14 no.9
    • /
    • pp.1972-1978
    • /
    • 2010
  • AAM(Active Appearance Model) is one of the most effective ways to detect deformable 2D objects and is a kind of mathematical optimization methods. The cost function is a convex function because it is a least-square function, but the search space is not convex space so it is not guaranteed that a local minimum is the optimal solution. That is, if the initial value does not depart from around the global minimum, it converges to a local minimum, so it is difficult to detect face contour correctly. In this study, an AAM-based face tracking algorithm is proposed, which is robust to various lighting conditions and backgrounds. Eye detection is performed using SIFT and Genetic algorithm, the information of eye are used for AAM's initial matching information. Through experiments, it is verified that the proposed AAM-based face tracking method is more robust with respect to pose and background of face than the conventional basic AAM-based face tracking method.

Robust pupil detection and gaze tracking under occlusion of eyes

  • Lee, Gyung-Ju;Kim, Jin-Suh;Kim, Gye-Young
    • Journal of the Korea Society of Computer and Information
    • /
    • v.21 no.10
    • /
    • pp.11-19
    • /
    • 2016
  • The size of a display is large, The form becoming various of that do not apply to previous methods of gaze tracking and if setup gaze-track-camera above display, can solve the problem of size or height of display. However, This method can not use of infrared illumination information of reflected cornea using previous methods. In this paper, Robust pupil detecting method for eye's occlusion, corner point of inner eye and center of pupil, and using the face pose information proposes a method for calculating the simply position of the gaze. In the proposed method, capture the frame for gaze tracking that according to position of person transform camera mode of wide or narrow angle. If detect the face exist in field of view(FOV) in wide mode of camera, transform narrow mode of camera calculating position of face. The frame captured in narrow mode of camera include gaze direction information of person in long distance. The method for calculating the gaze direction consist of face pose estimation and gaze direction calculating step. Face pose estimation is estimated by mapping between feature point of detected face and 3D model. To calculate gaze direction the first, perform ellipse detect using splitting from iris edge information of pupil and if occlusion of pupil, estimate position of pupil with deformable template. Then using center of pupil and corner point of inner eye, face pose information calculate gaze position at display. In the experiment, proposed gaze tracking algorithm in this paper solve the constraints that form of a display, to calculate effectively gaze direction of person in the long distance using single camera, demonstrate in experiments by distance.

Active eye system for tracking a moving object (이동물체 추적을 위한 능동시각 시스템 구축)

  • 백문홍
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1996.10b
    • /
    • pp.257-259
    • /
    • 1996
  • This paper presents the active eye system for tracking a moving object in 3D space. A prototype system able to track a moving object is designed and implemented. The mechanical system ables the control of platform that consists of binocular camera and also the control of the vergence angle of each camera by step motor. Each camera has two degrees of freedom. The image features of the object are extracted from complicated environment by using zero disparity filtering(ZDF). From the cnetroid of the image features the gaze point on object is calculated and the vergence angle of each camera is controlled by step motor. The Proposed method is implemented on the prototype with robust and fast calculation time.

  • PDF

Dynamic Visual Servoing of Robot Manipulators (로봇 메니퓰레이터의 동력학 시각서보)

  • Baek, Seung-Min;Im, Gyeong-Su;Han, Ung-Gi;Guk, Tae-Yong
    • The Transactions of the Korean Institute of Electrical Engineers D
    • /
    • v.49 no.1
    • /
    • pp.41-47
    • /
    • 2000
  • A better tracking performance can be achieved, if visual sensors such as CCD cameras are used in controling a robot manipulator, than when only relative sensors such as encoders are used. However, for precise visual servoing of a robot manipulator, an expensive vision system which has fast sampling rate must be used. Moreover, even if a fast vision system is implemented for visual servoing, one cannot get a reliable performance without use of robust and stable inner joint servo-loop. In this paper, we propose a dynamic control scheme for robot manipulators with eye-in-hand camera configuration, where a dynamic learning controller is designed to improve the tracking performance of robotic system. The proposed control scheme is implemented for tasks of tracking moving objects and shown to be robust to parameter uncertainty, disturbances, low sampling rate, etc.

  • PDF

Real Time Eye and Gaze Tracking

  • Park Ho Sik;Nam Kee Hwan;Cho Hyeon Seob;Ra Sang Dong;Bae Cheol Soo
    • Proceedings of the IEEK Conference
    • /
    • 2004.08c
    • /
    • pp.857-861
    • /
    • 2004
  • This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person, our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks (GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Furthermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

  • PDF