Real Time Eye and Gaze Tracking

실시간 눈과 시선 위치 추적

  • Published : 2004.04.01

Abstract

This paper describes preliminary results we have obtained in developing a computer vision system based on active IR illumination for real time gaze tracking for interactive graphic display. Unlike most of the existing gaze tracking techniques, which often require assuming a static head to work well and require a cumbersome calibration process for each person our gaze tracker can perform robust and accurate gaze estimation without calibration and under rather significant head movement. This is made possible by a new gaze calibration procedure that identifies the mapping from pupil parameters to screen coordinates using the Generalized Regression Neural Networks(GRNN). With GRNN, the mapping does not have to be an analytical function and head movement is explicitly accounted for by the gaze mapping function. Futhermore, the mapping function can generalize to other individuals not used in the training. The effectiveness of our gaze tracker is demonstrated by preliminary experiments that involve gaze-contingent interactive graphic display.

본 논문에서는 새로운 실시간 시선 추적 방식을 제안하고자한다 기존의 시선추적 방식은 사용자가 머리를 조금만 움직여도 잘못된 결과를 얻을 수가 있었고 각각의 사용자에 대하여 교정 과정을 수행할 필요가 있었다 따라서 제안된 시선 추적 방법은 적외선 조명과 Generalized Regression Neural Networks(GRNN)를 이용함으로써 교정 과정 없이 머리의 움직임이 큰 경우에도 견실하고 정확한 시선 추적을 가능하도록 하였다. GRNN을 사용함으로써 매핑기능은 원활하게 할 수 있었고, 머리의 움직임은 시선 매핑 기능에 의해 적절하게 시선추적에 반영되어 얼굴의 움직임이 있는 경우에도 시선추적이 가능토록 하였고, 매핑 기능을 일반화함으로써 각각의 교정과정을 생략 할 수 있게 하여 학습에 참석하지 않은 다른 사용자도 시선 추적을 가능케 하였다. 실험결과 얼굴의 움직임이 있는 경우에는 평균 90%, 다른 사용자에 대해서는 평균 85%의 시선 추적 결과를 나타내었다.

Keywords

References

  1. S. Baluja and D. Pomerleau. Non-intrusive gaze tracking using artificial neural networks. Technical Report CMU-CS-94-102, Carnegie Mellon University, 1994
  2. Y. Ebisawa. Unconstrained pupil detection technique using two light sources and the image difference method, Visualization and Intelligent Design in Engineering, pages 7989, 1989
  3. Y. Ebisawa. Improved video-based eye-gaze detection method. IEEE Transactions on Instrumentation and Measurement, 47(2):948955, 1998 https://doi.org/10.1109/19.744648
  4. T. E. Hutchinson. Eye movement detection with improved calibration and speed. United States Patent [19], (4,950,069), 1988
  5. T. E. Hutchinson, K. White, J. R. Worthy, N. Martin, C. Kelly, R. Lisa, , and A. Frey. Human-computer interaction using eye-gaze input. IEEE Transaction on systems, man, and cybernetics, 19(6):15271533, 1989 https://doi.org/10.1109/21.44068
  6. Q. Ji and X. Yang. Real time visual cues extraction for monitoring driver vigilance. in ICVS 2001: Second International Workshop on Computer Vision Systems, Vancouver, Canada, 2001
  7. D. Koons and M. Flickner. Ibm blue eyes project, http://www.almaden.ibm.com/cs/blueeyes
  8. T. Ohno, N. Mukawa, and A. Yoshikawa. Freegaze: A gaze tracking system for everyday gaze interaction. Eye Tracking Research and Applications Symposium, 25-27 March, New Orleans, LA, USA, 2002
  9. R. Rae and H. Ritter. Recognition of human head orientation based on artificial neural networks. IEEE Transactions on Neural Networks, 9(2):257265, 1998 https://doi.org/10.1109/72.661121
  10. D. F. Specht. A general regression neural network. IEEE Transcations on Neural Networks, 2:568576, 1991 https://doi.org/10.1109/72.97934
  11. G. Yang and A. Waibel. A real-time face tracker. Workshop on Applications of Computer Vision, pages 142-147, 1996
  12. Z. Zhu, K. Fujimura, and Q. Ji. Real-time eye detection and tracking under various light conditions. Eye Tracking Research and Applications Symposium, 25-27 March, New Orleans, LA, USA, 2002