Detection of Face Direction by Using Inter-Frame Difference



Jang, Bongseog;Bae, Sang-Hyun

  • 투고 : 2016.04.15
  • 심사 : 2016.06.25
  • 발행 : 2016.06.30


Applying image processing techniques to education, the face of the learner is photographed, and expression and movement are detected from video, and the system which estimates degree of concentration of the learner is developed. For one learner, the measuring system is designed in terms of estimating a degree of concentration from direction of line of learner's sight and condition of the eye. In case of multiple learners, it must need to measure each concentration level of all learners in the classroom. But it is inefficient because one camera per each learner is required. In this paper, position in the face region is estimated from video which photographs the learner in the class by the difference between frames within the motion direction. And the system which detects the face direction by the face part detection by template matching is proposed. From the result of the difference between frames in the first image of the video, frontal face detection by Viola-Jones method is performed. Also the direction of the motion which arose in the face region is estimated with the migration length and the face region is tracked. Then the face parts are detected to tracking. Finally, the direction of the face is estimated from the result of face tracking and face parts detection.


Learner Concentration;Face Detection;Motion Direction;Inter-frame Difference


  1. P. Viola and M. Jones, "Rapid object detection using a boosted cascade of simple features", Computer Vision and Pattern Recognition, Vol. 1, pp. I511-I518, 2001.
  2. D. Comaniciu and P. Meer, "Mean shift: A robust approach toward feature space analysis", IEEE T. Pattern Anal., Vol. 24, pp. 603-619, 2002.
  3. D. Comaniciu, V. Ramesh, and P. Meer, "Kernel-based object tracking", IEEE T. Pattern Anal., Vol. 25, pp. 564-577, 2003.
  4. Y. Tsuduki, H. Fujiyoshi, and T. Kanade, "Mean shift-based point feature tracking using SIFT", J. Information Processing Society, Vol. 49, pp. 35-45, 2008.
  5. N. Oshima, T. Saitoh, and R. Konishi, "Real time mean shift tracking using optical flow distribution", SCIE-ICASE, pp. 4316-4320, 2006.
  6. E. Watanabe, T. Ozeki, and T. Kohama, "Analysis of behaviors by lecturer and students in lecture based on piecewise auto-regressive modeling", Intelligent Computer Communication and Processing, pp. 385-390, 2011.
  7. K. Kunihira, K. Matsuura, and Y. Yano, "Cultivating the performance of presentation through monitoring presenter's action", Proceedings of the 18th International Conference on Computers in Education ICCE 2010, Putrajaya, Malaysia, November 29-December 03, 2010.
  8. Y. Araki, N. Shimada, and Y. Shirai, "Detection of faces of various directions and estimation of face direction in complex backgrounds", PRMU, Vol. 217, pp.87-94, 2001.
  9. B. K. P. Horn and B. G. Schunck, "Determining optical flow", Artif. Intell., Vol. 17, pp. 185-203, 1981.


연구 과제 주관 기관 : Chosun University