Advanced SearchSearch Tips
Infrared Sensitive Camera Based Finger-Friendly Interactive Display System
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Infrared Sensitive Camera Based Finger-Friendly Interactive Display System
Ghimire, Deepak; Kim, Joon-Cheol; Lee, Kwang-Jae; Lee, Joon-Whoan;
  PDF(new window)
In this paper we present a system that enables the user to interact with large display system even without touching the screen. With two infrared sensitive cameras mounted on the bottom left and bottom right of the display system pointing upwards, the user fingertip position on the selected region of interest of each camera view is found using vertical intensity profile of the background subtracted image. The position of the finger in two images of left and right camera is mapped to the display screen coordinate by using pre-determined matrices, which are calculated by interpolating samples of user finger position on the images taken by pointing finger over some known coordinate position of the display system. The screen is then manipulated according to the calculated position and depth of the fingertip with respect to the display system. Experimental results demonstrate an efficient, robust and stable human computer interaction.
Interactive System;IR Sensitive Camera;Background Subtraction;Bilinear Interpolation;Moving Average;
 Cited by
Mitsubishi DiamondTouch, last visited July 20, 2010,

Morrison G. D., “A CMOS Camera-Based Man-Machine Input Device for Large-Format Interactive Displays”, ACM SIGGRAPH 2007 courses, SIGGRAPH ’07. ACM, New York, NY, 65-74.

V. I. Pavlovic, R. Sharma, T. S. Huang, “Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review”, IEEE Transaction on Pattern Analysis and Machine Intelligence, Vol. 19, No. 7, July 1997, pp. 677-695. crossref(new window)

T.-S. Cheung, “Stereo Computer Vision with Multiple Cameras”, Hongkong University of Science and Technology.

T. Fukasawa, K. Fukushi, H. Koike, “A Vision-Based Noncontact Interactive Advertisement with a Display Wall”, ICEC 2006, LNCS 4161, pp. 394-397.

I. Katz, K. Gabayan, H. Aghajan, “A Multi-touch Surface Using Multiple Cameras”, ACIVS 2007, LNCS 4678, pp. 97-108.

Ye Zhou, G. Morrison, “A Real-time algorithm for Finger Detection in a Camera Based Finger-Friendly Interactive Board System”, Proc. ICVS, 2007.

A. Sanghi, H. Arora, K. Gupta, V.B. Vats, “A Fingertip Detection and Tracking System as a Virtual Mouse, a Signature Input Device and an Application Selector”, Proc. of the 7th International Caribbean Conference on Devices, Circuits, and Systems, April 2008.

D.-D. Yang, L.-W. Jin, J.-X. Yin, L.-X. Zhen, J.-C. Huang, “An Effective Robust Fingertip Detection Method for Finger Writing Character Recognition System”, Proc. of the 4th International Conference on Machine Learning and Cybernetics, August 2005, pp. 4991-4996.

J. Ravikiran, K. Mahesh, S. Mahishi, R. Dheeraj, S. Sudheender, N. V. Pujari, “Finger Detection for Sign Language Recognition”, Proc. IMECS 2009, March 2009.

Academics, Perspective Transform Estimation, last visited November 11, 2010,

R. Jain, R. Kasturi, B. G. Schunck, “Machine Vision”, McGRAW-HILL International editions, pp. 382-383, 1995.

I. MacKenzie and C.Ware, “Lag as a determinant of human performance in interactive systems”, Conference on Computer-Human Interaction, vol. 1, no. 4, pp. 356, 1994.