JOURNAL BROWSE
Search
Advanced SearchSearch Tips
Motion Field Estimation Using U-disparity Map and Forward-Backward Error Removal in Vehicle Environment
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Motion Field Estimation Using U-disparity Map and Forward-Backward Error Removal in Vehicle Environment
Seo, Seungwoo; Lee, Gyucheol; Lee, Sangyong; Yoo, Jisang;
  PDF(new window)
 Abstract
In this paper, we propose novel motion field estimation method using U-disparity map and forward-backward error removal in vehicles environment. Generally, in an image obtained from a camera attached in a vehicle, a motion vector occurs according to the movement of the vehicle. but this motion vector is less accurate by effect of surrounding environment. In particular, it is difficult to extract an accurate motion vector because of adjacent pixels which are similar each other on the road surface. Therefore, proposed method removes road surface by using U-disparity map and performs optical flow about remaining portion. forward-backward error removal method is used to improve the accuracy of the motion vector. Finally, we predict motion of the vehicle by applying RANSAC(RANdom SAmple Consensus) from acquired motion vector and then generate motion field. Through experimental results, we show that the proposed algorithm performs better than old schemes.
 Keywords
Motion Field Estimation;Optical Flow;Forward-Backward Error;U-disparity;
 Language
Korean
 Cited by
 References
1.
Y. Zhang, M. Xie, and D. Tang, "A central sub-image based global motion estimation method for in-car video stabilization," in Proc. ACM SIGKDD KDD, Phuket, Thailand, pp. 204-207, Jan. 2010.

2.
K. Yamaguchi, T. Kato, and Y. Ninomiya, "Vehicle ego-motion estimation and moving object detection using a monocular camera," in Proc. Int. Conf. Pattern Recognition, Hong Kong, Hong Kong, pp. 610-613, Aug. 2006.

3.
O. Pink, F. Moosmann, and A. Bachmann, "Visual features for vehicle localization and ego-motion estimation," in Proc. IEEE Intell. Veh. Symp., Xi'an, China, pp. 254-260, Jun. 2009.

4.
G. Ligorio and A. M. Sabatini, "Extended kalman filter-based methods for pose estimation using visual, inertial and magnetic sensors: comparative analysis and performance evaluation," Sensors, vol. 13, no. 2, pp. 1919-1941, Feb. 2013. crossref(new window)

5.
F. J. Sharifi and M. Marey, "A kalman-filterbased method for pose estimation in visual servoing," IEEE Trans. Robotics, vol. 26, no. 5, pp. 939-947, Oct. 2010. crossref(new window)

6.
M. A. Flschier and R. C. Bolles, "Random sample consensus: a paradigm for model fitting with application to image analysis and automated cartography," Commun. ACM, vol. 24, no. 6, pp. 381-395, Jun. 1981. crossref(new window)

7.
V. Lippiello, B. Siciliano, and L. Villani, "Adaptive extended kalman filtering for visual motion estimation of 3D objects," Control Eng. Practice, vol. 15, no. 1, pp. 123-134, Jan. 2007. crossref(new window)

8.
W. Jang, C Lee, and Y. Ho, "Efficient depth map generation for various stereo camera arrangements," J. KICS, vol. 37, no. 6, pp. 458-463, Jun. 2012. crossref(new window)

9.
E. Baek and Y. Ho, "Stereo image composition using poisson object editing," J. KICS, vol. 39, no. 8, pp. 453-458, Aug. 2014.

10.
Z. Hu and K. Uchimura, "U_V Disparity, an efficient algorithm for stereo vision based scene analysis," IEEE Intell. Veh. Symp., pp. 48-54, Las Vegas, USA, Jun. 2005.

11.
B. D. Lucas and T. Kanade, "An iterative image registration technique with an application to stereo vision," in Proc. Int. Joint Conf. Artificial Intell., pp. 674-679, Vancouver, Canada, Aug. 1981.

12.
C. Song and J. Lee, "Detection of illegal u-turn vehicles by optical flow analysis," J. KICS, vol. 39, no. 10, pp. 948-956, Oct. 2014.

13.
Z. Kalal, K. Mikolajczyk, and J. Matas, "Forward-backward error: automatic detection of tracking failures," in Proc. Int. Conf. Pattern Recognition, pp. 23-26, Istanbul, Turkey, Aug. 2010.

14.
C. Harris and M. Stephens, "A combined corner and edge detector," in Proc. Alvey Vision Conf., pp. 147-151, Manchester, UK, Aug. 1988.

15.
B. K. P. Horn and B. Schunck, "Determining optical flow," Artificial Intell., vol. 17, no. 1-3, pp. 185-203, Aug. 1981. crossref(new window)

16.
H. C. Longuet-Higgins and K. Prazdny, "The interpretation of a moving retinal image," The Royal Soc. London B, vol. 208, no. 1173, pp. 385-397, Jul. 1980. crossref(new window)

17.
C. Keller, M. Enzweiler, and D. M. Gavila, "A new benchmark for stereo-based pedestrian detection," in Proc. IEEE Intell. Veh. Symp., Baden-Baden, Germany, Jun. 2011.