JOURNAL BROWSE
Search
Advanced SearchSearch Tips
Vision-Based Obstacle Collision Risk Estimation of an Unmanned Surface Vehicle
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Vision-Based Obstacle Collision Risk Estimation of an Unmanned Surface Vehicle
Woo, Joohyun; Kim, Nakwan;
 
 Abstract
This paper proposes vision-based collision risk estimation method for an unmanned surface vehicle. A robust image-processing algorithm is suggested to detect target obstacles from the vision sensor. Vision-based Target Motion Analysis (TMA) was performed to transform visual information to target motion information. In vision-based TMA, a camera model and optical flow are adopted. Collision risk was calculated by using a fuzzy estimator that uses target motion information and vision information as input variables. To validate the suggested collision risk estimation method, an unmanned surface vehicle experiment was performed.
 Keywords
computer vision;mono camera;target motion analysis;collision risk;unmanned surface vehicle;
 Language
Korean
 Cited by
 References
1.
W. Naeem and G. W. Irwin, "An automatic collision avoidance strategy for unmanned surface vehicles," In: Li K, Li X, Ma S and Irwin GW (eds) Life system modeling and intelligent computing. Berlin: Springer Berlin Heidelberg, pp. 184-191, 2010.

2.
Z, Jingsong, W. G. Price, and P. A. Wilson, "Automatic collision avoidance systems: Towards 21st century," In: Annual China airport summit (ACAS), Shanghai, China, pp. 1-10, Sep. 1994.

3.
T, Statheros, G. Howells, and K. M. Maier, "Autonomous ship collision avoidance navigation concepts, technologies and techniques," Journal of Navigation, vol. 61, pp. 129-142, 2008.

4.
S. Campbell, W. Naeem, and G. W. Irwin, "A review on improving the autonomy of unmanned surface vehicles through intelligent collision avoidance manoeuvres," Annual Reviews in Control, vol. 36, no. 2, pp. 267-283, 2012. crossref(new window)

5.
Y. Watanabe, A. J. Calise, and E. N. Johnson, "Vision-based obstacle avoidance for UAVs," In: AIAA Guidance, Navigation and Control Conference and Exhibit, Hilton Head, USA, pp. 1-11. AIAA., Aug. 2007.

6.
S. Cho, S. Huh, D. H. Shim et al. "Vision-based detection and tracking of airborne obstacles in a cluttered environment," J Intell Rob Syst., vol. 69, pp. 475-488, 2013. crossref(new window)

7.
S. G. Yun, S. E. Kang, and S. H. Ko, "Moving target indication using an image sensor for small UAVs," Journal of Institute of Control, Robotics and Systems (in Korean), vol. 20, no. 12, pp. 1189-1195, Dec. 2014. crossref(new window)

8.
D. H. Kim, D. H. Lee, H. Myung, and H. T. Choi, "Vision-based localization for AUVs using weighted template matching in a structured environment," Journal of Institute of Control, Robotics and Systems (in Korean), vol. 19, no. 8, pp. 667-675, Aug. 2013. crossref(new window)

9.
H. Wang, Z. Wei, S. Wang et al. "A vision-based obstacle detection system for unmanned surface vehicle," In: 2011 IEEE Conference on Robotics, Automation and Mechatronics (RAM), Qingdao, China, pp. 364-369, Sep. 2011.

10.
J. H. Park, J. W. Kim, and N. S. Son, "Passive target tracking of marine traffic ships using an onboard monocular camera for an unmanned surface vessel," Electronics Letters, vol. 51, no. 13, pp. 987-989, Jun. 2015. crossref(new window)

11.
J. H. Woo and N. W. Kim, "Vision-based target motion analysis and collision avoidance of unmanned surface vehicles," Proc. of the Institution of Mechanical Engineers, Part M: Journal of Engineering for the Maritime Environment, DOI:1475090215605136, Sep. 2015.

12.
S. Campbell, M. Abu-Tair, and W. Naeem, "An automatic COLREGs-compliant obstacle avoidance system for an unmanned surface vehicle," Proc. of the Institution of Mechanical Engineers, Part M: Journal of Engineering for the Maritime Environment, vol. 228, no. 2, pp. 108-121, Nov. 2013.

13.
D. G. Lowe, "Object recognition from local scale-invariant features," In Computer vision, 1999. The Proceedings of the Seventh IEEE International Conference, Kerkyra, Greece, vol. 2, pp. 1150-1157, Sep. 1999.

14.
D. G. Lowe, "Distinctive image features from scale-invariant keypoints," International Journal of Computer Vision, vol. 60, no. 2, pp. 91-110, Nov. 2004. crossref(new window)

15.
E. Rosten, R. Porter, and D. Tom, "Faster and better: A machine learning approach to corner detection," Pattern Analysis and Machine Intelligence, IEEE Transactions on, vol. 32, no. 1, pp. 105-119, Nov. 2010. crossref(new window)

16.
E. Mair, G. D. Hager, D. Burschka, M. Suppa, and G. Hirzinger, "Adaptive and generic corner detection based on the accelerated segment test," Computer Vision-ECCV 2010. Springer Berlin Heidelberg, pp. 183-196, 2010.

17.
S. Leutenegger, M. Chli, and R. Y. Siegwart, "BRISK: Binary robust invariant scalable keypoints," Computer Vision (ICCV), 2011 IEEE International Conference on. IEEE, Barcelona, Spain, pp. 2548-2555, Nov. 2011.

18.
J. Heinly, D. Enrique, and J.M Frahm, "Comparative evaluation of binary features," Computer Vision-ECCV 2012. Springer Berlin Heidelberg, Springer Berlin Heidelberg, pp. 759-773, 2012.

19.
Z. Zhang, "Flexible camera calibration by viewing a plane from unknown orientations," Computer Vision, 1999. The Proceedings of the Seventh IEEE International Conference, Kerkyra, Greece, vol. 1, pp. 666-673, Sep. 1999.

20.
J. Y. Bouguet, "Camera calibration toolbox for matlab," http://www.vision.caltech.edu/bouguetj/calibdoc/, 2004.

21.
D. Dusha, W. Boles, and R. Walker, "Attitude estimation for a fixed-wing aircraft using horizon detection and optical flow," 9th Biennial Conference of the Australian Pattern Recognition Society on Digital Image Computing Techniques and Applications, Glenelg, Australia, pp. 485-492, Dec. 2007.

22.
H. Iwasaki and K. Hara, "A fuzzy reasoning model to decide the collision avoidance action," Japan Institute of Navigation 1986, vol. 75, pp. 69-77, 1986.

23.
K. Hasegawa and A. Kouzuki, "Automatic collision avoidance system for ships using fuzzy control," Journal of the Kansai Society of Naval Architects, pp.205-215, Jun. 1987.