DOI QR코드

DOI QR Code

A Novel Vehicle Counting Method using Accumulated Movement Analysis

누적 이동량 분석을 통한 영상 기반 차량 통행량 측정 방법

  • Lim, Seokjae (Department of Electrical and Electronics Engineering, Konkuk University) ;
  • Jung, Hyeonseok (Department of Electrical and Electronics Engineering, Konkuk University) ;
  • Kim, Wonjun (Department of Electrical and Electronics Engineering, Konkuk University) ;
  • Lee, Ryong (Research Data Sharing Center, Korea Institute of Science and Technology Information) ;
  • Park, Minwoo (Research Data Sharing Center, Korea Institute of Science and Technology Information) ;
  • Lee, Sang-Hwan (Research Data Sharing Center, Korea Institute of Science and Technology Information)
  • 임석재 (건국대학교 전기전자공학부) ;
  • 정현석 (건국대학교 전기전자공학부) ;
  • 김원준 (건국대학교 전기전자공학부) ;
  • 이용 (한국과학기술정보연구원 연구데이터공유센터) ;
  • 박민우 (한국과학기술정보연구원 연구데이터공유센터) ;
  • 이상환 (한국과학기술정보연구원 연구데이터공유센터)
  • Received : 2019.11.12
  • Accepted : 2020.01.21
  • Published : 2020.01.30

Abstract

With the rapid increase of vehicles, various traffic problems, e.g., car crashes, traffic congestions, etc, frequently occur in the road environment of the urban area. To overcome such traffic problems, intelligent transportation systems have been developed with a traffic flow analysis. The traffic flow, which can be estimated by the vehicle counting scheme, plays an important role to manage and control the urban traffic. In this paper, we propose a novel vehicle counting method based on predicted centers of each lane. Specifically, the centers of each lane are detected by using the accumulated movement of vehicles and its filtered responses. The number of vehicles, which pass through extracted centers, is counted by checking the closest trajectories of the corresponding vehicles. Various experimental results on road CCTV videos demonstrate that the proposed method is effective for vehicle counting.

최근 급격한 도시화 및 인구 집중으로 다양한 교통 문제가 빈번히 발생하고 있다. 따라서, 이를 효과적으로 해결하기 위한 교통 흐름 분석 관련 연구들이 활발히 진행되고 있다. 교통 흐름 정보는 도시 교통 관리의 핵심 요소로 차량 통행량 분석을 통해 수집된다. 본 논문에서는 도로 CCTV 영상에서 교통 흐름 정보를 정밀하게 추정하기 위해 차로 중앙 지점을 기반으로 한 고정밀 차량 통행량 측정 방법을 제안한다. 제안하는 방법은 심층 신경망(Deep Neural Network) 기반 객체 검출 방법을 이용하여 차량을 검출한 후, 누적 이동량을 바탕으로 각 차로의 중앙 지점을 검출한다. 또한, 객체 추적 기술을 통해 동일 차량의 궤적을 추정하여 검출된 각 차로 중앙 지점과의 거리를 비교함으로써 정밀하게 차량 통행량을 측정한다. 다양한 실험 결과들을 통해 본 논문에서 제안하는 방법이 차량 통행량 측정에 효과적임을 보인다.

Keywords

References

  1. Y. Liu, Y. Lu, Q. Shi, and J. Ding, "Optical flow based urban road vehicle tracking," in Proc. IEEE International Conference Computational Intelligence and Security(CIS), Dec. 2013, pp. 391-395.
  2. Y. Xia, X. Shi, G. Song, Q. Geng, and Y. Liu, "Towards improving quality of video-based vehicle counting method for traffic flow estimation," Signal Processing, vol. 120, pp. 672-681, Mar. 2016. https://doi.org/10.1016/j.sigpro.2014.10.035
  3. H. Zhang, and K. Wu, "A vehicle detection algorithm based on three-frame differencing and background subtraction," in Proc. International Symposium on Computational Intelligence and Design(ISCID), Oct. 2012, pp. 148-151.
  4. M. S. Shirazi, and B. Morris, "A typical video-based framework for counting, behavior and safety analysis at intersections," in Proc. IEEE Intelligent Vehicles Symposium(IV), Jun. 2015, pp. 1264-1269.
  5. Traffic monitorng system(road traffic investigation), www.road.re.kr/ intro/intro_01.asp (accessed Aug. 27, 2019).
  6. S. Cheung, S. Coleri, B. Dundar, S. Ganesh, C. Tan, and P. Varaiya, "Traffic measurement and vehicle classification with single magnetic sensor", Transportation Research Record: Journal of the Transportation Research Board, vol. 1917, no. 1, pp. 173-181, Jan. 2005. https://doi.org/10.1177/0361198105191700119
  7. P. Barcellos, C Bouvie, F. L. Escouto, and J. Scharcanski, "A novel video based system for detecting and counting vehicles at user-defined virtual loops," Expert Systems with Applications, vol. 42, no. 4, pp. 1845-1856, Mar, 2015. https://doi.org/10.1016/j.eswa.2014.09.045
  8. D. Neven, B. D. Brabandere, S. Georgoulis, M. Proesmans, and L. V. Gool, "Towards end-to-end lane detection: an instance segmentation approach," in Proc. IEEE Intelligent Vehicles Symposium(IV), Jun. 2018, pp. 286-291.
  9. S. Lee et al., "VPGNet: vanishing point guided network for lane and road marking detection and recognition," in Proc. IEEE International Conference on Computer Vision(ICCV), Oct. 2017, pp. 1966-1973.
  10. S. Chougule, N. Koznek, A. Isumail, G. Adam, and V. Narayan, "Reliable multilane detection and classification by utilizing CNN as a regression network," in Proc. European Conference on Computer Vision(ECCV) Workshops, Sep. 2018, pp. 740-752.
  11. J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, "You only look once: Unified real-time object detection," in Proc. IEEE Conference on Computer Vision and Pattern Recognition(CVPR), Jun. 2016, pp. 779-788.
  12. A. Bewley, Z. Ge, L. Ott, F. Ramos, and B. Upcroft, "Simple online and realtime tracking," in Proc. IEEE International Conference on Image Processing(ICIP), Sep. 2016, pp. 3464-3468.
  13. J. Wang, T. Mei, B. Kong, and H. Wei, "An approach of lane detection based on inverse perspective mapping," in Proc. International IEEE Conference on Intelligent Transportation Systems(ITSC), Oct. 2014, pp. 35-38.
  14. Daejeon metropolitan city road traffic cctv, http://traffic.daejeon.go.kr/map/trafficInfo/cctv.do (accessed Feb. 1, 2019).
  15. G. Farneback, "Two-frame motion estimation based on polynomial expansion", in Proc. Scandinavian Conference on Image Analysis (SCIA), Jun. 2003, pp. 363-370.
  16. Z. Zivkovic, "Improved adaptive gaussian mixture model for background subtraction," in Proc. International Conference on Pattern Recognition(ICPR), Aug. 2004, pp. 28-31.
  17. Z. Zivkovic, "Efficient adaptive density estimation per image pixel for the task of background subtraction," Pattern Recognition Letters, vol. 27, no. 7, pp. 773-780, May. 2006. https://doi.org/10.1016/j.patrec.2005.11.005
  18. C. Goutte, and E. Gaussier, "A probabilistic interpretation of precision, recall and F-score, with implication for evaluation," in Proc. European Conference on Information Retrieval(ECIR), Apr. 2005, pp. 345-359.