DOI QR코드

DOI QR Code

Separation of Touching Pigs using YOLO-based Bounding Box

YOLO 기반 외곽 사각형을 이용한 근접 돼지 분리

  • Seo, J. (Dept. of Computer Convergence Software, Korea University) ;
  • Ju, M. (Dept. of Computer Convergence Software, Korea University) ;
  • Choi, Y. (Dept. of Computer Convergence Software, Korea University) ;
  • Lee, J. (Dept. of Computer Convergence Software, Korea University) ;
  • Chung, Y. (Dept. of Computer Convergence Software, Korea University) ;
  • Park, D. (Dept. of Computer Convergence Software, Korea University)
  • Received : 2018.01.12
  • Accepted : 2018.01.22
  • Published : 2018.02.28

Abstract

Although separation of touching pigs in real-time is an important issue for a 24-h pig monitoring system, it is challenging to separate accurately the touching pigs in a crowded pig room. In this study, we propose a separation method for touching pigs using the information generated from Convolutional Neural Network(CNN). Especially, we apply one of the CNN-based object detection methods(i.e., You Look Only Once, YOLO) to solve the touching objects separation problem in an active manner. First, we evaluate and select the bounding boxes generated from YOLO, and then separate touching pigs by analyzing the relations between the selected bounding boxes. Our experimental results show that the proposed method is more effective than widely-used methods for separating touching pigs, in terms of both accuracy and execution time.

Keywords

References

  1. Ministry of Agriculture, Food and Rural Affairs. http://www.mifaff.go.kr, (accessed Oct., 24, 2017).
  2. B. Kim, Y. Lee, Y. Kim, T. Kim, J. Park, S. Lee, and et al., Top 10 Agriculture Issues in 2017, Korea Rural Economic Institute, Focus on agricultural affairs, Vol. 142, pp. 1-27, 2017.
  3. A. Wongsriworaphon, B. Arnonkijpanich, and S. Pathumnakul, "An Approach Based on Digital Image Analysis to Estimate the Live Weights of Pigs in Farm Environments," Computers and Electronics in Agriculture, Vol. 115, pp. 26-33, 2015. https://doi.org/10.1016/j.compag.2015.05.004
  4. M. Oczak, K. Maschat, D. Berckmans, E. Vranken, and J. Baumgartner, "Automatic Estimation of Number of Piglets in a Pen during Farrowing, using Image Analysis," Biosystems Engineering, Vol. 151, pp. 81-89, 2016. https://doi.org/10.1016/j.biosystemseng.2016.08.018
  5. H. Baek, Y. Chung, M. Ju, Y. Chung, D. Park, and H. Kim, "Pig Segmentation using Concave Points and Edge Information," Journal of Korea Multimedia Society, Vol. 19, No. 8, pp. 1361-1370, 2016. https://doi.org/10.9717/kmms.2016.19.8.1361
  6. M. Ju, H. Baek, J. Sa, H. Kim, Y. Chung, and D. Park, “Real-Time Pig Segmentation for Individual Pig Monitoring in a Weaning Pig Room,” Journal of Korea Multimedia Society, Vol. 19, No. 2, pp. 215-223, 2016. https://doi.org/10.9717/kmms.2016.19.2.215
  7. J. Choi, L. Lee, Y. Chung, and D. Park, “Individual Pig Detection Using Kinect Depth Information,” KIPS Transactions on Computer and Communication Systems, Vol. 5, No. 10, pp. 319-326, 2016. https://doi.org/10.3745/KTCCS.2016.5.10.319
  8. L. Lee, L. Jin, D. Park, and Y. Chung, “Automatic Recognition of Aggressive Behavior in Pigs using a Kinect Depth Sensor,” Sensors, Vol. 16, No. 5, pp. 631, 2016. https://doi.org/10.3390/s16050631
  9. J. Choi, L. Lee, Y. Chung, and D. Park, “Individual Pig Detection using Fast Region-based Convolution Neural Network,” J ournal of Korea Multimedia Society, Vol. 20, No. 2, pp. 216-224, 2017. https://doi.org/10.9717/kmms.2017.20.2.216
  10. J. Kim, Y. Chung, Y. Choi, J. Sa, H. Kim, Y. Chung, D. Park, and H. Kim, “Depth-based Detection of Standing-Pigs in Moving Noise Environments,” Sensors, Vol. 17, No. 12, pp. 2757, 2017. https://doi.org/10.3390/s17122757
  11. A. Krizhevsky, I. Sutskever, and G. Hinton, "Imagenet Classification with Deep Convolutional Neural Networks," Advances in Neural Information Processing Systems, pp. 1097-1105, 2012.
  12. J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, "You Only Look Once: Unified, Real-Time Object Detection," Proceeding of Computer Vision and Pattern Recognition, pp. 779-788, 2016.
  13. C. Koyuncu, S. Arslan, I. Durmaz, R. Cetin-Atalay, and C. Gunduz-Demir, “Smart Markers for Watershed-Based Cell Segmentation,” PLoS one, Vol. 7, No. 11, pp. e48664, 2012. https://doi.org/10.1371/journal.pone.0048664
  14. J. Peng, Y. Chen, M. Green, S. Sabatinos, S. Forsburg, and C. Hsu, “PombeX: Robust Cell Segmentation for Fission Yeast Transillumination Images,” PLoS one, Vol. 8, No. 12, pp. e81434, 2013. https://doi.org/10.1371/journal.pone.0081434
  15. B. Singh and S. Patel, “Efficient Medical Image Enhancement using CLAHE and Wavelet Fusion,” International J ournal of Computer Applications, Vol. 167, No. 5, pp. 1-5, 2017.
  16. J. Canny, “A Computational Approach to Edge Detection,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 8, No. 6, pp. 679-698, 1986.
  17. R. Girshick, J. Donahue, T. Darrell, and J. Malik, "Rich Feature Hierarchies for Accurate Object Detection and Semantic Segmentation," Proceeding of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 580-587, 2014.
  18. R. Girshick, "Fast R-CNN," Proceeding of the IEEE International Conference on Computer Vision, pp. 1440-1448, 2015.
  19. J. Kim, Y. Choi, J. Kim, Y. Chung, Y. Chung, and D. Park, “Efficient Task Distribution for Pig Monitoring Applications using OpenCL,” KIPS Transactions on Computer and Communication Systems, Vol. 6, No. 10, pp. 407-414, 2017. https://doi.org/10.3745/KTCCS.2017.6.10.407

Cited by

  1. 비디오 모니터링 환경에서 정확한 돼지 탐지 vol.24, pp.7, 2018, https://doi.org/10.9717/kmms.2021.24.7.890