딥러닝 모델에서의 흐름정보 기반 지식증류 기법 분석

  • Published : 2020.12.31

Abstract

Keywords

References

  1. A. Krizhevsky, I. Sutskever, and G. E. Hinton, "ImageNet classification with deep convoluti onal neural networks," in 26th Annual Conf. Neural Information Process. Sys. (NIPS) 2012, Stateline, Nevada, USA, Dec. 3-8, 2012, pp. 1106-1114.
  2. C. Szegedy, W. Liu, Y. Jia, P. Sermanet, S. Reed, D. Anguelov, D. Erhan, V. Vanhoucke, and A. Rabinovich, "Going deeper with convolutions," in Proc. 2015 IEEE Conf. Comput. Vision Pattern Recogn. (CVPR), Boston, USA, June 7-12, 2015, pp. 1-9.
  3. K. Simonyan and A. Zisserman, "Very deep convolutional networks for large-scale image recognition," in Proc. 5th Int. Conf. Learning Represent. (ICLR), San Diego, USA, May 7-9, 2015, pp. 1-14.
  4. K. He, X. Zhang, S. Ren, and J. Sun, "Deep residual learning for image recognition," in Proc. IEEE Conf. Comput. Vision Pattern Recogn (CVPR), Las Vegas, USA, Jun. 26-Jul. 1, 2016, pp. 1-12.
  5. G. Huang, Z. Liu, L. V. D. Maaten, and K. Weinberger, "Densely connected convolutional networks," in Proc. 2017 IEEE Conf. Comput. Vision Pattern Recogn. (CVPR), Honolulu, USA, Jul. 21-26, 2017, pp. 2261-2269.
  6. ImageNet, Large Scale Visual Recognition Challenge (ILSVRC), http://www.image-net.org/challenges/LSVRC/
  7. O. Russakovsky, J. Deng, H. Su, J. Krause, S. Satheesh, S. Ma, Z. Huang, A. Karpathy, A. Khosla, M. Bernstein, A. C. Berg, and L. Fei-Fei, "ImageNet large scale visual recognition challenge," Int. J. Comput. Vis. (IJCV), vol. 115, no. 3, pp. 211-252, 2015. https://doi.org/10.1007/s11263-015-0816-y
  8. G. Hinton, O. Vinyals, and J. Dean, "Distilling the knowledge in a neural network," arXiv preprint arXiv:1503.02531, pp. 1-19, 2015.
  9. J. Yim, D. Joo, J.-H. Bae, and J. Kim, "A gift from knowledge distillation: Fast optimization, network minimization, and transfer learning," in Proc. of 2017 IEEE Conf. Comput. Vision Pattern Recogn. (CVPR), Honolulu, USA, Jul. 21-26, 2017, pp. 7130-7138.
  10. A. Romero, N. Ballas, S. E. Kahou, A. Chassang, C. Gatta, and Y. Bengio, "Fitnets: Hints for thin deep nets," in Proc. 5th Int. Conf. Learning Represent. (ICLR), San Diego, USA, May 7-9, 2015, pp. 1-13.
  11. J.-H. Bae, D. Yeo, J. Yim, N.-S. Kim, C.-S. Pyo, and J. Kim, "Densely distilled flow-based knowledge transfer in teacher-student framework for image classification," IEEE Transactions on Image Processing, vol. 29, pp. 5698-5710, 2020. https://doi.org/10.1109/tip.2020.2984362
  12. K. Kim and J.-H. Bae, "Important parameter optimized flow-based transfer learning technique supporting heterogeneous teacher network based on deep learning," Journal of KIIT, vol. 18, No. 3, pp. 21-29, 2020.
  13. Goodfellow, I., Pouget-Abadie, J., Mirza, M., et al. "Generative adversarial nets", Advances in Neural Information Processing Systems, Canada, December 2014, pp. 2672-2680.
  14. D. Yeo and J.-H. Bae, "Multiple flow-based knowledge transfer via adversarial networks," Electronics Letters, Vol. 551, No. 18, pp.989-992, Sept. 2019.
  15. S. Lee and B.-C. Song, "Knowledge transfer via decomposing essential information in convolutional neural networks," IEEE Transactions on Neural Networks and Learning Systems, pp.1-12, 2020.