Korean Dependency Parsing using Pointer Networks

포인터 네트워크를 이용한 한국어 의존 구문 분석

  • 박천음 (강원대학교 컴퓨터과학) ;
  • 이창기 (강원대학교 컴퓨터과학)
  • Received : 2017.02.20
  • Accepted : 2017.05.23
  • Published : 2017.08.15


In this paper, we propose a Korean dependency parsing model using multi-task learning based pointer networks. Multi-task learning is a method that can be used to improve the performance by learning two or more problems at the same time. In this paper, we perform dependency parsing by using pointer networks based on this method and simultaneously obtaining the dependency relation and dependency label information of the words. We define five input criteria to perform pointer networks based on multi-task learning of morpheme in dependency parsing of a word. We apply a fine-tuning method to further improve the performance of the dependency parsing proposed in this paper. The results of our experiment show that the proposed model has better UAS 91.79% and LAS 89.48% than conventional Korean dependency parsing.


Grant : (엑소브레인-1세부) 휴먼 지식증강 서비스를 위한 지능진화형 WiseQA 플랫폼 기술 개발

Supported by : 정보통신기술진흥센터


  1. D. Hays, Dependency theory: a formalism and some observations, Language, pp. 511-525, 1964.
  2. O. Vinyals, M. Fortunato and N. Jaitly, Pointer Networks, Advances in Neural Information Processing Systems, pp. 2674-2682, 2015.
  3. D. Bahdanau, K. Cho and Y. Bengio, Neural machine translation by jointly learning to align and translate, arXiv preprint arXiv:1409.0473, 2014.
  4. J. Li, E. Lee and J.H. Lee, Sequence-to-sequence based Morphological Analysis and Part-Of-Speech Tagging for Korean Language with Convolutional Features, Journal of KIISE, Vol. 44, No. 1, pp. 57-62, 2017. (in Korean)
  5. C. Lee, Named Entity Recognition using Long Short-Term Memory Based Recurrent Neural Network, Proc. of the KIISE Korea Computer Congress 2015, pp. 645-647, 2015. (in Korean)
  6. C. Park, K.H. Choi, C. Lee and S. Lim, Korean Coreference Resolution with Guided Mention Pair Model using Deep Learning, ETRI Journal, Vol. 38, No. 6, pp. 1207-1217, 2016. (in Korean)
  7. J. Bae and C. Lee, Korean Semantic Role Labeling using Stacked Bidirectional LSTM-CRFs, Journal of KIISE, Vol. 44, No. 1, pp. 36-43, 2017. (in Korean)
  8. Y. Wu, M. Schuster, Z. Chen, Q.V. Le and M. Norouzi, Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation, arXiv preprint arXiv:1609. 08144, 2016.
  9. K. Choi and C. Lee, End-to-end Document Summarization using Copy Mechanism and Input Feeding, Proc. of the 28th Annual Conference on Human & Cognitive Language Technology, pp. 56-61, 2016. (in Korean)
  10. C. Lee, J. Kim and J. Kim, Korean Dependency Parsing using Deep Learning, Proc. of the 26th Annual Conference on Human & Cognitive Language Technology, pp. 87-91, 2014. (in Korean)
  11. J. Li and J.H. Lee, Korean Transition-based Dependency Parsing with Recurrent Neural Network, KIISE Transactions on Computing Practices, Vol. 21, No. 8, pp. 567-571, 2015. (in Korean)
  12. S.H. Na, Phrase-Based Dependency Parsing Using String-to-Dependency SMT, Proc. of the KIISE Korea Computer Congress 2015, pp. 657-659, 2015. (in Korean)
  13. S.H. Na, K. Kim and Y.K. Kim, Stack LSTMs for Transition-Based Korean Dependency Parsing, Proc. of the KIISE Korea Computer Congress 2016, pp. 732-734, 2016. (in Korean)
  14. S.H. Na, J.H. Shin and K. Kim, Improving Stack LSTMs by Combining Syllables and Morphemes for Korean Dependency Parsing, Proc. of the 28th Annual Conference on Human & Cognitive Language Technology, pp. 9-13, 2016.
  15. J. Nivre, Non-Projective Dependency Parsing in Expected Linear Time, Proc. of the ACL-IJCNLP, pp. 351-359, 2009.
  16. R. McDonald, K. Crammer and F. Pereira, Online Large-margin Training of Dependency Parsers, Proc. of the ACL, pp. 91-98, 2005.
  17. K. Cho, B. Van Merrienboer and C. Gulcehre, Learning phrase representation using RNN encoder-decoder for statistical machine translation, arXiv preprint arXiv:1406.1078, 2014.
  18. K. Yao, B. Peng, Y. Zhang, D. Yu, G. Zweig, and Y. Shi, Spoken Language Understanding Using Longe Short-Term Memory Neural Networks, Spoken Language Technology Workshop (SLT), 2014 IEEE, pp. 189-194, 2014.
  19. J. Chung, C. Gulcehre, K.H. Cho, and Y. Bengio, Empirical Evaluation of Gated Recurrent Networks on Sequence Modeling, arXiv preprint arXiv:1412. 3555, 2014.
  20. The National Institute of the Korean Language, The 21 century Sejong plan. 2012. (in Korean)
  21. S. Lim, Y.T. Kim and D.Y. Ra, Korean Dependency Parsing Based on Machine Learning of Feature Weights, Journal of KIISE: Software and Applications, Vol. 38, No. 4, pp. 214-223, 2011. (in Korean)
  22. S.H. Na, J. Li, J.H. Shin and K. Kim, Stack LSTMs with Recurrent Controllers for Korean Dependency Parsing, Proc. of the KIISE 2016 winter conference, pp. 446-448, 2016. (in Korean)