DOI QR코드

DOI QR Code

A Study on Automatic Comment Generation Using Deep Learning

딥 러닝을 이용한 자동 댓글 생성에 관한 연구

  • Choi, Jae-yong (Dept. of Game & Multimedia Engineering, Korea Polytechnic University) ;
  • Sung, So-yun (Dept. of Game & Multimedia Engineering, Korea Polytechnic University) ;
  • Kim, Kyoung-chul (Dept. of Game & Multimedia Engineering, Korea Polytechnic University)
  • 최재용 (한국산업기술대학교 게임공학과) ;
  • 성소윤 (한국산업기술대학교 게임공학과) ;
  • 김경철 (한국산업기술대학교 게임공학과)
  • Received : 2018.09.10
  • Accepted : 2018.10.08
  • Published : 2018.10.20

Abstract

Many studies in deep learning show results as good as human's decision in various fields. And importance of activation of online-community and SNS grows up in game industry. Even it decides whether a game can be successful or not. The purpose of this study is to construct a system which can read texts and create comments according to schedule in online-community and SNS using deep learning. Using recurrent neural network, we constructed models generating a comment and a schedule of writing comments, and made program choosing a news title and uploading the comment at twitter in calculated time automatically. This study can be applied to activating an online game community, a Q&A service, etc.

최근 다수의 분야에서 딥 러닝을 통한 연구 성과들이 사람의 판단력에 근접하는 결과를 보여주고 있다. 그리고 게임 산업에서는 온라인 커뮤니티, SNS의 활성화가 게임 흥행 여부를 결정할 정도로 중요성이 높아지고 있다. 본 연구는 딥 러닝을 이용해 온라인 커뮤니티, SNS에서 활동할 수 있는 시스템을 구성하고, 온라인 공간에서 사람들이 작성한 텍스트를 읽고 그에 대한 반응을 생성하고 스케쥴에 따라 트위터에 올리는 것을 목표로 한다. 순환 신경망(Recurrent Neural Network)을 이용해 텍스트를 생성하고 글 작성 스케쥴을 생성하는 모델들을 구성했고, 생성한 시각에 맞춰 모델들에 뉴스 제목을 입력해 댓글을 출력 받고 트위터에 작성하는 프로그램을 구현했다. 본 연구결과는 온라인 게임 커뮤니티 활성화, Q&A 서비스 등에 적용이 가능할 것으로 예상된다.

Keywords

References

  1. Geoffrey E. Hinton , Simon Osindero and Yee-Whye Teh, "A Fast Learning Algorithm for Deep Belief Nets", Neural Computation Volume 18 Issue 7, pp.1527-1554, 2006. https://doi.org/10.1162/neco.2006.18.7.1527
  2. Google, "Google Duplex: An AI System for Accomplishing Real-World Tasks Over the Phone", https://ai.googleblog.com/2018/05/duplex-ai-system-for-natural-conversation.html, 2018.
  3. OpenAI, "Dota 2", https://blog.openai.com/dota-2/, 2017.
  4. Alex Krizhevsky, Ilya Sutskever, Geoffrey E. Hinton, "ImageNet Classification with Deep Convolutional Neural Networks", http://papers.nips.cc/paper/4824-imagenet-classification-with-deep-convolutional-neural-networks, 2012.
  5. Haoxiang Li, Zhe Lin, Xiaohui Shen, Jonathan Brandt, Gang Hua, "A Convolutional Neural Network Cascade for Face Detection", The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp.5325-5334, 2015.
  6. Junhua Mao, Wei Xu, Yi Yang, Jiang Wang, Zhiheng Huang, Alan Yuille "Deep Captioning With Multimodal Recurrent Neural Networks (m-RNN)", https://arxiv.org/abs/1412.6632, 2015.
  7. Ian Goodfellow, Yoshua Bengio And Aron Courville, "Deep Learning", The MIT Press, pp.363-382, 2016.
  8. Alex Graves, Abdel-rahman Mohamed, Geoffrey Hinton, "Speech Recognition with Deep Recurrent Neural Networks", https://arxiv.org/abs/1303.5778, 2013.
  9. Yong Du, Wei Wang, Liang Wang, "Hierarchical Recurrent Neural Network for Skeleton Based Action Recognition", The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp.1110-1118, 2015.
  10. "Long Short-Term Memory", Neural Computation Volume 9 Issue 8, pp.1735-1780, 1997. https://doi.org/10.1162/neco.1997.9.8.1735
  11. Oren Melamud, Omer Levy, Ido Dagan, "A Simple Word Embedding Model for Lexical Substitution", Proceedings of NAACL-HLT 2015, pp.1-7, 2015.
  12. Ian Goodfellow, Yoshua Bengio And Aron Courville, "Deep Learning", The MIT Press, pp.385-400, 2016.
  13. Kyunghyun Cho, Bart Van Merrienboer, Dzmitry Bahdanau, Yoshua Bengio, "On the properties of neural machine translation: Encoder-decoder approaches", https://arxiv.org/abs/1409.1259, 2014.
  14. Subhashini Venugopalan, Marcus Rohrbach, Jeffrey Donahue, Raymond Mooney, Trevor Darrell, Kate Saenko, "Sequence to Sequence - Video to Text", The IEEE International Conference on Computer Vision (ICCV), pp.4534-4542, 2015
  15. Andrej Karpathy, "The Unreasonable Effectiveness of Recurrent Neural Networks", http://karpathy.github.io/2015/05/21/rnn-effectiveness, 2015.
  16. Andrej Karpathy, "Multi-layer Recurrent Neural Networks (LSTM, GRU, RNN) for character-level language models in Torch" https://github.com/karpathy/char-rnn, 2015.
  17. insikk, "Korean language requires a little different treatment when we run character level RNN", https://github.com/insikk/kor-char-rnn-tensorflow, 2017.
  18. Google, "TensorFlow Neural Machine Translation Tutorial", https://github.com/tensorflow/nmt, 2017.
  19. Rico Sennrich, Barry Haddow, Alexandra Birch, "Neural Machine Translation of Rare Words with Subword Units", https://arxiv.org/abs/1508.07909, 2016.
  20. Minh-Thang Luong, Hieu Pham, Christopher D. Manning, "Effective Approaches to Attention-based Neural Machine Translation", https://arxiv.org/abs/1508.04025, 2015.