DOI QR코드

DOI QR Code

Korean Sentiment Analysis Using Natural Network: Based on IKEA Review Data

  • Sim, YuJeong (Graduate School of Smart Convergence Kwangwoon University) ;
  • Yun, Dai Yeol (Department of information and communication Engineering, Institute of Information Technology, Kwangwoon University) ;
  • Hwang, Chi-gon (Department of Computer Engineering, Institute of Information Technology, Kwangwoon University) ;
  • Moon, Seok-Jae (Department of Artificial Intelligence, Institute of Information Technology, KwangWoon University)
  • 투고 : 2021.03.21
  • 심사 : 2021.03.28
  • 발행 : 2021.05.31

초록

In this paper, we find a suitable methodology for Korean Sentiment Analysis through a comparative experiment in which methods of embedding and natural network models are learned at the highest accuracy and fastest speed. The embedding method compares word embeddeding and Word2Vec. The model compares and experiments representative neural network models CNN, RNN, LSTM, GRU, Bi-LSTM and Bi-GRU with IKEA review data. Experiments show that Word2Vec and BiGRU had the highest accuracy and second fastest speed with 94.23% accuracy and 42.30 seconds speed. Word2Vec and GRU were found to have the third highest accuracy and fastest speed with 92.53% accuracy and 26.75 seconds speed.

키워드

참고문헌

  1. Liu, Bing. "Sentiment analysis and subjectivity." Handbook of natural language processing 2.2010 (2010): 627-666.
  2. Maas, Andrew, et al. "Learning word vectors for sentiment analysis." Proceedings of the 49th annual meeting of the association for computational linguistics: Human language technologies. 2011.
  3. Feldman, Ronen. "Techniques and applications for sentiment analysis." Communications of the ACM 56.4 (2013): 82-89. https://doi.org/10.1145/2436256.2436274
  4. Gu, Jiuxiang, et al. "Recent advances in convolutional neural networks." Pattern Recognition 77 (2018): 354-377. https://doi.org/10.1016/j.patcog.2017.10.013
  5. Choi, Edward, et al. "Using recurrent neural network models for early detection of heart failure onset." Journal of the American Medical Informatics Association 24.2 (2017): 361-370. https://doi.org/10.1093/jamia/ocw112
  6. Hochreiter, Sepp, and Jurgen Schmidhuber. "Long short-term memory." Neural computation 9.8 (1997): 1735-1780. https://doi.org/10.1162/neco.1997.9.8.1735
  7. Cho, Kyunghyun, et al. "Learning phrase representations using RNN encoder-decoder for statistical machine translation." arXiv preprint arXiv:1406.1078 (2014).