JOURNAL BROWSE
Search
Advanced SearchSearch Tips
Neural Predictive Coding for Text Compression Using GPGPU
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Neural Predictive Coding for Text Compression Using GPGPU
Kim, Jaeju; Han, Hwansoo;
 
 Abstract
Several methods have been proposed to apply artificial neural networks to text compression in the past. However, the networks and targets are both limited to the small size due to hardware capability in the past. Modern GPUs have much better calculation capability than CPUs in an order of magnitude now, even though CPUs have become faster. It becomes possible now to train greater and complex neural networks in a shorter time. This paper proposed a method to transform the distribution of original data with a probabilistic neural predictor. Experiments were performed on a feedforward neural network and a recurrent neural network with gated-recurrent units. The recurrent neural network model outperformed feedforward network in compression rate and prediction accuracy.
 Keywords
artificial neural network;feedforward neural network;recurrent neural network;text compression;natural language compression;huffman coding;entropy coding;batch normalization;
 Language
Korean
 Cited by
 References
1.
NVIDIA's Next Generation CUDA Compute Architecture: Fermi, v1.1, [Online] Available: http://www.nvidia.com (downloaded 2015, August 21)

2.
Patel, R., Zhang, Y., Mak, J., Davidson, A., & Owens, J. D., "Parallel lossless data compression on the GPU," Proc. of Innovative Parallel Computing, 2012.

3.
Huffman, David A., "A method for the construction of minimum redundancy codes," Proc. of the IRE, Vol. 40, No. 9, pp. 1098-1101, 1952.

4.
Schmidhuber, Jurgen, et al., "Predictive Coding with Neural Nets: Application to Text Compression," Advances in neural information processing systems, pp. 1047-1054. 1995.

5.
J. Kim, H. Han, "GPGPU-Accelerated Neural Predictive Coding for Text Compression," Proc. of the KIISE Computer Congress, 2015. (in Korean)

6.
Srivastava, Nitish, et al., "Dropout: A simple way to prevent neural networks from overfitting," The Journal of Machine Learning Research," Vol. 15, No. 1, pp. 1929-1958, 2014.

7.
Ioffe, Sergey, and Christian Szegedy, "Batch normalization: Accelerating deep network training by reducing internal covariate shift," arXiv preprint arXiv:1502.03167, 2015.

8.
He, Kaiming, et al., "Delving deep into rectifiers: Surpassing human-level performance on ImageNet classification," Proc. of the IEEE Int'l Conf. on Computer Vision, 2015.

9.
V. Nair and G.E. Hinton, "Rectified linear units improve restricted boltzmann machines," Proc. of the Int'l Conf. on Machine Learning, 2010.

10.
Bridle, John S., "Probabilistic interpretation of feedforward classification network outputs, with relationships to statistical pattern recognition," Neurocomputing: Algorithms, Architectures and Applications, Springer, pp. 227-236, 1990.

11.
Hinton, Geoffrey, Simon Osindero, and Yee-Whye Teh, "A fast learning algorithm for deep belief nets," Neural Computation, Vol. 18, No. 7, pp. 1527-1554, 2006. crossref(new window)

12.
Hochreiter, Sepp and Schmidhuber, Jurgen, "Long Short-Term Memory," Neural Computation, Vol. 9, No. 8, pp. 1735-1780, 1997. crossref(new window)

13.
Sundermeyer, Martin, Ralf Schlüter, and Hermann Ney, "LSTM Neural Networks for Language Modeling," Proc. of the INTERSPEECH, pp. 194-197, 2012.

14.
Chung, Junyoung, et al., "Empirical evaluation of gated recurrent neural networks on sequence modeling," Poster Presented at Deep Learning and Representation Learning Workshop, 2014.

15.
Project Gutenberg, Project Gutenberg. [Online]. Availabile: https://www.gutenberg.org/

16.
Deorowicz, S., "Silesia corpus," Silesian University of Technology, Poland. 2003. [Online]. Available: http://www.data-compression.info/Corpora/SilesiaCorpus/

17.
Bergstra, James, et al., "Theano: a CPU and GPU math expression compiler," Proc. of the Python for Scientific Computing Conference (SciPy), 2010.

18.
Francois Chollet. Keras Project [Online]. Available: GitHub Repository, https://github.com/fchollet/keras. 2015.