# A Implementation of Simple Convolution Decoder Using a Temporal Neural Networks

• Chung, Hee-Tae (Division of Digital Information Engineering, Pusan University of Foreign Studies) ;
• Kim, Kyung-Hun (Graduate School of Electronic ＆ Computer Engineering, Pusan University of Foreign Studies)
• Published : 2003.12.01
• 54 8

#### Abstract

Conventional multilayer feedforward artificial neural networks are very effective in dealing with spatial problems. To deal with problems with time dependency, some kinds of memory have to be built in the processing algorithm. In this paper we show how the newly proposed Serial Input Neuron (SIN) convolutional decoders can be derived. As an example, we derive the SIN decoder for rate code with constraint length 3. The SIN is tested in Gaussian channel and the results are compared to the results of the optimal Viterbi decoder. A SIN approach to decode convolutional codes is presented. No supervision is required. The decoder lends itself to pleasing implementations in hardware and processing codes with high speed in a time. However, the speed of the current circuits may set limits to the codes used. With increasing speeds of the circuits in the future, the proposed technique may become a tempting choice for decoding convolutional coding with long constraint lengths.

#### Keywords

Communication;Information

#### References

1. A. C. Huang and Y. F. Chiang, 'Function approximation using serial input neuron', Neurocompuing, vol.47, pp. 85-101, 2002 https://doi.org/10.1016/S0925-2312(01)00581-1
2. M.T. Hagan, H.B. Demuth and M. Beale, Neural Network Design(PWS publishing, Boston, MA, 1996)
3. D.W. Patterson, Artificial Neural Networks: Theory and Applications(Prentice Hall, Englewood Cliffs, NJ, 1996)
4. A. C. Huang, 'SIN-based model reference adaptive control of a class of nonlinear systems', Proc. 6th Int, Conf. on Automation Technology, vol. 2, pp. 657-662, May 2000
5. G. Marcone and E. Zincolini, 'An efficient neural decoder for convolutional codes', European Trans. Tele communication., vol. 6(4), pp. 439-445, JulyAug. 1995
6. W. Rudin, Principes of Mathematical Analysis(3rd Edition, McGraw-Hill Book, New York, 1976)
7. A. $H\ddot{a}m\ddot{a}l\ddot{a}inen$ and J. Henriksson, 'Convolutional decoding using recurrent neural network', Proc. Int. Joint Conf. on Neural Networks, July 1999
8. J. L. Elman, 'Finding structure in time', Cognitive Science, vol. 14, pp. 179-211, 1990 https://doi.org/10.1016/0364-0213(90)90002-E
9. J.G. Proakis, Digital Communications(McGraw- Hill Book, 2nd Edition, 1989)
10. G. D. Forney, Jr., 'The Viterbi algorithm', Proc. IEEE, vol. 61(3), pp. 268-278, Mar. 1973 https://doi.org/10.1109/PROC.1973.9030
11. A. $H\ddot{a}m\ddot{a}l\ddot{a}inen$ and J. Henriksson, 'A recurrent neural decoder for convolutional codes', Proceedings of 1999 IEEE Int. Conf. on Communications, pp. 1305-1309, June 1999
12. A. $H\ddot{a}m\ddot{a}l\ddot{a}inen$ and J. Henriksson, 'Novel use of channel information in a neural convolutional decoder', Proc. Int. Joint Conf. on Neural networks, pp. 337-342 vol.5, 2000
13. X. Wang and B. Wicker, 'An artificial neural net viterbi decoder', IEEE trans. communications, vol. 44, no. 2, 1996