• Received : 2015.09.18
  • Accepted : 2016.01.24
  • Published : 2016.05.30


Depending upon the nature of the problem, different divergence measures are suitable. So it is always desirable to develop a new divergence measure. In the present work, new information divergence measure, which is exponential in nature, is introduced and characterized. Bounds of this new measure are obtained in terms of various symmetric and non- symmetric measures together with numerical verification by using two discrete distributions: Binomial and Poisson. Fuzzy information measure and Useful information measure corresponding to new exponential divergence measure are also introduced.


New exponential divergence measure;Bounds;Numerical verification;Comparison of divergence measures


  1. R.K. Bajaj and D.S. Hooda, Generalized measures of fuzzy directed divergence, total ambiguity and information improvement, Journal of Applied Mathematics, Statistics and Informatics, 6 (2010), 31- 44.
  2. M.B. Bassat, f- Entropies, probability of error and feature selection, Inform. Control, 39 (1978), 227-242.
  3. M. Basseville, Distance measures for signal processing and pattern recognition, Signal Processing, 18 (1989), 349-369.
  4. A. Benveniste, M. Basseville and G. Moustakides, The asymptotic local approach to change detection and model validation, IEEE Trans. Automatic Control, AC-32 (1987), 583- 592.
  5. L.M. Bregman, The relaxation method to find the common point of convex sets and its applications to the solution of problems in convex programming, USSR Comput. Math. Phys., 7 (1967), 200-217.
  6. A. Bhattacharyya, On a measure of divergence between two multinomial populations, Sankhaya: The Indian Journal of Statistics (1933-1960), 7 (1946), 401- 406.
  7. J.S. Bhullar, O.P. Vinocha and M. Gupta, Generalized measure for two utility distributions, Proceedings of the World Congress on Engineering, 3 (2010).
  8. D.E. Boekee and J.C.A. Van Der Lubbe, Some aspects of error bounds in feature selection, Pattern Recognition, 11 (1979), 353- 360.
  9. J. Burbea and C.R. Rao, On the convexity of some divergence measures based on entropy functions, IEEE Trans. on Inform. Theory, IT-28 (1982), 489-495.
  10. H.C. Chen, Statistical pattern recognition, Hoyderc Book Co., Rocelle Park, New York, (1973).
  11. C.K. Chow and C.N. Lin, Approximating discrete probability distributions with dependence trees, IEEE Trans. Inform. Theory, 14 (1968), 462-467.
  12. I. Csiszar, Information measures: A Critical survey, in Trans. In: Seventh Prague Conf. on Information Theory, Academia, Prague, (1974), 73-86.
  13. I. Csiszar, Information type measures of differences of probability distribution and indirect observations, Studia Math. Hungarica, 2 (1967), 299-318.
  14. D. Dacunha- Castelle, Ecole dEte de Probabilites de, Saint-Flour VII-1977, Berlin, Heidelberg, New York: Springer, (1978).
  15. S.S. Dragomir, A generalized f- divergence for probability vectors and applications, Research Report Collection 5 (2000).
  16. E. Hellinger, Neue begrundung der theorie der quadratischen formen von unendlichen vielen veranderlichen, J. Rein.Aug. Math., 136 (1909), 210-271.
  17. S.S. Dragomir, V. Gluscevic and C.E.M. Pearce, Approximation for the Csiszars f- divergence via midpoint inequalities, in Inequality Theory and Applications - Y.J. Cho, J.K. Kim, and S.S. Dragomir (Eds.), Nova Science Publishers, Inc., Huntington, New York, 1 (2001), 139-154.
  18. S.S. Dragomir, J. Sunde and C. Buse, New inequalities for Jeffreys divergence measure, Tamusi Oxford Journal of Mathematical Sciences, 16 (2000), 295-309.
  19. D.V. Gokhale and S. Kullback, Information in contingency Tables, New York, Marcel Dekker, (1978).
  20. D.S. Hooda, On generalized measures of fuzzy entropy, Mathematica Slovaca, 54 (2004), 315- 325.
  21. K.C. Jain and P. Chhabra, New series of information divergence measures and their properties, Accepted in Applied Mathematics and Information Sciences.
  22. K.C. Jain and R.N. Saraswat, Some new information inequalities and its applications in information theory, International Journal of Mathematics Research, 4 (2012), 295- 307.
  23. K.C. Jain and A. Srivastava, On symmetric information divergence measures of Csiszar’s f- divergence class, Journal of Applied Mathematics, Statistics and Informatics, 3 (2007), 85-102.
  24. H. Jeffreys, An invariant form for the prior probability in estimation problem, Proc. Roy. Soc. Lon. Ser. A, 186 (1946), 453-461.
  25. P. Jha and V.K. Mishra, Some new trigonometric, hyperbolic and exponential measures of fuzzy entropy and fuzzy directed divergence, International Journal of Scientific and Engineering Research, 3 (2012), 1-5.
  26. D. Kazakos and T. Cotsidas, A decision theory approach to the approximation of discrete probability densities, IEEE Trans. Perform. Anal. Machine Intell, 1 (1980), 61- 67.
  27. L. Jones and C. Byrne, General entropy criteria for inverse problems with applications to data compression, pattern classification and cluster analysis, IEEE Trans. Inform. Theory, 36 (1990), 23-30.
  28. T.T. Kadota and L.A. Shepp, On the best finite set of linear observables for discriminating two Gaussian signals, IEEE Trans. Inform. Theory, 13 (1967), 288-294.
  29. T. Kailath, The divergence and Bhattacharyya distance measures in signal selection, IEEE Trans. Comm. Technology, COM-15 (1967), 52-60.
  30. A.N. Kolmogorov, On the approximation of distributions of sums of independent summands by infinitely divisible distributions, Sankhya, 25, 159-174.
  31. S. Kullback and R.A. Leibler, On information and sufficiency, Ann. Math. Statist., 22 (1951), 79-86.
  32. P. Kumar and A. Johnson, On a symmetric divergence measure and information inequalities, Journal of Inequalities in Pure and Applied Mathematics, 6 (2005), 1-13.
  33. P.W. Lamberti, A.P. Majtey, A. Borras, M. Casas and A. Plastino, Metric character of the quantum Jensen- Shannon divergence, Physical Review A, 77 (2008), 052311.
  34. F. Nielsen and S. Boltz, The Burbea-Rao and Bhattacharyya centroids, Apr. (2010), Arxiv.
  35. M.A. Nielsen and I.L. Chuang, Quantum computation and information, Cambridge University Press, Cambridge, UK, 3 (2000), 9.
  36. F. Osterreicher, Csiszar's f- divergence basic properties, Homepage:, November 22, (2002).
  37. A. Renyi, On measures of entropy and information, Proc. 4th Berkeley Symposium on Math. Statist. and Prob., 1 (1961), 547-561.
  38. K. Pearson, On the Criterion that a given system of deviations from the probable in the case of correlated system of variables is such that it can be reasonable supposed to have arisen from random sampling, Phil. Mag., 50 (1900), 157-172.
  39. E.C. Pielou, Ecological diversity, New York, Wiley, (1975).
  40. H.V. Poor, Robust decision design using a distance criterion, IEEE Trans. Inf. Th., IT 26 (1980), 575- 587.
  41. M. Salicru, Measures of information associated with Csiszar’s divergences, Kybernetika, 30 (1994), 563- 573.
  42. R. Santos-Rodriguez, D. Garcia-Garcia and J. Cid-Sueiro, Cost-sensitive classification based on Bregman divergences for medical diagnosis, In M.A. Wani, editor, Proceedings of the 8th International Conference on Machine Learning and Applications (ICMLA'09), Miami Beach, Fl., USA, December 13-15, (2009), 551- 556.
  43. R. Sibson, Information radius, Z. Wahrs. Undverw. Geb., 1 (1969), 149-160.
  44. H.C. Taneja and R.K. Tuteja, Characterization of a quantitative- qualitative measure of inaccuracy, Kybernetika, 22 (1986), 393- 402.
  45. I.J. Taneja, New developments in generalized information measures, Chapter in: Advances in Imaging and Electron Physics, Ed. P.W. Hawkes, 91 (1995), 37-135.
  46. I.J. Taneja, Generalized symmetric divergence measures and inequalities, RGMIA Research Report Collection,, 7(2004), Art. 9. Available on-line at: arXiv:math.ST/0501301 v1 19 Jan 2005.
  47. K. Tumer and J. Ghosh, Estimating the Bayes error rate through classifier combining, Proceedings of 13th International Conference on Pattern Recognition, (1996), 695-699.
  48. B. Taskar, S. Lacoste-Julien and M.I. Jordan, Structured prediction, dual extra gradient and Bregman projections, Journal of Machine Learning Research, 7 (2006), 1627-1653.
  49. H. Theil Statistical decomposition analysis, Amsterdam, North-Holland, 1972.
  50. H. Theil, Economics and information theory, Amsterdam, North-Holland, 1967.
  51. B. Vemuri, M. Liu, S. Amari and F. Nielsen, Total Bregman divergence and its applications to DTI analysis, IEEE Transactions on Medical Imaging, (2010).