JOURNAL BROWSE
Search
Advanced SearchSearch Tips
A NEW EXPONENTIAL DIRECTED DIVERGENCE INFORMATION MEASURE
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
A NEW EXPONENTIAL DIRECTED DIVERGENCE INFORMATION MEASURE
JAIN, K.C.; CHHABRA, PRAPHULL;
 
 Abstract
Depending upon the nature of the problem, different divergence measures are suitable. So it is always desirable to develop a new divergence measure. In the present work, new information divergence measure, which is exponential in nature, is introduced and characterized. Bounds of this new measure are obtained in terms of various symmetric and non- symmetric measures together with numerical verification by using two discrete distributions: Binomial and Poisson. Fuzzy information measure and Useful information measure corresponding to new exponential divergence measure are also introduced.
 Keywords
New exponential divergence measure;Bounds;Numerical verification;Comparison of divergence measures;
 Language
English
 Cited by
 References
1.
R.K. Bajaj and D.S. Hooda, Generalized measures of fuzzy directed divergence, total ambiguity and information improvement, Journal of Applied Mathematics, Statistics and Informatics, 6 (2010), 31- 44.

2.
M.B. Bassat, f- Entropies, probability of error and feature selection, Inform. Control, 39 (1978), 227-242. crossref(new window)

3.
M. Basseville, Distance measures for signal processing and pattern recognition, Signal Processing, 18 (1989), 349-369. crossref(new window)

4.
A. Benveniste, M. Basseville and G. Moustakides, The asymptotic local approach to change detection and model validation, IEEE Trans. Automatic Control, AC-32 (1987), 583- 592. crossref(new window)

5.
A. Bhattacharyya, On a measure of divergence between two multinomial populations, Sankhaya: The Indian Journal of Statistics (1933-1960), 7 (1946), 401- 406.

6.
J.S. Bhullar, O.P. Vinocha and M. Gupta, Generalized measure for two utility distributions, Proceedings of the World Congress on Engineering, 3 (2010).

7.
D.E. Boekee and J.C.A. Van Der Lubbe, Some aspects of error bounds in feature selection, Pattern Recognition, 11 (1979), 353- 360. crossref(new window)

8.
L.M. Bregman, The relaxation method to find the common point of convex sets and its applications to the solution of problems in convex programming, USSR Comput. Math. Phys., 7 (1967), 200-217. crossref(new window)

9.
J. Burbea and C.R. Rao, On the convexity of some divergence measures based on entropy functions, IEEE Trans. on Inform. Theory, IT-28 (1982), 489-495. crossref(new window)

10.
H.C. Chen, Statistical pattern recognition, Hoyderc Book Co., Rocelle Park, New York, (1973).

11.
C.K. Chow and C.N. Lin, Approximating discrete probability distributions with dependence trees, IEEE Trans. Inform. Theory, 14 (1968), 462-467. crossref(new window)

12.
I. Csiszar, Information measures: A Critical survey, in Trans. In: Seventh Prague Conf. on Information Theory, Academia, Prague, (1974), 73-86.

13.
I. Csiszar, Information type measures of differences of probability distribution and indirect observations, Studia Math. Hungarica, 2 (1967), 299-318.

14.
D. Dacunha- Castelle, Ecole dEte de Probabilites de, Saint-Flour VII-1977, Berlin, Heidelberg, New York: Springer, (1978).

15.
S.S. Dragomir, A generalized f- divergence for probability vectors and applications, Research Report Collection 5 (2000).

16.
S.S. Dragomir, V. Gluscevic and C.E.M. Pearce, Approximation for the Csiszars f- divergence via midpoint inequalities, in Inequality Theory and Applications - Y.J. Cho, J.K. Kim, and S.S. Dragomir (Eds.), Nova Science Publishers, Inc., Huntington, New York, 1 (2001), 139-154.

17.
S.S. Dragomir, J. Sunde and C. Buse, New inequalities for Jeffreys divergence measure, Tamusi Oxford Journal of Mathematical Sciences, 16 (2000), 295-309.

18.
D.V. Gokhale and S. Kullback, Information in contingency Tables, New York, Marcel Dekker, (1978).

19.
E. Hellinger, Neue begrundung der theorie der quadratischen formen von unendlichen vielen veranderlichen, J. Rein.Aug. Math., 136 (1909), 210-271.

20.
D.S. Hooda, On generalized measures of fuzzy entropy, Mathematica Slovaca, 54 (2004), 315- 325.

21.
K.C. Jain and P. Chhabra, New series of information divergence measures and their properties, Accepted in Applied Mathematics and Information Sciences.

22.
K.C. Jain and R.N. Saraswat, Some new information inequalities and its applications in information theory, International Journal of Mathematics Research, 4 (2012), 295- 307.

23.
K.C. Jain and A. Srivastava, On symmetric information divergence measures of Csiszar’s f- divergence class, Journal of Applied Mathematics, Statistics and Informatics, 3 (2007), 85-102.

24.
H. Jeffreys, An invariant form for the prior probability in estimation problem, Proc. Roy. Soc. Lon. Ser. A, 186 (1946), 453-461. crossref(new window)

25.
P. Jha and V.K. Mishra, Some new trigonometric, hyperbolic and exponential measures of fuzzy entropy and fuzzy directed divergence, International Journal of Scientific and Engineering Research, 3 (2012), 1-5. crossref(new window)

26.
L. Jones and C. Byrne, General entropy criteria for inverse problems with applications to data compression, pattern classification and cluster analysis, IEEE Trans. Inform. Theory, 36 (1990), 23-30. crossref(new window)

27.
T.T. Kadota and L.A. Shepp, On the best finite set of linear observables for discriminating two Gaussian signals, IEEE Trans. Inform. Theory, 13 (1967), 288-294.

28.
T. Kailath, The divergence and Bhattacharyya distance measures in signal selection, IEEE Trans. Comm. Technology, COM-15 (1967), 52-60. crossref(new window)

29.
D. Kazakos and T. Cotsidas, A decision theory approach to the approximation of discrete probability densities, IEEE Trans. Perform. Anal. Machine Intell, 1 (1980), 61- 67. crossref(new window)

30.
A.N. Kolmogorov, On the approximation of distributions of sums of independent summands by infinitely divisible distributions, Sankhya, 25, 159-174.

31.
S. Kullback and R.A. Leibler, On information and sufficiency, Ann. Math. Statist., 22 (1951), 79-86. crossref(new window)

32.
P. Kumar and A. Johnson, On a symmetric divergence measure and information inequalities, Journal of Inequalities in Pure and Applied Mathematics, 6 (2005), 1-13.

33.
P.W. Lamberti, A.P. Majtey, A. Borras, M. Casas and A. Plastino, Metric character of the quantum Jensen- Shannon divergence, Physical Review A, 77 (2008), 052311. crossref(new window)

34.
F. Nielsen and S. Boltz, The Burbea-Rao and Bhattacharyya centroids, Apr. (2010), Arxiv.

35.
M.A. Nielsen and I.L. Chuang, Quantum computation and information, Cambridge University Press, Cambridge, UK, 3 (2000), 9.

36.
F. Osterreicher, Csiszar's f- divergence basic properties, Homepage: http://www.sbg.ac.at/mat/home.html, November 22, (2002).

37.
K. Pearson, On the Criterion that a given system of deviations from the probable in the case of correlated system of variables is such that it can be reasonable supposed to have arisen from random sampling, Phil. Mag., 50 (1900), 157-172. crossref(new window)

38.
E.C. Pielou, Ecological diversity, New York, Wiley, (1975).

39.
H.V. Poor, Robust decision design using a distance criterion, IEEE Trans. Inf. Th., IT 26 (1980), 575- 587. crossref(new window)

40.
A. Renyi, On measures of entropy and information, Proc. 4th Berkeley Symposium on Math. Statist. and Prob., 1 (1961), 547-561.

41.
M. Salicru, Measures of information associated with Csiszar’s divergences, Kybernetika, 30 (1994), 563- 573.

42.
R. Santos-Rodriguez, D. Garcia-Garcia and J. Cid-Sueiro, Cost-sensitive classification based on Bregman divergences for medical diagnosis, In M.A. Wani, editor, Proceedings of the 8th International Conference on Machine Learning and Applications (ICMLA'09), Miami Beach, Fl., USA, December 13-15, (2009), 551- 556.

43.
R. Sibson, Information radius, Z. Wahrs. Undverw. Geb., 1 (1969), 149-160. crossref(new window)

44.
H.C. Taneja and R.K. Tuteja, Characterization of a quantitative- qualitative measure of inaccuracy, Kybernetika, 22 (1986), 393- 402.

45.
I.J. Taneja, New developments in generalized information measures, Chapter in: Advances in Imaging and Electron Physics, Ed. P.W. Hawkes, 91 (1995), 37-135.

46.
I.J. Taneja, Generalized symmetric divergence measures and inequalities, RGMIA Research Report Collection, http://rgmia.vu.edu.au, 7(2004), Art. 9. Available on-line at: arXiv:math.ST/0501301 v1 19 Jan 2005.

47.
B. Taskar, S. Lacoste-Julien and M.I. Jordan, Structured prediction, dual extra gradient and Bregman projections, Journal of Machine Learning Research, 7 (2006), 1627-1653.

48.
H. Theil Statistical decomposition analysis, Amsterdam, North-Holland, 1972.

49.
H. Theil, Economics and information theory, Amsterdam, North-Holland, 1967.

50.
K. Tumer and J. Ghosh, Estimating the Bayes error rate through classifier combining, Proceedings of 13th International Conference on Pattern Recognition, (1996), 695-699.

51.
B. Vemuri, M. Liu, S. Amari and F. Nielsen, Total Bregman divergence and its applications to DTI analysis, IEEE Transactions on Medical Imaging, (2010).