DOI QR코드

DOI QR Code

A View on Extension of Utility-Based on Links with Information Measures

  • Hoseinzadeh, A.R. (Science and Research Branch, Islamic Azad University) ;
  • Borzadaran, G.R.Mohtashami (Department of Statistics, Ferdowsi University of Mashhad) ;
  • Yari, G.H. (Iran University of Science and Technology)
  • Published : 2009.09.30

Abstract

In this paper, we review the utility-based generalization of the Shannon entropy and Kullback-Leibler information measure as the U-entropy and the U-relative entropy that was introduced by Friedman et al. (2007). Then, we derive some relations between the U-relative entropy and other information measures based on a parametric family of utility functions.

Keywords

References

  1. Ali, S. M. and Silvey, S. D. (1966). A general class of coefficients of divergence of one distribution from another, Journal of the Royal Statistical Society, Series B, 28, 131-142
  2. Bell, D. E. (1988). One switch utility functions and a measure of risk, Management Science, 34, 1416-1424 https://doi.org/10.1287/mnsc.34.12.1416
  3. Bell, D. E. (1995). A contextual uncertainty condition for behavior under risk, Management Science, 41, 1145-1150 https://doi.org/10.1287/mnsc.41.7.1145
  4. Burbea, J. and Rao, C. R. (1982). On the convexity of some divergence measures based on entropy functions, IEEE Transactions on Information Theory, 28, 489-495 https://doi.org/10.1109/TIT.1982.1056497
  5. Conniffe, D. (2007). Generalized means of simple utility functions with risk aversion, Paper Pre-sented at Irish Economics Association Conference
  6. Csiszar, J. (1967). Information type measures of differences of probability distribution and indirect observations, Studia Scientifica Materia Hungary, 2, 299-318
  7. Dragomir, S. S. (2003). On the p-Logarithmic and $\alpha$-Power divergence measures in information theory, PanAmerican Mathematical Journal, 13, 1-10
  8. Friedman, C., Huang, J. and Sandow, S. (2007). A utility-based approach to some information mea-sures, Entropy, 9, 1-26 https://doi.org/10.3390/e9010001
  9. Friedman, C. and Sandow, S. (2003). Model performance measures for expected utility maximizing investors, International Journal of Theoretical and Applied Finance, 6, 355-401 https://doi.org/10.1142/S0219024903001918
  10. Gerber, H. U. and Pafumi, G. (1999). Utility functions: From risk theory to finance, North American Actuarial Journal, 2, 74-100
  11. Havrda, J. H. and Charvat, F. (1967). Quantification method classification process: Concept of struc-tural $\alpha$ - entropy, Kybernetika, 3, 30-35
  12. Ingersoll, J. (1987). Theory of Financial Decision Making, Rowman & Littlefield, New York
  13. Johnson, T. C. (2007). Utility functions, C2922 E
  14. Kapur, J. N. (1984). A comparative assessment of various measures of directed divergence, Advances in Management Studies, 3, 1-16 https://doi.org/10.2307/3504743
  15. Kullback, S. and Leibler, R. A. (1951). On information and sufficiency, The Annals of Mathematical Statistics, 22, 79-86 https://doi.org/10.1214/aoms/1177729694
  16. Lin, J. (1991). Divergence measures based on the Shannon entropy, IEEE Transactions on Information Theory, 37, 145-151 https://doi.org/10.1109/18.61115
  17. Mahalanobis. P. C. (1936). On the generalized distance in statistics, In Proceedings of the National Institute of Sciences of India, 2, 49-55
  18. Pardo, L. (1993). $R_{h}^{\varphi}$ - Divergence statistics in applied categorical data analysis with stratified sam-pling, Utilitas Mathematica, 44, 145-164
  19. Rao, C. R. (1982). Diversity and dissimilarity coefficients: A unified approach, Theoretic Population Biology, 21, 24-43 https://doi.org/10.1016/0040-5809(82)90004-1
  20. Renyi, A. (1961). On measures of entropy and information, In Proceedings of the 4th Berkeley Sym-posium on Mathematical Statistics and Probability, 1, 547-561
  21. Shannon, C. E. (1948). A mathematical theory of communication, Bell System Technical Journal, 27, 379-423 https://doi.org/10.1002/j.1538-7305.1948.tb01338.x
  22. Sharma, B. D. and Mittal, D. P. (1977). New non-additive measures of relative information, Journal Combined Information Systems Science, 2, 122-132
  23. Shioya, H. and Da-te, T. (1995). A generalization of Lin divergence and the derivative of a new information divergence, Electronics and Communications in Japan, 78, 37-40
  24. Soofi, E. S., Ebrahimi, N. and Habihullah, M. (1995). Information distinguish ability with application to analysis of failure data, Journal of the American Statistical Association, 90, 657-668 https://doi.org/10.2307/2291079
  25. Vajda, I. (1989). Theory of Statistical Inference and Information, Kluwer Academic Publishers, Dordrecht-Boston
  26. Wiener, N. (1949). Cybernetics, John Wiley & Sons, New York