A View on Extension of Utility-Based on Links with Information Measures

- Journal title : Communications for Statistical Applications and Methods
- Volume 16, Issue 5, 2009, pp.813-820
- Publisher : The Korean Statistical Society
- DOI : 10.5351/CKSS.2009.16.5.813

Title & Authors

A View on Extension of Utility-Based on Links with Information Measures

Hoseinzadeh, A.R.; Borzadaran, G.R.Mohtashami; Yari, G.H.;

Hoseinzadeh, A.R.; Borzadaran, G.R.Mohtashami; Yari, G.H.;

Abstract

In this paper, we review the utility-based generalization of the Shannon entropy and Kullback-Leibler information measure as the U-entropy and the U-relative entropy that was introduced by Friedman et al. (2007). Then, we derive some relations between the U-relative entropy and other information measures based on a parametric family of utility functions.

Keywords

Shannon entropy;Kullback-Leibler information measure;utility function;expected utility maximization;U-entropy;U-relative entropy;

Language

English

References

1.

Ali, S. M. and Silvey, S. D. (1966). A general class of coefficients of divergence of one distribution from another, Journal of the Royal Statistical Society, Series B, 28, 131-142

2.

Bell, D. E. (1988). One switch utility functions and a measure of risk, Management Science, 34, 1416-1424

3.

Bell, D. E. (1995). A contextual uncertainty condition for behavior under risk, Management Science, 41, 1145-1150

4.

Burbea, J. and Rao, C. R. (1982). On the convexity of some divergence measures based on entropy functions, IEEE Transactions on Information Theory, 28, 489-495

5.

Conniffe, D. (2007). Generalized means of simple utility functions with risk aversion, Paper Pre-sented at Irish Economics Association Conference

6.

Csiszar, J. (1967). Information type measures of differences of probability distribution and indirect observations, Studia Scientifica Materia Hungary, 2, 299-318

7.

Dragomir, S. S. (2003). On the p-Logarithmic and $\alpha$ -Power divergence measures in information theory, PanAmerican Mathematical Journal, 13, 1-10

8.

Friedman, C., Huang, J. and Sandow, S. (2007). A utility-based approach to some information mea-sures, Entropy, 9, 1-26

9.

Friedman, C. and Sandow, S. (2003). Model performance measures for expected utility maximizing investors, International Journal of Theoretical and Applied Finance, 6, 355-401

10.

Gerber, H. U. and Pafumi, G. (1999). Utility functions: From risk theory to finance, North American Actuarial Journal, 2, 74-100

11.

Havrda, J. H. and Charvat, F. (1967). Quantification method classification process: Concept of struc-tural $\alpha$ - entropy, Kybernetika, 3, 30-35

12.

Ingersoll, J. (1987). Theory of Financial Decision Making, Rowman & Littlefield, New York

13.

Johnson, T. C. (2007). Utility functions, C2922 E

14.

Kapur, J. N. (1984). A comparative assessment of various measures of directed divergence, Advances in Management Studies, 3, 1-16

15.

Kullback, S. and Leibler, R. A. (1951). On information and sufficiency, The Annals of Mathematical Statistics, 22, 79-86

16.

Lin, J. (1991). Divergence measures based on the Shannon entropy, IEEE Transactions on Information Theory, 37, 145-151

17.

Mahalanobis. P. C. (1936). On the generalized distance in statistics, In Proceedings of the National Institute of Sciences of India, 2, 49-55

18.

Pardo, L. (1993). $R_{h}^{\varphi}$ - Divergence statistics in applied categorical data analysis with stratified sam-pling, Utilitas Mathematica, 44, 145-164

19.

Rao, C. R. (1982). Diversity and dissimilarity coefficients: A unified approach, Theoretic Population Biology, 21, 24-43

20.

Renyi, A. (1961). On measures of entropy and information, In Proceedings of the 4th Berkeley Sym-posium on Mathematical Statistics and Probability, 1, 547-561

21.

Shannon, C. E. (1948). A mathematical theory of communication, Bell System Technical Journal, 27, 379-423

22.

Sharma, B. D. and Mittal, D. P. (1977). New non-additive measures of relative information, Journal Combined Information Systems Science, 2, 122-132

23.

Shioya, H. and Da-te, T. (1995). A generalization of Lin divergence and the derivative of a new information divergence, Electronics and Communications in Japan, 78, 37-40

24.

Soofi, E. S., Ebrahimi, N. and Habihullah, M. (1995). Information distinguish ability with application to analysis of failure data, Journal of the American Statistical Association, 90, 657-668

25.

Vajda, I. (1989). Theory of Statistical Inference and Information, Kluwer Academic Publishers, Dordrecht-Boston

26.

Wiener, N. (1949). Cybernetics, John Wiley & Sons, New York