Advanced SearchSearch Tips
A View on Extension of Utility-Based on Links with Information Measures
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
A View on Extension of Utility-Based on Links with Information Measures
Hoseinzadeh, A.R.; Borzadaran, G.R.Mohtashami; Yari, G.H.;
  PDF(new window)
In this paper, we review the utility-based generalization of the Shannon entropy and Kullback-Leibler information measure as the U-entropy and the U-relative entropy that was introduced by Friedman et al. (2007). Then, we derive some relations between the U-relative entropy and other information measures based on a parametric family of utility functions.
Shannon entropy;Kullback-Leibler information measure;utility function;expected utility maximization;U-entropy;U-relative entropy;
 Cited by
Ali, S. M. and Silvey, S. D. (1966). A general class of coefficients of divergence of one distribution from another, Journal of the Royal Statistical Society, Series B, 28, 131-142

Bell, D. E. (1988). One switch utility functions and a measure of risk, Management Science, 34, 1416-1424 crossref(new window)

Bell, D. E. (1995). A contextual uncertainty condition for behavior under risk, Management Science, 41, 1145-1150 crossref(new window)

Burbea, J. and Rao, C. R. (1982). On the convexity of some divergence measures based on entropy functions, IEEE Transactions on Information Theory, 28, 489-495 crossref(new window)

Conniffe, D. (2007). Generalized means of simple utility functions with risk aversion, Paper Pre-sented at Irish Economics Association Conference

Csiszar, J. (1967). Information type measures of differences of probability distribution and indirect observations, Studia Scientifica Materia Hungary, 2, 299-318

Dragomir, S. S. (2003). On the p-Logarithmic and $\alpha$-Power divergence measures in information theory, PanAmerican Mathematical Journal, 13, 1-10

Friedman, C., Huang, J. and Sandow, S. (2007). A utility-based approach to some information mea-sures, Entropy, 9, 1-26 crossref(new window)

Friedman, C. and Sandow, S. (2003). Model performance measures for expected utility maximizing investors, International Journal of Theoretical and Applied Finance, 6, 355-401 crossref(new window)

Gerber, H. U. and Pafumi, G. (1999). Utility functions: From risk theory to finance, North American Actuarial Journal, 2, 74-100

Havrda, J. H. and Charvat, F. (1967). Quantification method classification process: Concept of struc-tural $\alpha$ - entropy, Kybernetika, 3, 30-35

Ingersoll, J. (1987). Theory of Financial Decision Making, Rowman & Littlefield, New York

Johnson, T. C. (2007). Utility functions, C2922 E

Kapur, J. N. (1984). A comparative assessment of various measures of directed divergence, Advances in Management Studies, 3, 1-16

Kullback, S. and Leibler, R. A. (1951). On information and sufficiency, The Annals of Mathematical Statistics, 22, 79-86 crossref(new window)

Lin, J. (1991). Divergence measures based on the Shannon entropy, IEEE Transactions on Information Theory, 37, 145-151 crossref(new window)

Mahalanobis. P. C. (1936). On the generalized distance in statistics, In Proceedings of the National Institute of Sciences of India, 2, 49-55

Pardo, L. (1993). $R_{h}^{\varphi}$ - Divergence statistics in applied categorical data analysis with stratified sam-pling, Utilitas Mathematica, 44, 145-164

Rao, C. R. (1982). Diversity and dissimilarity coefficients: A unified approach, Theoretic Population Biology, 21, 24-43 crossref(new window)

Renyi, A. (1961). On measures of entropy and information, In Proceedings of the 4th Berkeley Sym-posium on Mathematical Statistics and Probability, 1, 547-561

Shannon, C. E. (1948). A mathematical theory of communication, Bell System Technical Journal, 27, 379-423 crossref(new window)

Sharma, B. D. and Mittal, D. P. (1977). New non-additive measures of relative information, Journal Combined Information Systems Science, 2, 122-132

Shioya, H. and Da-te, T. (1995). A generalization of Lin divergence and the derivative of a new information divergence, Electronics and Communications in Japan, 78, 37-40

Soofi, E. S., Ebrahimi, N. and Habihullah, M. (1995). Information distinguish ability with application to analysis of failure data, Journal of the American Statistical Association, 90, 657-668 crossref(new window)

Vajda, I. (1989). Theory of Statistical Inference and Information, Kluwer Academic Publishers, Dordrecht-Boston

Wiener, N. (1949). Cybernetics, John Wiley & Sons, New York