• Title, Summary, Keyword: divergence

Search Result 1,044, Processing Time 0.047 seconds

Direct Divergence Approximation between Probability Distributions and Its Applications in Machine Learning

  • Sugiyama, Masashi;Liu, Song;du Plessis, Marthinus Christoffel;Yamanaka, Masao;Yamada, Makoto;Suzuki, Taiji;Kanamori, Takafumi
    • Journal of Computing Science and Engineering
    • /
    • v.7 no.2
    • /
    • pp.99-111
    • /
    • 2013
  • Approximating a divergence between two probability distributions from their samples is a fundamental challenge in statistics, information theory, and machine learning. A divergence approximator can be used for various purposes, such as two-sample homogeneity testing, change-point detection, and class-balance estimation. Furthermore, an approximator of a divergence between the joint distribution and the product of marginals can be used for independence testing, which has a wide range of applications, including feature selection and extraction, clustering, object matching, independent component analysis, and causal direction estimation. In this paper, we review recent advances in divergence approximation. Our emphasis is that directly approximating the divergence without estimating probability distributions is more sensible than a naive two-step approach of first estimating probability distributions and then approximating the divergence. Furthermore, despite the overwhelming popularity of the Kullback-Leibler divergence as a divergence measure, we argue that alternatives such as the Pearson divergence, the relative Pearson divergence, and the $L^2$-distance are more useful in practice because of their computationally efficient approximability, high numerical stability, and superior robustness against outliers.

A NEW EXPONENTIAL DIRECTED DIVERGENCE INFORMATION MEASURE

  • JAIN, K.C.;CHHABRA, PRAPHULL
    • Journal of applied mathematics & informatics
    • /
    • v.34 no.3_4
    • /
    • pp.295-308
    • /
    • 2016
  • Depending upon the nature of the problem, different divergence measures are suitable. So it is always desirable to develop a new divergence measure. In the present work, new information divergence measure, which is exponential in nature, is introduced and characterized. Bounds of this new measure are obtained in terms of various symmetric and non- symmetric measures together with numerical verification by using two discrete distributions: Binomial and Poisson. Fuzzy information measure and Useful information measure corresponding to new exponential divergence measure are also introduced.

A Study on The Formation of Inferior Space in Louis I. Kahn's Architecture (루이스 칸 건축의 내부공간 형성에 관한 연구)

  • Yoon, Dong-Sik
    • Korean Institute of Interior Design Journal
    • /
    • v.17 no.5
    • /
    • pp.23-30
    • /
    • 2008
  • This thesis aims to analyze the visual perceptual effects drawn by 'axial composition and divergence' and to interpret the architecture of Kahn in respect of 'axial composition and divergence'. Axial composition of the form, the location of the entrance and divergence of internal movement were checked up about 53 works by extracting parti which is basic element of spatial composition. The 3D modeling simulation was performed for the selected 10 works in order to analyze the visual perceptual effect due to divergence of the internal movement. The reaction of the observer's actions and visual perception by 'axial composition and divergence' is presented in the following steps. 1. Divergence of the entrance/a panorama of expanding planes. 2. Divergence of internal movement/The process of perception of visual rotation and central spatial form. 'Perceptive form' created by 'divergence' is the result of diverse and flexible series of processes which must be experienced in person in order to reach the space as a room with a definite domain and center.

A Kullback-Leiber Divergence-based Spectrum Sensing for Cognitive Radio Systems (무선인지시스템을 위한 Kullback-Leiber Divergence 기반의 스펙트럼 센싱 기법)

  • Thuc, Kieu-Xuan;Koo, In-Soo
    • Journal of Internet Computing and Services
    • /
    • v.13 no.1
    • /
    • pp.1-6
    • /
    • 2012
  • In the paper, an information divergence called Kullback-Leiber divergence, which measures the average of the logarithmic difference between two probability density functions, is utilized to derive a novel method for spectrum sensing in cognitive radio systems. In the proposed sensing method, we test whether the observed samples are drawn from the noise distribution by using Kullback-Leiber divergence. It is shown by numerical results that under the same conditions, the proposed Kullback-Leiber divergence-based spectrum sensing always outperforms the energy detection based spectrum sensing significantly, especially in low SNR regime and in fading circumstance.

Void Formation Induced by the Divergence of the Diffusive Ionic Fluxes in Metal Oxides Under Chemical Potential Gradients

  • Maruyama, Toshio;Ueda, Mitsutoshi
    • Journal of the Korean Ceramic Society
    • /
    • v.47 no.1
    • /
    • pp.8-18
    • /
    • 2010
  • When metal oxides are exposed to chemical potential gradients, ions are driven to diffusive mass transport. During this transport process, the divergence of ionic fluxes offers the formation/annihilation of oxides. Therefore, the divergence of ionic flux may play an important role in the void formation in oxides. Kinetic equations were derived for describing chemical potential distribution, ionic fluxes and their divergence in oxides. The divergence was found to be the measure of void formation. Defect chemistry in scales is directly related to the sign of divergence and gives an indication of the void formation behavior. The quantitative estimation on the void formation was successfully applied to a growing magnetite scale in high temperature oxidation of iron at 823 K.

SOME NEW MEASURES OF FUZZY DIRECTED DIVERGENCE AND THEIR GENERALIZATION

  • PARKASH OM;SHARMA P. K.
    • The Pure and Applied Mathematics
    • /
    • v.12 no.4
    • /
    • pp.307-315
    • /
    • 2005
  • There exist many measures of fuzzy directed divergence corresponding to the existing probabilistic measures. Some new measures of fuzzy divergence have been proposed which correspond to some well-known existing probabilistic measures. The essential properties of the proposed measures have been developed which contains many existing measures of fuzzy directed divergence.

  • PDF

SHADOWING, EXPANSIVENESS AND STABILITY OF DIVERGENCE-FREE VECTOR FIELDS

  • Ferreira, Celia
    • Bulletin of the Korean Mathematical Society
    • /
    • v.51 no.1
    • /
    • pp.67-76
    • /
    • 2014
  • Let X be a divergence-free vector field defined on a closed, connected Riemannian manifold. In this paper, we show the equivalence between the following conditions: ${\bullet}$ X is a divergence-free vector field satisfying the shadowing property. ${\bullet}$ X is a divergence-free vector field satisfying the Lipschitz shadowing property. ${\bullet}$ X is an expansive divergence-free vector field. ${\bullet}$ X has no singularities and is Anosov.

Automatic Selection of the Turning Parametter in the Minimum Density Power Divergence Estimation

  • Changkon Hong;Kim, Youngseok
    • Journal of the Korean Statistical Society
    • /
    • v.30 no.3
    • /
    • pp.453-465
    • /
    • 2001
  • It is often the case that one wants to estimate parameters of the distribution which follows certain parametric model, while the dta are contaminated. it is well known that the maximum likelihood estimators are not robust to contamination. Basuet al.(1998) proposed a robust method called the minimum density power divergence estimation. In this paper, we investigate data-driven selection of the tuning parameter $\alpha$ in the minimum density power divergence estimation. A criterion is proposed and its performance is studied through the simulation. The simulation includes three cases of estimation problem.

  • PDF

INEQUALITIES FOR QUANTUM f-DIVERGENCE OF CONVEX FUNCTIONS AND MATRICES

  • Dragomir, Silvestru Sever
    • Korean Journal of Mathematics
    • /
    • v.26 no.3
    • /
    • pp.349-371
    • /
    • 2018
  • Some inequalities for quantum f-divergence of matrices are obtained. It is shown that for normalised convex functions it is nonnegative. Some upper bounds for quantum f-divergence in terms of variational and ${\chi}^2-distance$ are provided. Applications for some classes of divergence measures such as Umegaki and Tsallis relative entropies are also given.

Minimum Density Power Divergence Estimation for Normal-Exponential Distribution (정규-지수분포에 대한 최소밀도함수승간격 추정법)

  • Pak, Ro Jin
    • The Korean Journal of Applied Statistics
    • /
    • v.27 no.3
    • /
    • pp.397-406
    • /
    • 2014
  • The minimum density power divergence estimation has been a popular topic in the field of robust estimation for since Basu et al. (1988). The minimum density power divergence estimator has strong robustness properties with the little loss in asymptotic efficiency relative to the maximum likelihood estimator under model conditions. However, a limitation in applying this estimation method is the algebraic difficulty on an integral involved in an estimation function. This paper considers a minimum density power divergence estimation method with approximated divergence avoiding such difficulty. As an example, we consider the normal-exponential convolution model introduced by Bolstad (2004). The estimated divergence in this case is too complicated; consequently, a Laplace approximation is employed to obtain a manageable form. Simulations and an empirical study show that the minimum density power divergence estimators based on an approximated estimated divergence for the normal-exponential model perform adequately in terms of bias and efficiency.