TY - BOOK ID - 134271387 TI - Divergence Measures : Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems PY - 2022 PB - Basel MDPI - Multidisciplinary Digital Publishing Institute DB - UniCat KW - Research & information: general KW - Mathematics & science KW - Bregman divergence KW - f-divergence KW - Jensen–Bregman divergence KW - Jensen diversity KW - Jensen–Shannon divergence KW - capacitory discrimination KW - Jensen–Shannon centroid KW - mixture family KW - information geometry KW - difference of convex (DC) programming KW - conditional Rényi divergence KW - horse betting KW - Kelly gambling KW - Rényi divergence KW - Rényi mutual information KW - relative entropy KW - chi-squared divergence KW - f-divergences KW - method of types KW - large deviations KW - strong data–processing inequalities KW - information contraction KW - maximal correlation KW - Markov chains KW - information inequalities KW - mutual information KW - Rényi entropy KW - Carlson–Levin inequality KW - information measures KW - hypothesis testing KW - total variation KW - skew-divergence KW - convexity KW - Pinsker’s inequality KW - Bayes risk KW - statistical divergences KW - minimum divergence estimator KW - maximum likelihood KW - bootstrap KW - conditional limit theorem KW - Bahadur efficiency KW - α-mutual information KW - Augustin–Csiszár mutual information KW - data transmission KW - error exponents KW - dimensionality reduction KW - discriminant analysis KW - statistical inference KW - n/a KW - Jensen-Bregman divergence KW - Jensen-Shannon divergence KW - Jensen-Shannon centroid KW - conditional Rényi divergence KW - Rényi divergence KW - Rényi mutual information KW - strong data-processing inequalities KW - Rényi entropy KW - Carlson-Levin inequality KW - Pinsker's inequality KW - Augustin-Csiszár mutual information UR - https://www.unicat.be/uniCat?func=search&query=sysid:134271387 AB - Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled “Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems”, includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_ényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures. ER -