Narrow your search

Library

FARO (3)

KU Leuven (3)

LUCA School of Arts (3)

Odisee (3)

Thomas More Kempen (3)

Thomas More Mechelen (3)

UCLL (3)

VIVES (3)

Vlaams Parlement (3)


Resource type

book (3)


Language

English (3)


Year
From To Submit

2022 (1)

2021 (1)

2020 (1)

Listing 1 - 3 of 3
Sort by

Book
Fluctuation Relations and Nonequilibrium Thermodynamics in Classical and Quantum Systems
Author:
Year: 2020 Publisher: Basel, Switzerland MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

This Special Issue contains novel results in the area of out-of-equilibrium classical and quantum thermodynamics. Contributions are from different areas of physics, including statistical mechanics, quantum information and many-body systems.


Book
Robust Procedures for Estimating and Testing in the Framework of Divergence Measures
Authors: ---
Year: 2021 Publisher: Basel, Switzerland MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

The scope of the contributions to this book will be to present new and original research papers based on MPHIE, MHD, and MDPDE, as well as test statistics based on these estimators from a theoretical and applied point of view in different statistical problems with special emphasis on robustness. Manuscripts given solutions to different statistical problems as model selection criteria based on divergence measures or in statistics for high-dimensional data with divergence measures as loss function are considered. Reviews making emphasis in the most recent state-of-the art in relation to the solution of statistical problems base on divergence measures are also presented.


Book
Divergence Measures : Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems
Author:
Year: 2022 Publisher: Basel MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled “Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems”, includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_ényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures.

Keywords

Bregman divergence --- f-divergence --- Jensen–Bregman divergence --- Jensen diversity --- Jensen–Shannon divergence --- capacitory discrimination --- Jensen–Shannon centroid --- mixture family --- information geometry --- difference of convex (DC) programming --- conditional Rényi divergence --- horse betting --- Kelly gambling --- Rényi divergence --- Rényi mutual information --- relative entropy --- chi-squared divergence --- f-divergences --- method of types --- large deviations --- strong data–processing inequalities --- information contraction --- maximal correlation --- Markov chains --- information inequalities --- mutual information --- Rényi entropy --- Carlson–Levin inequality --- information measures --- hypothesis testing --- total variation --- skew-divergence --- convexity --- Pinsker’s inequality --- Bayes risk --- statistical divergences --- minimum divergence estimator --- maximum likelihood --- bootstrap --- conditional limit theorem --- Bahadur efficiency --- α-mutual information --- Augustin–Csiszár mutual information --- data transmission --- error exponents --- dimensionality reduction --- discriminant analysis --- statistical inference --- n/a --- Jensen-Bregman divergence --- Jensen-Shannon divergence --- Jensen-Shannon centroid --- conditional Rényi divergence --- Rényi divergence --- Rényi mutual information --- strong data-processing inequalities --- Rényi entropy --- Carlson-Levin inequality --- Pinsker's inequality --- Augustin-Csiszár mutual information

Listing 1 - 3 of 3
Sort by