Narrow your search

Library

FARO (7)

KU Leuven (7)

LUCA School of Arts (7)

Odisee (7)

Thomas More Kempen (7)

Thomas More Mechelen (7)

UCLL (7)

ULiège (7)

VIVES (7)

Vlaams Parlement (7)

More...

Resource type

book (11)


Language

English (11)


Year
From To Submit

2022 (6)

2019 (4)

2017 (1)

Listing 1 - 10 of 11 << page
of 2
>>
Sort by

Book
Differential Geometrical Theory of Statistics
Authors: ---
ISBN: 3038424242 3038424250 Year: 2017 Publisher: MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

This Special Issue "Differential Geometrical Theory of Statistics" collates selected invited and contributed talks presented during the conference GSI'15 on "Geometric Science of Information" which was held at the Ecole Polytechnique, Paris-Saclay Campus, France, in October 2015 (Conference web site: http://www.see.asso.fr/gsi2015).


Book
Information Geometry
Author:
Year: 2019 Publisher: MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

This Special Issue of the journal Entropy, titled “Information Geometry I”, contains a collection of 17 papers concerning the foundations and applications of information geometry. Based on a geometrical interpretation of probability, information geometry has become a rich mathematical field employing the methods of differential geometry. It has numerous applications to data science, physics, and neuroscience. Presenting original research, yet written in an accessible, tutorial style, this collection of papers will be useful for scientists who are new to the field, while providing an excellent reference for the more experienced researcher. Several papers are written by authorities in the field, and topics cover the foundations of information geometry, as well as applications to statistics, Bayesian inference, machine learning, complex systems, physics, and neuroscience.


Book
Information Geometry
Author:
Year: 2019 Publisher: MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

This Special Issue of the journal Entropy, titled “Information Geometry I”, contains a collection of 17 papers concerning the foundations and applications of information geometry. Based on a geometrical interpretation of probability, information geometry has become a rich mathematical field employing the methods of differential geometry. It has numerous applications to data science, physics, and neuroscience. Presenting original research, yet written in an accessible, tutorial style, this collection of papers will be useful for scientists who are new to the field, while providing an excellent reference for the more experienced researcher. Several papers are written by authorities in the field, and topics cover the foundations of information geometry, as well as applications to statistics, Bayesian inference, machine learning, complex systems, physics, and neuroscience.


Book
Information Geometry
Author:
Year: 2019 Publisher: MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

This Special Issue of the journal Entropy, titled “Information Geometry I”, contains a collection of 17 papers concerning the foundations and applications of information geometry. Based on a geometrical interpretation of probability, information geometry has become a rich mathematical field employing the methods of differential geometry. It has numerous applications to data science, physics, and neuroscience. Presenting original research, yet written in an accessible, tutorial style, this collection of papers will be useful for scientists who are new to the field, while providing an excellent reference for the more experienced researcher. Several papers are written by authorities in the field, and topics cover the foundations of information geometry, as well as applications to statistics, Bayesian inference, machine learning, complex systems, physics, and neuroscience.


Book
Divergence Measures : Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems
Author:
Year: 2022 Publisher: Basel MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled “Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems”, includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_ényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures.

Keywords

Research & information: general --- Mathematics & science --- Bregman divergence --- f-divergence --- Jensen-Bregman divergence --- Jensen diversity --- Jensen-Shannon divergence --- capacitory discrimination --- Jensen-Shannon centroid --- mixture family --- information geometry --- difference of convex (DC) programming --- conditional Rényi divergence --- horse betting --- Kelly gambling --- Rényi divergence --- Rényi mutual information --- relative entropy --- chi-squared divergence --- f-divergences --- method of types --- large deviations --- strong data-processing inequalities --- information contraction --- maximal correlation --- Markov chains --- information inequalities --- mutual information --- Rényi entropy --- Carlson-Levin inequality --- information measures --- hypothesis testing --- total variation --- skew-divergence --- convexity --- Pinsker's inequality --- Bayes risk --- statistical divergences --- minimum divergence estimator --- maximum likelihood --- bootstrap --- conditional limit theorem --- Bahadur efficiency --- α-mutual information --- Augustin-Csiszár mutual information --- data transmission --- error exponents --- dimensionality reduction --- discriminant analysis --- statistical inference --- Bregman divergence --- f-divergence --- Jensen-Bregman divergence --- Jensen diversity --- Jensen-Shannon divergence --- capacitory discrimination --- Jensen-Shannon centroid --- mixture family --- information geometry --- difference of convex (DC) programming --- conditional Rényi divergence --- horse betting --- Kelly gambling --- Rényi divergence --- Rényi mutual information --- relative entropy --- chi-squared divergence --- f-divergences --- method of types --- large deviations --- strong data-processing inequalities --- information contraction --- maximal correlation --- Markov chains --- information inequalities --- mutual information --- Rényi entropy --- Carlson-Levin inequality --- information measures --- hypothesis testing --- total variation --- skew-divergence --- convexity --- Pinsker's inequality --- Bayes risk --- statistical divergences --- minimum divergence estimator --- maximum likelihood --- bootstrap --- conditional limit theorem --- Bahadur efficiency --- α-mutual information --- Augustin-Csiszár mutual information --- data transmission --- error exponents --- dimensionality reduction --- discriminant analysis --- statistical inference


Book
Information Theory and Machine Learning
Authors: ---
ISBN: 3036553088 303655307X Year: 2022 Publisher: MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

The recent successes of machine learning, especially regarding systems based on deep neural networks, have encouraged further research activities and raised a new set of challenges in understanding and designing complex machine learning algorithms. New applications require learning algorithms to be distributed, have transferable learning results, use computation resources efficiently, convergence quickly on online settings, have performance guarantees, satisfy fairness or privacy constraints, incorporate domain knowledge on model structures, etc. A new wave of developments in statistical learning theory and information theory has set out to address these challenges. This Special Issue, "Machine Learning and Information Theory", aims to collect recent results in this direction reflecting a diverse spectrum of visions and efforts to extend conventional theories and develop analysis tools for these complex machine learning systems.


Book
Divergence Measures : Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems
Author:
Year: 2022 Publisher: Basel MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled “Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems”, includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_ényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures.

Keywords

Research & information: general --- Mathematics & science --- Bregman divergence --- f-divergence --- Jensen–Bregman divergence --- Jensen diversity --- Jensen–Shannon divergence --- capacitory discrimination --- Jensen–Shannon centroid --- mixture family --- information geometry --- difference of convex (DC) programming --- conditional Rényi divergence --- horse betting --- Kelly gambling --- Rényi divergence --- Rényi mutual information --- relative entropy --- chi-squared divergence --- f-divergences --- method of types --- large deviations --- strong data–processing inequalities --- information contraction --- maximal correlation --- Markov chains --- information inequalities --- mutual information --- Rényi entropy --- Carlson–Levin inequality --- information measures --- hypothesis testing --- total variation --- skew-divergence --- convexity --- Pinsker’s inequality --- Bayes risk --- statistical divergences --- minimum divergence estimator --- maximum likelihood --- bootstrap --- conditional limit theorem --- Bahadur efficiency --- α-mutual information --- Augustin–Csiszár mutual information --- data transmission --- error exponents --- dimensionality reduction --- discriminant analysis --- statistical inference --- n/a --- Jensen-Bregman divergence --- Jensen-Shannon divergence --- Jensen-Shannon centroid --- conditional Rényi divergence --- Rényi divergence --- Rényi mutual information --- strong data-processing inequalities --- Rényi entropy --- Carlson-Levin inequality --- Pinsker's inequality --- Augustin-Csiszár mutual information


Book
Divergence Measures : Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems
Author:
Year: 2022 Publisher: Basel MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled “Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems”, includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_ényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures.

Keywords

Bregman divergence --- f-divergence --- Jensen–Bregman divergence --- Jensen diversity --- Jensen–Shannon divergence --- capacitory discrimination --- Jensen–Shannon centroid --- mixture family --- information geometry --- difference of convex (DC) programming --- conditional Rényi divergence --- horse betting --- Kelly gambling --- Rényi divergence --- Rényi mutual information --- relative entropy --- chi-squared divergence --- f-divergences --- method of types --- large deviations --- strong data–processing inequalities --- information contraction --- maximal correlation --- Markov chains --- information inequalities --- mutual information --- Rényi entropy --- Carlson–Levin inequality --- information measures --- hypothesis testing --- total variation --- skew-divergence --- convexity --- Pinsker’s inequality --- Bayes risk --- statistical divergences --- minimum divergence estimator --- maximum likelihood --- bootstrap --- conditional limit theorem --- Bahadur efficiency --- α-mutual information --- Augustin–Csiszár mutual information --- data transmission --- error exponents --- dimensionality reduction --- discriminant analysis --- statistical inference --- n/a --- Jensen-Bregman divergence --- Jensen-Shannon divergence --- Jensen-Shannon centroid --- conditional Rényi divergence --- Rényi divergence --- Rényi mutual information --- strong data-processing inequalities --- Rényi entropy --- Carlson-Levin inequality --- Pinsker's inequality --- Augustin-Csiszár mutual information


Book
New Developments in Statistical Information Theory Based on Entropy and Divergence Measures
Author:
ISBN: 3038979376 3038979368 Year: 2019 Publisher: MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.

Keywords

n/a --- mixture index of fit --- Kullback-Leibler distance --- relative error estimation --- minimum divergence inference --- Neyman Pearson test --- influence function --- consistency --- thematic quality assessment --- asymptotic normality --- Hellinger distance --- nonparametric test --- Berstein von Mises theorem --- maximum composite likelihood estimator --- 2-alternating capacities --- efficiency --- corrupted data --- statistical distance --- robustness --- log-linear models --- representation formula --- goodness-of-fit --- general linear model --- Wald-type test statistics --- Hölder divergence --- divergence --- logarithmic super divergence --- information geometry --- sparse --- robust estimation --- relative entropy --- minimum disparity methods --- MM algorithm --- local-polynomial regression --- association models --- total variation --- Bayesian nonparametric --- ordinal classification variables --- Wald test statistic --- Wald-type test --- composite hypotheses --- compressed data --- hypothesis testing --- Bayesian semi-parametric --- single index model --- indoor localization --- composite minimum density power divergence estimator --- quasi-likelihood --- Chernoff Stein lemma --- composite likelihood --- asymptotic property --- Bregman divergence --- robust testing --- misspecified hypothesis and alternative --- least-favorable hypotheses --- location-scale family --- correlation models --- minimum penalized ?-divergence estimator --- non-quadratic distance --- robust --- semiparametric model --- divergence based testing --- measurement errors --- bootstrap distribution estimator --- generalized renyi entropy --- minimum divergence methods --- generalized linear model --- ?-divergence --- Bregman information --- iterated limits --- centroid --- model assessment --- divergence measure --- model check --- two-sample test --- Wald statistic --- Hölder divergence


Book
Finite-Time Thermodynamics
Authors: --- ---
ISBN: 3036549501 3036549498 Year: 2022 Publisher: MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

The theory around the concept of finite time describes how processes of any nature can be optimized in situations when their rate is required to be non-negligible, i.e., they must come to completion in a finite time. What the theory makes explicit is “the cost of haste”. Intuitively, it is quite obvious that you drive your car differently if you want to reach your destination as quickly as possible as opposed to the case when you are running out of gas. Finite-time thermodynamics quantifies such opposing requirements and may provide the optimal control to achieve the best compromise. The theory was initially developed for heat engines (steam, Otto, Stirling, a.o.) and for refrigerators, but it has by now evolved into essentially all areas of dynamic systems from the most abstract ones to the most practical ones. The present collection shows some fascinating current examples.

Keywords

Economics, finance, business & management --- macroentropy --- microentropy --- endoreversible engine --- reversible computing --- Landauer’s principle --- piston motion optimization --- endoreversible thermodynamics --- stirling engine --- irreversibility --- power --- efficiency --- optimization --- generalized radiative heat transfer law --- optimal motion path --- maximum work output --- elimination method --- finite time thermodynamics --- thermodynamics --- economics --- optimal processes --- n/a --- averaged --- heat transfer --- cyclic mode --- simulation --- modeling --- reconstruction --- nonequilibrium thermodynamics --- entropy production --- contact temperature --- quantum thermodynamics --- maximum power --- shortcut to adiabaticity --- quantum friction --- Otto cycle --- quantum engine --- quantum refrigerator --- finite-time thermodynamics --- sulfuric acid decomposition --- tubular plug-flow reactor --- entropy generation rate --- SO2 yield --- multi-objective optimization --- optimal control --- thermodynamic cycles --- thermodynamic length --- hydrogen atom --- nano-size engines --- a-thermal cycle --- heat engines --- cooling --- very long timescales --- slow time --- ideal gas law --- new and modified variables --- Silicon–Germanium alloys --- minimum of thermal conductivity --- efficiency of thermoelectric systems --- minimal energy dissipation --- radiative energy transfer --- radiative entropy transfer --- two-stream grey atmosphere --- energy flux density --- entropy flux density --- generalized winds --- conservatively perturbed equilibrium --- extreme value --- momentary equilibrium --- information geometry of thermodynamics --- thermodynamic curvature --- critical phenomena --- binary fluids --- van der Waals equation --- quantum heat engine --- carnot cycle --- otto cycle --- multiobjective optimization --- Pareto front --- stability --- maximum power regime --- entropy behavior --- biophysics --- biochemistry --- dynamical systems --- diversity --- complexity --- path information --- calorimetry --- entropy flow --- biological communities --- reacting systems --- Landauer's principle --- Silicon-Germanium alloys

Listing 1 - 10 of 11 << page
of 2
>>
Sort by