Listing 1 - 10 of 98 << page
of 10
>>
Sort by

Book
Common Law and Civil Law Today - Convergence and Divergence
Author:
ISBN: 1622738071 9781622738076 9781622735075 1622735072 Year: 2019 Publisher: Wilmington, DE Vernon Press

Loading...
Export citation

Choose an application

Bookmark

Abstract


Book
Generalized quadrupeds, committed bipeds, and the shift to open habitats : an evolutionary model of hominid divergence, published by American Museum of Natural History
Year: 1998 Publisher: American Museum of Natural History

Loading...
Export citation

Choose an application

Bookmark

Abstract


Book
Understanding the tripartite approach to Bayesian divergence time estimation
Authors: ---
ISBN: 1108957765 1108957560 1108954367 Year: 2020 Publisher: Cambridge : Cambridge University Press,

Loading...
Export citation

Choose an application

Bookmark

Abstract

Placing evolutionary events in the context of geological time is a fundamental goal in paleobiology and macroevolution. In this Element we describe the tripartite model used for Bayesian estimation of time calibrated phylogenetic trees. The model can be readily separated into its component models: the substitution model, the clock model and the tree model. We provide an overview of the most widely used models for each component and highlight the advantages of implementing the tripartite model within a Bayesian framework.


Book
Divergence Measures : Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems
Author:
Year: 2022 Publisher: Basel MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled “Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems”, includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_ényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures.

Keywords

Research & information: general --- Mathematics & science --- Bregman divergence --- f-divergence --- Jensen–Bregman divergence --- Jensen diversity --- Jensen–Shannon divergence --- capacitory discrimination --- Jensen–Shannon centroid --- mixture family --- information geometry --- difference of convex (DC) programming --- conditional Rényi divergence --- horse betting --- Kelly gambling --- Rényi divergence --- Rényi mutual information --- relative entropy --- chi-squared divergence --- f-divergences --- method of types --- large deviations --- strong data–processing inequalities --- information contraction --- maximal correlation --- Markov chains --- information inequalities --- mutual information --- Rényi entropy --- Carlson–Levin inequality --- information measures --- hypothesis testing --- total variation --- skew-divergence --- convexity --- Pinsker’s inequality --- Bayes risk --- statistical divergences --- minimum divergence estimator --- maximum likelihood --- bootstrap --- conditional limit theorem --- Bahadur efficiency --- α-mutual information --- Augustin–Csiszár mutual information --- data transmission --- error exponents --- dimensionality reduction --- discriminant analysis --- statistical inference --- n/a --- Jensen-Bregman divergence --- Jensen-Shannon divergence --- Jensen-Shannon centroid --- conditional Rényi divergence --- Rényi divergence --- Rényi mutual information --- strong data-processing inequalities --- Rényi entropy --- Carlson-Levin inequality --- Pinsker's inequality --- Augustin-Csiszár mutual information


Book
Divergence Measures : Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems
Author:
Year: 2022 Publisher: Basel MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled “Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems”, includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_ényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures.

Keywords

Bregman divergence --- f-divergence --- Jensen–Bregman divergence --- Jensen diversity --- Jensen–Shannon divergence --- capacitory discrimination --- Jensen–Shannon centroid --- mixture family --- information geometry --- difference of convex (DC) programming --- conditional Rényi divergence --- horse betting --- Kelly gambling --- Rényi divergence --- Rényi mutual information --- relative entropy --- chi-squared divergence --- f-divergences --- method of types --- large deviations --- strong data–processing inequalities --- information contraction --- maximal correlation --- Markov chains --- information inequalities --- mutual information --- Rényi entropy --- Carlson–Levin inequality --- information measures --- hypothesis testing --- total variation --- skew-divergence --- convexity --- Pinsker’s inequality --- Bayes risk --- statistical divergences --- minimum divergence estimator --- maximum likelihood --- bootstrap --- conditional limit theorem --- Bahadur efficiency --- α-mutual information --- Augustin–Csiszár mutual information --- data transmission --- error exponents --- dimensionality reduction --- discriminant analysis --- statistical inference --- n/a --- Jensen-Bregman divergence --- Jensen-Shannon divergence --- Jensen-Shannon centroid --- conditional Rényi divergence --- Rényi divergence --- Rényi mutual information --- strong data-processing inequalities --- Rényi entropy --- Carlson-Levin inequality --- Pinsker's inequality --- Augustin-Csiszár mutual information


Article
Derivation of the Flora and Fauna of Juan Fernandez and Easter Island

Loading...
Export citation

Choose an application

Bookmark

Abstract


Book
Divergence Measures : Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems
Author:
Year: 2022 Publisher: Basel MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled “Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems”, includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_ényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures.


Book
Diversity of Insect Faunas
Authors: --- ---
ISBN: 0632003529 Year: 1978 Publisher: Oxford Blackwell


Book
Regional Innovation: Government policies and the role of Higher Education Institutions
Authors: ---
ISBN: 192038281X 1920382801 Year: 2016 Publisher: Bloemfontein UJ Press

Loading...
Export citation

Choose an application

Bookmark

Abstract

“This book provides an excellent analysis of regional innovation policy issues and developments with a wealth of examples, notably from OECD countries. Key policy areas, such as clusters, support services, and higher education institutions, are well documented. The research methodology is founded on the experience accumulated by the authors over several decades in many different countries in the context of a world class international organisation. This allows a good selection of policy relevant examples and an experienced presentation of them.” – Jean-Eric Aubert, Former programme manager, World Bank and OECD


Book
New Developments in Statistical Information Theory Based on Entropy and Divergence Measures
Author:
ISBN: 3038979376 3038979368 Year: 2019 Publisher: MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.

Keywords

n/a --- mixture index of fit --- Kullback-Leibler distance --- relative error estimation --- minimum divergence inference --- Neyman Pearson test --- influence function --- consistency --- thematic quality assessment --- asymptotic normality --- Hellinger distance --- nonparametric test --- Berstein von Mises theorem --- maximum composite likelihood estimator --- 2-alternating capacities --- efficiency --- corrupted data --- statistical distance --- robustness --- log-linear models --- representation formula --- goodness-of-fit --- general linear model --- Wald-type test statistics --- Hölder divergence --- divergence --- logarithmic super divergence --- information geometry --- sparse --- robust estimation --- relative entropy --- minimum disparity methods --- MM algorithm --- local-polynomial regression --- association models --- total variation --- Bayesian nonparametric --- ordinal classification variables --- Wald test statistic --- Wald-type test --- composite hypotheses --- compressed data --- hypothesis testing --- Bayesian semi-parametric --- single index model --- indoor localization --- composite minimum density power divergence estimator --- quasi-likelihood --- Chernoff Stein lemma --- composite likelihood --- asymptotic property --- Bregman divergence --- robust testing --- misspecified hypothesis and alternative --- least-favorable hypotheses --- location-scale family --- correlation models --- minimum penalized ?-divergence estimator --- non-quadratic distance --- robust --- semiparametric model --- divergence based testing --- measurement errors --- bootstrap distribution estimator --- generalized renyi entropy --- minimum divergence methods --- generalized linear model --- ?-divergence --- Bregman information --- iterated limits --- centroid --- model assessment --- divergence measure --- model check --- two-sample test --- Wald statistic --- Hölder divergence

Listing 1 - 10 of 98 << page
of 10
>>
Sort by