Listing 1 - 10 of 41 | << page >> |
Sort by
|
Choose an application
Civil law. --- Common law. --- Convergence. --- Divergence (Biology)
Choose an application
Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled “Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems”, includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_ényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures.
Bregman divergence --- f-divergence --- Jensen–Bregman divergence --- Jensen diversity --- Jensen–Shannon divergence --- capacitory discrimination --- Jensen–Shannon centroid --- mixture family --- information geometry --- difference of convex (DC) programming --- conditional Rényi divergence --- horse betting --- Kelly gambling --- Rényi divergence --- Rényi mutual information --- relative entropy --- chi-squared divergence --- f-divergences --- method of types --- large deviations --- strong data–processing inequalities --- information contraction --- maximal correlation --- Markov chains --- information inequalities --- mutual information --- Rényi entropy --- Carlson–Levin inequality --- information measures --- hypothesis testing --- total variation --- skew-divergence --- convexity --- Pinsker’s inequality --- Bayes risk --- statistical divergences --- minimum divergence estimator --- maximum likelihood --- bootstrap --- conditional limit theorem --- Bahadur efficiency --- α-mutual information --- Augustin–Csiszár mutual information --- data transmission --- error exponents --- dimensionality reduction --- discriminant analysis --- statistical inference --- n/a --- Jensen-Bregman divergence --- Jensen-Shannon divergence --- Jensen-Shannon centroid --- conditional Rényi divergence --- Rényi divergence --- Rényi mutual information --- strong data-processing inequalities --- Rényi entropy --- Carlson-Levin inequality --- Pinsker's inequality --- Augustin-Csiszár mutual information
Choose an application
“This book provides an excellent analysis of regional innovation policy issues and developments with a wealth of examples, notably from OECD countries. Key policy areas, such as clusters, support services, and higher education institutions, are well documented. The research methodology is founded on the experience accumulated by the authors over several decades in many different countries in the context of a world class international organisation. This allows a good selection of policy relevant examples and an experienced presentation of them.” – Jean-Eric Aubert, Former programme manager, World Bank and OECD
Higher & further education, tertiary education --- regional innovation --- innovation --- societies --- growth --- divergence --- soft parameters --- policy instruments --- Higher Education institutions --- innovation policy
Choose an application
This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.
n/a --- mixture index of fit --- Kullback-Leibler distance --- relative error estimation --- minimum divergence inference --- Neyman Pearson test --- influence function --- consistency --- thematic quality assessment --- asymptotic normality --- Hellinger distance --- nonparametric test --- Berstein von Mises theorem --- maximum composite likelihood estimator --- 2-alternating capacities --- efficiency --- corrupted data --- statistical distance --- robustness --- log-linear models --- representation formula --- goodness-of-fit --- general linear model --- Wald-type test statistics --- Hölder divergence --- divergence --- logarithmic super divergence --- information geometry --- sparse --- robust estimation --- relative entropy --- minimum disparity methods --- MM algorithm --- local-polynomial regression --- association models --- total variation --- Bayesian nonparametric --- ordinal classification variables --- Wald test statistic --- Wald-type test --- composite hypotheses --- compressed data --- hypothesis testing --- Bayesian semi-parametric --- single index model --- indoor localization --- composite minimum density power divergence estimator --- quasi-likelihood --- Chernoff Stein lemma --- composite likelihood --- asymptotic property --- Bregman divergence --- robust testing --- misspecified hypothesis and alternative --- least-favorable hypotheses --- location-scale family --- correlation models --- minimum penalized ?-divergence estimator --- non-quadratic distance --- robust --- semiparametric model --- divergence based testing --- measurement errors --- bootstrap distribution estimator --- generalized renyi entropy --- minimum divergence methods --- generalized linear model --- ?-divergence --- Bregman information --- iterated limits --- centroid --- model assessment --- divergence measure --- model check --- two-sample test --- Wald statistic --- Hölder divergence
Choose an application
The scope of the contributions to this book will be to present new and original research papers based on MPHIE, MHD, and MDPDE, as well as test statistics based on these estimators from a theoretical and applied point of view in different statistical problems with special emphasis on robustness. Manuscripts given solutions to different statistical problems as model selection criteria based on divergence measures or in statistics for high-dimensional data with divergence measures as loss function are considered. Reviews making emphasis in the most recent state-of-the art in relation to the solution of statistical problems base on divergence measures are also presented.
classification --- Bayes error rate --- Henze–Penrose divergence --- Friedman–Rafsky test statistic --- convergence rates --- bias and variance trade-off --- concentration bounds --- minimal spanning trees --- composite likelihood --- composite minimum density power divergence estimators --- model selection --- minimum pseudodistance estimation --- Robustness --- estimation of α --- monitoring --- numerical minimization --- S-estimation --- Tukey’s biweight --- integer-valued time series --- one-parameter exponential family --- minimum density power divergence estimator --- density power divergence --- robust change point test --- Galton-Watson branching processes with immigration --- Hellinger integrals --- power divergences --- Kullback-Leibler information distance/divergence --- relative entropy --- Renyi divergences --- epidemiology --- COVID-19 pandemic --- Bayesian decision making --- INARCH(1) model --- GLM model --- Bhattacharyya coefficient/distance --- time series of counts --- INGARCH model --- SPC --- CUSUM monitoring --- MDPDE --- contingency tables --- disparity --- mixed-scale data --- pearson residuals --- residual adjustment function --- robustness --- statistical distances --- Hellinger distance --- large deviations --- divergence measures --- rare event probabilities --- n/a --- Henze-Penrose divergence --- Friedman-Rafsky test statistic --- Tukey's biweight
Choose an application
This Special Issue "Differential Geometrical Theory of Statistics" collates selected invited and contributed talks presented during the conference GSI'15 on "Geometric Science of Information" which was held at the Ecole Polytechnique, Paris-Saclay Campus, France, in October 2015 (Conference web site: http://www.see.asso.fr/gsi2015).
Hessian Geometry --- Shape Space --- Computational Information Geometry --- Statistical physics --- Entropy --- Cohomology --- Information geometry --- Thermodynamics --- Coding Theory --- Information topology --- Maximum entropy --- Divergence Geometry
Choose an application
This monograph presents a comprehensive, self-contained, and novel approach to the Divergence Theorem through five progressive volumes. Its ultimate aim is to develop tools in Real and Harmonic Analysis, of geometric measure theoretic flavor, capable of treating a broad spectrum of boundary value problems formulated in rather general geometric and analytic settings. The text is intended for researchers, graduate students, and industry professionals interested in applications of harmonic analysis and geometric measure theory to complex analysis, scattering, and partial differential equations. Volume I establishes a sharp version of the Divergence Theorem (aka Fundamental Theorem of Calculus) which allows for an inclusive class of vector fields whose boundary trace is only assumed to exist in a nontangential pointwise sense.
Functional analysis --- Harmonic analysis. Fourier analysis --- Mathematical analysis --- analyse (wiskunde) --- Fourierreeksen --- functies (wiskunde) --- mathematische modellen --- wiskunde --- Divergence theorem. --- Functional analysis. --- Anàlisi funcional
Choose an application
This Special Issue of the journal Entropy, titled “Information Geometry I”, contains a collection of 17 papers concerning the foundations and applications of information geometry. Based on a geometrical interpretation of probability, information geometry has become a rich mathematical field employing the methods of differential geometry. It has numerous applications to data science, physics, and neuroscience. Presenting original research, yet written in an accessible, tutorial style, this collection of papers will be useful for scientists who are new to the field, while providing an excellent reference for the more experienced researcher. Several papers are written by authorities in the field, and topics cover the foundations of information geometry, as well as applications to statistics, Bayesian inference, machine learning, complex systems, physics, and neuroscience.
decomposable divergence --- tensor Sylvester matrix --- maximum pseudo-likelihood estimation --- matrix resultant --- ?) --- Markov random fields --- Fisher information --- Fisher information matrix --- Stein equation --- entropy --- Sylvester matrix --- information geometry --- stationary process --- (? --- dually flat structure --- information theory --- Bezout matrix --- Vandermonde matrix
Choose an application
This book utilizes three theoretical models to analyze the defense conditions and preparedness of all the states of Eastern Europe. Their transition from Cold War communism to post-Cold War democracies, the stability of the East-Central European States, the precarious defense positions of the Baltic states, and the uneven defense preparedness of the Balkan states are at the heart of the analysis.
National security --- Balkan states' security. --- Baltic state's security. --- East Central European states' security. --- European Union (EU). --- North Atlantic Treaty Organization (NATO). --- alliance politics theory. --- anti-Communist revolutions. --- convergence/divergence theories. --- late Cold War political history. --- policy-formation making theories.
Choose an application
Evolutionary developmental biology, or 'evo-devo', is the study of the relationship between evolution and development. Dealing specifically with the generative mechanisms of organismal form, evo-devo goes straight to the core of the developmental origin of variation, the raw material on which natural selection (and random drift) can work. Evolving Pathways brings together contributions that represent a diversity of approaches. Topics range from developmental genetics to comparative morphology of animals and plants alike, and also include botany and palaeontology, two disciplines for which the potential to be examined from an evo-devo perspective has largely been ignored until now. Researchers and graduate students will find this book a valuable overview of current research as we begin to fill a major gap in our perception of evolutionary change.
Developmental biology. --- Evolution. --- Philosophy --- Creation --- Emergence (Philosophy) --- Teleology --- Development (Biology) --- Biology --- Growth --- Ontogeny --- Evolutionary developmental biology. --- Developmental evolution (Biology) --- Evo-devo (Evolutionary developmental biology) --- Evolution of development (Biology) --- Evolutionary biology of development --- Developmental biology --- Evolution (Biology) --- 57.017.6 --- 575.832 --- 575.854 --- 57.017.6 Growth. Development. Ageing. Senescence. Death --- Growth. Development. Ageing. Senescence. Death --- Animal evolution --- Animals --- Biological evolution --- Darwinism --- Evolutionary biology --- Evolutionary science --- Origin of species --- Evolution --- Biological fitness --- Homoplasy --- Natural selection --- Phylogeny --- 575.854 Tissue, organs and function --- Tissue, organs and function --- 575.832 Divergence --- Divergence
Listing 1 - 10 of 41 | << page >> |
Sort by
|