Listing 1 - 8 of 8 |
Sort by
|
Choose an application
This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.
n/a --- mixture index of fit --- Kullback-Leibler distance --- relative error estimation --- minimum divergence inference --- Neyman Pearson test --- influence function --- consistency --- thematic quality assessment --- asymptotic normality --- Hellinger distance --- nonparametric test --- Berstein von Mises theorem --- maximum composite likelihood estimator --- 2-alternating capacities --- efficiency --- corrupted data --- statistical distance --- robustness --- log-linear models --- representation formula --- goodness-of-fit --- general linear model --- Wald-type test statistics --- Hölder divergence --- divergence --- logarithmic super divergence --- information geometry --- sparse --- robust estimation --- relative entropy --- minimum disparity methods --- MM algorithm --- local-polynomial regression --- association models --- total variation --- Bayesian nonparametric --- ordinal classification variables --- Wald test statistic --- Wald-type test --- composite hypotheses --- compressed data --- hypothesis testing --- Bayesian semi-parametric --- single index model --- indoor localization --- composite minimum density power divergence estimator --- quasi-likelihood --- Chernoff Stein lemma --- composite likelihood --- asymptotic property --- Bregman divergence --- robust testing --- misspecified hypothesis and alternative --- least-favorable hypotheses --- location-scale family --- correlation models --- minimum penalized ?-divergence estimator --- non-quadratic distance --- robust --- semiparametric model --- divergence based testing --- measurement errors --- bootstrap distribution estimator --- generalized renyi entropy --- minimum divergence methods --- generalized linear model --- ?-divergence --- Bregman information --- iterated limits --- centroid --- model assessment --- divergence measure --- model check --- two-sample test --- Wald statistic --- Hölder divergence
Choose an application
Choose an application
Asymptotic expansions --- Divergent series --- Entropy (Information theory) --- Multivariate analysis --- Statistical hypothesis testing
Choose an application
The scope of the contributions to this book will be to present new and original research papers based on MPHIE, MHD, and MDPDE, as well as test statistics based on these estimators from a theoretical and applied point of view in different statistical problems with special emphasis on robustness. Manuscripts given solutions to different statistical problems as model selection criteria based on divergence measures or in statistics for high-dimensional data with divergence measures as loss function are considered. Reviews making emphasis in the most recent state-of-the art in relation to the solution of statistical problems base on divergence measures are also presented.
Research & information: general --- classification --- Bayes error rate --- Henze–Penrose divergence --- Friedman–Rafsky test statistic --- convergence rates --- bias and variance trade-off --- concentration bounds --- minimal spanning trees --- composite likelihood --- composite minimum density power divergence estimators --- model selection --- minimum pseudodistance estimation --- Robustness --- estimation of α --- monitoring --- numerical minimization --- S-estimation --- Tukey’s biweight --- integer-valued time series --- one-parameter exponential family --- minimum density power divergence estimator --- density power divergence --- robust change point test --- Galton-Watson branching processes with immigration --- Hellinger integrals --- power divergences --- Kullback-Leibler information distance/divergence --- relative entropy --- Renyi divergences --- epidemiology --- COVID-19 pandemic --- Bayesian decision making --- INARCH(1) model --- GLM model --- Bhattacharyya coefficient/distance --- time series of counts --- INGARCH model --- SPC --- CUSUM monitoring --- MDPDE --- contingency tables --- disparity --- mixed-scale data --- pearson residuals --- residual adjustment function --- robustness --- statistical distances --- Hellinger distance --- large deviations --- divergence measures --- rare event probabilities --- n/a --- Henze-Penrose divergence --- Friedman-Rafsky test statistic --- Tukey's biweight
Choose an application
The scope of the contributions to this book will be to present new and original research papers based on MPHIE, MHD, and MDPDE, as well as test statistics based on these estimators from a theoretical and applied point of view in different statistical problems with special emphasis on robustness. Manuscripts given solutions to different statistical problems as model selection criteria based on divergence measures or in statistics for high-dimensional data with divergence measures as loss function are considered. Reviews making emphasis in the most recent state-of-the art in relation to the solution of statistical problems base on divergence measures are also presented.
classification --- Bayes error rate --- Henze–Penrose divergence --- Friedman–Rafsky test statistic --- convergence rates --- bias and variance trade-off --- concentration bounds --- minimal spanning trees --- composite likelihood --- composite minimum density power divergence estimators --- model selection --- minimum pseudodistance estimation --- Robustness --- estimation of α --- monitoring --- numerical minimization --- S-estimation --- Tukey’s biweight --- integer-valued time series --- one-parameter exponential family --- minimum density power divergence estimator --- density power divergence --- robust change point test --- Galton-Watson branching processes with immigration --- Hellinger integrals --- power divergences --- Kullback-Leibler information distance/divergence --- relative entropy --- Renyi divergences --- epidemiology --- COVID-19 pandemic --- Bayesian decision making --- INARCH(1) model --- GLM model --- Bhattacharyya coefficient/distance --- time series of counts --- INGARCH model --- SPC --- CUSUM monitoring --- MDPDE --- contingency tables --- disparity --- mixed-scale data --- pearson residuals --- residual adjustment function --- robustness --- statistical distances --- Hellinger distance --- large deviations --- divergence measures --- rare event probabilities --- n/a --- Henze-Penrose divergence --- Friedman-Rafsky test statistic --- Tukey's biweight
Choose an application
The scope of the contributions to this book will be to present new and original research papers based on MPHIE, MHD, and MDPDE, as well as test statistics based on these estimators from a theoretical and applied point of view in different statistical problems with special emphasis on robustness. Manuscripts given solutions to different statistical problems as model selection criteria based on divergence measures or in statistics for high-dimensional data with divergence measures as loss function are considered. Reviews making emphasis in the most recent state-of-the art in relation to the solution of statistical problems base on divergence measures are also presented.
Research & information: general --- classification --- Bayes error rate --- Henze-Penrose divergence --- Friedman-Rafsky test statistic --- convergence rates --- bias and variance trade-off --- concentration bounds --- minimal spanning trees --- composite likelihood --- composite minimum density power divergence estimators --- model selection --- minimum pseudodistance estimation --- Robustness --- estimation of α --- monitoring --- numerical minimization --- S-estimation --- Tukey's biweight --- integer-valued time series --- one-parameter exponential family --- minimum density power divergence estimator --- density power divergence --- robust change point test --- Galton-Watson branching processes with immigration --- Hellinger integrals --- power divergences --- Kullback-Leibler information distance/divergence --- relative entropy --- Renyi divergences --- epidemiology --- COVID-19 pandemic --- Bayesian decision making --- INARCH(1) model --- GLM model --- Bhattacharyya coefficient/distance --- time series of counts --- INGARCH model --- SPC --- CUSUM monitoring --- MDPDE --- contingency tables --- disparity --- mixed-scale data --- pearson residuals --- residual adjustment function --- robustness --- statistical distances --- Hellinger distance --- large deviations --- divergence measures --- rare event probabilities --- classification --- Bayes error rate --- Henze-Penrose divergence --- Friedman-Rafsky test statistic --- convergence rates --- bias and variance trade-off --- concentration bounds --- minimal spanning trees --- composite likelihood --- composite minimum density power divergence estimators --- model selection --- minimum pseudodistance estimation --- Robustness --- estimation of α --- monitoring --- numerical minimization --- S-estimation --- Tukey's biweight --- integer-valued time series --- one-parameter exponential family --- minimum density power divergence estimator --- density power divergence --- robust change point test --- Galton-Watson branching processes with immigration --- Hellinger integrals --- power divergences --- Kullback-Leibler information distance/divergence --- relative entropy --- Renyi divergences --- epidemiology --- COVID-19 pandemic --- Bayesian decision making --- INARCH(1) model --- GLM model --- Bhattacharyya coefficient/distance --- time series of counts --- INGARCH model --- SPC --- CUSUM monitoring --- MDPDE --- contingency tables --- disparity --- mixed-scale data --- pearson residuals --- residual adjustment function --- robustness --- statistical distances --- Hellinger distance --- large deviations --- divergence measures --- rare event probabilities
Choose an application
Engineering sciences. Technology --- Computer science --- informatica --- systeemtheorie --- systeembeheer
Choose an application
Real-life problems are often quite complicated and, for centuries, Mathematics has provided the necessary tools and ideas to formulate these problems mathematically and then help in solving them either exactly or approximately. This book aims to gather a collection of papers dealing with several different problems arising from many disciplines and some modern mathematical approaches to handle them. In this respect, the book offers a wide overview on many of the current trends in Mathematics as valuable formal techniques in capturing and exploiting the complexity involved in real-world situations. Several researchers, colleagues, friends and students of Professor María Luisa Menéndez have contributed to this volume to pay tribute to her and to recognize the diverse contributions she had made to the fields of Mathematics and Statistics and to the profession in general. She had a sweet and strong personality, and instilled great values and work ethics in her students through her dedication to teaching and research. Even though the academic community lost her prematurely, she would continue to provide inspiration to many students and researchers worldwide through her published work.
Engineering sciences. Technology --- Computer science --- informatica --- systeemtheorie --- systeembeheer
Listing 1 - 8 of 8 |
Sort by
|