Narrow your search

Library

FARO (3)

KU Leuven (3)

LUCA School of Arts (3)

Odisee (3)

Thomas More Kempen (3)

Thomas More Mechelen (3)

UCLL (3)

ULiège (3)

VIVES (3)

Vlaams Parlement (3)

More...

Resource type

book (6)


Language

English (6)


Year
From To Submit

2022 (6)

Listing 1 - 6 of 6
Sort by

Book
Approximate Bayesian Inference
Author:
Year: 2022 Publisher: Basel MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

Extremely popular for statistical inference, Bayesian methods are also becoming popular in machine learning and artificial intelligence problems. Bayesian estimators are often implemented by Monte Carlo methods, such as the Metropolis–Hastings algorithm of the Gibbs sampler. These algorithms target the exact posterior distribution. However, many of the modern models in statistics are simply too complex to use such methodologies. In machine learning, the volume of the data used in practice makes Monte Carlo methods too slow to be useful. On the other hand, these applications often do not require an exact knowledge of the posterior. This has motivated the development of a new generation of algorithms that are fast enough to handle huge datasets but that often target an approximation of the posterior. This book gathers 18 research papers written by Approximate Bayesian Inference specialists and provides an overview of the recent advances in these algorithms. This includes optimization-based methods (such as variational approximations) and simulation-based methods (such as ABC or Monte Carlo algorithms). The theoretical aspects of Approximate Bayesian Inference are covered, specifically the PAC–Bayes bounds and regret analysis. Applications for challenging computational problems in astrophysics, finance, medical data analysis, and computer vision area also presented.

Keywords

Research & information: general --- Mathematics & science --- bifurcation --- dynamical systems --- Edward–Sokal coupling --- mean-field --- Kullback–Leibler divergence --- variational inference --- Bayesian statistics --- machine learning --- variational approximations --- PAC-Bayes --- expectation-propagation --- Markov chain Monte Carlo --- Langevin Monte Carlo --- sequential Monte Carlo --- Laplace approximations --- approximate Bayesian computation --- Gibbs posterior --- MCMC --- stochastic gradients --- neural networks --- Approximate Bayesian Computation --- differential evolution --- Markov kernels --- discrete state space --- ergodicity --- Markov chain --- probably approximately correct --- variational Bayes --- Bayesian inference --- Markov Chain Monte Carlo --- Sequential Monte Carlo --- Riemann Manifold Hamiltonian Monte Carlo --- integrated nested laplace approximation --- fixed-form variational Bayes --- stochastic volatility --- network modeling --- network variability --- Stiefel manifold --- MCMC-SAEM --- data imputation --- Bethe free energy --- factor graphs --- message passing --- variational free energy --- variational message passing --- approximate Bayesian computation (ABC) --- differential privacy (DP) --- sparse vector technique (SVT) --- Gaussian --- particle flow --- variable flow --- Langevin dynamics --- Hamilton Monte Carlo --- non-reversible dynamics --- control variates --- thinning --- meta-learning --- hyperparameters --- priors --- online learning --- online optimization --- gradient descent --- statistical learning theory --- PAC–Bayes theory --- deep learning --- generalisation bounds --- Bayesian sampling --- Monte Carlo integration --- PAC-Bayes theory --- no free lunch theorems --- sequential learning --- principal curves --- data streams --- regret bounds --- greedy algorithm --- sleeping experts --- entropy --- robustness --- statistical mechanics --- complex systems --- bifurcation --- dynamical systems --- Edward–Sokal coupling --- mean-field --- Kullback–Leibler divergence --- variational inference --- Bayesian statistics --- machine learning --- variational approximations --- PAC-Bayes --- expectation-propagation --- Markov chain Monte Carlo --- Langevin Monte Carlo --- sequential Monte Carlo --- Laplace approximations --- approximate Bayesian computation --- Gibbs posterior --- MCMC --- stochastic gradients --- neural networks --- Approximate Bayesian Computation --- differential evolution --- Markov kernels --- discrete state space --- ergodicity --- Markov chain --- probably approximately correct --- variational Bayes --- Bayesian inference --- Markov Chain Monte Carlo --- Sequential Monte Carlo --- Riemann Manifold Hamiltonian Monte Carlo --- integrated nested laplace approximation --- fixed-form variational Bayes --- stochastic volatility --- network modeling --- network variability --- Stiefel manifold --- MCMC-SAEM --- data imputation --- Bethe free energy --- factor graphs --- message passing --- variational free energy --- variational message passing --- approximate Bayesian computation (ABC) --- differential privacy (DP) --- sparse vector technique (SVT) --- Gaussian --- particle flow --- variable flow --- Langevin dynamics --- Hamilton Monte Carlo --- non-reversible dynamics --- control variates --- thinning --- meta-learning --- hyperparameters --- priors --- online learning --- online optimization --- gradient descent --- statistical learning theory --- PAC–Bayes theory --- deep learning --- generalisation bounds --- Bayesian sampling --- Monte Carlo integration --- PAC-Bayes theory --- no free lunch theorems --- sequential learning --- principal curves --- data streams --- regret bounds --- greedy algorithm --- sleeping experts --- entropy --- robustness --- statistical mechanics --- complex systems


Book
Approximate Bayesian Inference
Author:
Year: 2022 Publisher: Basel MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

Extremely popular for statistical inference, Bayesian methods are also becoming popular in machine learning and artificial intelligence problems. Bayesian estimators are often implemented by Monte Carlo methods, such as the Metropolis–Hastings algorithm of the Gibbs sampler. These algorithms target the exact posterior distribution. However, many of the modern models in statistics are simply too complex to use such methodologies. In machine learning, the volume of the data used in practice makes Monte Carlo methods too slow to be useful. On the other hand, these applications often do not require an exact knowledge of the posterior. This has motivated the development of a new generation of algorithms that are fast enough to handle huge datasets but that often target an approximation of the posterior. This book gathers 18 research papers written by Approximate Bayesian Inference specialists and provides an overview of the recent advances in these algorithms. This includes optimization-based methods (such as variational approximations) and simulation-based methods (such as ABC or Monte Carlo algorithms). The theoretical aspects of Approximate Bayesian Inference are covered, specifically the PAC–Bayes bounds and regret analysis. Applications for challenging computational problems in astrophysics, finance, medical data analysis, and computer vision area also presented.

Keywords

Research & information: general --- Mathematics & science --- bifurcation --- dynamical systems --- Edward–Sokal coupling --- mean-field --- Kullback–Leibler divergence --- variational inference --- Bayesian statistics --- machine learning --- variational approximations --- PAC-Bayes --- expectation-propagation --- Markov chain Monte Carlo --- Langevin Monte Carlo --- sequential Monte Carlo --- Laplace approximations --- approximate Bayesian computation --- Gibbs posterior --- MCMC --- stochastic gradients --- neural networks --- Approximate Bayesian Computation --- differential evolution --- Markov kernels --- discrete state space --- ergodicity --- Markov chain --- probably approximately correct --- variational Bayes --- Bayesian inference --- Markov Chain Monte Carlo --- Sequential Monte Carlo --- Riemann Manifold Hamiltonian Monte Carlo --- integrated nested laplace approximation --- fixed-form variational Bayes --- stochastic volatility --- network modeling --- network variability --- Stiefel manifold --- MCMC-SAEM --- data imputation --- Bethe free energy --- factor graphs --- message passing --- variational free energy --- variational message passing --- approximate Bayesian computation (ABC) --- differential privacy (DP) --- sparse vector technique (SVT) --- Gaussian --- particle flow --- variable flow --- Langevin dynamics --- Hamilton Monte Carlo --- non-reversible dynamics --- control variates --- thinning --- meta-learning --- hyperparameters --- priors --- online learning --- online optimization --- gradient descent --- statistical learning theory --- PAC–Bayes theory --- deep learning --- generalisation bounds --- Bayesian sampling --- Monte Carlo integration --- PAC-Bayes theory --- no free lunch theorems --- sequential learning --- principal curves --- data streams --- regret bounds --- greedy algorithm --- sleeping experts --- entropy --- robustness --- statistical mechanics --- complex systems


Book
Approximate Bayesian Inference
Author:
Year: 2022 Publisher: Basel MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

Extremely popular for statistical inference, Bayesian methods are also becoming popular in machine learning and artificial intelligence problems. Bayesian estimators are often implemented by Monte Carlo methods, such as the Metropolis–Hastings algorithm of the Gibbs sampler. These algorithms target the exact posterior distribution. However, many of the modern models in statistics are simply too complex to use such methodologies. In machine learning, the volume of the data used in practice makes Monte Carlo methods too slow to be useful. On the other hand, these applications often do not require an exact knowledge of the posterior. This has motivated the development of a new generation of algorithms that are fast enough to handle huge datasets but that often target an approximation of the posterior. This book gathers 18 research papers written by Approximate Bayesian Inference specialists and provides an overview of the recent advances in these algorithms. This includes optimization-based methods (such as variational approximations) and simulation-based methods (such as ABC or Monte Carlo algorithms). The theoretical aspects of Approximate Bayesian Inference are covered, specifically the PAC–Bayes bounds and regret analysis. Applications for challenging computational problems in astrophysics, finance, medical data analysis, and computer vision area also presented.

Keywords

bifurcation --- dynamical systems --- Edward–Sokal coupling --- mean-field --- Kullback–Leibler divergence --- variational inference --- Bayesian statistics --- machine learning --- variational approximations --- PAC-Bayes --- expectation-propagation --- Markov chain Monte Carlo --- Langevin Monte Carlo --- sequential Monte Carlo --- Laplace approximations --- approximate Bayesian computation --- Gibbs posterior --- MCMC --- stochastic gradients --- neural networks --- Approximate Bayesian Computation --- differential evolution --- Markov kernels --- discrete state space --- ergodicity --- Markov chain --- probably approximately correct --- variational Bayes --- Bayesian inference --- Markov Chain Monte Carlo --- Sequential Monte Carlo --- Riemann Manifold Hamiltonian Monte Carlo --- integrated nested laplace approximation --- fixed-form variational Bayes --- stochastic volatility --- network modeling --- network variability --- Stiefel manifold --- MCMC-SAEM --- data imputation --- Bethe free energy --- factor graphs --- message passing --- variational free energy --- variational message passing --- approximate Bayesian computation (ABC) --- differential privacy (DP) --- sparse vector technique (SVT) --- Gaussian --- particle flow --- variable flow --- Langevin dynamics --- Hamilton Monte Carlo --- non-reversible dynamics --- control variates --- thinning --- meta-learning --- hyperparameters --- priors --- online learning --- online optimization --- gradient descent --- statistical learning theory --- PAC–Bayes theory --- deep learning --- generalisation bounds --- Bayesian sampling --- Monte Carlo integration --- PAC-Bayes theory --- no free lunch theorems --- sequential learning --- principal curves --- data streams --- regret bounds --- greedy algorithm --- sleeping experts --- entropy --- robustness --- statistical mechanics --- complex systems


Book
The Statistical Foundations of Entropy
Authors: ---
Year: 2022 Publisher: Basel MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

In the last two decades, the understanding of complex dynamical systems underwent important conceptual shifts. The catalyst was the infusion of new ideas from the theory of critical phenomena (scaling laws, renormalization group, etc.), (multi)fractals and trees, random matrix theory, network theory, and non-Shannonian information theory. The usual Boltzmann–Gibbs statistics were proven to be grossly inadequate in this context. While successful in describing stationary systems characterized by ergodicity or metric transitivity, Boltzmann–Gibbs statistics fail to reproduce the complex statistical behavior of many real-world systems in biology, astrophysics, geology, and the economic and social sciences.The aim of this Special Issue was to extend the state of the art by original contributions that could contribute to an ongoing discussion on the statistical foundations of entropy, with a particular emphasis on non-conventional entropies that go significantly beyond Boltzmann, Gibbs, and Shannon paradigms. The accepted contributions addressed various aspects including information theoretic, thermodynamic and quantum aspects of complex systems and found several important applications of generalized entropies in various systems.

Keywords

Research & information: general --- Mathematics & science --- ecological inference --- generalized cross entropy --- distributional weighted regression --- matrix adjustment --- entropy --- critical phenomena --- renormalization --- multiscale thermodynamics --- GENERIC --- non-Newtonian calculus --- non-Diophantine arithmetic --- Kolmogorov-Nagumo averages --- escort probabilities --- generalized entropies --- maximum entropy principle --- MaxEnt distribution --- calibration invariance --- Lagrange multipliers --- generalized Bilal distribution --- adaptive Type-II progressive hybrid censoring scheme --- maximum likelihood estimation --- Bayesian estimation --- Lindley's approximation --- confidence interval --- Markov chain Monte Carlo method --- Rényi entropy --- Tsallis entropy --- entropic uncertainty relations --- quantum metrology --- non-equilibrium thermodynamics --- variational entropy --- rényi entropy --- tsallis entropy --- landsberg-vedral entropy --- gaussian entropy --- sharma-mittal entropy --- α-mutual information --- α-channel capacity --- maximum entropy --- Bayesian inference --- updating probabilities --- ecological inference --- generalized cross entropy --- distributional weighted regression --- matrix adjustment --- entropy --- critical phenomena --- renormalization --- multiscale thermodynamics --- GENERIC --- non-Newtonian calculus --- non-Diophantine arithmetic --- Kolmogorov-Nagumo averages --- escort probabilities --- generalized entropies --- maximum entropy principle --- MaxEnt distribution --- calibration invariance --- Lagrange multipliers --- generalized Bilal distribution --- adaptive Type-II progressive hybrid censoring scheme --- maximum likelihood estimation --- Bayesian estimation --- Lindley's approximation --- confidence interval --- Markov chain Monte Carlo method --- Rényi entropy --- Tsallis entropy --- entropic uncertainty relations --- quantum metrology --- non-equilibrium thermodynamics --- variational entropy --- rényi entropy --- tsallis entropy --- landsberg-vedral entropy --- gaussian entropy --- sharma-mittal entropy --- α-mutual information --- α-channel capacity --- maximum entropy --- Bayesian inference --- updating probabilities


Book
The Statistical Foundations of Entropy
Authors: ---
Year: 2022 Publisher: Basel MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

In the last two decades, the understanding of complex dynamical systems underwent important conceptual shifts. The catalyst was the infusion of new ideas from the theory of critical phenomena (scaling laws, renormalization group, etc.), (multi)fractals and trees, random matrix theory, network theory, and non-Shannonian information theory. The usual Boltzmann–Gibbs statistics were proven to be grossly inadequate in this context. While successful in describing stationary systems characterized by ergodicity or metric transitivity, Boltzmann–Gibbs statistics fail to reproduce the complex statistical behavior of many real-world systems in biology, astrophysics, geology, and the economic and social sciences.The aim of this Special Issue was to extend the state of the art by original contributions that could contribute to an ongoing discussion on the statistical foundations of entropy, with a particular emphasis on non-conventional entropies that go significantly beyond Boltzmann, Gibbs, and Shannon paradigms. The accepted contributions addressed various aspects including information theoretic, thermodynamic and quantum aspects of complex systems and found several important applications of generalized entropies in various systems.


Book
Novel Approaches in Landslide Monitoring and Data Analysis
Authors: --- ---
ISBN: 3036537880 3036537872 Year: 2022 Publisher: MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

Significant progress has been made in the last few years that has expanded the knowledge of landslide processes. It is, therefore, necessary to summarize, share and disseminate the latest knowledge and expertise. This Special Issue brings together novel research focused on landslide monitoring, modelling and data analysis.

Keywords

Research & information: general --- Earth sciences, geography, environment, planning --- landslide susceptibility assessment --- geographically weighted regression --- spatial non-stationary --- spatial proximity --- slope unit --- multitemporal DEM --- control factors --- susceptibility assessment --- LRM --- historical landslide events --- unsaturated soil --- capillary barrier --- multi-layer slope --- slope failure --- coastal landslides --- mass rock creep --- coastal cliffs --- land surface analysis --- data analysis --- Conero promontory --- slow-moving landslide --- landslide monitoring --- time-series analysis --- San Andrés Landslide --- El Hierro --- Canary Islands --- large-scale landslide model experiment --- soil–water characteristic curve --- Bayesian updating --- Markov chain Monte Carlo --- artificial neural networks --- object-based image analysis --- Sentinel-1 --- Sentinel-2 --- digital elevation model --- InSAR --- landslide --- landslide-dammed lake --- river --- Iceland --- rockfill --- ground-based interferometric synthetic aperture radar --- construction --- cross-sectional area of equal displacement body --- landslide warning method --- data mining --- landslide detection --- landslide inventory --- Typhoon Morakot --- Global Positioning System (GPS) --- disaster prevention monitoring --- disaster mitigation --- monitoring --- landslides --- photogrammetry --- global positioning system --- in-hole wire extensometer --- DInSAR --- GBSAR --- landslide susceptibility --- ranking --- cross validation --- prediction model --- prediction pattern --- target pattern --- uncertainty pattern --- airborne electromagnetics --- physical-based modeling --- tropical volcanic environment --- La Martinique --- modelling --- susceptibility

Listing 1 - 6 of 6
Sort by