Narrow your search

Library

FARO (5)

KU Leuven (5)

LUCA School of Arts (5)

Odisee (5)

Thomas More Kempen (5)

Thomas More Mechelen (5)

UCLL (5)

ULiège (5)

VIVES (5)

Vlaams Parlement (5)

More...

Resource type

book (14)

digital (2)


Language

English (16)


Year
From To Submit

2022 (6)

2021 (8)

1849 (1)

1840 (1)

Listing 1 - 10 of 16 << page
of 2
>>
Sort by

Digital
Remarks on a sermon, &c entitled "The use of the church service on a late occasion defended, and socialist crimes exposed, by the Rev John Craig, MA, Vicar of Leamington-Priors"
Authors: ---
Year: 1840 Publisher: Coventry Printed and published by J. Rushton

Loading...
Export citation

Choose an application

Bookmark

Abstract


Digital
Laws, rules, and regulations of the Priors' Lodge Benevolent Society : held at the Boat Inn, Worksop, Nottinghamshire.
Authors: --- ---
Year: 1849 Publisher: [Worksop? s.n.

Loading...
Export citation

Choose an application

Bookmark

Abstract


Book
Information Bottleneck : Theory and Applications in Deep Learning
Authors: ---
Year: 2021 Publisher: Basel, Switzerland MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

The celebrated information bottleneck (IB) principle of Tishby et al. has recently enjoyed renewed attention due to its application in the area of deep learning. This collection investigates the IB principle in this new context. The individual chapters in this collection: • provide novel insights into the functional properties of the IB; • discuss the IB principle (and its derivates) as an objective for training multi-layer machine learning structures such as neural networks and decision trees; and • offer a new perspective on neural network learning via the lens of the IB framework. Our collection thus contributes to a better understanding of the IB principle specifically for deep learning and, more generally, of information–theoretic cost functions in machine learning. This paves the way toward explainable artificial intelligence.


Book
Information Bottleneck : Theory and Applications in Deep Learning
Authors: ---
Year: 2021 Publisher: Basel, Switzerland MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

The celebrated information bottleneck (IB) principle of Tishby et al. has recently enjoyed renewed attention due to its application in the area of deep learning. This collection investigates the IB principle in this new context. The individual chapters in this collection: • provide novel insights into the functional properties of the IB; • discuss the IB principle (and its derivates) as an objective for training multi-layer machine learning structures such as neural networks and decision trees; and • offer a new perspective on neural network learning via the lens of the IB framework. Our collection thus contributes to a better understanding of the IB principle specifically for deep learning and, more generally, of information–theoretic cost functions in machine learning. This paves the way toward explainable artificial intelligence.


Book
Information Bottleneck : Theory and Applications in Deep Learning
Authors: ---
Year: 2021 Publisher: Basel, Switzerland MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

The celebrated information bottleneck (IB) principle of Tishby et al. has recently enjoyed renewed attention due to its application in the area of deep learning. This collection investigates the IB principle in this new context. The individual chapters in this collection: • provide novel insights into the functional properties of the IB; • discuss the IB principle (and its derivates) as an objective for training multi-layer machine learning structures such as neural networks and decision trees; and • offer a new perspective on neural network learning via the lens of the IB framework. Our collection thus contributes to a better understanding of the IB principle specifically for deep learning and, more generally, of information–theoretic cost functions in machine learning. This paves the way toward explainable artificial intelligence.


Book
Bayesian Design in Clinical Trials
Authors: ---
Year: 2022 Publisher: Basel MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

In the last decade, the number of clinical trials using Bayesian methods has grown dramatically. Nowadays, regulatory authorities appear to be more receptive to Bayesian methods than ever. The Bayesian methodology is well suited to address the issues arising in the planning, analysis, and conduct of clinical trials. Due to their flexibility, Bayesian design methods based on the accrued data of ongoing trials have been recommended by both the US Food and Drug Administration and the European Medicines Agency for dose-response trials in early clinical development. A distinctive feature of the Bayesian approach is its ability to deal with external information, such as historical data, findings from previous studies and expert opinions, through prior elicitation. In fact, it provides a framework for embedding and handling the variability of auxiliary information within the planning and analysis of the study. A growing body of literature examines the use of historical data to augment newly collected data, especially in clinical trials where patients are difficult to recruit, which is the case for rare diseases, for example. Many works explore how this can be done properly, since using historical data has been recognized as less controversial than eliciting prior information from experts’ opinions. In this book, applications of Bayesian design in the planning and analysis of clinical trials are introduced, along with methodological contributions to specific topics of Bayesian statistics. Finally, two reviews regarding the state-of-the-art of the Bayesian approach in clinical field trials are presented.


Book
Bayesian Design in Clinical Trials
Authors: ---
Year: 2022 Publisher: Basel MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

In the last decade, the number of clinical trials using Bayesian methods has grown dramatically. Nowadays, regulatory authorities appear to be more receptive to Bayesian methods than ever. The Bayesian methodology is well suited to address the issues arising in the planning, analysis, and conduct of clinical trials. Due to their flexibility, Bayesian design methods based on the accrued data of ongoing trials have been recommended by both the US Food and Drug Administration and the European Medicines Agency for dose-response trials in early clinical development. A distinctive feature of the Bayesian approach is its ability to deal with external information, such as historical data, findings from previous studies and expert opinions, through prior elicitation. In fact, it provides a framework for embedding and handling the variability of auxiliary information within the planning and analysis of the study. A growing body of literature examines the use of historical data to augment newly collected data, especially in clinical trials where patients are difficult to recruit, which is the case for rare diseases, for example. Many works explore how this can be done properly, since using historical data has been recognized as less controversial than eliciting prior information from experts’ opinions. In this book, applications of Bayesian design in the planning and analysis of clinical trials are introduced, along with methodological contributions to specific topics of Bayesian statistics. Finally, two reviews regarding the state-of-the-art of the Bayesian approach in clinical field trials are presented.


Book
Bayesian Design in Clinical Trials
Authors: ---
Year: 2022 Publisher: Basel MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

In the last decade, the number of clinical trials using Bayesian methods has grown dramatically. Nowadays, regulatory authorities appear to be more receptive to Bayesian methods than ever. The Bayesian methodology is well suited to address the issues arising in the planning, analysis, and conduct of clinical trials. Due to their flexibility, Bayesian design methods based on the accrued data of ongoing trials have been recommended by both the US Food and Drug Administration and the European Medicines Agency for dose-response trials in early clinical development. A distinctive feature of the Bayesian approach is its ability to deal with external information, such as historical data, findings from previous studies and expert opinions, through prior elicitation. In fact, it provides a framework for embedding and handling the variability of auxiliary information within the planning and analysis of the study. A growing body of literature examines the use of historical data to augment newly collected data, especially in clinical trials where patients are difficult to recruit, which is the case for rare diseases, for example. Many works explore how this can be done properly, since using historical data has been recognized as less controversial than eliciting prior information from experts’ opinions. In this book, applications of Bayesian design in the planning and analysis of clinical trials are introduced, along with methodological contributions to specific topics of Bayesian statistics. Finally, two reviews regarding the state-of-the-art of the Bayesian approach in clinical field trials are presented.


Book
Approximate Bayesian Inference
Author:
Year: 2022 Publisher: Basel MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

Extremely popular for statistical inference, Bayesian methods are also becoming popular in machine learning and artificial intelligence problems. Bayesian estimators are often implemented by Monte Carlo methods, such as the Metropolis–Hastings algorithm of the Gibbs sampler. These algorithms target the exact posterior distribution. However, many of the modern models in statistics are simply too complex to use such methodologies. In machine learning, the volume of the data used in practice makes Monte Carlo methods too slow to be useful. On the other hand, these applications often do not require an exact knowledge of the posterior. This has motivated the development of a new generation of algorithms that are fast enough to handle huge datasets but that often target an approximation of the posterior. This book gathers 18 research papers written by Approximate Bayesian Inference specialists and provides an overview of the recent advances in these algorithms. This includes optimization-based methods (such as variational approximations) and simulation-based methods (such as ABC or Monte Carlo algorithms). The theoretical aspects of Approximate Bayesian Inference are covered, specifically the PAC–Bayes bounds and regret analysis. Applications for challenging computational problems in astrophysics, finance, medical data analysis, and computer vision area also presented.

Keywords

Research & information: general --- Mathematics & science --- bifurcation --- dynamical systems --- Edward–Sokal coupling --- mean-field --- Kullback–Leibler divergence --- variational inference --- Bayesian statistics --- machine learning --- variational approximations --- PAC-Bayes --- expectation-propagation --- Markov chain Monte Carlo --- Langevin Monte Carlo --- sequential Monte Carlo --- Laplace approximations --- approximate Bayesian computation --- Gibbs posterior --- MCMC --- stochastic gradients --- neural networks --- Approximate Bayesian Computation --- differential evolution --- Markov kernels --- discrete state space --- ergodicity --- Markov chain --- probably approximately correct --- variational Bayes --- Bayesian inference --- Markov Chain Monte Carlo --- Sequential Monte Carlo --- Riemann Manifold Hamiltonian Monte Carlo --- integrated nested laplace approximation --- fixed-form variational Bayes --- stochastic volatility --- network modeling --- network variability --- Stiefel manifold --- MCMC-SAEM --- data imputation --- Bethe free energy --- factor graphs --- message passing --- variational free energy --- variational message passing --- approximate Bayesian computation (ABC) --- differential privacy (DP) --- sparse vector technique (SVT) --- Gaussian --- particle flow --- variable flow --- Langevin dynamics --- Hamilton Monte Carlo --- non-reversible dynamics --- control variates --- thinning --- meta-learning --- hyperparameters --- priors --- online learning --- online optimization --- gradient descent --- statistical learning theory --- PAC–Bayes theory --- deep learning --- generalisation bounds --- Bayesian sampling --- Monte Carlo integration --- PAC-Bayes theory --- no free lunch theorems --- sequential learning --- principal curves --- data streams --- regret bounds --- greedy algorithm --- sleeping experts --- entropy --- robustness --- statistical mechanics --- complex systems


Book
Approximate Bayesian Inference
Author:
Year: 2022 Publisher: Basel MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

Extremely popular for statistical inference, Bayesian methods are also becoming popular in machine learning and artificial intelligence problems. Bayesian estimators are often implemented by Monte Carlo methods, such as the Metropolis–Hastings algorithm of the Gibbs sampler. These algorithms target the exact posterior distribution. However, many of the modern models in statistics are simply too complex to use such methodologies. In machine learning, the volume of the data used in practice makes Monte Carlo methods too slow to be useful. On the other hand, these applications often do not require an exact knowledge of the posterior. This has motivated the development of a new generation of algorithms that are fast enough to handle huge datasets but that often target an approximation of the posterior. This book gathers 18 research papers written by Approximate Bayesian Inference specialists and provides an overview of the recent advances in these algorithms. This includes optimization-based methods (such as variational approximations) and simulation-based methods (such as ABC or Monte Carlo algorithms). The theoretical aspects of Approximate Bayesian Inference are covered, specifically the PAC–Bayes bounds and regret analysis. Applications for challenging computational problems in astrophysics, finance, medical data analysis, and computer vision area also presented.

Keywords

Research & information: general --- Mathematics & science --- bifurcation --- dynamical systems --- Edward–Sokal coupling --- mean-field --- Kullback–Leibler divergence --- variational inference --- Bayesian statistics --- machine learning --- variational approximations --- PAC-Bayes --- expectation-propagation --- Markov chain Monte Carlo --- Langevin Monte Carlo --- sequential Monte Carlo --- Laplace approximations --- approximate Bayesian computation --- Gibbs posterior --- MCMC --- stochastic gradients --- neural networks --- Approximate Bayesian Computation --- differential evolution --- Markov kernels --- discrete state space --- ergodicity --- Markov chain --- probably approximately correct --- variational Bayes --- Bayesian inference --- Markov Chain Monte Carlo --- Sequential Monte Carlo --- Riemann Manifold Hamiltonian Monte Carlo --- integrated nested laplace approximation --- fixed-form variational Bayes --- stochastic volatility --- network modeling --- network variability --- Stiefel manifold --- MCMC-SAEM --- data imputation --- Bethe free energy --- factor graphs --- message passing --- variational free energy --- variational message passing --- approximate Bayesian computation (ABC) --- differential privacy (DP) --- sparse vector technique (SVT) --- Gaussian --- particle flow --- variable flow --- Langevin dynamics --- Hamilton Monte Carlo --- non-reversible dynamics --- control variates --- thinning --- meta-learning --- hyperparameters --- priors --- online learning --- online optimization --- gradient descent --- statistical learning theory --- PAC–Bayes theory --- deep learning --- generalisation bounds --- Bayesian sampling --- Monte Carlo integration --- PAC-Bayes theory --- no free lunch theorems --- sequential learning --- principal curves --- data streams --- regret bounds --- greedy algorithm --- sleeping experts --- entropy --- robustness --- statistical mechanics --- complex systems

Listing 1 - 10 of 16 << page
of 2
>>
Sort by