Listing 1 - 1 of 1 |
Sort by
|
Choose an application
Doing inference on a model defining an implicit likelihood that is not known in closed form is called likelihood-free inference. This occurs frequently in engineering and science domains where a simulator is used as a generative model of data, but the likelihood of the generated data is not known and is intractable. Given observed data, we combine the idea of hierarchical Bayesian modeling, empirical Bayes, and neural density estimation with normalizing flow to first learn a surrogate approximation of the model likelihood and then, to learn a prior distribution over the model parameters. The learned prior and the surrogate likelihood further allow to learn a posterior distribution for each observation. This is a general approach to likelihood-free inference, and is especially useful in settings where the simulator is too costly to run at inference time. We show the applicability of our methods on a real physical problem from high energy physics (HEP).
Listing 1 - 1 of 1 |
Sort by
|