Kavli Affiliate: George Efstathiou
| First 5 Authors: Harry T. J. Bevins, William J. Handley, Pablo Lemos, Peter H. Sims, Eloy de Lera Acedo
| Summary:
Bayesian workflows often require the introduction of nuisance parameters, yet
for core science modelling one needs access to a marginal posterior density. In
this work we use masked autoregressive flows and kernel density estimators to
encapsulate the marginal posterior, allowing us to compute marginal
Kullback-Leibler divergences and marginal Bayesian model dimensionalities in
addition to generating samples and computing marginal log probabilities. We
demonstrate this in application to topical cosmological examples of the Dark
Energy Survey, and global 21cm signal experiments. In addition to the
computation of marginal Bayesian statistics, this work is important for further
applications in Bayesian experimental design, complex prior modelling and
likelihood emulation. This technique is made publicly available in the
pip-installable code margarine.
| Search Query: ArXiv Query: search_query=au:”Anastasia Fialkov”&id_list=&start=0&max_results=10