Options
Informed sub-sampling MCMC: approximate Bayesian inference for large datasets
Author(s)
Date Issued
2018-06-09
Date Available
2019-05-13T09:14:31Z
Abstract
This paper introduces a framework for speeding up Bayesian inference conducted in presence of large datasets. We design a Markov chain whose transition kernel uses an unknown fraction of fixed size of the available data that is randomly refreshed throughout the algorithm. Inspired by the Approximate Bayesian Computation literature, the subsampling process is guided by the fidelity to the observed data, as measured by summary statistics. The resulting algorithm, Informed Sub-Sampling MCMC, is a generic and flexible approach which, contrary to existing scalable methodologies, preserves the simplicity of the Metropolis–Hastings algorithm. Even though exactness is lost, i.e the chain distribution approximates the posterior, we study and quantify theoretically this bias and show on a diverse set of examples that it yields excellent performances when the computational budget is limited. If available and cheap to compute, we show that setting the summary statistics as the maximum likelihood estimator is supported by theoretical arguments.
Sponsorship
Science Foundation Ireland
Other Sponsorship
Insight Centre for Data Analytics
Labex ECODEC
Fondation du Risque
Type of Material
Journal Article
Publisher
Springer
Journal
Statistics and Computing
Volume
29
Issue
3
Start Page
449
End Page
482
Copyright (Published Version)
2018 Springer
Language
English
Status of Item
Peer reviewed
This item is made available under a Creative Commons License
File(s)
Owning collection
Scopus© citations
5
Acquisition Date
Mar 18, 2024
Mar 18, 2024
Views
611
Acquisition Date
Mar 18, 2024
Mar 18, 2024
Downloads
275
Last Week
2
2
Last Month
7
7
Acquisition Date
Mar 18, 2024
Mar 18, 2024