Now showing 1 - 3 of 3
  • Publication
    Bayesian Inference, Model Selection and Likelihood Estimation using Fast Rejection Sampling: The Conway-Maxwell-Poisson Distribution
    (International Society for Bayesian Analysis, 2021-09) ;
    Bayesian inference for models with intractable likelihood functions represents a challenging suite of problems in modern statistics. In this work we analyse the Conway-Maxwell-Poisson (COM-Poisson) distribution, a two parameter generalisation of the Poisson distribution. COM-Poisson regression modelling allows the flexibility to model dispersed count data as part of a generalised linear model (GLM) with a COM-Poisson response, where exogenous covariates control the mean and dispersion level of the response. The major difficulty with COM-Poisson regression is that the likelihood function contains multiple intractable normalising constants and is not amenable to standard inference and Markov Chain Monte Carlo (MCMC) techniques. Recent work by Chanialidis et al. (2018) has seen the development of a sampler to draw random variates from the COM-Poisson likelihood using a rejection sampling algorithm. We provide a new rejection sampler for the COM-Poisson distribution which significantly reduces the central processing unit (CPU) time required to perform inference for COM-Poisson regression models. An extension of this work shows that for any intractable likelihood function with an associated rejection sampler it is possible to construct unbiased estimators of the intractable likelihood which proves useful for model selection or for use within pseudo-marginal MCMC algorithms (Andrieu and Roberts, 2009). We demonstrate all of these methods on a real-world dataset of takeover bids.
      96Scopus© Citations 4
  • Publication
    Noisy Hamiltonian Monte Carlo for Doubly Intractable Distributions
    (Taylor & Francis, 2018-10-29) ; ;
    Hamiltonian Monte Carlo (HMC) has been progressively incorporated within thestatisticians toolbox as an alternative sampling method in settings when standardMetropolis-Hastings is inefficient. HMC generates a Markov chain on an augmentedstate space with transitions based on a deterministic differential flow derived fromHamiltonian mechanics. In practice, the evolution of Hamiltonian systems cannotbe solved analytically, requiring numerical integration schemes. Under numericalintegration, the resulting approximate solution no longer preserves the measure ofthe target distribution, therefore an accept-reject step is used to correct the bias.For doubly-intractable distributions such as posterior distributions based on Gibbsrandom fields HMC suffers from some computational difficulties: computationof gradients in the differential flow and computation of the accept-reject proposalsposes difficulty. In this paper, we study the behaviour of HMC when these quantitiesare replaced by Monte Carlo estimates.
      489Scopus© Citations 3
  • Publication
    Adaptive MCMC for multiple changepoint analysis with applications to large datasets
    (Institute of Mathematical Statistics, 2018) ;
    We consider the problem of Bayesian inference for changepoints where the number and position of the changepoints are both unknown. In particular, we consider product partition models where it is possible to integrate out model parameters for the regime between each changepoint, leaving a posterior distribution over a latent vector indicating the presence or not of a changepoint at each observation. The same problem setting has been considered by Fearnhead (2006) where one can use filtering recursions to make exact inference. However, the complexity of this filtering recursions algorithm is quadratic in the number of observations. Our approach relies on an adaptive Markov Chain Monte Carlo (MCMC) method for finite discrete state spaces. We develop an adaptive algorithm which can learn from the past states of the Markov chain in order to build proposal distributions which can quickly discover where changepoint are likely to be located. We prove that our algorithm leaves the posterior distribution ergodic. Crucially, we demonstrate that our adaptive MCMC algorithm is viable for large datasets for which the filtering recursions approach is not. Moreover, we show that inference is possible in a reasonable time thus making Bayesian change point detection computationally efficient.
      231Scopus© Citations 5