Options
Bayesian variational inference for exponential random graph models
Author(s)
Date Issued
2020-04-15
Date Available
2021-03-04T16:51:03Z
Abstract
Deriving Bayesian inference for exponential random graph models (ERGMs) is a challenging “doubly intractable” problem as the normalizing constants of the likelihood and posterior density are both intractable. Markov chain Monte Carlo (MCMC) methods which yield Bayesian inference for ERGMs, such as the exchange algorithm, are asymptotically exact but computationally intensive, as a network has to be drawn from the likelihood at every step using, for instance, a “tie no tie” sampler. In this article, we develop a variety of variational methods for Gaussian approximation of the posterior density and model selection. These include nonconjugate variational message passing based on an adjusted pseudolikelihood and stochastic variational inference. To overcome the computational hurdle of drawing a network from the likelihood at each iteration, we propose stochastic gradient ascent with biased but consistent gradient estimates computed using adaptive self-normalized importance sampling. These methods provide attractive fast alternatives to MCMC for posterior approximation. We illustrate the variational methods using real networks and compare their accuracy with results obtained via MCMC and Laplace approximation.
Sponsorship
Science Foundation Ireland
Other Sponsorship
Insight Research Centre
Type of Material
Journal Article
Publisher
Taylor & Francis
Journal
Journal of Computational and Graphical Statistics
Volume
29
Issue
4
Start Page
910
End Page
928
Language
English
Status of Item
Peer reviewed
This item is made available under a Creative Commons License
File(s)
No Thumbnail Available
Name
insight_publication.pdf
Size
1.9 MB
Format
Adobe PDF
Checksum (MD5)
1b82e02bb1c10a7cbebb9fd1482cb6c8
Owning collection
Mapped collections