Tan, Linda S. L.Linda S. L.TanFriel, NialNialFriel2021-03-042021-03-042020-04-15Journal of Computational and Graphical Statisticshttp://hdl.handle.net/10197/12008Deriving Bayesian inference for exponential random graph models (ERGMs) is a challenging “doubly intractable” problem as the normalizing constants of the likelihood and posterior density are both intractable. Markov chain Monte Carlo (MCMC) methods which yield Bayesian inference for ERGMs, such as the exchange algorithm, are asymptotically exact but computationally intensive, as a network has to be drawn from the likelihood at every step using, for instance, a “tie no tie” sampler. In this article, we develop a variety of variational methods for Gaussian approximation of the posterior density and model selection. These include nonconjugate variational message passing based on an adjusted pseudolikelihood and stochastic variational inference. To overcome the computational hurdle of drawing a network from the likelihood at each iteration, we propose stochastic gradient ascent with biased but consistent gradient estimates computed using adaptive self-normalized importance sampling. These methods provide attractive fast alternatives to MCMC for posterior approximation. We illustrate the variational methods using real networks and compare their accuracy with results obtained via MCMC and Laplace approximation.enAdjusted pseudolikelihoodAdaptive self-normalozed importance samplingExpontential random graph modelNonconjugate variational message passingStochastic variational inferenceBayesian variational inference for exponential random graph modelsJournal Article29491092810.1080/10618600.2020.17407142020-05-20155-000-190-133SFI/12/RC/2289https://creativecommons.org/licenses/by-nc-nd/3.0/ie/