Optimal parallelization of a sequential approximate Bayesian computation algorithm

Approximate Bayesian Computation (ABC) methods have a lot of success to accomplish Bayesian inference on the parameters of models for which the calculation of the likelihood is intractable. These algorithms consists in comparing the observed dataset to many simulated datasets. These ones can be gene...

Full description

Saved in:
Bibliographic Details
Published in:Proceedings of the 2012 Winter Simulation Conference (WSC) pp. 1 - 7
Main Authors: Marin, Jean-Michel, Pudlo, Pierre, Sedki, Mohammed
Format: Conference Proceeding
Language:English
Published: IEEE 01.12.2012
Subjects:
ISBN:1467347795, 9781467347792
ISSN:0891-7736
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Approximate Bayesian Computation (ABC) methods have a lot of success to accomplish Bayesian inference on the parameters of models for which the calculation of the likelihood is intractable. These algorithms consists in comparing the observed dataset to many simulated datasets. These ones can be generated in different ways. Typically, the rejection ABC scheme consists first of simulating parameters using independent calls to the prior distribution and then, given these values, generating the datasets using independent calls to the model. For such a method, the computation time needed to get a suitable approximation of the posterior distribution can be very long. Also, there exist some sequential Monte Carlo methods replacing simulations from the prior by using successive approximations to the posterior distribution. Here, we recall a sequential simulation algorithm and we compare different parallelization strategies. We notably shown that the parallelization of the sequential ABC sampler is useless when using more than four threads per instance of the program and that the standard rejection ABC sampler has to be used when facing a large number of cores. Indeed, in such a case, the cost of the sequential ABC sampler's parallelization prohibits its use.
ISBN:1467347795
9781467347792
ISSN:0891-7736
DOI:10.1109/WSC.2012.6465244