12–17 Jul 2026
University of Graz
Europe/Vienna timezone

Simulation-based inference of epidemic and phylodynamic models via neural posterior estimation

17 Jul 2026, 09:30
20m
10.11 - HS (University of Graz)

10.11 - HS

University of Graz

200
Contributed Talk Mathematical Epidemiology Contributed Talks

Speaker

Francesco Pinotti (INRAE)

Description

Robust parameter inference is central to infectious disease modelling. Traditional Bayesian approaches require an analytical likelihood, which is rarely available for complex models. Classical likelihood-free methods circumvent this limitation through simulation but are inefficient and rely on hand-crafted summary statistics. Neural Posterior Estimation (NPE) is a scalable and flexible alternative that uses deep learning to learn the full posterior distribution \cite{papamakarios_fast_2018}. NPE has shown promise in physics and neuroscience, but is underutilized in infectious disease epidemiology. Here, we evaluate its practical utility for real-world outbreak analysis.

We applied NPE to estimate key epidemiological parameters from two distinct data sources from the 2014 Ebola outbreak in Sierra Leone: (i) time series of reported cases and deaths, and (ii) a phylogeny reconstructed from early viral genome sequences. In both settings, NPE recovered accurate posterior distributions without the need to define any summary statistics.

Our findings demonstrate that NPE is a powerful, general-purpose inference framework for infectious disease modelling. Its flexibility, scalability, and integration within the Bayesian paradigm make it a compelling alternative to traditional inference tools. We advocate for a broader adoption of NPE across epidemiological and phylodynamic applications.

Bibliography

@misc{papamakarios_fast_2018,
title = {Fast \$ε\$-free {Inference} of {Simulation} {Models} with {Bayesian} {Conditional} {Density} {Estimation}},
url = {http://arxiv.org/abs/1605.06376},
doi = {10.48550/arXiv.1605.06376},
abstract = {Many statistical models can be simulated forwards but have intractable likelihoods. Approximate Bayesian Computation (ABC) methods are used to infer properties of these models from data. Traditionally these methods approximate the posterior over parameters by conditioning on data being inside an \$ε\$-ball around the observed data, which is only correct in the limit \$ε{\textbackslash}!{\textbackslash}rightarrow{\textbackslash}!0\$. Monte Carlo methods can then draw samples from the approximate posterior to approximate predictions or error bars on parameters. These algorithms critically slow down as \$ε{\textbackslash}!{\textbackslash}rightarrow{\textbackslash}!0\$, and in practice draw samples from a broader distribution than the posterior. We propose a new approach to likelihood-free inference based on Bayesian conditional density estimation. Preliminary inferences based on limited simulation data are used to guide later simulations. In some cases, learning an accurate parametric representation of the entire true posterior distribution requires fewer model simulations than Monte Carlo ABC methods need to produce a single sample from an approximate posterior.},
urldate = {2026-03-09},
publisher = {arXiv},
author = {Papamakarios, George and Murray, Iain},
month = apr,
year = {2018},
note = {arXiv:1605.06376},
keywords = {Statistics - Machine Learning, Computer Science - Machine Learning, Statistics - Computation},
}

Author

Francesco Pinotti (INRAE)

Co-authors

Julien Theze (INRAE) Xavier Bailly (INRAE) Guillaume Fournié (INRAE)

Presentation materials

There are no materials yet.