Speaker
Description
Discovering differential equations governing dynamical systems is a fundamental challenge in mathematical biology, where mechanistic models are used to study complex processes such as gene regulation, cell fate decisions, and tumor dynamics. This becomes particularly difficult when experimental data is sparse. Generative Flow Networks (GFlowNets) are a probabilistic framework for generating diverse candidate solutions using a reward function \cite{Bengio2021}, and recent work has shown that GFlowNets can be applied with symbolic regression for mechanistic model discovery \cite{Li2023}. However, biological data are often sparse and noisy, making the derivative estimates required for this approach unreliable.
We present a method that combines GFlowNets with physics-informed neural networks (PINNs) to discover differential models from sparse time-series data. The GFlowNet sequentially generates candidate symbolic expressions for a differential equation; each expression is then fit using a PINN, which learns trajectories consistent with both sparse/noisy observations and the proposed differential equation. The resulting PINN loss defines the reward used to train the GFlowNet. We demonstrate this approach on synthetic biological data from gene regulatory networks, and evaluate whether the method can recover governing equations under sparse data conditions.
Bibliography
@misc{Li2023,
author = {Li, Sida and Marinescu, Ioana and Musslick, Sebastian},
title = {GFN-SR: Symbolic Regression with Generative Flow Networks},
year = {2023},
eprint = {2312.00396},
archivePrefix = {arXiv},
url = {https://arxiv.org/abs/2312.00396}
}
@misc{Bengio2021,
author = {Bengio, Emmanuel and Jain, Moksh and Korablyov, Maksym and Precup, Doina and Bengio, Yoshua},
title = {Flow Network Based Generative Models for Non-Iterative Diverse Candidate Generation},
year = {2021},
eprint = {2106.04399},
archivePrefix = {arXiv},
url = {https://arxiv.org/abs/2106.04399}
}