12–17 Jul 2026
University of Graz
Europe/Vienna timezone

From CRN to CNN: Deep Learning with Stochastic Chemical Reaction Networks

MS95-01
14 Jul 2026, 10:40
20m
11.02 - HS (University of Graz)

11.02 - HS

University of Graz

130
Minisymposium Talk Systems Biology and Biochemical Networks Molecular Computing: Theory and Implementations

Speaker

Bernie Daigle (The University of Memphis)

Description

Advances in deep neural networks (DNNs) have revolutionized our ability to model complex data via the capacity to approximate continuous functions with arbitrary accuracy. DNN models have been used to achieve or surpass human-level performance in tasks such as image classification, generative modeling, and scientific discovery. Recent studies of chemical reaction networks (CRNs) suggest similar potential; for example, it has been shown that stationary distributions of particular stochastic CRNs can closely approximate any distribution on the nonnegative integer lattice \cite{cappelletti_stochastic_2020}.

To explore this potential, we develop well-mixed stochastic CRN models with mass-action kinetics whose equilibrium behavior enables accurate image classification and realistic image generation. We construct our models as multilayer regulatory cascades with network motifs implementing standard neural network operations such as convolution, pooling, and nonlinear activation. To efficiently calibrate and evaluate these models, we apply the linear noise approximation (LNA), which provides a second-order approximation of the system size expansion of the chemical master equation (CME). We benchmark the ability of our models to classify and generate realistic images of handwritten digits based on the classic MNIST dataset, showing performance competitive with that of DNNs. Our results underscore the ability of CRNs to be universal approximators and suggest new methodological innovations for modeling complex data.

Bibliography

@article{cappelletti_stochastic_2020,
title = {Stochastic chemical reaction networks for robustly approximating arbitrary probability distributions},
volume = {801},
issn = {0304-3975},
url = {https://www.sciencedirect.com/science/article/pii/S030439751930502X},
doi = {10.1016/j.tcs.2019.08.013},
abstract = {We show that discrete distributions on the d-dimensional non-negative integer lattice can be approximated arbitrarily well via the marginals of stationary distributions for various classes of stochastic chemical reaction networks. We begin by providing a class of detailed balanced networks and prove that they can approximate any discrete distribution to any desired accuracy. However, these detailed balanced constructions rely on the ability to initialize a system precisely, and are therefore susceptible to perturbations in the initial conditions. We therefore provide another construction based on the ability to approximate point mass distributions and prove that this construction is capable of approximating arbitrary discrete distributions for any choice of initial condition. In particular, the developed models are ergodic, so their limit distributions are robust to a finite number of perturbations over time in the counts of molecules.},
urldate = {2026-03-21},
journal = {Theoretical Computer Science},
author = {Cappelletti, Daniele and Ortiz-Muñoz, Andrés and Anderson, David F. and Winfree, Erik},
month = jan,
year = {2020},
keywords = {Approximation, Arbitrary distributions, Detailed balance, Molecular computing, Robustness, Stochastic chemical reaction networks},
pages = {64--95},
file = {1-s2.0-S030439751930502X-main:/Users/bdaigle/Zotero/storage/A4BLQ5B9/1-s2.0-S030439751930502X-main.pdf:application/pdf;ScienceDirect Snapshot:/Users/bdaigle/Zotero/storage/3BMQ2NTJ/S030439751930502X.html:text/html;Submitted Version:/Users/bdaigle/Zotero/storage/VSZY8SJX/Cappelletti et al. - 2020 - Stochastic chemical reaction networks for robustly approximating arbitrary probability distributions.pdf:application/pdf},
}

Author

Bernie Daigle (The University of Memphis)

Presentation materials

There are no materials yet.