Speaker
Description
We propose an entropy-based framework for uncertainty quantification in settings involving multiple, heterogeneous data sources. The central idea is to represent each empirical layer through an entropy-induced probability measure, allowing information to be shared and propagated across layers in a principled and consistent manner. This approach provides a natural mechanism for reconciling uncertainty arising from observational, experimental, and model-based components, while enabling interpretable variance–covariance decompositions analogous to ANOVA.
As an illustrative example, we reference recent work on random-measure-based sensitivity analysis in randomized controlled trials (Bastian, Rabitz, and Rempala, 2025), where entropy-consistent measures are used to quantify and decompose uncertainty across treatment and outcome spaces. While arising in a clinical context, this example highlights the broader applicability of entropy-driven information sharing for uncertainty quantification in complex, multi-layer systems.