Mutual information (MI) is essential for quantifying dependencies in complex systems, yet accurately estimating it in high-dimensional settings remains challenging. In this talk, we introduce novel methods for MI estimation using generative diffusion processes.
First, we discuss how to leverage score-based diffusion models to estimate the Kullback–Leibler divergence between probability measures for continuous random variables. We then address the counterpart for discrete random variables. Next, we show how these estimators extend to higher-order interaction measures, revealing the synergy–redundancy balance in multivariate systems.
Finally, we present extensions for computing transfer entropy and other dynamical quantities.