Transfer entropy measures directed information flow in time series, and it has become a fundamental quantity in applications spanning neuroscience, finance, and complex systems analysis. However, existing estimation methods suffer from the curse of dimensionality, require restrictive distributional assumptions, or need exponentially large datasets for reliable convergence. We address these limitations in the literature by proposing TENDE (Transfer Entropy Neural Diffusion Estimation), a novel approach that leverages score-based diffusion models to estimate transfer entropy through conditional mutual information. By learning score functions of the relevant conditional distributions, TENDE provides flexible, scalable estimation while making minimal assumptions about the underlying data-generating process. We demonstrate superior accuracy and robustness compared to existing neural estimators and other state-of-the-art approaches across synthetic benchmarks and real data.
TENDE: Transfer Entropy Neural Diffusion Estimation
Submitted to ArXiV, 15 October 2025
Type:
Rapport
Date:
2025-10-15
Department:
Data Science
Eurecom Ref:
8434
Copyright:
© EURECOM. Personal use of this material is permitted. The definitive version of this paper was published in Submitted to ArXiV, 15 October 2025 and is available at :
See also:
PERMALINK : https://www.eurecom.fr/publication/8434