Core Concepts
O-INFORMATION provides insights into synergy-redundancy balance in multivariate systems, with SΩI offering a scalable and flexible estimation method.
Abstract
The content discusses the development of SΩI for computing O-INFORMATION in multivariate systems, overcoming limitations of existing methods. It introduces the concept of information synergy and redundancy, highlighting the practical applications and experimental validations. The paper explores high-dimensional interaction measures, score-based divergence estimation, and experimental validation in synthetic and real-world scenarios.
Introduction to O-INFORMATION and its significance.
Limitations of existing methods like PID.
Development of SΩI for scalable O-INFORMATION estimation.
Experimental validation in synthetic and real systems.
Application to neuroscience data analysis.
Stats
Mutual Information (MI) is fundamental for non-linear dependence between random variables (Shannon, 1948; MacKay, 2003).
Computational complexity grows fast as the Dedekind number of variables increases (more than 10^31 for 9 variables).
Quotes
"The main limitations of PID persist in all variants."
"Recent work focuses on studying individual influence of variables to high-order interactions."