核心概念
The paper investigates the relationship between two practical metrics of high-order interdependencies: the redundancy-synergy index (RSI), which captures directed interdependencies, and the O-information, which captures undirected interdependencies. The results reveal tight links between these two quantities and provide interpretations in terms of likelihood ratios and information geometry.
要約
The paper focuses on characterizing and relating two metrics of high-order statistical interdependencies: the redundancy-synergy index (RSI) and the O-information.
Key highlights:
- The RSI captures directed high-order effects, measuring the extent to which correlations within a set of variables X are created or destroyed by conditioning on a target variable Y.
- The O-information captures undirected high-order effects, measuring the differences between the total and dual total correlations within X.
- The paper presents analytical expressions relating the RSI and O-information, showing that the RSI can be expressed as the difference between the O-information of the joint system (X,Y) and the conditional O-information of X given Y.
- It is shown that the RSI and O-information can be interpreted in terms of log-likelihood ratios between models with predominantly redundant or synergistic high-order interdependencies.
- The O-information is further characterized as the sum of RSI-like quantities, each considering a different partition of the system.
- These results provide insights into the similarities and differences between directed and undirected approaches to quantifying high-order statistical relationships.