Skip to content

Instantly share code, notes, and snippets.

@B-R-P
Created November 2, 2024 11:13
Synergistic-Unique-Redundant Decomposition (SURD)
Algorithm SURD
Input: Probability distribution array p (target + agent variables)
Output: Dictionaries I_R, I_S, MI, and info_leak
1. Ensure no zero values in p to prevent NaN errors during calculations
2. Normalize p so that it represents a probability distribution (p should sum to 1)
3. Define Ntot as the total number of variables (target + agents)
4. Define Nvars as the number of agent variables
5. Define Nt as the number of states for the target variable
6. Initialize indices list (inds) to represent agent variable indices
7. Calculate Information Leak:
a. Compute entropy H of the target variable
b. Compute conditional entropy Hc of the target variable given the agent variables
c. Set info_leak = Hc / H
8. Calculate Marginal Distribution of Target Variable:
a. Sum over agent variables to get the marginal distribution of the target
9. Initialize empty dictionaries for MI, I_R, I_S, and temporary storage of specific mutual information (Is)
10. For each possible combination of agent variables:
a. Compute joint distribution of target and current combination of agents
b. Compute marginal distribution of agents in the combination
c. Compute conditional distributions:
- p_a_s = conditional probability of agent combination given the target
- p_s_a = conditional probability of target given the agent combination
d. Calculate specific mutual information for the current combination and store in Is
11. Calculate Mutual Information (MI) for Each Combination:
- For each entry in Is, average it with the target’s marginal probability and store in MI
12. Initialize Redundancy (I_R) and Synergy (I_S) for each combination:
- For each possible combination, set initial values in I_R and I_S dictionaries to 0
13. Process Each State of the Target Variable:
- For each target state (t):
a. Extract specific mutual information values for t
b. Sort values and keep track of agent variable combinations
c. Remove any lower-ranked values within a combination, retaining only max values
d. Compute differences between sorted values
e. Distribute information to redundancy (I_R) and synergy (I_S):
- If combination has a single agent, add difference to I_R
- If combination has multiple agents, add difference to I_S
14. Return I_R, I_S, MI, and info_leak
@B-R-P
Copy link
Author

B-R-P commented Nov 2, 2024

Orginal Source: Github
Martínez-Sánchez, Álvaro, Arranz, Gonzalo, & Lozano-Durán, Adrián. (2024). Decomposing causality into its synergistic, unique, and redundant components. Nature Communications, 15(1), 9296. https://doi.org/10.1038/s41467-024-53373-4

@B-R-P
Copy link
Author

B-R-P commented Nov 2, 2024

Purpose

Decompose mutual information between a future event (target variable) and past observations (agent variables) into redundant, unique, and synergistic contributions.

Inputs and Outputs

  • Input: Probability distribution ( p ) of the target and agent variables (histogram or joint distribution array).
  • Output:
    • ( I_R ): Dictionary of redundancy and unique information values for each combination of agent variables.
    • ( I_S ): Dictionary of synergy values for each combination of agent variables.
    • ( MI ): Dictionary of mutual information for each variable combination.
    • info_leak: Estimation of any information leakage in the system.

@B-R-P
Copy link
Author

B-R-P commented Nov 2, 2024

Disclaimer: It's unofficial algorithm

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment