Skip to content

Instantly share code, notes, and snippets.

@LucaHermes
Last active February 3, 2023 15:11
Embed
What would you like to do?
Code to compute label informativeness given an edge index and node labels as proposed by Platonov et al. (2022) Characterizing Graph Datasets for Node Classification: Beyond Homophily-Heterophily Dichotomy
from scipy.stats import entropy
import numpy as np
def label_informativeness(edge_index, labels):
'''
Computes the label informativeness metric proposed in
Platonov et al. (2022)
Characterizing Graph Datasets for Node Classification:
Beyond Homophily-Heterophily Dichotomy
https://arxiv.org/abs/2209.06177
Parameters
-----------
edge_index : array, int
A numpy array of shape [n_edges, 2]
labels : array, int
A numpy array of shape [n_nodes, 1]
'''
n_classes = len(np.unique(labels))
n_edges = len(edge_index)
pairwise_label_probs = np.zeros([n_classes]*2)
label_stats = np.zeros([n_classes])
# compute p(c1, c2) and p-bar(c)
for c1 in range(n_classes):
label_stats[c1] = (labels == c1).sum() / (2 * n_edges)
for c2 in range(n_classes):
y_edges = labels[edge_index][...,0]
y_edge_freq = np.logical_and(
y_edges[:,0] == c1,
y_edges[:,1] == c2
).sum()
pairwise_label_probs[c1, c2] = y_edge_freq / (2 * n_edges)
return 2 - (
entropy(pairwise_label_probs.reshape(-1)) / entropy(label_stats)
)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment