Skip to content

Instantly share code, notes, and snippets.

@davidlenz
Last active December 5, 2020 07:05
Show Gist options
  • Save davidlenz/879735d64f8b0be570a684ac5fd79d3b to your computer and use it in GitHub Desktop.
Save davidlenz/879735d64f8b0be570a684ac5fd79d3b to your computer and use it in GitHub Desktop.
Implementation of Jensen-Shannon-Divergence based on https://github.com/scipy/scipy/issues/8244
import numpy as np
from scipy.stats import entropy
def js(p, q):
p = np.asarray(p)
q = np.asarray(q)
# normalize
p /= p.sum()
q /= q.sum()
m = (p + q) / 2
return (entropy(p, m) + entropy(q, m)) / 2
@davidlenz
Copy link
Author

Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment