Skip to content

Instantly share code, notes, and snippets.

@necronet
Created December 5, 2022 01:23
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save necronet/06b807260c29c9a4d94834ce1291ea40 to your computer and use it in GitHub Desktop.
Save necronet/06b807260c29c9a4d94834ce1291ea40 to your computer and use it in GitHub Desktop.
Understanding KL divergence as explained on statistical rethinking from Prof Mcelreath
p <- c(0.3, 0.7 )
N = 1000
qs <- cbind( seq(0, 1, length.out = N), seq(1, 0, length.out = N) )
kldivergence <- function(q1, q2) {p <- c(0.3, 0.7 ); sum( p*log(p/c(q1,q2)))}
kldivergence_results <- mapply(kldivergence, qs[,1], qs[,2])
plot(qs[,1], kldivergence_results, pch=20, col='blue')
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment