Skip to content

Instantly share code, notes, and snippets.

@jtilly
Created November 28, 2016 21:31
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save jtilly/ac41df77a5826c2c5d6bb043e1b9e54e to your computer and use it in GitHub Desktop.
Save jtilly/ac41df77a5826c2c5d6bb043e1b9e54e to your computer and use it in GitHub Desktop.
Use log/exp transformation to make the log likelihood computation of a simple mixture model more robust.
# making mixture models numerically robust
set.seed(4)
rm(list = ls())
nobs = 1000
alpha = 0.1
p = runif(nobs, min = 0.0, max = 1.0)
q = runif(nobs, min = 0.0, max = 1.0)
# naive computation of log-likelihood contribution
log(alpha * prod(p) + (1-alpha) * prod(q))
# sum up the logs
psum = sum(log(c(alpha, p)))
qsum = sum(log(c(1 - alpha, q)))
# use exp/log transformation
const = 1 - max(c(psum, qsum))
# put it back together
log(exp(psum + const) + exp(qsum + const)) - const
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment