Skip to content

Instantly share code, notes, and snippets.

@jacobcvt12
Created April 4, 2017 21:14
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save jacobcvt12/c480edef89372f61c20145174613035d to your computer and use it in GitHub Desktop.
Save jacobcvt12/c480edef89372f61c20145174613035d to your computer and use it in GitHub Desktop.
Automatic Differentiation to calculation score function with Julia
# Automatic Differentiation example with forward accumulation
# Here we calculate the gradient of the loglikelihood of a normal
# distribution wrt to the mean and variance (i.e. the score function)
# load package for AD and Distributions
using ForwardDiff, Distributions
# parameters
μ = 10.0; # true mean
σ² = 1; # true variance
n = 100; # number of observations to be sampled
# put parameters in an array
x = [μ, σ²];
# sample observations
y = rand(Normal(μ, √σ²), n);
# create function to take the gradient of
h(x) = loglikelihood(Normal(x[1], √x[2]), y)
# calculate gradient and evaluate at truth
ForwardDiff.gradient(h, [μ, σ²])
# analytically derived score functions for comparison
# (note that sigma is assumed 1)
# partial log L / partial μ
sum(y - μ)
# partial log L / partial σ²
-n / 2 + 0.5 * sum((y-μ).^2)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment