Created
April 4, 2017 21:14
-
-
Save jacobcvt12/c480edef89372f61c20145174613035d to your computer and use it in GitHub Desktop.
Automatic Differentiation to calculation score function with Julia
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# Automatic Differentiation example with forward accumulation | |
# Here we calculate the gradient of the loglikelihood of a normal | |
# distribution wrt to the mean and variance (i.e. the score function) | |
# load package for AD and Distributions | |
using ForwardDiff, Distributions | |
# parameters | |
μ = 10.0; # true mean | |
σ² = 1; # true variance | |
n = 100; # number of observations to be sampled | |
# put parameters in an array | |
x = [μ, σ²]; | |
# sample observations | |
y = rand(Normal(μ, √σ²), n); | |
# create function to take the gradient of | |
h(x) = loglikelihood(Normal(x[1], √x[2]), y) | |
# calculate gradient and evaluate at truth | |
ForwardDiff.gradient(h, [μ, σ²]) | |
# analytically derived score functions for comparison | |
# (note that sigma is assumed 1) | |
# partial log L / partial μ | |
sum(y - μ) | |
# partial log L / partial σ² | |
-n / 2 + 0.5 * sum((y-μ).^2) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment