Skip to content

Instantly share code, notes, and snippets.

@fredcallaway
Created April 15, 2021 17:41
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save fredcallaway/f6da260e410fafc3997059c3599518ff to your computer and use it in GitHub Desktop.
Save fredcallaway/f6da260e410fafc3997059c3599518ff to your computer and use it in GitHub Desktop.
using Distributions
using Plots
gr(label="", dpi=200, size=(400,300), lw=2)
α, β = 1, 5
prior = Beta(α, β)
max_n = 25
n = 0:max_n
posterior = Beta.(α .+ n, β) # Assuming θ = 1, we always get heads
plot(n, mean.(posterior), xaxis=("# Samples", 0:5:max_n), ylab="Posterior Mean")
hline!([1], c=:red, ls=:dash, lw=1)
plot(n, entropy.(posterior), xaxis=("# Samples", 0:5:max_n), ylab="Posterior Entropy")
plot(n[2:end], -diff(entropy.(posterior)), xaxis=("# Samples", 0:5:max_n), ylab="Information Gain")
hline!([0], c=:black, lw=1)
# Now, sample θ from prior
n_sim = 100000
avg_entropy = n_sim \ mapreduce(+, 1:n_sim) do i
θ = rand(prior)
data = rand(Bernoulli(θ), max_n)
posterior = Beta.(α .+ cumsum(data), β .+ cumsum(.!data))
entropy.(posterior)
end
avg_entropy = [entropy(prior); avg_entropy]
plot(n, avg_entropy, xaxis=("# Samples", 0:5:max_n), ylab="Posterior Entropy")
plot(n[2:end], -diff(avg_entropy), xaxis=("# Samples", 0:5:max_n), ylab="Expected Information Gain")
hline!([0], c=:black, lw=1)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment