Skip to content

Instantly share code, notes, and snippets.

@derekpowell
Last active March 7, 2022 11:56
Show Gist options
  • Save derekpowell/a04937865ae6aac44fab9d9daa8e9d47 to your computer and use it in GitHub Desktop.
Save derekpowell/a04937865ae6aac44fab9d9daa8e9d47 to your computer and use it in GitHub Desktop.
pymc3 horseshoe prior implementation
def horseshoe_prior(name, X, y, m, v, s):
'''
Regularizing horseshoe prior as introduced by Piironen & Vehtari
https://arxiv.org/pdf/1707.01694.pdf
name: variable name
X: X (2-d array)
y: y (for setting pseudo-variance)
m: expected number of relevant features (must be < total N)
v: regularizing student-t df
s: regularizing student-t sd
'''
half_v = v/2
n = X.shape[0]
u = np.mean(y)
M = X.shape[1]
# Estimate binomial psuedo variance using gaussian approximation
sigma = np.sqrt(1/u * (1-u)/1) # from https://arxiv.org/pdf/1707.01694.pdf p.15
tau0 = (m/(M-m)) * (sigma/np.math.sqrt(n))
tau_t = pm.HalfCauchy(f"tauT_{name}", beta = 1)
tau = tau0*tau_t
c2_t = pm.InverseGamma(f"c2_{name}", half_v, half_v)
c2 = np.power(s,2) * c2_t
lambda_m = pm.HalfCauchy(f"lambdaM_{name}", beta = 1, shape = M)
lambda_t = (pm.math.sqrt(c2)*lambda_m) / pm.math.sqrt(c2 + pm.math.sqr(tau) * pm.math.sqr(lambda_m))
beta_t = pm.Normal(f"betaT_{name}", mu=0, sd = 1, shape= M)
beta = tau * lambda_t * beta_t
return(beta)
@Nathan-Furnal
Copy link

Hi! I was reading the paper about the regularized horseshoe prior and came upon this gist. I was wondering if you had any idea about the v and s parameters if the priors were defined ? The paper does not seem to point at possible values if I read it correctly.

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment