Skip to content

Instantly share code, notes, and snippets.

@ahwillia
Last active September 26, 2018 22:35
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save ahwillia/40cdbe3b2f2df1806358dd1e6de0743a to your computer and use it in GitHub Desktop.
Save ahwillia/40cdbe3b2f2df1806358dd1e6de0743a to your computer and use it in GitHub Desktop.
Poisson Regression via scipy.optimize
"""
A simple implementation of Poisson regression.
"""
import numpy as np
from scipy.optimize import minimize
n = 1000 # number of datapoints
p = 5 # number of features
# create data
X = .3*np.random.randn(n, p)
true_b = np.random.randn(p)
y = np.random.poisson(np.exp(np.dot(X, true_b)))
# loss function and gradient
def f(b):
Xb = np.dot(X, b)
exp_Xb = np.exp(Xb)
loss = exp_Xb.sum() - np.dot(y, Xb)
grad = np.dot(X.T, exp_Xb - y)
return loss, grad
# hessian
def hess(b):
return np.dot(X.T, np.exp(np.dot(X, b))[:, None]*X)
# optimize
result = minimize(f, np.zeros(p), jac=True, hess=hess, method='newton-cg')
print('True regression coeffs: {}'.format(true_b))
print('Estimated regression coeffs: {}'.format(result.x))
@ahwillia
Copy link
Author

Output:

True regression coeffs: [-1.11349207  0.43088945  1.2669777   0.53207522 -0.79719178]
Estimated regression coeffs: [-1.14274498  0.32025399  1.12860779  0.52368911 -0.93427154]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment