Skip to content

Instantly share code, notes, and snippets.

@thmosqueiro
Created April 26, 2017 21:19
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save thmosqueiro/12b1b8013c316f539389681610dae44b to your computer and use it in GitHub Desktop.
Save thmosqueiro/12b1b8013c316f539389681610dae44b to your computer and use it in GitHub Desktop.
Regression with asymmetric loss function
import numpy as np
import pylab as pl
from lmfit import minimize, Parameters
# Creating some data with some strong random factor
x = np.linspace(0, 10, 50)
y = (x-3)**2 + np.random.rand( 50 )*5
# Defining residuals using an asymmetric loss function
def residual(params, x, data, eps_data):
alpha = params['alpha']
beta = params['beta']
delta = params['delta']
model = alpha*x + beta*x**2 + delta
dy = (model - data)
return dy**2*(np.sign(dy) + 0.8)**2/eps_data
# Setting up the optimization environment
params = Parameters()
params.add('alpha', value=0.1)
params.add('beta', value=0.05)
params.add('delta', value=1000.)
eps_data = 1.0
# Running the optimzation and grabbibg the resulting
# parameters back
out = minimize(residual, params, args=(x, y, eps_data))
resultparams = out.params.valuesdict()
alpha = resultparams['alpha']
beta = resultparams['beta']
delta = resultparams['delta']
# Plotting the results
pl.figure()
pl.plot( x, y, 'ko' )
pl.plot( x, alpha*x + beta*x**2 + delta, 'r-' )
pl.show()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment