Skip to content

Instantly share code, notes, and snippets.

@ducnh1022
Last active April 6, 2016 15:36
Show Gist options
  • Save ducnh1022/41f2389f7f19ff3542dd33431ce5365b to your computer and use it in GitHub Desktop.
Save ducnh1022/41f2389f7f19ff3542dd33431ce5365b to your computer and use it in GitHub Desktop.
Classification week 5 boosting
combine multiple simple classifier -> ensemble classifiers
y hat = sign(f(x))
Adaboost
start same weight for all points alpha = 1/N
For t = 1..T
learn f(t) with data weight alpha
compute coefficients w => wt = 1/2 * len ( 1 - weight_error / weight_error )
recompute weight alpha
alpha i = alpha i * exp(-wt) if ft(xi) = y mean correct prediction
alpha i * exp(wt) if ft(xi) = y mean incorrect prediction
Final model predicts by
yhat = sign(sum of all wt ft(x))
normalizing weights
alpha i <- alpha i / total sum of all alpha
Apply adaboost
Exploring ensemble method
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment