Skip to content

Instantly share code, notes, and snippets.

@aswalin
Last active May 26, 2024 20:25
Show Gist options
  • Save aswalin/9ff5ec849aa24570a87b74558e3d4910 to your computer and use it in GitHub Desktop.
Save aswalin/9ff5ec849aa24570a87b74558e3d4910 to your computer and use it in GitHub Desktop.
import xgboost as xgb
from sklearn import metrics
def auc(m, train, test):
return (metrics.roc_auc_score(y_train,m.predict_proba(train)[:,1]),
metrics.roc_auc_score(y_test,m.predict_proba(test)[:,1]))
# Parameter Tuning
model = xgb.XGBClassifier()
param_dist = {"max_depth": [10,30,50],
"min_child_weight" : [1,3,6],
"n_estimators": [200],
"learning_rate": [0.05, 0.1,0.16],}
grid_search = GridSearchCV(model, param_grid=param_dist, cv = 3,
verbose=10, n_jobs=-1)
grid_search.fit(train, y_train)
grid_search.best_estimator_
model = xgb.XGBClassifier(max_depth=50, min_child_weight=1, n_estimators=200,\
n_jobs=-1 , verbose=1,learning_rate=0.16)
model.fit(train,y_train)
auc(model, train, test)
@Jason2Brownlee
Copy link

Great example of a grid search with custom metric.

Generally, you can use the grid_search.best_estimator_ property to access a fit model directly. No need to re-train a model.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment