Skip to content

Instantly share code, notes, and snippets.

@neelriyer
Created August 10, 2020 08:23
Show Gist options
  • Save neelriyer/dce8b744c0d943b399d31a7dc4bcf0b7 to your computer and use it in GitHub Desktop.
Save neelriyer/dce8b744c0d943b399d31a7dc4bcf0b7 to your computer and use it in GitHub Desktop.
Quickly compare neural network against xgb, lgb and random forest
from sklearn.ensemble import RandomForestRegressor
import xgboost as xgb
import lightgbm as lgb
def rmspe_calc(y_true, y_pred):
# Compute Root Mean Square Percentage Error between two arrays.
return np.sqrt(np.mean(np.square(((y_true - y_pred) / y_true)), axis=0))
models = [
xgb.XGBRegressor(),
lgb.LGBMRegressor(),
RandomForestRegressor()
]
results = pd.DataFrame(columns=["Regressor", "RMSPE"])
for model in models:
name = model.__class__.__name__
model.fit(X_train, y_train)
rmspe = rmspe_calc(y_valid, model.predict(X_valid))
df2 = pd.DataFrame(
{"Regressor": name, \
"RMSPE": rmspe*100}, index = [0]
)
results = results.append(df2, ignore_index = True)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment