Skip to content

Instantly share code, notes, and snippets.

View toshihikoyanase's full-sized avatar

Toshihiko Yanase toshihikoyanase

View GitHub Profile
@toshihikoyanase
toshihikoyanase / lightgbm_tuner_cv_extracted.py
Last active June 1, 2020 05:43
A code snippet for LightGBMTunerCV.
params = {
"objective": "binary",
"metric": "binary_logloss",
"verbosity": -1,
"boosting_type": "gbdt",
}
tuner = lgb.LightGBMTunerCV(
params, dtrain, verbose_eval=100, early_stopping_rounds=100
)
import json
import mlflow
import numpy as np
import sklearn.datasets
from sklearn.metrics import accuracy_score
from sklearn.model_selection import train_test_split
import optuna
import optuna.integration.lightgbm as lgb
@toshihikoyanase
toshihikoyanase / lightgbm_tuner_dump_result.py
Created April 8, 2020 08:31
An example of parallel execution of LightGBMTuner.
import optuna
study = optuna.create_study(
storage="sqlite:///lgbtuner.db", study_name="parallel", load_if_exists=True
)
study.trials_dataframe().to_csv("parallel-result.csv")
@toshihikoyanase
toshihikoyanase / lightgbm_tuner_parallel.py
Created March 18, 2020 10:12
Examples for Optuna #1039 Support pruning/resume/parallelization for LightGBMTuner
"""
Optuna example that optimizes a classifier configuration for cancer dataset using LightGBM tuner.
In this example, we optimize the validation log loss of cancer detection.
You can execute this code directly.
$ python lightgbm_tuner_parallel.py [-p]
"""
import argparse
@toshihikoyanase
toshihikoyanase / quadratic_joblib_simple.py
Created August 28, 2019 06:55
Optuna example that optimizes a quadratic function in parallel using joblib.
"""
Optuna example that optimizes a simple quadratic function in parallel using joblib.
In this example, we optimize a simple quadratic function. We also demonstrate how to continue an
optimization and to use timeouts.
We have the following way to execute this example:
$ python quadratic_joblib_simple.py
@toshihikoyanase
toshihikoyanase / nan_value_pruner_sample.py
Created August 7, 2019 05:34
An example to prune trials when they reports NaN values as intermediate values.
import math
import sklearn.datasets
import sklearn.linear_model
import sklearn.model_selection
import optuna
class NaNValuePruner(optuna.pruners.BasePruner):
@toshihikoyanase
toshihikoyanase / README.md
Last active August 20, 2021 11:22
東工大 第1回 ディープラーニング分散学習ハッカソン Optuna資料リンク集
@toshihikoyanase
toshihikoyanase / pruning.py
Created July 31, 2019 10:49
Pseudo-code of Pruning
import ...
def objective(trial):
...
alpha = trial.suggest_loguniform('alpha', 1e-5, 1e-1)
clf = sklearn.linear_model.SGDClassifier(alpha=alpha)
for step in range(100):
clf.partial_fit(train_x, train_y, classes=classes)
@toshihikoyanase
toshihikoyanase / mlp-optuna.py
Last active November 15, 2023 16:53
Tuning MLP by using Optuna.
import optuna
import sklearn
import sklearn.datasets
import sklearn.neural_network
def objective(trial):
# ネットワーク構造の決定
n_layers = trial.suggest_int('n_layers', 1, 4)
layers = []

Settings: (sigopt/evalset/auc-test-suites)

  • Problems: 38
  • Metrics: best
Solver Borda Firsts
(a) optuna#tpe-faster 0 37
(b) optuna#tpe-latest 1 38

Plot of best value