Skip to content

Instantly share code, notes, and snippets.

@zaleslaw
Created October 19, 2020 13:43
Show Gist options
  • Save zaleslaw/5ca95081bc03222d3a105dd45deb9dbf to your computer and use it in GitHub Desktop.
Save zaleslaw/5ca95081bc03222d3a105dd45deb9dbf to your computer and use it in GitHub Desktop.
Manual hyperparamter tuning
// Tune hyper-parameters with K-fold Cross-Validation on the split training set.
int[] pSet = new int[] {1, 2};
int[] maxDeepSet = new int[] {1, 2, 3, 4, 5, 10, 20};
int bestP = 1;
int bestMaxDeep = 1;
double avg = Double.MIN_VALUE;
for (int p : pSet) {
for (int maxDeep : maxDeepSet) {
Preprocessor<Integer, Vector> normalizationPreprocessor = new NormalizationTrainer<Integer, Vector>()
.withP(p)
.fit(
ignite,
dataCache,
minMaxScalerPreprocessor
);
DecisionTreeClassificationTrainer trainer
= new DecisionTreeClassificationTrainer(maxDeep, 0);
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment