Skip to content

Instantly share code, notes, and snippets.

@obeshor
Created February 19, 2021 22:53
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save obeshor/74f9316a837b0b98dabd8b7afa2e6721 to your computer and use it in GitHub Desktop.
Save obeshor/74f9316a837b0b98dabd8b7afa2e6721 to your computer and use it in GitHub Desktop.
# Get the optimal hyperparameters
best_hps = tuner.get_best_hyperparameters(num_trials = 1)[0]
print(f"""
The hyperparameter search is complete.\n The optimal filter in second Convolutional layer is {best_hps.get('num_filters')}.\n The optimal number of units in the first densely-connected
layer is {best_hps.get('units')}.\n The optimal rate of dropout is {best_hps.get('dropout_1')} And the optimal learning rate for the optimizer is {best_hps.get('learning_rate')}.
""")
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment