Skip to content

Instantly share code, notes, and snippets.

@wookim3
Last active September 2, 2020 10:01
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save wookim3/da621a86acc80c5cb909e65d226b4a52 to your computer and use it in GitHub Desktop.
Save wookim3/da621a86acc80c5cb909e65d226b4a52 to your computer and use it in GitHub Desktop.
optimizer = HyperParameterOptimizer(
base_task_id=TEMPLATE_TASK_ID,
# setting the hyper-parameters to optimize
hyper_parameters=[
UniformIntegerParameterRange('number_of_epochs', min_value=2, max_value=12, step_size=2),
UniformIntegerParameterRange('batch_size', min_value=2, max_value=16, step_size=2),
UniformParameterRange('dropout', min_value=0, max_value=0.5, step_size=0.05),
UniformParameterRange('base_lr', min_value=0.00025, max_value=0.01, step_size=0.00025),
],
# setting the objective metric we want to maximize/minimize
objective_metric_title='accuracy',
objective_metric_series='total',
objective_metric_sign='max',
# setting optimizer
optimizer_class=OptimizerOptuna,
# Configuring optimization parameters
execution_queue='dan_queue',
max_number_of_concurrent_tasks=2,
optimization_time_limit=60.,
compute_time_limit=120,
total_max_jobs=20,
min_iteration_per_job=15000,
max_iteration_per_job=150000,
)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment