Skip to content

Instantly share code, notes, and snippets.

@kk17
Created September 7, 2022 02:19
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save kk17/b159d821e1c52fed7727308a81ef2211 to your computer and use it in GitHub Desktop.
Save kk17/b159d821e1c52fed7727308a81ef2211 to your computer and use it in GitHub Desktop.
train_on_ray2 Tuner Error
Display the source blob
Display the rendered blob
Raw
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@xwjiang2010
Copy link

Can you try removing datasets from param_space?

param_space = {
    "train_loop_config" : {
        "lr": tune.grid_search([0.001, 0.01]),
        "batch_size": batch_size,
        "image_shape": image_shape,
        "num_epochs": 10,
    },
    "datasets": {"train": train_ds, "val": val_ds}
}

basically the above blob becomes

param_space = {
    "train_loop_config" : {
        "lr": tune.grid_search([0.001, 0.01]),
        "batch_size": batch_size,
        "image_shape": image_shape,
        "num_epochs": 10,
    },
}

@xwjiang2010
Copy link

The idea is that if you don't need to tune the dataset, you only need to supply it through Trainer() args as you did. No need to do it through param_space again. Actually similarly for all the none-tuning parameters.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment