Skip to content

Instantly share code, notes, and snippets.

@nilbot
Created May 31, 2018 15:11
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save nilbot/44428a4400fa67621f4cd91884a02c06 to your computer and use it in GitHub Desktop.
Save nilbot/44428a4400fa67621f4cd91884a02c06 to your computer and use it in GitHub Desktop.
some pytorch notes

On 0.4 migration, merge of V and T introducing requires_grad, affecting all optimizer

So I don't really think that it makes sense to allow such parameters. If you don't want to optimize some tensors, they're not parameters - they're fixed. You probably don't want to count them in. And if you really need to then

optimizer.SGD(filter(lambda p: p.requires_grad, model.parameters()), lr=1e-3) should do the trick.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment