Skip to content

Instantly share code, notes, and snippets.

@abhishekkrthakur
Created June 11, 2019 20:30
Show Gist options
  • Save abhishekkrthakur/6ca89fbe647fb5b3c48be521a164d829 to your computer and use it in GitHub Desktop.
Save abhishekkrthakur/6ca89fbe647fb5b3c48be521a164d829 to your computer and use it in GitHub Desktop.
import torch.optim as optim
from torch.optim import lr_scheduler
plist = [
{'params': model_ft.layer4.parameters(), 'lr': 1e-5},
{'params': model_ft.last_linear.parameters(), 'lr': 5e-3}
]
optimizer_ft = optim.Adam(plist, lr=0.001)
lr_sch = lr_scheduler.StepLR(optimizer_ft, step_size=10, gamma=0.1)
model_ft = train_model(model_ft,
train_dataset_loader,
len(train_dataset),
optimizer_ft,
lr_sch,
num_epochs=20)
torch.save(model_ft.state_dict(), "model.bin")
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment