Skip to content

Instantly share code, notes, and snippets.

@shayanalibhatti
Created September 22, 2020 15:49
Show Gist options
  • Save shayanalibhatti/7955bc7a09cbfcd67843f7159d1d8924 to your computer and use it in GitHub Desktop.
Save shayanalibhatti/7955bc7a09cbfcd67843f7159d1d8924 to your computer and use it in GitHub Desktop.
yolov5 changes
if any(freeze):
for k, v in model.named_parameters():
if "model.24" not in k: # model.24 is the detect head module
print('freezing %s' % k)
v.requires_grad = False
else:
print("Not freezing %s" %k)
v.requires_grad = True
pg0, pg1, pg2 = [], [], [] # optimizer parameter groups
for k, v in model.named_parameters():
if "model.24" in k:
v.requires_grad = True
if '.bias' in k:
pg2.append(v) # biases
elif '.weight' in k and '.bn' not in k:
pg1.append(v) # apply weight decay
else:
pg0.append(v) # all else`
This would give error in
if opt.adam:
optimizer = optim.Adam(pg0, lr=hyp['lr0'], betas=(hyp['momentum'], 0.999)) # adjust beta1 to momentum
But this will give error because pg0 will be empty as it is filled only with models that
had batch normalization layer weights e.g. model.23.conv.bn.weight etc.
And pg0 cannot be empty as it is needed by the optimizer SGD/Adam
To prevent that, I replaced following
if opt.adam:
optimizer = optim.Adam(pg0, lr=hyp['lr0'], betas=(hyp['momentum'], 0.999))
else:
optimizer = optim.SGD(pg0, lr=hyp['lr0'], momentum=hyp['momentum'], nesterov=True)
optimizer.add_param_group({'params': pg1, 'weight_decay': hyp['weight_decay']}) # add pg1 with weight_decay
optimizer.add_param_group({'params': pg2}) # add pg2 (biases)
with
if opt.adam:
optimizer = optim.Adam(model.parameters(), lr=hyp['lr0'], betas=(hyp['momentum'], 0.999)) # adjust beta1 to momentum
else:
optimizer = optim.SGD(model.parameters(), lr=hyp['lr0'], momentum=hyp['momentum'], nesterov=True)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment