Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
%%time
for e in range(10):
# define the loss value after the epoch
losss = 0.0
number_of_sub_epoch = 0
# loop for every training batch (one epoch)
for images, labels in train_loader:
#create the output from the network
out = modelpy(images)
# count the loss function
loss = criterion(out, labels)
# in pytorch you have assign the zero for gradien in any sub epoch
optim.zero_grad()
# count the backpropagation
loss.backward()
# learning
optim.step()
# add new value to the main loss
losss += loss.item()
number_of_sub_epoch += 1
print("step {}: loss: {}".format(e, losss / number_of_sub_epoch))
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.