Skip to content

Instantly share code, notes, and snippets.

@edenau
Last active December 27, 2019 16:40
Show Gist options
  • Save edenau/f2fedaee8478d073b972f218f91ffc81 to your computer and use it in GitHub Desktop.
Save edenau/f2fedaee8478d073b972f218f91ffc81 to your computer and use it in GitHub Desktop.
learning_rate = 0.01
max_epoch = 1000000
for epoch in range(1,max_epoch+1):
y_hat_train = forward_prop(X_train) # update y_hat
backward_prop(y_train, y_hat_train) # update (dW,db)
for layer_index in range(1,len(layers_dim)): # update (W,b)
neural_net[layer_index].W = neural_net[layer_index].W - learning_rate * neural_net[layer_index].dW
neural_net[layer_index].b = neural_net[layer_index].b - learning_rate * neural_net[layer_index].db
if epoch % 100000 == 0:
print(f'{get_loss(y_train, y_hat_train):.4f}')
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment