Skip to content

Instantly share code, notes, and snippets.

@bpesquet
Created October 28, 2020 19:17
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save bpesquet/b88756adf0d9fe2a50d788a7345607ed to your computer and use it in GitHub Desktop.
Save bpesquet/b88756adf0d9fe2a50d788a7345607ed to your computer and use it in GitHub Desktop.
from micrograd.nn import Neuron, Layer, MLP
model = MLP(2, [16, 16, 1]) # 2-layer neural network
for k in range(100):
# forward
total_loss, acc = loss()
# backward
model.zero_grad()
total_loss.backward()
# update (sgd)
learning_rate = 1.0 - 0.9*k/100
for p in model.parameters():
p.data -= learning_rate * p.grad
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment