Skip to content

Instantly share code, notes, and snippets.

@madagra
Created December 22, 2021 18:37
Show Gist options
  • Save madagra/016aec16707bc36ffdf5baf15ec9759d to your computer and use it in GitHub Desktop.
Save madagra/016aec16707bc36ffdf5baf15ec9759d to your computer and use it in GitHub Desktop.
# create a 5-layers PINN with 5 neurons per layer
nn_approximator = PINN(5, 5)
# hyperparameters and optimizer
max_epochs = 10_000
learning_rate = 0.01
optimizer = torch.optim.Adam(nn_approximator.parameters(), lr=learning_rate)
# optimization loop
for epoch in range(max_epochs):
# this function return the final_loss variable containing
# PDE residual, boundary and initial condition losses and
# potentially regularization terms
loss: torch.Tensor = loss_fn(nn_approximator)
optimizer.zero_grad()
loss.backward()
optimizer.step()
if epoch % 1000 == 0:
print(f"Epoch: {epoch} - Loss: {float(loss):>7f}")
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment