Created
August 1, 2020 11:24
-
-
Save snsmssss/0e6f4e49de6af9d5ca2a8cb7bf2b0260 to your computer and use it in GitHub Desktop.
Output Error message
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Epoch 0 loss: 0.334 | |
Epoch 10 loss: 0.305 | |
Epoch 20 loss: 0.118 | |
Epoch 30 loss: 0.072 | |
Epoch 40 loss: 0.049 | |
Epoch 50 loss: 0.047 | |
Epoch 60 loss: 0.046 | |
Epoch 70 loss: 0.037 | |
Epoch 80 loss: 0.024 | |
--------------------------------------------------------------------------- | |
RuntimeError Traceback (most recent call last) | |
<ipython-input-28-5c2aae2b3c78> in <module> | |
128 with torch.autograd.set_detect_anomaly(True): | |
129 network = OurNeuralNetwork() | |
--> 130 network.train(data, all_y_trues) | |
<ipython-input-28-5c2aae2b3c78> in train(self, data, all_y_trues) | |
77 y_pred = o1 | |
78 mseloss = mse_loss(y_true, y_pred) | |
---> 79 mseloss.backward(retain_graph=True) | |
80 | |
81 # --- Update weights and biases | |
~/.conda/envs/pytorch/lib/python3.8/site-packages/torch/tensor.py in backward(self, gradient, retain_graph, create_graph) | |
196 products. Defaults to ``False``. | |
197 """ | |
--> 198 torch.autograd.backward(self, gradient, retain_graph, create_graph) | |
199 | |
200 def register_hook(self, hook): | |
~/.conda/envs/pytorch/lib/python3.8/site-packages/torch/autograd/__init__.py in backward(tensors, grad_tensors, retain_graph, create_graph, grad_variables) | |
96 retain_graph = create_graph | |
97 | |
---> 98 Variable._execution_engine.run_backward( | |
99 tensors, grad_tensors, retain_graph, create_graph, | |
100 allow_unreachable=True) # allow_unreachable flag | |
RuntimeError: Function 'ExpBackward' returned nan values in its 0th output. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment