Last active
June 2, 2020 23:06
-
-
Save Mehdi-Amine/98baeab62f984b2814fca87c4b2a9ccd to your computer and use it in GitHub Desktop.
differentiation of ReLU
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import torch | |
import torch.nn.functional as F | |
#----------- Implementing the math -----------# | |
def relu_prime(z): | |
return torch.where(z>0, torch.tensor(1.), torch.tensor(0.)) | |
z = torch.tensor([[-0.2], [0.6]], requires_grad=True) | |
relu_p = relu_prime(z) | |
#----------- Using Pytorch autograd -----------# | |
torch_relu = F.relu(z) | |
torch_relu.backward(torch.tensor([[1.], [1.]])) | |
#----------- Comparing outputs -----------# | |
print(f"Pytorch ReLU': \n{z.grad} \nOur ReLU': \n{relu_p}") | |
''' | |
Out: | |
Pytorch ReLU': | |
tensor([[0.],[1.]]) | |
Our ReLU': | |
tensor([[0.],[1.]]) | |
''' |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment