Skip to content

Instantly share code, notes, and snippets.

@Mehdi-Amine
Last active June 2, 2020 23:08
Show Gist options
  • Save Mehdi-Amine/f19b5036c2fe6e852390af1fc87328b3 to your computer and use it in GitHub Desktop.
Save Mehdi-Amine/f19b5036c2fe6e852390af1fc87328b3 to your computer and use it in GitHub Desktop.
relu with clamp and with torch.nn.functional
import torch
import torch.nn.functional as F
#----------- Implementing the math -----------#
def relu(z):
return torch.clamp(z, 0, None) # None specifies that we don't require an upper-bound
z = torch.tensor([[-0.2], [0.], [0.6]]) # Three neurons with different values
relu = relu(z)
#----------- Using Pytorch -----------#
torch_relu = F.relu(z)
#----------- Comparing outputs -----------#
print(f"Pytorch ReLU: \n{torch_relu} \nOur ReLU: \n{relu}")
'''
Out:
Pytorch ReLU:
tensor([[0.0000],[0.0000],[0.6000]])
Our ReLU:
tensor([[0.0000],[0.0000],[0.6000]])
'''
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment