Created
June 19, 2019 09:17
-
-
Save RafayAK/ffa8df87b7e86505c097bb56faf8be60 to your computer and use it in GitHub Desktop.
Class for Sigmoid Activation Layer
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import numpy as np # import numpy library | |
class SigmoidLayer: | |
""" | |
This file implements activation layers | |
inline with a computational graph model | |
Args: | |
shape: shape of input to the layer | |
Methods: | |
forward(Z) | |
backward(upstream_grad) | |
""" | |
def __init__(self, shape): | |
""" | |
The consturctor of the sigmoid/logistic activation layer takes in the following arguments | |
Args: | |
shape: shape of input to the layer | |
""" | |
self.A = np.zeros(shape) # create space for the resultant activations | |
def forward(self, Z): | |
""" | |
This function performs the forwards propagation step through the activation function | |
Args: | |
Z: input from previous (linear) layer | |
""" | |
self.A = 1 / (1 + np.exp(-Z)) # compute activations | |
def backward(self, upstream_grad): | |
""" | |
This function performs the back propagation step through the activation function | |
Local gradient => derivative of sigmoid => A*(1-A) | |
Args: | |
upstream_grad: gradient coming into this layer from the layer above | |
""" | |
# couple upstream gradient with local gradient, the result will be sent back to the Linear layer | |
self.dZ = upstream_grad * self.A*(1-self.A) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment