Last active
July 30, 2019 08:08
-
-
Save mizushou/dad2baf7752b407df3c32c103f676c6d to your computer and use it in GitHub Desktop.
NNで使うactivation functionのサンプル
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import numpy as np | |
# ReLu activation function, which computes the ReLu of a scalar. | |
def rectified_linear_unit(x): | |
""" Returns the ReLU of x, or the maximum between 0 and x. """ | |
return np.maximum(0, x) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# ReLUの微分を返す関数。Backpropagationで使う | |
def rectified_linear_unit_derivative(x): | |
""" Returns the derivative of ReLU.""" | |
return 1 if x > 0 else 0 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment