Skip to content

Instantly share code, notes, and snippets.

@mizushou
Last active July 30, 2019 08:08
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save mizushou/dad2baf7752b407df3c32c103f676c6d to your computer and use it in GitHub Desktop.
Save mizushou/dad2baf7752b407df3c32c103f676c6d to your computer and use it in GitHub Desktop.
NNで使うactivation functionのサンプル
import numpy as np
# ReLu activation function, which computes the ReLu of a scalar.
def rectified_linear_unit(x):
""" Returns the ReLU of x, or the maximum between 0 and x. """
return np.maximum(0, x)
# ReLUの微分を返す関数。Backpropagationで使う
def rectified_linear_unit_derivative(x):
""" Returns the derivative of ReLU."""
return 1 if x > 0 else 0
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment