Skip to content

Instantly share code, notes, and snippets.

@Priyansh-Kedia
Created May 19, 2021 19:16
Show Gist options
  • Save Priyansh-Kedia/d6e64ad407897a3eee295b71faf89b7c to your computer and use it in GitHub Desktop.
Save Priyansh-Kedia/d6e64ad407897a3eee295b71faf89b7c to your computer and use it in GitHub Desktop.
Code for ReLu activation plot using matplotlib
from matplotlib import pyplot
def rectified(x):
return max(0.0, x)
series_in = [x for x in range(-100, 101)]
series_out = [rectified(x) for x in series_in]
pyplot.plot(series_in, series_out)
pyplot.show()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment