Skip to content

Instantly share code, notes, and snippets.

@alanse7en
Last active May 13, 2016 16:56
Show Gist options
  • Save alanse7en/47b45fe1e04a912d9148 to your computer and use it in GitHub Desktop.
Save alanse7en/47b45fe1e04a912d9148 to your computer and use it in GitHub Desktop.

The advantages of using Rectified Linear Units in neural networks are

  • If hard max function is used as activation function, it  induces the sparsity in the hidden units.
  • ReLU doesn't face gradient vanishing problem as with sigmoid and tanh function. Also, It has been shown that deep networks can be trained efficiently using ReLU even without pre-training.
  • ReLU can be used in Restricted Boltzmann machine to model real/integer valued inputs.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment