Skip to content

Instantly share code, notes, and snippets.

@shilpavijay
Last active February 13, 2018 09:30
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save shilpavijay/61711a46d76d68ae730e103db70de3f8 to your computer and use it in GitHub Desktop.
Save shilpavijay/61711a46d76d68ae730e103db70de3f8 to your computer and use it in GitHub Desktop.
Deep learning
Sigmoid function:
sigmoid(x) = 1/(1+e^-x)
A sigmoid function is a mathematical function having a characteristic "S"-shaped curve or sigmoid curve.
Softmax:
The softmax function is often used in the final layer of a neural network-based classifier. Such networks are commonly trained
under a log loss (or cross-entropy) regime, giving a non-linear variant of multinomial logistic regression.
W -> Vector representing weight (that represented on the line btw input(x) and first activation layer(z))
b -> bias (each activation layer(z) has a bias term)
TBD:
-------
- Activation function (to be chosen)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment