Last active
February 13, 2018 09:30
-
-
Save shilpavijay/61711a46d76d68ae730e103db70de3f8 to your computer and use it in GitHub Desktop.
Deep learning
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sigmoid function: | |
sigmoid(x) = 1/(1+e^-x) | |
A sigmoid function is a mathematical function having a characteristic "S"-shaped curve or sigmoid curve. | |
Softmax: | |
The softmax function is often used in the final layer of a neural network-based classifier. Such networks are commonly trained | |
under a log loss (or cross-entropy) regime, giving a non-linear variant of multinomial logistic regression. | |
W -> Vector representing weight (that represented on the line btw input(x) and first activation layer(z)) | |
b -> bias (each activation layer(z) has a bias term) | |
TBD: | |
------- | |
- Activation function (to be chosen) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment