Skip to content

Instantly share code, notes, and snippets.

What would you like to do?
[deep_learning_study] book table of contents
Table of Contents
Chapter 1
Sigmoid neurons
The architecture of neural networks
A simple network to classify handwritten digits
Learning with gradient descent
Implementing our network to classify digits
Toward deep learning
Chapter 2
Warm up: a fast matrix-based approach to computing the output from a neural network
The two assumptions we need about the cost function
The Hadamard product
The four fundamental equations behind backpropagation
Proof of the four fundamental equations (optional)
The backpropagation algorithm
The code for backpropagation
In what sense is backpropagation a fast algorithm?
Backpropagation: the big picture
Chapter 3
The cross-entropy cost function
Introducing the cross-entropy cost function
Using the cross-entropy to classify MNIST digits
What does the cross-entropy mean? Where does it come from?
Overfitting and regularization
Why does regularization help reduce overfitting?
Other techniques for regularization
Weight initialization
Handwriting recognition revisited: the code
How to choose a neural network's hyper-parameters?
Other techniques
Variations on stochastic gradient descent
Other models of artificial neuron
On stories in neural networks
Chapter 4
Two caveats
Universality with one input and one output
Many input variables
Extension beyond sigmoid neurons
Fixing up the step functions
Chapter 5
The vanishing gradient problem
What's causing the vanishing gradient problem? Unstable gradients in deep neural nets
Unstable gradients in more complex networks
Other obstacles to deep learning
Chapter 6
Introducing convolutional networks
Convolutional neural networks in practice
The code for our convolutional networks
Recent progress in image recognition
Other approaches to deep neural nets
On the future of neural networks
Find the code
git clone
Code by Chapter
Chapter 1 How the backpropagation algorithm works
Chapter 2 How the backpropagation algorithm works
Chapter 3 Improving the way neural networks learn
Chapter 4 A visual proof that neural nets can compute any function
Moar mathz
Chapter 5 Why are deep neural networks hard to train?
Chapter 6 Deep learning
Supplemental Video Learning
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.