Skip to content

Instantly share code, notes, and snippets.

@jcausey-astate
Created October 20, 2018 18:20
Show Gist options
  • Save jcausey-astate/4cc7ca4a9b2384fcabc9b4e92575d615 to your computer and use it in GitHub Desktop.
Save jcausey-astate/4cc7ca4a9b2384fcabc9b4e92575d615 to your computer and use it in GitHub Desktop.
An example one- and two-layer neural network from scratch in Python, with helpful references and example links.
Display the source blob
Display the rendered blob
Raw
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Simple Neural Network in Python"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This notebook is based on the following tutorials and examples, and it borrows heavily from their figures, code, and discussion.\n",
"* [Neural Network Implementation](https://peterroelants.github.io/posts/neural-network-implementation-part01/)\n",
"* [Build a Neural Network](https://enlight.nyc/projects/neural-network/)\n",
"* [A Neural Network in 11 lines of Python (Part 1)](https://iamtrask.github.io/2015/07/12/basic-python-network/)\n",
"* [How to build your own Neural Network from scratch in Python](https://towardsdatascience.com/how-to-build-your-own-neural-network-from-scratch-in-python-68998a08e4f6)\n",
"* [Artificial neural networks with Math.](https://medium.com/deep-math-machine-learning-ai/chapter-7-artificial-neural-networks-with-math-bb711169481b)"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"import numpy as np, matplotlib.pyplot as plt\n",
"%matplotlib inline"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## What is a \"Neuron\" in computing terms anyway?\n",
"\n",
"![Neuron](images/artificial_neuron.jpeg)\n",
"\n",
"(Source: [Artificial neural networks with Math](https://medium.com/deep-math-machine-learning-ai/chapter-7-artificial-neural-networks-with-math-bb711169481b))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Each neuron is the sum of the incoming inputs multiplied by the weights for each, then with (maybe) a bit of added bias. In math speak:\n",
"\n",
"$$\n",
"Y = \\sum_{i=0}^{N}{(x_i \\cdot w_i)} + b\n",
"$$\n",
"\n",
"Where $N$ is the number of inputs, each $x_i$ is one input value, and each $w_i$ is one weight; $Y$ is the output value for the neuron. \n",
"\n",
"The value of $Y$ could be anything in the range $(-\\infty,\\infty)$.\n",
"\n",
"But, that range is too wide. We generally want to use an *activation function* to \"compress\" the output range.\n",
"\n",
"Common activation functions include:\n",
"\n",
"| name | formula |\n",
"|----------|--------------------------|\n",
"| Sigmoid | $y = \\frac{1}{1+e^{-x}}$ |\n",
"| Tanh | $y = \\textrm{tanh}(x)$ |\n",
"| Relu | $y = \\textrm{max}(0, x)$ |\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## The Example Problem\n",
"Let's try to train a network to predict the outcome of:"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"| x0 | x1 | x2 | **Output** |\n",
"|--------|---|---|------------|\n",
"| 0 | 0 | 1 | **0** |\n",
"| 1 | 1 | 1 | **1** |\n",
"| 1 | 0 | 1 | **1** |\n",
"| 0 | 1 | 1 | **0** |\n",
"\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## The Sigmoid activation function\n",
"\n",
"![Sigmoid](images/Sigmoid-function.png)\n",
"\n",
"The sigmoid function scales its input into the range $(0,1)$.\n",
"\n",
"The formula is:\n",
"\n",
"$$\n",
"y = \\frac{1}{1+e^{-x}}\n",
"$$\n",
"\n",
"And in Python:"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"def sigmoid(x):\n",
" return 1/(1+np.exp(-x))"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {
"lines_to_next_cell": 2
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"x: -10 y: 0.0000\n",
"x: -8 y: 0.0003\n",
"x: -6 y: 0.0025\n",
"x: -4 y: 0.0180\n",
"x: -2 y: 0.1192\n",
"x: 0 y: 0.5000\n",
"x: 2 y: 0.8808\n",
"x: 4 y: 0.9820\n",
"x: 6 y: 0.9975\n",
"x: 8 y: 0.9997\n"
]
}
],
"source": [
"# Let's try it...\n",
"for i in range(-10, 10, 2):\n",
" print(\"x: %3i y: %.04f\" % (i, sigmoid(i)))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now we set up our input matrix ($X$) and output matrix ($y$).\n",
"\n",
"$X$ represents the random variables, $y$ represents the dependent variable(s)."
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"# input matrix (ordered so that the output \"ones\" are at the bottom)\n",
"X = np.array([ [0,0,1],\n",
" [0,1,1],\n",
" [1,0,1],\n",
" [1,1,1] ])\n",
" \n",
"# output matrix (order must match X):\n",
"y = np.array([ [0],\n",
" [0],\n",
" [1],\n",
" [1] ])"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## A Simple One-Layer Artificial Neural Network\n",
"\n",
"For this example, we will just use a single \"hidden layer\" that is also the *output layer*.\n",
"\n",
"That means we just have one set of weights $W_0$ that will be multiplied by the inputs and summed. Then, we run the result through the Sigmoid function to produce the output.\n",
"\n",
"The network diagram looks like this:\n",
"\n",
"![One layer network](images/one_layer_nn.png)\n",
"\n",
"This network only has one \"layer\", and that is the layer producing the output. So, we directly observe the results coming out of this layer -- the output layer. Thus, we would not say that this network has any *hidden layers* at all. But, it still behaves like a neural network, and it is simple, so it's a good place to start."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Initial Weights are random!\n",
"We start by creating the weights matrix $W_0$ with random values that fall in the range $(-1,1)$ with mean $\\mu = 0$."
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
"# initialize weights randomly with mean 0 (range (-1,1))\n",
"W0 = 2*np.random.random((3,1)) - 1 # The (3,1) is the shape: 3 rows, 1 column as shown in the diagram above."
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[[-0.07718115]\n",
" [ 0.71523045]\n",
" [ 0.6842432 ]]\n"
]
}
],
"source": [
"# Let's see it:\n",
"print(W0)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Forward propagation\n",
"Let `l` mean \"layer\" in our short variable names here.\n",
"\n",
"We start by loading the inputs into layer 0."
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [],
"source": [
"l0 = X"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now, we calculate the values at the \"hidden layer\" (layer 1, before activation) by dot-product of the input layer (`l0`) with the weights for the first hidden layer (`W0`)."
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[[0.6842432 ]\n",
" [1.39947366]\n",
" [0.60706206]\n",
" [1.32229251]]\n"
]
}
],
"source": [
"l1 = np.dot(l0, W0) # Note, we're not using any bias here. Bias = 0\n",
"print(l1)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"And then we apply the activation layer (nonlinearity):"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [],
"source": [
"output = sigmoid(l1)"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[[0.66468508]\n",
" [0.80210035]\n",
" [0.64727033]\n",
" [0.78956287]]\n"
]
}
],
"source": [
"# Let's see it:\n",
"print(output)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"That's pretty bad. But the second value isn't *too* bad. The last one is *really really bad*."
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[[-0.66468508]\n",
" [-0.80210035]\n",
" [ 0.35272967]\n",
" [ 0.21043713]]\n"
]
}
],
"source": [
"# Now calculate the error:\n",
"output_error = y - output\n",
"print(output_error)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The error reflects that observation, some are worse than others, and one is not too bad."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Backpropagation\n",
"To train the network, we need to *backpropogate* the error through the hidden layers, updating the weights as we go.\n",
"\n",
"But, to figure out how much each weight contributed, since it has passed through a nonlinearity (the activation function), we need to \"undo\" that transformation by applying the derivative of the activation function.\n",
"\n",
"The derivative of sigmoid turns out to be:\n",
"\n",
"$$\n",
"\\textrm{sigmoid}^\\prime(x) = \\textrm{sigmoid}(x) * (1-\\textrm{sigmoid}(x))\n",
"$$\n",
"\n",
"To see why, [read this article](https://beckernick.github.io/sigmoid-derivative-neural-network/) and [this one](https://www.analyticsindiamag.com/beginners-guide-neural-network-math-python/)."
]
},
{
"cell_type": "code",
"execution_count": 12,
"metadata": {},
"outputs": [],
"source": [
"def sigmoid_deriv(x):\n",
" # According to the equation above, this function would look like:\n",
" # return sigmoid(x) * (1 - sigmoid(x))\n",
" # BUT: Remember, we passed the result of l1 = l0 * W0 through the sigmoid\n",
" # function already to produce our `output`. So the value that \n",
" # will be passed to this function (x, which is `output`): \n",
" # `output` == sigmoid(l1); we don't need to re-do the sigmoid!\n",
" return x * (1-x)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"So to figure out how much error was due to each weight, we can do:"
]
},
{
"cell_type": "code",
"execution_count": 13,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[[-0.14814423]\n",
" [-0.1273217 ]\n",
" [ 0.08053222]\n",
" [ 0.03496483]]\n"
]
}
],
"source": [
"# Put the output error in terms of the hidden layer output (i.e. undo the sigmoid function):\n",
"output_delta = output_error * sigmoid_deriv(output)\n",
"print(output_delta)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now figure out how much of that error was due to each of the weights in $W_0$:\n",
"\n",
"This works by dotting the inputs (transposed) by the weights, to give an amount of error associated with each weight."
]
},
{
"cell_type": "code",
"execution_count": 14,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[[ 0.11549706]\n",
" [-0.09235687]\n",
" [-0.15996887]]\n"
]
}
],
"source": [
"err_per_weight = np.dot(l0.T, output_delta)\n",
"print(err_per_weight)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now we simply adjust the weights by this amount:"
]
},
{
"cell_type": "code",
"execution_count": 15,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Weights now: \n",
" [[0.03831591]\n",
" [0.62287358]\n",
" [0.52427433]]\n"
]
}
],
"source": [
"W0 += err_per_weight\n",
"print(\"Weights now: \\n\", W0)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"*Note*: Normally, you would want to move more slowly, so you would include a *learning rate* in the calculation of the error. The learning rate is a number in the range $(0,1)$ that scales down the amount of error that gets backpropogated on each update. Larger learning rates (nearer to 1) converge faster by taking larger \"steps\", smaller learning rates converge more slowly. There are plenty of resources online to help you get a sense for the tradeoffs involved. Here, there is no learning rate, so it is essentially clamped to 1.\n",
"\n",
"Now, we do the whole process over and over again to move down the error gradient closer to the correct answer. \n",
"\n",
"This is the concept of *gradient descent*.\n",
"\n",
"### Let's bundle up the forward propagation step:"
]
},
{
"cell_type": "code",
"execution_count": 16,
"metadata": {},
"outputs": [],
"source": [
"def forward(X, W0):\n",
" l0 = X\n",
" l1 = np.dot(l0, W0) # Note, we're not using any bias here. Bias = 0\n",
" output = sigmoid(l1)\n",
" return output"
]
},
{
"cell_type": "code",
"execution_count": 17,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[[0.6281467 ]\n",
" [0.75898959]\n",
" [0.63705166]\n",
" [0.76592879]]\n"
]
}
],
"source": [
"output = forward(X, W0)\n",
"print(output)"
]
},
{
"cell_type": "code",
"execution_count": 18,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"array([[-0.6281467 ],\n",
" [-0.75898959],\n",
" [ 0.36294834],\n",
" [ 0.23407121]])"
]
},
"execution_count": 18,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"y - output"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Well, that is much better!\n",
"\n",
"Let's bundle up the backpropagation step as well. This function will update the weight matrix in-place."
]
},
{
"cell_type": "code",
"execution_count": 19,
"metadata": {},
"outputs": [],
"source": [
"def backprop(l0, W0, output, y):\n",
" output_error = y - output\n",
" output_delta = output_error * sigmoid_deriv(output)\n",
" err_per_weight = np.dot(l0.T, output_delta)\n",
" W0 += err_per_weight"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Define a function to compute the Mean Square Error (MSE), so we can track progress."
]
},
{
"cell_type": "code",
"execution_count": 20,
"metadata": {},
"outputs": [],
"source": [
"def MSE(truth, estimate):\n",
" return np.mean(np.square(truth-estimate))"
]
},
{
"cell_type": "code",
"execution_count": 21,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"MSE after second training step: 0.289\n"
]
}
],
"source": [
"print(\"MSE after second training step: %.3f\" % MSE(y, forward(X,W0)))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now create a \"Train\" function that will do a forward and backward pass."
]
},
{
"cell_type": "code",
"execution_count": 22,
"metadata": {},
"outputs": [],
"source": [
"def train(X, y, W0):\n",
" output = forward(X, W0)\n",
" backprop(X, W0, output, y)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now let's iterate 100 more times to watch it train:\n",
"\n",
"To be fair to our later comparision, we will re-initialize the weights so that we are starting over from scratch..."
]
},
{
"cell_type": "code",
"execution_count": 23,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"i: 0 MSE before: 0.22029 after: 0.19996\n",
"i: 10 MSE before: 0.09415 after: 0.08781\n",
"i: 20 MSE before: 0.05207 after: 0.04961\n",
"i: 30 MSE before: 0.03418 after: 0.03298\n",
"i: 40 MSE before: 0.02486 after: 0.02417\n",
"i: 50 MSE before: 0.01930 after: 0.01887\n",
"i: 60 MSE before: 0.01566 after: 0.01537\n",
"i: 70 MSE before: 0.01312 after: 0.01291\n",
"i: 80 MSE before: 0.01125 after: 0.01109\n",
"i: 90 MSE before: 0.00983 after: 0.00970\n",
"i: 99 MSE before: 0.00881 after: 0.00871\n"
]
}
],
"source": [
"W0 = 2*np.random.random((3,1)) - 1\n",
"\n",
"for i in range(100):\n",
" mse_before = MSE(y, forward(X,W0))\n",
" train(X, y, W0)\n",
" mse_after = MSE(y, forward(X,W0))\n",
" if i % 10 == 0 or i == 99:\n",
" print(\"i: %3i MSE before: %0.5f after: %0.5f\" % (i, mse_before, mse_after))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Two-Layer Network\n",
"\n",
"Now, let's design a 2-layer neural network and compare its results on the same problem.\n",
"\n",
"Here is the diagram. Note that the matrix dimensions required for the weights is shown below the weight boxes. Also, there is an activation function (Sigmoid) at each neuron, although it is not shown in the diagram for simplicity.\n",
"\n",
"![Two layer ANN](images/two_layer_nn.png)\n",
"\n",
"This network *does* have a single **_hidden layer_**. The outputs from the hidden layer become inputs to the output layer. This is the simplest neural network architecture that you will find in practice."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Here are the initial weights for this network. Again, we randomly generate them just as before. Note the sizes of the matrices."
]
},
{
"cell_type": "code",
"execution_count": 24,
"metadata": {},
"outputs": [],
"source": [
"W0 = 2 * np.random.random((3,2)) - 1\n",
"W1 = 2 * np.random.random((2,1)) - 1"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Here they are:"
]
},
{
"cell_type": "code",
"execution_count": 25,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[[-0.36481251 -0.37203586]\n",
" [-0.81881763 0.65717183]\n",
" [-0.15073624 0.62975796]]\n",
"\n",
"[[-0.81306733]\n",
" [ 0.94892801]]\n"
]
}
],
"source": [
"print(W0)\n",
"print()\n",
"print(W1)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We can use a list `[W0, W1]` to represent the weights for the whole network in function calls for simplicity:"
]
},
{
"cell_type": "code",
"execution_count": 26,
"metadata": {},
"outputs": [],
"source": [
"NN2 = [W0, W1]"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Define the forward and backward propogration functions:\n",
"\n",
"In this version, we will put the \"network\" first in the arg list."
]
},
{
"cell_type": "code",
"execution_count": 27,
"metadata": {},
"outputs": [],
"source": [
"def _forward(NN, X):\n",
" '''\n",
" This version returns the intermediate values\n",
" as well as the outputs. We need it for backprop\n",
" later.\n",
" '''\n",
" l0 = X\n",
" l1 = sigmoid( np.dot(l0, NN[0]) )\n",
" l2 = sigmoid( np.dot(l1, NN[1]) )\n",
" output = l2\n",
" return (output, l1, l0)\n",
"\n",
"def forward(NN, X):\n",
" '''\n",
" The forward function makes one forward pass \n",
" and returns only the outputs.\n",
" '''\n",
" return _forward(NN, X)[0]"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This is simple enough. It starts just like before --- note that we are chaining the dot product and the sigmoid for brevity, but the steps are the same.\n",
"\n",
"The backpropagation function is a bit more complex, but it follows from the process above: Undo the sigmoid, determine the error due to each layer's weights, update the weights."
]
},
{
"cell_type": "code",
"execution_count": 28,
"metadata": {},
"outputs": [],
"source": [
"def backprop(NN, X, y):\n",
" # forward pass and get intermediate values\n",
" output, l1, l0 = _forward(NN, X) \n",
" # the backward pass looks like this:\n",
" # output_err --(sig')--> W1 --(sig')--> W0\n",
" # (err........>delta) (err..>delta)\n",
" \n",
" # Calculate the errors and deltas\n",
" output_err = y - output\n",
" w1_delta = output_err * sigmoid_deriv(output)\n",
" \n",
" w1_err = np.dot(w1_delta, NN[1].T)\n",
" w0_delta = w1_err * sigmoid_deriv(l1)\n",
" \n",
" # Update the weights\n",
" NN[0] += np.dot(X.T, w0_delta)\n",
" NN[1] += np.dot(l1.T, w1_delta)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now create a train function that will do a forward and a backward pass:\n",
"\n",
"*Note*: Since our new backprop function actually computes the forward pass first, there is no need to call `forward()` separately here -- this version of `backprop` is essentially `train`!"
]
},
{
"cell_type": "code",
"execution_count": 29,
"metadata": {},
"outputs": [],
"source": [
"def train(NN, X, y):\n",
" backprop(NN, X, y)"
]
},
{
"cell_type": "code",
"execution_count": 30,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"i: 0 MSE before: 0.26109 after: 0.25533\n",
"i: 10 MSE before: 0.22680 after: 0.22415\n",
"i: 20 MSE before: 0.19386 after: 0.18958\n",
"i: 30 MSE before: 0.14395 after: 0.13847\n",
"i: 40 MSE before: 0.09312 after: 0.08885\n",
"i: 50 MSE before: 0.05867 after: 0.05614\n",
"i: 60 MSE before: 0.03882 after: 0.03738\n",
"i: 70 MSE before: 0.02735 after: 0.02650\n",
"i: 80 MSE before: 0.02036 after: 0.01982\n",
"i: 90 MSE before: 0.01583 after: 0.01547\n",
"i: 99 MSE before: 0.01301 after: 0.01275\n"
]
}
],
"source": [
"for i in range(100):\n",
" mse_before = MSE(y, forward(NN2,X))\n",
" train(NN2, X, y)\n",
" mse_after = MSE(y, forward(NN2,X))\n",
" if i % 10 == 0 or i == 99:\n",
" print(\"i: %3i MSE before: %0.5f after: %0.5f\" % (i, mse_before, mse_after))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Wait a minute -- shouldn't the more complex network be more accurate? Well, each weight is a *parameter* in the model. With more parameters to train, more training is required.\n",
"\n",
"Let's do 900 more epochs to make it an even 1000:"
]
},
{
"cell_type": "code",
"execution_count": 31,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"i: 100 MSE before: 0.01275 after: 0.01249\n",
"i: 200 MSE before: 0.00365 after: 0.00362\n",
"i: 300 MSE before: 0.00196 after: 0.00195\n",
"i: 400 MSE before: 0.00130 after: 0.00130\n",
"i: 500 MSE before: 0.00096 after: 0.00096\n",
"i: 600 MSE before: 0.00076 after: 0.00076\n",
"i: 700 MSE before: 0.00062 after: 0.00062\n",
"i: 800 MSE before: 0.00053 after: 0.00053\n",
"i: 900 MSE before: 0.00045 after: 0.00045\n",
"i: 999 MSE before: 0.00040 after: 0.00040\n"
]
}
],
"source": [
"for i in range(100,1000):\n",
" mse_before = MSE(y, forward(NN2,X))\n",
" train(NN2, X, y)\n",
" mse_after = MSE(y, forward(NN2,X))\n",
" if i % 100 == 0 or i == 999:\n",
" print(\"i: %3i MSE before: %0.5f after: %0.5f\" % (i, mse_before, mse_after))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Deeper models (more parameters) also means the model has potential to learn a more complex function, so let's make one up. \n",
"\n",
"## (Slightly) more complex function\n",
"\n",
"For this problem, we will simulate a dataset that represents values produced according to the \n",
"linear function $y = 2x_0 + 1x_1 + 0x_3 + d$. In fact, we will let $d$ be zero, since our network\n",
"doesn't have a bias parameter, so we need the y-intercept to be at 0. \n",
"This gives us two variables that \"matter\", and one that does not.\n",
"\n",
"We will generate samples by first generating values for the $x$ variables, then we will compute the \n",
"corresponding $y$ outputs, and add a bit of random noise so that the values look more like a natural\n",
"process."
]
},
{
"cell_type": "code",
"execution_count": 32,
"metadata": {},
"outputs": [],
"source": [
"# Define the vector of input samples as x0, x1, x2, \n",
"# with 100 samples each, sampled from a uniform distribution \n",
"# between 0 and 1.\n",
"X2 = np.random.uniform(low=0, high=1, size=(100,3))\n",
"\n",
"# Define a function f that represents the line that generates the output\n",
"# (response) variable \"perfectly\", without noise. We will add noise after \n",
"# calculating the output.\n",
"def f(x): \n",
" result = np.array(x).astype(float)\n",
" result[:,0] *= 2.0 # 2x_0\n",
" result[:,1] *= 1.0 # 1x_1\n",
" result[:,2] *= 0.0 # 0x_2 (this variable doesn't matter at all)\n",
" return np.sum(x, axis=1) # sum to create the y values from 2x_0 + 1x_1 + 0x_2 + 0\n",
"\n",
"# Now add some Gaussian noise to the \"perfect\" response values:\n",
"noise_stdev = 0.2 # sigma of the noise; we'll keep it small\n",
"# Gaussian noise error for each sample in x\n",
"noise = np.random.randn(X2.shape[0]) * noise_stdev\n",
"# Create the response variable `y`:\n",
"y2 = f(X2) + noise\n",
"# Make the y response a column vector:\n",
"y2 = np.reshape(y2,(100,1))"
]
},
{
"cell_type": "code",
"execution_count": 33,
"metadata": {
"lines_to_next_cell": 2
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[[0.89004455 0.69871283 0.82819791]\n",
" [0.05595717 0.01774589 0.63341429]\n",
" [0.26457044 0.6574532 0.90494819]\n",
" [0.20133986 0.8807249 0.16668785]\n",
" [0.42231825 0.97681259 0.54426161]\n",
" [0.5442228 0.53389683 0.01207421]\n",
" [0.18495185 0.80140111 0.94399568]\n",
" [0.73185868 0.30016923 0.35056698]\n",
" [0.4664781 0.4363943 0.37252552]]\n",
"\n",
"[[2.50751857]\n",
" [0.36895071]\n",
" [1.8154392 ]\n",
" [1.31503355]\n",
" [1.73178501]\n",
" [0.93544818]\n",
" [1.83700519]\n",
" [1.40086866]\n",
" [1.52521015]]\n"
]
}
],
"source": [
"print(X2[1:10,:])\n",
"print()\n",
"print(y2[1:10,:])"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Encapsulate the Networks"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"To compare the two networks, it will be better to have them encapsulated a bit more (we over-wrote the weights and training functions when we created the second one earlier).\n",
"\n",
"To do this, we will create Python classes for each."
]
},
{
"cell_type": "code",
"execution_count": 34,
"metadata": {},
"outputs": [],
"source": [
"class SingleLayerNN:\n",
" \"\"\"\n",
" Single layer neural network\n",
" \"\"\"\n",
" def __init__(self, X, y, activations=(\"sigmoid\",), learning_rate=1.):\n",
" if not isinstance(activations, tuple):\n",
" activations = (activations,) # Make sure that we have a tuple, even if a string is passed\n",
" self.learning_rate = learning_rate\n",
" self.X = X\n",
" self.y = y\n",
" self.W0 = 2 * np.random.random((3,1)) - 1\n",
" self.l0 = X\n",
" self.output = None\n",
" available_activations = {\n",
" \"sigmoid\": self.sigmoid,\n",
" \"linear\": self.linear,\n",
" \"tanh\": self.tanh\n",
" }\n",
" print(\"Using {} activation.\".format(activations[0]))\n",
" self.activation0 = available_activations[activations[0]]\n",
"\n",
" def sigmoid(self, x, deriv=False):\n",
" if deriv:\n",
" return x * (1-x)\n",
" return 1/(1+np.exp(-x))\n",
"\n",
" def linear(self, x, deriv=False):\n",
" if deriv:\n",
" return 1\n",
" return x\n",
"\n",
" def tanh(self, x, deriv=False):\n",
" if deriv:\n",
" return (1 - np.square(x))\n",
" else:\n",
" return np.tanh(x) \n",
" \n",
" def forward(self, X=None):\n",
" if X is None:\n",
" X = self.X\n",
" self.l0 = X\n",
" self.output = self.activation0(np.dot(self.l0, self.W0))\n",
" return self.output\n",
"\n",
" def backprop(self):\n",
" o_err = self.learning_rate * (self.y - self.output)\n",
" w0_delta = o_err * self.activation0(self.output, True)\n",
" self.W0 += np.dot(self.l0.T, w0_delta)\n",
" \n",
" def train(self, epochs = 1):\n",
" for i in range(epochs):\n",
" self.forward()\n",
" self.backprop()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"And now the 2-layer network:"
]
},
{
"cell_type": "code",
"execution_count": 35,
"metadata": {},
"outputs": [],
"source": [
"class TwoLayerNN:\n",
" \"\"\"\n",
" Two layer neural network\n",
" \"\"\"\n",
" def __init__(self, X, y, activations=(\"sigmoid\", \"sigmoid\"), learning_rate=1.):\n",
" self.learning_rate = learning_rate\n",
" self.X = X\n",
" self.y = y\n",
" self.W0 = 2 * np.random.random((3,2)) - 1\n",
" self.W1 = 2 * np.random.random((2,1)) - 1\n",
" self.l0 = X\n",
" self.l1 = None\n",
" self.output = None\n",
" available_activations = {\n",
" \"sigmoid\": self.sigmoid,\n",
" \"linear\": self.linear,\n",
" \"tanh\": self.tanh\n",
" }\n",
" print(\"Using activations: {}.\".format(\", \".join(activations)))\n",
" self.activation0 = available_activations[activations[0]]\n",
" self.activation1 = available_activations[activations[1]]\n",
"\n",
" def sigmoid(self, x, deriv=False):\n",
" if deriv:\n",
" return x * (1-x)\n",
" return 1/(1+np.exp(-x))\n",
"\n",
" def linear(self, x, deriv=False):\n",
" if deriv:\n",
" return np.ones_like(x)\n",
" return x\n",
" \n",
" def tanh(self, x, deriv=False):\n",
" if deriv:\n",
" return (1 - np.square(x))\n",
" else:\n",
" return np.tanh(x)\n",
"\n",
" def forward(self, X=None):\n",
" if X is None:\n",
" X = self.X\n",
" self.l0 = X\n",
" self.l1 = self.activation0(np.dot(self.l0, self.W0))\n",
" self.output = self.activation1(np.dot(self.l1, self.W1))\n",
" return self.output\n",
"\n",
" def backprop(self):\n",
" # Calculate error and deltas\n",
" o_err = self.learning_rate * (self.y - self.output)\n",
" w1_delta = o_err * self.activation1(self.output, True)\n",
"\n",
" w1_err = np.dot(w1_delta, self.W1.T)\n",
" w0_delta = w1_err * self.activation0(self.l1, True)\n",
" # Apply updates to weights\n",
" self.W0 += np.dot(self.X.T, w0_delta)\n",
" self.W1 += np.dot(self.l1.T, w1_delta)\n",
"\n",
" def train(self, epochs = 1):\n",
" for i in range(epochs):\n",
" self.forward()\n",
" self.backprop()"
]
},
{
"cell_type": "code",
"execution_count": 36,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Using sigmoid activation.\n",
"Using activations: sigmoid, sigmoid.\n",
"0.008829302126346193\n",
"0.15354573348953232\n"
]
}
],
"source": [
"# Just check to see that the classes work on the \"easy\" training set:\n",
"onelayer = SingleLayerNN(X,y)\n",
"twolayer = TwoLayerNN(X,y)\n",
"\n",
"onelayer.train(100)\n",
"twolayer.train(100)\n",
"\n",
"print(MSE(y, onelayer.forward(X)))\n",
"print(MSE(y, twolayer.forward(X)))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now, create networks for the `X2`->`y2` (linear polynomial) dataset. One single-layer and one double layer.\n",
"\n",
"Then, train them both for 1000 epochs and print the MSE at the end."
]
},
{
"cell_type": "code",
"execution_count": 37,
"metadata": {
"lines_to_next_cell": 0
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Using sigmoid activation.\n",
"Using activations: sigmoid, sigmoid.\n",
"0.627656628448089\n",
"0.6277107258122931\n"
]
}
],
"source": [
"poly_onelayer = SingleLayerNN(X2, y2)\n",
"poly_twolayer = TwoLayerNN(X2, y2)\n",
"\n",
"poly_onelayer.train(100)\n",
"poly_twolayer.train(100)\n",
"\n",
"print(MSE(y2, poly_onelayer.forward(X2)))\n",
"print(MSE(y2, poly_twolayer.forward(X2)))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"What do the predictions look like, versus the truth?"
]
},
{
"cell_type": "code",
"execution_count": 38,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"y, predicted, error\n",
"1.542 1.000 0.5424\n",
"2.508 1.000 1.5075\n",
"0.369 1.000 -0.6309\n",
"1.815 1.000 0.8154\n",
"1.315 1.000 0.3150\n",
"1.732 1.000 0.7318\n",
"0.935 1.000 -0.0646\n",
"1.837 1.000 0.8370\n",
"1.401 1.000 0.4009\n",
"1.525 1.000 0.5252\n",
"1.604 1.000 0.6035\n",
"1.683 1.000 0.6833\n",
"1.481 1.000 0.4813\n",
"1.982 1.000 0.9823\n",
"1.630 1.000 0.6296\n",
"1.527 1.000 0.5273\n",
"1.090 1.000 0.0899\n",
"1.687 1.000 0.6866\n",
"2.555 1.000 1.5550\n",
"1.779 1.000 0.7795\n"
]
}
],
"source": [
"preds = poly_onelayer.forward(X2)\n",
"print(\"y, predicted, error\")\n",
"for i in range(20):\n",
" print(\"%0.3f %0.3f %0.4f\" % (y2[i,0], preds[i,0], y2[i,0]-preds[i,0]))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Notice something interesting? All the predictions are in the range $[0,1]$. Well, of course they are! We pass the output layer through a Sigmoid activation, so we squash them to that range on purpose. But the real $y$ values are not in that range. What we need here is a linear regression, not a value in $[0,1]$.\n",
"\n",
"To accomplish this, you need a *linear* activation layer on the output layer. The linear layer is:\n",
"$$\n",
"y = x\n",
"$$\n",
"(pass the input directly through). It's derivative is 1. (Or, in matrix notation, it is a matrix of 1's the same shape as the input.)\n",
"\n",
"It is implemented in the classes above, with the ability to choose it by passing the string `\"linear\"` to the `activations` parameter of the constructor at whichever layer we want to use the linear activation. For our one-layer network, we will try the linear activation at the only place we can --- after layer0. On the two-layer network, we will only use the linear activation after the second (output) layer.\n",
"\n",
"So let's try again:"
]
},
{
"cell_type": "code",
"execution_count": 39,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Using linear activation.\n",
"Using activations: tanh, linear.\n",
"One layer: nan\n",
"Two layer: nan\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"/usr/local/lib/python3.7/site-packages/ipykernel_launcher.py:48: RuntimeWarning: invalid value encountered in add\n",
"/usr/local/lib/python3.7/site-packages/ipykernel_launcher.py:53: RuntimeWarning: invalid value encountered in multiply\n"
]
}
],
"source": [
"poly_onelayer = SingleLayerNN(X2, y2, \"linear\")\n",
"poly_twolayer = TwoLayerNN(X2, y2, (\"tanh\", \"linear\"))\n",
"\n",
"poly_onelayer.train(1000)\n",
"poly_twolayer.train(1000)\n",
"\n",
"print(\"One layer: \", MSE(y2, poly_onelayer.forward(X2)))\n",
"print(\"Two layer: \", MSE(y2, poly_twolayer.forward(X2)))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"OK, what's going on with the single layer network? The `nan` is happening because the weights are \"exploding\". Look at the weights after training:"
]
},
{
"cell_type": "code",
"execution_count": 40,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[[nan]\n",
" [nan]\n",
" [nan]]\n"
]
}
],
"source": [
"print(poly_onelayer.W0)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The issue is that for the linear activation layer, we need to \"take our time\" when we adjust the weights. Let's introduce a *learning rate* that will serve to control the amount of error that is actually backpropagated on each epoch. The classes above already have this capability, as a parameter on their constructor, so let's set the learning rate to 0.01 and try again."
]
},
{
"cell_type": "code",
"execution_count": 41,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Using linear activation.\n",
"Using activations: sigmoid, linear.\n",
"One layer: 0.029093942503251993\n",
"Two layer: 0.06462544272282988\n"
]
}
],
"source": [
"poly_onelayer = SingleLayerNN(X2, y2, activations=(\"linear\"), learning_rate=0.01)\n",
"poly_twolayer = TwoLayerNN(X2, y2, activations=(\"sigmoid\", \"linear\"), learning_rate=0.01)\n",
"\n",
"poly_onelayer.train(100)\n",
"poly_twolayer.train(100)\n",
"\n",
"print(\"One layer: \", MSE(y2, poly_onelayer.forward(X2)))\n",
"print(\"Two layer: \", MSE(y2, poly_twolayer.forward(X2)))"
]
},
{
"cell_type": "code",
"execution_count": 42,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"y, predicted(1L), error(1L), predicted(2L), error(2L)\n",
"1.542 1.401 0.1419 1.510 0.0325\n",
"2.508 2.430 0.0775 2.052 0.4555\n",
"0.369 0.665 -0.2961 0.801 -0.4317\n",
"1.815 1.800 0.0152 1.755 0.0604\n",
"1.315 1.280 0.0348 1.572 -0.2566\n",
"1.732 1.963 -0.2311 1.925 -0.1936\n",
"0.935 1.138 -0.2023 1.395 -0.4598\n",
"1.837 1.902 -0.0650 1.831 0.0064\n",
"1.401 1.407 -0.0060 1.491 -0.0906\n",
"1.525 1.289 0.2359 1.449 0.0765\n",
"1.604 1.325 0.2781 1.426 0.1778\n",
"1.683 1.601 0.0822 1.508 0.1751\n",
"1.481 1.290 0.1917 1.449 0.0327\n",
"1.982 2.184 -0.2013 2.007 -0.0251\n",
"1.630 1.960 -0.3303 1.839 -0.2093\n",
"1.527 1.589 -0.0614 1.700 -0.1731\n",
"1.090 1.168 -0.0781 1.374 -0.2838\n",
"1.687 1.376 0.3103 1.540 0.1462\n",
"2.555 2.495 0.0600 2.095 0.4598\n",
"1.779 1.645 0.1340 1.788 -0.0085\n"
]
}
],
"source": [
"preds1 = poly_onelayer.forward(X2)\n",
"preds2 = poly_twolayer.forward(X2)\n",
"print(\"y, predicted(1L), error(1L), predicted(2L), error(2L)\")\n",
"for i in range(20):\n",
" print(\"%0.3f %0.3f %0.4f %0.3f %0.4f\" % (y2[i,0], preds1[i,0], y2[i,0]-preds1[i,0], preds2[i,0], y2[i,0]-preds2[i,0]))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"That's much better. \n",
"\n",
"Let's adjust one more thing: What if we allow the hidden layer in the 2-layer network to potentially produce negative outputs. We can change the activation on the hidden layer from Sigmoid to Tanh which gives outputs in the range $(-1,1)$."
]
},
{
"cell_type": "code",
"execution_count": 43,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Using linear activation.\n",
"Using activations: tanh, linear.\n",
"One layer: 0.02909394017088954\n",
"Two layer: 0.0358061125295244\n"
]
}
],
"source": [
"poly_onelayer = SingleLayerNN(X2, y2, activations=(\"linear\"), learning_rate=0.01)\n",
"poly_twolayer = TwoLayerNN(X2, y2, activations=(\"tanh\", \"linear\"), learning_rate=0.01)\n",
"\n",
"poly_onelayer.train(100)\n",
"poly_twolayer.train(100)\n",
"\n",
"print(\"One layer: \", MSE(y2, poly_onelayer.forward(X2)))\n",
"print(\"Two layer: \", MSE(y2, poly_twolayer.forward(X2)))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Interesting! The two layer network seems to be converging faster now; it perhaps comparible to the 1 layer network. \n",
"\n",
"Let's repeat the trial and see if that is the case:"
]
},
{
"cell_type": "code",
"execution_count": 44,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Using linear activation.\n",
"Using activations: tanh, linear.\n",
"One layer: 0.029093936579457257\n",
"Two layer: 0.03405943270583079\n",
"\n",
"Using linear activation.\n",
"Using activations: tanh, linear.\n",
"One layer: 0.029093942015501598\n",
"Two layer: 0.08096332049593732\n",
"\n",
"Using linear activation.\n",
"Using activations: tanh, linear.\n",
"One layer: 0.029093939299101458\n",
"Two layer: 0.03277308905519685\n",
"\n",
"Using linear activation.\n",
"Using activations: tanh, linear.\n",
"One layer: 0.02909393763373747\n",
"Two layer: 0.039892649300335814\n",
"\n",
"Using linear activation.\n",
"Using activations: tanh, linear.\n",
"One layer: 0.029093936992770134\n",
"Two layer: 0.037743835989540536\n",
"\n"
]
}
],
"source": [
"for i in range(5):\n",
" poly_onelayer = SingleLayerNN(X2, y2, activations=(\"linear\"), learning_rate=0.01)\n",
" poly_twolayer = TwoLayerNN(X2, y2, activations=(\"tanh\", \"linear\"), learning_rate=0.01)\n",
"\n",
" poly_onelayer.train(100)\n",
" poly_twolayer.train(100)\n",
"\n",
" print(\"One layer: \", MSE(y2, poly_onelayer.forward(X2)))\n",
" print(\"Two layer: \", MSE(y2, poly_twolayer.forward(X2)))\n",
" print()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Looks pretty consistent. Now, lets move the number of epochs up to 1000:"
]
},
{
"cell_type": "code",
"execution_count": 45,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Using linear activation.\n",
"Using activations: tanh, linear.\n",
"One layer: 0.02909393655222622\n",
"Two layer: 0.06052073592087871\n",
"\n",
"Using linear activation.\n",
"Using activations: tanh, linear.\n",
"One layer: 0.02909393655222622\n",
"Two layer: 0.04601340269436041\n",
"\n",
"Using linear activation.\n",
"Using activations: tanh, linear.\n",
"One layer: 0.02909393655222622\n",
"Two layer: 0.0650053194663348\n",
"\n",
"Using linear activation.\n",
"Using activations: tanh, linear.\n",
"One layer: 0.02909393655222622\n",
"Two layer: 0.06470985216466407\n",
"\n",
"Using linear activation.\n",
"Using activations: tanh, linear.\n",
"One layer: 0.02909393655222622\n",
"Two layer: 0.05775805105618975\n",
"\n"
]
}
],
"source": [
"for i in range(5):\n",
" poly_onelayer = SingleLayerNN(X2, y2, activations=(\"linear\"), learning_rate=0.01)\n",
" poly_twolayer = TwoLayerNN(X2, y2, activations=(\"tanh\", \"linear\"), learning_rate=0.01)\n",
"\n",
" poly_onelayer.train(1000)\n",
" poly_twolayer.train(1000)\n",
"\n",
" print(\"One layer: \", MSE(y2, poly_onelayer.forward(X2)))\n",
" print(\"Two layer: \", MSE(y2, poly_twolayer.forward(X2)))\n",
" print()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The 2-layer network didn't perform so well now. \n",
"\n",
"Let's keep track of the training progress as we go. \n",
"This time, we will one epoch at a time and record the progress after each time, for a total of 3000 epochs.\n",
"Then, we will plot the MSE over time."
]
},
{
"cell_type": "code",
"execution_count": 46,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Using linear activation.\n",
"Using activations: tanh, linear.\n",
"One layer: 0.02909393655222622\n",
"Two layer: 0.060749503736278854\n",
"\n"
]
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAVAAAADjCAYAAADTwUy2AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAIABJREFUeJzt3Xl8VNX5+PHPQwyEfUvcCAhq0CJqgGBFxQ0RtCp1RRQrbnyLla9WbcVCLWK1qP1a9adtv/4UrWhBpVWpxUJdKrhiWFQWgbBJFCUGEAGRBJ7vH+dOchMmyWQyc2d73q/XvOYu59555jI8Oeeee88VVcUYY0zjNUt0AMYYk6osgRpjTJQsgRpjTJQsgRpjTJQsgRpjTJQsgRpjTJQsgRpjTJQsgZqUJSLrRGS3iOTWWr5IRFREuotIvoj8TUS+FpFvRGSJiIzyynX3ym2v9RqekC9kUs5+iQ7AmCZaC4wA/h+AiBwNtPKtnwp8BBwCfA8cDRxYax8dVLUy/qGadGM1UJPqpgI/8c1fCTztm+8PPKWqO1S1UlUXqeqrgUZo0pYlUJPq3gfaicgPRCQLuBR4ptb6R0XkUhHplpAITdqyBGrSQagWOhhYDnzuW3cxMA/4NbBWRBaLSP9a238tIlt9rx8EErVJeXYO1KSDqcBcoAc1m++o6hZgHDDO62z6PfCSiOT7iuXaOVATDauBmpSnqutxnUlnA3+vp9zXuAR6MNApmOhMOrMEatLFNcDpqrrDv1BE7hWR3iKyn4i0BcYAJapanpAoTVqxBGrSgqquVtXiMKtaAS8CW4E1uMuZzqtVZmut60BvjnO4Jk2IDahsjDHRsRqoMcZEyRKoMcZEyRKoMcZEyRKoMcZEyRKoMcZEKZA7kURkCnAOsElVe4dZfzlwGyDAt8AYVf2oof3m5uZq9+7dYxytMSbTLViw4GtVzWuoXFC3cj4FPEKt2+x81gKnqOoWETkLeAz4YUM77d69O8XF4S79M8aY6InI+kjKBZJAVXWuiHSvZ/27vtn3gfy6yhpjTLJIxnOg1wA2XqMxJukl1WhMInIaLoGeVE+Z0cBogG7dbHhHY0ziJE0CFZFjgMeBs+ob6EFVH8OdI6WoqMjuQzUZp6KigtLSUnbt2pXoUFJeTk4O+fn5ZGdnR7V9UiRQb6TwvwNXqOrKRMcTD3v3wuTJMGYMdOyY6GhMKistLaVt27Z0794dEUl0OClLVSkvL6e0tJQePXpEtY+gLmOaBpwK5IpIKfAbIBtAVf8M3AF0Bv7o/SAqVbUoiNiCMmcOjB8PS5fCs88mOhqTynbt2mXJMwZEhM6dO1NWVhb1PoLqhR/RwPprgWuDiCVRvv/evW/fntg4THqw5BkbTT2OydgLb4xJYuXl5RQWFlJYWMiBBx5Ily5dquZ3797d5P2fdNJJLF68OAaRxl9SnAM1xqSOzp07VyW4iRMn0qZNG2699dYERxW5yspK9tsvNqnPaqDGmJj43e9+xx//+EcAxo4dy5lnngnAnDlzuPLKKwF45plnOProo+nduze/+tWvGtzn6NGjKSoq4qijjmLSpElV+7vooouqyrz66qtcfPHFVdMDBgygb9++DB8+nB073BNe8vPzGTduHH369OHFF1+M2Xe2GqgxKeymmyDWrd3CQnjwwcZvN3DgQB599FGuv/56Fi5cSEVFBXv27GHevHmcfPLJlJaWMmHCBIqLi2nfvj1nnHEGr7zyCuecc06d+5w8eTKdOnWisrKS0047jYsuuogzzjiDG264gfLycjp37syTTz7J1VdfzaZNm5g8eTKvv/46rVq14u677+ahhx6qStT7778/ixYtivawhGU10AisWgUisHx50/dlT1Ax6ap///58+OGHbN26lTZt2tC/f38WLlzIvHnzGDhwIB988AGnn346ubm5ZGdnc9lllzF37tx69zlt2jT69u1L3759Wb58OcuWLaNZs2Zcfvnl/PWvf2Xz5s0sWLCAM888k3fffZdly5ZxwgknUFhYyLPPPsu6deuq9jV8+PCYf2ergUbg+efd+zPPwN13JzYWY/yiqSnGS4sWLejSpQtPP/00J554Ij179uT1119n/fr19OzZk08++aRR+1u1ahUPPfQQ8+fPp0OHDowcObLq5oGrr76aCy+8EHCJMSsrC1Vl6NChTJ06Nez+Wrdu3bQvGIbVQCMQy1qjXX1i0tnAgQP5/e9/z8knn1zVpC8qcpd0//CHP+TNN9+kvLycyspKpk+fzimnnFLnvrZt20bbtm1p164dGzduZPbs2VXrunbtSm5uLpMnT2bUqFEAnHDCCbz11lusWbMGgB07drBq1ar4fVmsBmqMiaGBAwdy//33c/zxx5OTk0N2djYDBw4EXEfOXXfdxamnnoqqcu655/KjH/2ozn317duXXr16ceSRR3LIIYdw4okn1lh/2WWXsW3bNnr27AnAAQccwBNPPMHw4cOrLqe65557KCgoiNO3tQRqjGmCiRMn1pgfMmQI34fuGoGq2mDIyJEjGTlyZL37fPvtt6um62qOh8pdd911NZYNHjyYwYMH71O2tLS03s+MliXQRrDmtzHJobCwkI4dO/Lwww8nNA5LoMaYlJMsdypZJ1LA7DImY9KHJdAIWNIzxoRjCTRgdh7VmPRhCTRgVps1Jn1YAm0Eqz0a45SWljJs2DAKCgo47LDDuPHGG2MylB3AqFGjmDFjRkz2FW+WQANmSdikOlXlggsu4Mc//jGrVq1i5cqVbN++nfHjxyc6tIhUVlbGbF+BJFARmSIim0RkSR3rRUQeFpESEflYRPoGEZcxpvHeeOMNcnJyuOqqqwDIysriD3/4A1OmTGHnzp089dRTXHDBBQwdOpSCggJ++ctfVm07Z86cquHmLr74YrY38IiGSZMm0b9/f3r37s3o0aNRVVavXk3fvtUpYtWqVVXzCxYs4JRTTqFfv34MGTKEjRs3AnDqqady0003UVRUxEMPPRSzYxHUdaBPAY8AT9ex/iygwHv9EPiT954UYnne0s6BmphKwHh2S5cupV+/fjWWtWvXjm7dulFSUgK46zQXLVpEixYtOOKIIxg7diwtW7bkt7/9La+99hqtW7fm3nvv5YEHHuCOO+6o87NuuOGGqvVXXHEFr7zyCueeey7t27dn8eLFFBYW8uSTT3LVVVdRUVHB2LFjefnll8nLy+O5555j/PjxTJkyBYDdu3dTXFzc1KNTQ1DPRJorIt3rKTIMeFpVFXhfRDqIyEGqujGI+ABefRWyssAbA9YY0wSDBg2iffv2APTq1Yv169ezdetWli1bVnVP++7duxkwYEC9+3nzzTe577772LlzJ5s3b+aoo47i3HPP5dprr+XJJ5/kgQce4LnnnmP+/PmsWLGCJUuWVN3KuWfPHg466KCqfaXzcHZdgA2++VJv2T4JVERGA6MBunXrFrMAzj7bvce7hmjnQE1MJWA8u169eu3TybNt2zY+++wzDj/8cBYuXEiLFi2q1mVlZVFZWYmqMnjwYKZNmxbR5+zatYvrr7+e4uJiunbtysSJE6uGs7vwwgu58847Of300+nXrx+dO3fmiy++4KijjuK9994Luz8bzg5Q1cdUtUhVi/Ly8gL97FgkP2vCm1Q3aNAgdu7cydNPuzNye/bs4ZZbbmHUqFG0atWqzu2OP/543nnnnapm/o4dO1i5cmWd5UPJMjc3l+3bt9dI2jk5OQwZMoQxY8ZUnYs94ogjKCsrq0qgFRUVLF26tGlftgHJkkA/B7r65vO9ZcaYJCMivPjii7zwwgsUFBTQs2dPcnJyuOeee+rdLi8vj6eeeooRI0ZwzDHHMGDAAD799NM6y3fo0IHrrruO3r17M2TIEPr3719j/eWXX06zZs2qnr3UvHlzZsyYwW233caxxx5LYWEh7777btO/cH1UNZAX0B1YUse6HwGvAgIcD8yPZJ/9+vXTWHF1w/Dr7rzTrfv1r6Pf/0svuX2cd170+zBGVXXZsmWJDiEp3H///TphwoQm7yfc8QSKNYIcFMg5UBGZBpwK5IpIKfAbINtL4H8GZgFnAyXATuCqIOKKlPXCG5Nczj//fFavXs0bb7yR0DiC6oUf0cB6BX4WRCyJYp1HxsROLB9N3BTJcg407VnN05j0Ywm0EawWaZKF2l/kmGjqcbQEGhBLviZWcnJyKC8vtyTaRKpKeXk5OTk5Ue8jWS6kT3v2Wzexkp+fT2lpKWVlZYkOJeXl5OSQn58f9faWQCNgz4U3ySQ7O5sePXokOgyDNeEDZzVRY9KHJdCAWM3TmPRjCTQgVvM0Jv1YAjXGmChZAm2EpjTDrQlvTPqxBBoQa8Ibk34sgRpjTJQsgUYgFrVHa8Ibk34sgRpjTJQsgQbEzoEak34sgTaCNcONMX6WQGNo4UJYuzb8Oku+xqSfwBKoiAwVkRUiUiIi48Ks7yYib4rIIhH5WETODiq2WOnXDw49NPw6a8Ibk34CSaAikgU8CpwF9AJGiEivWsUmAM+rah/gUuCPQcQWiXgnv5074Ysv4vsZxpjYC6oGehxQoqprVHU3MB0YVquMAu286fZAWqWU+prwgwZBly7BxWKMiY2gEmgXYINvvtRb5jcRGOk9tXMWMDbcjkRktIgUi0hxugwo+/77iY7AGBONZOpEGgE8par5uEccTxWRfeJT1cdUtUhVi/Ly8gIN0DqCjDF+QSXQz4Guvvl8b5nfNcDzAKr6HpAD5AYSnTHGRCGoBPohUCAiPUSkOa6TaGatMp8BgwBE5Ae4BJoebXRjTFoKJIGqaiVwAzAbWI7rbV8qIpNE5Dyv2C3AdSLyETANGKVJ8tjBWESRHN/EGBNLgT1UTlVn4TqH/Mvu8E0vA04MKh5jjGmqZOpESmvWAWVM+rEE2giWBI0xfpZAjTEmSpZAjTEmSpZAI2A96MaYcCyBGmNMlCyBGmNMlCyBNoL1whtj/CyBGmNMlCyBGmNMlCyBRsB64Y0x4TSYQEXk4lrzR9SavynWQRljTCqIpAb6RK3592rNT4pRLMYYk1IiSaC1+54bmjfGmIwQSQKtfQawoXljjMkIEY0HKiKCq2lKuPlMEYvrQOvrkFK1a02NSSWRJNA2QKVvXnzzgtVAjTEZKpIE2iMWHyQiQ4GHgCzgcVWdHKbMJbjHGyvwkapeFovPTib11TCtBmpMamkwgarq+nDLRaSjqm6J5ENEJAt4FBiMeyb8hyIy03uMR6hMAXA7cKKqbhGR/SPZdxBieR1oQ014Y0zqiOQ60J+IyBDffJGIbAC+FpEVta8LrcNxQImqrlHV3cB0YFitMtcBj4aSsqpuivhbGGNMAkTSC38r8KVv/jHgNeAY7/3+CPbRBdjgmy/1lvn1BHqKyDsi8r7X5N+HiIwWkWIRKS4rS72nHjfUhDfGpI5IzoF2BT4BEJGuwNHAGaq6WUTGASUxjKUAOBXIB+aKyNGqutVfSFUfwyVxioqKAk058T4/aQnUmNQSSQ20EmjuTZ8AfKqqm735nUDLCPbxOS4Rh+R7y/xKgZmqWqGqa4GVuISaVuwcqDHpI5IE+hZwt4gcA4wF/uFbdyQ1m/d1+RAoEJEeItIcuBSYWavMS7jaJyKSi2vSr4lg3ynFetmNSR+RJNAbgT7AO7ga572+dVcA/2poB6paCdwAzAaWA8+r6lIRmSQi53nFZgPlIrIMeBP4haqWR/xN4igWNcNI9mE1UGNSSyTnQLOAUVRfNN9eRNp76/4Y6Qep6ixgVq1ld/imFbjZe6Ut60QyJn1EkkDXUfNuo9opQHFJ1hhjMkokTfiPgFXABKA7kF3r1bzOLdNMU85fWhPemPTTYAJV1T7ARUAn3HnQWbhOoOaqukdV98Q3xPRiTXhj0kdEj/RQ1SWq+gtcDfQB4Bxgo4j0jWNsxhiT1Br7TKQC4BRgALAIiOhe+FRnvfDGmHAa7EQSkU7ACOBKoC0wFThZVT+Lc2xpJZQcrQlvTPqIpBf+C2AtLnG+7y07XEQODxVQ1TfiEJsxxiS1SBLol0AObrSk68KsV+DQWAaVrOxeeGOMXyTjgXYPII60Z014Y9JPYzuRMpIlNmNMOJZAG8Ga8MYYP0ugAbEmvDHpxxKoMcZEyRJoBIKqGVoN1JjUYgk0INaENyb9WAINmI1Ib0z6sATaCNYLb4zxCyyBishQ7znyJd7TPOsqd6GIqIgUBRVbEGwwEWPSTyAJVESygEeBs4BewAgR6RWmXFvcM5g+CCKuRLAmvDHpI6ga6HFAiaquUdXdwHRgWJhyd+EeWrcroLgikohe+A0bYMmSYD7XGBOdoBJoF2CDb77UW1bFG5y5q6r+s74dichoESkWkeKysrLYRxonjW3Cd+sGRx8dv3iMMU2XFJ1IItIMN9L9LQ2VVdXHVLVIVYvy8vLiH1yMWRPemPQRVAL9HOjqm8/3loW0BXoD/xGRdcDxwMx060hqiHUiGZNagkqgHwIFItJDRJrjHko3M7RSVb9R1VxV7e4Nn/c+cJ6qFgcUX9xZL7wx6SeQBKqqlcANwGxgOfC8qi4VkUkicl4QMcRCLJrfdieSMekjkhHpY0JVZ+EeiexfdkcdZU+NRwzTp8OBB8Kpcdl7/Sw5GpN+kqITKSi33w5TpiQ2BquBGpM+MiqBNmsWXZKy0ZiMMeFkVAIVgb17E/PZlhyNST8Zl0ATncisCW9M+sioBBptEz4kkaMxff89LF0a3883xjRORiXQVG7C/+xn0Ls3fPllbOIxxjRdxiXQRDeTo23Cz5vn3r/5JrbxGGOiZwk0AtYLb4wJJ6MSaCTnQOOVxKLdb2g7S67GJJ+MSqCRnAONdwJtbBN+z574xGOMabqMS6Cp1gsfSqA2DJ4xySejEmg8m/DxqrmGEqg14Y1JPhmVQNOhCW81UWOSR8Yl0GhqoEGN5RlJArWaqDHJI6MSaFPvREoES6DGJK+MSqDxbMJHWrNtbBO+dryJupPKGLOvjEugTelESmQvfIglUGOSR2AJVESGisgKESkRkXFh1t8sIstE5GMReV1EDol1DKncC197/ttv3QAjxpjECSSBikgW8ChwFtALGCEivWoVWwQUqeoxwAzgvtjHEb8aXLya8HXVQNu1g+OOizw+Y0zsBVUDPQ4oUdU1qrobmA4M8xdQ1TdVdac3+z7u0ccxFc9e+FhobBP+44/jG48xpn5BJdAuwAbffKm3rC7XAK+GWyEio0WkWESKy8rKGhVEU8+BxmO7htg5UGOSV9J1IonISKAIuD/celV9TFWLVLUoLy+vUftu1iz1euFr34lkCdSY5BHUY40/B7r65vO9ZTWIyBnAeOAUVY15F0k874WPdL9NvYzJBhcxJnkEVQP9ECgQkR4i0hy4FJjpLyAifYD/Bc5T1U3xCCKVm/ChxGs1UGOSRyAJVFUrgRuA2cBy4HlVXSoik0TkPK/Y/UAb4AURWSwiM+vYXdQSeRlTtLeDRtKE37gRlixpeP/GmNgKqgmPqs4CZtVadodv+ox4xxDtnUiRJL9Ia4axuozJ79prYdYsqKyErKzI4jDGNF3SdSLFUzyfiRTvGmhd8+CSJ8Du3Q1/hjEmdjIqgSZ7Ez6ccOdA66rt7t4N69bBe+9F91nGmMYJrAmfDJo6mEgieuHDnQOtqye+ogJ69GhcPMaY6GVUDTTaXvjWrd17fdftx+L60khHY6orgfqb8Ha5kzHxl3E10GhqZtnZ7n3FirrLxLsJ75+vrAxf1p9At2yBGTNcrXTs2Og+2xhTv4xKoNHeiRRKYp9+2rjtwolFL3wkCfS772DMGDdtCdSY+LAmfC3hmr6hpLtyZd3JK6he+MYkUL+TT4Zp0xqOwRgTuYxKoBM+uYThq++ut8z69fsuCyXQigpYuzb8dpHWbBs7KHNjOpHqSqDbtsG8eXDZZW773/wGdu1qXBzGmH1lVAI98rvF5G9dwsaNdZcJ10z3J8fly8NvF++HyoXeKytr1kD929SVQD/3jTpwyy0waRJcdRW8/TZcd13T4zYmU2VUAm3esRWt2Mm77+67rl079x4uQUYyBqc/kYWrjTa1CR+quX7/fc0E6v8sfwLdtq16evXq6umOHd37/vvDwIHw+OPuVlARKCpyMfzhD/ueAjDG7CujEmjLzq1om7WT2bP3Xbef15320Uf7rtu7F9q0gd69Xa0tHH/yC3dfeijpNbamGkqQzbx/qe++q/koD38y9TfL/bXsVauqp7durd5PyFtvufcFC+C+++Dmm2HoUDjhBJdYd+yAww6Dn//cJeneveHLL+Grr+DZZ6v3++230X1HY1JVRiXQZm1a06XDTl56ad+OmNB/+rlz9123d69LYAMHwjvvhL9l0l8TnDt33/UVFe7dXxusLdx+a9dAv/vOJbRw22zeXD3t/xx/Qv/ss5rvAH/7W/V06C6mb7+tnh43DtasgQcfhBEjYOlSl0QPPBBGjnTnVDt2dLX46693x6pHDxeziDv3GpoeNKh62l72Cuq1ZUvd/++aIqMSKJ060aX5JsrK4O9/r7lKFfLzXeKYM6fmulACPfdc2L4dXn55312HEiTACy/Uvf6tt9w+wvn6632XhRJoqAa6a1fN7f01zQ2+Mf/9SXPevOrpN95w7/5a+IwZ1dOh77ZoUfWyRx6png4dt/Ly6mWTJlVP/+lP7n3duupl/t7/0OcbE6T58+Oz34y6DpQjjqD1jBkcc8T3/OIXLTjxROjiPVikogLOP98liAkT4LTToGVLt27vXvdX7Mwz4fDD4fbb4fTToXPn6l2HaoKFha4G+vLLMMz31Cd/E/7ee+Guu6rX5eS4xDh7NpxzTs2QQ83ikLKymjVQ/ykHf9ILNcuhZhN+yxbo1Km6ttq3LyxcCMcfD6Wl7vXrX8P990NuLtx6K9x0E4wf72J85hmXcIcPdw+1u+02V6t8+GG37RNPwGuvwRVXuGM1cSL07OnWt2zpOq1WrIALLnC14PXrXTw33uj2ecIJ7sf+1lswdao7lmPGQEGBG3Xqpz91ManCX/4Cd94J//oXfPONqx0/9RQceywccgjMnAk/+pE71VBSAmecAcXF0KIFHHkkfPCBizE72x2j3r3d8d2+3Z2yWLvW3YXWoYOL9cAD3W9h82Y3/c03br5TJ/fHr2VLaN7cLW/Xzv3x27UL2raFnTvdb6hlS7f/Fi3cyFnffeeWVVa68i1auN+SiDuttHu3exdxZbKzXTlVt7yy0q3Lyqo5GtfevW56715XNivLbReqkYUqBao110NkZf2nlvbsqf4DH+5zw5UVcdNZWdX7DV2nXftzQ/9vRKiTSN2njrp3h0MPrXvbJlHVlH3169dPG+X551VBlz3+jrZurdqxo+qtt6rOmuX+CSdMUH3pJTfdq5fqI4+oLlqkesklqgcf7Hbx1luqLVqo5uer/va3qnPnqn75perHH7vtpk5V7dNHdb/9VK+/XvVf/1ItLVUdP15VRHXkSFfukktUX35Z9bPPVPPyQj8h1eHDVWfMqJ7v2FH1rruq5zt0UB04sHq+W7fqaX+Z0PtJJ7npq69WPfZY1ebNVV97TfW441SvuUa1pER1xAjVDz9U/eQTd4j27FFdu1Z1wwb3nbdsadxhNibVAcUaQQ4STeEz/kVFRVpcXBz5Btu2wcEHQ8+efHX2Vbzwz1Ys+Gg/KtX9mRt5OQwZAosXw/Tp8PkX1Zvmd4Hf/c5Nr17tmqUrVu77Ebf9Enr0cOvfeQcqfddstmvramIvveRqTbt8nUHDvGGlZ8+uXj78Ene+cclSN3/uOa72tnIVdDnYXRw/c6ar7VxxBbwwAw460NWkX3kF+vRxNamPP4b+/d1f+127qq84MCZjDBrk/u9HSEQWqGpRg+UyKoGC6zH5r/+qeRLPGJPe5syBwYMjLh5pAg3sHKiIDAUeArKAx1V1cq31LYCngX5AOTBcVdfFPJALL3Qn4DZtcieYKirsuhtj0t1BB8Vlt4EkUBHJAh4FBuOeCf+hiMxU1WW+YtcAW1T1cBG5FLgXGB6ngOCAA+Kya2NM5gjqMqbjgBJVXaOqu4HpwLBaZYYBf/GmZwCDROrrdzPGmMQKKoF2AXxXKVLqLQtbRt1TPL8BOtcqg4iMFpFiESkuq2+EY2OMibOUu5BeVR9T1SJVLcrLy0t0OMaYDBZUAv0c6Oqbz/eWhS0jIvsB7XGdScYYk5SCSqAfAgUi0kNEmgOXAjNrlZkJXOlNXwS8oal8jZUxJu0Fdh2oiJwNPIi7jGmKqt4tIpNwV/zPFJEcYCrQB9gMXKqqaxrYZxmwvpGh5AJh7jpPOhZn7KVKrBZn7DU21kNUtcFzhCl9IX00RKQ4kgtkE83ijL1UidXijL14xZpynUjGGJMsLIEaY0yUMjGBPpboACJkccZeqsRqccZeXGLNuHOgxhgTK5lYAzXGmJjImAQqIkNFZIWIlIjIuETHAyAi60TkExFZLCLF3rJOIvJvEVnlvXf0louIPOzF/7GI9I1jXFNEZJOILPEta3RcInKlV36ViFwZ7rPiEOdEEfncO6aLvcvnQutu9+JcISJDfMvj+tsQka4i8qaILBORpSJyo7c8qY5pPXEm4zHNEZH5IvKRF+ud3vIeIvKB97nPededIyItvPkSb333hr5DRCIZdTnVX7hrT1cDhwLNgY+AXkkQ1zogt9ay+4Bx3vQ44F5v+mzgVUCA44EP4hjXyUBfYEm0cQGdgDXee0dvumMAcU4Ebg1Ttpf3794C6OH9HrKC+G0ABwF9vem2wEovnqQ6pvXEmYzHVIA23nQ28IF3rJ7HXUMO8GdgjDd9PfBnb/pS4Ln6vkOkcWRKDTSS0aCShX9Uqr8AP/Ytf1qd94EOIhKXQQ5VdS7uZoamxDUE+LeqblbVLcC/gaEBxFmXYcB0Vf1eVdcCJbjfRdx/G6q6UVUXetPfAstxg+ck1TGtJ866JPKYqqqGHq+Y7b0UOB03mhvse0zDjfZW13eISKYk0EhGg0oEBeaIyAIRGe0tO0BVQ8/a/BIIDVya6O/Q2LgSGe8NXtN3SqhZXE88gcbpNR374GpMSXtMa8UJSXhMRSRLRBYDm3B/TFYDW9WN5lb7c+sa7a1JsWZKAk2yQHN0AAAFPklEQVRWJ6lqX+As4GcicrJ/pbo2RtJdJpGscXn+BBwGFAIbgf9JbDjVRKQN8DfgJlXd5l+XTMc0TJxJeUxVdY+qFuIGJzoOODLoGDIlgUYyGlTgVPVz730T8CLuR/BVqGnuvW/yiif6OzQ2roTEq6pfef+x9gL/n+rmWELjFJFsXFJ6VlX/7i1OumMaLs5kPaYhqroVeBMYgDvdEXrShv9z6xrtrUmxZkoCjWQ0qECJSGsRaRuaBs4EllBzVKorgZe96ZnAT7we2uOBb3zNvyA0Nq7ZwJki0tFr8p3pLYurWueFz8cd01Ccl3q9sT2AAmA+Afw2vHNtTwDLVfUB36qkOqZ1xZmkxzRPRDp40y1xjwtajkukF3nFah/TcKO91fUdIhPLnrFkfuF6NlfizpOMT4J4DsX1/n0ELA3FhDsv8zqwCngN6KTVvY6PevF/AhTFMbZpuKZaBe6c0DXRxAVcjTspXwJcFVCcU704Pvb+cxzkKz/ei3MFcFZQvw3gJFzz/GNgsfc6O9mOaT1xJuMxPQZY5MW0BLjD9/9qvnd8XgBaeMtzvPkSb/2hDX2HSF52J5IxxkQpU5rwxhgTc5ZAjTEmSpZAjTEmSpZAjTEmSpZAjTEmSpZATVLwRvx5Jg77VRE5PMb77CYi20UkK5b7NanHEqgJlIhcJiLFXgLaKCKvishJiY6rMVT1M1Vto6p7GiorIt29JL5fQ2VN6rF/VBMYEbkZN2zbT3F30OzGjSY0DNiRwNCMiYrVQE0gRKQ9MAn4mar+XVV3qGqFqv5DVX/hFWsuIk+LyLfeILlFvu0PFpG/iUiZiKwVkf/2rcsSkV+JyGpv2wUi0rVWCIjISSKyQURO9eZVRP5bRNaIyNcicr+INPPWNRORCSKyXtygzU9732GfWqWI/EdE7hKRd7zPnyMiud7HzvXet3q17gGxPbImkSyBmqAMwN1O92I9Zc7DjR3ZAXfL4CPgkhnwD9xtr12AQcBNUj16+M3ACNztg+1wtzvu9O9YRIbibv28UFX/41t1PlCEG5h5mLctwCjvdRru9sA2oXjqcBlwFbA/bhDhW73loRG2OnjN/vfq2YdJMZZATVA6A19r9ViN4bytqrO8c4tTgWO95f2BPFWdpKq7VXUNblSgS7311wITVHWFOh+parlvvxcD/4u7z7n2QBH3qhug+DPgQVwiBrgceEDdoMDbgdtxg07UddrrSVVdqarf4UZFL2zgeJg0YOdATVDKgVwR2a+eJPqlb3onkOMlrEOAg0Vkq299FjDPm+6KGwyiLjfhRnhfEmadfzDd9cDB3vTB3rx/3X5UD3rcUOxt6onHpAmrgZqgvAd8T/UjFhpjA7BWVTv4Xm1V9Wzf+sPq2f5i4MfiPSStFv+50m7AF970F7jE7V9XCXzVyNhttJ40ZgnUBEJVvwHuAB4VkR+LSCsRyRaRs0TkvgY2nw98KyK3iUhLr9Oot4j099Y/DtwlIgXeGJrHiEhn3/Zf4M6b3igiY2rt+xfe+JpdgRuB57zl04Cfe2NatgHuwT2IrL5TEOGUAXtx51FNmrEEagKjqv+D6/CZgEssG4AbgJca2G4PcA7uvOJa4Gtc0mzvFXkAd95xDrANNyhwy1r7+AyXRMeJyLW+VS8DC3BjX/7T2xZgCu487FzvM3cBYxv5lVHVncDdwDsistUbINmkCRsP1GQsEVGgQFVLEh2LSU1WAzXGmChZAjXGmChZE94YY6JkNVBjjImSJVBjjImSJVBjjImSJVBjjImSJVBjjImSJVBjjInS/wGdGIlyQJqKwQAAAABJRU5ErkJggg==\n",
"text/plain": [
"<Figure size 360x216 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"poly_onelayer = SingleLayerNN(X2, y2, activations=(\"linear\"), learning_rate=0.01)\n",
"poly_twolayer = TwoLayerNN(X2, y2, activations=(\"tanh\", \"linear\"), learning_rate=0.01)\n",
"\n",
"training_stats = []\n",
"\n",
"for checkpoint in range(3000):\n",
" # Train 20 epochs at a time, then record progress; total of 50 * 20 = 1000 epochs\n",
" poly_onelayer.train(1)\n",
" poly_twolayer.train(1)\n",
" training_stats.append([MSE(y2, poly_onelayer.forward(X2)), MSE(y2, poly_twolayer.forward(X2))])\n",
"\n",
"print(\"One layer: \", MSE(y2, poly_onelayer.forward(X2)))\n",
"print(\"Two layer: \", MSE(y2, poly_twolayer.forward(X2)))\n",
"print()\n",
"\n",
"# Plot the progress\n",
"training_stats = np.array(training_stats)\n",
"plt.figure(figsize=(5, 3))\n",
"plt.plot(training_stats[:,1], 'b', label='Two layer')\n",
"plt.plot(training_stats[:,0], 'r', label='One layer')\n",
"plt.xlabel('Checkpoint', fontsize=12)\n",
"plt.ylabel('MSE', fontsize=12)\n",
"plt.title('MSE')\n",
"plt.legend()\n",
"plt.show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Two observations:\n",
"1. Both networks initially converged **fast**. Well, this was a simple function, after all.\n",
"2. The 2-layer network is behaving strangely. There is a periodic jump in MSE that decreases over time. Let's plot the same training if we use Sigmoid activation instead of tanh:"
]
},
{
"cell_type": "code",
"execution_count": 47,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Using linear activation.\n",
"Using activations: sigmoid, linear.\n",
"One layer: 0.02909393655222622\n",
"Two layer: 0.16086398349472605\n",
"\n"
]
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAVcAAADjCAYAAAAxHVfPAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAH9NJREFUeJzt3X2cV2Wd//HXW0RGRAFhMuVGSEEF1BEHExU1lZvaDEt9gFiBN/FIw3Rbd6X0Z4arq+ZaumtbbqJhpW5uJvXTVVNLf3kHKKaCCOLdoK0IIgEZDH1+f5zDcBhnhu/A+d7M1/fz8fg+vudc17n5zHH4eM11zrkuRQRmZpavHcodgJlZNXJyNTMrAidXM7MicHI1MysCJ1czsyJwcjUzKwInVzOzInBytaok6TVJ6yX1blb+rKSQNEBSX0n/LeldSe9LekHSlHS7Ael2a5p9JpTlB7IOZ8dyB2BWRK8CpwH/BiDpQKBrpv424Dlgb+CvwIHAx5sdo0dENBY/VKs2brlaNbsN+HJmfTIwK7M+Arg1ItZGRGNEPBsR95U0QqtaTq5WzZ4EdpN0gKROwETgp83qb5Q0UVL/skRoVcvJ1ardptbraGAhsCxTdyrwGPB/gFclzZc0otn+70palfkcUJKorcNzn6tVu9uAR4GBbNklQES8B0wHpqc3vq4FfiWpb2az3u5ztW3hlqtVtYh4neTG1meAX7ax3bskyXUvYPfSRGfVzMnVPgrOAo6LiLXZQklXSxomaUdJuwLnAEsiYkVZorSq4uRqVS8iXomIuS1UdQXuBlYBS0keyfpcs21WNXvO9RtFDteqhDxYtplZ/txyNTMrAidXM7MicHI1MysCJ1czsyJwcjUzK4KqfUOrd+/eMWDAgHKHYWZVZt68ee9GRO3Wtqva5DpgwADmzm3p0UYzs20n6fVCtnO3gJlZETi5mpkVgZOrmVkRVG2fq9lHzYYNG2hoaOCDDz4odyhVoaamhr59+9K5c+dt2t/JFbjnnuRz880glTsas23T0NDArrvuyoABA5B/kbdLRLBixQoaGhoYOHDgNh2jJN0CkmZKekfSC63UH5vOvjk//VyaqRsnaZGkJZKmFyO++fPhlluKcWSz0vnggw/o1auXE2sOJNGrV6/t+iugVH2utwLjtrLNYxFRl35mAKTzHt0IfBoYApwmaUixgvQAYdbRObHmZ3uvZUmSa0Q8Cqzchl0PIxm8eGlErAfuAMbnGhzuCjDLw4oVK6irq6Ouro6Pf/zj9OnTp2l9/fr12338o446ivnz5+cQaWlUUp/rSEnPAW8BF0bEi0Af4M3MNg3AJ1s7gKSpwFSA/v09madZKfXq1asp+V122WV069aNCy+8sMxRFa6xsZEdd8wvJVbKo1jPAHtHxMHAvwG/2paDRMRNEVEfEfW1tVt9O62F/bflrGbWln/5l3/hBz/4AQDnnXceY8aMAeCBBx5g8uTJAPz0pz/lwAMPZNiwYXzrW9/a6jGnTp1KfX09Q4cOZcaMGU3HO+WUU5q2ue+++zj11FOblkeOHMnw4cOZMGECa9cmM/707duX6dOnc8ghh3D33Xfn90NTIS3XiFidWb5X0g/S2TiXAf0ym/Zly6mRc+FuAas2F1yQ3KjNU10dfP/77d9v1KhR3HjjjZx77rk888wzbNiwgY0bN/LYY49x9NFH09DQwCWXXMLcuXPp3r07J5xwAr/5zW/47Gc/2+oxr7rqKnbffXcaGxv51Kc+xSmnnMIJJ5zAtGnTWLFiBb169eKWW27hzDPP5J133uGqq67ioYceomvXrlxxxRVcf/31TUn8Yx/7GM8+++y2XpZWVUTLVdLHlfYeSzqMJK4VwBxgkKSBknYCJgKzyxepmbXXiBEjmDNnDqtWraJbt26MGDGCZ555hscee4xRo0bx1FNPcdxxx9G7d286d+7MpEmTePTRR9s85u23387w4cMZPnw4CxcuZMGCBeywww6cfvrp/PznP2flypXMmzePMWPG8Pjjj7NgwQKOOOII6urq+NnPfsZrr73WdKwJEyYU5ecuSctV0u3AsUBvSQ3At4HOABHxQ+AU4BxJjcBfgImRTO7VKGkacD/QCZiZ9sUWhbsFrFpsSwuzWLp06UKfPn2YNWsWRx55JIMHD+ahhx7i9ddfZ/DgwTz//PPtOt7ixYu5/vrrefrpp+nRowdf/OIXmx6ZOvPMMzn55JOBJGl26tSJiGDcuHHcdtttLR5vl1122b4fsBWlelrgtIjYMyI6R0TfiLg5In6YJlYi4t8jYmhEHBwRh0fE45l9742IwRGxT0RcUYz43C1gVlyjRo3i2muv5eijj27qJqivrwfgk5/8JI888ggrVqygsbGRO+64g2OOOabVY61evZpdd92V3Xbbjbfffpv777+/qa5fv3707t2bq666iilTpgBwxBFH8Pvf/56lS5cCsHbtWhYvXly8HzZVEX2ulcItV7PiGDVqFN/97nc5/PDDqampoXPnzowaNQpIbipdfvnlHHvssUQEJ554In/3d3/X6rGGDx/OkCFD2H///dl777058sgjt6ifNGkSq1evZvDgwQDsscce3HzzzUyYMKHpkbArr7ySQYMGFemnTVTt1Nr19fVR6HiuV1wBl1wC69fDNr5GbFZ2Cxcu5IADDih3GGX31a9+lZEjRzY9ibA9WrqmkuZFRP3W9nXL1cyqRl1dHT179uSGG24odyhOrllV2og3+8iopDe4KuJRrHLzDS0zy5uTq5lZETi5ZrhbwMzy4uSKuwXMLH9OrhluuZptn4aGBsaPH8+gQYPYZ599OP/883MZbhBgypQp3HXXXbkcqxScXHHL1SwPEcEXvvAFTjrpJBYvXszLL7/MmjVruPjii8sdWkEaGxtzPZ6Tq5nl4uGHH6ampoYzzjgDgE6dOvG9732PmTNnsm7dOm699Va+8IUvMG7cOAYNGsQ//dM/Ne37wAMPNA0JeOqpp7JmzZo2zzVjxgxGjBjBsGHDmDp1KhHBK6+8wvDhw5u2Wbx4cdP6vHnzOOaYYzj00EMZO3Ysb7/9NgDHHnssF1xwAfX19Vx//fW5Xg8/55rhbgGrGmUYc/DFF1/k0EMP3aJst912o3///ixZsgRInkN99tln6dKlC/vttx/nnXceO++8M//8z//Mb3/7W3bZZReuvvpqrrvuOi699NKWTgPAtGnTmuq/9KUv8Zvf/IYTTzyR7t27M3/+fOrq6rjllls444wz2LBhA+eddx733HMPtbW13HnnnVx88cXMnDkTgPXr11Po25zt4eSKuwXMSuX444+ne/fuAAwZMoTXX3+dVatWsWDBgqYxAtavX8/IkSPbPM4jjzzCNddcw7p161i5ciVDhw7lxBNP5Oyzz+aWW27huuuu48477+Tpp59m0aJFvPDCC4wePRqAjRs3sueeezYdq0MPOdhRuOVqVaMMYw4OGTLkQzecVq9ezRtvvMG+++7LM888Q5cuXZrqOnXqRGNjIxHB6NGjuf322ws6zwcffMC5557L3Llz6devH5dddlnTkIMnn3wy3/nOdzjuuOM49NBD6dWrF2+99RZDhw7liSeeaPF4HXrIwUrnlqvZ9jv++ONZt24ds2bNApIW4j/8wz8wZcoUunbt2up+hx9+OH/4wx+aug7Wrl3Lyy+/3Or2mxJp7969WbNmzRYJvaamhrFjx3LOOec09f3ut99+LF++vCm5btiwgRdfLNqw0E1KklwlzZT0jqQXWqk/XdIfJT0v6XFJB2fqXkvL50vKv2PEzHIhibvvvptf/OIXDBo0iMGDB1NTU8OVV17Z5n61tbXceuutnHbaaRx00EGMHDmSl156qdXte/TowVe+8hWGDRvG2LFjGTFixBb1p59+OjvssEPTXF077bQTd911FxdddBEHH3wwdXV1PP744y0dOlclGXJQ0tHAGmBWRAxrof4IYGFEvCfp08BlEfHJtO41oD4i3m3POdsz5OA118BFF8GaNVCkvxDMis5DDiauvfZa3n//fS6//PLtPlbFDzkYEY9KGtBGffZ/I0+STERYMu4WMKsOn//853nllVd4+OGHyx1KRd7QOgu4L7MewAOSAvhRRNxUnrDMrNLlPT329qio5CrpUyTJ9ahM8VERsUzSx4AHJb0UES1ODSlpKjAVoH///u0+v58WMLO8VMzTApIOAn4MjI+IFZvKI2JZ+v0OcDdwWGvHiIibIqI+Iupra2vbce5tDtusolTrtE3lsL3XsiKSq6T+wC+BL0XEy5nyXSTtumkZGAO0+MRBHvx7aR1ZTU0NK1ascILNQUSwYsUKampqtvkYJekWkHQ7cCzQW1ID8G2gM0A6vfalQC/gB0qakY3p3bg9gLvTsh2Bn0fE/+QfX95HNCu9vn370tDQwPLly8sdSlWoqamhb99tv7deqqcFTttK/dnA2S2ULwUO/vAeZtZc586dGThwYLnDsFRFdAtUCv81ZWZ5cXLF3QJmlj8n1wy3XM0sL06uuOVqZvlzcjUzKwIn1wx3C5hZXpxccbeAmeXPydXMrAicXDPcLWBmeXFyxd0CZpY/J9cMt1zNLC9Orrjlamb5c3I1MysCJ9cMdwuYWV6cXHG3gJnlz8nVzKwISpZcJc2U9I6kFqdpUeIGSUsk/VHS8EzdZEmL08/kYsXobgEzy0spW663AuPaqP80MCj9TAX+A0DS7iTTwnySZHLCb0vqmWdg7hYws7yVLLmm02GvbGOT8cCsSDwJ9JC0JzAWeDAiVkbEe8CDtJ2ktyPGYhzVzD6KKqnPtQ/wZma9IS1rrTw3brmaWd4qKbluN0lTJc2VNNczYJpZOVVScl0G9Mus903LWiv/kIi4KSLqI6K+tra23QG4W8DM8lJJyXU28OX0qYHDgfcj4m3gfmCMpJ7pjawxaVlu3C1gZnnbsVQnknQ7cCzQW1IDyRMAnQEi4ofAvcBngCXAOuCMtG6lpMuBOemhZkREWzfGtplbrmaWl5Il14g4bSv1AXytlbqZwMxixAVuuZpZ/iqpW8DMrGo4uWa4W8DM8uLkirsFzCx/Tq5mZkXg5JrhbgEzy4uTK+4WMLP8OblmuOVqZnnZanKVdGqz9f2arV+Qd1Cl5parmeWtkJbrzc3Wn2i2PiOnWMzMqkYhybV5u25r6x2WuwXMLC+FJNfmKWdr6x2OuwXMLG8FjS0gSSQtVLW0Xi3ccjWzvBSSXLsBjZl1ZdaFW65mZh9SSHIdWPQozMyqzFaTa0S83lK5pJ7phIFVw90CZpaXQp5z/bKksZn1eklvAu9KWtT8udeOyN0CZpa3Qp4WuBD4U2b9JuC3wEHp93cLOZGkcWkyXiJpegv135M0P/28LGlVpm5jpm52IeczMyunQvpc+wHPA0jqBxwInJBOvzKdZFqWNknqBNwIjCaZGnuOpNkRsWDTNhHx95ntzwMOyRziLxFRV0Cs28XdAmaWl0Jaro3ATunyEcBLmTms1gE7F3CMw4AlEbE0ItYDdwDj29j+NOD2Ao6bC3cLmFneCkmuvweukHQQcB7w60zd/mzZZdCaPsCbmfWGtOxDJO1N8oTCw5niGklzJT0p6aTWTiJparrd3OXLlxcQ1pbccjWzvBSSXM8n+RP9DyQt1aszdV8C/ifnmCYCd0XExkzZ3hFRD0wCvi9pn5Z2jIibIqI+Iupra2sLPqFbrmaWt0L6XDsBU9j8wkB3Sd3Tuh8UeJ5lJH23m/RNy1oykWazwEbEsvR7qaTfkST7Vwo8t5lZyRWSXF9jy7ewmrfzgiQBt2UOMEjSQJKkOpGkFboFSfsDPcmMvCWpJ7AuIv4qqTdwJHBNAXG3m7sFzCwvhSTX50huWv0E+CnwVntPEhGNkqYB95Mk4pkR8aKkGcDciNj0eNVE4I6ILdLcAcCPJP2NpBvjquxTBnlwt4CZ5a2QN7QOkTQMmEzS77oQmAX8MiL+UuiJIuJe4N5mZZc2W7+shf0eJ3n8q+jccjWzvBQ0zUtEvBAR/wgMAK4DPgu8LWl4EWMzM+uw2juH1iDgGGAk8CxQFWMLuFvAzPK21W4BSbuTPNQ/GdgVuA04OiLeKHJsJeduATPLSyE3tN4CXiVJqk+mZftK2nfTBhHxcEs7dhRuuZpZ3gpJrn8CaoCvpJ/mAvhEnkGZmXV0hTwtMKAEcVQEdwuYWV7ae0OrKrlbwMzy5uSa4ZarmeXFyRW3XM0sf06uZmZF4OSa4W4BM8uLkyvuFjCz/Dm5mpkVgZNrhrsFzCwvTq64W8DM8ley5CppnKRFkpakU3I3r58iabmk+enn7EzdZEmL08/kYsXolquZ5aWQsQW2m6ROwI3AaJKZX+dImt3CjAJ3RsS0ZvvuDnwbqCcZx2Beum9uwx265WpmeStVy/UwYElELI2I9cAdwPgC9x0LPBgRK9OE+iAwrkhxmpnlolTJtQ/wZma9IS1r7mRJf5R0l6RNs8UWuu92c7eAmeWlkm5o/RoYEBEHkbROf9LeA0iaKmmupLnLly9vx37tPZOZWdtKlVyXAf0y633TsiYRsSIi/pqu/hg4tNB9M8e4KSLqI6K+tra23UG65WpmeSlVcp0DDJI0UNJOJFNoz85uIGnPzOrnSGaZhWQ67jGSekrqCYxJy3LjlquZ5a0kTwtERKOkaSRJsRMwMyJelDQDmBsRs4GvS/oc0AisBKak+66UdDlJggaYERErSxG3mdm2KklyBYiIe4F7m5Vdmln+JvDNVvadCcwsaoC4W8DM8lNJN7TKxt0CZpY3J1czsyJwcs1wt4CZ5cXJFXcLmFn+nFwz3HI1s7w4ueKWq5nlz8nVzKwInFzZ3HL929/KG4eZVQ8nV2DH9FWKxsbyxmFm1cPJFejcOfl2cjWzvDi5srnlumFDeeMws+rh5Iq7Bcwsf06uuFvAzPLn5Iq7Bcwsf06uuOVqZvlzcsUtVzPLX8mSq6RxkhZJWiJpegv135C0IJ399SFJe2fqNkqan35mN993e/mGlpnlrSQzEUjqBNwIjCaZGnuOpNkRsSCz2bNAfUSsk3QOcA0wIa37S0TUFSu+bt2S79Wri3UGM/uoKVXL9TBgSUQsjYj1wB3A+OwGEfFIRKxLV58kmeW1JPbaK/le1uKcsmZm7Veq5NoHeDOz3pCWteYs4L7Meo2kuZKelHRS3sHttBPssQc0NOR9ZDP7qCrZBIWFkvRFoB44JlO8d0Qsk/QJ4GFJz0fEKy3sOxWYCtC/f/92nbdvX3jzza1vZ2ZWiFK1XJcB/TLrfdOyLUg6AbgY+FxE/HVTeUQsS7+XAr8DDmnpJBFxU0TUR0R9bW1tuwLcZx9YtKhdu5iZtapUyXUOMEjSQEk7AROBLe76SzoE+BFJYn0nU95TUpd0uTdwJJC9EZaLujp49VVYtSrvI5vZR1FJkmtENALTgPuBhcB/RcSLkmZI+ly62XeBbsAvmj1ydQAwV9JzwCPAVc2eMsjFIWlbeM6cvI9sZh9FJetzjYh7gXublV2aWT6hlf0eBw4sbnRw1FGw887wq1/B6NHFPpuZVTu/oZXq1g1OOgl+8hN4661yR2NmHZ2Ta8aMGclUL+PGwZNPejZYM9t2FfcoVjntuy/Mng2nnQYjRybPvg4dmnzvvnvSuu3cefNnxx2T7x3S/0VJmz/Z9daWt7ZdR+O4S6cjxgyVF/fAgXDkkcU5tpNrMyecAEuWwF13wWOPwUsvwdNPw3vvwZ//7MFdzKrJpElOriXVvTucdVbyacnGjUmSbWxMviO2/MDWl7e2XUfjuEunI8YMlRn3pnFFisHJdRt06pR8zMxa4xtaZmZF4ORqZlYETq5mZkXg5GpmVgROrmZmReDkCvDQQ3DRReWOwsyqiJMrwBNPwDXX+A0BM8uNkytA167J99q15Y3DzKqGkyvALrsk3+vWtb2dmVmB/IYWbE6uN9wA+++fjMRSaSNMmFn+Bg5MBnMugpIlV0njgOuBTsCPI+KqZvVdgFnAocAKYEJEvJbWfZNkRtiNwNcj4v5cgxs8OBne6uqrcz2smVW4SZM6dnKV1Am4ERhNMq32HEmzm03XchbwXkTsK2kicDUwQdIQkjm3hgJ7Ab+VNDgiNuYW4GGHJZNnvftuMqDrxvwObWYVrIgjt5Sq5XoYsCSdvRVJdwDj2XKiwfHAZenyXcC/S1Jafkc6G+yrkpakx3si1wi7doV2TsdtZtaaUt3Q6gO8mVlvSMta3Cad0PB9oFeB+wIgaaqkuZLmLl++PKfQzczar6qeFoiImyKiPiLqa2tryx2OmX2ElSq5LgP6Zdb7pmUtbiNpR6A7yY2tQvY1M6sopUquc4BBkgZK2onkBtXsZtvMBiany6cAD0dEpOUTJXWRNBAYBDxdorjNzLZJSW5oRUSjpGnA/SSPYs2MiBclzQDmRsRs4GbgtvSG1UqSBEy63X+R3PxqBL6W65MCZmZFoKjEiW1yIGk58Ho7dukNvFukcPLWUWJ1nPnqKHFCx4l1W+LcOyK2elOnapNre0maGxH15Y6jEB0lVseZr44SJ3ScWIsZZ1U9LWBmVimcXM3MisDJdbObyh1AO3SUWB1nvjpKnNBxYi1anO5zNTMrArdczcyKwMmVZDhESYskLZE0vQLieU3S85LmS5qblu0u6UFJi9Pvnmm5JN2Qxv5HScOLGNdMSe9IeiFT1u64JE1Ot18saXJL5ypSrJdJWpZe1/mSPpOp+2Ya6yJJYzPlRf3dkNRP0iOSFkh6UdL5aXlFXdc24qyoayqpRtLTkp5L4/xOWj5Q0lPpOe9MX2YifTnpzrT8KUkDthZ/wSLiI/0heanhFeATwE7Ac8CQMsf0GtC7Wdk1wPR0eTpwdbr8GeA+QMDhwFNFjOtoYDjwwrbGBewOLE2/e6bLPUsU62XAhS1sOyT9794FGJj+PnQqxe8GsCcwPF3eFXg5jaeirmsbcVbUNU2vS7d0uTPwVHqd/guYmJb/EDgnXT4X+GG6PBG4s6342xOLW66Z4RAjYj2waTjESjMe+Em6/BPgpEz5rEg8CfSQtGcxAoiIR0nentueuMYCD0bEyoh4D3gQGFeiWFvTNKxlRLwKbBrWsui/GxHxdkQ8ky7/GVhIMupbRV3XNuJsTVmuaXpd1qSrndNPAMeRDGUKH76em67zXcDx0pZDnTaLv2BOru0Y0rCEAnhA0jxJU9OyPSLi7XT5T8Ae6XK5429vXOWOd1r65/TMTX9qtxFTSWNN/yQ9hKS1VbHXtVmcUGHXVFInSfOBd0j+J/MKsCqSoUybn3O7hzptjZNrZToqIoYDnwa+JunobGUkf7dU3GMelRpXxn8A+wB1wNvAv5Y3nM0kdQP+G7ggIlZn6yrpurYQZ8Vd04jYGBF1JCPoHQbsX444nFwrcEjDiFiWfr8D3E3yC/K/m/7cT7/fSTcvd/ztjats8UbE/6b/8P4G/Ceb/8wra6ySOpMkrJ9FxC/T4oq7ri3FWanXNI1tFfAIMJKk+2TTQFXZcxZtqFMn18KGQywZSbtI2nXTMjAGeIEth2ScDNyTLs8GvpzeRT4ceD/z52QptDeu+4Exknqmf0KOScuKrllf9OdJruumWFsa1rLovxtp/97NwMKIuC5TVVHXtbU4K+2aSqqV1CNd3plk3r6FJEn2lHSz5tezOEOd5nWXriN/SO7AvkzSN3NxmWP5BMldyueAFzfFQ9IP9BCwGPgtsHtsvjt6Yxr780B9EWO7neRPvw0kfVBnbUtcwJkkNwiWAGeUMNbb0lj+mP7j2TOz/cVprIuAT5fqdwM4iuRP/j8C89PPZyrturYRZ0VdU+Ag4Nk0nheASzP/rp5Or80vgC5peU26viSt/8TW4i/04ze0zMyKwN0CZmZF4ORqZlYETq5mZkXg5GpmVgROrmZmReDkahUtHXXpp0U4bkjaN+dj9pe0RlKnPI9rHZOTq1UESZMkzU2T09uS7pN0VLnjao+IeCMiukUBU79LGpAm+JJMb2+l5/+wVnaSvkEyrN5XSd4qWk8yotN4YG0ZQzPbZm65WllJ6g7MAL4WEb+MiLURsSEifh0R/5hutpOkWZL+nA6AXJ/Zfy9J/y1puaRXJX09U9dJ0rckvZLuO09Sv2YhIOkoSW9KOjZdD0lfl7RU0ruSvitph7RuB0mXSHpdyWDcs9Kf4UOtUUm/k3S5pD+k539AUu/0tI+m36vS1vrIfK+slZuTq5XbSJJXEO9uY5vPkYz72YPkFct/hyTRAb8meVW4D3A8cIE2jxr/DeA0ktctdyN5PXRd9sCSxpG8KntyRPwuU/V5oJ5kwO3x6b4AU9LPp0heqey2KZ5WTALOAD5GMjj0hWn5ppHOeqRdCU+0cQzrgJxcrdx6Ae/G5rE2W/L/IuLetC/zNuDgtHwEUBsRMyJifUQsJRmZaWJafzZwSUQsisRzEbEic9xTgR+RvDfefFCOqyMZePoN4PskSRrgdOC6SAZ7XgN8k2SAj9a62G6JiJcj4i8ko+HXbeV6WJVwn6uV2wqgt6Qd20iwf8osrwNq0mS2N7CXpFWZ+k7AY+lyP5KBN1pzAcmo/i+0UJcdKPl1YK90ea90PVu3I5sHs95a7N3aiMeqiFuuVm5PAH9l87Qb7fEm8GpE9Mh8do2Iz2Tq92lj/1OBk5ROttdMtm+2P/BWuvwWSVLP1jUC/9vO2D1iUpVzcrWyioj3gUuBGyWdJKmrpM6SPi3pmq3s/jTwZ0kXSdo5vYE1TNKItP7HwOWSBqXjnx4kqVdm/7dI+mnPl3ROs2P/Yzo2aj/gfODOtPx24O/T8Ui7AVeSTGrXVrdGS5YDfyPpt7Uq5ORqZRcR/0py8+kSkqTzJjAN+NVW9tsIfJakH/NV4F2ShNo93eQ6kn7OB4DVJIM979zsGG+QJNjpks7OVN0DzCMZt/T/pvsCzCTp9300PecHwHnt/JGJiHXAFcAfJK1KB762KuLxXM2akRTAoIhYUu5YrONyy9XMrAicXM3MisDdAmZmReCWq5lZETi5mpkVgZOrmVkROLmamRWBk6uZWRE4uZqZFcH/B63txYuickyyAAAAAElFTkSuQmCC\n",
"text/plain": [
"<Figure size 360x216 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"poly_onelayer = SingleLayerNN(X2, y2, activations=(\"linear\"), learning_rate=0.01)\n",
"poly_twolayer = TwoLayerNN(X2, y2, activations=(\"sigmoid\", \"linear\"), learning_rate=0.01)\n",
"\n",
"training_stats = []\n",
"\n",
"for checkpoint in range(3000):\n",
" # Train 20 epochs at a time, then record progress; total of 50 * 20 = 1000 epochs\n",
" poly_onelayer.train(1)\n",
" poly_twolayer.train(1)\n",
" training_stats.append([MSE(y2, poly_onelayer.forward(X2)), MSE(y2, poly_twolayer.forward(X2))])\n",
"\n",
"print(\"One layer: \", MSE(y2, poly_onelayer.forward(X2)))\n",
"print(\"Two layer: \", MSE(y2, poly_twolayer.forward(X2)))\n",
"print()\n",
"\n",
"# Plot the progress\n",
"training_stats = np.array(training_stats)\n",
"plt.figure(figsize=(5, 3))\n",
"plt.plot(training_stats[:,1], 'b', label='Two layer')\n",
"plt.plot(training_stats[:,0], 'r', label='One layer')\n",
"plt.xlabel('Checkpoint', fontsize=12)\n",
"plt.ylabel('MSE', fontsize=12)\n",
"plt.title('MSE')\n",
"plt.legend()\n",
"plt.show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This is much smoother. What you see is most likely an artifact of the hidden layer's outputs \"alternating\" across the negative / positive boundary, since that is a thing tanh can do. Sigmoid is always positive, so there is no single point where a sign can flip suddenly.\n",
"\n",
"Another possible way to combat the observed behavior of the 2-layer network is to lower the learning rate even more. Let's try that (setting the hidden layer activation back to \"tanh\"):"
]
},
{
"cell_type": "code",
"execution_count": 48,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Using linear activation.\n",
"Using activations: tanh, linear.\n",
"One layer: 0.02909393655222622\n",
"Two layer: 0.028925180466813725\n",
"\n"
]
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAVAAAADjCAYAAADTwUy2AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAHU1JREFUeJzt3Xl0VfW99/H3NzEQmZWkDgxCNaiANoagoqI4IGhVnFigVsWJVS2ot1pLq4/XYrVOl6rV6vWpYFGrPmKpw9UrTrd6nTAMKoMKDmiU1hgEBMoQ/D5/7J14iBlOTvYZkv15rbXX2fP5nm34uPdvn/075u6IiEjL5WW7ABGRtkoBKiKSIgWoiEiKFKAiIilSgIqIpEgBKiKSIgWoiEiKFKDSZpnZJ2a22cyK6s1fYGZuZv3MrLeZPWZmX5nZGjNbZGYTwvX6heutqzeMy8oHkjZnu2wXINJKHwOnAX8AMLN9gE4Jy+8H3gZ2AzYB+wA719tHD3evSX+p0t7oDFTauvuBsxKmzwZmJkwPBe5z9/XuXuPuC9z9mYxWKO2WAlTaujeAbma2t5nlA+OBB+otv9PMxptZ36xUKO2WAlTag9qz0JHAUuDzhGVjgVeA/wN8bGYLzWxove2/MrPVCcPeGala2jy1gUp7cD/wMtCfbS/fcfevgSnAlPBm0y3A38ysd8JqRWoDlVToDFTaPHdfQXAz6Vjgr02s9xVBgO4K7JiZ6qQ9U4BKe3EecIS7r0+caWY3mtlgM9vOzLoCFwLL3b06K1VKu6IAlXbB3T9094oGFnUCZgOrgY8Ivs50Qr11Vtf7HujP01yutBOmDpVFRFKjM1ARkRQpQEVEUqQAFRFJUUYC1Mz6mNlLZrbEzBab2SUNrGNmdruZLTezd8ysLBO1iYikKlNfpK8BLnP3+eFXSeaZ2XPuviRhnWOAknA4ALgrfBURyUkZCVB3XwmsDMe/MbOlQC8gMUDHADM9+FrAG2bWw8x2CbdtUFFRkffr1y+NlYtIHM2bN+8rdy9ubr2MP8ppZv2A/YA36y3qBXyWMF0Zzms0QPv160dFRUNf/RMRSZ2ZrUhmvYzeRDKzLsBjwKXuvjbFfUw0swozq6iqqoq2QBGRFshYgJpZAUF4PujuDT2v/DnQJ2G6N9v2qgOAu9/j7uXuXl5c3OwZtohI2mTqLrwB9wJL3X1aI6s9AZwV3o0/EFjTVPuniEi2ZaoN9GDgTOBdM1sYzvs10BfA3e8GniboTWc5sAE4J0O1ibQpW7ZsobKyko0bN2a7lDavsLCQ3r17U1BQkNL2mboL/7+ANbOOAz9LZx1XXAEDBsD556fzXUTSq7Kykq5du9KvXz+CiztJhbtTXV1NZWUl/fv3T2kfsXoS6dFH4eWXs12FSOts3LiRnj17Kjxbyczo2bNnq87kYxWgeXmgzqekPVB4RqO1xzF2Afrtt9muQqRtq66uprS0lNLSUnbeeWd69epVN7158+ZW7/+QQw5h4cKFza+YA2L1m0gKUJHW69mzZ13AXXPNNXTp0oXLL788y1Ulr6amhu22iyb6YnUGaqYAFUmX3/3ud/zxj38EYPLkyRx99NEAzJkzh7PPPhuABx54gH322YfBgwfz61//utl9Tpw4kfLycgYNGsTUqVPr9nfqqafWrfPMM88wduzYuvFhw4ZRVlbGuHHjWL8++IWX3r17M2XKFPbbbz9mz54d2WeO3Rmo2kClPbn0Uoj6are0FG69teXbDR8+nDvvvJOLLrqI+fPns2XLFrZu3corr7zCoYceSmVlJVdddRUVFRV0796do446iqeeeorjjjuu0X3ecMMN7LjjjtTU1HD44Ydz6qmnctRRRzFp0iSqq6vp2bMnM2bM4Nxzz+XLL7/khhtu4IUXXqBTp05cd9113HbbbXVB/YMf/IAFCxakelgaFKszUF3Ci6TP0KFDeeutt1i9ejVdunRh6NChzJ8/n1deeYXhw4fz5ptvcsQRR1BUVERBQQGnn346LzfztZiHHnqIsrIyysrKWLp0KUuWLCEvL48zzjiDv/zlL6xatYp58+Zx9NFH89prr7FkyRIOOuggSktLefDBB/nkk0/q9jVu3LjIP3PszkAVoNKepHKmmC4dO3akV69ezJw5k4MPPpgBAwbwwgsvsGLFCgYMGMC7777bov0tW7aM2267jblz59KjRw9+8pOf1H3l6Nxzz+WUU04BgmDMz8/H3Rk9ejT3339/g/vr3Llz6z5gA3QGKiKRGT58OLfccguHHnpo3SV9eXk5AAcccAAvvfQS1dXV1NTU8PDDD3PYYYc1uq+1a9fStWtXunXrxsqVK3n22WfrlvXp04eioiJuuOEGJkyYAMBBBx3E3//+dz766CMA1q9fz7Jly9L3YYnZGahuIomk1/Dhw7n55ps58MADKSwspKCggOHDhwPBjZxrr72WESNG4O4cf/zx/PjHP250X2VlZQwcOJC99tqL3XbbjYMPPnib5aeffjpr165lwIABAOy0007ce++9jBs3ru7rVNdffz0lJSVp+rRt/GeNy8vLvSX9gQ4ZArvuCk8+mcaiRNJs6dKl7L333tkuI+t++tOfMmzYsLo7/Klq6Hia2Tx3L29u21idgeoSXqR9KC0tZYcdduD222/Pah0KUBFpc3LlSaVY3URSG6iIRClWAaov0otIlGIXoDoDFZGoKEBFRFKkABWRFqusrGTMmDGUlJSw++67c8kll0TSlR3AhAkTmDVrViT7SrdYBahuIom0nrtz8sknc+KJJ7Js2TI++OAD1q1bx5VXXpnt0pJSU1MT2b5iFaC6iSTSei+++CKFhYWcc07wu4/5+fn8/ve/Z/r06WzYsIH77ruPk08+mdGjR1NSUsIVV1xRt+2cOXPqupsbO3Ys69ata/K9pk6dytChQxk8eDATJ07E3fnwww8pKyurW2fZsmV10/PmzeOwww5jyJAhjBo1ipUrgx/2HTFiBJdeeinl5eXcdtttkR0LfQ9UpC3LQn92ixcvZsiQIdvM69atG3379mX58uVA8D3NBQsW0LFjR/bcc08mT57M9ttvz29/+1uef/55OnfuzI033si0adO4+uqrG32vSZMm1S0/88wzeeqppzj++OPp3r07CxcupLS0lBkzZnDOOeewZcsWJk+ezOOPP05xcTGPPPIIV155JdOnTwdg8+bNtOTJxWQoQEUkckceeSTdu3cHYODAgaxYsYLVq1ezZMmSumfaN2/ezLBhw5rcz0svvcRNN93Ehg0bWLVqFYMGDeL444/n/PPPZ8aMGUybNo1HHnmEuXPn8v7777No0SJGjhwJwNatW9lll13q9qXu7FpJbaDS7mShP7uBAwd+7ybP2rVr+fTTT9ljjz2YP38+HTt2rFuWn59PTU0N7s7IkSN56KGHknqfjRs3ctFFF1FRUUGfPn245ppr6rqzO+WUU/jNb37DEUccwZAhQ+jZsydffPEFgwYN4vXXX29wf+rOrpXUBirSekceeSQbNmxg5syZQHCmd9lllzFhwgQ6derU6HYHHnggr776at1l/vr16/nggw8aXb82LIuKili3bt02oV1YWMioUaO48MIL69pi99xzT6qqquoCdMuWLSxevLh1H7YZsQtQnYGKtI6ZMXv2bB599FFKSkoYMGAAhYWFXH/99U1uV1xczH333cdpp53Gvvvuy7Bhw3jvvfcaXb9Hjx5ccMEFDB48mFGjRjF06NBtlp9xxhnk5eXV/fZShw4dmDVrFr/85S/50Y9+RGlpKa+99lrrP3ATYtWd3QknQGUlzJ+fxqJE0kzd2QVuueUW1qxZw7XXXtuq/ag7uySpDVSkfTjppJP48MMPefHFF7NaR6wCVG2gIu1DlD9N3BpqAxURSZECVKQNasv3LnJJa49jRgLUzKab2ZdmtqiR5SPMbI2ZLQyHxh9NaAUFqLQHhYWFVFdXK0Rbyd2prq6msLAw5X1kqg30PuAOYGYT67zi7selswjdRJL2oHfv3lRWVlJVVZXtUtq8wsJCevfunfL2GQlQd3/ZzPpl4r2aoptI0h4UFBTQv3//bJch5FYb6DAze9vMnjGzQel4A13Ci0iUcuVrTPOB3dx9nZkdC/wNKGloRTObCEwE6Nu3b4veRAEqIlHKiTNQd1/r7uvC8aeBAjMramTde9y93N3Li4uLW/Q+agMVkSjlRICa2c5mZuH4/gR1VUf9PmoDFZEoZeQS3sweAkYARWZWCfw7UADg7ncDpwIXmlkN8C9gvKfhOxq6hBeRKGXqLvxpzSy/g+BrTmmlABWRKOXEJXymKEBFJEqxClDdRBKRKMUqQHUTSUSiFLsA1RmoiERFASoikqJYBajaQEUkSrEKULWBikiUYhegOgMVkagoQEVEUhSrAFUbqIhEKVYBqjNQEYlS7AJUN5FEJCqxC1CdgYpIVBSgIiIpilWAmukSXkSiE6sAzQs/rUJURKIQywDVZbyIREEBKiKSolgFaPCzdQpQEYlGrAJUbaAiEqVYBqjOQEUkCgpQEZEUKUBFRFIUqwDVTSQRiVKsAlQ3kUQkSrEMUJ2BikgUmg1QMxtbb3rPetOXRl1UuihARSRKyZyB3ltv+vV601MjqiXt1AYqIlFKJkCthdM5S22gIhKlZAK0ftw0N52zdAkvIlFK6iaSBfLMLL+h6SS2n25mX5rZoib2f7uZLTezd8ysLPmPkDwFqIhEKZkA7QLUAFuAzUCPhOktQOck9nEfMLqJ5ccAJeEwEbgriX22mNpARSRK2yWxTv/Wvom7v2xm/ZpYZQww090deMPMepjZLu6+srXvnUhnoCISpWYD1N1XNDTfzHZw968jqqMX8FnCdGU4Ly0BqptIIhKFZL4HepaZjUqYLjezz4CvzOz9+t8LTTczm2hmFWZWUVVV1aJtdQYqIlFKpg30cuAfCdP3AM8D+4avN0dQx+dAn4Tp3uG873H3e9y93N3Li4uLW/QmClARiVIyAdoHeBfAzPoA+wCXuftiYApwQAR1PAGcFd6NPxBYE3X7J+gmkohEK5mbSDVAB2AjcBDwnruvCpdtALZvbgdm9hAwAigys0rg34ECAHe/G3gaOBZYHu7znBZ9iiSpDVREopRMgP4duM7M/gxMBp5MWLYX217eN8jdT2tmuQM/S6KWVtElvIhEKZlL+EuA/YBXCc4Ob0xYdibw32moKy0UoCISpWTOQPOBCQTPvDvQ3cy6h8v+mKa60kJtoCISpWQC9BO2fd69fuchThCyOU9toCISpWQu4d8GlgFXAf0Ibv4kDh3SVVzUdAkvIlFqNkDdfT/gVGBHgnbQp4HxQAd33+ruW9NbYnQUoCISpaR6Y3L3Re7+C4Iz0GnAccDKdPWalC4KUBGJUkt/E6kEOAwYBiwAonoWPiN0E0lEotTsTSQz2xE4DTgb6ArcDxzq7p+mubbI6SaSiEQpmbvwXwAfEwTnG+G8Pcxsj9oV3P3FNNQWOV3Ci0iUkgnQfwCFwAXhUJ8DP4yyqHRRgIpIlJLpD7RfBurICLWBikiUWnoTqU1TG6iIRCmWAaozUBGJggJURCRFClARkRTFKkB1E0lEohSrANVNJBGJUiwDVGegIhIFBaiISIpiFaBqAxWRKMUqQNUGKiJRimWA6gxURKKgABURSVGsAlRtoCISpVgFqNpARSRKsQxQnYGKSBQUoCIiKVKAioikKFYBqptIIhKlWAWobiKJSJQyFqBmNtrM3jez5WY2pYHlE8ysyswWhsP5UdegS3gRiVIyv8rZamaWD9wJjAQqgbfM7Al3X1Jv1UfcfVK66lCAikiUMnUGuj+w3N0/cvfNwMPAmAy9d53aAN26NdPvLCLtUaYCtBfwWcJ0ZTivvlPM7B0zm2VmfaIuYrvwfFsBKiJRyKWbSE8C/dx9X+A54M8NrWRmE82swswqqqqqWvQGtQFaU9O6QkVEIHMB+jmQeEbZO5xXx92r3X1TOPknYEhDO3L3e9y93N3Li4uLW1REbYBu2dKizUREGpSpAH0LKDGz/mbWARgPPJG4gpntkjB5ArA06iIKCoJXnYGKSBQychfe3WvMbBLwLJAPTHf3xWY2Fahw9yeAi83sBKAGWAVMiLoOXcKLSJQyEqAA7v408HS9eVcnjP8K+FU6a9AlvIhEKZduIqVdXl4w6AxURKIQqwCF4CxUASoiUYhlgOoSXkSiELsALSjQGaiIRCN2AapLeBGJSiwDVJfwIhKF2AWoLuFFJCqxC1BdwotIVGIZoLqEF5EoxDJAdQYqIlGIXYCqDVREohK7ANUlvIhEJZYBqjNQEYlC7AK0QwfYvDnbVYhIexC7AO3UCTZsyHYVItIexC5AO3eG9euzXYWItAcKUBGRFMUuQHUJLyJRiV2A6gxURKIS2wB1z3YlItLWxTJAv/0WNm1qfl0RkabELkC7dQte16zJbh0i0vbFLkB79QpeKyuzW4eItH2xC9C+fYPXTz/Nbh0i0vbFK0BvvZU9F/8VM3jnnWwXIyJtXbwC9I476PzfjzF8ONx5J7z4ou7Gi0jq4hWg4XeY7ror+EL9kUdCeTn84Q/wxRfZLk5E2ppYBujAgbBkCdx9d9A36MUXQ+/esP/+cMUV8NRT8NVX2S5WRHKdeRu+hi0vL/eKiorkNxg5Etatg9df32b20qUwaxbMmQNz537X3d1OO8E++8DAgbDbbsHQty/06QM9ewa924tI+2Nm89y9vNn1YhWg550Hs2dDVRXk5ze4yr/+BW+8AQsWwLvvBsN77zX8+Ge3bkGQ1g49egQnuZ07Q5cu3x/v1Cnoj7RDB+jYsfnxDh2CMs1SPEAikpJkA3S7TBQDYGajgduAfOBP7n5DveUdgZnAEKAaGOfun0RaxHHHwfTpcMEFMHRokFB527ZibA8cDhy+A3BoMLgHAbpqFVRXw9dfwzfrYP06WLc+fP0g6KRk06Zg+HITbP32+yU4LU/DPAvKTHbIzwfL+267/PzgtXYeBKFcf4DvDoflgTWwHhbNPmrnJzLb9rX+/NpDZ/XnJ7tevflR7KO1tSa7fX0t+Z9qQ+u2dvtgQVKzInm/1m6/+4VHs3PZrsnvJEkZCVAzywfuBEYClcBbZvaEuy9JWO084Gt338PMxgM3AuMiLWTMmCA8Z8wIhiQZ0CUc+kZaUJIc2BoOItJi83af03YDFNgfWO7uHwGY2cPAGCAxQMcA14Tjs4A7zMw8yjaGvDy45x64/fbgWc5NmzL7PaY23FzSntX+Z2npa6a2b6ze5uY1NT8d+83levcetFPyO26BTAVoL+CzhOlK4IDG1nH3GjNbA/QEor8fXlgYDCIkXGZntQppi9rc15jMbKKZVZhZRVVVVbbLEZEYy1SAfg70SZjuHc5rcB0z2w7oTnAzaRvufo+7l7t7eXFxcZrKFRFpXqYC9C2gxMz6m1kHYDzwRL11ngDODsdPBV6MtP1TRCRiGWkDDds0JwHPEnyNabq7LzazqUCFuz8B3Avcb2bLgVUEISsikrMy9j1Qd38aeLrevKsTxjcCYzNVj4hIa7XpJ5HMrApY0cLNikjHnf3oqc7otZVaVWf0Wlrrbu7e7E2WNh2gqTCzimQe0co21Rm9tlKr6oxeumptc19jEhHJFQpQEZEUxTFA78l2AUlSndFrK7WqzuilpdbYtYGKiEQljmegIiKRiE2AmtloM3vfzJab2ZRs1wNgZp+Y2btmttDMKsJ5O5rZc2a2LHzdIZxvZnZ7WP87ZlaWxrqmm9mXZrYoYV6L6zKzs8P1l5nZ2Q29VxrqvMbMPg+P6UIzOzZh2a/COt83s1EJ89P6t2FmfczsJTNbYmaLzeyScH5OHdMm6szFY1poZnPN7O2w1t+E8/ub2Zvh+z4SPvmImXUMp5eHy/s19xmS4u7tfiB4+ulD4IdAB+BtYGAO1PUJUFRv3k3AlHB8CnBjOH4s8AxBp0EHAm+msa5DgTJgUap1ATsCH4WvO4TjO2SgzmuAyxtYd2D4370j0D/8e8jPxN8GsAtQFo53BT4I68mpY9pEnbl4TA3oEo4XAG+Gx+r/AePD+XcDF4bjFwF3h+PjgUea+gzJ1hGXM9C6/kjdfTNQ2x9pLhoD/Dkc/zNwYsL8mR54A+hhZrukowB3f5ngcdrW1DUKeM7dV7n718BzwOgM1NmYMcDD7r7J3T8GlhP8XaT9b8PdV7r7/HD8G2ApQfeNOXVMm6izMdk8pu7u68LJgnBw4AiC/oTh+8e09ljPAo40M2viMyQlLgHaUH+kTf1hZIoDc8xsnplNDOft5O4rw/F/ALU9wWb7M7S0rmzWOym89J1ee1ncRD0ZrTO8dNyP4IwpZ49pvTohB4+pmeWb2ULgS4L/mXwIrHb3mgbed5v+hoHa/oZbVWtcAjRXHeLuZcAxwM/M7NDEhR5cY+Tc1yRyta7QXcDuQCmwEviP7JbzHTPrAjwGXOruaxOX5dIxbaDOnDym7r7V3UsJusfcH9gr0zXEJUCT6Y8049z98/D1S2A2wR/BP2svzcPXL8PVs/0ZWlpXVup193+G/7C+Bf4v312OZbVOMysgCKUH3f2v4eycO6YN1Zmrx7SWu68GXgKGETR31HaSlPi+jfU33Kpa4xKgyfRHmlFm1tnMutaOA0cDi9i2X9SzgcfD8SeAs8I7tAcCaxIu/zKhpXU9CxxtZjuEl3xHh/PSql678EkEx7S2zvHh3dj+QAkwlwz8bYRtbfcCS919WsKinDqmjdWZo8e02Mx6hOPbE/xg5VKCID01XK3+MW2ov+HGPkNyorwzlssDwZ3NDwjaSa7MgXp+SHD3721gcW1NBO0yLwDLgOeBHf27u453hvW/C5SnsbaHCC7VthC0CZ2XSl3AuQSN8suBczJU5/1hHe+E/zh2SVj/yrDO94FjMvW3ARxCcHn+DrAwHI7NtWPaRJ25eEz3BRaENS0Crk74dzU3PD6PAh3D+YXh9PJw+Q+b+wzJDHoSSUQkRXG5hBcRiZwCVEQkRQpQEZEUKUBFRFKkABURSZECVHJC2OPPA2nYr5vZHhHvs6+ZrTOz/Cj3K22PAlQyysxON7OKMIBWmtkzZnZItutqCXf/1N27uPvW5tY1s35hiGfsJ8Qlc/QfVTLGzH5O0G3bTwmeoNlM0JvQGGB9FksTSYnOQCUjzKw7MBX4mbv/1d3Xu/sWd3/S3X8RrtbBzGaa2TdhJ7nlCdvvamaPmVmVmX1sZhcnLMs3s1+b2YfhtvPMrE+9EjCzQ8zsMzMbEU67mV1sZh+Z2VdmdrOZ5YXL8szsKjNbYUGnzTPDz/C9s0oz+x8zu9bMXg3ff46ZFYVv+3L4ujo86x4W7ZGVbFKASqYMI3icbnYT65xA0HdkD4JHBu+AIMyAJwkee+0FHAlcat/1Hv5z4DSCxwe7ETzuuCFxx2Y2muDRz1Pc/X8SFp0ElBN0zDwm3BZgQjgcTvB4YJfaehpxOnAO8AOCToQvD+fX9rDVI7zsf72JfUgbowCVTOkJfOXf9dXYkP9196fDtsX7gR+F84cCxe4+1d03u/tHBL0CjQ+Xnw9c5e7ve+Btd69O2O9Y4D8JnnOu31HEjR50UPwpcCtBEAOcAUzzoFPgdcCvCDqdaKzZa4a7f+Du/yLoFb20meMh7YDaQCVTqoEiM9uuiRD9R8L4BqAwDKzdgF3NbHXC8nzglXC8D0FnEI25lKCH90UNLEvsTHcFsGs4vms4nbhsO77r9Li52rs0UY+0EzoDlUx5HdjEdz+x0BKfAR+7e4+Eoau7H5uwfPcmth8LnGjhj6TVk9hW2hf4Ihz/giC4E5fVAP9sYe3qracdU4BKRrj7GuBq4E4zO9HMOplZgZkdY2Y3NbP5XOAbM/ulmW0f3jQabGZDw+V/Aq41s5KwD819zaxnwvZfELSbXmJmF9bb9y/C/jX7AJcAj4TzHwL+LezTsgtwPcEPkTXVBNGQKuBbgnZUaWcUoJIx7v4fBDd8riIIls+AScDfmtluK3AcQbvix8BXBKHZPVxlGkG74xxgLUGnwNvX28enBCE6xczOT1j0ODCPoO/L/wq3BZhO0A77cvieG4HJLfzIuPsG4DrgVTNbHXaQLO2E+gOV2DIzB0rcfXm2a5G2SWegIiIpUoCKiKRIl/AiIinSGaiISIoUoCIiKVKAioikSAEqIpIiBaiISIoUoCIiKfr/Xu3fHZVX7MsAAAAASUVORK5CYII=\n",
"text/plain": [
"<Figure size 360x216 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"poly_onelayer = SingleLayerNN(X2, y2, activations=(\"linear\"), learning_rate=0.01)\n",
"poly_twolayer = TwoLayerNN(X2, y2, activations=(\"tanh\", \"linear\"), learning_rate=0.001)\n",
"\n",
"training_stats = []\n",
"\n",
"for checkpoint in range(3000):\n",
" # Train 20 epochs at a time, then record progress; total of 50 * 20 = 1000 epochs\n",
" poly_onelayer.train(1)\n",
" poly_twolayer.train(1)\n",
" training_stats.append([MSE(y2, poly_onelayer.forward(X2)), MSE(y2, poly_twolayer.forward(X2))])\n",
"\n",
"print(\"One layer: \", MSE(y2, poly_onelayer.forward(X2)))\n",
"print(\"Two layer: \", MSE(y2, poly_twolayer.forward(X2)))\n",
"print()\n",
"\n",
"# Plot the progress\n",
"training_stats = np.array(training_stats)\n",
"plt.figure(figsize=(5, 3))\n",
"plt.plot(training_stats[:,1], 'b', label='Two layer')\n",
"plt.plot(training_stats[:,0], 'r', label='One layer')\n",
"plt.xlabel('Checkpoint', fontsize=12)\n",
"plt.ylabel('MSE', fontsize=12)\n",
"plt.title('MSE')\n",
"plt.legend()\n",
"plt.show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This also cleans up the training error curve by taking more gradual steps. \n",
"Lets zoom in on the first 100 epochs:"
]
},
{
"cell_type": "code",
"execution_count": 49,
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAVYAAADjCAYAAADe3zzxAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAIABJREFUeJzt3Xl4VdW9//H3lxCIDIIkUZGgYA0oogYIyCBOqKhX6sijaAe0LY9VqN7qba16vRZbq9Vrr73V3p/XCoVaamtrHa7WudWqVRMmGUQQB6JYYyggIE0C398fawcOMXPOOTvn5PN6nv2cPa7z3dnwzcpae69t7o6IiCRPl7gDEBHJNkqsIiJJpsQqIpJkSqwiIkmmxCoikmRKrCIiSabEKiKSZEqskrHM7F0zqzazgnrrF5mZm9kgMysys9+b2SdmtsnMlpnZ9Gi/QdF+W+pN58dyQpI1usYdgEg7vQNMA/4bwMyOAHokbJ8PLAEOAv4JHAHsX6+Mvu5em/pQpbNQjVUy3XzgKwnLXwXmJSyPBua6+1Z3r3X3Re7+RFojlE5HiVUy3d+Avc3sMDPLAS4AflVv+11mdoGZHRhLhNLpKLFKNqirtZ4MrAQ+SNg2FXgR+HfgHTNbbGaj6x3/iZltTJgOS0vUkrXUxirZYD7wAjCYPZsBcPd/ANcA10SdXLcDfzSzooTdCtTGKsmkGqtkPHd/j9CJdTrwhyb2+4SQWA8A+qUnOumMlFglW3wNONHdtyauNLNbzWy4mXU1s97AN4E17l4VS5TSKSixSlZw97fdvayBTT2Ah4CNwFrCbVdfrLfPxnr3sX47xeFKljMNdC0iklyqsYqIJJkSq4hIkimxiogkWVoSq5kNNLPnzWyFmS03sysa2MfM7KdmtsbMlprZyHTEJiKSbOl6QKAWuMrdF0a3vJSb2dPuviJhn9OA4mg6Gvh59CkiklHSkljdfT2wPpr/1MxWAgOAxMR6JjDPw20KfzOzvmbWPzq2QQUFBT5o0KAURi4inVF5efkn7l7Y1uPT/kirmQ0CRgCv1ts0AFiXsFwRrWs0sQ4aNIiysoZuXRQRaTsze689x6e188rMegG/B650981tLGOGmZWZWVllZWVyAxQRSYK0JVYzyyUk1fvdvaHnuT8ABiYsF7HnKEUAuPs97l7q7qWFhW2uqYuIpEy67gow4BfASne/o5HdHgG+Et0dMBbY1FT7qohIR5WuNtYJwJeBN8xscbTuWuBAAHf/H+BxwuhEa4BtwMVpik0ko9TU1FBRUcH27dvjDiXj5eXlUVRURG5ublLLTdddAX8FrJl9HLi8NeW+/357ohLJTBUVFfTu3ZtBgwYR/hiUtnB3qqqqqKioYPDgwUktO6OfvKqshC1b4o5CJL22b99Ofn6+kmo7mRn5+fkpqflndGIFWLIk7ghE0k9JNTlS9XPM+MRaXh53BCKdS1VVFSUlJZSUlLD//vszYMCAXcvV1dXtLv+YY45h8eLFze/YgWX0O6+6doWFC+OOQqRzyc/P35X4brzxRnr16sXVV18dc1QtV1tbS9euqU19GV1j7dFDNVaRjuJHP/oRd999NwCzZs3ilFNOAeCpp57iq1/9KgC/+tWvOOKIIxg+fDjXXntts2XOmDGD0tJSDj/8cGbPnr2rvPPOO2/XPk888QRTp07dNT9u3DhGjhzJ+eefz9at4U09RUVFXHPNNYwYMYKHHnooeSfdiIyusfbsCStWwLZtIcmKdDZXXgnJ/qu5pAT+679af9zEiRO56667uOyyy1i4cCE1NTXs2LGDF198kWOPPZaKigquv/56ysrK6NOnDyeddBKPPfYYZ5xxRqNl3nLLLfTr14/a2lpOOOEEzjvvPE466SRmzpxJVVUV+fn5zJkzh0suuYSPP/6YW265hWeffZYePXrwwx/+kDvvvHNXAt93331ZtGhRW38srZLxNdadO2Hp0rgjEZHRo0fz+uuvs3HjRnr16sXo0aNZuHAhL774IhMnTuTVV1/lxBNPpKCggNzcXC688EJeeOGFJstcsGABI0eOZOTIkaxcuZIVK1bQpUsXLrroIn7961+zYcMGysvLOeWUU3j55ZdZsWIF48ePp6SkhPvvv5933313V1nnn39+in8Cu2V0jbWullpeDmPHxhuLSBzaUrNMle7duzNgwADmzZvHhAkTGDJkCM8++yzvvfceQ4YM4Y033mhVeatXr+bOO+/ktddeo2/fvnzpS1/adWvUJZdcwrnnnguEhJmTk4O7c+qppzJ//vwGy+vZs2f7TrAVMrrG2q0bFBSoA0uko5g4cSK33347xx577K6mgdLSUgCOPvponn/+eaqqqqitreU3v/kNxx13XKNlbd68md69e7P33nuzfv16nnzyyV3bBg4cSEFBAbfccgvTp08HYPz48fzlL39h7dq1AGzdupXVq1en7mSbkNE1VoBRo9SBJdJRTJw4kdtuu42xY8eSl5dHbm4uEydOBEIH0k033cTxxx+PuzNlyhT+5V/+pdGyRo4cybBhwzj00EM56KCDmDBhwh7bL7zwQjZv3syQIUMA2G+//fjFL37B+eefv+u2r5tvvpni4uIUnW3jMvr116WlpX7KKWXcdht8+ink5cUdkUjqrVy5ksMOOyzuMGJ36aWXMm7cuF13HLRVQz9PMyt399K2lpnRTQEQaqy1tdDK5hsRyWAlJSWsWrWKadOmxR1KgzK+KWBk9MrBhQth9Oh4YxGR9OjoT2ZlfI110CDYZx+1s4pIx5HxidUs1Fp1Z4CIdBQZn1gBRowIbaw1NXFHIiKSRYm1uhpWrow7EhGRLEqsAGl6DFik06uoqODMM8+kuLiYL3zhC1xxxRVJGTIQYPr06Tz44INJKSsuWZFYhwwJj7eqnVUk9dydc845h7POOovVq1fz1ltvsWXLFq677rq4Q2uR2tralH9HViTWnBw46ijVWEXS4bnnniMvL4+LLw7v+8zJyeEnP/kJ9913H9u2bWPu3Lmcc845nHrqqRQXF/Od73xn17FPPfXUrmH9pk6dypZm3q00e/ZsRo8ezfDhw5kxYwbuzttvv83IuvssCWMK1C2Xl5dz3HHHMWrUKCZPnsz69eFFz8cffzxXXnklpaWl3Hnnncn+kXxOxt/HWmfECJg/P4x21SUrfl2ItEAM4wYuX76cUaNG7bFu77335sADD2TNmjVAuM900aJFdO/enaFDhzJr1iz22msvfvCDH/DMM8/Qs2dPbr31Vu644w5uuOGGRr9r5syZu7Z/+ctf5rHHHmPKlCn06dOHxYsXU1JSwpw5c7j44oupqalh1qxZPPzwwxQWFvLAAw9w3XXXcd999wFQXV1NWVlZe386LZJVifXuu2HtWjjkkLijEencJk2aRJ8+fQAYNmwY7733Hhs3bmTFihW7nvmvrq5m3LhxTZbz/PPP8+Mf/5ht27axYcMGDj/8cKZMmcLXv/515syZwx133MEDDzzAa6+9xqpVq1i2bBknn3wyADt27KB///67ytKwgW1Q95fBokVKrNKJxDBu4LBhwz7XubR582bef/99DjnkEBYuXEj37t13bcvJyaG2thZ35+STT2bBggUt+p7t27dz2WWXUVZWxsCBA7nxxht3DRt47rnn8v3vf58TTzyRUaNGkZ+fz4cffsjhhx/OK6+80mB5GjawDQ4/XO/AEkmHSZMmsW3bNubNmweEmuFVV13F9OnT6dHEqzzGjh3LSy+9tKu5YOvWrbz11luN7l+XRAsKCtiyZcseyTwvL4/JkyfzzW9+c1db79ChQ6msrNyVWGtqali+fHn7TraNsiaxdu8ekqs6sERSy8x46KGH+N3vfkdxcTFDhgwhLy+Pm2++ucnjCgsLmTt3LtOmTePII49k3LhxvPnmm43u37dvX77xjW8wfPhwJk+ezOh6g4FcdNFFdOnSZde7tbp168aDDz7Id7/7XY466ihKSkp4+eWX23/CbZDxwwYmNkZffDE8/jh89FF41FUkG2nYwOD2229n06ZN3HTTTe0qJxXDBmZNGyuEdta5c2H9ejjggLijEZFUOfvss3n77bd57rnn4g6lQVmVWOuewFq4UIlVJJul4xXW7ZE1bawQHhIwUzuriMQrqxJr797hVislVsl2mdw30pGk6ueYlsRqZveZ2cdmtqyR7ceb2SYzWxxNjT+K0YyRI5VYJbvl5eVRVVWl5NpO7k5VVRV5KXhZXrraWOcCPwPmNbHPi+5+Rnu/aMQIeOAB2LAB+vVrb2kiHU9RUREVFRVUVlbGHUrGy8vLo6ioKOnlpiWxuvsLZjYoHd9V14G1eDGceGI6vlEkvXJzcxk8eHDcYUgTOlIb6zgzW2JmT5jZ4W0tRGOzikjcOsrtVguBg9x9i5mdDvwRKG5oRzObAcwAOPDAAz+3vbAQior0aKuIxKdD1FjdfbO7b4nmHwdyzaygkX3vcfdSdy8tLCxssLwRI1RjFZH4dIjEamb7m4WHUM1sDCGuqraWN3IkrFoFW7cmK0IRkZZLS1OAmS0AjgcKzKwC+A8gF8Dd/wc4D/immdUCnwEXeDvuJRkxIgx4vXQpNDPco4hI0qXrroBpzWz/GeF2rKRI7MBSYhWRdOsQTQHJNnAg5OernVVE4pGVidUs1Fp1Z4CIxCErEyuExLpsGdTUxB2JiHQ2WZ1Yq6thxYq4IxGRziZrE2vdywXVHCAi6Za1ibW4OAwjWF4edyQi0tlkbWLt0iU0Byixiki6ZW1iBRg1CpYsgdrauCMRkc4k6xPrZ5/BypVxRyIinUnWJ1ZQc4CIpFdWJ9YhQ6BXLyVWEUmvrE6s6sASkThkdWKFcD/r4sXqwBKR9Mn6xFrXgfXmm3FHIiKdRadIrKDmABFJn6xPrEOHQs+eSqwikj5Zn1hzcqCkRIlVRNIn6xMrhOaAxYthx464IxGRzqBTJNbSUti2TU9giUh6dIrEOmZM+Hz99XjjEJHOoVMk1uJi6NMHXn017khEpDPoFIm1SxcYPRpeey3uSESkM+gUiRVCc8DSpeFhARGRVOo0ifXoo8NdAXoltoikWqdJrKNHh081B4hIqnWaxNq/PwwcqA4sEUm9ZhOrmU2ttzy03vKVyQ4qVcaMUY1VRFKvJTXWX9RbfqXe8uwkxZJyY8bA2rXwySdxRyIi2awlidVaudxh6UEBEUmHliRWb+Vyh1VaGu5pVXOAiKRSizqvLOhiZjkNLbfg+PvM7GMzW9ZE+T81szVmttTMRrb8FFquVy8YNkwdWCKSWi1JrL2AWqAGqAb6JizXAD1bUMZc4NQmtp8GFEfTDODnLSizTcaMCYnVM6aeLSKZpiWJdTBwcMI0uIH5Jrn7C8CGJnY5E5jnwd+AvmbWvwWxtdr48bBhA6xalYrSRUSga3M7uPt7Da03s33c/R9JimMAsC5huSJatz5J5e8yYUL4fOklOPTQZJcuItKy+1i/YmaTE5ZLzWwd8ImZrap/X2uqmdkMMyszs7LKyspWHz90KOTnw1//moLgRERoWVPA1cBHCcv3AM8AR0aftyUhjg+AgQnLRdG6z3H3e9y91N1LCwsLW/1FZqHW+tJLbQtURKQ5LUmsA4E3AMxsIHAEcJW7LweuAY5OQhyPAF+J7g4YC2xy96Q3A9SZMAFWr4aPP07VN4hIZ9aSxFoLdIvmxwNvuntdR9Q2YK/mCjCzBYQntoaaWYWZfc3MLjWzS6NdHgfWAmuA/wUua8U5tFpdO+vLL6fyW0Sks2q28wr4C/BDM/slMAt4NGHboezZTNAgd5/WzHYHLm9BLElRWgrdu4d21rPOSte3ikhn0ZIa6xXACOAlQg311oRtXwb+lIK4Uqp795Bc1c4qIqnQkhprDjCdMCaAA33MrE+07e4UxZVyEybAT34S3iiwV7ONGSIiLdeSGuu7wDuENtB3ouXE6Z1UBJZqEyZATQ2UlcUdiYhkm5Yk1iXAauB6YBCQW2/q1uiRHdj48eFT97OKSLI1m1jdfQRwHtCP0M76OHAB0M3dd7j7jtSGmBoFBeHJqxdfjDsSEck2LRrdyt2Xufu/EWqsdwBnAOtTNQpVupxwQkisNTVxRyIi2aS177wqBo4DxgGLgGSNFRCLSZNgyxYNfC0iydWSsQL6mdnlZvYa8EdgC3Csu5/g7hnZcVXn+OPDI67PPht3JCKSTVpyu9WHhJ7/+cDfonWHmNkhdTu4+3MpiC3l8vOhpCQk1n//97ijEZFs0ZLE+hGQB3wjmupzWjAma0c1aRL89KewbRv06BF3NCKSDVpyV8Agdx/cxJSxSRXgxBOhulpPYYlI8rS28yrrTJwIXbuqnVVEkqfTJ9ZevWDsWHguI1uJRaQj6vSJFUJzQHk5bNwYdyQikg2UWAkdWDt3wp//HHckIpINlFgJTQE9e8KTT8YdiYhkAyVWoFs3OOUUeOwxcI87GhHJdEqskSlToKICliyJOxIRyXRKrJHTTw+Ptz76aPP7iog0RYk1st9+MGaMEquItJ8Sa4IpU8JIVx81+3pEEZHGKbEmmDIlfP7f/8Ubh4hkNiXWBEccAQMHqjlARNpHiTWBWai1Pv00bN8edzQikqmUWOuZMiUMIfj003FHIiKZSom1nkmTwgDYCxbEHYmIZCol1npyc2HqVHj44fA+LBGR1lJibcCFF4bmgEceiTsSEclESqwNmDAh3B3w61/HHYmIZCIl1gZ06QLTpoXRrj75JO5oRCTTpC2xmtmpZrbKzNaY2TUNbJ9uZpVmtjiavp6u2Bpy4YVQWwsPPhhnFCKSidKSWM0sB7gLOA0YBkwzs2EN7PqAu5dE073piK0xRx4Jw4bB/ffHGYWIZKJ01VjHAGvcfa27VwO/Ac5M03e3iRlcdBH89a+wenXc0YhIJklXYh0ArEtYrojW1XeumS01swfNbGB6QmvcxReHN7jefXfckYhIJulInVePAoPc/UjgaeCXDe1kZjPMrMzMyiorK1MaUP/+4Z7WOXN0T6uItFy6EusHQGINtChat4u7V7n7P6PFe4FRDRXk7ve4e6m7lxYWFqYk2EQzZ8KmTWprFZGWS1difR0oNrPBZtYNuADY4/Z7M+ufsPhFYGWaYmvSuHEwYgT893/rfVgi0jJpSazuXgvMBJ4kJMzfuvtyM5ttZl+MdvuWmS03syXAt4Dp6YitOWYwaxYsXw5/+Uvc0YhIJjDP4GpYaWmpl5WVpfx7PvsMiorguOPgD39I+deJSMzMrNzdS9t6fEfqvOqw9toLLr0U/vhHeOONuKMRkY5OibWFrroKeveG//iPuCMRkY5OibWF+vWDb38bHnoIysvjjkZEOjIl1la48sqQYG+4Ie5IRKQjU2JthT594Dvfgccfh5dfjjsaEemolFhbaeZM2G+/0CywY0fc0YhIR6TE2ko9e8Ltt8Orr8LPfx53NCLSESmxtsFFF8HkyfC978G6dc3vLyKdixJrG5iF2urOnXD55XrUVUT2pMTaRoMHw+zZ8OijejeWiOxJibUdrrgCjjkGZsyApUvjjkZEOgol1nbo2hV+9zvo2xfOPhs2bIg7IhHpCJRY22n//cMLB9etCy8g1C1YIqLEmgTjxsHPfhZelz19upKrSGfXNe4AssWMGVBZCddfH5bnzoWcnFhDEpGYKLEm0XXXhVuxrrsu1FrnzIHu3eOOSkTSTYk1ya69NtRUr7kG1qyB3/8eBsb+vlkRSSe1sabAd78b3jTw5pswciQ89VTcEYlIOimxpsjZZ8Prr8O++4bHX7/0Jfj73+OOSkTSQYk1hYYODcn1uuvgt78Ny7fdBp9+GndkIpJKSqwp1qMH/OAH4V1ZY8eG8VwPOgi+/31Yvz7u6EQkFZRY02ToUPjTn+Bvf4Njj4Ubbwxvfj3tNFiwADZujDtCEUkW3RWQZkcfHd72+tZb8Mtfwvz54YmtnBwYPz60x44fD6NHQ69ecUcrIm1hnsFj3pWWlnpZWVncYbTLzp3hNS9PPBGmRYvC+i5d4LDD4IgjYPjwUOM9+OAw9e0bb8wi2c7Myt29tM3HZ3RiHTDAyz74IO4wkmrDhvB2gldegcWLQ9vsu+/uuU+vXjBgABxwQLjrYN99oaAA9tknvOywb1/Ye+/wuu7evcNbD3r2DO29ehpMpHmdO7Gaedm998LXvhZ3KCn16aewdi28/Xb4/OADqKiADz8Mj9F+/DFs2tSysnJzYa+9wpSXF6bu3XdP3brt/uzWLexf91k3de36+fnEz/rzXbuGhJ643NT6unU5OXvO19+/bntOTnjiTSRZOndi7dPHyz77DJ55JvQIdWI1NaED7B//CEl28+YwbdkCW7eGz88+g23bwrR9++7pn//cPVVXh8+amjBfXb17vqYmTLW1u+c7ii5d9ky09RNvQ+sa2qehY+qX3Zqp/rFtXW5ufSo/66bE5Wz/RdbexJrZnVcHHxyyxTnnwEsvhYbITio3FwoLw5Qu7mFMhNra3cl2x47dSbduW+J8be3ufeqvS9y3bjnxs6l1icv1921qXUP71MW3fXvYvnNnw/u0ZEo8NoPrMJ9j9vnE21RSbm5dQ9saWm/WdHnNTfV/OTS2rb0yO7Hm5IR3oxx9NBx6aGh4POqo0OtzyCFhOuig8LB+Xl7c0WYds91/lkvz3PdMtPUTdkMJfOfOpvdL3N7c+rYeV/cLtKFfFPX3aWx7Yrn11ze3vW6+7pdcY/smxlB/fUPnVFdO4vnVrW+vzP8vUVwcensefhiWLAnTc8+F6kaiwkLo3z9M++0XensKCiA/P/T27LNP6PGp6/Xp1Sv0+OTmxnNeknXqannqQOz42tvUkbbEamanAncCOcC97n5Lve3dgXnAKKAKON/d321R4cXFcPXVu5d37gw9O2vWwPvvh2nduvCo00cfwYoVUFUVGhubk9jbU9fjU7+3p6HenYZ6WBpqIKubT/y7xGz3cuJn/Sn84D4/n/ivoqH9EtfveRGa3t6S41oinQ102d4YKB1SWhKrmeUAdwEnAxXA62b2iLuvSNjta8A/3P0QM7sAuBU4v01f2KVLeKypqKjp/bZtC709db0+mzeHLvjNm3f3+GzdurvXp66np+6zujps27x5d8NiYu9OQ4149f9WyrbGNxFJW411DLDG3dcCmNlvgDOBxMR6JnBjNP8g8DMzM0/lbQs9eoRpwICUfUWLuO+eEht66tYlztdNicfVLyOx3Pr7Ja6vH0NT21tyXEuk85eIfmFJWw0e3K7D05VYBwDrEpYrgKMb28fda81sE5APfJKWCOOU+Ge6GuBEMl7GDcJiZjPMrMzMyiorK+MOR0Tkc9KVWD8AEl9QUhSta3AfM+sK9CF0Yu3B3e9x91J3Ly1M502bIiItlK7E+jpQbGaDzawbcAHwSL19HgG+Gs2fBzyX0vZVEZEUSUsba9RmOhN4knC71X3uvtzMZgNl7v4I8AtgvpmtATYQkq+ISMZJ232s7v448Hi9dTckzG8HpqYrHhGRVMnoQVjM7FNgVdxxpFAB2X1XhM4vc2XzuQEMdffebT040x9pXdWeEWg6OjMr0/llrmw+v2w+Nwjn157jM+52KxGRjk6JVUQkyTI9sd4TdwAppvPLbNl8ftl8btDO88vozisRkY4o02usIiIdTsYmVjM71cxWmdkaM7sm7njaw8wGmtnzZrbCzJab2RXR+n5m9rSZrY4+94k71vYwsxwzW2Rmj0XLg83s1egaPhA9lZeRzKyvmT1oZm+a2UozG5dN18/M/jX6t7nMzBaYWV4mXz8zu8/MPjazZQnrGrxeFvw0Os+lZjayufIzMrEmjO96GjAMmGZmw+KNql1qgavcfRgwFrg8Op9rgGfdvRh4NlrOZFcAKxOWbwV+4u6HAP8gjMmbqe4E/uTuhwJHEc4zK66fmQ0AvgWUuvtwwtOTdWMmZ+r1mwucWm9dY9frNKA4mmYAP2+2dHfPuAkYBzyZsPw94Htxx5XE83uYMCj4KqB/tK4/4b7d2ONr4zkVRf9YTwQeA4xwg3nXhq5pJk2EAYPeIeqzSFifFdeP3UN69iPc+/4YMDnTrx8wCFjW3PUC/h8wraH9GpsyssZKw+O7xjxadXKY2SBgBPAqsJ+7r482fQTsF1NYyfBfwHeAule15QMb3b02Ws7kazgYqATmRE0d95pZT7Lk+rn7B8DtwPvAemATUE72XL86jV2vVuebTE2sWcnMegG/B650982J2zz8qszIWzjM7AzgY3cvjzuWFOkKjAR+7u4jgK3U+7M/w6/fPoQ3fAwGDgB68vk/o7NKe69XpibWlozvmlHMLJeQVO939z9Eq/9uZv2j7f2Bj+OKr50mAF80s3eB3xCaA+4E+kZj70JmX8MKoMLdX42WHyQk2my5ficB77h7pbvXAH8gXNNsuX51Grterc43mZpYWzK+a8YwMyMMm7jS3e9I2JQ4Ru1XCW2vGcfdv+fuRe4+iHCtnnP3i4DnCWPvQmaf30fAOjMbGq2aRHifW1ZcP0ITwFgz6xH9W607v6y4fgkau16PAF+J7g4YC2xKaDJoWNwNyO1oeD4deAt4G7gu7njaeS7HEP7sWAosjqbTCe2QzwKrgWeAfnHHmoRzPR54LJo/GHgNWAP8Duged3ztOK8SoCy6hn8E9smm6wd8H3gTWAbMB7pn8vUDFhDai2sIf3F8rbHrRehovSvKNW8Q7o5osnw9eSUikmSZ2hQgItJhKbGKiCSZEquISJIpsYqIJJkSq4hIkimxSodgZjea2a9SUK6b2SFJLvNAM9sSDQYk8jlKrJJWZnahmZVFiWm9mT1hZsfEHVdruPv77t7L3Xc0t6+ZDYqSe6a/uFNaQRdb0sbMvk14hv5S4EmgmvDM+ZmE5+tFsoJqrJIWZtYHmA1c7u5/cPet7l7j7o+6+79Fu3Uzs3lm9mk0qHJpwvEHmNnvzazSzN4xs28lbMsxs2vN7O3o2HIzG1gvBMzsGDNbZ2bHR8tuZt8ys7Vm9omZ3WZmXaJtXczsejN7LxoQeV50Dp+rhZrZn83sJjN7Kfr+p8ysIPraF6LPjVEtfVxyf7LSESmxSrqMA/KAh5rY54uEQVr6Ep7P/hmEJAc8CiwhDNc2CbjSzCZHx30IsZpxAAACU0lEQVQbmEZ4DHhv4BJgW2LBZnYq4THGc939zwmbzgZKCYOmnBkdCzA9mk4gPLrZqy6eRlwIXAzsC3QDro7WHxt99o2aD15pogzJEkqski75wCe+e/zOhvzV3R+P2i7nE0biBxgNFLr7bHevdve1wP8SBnQB+Dpwvbuv8mCJu1cllDuVMFjxae7+Wr3vvNXdN7j7+4QxY6dF6y8C7nD3te6+hTCY+gVNtJXOcfe33P0z4LeEsQOkk1Ibq6RLFVBgZl2bSK4fJcxvA/KiRHYQcICZbUzYngO8GM0PJAyQ0ZgrgXnuvqyBbYkDGL9HGG+U6PO9etu60vhg1fVj79VEPJLlVGOVdHkF+CdwVhuOXUcYD7RvwtTb3U9P2P6FJo6fCpxl0Usa60lsiz0Q+DCa/5CQ0BO31QJ/b2XsGuWoE1JilbRw903ADcBdZnZWNLZnrpmdZmY/bubw14BPzey7ZrZX1Fk13MxGR9vvBW4ys+JozMwjzSw/4fgPCe2yV5jZN+uV/W9mtk/U2XUF8EC0fgHwr9GYv72Am4EHmmnKaEgl4XU0B7fyOMlgSqySNu7+n4SOpusJCWcdMJMwfmlTx+0AziC0W75DeIndvYSX+AHcQWjXfArYTBg0fK96ZbxPSK7XmNnXEzY9THh/02Lg/6JjAe4jtPO+EH3ndmBWK08Zd98G/BB4ycw2RgMlS5bTeKzSaZmZA8XuvibuWCS7qMYqIpJkSqwiIkmmpgARkSRTjVVEJMmUWEVEkkyJVUQkyZRYRUSSTIlVRCTJlFhFRJLs/wNAJdspo+fA2wAAAABJRU5ErkJggg==\n",
"text/plain": [
"<Figure size 360x216 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"plt.figure(figsize=(5, 3))\n",
"plt.plot(training_stats[:,1], 'b', label='Two layer')\n",
"plt.plot(training_stats[:,0], 'r', label='One layer')\n",
"plt.xlabel('Checkpoint', fontsize=12)\n",
"plt.ylabel('MSE', fontsize=12)\n",
"plt.title('MSE')\n",
"plt.xlim(0, 100)\n",
"plt.legend()\n",
"plt.show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Huge Caveat!\n",
"\n",
"Remember that what we've been examining here is the **training error** -- that is, we're measuring the loss against the training set itself. \n",
"We have no idea whether or not we may be *overfitting* the training data. To do that, you need to use training and validation sets (and cross-validation) like in our previous discussions!"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"lines_to_next_cell": 2
},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"\n",
"\n"
]
}
],
"metadata": {
"jupytext": {
"text_representation": {
"extension": ".py",
"format_name": "light",
"format_version": "1.3",
"jupytext_version": "0.8.1"
}
},
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.0"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment