Skip to content

Instantly share code, notes, and snippets.

@alfredo-g-zapiola
Last active October 15, 2019 02:45
Show Gist options
  • Save alfredo-g-zapiola/be5957636dd437a60c0fc483d7feb241 to your computer and use it in GitHub Desktop.
Save alfredo-g-zapiola/be5957636dd437a60c0fc483d7feb241 to your computer and use it in GitHub Desktop.
Display the source blob
Display the rendered blob
Raw
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Your very own neural network\n",
"\n",
"In this notebook we're going to build a neural network using naught but pure numpy and steel nerves. It's going to be fun, I promise!\n",
"\n",
"<img src=\"frankenstein.png\" style=\"width:20%\">"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"import sys\n",
"sys.path.append(\"..\")\n",
"import tqdm_utils\n",
"import download_utils"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# use the preloaded keras datasets and models\n",
"download_utils.link_all_keras_resources()"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"from __future__ import print_function\n",
"import numpy as np\n",
"np.random.seed(42)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Here goes our main class: a layer that can do .forward() and .backward() passes."
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"class Layer:\n",
" \"\"\"\n",
" A building block. Each layer is capable of performing two things:\n",
" \n",
" - Process input to get output: output = layer.forward(input)\n",
" \n",
" - Propagate gradients through itself: grad_input = layer.backward(input, grad_output)\n",
" \n",
" Some layers also have learnable parameters which they update during layer.backward.\n",
" \"\"\"\n",
" def __init__(self):\n",
" \"\"\"Here you can initialize layer parameters (if any) and auxiliary stuff.\"\"\"\n",
" # A dummy layer does nothing\n",
" pass\n",
" \n",
" def forward(self, input):\n",
" \"\"\"\n",
" Takes input data of shape [batch, input_units], returns output data [batch, output_units]\n",
" \"\"\"\n",
" # A dummy layer just returns whatever it gets as input.\n",
" return input\n",
"\n",
" def backward(self, input, grad_output):\n",
" \"\"\"\n",
" Performs a backpropagation step through the layer, with respect to the given input.\n",
" \n",
" To compute loss gradients w.r.t input, you need to apply chain rule (backprop):\n",
" \n",
" d loss / d x = (d loss / d layer) * (d layer / d x)\n",
" \n",
" Luckily, you already receive d loss / d layer as input, so you only need to multiply it by d layer / d x.\n",
" \n",
" If your layer has parameters (e.g. dense layer), you also need to update them here using d loss / d layer\n",
" \"\"\"\n",
" # The gradient of a dummy layer is precisely grad_output, but we'll write it more explicitly\n",
" num_units = input.shape[1]\n",
" \n",
" d_layer_d_input = np.eye(num_units)\n",
" \n",
" return np.dot(grad_output, d_layer_d_input) # chain rule"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### The road ahead\n",
"\n",
"We're going to build a neural network that classifies MNIST digits. To do so, we'll need a few building blocks:\n",
"- Dense layer - a fully-connected layer, $f(X)=W \\cdot X + \\vec{b}$\n",
"- ReLU layer (or any other nonlinearity you want)\n",
"- Loss function - crossentropy\n",
"- Backprop algorithm - a stochastic gradient descent with backpropageted gradients\n",
"\n",
"Let's approach them one at a time.\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Nonlinearity layer\n",
"\n",
"This is the simplest layer you can get: it simply applies a nonlinearity to each element of your network."
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"class ReLU(Layer):\n",
" def __init__(self):\n",
" \"\"\"ReLU layer simply applies elementwise rectified linear unit to all inputs\"\"\"\n",
" pass\n",
" \n",
" def forward(self, input):\n",
" \"\"\"Apply elementwise ReLU to [batch, input_units] matrix\"\"\"\n",
" return np.maximum(input, 0) # maximum for element-wise maxima\n",
" \n",
" def backward(self, input, grad_output):\n",
" \"\"\"Compute gradient of loss w.r.t. ReLU input\"\"\"\n",
" relu_grad = input > 0\n",
" return grad_output * relu_grad "
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# some tests\n",
"from util import eval_numerical_gradient\n",
"x = np.linspace(-1,1,10*32).reshape([10,32])\n",
"l = ReLU()\n",
"grads = l.backward(x,np.ones([10,32])/(32*10))\n",
"numeric_grads = eval_numerical_gradient(lambda x: l.forward(x).mean(), x=x)\n",
"assert np.allclose(grads, numeric_grads, rtol=1e-3, atol=0),\\\n",
" \"gradient returned by your layer does not match the numerically computed gradient\""
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Instant primer: lambda functions\n",
"\n",
"In python, you can define functions in one line using the `lambda` syntax: `lambda param1, param2: expression`\n",
"\n",
"For example: `f = lambda x, y: x+y` is equivalent to a normal function:\n",
"\n",
"```\n",
"def f(x,y):\n",
" return x+y\n",
"```\n",
"For more information, click [here](http://www.secnetix.de/olli/Python/lambda_functions.hawk). "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Dense layer\n",
"\n",
"Now let's build something more complicated. Unlike nonlinearity, a dense layer actually has something to learn.\n",
"\n",
"A dense layer applies affine transformation. In a vectorized form, it can be described as:\n",
"$$f(X)= W \\cdot X + \\vec b $$\n",
"\n",
"Where \n",
"* X is an object-feature matrix of shape [batch_size, num_features],\n",
"* W is a weight matrix [num_features, num_outputs] \n",
"* and b is a vector of num_outputs biases.\n",
"\n",
"Both W and b are initialized during layer creation and updated each time backward is called."
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"class Dense(Layer):\n",
" def __init__(self, input_units, output_units, learning_rate=0.1):\n",
" \"\"\"\n",
" A dense layer is a layer which performs a learned affine transformation:\n",
" f(x) = <W*x> + b\n",
" \"\"\"\n",
" self.learning_rate = learning_rate\n",
" self.output_units = output_units\n",
" self.input_units = input_units\n",
"\n",
" \n",
" # initialize weights with small random numbers. We use normal initialization, \n",
" # but surely there is something better. Try this once you got it working: http://bit.ly/2vTlmaJ\n",
" self.weights = np.random.randn(input_units, output_units) * 0.01\n",
" self.biases = np.zeros(output_units)\n",
" \n",
" def forward(self,input):\n",
" \"\"\"\n",
" Perform an affine transformation:\n",
" f(x) = <W*x> + b\n",
" \n",
" input shape: [batch, input_units]\n",
" output shape: [batch, output units]\n",
" \"\"\"\n",
" assert input.shape[1] == self.weights.shape[0], 'Input feats are %s and Layer feats are %s' % (input.shape[1], self.weights.shape[0] )\n",
" #dot_product = np.dot(input, self.weights) + self.biases # why dot and not matmul?\n",
" return np.dot(input, self.weights) + self.biases\n",
" \n",
" def backward(self,input,grad_output):\n",
" \"\"\"\n",
" input: X values, or result from a previous layer\n",
" grad_output: incoming gradient, used for chain rule \n",
" \"\"\"\n",
" \n",
" # compute d f / d x = d f / d dense * d dense / d x\n",
" # where d dense/ d x = weights transposed\n",
" axis = 0\n",
" grad_input = np.dot(grad_output, self.weights.T) # dZ.dot(weights transpose)\n",
" \n",
" # compute gradient w.r.t. weights and biases\n",
" grad_weights = np.dot(input.T, grad_output)\n",
" \n",
" #grad_biases = np.ones(input.shape).T.dot(grad_output)\n",
" #grad_biases = np.dot(grad_output, np.ones(input.shape).T)\n",
" grad_biases = grad_output.sum(axis=0)\n",
" #grad_biases = grad_output.mean(axis=0)*input.shape[0]\n",
" #grad_biases = np.sum(grad_output, axis=0)\n",
" # IDEA: WHAT IF WE DO NOT USE BIASES FOR SOME NODES? OR ARE THEY OPTIMISED?\n",
" \n",
" if not grad_weights.shape == self.weights.shape and grad_biases.shape == self.biases.shape:\n",
" print('%s & %s' % (grad_biases.shape, self.biases.shape))\n",
" # raise AssertionError\n",
" # Here we perform a stochastic gradient descent step. \n",
" # Later on, you can try replacing that with something better.\n",
" # print('biases %s' % self.biases)\n",
" self.weights = np.subtract(self.weights, self.learning_rate * grad_weights)\n",
" self.biases = np.subtract(self.biases, self.learning_rate * grad_biases)\n",
" #print('new biases %s' % self.biases)\n",
" return grad_input\n",
" \n",
" def get_weights(self): # add a method for implementing regularisation\n",
" return self.weights\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": []
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Testing the dense layer\n",
"\n",
"Here we have a few tests to make sure your dense layer works properly. You can just run them, get 3 \"well done\"s and forget they ever existed.\n",
"\n",
"... or not get 3 \"well done\"s and go fix stuff. If that is the case, here are some tips for you:\n",
"* Make sure you compute gradients for W and b as __sum of gradients over batch__, not mean over gradients. Grad_output is already divided by batch size.\n",
"* If you're debugging, try saving gradients in class fields, like \"self.grad_w = grad_w\" or print first 3-5 weights. This helps debugging.\n",
"* If nothing else helps, try ignoring tests and proceed to network training. If it trains alright, you may be off by something that does not affect network training."
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Well done!\n"
]
}
],
"source": [
"l = Dense(128, 150)\n",
"\n",
"assert -0.05 < l.weights.mean() < 0.05 and 1e-3 < l.weights.std() < 1e-1,\\\n",
" \"The initial weights must have zero mean and small variance. \"\\\n",
" \"If you know what you're doing, remove this assertion.\"\n",
"assert -0.05 < l.biases.mean() < 0.05, \"Biases must be zero mean. Ignore if you have a reason to do otherwise.\"\n",
"\n",
"# To test the outputs, we explicitly set weights with fixed values. DO NOT DO THAT IN ACTUAL NETWORK!\n",
"l = Dense(3,4)\n",
"\n",
"x = np.linspace(-1,1,2*3).reshape([2,3])\n",
"l.weights = np.linspace(-1,1,3*4).reshape([3,4])\n",
"l.biases = np.linspace(-1,1,4)\n",
"\n",
"assert np.allclose(l.forward(x),np.array([[ 0.07272727, 0.41212121, 0.75151515, 1.09090909],\n",
" [-0.90909091, 0.08484848, 1.07878788, 2.07272727]]))\n",
"print(\"Well done!\")"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Well done!\n"
]
}
],
"source": [
"# To test the grads, we use gradients obtained via finite differences\n",
"\n",
"from util import eval_numerical_gradient\n",
"\n",
"x = np.linspace(-1,1,10*32).reshape([10,32])\n",
"l = Dense(32,64,learning_rate=0.1)\n",
"\n",
"numeric_grads = eval_numerical_gradient(lambda x: l.forward(x).sum(),x)\n",
"grads = l.backward(x,np.ones([10,64]))\n",
"\n",
"assert np.allclose(grads,numeric_grads,rtol=1e-3,atol=0), \"input gradient does not match numeric grad\"\n",
"print(\"Well done!\")"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Well done!\n"
]
}
],
"source": [
"#test gradients w.r.t. params\n",
"def compute_out_given_wb(w,b):\n",
" l = Dense(32,64,learning_rate=1)\n",
" l.weights = np.array(w)\n",
" l.biases = np.array(b)\n",
" x = np.linspace(-1,1,10*32).reshape([10,32])\n",
" return l.forward(x)\n",
" \n",
"def compute_grad_by_params(w,b):\n",
" l = Dense(32,64,learning_rate=1)\n",
" l.weights = np.array(w)\n",
" l.biases = np.array(b)\n",
" x = np.linspace(-1,1,10*32).reshape([10,32])\n",
" l.backward(x,np.ones([10,64]) / 10.)\n",
" return w - l.weights, b - l.biases\n",
" \n",
"w,b = np.random.randn(32,64), np.linspace(-1,1,64)\n",
"\n",
"numeric_dw = eval_numerical_gradient(lambda w: compute_out_given_wb(w,b).mean(0).sum(),w )\n",
"numeric_db = eval_numerical_gradient(lambda b: compute_out_given_wb(w,b).mean(0).sum(),b )\n",
"grad_w,grad_b = compute_grad_by_params(w,b)\n",
"\n",
"assert np.allclose(numeric_dw,grad_w,rtol=1e-3,atol=0), \"weight gradient does not match numeric weight gradient\"\n",
"assert np.allclose(numeric_db,grad_b,rtol=1e-3,atol=0), \"bias gradient does not match numeric weight gradient\"\n",
"print(\"Well done!\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### The loss function\n",
"\n",
"Since we want to predict probabilities, it would be logical for us to define softmax nonlinearity on top of our network and compute loss given predicted probabilities. However, there is a better way to do so.\n",
"\n",
"If you write down the expression for crossentropy as a function of softmax logits (a), you'll see:\n",
"\n",
"$$ loss = - log \\space {e^{a_{correct}} \\over {\\underset i \\sum e^{a_i} } } $$\n",
"\n",
"If you take a closer look, ya'll see that it can be rewritten as:\n",
"\n",
"$$ loss = - a_{correct} + log {\\underset i \\sum e^{a_i} } $$\n",
"\n",
"It's called Log-softmax and it's better than naive log(softmax(a)) in all aspects:\n",
"* Better numerical stability\n",
"* Easier to get derivative right\n",
"* Marginally faster to compute\n",
"\n",
"So why not just use log-softmax throughout our computation and never actually bother to estimate probabilities.\n",
"\n",
"Here you are! We've defined the both loss functions for you so that you could focus on neural network part."
]
},
{
"cell_type": "code",
"execution_count": 13,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"def softmax_crossentropy_with_logits(logits,reference_answers, regularise=False, alpha=0.01, weights=None):\n",
" \"\"\"Compute crossentropy from logits[batch,n_classes] and ids of correct answers\"\"\"\n",
" \n",
" logits_for_answers = logits[np.arange(len(logits)),reference_answers]\n",
" \n",
" xentropy = - logits_for_answers + np.log(np.sum(np.exp(logits),axis=1))\n",
" \n",
" if regularise: # add l2 regularisation. Note adding this to gradient w.r.t last activattion propagates it across all net\n",
" assert weights is not None, 'Missing weights'\n",
" xentropy = xentropy + alpha * np.sum(weights**2, axis=None) #first across all columns, then all rows\n",
" \n",
" return xentropy\n",
"\n",
"def grad_softmax_crossentropy_with_logits(logits,reference_answers,regularised=False, alpha=0.01, weights=None):\n",
" \"\"\"Compute crossentropy gradient from logits[batch,n_classes] and ids of correct answers\"\"\"\n",
" \n",
" ones_for_answers = np.zeros_like(logits)\n",
" ones_for_answers[np.arange(len(logits)),reference_answers] = 1\n",
" \n",
" softmax = np.exp(logits) / np.exp(logits).sum(axis=-1, keepdims=True)\n",
" grad = (- ones_for_answers + softmax ) / logits.shape[0]\n",
" \n",
" if regularised: \n",
" assert weights is not None, 'Missing weights'\n",
" grad += alpha * 2 * np.sum(weights, axis=None) / logits.shape[0]\n",
" \n",
" \n",
" return grad"
]
},
{
"cell_type": "code",
"execution_count": 14,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"logits = np.linspace(-1,1,500).reshape([50,10])\n",
"answers = np.arange(50)%10\n",
"\n",
"softmax_crossentropy_with_logits(logits,answers)\n",
"grads = grad_softmax_crossentropy_with_logits(logits,answers)\n",
"numeric_grads = eval_numerical_gradient(lambda l: softmax_crossentropy_with_logits(l,answers).mean(),logits)\n",
"\n",
"assert np.allclose(numeric_grads,grads,rtol=1e-3,atol=0), \"The reference implementation has just failed. Someone has just changed the rules of math.\""
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Full network\n",
"\n",
"Now let's combine what we've just built into a working neural network. As we announced, we're gonna use this monster to classify handwritten digits, so let's get them loaded."
]
},
{
"cell_type": "code",
"execution_count": 15,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"Using TensorFlow backend.\n"
]
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAXAAAAF1CAYAAADx1LGMAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3Xu0VXW5//HPA0Le8gIWEoiYA2mQQzGRyEgpsIx0iJkU\nQwWHHnEMpaMN86f5w9RKD+WlvCdHkYsetQ4RZJp6EDWHxhENFUHU/AlBCN4QUMuA5/fHmoy2+/vd\n7LXXmmuu9V37/Rpjjb3Ws+blmfDwMPe8fKe5uwAA6elS7wQAAJWhgQNAomjgAJAoGjgAJIoGDgCJ\nooEDQKJo4AUzs0fM7N+KnheoNWq7eDTwCpnZa2Y2qt55tMXMTjWzLWa2qcVrRL3zQuNr9NqWJDP7\nnpm9bmYbzGyamX2s3jnVAw28uT3p7ru2eD1S74SAapnZ1yRdKGmkpH0lfVrSZXVNqk5o4Dkzsz3N\n7F4ze8PM3sne92012f5m9r/Z3sNcM+vRYv5hZvaEma03s2fZa0ajaKDaniDpNnd/wd3fkfRjSadW\nuKyk0cDz10XS7SrtGfST9IGkG1pNM17SaZJ6S9os6TpJMrM+kn4v6SeSekj6vqTZZvaJ1isxs37Z\nP4R+28nlEDN708xeMrOLzWyH6jYNnVyj1PZnJT3b4vOzknqZWc8KtytZNPCcuftb7j7b3d93942S\nLpd0ZKvJZrn7End/T9LFksaaWVdJJ0u6z93vc/et7v6QpEWSRkfWs9Ld93D3lW2k8pikAyV9UtIJ\nksZJOj+XjUSn1EC1vaukd1t83vb+41VsXpJo4Dkzs53N7BYzW2FmG1RqpHtkRbzNX1u8XyGpm6S9\nVNqzOTHb+1hvZuslDVdpb6ZD3P1Vd/9/2T+W5yX9SNK3Kt0uoFFqW9ImSbu1+Lzt/cYKlpU0Gnj+\nzpM0UNLn3X03SUdkcWsxzT4t3veT9E9Jb6pU/LOyvY9tr13cfUoOeXmrHICOapTafkHSwS0+Hyxp\nrbu/VcGykkYDr043M9uxxWsHlX6N+0DS+uwEziWR+U42s0FmtrNKe8b/7e5bJN0h6Vgz+5qZdc2W\nOSJyoqhdZvZ1M+uVvf+MSr/Ozq1wO9H5NGxtS5op6fRsPXtImixpeiUbmToaeHXuU6mgt70ulfQL\nSTuptNfxJ0l/iMw3S6WCe13SjpL+XZLc/a+SjpN0kaQ3VNprOV+Rv6fsRM+m7ZzoGSnpOTN7L8vz\nN5KuqGAb0Tk1bG27+x8k/UzSAkkrVTpUE/vPpOkZD3QAgDSxBw4AiaKBA0CiaOAAkCgaOAAkqqoG\nbmZHm9lyM3vFzC7MKymg3qhtpKDiq1Cyu69eknSUpFWSnpI0zt2XbmceLnlBrtw995uTqG00gnJq\nu5o98KGSXslu2f5Q0t0qXecJpI7aRhKqaeB99NFxD1ZlsY8ws4lmtsjMFlWxLqBI1DaSUPPhRd19\nqqSpEr9morlQ26i3avbAV+ujA9f0zWJA6qhtJKGaBv6UpAFmtp+ZdZf0HUnz8kkLqCtqG0mo+BCK\nu282s0mSHpDUVdI0d38ht8yAOqG2kYpCB7PiOCHyVovLCCtBbSNvtb6MEABQRzRwAEgUDRwAEkUD\nB4BE0cABIFE0cABIFA0cABJFAweARNHAASBRNHAASBQNHAASRQMHgETV/IEOANCeQw89NIhNmjQp\niI0fPz46/8yZM4PY9ddfH8SeeeaZCrJrXOyBA0CiaOAAkCgaOAAkigYOAImq6iSmmb0maaOkLZI2\nu/uQPJIC6o3aRgqqeqRaVuRD3P3NMqfv1I+d6tq1axDbfffdq1pm7Ez9zjvvHJ124MCBQezss88O\nYldddVV0/nHjxgWxv//970FsypQp0fkvu+yyaLwatXqkGrVdG4MHD47GH3744SC22267VbWud999\nN4j17NmzqmUWiUeqAUATq7aBu6QHzexpM5uYR0JAg6C20fCqvZFnuLuvNrNPSnrIzF5098daTpAV\nP/8AkBpqGw2vqj1wd1+d/VwnaY6koZFpprr7EE4CISXUNlJQ8R64me0iqYu7b8zef1XSj3LLrM76\n9esXxLp37x7EDj/88Oj8w4cPD2J77LFHEDvhhBMqyK4yq1atCmLXXXddEDv++OOj82/cuDGIPfvs\ns0Hs0UcfrSC7xtHstV2UoUOD//M0e/bs6LSxk/mxCyxiNShJH374YRCLnbAcNmxYdP7YLfaxZTaa\nag6h9JI0x8y2Lee/3P0PuWQF1Be1jSRU3MDd/VVJB+eYC9AQqG2kgssIASBRNHAASFRVd2J2eGUN\neLdaR+4Mq/auyaJs3bo1Gj/ttNOC2KZNm8pe7po1a4LYO++8E8SWL19e9jKrVas7MTuqEWu7VmJ3\n+n7uc58LYnfccUcQ69u3b3SZ2fmGj4j1prbG8/7Zz34WxO6+++6y1iNJkydPDmL/8R//EZ22KNyJ\nCQBNjAYOAImigQNAomjgAJAoGjgAJKrTP5V+5cqV0fhbb70VxIq6CmXhwoXR+Pr164PYl7/85SDW\n1i3As2bNqi4xQNItt9wSxGJjxddC7GoXSdp1112DWGxIhxEjRkTnP+igg6rKq17YAweARNHAASBR\nNHAASBQNHAAS1elPYr799tvR+Pnnnx/EjjnmmCD25z//OTp/bJztmMWLFwexo446Kjrte++9F8Q+\n+9nPBrFzzjmnrHUD23PooYdG49/4xjeCWFu3qLfW1ljxv/vd74JY7OHaf/vb36Lzx/4dxoZ5+MpX\nvhKdv9z8Gw174ACQKBo4ACSKBg4AiaKBA0Ci2h0P3MymSTpG0jp3PzCL9ZB0j6T+kl6TNNbdwzMG\n4bKSHjN5t912C2JtPWQ1drfa6aefHsROPvnkIHbXXXdVkF3nVM144NT2v8TGxY+NiS/F/x3E3H//\n/UGsrTs2jzzyyCAWuzvy1ltvjc7/xhtvlJXTli1bovH333+/rJzaGo+8FvIaD3y6pKNbxS6UNN/d\nB0ian30GUjNd1DYS1m4Dd/fHJLW+1u44STOy9zMkjck5L6DmqG2krtLrwHu5+7bna70uqVdbE5rZ\nREkTK1wPUDRqG8mo+kYed/ftHf9z96mSpkrpHydE50Jto9FVehXKWjPrLUnZz3X5pQTUFbWNZFS6\nBz5P0gRJU7Kfc3PLqIFt2LCh7GnffffdsqY744wzgtg999wTnbatp80jV01f2wcccEAQiw0d0db4\n92+++WYQW7NmTRCbMWNGENu0aVN0mb///e/LitXKTjvtFMTOO++8IHbSSScVkU7Z2t0DN7O7JD0p\naaCZrTKz01Uq7qPM7GVJo7LPQFKobaSu3T1wd2/rURsjc84FKBS1jdRxJyYAJIoGDgCJ6vTjgdfK\npZdeGsRi4yvHbtcdNWpUdJkPPvhg1Xmh8/jYxz4WjcfG2R49enQQa2uYiPHjxwexRYsWBbHYicGU\n9OvXr94ptIs9cABIFA0cABJFAweARNHAASBR7Y4HnuvKOvl4Efvvv38Qi40vvH79+uj8CxYsCGKx\nk0c33nhjdP4i/66LUs144HlqxNoeNmxYNP7444+XNf/IkfHL4dt6MHEK2hoPPPZv48knnwxiX/rS\nl3LPqS15jQcOAGhANHAASBQNHAASRQMHgERxJ2aB/vKXvwSxU089NYjdfvvt0flPOeWUsmK77LJL\ndP6ZM2cGsdgwoGgO11xzTTRuFp4bi52YTPlkZVu6dInvs6Y6VDN74ACQKBo4ACSKBg4AiaKBA0Ci\nynmk2jQzW2dmS1rELjWz1Wa2OHuFY1ECDY7aRurKuQpluqQbJLW+hOHn7h4OLIwOmTNnThB7+eWX\no9PGriqI3e58xRVXROffd999g9jll18exFavXh2dvwlNV5PU9jHHHBPEBg8eHJ02dtv4vHnzcs+p\nEbV1tUnsz2Tx4sW1Tqdq7e6Bu/tjkt4uIBegUNQ2UlfNMfBJZvZc9mvonrllBNQftY0kVNrAb5a0\nv6TBktZIurqtCc1sopktMrNw2Dyg8VDbSEZFDdzd17r7FnffKuk/JQ3dzrRT3X2Iuw+pNEmgKNQ2\nUlLRrfRm1tvdt92DfbykJdubHh2zZEn8j3Ps2LFB7Nhjjw1ibd2Kf+aZZwaxAQMGBLGjjjqqvRSb\nVqq1HXuAcPfu3aPTrlu3Lojdc889uedUpNgDnGMPFm/Lww8/HMR+8IMfVJNSIdpt4GZ2l6QRkvYy\ns1WSLpE0wswGS3JJr0kKOwPQ4KhtpK7dBu7u4yLh22qQC1Aoahup405MAEgUDRwAEsV44AmJPex4\n1qxZQezWW2+Nzr/DDuFf9xFHHBHERowYEZ3/kUce2X6CSMI//vGPIJbKuPCxk5WSNHny5CB2/vnn\nB7FVq1ZF57/66vBq0U2bNnUwu+KxBw4AiaKBA0CiaOAAkCgaOAAkigYOAIniKpQGdNBBB0Xj3/rW\nt4LYYYcdFsRiV5u0ZenSpUHsscceK3t+pCeVsb9j45nHriyRpG9/+9tBbO7cuUHshBNOqD6xBsIe\nOAAkigYOAImigQNAomjgAJAoTmIWaODAgUFs0qRJQeyb3/xmdP699967qvVv2bIliMVuoW7rwa9o\nXGZWVkySxowZE8TOOeec3HPqiO9973tB7OKLLw5iu+++e3T+O++8M4iNHz+++sQaHHvgAJAoGjgA\nJIoGDgCJooEDQKLKeSbmPpJmSuql0nMCp7r7tWbWQ9I9kvqr9OzAse7+Tu1SbUxtnVgcNy58Wlfs\nhGX//v3zTkmLFi2Kxi+//PIglspdebXQTLXt7mXFpHjNXnfddUFs2rRp0fnfeuutIDZs2LAgdsop\npwSxgw8+OLrMvn37BrGVK1cGsQceeCA6/0033RSNN7ty9sA3SzrP3QdJGibpbDMbJOlCSfPdfYCk\n+dlnICXUNpLWbgN39zXu/kz2fqOkZZL6SDpO0oxsshmSwmuTgAZGbSN1HboO3Mz6SzpE0kJJvdx9\n20XEr6v0a2hsnomSJlaeIlB71DZSVPZJTDPbVdJsSee6+4aW33npYFv0gJu7T3X3Ie4+pKpMgRqh\ntpGqshq4mXVTqcDvdPffZOG1ZtY7+763pHW1SRGoHWobKSvnKhSTdJukZe5+TYuv5kmaIGlK9jMc\nfDdhvXqFvzUPGjQoiN1www3R+T/zmc/kntPChQuD2JVXXhnEYuMgS9wi31pnre2uXbsGsbPOOiuI\ntTV29oYNG4LYgAEDqsrpiSeeCGILFiwIYj/84Q+rWk+zKecY+BclnSLpeTNbnMUuUqm4f2Vmp0ta\nIWlsbVIEaobaRtLabeDu/rik+Kg40sh80wGKQ20jddyJCQCJooEDQKKsrdtta7Iys+JWFtGjR48g\ndsstt0SnjT1Q9dOf/nTuOcVO3lx99dXRaWO3EX/wwQe555QSd2/rEEih6l3bsVvRf/3rX0enjT0I\nO6at8cTL7RmxW+7vvvvu6LT1Ho+8EZVT2+yBA0CiaOAAkCgaOAAkigYOAIlK/iTm5z//+Wj8/PPP\nD2JDhw4NYn369Mk7JUnS+++/H8RiYy5fccUVQey9996rSU7NiJOYbevdu3c0fuaZZwaxyZMnB7GO\nnMS89tprg9jNN98cxF555ZXoMhHiJCYANDEaOAAkigYOAImigQNAomjgAJCo5K9CmTJlSjQeuwql\nI5YuXRrE7r333iC2efPm6Pyx2+HXr19fVU4IcRUKmhVXoQBAE6OBA0CiaOAAkKh2G7iZ7WNmC8xs\nqZm9YGbnZPFLzWy1mS3OXqNrny6QH2obqWv3JGb2VO7e7v6MmX1c0tOSxqj0nMBN7n5V2SvjRA9y\nVs1JTGobjayc2i7nmZhrJK3J3m80s2WSajOACFAgahup69AxcDPrL+kQSQuz0CQze87MppnZnjnn\nBhSG2kaKym7gZrarpNmSznX3DZJulrS/pMEq7cVEnwNmZhPNbJGZLcohXyB31DZSVdaNPGbWTdK9\nkh5w92si3/eXdK+7H9jOcjhOiFxVeyMPtY1GlcuNPFYaFPg2SctaFnh2Amib4yUtqSRJoF6obaSu\nnKtQhkv6o6TnJW3NwhdJGqfSr5gu6TVJZ2Ynhba3LPZSkKsqr0KhttGwyqnt5MdCQefGWChoVoyF\nAgBNjAYOAImigQNAomjgAJAoGjgAJIoGDgCJooEDQKJo4ACQqHaHk83Zm5JWZO/3yj43k2bbpkbf\nnn3rnUAL22q70f/MKsE2Fa+s2i70TsyPrNhskbsPqcvKa6TZtqnZtqcIzfhnxjY1Lg6hAECiaOAA\nkKh6NvCpdVx3rTTbNjXb9hShGf/M2KYGVbdj4ACA6nAIBQASVXgDN7OjzWy5mb1iZhcWvf48ZA+6\nXWdmS1rEepjZQ2b2cvYzqQfhmtk+ZrbAzJaa2Qtmdk4WT3q7ikRtN6Zmru1CG7iZdZV0o6SvSxok\naZyZDSoyh5xMl3R0q9iFkua7+wBJ87PPKdks6Tx3HyRpmKSzs7+b1LerENR2Q2va2i56D3yopFfc\n/VV3/1DS3ZKOKziHqrn7Y5LebhU+TtKM7P0MSWMKTapK7r7G3Z/J3m+UtExSHyW+XQWithtUM9d2\n0Q28j6S/tvi8Kos1g14tnpv4uqRe9UymGtmT2A+RtFBNtF01Rm0noNlqm5OYNeClS3uSvLzHzHaV\nNFvSue6+oeV3KW8X8pFyDTRjbRfdwFdL2qfF575ZrBmsNbPekpT9XFfnfDrMzLqpVOB3uvtvsnDy\n21UQaruBNWttF93An5I0wMz2M7Pukr4jaV7BOdTKPEkTsvcTJM2tYy4dZmYm6TZJy9z9mhZfJb1d\nBaK2G1RT17a7F/qSNFrSS5L+Iun/Fr3+nLbhLklrJP1TpWOdp0vqqdKZ7Jcl/Y+kHm3M+4ikf6tw\nvRXPW8ayh6v0K+RzkhZnr9HlbhcvapvaLv5V9HCycvf7JN1X9Hrz5O7jzOw1SV939/9p8dXIOqW0\nXWY2X9JXJHVz982xadz9cUnWxiIacrsaDbVdDDM7UNLVkg6V1NPd26pbSc1d25zEbHJmdpKkbvXO\nA8jRPyX9SqXfDjo1GnjOzGxPM7vXzN4ws3ey931bTba/mf2vmW0ws7lm1qPF/MPM7AkzW29mz5rZ\niCpy2V3SJZL+T6XLALZplNp29+XufpukF6rYnKZAA89fF0m3q/REjX6SPpB0Q6tpxks6TVJvle4S\nu06SzKyPpN9L+omkHpK+L2m2mX2i9UrMrF/2D6HfdnK5QtLNKl3jClSrkWobooHnzt3fcvfZ7v6+\nl+76ulzSka0mm+XuS9z9PUkXSxqb3Yp9sqT73P0+d9/q7g9JWqTSCZfW61np7nu4+8pYHmY2RNIX\nJV2f4+ahE2uU2sa/FH4Ss9mZ2c6Sfq7SeBLbBsf5uJl1dfct2eeWd+ytUOkY9V4q7dmcaGbHtvi+\nm6QFHcyhi6SbJJ3j7ptLV1EB1WmE2sZH0cDzd56kgZI+7+6vm9lgSX/WR8+Ct7zho59KJ2XeVKn4\nZ7n7GVXmsJukIZLuyZp31yy+ysxOdPc/Vrl8dE6NUNtogUMo1elmZju2eO0g6eMqHRtcn53AuSQy\n38lmNijbo/mRpP/O9mDukHSsmX3NzLpmyxwROVHUnnclfUrS4Oy17dfUQ1UaAwJoT6PWtqxkR0nd\ns887mtnHKt3QlNHAq3OfSgW97XWppF9I2kmlvY4/SfpDZL5ZKg3b+bqkHSX9uyS5+19VGiHtIklv\nqLTXcr4if0/ZiZ5NsRM9XvL6tle2LEla66WR8oD2NGRtZ/bNctp2FcoHkpZ3cPuaAo9UA4BEsQcO\nAImigQNAomjgAJAoGjgAJKqqBm5N8BRuIIbaRgoqvgoluz32JUlHqTRu8FOSxrn70u3MwyUvyFV7\nQ4lWgtpGIyintqvZA2+Kp3ADEdQ2klBNAy/rKdxmNtHMFpnZoirWBRSJ2kYSaj4WirtPlTRV4tdM\nNBdqG/VWzR54Mz+FG50btY0kVNPAm/kp3OjcqG0koeJDKNk405MkPaDScKXT3L3TP+II6aO2kYpC\nB7PiOCHyVovLCCtBbSNvtb6MEABQRzRwAEgUDRwAEkUDB4BE0cABIFE0cABIFA0cABJFAweARNHA\nASBRNHAASBQNHAASRQMHgETRwAEgUTRwAEgUDRwAEkUDB4BE0cABIFFVPZXezF6TtFHSFkmb3X1I\nHkkB9UZtIwVVNfDMl939zRyWgwYxcuTIaPzOO+8MYkceeWQQW758ee451Qm1nYjJkycHscsuuyyI\ndekSP+gwYsSIIPboo49WnVetcQgFABJVbQN3SQ+a2dNmNjGPhIAGQW2j4VV7CGW4u682s09KesjM\nXnT3x1pOkBU//wCQGmobDa+qPXB3X539XCdpjqShkWmmuvsQTgIhJdQ2UlDxHriZ7SKpi7tvzN5/\nVdKPcsusTEcccUQ03rNnzyA2Z86cWqfTFA477LBo/Kmnnio4k/polNpG6NRTT43GL7jggiC2devW\nspfr7pWmVFfVHELpJWmOmW1bzn+5+x9yyQqoL2obSai4gbv7q5IOzjEXoCFQ20gFlxECQKJo4ACQ\nqDzuxKyr2B1UkjRgwIAgxknMUOzOtP322y867b777hvEsuPEQCFiNShJO+64Y8GZNAb2wAEgUTRw\nAEgUDRwAEkUDB4BE0cABIFHJX4Uyfvz4aPzJJ58sOJM09e7dO4idccYZ0WnvuOOOIPbiiy/mnhMg\nSaNGjQpi3/3ud8ueP1abxxxzTHTatWvXlp9YA2EPHAASRQMHgETRwAEgUTRwAEhU8icx23pIKcpz\n6623lj3tyy+/XMNM0JkNHz48iN1+++1BbPfddy97mVdeeWUQW7FiRccSa3B0PwBIFA0cABJFAweA\nRNHAASBR7Z7ENLNpko6RtM7dD8xiPSTdI6m/pNckjXX3d2qXZslBBx0UxHr16lXr1Ta1jpwUeuih\nh2qYSfEaqbY7uwkTJgSxT33qU2XP/8gjjwSxmTNnVpNSEsrZA58u6ehWsQslzXf3AZLmZ5+B1EwX\ntY2EtdvA3f0xSW+3Ch8naUb2foakMTnnBdQctY3UVXodeC93X5O9f11Sm8cxzGyipIkVrgcoGrWN\nZFR9I4+7u5n5dr6fKmmqJG1vOqDRUNtodJVehbLWzHpLUvZzXX4pAXVFbSMZle6Bz5M0QdKU7Ofc\n3DLajtGjRwexnXbaqYhVN4XYFTttPYE+ZvXq1Xmm06jqUtudxV577RWNn3baaUFs69atQWz9+vXR\n+X/yk59Ul1ii2t0DN7O7JD0paaCZrTKz01Uq7qPM7GVJo7LPQFKobaSu3T1wdx/Xxlcjc84FKBS1\njdRxJyYAJIoGDgCJSmo88IEDB5Y97QsvvFDDTNJ01VVXBbHYic2XXnopOv/GjRtzzwnNq3///kFs\n9uzZVS3z+uuvj8YXLFhQ1XJTxR44ACSKBg4AiaKBA0CiaOAAkKikTmJ2xFNPPVXvFHK32267BbGj\nj249Gqp08sknR+f/6le/WtZ6fvzjH0fjbd0FB8TEajM2pn9b5s+fH8SuvfbaqnJqNuyBA0CiaOAA\nkCgaOAAkigYOAIlq2pOYPXr0yH2ZBx98cBAzs+i0o0aNCmJ9+/YNYt27dw9iJ510UnSZXbqE/99+\n8MEHQWzhwoXR+f/xj38EsR12CEvg6aefjs4PtGXMmPDJc1OmlD+Q4+OPPx7EYg86fvfddzuWWJNj\nDxwAEkUDB4BE0cABIFE0cABIVDmPVJtmZuvMbEmL2KVmttrMFmev8GGVQIOjtpG6cq5CmS7pBkkz\nW8V/7u7hANM1FLviwt2j0/7yl78MYhdddFFV64/dBtzWVSibN28OYu+//34QW7p0aRCbNm1adJmL\nFi0KYo8++mgQW7t2bXT+VatWBbHYQ6FffPHF6PxNaLoapLZTUotxvl999dUg1lYd41/a3QN398ck\nvV1ALkChqG2krppj4JPM7Lns19A9c8sIqD9qG0motIHfLGl/SYMlrZF0dVsTmtlEM1tkZuHv/0Dj\nobaRjIoauLuvdfct7r5V0n9KGrqdaae6+xB3H1JpkkBRqG2kpKJb6c2st7uvyT4eL2nJ9qbPy1ln\nnRXEVqxYEZ328MMPz339K1euDGK//e1vo9MuW7YsiP3pT3/KPaeYiRMnRuOf+MQngljs5FFnVq/a\nTskFF1wQxLZu3VrVMjty2z3+pd0GbmZ3SRohaS8zWyXpEkkjzGywJJf0mqQza5gjUBPUNlLXbgN3\n93GR8G01yAUoFLWN1HEnJgAkigYOAIlKfjzwn/70p/VOoeGMHDmy7GmrvYMOzWvw4MHReLkPx46Z\nO3duNL58+fKKl9mZsQcOAImigQNAomjgAJAoGjgAJIoGDgCJSv4qFFRnzpw59U4BDerBBx+Mxvfc\ns7wBGmNDR5x66qnVpIRW2AMHgETRwAEgUTRwAEgUDRwAEsVJTABRPXv2jMbLHfv7pptuCmKbNm2q\nKid8FHvgAJAoGjgAJIoGDgCJooEDQKLKeSbmPpJmSuql0nMCp7r7tWbWQ9I9kvqr9OzAse7+Tu1S\nRbXMLIgdcMABQayohy/XG7X9L7fffnsQ69Kluv27J554oqr50b5y/oY2SzrP3QdJGibpbDMbJOlC\nSfPdfYCk+dlnICXUNpLWbgN39zXu/kz2fqOkZZL6SDpO0oxsshmSxtQqSaAWqG2krkPXgZtZf0mH\nSFooqZe7r8m+el2lX0Nj80yUNLHyFIHao7aRorIPcpnZrpJmSzrX3Te0/M7dXaVjiAF3n+ruQ9x9\nSFWZAjVCbSNVZTVwM+umUoHf6e6/ycJrzax39n1vSetqkyJQO9Q2UlbOVSgm6TZJy9z9mhZfzZM0\nQdKU7Gf8cdNoGKWdyY+q9kqDlHXW2o49bX7UqFFBrK1b5j/88MMgduONNwaxtWvXVpAdOqKcY+Bf\nlHSKpOfNbHEWu0il4v6VmZ0uaYWksbVJEagZahtJa7eBu/vjksILiEtG5psOUBxqG6nrvL8/A0Di\naOAAkCjGA+/kvvCFLwSx6dOnF58ICrPHHnsEsb333rvs+VevXh3Evv/971eVEyrDHjgAJIoGDgCJ\nooEDQKKR1BfTAAAEFUlEQVRo4ACQKE5idiKx8cABpIs9cABIFA0cABJFAweARNHAASBRNHAASBRX\noTSh+++/Pxo/8cQTC84EjejFF18MYrEnyA8fPryIdFAF9sABIFE0cABIFA0cABLVbgM3s33MbIGZ\nLTWzF8zsnCx+qZmtNrPF2Wt07dMF8kNtI3UWe9DtRyYoPZW7t7s/Y2Yfl/S0pDEqPSdwk7tfVfbK\nzLa/MqCD3L3i8QGobTSycmq7nGdirpG0Jnu/0cyWSepTfXpAfVHbSF2HjoGbWX9Jh0hamIUmmdlz\nZjbNzPbMOTegMNQ2UlR2AzezXSXNlnSuu2+QdLOk/SUNVmkv5uo25ptoZovMbFEO+QK5o7aRqnaP\ngUuSmXWTdK+kB9z9msj3/SXd6+4HtrMcjhMiV9UcA5eobTSucmq7nKtQTNJtkpa1LPDsBNA2x0ta\nUkmSQL1Q20hdOVehDJf0R0nPS9qahS+SNE6lXzFd0muSzsxOCm1vWeylIFdVXoVCbaNhlVPbZR1C\nyQtFjrxVewglL9Q28pbLIRQAQGOigQNAomjgAJAoGjgAJIoGDgCJooEDQKJo4ACQKBo4ACSq6Ica\nvylpRfZ+r+xzM2m2bWr07dm33gm0sK22G/3PrBJsU/HKqu1C78T8yIrNFrn7kLqsvEaabZuabXuK\n0Ix/ZmxT4+IQCgAkigYOAImqZwOfWsd110qzbVOzbU8RmvHPjG1qUHU7Bg4AqA6HUAAgUYU3cDM7\n2syWm9krZnZh0evPQ/ag23VmtqRFrIeZPWRmL2c/k3oQrpntY2YLzGypmb1gZudk8aS3q0jUdmNq\n5toutIGbWVdJN0r6uqRBksaZ2aAic8jJdElHt4pdKGm+uw+QND/7nJLNks5z90GShkk6O/u7SX27\nCkFtN7Smre2i98CHSnrF3V919w8l3S3puIJzqJq7Pybp7Vbh4yTNyN7PkDSm0KSq5O5r3P2Z7P1G\nScsk9VHi21UgartBNXNtF93A+0j6a4vPq7JYM+jV4rmJr0vqVc9kqpE9if0QSQvVRNtVY9R2Apqt\ntjmJWQNeurQnyct7zGxXSbMlnevuG1p+l/J2IR8p10Az1nbRDXy1pH1afO6bxZrBWjPrLUnZz3V1\nzqfDzKybSgV+p7v/Jgsnv10FobYbWLPWdtEN/ClJA8xsPzPrLuk7kuYVnEOtzJM0IXs/QdLcOubS\nYWZmkm6TtMzdr2nxVdLbVSBqu0E1c20XfiOPmY2W9AtJXSVNc/fLC00gB2Z2l6QRKo1otlbSJZJ+\nK+lXkvqpNCrdWHdvfTKoYZnZcEl/lPS8pK1Z+CKVjhUmu11ForYbUzPXNndiAkCiOIkJAImigQNA\nomjgAJAoGjgAJIoGDgCJooEDQKJo4ACQKBo4ACTq/wMOa0tS7dporAAAAABJRU5ErkJggg==\n",
"text/plain": [
"<matplotlib.figure.Figure at 0x7faedcf0f6a0>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"import matplotlib.pyplot as plt\n",
"%matplotlib inline\n",
"\n",
"from preprocessed_mnist import load_dataset\n",
"X_train, y_train, X_val, y_val, X_test, y_test = load_dataset(flatten=True)\n",
"\n",
"plt.figure(figsize=[6,6])\n",
"for i in range(4):\n",
" plt.subplot(2,2,i+1)\n",
" plt.title(\"Label: %i\"%y_train[i])\n",
" plt.imshow(X_train[i].reshape([28,28]),cmap='gray');"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We'll define network as a list of layers, each applied on top of previous one. In this setting, computing predictions and training becomes trivial."
]
},
{
"cell_type": "code",
"execution_count": 16,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"784"
]
},
"execution_count": 16,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"network = []\n",
"network.append(Dense(X_train.shape[1],100))\n",
"network.append(ReLU())\n",
"network.append(Dense(100,200))\n",
"network.append(ReLU())\n",
"network.append(Dense(200,10))\n",
"X_train.shape[1]"
]
},
{
"cell_type": "code",
"execution_count": 21,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"def forward(network, X):\n",
" \"\"\"\n",
" Compute activations of all network layers by applying them sequentially.\n",
" Return a list of activations for each layer. \n",
" Make sure last activation corresponds to network logits.\n",
" \"\"\"\n",
" activations = [X]\n",
" for n in range(len(network)):\n",
" layer = network[n]\n",
" activations.append(layer.forward(input=activations[-1]))\n",
" assert len(activations[1:]) == len(network), 'Lens are %s and %s' % (len(activations), len(network) )\n",
" \n",
" return activations[1:]\n",
"\n",
"def predict(network,X):\n",
" \"\"\"\n",
" Compute network predictions.\n",
" \"\"\"\n",
" logits = forward(network,X)[-1]\n",
" return logits.argmax(axis=-1)\n",
"\n",
"def train(network,X,y, regularise=False):\n",
" \"\"\"\n",
" Train your network on a given batch of X and y.\n",
" You first need to run forward to get all layer activations.\n",
" Then you can run layer.backward going from last to first layer.\n",
" \n",
" After you called backward for all layers, all Dense layers have already made one gradient step.\n",
" \"\"\"\n",
" \n",
" # Get the layer activations\n",
" layer_activations = forward(network,X)\n",
" layer_inputs = [X]+layer_activations #layer_input[i] is an input for network[i]\n",
" logits = layer_activations[-1]\n",
" \n",
" # Compute the loss and the initial gradient\n",
" loss = softmax_crossentropy_with_logits(logits,y, regularise=regularise, weights=network[-1].get_weights())\n",
" loss_grad = grad_softmax_crossentropy_with_logits(logits,y, regularised=regularise, weights=network[-1].get_weights())\n",
" # propagate gradients through the network\n",
" \n",
"# for n in range(len(network), 0): # range(max, 0) is wrong!!! \n",
" backward_grads = [loss_grad]\n",
" for n in reversed(range(len(network))):\n",
" # print('network %s' % n)\n",
" layer = network[n]\n",
" # print(layer)\n",
" # print(layer_inputs[n].shape)\n",
" grads_input = layer.backward(input=layer_inputs[n], grad_output=backward_grads[-1])\n",
" backward_grads.append(grads_input) \n",
"\n",
" return np.mean(loss)\n",
"\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": []
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Instead of tests, we provide you with a training loop that prints training and validation accuracies on every epoch.\n",
"\n",
"If your implementation of forward and backward are correct, your accuracy should grow from 90~93% to >97% with the default network."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Training loop\n",
"\n",
"As usual, we split data into minibatches, feed each such minibatch into the network and update weights."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": []
},
{
"cell_type": "code",
"execution_count": 22,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"def iterate_minibatches(inputs, targets, batchsize, shuffle=False):\n",
" assert len(inputs) == len(targets)\n",
" if shuffle:\n",
" indices = np.random.permutation(len(inputs))\n",
" for start_idx in tqdm_utils.tqdm_notebook_failsafe(range(0, len(inputs) - batchsize + 1, batchsize)):\n",
" if shuffle:\n",
" excerpt = indices[start_idx:start_idx + batchsize]\n",
" else:\n",
" excerpt = slice(start_idx, start_idx + batchsize)\n",
" yield inputs[excerpt], targets[excerpt]"
]
},
{
"cell_type": "code",
"execution_count": 23,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"from IPython.display import clear_output\n",
"train_log = []\n",
"val_log = []"
]
},
{
"cell_type": "code",
"execution_count": 24,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 24\n",
"Train accuracy: 1.0\n",
"Val accuracy: 0.9801\n"
]
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAX4AAAD8CAYAAABw1c+bAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3Xl8VOX1+PHPyZ6QhIQEwpIAkaXsW8ImVoMURauiKEXE\nBSxSK1b7s9aibdUvarVWarVoFS1VWhUp1rVYFiFAKwgJ+xYImySQEAiQBLLOPL8/7hCGSJJJMskk\nM+f9es1r7txtzsnAmTvPfe5zxRiDUkop3+Hn6QCUUko1LS38SinlY7TwK6WUj9HCr5RSPkYLv1JK\n+Rgt/Eop5WO08CullI/Rwq+UUj5GC79SSvmYAE8HUFVsbKzp2rVrvbc/e/YsrVq1cl9ALYjm7pu5\ng2/n78u5w4X809PTTxhj2rqyTbMr/F27diUtLa3e26emppKSkuK+gFoQzT3F02F4jC/n78u5w4X8\nReSwq9toU49SSvkYLfxKKeVjtPArpZSP0cKvlFI+Rgu/Ukr5mFoLv4jMF5HjIrKjmuUiIq+KSKaI\nbBORIU7L7hGRfY7HPe4MXCmlVP24csT/DjCuhuXXAT0cjxnAXwBEpA3wFDAcGAY8JSLRDQlWKaVU\nw9Xaj98Ys0ZEutawynhggbHu4bheRKJEpAOQAiw3xuQDiMhyrC+QDxoatFKq+bLbDSUVNorLbJTb\nDHZjsNkNxoDNWK+NMdjsYHe8tjumbcZQYTOU2+yU2eyUV9ipsDteV9gpdyy7sNyw/2AZm8oymiY5\nEQTwE0EExJqFVL6+eL4xYMCRs7UL45i2GzCYynUwhvatQ7ljeOdGT8MdF3B1Ao44vc5yzKtu/neI\nyAysXwvExcWRmppa72CKiooatH1LprmnejoMj2nM/E8W29lxwsbhQjulFVBqM5TZLn4utUGZzVBq\nhzJbo4RRI9mf2ejv0dh3JxegW5QfHYsP1Gm7+nz2zeLKXWPMPGAeQHJysmnIVXi+fBWf5p7i6TA8\nxp35F5fZWH/wJGv25rFmbx7784oBiAgJIDIkkNAgf0KD/Wkd5E9YkD+hgf6EXjQdUDkdFOCHn+OI\n2F8EPz/raPn8w9/PWuY8HeTvR4CfEBjgR5C/H4H+fgT6C4H+fgQFXPw60N+PtWtWN+lnf+GI3WA4\nf1TvOHKvchTv/OsAqPxVcP5v4vyLob7q89m7o/BnAwlOr+Md87Kxmnuc56e64f2UUm5kjGFPTqFV\n6PflsfHgKcpsdoID/Bh+WQyTh3Xmyp5t6dEuvEEFylucb9bxo+X+LdxR+D8DHhSRhVgncs8YY46J\nyFLgd04ndK8BHnfD+ynl806fK2Nr1hm2HjnNtqzTHD1ezAdH0ogICSQ8OICIkADCgwMIdzxbry8s\n8/cTNh7KZ/XePNbuO0FeYSkAPePCuXtkF67s2ZZhiW0ICfT3cKaqMdRa+EXkA6wj91gRycLqqRMI\nYIx5A1gCXA9kAueAaY5l+SLyDLDRsavZ50/0KqVcV1JuY/exArYcOc3WI6fZmnWGgyfOAlYzQbe2\n4YgNDp04R1FpBYUl5RSVVmB3oVE6KiyQK7rHcmXPtlzZoy3tW4c0cjaqOXClV8/kWpYbYGY1y+YD\n8+sXmlK+xxjD/ryzjgJ/mi1HTrP7WAHlNquKx0UGMyghionJ8QyKj6JffGsiQwId7bxXXrSfc2U2\nxxdBBUWlFRSVWF8KhaUVlJbb6B8fRf9OrfH3a7lNFqp+msXJXaUUnCur4OGFW1i+KxeAVkH+DIiP\n4sdXXMaghCgGJUS5fEQuIrQKDqBVcABxkY0ZtWqJtPAr1QzkFZYy/d2NbMs+wyNjezKuX3u6tQ3X\no3HVKLTwK+Vh+/OKmPq3DeQVlvLmnUlc07e9p0NSXk4Lv1IustsNfm4+Ak87lM/0BWn4i/DBfSMY\n3FlHNVGNT0fnVMoF3548R8pLqdz82v9IO+Sezmn/3naMO97+huiwIP71wOVa9FWT0cKvVC0OnjjL\npHnrKCgpJ+dMCbe9sY6f/iOdwyfP1mt/xhjeWnOAme9von+n1nz008vpEuO7NwtXTU+bepSqwf68\nIibPW0+F3fD+9BEkxrbirbUHeGP1flbszuWuEV15aEx3osKCXNqfzW545otdvPP1Ia7r156XJw3S\ni6RUk9PCr1Q19uUWMvmtbwDDB/eN4HvtIwB4aEwPbh+awMsr9vLO1wf5aFMWP7u6O3eP7EpQQPU/\noovLbDy8cDPLduUy/YpEnri+t9vPGSjlCm3qUeoSMnIKuX3eekRg4YwLRf+8dpEhPD9hAEse/j4D\nE6J49t+7Gfvyar7cfgxjvnvJ7ImiUia/tZ7lu3N56sY+/OaGPlr0lcdo4Veqil1HC7h93joC/IWF\nM0bQvV1Etev2ah/JgnuH8e69wwgJ8Oen721i4hvr2Pztqcp1DuQVMeH1r9l9rIC/TEli2qjEpkhD\nqWppU49qkSpsdg6dPMuenEIycgrZklFKQKcTjOoe06ARJHdkn+HOv35DaKA/H9w3gq6xrp10vapn\nW67oHss/044wZ/lebnn9a24c2JHr+7XniY+3IyJ8MGMEQ7TnjmoGtPCrZs0YQ05BCRmOAp+RU8ie\nnEIy84ooq7AD4CcQ7A9r//oNAxOimJnSjR/0jqtzU8rWI6e566/fEBESyAf3jaBzTFidtvf3E24f\n1pkbB3bkzdX7mbf2AJ9vPUrXmDDemTbM5S8RpRqbFn7V7CzbmcPafSesQp9byJni8spl7SND6Nk+\ngit6xPK9uAi+1z6C7u3CWbt2DcdbXcYbq/cz4+/pfC8uggdGd+OH/TsQ4F97i+amb09xz183ENXK\nKvrx0XUr+s5aBQfwyDXfY/Lwzny+9Si3DoknJjy43vtTyt208KtmZcG6Qzz56U4iggPo2T6CHw7o\nQK/2EZVFvrpuk4F+wpThXZiUnMDn247y2qr9PLxwC39cvpefXtWNCUPiq+1xk3Yon6l/20hMeBAf\n3DeCjlGhbsmlQ+tQZlzZzS37UsqdtPCrZmNxehZPfrqTsX3ieH3KEAJdOFKvKsDfj1sGxzN+YCeW\n7cph7qpMZv1rO698tY/7vn8Zk4d1JjToQr/59QdOcu87G2kfGcL7943Q8eiVT9DCr5qFL7cf47HF\nW7mieyx/njy4XkXfmZ+fMK5fB67t2541+07w2spMZn+xi9dWZXLvFYncNbILO7LOcO+7G4mPDuP9\n6cNpF6lFX/kGLfzK41ZlHOehhZsZ3DmaeXcnufVKVhHhqp5tuapnWzYczOe1VZn8YWkGb6zeT1mF\nnS4xYbw3fQRtI7QNXvkOLfzKo9YfOMn9f0+nZ1wE86cOJSyo8f5JDktsw7DEYWzPOsPrqZnkny3j\n9SlD9MSr8jla+JXHbD1ymunvppHQJowF9w6jdWhgk7xv//jW/OXOpCZ5L6WaI71yV3nEnpwC7p6/\ngehWgfzjx8P1qFupJuRS4ReRcSKSISKZIjLrEsu7iMhXIrJNRFJFJN5p2e9FZIfjMcmdwauW6eCJ\ns9z59gZCAv14f7r2pFGqqdXa1CMi/sBrwFggC9goIp8ZY3Y5rfYSsMAY866IXA08D9wlIj8EhgCD\ngGAgVUS+NMYUuDsR1bj25BQwd2Umq/Yc58qebbl9WGeu6B5b53vCZp8uZspb67Ebw8LpI0hoU/8L\npZRS9eNKG/8wINMYcwBARBYC4wHnwt8HeMQxvQr4xGn+GmNMBVAhItuAccAiN8SumsCO7DP8eeU+\nlu7MpVWQP1f3juO/+/L4ckcOnaJCmZgcz4+SE1y66Ol4YQlT3lpPYWkFH9xX8+BnSqnGI5caQvai\nFURuA8YZY6Y7Xt8FDDfGPOi0zvvAN8aYV0RkAvAREAskAU9h/VoIAzYArxlj5lR5jxnADIC4uLik\nhQsX1juhoqIiwsPD6719S+bO3PeftvHZ/nK25tkIDYBrugQytksg4UFCud2wOdfG6qxydp60I0D/\nWH+ujA9gUDt/Ai7xK6CozPDChmLyig2PJofQI9q9Nx/x5c8dfDt/X84dLuQ/evTodGNMsivbuKtX\nz6PAXBGZCqwBsgGbMWaZiAwFvgbygHWArerGxph5wDyA5ORkk5KSUu9AUlNTacj2LZk7ct9wMJ8/\nr9zH2n0niAoL5Bdju3HPqK5Ehlzc42Ys8BhwJP8ci9KOsCjtCHO3lBIbHsStSfFMSk7gsrbWf8ai\n0gqmvLWe48Ul/G3aMEZ1j21QjJfiy587+Hb+vpw71C9/Vwp/NpDg9DreMa+SMeYoMAFARMKBW40x\npx3LngOecyx7H9hbpwhVozPGsG7/SV5duY/1B/KJDQ9i1nW9uHNEF8KDa/4nktAmjF9c8z0eHtOD\n1XvzWLjxCG+vPcibqw8wPLENk4Ym8OHGI+w4WsAbdyY1StFXStWNK4V/I9BDRBKxCv7twB3OK4hI\nLJBvjLEDjwPzHfP9gShjzEkRGQAMAJa5MX7VAMYYVu/N488rM0k/fIp2EcH89oY+3FFlPBtXBPj7\nMaZ3HGN6x3G8oITFm7L4cOMRHlm0FRH406RBjO0T10iZKKXqotbCb4ypEJEHgaWAPzDfGLNTRGYD\nacaYz4AU4HkRMVhNPTMdmwcCax03xigA7nSc6FUelnYon2f/vZstR07TsXUIz4zvy8TkBLcMl9Au\nMoQHUrpz/5XdWH/wJMagR/pKNSMutfEbY5YAS6rMe9JpejGw+BLblWD17FHNRNapc7zw5R6+2HaM\n9pEhvDChf41DFjeEn59weTct+Eo1Nzpkg484W1rBG6v3M2/NAUTg4TE9+MlVlzXq2DhKqeZJ/9d7\nObvd8K/N2bz4nz0cLyxl/KCO/GpcL7fdbEQp1fJo4fdiaYfymf3FLrZlnWFgQhR/uTOJpC56s2+l\nfJ0Wfi+Udeocv/9PBp9vPUr7yBBenjSQ8QM71fnm40op76SF34uUVBjmLMtg3poDADw0pgf3azu+\nUqoKrQheYtnOHGatLeZ0aSbjB3XksXG96KTt+EqpS9DC7wXe+d9B/u+LXXSO8OOv947UdnylVI20\n8Ldgxhj+sDSD11P3c02fOG7rVKhFXylVK70DVwtVbrPz6D+38Xrqfu4Y3pm/3JlEkL+evFVK1U6P\n+Fugc2UVPPDeJlIz8vh/P+jJQ2O64xgWQymlaqWFv4U5WVTKve9sZHv2GZ6f0J/Jwzp7OiSlVAuj\nhb8FOZJ/jrvnb+Do6WLeuDOJa/q293RISqkWSAt/C7Ej+wzT3tlIWYWd96YPJ7lrG0+HpJRqobTw\ntwD/yzzBT/6eTmRIAO/fP5IecXqvWqVU/Wnhb+Y+23qUXyzawmWx4bxz71A6tNaLspRSDaOFvxn7\n638P8swXuxjWtQ1v3Z1M67DA2jdSSqlaaOFvhux2w++X7uHN1Qe4tm8cr9w+2C13xlJKKdDC3yzN\nWZ7Bm6sPcOeIzvzfTf3w11E1lVJupIW/mfl861FeW7WfSckJPDO+n16YpZRyOx2yoRnZkX2GXy7e\nSnKXaJ65WYu+UqpxuFT4RWSciGSISKaIzLrE8i4i8pWIbBORVBGJd1r2oojsFJHdIvKqaDW7pBNF\npcxYkEZ0WJA17k4j3PxcKaXAhcIvIv7Aa8B1QB9gsoj0qbLaS8ACY8wAYDbwvGPby4FRwACgHzAU\nuMpt0XuJsgo7P/1HOifPljHvrmTaRgR7OiSllBdzpY1/GJBpjDkAICILgfHALqd1+gCPOKZXAZ84\npg0QAgQBAgQCuQ0P23sYY3jqs51sPHSKV24fRP/41p4OSSn3yNkB5eegw0AIaGEHMyVnIP8g5B+A\nUwfh9BEIawPRidDmMmiTCOHtwc8Nv8xLC+FMNpzJsvbX7eqG77MWrhT+TsARp9dZwPAq62wFJgCv\nALcAESISY4xZJyKrgGNYhX+uMWZ3w8P2Hv/45ls+2PAtP03pxvhBnTwdjlINd/hrWPMH2L/Seu0f\nDB0HQ+fhkDACEoZDqxjPxmgMnM2zCnv+Qau4n5/OPwDF+RevH9rG+jIwtgvzAkIcXwSJF57bOL4Y\nWieAfyBUlEHh0QuFvSDLenZ+XXLmwj47DGqSwi/GmJpXELkNGGeMme54fRcw3BjzoNM6HYG5QCKw\nBrgVq2knFuvLYJJj1eXAY8aYtVXeYwYwAyAuLi5p4cKF9U6oqKiI8PDwem/flPbk2/jDxhL6xfrz\n8JBg/Bp4+qMl5e5uvpw7NIP8jSH61Fa6HF5E1JmdlAW25kjCzRSHdqD1mT1EFuwmonA/fqYCgHOh\nnTjTurfj0Yvi0E5Qz3//ZwvPEBUMARVFBJYXEVBR6DRdRGB5YZVlZwkuPUGAreRC+PhREhJLSUh7\nikM7UBx6/rkDJSFx2AJCEXsFwaV5hBbnOB7HKp9DSnLwt5ddtL/ywAgCywsQLq6x5QERlITEUhoc\nS2lwW8d0W0qDYykJaUdpSGyd8j//2Y8ePTrdGJPsyjauFP6RwNPGmGsdrx8HMMY8X8364cAeY0y8\niPwSCDHGPONY9iRQYox5sbr3S05ONmlpaa7EfkmpqamkpKTUe/umciT/HONf+x/RYYF8PHMUkSEN\nvyq3peTeGHw5d/Bg/sZAxpew9iXIToeIjjDqYRhyNwSFXbxueTEc3Qzfrocj31iP4lPWsrAY65dA\nfDL4BULZWSgrspqKys5eeF12FsrOOU2fhYrimmMMbg2hrSE02nqEREFEe+vI/HzTTVRnCAhq2N+h\nMOdC01D+QesXRUQHaN0JWsdDZLw1HdSq/u9zCec/exFxufC70tSzEeghIolANnA7cIfzCiISC+Qb\nY+zA48B8x6JvgftE5Hmspp6rgD+5lI0XO1dWwX0L0ii32Xnr7mS3FH2lmpTdBrs/gzUvQe4OiOoC\nN/wJBt1RfXt+YCh0udx6ANjtcHLfhS+Cb9dDxhKn9cOsIhnUCgIdz0HhEB5nPQe1gqAwDh49QWLv\nIRAadaG4ny/wIa3BvwkuVxKByA7Wo+uoxn+/Bqr1L2KMqRCRB4GlgD8w3xizU0RmA2nGmM+AFOB5\nETFYTT0zHZsvBq4GtmOd6P2PMeZz96fRchhjePSfW9mbW8jfpg3jsra+2zyhWiBbBexYDGvnwIm9\nENMDbnkT+t1W9wLr5wdtv2c9ku6x5pUUWEU0MAz8XBum5HBqKonDU+r23j7OpU/KGLMEWFJl3pNO\n04uxinzV7WzATxoYo1f588pMlmzP4dfX9+aqnm09HY5Srik+DTs/hv++DKcPQ1w/mPgO9L7J5QLt\nkpBI9+1LVUuHbGhCS3fm8Mfle5kwuBPTv5/o6XCUurTyYji2DY5uguxN1vPJTGtZpyS47vfQc1y9\nT8Yqz9PC30Qycgp55MMtDIxvze8m9NfhGNR32e2w6V04uNrq1td5hNUNsjH7wNvK4fhuR5FPh+zN\ncHzXhW6LER2h0xAYONlqm+88Ugu+F9DC3wROnS3jvgVptAoO4M27knWI5ZamtAh2/gu2L4bYnjD6\nCetiHnc6dRg+exAOroFWba1mFQD/IKv4Jwy3vggSRtSvD3xFGZz+9kKPk5OZVg+bnG1Q4ejWGBJl\nFfme/8967jjEOlmpvI4W/kZWUm5j5vubyDlTwsKfjKB96xBPh6RcYYzVzLHpXdjxkdV9MDoRDv3X\nev2Dp2Dw3Q2/ctMYSJsPy58EBG58BYbcA2dPOHq6rLOe1/8Fvn7V2iamx4WLoTqPgJju1vyys9+9\nGOn89JksMPYL7xsUDu37w9Dp1hdLpyFWfno07xO08DeigpJypr+bxsZD+cyZOJAhnaM9HVLLUJhj\nHem6+6jaFefyYfs/YdMCq5tiYBj0nWD1S08YZjWD/PtR+Pxha53rX7KKZn2c/hY+fdBq2rksBW6a\nC1EJ1rLwttD7BusB3+0Dv+ffsPkf1rKwGEZW2CH11MX7D21j9VFPGA4Dbr8w1ECby6xfFVrkfZYW\n/kZyvLCEe+ZvJPN4Ia/cPpibBnb0dEjNn60C/vtHWP17q594x8HQfQx0G2Nd2OPfSNc7GGMdyW9a\nALs+BVup9d43vAz9brX6gp8X1xemLbG+HJb9Bt66GpKmwpgnXf+iMgbS/wbLfmu9vuFP1j5qKsQ1\n9YHP2kD+sRw69Bl5obhHJ1r92pW6BC38jeDbk+e4a/43HC8o5a/3DOXKltBt01budIXkWSg/e/Hr\n8w/xgwETrQtk3OlEJnz8E8hOs4ptTA9rrJe1c6xxX4IjIfFKaxyT7mMgumvD37MwF7a+bxX8/APW\nFZ5D7rYeHQZUv50IDPiR1bMl9QX45g3Y9Qn84Onam39OH4HPfgYHVkHiVTB+rnXVaF1V6QOfkZpK\nhytT6r4f5ZO08LvZ7mMF3D1/A+U2O+/fN5zBzbF5xxjY/blVVE8ftgq6raz27c5b9Rxc9Surfbgh\nl7mfj2Xj21Ybt38Q3DbfKvwAox+3Luk/sBr2fwWZK2HPF9ayNt0u/BroesXF+7TbrDbys8ehKBeK\n8qzns47n8/NO7gN7BXQZZeXTZ7x1ZO2qkEgY9zsYfCcscTT/pL8LP3zJ6vZYNc9N78LS3wAGfvhH\nSL5Xm1uUR2jhd6MNB/P58bsbCQ8O4P3pI+kRF+HpkL7rwGr46v+srnuxPa0rLs9fFh/kdFl85eXy\n4RcvO3MElj8FSx+HDW9aR7l9bq5fASs4Bp/OtIp6tzHW0W9klSax0Gjoe7P1MMa6WnT/Ssj8Cjb9\nHTbMA79AhrRKhN1BVlE/d/LiE5nnBYZZl/uHt4PY7tDrequbYmyP+vwlL4jrA1P/bfX6WfZreGuM\ndSXqmKes5p8zWfDZQ1aeiVdabfnRXRr2nko1gBZ+N1mxK5eZ72+iU3Qof//xcDpF1eHIsSlkb7IK\n/oFUa7Co8a9ZJ/zqepl9WBu4+xPIXGG1Uf9zKsQPhWues3qauGrHR/DFI1BRap0gHTq99i8PkQvN\nGyN+CuUlVq+X/Sux7/zKajLplHShuIe3uzDdqh0EN+LwGCJWE1jPa61zFOv/Yp0vGDTFakqy2+CH\ncyDpXveM4a5UA2jhd4PF6Vn86qNt9O0Yyd+mDiUmvBnddCJvL6x8xhpQKywGrn3eamIIbGC30u4/\ngMtGw5b3YeWzMP8a6/L9HzwNMd2q3674lNUrZsdiq0jfMs86+q6PwBDoNhq6jWZL0JjmMTpnSCRc\n+5xV8Jf8EtbNha7ft37NuOO8hFJuoIW/gd5ac4DnluxmVPcY3rwrmfBgN/xJs9Kso8aoztZFNJ2G\nWM0ydRkT5UyWdeJxy3tWE8dVs2DkTPeOheLnD0Pugn4T4Ou58L9XrNEVh06HKx/77oVG+1fCJzOt\ntvfRv4YrHmmakRM9Ia4PTP3Cuiq2bS89ylfNipf+r2t8xhh+/58M3li9n+v7t+flSYMIDnDDFbmn\nDsP7k6w26sPrrBOfYLW1dxh44WKbjkOsI8iqzSNnT1pdIje8BRgYfj98/xfQqm43d6iToFaQ8iur\nS2Lq76x29y0fwPcfsd7f2GHFU9b82J5w+3v17/vekohYXwBKNTNa+OuhwmbniY+3sygtiynDOzN7\nfD/8/dzQO6O0ED64HezlMP0rq+fKyUzrROz5AbM2vGX1MwfrAh2nL4Iuhz6Hr++0umIOnAwps+rX\nVbC+IuKsK0+H32+dAF7xlPXFFRBs5TH8fqspqC49Z5RSbqeFv45Kym089MFmlu3K5aGru/P/xvZ0\nz4Brdht8NB3yMuDOjy70NGnb03oMmmy9riizrh6tHDlxs9Ut09hJBOh1A1z9W2jXq+Ex1Ve73jBl\nkdWDaPlvrV8hd31itccrpTxOC38d/eaTHSzblcvTN/Zh6ig3Dq284mnY+x+rh0tNBTIgCDoOsh7J\n91rzys5BzjY2btvD0Bumui+mhrrsKpix2uqGqW3cSjUbWvjroKTcxr+3HWPysM7uLfqb37MG4Bp6\nHwy7r+7bB4VB5xGcPVBS+7pNTUQvUlKqmdHDsDpYu+8ExeU2ru/f3n07PbzOuuLzshQY94L79quU\nUtXQwl8Hy3bmEBESwPDEeoyHfimnDsGHU6yrOCe+471dG5VSzYoWfhdV2Oys2J3L1b3aERTghj9b\nSQF8MNkaK2byh+4f9EwpparhUgUTkXEikiEimSIy6xLLu4jIVyKyTURSRSTeMX+0iGxxepSIyM3u\nTqIppB8+xalz5Vzb1w3NPM49eH60oP5XriqlVD3UWvhFxB94DbgO6ANMFpGqV6W8BCwwxgwAZgPP\nAxhjVhljBhljBgFXA+eAZW6Mv8ks3ZlLUICfe4ZYXvEU7FsK1//BattXSqkm5MoR/zAg0xhzwBhT\nBiwExldZpw+w0jG96hLLAW4DvjTGnKtvsJ5ijGHZrhyu6B7b8CEZNv8Dvv4zDJsBQ3/sngCVUqoO\nXCn8nYAjTq+zHPOcbQUmOKZvASJEpOoZ0NuBD+oTpKftPlZI1qlirukT17AdHf4aPv+5dTORa593\nT3BKKVVHYoypeQWR24Bxxpjpjtd3AcONMQ86rdMRmAskAmuAW4F+xpjTjuUdgG1AR2NM+SXeYwYw\nAyAuLi5p4cKF9U6oqKiI8HD3Dr/7SWYZn2aW88roMCKD69cnPaQ4lyGbHqUiIIJNQ16kItD9QwQ3\nRu4thS/nDr6dvy/nDhfyHz16dLoxJtmljYwxNT6AkcBSp9ePA4/XsH44kFVl3sPAvNreyxhDUlKS\naYhVq1Y1aPtLGfenNea2v/yv/jsoPmPM3OHGPN/ZmBOZ7gusisbIvaXw5dyN8e38fTl3Yy7kD6QZ\nF2qsMcalK3c3Aj1EJBHIxmqyucN5BRGJBfKNMXbHF8P8KvuY7Jjf4hzJP8fuYwX8+vre1a9UXgIl\np62x5ovPP5+6MO/gGus2f3d9XPNY9Uop1QRqLfzGmAoReRBYCvgD840xO0VkNtY3zGdACvC8iBis\npp6Z57cXka5AArDa7dE3gWW7cgG4pq+jfX/7Ykibf3GBryiufgfiZ42ieeOr1m33lFLKw1zqomKM\nWQIsqTLvSafpxcDiarY9xHdPBrcYy3bm0Kt9BF1iWkHJGevkbKsYiOsHIYMhNMq6+Co0+uLpEMd0\ncKQOUKYqFHSiAAAay0lEQVSUalZ0jIAanCwqZeOhfB4c7bjAatPfoawQpn5ujYOvlFItkB6K1uCr\nPcexG7imb3uwVcA3b0CXK7ToK6VaNC38NVi2M5eOrUPo2zESdn8KZ47A5Q/WvqFSSjVjWvirca6s\ngrX78rimb3sErJuJx3SHHtd6OjSllGoQLfzVWLM3j9IKu9Wb59v11q0ORzygJ2qVUi2eVrFqLNuZ\nS+vQQIZ1bQPr5lpdMgdO9nRYSinVYFr4L6HcZuerPccZ07sdAacPwp5/WwOqBYV5OjSllGowLfyX\nsPFgPmeKy7mmT3tY/xfwD7Tuh6uUUl5AC/8lLN2ZQ0igH1clBMCW96D/jyCigSNzKqVUM6GFvwpj\nDMt25fL9Hm0J3fYulJ+DkQ94OiyllHIbLfxV7Mgu4NiZEq7t1Qa+mWeNnR/X19NhKaWU22jhr2LZ\nrhz8BMaZ/0FRDoycWftGSinVgmjhr2LpzhyGdY0mfNMb0LY3dBvj6ZCUUsqttPA7OXjiLHtzi7in\nw7eQu8M62pf63XFLKaWaKy38TpbvygEg5eSH0KodDPiRhyNSSin308LvZNnOXK5td5rQwyth2H0Q\nEOzpkJRSyu10PH6HvMJS0r89xeedl8G5EEj+sadDUkqpRqFH/A4rdufSxpyhz4kvrTF5WsV4OiSl\nlGoUWvgdlu3MYWZ4Kn62Uu3CqZTyalr4gaLSCtIyjzGJZdBzHMT28HRISinVaLTwA6kZx/kha2hV\ncQpG6h22lFLezaXCLyLjRCRDRDJFZNYllncRka9EZJuIpIpIvNOyziKyTER2i8guEenqvvDdY/mO\nY8wI/A+m/QDoeoWnw1FKqUZVa+EXEX/gNeA6oA8wWUT6VFntJWCBMWYAMBt43mnZAuAPxpjewDDg\nuDsCd5eyCjvlGcu4jCxk5IN6wZZSyuu5csQ/DMg0xhwwxpQBC4HxVdbpA6x0TK86v9zxBRFgjFkO\nYIwpMsacc0vkbrL+wEnusH9OSWgc9L3F0+EopVSjc6UffyfgiNPrLGB4lXW2AhOAV4BbgAgRiQF6\nAqdF5F9AIrACmGWMsTlvLCIzgBkAcXFxpKam1j0Th6Kiojptv3JrBrP9d7I37m6O/vfrer9vc1DX\n3L2JL+cOvp2/L+cO9cvfXRdwPQrMFZGpwBogG7A59v99YDDwLfAhMBX4q/PGxph5wDyA5ORkk5KS\nUu9AUlNTcXV7u91wdvXLlEgIPSfNpmdodL3ftzmoS+7expdzB9/O35dzh/rl70pTTzaQ4PQ63jGv\nkjHmqDFmgjFmMPBrx7zTWL8OtjiaiSqAT4AhdYqwEe3cm8FY+1qyu94KLbzoK6WUq1wp/BuBHiKS\nKCJBwO3AZ84riEisiJzf1+PAfKdto0SkreP11cCuhoftHoVpiwgSGzFjHvJ0KEop1WRqLfyOI/UH\ngaXAbmCRMWaniMwWkZscq6UAGSKyF4gDnnNsa8NqBvpKRLYDArzl9izqKfT4Zo4RS1R8L0+HopRS\nTcalNn5jzBJgSZV5TzpNLwYWV7PtcmBAA2JsNB2KdnE4tDcdPB2IUko1IZ+9ctcU5dHensPp6Gb5\nnaSUUo3GZwt//r511kR8kmcDUUqpJuazhb9o/zfYjBDdbZinQ1FKqSbls4Xf/9gm9poEunVq5+lQ\nlFKqSflm4TeGNqe3s8uvBzHhentFpZRv8c3Cn3+AMFsheZH9PB2JUko1OZ8s/CYrDYCy9oM9HIlS\nSjU9n7zZevGhDRgTTGTn/p4ORSmlmpxPFn7bkTR2mUS6x0V5OhSllGpyvtfUU1FK2MmdbLZ3p3u7\ncE9Ho5RSTc73Cn/uDvxNOXv9exIXqT16lFK+x/cKf/YmAApiByJ6m0WllA/yvcKflcYJomjTvqun\nI1FKKY/wucJvy0pjs60b3eMiPB2KUkp5hG8V/uJT+OdnstnejR7ttPArpXyTbxX+o5sB2Gq6aY8e\npZTP8q3Cn5UOwF7/nnSKCvVwMEop5Rm+dQFXdjrZAQm0j2mHn5/26FFK+SbfOeI3BrLT2KIXbiml\nfJzvHPGfOQJn81hX3lULv1LKp7l0xC8i40QkQ0QyRWTWJZZ3EZGvRGSbiKSKSLzTMpuIbHE8PnNn\n8HXiGJFzi70bPbTwK6V8WK1H/CLiD7wGjAWygI0i8pkxZpfTai8BC4wx74rI1cDzwF2OZcXGmEFu\njrvustOx+QWRYTrTQ/vwK6V8mCtH/MOATGPMAWNMGbAQGF9lnT7ASsf0qkss97zsTeSE9kT8g0iI\n1h49Sinf5UobfyfgiNPrLGB4lXW2AhOAV4BbgAgRiTHGnARCRCQNqABeMMZ8UvUNRGQGMAMgLi6O\n1NTUuuZRqaio6Dvbi93GFVnppPtfTbtQw3/Xrqn3/puzS+XuK3w5d/Dt/H05d6hf/u46ufsoMFdE\npgJrgGzA5ljWxRiTLSKXAStFZLsxZr/zxsaYecA8gOTkZJOSklLvQFJTU/nO9jnbYU0pmwN6MSix\nPSkpQ+q9/+bskrn7CF/OHXw7f1/OHeqXvytNPdlAgtPreMe8SsaYo8aYCcaYwcCvHfNOO56zHc8H\ngFSg6e936Dixu7Kwsw7VoJTyea4U/o1ADxFJFJEg4Hbgot45IhIrIuf39Tgw3zE/WkSCz68DjAKc\nTwo3jex0KoKjOWza0SNOe/QopXxbrYXfGFMBPAgsBXYDi4wxO0Vktojc5FgtBcgQkb1AHPCcY35v\nIE1EtmKd9H2hSm+gppGdzonW/QDRPvxKKZ/nUhu/MWYJsKTKvCedphcDiy+x3deAZ+9oXloIx3ez\nP/7H+PsJXWNaeTQcpZTyNO8fsuHoFsCwqeIyusaEERTg/SkrpVRNvL8KZlsjcn5VmKAndpVSCh8p\n/CaqK9tPBWj7vlJK4SOFvzB2IDa70R49SimFtxf+gmNQkM2RsN4AesSvlFJ4e+F3tO/voAci0K2t\nFn6llPLu8fiz08EvgHXnOtG5TQkhgf6ejkgppTzOy4/40yCuH7vzyumuR/tKKQV4c+G32yF7M/aO\nQzhwoojuemJXKaUAby78J/ZCWSEnWvej3Ga0D79SSjl4b+F3nNjdG9gLQG+3qJRSDl5c+NMgOJKt\nxW0B6KaFXymlAK8u/OnQcTD7jp+lY+sQwoO9uwOTUkq5yjsLf3kx5O6ETknsO15Ed725ulJKVfLO\nwn9sG9grsHdKYn9ekbbvK6WUE+8s/I4Tu8fC+1JSbtfCr5RSTry08KdBZCcyzoYBOkaPUko589LC\nn2617+cWAVr4lVLKmdd1dQksOwOnDkHSNDKPFdE2IpiosCBPh6VUs1deXk5WVhYlJSWeDqVOWrdu\nze7duz0dRpMJCQkhPj6ewMDAeu/D6wp/ROE+ayI+mX1b9cSuUq7KysoiIiKCrl27IiKeDsdlhYWF\nRET4Rs89YwwnT54kKyuLxMTEeu/HpaYeERknIhkikikisy6xvIuIfCUi20QkVUTiqyyPFJEsEZlb\n70hdFFmwF8QP02Egmce18CvlqpKSEmJiYlpU0fc1IkJMTEyDf5XVWvhFxB94DbgO6ANMFpE+VVZ7\nCVhgjBkAzAaer7L8GWBNgyJ1UUThPmjbm5ySAIpKK7R9X6k60KLf/LnjM3LliH8YkGmMOWCMKQMW\nAuOrrNMHWOmYXuW8XESSgDhgWYOjrY0xRBbsg05DnE7s+sZPQKVautOnT/P666/Xa9vrr7+e06dP\nuzki7+VK4e8EHHF6neWY52wrMMExfQsQISIxIuIHzAEebWigLsk/QGBFIcQnk3ncKvx6n12lWoaa\nCn9FRUWN2y5ZsoSoqKjGCKtBjDHY7XZPh/Ed7jq5+ygwV0SmYjXpZAM24AFgiTEmq6afJyIyA5gB\nEBcXR2pqar2CaJe7mj7AxmOGNYf2Eh4I2zd+7TM/X4uKiur9t2vpfDl3cE/+rVu3prCw0D0B1cMv\nfvEL9u/fz4ABAxg9ejTXXnstzz77LFFRUezdu5fNmzczefJksrOzKSkp4ac//SnTpk3DZrPRpUsX\nVq9eTVFREbfeeisjR47km2++oUOHDixcuJDQ0NCL3uvLL7/kxRdfpLy8nDZt2vD222/Trl07ioqK\n+OUvf8nmzZsREWbNmsX48eNZvnw5s2fPxmazERMTw+eff87vfvc7wsPDeeihhwAYPnw4ixYtAuCW\nW24hOTmZLVu2sHjxYl5++WU2bdpEcXEx48eP59e//jUA6enp/OpXv+LcuXMEBQXx+eefM3HiRF58\n8UUGDBgAwDXXXMOcOXPo379/ZfwlJSWVn3d9PntXCn82kOD0Ot4xr5Ix5iiOI34RCQduNcacFpGR\nwPdF5AEgHAgSkSJjzKwq288D5gEkJyeblJSUOiVR6csvsfkFM/S6uzj71gZ6d4LRoy+v375aoNTU\nVOr9t2vhfDl3cE/+u3fvruwd83+f72TX0QI3RHZBn46RPHVj32qXz5kzh4yMDLZt2wZYOW3dupUd\nO3ZU9mBZsGABbdq0obi4mKFDhzJlyhSCgoIQEcLDrV/3+/fv58MPP2TQoEH86Ec/YtmyZdx5550X\nvdfYsWOZOHEiIsLbb7/N66+/zpw5c3j22WeJjY1l586dAJw6dYqSkhIefvhh1qxZQ2JiIvn5+URE\nRBAcHExwcHDl38zPz++iGP7+978zYsQIAF588UXatGmDzWZjzJgxHDx4kF69enHvvffy4YcfMnTo\nUAoKCggLC2PGjBn885//ZNSoUezdu5fy8nIuv/ziOhYSEsLgwYMr/051/exdaerZCPQQkUQRCQJu\nBz5zXkFEYh3NOgCPA/MBjDFTjDGdjTFdsX4VLKha9N0qO53CiG4YP39rcDZt31eqRRs2bNhF3RZf\nffVVBg4cyIgRIzhy5Aj79u37zjaJiYkMGjQIgKSkJA4dOvSddbKysrj22mvp378/f/jDHyoL/YoV\nK5g5c2bletHR0axfv54rr7yyMo42bdrUGneXLl0qiz7AokWLGDJkCIMHD2bnzp3s2rWLjIwMOnTo\nwNChQwGIjIwkICCAiRMn8sUXX1BeXs78+fOZOnVq7X+oOqr1iN8YUyEiDwJLAX9gvjFmp4jMBtKM\nMZ8BKcDzImKwmnpmVrvDxlJRBse2UdjhOirOlnH6XLl25VSqnmo6Mm9KrVq1qpxOTU1lxYoVrFu3\njrCwMFJSUi7ZrTE4OLhy2t/fn+Li4u+s87Of/YxHHnmEm266idTUVJ5++uk6xxYQEHBR+71zLM5x\nHzx4kJdeeomNGzcSHR3N1KlTa+yOGRYWxtixY/n0009ZtGgR6enpdY6tNi714zfGLDHG9DTGdDPG\nPOeY96Sj6GOMWWyM6eFYZ7oxpvQS+3jHGPOge8N3cu4kxCdzOqpPZY8ePbGrVMsRERFR4zmGM2fO\nEB0dTVhYGHv27GH9+vX1fq8zZ87QqZPVR+Xdd9+tnD927Fhee+21ytenTp1ixIgRrFmzhoMHDwKQ\nn58PQNeuXdm0aRMAmzZtqlxeVUFBAa1ataJ169bk5uby5ZdfAvC9732PY8eOsXHjRsC6EO38Sezp\n06fz0EMPMXToUKKjo+udZ3W8Z6yeyA4wbQknY4eTedz6x6P32VWq5YiJiWHUqFH069ePX/7yl99Z\nPm7cOCoqKujduzezZs26qCmlrp5++mkmTpxIUlISsbGxlfN/85vfcOrUKfr168fAgQNZtWoVbdu2\nZd68eUyYMIGBAwcyadIkAG699Vby8/Pp27cvc+fOpWfPnpd8r4EDBzJ48GB69erFHXfcwahRowAI\nCgriww8/5Gc/+xkDBw5k7Nixlb8EkpKSiIyMZNq0afXOsUbGmGb1SEpKMg2xatUq89tPtpu+T/7H\n2O32Bu2rpVm1apWnQ/AYX87dGPfkv2vXroYH4gEFBQWeDsHtsrOzTY8ePYzNZrvkcufP6vxnj9X0\n7lKd9Z4jfieZx4vo3i7cZ7pxKqW8x4IFCxg+fDjPPfccfn6NU6K9bpA2gH3Hi0jp2dbTYSilVJ3d\nfffd3H333Y36Hl53xF9UZsgrLNUTu0opVQ2vK/zHzlrdq3RwNqWUujSvK/zZRVbh1x49Sil1aV5X\n+I8V2QkJ9KNTVGjtKyullA/yusKffdbQvV04fn7ao0cpb3d+bBxVN15X+I8W2bWZRynVJGobLrq5\n8qrCX1RaQX6J0RO7SrVAs2bNumi4hKeffpqXXnqJoqIixowZw5AhQ+jfvz+ffvpprfu6+eabSUpK\nom/fvsybN69y/n/+8x+GDBnCwIEDGTNmDGANazxt2jT69+/PgAED+Oijj4CLf00sXry4crC0qVOn\ncv/99zN8+HAee+wxNmzYwMiRIxk8eDCXX345GRkZANhsNh599FH69evHgAED+POf/8zKlSu5+eab\nK/e7fPlybrnllvr/0erJq/rx7z9+/q5bWviVapAvZ0HOdvfus31/uO6FahdPmjSJn//855WjYy5a\ntIilS5cSEhLCxx9/TGRkJCdOnGDEiBHcdNNNNV6gOX/+/IuGb7711lux2+3cd999Fw2vDPDMM8/Q\nunVrtm+38j116lStqWRlZfH111/j7+9PQUEBa9euJSAggBUrVvDEE0/w0UcfMW/ePA4dOsSWLVsI\nCAggPz+f6OhoHnjgAfLy8mjbti1/+9vfuPfee+vyV3QLryr8+87fdUsLv1ItzuDBgzl+/DhHjx4l\nLy+P6OhoEhISKC8v54knnmDNmjX4+fmRnZ1Nbm4u7du3r3Zfr776Kh9//DFA5fDNeXl5lxxeecWK\nFSxcuLByW1cGRZs4cSL+/v6ANeDbPffcw759+xARysvLK/d7//33ExAQcNH73XXXXfzjH/9g2rRp\nrFu3jgULFtT1T9VgXlb4CwkQ6NwmzNOhKNWy1XBk3pgmTpzI4sWLycnJqRwM7b333iMvL4/09HQC\nAwPp2rVrjcMauzp8c22cf1FU3d552OXf/va3jB49mo8//phDhw7VelOUadOmceONNxISEsLEiRMr\nvxiakle18WfmFtG+lRDg71VpKeUzJk2axMKFC1m8eDETJ04ErCPqdu3aERgYyKpVqzh8+HCN+6hu\n+Obqhle+1FDMYN0Gdvfu3djt9spfD9W93/khnt95553K+WPHjuXNN9+sPAF8/v06duxIx44defbZ\nZxtv9M1aeFWF3He8iI7hXpWSUj6lb9++FBYW0qlTJzp06ADAlClTSEtLo3///ixYsIBevXrVuI/q\nhm+ubnjlSw3FDPDCCy9www03cPnll1fGcimPPfYYjz/+OIMHD76ol8/06dPp3LkzAwYMYODAgbz/\n/vuVy6ZMmUJCQgK9e/eu3x+qoVwdxrOpHvUdlrm4rMJ0nfWFefitpfXa3hv48tDEvpy7MTosc0sz\nc+ZM8/bbb9d7+4YOy+w1bfxFpRXcOKAj3QPzPR2KUkpVKykpiVatWjFnzhyPxeA1hT82PJhXJw8m\nNTXV06EopVS1GuMeunWlDeJKKeVjXCr8IjJORDJEJFNEZl1ieRcR+UpEtolIqojEO83fJCJbRGSn\niNzv7gSUUu5jNRWr5swdn1GthV9E/IHXgOuAPsBkEelTZbWXgAXGmAHAbOB5x/xjwEhjzCBgODBL\nRDo2OGqllNuFhIRw8uRJLf7NmDGGkydPEhIS0qD9uNLGPwzINMYcABCRhcB4YJfTOn2ARxzTq4BP\nHEGWOa0TjDYtKdVsxcfHk5WVRV5enqdDqZOSkpIGF8KWJCQkhPj4+Abtw5XC3wk44vQ6C+vo3dlW\nYALwCnALECEiMcaYkyKSAPwb6A780hhztEERK6UaRWBgYOVwBi1JamoqgwcP9nQYLYrU9rNORG4D\nxhljpjte3wUMN8Y86LROR2AukAisAW4F+hljTldZ5xPgRmNMbpX3mAHMAIiLi0tyHjejroqKinx2\njG7N3TdzB9/O35dzhwv5jx49Ot0Yk+zSRrV19AdGAkudXj8OPF7D+uFAVjXL5gO31fR+9b2Aq+rF\nDL5Ic/ddvpy/L+duTP0u4HKlzX0j0ENEEkUkCLgd+Mx5BRGJFZHz+3rcUeARkXgRCXVMRwNXABku\nfSMppZRqFLW28RtjKkTkQWAp4A/MN8bsFJHZWN8wnwEpwPMiYrCaemY6Nu8NzHHMF+AlY0yNg3yn\np6efEJGaR2GqWSxwogHbt2Sau+/y5fx9OXe4kH8XVzeotY2/pRGRNONqO5eX0dx9M3fw7fx9OXeo\nX/7avVIppXyMFn6llPIx3lj459W+itfS3H2XL+fvy7lDPfL3ujZ+pZRSNfPGI36llFI18JrCX9sI\not5ORA6JyHbHSKhpno6nMYnIfBE5LiI7nOa1EZHlIrLP8RztyRgbUzX5Py0i2Y7Pf4uIXO/JGBuL\niCSIyCoR2eUY8fdhx3yv//xryL3On71XNPU4RhDdC4zFGktoIzDZGLOrxg29iIgcApKNMV7fn1lE\nrgSKsEaE7eeY9yKQb4x5wfHFH22M+ZUn42ws1eT/NFBkjHnJk7E1NhHpAHQwxmwSkQggHbgZmIqX\nf/415P4j6vjZe8sRf+UIosYaEfT8CKLKCxlj1gBV77E5HnjXMf0u1n8Ir1RN/j7BGHPMGLPJMV0I\n7MYaSNLrP/8acq8zbyn8lxpBtF5/kBbMAMtEJN0x6J2viTPGHHNM5wBxngzGQx503Axpvjc2dVQl\nIl2BwcA3+NjnXyV3qONn7y2FX8EVxpghWDfMmeloDvBJjgGrWn4bZt38BegGDMK6AZLn7uTdBEQk\nHPgI+LkxpsB5mbd//pfIvc6fvbcU/mwgwel1vGOezzDGZDuejwMfYzV/+ZJcRxvo+bbQ4x6Op0kZ\nY3KNMTZjjB14Cy/+/EUkEKvwvWeM+Zdjtk98/pfKvT6fvbcU/lpHEPVmItLKcbIHEWkFXAPsqHkr\nr/MZcI9j+h7gUw/G0uTOFz2HW/DSz19EBPgrsNsY80enRV7/+VeXe30+e6/o1QPg6ML0Jy6MIPqc\nh0NqMiJyGdZRPlgjrr7vzfmLyAdYI8LGArnAU1g3+VkEdAYOAz8yxnjlCdBq8k/B+qlvgEPAT5za\nvL2GiFwBrAW2A3bH7Cew2rq9+vOvIffJ1PGz95rCr5RSyjXe0tSjlFLKRVr4lVLKx2jhV0opH6OF\nXymlfIwWfqWU8jFa+JVSysdo4VdKKR+jhV8ppXzM/wcKztNMbKTPsAAAAABJRU5ErkJggg==\n",
"text/plain": [
"<matplotlib.figure.Figure at 0x7fae88eff400>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"for epoch in range(25):\n",
"\n",
" for x_batch,y_batch in iterate_minibatches(X_train,y_train,batchsize=32,shuffle=True):\n",
" train(network,x_batch,y_batch, regularise=True)\n",
" \n",
" train_log.append(np.mean(predict(network,X_train)==y_train))\n",
" val_log.append(np.mean(predict(network,X_val)==y_val))\n",
" \n",
" clear_output()\n",
" print(\"Epoch\",epoch)\n",
" print(\"Train accuracy:\",train_log[-1])\n",
" print(\"Val accuracy:\",val_log[-1])\n",
" plt.plot(train_log,label='train accuracy')\n",
" plt.plot(val_log,label='val accuracy')\n",
" plt.legend(loc='best')\n",
" plt.grid()\n",
" plt.show()\n",
" "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Peer-reviewed assignment\n",
"\n",
"Congradulations, you managed to get this far! There is just one quest left undone, and this time you'll get to choose what to do.\n",
"\n",
"\n",
"#### Option I: initialization\n",
"* Implement Dense layer with Xavier initialization as explained [here](http://bit.ly/2vTlmaJ)\n",
"\n",
"To pass this assignment, you must conduct an experiment showing how xavier initialization compares to default initialization on deep networks (5+ layers).\n",
"\n",
"\n",
"#### Option II: regularization\n",
"* Implement a version of Dense layer with L2 regularization penalty: when updating Dense Layer weights, adjust gradients to minimize\n",
"\n",
"$$ Loss = Crossentropy + \\alpha \\cdot \\underset i \\sum {w_i}^2 $$\n",
"t an experiment showing if regularization mitigates overfitting in case of abundantly large number of neurons. Consider tuning $\\alpha$ for better results.\n",
"\n",
"#### Option III: optimization\n",
"* Implement a version of Dense la\n",
"To pass this assignment, you must conducyer that uses momentum/rmsprop or whatever method worked best for you last time.\n",
"\n",
"Most of those methods require persistent parameters like momentum direction or moving average grad norm, but you can easily store those params inside your layers.\n",
"\n",
"To pass this assignment, you must conduct an experiment showing how your chosen method performs compared to vanilla SGD.\n",
"\n",
"### General remarks\n",
"_Please read the peer-review guidelines before starting this part of the assignment._\n",
"\n",
"In short, a good solution is one that:\n",
"* is based on this notebook\n",
"* runs in the default course environment with Run All\n",
"* its code doesn't cause spontaneous eye bleeding\n",
"* its report is easy to read.\n",
"\n",
"_Formally we can't ban you from writing boring reports, but if you bored your reviewer to death, there's noone left alive to give you the grade you want._\n",
"\n",
"\n",
"### Bonus assignments\n",
"\n",
"As a bonus assignment (no points, just swag), consider implementing Batch Normalization ([guide](https://gab41.lab41.org/batch-normalization-what-the-hey-d480039a9e3b)) or Dropout ([guide](https://medium.com/@amarbudhiraja/https-medium-com-amarbudhiraja-learning-less-to-learn-better-dropout-in-deep-machine-learning-74334da4bfc5)). Note, however, that those \"layers\" behave differently when training and when predicting on test set.\n",
"\n",
"* Dropout:\n",
" * During training: drop units randomly with probability __p__ and multiply everything by __1/(1-p)__\n",
" * During final predicton: do nothing; pretend there's no dropout\n",
" \n",
"* Batch normalization\n",
" * During training, it substracts mean-over-batch and divides by std-over-batch and updates mean and variance.\n",
" * During final prediction, it uses accumulated mean and variance.\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": []
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.2"
},
"toc": {
"colors": {
"hover_highlight": "#DAA520",
"navigate_num": "#000000",
"navigate_text": "#333333",
"running_highlight": "#FF0000",
"selected_highlight": "#FFD700",
"sidebar_border": "#EEEEEE",
"wrapper_background": "#FFFFFF"
},
"moveMenuLeft": true,
"nav_menu": {
"height": "264px",
"width": "252px"
},
"navigate_menu": true,
"number_sections": true,
"sideBar": true,
"threshold": 4,
"toc_cell": false,
"toc_section_display": "block",
"toc_window_display": false,
"widenNotebook": false
}
},
"nbformat": 4,
"nbformat_minor": 2
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment