Skip to content

Instantly share code, notes, and snippets.

@alessiot
Last active September 30, 2019 17:30
Show Gist options
  • Save alessiot/9f09625d3832c2e22213b0e8ac8f7fc7 to your computer and use it in GitHub Desktop.
Save alessiot/9f09625d3832c2e22213b0e8ac8f7fc7 to your computer and use it in GitHub Desktop.
Programming Neural Networks from Scratch
Display the source blob
Display the rendered blob
Raw
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Programming Neural Networks from Scratch\n",
"\n",
"In order to understand how to make changes to the optimization process in a neural network model, we will start by illustrating how to program neural networks in basic Python, that is by using numpy. I will be reusing the [code](https://github.com/SkalskiP/ILearnDeepLearning.py) written by Piotr Skalski and reviewed in his [article on Medium](https://towardsdatascience.com/lets-code-a-neural-network-in-plain-numpy-ae7e74410795)"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"import pandas as pd\n",
"import VisualizeNN as VisNN\n",
"import numpy as np"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"data = {'F1':[0,0,1,1], 'F2':[0,1,0,1], \"Target\":[0,1,1,0]} #XOR\n",
"df = pd.DataFrame(data) \n",
"\n",
"X_xor = df[[\"F1\",\"F2\"]]\n",
"y_xor = df[\"Target\"].values.tolist()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Neural Network Architecture and Initialization"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
"NN_ARCHITECTURE = [\n",
" {\"input_dim\": X_xor.shape[1], \"output_dim\": 2, \"activation\": \"sigmoid\"},\n",
" {\"input_dim\": 2, \"output_dim\": 1, \"activation\": \"sigmoid\"}\n",
"]"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Before training a neural network, one needs to initialize its weights. This is typically done by randomly generating small values to increase the algorithm efficiency to find optimal weight values during the first iterations. "
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"def init_layers(nn_architecture, seed = 99):\n",
" # random seed initiation\n",
" np.random.seed(seed)\n",
" # number of layers in our neural network\n",
" number_of_layers = len(nn_architecture)\n",
" # parameters storage initiation\n",
" params_values = {}\n",
" \n",
" # iteration over network layers\n",
" for idx, layer in enumerate(nn_architecture):\n",
" # we number network layers from 1\n",
" layer_idx = idx + 1\n",
" \n",
" # extracting the number of units in layers\n",
" layer_input_size = layer[\"input_dim\"]\n",
" layer_output_size = layer[\"output_dim\"]\n",
" \n",
" # initiating the values of the W matrix\n",
" # and vector b for subsequent layers\n",
" params_values['W' + str(layer_idx)] = np.random.randn(\n",
" layer_output_size, layer_input_size) * 0.1\n",
" params_values['b' + str(layer_idx)] = np.random.randn(\n",
" layer_output_size, 1) * 0.1\n",
" \n",
" return params_values"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Activation Functions\n",
"\n",
"Activation functions are non-linear squashing functions that limit their output and act as perceptrons. For a review, one can refer to the Medium article [here](https://medium.com/binaryandmore/beginners-guide-to-deriving-and-implementing-backpropagation-e3c1a5a1e536)."
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
"def sigmoid(Z):\n",
" return 1/(1+np.exp(-Z))\n",
"\n",
"def relu(Z):\n",
" return np.maximum(0,Z)\n",
"\n",
"def tanh(Z):\n",
" return 1.0/(1.0 + np.exp(-2*Z)) - 1.0\n",
"\n",
"def softmax(Z):\n",
" #return np.exp(Z) / np.sum(np.exp(Z), axis=0)\n",
" # stable version, no overflowing or NaN\n",
" return np.exp(Z-np.max(Z)) / np.sum(np.exp(Z), axis=0)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Forward Propagation\n",
"\n",
"At any given layer $l$, input values $A^{l-1}$ fanned out from the previous layer $l-1$ are linearly transformed using the weights connecting layer $l-1$ to $l$, $W^{l}$, and the bias of layer $l$, $b^{l}$, as follows \n",
"\n",
"\\begin{equation}\n",
"Z^l = W^{l}\\cdot A^{l-1} + b^l.\n",
"\\end{equation}\n",
"\n",
"$Z^l$ is the vector of values held at the nodes of layer $l$.\n",
"A non-linear transformation is applied to $Z^{l}$ using activation functions as follows\n",
"\n",
"\\begin{equation}\n",
"A^l = g^{l}\\left(Z^l\\right).\n",
"\\end{equation}"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [],
"source": [
"def single_layer_forward_propagation(A_prev, W_curr, b_curr, activation=\"relu\"):\n",
" # calculation of the input value for the activation function\n",
" Z_curr = np.dot(W_curr, A_prev) + b_curr\n",
" \n",
" # selection of activation function\n",
" if activation is \"relu\":\n",
" activation_func = relu\n",
" elif activation is \"sigmoid\":\n",
" activation_func = sigmoid\n",
" elif activation is \"tanh\":\n",
" activation_func = tanh\n",
" elif activation is \"softmax\":\n",
" activation_func = softmax\n",
" else:\n",
" raise Exception('Non-supported activation function')\n",
" \n",
" # return of calculated activation A and the intermediate Z matrix\n",
" return activation_func(Z_curr), Z_curr\n",
"\n",
"def full_forward_propagation(X, params_values, nn_architecture):\n",
" \n",
" # creating a temporary memory to store the information needed for a backward step\n",
" memory = {}\n",
" \n",
" # X vector is the activation for layer 0 \n",
" A_curr = X\n",
" \n",
" # iteration over network layers\n",
" for idx, layer in enumerate(nn_architecture):\n",
" # we number network layers from 1\n",
" layer_idx = idx + 1\n",
" # transfer the activation from the previous iteration\n",
" A_prev = A_curr\n",
" \n",
" # extraction of the activation function for the current layer\n",
" activ_function_curr = layer[\"activation\"]\n",
" # extraction of W for the current layer\n",
" W_curr = params_values[\"W\" + str(layer_idx)]\n",
" # extraction of b for the current layer\n",
" b_curr = params_values[\"b\" + str(layer_idx)]\n",
" # calculation of activation for the current layer\n",
" A_curr, Z_curr = single_layer_forward_propagation(A_prev, W_curr, b_curr, activ_function_curr)\n",
" \n",
" # saving calculated values in the memory\n",
" memory[\"A\" + str(idx)] = A_prev # what's coming into this layer from previous layer\n",
" memory[\"Z\" + str(layer_idx)] = Z_curr\n",
" memory[\"W\" + str(layer_idx)] = W_curr\n",
" memory[\"b\" + str(layer_idx)] = b_curr\n",
" \n",
" # return of prediction vector and a dictionary containing intermediate values\n",
" return A_curr, memory"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Error Backpropagation and Gradient Descend\n",
"\n",
"During backpropagation the gradients of the cost $C$ with respect to the network weights are calculated for each layer $l$. For layer $l$,\n",
"\n",
"\\begin{equation}\n",
"\\frac{dC}{dW^l} = \\frac{dC}{dZ^l}\\cdot (A^{l-1})^T,\n",
"\\end{equation}\n",
"\n",
"where \n",
"\n",
"\\begin{equation}\n",
"\\frac{dC}{dZ^l} = (W^{l+1})^T \\cdot \\frac{dC}{dZ^{l+1}} \\cdot g^{l '} \\left(Z^l\\right) = dA^l \\cdot g^{l '} \\left(Z^l\\right).\n",
"\\end{equation}"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [],
"source": [
"def sigmoid_backward(dA, Z):\n",
" sig = sigmoid(Z)\n",
" return dA * sig * (1 - sig)\n",
"\n",
"def relu_backward(dA, Z):\n",
" dZ = np.array(dA, copy = True)\n",
" dZ[Z <= 0] = 0;\n",
" return dZ;\n",
"\n",
"def tanh_backward(dA, Z):\n",
" ta = tanh(Z)\n",
" return dA * (1.0 - ta) * (1.0 + ta)\n",
"\n",
"def softmax_grad(x):\n",
" # x here is the result of softmax applied to an input vector\n",
" # Reshape the 1-d softmax to 2-d so that np.dot will do the matrix multiplication\n",
" s = x.reshape(-1,1)\n",
" return np.diagflat(s) - np.dot(s, s.T)\n",
"\n",
"def softmax_backward(dA, Z):\n",
" so = softmax_grad(Z)\n",
" return dA * so"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [],
"source": [
"def single_layer_backward_propagation(dA_curr, W_curr, b_curr, Z_curr, A_prev, activation=\"relu\"):\n",
" # number of examples\n",
" m = A_prev.shape[1]\n",
" \n",
" # selection of activation function\n",
" if activation is \"relu\":\n",
" backward_activation_func = relu_backward\n",
" elif activation is \"sigmoid\":\n",
" backward_activation_func = sigmoid_backward\n",
" elif activation is \"tanh\":\n",
" backward_activation_func = tanh_backward\n",
" elif activation is \"softmax\":\n",
" backward_activation_func = softmax_backward\n",
" else:\n",
" raise Exception('Non-supported activation function')\n",
" \n",
" # calculation of the activation function derivative\n",
" dZ_curr = backward_activation_func(dA_curr, Z_curr)\n",
" \n",
" # derivative of the matrix W\n",
" dW_curr = np.dot(dZ_curr, A_prev.T) / m\n",
" # derivative of the vector b\n",
" db_curr = np.sum(dZ_curr, axis=1, keepdims=True) / m\n",
" # derivative of the matrix A_prev\n",
" dA_prev = np.dot(W_curr.T, dZ_curr)\n",
"\n",
" return dA_prev, dW_curr, db_curr\n",
"\n",
"def full_backward_propagation(Y_hat, Y, memory, params_values, nn_architecture):\n",
" grads_values = {}\n",
" \n",
" # number of examples\n",
" m = Y.shape[1]\n",
" # a hack ensuring the same shape of the prediction vector and labels vector\n",
" Y = Y.reshape(Y_hat.shape)\n",
" \n",
" # initiation of gradient descent algorithm - dError to minimize\n",
" dA_prev = - (np.divide(Y, Y_hat) - np.divide(1 - Y, 1 - Y_hat)); #cross entropy\n",
" \n",
" for layer_idx_prev, layer in reversed(list(enumerate(nn_architecture))):\n",
" # we number network layers from 1\n",
" layer_idx_curr = layer_idx_prev + 1\n",
" # extraction of the activation function for the current layer\n",
" activ_function_curr = layer[\"activation\"]\n",
" \n",
" dA_curr = dA_prev\n",
" \n",
" A_prev = memory[\"A\" + str(layer_idx_prev)]\n",
" Z_curr = memory[\"Z\" + str(layer_idx_curr)]\n",
" \n",
" W_curr = params_values[\"W\" + str(layer_idx_curr)]\n",
" b_curr = params_values[\"b\" + str(layer_idx_curr)]\n",
" \n",
" dA_prev, dW_curr, db_curr = single_layer_backward_propagation(\n",
" dA_curr, W_curr, b_curr, Z_curr, A_prev, activ_function_curr)\n",
" \n",
" grads_values[\"dW\" + str(layer_idx_curr)] = dW_curr\n",
" grads_values[\"db\" + str(layer_idx_curr)] = db_curr\n",
" \n",
" return grads_values"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Once backpropagation is complete, the objective is to minimize the cost function by updating the weights during several iterations (epochs)."
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [],
"source": [
"def update(params_values, grads_values, nn_architecture, learning_rate):\n",
"\n",
" # iteration over network layers\n",
" for layer_idx, layer in enumerate(nn_architecture, 1):\n",
" params_values[\"W\" + str(layer_idx)] -= learning_rate * grads_values[\"dW\" + str(layer_idx)] \n",
" params_values[\"b\" + str(layer_idx)] -= learning_rate * grads_values[\"db\" + str(layer_idx)]\n",
"\n",
" return params_values"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Monitoring the Learning Process\n",
"\n",
"The loss function is designed to show how far we are from the target solution. "
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [],
"source": [
"# Cross-entropy loss function\n",
"def get_cost_value(Y_hat, Y):\n",
" # number of examples\n",
" m = Y_hat.shape[1]\n",
" # calculation of the cost according to the formula\n",
" cost = -1 / m * (np.dot(Y, np.log(Y_hat).T) + np.dot(1 - Y, np.log(1 - Y_hat).T))\n",
" return np.squeeze(cost)\n",
"\n",
"def convert_prob_into_class(probs):\n",
" probs_ = np.copy(probs)\n",
" probs_[probs_ > 0.5] = 1\n",
" probs_[probs_ <= 0.5] = 0\n",
" return probs_\n",
"\n",
"def get_accuracy_value(Y_hat, Y):\n",
" Y_hat_ = convert_prob_into_class(Y_hat)\n",
" return (Y_hat_ == Y).all(axis=0).mean()\n",
"\n",
"def train(X, Y, nn_architecture, epochs, learning_rate, verbose=False, callback=None):\n",
" # initiation of neural net parameters\n",
" params_values = init_layers(nn_architecture, 2)\n",
" # initiation of lists storing the history \n",
" # of metrics calculated during the learning process \n",
" cost_history = []\n",
" accuracy_history = []\n",
" \n",
" # Store weights over time\n",
" weights_over_time = {}\n",
" for layer_idx, layer in enumerate(nn_architecture, 1):\n",
" if layer_idx == 1:\n",
" weights_over_time[-1] = params_values[\"W\" + str(layer_idx)].flatten()\n",
" else:\n",
" weights_over_time[-1] = np.append(weights_over_time[-1], params_values[\"W\" + str(layer_idx)].flatten())\n",
" \n",
" # performing calculations for subsequent iterations\n",
" for i in range(epochs):\n",
" # step forward\n",
" Y_hat, cashe = full_forward_propagation(X, params_values, nn_architecture)\n",
" \n",
" # calculating metrics and saving them in history\n",
" cost = get_cost_value(Y_hat, Y)\n",
" cost_history.append(cost)\n",
" accuracy = get_accuracy_value(Y_hat, Y)\n",
" accuracy_history.append(accuracy)\n",
" \n",
" # step backward - calculating gradient\n",
" grads_values = full_backward_propagation(Y_hat, Y, cashe, params_values, nn_architecture)\n",
" # updating model state\n",
" params_values = update(params_values, grads_values, nn_architecture, learning_rate)\n",
"\n",
" for layer_idx, layer in enumerate(nn_architecture, 1):\n",
" if layer_idx == 1:\n",
" weights_over_time[i] = params_values[\"W\" + str(layer_idx)].flatten()\n",
" else:\n",
" weights_over_time[i] = np.append(weights_over_time[i], params_values[\"W\" + str(layer_idx)].flatten())\n",
" \n",
" if(i % 50 == 0):\n",
" if(verbose):\n",
" print(\"Iteration: {:05} - cost: {:.5f} - accuracy: {:.5f}\".format(i, cost, accuracy))\n",
" if(callback is not None):\n",
" callback(i, params_values)\n",
" \n",
" return params_values, cost_history, accuracy_history, weights_over_time"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## XOR Again"
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"array([[0, 0],\n",
" [0, 1],\n",
" [1, 0],\n",
" [1, 1]])"
]
},
"execution_count": 11,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"X_xor = X_xor.to_numpy()\n",
"y_xor = np.array(y_xor)\n",
"\n",
"X_xor"
]
},
{
"cell_type": "code",
"execution_count": 12,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"CPU times: user 13.5 s, sys: 68.2 ms, total: 13.6 s\n",
"Wall time: 13.6 s\n"
]
}
],
"source": [
"%time params_values, cost_history_bp, accuracy_history_bp, weights_over_time = train(np.transpose(X_xor), np.transpose(y_xor.reshape((y_xor.shape[0], 1))), NN_ARCHITECTURE, 100000, 0.1, verbose=False)"
]
},
{
"cell_type": "code",
"execution_count": 13,
"metadata": {},
"outputs": [],
"source": [
"import matplotlib.pyplot as plt\n",
"import matplotlib.axes as axes\n",
"%matplotlib inline"
]
},
{
"cell_type": "code",
"execution_count": 14,
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAXwAAAD8CAYAAAB0IB+mAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzt3Xd8HPWd//HXd5t6b1a1XOQm4yo3bAwGG9MCgR+EcBykkJDL5S4Jl/wukHC/3P0u97skd7lLIQWSkHaEACFUm5oAxjbYlm3ZlruKJav3Lu1qd7+/P3YkrWTJtmytZqX9PB+P9ex85zszn9HI7x3Nzs4qrTVCCCGmP4vZBQghhJgcEvhCCBEiJPCFECJESOALIUSIkMAXQogQIYEvhBAhQgJfCCFChAS+EEKECAl8IYQIETazC/CXnJysc3NzzS5DCCGmlP379zdprVMu1C+oAj83N5fCwkKzyxBCiClFKVVxMf3klI4QQoQICXwhhAgREvhCCBEiJPCFECJESOALIUSIkMAXQogQIYEvhBAhIqiuwxdCiFDicfdTV1pC9YmjpM2ey8wrlgV0fRL4QggxSdwuF7WnT1B59DBVx4upO30Kd78LgNW33SmBL4QQU5XX46G+rITK4kNUFh+i5uRx3P0ulMVCau4cll5/I5kL8smcv4jIuPiA1yOBL4QQE6ijsYGyA/soP7SfqmPFuHp7AEjJyWXJlhvJWbyUrIWLCYuMnPTaJPCFEOIyeNxuak4d94X8wUKaqyoBiEubwYL1G8lZvJTs/CVExsaZXKkEvhBCjFu/s4/yov2c+nAXZ4r24+zpxmK1kbUwn8WbtjB7xSoS0jNRSpld6jAS+EIIcRFcfb2UHyzk1Ie7KDu4D7fTSURMLHlrrmT28lXkXLHMlNM04yGBL4QQY/C43VQcPsjR9/5M2f69uPtdRMbFk7/xOuat3UDWwnwsVqvZZV40CXwhhBih4UwZx3b8meM736OnvY3wmFgWX7uF+WuvImPBQiyWqRPy/iTwhRAC33n54zvf49Cb22k4U4rFamP2ilXkX30ds5avxGqzm13iZZPAF0KEtNbaag69tZ3id9/G2d1NcvZMrv3U51iw/moiYmLNLm9CSeALIUJS9cnj7H3pOcr278VitZK3+kqWbb2ZzAX5QXd1zUSRwBdChAytNWeK9rP3pT9SdbyY8JhY1t15D0s230h0QqLZ5QWcBL4QIiRUHCli59O/oa70NNFJyVxz/2dZct1W7OHhZpc2aSTwhRDTWl3pad5/+jdUHikiJjmF6z/3RRZt3DQt3oQdLwl8IcS01N3Wyo7/eZJj779DREws19z/WZZuuRGbw2F2aaaZkMBXSj0J3AI0aK0XG22JwDNALnAG+JjWunUi1ieEEGPxejwUvfEqu559Ck+/izW3f4xVt94Z9J+CnQwTdYT/a+Ax4Ld+bQ8Df9Zaf1sp9bAx/rUJWp8QQpyjqfIMr/3kv2koLyV36Qqu/dTnSEjPNLusoDEhga+13qGUyh3RfBtwjfH8N8C7SOALIQLA6/VQ+MoL7H72fwiLiuYjDz1M3pr10/byyksVyHP4aVrrWgCtda1SKjWA6xJChKiu1ha2/eC7VB0vJm/1lWz+7BeC4lbEwcj0N22VUg8CDwLk5OSYXI0QYiqpOlbMK9//Nq6+Xm7424dYtPFaOao/j0AGfr1SKt04uk8HGkbrpLV+AngCoKCgQAewHiHENHLgtVd497c/Jz4tnbse/RbJOblmlxT0Ahn4LwOfAL5tDF8K4LqEECFCe72899Sv2P/qC8wpWMONX/iKXIFzkSbqssyn8b1Bm6yUqgK+iS/on1VKPQBUAndNxLqEEKHL43bz2o//i5O7d7Bs681s+uSDU/ZWxWaYqKt07hlj0nUTsXwhhPC43Wz/4X9was8uNtzzCVbfdqecrx8n09+0FUKIC/F6PLz22Pc4tWcX19z/GVbe/FGzS5qSLGYXIIQQ56O15i+/epyTH7zPxr/+tIT9ZZDAF0IEtQPbX+LQW9tZdev/YtVH7jC7nClNAl8IEbTKDu7j3d/9krw1V3LVPZ8wu5wpTwJfCBGUOpubeO3H/01KTi43fuEfUBaJq8slP0EhRNDxejxs++F38bhc3PLlh7GHhc6XlASSBL4QIujs3/Yi1SeOsfmzXyAxQ+52OVEk8IUQQaWtrpbdzz7FnIK1LNxwjdnlTCsS+EKIoKG15u1f/gSLzcp1D/yNfLBqgkngCyGCRvnBQioOH2T93fcRk5hsdjnTjgS+ECIoeL0e3v/9r4mfkc7SLTeaXc60JIEvhAgKx99/l6azFWz4+Cew2uxmlzMtSeALIUynvV72vvgcKTNnMW/terPLmbYk8IUQpis9sI+WmipWyR0wA0oCXwhhusJXnic2JZX5azeYXcq0JoEvhDBV09kKqk8cY/nWW7BY5ctMAkkCXwhhquJ33sJitbLoavm+pECTwBdCmMbj7ufY++8wZ+UaImPjzC5n2pPAF0KY5syhA/R2tJN/zWazSzGV9njQLlfA1yNfcSiEMM3pPR8QFhlF7tLlZpdyWXR/P572djwdnXg7O4YPuzrxdHTi6ezA6z/s6MDb0Y6nqwvd5yTp/o+R+vV/CWidEvhCCFN4PR5KD+xl9opVQfdBK6/LhaexEXdjI+6WVjytrXhaW3C3tuJpbcPT0oKntdUYb8Xb2Xn+BVoU1nArFgdY7R6stn5stn4ssV6sSRqL3UtkXH3At0sCXwhhiqrjR+nr7GDu6nWTul5PVzf91dX0V1fRX1eHu6EBd0OjMWzA3diIp7V11HmV3Y41NgprtB1bhJWIBC/WFAtWqw2r7sBqdWJ1eLE4vFjtenCoohNQMakQlQLRxjAqGSKTICLRN0ycFfBtl8AXQpiivKgQq83GrKUrJ3zZnvZ2nKVlOEtL6K+owFVVTX9VFf1VVXja2oZ3tlqxJSdjS0nGnpZAxJxUbBFe7I5ebJZ2rJ4GrK4abJZulE0z+LkwZYWYdIjNgLgFEJMBMTOMUE+BqFRfuEcmgzU4ojY4qhBChJza0ydInT0Xe/ilf5uV9npxlZfTe+QIfcVHcZaU4CwtwdPYNNhH2e3YMzKwZ2URnp+PPT0NR6wFe3gvdhqw9p1BNZ2A1kOgvUMLt0X4jroT5kLCFojPhthM45HhC3PL1PrcgAS+EGLSedz91JeWsPT68d0V0+ty0VtURM+HH9JTuJ++o0fxdncDoCIjCZs7l+gNVxE2dw6OOXMIy83Bbm1B1RVB9UGoeRPqT0Cdx7dAZYWkOZC2CBbfAYmzIWGWL+ij02Ca3eZBAl8IMekaK87g7neRnrfwgn3dra10vvUWnW+9Tc++fei+PrBYCF+0iLjbbiV88RVEXLEYx+zZKI8TqvZC+ftQ+iy8fwA8xuWOkUmQsRzm3wipC32PpLlgCwvw1gYPCXwhxKSrPX0CgPS8+aNO1x4PXe/toPXpp+nevRs8HuzZ2cTfdRdR69YSWVCANTbW17m1Ak5uh93bofJDX8Arqy/c1/wNZK70PY/PmXZH7OMlgS+EmHS1p08SnZBITNLwb7XSbjftL75I009/Rn91NbbUVJIeeIDYG28gbMGCoTtpdtbBzifhyHNQX+xrS1kAaz4HuRshZy2Ex07yVgU/CXwhxKSrOX2C9LwFw26F3HPgILX/9E+4SksJX7KE1H/8R2Ku3YSyG9foaw2l78CHP4WSt3xvsGatguu/BfNv8p2LF+clgS+EmFQ97W2019exdMtNgO9Km6bHHqPppz/Dnp5O5o9+SMzmzUMvBlrD8Vdgx39A3WHf5Y7rvwTL7oXkPBO3ZOqRwBdCTKrakpOA7/y9drup+drDdGzbRtztt5P2jW9gjY4a6lxzEF7/OlTuhqQ8uPVHsOTukHqjdSJJ4AshJlXt6ZNYrFZSc2dT+8//TMe2baQ89BBJD3526Kje0w/vfQfe/57v6ppbvg8r7p9y170Hm4AHvlLqDNAJeAC31rog0OsUQgSvmlMnSJk5i+6XX6H9j8+T9DefI/lzDw516GqAP/wVVO3znba54d8hXG6dPBEm6wh/k9a66cLdhBDTmdfroa7kFAtXX0n9t79D5Nq1pHzxi0Mdmkvhdx+Frka469eQf7tptU5HckpHCDFpms9W0u/sI+p0GfT3k/6tb6EsxtdytFXCb24Fdy98ajtkrjC32GloMr4ARQNvKqX2K6UevGBvIcS0VXva94atY8dO4u++G0dWpm+CqxueugucnXDfixL2ATIZR/jrtdY1SqlU4C2l1Amt9Y6BicaLwIMAOTk5k1COEMIsNadOEGazE9nnIvGTnxyasO0r0HgS7nsB0peYVt90F/AjfK11jTFsAF4AVo+Y/oTWukBrXZCSkhLocoQQJqo9fYL47l6ir7xy6Oj+9Ntw6GnY+L9hziZzC5zmAhr4SqkopVTMwHPgeqA4kOsUQgSnvq4uWmqqiGtpJ+62W32Nbhe89o++m5ht/Kq5BYaAQJ/SSQNeMK6ttQG/11q/HuB1CiGCUJ3xgav4HidRV13lazzyHLSUwj3PyIepJkFAA19rXQYsDeQ6hBBTQ41xh8y0OXOxJSSA1wu7fgBpV8C8rSZXFxom4yodIYSg5lgxMb1OEjZe7Wuo3A1NJ+HKvwv52xZPFgl8IUTAaa+X2pKTxPf0Eb1xo6/x8DPgiIaFt5pbXAiRwBdCBFxLbTUul4tEZSM8P993r5yjL8GCW8ARaXZ5IUMCXwgRcDUnjgGQsXiJ75O1VYXgbIcFN5tcWWiRwBdCBFzV3g+xuT1kXLvF11D2DigLzNpobmEhRgJfCBFwtadOEN/rJHrDel9D6TuQsQIi4s0tLMRI4AshAsrV20NrTycpMXG+yzHdLqgtgpnrzC4t5EjgCyECquaY78P16fMW+Roaj4PHBRnLTawqNEngCyEC6uyu9wHI3mB8urbmoG+YvsykikKXBL4QIqBqThwjqs9Fwpq1RkMRhMVB4mxzCwtBEvhCiIDRWtPY0kiSzYE13niDtvEEpOXLp2tNIIEvhAiYttpqnNpLWubMocbmUkiSo3szSOALIQKmcqfvu46yVqz0NTg7obsBEueYWFXoksAXQgRM9YFCrB4vGddu9jW0lPmGcv7eFBL4QoiAqa+qJMHtJSw729fQXOobJskRvhkk8IUQAdHvctLq6iUlMQU18AatHOGbSgJfCBEQNfv2oJUiY2H+UGNLGUTPAEeUeYWFMAl8IURAVA584Gqj3xeTN5fK6RwTSeALIQKi9vRJIl1uEleuHGpsKZPTOSaSwBdCTDitNY3tLSRFRKFsxldn93UYl2RK4JtFAl8IMeHaKs7QpyB95qyhxtZy31BO6ZhGAl8IMeHK33wNgJwrrxpqHLgkUz50ZRoJfCHEhDt7+CA2j5fMa68bamwZCPxZo88kAk4CXwgx4eoa6ki2OrBFRQ81NpdBTLpckmkiCXwhxIRqOnWCLqXJzB3x5mxLmZzOMZkEvhBiQh3/4zMALPjIR4dPaCmV0zkmk8AXQkwYrTWnjhwkrt9L2nq/N2y7m6G7EZLnmVeckMAXQkycij+/SZvXzcL8pUP3zwGoO+Qbpi8xpzABSOALISaI1+Ph3Scfx+H2sPxvvzh8Yu1h33CGBL6ZJPCFEBPinW9+nWaPi1XL1hCZnj58YtU+iJ8JkYnmFCcACXwhxGVydXTw2pc+T9Hpo8wMi2L11/9peAePG8p3wOxrzChP+LGZXYAQYurRWtN++DCH//A/HD11lB6bhblxSdz8g59hsVqHdy5/F5wdMHezKbWKIQEPfKXUDcAPACvwC631twO9TiHE5dMuF+62NtwtLXSdPUvzqRM0lZfS3FBHQ2cbHQ5ffCQ6HGz66N0suPue0Re05wmISIR5WyexejGagAa+UsoK/BjYAlQB+5RSL2utj03kejydnbjr68FiBTV6n2FXDJw7cfzTLnV5YxV4Dj1idJzjo7Tpc+YZ5zovYr0XXsdFrGe86xhtPef0MeHndzHruZSfn/ai3R60ux88HrTbAx432ng+0O7td6Pd/bhdLtz9/bhdTjzG0N3Ti7Ovh/6+PpxOJy5XH/39/bjcbvo9bpxuF32A026lz2ZDW4Z+b60akmJiyctbwLwbbiZj1ZpRijQUPQ2n34Drvgm2sLH7iUkR6CP81UCJ1roMQCn1B+A2YEIDv/P9XXzw/dfpC08AQA3+J9EjhhjTxwoDPWL+0Zx34rjnufR1jb+Oc7f7YlxaDSO3Sw/rq89pGzZdjTbt3PmH9/IOr0bpYWGq/f9VI+fXvu5qZO8R6xyoS/vN7TfPyLX5t2k1sBUalB7ew1iu9qtvcIuVHlbNQO16xDK10mjl+ylopfEOzHexxxcGiw1sNhv2CAfhNivJDjsR4XaiIh0kJEWSmhFNYpwVh82D1eKExudh+x/9ftYDP3cNLeVQ9g7kXgXrvjC+QkRABDrwM4GzfuNVwHkOBy5NcVUfpbnXYvG2oHAx7JfO91/AN649fuExFAgar7Ek7+B/rqH5hz/0GO3+8+gx2octZ7D6gT4Dz/3ajOd6ZJv/f65R+vvPNbTu8/X3CzY9ssax1zG8jgvXJPwpfNdMWEBZ/MYVKCtqYBoW4y9Gvz7K4jddGfPbjPmsoGxYjKEy2hkc9w3BhlJ2UGGgwlDKAcqB749y8ADdxgMP0Gk8zgxtgQU3dksfduUk0tpBtK2FaFsLqWFnyIo4SVSsFa75Oqz/khzdB4lAB/5oxxfDD8SUehB4ECAnJ+eSVtIZ04Cz/eVLmvdSDIbpwNaNNcTvqHG0Nr/2kUdjvnE1+jx+44M/TDX8uW+aGmVeNWx8aB7lN66GL2/YPMrvqHVgZcq3rlHnUcPbBubTw6drpUY8Z0T7QJvRB8vgz0gPbKtSoAfafRegeY1Q1P51YjGWM9BmMfpZhvXxGuvx9bfg9Zt3qM1YljLatTGPYnCaVxl9lW8Z3oGQ1wzWMfBzH/s5fjX7tw3tcO3Xz7/Nf360MloUFhQW448UiwIrbqy4GXo58Q2tyveTsaKwa3Bo39CGwq4jcOhIorwJRLlyie4Fe4fCi6Y21sLcBdncb3HI1SFBItD7oQrI9hvPAmr8O2itnwCeACgoKLikQ8GeGXYa45wcm9WBy+bFq0BbtG+oRgwtA38aDzz8x31/Fg/+uawGQs0372CgjfPPZDH9DcWoQinLwDOUMqJVWbAqKxasWIyhUlYsyoLVr82iLCisWJQNi3Gk75s20G41otr33KrsWHH4hsqBzRi3KDtWFYYFGzblwIJvulU5sBOJ3RKBjQjjhcf4P6CNv0+NPzi1Bj04bUS70Q+G+jg19Gkv1g4PEXV9zDjTS8PzFTxU3sr3HyjAapH/OGYLdODvA/KUUrOAauDjwF9N9EpWrthET5qNfMCiLL4HFpRSg+OKoef49VNKYWGM5yOWM/AfGAb+g/uMbButj3/7sL5jzaNG73cxy/dvHzwxM8r5+5HThp8tZujMzQX6jVz2mP1HjA/2v8Byx6zHr9/Ac6/24tXewTav9uLFGNd68LlXe43QGr3PyGX4zzu43JHL0N6hGvyW59EevNqLR3twe914tAeP14Nbu/F4PcPGx5ru3z4w7va6cXlduDwunB7nOfv3QhSKKHsU0Y5oou3RxDhiiA+LJzUylZSIFN8wMoWs6CyyYrKwWcYXF42VHTz7HweILWrnqQ/PcP+VcuM0swU08LXWbqXU3wFv4Lss80mt9dGJXk9uXC65cbkTvVghpgytNS6vL/gHXgCcbqdvaLT1efroc/fR1d9Fp6uTrv4uuly+5wPjZzvPcqDhAO3O9mHLt1vs5MblMi9hHgVpBayZsYbs2OwxqvFJyYll81/P5+1fHeeVV0u5e3UOYTbreecRgRXwU2ta6+3A9kCvR4hQppQizBpGmHVi3hx1epw09TbR0NNAZUclpe2llLaVsqd2D9vKtgGwIHEBt8+9nTvy7iDcFj7qcuatmsGOF0qZ297L9iO13L48a0LqE5dGjXpNs0kKCgp0YWGh2WUIIcagtaa8o5wPaj7g5dKXOdZ8jLTINB5d+yjXZF8z6jz7tpWz95VyjqyM4mefnfCL9ASglNqvtS64UD+5l44Q4qIppZgdN5t7F97LM7c8w5NbnyQ+LJ6//8vf84sjvxh1ntwrkgGoPdlGt9M9meWKESTwhRCXbNWMVTx181PcNOsmfnDgBzx/6vlz+iRnR2MNt5LmUhRWtJpQpRgggS+EuCxh1jD+bcO/sS59Hd/Z9x3OdpwdNl0pRVpuLOkeC/vPtJhUpQAJfCHEBLBZbPzr+n8F4EdFPzpnevqsWJI9FvaXSeCbSQJfCDEh0qLSuGfBPbxe/jqVHZXDpiVnxWABaqs6R7/5nZgUEvhCiAlz78J7UUrxYsmLw9rj0yIBsPd4aewc/4fExMSQwBdCTJjUyFTWpa9jW9m2YUfycakRACR6Fafqu8wqL+RJ4AshJtSWmVuo6a6hpK1ksM3usBIZH0aiR3GqvtPE6kKbBL4QYkKtz1wPwK7qXcPaE9MiScbK6QY5wjeLBL4QYkLNiJrB3Pi57KoZHvgxyeHEaUWpBL5pJPCFEBNuZdpKDjcexuP1DLbFJIYT5oZyCXzTSOALISbcstRl9Lh7hp3Hj07w3WCtv7Of1m6XWaWFNAl8IcSEW5qyFIBDjYcG22ISfXfyjPEqShrlKN8MEvhCiAmXFZ1FUngSRQ1Fg23Rib4j/FitKJHTOqaQwBdCTDilFEtTlg47wo9O8B3hJ2CRwDeJBL4QIiCWpi6lsrOS1j7fHTJtdisRsQ4yHXZK5ZSOKSTwhRABsSR5CQBHmo4MtsUkhJGkrHKEbxIJfCFEQOQn52NV1uHn8RPCifRAdVsvvS7PeeYWgSCBL4QIiAhbBPMS5nG46fBgW3RCGJZeL1ojp3VMIIEvhAiYJSlLKG4qHvwAVnRiOLrfi0Mjp3VMIIEvhAiYpSlL6e7vprS9FBi6UifVYuNgpXzd4WSTwBdCBMzAB7AON/pO68QY1+IvTYpm7xkJ/MkmgS+ECJjsmGwSwhIGA3/gCH9+bCQn6jpo7+03s7yQI4EvhAgYpRRLUpYMfgArMi4MZVFkOuxoDR+UNplcYWiRwBdCBNSSlCWUtZfR7mzHYlFExTuI8SriI+28cbTe7PJCigS+ECKg1qSvAeCD2g8AiEkIp6fNyZaFabx9vB6X22tmeSFFAl8IEVCLkxYT64hlZ9VOwHcev7Olj1uWZtDZ5+aNo3UmVxg6JPCFEAFltVi5MuNKdtXsQmtNdGI4XW1ONsxOYmZSJL/94IzZJYYMCXwhRMCtz1xPU28TJ1tPEp0Qjtetcfa4uW/tTPadaWVveYvZJYYECXwhRMBtyNyARVl4u+LtwUszu1r7uHfNTNJiw/h/24+jtTa5yulPAl8IEXDJEcmsSlvF62deHwz8zpY+IhxWvrJlPkVn2/j93kqTq5z+Ahb4Sql/VkpVK6WKjMdNgVqXECL43TDrBio6KqiznAWgo7EPgDtXZnFVXjLfevW43FAtwAJ9hP/fWutlxmN7gNclhAhim3M2Y1M23qjbTkSMnbb6bgAsFsV371xChMPKp3+9j+Yup8mVTl9ySkcIMSniw+O5NudaXih5gdjUcFrrewanpcdF8PP7C6hr7+O+X+6lSUI/IAId+H+nlDqslHpSKZUQ4HUJIYLcvQvvpdPVSXtEI611PcOmrZyZwOP3raSsqYuP/ewDOb0TAJcV+Eqpt5VSxaM8bgN+CswBlgG1wPfGWMaDSqlCpVRhY2Pj5ZQjhAhyy1OXk5+UT6FzN31d/fR1Db952jXzU/ndA2to7XFx64928uLBarl6ZwJdVuBrrTdrrReP8nhJa12vtfZorb3Az4HVYyzjCa11gda6ICUl5XLKEUIEOaUUX175ZSqtpwGGndYZsCo3kW1fvIoF6bF8+ZkiPvXrfVQ2n9tPjF8gr9JJ9xu9HSgO1LqEEFPH2vS1zJ+TC8CJ02Wj9smIj+CZB9fyf25ZROGZVq77r3d59MUj1Lb3TmKl008gz+F/Vyl1RCl1GNgEPBTAdQkhppBHrvsqTlsvbxa+T7uzfdQ+NquFT2+Yxdv/cDUfK8jmmX1nufq77/LQM0Xsr2iVUz2XQAXTD62goEAXFhaaXYYQYhL87rvvUlZfScl1b/GTzT8hxhFz3v5VrT38fEcZzx+opsvpZsGMGG5blsnNV6STkxQ5SVUHJ6XUfq11wQX7SeALIczwwQulHHjrDE+u/ho5Cdk8dt1jZEZnXnC+bqebF4uqebawikNn2wBYmhXHtQvSuGpeMkuz4rFaVKDLDyoS+EKIoFZ6oIHXnyhmwWfCePTUV9Fovrbqa3x07kdR6uIC+2xLD9uP1LK9uI7DVW1oDXERdjbMTWZVbgIrZyayID0Gu3V6f+RIAl8IEdQ6W/r47dd3s+GuPJJWKx7d9Sj76/ezJGUJX1n5FVakrRjX8lq6XewqaWLHqUZ2ljRR2+67dUOE3cqSrDiW5ySwKCOWRekxzEqOnlZ/BUjgCyGC3u8e3U1SZjQ3fX4JXu3lpZKXeOzgYzT0NrA8dTn3LbqPTdmbsFls4152TVsv+yta2V/RyoHKVo7VdOD2+vIuzGZh/owYFqXHkpcWw+zkKGYlR5GVEIFtCv41IIEvhAh6f/ndcUoPNPLA967CYhxx9/T38Pzp53nq+FNUd1WTGJ7I1tyt3DTrJpakLMGiLi2QnW4PJQ1dHK/t5Hhtx+CjtWfow192qyInMZJZydHMSYkiKzGSrPgIMhMiyIyPICps/C88k0ECXwgR9E7uqePtXx3jrkcKSJ0ZO2yax+thR9UOXi17lfeq3sPpcZIYnsi6jHWsz1jPuox1JEckX9b6tda09vRT1thFWVM3ZY3dlDd1Ud7UzZmmHlye4d+3mxBpHwz/rIRI0uPCSYkJIzUmnNTYMNJiw4k24UXhYgM/OF+uhBAhIWuB7xZblcdazgl8q8XKppxNbMrZRJeri/eq3mNn9U521+xmW9k2ADKjM1masnTwMTdhLmHWsItev1KKxCgHiVGJFOQmDpvm9WoaOp1Ut/VQ1dpLdVtsm9gEAAANzElEQVSvb9jaS2ljNztONdHb7zlnmZEOK6nGi0BKbBhpMeEkxzhIinKQGBVGYpTxPNpBTJjtot+gnghyhC+EMNVz/74PgLseWXVR/b3ay/Hm4xTWF3Ko8RBFDUU09vruw2VRFmbGzmRewjzmJcwjLz6PmXEzyY7Oxm61T2jdWms6et00dPbR0OmkobOP+g4nDR3OobYO37DHde4LA4DDaiEhyk5iVBj3rM7m/nW5l1SLHOELIaaE2ctT+PDFMrpa+4hOCL9gf4uykJ+cT35yPuAL3truWo40HeF062lOtZ6iuKmYN868MWye9Kh0cmJyyInNITsmm6zoLGZEzSAtKo2k8KRxH2krpYiLtBMXaScv7fwfGut1eWjudtLS7aK520VLl2voudEebreOa/2XQgJfCGGq2ct8gV9W1MiSTdnjnl8pRUZ0BhnRGWzN3TrY3t3fTUlbCZUdlVR2VvqGHZVsL99Op6tz2DLsFjtpkWmkR6czI3IGM6JmkBKZQlJ4EonhiSRFJJEUkUSMPeaSTsFEOKxkOSLJSjD3E8ES+EIIUyXMiCIpM4qTH9ZdUuCPJcoeNXhuf6S2vjaqu6up666jrruO+u563/OeOgrrC2noacCjzz0N47A4SIxIJCnc9wKQFJ5EQngCcWFxxDnifMOwOGIdscSFxREfFk+47cJ/tUwWCXwhhOkWXpnBzudO01zdRVJmdMDXFx8eT3x4PPlJ+aNO93g9tDpbae5tprmvmebeZlr6WoaNN/Q0cLz5OK3OVtxe95jrCrOGEeeIIzYsdvDFIMYRM/iItkcT64hlQeICFiYtDNQmAxL4QoggMG9NGrv/VMLxXbVs+Fie2eVgtVhJjki+qMs+tdb0unvpcHXQ5myj3dnue7h8ww6nX7urnbOdZ+nq76LL1UVX/9C3en3mis9I4Ashpr+IaAezl6Vw4sNaVt86C0f41IkmpRSR9kgi7ZHMiJoxrnk9Xg/d7m46XZ2EWwN/6mfqfYZYCDEtLducg7PHzbGdNWaXMmmsFiuxjlgyozNJikgK+Pok8IUQQSFtViyZ8+IpevssHrf3wjOIcZPAF0IEjRVbZ9Ld5gypo/zJJIEvhAga2YsSyZwfz95Xy3H2jn3li7g0EvhCiKChlOLKO+bS19XPgdcrzC5n2pHAF0IEldSZscxfO4Oitytpru668AziokngCyGCzvo75+KIsPGX3x7H65E3cCeKBL4QIuhERDvYePc8Gio6OfBGpdnlTBsS+EKIoDS3IJW8glT2vlJG9alWs8uZFiTwhRBBSSnFNX+9gLjUSN78xVG6251mlzTlSeALIYKWI9zGDQ8uxtXnZtuPD+Pqk0s1L4cEvhAiqCVlRrP1s4tpOtvJm784Km/iXgYJfCFE0Mu9IpmN98ynoriZt391TEL/Ek2dW9IJIULa4o2ZuHrdfPBCKVrD5k8vwmqVY9bxkMAXQkwZK7bORFkUu58vwe3ycP1nFmMPC/x3wU4X8vIohJhSlm/J4ep75lFR3MwL3zsgV++MgwS+EGLKWXx1Fjd9fgmtdd388duF1Jd3mF3SlCCBL4SYknKXJHPHV1eCgj/9534Ov3MWrbXZZQW1ywp8pdRdSqmjSimvUqpgxLRHlFIlSqmTSqmtl1emEEKcKyUnhru/sZqcRYm8/8xpXn+8mJ4Ol9llBa3LPcIvBu4Advg3KqUWAR8H8oEbgJ8opeSdFSHEhAuPsnPT55ew7o45nDnSxNP/dw8l+xvMLisoXVbga62Pa61PjjLpNuAPWmun1rocKAFWX866hBBiLMqiWHH9TD729VXEJIbzxs+Lee3xI3S29JldWlAJ1Dn8TOCs33iV0SaEEAGTlBnNnV9bydqPzqaiuJnff/NDCreX4+73mF1aULjgdfhKqbeBGaNM+obW+qWxZhulbdR3U5RSDwIPAuTk5FyoHCGEOC+L1cLKG3LJW5XG7udL2PNyOcd317Lm1tnkFaShLKPFU2i4YOBrrTdfwnKrgGy/8Sxg1G8l1lo/ATwBUFBQIG+xCyEmRGxSBDc8eAVVJ1vZ+exp3nryGPtfr2DNR2Yza1kySoVe8AfqlM7LwMeVUmFKqVlAHrA3QOsSQogxZc1P4O5vrOL6z+Tj9Whee/wIz/17IaUHGvB6Q+sY87JuraCUuh34EZACbFNKFWmtt2qtjyqlngWOAW7gC1prOYkmhDCFsijyCtKYszyFk3vqKNx+htefKCYuJYJlW3JYsHYGNsf0v5BQBdMHFQoKCnRhYaHZZQghpjmvV1N2sJGDb1bQUNFJRIydReszWLQhg9jkCLPLGzel1H6tdcEF+0ngCyFCldaa2pI2Dr51loojTWhgZn4S+VdlMHNxEpYpcjfOiw18uVumECJkKaXIyEsgIy+BzpY+ju2q4djOGrb/9AhR8WHMX5PGvNUzSMqMNrvUCSFH+EII4cfj8VJxuJlju2qoPNaC9mqSs6OZt3oG81alERUfZnaJ55BTOkIIcZl6OlyU7K/n5J56Gs50gIKMufHMXpbC7OUpxCSGm10iIIEvhBATqq2+h1N76yg92EhLTTfgu3nb7OUpzFmeQsKMKNNqk8AXQogAaavvoayokbKixsF78cemRDBzUSI5i5PInJcwqd/EJYEvhBCToKu1j/JDTVQcbab6ZCtulxerzUJGXhw5+UnkLEoiIT0yoJ/slcAXQohJ5u73UHu6nYpjzVQWN9Na1wNARIydzHkJZM5PIHNePPFpE/sCIJdlCiHEJLPZrWQvSiR7USLcmUdHcy9VJ1qpPul7DNynPzLWQea8eDLnJ5CRN/EvAGPWF/A1CCFEiIpNimDR+ggWrc9Aa017Y68v/E+1UX2qldOFvheA8Cg7K26YyfItgb1jsAS+EEJMAqUU8amRxKdGkn9Vpu8FoKGXmpI2akvbiZ6E6/sl8IUQwgRKKeLTIolPi2TR+oxJWefUuFGEEEKIyyaBL4QQIUICXwghQoQEvhBChAgJfCGECBES+EIIESIk8IUQIkRI4AshRIgIqpunKaUagYpLnD0ZaJrAcqYC2ebQINscGi5nm2dqrVMu1CmoAv9yKKUKL+ZucdOJbHNokG0ODZOxzXJKRwghQoQEvhBChIjpFPhPmF2ACWSbQ4Nsc2gI+DZPm3P4Qgghzm86HeELIYQ4j2kR+EqpG5RSJ5VSJUqph82uZzyUUtlKqXeUUseVUkeVUl8y2hOVUm8ppU4bwwSjXSmlfmhs62Gl1Aq/ZX3C6H9aKfUJv/aVSqkjxjw/VJPxXWoXQSllVUodVEq9aozPUkrtMep/RinlMNrDjPESY3qu3zIeMdpPKqW2+rUH3e+EUipeKfVHpdQJY3+vm+77WSn1kPF7XayUelopFT7d9rNS6kmlVINSqtivLeD7dax1nJfWeko/ACtQCswGHMAhYJHZdY2j/nRghfE8BjgFLAK+CzxstD8MfMd4fhPwGqCAtcAeoz0RKDOGCcbzBGPaXmCdMc9rwI1mb7dR1z8AvwdeNcafBT5uPP8Z8Hnj+d8CPzOefxx4xni+yNjfYcAs4/fAGqy/E8BvgM8Yzx1A/HTez0AmUA5E+O3fT063/QxsBFYAxX5tAd+vY63jvLWa/Z9gAn7Y64A3/MYfAR4xu67L2J6XgC3ASSDdaEsHThrPHwfu8et/0ph+D/C4X/vjRls6cMKvfVg/E7czC/gzcC3wqvHL3ATYRu5X4A1gnfHcZvRTI/f1QL9g/J0AYo3wUyPap+1+xhf4Z40Qsxn7eet03M9ALsMDP+D7dax1nO8xHU7pDPxSDagy2qYc40/Y5cAeIE1rXQtgDFONbmNt7/naq0ZpN9v3gX8EvMZ4EtCmtXYb4/51Dm6bMb3d6D/en4WZZgONwK+M01i/UEpFMY33s9a6GvhPoBKoxbff9jO99/OAydivY61jTNMh8Ec7TznlLj1SSkUDzwNf1lp3nK/rKG36EtpNo5S6BWjQWu/3bx6lq77AtCmzzfiOWFcAP9VaLwe68f0ZPpYpv83GOeXb8J2GyQCigBtH6Tqd9vOFmLqN0yHwq4Bsv/EsoMakWi6JUsqOL+yf0lr/yWiuV0qlG9PTgQajfaztPV971ijtZloP3KqUOgP8Ad9pne8D8Uopm9HHv87BbTOmxwEtjP9nYaYqoEprvccY/yO+F4DpvJ83A+Va60atdT/wJ+BKpvd+HjAZ+3WsdYxpOgT+PiDPeOffge/NnpdNrumiGe+4/xI4rrX+L79JLwMD79R/At+5/YH2+413+9cC7cafc28A1yulEowjq+vxnd+sBTqVUmuNdd3vtyxTaK0f0Vpnaa1z8e2vv2it7wXeAe40uo3c5oGfxZ1Gf220f9y4umMWkIfvDa6g+53QWtcBZ5VS842m64BjTOP9jO9UzlqlVKRR08A2T9v97Gcy9utY6xibmW/qTOAbJjfhu7qlFPiG2fWMs/YN+P5EOwwUGY+b8J27/DNw2hgmGv0V8GNjW48ABX7L+jRQYjw+5ddeABQb8zzGiDcOTd7+axi6Smc2vv/IJcBzQJjRHm6MlxjTZ/vN/w1ju07id1VKMP5OAMuAQmNfv4jvaoxpvZ+BfwFOGHX9Dt+VNtNqPwNP43uPoh/fEfkDk7Ffx1rH+R7ySVshhAgR0+GUjhBCiIsggS+EECFCAl8IIUKEBL4QQoQICXwhhAgREvhCCBEiJPCFECJESOALIUSI+P8R2i5oVPGrIQAAAABJRU5ErkJggg==\n",
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"weights_over_time_lst = sorted(weights_over_time.items()) # sorted by key, return a list of tuples\n",
"\n",
"e, w = zip(*weights_over_time_lst) # unpack a list of pairs into two tuples\n",
"\n",
"plt.plot(e, w)\n",
"plt.show()"
]
},
{
"cell_type": "code",
"execution_count": 15,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"[<matplotlib.lines.Line2D at 0x11f032c50>]"
]
},
"execution_count": 15,
"metadata": {},
"output_type": "execute_result"
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAXcAAAD8CAYAAACMwORRAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAEs9JREFUeJzt3X+MXWl93/H3J3YNDaFlyQ7V1j+wCU4qq4lYmC5LU6U0gcTLVjZSSLGTKtCSWEnqJmSrtraotu1WkTa0CjSq1axDqNIo4N1sI3B2B1kE6B+NYOPZsgHsxWEwm3hYmh1+hEStyGLy7R/3GO4OdzxnZu74zjzzfkmjOc9znjnne3yuP3Pm3HPuSVUhSWrLt0y6AEnS+BnuktQgw12SGmS4S1KDDHdJapDhLkkNMtwlqUGGuyQ1yHCXpAZtn9SKb7755tq7d++kVi9Jm9Kjjz76+aqaWm7cxMJ97969zM7OTmr1krQpJfmjPuM8LSNJDTLcJalBhrskNahXuCc5mORSkrkkJ0bMf1uSx7qvP0zyp+MvVZLU17JvqCbZBpwCXg3MA+eTnK2qi9fGVNXPD43/58Ct61CrJKmnPlfL3AbMVdVlgCRngMPAxSXGHwX+7XjKe6a/9ZYZvvK1bzxc5Nnbwid/4TXrsSpJ2tT6hPtO4MpQex54+aiBSV4I7AM+uPbSnmlxsAN85WvF3hMPj3tVknp64t47J12CltDnnHtG9C31bL4jwINV9bWRC0qOJZlNMruwsNC3RoBvCnZJk7f3xMMeYG1QfcJ9Htg91N4FPLnE2CPAu5daUFWdrqrpqpqemlr2BitJm4QBv/H0CffzwP4k+5LsYBDgZxcPSvJdwE3Ah8dboqTNwIDfWJYN96q6ChwHzgGPAw9U1YUk9yQ5NDT0KHCmqjx/Im1RBvzG0euzZapqBphZ1Hf3ova/G19ZkqS12DR3qPquvCT1N7FPhVwNA16aHE+5bC6b5shd0mQ9ce+dvQ6w/CWwMRjuklbEv6A3B8NdkhpkuEtSgwx3SWqQ4S5pxTzvvvEZ7pLUIMNd0tjt83LIiTPcJY2dHzA1eYa7JDXIcJekBhnuktQgw13Sqng55MZmuEtSgwx3SWqQ4S5JDTLcJalBhrukdfHik96lOkm9wj3JwSSXkswlObHEmH+U5GKSC0neNd4yJW02V71NdaKWfYZqkm3AKeDVwDxwPsnZqro4NGY/cBL43qr6UpIXrFfBkqTl9Tlyvw2Yq6rLVfU0cAY4vGjMTwKnqupLAFX11HjLlCStRJ9w3wlcGWrPd33DvhP4ziS/l+QjSQ6Oq0BJ0sr1CfeM6Ft8Nm07sB94JXAUeEeS533TgpJjSWaTzC4sLKy0VkkbjHepblx9wn0e2D3U3gU8OWLMe6vqq1X1GeASg7B/hqo6XVXTVTU9NTW12polScvoE+7ngf1J9iXZARwBzi4a8x7gHwAkuZnBaZrL4yxUktTfsuFeVVeB48A54HHggaq6kOSeJIe6YeeALyS5CHwI+JdV9YX1KlqSdH3LXgoJUFUzwMyivruHpgu4q/uSJE2Yd6hKWjc/9qsfnnQJW5bhLmnd/N6nvzjpErYsw12SGmS4S1KDDHdJapDhLmlNvEt1YzLcJalBhrskNchwl6QGGe6S1CDDXdK6evkvvH/SJWxJhrukdfUnf/70pEvYkgx3SWqQ4S5JDTLcJalBhrukNfMu1Y3HcJekBhnuktQgw12SGmS4S1KDeoV7koNJLiWZS3JixPw3JllI8lj39RPjL1WS1Nf25QYk2QacAl4NzAPnk5ytqouLht5fVcfXoUZJ0gr1OXK/DZirqstV9TRwBji8vmVJktaiT7jvBK4Mtee7vsV+OMnHkjyYZPeoBSU5lmQ2yezCwsIqypUk9dEn3DOirxa1fwfYW1XfA/wu8OujFlRVp6tquqqmp6amVlapJKm3PuE+Dwwfie8CnhweUFVfqKq/6Jq/CrxsPOVJasHeEw9PuoQtp0+4nwf2J9mXZAdwBDg7PCDJLUPNQ8Dj4ytRkrRSy14tU1VXkxwHzgHbgHdW1YUk9wCzVXUW+Nkkh4CrwBeBN65jzZKkZSwb7gBVNQPMLOq7e2j6JHByvKVJklbLO1QljcXbX/+SSZegIYa7pLF47a2jrpDWpBjuktQgw12SGmS4S1KDDHdJapDhLkkNMtwlqUGGuyQ1yHCXpAYZ7pLUIMNdkhpkuEtSgwx3SWqQ4S5JDTLcJalBhrskNchwl6QGGe6S1KBe4Z7kYJJLSeaSnLjOuNclqSTT4ytRkrRSy4Z7km3AKeAO4ABwNMmBEeOeC/ws8Mi4i5QkrUyfI/fbgLmqulxVTwNngMMjxv0H4K3AV8ZYnyRpFfqE+07gylB7vuv7uiS3Arur6qEx1iZJWqU+4Z4RffX1mcm3AG8D/sWyC0qOJZlNMruwsNC/Skmb3otPPjzpEraUPuE+D+weau8CnhxqPxf428D/TPIEcDtwdtSbqlV1uqqmq2p6ampq9VVL2nSu1vJjND59wv08sD/JviQ7gCPA2Wszq+rLVXVzVe2tqr3AR4BDVTW7LhVLkpa1bLhX1VXgOHAOeBx4oKouJLknyaH1LlCStHLb+wyqqhlgZlHf3UuMfeXay5K0Gb399S/hzfc/NukyhHeoShqj1966c/lBuiEMd0lqkOEuSQ0y3CWpQYa7JDXIcJekBhnuktQgw12SGmS4S1KDDHdJapDhLkkNMtwlqUGGuyQ1yHCXpAYZ7pLUIMNdkhpkuEtSgwx3SWqQ4S5JDTLcJalBvcI9ycEkl5LMJTkxYv5PJfl4kseS/K8kB8ZfqiSpr2XDPck24BRwB3AAODoivN9VVd9dVS8B3gr80tgrlST11ufI/TZgrqouV9XTwBng8PCAqvqzoeZzgBpfiZKkldreY8xO4MpQex54+eJBSf4ZcBewA/j+UQtKcgw4BrBnz56V1ipJ6qnPkXtG9H3TkXlVnaqq7wD+NfBvRi2oqk5X1XRVTU9NTa2sUklSb33CfR7YPdTeBTx5nfFngNeupShJ0tr0CffzwP4k+5LsAI4AZ4cHJNk/1LwT+NT4SpQkrdSy59yr6mqS48A5YBvwzqq6kOQeYLaqzgLHk7wK+CrwJeAN61m0JOn6+ryhSlXNADOL+u4emv65MdclSVoD71CVpAYZ7pJumPd89LOTLmHLMNwl3TBvvv+xSZewZRjuktQgw12SGmS4SxqrUbe068Yz3CWN1WfuvXPSJQjDXZKaZLhLUoMMd0lqkOEuSQ0y3CWpQYa7JDXIcJekBhnuktQgw12SGmS4S1KDDHdJapDhLkkN6hXuSQ4muZRkLsmJEfPvSnIxyceSfCDJC8dfqiSpr2XDPck24BRwB3AAOJrkwKJhHwWmq+p7gAeBt467UElSf32O3G8D5qrqclU9DZwBDg8PqKoPVdX/65ofAXaNt0xJ0kr0CfedwJWh9nzXt5Q3Ae9bS1GSpLXZ3mPMqAer1MiByT8GpoG/v8T8Y8AxgD179vQsUZK0Un2O3OeB3UPtXcCTiwcleRXwFuBQVf3FqAVV1emqmq6q6ampqdXUK0nqoU+4nwf2J9mXZAdwBDg7PCDJrcB9DIL9qfGXKUlaiWXDvaquAseBc8DjwANVdSHJPUkOdcP+I/BtwG8leSzJ2SUWJ0m6Afqcc6eqZoCZRX13D02/asx1SZLWwDtUJd1Qe088POkStgTDXZIaZLhLUoMMd0lqkOEuaez2v+A5ky5hyzPcJY3d++965aRL2PIMd0lqkOEuSQ0y3CWpQYa7JDXIcJekBhnuktQgw12SGmS4S1KDDHdJapDhLumG82N/15/hLkkNMtwlqUGGuyQ1yHCXpAb1CvckB5NcSjKX5MSI+d+X5H8nuZrkdeMvU9Jm88S9d066hC1t2XBPsg04BdwBHACOJjmwaNgfA28E3jXuAiVJK7e9x5jbgLmqugyQ5AxwGLh4bUBVPdHN+8t1qFGStEJ9TsvsBK4Mtee7PknSBtUn3DOir1azsiTHkswmmV1YWFjNIiQ1whuZ1lefcJ8Hdg+1dwFPrmZlVXW6qqaranpqamo1i5Ak9dAn3M8D+5PsS7IDOAKcXd+yJElrsWy4V9VV4DhwDngceKCqLiS5J8khgCR/J8k88CPAfUkurGfRkqTr63O1DFU1A8ws6rt7aPo8g9M1kqQNwDtUJa2b5W5k8k3V9WO4S1KDDHdJapDhLkkNMtwlTZTn3deH4S5pXfnpkJNhuEtSgwx3SRPnqZnxM9wlqUGGu6R11+e8u0fv42W4S9owDPjxMdwlbSgG/HgY7pJuiJVcEmnAr53hLmlD2nviYUN+DQx3STfMam5oMuRXJ1Wrehzqmk1PT9fs7OxE1i1pstYa1lv5rtckj1bV9LLjDHdJkzDuo/GtEviGu6QN70afbmnhF4DhLmlT2Kzn0yf1i2Ks4Z7kIPCfgW3AO6rq3kXznwX8d+BlwBeA11fVE9dbpuEu6ZrNGvDjsNJfEn3DfdmrZZJsA04BdwAHgKNJDiwa9ibgS1X1YuBtwC+uqFpJW9oT997ZxCmT1VivX2x9LoW8DZirqstV9TRwBji8aMxh4Ne76QeBH0iS8ZUpaSvYyiE/btt7jNkJXBlqzwMvX2pMVV1N8mXg24HPj6NISVvLcMBv5VM2a9En3EcdgS8+Ud9nDEmOAccA9uzZ02PVkra6UUfyBv7y+oT7PLB7qL0LeHKJMfNJtgN/Hfji4gVV1WngNAzeUF1NwZLkRwgvr0+4nwf2J9kHfBY4AvzoojFngTcAHwZeB3ywJnWNpSQx3ksV1/MXxXq9x7BsuHfn0I8D5xhcCvnOqrqQ5B5gtqrOAr8G/EaSOQZH7EfWpVpJmoDN+CZvnyN3qmoGmFnUd/fQ9FeAHxlvaZKk1fJTISWpQYa7JDXIcJekBhnuktSgiX0qZJIF4I9W+eM3s/XufnWbtwa3eWtYyza/sKqmlhs0sXBfiySzfT4VrSVu89bgNm8NN2KbPS0jSQ0y3CWpQZs13E9PuoAJcJu3Brd5a1j3bd6U59wlSde3WY/cJUnXsenCPcnBJJeSzCU5Mel6ViLJ7iQfSvJ4kgtJfq7rf36S9yf5VPf9pq4/SX6529aPJXnp0LLe0I3/VJI3DPW/LMnHu5/55Y3yRKwk25J8NMlDXXtfkke6+u9PsqPrf1bXnuvm7x1axsmu/1KSHxrq33CviSTPS/Jgkk92+/sVre/nJD/fva4/keTdSZ7d2n5O8s4kTyX5xFDfuu/XpdZxXVW1ab4YfCrlp4EXATuAPwAOTLquFdR/C/DSbvq5wB8yeC7tW4ETXf8J4Be76dcA72PwMJTbgUe6/ucDl7vvN3XTN3Xzfh94Rfcz7wPumPR2d3XdBbwLeKhrPwAc6aZ/BfjpbvpngF/ppo8A93fTB7r9/SxgX/c62LZRXxMMHjv5E930DuB5Le9nBk9j+wzwV4f27xtb28/A9wEvBT4x1Lfu+3WpdVy31kn/J1jhP+wrgHND7ZPAyUnXtYbteS/wauAScEvXdwtwqZu+Dzg6NP5SN/8ocN9Q/31d3y3AJ4f6nzFugtu5C/gA8P3AQ90L9/PA9sX7lcFHS7+im97ejcvifX1t3EZ8TQB/rQu6LOpvdj/zjUdtPr/bbw8BP9Tifgb28sxwX/f9utQ6rve12U7LjHqe684J1bIm3Z+htwKPAH+jqj4H0H1/QTdsqe29Xv/8iP5Jezvwr4C/7NrfDvxpVV3t2sN1PuN5vMC15/Gu9N9ikl4ELAD/rTsV9Y4kz6Hh/VxVnwX+E/DHwOcY7LdHaXs/X3Mj9utS61jSZgv3Xs9q3eiSfBvwP4A3V9WfXW/oiL5aRf/EJPmHwFNV9ehw94ihtcy8TbPNDI5EXwr816q6Ffi/DP6UXsqm3+buHPBhBqdS/ibwHOCOEUNb2s/Lmeg2brZw7/M81w0tyV9hEOy/WVW/3XX/SZJbuvm3AE91/Utt7/X6d43on6TvBQ4leQI4w+DUzNuB52XwvF14Zp1f37Y883m8K/23mKR5YL6qHunaDzII+5b386uAz1TVQlV9Ffht4O/S9n6+5kbs16XWsaTNFu5ff55r9677EQbPb90Uune+fw14vKp+aWjWtWfQ0n1/71D/j3fvut8OfLn7k+wc8INJbuqOmH6QwfnIzwF/nuT2bl0/PrSsiaiqk1W1q6r2MthfH6yqHwM+xOB5u/DN23zt32L4ebxngSPdVRb7gP0M3nzacK+Jqvo/wJUk39V1/QBwkYb3M4PTMbcn+daupmvb3Ox+HnIj9utS61jaJN+EWeWbGa9hcJXJp4G3TLqeFdb+9xj8mfUx4LHu6zUMzjV+APhU9/353fgAp7pt/TgwPbSsfwrMdV//ZKh/GvhE9zP/hUVv6k14+1/JN66WeRGD/7RzwG8Bz+r6n92157r5Lxr6+bd023WJoatDNuJrAngJMNvt6/cwuCqi6f0M/Hvgk11dv8Hgipem9jPwbgbvKXyVwZH2m27Efl1qHdf78g5VSWrQZjstI0nqwXCXpAYZ7pLUIMNdkhpkuEtSgwx3SWqQ4S5JDTLcJalB/x+XetQZACV5OgAAAABJRU5ErkJggg==\n",
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"plt.plot(range(len(cost_history_bp)), cost_history_bp, 'o-.')"
]
},
{
"cell_type": "code",
"execution_count": 16,
"metadata": {},
"outputs": [],
"source": [
"# Prediction\n",
"Y_xor_hat, _ = full_forward_propagation(np.transpose(X_xor), params_values, NN_ARCHITECTURE)"
]
},
{
"cell_type": "code",
"execution_count": 17,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Test set accuracy: 1.00\n"
]
}
],
"source": [
"# Accuracy achieved on the test set\n",
"acc_xor = get_accuracy_value(Y_xor_hat, np.transpose(y_xor.reshape((y_xor.shape[0], 1))))\n",
"print(\"Test set accuracy: {:.2f}\".format(acc_xor))"
]
},
{
"cell_type": "code",
"execution_count": 18,
"metadata": {},
"outputs": [],
"source": [
"from sklearn.metrics import classification_report, confusion_matrix"
]
},
{
"cell_type": "code",
"execution_count": 19,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
" precision recall f1-score support\n",
"\n",
" 0.0 1.00 1.00 1.00 2\n",
" 1.0 1.00 1.00 1.00 2\n",
"\n",
" micro avg 1.00 1.00 1.00 4\n",
" macro avg 1.00 1.00 1.00 4\n",
"weighted avg 1.00 1.00 1.00 4\n",
"\n",
"[[2 0]\n",
" [0 2]]\n"
]
}
],
"source": [
"print(classification_report(np.transpose(convert_prob_into_class(Y_xor_hat).reshape(y_xor.shape)), y_xor))\n",
"print(confusion_matrix(np.transpose(convert_prob_into_class(Y_xor_hat).reshape(y_xor.shape)), y_xor))"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.3"
},
"toc": {
"base_numbering": 1,
"nav_menu": {},
"number_sections": true,
"sideBar": true,
"skip_h1_title": false,
"title_cell": "Table of Contents",
"title_sidebar": "Contents",
"toc_cell": false,
"toc_position": {},
"toc_section_display": true,
"toc_window_display": false
}
},
"nbformat": 4,
"nbformat_minor": 2
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment