Skip to content

Instantly share code, notes, and snippets.

@alessiot
Last active October 14, 2019 17:17
Show Gist options
  • Save alessiot/bdf5238930ff7e1131135958796cd7ef to your computer and use it in GitHub Desktop.
Save alessiot/bdf5238930ff7e1131135958796cd7ef to your computer and use it in GitHub Desktop.
Solving Back-propagation using Stiff and Non-Stiff Differential Equation Solvers
Display the source blob
Display the rendered blob
Raw
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Solving Backpropagation using Stiff and Non-Stiff Differential Equation Solvers\n",
"\n",
"Ordinary differential equations (ODE) can be solved using numerical methods. In fact, the backpropagation equation\n",
"\n",
"\\begin{equation}\n",
"\\frac{dC}{dW} = f(W)\n",
"\\end{equation}\n",
"\n",
"is a ODE that can be solved numerically. The idea was first introduced by [Aaron Owens et al.](https://www.semanticscholar.org/paper/Efficient-training-of-the-backpropagation-network-a-Owens-Filkin/3ed4de93b828a2350489aaa40de382e3fec45e68?citingPapersSort=is-influential#citing-papers) Using gradient descend to reach an optimal solution requires a grid search for the hyperparameters learning rate and momentum, for instance. In some cases, reaching the optimal solution happens after multiple iterations through the backpropagation process due to fact that the cost function presents multiple minima. When solving the backpropagation ODE with numerical methods, solution instabilities identify a class of ODEs named stiff ODEs. The solution of stiff ODE requires the step size of the integrator to be extremely small. In Python, ODE solvers are implemented in the scipy library. "
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"import pandas as pd\n",
"import VisualizeNN as VisNN\n",
"import numpy as np\n",
"from scipy.integrate import ode\n",
"import matplotlib.pyplot as plt\n",
"import matplotlib.axes as axes\n",
"%matplotlib inline"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"array([[0, 0],\n",
" [0, 1],\n",
" [1, 0],\n",
" [1, 1]])"
]
},
"execution_count": 2,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"data = {'F1':[0,0,1,1], 'F2':[0,1,0,1], \"Target\":[0,1,1,0]} #XOR\n",
"df = pd.DataFrame(data) \n",
"\n",
"X_xor = df[[\"F1\",\"F2\"]]\n",
"y_xor = df[\"Target\"].values.tolist()\n",
"\n",
"X_xor = X_xor.to_numpy()\n",
"y_xor = np.array(y_xor)\n",
"\n",
"X_xor"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"In order to find the optimal weights, the backpropagation equation needs to be solved by introducing a time variable $t$ that corresponds to changing the step size of the integrator to reach a stable numerical solution. Therefore, we will use the following equation\n",
"\n",
"\\begin{equation}\n",
"\\frac{dC}{dW(t)} = f(W,t)\n",
"\\end{equation}\n",
"\n",
"Since we are trying to minimize the cost and not using the gradient descend optimization step, we will modify the backpropagation function previously defined to update the gradients as\n",
"\n",
"\\begin{equation}\n",
"\\frac{dC}{dW} \\rightarrow -\\frac{dC}{dW}\n",
"\\end{equation}"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"# Solves dW/dt = f'(t,w), returning w\n",
"# Reduce/Increase tf if solution becomes unstable\n",
"def deltaW(t, u, f_args):\n",
" \n",
" Y_hat = f_args[0]\n",
" cashe = f_args[1]\n",
" Y = f_args[2]\n",
" params_values = f_args[3]\n",
" nn_architecture = f_args[4]\n",
" \n",
" # Format weights as required by backprop function\n",
" u_curr = u\n",
" for idx, layer in enumerate(nn_architecture):\n",
" # we number network layers from 1\n",
" layer_idx = idx + 1\n",
" nrows = nn_architecture[idx]['input_dim']\n",
" ncols = nn_architecture[idx]['output_dim']\n",
" #print('deltaW - W',layer_idx, nrows, ncols, u_curr[:(nrows*ncols)].reshape(ncols,nrows), \n",
" # u_curr[:(nrows*ncols)].reshape(ncols,nrows).shape, u_curr.shape)\n",
" params_values[\"W\" + str(layer_idx)] = u_curr[:(nrows*ncols)].reshape(ncols,nrows)\n",
" u_curr = np.delete(u_curr, range(nrows*ncols), axis=None)\n",
" #print('deltaW - b',layer_idx, nrows, ncols, u_curr[:(1*ncols)].reshape(ncols,1), \n",
" # u_curr[:(1*ncols)].reshape(ncols,1).shape, u_curr.shape)\n",
" params_values[\"b\" + str(layer_idx)] = u_curr[:(1*ncols)].reshape(ncols,1)\n",
" u_curr = np.delete(u_curr, range(1*ncols), axis=None)\n",
"\n",
" #print('Back prop now...') \n",
" grads_values = full_backward_propagation_stiff(Y_hat, Y, cashe, params_values, nn_architecture)\n",
" #print('Done')\n",
"\n",
" # Here loop is forward; we grab from first layer\n",
" grads_values_all = []\n",
" for idx, layer in list(enumerate(nn_architecture)):\n",
" \n",
" # we number network layers from 1\n",
" layer_idx = idx + 1\n",
" \n",
" grads_values_all.extend(grads_values[\"dW\" + str(layer_idx)].flatten())\n",
" grads_values_all.extend(grads_values[\"db\" + str(layer_idx)].flatten())\n",
" \n",
" grads_values_all = np.array(grads_values_all)\n",
" \n",
" #print(grads_values_all)\n",
" \n",
" return grads_values_all\n",
"\n",
"\n",
"def full_backward_propagation_stiff(Y_hat, Y, memory, params_values, nn_architecture):\n",
" grads_values = {}\n",
" \n",
" # number of examples\n",
" m = Y.shape[1]\n",
" # a hack ensuring the same shape of the prediction vector and labels vector\n",
" Y = Y.reshape(Y_hat.shape)\n",
" \n",
" # initiation of gradient descent algorithm\n",
" dA_prev = - (np.divide(Y, Y_hat) - np.divide(1 - Y, 1 - Y_hat));\n",
" \n",
" for layer_idx_prev, layer in reversed(list(enumerate(nn_architecture))):\n",
" # we number network layers from 1\n",
" layer_idx_curr = layer_idx_prev + 1\n",
" # extraction of the activation function for the current layer\n",
" activ_function_curr = layer[\"activation\"]\n",
" \n",
" dA_curr = dA_prev\n",
" \n",
" A_prev = memory[\"A\" + str(layer_idx_prev)]\n",
" Z_curr = memory[\"Z\" + str(layer_idx_curr)]\n",
" \n",
" W_curr = params_values[\"W\" + str(layer_idx_curr)]\n",
" b_curr = params_values[\"b\" + str(layer_idx_curr)]\n",
" \n",
" dA_prev, dW_curr, db_curr = single_layer_backward_propagation(\n",
" dA_curr, W_curr, b_curr, Z_curr, A_prev, activ_function_curr)\n",
" \n",
" grads_values[\"dW\" + str(layer_idx_curr)] = -dW_curr\n",
" grads_values[\"db\" + str(layer_idx_curr)] = -db_curr\n",
" \n",
" return grads_values\n",
"\n",
"\n",
"def train_stiff(X, Y, nn_architecture, epochs, stiff = True, verbose=False, callback=None):\n",
" \n",
" # initiation of neural net parameters (W, b)\n",
" params_values = init_layers(nn_architecture, 2)\n",
" \n",
" # initiation of lists storing the history \n",
" # of metrics calculated during the learning process \n",
" cost_history = []\n",
" accuracy_history = []\n",
" \n",
" # Store weights over time\n",
" weights_over_time = {}\n",
" for layer_idx, layer in enumerate(nn_architecture, 1):\n",
" if layer_idx == 1:\n",
" weights_over_time[-1] = params_values[\"W\" + str(layer_idx)].flatten()\n",
" else:\n",
" weights_over_time[-1] = np.append(weights_over_time[-1], params_values[\"W\" + str(layer_idx)].flatten())\n",
" \n",
" # performing calculations for subsequent iterations\n",
" for i in range(epochs): #epochs\n",
" #print('epoch', i)\n",
" #print('Initial params_values', params_values)\n",
" \n",
" # step forward - return predictions for Y (Y_hat)\n",
" # and caches A, Z, W, W for each layer (A being for the previous layer)\n",
" # Z: input to activation, A: output from activation\n",
" Y_hat, cashe = full_forward_propagation(X, params_values, nn_architecture)\n",
"\n",
" #print(Y, Y_hat)\n",
" #print(cashe)\n",
"\n",
" # calculating metrics and saving them in history\n",
" cost = get_cost_value(Y_hat, Y)\n",
" cost_history.append(cost)\n",
" accuracy = get_accuracy_value(Y_hat, Y)\n",
" accuracy_history.append(accuracy)\n",
"\n",
" # step backward - calculating gradient\n",
" ####### Stiff solver for dW/dt (backpropagation) #############\n",
" u0 = []\n",
" for idx, layer in enumerate(nn_architecture): \n",
" # we number network layers from 1\n",
" layer_idx = idx + 1\n",
" \n",
" #print('W', layer_idx, cashe[\"W\" + str(layer_idx)].flatten())\n",
" #print('b', layer_idx, cashe[\"b\" + str(layer_idx)].flatten())\n",
" \n",
" u0.extend(params_values[\"W\" + str(layer_idx)].flatten())\n",
" u0.extend(params_values[\"b\" + str(layer_idx)].flatten())\n",
" \n",
" u0 = np.array(u0)\n",
" \n",
" t0 = 0.0\n",
" tf = 0.125 #0.125\n",
" \n",
" #print('Initial values', u0)\n",
" solver = ode(deltaW)\n",
" if stiff:\n",
" solver.set_integrator(\"vode\", atol=1e-8, rtol=1e-6, method=\"BDF\", order=23) \n",
" else:\n",
" solver.set_integrator(\"vode\", atol=1e-8, rtol=1e-6, method=\"adams\") \n",
" solver.set_f_params((Y_hat, cashe, Y, params_values, nn_architecture))\n",
" solver.set_initial_value(u0, t0)\n",
" #print('done')\n",
" \n",
" j = 0\n",
" while solver.successful() and solver.t < tf:\n",
" solver.integrate(tf, step=False)\n",
" #print(\"t\", solver.t)\n",
" #print(\"y\", solver.y)\n",
" j += 1 \n",
" #####\n",
"\n",
" params_values_curr = solver.y\n",
" \n",
" #print(\"Solution:\")\n",
" #print(params_values_curr)\n",
" \n",
" for idx, layer in list(enumerate(nn_architecture)): \n",
" # we number network layers from 1\n",
" layer_idx = idx + 1\n",
" nrows = nn_architecture[idx]['input_dim']\n",
" ncols = nn_architecture[idx]['output_dim']\n",
" params_values[\"W\" + str(layer_idx)] = params_values_curr[:(nrows*ncols)].reshape(ncols,nrows)\n",
" params_values_curr = np.delete(params_values_curr, range(nrows*ncols), axis=None)\n",
" params_values[\"b\" + str(layer_idx)] = params_values_curr[:(1*ncols)].reshape(ncols,1)\n",
" params_values_curr = np.delete(params_values_curr, range(1*ncols), axis=None)\n",
"\n",
" if layer_idx == 1:\n",
" weights_over_time[i] = params_values[\"W\" + str(layer_idx)].flatten()\n",
" else:\n",
" weights_over_time[i] = np.append(weights_over_time[i], params_values[\"W\" + str(layer_idx)].flatten())\n",
"\n",
" \n",
" #print(\"Iteration: {:05} - cost: {:.5f} - accuracy: {:.5f}\".format(i, cost, accuracy))\n",
" \n",
" if(i % 50 == 0):\n",
" if(verbose):\n",
" print(\"Iteration: {:05} - cost: {:.5f} - accuracy: {:.5f}\".format(i, cost, accuracy))\n",
" if(callback is not None):\n",
" callback(i, params_values)\n",
" \n",
" #if accuracy==1:\n",
" # break\n",
" \n",
" #print('params_values - final',params_values)\n",
" \n",
" return params_values, cost_history, accuracy_history, weights_over_time"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Neural Network Model"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
"NN_ARCHITECTURE = [\n",
" {\"input_dim\": X_xor.shape[1], \"output_dim\": 2, \"activation\": \"sigmoid\"},\n",
" {\"input_dim\": 2, \"output_dim\": 1, \"activation\": \"sigmoid\"}\n",
"]"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [],
"source": [
"def init_layers(nn_architecture, seed = 99):\n",
" # random seed initiation\n",
" np.random.seed(seed)\n",
" # number of layers in our neural network\n",
" number_of_layers = len(nn_architecture)\n",
" # parameters storage initiation\n",
" params_values = {}\n",
" \n",
" # iteration over network layers\n",
" for idx, layer in enumerate(nn_architecture):\n",
" # we number network layers from 1\n",
" layer_idx = idx + 1\n",
" \n",
" # extracting the number of units in layers\n",
" layer_input_size = layer[\"input_dim\"]\n",
" layer_output_size = layer[\"output_dim\"]\n",
" \n",
" # initiating the values of the W matrix\n",
" # and vector b for subsequent layers\n",
" params_values['W' + str(layer_idx)] = np.random.randn(\n",
" layer_output_size, layer_input_size) * 0.1\n",
" params_values['b' + str(layer_idx)] = np.random.randn(\n",
" layer_output_size, 1) * 0.1\n",
" \n",
" return params_values\n",
"\n",
"def sigmoid(Z):\n",
" return 1/(1+np.exp(-Z))\n",
"\n",
"def relu(Z):\n",
" return np.maximum(0,Z)\n",
"\n",
"def tanh(Z):\n",
" return 1.0/(1.0 + np.exp(-2*Z)) - 1.0\n",
"\n",
"def softmax(Z):\n",
" #return np.exp(Z) / np.sum(np.exp(Z), axis=0)\n",
" # stable version, no overflowing or NaN\n",
" return np.exp(Z-np.max(Z)) / np.sum(np.exp(Z), axis=0)\n",
"\n",
"def single_layer_forward_propagation(A_prev, W_curr, b_curr, activation=\"relu\"):\n",
" # calculation of the input value for the activation function\n",
" Z_curr = np.dot(W_curr, A_prev) + b_curr\n",
" \n",
" # selection of activation function\n",
" if activation is \"relu\":\n",
" activation_func = relu\n",
" elif activation is \"sigmoid\":\n",
" activation_func = sigmoid\n",
" elif activation is \"tanh\":\n",
" activation_func = tanh\n",
" elif activation is \"softmax\":\n",
" activation_func = softmax\n",
" else:\n",
" raise Exception('Non-supported activation function')\n",
" \n",
" # return of calculated activation A and the intermediate Z matrix\n",
" return activation_func(Z_curr), Z_curr\n",
"\n",
"def full_forward_propagation(X, params_values, nn_architecture):\n",
" \n",
" # creating a temporary memory to store the information needed for a backward step\n",
" memory = {}\n",
" \n",
" # X vector is the activation for layer 0 \n",
" A_curr = X\n",
" \n",
" # iteration over network layers\n",
" for idx, layer in enumerate(nn_architecture):\n",
" # we number network layers from 1\n",
" layer_idx = idx + 1\n",
" # transfer the activation from the previous iteration\n",
" A_prev = A_curr\n",
" \n",
" # extraction of the activation function for the current layer\n",
" activ_function_curr = layer[\"activation\"]\n",
" # extraction of W for the current layer\n",
" W_curr = params_values[\"W\" + str(layer_idx)]\n",
" # extraction of b for the current layer\n",
" b_curr = params_values[\"b\" + str(layer_idx)]\n",
" # calculation of activation for the current layer\n",
" A_curr, Z_curr = single_layer_forward_propagation(A_prev, W_curr, b_curr, activ_function_curr)\n",
" \n",
" # saving calculated values in the memory\n",
" memory[\"A\" + str(idx)] = A_prev # what's coming into this layer from previous layer\n",
" memory[\"Z\" + str(layer_idx)] = Z_curr\n",
" memory[\"W\" + str(layer_idx)] = W_curr\n",
" memory[\"b\" + str(layer_idx)] = b_curr\n",
" \n",
" # return of prediction vector and a dictionary containing intermediate values\n",
" return A_curr, memory\n",
"\n",
"def sigmoid_backward(dA, Z):\n",
" sig = sigmoid(Z)\n",
" return dA * sig * (1 - sig)\n",
"\n",
"def relu_backward(dA, Z):\n",
" dZ = np.array(dA, copy = True)\n",
" dZ[Z <= 0] = 0;\n",
" return dZ;\n",
"\n",
"def tanh_backward(dA, Z):\n",
" ta = tanh(Z)\n",
" return dA * (1.0 - ta) * (1.0 + ta)\n",
"\n",
"def softmax_grad(x):\n",
" # x here is the result of softmax applied to an input vector\n",
" # Reshape the 1-d softmax to 2-d so that np.dot will do the matrix multiplication\n",
" s = x.reshape(-1,1)\n",
" return np.diagflat(s) - np.dot(s, s.T)\n",
"\n",
"def softmax_backward(dA, Z):\n",
" so = softmax_grad(Z)\n",
" return dA * so\n",
"\n",
"def single_layer_backward_propagation(dA_curr, W_curr, b_curr, Z_curr, A_prev, activation=\"relu\"):\n",
" # number of examples\n",
" m = A_prev.shape[1]\n",
" \n",
" # selection of activation function\n",
" if activation is \"relu\":\n",
" backward_activation_func = relu_backward\n",
" elif activation is \"sigmoid\":\n",
" backward_activation_func = sigmoid_backward\n",
" elif activation is \"tanh\":\n",
" backward_activation_func = tanh_backward\n",
" elif activation is \"softmax\":\n",
" backward_activation_func = softmax_backward\n",
" else:\n",
" raise Exception('Non-supported activation function')\n",
" \n",
" # calculation of the activation function derivative\n",
" dZ_curr = backward_activation_func(dA_curr, Z_curr)\n",
" \n",
" # derivative of the matrix W\n",
" dW_curr = np.dot(dZ_curr, A_prev.T) / m\n",
" # derivative of the vector b\n",
" db_curr = np.sum(dZ_curr, axis=1, keepdims=True) / m\n",
" # derivative of the matrix A_prev\n",
" dA_prev = np.dot(W_curr.T, dZ_curr)\n",
"\n",
" return dA_prev, dW_curr, db_curr\n",
"\n",
"def full_backward_propagation(Y_hat, Y, memory, params_values, nn_architecture):\n",
" grads_values = {}\n",
" \n",
" # number of examples\n",
" m = Y.shape[1]\n",
" # a hack ensuring the same shape of the prediction vector and labels vector\n",
" Y = Y.reshape(Y_hat.shape)\n",
" \n",
" # initiation of gradient descent algorithm - dError to minimize\n",
" dA_prev = - (np.divide(Y, Y_hat) - np.divide(1 - Y, 1 - Y_hat)); #cross entropy\n",
" \n",
" for layer_idx_prev, layer in reversed(list(enumerate(nn_architecture))):\n",
" # we number network layers from 1\n",
" layer_idx_curr = layer_idx_prev + 1\n",
" # extraction of the activation function for the current layer\n",
" activ_function_curr = layer[\"activation\"]\n",
" \n",
" dA_curr = dA_prev\n",
" \n",
" A_prev = memory[\"A\" + str(layer_idx_prev)]\n",
" Z_curr = memory[\"Z\" + str(layer_idx_curr)]\n",
" \n",
" W_curr = params_values[\"W\" + str(layer_idx_curr)]\n",
" b_curr = params_values[\"b\" + str(layer_idx_curr)]\n",
" \n",
" dA_prev, dW_curr, db_curr = single_layer_backward_propagation(\n",
" dA_curr, W_curr, b_curr, Z_curr, A_prev, activ_function_curr)\n",
" \n",
" grads_values[\"dW\" + str(layer_idx_curr)] = dW_curr\n",
" grads_values[\"db\" + str(layer_idx_curr)] = db_curr\n",
" \n",
" return grads_values\n",
"\n",
"def update(params_values, grads_values, nn_architecture, learning_rate):\n",
"\n",
" # iteration over network layers\n",
" for layer_idx, layer in enumerate(nn_architecture, 1):\n",
" params_values[\"W\" + str(layer_idx)] -= learning_rate * grads_values[\"dW\" + str(layer_idx)] \n",
" params_values[\"b\" + str(layer_idx)] -= learning_rate * grads_values[\"db\" + str(layer_idx)]\n",
"\n",
" return params_values\n",
"\n",
"# Cross-entropy loss function\n",
"def get_cost_value(Y_hat, Y):\n",
" # number of examples\n",
" m = Y_hat.shape[1]\n",
" # calculation of the cost according to the formula\n",
" cost = -1 / m * (np.dot(Y, np.log(Y_hat).T) + np.dot(1 - Y, np.log(1 - Y_hat).T))\n",
" return np.squeeze(cost)\n",
"\n",
"def convert_prob_into_class(probs):\n",
" probs_ = np.copy(probs)\n",
" probs_[probs_ > 0.5] = 1\n",
" probs_[probs_ <= 0.5] = 0\n",
" return probs_\n",
"\n",
"def get_accuracy_value(Y_hat, Y):\n",
" Y_hat_ = convert_prob_into_class(Y_hat)\n",
" return (Y_hat_ == Y).all(axis=0).mean()\n",
"\n",
"def train(X, Y, nn_architecture, epochs, learning_rate, verbose=False, callback=None):\n",
" # initiation of neural net parameters\n",
" params_values = init_layers(nn_architecture, 2)\n",
" # initiation of lists storing the history \n",
" # of metrics calculated during the learning process \n",
" cost_history = []\n",
" accuracy_history = []\n",
" \n",
" # Store weights over time\n",
" weights_over_time = {}\n",
" for layer_idx, layer in enumerate(nn_architecture, 1):\n",
" if layer_idx == 1:\n",
" weights_over_time[-1] = params_values[\"W\" + str(layer_idx)].flatten()\n",
" else:\n",
" weights_over_time[-1] = np.append(weights_over_time[-1], params_values[\"W\" + str(layer_idx)].flatten())\n",
" \n",
" # performing calculations for subsequent iterations\n",
" for i in range(epochs):\n",
" # step forward\n",
" Y_hat, cashe = full_forward_propagation(X, params_values, nn_architecture)\n",
" \n",
" # calculating metrics and saving them in history\n",
" cost = get_cost_value(Y_hat, Y)\n",
" cost_history.append(cost)\n",
" accuracy = get_accuracy_value(Y_hat, Y)\n",
" accuracy_history.append(accuracy)\n",
" \n",
" # step backward - calculating gradient\n",
" grads_values = full_backward_propagation(Y_hat, Y, cashe, params_values, nn_architecture)\n",
" # updating model state\n",
" params_values = update(params_values, grads_values, nn_architecture, learning_rate)\n",
"\n",
" for layer_idx, layer in enumerate(nn_architecture, 1):\n",
" if layer_idx == 1:\n",
" weights_over_time[i] = params_values[\"W\" + str(layer_idx)].flatten()\n",
" else:\n",
" weights_over_time[i] = np.append(weights_over_time[i], params_values[\"W\" + str(layer_idx)].flatten())\n",
" \n",
" if(i % 50 == 0):\n",
" if(verbose):\n",
" print(\"Iteration: {:05} - cost: {:.5f} - accuracy: {:.5f}\".format(i, cost, accuracy))\n",
" if(callback is not None):\n",
" callback(i, params_values)\n",
" \n",
" return params_values, cost_history, accuracy_history, weights_over_time"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We can train the previous architecture for the XOR problem by using a non-stiff solver."
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"CPU times: user 13.2 s, sys: 51.4 ms, total: 13.3 s\n",
"Wall time: 13.3 s\n"
]
}
],
"source": [
"%time params_values, cost_history_bp, accuracy_history_bp, weights_over_time = train(np.transpose(X_xor), np.transpose(y_xor.reshape((y_xor.shape[0], 1))), NN_ARCHITECTURE, 100000, 0.1, verbose=False)"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"CPU times: user 1min 26s, sys: 167 ms, total: 1min 26s\n",
"Wall time: 1min 26s\n"
]
}
],
"source": [
"%time params_values, cost_history_nonstiff, accuracy_history_nonstiff, weights_over_time = train_stiff(np.transpose(X_xor), np.transpose(y_xor.reshape((y_xor.shape[0], 1))), NN_ARCHITECTURE, 70000, stiff = False, verbose=False)"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAXwAAAD8CAYAAAB0IB+mAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzt3Xl8HHd9//HXd3d135dtWZYt33d8CcexYztxyEVI0iQUQjhSEn4p0AKlBJpAWwr04CoFyhHSlCOUQC5CQiAJicnhhPiOHZ+yfEuWZMmWZFmWpdXufn9/7EjalWRbsrTa1e77+XjsY2a+3+/M97Or0Uej78zOGGstIiIS/1zRDkBEREaGEr6ISIJQwhcRSRBK+CIiCUIJX0QkQSjhi4gkCCV8EZEEoYQvIpIglPBFRBKEJ9oBhCosLLRlZWXRDkNEZFTZsmXLCWtt0YXaxVTCLysrY/PmzdEOQ0RkVDHGHBlIOw3piIgkCCV8EZEEoYQvIpIglPBFRBKEEr6ISIJQwhcRSRBK+CIiCUIJX0Qkijq9HWx8+gmOVeyJeF8x9cUrEZFEEfD72fXaWv782C9pbTzJ0pvfQ8nM2RHtUwlfRGQEWWs5uHUT6x75GSerj1I8bSbv+uS9lM6ZH/G+lfBFREZI8/E6Xv7Zjzm4dRN5xSXc9PdfYNrSyzDGjEj/w5LwjTE/Ad4N1Ftr5zll+cCjQBlwGHivtbZpOPoTERlNfF4vG59+go1PP47L7WH1h+5m0XU34vaM7DH3cPX2M+D7wMMhZfcBa621XzPG3Ocs/8Mw9SciMirU7d/Hcz/8LxqPVTFz+Squ+NDdZOYXRCWWYUn41trXjDFlvYpvBq5w5n8OvIISvogkCL+vk/VP/poNv32cjNw8brv/y5QtXBLVmCL5/8RYa20tgLW21hgzJoJ9iYjEjFP1dfzuv77G8YP7mbNqDVf+1T2kZmRGO6zon7Q1xtwD3AMwceLEKEcjIjI0B7Zs4LkffBss3PTZLzB96fJoh9Qtkgn/uDGm2Dm6Lwbq+2tkrX0QeBCgvLzcRjAeEZGIsYEAbzz2SzY89ShjJk/lxs/cT+7YcdEOK0wkE/4zwJ3A15zp0xHsS0QkanxeL8/98L/Y9+Y65l15DVfd9TE8ycnRDquP4bos81cET9AWGmOqgS8RTPSPGWPuBo4CfzkcfYmIxJK2llM8/c1/pWbfHlZ94COU33jriF1XP1jDdZXO+89RddVwbF9EJBa1nWrm8a9+kaa6Gt79d/cx87LLox3SeUX9pK2IyGjU2tTI41/9Ii0n6rn1vn9h4rwF0Q7pgpTwRUQGqa3lFI9/5QucPnmC2+77MhPmzIt2SAOi2yOLiAxCZ3s7v/36V2hpqOfW+/9l1CR7UMIXERmwgN/P777zNeoOVHLDpz/PhNmjJ9mDEr6IyICt+9XPOfTWZq66+2NMe8eyaIczaEr4IiIDsG/962z+3W9YcM0NLLj6XdEO56Io4YuIXEBjzTGe/9F3KZ4+kyvv/Gi0w7loSvgiIucR8Pt5/gffxu12c+Nn7sftSYp2SBdNCV9E5Dw2PfMktfsruOruj5NVUBjtcIZECV9E5BxOVh/lz48/wozLVjJrxepohzNkSvgiIv2w1vKnnz5AUmoKV931sWiHMyyU8EVE+lG58c8c3fk2K973IdKzc6IdzrBQwhcR6cXX2cmrv/hfiiaWseCd10c7nGGjhC8i0svOl1+kpaGeVR+8C5fbHe1who0SvohICJ/Xy4anHmX8zDlMumRRtMMZVkr4IiIhdrz8R1obT7L8L++I2QeZXCwlfBERhw0E2PqHpymePnNU3N9+sJTwRUQch7ZvobmulsXX3xR3R/eghC8i0u2t535HZl4+0y9dEe1QIkIJX0QEOFVfx+HtW7nk6utxe+LzYYBK+CIiwO51LwMwd/VVUY4kcpTwRSThWWvZs+5lSufMJ7twTLTDiRglfBFJeHX799FUW8PsVVdGO5SIUsIXkYRXsf513B4PM+L0ZG0XJXwRSXgHt2ykdO4lpKRnRDuUiFLCF5GE1lhTTVPtMaYuuTTaoURcfF57JCIyQAc2bwBgypJ3jFifvqYmOo8cwVtVhbeqis6jVWSsvJycG26IaL9K+CKS0A5t20LRxLKIXJ3ja2igvWIf3oMH6DhwEO+BA3QcPIi/sTGsnWfsWFKmTx/2/ntTwheRhBXw+6nbv495V1495G35m5tp27aN9l27aN+1m/adO/HV13fXu3JySJkyhcw1V5IyZSrJk8tInjiRpHFjcJ3aD6mRf8iKEr6IJKwTVUfo7GineNqMQa/rb22lbf16zmzcSNvGTXRUVIC1YAzJU6aQvuxS0ubOJWXWbFKmTsFdUIABOHkAqjfCsd/Dn7ZC3Q7we2H5J+Gafx329xhKCV9EElZtZQUAxdNnDai978QJTr/0EqdfWsuZDRugsxOTmkr64kVkf+qTpC1ZQuqcubgznat9rIWT++HwM/DK63D4DWitC9YlZ0LxQrj0r2H8YiiN/EljJXwRSVi1+ytIy8omZ+y4c7axfj+tr71G85NP0vrKq+DzkTRpIvkf/CBZa64kbcECTHJyzwo+Lxz4E1Q8D/ueh+YjwfLMcVB2OZStgImXQeEMcI3s07SU8EUkYdVWVlA8fWa/t0K2Ph+nnn2Wkz9+EO+hQ7gLCsj/8IfJ+YubSZk+PXydQAAOr4O3H4M9z0BHC3hSYcoVsOLTwWn+FIjyLZeV8EUkIbWfaaXxWBWzV6zuU9e2aRN1X/kqHZWVpMycyfj//BbZ11yDSUoKb9haD5v+F976BbQcg+QsmHMTzL4RJq+G5PQRejcDo4QvIgmp7kAlED5+b71ejn/zWzT94hckjR9Pyfe+S9bVV/f9D+BEJaz7Nux8InjCddo74ZqvwozrYy7Jh1LCF5GEVFu5F4xh3LTg9e++piaqP/Zxzm7fTt6HPsSYv/8MrrS08JVaauCV/4C3fgmeFFh8J1z6MSicFoV3MHhK+CKSkGorKygoKSUlPQNfUxNHP3wn3iNHKPnOd8i+7trwxgE/bHoI1n4leES/9P/Bys9C5ui6lbISvogkHGsttfv3Ma38UqzXy7FPfgrvkSOUPvAjMpYvD2986hg8cRdUrYepV8G7vw15ZVGJe6ginvCNMYeB04Af8FlryyPdp4jI+TQfr6X9dAvF02fS8N/fp23zZsZ/8xt9k/2hdfD4X4GvHW75MVzyvqhfaTMUI3WEf6W19sQI9SUicl5dX7jKdyVx8ic/Iee2W8m58cbwRnt/H0z2eZPhff8HRYP/Nm6s0ZCOiCSc2soKklJS8f/iEdzZ2Yz9/OfDG+z9Azz6IRi/CD7wOKTnRyfQYTYS98O3wB+NMVuMMfeMQH8iIudVW1nBmHHjafvznyn46N24c0JuXHZsS3DMvngBfPi3cZPsYWSO8FdYa2uMMWOAF40xe621r3VVOn8E7gGYOHHiCIQjIoms09tBw5GDzEzPxZWdTd7tt/dUnm2Gx+6EzCK441FIyYpeoBEQ8SN8a22NM60HngKW9qp/0Fpbbq0tLyoqinQ4IpLg6g8dJOD3k76ngpx334ArI+Sxhs99Pnit/Xt+NuouuRyIiCZ8Y0yGMSarax64BtgZyT5FRM6ntnIvALmnWsm55ZaeikPr4O1HYdW9MGFJlKKLrEgP6YwFnnK+luwBHrHWPh/hPkVEzqm2soJ0l5vM4vGkzpsXLAwE4IUvQE4pXP6Z6AYYQRFN+Nbag8CCSPYhIjIYtfv2ktPcSubKK3rukbPvOah7G255EJLSzr+BUWwkrtIREYkJrY0nOd14gtzTZ8hYtbKn4s0fQM5EmHdb9IIbAUr4IpIwavcHv3CV5/WRsdS5fqRuJxx5Ay69B9zx/dUkJXwRSRi1lRUYC+PmXoIr3bmN8Y7HwbhhwfujG9wIUMIXkYRRs2sH2W3tZK9aFSywFnb+BqZeCRmF0Q1uBCjhi0hCCPj9HD98kNy2djK7xu9rt8OpozD3lvOvHCeU8EUkIZyoOoLP76MgJY3kKVOChQdfCU6nvTNqcY0kJXwRSQi1e3cBMGFhec/lmAdfgaLZkDUueoGNICV8EUkIVRvXk+zzM3bNmmCBzwtH18OUvg8xj1dK+CKSEGoPVJJ7toOMy5yHnNTvAt9ZmLgsuoGNICV8EYl77WdaaWlvoygrF3emc7O0mreC0/GLohfYCFPCF5G4V7snOH4/bvqsnsKabZCaC7mTohTVyFPCF5G4V/Xm62AtpctX9BTWvAXjF47qZ9QOlhK+iMS92r27yWzvJOdSZ7ze74OGvTDukugGNsKU8EUkrllrqT/ZQL7bg6egIFjYfAT8Xigc/Q8mHwwlfBGJa021NXhtgLElIY9QPbk/OC2cHp2gokQJX0TiWvWbrwMwYVHIU6y6En6BEr6ISNyo3rIJtz9A8eorewpPVEJaHmQURC+wKFDCF5G4drz6CLmdflKnTu0pPLk/4Y7uQQlfROJYp7eDpvY2ivIKeu6fA8Ej/IJp0QssSpTwRSRu1W7bijWG8TNm9xR2tEJrHRRMPfeKcUoJX0TiVtXrrwFQevmqnsLmI8Fp/uQoRBRdSvgiErdqKvaQ5vWRv+yynsLGQ8FpnhK+iEjcaGg+SUFKGq7k5J7CpsPBaV5ZNEKKKiV8EYlLLTXHOItl3ISJ4RVNhyA1B9LzoxNYFCnhi0hcOvLyWgBKlrwjvKLpcEIe3YMSvojEqaqtm3AFApS+89rwCiV8EZH4UldTRb51kVJU1FMY8EPTkYQ8YQtK+CIShzpaWmjydzJuXEl4RUsNBDp1hC8iEi8OPP97MIaJi8vDK5q6LsksG/GYYoESvojEncrXXsbjDzD15lvDK07sC04T7LbIXZTwRSSuBAJ+qo8fY2xyKskFve6G2VAByVmQXdL/ynFOCV9E4sq+p5+i3WWYsXhp38qGvVA0M6GeYxtKCV9E4spbzzxJss/PvLv/um9lQwUUzRr5oGKEEr6IxI0jL6+lpu00s8aXkZyXF17ZUgOtx2HcvOgEFwOU8EUkLnhbW3nhge+S4vOz4v5/6tugamNwOqGfoZ4EoYQvIqOet/U0T3ziLk4TYM31f0H6+PF9Gx1dD55UGDd/5AOMEZ5oByAiMhSHnvktL/3iIVpccNmcRcz5aD9j99bCvueg7HLwJPetTxBK+CIyqthAgNN7drP3N09QsWMr9SZAig1w3fW3Mfev7u5/pZq3gvfQWfF3IxprrFHCF5GYYq0l0NqK/1QLnfV1NFVW0njoAE21NTTW13HibCstyR4whnRgyaxLWPbpe0nNP8/tjt/8fvD6+3m3nrtNAoh4wjfGXAd8F3ADD1lrvzbcfdjOTqzXC273ua+vPUf5Oa/GHeR2hq28N2vPvzzEsn5aDXsf/fbbf8cXt70B99lvpxfX51D7Hc73P8CyQX3mgQDW78f6fMF5nx/8Pqw/4Ez9wbJAcGr9PvAHnKm/u13A68Xfdhbf2TY62trobGujo72NzvZ2vN52Oju8dHZ20NHRgbejnXZfJ+02QIfbjdfjpiPJjQ35XfEAhfkFTJ00hSlrrmbSipUY1wVORe5+GnY+CSvvDd4HP4FFNOEbY9zAD4CrgWpgkzHmGWvt7uHs5/Tatez48gOcTQ1+q85Ar707fKfuk2r7/CacPzGY3vV9mp97exfu+wJ9XXD9C7zX0PoLxD34vge47nlivth4RyLW/tqF72XBJWt6KrrqjbHd8xbb/UZ7ygiWWdszH7Zt07Vj913HmbdYrHGidPqzxqkzJmS+17RXu551jbNNurdrDQQMBIxxpsH5sHJXeP2ADm4MuFLdpLiSSEvykJ2SRFpaKhmZ6eSOzaOgdCyF4wvIyEjr2ZxtgF2/cX5Gtu804Iejb8L2X0FJOaz63IXjiHORPsJfCuy31h4EMMb8GrgZGNaEf+xMEtsWfGrI27FdOwoBun8VbMi8U277KQvuZF3lhNTTXW97l4W+bD9lvbdje+psn2309NP/+v1tB2dbA93GueOhO6L+tnWOWPvdTnjbnvfZf3/9v/9zvO+u92r7Wa/fz+h8cfdN/vHDELyAzwXGFbJsCB7DecC4ATcYN6Z72eMs95S7Q+tNMoZkMMGX6TUNrh+gnQBe46el049pCuBqDuCq6CTJdOAxjSSZDpJMOx7jxWPaSXG1kuFqIsPdSIbrJIVJh0lznQ6+leQseMdHYc0/QVJqVD7NWBLphF8CVIUsVwOXhjYwxtwD3AMwcWKvR5ENUH3HEdLa1+IJ1BEAAtYQsAZL1zT4+xtcDk6DZcFpV5vzDPAkDNNrJvQT6X2gZno1Nv1soO/BnaH7YLVr2dV/vQ3b7jk7DwnWnHu59xvD1Ttgp73pLrOAMV3zJrxN97pOfVhfPeXdyyH13e/NmD59drczxkm2vbdpwmIwYcsuZ94FrpBk7XKHxOFyhkC62rt6tXcF33Ov99fTR+hnHLKTWFd4064faui8ddaxPcuGrn8tOrHWF/x8nN9Rv/UABmsNBOCs34A/OG/9BvyA32C9gC98/zDpkFRimLK6jFWLJpHk1hXoEPmE318GDTs0stY+CDwIUF5eflGHTSddAZrOnqK9aAz+FDfWZXr+DXUR/i/t+V7d/75a519U210ecPbVgAnOW2PBGKdN8Bekexs4+za2+9/i7v3a+VQCXWXG0jVYENwmWGtDYnI+JycWumK0tjvJBLrWc7YZCG6y5/1Ad/84n0PYTycsQfU2kPKw9N0ds8F0Hyh3tbB0pbye+a7jeBMyaGIGG481fZpY27UdJ7GYrnamu7znj1XPct+6kMQb1obueds1tb3ahR5M2J4/Hj3rdyU/sLh6flhdn4U1Tv9OkrZuJ2F2JU5XzzIubL91Lqfv8Hrr1FlrMKHLAQ/gxlrnD4XzT5F1hpuCB0+2+2drA12fd9/6rmXClkPadQ1hnaOP4HqhbZxSN1iXxSaBTYEkCxkBQ3bAMNbvYpLXxaRKF9v2H+SnxUf40l8vYfrYrP73nQQS6YRfDZSGLE8Aaoa7k/RLJ/CH2jrq848AwV9Blwkeqbhw9cwbFy5C5o0rvO156kO3Z0zoL2zPUWBXmcE4ZSHtnInLKXP3s16/2wpZ7r2tvv1xzjbQM0DiLPRbHnqCMax8IOsOZDvnGEsfegy233WtkzAAAjYQVmatJUCgT33AyWCh9dZ2r9ld3729kPqusrDtEQgmKxNeH6b3H+EY4XF5SHYlk+ROCk5dSSS7k/G4PN3zvafpnnTSk9JJ86R1z6d7nGVnPiMpg9zUXPJS8kjzpHXvv0MR/Dn0/OFoqGnl2f/ZyaU1Z/nsf6/noX9YyZisxB7WMQO+cuFiNm6MB9gHXAUcAzYBd1hrd/XXvry83G7evHnQ/Vjb80vYlZhFYln3HxAb8gch5A9GV73f+gkEAsGpDU5Dy7qXu+oCPfMBG8AX8J27zjp1IeWdgU58AR9ev5fOQGefaWegk05/J95Ar3J/sK7d385Z31naOtto87XhC/gu+Fkku5K7k39uai5j08dSklnC+MzxlGSWUJpVytj0sRf1e+1t9/HIv2+ivqGN6sty+d6Hl1zMjyvmGWO2WGvLL9Quokf41lqfMeZvgRcIXpb5k3Ml+6EwxuA27uHerEjEdP+XaMBN/O67nf5O2nxt3X8EzvrO0uZr47T3NKc6TtHU0URzezPNHc00dTTR1N7EhtoN1LfVh/0nlJuSy6z8WcwtmMtl4y9j8ZjFJLmTLth/cqqH6z4yhye/voX6rSeovPZ0Qg/tRPQIf7Au9ghfROJLp7+TujN1HDtzjMOnDrO3cS+7T+6msrkSX8BHuieda8uu5X2z3sfcgrkX3N4T/7mFfQea8F9fzD/deOH2o01MHOGLiFyMJHcSpdmllGaXsqx4WXf5mc4zbKjdwMtVL/P84ed5av9TXD3paj5X/jmKM4vPub35K8ZzvPIUz26sxfeu2XgS9KqdxHzXIjIqZSRlsGbiGr664qu89Jcv8YmFn2Bd9Tre++x72Vi78ZzrTZpfCEDuaT/bq0+NVLgxRwlfREal7ORsPr7g4zxx0xMUpBbwibWfYHNd/0PCqRlJ5E3IYJLPxbrKhhGONHYo4YvIqDYpexI/ve6njM8cz2df/Swnz57st93EmfmM97t5Y58SvojIqJWXmse3V3+bVm8rX9/09X7bFE3Mwm2h+uhpOv2BEY4wNijhi0hcmJY3jTvn3slzh56jorGiT33hhEwAcr1Qebx1pMOLCUr4IhI37px7JxlJGfx818/71OWNS8flMYzxG3YeS8wTt0r4IhI3clJyuGHyDbx45EVOe0+H1bncLnLHpFNk3eyubYlShNGlhC8iceWW6bfQ7m/npSMv9anLHZNOkXFx8MSZKEQWfUr4IhJX5hbMZVzGOF6peqVPXU5RGuleOFSvMXwRkVHPGMPqCat5s/ZNOvwdYXU5Y9JwWWhpbKe90x+lCKNHCV9E4s7KkpWc9Z1le/32sPKcMekA5PoNhxJwWEcJX0TizqKxizAYttRvCSvPLgjeDz87YDjYoIQvIjLqZSdnMyNvBluPbw0rz8hLAQNZAcPBhsQbx1fCF5G4tHjsYrY3bA97CIvb7SI9O5lxSUkcUMIXEYkPi8cu5qzvLHsb94aVZ+alUuBKzEszlfBFJC4tLFoIwPaG8BO3WXkpZPoNB+pbz/mM5XilhC8icWlcxjjGpo/tc6VOZl4qno4AZzr8HG/pOMfa8UkJX0Ti1sIxC9nWsC2sLDM/BXyWFEvCjeMr4YtI3FpQtIDaM7UcP3O8uywzz7k00ybelTpK+CISt/obx8/MSwGgyOXmQIJdi6+ELyJxa1b+LFLcKb0SfvAIvywjVUM6IiLxIsmdxNyCuWHj+Ok5yRiXoSQ5mT21LQl1pY4SvojEtQVjFrD75O7uG6m5XIaMnGSK3G5OtHo51nw2yhGOHCV8EYlrC4oW4Av42HNyT3dZZl4q6c4XcLdXJc7Tr5TwRSSuLShaAMC2+p5hncz8FGybn2S3i7erm6MV2ohTwheRuFaYVkhpVmmfE7dnmjuYOz6LjYcboxjdyFLCF5G4t6BoAdsatnWfoM3MS8HfGeDyiflsr2rmVFtnlCMcGUr4IhL3FhYt5MTZE9ScqQEgy7k0c0lRNgELbxw4Ec3wRowSvojEvQVjwsfxM/ODX76akJxMVqqHP+2tj1psI0kJX0Ti3rTcaaR70nsSvnOE33bKy7Vzx/HCzrqEeMatEr6IxD2Py8P8wvndX8BKy0zC5TG0NrVzy6ISTnf4WLsn/o/ylfBFJCFcWnwpexv30tDWgHEZMnNTaG3qYNmUAopzUvnF+sPRDjHilPBFJCFcUXoFAK9WvwoEh3Vam9pxuwwfWVHG+oONbK+K72vylfBFJCFMy51GSWYJr1S9AgQvzWxtDN5u4f1LJ5KV6uF7ayujGGHkKeGLSEIwxnBF6RWsr11PW2db95evAgFLVmoSf3PlNNburefVfQ3RDjVilPBFJGFcWXolHf4OXqt+jZyiNAIBS2tjOwAfWVFGWUE6//z0Tlo7fFGONDKU8EUkYZSPLWdM+hh+d/B35I5NA6C5vg2AFI+bb7xnAVWNbXzp6V3RDDNiIpbwjTH/Yow5ZozZ5rzeFam+REQGwu1y8+4p7+aNY2/gzwoe2Tcf77k98tLJ+XxyzXSe3FrNQ+sORivMiIn0Ef5/WWsXOq8/RLgvEZELunHKjfitn5caXiApxd19hN/l01dN54b5xfzr7/fwxJbqKEUZGRrSEZGEMi1vGovHLOaRil+SMyaNU8fDE77LZfjP9y7g8mmF3Pv4dv739UNRinT4RTrh/60x5m1jzE+MMXn9NTDG3GOM2WyM2dzQEL9nx0Ukdtw17y7qztTRltHc5wgfIDXJzUN3lnP9vHF89dndfP6J7Zz1jv5bLwwp4RtjXjLG7OzndTPwI2AqsBCoBf6zv21Yax+01pZba8uLioqGEo6IyICsnLCSabnT2ObdSMvJdnz9JPPUJDffv2Mxf3vlNB7fUs3NP3idt442RSHa4TOkhG+tfae1dl4/r6ettcettX5rbQD4H2Dp8IQsIjI0LuPic+Wf45BnL1horD3Tbzu3y3DvtTP52UeW0nLWx60/+jP/+NsdnGztGOGIh0ckr9IpDlm8BdgZqb5ERAZreclyZk6fCMCOPef/hu3qGUW8+Per+KvlZTyy4SirvvEy33qhYtQ9OCWSY/jfMMbsMMa8DVwJfCaCfYmIDNr9V91Lp7uD57e8QlP7+YdrslKT+NKNc/njZ1ZxxawxfP/l/Sz7j7V88akd7Dt+eoQiHhrT9civWFBeXm43b94c7TBEJIE8/O+vsb+5kn2rXuShax4iMzlzQOvtrWvhJ68f4rfbavD6ApRPyuPmRSXcML+Y/IzkCEcdzhizxVpbfsF2Svgikshee3Qfu96o5sEl9zI5r4wfXvVDxmWMG/D6jWe8PLqpit9sraayvhWPy7ByeiFXzR7LmlljGJ+bFsHog5TwRUQGYO+btaz9+R5mfSyZf9z9OZLdyXx5+Ze7b6c8UNZadte28PS2Gv6wo5bqpuA3eGeNy+KKmWNYNiWfJZPyyEpNGvb3oIQvIjIApxra+L9/Ws/qO2aSdkk7n3/t8+xr2seNU27k04s/zdiMsYPeprWW/fWt/GlvPX/aW8+WI034AhaXgXklOSwtCyb/S0pzGZ+TijFmSO9BCV9EZACstfzsH95gwqw8rr5rLl6/lwe2P8DPdv0Mj8vDB2d/kDtm30FhWuFF93Gmw8dbR5vZeOgkGw418lZVM15fAIDCzGQumZDLbYsncMMlxRfYUv8GmvA9F7V1EZE4YYyheFoOtftPAZDsTuZTiz/FLdNv4btbv8tDOx7i57t+zo1Tb+S26bcxr3DeoI/IM1I8XD69kMunB/9odPj87K5pYcexU2yvOsXb1c1UN/X9xu9w0xG+iCS87WureP3xSj7878vJyk8Nqzt86jAP736YZw48Q4e/g8k5k7lp6k2sKV3D5JzJQx6O6WKtvehtaUhHRGSATta08uuvbOSKD8xk7sqSftuc9p7mj4f/yDMHnmFr/VYASrMFSyZhAAALSUlEQVRKWT1hNcvHL2fRmEUDvqRzuCnhi4gMkLWWh7/4Z4pKs3jXxy+5YPu6M3W8WvUqr1S/wsbajXgDXlzGxaz8WZSPLWfhmIXMKZjD+Izxw/YfwPko4YuIDMIrj1Swb0Mdd39rJe6kgd+E4KzvLNsbtrO5bjNbjm/h7Ya38Qa8AOSk5DA7fzZzCuYwK38WU3KmMCl7Eqme1AtsdXB00lZEZBAmzStg12vHqKlspnRO/oDXS/Oksax4GcuKlwHQ4e9gX+M+9jTuYffJ3ew+uZuHdz+MLxB8Tq7BUJJZwpTcKUzJmcLknMlMzJrIlNwp5KcOvN+LoYQvIgJMmJWHJ8XN/q31g0r4vaW4U5hfNJ/5RfO7y7x+L4dOHeLQqUMcPHWw+/VmzZt0BoI3YLtzzp3c+457h/w+zkcJX0QESEp2M2VhIQe21rPqfTMGNaxzIcnuZGbmz2Rm/sywcn/AT01rDVWnqxiTPmbY+jsXPeJQRMQx4x3j6GjzcXT3yRHpz+1yU5pdyvKS5UzLmxbx/pTwRUQcE2bnkZaVxN71ddEOJSKU8EVEHG63i1nLijm0/QStTe3RDmfYKeGLiISYt7oEay271tVEO5Rhp4QvIhIiuzCNsvmF7Fp3DF9n34ebj2ZK+CIivSxYM4GzpzvZ80ZttEMZVkr4IiK9lMzMo3hqDltfOIK/MxDtcIaNEr6ISC/GGN5xw2RamzrY/Ub8jOUr4YuI9GPC7DyKp+Ww6feH6GjrjHY4w0IJX0SkH8YYVr53BmdbO9n0+8PRDmdYKOGLiJxD0cQs5iwvZsfL1TTWnIl2OEOmhC8ich6X3jyV5DQPax/eQ8A/uk/gKuGLiJxHenYyq94/g/rDLbz14tFohzMkSvgiIhcwvXwsUxePYePvDlF/pCXa4Vw0JXwRkQFYfccM0nOSef7HO2lvHZ1X7Sjhi4gMQFpmMtfdM58zLR28+JNdBAKx83jYgVLCFxEZoLFl2ax63wyO7m7k9Uf3EUvPBB8IPfFKRGQQ5q4sofl4G9teqiIzP5XF106KdkgDpoQvIjJIy2+dxplTXt586gCpGUnMuXx8tEMaECV8EZFBMi7DVR+eTUdbJy//314CAcu8VSXRDuuCNIYvInIR3Ekurv/YfCbNL+DVRyrY/qeqaId0QUr4IiIXyZPk5vq/ns/kBYW8/lglbzy5HxvDV+8o4YuIDIHb4+K6e+Yxb3UJ2148ygv/sxOfNzaflKWELyIyRC63i1W3z2DFe6ZxYFsDT35zC6ca2qIdVh9K+CIiw8AYw8J3TuSGj1/C6ZPtPPbvmzm4rSHaYYVRwhcRGUZllxTy3i+8g5yiNJ57YAevPbqPzhgZ4hlSwjfG/KUxZpcxJmCMKe9Vd78xZr8xpsIYc+3QwhQRGT2yC9O47XNLuGTNBHa8XM2jX91I7YFT0Q5ryEf4O4FbgddCC40xc4DbgbnAdcAPjTHuIfYlIjJquJNcrHzvDG7+zCICfstvvrWF1x+vxNvui1pMQ0r41to91tqKfqpuBn5tre2w1h4C9gNLh9KXiMhoNGFmHrf/81LmXj6e7WureORL66ncdDwq9+GJ1Bh+CRD6LYRqp6wPY8w9xpjNxpjNDQ2xdYJDRGQ4JKd6uOIDs7jtH5aQnpPCH/93F09/ZxsnqltHNI4LJnxjzEvGmJ39vG4+32r9lPX758xa+6C1ttxaW15UVDTQuEVERp1xk3N4z33lrL5jJieqTvPov23kpZ/upuXk2RHp/4L30rHWvvMitlsNlIYsTwBqLmI7IiJxxeUyzFtVwrQlY9j6whHe/lM1lVuOc9lfTGXhOydGtu8IbfcZ4HZjTIoxZjIwHdgYob5EREad1Iwklt86jQ98ZRkzlo4juzAt4n0O6W6ZxphbgP8GioDfG2O2WWuvtdbuMsY8BuwGfMDfWGtj40JUEZEYkpWfylUfnj0ifQ0p4VtrnwKeOkfdvwH/NpTti4jI8NE3bUVEEoQSvohIglDCFxFJEEr4IiIJQglfRCRBKOGLiCQIJXwRkQRhonHHtnMxxjQARy5y9ULgxDCGE2mjKd7RFCuMrnhHU6wwuuIdTbHC0OKdZK294M3IYirhD4UxZrO1tvzCLWPDaIp3NMUKoyve0RQrjK54R1OsMDLxakhHRCRBKOGLiCSIeEr4D0Y7gEEaTfGOplhhdMU7mmKF0RXvaIoVRiDeuBnDFxGR84unI3wRETmPuEj4xpjrjDEVxpj9xpj7RrDfnxhj6o0xO0PK8o0xLxpjKp1pnlNujDHfc2J82xizOGSdO532lcaYO0PKlxhjdjjrfM8Y09+jIwcaa6kx5mVjzB5jzC5jzKdjPN5UY8xGY8x2J94vO+WTjTEbnL4fNcYkO+UpzvJ+p74sZFv3O+UVxphrQ8qHdb8xxriNMW8ZY54dBbEedn5W24wxm52yWN0Xco0xTxhj9jr772UxHOtM5zPterUYY/4uZuK11o7qF+AGDgBTgGRgOzBnhPpeBSwGdoaUfQO4z5m/D/i6M/8u4DmCz/tdBmxwyvOBg840z5nPc+o2Apc56zwHXD+EWIuBxc58FrAPmBPD8Rog05lPAjY4cTwG3O6UPwB83Jn/BPCAM3878KgzP8fZJ1KAyc6+4o7EfgP8PfAI8KyzHMuxHgYKe5XF6r7wc+CjznwykBursfaK2w3UAZNiJd6IJ8VIv5w3/kLI8v3A/SPYfxnhCb8CKHbmi4EKZ/7HwPt7twPeD/w4pPzHTlkxsDekPKzdMMT9NHD1aIgXSAe2ApcS/GKKp/fPHngBuMyZ9zjtTO/9oavdcO83BJ/bvBZYAzzr9B2TsTrbOEzfhB9z+wKQDRzCOd8Yy7H2E/s1wBuxFG88DOmUAFUhy9VOWbSMtdbWAjjTMU75ueI8X3l1P+VD5gwhLCJ41Byz8TpDJNuAeuBFgke5zdZaXz99dMfl1J8CCi7ifVys7wCfBwLOckEMxwpggT8aY7YYY+5xymJxX5gCNAA/dYbLHjLGZMRorL3dDvzKmY+JeOMh4fc3fhWLlx6dK87Blg8tCGMygSeBv7PWtpyv6SDjGvZ4rbV+a+1CgkfPS4H+HvzZ1UfU4jXGvBuot9ZuCS0+z/aj/tkCK6y1i4Hrgb8xxqw6T9toxushOGz6I2vtIuAMwSGRc4mFzxbnfM1NwOMXajrIuIYUbzwk/GqgNGR5AlATpVgAjhtjigGcab1Tfq44z1c+oZ/yi2aMSSKY7H9prf1NrMfbxVrbDLxCcIwz1xjT9Szm0D6643Lqc4DGi3gfF2MFcJMx5jDwa4LDOt+J0VgBsNbWONN6gs+lXkps7gvVQLW1doOz/ATBPwCxGGuo64Gt1trjznJsxDscY1XRfBE8AjhI8CRX1wmtuSPYfxnhY/jfJPzkzDec+RsIPzmz0SnPJzhGmee8DgH5Tt0mp23XyZl3DSFOAzwMfKdXeazGWwTkOvNpwDrg3QSPmEJPhH7Cmf8bwk+EPubMzyX8ROhBgifTIrLfAFfQc9I2JmMFMoCskPk/A9fF8L6wDpjpzP+LE2dMxhoS86+Bj8Ta79mIJMVIvwie6d5HcIz3iyPY76+AWqCT4F/euwmOxa4FKp1p1w/JAD9wYtwBlIds5y5gv/MK3UnKgZ3OOt+n14mrQcZ6OcF//d4Gtjmvd8VwvJcAbznx7gT+2SmfQvAqhf0EE2qKU57qLO936qeEbOuLTkwVhFzREIn9hvCEH5OxOnFtd167urYXw/vCQmCzsy/8lmACjMlYne2lAyeBnJCymIhX37QVEUkQ8TCGLyIiA6CELyKSIJTwRUQShBK+iEiCUMIXEUkQSvgiIglCCV9EJEEo4YuIJIj/DxN0Zm0BxAfNAAAAAElFTkSuQmCC\n",
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"weights_over_time_lst = sorted(weights_over_time.items()) # sorted by key, return a list of tuples\n",
"\n",
"e, w = zip(*weights_over_time_lst) # unpack a list of pairs into two tuples\n",
"\n",
"plt.plot(e, w)\n",
"plt.show()"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAEKCAYAAAD9xUlFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzt3Xt0VfWd9/H3l0tErQpI2oXEkOigbRC5BcWqiIIIdQyo2IJhjdrOk7aKUm2fBxznQaSLdlpt9WGVqWVadTqmcmtBdGFxEFRsvRCUgoApqFAilJtVYSwa4Pv8sXe2J4eTCwmbc3LO57XWWZy99+/s/d3Z4Xyyb79t7o6IiAhAu3QXICIimUOhICIiEYWCiIhEFAoiIhJRKIiISEShICIiEYWCiIhEFAoiIhJRKIiISKRDugs4Wt26dfOioqJ0lyEi0qasXr16j7vnN9WuzYVCUVERVVVV6S5DRKRNMbOtzWmnw0ciIhJRKIiISEShICIikVhDwcxGmlm1mW02sykppj9oZmvC15/N7IM46xERkcbFdqLZzNoDs4ArgRpglZktdvcNdW3c/c6E9rcD/eOqR0REmhbn1UcXAJvd/R0AM5sDjAY2NNB+PHBvHIXYPQYdE0bUgs/Qw4VERJLFGQo9gG0JwzXAhakamllPoBhYfqyLiALBEkZ2BJtmDX1EpO1pY7/Ofq/+KMtUcYZCql/Thn4TxgEL3P1QyhmZVQAVAIWFhUdXRXIgNFSZiBw3dl/wn1DhkHniPNFcA5yZMFwAbG+g7TjgiYZm5O6z3b3U3Uvz85u8IU9E2oi6cJDMEWcorAJ6mVmxmeURfPEvTm5kZucCXYCXY6xFRDKUgiGzxBYK7n4QmAgsBTYC89x9vZlNN7OyhKbjgTnurv1IkRylYMgcsfZ95O5LgCVJ46YmDU+Ls4boLIZ+50REmtTmOsQ7Wn6fY/cqESTH6FdeWijrQwGCYBDJFmZNf+Nn0tFYHRpqW9T3kUgbk0lf+M3h93qzLj1VeGQGhYKIHBe6J6FtUCiIiEhEoSDSBrW1Q0jSdigUREQkolAQyUJ5eXnpLiElnVfIfAoFkSxUW1ub7hKkjVIoiEjGaH9f+3SXkPMUCiKSMQ5zON0l5DyFgoiIRBQKIm2ULkuVOCgUREQkolAQkeNKl6VmNoWCiIhEFAoiIhJRKIiISEShICIiEYWCSJaqrKxMdwktkjc9M/ttyhWxhoKZjTSzajPbbGZTGmjzVTPbYGbrzew3cdYjkksmTJiQ7hJapNbVb1M6xfaMZjNrD8wCrgRqgFVmttjdNyS06QXcDVzs7n8zs8/HVY+IiDQtzj2FC4DN7v6Ou38KzAFGJ7X5X8Asd/8bgLvvirEeERFpQpyh0APYljBcE45LdA5wjpn9wcxeMbORMdYjIiJNiDMULMW45FsZOwC9gKHAeOCXZtb5iBmZVZhZlZlV7d69+5gXKtJWtdX+j3RXc+aKMxRqgDMThguA7SnaPOnute7+LlBNEBL1uPtsdy9199L8/PzYChYRyXVxhsIqoJeZFZtZHjAOWJzUZhFwOYCZdSM4nPROjDWJiEgjYgsFdz8ITASWAhuBee6+3symm1lZ2GwpsNfMNgArgP/t7nvjqklERBoX2yWpAO6+BFiSNG5qwnsH7gpfIiKSZrqjWUQyzvBfD093CTlLoSAiGee5d59Ldwk5S6EgIiIRhYJIFjNLdbuQSMMUCiIiElEoiEha6K7mzKRQEBGRiEJBREQiCgWRNq6tdoonmUmhICIiEYWCiGSkHj9JfvyKHA8KBRHJSNv3J/e0L8eDQkFERCIKBRERiSgUREQkolAQkbTRXc2ZR6EgIiIRhYJIllNPqXI0FAoiIhJRKIiISCTWUDCzkWZWbWabzWxKiuk3m9luM1sTvv45znpERKRxsYWCmbUHZgGjgBJgvJmVpGg61937ha9fxlWPSDYbNmxYukuQLBHnnsIFwGZ3f8fdPwXmAKNjXJ5Izlq2bFm6S5AsEWco9AC2JQzXhOOSXW9ma81sgZmdmWpGZlZhZlVmVrV79+44ahUREeINhVTXwSXfqfIUUOTu5wPLgP9MNSN3n+3upe5emp+ff4zLFBGROnGGQg2Q+Jd/AVCv20N33+vun4SD/wEMjLEeEWlj7D7dY3G8xRkKq4BeZlZsZnnAOGBxYgMz654wWAZsjLEeERFpQoe4ZuzuB81sIrAUaA884u7rzWw6UOXui4E7zKwMOAi8D9wcVz0iItK02EIBwN2XAEuSxk1NeH83cHecNYiISPPpjmYRSavHr3s83SVIAoWCiKRVeZ/ydJcgCRQKIiISUSiIiEhEoSAiIhGFgoiIRBQKIiISUSiIiEhEoSAiIhGFgoiIRBQKIiISUSiIiEhEoSAiIhGFgoiIRBQKIiISUSiI5IDhw4enuwRpIxQKIjngueeeS3cJ0kYoFEREJKJQEBGRSKyhYGYjzazazDab2ZRG2o01Mzez0jjrEclmZ5xxRrpLkCwQWyiYWXtgFjAKKAHGm1lJinanAHcAr8ZVi0gueO+999JdgmSBOPcULgA2u/s77v4pMAcYnaLd94EfAwdirEVERJohzlDoAWxLGK4Jx0XMrD9wprs/HWMdIiLSTHGGgqUY59FEs3bAg8B3m5yRWYWZVZlZ1e7du49hiSKS6fKm56W7hJwSZyjUAGcmDBcA2xOGTwHOA543sy3AYGBxqpPN7j7b3UvdvTQ/Pz/GkkUk09R6bbpLyClxhsIqoJeZFZtZHjAOWFw30d0/dPdu7l7k7kXAK0CZu1fFWJOIiDQitlBw94PARGApsBGY5+7rzWy6mZXFtVwREWm5DnHO3N2XAEuSxk1toO3QOGsRkcz1+HWPM+F3E9JdhtDMPQUzO9vMTgjfDzWzO8ysc7yliUiuKO9Tnu4SJNTcw0e/BQ6Z2T8AvwKKgd/EVpWIiKRFc0PhcHiO4FrgIXe/E+geX1kiIpIOzQ2FWjMbD9wE1N1o1jGekkREJF2aGwq3ABcBM9z9XTMrBh6PrywREUmHZl195O4bCDqtw8y6AKe4+7/FWZiIiBx/zb366HkzO9XMugJ/Ah41s5/GW5qIiBxvzT18dJq7fwRcBzzq7gMBPfRVRCTLNDcUOphZd+CrfHaiWUREskxzQ2E6QXcVb7v7KjM7C9gUX1kiIpIOzT3RPB+YnzD8DnB9XEWJiEh6NPdEc4GZLTSzXWa208x+a2YFcRcnIiLHV3MPHz1K0O31GQRPT3sqHCciIlmkuaGQ7+6PuvvB8PUYoKfdiIhkmeaGwh4zm2Bm7cPXBGBvnIWJiMjx19xQ+DrB5ah/BXYAYwm6vhARkSzSrFBw97+4e5m757v75919DMGNbCIikkVa8zjOu45ZFSIikhFaEwp2zKoQEZGM0JpQ8GNWhYiIZIRGQ8HM9pnZRyle+wjuWWiUmY00s2oz22xmU1JM/5aZrTOzNWb2kpmVtGJdRESklRrt5sLdT2npjM2sPTALuBKoAVaZ2eLw2Qx1fuPuD4fty4CfAiNbukwREWmd1hw+asoFwGZ3f8fdPwXmAKMTG4Tdcdc5GR2SEhFJq2Z1iNdCPYBtCcM1wIXJjczsNoIrmfKAK1LNyMwqgAqAwsLCY16oiIgE4txTSHV10hF7Au4+y93PBiYD/5pqRu4+291L3b00P1+9a4iIxCXOUKgBzkwYLgC2N9J+DjAmxnpERKQJcYbCKqCXmRWbWR4wjqCn1YiZ9UoYvBo9uEdEJK1iO6fg7gfNbCLBE9vaA4+4+3ozmw5UuftiYKKZDQdqgb8BN8VVj4iINC3OE824+xJgSdK4qQnvJ8W5fBEROTpxHj4SEZE2RqEgIhmvcl1lukvIGQoFkRzRu3fvdJfQYhN+NyHdJeQMhYJIjtiwYUPTjSTnKRRERCSiUBCRjNBOX0cZQVtBJIuUlLTd3ucP3Xso3SUICgWRrLJ+/fp0lyBtnEJBREQiCgUREYkoFEREJKJQEBGRiEJBREQiCgUREYkoFEREJKJQEBGRiEJBREQiCgUREYkoFEREJBJrKJjZSDOrNrPNZjYlxfS7zGyDma01s+fMrGec9YiISONiCwUzaw/MAkYBJcB4M0vuwvENoNTdzwcWAD+Oqx4REWlanHsKFwCb3f0dd/8UmAOMTmzg7ivc/eNw8BWgIMZ6RESkCXGGQg9gW8JwTTiuId8AnomxHhERaUKHGOdtKcZ5yoZmE4BS4LIGplcAFQCFhYXHqj4REUkS555CDXBmwnABsD25kZkNB+4Bytz9k1QzcvfZ7l7q7qX5+fmxFCsiIvGGwiqgl5kVm1keMA5YnNjAzPoDvyAIhF0x1iIiIs0QWyi4+0FgIrAU2AjMc/f1ZjbdzMrCZvcDnwPmm9kaM1vcwOxEROQ4iPOcAu6+BFiSNG5qwvvhcS5fRESOju5oFpE2we5Lde2KHGsKBRERiSgUREQkolAQEZGIQkFEMkZJt+Tu0eR4UyiI5BCzzD5Zu/629ekuIecpFEREJKJQEBGRiEJBREQiCgWRLDNs2LB0lyBtmEJBJMssW7Ys3SVIG6ZQEBGRiEJBREQiCgUREYkoFEREJKJQEJE2Q91nx0+hICIiEYWCiIhEFAoiIhJRKIiISCTWUDCzkWZWbWabzWxKiulDzOx1MztoZmPjrEVEApnefbbf6+kuIafFFgpm1h6YBYwCSoDxZpb8BI2/ADcDv4mrDhERab4OMc77AmCzu78DYGZzgNHAhroG7r4lnHY4xjpERKSZ4gyFHsC2hOEa4MI4FlRbW0tNTQ0HDhyIY/aSgTp16kRBQQEdO3ZMdykiWSXOUEh14LJFBwvNrAKoACgsLDxiek1NDaeccgpFRUUZf7xUWs/d2bt3LzU1NRQXF6e7HDnO7D7TeYcYxXmiuQY4M2G4ANjekhm5+2x3L3X30vz8/COmHzhwgNNPP12BkCPMjNNPP117ho1w15emtEycobAK6GVmxWaWB4wDFse1MAVCbtH2FolHbKHg7geBicBSYCMwz93Xm9l0MysDMLNBZlYD3AD8wszWx1VP3Gpqahg9ejS9evXi7LPPZtKkSXz66acAPP/885x22mn079+fc889lyFDhvD0009Hn502bRo9evSgX79+0euDDz6oN//Dhw9zxx13cN5559GnTx8GDRrEu+++22hNRUVF7Nmz59ivrIhkrVjvU3D3Je5+jruf7e4zwnFT3X1x+H6Vuxe4+8nufrq7946znjqVlZUUFRXRrl07ioqKqKysbNX83J3rrruOMWPGsGnTJv785z+zf/9+7rnnnqjNpZdeyhtvvEF1dTUzZ85k4sSJPPfcc9H0O++8kzVr1kSvzp0711vG3Llz2b59O2vXrmXdunUsXLjwiDZxcHcOH9bFYdlGe1rSkJy7o7myspKKigq2bt2Ku7N161YqKipaFQzLly+nU6dO3HLLLQC0b9+eBx98kEceeYSPP/74iPb9+vVj6tSp/OxnP2v2Mnbs2EH37t1p1y7YZAUFBXTp0gWAJ554gj59+nDeeecxefLkIz47efJk/v3f/z0anjZtGj/5yU8AuP/++xk0aBDnn38+9957LwBbtmzhS1/6ErfeeisDBgxg27ZtR8xTJE5NnUhWb6nxifPqo7QZOnRog9NeeeUVPvnkk3rjPv74YyZNmkR5eTl79uxh7Nj6N1c///zzjS5v/fr1DBw4sN64U089lcLCQjZv3pzyMwMGDOD++++Phh988EEef/xxALp06cKKFSvqtf/qV7/KJZdcwsqVKxk2bBgTJkygf//+bN++ncmTJ7N69Wq6dOnCiBEjWLRoEWPGjIk+O27cOL7zne9w6623AjBv3jx+//vf8+yzz7Jp0yZee+013J2ysjJefPFFCgsLqa6u5tFHH60XJiKS/bIyFBqTHAh19u7d2+J5unvK3fGGxtdNS3TnnXfyve99r8FlFBQUUF1dzfLly1m+fDnDhg1j/vz57N+/n6FDh1J3VVZ5eTkvvvhivVDo378/u3btYvv27ezevZsuXbpQWFjIzJkzefbZZ+nfvz8A+/fvZ9OmTRQWFtKzZ08GDx581D8LEWnbsjIUGvvLvqioiK1btx4xvmfPngB069atyT2DZL179+a3v/1tvXEfffQR27Zt4+yzz04ZOG+88QZf+tKXjmo5J5xwAqNGjWLUqFF84QtfYNGiRQwbNqxZnx07diwLFizgr3/9K+PGjQOCYLr77rv55je/Wa/tli1bOPnkk4+qNhHJDjl3TmHGjBmcdNJJ9caddNJJzJgxo8XzHDZsGB9//DG//vWvATh06BDf/e53ufnmm49YFsDatWv5/ve/z2233dbsZbz++uts3x7c5nH48GHWrl1Lz549ufDCC3nhhRfYs2cPhw4d4oknnuCyyy474vPjxo1jzpw5LFiwIDo8dtVVV/HII4+wf/9+AN577z127dp11OsvmampexXa+slmnVeIR86FQnl5ObNnz6Znz56YGT179mT27NmUl5e3eJ5mxsKFC5k/fz69evXinHPOoVOnTvzgBz+I2qxcuTK6JPW2225j5syZ9f7Kf/DBB+tdkrply5Z6y9i1axfXXHMN5513Hueffz4dOnRg4sSJdO/enR/+8Idcfvnl9O3blwEDBjB69Ogjauzduzf79u2jR48edO/eHYARI0Zw4403ctFFF9GnTx/Gjh3Lvn37WvxzEDmWdNdyelhbu/OxtLTUq6qq6o3buHHjUR+KkbZP271pTe0NZPr//6b2BhQczWdmq929tKl2ObenICKf0SEkSaZQEBGRiEJBJIs15/BQXl7ecaikZZpzeEh7C8eWQkEkx9XW1qa7hFZTMBw7CgURafPnFkDBcKwoFESyXHOvMMrUYDiaK4wUDK2nUDhGdu7cyY033shZZ53FwIEDueiii1i4cGGr5jlt2jQeeOABAKZOncqyZctaNJ81a9awZMmSlNOa6tY7E9x8880sWLAg3WXkhEwNhqNh95nCoRVyMhQq11VS9FAR7e5rR9FDRVSua33X2WPGjGHIkCG88847rF69mjlz5lBTU3NE24MHD7ZoGdOnT2f48OEt+mxjoQBNd+stbd/R3I9gZhkXDi25H0Hh0DI5FwqV6yqpeKqCrR9uxXG2friViqcqWhUMy5cvJy8vj29961vRuJ49e3L77bcD8Nhjj3HDDTdwzTXXMGLECPbv38+wYcMYMGAAffr04cknn4w+N2PGDM4991yGDx9OdXV1ND7xr+XVq1dz2WWXMXDgQK666ip27NgBBL3DTp48mQsuuIBzzjmHlStX8umnnzJ16lTmzp1Lv379mDt3bqPrktyt9+7du7n++usZNGgQgwYN4g9/+AMAL7zwQnT3df/+/aM7oX/84x/Tp08f+vbty5QpUwB4++23GTlyJAMHDuTSSy/lrbfeitbpjjvu4Mtf/jJnnXVWtH7uzsSJEykpKeHqq69W1xtpUhcOmRIQLb1RrS4cFBDNk5Ud4g19bGiD016peYVPDiV1nV37MZOemUR5n3L2fLyHsfOSus6++flGl7d+/XoGDBjQaJuXX36ZtWvX0rVrVw4ePMjChQs59dRT2bNnD4MHD6asrIzXX3+dOXPm8MYbb3Dw4EEGDBhwRJfctbW13H777Tz55JPk5+czd+5c7rnnHh555BEg2BN57bXXWLJkCffddx/Lli1j+vTpVFVVNfv5DYndek+aNIk777yTSy65hL/85S9cddVVbNy4kQceeIBZs2Zx8cUXs3//fjp16sQzzzzDokWLePXVVznppJN4//33AaioqODhhx+mV69evPrqq9x6660sX74cCJ4T8dJLL/HWW29RVlbG2LFjWbhwIdXV1axbt46dO3dSUlLC17/+9WbVLg1rrNfepiR/Ll13Qvu93qov91Sf1V3R9WVlKDQmORDq7P17y7vOTnbbbbfx0ksvkZeXx6pVqwC48sor6dq1KxD8h/qXf/kXXnzxRdq1a8d7773Hzp07WblyJddee23UiV5ZWdkR866urubNN9/kyiuvBILO9+r6MgK47rrrABg4cOAR/Sc1V+J/+GXLlrFhw4Zo+KOPPmLfvn1cfPHF3HXXXZSXl3PddddRUFDAsmXLuOWWW6L6u3btyv79+/njH//IDTfcEM0jsfvyMWPG0K5dO0pKSti5cycAL774IuPHj6d9+/acccYZXHHFFS1aDzlSa4IhUUNdxR8PrQ2GZM2ZVy4FR1aGQmN/2Rc9VMTWD1N0nX1a2HX2Sd2a3DNIltx19qxZs9izZw+lpZ91M5LYFXVlZSW7d+9m9erVdOzYkaKiIg4cOAA0r6+a3r178/LLL6ecfsIJJwDB099aev4isVvvw4cP8/LLL3PiiSfWazNlyhSuvvpqlixZwuDBg1m2bFnKL5zDhw/TuXNn1qxZ02i9detWJ1MOWWSjYxUMycyszQZDU47lsjI9YGI9p2BmI82s2sw2m9mUFNNPMLO54fRXzawoznoAZgybwUkdk7rO7ngSM4a1vOvsK664ggMHDvDzn/88GpfqMZx1PvzwQz7/+c/TsWNHVqxYET3fYciQISxcuJC///3v7Nu3j6eeeuqIz5577rns3r07CoXa2lrWr1/faH2nnHJKs3s/Te7We8SIEfUOO9V9ub/99tv06dOHyZMnU1payltvvcWIESPqPYL0/fff59RTT6W4uJj58+cDwRfSn/70p0ZrGDJkCHPmzOHQoUPs2LHjiKfQSetlekd4zZHpX64NSTzH0ZpXXGILBTNrD8wCRgElwHgzK0lq9g3gb+7+D8CDwI/iqqdOeZ9yZl8zm56n9cQwep7Wk9nXzKa8T+u6zl60aBEvvPACxcXFXHDBBdx000386EepV6e8vJyqqipKS0uprKzki1/8IhAcy//a175Gv379uP7667n00kuP+GxeXh4LFixg8uTJ9O3bl379+vHHP/6x0fouv/xyNmzY0OCJ5sa69Z45cyZVVVWcf/75lJSU8PDDDwPw0EMPcd5559G3b19OPPFERo0axciRIykrK6O0tJR+/fpFl9NWVlbyq1/9ir59+9K7d+96J9ZTufbaa+nVqxd9+vTh29/+dsrnQ0jrZUswtNVwaK24giG2rrPN7CJgmrtfFQ7fDeDuP0xoszRs87KZdQD+CuR7I0Wp62ypo+1+7Byrw0npDppcu8LoqG7sa2bX2XGeU+gBbEsYrgEubKiNux80sw+B04E9MdYlIkmy5XxO4pdkrgXEsRJnKKTaIsmx1pw2mFkFUAFQWFjY+spEpEHJf+03NyTSvZeQLNVf0QqKpsUZCjXAmQnDBcD2BtrUhIePTgPeT56Ru88GZkNw+CiWakUkpUz7sm8NdcXdtDhDYRXQy8yKgfeAccCNSW0WAzcBLwNjgeWNnU9oTFyX2UlmyqYvKsksx/LEdZwBE9cJ9thCITxHMBFYCrQHHnH39WY2Hahy98XAr4D/MrPNBHsI41qyrE6dOrF3715OP/10BUMOcHf27t1Lp06d0l2KSKPa4pVRsV19FJdUVx/V1tZSU1MT3QAm2a9Tp04UFBTQsWPHdJci0iZkwtVHx03Hjh0pLi5OdxkiIm1ezvWSKiIiDVMoiIhIRKEgIiKRNnei2cx2A0d2c9o83ci9u6W1zrlB65wbWrPOPd09v6lGbS4UWsPMqppz9j2baJ1zg9Y5NxyPddbhIxERiSgUREQkkmuhMDvdBaSB1jk3aJ1zQ+zrnFPnFEREpHG5tqcgIiKNyJlQaOp50ZnMzM40sxVmttHM1pvZpHB8VzP7bzPbFP7bJRxvZjYzXNe1ZjYgYV43he03mdlNCeMHmtm68DMzLUN6FjSz9mb2hpk9HQ4Xh8/z3hQ+3zsvHN/g877N7O5wfLWZXZUwPuN+J8yss5ktMLO3wu19UbZvZzO7M/y9ftPMnjCzTtm2nc3sETPbZWZvJoyLfbs2tIxGuXvWvwh6aX0bOAvIA/4ElKS7rqOovzswIHx/CvBngude/xiYEo6fAvwofP8V4BmChxgNBl4Nx3cF3gn/7RK+7xJOew24KPzMM8CodK93WNddwG+Ap8PhecC48P3DwLfD97cCD4fvxwFzw/cl4fY+ASgOfw/aZ+rvBPCfwD+H7/OAztm8nQmevvgucGLC9r0527YzMAQYALyZMC727drQMhqtNd3/CY7TBrkIWJowfDdwd7rrasX6PAlcCVQD3cNx3YHq8P0vgPEJ7avD6eOBXySM/0U4rjvwVsL4eu3SuJ4FwHPAFcDT4S/8HqBD8nYl6KL9ovB9h7CdJW/runaZ+DsBnBp+QVrS+Kzdznz2SN6u4XZ7GrgqG7czUET9UIh9uza0jMZeuXL4KNXzonukqZZWCXeX+wOvAl9w9x0A4b+fD5s1tL6Nja9JMT7dHgL+D3A4HD4d+MDdD4bDiXXWe943UPe876P9WaTTWcBu4NHwkNkvzexksng7u/t7wAPAX4AdBNttNdm9nescj+3a0DIalCuh0KxnQWc6M/sc8FvgO+7+UWNNU4zzFoxPGzP7R2CXu69OHJ2iqTcxrc2sM8FfvgOAn7t7f+B/CHb5G9Lm1zk8xj2a4JDPGcDJwKgUTbNpOzclreuYK6HQnOdFZzQz60gQCJXu/rtw9E4z6x5O7w7sCsc3tL6NjS9IMT6dLgbKzGwLMIfgENJDQGcLnucN9euM1s3qP+/7aH8W6VQD1Lj7q+HwAoKQyObtPBx41913u3st8Dvgy2T3dq5zPLZrQ8toUK6EQvS86PAqhnEEz4duE8IrCX4FbHT3nyZMqnvGNeG/TyaM/6fwKobBwIfhruNSYISZdQn/QhtBcLx1B7DPzAaHy/qnhHmlhbvf7e4F7l5EsL2Wu3s5sILged5w5DrX/SwSn/e9GBgXXrVSDPQiOCmXcb8T7v5XYJuZnRuOGgZsIIu3M8Fho8FmdlJYU906Z+12TnA8tmtDy2hYOk8yHeeTPF8huGrnbeCedNdzlLVfQrA7uBZYE76+QnAs9TlgU/hv17C9AbPCdV0HlCbM6+vA5vB1S8L4UuDN8DM/I+lkZ5rXfyifXX10FsF/9s3AfOCEcHyncHhzOP2shM/fE65XNQlX22TlItK0AAAEiUlEQVTi7wTQD6gKt/UigqtMsno7A/cBb4V1/RfBFURZtZ2BJwjOmdQS/GX/jeOxXRtaRmMv3dEsIiKRXDl8JCIizaBQEBGRiEJBREQiCgUREYkoFEREJKJQkLQyMzeznyQMf8/Mph2jeT9mZmObbtnq5dxgQY+mK5LGF9X1imlm/czsK8ehliVm1jnu5Uj2UihIun0CXGdm3dJdSCIza38Uzb8B3OrulzfSph/B9fJHU0OHplvV5+5fcfcPjvZzInUUCpJuBwkeMXhn8oTkv/TNbH/471Aze8HM5pnZn83s38ys3MxeC/uUPzthNsPNbGXY7h/Dz7c3s/vNbFXYX/03E+a7wsx+Q3DTUHI948P5v2lmPwrHTSW4ufBhM7s/1QqGd9JOB75mZmvM7GtmdrIFfeyvCju/Gx22vdnM5pvZU8CzZvY5M3vOzF4Pl13X7lvhvNaY2bt1eylmtqUuYM3srrDWN83sO+G4onCv5j8seIbBs2Z2YjjtDjPbEP5M5jR7C0p2SffdjHrl9gvYT9Bl9BaCfmy+B0wLpz0GjE1sG/47FPiAoCvgE4D3gPvCaZOAhxI+/3uCP356EdxJ2gmoAP41bHMCwR3ExeF8/wcoTlHnGQRdMuQTdFy3HBgTTnuehLtOEz5TRNhVMsEzAn6WMO0HwITwfWeCO25PDtvV8NndrR2AU8P33QjuZLWE+XQEVgLXhMNbwnYDCYLtZOBzwHqC3nWLCIK4X9h+XkId2/nszuHO6f7d0Cs9L+0pSNp50OPrr4E7juJjq9x9h7t/QnBr/7Ph+HUEX3x15rn7YXffRPBQki8S9BnzT2a2hqAL8tMJQgPgNXd/N8XyBgHPe9Bx20GgkuDBKS01ApgS1vA8QVgVhtP+293fD98b8AMzWwssI+gS+QsJ8/l/BP3/PJU0/0uAhe7+P+6+n6CjuUvDae+6+5rw/Wo++3mtBSrNbAJBcEgOOupjliIxeQh4HXg0YdxBwkOcYUdfeQnTPkl4fzhh+DD1f6+T+3Gp62r4dndfmjjBzIYS7CmkcqwfW2nA9e5enVTDhUk1lBPsnQx091oLeo3tFLa9GegJTDzKehN/doeAE8P3VxMEXRnwf82st3/2TAPJEdpTkIwQ/mU8j+CkbZ0tBIdBIOhzv2MLZn2DmbULzzOcRdBZ2lLg2xZ0R46ZnWPBw2wa8ypwmZl1C09CjwdeOIo69hE8SrXOUuD2MOwws/4NfO40gudK1JrZ5QQhgJkNJDjUNsHdD6f43IvAmLD30ZOBawkOM6VkZu2AM919BcGDjToTHHaSHKNQkEzyE4Lj4XX+g+CL+DUg+S/o5qom+PJ+BviWux8AfknQPfPr4SWjv6CJvWYPuie+m6BL5z8Br7v70XQ7vQIoqTvRDHyfIOTWhjV8v4HPVQKlZlZFsNfwVjh+IsEjLFeE8/xlUr2vE5xTeY0g0H7p7m80Ul974HEzWwe8ATzouoopJ6mXVBERiWhPQUREIgoFERGJKBRERCSiUBARkYhCQUREIgoFERGJKBRERCSiUBARkcj/B6ExQlyYReKKAAAAAElFTkSuQmCC\n",
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"plt.plot(range(len(cost_history_nonstiff)), cost_history_nonstiff, 'o-.', color='k', label='ODE Solver')\n",
"plt.plot(range(len(cost_history_bp)), cost_history_bp, 'o-.', color='g',label='Gradient Descend')\n",
"plt.xlabel('Number of Iterazions')\n",
"plt.ylabel('Loss')\n",
"plt.legend()\n",
"plt.show()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.3"
},
"toc": {
"base_numbering": 1,
"nav_menu": {},
"number_sections": true,
"sideBar": true,
"skip_h1_title": false,
"title_cell": "Table of Contents",
"title_sidebar": "Contents",
"toc_cell": false,
"toc_position": {},
"toc_section_display": true,
"toc_window_display": false
}
},
"nbformat": 4,
"nbformat_minor": 2
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment