Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Save DiegoHernanSalazar/941ce4922278e8ac7f522e24e4cad5b9 to your computer and use it in GitHub Desktop.
Save DiegoHernanSalazar/941ce4922278e8ac7f522e24e4cad5b9 to your computer and use it in GitHub Desktop.
Stanford Online/ DeepLearning.AI. Supervised Machine Learning: Regression and Classification, Multiple Variable Linear Regression.
Display the source blob
Display the rendered blob
Raw
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Optional Lab: Multiple Variable Linear Regression\n",
"\n",
"In this lab, you will extend the data structures and previously developed routines to support multiple features. Several routines are updated making the lab appear lengthy, but it makes minor adjustments to previous routines making it quick to review.\n",
"# Outline\n",
"- [  1.1 Goals](#toc_15456_1.1)\n",
"- [  1.2 Tools](#toc_15456_1.2)\n",
"- [  1.3 Notation](#toc_15456_1.3)\n",
"- [2 Problem Statement](#toc_15456_2)\n",
"- [  2.1 Matrix X containing our examples](#toc_15456_2.1)\n",
"- [  2.2 Parameter vector w, b](#toc_15456_2.2)\n",
"- [3 Model Prediction With Multiple Variables](#toc_15456_3)\n",
"- [  3.1 Single Prediction element by element](#toc_15456_3.1)\n",
"- [  3.2 Single Prediction, vector](#toc_15456_3.2)\n",
"- [4 Compute Cost With Multiple Variables](#toc_15456_4)\n",
"- [5 Gradient Descent With Multiple Variables](#toc_15456_5)\n",
"- [  5.1 Compute Gradient with Multiple Variables](#toc_15456_5.1)\n",
"- [  5.2 Gradient Descent With Multiple Variables](#toc_15456_5.2)\n",
"- [6 Congratulations](#toc_15456_6)\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a name=\"toc_15456_1.1\"></a>\n",
"## 1.1 Goals\n",
"- Extend our regression model routines to support multiple features\n",
" - Extend data structures to support multiple features\n",
" - Rewrite prediction, cost and gradient routines to support multiple features\n",
" - Utilize NumPy `np.dot` to vectorize their implementations for speed and simplicity"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a name=\"toc_15456_1.2\"></a>\n",
"## 1.2 Tools\n",
"In this lab, we will make use of: \n",
"- NumPy, a popular library for scientific computing\n",
"- Matplotlib, a popular library for plotting data"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"import copy, math # Python 'math' module allows you to perform mathematical tasks on numbers.\n",
" # Use the 'copy' module in Python, to create “real copies” or “clones” of the objects.\n",
" \n",
"import numpy as np # Get numpy library as 'np' constructor\n",
"import matplotlib.pyplot as plt # Get matplotlib, pyplot library as 'plt' constructor\n",
"\n",
"plt.style.use('./deeplearning.mplstyle') # Use or import 'available style sheets', using its 'name',\n",
" # on a common set of example plots: scatter plot, image, \n",
" # bar graph, patches, line plot and histogram.\n",
" \n",
"np.set_printoptions(precision=2) # reduced display precision on numpy arrays\n",
" # These options determine the way floating point numbers, arrays \n",
" # and other NumPy objects are displayed. (Precision = integer or None \n",
" # defining the number of digits of precision for floating point output. \n",
" # default 8)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a name=\"toc_15456_1.3\"></a>\n",
"## 1.3 Notation\n",
"Here is a summary of some of the notation you will encounter, updated for multiple features. \n",
"\n",
"|General <img width=70/> <br /> Notation <img width=70/> | Description<img width=350/>| Python (if applicable) |\n",
"|: ------------|: ------------------------------------------------------------||\n",
"| $a$ | scalar, non bold ||\n",
"| $\\mathbf{a}$ | vector, bold ||\n",
"| $\\mathbf{A}$ | matrix, bold capital ||\n",
"| **Regression** | | | |\n",
"| $\\mathbf{X}$ | training example matrix | `X_train` | \n",
"| $\\mathbf{y}$ | training example targets | `y_train` \n",
"| $\\mathbf{x}^{(i)}$, $y^{(i)}$ | $i_{th}$Training Example | `X[i]`, `y[i]`|\n",
"| m | number of training examples | `m`|\n",
"| n | number of features in each example | `n`|\n",
"| $\\mathbf{w}$ | parameter: weight, | `w` |\n",
"| $b$ | parameter: bias | `b` | \n",
"| $f_{\\mathbf{w},b}(\\mathbf{x}^{(i)})$ | The result of the model evaluation at $\\mathbf{x^{(i)}}$ parameterized by $\\mathbf{w},b$: $f_{\\mathbf{w},b}(\\mathbf{x}^{(i)}) = \\mathbf{w} \\cdot \\mathbf{x}^{(i)}+b$ | `f_wb` | \n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a name=\"toc_15456_2\"></a>\n",
"# 2 Problem Statement\n",
"\n",
"You will use the motivating example of housing price prediction. The training dataset contains three examples with four features (size, bedrooms, floors and, age) shown in the table below. Note that, unlike the earlier labs, size is in sqft rather than 1000 sqft. This causes an issue, which you will solve in the next lab!\n",
"\n",
"| Size (sqft) | Number of Bedrooms | Number of floors | Age of Home | Price (1000s dollars) | \n",
"| ----------------| ------------------- |----------------- |--------------|-------------- | \n",
"| 2104 | 5 | 1 | 45 | 460 | \n",
"| 1416 | 3 | 2 | 40 | 232 | \n",
"| 852 | 2 | 1 | 35 | 178 | \n",
"\n",
"You will build a linear regression model using these values so you can then predict the price for other houses. For example, a house with 1200 sqft, 3 bedrooms, 1 floor, 40 years old. \n",
"\n",
"Please run the following code cell to create your `X_train` and `y_train` variables."
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"X_train =\n",
" Xj=0 Xj=1 Xj=2 Xj=3\n",
" [[2104 5 1 45]\n",
" [1416 3 2 40]\n",
" [ 852 2 1 35]]\n",
"\n",
"y_train =\n",
"yi=0 yi=1 yi=2\n",
" [460 232 178]\n"
]
}
],
"source": [
"# X_train = [X0=size (sqft), X1=Number of bedrooms, X2=Number of floors, X3=Age of Home]\n",
"# col0 col1 col2 col3\n",
"# X0 X1 X2 X3\n",
"X_train = np.array([[2104, 5, 1, 45], # row i = 0\n",
" [1416, 3, 2, 40], # row i = 1\n",
" [852, 2, 1, 35]]) # row i = 2\n",
"\n",
"print(f\"X_train =\\n\"+\" Xj=0 Xj=1 Xj=2 Xj=3\\n\",X_train) # Print X_train = [X0, X1, X2, X3] numpy 2D array\n",
"\n",
"# y_train = [y=Price (1000s dollars)]\n",
"# col0\n",
"# y\n",
"y_train = np.array([460, # row i = 0 \n",
" 232, # row i = 1\n",
" 178]) # row i = 2\n",
"\n",
"print(f\"\\ny_train =\\n\"+\"yi=0 yi=1 yi=2\\n\",y_train) # Print y_train = [y] numpy 1D array"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a name=\"toc_15456_2.1\"></a>\n",
"## 2.1 Matrix X containing our training set examples\n",
"Similar to the table above, examples are stored in a NumPy matrix `X_train`. Each row of the matrix represents one example. When you have $m$ training examples ( $m$ is three in our example), and there are $n$ features (four in our example), $\\mathbf{X}$ is a matrix with dimensions ($m$, $n$) (m rows, n columns).\n",
"\n",
"\n",
"$$\\mathbf{X} = \n",
"\\begin{pmatrix}\n",
" x^{(0)}_0 & x^{(0)}_1 & \\cdots & x^{(0)}_{n-1} \\\\ \n",
" x^{(1)}_0 & x^{(1)}_1 & \\cdots & x^{(1)}_{n-1} \\\\\n",
" \\cdots \\\\\n",
" x^{(m-1)}_0 & x^{(m-1)}_1 & \\cdots & x^{(m-1)}_{n-1} \n",
"\\end{pmatrix}\n",
"$$\n",
"notation:\n",
"- $\\mathbf{x}^{(i)}$ is vector containing example i. $\\mathbf{x}^{(i)}$ $ = (x^{(i)}_0, x^{(i)}_1, \\cdots,x^{(i)}_{n-1})$\n",
"- $x^{(i)}_j$ is element j in example i. The superscript in parenthesis indicates the example number while the subscript represents an element. \n",
"\n",
"Display the input data."
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"X Shape: (3, 4), X Type:<class 'numpy.ndarray'>\n",
"[[2104 5 1 45]\n",
" [1416 3 2 40]\n",
" [ 852 2 1 35]]\n",
"\n",
"y Shape: (3,), y Type:<class 'numpy.ndarray'>\n",
"[460 232 178]\n"
]
}
],
"source": [
"# X_train data was stored as 2D numpy array/matrix of size \n",
"# (m = 3 examples/rows, n = 4 features/cols) = [m = 3 x n = 4] \n",
"print(f\"X Shape: {X_train.shape}, X Type:{type(X_train)}\")\n",
"print(X_train) # Print 'X_train' 2D array/matrix \n",
"\n",
"# y_train data was stored as 1D numpy array/vector of size [1 row x 3 examples/elements]\n",
"print(f\"\\ny Shape: {y_train.shape}, y Type:{type(y_train)}\")\n",
"print(y_train) # Print 'y_train' 1D array/vector"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a name=\"toc_15456_2.2\"></a>\n",
"## 2.2 Parameter vector w, b\n",
"\n",
"* $\\mathbf{w}$ is a vector with $n$ elements.\n",
" - Each element contains the parameter associated with one feature.\n",
" - in our dataset, n is 4.\n",
" - notionally, we draw this as a column vector\n",
"\n",
"$$\\mathbf{w} = \\begin{pmatrix}\n",
"w_0 \\\\ \n",
"w_1 \\\\\n",
"\\cdots\\\\\n",
"w_{n-1}\n",
"\\end{pmatrix}\n",
"$$\n",
"* $b$ is a scalar parameter. "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"For demonstration, $\\mathbf{w}$ and $b$ will be loaded with some initial selected values that are near the optimal. $\\mathbf{w}$ is a 1-D NumPy vector."
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"w_init shape: (4,), b_init type: <class 'float'>\n"
]
}
],
"source": [
"# Initialize b parameter as a scalar near the minimum\n",
"b_init = 785.1811367994083\n",
"\n",
"# Initialize w vector parameter with n=4 values, (each related with every input feature) \n",
"# near the minimum. - > w = [w0, w1, w2, w3]\n",
"w_init = np.array([ 0.39133535, 18.75376741, -53.36032453, -26.42131618])\n",
"\n",
"print(f\"w_init shape: {w_init.shape}, b_init type: {type(b_init)}\") # Print w size and b data type"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a name=\"toc_15456_3\"></a>\n",
"# 3 Model Prediction With Multiple Variables\n",
"The model's prediction with multiple variables is given by the linear model:\n",
"\n",
"$$ f_{\\mathbf{w},b}(\\mathbf{x}) = w_0x_0 + w_1x_1 +... + w_{n-1}x_{n-1} + b \\tag{1}$$\n",
"or in vector notation:\n",
"$$ f_{\\mathbf{w},b}(\\mathbf{x}) = \\mathbf{w} \\cdot \\mathbf{x} + b \\tag{2} $$ \n",
"where $\\cdot$ is a vector `dot product`\n",
"\n",
"To demonstrate the dot product, we will implement prediction using (1) and (2)."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a name=\"toc_15456_3.1\"></a>\n",
"## 3.1 Single Prediction element by element\n",
"Our previous prediction multiplied one feature value by one parameter and added a bias parameter. A direct extension of our previous implementation of prediction to multiple features would be to implement (1) above using loop over each element, performing the multiply with its parameter and then adding the bias parameter at the end.\n"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
"def predict_single_loop(x, w, b): # Input selected 1D vector Xj^(î) = [x0, x1, x2, x3] from X_train matrix/2D array\n",
" # Input wj = [w0, w1, w2, w3] 1D array of paramters (one per each xj)\n",
" # Input fixed b scalar parameter\n",
" \"\"\"\n",
" single predict using linear regression\n",
" \n",
" Input arguments:\n",
" x 1D (ndarray): Shape (n,) example with multiple features\n",
" w 1D (ndarray): Shape (n,) model parameters \n",
" b num (scalar): model parameter \n",
" \n",
" Output returned:\n",
" p num (scalar): y^ = fw_b(x) prediction\n",
" \"\"\"\n",
" n = x.shape[0] # Extract the size/len of rows/vectors/training examples \n",
" # x.shape=(rows,cols) -> x.shape[0]=(rows) \n",
" p = 0 # Init prediction counter as '0'\n",
" \n",
" for i in range(n): # n = rows/examples -> range(n) = i = 0, 1,..., n-1\n",
" p_i = x[i] * w[i] # updated_pred = wj * xj\n",
" p = p + p_i # new_pred = previous_pred + updated_pred\n",
" # (init as 0)\n",
" p = p + b # Overwrite new_pred = new_pred + b, out of for loop \n",
" return p # return new_pred = p + x[i]*w[i] + b"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"x_vec shape (4,), x_vec value: [2104 5 1 45]\n",
"f_wb shape (), prediction: 459.9999976194083\n"
]
}
],
"source": [
"# Select the first row/vector, from our training data -> X_train[row = 0, All cols/elements = :]\n",
"# x_vec = [2104, 5, 1, 45]\n",
"x_vec = X_train[0,:]\n",
"\n",
"# Print selected vector size/shape, and the vector/1D array elements\n",
"print(f\"x_vec shape {x_vec.shape}, x_vec value: {x_vec}\") \n",
"\n",
"# Call the previous function 'predict_single_loop()' and make a prediction y^=f_wb\n",
"f_wb = predict_single_loop(x_vec, w_init, b_init)\n",
"\n",
"# Print scalar prediction y^=f_wb size/shape, and its value\n",
"print(f\"f_wb shape {f_wb.shape}, prediction: {f_wb}\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Note the shape of `x_vec`. It is a 1-D NumPy vector with 4 elements, (4,). The result, `f_wb` is a scalar."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a name=\"toc_15456_3.2\"></a>\n",
"## 3.2 Single Prediction, vector\n",
"\n",
"Noting that equation (1) above can be implemented using the dot product as in (2) above. We can make use of vector operations to speed up predictions.\n",
"\n",
"Recall from the Python/Numpy lab that NumPy `np.dot()`[[link](https://numpy.org/doc/stable/reference/generated/numpy.dot.html)] can be used to perform a vector dot product. "
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [],
"source": [
"def predict(x, w, b): # Input selected 1D vector Xj^(î) = [x0, x1, x2, x3] from X_train matrix/2D array\n",
" # Input wj = [w0, w1, w2, w3] 1D array of paramters (one per each xj)\n",
" # Input fixed b scalar parameter\n",
" \"\"\"\n",
" single predict using linear regression\n",
" \n",
" Input arguments:\n",
" x 1D (ndarray): Shape (n,) example with multiple features\n",
" w 1D (ndarray): Shape (n,) model parameters \n",
" b num (scalar): model parameter \n",
" \n",
" Output returned:\n",
" p num (scalar): y^ = fw_b(x) prediction\n",
" \"\"\"\n",
" p = np.dot(x,w) + b # y^ = x.w + b = x[j].w[j] + b \n",
" return p # return prediction y^ "
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"x_vec shape (4,), x_vec value: [2104 5 1 45]\n",
"f_wb shape (), prediction: 459.99999761940825\n"
]
}
],
"source": [
"# Select the first row/vector, from our training data -> X_train[row = 0, All cols/elements = :]\n",
"# x_vec = [2104, 5, 1, 45]\n",
"x_vec = X_train[0,:]\n",
"\n",
"# Print selected vector size/shape, and the vector/1D array elements\n",
"print(f\"x_vec shape {x_vec.shape}, x_vec value: {x_vec}\") \n",
"\n",
"# Call the previous function 'predict_single_loop()' and make a prediction y^=f_wb\n",
"f_wb = predict(x_vec, w_init, b_init)\n",
"\n",
"# Print scalar prediction y^=f_wb size/shape, and its value\n",
"print(f\"f_wb shape {f_wb.shape}, prediction: {f_wb}\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The results and shapes are the same as the previous version which used looping. Going forward, `np.dot` will be used for these operations. The prediction is now a single statement. Most routines will implement it directly rather than calling a separate predict routine."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a name=\"toc_15456_4\"></a>\n",
"# 4 Compute Cost With Multiple Variables\n",
"The equation for the cost function with multiple variables $J(\\mathbf{w},b)$ is:\n",
"$$J(\\mathbf{w},b) = \\frac{1}{2m} \\sum\\limits_{i = 0}^{m-1} (f_{\\mathbf{w},b}(\\mathbf{x}^{(i)}) - y^{(i)})^2 \\tag{3}$$ \n",
"where:\n",
"$$ f_{\\mathbf{w},b}(\\mathbf{x}^{(i)}) = \\mathbf{w} \\cdot \\mathbf{x}^{(i)} + b \\tag{4} $$ \n",
"\n",
"\n",
"In contrast to previous labs, $\\mathbf{w}$ and $\\mathbf{x}^{(i)}$ are vectors rather than scalars supporting multiple features."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Below is an implementation of equations (3) and (4). Note that this uses a *standard pattern for this course* where a for loop over all `m` examples is used."
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [],
"source": [
"def compute_cost(X, y, w, b): # Input X 2D array/Matrix\n",
" # Input y 1D array/vector\n",
" # Input w parameter 1D array/vector\n",
" # Input b parameter fixed scalar\n",
" \"\"\"\n",
" compute cost\n",
" Args:\n",
" X (ndarray (m,n)): Data, m examples with n features\n",
" y (ndarray (m,)) : target values\n",
" w (ndarray (n,)) : model parameters \n",
" b (scalar) : model parameter\n",
" \n",
" Returns:\n",
" cost (scalar): cost\n",
" \"\"\"\n",
" m = X.shape[0] # Select the total number of samples/rows/vectors at 'X_train' \n",
" # 2D numpy array/matrix (_ , ), and assign to (m).\n",
" cost = 0.0 # Init the value of counter cost = J = 0.0\n",
" \n",
" for i in range(m): # Iterate through each ith row/training sample/vector X[i] = [X0, X1, X2, X3] with \n",
" # range(m) = i = [0, 1, 2, ..., m-1]\n",
" \n",
" f_wb_i = np.dot(X[i], w) + b # X[i] = [X0, X1, X2, X3] each 1D vector is selected, with n = 4 features \n",
" # w = [w0, w1, w2, w3], 1D parameters vector, with n = 4 related features \n",
" # y^[i] = f_wb(x) = X[i] * w + b -> (n=4,)(n=4,) + scalar = scalar + scalar\n",
" # y^[i] = f_wb(x) = X[i]0*w0 + X[i]1*w1 + X[i]2*w2 + X[i]3*w3 + b\n",
" \n",
" cost = cost + (f_wb_i - y[i]) ** 2 # cost_new = cost_prev + (y^[i] - y[i])^2 -> scalar + (scalar - scalar)\n",
" # (init as 0)\n",
" cost = cost / (2 * m) # cost_new = cost_new / (2*m) -> scalar / (scalar) \n",
" return cost # return cost_new = J -> scalar or the minimum value of cost function J(w,b)"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Cost at optimal w : 1.5578904880036537e-12\n"
]
}
],
"source": [
"# Compute and display cost = J(w,b) using our pre-chosen optimal parameters \n",
"# w = w_init = [0.39133535, 18.75376741, -53.36032453, -26.42131618].\n",
"# b = b_init = 785.1811367994083\n",
"# X_train = [2104 5 1 45\n",
"# 1416 3 2 40\n",
"# 852 2 1 35] is a 2D array/matrix. \n",
"# y_train = [460 232 178] is a 1D array/vector\n",
"cost = compute_cost(X_train, y_train, w_init, b_init)\n",
"\n",
"# Print J(w,b) = computed cost value at w = w_init, before gradient descent\n",
"print(f'Cost at optimal w : {cost}') "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Expected Result**: Cost at optimal w : 1.5578904045996674e-12"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a name=\"toc_15456_5\"></a>\n",
"# 5 Gradient Descent With Multiple Variables\n",
"Gradient descent for multiple variables:\n",
"\n",
"$$\\begin{align*} \\text{repeat}&\\text{ until convergence:} \\; \\lbrace \\newline\\;\n",
"& w_j = w_j - \\alpha \\frac{\\partial J(\\mathbf{w},b)}{\\partial w_j} \\tag{5} \\; & \\text{for j = 0..n-1}\\newline\n",
"&b\\ \\ = b - \\alpha \\frac{\\partial J(\\mathbf{w},b)}{\\partial b} \\newline \\rbrace\n",
"\\end{align*}$$\n",
"\n",
"where, n is the number of features, parameters $w_j$, $b$, are updated simultaneously and where \n",
"\n",
"$$\n",
"\\begin{align}\n",
"\\frac{\\partial J(\\mathbf{w},b)}{\\partial w_j} &= \\frac{1}{m} \\sum\\limits_{i = 0}^{m-1} (f_{\\mathbf{w},b}(\\mathbf{x}^{(i)}) - y^{(i)})x_{j}^{(i)} \\tag{6} \\\\\n",
"\\frac{\\partial J(\\mathbf{w},b)}{\\partial b} &= \\frac{1}{m} \\sum\\limits_{i = 0}^{m-1} (f_{\\mathbf{w},b}(\\mathbf{x}^{(i)}) - y^{(i)}) \\tag{7}\n",
"\\end{align}\n",
"$$\n",
"* m is the number of training examples in the data set\n",
"\n",
" \n",
"* $f_{\\mathbf{w},b}(\\mathbf{x}^{(i)})$ is the model's prediction, while $y^{(i)}$ is the target value\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a name=\"toc_15456_5.1\"></a>\n",
"## 5.1 Compute Gradient with Multiple Variables\n",
"An implementation for calculating the equations (6) and (7) is below. There are many ways to implement this. In this version, there is an\n",
"- outer loop over all m examples. \n",
" - $\\frac{\\partial J(\\mathbf{w},b)}{\\partial b}$ for the example can be computed directly and accumulated\n",
" - in a second loop over all n features:\n",
" - $\\frac{\\partial J(\\mathbf{w},b)}{\\partial w_j}$ is computed for each $w_j$.\n",
" "
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {},
"outputs": [],
"source": [
"def compute_gradient(X, y, w, b): # Inputs X_train 2D array/matrix\n",
" # Input y 1D array/vector\n",
" # Input w = w_init 1D array/vector\n",
" # Input b = b_init fixed scalar\n",
" \"\"\"\n",
" Computes the gradient for linear regression \n",
" Args:\n",
" X (ndarray (m,n)): Data, m examples with n features\n",
" y (ndarray (m,)) : target values\n",
" w (ndarray (n,)) : model parameters \n",
" b (scalar) : model parameter\n",
" \n",
" Returns:\n",
" dj_dw (ndarray (n,)): The gradient of the cost w.r.t. the parameters w. \n",
" dj_db (scalar): The gradient of the cost w.r.t. the parameter b. \n",
" \"\"\"\n",
" m,n = X.shape # X.shape = (m,n) -> number of examples m, number of features n\n",
" dj_dw = np.zeros((n,)) # init djdw = [0, 0, 0, 0] 1D array/vector\n",
" dj_db = 0. # init djdb = 0 scalar\n",
"\n",
" for i in range(m): # Iterate rows i = 0 ... m-1 \n",
" err = (np.dot(X[i], w) + b) - y[i] # (X[i].w + b) - y[i] = y^[i] - y[i]\n",
" for j in range(n): # Each i-th row, iterate through cols j = 0 ... n-1 \n",
" # X[i, j] = X[i][j] ----> X_train = i [2104 5 1 45\n",
" # 1416 3 2 40\n",
" # 852 2 1 35] \n",
" dj_dw[j] = dj_dw[j] + err * X[i, j] # new_djdw = prev_djdw + (y^[i] - y[i])*X[i][j] \n",
" # (init as 0) \n",
" dj_db = dj_db + err # new_djdb = prev_djdb + (y^[i] - y[i])\n",
" # (init as 0)\n",
" dj_dw = dj_dw / m # new_djdw = new_djdw / m \n",
" dj_db = dj_db / m # new_djdb = new_djdb / m \n",
" \n",
" return dj_db, dj_dw # return dj_db = new_djdb, dj_dw = new_djdw "
]
},
{
"cell_type": "code",
"execution_count": 12,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"dj_db at initial w,b: -1.673925169143331e-06\n",
"dj_dw at initial w,b: \n",
" [-2.73e-03 -6.27e-06 -2.22e-06 -6.92e-05]\n"
]
}
],
"source": [
"# Call ‘compute_gradient()’ defined above. \n",
"# Input X_train 2D array/Matrix \n",
"# Input y_train 1D array/vector \n",
"# Input w = w_init 1D array/vector \n",
"# Input b = b_init scalar/number \n",
"tmp_dj_db, tmp_dj_dw = compute_gradient(X_train, y_train, w_init, b_init)\n",
"\n",
"# Then display returned gradient as a number/scalar dj_db\n",
"print(f'dj_db at initial w,b: {tmp_dj_db}')\n",
"\n",
"# and display returned gradient as 1D array/vector dj_dw\n",
"print(f'dj_dw at initial w,b: \\n {tmp_dj_dw}')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Expected Result**: \n",
"dj_db at initial w,b: -1.6739251122999121e-06 \n",
"dj_dw at initial w,b: \n",
" [-2.73e-03 -6.27e-06 -2.22e-06 -6.92e-05] "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a name=\"toc_15456_5.2\"></a>\n",
"## 5.2 Gradient Descent With Multiple Variables\n",
"The routine below implements equation (5) above."
]
},
{
"cell_type": "code",
"execution_count": 13,
"metadata": {},
"outputs": [],
"source": [
"def gradient_descent(X, y, w_in, b_in, cost_function, gradient_function, alpha, num_iters):\n",
" \n",
" # Input X_train 2D array/matrix\n",
" # Input y_train 1D array/vector\n",
" # Input w = w_init parameter 1D array/vector\n",
" # Input b = b_init fixed parameter scalar\n",
" # Input 'cost_function' name = 'compute_cost' function name\n",
" # Input 'gradient_function' name = 'compute_gradient' function name\n",
" # Input Learning Rate = alpha constant\n",
" # Input num_iters = number of times GD loop is executed\n",
" \n",
" \"\"\"\n",
" Performs batch (All samples) gradient descent to learn w and b. Updates w and b by taking \n",
" num_iters gradient steps with learning rate alpha\n",
" \n",
" Input Args:\n",
" X (ndarray (m,n)) : Data, m examples with n features\n",
" y (ndarray (m,)) : target values\n",
" w_in (ndarray (n,)) : initial model parameters \n",
" b_in (scalar) : initial model parameter\n",
" cost_function : function to compute cost\n",
" gradient_function : function to compute the gradient\n",
" alpha (float) : Learning rate\n",
" num_iters (int) : number of iterations to run gradient descent\n",
" \n",
" Outputs Returned:\n",
" w (ndarray (n,)) : Final Updated 1D array/vector with values or parameters \n",
" b (scalar) : Final Updated scalar value or parameter \n",
" \"\"\"\n",
" \n",
" # An array to store cost J(wj,b) and wj's, at each iteration primarily, for graphing later\n",
" J_history = [] # Empty array to be filled with J(wj,b)^(i) COST values, each iter\n",
" w = copy.deepcopy(w_in) # Avoid modifying global w within function. A copy of w_in is copied into w.\n",
" # It means that any change made to w (a copy of w_in) do not affect the original w_in vector.\n",
" b = b_in # First b parameter value = initial parameter b_init\n",
" \n",
" for i in range(num_iters): # GD loop is executed the total number of iters i = 0, 1, 2, ..., N-1\n",
"\n",
" # Calculate gradients/slopes/derivative terms and update the parameters w and b, \n",
" # using JUST the name of the function 'gradient_function(X,y,w,b)' <- 'compute_gradient(X,y,w,b)'\n",
" dj_db,dj_dw = gradient_function(X, y, w, b) \n",
"\n",
" # Update Parameters wj and b simultaneously (one after another), \n",
" # using previous GRADIENT DESCENT equations at (5) above\n",
" w = w - alpha * dj_dw # updated_w = actual_w – learning_rate * dJ(wj,b) / dwj\n",
" b = b - alpha * dj_db # updated_b = actual_b – learning_rate * dJ(wj,b) / db\n",
" \n",
" # Save cost J(wj,b)^(i) and [wj],b parameters, at each i-th GD iteration\n",
" if i<100000: # Append Cost and parameters lists, until total iters < 100.000 (prevent resource exhaustion)\n",
" \n",
" J_history.append(cost_function(X, y, w, b)) # Calculate 'cost_function(X,y,w,b)' <- input 'compute_cost(X,y,w,b)'\n",
" # every GD loop iteration, then append each cost result per iter \n",
" # into 'J_history []' list\n",
"\n",
" # Python 'math.ceil()' method, round a number upward to its nearest integer\n",
" # Print 10 cost values -> step = round (total_iters/10). It begins from 0 until (total iters - step)\n",
" if i% math.ceil(num_iters / 10) == 0: # As total iters i[0-999], then last value 1.000 is not printed\n",
" \n",
" print(f\"Iteration {i:4d}: Cost {J_history[-1]:8.2f} \") # Print as f\"\" full string\n",
" # Iteration value has 4 numbers using {i:4}\n",
" # Select each whole Cost value with [-1] at 'J_history [] list',\n",
" # and display 8 characters in total, but 2 of them are decimals \n",
" # 'xxxxx.xx', using {J_history[-1]:8.2f}\n",
" \n",
" return w, b, J_history # return [wj_final vector], b_final scalar, [Cost J appended list] for graphing"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"In the next cell you will test the implementation. "
]
},
{
"cell_type": "code",
"execution_count": 14,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Iteration 0: Cost 2529.46 \n",
"Iteration 100: Cost 695.99 \n",
"Iteration 200: Cost 694.92 \n",
"Iteration 300: Cost 693.86 \n",
"Iteration 400: Cost 692.81 \n",
"Iteration 500: Cost 691.77 \n",
"Iteration 600: Cost 690.73 \n",
"Iteration 700: Cost 689.71 \n",
"Iteration 800: Cost 688.70 \n",
"Iteration 900: Cost 687.69 \n",
"b,w found by gradient descent: -0.00,[ 0.2 0. -0.01 -0.07] \n",
"prediction: 426.19, target value: 460\n",
"prediction: 286.17, target value: 232\n",
"prediction: 171.47, target value: 178\n"
]
}
],
"source": [
"# Initialize parameters\n",
"initial_w = np.zeros_like(w_init) # Create a numpy array with the same size of its input array w_init, but ALL zeros.\n",
" # w_init = [w0, w1, w2, w3] is a 1D array related to (4) Xj input features, \n",
" # so it has n = 4 elements in total (j = 0, 1, 2, 3).\n",
" # It means initial_w = [0, 0, 0, 0]\n",
"initial_b = 0. # initial_b = 0.\n",
"\n",
"# Some gradient descent settings\n",
"iterations = 1000 # Set total GD iterations = 1000\n",
"alpha = 5.0e-7 # learning_rate alpha = 0.0000005\n",
"\n",
"# Run gradient descent executing 'gradient_descent()' function, which receive inputs\n",
"# X_train, y_train, initial_w, initial_b, compute_cost (function name), \n",
"# compute_gradient (function name), learning_Rate, total_GD_iters\n",
"# outputs final updated 1D array w, final updated scalar b, [list of Cost values per GD iteration]\n",
"# print 10 GD iterations with the Cost value, each 100 steps\n",
"w_final, b_final, J_hist = gradient_descent(X_train, y_train, initial_w, initial_b,\n",
" compute_cost, compute_gradient, \n",
" alpha, iterations)\n",
"\n",
"# Print out b final scalar with 2 decimals, doing {b_final:0.2f} -> x.xx\n",
"# Print out w final vector with 4 elements in total. Doesn't specify decimals used {w_final}\n",
"print(f\"b,w found by gradient descent: {b_final:0.2f},{w_final} \")\n",
"\n",
"# X_train = [m rows/samples x n cols/features] = (m = 3 rows, n = 4 cols) shape\n",
"# Select the total number of rows/samples/vectors\n",
"m,_ = X_train.shape # m = 3 rows/samples/vectors\n",
"\n",
"for i in range(m): # Iterate through each row/vector/training sample with i = 0, 1, 2\n",
" # for accessing to each X_train[i] vector, and obtain prediction \n",
" # y^ = X[i].w_final = X0w0 + X1w1 + X2w2 + X3w3 \n",
" # Also access to each y_train[i] target element\n",
" # Then print each of 3 predictions y^, one per each training sample/row (m=3)\n",
" print(f\"prediction: {np.dot(X_train[i], w_final) + b_final:0.2f}, target value: {y_train[i]}\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Expected Result**: \n",
"b,w found by gradient descent: -0.00,[ 0.2 0. -0.01 -0.07] \n",
"prediction: 426.19, target value: 460 \n",
"prediction: 286.17, target value: 232 \n",
"prediction: 171.47, target value: 178 "
]
},
{
"cell_type": "code",
"execution_count": 18,
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAA2gAAAEoCAYAAAAt0dJ4AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nOzde3iMd/rH8fckkUgyERFnUUQQIohDixZVdEvZlghC67ilXVW7bLFUJU7bbX+12tKW2mK3jo2qRvVAD6Lt0jokWSl1DFWnIERIyOH5/ZE1NSZqJJPJ6fO6rlyXPPfMM/ejvebrfr7f5/6aDMMwEBERERERkWLnUtwJiIiIiIiISB4VaCIiIiIiIiWECjQREREREZESQgWaiIiIiIhICaECTUREREREpIRQgSYiIiIiIlJCqEATKaO+/vprTCYTX3/9dXGncteioqIwmUzFnYaIiPwGjTOFN2PGDBo3bkxOTo5Dz5vf9T344IM8+OCDlt/T09Px9/dnyZIlDv1sKTwVaFJqXbx4kejoaMLCwqhUqRIVK1YkKCiIp556ij179hTZ57777rvMnz+/yM5flL788kuioqK4ePFicafCkSNHiIqKIj4+vrhTERHJl8aZu6dxxn5nz55l3rx5TJs2DVdXV8vxefPmsWzZsiL/fLPZzHPPPUdUVBQZGRlF/nliP5M2qpbSKCkpiZ49e3Lq1CkiIiLo2LEjFStW5ODBg7z//vskJydz/PhxAgICHP7ZDzzwACdOnCA5Odnh53ak3Nxcrl+/jru7Oy4uefdiXnjhBebMmcPRo0epX79+sea3ZcsWevTowdKlSxk+fLhVLDs7m+zsbCpWrFg8yYlIuadx5s40zhTO9OnTWbhwIadPn8bd3d1yPCAggKCgoELNTOZ3fTdmz24+75kzZ6hTpw4LFy5kzJgxBf48cSy34k5A5G6lp6fz+9//nitXrrBjxw5at25tFZ8zZw6vvPIK5f3eg4uLi9MGHsMwuHbtmsM+z83NDTc3fT2JSPHQOGMfjTMFl5OTw9KlS+nfv79VceYo9l5fjRo16Nq1K0uWLFGBVpIYIqXMq6++agDGu+++a/d7Tpw4YQwbNsyoXr264e7ubjRt2tSYN2+ekZuba/W6PXv2GL179zaqV69ueHh4GHXr1jXCw8ONY8eOGYZhGPXq1TMAm5/bmT9/vgEYP/zwg01s69atBmAsWrTIMAzDyM7ONubOnWs0adLE8PT0NPz8/IzWrVsbCxYssPs6b/bVV18ZgPHVV18ZhmEYw4YNyzf3G3HDMIz4+Hjj8ccfN6pUqWJ4eHgYoaGhNn/PR48eNQBj2rRpxtKlS41mzZoZFSpUMJYuXWoYhmEsW7bMePjhh41atWoZFSpUMOrWrWuMGzfOSEtLs5xj6dKl+eYyY8YMwzAMY8aMGfn+vW7atMno2LGj4eXlZfj4+BgPP/ywsX379tvmFxMTYzRv3tzw8PAwGjVqZKxZs6ZAf5ciUr5onLGPxpmCjzPffPONARjvv/++1fH8cq5Xr55hGIZx7do1Y8aMGca9995r+ftr1qyZ8dprr9n8f5bf9XXp0sXo0qWLTS6vvPKKAVj+H5Tip1vUUuqsX78eDw8PBg8ebNfrz58/T8eOHTl9+jRjx44lMDCQjRs3MmHCBA4fPsyCBQsASElJoXv37lSuXJkJEyZQtWpVTp48yeeff86JEye45557mD9/PpMnT+bChQv84x//uONnDxo0iIkTJ7JixQratm1rFVu5ciXu7u70798fgJkzZzJz5kyGDx/OhAkTyMzM5Mcff2Tbtm2MHTv2Lv+WbI0ZM4aLFy+yYcMG/vGPf1C1alUAmjZtCsB3331Hjx49aNiwIZMmTcLHx4fY2FhGjhzJuXPneP75563Ot2HDBs6fP88zzzxDtWrVCA4OBuD1118nKCiI8ePH4+fnx+7du1m0aBH//e9/+eqrrwDo3LkzU6ZM4aWXXmL06NF06tQJgBYtWtw2/7Vr1zJo0CCaNGlCVFQU169f5+2336ZLly588cUX3H///Vav//TTT1m6dClPP/00lStXZvHixURGRtKqVSsaN25c6L9PESm7NM4UjMYZ+8eZbdu2AdCuXTur4//+97957rnnqFGjBtOmTQPynhUDSEtLY8GCBURERDBkyBBMJhOff/4548ePJzU1lRkzZvzmZ97OfffdZ8lpyJAhBTqHOFhxV4gid6tKlSpGixYt7H79888/bwBGTEyM5Vhubq7Rt29fAzASExMNwzCMDz/80ACM77///jfPd//991vuZtnjxl2+nJwcy7Hr168b/v7+xmOPPWY51qpVK6NXr152n/dObr2zaRiGMW3aNAMwjh49avXa3Nxco1mzZkb79u2NrKwsq1jfvn0NLy8v4+LFi4Zh/Hrn0MPDw0hOTrb53PT0dJtjy5YtMwDjP//5j+XY5s2bDcByR/Rmt975y8rKMmrVqmUEBAQYqampluM///yzYTabjTZt2liO3cjP29vb+Pnnny3HT548abi7uxvPP/+8zeeJiNxM44x9NM4UfJwZOnSo4erqajPzZRiGUadOnXxnurKzs43MzEyb48OHDzd8fHyMa9eu3fb6DOP2M2gnTpwwAGPq1Kl3zFucQ10cpdRJS0ujUqVKdr/+o48+IigoiPDwcMsxk8lkuVMXGxsLgK+vr+X369evOyzfIUOGcOrUKctdPYDPPvuM8+fPW92p8vX1JSkpiX379jnss+313//+lx9//JHBgwdz8eJFzp07Z/np1asXV69eZfv27Vbv6dWrF/Xq1bM5l7e3N5D38PilS5c4d+4cnTt3BuCHH34oUH47d+7k1KlTjBkzhsqVK1uOBwQEMHjwYHbt2sXJkyet3vP73//e6uH9WrVqERwczOHDhwuUg4iUHxpnHE/jjLWUlBQqV658V63+XV1d8fDwAPKagKSmpnLu3Dm6devG5cuX+emnn+w+1838/f0BOHfuXIHeL46nAk1KnUqVKpGWlmb365OTky3LIm7WrFkzAI4ePQpAly5dGDBgALNmzcLf359evXrx+uuvF/oLq2/fvnh6erJy5UrLsZUrV+Lj40Pv3r0tx2bNmsXly5dp1qwZzZo149lnn7UabIvS/v37AXjuueeoVq2a1c9TTz0F5LUDvlmDBg3yPdf3339Pjx498Pb2pnLlylSrVo3AwEAAUlNTC5TfjU5mN5bJ3OzW/4435Deo+/n5ceHChQLlICLlh8YZx9M44xjvvfceYWFhVKxYkSpVqlCtWjWefPJJoODXnpub68gUxQFUoEmp06xZM3766ScyMzPtfs9v3aG6ETOZTKxZs4adO3cyadIkrly5woQJEwgODi7UHio+Pj706dOHDz74gGvXrnHlyhU++ugj+vXrh6enp+V1nTp14siRI6xYsYL27duzfv16HnroIUaNGlXgz7bXjS/nqKgoNm/enO9P9+7drd5zc+43JCcn07VrV06ePMnLL7/MRx99xObNm/n000+tPqeg8vvvaPyvi9qtsZv3lMnv9SIit6NxxvE0zlirVq0aFy9evKsx6f333+fJJ5+kVq1aLF68mI8//pjNmzfz97//HSj4td8o7G48MyjFT01CpNR5/PHH+eabb1i5ciUjR4684+vr16+f73KOG8du3aelTZs2tGnThunTp5OYmEibNm14+eWXLXcm72Y5wg1Dhgxh7dq1bNq0iYyMDK5cuZLvg7i+vr4MHjyYwYMHk52dzbBhw3j33XeZPHmyQxpb3C73oKAgIG8wvHWAvBsbNmzg6tWrfPzxx1Z/r/ktu7ibv8cb5/rxxx/p16+fVezGXdni3m9HRMoOjTMFp3HGPk2bNiUnJ4djx47ZnPd2ea9cuZIGDRqwceNGy75zQKGX7t94/42ZQil+mkGTUmf06NHUr1+f559/nj179tjEs7Oz+fvf/86JEycA6NOnD4cOHWL9+vWW1xiGwf/93/9Z4pB3B+nWO1lNmzbF09PTarmC2Wzm4sWLd5Vzz549qVKlCitXrmTlypXUrFmThx56yOo158+ft/rdzc2N0NBQAKvP379/P8ePH7+rz785d7BdBtG6dWuaNGnCa6+9lu9Sm5SUFLvOf+Nu4q138V566SW7c8lP27ZtLXcMb152dPLkSVasWEHr1q2pXbu2XTmKiNyJxhmNMzcU1Thzo6tkfs/Mmc3mfHO+ce03/z+UkZHB66+/XqhcduzYAeRtkC4lg2bQpNTx8fHho48+omfPntx3331ERETQsWNHKlasyKFDh4iJieHIkSM88cQTAEyZMoW1a9cSGRlpaX/88ccf88knnzB27FjL4LR8+XLeeOMN+vbtS1BQENnZ2axevZrLly9b3YVs164dn376KePHj+e+++7DxcWFQYMG/WbOFSpUICIiguXLl5OTk8PYsWNtlkY0bdqUBx54gHbt2lGjRg1++uknFixYQJMmTWjTpo3V67p06cLXX3991393N9r5Tp06lcjISNzd3XnooYeoXr06S5cu5eGHH6ZZs2aMGjWKwMBAzp07x549e9iwYQPXrl274/kfeeQRKlasSK9evRgzZgyurq7Exsba/KMAICQkBC8vL9566y3MZjM+Pj40b96c5s2b27zWzc2N+fPnM2jQINq3b8/IkSMt7Y+zsrJ47bXX7vrvQkTkdjTOaJwp6nGmffv21K5dm88//5yIiAirWLt27XjvvfeIjo6mcePGmM1m+vTpQ9++fVm3bh29evWiX79+XLhwgWXLllkK0YL69NNPadOmTb7P1EkxKa72kSKFdeHCBePFF180WrZsaXh7exvu7u5Gw4YNjdGjRxsJCQlWrz1x4oQxdOhQo2rVqoa7u7sRHBxsvPrqq1btbXfv3m0MGTLEqF+/vlGxYkWjSpUqxgMPPGB88MEHVue6dOmSERkZafj5+Rkmk+k3NxC9WVxcnGXTyfxaLM+dO9fo0KGD4e/vb3h4eBiBgYHGuHHjjNOnT1u9Dsi3Te6t8mt/bBh5rXfr1KljuLi42MT37dtnDBkyxKhZs6ZRoUIFo3bt2kb37t2NhQsXWl5z8wad+dmyZYtx7733Gl5eXkbVqlWNYcOGGWfPnrXaIPSGmJgYIyQkxKhQoYJdG4h+/PHHRseOHQ1PT0/DbDYbPXr0sGqpfKf8btdiWEQkPxpnfpvGmcKNM9OmTTP8/Pys2uMbRt7/S7169TJ8fHysNqo2DMN4/fXXjUaNGhkeHh5G/fr1jejoaMt2Ajf/PdvbZv/06dOGq6ur8dZbb9mVsziHyTD0xLyIiIiIiDOdPn2aoKAgFi5cyLBhw4olh+joaBYvXszBgwfx8vIqlhzElp5BExERERFxspo1azJx4kTmzp1LTk6O0z8/PT2d119/naioKBVnJYxm0EREREREREoIzaCJiIiIiIiUECrQRERERERESohy02b/0qVLxZ2CiIgUEV9f3+JOoUhpDBMRKZvyG780gyYiIiIiIlJCqEATEREREREpIcrNEseblfWlMCIi5UF5XfanMUxEpHS70/ilGTQREREREZESQgWaiIiIiIhICaECTUREREREpIRQgSYiIuIAq1evpmnTpnh7e9OwYUO2bdsGwJIlSwgKCsJsNvPII49w8uRJq/ft3r2bzp07YzabqVGjBq+99lpxpC8iIiWECjQREZFC2rx5M5MnT2bp0qVcvnyZuLg4AgMD2bp1K1OnTmXDhg1cuHCBBg0aEBkZaXnfuXPneOSRRxgzZgznz5/n0KFDPPzww8V4JSIiUtxMhmEYxZ2EM9zcLUUdsERESr+S9L3esWNHRo0axahRo6yO/+UvfyEjI4OFCxcCcPLkSerUqcOhQ4do2LAhU6dO5eeff+bf//73b57fUde64wzU8IT6lQp8ChERKaQ7fadrBk1ERKQQcnJy2LlzJykpKQQFBREQEMCzzz5LRkYGhmFw833QG3/eu3cvANu3b6dKlSp07NiR6tWr06dPH44fP14keV7JgoGbIWQNvBoP2blF8jEiIlJITivQrl27xqhRo6hXrx4+Pj6EhYXxySefAJCcnIzJZMJsNlt+Zs2aZXmvYRhMnjwZf39//P39mTRpktWAl5ycTNeuXfHy8iI4OJgtW7Y4PP9HP4agFdDwfz8/pTr8I0REpBQ6c+YMWVlZxMTEsG3bNuLj49mzZw+zZ8+mV69erF27lsTERDIyMpg5cyYmk4mrV68CcOLECZYvX85rr73G8ePHbZZAOtKMH+DYZbiaDX/5D9y7DnalFMlHiYhIITitQMvOzqZu3bps3bqVS5cuMWvWLAYMGEBycrLlNRcvXiQ9PZ309HSmT59uOb548WI+/PBDEhISSExMZOPGjSxatMgSj4yMJCwsjPPnzzNnzhz69+9PSopjR53j6XA4DY787+e67jyKiAjg6ekJwLhx46hVqxZVq1ZlwoQJbNq0iW7duhEdHU14eDj16tWjfv36+Pj4EBAQYHlv3759adeuHRUrVmTGjBl89913Dt+Ee18qzE+0PrbnXF6R9udvIT3LoR8nIiKF4LQCzdvbm6ioKOrXr4+Liwu9e/emQYMG7Nq1647vXb58ORMnTiQgIIA6deowceJEli1bBsCBAwfYvXs30dHReHp6Eh4eTmhoKOvWrSvS6ykXD+6JiMgd+fn5ERAQgMlkyjc+duxYDh48yNmzZwkPDyc7O5vmzZsD0KJFC6v33fizox8Pb1IZFnWByu7Wx3ONvMItZDVsTHboR4qISAEV2zNoZ86c4cCBA4SEhFiO1atXj4CAAEaMGMG5c+csx5OSkmjZsqXl95YtW5KUlGSJBQYG4uPjk2/cUfIfdkVERGDEiBG88cYbnD17ltTUVObPn0/v3r3JzMxk7969GIbB8ePHGT16NOPHj8fPz8/yvvXr1xMfH09WVhazZs3igQceoHLlyg7Nz8UEo5rC/kiIDLKNH0+HPp9AxGdw6opDP1pERO5SsRRoWVlZDBkyhGHDhhEcHEzVqlX54YcfOHbsGLt27eLy5csMGTLE8vr09HSrDie+vr6kp6djGIZN7Eb88uXLRXoN5aP3pYiI2GP69Om0a9eOxo0b07RpU8LCwpg2bRqZmZkMHjwYs9nMvffeS4cOHayesX7ooYeYO3cujz76KNWrV+fQoUOsXLmyyPKs4QUre8Anj0J9H9t4zBEIXg1vJ+XNromIiPO5OfsDc3NzefLJJ3F3d2fBggUAmM1m2rZtC0CNGjVYsGABtWrVIi0tjUqVKmE2m0lLS7OcIy0tDbPZbGkscnPsRvzmGTVHuM3KFRERESpUqMCbb77Jm2++aXW8YsWKJCYm3uZdeZ555hmeeeaZokzPxiP3wN6BEPUD/CMRcm4qxtKuwzNx8O8DsLgLhFRxamoiIuWeU2fQDMNg1KhRnDlzhnXr1lGhQoV8X3frGvyQkBASEhIs8YSEBMvSyJCQEI4cOWI1Y3ZzvKjoxqKIiJRm3hXglY6wsz+0rWYb/+40hL0PL+yAzGzn5yciUl45tUB75pln2LdvH7GxsZauVwA7duzgp59+Ijc3l/Pnz/Pcc8/x4IMPWpYuDh06lHnz5vHLL79w8uRJXn31VYYPHw5A48aNadWqFdHR0WRmZrJ+/XoSExMJDw93aO6aQBMRkbKoVVXY3g/m3w/et6yrycqFObshdC18eaJ48hMRKW+ctsTx2LFjLFq0CA8PD2rWrGk5vmjRIlxcXJg6dSpnz56lUqVK9OjRg1WrVlleM2bMGI4cOUJoaCgAf/jDHxgzZowlvnr1aoYPH46fnx/33HMPMTExVKuWz+1AERERseHqAuNbQN8G8Ow2iD1mHT90CbrFwrAm8H8doKpn/ucREZHCMxmO7uVbQt28p8ytTUXs0WotJJz/9ffd/SFMNaCISLEp7Pd6aeLMazUM+OAIjPsGTl21jVetCPM6whON9Xy2iEhB3Ok7vdja7Jc2GoRERKQ8MJkgvCHsGwR/DLFd4n8uE4Z+CQ9vhMOO3U9bRERQgVZg5WLaUUREyi1fD1jYGb7tC83z6eS45QQ0XwN/2w1ZOc7PT0SkrFKBZidNoImISHnUoSbs6g9z7wMPV+tYZg5M3QFtYmD76eLJT0SkrFGBVkCaQRMRkfLC3RX+2jpv77RudWzj/70AHdfD2Di4dM35+YmIlCUq0OykGTQRESnvgnxhcx9Y/hD4V7SOGcCbSdBsTV6TkfLRgkxExPFUoImIiIjdTCYY2gT2D4KhjW3jJ69A+Gfw+Kfwc7rz8xMRKe1UoBWQ7gyKiEh5VtUTlneDLX3yZtZu9VEyNFsNrydCTq7T0xMRKbVUoNlJbfZFRERsdQuAxAEwtTW43fKvivQsGP8tdFgPCeeKJz8RkdJGBVoBaQJNREQkj6cbzLkP9vSHDjVs4z+czev0OOk/cDXL+fmJiJQmKtDspAk0ERGR39bcH77pC291hkru1rEcA16Jz9s77bPjxZOfiEhpoAKtgPQMmoiIiC0XEzwdAvsGQf9A2/jRy/DIxzBkC5y96vz8RERKOhVodtIzaCIiIvar7Q3v/w4+6gl1zbbxlQcheDX8c59ueoqI3EwFWgFpLBEREbmzPvUhaSCMD82bXbtZ6jX4w9fQ9SP4KbU4shMRKXlUoNlJE2giIiIF4+MO8x+AHf2gVVXb+NaT0GItzNwJ13Kcn5+ISEmiAk1EREScom11+CEcXukAXm7Wseu5MOMHaLUWtp0snvxEREoCFWgFpPXyIiIid8/NBf7SCvYOhEfq2sb3X4TOG+Cpr+FCptPTExEpdirQ7KQljiIiIo7ToBJsehRWdYfqnrbxJfsgeBWsOKCboiJSvqhAKyCNFSIiIoVjMsGgRnkt+f/Q1DaekglPfAEPb4RDl5yfn4hIcXBagXbt2jVGjRpFvXr18PHxISwsjE8++QSA7du306NHD6pUqUK1atWIiIjg1KlTlvdGRUVRoUIFzGaz5efIkSOWeHJyMl27dsXLy4vg4GC2bNni8PzVZl9ERKRoVKkI7zwIWx+D4Mq28S0n8ja4nr1LTUREpOxzWoGWnZ1N3bp12bp1K5cuXWLWrFkMGDCA5ORkUlNTGT16NMnJyRw7dgwfHx9GjBhh9f6BAweSnp5u+QkM/HX3y8jISMLCwjh//jxz5syhf//+pKSkFOn1aAZNRETEsTrXhvgBMOte8HC1jl3LgenfQ9j7EKcmIiJShjmtQPP29iYqKor69evj4uJC7969adCgAbt27aJnz55ERERQqVIlvLy8ePbZZ/n222/tOu+BAwfYvXs30dHReHp6Eh4eTmhoKOvWrXNo/ppAExERKXoervBCG/jvAOhWxza+LxW6bIBRX8F5NRERkTKo2J5BO3PmDAcOHCAkJMQmFhcXZ3M8NjaWKlWqEBISwltvvWU5npSURGBgID4+PpZjLVu2JCkpqeiSRw8si4iIFKVGlWFzH3ivG1SraBt/d39eE5F//6QxWUTKlmIp0LKyshgyZAjDhg0jODjYKpaYmMjMmTN55ZVXLMcGDBjAvn37SElJ4Z133mHmzJmsWrUKgPT0dHx9fa3O4evry+XLl4v+QkRERKTImEwwpDHsj8y/ici5TBj6JXSPhQMXnZ+fiEhRcHqBlpuby5NPPom7uzsLFiywih06dIiePXvy2muv0alTJ8vxZs2aUbt2bVxdXenYsSPjx48nJiYGALPZTFpamtV50tLSrGbUHEFNQkRERIrHjSYicY9BUz/b+Je/QIu1MGunmoiISOnn1ALNMAxGjRrFmTNnWLduHRUqVLDEjh07Rvfu3Zk+fTpPPvnkb57HZDJh/G89Q0hICEeOHLGaMUtISMh36aQjaTWFiIiIc3WqDfERMPs2TURe/AFaroWtaiIiIqWYUwu0Z555hn379hEbG4un56+7Uv7yyy889NBDjB07lqefftrmfRs2bCA1NRXDMPj+++95/fXXeeyxxwBo3LgxrVq1Ijo6mszMTNavX09iYiLh4eEOzV0TaCIiIsXP3RWmtYG9A6FHgG38p4vw4AYY+RWcy3B+fiIihWUyDOc8Wnvs2DHq16+Ph4cHbm5uluOLFi3i0KFDREVF4e3tbfWe9PR0IK+N/ueff861a9cICAjgj3/8I88995zldcnJyQwfPpwdO3Zwzz33sHDhQrp37251rkuXft3h8tZn1uzxwHr49vSvv8c9lncnT0REikdhv9dLk/J0rXfDMGDVQfjzd3A2n2LMvyK82gGGNtGjCiJSctzpO91pBVpxK+zg1mk9fKMCTUSkxChPRUt5utaCuJAJU7bDO/vyjz9YG97uDE3yeX5NRMTZ7vSdXmxt9ku7clHVioiIlAJVKsLiB+GbxyEknyLs65N5TUSif1ATEREp+VSg2UlLI0REREq2+2vB7giYex9UvKWJyPVciNoJLdbA178UT34iIvZQgSYiIiJlhrsr/LV1XhORh+vaxg9cgq4fwfAv1UREREomFWgFpCWOIiJys9WrV9O0aVO8vb1p2LAh27ZtA2DJkiUEBQVhNpt55JFHOHnStgf89evXCQ4OJiAgn7aEUiANfeHTR2FVd6jhaRtf/hMEr4Zl+/OajYiIlBQq0OykFY4iInI7mzdvZvLkySxdupTLly8TFxdHYGAgW7duZerUqWzYsIELFy7QoEEDIiMjbd7/yiuvUL169WLIvGwzmWBQI9gXCWOa2cbPZ8KIr/Jm1PanOj8/EZH8qEArIN1tExGRG2bMmMGLL75I+/btcXFxoU6dOtSpU4fY2FgiIiIICQnB3d2d6dOnExcXx+HDhy3vPXr0KO+99x5//etfi/EKyjY/D3i7C3zbF5pXsY1v/V8TkRnfQ2a28/MTEbmZCjQ7qUmIiIjkJycnh507d5KSkkJQUBABAQE8++yzZGRkYBgGN+9mc+PPe/futRwbN24cc+fOxdMzn3V44lAda8Lu/vBSe/B0s45l5cLMXXmF2pcniic/ERFQgVZgmkATERGAM2fOkJWVRUxMDNu2bSM+Pp49e/Ywe/ZsevXqxdq1a0lMTCQjI4OZM2diMpm4evUqAOvXryc7O5u+ffsW81WUHxVcYXJYXhOR3+XTROTgJegWC0O/gOIEsBAAACAASURBVBQ1ERGRYqACzU6aQBMRkfzcmPkaN24ctWrVomrVqkyYMIFNmzbRrVs3oqOjCQ8Pp169etSvXx8fHx8CAgK4cuUKkyZN4o033ijmKyifAivBJ4/C6h75NxH59wEIXgXv7tNjDSLiXCrQCkhf1iIiAuDn50dAQACm26yFHzt2LAcPHuTs2bOEh4eTnZ1N8+bNOXjwIMnJyXTq1ImaNWvSr18/Tp06Rc2aNUlOTnbuRZRTJhMMDIL9kfB0M9ubsReuwaiv4cENsE9NRETESVSg2UkzaCIicjsjRozgjTfe4OzZs6SmpjJ//nx69+5NZmYme/fuxTAMjh8/zujRoxk/fjx+fn40b96cn3/+mfj4eOLj41myZAk1atQgPj6eunXzWXsnRaayB7z1vyYiofk0EYk7BS3XwotqIiIiTqACTUREpJCmT59Ou3btaNy4MU2bNiUsLIxp06aRmZnJ4MGDMZvN3HvvvXTo0IFZs2YB4ObmRs2aNS0/VapUwcXFhZo1a+Lq6lrMV1Q+dagJu/rD32/TRGTWLghdC1vUREREipDJMMrHYr1Lly5Z/uzr63vX739oA3x1096iW/pAN+0nKiJSbAr7vV6alKdrLSmOpsHYbfDJ8fzjTzSGVztAdS/n5iUipd+dvtM1g2YntdkXEREpPxpUgo97wdqHoWY+Rdh7ByB4NbzzI+SWi1vdIuIsKtAKSN/FIiIiZZvJBBENYf8g+GOI7fPoqddg9Fbo9CH893yxpCgiZZAKNDtpAk1ERKR88vWAhZ3hP/2ghb9t/LvT0DoGJv8HrmQ5Pz8RKVtUoBVQ+XhyT0RERG64rwbsDIdXOoDXLU1EsnPh5Xhothpik4slPREpI1Sg2UnPoImIiEgFV/hLK/hxEPSpZxs/ng6//wT6fQo/pzs/PxEp/ZxWoF27do1Ro0ZRr149fHx8CAsL45NPPrHEv/jiC4KDg/Hy8qJr164cO3bMEjMMg8mTJ+Pv74+/vz+TJk3i5uaTycnJdO3aFS8vL4KDg9myZYuzLktERETKoXo+sKEnrH8EArxt4+uPQtNVMC8hb3ZNRMReTivQsrOzqVu3Llu3buXSpUvMmjWLAQMGkJyczLlz5+jXrx+zZs3iwoULtG3bloEDB1reu3jxYj788EMSEhJITExk48aNLFq0yBKPjIwkLCyM8+fPM2fOHPr3709KSkqRXo9WOIqIiJRvJhM83gD2RcLEluB6y2qbK9kw8TtoGwPbTxdPjiJS+hTrPmgtWrRgxowZnD9/nmXLlvHdd98BcOXKFapWrcqePXsIDg6mY8eODB8+nNGjRwPwz3/+k3feeYft27dz4MABQkNDOXfuHD4+PgB06tSJIUOG8PTTT1s+q7B7yDwcC5tv2pjys97wcN2CXLWIiDhCedobrDxda2mWcA6ejoPtZ2xjJmB0M/hbe/DzcHpqIlKClNh90M6cOcOBAwcICQkhKSmJli1bWmLe3t40bNiQpKQkAJt4y5YtrWKBgYGW4uzWeFFRkxARERG5Wcuq8G1feLszVHa3jhnAoh8heBWsOKB/R4jI7RVLgZaVlcWQIUMYNmwYwcHBpKen21SPvr6+XL58GcAm7uvrS3p6OoZh3PG9jqImISIiInInLiYYEwL7I2FII9v42Qx44gvoHgs/pTo/PxEp+ZxeoOXm5vLkk0/i7u7OggULADCbzaSlpVm9Li0tzTIrdms8LS0Ns9mMyWS643uLim58iYiIyO3U8IL3usOWPtA4n1WpX/4CLdbCjO8hM9v5+YlIyeXUAs0wDEaNGsWZM2dYt24dFSpUACAkJISEhATL665cucLhw4cJCQnJN56QkGAVO3LkiNWM2c1xR9EEmoiIiNytbgGQOBCi24GHq3Xsei7M3AWha2Hzz8WTn4iUPE4t0J555hn27dtHbGwsnp6eluN9+/Zl7969rFu3jszMTGbOnEmLFi0IDg4GYOjQocybN49ffvmFkydP8uqrrzJ8+HAAGjduTKtWrYiOjiYzM5P169eTmJhIeHh4kV6L1o6LiIiIPTxc4cW28N8B0D3ANn7oEjy8EQZvhtNXnZ+fiJQsTivQjh07xqJFi4iPj6dmzZqYzWbMZjMrVqygWrVqrFu3jmnTpuHn58eOHTtYvXq15b1jxoyhT58+hIaG0rx5cx599FHGjBljia9evZqdO3fi5+fHlClTiImJoVq1ag7NXzNoIiIiUhiNKsPnvWFld6jhaRtfdSivicibeyFHe6eJlFvF2mbfmQrborjnRvj0puUHm3pBz3qOyExERAqiPLWeL0/XWl5cvAbTdsBbSfk/135v9bxukGGOvd8sIiVAiW2zX9qVi6pWREREikRlD1jYGbb3g1ZVbePfn4W26+DP38Ll687PT0SKjwo0O6nNvoiIiDjavTXgh3D4R0cwV7CO5RowPxGaroZ1h/X8u0h5oQKtgPQdKSIiIo7g5gJ/agn7BkF4oG38lyvQ/3PovQmOptnGRaRsUYFmJ02giYiISFEKMEPM72BjL6ifz3aum45DyBp4aTdcz3F+fiLiHCrQCkjLDERERKQoPFoPkgbCX8PyZtdulpENf90Brd+HbSeLJz8RKVoq0OykZ9BERETEWbwqwNz2EB8BD9S0jSelQucNMOorOJfh/PxEpOioQCsgTaCJiIhIUQupAlsfh3e7gn9F2/i7+yF4NSzdr9U9ImWFCjQ7aQJNREREioOLCUYEw/5BMDLYNn4+E0Z+BV02QNIF5+cnIo6lAk1ERESkFKjqCf/sClsfg2Z+tvFtp6DV+zB1O1zNcn5+IuIYKtAKSMsIREREpDh0rg17IuBv94Gnm3UsOxf+tiev2+OmY8WTn4gUjgo0O2mJo4iIiJQU7q4wpXVet8de99jGky/Do5ug/2dwIt35+YlIwalAKyBNoImIiEhxa1Apb9+0mIehtrdtfN0RaLoaXkvMm10TkZJPBZqd1GZfRERESiKTCcIb5jUR+VOLvKYiN0vPgj99C+3WwfbTxZOjiNhPBVoBaQZNREREShIfd/jH/bAzHNpVt43Hn4OO62HMVriQ6fz8RMQ+KtDspAk0ERERKQ3CqsF/+sLCTlDJ3TpmAIt/hOBVsFx7p4mUSCrQRERERMoYVxf4Y3P4KRIGN7KNp2TCcO2dJlIiqUArIN1xEhERkZKuphes6A5b+kBjX9v4jb3TJv8HrmjvNJESQQWandQkREREREqrbgGQOBBm3QsVXa1j2bnwcnxet8cPj+omtEhxc2qBtmDBAtq2bYuHhwfDhw+3HF+xYgVms9ny4+XlhclkYteuXQBERUVRoUIFq9ccOXLE8v7k5GS6du2Kl5cXwcHBbNmypcivRd9dIiIiUpp4uMILbSBpUP57p/2cDn0/hT6fwNE05+cnInmcWqDVrl2bF154gZEjR1odHzJkCOnp6ZafN998k8DAQFq3bm15zcCBA61eExgYaIlFRkYSFhbG+fPnmTNnDv379yclJcWhuWsCTUREfsvq1atp2rQp3t7eNGzYkG3btgGwZMkSgoKCMJvNPPLII5w8edLynldeeYXmzZvj4+NDgwYNeOWVV4orfSlHAv+3d9oHv4OAfPZO+/gYhKyBubvgeo7z8xMp75xaoPXr14/HH38cf3//33zd8uXLGTp0KCY71hUeOHCA3bt3Ex0djaenJ+Hh4YSGhrJu3TpHpZ0vTf+LiMgNmzdvZvLkySxdupTLly8TFxdHYGAgW7duZerUqWzYsIELFy7QoEEDIiMjLe8zDIN//etfpKam8umnn7JgwQJWr15djFci5YXJBH0DYV8k/KUluN7yT66MbJj2PbRcC1+eKJ4cRcqrEvcM2rFjx4iLi2Po0KFWx2NjY6lSpQohISG89dZbluNJSUkEBgbi4+NjOdayZUuSkpIcmpdm0ERE5HZmzJjBiy++SPv27XFxcaFOnTrUqVOH2NhYIiIiCAkJwd3dnenTpxMXF8fhw4cBmDRpEq1bt8bNzY0mTZrw2GOP8e233xbz1Uh5Yq4Ar3SEPRFwf03b+P6L0C0WntgCp686Pz+R8qjEFWj/+te/6NSpEw0aNLAcGzBgAPv27SMlJYV33nmHmTNnsmrVKgDS09Px9bVuS+Tr68vly5eLNE9NoImICEBOTg47d+4kJSWFoKAgAgICePbZZ8nIyMAwDIybllzc+PPevXttzmMYBtu2bSMkJMRpuYvcEOoPcY/Du13Bv6JtfMXBvL3TFu6FnFzn5ydSnpTIAm3YsGFWx5o1a0bt2rVxdXWlY8eOjB8/npiYGADMZjNpadZPsqalpVnNqDmCujiKiEh+zpw5Q1ZWFjExMWzbto34+Hj27NnD7Nmz6dWrF2vXriUxMZGMjAxmzpyJyWTi6lXbqYioqChyc3MZMWJEMVyFCLiYYERw3t5pTzW1jV+6Ds9ug/YfwM6zzs9PpLywu0AbOXJkvrNSV65csWn6UVDffvstJ0+epH///r/5OpPJZLkLGRISwpEjR6xyS0hI0B1IERGxS2HHN09PTwDGjRtHrVq1qFq1KhMmTGDTpk1069aN6OhowsPDqVevHvXr18fHx4eAgACrcyxYsIB//etffPzxx3h4eDjmwkQKyL8iLH4QvusLLfNpG7AzBe5dB2Pj4OI1p6cnUubZXaAtX76cjIwMm+MZGRn861//susc2dnZZGZmkpOTQ05ODpmZmWRnZ1t9Rnh4uM3s14YNG0hNTcUwDL7//ntef/11HnvsMQAaN25Mq1atiI6OJjMzk/Xr15OYmEh4eLi9l1YgWuIoIlI2FHZ88/PzIyAg4LaNrcaOHcvBgwc5e/Ys4eHhZGdn07x5c0v83Xff5aWXXuKLL76wKdxEilOHmrCzP8zrmPes2s0M4M0kaLIK3jug5mkijmR3gWYYhs3gYxgG33zzDdWqVbPrHLNnz8bT05OXXnqJ9957D09PT2bPng1AZmYma9eutVneCHmti4OCgvDx8WHo0KFMnjzZ6nWrV69m586d+Pn5MWXKFGJiYuzOyV5a4SgiUjY5YnwbMWIEb7zxBmfPniU1NZX58+fTu3dvMjMz2bt3L4ZhcPz4cUaPHs348ePx8/MD8vYBnTp1Kps3b7baPkakpHBzgT+3hP2DIKKhbfxsBjz5BXT7CPanOj8/kbLIZBi/fc/DxcXlju3ux48fz7x58xyamKNdunTJ8udbm4rYI+IziPl1b2zW9IABQY7ITERECqKw3+uOHN+ysrIYP348K1eupGLFigwYMICXX36ZzMxMOnfuzOHDh/Hx8WHEiBHMnj0bV1dXABo0aMCJEyesljU+8cQTvP3221bnL+y1ijjKZ8dh7DY4nM9G1hVc4PlWMK01eFWwjYtInjt9p9+xQFuxYgWGYTB06FAWLFhgdRJ3d3caNGhA27ZtHZhy0Sjs4Dbgc3j/8K+/q0ATESlehf1eL03jmwo0KUkys+GlPfC33XA9n46O9X3gjQegd32npyZSKtzpO93tTicYMmQIAHXr1uX+++/Hze2ObykXtNRaRKR00/gmUjAV3SCqHQxplDebtvmWjayTL0OfT+DxBvDa/XCPYxtri5R5dj+D1qhRI86e/bWn6p49e3j++edZtmxZUeRV4ugZNBGRsqm8j28iBdWoMnzWO29VUS0v2/iHR6Hpanh5D2TlOD8/kdLKNSoqKsqeF/bp0wdvb2/CwsI4f/48bdq04dSpU6xZswY3Nzfuv//+Ik61cK5d+7UPbMWK+ezAeAcxhyHppodfwwOheT6tZ0VExDkK+71+Q2kY3xx1rSKOZjJBSBV4qhlk5sD3Z61XGWXlwpYTsO4IhFaBeppNE7njd7rdM2h79+7lvvvuA+CDDz4gMDCQH3/8keXLl7NkyRIHpFqyaaNqEZGyqbyPbyKOUMkd/nE/7OoP7WvYxn9MhS4bYPiXkGK7q4WI3MTuAu3KlStUqlQJgC+//JI+ffoA0LZtW37++eeiyU5ERKSIaXwTcZxWVeHbvrC4C/jls+f68p/y9k5blAS5eqBfJF92F2gNGjQgLi6O9PR0Nm/eTPfu3QFISUmx2Vi6PNB3iohI2aDxTcSxXEx5Sx5/ioThTWzjqdfg6Tjo8AHsSXF+fiIlnd0F2oQJExg+fDh169blnnvusazJj4uLo3nz5kWWYEmhFY4iImVTeR/fRIpKNU9Y+hDEPQYhfrbx789C23Uw/htIu+78/ERKqjvug3az3bt3c/z4cXr06IG3tzcAH330EX5+fnTq1KnIknSEwu4hE7kZVh/69fcV3WBwY0dkJiIiBeHIvcFK+vimfdCktMvKgfmJELUTrmbbxmt55T3DNqChnvuXsq/QG1WXFYUd3AZvhlUq0ERESozyVLSUp2uVsu34ZfjTt7D+aP7x7gGwsBM0ruzcvESc6U7f6XYvcQT46quv6N69O7Vq1aJ27dr06NGDr7/+utBJlkbloqoVESknNL6JOMc9PvDBI7CxF9TP5xHPLScgdA1M/x6uZjk/P5GSwO4CbdWqVXTv3p1KlSoxZcoUJk2ahNlspnv37qxZs6YocywRNN0uIlI2lffxTaQ4PFoPkgbCtNZQ4ZZ/jV7Phdm7IGQNxCYXS3oixcruJY4hISE88cQT/PWvf7U6PnfuXFauXMnevXuLJEFHKezykCFbYOXBX39/rxsM0RJHEZFi46hlf6VhfNMSRynL9qfCH+Pgq5P5x/vUg9cegAaVnJuXSFFx2BLHQ4cOERERYXN8wIABHDp0KJ93lG1a4igiUjZofBMpXsF+8MXv857vr+llG489Bs1Ww5xdcC3H+fmJOJvdBVq1atVITEy0OR4fH0+1atUcmlRJpBWOIiJlU3kf30RKApMpr/na/kHwXGjeXmo3y8yBF76HFmtgs/aPlzLOzd4XPvHEE4wZM4aUlBQ6deqEyWRi69atTJ8+naeeeqoocyyRykfvSxGRsk/jm0jJ4euRt5xxRHDessf/nLGOH7gED2+EiIYwryMEmIsnT5GiZHeBNnv2bHJychg/fjxZWVkYhoGHhwfPPfccM2fOLMocSwQ1CRERKZvK+/gmUhK1qgrf9IVl+2HydjiXaR1//zBsOgZR7WB8KFRwLZ48RYrCHZuE5ObmsnfvXho1aoSnpycZGRmWNfkNGzbk0KFDNG/eHBeXu+rY73SFfcD6yS/gvQO//r78IRjaxBGZiYhIQRT2e700jW9qEiLl2YVMmLoDFv+Yfw+AED94szN0ru301EQKpNBNQlasWMHQoUNxd3cHwNPTk9DQUEJDQ3F3d2fo0KF2tyFesGABbdu2xcPDg+HDh1uOJycnYzKZMJvNlp9Zs2ZZ4oZhMHnyZPz9/fH392fSpEncXFcmJyfTtWtXvLy8CA4OZsuWLXblczc0gSYiUrY4cnwTkaJTpSK83QW294M2+TwWmpQKXTbk3Uw/fdX5+Yk42h0LtH/+859MnDgRV1fbuWM3Nzf+8pe/sHjxYrs+rHbt2rzwwguMHDky3/jFixdJT08nPT2d6dOnW44vXryYDz/8kISEBBITE9m4cSOLFi2yxCMjIwkLC+P8+fPMmTOH/v37k5KSYldOBaVn0ERESjdHjm8iUvTurQE7+sGbnaCyu238vQPQZBUs+C/k5Do/PxFHuWOBtn//fjp27HjbeIcOHdi3b59dH9avXz8ef/xx/P397c8QWL58ORMnTiQgIIA6deowceJEli1bBsCBAwfYvXs30dHReHp6Eh4eTmhoKOvWrburz7gTzaCJiJQtjhzfRMQ5XF3gmebwUyQMy+dRk7TrMO4baLcOtp92fn4ijnDHAu3SpUtkZWXdNn79+nXS0tIckky9evUICAhgxIgRnDt3znI8KSmJli1bWn5v2bIlSUlJllhgYCA+Pj75xkVERPLjzPFNRByruhcsewi2PQ6hVWzje85Bh/Xw1NdwPtM2LlKS3bFAq1evHvHx8beNx8fHc8899xQqiapVq/LDDz9w7Ngxdu3axeXLlxkyZIglnp6ebvUAna+vL+np6RiGYRO7Eb98+XKhcroTrXAUESndnDG+iUjReqAW7Oqf13LfXME2vmQfNF4J7/wIufrHm5QSdyzQfv/73zN9+nTS09NtYmlpacyYMYM+ffoUKgmz2Uzbtm1xc3OjRo0aLFiwgM8//9xy59JsNlvdxUxLS8NsNlsai9x6hzMtLc1qRs0R1GZfRKRsccb4JiJFr4Ir/Lll3rLHQUG28QvXYPRW6PgB7C7aFgUiDnHHAm3KlClcv36dxo0b87e//Y0PP/yQDRs2MHfuXIKDg7l27RpTpkxxaFKm/1VDNzo1hoSEkJCQYIknJCQQEhJiiR05csRqxuzmeFHRTRgRkdKtOMY3ESk6tb1hVQ/Y0geaVLaN7zib92zauG1w8Zrz8xOx1x03qq5SpQrfffcdTz/9NNOnTyc3N68tjouLCz179uTNN9+0u+lHdnY22dnZ5OTkkJOTQ2ZmJm5ubuzatYvKlSvTqFEjUlNTee6553jwwQctSxeHDh3KvHnz6NWrFyaTiVdffZVx48YB0LhxY1q1akV0dDSzZ8/mk08+ITExUU1CRETkNzlyfBORkqNbACQOgHkJMGsXXM3+NZZrwIK9sPYw/F8HeKKxVklJyXPHAg2gTp06xMbGkpqayqFDhzAMg0aNGuHn53dXHzZ79myio6Mtv7/33nvMmDGDJk2aMHXqVM6ePUulSpXo0aMHq1atsrxuzJgxHDlyhNDQUAD+8Ic/MGbMGEt89erVDB8+HD8/P+655x5iYmKoVi2fjTIcSG32RURKP0eNbyJSsri7wpTWENkI/vQtfHjUOn42A4Z+mfeM2sJO0Fz3YqQEMRlG+Sg17rRj952M/AqW7v/1938+CCObOiAxEREpkMJ+r5cm5elaRYrCpmN57feP5NOY1dUEf2oBM9qCTz77q4k42p2+0+/4DJrkr1xUtSIiIiJlQK96sHdgXhHmccve9DkGvJoATVfD2kNaJSXFTwWaiIiIiJR5nm4Q1S6vUOuZzw4av1yBgZvhdxvhwEXn5ydygwo0O+n5UREREZHSL8gXPu4FH/wO6ppt45tPQOgaeGEHXL39XvYiRUYFWgFp+ltERESkdDKZoG8g7BsEk8PA7ZZ/EV/PhTm7odka+Oho/ucQKSoq0OykGTQRERGRssW7ArzUPq8tf9fatvFjl+GxT6HPJjiaT4MRkaKgAq2ANIEmIiIiUjY09YMvfg8ru0MtL9v4xmPQbDXM2gnXcpyfn5QvKtDspE0MRURERMoukylv37T9kXlt911v+bdfZg68+AM0XwOfHS+eHKV8UIFWQJpBExERESl7KrnDP+6HXf3h/pq28UOX4JGPIeIzOJHu/Pyk7FOBZidNoImIyG9ZvXo1TZs2xdvbm4YNG7Jt2zYAlixZQlBQEGazmUceeYSTJ09a3mMYBpMnT8bf3x9/f38mTZqEoS5UIiVCy6oQ9zgs7QpVK9rGY45A8Cp4eQ9c17JHcSAVaCIiIoW0efNmJk+ezNKlS7l8+TJxcXEEBgaydetWpk6dyoYNG7hw4QINGjQgMjLS8r7Fixfz4YcfkpCQQGJiIhs3bmTRokXFeCUicjMXEwwPhp8i4elmtjfsr2TD5O3Qci1sOVEsKUoZpAKtgHSDU0REbpgxYwYvvvgi7du3x8XFhTp16lCnTh1iY2OJiIggJCQEd3d3pk+fTlxcHIcPHwZg+fLlTJw4kYCAAOrUqcPEiRNZtmxZ8V6MiNioUhHe6gI7wqFtNdv4/ovQIxYGfq5lj1J4KtDspCYhIiKSn5ycHHbu3ElKSgpBQUEEBATw7LPPkpGRgWEYVksWb/x57969ACQlJdGyZUtLvGXLliQlJTn3AkTEbu2qw/Z+8FZn8POwja89rGWPUngq0ApIE2giIgJw5swZsrKyiImJYdu2bcTHx7Nnzx5mz55Nr169WLt2LYmJiWRkZDBz5kxMJhNXr14FID09HV9fX8u5fH19SU9P13NoIiWYqws8HQIHIuEPTW3jWvYohaUCzU6aQBMRkfx4enoCMG7cOGrVqkXVqlWZMGECmzZtolu3bkRHRxMeHk69evWoX78+Pj4+BAQEAGA2m0lL+3X327S0NMxmMyYt2xAp8ap6wjsP5s2otfmNZY8DtOxR7pIKtALSzU0REQHw8/MjICDgtkXV2LFjOXjwIGfPniU8PJzs7GyaN28OQEhICAkJCZbXJiQkEBIS4pS8RcQx7qsBO/rB27dZ9vj+/5Y9/l3LHsVOKtDspHuZIiJyOyNGjOCNN97g7NmzpKamMn/+fHr37k1mZiZ79+7FMAyOHz/O6NGjGT9+PH5+fgAMHTqUefPm8csvv3Dy5EleffVVhg8fXrwXIyJ3zdUFxtxh2eOU7dBCyx7FDirQCkgTaCIicsP06dNp164djRs3pmnTpoSFhTFt2jQyMzMZPHgwZrOZe++9lw4dOjBr1izL+8aMGUOfPn0IDQ2lefPmPProo4wZM6YYr0RECuNOyx5/0rJHsYPJKCdPIl+6dMny55sfyLbXH+PgrZsaay3sBH9s7ojMRESkIAr7vV6alKdrFSkrcnJhyT746w5IvWYb93aD6W3hzy3A3dX5+UnxudN3ulNn0BYsWEDbtm3x8PCwWsKxfft2evToQZUqVahWrRoRERGcOnXKEo+KiqJChQqYzWbLz5EjRyzx5ORkunbtipeXF8HBwWzZssWZlyUiIiIiYuXmZY9PNc1/k2ste5T8OLVAq127Ni+88AIjR460Op6amsro0aNJTk7m2LFj+Pj4MGLECKvXDBw4kPT0dMtPYGCgJRYZGUlYWBjnz59nzpw59O/fn5SUlCK9lnIx7SgiIiIihVLVExY/mLfsMb9NrrXsUW7l1AKtX79+PP744/j7+1sd79mzJxEREVSqVAkvLy+effZZvv32W7vOeeDA1CyugAAAIABJREFUAXbv3k10dDSenp6Eh4cTGhrKunXrHJq7moSIiIiISEHdWyOvSFO3R7mTEtkkJC4uzqbNcGxsLFWqVPn/9u49LOpq7Rv4d0TOhwFRQcBMVARRwTamtik8oan5pqDlKdEtO+1N9ElM3CqSSlf2mG7d2i7LXtNoy6Pg4dJtT4nbjLbnUnziMU0BC3KDB2IYBDnd7x9sRocBGWCYGZjv57q84rfumfndvzU6q5u1fmsQGBiIDz74QNOemZkJX19fODs7a9qCgoKQmZmJ1mQZd+4RERERkaFw2SPpw+wKtMuXL2Pt2rXYsGGDpu2ll17ClStXcPv2bXz88cdYu3Yt9uzZAwBQq9U6N9cplUoUFxcbNC9+ZygRERERGUJTlj3+wmWPFsesCrTr169j3Lhx2LJlC5599llNe79+/eDl5QUrKys888wzWLx4MVJSUgAATk5OUKlUWq+jUqm0ZtRaAyfQiIiIiKglHl322InLHunfzKZAu3nzJkaPHo34+Hi88sorj32sQqFA7bcDBAYGIisrS2vGLCMjQ2eJZEtxAo2IiIiIDK122ePVBpY93ueyR4tj1AKtsrISZWVlqKqqQlVVFcrKylBZWYm8vDyMHDkSr7/+OhYsWKDzvEOHDqGwsBAignPnzuEvf/kLXnzxRQCAn58fgoODsWbNGpSVleHAgQO4fPkyIiMjW/VaeA8aERERERkKlz1SLaMWaImJibC3t8f69euRlJQEe3t7JCYmYseOHcjKysKaNWu0vuusVnJyMnr37g1nZ2fMnj0bcXFxiIqK0opfuHABbm5uWL58OVJSUtClSz1/s4mIiIiIzBiXPZJCxDLmghr7xu7GLPoW2Po/D4+3/B5YNNAQmRERUXO09HO9LbGkayWih+6UAivOAjuu1L//QV9XYGsoEN7d6KlRCzT2mW4296C1NRZR1RIRERGRyeiz7HHMEWDql1z22J6wQNMTNwkhIiIiIlOoXfa4Paz+ZY8pWVz22J6wQGsmy1gYSkRERETmwKoD8Go/4NqMmv8+brfHY7+YJEUyEBZoeuIMGhERERGZmrtdzUwalz22XyzQmokTaERERERkKvoue1z/PZc9tjUs0PSk4BQaEREREZkRfZY9/ukslz22NSzQiIiIiIjasNplj2cjH7/sccqXwM1i4+dHTcMCrZm4xJGIiIiIzMngro9f9piaBQQkA4nfAWWVxs+P9MMCTU9c4UhERERE5q6xZY+llUD8OSDwv4AjOabIkBrDAq2ZuM0+EREREZmrR5c9Du6qG89SARO/AF44ClwvMn5+1DAWaHriJiFERERE1NbULnvcMRzobKcb//tNIDAZWHUWKKkwenpUDxZozcQJNCIiIiJqCzoogHkBNcseF/avOX5UeTXw9vc196el3OBKMVNjgaYnTqARERERUVvmZgtsfRb4fgoQ6qkb/0UNTP0KCD8MXCk0fn5UgwVaM/E3C0RERETUFgV1Br6ZBCSNAjwddOPH82q+O23pKUBVbvz8LB0LND1xBo2IiIiI2guFApjpB1ydDiwNAjrWqQoqq4GNGYD/HuDza5ycMCYWaEREREREFsrFBtjwDHD5JWCUt2781n1g1nHguYNAxh3j52eJWKA1E3+JQERERETtRYAbcGwikDIG6O6kG//2X8BTKUBMOlD4wPj5WRIWaHriNvtERERE1J4pFEBkL+DKNGDV7wCbOpVCtQDbfgD8/gZ8cqXmmAzPqAXatm3bEBISAltbW8yZM0crdvz4cfj7+8PBwQEjRozAzZs3NTERQVxcHNzd3eHu7o5ly5ZBHlkIm5OTgxEjRsDBwQH+/v5IS0tr9Wvh30ciIiIiao8crYF1TwOZ04AJPXTjd8qA6K+BofuBc/lGT6/dM2qB5uXlhVWrVuEPf/iDVvudO3cQERGBdevW4d69ewgJCcHLL7+siX/00Uc4ePAgMjIycPnyZRw5cgTbt2/XxKdPn45Bgwbh7t27ePvttzFlyhTcvn3boLlzAo2IiIiILElvJXBkPHB4HODrohs/XwAM2Q9EnwBulxo/v/bKqAVaREQEJk2aBHd3d632/fv3IzAwEFOnToWdnR3eeustZGRk4McffwQA7Nq1C7GxsfDx8YG3tzdiY2Px6aefAgCuXbuG77//HmvWrIG9vT0iIyMxYMAApKamtuq1cCcbIiIiIrIELzwJZL5cM6tm31E3/smPNcse3/+hZvdHahmzuActMzMTQUFBmmNHR0f06tULmZmZ9caDgoK0Yr6+vnB2dq43bii8B42IiIiILJVdx5r70q5MAyJ9deO/lQML04GQFODbW8bPrz0xiwJNrVZDqVRqtSmVShQXF9cbVyqVUKvVEJFGn0tERERERIbRwxlIGQt89QLg76obz7gLPHsQeOU4cKvE+Pm1B2ZRoDk5OUGlUmm1qVQqzaxY3bhKpYKTkxMUCkWjz20tXOFIRERERJYqvDuQ8RKwYRjgZK0bT7oG+O0BNl4CKqqMn19bZhYFWmBgIDIyMjTHJSUluHHjBgIDA+uNZ2RkaMWysrK0ZswejRsKVzgSEdHjJCcnIyAgQLNMPz09HQCwd+9eBAQEwNnZGf369cPBgwc1z3nw4AEWLFgADw8PdOrUCRMnTkReXp6pLoGIqElsrIClwcDV6cDMPrpxdQWw9DQQtA9IyzV+fm2VUQu0yspKlJWVoaqqClVVVSgrK0NlZSUmT56MH374AampqSgrK8PatWsxcOBA+Pv7AwBmz56NTZs2IS8vD7/++is2btyo2abfz88PwcHBWLNmDcrKynDgwAFcvnwZkZGRrXot3CSEiIhqHTt2DHFxcdi5cyeKi4vxzTffwNfXF3l5eZg1axY2bdoElUqFDRs2YMaMGSgoKAAAbNmyBadPn8bly5fx66+/wtXVFTExMSa+GiKipvFyBJJGAydfBAa668avFALhh4GpXwI/8y6kRhm1QEtMTIS9vT3Wr1+PpKQk2NvbIzExEV26dEFqaipWrlwJNzc3nD17FsnJyZrnzZ8/HxMnTsSAAQPQv39/TJgwAfPnz9fEk5OTceHCBbi5uWH58uVISUlBly5dDJo7Z9CIiKghCQkJWL16NYYOHYoOHTrA29sb3t7eyM3NhaurK8aNGweFQoEJEybA0dERN27cAABkZ2dj7Nix8PDwgJ2dHaZNm2bwTa6IiIzlOS/guynA1lBAaaMbT8kC/JOBt78DyiqNn19boRCxjLmgoqIizc91NxXRx4ozwDsXHx4nPg2s/J0hMiMiouZo6ee6oVRVVcHe3h5r167Fjh07UFZWhkmTJmHDhg2wsbHByJEjERsbiwkTJuDw4cNYuHAhrl69CkdHR1y4cAGLFy/Gvn374OrqiujoaHTt2hWbN2/WOoe5XCsRkb4K7gMrztZswV+fXi7AltD6vwi7vWvsM90s7kFrC7jNPhER1Sc/Px8VFRVISUlBeno6Ll26hIsXLyIxMRFWVlaYPXs2ZsyYAVtbW8yYMQPbt2+Ho6MjgJpl+k888QS8vb3h4uKCK1euYPXq1Sa+IiKiluvqAOwYAZyJAELqWdh2QwW8cBSYeBS4UaQbt2Qs0JrJIqYdiYioUfb29gCAmJgYdOvWDZ07d8aSJUtw9OhRpKWlYdmyZfj6669RXl6OkydPIjo6GpcuXQIAvPbaaygrK8Pdu3dRUlKCiIgIjBs3zpSXQ0RkUEM8gLORwEdhgLudbvzITSDwv4D4c8D9CuPnZ45YoOmJE2hERFQfNzc3+Pj4QFHPUotLly7hueeeQ0hICDp06IDBgwdjyJAhSEtLA1Cz6/CcOXPQqVMn2NraIiYmBufOncOdO3eMfRlERK2mgwL4Yz/g2nTg/wbWHD/qQRWQ+B0QkAzsz+JmfCzQiIiIWmju3LnYunUrCgoKUFhYiM2bN+OFF17A4MGDNcseAeDixYtIT0/HwIEDAQCDBw/G7t27UVRUhIqKCvz1r3+Fl5cXOnfubMrLISJqFZ3sgPefAy5EAs946sZ/VgORXwJjjtTs/GipWKA1k6VX9kRE9FB8fDwGDx4MPz8/BAQEYNCgQVi5ciXCwsLw1ltvYcqUKXB2dkZkZCRWrFiBMWPGAADee+892NnZoU+fPujSpQuOHj2KAwcOmPhqiIha16AuwLeTgN0jAU8H3XhaLjBwLxB7ClCVGz8/U+MujnqKP1cz9Vpr7WAgPsQQmRERUXNY0s6GlnStRGRZVOXA2gvAlv8BKqt14x72wPqhwOy+uksj2yru4thKLKKqJSIiIiJqRS42wHvPABlTgZHeuvH8UmDuCeCZ/cD5AuPnZwos0PTUTgp2IiIiIiKz068TkDYR2DsG6O6kGz9bAAxJBaJP1HzHWnvGAq2ZLGNhKBERERGRcSgUwNRewI/TgNW/A2yttOOCmi++9tsDbLkMVFSZJM1WxwJNT5xBIyIiIiJqfQ7WwJqngSvTgEk9deNF5cB//BMYtA/4R67x82ttLNCaiRNoREREREStp6cLcOB54MsXAH9X3XhmITDqMDD1S+BmsfHzay0s0IiIiIiIyGyN6Q5kvAS8NwxwttaNp2TVfMn12gtAaaXx8zM0Fmh6UnCNIxERERGRSdhYAbHBwLUZwJy+uvHSSiDhPNAvGTiQ1bb3i2CB1kxt+U0nIiIiImqLPB2AnSOB05OBkC668ZxiIOJLYOwR4Eqh8fMzBBZoeuIEGhERERGReRjqCZyNBHYMB7rY6caP5QID9wKxp4CiB0ZPr0VYoDUTJ9CIiIiIiEyngwKYF1Cz7HHRAMCqzoxKZTWwKQPouwf49Eeguo38DzwLND3xHjQiIiIiIvPjagtsCQUuTQVGeOnG80uBuSeAZ/YD5wuMn19TsUBrpjZSgBMRERERWYT+7sDx/wPsGwM84aQbP1sAPJ0KzDsBFNw3fn76MpsCzcnJSeuPlZUVYmJiAAA5OTlQKBRa8XXr1mmeKyKIi4uDu7s73N3dsWzZMoiBd/HgBBoRERERkXlTKIApvWq+5Hr17wBbK93H/L8fAb89wJbLQEWV8XNsTEdTJ1BLrVZrfi4pKYGHhwemTp2q9ZjffvsNHTvqpvzRRx/h4MGDyMjIgEKhQHh4OHx9fbFgwYJWz5uIiIiIiMyLgzWw5mlgjj+w5BRwMFs7XlQO/Mc/gY//F/hLKDDSxzR51sdsZtAelZKSgq5du+LZZ5/V6/G7du1CbGwsfHx84O3tjdjYWHz66acGzcm6Tk89MMNqm4iIiIiIHurpAhx4HvjyBcDfVTeeWQiMOgxM+RK4WWz8/OpjlgXarl27MHv2bCjq7MzRo0cP+Pj4YO7cubhz546mPTMzE0FBQZrjoKAgZGZmGjQnpzrfWl5SYdCXJyIiIiKiVjKmO3D5JWDjM4CztW48NQvw3wOsOV/zpdemZHYF2s8//4yTJ08iKipK09a5c2ecP38eN2/exHfffYfi4mLMnDlTE1er1VAqlZpjpVIJtVpt0PvQHOu8kWoWaEREREREbYa1FbAkqGZb/jl9deNlVcBbF4B+ycCBLMDAW1rozewKtN27dyM0NBQ9e/bUtDk5OSEkJAQdO3aEh4cHtm3bhq+++goqlUoTr/0ZAFQqFZycnHRm4FpCZwbNxJU1ERERERE1nacDsHMkcHoyENJFN55TDER8CYw5AlwpNH5+ZlmgPTp7Vp/awqt2hiwwMBAZGRmaeEZGBgIDAw2al1OdvUk4g0ZERERE1HYN9QTORgI7hgNd7HTjabnAwL3Akn8CRQ+Ml5fZ7OIIAKdOnUJeXp7O7o1nz56Fq6sr+vTpg8LCQixatAjDhw/XLGucPXs2Nm3ahPHjx0OhUGDjxo2aLfoNpe4M2ok84LvbNRVu7USd4t8/K+r+/Gi8Trte8aaeozk5/Pu6GsqNiIiIiKi96aAA5gUAkb7AW+eBbT8AVY8sbaysBv58Gfj8J2D9UCCqb81zWpNZFWi7du1CREQEnJ2dtdqzsrKwYsUKFBQUwMXFBeHh4dizZ48mPn/+fGRlZWHAgAEAgOjoaMyfP9+gudW9B628GghJMegpzFqDBRwMUAQ+8nOT4vU8Fo85t7HO0eR4U87R3Bya8j4Z6RyPjevxPrXaLyXq9kEL/q40+BrNzaGh96mxeCtfZ4PneEy8owKwM6sRiIiILJmrLbA5FIgOABZ9C5z4VTteUAr84QSwPRPY+iwwuGvr5aIQQ3+js5kqKirS/PzohiL6ylIBvT43ZEZERJZrbHfgv19o2Wu09HO9LbGkayUiMjWRml0dY08BP6t14w4dgdzZgJtt816/sc90s7sHzVz1dAaC3E2dBRFR+6Bo/CFEREQmoVAAU3oBV6YBq38H2Fppx+MGNb840wcXmOhJoaj5be+mDODkrzVrU0UerlEV1Bxr/RcPt+esr10n3pTXMPA5UE87EVFr4f2tRERk7hysgTVPA3P8a2bTDmQDTzoDbwa37nlZoDWBpwPwn8NMnYVx1VvkGbAIbDQO7YKxoXO3KIeGztGMHHT6qBWus9k56Pk+tso5mphDY++jqX5pYdBzNDeHJrxPxjiHXvF62u3q/DaSiIjIXPV0AfY/Dxz7pebYvpUrKBZo9Fiam/z5224iIiIismDh3Y1zHt6DRkREREREZCZYoBEREREREZkJFmhERERERERmggUaERERERGRmWCBRkREREREZCZYoBEREREREZkJi9xmv6ioyNQpEBERNQvHMCKi9o0zaERERERERGaCBRoREREREZGZUIiImDoJIiIiIiIi4gwaERERERGR2WCBRkREREREZCZYoOnp3r17mDx5MhwdHdGjRw/87W9/M3VKrerBgweYN28eevToAWdnZwwaNAhffPGFJn78+HH4+/vDwcEBI0aMwM2bNzUxEUFcXBzc3d3h7u6OZcuWob2spP3pp59gZ2eHWbNmadostS8AIDk5GQEBAXB0dESvXr2Qnp4OwDL7JCcnB+PHj4ebmxs8PT2xcOFCVFZWAmj//bFt2zaEhITA1tYWc+bM0Yq15NpzcnIwYsQIODg4wN/fH2lpaca6JDIAjiPNw3FGfxyD9GfJY1RjzHIME9LLtGnT5KWXXpLi4mJJT08XFxcX+eGHH0ydVqtRq9WSkJAg2dnZUlVVJYcPHxYnJyfJzs6W27dvi4uLi+zdu1dKS0tl6dKlMmTIEM1zP/zwQ/Hz85NffvlFcnNzJSAgQD744AMTXo3hhIeHS2hoqMycOVNExKL74quvvpInnnhCTp8+LVVVVZKbmyu5ubkW2yfjxo2TqKgoKS0tlVu3bkn//v1ly5YtFtEfqampcuDAAVmwYIFERUVp2lt67UOHDpU33nhD7t+/LykpKaJUKqWgoMCYl0YtwHGkeTjO6IdjUNNY8hjVGHMcw1ig6UGtVou1tbVcvXpV0zZr1iyJi4szYVbGN2DAAElJSZHt27fLsGHDNO1qtVrs7OzkypUrIiIybNgw2b59uya+Y8cOrb/QbdWePXtk6tSpkpCQoBk4LbUvRGqubceOHTrtlton/v7+8ve//11zvHTpUnn11Vctqj9WrlypNbi15NqvXr0qNjY2olKpNPHQ0NB29z8GlsbSx5HGcJzRH8egpuEY1ThzGsO4xFEP165dg5WVFfz8/DRtQUFByMzMNGFWxpWfn49r164hMDAQmZmZCAoK0sRqlxbU9kfdeHvoK5VKhdWrV2Pjxo1a7ZbYFwBQVVWFCxcu4Pbt2+jduzd8fHywcOFClJaWWmyfLF68GMnJybh//z7y8vLwxRdf4Pnnn7fY/gBa9u8jMzMTvr6+cHZ2rjdObY+ljyON4TijP45BTccxqulMOYaxQNODWq2GUqnUalMqlSguLjZRRsZVUVGBmTNnIioqCv7+/o32R924UqmEWq1u02uW4+PjMW/ePHTv3l2r3RL7Aqj5H62KigqkpKQgPT0dly5dwsWLF5GYmGixfRIWFobMzEy4uLjAx8cHISEhmDRpksX2B9Cyfx+W/rnb3nAcaRzHGf1xDGo6jlFNZ8oxjAWaHpycnKBSqbTaVCqVVlXcXlVXV+OVV16BjY0Ntm3bBqDx/qgbV6lUcHJygkKhMF7iBnTp0iWkpaXhjTfe0IlZWl/Usre3BwDExMSgW7du6Ny5M5YsWYKjR49aZJ9UV1dj7NixiIiIQElJCe7cuYPCwkLExcVZZH/Uasm1W/LnbnvDcaRxHGeahmNQ03CMah5TjmEs0PTg5+eHyspK/PTTT5q2jIwMBAYGmjCr1icimDdvHvLz85Gamgpra2sAQGBgIDIyMjSPKykpwY0bNzT9UTfe1vvq66+/Rk5ODp544gl4enrivffeQ2pqKp566imL64tabm5u8PHxqfcD2hL75N69e/jll1+wcOFC2Nrawt3dHXPnzsXRo0ctsj9qteTaAwMDkZWVpfXbxvbUN5aC44h+OM40DcegpuEY1TwmHcOaegOdpXr55Zdl2rRpolar5dtvv233uziKiMyfP1+GDBkixcXFWu0FBQXi4uIiKSkpUlpaKsuWLdO6YfSDDz4Qf39/yc3Nlby8POnXr1+bvrG/pKREbt26pfkTGxsrkZGRUlBQYHF98aj4+HgJCQmR/Px8uXfvnoSGhsqqVasstk969uwp77zzjlRUVEhhYaFMmjRJZsyYYRH9UVFRIaWlpbJ8+XKZNWuWlJaWSkVFRYuvfciQIRIbGyulpaWyf/9+7uLYBnEc0Q/HmabjGNQ0ljxGNcYcxzAWaHq6e/euvPjii+Lg4CDdu3eXzz//3NQptaqcnBwBILa2tuLo6Kj5k5SUJCIix44dk759+4qdnZ2EhYVJdna25rnV1dXy5ptvipubm7i5ucmbb74p1dXVJroSw3t0dy0Ry+2L8vJyee2110SpVIqHh4fExMRIaWmpiFhmn1y8eFHCwsLE1dVV3N3dZcqUKZKfny8i7b8/EhISBIDWn4SEBBFp2bVnZ2dLWFiY2NnZiZ+fnxw7dszIV0YtwXGk+TjONI5jUNNY8hjVGHMcwxQiFnCXHxERERERURvAe9CIiIiIiIjMBAs0IiIiIiIiM8ECjYiIiIiIyEywQCMiIiIiIjITLNCIiIiIiIjMBAs0IiIiIiIiM8ECjagBc+bMwejRo02dRr2GDx+O6OhoU6dBRERmiOMXUdvG70EjakBRURGqq6vh5uYGAIiOjsb169fx9ddfGy2HxMRE7NixAzk5OVrt9+7dQ8eOHeHi4mK0XOpjij4hIqLH4/jVOI5fZM46mjoBInOlVCpb7bXLy8thY2PT7Od36tTJgNkQEVF7wvGLqI0TIqpXVFSUjBo1SkREEhISBIDWn507d4qISHFxsSxatEi8vLzE3t5egoODJTU1VfM62dnZAkCSkpJk3Lhx4uDgILGxsVJdXS3R0dHi6+srdnZ20rNnT/nTn/4kZWVlIiKyc+dOnXMmJCSIiEhYWJjMmzdPc47y8nKJi4sTLy8vsba2loCAAPn888+1rgeAvP/++zJr1ixxcnISHx8feffddx/bB+Xl5fLGG2+It7e32NjYiKenp7z88ssG65Pdu3fLyJEjxc7OTp588klJSkpq+htFRERaOH5x/KK2jQUaUQMeHeCKi4tlxowZMmzYMLl165bcunVL7t+/L9XV1TJ8+HAJCwuT9PR0uXHjhmzfvl2sra0lLS1NRB5+mHt7e8tnn30mN27ckKysLKmqqpKVK1fKmTNnJDs7Ww4dOiSenp6yevVqERG5f/++xMXFiY+Pj+acxcXFIqI7wC1dulQ6deoke/fulatXr8rbb78tCoVCk4NIzQDXtWtX+eijj+T69euyZcsWASD/+Mc/GuyDjRs3ire3t5w4cUJu3rwp586dkz//+c8G65Nu3bpJUlKS/Pjjj7Jy5UpRKBRy/vx5A76LRESWh+MXxy9q21igETXg0QFORGTevHkSFham9ZgTJ06Ira2t/Pbbb1rtc+fOlRdffFFEHn6Yr127ttFzbtq0SXr37q05XrdunfTo0UPncY8OcCUlJWJjYyPvv/++1mMmTZokI0aM0BwDkJiYGK3H9O3bV5YvX95gPosWLZIRI0ZIdXV1vfGW9smqVau0HjNs2DCZOXNmg/kQEVHjOH5x/KK2jfegEbXA+fPnUV5eDm9vb6328vJy9OnTR6vt6aef1nn+xx9/rLmJuqSkBJWVlaiurm5SDtevX0d5eTmee+45rfawsDC88847Wm3BwcFax97e3sjPz2/wtefOnYvw8HD07t0b4eHhCA8Px8SJEx97/0FT+mTYsGFax7///e9x/PjxBl+biIgMg+OXLo5fZC5YoBG1QHV1NZRKJc6fP68TqzsIODo6ah3v27cPr7/+OtavX4+wsDC4uLhg3759WLlyZbNyUSgUWsciotNWNyeFQvHYATU4OBjZ2dk4duwYTpw4gcWLFyM+Ph5nzpxpcAeupvRJXcJNZYmIjILjly6OX2QuWKAR6cnGxgZVVVVabSEhIfjtt99QVlaG/v37N+n1vvnmGwwaNAhLlizRtNXdjri+c9bVu3dv2Nra4uTJkwgMDNR6/UePm8vJyQmTJ0/G5MmTsWLFCnTr1g0nT57U/CayJX1y5swZjB8/XnN8+vRpBAQEtDhnIiJ6iOMXxy9qW1igEempZ8+e2LdvHzIzM+Hh4QFnZ2eMHDkSo0ePRkREBN59910EBQWhsLAQp06dgp2dHf74xz82+Hp9+/bFJ598gkOHDqF///44cuQI9u/fr3POf/3rXzh9+jT69OkDBwcHODg4aD3GwcEBixYtQnx8PLp06YLg4GDs27cPhw4dwrFjx1p0zRs2bICXlxeCg4Ph4OCAPXv2wMrKCn5+fgbpk08++QT+/v4ICQlBUlISTp8+jc2bN7coZyIi0sbxi+MXtTGmvQWOyHzVvcn67t27Mm7cOHFxcdHakrd2t6onn3y1JRTgAAABCklEQVRSrK2txcPDQ8aOHSvHjx8XkYc3FKenp2u9fnl5ubz66qvi5uYmzs7OMn36dNm6das8+s+yvLxcpk+fLm5ubgbZpvizzz7Tahs1apRERUU12AcffvihPPXUU+Ls7CyOjo4SEhIiBw8eNFif7N69W8LCwsTW1lZ69Oghu3fvfsw7QkRE+uD4xfGL2jaFCBfNEpFx5eTkoGfPnkhPT0doaKip0yEiItILxy8yhg6mToCIiIiIiIhqsEAjIiIiIiIyE1ziSEREREREZCY4g0ZERERERGQmWKARERERERGZCRZoREREREREZoIFGhERERERkZlggUZERERERGQmWKARERERERGZif8P5D0X/FKMxAcAAAAASUVORK5CYII=\n",
"text/plain": [
"<Figure size 864x288 with 2 Axes>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"# Plot COST J vs iters. \n",
"# Create one figure of size = (12,4) inches, and two subplots at positions (1 row, 1 col) and (1 row, 2 col)\n",
"# Also assign the two subplots to objects 'ax1' and 'ax2' \n",
"fig, (ax1, ax2) = plt.subplots(1, 2, constrained_layout=True, figsize=(12, 4))\n",
"\n",
"ax1.plot(J_hist) # plot the first 'ax1' subplot object with y1 = J_hist list\n",
"\n",
"#print(len(J_hist))\n",
"#print(J_hist[100:])\n",
"x2 = 100 + np.arange(len(J_hist[100:])) # x2 = Create array [0-999] and add 100 iterations -> x = [100-1099]\n",
"y2 = J_hist[100:] # y2 = Add array with Cost data [from 696 to 686]\n",
"\n",
"ax2.plot(x2, y2) # plot the second 'ax2' subplot object with y2 = J_hist[100:] tail of the list \n",
"\n",
"ax1.set_title(\"Cost vs. iteration\"); ax2.set_title(\"Cost vs. iteration (tail)\") # Set ax1 and ax2 subplots, title names\n",
"ax1.set_ylabel('Cost') ; ax2.set_ylabel('Cost') # Set ax1 and ax2, y-label names\n",
"ax1.set_xlabel('iteration step') ; ax2.set_xlabel('iteration step') # Set ax1 and ax2, x-label names\n",
"plt.show() # Display 'fig' object, with ax1 and ax2 subplots"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"*These results are not inspiring*! Cost is still declining and our predictions are not very accurate. The next lab will explore how to improve on this."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"\n",
"<a name=\"toc_15456_6\"></a>\n",
"# 6 Congratulations!\n",
"In this lab you:\n",
"- Redeveloped the routines for linear regression, now with multiple variables.\n",
"- Utilized NumPy `np.dot` to vectorize the implementations"
]
}
],
"metadata": {
"dl_toc_settings": {
"rndtag": "15456"
},
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.6"
},
"toc-autonumbering": false
},
"nbformat": 4,
"nbformat_minor": 5
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment