Skip to content

Instantly share code, notes, and snippets.

@israeldi
Created December 19, 2019 07:27
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save israeldi/4a6f64fa85353931101bd638d5dcfe08 to your computer and use it in GitHub Desktop.
Save israeldi/4a6f64fa85353931101bd638d5dcfe08 to your computer and use it in GitHub Desktop.
Display the source blob
Display the rendered blob
Raw
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Stats507 Homework 10, May 2, 2019\n",
"### Israel Diego [(Go to Home Page)](https://israeldi.github.io/coursework/)\n",
"#### Discussed with: Hunter Zhang, Xiaochuan Guo\n",
"\n",
"#### israeldi@umich.edu\n",
"\n",
"This notebook shows solutions to homework 10 for Stats507\n",
"\n",
"## Table of Contents\n",
"\n",
"1. [Problem 1: Constructing a 3-tensor](#Problem-1:-Constructing-a-3-tensor)\n",
"2. [Problem 2: Building and training simple models](#Problem-2:-Building-and-training-simple-models)\n",
"3. [Problem 3: Building a Complicated Model](#Problem-3:-Building-a-Complicated-Model)\n",
"4. [Problem 4: Running Models on Google Cloud Platform](#Problem-4:-Running-Models-on-Google-Cloud-Platform)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Problem 1: Constructing a 3-tensor\n",
"#### Time Spent: 30 minutes\n",
"#### ([Back to Top](#Table-of-Contents))\n",
"You may have noticed that the TensorFlow logo, seen in Figure 1 below, is a $2$-dimensional depiction of a $3$-dimensional orange structure, which casts shadows shaped like a \"T\" and an \"F\", depending on the direction of the light. The structure is five \"cells\" tall, four wide and three deep.\n",
"\n",
"<img src=\"https://raw.githubusercontent.com/israeldi/israeldi.github.io/master/Stats507/Homeworks/israeldi_hw10/tensor_flow.png\" alt=\"Drawing\" style=\"width: 300px;\"/>\n",
"\n",
"Create a TensorFlow constant tensor `tflogo` with shape $5$-by-$4$-by-$3$. This tensor will represent the $5$-by-$4$-by-$3$ volume that contains the orange structure depicted in the logo (said another way, the orange structure is inscribed in this $5$-by-$4$-by-$3$ volume). Each cell of your tensor should correspond to one cell in this volume. Each entry of your tensor should be $1$ if and only if the corresponding cell is part of the orange structure, and should be $0$ otherwise. Looking at the logo, we see that the orange structure can be broken into 11 cubic cells, so your tensor tflogo should have precisely $11$ non-zero entries. For the sake of consistency, the $(0, 3, 2)$-entry of your tensor (using $0$-indexing) should correspond to the top rear corner of the structure where the cross of the \"T\" meets the top of the \"F\". **Note:** if you look carefully, the shadows in the logo do not correctly reflect the orange structure--the shadow of the \"T\" is incorrectly drawn. Do not let this fool you!\n",
"\n",
"**Hint:** you may find it easier to create a Numpy array representing the structure first, then turn that Numpy array into a TensorFlow constant. **Second hint:** as a sanity check, try printing your tensor. You should see a series of $4$-by-$3$ matrices, as though you were looking at one horizontal slice of the tensor at a time, working your way from top to bottom."
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"/Users/israeldiego/anaconda3/lib/python3.6/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.\n",
" from ._conv import register_converters as _register_converters\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"[[[0 0 1]\n",
" [0 0 1]\n",
" [0 0 1]\n",
" [1 1 1]]\n",
"\n",
" [[0 0 0]\n",
" [0 0 0]\n",
" [0 0 1]\n",
" [0 0 0]]\n",
"\n",
" [[0 0 0]\n",
" [0 0 0]\n",
" [0 1 1]\n",
" [0 0 0]]\n",
"\n",
" [[0 0 0]\n",
" [0 0 0]\n",
" [0 0 1]\n",
" [0 0 0]]\n",
"\n",
" [[0 0 0]\n",
" [0 0 0]\n",
" [0 0 1]\n",
" [0 0 0]]]\n"
]
}
],
"source": [
"import tensorflow as tf\n",
"import numpy as np\n",
"import functools\n",
"\n",
"sess = tf.Session()\n",
"\n",
"dim = [5, 4, 3] # dimension of tflogo\n",
"num_elems = functools.reduce(lambda x,y: x * y, dim) # Total number of elements in tflogo\n",
"one_ind = [2, 5, 8, 9, 10, 11, 20, 31, 32, 44, 56] # Indices where 1's should go\n",
"\n",
"# Initialize tflogo array\n",
"tflogo_elems = np.zeros(num_elems)\n",
"tflogo_elems[one_ind] = 1\n",
"\n",
"# Create tflogo.constant and reshape\n",
"tflogo = tf.constant(tflogo_elems, dtype = 'int8')\n",
"tflogo = tf.reshape(tflogo, dim)\n",
"\n",
"print(tflogo.eval(session = sess))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Problem 2: Building and training simple models\n",
"#### Time Spent: 4 hours\n",
"#### ([Back to Top](#Table-of-Contents))\n",
"In this problem, you'll use TensorFlow to build the loss functions for a pair of commonly used statistical models. In all cases, your answer should include placeholder variables `x` and `ytrue`, which will serve as the predictor (independent variable) and response (dependent variable), respectively. Please use `W` to denote a parameter that multiplies the predictor, and `b` to denote a bias parameter (i.e., a parameter that is added). \n",
"1. **Logistic regression with a negative log-likelihood loss.** In this model, which we discussed briefly in class, the binary variable $Y$ is distributed as a Bernoulli random variable with success parameter $\\sigma(W^{T}X+b)$, where $\\sigma(z)=(1+\\textrm{exp}(-z))^{-1}$ is the logistic function, and $X\\in\\mathbb{R}^{6}$ is the predictor random variable, and $W\\in\\mathbb{R}^{6}$, $b\\in\\mathbb{R}$ are the model parameters. Derive the log-likelihood of $Y$, and write the TensorFlow code that represents the negative log-likelihood loss function. **Hint:** the loss should be a sum over all observations of a negative log-likelihood term.\n",
"\n",
"#### Solution\n",
"First, the likelihood function $L(p)$ for a random sample $Y_{1},\\ldots,Y_{n}\\sim Bernoulli(p)$,\n",
"where $p=\\sigma(W^{T}X+b)$ is our success parameter\n",
"is,\n",
"\\begin{align*}\n",
"L(p) & =\\prod_{i=1}^{n}p^{y_{i}}(1-p)^{1-y_{i}}\\\\\n",
" & =p^{\\sum_{i=1}^{n}y_{i}}(1-p)^{\\sum_{i=1}^{n}1-y_{i}}\n",
"\\end{align*}\n",
"Taking log of the likelihood function, the negative log-likelihood\n",
"is,\n",
"\n",
"\\begin{align*}\n",
"-\\ell(p) & =-\\sum_{i=1}^{n}[y_{i}log(p)+(1-y_{i})log(1-p)]\\\\\n",
"\\end{align*}"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
"import matplotlib.pyplot as plt\n",
"%matplotlib inline\n",
"\n",
"x = tf.placeholder(tf.float32, shape = [None, 6]) # predictor (independent variable)\n",
"ytrue = tf.placeholder(tf.float32, shape = [None, 1]) # response (dependent variable)\n",
"\n",
"# Model Parameters\n",
"W = tf.Variable(tf.zeros([6, 1]))\n",
"b = tf.Variable(tf.zeros([1]))\n",
"\n",
"# logistic function\n",
"Y = tf.sigmoid(tf.matmul(x, W) + b)\n",
"\n",
"# negative log-likelihood loss function\n",
"loss = tf.reduce_sum(-(ytrue * tf.log(Y)) - ((1 - ytrue) * tf.log(1 - Y)))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"2. **Estimating parameters in logistic regression.** The zip file at http://www-personal.umich.edu/~klevin/teaching/Winter2019/STATS507/HW10_logistic.zip contains four Numpy `.npy` files that contain train and test data generated from a logistic model:\n",
" - `logistic_xtest.npy` : contains a $500$-by-$6$ matrix whose rows are the independent variables (predictors) from the test set.\n",
" - `logistic_xtrain.npy` : contains a $2000$-by-$6$ matrix whose rows are the independent variables (predictors) from the train set.\n",
" - `logistic_ytest.npy` : contains a binary $500$-dimensional vector of dependent variables (responses) from the test set.\n",
" - `logistic_ytrain.npy` : contains a binary $2000$-dimensional vector of dependent variables (responses) from the train set.\n",
" \n",
"The $i$-th row of the matrix in `logistic_xtrain.npy` is the predictor for the response in the $i$-th entry of the vector in `logistic_ytrain.npy`, and analogously for the two test set files. Please include these files in your submission so that we can run your code without downloading them again. **Note:** we didn't discuss reading numpy data from files. To load the files, you can simply call `xtrain = np.load('xtrain.npy')` to read the data into the variable `xtrain`. `xtrain` will be a Numpy array.\n",
"\n",
"Load the training data and use it to obtain estimates of $W$ and $b$ by minimizing the negative log-likelihood via gradient descent. **Another note:** you'll have to play around with the learning rate and the number of steps. Two good ways to check if optimization is finding a good minimizer:\n",
" * Try printing the training data loss before and after optimization.\n",
" * Use the test data to validate your estimated parameters."
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {
"scrolled": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Training data loss before optimization: 1386.2922\n",
"Training data loss after optimization: 716.14246\n"
]
},
{
"data": {
"text/latex": [
"$\\hat{W}_{train}=$[1.3297654, 1.4136479, 1.5654564, 3.2922843, 4.26599, 8.042851]"
],
"text/plain": [
"<IPython.core.display.Latex object>"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"data": {
"text/latex": [
"$\\hat{b}_{train}=$[-1.5289111]"
],
"text/plain": [
"<IPython.core.display.Latex object>"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Accuracy of our parameters (based on test data): 0.852\n"
]
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAZIAAAEbCAYAAADwPQLqAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvhp/UCwAAIABJREFUeJzsnXl8XGW9/9/fTDKTZm2WNume7gttoTSUAorsi4KIgoI/paBXrogKigu4gFevigsiqOBFqYgXWUQQLrJVVoG2NC3daZu0Tdt0SdJszT7JzPP745yTTpKZyUySmWSS7/v1mldmnvPMOc85OTOf+S7P9xFjDIqiKIrSX5KGegCKoihKYqNCoiiKogwIFRJFURRlQKiQKIqiKANChURRFEUZECokiqIoyoBQIRkiRKRIRIyIFA/R8c8Qkc0i4hWR14diDCMFETnL/l/m26+vFZGmAe7zByKyNczrh0TkuYEcYwBje05EHgqzvdv1GArs61Vpj+PaoRpHwFi29t0zcVEhiRH2B93Yjw4R2SMivxSR9AHs83UR+e0gDfEeYBMwE/h4iOM55/C9Hu1D9kUxVF+g9vleEWLzO8AEoCaGQ/gl8KEY7n/EICILgTuAL2L9Xx6P03FD/Tgc8f87FZLY8i+sG3kG8D3gS1g31XBgFvCqMeaAMaY2TL824FsiMi5O40o4jDFeY8wRE8PZvcaYJmNMLIVqJDHL/vsP+//SOpSDGQ3/OxWS2NJu38gHjDF/BR4BPhaqs4icKSJrRaTNNsvvFhG3ve0hrF81NwZYOkUh9uMRkV/b+2gTkTUi8gF7W5GIGCAbWBmB6f8aUA58P9yJisgCEfmniDSKSJWIPCoihQHbk+3zqbMfd4vI/YPtVhORqSLytD2ORhF5SkQm9+hzm31tmkTkYRG5Q0TKB3DMsBaaiOSIyNsi8pJjkfZ1vYLsI6h7RERuEpGD9jX9k4ikBWwLeR8E9Al5z9nb02wrsMne/p1+XqOPi8gWEWkXkQMi8l0RkR7bN4tIq4jUisgbIlJgb5siIs/Y7S0iskNErgp1nYCn7Zd++14Pasn2vKZOnz6uqYjILSJSap9LhYj81N681/67zr4fXg9xnCQR+b59Hdrt63JZwHbHsvmEiKyyz3m7iJwf7XWPFyok8aUVSAm2QUQmAS8A7wFLgM8DVwPOTXoTsBr4E5aVMwE4EOI4Pwc+BXzO3tcW4EURcd4zAWgBbqZv098P3Ap8UURmhhj7BOBNYCuwDDgPyACeFRHnHvsGcC3wH8ByrHvv02GOGzX2F9M/gALgHOBsYCLwD+dLy/4CugP4LnAy8D7w9cEcR48xOdfmIHCpMaY5wusVCR8EFtrv/xRwOdZ94hDuPojkngPLgj4f+ARwrt3vzCjGiIgsBf4GPAUswrqfbgO+bG8vBB4D/gzMt/f/l4Bd3AekYf0/T8C6b+tDHO6XwBfs587nJBr6uqY/wfpR9VN7LFdy/HO4zP57kX3coC5je3/fBL6NdT2eBp4SkZN69PsxcC9wIrAOeExEMqI8n/hgjNFHDB7AQ8BzAa+XAUeBx+3XRYABiu3XPwbKgKSA91wLtANp9uvXgd/2cdx0wAtcE9DmAnYD/x3Q1gRcG+k5YFkmj9nPz7LHnm+//iHwSo/35th9ltmvDwO3BmwXYAfw+kCua49t5wM+oCigbQaWGJ5nv14N/L7H+14Gyvs4rgGuCLGt5/W41r6+s4A9wP09/q+RXK8fAFsDtvd8/RDWF1hyQNsfgH9Feh/0dc9hiVs78P8CtmdgfYk/FOZa9bwej2C5UelxPhX285Pt/tNC7G8zcEcU98gVgOnrvunHNc3AcvV+McRxiwj4TIc5zkHg9h59Xgf+t8d+/jNg+yS77QPRfF7i9VCLJLZcZLsE2rC+wN4EvhKi73xgtTHGH9D2FuDmuM83EmZiWT1vOw3GGJ99/AVR7Kcn3wKulOBZZkuBM+1zbRIrY8n5lTZTRLKBQuDdgDEZrF9Zg8l84JAxpjzgOHuAQxw/93mB47BZO8jjAOv/9hbwgjHmhh7/17DXK4pjbDfGdAa8PgSMD9hPX/dBX/fcTPv56oB9NGFZNtEwP3AcAceZJCJZWEkf/wK2isjfReQG6R6Tuwf4noisFpH/ti2cWBHumi4APMAr/d25fb4TCX49en4+N/cYBwFjGVaokMSWN4GTgLlAqjHm48aYqhB9BesXRzCiCeI6fudg7+l3MNgYsw74O/CzIJuTgH9inWvgYzYQ6JeOdanpSK9hPEped2BZOh8WkWk9tkV6vSI5RiCG45/pSO6Dvq6XhNgWLWGPYwvcBfZjM5aLrVRETrQ7PAhMx3LrzgHesWMh0eCn9/kEczNHck0Hg0g+n11jsX94wTD9zh6WgxpBtBhjyowx+4wxPW/QnmwHTuvhI/8Alntit/3ai+WeCEeZ3a8rqCoiLuA0+xgD4TtYPuSLerRvwPIX77PPN/DRaIxpAI5w3IfsxDNOGeB4erId61duUcBxZmD9AnTOfUfgOGx6vh4MDJab6C3gNRGZGrAt7PUapONHch/0dc+VYX2ZLQ/YRzpWDCEatgeOI+A4Fc75GovVxpj/wrovDmHFKLC3VxhjHjDGfBK4Hbg+yjFU0zte0jMm0RfbsVx954bY7rX/hvyMGmOOYZ1bsOsx0M/nkKFCMny4D+sL7z4RmS8iHwHuxIqJtNh9yoFldlZHfrDArDGmGcsnf6eIfFhE5tuvC+xj9BtjTBnwAN2DjwC/w8oCe1xEThWRGSJynog8ICKZdp97sNKILxeRucBdWB/srl9hIvJlEdkRwVCyROSkHo8iLPfIJuAREVlqu+EewfrifjVgHNeKyOdEZLaIfAs4lcislKIgx80K1dl2Ga3AmmfyeoCYRHK9BkSE90HYe852Yz0I/ExEzheRE4CV9P1jpid3AR+ys5fmiMj/A27BSgZARJaLyPdE5BT7Gn0UmIL9xSoi94jIRfZ1Ognrh0y0X7qvAkvs//ss+/9+RjQ7sEXvHuCnInKdiMwUkWUicoPdpQoroeZCESmwXbrB+AXwDRG52r4eP8T6gXZXlOc0fBjqIM1IfRAmKGxvL6JHYA4rW2Ut1q+eSuBuwBOwfQ6Wv7rFfm9RiH17gF/b+2gH1tAjSEeUwfaAtvFAIwHBVLt9NvAkUIf1YdoJ/AZw29uT7THV231+Ze//hYB9/IAeQdIQYzJBHk/a26diZW412o+ngck99vEdrA99E/Aw1pfn+30cN9gxDXAJIYLtAe91Af+L9St/aoTX6wf0HWzvK3gcyX3Q1z2Xbl+jJvuafR/L/fZQmGvV7XrYbR/Hiq14seJB3wXE3jYfK3vMGWcZ8K2A9/4GKMUKdFdjZXhNCnP8XsH2gOtzGGjAEtGf9OOaJmFlne0JOJcfB2z/D2A/VtLH62H28X37vV77unws3HdDwD0YNOFjqB/OP1JR4o6IbADeNsaESkCI1ziexsrUuXQox6EoiUryUA9AGR3YAecLgTew7rvrsfLjo/V1D3QcacANwItAJ9b8iMvsv4qi9AMVEiVe+IFrsPzDSVg+7ouNMSVxHocBLsZyb43Bcpl81hjzdNh3KYoSEnVtKYqiKANCs7YURVGUATEqXFv5+fmmqKhoqIehKIqSUKxfv/6oMabPyt9xExIRmYKVRliI5S9/wBhzj4j8AriU45OgrjPG1NvvuQ1rlqsP+Kox5iW7/SKsfG4X8EdjzJ3hjl1UVERJSbxd8YqiKImNiOyLpF88XVudwC3GmPlYM2VvFJEFwCpgoTFmMbALqyoo9rarsGYAX4Q1acplz879HVbAdAFwtd1XURRFGQLiJiTGmMPGmA3280as8t2TjDEvm+NF0tYAztoRl2FVm203xuzFmqS0zH6UGWP2GGO8WJOTLkNRFEUZEoYk2G6Xs1hC76qrn8Oa4QpW2eTA9TYq7LZQ7T2Pcb2IlIhISXV19eAMXFEURelF3IXEXpjl78DNxipg5rR/F8v99YjTFOTtoSqS9sphNlaBt2JjTPG4cbpKrKIoSqyIa9aWiKRgicgjxpinAtpXYNUsOtccn9hSgVW4zWEyx2vyh2pXFEVR4kzcLBK7bPiDWMXxfhXQfhHWkpMfNcer3AI8C1wl1rrT07GK3L2LtRjSbBGZLtba0lfZfRVFUZQhIJ4WyRnAZ4EtIrLRbvsO1prEHmCVpTWsMcZ80RizTUSewCql0QncaKwFcBCRLwMvYaX/rjTGbIvjeSiKoigBjIoSKcXFxSbW80je21+HK0lYPHlsTI+jKIoSL0RkvTEm2PLa3dASKYPEbU9t4c4XIlmTSVEUZWQxKkqkxJpOn5891c0kuwZzSWdFUZTEQC2SQWB/bQten5+mts6+OyuKoowwVEgGgV2VTQA0tauQKIoy+lAhGQTKqhoBaFSLRFGUUYgKySBQWmVZJO2dfjp8/iEejaIoSnxRIRkEHNcWQLO6txRFGWWokAwQn9+wu7qJnLQUQN1biqKMPlRIBsiB2ha8nX6WTM0BNOCuKMroQ4VkgOyqtALtS6ZYM9r7IyR+v+Gd3UcHdVyKoijxQoVkgDiB9pOm9l9I3tldw6f/sJb3Dx/ru7OiKMowQ4VkgJRVNTExO5XCrFSAfk1KrG3xAnCstWNQx6YoihIPVEgGyK7KRmYVZJKRalWb6Y9F0mK/p71TU4cVRUk8VEgGgM9vKKtqYvb4DDI8tpD0wyJp9voA8KqQKIqSgKiQDICDda20d/qZU5BBulstEkVRRicqJAPAydiaNT6TpCQh3e3ql5B0WSQ+36COT1EUJR6okAwAJ2Nr1vgMADJSk/vl2mrx2hZJh1okiqIkHiokA6C0qpHCrFSyx1iz2jM8yf2zSNodi0SFRFGUxCNuQiIiU0TkNRF5X0S2ichNdnuuiKwSkVL7b47dLiJyr4iUichmETk5YF8r7P6lIrIiXufQk9LKJmYXZHS9zkhNobEfQtLaoRaJoiiJSzwtkk7gFmPMfGA5cKOILABuBV4xxswGXrFfA1wMzLYf1wP3gyU8wB3AqcAy4A5HfOKJ387YctxaAJme5H4VbXQskvZOjZEoipJ4xE1IjDGHjTEb7OeNwPvAJOAy4M92tz8DH7OfXwY8bCzWAGNFZAJwIbDKGFNrjKkDVgEXxes8HA7Wt9La4WNOQWZXW7rHNaAYiab/KoqSiAxJjEREioAlwFqgwBhzGCyxAcbb3SYBBwLeVmG3hWrveYzrRaREREqqq6sH+xQotRezmh1gkWR4UgYUI9H0X0VREpG4C4mIZAB/B242xoQrLiVB2kyY9u4NxjxgjCk2xhSPGzeuf4MNQ2ll94wtgMzUZBrboi9z0pW1pUKiKEoCElchEZEULBF5xBjzlN1cabussP9W2e0VwJSAt08GDoVpjyulVU2My/QwNs3d1ZbhSabZ68OYXroWFmceiQqJoiiJSDyztgR4EHjfGPOrgE3PAk7m1QrgmYD2a+zsreVAg+36egm4QERy7CD7BXZbXCmtbGROQMYWQLonGZ/f0BZl9pUzs11jJIqiJCLJcTzWGcBngS0istFu+w5wJ/CEiHwe2A9caW97HvgwUAa0ANcBGGNqReRHwDq73w+NMbXxOQULYwylVU18snhKt3ancGNjewdj3K6I99XSoVlbiqIkLnETEmPMWwSPbwCcG6S/AW4Msa+VwMrBG110HGpoo8Xr6xYfASv9F6zCjeMzg72zN20dfhxPmFokiqIkIjqzvR84NbYCU3+BrgrAThZWJDR7j2d5aYxEUZRERIWkH5TZGVuzx/eOkYDl2oqUlgDRUYtEUZRERIWkH5RWNZKf4SYn3d2tPTM1+jVJulskGiNRFCXxUCHpB7sqm5gdJAjStbhVFJMSnTkkKS7Roo2KoiQkKiT9oLymmRnj0nu192e5XSeeMjbNrUUbFUVJSFRIosTnN9S3dJCX4em1bSAWSW6aWy0SRVESEhWSKDnWagXSx9prkATiSU4iOUmiipG0eB2LJEUtEkVREhIVkiipt4UkJ723kIiItUpiNK4tW0hy09UiURQlMVEhiZK6Fi8AY8e4g27P8ES33K5THsWKkWjWlqIoiYcKSZQ0tFgWSXZab4sEol9u17FIssekqEWiKEpCokISJfWtjkUyOELS0t5JmttFakoSHT6D3x9d5WBFUZShRoUkSuqa7RhJWgjXVj9iJGnuZDzJVpFHtUoURUk0VEiixAm2Z4WzSKLK2uok3ePCnWz9KzRzS1GUREOFJEoaWrxkpSbjSgpeyDgzWouk3bFIbCHxacBdUZTEQoUkSupbO3rV2Aok3R2dkLR2dJLuVotEUZTERYUkSupaOkIG2sGKkbR4ffgiDJo3t/tI8xy3SDRGoihKoqFCEiUNLV6yQwTaIfoyKS1eyyLxqEWiKEqCokISJfWtHeSEmEMCAaXkIxSS4zESzdpSFCUxiZuQiMhKEakSka0BbSeJyBoR2SgiJSKyzG4XEblXRMpEZLOInBzwnhUiUmo/VsRr/A51zd6wrq30rlUSI7dI0rrFSDTYrihKYhFPi+Qh4KIebT8H/ssYcxJwu/0a4GJgtv24HrgfQERygTuAU4FlwB0ikhPzkdv4/IZjbZ0RubYaI0wBbvb6SPMEuLZ0lURFURKMuAmJMeZNoLZnM5BlP88GDtnPLwMeNhZrgLEiMgG4EFhljKk1xtQBq+gtTjEjXOVfh2hcWx0+P95OP+nu5C6LRJfbVRQl0Uge4uPfDLwkIr/EErXT7fZJwIGAfhV2W6j2XojI9VjWDFOnTh2UwYar/OuQ4bG2RTIp0Skhn+Z2dcVI1CJRFCXRGOpg+w3A14wxU4CvAQ/a7cFm+5kw7b0bjXnAGFNsjCkeN27coAy2r8q/AOkeSxAiiZG02kKS7gmwSHRCoqIoCcZQC8kK4Cn7+d+w4h5gWRpTAvpNxnJ7hWqPC31V/gXItC2SxgiEpNleHTFN038VRUlghlpIDgEfsp+fA5Taz58FrrGzt5YDDcaYw8BLwAUikmMH2S+w2+KCU/k3VMFGOG6RROTastdr7xYj0fRfRVESjLjFSETkUeAsIF9EKrCyr74A3CMiyUAbdkwDeB74MFAGtADXARhjakXkR8A6u98PjTE9A/gxw6n8Gy7YnuxKYkyKi6b2jj7312WReNQiURQlcYmbkBhjrg6xaWmQvga4McR+VgIrB3FoEVPf2oFI6Mq/DlYp+b5jHS1dri2dkKgoSuIy1K6thMKq/JsSsvKvQ6SLWzV3ubZcpLisfeqEREVREg0Vkiiob+1gbJhAu4O1Jknfrq0ui8STjIjgSU6iXS0SRVESDBWSKOir8q9DfywSAHdyksZIFEVJOFRIoqCvyr8OGanJEZVIae1wJiRaoSpPsktjJIqiJBwqJFHQV+VfhwxPcldGVjia2ztJcUlX6q9HLRJFURIQFZIo6Kvyr0Ok67a3eH1d1ghYQqIWiaIoiYYKSYREUvnXIcNet93KYg5Nc3tnV3wEnBiJZm0pipJYqJBEiFP5N1LXVofP9FmAscVrLbProBaJoiiJiApJhDiVfyNN/4W+Czc224taOWjWlqIoiYgKSYREUvnXIdJ121vafd2ERLO2FEVJRFRIIiSSyr8OGamRrZLY0tFJekCw3Z2cRHunxkgURUksVEgiJJLKvw6Z0VgkPWMkurCVoigJhgpJhERS+dfBsUgiiZH0ytpSIVEUJcFQIYmQSCv/grXiIUQaI1GLRFGUxEaFJEIirfwLx11b4WIkxhjLIvGoRaIoSmKjQhIhkVb+heOurXAWSXunH7+hh0XiUotEUZSEQ4UkQiKt/AswJsVFkoRfbteJn/SaR6JZW4qiJBgqJBHS0OJlbAQZWwAiQnofpeRbvE7l38B5JEl0+Ax+f/jSKoqiKMOJuAmJiKwUkSoR2dqj/SsislNEtonIzwPabxORMnvbhQHtF9ltZSJya7zGH41rC6w4SSRCku7pPo8EdLldRVESi7it2Q48BPwWeNhpEJGzgcuAxcaYdhEZb7cvAK4CTgAmAv8SkTn2234HnA9UAOtE5FljzPZYDz7Syr8OGanhKwA3e3u7tpx129s7/KSmuIK+T1EUZbgRN4vEGPMmUNuj+QbgTmNMu92nym6/DHjMGNNujNkLlAHL7EeZMWaPMcYLPGb3jSnRVP516GuVxJb20BZJu0/jJIqiJA5DHSOZA3xQRNaKyBsicordPgk4ENCvwm4L1d4LEbleREpEpKS6unpAg4ym8q9DXzGS4BaJLSRauFFRlARiqIUkGcgBlgPfBJ4QEQGCTdYwYdp7NxrzgDGm2BhTPG7cuAENMprKvw6ZqX3FSKxt6T0mJILGSBRFSSziGSMJRgXwlLFWgHpXRPxAvt0+JaDfZOCQ/TxUe8yIpvKvQ1+rJDbbrq00j1okiqIkNkNtkfwDOAfADqa7gaPAs8BVIuIRkenAbOBdYB0wW0Smi4gbKyD/bKwH6VT+jcYiyfCkRG2RaNaWoiiJSNwsEhF5FDgLyBeRCuAOYCWw0k4J9gIrbOtkm4g8AWwHOoEbjTE+ez9fBl4CXMBKY8y2WI/dqfwb6TwSgAyPi6b2Tvx+Q1KQsiqORTImJVjWlgbbFUVJHOImJMaYq0Ns+kyI/j8Gfhyk/Xng+UEcWp9EU/nXwSmT0tLh61roKpDWDp81Az5AZNQiURQlERlq11ZCEE3lX4cMj9U3VJykub17wUbQGImiKImJCkkERFP51+F44caOoNtbvN1LyINaJIqiJCYqJBEQbXkU6LuUfHN7Z7c5JBAQI9HCjYqiJBAqJBEQTeVfB2fGuhNU70mL19dtVjsEzCPRUvKKoiQQKiQREE3lX4cMT3jXVrO3t0XSVSJFhURRlARChSQC+uXaSg3v2mpp93WbQwJqkSiKkpgMSEhEZIyInCci0wZrQMORaCv/QqBFEiJGohaJoigjhKiEREQeEpEv2c/dWLPNXwZ2isjFMRjfkONU/o3WtXU8RhJcSFq9vm7lUQDcLhUSRVESj2gtkguBNfbzjwKZQCHwA/sx4jjWj4KNYFkX7uQkGsNYJD1dWyKiy+0qipJwRCskOYCzZshFwN/tNUQeAxYM5sCGC/2p/OuQGaJwo89vaOvw95pHAlacRGMkiqIkEtEKyRFgoYi4sKyTf9ntGUDw9KQEpz+Vfx0yQpSS7yrY6Om9CqInOUldW4qiJBTR1tpaCTyOVbrdB7xit58K7BjEcQ0b+lP51yHdHdwicdZrD26RuNQiURQloYhKSIwxPxSRbcBU4G/2crdgVej92WAPbjjQn8q/DqEsEicAH8wicatFoihKghF19V9jzN+DtP15cIYz/OhP5V+HTE8yR4619WoPb5Ek4dVgu6IoCUS06b+fFJELAl7fLiIVIvKSiEwY/OENPf2p/OsQOkZiCUW6Wy0SRVESn2iD7T9wnojIycB3gHuBFOCuwRvW8KE/lX8dxmV4ONLQRmePar7NdrB9TBAh0awtRVESjWiFZBqw035+OfAPY8zPga8D5w7mwIYL/SmP4rBocjbtnX5Kq5q6tbfYhRx7Fm0EtUgURUk8ohWSNqxJiGAJh5P+2xDQPqKoa+noV6AdYNGkbAC2VDR0a3cskp4lUkCzthRFSTyiFZJ/A3eJyPeBYo4veTsHOBDujSKyUkSq7PXZe277hogYEcm3X4uI3CsiZSKy2XajOX1XiEip/VgR5fijpqEl+jpbDkV56WR6ktl8sL5be4uTtRUk2O526cx2RVESi2iF5MuAF7gC+KIx5pDdfjHwUh/vfQhrNnw3RGQKcD6wP6D5YmC2/bgeuN/umwvcgTVvZRlwh4jkRHkOUTEQ11ZSkrBwUnYQi8TO2go2ITFFXVuKoiQWUQmJMabCGHOpMeZEY8zKgPabjTFf7eO9bwK1QTbdDXwLMAFtlwEPG4s1wFg7K+xCYJUxptYYUwesIog4DSb9qfwbyOLJ2bx/uLGbu6rF20lyknQVaQzE7dJgu6IoiUXU80gAROQcrNpaBthujHmtn/v5KHDQGLNJpFtW1CS6u8oq7LZQ7cH2fT2WNcPUqVP7M7x+V/4NZNHkbLw+P7sqG1lox0ya232kuV30OGdALRJFURKPqIRERCYBTwNLscqkAEwUkRLg8gBXVyT7SgO+C1wQbHOQNhOmvXejMQ8ADwAUFxcH7dMX/a38G8jiSWMB2FzR0CUkrUGW2XVwuzTYrihKYhFtjORerBpbs4wxU4wxU7DiGD57WzTMBKYDm0SkHJgMbBCRQixLY0pA38lYwhWqPSZ4UpL42ScWccas/H7vY0ruGLLHpLAlIOAebFGrwGNqsF1RlEQiWiE5H7jRGLPXaTDG7AG+am+LGGPMFmPMeGNMkTGmCEskTjbGHAGeBa6xs7eWAw3GmMNYAf0LRCTHDrJfQN9B/n6T5k7mU6dMZU5B/zObRYTFk7PZHBBwb/H6gpZHAStG0uEz+P39MqIURVHizmCt2d6nL0ZEHgVWA3PtsiqfD9P9eWAPUAb8AfgSgDGmFvgRsM5+/NBuG9YsmpTNziONtHVYlkZze3iLBMDrU/eWoiiJQbTB9leAe0XkamPMAQARmQrcA7wa7o3GmKv72F4U8NwAN4botxKrnH3CsHhyNp1+w/uHj7Fkag4tXh/jMj1B+wYut5uaElxsFEVRhhPRWiRfBdKAPSKyz45t7AbGAF8Z5LGNGBZNtgLuWw5a7q3wMRKrXeMkiqIkCtGuR3IAOFlEzgfmYWVRbcdyQf0K+OSgj3AEMDE7lbx0d1ecpKXdF3RWO4DHtkg0c0tRlEShX/NIjDGrsCYDAiAiJwKfGKxBjTREhEWTj89wb/Z2Bp3VDsdjJDqXRFGURGGwgu1KHyyelE1pVSMt3k5rHkmYrC1Qi0RRlMRBhSROLJo8Fr+Bjfvr6fQbtUgURRkxqJDEicWTrVnta/bUAJAWIiPLk2y1q0WiKEqiEFGMRESe7aNL1iCMZURTkJXK+EwPa/ZY017SQpVISXYsEs3aUhQlMYg02F4Twfa9ffQZ9SyenM2bu44CwdciAWupXVCLRFGUxCEiITHGXBfrgYwGFk0ay7/erwKCr0UCgRaJColU2nKoAAAgAElEQVSiKImBxkjiiBMngXAWicZIFEVJLFRI4ohTRh6Cr9cOGiNRFCXxUCGJI+MyPUzMTgUIuR6JxkgURUk0VEjizCLbvZXep0WiQqIoSmKgQhJniqfl4k5OIjM1+KqLHhUSRVESjH7V2lL6z4rTizh73njGhLJIXCokiqIkFmqRxBl3chKzxmeE3C4iuJOTNEaiKErCoEIyDPG4Br5u+8H6Vt4/fGyQRqQoihIaFZJhiCdl4BbJnS/s4Mt/3TBII1IURQlN3IRERFaKSJWIbA1o+4WI7BCRzSLytIiMDdh2m4iUichOEbkwoP0iu61MRG6N1/jjiduVNOAYSWVDG1XH2gdpRIqiKKGJp0XyEHBRj7ZVwEJjzGJgF3AbgIgsAK4CTrDfc5+IuETEBfwOuBhYAFxt9x1ReFJcAxaSo83tNLZ36sRGRVFiTtyExBjzJlDbo+1lY0yn/XINMNl+fhnwmDGm3RizF2sp32X2o8wYs8cY4wUes/uOKNyuJLwDFIDaZi8Adc0dgzEkRVGUkAynGMnngBfs55OAAwHbKuy2UO29EJHrRaREREqqq6tjMNzY4UkZmGurw+envsUSEEdQFEVRYsWwEBIR+S7QCTziNAXpZsK092405gFjTLExpnjcuHGDM9A4YVkk/ReSupbj4qFCoihKrBnyCYkisgK4BDjXGOOIQgUwJaDbZOCQ/TxU+4jBk5JEW0f/haSm6bh41DRrwF1RlNgypBaJiFwEfBv4qDGmJWDTs8BVIuIRkenAbOBdYB0wW0Smi4gbKyDf1+qNCcdALZJAK6ROLRJFUWJM3CwSEXkUOAvIF5EK4A6sLC0PsEpEANYYY75ojNkmIk8A27FcXjcaY3z2fr4MvAS4gJXGmG3xOod44Ul2DSjb6mjTcStEXVuKosSauAmJMebqIM0Phun/Y+DHQdqfB54fxKENOwZaIsURj+QkoUaFRFGUGDMsgu1KdzzJA8vaqmnykiQwNTetW+BdURQlFqiQDEMGapHUNHvJTXeTn+HpFnhXFEWJBSokwxArRjIQi6SdvHQPuelujZEoihJzVEiGIYMRI8lNd5Ob4VbXlqIoMUeFZBjiSU7C6/Pj9weda9knNc1e8jLc5Ka5qWvp6Pd+FEVRIkGFZBjirNvu9fXPKrFcW25y0934/IaGVq23pShK7FAhGYYMZN12b6efY22d5GV4yMtwA1Cr7i1FUWKICskwxJNirefenziJExPJTXeTk2YLiQbcFUWJISokwxCPy7FIop/d7sxqz8+wXFvAqE8Brm/xsqe6aaiHoSgjFhWSYYgnxY6R9MMicayP3PQA19Yot0h+8vz7XP2HNUM9DEUZsaiQDEPcrv7HSBzrIy/juGtrtKcAr95TQ+Wxdto6dLVIRYkFKiTDkIFYJE5trbx0N6kpLtLdrlHt2jrS0MaB2lYAKo+1DfFoFGVkokIyDHG7rGB7/yySdpKThKzUFAByM9zUjuI1SdaVH1/d+UiDComixAIVkmHIQGMkOelukpKsxSRz09zUtozeeSTdhEQtEkWJCSokwxD3gLK2vOTZ2VqAXW9r9Fok7+6t5cQpYwG1SBQlVqiQDEMGZpG0d2VrgZW9VTtKYyQNrR3srGzknLnjSXe71CJRlBihQjIMGVDWVrOXvHRP1+vc9JRRO7N9w746jIFTpudQkJ2qwXZFiRFxExIRWSkiVSKyNaAtV0RWiUip/TfHbhcRuVdEykRks4icHPCeFXb/UhFZEa/xxxNnZnt/XFu1Td6uiYhgWSRtHX5avJ2DNr5EYV15LclJwpIpORRmpaprS1FiRDwtkoeAi3q03Qq8YoyZDbxivwa4GJhtP64H7gdLeLDWej8VWAbc4YjPSMKxSKJ1bbV3+mhs7yQ/wLWVN4pnt68rr2XhpGzGuF0UZquQKEqsiJuQGGPeBGp7NF8G/Nl+/mfgYwHtDxuLNcBYEZkAXAisMsbUGmPqgFX0FqeEx4mRROvacmaw52Ucd23lpI/OSYltHT42HWjglCLrd0ZhVipVje1aUl9RYsBQx0gKjDGHAey/4+32ScCBgH4Vdluo9hFFf2MkjtWR2yNrC45PVBwtbDnYgNfn55SiXAAKs1Pp9BuOjuIMNkWJFUMtJKGQIG0mTHvvHYhcLyIlIlJSXV09qIOLNf0tI++IRTDX1mjL3HLmjxTbQlKQlQpAZYMKiaIMNkMtJJW2ywr7b5XdXgFMCeg3GTgUpr0XxpgHjDHFxpjicePGDfrAY4mI4HZFv9xujV35NzddXVvr9tYya3xGl0U2IdsSksMNrUM5LEUZkQy1kDwLOJlXK4BnAtqvsbO3lgMNtuvrJeACEcmxg+wX2G0jDk9yUtRZW4EFGx2yUpNJccmocm35/YaSfXVd8RGwYiSg9bYUJRYkx+tAIvIocBaQLyIVWNlXdwJPiMjngf3AlXb354EPA2VAC3AdgDGmVkR+BKyz+/3QGNMzgD8icCf3wyJp9pLiEjI9x/+tIkJOmntUubZ2VjbS2NbZFR8BKwHBlSQ6KVFRYkDchMQYc3WITecG6WuAG0PsZyWwchCHNiyxLJLoXVt56R5EuoeSctPdo8oiKbHjI4FC4koSCjI9HNYUYEUZdIbataWEoD8WSW1z98mIDrnp7lEVI3m3vI7CrFQm54zp1q6z2xUlNqiQDFM8ya6oYyRHm73d4iMOVuHG0SEkxhjW7a3llOm5vSwznd2uKLFBhWSY0j+LpL1b5V+H3HR3V0bXSKeirpUjx9q6BdodCrJSqTw2Oq6DosQTFZJhSv9iJN5us9odctPdHGvrpMMXfRHIRGNdkPiIw4TsVJraO2lsG73rsyix5bUdVby2s6rvjiMMFZJhSrQWSavXR4vXFzRGkjeK5pKsK68jMzWZOQWZvbYVZmsKcLw4WN/KlooGrLyZ0cOtT23muj+t4/ZnttLWEZ1rur7FywNv7qYzAX/wqZAMU6K1SGrs0h/5QWIkzqTE0RAn2XSgniVTc3Al9S6C4MxuP6Kz22POTY++x6W/fYtz7nqDe18p5UBty1APKeYca+ug8lg7s8dn8PDqfXzi/ncoP9oc8fuf3XSInzy/g3+XHo3hKGODCskwJVqLxBGJwFntDrmjREg6fX7KqpuYX9jbGoHjs9t1Lklsaevwsaming/Myqcgy8OvVu3igz9/jSt//w6v7qgc6uHFjN1VTQB888K5/PGaYirqWrnkN2/x3OagxTd6UVppvf/5LYdjNsZYoUIyTIk2ayvYrHYHZ6GrkS4k5TUteDv9Qd1aEGiRaJmUWLLpQD0dPsO1pxfx2PWn8fat5/DNC+dyuKGNLz2yISFdN5FQagvJ7IJMzltQwPM3fZDZBRl8+a/vcfeqXX2+v8x+/8vbKxMunqlCMkyJ1iLpKtg4ii2SnUcaAZgbwiJJTXExNi1FLZIYU7KvDoCl06zMuUljx3Dj2bP4+vlzaOvwszcKd08isbuqCbcriSn2/KVJY8fwxH+extlzx/Hw6vI+40WlVU0UZHloaO1g9e6aOIx48FAhGaZEHSNxCjYGsUjGpqUAo0BIKhtJEpg1PiNkH2suicZIYsn6fXXMGp/RFZtzWDAxC4Dth48NxbBiTllVE9Pz00l2Hf9aTXElcdbc8dS1dFDVGPq+q2/xcrSpnc+cOo10t4sXtiaWe0uFZJjiSXZFHSPxJCeR7nb12pbiSiJ7TMrIF5IjxyjKTyc1pfc1cCjMTuXIMXVtxQq/31BSXkvxtN7zeGaOy8DtSmL7oREqJNVNQX/EzLMt5B22xRz0vbZb64RJWZw7v4CXtlUmlAtQhWSY4o7SIjna5CUv3d1rNrdD3iiot7Wrsom5IeIjDmqRxJay6iaOtXV2ubUCSXElMbsgY0RaJG0dPg7UtjAzqJBYltiOMOfdFV8Zn8nFCwupbfbybnni1KNVIRmmeJKT8Pr8Eefh1za3B52M6JCT7qZuBAtJq9dHeU1zyPiIQ0FWKjXN7cMymFkdxvWRKJSUW/GR4iATQgEWTMhi+6FjI25+yZ7qZvwGZgcRkuy0FCZkp/ZpkaSmJDFp7BjOmjueMSkuXthyJJZDHlRUSIYp7ihXSawJUbDRYaTX2yqrasIY+rRIJmSnYgxh/dVDwas7Kjn1J/+itDL0l00iULKvlrx0N0V5aUG3L5iYRU2zd0SIZiBl1ZZFESo+N68wM6yQlFY1MXNcBklJwhi3i7PnjePFbUfw+RNDcFVIhinOcrveCH85W+VRQgvJSHdt7ThiuQ36tEiyh2cK8Ks7qvAbeGNXbJeFfrvsKN//x9aYWQQl5XUUF+WEdLEumGC5ebaNMPdWWVUTSQLT89ODbp9bmEVZVWNIS7issrGbNXPxwglUN7az3s6AG+6okAxTutZt74jUIglesNHBcW2NNJeCw67KRjzJSUzLC/5BdigcprPb37HTPdfsiW3a58Ory/nLmn3sj8FM86rGNvbXtlA8LbhbC2CeLSTvjzAh2V3VxJTctJCJHvMnZNLhM+yp7p363NTeyaGGtm7WzNnzxuNOTkqYyYkqJMMUT7J1Q0ZikbR4O2nr8IeNkeSlu+n0G461dQ7aGIcTO440MrsgI2hplECG4+z2ymNt7KluJjUlibV7a2PmzjDG8O5eK4D7dtngC9Z6Oz6yNEjlZYfsMSlMzhkz4jK3yqqamDUudNp5V8D9SO/zdmbEzxp/3JrO8CTzoTnjeHHrEfwJ4N5SIRmmdMVIIij85sxq7ytGAiN3LsmuysaQM9oDyR6Tgic5aVgVbnSskM8un0ZjWyfbDjXE5DhlVU3UtViVj9/ePfj1nNaV1+FJTmLhxOyw/RZMyBpRmVudPj97jgZP/XWYMS6dFJcEjZOUdc2I7/7+Dy8q5MixNjZW1A/ugGPAsBASEfmaiGwTka0i8qiIpIrIdBFZKyKlIvK4iLjtvh77dZm9vWhoRx8bPFEE27tmtYeJkRwv3Di8XDqDQX2Ll8pj7V35+uEQEQqzU4fVkrurd9eQlZrM5z8wo+t1LFhjWyNLp+WwZnfNoP/SXb+vlhOnjO36ERSKBROz2Hu0mRbvyLCO99e20OEzYYUkxZXEzHEZQVOAS6uaSHEJ03K7JyicO7+AFJfwQgK4t4ZcSERkEvBVoNgYsxBwAVcBPwPuNsbMBuqAz9tv+TxQZ4yZBdxt9xtxOB/GSCYlds1qD1IexSGvS0hG3locTmmUSCwSsBe4Gk5CsqeGZdPzKMxOZea4dFbHKE7y7t5aCrI8XHXKFGqavezsR4ZYQ2tH0Dhbq9fHtkPHgk5E7Mn8CVkYE36CXjw4WN/K4+v2DzhuWFYVPmPLYf6ErBAWSWOvGfEAWakpfGBWPs9vOTLsY5tDLiQ2ycAYEUkG0oDDwDnAk/b2PwMfs59fZr/G3n6uhEoRSWCcGEk0Fkm4YHvuCLZInC9Exw/dFxOyU4dNjORQfSv7alo4bWYeAKfNzGPd3tpBn+dixUdqOHV6HmfMygesDK5oqG32ctpPX+GnL+zotW3jgXo6/YbiMPERhwXDJOD+vae38O2/b2HLwYG5Ep3U32CTEQOZV5jJ4YY2Glq6/5grqwrtFrt40QQO1reyuSI27s7BYsiFxBhzEPglsB9LQBqA9UC9McaxfSuASfbzScAB+72ddv+8eI45HkRnkYSu/OvgCMlITAHeeaSRrNRkCrJCW2SBFGZZQjIcfuU5bqzTZthCMiOfZq9vwF9uPdlX00LlsXaWTc9l4tgxTM9P78oUi5QXtx6hxevjgTf39FoFsMSehX3y1L6FZHLOGDJTk4c04F5SXstrO61U60ff3T+gfZXZxRazUlPC9nMy1gID7m0dPvbXtnQLtAdy4YJC3MlJPLm+YkBjjDVDLiQikoNlZUwHJgLpwMVBujqf+mDWR69vBBG5XkRKRKSkujq2ufmx4HiMxAq2+/yG13dWccsTm3jgzd20eo8H4Wub2xmT4iLNnRxyf2nuZFJTkkbk7PadRxqZV5gVcu5CTwqyUvF2+rsCz0PJ6j015KSldMV3ls+wUmcHO07iZGudOt3a/+kz81i7pyYqy+efWw5RlJfGvMJMvvHEJqoaj1t1JfvqmD0+g7FpoX/MOIjIkAbcjTH84qWd5Gd4uGTxBJ7ZeIim9v7Ha3aHsSgCCVZza+/R0DPiwZoVf/HCQv6x8WDUKy7GkyEXEuA8YK8xptoY0wE8BZwOjLVdXQCTAWd1mApgCoC9PRvoVZTGGPOAMabYGFM8bty4WJ/DoONYJAdqW/jtq6Wc+fPXuPZP63hp2xF+8vwOPvSL13h4dTneTj81TeFntTvkpXt6WSSPrN3HhXe/yb6axCztbYxhZ2Ujcwr7/iA7dKUAD3GcxBjD6t01LJ+RR5KdtpyX4WFuQeagzydZu7eW3HR31xfeGbMsyydSl0l1Yzurd9dw6YkT+c3VS2j2dnLLE5vw+w1+v2HD/rqQZVGCsWBiFjsONw7JzO23yo6ydm8tXz57Jp/7wHRavD6e3RjZ4lM9Mcb0mfrrMD7TQ05aSjeLpDSC+MqnTplCY1vnsK4IPByEZD+wXETS7FjHucB24DXgCrvPCuAZ+/mz9mvs7a+a4eCjGGQci+QH/7edX768i+n56fzu0yez4fvn88R/nkZRXjq3P7ONc+56nZJ9dWEzthxy0o9XAPb5DT96bjvffXorOysb+V4MZzvHksMNbTS2dTI3wvgIHJ/dPtQpwAdqWzlY39oVH3E4bWYeJeV1UVV/7ot3y2tYVpTbZbU5rrR3IoyTvLjtCH4DlyyeyOyCTO649AT+XXqUP/x7D7uqGmls64wo0O4wf0IWrR2+uP+AMcbwy5d2MmnsGK4+dSpLpoxlXmFmWPdWp8/PMxsPBrVaDje00ez1MSuCRA8RYV5hFu8fPm6RlNlLH4SaEQ+wfHoeU3PTeHzdgT6PMVQMuZAYY9ZiBc03AFuwxvQA8G3g6yJShhUDedB+y4NAnt3+deDWuA86DkwcO4bz5o/nix+ayevfOIv//Y9T+cjiCbiTk1g2PZfH/3M5f/7cMnLS3OyvbWFcZmqf+8xN91DX7KW5vZP//Mt6HnxrL9eeXsT3PjKff5ce5Z8JkGbYEyfQ3leNrUCc2e1DnQK8eo/1Je58qTssn5FHq71cbTBavb6oRP9QfSsHaltZNv24xZCT7mbBhKyI55M8t+kQs8ZnMMee63DVKVP48KJCfvHSTla+tRcgokC7gxNwj7d7a9X2SjZVNHDTubPxJLsQEa5eNpUtBxvYEsI6W/n2Xm56bCP3/Kv3KoddGVsRWCRglfDZVdnYlXpdVt3E1DAz4gGSkoRPnTKFNXtqo1oDPp4MuZAAGGPuMMbMM8YsNMZ81hjTbozZY4xZZoyZZYy50hjTbvdts1/PsrfvGerxx4LUFBd/XHEKt148j6Igv1ZEhA/NGcezXz6DP39uGbd9eF6f+8xLd3OwvpVP/s9qXt1RyX999AR+8NETuO6M6SyalM0P/287x9qiixu0dfiGdOZt16qIUQjJuEwPIkM/u3317hryMzy93BrLZ+QiEjxOsqe6iWU/+Rd3/6s04uN0xUdmdHc9nTErjw376rvF24JRdayNd8truWTxhC6LRkT46eWLKchK5YmSCvIzPEzNDV6oMRizCzJITpK4Btx9fsNdL+9iRn46Hz95Ulf7x5ZMwpOcxKPrelsle482c9fLu0hxCX9du5/6lu6u4UhTfx3mT8ikxevjQJ1Voqa0silkoD2QK5ZOJkngiZLhaZUMCyFR+o8jKDMj+EWUk+bmaJOX8qPNPLjiFFacXgSAK0n48eULqW5q51cv9722tENds5fzfvUG1z60btB83aWVjVz9wBo27I+sWN2uI40UZqWSnRY+YyaQFFcS4zI8Ec0lqW/xcsP/rmfV9sqI9x8JxhhW76mxRaN7ksDYNDfzC7N6CYm3089Nj22ksa2TB97c3S3YHY61e2vJTE3ulR59+qx8vD4/JfvCr3vxwtYjGAMfWTShW3t2Wgr3XHUSSQKnhCnUGAxPsotZ4+O7Nslzmw+xs7KRm8+f023ORvaYFC5ZPJFn3jtIc4D7yu83fPvvm3EnJ/HgilNo9vr4y+p93fZZVt1E9piUiFzLcDxF/f3DVgHH8prmXjPag1GQlcrZc8fz5PqKYbnglQrJKGLptBzmFWby5A2nc/a88d22LZ48lmuWT+Ph1eVsjqAkg99vuOVvmzhU38qbu6qDmv3R0uLt5EuPbGD1nho+/9A6dtv5+eHYcaSxz4q/wSiMYC6JMYbvPr2VF7Ye4fq/lPD7N3YPWhxp79FmKo+1c/rM/KDbT5uZx/r9dd0ydX79r11sOdjArRfPo8NnuO+13REda+3eGk4pyu1Vh2xZUS7JSdJnGvBzmw8xtyCT2UGsvuKiXP73P07ltovnRzSWQBZMzIp4LklDSwdff3xjvwW9w+fn7lW7mFeYySU9BBHg06dOodnr4/82HQ+6P/Luft7dW8v3P7KAM+eM45x54/nTO+XdLLiySitjK1IRnVOQiYiVAryvxp4RH6Fb7FOnTKGqsZ3Xd/bOQvX7Dfe9XsbrPdKy44UKySjiI4sn8OLNZzJ/QvDA9C0XziUvw8N3n97ap4Xx4Ft7eXVHFXdcegJXLp3Mva8O/Ca+45ltlFU38dOPL8KVJFzz4LtUhfmy7/T5Katu6peQFGSl9pm19Y+NB/nnlsN89dzZfHjRBO58YQffenLzoATBndnrPQPtDqfNyMPb6ee9/Zaor91Tw/1v7OZTxVP44odmcuXSyfx17X4O1Ycvh1/d2M6e6uautN9A0j3JLJk6NmzA/UhDG+vK67hkce8vX4fTZ+YzNcT6I+FYMCGLymPtHG0KP0m2vdPH9X8p4an3DvKFh0v4wbPbutLiI+XxdQcor2nhGxfM7cqQC+TkqTnMKcjoCrofrG/lzuff54Oz87myeDIAN5w1k9pmbzf3Ull1ZBlbDmPcLory0tl5pJGyKsstG4lFAlZF4PwMD4/1CLr7/YbbntrCz1/cyY2PbBiSDEwVEqWLrNQUvn/JArYcbOB/1+wL2W/D/jp+9uIOLjqhkGtOm8YPL1vIvMJMvvb4Rg728cUWir+vr+Bv6yv4ytmzuHrZVP507TLqW7ys+NO6kHGb8poWvJ3+qOIjDn3Nbj9Y38rt/9hG8bQcbjp3Nr+5aglfPXc2f1tfwWceXDvg4perd9dQmJUacgGoZTNySRJLcBpaO/ja4xuZlpvG7ZcuAOAr584G4DevloU9zjp7ouCyIEICcNrMfLYcbKChNfg1dhIwPhJGSPpLJDPcjTF8+8nNrN1byy+uWMznzpjOQ++U84n732FvhIHn9ftq+eFz21k+I5dz548P2scJum+qaGDboQa+89QWDPCTyxd1WRunFOVSPC2HB97cQ4fPT22zl9pmb8RC4OAscuXEVyJxS4Plkr1i6WRe21nV9QPLcb89XnKAFadNIylJ+NrjG+Pu/lIhUbpx6eIJfHB2Pr94aWfXl1AgDS0dfOWv71GYncrPrliMiLWi233/72Q6fIYbH9kQ9S/2sior/Xj5jFxuOm8OAIsmZ3P/Z5ZSWtnIF/+yPugv0F1OxlY/XVsNrR28GWQhKb/fcMsTG/Ebw68+eRKuJCEpSfj6+XO456qT2Hignsvve5t399b2y9VljGHNnhpOm5kX0iWSlZrCwknZrNldw3ef3kJVYzv3XLWEdI81tWrS2DFcvWwKfys5EPYX6No9NaS5XSycFLwi7xkz8/Cb0Oug/HPzIRZMyGJGFL+6I8WxjMMF3O9etYt/bDzENy+cy5XFU7j90gX88ZpiKupaueTef/PMxoNhj1F+tJkvPLyeidmp3Pf/loZ1QV1uB92/8uh7vLGrmm9eOJcpPRIIbjhrJgfrW3lu86HjQhBhoN1hXmEW5TXNbK5oYNLYMV3/00j4ZPFkfH7Dkxsq8PkN3/r7Zv62voKbzp3Nf122kP/+2EI27K/nvtcjc3sOFiokSjdEhB9dthBPchJX/n41n7j/HV7eZq2JYIzhm09aM5p/++mTyR5zPMA9Y1wGP79iMRsP1POT59/vtd8Wb2fQWfWtXh9femQDaW4X91y1pJsf/8w54/jFlYt5Z3cN3/jb5l7ZYTuOWDn4kWbMBPKxkyYxa3wG16x8l+//Y2u3SrQPvrWXNXtquePSE3q5bC47aRKPXb+c5nYfn/yf1Vz063/z8OryqLLdSquaONrk7ZX225PTZuTxbnktz20+zM3nzebEKWO7bb/x7Fm4koR7XgmdwbV2by1Lp+WQ4gr+UV8yNYcxKa6g7q2D9a1s2F8fE2sErBTkidmpIS2SJ0oOcO+rZXyqeApfOmtmV/t5Cwp4/qsfZMHELG56bCM3PfZeUPdYXbOX6x5ahzGGP123rM9Ju2PT3Hxk0QT2VDezdFoOK04r6tXn7LnjmVuQyf2v7+76IRONawusHz7GXg0z2nt3xrgMlk3P5Yl1B/jmk5t4cn0FN583m6+db/0Au+ykSVx20kTueaWUjQfiV34+cilURg1F+en8+9tn88S6A/zxrb1c/5f1zBiXzslTc3h5eyXf+8h8TurxpQbw4UUTuO6MIv70djkilkjsPdpMeY0VWAaYlpfG0qk5LJmWw9KpOTz0zl5Kq5r483XLKMjqPRfm8iWTqTzWzp0v7ODtsqMUT8th2fRcTp2ex/uHj1GUlx42Bz8UE8eO4bmvfMCaB/H2Xv5dWs1dnzyRdE8yv3hpJ+cvKOjyjffk5Kk5vPmts/i/TYd4ZO1+bn9mGz99fgeXnTSRjyyewOLJY7uJrEN1Yzurtlfy5HrLxx0qPuKwfGYe//PmHk4pyuGGs2b12j4+K5UVpxfxx3/v4UtnzeyVRlrfYlX47ZltFYg7OYlTpucGDbg/v9lya4WLjwyUBRODl0p5q/Qo33lqCx+YlbMd7gwAAAsHSURBVM9/X76wlyUxcewYHv3Ccn77Whm/e62MN3ZV890Pz+eKpZMREdo6fHzh4RIO1rfy1/84NeyEv0A+94HpbDt0jJ99YnHQWEpSknDDWTO5+fGNPPROOWNSXEwaOyaqc54/wfo/tXf6+/Uj6FPFU7jlb5sor2nh6+fP4au2m9Phh5ctZN3eWm5+7D3++dUPRmXx9BdJxNnM0VJcXGxKSkqGehgJSafPz/Nbj/DAm7vZevAY580fzx+uKQ7pIvB2+vnsg2tZu7eWvHQ3RfnpTLcfyUnCe/vrKdlX1+0X5FfOmcUtF8wNOQZjDM9vOcJrO6t4d29tt2ViL15YyP2fWTqgc1yzp4Zv2BloeRkejDG8ePOZ5IdZcTKQzRX1/HXtfp7ZeIhWO8tqRn46J04Zy4mTs+n0G17adoSSfXUYA1Nz0/jM8qlcf+bMsPv1dvr5zaulfPrUqUzIDv5lVdPUzpk/f42z5o3nd58+udu2Vdsr+cLDJTx+/XJODWP9/M8bu/npCzs4ccpYTp46liVTczh56lhufGQDfgP/95UPRHQd+sNdL+/kt6+VdcW5kkQQsbLapuSk8bcbTuuzGGJpZSPfeXoL68rrOG1GHj++fCG/WrWL5zYf5refXsIliycO6pg7fX7O+uXrVNS1csLELP751Q9G9X6/37DwBy/R4vVx58cXcdWyqVG9v9Xr49N/XMOFJxTyxQ8Fv4fW7Knh6j+s4apTpvLTjy+Kav+BiMh6Y0xxn/1USJRIMMaw9eAxZhdk9GkBGGNoau8kM8QXgDGGirpW1tuCcu3pRb3WYgjHkQZrgtyGfVY2UTQ1nkLR1N7Jfz+3nSfXV/A/n13KufML+rWPjfvr2VRRz8YD9Ww6UE9VoyWY8wozuWhhIReeUMi8wsyo5lz0xV0v7+Q3r5bx+88sJTUliZomLzXN7bzyfhXvHahn8x0XhP2f1bd4eeDNPZTsq2NzRT1tHcdjXLdePC/kl9VgsLu6iV+t2kVHpx+DdW8YY2WUffvieRH/2vf7DY+tO8BPX3ifpvZOjInt2B9eXc7tz2zjYydN5NdXLYn6/Zff9zbv7a/nyS+eNij3bzDufGEHv39jN3+4ppjzF0R/P4MKSTdUSJRIaevw9ctVForDDa34/IbJOdGnx0ZKQ2sHH/zZqxxr614Lyp2cxMdOmsjPrzgx4n11+PzsONzIhv117D3azE3nzu5aXTMRqDrWxs9e3ElhtodvXDB3UAU7kFavj8vve5svfHAGn1ga3AUajtue2syj7x5g4+3nR1QxuT94O/187Hdv0+Hz89LNZwZ11fWFCkkAKiTKSOf9w8c4UNtCXoaH/Aw3eRke0t2umH2RKgNj26EG3tx1lBvOip21B1bWWprHxfgIavEFI1Ih0WC7oowA5k/ICjnRVBl+nDAxmxMmBk/JHkyC1emLBZr+qyiKogwIFRJFURRlQKiQKIqiKANChURRFEUZECokiqIoyoBQIVEURVEGhAqJoiiKMiBUSBRFUZQBMSpmtotINRB6paa+yQdCLyOXWIykc4GRdT4j6VxAz2c4E+m5TDPGjOur06gQkoEiIiWRlAlIBEbSucDIOp+RdC6g5zOcGexzUdeWoiiKMiBUSBRFUZQBoUISGQ8M9QAGkZF0LjCyzmcknQvo+QxnBvVcNEaiKIqiDAi1SBRFUZQBoUKiKIqiDAgVkjCIyEUislNEykTk1qEeT7SIyEoRqRKRrQFtuSKySkRK7b85QznGSBGRKSLymoi8LyLbROQmuz1RzydVRN4VkU32+fyX3T5dRNba5/O4iCTMOrci4hKR90TkOft1Ip9LuYhsEZGNIlJityXkvQYgImNF5EkR2WF/hk4bzPNRIQmBiLiA3wEXAwuAq0VkwdCOKmoeAi7q0XYr8IoxZjbwiv06EegEbjHGzAeWAzfa/49EPZ924BxjzInAScBFIrIc+Blwt30+dcDnh3CM0XIT8H7A60Q+F4CzjTEnBcy3SNR7DeAe4EVjzDzgRKz/0+CdjzFGH0EewGnASwGvbwNuG+px9eM8ioCtAa93AhPs5xOAnUM9xn6e1zPA+SPhfIA0YANwKtZs42S7vds9OJwfwGT7y+gc4DlAEvVc7PGWA/k92hLyXgOygL3YyVWxOB+1SEIzCTgQ8LrCbkt0CowxhwHsv+OHeDxRIyJFwBJgLQl8PrYraCNQBawCdgP1xphOu0si3XO/Br4F+O3XeSTuuQAY4GURWS8i19ttiXqvzQCqgT/Zrsc/ikg6g3g+KiShkSBtmis9xIhIBvB34GZjzLGhHs9AMMb4jDEnYf2aXwbMD9YtvqOKHhG5BKgyxqwPbA7SddifSwBnGGNOxnJt3ygiZ/7/9u4vRMoqjOP492ebGVFIFpFuEWXsRWRr0T8rEsouwrwohcAlCaG6COrCiCQqFiUCKS+iklahQAvLjL0JKs2IcCVIKLEuTJbalnYvqosNsqCni3MmJ9ldV9+j777y+8Aw78x5z8x54H3nmXPemXPqblAFHcD1wOsRsRD4g8LDck4kExsCLmt73AkM19SWkkYkXQqQ70drbs+USTqblES2RsQH+enGxtMSEb8De0jXfmZL6shFTTnmbgOWSRoE3iUNb22kmbEAEBHD+X4U2ElK9E091oaAoYjYlx+/T0osxeJxIpnYV8DV+ZcnM4EHgf6a21RCP7Aqb68iXWuY9iQJ2Ax8FxEvtxU1NZ6LJc3O2+cCd5MugH4GLM+7NSKeiHgmIjoj4grSebI7IlbSwFgAJJ0n6fzWNnAPcICGHmsR8Qvwk6Su/NRdwEEKxuN/tk9C0r2kb1ZnAVsiYn3NTTohkt4BFpOmjB4Bngc+BLYDlwM/Aisi4te62jhVkm4HvgC+5eg4/FrSdZImxrMAeIt0bM0AtkdEr6QrSd/qLwT2Az0RcaS+lp4YSYuBNRGxtKmx5HbvzA87gG0RsV7SHBp4rAFI6gb6gJnAYeBh8nFHgXicSMzMrBIPbZmZWSVOJGZmVokTiZmZVeJEYmZmlTiRmJlZJU4kZmZWiROJWSH5T4av5SnIj0gakbRL0pJcPihpTd3tNCut4/i7mNkU7SDN5LsaOESaBO9O0gSGZmcs/yHRrIA83clvwJKI+HSc8j2kpPKfiFAuWwS8CNyYX6MfeLo1KWWu+z1pDZOHcvW+vM8/mNXMQ1tmZYzl2zJJs8Ypv580eV4vae2H1mR51wIfk5LHdXm/bmDLMfVXks7XW4FHgUeAJ4tHYXYS3CMxK0TSA8CbpOGt/cCXwHutWVfz7LivRsSGtjpvA39HxOq257pz/UsiYjT3SOYCXZFPWEnPAo9FROfpiM1sMu6RmBUSETtIH/j3AR8Bi4ABSWsnqXYD0CNprHUjJSCAq9r2G4j/f+vbC8yTdEG5CMxOji+2mxUUEX+SVjv8BOiV1Ae8IGnDBFVmkK53vDJO2c+nppVmZTmRmJ1aB0nn2SzgL9K08e2+Bq6JiEPHeZ2bJamtV3ILMNz0VSLtzOChLbMCJM2RtFtSj6QFeUG0FaR1zHflD/xB4A5J8yRdlKu+BNwk6Q1JCyXNl7RU0qZj3mIusFFSl6TlwFOM34sxO+3cIzErYwwYAJ4A5gPnkIamtgHr8j7PAZuAH3K5IuKbvB74OuBzUo/lMEcXVmrZmsv2kdY+34wTiU0T/tWW2TSXf7V1ICIer7stZuPx0JaZmVXiRGJmZpV4aMvMzCpxj8TMzCpxIjEzs0qcSMzMrBInEjMzq8SJxMzMKvkXejrPTld/49QAAAAASUVORK5CYII=\n",
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"from IPython.display import Latex\n",
"\n",
"# Load train data\n",
"xtrain = np.load('./logistic_data/logistic_xtrain.npy')\n",
"ytrain = np.load('./logistic_data/logistic_ytrain.npy')\n",
"\n",
"# Load test data\n",
"xtest = np.load('./logistic_data/logistic_xtest.npy')\n",
"ytest = np.load('./logistic_data/logistic_ytest.npy')\n",
"\n",
"# Specify parameters\n",
"learnRate = 0.4; n_steps = 60; batchSize = len(xtrain) // n_steps\n",
"\n",
"# Initialize all global variables\n",
"init = tf.global_variables_initializer()\n",
"sess.run(init)\n",
"\n",
"# Define our Gradient Descent optimizer and loss function from part 1.)\n",
"optimizer = tf.train.GradientDescentOptimizer(learnRate)\n",
"train = optimizer.minimize(loss)\n",
"\n",
"# Printing loss before optimization\n",
"lossBefore = sess.run(loss, feed_dict={x: xtrain, ytrue: ytrain})\n",
"print(\"Training data loss before optimization: \" + str(lossBefore))\n",
"\n",
"# Conduct Optimization with training data using batched approach\n",
"losses = np.zeros(n_steps)\n",
"for i in range(n_steps):\n",
" losses[i] = sess.run(loss, feed_dict = {x: xtrain, ytrue: ytrain})\n",
" batch_x = xtrain[batchSize * i: batchSize * (i + 1)]\n",
" batch_ytrue = ytrain[batchSize * i: batchSize * (i + 1)]\n",
" sess.run(train, feed_dict = {x: batch_x, ytrue: batch_ytrue})\n",
" \n",
"# Printing loss after optimization \n",
"lossAfter = sess.run(loss, feed_dict={x: xtrain, ytrue: ytrain})\n",
"print(\"Training data loss after optimization: \" + str(lossAfter))\n",
"\n",
"# Estimates of W and b\n",
"display(Latex('$\\hat{W}_{train}=$'+ str([x[0] for x in sess.run(W)]) ))\n",
"display(Latex('$\\hat{b}_{train}=$'+ str(sess.run(b)) ))\n",
"\n",
"# Use test data to validate parameters\n",
"correct_prediction = tf.equal(tf.round(Y), ytrue)\n",
"accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))\n",
"print(\"Accuracy of our parameters (based on test data): \" + str(sess.run(accuracy, feed_dict={x: xtest, ytrue: ytest})))\n",
"\n",
"# Plot our Loss function\n",
"plt.title(\"Plot of Neg. Log Likelihood loss function\", fontsize = 14)\n",
"plt.xlabel(\"Step\", fontsize = 14)\n",
"plt.ylabel(\"Loss\", fontsize = 14)\n",
"_ = plt.plot(losses)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"3. **Evaluating logistic regression on test data.** Load the test data. What is the negative log-likelihood of your model on this test data? That is, what is the negative log-likelihood when you use your estimated parameters with the previously unseen test data?"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"The negative log-likelihood on test data is: 170.90541\n"
]
}
],
"source": [
"neg_log_likelihood = sess.run(loss, feed_dict = {x: xtest, ytrue: ytest})\n",
"print(\"The negative log-likelihood on test data is: \" + str(neg_log_likelihood))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"4. **Evaluating the estimated logistic parameters.** The data was, in reality, generated with\n",
"\n",
"\\begin{align*}\n",
"W=(1,1,2,3,5,8), & \\ b=-1\n",
"\\end{align*}\n",
"\n",
"Write TensorFlow expressions to compute the squared error between your estimated parameters and their true values. Evaluate the error in recovering $W$ and $b$ separately. What are the squared errors? **Note:** you need only evaluate the error of your final estimates, not at every step."
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"The Squared Error for W is:1.0947154\n",
"The Squared Error for b is: 0.27974698\n"
]
}
],
"source": [
"# Specify the real values\n",
"W_Real = tf.constant(np.reshape([1, 1, 2, 3, 5, 8], [6, 1]), dtype = tf.float32)\n",
"b_Real = tf.constant([-1], dtype = tf.float32)\n",
"\n",
"# Compute the squared errors between our estimates and the real values\n",
"sq_err_W = tf.reduce_sum(tf.square(W - W_Real))\n",
"sq_err_b = tf.square(b - b_Real)\n",
"\n",
"# Print our squared errors\n",
"print(\"The Squared Error for W is:\" + str(sess.run(sq_err_W)))\n",
"print(\"The Squared Error for b is: \"+ str(sess.run(sq_err_b)[0]))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"5. For ease of grading, please make the variables from the above problems available in a dictionary called `results_logistic`. The dictionary should have keys `'W'`, `'Wsqerr'`, `'b'`, `'bsqerr'`, `'log_lik_test'`, with respective values `sess.run(x)` where `x` ranges over the corresponding quantities. For example, if my squared error for $W$ is stored in a $TF$ variable called `W_squared_error`, then the key `'Wsqerr'` should have value `sess.run(W_squared_error)`."
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [],
"source": [
"results_logistic = {'W': sess.run(W), 'Wsqerr': sess.run(sq_err_W), \n",
" 'b': sess.run(b), 'bsqerr': sess.run(sq_err_b), \n",
" 'log_lik_test': neg_log_likelihood}"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"6. **Classification of normally distributed data.** The .zip file at http://www-personal.umich.edu/~klevin/teaching/Winter2019/STATS507/HW10_normal.zip contains four Numpy `.npy` files that contain train and test data generated from $K = 3$ different classes. Each class $k \\in \\{1, 2, 3\\}$ has an associated mean $\\mu_k \\in R$ and variance $\\sigma_{k}^{2}\\in\\mathbb{R}$, and all observations from a given class are i.i.d. $\\mathcal{N}(\\mu_{k},\\sigma_{k}^{2})$. The four files are:\n",
" * `normal_xtest.npy` : contains a $500$-vector whose entries are the independent variables (predictors) from the test set.\n",
" * `normal_xtrain.npy` : contains a $2000$-vector whose entries are the independent variables (predictors) from the train set.\n",
" * `normal_ytest.npy` : contains a $500$-by-$3$ dimensional matrix whose rows are one-hot encodings of the class labels for the test set.\n",
" * `normal_ytrain.npy` : contains a $2000$-by-$3$ dimensional matrix whose rows are one-hot encodings of the class labels for the train set.\n",
"\n",
"The $i$-th entry of the vector in `normal_xtrain.npy` is the observed random variable from class with label given by the $i$-th row of the matrix in `normal_ytrain.npy`, and analogously for the two test set files. Please include these files in your submission so that we can run your code without downloading them again. \n",
"\n",
"Load the training data and use it to obtain estimates of the vector of class means $\\mu=(\\mu_{0},\\mu_{1},\\mu_{2})$ and variances $\\sigma^{2}=(\\sigma_{0}^{2},\\sigma_{1}^{2},\\sigma_{2}^{2})$ by minimizing the cross-entropy between the estimated normals and the one-hot encodings of the class labels (as we did in our softmax regression example in class). Please name the corresponding variables `mu` and `sigma2`. This time, instead of using gradient descent, use Adagrad, supplied by TensorFlow as the function `tf.train.AdagradOptimizer`. Adagrad is a *stochastic gradient descent algorithm*, popular in machine learning. You can call this just like the gradient descent optimizer we used in class|just supply a learning rate. Documentation for the $TF$ implementation of Adagrad can be found here: https://www.tensorflow.org/api_docs/python/tf/train/AdagradOptimizer. See https://en.wikipedia.org/wiki/Stochastic_gradient_descent for more information about stochastic gradient descent and the Adagrad algorithm. \n",
"\n",
"**Note:** you'll no longer be able to use the built-in logit cross-entropy that we used for training our models in lecture. Your cross-entropy for one observation should now look something like $-\\sum_{k}y'_{k}\\textrm{log}p_{k}$, where $y'$ is the one-hot encoded vector and $p$ is the vector whose $k$-th entry is the (estimated) probability of the $k$-th observation given its class. **Another note:** do not include any estimation of the mixing coefficients (i.e., the class priors) in your model. You only need to estimate three means and three variances, because we are building a *discriminative* model in this problem."
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Training data loss before optimization: 5689.1797\n",
"Training data loss after optimization: 2783.872\n"
]
},
{
"data": {
"text/latex": [
"$\\hat{\\mu}_{train}=$[-1.0076367, 0.004491298, 2.9920805]"
],
"text/plain": [
"<IPython.core.display.Latex object>"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"data": {
"text/latex": [
"$\\hat{\\sigma}_{train}^{2}=$[0.7303288, 1.0043671, 1.2380918]"
],
"text/plain": [
"<IPython.core.display.Latex object>"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAZIAAAEbCAYAAADwPQLqAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvhp/UCwAAIABJREFUeJzt3XmcXFWd///Xu6vTnXSHbGQhe1iC7AQIIQgCgrKoCCgoM6CIzKAMbqMzozi/78ig/EbnKwPiKA4Cog4KCqIMykDYBnEIEARC2EMIpEnIQkggW3e6+/P9455K3zTdner0Up3u9/ORelTdc8+995zblfrUWepeRQRmZmbbq6LcBTAzsx2bA4mZmXWJA4mZmXWJA4mZmXWJA4mZmXWJA4mZmXWJA0k/IWmapJA0s0zHP0LSfEkNku4vRxls+5T7vZPKcL6kVyU1S7q4XOVIZfmUpHXlLMOOxoFkByHp+vSfPSRtlrRI0ncl1XZhn/dL+vduKuL3gCeB3YGPdHDMcZK+J+klSfWSXpN0h6QPdFM5+pT0d7u93OXoyySNBH4A/F9gIvDdXjx2SDq9VfJNwG69VYb+oLLcBbBOuRv4BDAIeA9wDVALXFDOQiV7AD+IiCXtZZA0DfgT8DZwEVngqQCOA34ETGlnu6qIaOjm8vYpkgZFxOZyl6NMppJ9Ft0eEcvKXZiI2AhsLHc5digR4ccO8ACuJ/uPlk/7MbAsvZ4GBDAzt/4o4GFgE7AcuByoyu0vWj2mtXPsauCKtI9NwFzgyFbHzT8+1c5+/gAsBYa2sW5k7nUAFwK/AdYD391WfXLr5wLrgLUp735p3XDg58CKtP0i4EvbOOcTgRuBN9Pj98D03PqLgQXAmcBLZAHyt8Do3PrW5+aY3Dn7C+Besg+tz6VtPgI8BdQDS4B/BJQ75uK03/9M9Xwd+Lvc+uvaeJ9UAK8CX26nnp1673TnuQY+1cY5mlY8t23kXVfq+c/lOyd3TpcD1+fOZf64i9s6Tkr7DLAQaEjPf91qfQDnA78me88uAs4u9+dGbz3KXgA/SvxDtR1IrgRWpddbfRiQfQiuJ/umvzfwofShc1laPxz43/TBs0t6FNo59veAZcAH075+nD5AxgOFtO164Ivp9ZA29jEKaAa+XkJdI30I/RVZF8OuJdSnkuzD/rtk3Wt7AX8J7J3Wfx94ApiVztUxwBkdlKEGeCGd9wPS/q4BXgFqUp6L03m4NeU5PK3/j7R+KFk3yZzcOa7K/a0WA6en+k0CDgGagH8G9gTOSvv/fK5ci4G3yALMnmQfcA3AR9L6w4FGYHxumxNSnjHt1LWz751uO9fAkFS+AA5N56hA6YGk3fOf8nyGLJh9GXhXOsd/n9aNScf9q3TcMe0c5zRgM/C5dM4/n5ZPbvWerQPOJmud/0s651PL/dnRK59P5S6AHyX+oVoFkvSfdBVwU1pu/WFwKdk3p4rcNp8i+1ZW/CC8H/j3bRy3Nv2H+GQurUD2DfBbubR1tNMSyZU3gNNKqGsA32+V1mF9yAJVAEe3s8/bgJ904nx/GniRrVsDBeAN4GNp+eL0ITU8l+cfgYXt/d1a/a2+0ir9BuDeVmkXA3W55cXAnFZ5rgEezC0vAL6WW74JuLmDunbqvdMD53omrVrElB5ItnX+64Bvb+O9dvo2jvMn4LpWea5vdc4D+JfcciWwgQHSKvFg+47lREnrJG0CHgIeIPt21Ja9gYciojmX9iDZN+I9OnHM3cnGZP5UTIiIpnT8fTqxH3UiL8C8Vssd1iciVpP9575T0u8lfVnS5Fzeq4CPSXoyTVI4ekvBpB+l87ouN1vnELKWwtu59LXASLJzUvRKRKzNLS8Fxnahjn9qlfYgMFHSsFzaQ63ytP5b/Bg4F0DSKOAU4NoSy1QsR4+c6x7Q7vmXNJasdXVPF4/R3t+l9ft/fvFFRDQCKyn9vbBDcyDZsTwAzCBrog+OiI9ExIp28orsW1Jb2ktvbz/tbdOZ/byY8u9dYv71bZSjw/pExLnAYWTn6cPAC5JOSOvuIBvU/S4wGvi9pJ+k7f+J7LwWH5D933iiVfoMsq6N/8gdu/UAeVD6/6tO17FEPwemSjqSrHtsFXBXJ7bvyXNdqmbe+eVjUBv5Ojr/nf3y0pFS3v9deS/s0AZEJfuRDRGxMCJeiW3P8HkGOFxS/m98JFk31UtpuYGsu6YjxQHGI4sJkgpk/dHPlFrw9C32TuBzkoa2Xi9pxDZ2UUp9iIgnI+I7EXEMWdfdObl1qyLi5xHxKeA84BxJ1RGxIp3XhRGxMGX/M1nLbVV+XXqsLrXelHaO83U8slXakWRdW2/n0ma3yjMbeLa4kMr3G7LuuU+TDS43daLMPXauO1GGlcA4SflgMKO9zG2JiOXAa2SzAtuzmW3/fZ6l7b9Lye///s6BpP/6ITAB+KGkvSV9EPg22ZjIhpRnMTAr/SBtdKsPDgAiYj1ZV8W3JX1A0t5peVw6Rmf8Ddm3xHmSzpD0Lkl7SbqAXLfA9tRH0q6Svi3p3ZKmSnov2QDsMwCSLpF0qqTpqQ4fARZFRH07x7uBbIbP7yQdnfZ/lKTLJE3vRJ0XA/uluo6W1Na36qLLgKMlXSxpT0lnAV8B/rVVvtmSLkp1+Wvgk2SzqvJ+TNYaORDobGugt891W+4nG4v5uqTdJZ1HNjGhsy4FviTpb9M5nSHpK7n1i4HjJO2Sfs/Slv8LfELShalOnyc7t63/LgNXuQdp/CjtQRuDtq3WT6P9KZzFaY+XA9W59XuS9a9voPTpv/Xkpv/m8nQ42J7LN55sVs+itK+lwB3ASbk87xgA3VZ9yALbb8i+gdaTTXf9V2BQWv+PwNOprqvJpiLvvY2yjiP7EF6R9vky2Sy3/PTebQ0IjyHrVnqbd07/ndnGMYvTfxvoePrvL9M5Xw58tY39iKz1cG9Hddye9053n2vaGGxP6Z8hm4W1nmwa9hdpY/pvR+c/pZ1HFuQayGafXZdbdzJZt+tmOp7++1my1vlm2p/+23rQfjG5qdn9+aFUYTPbAUhaTNYy6PDX35KGkH3Qfz4ibuiNstnA5V+2m/UjqXtyHPC3ZD90/HV5S2QDgQOJWf8yhawLrg44N/r5pWWsb3DXlpmZdYlnbZmZWZcMiK6t0aNHx7Rp08pdDDOzHcpjjz22KiLGbCvfgAgk06ZNY9681lejMDOzjkh6pZR87toyM7MucSAxM7MucSAxM7MucSAxM7MucSAxM7MucSAxM7MucSAxM7MucSDpwNubNvNvc17giSVryl0UM7M+y4GkA41NwZX3vMjjr75Z7qKYmfVZDiQdqK3Ofvi/oaEzdyk1MxtYHEg6UFVZwaCCWFffWO6imJn1WQ4k21BbXckGBxIzs3Y5kGxDbVUl6+rdtWVm1h4Hkm2orS6wocEtEjOz9jiQbENNVaXHSMzMOuBAsg1Dqys9a8vMrAMOJNtQU1VgvVskZmbtciDZhqHVlaz3GImZWbt6NZBIWizpKUlPSJqX0i6W9FpKe0LSB3L5L5K0UNLzkk7IpZ+Y0hZK+lpPlrmmusB6z9oyM2tXOe7Z/t6IWNUq7fKI+G4+QdI+wJnAvsAE4G5Je6bVPwDeD9QBj0q6LSKe6YnC1lZXumvLzKwD5QgkpToFuDEi6oGXJS0EZqV1CyNiEYCkG1PengkkVZXUNzbT2NRMZcE9gWZmrfX2J2MAd0l6TNL5ufTPSZov6TpJI1PaRGBJLk9dSmsvfSuSzpc0T9K8lStXbneBi9fbWu+ZW2ZmbertQHJERBwMnARcKOko4Cpgd2AGsAy4LOVVG9tHB+lbJ0RcHREzI2LmmDFjtrvAtVUFAHdvmZm1o1cDSUQsTc8rgFuBWRGxPCKaIqIZ+DEt3Vd1wOTc5pOApR2k94gtLRIHEjOzNvVaIJFUK2mn4mvgeGCBpPG5bKcBC9Lr24AzJVVL2hWYDjwCPApMl7SrpCqyAfnbeqrctdWpReKuLTOzNvXmYPs44FZJxeP+IiL+W9LPJc0g655aDHwGICKelvQrskH0RuDCiGgCkPQ54E6gAFwXEU/3VKFrq9wiMTPrSK8FkjTL6sA20j/RwTaXApe2kf4H4A/dWsB2uGvLzKxjns+6DS2zthxIzMza4kCyDVvGSPzrdjOzNjmQbIPHSMzMOuZAsg1DBhWQPGvLzKw9DiTbUFEhagb5UvJmZu1xIClBbXWlb7drZtYOB5IS1FZXss6D7WZmbXIgKUFtdYEN7toyM2uTA0kJaqoqWedAYmbWJgeSEgytrmSDZ22ZmbXJgaQENVWetWVm1h4HkhIMra70JVLMzNrhQFKCmqpKXyLFzKwdDiQlGFpdYH1DIxHvuBGjmdmA50BSgprqSiJg42a3SszMWnMgKUHLPUkcSMzMWnMgKUFtVfFS8h5wNzNrzYGkBL65lZlZ+xxIStByTxJ3bZmZteZAUoItd0l0i8TM7B0cSErQMtjuQGJm1poDSQmKgWSDu7bMzN7BgaQExVlbvgKwmdk7OZCUoCYNtvsuiWZm7+RAUoKqygqqChW+S6KZWRt6NZBIWizpKUlPSJqX0kZJmiPpxfQ8MqVL0pWSFkqaL+ng3H7OSflflHROb5S9trrgFomZWRvK0SJ5b0TMiIiZaflrwD0RMR24Jy0DnARMT4/zgasgCzzAN4DDgFnAN4rBpyf5LolmZm3rC11bpwA/Ta9/CpyaS/9ZZOYCIySNB04A5kTE6oh4E5gDnNjThRxaXelZW2ZmbejtQBLAXZIek3R+ShsXEcsA0vPYlD4RWJLbti6ltZe+FUnnS5onad7KlSu7XPCadCl5MzPbWmUvH++IiFgqaSwwR9JzHeRVG2nRQfrWCRFXA1cDzJw5s8s3EhlaXekfJJqZtaFXWyQRsTQ9rwBuJRvjWJ66rEjPK1L2OmBybvNJwNIO0ntUdt92d22ZmbXWa4FEUq2knYqvgeOBBcBtQHHm1TnA79Lr24BPptlbs4G1qevrTuB4SSPTIPvxKa1H1fq+7WZmberNrq1xwK2Sisf9RUT8t6RHgV9JOg94FTgj5f8D8AFgIbABOBcgIlZL+ibwaMp3SUSs7unC11a5a8vMrC29FkgiYhFwYBvpbwDHtZEewIXt7Os64LruLmNHaqsr3bVlZtaGvjD9d4dQW1WgoamZhsbmchfFzKxPcSAp0ZYrAHucxMxsKw4kJWq5uZW7t8zM8hxISuSbW5mZtc2BpEQt9213IDEzy3MgKVFLi8RdW2ZmeQ4kJaqpKo6RuEViZpbnQFKioR4jMTNrkwNJiWo8a8vMrE0OJCVyi8TMrG0OJCUaMqiABBscSMzMtuJAUiJJ1FZVss6ztszMtuJA0gk1VQVfIsXMrBUHkk4YWl3JOndtmZltxYGkE2qqC2zwrC0zs604kHRCNkbiFomZWZ4DSSfUVld6jMTMrBUHkk7wXRLNzN7JgaQTaqsK/kGimVkrDiSdkLVIHEjMzPIcSDqhtqrAhs1NNDdHuYtiZtZnOJB0Qm11JRGwcbPHSczMihxIOqGmeOFGz9wyM9vCgaQThhYvJe+ZW2ZmWziQdEKN79tuZvYOvR5IJBUkPS7p9rR8vaSXJT2RHjNSuiRdKWmhpPmSDs7t4xxJL6bHOb1Vdt+TxMzsnSrLcMwvAs8Cw3Jpfx8RN7fKdxIwPT0OA64CDpM0CvgGMBMI4DFJt0XEmz1d8OJ92329LTOzFr3aIpE0CfggcE0J2U8BfhaZucAISeOBE4A5EbE6BY85wIk9VuicYovE19syM2vR211bVwD/ADS3Sr80dV9dLqk6pU0EluTy1KW09tK3Iul8SfMkzVu5cmW3FL44a8vX2zIza9FrgUTSh4AVEfFYq1UXAXsBhwKjgK8WN2ljN9FB+tYJEVdHxMyImDlmzJjtL3jO0Kpii8RdW2ZmRb3ZIjkC+LCkxcCNwLGS/jMilqXuq3rgJ8CslL8OmJzbfhKwtIP0HleTpv/6vu1mZi16LZBExEURMSkipgFnAvdGxNlp3ANJAk4FFqRNbgM+mWZvzQbWRsQy4E7geEkjJY0Ejk9pPW5QoYKqygrWuWvLzGyLcszaau0GSWPIuqyeAD6b0v8AfABYCGwAzgWIiNWSvgk8mvJdEhGre6uwtVUFNrhry8xsi7IEkoi4H7g/vT62nTwBXNjOuuuA63qoeB3yFYDNzLbWpa4tSUMkvU/S1O4qUF83tNq32zUzy+tUIEm/Qv+b9LoKeAS4C3he0kk9UL4+p6aq4B8kmpnldLZFcgIwN73+MLATsAtwcXr0e7VukZiZbaWzgWQksCK9PhG4JSJWkE3n3ac7C9ZX1VZV+geJZmY5nQ0krwP7SSqQtU7uTulDgc3dWbC+Khtsd9eWmVlRZ2dtXQfcRPYDwCbgnpR+GPBcN5arz6qtLvjGVmZmOZ0KJBFxiaSngSnAryOiIa1qBL7T3YXrizz918xsa53+HUlE3NJG2k+7pzh9X21Vgc1NQUNjM1WVvi+YmVlnp/9+TNLxueV/klQn6c7ipU76u1rf3MrMbCud/Up9cfFFumPh14ErgUHAZd1XrL6rtni7XY+TmJkBne/amgo8n16fBvw2Iv5V0l300oUTy62lReKZW2Zm0PkWySayHyECHEfL9N+1ufR+rXgpebdIzMwynW2R/BG4TNKDZPdMPz2l78nWdy3st4Z6jMTMbCudbZF8DmggCyCfjYjiDaVOYoB0bdVUpRaJu7bMzIDO/46kDji5jfQvdVuJ+ji3SMzMtrZd9yORdCzZtbUCeCYi7uvWUvVhNWnWlq+3ZWaW6VQgkTQRuBU4hJb7pE+QNA84LdfV1W8VWyTr3LVlZgZ0fozkSrJrbO0REZMjYjIwPaVd2d2F64sGD6qgQm6RmJkVdbZr6/3AMRHxcjEhIhZJ+gItF3Ds1yRRW+V7kpiZFXXXxaKau2k/O4Sa6gIb3LVlZgZ0PpDcA1wpaXIxQdIU4HvAvd1ZsL6strqSde7aMjMDOh9IvgDUAIskvSJpMfASMAT4fDeXrc+qrapkg7u2zMyAzv+OZAlwsKT3A3sBAp4BFgL/Bnys20vYB9VWF/yDRDOzZLt+RxIRc4A5xWVJBwIf7a5C9XW1VZW8/tamchfDzKxP8J2ZtoPvkmhm1qLXA4mkgqTHJd2elneV9LCkFyXdJKkqpVen5YVp/bTcPi5K6c9LOqG365Ddt91dW2ZmUJ4WyReBZ3PL3wEuj4jpwJvAeSn9PODNiNgDuDzlQ9I+wJnAvsCJwA8lFXqp7EDWteUWiZlZpqQxEkm3bSPLsBL3Mwn4IHAp8GVJAo4F/jJl+SnZXRivAk6h5Y6MNwP/nvKfAtwYEfXAy5IWArOAh0opQ3eoqa5kQ0MTzc1BRYV667BmZn1SqYPtb5Sw/uVt5AG4AvgHWm6CtTOwJiKKX+/rgInp9UTSPU4iolHS2pR/IjA3t8/8NltIOh84H2DKlCklFK10Q9PNrTZsbtpy7S0zs4GqpE/BiDi3qweS9CFgRUQ8JumYYnJbh9vGuo62aUmIuBq4GmDmzJnvWN8VW64AXN/oQGJmA15vfgoeAXxY0geAwWTdYVcAIyRVplbJJFquKlwHTAbqJFUCw4HVufSi/Da9ouUKwI2M7c0Dm5n1Qb022B4RF0XEpIiYRjZYfm9EnAXcR8ste88Bfpde35aWSevvjYhI6WemWV27kl19+JFeqgbQcpfEDZ65ZWbWqy2S9nwVuFHSt4DHgWtT+rXAz9Ng+mqy4ENEPC3pV2S/qG8ELoyIXv1Ez7dIzMwGurIEkoi4H7g/vV5ENuuqdZ5NwBntbH8p2cyvsqip9l0SzcyK/Mv27VCcteW7JJqZOZBsl/ysLTOzgc6BZDvUeozEzGwLB5LtUJtmbflS8mZmDiTbpbJQQXVlhQfbzcxwINlutdWV7toyM8OBZLuN3amaBa+tJfuNpJnZwOVAsp3Onj2VJ+vWMnfR6nIXxcysrBxIttPph0xi9NBqrvqfl8pdFDOzsnIg2U6DBxX49JHTeOCFlSx4bW25i2NmVjYOJF1w1mFTGVpdyY/cKjGzAcyBpAuGDxnEWbOn8IenlrF41fpyF8fMrCwcSLrovCN2pbKigqv/uKjcRTEzKwsHki4aO2wwHz1kEjfPq2PFW5vKXRwzs17nQNINPnPUbjQ2N3PdnxaXuyhmZr3OgaQbTBtdy0n7j+eGua/w1qbN5S6OmVmvciDpJhccvTtv1zfyn3NfKXdRzMx6lQNJN9lv4nDeM3001z24mE2bfVVgMxs4HEi60QXH7M6qdfXc/FhduYtiZtZrHEi60eG77cyBk0fwg/sWeqzEzAYMB5JuJImLT96H5W9t4pv/9Uy5i2Nm1iscSLrZQVNGcsExu/Prx+q4+5nl5S6OmVmPcyDpAV84bjp77bITX/vNU6xe31Du4piZ9SgHkh5QXVng8o/PYO3GBv7Pbxf45ldm1q85kPSQvccP40vv25PfP7WM/5q/rNzFMTPrMb0WSCQNlvSIpCclPS3pn1P69ZJelvREesxI6ZJ0paSFkuZLOji3r3MkvZge5/RWHTrrM0ftxkFTRvB/fruA5b4Ol5n1U73ZIqkHjo2IA4EZwImSZqd1fx8RM9LjiZR2EjA9Pc4HrgKQNAr4BnAYMAv4hqSRvViPklUWKrjsjAOpb2ziq7fMdxeXmfVLvRZIIrMuLQ5Kj44+WU8Bfpa2mwuMkDQeOAGYExGrI+JNYA5wYk+WvSt2GzOUr524F/c/v5KbHl1S7uKYmXW7Xh0jkVSQ9ASwgiwYPJxWXZq6ry6XVJ3SJgL5T966lNZeeutjnS9pnqR5K1eu7Pa6dMYnD5/Gu3ffmUtuf4an6nxbXjPrX3o1kEREU0TMACYBsyTtB1wE7AUcCowCvpqyq61ddJDe+lhXR8TMiJg5ZsyYbin/9qqoEJd/fAYja6o45yeP8NLKddveyMxsB1GWWVsRsQa4HzgxIpal7qt64Cdk4x6QtTQm5zabBCztIL1PGzdsMP/5V4dRIfjENQ+zdM3GchfJzKxb9OasrTGSRqTXQ4D3Ac+lcQ8kCTgVWJA2uQ34ZJq9NRtYGxHLgDuB4yWNTIPsx6e0Pm/X0bVcf+4s3t7UyCeufdg/VjSzfqE3WyTjgfskzQceJRsjuR24QdJTwFPAaOBbKf8fgEXAQuDHwN8ARMRq4JtpH48Cl6S0HcJ+E4dz7acOpe7NjXzqJ4+wrr6x3EUyM+sSDYQpqTNnzox58+aVuxhbuefZ5Zz/88eYNW0UPzn3UAYPKpS7SGZmW5H0WETM3FY+/7K9TI7bexyXnXEgDy16gy/88nE2NzWXu0hmZtvFgaSMTj1oIhefvA93PbOcT177iMdMzGyH5EBSZp86YlcuO+NAHnv1TU7+/oM8vdS/MzGzHYsDSR/w0UMmcfNnD6c5go9e9b/815N9fjazmdkWDiR9xAGTRnDb545kvwnD+fwvH+fbdzxHU3P/nwhhZjs+B5I+ZMxO1fzir2fzl4dN4Uf/8xKfvv5R1mzwuImZ9W0OJH1MVWUF//9p+3Ppafvxvy+t4rjL/odbH6/zlYPNrM9yIOmjzjpsKr+78Egmj6rhb296krOvfZhFvkaXmfVBDiR92D4ThnHLBe/mm6fux/y6tZx4xR+54u4X2LS5qdxFMzPbwoGkjytUiE/Mnso9XzmaE/fbhSvufpEPfO+P3Pf8Cnd3mVmf4ECygxi702Cu/IuD+NmnZ9EUwbk/eZTTf/QQf1q4ygHFzMrKgWQHc9SeY5jzt0fzrVP3Y+majZx1zcN8/Oq5zF30RrmLZmYDlC/auAPbtLmJmx5dwg/uW8iKt+s5fLed+dL7pjNr11FkV+U3M9t+pV600YGkH9i0uYlfPPwqP7z/JVatq+fAScM57z27cdJ+uzCo4EanmW0fB5Kc/h5IijY2NHHLn+u47sGXWbRqPROGD+acd0/jzFlTGD5kULmLZ2Y7GAeSnIESSIqam4P7nl/BNX98mYcWvUFNVYGPzZzM2bOnssfYoeUunpntIBxIcgZaIMlb8NparnvwZf5r/lI2NwWzdxvF2bOncvw+u1BV6W4vM2ufA0nOQA4kRSvfrufXjy3hFw+/St2bGxk9tJqPHzqJMw+dwuRRNeUunpn1QQ4kOQ4kLZqagwdeXMkNc1/l3ueWE8CRe4zm44dO5v37jKO60rf8NbOMA0mOA0nbXluzkZseXcLN85awdO0mRtQM4tQZEzlj5iT2nTC83MUzszJzIMlxIOlYU3Pwvy+t4lfz6rjz6ddpaGxm3wnDOP2QSZx84ARGD60udxHNrAwcSHIcSEq3ZkMDtz25lF/NW8KC196iskIcvecYPnLwJI7beyyDB7nry2ygcCDJcSDZPi8sf5vf/Pk1fvv4a7z+1iZ2GlzJB/cfz2kHTeTQaaOoqPCv5836MweSHAeSrmlqDh566Q1+83gd/73gdTY0NDFh+GBOnjGBUw6cyN7jd/IlWcz6IQeSHAeS7rO+vpG7n13O755YygMvrKSxOZg+diinzJjAhw+cyJSdPZXYrL/oc4FE0mDgAaAaqARujohvSNoVuBEYBfwZ+ERENEiqBn4GHAK8AXw8IhanfV0EnAc0AV+IiDs7OrYDSc9Yvb6BPzy1jNueWMoji1cDcMCk4XzogPF88IAJTBwxpMwlNLOu6IuBREBtRKyTNAh4EPgi8GXgNxFxo6QfAU9GxFWS/gY4ICI+K+lM4LSI+LikfYBfArOACcDdwJ4R0e5tAx1Iet5razZy+5NLuX3+Mp56bS0AB00ZwYcOmMAH9x/PLsMHl7mEZtZZfS6QbHVQqYYskFwA/B7YJSIaJR0OXBwRJ0i6M71+SFIl8DowBvgaQET8S9rXlnztHc+BpHe98sZ6bp+/jNvnL+PZZW8BcMjUkZy03y6ctP94t1TMdhClBpLK3ihMkaQC8BiwB/AD4CVgTUQ0pix1wMT0eiKwBCAFmbXAzil9bm63+W3yxzofOB9gypQp3V4Xa9/UnWu58L17cOF79+Cllev4/fxl3LHgdb71+2f51u+f5cBJwzlp//GctN8uTN25ttzFNbMu6tVAkrqfZkgaAdwK7N1WtvQFidiUAAALMklEQVTc1jSg6CC99bGuBq6GrEWyXQW2Ltt9zFC+cNx0vnDcdBavWs8dC17njgXL+PYdz/HtO55jr1124vh9d+GEfcexz/hhnv1ltgPq1UBSFBFrJN0PzAZGSKpMrZJJwNKUrQ6YDNSlrq3hwOpcelF+G+vDpo2u5YJjdueCY3ZnyeoN3Pn069z19HK+f++LXHnPi0wcMYTj9x3HCfvuwsypI6n0TbnMdgi9Odg+BticgsgQ4C7gO8A5wC25wfb5EfFDSRcC++cG2z8SER+TtC/wC1oG2+8Bpnuwfce1al099z67gjuffp0/LlxFQ2Mzw4cM4ph3jeHYvcZyzJ5jGV7jG3OZ9bY+N9gu6QDgp0ABqAB+FRGXSNqNlum/jwNnR0R9mi78c+AgspbImRGxKO3rH4FPA43AlyLijo6O7UCy41hf38gDL6zknudWcN9zK3hjfQOFCnHI1JEct9dY3rvXWKaPHeouMLNe0OcCSTk5kOyYmpqDJ+vWcO+zK7j3uRU8k2aAjR8+mPdMH81Re47hiN1HM7K2qswlNeufHEhyHEj6h6VrNvLACyt54MWVPPjiKt7a1IgEB0wawVHTR3P4bjtz8NSRvrCkWTdxIMlxIOl/iq2VB15YyR9fXMUTS9bQ1BxUFSqYMXkEs3ffmdm7jeLgKQ4sZtvLgSTHgaT/e3vTZuYtfpO5i95g7qI3eOq1tTQHVBUq2GfCMA6eMpKDpozgoCkjmDhiiMdYzErgQJLjQDLwvLVpM/MWr+bhRat5/NU1zH9tDZs2NwMwZqdqDpo8gv0nDmfv8cPYZ8Iwxg8f7OBi1kqf/GW7WW8ZNngQx+41jmP3GgfA5qZmnlv2No8veZPHX13D46++yV3PLN+Sf/iQQew9fif2GT+c6eOGsuvoWnYdXcvYnaodYMy2wS0SG7DW1Tfy3LK3eHbZWzyz7G2eWfYWz7/+1paWC0BNVYFpO9ey65hapoyqYcLwwewyfAjjhw9m/PDBjKqtcqCxfsstErNtGFpdycxpo5g5bdSWtKbmYOmajSx+Yz0vr2p5PP3aWu5c8DqNzVt/8aqqrGDcsGpG1VYzqmZQ9lw7iJG1VYyqqWKnwYPYaXAlQwdXslN19jy0upKaqkoKvsOk9RMOJGY5hQoxeVQNk0fV8J7pY7Za19wcrFpXz7K1m1i2diPL1m7i9bWbeP2tTaxe38DKdfW8sHwdb6yv36pV056qygqGDCpkj6rsuXpQBVWFCqoHFdJzBdWFCqoqKxhUSI9KMaii5XVlhaisqGBQQRQqKqgsqOV1hShUiIJEoZDlLSilVYiK3HLFlnSoUMtyRYWoUJYm5dZJqCK7+F0xTa3yCZBwq62fcyAxK1FFhRg7bDBjhw3mwMkjOsy7saGJ1RsaWLepkbc3bebt+kbWbWpkXX22vKGhiY2bm9iUnjdubmZjQyP1jc3UNzazduNmGhqbqW9soqGxmYbGZhqbg82NzTQ0ZY8drVe6IgWUYuAh+5cFGpSeW/KQX1bL1VqVC1Ck7VL2Lftiy/qtt2ttS57cKuWuC9te/Msn5/fbbrjsII52NsR2NijvPX4Y3/+Lgzp5lM5xIDHrAUOqCkys6tn7rjQ1B5ubsgDTuOW5Ja0pPRqbm9NzS1pzcTmy103NQXMETc3QFEFELm8EEdAc0BzZcnNz0BQQaV0QLeubi2ls2TYiWx9svY70umU/bNlfMVBGxJbLe0erfbQE0+JxtixtyZ8ts9VyPg9bpeVetxOpt87TdvpW+TuI+J3+LrAdXx4mj+z5+/84kJjtoLJuKP/Y0srP1+k2M7MucSAxM7MucSAxM7MucSAxM7MucSAxM7MucSAxM7MucSAxM7MucSAxM7MuGRBX/5W0EnilC7sYDazqpuLsSFzvgcX1HlhKqffUiBizjTwDI5B0laR5pVxKub9xvQcW13tg6c56u2vLzMy6xIHEzMy6xIGkNFeXuwBl4noPLK73wNJt9fYYiZmZdYlbJGZm1iUOJGZm1iUOJB2QdKKk5yUtlPS1cpenJ0m6TtIKSQtyaaMkzZH0YnoeWc4ydjdJkyXdJ+lZSU9L+mJK7+/1HizpEUlPpnr/c0rfVdLDqd43Saoqd1l7gqSCpMcl3Z6WB0q9F0t6StITkualtG55rzuQtENSAfgBcBKwD/AXkvYpb6l61PXAia3SvgbcExHTgXvScn/SCHwlIvYGZgMXpr9xf693PXBsRBwIzABOlDQb+A5wear3m8B5ZSxjT/oi8GxueaDUG+C9ETEj9/uRbnmvO5C0bxawMCIWRUQDcCNwSpnL1GMi4gFgdavkU4Cfptc/BU7t1UL1sIhYFhF/Tq/fJvtwmUj/r3dExLq0OCg9AjgWuDml97t6A0iaBHwQuCYtiwFQ7w50y3vdgaR9E4ElueW6lDaQjIuIZZB96AJjy1yeHiNpGnAQ8DADoN6pe+cJYAUwB3gJWBMRjSlLf32/XwH8A9CclndmYNQbsi8Ld0l6TNL5Ka1b3uuV3VTA/khtpHmudD8kaShwC/CliHgr+5Lav0VEEzBD0gjgVmDvtrL1bql6lqQPASsi4jFJxxST28jar+qdc0RELJU0Fpgj6bnu2rFbJO2rAybnlicBS8tUlnJZLmk8QHpeUebydDtJg8iCyA0R8ZuU3O/rXRQRa4D7ycaIRkgqfrnsj+/3I4APS1pM1lV9LFkLpb/XG4CIWJqeV5B9eZhFN73XHUja9ygwPc3oqALOBG4rc5l6223AOen1OcDvyliWbpf6x68Fno2If8ut6u/1HpNaIkgaAryPbHzoPuD0lK3f1TsiLoqISRExjez/870RcRb9vN4Akmol7VR8DRwPLKCb3uv+ZXsHJH2A7BtLAbguIi4tc5F6jKRfAseQXVp6OfAN4LfAr4ApwKvAGRHRekB+hyXpSOCPwFO09Jl/nWycpD/X+wCygdUC2ZfJX0XEJZJ2I/umPgp4HDg7IurLV9Kek7q2/i4iPjQQ6p3qeGtarAR+ERGXStqZbnivO5CYmVmXuGvLzMy6xIHEzMy6xIHEzMy6xIHEzMy6xIHEzMy6xIHEzMy6xIHErJukH/r9MF2uu17Sckn3SHp/Wr9Y0t+Vu5xm3c3X2jLrPrcANWSXIV9IdgG8o8kuDGjWb/kHiWbdIF1y5E3g/RFxdxvr7ycLKltEhNK6dwP/Ahya9nEb8NWIeCu37XNk9xH5ZNr8mpSnGbMyc9eWWfdYlx4fljS4jfUfIbsQ6CXA+PRA0v7AXWTB48CUbwZwXavtzyL7/3o48BngfOBL3V4Ls+3gFolZN5H0UeDHZN1bjwN/An4dEQ+n9YuBf4+I7+a2+RmwOSLOy6XNSNuPi4gVqUUyAXhXpP+wkv4/4LMRMak36mbWEbdIzLpJRNxC9oF/MnAH8G5grqSvd7DZIcDZktYVH2QBCGD3XL65sfW3voeAiZKGdV8NzLaPB9vNulFEbCK74+Ac4BJJ1wAXS/puO5tUkI13XN7Gutd6ppRm3cuBxKxnPUP2/2ww0EB26fa8PwP7RsTCbeznMEnKtUpmA0uLA/Jm5eSuLbNuIGlnSfdKOlvSAemGaGeQ3R/8nvSBvxh4j6SJkkanTb8DzJL0I0kHSdpD0ock/UerQ0wArpD0LkmnA39P260Ys17nFolZ91gHzAW+COwBVJN1Tf0C+FbK80/AfwAvpfWKiPmSjkp5/oesxbKIlpsQFd2Q1j1Mdk/xa3EgsT7Cs7bM+rg0a2tBRHyu3GUxa4u7tszMrEscSMzMrEvctWVmZl3iFomZmXWJA4mZmXWJA4mZmXWJA4mZmXWJA4mZmXXJ/wNbONQw8zgc4QAAAABJRU5ErkJggg==\n",
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"# Load train data\n",
"xtrain = np.load('./normal_data/normal_xtrain.npy')\n",
"ytrain = np.load('./normal_data/normal_ytrain.npy')\n",
"\n",
"# Specify parameters\n",
"learnRate = 0.45; n_steps = 50\n",
"\n",
"# Initialize parameters\n",
"mu = tf.Variable(tf.ones([1, 3]), dtype = tf.float32)\n",
"sigma2 = tf.Variable(tf.ones([1, 3]), dtype = tf.float32)\n",
"\n",
"x = tf.placeholder(tf.float32, shape = [None, 1]) # predictor (independent variable)\n",
"ytrue = tf.placeholder(tf.float32, [None, 3]) # one-hot encoded vector\n",
"\n",
"# Probability vector -------------------------------------------------------------------------------------------\n",
"p = tf.distributions.Normal(loc = mu, scale = sigma2).prob(x)\n",
"\n",
"# Cross-entropy Loss Function to minimize\n",
"cross_entropy = tf.reduce_mean(-(tf.reduce_sum(ytrue * tf.log(p))))\n",
"\n",
"# Optimize via Adagrad\n",
"train_step = tf.train.AdagradOptimizer(learnRate).minimize(cross_entropy)\n",
"\n",
"# Initialize all global variables\n",
"init = tf.global_variables_initializer()\n",
"sess.run(init)\n",
"\n",
"# Printing loss before optimization\n",
"cross_entropy_Before = sess.run(cross_entropy, feed_dict={x: xtrain, ytrue: ytrain})\n",
"print(\"Training data loss before optimization: \" + str(cross_entropy_Before))\n",
"\n",
"# Conduct Optimization with training data ----------------------------------------------------------------------\n",
"losses = np.zeros(n_steps)\n",
"for i in range(n_steps):\n",
" losses[i] = sess.run(cross_entropy, feed_dict = {x: xtrain, ytrue: ytrain})\n",
" sess.run(train_step, feed_dict = {x: xtrain, ytrue: ytrain})\n",
"\n",
"# Printing loss after optimization \n",
"cross_entropy_After = sess.run(cross_entropy, feed_dict = {x: xtrain, ytrue: ytrain})\n",
"print(\"Training data loss after optimization: \" + str(cross_entropy_After))\n",
"\n",
"# Estimates of W and b\n",
"display(Latex('$\\hat{\\mu}_{train}=$'+ str(list(sess.run(mu)[0])) ))\n",
"display(Latex('$\\hat{\\sigma}_{train}^{2}=$'+ str(list(sess.run(sigma2)[0]) )))\n",
"\n",
"# Plot our Loss function\n",
"plt.title(\"Plot of Cross-entropy loss function\", fontsize = 14)\n",
"plt.xlabel(\"Step\", fontsize = 14)\n",
"plt.ylabel(\"Loss\", fontsize = 14)\n",
"_ = plt.plot(losses)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"7. **Evaluating loss on test data.** Load the test data. What is the cross-entropy of your model on this test data? That is, what is the cross-entropy when you use your estimated parameters with the previously unseen test data?"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"The Cross-entropy on test data is: 686.2981\n"
]
}
],
"source": [
"# Load test data\n",
"xtest = np.load('./normal_data/normal_xtest.npy')\n",
"ytest = np.load('./normal_data/normal_ytest.npy')\n",
"\n",
"crossent_test = sess.run(cross_entropy, {x: xtest, ytrue: ytest})\n",
"print(\"The Cross-entropy on test data is: \" + str(crossent_test))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"8. **Evaluating parameter estimation on test data.** The true parameter values for the three classes were\n",
"\n",
"\\begin{align*}\n",
"\\mu_{0}=-1, & \\sigma_{0}^{2}=0.5\\\\\n",
"\\mu_{1}=0, & \\sigma_{1}^{2}=1\\\\\n",
"\\mu_{2}=3, & \\sigma_{2}^{2}=1.5\n",
"\\end{align*}\n",
"\n",
"Write a TensorFlow expression to compute the total squared error (i.e., summed over the six parameters) between your estimates and their true values. What is the squared error? **Note:** you need only evaluate the error of your final estimates, not at every step."
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"total squared error: 0.12180754\n"
]
}
],
"source": [
"mu_Real = tf.constant([[-1, 0, 3]], dtype = tf.float32)\n",
"sigma2_Real = tf.constant([[0.5, 1, 1.5]], dtype = tf.float32)\n",
"\n",
"sq_err = sess.run(tf.reduce_sum((mu - mu_Real)**2) + tf.reduce_sum((sigma2 - sigma2_Real)**2))\n",
"print(\"total squared error: \" + str(sq_err))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"9. **Evaluating classification error on test data.** Write and evaluate a TensorFlow expression that computes the classification error of your estimated model averaged over the test data."
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"The Classification Error is: 0.268\n"
]
}
],
"source": [
"correct_prediction = tf.equal(tf.argmax(p, 1), tf.argmax(ytrue, 1))\n",
"error = 1 - tf.reduce_mean(tf.cast(correct_prediction, tf.float32))\n",
"class_error = sess.run(error, {x: xtest, ytrue: ytest})\n",
"print(\"The Classification Error is: \" + str(class_error))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"10. Again, for ease of grading, define a dictionary called `results_class`, with keys `'mu'`, `'sigma2'`, `'crossent_test'`, `'class_error'` with keys corresponding to the evaluation (again using `sess.run`) of your estimate of $\\mu$, $\\sigma^2$, the cross-entropy on the test set, and the classification error from the previous problem."
]
},
{
"cell_type": "code",
"execution_count": 12,
"metadata": {},
"outputs": [],
"source": [
"results_class = {'mu': sess.run(mu), 'sigma2': sess.run(sigma2), \n",
" 'crossent_test': crossent_test, 'class_error': class_error}"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Problem 3: Building a Complicated Model\n",
"#### Time Spent: 2 hours\n",
"#### ([Back to Top](#Table-of-Contents))\n",
"\n",
"The TensorFlow documentation includes tutorials on building a number of more complicated neural models in TensorFlow: https://www.tensorflow.org/tutorials/. In the left side panel, choose any one tutorial from under one of the headings \"ML at production scale\", \"Generative models\", \"Images\" or \"Sequences\" and follow it. Some of the tutorials include instructions along the lines of \"We didn't discuss this trick, try adding it!\". You do not need to do any of these additional steps (though you will certainly learn something if you do!). **Warning:** some of the tutorials require large amounts of training data. If this is the case, please do not include the training data in your submission! Instead, include a line of code to download the data from wherever it is stored. Also, some of the tutorials require especially long training time, (e.g., the neural models) so budget your time accordingly!\n",
"\n",
"Your submission for this problem should be a *separate* jupyter notebook called `tutorial.ipynb` (no need to include your uniqname), which includes code to load the training and test data, build and train a model, and evaluate that model on test data. That is, the code in `tutorial.ipynb` should perform all the training and testing steps performed in the tutorial, but without having to be run from the command line. Depending on which model you choose, training may take a long time if you use the preset number of training steps, so be sure to include a variable called `nsteps` that controls the number of training steps, and set it to be something moderately small for your submission.\n",
"\n",
"**Note:** it will not be enough to simply copy the tutorial's python code into your\n",
"jupyter notebook, since the demo code supplied in the tutorials is meant to be run from\n",
"the command line.\n",
"\n",
"**Another note:** If it was not clear, you are, for this problem and this problem only,\n",
"permitted to copy-paste code from the TensorFlow tutorials as much as you like without\n",
"penalty.\n",
"\n",
"**One more note:** Please make sure that in both `tutorial.ipynb` and your main submission notebook `uniqname.hw10.ipynb` you do not set any training times to be excessively long. You are free to set the number of training steps as you like for running on your own machine, but please set these parameters to something more reasonable in your submission so that we do not need to wait too long when running your notebook. Aim to set the number of training steps so that we can run each of your submitted notebooks less than a minute. "
]
},
{
"cell_type": "raw",
"metadata": {
"scrolled": true
},
"source": [
"See: tutorial.ipynb"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Problem 4: Running Models on Google Cloud Platform\n",
"#### Time Spent: 5 hours\n",
"#### ([Back to Top](#Table-of-Contents))\n",
"In this problem, you'll get a bit of experience running TensorFlow jobs on Google Cloud Platform (GCP), Google's cloud computing service. Google has provided us with a grant, which will provide each of you with free compute time on GCP.\n",
"\n",
"**Important:** this problem is **very hard**. It involves a sequence of fairly complicated operations in GCP. As such, I do not expect every student to complete it. Don't worry\n",
"about that. Unless you've done a lot of programming in the past, this problem is likely\n",
"your first foray into learning a new tool largely from scratch instead of having my lectures\n",
"to guide you. The ability to do this is a crucial one for any data scientist, so consider\n",
"this a learning opportunity (and a sort of miniature final exam). Start early, read the\n",
"documentation carefully, and come to office hours if you're having trouble.\n",
"\n",
"Good luck, and have fun!\n",
"\n",
"The first thing you should do is claim your share of the grant money by visiting this link: https://google.secure.force.com/GCPEDU?cid=VYZbhLIwytS0UVxuWxYyRYgNVxPMOf37oBx0hRmx71pfjlHwvVnxlUdVjBD6l5XA You will need to supply your name and your UMich email. Please use the email address associated to your unique name (i.e., uniqname@umich.edu), so that we can easily determine which account belongs to which student. Once you have submitted this form, you will receive a confirmation email through which you can claim your compute credits. These credits are valid on GCP until they expire in January 2020. Any credits left over after completing this homework are yours to use as you wish. Make sure that you claim your credits while signed in under your University of Michigan email, rather than a personal gmail account so that your project is correctly associated with your UMich email. If you accidentally claim the credits under a different address, add your unique name email as an owner.\n",
"\n",
"Once you have claimed your credits, you should create a project, which will serve as a repository for your work on this problem. You should name your project `uniqname-stats507w19`, where `uniqname` is your unique name in all lower-case letters. Your project's billing should be automatically linked to your credits, but you can verify this fact in the billing section dashboard in the GCP browser console. Please add both me (UMID `klevin`) and your GSI Roger Fan (UMID `rogerfan`) as owners. You can do this in the IAM tab of the IAM & admin dashboard by clicking \"Add\" near the top of the page, and listing our UMich emails and specifying our Roles as Project ! Owner.\n",
"\n",
"**Note:** this problem is comparatively complicated, and involves a lot of moving parts. At the end of this problem (several pages below), I have included a list of all the files that should be included in your submission for this problem, as well as a list of what should be on your GCP project upon submission.\n",
"\n",
"**Important:** after the deadline (May 2nd at 10:00am) you **should not** edit your GCP project in any way until you receive a grade for the assignment in canvas. If your project indicates that any files or running processes have been altered after the deadline by a user other than `klevin` or `rogerfan`, we will assume this to be an instance of editing your assignment after the deadline, and you will receive a penalty.\n",
"1. Follow the tutorial at https://cloud.google.com/ml-engine/docs/distributed-tensorflow-mnist-cloud-datalab, which will walk you through the process of training a CNN similar to the one we saw in class, but this time using resources on GCP instead of your own machine. This tutorial will also have you set up a DataLab notebook, which is Google's version of a Jupyter notebook, in which you can interactively draw your own digits and pass them to your neural net for classification. **Important:** the tutorial will tell you to tear your nodes and storage down at the end. Do not do that. Leave everything running so that we can verify that you set things up correctly. It should only cost a few dollars to leave the datalab server and storage buckets running, but if you wish to conserve your credits, you can tear everything down and go through the tutorial again on the evening of May 1st or the (early!) morning of May 2nd."
]
},
{
"cell_type": "raw",
"metadata": {},
"source": [
"See: gcloud"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"2. Let us return to the classifier that you trained above on the normally-distributed data. In this and the next several subproblems, we will take an adaptation of that model and upload it to GCP where it will serve as a prediction node similar to the one you built in the tutorial above. Train the same classifier on the same training data, but this time, save the resulting trained model in a directory called `normal_trained`. You'll want to use the `tf.saved_model.simple_save` function. Refer to the GCP documentation at https://cloud.google.com/ml-engine/docs/deploying-models, and the documentation on the `tf.saved_model.simple_save` function, here: https://www.tensorflow.org/programmers_guide/saved_model#save_and_restore_models Please include a copy of this model directory in your submission. **Hint:** a stumbling block in this problem is figuring out what to supply as the inputs and outputs arguments to the `simple_save` function. Your arguments should look something like `inputs = {'x':x}`, `outputs = {'prediction':prediction}`."
]
},
{
"cell_type": "raw",
"metadata": {},
"source": [
"# Load train data\n",
"xtrain = np.load('./normal_data/normal_xtrain.npy')\n",
"ytrain = np.load('./normal_data/normal_ytrain.npy')\n",
"\n",
"# Specify parameters\n",
"learnRate = 0.45; n_steps = 100\n",
"\n",
"# Initialize parameters\n",
"mu = tf.Variable(tf.ones([1, 3]), dtype = tf.float32)\n",
"sigma2 = tf.Variable(tf.ones([1, 3]), dtype = tf.float32)\n",
"\n",
"x = tf.placeholder(tf.float32, shape = [None, 1]) # predictor (independent variable)\n",
"ytrue = tf.placeholder(tf.float32, [None, 3]) # one-hot encoded vector\n",
"\n",
"# Probability vector -------------------------------------------------------------------------------------------\n",
"p = tf.distributions.Normal(loc = mu, scale = sigma2).prob(x)\n",
"\n",
"# Cross-entropy Loss Function to minimize\n",
"cross_entropy = tf.reduce_mean(-(tf.reduce_sum(ytrue * tf.log(p))))\n",
"\n",
"# Optimize via Adagrad\n",
"train_step = tf.train.AdagradOptimizer(learnRate).minimize(cross_entropy)\n",
"\n",
"# Initialize all global variables\n",
"init = tf.global_variables_initializer()\n",
"sess.run(init)\n",
"\n",
"# Conduct Optimization with training data ----------------------------------------------------------------------\n",
"losses = np.zeros(n_steps)\n",
"for i in range(n_steps):\n",
" losses[i] = sess.run(cross_entropy, feed_dict = {x: xtrain, ytrue: ytrain})\n",
" sess.run(train_step, feed_dict = {x: xtrain, ytrue: ytrain})\n",
"\n",
"# Compute prediction our normal model\n",
"prediction = tf.argmax(p, axis = 1)\n",
"\n",
"# Save training model\n",
"dir_name = \"normal_trained\"\n",
"tf.saved_model.simple_save(sess, dir_name, inputs = {\"x\": x}, outputs = {'prediction': prediction})"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"3. Let's upload that model to GCP. First, we need somewhere to put your model. You already set up a bucket in the tutorial, but let's build a separate one. Create a new bucket called `uniqname-stats507w19-hw10-normal`, where `uniqname` is your uniqname. You should be able to do this by making minor changes to the commands you ran in the tutorial, or by following the instructions at https://cloud.google.com/solutions/running-distributed-tensorflow-on-compute-engine#creating_a_cloud_storage_bucket. Now, we need to upload your saved model to this bucket. There are several ways to do this, but the easiest is to follow the instructions at https://cloud.google.com/storage/docs/uploading-objects and upload your model through the GUI. **Optional challenge (worth no extra points, just bragging rights):** Instead of using the GUI, download and install the Google Cloud SDK, available at https://cloud.google.com/sdk/ and use the `gsutil` command line tool to upload your model to a storage bucket."
]
},
{
"cell_type": "raw",
"metadata": {},
"source": [
"# Cloud Shell Commands:\n",
"\n",
"BUCKET=\"israeldi-stats507w19-hw10-normal\"\n",
"gsutil mb -c regional -l us-east1 gs://${BUCKET}"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"4. Now we need to create a *version* of your model. Versions are how the GCP machine learning tools organize different instances of the same model (e.g., the same model trained on two different data sets). To do this, follow the instructions located at https://cloud.google.com/ml-engine/docs/deploying-models#creating_a_model_version, which will ask you to\n",
" * Upload a SavedModel directory (which you just did)\n",
" * Create a Cloud ML Engine model resource\n",
" * Create a Cloud ML Engine version resource (this specifies where your model is stored, among other information)\n",
" * Enable the appropriate permissions on your account.\n",
"\n",
"Please name your model `stats507w19_hw10_normal` (note the underscores here as opposed to the hyphens in the bucket name; see the documentation for the `gcloud ml-engine versions` command for how to delete versions, if need be). **Important:** there are a number of pitfalls that you may encounter here, which I want to warn you about: A good way to check that your model resource and version are set up correctly is to run the command `gcloud ml-engine versions describe \"your_version_name\" --model \"your_model_name\"`. The resulting output should include a line reading `state: READY`. You may notice that the Python version for the model appears as, say, `pythonVersion: '2.7'`, even though you used, say, Python 3.6. This should not be a problem, but you should make sure that the `runtimeVersion` is set correctly. If the line `runtimeVersion: '1.0'` is appearing when you describe your version, you are likely headed for a bug. You can prevent this bug by adding the flag `--runtime-version 1.6` to your `gcloud ml-engine versions create` command, and making sure that you are running TensorFlow version 1.6 on your local machine (i.e., the machine where you're running Jupyter). Running version 1.7 locally while running 1.6 on GCP also seems to work fine."
]
},
{
"cell_type": "raw",
"metadata": {},
"source": [
"# There is an error that occurs in problem 4.6 if runtimeVersions don't match between our own \n",
"# computer and gcloud. Someone on this forum page: https://github.com/tensorflow/tensorflow/issues/22994\n",
"# mentions that using runtimeVersion 1.11 and above fixes this issue. Therefore, I am running \n",
"# tensorflow version 1.12\n",
"\n",
"# Cloud Shell Commands: \n",
"\n",
"# Step 1:\n",
"MODEL_DIR=\"gs://israeldi-stats507w19-hw10-normal/\"\n",
"VERSION_NAME=\"v1_hw10\"\n",
"MODEL_NAME=\"stats507w19_hw10_normal\"\n",
"FRAMEWORK=\"TENSORFLOW\"\n",
"\n",
"# Step 2:\n",
"gcloud ml-engine versions create $VERSION_NAME \\\n",
" --model $MODEL_NAME \\\n",
" --origin $MODEL_DIR \\\n",
" --runtime-version=1.12 \\\n",
" --framework $FRAMEWORK \\\n",
" --python-version=3.5"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"5. Create a .json file corresponding to a single prediction instance on the input observation $x = 4$. Name this .json file `instance.hw10.json`, and please include a copy of it in your submission. **Hint:** you will likely find it easiest to use nano/vim/emacs to edit edit the .json file from the tutorial (GCP Cloud Shell has versions of all three of these editors). Doing this will allow you to edit a copy of the .json file directly in the GCP shell instead of going through the trouble of repeatedly downloading and uploading files. Being proficient with a shell-based text editor is also, generally speaking, a good skill for a data scientist to have."
]
},
{
"cell_type": "raw",
"metadata": {},
"source": [
"vi instance.hw10.json"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"6. Okay, it's time to make a prediction. Follow the instructions at https://cloud.google.com/ml-engine/docs/online-predict#requesting_predictions to submit the observation in your .json file to your running model. Your model will make a prediction, and print the output of the model to the screen. Please include a copy-paste of the command you ran to request this prediction as well as the resulting output. Which cluster does your model think $x = 4$ came from? **Hint:** if you are getting errors about dimensions being wrong, make sure that your instance has the correct dimension expected by your model. Second hint: if you are encountering an error along the lines of `Error during model execution: AbortionError(code=StatusCode.INVALID_ARGUMENT, details=\\\"NodeDef mentions attr 'output_type'`, this is an indication that there is a mismatch between the version of TensorFlow that you used to create your model and the one that you are running on GCP. See the discussion of `gcloud ml-engine versions create` above.\n",
"\n",
"That's all of it! Great work! Here is a list of all files that should be included for this problem in your submission, as well as a list of what processes or resources should be left running in your GCP project:\n",
" * You should leave the datalab notebook and its supporting resources (i.e., the prediction node and storage bucket) from the GCP ML tutorial running in your GCP project.\n",
" * Include in your submission a copy of the saved model directory constructed from your classifier. You should also have a copy of this directory in a storage bucket on GCP.\n",
" * Leave a storage bucket running on GCP containing your uploaded model directory. This storage bucket should contain a model with a single version.\n",
" * Include in your submission a `.json` file representing a single observation. You need not include a copy of this file in a storage bucket on GCP; it will be stored by default in your GCP home directory if you created it in a text editor in the GCP shell.\n",
" * Include in your jupyter notebook a copy-paste of the command you ran to request your model's prediction on the .json file, and please include the output that was printed to the screen in response to that prediction request. **Note:** Please make sure that the cell(s) that you copy-paste into is/are set to be Raw NBconvert cell(s), so that your commands display as code but are not run as code by Jupyter."
]
},
{
"cell_type": "raw",
"metadata": {},
"source": [
"# GCP Shell command: \n",
"\n",
"MODEL_NAME=\"stats507w19_hw10_normal\" \n",
"INPUT_DATA_FILE=\"instance.hw10.json\" \n",
"VERSION_NAME=\"v1_hw10\" \n",
"gcloud ml-engine predict --model $MODEL_NAME \\\n",
" --version $VERSION_NAME \\\n",
" --json-instances $INPUT_DATA_FILE"
]
},
{
"cell_type": "raw",
"metadata": {},
"source": [
"# Output:\n",
"\n",
"PREDICTION\n",
"2"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.8"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment