Skip to content

Instantly share code, notes, and snippets.

@z-a-f
Created March 18, 2021 10:14
Show Gist options
  • Save z-a-f/d63074e6470557046453cc0f107e48b4 to your computer and use it in GitHub Desktop.
Save z-a-f/d63074e6470557046453cc0f107e48b4 to your computer and use it in GitHub Desktop.
Display the source blob
Display the rendered blob
Raw
{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"colab_type": "text",
"id": "edfbxDDh2AEs"
},
"source": [
"## Generate Shakespearian Scripts using PyTorch\n",
"\n",
"This tutorial is adapted from this github repository: [albertlai431](https://github.com/albertlai431/Machine-Learning/tree/master/Text%20Generation)"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"%matplotlib inline\n",
"%load_ext autoreload\n",
"%autoreload 2\n",
"\n",
"from contextlib import contextmanager\n",
"import copy\n",
"import itertools\n",
"import matplotlib.pyplot as plt\n",
"import numpy as np\n",
"import sys\n",
"\n",
"import torch\n",
"from torch import nn\n",
"import torch.nn.functional as F\n",
"import torch.quantization as tq\n",
"\n",
"device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')"
]
},
{
"cell_type": "markdown",
"metadata": {
"colab_type": "text",
"id": "KRQ6Fjra3Ruq"
},
"source": [
"### Download data\n",
"\n",
"Download *The Complete Works of William Shakespeare* as a single text file from [Project Gutenberg](https://www.gutenberg.org/)."
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 208
},
"colab_type": "code",
"id": "j8sIXh1DEDDd",
"outputId": "79f9cbb8-98ae-4c5e-c8d4-2ed31eae4d93"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"--2021-03-18 02:50:58-- http://www.gutenberg.org/files/100/100-0.txt\n",
"Resolving www.gutenberg.org (www.gutenberg.org)... 2610:28:3090:3000:0:bad:cafe:47, 152.19.134.47\n",
"Connecting to www.gutenberg.org (www.gutenberg.org)|2610:28:3090:3000:0:bad:cafe:47|:80... connected.\n",
"HTTP request sent, awaiting response... 416 Requested Range Not Satisfiable\n",
"\n",
" The file is already fully retrieved; nothing to do.\n",
"\n"
]
}
],
"source": [
"!wget --show-progress --continue -O ./shakespeare.txt http://www.gutenberg.org/files/100/100-0.txt"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Load the data"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"The Project Gutenberg eBook of The Complete Works of William Shakespeare, by William Shakespeare\n",
"\n",
"This eBook is for the use of anyone anywhere in the United States and\n",
"most other parts of the world at no cost and with almost no restrictions\n",
"whatsoev\n"
]
}
],
"source": [
"with open('shakespeare.txt', 'r') as f:\n",
" text = f.read()\n",
"\n",
"# Showing the first 250 characters\n",
"print(text[:250])"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Tokenize the data"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[ 70 41 91 61 28 26 19 94 27 61 59 45 28 3 71 45 61 95\n",
" 32 61 19 100 28 61 89 94 94 20 28 94 72 28 41 91 61 28\n",
" 64 94 24 14 7 61 45 61 28 106 94 19 20 105 28 94 72 28\n",
" 106 77 7 7 77 52 24 28 97 91 52 20 61 105 14 61 52 19\n",
" 61 0 28 32 21 28 106 77 7 7 77 52 24 28 97 91 52 20\n",
" 61 105 14 61 52 19 61 50 50 41 91 77 105 28 61 89 94 94\n",
" 20 28 77 105 28 72 94 19 28 45 91 61 28 71 105 61 28 94\n",
" 72 28 52 95 21 94 95 61 28 52 95 21 22 91 61 19 61 28\n",
" 77 95 28 45 91 61 28 104 95 77 45 61 18 28 97 45 52 45\n",
" 61 105 28 52 95 18 50 24 94 105 45 28 94 45 91 61 19 28\n",
" 14 52 19 45 105 28 94 72 28 45 91 61 28 22 94 19 7 18\n",
" 28 52 45 28 95 94 28 59 94 105 45 28 52 95 18 28 22 77\n",
" 45 91 28 52 7 24 94 105 45 28 95 94 28 19 61 105 45 19\n",
" 77 59 45 77 94 95 105 50 22 91 52 45 105 94 61 98]\n"
]
}
],
"source": [
"# encoding the text and map each character to an integer and vice versa\n",
"\n",
"# We create two dictionaries:\n",
"# 1. int2char, which maps integers to characters\n",
"# 2. char2int, which maps characters to integers\n",
"chars = tuple(set(text))\n",
"int2char = dict(enumerate(chars))\n",
"char2int = {ch: ii for ii, ch in int2char.items()}\n",
"\n",
"# Encode the text\n",
"encoded = np.array([char2int[ch] for ch in text])\n",
"\n",
"# Showing the first 100 encoded characters\n",
"print(encoded[:250])"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Utilities"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
"# Defining method to encode one hot labels\n",
"def one_hot_encode(arr, n_labels):\n",
" \n",
" # Initialize the the encoded array\n",
" one_hot = np.zeros((np.multiply(*arr.shape), n_labels), dtype=np.float32)\n",
" \n",
" # Fill the appropriate elements with ones\n",
" one_hot[np.arange(one_hot.shape[0]), arr.flatten()] = 1.\n",
" \n",
" # Finally reshape it to get back to the original array\n",
" one_hot = one_hot.reshape((*arr.shape, n_labels))\n",
" \n",
" return one_hot\n",
" \n",
"# Defining method to make mini-batches for training\n",
"def get_batches(arr, batch_size, seq_length):\n",
" '''Create a generator that returns batches of size\n",
" batch_size x seq_length from arr.\n",
" \n",
" Arguments\n",
" ---------\n",
" arr: Array you want to make batches from\n",
" batch_size: Batch size, the number of sequences per batch\n",
" seq_length: Number of encoded chars in a sequence\n",
" '''\n",
" \n",
" batch_size_total = batch_size * seq_length\n",
" # total number of batches we can make\n",
" n_batches = len(arr)//batch_size_total\n",
" \n",
" # Keep only enough characters to make full batches\n",
" arr = arr[:n_batches * batch_size_total]\n",
" # Reshape into batch_size rows\n",
" arr = arr.reshape((batch_size, -1))\n",
" \n",
" # iterate through the array, one sequence at a time\n",
" for n in range(0, arr.shape[1], seq_length):\n",
" # The features\n",
" x = arr[:, n:n+seq_length]\n",
" # The targets, shifted by one\n",
" y = np.zeros_like(x)\n",
" try:\n",
" y[:, :-1], y[:, -1] = x[:, 1:], arr[:, n+seq_length]\n",
" except IndexError:\n",
" y[:, :-1], y[:, -1] = x[:, 1:], arr[:, 0]\n",
" yield x, y"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Define the Model"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [],
"source": [
"# Declaring the model\n",
"class CharRNN(nn.Module):\n",
" \n",
" def __init__(self, tokens, n_hidden=256, n_layers=2, drop_prob=0.5):\n",
" super().__init__()\n",
" self.drop_prob = drop_prob\n",
" self.n_layers = n_layers\n",
" self.n_hidden = n_hidden\n",
" \n",
" # creating character dictionaries\n",
" self.chars = tokens\n",
" self.int2char = dict(enumerate(self.chars))\n",
" self.char2int = {ch: ii for ii, ch in self.int2char.items()}\n",
" \n",
" # Define quantizers\n",
" self.quant_x = tq.QuantStub()\n",
" self.quant_h = nn.ModuleList([tq.QuantStub(), tq.QuantStub()])\n",
" \n",
" # Define the LSTM\n",
" self.lstm = nn.LSTM(len(self.chars), n_hidden, n_layers, \n",
" dropout=drop_prob, batch_first=True)\n",
"\n",
" # Define a dropout layer\n",
" self.dropout = nn.Dropout(drop_prob)\n",
"\n",
" # Define the final, fully-connected output layer\n",
" self.fc = nn.Linear(n_hidden, len(self.chars))\n",
" \n",
" # Define dequantizers\n",
" self.dequant = tq.DeQuantStub()\n",
" \n",
" def forward(self, x, h):\n",
" ''' Forward pass through the network. \n",
" These inputs are x, and the hidden/cell state `hidden`. '''\n",
" # Quantize the inputs\n",
" qx = self.quant_x(x)\n",
" qh = [self.quant_h[0](h[0]), self.quant_h[1](h[1])]\n",
" \n",
" #get the outputs and the new hidden state from the lstm\n",
" r_output, h = self.lstm(qx, qh)\n",
" \n",
" #pass through a dropout layer\n",
" out = self.dropout(r_output)\n",
" \n",
" # Stack up LSTM outputs using view\n",
" out = out.contiguous().view(-1, self.n_hidden)\n",
" \n",
" #put x through the fully-connected layer\n",
" out = self.fc(out)\n",
" \n",
" out = self.dequant(out)\n",
" h = self.dequant(h[0]), self.dequant(h[1])\n",
" # return the final output and the hidden state\n",
" return out, h\n",
" \n",
"\n",
"def init_hidden(model, batch_size, device='cpu'):\n",
" ''' Initializes hidden state '''\n",
" # Create two new tensors with sizes n_layers x batch_size x n_hidden,\n",
" # initialized to zero, for hidden state and cell state of LSTM\n",
" hidden = (\n",
" torch.zeros(model.n_layers, batch_size, model.n_hidden, device=device),\n",
" torch.zeros(model.n_layers, batch_size, model.n_hidden, device=device),\n",
" )\n",
" return hidden"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"CharRNN(\n",
" (quant_x): QuantStub()\n",
" (quant_h): ModuleList(\n",
" (0): QuantStub()\n",
" (1): QuantStub()\n",
" )\n",
" (lstm): LSTM(108, 512, num_layers=2, batch_first=True, dropout=0.5)\n",
" (dropout): Dropout(p=0.5, inplace=False)\n",
" (fc): Linear(in_features=512, out_features=108, bias=True)\n",
" (dequant): DeQuantStub()\n",
")\n"
]
}
],
"source": [
"# Define and print the net\n",
"n_hidden=512\n",
"n_layers=2\n",
"\n",
"net = CharRNN(chars, n_hidden, n_layers)\n",
"print(net)"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [],
"source": [
"# Declaring the train method\n",
"def train(net, data, epochs=10, batch_size=10, seq_length=50,\n",
" lr=0.001, clip=5, val_frac=0.1, print_every=10, device='cpu'):\n",
" ''' Training a network \n",
" \n",
" Arguments\n",
" ---------\n",
" \n",
" net: CharRNN network\n",
" data: text data to train the network\n",
" epochs: Number of epochs to train\n",
" batch_size: Number of mini-sequences per mini-batch, aka batch size\n",
" seq_length: Number of character steps per mini-batch\n",
" lr: learning rate\n",
" clip: gradient clipping\n",
" val_frac: Fraction of data to hold out for validation\n",
" print_every: Number of steps for printing training and validation loss\n",
" device: Device to work on\n",
" \n",
" '''\n",
" spinner = itertools.cycle(['|', '/', '-', '\\\\']) # A \"busy\" spinner\n",
" \n",
" old_training = net.training\n",
" net.train()\n",
" \n",
" opt = torch.optim.Adam(net.parameters(), lr=lr)\n",
" criterion = nn.CrossEntropyLoss()\n",
" \n",
" # create training and validation data\n",
" val_idx = int(len(data)*(1-val_frac))\n",
" data, val_data = data[:val_idx], data[val_idx:]\n",
" \n",
" net.to(device)\n",
" \n",
" n_chars = len(net.chars)\n",
" n_batches = len(data) // (batch_size * seq_length)\n",
" dot_every = n_batches // 100 + 1\n",
" \n",
" best_model = {\n",
" 'epoch': -1,\n",
" 'train_loss': float('inf'),\n",
" 'val_loss': float('inf'),\n",
" 'model': None\n",
" }\n",
" \n",
" history = {\n",
" 'epoch': [],\n",
" 'train_loss': [],\n",
" 'val_loss': [],\n",
" }\n",
"\n",
" for e in range(epochs):\n",
" print(f'{e+1:>03}/{n_epochs:>03}', end='', flush=True)\n",
" history['epoch'].append(e)\n",
" # initialize hidden state\n",
" h = init_hidden(net, batch_size, device)\n",
" \n",
" # Train Phase\n",
" counter = 0\n",
" running_loss = 0.0\n",
" net.train()\n",
" for x, y in get_batches(data, batch_size, seq_length):\n",
" counter += 1\n",
" \n",
" # One-hot encode our data and make them Torch tensors\n",
" x = one_hot_encode(x, n_chars)\n",
" inputs, targets = torch.from_numpy(x), torch.from_numpy(y)\n",
" inputs, targets = inputs.to(device), targets.to(device)\n",
"\n",
" # Creating new variables for the hidden state, otherwise\n",
" # we'd backprop through the entire training history\n",
" h = tuple([each.data for each in h])\n",
"\n",
" # zero accumulated gradients\n",
" net.zero_grad()\n",
" \n",
" # get the output from the model\n",
" output, h = net(inputs, h)\n",
" \n",
" # calculate the loss and perform backprop\n",
" loss = criterion(output, targets.view(batch_size*seq_length).long())\n",
" loss.backward()\n",
" # `clip_grad_norm` helps prevent the exploding gradient problem in RNNs / LSTMs.\n",
" nn.utils.clip_grad_norm_(net.parameters(), clip)\n",
" opt.step()\n",
" \n",
" running_loss += loss.item()\n",
" \n",
" if counter % (10 * dot_every) == 0:\n",
" print(':', end='', flush=True)\n",
" elif counter % dot_every == 0:\n",
" print('.', end='', flush=True)\n",
" loss = running_loss / counter\n",
" history['train_loss'].append(loss)\n",
" \n",
" print() \n",
" # Eval Phase\n",
" val_h = init_hidden(net, batch_size, device)\n",
" val_losses = []\n",
" running_loss = 0.0\n",
" counter = 0\n",
" net.eval()\n",
" for x, y in get_batches(val_data, batch_size, seq_length):\n",
" sys.stdout.write('\\r\\033[K')\n",
" print(next(spinner) + ' Validating', end='', flush=True)\n",
" # One-hot encode our data and make them Torch tensors\n",
" x = one_hot_encode(x, n_chars)\n",
" x, y = torch.from_numpy(x), torch.from_numpy(y)\n",
"\n",
" # Creating new variables for the hidden state, otherwise\n",
" # we'd backprop through the entire training history\n",
" val_h = tuple([each.data for each in val_h])\n",
"\n",
" inputs, targets = x.to(device), y.to(device)\n",
"\n",
" output, val_h = net(inputs, val_h)\n",
" val_loss = criterion(output, targets.view(batch_size*seq_length).long())\n",
"\n",
" val_losses.append(val_loss.item())\n",
" val_loss = np.mean(val_losses)\n",
" history['val_loss'].append(val_loss)\n",
" \n",
" if val_loss <= best_model['val_loss']:\n",
" best_model.update({\n",
" 'epoch': e,\n",
" 'train_loss': loss,\n",
" 'val_loss': val_loss,\n",
" 'model': copy.deepcopy(net)\n",
" })\n",
"\n",
" sys.stdout.write('\\r\\033[K')\n",
" print(f\"\\r\\033[K\\tTrain Loss: {loss:.4f}; Val Loss: {val_loss:.4f}\")\n",
" \n",
" net.train(old_training)\n",
" return best_model, history"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {
"scrolled": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"001/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 3.0595; Val Loss: 2.3688\n",
"002/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 2.1082; Val Loss: 1.9720\n",
"003/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.8323; Val Loss: 1.8137\n",
"004/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.6878; Val Loss: 1.7265\n",
"005/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.5914; Val Loss: 1.6699\n",
"006/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.5226; Val Loss: 1.6317\n",
"007/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.4717; Val Loss: 1.6036\n",
"008/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.4326; Val Loss: 1.5848\n",
"009/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.4015; Val Loss: 1.5675\n",
"010/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.3763; Val Loss: 1.5584\n",
"011/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.3558; Val Loss: 1.5483\n",
"012/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.3378; Val Loss: 1.5427\n",
"013/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.3220; Val Loss: 1.5299\n",
"014/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.3089; Val Loss: 1.5227\n",
"015/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.2966; Val Loss: 1.5197\n",
"016/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.2860; Val Loss: 1.5135\n",
"017/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.2760; Val Loss: 1.5106\n",
"018/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.2673; Val Loss: 1.5089\n",
"019/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.2591; Val Loss: 1.5074\n",
"020/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.2517; Val Loss: 1.5037\n",
"021/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.2446; Val Loss: 1.5053\n",
"022/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.2381; Val Loss: 1.5049\n",
"023/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.2324; Val Loss: 1.4982\n",
"024/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.2269; Val Loss: 1.5017\n",
"025/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.2220; Val Loss: 1.5013\n",
"026/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.2168; Val Loss: 1.4991\n",
"027/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.2120; Val Loss: 1.4999\n",
"028/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.2074; Val Loss: 1.4988\n",
"029/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.2030; Val Loss: 1.4979\n",
"030/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.1995; Val Loss: 1.5026\n",
"031/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.1955; Val Loss: 1.5009\n",
"032/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.1922; Val Loss: 1.5033\n",
"033/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.1889; Val Loss: 1.5028\n",
"034/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.1853; Val Loss: 1.5030\n",
"035/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.1820; Val Loss: 1.5055\n",
"036/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.1791; Val Loss: 1.5001\n",
"037/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.1764; Val Loss: 1.5048\n",
"038/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.1733; Val Loss: 1.5088\n",
"039/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.1704; Val Loss: 1.5067\n",
"040/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.1678; Val Loss: 1.5100\n",
"041/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.1653; Val Loss: 1.5169\n",
"042/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.1628; Val Loss: 1.5155\n",
"043/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.1605; Val Loss: 1.5194\n",
"044/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.1581; Val Loss: 1.5144\n",
"045/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.1555; Val Loss: 1.5217\n",
"046/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.1535; Val Loss: 1.5199\n",
"047/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.1514; Val Loss: 1.5230\n",
"048/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.1490; Val Loss: 1.5241\n",
"049/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.1469; Val Loss: 1.5237\n",
"050/050.........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"\u001b[K\tTrain Loss: 1.1451; Val Loss: 1.5242\n"
]
}
],
"source": [
"# Declaring the hyperparameters\n",
"batch_size = 128\n",
"seq_length = 100\n",
"n_epochs = 50 # start smaller if you are just testing initial behavior\n",
"\n",
"# train the model\n",
"best_model, history = train(net, encoded, epochs=n_epochs, batch_size=batch_size, seq_length=seq_length,\n",
" lr=0.001, print_every=50, device=device)"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAisAAAGkCAYAAADwoGW4AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD+naQAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nOzde3xb5X0/8M+RLFt2fJGcQC7EIZG5NdyK7AClwGgj0QFtocVKWlooZY20rl1baLGb/rYV9tuWnzy6rV27VkpXoLSAbY0mMCjDgoVbS4ktCJBwtXKD3LCt47tsWTq/P4517KObJVs3O5/366WXfI5uj48T6aPn+T7PESRJkkBERERUpDSFbgARERFRKgwrREREVNQYVoiIiKioMawQERFRUWNYISIioqLGsEJERERFjWGFiIiIihrDChERERW1kkI3YL4ikQiOHDmCqqoqCIJQ6OYQERFRGiRJwtDQEFatWgWNJnXfyYIPK0eOHEFdXV2hm0FERERzcPjwYaxevTrlfRZ8WKmqqgIg/7LV1dVZfe5QKISnnnoKV199NXQ6XVafm+LxeOcXj3d+8XjnF493fs3leA8ODqKurk75HE9lwYeV6NBPdXV1TsJKRUUFqqur+Y89D3i884vHO794vPOLxzu/5nO80ynhYIEtERERFTWGFSIiIipqDCtERERU1BhWiIiIqKgxrBAREVFRY1ghIiKiosawQkREREWNYYWIiIiK2oJfFI6IqBhJkoRQKIRIJAJAXjSrpKQEwWAQ4XC4wK1b/Hi880Oj0eRl0T2GFSKiLAqHw+jt7cXQ0BBCoZCyX5IkrFixAocPH+ZJV/OAxzt/dDodlixZktPjzLBCRJQl4XAYhw8fxvj4OGpqalBZWQmtVgtBEBCJRDA8PIzKyspZzzBL88fjnXuSJCEcDmN4eBiiKMJoNCo9idnGsJJA/8gE9hwWIY4E8dJxAcNd7+NLH1tX6GYRUZHr7e3F+Pg41qxZg/LyctVtkUgEExMT0Ov1/PDMAx7v/KmsrERVVRXGx8fR39+PlStXZv01GFYS2HtkAF+9b/fUlhbPnHiPYYWIUpIkCUNDQ6ipqYkLKkSLXXl5OaqrqzE0NIQVK1ZkfUiIcTOByjJ1hhsenyxQS4hooQiFQgiFQqisrCx0U4gKQq/XY3JyUlWrlS0MKwlU6dVhZSwUwWQ4N+NwRLQ4RMfqtVptgVtCVBjRf/u5qFvJeBhIFEW43W709fUp2/39/di6dSvMZnNGz+X1etHZ2Yn6+nqIoggAaG5uzrRJWVdZFj8Na3h8EoaK0gK0hogWEs48Icq+jMKKKIrYtm0btm7dCoPBoOz3eDxoaGhAZ2cnLBZLWs/l8XjQ1taGjo4OZZ/X64XVakVnZ2cmzcq6Sn38YRkKMqwQEREVQkbDQO3t7XC73ejv71ftb2pqgsFgQEtLS1rPI4oitmzZgu3bt6v2WywW9Pf3w+12Z9KsrKvQaRH75Yh1K0RERIWRUVgxmUwAoAzZzFV7eztMJpOqdyZq8+bNcLlc83r++dJoBBbZEhERFYmMhoEsFgsCgUDC20RRRGNjY1rP09HRgdra2oS3mUwm+Hw+iKKYMMzkS1VZCYaC0wFlOMiwQkREVAhZmQ3U2toKAGkPA3V1dSm9NLGi+/1+fzaaNmexdStD7FkhIiIqiHmHFVEU4XK50NHRkTSAJHrMbL0mBQ8rscNA7FkhIiIqiDmtYBudvtzT04P+/n50dnamHVRmEw0xsUW8+VapV09fHh7P/iI3RERENLs5hRWDwaCsh+L1euFwOOBwONDU1JTVxiUyPj6O8fFxZXtwcBDA9OqR2bJEp+50GhidyMmqfDQtenx5nPODxzu7QqEQJElCJBJJuCiWJEnKda5O9rbQ+Xw+bNiwIa37GgwGZb2vRPJxvN1uN7Zu3YrOzs6M1xlbbGYe71AolNbiiJm898z73EAWiwWNjY0wGo1wuVyw2+3zer7oTKNkBbjbtm3D3XffHbf/qaeeQkVFxbxee6bAhxrMHCV7/a338MT4O1l7fkqu0OvsnGx4vLOjpKQEK1aswPDwMCYmJpLeb2hoKI+tWlj27t0LAPjKV76Cdevk87G98sor2Llzp2rfrl27sGvXLrz22mtYu3ZtyufM5fF+6aWXIIoi9u7dizPOOCNnr7OQBINBPPfcc5icnL10YnR0NO3nzcqJDA0GA5qamuBwOLBp06aczuLZunUr7rjjDmV7cHAQdXV1uPrqq1FdXZ2113n192/jpRMHle1lK1fj2mvPy9rzU7xQKITOzk5YrVbodPGrCFN28XhnVzAYxOHDh1FZWQm9Xh93e/REh1VVVVzlNonoCSBvuukmZYFRt9uNnTt34pvf/KbSe6HX67Fr1y5UVlYmfd/Px/H+1a9+hb/5m7/JWhnEQiZJEvr6+qDX63HllVcm/D8QKzoyko6snXV5w4YN8Hg88Hq9sw4HmUympAW00VqVZH/8srIylJWVxe3X6XRZfcOtLlevVjs6EeEbep5k+29JqfF4Z0c4HIYgCNBoNNBo4ucuRIcioveheI2NjbBYLLj44ouVYzTzOvqz1WqF1+tN2ZuRr+PNHhXZzOOd7ntKJu87GYUVo9EIi8WiWiI/Ktqbks4sHrPZnHRhuejjCz3+F3syQy4KR0TzEYlI6B8NIaQZX3RhxVhRCo1m/r0XJpMprWFJs9mc0fBlodftovlLO6yIophy5dqenh4AyXtEZrJarXA6nUmfJ93zC+VSbFjhOitENB+B0Ql88icvF7oZOdH9NxYsrYzv8c4Xo9Go+nzq7u7G2rVrcdttt2Hnzp3KfqfTqTpZrs/nw5YtW+Dz+QDIn19msxlOpzPpZ5nD4Yg7JUxHR0fCEQWPx6OsP2Y2m7F9+3a0tLSgvb0doijCYrHA5XJxGCkNacd7g8EAu92esFcFkGcFRWtXokRRhNfrjbvvpk2b0N/fn7AXxuPxwOFwpNusnIk98/JwkDMmiIiK0fbt2+F0OpUJHi6XC/X19di1axfuvPNO5XOrra1N9TibzQafz4fm5mY4nU40NTXB6/Wivr4eHo8n4Ws5HA44nU5V8Nm9e3fC+5pMJuXLt8fjwbp169De3g673Y7m5mZ0dXWhoaEhK8dgsctoGKilpUX5Q83sUnO73fD5fOju7lbd32azwev1xs0SMhgMSsKcGX48Hg9MJlNepkDPJnYFWw4DEREVp+hnhs/ng9vthtvthtlsxiOPPIK6ujpoNBr09PTEzTJtaWlBT0+Pqqff6XSioaEBW7ZsSfhZZDablTIFURSVFdwTMZvNcLlcSm9MbW0turu7lc/P+vp6OBwOeL3eohhRKGYZhRWTyQSXy6V0a4miiP7+ftTW1iIQCMSNCVqtVnR1dSU8Z9DMMzXX19crXXjFMo2SK9gSES1cbW1tqKmpUbYTDbUkW2ojunaYz+fLav2ky+VSfU5GPxsLvWL7QjCn2UDJ6k1iNTc3q8YHY1kslqJNk7E1KyMTYYQjErRZKCIjopOPsaIUz3zrYlRVVi7KAtti0tTUBJPJlPbUWK/Xi87OTvh8PlWJQrZXUk/3ZL8UL2tTlxeb2J4VQB4KqinnFE8iypxGI6C2QofqyrJFF1aKTbqr4EZXYPf7/Upxrdlsht/vT1qzMh+ckTR3DCtJxNasAAwrREQLQTqhwOfzwWq1wmw2o7u7WzXc4/V6cxJWaO4YVpJYUpogrLBuhYio6CU7XctM0ZlBHR0dcfUsqZbpoMJgX2QSWo2AJWXqEzHxzMtERIvD0qVLASQubi2WiR40jT0rKVSWlWBkPKxsD7FnhYgoL0RRRHt7uxIc2tra0NXVlXAGj8fjQX9/v7J8RltbG3p7exEMBrF+/XpcffXVcY+x2+3Ytm0brFYr7HY76uvr0dfXB4/HowSYlpYWNDY2wmazKZNBvF6vcnu0ByY6ZRpQr63i9/vh9XrR1dUFQF7mI3q72+1WFlON/o7zPRHwoiYtcAMDAxIAaWBgIOvP/cl7/lc6veW/lctjez7I+mvQtImJCWnHjh3SxMREoZtyUuDxzq6xsTFp37590tjYWMLbw+GwFAgEpHA4nOeWLUwul0sCEHfp7u6Ou6/BYEh4XwCS2WxO+hqBQECy2+3K400mk9TU1CR1d3dLHR0dktlslkwmk+R0OpXHmM3mpK8V+3pOpzPudpPJJAUCgbR/t4UiHA5Lx48fl/bu3Zv0/0CsTD6/2bOSAtdaISIqDLvdnnZPQyAQiNsXiUQwODiY9KzMgFyI63K54HK54m4zm80JF4WLXfw0lVTLd0iSlPbzEGtWUooLK1zFloiIKO8YVlKojCmwZc0KERFR/jGspBC71grDChERUf4xrKQQPwzEqctERET5xrCSAmtWiIiICo9hJYXYkxlyGIiIiCj/GFZSYM8KERFR4TGspMB1VoiIiAqPYSWF2KnL7FkhIiLKP4aVFNizQkREVHgMKynEhZWJSUQiXCKZiIgonxhWUohdFE6SgJEJ9q4QERHlE8NKCrE9KwDrVoiIiPKNYSWFJaXauH2sWyEiIsovhpUUSrQalGrUNSpD7FkhIiLKK4aVWehjOlfYs0JERJRfDCuziAsr7FkhIiLKK4aVWbBnhYiIqLAYVmahL2HNChERUSExrMyCPStERESFxbAyi9iwMhQMFaYhRESLnMfjgSAIysXhcKhub2hoUN3u9/vn/Fper1f1XIIgwGg0QhTF+f4alAMMK7NggS0RUX5YLBY0NzcrP8eGFafTCZPJBABwuVzKz3PR2NgIl8sFp9MJp9OJpqYmiKI4rwBEuRO/RCupxPWsMKwQEeWEwWCA0+lEa2srAMBsNqtut1gs6O/vh8Vigd1un/drzXwOt9sNj8czr+ek3GHPyiz0WnWBLWtWiIhyy2KxoKurK26/3++HKIqwWq0FaBUVEntWZhFzLkMOAxHR3EgRCKN9gHYC0Cyy74nltVn9naxWK7xeL/x+v2qox+v1AojvcaHFj2FlFpwNRERZMdqPGvci/ZC9swdYsixrT2exWADI4WTmUE1nZ6fq9pl8Ph+2bNkCn88HADCZTDj//PNxzz334Iwzzsha26gwFlm8zz4W2BIR5Ve05yQaTqK8Xm/SXhWbzQafz4fm5mY4nU7ceOON2LVrF84880zWoiwC7FmZRWzNCqcuExHlntlsVoZ9gOl6lUS9KgDQ0tKCnp4eOJ1OAEAkEsEPfvADbNy4EVu2bEFTU1Ne2k25wZ6VWSTqWZEkKfGdiYgoKywWi2oqcXR4J1lxrd1uV4LKTFu2bIEoisrjaWFiz8osYsNKRALGQmFUlPLQEVEGKmoxYPehqqoKmsVYYJtlVqsVra2tSt1KqnqVmbxeLzo7O9Hd3Y3e3l4cPHgQANDf35/1NlL+8BN3FrFhBQCGgpMMK0SUGUEDqWIpsKR68c0GyoFoKOns7ITdbk9ZrwLIIcXhcCgziC666CKce+65+OCDD/Bf//Vf+Wo25Qg/cWdRliSsLK/Of1uIiE4mZrMZPp9PGQ6Krm4by+fzwWq1wmw2o7u7G2azGZFIBIODg3j55ZcZVhYBxvtZlGiAshL1YeKMICKi3LNYLPD7/UqhbbJ6lba2NgBAR0dHXO8Lz/WzODCspKEqZmU4rrVCRJR70XCybds2APL5fBJZunQpACQ8r8/MGUW0cHEYKA2VZSXoHZ5QtofHOX2ZiCjXonUrPp8PZrMZBoMh4f3sdju2bdsGq9UKu92O+vp69Pb2oqOjAwcOHAAgT21ubGyEzWZTntftdivPES3gbWtrUy31P99zEFF2MKykobJMfZiG2LNCRJQXTU1N8Hg82Lx5c9L7GAwG7N+/Hy0tLWhvb4coijCZTLjwwguVwLJt2zZ4vV7U19fDYrEoBbmxoidRjNq0aVPSkET5w7CShsqYKlvWrBAR5UdHR0da9zMYDHC5XHC5XACgFNhWV1ejsbExblE4i8XCNbMWENaspCG2Z4U1K0RERPnDsJKGuLDCnhUiIqK8YVhJQ2XMbKBB9qwQERHlDcNKGtizQkREVDgMK2mIr1nh1GUiIqJ8YVhJQ+wwEHtWiIiI8odhJQ1cZ4WIiKhwGFbSwHVWiIiICodhJQ0ssCUiIiochpU0JFoUjisfEhER5QfDShpiC2wnIxLGJyMFag0REdHJhWElDbE9KwAwyOnLREREecGwkoZEYYXnByIiIsoPhpU0lJVoUKpVHyoW2RIREeUHw0qaqmIXhmPPChERUV4wrKQptsh2iD0rREREecGwkqZE05eJiIgo9xhW0sSF4YiIiAqDYSVNcTUrDCtERER5wbCSJp7MkIiIqDAYVtIUV2DLReGIiLLK7XZDEAQIgoCGhoa420VRRH19vXIfn8+X9HmWLl2KPXv25LrJKRmNRqWt0Yvb7S5omxYqhpU0VZbpVNscBiIiyq5NmzbBbrcDAHw+X1wYcbvd8Pv9AACn0wmz2Zzwebq7uyGKIg4cOJDT9s5m+/btyu9jt9vhdDqxadOmgrZpoWJYSRPXWSEiyi2DwQCbzQYAMJlMaGtrU93e1tYGk8kEAEoISMTlcuHdd9/F9ddfn7vGpqGpqUn5fWw2G5qbm2EwGArapoWKYSVNsWGF66wQUbEJTgbxwJ4HcGP7jbjqvqtwY/uNeGDPAwhOBgvdtIw1NTXB4/Eo26IowufzoampKa3HR0MNLQ4MK2niOitEVMwefftRrPrRKtyy4xbseGsHnj34LHa8tQO37LgFq360Co+9/Vihm5gRq9UKv9+vDPu0t7cDADZs2FDIZlGBMKykieusEFGxevTtR3HDwzdADIoAgIgUUV2LQRHXP3w9Hn370YK1MVONjY0wGAxK70pnZyeampqSDqM4HA5VIatWq8XOnTsT3tfj8aC+vh719fWw2WwQRREOh0MpiI0GpUJraWlBQ0OD0qbZinPdbrdyf0EQYDQa4XA4UF9fD6/Xm7XHFELGYcXtdqOlpQVWqxUNDQ1oaWnJ+EVtNhscDodSPCWKIrxeL2w2W9Lq7kKLnQ3EsEJExSA4GcStO24FAEiQEt4nuv/WHbcuqCGhTZs2KXUrHo8HmzdvTnpfh8MBp9MJp9OJ5uZmAEj6eWIymWCxWJTnXbduHdrb22G329Hc3Iyurq6Es5HyRRRFNDQ0oLW1FSaTSfl9HA6HUgMTK/q5Wltbi+bmZjidTlgsFqUoOdGxmMtjCqVk9rtMa2lpgcPhUAqbRFGEzWaD0WjE/v370y4cEkURHo9HlRINBgM6OjqSVncXWlXsbCAOAxFREejY24FAMDDr/SRICAQD8Ozz4MsXfDkPLZs/m80Gt9ut9K5YLBZ0dXUlvK/ZbFY+P0RRRGtra9LnNZvNcLlccDgccLvdqK2tRXd3t/IZVl9fD4fDAa/Xq4SafNqyZQt8Ph+6u7tVn4lut1tp88wCY7/fD4/HA5fLFVd47PF4EgacuTymkNLuWYmm2plFS9GAEQ0t6TKbzejs7FRScEdHBwKBQEH+UaQrtmdlIhzB+GS4QK0hIpLteHsHNEJ6b+UaQYPfvfW7HLcoe6KfCdu2bYPZbM7ZTBqXy6V67sbGRgAoyFBQNEQ0NzfHfXm32+2wWCxxIxq1tbUA5CnbsZqammCxWOI+X+fymEJKu2dl9+7dCauwDQYD7HY73G43RFFM+x9TsR2I2cTWrADyKrZlldoCtIaISNY32qfUpswmIkXQP9qf4xZlV3RWkNPpzNlrRMNJMYjWiSQb8rLZbPB6vfD7/UrngcFggNPpREtLC9rb29HY2Aiz2YwNGzbAYrGgs7Mz7nnm8phCSrtnxe12w2q1JrwtOraXrHtuMYidugxwKIiICm9pxdKMelZqK2pz3KLscjqdSi1JrhTT2ic9PT0Akk+9jvaIxPb6NDc3o6enRxnScbvdSplGspGPuTymUNIOK6mSpyjKFejRg7gYlZVoUKIRVPtYZEtEhXbD2Tdk1LPyuXM+l+MWZZfJZILL5Sp0M3Iq+hkKyPUyQPIhqOj+2DATrb1xOp3o7OxEIBBAT08PXC4XvF5vwoLhuTymUNIeBkrVJRRNgpkUx/r9ftW0qJ6eHmzdurWoEu5MgiCgSl+CwOj0OYF4MkMiKjTbuTZ8+8lvQwyKSWcDAYAAAQa9AU3r01tUjfIjOgs2+jkaLY9wuVwJQ1p0dtTMsOLz+eBwONDT06MaLjOZTEqvicPhUJVqzOUxhZSVdVZiK5NnE50SZbfblcvmzZvR0NCgSpjFhtOXiajY6Ev0uP+G+wHIgSSR6P77b7gf+hJ93tpGsxNFEf3903VEJpMJTU1NcLvdceuctLa2wufzJa3faW1tTdgjEw1CiULHXB5TCBlNXU6kpaUFJpMpo+Knjo6OuH3RaWdbtmxJeHvU+Pg4xsfHle3BwUEAQCgUQiiU3TMhR58ver2kVH24BkaCWX/Nk1ns8abc4vHOrlAoBEmSEIlEEInED8tIkqRcJ7p9Pq478zo8sukR3PbobQgEA9AIGkSkiHJt0Btw7/X34rozr8v6a2eTKIrKSrUPP/xw3Jfgmbe7XC7ceeedym3RolMAGBgYAAC89tpryhIZM9dWifbsR+ssf/GLXyi3zzxZ4lNPPYVIJJLRl/GZPB6PEjja29vx3nvvxd0n+loz/y4ulws+nw9WqxU33ngjTCYTfD4fnn76adx444343ve+p7r/zJ/r6+uxZcsW1NfXo6+vD08//TR8Ph/uvPPOeT8mlZn/vkOhELTa2SefZPLeI0jRV5gDn8+HjRs3oru7OyvnYYjOIU/VpLvuugt333133P4HH3wQFRUV825DKj95Q4ueoelvLk3rwrhixZwPHxEtIiUlJVixYgXq6upQWlpakDYEJ4PY+d5OPP7e4wgEAzDqjbjujOtw/RnXL4gelfvuuw+33367sv3KK69g7dq1yvbOnTtx6623Ktu7du3ChRdeCAC46qqrsGfPnqTPfeGFF2LXrl0AgB//+Me46667VLevXbsWu3btUr1eotfJxNq1a5XglMpVV12F3/0ufkr5D3/4Qzz77LPYs2cPLrzwQtx6662q3z/qwIEDuOqqq3Dffffhvvvuw65duzAwMICamhqsXbsWt99+e9xJHefymNlMTEzg8OHDOHbsGCYnZx95GB0dxU033YSBgQFUV1envO+8wkp9fX1WF3KLLkQTuxDOTIl6Vurq6tDb2zvrL5upUCiEzs5OWK1W6HQ6bHnAh13v9Cq3f896JhxXrsvqa57MYo835RaPd3YFg0EcPnwYa9euhV4fHwwkScLQ0BCqqqogCImHayh7eLzzS5Ik9PX14cMPP8SaNWsS/h+INTg4iGXLlqUVVuY8DGS1WuFyuTIOKg6HQ5nfHSs6NjZz/C5WWVkZysrK4vbrdLqcveFGn7u6XP1taTQU4Zt8DuTyb0nxeLyzIxwOQxAEaDQaaDTx5YDR7vTofSi3eLzza+bxTvc9JZP3nTn9BR0OB1paWua0qFt7e/usU7KKaYGemWILbDkbiIiIKPcyDiutra2w2WxxQSV2KnIydrs9aQFtZ2cnTCZT0VQfx6rimZeJiIjyLqOw4vF4YDabE/ao+Hw+VZFt9EzKsTZs2JDwTI7Rkxvmcknl+Ypdcp89K0RERLmXds2Kz+eDy+VSzoIZFV0Xpa2tTXVCpOj5C2LP6NjU1ASHwwGHw6Gqd9m4cSPsdnvC8w8Vi9gl94fHOeWTiIgo19IOKxs3bkzaWwLEL/1rtVrR1dWVsP7E5XKhtbUVbW1tyoI4W7duLeqgAgCVenUxEIeBiIiIci/tsBIIBDJ64ubm5pQnnsrlSalyJXYYiCcyJCIiyj3O58pA/DAQwwoREVGuMaxkgAW2RERE+cewkoHYdVbGJyOYmCze82wQUf7NY1FwIkqCYSUDseusAMAIh4KICFBWSQ2HwwVuCVFhRP/t52LFYIaVDMT2rAAcCiIiWXSJ8eHh4UI3hagggsEgSkpKcnL6DoaVDJTrtNBq1CfEGuJaK0QE+ZwoVVVVGBgYwNjYWKGbQ5RXY2NjGBwczNmJI+d8IsOTkSAIqCwrwcDYdEDh9GUiilq2bBnGxsZw6NAhVFdXo6qqClqtFoIgIBKJYGJiAsFgkCfWywMe79yTJAnhcBhDQ0MYGBjA4OAgzjnnnJy8FsNKhuLCCmtWiGiKVqtFXV0dent7MTQ0pKzwDchv7GNjYygvL8/JN09S4/HOH51Oh5qaGrzzzjs5C4YMKxniWitElIpWq8Xy5ctx6qmnIhQKIRKRZwyGQiE899xzuPLKK3Mypk9qPN75odFooNPpMDk5mdOZcAwrGeJaK0SUDkEQUFpaqmxrtVpMTk5Cr9fzwzMPeLwXFw7kZSh2RhB7VoiIiHKLYSWZSAQYPIKlw29BOPiCspvnByIiIsovDgMl8loHsPMb0IXHcTmAyNCFwBmfAMCaFSIionxjz0oiFbVAeFzZFMSDys+xPSuDQa6zQkRElEsMK4kY16o2haAIjMlTECvL1IVaHAYiIiLKLYaVRGrqACHm0Ez1rrDAloiIKL8YVhIpKQWqT1PvCxwAwJoVIiKifGNYSSZmKEgJK5wNRERElFcMK8kYT1dvT4WV2GGgIfasEBER5RTDSjJJela4zgoREVF+MawkY1yn3g7IBbaxNStjoTAmw5F8tYqIiOikw7CSTGzPingIiITjpi4DwMh4OD9tIiIiOgkxrCRjiKlZiYSAwSNxNSsAF4YjIiLKJYaVZJYsg6Rbot4XOIAKnRaCoN7N6ctERES5w7CSjCAknBGk0QioLOVaK0RERPnCsJKCFDsUlGxhOM4IIiIiyhmGlRTiwkqSJfe51goREVHuMKykkqRnhWutEBER5Q/DSgrJhoEq9TFnXh7nbCAiIqJcYVhJQYpda2XkQ2B8mOcHIiIiyiOGlVRq6uL3iQfjhoFYs0JERJQ7DCup6MoxpjOq9wUOxBfYsmeFiIgoZxhWZjFaeop6RyC+Z4XDQERERLnDsDKL+LByIH6dFQ4DERER5QzDyixGymYPK6xZISIiyh2GlVmMlp6q3hE4EHfm5WGeyJCIiChnGFZmEdezIh5EZZn6sHEYiIiIKHcYVio56icAACAASURBVGYxWhbTszIZhDESUO1igS0REVHuMKzMIlhSA6lEr9pnGD+i2h6ZCCMckfLZLCIiopMGw8psBE3c4nBVY+/H3W1kgr0rREREucCwkobYcwSVjxyOuw8XhiMiIsoNhpU0xJ4jqHTwUNx9WLdCRESUGwwr6YjpWdGIB7GkVKvaxzMvExER5QbDShpih4HkheHUa61wGIiIiCg3GFbSEBdWho7CWBZR7eJaK0RERLnBsJIOw5q4XaaSPtU2a1aIiIhyg2ElHWVVQMUy1a512g9V2+xZISIiyg2GlXTFzAhajeOqbdasEBER5QbDSrpiwsoqSR1W2LNCRESUGwwr6YoJK6eEj6m2h3jmZSIiopxgWEmXUT0jaNmE+vxA7FkhIiLKDYaVdMX0rNSMHwEwffJC1qwQERHlBsNKumKX3A+PohZDyjZ7VoiIiHKDYSVd1acBmhLVrjXCCeVnrrNCRESUGwwr6dJo4xaHU4UV9qwQERHlBMNKJmLXWmHPChERUc4xrGQi5hxBqp6ViUlEIlLsI4iIiGieGFYyEdOzMjOsSBIwGgrnuUFERESLH8NKJmLDiuaEaptDQURERNnHsJKJmLCyEn3QYTqg9I2M57lBREREix/DSiZiwopWkLBK6FW23z42BCIiIsouhpVMlBsAfY1qV53wofLz3iOD+W4RERHRosewkqkURbb7GFaIiIiyjmElUynCyt4jA5AkTl8mIiLKJoaVTMWElTrhuPLzYHAS7wfG8twgIiKixY1hJVMxYWWd9kPV9r6jHAoiIiLKJoaVTMWEldMFdVhhkS0REVF2MaxkKmbJ/SXSMKoxrGzvOzKQ7xYREREtagwrmaqpAwT1YZs5fZkzgoiIiLKrJNMHuN1u9PT0wOfzob+/HxaLBU6nM+MX9nq96OzsRH19PURRBAA0Nzdn/Dx5V1IKVK8GBg4pu9YIJ7BXWgcAODIQRGBkAsYlpYVqIRER0aKSUVhpaWmBw+GA3W4HAIiiCJvNBqPRiP3798NgMKT1PB6PB21tbejo6FD2eb1eWK1WdHZ2ZtKkwjCergorJu2HQGT65r1HBnH5mcsK0DAiIqLFJ+1hII/Hg82bN8NkMin7DAYDOjo6lNCSDlEUsWXLFmzfvl2132KxoL+/H263O90mFU5Mke35SwKq7X1HWbdCRESULWmHld27d8NsNsftNxgMsNvt8Hq9ynBOKu3t7TCZTAl7YTZv3gyXy5VukwrHqC6yrS/pVW1zRhAREVH2pB1W3G43rFZrwtsaGhoAAF1dXbM+T0dHB2praxPeZjKZ4PP50go9BWVcp9pcHjmu2mZYISIiyp60w0pjY2PS26LhIlkImamrq0s1lDRTdL/f70+3WYURMwxUNXYEmhlFK/4PhzE2Ec5zo4iIiBantMNKZ2dn0uLXnp4eAEg4TBRLFMVZC3EXWlgRpEmsEvqU7YgEvHWMvStERETZkJV1VtxutzJDaD6iIaa/v3/ez5VTFUuB0krVrkuMQ6ptDgURERFlx7zDSktLC0wm05zWWlmwBCGud6WhSj0DiOcIIiIiyo6MF4Wbyefzwe12o7u7O+01VlJJp/ZlfHwc4+PjyvbgoBwKQqEQQqHQvNswU/T5Ej2vtqYOmuNvKNtnl6lnBL3xgZj19ix2qY43ZR+Pd37xeOcXj3d+zeV4Z3LfeYUVm82Gp59+OmnBbC5s27YNd999d9z+p556ChUVFTl5zUS1OucGIjhjxnZ531sApmdLvfnBAB57/AlohZw0aVFbEAsDLiI83vnF451fPN75lcnxHh0dTfu+cw4rVqsVLpcrraLamUwmU9IC2mitSqrws3XrVtxxxx3K9uDgIOrq6nD11Vejuro6o7bMJhQKobOzE1arFTqdTnWbZvcR4Kn/UbbPqBoH+mY8VhJwzoYrceap6toWSi7V8abs4/HOLx7v/OLxzq+5HO/oyEg65hRWHA4HWlpaYLFYMn6s2WxOuo5KNMSkCkBlZWUoKyuL26/T6XL2DzLhcy+rV22WDh7Cyho9jg4ElX3vnBjF+tOMOWnTYpbLvyXF4/HOLx7v/OLxzq9Mjncmf5eMC2xbW1ths9nigorf74fX65318VarNWnPSk9Pz5wCUEHEFNhitBfmFerst/cIl90nIiKar4zCisfjgdlsThgofD6favhGFMWE4WXTpk3o7+9PGFg8Hg8cDkcmTSocwxoA6oKUq5YcUm1zRhAREdH8pT0M5PP54HK5YLPZVCcbjA7ptLW1obu7W9lvs9ng9XrhcrlUa7AYDAZs374dLS0tqrMuezwemEwmNDU1zesXyhudHlh5AXB0j7LrkuCLAD6jbO89MghJkiAIrLIlIiKaq7TDysaNG5P2lgDxRbFWqxVdXV0Jl+lvamqCwWBAS0sL6uvrlcCz4Kq211+vCiunHfNCg+sQmeqwEkdDODIQxGmG8kK1kIiIaMFLO6wEAoGMnri5uRnNzc1Jb7dYLAunPiWZ9TcAT/+9sqkd/RBX6d/FM8GzlX37jgwyrBAREc1DVpbbP2ktrQdWnK/atbmiW7XNIlsiIqL5YViZr/U3qDYvm/ij6gzMPEcQERHR/DCszFdMWKma7EOj8LayvY9hhYiIaF4YVuZr2RnA8vNUu67V/kn5+QNxDOLoRL5bRUREtGgwrGRDTO/KNdrdEGYMBbF3hYiIaO4YVrLhXHVYWS4E0CC8o2xzcTgiIqK5Y1jJhmVnAqeuV+26bsZQEItsiYiI5o5hJVvihoJeVoaCOH2ZiIho7hhWsiVmKGiFEIBZeBcA0PPhCIKhcCFaRUREtOAxrGTLKWcDp5yj2nWt9mUAQDgi4e1jQ4VoFRER0YLHsJJNcUNBf5oxFMS6FSIiorlgWMmmmKGgVUI/LhLeAwDsO8q6FSIiorlgWMmmUz8CLDtbtSu6QBx7VoiIiOaGYSXb1l+v2rxG+zIACW8dHUI4IhWmTURERAsYw0q2xQwFnSb04aNCD8ZCYezvHSlQo4iIiBYuhpVsO3U9sPRM1a7poSDWrRAREWWKYSXbBCGud0UOKxLPEURERDQHDCu5EFO3slroxQWCn+cIIiIimgOGlVxYfh5QW6/ada32T9h7ZBCSxCJbIiKiTDCs5EKCoaDrNH9C/8g4jg0GC9QoIiKihYlhJVdiVrOt03yI84X92HOYRbZERESZYFjJlRXnA8Z1ql3Xav+Ex18/WqAGERERLUwMK7mSaFaQ5k94au9RDAVDBWoUERHRwsOwkksxQ0Gna07gvPBbePKNYwVqEBER0cLDsJJLKy8EDKerdv2t7gHs8B0uUIOIiIgWHoaVXBIE4OItql0f1fix5uB/4ejAWIEaRUREtLAwrOTaxQ5EYpbfby55GL9/eV+BGkRERLSwMKzkWkkpNNfdo9plFIaxfLeTC8QRERGlgWElH0xXoe/061S7rhl/Cv5XnytMe4iIiBYQhpU8MX6uFaPQK9saQUJ5ZwsQCRewVURERMWPYSVPNIbVeHnN11T7Vo2+iXDXfYVpEBER0QLBsJJHq6+5A+9GTlPti3jvBkb6CtQiIiKi4sewkkdnrFyK+wzfUO3TTQwAT99VmAYREREtAAwreVZ/8bV4NPwx9U7fr4H3uwrTICIioiLHsJJnn/3oKvy/8JcxLOnVNzz+XRbbEhERJcCwkmfLKstw9pln4ceTn1ffcPRVoPu+grSJiIiomDGsFMDnzKtxb/jP8U5MsS2e/ntgpLcwjSIiIipSDCsFcPX65dCX6fHDyVvVNwRFwHtXIZpERERUtBhWCkCv0+Ka81bgj5FzsTN8mfrGVx4A3nu6MA0jIiIqQgwrBfK5i+QhoH8MfSm+2LbtZs4OIiIimsKwUiCXmpZiZY0eJ2DEv042qW8MjQC/bQJOvFmYxhERERURhpUC0WgEXP9RuXflV+E/x2PhS9V3GAsAD3wOCBwsQOuIiIiKB8NKAX3eLIcVCRrcEfor7ApfqL7D0FHggRuA4RMFaB0REVFxYFgpoLOWV+HcVdUAgBBK8PXQt7G//Dz1nfr9wAOfB8bEArSQiIio8BhWCixaaAsAY9DjiyO3I3LKevWdjr8OPPQFYGI0z60jIiIqPIaVAvvsR1dBI0xvH5soxxMf/Q/AuFZ9x0N/BDq+AoRDeW0fERFRoTGsFNipVXpcceYpqn3//AcREzf9Dqhcob7zu08BO74ORCJ5bCEREVFhMawUgZsuWaPaPtg3iofe1QA3PwLoa9R3fr0D+H0zIEl5bCEREVHhMKwUgavXL8dFawyqfT95+l0M1ZwFfMkD6CrUD9i9HfDcBgQH8thKIiKiwmBYKQKCIOAH135Eta9vZALu5/xA3cXA5gcAjU79oL2PAL+4givdEhHRosewUiQ2rK2F5SPLVft++fx+nBgMAmdYgM+7AQjqB4kHgV99Cnj+X1jHQkREixbDShH5/jVnq2YGjYXC+Ffvu/LGeZ8HNv8mvoYlMgk8fTfwm88BQ8fy11giIqI8YVgpImecWoXNG+pU+9q7DuO9E0Pyxkc+Dfzli0DdpfEP9u8Cfv5x4N3O3DeUiIgojxhWisx3LGdBr5v+s4QjEpxPvj19B0MdcOvjwJ+1AELMn2+0Vz4B4pM/ACbH89RiIiKi3GJYKTLLq/X42uUm1b7OfcfRdaB/eoe2BPjED4CvPAZUrYp/kpd+BvynFTjyao5bS0RElHsMK0XI8Wcm1C4pVe37pyfehBS7tsray4GvvwicfV38kxzdA7j/DOj4KtDXk8PWEhER5RbDShGq0uvwrU+eodrnOyTif/YmKKCtqAW+8Fvg2nsAbVn87XsfAX52MfD4d4Gh4zlqMRERUe4wrBSpmy45HacvVS8G1/rk2wiFE0xRFgTg4i3AlmeAZWfH3x6ZBHb/EvjJRcAz/wgEB3PUaiIiouxjWClSpSUafO9qdfDw946gbffh5A9acR7geBaw3AWU1cTfHhoBnmsFfvJR4KWfswiXiIgWBIaVInbd+Stx4Wp16Pg377sYGZ9M/iBdOXD57cC3XwUu+1bioaHRPuDJ7wP/3gi88G/AwPtZbjkREVH2MKwUMY1GwPevUS/D3zs8ju3P+2d/cEUtcPX/Bb7lAy76cvw0ZwAYOAR4fwj863nAfZ8GfL8GxsQstZ6IiCg7GFaK3Mfql+ITZ5+i2ud+zo8Ph9IcwqlZDVz/M+Drf0w8awgAIAEHngce/WvgnrOAtpuBNx/jMBFRCl6vFz6fr9DNoAzwb7ZwMawsAC3XnANhxjL8oxNh/MPj++KnMqdy6jnAFx8EbnsKWHNZ8vuFx4E3HwXavgzccybw6LcA/7NAOMXQE9FJxmazwWq1oquLJxJdSDo6OtDQ0ICWlpZCN4UyxLCyAJyzohpN5tWqfTtfPYKHXk5RbJvMmkuArz4hr4J70c2JC3GjggOA737g158FfnQ28Nh3gJ7/ZXChk5rNZoPH44HL5YLdbk94H5/PB6vVCqPRCKPRiIaGBtxzzz0pn9fv98Nms6G+vh6CIKChoQEOhwOiWDxDsw0NDXC73Vl/XofDAUEQkl5aW1uTPtbtdquOmdfrTXrf6N+stbU15XNS8SkpdAMoPXdcfRaefOMYhmYU19712F5csLoG552WInAkIgjygnJrL5fXZ3n3f4DX2oF3/geIhBI/ZrQX6L5XvlQsBc75NHDuDcDaKwCtbh6/GdHC4XA4Zg0qra2taGlpgclkUu7j9Xrxgx/8AMuXL8frr7+OU06JGdp1u+FwOGAwGLBp0ybU19dj9+7dcLvdcLvd6O7uhtlszvnvl0prayt8Ph96erK/yGRXVxcMBgO2bt2a8PampqaE+x0OB9xuN8xmM5qbm+HxeGC1WtHR0YHrr78+4WNcLhcMBoPyN0r23FRcGFYWiJU15fhn2wX4y99Mj7dOTEbwV7/14bG/vhw15XMMDDo9sP56+TLaD+zbCbzeARx8MfljRvvkHhff/UB5LXDOdcC6K4FT1wPLzgRKEsxAIlrgfD4f3G437HZ7yh6VlpYWNDU1oaOjQ3Xbk08+iWuuuQaf+tSnVHUTfr8fDocDFosFnZ3qE5GKooh169Zh48aNCAQC2f+l0uR2u3M+dGIymdDc3Jz2/b1eL9xuN5qbm+F0OgEATqcTVqsVNpsNJ06cSPpYp9MJr9eLLVu2wGKxwGAwzLv9lFscBlpA/vy8lfiLy9ep9h3qH8WdHXsyq19JpqIWaPyqPEz0ndeBjT8EVlyQ+jFj/cArDwCPbAF+8XHgn1YBP7sE6LgVePafgTf/G+j3A5EEi9kRLSAtLS0wGAzKB2MibW1tAJDwPhs3bsQtt9yCV155RRVWPB4PAPkbf6zo64mimHJ4I1d8Ph8EQVDCVDGJ9ozEHutoSEz1dwKA7du3QxRFbNu2LWdtpOxhz8oC8/1rzsErhwLwHZoex35q33H88vn92HKlKcUjM2RYA1xxh3zp65GLbvfuAI7OcnLEyCTw4VvyZe/vpvfrKuSelxXnAysvAFZcCCxfL68LQ1Tk/H4/vF4vmpubU34L9/vlZQWS1ZlceOGFAORhj0IP66TDZDLB5XKhsbERJpMJRqOx0E0CIIcon8+XNOA1NTXhRz/6Ea644oqkz2E2m2E2m+F2u2cNNlR47FlZYHRaDX56kxnGCvWwz/978i31mZmzaWm9vNCc41ngW68C1r8HVmX4RhsaBT7okmte/vt24JefBP7pNOBnlwKP2IE//BTCgedRGhoEJPbCUHGJ9mpYrdaU93M6nXC5XBkFkeiQksPhiLtNFEU4nU6YTKaC9GwYDAbY7XaYzea8DZVEh3dS9SRFZ2Ft2rQp4e2bN28GgFnrazZv3gxRFDmdeQFgz8oCtMpQjn/7wkW49d6XER39CUckfPPBV/D4ty7H0soc1ozUrgM+/m35Ejgo97jsfw448SYwkOHsJCkMfPimfHmtDSUArgEg7bsdqFoJVK8CqlcCVdHrqX3GtfI1UZ5Ea0lmCwwzi2oTee+99wAAjY2Nyj6DwYDu7m7YbDYYjUalwLanpwdutxsWiyWu/mUxig45xUpUzNzd3Q0ASQOUyST3Mh8/nvrkrdG/50Lp6TqZZRxWRFFMa+w2GZvNhtraWjgcDpjNZoiiiK6uLrhcLmzdupX/YNL0Z2edgr/+xBn4yTPvKfuODQbxnbZXcd9XL4ZWE/+fPuuMpwOX/bV8AeSpzifeAk7sm7q8CRzfK9e1ZECIhOTVdQcOJb9TTR1w+seB0y+Tr5fWAwne6IiyQRTFrPQs/PrXv4bJZIp7nzObzejo6IDNZlNNDTabzcrslZOBxWJRepL6+/vhcDiUHqeZgcXv96c8JtGwcuxYgjPVzxB9jlzMcKLsSjustLS0wO/3Y8OGDfB6vXPukhRFER6PR/Uf0mAwoKOjg0ElQ9+2nIXuQwG8+F6fsu/5d3vx78+8i+9Yzsp/g/Q18jouay6Z3idJwPAJ4PgbwLHXgWOvAUdfA/reAzCPouCBw8BrD8sXAKhcPh1cTv84cMo5gIajnJQd/f39qK2tnddzXHPNNRgZGcHPfvazuNtaWlrQ2tqK5uZmOBwOmEwm+Hw+bNu2DfX19ejo6FjUU2wTBRKDwYDOzk5lEbdUPVax+vvT+4I0378p5U/aYWVmL0q04n0uzGYzWlpalDFCznOfO61GwI+/cBGu+8nzOD44vTT+j59+F+Y1Rlx51ikpHp0nggBULZcvZ2yc3j8+LPe6HHsNOLoHOPYapBNvQghPzO11ho/LBb3Rot5yI7DsLKDWBBjXycNXxnXydkUte2EoIyaTSZm1Mxc2mw1PP/00brnlFmzcuFF1m9vtRmtra1wgifa2OBwO2Gw29PT0KD0Gi02qILJ161bYbDbVl2STyZSypiVa6LxixYqUrxsNNUuXLs20yZRnBalZsVgsRTcNbqFaVlmGn95kxhfcLyEckXsqJAn4TturePxbl2NlTZHOtimrjOuFmRwfwzOPPoyNGz6CktETwNBRYPDI9HX059Do7M8/FgAO/0m+xL12tVz3UrtOXuCurEreV1YN6Keuy6qmfq6Se21Kl2Tvd6cFJxoS5jIcZLVa4fV68d3vfjfh7BSXy5XyS5vT6VRmrCSa/bIYpDqu0R53n8+nfG40NDQo+xL1yEfDyvLly1O+7swvzUlFwkBoTH4PmMuXnFAQEA8C/fvlaykir0VVUi6vc1UyddGVT++PTMrD6uODQHBw6ucB+To4KO+fGAHCIfkSibmO/hwJyyexFYSpaw2AGT9HL1JkxiU8/XMkZn/NasC+K/NjkAUssF0ENqytRfOnzsa237+l7OsfmcBX792N33ztEizLZcFtNmlKECythXRaA6BLsshdJAL0vi0vWnfgRfl6OHURXZzxQblH59hr6T+mvFb+j1pTJ18b6tTbS04BNNrM2kFFLTgZRMfeDux4ewfei8i1YXf86A78xw//A/oS/ayPF0URGzduVKbYfvWrX8UTTzwRdz+/35/ywzL6IZ7u0MZCE12FNhAIJAws0WngM49RtEC5ra1tOqyMD8lF/4ED6PjlvwAAPq/bBc2T70+FglJAWyavuF1SBmhL0fngQwAAi34f8ORWeWHMsX71dXAAgCQ/tnKql7hy6lK1Aqg8FahcIffmDh2RQ0m/HwgckH8e/ADzGvIuJgVcaoJhZZGwX2nC7gMBeN+c/uB+69gQvuh+Cb/92iU4tXr2N9cFQaMBTv2IfNnwNbkbqd8/I7z8IXVh7lyNTb15pQo4pZXxvTIze21Kl6i/SZXop75J6dX7taVT+8vkN8ipN1b5Nv6XzYdH334Ut+64FYFgABpBg0hlBNAD9z54L3ZU7sD9N9yPz5z9maSPF0URDQ0N8Pv96OzshMViQSiU+FQWFosFHo8nae9C9Nv/hg0bsvPL5ZskyR/4o31yb0C0F6F0iXw9NaWxv78//vefnIDrpz8GAJhX6YCeZ4DgAMyRAMxnroTn/p/DecZuORiMyrV7/kAE3t3DaL6sFGv7ngX6nk3atPbfD8Ji0sLQ9aPZf4/w+OyF/4tdARf3LMg7X3SBpaienh5s3br1pKl4zwVBEPAj24X4zE9fwKH+6WGSd08MY7P7JTy45ZLiHRKaD0GQZwItrQfMt8j7xEPAsTeAwH75m01g6puOeEjuXs2ViWH5MpS7l4Cgkd/s9TXyNznlYpjxc628DQCT49OX8DgwGVS2NRNjOP/wfmieemEqBAnq7uLoRVcuLxJoOF2eAVa5fFHX/Dz69qO44eEblO1IdN2fBgAvAoG9AVwfvB47vrADnz37s/KJPUf75GNSUgb/oaNouORjAJDwnD733HMPNm/eLPcUhEPYevs34PF4YLv+OnQ+9FNgckI+7hoNxMER2D7/JRhqqmC/4Qp5hp2glV9LkgBIgCTB89iTcP26HY6bbWj6tHVqf0QevpgYlkOCchmWa8YmRoDQCFBaNb00QNVKubegelXyb9GSBIz0yoXzIyeA4Q/l3s3RXrknYrRP6ZnwvHQQrj/0wWkpg3ll4p5HqxiCG4Dzix+F6wunyz0gEyNAcACePQNwd4zBbtbB9NQtqsc5L52E9YFROH7xAlyfmW6rrUN+/9t6Reoe5ZbOIMQg0PLxBdLzXAwKuAZW3sOK3++Hz+dTFVT5fD40NDSgu7ubgWUeaip0+O3XLsEXt7+E9wNjyv79vSPY5PojHvzapairrShgC/PEsEa+xApPAoPvy8Glf788oyg6/js+NPXzkDw2HN2WwvlvfypSRK7ZCY3K9TvzoAVgAoDeDB9YopeHv4ynTweYJafKH4JjIhAU5euxwPTPQVE+nsCMMfQk4+glZVPfuiuA0gpAt2TqumJ6v0YrB8/IpDwuH/05HJrehiR3+Wt0cs+UVhezXSJfa0rk59PoEIKEx5/8DjZJJQhBQjTaLoWA5RYBP+6eQNV/C3joW+VY9fDNkPS1EGZMzfcdDaPBPQIAsDeWo+sfrOiaeg1Jo8MJcRh/9+SH0D7XiubLSoDwBMwAXJ/Ww/Hff4Cx3oxN63Wor9Vg95EwPPsmYdADHbYKGB7886R/ki3OQYhBwPvcHxFoqYJBn4Uwqa8BqlbB3S0X74tT5yby/uZf4H7z3wEAFlMJTMbks+62eOR2tXgldN6cuO6rab0OdvMk3C8PwfvuXljWlcCgF+DdPwnf0QjMKzWqMBJlMZXAbtbB7QvBL0ZgNZWgbW8IvqMRdN5ckfIY+AMRtP5hAhaTFhZTnj4Gl5wiF/nr9HIdy+TURfXz2NTJZIXpGjp9zdTPNert0iXy/xWNTv63PPPftSb6b71EDpeqmpQIlDAbrUsRBPn/gPIlZcbPmhlfXNIY/swVQZrDSWUaGhrQ2NiY1WIvm80GALMufjQ+Po7x8emZL4ODg6irq0Nvby+qq6uz1h4ACIVC6OzshNVqhS5ZDUUROjoQxM2/6sLBfnUh6qoaPX59WyNOL9LAUnTHW5KmekqOQRh8Hxh4H8Lg+xAG3gcG34cw8AEw+IG8LgydFKJhxLxSg6dvWRL3gdjSGUTrH2af0dZhK0fTevW/cX8gghZvEL6jYfgDEswrNWhcqYXTqp81fMx83c6bK7L2ASwGJRidybsK7WZdwiAR1friOLa9MI7tn4n/fWN5/ZNwvjiOriPyF4TGVVrY1utgbyhN+Th39wScL44rx8y9aTUuOv8cRKrrcPDEANactgLayCQQmQAmJ+A/2ocNf+NF7ZIS7P4/G2BYUgroDUBFLSSlZ9IIqWKpfF1eC+jKIUz1JgnDx4GR4/L18Alg+DiEkRNyb9KSUyAZ1wLGdZCM6yAZ10IyrpML+suqUv4eikh4OsgvIHN5/x4cHMSyZcswMDAw6+d30YSV6CnSZ2vOXXfdhbvvvjtu/4MPPoiKiuL8EC6EgQngZ/u0OD6mfpOr0Un4xrlhLF+EI0IFIUVQNjmIz/IyNwAAHMBJREFU0slhlITHoIuMydfhUZSEgyiJRH8eQ0lkHJpICFopBG1kAhopBG0kBE1kAlopNHXbBDSRSWjAUw4UK69fHn4wGQV02yuz04uRBZ59Idg6xrLXs5IDEgQIcyw2DQslCGmXIKStwKS2HEGdESOlp2C07BSMlJ6K0bJTMFq6DBFN8nDT09ODv/u7vwMgz8KqrKycU1soO0ZHR3HTTTctrLDi8Xhgs9kSjvHOxJ6V9PUOj+Mr93bjnRPDqv3LKktx/60NOGt5mkk/Txb68c6qyORUnclEzPU4hNDY1BTGAIQZwy3CWGD656AofzPTlkFSFehOF+9GhBIc/uAoVq8+DVoBibuLpTAQHIAgHgIGDs99HZxFxnc0DFvHKJo+ooPTmr2ucUlbKv+NIMVMG5UvyT7o/YEIGtzDMBk16LJXTdUUCTOG0eSLVFqp/AzdEki6CvnfyvAxCENHgaGjECZG0m9vea3cm7DkFGDJMkjlS+V1jMprIVXUAuVL5d6Kiqn9ugp5qG4yOD2cGRqT/01HtyfH5fbpayCVzRj6yHAIItH7yTXXXAMAeOihh1hykGW57lnJa82Kw+FIukx/utPzysrKUFYWXxCl0+ly9gGXy+fOpZVGHR52fAw3/+efsPfIoLK/d3gCN9/bjd/8xSVYvyq7AS8bFurxzi4dUJad7q9k37HDoRBee+IJrL72WmjSOd6RCDB8TC5UDhyU14wQD8o/j/ZPfagYpgt+9Ybpwl+9Qb5d0CYeN595CU19kE2MTF2PyoWgM6+lyFStScmMmpOS6XH66DTy8KQc9JT1JyZi1qaYUNW8vNf7NoaDAZQAUxcBWgD9kHACEo4jguOQcHylhMu+rUNF3ceA634m1yMIWmByTP6wDU1dT44pRc2TwWG88uoeXHTJ5Sgpr5GDRGnldC1O6RII2ln+DjMD5VRBtNu9HY67vy6vBP5CN4QU06DT6m8ZHwIG5eCiXAC5LqnyVPl3nbqOtjejfhydDkAFgPysHjvz/STVQnKUHZm8f2fyPp/XsNLe3p50MbjoIj4zT/BF81e7pBQPfu1S3HLvy9hzePq09f0jE/ji9pdw/20X46N1/IZBadBopk4uuQpYc2mhW5MTf9zzAG7Zccvsd5zyQOPXgOXnpnVfKRTCkYN6fPQMa/J1hGYjCHIowvTMGsvVVyc82d+clVUBp1QBpxTglB1ESeS1gsdutyctoO3s7ITJZGLXXA7UVOjwm7+4GA2nG1X7B8ZC2OT6Ix546eCstUJEJwPbuTYY9UYIs/QVCBBg1BvRtL7wpwqZ7UzPRItBTsKKKIoJu9s2bNigLHAUe3+PxzOnszhTeqr0Ovz6totxqUnd9ToxGcHf7ngD33zoFQwGOauFTm76Ej3uv+F+AEgaWKL777/h/rRWsiWi+ZtTWBFFMWVtic1mg9VqVZ1ZGQCamprgcrniAsvGjRtht9t5QsMcW1JWgntvvRhXnLks7rbHXzuKz/z7C3j9/YECtIyoeHzm7M9gxxd2wKCXe3k1U1NIo9cGvQE7v7Az5Qq2RJRdadestLa2Yvfu3fD7/crFarXCYDBg8+bNqqBhtVrR1dWVsP7E5XKhtbUVbW1tSujZunUrg0qelJdq8Z9f2QDnk2/hP1/Yr7rtYN8obvz5H/CDa8/BVy5bC2ERr1JKlMpnz/4sjnz3CDz7PPjdW79D/2g/aitq8blzPoem9U3sUSHKs7TDSnNzc9pP2tzcnPL+mTwXZV9piQZ/++n1uNS0FN/r2IOBsenhn4lwBHc9tg8v+fvhbLoANeUn+6wcOlnpS/T48gVfxpcv+HKhm0J00ltYS+RRVlnXL8fj37ocF62JL2p+cu8xXPeT5/HqjBlEREREhcCwcpJbbaxAu+NjcFwZvzbD+4Ex2H7xB/x8Vw9CYa6oSkREhcGwQtBpNdh67Ufwq1sbYaxQD/uEwhKcT76Fa378PF58L9Mz3hEREc0fwwopPnnOcjzx7SuwYa0x7rb3TgzjS7/8E/7qt934QBxL8GgiIqLcYFghlZU15Xhoy6X4xifqkWgy0BOvH8PGH+3CT595F8FQOP8NJCKikw7DCsUp0Wpw56fOwX99/TKcf1pN3O3BUAT3PPUOPvVvz+GZt44XoIVERHQyYVihpMxrjNjxjY/jnz53PgwV8VOYD/aN4rb7unDbfbtxoDf9M7USERFlgmGFUtJqBNx0yRr873evwpcvXQNNgqGhZ946gY3/8izuaH8V750Yzn8jiYhoUWNYobQYl5TiH244H49+8/K4EyICQDgi4RHfB7D+67P4+m+68cYHXLafiIiyg2GFMnLeaTXw/OXH8C+bLsSyyrK42yUJ+P0bx/Dpf38Bt977MroOJD+HFBERUToYVihjgiDg8+bV+N/v/RkcV5pQrtMmvN+utz9E0y/+iM2uP+L5dz+EJEl5bikRES0GDCs0Z1V6HbZe+xG8+P1P4pufOANV+sSnmvrT/n7c/J8v4zM/fQEPvXwIw+OTeW4pEREtZAwrNG+1S0rxvU+djRe//0nc+amzUbukNOH93vhgEFsfeR0X/6MXzZ496D4YYG8LERHNKu2zLhPNplqvwzc+cQZu+/g6PPTyIbif8+PYYDDufqMTYbR3vY/2rvdx5qmV2LyhDp83r0ZVaYKpRkREdNJjWKGsKy/V4rbL1+FLl67BI74P8PNdPTjUP5rwvu+eGMY/PP4mnE++BetHTsXpYQFXhyPQxS/rQkREJymGFcqZshItvnjxGtgaVuPZdz7Ew7sP45m3TiAciR/6CYUlPPHGcQBatP/zc7jugpX/v717iW0jv+8A/h2+3xxSlmTJku2lvPZ6He829DppAjQ1GipoN8khqAwDe+lppaKXFiggw5dibwvpWORCBQiK3Gw6aIs8gIRaBMnCzS4scxMn9ma9Jv2WLUt8iC/xPT3MDEVZpF58iJK+H4CY4cyQIn6WR1/+//+ZP77/9iDOHXVBU+/mLkREdGAwrFDb6bQafPt0P759uh8vkzlcDz3F1ZtP8Chav7UlmingJ79/hJ/8/hEGnCZ8TwkuZ484IdSbsIiIiPY1hhXqqD6HCf9y4QT++Vsj+PRBDFdvPsYv//wChVKl7vHPl3P40ccP8KOPH+BYjwXff2sQ33t7AKf67QwuREQHBMMK7QqNRsA3RnrwjZEefJAt4H//MI9rc49xZz7V8DWPoln88Df38cPf3MdRtwW+0/3wne7D+dfc0Gt5YRsR0X7FsEK7TrQY8E/fPI73zh/Bf/30l8j0vIFf/OkFvtxgnqHHsSx+fOMBfnzjAewmHS6c6oPvdB8unOyDs86ki0REtHcxrFBX6TMD717w4N9GT+GLFyn87I/z+Nnt+YbjWwAglSvJx/1xHlqNgPPHXfCd7sffvN6L1/tsHKBLRLTHMaxQ1zp12I5Th0/h379zEn9+lsTPbs/j53+cx/zy+nu3qMoVCZ9EYvgkEgPwOVwWPb72mht/7enB11/rwRuH7QwvRER7DMMKdT1BEHB2yImzQ05c+Yc3cPd5ErN3X+Kjvyzg9tONZ3eOZ4v41Z0F/OrOAgBAtOhx/rgaXtw4PeCAluGFiKirMazQniIIAs4MOnFm0Il/9b2OhWQOH33+ErOfL+DG/SXkG1xVpEpkiwjeXUDwrhxe7EYdvMdc+NprbrxzzIW3h0WYGkzMSEREu4Nhhfa0focJ7339KN77+lFkCyXcuB/F7N0F3Agv4Wl8ZdPXp/Il/PbeIn57bxEAoNcKOHvEifPH3XjnuBxgXA3mOiIios5gWKF9w2LQYfTNfoy+2Q8AeBrP4tNIDJ8+iOKTSKzhLf9rFcsSQo8TCD1OwP+7CADA02vFW0ecODsk4q0hJ84MOmAx8L8OEVGn8IxL+9aQy4Khcxb847khAMB8YkUOLmE5wDzc4AqjWpHFDCKLGfzPH+YBABoBONFnw9kjIt4eduLsESdODzjYfURE1CYMK3RgDIpm/OCrQ/jBV+XwspDMYe5hHDcfxnDzYQyfP0+izrRF61Qk4N5CGvcW0vhp6CkAQKsRcKLXhjcHHTgz6JCXA07e84WIqAUYVujA6neY8N23BvDdtwYAAKlcEZ89TmDuYQw3H8bx2ZM4csWNB+yqyhUJXyyk8MVCCv/92bPq9iOiGWcGHTgz6MSbgw6c7Ldh2GXh5dNERNvAsEKksJv0+NbJXnzrZC8AoFiu4O58ErefLeNPTxO4/XQZX75M1501upFniRU8S6zg18rVRwBg0msw0mvDyX47Xu+34fU+O0722zDksvAyaiKiOhhWiBrQazV4e1jE28MigGMAgJVCGXefJ+Xw8mwZt58uI7yYhrT1/IJcsYI780ncmU+u2W7UaXCiz4aRXuXRZ4XnkA2eXivHwxDRgcawQrQNZoMW5465cO6Yq7otky/h8+dJ3H2exJ1nSdx5vox7L9IolLfWhaTKl+qHGEGQu5NGeuXgMtJrg+eQFccOWTHgMLFLiYj2PYYVoiZZjTr5nizH3dVtxXIF91+mlfCxjDvzSdxbSCGRLW77/SUJeBpfwdP4SvV+MCqDToOjbguO91hwrMeK44esON5jwfEeKwacJug4GzUR7QMMK0RtoNdqcHrAgdMDDowpl05LkoSldAFfLqRwbyGFL1+m8eVCGvde7izEAEChJIei+3VmqNZpBAy5zBh2W3Csx4KjbguOuq3ysscCI3MMEe0RDCtEHSIIAnrtRvTajfjmiUPV7bUh5v5iGpHFDMKLaYRfpjectHEzpYqEh9EsHkaz+PjL9fvdVj3sgha/Tt3GcI8VQy6z8rBgyGXmOBki6hoMK0S7rFGIAYBsoVQNL7XLR9EMMoVyUz83likiBgGP/vyi7v5euxFDLjOOiGYMOE047FSXJgw4Tei1GdnNREQdwbBC1MUsBh2+csSJrxxxrtmutsY8imbwMJpds3ywlEEqV2r6Zy+m8lhM5fHZ40Td/RoB6LPL4eWwQ172O0w47DTKS4f83GrkaYaImsOzCNEeVNsaUzuwF5CDTDxbxOOYHF4eR7PyeiyLJ7EsnjfRtVSrIgEvkjm8SG78fnajDv1KoOlTPnP1YVtdd5r1EARe2URE6zGsEO0zgiDAbTXAbTXgr4bFdftzxTKexrMIv0wheGMOzkEP5pfzyhVHWcR3ONi3kVS+hFSDQcC19FqhJryY0Ocwos9uRJ9dDjnycxMO2QzsfiI6YBhWiA4Yk16LE312HHOZkAtLePfvT0GvX53DKJ0v4ZkSXNSWmBfJnLxUHtu9h8xWFMsS5pdzyqDi5YbHCQLQYzXgkG21deZQdWlAr82EXrsRh2wGuCwG3oeGaB9gWCGiNWxGHU4dtuPUYXvd/ZIkIZYprIaX5OpyoWa9FeNm6v98YCldwFK6gL+8SG14rEYA3FY5uByyycsemxGHbEb02AzotRnhthrQYzOgx2qE2cAroIi6EcMKEW2LIAjosRnRYzOuG/hbK1sorQkxC8k8llJ5LKbz1cG7i+n8ju8xsxUVCVhK57GUzgPYONgAgMWgRY/NIAccpSvNbTNANBvgNOvrPuwmHVtviNqMYYWI2sJi0MHTa4On17bhcflSGdF0oRpgXqbyeJnKyctkHospJeik8yhtYxLJncgWysjGVvAktrLl1wiCPIjYZTVAtBjgtujhsijrVj1Ei9wd5TBq8CwjT27ZY5dbsDigmGhrGFaIaFcZdVoMimYMiuYNj6tUJMSyBbxUgovaMqO21qjbltIFxDKFDn16uVsqmSshmSvhUTS7ydE6TN/+GIDcReUw6+Ew6eEw6+BU1p1mvbJdt2a/vNRXjzPpNQw7dGAwrBDRnqDRCMq4E+OmxxbLFcQycmtNNFPAUiqPaEYOMkupPJZqtsUyBRTL7W2xqaciAYlsccfdYAatRgkvctARLatdVQ6zHqK5NtzoVtfNelgNWgYd2lMYVoho39FrNehXbkq3GUmSkMyVEE3LwUVtmYmm5aATzRSwvFLE8koRSWW5vFJEuc1dUpsplCs143G2R6sRqgHGbtLBZtTBZqxZV5ar++SHtWaf1aiDRa/leB3qCIYVIjrQBEGotkh4erf2GkmSkC2Uq8ElkS1ieaWAeLaIWKaARFZej2cKiNesL68UIGH3/7iXK/KNA5u9p44gAFaDDlajVgk5SiuOSQ4+8kP/ylIHu1G/JhAZdezSoo0xrBARbZMgCLAqrQubjbVRFYtF/PwXv8Tffvs7yJYkJFdKcmtNbrXFJpkrIVndVqruSyr70vn2XA6+U5Ik35cnnS9hAdtv4VHpNEI1vKgPi1EHm1ELi0FtyVldtxi01dYdq1ErLw1Ka49BC6OONw3cbxhWiIg6RCMAdpMObr0ecG3/9aVyBel8qdqis7ZlRw416roahNTtqXwJ0u72XDVUqkhNjd95lU4jwGLQQlPR4j/v31ACzmrgkcONFhZ1adDCbNApSy2sNesWg/wai0ELPe+cvGsYVoiI9gidVgNRuSx6uyoVCal8qaYVRwkxSotNWlmmatbTObl1J1OQ1zP5clvuXtxqpYo8DgkQkFjMtOx9DVoNLMbVMFMbdmpDkLXaOqS0ABlWW4EsBi1MevlhVpZajvvZFMMKEdEBoNGsjs0ZbuJ9CqUKMkrXjxpi1K6gVK6EVK6oLOWgk1K6tlK5ElL5IjL5MlK54q5cgdWsQrmCQrbS8hsZ6rXCmgBj1mthMmhh0cutO2aDvM2iLF9t8altNbKoLUNqMNLtj0HQDCtERLRlBp0GBp0BLuv2W3dq5Uvl1dacV1p30vkSMvkSMoWyvFS2ZQvl6j51Pasct5cVyxKK5VLbpqgw6TVK2NHJ64bVVh11adJrYNLL433U4GTUaWDUa2FStlmNWvzdG/1t+YybYVghIqKOM+q0MNq06NnCfXM2U6lIWCmW1wScRCaH3/3fpzjz1leRLwOZghxw1KCTUVqGMvkyVgplZIvy9hVl30qxvCdbf+rJFSvIFStNX/3lsujx2X98p0WfansYVoiIaE/TaFavzlIVixZEP5fw7tnDa2YV345iuSJPwaAEnWy+rIQeOeTULtM1z1eDkPK8Zn0vjPlpxKTfvYk+GVaIiIjq0Gs1cJo1cJp3FnbqKVck5IplrBTLyFUflerzlYK8T11mC/L2bO32ghyA5GVtmGpvl9huXhLOsEJERNQh2jqtQK1UqUjIlcrVlqCV4mrIWRN6imXkCvLzfEkOTLlSGfnqsox8qVINU/lSGUe2eE+hdmBYISIi2ic0GkG5KkgHbDzh+Z7CO9wQERFRV2NYISIioq7GsEJERERdjWGFiIiIuhrDChEREXU1hhUiIiLqagwrRERE1NUYVoiIiKirMawQERFRV2NYISIioq7GsEJERERdjWGFiIiIuhrDChEREXW1PT/rsiRJAIBkMtny9y4Wi8hms0gmk9Dr9S1/f1qL9e4s1ruzWO/OYr07ayf1Vv9uq3/HN7Lnw0oqlQIADA8P7/InISIiou1KpVJwOp0bHiNIW4k0XaxSqWB+fh52ux2CILT0vZPJJIaHh/HkyRM4HI6Wvjetx3p3FuvdWax3Z7HenbWTekuShFQqhcHBQWg0G49K2fMtKxqNBkNDQ239GQ6Hg7/sHcR6dxbr3Vmsd2ex3p213Xpv1qKi4gBbIiIi6moMK0RERNTVtB988MEHu/0huplWq8WFCxeg0+35HrM9gfXuLNa7s1jvzmK9O6ud9d7zA2yJiIhof2M3EBEREXU1to3VMTs7i2AwiJGRESQSCQDA5OTkLn+qvS2RSODy5csQRRFTU1MNj2PtW2dmZgbhcBihUAixWAw+n69h7Vn35iUSCczMzCAajVafx2IxXLlyBV6vd93xrHnrTU9Pw+v1wufzrdvHeu/cxYsX4Xa7MTExAa/Xi0Qigbm5Ofj9/rq/322ptURrBAIBaWxsbM22YDAo+Xy+XfpEe9vk5KQ0NjYmTU1NSR6PRxofH294LGvfOpOTk1I4HK4+j8fjks/nk0RRlOLx+JpjWffmxeNxaXJysm5tAUjBYHDddta8teLxuARACgQC6/ax3s3x+XwSgDUPURTX/V5LUvtqzbBSIx6P1z2ZS5Ikeb1eye/378Kn2j+8Xm/DsMLat04gEJBu3bq1brt6Mq89abDureH3+yVRFNcERJUoipLX660+Z83bY2pqqm5YYb2bNzk5KQWDQWlqakqampqqGwglqb215piVGteuXYPH44Eoiuv2Xbp0CX6/fxc+1cHA2rfOzZs363Y7iKKI8fFxzM7OVptmWffW8Hg8AFCt60ZY89abnZ2t2/UDsN6t4vP5MDk5icnJSYyNjdU9pp21ZlipEQgE4Ha76+7zeDwIhUJbOhnR9rH2rTMzM4PR0dG6+86dOwcAmJubA8C6t4rP50M8Hq8bEhOJBN55553qc9a89UKhUN3aA6x3J7Wz1gwrNebm5qrfkF6lbo9EIp38SAcGa986tX8YX6WeKNQTCuveXtPT0wCAy5cvV7ex5q01MzOz4eBN1rtz2llrhpUaiUSibvNVLf5Stwdr3zrBYBDBYLDuvnA4DADVb6Gse/skEgn4/X4EAoE1J3DWvHUikUjDb/Iq1rs1IpEIZmZmqo/Lly+vayVpZ6156fIWqf8AsVhslz/JwcPat87MzAzGx8e3dCzrvn3q5cvhcBixWAzBYLDhN816WPPtuX79elOXxLLeWxOJRBAKhdacO0KhEM6dO4dbt25tGlCA5mvNsEJ0QFy+fBkej2fD+9xQc0RRrP7xnJ2dxcTEBCYmJhoOSKSdu379OuvaIYFAYN02r9cLr9eL999/v+7+VmM30Ba92tdPncPaNy8UCmFmZgbBYHBL34IA1r1ZPp8PgUAAFy9exMzMzJZew5pvjXrDve20WjV6H4D13qnR0VFcv359S8c2W2uGFaID4OLFi/joo4+aPrnT9oiiiLGxMUxMTPCKkxbaTncmtY8aPEKhUNt/FsNKDY/H03Dwj9rPxpN9e7D27TM6Ogq/31/30k7Wvf3Onz8PQO4WAljzZm10mXI9rHdzJiYm1lzNVuvVcSjtrDXHrNRQ5zyoR/0H2M5/Eto61r491BNNoxtmse6t4XK5qt0+r1JP6LX1ZM13LhKJ4OrVq+tuMKbW9MMPP8TVq1fhdrurIZ313rlr1641PH+o9VNvl9DOWrNlpcbo6GjDVBgOhxv+g1HzWPvWm56exsWLF9fVLhKJVL/ls+7NSyQSG3bxqJeLq98oWfPmjI2NIRAI1H0AwJUrVxAIBKphhvVuzvj4eMMBtOrVbmogb2utd3yj/n1Indeg3vweHo+n4XwItDVbmRuItW+NQCBQd5IxdZ9aZ9a9NTaaoNPr9UqiKFafs+bt0WgiQ9a7OZvNNVZbv3bWmmHlFfVmjAwEApydswU8Hs+62tZi7Vvj1q1bks/nk/x+/5qHOglZ7aR6ksS6t0I4HJbGx8fXTeDm9/slAOtO9qx56926dUsCUHeyPNa7OePj4+t+hxt9+WxXrQVJkqSdt8vsT7OzswgGgxgZGak27zZz46GDbHp6Gjdv3qzeVAiQL+kURRGXLl1ad58E1r55Lpdrw24Jj8dT7ZpQse6toQ5EVC+tdbvdmJqaqnu5OGveGqFQCB9++GH1HCOKInw+H0ZHR9dcMcR6N2d6ehrRaLT6u13v/K1qR60ZVoiIiKircYAtERERdTWGFSIiIupqDCtERETU1RhWiIiIqKsxrBAREVFXY1ghIiKirsawQkRERF2NYYWIiIi6GsMKERERdTWGFSIiIupqDCtERETU1RhWiIiIqKsxrBAREVFXY1ghIiKirvb/fo//LRAMrQ0AAAAASUVORK5CYII=\n",
"text/plain": [
"<Figure size 640x480 with 1 Axes>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"plt.plot(history['epoch'], history['train_loss'], label='Train')\n",
"plt.plot(history['epoch'], history['val_loss'], label='Val')\n",
"\n",
"min_loss_epoch = np.argmin(history['val_loss'])\n",
"min_loss = history['val_loss'][min_loss_epoch]\n",
"plt.scatter([min_loss_epoch], [min_loss], c='g', label='Min Loss')\n",
"plt.annotate(f'({min_loss_epoch}, {min_loss:.2f})', (min_loss_epoch, min_loss))\n",
"\n",
"plt.legend()\n",
"plt.grid()"
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {},
"outputs": [],
"source": [
"# Saving the model\n",
"model_name = r'rnn_best_model.pt'\n",
"if isinstance(best_model['model'], nn.Module):\n",
" net = best_model['model']\n",
" best_model['model'] = best_model['model'].state_dict()\n",
"with open(model_name, 'wb') as f:\n",
" torch.save(best_model, f)"
]
},
{
"cell_type": "code",
"execution_count": 12,
"metadata": {},
"outputs": [],
"source": [
"# Defining a method to generate the next character\n",
"def predict(net, char, h=None, top_k=None, device='cpu'):\n",
" ''' Given a character, predict the next character.\n",
" Returns the predicted character and the hidden state.\n",
" '''\n",
" # tensor inputs\n",
" x = np.array([[net.char2int[char]]])\n",
" x = one_hot_encode(x, len(net.chars))\n",
" inputs = torch.from_numpy(x)\n",
" \n",
" inputs = inputs.to(device)\n",
" \n",
" # detach hidden state from history\n",
" h = tuple([each.data for each in h])\n",
" # get the output of the model\n",
" out, h = net(inputs, h)\n",
"\n",
" # get the character probabilities\n",
" p = F.softmax(out, dim=1).data\n",
" p = p.cpu() # move to cpu\n",
" \n",
" # get top characters\n",
" if top_k is None:\n",
" top_ch = np.arange(len(net.chars))\n",
" else:\n",
" p, top_ch = p.topk(top_k)\n",
" top_ch = top_ch.numpy().squeeze()\n",
" \n",
" # select the likely next character with some element of randomness\n",
" p = p.numpy().squeeze()\n",
" char = np.random.choice(top_ch, p=p/p.sum())\n",
" \n",
" # return the encoded value of the predicted char and the hidden state\n",
" return net.int2char[char], h\n",
" \n",
"# Declaring a method to generate new text\n",
"def sample(net, size, prime='The', top_k=None, device='cpu'):\n",
" net = net.to(device)\n",
" \n",
" net.eval() # eval mode\n",
" \n",
" # First off, run through the prime characters\n",
" chars = [ch for ch in prime]\n",
" h = init_hidden(net, 1, device)\n",
" for ch in prime:\n",
" char, h = predict(net, ch, h, top_k=top_k, device=device)\n",
"\n",
" chars.append(char)\n",
" \n",
" # Now pass in the previous character and get a new one\n",
" for ii in range(size):\n",
" char, h = predict(net, chars[-1], h, top_k=top_k, device=device)\n",
" chars.append(char)\n",
"\n",
" return ''.join(chars)"
]
},
{
"cell_type": "code",
"execution_count": 13,
"metadata": {
"scrolled": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Testary\n",
"\n",
"A TOMON.\n",
"Then stays not where to make his honour beat\n",
"A trooping thoughts of season to the world,\n",
"And hath been he is brings against the womb\n",
"As thou wert letter; and thou wert to saw\n",
"To the sea such a second conscience,\n",
"To hear thee said; and thou stand strives on me.\n",
"\n",
" Enter a Servant.\n",
"\n",
"SIR TOBY.\n",
"He shall have the worse the world.\n",
"\n",
"HAMLET.\n",
"And will the streets be taken, she is not so? Who came to the\n",
"minds of a sun, who will share make a good best and time? Alas, this\n",
"whole head and all as we will be a commonwealth of the minds and all\n",
"hers. This stale is a star there a song where you speak truly; and we are\n",
"not he hath shall be his. I’ll see you here welcome. His more\n",
"welp in them, I shall have some state as the children of a cold.\n",
"\n",
"SIR TOBY.\n",
"Why, they shall have my blessing will not then, but I’ll shall set it to\n",
"him. This will send them to the moon and the morning. This shall have\n",
"a couparine of the world.\n",
"\n",
"HAMLET.\n",
"I have not dead, and see thou art a chain that holding thy day\n"
]
}
],
"source": [
"# Generating new text\n",
"print(sample(net, 1000, prime='Test', top_k=5, device=device))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Quantizing the model"
]
},
{
"cell_type": "code",
"execution_count": 14,
"metadata": {},
"outputs": [],
"source": [
"import torch\n",
"from torch import nn\n",
"import torch.quantization as tq"
]
},
{
"cell_type": "code",
"execution_count": 15,
"metadata": {
"scrolled": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"CharRNN(\n",
" (quant_x): QuantStub()\n",
" (quant_h): ModuleList(\n",
" (0): QuantStub()\n",
" (1): QuantStub()\n",
" )\n",
" (lstm): LSTM(108, 512, num_layers=2, batch_first=True, dropout=0.5)\n",
" (dropout): Dropout(p=0.5, inplace=False)\n",
" (fc): Linear(in_features=512, out_features=108, bias=True)\n",
" (dequant): DeQuantStub()\n",
")\n"
]
}
],
"source": [
"model_name = r'rnn_best_model.pt'\n",
"best_model = torch.load(model_name)\n",
"n_layers = 0\n",
"n_hidden = 0\n",
"for key, tensor in best_model['model'].items():\n",
" key = key.split('_')\n",
" layer_type, key_type = key[0].split('.')\n",
" if layer_type == 'fc':\n",
" continue\n",
" \n",
" layer_idx = int(key[2][1])\n",
" n_layers = max(n_layers, layer_idx)\n",
" \n",
" if key_type == 'weight' and key[1] == 'hh':\n",
" n_hidden = tensor.shape[1]\n",
"n_layers += 1\n",
"\n",
"net = CharRNN(chars, n_hidden, n_layers).cpu()\n",
"net.load_state_dict(best_model['model'])\n",
"print(net)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Because the LSTM is setup as a custom module, you need to setup a configuration."
]
},
{
"cell_type": "code",
"execution_count": 16,
"metadata": {},
"outputs": [],
"source": [
"custom_module_config = {\n",
" 'float_to_observed_custom_module_class': {\n",
" # What to do when converting a floating point layer\n",
" torch.nn.LSTM: torch.nn.quantizable.LSTM,\n",
" },\n",
" 'observed_to_quantized_custom_module_class': {\n",
" # What to do after the observation is complete\n",
" torch.nn.quantizable.LSTM: torch.nn.quantizable.LSTM\n",
" }\n",
"}"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The utility below also helps setting up the quantization engine."
]
},
{
"cell_type": "code",
"execution_count": 17,
"metadata": {},
"outputs": [],
"source": [
"qengine = 'fbgemm'\n",
"\n",
"@contextmanager\n",
"def override_quantized_engine(qengine):\n",
" previous = torch.backends.quantized.engine\n",
" torch.backends.quantized.engine = qengine\n",
" try:\n",
" if qengine == 'qnnpack':\n",
" torch._C._set_default_mobile_cpu_allocator()\n",
" yield\n",
" finally:\n",
" if qengine == 'qnnpack':\n",
" torch._C._unset_default_mobile_cpu_allocator()\n",
" torch.backends.quantized.engine = previous"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Prepare"
]
},
{
"cell_type": "code",
"execution_count": 18,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"CharRNN(\n",
" (quant_x): QuantStub(\n",
" (activation_post_process): HistogramObserver()\n",
" )\n",
" (quant_h): ModuleList(\n",
" (0): QuantStub(\n",
" (activation_post_process): HistogramObserver()\n",
" )\n",
" (1): QuantStub(\n",
" (activation_post_process): HistogramObserver()\n",
" )\n",
" )\n",
" (lstm): QuantizableLSTM(\n",
" (layers): ModuleList(\n",
" (0): _LSTMLayer(\n",
" (layer_fw): _LSTMSingleLayer(\n",
" (cell): QuantizableLSTMCell(\n",
" (igates): Linear(\n",
" in_features=108, out_features=2048, bias=True\n",
" (activation_post_process): HistogramObserver()\n",
" )\n",
" (hgates): Linear(\n",
" in_features=512, out_features=2048, bias=True\n",
" (activation_post_process): HistogramObserver()\n",
" )\n",
" (gates): FloatFunctional(\n",
" (activation_post_process): HistogramObserver()\n",
" )\n",
" (fgate_cx): FloatFunctional(\n",
" (activation_post_process): HistogramObserver()\n",
" )\n",
" (igate_cgate): FloatFunctional(\n",
" (activation_post_process): HistogramObserver()\n",
" )\n",
" (fgate_cx_igate_cgate): FloatFunctional(\n",
" (activation_post_process): HistogramObserver()\n",
" )\n",
" (ogate_cy): FloatFunctional(\n",
" (activation_post_process): HistogramObserver()\n",
" )\n",
" )\n",
" )\n",
" )\n",
" (1): _LSTMLayer(\n",
" (layer_fw): _LSTMSingleLayer(\n",
" (cell): QuantizableLSTMCell(\n",
" (igates): Linear(\n",
" in_features=512, out_features=2048, bias=True\n",
" (activation_post_process): HistogramObserver()\n",
" )\n",
" (hgates): Linear(\n",
" in_features=512, out_features=2048, bias=True\n",
" (activation_post_process): HistogramObserver()\n",
" )\n",
" (gates): FloatFunctional(\n",
" (activation_post_process): HistogramObserver()\n",
" )\n",
" (fgate_cx): FloatFunctional(\n",
" (activation_post_process): HistogramObserver()\n",
" )\n",
" (igate_cgate): FloatFunctional(\n",
" (activation_post_process): HistogramObserver()\n",
" )\n",
" (fgate_cx_igate_cgate): FloatFunctional(\n",
" (activation_post_process): HistogramObserver()\n",
" )\n",
" (ogate_cy): FloatFunctional(\n",
" (activation_post_process): HistogramObserver()\n",
" )\n",
" )\n",
" )\n",
" )\n",
" )\n",
" )\n",
" (dropout): Dropout(p=0.5, inplace=False)\n",
" (fc): Linear(\n",
" in_features=512, out_features=108, bias=True\n",
" (activation_post_process): HistogramObserver()\n",
" )\n",
" (dequant): DeQuantStub()\n",
")\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"/home/zafar/Git/pytorch-dev/pytorch/torch/quantization/observer.py:123: UserWarning: Please use quant_min and quant_max to specify the range for observers. reduce_range will be deprecated in a future release of PyTorch.\n",
" reduce_range will be deprecated in a future release of PyTorch.\"\n",
"/home/zafar/Git/pytorch-dev/pytorch/torch/nn/quantizable/modules/rnn.py:320: UserWarning: dropout option for quantizable LSTM is ignored. If you are training, please, use nn.LSTM version followed by `prepare` step.\n",
" warnings.warn(\"dropout option for quantizable LSTM is ignored. \"\n"
]
}
],
"source": [
"net.qconfig = tq.get_default_qconfig(qengine)\n",
"net_prepared = tq.prepare(net, prepare_custom_config_dict=custom_module_config, inplace=False)\n",
"print(net_prepared)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Calibrate"
]
},
{
"cell_type": "code",
"execution_count": 24,
"metadata": {
"scrolled": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Calibrating .........:.........:.........:.........:.........:.........:.........:.........:.........:.......\n",
"Calibration Loss: 1.09\n"
]
}
],
"source": [
"device = torch.device('cuda')\n",
"net_calibrated = copy.deepcopy(net_prepared).eval().to(device)\n",
"criterion = nn.CrossEntropyLoss()\n",
"\n",
"# Hyperparameters\n",
"batch_size = 128\n",
"seq_length = 100\n",
"n_chars = len(net_calibrated.chars)\n",
"h = init_hidden(net, batch_size, device)\n",
"\n",
"# Split the data, and calibrate only on the training\n",
"val_frac = 0.1\n",
"val_idx = int(len(encoded)*(1-val_frac))\n",
"train_encoded, val_encoded = encoded[:val_idx], encoded[val_idx:]\n",
"\n",
"print('Calibrating', end=' ', flush=True)\n",
"dot_every = 4 # This is for logging and printing\n",
"counter = 0\n",
"running_loss = 0.0\n",
"for x, y in get_batches(train_encoded, batch_size, seq_length):\n",
" counter += 1\n",
"\n",
" x = one_hot_encode(x, n_chars)\n",
" inputs, targets = torch.from_numpy(x), torch.from_numpy(y)\n",
" inputs, targets = inputs.to(device), targets.to(device)\n",
"\n",
" h = tuple([each.data.to(device) for each in h])\n",
" output, h = net_calibrated(inputs, h)\n",
"\n",
" loss = criterion(output, targets.view(batch_size*seq_length).long())\n",
" running_loss += loss.item()\n",
"\n",
" if counter % (10 * dot_every) == 0:\n",
" print(':', end='', flush=True)\n",
" elif counter % dot_every == 0:\n",
" print('.', end='', flush=True)\n",
"print()\n",
"print(f'Calibration Loss: {running_loss / counter:.2f}')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Convert"
]
},
{
"cell_type": "code",
"execution_count": 25,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"CharRNN(\n",
" (quant_x): Quantize(scale=tensor([0.0079]), zero_point=tensor([0]), dtype=torch.quint8)\n",
" (quant_h): ModuleList(\n",
" (0): Quantize(scale=tensor([0.0159]), zero_point=tensor([63]), dtype=torch.quint8)\n",
" (1): Quantize(scale=tensor([2.1360]), zero_point=tensor([85]), dtype=torch.quint8)\n",
" )\n",
" (lstm): QuantizedLSTM(\n",
" (layers): ModuleList(\n",
" (0): _LSTMLayer(\n",
" (layer_fw): _LSTMSingleLayer(\n",
" (cell): QuantizedLSTMCell(\n",
" (igates): QuantizedLinear(in_features=108, out_features=2048, scale=0.04310886934399605, zero_point=64, qscheme=torch.per_channel_affine)\n",
" (hgates): QuantizedLinear(in_features=512, out_features=2048, scale=0.10811269283294678, zero_point=64, qscheme=torch.per_channel_affine)\n",
" (gates): QFunctional(\n",
" scale=0.11146293580532074, zero_point=63\n",
" (activation_post_process): Identity()\n",
" )\n",
" (fgate_cx): QFunctional(\n",
" scale=0.03302617743611336, zero_point=60\n",
" (activation_post_process): Identity()\n",
" )\n",
" (igate_cgate): QFunctional(\n",
" scale=0.015441657043993473, zero_point=64\n",
" (activation_post_process): Identity()\n",
" )\n",
" (fgate_cx_igate_cgate): QFunctional(\n",
" scale=0.04679515212774277, zero_point=59\n",
" (activation_post_process): Identity()\n",
" )\n",
" (ogate_cy): QFunctional(\n",
" scale=0.014196028932929039, zero_point=70\n",
" (activation_post_process): Identity()\n",
" )\n",
" )\n",
" )\n",
" )\n",
" (1): _LSTMLayer(\n",
" (layer_fw): _LSTMSingleLayer(\n",
" (cell): QuantizedLSTMCell(\n",
" (igates): QuantizedLinear(in_features=512, out_features=2048, scale=0.20001915097236633, zero_point=63, qscheme=torch.per_channel_affine)\n",
" (hgates): QuantizedLinear(in_features=512, out_features=2048, scale=0.23398037254810333, zero_point=69, qscheme=torch.per_channel_affine)\n",
" (gates): QFunctional(\n",
" scale=0.2735488712787628, zero_point=71\n",
" (activation_post_process): Identity()\n",
" )\n",
" (fgate_cx): QFunctional(\n",
" scale=2.2412078380584717, zero_point=84\n",
" (activation_post_process): Identity()\n",
" )\n",
" (igate_cgate): QFunctional(\n",
" scale=0.016878196969628334, zero_point=59\n",
" (activation_post_process): Identity()\n",
" )\n",
" (fgate_cx_igate_cgate): QFunctional(\n",
" scale=2.240091562271118, zero_point=85\n",
" (activation_post_process): Identity()\n",
" )\n",
" (ogate_cy): QFunctional(\n",
" scale=0.016556110233068466, zero_point=60\n",
" (activation_post_process): Identity()\n",
" )\n",
" )\n",
" )\n",
" )\n",
" )\n",
" )\n",
" (dropout): Dropout(p=0.5, inplace=False)\n",
" (fc): QuantizedLinear(in_features=512, out_features=108, scale=0.4080132246017456, zero_point=74, qscheme=torch.per_channel_affine)\n",
" (dequant): DeQuantize()\n",
")\n"
]
}
],
"source": [
"net_calibrated = net_calibrated.cpu()\n",
"net_quantized = tq.convert(net_calibrated, convert_custom_config_dict=custom_module_config, inplace=False)\n",
"print(net_quantized)"
]
},
{
"cell_type": "code",
"execution_count": 26,
"metadata": {
"scrolled": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Validating .........:\n",
"Validation Loss: 4.30\n"
]
}
],
"source": [
"print('Validating', end=' ', flush=True)\n",
"dot_every = 4 # This is for logging and printing\n",
"counter = 0\n",
"running_loss = 0.0\n",
"h = init_hidden(net, batch_size, 'cpu')\n",
"for x, y in get_batches(val_encoded, batch_size, seq_length):\n",
" counter += 1\n",
"\n",
" x = one_hot_encode(x, n_chars)\n",
" inputs, targets = torch.from_numpy(x), torch.from_numpy(y)\n",
"\n",
" h = tuple([each.data for each in h])\n",
" output, h = net_quantized(inputs, h)\n",
"\n",
" loss = criterion(output, targets.view(batch_size*seq_length).long())\n",
" running_loss += loss.item()\n",
"\n",
" if counter % (10 * dot_every) == 0:\n",
" print(':', end='', flush=True)\n",
" elif counter % dot_every == 0:\n",
" print('.', end='', flush=True)\n",
"print()\n",
"print(f'Validation Loss: {running_loss / counter:.2f}')"
]
},
{
"cell_type": "code",
"execution_count": 27,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"TestTtTt tttn tnTnTn tTaTnt nanaTTa att tt a Tat T tn n tnnant nTn t TtTntTt aaTa n aanTntaTtttT aT ta tttTnataaT t Tn nnt a nat a an tttat an nTTaTaaTtn T T tnn tna aa n TT nn atnn nnT n aa nn na tnt T Ttat nTnana tan nn ttttTTa TTantatn aTa tT nnTT TnTn aTat atnT tTa n taaa att aTTn n t T TTa nataanntTt a attt tnnn t Tat ntn a T ttnat aattnTt Tn nT Tat anTttan naa tatnTnttTnTt n TnnaTnaata nnn n tnatnna ttt Tn TaTTTann naaant taanatTTaaT Tt ttt taa TnTa TntaanTTnaaa naann atn nntTan Ttn nTnttanTn a a Tn t taaattaTTT na tnTTnna taa ta ntnaaTnn nttT nn Tn TanT tn ttn nnn naatnn a nantt ta ta taaT aTTa TTnT TanTanttT ntatT n TTa Ta nn aTnT t n aaaT t Tt aaT TTnaTttt tn nan tna a na nT nt nnat Tt att na T tTtntaTaat T nant anTaTatana ntaa tT aaT ata tTnnTnn TTat naaatnat TtatnTTanTntt t nT aT Ttnnannt nat n Ta T ttn anTntaaTna nn nnttt t na n a a aaanTt n a tTT t t tnnaTTTaa aa t TtaattTn Taan aTnn TTtTTn a \n"
]
}
],
"source": [
"# Generating new text\n",
"print(sample(net_quantized, 1000, prime='Test', top_k=5, device='cpu'))"
]
}
],
"metadata": {
"accelerator": "TPU",
"colab": {
"collapsed_sections": [
"N6ZDpd9XzFeN"
],
"name": "Predict Shakespeare with Cloud TPUs and Keras",
"provenance": [],
"version": "0.3.2"
},
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.10"
}
},
"nbformat": 4,
"nbformat_minor": 1
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment