Skip to content

Instantly share code, notes, and snippets.

@jaimeide
Last active April 29, 2018 02:04
Show Gist options
  • Save jaimeide/e71091ce1ff8bb15710c47ac98962948 to your computer and use it in GitHub Desktop.
Save jaimeide/e71091ce1ff8bb15710c47ac98962948 to your computer and use it in GitHub Desktop.
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Case Study: Fraud Detection (XGBoost)\n",
"\n",
"Ref: https://www.kaggle.com/mlg-ulb/creditcardfraud"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# Class to create Matlab-like structure\n",
"class matlab_like():\n",
" pass"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Context\n",
"\n",
"It is important that credit card companies are able to recognize fraudulent credit card transactions so that customers are not charged for items that they did not purchase.\n",
"\n",
"## Content\n",
"\n",
"The datasets contains transactions made by credit cards in September 2013 by european cardholders. This dataset presents transactions that occurred in two days, where we have 492 frauds out of 284,807 transactions. The dataset is highly unbalanced, the positive class (frauds) account for 0.172% of all transactions.\n",
"\n",
"It contains only numerical input variables which are the result of a PCA transformation. Unfortunately, due to confidentiality issues, we cannot provide the original features and more background information about the data. Features V1, V2, ... V28 are the principal components obtained with PCA, the only features which have not been transformed with PCA are 'Time' and 'Amount'. Feature 'Time' contains the seconds elapsed between each transaction and the first transaction in the dataset. The feature 'Amount' is the transaction Amount, this feature can be used for example-dependant cost-senstive learning. Feature 'Class' is the response variable and it takes value 1 in case of fraud and 0 otherwise.\n",
"\n",
"## Inspiration\n",
"\n",
"Identify fraudulent credit card transactions.\n",
"\n",
"Given the class imbalance ratio, we recommend measuring the accuracy using the Area Under the Precision-Recall Curve (AUPRC). Confusion matrix accuracy is not meaningful for unbalanced classification.\n",
"\n",
"## Acknowledgements\n",
"\n",
"The dataset has been collected and analysed during a research collaboration of Worldline and the Machine Learning Group (http://mlg.ulb.ac.be) of ULB (Université Libre de Bruxelles) on big data mining and fraud detection. More details on current and past projects on related topics are available on http://mlg.ulb.ac.be/BruFence and http://mlg.ulb.ac.be/ARTML\n",
"\n",
"Please cite: Andrea Dal Pozzolo, Olivier Caelen, Reid A. Johnson and Gianluca Bontempi. Calibrating Probability with Undersampling for Unbalanced Classification. In Symposium on Computational Intelligence and Data Mining (CIDM), IEEE, 2015"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Library"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"import time\n",
"import scipy.stats as stats\n",
"import numpy as np\n",
"import pandas as pd\n",
"import tensorflow as tf\n",
"import seaborn as sns\n",
"import matplotlib.gridspec as gridspec\n",
"\n",
"from sklearn.preprocessing import StandardScaler # to normalize feats\n",
"from sklearn.decomposition import PCA\n",
"from sklearn.metrics import classification_report\n",
"from imblearn.over_sampling import SMOTE\n",
"from imblearn.over_sampling import ADASYN"
]
},
{
"cell_type": "code",
"execution_count": 124,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# Define my metrics\n",
"def ji_get_metrics(C):\n",
" \"\"\"\n",
" C: ndarray, shape (2,2) as given by scikit-learn confusion_matrix function\n",
" \n",
" Good reading: http://stats.stackexchange.com/questions/49579/balanced-accuracy-vs-f-1-score\n",
" \n",
" \"\"\"\n",
" #import matplotlib.pyplot as plt\n",
" import numpy as np\n",
" \n",
" allres = matlab_like()\n",
" \n",
" assert C.shape == (2,2), \"Confusion matrix should be from binary classification only.\"\n",
" \n",
" # true negative, false positive, etc...\n",
" tn = C[0,0]; fp = C[0,1]; fn = C[1,0]; tp = C[1,1]\n",
" NP = fn+tp # Num positive examples\n",
" NN = tn+fp # Num negative examples\n",
" N = NP+NN\n",
" TPR = tp / (NP+0.) # Sensitivity or Recall_P\n",
" TNR = tn / (NN+0.) # Specificity or Recall_N\n",
" \n",
" print(' TN=%d, FP=%d'%(tn,fp))\n",
" print(' FN=%d, TP=%d'%(fn,tp))\n",
" \n",
" \n",
" precision = (tp/(tp+fp+0.)) # Precision_P\n",
" recall = TPR\n",
" \n",
" allres.C0 = TNR\n",
" allres.C1 = TPR\n",
" allres.Precision = precision\n",
" allres.Recall = recall\n",
" allres.BAcc = (TPR+TNR)/2 \n",
" allres.F1 = 2*((precision*recall)/(precision+recall))\n",
" \n",
" return allres"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# Function to plot Confusion Matrix\n",
"import itertools\n",
"import matplotlib.pyplot as plt\n",
"\n",
"def plot_confusion_matrix(cm, classes,\n",
" normalize=False,\n",
" title='Confusion matrix',\n",
" cmap=plt.cm.Blues):\n",
" \"\"\"\n",
" This function prints and plots the confusion matrix.\n",
" Normalization can be applied by setting `normalize=True`.\n",
" \"\"\"\n",
" plt.imshow(cm, interpolation='nearest', cmap=cmap)\n",
" plt.title(title)\n",
" plt.colorbar()\n",
" tick_marks = np.arange(len(classes))\n",
" plt.xticks(tick_marks, classes, rotation=0)\n",
" plt.yticks(tick_marks, classes)\n",
"\n",
" if normalize:\n",
" cm = cm.astype('float') / cm.sum(axis=1)[:, np.newaxis]\n",
" #print(\"Normalized confusion matrix\")\n",
" else:\n",
" 1#print('Confusion matrix, without normalization')\n",
"\n",
" #print(cm)\n",
"\n",
" thresh = cm.max() / 2.\n",
" for i, j in itertools.product(range(cm.shape[0]), range(cm.shape[1])):\n",
" plt.text(j, i, cm[i, j],\n",
" horizontalalignment=\"center\",\n",
" color=\"white\" if cm[i, j] > thresh else \"black\")\n",
"\n",
" plt.tight_layout()\n",
" plt.ylabel('True label')\n",
" plt.xlabel('Predicted label')"
]
},
{
"cell_type": "code",
"execution_count": 157,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# Standardize or normalize\n",
"def ji_normalize(df,typeN,selectedCol):\n",
" # Return a normalized DF\n",
" # Use: dd = ji_normalize(df,'minmax',['colName'])\n",
"\n",
" dfx = df.copy()\n",
" if selectedCol:\n",
" if typeN == 'minmax':\n",
" dfx[selectedCol] = dfx[selectedCol].apply(lambda x: (x - x.min()) / (x.max() - x.min()))\n",
" else:\n",
" dfx[selectedCol] = StandardScaler().fit_transform(dfx[selectedCol].values.reshape(-1, 1))\n",
" else:\n",
" allcols = dfx.columns\n",
" if typeN == 'minmax':\n",
" dfx[allcols] = dfx[allcols].apply(lambda x: (x - x.min()) / (x.max() - x.min()))\n",
" else:\n",
" dfx[allcols] = StandardScaler().fit_transform(dfx[allcols].values)\n",
" \n",
" return dfx\n",
" "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Load the data set"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<div>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>Time</th>\n",
" <th>V1</th>\n",
" <th>V2</th>\n",
" <th>V3</th>\n",
" <th>V4</th>\n",
" <th>V5</th>\n",
" <th>V6</th>\n",
" <th>V7</th>\n",
" <th>V8</th>\n",
" <th>V9</th>\n",
" <th>...</th>\n",
" <th>V21</th>\n",
" <th>V22</th>\n",
" <th>V23</th>\n",
" <th>V24</th>\n",
" <th>V25</th>\n",
" <th>V26</th>\n",
" <th>V27</th>\n",
" <th>V28</th>\n",
" <th>Amount</th>\n",
" <th>Class</th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>0</th>\n",
" <td>0.0</td>\n",
" <td>-1.359807</td>\n",
" <td>-0.072781</td>\n",
" <td>2.536347</td>\n",
" <td>1.378155</td>\n",
" <td>-0.338321</td>\n",
" <td>0.462388</td>\n",
" <td>0.239599</td>\n",
" <td>0.098698</td>\n",
" <td>0.363787</td>\n",
" <td>...</td>\n",
" <td>-0.018307</td>\n",
" <td>0.277838</td>\n",
" <td>-0.110474</td>\n",
" <td>0.066928</td>\n",
" <td>0.128539</td>\n",
" <td>-0.189115</td>\n",
" <td>0.133558</td>\n",
" <td>-0.021053</td>\n",
" <td>149.62</td>\n",
" <td>0</td>\n",
" </tr>\n",
" <tr>\n",
" <th>1</th>\n",
" <td>0.0</td>\n",
" <td>1.191857</td>\n",
" <td>0.266151</td>\n",
" <td>0.166480</td>\n",
" <td>0.448154</td>\n",
" <td>0.060018</td>\n",
" <td>-0.082361</td>\n",
" <td>-0.078803</td>\n",
" <td>0.085102</td>\n",
" <td>-0.255425</td>\n",
" <td>...</td>\n",
" <td>-0.225775</td>\n",
" <td>-0.638672</td>\n",
" <td>0.101288</td>\n",
" <td>-0.339846</td>\n",
" <td>0.167170</td>\n",
" <td>0.125895</td>\n",
" <td>-0.008983</td>\n",
" <td>0.014724</td>\n",
" <td>2.69</td>\n",
" <td>0</td>\n",
" </tr>\n",
" <tr>\n",
" <th>2</th>\n",
" <td>1.0</td>\n",
" <td>-1.358354</td>\n",
" <td>-1.340163</td>\n",
" <td>1.773209</td>\n",
" <td>0.379780</td>\n",
" <td>-0.503198</td>\n",
" <td>1.800499</td>\n",
" <td>0.791461</td>\n",
" <td>0.247676</td>\n",
" <td>-1.514654</td>\n",
" <td>...</td>\n",
" <td>0.247998</td>\n",
" <td>0.771679</td>\n",
" <td>0.909412</td>\n",
" <td>-0.689281</td>\n",
" <td>-0.327642</td>\n",
" <td>-0.139097</td>\n",
" <td>-0.055353</td>\n",
" <td>-0.059752</td>\n",
" <td>378.66</td>\n",
" <td>0</td>\n",
" </tr>\n",
" <tr>\n",
" <th>3</th>\n",
" <td>1.0</td>\n",
" <td>-0.966272</td>\n",
" <td>-0.185226</td>\n",
" <td>1.792993</td>\n",
" <td>-0.863291</td>\n",
" <td>-0.010309</td>\n",
" <td>1.247203</td>\n",
" <td>0.237609</td>\n",
" <td>0.377436</td>\n",
" <td>-1.387024</td>\n",
" <td>...</td>\n",
" <td>-0.108300</td>\n",
" <td>0.005274</td>\n",
" <td>-0.190321</td>\n",
" <td>-1.175575</td>\n",
" <td>0.647376</td>\n",
" <td>-0.221929</td>\n",
" <td>0.062723</td>\n",
" <td>0.061458</td>\n",
" <td>123.50</td>\n",
" <td>0</td>\n",
" </tr>\n",
" <tr>\n",
" <th>4</th>\n",
" <td>2.0</td>\n",
" <td>-1.158233</td>\n",
" <td>0.877737</td>\n",
" <td>1.548718</td>\n",
" <td>0.403034</td>\n",
" <td>-0.407193</td>\n",
" <td>0.095921</td>\n",
" <td>0.592941</td>\n",
" <td>-0.270533</td>\n",
" <td>0.817739</td>\n",
" <td>...</td>\n",
" <td>-0.009431</td>\n",
" <td>0.798278</td>\n",
" <td>-0.137458</td>\n",
" <td>0.141267</td>\n",
" <td>-0.206010</td>\n",
" <td>0.502292</td>\n",
" <td>0.219422</td>\n",
" <td>0.215153</td>\n",
" <td>69.99</td>\n",
" <td>0</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"<p>5 rows × 31 columns</p>\n",
"</div>"
],
"text/plain": [
" Time V1 V2 V3 V4 V5 V6 V7 \\\n",
"0 0.0 -1.359807 -0.072781 2.536347 1.378155 -0.338321 0.462388 0.239599 \n",
"1 0.0 1.191857 0.266151 0.166480 0.448154 0.060018 -0.082361 -0.078803 \n",
"2 1.0 -1.358354 -1.340163 1.773209 0.379780 -0.503198 1.800499 0.791461 \n",
"3 1.0 -0.966272 -0.185226 1.792993 -0.863291 -0.010309 1.247203 0.237609 \n",
"4 2.0 -1.158233 0.877737 1.548718 0.403034 -0.407193 0.095921 0.592941 \n",
"\n",
" V8 V9 ... V21 V22 V23 V24 \\\n",
"0 0.098698 0.363787 ... -0.018307 0.277838 -0.110474 0.066928 \n",
"1 0.085102 -0.255425 ... -0.225775 -0.638672 0.101288 -0.339846 \n",
"2 0.247676 -1.514654 ... 0.247998 0.771679 0.909412 -0.689281 \n",
"3 0.377436 -1.387024 ... -0.108300 0.005274 -0.190321 -1.175575 \n",
"4 -0.270533 0.817739 ... -0.009431 0.798278 -0.137458 0.141267 \n",
"\n",
" V25 V26 V27 V28 Amount Class \n",
"0 0.128539 -0.189115 0.133558 -0.021053 149.62 0 \n",
"1 0.167170 0.125895 -0.008983 0.014724 2.69 0 \n",
"2 -0.327642 -0.139097 -0.055353 -0.059752 378.66 0 \n",
"3 0.647376 -0.221929 0.062723 0.061458 123.50 0 \n",
"4 -0.206010 0.502292 0.219422 0.215153 69.99 0 \n",
"\n",
"[5 rows x 31 columns]"
]
},
"execution_count": 5,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# 1) Import data\n",
"df = pd.read_csv(\"creditcard.csv\")\n",
"df.head() # notice that the features are anonymized"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# If you want to alter/categorize or fix a particular categorical variable (not the case here...)\n",
"def val_update(val):\n",
" if val<0:\n",
" return 0\n",
" else:\n",
" return 1\n",
"#aux = df['V1'].apply(val_update)\n",
"#aux.head()"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# Check if there are missing values \n",
"# --> NO!\n",
"#df.isnull().sum()"
]
},
{
"cell_type": "code",
"execution_count": 50,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>Time</th>\n",
" <th>V1</th>\n",
" <th>V2</th>\n",
" <th>V3</th>\n",
" <th>V4</th>\n",
" <th>V5</th>\n",
" <th>V6</th>\n",
" <th>V7</th>\n",
" <th>V8</th>\n",
" <th>V9</th>\n",
" <th>...</th>\n",
" <th>V21</th>\n",
" <th>V22</th>\n",
" <th>V23</th>\n",
" <th>V24</th>\n",
" <th>V25</th>\n",
" <th>V26</th>\n",
" <th>V27</th>\n",
" <th>V28</th>\n",
" <th>Amount</th>\n",
" <th>Class</th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>count</th>\n",
" <td>284807.000000</td>\n",
" <td>2.848070e+05</td>\n",
" <td>2.848070e+05</td>\n",
" <td>2.848070e+05</td>\n",
" <td>2.848070e+05</td>\n",
" <td>2.848070e+05</td>\n",
" <td>2.848070e+05</td>\n",
" <td>2.848070e+05</td>\n",
" <td>2.848070e+05</td>\n",
" <td>2.848070e+05</td>\n",
" <td>...</td>\n",
" <td>2.848070e+05</td>\n",
" <td>2.848070e+05</td>\n",
" <td>2.848070e+05</td>\n",
" <td>2.848070e+05</td>\n",
" <td>2.848070e+05</td>\n",
" <td>2.848070e+05</td>\n",
" <td>2.848070e+05</td>\n",
" <td>2.848070e+05</td>\n",
" <td>284807.000000</td>\n",
" <td>284807.000000</td>\n",
" </tr>\n",
" <tr>\n",
" <th>mean</th>\n",
" <td>94813.859575</td>\n",
" <td>1.165980e-15</td>\n",
" <td>3.416908e-16</td>\n",
" <td>-1.373150e-15</td>\n",
" <td>2.086869e-15</td>\n",
" <td>9.604066e-16</td>\n",
" <td>1.490107e-15</td>\n",
" <td>-5.556467e-16</td>\n",
" <td>1.177556e-16</td>\n",
" <td>-2.406455e-15</td>\n",
" <td>...</td>\n",
" <td>1.656562e-16</td>\n",
" <td>-3.444850e-16</td>\n",
" <td>2.578648e-16</td>\n",
" <td>4.471968e-15</td>\n",
" <td>5.340915e-16</td>\n",
" <td>1.687098e-15</td>\n",
" <td>-3.666453e-16</td>\n",
" <td>-1.220404e-16</td>\n",
" <td>88.349619</td>\n",
" <td>0.001727</td>\n",
" </tr>\n",
" <tr>\n",
" <th>std</th>\n",
" <td>47488.145955</td>\n",
" <td>1.958696e+00</td>\n",
" <td>1.651309e+00</td>\n",
" <td>1.516255e+00</td>\n",
" <td>1.415869e+00</td>\n",
" <td>1.380247e+00</td>\n",
" <td>1.332271e+00</td>\n",
" <td>1.237094e+00</td>\n",
" <td>1.194353e+00</td>\n",
" <td>1.098632e+00</td>\n",
" <td>...</td>\n",
" <td>7.345240e-01</td>\n",
" <td>7.257016e-01</td>\n",
" <td>6.244603e-01</td>\n",
" <td>6.056471e-01</td>\n",
" <td>5.212781e-01</td>\n",
" <td>4.822270e-01</td>\n",
" <td>4.036325e-01</td>\n",
" <td>3.300833e-01</td>\n",
" <td>250.120109</td>\n",
" <td>0.041527</td>\n",
" </tr>\n",
" <tr>\n",
" <th>min</th>\n",
" <td>0.000000</td>\n",
" <td>-5.640751e+01</td>\n",
" <td>-7.271573e+01</td>\n",
" <td>-4.832559e+01</td>\n",
" <td>-5.683171e+00</td>\n",
" <td>-1.137433e+02</td>\n",
" <td>-2.616051e+01</td>\n",
" <td>-4.355724e+01</td>\n",
" <td>-7.321672e+01</td>\n",
" <td>-1.343407e+01</td>\n",
" <td>...</td>\n",
" <td>-3.483038e+01</td>\n",
" <td>-1.093314e+01</td>\n",
" <td>-4.480774e+01</td>\n",
" <td>-2.836627e+00</td>\n",
" <td>-1.029540e+01</td>\n",
" <td>-2.604551e+00</td>\n",
" <td>-2.256568e+01</td>\n",
" <td>-1.543008e+01</td>\n",
" <td>0.000000</td>\n",
" <td>0.000000</td>\n",
" </tr>\n",
" <tr>\n",
" <th>25%</th>\n",
" <td>54201.500000</td>\n",
" <td>-9.203734e-01</td>\n",
" <td>-5.985499e-01</td>\n",
" <td>-8.903648e-01</td>\n",
" <td>-8.486401e-01</td>\n",
" <td>-6.915971e-01</td>\n",
" <td>-7.682956e-01</td>\n",
" <td>-5.540759e-01</td>\n",
" <td>-2.086297e-01</td>\n",
" <td>-6.430976e-01</td>\n",
" <td>...</td>\n",
" <td>-2.283949e-01</td>\n",
" <td>-5.423504e-01</td>\n",
" <td>-1.618463e-01</td>\n",
" <td>-3.545861e-01</td>\n",
" <td>-3.171451e-01</td>\n",
" <td>-3.269839e-01</td>\n",
" <td>-7.083953e-02</td>\n",
" <td>-5.295979e-02</td>\n",
" <td>5.600000</td>\n",
" <td>0.000000</td>\n",
" </tr>\n",
" <tr>\n",
" <th>50%</th>\n",
" <td>84692.000000</td>\n",
" <td>1.810880e-02</td>\n",
" <td>6.548556e-02</td>\n",
" <td>1.798463e-01</td>\n",
" <td>-1.984653e-02</td>\n",
" <td>-5.433583e-02</td>\n",
" <td>-2.741871e-01</td>\n",
" <td>4.010308e-02</td>\n",
" <td>2.235804e-02</td>\n",
" <td>-5.142873e-02</td>\n",
" <td>...</td>\n",
" <td>-2.945017e-02</td>\n",
" <td>6.781943e-03</td>\n",
" <td>-1.119293e-02</td>\n",
" <td>4.097606e-02</td>\n",
" <td>1.659350e-02</td>\n",
" <td>-5.213911e-02</td>\n",
" <td>1.342146e-03</td>\n",
" <td>1.124383e-02</td>\n",
" <td>22.000000</td>\n",
" <td>0.000000</td>\n",
" </tr>\n",
" <tr>\n",
" <th>75%</th>\n",
" <td>139320.500000</td>\n",
" <td>1.315642e+00</td>\n",
" <td>8.037239e-01</td>\n",
" <td>1.027196e+00</td>\n",
" <td>7.433413e-01</td>\n",
" <td>6.119264e-01</td>\n",
" <td>3.985649e-01</td>\n",
" <td>5.704361e-01</td>\n",
" <td>3.273459e-01</td>\n",
" <td>5.971390e-01</td>\n",
" <td>...</td>\n",
" <td>1.863772e-01</td>\n",
" <td>5.285536e-01</td>\n",
" <td>1.476421e-01</td>\n",
" <td>4.395266e-01</td>\n",
" <td>3.507156e-01</td>\n",
" <td>2.409522e-01</td>\n",
" <td>9.104512e-02</td>\n",
" <td>7.827995e-02</td>\n",
" <td>77.165000</td>\n",
" <td>0.000000</td>\n",
" </tr>\n",
" <tr>\n",
" <th>max</th>\n",
" <td>172792.000000</td>\n",
" <td>2.454930e+00</td>\n",
" <td>2.205773e+01</td>\n",
" <td>9.382558e+00</td>\n",
" <td>1.687534e+01</td>\n",
" <td>3.480167e+01</td>\n",
" <td>7.330163e+01</td>\n",
" <td>1.205895e+02</td>\n",
" <td>2.000721e+01</td>\n",
" <td>1.559499e+01</td>\n",
" <td>...</td>\n",
" <td>2.720284e+01</td>\n",
" <td>1.050309e+01</td>\n",
" <td>2.252841e+01</td>\n",
" <td>4.584549e+00</td>\n",
" <td>7.519589e+00</td>\n",
" <td>3.517346e+00</td>\n",
" <td>3.161220e+01</td>\n",
" <td>3.384781e+01</td>\n",
" <td>25691.160000</td>\n",
" <td>1.000000</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"<p>8 rows × 31 columns</p>\n",
"</div>"
],
"text/plain": [
" Time V1 V2 V3 V4 \\\n",
"count 284807.000000 2.848070e+05 2.848070e+05 2.848070e+05 2.848070e+05 \n",
"mean 94813.859575 1.165980e-15 3.416908e-16 -1.373150e-15 2.086869e-15 \n",
"std 47488.145955 1.958696e+00 1.651309e+00 1.516255e+00 1.415869e+00 \n",
"min 0.000000 -5.640751e+01 -7.271573e+01 -4.832559e+01 -5.683171e+00 \n",
"25% 54201.500000 -9.203734e-01 -5.985499e-01 -8.903648e-01 -8.486401e-01 \n",
"50% 84692.000000 1.810880e-02 6.548556e-02 1.798463e-01 -1.984653e-02 \n",
"75% 139320.500000 1.315642e+00 8.037239e-01 1.027196e+00 7.433413e-01 \n",
"max 172792.000000 2.454930e+00 2.205773e+01 9.382558e+00 1.687534e+01 \n",
"\n",
" V5 V6 V7 V8 V9 \\\n",
"count 2.848070e+05 2.848070e+05 2.848070e+05 2.848070e+05 2.848070e+05 \n",
"mean 9.604066e-16 1.490107e-15 -5.556467e-16 1.177556e-16 -2.406455e-15 \n",
"std 1.380247e+00 1.332271e+00 1.237094e+00 1.194353e+00 1.098632e+00 \n",
"min -1.137433e+02 -2.616051e+01 -4.355724e+01 -7.321672e+01 -1.343407e+01 \n",
"25% -6.915971e-01 -7.682956e-01 -5.540759e-01 -2.086297e-01 -6.430976e-01 \n",
"50% -5.433583e-02 -2.741871e-01 4.010308e-02 2.235804e-02 -5.142873e-02 \n",
"75% 6.119264e-01 3.985649e-01 5.704361e-01 3.273459e-01 5.971390e-01 \n",
"max 3.480167e+01 7.330163e+01 1.205895e+02 2.000721e+01 1.559499e+01 \n",
"\n",
" ... V21 V22 V23 V24 \\\n",
"count ... 2.848070e+05 2.848070e+05 2.848070e+05 2.848070e+05 \n",
"mean ... 1.656562e-16 -3.444850e-16 2.578648e-16 4.471968e-15 \n",
"std ... 7.345240e-01 7.257016e-01 6.244603e-01 6.056471e-01 \n",
"min ... -3.483038e+01 -1.093314e+01 -4.480774e+01 -2.836627e+00 \n",
"25% ... -2.283949e-01 -5.423504e-01 -1.618463e-01 -3.545861e-01 \n",
"50% ... -2.945017e-02 6.781943e-03 -1.119293e-02 4.097606e-02 \n",
"75% ... 1.863772e-01 5.285536e-01 1.476421e-01 4.395266e-01 \n",
"max ... 2.720284e+01 1.050309e+01 2.252841e+01 4.584549e+00 \n",
"\n",
" V25 V26 V27 V28 Amount \\\n",
"count 2.848070e+05 2.848070e+05 2.848070e+05 2.848070e+05 284807.000000 \n",
"mean 5.340915e-16 1.687098e-15 -3.666453e-16 -1.220404e-16 88.349619 \n",
"std 5.212781e-01 4.822270e-01 4.036325e-01 3.300833e-01 250.120109 \n",
"min -1.029540e+01 -2.604551e+00 -2.256568e+01 -1.543008e+01 0.000000 \n",
"25% -3.171451e-01 -3.269839e-01 -7.083953e-02 -5.295979e-02 5.600000 \n",
"50% 1.659350e-02 -5.213911e-02 1.342146e-03 1.124383e-02 22.000000 \n",
"75% 3.507156e-01 2.409522e-01 9.104512e-02 7.827995e-02 77.165000 \n",
"max 7.519589e+00 3.517346e+00 3.161220e+01 3.384781e+01 25691.160000 \n",
"\n",
" Class \n",
"count 284807.000000 \n",
"mean 0.001727 \n",
"std 0.041527 \n",
"min 0.000000 \n",
"25% 0.000000 \n",
"50% 0.000000 \n",
"75% 0.000000 \n",
"max 1.000000 \n",
"\n",
"[8 rows x 31 columns]"
]
},
"execution_count": 50,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Original data to keep:\n",
"df.describe()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Prepare training & validation & testing"
]
},
{
"cell_type": "code",
"execution_count": 170,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Train+Validation:(199364, 30), Test:(85443, 30)\n",
"Index,Train: [ 39873 39874 39875 ..., 199361 199362 199363]|Valid: [ 0 1 2 ..., 39870 39871 39872]\n",
"Index,Train: [ 0 1 2 ..., 199361 199362 199363]|Valid: [39873 39874 39875 ..., 79743 79744 79745]\n",
"Index,Train: [ 0 1 2 ..., 199361 199362 199363]|Valid: [ 79746 79747 79748 ..., 119616 119617 119618]\n",
"Index,Train: [ 0 1 2 ..., 199361 199362 199363]|Valid: [119619 119620 119621 ..., 159489 159490 159491]\n",
"Index,Train: [ 0 1 2 ..., 159489 159490 159491]|Valid: [159492 159493 159494 ..., 199361 199362 199363]\n",
"\n",
"** Train+Validation Data **\n",
"Percentage of normal transactions: 0.9983 (n=199016)\n",
"Percentage of fraud transactions: 0.0017 (n=348)\n",
"Total number of transactions in resampled data: 199364\n",
"\n",
"** Test Data **\n",
"Percentage of normal transactions: 0.9983 (n=85299)\n",
"Percentage of fraud transactions: 0.0017 (n=144)\n",
"Total number of transactions in resampled data: 85443\n"
]
}
],
"source": [
"# ORIGINAL DATA\n",
"# - df\n",
"# - x_data\n",
"# - y_labels\n",
"\n",
"# 2) Prepare the learning\n",
"from sklearn.model_selection import train_test_split\n",
"from sklearn.model_selection import KFold\n",
"from sklearn.model_selection import StratifiedKFold\n",
"\n",
"x_data = df.drop('Class',axis=1)\n",
"y_labels = df['Class']\n",
"y_names = ['Normal','Fraud']\n",
"\n",
"X_train, X_test, y_train, y_test = train_test_split(x_data,y_labels,test_size=0.3,random_state=101)\n",
"\n",
"print('Train+Validation:%s, Test:%s'%(X_train.shape,X_test.shape))\n",
"\n",
"# 2.a) Random Cross Validation (10-folds)\n",
"k_3folds = KFold(n_splits=3,random_state=101)\n",
"k_5folds = KFold(n_splits=5,random_state=101)\n",
"k_10folds = KFold(n_splits=10,random_state=101)\n",
"for train_indices, validation_indices in k_5folds.split(X_train):\n",
" print('Index,Train: %s|Valid: %s' % (train_indices, validation_indices))\n",
"\n",
"# 2.b) Stratified Cross Validation (10-folds) - distributed based on labels\n",
"sk_3folds = StratifiedKFold(n_splits=3, random_state=101)\n",
"sk_5folds = StratifiedKFold(n_splits=5, random_state=101)\n",
"sk_10folds = StratifiedKFold(n_splits=10, random_state=101)\n",
"\n",
"# Showing ratio\n",
"dfaux = y_train\n",
"print(\"\")\n",
"print(\"** Train+Validation Data **\")\n",
"print(\"Percentage of normal transactions: %1.4f (n=%d)\"%(len(dfaux[dfaux == 0])/float(len(dfaux)),len(dfaux[dfaux == 0])))\n",
"print(\"Percentage of fraud transactions: %1.4f (n=%d)\"%(len(dfaux[dfaux == 1])/float(len(dfaux)),len(dfaux[dfaux == 1])))\n",
"print(\"Total number of transactions in resampled data: %d\"%(len(dfaux)))\n",
"# Showing ratio\n",
"dfaux = y_test\n",
"print(\"\")\n",
"print(\"** Test Data **\")\n",
"print(\"Percentage of normal transactions: %1.4f (n=%d)\"%(len(dfaux[dfaux == 0])/float(len(dfaux)),len(dfaux[dfaux == 0])))\n",
"print(\"Percentage of fraud transactions: %1.4f (n=%d)\"%(len(dfaux[dfaux == 1])/float(len(dfaux)),len(dfaux[dfaux == 1])))\n",
"print(\"Total number of transactions in resampled data: %d\"%(len(dfaux)))"
]
},
{
"cell_type": "markdown",
"metadata": {
"collapsed": true
},
"source": [
"## Dealing with Imbalanced Data: Sampling Approach\n",
"\n",
"**OBS**: Here, we have to be careful to not use the test for the sampling purpose... otherwise, we will already use the class information. Thus, the correct thing to do is a Sampling on the training data solely, leaving the test data completely apart"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 1) Undersampling: create dataset with proportional train/test"
]
},
{
"cell_type": "code",
"execution_count": 188,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Percentage of normal transactions: 0.50 (n=348)\n",
"Percentage of fraud transactions: 0.50 (n=348)\n",
"Total number of transactions in resampled data: 696\n"
]
}
],
"source": [
"# Set data to undersample. The example used the whole data, but maybe this is not correct...\n",
"#datax = pd.concat([x_data,y_labels],axis=1) # This would use the whole data including Test... (maybe it is OK)\n",
"datax0 = pd.concat([x_data,y_labels],axis=1) # ORIGINAL DATA\n",
"datax = pd.concat([X_train,y_train],axis=1) #\n",
"\n",
"# Number of data points in the minority class\n",
"number_records_fraud = len(datax[datax.Class == 1])\n",
"fraud_indices = np.array(datax[datax.Class == 1].index)\n",
"\n",
"# Picking the indices of the normal classes\n",
"normal_indices = datax[datax.Class == 0].index\n",
"\n",
"# Out of the indices we picked, randomly select \"x\" number (number_records_fraud)\n",
"random_normal_indices = np.random.choice(normal_indices, number_records_fraud, replace = False)\n",
"random_normal_indices = np.array(random_normal_indices)\n",
"\n",
"# Appending the 2 indices\n",
"under_sample_indices = np.concatenate([fraud_indices,random_normal_indices])\n",
"\n",
"# Under sample dataset\n",
"under_sampled_data = datax0.iloc[under_sample_indices,:]\n",
"under_sampled_data_shuffled = under_sampled_data.sample(frac=1) # shuffle data to use for train/test\n",
"\n",
"X_undersampled = under_sampled_data_shuffled.loc[:, under_sampled_data_shuffled.columns != 'Class'] # Do not use direct to train/test...\n",
"y_undersampled = under_sampled_data_shuffled.loc[:, under_sampled_data_shuffled.columns == 'Class']\n",
"# Convert to Series (JI)\n",
"y_undersampled = y_undersampled.iloc[:,0]\n",
"\n",
"# Showing ratio\n",
"aux0 = len(under_sampled_data[under_sampled_data.Class == 0])\n",
"aux1 = len(under_sampled_data[under_sampled_data.Class == 1])\n",
"print(\"Percentage of normal transactions: %1.2f (n=%d)\"%(aux0/float(len(under_sampled_data)),aux0))\n",
"print(\"Percentage of fraud transactions: %1.2f (n=%d)\"%(aux1/float(len(under_sampled_data)),aux1))\n",
"print(\"Total number of transactions in resampled data: %d\"%(len(under_sampled_data)))\n",
"\n",
"# # Split into Train and Validation data sets\n",
"# #from sklearn.cross_validation import train_test_split\n",
"# from sklearn.model_selection import train_test_split\n",
"\n",
"# # Undersampled dataset\n",
"# X_train_undersampled, X_test_undersampled, y_train_undersampled, y_test_undersampled = \\\n",
"# train_test_split(X_undersampled,y_undersampled,test_size = 0.3,random_state = 0)\n",
"# print(\"\")\n",
"# print(\"Number transactions train dataset: \", len(X_train_undersampled))\n",
"# print(\"Number transactions validation dataset: \", len(X_test_undersampled))\n",
"# print(\"Total number of transactions: \", len(X_train_undersampled)+len(X_test_undersampled))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 2) SMOTE: Synthetic Minority Oversampling"
]
},
{
"cell_type": "code",
"execution_count": 128,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Applied SMOTE: y_train old_size=199364 (#fraud=348), new_size=398032 (#fraud=199016)\n",
"Applied SMOTE (borderline): y_train old_size=199364 (#fraud=348), new_size=398032 (#fraud=199016)\n"
]
}
],
"source": [
"# SMOTE (Ide, ICPR 2016)\n",
"# Check: - ji_Example_Nice_ML_Framework.ipynb\n",
"# - ji_example_SMOTE.ipynb\n",
"\n",
"from imblearn.over_sampling import SMOTE\n",
"from imblearn.over_sampling import ADASYN\n",
"\n",
"# SMOTE and others..\n",
"\n",
"# a) Standard SMOTE\n",
"sm = SMOTE(kind='regular',random_state=101)\n",
"X_train_SMOTE, y_train_SMOTE = sm.fit_sample(X_train, y_train)\n",
"print('Applied SMOTE: y_train old_size=%d (#fraud=%d), new_size=%d (#fraud=%d)'%(len(y_train),len(y_train[y_train==1]),len(y_train_SMOTE),len(y_train_SMOTE[y_train_SMOTE==1])))\n",
"X_train_SMOTE = pd.DataFrame(X_train_SMOTE,columns=X_train.columns)\n",
"y_train_SMOTE = pd.Series(y_train_SMOTE)\n",
"# Shuffle data\n",
"iprm = np.random.permutation(len(y_train_SMOTE))\n",
"X_train_SMOTE = X_train_SMOTE.iloc[iprm]\n",
"y_train_SMOTE = y_train_SMOTE.iloc[iprm]\n",
"\n",
"# b) Borderline SMOTE\n",
"sm = SMOTE(kind='borderline2',random_state=101) # \n",
"X_train_SMOTE2, y_train_SMOTE2 = sm.fit_sample(X_train, y_train)\n",
"print('Applied SMOTE (borderline): y_train old_size=%d (#fraud=%d), new_size=%d (#fraud=%d)'%(len(y_train),len(y_train[y_train==1]),len(y_train_SMOTE2),len(y_train_SMOTE2[y_train_SMOTE2==1])))\n",
"X_train_SMOTE2 = pd.DataFrame(X_train_SMOTE2,columns=X_train.columns)\n",
"y_train_SMOTE2 = pd.Series(y_train_SMOTE2)\n",
"# Shuffle data\n",
"iprm = np.random.permutation(len(y_train_SMOTE2))\n",
"X_train_SMOTE2 = X_train_SMOTE2.iloc[iprm]\n",
"y_train_SMOTE2 = y_train_SMOTE2.iloc[iprm]\n",
"\n",
"#sm = SMOTE(kind='svm')\n",
"# c) ADASYN\n",
"#sm = ADASYN()\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Feature Selection\n",
"\n",
"Ref: "
]
},
{
"cell_type": "code",
"execution_count": 189,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# 1) Check correlation among feats \n",
"# --> you can consider removing correlated feats, since it creates colinearity in regression-based approaches\n",
"\n",
"# Cross-correlation map of the features\n",
"f,ax = plt.subplots(figsize=(20, 20))\n",
"#sns.heatmap(X_train.corr(), annot=True, linewidths=.5, fmt= '.1f',ax=ax)"
]
},
{
"cell_type": "code",
"execution_count": 190,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Index(['V1', 'V2', 'V3', 'V4', 'V5', 'V7', 'V9', 'V10', 'V11', 'V12', 'V14',\n",
" 'V16', 'V17', 'V18', 'V21'],\n",
" dtype='object')\n",
"Index(['Time', 'V1', 'V2', 'V3', 'V4', 'V5', 'V7', 'V8', 'V10', 'V11', 'V12',\n",
" 'V14', 'V16', 'V17', 'Amount'],\n",
" dtype='object')\n"
]
}
],
"source": [
"# 2) Univariate feature selection\n",
"# - In univariate feature selection, we will use SelectKBest that removes all but the k highest scoring features. \n",
"# - http://scikit-learn.org/stable/modules/generated/sklearn.feature_selection.SelectKBest.html#sklearn.feature_selection.SelectKBest\n",
"\n",
"from sklearn.feature_selection import SelectKBest,chi2,f_classif,mutual_info_classif\n",
"\n",
"# Some options: f_classif (ANOVA), chi2, mutual_info_classif\n",
"# find k-best scored 10 features\n",
"select_feature1 = SelectKBest(f_classif, k=15).fit(X_train, y_train)\n",
"print(X_train.columns[select_feature1.get_support()])\n",
"select_feature2 = SelectKBest(chi2, k=15).fit(X_train.abs(), y_train)\n",
"print(X_train.columns[select_feature2.get_support()])\n",
"#select_feature3 = SelectKBest(mutual_info_classif, k=10).fit(X_train, y_train)\n",
"#print(X_train.columns[select_feature3.get_support()])\n",
"\n",
"# Transform data\n",
"X_train_UniF = select_feature1.transform(X_train)\n",
"X_test_UniF = select_feature1.transform(X_test)\n",
"X_train_Chi2 = select_feature2.transform(X_train)\n",
"X_test_Chi2 = select_feature2.transform(X_test)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# 3) Recursive feature elimination (RFE) using Random Forest¶ (could be any other...)\n",
"# - http://scikit-learn.org/stable/modules/generated/sklearn.feature_selection.RFE.html \n",
"# - Basically, it uses one of the classification methods (random forest in our example), \n",
"# assign weights to each of features. Whose absolute weights are the smallest are pruned \n",
"# from the current set features. That procedure is recursively repeated on the pruned set until \n",
"# the desired number of features\n",
"tic = time.time()\n",
"\n",
"from sklearn.feature_selection import RFE\n",
"from sklearn.ensemble import RandomForestClassifier # Random forest classifier\n",
"\n",
"# Here, we will apply RFE on the undersampled data so that it will not be affected by the data imbalance\n",
"\n",
"# Create the RFE object and rank each pixel\n",
"clf_rf_3 = RandomForestClassifier(n_estimators=10) \n",
"rfe = RFE(estimator=clf_rf_3, n_features_to_select=15, step=1)\n",
"# Normalize\n",
"XX_train = X_undersampled.copy()\n",
"yy_train = y_undersampled.copy()\n",
"cols_to_norm = ['Time','Amount']\n",
"#XX_train[cols_to_norm] = XX_train[cols_to_norm].apply(lambda x: (x - x.mean()) / x.std())\n",
"XX_train[cols_to_norm] = XX_train[cols_to_norm].apply(lambda x: (x - x.min()) / (x.max() - x.min()))\n",
"\n",
"rfe = rfe.fit(XX_train, yy_train)\n",
"print('Chosen best 15 features by Recursive Feature Elimination:',XX_train.columns[rfe.support_])\n",
"\n",
"X_train_RFE = rfe.transform(X_train)\n",
"X_test_RFE = rfe.transform(X_test)\n",
"\n",
"toc = time.time()\n",
"print(\"Computation time: %d seconds\"%(toc-tic))"
]
},
{
"cell_type": "code",
"execution_count": 28,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"(696, 30)"
]
},
"execution_count": 28,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# 4) t-SNE (good for visualization)\n",
"# - t-Distributed Stochastic Neighbor Embedding (t-SNE) is a (prize-winning) technique for dimensionality reduction\n",
"# that is particularly well suited for the visualization of high-dimensional datasets. The technique can be \n",
"# implemented via Barnes-Hut approximations, allowing it to be applied on large real-world datasets. \n",
"# - t-distributed Stochastic Neighbor Embedding. t-SNE [1] is a tool to visualize high-dimensional data. \n",
"# It converts similarities between data points to joint probabilities and tries to minimize the Kullback-Leibler \n",
"# divergence between the joint probabilities of the low-dimensional embedding and the high-dimensional data.\n",
"# t-SNE has a cost function that is not convex, i.e. with different initializations we can get different results.\n",
"# - It is highly recommended to use another dimensionality reduction method (e.g. PCA for dense data or \n",
"# TruncatedSVD for sparse data) to reduce the number of dimensions to a reasonable amount (e.g. 50) \n",
"# if the number of features is very high. This will suppress some noise and speed up the computation of \n",
"# pairwise distances between samples. For more tips see Laurens van der Maaten’s FAQ [2].\n",
"# - Ref: - https://lvdmaaten.github.io/tsne/\n",
"# - http://scikit-learn.org/stable/modules/generated/sklearn.manifold.TSNE.html\n",
"\n",
"from sklearn.preprocessing import StandardScaler\n",
"from sklearn.manifold import TSNE\n",
"import time\n",
"import pandas as pd\n",
"import numpy as np\n",
"\n",
"# Select the dataset to transform\n",
"XX_train = X_undersampled.copy()\n",
"yy_train = y_undersampled.copy()\n",
"\n",
"#Scale features to improve the training ability of TSNE.\n",
"standard_scaler = StandardScaler()\n",
"XX_train_std = standard_scaler.fit_transform(XX_train)\n",
"XX_train_std.shape"
]
},
{
"cell_type": "code",
"execution_count": 35,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Computation time: 2 seconds\n"
]
}
],
"source": [
"tic = time.time()\n",
"tsne = TSNE(n_components=2, random_state=101)\n",
"XX_train_2d = tsne.fit_transform(XX_train_std)\n",
"toc = time.time()\n",
"print(\"Computation time: %d seconds\"%(toc-tic))"
]
},
{
"cell_type": "code",
"execution_count": 85,
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAfIAAAFnCAYAAABdOssgAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzsnXl4FFXW/79V3dm6OyEmhOwERtmDr6OiAiKCQBIQUZwZ\nIuqoyMCLOqMoECCMRExQwiLMiAtEmFFkwN1BE3bHX1gEBJ2XEBUUCJ3VLCQhXdm6q35/VLrTS1V1\n9d6d3M/z+Ei6qqtu3aquc++553wPxXEcBwKBQCAQCAEJ7esGEAgEAoFAcB5iyAkEAoFACGCIIScQ\nCAQCIYAhhpxAIBAIhACGGHICgUAgEAIYYsgJBAKBQAhgiCEnEAgEAiGAIYacEJB88MEHotsaGhqw\naNEipKWlIS0tDdOmTbPYf+LEiXj88cctvlNeXo6JEyea/j1kyBCkp6fb/FdTUyO7jenp6airq3Ps\nwmQwZMgQVFdX48CBA1i2bJnTxykuLkZlZSUAYP369fjXv/7lriaKotVqMXnyZMyYMcPic5Zl8dBD\nD2H79u0Wnz/33HN45ZVX7G63h9TzIoWr9/Dbb781PVdS/Pe//8WPP/7o9HkIvRyOQAgw9Ho9d8st\nt4huf+6557j8/HzOYDBwHMdxly5d4m677TbuzJkzHMdx3IQJE7gJEyZwBw4cMH1Hq9VyEyZMMP17\n2LBhHrwC1xg8eDBXVVXl8nHmzJnDnTp1yg0tks+nn37KPfTQQ4LbjPeprKyM4ziO+89//sNNnjyZ\nYxhG1nYx7D0vnuTUqVOm50qKv/71r9xnn33mhRYReiJkRk4IOJ544glcu3YN6enp0Gq1NtvPnz+P\nG2+8ETTNP94DBgzAnj17cOONN5r2WbRoEdauXYuOjg6X2rJmzRrk5uaa/r569SpuuukmXLt2zTRz\n1ul0ePrpp5GRkYF77rkHK1asQGdnJz755BMLz4D533V1dXjyySeRnp6OiRMn2sxEzfc3GAwWXoMx\nY8Zg5syZksfZuHEjvvnmGyxevBiFhYVYunQp3njjDQDAjz/+iMzMTKSnp2PGjBkoLi4GAJw4cQKz\nZs3C+vXrkZGRgYkTJ+LkyZOC/VJUVIR7770X6enp+OMf/4grV67gu+++w7p161BaWor77rvP5jsD\nBgzAggULkJ2dDZ1Oh5deegl5eXkICwuTtV0M6+fl0UcfxWuvvYaMjAycOXNGsq+N99CRa3/jjTcw\nfvx4PPDAAzh27Jjp89bWVjz33HNIS0vDxIkTsWbNGgDAv/71L3z++edYu3Yttm/fDpZl8dJLL5n2\nW7x4MTo7OyWvkdDL8fVIgkBwFHsz5ldffZW74447uLfffpsrLS01zcyNTJgwgdNqtdyiRYu4goIC\n0zGdmZF///33FjOujz76iJs/fz7Hcd0z5x07dnBLly7lOI7jOjs7uRdffJErLS3lPv74Y+6xxx4z\nfdf871WrVnEvvvgix3Ecd+XKFW7EiBFcZWWlxXGtv89xHNfW1sZNnz6d27t3r93jTJgwwTQjz8rK\n4jZv3swZDAYuIyOD27NnD8dxHPd///d/3KhRo7hr165x33zzDZeammryZGzdupV7/PHHbfqkoqKC\nu+WWW7jLly9zHMdx77zzjqmdQm02x2AwcA899BA3ffp0btWqVQ5vF8L6fj7yyCPcnDlzTM+FnL6W\ne+0XLlzgRo0axdXW1nJ6vZ576qmnTM/HO++8w82dO5djWZZrbGzkbrvtNlP/P/LII6YZ+d69e7l7\n772X6+jo4Nra2riMjAwyWydIQmbkhB7H4sWLsXDhQhQXF+P3v/897rzzTmzevBksy1rst2jRImzf\nvh319fU2x7Ce5aanp2PhwoU2+/3P//wPOI4zrW8eOHAAGRkZFvtERUXhu+++w5EjR0yzrWHDhkle\nw4oVK/DXv/4VAJCcnIyYmBiUl5fbvfZXX30VN910E9LS0pw6Tnl5Oerq6jBt2jQAwMiRI5GQkICz\nZ88CANRqNSZNmgQAGDFihGmN3ZyjR4/i9ttvR0pKCgDg97//PU6cOCFrVknTNKZPn46ffvoJkydP\ndni7XMaPH2/y2MjtIznXfurUKYwaNQp9+/aFQqGw8DzMmTMHb7zxBiiKQp8+fTBo0CDB86SlpeHj\njz9GUFAQQkJCMHLkSEHPE4FgROnrBhAIrlBTU4PHHnsMAHDjjTciPz8fNE3jD3/4A/7whz+AYRj8\n5z//wcsvv4zo6GhkZmaavhsbG4vMzExs3LgR8+fPtziuQqHA3r17ZbVh8uTJOHToEPr3748zZ85g\n3bp1FtszMjLQ1NSETZs24eLFi7jvvvvsBqmdPXsW69evR1VVFWiaRm1trc1AxJpDhw7h1KlT+Oij\nj5w+TkNDA8LDw0FRlOmziIgINDQ0oG/fvggPDzd9TtO04LGuXr2KiIgI09/h4eHgOA6NjY2S7QeA\n+vp6vPXWW3jhhReQm5uLTz75BMHBwbK3y6VPnz6mf8vtIznX3tTUZLGfeT9cvnwZr776Ki5evAia\nplFdXW1aAjGnoaEBL7/8MkpLS0FRFOrq6kzPOIEgBJmREwKa2NhY7N27F3v37kV+fj50Oh2++uor\n03aVSoWpU6dixowZuHDhgs33n3zySRw9etSliOG0tDQcPnwYR44cwahRo6DRaGz2yczMxIcffojC\nwkKcO3cOn332GWiahsFgMO3T1NRk+vfixYuRlpaGffv2Ye/evbjuuusk21BTU4OXXnoJ69evR2ho\nqNPHiY6ORlNTEzizooiNjY2Ijo622w/mxzA32k1NTaBp2u65ASAnJweZmZmYN28e+vfvj7ffftuh\n7c7gaB9JERERgWvXrpn+vnr1qunfq1atwqBBg1BUVIS9e/di6NChgsd47bXXoFQqsWfPHuzduxfj\nx493uj2E3gEx5ISAIygoCCzLoqWlxWYbRVFYtmwZPvnkE9NndXV1OHr0KG699Vab/cPCwvDcc89h\n7dq1Trfn5ptvRn19PT755BMbtzoAbN682TRLjo2NRVJSEiiKQr9+/XD58mW0t7ejtbUV+/btM32n\nvr4eI0aMAEVR+PTTT9Ha2gqdTid4fpZlsWjRIsyfPx9Dhgyx2CZ1HKVSaWF0ACApKQlxcXEoLCwE\nAFMwmHmgoD3Gjh2Lb7/91uQO3rVrF8aOHQulUtoBuGfPHpSVlWHu3LkAgBdffBE7duzA+fPnZW0X\nQ+p5ARzra3vcfPPNOH36NBoaGmAwGPDvf//b4jzDhg2DQqHA0aNHUVZWJngv6uvrMWjQIAQHB+PH\nH3/Ed99953R7CL0DYsgJAUdMTAxuueUWTJgwAWfOnLHYplKp8I9//ANFRUWYMmUKpkyZgsceewyZ\nmZmCRhYApk+fbuFqBYTXyNPT03HgwAGb71MUhUmTJuH48eOYMGGCzfYZM2bg888/R1paGtLT0xEU\nFIQZM2bg9ttvx4033oi0tDT86U9/Mq2/AsCzzz6L+fPnY/r06WAYBrNmzcKyZctw5coVm+OfOXMG\nJ0+exHvvvWfR1o6ODsnjpKWlYeHChRZR2hRFYcOGDdixYwcyMjKQm5uLTZs2QaVSSd8UM+Li4vDy\nyy/jqaeeQkZGBk6dOoVVq1ZJfqe2tharV69Gbm4ugoKCTMd55plnkJ2djZqaGsnt5p4Na6SeF8Cx\nvrbH0KFDkZmZiQceeAAzZ87EzTffbNq2YMECrF69GlOnTsXJkyfxzDPPYOPGjTh9+jQmTZqEdevW\n4ZVXXsGcOXOwa9cuTJkyBe+//z6ysrKwe/duFBUVOdweQu+A4sx9aAQCgUAgEAIKMiMnEAgEAiGA\nIVHrBAIh4Hn66afxyy+/CG7bvHkzrr/+ei+3iEDwHsS1TiAQCARCAENc6wQCgUAgBDAB6Vqvrb1m\nfycHue46Fa5eZdx+3N4A6TvnIP3mPKTvnIf0nXP4Q7/FxIQLfk5m5F0olQpfNyFgIX3nHKTfnIf0\nnfOQvnMOf+43YsgJBAKBQAhgiCEnEAgEAiGAIYacQCAQCIQAhhhyAoFAIBACGGLICQQCgUAIYIgh\nJxAIBAIhgCGGnEAgEAiEAIYYcjdSVVWJu+66DT//fMH0WWHhHhQW7vHYOfPycnD0aLHHjk8gEAgE\n/6ZXG3KGAS5dosC4UaxnwICBeOutv7vvgAQCgUAgSBCQEq2uotcDOTnBKCpSoqKCRmIii5kzgaws\nQOlijwwZMgxtbW04ffoUbrlllOnzDz74Fw4d2g8AGDduPB555HHk5eVAqQxCc3Mjxo69C99/fwaN\njY24dOki5s1bgIMH9+Hy5Ut48cVcjBiRir//fQNKS8+ho6MD99//IKZPv9+1xhIIBAIh4OmVM/Kc\nnGBs2RICrVYBlqWg1SqwaRP/uTuYP/9pbNnyBoyF5TiOQ1HRHmzevBWbN2/F4cMHUFFRDgCIiIhA\nXt5aAIBWewVr1mzAo48+jh07/oHVq9fh0Ucfx8GD+9De3o64uAS8+eY7eOONrSgoeMstbSUQCD0A\nhgF96SLc6l4kBAy9bkbOMEBRkfBlFxUpsXx5B1Qq186RlJSMwYOHmmbg1641Y8SIkVB2TfeHD0/F\nzz+f7/r3CNP3hg4dDoqiEB3dF9dfPwgKhQLXXRcNne6/CAkJQXNzE/73f+dAqVSisfGqa40kEAiB\nj14PdU42Qoq+BF1RDjYxCe0Z06DLyXPdvUgIGHrdjLymhkJFhfBlV1bSqKmh3HKeJ56Yix07/gm9\nXg+KomBe9p1lWVAU3walMsj0uUKhEPw3x3H47rvTOHPmW7z++ha8/voWBAeHuKWdBAIhcFHnZEO1\n5U0otFdAsSwU2itQbXkT6pxsXzeN4EV6nSGPjeWQmMgKbktIYBEbywluc5SoqGiMGzcen3/+CcLD\nI1BSchZ6vR56vR6lpecwePAQh47X1NSIfv1ioVQqceTI1zAYDOjs7HRLWwkEQgDCMAgp+lJwU0hR\nIXGz9yJ6nSFXqYCMDL3gtowMvctudXMeeuhR/PprDQDgvvsewJ//PA9PP/0nTJ8+A3Fx8Q4d69Zb\nb0d5+RU888w8VFSUY8yYO7Fu3SvuayyBQAgo6Jpq0F2xNjbbKstB11R7uUUEX0Fx5j7fAKG29ppL\n3zePWq+spJGQwGLmTAWysq6RZSUniIkJd/me9EZ6Vb8xDOiaarCxcXDHaLlX9Z0YDIOocbdBob1i\ns8mQnIKG4hOCfU36zjn8od9iYsIFP+91M3KAjwHJze1AcTGDY8d0KC5msHEjiQ0hENyOXg/1iixE\njbsNUaNvRtS426BekcWPpgmuoVKhPS1DcBOrUQPB7snCIfg/vdKQG1GpgIEDObe60wkEQjckGMvD\nsMLxPkE/lCJyyngyYOol9GpDTiAQPIgjwVgkD9pxGAahu3eKbg4qOQt1dpYXG0TwFcSQEwgEjyAr\nGIu43p2GLrsMSqeT3Cdk75dkcNQLIIacQCB4BDY2DmxikvC2hCSwsXHE9e4S9uOU6V9rbKPXifej\nx0EMOYFA8AwqFdozpgluas+YCgCirvfQnTuA5maPNa0nwKYMBKfRSO/TNWACYPJ+YMQI4v3oYShy\ncnJyfN0IR2GYDrcfU60Ocfm4VVWVmDlzGk6ePG4qX3rhwnnccccYl9u3YsUSREZeh/j4BJeP5W7c\n0Xe9kYDuN4YBXa4FFxICdHZ2/zsoyGK3zrsmgLrWDPrXWlC6FrBJ/dGWORu6nDzQlRVQvbYWlEAG\nLNXRAfrXanRk3Ct4ejVlQNv5XwTP6TEunEfo7vehj7wOiI72zjmlCAoCXVONoDOnRXdpy5yNzjR+\n0KR+cRlUW94EGhtBcRzo5iYEnf4W1LVmdE6c7K1WByz+8HtVq4UVPXt3wpV5biuE8/McpX//FLz+\n+ha3HItA8DustL25rpQPSqcDm5Rsq/OtVEKXuwa65Stt8sjZ2DiwCYlQlGsFTxVcXMy7f83TSrrO\nj/1FiCorAxsXx58zN99z+aONjYgeORhUexsAQJOzAlxIKOrPngciIz1zTpnoVr0C0DRCvtgDurIc\noGmAZcEm9+++F4DdwEPd8pVuye8n+IbeacgFCg1g5gNA1kq3vwzOnPkWu3btAMMweOaZhfjuu9P4\nz38OgWVZjB49FnPmzMM777yNyMhIPPjgLFy8+DM2bMjH669vwfvv/xOHDu1HcnJ/NBM3I8EPMK5p\nG6FaWkz/Nq5vA4Aud43lF1UqsAN/Y/NZx9hxCBOJvKarK3njb/Y98/NTABRVVVBtK0DQyRNo3P+1\nR4x59MjBoLuMuBGqvQ3RIwejXvur28/nEEoldMtXovXhxwBwYPvFgm5uthHekRN4aHN/CAFDr1wj\nFwqwwaZNHguw+eWXn7Fhw+sYOnQYAOCNNwqwZcs/UFT0BXS6FsHvXLt2DZ9++hHefHMbsrJW4OLF\nXzzSNgJBNhKzOnNk63zr9eBCw0RDtizWd+2c32OpVhfOm2bi1lDtbcCF8+4/p1zMI/4njEHkI7Og\nfm0t2OT+NrNrOYGHhMCl983IPexiunKlDM88M8/096hRt+OGGwYhuEtlKTQ0FM88Mw8KhQKNjY2i\nM+2KCi0GDvwNQkJCAIRgyJBhTreJQHAHUrM6i/1kzvDUOdlQ/fMd0e3tGVNtZ5UibniAT7XSrXzZ\nrS7i0AN77W5vGzTYbeczwTCgyy4D4MCmDBS8JmvviD2PSHvGNIv9jVj3MyHw6HWG3NMuJus18jNn\nvsWFCz8BAKqrq7B79/vYtu19qFQqPProHwAAFNVdOlXfFUHKcZyp1Cn/t7CCE4HgLYyzOiFtb4v9\n5MzwJAbUnEKB1see6F7fNT9/XBwUVVWC3zOmWrnTRdw2OR2anBWS292KXg/1i8sQuut907IFp9Gg\nLfNhfj3cuHQgNSEp/AKtD//RZgBg7E/V/iJwWi3YhCS0Z0y16WdC4NHrXOu+dDE1Njbiuuuug0ql\nwk8//Yjq6mp0dnZCrVajrq4OAPB///c9ACAxMQllZZeg1+uh07Xgp59+8Fi7CARZSKSTmSNnhmdv\ndt86/xnb9W475/fI73fQYHAhoYKbuJBQwM2zcXVONlQFb4NuaQEFPg6AbmmBquBti6U/yQlJuRZR\nd4+xTS/rCjzEuXNoOHYaDcUn+L9JkYmAp9cZcru5rR50MQ0aNBhhYSosWDAHhw7tx4wZM7F+/RqM\nHz8RR458jeeeewotXaPwiIg+yMi4F/PmPY5XXnkZQ4eO8Fi7CAS56HLywMxbAENyCjiFAqwmHKwm\nHBxNw5CcAmbeAlkzPGcH1LrcfHSmjhTc5qnfb/3Z82BDQsEBpv9YY9S6O2EYhBR+Ibo55MsvTLEH\nUv1HAaA4rltcJzvLUgDGGHhI3Ok9hl5ZxrQ7ar0QdGU52IQkKGbej1oPRK33BvyhvF8gEtD9ZpG6\nCcdLlOr1iJwyHkElZ20PPW+B7Rqv1XdjclfA8OlnoH+tsXQRe/L3e+E8vyY+Od3tM3EAoC9dRNTo\nm0GJFELhaBoNx8+Ylg7UK7IE17xtv6cAONaUHqja/DfUXm11a9t7A/7wexUrY9o7DbkRs5dRTEqs\nz29SoOIPD3gg0pv7TcwIdaaOlE4j6/rNRqcOQu2VGii/OwM2OhrskGGBP8NkGETdOUo0r96QmIyG\no6e6r9N8QlJRDrAGUILftOLZZ1Gb/bLbmt1b8IffK6lHLgRxMREI3kciUItuagY6BNSzrIqrIC4O\n0SMHI3L27xCVNgHRqTdAvXxxYMuNqlRonyqsZAcA7dPutXxXda15NxSfQMOBr8Gp1fLO8/nnRGe9\nh9G7DTmBQPA6sqqiWWGt/YBr10Dr9ZIBYYGILicPzNz5fNwButbjNeFg5s4Xjz1QqRC2+33Qdiqh\nmbhyRbCPCYELMeQEAsGrOBzoJlOIBrAMCAtIlEroVq9FfckFNHz9DRq+Po76kgvQrV4rudwgt38A\nAGo1EYDpYfgksis/Px+nT5+GXq/H/PnzMXLkSCxZsgQGgwExMTFYu3atSUCFQCD0MBwUJ5ErRAMA\ndFVFz5AbVanADhsua1dH+gcAQMlaSScEEF435N988w0uXLiA3bt34+rVq3jggQcwevRozJ49GxkZ\nGcjPz8dHH32E2bNne7tpBALBSxjdxOaZI2LiJHKFaACAjU90brZpDHyNiBDUKvdnHOkfAIBO1zMG\nOwQTXnetjxo1Cps2bQIA9OnTB62trThx4gTuueceAMA999yD48ePe7tZBALBm5gHatkTJ5EpRAMI\nBITZQ6+HetFfEDVqJKJuvwl9R9yAqNtvQtSdowKnVrcD/QMASE4mrvUehtdn5AqFAqquH9qHH36I\nu+66C0eOHDG50mNiYlBbWyt5jOuuU0GpVLi9bWKh/QT7kL5zDtJv4UBKrP3dNv8NCAvmI661Wt5Y\nt7d3R7iHhwOPPw7Vhg1Qyc0lb2sDkn8DdKkqAgC6crgV5VqotrwJVVgwsHGjg9fkAzb/DeD0wNat\n9vedMQMxcvqcYIO//l59pn5y8OBBfPTRR9i2bRvS0tJMn8tJa7961f3BLP6QIxiokL5zDtJvDpL9\nMrBwWXceee0128IiDgidRN49BkHmRlwAwyefoWHhsoBwsyvT70Pk1q2iueQcgLbM2Qhbt448d07g\nD79Xv8ojLy4uxltvvYWtW7ciPDwcYWFhaGvjSwXW1NSgX79+vmgWgUDwd8y1H7oCwthhIxw3tPV1\nUP5Yanc3ukIbMKla+uEjAFrcU8nGJ6Dl1Q1EvVIOF84j9I2/+bZMrQN43ZBfu3YN+fn5ePvttxEZ\nGQkAGDNmDPbt2wcA2L9/P8aNG+ftZhEIvROGsdTh7g0wDIIPHzS50aVgY+MCZz05ui/0w8Uj3dun\nzwgIz4JPaWxEdHI/9B17KzQ5K9B37K2ITu4HNDb6umWSeH1oVlhYiKtXr+K5554zffbqq69ixYoV\n2L17NxISEnD//fd7u1kEQu+BYUBXlCPsrdcRcnA/H8GcmIT2jGme1yv3JSZJ0y9By4zwDrRa3Y2F\nhxCZMRHK0nNA1zIlpwxC2x8f9+9ypeba/T7s7+iRg0G3t1l8RrW3IXrkYKDNf/Xpe7fWuhn+sP4R\nqJC+cw6395u9FCqjISv8AnS5VnAt1W7BEj/Bmb6TW2TEiF3dd2/hjJGrr4Pyu9Ngo2PADhlq8T2/\n+r2aD64qyn07oLxwHn3H3ir4u+AAUD/+iNqoBO+2yQq/WiMnEAhuxKhDfucoPoUqdRCfQjXOMoXK\nJHMqYsQBPq+7R7rZJdTPOKv/DHHxYJ74k++NuJW+vE19cSmi+0I/KQ3sb2/2a4+CtfSuqfSqD6R2\nQw/sld7hC/ESs76GGHICIcCxNtCUga+CpdBqu1+Kzc0I3fme3WOJaZ27BYYB/UMp6B/OeX2wYE/9\njI2KQuvsR/ic9m++g27NeseMuAdiDfzJyHkEicGVLwaUbZPTpXfo0jrxR4ghJxACGRk62yFFhdBk\nPQ+6pcXu4dh+se4P7tLroV6+GNGpNyBq/B2IGj/a69XKpPTdKQCKhgaodu5A2D8KHBeUcXbWLIWf\nGTlP4EzxHI8yaDC4kFDx7Q884LciQcSQEwgBjBydbbqyHMFHi2Udrz19mttdseqcbKgK3gbd0uK7\namUy1c8cNZKemjX7nZHzAA4Xz/EC9WfPgw0JNS2zGKEA4PJlv/WIEENOIAQwUi9D0z79YmW9+DtT\nR0KX5+ZAN4ZBSKH42qJNtTIPpsPpcvLAzFsAQ3wCxCJ8HTKSHpw1+6ORczsSgyufZQtERqJe+yvq\nDh0B2ydScBd/9IgQQ04gBDIyZprt6dPAJiULbuMAGGLjwMyZ65HgLrqmGnRlhfj2rmplgi7qpc+D\n/vmC+16aRn33w0fAxgtHHztiJD06a/ZHI+cBTIOr5BRwCgUMySl85oSPU+VojQb0tWbhbX7oESGG\nnEAIcEwvw6T+vEtQoQAHCobk/vxLMW+NqFFomzUbDSe+h06O4pcTs2U2Ng5sQqL49q5qZYIu6m0F\niBpzi/C6sysz9+i+vDiKAI4YSVmzZrkBfgLX469GThBn74cjxXO8SKB5RHqo8gOB0Ivoehnqlq8U\nzSOXLBtq76XpSq6vSoX2qfeK5m+3T7u3q13CLmo++v6K6fu6nDx+jXJ/EaKuXHE679iRMqqS1yZW\nVz0tHerclQjd9T6oriBDTqNBW+bD0K16pbutdvrW4r76S2lV87z24GD35IEbpXeFzuGLa5a6t37o\nESGCMF34lUhCgEH6zjl80m+OviAZBpqlzyNs107bTVLiMdYv+xeXIXTXTlAt/PVymnC0Zc6GbtUr\noLVXEDX6ZlB2JFMNySlonzQFqu22Fb6cFrJx1WCYDLHlgAAsC1XB28KnNGurmEiNJ4V5nH7umpuh\nWbEEwcX/D3RVJdjEJLB9+iCo5KzNrswTf+JT+ISQ6nN/EoixurdUcjKYKRk+VT8UE4QhhrwLYoyc\nh/Sdc/h1v8lQgTMkp6Ch+ISwepzQi7ijw7ZaGQAwDKLG3QaFHdlUjqbBxsZBUVUpry3exNw4AYi6\ncxQU5VrBXQ2JyWg4eorfT+S6PXk9Dj93Xfc0dOd7slIYAX55p/WxJ6DLzZftfQB8M7CxS9e9jU4d\nhFqdwTdt6IIouxEIBNnIUYETCvqRTMcSq1YmMzWMjY0TDTLyeQCSWVU2uQF+gZJiZrynco04wIsS\nqbYVWKRq2U3V89fcefOKe34KMeQEAsESGSIzgEDQjwsvYovALpF92jOmuicAycMV3+QG+LEREaJt\n9puAKpnPghim+y7j2QiUgY0/Qgw5gUCwQI7IDGAb9OPSi9g8evnot2Ce+JNttHZuvmspWZ5SYbOm\nK8BPjPaMqVCvfglRU+4GLbBMYNzHH2aAcp8F0e933Xc5zwYfKS48AGITEv1jYOOnkKh1AoFggTH1\nRmjtlgPAJiajfdq9NhHeUt+TPcNUqcAOGgzdmvXQCQRFGc+p2l8ETqt1KNrc6No1YhEN7+b1V11O\nHsCyggF+AATXgTlQYJP7Ox4970Gk7qmRzuEjoPzpR1AG2/VjNj4BaGvlpX/tPRsqFdg+kVBobWML\n2D59/GI6T5rYAAAgAElEQVRg46+QGTmBQLBEYs1aP3gwGr4+bpnra3RVA6LfY/tEAMHBDrfDZm2y\na+aOc+eE847F3ObeXn9VKqFbvRb1JRfQ8PU3aPj6OOpLLkC34iWE7CsS/AobH4+G/V/5RR61CYln\ngdWEg5m3AI0Hi9H62BOC+1CNVxE1YSyiptzNG2MBTN4HhgHd2Ci4D93Y5Hdqav4EMeQEAsEGXU4e\nOlNH2nwedP481Plds0UBVzVYFp0jUm2/V3LWvRrV1kbejtvcZ+uvVgF+ku34tQZ0s7CamC+xEaZJ\nSkbrrNmo//6Hbk8GrQCr0Zg0ylllEP9xS4spsC2o5Cw6U0eKCtxIBQmaFAAJgvjJsI9AIPgVHR2g\nm5oEN4UUFUK3fCXUq1+ydVUXvA1Wo5H8ntz8dUfyu+25zd3i9ncD/tIOh7AjTGMsimMOpe8UPBTd\n1IyG/V/ZCBYBAdo3fgKZkRMIBBvszmDLLoursYmkKcma+ToTkCbHbe4v2uX+0g65mC9VCC11OBjV\nTleW80ZcKJ0r0PrGjyAzcgKhp8EwoMsuAaDApgxw/AXIMEBbG9j4BCgEjDmbkASAcziaWc6sypmA\nNFkR0QN/A92SbFDNTQguLgZdXemcLKsbcIs8rKeRqbDmaFS7vWcgIPrGDyGGnEDoKej1XVKodvS9\npb5v9vLmRAYA7ZOmgE0ZKB7Zrgk3RWpbfM/erEpqZv3lF6JuebsuWZUamr/8b7e0aEIi2n43Cy15\n+UBEhHh7PIVSCd3ylWh9+DHYqNz5CXIHVHKi2s2x+wz4q768n0Nc6wRCD8G4Vkm3tIACX3CEbmmB\nquBtWYFm1spbRiUvPoiJ6qqqBoQc2Av16pfQnpYheJy2zNnyqnZZRZjTZZdAi8ia0hVaaLKeF3ax\nS0VW94lA9OjfImzXTigqyvnAq3Itwnbv7A7a8ybmSwcTxiDykVlQr37J/bnsruBIhL9E30sFttkl\nANTU/Amitd6FX+te+zmk75zDrf3GMIgae6ugKxwADEnJaDhySvzFKKF3zmo0gvKczNz5AE0j5Msv\nQFdVgI1P7M4vVyrFA9Zs3LaJYPtEgr56FXRFuagkLNCtuW3Td6aCHt1uc7ZPhGBBD1Of+ECf3R+0\nxO09d/Sli6JFbDiFAg3HTltWKhMpHGPS1+8hM2t/eM8RrXUCoQdjV9+7slIy0ExqrVMseC1kbxHQ\nqYfJ8lpbYJFZla3mthZBJWf5GbNoC7vOaT0jNM5wJ4xB6Ae7AApo+90sNBQeEI26N+J12U9/1RK3\nwuFa3FI1xcnM2isQQ04g9ADs6nsnJEgGGUm9vMWgy69AtX0rX1ily2VtUQRDCBe1u62Nr82goMtt\nrsnNsRuE5baUJpna7XRlBWiRtWTRQYWHdeEFcTZ6nBhtn0EMOYHQE1Cp0D5tuujm9qn3Sr9gJV7e\nnEheuBhSs0tXtbstjK/EoCD4aDEvDyqBnOA7SSPqYKpcWMGboh4H6+uif74A9dLn3aML78RgwEYE\nxtE1bg+1iyAMiVonEHoIUvrecl7AYqk/YFkbwQ/A1pNuxDzlyxpHo5ytMTe+kmlnVZVo+/0shO3a\nadsGTTjaZj8i3ifmtdgrK8AmJKJ96r02qVcOpcrV1/EDHLHrmjQFCA6GekUWHzugvWLRv07pwouk\nkGHz3+x/Vyh6HACtveL6erfM1DaCfEiwWxf+EMgQqJC+cw6P9Zsb8sgtApT0eqizsxD27jbBwhjW\n2AsiEwv4soZVKkF1zUBNA5KuNLqYmHDUltWIBugZklPQ8NVRqPPzugcm8QnoGDvObtqZevliwYEL\nM3c+dKvXdv0hHhxocf1Go7XnM9BVVYKDHw5Aw9FvEfbPd+z2iyMBeqL9/OyzqM1+2e73TbjZ8PpD\nwJ8z+MN7jgS7EQi9BZUK7LARYIcNd27mZL3WqVSi9X+fBgSimIVoT0uXPK+t27Y/n6qU1B+cQgFW\nw7+saL3eLI3uGkDTlobD3lpuRIRlENaRU2j5+1uWRtzavcswCN31vuAxQ3ft7E6Vk6ndblrDFzHi\nAMAmp4CNipIVOyA7QE8qFuHzzy2u15572zY48YpwLIQcV3mABPw5RH0dlMVfA/V1PmsCMeQEAsEu\nbGwc2KRk9xzMJsr5JBoPH0XDkZNoOHwEXGSk4NeEXvSy1nKFgrBE1rfpX34WjdKnWq6BLrsMQGZk\nt8zAvvaMqaCbm2XFDsgN0JOMRdBqQVdWyFvfl2N4m5uh+fP/IurOUXbX8yUD/iq0rmUReHvNva0N\nkRPHom/qIEQ+OJ3//8SxQFubd85vBjHkBALB/ktQYvZrTci+vfJeptYGVqUCQsPEK2AJzUalUp8k\nEJtlqt60t37MmdpqL7JbyphyAAzxCWh6YgFKHlsNXYS8rAG5muOSWQjJyQgreFN8lm32LEh6HrpE\neqJvGoqw3Tu7sxfEZuyQDvjjVCrnsgic0ed3A5FT70FQyVlQBgPvOTIYEFRyFpFT7/HoeYUghpxA\n6M048BK0mP3SNMSCa1yZWTmcw2zEkdQnqWj3b46DU6sFt3GacF5OtQt73gCpazHEJ2DR5JMYefDv\nGD2uD+6c0hdf9blP+LyAU8poogOvqVMRcmC/4KbQnTsQNeYWRN3xW0SNvRVhb20WTWvkVGqE7d4p\nKBYECHhQGIYf5IlAOTmTle36dyf1dVD+UCq4SflDqdfd7MSQEwi9GIdeguaz36+OiqalcaFhYJ3V\nMHc2h9kBt6q9aHexNL62zNmW57fnDZC4lv9Ez8TGdxOh1SrAshS0WgXSS17DwdRnLAcGT/wJDUe/\nle1pMEdsoIG//EX8+luuQVFZAYrjoKgoh2r7VrB9+oicQTpO2tqDYk+0CHq9aelCNj5ac1eWngPE\nAj8NBn67FyGGnEDorTj7ElSpLGam1lCMDlGTxom7N+0YXV1OHpi588FqwsGBNxesRsMH2zU3W37X\n3KNwx28RNfq3UC8V0WTvwt6sv2X1Ot4AJiaDo2kYEpP52fCqV8T7Q8QbIGRMm55YgD81rrPZ1wAl\n5jRtRMV+s4HBmvVgBw12LmhRbKCRnOyQ+I/i8iUwc+ZaXENr5mxQdoyktQeFjY0DGxtr52yOJVHJ\nDTp0N/rhIwCFQnR7yGefeFU/nyTtEQi9FLnlP21gGChPnxJ9kVMAFBUVfIoRy3anbNlLYzJPe6Np\nPlLdeMyu4i+hu97nz9u/P9RTMmxy3BVVVVBtK0DQyRNo3P+18Ay2a6YslAJlHu3ulgpcAvnYl2rU\nuPLPYMHdKytpVDerMVCo353FONAw+1vs+oWgdDq0PvYkdC/mWuSUBx89IqkHYONBUanQnjEdqu1b\nBfe3XrqQg93Kd+5Q7hMiui/0w4YLavlTAFTvbQfCQr2WTkdm5ARCL8Xh9Wiz2W/k7+4DKHvK6JYp\nW6Ju/BeXWa7T3zkKoTvfEzwe3dLCF/O4fBmqLW+KpooFlZyFOjtLtF0ORbsDrkdDm83aY2M5JCYK\np/IlJLCIjfWAtIeVF8Tm+vvFSc+F29otPQ9SFec04aLr+bq8NegckSp8CuulCznIWYrxUDR7Y+Eh\ndA4fIdpv3kynI4acQOitOLgebWGIOU6WOIwpZUvCjR+6a6eNXrpYAJXt8cX3C9n7pfiLVE60u4ei\noVUqICND+BgZGXr3SpULXAOeew4ALK6/8b1/SR8nNNTmI5vBQGIyWjNno/74GbQ+OR/o6LA9jlKJ\nxgP/j3fVx8WBoyj7Sxd2EB2UrXjJs9HsoaFo3v6+6IDWm0V5iLJbF/6g2hOokL5zDp/1m7kLOzhY\nvASluVGTUDLjKArgOHHVsq+PA6Fh4qUxIS73ag+p7wqW3HQATyqQ6fVATk4wioqUqKykkZDAIiND\nj5ycDreqlMq+BoZBdOoNggMoVqNBfcnP0oGGNdVgo/t2KenJVIATK3PrLFbHc/f9E/y9ylX4cxNE\n2Y1A6O0IzTBzsqHLybObhy0pMEJR4ARmbED3uqcz1dXkwKnFC7qwcdIV3yTxcDS0Ugnk5naguJjB\nsWM6FBczyM11rxF36BpUKrRlPiy4b1vmw3YL7rADfwN1fp5jaWDurpZmfjxvRbM7m2XhZoghJxB6\nCZKpZnZeqlKGmFOp0faHTMFtpnVPBwRlBM8h8nnbQw+jM3Wk4Daq6SrUq19yypXqrWholQoYOJDz\nyPve0WvQrXqFd1EnJYOjFTAkOeDy9jPpVW9Gs3ulUpwdiCEnEHoDrr5oJQwx3XINCA6xm7Kly8kT\nNLqy3OoUhdZZs00vSwwYYDp+4/6vwTzxJ7AqSyEXuqXFFEznKE4L0/gRDl+DMW7gyCk0HOe16eXm\nrvsqDUwMr94/J9UF3Qkx5ARCL8AdL1rdkmw+n1uAkH17oVu+Eg1HT6Hh+Bk0HBUwAh0doJuaJM8h\nNvNmk/qjZc0G08sS5851H1+phG7lywAtPCQwj5yXjZ+4TF3C2WuQ8s4IFJmhfygFqqvA9usneDif\nDHx8cf/cvVTgACSPnEDoBbgj35aurxPNHTfPOxcLLpNcZ7eD+cvX9LLUdQce0WWX7RY7YYcNd+ic\nYvXZvekydRWha1DMvB+6rJWOHchaA6BfLNjrroPiShkonU7yq74a+PSE+ycXYsgJhN6APREUBwpx\nODUY0OsR9vbrdnPP2eT+aJ88BSEHDjj48rWXfONEco6AmEtAzMTNEbiGmJRYwBh9LTNy3BhfYURR\nXQVFdZXo/hwo/l760nD2hPsnE2LICQQ3wzBATQ2F2FjPBDE5i8szFBcGA+qcbKi2Fdg9RceYO6F7\nMddSRcx4XHOjA8s0HDZlIDiNRnBW7oximAXWymiBiPU12FPZM0dmOVZz2LhYNOz/Coju64bGu0hP\nuH92IIacQHATzc3AihXBKC5WoqqKRmKicG6w0dCLFNnyHG6YoTg1GJAwBMZ5srEAS+gH/0LwsSOW\nRkXA6GDmA0DWSosCJW2ZD1vItRqxqxjm7nzmAMBmht2VwUA1NaFlzQaLfnBmSYT+9VfQzc1g/cGQ\n9wJ8Ighz/vx5PPXUU3j88cfxyCOPoKqqCkuWLIHBYEBMTAzWrl2L4GBhLWKACML4G72h76xn2QwD\nlJXxbuLERA75+cHYuTMILS228aPz5rUjN7fDQgSkooJG//4Upkxpd7sIiNuQMnAOGD/60kVxMRiK\nQsf4CQj5z2Hb03cJd0gKe5gPSoziNl9+AbqqAmx8Itqn3SsuSCI1K+3osLy+QDD2MtsYo1bAMHSY\nsLgPwPfDtOkWGvhRo2+GoqpSdlMMicloOHrKf/vKCfzhPScmCON1Q84wDObPn48BAwZgyJAheOSR\nR7Bs2TLcddddyMjIQH5+PpKSkjB79mzRYxBD7l94qu/8wUVtbXwTElj06cOhrIxGSwtvyJVKDnq9\neAJIcrIBxcUMVq8OxpYtITbbjYbeb3DE7SoHKVU4hQIwGART0AzJKWjY/xWiptwtvC6vCQcXGQm6\nskLaCIsgNkDoTB0JuqmJv/aERLCRkaCbGkFXVLjeF57AwfsV0/wruEGDQNl59ZsU0PR6RE66C0Gl\nJbKb5A71O3/DH2yE3yi7BQcHY+vWrehnlqpw4sQJ3HPPPQCAe+65B8ePH/d2swh+hF7Pu6jHjVNh\n9Gg1xo1TYcWKYG9WBTSRk8MbX2Pd6PJyBc6dU3bNvCkAlKQRB/iKVmVlFIqKhF/8RUVKb+tlSOJQ\njXJ7dM0S2ydPEdxMiRhxgI+EV5aek66dXa51WNzG2C4xd39QyVkL3Xf+b6vzZGd5pBCHWFulzuXQ\n/dLrgQ0bANr+q9+oL6DOyZY04pzZf6xaA2bufOcC3DxU3KQ34HVDrlQqEWol59ja2mpypcfExKC2\nttbbzSL4EdbGU6tVYMuWEOTkiC+3eAKGgajxdYSEBN6lXFEh/HOrrKRRU+Os2ribcYdCF8OAvnAe\n6qwXTHKwIQf2oTN1JAxJ/cEpFPxM3A5sQhL0w0c4JO0q2EYBA+FKKhwAhL27ja9/7u5CHObIKdri\n4P1S52QDb7whq+ANXVkOuuyS/UA3mkbj9vfR8PVx1J/7mS9ba89bYbwn9XXdz8rYW/k+HXur5/q0\np8L5iL/97W/ce++9x3Ecx40ePdr0+eXLl7lZs2ZJfrezU+/RthF8h07HcSkpHAfY/jdgAL/dW/z8\nM8fRtHBbHPnv2Wf5dg8YIH5dtbX8+bx5fQ5ftELBbxejs5O/2P79xTvjqac47tAhjqMoeR3Hcfz/\n5XY2TXe30dieAQP4zwcM4P/u7JS+Ic7eZHcjdt3m53Lkfkn9uMQezLNn7f8IHPlhWj8jCoX4cf/8\nZ7d2Z0/GLxZ5wsLC0NbWhtDQUNTU1Fi43YW4etX9rhd/WP8IVNzZd5cuUdBq1RAS7tRqOZSU6DBw\noHfCOpRKIDFRBa3W/uxRCI2GxezZncjK6oBOB0yZIrxGrlLpceONFGpqxCPdvYZSgyiRXHFDQhIa\nOmnQJ/8ruP6sXr5YMGrc4hh7vkTDnxchKilZItgqmQ9Sy1rJ5ztnrYS6taM7Uj4uAYrmRuCa7TPH\nURRa89ZAl7fGJjIbly8DmzaBae3gg+imZAiukTuD4ZPP0LBwmfuCuxgGUZ98CqEnz+Jc9u6XUmPK\nGacvXUSUViu70hwzJQO68BjR41vspzNYCPSIYfOMSHgG2O3/QP0L2X4TMOcPNsJv1siFGDNmDPbt\n2wcA2L9/P8aNG+fjFhF8RWwsh8RE2+hmgHdRx8Z6LzZTqm60EBoNC4WCQ1KSAbNmdeD773UWFa1y\ncjowb147kpMNUCg4pKQAUVEGlJYqUVXl22UEExLSlmyfCERNuVvYzcswCN31vt3D05XloJubRc/R\nNms2L++6fCVo7RXeNWytZX30FDBnjuD3KYMBqu1boV6xxK7LWajYhVgBFlnX5UY9cdmSug5Ikcqt\nQMcpFGDmzOXXuaWeB024Y8VBZD4jRnhFvkuy9+/NeD1qvaSkBGvWrEFFRQWUSiViY2Oxbt06LF26\nFO3t7UhISMArr7yCoKAg0WOQqHUJfJAm4+6+W7HCf6K7retGx8fbRq1rNBwyMzuxdGkH6uvtR9kb\no/ELCjTYulV4H2Oku08mI6Yo6O5ccbZPBIJKztrsaoxOpn8oRdT4O+zO9kw1msXqoK94CerclXYj\nsGOuCwMzfwHC/rldcL3XEJ8AuqZaOOXNuk65dX32FUtEj8tBuMiL22tPO1LnWuB+CdaUh3ikvjk2\n/WN9/LgEdIwbh5bcfCAiQvYlyX1GTO0A0PbgLLT8/U2/yBDwBxvhN+ln7oAYcgHcnTLkAO7uO2vj\nmZDgY3czpPPIU1IcT49jGGDMmHBUiqTmKhQcjh3z3jKCIEYDFxEhmgJmNCp02SVEjR9t9yVtk5Zk\nNfCUzBk3+15MTDjqT/5XPD+dpsHGxgnmPssxuuqsF6DabjvK6hw+AkGl5+xflxuQ2xfdG2QM4vV6\nxKx5CYaPPwFdLuxmF+0fFycJ9A/nZD0j1vhLKps/2Ai/dq0TXMetKUM+RqkEcnM7UFzM4NgxHYqL\nGQsXtS+wrhutUgHDhnEYNsy5HPeaGgpV4lLV6NfPu8sIgnSlcdHNzXbdvEaJVCE4oLu2tbUb1jxV\nzMEIbMlSlYnJaE93vvqVLm+NYI3pxr1fea32tMN1ruWk3SmVwMaNaDhyile8E0C0f1ys7iX1jEjh\ni3rmgYbv/RUE17HzAtQtX+k3ASOOYDSePZHYWH6N/PJl4e3p6Xq/uWWyiqVISaT+IRMt+RvtK8DJ\nWBe20My2o/2uy8kDgpTOactLyNl6rRCHJ4t+qFRo2fA6uIg+3qsOJvGMsAolKINecLYueO8JFhBD\n3gNw+AVIcAl3KM6pVMCMGcCmTbbbUlP1yMvzI6U3mcVSdKteAWhavkSqFc5UV5PUfneHIRQruGH9\neX0dlKXnoB8+wv2FQtxd9KMrh5uNjfN6dTCbZyQuHh3jxqNlRQ6i0idCIfAe80k98wCDrJF34Q/r\nH07jSGCMBwjovnMAa7lWV1PFrrsuHE8/3W46XmwsfzxfLyMI4kBAlStrqY6skVs8c77SQm9rQ+TU\ne6D8oZRPpVIooB82HI2FhwAr4Suf03UPVfuLwF254lu5WYH75XBMgJfxh/ccCXazgz/cJFfw5Y8g\n0PtOLu6Opjf2mz9oysvG0wZT5oDBX565yIljBaP5O1NHovHwUR+0SBz10ucFS8n6i6F0aLDoA/zh\nmSOG3A7+cJNcwoc/goDvOxkwDDBunLA4jClVDI4Zud7QbyYcHQDY2d8v+q6+Dn1TBwmnqSkUqCu5\n4L163FL9pddDnZ2FsHe3CafqecFr5xB+WmnOH545ErXe07EWzSg+wY+y/WAkyzC8YpunAk89fXyA\nnzGLaaXXVLAIzloqrYltDcMAv/zSc6JxxQpeyNELF8LFCGlZbXMRZek5cWUyg4Hf7mlk9K86Jxuq\n7VtF9dXdLWbjMu68970EYsh7Gn70I/B0FTOp47vbuEspzm1WLUL87jdkV58yvngxeLBni254AzuG\nxKdpkc4OIuQefvgIQKz4i0LBb/cwdvtXIqPFCAkmC3yIISd4DLEqZgsXhrjFwIodf8qUMLcPHsTk\nWsPA4H58LvgdsepTxhcvfFUW041IGhJ3VFLzVNvcQXRf6IcNF9ykHzbc8251Gf0rp8qbnLx6gn9D\nDDnBI0iVAN29Owhjx7pmYKWOX1Ki9EgJVGut9ORkAxZmXkY0oxXcn64sR+Xpmm57JfHi9UpZTHdj\nx5DQZZfl6YV7rG1fiLTtS7cNIhoLD6EzdSRfmhX82nhn6kg+at3DyEo7lRDNsdBUJwQ0xJATPILU\nmjJAoaLCNQMrfXxbioqUdt/dou74rjVWZQdjozj33KvRoi9KLZIx9nfXm7wCXKX4i5cyGEBxXEAp\n8tkzJAAnrrzmYXcuXVEOWisywNJq3TeICA1F4+GjqCu5gMaP9/D/P3zUK6lnksp2ZkI9YkVPWv84\nB7pXN3gnjoZhQP9QCvqHcwHndQoEiCEneASpNWVz5BhYV45vpLKSRk2NsMqz+Vr7HXeoMXq0CkuX\nBkPfJrzGqgrWd8u1SrwoPzbMgI5Tm7wCqwr6y6o+BQSGLKVdQ5IyUHZlLlnU10FZ/DVQX2d317C3\nXhfX9FbQYB0o9iGIdQBddF/ox433XpQ6IF35bPJkfrAiUOUNAwbwKWd5Xkg50+uhXr4Y0ak3IGr8\nHYgaPxrRQwdCvfDPgeF1ChCIISd4BLklQKUMrDuOb0SqBKr5WjvHUaiqUmDbthAcvGmlrDVW6xfl\nFcUAvIZnsRjrLPb794EItEwSfvFa43eRxELIKKHpsF64EG1tiJw4Fn1TByHywenomzoI+O1vgbY2\n4f0ZBiEH94sfz2AA3dws//zmeDiAzlFs+jepPzpTRyLkwL7u9uVkQ5eTZ8powblzXstoUedkQ1Xw\nNuiWFlDgK8fRba1Qvf9PRKUOEr+HBIcgeeRd+EOOYKAi1ndGJbTCQiXKy2kIFYB0pVynUJW0Pn04\nlJTYvqDERFvE8sPDwOAchmMgymy+I1UdqvJ0Dcb+7nroOLXN9xQKDsf+XxNS/7kcIUWFUFSW82Ux\nAyG3Vwy5+gUu5AY7KrpCX7ooWhUNAAxx8Wj45jun+tZv1ce6+jfsrc2CVdvM2+e1dx3DIOrOUVCU\nCy9xAEDnsOFo/Pobz7fFDfiDjSB55ASvY6xiduQIg8xMYeWzjAzni4MIVUnbv7/VJiBt3rx25OR0\nn998LbymhuoaZFgSjyokQzyITXC2rFIh8paBiEoSXh9NSGARm6gw5fvjp5/Q+tgTgvsGTCSxXP0C\nZ9Mi6+t4+VOhU/9QKuhml3L5A0D71GnO9a2Po/AlUanAxsYh5OA+wc2+aB9dUw26skJyH7F7SHAM\nYsgJHkelAjZssI34tjawRhzNATcvMSpVAlUo7/ytt4IQG2s7c6tCPLToL3g+qUAtKZe/xaBFpYI+\n5Xq8QG9EgeYvuIgB6IQCtZoUtMxdgNoleR4XuXErHtIvcEp0RcLl35k6ErrcfKfaIidK3Jf4W/vY\n2DiwCYl291N+d9oLrenZ+F72i9BjsdYQz83twPLlHaK64u4sSiJUAtW4Fm5Eq1Vg+3YFhg/Xo9rq\nHdcKFT7DDCyEbXkye7Nl4+DE3OVvvA5zFi0C3ipQA9iEMLyCeFShqiUe138TjKYJlHgf+KmEpScw\nia4IGXMJ0RWbqmj9YtGePo0P8HJybdiZ6mzexO/ap1Khfeq9gksR5rDeDBDsoRBDTnA7UgZZqsa4\nkKHdsoVfu3amKIk5Unnnzc0UoqIMaGiwXCdfjHUYmWrAhKY9DtVrNnoFpAYtDAN89ln3361Q4SKu\nBwCUlHR/btEHOUzXevSXoCvKfVu9ylt0ia4IrZFLiq54opa3zHKuPsMP26fLyQM62hH2j23CWQRB\nQWCHDPN2s3ocPfTXT/AlzhhkKUNbVKTE8uUdLr2HpPLOxQLxhqUCI/avRkPHCqeMgdSgpaaGgkia\nsyBFRUrkd2ZDtb37JW2MogfgH9WrPERj4SGbUqHUyJFo/LdEZLoRN9fylqx/7gf4XfuUSujyNwIs\nB9W72202tz76hO8HQD0AErXehT9EJAYq5n0nq0qYwO/20iUKo0erwbK2BlWh4HDsmE7UKMpBql0K\nBQeDwb0R9XLac/fd4bh8Wd7+GlqH2tjhCK3yTc15v6C+DsrSc9APH4GYoQN9+3v19+UNifb55F1n\nzG74cg/oqkqw8QlonzY9oLxJ/mAjSNQ6wStIzXylcsajozmoVMKGWioHXC5SQWhisVTO5rjLbc+M\nGfL3vym2AiE1/hPI5BN8Iboihh8VJxLE39pnzG44+i0ajp9Bw9Fv/aY6Y0+AGHKCDc5UDmMY4Icf\nKDQ28oZXCCmDnJ8fjJYW4cfRlRQ1c4S00ufMaUdysuPtdQfr1sGmPampwoON/8mI8ZncKYHgNvxt\ngGncYUkAACAASURBVNFDIMMhgglnosabm/no648/VkOn42evSqWw8RMzyFLr4xoNiyVLXAt0M2Ie\nhFZb1op4VCE4JQ5KZaRpDV9Oe92FUFBccLCtyE1Ghh7ZOUq0K4UDmVomORbIZJ1NQCAQAhtiyL1E\nILw87QWpmV+D0eDs3BmElhbA3Lmj1/MGXaNh0dpKiaZfGZFyx7e2UqivpxAR4aaZsV6PmNXZSDKL\n/F6fNg303LX4Ym+oyXhOnWrbXk/dQ+ugOLGId2PAUnBRIShtOSoUyfjUcB82HsjHlCCIDriM7Y6O\n5pCf7570PgKB4D+Qn6+HcWdutCeRmhUXFirR2QkcPNh9DWJSqOaEh3P48ksGKSnShs9YAEUoEM3d\n7m1jjWojCu0VaArexIOpCuzhNoLjAOvwT1fvoTMDAMGI9651xiX6PBRta0CVIR6tUAHlwJYt/C7m\nWQHW7VapOIvlC3em9xEIBN8hukb+73//2+LvajPFjNdee81zLephmBfkcHd9bHdiLz1r+3bLa7Bn\nxPlj0ggNte/1lQpEu29yMzQ1F90jLykhsfmbki/QUNEOjususfr888FgGF4Nzpl7KKQk50oN9q5L\nwL8PROAirueNuBnWleSsnz2xGAShCnTOxEkQPIB1lTUCQQBRQ/7RRx9Z/L1kyRLTv7/77jvPtagH\nYS832p9+m1JlQRW2E2VZxMfLn01bB6INSGrHgdQ/Y+OBm9xWZUpKwjIZWsSjyuKzXbuCMWKEGtu3\nCxtse/fQE4O4mhoKDeVt+A1+QRgsT24eZS/17Flj/j1PDD4ITqDXQ730eUSN/i2i7vitz6usEfwb\nUUNunV4egOnmPsfZVCxf4Ex6lj2mTZMfLGatkf795IWYVPI6lHZKiDqCVDENLZJRhXirTynodDQ4\nTvg+Sd3D+npgzx43D+L0egx5awlK6VT8hME4hxHYgOegAH/fzJchpJ49a8y/FygeJK/hixmxXo/I\nKeOh2lYARVUVKI5zy/NP6LmI/tIpyn+MTKDCr4kKD4DCwjiPpjY5g6PpWcJw0GhYzJ0rXBDFHioV\nMDBWB81BD1SZkiim8Tlm2Liq7REWxiE62vIeNjcDf/5zCCZMUKGqyr2DOHVONvpsfxP9DZehBIuB\nuIyF2IS1WATAMspeysNijfF79uIk3LS6ERguex/WHVdnZwlK0gJ+UGWN4JfIDrcihr3nI6YRrlRC\nMD1r0CA9GIZCdbUC8fEG3HGHAQsWdOD6612L6uYqqwGtuPhJR1k1KkKvtwgekxtQJiRh+VWf6Vhc\nss7hdra00MjPD8by5R2orKRQUBCEDz4IEl2LNuJUAJ/E+v6Dis/xy2MrkZ3T/XM2eliE7ptYNoG9\nOImlS4OxYYNzQZqBEvRpRCgo0ityuAyDkL3C9xkA6Aotr9jmRtlZQuAj+hMqLS3Fww8/bPr7p59+\nwsMPPwyO43DhwgWvNC6Q0euBpUuD0dIiPADS6SiUlVEYNsy/ZuWAbcS0eTUvY/QzAPzyiwIJCSwe\nfRRYsYJBRITlcZxN11pV0B8voD8G4rLNtrqwJEyY/Rv8UqVGYiKLtDR+hrRvn0wDIVBMY0SwCk/m\nGPDll0ZDJn/QunNnEAoLlaJ67UI4k59ub33/pflXwCotX+5iVdiWLOlAfb3tfZHKHgAo7NoVgogI\n5yLcPVkQx+3YqTuuW77SY4ImdE21pEofGxtHxH8INohqrZ88eVLyi7fddptHGiSHQNBaX748GAUF\nIRJ7cEhK6s5X9sdZiTUMA2RlhWD3btv10nnz2k0vZFdmX0ZN9IXaFwRLiL6GZ/E8Ntptq3l75CJ1\nfa7DIT6exfTp3f3g0DPHMIgad5tgiUp7WuuODKiMEfpiOKM/76z+vhSe1L2mL11E1OibQbG2SxOc\nQoGGY6c9NyOWuM8AwMyZC92rG1w6hT9ohgci/tBvYlrroq9VXxrqQIdhgF27guzsRaG83M2zEi8U\ncjh2TDiE3bxCmSuzL6N7dzF4V/cMfI5kaKFFMj7HfabP7eFMxTSVCnjttXb06cPZeB94z4rzy0vx\n8SwOH2YQHc0/H1otBbXagQO4UKJSqgqbNTk5HWhqorB7dxCErte4vu9IARs5QZ+uFMRxNz6t6y1x\nnztTR0KXm++5cxMCFlFDPnHiRIt1cY7jQFEUOjo6UF9fj9LSUq80MBApK6NEXepCuFym01hZyMN1\nquW8kGNjObvlSI3HEpohdrt3lXgeG5GN1YhHFaoQj1aEQa4xddZACMUJAPwyya5dUh4WaaZP16NP\nH37Gaxwk9O8PTJkSLNsjo8vJQ2cnELK3ECG/dpeorF2Sh5pL7lGcUyqBNWvaceSIAhUV7hHoiYjg\nEBvLoqrK84I/bsEbdb3NKrlZF4GxiePoF4v29GnQ5ZEiIwRhRJ+Kw4cP23x28OBBrF+/Hg8++KBH\nG9XbcHVW4q3AHDkKbFLGvqKCRlZWCI4dU4i63K2DtFqhwkVcD4AP0pI7QBKKKHcE61nshg0d0Gh4\nT4uxDRQF0dQ0IxoNi9mzO5GT02Hjqbh8Gaa/7Xkq+OUKFYoO/h0N1Wvx27gK3HhPDDpYFfZNkF7C\ncMS1Xl8PlJbSmDhRj/fec01/3nyJRSyC39N69s7isbrebW02tdX1w4ajsfAQEBrK7yMQx+GXnUTw\nG2TVI798+TJyc3MRFBSE5cuXIzk52RttE8Xf18gZBkhNVduNXjbiUt1rF9ZOnUFsDdW4Ji21Hsob\nYts+sV7PNjcA5kFaLAs7cQfSx3UVe+vH3fADnnHj9MjN7UBEhOvrxPLP3X3djsQqtLUBU6eG4Ycf\nFEb7gj59WKhUQFUVbRHhLndSKN5mDsnJrkWte2290s3LVZETxwqmlnWmjkTj4aMuH18O/rDWG4j4\nQ785vEYOAAzDYPPmzfj666+xePFijB8/3iON62moVEBmZqdso+PKrISuqQZdrhXe5oFUFaFI6Jkz\nFcjK4j+XSnsSw3ppQSwNTq8HaNry3Pfco8dHHwmnfJkf19WCJ1I51goFPxZOSGAxaZIec+d2IjHR\n8jyurBM7otIGdF/36tXyYxWmTg2zkN01GICGBgXi4vT4+GOdw/0m1ea4OBYFBa0YMoRDRwcfL+C3\nxYSMZTelkGvs6+v4mbgAyh9Kgfo6/6i1Tgg4RN8OX3zxBV5//XXMnDkTn332GZRkbcYhVq3qAE0D\nX37JuxXj41mTctq+fZYzTaNxdMbYsLFx4NRqUHwJMgs4ldrtgTlCRjYlJRy1td37CBn7sWP1+OAD\n4WhwMUNm7d4WOndNDYV33xU/bmUlhX/8I8jl/GUpQ8xxwIcfMrjlFtb2vnW95OMi4pCYGOZUYRhH\nVNoA/rrLyii7sQrGtpaXA6WlwgOvn35SICLCcSMr1ebqahoZGWpTIKFORyEpyb/zygVxMDZFWXpO\nXCbRYODXzMeRyRLBcURd60OHDsWAAQMQExMjGPT27rvveq2R1vi7a90cIeNs/ZlLYhkMg+jUG0AL\nGHJWE47yUxdQ3az26IxHrO/MrxNwfwqS8RxSx508WY9t28SXAtx1Hpv2C7zkv+pzH9JLXoPBavxs\nviwhNJCTOrcQyckG7NjRigkT1GBZ2zV8hYLDsWM6JCdzyMkJxqefBqG2Viwqn8PHHzMYN84RdT/H\n22xE7n3xBzenekWWYEAcM2+BcGxKfR36pg4CJWDMOYUCdSUXvDIj94e+C0T8od8cdq0fOnTIY43p\nTQil/tgKrjifrkXXVIMSkWzkWhg8PLEJx2r6+URJy/o6xVzuriwtSLnyJ03S48ABebNSV84j1H6h\nAMRJ2tdRNAKYWbbJlM6m0bDQ63ndATFRG0eXKzIy9EhJsR+YaP3cCaFQAMOH2zfi1oMQZ5ZYADdk\ncHgLZ0RjovtCP2y44Bq5fthw4lYnOI2ovy4xMdH0/8TERNA0jQMHDuD8+fOmbQTXcbVCmlQhkDIk\n43RVkt8UvxDScp83zzlNdjnHnTu30+WiNeba4LLbL/GSv6lsDwwtbTDOfltaaGzbFoKCAulCJULn\nnju3HXPnCrdHqgiO8XM56+7DhhkQHS2+XapamnmbKYoDYD+LwN+KCYkhpbRHV5aLqrM1Fh5CZ+pI\ncAoFOPAz8c7UkXzUOoHgJKKu9R07duDzzz/Hhx9+CIZhkJaWhttvvx3V1dWYMGECnnzySW+31UQg\nudbtcekShdGjpV2g9tLSxFx8QipoJjcw3BeN62jfuRp4Jve4rkSKNzfzBqq4mI9xMJ8ld3RIt19K\nGawTCgzFT6aUOimE2ihnqcaIWPR/Tk4HLlygcPfdapH0OQ4UBYwYYUBhYaspK0oIe1kMxvYtWhSC\njz6yP4iUu8ziczenq9kiEnnknsbnfReg+EO/ibnWRWfkn3zyCbZv3w4A2LdvHwYNGoR169Zh+/bt\nOHDggGda2QuRqlIlVyxDtyQbrZmzYUhMBqdQoC0+BZvwF0EVtJoKFsFZSx2r6uTmUo5Gl7s7jLj5\njNn6uMHBvBiJEGlpwu584wzzppvU2LUrBBUVtrNke+13vFyqMMbZqdA1AuLXbcS6NGxxMWPKZ3/4\n4TBIJZ7GxbEYM8YguQzjiDfpxAl5LvbJk/Wma/ZrJCrpyRKNie7LB7YRdzrBDYgacrVaDY1GAwA4\nfvw47r77bgBAUFAQQqWG6ASHsOcClXwfGEstThiD0A92ARTQ9rtZ+HX/UaxP3mATVAUAm1WLEL/7\nDSjk1Pn2YSlHe0i5dI3k5ATj3DnHAgKM68ZiGgBffimjnKfES/6w5j7Z5VITEli8/XaQxTUuXx6M\n5culr1ugOSZDb7y+8nIFxFXyKFRV2V+KkZNSZ28/3t3OISHBgNRUPqZB7nX5Gl1OHph5C2BITgGn\nUMCQnMIHurkqGkMgOIioIe/s7AQAGAwGHDt2DKNHjzZta21t9XzLehHOrh2rs3mXuskol2sRtnsn\n+v49T3BwEAYG9+NzwWMJ1Tk2BmzJMvpexmiQxNaVGYY3umLs3WtrkOXkaxvV6fR66draYi/505mr\nZV9jnz4ctm2zvMaCAvvr6WJIX5/w9FwqTkOuN0lqP4WCV8hrbqZQUqJ06rp8RpcCW0PxCTQcO42G\n4hN8tHrA5M8RegqihvzWW2/FM888g3nz5mHgwIEYNGgQDAYDNm/ejP79+3uzjT0eIRdobq5EdLle\nD/XS5xH27jbBzSFFhchZ0mgzOFiYeRnRjIh4jHWAjp2oXF/6PuW4dGtqKFRWiudeCwVVycvXprB7\ndzCmTAmTnhWLvORfXMWa3ReIBq098UQ7Ghsd0+u3d0sczUcHpIPPVCreFS6EuTdJyutkMFDgOErU\nAyLnunyOUTTG70PtCT0V0aHjCy+8gC+++AJNTU24//77AfA55BcvXsTKlSvd3pDVq1fjv//9LyiK\nwvLly3HjjTe6/Rz+jtwqVeqcbKi2FYhupyvLEVxfjdxclaU6GqLBHpVX1UlOVK7HSjnaQW7xloQE\nVrDwB2A5YzQafpWKg0rFydJzN1dBk0wXtFIG6+gAnnyyEwsXdiAoKBxKZXdg14oVlkI3//yn/Nmo\nUQQmNBSiQXhSWvkKhbBWiVCcBsMAFRUUCgqCcPCgEgBn+r55aV5zrEWCAN6Iy7kuf6uORiD4G6LD\nc4qiMH36dDzyyCOmtfKamhqsX78eERERbm3EyZMnUVZWht27dyM3Nxcvv/yyW4/fo5CYKRsxN8oW\ngVAOBOhIBWx5vJSjHeS4dFUqYNo08QXWqVP1CA62XGcfPVq+Pr4QUrNH6zX9KVNU+Pvf+YA8I+b3\nSuoahQgL4/Dww2GS68tSM+Nhw4QVx8xn1ubXMHasGtu3h3QNCqguo0xh8mS9oDfJ3Ov0wQcMBAL6\nBXG0Olp9PVBcTKO+XvZXCISAx6G31rJlyzzSiOPHj2PSpEkAgBtuuAHNzc1oEVAqCyjcHOltRGqm\nbEQqalZOgA7DAJdq1GiZ5EJUrgeRGyCYk9OBuXPbodGwMAZVaTQs5s5tt6hGZlyXdcWIA9JuaKE1\n/U2bILoGLHWNQrS00Cgvt7++LBaPUVjYajdOw/waxALlDh6UdoWrVMAtt7BISpJnyeWKBbW1ARMn\nhiE1VYMHH1QhNVWDiRPD0NYm6zQEQkDjUFSGjEJpTlFXV4cRI0aY/o6OjkZtba3JExBQWElz6hOS\nUDf2XnTk5UIV4XoQjHGmLOQe5xQKtP5xDnRLskFfuiicIy5RItFaKrZ/wmvYmqrA3U17oHBnKUcX\n0esBlrUsa6rRcMjM5MuFmudVr17dgRUrOlBWxu+XktKdY+5IIRI5iM0e6+uBPXscV5gT0qxPS7PU\n64+LY9HUJDwIETq2WEEaQPxzQH5/yXGFS6m+aTQsWlspmzoE9hAq+lJSosTUqWE4fJgE5xJ6OJwD\nfPjhh47sLpvs7GzuwIEDpr8zMzO5S5cuie7f2an3SDvcwrPPchxfR8Piv4LwZ7lnn/3/7J17eBTl\n2f+/M7M57OwSYgKEnAhY2wrEI62CkCpyEIKKrSK5sBZBSvV9a1FfBQ35QURCgVIsaq1SBLHKG0Rb\nsRIUNNYioFbf2hoOVgXC5kDEQAi7k2SzO/P7YzKbPTxz2uxudpPnc11cmj3MPvPs7NzPfT/3/b0l\nqbMzep8h3XOP/NywYZLEMPJ/TXyo2mEf+i+XJH31lSS5XBEYfM9RG+d998nPDR8uSSwr/1ft9L/6\nSn4N6TjB/9LS5KnkOPmYl19Oft2iRYGf0dkpP5abq35sjpPHooWLMP3KY59/rn4eHCdJX38eme/O\n6Hzl5UnS6dP656HMzfDh3fO6aJEknTtnfrinT8vHUJsDtfFQKH0F3X7kjzzyCFavXh3w2N13343n\nn38+YouJp556CoMHD0ZJSQkAYNKkSdi5c6eqRx63ym4aak/HMByFOIQ7F3I975Ht8/qrwPp5yvB4\nwG/+Y+iwFvwCrlW/0Ru6pgranj0CWlvJamY9nTszSm/h9DufN68D99zTGaKGZrSpx8KFHQGeanJy\nYOQiK0v2HoP3ho30EO9RL3qN8+DgwbP2/8Hc9NdhadDvzBXu54QS2mtcqymQnkqeGv7X3L59LG69\nlUckm770ZeJBoSwRiYd5M63s9sYbb2DOnDl49913cccdd/j+/eQnP8GJEyciOrjx48fj7bffBgAc\nPnwYQ4YMSciwutb+dT4cyEZjZMppSKVNpcuR+sr/El+eWrlNd69eKxPc4WBx/fXGBUiMYkTUxcw4\n1bLNX3wxGWPHysdfsiQZX34pv05tD9puF4na5UoimsUih70nT/YgK0tEUxOL3bstWLq0e+xGQ9Fm\nGsaQ6tbV9tJ/g4ewwPkkkurIGgBaNfAktPfs/X2B0D16rZr/SKj8jRolglNZXxht+kKhJDKaHnlT\nUxMeeugh3Hfffb7HWJbFRRddhPT09IgOZN26dfjkk0/AMAyWL1+Oiy++WPW1ieyRuzmrIf10s7BH\nDiHj2nEqPglw5v2DEEeOJjwrY7btpL+WdrhzZ0Sn29w4JairlQW+Lj8/dL9Z2ZddvNiN5mZtL1Ft\n7IWFHuzZ0waHQ11DH5CQnS3i9ts5LFlyXtdB1mtzG6ypfuFQJw6cuwSDnLWhx8ovwP1T/ok39qaZ\nbpnr8QDLliWjsjLJt2hSeooLQujiSonkTJ0a+fa1wdfc9dcH7pErFBZ66B55EPHgWSYi8TBvptuY\nAkBWVhb+9Kc/RWVAwTz00EMx+Zyo0lXeRWpgshMz0QYe+TleU+U0xtE2YA31LNIK1JPNzbad7Gm7\nST1RF7VjaydKGasBV7zGTZs4LFzYgX37BDQ1MUhLk9DaysBi6a7nV2tSojb2mhoLysqSsWyZW7Vm\nOztbRHW1gIsvHoDTp/VHq9fmNjiBLbf9GDInkoV/GEcddm8+AwcuIB5LC4sFYFkEbF8Igvp8NzSw\nOHyYRV0dCysEZKMRjcj2SdTW10euRryqqg3FxVYcOcLB65U98ZEj5aYvlARAiFwTp/5Iz+ptKCEo\n5V2n7QXoBIdjGI4nsMjXwKQnvbc1P3fIcDgZ8mqtFWkYO2e0buhaKU3Ky/MCkMCy6q0ne9pu0qhO\nt9Y4g0ulSko6TY9j924LPB7g+eeTMHWqMU1zPYU0xcirhaJvusmj2RrUHzONSZQwdXLBUHhyyBoA\n9Ry5aYvalo9/CN5spv/QoSJGfc+Np7j7cQij8QW+h0MYjfW4Hxw84HkpYova1FSguroNNTVOvPaa\ngJoaJ6qrtTu3UfyIUrmsLko/h/E/QMbYK5Ax/gdx088hkaCiwJGma//as3g5ysuasWNfPo6dsneF\nbXvee1uNU602vC7Nxa/wdMhzL2AuBNggOKDpfSmeXWcnsGVLiqZoh1mhjmC0VMb0jq1WQuXxyB6j\nGfWwhgYWS5emYPv27pprxVv3x99zLS11IytLRGMjOXrR1CQvRPzLx87Wt+PyrHpcNn0wlpYb/9kZ\nWfD4e7QeD1C+Kh0/bLkFC/BkyHv+4iU3bXE4WHz9NQO7HcSEvtxcuRtaXZ3xtf+5cwxOzl6G//Z0\nX5MjcAIPYAMAYBnWGz6WUTIzQRPbzKAkzla9CbahHmJOLjqKbww7KdIstmWPgt/0nO9vrr5OjmiK\nom6CLqUb3ax1ADh//jxaWloCHsvPz4/aoPSI2z1yAtHqvU36nOsmJONXdUtwC15HHupQhzy8jlvw\nMNYFdELT2ps0ulfeW3vkRlDm/LnnkrB5s3bWeG6uFwyDrm5g+ihz99hjydiyhXzsgPn1eJC8dClS\n3tqFlKbA7PHB2RfozpvZnurKnHLw4Dd4CDOxE/lwoMWeh+RZxbh87xM4UUcet90uQhAY5OaKGDhQ\nIu45q1UGyKI7gWF3KwQcwiiMQOhe/TEMx2VsDd49KIUVWo+H/cpExX/ubKUPBxhSBSOVLj1GEJA5\n6kKwhCiAyNvQfPjruAqzx8M1F9YeOQCsXLkSr732GjIyMnyCMAzD4N13343sCPsoRvXTI/E5U4uB\nBzf+DkuxKmQ/0h8t0Q69lpPZ2SJuusm4UIcWJMETMyIgaihzruwf795tgcPBgpRHUFTkwSuvmNM0\nb2hgwHGAxSLC4wmdK//tE1v5UvBbunMmlOxxAMBzz2h+lrIgmTLFg82bQw158DaNf+jbCwseRPe1\nwA0cgr3LgesZYDO5147PCDscHBzkLXZVhg1ThGm6H8tGI/JBPlA+HLg8qx5ZWcZ6s1OigCAgtfJl\n4lOpldvgKnssqoaUrT0ORiWUzwgusLXHNRN0Kd3oGvKPPvoIH374IVJStD0bSg+IUKJHt2FMQW1D\nV6MOg40wFLRC3kqSltH9XT20VMYiffyGBrnJx969oRnq+/dbDGfr5+SI2LgxCS+8EPp7sNlE3HFH\nZ/dCJMwOcqQs9cJCD1paGDQ2qi94SIuwNvA4hu+AOyWhqcmFBQs6sXlzMoxl94fidDJIT/eipSVw\nvg4flpun+NOIbDgwDCNwIuQ4DuTjsumD48nh6newtSfAqEhhM87zYL84AvGKMdEbQMu5nj1P8aG7\n4TV8+HBqxKNFV6JH+oSrkDHuSqRPuKpHiR7B7VDnziV7tloJdzwPTJ4cmSQto3XKkagl1jv+RRdJ\nWL06tFVsWpo5TfPJkz149dUk1edLS7vLuPQ6yKGxEUDoXJHqrmtqLJgyxaPZ5lar0crQoWJXRzi5\n9C58mBAjrhBcy90GHq9jJvG1xwpvxNKVNEWnd9GOFKbfWRLVxDPWpd1PQ+95Sje6v6SsrCzccccd\nGDNmDDi/X+qiRYuiOrBEItx9cOuypeA3dYddk+pOImnjHyCKQNuqNWGPhxRaNhK6VjzBvXsDW1P6\nK3XpoVfzHAl6knegzI1iPLOyJEOa5srczZ7diS1byKF4l4tBbS2DkSPlG6SWLr6YkwdpcDbKHuYC\n5mryZE/X/IfyzjsWLF+uXvKnVZp37hyDVauSUV7uNlVmaAZSG9SHsQ6Xju7ExNo/gXHK+4ui3Y5r\nxnbCBQ9ovm3vIRaMgGS3E71yBgD3TZO8DeTxwrV6XcQ/33PFlT16ntKN7q8oPT0d48aNi8VYEg7F\naFVVdd/slV7MukZLENBeWQWSfl1bZRVQtrzH+1NmQ9fB9crKjXnKFA9WlraAdeiH//VqnntCJBYJ\nWsdQ5kmpJ8/KkgJ6hPM8cOSIiZC0hq7AN9fMwMpHeWzc2P2Yw8FhyxYOeiV/WjkXyqJk27akgMQz\np5P1fS/Ka6qqLF1Z6ORzuugiL776Sv35YPLyREyZ4sE77/gvfry4WpTAHupOEuKcTjnBimXhWhn+\nghVA4LYUQN6iojXKZHge7SV3EJPd/LH+aQtcy1ZEfu6sPCSLBQzB45csFsBKvyujqGatS5IEhmEg\nqtQgsWzvlaDHS9Z6aWkyNm0K3XZYsKADq1ZpGy33kWPIuvZKWBA6v53g8M37nyJ55IWmxtMTeqLZ\nrcyd2Qxrs6hluc+e7caaNR2Gjq2VKa+0NtVaKAgCUFhI7ltut4uoqXEF9O9esYzFmMpSXO98A/lw\noCkpD2+lzMS9znWQuCSiF8txErFszugcCgIwYQJPzMT3P8aRIwyuvdYGNY3yd9914ac/taqW2QWj\nVBsEREygrnbozS/AmX0fAVAxwBoMvsAK4b9/5esyKHW9j3G5IObly9do2WOwrVzue42u3nw/MfgB\n9zql/Owvr4E9/Y26MuTb70V8v5w9fgwZV19O/kyGwZkP/wlxROzugXokZNb63Llz8eKLL2LUqFFg\nmO6pVgz8kSNHIj/KBEIQgMpK8l5pZWUSysq0Vc8akQ23RiLQGWSjIEJjNYJatrqi2Y2u6Jt/1nWw\nN2W25tkMWmIk27cn4YMPOMyYoe2d64mrKPXzCko0weMBVq/uNlC33dZJTHYrKekM+M7Ly5OxcVMK\ngA2w4tdyJUFnNto6u15EMOIAOUQN6IsJKeNrb4evjj4Y/++hoEBSVcOz2SS89FISzp8ne+NyAtxg\nWQAAIABJREFUxn5oC1kgsFKDPa6RJ1DvgH3Jg0g+8IExQ+vPQw8FRDr8w8PKNZr0t3eR9J//hDwO\nBF27Qa2He9pgJqHo0r3oGDce6fN+qvoytvlbgsvRM8SsoRDzh5G3nvKG+aIsFH1Ur9IXX3wRAHD0\n6NGYDSaRqK1lVOVAnc7AvVISgwus2Gu/WTaSQVTbb8aUAmvExmoEUra6FQJuwevE16fsroKrNDD8\n3xORFz20y+IY1Nfrh/C1jlFfz+Ktt8g/h61bk/HhhxxaWxlfyFjJIg/eUlEIXjQo2eNGyM+XQ9TB\nGfZ6uQ1KJCE7WwTPkw200e+howPEOnml5M6/7M7pZMCyZJunlScg8TZYt2/z/a21SAxAEIDXyddl\nwFj9jLg/qdtegmvxUiAtDUBXieBGcolgj0P/CYJn7DXaz18xRo5Y1B4HwEAsGN7zqIXG1lPH9OI+\nHRWJNFSiNUYEZybzPPBpySo8gUU4huEBcq6flqyK+TVM6m6lVQfMNtSBbTqlewyFnkrTamVk+6PV\nXU7rGEoXMxJeL4PDhy2oq5OzyOvq5CzyqVM9OHjQhQ8+kLPIz52TW2o2N+vLuGoxfbqHmGGv5hwG\nZ7nX13PE0L9ybOV7aGpiVLXSFW87GEkiPx4y74rkJ4CO6TPIA1fJBdAqzQPkMLyRQne1nX3WeR72\nssW+cYZTItjnyBwEz+hC4lOeUaNh++0aZBZehIxrxyHj2rHILLwIttKHtTPaDci+KpLW3vwCSBwH\nb34BhIX3ytEQimGoIQ8TJSxJwm6Xw5aAdqvOZStEHF24BtNy/43R7FFMy/03ji5cg2UrekdiMljD\nnMsdghY7WcFPzMkjhr7UdNAjIfJipExMS6ddb6FhZKHgz969FmRlSWBZuftWYaEdt97Ko7DQjnnz\nUpGTY/R4ku+f3S5CFOXrxkhZntZ2gd0uIi9P/XswujjyRy3s75t3RTu7SC6pzCi6ChBFCAt+EXCz\nbiuZoyoGQlok+iNmDQWGDTM17mCS9+3z7YmzdcYXq32Zlt3V6Cy8BBLLylcjy6Kz8BJ0jh0PftNz\nYJ1OMJAXSGxXwqLSFjcAwjWgWsZGasm8ck3f39KIMHS2woTn5T1RUrKb/16pVhZ3aakbd9/diQce\nAFpbh3ZlRkdHi90IoVnuAL+qGDAR+oqmyIuRbGu90LGWmpzFAlNlWYrxuvvu1AA5U69XFkjJyFCx\neiF0n4fTyWDTphSwrLEsfy3Pv62Nwa5dAlJTQfwewukkp5QkBqPMOzFMvek5CAvvxZl9HwVkmCfv\n/0C1NE9zf5TngZkzgQ0b1F+jA3uqAWx9Haxd2fOkk/KNo58kwSE1FS3V+4Hmb2E5fAieUaMBK4+M\n8T9QfUtK1ZshW2xhbVXwfFwltiUaqh753LlzUVsbqpFM6WbFCtn7zM31gmUl5ObKXs+KFfINWMtb\n2rYtCRMmyF761Kk8nn8+CcnGlUIDMCO+YuQ4QLcnGG7oi+eBEVku2Jsi11FJWSR88IGAkhLzYjf+\nxyCFrRcvdsNmM+6h5uSISEqScORIYF7BhfgaVgg4d47FnXd2Ryfy8rwYNcoDjtPfo9baIvBHy6vO\nyRFRUCBpevVmO8mNHElenEyf7gEPnTA1IN+sed63P0rC0P7ounUB16VoHwDRbteROOlGzMmDddNz\n4Lf8EYxKmKHjhmmwrXrMmGfZl8gcBE/RtUDmIDli0VCv+lK2oSEwakG3KnoFVY/8xz/+MebNm4db\nb70VCxcuRFKSuppVf0XP+9TylpxO1qdLHW6tdXCSU1aWiGnTPKioMCe+ol2fLYe+XKXLjXslUc4C\n5iFgw6+OIZMfhjf2poWl007SwG9uZtDWZrxOfPp0D44fZ+X+111NSm7BTuTjJBwYhte9N2P4TSvw\n+OOs7/poamIwdqxN99hGe3VredVG8hKMdpJT5reszI2VK5OJEQ3Woa1kxzadCvC6lMVgyu4qsA11\nEHPy0DG92Nj+KOm6BGB/5EFYK7fpvBnomDIFKXvfJj4ncRza5s4DgH6fBCdmDYWYkwtO5XsVc3IC\noid6aobB10BcokRg0tLAtrYmRCRGs/vZ+fPnsWHDBnz44Yd45JFHUFDQXRBFu5/pY7STmILZWmu1\nmujCQg/27GkLsZlqimg97UIWPHe2siXETFRh4b09uwESFgjOyTPwxYJVyMrlevxb0/q+LBYRQ4ZI\naGoKNF7nzgGFhXb8xvuArz2nP8133gvxt93nbPSa4DgJP/uZ29CizH8hRtouCP7ezSjjqb2W+Lhg\noGZcpeWe6Tpytd+rxwPb0iWwbt0MRiR72qLdjrM730LGlB+BIehkSByHM9X7kf7T282fSwJg9l6n\n9nsGCL/pcK+BeMB3f3kTrMPh20cS8/LRMfUG8IsfwmkpuVeNu1oduWay24ABA/Doo4/isssuw6JF\ni3DXXXdh7ty5uOuuu6Ixxl6HFKI2E7YmZaab0fHWStQifZZa2L6mxoKysu44vVbCnV5tdfB5686H\nICBZJbSW3MPQmrL3xjlOghFFcI6TGLjlDyjcWqraklVtrKTntL6v+fM7ceBAaDg+MxO44vvnVcv0\n0v8WeM5Grwmvl8GWLSkoL9ffb7FY5HyLl15qQ3V19/iA0O/9+uutvi0d5TpobVWfJ7WEO+Lj4YbL\nlf3RSNwYLRa41vwW7T+5VfUlTFsbYOEg5uYRnxdz8gBIup5lf8FVXgFhwS8g2gf40jJF+wC51Wlw\n9KSnWya9SPf9xSEn9Xm9slRtnQP85k3AxRdjUOF3kXH15ciY8MO42mbR9Mg/+eQTrFixApdeeike\neughpKenx3JsqkTac/Z4gDVrBuDPf/b6Qsv+Wtt6cqBaoWlAfi5YMpOEGY/8+HEG48bZIIpkw5+d\n7cXBg/KxtDzuu+/uVD0Ox0k4cMCFESMkzXPMzu5e4bPHjyH96ivBEeQjvODQ8tGn4YXWTKz0jXwf\nauptet4tic6jxzDkR2SVPonjcOZA4Dm3tsrfyYEDKairk2C1ShAEhljapXdNkGrIi4o8WLnSjbVr\nyd97MP59yNXO1bAX7/NqCOHyCGYia3qVggD2669wwc03gHW5Qp5WrhfbqsfUI0elyxPXs9TBdPRR\nSX4bcSHY863QrSOP0TUQUTTuL5pvm78Abb/4Zcw8dDWPXNWQP/zwwzh69CjKy8sxZkwUW9mFQaQN\nuZqhI0EKN+uFprUkM/WOrYYgAOPG8arymSwr4eBBF7KyJE3Z1D17BEydqi+rqnWOzz2X4vtO2poF\noHAshnlPhLz2JDccqPkQ1kzzFzx7/Bgyxl2pHgr1M5ZaYwVgaBvBVGMWg4uMYKObl8dg3Dg3Fi50\nY/JkG9GQ+y+mSKidq90uz5Pe4pGE/1yY1bdX5m1omgu21h5keuuE24nGKGjrReJ5sISGIL5wsI7B\nidoWUS9j2JC3tyO9eBIsRw7LWf0cB8/IUWipehcQRf3tkATK9te6v2ghcRwgSRBzctFRfGPUFyum\nQ+vf/e538Ze//CXujHik0QotkwgONxsJTTc1MaqSmYCEoUO9mD3bjcWLjSe68TwwbZp6WCc3V/Ql\nV2nJpra2MroiLmbC76dabfiL92bia//ivRmnWvUTvUgoCmHE5/zKlbTGWlVlwa5dxs7DVGtVg+HE\nYOGWkyeB7duTcd99qVBrXaBVTqd1rnIyZXgyEf5zQWqpunFjaMg/ePtmwtRBKH3+YniSTd7AzdQg\nBxG89aIYcdE+gFxxoVPD3K/ESvzFW7r+P33aRCTVfO4LMTNeL5JqPkfG5SPVvx8DIjDxiNb9RQvG\n65W3+eoc4Df+AbZlj0ZhdPqoWrCFCxfGchy9hlkFrmDNcCP64lrSpTabBIYBduxIwoEDnKluXhUV\nbvzjH1xADbOCYoS1PnvoUBHt7fAtINTaneqdY2OjrHYpCEB7O/BE7lqI9QxmYify4YAD+diJmXgy\nbw1uyQqzTt6gnKPeWNU2knqqBe8qr0BnJ5DyVhVSvgnNwNYyuocPq3/ZWpnnPVGP08L/utVawJWW\ndvcTiFTXu7DlUjXKnqT0dJzdtQdiwQiyZ6hWwxxOxUai4R/FqHNAsskLbUYQABXvlDvTDJxplv9f\n+X5EEWBZ7SY28Rxa17i/mCH1hc1wPfL/fPK/saLfK7uZVbcK9pD06niV0Kya1+tysWhs1PZ21LBY\ngD172jB/fgeys71gGAnZ2V7Mn9+t4KX12efOMZg40YaJE+Uf3XvvkSVB9c5x8OBub2ziRBuazyXh\nQfwOhTiEi/EFCnEID+J3mNrDXBcjHpLeWNXU1nqiBe/xAGXlPC555ykMOnUIPxpyGIum/BPnyru9\nO7NGl+OkgO+RRDjKbEZQ5sLIIhXQ10tobTX4wT2oQdYse2psAFKt4V98kUzGizMCohiSBNbplBXc\nRNFg81qZ1MptIdEQ1ukEI0k+Y28rWxzX3rqrvALC/AVyuDxMWE8n7L9cGPNz7PeG3GxmebCHZFRf\nvHxxC0pLDuO7ued94iDKPmYwRsVAgO464GnTPBg6VNYL37vXgvLyZF+0K1j0w3//1H8BsXZtcmA4\nWRDgPnIMp2vbMHmy+jkuW4aA8KsS0uXsqajlLsSg/JSIyLQakXPU+j6Kiz2YMcOYFryZagX/8LNL\nsuGDxu/h95svCFiQZWVJyM42Z3R/8YtOTQfGyLVrt4vIze0Weyks9PhkW9Wuv+BoDgn/hY+eXoJ/\nBYXWvBqpQVbD6NYLxQ+NhZNZGKf+nrt165b4FtaxWOBavR5tP5uv+1KtJX/qW1XIGHdlTM8xTuMc\nsaW83A2rNQV//rPXF1r2z1pXHps82YO5czshCIGLcy3ZT//Q1cr6OpTn5OHb227ElwsrMHHKQOJ4\nzIZ5y8uTie03Afg8a0X0o7aWwR13WEHIAeoOlyZ7YF22FO2VVchyOuDGMFxjvxn/N3oNzrQmBZzj\n4sVuTJpEThQcOFDCrl0CCgoiJ9MKQFfOUfP78DvXhgY5y3v8eK9veyGc5C4j4WeeB4qKPKisNLba\nNxohUM5JrSpizpzOELEXJW8jM1PC2rXkDH3AuNhMVpaEnBxRNZlz3z4LWlvdvs9Sm1etTmlGZFtp\nJy1zaC2cooGioBfvwjquijVgBBdSt28zFZVQYABwjQ2+LQfXqt9Eeoihn6lVfhavREsQprb2fEim\nsiAADQ0MNm1KCjDqpN7XpExntczXc/PuxSXvPEXcu87N9WLbtjZDBlBLYIRUuqRVtsZxEqqrXfju\nM0uQvf2ZkOefwCLUzF+DX/yi03eO8vHsxO00vYzraKOVea6UgO3bZ0FjY7dhEUUQ9fPnz+8IOG8F\nvflUzl8QgK+/ZnDzzTxcLv1AmJkKBuV8li5Nwf79HBobjZXNKWjNk9FyvPvuS8H27eQtIY6TcNtt\nncTng8/TaKa4dtZ6ApU99QK+uTNYcuX79XIcxIHp8h55EKLdTqwQ0MKbm4+WzS8Cqank/IXezHoX\nBGRcfRm4pibi0xLUu+v5I9oHoLnmy4iN33T5WTwTa2W30tJk4s19wYIOrFoVerP1L8HJnapelrRo\nyj/x+80XhDxnpK5Xwagh8R+bmuG320VkDXBhb+MlGIFQnf1jGI5puf/G3v0IWOhcd90AnDgROjaz\nSnWxRKtsi+TZcpwESULId6K3kHrvPSHAE5X7hIcen+dFdHQwxgywxg3OVNmcCfSO29oKXH65jXhu\nubleMAyIHnvINWLQGOvWkffV5LQI4D93WqptCsKdd6HjllvlJioD04nfD0QR/KbnTI3D3/BIdjva\nS+6Aa8Wv5XFFUeLZKPZ7fw7ra9t7dAwJwJn3D0IcOToiY6KGXAe1G4MgAIWF5BuU3S6ipsblu1cE\nh2XHDfkSfzs1UlUo5PS+T/H/XrjY5+1YreSbvJZ3ZtYjB7Tr5i/E1/gC3yOOuRMcRrNH8dLBoQGL\ng4qKAcRGVGbr4qNhgNQ+S10m1dha2//cwqlbt9tFtLWxsFqVfAUG2dmyEVftPR5lDfueojYPJSUd\neOWVZMOLTQDh1ZFTDBEwd/4Lp3oHJL4ra71N0I5oBH8/QTXn4YSkATnyAiA+6vdbWzHo4hFgPOQG\nQkaQDfmHEEeOisiQwpJopQC1tQyxnSMg33xra7ufC665/b9TeXCA3DdZzMkDkzPU14mrutqF9HTy\nmopUu64kDBlNtvN/T3Dym3/iXSOyVcfsQD7ErCy0twcmZa5bh7B7kGvJx0aLSJRtBdZakzuILV7s\nVt0/T0+XUFIiJ4N5ne24EMfQ0tiBzZvVqxZIErX8xj+Qe0L3AmrzsHKl21DSXAB9OFM8rvBPID34\nf2iu+RLNNV+Sk0n9a8SDvh/byuUBNefhkvLXN5BS9Sb5uVh3T0tLQ9td+olvWkj2AbIKXpTp/WV8\nH4GU9NQGHq9jJrGZhnNydwIOzwOpqVAVjVGS3/LzJWIiVlmZGx6P/PlNTYGJRFrJW0oSVHs7MHGi\nTXfMOzETTedtmDgxNOwfbg/ypUu1E/Wi4alr1dar9eEOxj8hUe38jx9XXzA0NrL4e7UH64M7pmEm\nnqxag9LSIBumU5oV3BO6N9C6DnrSoY0SA4ISSAOSSfUiQQaz3yUAYBhAklSNPXuqQfX9vdE9zbXi\n13J9fNWb4BrqIZksy2svmROT3yU15DoUFEiqN3e7XUJBgXYJzsNYBwAB4ih/xc0Yt+AxjPB7nZZx\nUbwWNdGNAwc4nDvHoKlJbmU6eXK3gQ0OdwYbSiURy/+zSWPebbkZD3vWwdsV+leO09oKPP+8fGxS\na1A1FE/8xRfJ3mdVlQWdncA77xjLHjeDVjZ2cXEneB549dUkv+889LsneZLB56/1nQ4ZIuLBxodw\nv9+CaQRO4AFsAFsnoampIuBYidQeknQdGKkkoMQneiI9RrPfxbx8tDz/ItLn/xRcPbnHuWTlwXS0\ny5Kwwe/vjTJCP1GgwR4nPDfPRNLhQ7pvkzgObT+b79vzjzY0tK4DzwMlJeQ9kpKSzoASHFL40AtL\niDjKb/PXIys38OauFyIHtLudKeH8xkbO1zXLqLRq8GcHjvkoClGDX3p+By9h3VdZmYyLL5azlg0L\nf0COEmzenAKvl7y+ratjsWWLvjSogpm6b/nzu8PALCvXVNvtIl59NbnLiLOQDTh5fEY8Sa3v9ObJ\nrbiVI3dM+zH3BoamBTb7SPQ6acVb37ePLDpEiVMMiPQYlTd1TyiC+P2R6JhBlnAGAFZw+crUgumY\nPFXWEugNQRmeB77zHbS89R46Cy+BxHG+TnAk2ubOg2vNb2OWu0INuQFWrJBv+nl58k0/L8+LefM6\nMG9ep+GWpW3gcQzfQRt4n7hKsOFR22MsL3eb3tfdvduC2lpjylzBn61cnvKYL0IbbFBPAGPgcMia\n4ZdfbjO0v21E315NXCk4XyDcPXZ/wzJrVqdPm9xf0CYU+bs3I26j9p2uuPckcrwO4ntyvQ7YWoPE\nTxK4PaQ/pjTs/TC7UKNEBkMiPRrXptLyVLTbkfpKJTLG/wBMy1kIP5sX2BbVZodot6sew3tBBlL2\n7O59QZnUVLRU78e3NV+i5bW/4tvP/0NWm1y5NqbDolnrXRjJgvWvKd+7V71F5htvWHDqlOLRBSPh\n9ts7cfAgpxoyJu0La2dah6LUhP/0p1bdjHb/zztzBpg61YZvvw1/jaeXra7XglWGnD0enOWs13nO\nn57OK8tKeO89F0aONP+TCflsQcCga8eCqT0R8lpPfgHOktpk9pE6aUN5D11Z0e7MoShfmx6S4/H7\n36fg7FmatR4OpjL+jbYPJl2bk6eCcZ2H9ZXKkPeKNhs6pt0IYf7PAbvsKGRMvMZU97FYZ7HHQ8kj\nzVo3CckD4HnghReSsHmzdsiX0bBPdruEV15JDnn/0qXJAZnowV4Lz8OnNmeEnBwRBQWSZrg+OTnQ\nmy0stGHCBBu+/bYneaf6ErNa0p8cJ+HOOzuQn6+f5Wx060DNa29tBT79lDUc6cjNFX05EWYJ+U55\nHswtM4mvdat52AYkauMZ5XuYMh64Y+wpTBmP0OhJUPezpMuvxsUbl6DBIQX8Xh56qNdOo39hNBJE\nujaXP47kgweI72VdLlhf244LZt8C68svQszNM919LOZZ7Fr0cpUFNeRBaIVq9QyH4h3KPcLNGcMX\nX0zG2LGRK79S9nC1wvXB5XJOJwtBUIskyDCM1s6QTHDYPhitbYi5c9347W/dhkrqjDb1UGvFefnl\nNtx2G6+58FL77Iiwbl14bTKjcNOIReh6xTIWF29cgrfrL8UR6ft4u/5SXLxxCVYs6/4Og0vsBjlr\n8QA24DcItNw7d8bPPbyvY6qdq9+1aSQJjnU65RLK1Y+jY/INpsalp7/fn6Ch9S6UsIlWqPbuuzs1\nVdSGDBG7jHgwEvLyREyY4FEVxiB9nn9oWCsEbLOJGDBAwjffqGd3B4czzYbqFebP74DLxajKcQKh\nQjSkUKqe9KcRaVAjYjhAeOcJKKItBtXWwsBfKrOnYblwy/TMasuHiyAAewvLsMD5ZMhzm+y/wtR/\nlML2zXGk33E7uLrQ3IFjGI5CHEIb5JPjOODAAWevyf8mMmGL6Zi9Tg1KwALyPvrZnbuRMWmCYRco\nILQfA+JBhEgttJ4YMbkYoedxP/CAW7OcSN4XD4VlgZdflrXT9++3GDIqwf2etbxPl4tBWxsTUnrm\nT3BJkPHkOfk9druEkpJOrFghLy4GDpRUG3UonquekdCqPTdSm26kqYdWLXcwHCefq39DmObmGKjN\n6TSB0cKIIW5rFnD2cBMuGJUFa2bgiUSqj7gep2vbMMm5k/jcLOcLSJn4Z6Q01UOtYXw+HMhGI47h\nO/Lf+Qi77SwlTMxepzyPjilTwW/epPtSxnke8HRCzB9myPADQMcN0xImyTPa0NC6H3qh2tZWRjXk\nO22aB3l55H1dZW/VTMvU4PC0du9pJqT0TA+jvayzskS8/bYLNTUurFolGwfFyH72mQslJR0YNgwh\nYXtBAB54IIUY0vYfn14Ws97zWlsHZs4TkEtX//d/BV9pVFpaeBnWsUJvjj3tHvzr+lKgcCxG33oF\nUDgW/7q+FJ52j+/9RnIMIkE2GpEPcpb+QLQi9VQdGA2hEAfy0Yhs398zZ9J7eCLQtuBenY04P1JT\nVffjKdpQQ+6Hkf7LJMMxf34Hfv7zTkyZYqAvedD7FS9Q7fMUzCwCjNyEjR5v5kwPrriCbMzS0oAn\nn3TjyBH4aoOVvffx43ls355EPOauXZEzEnr1yWbmDWDw1luWuDcQSh6H1hzv3m3B59OWYXLN0xjm\nPQELRAzznsDkmqdxqHgZAP2F66efsmhr9pPl7AHJBUPRYs8P+/3V9pvh5qy+hdq6deqvpaVqMUZQ\nv0bEnFyI+WTJZ38kzgJ4vHAtXirvx2fn6C4AUt5+iyZKdEENuR9GdMv9Dcff/+7C5Mke7N1rwYQJ\nNuzda0FhoQd5eeqa48GGZ+5ccviSlFhVXu7GggUdXbro6kln9fXayWYKixe7MXu22zdeRRSFZc3p\npft7zUqotr5ePeGvvp7FlClWtLfrHtowWp57ebl8nnpJegCwd29kPdFoYGSOz9S14+KjfyU+970j\nb6KtWdCuHpA8OHmr7M2nX30lLuhp7S7PI7Wk2PDLJQAS251YNeWzcl0hmd7Q7e/XBFUY+Oq7W1sD\nNNmNeNms14OMSROQMfEaAMCZPX+DmJ2j/R6a7OaD7pEHYVRKkueBrVuTQnTCHQ5g3rwO3HNPaO/q\n4PePGCH5bkpGpCstFnm/XV2wRIZhgGefTUJFBfmG59+7ur6exZAhIn78YzdWr5ZfH662uRGhl64R\n4ssvLSgutqK6us3ch4SBxQKsWdOBffs4NDRo5yf4a6gDse3KZgSjc3zJoHrknCaHsrO9Dhw+3ISc\nohGqOQarxYdxP54EFJEtx0lY/GQ5w6FtRQVYFkiqqgLXUAdvdg7Yc2fBEfpYi7n5aNm2Q244wfPg\nAYxI016IxWq/nyKjJt2auu1PYAShW5P9wSVgzp1D8v59YBsbAFEEQ8iFYBAo/9px00zdFqvW556W\nxVcSpAQzWtCs9S6CMxK1buCCIHdFu+MOq7EeywYwYjDMZpoHZ74riVFqSWqFhR7s2dNm+jehzJ0x\noZduOE5CTY0TmZnmPi9ctNq3KijfXXJy5LO5g7/jcLJgjc7xPT87i0dfvhLDvCdCnjvJDQdqPoQ1\nkw+pDgCAZG8bDmEUsSd9RDKFBQFSwyms2DQMl79STsxk1xP7IP1ezbbz7a9EJPvaVEa6XTbs2Tno\nvPwKpFS9STTk/njzC3Dmvf2wra1A6raXwDrVxxsrYZh4zlqnoXUVSKFa/9DdddfZUFdnTP6UhP8+\nnlGvz6xMa1WVBUeOdO8VKh6LmkdfU2PB0qX6iXJqmEksA+TkssOHY3cJhm5NhNJdf0+uPTeSSBhM\nJEO+2nPcLSG7bLUF/xl5I/FV/xl5oy973X+r55VXBIiidmJaRMKZPI//98LF+P3mC3CP87d4Aotw\nDMPRCQ6n7QZr6YMwqilAiQxGG6UAcq04I4rg6uuQuuuvkGw2/fc01IFt/haulWvQ/NkRtN06GxJL\n/n7jShiml6CG3AT+N3dJUm+oodpjGYE39bFjZTW1wkKboRt8ZqYEnjceQKmrY3HddfJxlyxJRlWV\nviv51lvh7xGbSyyTa4FHjTJu+HuKxQKsWuVGTY0L777rwm23dSA3V84P+G7ueZSWHEb54paIZ3Or\nLQrCUSfTmuOSEjc++KB7D3l01QrsHf1LnGBkI3kMw/F7y6+w46rVIdcYzwNjxojIyxM1e9JHokGL\n//wGNxUaP/BznC41r1ZnJFGVEjmMNkoho7+oErNzuq+ztDQIix9VLU2ke+W9YMg//vhjjBs3Du+9\n957vsaNHj6KkpAQlJSVYvnx5rIdkCOP7v9oKYMGLAf9mHXpe39q1ybr744EwkCT5uFvrFjONAAAg\nAElEQVS2pKhGEPz55huy9xKcCayWGUzK6s/IIHczGjnSG7Owuj88D1xyiYRnnnFj//utcNz2Sxxi\nCrHylUswZOJVSF7yCE7VkY2CWe9O67oJV51MreRu/Xp3wHVnSbXg1fHrMErq7rz3S88GbNxsI15j\nyiJB6UlPIhINWkjes9JU6Ngpe1jes5FEVUoEMZjERoIRXGibcbNm6ql7fFHAdSZmDYWYR656SITu\nf9Empob85MmT2LJlC8aMGRPweEVFBUpLS1FZWYmWlha8//77sRyWIbTD2pKhTG+jiwGS12dmIaGG\nWkcxf4K9F1JY+PrrrZgwofvv++/vTmYmlYN99pmAwkJPV6mdXHJXWOhBVVX4iW6RKjEavHYpsrc/\ng6Q6WRKUc5xE9vZn8IyN7C6b9e60rhuHAz6jZeZ8jLYEVa4Z/857CmqRBWWR8GTeGmzAr3CSGw4v\nOHiMyscaICtLwneynbgQX8OKwEH0xHvW0xSgRJZg6VbRTt6/DUbMzoVz3e/U2/LaB8BZEdQ9rI90\n/4sWMTXkgwcPxtNPPw27X7s6t9uN+vp6XHrppQCASZMm4eDBg7EcliH8Q3dWCAE3odxcEe+9p99j\nuamJMeQVk7w+c0ps5BuhSpvfAIK9F1JYuKbGgrq67r83bECIh+efY5CaClRXt6Gmxolt2wRUVbnw\n5pttSE01cDpBRLTESKPX8kzsDDEygHnvTivkm58vb5eEez56Yjnh7Bsri4S/feDGuI8qgJoP0fLR\npzgbqQYtHg8Gr1qCA+cuwRf4Hg5hNNbjfnCQT7gn3jPteR5jghqlNH92xFANuLuoCMgchI4ZNxGf\nb5/zU1mkIghTmu/9jJgacqvVCi7ILTx79izS/L60wYMH4/Tp07EcliF4HphxQzvW434cwuiAm9BN\n09sxcqR+eVJWlgSbzZi38dxzSQE386wsCVlZxpTYcnLUDIeI+fNlj4VlJdhsImw2UdV7MRMF0Ns7\n9niAJ55IxpIlqZg+PXwDHMkkNK2EnUFtdXig5ESPvTutkO/MmfJ2idnzMeq992TfWFkkWDMj26BF\nKVka5KyFBSJG4AQewAY8a/+fiHnP4fY8p4SJIt2aliYb9uoPVGvARfsAOLt6dYcY5rx8tM2eA9fi\npeTPSfDuf1FFihKvvPKKNGvWrIB/f//73yVJkqQlS5ZI1dXVkiRJ0qlTp6SZM2f63rd//37pwQcf\n1Dx2Z6cnWsPWxHPfIkmSUy4C/nl+NleSXC7d97tckjRgAPEQxH+LFgW+/7/+y9h7FpGH6TueyyVJ\nX30l/9f//4P56itJYlljY2VZ+fVq6I3JCC6XJBUUkI8zfLihryD0gMOHax5Qa36M0tkpSffdJ0lp\nad2HHzBAku65R5KGDTN+Pp2d8nwNHy7P9/Dh8t+dnd2nEzzWSMx7xND4Ar0F4XyBlLjFzIV37pwk\nzZ0r/xhIFzZFl6gtZWbNmoVZs2bpvi4jIwMtLS2+v5uamjBkyBDN95w9G/lSA90aQUFAxut/IT7F\nvrgV3upqdBTfKId5VFaIx48zcLlsMNri9M9/9uKBB7rrX8vKgL//3YqamtDj2+0i5szpxJIlskfT\n1hbaOWzJEjeUYEdaGuByBf6/8nfX6aK+nkFODrlWPhieF2GxuEAKpggC8Oc/8wBCjxN8jlocP87A\n4SDPn8MhoabGZboblm3qdKLohDB1OlwuL4DzxPkxS0dHMlpbu2vYz58Hnn0WkLdBjJ1PcB38iRPA\nhg2Ay9UBlgWx5n3JktBrYfJkD26/vRO1tbH1Wtnjx5DhcBCvfqbOgeaaLw035YiHmt5EJSZzt2Q5\nbG1upOyuAttQB3FIFjqmzYBryXIg6LNtZY+A37q1+4GuC1toc8ekPtwo8XDNxW0deVJSEi688EJ8\n8sknAIA9e/agqKiol0cVilYYlgHA1TnkvrrlKmEhAGlpxsLjCsH7mBYLsGdPG+bP70B2thwez831\noqSkA5995vLtB/Zkr9B/D3riRBtaWnpefxupGt9olBjFYt9Na4tCLQEx+Hy0jlFZmaQanve/Fvbt\nc2HKFA/eeceCoiLt7Y1o6JVrlSzRzOM+hsUCV3kFOqZMgThkCNhTjUh55235/uh/wWnkqdD6cOPE\n1JD/7W9/w5133ol9+/Zh/fr1mD9/PgCgtLQU69evR0lJCYYNG4ZrrrkmlsMyhNG6SdLFpxjHqVN5\nNDYan3KScbJYgNWr3Th4UMDBgy7s3y/gySfdpNyQsPYKg/eglXI3RYNdLZGurY1RNcha9e9mDHBU\nSoxU9t0EtyVihqy+noHDQf7e1RIQg89HazHkdJLn3T9vgeeBF15IwubNoQa/rCzZd65R1Sunmcf9\nClv5UvCbN4FrbAQjST75VX9nR8tBovXhxolplsB1112H6667LuTxiy66CNu2bYvlUMzTdRPS0/5V\nLj7/EGGwBnQ3sgGzWCR4POp9vVWGExB2jYQmuJbXN3CghNdeE3D33eRQu5ZB1qp/nzLFnAE2qoVv\nmq6EHY8HKC+LrDTrH/+YBLXtlPx8EVOmyI13tM5HiUYYlecFAnXjtb7brVuT8cILycjNFTFwoBSw\ndRNpvXIl0uELuebkoWN6Mc087mtoedq73oSrdLn8m+tykEhSrzRKYxya7mcC301o15tg68l7ff4X\nn6LJvmuX2jTLR/B45P/a7SLa2hhTxslfKztcw6MsAtrboer1nTrFIj0dKC4mN9lQW3RoZ75LePtt\nCywWGB6vEiouLXVHpZlJpBtvCALw6qvkVqOAvJBZvdqNZcu0z0eJRpDm3mIhNyXzX1xpefRer3z9\nKU1/SOzebUFpqbvnc90VAXGVLpcXvFlDqSfeB9H0tOsdsC95EM4nntZ0kGiUxjjUkJvB7yZkX/Ig\nrNtDowgd04vhSeZRXiZLotbXs2rKgiGkp0vYtUtAQYFx49QTwxO8CMjOFsHzEjFU69+PHQj0iH/y\nE86XZOePIACffspq1M4zqK8Pz1AGRyQigZ40q2LIzEQ/amsZ1dA3AJSUdAIwdj6kuQ/2oP254Ybu\nxVU4Hr0/wV3heoxSskTpk2h52gwA6/ZtkGx2uFavo1GaCNDryW4JCc/D+cTTqklSy5bJxrWuTluT\nPZjGRhapqcYXoT3VBA/eD6+v51RD4IrH7XYDd9/diT175ES6PXsE3Hef/LiC/z7rrFk8VHodmB5v\ntNFLymtoYAL2j8eNkzXse7J/vHGjurceTHAS4549As6dU7+2/MdlVgc/GKpXTjGFAQlX65+2yDcx\nWh/eY+hM+SMIxsN9KiFCQZCziMNBuVlGohuangeltQiw20Wkp0tobOzesy0rc6PMb+84J0d+zblz\nDOrrgdxc3hfSV88JUCfiHl8YaHmtOTkiNm2Sk8UUGhs5bNnC4R//4FTbvxYUSLDbyVEOADh40AJB\nMBeyVrz348e11f7eftuC8vLuY/t79PX1LBimO6yuB9Urp5jFVV4B5ptvkPr6a2RXprMT7BdHIF7R\nJdlNozRhQz1yQHZd7r8fGUVXIWPclcgougq2siXkjcdg+EDlK71QqhY33ODBqlXGM4Z7Uo6ltQho\na2Pw8sttAaVrK1cGeu91dbJUq/w3fBnQS5cma+6JqxEPHp+W1zp5spyQFowVAoSaWqx4hPw+PU/4\n1KnwW2zqqf01NYWWLyoe/cGDLvzsZ+StjMJCT/iKdoIA9vgxWjZEASwWtN9eovkStvnbGA2mb0M9\ncshlEtj4B59ciVImASAGggSy8bLbJRw8yOHQIeMZw1oJUHoelJ736b9Pb0aq9a23LDh1yvz60Gz2\nOgA0N8v9zEeNEiPWRU0tK37u3E5s3dotm8rBg9/gIdyCncjHSTS8NAzJ3HS4K0IFgX796w7s3m0h\nblv0ZAHD88C0aR5s2ULe987NJR9b8egrKtxISiJXALjdJqsgPB7YypciZfcusPV16MjKQ8e0GcT5\noPRx/CKbnivGaL5U73mKMRhJMpqKFT9EVF1HEJBRdBUxKcObX4D6PR/hVKvN8A1NEIDCQpvJdqPa\n5Od7sW8fWf3MP2Et+Gasd/8MVgpTWLiwI2DhcPw4g3HjbBBFfc+R4yQMGSKisZFkXMgqZoCEAwdc\nuOgiY5dieztQXGzFkSMcvF5ZVGXkSC+qqsJrxEIieHtDEIBx43jfea3H/XgAG0Lft/Be4uLP6Fyb\nxeMBpk4lq/0ZPXYkShdtZUuImcfvFP4So/esirgtjweVrUQlanMXtJgTc3IhpqfDcuQwGIJgQufo\nQrS8dyDy44gS8XDNxa2yW2+jVSYBRx3uuP6cKWEMnu/ORA5m9OjukKUsrmIMLfUzLRU3PXUuo20f\ns7Ik1UYswSgLCRJ2O/mc8/NF5OQYn4/iYtlwyfu7DLxeBjU1FhQXWw0fQ49gMR3F+wXkcPoteJ34\nPjU1KtJcL1oE4yFrFUhqf2bD4T1uMqJRM3xhzZuoKIuEmgwl3lEa4nCOrpbAdQ4k1XweYsQlAJ2F\nl6Bld3XvDLQPQj1yDY/8GIajEIcC+jgb8XIUL3nXLgsaG1kMHSqiqMiLxx7rwNq1yfjrXy04fZqF\n0Wz2vDwvXn65zRfu1vOgzNaWGwlRX3892esLRjEgpCiBKAKbNvXMK21uBgoL7cQkLY6TUFPjjFiY\nPRjF+xVqavEFvgcLQhc3EsfhzIFPVZN2/L+7goLIrvAj4VmHA3v8GDLGXQlGDJ2PTnC4PvswXjqY\nHdExxYN3lKhEZe407qPBeHPzcGb/JwlXIx4P15yaR043rzQECXZiZoARB4wJYyhe8uLFcqb3vn0W\n7NiRhF27yPukerS0MJg40eZT3mppYdDQoG6gjdaWGzX4bc0C0psbYUVeyHxwnARJYpCb6w14L0m0\nxeOBr7lHuKpshw+zqrKmXq/8fFGRcT17Myje74pHMtDw0jAME0+EvEZPjSoa9e+xOLYWYtZQdGTl\nIbUx9CbuQD4+a8pFU5PUqxUJlOiiGdkMfu2pRnkPPS0NlsOH4Bk1GsgcFOUR9m2oIYdcJsFbk+H9\n8+tgG+rQMSQPzzXOxMNYF/Jah4NFfT2D735X/6a0dm0yKiu7DaqRbPbCQg/OnZMNtdUqwelkfcY/\nWHmLZKCNipoABgx+157XgL/uwnuNdXBgGF6HPC9ev0tn717gO98J3cMPNiyRUGUbNUoEx5E1yjlO\nfj6aWCzAinUWJHPTgS1kNSoBPJqOx94z7jV4Hh3TZiCVMB87MRMX5KYiK4tmsfdlxLQ0iEOywJ1q\n1H9tdg7S5t0ByxdHoSS5eEaOQkvVu4hYkks/o9/vkQOQ786/+51PkOCb6o/w2/z1AcaqGwabNunX\niZvJ9AYkZGfL+5p79rRh3z4B1dUuDBxozIPxF1PRqy3/9FMWgmBMTEbZ80ptPAkLRIzACTyADfgN\nHvK9NidHxNix5qJkPdmTzcyUE9tIjBzpjVpYPRh3RWjXNOeCe/E/4m+i03AkznFXVOCdwl/iGIaj\nExyOYTiewCI8jHW0Br0v4/HA9siDyLh+PFgDRhwAIAhIOnwIjNcLBgDj9SKp5nOkF0+K6lD7MtSQ\n+9NVE27N5DFlivrdd8eOJLS2ah9Ky6AGk50torq6O0mN5+WFqdFOafX1LGprZW9fq7YcAG67jUdR\nEY9HHknWNPina9tUE5hmYieskFcORm/SkWyLWVXVhsJCDzhO7sbGcRIKCz2oqmrr+cGNQlCj+h/2\nCTy7yUZsJ9rnsVgwes8qrJ//f7g++zAuY2vwRP5vcfdCb88b2lDiE48H6VOv7e5wpvIyieO61S/v\nnAf2XAvxdZYjhwFaVx4W1JCrsGBBJ9QETJxOFkuXaiuX6RlUf266yRPiSZp5vygCd9xh9Xl/48cH\nLkKsEHAhvkaytw2SJBuYysoUzdai2WhU3fPKhwNjsusMZUZHoy1maipQXd2GmhonXntNQE2NE9XV\nkSs9M0XX4k8A3yO53L6AxQIsX23BSwez8e5BKaCCgtL3sC1dgqSaz/VfKEloeeV1nNn3ETpu+Yl6\n716vF5bDhyI7yH4CNeQq5ORoG9L9+znNm7OWopfdLhLLvfy9VnPa2LLS2saNKbj8chteeSUZdruI\ngTY3fodFOITR+ALfwyGMxnrcDw7ax50+3YPkAvX+697sPLxcPdDQTTpYzz2SXmpmJlBUFJ4YTHMz\nsG8fi+bmHg8DgDG53P5Cj8vZKPGPICDlLXLELhgxNx+eMT8EeF5ObONUGvdwnPw8xTR0rawCzwNF\nRR5UVpIvusZGfW1wNZWwxYvdaG4OzOYuI/TALisL1MbOyhJxwQXdWeuk8jUlMc7pZLAei7EIT/qe\nU/a4AeBB/A6CwGD2bDcOHOBCs8gt6tn84k3FsGbKd2lBAL7+ulu8K1hExWjiXayIlpiMnlJeb8vP\nUiiRhG06BbbplKHXBrQjzRwEz8hRRE/eM3IUzV4PE1pH3gWpRrC1Fbj8crJKm5baWjB69b1aql/l\n5W4sXZqMt96yoKlJNvJXX+3Fa68ldXVWI2NHK+qQh4EInSulPn5Qfgr27ZPDCsTx+ZSaQtsLemDx\nK13jwPNy9MLlYpCXJy8I7rqrExMmkBXhOE5Wc/NfCMWiDlqtHr6w0IPq6p7tsZtVb4t1XWpv1ZlH\ng3io6U1UIjJ3OnXjEgAxv6C7Hal/6K69HenFk+Q98QTKWo+Ha47WkYdBWhowZ04n8eZsJhNXq75X\nz2v1eIAtWwJLxBwODna7qFnOtgG/IhpxQN7jzkYjJk/P850DcXwqHd4AoDzIaPkvdpQyNo8HhrxU\nswI24dLcDBw5Qo6wHDnCobkZPcp6V4vA9HayV6zml9KP0NDf6BxViNY/viBvzZFukqmpaKneDzR/\nS+vIIwT9GesQ7Zuz3t6q8RK2bqwQcD3U5Q9PsXmYflcGlho9h6D2gkZL6/butWDyZHJTD/+FkFEB\nm54SbTGZSNTJR4NYzS+lf+EqrwCA7ojdkCx0TJsBV4XBXuKZg+ApujbKo+wf0NB6F3phk2iFJQUB\nKCriiV5rdrYXp06xxBA6x0mYNcuN/fstAeIxAHAhvlaVEAWA87fNQfszz4Y9ZqNNVDhOwt//7sLW\nrUmqTV20zt/M9oURelPelUQsQnWxnN9YEg9hzkQl4nPn1+0sIS8mg8TDNUebpvSQaGXiamWnT5vm\nQV6eer/x1au7m6V89pnL15TjG3YoGrhhxPeJ9gFoX722R2M2WhqXkyMiN1dSbeoCxDbbO17EZGIJ\nzaanRB0lYteHjXi8Qw15HKDWhayiwq1q5JXQtLLASEvr7oL27kEJA382nfi+9jk/lTf/e4DR0ri0\ntMDuYaSFkNaiIBrZ3nEhJhNDYj2/FAol9tA98jhAbW9VEIC5czvR2Qm8846xPXrFYLorKiAkgZhx\nHgmUz6+qsqCujpxAduIE66uJV0NZFCh7tv5EQ9pTEZMx0vGtLxDr+aVQKLGH7pF3EQ/7HwqkLOMp\nUzxYsKATOTkmw/tR3r/65z8Z3HCDXeVZCe+/78LIkdqXmP/5kvbR+yqxuub64vzG0+810aBzFx7x\nMG9qe+TUkHcRD1+Sgtl65N7kyBEG117bM0Ou0JfqnI1A68jDJ55+r4kGnbvwiId5o8luCYKRrmQ9\nOXZw4xLlsbq68CRLCwokDCBfW7DbJRQUGF8nUmnP6ELnl0LpmyRoYK3vYiTLWEsWlgQpVH/DDXKy\nmvKYglnJUp4H7roLeOqp0OdKSjqp0aBQKJQoQw15nBENzW6SIMimTeQENa8XqKmxoLjYaliydP16\noKOjA7t2WdDYyCI7W8SMGb2vaEahUCj9AWrI44xIZxkbVWELxoxkabwqmlEoFEp/gBryOCSSsrBa\noXotwpEs1dKUp1AoFEp0oIY8Domkh6sVqteC44DRI5xgj/d96UUKhUJJZGjWehwTiSxjoyps/nDw\nYMvARbho5lXIGHclMoqugq1siZw1R6FQKJS4gnrk/QBSqF4ra33LwAdx55mngDNdjzlO+toVulau\nie3gu+hLNdAUCoUSSaggTBfxUOwfbUjGUHksKUnC8eMsRo9w4qKZV4FznAx5vze/AGf2fRQSZo/m\n3CVKL+1wJF/7wzUXLejchQ+du/CIh3lTE4SJo1shJdqQktH8H8vLE8EePwW2vo74frahTpZ79etN\nHm3ivZd2eztQXGzFkSMcvF7zdfgUCoXSU+geOSUAMWsoxNw88nM5eXLiW4yIpspdpCgutqKmxtLV\n45yB18v46vApFAolFlBDTgmE59ExfQbxqY7pxTHNXjfUS1sQwB4/ht6w6s3Ncr09CaUOn0KhUKIN\nNeSUEFzlFRAW3gtvfgEkjoM3vwDCwnsj1gLVKFq9tPOz3fj+s4uRUdR7mfWHD7PwesnPKXX4FAqF\nEm3oHjklFIsFrpVr4CpdHtUWqHpoqdz9Mf0hDNzyB9/fZjPrBQE4XduGbDQiuSC88xs1SgTHgWjM\nOU5+nkKhUKINdRko6vC8nNjWi/Ve5eVuLFzYgfx8LzhOQn6+F/89/yyuO/cG8fUpu6tUw+yCAHz5\nJYNHH2axt7AMmddejaxrr4Q46mokLTbvzWdmyoltJEaO9BrOXqdQKJSeQD1ySlxDUrmzN50E94Lx\nzHr/EjaHg8V6PIAFeNL3fJZQC7zwB7zzCYfRe1aZKmurqmpTzVqnUCiUWEA98r5OLyaDRRJ/lTuz\nmfVKCZvDwcGKNtyC14nvvbDmTVSUmfPKU1OB6uo21NQ48dprAmpqnKiupqVnFAoldlBD3lfxeGAr\nW9KryWBRw0RmfXAJWzYakQ8H8b35cOBfu0+HtebJzASKioyLwVAoFEqkoKH1PoqtfKkv+QuID5nV\nSKJk0KfsrgLbUAcxJw8d04tDMuubmhicqWvHhTiFRmSjEdlwYBhG4ETIMR3Ix2dNuWhqkmgXNwqF\nkjBQj7wvIghI2b2L+JRWMlhC0ZVZf2bfRzhz4FOc2feRvEDx3+D2ePD9ZxfjCDMKX+C7OIyRqEAp\n3sBNxEPuxExckJuKrCxqxCkUSuJADXkfhG3Sl1ntM2hk1tuWPYqBW/6AfPEkLJAwHCfxADYAEPEE\nFuEYhqMTHI5hOJ7AIjyMdZg+3UObslAolIQipqF1j8eDpUuXwuFwwOPxYPHixfjBD36Ao0ePory8\nHADw/e9/H4899lgsh9XnUJLBSI1PYi2z2msIAlIrXyY+NQ8v4uGf1uL/hOU4tv80/n06FxfkpuLu\nrmYsFAqFkkjE1JDv3LkTVqsV27Ztw5dffolHH30Ur776KioqKlBaWopLL70UixYtwvvvv49rr702\nlkPrW3Qlg/nvkSvEWma1t2Brj4NxOonPDcB5rPr5FxBHjoYgZKOpSUJWltAfpoVCofRBYmrIb775\nZtx4440AgIyMDLS0tMDtdqO+vh6XXnopAGDSpEk4ePAgNeQ9xGgyWN+FMfQ8qSMchUKhJBIxNeRJ\nSUm+/9+6dStuvPFGnD17Fmlpab7HBw8ejNOnT8dyWH2TOJFZ7S3EguGQ7HaiVy7ZB0AsGB77QalA\n6hNPoVAoRomaId+xYwd27NgR8Nh9992HoqIivPzyyzh06BCeffZZnDlzJuA1kqTvHV1wAQ+Lhdx1\nqieoNW1PbAYABVlR/5T4m7sBwLx5wFNPhTzDzrsLg2MwJ3p4PEBFxQDs3AmcPAkMGwbMnAmsWwdT\n6nL9lfi75hIHOnfhEa/zFrXbxaxZszBr1qyQx3fs2IHq6mo888wzSEpK8oXYFZqamjBkyBDNY589\nG/nyqcGDB+D06fMRP25/IG7n7tHHYOvwIKXqTbANDRBzctBRfCNcjz4GxMF4KyoGYMOG7r9PnAA2\nbADa2jqwciVNutMibq+5BIDOXXjEw7ypLSRiWn7mcDhQWVmJp59+GikpKQDkcPuFF16ITz75BACw\nZ88eFBUVxXJYiUkfkV6NKkqt+Qf/wJmDn+LMB/8IrTXvJQQBeJ2sFIvduy299rUKAnD8OEMvKwol\ngYjpHW3Hjh1oaWnBwoULfY89//zzKC0txbJlyyCKIi677DJcc801sRxWYuHxwFa+FCm7d4Gtr4OY\nm4eO6TPkJLY4MFBxiVJrHkc0NTFwkJViUV/P4tNPWYwZI8Zsz9y/sUx9PYvcXBHTu8rx6GVFocQ3\njGRkUzrOiEZ4Ix7CJkawlS0hlpUJC+/tNenVRJm7eEIQgOuuG4ATJ0Kf4zgJkoSYGtOyMrmxTDAL\nF8ZnmJ9ec+FD5y484mHe4iK0Tukh/UF6tZ/A83JiGwmvl4EoMnA4OGzcmILy8uSojiW4sYw/vRnm\np1AoxqCGPIHoV9Kr/YB162SPNz/fC5aVwHHk4Fi0jWlTE4P6evKtoKGBRVOTXk0+hULpTaghTyDM\n9uGmxDcWC7BypRv79gnYsUOA2iZXtI1pVpaE3FyR+FxOjkibyFAocQ415ImEiT7clMSB54ExY8Re\nM6Y8D0yfTu5TP3myB01NNIudQolnaD5qgkGlV/smPA8MHCgRM9kHDpSN+PHj0VN/U5rF7N5tQUMD\ni5wcEQMHSti714KtW5NpFjuFEsfQrPUu4iEj0RSCEDfSqwk3d3GC/7wJAjB+PI/6+lDFQptNRHq6\nhMbG6JeFKXKxzz2XhM2bQ7PYS0o6sHq1u7cvOXrN9QA6d+ERD/NGs9b7Ghp9uCmJhccDPPJIsmrC\nmcvFor6eC8lkj5Z4S3s7sGcPeZVQWZmMCRN4lJUlw0OOxlMolBhDg2QUSi9TXp6MyspQ71eLbduS\nsGuXJWJeur8gTF0dq5p4BzCoq+OwcaMcOSgtdRtq+EIbw1Ao0YMacgqlF9Gq4dbC6WShNHaTvXTZ\nsOqJt6gZ1PJysiCMFkYWE1QxjkKJPjS0TqH0Ilo13IAEnidnspPQqjf3eGT1tvHjeYwda8P48d3h\ncUEAqqrCW0yQQv7+KAsEh0P7dRQKJXyoIadQehGtGu7cXBGzZ3caPpZWvfmyZSkf6G0AAAsBSURB\nVLJBra/nIEkM6utlg7psWTKamhjU1akvJuR/xvBfTFDFOAolNlBDTqH0Ilo13DNmeFBR4fapv3Gc\nhLw8L+x2suG3WiVkZoYaXUEAKiuTiO+prExCUpIELjRZHgDAssAtt4S3mKCKcRRKbKCGnELpZcrL\nA411fr4XCxd2+PaRFfW3Awdc+OADAXPmkA2r08li7drQkHVtLQOnk2w0nU4GR4+y8HrJYxNF4OGH\njS8m/MVrqGIchRIbqCGnUHqZYGO9b5+AlSsDk8F4HhgxQk5QW7zYrWpIwwlZZ2ZKyM8nHy8/X0Ru\nrmR4MTF9useXRKcVbfB/HYVC6Rk0b5RCiRMUY61HczMDQSB72ErI2v84BQUS7HaJ6JXb7RK+/30J\n06d7fJnv/gQbZuW4JCU4JRvdH6Ovo1Ao4UMNeX8kjlThKOZRQtYOR6jhJYWseR4oKenEpk0kpbZO\n8Lx5g6tEEfTqyI2+jkKhhA815P0Jjwe28qVI2b0LbH0dxNw8dEyfIeu006LehEEJWet50P6sWOEG\ny8plZoqhLi7uNtThGlyjUQSjr6NQKOahd+9+hK18KfiNf/D9zTlO+v52rVzTW8OihEG0PGhqcCmU\nxIMa8v6CICBl9y7iUym7q+AqXU7D7AlEtD1oCoWSONCs9X4C23QKbH0d+bmGOrBNp2I8Ikok8M9m\np1Ao/RNqyPsJYtZQiLl55Ody8uTENwqFQqEkHNSQ9xd4Hh3TZxCf6pheTMPqFAqFkqDQPfJ+hKu8\nAoC8J8421EHMyUPH9GLf4xQKhUJJPKgh709YLHCtXANX6XJaR06hUCh9BGrI+yM8D3HEhb09CgqF\nQqFEALpHTqFQKBRKAkMNOYVCoVAoCQw15BQKhUKhJDDUkFMoFAqFksBQQ06hUCgUSgJDDTmFQqFQ\nKAkMNeQUCoVCoSQw1JBTKBQKhZLAUENOoVAoFEoCQw05hUKhUCgJDCNJktTbg6BQKBQKhRIe1COn\nUCgUCiWBoYacQqFQKJQEhhpyCoVCoVASGGrIKRQKhUJJYKghp1AoFAolgaGGnEKhUCiUBKZfG/Lm\n5mYsWLAAd955J0pKSvCvf/0LAHD06FGUlJSgpKQEy5cv7+VRxicejwdLlizBnDlzcPvtt+OTTz4B\nQOfOCB9//DHGjRuH9957z/cYnTfjrFq1CrNnz0ZJSQn+/e9/9/Zw4p7//Oc/mDx5Ml566SUAQGNj\nI+68807MmTMHixYtgtvt7uURxidr167F7Nmzceutt2LPnj1xPW/92pC/8cYbmDlzJv70pz/hwQcf\nxIYNGwAAFRUVKC0tRWVlJVpaWvD+++/38kjjj507d8JqtWLbtm2oqKjA6tWrAdC50+PkyZPYsmUL\nxowZE/A4nTdjfPzxx6itrcX27duxcuVKPP744709pLhGEAQ8/vjjGDdunO+xJ598EnPmzMG2bduQ\nm5uLV199tRdHGJ98+OGH+PLLL7F9+3Zs2rQJq1atiut569eGfN68ebjpppsAyKvUrKwsuN1u1NfX\n49JLLwUATJo0CQcPHuzNYcYlN998Mx599FEAQEZGBlpaWujcGWDw4MF4+umnYbfbfY/ReTPOwYMH\nMXnyZADARRddhNbWVjidzl4eVfySnJyMP/7xjxgyZIjvsY8++giTJk0CQK81NX74wx/6HLuBAwei\nra0trufN0tsD6G1Onz6Ne+65By6XC1u3bsXZs2eRlpbme37w4ME4ffp0L44wPklKSvL9/9atW3Hj\njTfSuTOA1WoNeYzOm3G+/fZbjB492vd3ZmYmTp8+HbAwonRjsVhgsQTe5tva2pCcnAyAXmtqcBwH\nnucBADt27MCPfvQjfPDBB3E7b/3GkO/YsQM7duwIeOy+++5DUVERXnvtNbz//vt49NFH8etf/zrg\nNVTBVnvuXn75ZRw6dAjPPvsszpw5E/Ca/j53WvOmRX+fNy2C50aSJDAM00ujSUz854tea9q88847\nePXVV7F582bccMMNvsfjbd76jSGfNWsWZs2aFfDYxx9/jHPnzmHgwIG49tprsXjxYl+YWKGpqSkg\nLNUfIc0dIBuq6upqPPPMM0hKSqJzF4TavAVD5804WVlZ+Pbbb31/f/PNNxg0aFAvjijxsFqtaG//\n/+3dT0hUaxzG8a/jWIOgyIQOI2J/NrqwMZAWJeEg0SpwTIIShqBFGGji0MKaoaLC8YRpYFSCzkKH\nMCHLyIVCEARtMoOKyoigHBcRDEgEOWXdxaVzmztdM++F65l5PuDC85739fhy4OGc9/D+PuFwOHSv\nLeHevXtcuXKF/v5+8vLyVvW8ZfQa+eTkJDdu3ABgZmYGt9tNTk4OmzZtMr/Cnpyc/OUTVCaanZ1l\neHiYixcvsnbtWgDN3Qpp3pavurqaiYkJAJ49e0ZRUZFeq/+m7du3m3Ooe+3nPnz4wLlz5+jr66Og\noABY3fOW0dXP4vE47e3tfPz4kUQiQTAYZMuWLbx69YoTJ07w9etXKisrzY+65C/d3d2Mj49TXFxs\nHhsYGODt27eauyXcvXuXgYEBXr9+jdPppLCwkEgkonvuN3R1dTE1NUVWVhYnT56kvLz8/76kVevp\n06cYhsHc3Bx2ux2Xy0VXVxft7e0sLCxQXFxMOBxO+uZF4Nq1a/T29rJx40bzWGdnJ6FQaFXOW0YH\nuYiIiNVl9Kt1ERERq1OQi4iIWJiCXERExMIU5CIiIhamIBcREbEwBblImnjy5Ak7d+5M2nv89OnT\nGIaRcm5bWxvv3r1b9thjY2P/2BaJRNi3bx9+vx+fz8eFCxfMna/Kysq4fPly0vl+v59YLEYsFqOi\nogK/35/009/fv+zrEpEM2tlNJN1t3rwZn89HZ2cnZ8+eZWpqigcPHvy0SlNPT8+yx11cXOTSpUvU\n1dWltD18+JDbt28zMjKC3W4nkUjQ1NTE9PQ0VVVVrFu3jps3b+Lz+XC73Sn9nU4nQ0NDv/ePikgS\nPZGLpJGmpiZmZma4c+cOp06dIhwOmzvv/ai2tpY3b94wOjrK0aNHCQQC1NfX09zcnLKP9PHjx5mb\nm+PgwYMp48zPz/P582ezNvOaNWuIRCJmmVaHw0FLS4tZ5lZE/nsKcpE0YrfbMQyDQCBAbW0tFRUV\nv+zz6NEjOjo6GB0d5cWLFzx//jypvaWlBafTSSQSSem7Y8cONmzYQE1NDUeOHOHq1aspxXN2795N\nPB5fVWUfRdKJglwkzbx8+ZKSkhKmp6eXVaXJ4/HgcDjIysrC7XYzPz+/7L+Vk5NDb28v169fZ9u2\nbdy/f59du3bx+PHjpPOCwSDhcJgvX74kHY/H4ylr5H/vKyJL0xq5SBp5//493d3dDA0NYRgGg4OD\nHDhwYMk+2dnZSb8vFf7f94oHCAQCeDweFhcXKS0tpbS0lP3799PT08OtW7fweDxmv/LycrZu3Uo0\nGk0aT2vkIv+eglwkjQSDQQ4fPozL5SIUCtHQ0IDX62X9+vUrHtNms7GwsACA1+vF6/WabefPnyce\nj3PmzBlsNhvfvn0jFotRWVmZMk5raysNDQ2rptCESLpQkIukieHhYQDq6+uBP59229raOHbsGNFo\nFJttZStpRUVFuFwu9uzZQzQaJTc312xrbm7GMAz27t1Lbm4uiUSCqqoqGhsbU8bJz8/n0KFDhEIh\n89j3V+s/KikpIRwOr+haRTKRqp+JiIhYmD52ExERsTAFuYiIiIUpyEVERCxMQS4iImJhCnIREREL\nU5CLiIhYmIJcRETEwhTkIiIiFvYHLXRVZPBZdpcAAAAASUVORK5CYII=\n",
"text/plain": [
"<matplotlib.figure.Figure at 0x7f245ae95860>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"#Build the scatter plot with the two types of transactions.\n",
"color_map = {0:'red', 1:'blue'}\n",
"plt.figure()\n",
"yy = yy_train.Class.values\n",
"allX = XX_train_2d[:,0]\n",
"allY = XX_train_2d[:,1]\n",
"plt.scatter(x = allX[yy==0], y = allY[yy==0], c = 'blue', label = 'Normal')\n",
"plt.scatter(x = allX[yy==1], y = allY[yy==1], c = 'red', label = 'Fraud')\n",
"plt.xlabel('X in t-SNE')\n",
"plt.ylabel('Y in t-SNE')\n",
"plt.legend(loc='upper left')\n",
"plt.title('t-SNE visualization of XX_train data')\n",
"plt.show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 3) Stratification approach: GNG + COM\n",
"\n",
"Probably not relevant for this data which is not that small..."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Boosting methods\n",
"- XGBoost [x]\n",
"- LightGBM\n",
"- CatBoost"
]
},
{
"cell_type": "code",
"execution_count": 121,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"from sklearn.metrics import confusion_matrix\n",
"from sklearn.model_selection import GridSearchCV\n",
"\n",
"# Useful functions\n",
"def get_auc(m, Xtrain,ytrain, Xtest,ytest): \n",
" return (metrics.roc_auc_score(ytrain,m.predict_proba(Xtrain)[:,1]),\n",
" metrics.roc_auc_score(ytest,m.predict_proba(Xtest)[:,1]))\n",
"\n",
"def get_metrics(model,XX,yy): \n",
" cnf_matrix = confusion_matrix(yy,model.predict(XX))\n",
" # Get scores\n",
" res = ji_get_metrics(cnf_matrix)\n",
" print(\" Precision: %1.2f\"%(res.Precision))\n",
" print(\" Recall(TPR): %1.2f\"%(res.Recall))\n",
" print(\" BAcc=(TPR+TNR)/2: %1.2f\"%(res.BAcc))\n",
" \n",
" return res"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# XGBoost"
]
},
{
"cell_type": "code",
"execution_count": 218,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# XGBoost\n",
"# Install with conda:\n",
"# - Check file to instal: anaconda search -t conda xgboost\n",
"# - conda install -c mndrake xgboost (IF mndrake is listed...)\n",
"\n",
"import xgboost as xgb\n",
"from sklearn import metrics\n",
"\n",
"# Wraper function: Tune + Train + Test\n",
"def run_XGBoost(XX_train,yy_train,XX_test,yy_test,explabel,flag):\n",
" tic = time.time()\n",
"\n",
" # 1) Parameter Tuning \n",
" model = xgb.XGBClassifier()\n",
" if flag.set2use==0:\n",
" param_dist = {\"max_depth\": [10,30,50], # 27 combinations\n",
" \"min_child_weight\" : [1,3,6],\n",
" \"n_estimators\": [200],\n",
" \"learning_rate\": [0.01,0.05, 0.1]}\n",
" else: # 125 combinations\n",
" param_dist = {\"max_depth\": [10,30,50,70,90],\n",
" \"min_child_weight\" : [1,3,6,9,12],\n",
" \"n_estimators\": [200],\n",
" \"learning_rate\": [0.01,0.05, 0.1,0.15,0.2],}\n",
"\n",
" if flag.pars2use:\n",
" best = flag.pars2use\n",
" print('Using parameters:', best)\n",
" else:\n",
" # Grid search\n",
" grid_search = GridSearchCV(model, param_grid=param_dist, cv = 3,verbose=1, n_jobs=-1)\n",
" grid_search.fit(XX_train, yy_train)\n",
" toc = time.time()\n",
" print(\"Computation time (parameter tunning): %d seconds...\"%(toc-tic)) \n",
" best = grid_search.best_params_\n",
" print('Best parameters:', best)\n",
" \n",
" # 3) Run with best parameters\n",
" tic = time.time()\n",
" model = xgb.XGBClassifier(max_depth=best['max_depth'], min_child_weight=best['min_child_weight'], \\\n",
" n_estimators=best['n_estimators'],n_jobs=-1 , verbose=1,learning_rate=best['learning_rate'])\n",
" model.fit(XX_train,yy_train)\n",
"\n",
" # Evaluate\n",
" print('********** SUMMARY: XGBoost (%s) **************'%(explabel))\n",
" print('- Parameters used: ',best)\n",
" print('- AUC(Train|Test): ',get_auc(model,XX_train,yy_train,XX_test,yy_test))\n",
" print('- Metrics: Train data (n=%d) **'%len(yy_train))\n",
" res_train = get_metrics(model,XX_train,yy_train)\n",
" print('- Metrics: Test data (n=%d) **'%len(yy_test))\n",
" res_test = get_metrics(model,XX_test,yy_test)\n",
"\n",
" toc = time.time()\n",
" print(\"Computation time (Total): %d seconds\"%(toc-tic))\n",
" \n",
" return best"
]
},
{
"cell_type": "code",
"execution_count": 219,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Fitting 3 folds for each of 125 candidates, totalling 375 fits\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"[Parallel(n_jobs=-1)]: Done 18 tasks | elapsed: 0.8s\n",
"[Parallel(n_jobs=-1)]: Done 168 tasks | elapsed: 3.2s\n",
"[Parallel(n_jobs=-1)]: Done 375 out of 375 | elapsed: 5.1s finished\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Computation time (parameter tunning): 5 seconds...\n",
"Best parameters: {'learning_rate': 0.05, 'max_depth': 10, 'min_child_weight': 6, 'n_estimators': 200}\n",
"********** SUMMARY: XGBoost (Exp1-Undersampled) **************\n",
"- Parameters used: {'learning_rate': 0.05, 'max_depth': 10, 'min_child_weight': 6, 'n_estimators': 200}\n",
"- AUC(Train|Test): (0.99980182322631783, 0.98394699169327249)\n",
"- Metrics: Train data (n=696) **\n",
" TN=348, FP=0\n",
" FN=6, TP=342\n",
" Precision: 1.00\n",
" Recall(TPR): 0.98\n",
" BAcc=(TPR+TNR)/2: 0.99\n",
"- Metrics: Test data (n=85443) **\n",
" TN=82434, FP=2865\n",
" FN=15, TP=129\n",
" Precision: 0.04\n",
" Recall(TPR): 0.90\n",
" BAcc=(TPR+TNR)/2: 0.93\n",
"Computation time (Total): 0 seconds\n"
]
}
],
"source": [
"## EXPERIMENT 1 - Train data: undersampled, Test data: all\n",
"\n",
"# 1) Choose dataset\n",
"mylabel = 'Exp1-Undersampled'\n",
"XX_train = X_undersampled.copy()\n",
"yy_train = y_undersampled.copy()\n",
"\n",
"XX_test = X_test.copy()\n",
"yy_test = y_test.copy()\n",
"# 2) Run wrapper\n",
"flag = matlab_like()\n",
"flag.set2use = 1\n",
"flag.pars2use = {}\n",
"best1 = run_XGBoost(XX_train,yy_train,XX_test,yy_test,mylabel,flag)"
]
},
{
"cell_type": "code",
"execution_count": 220,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Fitting 3 folds for each of 125 candidates, totalling 375 fits\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"[Parallel(n_jobs=-1)]: Done 18 tasks | elapsed: 0.8s\n",
"[Parallel(n_jobs=-1)]: Done 168 tasks | elapsed: 3.2s\n",
"[Parallel(n_jobs=-1)]: Done 375 out of 375 | elapsed: 5.1s finished\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Computation time (parameter tunning): 5 seconds...\n",
"Best parameters: {'learning_rate': 0.05, 'max_depth': 10, 'min_child_weight': 6, 'n_estimators': 200}\n",
"********** SUMMARY: XGBoost (Exp2-Undersampled (normalized)) **************\n",
"- Parameters used: {'learning_rate': 0.05, 'max_depth': 10, 'min_child_weight': 6, 'n_estimators': 200}\n",
"- AUC(Train|Test): (0.99980182322631783, 0.94721838767160227)\n",
"- Metrics: Train data (n=696) **\n",
" TN=348, FP=0\n",
" FN=6, TP=342\n",
" Precision: 1.00\n",
" Recall(TPR): 0.98\n",
" BAcc=(TPR+TNR)/2: 0.99\n",
"- Metrics: Test data (n=85443) **\n",
" TN=9371, FP=75928\n",
" FN=1, TP=143\n",
" Precision: 0.00\n",
" Recall(TPR): 0.99\n",
" BAcc=(TPR+TNR)/2: 0.55\n",
"Computation time (Total): 0 seconds\n"
]
}
],
"source": [
"## EXPERIMENT 2 - Train data: undersampled, Test data: all (normalized data)\n",
"\n",
"# 1) Choose dataset\n",
"mylabel = 'Exp2-Undersampled (normalized)'\n",
"XX_train = ji_normalize(X_undersampled,'standardize',[])\n",
"yy_train = y_undersampled.copy()\n",
"\n",
"XX_test = ji_normalize(X_test,'standardize',[])\n",
"yy_test = y_test.copy()\n",
"# 2) Run wrapper\n",
"flag = matlab_like()\n",
"flag.set2use = 1\n",
"flag.pars2use = {}\n",
"best2 = run_XGBoost(XX_train,yy_train,XX_test,yy_test,mylabel,flag)"
]
},
{
"cell_type": "code",
"execution_count": 224,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Using parameters: {'learning_rate': 0.05, 'max_depth': 10, 'min_child_weight': 6, 'n_estimators': 200}\n",
"********** SUMMARY: XGBoost (Exp3-SMOTE (best parameters:Exp1)) **************\n",
"- Parameters used: {'learning_rate': 0.05, 'max_depth': 10, 'min_child_weight': 6, 'n_estimators': 200}\n",
"- AUC(Train|Test): (0.99999814343887916, 0.98250410972643953)\n",
"- Metrics: Train data (n=398032) **\n",
" TN=198986, FP=30\n",
" FN=0, TP=199016\n",
" Precision: 1.00\n",
" Recall(TPR): 1.00\n",
" BAcc=(TPR+TNR)/2: 1.00\n",
"- Metrics: Test data (n=85443) **\n",
" TN=85266, FP=33\n",
" FN=20, TP=124\n",
" Precision: 0.79\n",
" Recall(TPR): 0.86\n",
" BAcc=(TPR+TNR)/2: 0.93\n",
"Computation time (Total): 87 seconds\n"
]
},
{
"data": {
"text/plain": [
"{'learning_rate': 0.05,\n",
" 'max_depth': 10,\n",
" 'min_child_weight': 6,\n",
" 'n_estimators': 200}"
]
},
"execution_count": 224,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"## EXPERIMENT 3 - Train data: SMOTE, Test data: all (best parameters from Exp1)\n",
"\n",
"# 1) Choose dataset\n",
"mylabel = 'Exp3-SMOTE (best parameters:Exp1)'\n",
"XX_train = X_train_SMOTE.copy()\n",
"yy_train = y_train_SMOTE.copy()\n",
"XX_test = X_test.copy()\n",
"yy_test = y_test.copy()\n",
"# 2) Run wrapper\n",
"flag = matlab_like()\n",
"flag.set2use = 0 # smaller set\n",
"flag.pars2use = best1 # will not run GridSearch if selected\n",
"run_XGBoost(XX_train,yy_train,XX_test,yy_test,mylabel,flag)"
]
},
{
"cell_type": "code",
"execution_count": 226,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Using parameters: {'learning_rate': 0.05, 'max_depth': 10, 'min_child_weight': 6, 'n_estimators': 200}\n",
"********** SUMMARY: XGBoost (Exp4-SMOTE2 (best parameters:Exp1)) **************\n",
"- Parameters used: {'learning_rate': 0.05, 'max_depth': 10, 'min_child_weight': 6, 'n_estimators': 200}\n",
"- AUC(Train|Test): (0.99999833710233976, 0.98295973738131626)\n",
"- Metrics: Train data (n=398032) **\n",
" TN=198996, FP=20\n",
" FN=35, TP=198981\n",
" Precision: 1.00\n",
" Recall(TPR): 1.00\n",
" BAcc=(TPR+TNR)/2: 1.00\n",
"- Metrics: Test data (n=85443) **\n",
" TN=85275, FP=24\n",
" FN=23, TP=121\n",
" Precision: 0.83\n",
" Recall(TPR): 0.84\n",
" BAcc=(TPR+TNR)/2: 0.92\n",
"Computation time (Total): 76 seconds\n"
]
}
],
"source": [
"## EXPERIMENT 4 - Train data: SMOTE2, Test data: all (best parameters from Exp1)\n",
"\n",
"# 1) Choose dataset\n",
"mylabel = 'Exp4-SMOTE2 (best parameters:Exp1)'\n",
"XX_train = X_train_SMOTE2.copy()\n",
"yy_train = y_train_SMOTE2.copy()\n",
"XX_test = X_test.copy()\n",
"yy_test = y_test.copy()\n",
"# 2) Run wrapper\n",
"flag = matlab_like()\n",
"flag.set2use = 0 # smaller set\n",
"flag.pars2use = best1 # will not run GridSearch if selected\n",
"tmp = run_XGBoost(XX_train,yy_train,XX_test,yy_test,mylabel,flag)"
]
},
{
"cell_type": "code",
"execution_count": 235,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Fitting 3 folds for each of 125 candidates, totalling 375 fits\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"[Parallel(n_jobs=-1)]: Done 18 tasks | elapsed: 0.5s\n",
"[Parallel(n_jobs=-1)]: Done 304 tasks | elapsed: 2.8s\n",
"[Parallel(n_jobs=-1)]: Done 344 out of 375 | elapsed: 3.0s remaining: 0.3s\n",
"[Parallel(n_jobs=-1)]: Done 375 out of 375 | elapsed: 3.2s finished\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Computation time (parameter tunning): 3 seconds...\n",
"Best parameters: {'learning_rate': 0.01, 'max_depth': 10, 'min_child_weight': 6, 'n_estimators': 200}\n",
"********** SUMMARY: XGBoost (Exp5-Undersampled) **************\n",
"- Parameters used: {'learning_rate': 0.01, 'max_depth': 10, 'min_child_weight': 6, 'n_estimators': 200}\n",
"- AUC(Train|Test): (0.99272526093275193, 0.9776254378389222)\n",
"- Metrics: Train data (n=696) **\n",
" TN=339, FP=9\n",
" FN=26, TP=322\n",
" Precision: 0.97\n",
" Recall(TPR): 0.93\n",
" BAcc=(TPR+TNR)/2: 0.95\n",
"- Metrics: Test data (n=85443) **\n",
" TN=82646, FP=2653\n",
" FN=17, TP=127\n",
" Precision: 0.05\n",
" Recall(TPR): 0.88\n",
" BAcc=(TPR+TNR)/2: 0.93\n",
"Computation time (Total): 0 seconds\n",
"Using parameters: {'learning_rate': 0.01, 'max_depth': 10, 'min_child_weight': 6, 'n_estimators': 200}\n",
"********** SUMMARY: XGBoost (Exp5-SMOTE (best parameters from Undersampled)) **************\n",
"- Parameters used: {'learning_rate': 0.01, 'max_depth': 10, 'min_child_weight': 6, 'n_estimators': 200}\n",
"- AUC(Train|Test): (0.99986035084522029, 0.97706055398591363)\n",
"- Metrics: Train data (n=398032) **\n",
" TN=198679, FP=337\n",
" FN=687, TP=198329\n",
" Precision: 1.00\n",
" Recall(TPR): 1.00\n",
" BAcc=(TPR+TNR)/2: 1.00\n",
"- Metrics: Test data (n=85443) **\n",
" TN=85094, FP=205\n",
" FN=21, TP=123\n",
" Precision: 0.38\n",
" Recall(TPR): 0.85\n",
" BAcc=(TPR+TNR)/2: 0.93\n",
"Computation time (Total): 65 seconds\n"
]
}
],
"source": [
"## EXPERIMENT 5 - Train data: SMOTE + Selected Feats, Test data: all (best parameters from Undersampled data)\n",
"\n",
"# 1) Choose dataset\n",
"mylabel = 'Exp5-SMOTE (best parameters from Undersampled)'\n",
"iselected = X_train.columns[select_feature1.get_support()] # Univariate ANOVA\n",
"XX_train = X_train_SMOTE[iselected].copy()\n",
"yy_train = y_train_SMOTE.copy()\n",
"XX_test = X_test[iselected].copy()\n",
"yy_test = y_test.copy()\n",
"# 2) Get best parameters from Undersampled data\n",
"flag = matlab_like()\n",
"flag.set2use = 1 # smaller set\n",
"flag.pars2use = {} # will not run GridSearch if selected\n",
"best5 = run_XGBoost(X_undersampled[iselected],y_undersampled,XX_test,yy_test,'Exp5-Undersampled',flag)\n",
"# 3) Run on whole data with the best parameters from Undersampled data\n",
"flag = matlab_like()\n",
"flag.set2use = 0 # smaller set\n",
"flag.pars2use = best5 # will not run GridSearch if selected\n",
"best5 = run_XGBoost(XX_train,yy_train,XX_test,yy_test,mylabel,flag)\n"
]
},
{
"cell_type": "code",
"execution_count": 236,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Fitting 3 folds for each of 125 candidates, totalling 375 fits\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"[Parallel(n_jobs=-1)]: Done 18 tasks | elapsed: 0.5s\n",
"[Parallel(n_jobs=-1)]: Done 168 tasks | elapsed: 2.0s\n",
"[Parallel(n_jobs=-1)]: Done 375 out of 375 | elapsed: 3.2s finished\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Computation time (parameter tunning): 3 seconds...\n",
"Best parameters: {'learning_rate': 0.01, 'max_depth': 10, 'min_child_weight': 6, 'n_estimators': 200}\n",
"********** SUMMARY: XGBoost (Exp6-Undersampled) **************\n",
"- Parameters used: {'learning_rate': 0.01, 'max_depth': 10, 'min_child_weight': 6, 'n_estimators': 200}\n",
"- AUC(Train|Test): (0.99272526093275193, 0.9776254378389222)\n",
"- Metrics: Train data (n=696) **\n",
" TN=339, FP=9\n",
" FN=26, TP=322\n",
" Precision: 0.97\n",
" Recall(TPR): 0.93\n",
" BAcc=(TPR+TNR)/2: 0.95\n",
"- Metrics: Test data (n=85443) **\n",
" TN=82646, FP=2653\n",
" FN=17, TP=127\n",
" Precision: 0.05\n",
" Recall(TPR): 0.88\n",
" BAcc=(TPR+TNR)/2: 0.93\n",
"Computation time (Total): 0 seconds\n",
"Using parameters: {'learning_rate': 0.01, 'max_depth': 10, 'min_child_weight': 6, 'n_estimators': 200}\n",
"********** SUMMARY: XGBoost (Exp6-SMOTE2 (best parameters from Undersampled)) **************\n",
"- Parameters used: {'learning_rate': 0.01, 'max_depth': 10, 'min_child_weight': 6, 'n_estimators': 200}\n",
"- AUC(Train|Test): (0.99986703870436522, 0.95360087912975411)\n",
"- Metrics: Train data (n=398032) **\n",
" TN=198901, FP=115\n",
" FN=92, TP=198924\n",
" Precision: 1.00\n",
" Recall(TPR): 1.00\n",
" BAcc=(TPR+TNR)/2: 1.00\n",
"- Metrics: Test data (n=85443) **\n",
" TN=85223, FP=76\n",
" FN=25, TP=119\n",
" Precision: 0.61\n",
" Recall(TPR): 0.83\n",
" BAcc=(TPR+TNR)/2: 0.91\n",
"Computation time (Total): 60 seconds\n"
]
}
],
"source": [
"## EXPERIMENT 6 - Train data: SMOTE2 + Selected Feats, Test data: all (best parameters from Undersampled data)\n",
"\n",
"# 1) Choose dataset\n",
"mylabel = 'Exp6-SMOTE2 (best parameters from Undersampled)'\n",
"iselected = X_train.columns[select_feature1.get_support()] # Univariate ANOVA\n",
"XX_train = X_train_SMOTE2[iselected].copy()\n",
"yy_train = y_train_SMOTE2.copy()\n",
"XX_test = X_test[iselected].copy()\n",
"yy_test = y_test.copy()\n",
"# 2) Get best parameters from Undersampled data\n",
"flag = matlab_like()\n",
"flag.set2use = 1 # smaller set\n",
"flag.pars2use = {} # will not run GridSearch if selected\n",
"best6 = run_XGBoost(X_undersampled[iselected],y_undersampled,XX_test,yy_test,'Exp6-Undersampled',flag)\n",
"# 3) Run on whole data with the best parameters from Undersampled data\n",
"flag = matlab_like()\n",
"flag.set2use = 0 # smaller set\n",
"flag.pars2use = best6 # will not run GridSearch if selected\n",
"tmp = run_XGBoost(XX_train,yy_train,XX_test,yy_test,mylabel,flag)"
]
},
{
"cell_type": "code",
"execution_count": 238,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Selected feats: Index(['Time', 'V1', 'V2', 'V3', 'V4', 'V5', 'V7', 'V8', 'V10', 'V11', 'V12',\n",
" 'V14', 'V16', 'V17', 'Amount'],\n",
" dtype='object')\n",
"Fitting 3 folds for each of 125 candidates, totalling 375 fits\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"[Parallel(n_jobs=-1)]: Done 18 tasks | elapsed: 0.5s\n",
"[Parallel(n_jobs=-1)]: Done 304 tasks | elapsed: 2.8s\n",
"[Parallel(n_jobs=-1)]: Done 344 out of 375 | elapsed: 3.0s remaining: 0.3s\n",
"[Parallel(n_jobs=-1)]: Done 375 out of 375 | elapsed: 3.1s finished\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Computation time (parameter tunning): 4 seconds...\n",
"Best parameters: {'learning_rate': 0.01, 'max_depth': 10, 'min_child_weight': 1, 'n_estimators': 200}\n",
"********** SUMMARY: XGBoost (Exp7-Undersampled) **************\n",
"- Parameters used: {'learning_rate': 0.01, 'max_depth': 10, 'min_child_weight': 1, 'n_estimators': 200}\n",
"- AUC(Train|Test): (0.99997522790328974, 0.97433619125403315)\n",
"- Metrics: Train data (n=696) **\n",
" TN=348, FP=0\n",
" FN=4, TP=344\n",
" Precision: 1.00\n",
" Recall(TPR): 0.99\n",
" BAcc=(TPR+TNR)/2: 0.99\n",
"- Metrics: Test data (n=85443) **\n",
" TN=80522, FP=4777\n",
" FN=13, TP=131\n",
" Precision: 0.03\n",
" Recall(TPR): 0.91\n",
" BAcc=(TPR+TNR)/2: 0.93\n",
"Computation time (Total): 0 seconds\n",
"Using parameters: {'learning_rate': 0.01, 'max_depth': 10, 'min_child_weight': 1, 'n_estimators': 200}\n",
"********** SUMMARY: XGBoost (Exp7-SMOTE2 (best parameters from Undersampled)) **************\n",
"- Parameters used: {'learning_rate': 0.01, 'max_depth': 10, 'min_child_weight': 1, 'n_estimators': 200}\n",
"- AUC(Train|Test): (0.99986347039103807, 0.96300350661919953)\n",
"- Metrics: Train data (n=398032) **\n",
" TN=198934, FP=82\n",
" FN=47, TP=198969\n",
" Precision: 1.00\n",
" Recall(TPR): 1.00\n",
" BAcc=(TPR+TNR)/2: 1.00\n",
"- Metrics: Test data (n=85443) **\n",
" TN=85238, FP=61\n",
" FN=26, TP=118\n",
" Precision: 0.66\n",
" Recall(TPR): 0.82\n",
" BAcc=(TPR+TNR)/2: 0.91\n",
"Computation time (Total): 57 seconds\n"
]
}
],
"source": [
"## EXPERIMENT 7 - Train data: SMOTE2 + Selected Feats (Chi2), Test data: all (best parameters from Undersampled data)\n",
"\n",
"# 1) Choose dataset\n",
"mylabel = 'Exp7-SMOTE2 (best parameters from Undersampled)'\n",
"iselected = X_train.columns[select_feature2.get_support()] # Chi2\n",
"print('Selected feats: ',iselected)\n",
"XX_train = X_train_SMOTE2[iselected].copy()\n",
"yy_train = y_train_SMOTE2.copy()\n",
"XX_test = X_test[iselected].copy()\n",
"yy_test = y_test.copy()\n",
"# 2) Get best parameters from Undersampled data\n",
"flag = matlab_like()\n",
"flag.set2use = 1 # smaller set\n",
"flag.pars2use = {} # will not run GridSearch if selected\n",
"tmp = run_XGBoost(X_undersampled[iselected],y_undersampled,XX_test,yy_test,'Exp7-Undersampled',flag)\n",
"# 3) Run on whole data with the best parameters from Undersampled data\n",
"flag = matlab_like()\n",
"flag.set2use = 0 # smaller set\n",
"flag.pars2use = tmp # will not run GridSearch if selected\n",
"best7 = run_XGBoost(XX_train,yy_train,XX_test,yy_test,mylabel,flag)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.1"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment