Skip to content

Instantly share code, notes, and snippets.

@ginward
Last active March 15, 2020 20:54
Show Gist options
  • Save ginward/a7559784e96a1aecad813b97b9ba9089 to your computer and use it in GitHub Desktop.
Save ginward/a7559784e96a1aecad813b97b9ba9089 to your computer and use it in GitHub Desktop.
Xiu_monthly (P CRSP CV RF 0314).ipynb
Display the source blob
Display the rendered blob
Raw
{
"nbformat": 4,
"nbformat_minor": 0,
"metadata": {
"kernelspec": {
"name": "python3",
"display_name": "Python 3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.1"
},
"colab": {
"name": "Xiu_monthly (P CRSP CV RF 0314).ipynb",
"provenance": [],
"collapsed_sections": [],
"toc_visible": true,
"machine_shape": "hm",
"include_colab_link": true
},
"accelerator": "GPU"
},
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "view-in-github",
"colab_type": "text"
},
"source": [
"<a href=\"https://colab.research.google.com/gist/ginward/a7559784e96a1aecad813b97b9ba9089/xiu_monthly-p-crsp-cv-rf-0314.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "YX8cgPsuPUd6",
"colab_type": "text"
},
"source": [
"# 1. Introduction"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "QCPSfZqhPUd9",
"colab_type": "text"
},
"source": [
"This notebook is to replicate the main result of the paper: \n",
"Gu, Shihao, Bryan T. Kelly, and Dacheng Xiu. \"Autoencoder asset pricing models.\" Available at SSRN (2019).\n",
"\n",
"Please refer to the requirement.txt for environment configuration."
]
},
{
"cell_type": "code",
"metadata": {
"id": "2ia_JKmlPUd-",
"colab_type": "code",
"colab": {}
},
"source": [
"import numpy as np\n",
"import pandas as pd\n",
"import h5py\n",
"from matplotlib import pyplot as plt\n",
"from scipy import stats\n",
"import torch as t\n",
"from torch import nn\n",
"import os\n",
"from torch import optim\n",
"import warnings\n",
"warnings.filterwarnings(\"ignore\")\n",
"t.manual_seed(1)\n",
"t.cuda.manual_seed(1)"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "7xd1MpSTPUeB",
"colab_type": "code",
"outputId": "134b62de-1eb9-407e-ff10-acdd5287ae51",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 34
}
},
"source": [
"print(t.__version__)"
],
"execution_count": 0,
"outputs": [
{
"output_type": "stream",
"text": [
"1.4.0\n"
],
"name": "stdout"
}
]
},
{
"cell_type": "code",
"metadata": {
"id": "6urp-yfyPUeF",
"colab_type": "code",
"outputId": "b5d1f09e-dd7b-4904-8303-380b208c737d",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 34
}
},
"source": [
"print(t.version.cuda)"
],
"execution_count": 0,
"outputs": [
{
"output_type": "stream",
"text": [
"10.1\n"
],
"name": "stdout"
}
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "psJfSgLcPUeI",
"colab_type": "text"
},
"source": [
"If you should run this code locally, you should make sure `t.cuda.is_available() == True`, otherwise you should install the right pytorch matches your cuda version.\n",
"If you want to run the code only on CPU, then please remove every `.cuda()` in the code."
]
},
{
"cell_type": "code",
"metadata": {
"id": "RZbBFDRxPUeI",
"colab_type": "code",
"outputId": "1effe334-6279-4a18-920c-b1fbf5885137",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 34
}
},
"source": [
"t.cuda.is_available()"
],
"execution_count": 0,
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/plain": [
"True"
]
},
"metadata": {
"tags": []
},
"execution_count": 4
}
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "xma_fGFIPUeL",
"colab_type": "text"
},
"source": [
"# 2. Data "
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "RZ6JEeHlPUeM",
"colab_type": "text"
},
"source": [
"## 2.1 Overview"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "MrCl5DjBPUeN",
"colab_type": "text"
},
"source": [
"We used the `datashare.csv` updated by the author of this paper"
]
},
{
"cell_type": "code",
"metadata": {
"id": "mPkbG5XaUf-h",
"colab_type": "code",
"outputId": "fbe463ef-fb11-4ef1-d181-0ded61fb4ba3",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 34
}
},
"source": [
"# Connect with Google Drive\n",
"from google.colab import drive\n",
"drive.mount('/content/drive', force_remount = True)\n",
"cd drive/My Drive/Colab Notebooks\n",
"data = pd.read_csv('./xiu_month_rf.csv')\n",
"#we don't want return to contain nan\n",
"data=data[data['return'].isnull()==False]\n",
"#RF is in percentages\n",
"data['RF'] = data['RF']/100\n",
"data['return'] = data['return'] - data['RF']\n",
"#drop the unnecessary columns\n",
"data = data.drop(columns = [\"MONTH\", \"RF\"])\n",
"#shift the return to future return\n",
"data['return']=data['return'].shift(periods=1)\n",
"#we don't want return to contain nan\n",
"data=data[data['return'].isnull()==False]"
],
"execution_count": 0,
"outputs": [
{
"output_type": "stream",
"text": [
"Mounted at /content/drive\n"
],
"name": "stdout"
}
]
},
{
"cell_type": "code",
"metadata": {
"id": "XdzhwhSRQ9Z5",
"colab_type": "code",
"outputId": "105e5586-bf5c-4c8b-afc7-c01ccd0357ee",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 289
}
},
"source": [
"data.columns"
],
"execution_count": 0,
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/plain": [
"Index(['permno', 'DATE', 'mvel1', 'beta', 'betasq', 'chmom', 'dolvol',\n",
" 'idiovol', 'indmom', 'mom1m', 'mom6m', 'mom12m', 'mom36m', 'pricedelay',\n",
" 'turn', 'absacc', 'acc', 'age', 'agr', 'bm', 'bm_ia', 'cashdebt',\n",
" 'cashpr', 'cfp', 'cfp_ia', 'chatoia', 'chcsho', 'chempia', 'chinv',\n",
" 'chpmia', 'convind', 'currat', 'depr', 'divi', 'divo', 'dy', 'egr',\n",
" 'ep', 'gma', 'grcapx', 'grltnoa', 'herf', 'hire', 'invest', 'lev',\n",
" 'lgr', 'mve_ia', 'operprof', 'orgcap', 'pchcapx_ia', 'pchcurrat',\n",
" 'pchdepr', 'pchgm_pchsale', 'pchquick', 'pchsale_pchinvt',\n",
" 'pchsale_pchrect', 'pchsale_pchxsga', 'pchsaleinv', 'pctacc', 'ps',\n",
" 'quick', 'rd', 'rd_mve', 'rd_sale', 'realestate', 'roic', 'salecash',\n",
" 'saleinv', 'salerec', 'secured', 'securedind', 'sgr', 'sin', 'sp',\n",
" 'tang', 'tb', 'aeavol', 'cash', 'chtx', 'cinvest', 'ear', 'nincr',\n",
" 'roaq', 'roavol', 'roeq', 'rsup', 'stdacc', 'stdcf', 'ms', 'baspread',\n",
" 'ill', 'maxret', 'retvol', 'std_dolvol', 'std_turn', 'zerotrade',\n",
" 'sic2', 'PERMNO', 'date', 'return'],\n",
" dtype='object')"
]
},
"metadata": {
"tags": []
},
"execution_count": 14
}
]
},
{
"cell_type": "code",
"metadata": {
"id": "ZFfYUw7jPUeT",
"colab_type": "code",
"outputId": "bd676618-c678-4382-e123-863b1f864967",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 34
}
},
"source": [
"data.shape"
],
"execution_count": 0,
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/plain": [
"(3839444, 100)"
]
},
"metadata": {
"tags": []
},
"execution_count": 15
}
]
},
{
"cell_type": "code",
"metadata": {
"id": "0KhP6wl_y5Pu",
"colab_type": "code",
"colab": {}
},
"source": [
""
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "5VLqDjQqPUeZ",
"colab_type": "code",
"colab": {}
},
"source": [
"summary = data.describe()"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "w1hc8hFaPUeb",
"colab_type": "code",
"outputId": "33cc1369-6e70-4b42-87a7-a24766ac1050",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 380
}
},
"source": [
"summary"
],
"execution_count": 0,
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>permno</th>\n",
" <th>DATE</th>\n",
" <th>mvel1</th>\n",
" <th>beta</th>\n",
" <th>betasq</th>\n",
" <th>chmom</th>\n",
" <th>dolvol</th>\n",
" <th>idiovol</th>\n",
" <th>indmom</th>\n",
" <th>mom1m</th>\n",
" <th>mom6m</th>\n",
" <th>mom12m</th>\n",
" <th>mom36m</th>\n",
" <th>pricedelay</th>\n",
" <th>turn</th>\n",
" <th>absacc</th>\n",
" <th>acc</th>\n",
" <th>age</th>\n",
" <th>agr</th>\n",
" <th>bm</th>\n",
" <th>bm_ia</th>\n",
" <th>cashdebt</th>\n",
" <th>cashpr</th>\n",
" <th>cfp</th>\n",
" <th>cfp_ia</th>\n",
" <th>chatoia</th>\n",
" <th>chcsho</th>\n",
" <th>chempia</th>\n",
" <th>chinv</th>\n",
" <th>chpmia</th>\n",
" <th>convind</th>\n",
" <th>currat</th>\n",
" <th>depr</th>\n",
" <th>divi</th>\n",
" <th>divo</th>\n",
" <th>dy</th>\n",
" <th>egr</th>\n",
" <th>ep</th>\n",
" <th>gma</th>\n",
" <th>grcapx</th>\n",
" <th>...</th>\n",
" <th>ps</th>\n",
" <th>quick</th>\n",
" <th>rd</th>\n",
" <th>rd_mve</th>\n",
" <th>rd_sale</th>\n",
" <th>realestate</th>\n",
" <th>roic</th>\n",
" <th>salecash</th>\n",
" <th>saleinv</th>\n",
" <th>salerec</th>\n",
" <th>secured</th>\n",
" <th>securedind</th>\n",
" <th>sgr</th>\n",
" <th>sin</th>\n",
" <th>sp</th>\n",
" <th>tang</th>\n",
" <th>tb</th>\n",
" <th>aeavol</th>\n",
" <th>cash</th>\n",
" <th>chtx</th>\n",
" <th>cinvest</th>\n",
" <th>ear</th>\n",
" <th>nincr</th>\n",
" <th>roaq</th>\n",
" <th>roavol</th>\n",
" <th>roeq</th>\n",
" <th>rsup</th>\n",
" <th>stdacc</th>\n",
" <th>stdcf</th>\n",
" <th>ms</th>\n",
" <th>baspread</th>\n",
" <th>ill</th>\n",
" <th>maxret</th>\n",
" <th>retvol</th>\n",
" <th>std_dolvol</th>\n",
" <th>std_turn</th>\n",
" <th>zerotrade</th>\n",
" <th>sic2</th>\n",
" <th>PERMNO</th>\n",
" <th>return</th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>count</th>\n",
" <td>3.839444e+06</td>\n",
" <td>3.839444e+06</td>\n",
" <td>3.755557e+06</td>\n",
" <td>3.389347e+06</td>\n",
" <td>3.389347e+06</td>\n",
" <td>3.446847e+06</td>\n",
" <td>3.411209e+06</td>\n",
" <td>3.389347e+06</td>\n",
" <td>3.758478e+06</td>\n",
" <td>3.729726e+06</td>\n",
" <td>3.615102e+06</td>\n",
" <td>3.446847e+06</td>\n",
" <td>2.850971e+06</td>\n",
" <td>3.389275e+06</td>\n",
" <td>3.413138e+06</td>\n",
" <td>2.326587e+06</td>\n",
" <td>2.326587e+06</td>\n",
" <td>2.832029e+06</td>\n",
" <td>2.596519e+06</td>\n",
" <td>2.782970e+06</td>\n",
" <td>2.782970e+06</td>\n",
" <td>2.687636e+06</td>\n",
" <td>2.786218e+06</td>\n",
" <td>2.510444e+06</td>\n",
" <td>2.510444e+06</td>\n",
" <td>2.355057e+06</td>\n",
" <td>2.592894e+06</td>\n",
" <td>2.562909e+06</td>\n",
" <td>2.512856e+06</td>\n",
" <td>2.555792e+06</td>\n",
" <td>2.832029e+06</td>\n",
" <td>2.710289e+06</td>\n",
" <td>2.636130e+06</td>\n",
" <td>2.596562e+06</td>\n",
" <td>2.596562e+06</td>\n",
" <td>2.818892e+06</td>\n",
" <td>2.569202e+06</td>\n",
" <td>2.828959e+06</td>\n",
" <td>2.589928e+06</td>\n",
" <td>2.233329e+06</td>\n",
" <td>...</td>\n",
" <td>2.596562e+06</td>\n",
" <td>2.683265e+06</td>\n",
" <td>2.596562e+06</td>\n",
" <td>1.328890e+06</td>\n",
" <td>1.311279e+06</td>\n",
" <td>950982.000000</td>\n",
" <td>2.687410e+06</td>\n",
" <td>2.794359e+06</td>\n",
" <td>2.194654e+06</td>\n",
" <td>2.706516e+06</td>\n",
" <td>1.436380e+06</td>\n",
" <td>2.832029e+06</td>\n",
" <td>2.560932e+06</td>\n",
" <td>2.832029e+06</td>\n",
" <td>2.821471e+06</td>\n",
" <td>2.649309e+06</td>\n",
" <td>2.497247e+06</td>\n",
" <td>2.225224e+06</td>\n",
" <td>2.149641e+06</td>\n",
" <td>2.061693e+06</td>\n",
" <td>2.039079e+06</td>\n",
" <td>2.252931e+06</td>\n",
" <td>2.255226e+06</td>\n",
" <td>2.171228e+06</td>\n",
" <td>1.851443e+06</td>\n",
" <td>2.232022e+06</td>\n",
" <td>2.192266e+06</td>\n",
" <td>1.435203e+06</td>\n",
" <td>1.435203e+06</td>\n",
" <td>2.190799e+06</td>\n",
" <td>3.757942e+06</td>\n",
" <td>3.450330e+06</td>\n",
" <td>3.758039e+06</td>\n",
" <td>3.755459e+06</td>\n",
" <td>3.442552e+06</td>\n",
" <td>3.452972e+06</td>\n",
" <td>3.448858e+06</td>\n",
" <td>3.499786e+06</td>\n",
" <td>3.839444e+06</td>\n",
" <td>3.839444e+06</td>\n",
" </tr>\n",
" <tr>\n",
" <th>mean</th>\n",
" <td>5.641562e+04</td>\n",
" <td>1.992466e+07</td>\n",
" <td>1.076403e+06</td>\n",
" <td>1.010055e+00</td>\n",
" <td>1.445190e+00</td>\n",
" <td>1.860014e-03</td>\n",
" <td>1.078846e+01</td>\n",
" <td>6.057841e-02</td>\n",
" <td>1.244846e-01</td>\n",
" <td>9.094527e-03</td>\n",
" <td>4.889251e-02</td>\n",
" <td>1.174339e-01</td>\n",
" <td>3.129910e-01</td>\n",
" <td>1.571773e-01</td>\n",
" <td>1.007332e+00</td>\n",
" <td>9.042886e-02</td>\n",
" <td>-2.030275e-02</td>\n",
" <td>1.125793e+01</td>\n",
" <td>-1.568375e-01</td>\n",
" <td>2.492937e+00</td>\n",
" <td>-5.002076e-01</td>\n",
" <td>2.712186e-02</td>\n",
" <td>-5.329307e-01</td>\n",
" <td>7.683458e-02</td>\n",
" <td>-1.520426e-01</td>\n",
" <td>-4.775059e-04</td>\n",
" <td>1.136873e-01</td>\n",
" <td>-9.235074e-02</td>\n",
" <td>1.270485e-02</td>\n",
" <td>1.737038e-01</td>\n",
" <td>1.403806e-01</td>\n",
" <td>3.737107e+00</td>\n",
" <td>2.551475e-01</td>\n",
" <td>3.115427e-02</td>\n",
" <td>3.039750e-02</td>\n",
" <td>2.093203e-02</td>\n",
" <td>1.402474e-01</td>\n",
" <td>-1.813826e-02</td>\n",
" <td>3.571889e-01</td>\n",
" <td>9.057914e-01</td>\n",
" <td>...</td>\n",
" <td>4.165959e+00</td>\n",
" <td>3.046227e+00</td>\n",
" <td>1.269113e-01</td>\n",
" <td>5.908676e-02</td>\n",
" <td>4.936388e-01</td>\n",
" <td>0.261202</td>\n",
" <td>-6.514091e-02</td>\n",
" <td>5.000005e+01</td>\n",
" <td>2.697968e+01</td>\n",
" <td>1.158660e+01</td>\n",
" <td>5.626192e-01</td>\n",
" <td>4.035647e-01</td>\n",
" <td>1.914847e-01</td>\n",
" <td>8.873497e-03</td>\n",
" <td>2.217544e+00</td>\n",
" <td>5.379291e-01</td>\n",
" <td>-1.637022e-01</td>\n",
" <td>8.098581e-01</td>\n",
" <td>1.569229e-01</td>\n",
" <td>1.005715e-03</td>\n",
" <td>2.015679e-01</td>\n",
" <td>3.186940e-03</td>\n",
" <td>1.014422e+00</td>\n",
" <td>2.020900e-04</td>\n",
" <td>2.637633e-02</td>\n",
" <td>7.131660e-03</td>\n",
" <td>1.897089e-02</td>\n",
" <td>6.270030e+00</td>\n",
" <td>1.280745e+01</td>\n",
" <td>3.623760e+00</td>\n",
" <td>5.419741e-02</td>\n",
" <td>5.160812e-06</td>\n",
" <td>7.005861e-02</td>\n",
" <td>3.070398e-02</td>\n",
" <td>8.683051e-01</td>\n",
" <td>4.089129e+00</td>\n",
" <td>1.515324e+00</td>\n",
" <td>4.742094e+01</td>\n",
" <td>5.641562e+04</td>\n",
" <td>-1.119134e-01</td>\n",
" </tr>\n",
" <tr>\n",
" <th>std</th>\n",
" <td>2.728803e+04</td>\n",
" <td>1.418895e+05</td>\n",
" <td>4.816796e+06</td>\n",
" <td>6.487547e-01</td>\n",
" <td>1.726207e+00</td>\n",
" <td>5.394289e-01</td>\n",
" <td>2.925163e+00</td>\n",
" <td>3.673976e-02</td>\n",
" <td>2.861871e-01</td>\n",
" <td>1.477721e-01</td>\n",
" <td>3.511340e-01</td>\n",
" <td>5.610802e-01</td>\n",
" <td>9.157298e-01</td>\n",
" <td>1.353391e+00</td>\n",
" <td>1.046255e+01</td>\n",
" <td>9.871476e-02</td>\n",
" <td>1.274767e-01</td>\n",
" <td>1.009376e+01</td>\n",
" <td>4.207610e-01</td>\n",
" <td>2.676159e+01</td>\n",
" <td>2.445407e+01</td>\n",
" <td>1.539870e+01</td>\n",
" <td>8.345835e+01</td>\n",
" <td>1.331546e+00</td>\n",
" <td>6.279741e+00</td>\n",
" <td>2.193894e-01</td>\n",
" <td>3.242902e-01</td>\n",
" <td>9.347798e-01</td>\n",
" <td>5.622807e-02</td>\n",
" <td>5.786258e+00</td>\n",
" <td>3.473816e-01</td>\n",
" <td>1.229410e+01</td>\n",
" <td>3.944317e-01</td>\n",
" <td>1.737346e-01</td>\n",
" <td>1.716785e-01</td>\n",
" <td>3.721700e-02</td>\n",
" <td>6.520390e-01</td>\n",
" <td>3.597276e-01</td>\n",
" <td>3.356763e-01</td>\n",
" <td>4.525626e+00</td>\n",
" <td>...</td>\n",
" <td>1.715931e+00</td>\n",
" <td>1.229985e+01</td>\n",
" <td>3.328736e-01</td>\n",
" <td>1.044977e-01</td>\n",
" <td>3.790412e+00</td>\n",
" <td>0.281068</td>\n",
" <td>8.391528e-01</td>\n",
" <td>1.560374e+02</td>\n",
" <td>7.889359e+01</td>\n",
" <td>5.231783e+01</td>\n",
" <td>5.227797e-01</td>\n",
" <td>4.906122e-01</td>\n",
" <td>5.810332e-01</td>\n",
" <td>9.378039e-02</td>\n",
" <td>3.616400e+00</td>\n",
" <td>1.534417e-01</td>\n",
" <td>1.809995e+00</td>\n",
" <td>2.067140e+00</td>\n",
" <td>2.028815e-01</td>\n",
" <td>1.151145e-02</td>\n",
" <td>2.490160e+01</td>\n",
" <td>7.913224e-02</td>\n",
" <td>1.369501e+00</td>\n",
" <td>5.196299e-02</td>\n",
" <td>4.766587e-02</td>\n",
" <td>2.097163e-01</td>\n",
" <td>2.147489e-01</td>\n",
" <td>7.501391e+01</td>\n",
" <td>1.355946e+02</td>\n",
" <td>1.674192e+00</td>\n",
" <td>7.299035e-02</td>\n",
" <td>2.717579e-05</td>\n",
" <td>7.102210e-02</td>\n",
" <td>2.551464e-02</td>\n",
" <td>4.011955e-01</td>\n",
" <td>9.200734e+00</td>\n",
" <td>3.543730e+00</td>\n",
" <td>1.970926e+01</td>\n",
" <td>2.728803e+04</td>\n",
" <td>2.036537e+00</td>\n",
" </tr>\n",
" <tr>\n",
" <th>min</th>\n",
" <td>1.000000e+04</td>\n",
" <td>1.957033e+07</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-1.933279e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-9.062534e+00</td>\n",
" <td>-3.060271e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-7.794856e-01</td>\n",
" <td>-6.981132e-01</td>\n",
" <td>-1.000000e+00</td>\n",
" <td>-1.000000e+00</td>\n",
" <td>-1.000000e+00</td>\n",
" <td>-8.229368e+02</td>\n",
" <td>0.000000e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-1.257607e+00</td>\n",
" <td>1.000000e+00</td>\n",
" <td>-6.033304e+00</td>\n",
" <td>-2.439120e+01</td>\n",
" <td>-7.759533e+02</td>\n",
" <td>-1.024800e+04</td>\n",
" <td>-8.364906e+03</td>\n",
" <td>-2.057420e+02</td>\n",
" <td>-1.912332e+02</td>\n",
" <td>-1.314998e+00</td>\n",
" <td>-8.998305e-01</td>\n",
" <td>-5.622415e+01</td>\n",
" <td>-3.003145e-01</td>\n",
" <td>-1.574528e+02</td>\n",
" <td>0.000000e+00</td>\n",
" <td>1.418063e-03</td>\n",
" <td>-9.838288e-01</td>\n",
" <td>0.000000e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-3.313476e+00</td>\n",
" <td>-6.199143e+00</td>\n",
" <td>-1.821396e+01</td>\n",
" <td>-9.219229e-01</td>\n",
" <td>-4.047032e+02</td>\n",
" <td>...</td>\n",
" <td>0.000000e+00</td>\n",
" <td>1.418063e-03</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-3.411239e-02</td>\n",
" <td>-9.038462e+01</td>\n",
" <td>0.000000</td>\n",
" <td>-1.717197e+01</td>\n",
" <td>-1.591636e+03</td>\n",
" <td>-1.066224e+02</td>\n",
" <td>-2.179600e+04</td>\n",
" <td>0.000000e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-1.000000e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-3.594196e+01</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-4.219112e+01</td>\n",
" <td>-1.000000e+00</td>\n",
" <td>-1.431838e-01</td>\n",
" <td>-1.607357e-01</td>\n",
" <td>-2.833333e+03</td>\n",
" <td>-4.754902e-01</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-5.334442e-01</td>\n",
" <td>4.202608e-06</td>\n",
" <td>-1.720769e+02</td>\n",
" <td>-3.343933e+01</td>\n",
" <td>0.000000e+00</td>\n",
" <td>5.354650e-06</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-1.979275e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-7.335907e-02</td>\n",
" <td>0.000000e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>1.954854e-12</td>\n",
" <td>1.000000e+00</td>\n",
" <td>1.000000e+04</td>\n",
" <td>-3.250224e+02</td>\n",
" </tr>\n",
" <tr>\n",
" <th>25%</th>\n",
" <td>3.066400e+04</td>\n",
" <td>1.982083e+07</td>\n",
" <td>2.020406e+04</td>\n",
" <td>5.399023e-01</td>\n",
" <td>2.973754e-01</td>\n",
" <td>-2.380952e-01</td>\n",
" <td>8.755725e+00</td>\n",
" <td>3.433070e-02</td>\n",
" <td>-4.968952e-02</td>\n",
" <td>-6.150061e-02</td>\n",
" <td>-1.359311e-01</td>\n",
" <td>-1.927081e-01</td>\n",
" <td>-2.142376e-01</td>\n",
" <td>-5.276549e-02</td>\n",
" <td>1.902249e-01</td>\n",
" <td>2.971359e-02</td>\n",
" <td>-7.369294e-02</td>\n",
" <td>4.000000e+00</td>\n",
" <td>-2.029546e-01</td>\n",
" <td>3.374387e-01</td>\n",
" <td>-4.111622e-01</td>\n",
" <td>1.393792e-02</td>\n",
" <td>-8.487275e+00</td>\n",
" <td>-4.047826e-02</td>\n",
" <td>-1.445665e-01</td>\n",
" <td>-7.102396e-02</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-1.809318e-01</td>\n",
" <td>-1.544846e-03</td>\n",
" <td>-1.310713e-01</td>\n",
" <td>0.000000e+00</td>\n",
" <td>1.227175e+00</td>\n",
" <td>9.551871e-02</td>\n",
" <td>0.000000e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-2.783572e-02</td>\n",
" <td>-3.110659e-03</td>\n",
" <td>1.141927e-01</td>\n",
" <td>-3.729642e-01</td>\n",
" <td>...</td>\n",
" <td>3.000000e+00</td>\n",
" <td>8.788656e-01</td>\n",
" <td>0.000000e+00</td>\n",
" <td>3.133918e-03</td>\n",
" <td>2.423726e-03</td>\n",
" <td>0.107551</td>\n",
" <td>6.210056e-03</td>\n",
" <td>2.632329e+00</td>\n",
" <td>4.606195e+00</td>\n",
" <td>3.733727e+00</td>\n",
" <td>1.508237e-02</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-1.089539e-02</td>\n",
" <td>0.000000e+00</td>\n",
" <td>4.255818e-01</td>\n",
" <td>4.696129e-01</td>\n",
" <td>-7.161761e-01</td>\n",
" <td>-2.652705e-01</td>\n",
" <td>2.255238e-02</td>\n",
" <td>-1.279941e-03</td>\n",
" <td>-2.841891e-02</td>\n",
" <td>-3.226043e-02</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-5.471000e-05</td>\n",
" <td>5.151292e-03</td>\n",
" <td>7.037298e-04</td>\n",
" <td>-4.466318e-03</td>\n",
" <td>7.948247e-02</td>\n",
" <td>8.468278e-02</td>\n",
" <td>2.000000e+00</td>\n",
" <td>1.929526e-02</td>\n",
" <td>1.071943e-08</td>\n",
" <td>2.777778e-02</td>\n",
" <td>1.443068e-02</td>\n",
" <td>5.548521e-01</td>\n",
" <td>7.459210e-01</td>\n",
" <td>2.006704e-08</td>\n",
" <td>3.400000e+01</td>\n",
" <td>3.066400e+04</td>\n",
" <td>-8.733333e-02</td>\n",
" </tr>\n",
" <tr>\n",
" <th>50%</th>\n",
" <td>6.175100e+04</td>\n",
" <td>1.994063e+07</td>\n",
" <td>8.537100e+04</td>\n",
" <td>9.385057e-01</td>\n",
" <td>8.844134e-01</td>\n",
" <td>-4.508997e-03</td>\n",
" <td>1.065801e+01</td>\n",
" <td>5.129154e-02</td>\n",
" <td>1.041029e-01</td>\n",
" <td>0.000000e+00</td>\n",
" <td>2.222195e-02</td>\n",
" <td>5.333331e-02</td>\n",
" <td>1.514337e-01</td>\n",
" <td>6.893335e-02</td>\n",
" <td>4.594873e-01</td>\n",
" <td>6.236354e-02</td>\n",
" <td>-1.928674e-02</td>\n",
" <td>8.000000e+00</td>\n",
" <td>-7.557447e-02</td>\n",
" <td>6.493884e-01</td>\n",
" <td>-1.081960e-01</td>\n",
" <td>1.283873e-01</td>\n",
" <td>-6.678712e-01</td>\n",
" <td>4.772470e-02</td>\n",
" <td>-6.425090e-03</td>\n",
" <td>1.591766e-03</td>\n",
" <td>6.850432e-03</td>\n",
" <td>-6.001767e-02</td>\n",
" <td>2.775582e-04</td>\n",
" <td>-3.205090e-03</td>\n",
" <td>0.000000e+00</td>\n",
" <td>1.978028e+00</td>\n",
" <td>1.491666e-01</td>\n",
" <td>0.000000e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>4.105658e-03</td>\n",
" <td>7.785754e-02</td>\n",
" <td>5.049353e-02</td>\n",
" <td>3.004639e-01</td>\n",
" <td>1.333333e-01</td>\n",
" <td>...</td>\n",
" <td>4.000000e+00</td>\n",
" <td>1.295756e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>2.499797e-02</td>\n",
" <td>2.428685e-02</td>\n",
" <td>0.227044</td>\n",
" <td>6.723756e-02</td>\n",
" <td>1.004393e+01</td>\n",
" <td>7.611887e+00</td>\n",
" <td>5.894215e+00</td>\n",
" <td>5.391061e-01</td>\n",
" <td>0.000000e+00</td>\n",
" <td>9.550161e-02</td>\n",
" <td>0.000000e+00</td>\n",
" <td>1.033367e+00</td>\n",
" <td>5.476999e-01</td>\n",
" <td>-1.280279e-01</td>\n",
" <td>2.546798e-01</td>\n",
" <td>6.937072e-02</td>\n",
" <td>6.113440e-05</td>\n",
" <td>-1.190524e-03</td>\n",
" <td>5.285405e-04</td>\n",
" <td>1.000000e+00</td>\n",
" <td>7.788162e-03</td>\n",
" <td>1.137807e-02</td>\n",
" <td>2.394428e-02</td>\n",
" <td>1.444441e-02</td>\n",
" <td>1.319148e-01</td>\n",
" <td>1.438175e-01</td>\n",
" <td>4.000000e+00</td>\n",
" <td>3.276902e-02</td>\n",
" <td>1.296628e-07</td>\n",
" <td>4.878049e-02</td>\n",
" <td>2.349818e-02</td>\n",
" <td>8.040504e-01</td>\n",
" <td>1.699621e+00</td>\n",
" <td>5.636354e-08</td>\n",
" <td>4.800000e+01</td>\n",
" <td>6.175100e+04</td>\n",
" <td>-7.000000e-03</td>\n",
" </tr>\n",
" <tr>\n",
" <th>75%</th>\n",
" <td>8.065500e+04</td>\n",
" <td>2.004013e+07</td>\n",
" <td>4.103775e+05</td>\n",
" <td>1.388668e+00</td>\n",
" <td>1.932141e+00</td>\n",
" <td>2.343294e-01</td>\n",
" <td>1.278463e+01</td>\n",
" <td>7.703916e-02</td>\n",
" <td>2.611639e-01</td>\n",
" <td>6.666667e-02</td>\n",
" <td>1.818182e-01</td>\n",
" <td>3.076923e-01</td>\n",
" <td>5.791666e-01</td>\n",
" <td>3.235635e-01</td>\n",
" <td>1.090249e+00</td>\n",
" <td>1.142533e-01</td>\n",
" <td>4.686827e-02</td>\n",
" <td>1.600000e+01</td>\n",
" <td>1.620244e-02</td>\n",
" <td>1.124938e+00</td>\n",
" <td>2.510274e-01</td>\n",
" <td>2.817467e-01</td>\n",
" <td>4.925265e+00</td>\n",
" <td>1.235705e-01</td>\n",
" <td>8.754287e-02</td>\n",
" <td>7.820747e-02</td>\n",
" <td>6.498258e-02</td>\n",
" <td>2.967025e-02</td>\n",
" <td>2.247246e-02</td>\n",
" <td>5.678204e-02</td>\n",
" <td>0.000000e+00</td>\n",
" <td>3.207765e+00</td>\n",
" <td>2.635405e-01</td>\n",
" <td>0.000000e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>3.048487e-02</td>\n",
" <td>1.952768e-01</td>\n",
" <td>8.936414e-02</td>\n",
" <td>5.259542e-01</td>\n",
" <td>9.169413e-01</td>\n",
" <td>...</td>\n",
" <td>5.000000e+00</td>\n",
" <td>2.282146e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>6.965324e-02</td>\n",
" <td>9.866627e-02</td>\n",
" <td>0.375437</td>\n",
" <td>1.336030e-01</td>\n",
" <td>3.564672e+01</td>\n",
" <td>1.718927e+01</td>\n",
" <td>8.917445e+00</td>\n",
" <td>1.000000e+00</td>\n",
" <td>1.000000e+00</td>\n",
" <td>2.371986e-01</td>\n",
" <td>0.000000e+00</td>\n",
" <td>2.437527e+00</td>\n",
" <td>6.116876e-01</td>\n",
" <td>3.847830e-01</td>\n",
" <td>1.106273e+00</td>\n",
" <td>2.069448e-01</td>\n",
" <td>3.170890e-03</td>\n",
" <td>2.111668e-02</td>\n",
" <td>3.674658e-02</td>\n",
" <td>2.000000e+00</td>\n",
" <td>1.975541e-02</td>\n",
" <td>2.720177e-02</td>\n",
" <td>4.373651e-02</td>\n",
" <td>5.446088e-02</td>\n",
" <td>2.523467e-01</td>\n",
" <td>2.950189e-01</td>\n",
" <td>5.000000e+00</td>\n",
" <td>5.908240e-02</td>\n",
" <td>1.272854e-06</td>\n",
" <td>8.602151e-02</td>\n",
" <td>3.837227e-02</td>\n",
" <td>1.119647e+00</td>\n",
" <td>3.939640e+00</td>\n",
" <td>9.545456e-01</td>\n",
" <td>6.300000e+01</td>\n",
" <td>8.065500e+04</td>\n",
" <td>5.285921e-02</td>\n",
" </tr>\n",
" <tr>\n",
" <th>max</th>\n",
" <td>9.343600e+04</td>\n",
" <td>2.016123e+07</td>\n",
" <td>1.221146e+08</td>\n",
" <td>3.987207e+00</td>\n",
" <td>1.589782e+01</td>\n",
" <td>8.083012e+00</td>\n",
" <td>1.900568e+01</td>\n",
" <td>2.720346e-01</td>\n",
" <td>7.682736e+00</td>\n",
" <td>2.172414e+00</td>\n",
" <td>7.844445e+00</td>\n",
" <td>1.168096e+01</td>\n",
" <td>1.685246e+01</td>\n",
" <td>9.918375e+02</td>\n",
" <td>1.734793e+04</td>\n",
" <td>1.257607e+00</td>\n",
" <td>1.155264e+00</td>\n",
" <td>5.400000e+01</td>\n",
" <td>8.265002e-01</td>\n",
" <td>2.460433e+03</td>\n",
" <td>1.564199e+03</td>\n",
" <td>4.256410e+01</td>\n",
" <td>1.139067e+04</td>\n",
" <td>1.477503e+02</td>\n",
" <td>3.657082e+02</td>\n",
" <td>2.663142e+00</td>\n",
" <td>7.208766e+00</td>\n",
" <td>6.629468e+01</td>\n",
" <td>3.991813e-01</td>\n",
" <td>1.530145e+02</td>\n",
" <td>1.000000e+00</td>\n",
" <td>8.330556e+02</td>\n",
" <td>1.509737e+01</td>\n",
" <td>1.000000e+00</td>\n",
" <td>1.000000e+00</td>\n",
" <td>1.159430e+00</td>\n",
" <td>1.385515e+01</td>\n",
" <td>8.404797e-01</td>\n",
" <td>6.138484e+00</td>\n",
" <td>4.866364e+02</td>\n",
" <td>...</td>\n",
" <td>9.000000e+00</td>\n",
" <td>8.330556e+02</td>\n",
" <td>1.000000e+00</td>\n",
" <td>2.370752e+00</td>\n",
" <td>1.584300e+02</td>\n",
" <td>57.709343</td>\n",
" <td>1.047405e+01</td>\n",
" <td>5.906600e+03</td>\n",
" <td>3.024533e+03</td>\n",
" <td>6.870769e+02</td>\n",
" <td>5.000000e+00</td>\n",
" <td>1.000000e+00</td>\n",
" <td>1.791538e+01</td>\n",
" <td>1.000000e+00</td>\n",
" <td>5.493151e+01</td>\n",
" <td>1.000000e+00</td>\n",
" <td>2.924935e+01</td>\n",
" <td>5.121570e+02</td>\n",
" <td>9.786834e-01</td>\n",
" <td>1.765701e-01</td>\n",
" <td>6.463467e+03</td>\n",
" <td>5.578913e-01</td>\n",
" <td>8.000000e+00</td>\n",
" <td>1.460327e+00</td>\n",
" <td>8.201621e-01</td>\n",
" <td>7.039832e+00</td>\n",
" <td>5.670456e+00</td>\n",
" <td>7.496989e+03</td>\n",
" <td>8.751301e+03</td>\n",
" <td>8.000000e+00</td>\n",
" <td>1.078788e+00</td>\n",
" <td>2.479339e-03</td>\n",
" <td>1.701520e+00</td>\n",
" <td>4.379252e-01</td>\n",
" <td>2.860123e+00</td>\n",
" <td>6.517474e+02</td>\n",
" <td>2.008697e+01</td>\n",
" <td>9.900000e+01</td>\n",
" <td>9.343600e+04</td>\n",
" <td>3.511000e+03</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"<p>8 rows × 99 columns</p>\n",
"</div>"
],
"text/plain": [
" permno DATE ... PERMNO return\n",
"count 3.839444e+06 3.839444e+06 ... 3.839444e+06 3.839444e+06\n",
"mean 5.641562e+04 1.992466e+07 ... 5.641562e+04 -1.119134e-01\n",
"std 2.728803e+04 1.418895e+05 ... 2.728803e+04 2.036537e+00\n",
"min 1.000000e+04 1.957033e+07 ... 1.000000e+04 -3.250224e+02\n",
"25% 3.066400e+04 1.982083e+07 ... 3.066400e+04 -8.733333e-02\n",
"50% 6.175100e+04 1.994063e+07 ... 6.175100e+04 -7.000000e-03\n",
"75% 8.065500e+04 2.004013e+07 ... 8.065500e+04 5.285921e-02\n",
"max 9.343600e+04 2.016123e+07 ... 9.343600e+04 3.511000e+03\n",
"\n",
"[8 rows x 99 columns]"
]
},
"metadata": {
"tags": []
},
"execution_count": 17
}
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "mPHNtLRMPUed",
"colab_type": "text"
},
"source": [
"Different stocks have different record #"
]
},
{
"cell_type": "code",
"metadata": {
"id": "EJTPsQh3PUee",
"colab_type": "code",
"outputId": "77c1548b-f5a9-4edd-d3b3-d4e9b12db2d5",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 470
}
},
"source": [
"data.groupby('permno').count()"
],
"execution_count": 0,
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>DATE</th>\n",
" <th>mvel1</th>\n",
" <th>beta</th>\n",
" <th>betasq</th>\n",
" <th>chmom</th>\n",
" <th>dolvol</th>\n",
" <th>idiovol</th>\n",
" <th>indmom</th>\n",
" <th>mom1m</th>\n",
" <th>mom6m</th>\n",
" <th>mom12m</th>\n",
" <th>mom36m</th>\n",
" <th>pricedelay</th>\n",
" <th>turn</th>\n",
" <th>absacc</th>\n",
" <th>acc</th>\n",
" <th>age</th>\n",
" <th>agr</th>\n",
" <th>bm</th>\n",
" <th>bm_ia</th>\n",
" <th>cashdebt</th>\n",
" <th>cashpr</th>\n",
" <th>cfp</th>\n",
" <th>cfp_ia</th>\n",
" <th>chatoia</th>\n",
" <th>chcsho</th>\n",
" <th>chempia</th>\n",
" <th>chinv</th>\n",
" <th>chpmia</th>\n",
" <th>convind</th>\n",
" <th>currat</th>\n",
" <th>depr</th>\n",
" <th>divi</th>\n",
" <th>divo</th>\n",
" <th>dy</th>\n",
" <th>egr</th>\n",
" <th>ep</th>\n",
" <th>gma</th>\n",
" <th>grcapx</th>\n",
" <th>grltnoa</th>\n",
" <th>...</th>\n",
" <th>quick</th>\n",
" <th>rd</th>\n",
" <th>rd_mve</th>\n",
" <th>rd_sale</th>\n",
" <th>realestate</th>\n",
" <th>roic</th>\n",
" <th>salecash</th>\n",
" <th>saleinv</th>\n",
" <th>salerec</th>\n",
" <th>secured</th>\n",
" <th>securedind</th>\n",
" <th>sgr</th>\n",
" <th>sin</th>\n",
" <th>sp</th>\n",
" <th>tang</th>\n",
" <th>tb</th>\n",
" <th>aeavol</th>\n",
" <th>cash</th>\n",
" <th>chtx</th>\n",
" <th>cinvest</th>\n",
" <th>ear</th>\n",
" <th>nincr</th>\n",
" <th>roaq</th>\n",
" <th>roavol</th>\n",
" <th>roeq</th>\n",
" <th>rsup</th>\n",
" <th>stdacc</th>\n",
" <th>stdcf</th>\n",
" <th>ms</th>\n",
" <th>baspread</th>\n",
" <th>ill</th>\n",
" <th>maxret</th>\n",
" <th>retvol</th>\n",
" <th>std_dolvol</th>\n",
" <th>std_turn</th>\n",
" <th>zerotrade</th>\n",
" <th>sic2</th>\n",
" <th>PERMNO</th>\n",
" <th>date</th>\n",
" <th>return</th>\n",
" </tr>\n",
" <tr>\n",
" <th>permno</th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>10000</th>\n",
" <td>15</td>\n",
" <td>15</td>\n",
" <td>3</td>\n",
" <td>3</td>\n",
" <td>4</td>\n",
" <td>14</td>\n",
" <td>3</td>\n",
" <td>15</td>\n",
" <td>15</td>\n",
" <td>10</td>\n",
" <td>4</td>\n",
" <td>0</td>\n",
" <td>3</td>\n",
" <td>14</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>1</td>\n",
" <td>0</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>1</td>\n",
" <td>0</td>\n",
" <td>1</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>...</td>\n",
" <td>1</td>\n",
" <td>0</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>0</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>0</td>\n",
" <td>1</td>\n",
" <td>0</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>15</td>\n",
" <td>15</td>\n",
" <td>15</td>\n",
" <td>15</td>\n",
" <td>15</td>\n",
" <td>15</td>\n",
" <td>15</td>\n",
" <td>15</td>\n",
" <td>15</td>\n",
" <td>15</td>\n",
" <td>15</td>\n",
" </tr>\n",
" <tr>\n",
" <th>10001</th>\n",
" <td>371</td>\n",
" <td>371</td>\n",
" <td>358</td>\n",
" <td>358</td>\n",
" <td>360</td>\n",
" <td>370</td>\n",
" <td>358</td>\n",
" <td>371</td>\n",
" <td>370</td>\n",
" <td>366</td>\n",
" <td>360</td>\n",
" <td>336</td>\n",
" <td>358</td>\n",
" <td>369</td>\n",
" <td>343</td>\n",
" <td>343</td>\n",
" <td>355</td>\n",
" <td>343</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>331</td>\n",
" <td>343</td>\n",
" <td>343</td>\n",
" <td>343</td>\n",
" <td>343</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>343</td>\n",
" <td>343</td>\n",
" <td>343</td>\n",
" <td>343</td>\n",
" <td>355</td>\n",
" <td>343</td>\n",
" <td>331</td>\n",
" <td>343</td>\n",
" <td>...</td>\n",
" <td>355</td>\n",
" <td>343</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>0</td>\n",
" <td>355</td>\n",
" <td>343</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>313</td>\n",
" <td>296</td>\n",
" <td>296</td>\n",
" <td>296</td>\n",
" <td>296</td>\n",
" <td>296</td>\n",
" <td>296</td>\n",
" <td>296</td>\n",
" <td>269</td>\n",
" <td>296</td>\n",
" <td>296</td>\n",
" <td>269</td>\n",
" <td>269</td>\n",
" <td>293</td>\n",
" <td>371</td>\n",
" <td>371</td>\n",
" <td>371</td>\n",
" <td>371</td>\n",
" <td>371</td>\n",
" <td>371</td>\n",
" <td>371</td>\n",
" <td>371</td>\n",
" <td>371</td>\n",
" <td>371</td>\n",
" <td>371</td>\n",
" </tr>\n",
" <tr>\n",
" <th>10002</th>\n",
" <td>325</td>\n",
" <td>325</td>\n",
" <td>312</td>\n",
" <td>312</td>\n",
" <td>314</td>\n",
" <td>324</td>\n",
" <td>312</td>\n",
" <td>325</td>\n",
" <td>324</td>\n",
" <td>320</td>\n",
" <td>314</td>\n",
" <td>290</td>\n",
" <td>312</td>\n",
" <td>323</td>\n",
" <td>92</td>\n",
" <td>92</td>\n",
" <td>224</td>\n",
" <td>212</td>\n",
" <td>224</td>\n",
" <td>224</td>\n",
" <td>224</td>\n",
" <td>224</td>\n",
" <td>92</td>\n",
" <td>92</td>\n",
" <td>200</td>\n",
" <td>212</td>\n",
" <td>212</td>\n",
" <td>212</td>\n",
" <td>212</td>\n",
" <td>224</td>\n",
" <td>224</td>\n",
" <td>224</td>\n",
" <td>212</td>\n",
" <td>212</td>\n",
" <td>224</td>\n",
" <td>212</td>\n",
" <td>224</td>\n",
" <td>212</td>\n",
" <td>188</td>\n",
" <td>0</td>\n",
" <td>...</td>\n",
" <td>224</td>\n",
" <td>212</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>224</td>\n",
" <td>224</td>\n",
" <td>212</td>\n",
" <td>224</td>\n",
" <td>0</td>\n",
" <td>224</td>\n",
" <td>212</td>\n",
" <td>224</td>\n",
" <td>224</td>\n",
" <td>224</td>\n",
" <td>20</td>\n",
" <td>218</td>\n",
" <td>218</td>\n",
" <td>211</td>\n",
" <td>211</td>\n",
" <td>218</td>\n",
" <td>218</td>\n",
" <td>218</td>\n",
" <td>176</td>\n",
" <td>218</td>\n",
" <td>218</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>218</td>\n",
" <td>325</td>\n",
" <td>325</td>\n",
" <td>325</td>\n",
" <td>325</td>\n",
" <td>324</td>\n",
" <td>325</td>\n",
" <td>325</td>\n",
" <td>325</td>\n",
" <td>325</td>\n",
" <td>325</td>\n",
" <td>325</td>\n",
" </tr>\n",
" <tr>\n",
" <th>10003</th>\n",
" <td>119</td>\n",
" <td>119</td>\n",
" <td>106</td>\n",
" <td>106</td>\n",
" <td>108</td>\n",
" <td>117</td>\n",
" <td>106</td>\n",
" <td>119</td>\n",
" <td>118</td>\n",
" <td>114</td>\n",
" <td>108</td>\n",
" <td>84</td>\n",
" <td>106</td>\n",
" <td>117</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>6</td>\n",
" <td>0</td>\n",
" <td>6</td>\n",
" <td>6</td>\n",
" <td>6</td>\n",
" <td>6</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>6</td>\n",
" <td>6</td>\n",
" <td>6</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>6</td>\n",
" <td>0</td>\n",
" <td>6</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>...</td>\n",
" <td>6</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>6</td>\n",
" <td>6</td>\n",
" <td>6</td>\n",
" <td>0</td>\n",
" <td>6</td>\n",
" <td>0</td>\n",
" <td>6</td>\n",
" <td>6</td>\n",
" <td>6</td>\n",
" <td>6</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>119</td>\n",
" <td>119</td>\n",
" <td>119</td>\n",
" <td>119</td>\n",
" <td>119</td>\n",
" <td>119</td>\n",
" <td>119</td>\n",
" <td>119</td>\n",
" <td>119</td>\n",
" <td>119</td>\n",
" <td>119</td>\n",
" </tr>\n",
" <tr>\n",
" <th>10005</th>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>53</td>\n",
" <td>53</td>\n",
" <td>55</td>\n",
" <td>57</td>\n",
" <td>53</td>\n",
" <td>66</td>\n",
" <td>65</td>\n",
" <td>61</td>\n",
" <td>55</td>\n",
" <td>31</td>\n",
" <td>53</td>\n",
" <td>64</td>\n",
" <td>37</td>\n",
" <td>37</td>\n",
" <td>49</td>\n",
" <td>37</td>\n",
" <td>49</td>\n",
" <td>49</td>\n",
" <td>49</td>\n",
" <td>49</td>\n",
" <td>49</td>\n",
" <td>49</td>\n",
" <td>25</td>\n",
" <td>37</td>\n",
" <td>37</td>\n",
" <td>37</td>\n",
" <td>37</td>\n",
" <td>49</td>\n",
" <td>49</td>\n",
" <td>49</td>\n",
" <td>37</td>\n",
" <td>37</td>\n",
" <td>49</td>\n",
" <td>37</td>\n",
" <td>49</td>\n",
" <td>37</td>\n",
" <td>25</td>\n",
" <td>37</td>\n",
" <td>...</td>\n",
" <td>49</td>\n",
" <td>37</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>49</td>\n",
" <td>49</td>\n",
" <td>0</td>\n",
" <td>49</td>\n",
" <td>48</td>\n",
" <td>49</td>\n",
" <td>37</td>\n",
" <td>49</td>\n",
" <td>49</td>\n",
" <td>49</td>\n",
" <td>49</td>\n",
" <td>3</td>\n",
" <td>3</td>\n",
" <td>3</td>\n",
" <td>3</td>\n",
" <td>3</td>\n",
" <td>3</td>\n",
" <td>3</td>\n",
" <td>0</td>\n",
" <td>3</td>\n",
" <td>3</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>3</td>\n",
" <td>66</td>\n",
" <td>59</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>47</td>\n",
" <td>66</td>\n",
" <td>59</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" </tr>\n",
" <tr>\n",
" <th>...</th>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" </tr>\n",
" <tr>\n",
" <th>93432</th>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>9</td>\n",
" <td>0</td>\n",
" <td>11</td>\n",
" <td>10</td>\n",
" <td>6</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>9</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>...</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>2</td>\n",
" <td>2</td>\n",
" <td>2</td>\n",
" <td>2</td>\n",
" <td>2</td>\n",
" <td>2</td>\n",
" <td>2</td>\n",
" <td>0</td>\n",
" <td>2</td>\n",
" <td>2</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" </tr>\n",
" <tr>\n",
" <th>93433</th>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>67</td>\n",
" <td>77</td>\n",
" <td>65</td>\n",
" <td>78</td>\n",
" <td>77</td>\n",
" <td>73</td>\n",
" <td>67</td>\n",
" <td>43</td>\n",
" <td>65</td>\n",
" <td>76</td>\n",
" <td>54</td>\n",
" <td>54</td>\n",
" <td>66</td>\n",
" <td>54</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>42</td>\n",
" <td>54</td>\n",
" <td>54</td>\n",
" <td>54</td>\n",
" <td>54</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>54</td>\n",
" <td>54</td>\n",
" <td>66</td>\n",
" <td>54</td>\n",
" <td>66</td>\n",
" <td>54</td>\n",
" <td>42</td>\n",
" <td>54</td>\n",
" <td>...</td>\n",
" <td>66</td>\n",
" <td>54</td>\n",
" <td>60</td>\n",
" <td>60</td>\n",
" <td>60</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>0</td>\n",
" <td>66</td>\n",
" <td>18</td>\n",
" <td>66</td>\n",
" <td>54</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>52</td>\n",
" <td>52</td>\n",
" <td>46</td>\n",
" <td>49</td>\n",
" <td>52</td>\n",
" <td>52</td>\n",
" <td>52</td>\n",
" <td>24</td>\n",
" <td>52</td>\n",
" <td>52</td>\n",
" <td>24</td>\n",
" <td>24</td>\n",
" <td>47</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" </tr>\n",
" <tr>\n",
" <th>93434</th>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>67</td>\n",
" <td>76</td>\n",
" <td>65</td>\n",
" <td>78</td>\n",
" <td>77</td>\n",
" <td>73</td>\n",
" <td>67</td>\n",
" <td>43</td>\n",
" <td>65</td>\n",
" <td>76</td>\n",
" <td>60</td>\n",
" <td>60</td>\n",
" <td>72</td>\n",
" <td>60</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>48</td>\n",
" <td>60</td>\n",
" <td>60</td>\n",
" <td>60</td>\n",
" <td>60</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>60</td>\n",
" <td>60</td>\n",
" <td>72</td>\n",
" <td>60</td>\n",
" <td>72</td>\n",
" <td>60</td>\n",
" <td>48</td>\n",
" <td>60</td>\n",
" <td>...</td>\n",
" <td>72</td>\n",
" <td>60</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>36</td>\n",
" <td>72</td>\n",
" <td>60</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>62</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>38</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>38</td>\n",
" <td>38</td>\n",
" <td>65</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" </tr>\n",
" <tr>\n",
" <th>93435</th>\n",
" <td>23</td>\n",
" <td>23</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>12</td>\n",
" <td>22</td>\n",
" <td>10</td>\n",
" <td>23</td>\n",
" <td>22</td>\n",
" <td>18</td>\n",
" <td>12</td>\n",
" <td>0</td>\n",
" <td>10</td>\n",
" <td>21</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>0</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>0</td>\n",
" <td>11</td>\n",
" <td>...</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>14</td>\n",
" <td>14</td>\n",
" <td>14</td>\n",
" <td>14</td>\n",
" <td>14</td>\n",
" <td>14</td>\n",
" <td>14</td>\n",
" <td>14</td>\n",
" <td>14</td>\n",
" <td>14</td>\n",
" <td>14</td>\n",
" <td>14</td>\n",
" <td>9</td>\n",
" <td>23</td>\n",
" <td>23</td>\n",
" <td>23</td>\n",
" <td>23</td>\n",
" <td>23</td>\n",
" <td>23</td>\n",
" <td>23</td>\n",
" <td>23</td>\n",
" <td>23</td>\n",
" <td>23</td>\n",
" <td>23</td>\n",
" </tr>\n",
" <tr>\n",
" <th>93436</th>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>67</td>\n",
" <td>76</td>\n",
" <td>65</td>\n",
" <td>78</td>\n",
" <td>77</td>\n",
" <td>73</td>\n",
" <td>67</td>\n",
" <td>43</td>\n",
" <td>65</td>\n",
" <td>76</td>\n",
" <td>54</td>\n",
" <td>54</td>\n",
" <td>66</td>\n",
" <td>54</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>42</td>\n",
" <td>54</td>\n",
" <td>54</td>\n",
" <td>54</td>\n",
" <td>54</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>54</td>\n",
" <td>54</td>\n",
" <td>66</td>\n",
" <td>54</td>\n",
" <td>66</td>\n",
" <td>54</td>\n",
" <td>42</td>\n",
" <td>36</td>\n",
" <td>...</td>\n",
" <td>66</td>\n",
" <td>54</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>54</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>71</td>\n",
" <td>71</td>\n",
" <td>65</td>\n",
" <td>71</td>\n",
" <td>71</td>\n",
" <td>71</td>\n",
" <td>71</td>\n",
" <td>41</td>\n",
" <td>71</td>\n",
" <td>71</td>\n",
" <td>41</td>\n",
" <td>41</td>\n",
" <td>66</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>77</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"<p>29883 rows × 99 columns</p>\n",
"</div>"
],
"text/plain": [
" DATE mvel1 beta betasq chmom ... zerotrade sic2 PERMNO date return\n",
"permno ... \n",
"10000 15 15 3 3 4 ... 15 15 15 15 15\n",
"10001 371 371 358 358 360 ... 371 371 371 371 371\n",
"10002 325 325 312 312 314 ... 325 325 325 325 325\n",
"10003 119 119 106 106 108 ... 119 119 119 119 119\n",
"10005 66 66 53 53 55 ... 59 66 66 66 66\n",
"... ... ... ... ... ... ... ... ... ... ... ...\n",
"93432 11 11 0 0 0 ... 11 11 11 11 11\n",
"93433 78 78 65 65 67 ... 78 78 78 78 78\n",
"93434 78 78 65 65 67 ... 78 78 78 78 78\n",
"93435 23 23 10 10 12 ... 23 23 23 23 23\n",
"93436 78 78 65 65 67 ... 78 78 78 78 78\n",
"\n",
"[29883 rows x 99 columns]"
]
},
"metadata": {
"tags": []
},
"execution_count": 18
}
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "IEtoLXgBPUeh",
"colab_type": "text"
},
"source": [
"## 2.2 Missing Values"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "F17S9eJPPUei",
"colab_type": "text"
},
"source": [
"mean Missing % is around 30%"
]
},
{
"cell_type": "code",
"metadata": {
"id": "d8CR4o-VPUej",
"colab_type": "code",
"outputId": "2219f926-d579-486a-a35d-7763c52bf6f1",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 374
}
},
"source": [
"(1-summary.loc['count'].sort_values()/3760208).plot.bar()\n",
"plt.title('Missing % for each column')"
],
"execution_count": 0,
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/plain": [
"Text(0.5, 1.0, 'Missing % for each column')"
]
},
"metadata": {
"tags": []
},
"execution_count": 19
},
{
"output_type": "display_data",
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAXcAAAFUCAYAAADf+HxmAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjMsIGh0\ndHA6Ly9tYXRwbG90bGliLm9yZy+AADFEAAAgAElEQVR4nOydedhd0/XHPyuJxDxEYg6JsdKiCKVU\nDDW1REspOtBSndCWVqMDig5qqipt/ZCYx9KmhFCihgqZE0mEyCAJkogpiERi/f74rpN73usdbuKN\nyLU+z3Oee885++yzx7XXXns45u4kSZIk9UWbZR2AJEmSpPVJ4Z4kSVKHpHBPkiSpQ1K4J0mS1CEp\n3JMkSeqQFO5JkiR1SAr3jzlm9jcz+/UHeP4XZnZla4apNTCzdc3sYTObY2YXLuvwNIWZuZltvpTf\n8ZCZHb8035F89Gi3rAOQLB3MbDKwAbCBu79cuj4c+DTQzd0nu/v3Psh73P13HyigTWBm7YDrgQOA\nQcAR7v5G3PsF8I67X9SMFycALwOrey7mSD6GpOZe30wCjipOzGwbYOVlF5zF4lDAgU7A60hYY2bd\ngF7An1t4fhNg7JII9mhYkmS5JoV7fXMd8M3S+THAtWUHZtbXzM6N/53M7C4ze83MXjGzR8ysTdz7\nuZlNDzPHeDPbJ66fZWbXx/+uYWY4xsyeN7OXzeyXpXetZGbXmNmrZjbOzE4zs2lNhL0b8JC7LwAG\nApvG9T8Dp8b1RjGzvhHX08zsTTP7vJl1MLM/mdkLcfzJzDqE+z3NbFrE8SWgTxP+fjvC/aqZDTCz\nTUr3LjGzqWb2hpkNNbPPle61DfPVc5F+Q82sS8nrz5vZs5Hul5mZNfH+Jv0xs8+a2WAzez1+P9uE\nH4vyK86LPGsX5w+Z2blm9r9Iu3+b2dpmdkPEbbCZdS0972b2vVrCn3y4pHCvbwYBq5vZ1mbWFjgS\nmTqa4lRgGtAZWBf4BeBmthVwIrCTu68G7A9Mbsaf3YGtgH2AM8xs67h+JtAVCep9ga8348dTwN4h\ngPcCxpjZl4GX3f2xZp7D3Y8FbgD+6O6ruvt/gF8CuyCT1HbAzsCvSo+tB3REGv8J1X6a2SEoPQ5F\n6fMIcFPJyeDwuyNwI3Cbma0Y905BPagvAKsD3wbeLj17ELATsC1wBErfxmjUHzPrCNyNGr61gYuA\nu81s7Sb8aYkjgW8AGwKbAY+jBq8jMA7lY5law598iKRwr38K7X1fVDGnN+P2XWB9YBN3f9fdHwmz\nxkKgA9DdzFYIW/1zzfjzG3ef6+4jgZFImIIq/u/c/VV3n0bzppX+yKw0GJllbkZC5TQz+20Mll5u\nZu1biH/B14Cz3X2mu88CfoMEWMF7wJnuPs/d5zby/PeA37v7uOg1/A74dKG9u/v17j7b3Re4+4Uo\nvbaKZ48HfuXu412MdPfZJb//4O6vufvzqJfy6Sbi0JQ/XwSedffr4v03AU8DB9eYNtX0cffn3P11\n4B7gOXf/T8T7NmD7Kve1hj/5EEnhXv9cBxwNHEuVSaYRzgcmAPeZ2UQz6w3g7hOAHwNnATPN7GYz\n26AZf14q/X8bWDX+bwBMLd0r/29ACK/e7r6tu58A9Ab+hjTEHkBPoD3SXmthA2BK6XxKXCuY5e7v\nNPP8JsAlYXp4DXgFMKTdYmY/DZPN63F/DTReANAFaK4xbCq9qmnKn+q4EecbNvPO5phR+j+3kfPq\n8NUa/uRDJIV7nePuU5AG/AXgjhbcznH3U919UzRoeUphW3f3G919dyTkHDhvCYLzIrBR6bxLUw7L\nxEDwZ4ErgG2AodGjGIxMAbXwAgp7wcZxraClgdepwHfdfc3SsZK7/y/s66ehnsla7r4m6m1Y6dnN\nagxnS2FozJ/quIHi11gv7S0aDqqv1wrhSj6CpHD/eHAcsLe7v9WcIzM7yMw2jwGx15E55j0z28rM\nCvv3O0h7e28JwnErcLqZrWVmGyI7frNEWP4CnOzu76GGavcwx/QEJtb47puAX5lZZzPrBJxB8+MP\n1fwtwv7JCNcaZnZ43FsNWADMAtqZ2RnIJl5wJXCOmW1hYtsltIc35U9/YEszO9rM2pnZV4HuwF2N\n+DEC2MPMNjazNYDTlyAcyXJACvePAWE/HVKD0y2A/wBvokG0y919ILIf/wHNG38JWIclEwpnowHb\nSfGe24F5LTzzLeApdx8a53cgTXUWGjy8osZ3nwsMAUYBo4Fhca0m3P1O1Fu52czeQAO+B8btAcC9\nwDPIHPIODU1OF6GG7T7gDeAqYKVa392SP2F3PwgNiM9GvYiDyusbSvG4H7gFpcNQGm8AkjrAcn1H\nsqwws+8DR7p7z2UdliSpN1JzTz40zGx9M9vNzNrE9MpTgTuXdbiSpB7JlXjJh0l74O9ogdJraHrj\n5cs0RElSp6RZJkmSpA5Js0ySJEkdksI9SZKkDllmNvdOnTp5165dl9XrkyRJlkuGDh36srt3bsnd\nMhPuXbt2ZciQWqZeJ0mSJAVmVr3VRKOkWSZJkqQOSeGeJElSh6RwT5IkqUNSuCdJktQhKdyTJEnq\nkBTuSZIkdUgK9yRJkjokhXuSJEkdskyFe9fed9O1993LMghJkiR1SWruSZIkdUgK9yRJkjokhXuS\nJEkdksI9SZKkDknhniRJUoekcE+SJKlDUrgnSZLUITUJdzM7wMzGm9kEM+vdyP2LzWxEHM+Y2Wut\nH9QkSZKkVlr8EpOZtQUuA/YFpgGDzayfu48t3Lj7T0ruTwK2XwphTZIkSWqkFs19Z2CCu0909/nA\nzcAhzbg/CripNQKXJEmSLBm1CPcNgaml82lx7X2Y2SZAN+DBDx60JEmSZElp7QHVI4Hb3X1hYzfN\n7AQzG2JmQ2bNmtXKr06SJEkKahHu04EupfON4lpjHEkzJhl3v8Lde7h7j86dO9ceyiRJkmSxqEW4\nDwa2MLNuZtYeCfB+1Y7M7BPAWsDjrRvEJEmSZHFpUbi7+wLgRGAAMA641d3HmNnZZtar5PRI4GZ3\n96UT1CRJkqRWWpwKCeDu/YH+VdfOqDo/q/WClSRJknwQcoVqkiRJHZLCPUmSpA5J4Z4kSVKHpHBP\nkiSpQ1K4J0mS1CEp3JMkSeqQFO5JkiR1SAr3JEmSOuQjI9y79r6brr3vXtbBSJIkqQs+MsK9mhT0\nSZIkS85HVrgnSZIkS04K9yRJkjokhXuSJEkdksI9SZKkDknhniRJUoekcE+SJKlDlgvhXp4DXz0f\nPufHJ0mSvJ+ahLuZHWBm481sgpn1bsLNEWY21szGmNmNrRvMJEmSZHFo8TN7ZtYWuAzYF5gGDDaz\nfu4+tuRmC+B0YDd3f9XM1llaAU6SJElaphbNfWdggrtPdPf5wM3AIVVuvgNc5u6vArj7zNYNZpIk\nSbI41CLcNwSmls6nxbUyWwJbmtljZjbIzA5orQAmSZIki0+LZpnF8GcLYE9gI+BhM9vG3V8rOzKz\nE4ATADbeeGOslV6eJEmSNKQWzX060KV0vlFcKzMN6Ofu77r7JOAZJOwb4O5XuHsPd+/RuXPnJQ1z\nkiRJ0gK1CPfBwBZm1s3M2gNHAv2q3PwTae2YWSdkppnYiuFMkiRJFoMWhbu7LwBOBAYA44Bb3X2M\nmZ1tZr3C2QBgtpmNBQYCP3P32Usr0EmSJEnz1GRzd/f+QP+qa2eU/jtwShxJkiTJMma5WKGaJEmS\nLB4p3JMkSeqQFO5JkiR1SAr3JEmSOiSFe5IkSR2Swj1JkqQOSeGeJElSh6RwT5IkqUNSuCdJktQh\nKdyTJEnqkBTuSZIkdUgK9yRJkjokhXuSJEkdksI9SZKkDknhniRJUoekcE+SJKlDUrgnSZLUITUJ\ndzM7wMzGm9kEM+vdyP1jzWyWmY2I4/jWD2qSJElSKy1+Zs/M2gKXAfsC04DBZtbP3cdWOb3F3U9c\nCmFMkiRJFpNaNPedgQnuPtHd5wM3A4cs3WAtGV17303X3ncv62AkSZIsc2oR7hsCU0vn0+JaNYeZ\n2Sgzu93MurRK6JIkSZIlorUGVP8NdHX3bYH7gWsac2RmJ5jZEDMbMmvWrFZ6dZIkSVJNLcJ9OlDW\nxDeKa4tw99nuPi9OrwR2bMwjd7/C3Xu4e4/OnTsvSXiTJEmSGqhFuA8GtjCzbmbWHjgS6Fd2YGbr\nl057AeNaL4hLTtrfkyT5uNLibBl3X2BmJwIDgLbA1e4+xszOBoa4ez/gZDPrBSwAXgGOXYphTpIk\nSVqgReEO4O79gf5V184o/T8dOL11g5YkSZIsKblCNUmSpA5J4Z4kSVKHfGyEey5wSpLk48THRriX\nSUGfJEm987EU7tWkoE+SpN5I4V5FavVJktQDKdyTJEnqkBTuLZBafJIkyyMp3JMkSeqQFO5JkiR1\nSAr3xSAHW5MkWV5I4Z4kSVKHpHBPkiSpQ1K4LyFpokmS5KNMCvckSZI6JIV7K5FafJIkHyVqEu5m\ndoCZjTezCWbWuxl3h5mZm1mP1gvi8kfZZFNtvslGIEmSD4MWhbuZtQUuAw4EugNHmVn3RtytBvwI\neKK1A5kkSZIsHrVo7jsDE9x9orvPB24GDmnE3TnAecA7rRi+uiYHZZMkWVrUItw3BKaWzqfFtUWY\n2Q5AF3dPSfUBqDbfpOBPkmRJ+cADqmbWBrgIOLUGtyeY2RAzGzJr1qwP+uqPDWm3T5JkcalFuE8H\nupTON4prBasBnwIeMrPJwC5Av8YGVd39Cnfv4e49OnfuvOShTpIkSZqlFuE+GNjCzLqZWXvgSKBf\ncdPdX3f3Tu7e1d27AoOAXu4+ZKmEOEmSJGmRFoW7uy8ATgQGAOOAW919jJmdbWa9lnYAkyRJksWn\nXS2O3L0/0L/q2hlNuN3zgwcrSZIk+SDkCtXlnJYWTOXga5J8PEnhniRJUoekcP+YkFp8kny8SOGe\nJElSh6RwT5IkqUNSuCdJktQhKdw/pqT9PUnqmxTuSZIkdUgK9yRn0iRJHZLCPWlAcztQZiOQJMsP\nKdyTJSYFfZJ8dEnhnrQKqdUnyUeLFO5Jq5OCPkmWPSnck6VOU3b7bASSZOmRwj1JkqQOSeGeJElS\nh6RwTz4ytLQ3fS3ukiQRNQl3MzvAzMab2QQz693I/e+Z2WgzG2Fmj5pZ99YPapK0TAr6JBEtCncz\nawtcBhwIdAeOakR43+ju27j7p4E/Ahe1ekiTZDHJr1QlH2dq0dx3Bia4+0R3nw/cDBxSduDub5RO\nVwG89YKYJEuXJTUBZeOQfJSp5QPZGwJTS+fTgM9UOzKzHwKnAO2BvVsldEmSJMkS0WoDqu5+mbtv\nBvwc+FVjbszsBDMbYmZDZs2a1VqvTpJlTpp5ko8atQj36UCX0vlGca0pbga+1NgNd7/C3Xu4e4/O\nnTvXHsokWc5IQZ8sa2oR7oOBLcysm5m1B44E+pUdmNkWpdMvAs+2XhCTJEmSxaVFm7u7LzCzE4EB\nQFvgancfY2ZnA0PcvR9wopl9HngXeBU4ZmkGOkmSJGmeWgZUcff+QP+qa2eU/v+olcOVJHVDYaKZ\n/IcvNvhf3Cv+J0lrkitUk+QjQq68TVqTFO5JkiR1SAr3JEmSOiSFe5IkSR2Swj1JkqQOSeGeJMsB\nOdiaLC4p3JNkOSd3u0waI4V7kiRJHZLCPUnqlFq3Mk7qkxTuSZIs0R72udf9R5sU7kmSLFWyl7Bs\nSOGeJElSh6RwT5IkqUNSuCdJktQhKdyTJEnqkBTuSZIkdUgK9yRJkjqkJuFuZgeY2Xgzm2BmvRu5\nf4qZjTWzUWb2gJlt0vpBTZIkSWqlReFuZm2By4ADge7AUWbWvcrZcKCHu28L3A78sbUDmiRJktRO\nLZr7zsAEd5/o7vOBm4FDyg7cfaC7vx2ng4CNWjeYSZIkyeJQi3DfEJhaOp8W15riOOCeDxKoJEmS\n5IPRrjU9M7OvAz2Ank3cPwE4AWDjjTfGWvPlSZIkySJq0dynA11K5xvFtQaY2eeBXwK93H1eYx65\n+xXu3sPde3Tu3HlJwpskSZLUQC3CfTCwhZl1M7P2wJFAv7IDM9se+DsS7DNbP5hJkiTJ4tCicHf3\nBcCJwABgHHCru48xs7PNrFc4Ox9YFbjNzEaYWb8mvEuSJEk+BGqyubt7f6B/1bUzSv8/38rhSpIk\nST4AuUI1SZKkDknhniRJUoekcE+SJKlDUrgnSZLUISnckyRJ6pAU7kmSJHVICvckSZI6JIV7kiRJ\nHZLCPUmSpA5J4Z4kSVKHpHBPkiSpQ1K4J0mS1CEp3JMkSeqQFO5JkiR1SAr3JEmSOiSFe5IkSR2S\nwj1JkqQOqUm4m9kBZjbezCaYWe9G7u9hZsPMbIGZfaX1g5kkSZIsDi0KdzNrC1wGHAh0B44ys+5V\nzp4HjgVubO0AJkmSJItPLd9Q3RmY4O4TAczsZuAQYGzhwN0nx733lkIYkyRJksWkFrPMhsDU0vm0\nuJYkSZJ8RPlQB1TN7AQzG2JmQ2bNmvVhvjpJko8IXXvf3eB/cV7+X+0uWXxqEe7TgS6l843i2mLj\n7le4ew9379G5c+cl8SJJko8hzTUCSePUItwHA1uYWTczaw8cCfRbusFKkiRJPggtCnd3XwCcCAwA\nxgG3uvsYMzvbzHoBmNlOZjYNOBz4u5mNWZqBTpIkSZqnJpu7u/d39y3dfTN3/21cO8Pd+8X/we6+\nkbuv4u5ru/snl2agkyRJCmq123/c7Pu5QjVJkqQOSeGeJElSh6RwT5IkKVEvs3FSuCdJktQhKdyT\nJEnqkBTuSZIkdUgK9yRJkjokhXuSJEkdksI9SZKkDknhniRJUoekcE+SJGmG5XXOewr3JEmSOiSF\ne5IkSR2Swj1JkqQOSeGeJElSh6RwT5IkqUNqEu5mdoCZjTezCWbWu5H7Hczslrj/hJl1be2AJkmS\nJLXTonA3s7bAZcCBQHfgKDPrXuXsOOBVd98cuBg4r7UDmiRJktROLZr7zsAEd5/o7vOBm4FDqtwc\nAlwT/28H9jEza71gJkmSLHuWp73eaxHuGwJTS+fT4lqjbuKD2q8Da7dGAJMkSZLFx9y9eQdmXwEO\ncPfj4/wbwGfc/cSSm6fCzbQ4fy7cvFzl1wnACQAbb7zxjlOmTGnNuCRJkixXFL2AyX/44qLz8v/i\nXvm/mQ119x4t+V2L5j4d6FI63yiuNerGzNoBawCzqz1y9yvcvYe79+jcuXMNr06SJEmWhFqE+2Bg\nCzPrZmbtgSOBflVu+gHHxP+vAA96S12CJEmSZKnRriUH7r7AzE4EBgBtgavdfYyZnQ0Mcfd+wFXA\ndWY2AXgFNQBJkiTJMqJF4Q7g7v2B/lXXzij9fwc4vHWDliRJkiwpuUI1SZKkDmlxtszSokePHj5k\nyJBl8u4kSZLlldacLZMkSZIsZ6RwT5IkqUNSuCdJktQhKdyTJEnqkBTuSZIkdUgK9yRJkjokhXuS\nJEkdksI9SZKkDknhniRJUocssxWqZjYLmAJ0Aop938v/q8+Xprvl7V0Z3vp51/IW3kybZR/eTdy9\n5T3T3X2ZHmhnyff9b+5ea7tb3t6V4a2fdy1v4c20+eiEt6UjzTJJkiR1SAr3JEmSOuSjINyvaOJ/\nc/da293y9q7W8CPD+9F4V2v4Ua/vag0/6jm8zbLMBlSTJEmSpcdHQXNPkiRJWpkU7kmSJHXIMhXu\nZrZyE9evi98ftfL7dqg6f993X83scDNbsZHrnT7Ae1uek/r+Z9qY2RGl/5+tur+KmbWpcr9yU9eX\n4P3dSv87FNfMrIOZnRe/h5fvtyZmtqqZ3VCDu7ZmdkHpfFG6Lc5zyxNLM+ylPO3Wktuq5xa7jNU7\njcmv1pZpzb5/WdjcQ1BdCazq7hub2QHAD9y9l5mtBDwH3ADsB3wdWBf4JfAt4A13f8XM1gJu5/0f\n5t4U+DWwFbAisFb8PgWsA8wFbonjWnffwczOc/efR9iGASsAI9z9G3HtMOBC4LTwa3O0oGBXoEjA\naUBbYGGcrwL8O/7/IuJ0H/AgMKcU3j+V/FgPeKZ0vgdwp7t/y8yGu/v2pTQcBHze3d80sy8DT0R6\ntAEOA3Zy93+a2arx3s8BY9z9EyU/jnP3q6ry5g/u3tvMhrn7DkWaRDoNBSzS512AuF7cPxTYPcI/\nAtgZ2C3OHwUeBzYGDgVGA1OBm4DfA90jbTsAnYGXgOeBHwHHufsPzGxL4K/Auu7+KTPbFugFHOTu\nu5TiMMRLnyGLsrJF+L8ZsG3kxwnAf0vpXWY3YDBwEnAv0CPiMAl4DBgQaX0ncEiUydElv1ZD+fw6\nMARYH3jC3S82s/OAPwAXuvu3I//2cPefmNkDkX97uvs/I/wPuPs+pfisDjwA7B+XOgE/AzZBH73v\nAsxw990jz64GbnT3V81sU+ASKmV3DvA2MA8YCnwpwvu5iD/AV4G+QEcaKoS3AF2B44D3gD4Rlk8B\nDwGrA9tFmu2N8nO6ux8Y8fgi8MnIl4K/AGs2kh8AOwI/AD5dFY5r4nfteOfDkYbrAG+iurgBcD/Q\nPtLo7dLza6Fy1a507QbgixG/dqjcdwdGlt25+0URl3/z/nK0R4QFd+8V7oYDPVE9uBTYBqXdY8BJ\n7j7NzPaKd08GxoQf61a99/km0qjC4kyKb60DCaIuwHDgO6gQzYt7NwKvosI2LyJY/H8vjjfi8NK1\n91Ahegd4Mdw/goTpn4FuwK8iof4KvIAE1OPAzHDTF3gyEvxt4HyUyc8C/wNmABOA+Ug4jEDC+N/A\nwcArkWGHUanYo4DfhLvXUePyMjAWGIgE3AxUaG4LP64DTg33d0Za/QU4BugY6TSilJ4jit/S/+Hx\nu0vE5c1Io4WogQToD3yt5M/t8f65kbYL4/804I8R1klI2C0EFgBvxf/5cT4XNcIvRz53i+PJyJep\nwB2lPJwGfD/SaZM4/xMqG9eisjEDOCXS/OK4t0vcWxiHx7sPDf/7RrqdHGn9WuRjUU7OBmZHeP6O\nGsBxSOk4KdLrb8CXI29mI+E0K9LztfBnLjAp0u+PqKHaJvLi+gjvz8PdPeFuWFUejUCNXcd411qR\nHh3j/XMjPAuolPk3Iu1nojryd6R87Bjhfg6V/7MiPScAN0ccv4EExX8iz0dEHj0b73kTld/n49or\nEf/p4ecLQD+k8MxFjcEUNJvj0bhfpOGjwF7A0ahcT4tw3IXq33uozhZl6L2IY1H+PNy9G/9fBE5E\nQr44jkH1elK87/WI263ARfH/BSRUC8WiZxwDIo4vRZyfirTsj8rRdOBMVH8XIrnyThzzIh/mRF70\nRA3nW6i8vxnxfS7SazwVmTY34jQx8mJyxOEJpHDMiLA/G+7GR7hHA6NqkrPLSrgXhTsKVntgblHw\nIwLDgb/GtZFIq9g4zs9BgnoEaqm/D5wd94bG72jgqfK10vXtUE/gnSgw81Aln4xayklI85gThWI8\n0hRGosqxLtICqleTDUMF/SRU6VakImxXCn86IaG10BtfkbYPKuAvUBGmk2hY6N+IgvZmPDMKVerH\nkQawAzA67o2NtBwehWduxLMfqmAzgaOQ9jMD+C6qyLMjvG9G4Zod6bEACbjR8dsPVZiniZ5gvPcp\nYFzpfFS5YAKrRninIC20aNyfiLwejirVmajinUmlog2PdN4cCdq+SOiMQtrjnDgmIYExOfJ4FLA9\ncEe86zokuPpEOvQFro57Y4BBwFWoArYBRsa9tqV4/AsJwauoKAl/jnweFGn428ivn0UavBX+z40w\nzUcN1aQI58S4Ninid1bE+VlUtn4fYS6Ol+P36qqytG28+2kk4HpFXj6PFI5xRf0q6knk24R4V894\ntvjfE5Wh1UrvGBzXhkd825X8exIYVBWmNlXhmIEasVUjD/8P+ELJ/YHA38tyoxF5sioNy9qjZcUH\nad3Dy2Wx9H8cahAOQD2Gn0bePF3VAI+qeuehTRwTUF3ZE5XvIg17RnzXLYWrHKbhSGE4tuq9E1AD\n9q/FlrPLSLjfDnwWCcMnI0FfLVXuUXFvF9QtGh6JPxW1qK8iobB/yc9JwAVIAByMWt0HkXZRCOef\nIA18NBLefeMdNyGtsTiui/vdUNd3LvBDVPjvRF3sp6NgbFoKwxgkSAZFPNaMdxUN0GykAZwfx+1I\nUJ+JTB17xP2ZqAdzD7BBE2m4ExXtbEakySGoW/1aXHsUNWA7RpoWFfQ54KA4joj0/Uv8dgy3u5YK\n3WGl9x5WFY51w58nge1L1/8v0qtNHM9G/gxCXeQOkZ/TUQEuNLL/Rd4W5rGfAjeHn/cgs0rRiH6F\n6IWUK0RV+AaXKtNTqMEdE9dWpNIIjid6RXF+MRK0wyM9tog87QccX3rumNLxPBLEx8S9/VDZ6RPl\n4hkk+AsB+mNU3q5HWtpmSOheBPQtN/4RhnuBlalU/I5xnIUEwPqlax2Rme+kiMOsiOMQ4DLgXNQo\n3IAai7WjXFyAtNkBqCF8Ltx+oZROHUrpdEcpv8Yjk2iRX9+Jd+yKFI6vxvvGowb7M/HsU8jsNQuV\nh21L/veK6xegnu35Jf8OR/VwClJCxiAzzz6oB3YUKneHosayUDCKRnUUqivvVJWZvVCjfzKVXtZ5\nwH4lN++isvw48E8qDe1rhNIV7nYHpsT/+4B14v8DSGa1jWMS8FbpueK9A1GDOX5x5eyysrl3Qt2X\nz6NW9xlkF/sh8DtgDVSI26Eu1q/c/bYYmHwcJeyfUAY6Stw9UNe9K0rQMcD3wo/CDt4WVdgFSBP+\nB7IFDkOa613x7EnIzvtahPdK1LjMRpW8PRKez6NewNvxvi1RJv0TNV67oELSJd4/FBWofVAluwYJ\n6V8jTXOV+P22uw8IG/YFQG9kAx6LejjXmtma4c941Cs4POLiqOD8AxXiv1Ip7C+hSvDriEthSrEI\n3/qox/AoGrO4E1WQNSKffuzu15vZV5G9+mVUcd6Od62DGryZSHMh/C/S/q3IjxUjTyajyn5FhOEc\nZG9vhwTLmuHXUyjPV0aVd+2I55CIU9fwe2vUUG8WYZqAKupfgH1Rw7ZW+HkDKm/jqIwPdI/8m43G\nbp6Id64T6TEg0uKzqBz9F5kYBiBFoHv4W6SnR/zfpTIe0zvuvQhsGG6GIiH2+TgfjhptQ+M150U8\n3ot4rIhMgV8Lfx014q/E+wQeC2UAACAASURBVFZDZWlixHcKalTuQMKwoCvv503UGK+M6sVvIk6r\nRJyLMZcFqFE4LOK1PiqHbSJs85Gy8R4SqDtEOrwY/09DJsx9Ig1BdXvDeNdfUeNUNJAXobowJ+JF\n+POca7ynJ3B3pMc7SHl4B5WB1cPfc5Ey8y8qdvqbUCPzECpDBQ/GOy3euUKk+ztxrUPE5w/IdHaF\nu18a44d3oTq0fqTxbHffwMx6RVreiGTINyN9ivLxToTLgCPcfV0zuwqVva1RLwyo2PqbY1kJ993c\n/bH43wYNyByFCucAVLD3AU4H9nX3cTHo9GDc64W0j8+hAtce+LS7Two/2yITyi7u/mYLYVkBCchB\nqPKvioTGPHf/Qsndn5HAXA8V/KKArYAGR4bG9dNQhpZn1/yq6rXnuPvGJb87IKF/JDDW3efF9TNR\nhd8SCbx3UZfzK2b2Q+C77r5tuF0r0tCAG0oN0zZI6PdBPZK9kEnpzYjv2e5+dbjdBAm+Ik1uQprP\nuUjYPYwq4MoR1ldQgRwT8f0tcDkSOh0inp9pJM1/CPQv8qspzOwSJOxvinj9FAn6tigfjoywPhjn\nV6EG5yyUj7NRhe2I8qc7qsRDkUDqGf7fhsraZNQorI4akIGoMoLKaFdkV38SaZN7o4Zy90gbQ0L+\nJ8B/3P11M1vd3d8ws46lqP0U9bD+Hen3pQjDxXFvzwhrfzSw9ihwLGrcZyBb8ndQIz4I5euRaEDu\nM2b2+0iXF1C5KRpYkDLytLv/3szOQHVpsLt/38w2BtZ39yfM7K9IIN5GSei5+x1mtmPEGeBhdx9e\nyrP1kLAEKWBbuft8M9vU3SeW3HUolfPRSIi/ggT7OcC3w+nf0fjIa0iB+WWpzI909+3i/wSk3J2H\n5MHeEd4pZrZv+G/AAHe/vxSOnkjWvBjpW9AXmSgPd/fvmNkk1FiNRrLlVKRNP29mqwCPl8K1KypH\nt6AeySPuvr2ZPYnKx2lIqO8W73osfo9CDcgNwHx3v8bMzkWK5QLUcy3y4Te0xAcxryzpQXQ5GruG\nNIzPlrp8JxOzV9DMiX+WnllkD6My0LgNFTvmlHjuSKTZ/wDZCJ9BwnlS/A5DppzRSOgNo6E9bAek\n8eyAtLYdkXa4NTIlHIVa4RdRa9+NhmaeLZF2Wp4t83j4vTIyHb2MBMq1SNsYGtc6ocI0irD1e8XM\nUJgG7kdabjGGsRYqxNXpNB5Yu3T9NODZ0vlaaNYSVEwXV0YadSiFYyUklEZTsZUOQ1pbeexkFLB3\nnFfbJu9HAvUxKoNwE0vHn5Eg+zPw5/BjCBUzy5hSuAcBbUvn94XbtqjCbhnv2TPSdj/UIA8DusYz\n16Ce1uPx3LySf/+OPC5s1CshDfAfqJxdG9c7IFPLLNQg/ZboTqOy+BINxwEmxr09kZLyfKTpp4GX\nvWL2ur+cj0WZL/0/HGnrI5EicQfqdd2JtMSJyEx2YbzneiScpxMDm+HPFaU0HxtpNRZpusWxAxJY\nRRpeHUcx1nAVUiROJxSxUji/iMrcGZHfZ8T10UioFTJgxbg2Cm1vW7ZLl+N9J2pcu0Ze/iqu9QG6\nV8mXR+N3DpXB/DeQ0Hyj7DbcPQy0aey8CG/p3iLzXpx/KsrHc0guTK4OP1IqHoo0L9KwD+qdvIzq\n/zDUSFwAtF9sOfshC/VdUYs3FQnCF+KYQcUONjUS/TmknQ9A2uS7UVDXQZX1AcJWhmYjvIZa25nh\n91eRbXRiZOBA1FqOCD/Wjgy4iUoXe1b4+2qEoV8cr8ezQ5AWMy8yeGG8t7CrzgdujzAVDcfEcF/Y\nE3dE5qIXI3NnIo1lEhJ4Z6MK1Q2NtP8nMvo61HA8hxqWWVTsssORQBoT4TIqYxijkSAeFWEvD2qO\nRDbuW0tui3yYGc/NRRV1UCk+jyJhdXmE9V0qMweKwdeFqEF8goqwGBvpVIRlKhK6j6EG93el4xhU\nNn5CxYZ9eYSpS4RrTBzvII2+iNfTVITqS/Get+Id76EGfFSE+anSMwuo2GYXABPiXk80PvJm+DUT\nCcu9iMYm3N2LtLViLOFU4NRGGtmBwJql60MiDYYDT8a1BagHUc6/mahMjEVmzd8gTXwcEprXRd7c\nisrwM0gobQE8Vsrzh5FJZVJcKwazj0ENzDFxDCz9FsczkUZFGr4S/09C5fediP9LkUfzUR2eHNfn\notk87yBBvANqdCYgoVYI9dMirguQQnYNqmvXRf53RHXk70gIzoo8PD6ecyqzw4pjkVAvpf0DwBpl\nJSn+90V14xnUI3gS1b3TUSP3JqoDT8S7H0blfFC8f0bEpWhM9kXKVX9krnky8vNbyLR1GDGehZSH\nbeJodBC5luNDNctEF2hPJNxuKt16CwmxyaVrc1CEj0QawiMocy9DQv45NKC6dvj9NCoQl6DBwJei\nu7cTEnQrmdlIZKM7NJ65CVXGSajb/SzKkItQ4hdhPBPZm0fFPPivo67ThkgAbRX+H4Yy+Keot1Hw\nU9Qyd3T3M+LdHZAt7SZUwAcBK7jmb4909+3M7HIkWP6MZlqshoT4ZFTopyOB91dUCMejAr016tZ/\nBwnHF5AWeVG88/5Iy96l+M9BlW0AKogg+/Icd59kZoegweVtqdgiL0Pa6iTUAynSbXfU4Fabo9w1\nXjAeDZrNK26Y2THufk3ZsZkdiCrvc3Hpc0iYvF2ELa7/BfWgBsW9L0ca/QWZOgYiwbI/8IK7fy78\nPy7C+BDqgXwmnumHNPg1kLADlcHxVNZNtHH31cN8sQkSqOci5eF54AvAKe5+Ryk+byAhthoSTsXc\n6KORvbhbvPcXSIAXisRYlP+PlJJn/dL/YnbVWkiYrkHFxLAyyrPB7r6tmfVHvc9X4vgcsgmvHGNa\n97lMCIvWFCCFYyrK3y0jnX/t7sNivcVu7r4w8nV7JDB3j3w7JsJxNTK13Bph64EEbmHSsUibA1Av\nujuqF3+KeJ8ceVOMn4HML/PcfVMz61O6vkr8OhKyM1FvZhoyGa5fqof/ijDfH+8t7P+z4/d7qDHq\nGef/jd+eSEEcjcpFUU7ORb2Jn4S/P0Z14REk+4oyu0H8FmMQIOW3zBzUsHRCZXRYcaNcrprkw9Tc\nS63lJi3cvxQJnWFICE1EQmwIyqy3kOZRTLn6N6rMq9Cwq1Z0U19DQugPNBy93wHYodRabtVEeJ4q\na1/x/NsoY0cgbWR11N2di7SMPnFcTWU2w0hUqcegCrMZ0g5WirhehArxk8jE8VVU6c+I+OwB7Bzh\naINm4NyOCs4r8f7rUcF8JO59FwmCg1CD9EcqUwwfQxpv3zhuRQtrQELhV2igCFSYpyDBsDISIA8R\nWk8jadaPhtrpWlSmGd6DFrAR4RpOdI+RRjQPaTijUWV6MvL03iK/4tn1kM34YCTch8X5xqjhuhg1\n2N9GwmMg0K4qnBugytcLCdk9ijJadXwz8rDQEu9G5qUH4+iDKnkxc+KZSNvN4ngSadNXxvEUKrfH\nIM30nPC7GzFDKPK8PHNkJdQbuBMJq58gzfEu1AhOROa5waisDUNTfkejenNWXL8eafNfQ435LMKE\nhGzMRLrvTENTwryIw8FUpjOPp6L53hPpWfSaip5K0QNZi8psqa9S6RmVTbFjkGJ0GyqfR1EyeVTl\n3UDUazkH+FQTbkY2d42Gs50m0bCnuAmNm5AXhbd0beX4LXpeQ5EMM2JaZZX7c6nMc59EZdprcbwV\n5eN5JOSLRWJ9qJry2tSxrAZUOyMt+5NUzCwrRwSKVVhPIYHVAWnqINviWJToJwK3uUbKz0CaN0jY\nroAEWjekGb+OhNLWSCiOCredUMVzpI0ZEi5nIeG7MK6tFG72RVrQKUgQPY40lJmoq9gFuMm1orRo\njUGZ1AFpbC8g7XsV1CCtGnFeE1WCdlTmtLelMpI+FwnrL6OCA4C77x1pug6awTI8wjvT3R+Opfjn\nI0FsSFP7pbvfGIPZ30UDSiDt5crQwm6J93wT9Ya+hzSRkcic9bd45lB3f50qqlfUlq+Z2T+QOeEB\nKg3Yvu6+jpndhyrFT5Fm9t9I13aosh+BBPc+8X806kGtj6agbRCDy/e5+05m1iPe82Kky6pI0F2J\nNPU9iPUIhM29SNNSuG9FNuabUdl7MdLSi1/XStOxaO79JKRBrxvvG4sanE09BvhjEG6Yu29VNXts\n7XjPj9x9NiUiHG9QGeT9caTDj1EXfpq7n29mv0Ga8dZICToJNTaP8P4B0E+EHwY84O7j4vrgSL/h\nKILbm9m7yOyzAhpgvwA1APuixnyvyItXIk+3Qxp/MVNmBaRAfD3e9wIy39yM6u8nIj+3QYrceXH9\nCNSAvRj59G4pWdYIv74RzxfbIExBjdD/oR7mj1D+HgX80N0bbOcRcT4AyZr/oPwrpkRuFOl3nLs/\nZWbHoHK7O6rbuyKlYeOoN3Mi/Bejxm94yIQVkEK2BxqcbYfMU+9SKUerl8KzG1Kw1gJ+6+7Fivfa\nWFytuzUONOB1HGp1n0at9AwkxPoCfwp3gyLRQAMQFyLN4X+R6AuRaeJRVOnaooJ2K1WDJEjz7EVp\nYAJ1A7dGQmwNGi4cKA80ljWmO5HgWRk1FIUG2hVpbd1LWkVx3I8K2ft6BkiDujnichDQqXRv0UpG\nJFS/jwrZjmiGxX+Q4JiFCuPCSLOFqOUv7P2Tw59dUWGbH+fbAZc3kUdDSu8uFpq9hyr0YagSlhfw\nFANqxeDnSGCtkn8daXxu+Dg0E6QIY7EIbVQpP+YgrboP6pVcjRrsG+P/MCrTIwu76sLS77uRPleX\njjNRmTuHykKzTxALnKrSYv/I81OiHBS/pyLhWWjpm4Tb8ylp/aX4FIP+B6MexXxUZs8H+sW9a9DW\nEY3lydjS/+tQPViAerqXojK2cRxHh78XoIZzcetoeU3BXajOvhfxPZvKwrIz4x0/RyaUvyJlp8jf\nZ6kMWheTAzqgOnEEld53O1SWi1lA5frXHtXrIs2KuE6kMsb1OJUxn6KBfBMJ0n9RWXz3zyI88dwk\nGg7kz4+4HRRx/gYquwejXlKxhmBtJHuKMah3Sn4W4zj/iHLx6ygzg1Cvau84ZiNlqjrt90HK2EDU\ns7q6+qglD8t7KXyYrO3uV5k20Znv7oeb2dvuvhA41szGxZzQLsDRZvYm0hz/jYTqANQovAac6e5z\nTHvSHIYajV2B18ysGADt5O6fNLPTgb3NrLCPtUUr4N51TVsrwjcDadkAuPtc1LBcaGZboK7u7Ujj\n6xRzzm9FQndETJuah7S2oicA8Ekz+2T874walK/Guwqb3Crx/IrAima2J5X50rcDJ7j70JgmeCbS\nDmajbvnx4e44pAnuhcwB18Y7/4QqSTEl7XpgCzMrprUV8d0WmB9p6qjAd4n/30UFFaSlggTJ92jI\nhcDjZnZbnB9OzNP1km09tN1zgPZmdgqwdvy+CKxlZkU690dCZm9ky30dNcILUA/wLbQIZPUq2/Fg\nd9+pKmzFFNxe7v5rMzs4puY9bWZbldzs7e4Pol5WsVfNBhHvwfH+76PexXquaXc/QQJ++0gvzGxu\nhHNaaN8HowZruquXtCdSIEC9ia+Z2ZSIU6HRbQsMM7Nd3H1QhOfYSJf7I703QNr/xsgUUJS1JeGH\naCbHJ5D563VkRjndzNZHs0/uM7PjkVa8EZXez0K0J9IbZtYb2LHUk12Lyp4txXoT3H2BmbWL922N\n6u8ZSOhvFb/t0GrVkyJddwZujLG1bkgebO7q8RwZ0yOvQgK6i8deSVX0QArWs/H/JtQLeQH1Qvqi\nXoDF+VAqPbZ1kJl3PVTfMQmR1SLso+LZLqgh6IpW3T8Ybi8GjjOtZwH1cPZGZeMBdz8nxvEKVkQ9\n9xcaicf7WdwWvTUOYkkyEtJPoUHPOcge/ASyOV2DBN6bkfCzUSs7lYpWWUyLuxWZOP5GZbnuzqgB\n2BF1aUBC9gUqGsegeP9AJCRnogI9Ivw5CtlVn0L2yY4RtokRnhvCv/7IrvsIGvTZJN51efx/JtwP\nQNrnLFQoTkeazgPxfJ9Ih7kRpvnhdhoajZ+BNJuOEcaOSIMu0uFtKtp+MZPofFQxj0VC8R5gVtzb\nhIqZq1rT3I/KDKKxqMc0HwnyO0tp2h6ZqraharoWGhQ7MY7uNJyVU6wQLPbgeTvy5AY0W+ZT8f7n\n0Hx+Ik0eQia9IajCDECN3mtIGN4f+VjYji9C2uauEffPIRPd2EjnX0dePYw0vP6l8J8Vv32oTFV7\nCWnNxfjBUCrL73uhnsxC1DgV+9j0RGXmy+FmPBIIRc/kYlQePocGYoujyJOnI63GhZ+TI67vRRxG\nIk2y6K28FXlV9GLm0Mh0v2bq58pIu/4aUmQuRJrxHUjB6lc63kB1oNgfZzwVu3ExO2ZypNGzqC71\nifS5AfUMin2LXok8vinOn4p4jEOKSpuqcA6KvN4A1aEpqPFoi0w/DyCBXcz8alTzpTKuNDDC+i7q\nFT0X798VzWq5K9z9MeIyFzXy+1FZUf2/uDeOysaFRf0cBvSI/1dHns6O9OiLGo1iEL3Yv2fREc+1\nAf5XUz4uI+F+ENJaPxWJMQx10c6MyG5ecrsesk8VFWIAMuNcFBnRES1D71RUttKzIxp5d3mebJ84\nrkUVpNgQaERcK+4XpoDj0WwLqEwZbFPKvPKeI0V3c38kDD9JRfBuVQ4PDefPrkxlrvgnUNfwh6jy\nT6My+PIOEhh3RNi/Fm5eR0LqJSRUv4yE4EWokeldKoiLlvYXwrgqrdYO9wejgaYB4f93kObyBVQJ\nH0KC+HngwCbyfG2kRRV7B12NzELjKTUqLZSb8sD2mVXHpUizvxu4tORuYBzF+oKBaNDuwkiDIr96\n8n6zXdkEU/zORDNXTgk391JZKzESNaLFxmx7AVfF/0XmlkiHo6lsazAt8mggqtwDgQdL4dgkys+2\nkdYPoXpT7Dj5GpUpu0VZet9AYg3pWzR6z0e87o33nIoaoFOpbGFRHE9HGRsRcSmUmZ5IcbkXNQh/\npeF2ITugMZTXkCI1IeJX5McoVHdWR4uAoKGwuyvS/Q+lNBqKlJCZyPyyMZIVl6BG6BhUny6pCkdx\n9EbjF8+jcj0vjueIlfPxzHeRiemk+L2ByqycV1BZL7at2IWKLDgLNRwPRT5Pjvf9Pd73XyRr9msk\nnXsWsoMYiP6oCvfdGruGWrk7qNitH2ikQRiEBOu8SNBC2E1FDcSdqCKdFIWqN+o+rYBGqGdRsYlt\niwYsdqgKyzU0nOkxBmkT91GxGxcbU/VE0ysh9mUpNyJRsI5FFbqo9N2RieQ0JIR/j2bEDA4/5hL7\nd1BarFMVxp2Q2Wcj1AD9A3XpT4h06EZlBscu8UxREIvdBN+lMg+32EStWF59N6qYT6AVr+W0KYTU\n0zRsiDejkZkBce9+pCUXu0T+Co0Z/JGGe3b8EVXoFZDmNQv4ety7AthmMctaWTC/hBq0IWhwDJoR\nglQajhup7I3zKJVNyM6istjnbaTlPRrXJlEZ8yhsuY4ExehI/7cjLL8lFsWghnnjUhi+G+GeHH4W\n60PKFX8ImkL4D1ROL6FG7a4qvuXdWos68lSVm0NpuLfMnWgywFk00vspuVsX1YFiZ8xPox73/ajO\n90ENyItIkI6nMpj6Yrz32Sg3EyPNi60uil7gojUcpfc22PgrytWg0v2BpWNOlLGj4vyLNDLPnIZy\nqXP5WqRhW9TYbxNpOQPVu3vRQPS2EecZqPH8BrEZG7H2oOR/9aKrZ6ja36nJ/GwNYb0EhWhY6f9D\nqDKPjMI7FlWIottTLFTpiGxWjU0rOjMyYwaVnQvforKh0MK4NwNpA9PjuUtRSzkNtbhPIjv08Cr/\nD4+wXI4ame2iUBZh/Q8SepOQDd2Qpjgg4jEAzTp5O/z7O9IwpkYmz49MmxLHbVQqSjHl7NCq43w0\nU6VBOKvO29BC76V0zSLsQ1EXfCJqSKegHsGRqJtZLPZ5jsp00FElPwY3kedlrbuoqMWWrsUxh4pp\n4ctIGKxBpZdULJgaH8+/TsNNoBqL142RthdEnhT7mwxGQuSRGspr9U6Iu0ea3Il6AKcg88VzSJl4\nBAnav6PKviFacHQJjZvAFgm+eNdbEdd+qPze20iYVqGipW+HercrIu30ZEorkRejXpZ3a70CCaeR\nVW76RJm4DtWFdlQE0FtRJgpl4R0qGvMMVJ4L5ajY3novJPimR57cGfkzmYp5ZwQNe9DlNBwXv7uj\nQdZ7aKjhF1MTX0SWgk7EyuC4Xt74r2gIii00eiHZUCgHfSN/p0d+/xI1Mg+H/2vGc/3iud+G27mo\nQd4y/Czm7C+gUt6fp6K0jmrsWOz8/JCFenmF6ilxTEOt/lRUAYotM4tu0RwqdtlZSIjfVzq+GW6P\nQQ3EipSWbMd7C62oMJ8UmvYxqKAWFeJ6JAjmAueWni/P9LgmMrE9EmxbI0HfCw2qFc+MpbL51WDU\nVSuWr49CBbbQNjZDDdgiP6iYCc4pVao+VOyGs3m/7XAYEmaro0Gud5Bd9i0kjAqzzdNUzW6J54sN\n0YoxhyORFnI/lX1kzikd1yGBeWqk4V2oYhzK+xuei8K/NuH3T4ALmmoE0FTFA0ppuUnV8RzS7Han\nSlg2IphXLZ2virq/K4W/7aqfacSP6p0QO0RaVmv1Q6L83FAqU1eVnivMJ9XHjHh2ZOT73qhx7Yka\nh/0aCVOxl9GGqA79g+hhRdy6LkH9vD3SdW6kTSF4GmjFSPstNsGaQsz4iDLxA6SwXBrpPAkJsnFI\nIXmllKfFLKXCvDk20rZtXOuAGveTqGyVXBZ4kyLMB0V6nRPhOqh0nIy05ZnhZiYxhlPUmdL/Yr3A\nPCRb/kllVtAz8f9VJJRfjWNS3Hsw/L8Lme1+iLTyYj+Z/VFDPQLJudlRdi5G9eJGpHgsRPVqB1Sm\nH6OqbFPqOTR3fNizZdpTmdddzEZZAWltU1DC7Ic03G1jxHkGEionokx6EtmxNqeywq4zsgsvQNrm\n9kAX02ZfoJHzm4HOZvaXUnjeQtPLyrM3tkEDs780M4/Lp6JWGtSl2hJpCcMiPp3cvV9VXAcjwfGa\nmRUbL82N37aoYHQg9niO/zMjTLj7f01f3JkRz3wfzQbaFzUoHYDupTiuHvHv7pql8BYqJKuh6WAX\noMGlTanMk/4emsFxKKpcl6FexE5Iw7kCNRLFXOKRVFb/QUXIfSbCPQsJloNRnpZX0X0H2Revi/hf\nCLxlZt+N87/FM+NjtfFc4Psx82WOu08pJ66ZvehVX5FqgnWozGT4HTKHrevuc81sAVIsqlfSVnMt\n8KSZ3RnnX0KLvX5vZg8js96cWL28IzJpjQ63J8Zc+x6RVheixm89pEyATJKPoK8v/TfC+kaUge8B\nfWLWxKIVvWjTv7djla0jU0Lx5aSFSDN+3yyhFvgealRXR+nWD6XPa2VH7v6umd0T710p0uN4NPto\nOzM7wd0PjZkj85FwHogWEZ4Y3jwf7yjYBzVwj6MZPyBF7UxkmnoOKXFHU1nluXeE7y1Uzn9FxdxV\nsDJSJNZ090Xvi/n9nwTWKM1WuR7lxfMof/ZHZsD7zOwwd98ynj3J3S+tTrxYa/IsGndqG/PaL3FJ\n5ElmNhmNCZ5pZvu5+9Glx2+O+fPHoQbySVR/1o/4vWfafG511KC3TGtq5ouhIZRboSPQqPjlcX59\nJO5RqGt2KFqO/ToNV8Ytsp2hAvkw6mbPR5rpv5A2MxIJyJOQ4H8BCcE3UWG6EbWOk+Laq0iQ7k5l\npsd3ws8fUPlq0g5UTBRFV6qs3ZTNF89Q+WLRY6iS74aEdfF1nVeQQCv70YbKIOy9aO7uxciu+kqE\n/Zg4DkVjFuUVfj3j2ZGoIhUrIIuPnvRFlaEPaqwup7LfdDED6QFKMxWozHQZjYT5ApoYFwj3u8Xv\nilRMSpeghUpHoW70CKTpfSvieSWxERiqnOs14u8+VPbsXmSuasTdryPeZ6Ku8xA0vrEKMVujxjK7\nA2rofkTDfevHU9EsCzPVO6gszUXa3n2oodwynqn+QMtDyBQyjMpXfRbQ8GMy36Lh+oBipfQgKh+W\nKG9etSQDqp1rcHNglJvJ8fsFKmtR/ofMEiNQ2f0a+txfkX6P0XBGSDHzZyYVs9xryJTRO8peX1SH\nNo08vKMcRyrl9WgqezSVB0mLmXVTq+JxCBVTT5/S8Wc0sFy9f/ui8yg7v4o8XR3N8S96bc8CX2wi\n7cp+XI7qb1F+jyU+1BH3f1pKlyJekyLOJ9aSn8tqheqNSEtYiLSN1VELd37VHhG9IkIrIO38U6hL\nsn2sQhzkWuG3AmoQnkbaSrEvxw6oa9jXtVVpO6Qh7RfvnYc0xh8gofIftBvfUKqIucj3U9FQb0Ya\n7RdQt7PQ6nDNd96kyou2qIC+hJZnvxv+boE04DloSuUXCj/i/ihXL+Ypd/9UKTyro3ndC+O8bfhz\nPCpsI9GA0LeQRv5epGM7ZB76ZDkNS/7ugBq2rZDZZ6hpn5GfUvmeZMGqKB9/4O7HV6dZ+DfU3XcM\nrXZkkURU9jz/cqSdu1Z5tkHd21/Q8JuR15a8xcyuR7OJxkTcKPxoJAw9UGN6GvAVd388rq+EKv8S\nzwc3s1+ibvUDqMFpQ+UTcnORgC7iXPBztP/RxPDjINSYvosUnc5oBthoJNhudffepecxsz1QnjyG\nNN1bkZZ5smkfoJO99N3VGuPyDBK2twD/8Ng2uspNsR/TPV7aGyjudUUN9/6oPD+G9mSaHPe3pLJt\nQrGhX8GxSNBtjBqu+ai38yBwjcdeKmY2wt0/Hf9Hu/s28f/3qCx2QKYsUJo/iZSplVA90A33V+K5\nXYvyUBWXLyNFsz3Kx/J+7sU3Igq5cxeaCbRl9OS2j/eCetmrIZlRfBtgHpIHhspLsUIVpBQMRnWr\nMP38wFvYurwxlpVwH+Hunzazr1GZgjTUYz/kkru7UJd/DpVVcJ1Ra9sRtZTT0eyTYahgboFsgP82\nbWS0M5WtCVZEpoE/EZtzkwAAIABJREFUIHu/Axd7Cxvfm9mvUQ/jBGSW+TkaJB2DpqyNbeK5Qxu7\njgpMLyScvxMCfis01/0hNG0M1Ojs5e5fMrMr0BS/0eH3og9kx/mqaOFOg2XVps3S9o3w/hnNzHkD\ndX+/hCpC8QGHnZG5q9ioqyOV75U+iNK3vAT666iAH0jpQ+XuvmiDowjnqHhXsSnTvqjRnY8avNOB\nn4RwvwMt5ipW7YaXfnJVvMaXG6VaMLOfo95foUB8C80f/uPi+NOIvzsg8yDE8v5QYHZCPUiL9z6J\nNLvNqcx9N2RT/T4SFEbDxv93KJ0/Q8ksUwincLMZlY3sHNngv+nuE5YgLjujxupLSEm62d2vb/6p\nZv1rqg4AbOj6wMUOpWt7o7IGEsY/c/dHw6/d0DjNrnF+PqqPNyGteDjqsfy89P5JjbzX3X3TFsI9\nCWn3fV1bnBTnxbqSHqatGU5Aisj+aCypMFt+G/VG+6De/96oR3MLssefjPL6QLQwcZKZ/ReNzbVH\nPe8LUYPwfWAjdz+mkBXufldz4YdlJ9zHoKlQNwJ/cdkWF228X3K3MtJk9nf3Z01fu7+XygDNK6j1\nPxoJjEKobI/scy/H9bmo5dwEae7PospYbNBzBdImij1MnFIBMLM/Aae7VqoWH7W40t2L3RObimef\nJm7tGWFbybUL5MqoS7sfEsDlLyr92N1nWsN9S+YRq+rKDaKZjUDa45lI+3Y0VbI7qjDbI83hcdTI\nfQWZaW4JL85DQuJRVGjbokbn3XDjNNy58wdIK2lHZb8e99LeLKZ9Uz4ffp8Rl8+O/44a3p2QIB+C\nhN4jyAyHx1fjq4m0Pb+phrUpTDtNLtpLx90HLM7zi/Geh1H3fE6crwbc7e57xHkHlH4gBaXQYssf\nNxmMTE8r8v5ViZ+hsj/TinGtbbxzsbW8RsLfCQ2Cf83d25auH4rych0knBbtiWKlnSSjXH8amUZn\nod55V1T3ii9kvenaj2Vg6dXtUZnsgOrBLWj8pzCZHuPuRVkjxiJ2Q8rSue7eN3qxxSrYFYm0LZQ4\nM1vR3YseVVPxfxjV06FhKXgYjYm8Z2b/Q2XoMSTML0AC+5B4vDNaWHURqof3u/s24ccq0ZN9EilG\n+7r7y/HOyahOtEE2/+mRzvPRpID2hawoei/NUovtprUP1GpNR2aIQnNpdEoaFZtzMY90OMr8FUvX\nRiNb9JlxnIXMHxeiEfobI+E3CPeTaLgbW7GfxBQkvNZmCaaSLUb8F+3bUro2EmmvTT2zSdUxhIYf\nEu6BhHb1fPIJaDDrUqThXBnptQfqeexV8uMhKjOKxqHK8TANv9H5h9LxS2RXXbGGOPejslf2yPD7\naiTMi9kOPSOsX6a0cKMJ/8ZFvo2niTnOy/Kg8Rk240vnn0W9l4dLRx8k/J5FvdL/i/J8Bg13glyJ\nhvszfSncFx//6E7M41/MMK+OzCP3hH9/BHascjMB2LqJ5xftJBnn/VEjXcyCOx9tF0KUpQGN+HEP\nmpE2HfWuf4bGuc4g1iw08e6HqHzJbQYaXxlPaUV6yW2L4yzI1v9wKRzl/dyLLcffRHbxyUjwg8yi\nz1MZz5qMZjadiBSst8OP/yLF8lKqdqgNf4pZY43Kilryc5nsLePuxTS8gilmtlcTztvEKHGnsBG3\nQYX8CBqOGvdEGuDJqHAcigYvtikcmNmKpk+83UNF2wH4pDfyObjSc1uiQrYJDe3Aezf1TCN+fJGK\nlrWOmZ1NZe+RzZA2fhRqpN6Hv3+2yPfRCPvpcWl9NE/9Gnc/p+RuOzTo9hNkItgPFbYiPqchgQoq\nbLuZWd8I54QIV9Gj+lk5SGhArzAbHGhm3ZEtubFZLF28YsO9EPXA1kSV53A0kHZq3L8aVaZ5ZnZq\nxL9agz+gkXc0S3Na5+L6VQONzbDpG+G4Dk1/HUGV6cnMrkE7QhafSbwTCeti756jUfku78/0XTTA\n1zfcPIM03lpmE5UZiab//ca1f01jzPDYObIRVnb3J62yR9NGaFHOmWZ2hLuXy88MtNdStdlmc9Qg\nrIQUlp1QPq2BytmTZjaHhmMYoF4LSFMuFqt1QoK5H9r2ZQfUgK1MyxSK3xaod/pmHO2R1t4n4jcO\njf2sH3H5DerlX2tmO0U8B6JZVMUMqfZobKEdUmyKsTQnPg+IFBfQHk/bEN9hLsmKFlkmwt3M1kVC\nYQN3XyQUaLwwTkPCaIP4LQT67cClppJ0PtJIr0LdsZORHf8LZraTuxdTxK5D9uX9kWmgN7H7Wtjv\n7kfdxdugge34NjTw+n80/FhArfH9GypQe0U4F6IZOO3M7AbUrTwWOMQ0VfMWGn63cli1n0hT/VvE\n5Q1UgMcA95nZkWiAjUiPAa6NmTZHaV5MDfwWcFYIc0NaVx/U3SxsvBOpbLE6jfgYt2uK5z3xnkvi\nfnNCpY2ZreXur0bBfxbl4Xrh50bIBFRspfu75tLUNWi9O7CFu/cxTZlctblnkCZ6cDPCqdVw999G\n+hS2+G95ZavdHmjK6iIBZWZfNLPTUON7spnh7mejNHnb3YsGeGCY6N6I8xdRQ/EsIeAirxe7nCKF\n6RfA5abJB0VcymNhQ0zb2v6ThmMAdwAvh/Ap4jWRykcpHjCzAVQ+gPPViMPBqLH9LNJs10Rlug0q\nE99FWvepZnYWMm2tFml2TsT/OlRuvoaUnA7hX4/wozymNifi2Cwe3ygN+XJh6XxVNBX6H+4+2CoD\n++9RmbSwN2rcHSmx61L5OMkukZ6/iTxaOw5QHVkQccG0QdsKqJf+bpWsaJFlZXMvPvRa7MJmSGst\n9m3H3z+Adgaao3wiGqyYhfZlHha2qqlIyB9HZXrTtaiFfAHZ64rd33DNQBlB5VNyBT1QYrpX9kof\n6u47foD4FjNeit9V0cycfyPtbSXUhftNEf3iUaps2CU/q/f2HhDu30F2wKJytw3/3ozrcylpq9bw\ng8ZPuPtLcb1n6XXF9sJvUtlt8igksHYy7ei5cjy3aDZDVXi/iSpV9S6RZ1ISuGbWDXjRwyZqmtGy\nrseMi5J/Z6K82so1S2EDtL//bjSBmT3W3P0PC9NOmSe7+4txXm78V0Dld5C7Hxf5vIO7bx5uP4Ps\nu7eisYku8TsFraLdzMx2Ac5z954sBqYJCMVHyIsZSA16jU2MI7lrMHxTNH71WVTfXkfmN6Oyn3sb\nNA3wYXe/M/y8D9nSXwzt+m9ofGgBUigOc30FrQMyvW0Vz5U/kF30yjZFjU6xy+ZcYiW0N9wr/VLe\nr/2XuQI1Gh3j/M0I+8qowWqHZE4xt92jbl+LVvb+C8mqohdSzM0vVqJ2QI3Ciu6+bii4dyGl83HU\nM/1FvHO7SA9D5eLlZsK9iGW15W8nNCjaHXX/i99tkQB+xSqLc/6/vTOPlquq0vhvhwQICUNA7EZE\nJqOuGEEZJECgZQFCiygqoDI4YIu2gtB2O+ASQWRJg7QNBJtBpoC42oAugUZ6ISAQCRFJkISOIBCZ\nQRFBMIgI7P7jO6fuqftuvVfDq3r1Xp1vrVrv3Vt3qlu39tnn23t/O+Id7n5CCErsi3K6bzWz34V9\nv+3uV5okV9dBhvIxFLhcF3ndh4d9LzGz2Wh6tI4XgdMo/LVbWI5f7FVm9hnEe1ZmLIyAGLx5Phih\n96Av640kTSLQl5umCTrwrJm91d1/VTrmbHeflSyvbmYr3H1WuO6ZFJ3UV6AZy8aEJhnJ/Z2GPO3J\nSA75cPTAprTVs+7+L+GevIymqZ8E/hRoH8J7cwiB0DKCt347xbTz/e6+wsw+XfKkL0PGIaJRQc77\n0D1cGo7/mCloOQTJ1H84r7PrMLOr0He6NrDCFFT7K6IUb0IG8VQUyzjUlCK5HjDFimDbpui7dCS7\ne5eZ7Y4CmbPM7BaKVMpW8aSP0BDC3T/e4LNNQoqHe5gakcSmOHNQ/r2bEhFmuvt1ZraWma3tCjhv\nEgc69KxehmxCVHk8MHyHNWorYJUp4+6/0azsHPRcbESR2bIu4ut3Dc/fTciA3h6OsXM4V0wqiA2B\nzkX8/s/C57sLeNndNw2fYw6iNBeH88bMpPspss1WIQczpkXPouiYdhwaHGLG129Q+7/4+e4JlNtC\nlFV0ddV9Hw5j5bnfiAp4foq4pX9FBuYkxJl9kKIiNN7sHVxR65PQw/J9U5rfzuiLej16GKZSTO1v\nQ17pL9H07V0ogLEYPSSvQV5KpG12pCgRBxWnpMY2hfsI6VTJ5z02nHd3lN61AXoY93GlhL4J0RAv\nIG/0ynDOd6MfyGbIKz0lOeb3UKbR4rAcPbqbqdfX3hUZg6p0vwMRFXMb8tR2Qt7VA8gDmhK2m4wC\nQR72icUbt4TP9HI414Yoj7yWzTDMPYkG9x8QNRMN7n+gptI/SratyqS6zd3fbmZLXalq01AXpbp0\n2rBt9Darvkv3itz4bqA0G0pxNuLUT0axolejTKnTUBZVVUrjjz3pdBVolOXo91JLpWzx+nZHM7Lr\naTD4mdlr0bMcZ0ALUYzgETO73d23S7ZNM0w+iZyr9cPsYiZwtrvvHqjImWgWHwP3vwznORIlBlzE\n0C5Sm6GZ5M7IiF+NssAuTs4bG2bEKvRDga296KO8GJjr7i+F5SnhM01Nn7lga6hYtzqixWIWW5xt\nb2Vme4Tz/W947xTgi66am9jpKp31voBsUHxGL0VO1ibIfj2dHr/8/ZUxVp57DHJsiW7OfOTJLTOz\nI9C0fB7UpqwLgUfN7ByU2nhB+MFu7u5/NgVJt0Wj7+loBnAdGsWfRjc2TvcOCUZyC1Npt1GURN+P\nPMG10Q3cfJQ+791o1P9hmH59HqWI7WP1TSKeQlPwmLt+HHpgd0XxhtRAbwsssqLxSGzgfCCaJj6D\nfhwPI285Bmj+BrwSHrCvIE8qBu9iwdFyd58e1yEudFPkoRDukaMB+TA0E6vLz24C+4a/jgaOd4bl\n1VFebyxaeS+afZWxIDwP6wXDcRianQ1B9DZtaLByBhpMegIvpAWmIVnnV0zB+iUoXvItinv7HU8C\n42WY2SQzO5QiNjMdUQZvAN5g4uxbnZF8HFEFU0gKw6iXkbgQ0aexruGQsG5P4Doz+zeKmNEjwM/D\njOUINFgdFe7FvaZyfdz9iDDY74Jswj9FyoYQY3D3GNepwd0fCN/pfAonYUdE5yw3sxfRQHkLsJ8r\nFfLrJjo2YgZyVOIsfHpYtyI4ZZeE9VOgNqDEz70SDSYpTkfSJdNQTGktlEixHP1OPhzu5yozi1RL\nnPW+QH184AlE3zwOtUbqzcPHLlVscvjQZyKP+yL0Jb2IKrLidjOQ0VoLeTUzUbbL4RQCYJMphL1q\nQlHhhm5AIRT1BPoRGQpsLkXe/JXIa109OecFyTV8lqHNnj/TwmeNcgJzUeR8EZptHE8ik4oGgSnJ\nfmtQlJaXlSrLqZHxdWf4+1S4nxegAe5CxOunjSaupV7+dBFFNlLUgr+H+mbcl6Mirrkof/3McH9/\niB70EdMiS59jfunevi1c58PhtYhEVri075600EqufA8brevBs5+Kfj2AKLkfhPeORfTfNiMc4yNo\nAP9VeD1DkU55IU22Yisd854mtqlSGY0CYL8tvZ5GRvM4ZOiPQ44bhNqIimN9jyBRHZZ3AC4e5npq\nnzf5/47wWhWeofMpUjB3RjO8uP/Hqbc/v0UByxkoGLokvM5CzmFcPi19bivuxcHhs6c9A/ZGnPuN\n4RhRpuIWRMts1ehztvWc9fihfn/F60HkdZ2EeLJTKm72R0vHibnYae5nvKk1A4k85GXI8zseeSNH\nUzTQ+Gc0mDxHqdlE1bFL19C0UaDI+z0JOKi07h8ITSKo10E5jpIOSpPnKutrP4QqV0FGYAaiuc5A\nBvmvyCM/A3GXZ1GvER+14FdHfP35yCBdjWIaD4WHczfkOV/W4jNRaXCRBzW9lWM1ca6GfV17/DuI\ntRtHIt52Wbi2OPhX6ohXHKeu09UoXNeFIx0HUTaHoEB9rePRCPtMRym+Xwm/zz3Dc/pweD9KPceX\nU3SxeoWkd+ww5yg7CTOQgzMn3NsHwusOSkaUoiHQeylUWbcL13gHchLvQ/GkuFxZV0G9ttNPwnd0\nZ/L+GuF3tD1ykqKe/ReAryTbleW7v9nq99lrWmbfinV/Qx7ik+5+hZmdiEbrmMHxJQ8ZHAlWmdkG\nFHniaSDvUuAXZnZFWJ6MSuz/jCriTjP1H70YGdt70TRuP/Rjv9rMfkE9ZbWamZmHuxwCr6u38LlT\nSunkEPWfBMVUPeAbpkyiyGd+2t1j4OfgZk7k7u8L/x5vqvy7AHH3IE9mczRYLEEP/9bIO06P8Qjy\naIBajv7ZiLZ6Oxosz0QpYmlQN6bptYJaimQ415aIY7wZcDP7OXCCqy9mDdZeznrDvq49hpnZjug7\n/QSaqk9GRv277n51+B0MC1d17opwwH1CAHbN5P0TWryuOdT3AK7id2NZ/X+i398iQmqeKSMqxcYo\nlTGNc9wd1v0EOXe4+9o2VIupVWzlBd02G81QZ6CMtIcQXXm3uz+b7mRm17s0eK5I1yEHJ80cugHV\nivyMJJOoAuegQeRO9Fv7FfCSmS1D92EqctiOQlkzt6IZ9EEoCB5TgI+hyCyDInumefTaa6kY6VL1\nxGVoKnfvCPtEhbk/UTGlQaNuVPDbruSZXIsM+loUU6yPhOv4BvJG7wYOTfb7Fko92z28FiDD1uxn\nrFFKYXkjKjS6u3R/a94q8hZ+i4z8wnDPt0+2/SmK2KfeT2xG/vqw7hqU6nU3LU6hG1xfeu+/Ea7t\ncqjv2FSxX8NKyRHON6rebpvfya6ICvxSWL4BeYIr0axrDVpQdUQD78WIgjguHOv8Nq6rkuorbTOf\nobOfSPPNS17fRbTDTeG9aSj2sygsr4aKnrrxnC9Cs+G/oNnvOyh1pkKD4Ppxv/D/+oSGQCgTKd3+\n521cU7yHW4S/P0L25r8QZTOP+n4Kf0n+L9OwLdOHY5ItA3UVm3+XrD4DTfs2Q9RMuft73PcAZHQ2\nQVk3OwDHenWxT3rOSUjTZqWrCOcCxOGfjwxu5MkuBx7xQoluEvI2apokSFumnUKRnsLq88tXQ+XR\nv0cD4raI230qbHsnEjPbKdn/DgqdfKdIp1wHeZsz0LT6KfQA3xO2r/vORrjGWRQpkke5+8zS+zX1\nv2RdX+SsjwZMeiF7I4roXlPxylvc/dom96+qo7jG3XcZcefWr/UOT7J0Gq0L65ejrmd7h6yUPYBb\nXJrvlUJ3HVxX+pwfjpzEZWjA/AOaccxNdjkUUbSvQVXSMcMuttq7m/rMobeggO/5jJBGa/XV6HG7\nE8zs14TitQbZbj9x9w3C8lJ33yY5Zt1yMxirCtVyxeb+qB3WgyEdayQc6+6XhWyH3VBA7SzqRZeG\nwJWd8DuUDzwZcW3PId4bipTIt6FB4EfJfucjj+AVFHjqe8MOQ/LLD0N0R5Q+vRx5fTHz4SWUUgZA\nmCo78n52RSmTTyGK60E03V0D8aS3hWMPkYlt4hpTemELq6+w3R8N5PGa+iJnvROYqmnLol94KFZz\n5Xw/Xr13Jcp1FH9Ezko3UKbR1qexHbkfmBsyTNZBxnQlgCvLrRkZgCGwxgVItyMD/ASKoX0bzVCn\noec0Snm7K435dAvFka4GN8ciVuBWFK9LM4d2Rb+NvWmcSdTQtoW370I253Gqs93MispiN7NIIRn1\ndSdNYazy3DvyNKKnYPU575XeQ2m/k1EOfWwhBvqiGykPHuPqtpNyzobogk+5+zVV+/UrQkrZrGT5\nFkTDxCKuI9FAuQB9zl2QF/Sh5DBRy9oRxTETDawvo7jFd72iS00L1/gcQytsY7qfoyBv/H/MctY7\ngaki8weI0/00ar7xpCdStS0er1xH4eh7+NqwO7Z3rspKY3e/xIoiLVBMaTbymg0V61yBqsqfMbNt\nkee6YxvX8NHwb6MCpGNQEWOai3+8V+vTRxs0F9GCpyJOfD2v73PQlMT0cLYtxMDeioz9pGS3Ib0Q\nvKQl1Rba5bg6eVF0U1qMpkVrAPe1sH/sddgSR0lJqa+J7WNWQ41zDstbUtGou99fDOXIP4ZmLgvD\n6z5kIGL/yVdVHGMxRarpMhTQfDAsT6PLyoxI5A2qsyNaTv8bo+9hSbx/ybrKxuJNHu8AQgNvmkyl\n7PD6K+MWBCXP8NoZpRnHjJMoQrcKSUrfR0lxso3rWEzSA5fQnY36TJe7kCf/VNhmJvDuZJ/KTDZK\nmUPl5WGuqaFtK92f2ivZN/abOBM5VSP29x3uNVZFTP9jZutRX7RxXgv7H4imR6e6vICNqFcsbISV\n6AFoSlWNwjN8zusbH6yk0IoYT6iaCj6JaKiXgRNRwPXvkYc+y1QMc3NyjDW90As3pG+zYVh+mepq\n3tFEbB5cy44AcPenQ1HIeEAs9Ho8zAofo9AwaQeRppyL6LemaMp24QmNVsJD1OsC/QZlNN2AqIzJ\nKLj4OG1W0ZbQqADpUopMlzNQ8PoDYZtH0awjNrtolMlWzhx6A/BRk+Bdo0wiKGzbKRQ00HkwJDOu\nCvPRs7EQDYxvJhR9tYMxC6jWLkA3c013r9QkGaVzRI6upq1CPU/7uQb7xdL2s1CwcEE4zgHoQb4u\n7N/3PC/UOPQU30H8+Y/D8jFIxuFG6lvX1WirQOUc6RJs+zyiFTZDKVz7oc41p3XxM0RK7k5UYp5y\nvzd5KfDajwgpi1H0ax4yUMf7CLouwxyvLZpytBFiOzu5+4th+RYkTfwma9CRqZPfjgVVU5SeaIgX\n/zqqcJ0br8lD16R4P6xecKwymI1m+SmiGu2jpesvS3FPRfUzuyBbsRBJcuxsQ6WK69J3rb5t4GQU\nh2wpiJpirAKqayE9mde52sy9zsx28SZaR7WJmCu+hEI3phlEL3RNpMscA69PonzVfakIqvQrKh7E\nLbyeg/868JC77zPMYY4GLjOz2BloKvKc/ki9rG23EH8c/ZKz3g4OIIh+AbuFgelU6lsYtoKGdRQ9\nxuRo2AO+hkT3PoyE5iLic9jRb8cl9XwNpZoYM3vYzM5DTtz0EKBfF2p1FKlj93x6Dd44mN0sBz4f\nzeqjMN9BhAHBg1TxMKjNZFyyzU2eshpjRctciAxtDKaUp0qjCnefDzVNjxd8aFPpRoi67pVKeBEx\n8DpKl9tLLDWzOV40ZniK+jZ6VajSkT/Y1YClZ/AGKpO9vIYOUKaU/tghpdQuTTnaeNLM3uPu0YE6\nFmVg7Y0cpQcYxaC3NS5Aepwi0+VBJBWwrrWoh94mymqtrRT2bV3KkJkalpsp0BuKTgj7dl900Dqq\nw/PWgoFh+a0odS+2tNoK+Gobxx2xbVc/vlAXmVeQkX4WeVIvIiP/B+R9nFHaZwHiEHejkBzoWQk/\niRbOeH3RJzIIXfhcW4bf2MMUuuWx+O32UTzPSAVIaTvD61H9yj4oSWBD4Nwu3oOOC/tG6zVWnvuL\ngZtyGDpV6iLSYCBIwe33hOmQS5Xy+yiw2Aq6HUTsFmKrug80eD8WN9UaH6Pp/oeA97j7ibQnOTAs\nTI3QN6O+pWGsOZgzmucaI4xnSqkh3P1+YE5I/wPFE6JMR1kxMu7TbE+EFJ+iKEBaQn0B0jxgOzOb\n5ZrJbY4yT27wopvSdpVH7QCmgi1Hs4WYtOAU+vs9R8+Nu4lIOhtpHG/So6lSxCoz28aLStZXIQOW\nFgi8NHS3ETG2Uek24QUHH7vCN6KtrkHT/HOQMNJayMCfaKqsu51RgqlqeKtwnobFIuMZPr4ppYaw\noe0zd0VNR+5HRhY0iKVByaZ6IqRwyf8OV4B0BEWmy0ZoMDkyGPVD2vx4I+HdI2/SW/TcuLu7m9kX\nkN7DHDTqHuVNto7qEOVg4GYo4HEcgJntT2uVgRHj1XMv43pUIh5nN1ORFs8ULxofb4sqdV8ydQd6\nHeoas5wWJAeGwRyv5ywnJLxxOuF4xkUonhYb7eyJBML2Qk7CR1AO+vMoi+TsDs+3v6usv5wCmjZP\nv5rC8O6GcuxndHjeIfDRKDoaZYwVLbMU2MLbaB3VIcrBwDNQHumbzOxRlOPdzsh+2cibjAvU0VZe\nlIg/bkXj472Bf0SVvh/rwjXcmkypM8YXXuXuC8zsGAB3X2lmL7hkRWLP35PDtgehzJIDOzhfrGKu\nU9P0+p6v85Lli4IT8tkOzjluMFbGfQfgYDNrKA7WJVyMHrDIbx6EaJkNgUmufo5DkHLO7j47cMKR\nc8bdv1m13zhEHW0VprF/QT+Gc1EGwiLCINglb+ViZOCfYPhikYz+w3BS3J1kkTTCiCmg7n5OaXkJ\n0lia8BgrbZlyMQ3Q/amNFQ2kP5+s/iKl/qKudlzpfjcROGcvCiHucvfZ3bzeXsPMtkcNOyJttRHw\nwfCDiJx8w0FwlK7hPiTatJxEN7sfp70Z9TCzbVBAczaqDq311LUGPX/dvawB38r5OlLTnOgYE899\nDH+oS4M3EYsJNka6EyMVF6yVcM4R7QRe+x1l2uppYHcrNXaO96E8CI4SnvQiTzpjfGFLRNmlUtzR\nxlT2/O0kVuPNFyANJMaKlhkrxGBguan0/gz/gP0h4Zw7Cbz2O8q01alIsXBBwz1GH3eEdNSrGGdS\nvhnDSnHvPeyeGaOOMdeW6SUq6KA1UGBwZvh/FYCXKujMbAvEOe+EvNnIOT/Q5UvuKawkCdxoXZev\n4cKK1V7+TjL6D9YnGjcZwkB57mU6KBSR3I3y7E9AlWy/rthvJbBHLzjnMUadHEHMYTezNVGvz3Jz\niVE3uD6C1ENGX6NfNG4yGDDPvYzE04jC+lOAhbEKshR4HYIucc5jBlMbsDcylLZ6Der282eSQdDd\n25YjHeYaqjRq/oTK16+oeC+jT5ADnP2FgfLcKxBV2J4xdUx/AjWfiBgp0DrR0IgXvRrlEl/l7vMD\nJ76wS9ewJkq5jLUDH0A02NZmtpu7H92l82Z0iBzg7C8MunE/NwR/vorUDacjmVIAohbFoKBRFpOZ\nPR8KURoNgqP++/nXAAAEDUlEQVSJrYCdEwmEs9BAMhdl82RkZDSBgTbu7h67P93MMBoXveSc+xTD\nDoKjjBnh+LH4ZRqwvru/bGa9EJfLyJgQGOhgh5kdZWbrmHCemS01s3dWbHoJaj23F+qq/lrGZ5u9\ntuDu57n70+5+s7tv4e6vdvdOdUEa4RQk+nShmV2E+ll+KwSzr+vSOTMyJhwGPaB6p7tvbWZ7oXZx\nXwUu8VJrq5ECrxMdZnYUEoR6Dum3bwN8uVuBshCIe3tY/KW7Pzbc9hkZGUMx0J47hZrju5Cg/v8l\n61KUA6/r0j3OuR9xmLs/C7wT2AA4FPj3Lp5vEmpl+DTw+iAdm5GR0QIGmnMHlpjZtUhr+hgzW5tE\nzyRBLznnfsSQQdCswwaPjU5kdjIqLCvrud/cjfNlZExUDDotMwm12lvp6j25AbCxuy8b40vrK4Sq\n0Y3RILg1sBpwo7tv24Vz3YN6jObgaUZGBxhoWsbdXwF+B8wKU/83A+uVt2sh8DpR8Qngy8D2IZd5\ndaBblaQrUauyjIyMDjDQtExCAaygEP6vogAOc/fTQ+A1cs6XoC5FEx7u/oqZxUGwK8+Mmc1D9/55\nlC1zPfXCYZ/rxnkzMiYqBtq4A/sBb2yCAugZ59yPaGEQ7ASxD+sSFNfIyMjoAINu3CMFMJJxbzbw\nOlHR7CDYNtx9PgzbpDsjI6MFDKRxb4MC+ARF4PX5EHgdJPXCZgfB0UCjJt079eDcGRkTBgNp3GmR\nAugF59yPGCMevFGT7oyMjBYwMIYqRasUQI84537EWPDgjZp0Z2RktIBBz3NfDOwRPUUzmw5c6+47\nlbYb6NzrRoNgSIsc7XMN26Q7IyOjOQx0njsVFABQRQEMeu719Yj7jphK90S8YpPuvyIJgnNQtWpG\nRkYLGEhaJsGwFEDOva6hlzx4uUn3Qaim4IAunS8jY0Ji0I370cBlZlZHASTv59xroZc8+OxSQ+6f\nmdmKLp0rI2PCYtCNe6QA9kLe4pUkFEDOva5hpEFwNFHZpLtL58rImLAY9IDqAmTULw2rDgLWc/cD\nSts1FXidqAidqI6kGARvBea5+wtdOFejJt0vAe7uW432OTMyJiIG3XNvlgIY9NzrXvLgjZp0Z2Rk\ntIBBN+7NUgCDnnvdMx68UZPujIyM1jDoxn1bYJGZ1VEAZracegqgl5xzPyLz4BkZ4wyDzrlvOtz7\n0YvsJefcj8g8eEbG+MNAG/dm0WzgdaKi2UEwIyOjf5CNexMwsxUlzrlyXUZGRka/YNDlB5rFUjOb\nExcy55yRkdHvyJ57E8icc0ZGxnhDNu5NIHPOGRkZ4w3ZuGdkZGRMQGTOPSMjI2MCIhv3jIyMjAmI\nbNwzMjIyJiCycc/IyMiYgMjGPSMjI2MC4v8BKfZjtcfd8b0AAAAASUVORK5CYII=\n",
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"tags": []
}
}
]
},
{
"cell_type": "code",
"metadata": {
"id": "KHXlMBwI4RFO",
"colab_type": "code",
"colab": {}
},
"source": [
"#we do not want to rank normalise the return or PERMNO\n",
"tmp_df=data[['return', \"PERMNO\"]]\n",
"#redundant columns\n",
"train_df = data.drop(columns=['permno','date','return', 'PERMNO'])"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "GjeqD6JQHYHW",
"colab_type": "code",
"outputId": "69cb4d97-4356-4c60-b682-ed3f35c48760",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 338
}
},
"source": [
"train_df.head()"
],
"execution_count": 0,
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>DATE</th>\n",
" <th>mvel1</th>\n",
" <th>beta</th>\n",
" <th>betasq</th>\n",
" <th>chmom</th>\n",
" <th>dolvol</th>\n",
" <th>idiovol</th>\n",
" <th>indmom</th>\n",
" <th>mom1m</th>\n",
" <th>mom6m</th>\n",
" <th>mom12m</th>\n",
" <th>mom36m</th>\n",
" <th>pricedelay</th>\n",
" <th>turn</th>\n",
" <th>absacc</th>\n",
" <th>acc</th>\n",
" <th>age</th>\n",
" <th>agr</th>\n",
" <th>bm</th>\n",
" <th>bm_ia</th>\n",
" <th>cashdebt</th>\n",
" <th>cashpr</th>\n",
" <th>cfp</th>\n",
" <th>cfp_ia</th>\n",
" <th>chatoia</th>\n",
" <th>chcsho</th>\n",
" <th>chempia</th>\n",
" <th>chinv</th>\n",
" <th>chpmia</th>\n",
" <th>convind</th>\n",
" <th>currat</th>\n",
" <th>depr</th>\n",
" <th>divi</th>\n",
" <th>divo</th>\n",
" <th>dy</th>\n",
" <th>egr</th>\n",
" <th>ep</th>\n",
" <th>gma</th>\n",
" <th>grcapx</th>\n",
" <th>grltnoa</th>\n",
" <th>...</th>\n",
" <th>pchsaleinv</th>\n",
" <th>pctacc</th>\n",
" <th>ps</th>\n",
" <th>quick</th>\n",
" <th>rd</th>\n",
" <th>rd_mve</th>\n",
" <th>rd_sale</th>\n",
" <th>realestate</th>\n",
" <th>roic</th>\n",
" <th>salecash</th>\n",
" <th>saleinv</th>\n",
" <th>salerec</th>\n",
" <th>secured</th>\n",
" <th>securedind</th>\n",
" <th>sgr</th>\n",
" <th>sin</th>\n",
" <th>sp</th>\n",
" <th>tang</th>\n",
" <th>tb</th>\n",
" <th>aeavol</th>\n",
" <th>cash</th>\n",
" <th>chtx</th>\n",
" <th>cinvest</th>\n",
" <th>ear</th>\n",
" <th>nincr</th>\n",
" <th>roaq</th>\n",
" <th>roavol</th>\n",
" <th>roeq</th>\n",
" <th>rsup</th>\n",
" <th>stdacc</th>\n",
" <th>stdcf</th>\n",
" <th>ms</th>\n",
" <th>baspread</th>\n",
" <th>ill</th>\n",
" <th>maxret</th>\n",
" <th>retvol</th>\n",
" <th>std_dolvol</th>\n",
" <th>std_turn</th>\n",
" <th>zerotrade</th>\n",
" <th>sic2</th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>1</th>\n",
" <td>19860331</td>\n",
" <td>11960.000000</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>0.255908</td>\n",
" <td>-0.257143</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>...</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>0.055511</td>\n",
" <td>1.891760e-06</td>\n",
" <td>0.044776</td>\n",
" <td>0.031004</td>\n",
" <td>1.021089</td>\n",
" <td>1.079774</td>\n",
" <td>1.023392e-07</td>\n",
" <td>39.0</td>\n",
" </tr>\n",
" <tr>\n",
" <th>2</th>\n",
" <td>19860430</td>\n",
" <td>16330.000000</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>7.897668</td>\n",
" <td>NaN</td>\n",
" <td>0.368892</td>\n",
" <td>0.365385</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>0.251252</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>...</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>0.037231</td>\n",
" <td>7.315091e-07</td>\n",
" <td>0.145161</td>\n",
" <td>0.044548</td>\n",
" <td>1.033817</td>\n",
" <td>1.745333</td>\n",
" <td>7.467463e-08</td>\n",
" <td>39.0</td>\n",
" </tr>\n",
" <tr>\n",
" <th>3</th>\n",
" <td>19860530</td>\n",
" <td>15172.000000</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>8.472954</td>\n",
" <td>NaN</td>\n",
" <td>0.388370</td>\n",
" <td>-0.098592</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>0.251604</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>...</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>0.048336</td>\n",
" <td>1.215981e-06</td>\n",
" <td>0.022727</td>\n",
" <td>0.011246</td>\n",
" <td>1.184555</td>\n",
" <td>1.502285</td>\n",
" <td>7.649551e-08</td>\n",
" <td>39.0</td>\n",
" </tr>\n",
" <tr>\n",
" <th>4</th>\n",
" <td>19860630</td>\n",
" <td>11793.859375</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>8.250098</td>\n",
" <td>NaN</td>\n",
" <td>0.400748</td>\n",
" <td>-0.222656</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>0.273223</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>...</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>0.062245</td>\n",
" <td>2.744328e-06</td>\n",
" <td>0.115702</td>\n",
" <td>0.038863</td>\n",
" <td>0.959128</td>\n",
" <td>1.756198</td>\n",
" <td>7.360224e-08</td>\n",
" <td>39.0</td>\n",
" </tr>\n",
" <tr>\n",
" <th>5</th>\n",
" <td>19860731</td>\n",
" <td>11734.593750</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>8.113567</td>\n",
" <td>NaN</td>\n",
" <td>0.476698</td>\n",
" <td>-0.005025</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>0.272432</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>...</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>NaN</td>\n",
" <td>0.049174</td>\n",
" <td>1.270483e-06</td>\n",
" <td>0.042553</td>\n",
" <td>0.020357</td>\n",
" <td>1.044263</td>\n",
" <td>1.239524</td>\n",
" <td>2.000000e+00</td>\n",
" <td>39.0</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"<p>5 rows × 96 columns</p>\n",
"</div>"
],
"text/plain": [
" DATE mvel1 beta ... std_turn zerotrade sic2\n",
"1 19860331 11960.000000 NaN ... 1.079774 1.023392e-07 39.0\n",
"2 19860430 16330.000000 NaN ... 1.745333 7.467463e-08 39.0\n",
"3 19860530 15172.000000 NaN ... 1.502285 7.649551e-08 39.0\n",
"4 19860630 11793.859375 NaN ... 1.756198 7.360224e-08 39.0\n",
"5 19860731 11734.593750 NaN ... 1.239524 2.000000e+00 39.0\n",
"\n",
"[5 rows x 96 columns]"
]
},
"metadata": {
"tags": []
},
"execution_count": 21
}
]
},
{
"cell_type": "code",
"metadata": {
"id": "UYZyjDCi-Cdm",
"colab_type": "code",
"colab": {}
},
"source": [
"for col in train_df.columns:\n",
" train_df[col]=train_df.groupby('DATE')[col].apply(lambda x:x.fillna(x.median()))"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "markdown",
"metadata": {
"id": "dWFbH0G1Npmh",
"colab_type": "text"
},
"source": [
"Some columns still have missing values. It is because those metrics were not recorded in early years.We filled thos missing value with the total mean of the column"
]
},
{
"cell_type": "code",
"metadata": {
"id": "PlqZ1xulFRtI",
"colab_type": "code",
"colab": {}
},
"source": [
"#train_df.fillna(value=train_df.mean(), inplace=True)\n",
"train_df.fillna(value=0, inplace=True)"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "YASkyyi5PUe1",
"colab_type": "code",
"colab": {}
},
"source": [
"ranked_train_df = train_df.groupby('DATE').rank(method='average', ascending=True, pct=True)"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "evs3I9eiPUe3",
"colab_type": "code",
"colab": {}
},
"source": [
"ranked_train_df = pd.concat([ranked_train_df, train_df['DATE']], axis=1)"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "I67q7qHctWho",
"colab_type": "code",
"colab": {}
},
"source": [
"ranked_train_df = pd.concat([ranked_train_df, tmp_df], axis=1)\n",
"train_df = pd.concat([train_df, tmp_df], axis=1)"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "markdown",
"metadata": {
"id": "ObmLNipKPUe5",
"colab_type": "text"
},
"source": [
"Finally, in this paper, the author rank-normalize the each cross-sectional characteristics. So we did the same. We make sure the sum of the second axis==1"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "_OMn7ZxhPUe7",
"colab_type": "text"
},
"source": [
"# 3. Model"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "eguSKm-MPUe8",
"colab_type": "text"
},
"source": [
"We build the model based on the PyTorch v1.4. The implementation is simple, and contains two classes.\n",
"* `Perceptron`: a sequence of Neural Network\n",
"* `Antoencoder`: The main model.\n",
"\n",
"There are several modifications:\n",
"* I am a bit confused about eq(16). Because Z is rank-normalied, so I think x = r*Z is enough.\n",
"* We can flexibly change the structure of beta nueral network by changing the parameters.\n",
"* The paper did not mention the initializer, so I just used the default setting\n",
"\n",
"Both are subclasses of torch.nn.Module. Please refer to my comments for details."
]
},
{
"cell_type": "code",
"metadata": {
"id": "oYbx_zecPUe9",
"colab_type": "code",
"colab": {}
},
"source": [
"class Perceptron(nn.Module):\n",
"\n",
" def __init__(self, in_, out_, hidden_):\n",
" \"\"\"\n",
" This is to build a multi-layer network\n",
" :param in_:int, input dimensions\n",
" :param out_:int, output dimensions\n",
" :param hidden_: list or tuple, # neurons for each hidden layer(exclude the input and output)\n",
" \"\"\"\n",
" super(Perceptron, self).__init__()\n",
" self.sequential = nn.Sequential()\n",
" for i in range(len(hidden_)+1): # output layer is not included in hidden_layer, so #layers = #hidden+1\n",
" # define the dimension of each layer\n",
" if i == 0:\n",
" input_ = in_\n",
" output_ = hidden_[i]\n",
" elif i == len(hidden_):\n",
" input_ = hidden_[i-1]\n",
" output_ = out_\n",
" else:\n",
" input_ = hidden_[i-1]\n",
" output_ = hidden_[i]\n",
"\n",
" # no batchnorm or the activation for the last layer\n",
" if i == len(hidden_):\n",
" self.sequential.add_module('linear'+str(i), nn.Linear(input_, output_))\n",
" else:\n",
" #dropout layer\n",
" self.sequential.add_module('dropout'+str(i), nn.Dropout(p=0.3))\n",
" # add the linear layer\n",
" self.sequential.add_module('linear'+str(i), nn.Linear(input_, output_))\n",
" # add the batch norm layer\n",
" self.sequential.add_module('batchnorm'+str(i), nn.BatchNorm1d(output_))\n",
" #dropout layer\n",
" self.sequential.add_module('dropout'+str(i), nn.Dropout(p=0.1))\n",
" # add the activation layer\n",
" self.sequential.add_module('relu'+str(i), nn.ReLU())\n",
"\n",
" def forward(self, x):\n",
" return self.sequential(x)\n",
"\n",
"class Linear(nn.Module):\n",
" def __init__(self, in_, out_):\n",
" \"\"\"\n",
" This is to build a linear network\n",
" :param in_:int, input dimensions\n",
" :param out_:int, output dimensions\n",
" \"\"\"\n",
" super(Linear, self).__init__()\n",
" self.linear = nn.Linear(in_, out_)\n",
" self.dropout = nn.Dropout(p=0.3)\n",
" def forward(self, x):\n",
" x = self.linear(x)\n",
" x = self.dropout(x)\n",
" return x\n",
"\n",
"\n",
"class Autoencoder(nn.Module):\n",
"\n",
" def __init__(self, P, K, hidden_):\n",
" \"\"\"\n",
" This is to build the autoencoder neural netowrk with a multi-layer beta network and a single layer factor network\n",
" :param P:int, # characteristics\n",
" :param K:int, # output factors\n",
" :param hidden_: list or tuple, # neurons for each hidden layer for the beta network(exclude the input and output)\n",
" :param dropout_p: the dropout rate\n",
" \"\"\"\n",
" nn.Module.__init__(self)\n",
" self.beta_net = Perceptron(P, K, hidden_) # for beta nn, input Z: N*P, output BETA: N*K\n",
" self.factor_net = Linear(P, K) # for factor nn, only one linear layer, input r: N*1, output f: K*1\n",
"\n",
" def forward(self, z, r):\n",
" \"\"\"\n",
" :param z: N*P tensor\n",
" :param r: 1*N tensor\n",
" :return: P*1 tensor\n",
" \"\"\"\n",
" beta = self.beta_net(z)\n",
"\n",
" zz = t.mm(t.t(z), z) #P*P Matrix\n",
" zz_ = t.pinverse(zz)\n",
" zr = t.transpose(t.mm(r,z),0,1) # P*1\n",
" x = t.transpose(t.mm(zz_, zr),0,1) # 1*p\n",
" #x = t.mm(r, z)\n",
" factor = self.factor_net(x)\n",
"\n",
" return t.mm(factor, beta.t())"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "markdown",
"metadata": {
"id": "cIgj6JycPUe_",
"colab_type": "text"
},
"source": [
"# 4. Experiment"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "YKM3Aig0PUfA",
"colab_type": "text"
},
"source": [
"## 4.1 Training"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "15BNNnXBPUfA",
"colab_type": "text"
},
"source": [
"We trained the CA0 - CA3 mentioned in the paper. Below are functions for the training and evaluation process.\n",
"\n",
"I choose the 0-636th month as the training set, and the 636-677th as validation set. "
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "EdfdmNV5XnOW",
"colab_type": "text"
},
"source": [
"**Define** the evaluation function. "
]
},
{
"cell_type": "code",
"metadata": {
"id": "kwA7wes7Xgkp",
"colab_type": "code",
"colab": {}
},
"source": [
"def R_square(pred, target):\n",
"\n",
" return 1 - np.sum(np.square(target-pred)) / np.sum(np.square(target))"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "BJmfNhOePUfB",
"colab_type": "code",
"colab": {}
},
"source": [
"import torch.utils.data as Data\n",
"\n",
"'''\n",
"Increase the training sample by one year for each refit\n",
"'''\n",
"train_test_list = []\n",
"train_begin_yr = 1957\n",
"train_end_yr = 1975\n",
"#the test of the test period \n",
"test_end_yr = 1987\n",
"#Trainning dataset\n",
"for year_end in range(train_begin_yr+1, train_end_yr+1):\n",
" begin_date = int(str(train_begin_yr)+\"0000\")\n",
" end_date = int(str(year_end)+\"0000\")\n",
" \n",
" X = ranked_train_df[(ranked_train_df['DATE'] <= end_date) & (ranked_train_df['DATE'] >= begin_date)].drop(['PERMNO', 'DATE', 'return'], axis=1).values\n",
" y = train_df[(train_df['DATE']<=end_date) & (train_df['DATE']>=begin_date)]['return'].values\n",
" X = t.from_numpy(X).float().cuda()\n",
" y = t.from_numpy(y).float().cuda()\n",
" torch_dataset = Data.TensorDataset(X, y)\n",
"\n",
" test_end_date = int(str(test_end_yr)+\"0000\")\n",
" #Testing Dataset\n",
" X_test = ranked_train_df[(ranked_train_df['DATE']>end_date) & (ranked_train_df['DATE']<=test_end_date) ].drop(['PERMNO', 'DATE', 'return'], axis=1).values\n",
" y_test = train_df[(train_df['DATE']>end_date) & (train_df['DATE']<=test_end_date) ]['return'].values\n",
" X_test = t.from_numpy(X_test).float().cuda()\n",
" y_test = t.from_numpy(y_test).float().cuda()\n",
" test_dataset = Data.TensorDataset(X_test, y_test)\n",
"\n",
" #the first one is the training dataset\n",
" #the second one is the test dataset\n",
" to_append=(torch_dataset, test_dataset)\n",
"\n",
" train_test_list.append(to_append)"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "SbuWTizkPUfO",
"colab_type": "code",
"colab": {}
},
"source": [
"loader_list = []\n",
"for ls in train_test_list:\n",
" loader = Data.DataLoader(\n",
" dataset=ls[0],\n",
" batch_size=len(ls[0]),\n",
" shuffle=True,\n",
" num_workers=0)\n",
"\n",
" test_loader = Data.DataLoader(\n",
" dataset=ls[1],\n",
" batch_size=len(ls[1]),\n",
" shuffle=True,\n",
" num_workers=0)\n",
" \n",
" #the first one is the data loader for the training set\n",
" #the second one is the data loader for the test set\n",
" loader_list.append((loader, test_loader))\n",
"\n",
"ca1 = Autoencoder(95, 6, [32]).cuda()\n",
"\n",
"loss_fn = nn.MSELoss()\n",
"\n",
"decay = 1e-5 # weight decay for L1 LASSO.\n",
"\n",
"optimizer = optim.Adam(ca1.parameters(), lr=1e-3, betas=(0.9,0.999), eps=1e-8)"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "10MpeO_pHqlG",
"colab_type": "code",
"colab": {}
},
"source": [
"def train(ca1, loader, optimizer, epoch, r_square_list):\n",
" '''\n",
" Function for training\n",
" '''\n",
" ca1.train()\n",
" for step, (batch_x, batch_y) in enumerate(loader):\n",
" optimizer.zero_grad()\n",
"\n",
" # prepare the data\n",
" z = batch_x\n",
" r = batch_y[np.newaxis, ...]\n",
" # forward\n",
" r_pred = ca1(z, r)\n",
" loss = loss_fn(r_pred, r)\n",
" for param in ca1.parameters():\n",
" loss += decay * t.sum(t.abs(param.float())) # torch has no integrated L1 regulizations, so I manually wrote this part\n",
"\n",
" loss.backward()\n",
" optimizer.step()\n",
" if step % 250 == 0:\n",
" print('Train Epoch: {} [{}/{} ({:.0f}%)]\\tLoss: {:.6f}'.format(\n",
" epoch, step * len(batch_x), len(loader.dataset),\n",
" 100. * step / len(loader), loss.item()))\n",
" r_square_list.append(R_square(r_pred.detach().cpu().numpy(), r.detach().cpu().numpy()))\n",
" print('Insample R^2 for iter: %d is %f' % (epoch+1, np.mean(r_square_list)))\n",
" print(\"\\n\")"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "p5h5yMN7I8Ie",
"colab_type": "code",
"colab": {}
},
"source": [
"def test(ca1, test_loader, epoch, r_square_list):\n",
" '''\n",
" Function for Cross-validation\n",
" '''\n",
" with t.no_grad():\n",
" ca1.eval()\n",
" test_loss=0\n",
" for step, (batch_x, batch_y) in enumerate(test_loader):\n",
" z = batch_x\n",
" r = batch_y[np.newaxis, ...]\n",
" r_pred = ca1(z, r)\n",
" r_square_list.append(R_square(r_pred.detach().cpu().numpy(), r.detach().cpu().numpy()))\n",
" test_loss += loss_fn(r_pred, r).item()\n",
" if step % 250 == 0:\n",
" print('Test set: Average loss: {:f}'.format(test_loss))\n",
" print('Test-sample R^2 for iter: %d is %f' % (epoch+1, np.mean(r_square_list)))\n",
" print(\"\\n\")"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"scrolled": true,
"id": "aO9LkKtkPUfj",
"colab_type": "code",
"outputId": "62e075ae-4009-4f1d-a2d5-c21c2d1fee6f",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 1000
}
},
"source": [
"for epoch in range(3000):\n",
" running_loss = 0\n",
" r_square_list = []\n",
" test_rsq_list=[]\n",
" for loader in loader_list:\n",
" train(ca1, loader[0], optimizer, epoch, r_square_list)\n",
" test(ca1, loader[1] , epoch, test_rsq_list)"
],
"execution_count": 0,
"outputs": [
{
"output_type": "stream",
"text": [
"Train Epoch: 0 [0/10692 (0%)]\tLoss: 0.495187\n",
"Insample R^2 for iter: 1 is 0.019467\n",
"\n",
"\n",
"Test set: Average loss: 0.302839\n",
"Test-sample R^2 for iter: 1 is 0.001469\n",
"\n",
"\n",
"Train Epoch: 0 [0/23538 (0%)]\tLoss: 0.418418\n",
"Insample R^2 for iter: 1 is 0.031241\n",
"\n",
"\n",
"Test set: Average loss: 0.302056\n",
"Test-sample R^2 for iter: 1 is 0.001507\n",
"\n",
"\n",
"Train Epoch: 0 [0/36468 (0%)]\tLoss: 0.368416\n",
"Insample R^2 for iter: 1 is 0.038133\n",
"\n",
"\n",
"Test set: Average loss: 0.302124\n",
"Test-sample R^2 for iter: 1 is 0.001502\n",
"\n",
"\n",
"Train Epoch: 0 [0/49745 (0%)]\tLoss: 0.572386\n",
"Insample R^2 for iter: 1 is -0.107901\n",
"\n",
"\n",
"Test set: Average loss: 0.301685\n",
"Test-sample R^2 for iter: 1 is 0.001700\n",
"\n",
"\n",
"Train Epoch: 0 [0/63294 (0%)]\tLoss: 0.331155\n",
"Insample R^2 for iter: 1 is -0.081846\n",
"\n",
"\n",
"Test set: Average loss: 0.302420\n",
"Test-sample R^2 for iter: 1 is 0.001951\n",
"\n",
"\n",
"Train Epoch: 0 [0/81599 (0%)]\tLoss: 0.355404\n",
"Insample R^2 for iter: 1 is -0.068114\n",
"\n",
"\n",
"Test set: Average loss: 0.300619\n",
"Test-sample R^2 for iter: 1 is 0.002209\n",
"\n",
"\n",
"Train Epoch: 0 [0/106354 (0%)]\tLoss: 0.373207\n",
"Insample R^2 for iter: 1 is -0.048556\n",
"\n",
"\n",
"Test set: Average loss: 0.295418\n",
"Test-sample R^2 for iter: 1 is 0.002439\n",
"\n",
"\n",
"Train Epoch: 0 [0/131647 (0%)]\tLoss: 0.395089\n",
"Insample R^2 for iter: 1 is -0.032488\n",
"\n",
"\n",
"Test set: Average loss: 0.290015\n",
"Test-sample R^2 for iter: 1 is 0.002587\n",
"\n",
"\n",
"Train Epoch: 0 [0/157513 (0%)]\tLoss: 0.408513\n",
"Insample R^2 for iter: 1 is -0.023576\n",
"\n",
"\n",
"Test set: Average loss: 0.286939\n",
"Test-sample R^2 for iter: 1 is 0.002716\n",
"\n",
"\n",
"Train Epoch: 0 [0/183724 (0%)]\tLoss: 0.388504\n",
"Insample R^2 for iter: 1 is -0.013414\n",
"\n",
"\n",
"Test set: Average loss: 0.284812\n",
"Test-sample R^2 for iter: 1 is 0.002864\n",
"\n",
"\n",
"Train Epoch: 0 [0/210124 (0%)]\tLoss: 0.364979\n",
"Insample R^2 for iter: 1 is -0.004385\n",
"\n",
"\n",
"Test set: Average loss: 0.285644\n",
"Test-sample R^2 for iter: 1 is 0.003055\n",
"\n",
"\n",
"Train Epoch: 0 [0/236621 (0%)]\tLoss: 0.352490\n",
"Insample R^2 for iter: 1 is 0.000121\n",
"\n",
"\n",
"Test set: Average loss: 0.288923\n",
"Test-sample R^2 for iter: 1 is 0.003300\n",
"\n",
"\n",
"Train Epoch: 0 [0/263967 (0%)]\tLoss: 0.330616\n",
"Insample R^2 for iter: 1 is 0.004434\n",
"\n",
"\n",
"Test set: Average loss: 0.291930\n",
"Test-sample R^2 for iter: 1 is 0.003562\n",
"\n",
"\n",
"Train Epoch: 0 [0/292711 (0%)]\tLoss: 0.325886\n",
"Insample R^2 for iter: 1 is 0.008092\n",
"\n",
"\n",
"Test set: Average loss: 0.291782\n",
"Test-sample R^2 for iter: 1 is 0.003818\n",
"\n",
"\n",
"Train Epoch: 0 [0/322680 (0%)]\tLoss: 0.303313\n",
"Insample R^2 for iter: 1 is 0.014066\n",
"\n",
"\n",
"Test set: Average loss: 0.293069\n",
"Test-sample R^2 for iter: 1 is 0.004087\n",
"\n",
"\n",
"Train Epoch: 0 [0/354563 (0%)]\tLoss: 0.308887\n",
"Insample R^2 for iter: 1 is 0.016937\n",
"\n",
"\n",
"Test set: Average loss: 0.294332\n",
"Test-sample R^2 for iter: 1 is 0.004374\n",
"\n",
"\n",
"Train Epoch: 0 [0/422402 (0%)]\tLoss: 0.284073\n",
"Insample R^2 for iter: 1 is 0.021660\n",
"\n",
"\n",
"Test set: Average loss: 0.298617\n",
"Test-sample R^2 for iter: 1 is 0.004673\n",
"\n",
"\n",
"Train Epoch: 0 [0/487054 (0%)]\tLoss: 0.292785\n",
"Insample R^2 for iter: 1 is 0.024163\n",
"\n",
"\n",
"Test set: Average loss: 0.298187\n",
"Test-sample R^2 for iter: 1 is 0.004959\n",
"\n",
"\n",
"Train Epoch: 1 [0/10692 (0%)]\tLoss: 0.554172\n",
"Insample R^2 for iter: 2 is -0.097911\n",
"\n",
"\n",
"Test set: Average loss: 0.296807\n",
"Test-sample R^2 for iter: 2 is 0.021359\n",
"\n",
"\n",
"Train Epoch: 1 [0/23538 (0%)]\tLoss: 0.392026\n",
"Insample R^2 for iter: 2 is 0.002896\n",
"\n",
"\n",
"Test set: Average loss: 0.295686\n",
"Test-sample R^2 for iter: 2 is 0.021981\n",
"\n",
"\n",
"Train Epoch: 1 [0/36468 (0%)]\tLoss: 0.335287\n",
"Insample R^2 for iter: 2 is 0.047834\n",
"\n",
"\n",
"Test set: Average loss: 0.295318\n",
"Test-sample R^2 for iter: 2 is 0.022649\n",
"\n",
"\n",
"Train Epoch: 1 [0/49745 (0%)]\tLoss: 0.311944\n",
"Insample R^2 for iter: 2 is 0.075961\n",
"\n",
"\n",
"Test set: Average loss: 0.294758\n",
"Test-sample R^2 for iter: 2 is 0.023288\n",
"\n",
"\n",
"Train Epoch: 1 [0/63294 (0%)]\tLoss: 0.401495\n",
"Insample R^2 for iter: 2 is 0.023410\n",
"\n",
"\n",
"Test set: Average loss: 0.295105\n",
"Test-sample R^2 for iter: 2 is 0.024045\n",
"\n",
"\n",
"Train Epoch: 1 [0/81599 (0%)]\tLoss: 0.310699\n",
"Insample R^2 for iter: 2 is 0.040689\n",
"\n",
"\n",
"Test set: Average loss: 0.293029\n",
"Test-sample R^2 for iter: 2 is 0.024814\n",
"\n",
"\n",
"Train Epoch: 1 [0/106354 (0%)]\tLoss: 0.337577\n",
"Insample R^2 for iter: 2 is 0.057481\n",
"\n",
"\n",
"Test set: Average loss: 0.287737\n",
"Test-sample R^2 for iter: 2 is 0.025514\n",
"\n",
"\n",
"Train Epoch: 1 [0/131647 (0%)]\tLoss: 0.379517\n",
"Insample R^2 for iter: 2 is 0.064852\n",
"\n",
"\n",
"Test set: Average loss: 0.282287\n",
"Test-sample R^2 for iter: 2 is 0.026097\n",
"\n",
"\n",
"Train Epoch: 1 [0/157513 (0%)]\tLoss: 0.356359\n",
"Insample R^2 for iter: 2 is 0.076533\n",
"\n",
"\n",
"Test set: Average loss: 0.279163\n",
"Test-sample R^2 for iter: 2 is 0.026613\n",
"\n",
"\n",
"Train Epoch: 1 [0/183724 (0%)]\tLoss: 0.352454\n",
"Insample R^2 for iter: 2 is 0.085288\n",
"\n",
"\n",
"Test set: Average loss: 0.277009\n",
"Test-sample R^2 for iter: 2 is 0.027100\n",
"\n",
"\n",
"Train Epoch: 1 [0/210124 (0%)]\tLoss: 0.334919\n",
"Insample R^2 for iter: 2 is 0.092230\n",
"\n",
"\n",
"Test set: Average loss: 0.277672\n",
"Test-sample R^2 for iter: 2 is 0.027612\n",
"\n",
"\n",
"Train Epoch: 1 [0/236621 (0%)]\tLoss: 0.312879\n",
"Insample R^2 for iter: 2 is 0.097642\n",
"\n",
"\n",
"Test set: Average loss: 0.280670\n",
"Test-sample R^2 for iter: 2 is 0.028177\n",
"\n",
"\n",
"Train Epoch: 1 [0/263967 (0%)]\tLoss: 0.310350\n",
"Insample R^2 for iter: 2 is 0.098933\n",
"\n",
"\n",
"Test set: Average loss: 0.283533\n",
"Test-sample R^2 for iter: 2 is 0.028723\n",
"\n",
"\n",
"Train Epoch: 1 [0/292711 (0%)]\tLoss: 0.309966\n",
"Insample R^2 for iter: 2 is 0.099156\n",
"\n",
"\n",
"Test set: Average loss: 0.283514\n",
"Test-sample R^2 for iter: 2 is 0.029191\n",
"\n",
"\n",
"Train Epoch: 1 [0/322680 (0%)]\tLoss: 0.297666\n",
"Insample R^2 for iter: 2 is 0.100185\n",
"\n",
"\n",
"Test set: Average loss: 0.284751\n",
"Test-sample R^2 for iter: 2 is 0.029646\n",
"\n",
"\n",
"Train Epoch: 1 [0/354563 (0%)]\tLoss: 0.319963\n",
"Insample R^2 for iter: 2 is 0.095548\n",
"\n",
"\n",
"Test set: Average loss: 0.286064\n",
"Test-sample R^2 for iter: 2 is 0.030076\n",
"\n",
"\n",
"Train Epoch: 1 [0/422402 (0%)]\tLoss: 0.285089\n",
"Insample R^2 for iter: 2 is 0.095452\n",
"\n",
"\n",
"Test set: Average loss: 0.290525\n",
"Test-sample R^2 for iter: 2 is 0.030442\n",
"\n",
"\n",
"Train Epoch: 1 [0/487054 (0%)]\tLoss: 0.276912\n",
"Insample R^2 for iter: 2 is 0.096686\n",
"\n",
"\n",
"Test set: Average loss: 0.290521\n",
"Test-sample R^2 for iter: 2 is 0.030711\n",
"\n",
"\n",
"Train Epoch: 2 [0/10692 (0%)]\tLoss: 0.605436\n",
"Insample R^2 for iter: 3 is -0.199947\n",
"\n",
"\n",
"Test set: Average loss: 0.284488\n",
"Test-sample R^2 for iter: 3 is 0.061976\n",
"\n",
"\n",
"Train Epoch: 2 [0/23538 (0%)]\tLoss: 0.391115\n",
"Insample R^2 for iter: 3 is -0.047097\n",
"\n",
"\n",
"Test set: Average loss: 0.283268\n",
"Test-sample R^2 for iter: 3 is 0.062812\n",
"\n",
"\n",
"Train Epoch: 2 [0/36468 (0%)]\tLoss: 0.317622\n",
"Insample R^2 for iter: 3 is 0.029738\n",
"\n",
"\n",
"Test set: Average loss: 0.282772\n",
"Test-sample R^2 for iter: 3 is 0.063692\n",
"\n",
"\n",
"Train Epoch: 2 [0/49745 (0%)]\tLoss: 0.306538\n",
"Insample R^2 for iter: 3 is 0.066040\n",
"\n",
"\n",
"Test set: Average loss: 0.282206\n",
"Test-sample R^2 for iter: 3 is 0.064447\n",
"\n",
"\n",
"Train Epoch: 2 [0/63294 (0%)]\tLoss: 0.326280\n",
"Insample R^2 for iter: 3 is 0.060188\n",
"\n",
"\n",
"Test set: Average loss: 0.282832\n",
"Test-sample R^2 for iter: 3 is 0.065065\n",
"\n",
"\n",
"Train Epoch: 2 [0/81599 (0%)]\tLoss: 0.332191\n",
"Insample R^2 for iter: 3 is 0.061186\n",
"\n",
"\n",
"Test set: Average loss: 0.281192\n",
"Test-sample R^2 for iter: 3 is 0.065537\n",
"\n",
"\n",
"Train Epoch: 2 [0/106354 (0%)]\tLoss: 0.332277\n",
"Insample R^2 for iter: 3 is 0.076943\n",
"\n",
"\n",
"Test set: Average loss: 0.276639\n",
"Test-sample R^2 for iter: 3 is 0.065766\n",
"\n",
"\n",
"Train Epoch: 2 [0/131647 (0%)]\tLoss: 0.350230\n",
"Insample R^2 for iter: 3 is 0.090450\n",
"\n",
"\n",
"Test set: Average loss: 0.272059\n",
"Test-sample R^2 for iter: 3 is 0.065710\n",
"\n",
"\n",
"Train Epoch: 2 [0/157513 (0%)]\tLoss: 0.414290\n",
"Insample R^2 for iter: 3 is 0.084189\n",
"\n",
"\n",
"Test set: Average loss: 0.269716\n",
"Test-sample R^2 for iter: 3 is 0.065469\n",
"\n",
"\n",
"Train Epoch: 2 [0/183724 (0%)]\tLoss: 0.356708\n",
"Insample R^2 for iter: 3 is 0.091157\n",
"\n",
"\n",
"Test set: Average loss: 0.268236\n",
"Test-sample R^2 for iter: 3 is 0.065137\n",
"\n",
"\n",
"Train Epoch: 2 [0/210124 (0%)]\tLoss: 0.323572\n",
"Insample R^2 for iter: 3 is 0.100161\n",
"\n",
"\n",
"Test set: Average loss: 0.269362\n",
"Test-sample R^2 for iter: 3 is 0.064823\n",
"\n",
"\n",
"Train Epoch: 2 [0/236621 (0%)]\tLoss: 0.307023\n",
"Insample R^2 for iter: 3 is 0.106230\n",
"\n",
"\n",
"Test set: Average loss: 0.272710\n",
"Test-sample R^2 for iter: 3 is 0.064569\n",
"\n",
"\n",
"Train Epoch: 2 [0/263967 (0%)]\tLoss: 0.292372\n",
"Insample R^2 for iter: 3 is 0.110831\n",
"\n",
"\n",
"Test set: Average loss: 0.276069\n",
"Test-sample R^2 for iter: 3 is 0.064269\n",
"\n",
"\n",
"Train Epoch: 2 [0/292711 (0%)]\tLoss: 0.287079\n",
"Insample R^2 for iter: 3 is 0.114971\n",
"\n",
"\n",
"Test set: Average loss: 0.276761\n",
"Test-sample R^2 for iter: 3 is 0.063839\n",
"\n",
"\n",
"Train Epoch: 2 [0/322680 (0%)]\tLoss: 0.321741\n",
"Insample R^2 for iter: 3 is 0.110129\n",
"\n",
"\n",
"Test set: Average loss: 0.278535\n",
"Test-sample R^2 for iter: 3 is 0.063387\n",
"\n",
"\n",
"Train Epoch: 2 [0/354563 (0%)]\tLoss: 0.313221\n",
"Insample R^2 for iter: 3 is 0.106158\n",
"\n",
"\n",
"Test set: Average loss: 0.280368\n",
"Test-sample R^2 for iter: 3 is 0.062908\n",
"\n",
"\n",
"Train Epoch: 2 [0/422402 (0%)]\tLoss: 0.261243\n",
"Insample R^2 for iter: 3 is 0.109927\n",
"\n",
"\n",
"Test set: Average loss: 0.285401\n",
"Test-sample R^2 for iter: 3 is 0.062343\n",
"\n",
"\n",
"Train Epoch: 2 [0/487054 (0%)]\tLoss: 0.272840\n",
"Insample R^2 for iter: 3 is 0.111080\n",
"\n",
"\n",
"Test set: Average loss: 0.286091\n",
"Test-sample R^2 for iter: 3 is 0.061656\n",
"\n",
"\n",
"Train Epoch: 3 [0/10692 (0%)]\tLoss: 0.396611\n",
"Insample R^2 for iter: 4 is 0.215488\n",
"\n",
"\n",
"Test set: Average loss: 0.277781\n",
"Test-sample R^2 for iter: 4 is 0.084092\n",
"\n",
"\n",
"Train Epoch: 3 [0/23538 (0%)]\tLoss: 0.358273\n",
"Insample R^2 for iter: 4 is 0.198361\n",
"\n",
"\n",
"Test set: Average loss: 0.277034\n",
"Test-sample R^2 for iter: 4 is 0.084174\n",
"\n",
"\n",
"Train Epoch: 3 [0/36468 (0%)]\tLoss: 0.314593\n",
"Insample R^2 for iter: 4 is 0.195969\n",
"\n",
"\n",
"Test set: Average loss: 0.276991\n",
"Test-sample R^2 for iter: 4 is 0.084301\n",
"\n",
"\n",
"Train Epoch: 3 [0/49745 (0%)]\tLoss: 0.341482\n",
"Insample R^2 for iter: 4 is 0.167002\n",
"\n",
"\n",
"Test set: Average loss: 0.276832\n",
"Test-sample R^2 for iter: 4 is 0.084347\n",
"\n",
"\n",
"Train Epoch: 3 [0/63294 (0%)]\tLoss: 0.313662\n",
"Insample R^2 for iter: 4 is 0.148445\n",
"\n",
"\n",
"Test set: Average loss: 0.277604\n",
"Test-sample R^2 for iter: 4 is 0.084432\n",
"\n",
"\n",
"Train Epoch: 3 [0/81599 (0%)]\tLoss: 0.354155\n",
"Insample R^2 for iter: 4 is 0.124357\n",
"\n",
"\n",
"Test set: Average loss: 0.276130\n",
"Test-sample R^2 for iter: 4 is 0.084473\n",
"\n",
"\n",
"Train Epoch: 3 [0/106354 (0%)]\tLoss: 0.323693\n",
"Insample R^2 for iter: 4 is 0.134159\n",
"\n",
"\n",
"Test set: Average loss: 0.271899\n",
"Test-sample R^2 for iter: 4 is 0.084280\n",
"\n",
"\n",
"Train Epoch: 3 [0/131647 (0%)]\tLoss: 0.344687\n",
"Insample R^2 for iter: 4 is 0.142129\n",
"\n",
"\n",
"Test set: Average loss: 0.267679\n",
"Test-sample R^2 for iter: 4 is 0.083791\n",
"\n",
"\n",
"Train Epoch: 3 [0/157513 (0%)]\tLoss: 0.338273\n",
"Insample R^2 for iter: 4 is 0.149922\n",
"\n",
"\n",
"Test set: Average loss: 0.265698\n",
"Test-sample R^2 for iter: 4 is 0.083091\n",
"\n",
"\n",
"Train Epoch: 3 [0/183724 (0%)]\tLoss: 0.342576\n",
"Insample R^2 for iter: 4 is 0.153684\n",
"\n",
"\n",
"Test set: Average loss: 0.264482\n",
"Test-sample R^2 for iter: 4 is 0.082310\n",
"\n",
"\n",
"Train Epoch: 3 [0/210124 (0%)]\tLoss: 0.318486\n",
"Insample R^2 for iter: 4 is 0.158162\n",
"\n",
"\n",
"Test set: Average loss: 0.265794\n",
"Test-sample R^2 for iter: 4 is 0.081565\n",
"\n",
"\n",
"Train Epoch: 3 [0/236621 (0%)]\tLoss: 0.301718\n",
"Insample R^2 for iter: 4 is 0.160591\n",
"\n",
"\n",
"Test set: Average loss: 0.269218\n",
"Test-sample R^2 for iter: 4 is 0.080916\n",
"\n",
"\n",
"Train Epoch: 3 [0/263967 (0%)]\tLoss: 0.286297\n",
"Insample R^2 for iter: 4 is 0.162347\n",
"\n",
"\n",
"Test set: Average loss: 0.272685\n",
"Test-sample R^2 for iter: 4 is 0.080245\n",
"\n",
"\n",
"Train Epoch: 3 [0/292711 (0%)]\tLoss: 0.290689\n",
"Insample R^2 for iter: 4 is 0.162049\n",
"\n",
"\n",
"Test set: Average loss: 0.273497\n",
"Test-sample R^2 for iter: 4 is 0.079467\n",
"\n",
"\n",
"Train Epoch: 3 [0/322680 (0%)]\tLoss: 0.273804\n",
"Insample R^2 for iter: 4 is 0.163644\n",
"\n",
"\n",
"Test set: Average loss: 0.275403\n",
"Test-sample R^2 for iter: 4 is 0.078680\n",
"\n",
"\n",
"Train Epoch: 3 [0/354563 (0%)]\tLoss: 0.322876\n",
"Insample R^2 for iter: 4 is 0.154473\n",
"\n",
"\n",
"Test set: Average loss: 0.277369\n",
"Test-sample R^2 for iter: 4 is 0.077876\n",
"\n",
"\n",
"Train Epoch: 3 [0/422402 (0%)]\tLoss: 0.254855\n",
"Insample R^2 for iter: 4 is 0.156599\n",
"\n",
"\n",
"Test set: Average loss: 0.282447\n",
"Test-sample R^2 for iter: 4 is 0.077007\n",
"\n",
"\n",
"Train Epoch: 3 [0/487054 (0%)]\tLoss: 0.267826\n",
"Insample R^2 for iter: 4 is 0.156048\n",
"\n",
"\n",
"Test set: Average loss: 0.283289\n",
"Test-sample R^2 for iter: 4 is 0.076022\n",
"\n",
"\n",
"Train Epoch: 4 [0/10692 (0%)]\tLoss: 0.402154\n",
"Insample R^2 for iter: 5 is 0.204404\n",
"\n",
"\n",
"Test set: Average loss: 0.274061\n",
"Test-sample R^2 for iter: 5 is 0.096358\n",
"\n",
"\n",
"Train Epoch: 4 [0/23538 (0%)]\tLoss: 0.360822\n",
"Insample R^2 for iter: 5 is 0.189856\n",
"\n",
"\n",
"Test set: Average loss: 0.273331\n",
"Test-sample R^2 for iter: 5 is 0.096427\n",
"\n",
"\n",
"Train Epoch: 4 [0/36468 (0%)]\tLoss: 0.304980\n",
"Insample R^2 for iter: 5 is 0.198575\n",
"\n",
"\n",
"Test set: Average loss: 0.273322\n",
"Test-sample R^2 for iter: 5 is 0.096512\n",
"\n",
"\n",
"Train Epoch: 4 [0/49745 (0%)]\tLoss: 0.303010\n",
"Insample R^2 for iter: 5 is 0.195023\n",
"\n",
"\n",
"Test set: Average loss: 0.273323\n",
"Test-sample R^2 for iter: 5 is 0.096406\n",
"\n",
"\n",
"Train Epoch: 4 [0/63294 (0%)]\tLoss: 0.311626\n",
"Insample R^2 for iter: 5 is 0.172057\n",
"\n",
"\n",
"Test set: Average loss: 0.274523\n",
"Test-sample R^2 for iter: 5 is 0.096111\n",
"\n",
"\n",
"Train Epoch: 4 [0/81599 (0%)]\tLoss: 0.285767\n",
"Insample R^2 for iter: 5 is 0.176289\n",
"\n",
"\n",
"Test set: Average loss: 0.273627\n",
"Test-sample R^2 for iter: 5 is 0.095588\n",
"\n",
"\n",
"Train Epoch: 4 [0/106354 (0%)]\tLoss: 0.315873\n",
"Insample R^2 for iter: 5 is 0.181468\n",
"\n",
"\n",
"Test set: Average loss: 0.269987\n",
"Test-sample R^2 for iter: 5 is 0.094728\n",
"\n",
"\n",
"Train Epoch: 4 [0/131647 (0%)]\tLoss: 0.338109\n",
"Insample R^2 for iter: 5 is 0.185444\n",
"\n",
"\n",
"Test set: Average loss: 0.266332\n",
"Test-sample R^2 for iter: 5 is 0.093512\n",
"\n",
"\n",
"Train Epoch: 4 [0/157513 (0%)]\tLoss: 0.332724\n",
"Insample R^2 for iter: 5 is 0.189863\n",
"\n",
"\n",
"Test set: Average loss: 0.264830\n",
"Test-sample R^2 for iter: 5 is 0.092067\n",
"\n",
"\n",
"Train Epoch: 4 [0/183724 (0%)]\tLoss: 0.327291\n",
"Insample R^2 for iter: 5 is 0.193275\n",
"\n",
"\n",
"Test set: Average loss: 0.264096\n",
"Test-sample R^2 for iter: 5 is 0.090523\n",
"\n",
"\n",
"Train Epoch: 4 [0/210124 (0%)]\tLoss: 0.345611\n",
"Insample R^2 for iter: 5 is 0.187933\n",
"\n",
"\n",
"Test set: Average loss: 0.265717\n",
"Test-sample R^2 for iter: 5 is 0.089056\n",
"\n",
"\n",
"Train Epoch: 4 [0/236621 (0%)]\tLoss: 0.323559\n",
"Insample R^2 for iter: 5 is 0.182936\n",
"\n",
"\n",
"Test set: Average loss: 0.269278\n",
"Test-sample R^2 for iter: 5 is 0.087766\n",
"\n",
"\n",
"Train Epoch: 4 [0/263967 (0%)]\tLoss: 0.279741\n",
"Insample R^2 for iter: 5 is 0.184419\n",
"\n",
"\n",
"Test set: Average loss: 0.272842\n",
"Test-sample R^2 for iter: 5 is 0.086527\n",
"\n",
"\n",
"Train Epoch: 4 [0/292711 (0%)]\tLoss: 0.277294\n",
"Insample R^2 for iter: 5 is 0.185333\n",
"\n",
"\n",
"Test set: Average loss: 0.273759\n",
"Test-sample R^2 for iter: 5 is 0.085237\n",
"\n",
"\n",
"Train Epoch: 4 [0/322680 (0%)]\tLoss: 0.277399\n",
"Insample R^2 for iter: 5 is 0.184652\n",
"\n",
"\n",
"Test set: Average loss: 0.275667\n",
"Test-sample R^2 for iter: 5 is 0.084005\n",
"\n",
"\n",
"Train Epoch: 4 [0/354563 (0%)]\tLoss: 0.264085\n",
"Insample R^2 for iter: 5 is 0.185431\n",
"\n",
"\n",
"Test set: Average loss: 0.277679\n",
"Test-sample R^2 for iter: 5 is 0.082803\n",
"\n",
"\n",
"Train Epoch: 4 [0/422402 (0%)]\tLoss: 0.262176\n",
"Insample R^2 for iter: 5 is 0.184352\n",
"\n",
"\n",
"Test set: Average loss: 0.282584\n",
"Test-sample R^2 for iter: 5 is 0.081618\n",
"\n",
"\n",
"Train Epoch: 4 [0/487054 (0%)]\tLoss: 0.250681\n",
"Insample R^2 for iter: 5 is 0.185317\n",
"\n",
"\n",
"Test set: Average loss: 0.283225\n",
"Test-sample R^2 for iter: 5 is 0.080389\n",
"\n",
"\n",
"Train Epoch: 5 [0/10692 (0%)]\tLoss: 0.389111\n",
"Insample R^2 for iter: 6 is 0.230309\n",
"\n",
"\n",
"Test set: Average loss: 0.274186\n",
"Test-sample R^2 for iter: 6 is 0.095945\n",
"\n",
"\n",
"Train Epoch: 5 [0/23538 (0%)]\tLoss: 0.342111\n",
"Insample R^2 for iter: 6 is 0.224300\n",
"\n",
"\n",
"Test set: Average loss: 0.273511\n",
"Test-sample R^2 for iter: 6 is 0.095923\n",
"\n",
"\n",
"Train Epoch: 5 [0/36468 (0%)]\tLoss: 0.351853\n",
"Insample R^2 for iter: 6 is 0.181049\n",
"\n",
"\n",
"Test set: Average loss: 0.273413\n",
"Test-sample R^2 for iter: 6 is 0.096075\n",
"\n",
"\n",
"Train Epoch: 5 [0/49745 (0%)]\tLoss: 0.302913\n",
"Insample R^2 for iter: 6 is 0.181929\n",
"\n",
"\n",
"Test set: Average loss: 0.273317\n",
"Test-sample R^2 for iter: 6 is 0.096084\n",
"\n",
"\n",
"Train Epoch: 5 [0/63294 (0%)]\tLoss: 0.356402\n",
"Insample R^2 for iter: 6 is 0.134940\n",
"\n",
"\n",
"Test set: Average loss: 0.274275\n",
"Test-sample R^2 for iter: 6 is 0.096017\n",
"\n",
"\n",
"Train Epoch: 5 [0/81599 (0%)]\tLoss: 0.282151\n",
"Insample R^2 for iter: 6 is 0.147053\n",
"\n",
"\n",
"Test set: Average loss: 0.273099\n",
"Test-sample R^2 for iter: 6 is 0.095801\n",
"\n",
"\n",
"Train Epoch: 5 [0/106354 (0%)]\tLoss: 0.311926\n",
"Insample R^2 for iter: 6 is 0.157816\n",
"\n",
"\n",
"Test set: Average loss: 0.269274\n",
"Test-sample R^2 for iter: 6 is 0.095255\n",
"\n",
"\n",
"Train Epoch: 5 [0/131647 (0%)]\tLoss: 0.332882\n",
"Insample R^2 for iter: 6 is 0.166272\n",
"\n",
"\n",
"Test set: Average loss: 0.265355\n",
"Test-sample R^2 for iter: 6 is 0.094391\n",
"\n",
"\n",
"Train Epoch: 5 [0/157513 (0%)]\tLoss: 0.332728\n",
"Insample R^2 for iter: 6 is 0.172814\n",
"\n",
"\n",
"Test set: Average loss: 0.263554\n",
"Test-sample R^2 for iter: 6 is 0.093341\n",
"\n",
"\n",
"Train Epoch: 5 [0/183724 (0%)]\tLoss: 0.325782\n",
"Insample R^2 for iter: 6 is 0.178286\n",
"\n",
"\n",
"Test set: Average loss: 0.262521\n",
"Test-sample R^2 for iter: 6 is 0.092220\n",
"\n",
"\n",
"Train Epoch: 5 [0/210124 (0%)]\tLoss: 0.309998\n",
"Insample R^2 for iter: 6 is 0.182461\n",
"\n",
"\n",
"Test set: Average loss: 0.264003\n",
"Test-sample R^2 for iter: 6 is 0.091142\n",
"\n",
"\n",
"Train Epoch: 5 [0/236621 (0%)]\tLoss: 0.341840\n",
"Insample R^2 for iter: 6 is 0.173780\n",
"\n",
"\n",
"Test set: Average loss: 0.267380\n",
"Test-sample R^2 for iter: 6 is 0.090222\n",
"\n",
"\n",
"Train Epoch: 5 [0/263967 (0%)]\tLoss: 0.277805\n",
"Insample R^2 for iter: 6 is 0.176390\n",
"\n",
"\n",
"Test set: Average loss: 0.270762\n",
"Test-sample R^2 for iter: 6 is 0.089339\n",
"\n",
"\n",
"Train Epoch: 5 [0/292711 (0%)]\tLoss: 0.274553\n",
"Insample R^2 for iter: 6 is 0.178444\n",
"\n",
"\n",
"Test set: Average loss: 0.271549\n",
"Test-sample R^2 for iter: 6 is 0.088385\n",
"\n",
"\n",
"Train Epoch: 5 [0/322680 (0%)]\tLoss: 0.267713\n",
"Insample R^2 for iter: 6 is 0.180154\n",
"\n",
"\n",
"Test set: Average loss: 0.273420\n",
"Test-sample R^2 for iter: 6 is 0.087451\n",
"\n",
"\n",
"Train Epoch: 5 [0/354563 (0%)]\tLoss: 0.262149\n",
"Insample R^2 for iter: 6 is 0.181580\n",
"\n",
"\n",
"Test set: Average loss: 0.275434\n",
"Test-sample R^2 for iter: 6 is 0.086506\n",
"\n",
"\n",
"Train Epoch: 5 [0/422402 (0%)]\tLoss: 0.262250\n",
"Insample R^2 for iter: 6 is 0.180710\n",
"\n",
"\n",
"Test set: Average loss: 0.280472\n",
"Test-sample R^2 for iter: 6 is 0.085515\n",
"\n",
"\n",
"Train Epoch: 5 [0/487054 (0%)]\tLoss: 0.249144\n",
"Insample R^2 for iter: 6 is 0.182146\n",
"\n",
"\n",
"Test set: Average loss: 0.281306\n",
"Test-sample R^2 for iter: 6 is 0.084423\n",
"\n",
"\n",
"Train Epoch: 6 [0/10692 (0%)]\tLoss: 0.441980\n",
"Insample R^2 for iter: 7 is 0.125074\n",
"\n",
"\n",
"Test set: Average loss: 0.271842\n",
"Test-sample R^2 for iter: 7 is 0.103672\n",
"\n",
"\n",
"Train Epoch: 6 [0/23538 (0%)]\tLoss: 0.356457\n",
"Insample R^2 for iter: 7 is 0.155157\n",
"\n",
"\n",
"Test set: Average loss: 0.271080\n",
"Test-sample R^2 for iter: 7 is 0.103804\n",
"\n",
"\n",
"Train Epoch: 6 [0/36468 (0%)]\tLoss: 0.317977\n",
"Insample R^2 for iter: 7 is 0.164181\n",
"\n",
"\n",
"Test set: Average loss: 0.270936\n",
"Test-sample R^2 for iter: 7 is 0.104058\n",
"\n",
"\n",
"Train Epoch: 6 [0/49745 (0%)]\tLoss: 0.294536\n",
"Insample R^2 for iter: 7 is 0.174942\n",
"\n",
"\n",
"Test set: Average loss: 0.270688\n",
"Test-sample R^2 for iter: 7 is 0.104245\n",
"\n",
"\n",
"Train Epoch: 6 [0/63294 (0%)]\tLoss: 0.299591\n",
"Insample R^2 for iter: 7 is 0.163121\n",
"\n",
"\n",
"Test set: Average loss: 0.271177\n",
"Test-sample R^2 for iter: 7 is 0.104589\n",
"\n",
"\n",
"Train Epoch: 6 [0/81599 (0%)]\tLoss: 0.294846\n",
"Insample R^2 for iter: 7 is 0.164536\n",
"\n",
"\n",
"Test set: Average loss: 0.269413\n",
"Test-sample R^2 for iter: 7 is 0.104981\n",
"\n",
"\n",
"Train Epoch: 6 [0/106354 (0%)]\tLoss: 0.311085\n",
"Insample R^2 for iter: 7 is 0.173094\n",
"\n",
"\n",
"Test set: Average loss: 0.265109\n",
"Test-sample R^2 for iter: 7 is 0.105129\n",
"\n",
"\n",
"Train Epoch: 6 [0/131647 (0%)]\tLoss: 0.336373\n",
"Insample R^2 for iter: 7 is 0.178610\n",
"\n",
"\n",
"Test set: Average loss: 0.260730\n",
"Test-sample R^2 for iter: 7 is 0.105018\n",
"\n",
"\n",
"Train Epoch: 6 [0/157513 (0%)]\tLoss: 0.336143\n",
"Insample R^2 for iter: 7 is 0.182885\n",
"\n",
"\n",
"Test set: Average loss: 0.258543\n",
"Test-sample R^2 for iter: 7 is 0.104720\n",
"\n",
"\n",
"Train Epoch: 6 [0/183724 (0%)]\tLoss: 0.328243\n",
"Insample R^2 for iter: 7 is 0.186755\n",
"\n",
"\n",
"Test set: Average loss: 0.257200\n",
"Test-sample R^2 for iter: 7 is 0.104322\n",
"\n",
"\n",
"Train Epoch: 6 [0/210124 (0%)]\tLoss: 0.312993\n",
"Insample R^2 for iter: 7 is 0.189468\n",
"\n",
"\n",
"Test set: Average loss: 0.258155\n",
"Test-sample R^2 for iter: 7 is 0.103995\n",
"\n",
"\n",
"Train Epoch: 6 [0/236621 (0%)]\tLoss: 0.292341\n",
"Insample R^2 for iter: 7 is 0.191392\n",
"\n",
"\n",
"Test set: Average loss: 0.261077\n",
"Test-sample R^2 for iter: 7 is 0.103811\n",
"\n",
"\n",
"Train Epoch: 6 [0/263967 (0%)]\tLoss: 0.283004\n",
"Insample R^2 for iter: 7 is 0.191490\n",
"\n",
"\n",
"Test set: Average loss: 0.264016\n",
"Test-sample R^2 for iter: 7 is 0.103648\n",
"\n",
"\n",
"Train Epoch: 6 [0/292711 (0%)]\tLoss: 0.278743\n",
"Insample R^2 for iter: 7 is 0.191585\n",
"\n",
"\n",
"Test set: Average loss: 0.264630\n",
"Test-sample R^2 for iter: 7 is 0.103354\n",
"\n",
"\n",
"Train Epoch: 6 [0/322680 (0%)]\tLoss: 0.278203\n",
"Insample R^2 for iter: 7 is 0.190315\n",
"\n",
"\n",
"Test set: Average loss: 0.266414\n",
"Test-sample R^2 for iter: 7 is 0.103003\n",
"\n",
"\n",
"Train Epoch: 6 [0/354563 (0%)]\tLoss: 0.266655\n",
"Insample R^2 for iter: 7 is 0.190237\n",
"\n",
"\n",
"Test set: Average loss: 0.268383\n",
"Test-sample R^2 for iter: 7 is 0.102570\n",
"\n",
"\n",
"Train Epoch: 6 [0/422402 (0%)]\tLoss: 0.250937\n",
"Insample R^2 for iter: 7 is 0.190984\n",
"\n",
"\n",
"Test set: Average loss: 0.273894\n",
"Test-sample R^2 for iter: 7 is 0.101917\n",
"\n",
"\n",
"Train Epoch: 6 [0/487054 (0%)]\tLoss: 0.260581\n",
"Insample R^2 for iter: 7 is 0.189802\n",
"\n",
"\n",
"Test set: Average loss: 0.275646\n",
"Test-sample R^2 for iter: 7 is 0.100959\n",
"\n",
"\n",
"Train Epoch: 7 [0/10692 (0%)]\tLoss: 0.392688\n",
"Insample R^2 for iter: 8 is 0.223082\n",
"\n",
"\n",
"Test set: Average loss: 0.264003\n",
"Test-sample R^2 for iter: 8 is 0.129522\n",
"\n",
"\n",
"Train Epoch: 7 [0/23538 (0%)]\tLoss: 0.352908\n",
"Insample R^2 for iter: 8 is 0.208207\n",
"\n",
"\n",
"Test set: Average loss: 0.263583\n",
"Test-sample R^2 for iter: 8 is 0.129120\n",
"\n",
"\n",
"Train Epoch: 7 [0/36468 (0%)]\tLoss: 0.325527\n",
"Insample R^2 for iter: 8 is 0.193002\n",
"\n",
"\n",
"Test set: Average loss: 0.263795\n",
"Test-sample R^2 for iter: 8 is 0.128802\n",
"\n",
"\n",
"Train Epoch: 7 [0/49745 (0%)]\tLoss: 0.299277\n",
"Insample R^2 for iter: 8 is 0.193322\n",
"\n",
"\n",
"Test set: Average loss: 0.263857\n",
"Test-sample R^2 for iter: 8 is 0.128450\n",
"\n",
"\n",
"Train Epoch: 7 [0/63294 (0%)]\tLoss: 0.288400\n",
"Insample R^2 for iter: 8 is 0.184462\n",
"\n",
"\n",
"Test set: Average loss: 0.264737\n",
"Test-sample R^2 for iter: 8 is 0.128199\n",
"\n",
"\n",
"Train Epoch: 7 [0/81599 (0%)]\tLoss: 0.293456\n",
"Insample R^2 for iter: 8 is 0.182962\n",
"\n",
"\n",
"Test set: Average loss: 0.263682\n",
"Test-sample R^2 for iter: 8 is 0.127822\n",
"\n",
"\n",
"Train Epoch: 7 [0/106354 (0%)]\tLoss: 0.319978\n",
"Insample R^2 for iter: 8 is 0.185688\n",
"\n",
"\n",
"Test set: Average loss: 0.260318\n",
"Test-sample R^2 for iter: 8 is 0.127016\n",
"\n",
"\n",
"Train Epoch: 7 [0/131647 (0%)]\tLoss: 0.333916\n",
"Insample R^2 for iter: 8 is 0.190341\n",
"\n",
"\n",
"Test set: Average loss: 0.256769\n",
"Test-sample R^2 for iter: 8 is 0.125870\n",
"\n",
"\n",
"Train Epoch: 7 [0/157513 (0%)]\tLoss: 0.355553\n",
"Insample R^2 for iter: 8 is 0.188249\n",
"\n",
"\n",
"Test set: Average loss: 0.255262\n",
"Test-sample R^2 for iter: 8 is 0.124521\n",
"\n",
"\n",
"Train Epoch: 7 [0/183724 (0%)]\tLoss: 0.321723\n",
"Insample R^2 for iter: 8 is 0.193134\n",
"\n",
"\n",
"Test set: Average loss: 0.254608\n",
"Test-sample R^2 for iter: 8 is 0.123048\n",
"\n",
"\n",
"Train Epoch: 7 [0/210124 (0%)]\tLoss: 0.307680\n",
"Insample R^2 for iter: 8 is 0.196479\n",
"\n",
"\n",
"Test set: Average loss: 0.256143\n",
"Test-sample R^2 for iter: 8 is 0.121656\n",
"\n",
"\n",
"Train Epoch: 7 [0/236621 (0%)]\tLoss: 0.292560\n",
"Insample R^2 for iter: 8 is 0.197764\n",
"\n",
"\n",
"Test set: Average loss: 0.259650\n",
"Test-sample R^2 for iter: 8 is 0.120410\n",
"\n",
"\n",
"Train Epoch: 7 [0/263967 (0%)]\tLoss: 0.281499\n",
"Insample R^2 for iter: 8 is 0.197701\n",
"\n",
"\n",
"Test set: Average loss: 0.263168\n",
"Test-sample R^2 for iter: 8 is 0.119192\n",
"\n",
"\n",
"Train Epoch: 7 [0/292711 (0%)]\tLoss: 0.273079\n",
"Insample R^2 for iter: 8 is 0.198529\n",
"\n",
"\n",
"Test set: Average loss: 0.264303\n",
"Test-sample R^2 for iter: 8 is 0.117867\n",
"\n",
"\n",
"Train Epoch: 7 [0/322680 (0%)]\tLoss: 0.265118\n",
"Insample R^2 for iter: 8 is 0.199408\n",
"\n",
"\n",
"Test set: Average loss: 0.266487\n",
"Test-sample R^2 for iter: 8 is 0.116532\n",
"\n",
"\n",
"Train Epoch: 7 [0/354563 (0%)]\tLoss: 0.265819\n",
"Insample R^2 for iter: 8 is 0.198919\n",
"\n",
"\n",
"Test set: Average loss: 0.268745\n",
"Test-sample R^2 for iter: 8 is 0.115177\n",
"\n",
"\n",
"Train Epoch: 7 [0/422402 (0%)]\tLoss: 0.252153\n",
"Insample R^2 for iter: 8 is 0.198922\n",
"\n",
"\n",
"Test set: Average loss: 0.274337\n",
"Test-sample R^2 for iter: 8 is 0.113697\n",
"\n",
"\n",
"Train Epoch: 7 [0/487054 (0%)]\tLoss: 0.245845\n",
"Insample R^2 for iter: 8 is 0.199927\n",
"\n",
"\n",
"Test set: Average loss: 0.275986\n",
"Test-sample R^2 for iter: 8 is 0.112021\n",
"\n",
"\n",
"Train Epoch: 8 [0/10692 (0%)]\tLoss: 0.388015\n",
"Insample R^2 for iter: 9 is 0.232346\n",
"\n",
"\n",
"Test set: Average loss: 0.265597\n",
"Test-sample R^2 for iter: 9 is 0.124265\n",
"\n",
"\n",
"Train Epoch: 8 [0/23538 (0%)]\tLoss: 0.378824\n",
"Insample R^2 for iter: 9 is 0.183014\n",
"\n",
"\n",
"Test set: Average loss: 0.265195\n",
"Test-sample R^2 for iter: 9 is 0.123828\n",
"\n",
"\n",
"Train Epoch: 8 [0/36468 (0%)]\tLoss: 0.309164\n",
"Insample R^2 for iter: 9 is 0.190319\n",
"\n",
"\n",
"Test set: Average loss: 0.265425\n",
"Test-sample R^2 for iter: 9 is 0.123479\n",
"\n",
"\n",
"Train Epoch: 8 [0/49745 (0%)]\tLoss: 0.305749\n",
"Insample R^2 for iter: 9 is 0.186910\n",
"\n",
"\n",
"Test set: Average loss: 0.265446\n",
"Test-sample R^2 for iter: 9 is 0.123144\n",
"\n",
"\n",
"Train Epoch: 8 [0/63294 (0%)]\tLoss: 0.286298\n",
"Insample R^2 for iter: 9 is 0.180572\n",
"\n",
"\n",
"Test set: Average loss: 0.266207\n",
"Test-sample R^2 for iter: 9 is 0.122985\n",
"\n",
"\n",
"Train Epoch: 8 [0/81599 (0%)]\tLoss: 0.301452\n",
"Insample R^2 for iter: 9 is 0.175940\n",
"\n",
"\n",
"Test set: Average loss: 0.264743\n",
"Test-sample R^2 for iter: 9 is 0.122891\n",
"\n",
"\n",
"Train Epoch: 8 [0/106354 (0%)]\tLoss: 0.310997\n",
"Insample R^2 for iter: 9 is 0.182884\n",
"\n",
"\n",
"Test set: Average loss: 0.260859\n",
"Test-sample R^2 for iter: 9 is 0.122528\n",
"\n",
"\n",
"Train Epoch: 8 [0/131647 (0%)]\tLoss: 0.338989\n",
"Insample R^2 for iter: 9 is 0.186396\n",
"\n",
"\n",
"Test set: Average loss: 0.256795\n",
"Test-sample R^2 for iter: 9 is 0.121932\n",
"\n",
"\n",
"Train Epoch: 8 [0/157513 (0%)]\tLoss: 0.327706\n",
"Insample R^2 for iter: 9 is 0.191992\n",
"\n",
"\n",
"Test set: Average loss: 0.254969\n",
"Test-sample R^2 for iter: 9 is 0.121133\n",
"\n",
"\n",
"Train Epoch: 8 [0/183724 (0%)]\tLoss: 0.320673\n",
"Insample R^2 for iter: 9 is 0.196748\n",
"\n",
"\n",
"Test set: Average loss: 0.253992\n",
"Test-sample R^2 for iter: 9 is 0.120215\n",
"\n",
"\n",
"Train Epoch: 8 [0/210124 (0%)]\tLoss: 0.325932\n",
"Insample R^2 for iter: 9 is 0.195577\n",
"\n",
"\n",
"Test set: Average loss: 0.255218\n",
"Test-sample R^2 for iter: 9 is 0.119374\n",
"\n",
"\n",
"Train Epoch: 8 [0/236621 (0%)]\tLoss: 0.294314\n",
"Insample R^2 for iter: 9 is 0.196536\n",
"\n",
"\n",
"Test set: Average loss: 0.258309\n",
"Test-sample R^2 for iter: 9 is 0.118702\n",
"\n",
"\n",
"Train Epoch: 8 [0/263967 (0%)]\tLoss: 0.275007\n",
"Insample R^2 for iter: 9 is 0.197998\n",
"\n",
"\n",
"Test set: Average loss: 0.261407\n",
"Test-sample R^2 for iter: 9 is 0.118076\n",
"\n",
"\n",
"Train Epoch: 8 [0/292711 (0%)]\tLoss: 0.270340\n",
"Insample R^2 for iter: 9 is 0.199371\n",
"\n",
"\n",
"Test set: Average loss: 0.262099\n",
"Test-sample R^2 for iter: 9 is 0.117367\n",
"\n",
"\n",
"Train Epoch: 8 [0/322680 (0%)]\tLoss: 0.263859\n",
"Insample R^2 for iter: 9 is 0.200440\n",
"\n",
"\n",
"Test set: Average loss: 0.263879\n",
"Test-sample R^2 for iter: 9 is 0.116654\n",
"\n",
"\n",
"Train Epoch: 8 [0/354563 (0%)]\tLoss: 0.260487\n",
"Insample R^2 for iter: 9 is 0.200903\n",
"\n",
"\n",
"Test set: Average loss: 0.265871\n",
"Test-sample R^2 for iter: 9 is 0.115897\n",
"\n",
"\n",
"Train Epoch: 8 [0/422402 (0%)]\tLoss: 0.264519\n",
"Insample R^2 for iter: 9 is 0.198454\n",
"\n",
"\n",
"Test set: Average loss: 0.271350\n",
"Test-sample R^2 for iter: 9 is 0.114956\n",
"\n",
"\n",
"Train Epoch: 8 [0/487054 (0%)]\tLoss: 0.254784\n",
"Insample R^2 for iter: 9 is 0.197884\n",
"\n",
"\n",
"Test set: Average loss: 0.273231\n",
"Test-sample R^2 for iter: 9 is 0.113719\n",
"\n",
"\n",
"Train Epoch: 9 [0/10692 (0%)]\tLoss: 0.381564\n",
"Insample R^2 for iter: 10 is 0.245126\n",
"\n",
"\n",
"Test set: Average loss: 0.261079\n",
"Test-sample R^2 for iter: 10 is 0.139161\n",
"\n",
"\n",
"Train Epoch: 9 [0/23538 (0%)]\tLoss: 0.337485\n",
"Insample R^2 for iter: 10 is 0.236915\n",
"\n",
"\n",
"Test set: Average loss: 0.260446\n",
"Test-sample R^2 for iter: 10 is 0.139125\n",
"\n",
"\n",
"Train Epoch: 9 [0/36468 (0%)]\tLoss: 0.311623\n",
"Insample R^2 for iter: 10 is 0.224105\n",
"\n",
"\n",
"Test set: Average loss: 0.260420\n",
"Test-sample R^2 for iter: 10 is 0.139191\n",
"\n",
"\n",
"Train Epoch: 9 [0/49745 (0%)]\tLoss: 0.298059\n",
"Insample R^2 for iter: 10 is 0.217444\n",
"\n",
"\n",
"Test set: Average loss: 0.260486\n",
"Test-sample R^2 for iter: 10 is 0.139028\n",
"\n",
"\n",
"Train Epoch: 9 [0/63294 (0%)]\tLoss: 0.351110\n",
"Insample R^2 for iter: 10 is 0.166440\n",
"\n",
"\n",
"Test set: Average loss: 0.261617\n",
"Test-sample R^2 for iter: 10 is 0.138719\n",
"\n",
"\n",
"Train Epoch: 9 [0/81599 (0%)]\tLoss: 0.310211\n",
"Insample R^2 for iter: 10 is 0.160017\n",
"\n",
"\n",
"Test set: Average loss: 0.260657\n",
"Test-sample R^2 for iter: 10 is 0.138261\n",
"\n",
"\n",
"Train Epoch: 9 [0/106354 (0%)]\tLoss: 0.311865\n",
"Insample R^2 for iter: 10 is 0.168914\n",
"\n",
"\n",
"Test set: Average loss: 0.257437\n",
"Test-sample R^2 for iter: 10 is 0.137351\n",
"\n",
"\n",
"Train Epoch: 9 [0/131647 (0%)]\tLoss: 0.340159\n",
"Insample R^2 for iter: 10 is 0.173823\n",
"\n",
"\n",
"Test set: Average loss: 0.253877\n",
"Test-sample R^2 for iter: 10 is 0.136155\n",
"\n",
"\n",
"Train Epoch: 9 [0/157513 (0%)]\tLoss: 0.326254\n",
"Insample R^2 for iter: 10 is 0.181187\n",
"\n",
"\n",
"Test set: Average loss: 0.252390\n",
"Test-sample R^2 for iter: 10 is 0.134771\n",
"\n",
"\n",
"Train Epoch: 9 [0/183724 (0%)]\tLoss: 0.325299\n",
"Insample R^2 for iter: 10 is 0.185914\n",
"\n",
"\n",
"Test set: Average loss: 0.251778\n",
"Test-sample R^2 for iter: 10 is 0.133263\n",
"\n",
"\n",
"Train Epoch: 9 [0/210124 (0%)]\tLoss: 0.315305\n",
"Insample R^2 for iter: 10 is 0.188157\n",
"\n",
"\n",
"Test set: Average loss: 0.253298\n",
"Test-sample R^2 for iter: 10 is 0.131843\n",
"\n",
"\n",
"Train Epoch: 9 [0/236621 (0%)]\tLoss: 0.305532\n",
"Insample R^2 for iter: 10 is 0.187192\n",
"\n",
"\n",
"Test set: Average loss: 0.256785\n",
"Test-sample R^2 for iter: 10 is 0.130569\n",
"\n",
"\n",
"Train Epoch: 9 [0/263967 (0%)]\tLoss: 0.292717\n",
"Insample R^2 for iter: 10 is 0.185451\n",
"\n",
"\n",
"Test set: Average loss: 0.260287\n",
"Test-sample R^2 for iter: 10 is 0.129324\n",
"\n",
"\n",
"Train Epoch: 9 [0/292711 (0%)]\tLoss: 0.281294\n",
"Insample R^2 for iter: 10 is 0.185432\n",
"\n",
"\n",
"Test set: Average loss: 0.261542\n",
"Test-sample R^2 for iter: 10 is 0.127946\n",
"\n",
"\n",
"Train Epoch: 9 [0/322680 (0%)]\tLoss: 0.323801\n",
"Insample R^2 for iter: 10 is 0.175447\n",
"\n",
"\n",
"Test set: Average loss: 0.263798\n",
"Test-sample R^2 for iter: 10 is 0.126546\n",
"\n",
"\n",
"Train Epoch: 9 [0/354563 (0%)]\tLoss: 0.265491\n",
"Insample R^2 for iter: 10 is 0.176509\n",
"\n",
"\n",
"Test set: Average loss: 0.266263\n",
"Test-sample R^2 for iter: 10 is 0.125088\n",
"\n",
"\n",
"Train Epoch: 9 [0/422402 (0%)]\tLoss: 0.252900\n",
"Insample R^2 for iter: 10 is 0.177682\n",
"\n",
"\n",
"Test set: Average loss: 0.272116\n",
"Test-sample R^2 for iter: 10 is 0.123458\n",
"\n",
"\n",
"Train Epoch: 9 [0/487054 (0%)]\tLoss: 0.270467\n",
"Insample R^2 for iter: 10 is 0.175465\n",
"\n",
"\n",
"Test set: Average loss: 0.274090\n",
"Test-sample R^2 for iter: 10 is 0.121590\n",
"\n",
"\n",
"Train Epoch: 10 [0/10692 (0%)]\tLoss: 0.392986\n",
"Insample R^2 for iter: 11 is 0.222377\n",
"\n",
"\n",
"Test set: Average loss: 0.263177\n",
"Test-sample R^2 for iter: 11 is 0.132244\n",
"\n",
"\n",
"Train Epoch: 10 [0/23538 (0%)]\tLoss: 0.359931\n",
"Insample R^2 for iter: 11 is 0.199713\n",
"\n",
"\n",
"Test set: Average loss: 0.262789\n",
"Test-sample R^2 for iter: 11 is 0.131795\n",
"\n",
"\n",
"Train Epoch: 10 [0/36468 (0%)]\tLoss: 0.298200\n",
"Insample R^2 for iter: 11 is 0.210884\n",
"\n",
"\n",
"Test set: Average loss: 0.263159\n",
"Test-sample R^2 for iter: 11 is 0.131285\n",
"\n",
"\n",
"Train Epoch: 10 [0/49745 (0%)]\tLoss: 0.293190\n",
"Insample R^2 for iter: 11 is 0.210824\n",
"\n",
"\n",
"Test set: Average loss: 0.263544\n",
"Test-sample R^2 for iter: 11 is 0.130572\n",
"\n",
"\n",
"Train Epoch: 10 [0/63294 (0%)]\tLoss: 0.318758\n",
"Insample R^2 for iter: 11 is 0.180378\n",
"\n",
"\n",
"Test set: Average loss: 0.264831\n",
"Test-sample R^2 for iter: 11 is 0.129834\n",
"\n",
"\n",
"Train Epoch: 10 [0/81599 (0%)]\tLoss: 0.283067\n",
"Insample R^2 for iter: 11 is 0.184435\n",
"\n",
"\n",
"Test set: Average loss: 0.263963\n",
"Test-sample R^2 for iter: 11 is 0.129030\n",
"\n",
"\n",
"Train Epoch: 10 [0/106354 (0%)]\tLoss: 0.305279\n",
"Insample R^2 for iter: 11 is 0.192204\n",
"\n",
"\n",
"Test set: Average loss: 0.260546\n",
"Test-sample R^2 for iter: 11 is 0.127941\n",
"\n",
"\n",
"Train Epoch: 10 [0/131647 (0%)]\tLoss: 0.357072\n",
"Insample R^2 for iter: 11 is 0.189246\n",
"\n",
"\n",
"Test set: Average loss: 0.256786\n",
"Test-sample R^2 for iter: 11 is 0.126672\n",
"\n",
"\n",
"Train Epoch: 10 [0/157513 (0%)]\tLoss: 0.376195\n",
"Insample R^2 for iter: 11 is 0.181884\n",
"\n",
"\n",
"Test set: Average loss: 0.255029\n",
"Test-sample R^2 for iter: 11 is 0.125324\n",
"\n",
"\n",
"Train Epoch: 10 [0/183724 (0%)]\tLoss: 0.317337\n",
"Insample R^2 for iter: 11 is 0.188439\n",
"\n",
"\n",
"Test set: Average loss: 0.254109\n",
"Test-sample R^2 for iter: 11 is 0.123946\n",
"\n",
"\n",
"Train Epoch: 10 [0/210124 (0%)]\tLoss: 0.319778\n",
"Insample R^2 for iter: 11 is 0.189427\n",
"\n",
"\n",
"Test set: Average loss: 0.255429\n",
"Test-sample R^2 for iter: 11 is 0.122698\n",
"\n",
"\n",
"Train Epoch: 10 [0/236621 (0%)]\tLoss: 0.291410\n",
"Insample R^2 for iter: 11 is 0.191547\n",
"\n",
"\n",
"Test set: Average loss: 0.258703\n",
"Test-sample R^2 for iter: 11 is 0.121637\n",
"\n",
"\n",
"Train Epoch: 10 [0/263967 (0%)]\tLoss: 0.277218\n",
"Insample R^2 for iter: 11 is 0.192897\n",
"\n",
"\n",
"Test set: Average loss: 0.261888\n",
"Test-sample R^2 for iter: 11 is 0.120659\n",
"\n",
"\n",
"Train Epoch: 10 [0/292711 (0%)]\tLoss: 0.271893\n",
"Insample R^2 for iter: 11 is 0.194304\n",
"\n",
"\n",
"Test set: Average loss: 0.262641\n",
"Test-sample R^2 for iter: 11 is 0.119633\n",
"\n",
"\n",
"Train Epoch: 10 [0/322680 (0%)]\tLoss: 0.267085\n",
"Insample R^2 for iter: 11 is 0.195061\n",
"\n",
"\n",
"Test set: Average loss: 0.264492\n",
"Test-sample R^2 for iter: 11 is 0.118631\n",
"\n",
"\n",
"Train Epoch: 10 [0/354563 (0%)]\tLoss: 0.260302\n",
"Insample R^2 for iter: 11 is 0.195890\n",
"\n",
"\n",
"Test set: Average loss: 0.266459\n",
"Test-sample R^2 for iter: 11 is 0.117626\n",
"\n",
"\n",
"Train Epoch: 10 [0/422402 (0%)]\tLoss: 0.285042\n",
"Insample R^2 for iter: 11 is 0.189861\n",
"\n",
"\n",
"Test set: Average loss: 0.271935\n",
"Test-sample R^2 for iter: 11 is 0.116470\n",
"\n",
"\n",
"Train Epoch: 10 [0/487054 (0%)]\tLoss: 0.259387\n",
"Insample R^2 for iter: 11 is 0.188942\n",
"\n",
"\n",
"Test set: Average loss: 0.273791\n",
"Test-sample R^2 for iter: 11 is 0.115046\n",
"\n",
"\n",
"Train Epoch: 11 [0/10692 (0%)]\tLoss: 0.384373\n",
"Insample R^2 for iter: 12 is 0.239482\n",
"\n",
"\n",
"Test set: Average loss: 0.261870\n",
"Test-sample R^2 for iter: 12 is 0.136555\n",
"\n",
"\n",
"Train Epoch: 11 [0/23538 (0%)]\tLoss: 0.334807\n",
"Insample R^2 for iter: 12 is 0.237139\n",
"\n",
"\n",
"Test set: Average loss: 0.261191\n",
"Test-sample R^2 for iter: 12 is 0.136590\n",
"\n",
"\n",
"Train Epoch: 11 [0/36468 (0%)]\tLoss: 0.302839\n",
"Insample R^2 for iter: 12 is 0.231814\n",
"\n",
"\n",
"Test set: Average loss: 0.261189\n",
"Test-sample R^2 for iter: 12 is 0.136653\n",
"\n",
"\n",
"Train Epoch: 11 [0/49745 (0%)]\tLoss: 0.291799\n",
"Insample R^2 for iter: 12 is 0.227451\n",
"\n",
"\n",
"Test set: Average loss: 0.261094\n",
"Test-sample R^2 for iter: 12 is 0.136623\n",
"\n",
"\n",
"Train Epoch: 11 [0/63294 (0%)]\tLoss: 0.348139\n",
"Insample R^2 for iter: 12 is 0.176196\n",
"\n",
"\n",
"Test set: Average loss: 0.262135\n",
"Test-sample R^2 for iter: 12 is 0.136453\n",
"\n",
"\n",
"Train Epoch: 11 [0/81599 (0%)]\tLoss: 0.278902\n",
"Insample R^2 for iter: 12 is 0.182906\n",
"\n",
"\n",
"Test set: Average loss: 0.261196\n",
"Test-sample R^2 for iter: 12 is 0.136074\n",
"\n",
"\n",
"Train Epoch: 11 [0/106354 (0%)]\tLoss: 0.305026\n",
"Insample R^2 for iter: 12 is 0.190977\n",
"\n",
"\n",
"Test set: Average loss: 0.257958\n",
"Test-sample R^2 for iter: 12 is 0.135225\n",
"\n",
"\n",
"Train Epoch: 11 [0/131647 (0%)]\tLoss: 0.327237\n",
"Insample R^2 for iter: 12 is 0.196904\n",
"\n",
"\n",
"Test set: Average loss: 0.254394\n",
"Test-sample R^2 for iter: 12 is 0.134073\n",
"\n",
"\n",
"Train Epoch: 11 [0/157513 (0%)]\tLoss: 0.345558\n",
"Insample R^2 for iter: 12 is 0.196667\n",
"\n",
"\n",
"Test set: Average loss: 0.252842\n",
"Test-sample R^2 for iter: 12 is 0.132746\n",
"\n",
"\n",
"Train Epoch: 11 [0/183724 (0%)]\tLoss: 0.354630\n",
"Insample R^2 for iter: 12 is 0.192836\n",
"\n",
"\n",
"Test set: Average loss: 0.252113\n",
"Test-sample R^2 for iter: 12 is 0.131324\n",
"\n",
"\n",
"Train Epoch: 11 [0/210124 (0%)]\tLoss: 0.326167\n",
"Insample R^2 for iter: 12 is 0.191955\n",
"\n",
"\n",
"Test set: Average loss: 0.253452\n",
"Test-sample R^2 for iter: 12 is 0.130032\n",
"\n",
"\n",
"Train Epoch: 11 [0/236621 (0%)]\tLoss: 0.291910\n",
"Insample R^2 for iter: 12 is 0.193748\n",
"\n",
"\n",
"Test set: Average loss: 0.256609\n",
"Test-sample R^2 for iter: 12 is 0.128959\n",
"\n",
"\n",
"Train Epoch: 11 [0/263967 (0%)]\tLoss: 0.278414\n",
"Insample R^2 for iter: 12 is 0.194659\n",
"\n",
"\n",
"Test set: Average loss: 0.259798\n",
"Test-sample R^2 for iter: 12 is 0.127965\n",
"\n",
"\n",
"Train Epoch: 11 [0/292711 (0%)]\tLoss: 0.278953\n",
"Insample R^2 for iter: 12 is 0.194465\n",
"\n",
"\n",
"Test set: Average loss: 0.260647\n",
"Test-sample R^2 for iter: 12 is 0.126902\n",
"\n",
"\n",
"Train Epoch: 11 [0/322680 (0%)]\tLoss: 0.263282\n",
"Insample R^2 for iter: 12 is 0.195967\n",
"\n",
"\n",
"Test set: Average loss: 0.262544\n",
"Test-sample R^2 for iter: 12 is 0.125855\n",
"\n",
"\n",
"Train Epoch: 11 [0/354563 (0%)]\tLoss: 0.258757\n",
"Insample R^2 for iter: 12 is 0.197031\n",
"\n",
"\n",
"Test set: Average loss: 0.264569\n",
"Test-sample R^2 for iter: 12 is 0.124797\n",
"\n",
"\n",
"Train Epoch: 11 [0/422402 (0%)]\tLoss: 0.250805\n",
"Insample R^2 for iter: 12 is 0.197385\n",
"\n",
"\n",
"Test set: Average loss: 0.270291\n",
"Test-sample R^2 for iter: 12 is 0.123540\n",
"\n",
"\n",
"Train Epoch: 11 [0/487054 (0%)]\tLoss: 0.245537\n",
"Insample R^2 for iter: 12 is 0.198517\n",
"\n",
"\n",
"Test set: Average loss: 0.272626\n",
"Test-sample R^2 for iter: 12 is 0.121937\n",
"\n",
"\n",
"Train Epoch: 12 [0/10692 (0%)]\tLoss: 0.398921\n",
"Insample R^2 for iter: 13 is 0.210494\n",
"\n",
"\n",
"Test set: Average loss: 0.259929\n",
"Test-sample R^2 for iter: 13 is 0.142954\n",
"\n",
"\n",
"Train Epoch: 12 [0/23538 (0%)]\tLoss: 0.367897\n",
"Insample R^2 for iter: 13 is 0.184566\n",
"\n",
"\n",
"Test set: Average loss: 0.259362\n",
"Test-sample R^2 for iter: 13 is 0.142812\n",
"\n",
"\n",
"Train Epoch: 12 [0/36468 (0%)]\tLoss: 0.304473\n",
"Insample R^2 for iter: 13 is 0.195337\n",
"\n",
"\n",
"Test set: Average loss: 0.259610\n",
"Test-sample R^2 for iter: 13 is 0.142541\n",
"\n",
"\n",
"Train Epoch: 12 [0/49745 (0%)]\tLoss: 0.303711\n",
"Insample R^2 for iter: 13 is 0.192002\n",
"\n",
"\n",
"Test set: Average loss: 0.259821\n",
"Test-sample R^2 for iter: 13 is 0.142091\n",
"\n",
"\n",
"Train Epoch: 12 [0/63294 (0%)]\tLoss: 0.270272\n",
"Insample R^2 for iter: 13 is 0.194129\n",
"\n",
"\n",
"Test set: Average loss: 0.260722\n",
"Test-sample R^2 for iter: 13 is 0.141759\n",
"\n",
"\n",
"Train Epoch: 12 [0/81599 (0%)]\tLoss: 0.298426\n",
"Insample R^2 for iter: 13 is 0.188628\n",
"\n",
"\n",
"Test set: Average loss: 0.259756\n",
"Test-sample R^2 for iter: 13 is 0.141291\n",
"\n",
"\n",
"Train Epoch: 12 [0/106354 (0%)]\tLoss: 0.304527\n",
"Insample R^2 for iter: 13 is 0.196052\n",
"\n",
"\n",
"Test set: Average loss: 0.256422\n",
"Test-sample R^2 for iter: 13 is 0.140437\n",
"\n",
"\n",
"Train Epoch: 12 [0/131647 (0%)]\tLoss: 0.325874\n",
"Insample R^2 for iter: 13 is 0.201737\n",
"\n",
"\n",
"Test set: Average loss: 0.252801\n",
"Test-sample R^2 for iter: 13 is 0.139317\n",
"\n",
"\n",
"Train Epoch: 12 [0/157513 (0%)]\tLoss: 0.325779\n",
"Insample R^2 for iter: 13 is 0.206110\n",
"\n",
"\n",
"Test set: Average loss: 0.251240\n",
"Test-sample R^2 for iter: 13 is 0.138026\n",
"\n",
"\n",
"Train Epoch: 12 [0/183724 (0%)]\tLoss: 0.317103\n",
"Insample R^2 for iter: 13 is 0.210288\n",
"\n",
"\n",
"Test set: Average loss: 0.250542\n",
"Test-sample R^2 for iter: 13 is 0.136624\n",
"\n",
"\n",
"Train Epoch: 12 [0/210124 (0%)]\tLoss: 0.317468\n",
"Insample R^2 for iter: 13 is 0.209809\n",
"\n",
"\n",
"Test set: Average loss: 0.251968\n",
"Test-sample R^2 for iter: 13 is 0.135320\n",
"\n",
"\n",
"Train Epoch: 12 [0/236621 (0%)]\tLoss: 0.286806\n",
"Insample R^2 for iter: 13 is 0.211262\n",
"\n",
"\n",
"Test set: Average loss: 0.255246\n",
"Test-sample R^2 for iter: 13 is 0.134198\n",
"\n",
"\n",
"Train Epoch: 12 [0/263967 (0%)]\tLoss: 0.279169\n",
"Insample R^2 for iter: 13 is 0.210654\n",
"\n",
"\n",
"Test set: Average loss: 0.258494\n",
"Test-sample R^2 for iter: 13 is 0.133143\n",
"\n",
"\n",
"Train Epoch: 12 [0/292711 (0%)]\tLoss: 0.268421\n",
"Insample R^2 for iter: 13 is 0.211508\n",
"\n",
"\n",
"Test set: Average loss: 0.259451\n",
"Test-sample R^2 for iter: 13 is 0.132000\n",
"\n",
"\n",
"Train Epoch: 12 [0/322680 (0%)]\tLoss: 0.263669\n",
"Insample R^2 for iter: 13 is 0.211792\n",
"\n",
"\n",
"Test set: Average loss: 0.261494\n",
"Test-sample R^2 for iter: 13 is 0.130850\n",
"\n",
"\n",
"Train Epoch: 12 [0/354563 (0%)]\tLoss: 0.256704\n",
"Insample R^2 for iter: 13 is 0.212256\n",
"\n",
"\n",
"Test set: Average loss: 0.263663\n",
"Test-sample R^2 for iter: 13 is 0.129670\n",
"\n",
"\n",
"Train Epoch: 12 [0/422402 (0%)]\tLoss: 0.246703\n",
"Insample R^2 for iter: 13 is 0.212483\n",
"\n",
"\n",
"Test set: Average loss: 0.269485\n",
"Test-sample R^2 for iter: 13 is 0.128284\n",
"\n",
"\n",
"Train Epoch: 12 [0/487054 (0%)]\tLoss: 0.249467\n",
"Insample R^2 for iter: 13 is 0.212071\n",
"\n",
"\n",
"Test set: Average loss: 0.271944\n",
"Test-sample R^2 for iter: 13 is 0.126544\n",
"\n",
"\n",
"Train Epoch: 13 [0/10692 (0%)]\tLoss: 0.447487\n",
"Insample R^2 for iter: 14 is 0.113821\n",
"\n",
"\n",
"Test set: Average loss: 0.259643\n",
"Test-sample R^2 for iter: 14 is 0.143897\n",
"\n",
"\n",
"Train Epoch: 13 [0/23538 (0%)]\tLoss: 0.338364\n",
"Insample R^2 for iter: 14 is 0.170170\n",
"\n",
"\n",
"Test set: Average loss: 0.259673\n",
"Test-sample R^2 for iter: 14 is 0.142770\n",
"\n",
"\n",
"Train Epoch: 13 [0/36468 (0%)]\tLoss: 0.314538\n",
"Insample R^2 for iter: 14 is 0.177033\n",
"\n",
"\n",
"Test set: Average loss: 0.260291\n",
"Test-sample R^2 for iter: 14 is 0.141763\n",
"\n",
"\n",
"Train Epoch: 13 [0/49745 (0%)]\tLoss: 0.348013\n",
"Insample R^2 for iter: 14 is 0.148223\n",
"\n",
"\n",
"Test set: Average loss: 0.260691\n",
"Test-sample R^2 for iter: 14 is 0.140788\n",
"\n",
"\n",
"Train Epoch: 13 [0/63294 (0%)]\tLoss: 0.268582\n",
"Insample R^2 for iter: 14 is 0.160102\n",
"\n",
"\n",
"Test set: Average loss: 0.261743\n",
"Test-sample R^2 for iter: 14 is 0.140044\n",
"\n",
"\n",
"Train Epoch: 13 [0/81599 (0%)]\tLoss: 0.276869\n",
"Insample R^2 for iter: 14 is 0.170436\n",
"\n",
"\n",
"Test set: Average loss: 0.260776\n",
"Test-sample R^2 for iter: 14 is 0.139299\n",
"\n",
"\n",
"Train Epoch: 13 [0/106354 (0%)]\tLoss: 0.314097\n",
"Insample R^2 for iter: 14 is 0.177021\n",
"\n",
"\n",
"Test set: Average loss: 0.257412\n",
"Test-sample R^2 for iter: 14 is 0.138252\n",
"\n",
"\n",
"Train Epoch: 13 [0/131647 (0%)]\tLoss: 0.327447\n",
"Insample R^2 for iter: 14 is 0.184619\n",
"\n",
"\n",
"Test set: Average loss: 0.253780\n",
"Test-sample R^2 for iter: 14 is 0.136985\n",
"\n",
"\n",
"Train Epoch: 13 [0/157513 (0%)]\tLoss: 0.322834\n",
"Insample R^2 for iter: 14 is 0.191658\n",
"\n",
"\n",
"Test set: Average loss: 0.252207\n",
"Test-sample R^2 for iter: 14 is 0.135580\n",
"\n",
"\n",
"Train Epoch: 13 [0/183724 (0%)]\tLoss: 0.316154\n",
"Insample R^2 for iter: 14 is 0.197505\n",
"\n",
"\n",
"Test set: Average loss: 0.251471\n",
"Test-sample R^2 for iter: 14 is 0.134098\n",
"\n",
"\n",
"Train Epoch: 13 [0/210124 (0%)]\tLoss: 0.302789\n",
"Insample R^2 for iter: 14 is 0.201548\n",
"\n",
"\n",
"Test set: Average loss: 0.252899\n",
"Test-sample R^2 for iter: 14 is 0.132729\n",
"\n",
"\n",
"Train Epoch: 13 [0/236621 (0%)]\tLoss: 0.291117\n",
"Insample R^2 for iter: 14 is 0.202713\n",
"\n",
"\n",
"Test set: Average loss: 0.256294\n",
"Test-sample R^2 for iter: 14 is 0.131522\n",
"\n",
"\n",
"Train Epoch: 13 [0/263967 (0%)]\tLoss: 0.272827\n",
"Insample R^2 for iter: 14 is 0.204162\n",
"\n",
"\n",
"Test set: Average loss: 0.259692\n",
"Test-sample R^2 for iter: 14 is 0.130359\n",
"\n",
"\n",
"Train Epoch: 13 [0/292711 (0%)]\tLoss: 0.266763\n",
"Insample R^2 for iter: 14 is 0.205823\n",
"\n",
"\n",
"Test set: Average loss: 0.260732\n",
"Test-sample R^2 for iter: 14 is 0.129104\n",
"\n",
"\n",
"Train Epoch: 13 [0/322680 (0%)]\tLoss: 0.262029\n",
"Insample R^2 for iter: 14 is 0.206812\n",
"\n",
"\n",
"Test set: Average loss: 0.262861\n",
"Test-sample R^2 for iter: 14 is 0.127838\n",
"\n",
"\n",
"Train Epoch: 13 [0/354563 (0%)]\tLoss: 0.255278\n",
"Insample R^2 for iter: 14 is 0.207859\n",
"\n",
"\n",
"Test set: Average loss: 0.265159\n",
"Test-sample R^2 for iter: 14 is 0.126532\n",
"\n",
"\n",
"Train Epoch: 13 [0/422402 (0%)]\tLoss: 0.249947\n",
"Insample R^2 for iter: 14 is 0.207732\n",
"\n",
"\n",
"Test set: Average loss: 0.270832\n",
"Test-sample R^2 for iter: 14 is 0.125067\n",
"\n",
"\n",
"Train Epoch: 13 [0/487054 (0%)]\tLoss: 0.247094\n",
"Insample R^2 for iter: 14 is 0.208005\n",
"\n",
"\n",
"Test set: Average loss: 0.272607\n",
"Test-sample R^2 for iter: 14 is 0.123383\n",
"\n",
"\n",
"Train Epoch: 14 [0/10692 (0%)]\tLoss: 0.399106\n",
"Insample R^2 for iter: 15 is 0.210064\n",
"\n",
"\n",
"Test set: Average loss: 0.261873\n",
"Test-sample R^2 for iter: 15 is 0.136543\n",
"\n",
"\n",
"Train Epoch: 14 [0/23538 (0%)]\tLoss: 0.334782\n",
"Insample R^2 for iter: 15 is 0.222400\n",
"\n",
"\n",
"Test set: Average loss: 0.261440\n",
"Test-sample R^2 for iter: 15 is 0.136172\n",
"\n",
"\n",
"Train Epoch: 14 [0/36468 (0%)]\tLoss: 0.293894\n",
"Insample R^2 for iter: 15 is 0.229667\n",
"\n",
"\n",
"Test set: Average loss: 0.261557\n",
"Test-sample R^2 for iter: 15 is 0.135969\n",
"\n",
"\n",
"Train Epoch: 14 [0/49745 (0%)]\tLoss: 0.366774\n",
"Insample R^2 for iter: 15 is 0.174971\n",
"\n",
"\n",
"Test set: Average loss: 0.261475\n",
"Test-sample R^2 for iter: 15 is 0.135795\n",
"\n",
"\n",
"Train Epoch: 14 [0/63294 (0%)]\tLoss: 0.270221\n",
"Insample R^2 for iter: 15 is 0.180518\n",
"\n",
"\n",
"Test set: Average loss: 0.262240\n",
"Test-sample R^2 for iter: 15 is 0.135721\n",
"\n",
"\n",
"Train Epoch: 14 [0/81599 (0%)]\tLoss: 0.279217\n",
"Insample R^2 for iter: 15 is 0.186336\n",
"\n",
"\n",
"Test set: Average loss: 0.260921\n",
"Test-sample R^2 for iter: 15 is 0.135616\n",
"\n",
"\n",
"Train Epoch: 14 [0/106354 (0%)]\tLoss: 0.304013\n",
"Insample R^2 for iter: 15 is 0.194262\n",
"\n",
"\n",
"Test set: Average loss: 0.257315\n",
"Test-sample R^2 for iter: 15 is 0.135143\n",
"\n",
"\n",
"Train Epoch: 14 [0/131647 (0%)]\tLoss: 0.324250\n",
"Insample R^2 for iter: 15 is 0.200638\n",
"\n",
"\n",
"Test set: Average loss: 0.253453\n",
"Test-sample R^2 for iter: 15 is 0.134405\n",
"\n",
"\n",
"Train Epoch: 14 [0/157513 (0%)]\tLoss: 0.322854\n",
"Insample R^2 for iter: 15 is 0.205888\n",
"\n",
"\n",
"Test set: Average loss: 0.251721\n",
"Test-sample R^2 for iter: 15 is 0.133473\n",
"\n",
"\n",
"Train Epoch: 14 [0/183724 (0%)]\tLoss: 0.317303\n",
"Insample R^2 for iter: 15 is 0.210034\n",
"\n",
"\n",
"Test set: Average loss: 0.250918\n",
"Test-sample R^2 for iter: 15 is 0.132396\n",
"\n",
"\n",
"Train Epoch: 14 [0/210124 (0%)]\tLoss: 0.305048\n",
"Insample R^2 for iter: 15 is 0.212418\n",
"\n",
"\n",
"Test set: Average loss: 0.252317\n",
"Test-sample R^2 for iter: 15 is 0.131366\n",
"\n",
"\n",
"Train Epoch: 14 [0/236621 (0%)]\tLoss: 0.299604\n",
"Insample R^2 for iter: 15 is 0.210753\n",
"\n",
"\n",
"Test set: Average loss: 0.255546\n",
"Test-sample R^2 for iter: 15 is 0.130487\n",
"\n",
"\n",
"Train Epoch: 14 [0/263967 (0%)]\tLoss: 0.273372\n",
"Insample R^2 for iter: 15 is 0.211460\n",
"\n",
"\n",
"Test set: Average loss: 0.258767\n",
"Test-sample R^2 for iter: 15 is 0.129645\n",
"\n",
"\n",
"Train Epoch: 14 [0/292711 (0%)]\tLoss: 0.267273\n",
"Insample R^2 for iter: 15 is 0.212490\n",
"\n",
"\n",
"Test set: Average loss: 0.259683\n",
"Test-sample R^2 for iter: 15 is 0.128697\n",
"\n",
"\n",
"Train Epoch: 14 [0/322680 (0%)]\tLoss: 0.282964\n",
"Insample R^2 for iter: 15 is 0.208847\n",
"\n",
"\n",
"Test set: Average loss: 0.261660\n",
"Test-sample R^2 for iter: 15 is 0.127729\n",
"\n",
"\n",
"Train Epoch: 14 [0/354563 (0%)]\tLoss: 0.254386\n",
"Insample R^2 for iter: 15 is 0.209934\n",
"\n",
"\n",
"Test set: Average loss: 0.263800\n",
"Test-sample R^2 for iter: 15 is 0.126716\n",
"\n",
"\n",
"Train Epoch: 14 [0/422402 (0%)]\tLoss: 0.244465\n",
"Insample R^2 for iter: 15 is 0.210715\n",
"\n",
"\n",
"Test set: Average loss: 0.269597\n",
"Test-sample R^2 for iter: 15 is 0.125481\n",
"\n",
"\n",
"Train Epoch: 14 [0/487054 (0%)]\tLoss: 0.247800\n",
"Insample R^2 for iter: 15 is 0.210694\n",
"\n",
"\n",
"Test set: Average loss: 0.271864\n",
"Test-sample R^2 for iter: 15 is 0.123911\n",
"\n",
"\n",
"Train Epoch: 15 [0/10692 (0%)]\tLoss: 0.395532\n",
"Insample R^2 for iter: 16 is 0.217138\n",
"\n",
"\n",
"Test set: Average loss: 0.259598\n",
"Test-sample R^2 for iter: 16 is 0.144045\n",
"\n",
"\n",
"Train Epoch: 15 [0/23538 (0%)]\tLoss: 0.375988\n",
"Insample R^2 for iter: 16 is 0.178526\n",
"\n",
"\n",
"Test set: Average loss: 0.259079\n",
"Test-sample R^2 for iter: 16 is 0.143827\n",
"\n",
"\n",
"Train Epoch: 15 [0/36468 (0%)]\tLoss: 0.294468\n",
"Insample R^2 for iter: 16 is 0.199906\n",
"\n",
"\n",
"Test set: Average loss: 0.259286\n",
"Test-sample R^2 for iter: 16 is 0.143574\n",
"\n",
"\n",
"Train Epoch: 15 [0/49745 (0%)]\tLoss: 0.380179\n",
"Insample R^2 for iter: 16 is 0.143549\n",
"\n",
"\n",
"Test set: Average loss: 0.259373\n",
"Test-sample R^2 for iter: 16 is 0.143237\n",
"\n",
"\n",
"Train Epoch: 15 [0/63294 (0%)]\tLoss: 0.287749\n",
"Insample R^2 for iter: 16 is 0.144946\n",
"\n",
"\n",
"Test set: Average loss: 0.260148\n",
"Test-sample R^2 for iter: 16 is 0.143055\n",
"\n",
"\n",
"Train Epoch: 15 [0/81599 (0%)]\tLoss: 0.302111\n",
"Insample R^2 for iter: 16 is 0.145882\n",
"\n",
"\n",
"Test set: Average loss: 0.258948\n",
"Test-sample R^2 for iter: 16 is 0.142817\n",
"\n",
"\n",
"Train Epoch: 15 [0/106354 (0%)]\tLoss: 0.315852\n",
"Insample R^2 for iter: 16 is 0.155334\n",
"\n",
"\n",
"Test set: Average loss: 0.255511\n",
"Test-sample R^2 for iter: 16 is 0.142184\n",
"\n",
"\n",
"Train Epoch: 15 [0/131647 (0%)]\tLoss: 0.341151\n",
"Insample R^2 for iter: 16 is 0.161621\n",
"\n",
"\n",
"Test set: Average loss: 0.251822\n",
"Test-sample R^2 for iter: 16 is 0.141267\n",
"\n",
"\n",
"Train Epoch: 15 [0/157513 (0%)]\tLoss: 0.321545\n",
"Insample R^2 for iter: 16 is 0.171543\n",
"\n",
"\n",
"Test set: Average loss: 0.250206\n",
"Test-sample R^2 for iter: 16 is 0.140158\n",
"\n",
"\n",
"Train Epoch: 15 [0/183724 (0%)]\tLoss: 0.326781\n",
"Insample R^2 for iter: 16 is 0.176857\n",
"\n",
"\n",
"Test set: Average loss: 0.249498\n",
"Test-sample R^2 for iter: 16 is 0.138908\n",
"\n",
"\n",
"Train Epoch: 15 [0/210124 (0%)]\tLoss: 0.305170\n",
"Insample R^2 for iter: 16 is 0.182224\n",
"\n",
"\n",
"Test set: Average loss: 0.250851\n",
"Test-sample R^2 for iter: 16 is 0.137750\n",
"\n",
"\n",
"Train Epoch: 15 [0/236621 (0%)]\tLoss: 0.294618\n",
"Insample R^2 for iter: 16 is 0.184200\n",
"\n",
"\n",
"Test set: Average loss: 0.254029\n",
"Test-sample R^2 for iter: 16 is 0.136774\n",
"\n",
"\n",
"Train Epoch: 15 [0/263967 (0%)]\tLoss: 0.274566\n",
"Insample R^2 for iter: 16 is 0.186682\n",
"\n",
"\n",
"Test set: Average loss: 0.257242\n",
"Test-sample R^2 for iter: 16 is 0.135848\n",
"\n",
"\n",
"Train Epoch: 15 [0/292711 (0%)]\tLoss: 0.267588\n",
"Insample R^2 for iter: 16 is 0.189412\n",
"\n",
"\n",
"Test set: Average loss: 0.258157\n",
"Test-sample R^2 for iter: 16 is 0.134827\n",
"\n",
"\n",
"Train Epoch: 15 [0/322680 (0%)]\tLoss: 0.297389\n",
"Insample R^2 for iter: 16 is 0.184420\n",
"\n",
"\n",
"Test set: Average loss: 0.260086\n",
"Test-sample R^2 for iter: 16 is 0.133806\n",
"\n",
"\n",
"Train Epoch: 15 [0/354563 (0%)]\tLoss: 0.256146\n",
"Insample R^2 for iter: 16 is 0.186694\n",
"\n",
"\n",
"Test set: Average loss: 0.262143\n",
"Test-sample R^2 for iter: 16 is 0.132762\n",
"\n",
"\n",
"Train Epoch: 15 [0/422402 (0%)]\tLoss: 0.245296\n",
"Insample R^2 for iter: 16 is 0.188682\n",
"\n",
"\n",
"Test set: Average loss: 0.267979\n",
"Test-sample R^2 for iter: 16 is 0.131487\n",
"\n",
"\n",
"Train Epoch: 15 [0/487054 (0%)]\tLoss: 0.242128\n",
"Insample R^2 for iter: 16 is 0.190895\n",
"\n",
"\n",
"Test set: Average loss: 0.270472\n",
"Test-sample R^2 for iter: 16 is 0.129841\n",
"\n",
"\n",
"Train Epoch: 16 [0/10692 (0%)]\tLoss: 0.382497\n",
"Insample R^2 for iter: 17 is 0.243043\n",
"\n",
"\n",
"Test set: Average loss: 0.257723\n",
"Test-sample R^2 for iter: 17 is 0.150228\n",
"\n",
"\n",
"Train Epoch: 16 [0/23538 (0%)]\tLoss: 0.331550\n",
"Insample R^2 for iter: 17 is 0.242567\n",
"\n",
"\n",
"Test set: Average loss: 0.257416\n",
"Test-sample R^2 for iter: 17 is 0.149666\n",
"\n",
"\n",
"Train Epoch: 16 [0/36468 (0%)]\tLoss: 0.296010\n",
"Insample R^2 for iter: 17 is 0.241254\n",
"\n",
"\n",
"Test set: Average loss: 0.257809\n",
"Test-sample R^2 for iter: 17 is 0.149094\n",
"\n",
"\n",
"Train Epoch: 16 [0/49745 (0%)]\tLoss: 0.280603\n",
"Insample R^2 for iter: 17 is 0.242066\n",
"\n",
"\n",
"Test set: Average loss: 0.258153\n",
"Test-sample R^2 for iter: 17 is 0.148385\n",
"\n",
"\n",
"Train Epoch: 16 [0/63294 (0%)]\tLoss: 0.260615\n",
"Insample R^2 for iter: 17 is 0.239886\n",
"\n",
"\n",
"Test set: Average loss: 0.259223\n",
"Test-sample R^2 for iter: 17 is 0.147783\n",
"\n",
"\n",
"Train Epoch: 16 [0/81599 (0%)]\tLoss: 0.308109\n",
"Insample R^2 for iter: 17 is 0.222160\n",
"\n",
"\n",
"Test set: Average loss: 0.258395\n",
"Test-sample R^2 for iter: 17 is 0.147063\n",
"\n",
"\n",
"Train Epoch: 16 [0/106354 (0%)]\tLoss: 0.303146\n",
"Insample R^2 for iter: 17 is 0.225268\n",
"\n",
"\n",
"Test set: Average loss: 0.255155\n",
"Test-sample R^2 for iter: 17 is 0.145995\n",
"\n",
"\n",
"Train Epoch: 16 [0/131647 (0%)]\tLoss: 0.323688\n",
"Insample R^2 for iter: 17 is 0.227922\n",
"\n",
"\n",
"Test set: Average loss: 0.251614\n",
"Test-sample R^2 for iter: 17 is 0.144691\n",
"\n",
"\n",
"Train Epoch: 16 [0/157513 (0%)]\tLoss: 0.322880\n",
"Insample R^2 for iter: 17 is 0.230125\n",
"\n",
"\n",
"Test set: Average loss: 0.250195\n",
"Test-sample R^2 for iter: 17 is 0.143205\n",
"\n",
"\n",
"Train Epoch: 16 [0/183724 (0%)]\tLoss: 0.319712\n",
"Insample R^2 for iter: 17 is 0.231265\n",
"\n",
"\n",
"Test set: Average loss: 0.249555\n",
"Test-sample R^2 for iter: 17 is 0.141631\n",
"\n",
"\n",
"Train Epoch: 16 [0/210124 (0%)]\tLoss: 0.303484\n",
"Insample R^2 for iter: 17 is 0.232069\n",
"\n",
"\n",
"Test set: Average loss: 0.251141\n",
"Test-sample R^2 for iter: 17 is 0.140134\n",
"\n",
"\n",
"Train Epoch: 16 [0/236621 (0%)]\tLoss: 0.304355\n",
"Insample R^2 for iter: 17 is 0.227685\n",
"\n",
"\n",
"Test set: Average loss: 0.254538\n",
"Test-sample R^2 for iter: 17 is 0.138813\n",
"\n",
"\n",
"Train Epoch: 16 [0/263967 (0%)]\tLoss: 0.274831\n",
"Insample R^2 for iter: 17 is 0.226760\n",
"\n",
"\n",
"Test set: Average loss: 0.257845\n",
"Test-sample R^2 for iter: 17 is 0.137573\n",
"\n",
"\n",
"Train Epoch: 16 [0/292711 (0%)]\tLoss: 0.266866\n",
"Insample R^2 for iter: 17 is 0.226775\n",
"\n",
"\n",
"Test set: Average loss: 0.258804\n",
"Test-sample R^2 for iter: 17 is 0.136271\n",
"\n",
"\n",
"Train Epoch: 16 [0/322680 (0%)]\tLoss: 0.269325\n",
"Insample R^2 for iter: 17 is 0.224898\n",
"\n",
"\n",
"Test set: Average loss: 0.260828\n",
"Test-sample R^2 for iter: 17 is 0.134987\n",
"\n",
"\n",
"Train Epoch: 16 [0/354563 (0%)]\tLoss: 0.254582\n",
"Insample R^2 for iter: 17 is 0.224939\n",
"\n",
"\n",
"Test set: Average loss: 0.262990\n",
"Test-sample R^2 for iter: 17 is 0.133690\n",
"\n",
"\n",
"Train Epoch: 16 [0/422402 (0%)]\tLoss: 0.242454\n",
"Insample R^2 for iter: 17 is 0.225210\n",
"\n",
"\n",
"Test set: Average loss: 0.268611\n",
"Test-sample R^2 for iter: 17 is 0.132238\n",
"\n",
"\n",
"Train Epoch: 16 [0/487054 (0%)]\tLoss: 0.243124\n",
"Insample R^2 for iter: 17 is 0.225213\n",
"\n",
"\n",
"Test set: Average loss: 0.270592\n",
"Test-sample R^2 for iter: 17 is 0.130527\n",
"\n",
"\n",
"Train Epoch: 17 [0/10692 (0%)]\tLoss: 0.384768\n",
"Insample R^2 for iter: 18 is 0.238492\n",
"\n",
"\n",
"Test set: Average loss: 0.258873\n",
"Test-sample R^2 for iter: 18 is 0.146435\n",
"\n",
"\n",
"Train Epoch: 17 [0/23538 (0%)]\tLoss: 0.340765\n",
"Insample R^2 for iter: 18 is 0.229675\n",
"\n",
"\n",
"Test set: Average loss: 0.258441\n",
"Test-sample R^2 for iter: 18 is 0.146076\n",
"\n",
"\n",
"Train Epoch: 17 [0/36468 (0%)]\tLoss: 0.297461\n",
"Insample R^2 for iter: 18 is 0.231394\n",
"\n",
"\n",
"Test set: Average loss: 0.258726\n",
"Test-sample R^2 for iter: 18 is 0.145691\n",
"\n",
"\n",
"Train Epoch: 17 [0/49745 (0%)]\tLoss: 0.309943\n",
"Insample R^2 for iter: 18 is 0.214766\n",
"\n",
"\n",
"Test set: Average loss: 0.258898\n",
"Test-sample R^2 for iter: 18 is 0.145217\n",
"\n",
"\n",
"Train Epoch: 17 [0/63294 (0%)]\tLoss: 0.285165\n",
"Insample R^2 for iter: 18 is 0.203437\n",
"\n",
"\n",
"Test set: Average loss: 0.259987\n",
"Test-sample R^2 for iter: 18 is 0.144744\n",
"\n",
"\n",
"Train Epoch: 17 [0/81599 (0%)]\tLoss: 0.274900\n",
"Insample R^2 for iter: 18 is 0.207448\n",
"\n",
"\n",
"Test set: Average loss: 0.259146\n",
"Test-sample R^2 for iter: 18 is 0.144116\n",
"\n",
"\n",
"Train Epoch: 17 [0/106354 (0%)]\tLoss: 0.304840\n",
"Insample R^2 for iter: 18 is 0.212043\n",
"\n",
"\n",
"Test set: Average loss: 0.255944\n",
"Test-sample R^2 for iter: 18 is 0.143089\n",
"\n",
"\n",
"Train Epoch: 17 [0/131647 (0%)]\tLoss: 0.326643\n",
"Insample R^2 for iter: 18 is 0.215481\n",
"\n",
"\n",
"Test set: Average loss: 0.252430\n",
"Test-sample R^2 for iter: 18 is 0.141797\n",
"\n",
"\n",
"Train Epoch: 17 [0/157513 (0%)]\tLoss: 0.354568\n",
"Insample R^2 for iter: 18 is 0.210807\n",
"\n",
"\n",
"Test set: Average loss: 0.250952\n",
"Test-sample R^2 for iter: 18 is 0.140341\n",
"\n",
"\n",
"Train Epoch: 17 [0/183724 (0%)]\tLoss: 0.328737\n",
"Insample R^2 for iter: 18 is 0.211719\n",
"\n",
"\n",
"Test set: Average loss: 0.250285\n",
"Test-sample R^2 for iter: 18 is 0.138798\n",
"\n",
"\n",
"Train Epoch: 17 [0/210124 (0%)]\tLoss: 0.398458\n",
"Insample R^2 for iter: 18 is 0.192531\n",
"\n",
"\n",
"Test set: Average loss: 0.251715\n",
"Test-sample R^2 for iter: 18 is 0.137377\n",
"\n",
"\n",
"Train Epoch: 17 [0/236621 (0%)]\tLoss: 0.310128\n",
"Insample R^2 for iter: 18 is 0.190132\n",
"\n",
"\n",
"Test set: Average loss: 0.254970\n",
"Test-sample R^2 for iter: 18 is 0.136162\n",
"\n",
"\n",
"Train Epoch: 17 [0/263967 (0%)]\tLoss: 0.279468\n",
"Insample R^2 for iter: 18 is 0.191066\n",
"\n",
"\n",
"Test set: Average loss: 0.258331\n",
"Test-sample R^2 for iter: 18 is 0.134998\n",
"\n",
"\n",
"Train Epoch: 17 [0/292711 (0%)]\tLoss: 0.294849\n",
"Insample R^2 for iter: 18 is 0.187792\n",
"\n",
"\n",
"Test set: Average loss: 0.259327\n",
"Test-sample R^2 for iter: 18 is 0.133754\n",
"\n",
"\n",
"Train Epoch: 17 [0/322680 (0%)]\tLoss: 0.270010\n",
"Insample R^2 for iter: 18 is 0.188374\n",
"\n",
"\n",
"Test set: Average loss: 0.261445\n",
"Test-sample R^2 for iter: 18 is 0.132497\n",
"\n",
"\n",
"Train Epoch: 17 [0/354563 (0%)]\tLoss: 0.254056\n",
"Insample R^2 for iter: 18 is 0.190795\n",
"\n",
"\n",
"Test set: Average loss: 0.263677\n",
"Test-sample R^2 for iter: 18 is 0.131212\n",
"\n",
"\n",
"Train Epoch: 17 [0/422402 (0%)]\tLoss: 0.243254\n",
"Insample R^2 for iter: 18 is 0.192920\n",
"\n",
"\n",
"Test set: Average loss: 0.269540\n",
"Test-sample R^2 for iter: 18 is 0.129724\n",
"\n",
"\n",
"Train Epoch: 17 [0/487054 (0%)]\tLoss: 0.241055\n",
"Insample R^2 for iter: 18 is 0.195082\n",
"\n",
"\n",
"Test set: Average loss: 0.271883\n",
"Test-sample R^2 for iter: 18 is 0.127915\n",
"\n",
"\n",
"Train Epoch: 18 [0/10692 (0%)]\tLoss: 0.378919\n",
"Insample R^2 for iter: 19 is 0.250092\n",
"\n",
"\n",
"Test set: Average loss: 0.259379\n",
"Test-sample R^2 for iter: 19 is 0.144766\n",
"\n",
"\n",
"Train Epoch: 18 [0/23538 (0%)]\tLoss: 0.343636\n",
"Insample R^2 for iter: 19 is 0.232152\n",
"\n",
"\n",
"Test set: Average loss: 0.258978\n",
"Test-sample R^2 for iter: 19 is 0.144354\n",
"\n",
"\n",
"Train Epoch: 18 [0/36468 (0%)]\tLoss: 0.306214\n",
"Insample R^2 for iter: 19 is 0.225472\n",
"\n",
"\n",
"Test set: Average loss: 0.259215\n",
"Test-sample R^2 for iter: 19 is 0.144004\n",
"\n",
"\n",
"Train Epoch: 18 [0/49745 (0%)]\tLoss: 0.302557\n",
"Insample R^2 for iter: 19 is 0.215319\n",
"\n",
"\n",
"Test set: Average loss: 0.259251\n",
"Test-sample R^2 for iter: 19 is 0.143660\n",
"\n",
"\n",
"Train Epoch: 18 [0/63294 (0%)]\tLoss: 0.262475\n",
"Insample R^2 for iter: 19 is 0.217361\n",
"\n",
"\n",
"Test set: Average loss: 0.259996\n",
"Test-sample R^2 for iter: 19 is 0.143493\n",
"\n",
"\n",
"Train Epoch: 18 [0/81599 (0%)]\tLoss: 0.279541\n",
"Insample R^2 for iter: 19 is 0.216852\n",
"\n",
"\n",
"Test set: Average loss: 0.258698\n",
"Test-sample R^2 for iter: 19 is 0.143321\n",
"\n",
"\n",
"Train Epoch: 18 [0/106354 (0%)]\tLoss: 0.302026\n",
"Insample R^2 for iter: 19 is 0.221106\n",
"\n",
"\n",
"Test set: Average loss: 0.255146\n",
"Test-sample R^2 for iter: 19 is 0.142792\n",
"\n",
"\n",
"Train Epoch: 18 [0/131647 (0%)]\tLoss: 0.325773\n",
"Insample R^2 for iter: 19 is 0.223659\n",
"\n",
"\n",
"Test set: Average loss: 0.251293\n",
"Test-sample R^2 for iter: 19 is 0.142025\n",
"\n",
"\n",
"Train Epoch: 18 [0/157513 (0%)]\tLoss: 0.320569\n",
"Insample R^2 for iter: 19 is 0.226928\n",
"\n",
"\n",
"Test set: Average loss: 0.249580\n",
"Test-sample R^2 for iter: 19 is 0.141073\n",
"\n",
"\n",
"Train Epoch: 18 [0/183724 (0%)]\tLoss: 0.313009\n",
"Insample R^2 for iter: 19 is 0.229978\n",
"\n",
"\n",
"Test set: Average loss: 0.248735\n",
"Test-sample R^2 for iter: 19 is 0.139999\n",
"\n",
"\n",
"Train Epoch: 18 [0/210124 (0%)]\tLoss: 0.307223\n",
"Insample R^2 for iter: 19 is 0.230034\n",
"\n",
"\n",
"Test set: Average loss: 0.250031\n",
"Test-sample R^2 for iter: 19 is 0.139002\n",
"\n",
"\n",
"Train Epoch: 18 [0/236621 (0%)]\tLoss: 0.284084\n",
"Insample R^2 for iter: 19 is 0.230395\n",
"\n",
"\n",
"Test set: Average loss: 0.253162\n",
"Test-sample R^2 for iter: 19 is 0.138170\n",
"\n",
"\n",
"Train Epoch: 18 [0/263967 (0%)]\tLoss: 0.269978\n",
"Insample R^2 for iter: 19 is 0.230326\n",
"\n",
"\n",
"Test set: Average loss: 0.256355\n",
"Test-sample R^2 for iter: 19 is 0.137369\n",
"\n",
"\n",
"Train Epoch: 18 [0/292711 (0%)]\tLoss: 0.265741\n",
"Insample R^2 for iter: 19 is 0.230313\n",
"\n",
"\n",
"Test set: Average loss: 0.257275\n",
"Test-sample R^2 for iter: 19 is 0.136454\n",
"\n",
"\n",
"Train Epoch: 18 [0/322680 (0%)]\tLoss: 0.261835\n",
"Insample R^2 for iter: 19 is 0.229690\n",
"\n",
"\n",
"Test set: Average loss: 0.259221\n",
"Test-sample R^2 for iter: 19 is 0.135520\n",
"\n",
"\n",
"Train Epoch: 18 [0/354563 (0%)]\tLoss: 0.254867\n",
"Insample R^2 for iter: 19 is 0.229369\n",
"\n",
"\n",
"Test set: Average loss: 0.261311\n",
"Test-sample R^2 for iter: 19 is 0.134543\n",
"\n",
"\n",
"Train Epoch: 18 [0/422402 (0%)]\tLoss: 0.248449\n",
"Insample R^2 for iter: 19 is 0.228242\n",
"\n",
"\n",
"Test set: Average loss: 0.267222\n",
"Test-sample R^2 for iter: 19 is 0.133312\n",
"\n",
"\n",
"Train Epoch: 18 [0/487054 (0%)]\tLoss: 0.242749\n",
"Insample R^2 for iter: 19 is 0.228135\n",
"\n",
"\n",
"Test set: Average loss: 0.269930\n",
"Test-sample R^2 for iter: 19 is 0.131664\n",
"\n",
"\n",
"Train Epoch: 19 [0/10692 (0%)]\tLoss: 0.387239\n",
"Insample R^2 for iter: 20 is 0.233497\n",
"\n",
"\n",
"Test set: Average loss: 0.256449\n",
"Test-sample R^2 for iter: 20 is 0.154428\n",
"\n",
"\n",
"Train Epoch: 19 [0/23538 (0%)]\tLoss: 0.329708\n",
"Insample R^2 for iter: 20 is 0.239848\n",
"\n",
"\n",
"Test set: Average loss: 0.256121\n",
"Test-sample R^2 for iter: 20 is 0.153906\n",
"\n",
"\n",
"Train Epoch: 19 [0/36468 (0%)]\tLoss: 0.305313\n",
"Insample R^2 for iter: 20 is 0.231362\n",
"\n",
"\n",
"Test set: Average loss: 0.256582\n",
"Test-sample R^2 for iter: 20 is 0.153272\n",
"\n",
"\n",
"Train Epoch: 19 [0/49745 (0%)]\tLoss: 0.281937\n",
"Insample R^2 for iter: 20 is 0.233704\n",
"\n",
"\n",
"Test set: Average loss: 0.256934\n",
"Test-sample R^2 for iter: 20 is 0.152527\n",
"\n",
"\n",
"Train Epoch: 19 [0/63294 (0%)]\tLoss: 0.266892\n",
"Insample R^2 for iter: 20 is 0.229431\n",
"\n",
"\n",
"Test set: Average loss: 0.257942\n",
"Test-sample R^2 for iter: 20 is 0.151941\n",
"\n",
"\n",
"Train Epoch: 19 [0/81599 (0%)]\tLoss: 0.273350\n",
"Insample R^2 for iter: 20 is 0.229822\n",
"\n",
"\n",
"Test set: Average loss: 0.257062\n",
"Test-sample R^2 for iter: 20 is 0.151264\n",
"\n",
"\n",
"Train Epoch: 19 [0/106354 (0%)]\tLoss: 0.322222\n",
"Insample R^2 for iter: 20 is 0.224971\n",
"\n",
"\n",
"Test set: Average loss: 0.254087\n",
"Test-sample R^2 for iter: 20 is 0.150110\n",
"\n",
"\n",
"Train Epoch: 19 [0/131647 (0%)]\tLoss: 0.322930\n",
"Insample R^2 for iter: 20 is 0.227868\n",
"\n",
"\n",
"Test set: Average loss: 0.250702\n",
"Test-sample R^2 for iter: 20 is 0.148683\n",
"\n",
"\n",
"Train Epoch: 19 [0/157513 (0%)]\tLoss: 0.333603\n",
"Insample R^2 for iter: 20 is 0.227269\n",
"\n",
"\n",
"Test set: Average loss: 0.249307\n",
"Test-sample R^2 for iter: 20 is 0.147097\n",
"\n",
"\n",
"Train Epoch: 19 [0/183724 (0%)]\tLoss: 0.328691\n",
"Insample R^2 for iter: 20 is 0.226537\n",
"\n",
"\n",
"Test set: Average loss: 0.248687\n",
"Test-sample R^2 for iter: 20 is 0.145437\n",
"\n",
"\n",
"Train Epoch: 19 [0/210124 (0%)]\tLoss: 0.314772\n",
"Insample R^2 for iter: 20 is 0.225171\n",
"\n",
"\n",
"Test set: Average loss: 0.249998\n",
"Test-sample R^2 for iter: 20 is 0.143956\n",
"\n",
"\n",
"Train Epoch: 19 [0/236621 (0%)]\tLoss: 0.286335\n",
"Insample R^2 for iter: 20 is 0.225425\n",
"\n",
"\n",
"Test set: Average loss: 0.253080\n",
"Test-sample R^2 for iter: 20 is 0.142735\n",
"\n",
"\n",
"Train Epoch: 19 [0/263967 (0%)]\tLoss: 0.274107\n",
"Insample R^2 for iter: 20 is 0.224821\n",
"\n",
"\n",
"Test set: Average loss: 0.256200\n",
"Test-sample R^2 for iter: 20 is 0.141623\n",
"\n",
"\n",
"Train Epoch: 19 [0/292711 (0%)]\tLoss: 0.288199\n",
"Insample R^2 for iter: 20 is 0.220514\n",
"\n",
"\n",
"Test set: Average loss: 0.257192\n",
"Test-sample R^2 for iter: 20 is 0.140424\n",
"\n",
"\n",
"Train Epoch: 19 [0/322680 (0%)]\tLoss: 0.264685\n",
"Insample R^2 for iter: 20 is 0.219971\n",
"\n",
"\n",
"Test set: Average loss: 0.259225\n",
"Test-sample R^2 for iter: 20 is 0.139224\n",
"\n",
"\n",
"Train Epoch: 19 [0/354563 (0%)]\tLoss: 0.256277\n",
"Insample R^2 for iter: 20 is 0.219984\n",
"\n",
"\n",
"Test set: Average loss: 0.261461\n",
"Test-sample R^2 for iter: 20 is 0.137985\n",
"\n",
"\n",
"Train Epoch: 19 [0/422402 (0%)]\tLoss: 0.250732\n",
"Insample R^2 for iter: 20 is 0.218975\n",
"\n",
"\n",
"Test set: Average loss: 0.267326\n",
"Test-sample R^2 for iter: 20 is 0.136531\n",
"\n",
"\n",
"Train Epoch: 19 [0/487054 (0%)]\tLoss: 0.248823\n",
"Insample R^2 for iter: 20 is 0.218296\n",
"\n",
"\n",
"Test set: Average loss: 0.269800\n",
"Test-sample R^2 for iter: 20 is 0.134728\n",
"\n",
"\n",
"Train Epoch: 20 [0/10692 (0%)]\tLoss: 0.396002\n",
"Insample R^2 for iter: 21 is 0.216027\n",
"\n",
"\n",
"Test set: Average loss: 0.256788\n",
"Test-sample R^2 for iter: 21 is 0.153309\n",
"\n",
"\n",
"Train Epoch: 20 [0/23538 (0%)]\tLoss: 0.333781\n",
"Insample R^2 for iter: 21 is 0.226409\n",
"\n",
"\n",
"Test set: Average loss: 0.256399\n",
"Test-sample R^2 for iter: 21 is 0.152887\n",
"\n",
"\n",
"Train Epoch: 20 [0/36468 (0%)]\tLoss: 0.292573\n",
"Insample R^2 for iter: 21 is 0.233386\n",
"\n",
"\n",
"Test set: Average loss: 0.256747\n",
"Test-sample R^2 for iter: 21 is 0.152411\n",
"\n",
"\n",
"Train Epoch: 20 [0/49745 (0%)]\tLoss: 0.280983\n",
"Insample R^2 for iter: 21 is 0.235857\n",
"\n",
"\n",
"Test set: Average loss: 0.257058\n",
"Test-sample R^2 for iter: 21 is 0.151779\n",
"\n",
"\n",
"Train Epoch: 20 [0/63294 (0%)]\tLoss: 0.305972\n",
"Insample R^2 for iter: 21 is 0.207902\n",
"\n",
"\n",
"Test set: Average loss: 0.258447\n",
"Test-sample R^2 for iter: 21 is 0.151009\n",
"\n",
"\n",
"Train Epoch: 20 [0/81599 (0%)]\tLoss: 0.280870\n",
"Insample R^2 for iter: 21 is 0.208325\n",
"\n",
"\n",
"Test set: Average loss: 0.257876\n",
"Test-sample R^2 for iter: 21 is 0.150039\n",
"\n",
"\n",
"Train Epoch: 20 [0/106354 (0%)]\tLoss: 0.313032\n",
"Insample R^2 for iter: 21 is 0.209836\n",
"\n",
"\n",
"Test set: Average loss: 0.255078\n",
"Test-sample R^2 for iter: 21 is 0.148582\n",
"\n",
"\n",
"Train Epoch: 20 [0/131647 (0%)]\tLoss: 0.326672\n",
"Insample R^2 for iter: 21 is 0.213524\n",
"\n",
"\n",
"Test set: Average loss: 0.251957\n",
"Test-sample R^2 for iter: 21 is 0.146807\n",
"\n",
"\n",
"Train Epoch: 20 [0/157513 (0%)]\tLoss: 0.333777\n",
"Insample R^2 for iter: 21 is 0.214470\n",
"\n",
"\n",
"Test set: Average loss: 0.250801\n",
"Test-sample R^2 for iter: 21 is 0.144853\n",
"\n",
"\n",
"Train Epoch: 20 [0/183724 (0%)]\tLoss: 0.329963\n",
"Insample R^2 for iter: 21 is 0.214710\n",
"\n",
"\n",
"Test set: Average loss: 0.250339\n",
"Test-sample R^2 for iter: 21 is 0.142840\n",
"\n",
"\n",
"Train Epoch: 20 [0/210124 (0%)]\tLoss: 0.338910\n",
"Insample R^2 for iter: 21 is 0.208884\n",
"\n",
"\n",
"Test set: Average loss: 0.251925\n",
"Test-sample R^2 for iter: 21 is 0.140985\n",
"\n",
"\n",
"Train Epoch: 20 [0/236621 (0%)]\tLoss: 0.286736\n",
"Insample R^2 for iter: 21 is 0.210401\n",
"\n",
"\n",
"Test set: Average loss: 0.255370\n",
"Test-sample R^2 for iter: 21 is 0.139355\n",
"\n",
"\n",
"Train Epoch: 20 [0/263967 (0%)]\tLoss: 0.275527\n",
"Insample R^2 for iter: 21 is 0.210636\n",
"\n",
"\n",
"Test set: Average loss: 0.258735\n",
"Test-sample R^2 for iter: 21 is 0.137840\n",
"\n",
"\n",
"Train Epoch: 20 [0/292711 (0%)]\tLoss: 0.267374\n",
"Insample R^2 for iter: 21 is 0.211682\n",
"\n",
"\n",
"Test set: Average loss: 0.259785\n",
"Test-sample R^2 for iter: 21 is 0.136280\n",
"\n",
"\n",
"Train Epoch: 20 [0/322680 (0%)]\tLoss: 0.258820\n",
"Insample R^2 for iter: 21 is 0.212897\n",
"\n",
"\n",
"Test set: Average loss: 0.261911\n",
"Test-sample R^2 for iter: 21 is 0.134751\n",
"\n",
"\n",
"Train Epoch: 20 [0/354563 (0%)]\tLoss: 0.251671\n",
"Insample R^2 for iter: 21 is 0.214232\n",
"\n",
"\n",
"Test set: Average loss: 0.264147\n",
"Test-sample R^2 for iter: 21 is 0.133226\n",
"\n",
"\n",
"Train Epoch: 20 [0/422402 (0%)]\tLoss: 0.241401\n",
"Insample R^2 for iter: 21 is 0.215318\n",
"\n",
"\n",
"Test set: Average loss: 0.269624\n",
"Test-sample R^2 for iter: 21 is 0.131603\n",
"\n",
"\n",
"Train Epoch: 20 [0/487054 (0%)]\tLoss: 0.256676\n",
"Insample R^2 for iter: 21 is 0.213438\n",
"\n",
"\n",
"Test set: Average loss: 0.271334\n",
"Test-sample R^2 for iter: 21 is 0.129791\n",
"\n",
"\n",
"Train Epoch: 21 [0/10692 (0%)]\tLoss: 0.375804\n",
"Insample R^2 for iter: 22 is 0.256185\n",
"\n",
"\n",
"Test set: Average loss: 0.260580\n",
"Test-sample R^2 for iter: 22 is 0.140808\n",
"\n",
"\n",
"Train Epoch: 21 [0/23538 (0%)]\tLoss: 0.330937\n",
"Insample R^2 for iter: 22 is 0.249744\n",
"\n",
"\n",
"Test set: Average loss: 0.260630\n",
"Test-sample R^2 for iter: 22 is 0.139645\n",
"\n",
"\n",
"Train Epoch: 21 [0/36468 (0%)]\tLoss: 0.294959\n",
"Insample R^2 for iter: 22 is 0.246872\n",
"\n",
"\n",
"Test set: Average loss: 0.261274\n",
"Test-sample R^2 for iter: 22 is 0.138596\n",
"\n",
"\n",
"Train Epoch: 21 [0/49745 (0%)]\tLoss: 0.334264\n",
"Insample R^2 for iter: 22 is 0.209835\n",
"\n",
"\n",
"Test set: Average loss: 0.261887\n",
"Test-sample R^2 for iter: 22 is 0.137425\n",
"\n",
"\n",
"Train Epoch: 21 [0/63294 (0%)]\tLoss: 0.260184\n",
"Insample R^2 for iter: 22 is 0.214307\n",
"\n",
"\n",
"Test set: Average loss: 0.263080\n",
"Test-sample R^2 for iter: 22 is 0.136472\n",
"\n",
"\n",
"Train Epoch: 21 [0/81599 (0%)]\tLoss: 0.272405\n",
"Insample R^2 for iter: 22 is 0.217650\n",
"\n",
"\n",
"Test set: Average loss: 0.262086\n",
"Test-sample R^2 for iter: 22 is 0.135598\n",
"\n",
"\n",
"Train Epoch: 21 [0/106354 (0%)]\tLoss: 0.305646\n",
"Insample R^2 for iter: 22 is 0.220474\n",
"\n",
"\n",
"Test set: Average loss: 0.258465\n",
"Test-sample R^2 for iter: 22 is 0.134573\n",
"\n",
"\n",
"Train Epoch: 21 [0/131647 (0%)]\tLoss: 0.347141\n",
"Insample R^2 for iter: 22 is 0.216836\n",
"\n",
"\n",
"Test set: Average loss: 0.254403\n",
"Test-sample R^2 for iter: 22 is 0.133498\n",
"\n",
"\n",
"Train Epoch: 21 [0/157513 (0%)]\tLoss: 0.336714\n",
"Insample R^2 for iter: 22 is 0.216644\n",
"\n",
"\n",
"Test set: Average loss: 0.252316\n",
"Test-sample R^2 for iter: 22 is 0.132438\n",
"\n",
"\n",
"Train Epoch: 21 [0/183724 (0%)]\tLoss: 0.333572\n",
"Insample R^2 for iter: 22 is 0.215802\n",
"\n",
"\n",
"Test set: Average loss: 0.250953\n",
"Test-sample R^2 for iter: 22 is 0.131452\n",
"\n",
"\n",
"Train Epoch: 21 [0/210124 (0%)]\tLoss: 0.308015\n",
"Insample R^2 for iter: 22 is 0.216955\n",
"\n",
"\n",
"Test set: Average loss: 0.252083\n",
"Test-sample R^2 for iter: 22 is 0.130582\n",
"\n",
"\n",
"Train Epoch: 21 [0/236621 (0%)]\tLoss: 0.286154\n",
"Insample R^2 for iter: 22 is 0.217927\n",
"\n",
"\n",
"Test set: Average loss: 0.255002\n",
"Test-sample R^2 for iter: 22 is 0.129924\n",
"\n",
"\n",
"Train Epoch: 21 [0/263967 (0%)]\tLoss: 0.268191\n",
"Insample R^2 for iter: 22 is 0.219203\n",
"\n",
"\n",
"Test set: Average loss: 0.257777\n",
"Test-sample R^2 for iter: 22 is 0.129385\n",
"\n",
"\n",
"Train Epoch: 21 [0/292711 (0%)]\tLoss: 0.264820\n",
"Insample R^2 for iter: 22 is 0.220166\n",
"\n",
"\n",
"Test set: Average loss: 0.258204\n",
"Test-sample R^2 for iter: 22 is 0.128814\n",
"\n",
"\n",
"Train Epoch: 21 [0/322680 (0%)]\tLoss: 0.258769\n",
"Insample R^2 for iter: 22 is 0.220824\n",
"\n",
"\n",
"Test set: Average loss: 0.259842\n",
"Test-sample R^2 for iter: 22 is 0.128249\n",
"\n",
"\n",
"Train Epoch: 21 [0/354563 (0%)]\tLoss: 0.254623\n",
"Insample R^2 for iter: 22 is 0.221095\n",
"\n",
"\n",
"Test set: Average loss: 0.261672\n",
"Test-sample R^2 for iter: 22 is 0.127651\n",
"\n",
"\n",
"Train Epoch: 21 [0/422402 (0%)]\tLoss: 0.246953\n",
"Insample R^2 for iter: 22 is 0.220728\n",
"\n",
"\n",
"Test set: Average loss: 0.267135\n",
"Test-sample R^2 for iter: 22 is 0.126842\n",
"\n",
"\n",
"Train Epoch: 21 [0/487054 (0%)]\tLoss: 0.245302\n",
"Insample R^2 for iter: 22 is 0.220575\n",
"\n",
"\n",
"Test set: Average loss: 0.269444\n",
"Test-sample R^2 for iter: 22 is 0.125643\n",
"\n",
"\n",
"Train Epoch: 22 [0/10692 (0%)]\tLoss: 0.406848\n",
"Insample R^2 for iter: 23 is 0.194394\n",
"\n",
"\n",
"Test set: Average loss: 0.256794\n",
"Test-sample R^2 for iter: 23 is 0.153291\n",
"\n",
"\n",
"Train Epoch: 22 [0/23538 (0%)]\tLoss: 0.334252\n",
"Insample R^2 for iter: 23 is 0.215020\n",
"\n",
"\n",
"Test set: Average loss: 0.256418\n",
"Test-sample R^2 for iter: 23 is 0.152847\n",
"\n",
"\n",
"Train Epoch: 22 [0/36468 (0%)]\tLoss: 0.314093\n",
"Insample R^2 for iter: 23 is 0.207192\n",
"\n",
"\n",
"Test set: Average loss: 0.256869\n",
"Test-sample R^2 for iter: 23 is 0.152251\n",
"\n",
"\n",
"Train Epoch: 22 [0/49745 (0%)]\tLoss: 0.287399\n",
"Insample R^2 for iter: 23 is 0.211843\n",
"\n",
"\n",
"Test set: Average loss: 0.257250\n",
"Test-sample R^2 for iter: 23 is 0.151499\n",
"\n",
"\n",
"Train Epoch: 22 [0/63294 (0%)]\tLoss: 0.263911\n",
"Insample R^2 for iter: 23 is 0.213688\n",
"\n",
"\n",
"Test set: Average loss: 0.258441\n",
"Test-sample R^2 for iter: 23 is 0.150789\n",
"\n",
"\n",
"Train Epoch: 22 [0/81599 (0%)]\tLoss: 0.284960\n",
"Insample R^2 for iter: 23 is 0.211204\n",
"\n",
"\n",
"Test set: Average loss: 0.257621\n",
"Test-sample R^2 for iter: 23 is 0.149996\n",
"\n",
"\n",
"Train Epoch: 22 [0/106354 (0%)]\tLoss: 0.302229\n",
"Insample R^2 for iter: 23 is 0.216170\n",
"\n",
"\n",
"Test set: Average loss: 0.254541\n",
"Test-sample R^2 for iter: 23 is 0.148804\n",
"\n",
"\n",
"Train Epoch: 22 [0/131647 (0%)]\tLoss: 0.328058\n",
"Insample R^2 for iter: 23 is 0.218653\n",
"\n",
"\n",
"Test set: Average loss: 0.250975\n",
"Test-sample R^2 for iter: 23 is 0.147423\n",
"\n",
"\n",
"Train Epoch: 22 [0/157513 (0%)]\tLoss: 0.318829\n",
"Insample R^2 for iter: 23 is 0.222915\n",
"\n",
"\n",
"Test set: Average loss: 0.249405\n",
"Test-sample R^2 for iter: 23 is 0.145939\n",
"\n",
"\n",
"Train Epoch: 22 [0/183724 (0%)]\tLoss: 0.319280\n",
"Insample R^2 for iter: 23 is 0.224855\n",
"\n",
"\n",
"Test set: Average loss: 0.248641\n",
"Test-sample R^2 for iter: 23 is 0.144411\n",
"\n",
"\n",
"Train Epoch: 22 [0/210124 (0%)]\tLoss: 0.344523\n",
"Insample R^2 for iter: 23 is 0.216815\n",
"\n",
"\n",
"Test set: Average loss: 0.249872\n",
"Test-sample R^2 for iter: 23 is 0.143063\n",
"\n",
"\n",
"Train Epoch: 22 [0/236621 (0%)]\tLoss: 0.293819\n",
"Insample R^2 for iter: 23 is 0.216062\n",
"\n",
"\n",
"Test set: Average loss: 0.253018\n",
"Test-sample R^2 for iter: 23 is 0.141934\n",
"\n",
"\n",
"Train Epoch: 22 [0/263967 (0%)]\tLoss: 0.315969\n",
"Insample R^2 for iter: 23 is 0.206910\n",
"\n",
"\n",
"Test set: Average loss: 0.256197\n",
"Test-sample R^2 for iter: 23 is 0.140885\n",
"\n",
"\n",
"Train Epoch: 22 [0/292711 (0%)]\tLoss: 0.275000\n",
"Insample R^2 for iter: 23 is 0.206627\n",
"\n",
"\n",
"Test set: Average loss: 0.257186\n",
"Test-sample R^2 for iter: 23 is 0.139740\n",
"\n",
"\n",
"Train Epoch: 22 [0/322680 (0%)]\tLoss: 0.264855\n",
"Insample R^2 for iter: 23 is 0.206967\n",
"\n",
"\n",
"Test set: Average loss: 0.259211\n",
"Test-sample R^2 for iter: 23 is 0.138589\n",
"\n",
"\n",
"Train Epoch: 22 [0/354563 (0%)]\tLoss: 0.253016\n",
"Insample R^2 for iter: 23 is 0.208409\n",
"\n",
"\n",
"Test set: Average loss: 0.261374\n",
"Test-sample R^2 for iter: 23 is 0.137408\n",
"\n",
"\n",
"Train Epoch: 22 [0/422402 (0%)]\tLoss: 0.241531\n",
"Insample R^2 for iter: 23 is 0.209808\n",
"\n",
"\n",
"Test set: Average loss: 0.267145\n",
"Test-sample R^2 for iter: 23 is 0.136023\n",
"\n",
"\n",
"Train Epoch: 22 [0/487054 (0%)]\tLoss: 0.258076\n",
"Insample R^2 for iter: 23 is 0.207979\n",
"\n",
"\n",
"Test set: Average loss: 0.269460\n",
"Test-sample R^2 for iter: 23 is 0.134311\n",
"\n",
"\n",
"Train Epoch: 23 [0/10692 (0%)]\tLoss: 0.387487\n",
"Insample R^2 for iter: 24 is 0.232885\n",
"\n",
"\n",
"Test set: Average loss: 0.257317\n",
"Test-sample R^2 for iter: 24 is 0.151567\n",
"\n",
"\n",
"Train Epoch: 23 [0/23538 (0%)]\tLoss: 0.334365\n",
"Insample R^2 for iter: 24 is 0.234118\n",
"\n",
"\n",
"Test set: Average loss: 0.257082\n",
"Test-sample R^2 for iter: 24 is 0.150887\n",
"\n",
"\n",
"Train Epoch: 23 [0/36468 (0%)]\tLoss: 0.317877\n",
"Insample R^2 for iter: 24 is 0.216645\n",
"\n",
"\n",
"Test set: Average loss: 0.257746\n",
"Test-sample R^2 for iter: 24 is 0.149977\n",
"\n",
"\n",
"Train Epoch: 23 [0/49745 (0%)]\tLoss: 0.278569\n",
"Insample R^2 for iter: 24 is 0.224910\n",
"\n",
"\n",
"Test set: Average loss: 0.258476\n",
"Test-sample R^2 for iter: 24 is 0.148781\n",
"\n",
"\n",
"Train Epoch: 23 [0/63294 (0%)]\tLoss: 0.257675\n",
"Insample R^2 for iter: 24 is 0.227842\n",
"\n",
"\n",
"Test set: Average loss: 0.259886\n",
"Test-sample R^2 for iter: 24 is 0.147662\n",
"\n",
"\n",
"Train Epoch: 23 [0/81599 (0%)]\tLoss: 0.282735\n",
"Insample R^2 for iter: 24 is 0.224042\n",
"\n",
"\n",
"Test set: Average loss: 0.259131\n",
"Test-sample R^2 for iter: 24 is 0.146556\n",
"\n",
"\n",
"Train Epoch: 23 [0/106354 (0%)]\tLoss: 0.300872\n",
"Insample R^2 for iter: 24 is 0.227655\n",
"\n",
"\n",
"Test set: Average loss: 0.255923\n",
"Test-sample R^2 for iter: 24 is 0.145190\n",
"\n",
"\n",
"Train Epoch: 23 [0/131647 (0%)]\tLoss: 0.321698\n",
"Insample R^2 for iter: 24 is 0.230561\n",
"\n",
"\n",
"Test set: Average loss: 0.252459\n",
"Test-sample R^2 for iter: 24 is 0.143623\n",
"\n",
"\n",
"Train Epoch: 23 [0/157513 (0%)]\tLoss: 0.320929\n",
"Insample R^2 for iter: 24 is 0.232950\n",
"\n",
"\n",
"Test set: Average loss: 0.251073\n",
"Test-sample R^2 for iter: 24 is 0.141918\n",
"\n",
"\n",
"Train Epoch: 23 [0/183724 (0%)]\tLoss: 0.311735\n",
"Insample R^2 for iter: 24 is 0.235685\n",
"\n",
"\n",
"Test set: Average loss: 0.250466\n",
"Test-sample R^2 for iter: 24 is 0.140154\n",
"\n",
"\n",
"Train Epoch: 23 [0/210124 (0%)]\tLoss: 0.301577\n",
"Insample R^2 for iter: 24 is 0.236498\n",
"\n",
"\n",
"Test set: Average loss: 0.252074\n",
"Test-sample R^2 for iter: 24 is 0.138495\n",
"\n",
"\n",
"Train Epoch: 23 [0/236621 (0%)]\tLoss: 0.283978\n",
"Insample R^2 for iter: 24 is 0.236328\n",
"\n",
"\n",
"Test set: Average loss: 0.255518\n",
"Test-sample R^2 for iter: 24 is 0.137030\n",
"\n",
"\n",
"Train Epoch: 23 [0/263967 (0%)]\tLoss: 0.266597\n",
"Insample R^2 for iter: 24 is 0.236534\n",
"\n",
"\n",
"Test set: Average loss: 0.258858\n",
"Test-sample R^2 for iter: 24 is 0.135662\n",
"\n",
"\n",
"Train Epoch: 23 [0/292711 (0%)]\tLoss: 0.261922\n",
"Insample R^2 for iter: 24 is 0.236858\n",
"\n",
"\n",
"Test set: Average loss: 0.259773\n",
"Test-sample R^2 for iter: 24 is 0.134261\n",
"\n",
"\n",
"Train Epoch: 23 [0/322680 (0%)]\tLoss: 0.255862\n",
"Insample R^2 for iter: 24 is 0.236978\n",
"\n",
"\n",
"Test set: Average loss: 0.261785\n",
"Test-sample R^2 for iter: 24 is 0.132894\n",
"\n",
"\n",
"Train Epoch: 23 [0/354563 (0%)]\tLoss: 0.258829\n",
"Insample R^2 for iter: 24 is 0.235428\n",
"\n",
"\n",
"Test set: Average loss: 0.263874\n",
"Test-sample R^2 for iter: 24 is 0.131543\n",
"\n",
"\n",
"Train Epoch: 23 [0/422402 (0%)]\tLoss: 0.245727\n",
"Insample R^2 for iter: 24 is 0.234444\n",
"\n",
"\n",
"Test set: Average loss: 0.269314\n",
"Test-sample R^2 for iter: 24 is 0.130079\n",
"\n",
"\n",
"Train Epoch: 23 [0/487054 (0%)]\tLoss: 0.257944\n",
"Insample R^2 for iter: 24 is 0.231267\n",
"\n",
"\n",
"Test set: Average loss: 0.271062\n",
"Test-sample R^2 for iter: 24 is 0.128402\n",
"\n",
"\n",
"Train Epoch: 24 [0/10692 (0%)]\tLoss: 0.379920\n",
"Insample R^2 for iter: 25 is 0.247913\n",
"\n",
"\n",
"Test set: Average loss: 0.260345\n",
"Test-sample R^2 for iter: 25 is 0.141582\n",
"\n",
"\n",
"Train Epoch: 24 [0/23538 (0%)]\tLoss: 0.335657\n",
"Insample R^2 for iter: 25 is 0.240131\n",
"\n",
"\n",
"Test set: Average loss: 0.259698\n",
"Test-sample R^2 for iter: 25 is 0.141571\n",
"\n",
"\n",
"Train Epoch: 24 [0/36468 (0%)]\tLoss: 0.375502\n",
"Insample R^2 for iter: 25 is 0.170891\n",
"\n",
"\n",
"Test set: Average loss: 0.259644\n",
"Test-sample R^2 for iter: 25 is 0.141676\n",
"\n",
"\n",
"Train Epoch: 24 [0/49745 (0%)]\tLoss: 0.280764\n",
"Insample R^2 for iter: 25 is 0.189096\n",
"\n",
"\n",
"Test set: Average loss: 0.259398\n",
"Test-sample R^2 for iter: 25 is 0.141792\n",
"\n",
"\n",
"Train Epoch: 24 [0/63294 (0%)]\tLoss: 0.272241\n",
"Insample R^2 for iter: 25 is 0.190519\n",
"\n",
"\n",
"Test set: Average loss: 0.259812\n",
"Test-sample R^2 for iter: 25 is 0.142120\n",
"\n",
"\n",
"Train Epoch: 24 [0/81599 (0%)]\tLoss: 0.276405\n",
"Insample R^2 for iter: 25 is 0.195919\n",
"\n",
"\n",
"Test set: Average loss: 0.258193\n",
"Test-sample R^2 for iter: 25 is 0.142456\n",
"\n",
"\n",
"Train Epoch: 24 [0/106354 (0%)]\tLoss: 0.299384\n",
"Insample R^2 for iter: 25 is 0.204077\n",
"\n",
"\n",
"Test set: Average loss: 0.254247\n",
"Test-sample R^2 for iter: 25 is 0.142483\n",
"\n",
"\n",
"Train Epoch: 24 [0/131647 (0%)]\tLoss: 0.387780\n",
"Insample R^2 for iter: 25 is 0.190574\n",
"\n",
"\n",
"Test set: Average loss: 0.250087\n",
"Test-sample R^2 for iter: 25 is 0.142273\n",
"\n",
"\n",
"Train Epoch: 24 [0/157513 (0%)]\tLoss: 0.327615\n",
"Insample R^2 for iter: 25 is 0.195659\n",
"\n",
"\n",
"Test set: Average loss: 0.248239\n",
"Test-sample R^2 for iter: 25 is 0.141811\n",
"\n",
"\n",
"Train Epoch: 24 [0/183724 (0%)]\tLoss: 0.311992\n",
"Insample R^2 for iter: 25 is 0.202057\n",
"\n",
"\n"
],
"name": "stdout"
}
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "4SZC-wFqqOk9",
"colab_type": "text"
},
"source": [
"Calculate the out-of-sample Total R_square"
]
},
{
"cell_type": "code",
"metadata": {
"id": "n3ire4cIR77W",
"colab_type": "code",
"colab": {}
},
"source": [
"#Out-of-sample Estimation Dataset\n",
"X_out = ranked_train_df[ranked_train_df['DATE']>19870000].drop(['PERMNO', 'DATE', 'return'], axis=1).values\n",
"y_out = train_df[train_df['DATE']>19870000]['return'].values\n",
"X_out = t.from_numpy(X_out).float().cuda()\n",
"y_out = t.from_numpy(y_out).float().cuda()\n",
"out_dataset = Data.TensorDataset(X_out, y_out)\n",
"\n",
"out_loader = Data.DataLoader(\n",
" dataset=out_dataset,\n",
" batch_size=len(out_dataset),\n",
" shuffle=True,\n",
" num_workers=0)"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "RRYIbJTjBjHD",
"colab_type": "code",
"colab": {}
},
"source": [
""
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "FlVYProuSJJz",
"colab_type": "code",
"colab": {}
},
"source": [
"out_square_list = []\n",
"for batch_x, batch_y in out_loader:\n",
" z = batch_x\n",
" r = batch_y[np.newaxis, ...]\n",
" r_pred = ca1(z, r)\n",
" out_square_list.append(R_square(r_pred.detach().cpu().numpy(), r.detach().cpu().numpy()))\n",
"print('Out-of-sample R^2 for iter: %d is %f' % (epoch+1, np.mean(out_square_list)))"
],
"execution_count": 0,
"outputs": []
}
]
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment