Skip to content

Instantly share code, notes, and snippets.

@ginward
Last active March 28, 2020 21:28
Show Gist options
  • Save ginward/e917f2869b706c6c32cf99fed013e492 to your computer and use it in GitHub Desktop.
Save ginward/e917f2869b706c6c32cf99fed013e492 to your computer and use it in GitHub Desktop.
ckpt_xiu_monthly v4.1.ipynb
Display the source blob
Display the rendered blob
Raw
{
"nbformat": 4,
"nbformat_minor": 0,
"metadata": {
"kernelspec": {
"name": "python3",
"display_name": "Python 3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.1"
},
"colab": {
"name": "ckpt_xiu_monthly v4.1.ipynb",
"provenance": [],
"collapsed_sections": [],
"toc_visible": true,
"machine_shape": "hm",
"include_colab_link": true
},
"accelerator": "GPU"
},
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "view-in-github",
"colab_type": "text"
},
"source": [
"<a href=\"https://colab.research.google.com/gist/ginward/e917f2869b706c6c32cf99fed013e492/ckpt_xiu_monthly-v4-1.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "YX8cgPsuPUd6",
"colab_type": "text"
},
"source": [
"# 1. Introduction"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "QCPSfZqhPUd9",
"colab_type": "text"
},
"source": [
"This notebook is to replicate the main result of the paper: \n",
"Gu, Shihao, Bryan T. Kelly, and Dacheng Xiu. \"Autoencoder asset pricing models.\" Available at SSRN (2019).\n",
"\n",
"Please refer to the requirement.txt for environment configuration.\n",
"\n",
"This version also implements checkpointing."
]
},
{
"cell_type": "code",
"metadata": {
"id": "2ia_JKmlPUd-",
"colab_type": "code",
"colab": {}
},
"source": [
"import numpy as np\n",
"import pandas as pd\n",
"import h5py\n",
"from matplotlib import pyplot as plt\n",
"from scipy import stats\n",
"import torch as t\n",
"from torch import nn\n",
"import os\n",
"from torch import optim\n",
"import torch.utils.data as Data\n",
"import warnings\n",
"import random\n",
"warnings.filterwarnings(\"ignore\")\n",
"t.manual_seed(1)\n",
"t.cuda.manual_seed(1)"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "7xd1MpSTPUeB",
"colab_type": "code",
"outputId": "b7643f42-42b5-42ef-c616-112cec34e02d",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 34
}
},
"source": [
"print(t.__version__)"
],
"execution_count": 0,
"outputs": [
{
"output_type": "stream",
"text": [
"1.4.0\n"
],
"name": "stdout"
}
]
},
{
"cell_type": "code",
"metadata": {
"id": "6urp-yfyPUeF",
"colab_type": "code",
"outputId": "ac298085-c0e7-4db7-c561-410469292309",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 34
}
},
"source": [
"print(t.version.cuda)"
],
"execution_count": 0,
"outputs": [
{
"output_type": "stream",
"text": [
"10.1\n"
],
"name": "stdout"
}
]
},
{
"cell_type": "code",
"metadata": {
"id": "6z1BCbEcMG1w",
"colab_type": "code",
"outputId": "df2fc3ec-ced0-4f00-a6c9-90a57773c50c",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 306
}
},
"source": [
"gpu_info = !nvidia-smi\n",
"gpu_info = '\\n'.join(gpu_info)\n",
"if gpu_info.find('failed') >= 0:\n",
" print('Select the Runtime → \"Change runtime type\" menu to enable a GPU accelerator, ')\n",
" print('and then re-execute this cell.')\n",
"else:\n",
" print(gpu_info)"
],
"execution_count": 0,
"outputs": [
{
"output_type": "stream",
"text": [
"Sat Mar 28 20:54:05 2020 \n",
"+-----------------------------------------------------------------------------+\n",
"| NVIDIA-SMI 440.64.00 Driver Version: 418.67 CUDA Version: 10.1 |\n",
"|-------------------------------+----------------------+----------------------+\n",
"| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |\n",
"| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |\n",
"|===============================+======================+======================|\n",
"| 0 Tesla P100-PCIE... Off | 00000000:00:04.0 Off | 0 |\n",
"| N/A 39C P0 26W / 250W | 0MiB / 16280MiB | 0% Default |\n",
"+-------------------------------+----------------------+----------------------+\n",
" \n",
"+-----------------------------------------------------------------------------+\n",
"| Processes: GPU Memory |\n",
"| GPU PID Type Process name Usage |\n",
"|=============================================================================|\n",
"| No running processes found |\n",
"+-----------------------------------------------------------------------------+\n"
],
"name": "stdout"
}
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "psJfSgLcPUeI",
"colab_type": "text"
},
"source": [
"If you should run this code locally, you should make sure `t.cuda.is_available() == True`, otherwise you should install the right pytorch matches your cuda version.\n",
"If you want to run the code only on CPU, then please remove every `.cuda()` in the code."
]
},
{
"cell_type": "code",
"metadata": {
"id": "RZbBFDRxPUeI",
"colab_type": "code",
"outputId": "971c1a9b-1efc-4679-86ca-1205bba0f793",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 34
}
},
"source": [
"t.cuda.is_available()"
],
"execution_count": 0,
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/plain": [
"True"
]
},
"metadata": {
"tags": []
},
"execution_count": 5
}
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "xma_fGFIPUeL",
"colab_type": "text"
},
"source": [
"# 2. Data "
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "RZ6JEeHlPUeM",
"colab_type": "text"
},
"source": [
"## 2.1 Overview"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "MrCl5DjBPUeN",
"colab_type": "text"
},
"source": [
"We used the `datashare.csv` updated by the author of this paper"
]
},
{
"cell_type": "code",
"metadata": {
"id": "mPkbG5XaUf-h",
"colab_type": "code",
"outputId": "c867b07a-cf78-45dc-a174-a71db66cb966",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 34
}
},
"source": [
"# Connect with Google Drive\n",
"from google.colab import drive\n",
"drive.mount('/content/drive', force_remount = True)"
],
"execution_count": 0,
"outputs": [
{
"output_type": "stream",
"text": [
"Mounted at /content/drive\n"
],
"name": "stdout"
}
]
},
{
"cell_type": "code",
"metadata": {
"id": "yOBynFNbWKb9",
"colab_type": "code",
"outputId": "9abf07e0-9baa-4d25-9748-b763d7b8921b",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 34
}
},
"source": [
"cd drive/My Drive/Colab Notebooks"
],
"execution_count": 0,
"outputs": [
{
"output_type": "stream",
"text": [
"/content/drive/My Drive/Colab Notebooks\n"
],
"name": "stdout"
}
]
},
{
"cell_type": "code",
"metadata": {
"id": "Eqh7X11nKpyj",
"colab_type": "code",
"colab": {}
},
"source": [
"CKP_PATH = \"./ck_r2_char.pt\""
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "tzpgOXZuPUeO",
"colab_type": "code",
"colab": {}
},
"source": [
"data = pd.read_csv('./xiu_month_rf_hpr2.csv')"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "PBP-EEu29fiW",
"colab_type": "code",
"colab": {}
},
"source": [
"data = data[data['DATE'].isnull()==False]"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "uqwtTnfZNwrp",
"colab_type": "code",
"colab": {}
},
"source": [
"#data = pd.get_dummies(data, prefix=['sic2_'], columns=['sic2'])\n",
"data = data.drop(columns=['sic2'])"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "gMtnE8Vd9gCE",
"colab_type": "code",
"colab": {}
},
"source": [
"data = data.sort_values(by=[\"PERMNO\",\"MONTH\"])"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "INABBQPdUAiH",
"colab_type": "code",
"colab": {}
},
"source": [
"data['RF'] = pd.to_numeric(data['RF'])"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "cT-3qO_MJtB0",
"colab_type": "code",
"colab": {}
},
"source": [
"#RF is in percentages\n",
"data['RF'] = data['RF']/100"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "vnrHUF45JSiG",
"colab_type": "code",
"colab": {}
},
"source": [
"data['return'] = data['return'] - data['RF']"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "7EmW1OEvaWLd",
"colab_type": "code",
"colab": {}
},
"source": [
"#drop the unnecessary columns\n",
"data = data.drop(columns = [\"MONTH\", \"RF\"])"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "V5C8-W_SMzN5",
"colab_type": "code",
"colab": {}
},
"source": [
"#we don't want return to contain nan\n",
"data=data[data['return'].isnull()==False]"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "XdzhwhSRQ9Z5",
"colab_type": "code",
"outputId": "2c08e0ce-49ac-45fc-a3a4-c787d3b17cae",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 289
}
},
"source": [
"data.columns"
],
"execution_count": 0,
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/plain": [
"Index(['permno', 'DATE', 'mvel1', 'beta', 'betasq', 'chmom', 'dolvol',\n",
" 'idiovol', 'indmom', 'mom1m', 'mom6m', 'mom12m', 'mom36m', 'pricedelay',\n",
" 'turn', 'absacc', 'acc', 'age', 'agr', 'bm', 'bm_ia', 'cashdebt',\n",
" 'cashpr', 'cfp', 'cfp_ia', 'chatoia', 'chcsho', 'chempia', 'chinv',\n",
" 'chpmia', 'convind', 'currat', 'depr', 'divi', 'divo', 'dy', 'egr',\n",
" 'ep', 'gma', 'grcapx', 'grltnoa', 'herf', 'hire', 'invest', 'lev',\n",
" 'lgr', 'mve_ia', 'operprof', 'orgcap', 'pchcapx_ia', 'pchcurrat',\n",
" 'pchdepr', 'pchgm_pchsale', 'pchquick', 'pchsale_pchinvt',\n",
" 'pchsale_pchrect', 'pchsale_pchxsga', 'pchsaleinv', 'pctacc', 'ps',\n",
" 'quick', 'rd', 'rd_mve', 'rd_sale', 'realestate', 'roic', 'salecash',\n",
" 'saleinv', 'salerec', 'secured', 'securedind', 'sgr', 'sin', 'sp',\n",
" 'tang', 'tb', 'aeavol', 'cash', 'chtx', 'cinvest', 'ear', 'nincr',\n",
" 'roaq', 'roavol', 'roeq', 'rsup', 'stdacc', 'stdcf', 'ms', 'baspread',\n",
" 'ill', 'maxret', 'retvol', 'std_dolvol', 'std_turn', 'zerotrade',\n",
" 'PERMNO', 'date', 'return'],\n",
" dtype='object')"
]
},
"metadata": {
"tags": []
},
"execution_count": 18
}
]
},
{
"cell_type": "code",
"metadata": {
"id": "5VLqDjQqPUeZ",
"colab_type": "code",
"colab": {}
},
"source": [
"summary = data.describe()"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "w1hc8hFaPUeb",
"colab_type": "code",
"outputId": "63fde605-f05d-4447-dab3-d77a7f1fe194",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 397
}
},
"source": [
"summary"
],
"execution_count": 0,
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>permno</th>\n",
" <th>DATE</th>\n",
" <th>mvel1</th>\n",
" <th>beta</th>\n",
" <th>betasq</th>\n",
" <th>chmom</th>\n",
" <th>dolvol</th>\n",
" <th>idiovol</th>\n",
" <th>indmom</th>\n",
" <th>mom1m</th>\n",
" <th>mom6m</th>\n",
" <th>mom12m</th>\n",
" <th>mom36m</th>\n",
" <th>pricedelay</th>\n",
" <th>turn</th>\n",
" <th>absacc</th>\n",
" <th>acc</th>\n",
" <th>age</th>\n",
" <th>agr</th>\n",
" <th>bm</th>\n",
" <th>bm_ia</th>\n",
" <th>cashdebt</th>\n",
" <th>cashpr</th>\n",
" <th>cfp</th>\n",
" <th>cfp_ia</th>\n",
" <th>chatoia</th>\n",
" <th>chcsho</th>\n",
" <th>chempia</th>\n",
" <th>chinv</th>\n",
" <th>chpmia</th>\n",
" <th>convind</th>\n",
" <th>currat</th>\n",
" <th>depr</th>\n",
" <th>divi</th>\n",
" <th>divo</th>\n",
" <th>dy</th>\n",
" <th>egr</th>\n",
" <th>ep</th>\n",
" <th>gma</th>\n",
" <th>grcapx</th>\n",
" <th>...</th>\n",
" <th>pctacc</th>\n",
" <th>ps</th>\n",
" <th>quick</th>\n",
" <th>rd</th>\n",
" <th>rd_mve</th>\n",
" <th>rd_sale</th>\n",
" <th>realestate</th>\n",
" <th>roic</th>\n",
" <th>salecash</th>\n",
" <th>saleinv</th>\n",
" <th>salerec</th>\n",
" <th>secured</th>\n",
" <th>securedind</th>\n",
" <th>sgr</th>\n",
" <th>sin</th>\n",
" <th>sp</th>\n",
" <th>tang</th>\n",
" <th>tb</th>\n",
" <th>aeavol</th>\n",
" <th>cash</th>\n",
" <th>chtx</th>\n",
" <th>cinvest</th>\n",
" <th>ear</th>\n",
" <th>nincr</th>\n",
" <th>roaq</th>\n",
" <th>roavol</th>\n",
" <th>roeq</th>\n",
" <th>rsup</th>\n",
" <th>stdacc</th>\n",
" <th>stdcf</th>\n",
" <th>ms</th>\n",
" <th>baspread</th>\n",
" <th>ill</th>\n",
" <th>maxret</th>\n",
" <th>retvol</th>\n",
" <th>std_dolvol</th>\n",
" <th>std_turn</th>\n",
" <th>zerotrade</th>\n",
" <th>PERMNO</th>\n",
" <th>return</th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>count</th>\n",
" <td>3.739449e+06</td>\n",
" <td>3.739449e+06</td>\n",
" <td>3.736496e+06</td>\n",
" <td>3.371274e+06</td>\n",
" <td>3.371274e+06</td>\n",
" <td>3.428516e+06</td>\n",
" <td>3.393674e+06</td>\n",
" <td>3.371274e+06</td>\n",
" <td>3.739336e+06</td>\n",
" <td>3.710613e+06</td>\n",
" <td>3.596218e+06</td>\n",
" <td>3.428516e+06</td>\n",
" <td>2.836227e+06</td>\n",
" <td>3.371202e+06</td>\n",
" <td>3.395522e+06</td>\n",
" <td>2.315500e+06</td>\n",
" <td>2.315500e+06</td>\n",
" <td>2.819051e+06</td>\n",
" <td>2.584590e+06</td>\n",
" <td>2.770186e+06</td>\n",
" <td>2.770186e+06</td>\n",
" <td>2.675175e+06</td>\n",
" <td>2.773472e+06</td>\n",
" <td>2.498440e+06</td>\n",
" <td>2.498440e+06</td>\n",
" <td>2.344492e+06</td>\n",
" <td>2.580992e+06</td>\n",
" <td>2.551149e+06</td>\n",
" <td>2.501191e+06</td>\n",
" <td>2.544197e+06</td>\n",
" <td>2.819051e+06</td>\n",
" <td>2.697751e+06</td>\n",
" <td>2.623932e+06</td>\n",
" <td>2.584633e+06</td>\n",
" <td>2.584633e+06</td>\n",
" <td>2.805990e+06</td>\n",
" <td>2.557353e+06</td>\n",
" <td>2.816012e+06</td>\n",
" <td>2.578034e+06</td>\n",
" <td>2.223305e+06</td>\n",
" <td>...</td>\n",
" <td>2.315454e+06</td>\n",
" <td>2.584633e+06</td>\n",
" <td>2.670802e+06</td>\n",
" <td>2.584633e+06</td>\n",
" <td>1.322812e+06</td>\n",
" <td>1.305316e+06</td>\n",
" <td>945405.000000</td>\n",
" <td>2.675117e+06</td>\n",
" <td>2.781557e+06</td>\n",
" <td>2.185371e+06</td>\n",
" <td>2.694111e+06</td>\n",
" <td>1.428519e+06</td>\n",
" <td>2.819051e+06</td>\n",
" <td>2.549270e+06</td>\n",
" <td>2.819051e+06</td>\n",
" <td>2.808559e+06</td>\n",
" <td>2.636985e+06</td>\n",
" <td>2.485428e+06</td>\n",
" <td>2.213614e+06</td>\n",
" <td>2.138232e+06</td>\n",
" <td>2.050588e+06</td>\n",
" <td>2.028068e+06</td>\n",
" <td>2.241263e+06</td>\n",
" <td>2.243545e+06</td>\n",
" <td>2.159743e+06</td>\n",
" <td>1.842151e+06</td>\n",
" <td>2.220430e+06</td>\n",
" <td>2.180953e+06</td>\n",
" <td>1.427430e+06</td>\n",
" <td>1.427430e+06</td>\n",
" <td>2.180630e+06</td>\n",
" <td>3.738813e+06</td>\n",
" <td>3.432782e+06</td>\n",
" <td>3.738909e+06</td>\n",
" <td>3.736342e+06</td>\n",
" <td>3.425131e+06</td>\n",
" <td>3.435373e+06</td>\n",
" <td>3.431337e+06</td>\n",
" <td>3.739449e+06</td>\n",
" <td>3.739449e+06</td>\n",
" </tr>\n",
" <tr>\n",
" <th>mean</th>\n",
" <td>5.640638e+04</td>\n",
" <td>1.992516e+07</td>\n",
" <td>1.079295e+06</td>\n",
" <td>1.009825e+00</td>\n",
" <td>1.443811e+00</td>\n",
" <td>1.390837e-03</td>\n",
" <td>1.079274e+01</td>\n",
" <td>6.043689e-02</td>\n",
" <td>1.245874e-01</td>\n",
" <td>9.164804e-03</td>\n",
" <td>4.885550e-02</td>\n",
" <td>1.176537e-01</td>\n",
" <td>3.139641e-01</td>\n",
" <td>1.566428e-01</td>\n",
" <td>1.004903e+00</td>\n",
" <td>9.028293e-02</td>\n",
" <td>-2.009567e-02</td>\n",
" <td>1.126533e+01</td>\n",
" <td>-1.570820e-01</td>\n",
" <td>2.492919e+00</td>\n",
" <td>-5.026388e-01</td>\n",
" <td>2.821344e-02</td>\n",
" <td>-5.203027e-01</td>\n",
" <td>7.714399e-02</td>\n",
" <td>-1.520239e-01</td>\n",
" <td>-4.727783e-04</td>\n",
" <td>1.137399e-01</td>\n",
" <td>-9.211914e-02</td>\n",
" <td>1.274625e-02</td>\n",
" <td>1.734495e-01</td>\n",
" <td>1.403582e-01</td>\n",
" <td>3.738560e+00</td>\n",
" <td>2.546599e-01</td>\n",
" <td>3.110654e-02</td>\n",
" <td>3.034048e-02</td>\n",
" <td>2.094585e-02</td>\n",
" <td>1.408368e-01</td>\n",
" <td>-1.709670e-02</td>\n",
" <td>3.574233e-01</td>\n",
" <td>9.058863e-01</td>\n",
" <td>...</td>\n",
" <td>-7.123829e-01</td>\n",
" <td>4.167191e+00</td>\n",
" <td>3.047020e+00</td>\n",
" <td>1.267689e-01</td>\n",
" <td>5.891470e-02</td>\n",
" <td>4.931038e-01</td>\n",
" <td>0.261367</td>\n",
" <td>-6.433974e-02</td>\n",
" <td>4.996550e+01</td>\n",
" <td>2.695078e+01</td>\n",
" <td>1.158898e+01</td>\n",
" <td>5.621686e-01</td>\n",
" <td>4.031154e-01</td>\n",
" <td>1.914831e-01</td>\n",
" <td>8.879584e-03</td>\n",
" <td>2.215269e+00</td>\n",
" <td>5.380046e-01</td>\n",
" <td>-1.630156e-01</td>\n",
" <td>8.094319e-01</td>\n",
" <td>1.569021e-01</td>\n",
" <td>1.008276e-03</td>\n",
" <td>2.018008e-01</td>\n",
" <td>3.227083e-03</td>\n",
" <td>1.015102e+00</td>\n",
" <td>3.200990e-04</td>\n",
" <td>2.630615e-02</td>\n",
" <td>7.392625e-03</td>\n",
" <td>1.921570e-02</td>\n",
" <td>6.254564e+00</td>\n",
" <td>1.277835e+01</td>\n",
" <td>3.625508e+00</td>\n",
" <td>5.400235e-02</td>\n",
" <td>5.082128e-06</td>\n",
" <td>6.982927e-02</td>\n",
" <td>3.062197e-02</td>\n",
" <td>8.671847e-01</td>\n",
" <td>4.074174e+00</td>\n",
" <td>1.509970e+00</td>\n",
" <td>5.640638e+04</td>\n",
" <td>7.267667e-03</td>\n",
" </tr>\n",
" <tr>\n",
" <th>std</th>\n",
" <td>2.732098e+04</td>\n",
" <td>1.420988e+05</td>\n",
" <td>4.825887e+06</td>\n",
" <td>6.480901e-01</td>\n",
" <td>1.723233e+00</td>\n",
" <td>5.388213e-01</td>\n",
" <td>2.922874e+00</td>\n",
" <td>3.660089e-02</td>\n",
" <td>2.861385e-01</td>\n",
" <td>1.473740e-01</td>\n",
" <td>3.501918e-01</td>\n",
" <td>5.601429e-01</td>\n",
" <td>9.156545e-01</td>\n",
" <td>1.353978e+00</td>\n",
" <td>1.047787e+01</td>\n",
" <td>9.846507e-02</td>\n",
" <td>1.272371e-01</td>\n",
" <td>1.009857e+01</td>\n",
" <td>4.204604e-01</td>\n",
" <td>2.676182e+01</td>\n",
" <td>2.446237e+01</td>\n",
" <td>1.543410e+01</td>\n",
" <td>8.348633e+01</td>\n",
" <td>1.332607e+00</td>\n",
" <td>6.283583e+00</td>\n",
" <td>2.192646e-01</td>\n",
" <td>3.242773e-01</td>\n",
" <td>9.348005e-01</td>\n",
" <td>5.621459e-02</td>\n",
" <td>5.779889e+00</td>\n",
" <td>3.473584e-01</td>\n",
" <td>1.230473e+01</td>\n",
" <td>3.934959e-01</td>\n",
" <td>1.736057e-01</td>\n",
" <td>1.715224e-01</td>\n",
" <td>3.718645e-02</td>\n",
" <td>6.511498e-01</td>\n",
" <td>3.566570e-01</td>\n",
" <td>3.356517e-01</td>\n",
" <td>4.528311e+00</td>\n",
" <td>...</td>\n",
" <td>6.229557e+00</td>\n",
" <td>1.715702e+00</td>\n",
" <td>1.230901e+01</td>\n",
" <td>3.327139e-01</td>\n",
" <td>1.040411e-01</td>\n",
" <td>3.791595e+00</td>\n",
" <td>0.281477</td>\n",
" <td>8.379559e-01</td>\n",
" <td>1.559034e+02</td>\n",
" <td>7.878391e+01</td>\n",
" <td>5.241357e+01</td>\n",
" <td>5.224417e-01</td>\n",
" <td>4.905237e-01</td>\n",
" <td>5.803090e-01</td>\n",
" <td>9.381226e-02</td>\n",
" <td>3.611889e+00</td>\n",
" <td>1.533588e-01</td>\n",
" <td>1.809665e+00</td>\n",
" <td>2.065480e+00</td>\n",
" <td>2.028530e-01</td>\n",
" <td>1.150926e-02</td>\n",
" <td>2.492995e+01</td>\n",
" <td>7.901209e-02</td>\n",
" <td>1.370279e+00</td>\n",
" <td>5.178176e-02</td>\n",
" <td>4.758519e-02</td>\n",
" <td>2.096277e-01</td>\n",
" <td>2.142203e-01</td>\n",
" <td>7.498682e+01</td>\n",
" <td>1.355331e+02</td>\n",
" <td>1.674303e+00</td>\n",
" <td>7.253442e-02</td>\n",
" <td>2.663447e-05</td>\n",
" <td>7.038531e-02</td>\n",
" <td>2.528540e-02</td>\n",
" <td>4.007274e-01</td>\n",
" <td>9.178548e+00</td>\n",
" <td>3.536544e+00</td>\n",
" <td>2.732098e+04</td>\n",
" <td>1.728344e-01</td>\n",
" </tr>\n",
" <tr>\n",
" <th>min</th>\n",
" <td>1.000000e+04</td>\n",
" <td>1.957033e+07</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-1.933279e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-9.062534e+00</td>\n",
" <td>-3.060271e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-7.794856e-01</td>\n",
" <td>-6.981132e-01</td>\n",
" <td>-1.000000e+00</td>\n",
" <td>-1.000000e+00</td>\n",
" <td>-1.000000e+00</td>\n",
" <td>-8.229368e+02</td>\n",
" <td>0.000000e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-1.257607e+00</td>\n",
" <td>1.000000e+00</td>\n",
" <td>-6.033304e+00</td>\n",
" <td>-2.439120e+01</td>\n",
" <td>-7.759533e+02</td>\n",
" <td>-1.024800e+04</td>\n",
" <td>-8.364906e+03</td>\n",
" <td>-2.057420e+02</td>\n",
" <td>-1.912332e+02</td>\n",
" <td>-1.314998e+00</td>\n",
" <td>-8.998305e-01</td>\n",
" <td>-5.622415e+01</td>\n",
" <td>-3.003145e-01</td>\n",
" <td>-1.574528e+02</td>\n",
" <td>0.000000e+00</td>\n",
" <td>1.418063e-03</td>\n",
" <td>-9.838288e-01</td>\n",
" <td>0.000000e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-3.313476e+00</td>\n",
" <td>-6.199143e+00</td>\n",
" <td>-1.821396e+01</td>\n",
" <td>-9.219229e-01</td>\n",
" <td>-4.047032e+02</td>\n",
" <td>...</td>\n",
" <td>-2.057000e+02</td>\n",
" <td>0.000000e+00</td>\n",
" <td>1.418063e-03</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-3.411239e-02</td>\n",
" <td>-9.038462e+01</td>\n",
" <td>0.000000</td>\n",
" <td>-1.717197e+01</td>\n",
" <td>-1.591636e+03</td>\n",
" <td>-1.066224e+02</td>\n",
" <td>-2.179600e+04</td>\n",
" <td>0.000000e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-1.000000e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-3.594196e+01</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-4.219112e+01</td>\n",
" <td>-1.000000e+00</td>\n",
" <td>-1.431838e-01</td>\n",
" <td>-1.607357e-01</td>\n",
" <td>-2.833333e+03</td>\n",
" <td>-4.754902e-01</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-5.334442e-01</td>\n",
" <td>4.202608e-06</td>\n",
" <td>-1.720769e+02</td>\n",
" <td>-3.343933e+01</td>\n",
" <td>0.000000e+00</td>\n",
" <td>5.354650e-06</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-1.979275e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-7.335907e-02</td>\n",
" <td>0.000000e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>1.954854e-12</td>\n",
" <td>1.000000e+04</td>\n",
" <td>-9.912950e-01</td>\n",
" </tr>\n",
" <tr>\n",
" <th>25%</th>\n",
" <td>3.052500e+04</td>\n",
" <td>1.982093e+07</td>\n",
" <td>2.032800e+04</td>\n",
" <td>5.400761e-01</td>\n",
" <td>2.975018e-01</td>\n",
" <td>-2.380378e-01</td>\n",
" <td>8.760727e+00</td>\n",
" <td>3.428336e-02</td>\n",
" <td>-4.939807e-02</td>\n",
" <td>-6.144062e-02</td>\n",
" <td>-1.355932e-01</td>\n",
" <td>-1.919505e-01</td>\n",
" <td>-2.126540e-01</td>\n",
" <td>-5.269740e-02</td>\n",
" <td>1.900418e-01</td>\n",
" <td>2.969798e-02</td>\n",
" <td>-7.354222e-02</td>\n",
" <td>4.000000e+00</td>\n",
" <td>-2.030585e-01</td>\n",
" <td>3.376279e-01</td>\n",
" <td>-4.110030e-01</td>\n",
" <td>1.410281e-02</td>\n",
" <td>-8.480907e+00</td>\n",
" <td>-4.018772e-02</td>\n",
" <td>-1.441469e-01</td>\n",
" <td>-7.099654e-02</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-1.804556e-01</td>\n",
" <td>-1.534082e-03</td>\n",
" <td>-1.304695e-01</td>\n",
" <td>0.000000e+00</td>\n",
" <td>1.228087e+00</td>\n",
" <td>9.547563e-02</td>\n",
" <td>0.000000e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-2.729350e-02</td>\n",
" <td>-2.553201e-03</td>\n",
" <td>1.143995e-01</td>\n",
" <td>-3.721840e-01</td>\n",
" <td>...</td>\n",
" <td>-1.247647e+00</td>\n",
" <td>3.000000e+00</td>\n",
" <td>8.793050e-01</td>\n",
" <td>0.000000e+00</td>\n",
" <td>3.133341e-03</td>\n",
" <td>2.423276e-03</td>\n",
" <td>0.107750</td>\n",
" <td>6.581906e-03</td>\n",
" <td>2.635117e+00</td>\n",
" <td>4.606254e+00</td>\n",
" <td>3.735176e+00</td>\n",
" <td>1.497460e-02</td>\n",
" <td>0.000000e+00</td>\n",
" <td>-1.065269e-02</td>\n",
" <td>0.000000e+00</td>\n",
" <td>4.256670e-01</td>\n",
" <td>4.697502e-01</td>\n",
" <td>-7.153101e-01</td>\n",
" <td>-2.646674e-01</td>\n",
" <td>2.256222e-02</td>\n",
" <td>-1.279341e-03</td>\n",
" <td>-2.838249e-02</td>\n",
" <td>-3.225806e-02</td>\n",
" <td>0.000000e+00</td>\n",
" <td>1.322650e-05</td>\n",
" <td>5.141335e-03</td>\n",
" <td>8.120633e-04</td>\n",
" <td>-4.400480e-03</td>\n",
" <td>7.941321e-02</td>\n",
" <td>8.458948e-02</td>\n",
" <td>2.000000e+00</td>\n",
" <td>1.931990e-02</td>\n",
" <td>1.070679e-08</td>\n",
" <td>2.777778e-02</td>\n",
" <td>1.444932e-02</td>\n",
" <td>5.540967e-01</td>\n",
" <td>7.449103e-01</td>\n",
" <td>2.009552e-08</td>\n",
" <td>3.052500e+04</td>\n",
" <td>-6.532400e-02</td>\n",
" </tr>\n",
" <tr>\n",
" <th>50%</th>\n",
" <td>6.177800e+04</td>\n",
" <td>1.994063e+07</td>\n",
" <td>8.572500e+04</td>\n",
" <td>9.384522e-01</td>\n",
" <td>8.842564e-01</td>\n",
" <td>-4.718135e-03</td>\n",
" <td>1.066154e+01</td>\n",
" <td>5.120070e-02</td>\n",
" <td>1.042282e-01</td>\n",
" <td>0.000000e+00</td>\n",
" <td>2.208386e-02</td>\n",
" <td>5.331414e-02</td>\n",
" <td>1.520912e-01</td>\n",
" <td>6.873652e-02</td>\n",
" <td>4.587454e-01</td>\n",
" <td>6.230405e-02</td>\n",
" <td>-1.916127e-02</td>\n",
" <td>8.000000e+00</td>\n",
" <td>-7.571500e-02</td>\n",
" <td>6.494110e-01</td>\n",
" <td>-1.083135e-01</td>\n",
" <td>1.287089e-01</td>\n",
" <td>-6.661708e-01</td>\n",
" <td>4.781421e-02</td>\n",
" <td>-6.330727e-03</td>\n",
" <td>1.589537e-03</td>\n",
" <td>6.847698e-03</td>\n",
" <td>-5.982460e-02</td>\n",
" <td>2.894090e-04</td>\n",
" <td>-3.168992e-03</td>\n",
" <td>0.000000e+00</td>\n",
" <td>1.978973e+00</td>\n",
" <td>1.490439e-01</td>\n",
" <td>0.000000e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>4.161961e-03</td>\n",
" <td>7.802550e-02</td>\n",
" <td>5.058908e-02</td>\n",
" <td>3.006515e-01</td>\n",
" <td>1.338296e-01</td>\n",
" <td>...</td>\n",
" <td>-2.861115e-01</td>\n",
" <td>4.000000e+00</td>\n",
" <td>1.296124e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>2.496483e-02</td>\n",
" <td>2.425175e-02</td>\n",
" <td>0.227216</td>\n",
" <td>6.736905e-02</td>\n",
" <td>1.005086e+01</td>\n",
" <td>7.609959e+00</td>\n",
" <td>5.895310e+00</td>\n",
" <td>5.376000e-01</td>\n",
" <td>0.000000e+00</td>\n",
" <td>9.561810e-02</td>\n",
" <td>0.000000e+00</td>\n",
" <td>1.033256e+00</td>\n",
" <td>5.477468e-01</td>\n",
" <td>-1.265953e-01</td>\n",
" <td>2.550494e-01</td>\n",
" <td>6.937483e-02</td>\n",
" <td>6.297230e-05</td>\n",
" <td>-1.184269e-03</td>\n",
" <td>5.297698e-04</td>\n",
" <td>1.000000e+00</td>\n",
" <td>7.819408e-03</td>\n",
" <td>1.134994e-02</td>\n",
" <td>2.398843e-02</td>\n",
" <td>1.446924e-02</td>\n",
" <td>1.317511e-01</td>\n",
" <td>1.436099e-01</td>\n",
" <td>4.000000e+00</td>\n",
" <td>3.276042e-02</td>\n",
" <td>1.291729e-07</td>\n",
" <td>4.878049e-02</td>\n",
" <td>2.349544e-02</td>\n",
" <td>8.029089e-01</td>\n",
" <td>1.695989e+00</td>\n",
" <td>5.636663e-08</td>\n",
" <td>6.177800e+04</td>\n",
" <td>-3.337000e-03</td>\n",
" </tr>\n",
" <tr>\n",
" <th>75%</th>\n",
" <td>8.068000e+04</td>\n",
" <td>2.004023e+07</td>\n",
" <td>4.118183e+05</td>\n",
" <td>1.388240e+00</td>\n",
" <td>1.930812e+00</td>\n",
" <td>2.335132e-01</td>\n",
" <td>1.278733e+01</td>\n",
" <td>7.686123e-02</td>\n",
" <td>2.611684e-01</td>\n",
" <td>6.666667e-02</td>\n",
" <td>1.813771e-01</td>\n",
" <td>3.071035e-01</td>\n",
" <td>5.797258e-01</td>\n",
" <td>3.222820e-01</td>\n",
" <td>1.087716e+00</td>\n",
" <td>1.141448e-01</td>\n",
" <td>4.696611e-02</td>\n",
" <td>1.600000e+01</td>\n",
" <td>1.586623e-02</td>\n",
" <td>1.124785e+00</td>\n",
" <td>2.505711e-01</td>\n",
" <td>2.820772e-01</td>\n",
" <td>4.930041e+00</td>\n",
" <td>1.235594e-01</td>\n",
" <td>8.755702e-02</td>\n",
" <td>7.816332e-02</td>\n",
" <td>6.497040e-02</td>\n",
" <td>2.976349e-02</td>\n",
" <td>2.251888e-02</td>\n",
" <td>5.664668e-02</td>\n",
" <td>0.000000e+00</td>\n",
" <td>3.209117e+00</td>\n",
" <td>2.630835e-01</td>\n",
" <td>0.000000e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>3.052534e-02</td>\n",
" <td>1.954101e-01</td>\n",
" <td>8.943522e-02</td>\n",
" <td>5.261533e-01</td>\n",
" <td>9.173273e-01</td>\n",
" <td>...</td>\n",
" <td>7.000719e-01</td>\n",
" <td>5.000000e+00</td>\n",
" <td>2.282214e+00</td>\n",
" <td>0.000000e+00</td>\n",
" <td>6.950250e-02</td>\n",
" <td>9.843776e-02</td>\n",
" <td>0.375574</td>\n",
" <td>1.337349e-01</td>\n",
" <td>3.565057e+01</td>\n",
" <td>1.717171e+01</td>\n",
" <td>8.917961e+00</td>\n",
" <td>1.000000e+00</td>\n",
" <td>1.000000e+00</td>\n",
" <td>2.371986e-01</td>\n",
" <td>0.000000e+00</td>\n",
" <td>2.435432e+00</td>\n",
" <td>6.116974e-01</td>\n",
" <td>3.853083e-01</td>\n",
" <td>1.105991e+00</td>\n",
" <td>2.069173e-01</td>\n",
" <td>3.174626e-03</td>\n",
" <td>2.111255e-02</td>\n",
" <td>3.675214e-02</td>\n",
" <td>2.000000e+00</td>\n",
" <td>1.978208e-02</td>\n",
" <td>2.711450e-02</td>\n",
" <td>4.376447e-02</td>\n",
" <td>5.448232e-02</td>\n",
" <td>2.518106e-01</td>\n",
" <td>2.943518e-01</td>\n",
" <td>5.000000e+00</td>\n",
" <td>5.894172e-02</td>\n",
" <td>1.264673e-06</td>\n",
" <td>8.571429e-02</td>\n",
" <td>3.830655e-02</td>\n",
" <td>1.118146e+00</td>\n",
" <td>3.927884e+00</td>\n",
" <td>9.545456e-01</td>\n",
" <td>8.068000e+04</td>\n",
" <td>6.276400e-02</td>\n",
" </tr>\n",
" <tr>\n",
" <th>max</th>\n",
" <td>9.343600e+04</td>\n",
" <td>2.016123e+07</td>\n",
" <td>1.221146e+08</td>\n",
" <td>3.987207e+00</td>\n",
" <td>1.589782e+01</td>\n",
" <td>8.083012e+00</td>\n",
" <td>1.900568e+01</td>\n",
" <td>2.720346e-01</td>\n",
" <td>7.682736e+00</td>\n",
" <td>2.172414e+00</td>\n",
" <td>7.844445e+00</td>\n",
" <td>1.168096e+01</td>\n",
" <td>1.685246e+01</td>\n",
" <td>9.918375e+02</td>\n",
" <td>1.734793e+04</td>\n",
" <td>1.257607e+00</td>\n",
" <td>1.155264e+00</td>\n",
" <td>5.400000e+01</td>\n",
" <td>8.265002e-01</td>\n",
" <td>2.460433e+03</td>\n",
" <td>1.564199e+03</td>\n",
" <td>4.256410e+01</td>\n",
" <td>1.139067e+04</td>\n",
" <td>1.477503e+02</td>\n",
" <td>3.657082e+02</td>\n",
" <td>2.663142e+00</td>\n",
" <td>7.208766e+00</td>\n",
" <td>6.629468e+01</td>\n",
" <td>3.991813e-01</td>\n",
" <td>1.530145e+02</td>\n",
" <td>1.000000e+00</td>\n",
" <td>8.330556e+02</td>\n",
" <td>1.509737e+01</td>\n",
" <td>1.000000e+00</td>\n",
" <td>1.000000e+00</td>\n",
" <td>1.159430e+00</td>\n",
" <td>1.385515e+01</td>\n",
" <td>8.404797e-01</td>\n",
" <td>6.138484e+00</td>\n",
" <td>4.866364e+02</td>\n",
" <td>...</td>\n",
" <td>6.155897e+02</td>\n",
" <td>9.000000e+00</td>\n",
" <td>8.330556e+02</td>\n",
" <td>1.000000e+00</td>\n",
" <td>2.370752e+00</td>\n",
" <td>1.584300e+02</td>\n",
" <td>57.709343</td>\n",
" <td>1.047405e+01</td>\n",
" <td>5.906600e+03</td>\n",
" <td>3.024533e+03</td>\n",
" <td>6.870769e+02</td>\n",
" <td>5.000000e+00</td>\n",
" <td>1.000000e+00</td>\n",
" <td>1.791538e+01</td>\n",
" <td>1.000000e+00</td>\n",
" <td>5.493151e+01</td>\n",
" <td>1.000000e+00</td>\n",
" <td>2.924935e+01</td>\n",
" <td>5.121570e+02</td>\n",
" <td>9.786834e-01</td>\n",
" <td>1.765701e-01</td>\n",
" <td>6.463467e+03</td>\n",
" <td>5.578913e-01</td>\n",
" <td>8.000000e+00</td>\n",
" <td>1.460327e+00</td>\n",
" <td>8.201621e-01</td>\n",
" <td>7.039832e+00</td>\n",
" <td>5.670456e+00</td>\n",
" <td>7.496989e+03</td>\n",
" <td>8.751301e+03</td>\n",
" <td>8.000000e+00</td>\n",
" <td>1.078788e+00</td>\n",
" <td>2.479339e-03</td>\n",
" <td>1.701520e+00</td>\n",
" <td>4.379252e-01</td>\n",
" <td>2.860123e+00</td>\n",
" <td>6.517474e+02</td>\n",
" <td>2.008697e+01</td>\n",
" <td>9.343600e+04</td>\n",
" <td>2.399660e+01</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"<p>8 rows × 98 columns</p>\n",
"</div>"
],
"text/plain": [
" permno DATE ... PERMNO return\n",
"count 3.739449e+06 3.739449e+06 ... 3.739449e+06 3.739449e+06\n",
"mean 5.640638e+04 1.992516e+07 ... 5.640638e+04 7.267667e-03\n",
"std 2.732098e+04 1.420988e+05 ... 2.732098e+04 1.728344e-01\n",
"min 1.000000e+04 1.957033e+07 ... 1.000000e+04 -9.912950e-01\n",
"25% 3.052500e+04 1.982093e+07 ... 3.052500e+04 -6.532400e-02\n",
"50% 6.177800e+04 1.994063e+07 ... 6.177800e+04 -3.337000e-03\n",
"75% 8.068000e+04 2.004023e+07 ... 8.068000e+04 6.276400e-02\n",
"max 9.343600e+04 2.016123e+07 ... 9.343600e+04 2.399660e+01\n",
"\n",
"[8 rows x 98 columns]"
]
},
"metadata": {
"tags": []
},
"execution_count": 20
}
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "mPHNtLRMPUed",
"colab_type": "text"
},
"source": [
"Different stocks have different record #"
]
},
{
"cell_type": "code",
"metadata": {
"id": "EJTPsQh3PUee",
"colab_type": "code",
"outputId": "3178d408-2aa2-49e0-eb94-6ce9f5548437",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 470
}
},
"source": [
"data.groupby('permno').count()"
],
"execution_count": 0,
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>DATE</th>\n",
" <th>mvel1</th>\n",
" <th>beta</th>\n",
" <th>betasq</th>\n",
" <th>chmom</th>\n",
" <th>dolvol</th>\n",
" <th>idiovol</th>\n",
" <th>indmom</th>\n",
" <th>mom1m</th>\n",
" <th>mom6m</th>\n",
" <th>mom12m</th>\n",
" <th>mom36m</th>\n",
" <th>pricedelay</th>\n",
" <th>turn</th>\n",
" <th>absacc</th>\n",
" <th>acc</th>\n",
" <th>age</th>\n",
" <th>agr</th>\n",
" <th>bm</th>\n",
" <th>bm_ia</th>\n",
" <th>cashdebt</th>\n",
" <th>cashpr</th>\n",
" <th>cfp</th>\n",
" <th>cfp_ia</th>\n",
" <th>chatoia</th>\n",
" <th>chcsho</th>\n",
" <th>chempia</th>\n",
" <th>chinv</th>\n",
" <th>chpmia</th>\n",
" <th>convind</th>\n",
" <th>currat</th>\n",
" <th>depr</th>\n",
" <th>divi</th>\n",
" <th>divo</th>\n",
" <th>dy</th>\n",
" <th>egr</th>\n",
" <th>ep</th>\n",
" <th>gma</th>\n",
" <th>grcapx</th>\n",
" <th>grltnoa</th>\n",
" <th>...</th>\n",
" <th>ps</th>\n",
" <th>quick</th>\n",
" <th>rd</th>\n",
" <th>rd_mve</th>\n",
" <th>rd_sale</th>\n",
" <th>realestate</th>\n",
" <th>roic</th>\n",
" <th>salecash</th>\n",
" <th>saleinv</th>\n",
" <th>salerec</th>\n",
" <th>secured</th>\n",
" <th>securedind</th>\n",
" <th>sgr</th>\n",
" <th>sin</th>\n",
" <th>sp</th>\n",
" <th>tang</th>\n",
" <th>tb</th>\n",
" <th>aeavol</th>\n",
" <th>cash</th>\n",
" <th>chtx</th>\n",
" <th>cinvest</th>\n",
" <th>ear</th>\n",
" <th>nincr</th>\n",
" <th>roaq</th>\n",
" <th>roavol</th>\n",
" <th>roeq</th>\n",
" <th>rsup</th>\n",
" <th>stdacc</th>\n",
" <th>stdcf</th>\n",
" <th>ms</th>\n",
" <th>baspread</th>\n",
" <th>ill</th>\n",
" <th>maxret</th>\n",
" <th>retvol</th>\n",
" <th>std_dolvol</th>\n",
" <th>std_turn</th>\n",
" <th>zerotrade</th>\n",
" <th>PERMNO</th>\n",
" <th>date</th>\n",
" <th>return</th>\n",
" </tr>\n",
" <tr>\n",
" <th>permno</th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>10000.0</th>\n",
" <td>16</td>\n",
" <td>16</td>\n",
" <td>3</td>\n",
" <td>3</td>\n",
" <td>4</td>\n",
" <td>14</td>\n",
" <td>3</td>\n",
" <td>16</td>\n",
" <td>15</td>\n",
" <td>10</td>\n",
" <td>4</td>\n",
" <td>0</td>\n",
" <td>3</td>\n",
" <td>14</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>1</td>\n",
" <td>0</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>1</td>\n",
" <td>0</td>\n",
" <td>1</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>...</td>\n",
" <td>0</td>\n",
" <td>1</td>\n",
" <td>0</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>0</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>0</td>\n",
" <td>1</td>\n",
" <td>0</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>1</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>16</td>\n",
" <td>16</td>\n",
" <td>16</td>\n",
" <td>16</td>\n",
" <td>16</td>\n",
" <td>16</td>\n",
" <td>16</td>\n",
" <td>16</td>\n",
" <td>16</td>\n",
" <td>16</td>\n",
" </tr>\n",
" <tr>\n",
" <th>10001.0</th>\n",
" <td>371</td>\n",
" <td>371</td>\n",
" <td>358</td>\n",
" <td>358</td>\n",
" <td>360</td>\n",
" <td>370</td>\n",
" <td>358</td>\n",
" <td>371</td>\n",
" <td>370</td>\n",
" <td>366</td>\n",
" <td>360</td>\n",
" <td>336</td>\n",
" <td>358</td>\n",
" <td>369</td>\n",
" <td>343</td>\n",
" <td>343</td>\n",
" <td>355</td>\n",
" <td>343</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>331</td>\n",
" <td>343</td>\n",
" <td>343</td>\n",
" <td>343</td>\n",
" <td>343</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>343</td>\n",
" <td>343</td>\n",
" <td>343</td>\n",
" <td>343</td>\n",
" <td>355</td>\n",
" <td>343</td>\n",
" <td>331</td>\n",
" <td>343</td>\n",
" <td>...</td>\n",
" <td>343</td>\n",
" <td>355</td>\n",
" <td>343</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>0</td>\n",
" <td>355</td>\n",
" <td>343</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>355</td>\n",
" <td>313</td>\n",
" <td>296</td>\n",
" <td>296</td>\n",
" <td>296</td>\n",
" <td>296</td>\n",
" <td>296</td>\n",
" <td>296</td>\n",
" <td>296</td>\n",
" <td>269</td>\n",
" <td>296</td>\n",
" <td>296</td>\n",
" <td>269</td>\n",
" <td>269</td>\n",
" <td>293</td>\n",
" <td>371</td>\n",
" <td>371</td>\n",
" <td>371</td>\n",
" <td>371</td>\n",
" <td>371</td>\n",
" <td>371</td>\n",
" <td>371</td>\n",
" <td>371</td>\n",
" <td>371</td>\n",
" <td>371</td>\n",
" </tr>\n",
" <tr>\n",
" <th>10002.0</th>\n",
" <td>324</td>\n",
" <td>324</td>\n",
" <td>311</td>\n",
" <td>311</td>\n",
" <td>313</td>\n",
" <td>323</td>\n",
" <td>311</td>\n",
" <td>324</td>\n",
" <td>323</td>\n",
" <td>319</td>\n",
" <td>313</td>\n",
" <td>289</td>\n",
" <td>311</td>\n",
" <td>322</td>\n",
" <td>91</td>\n",
" <td>91</td>\n",
" <td>223</td>\n",
" <td>211</td>\n",
" <td>223</td>\n",
" <td>223</td>\n",
" <td>223</td>\n",
" <td>223</td>\n",
" <td>91</td>\n",
" <td>91</td>\n",
" <td>199</td>\n",
" <td>211</td>\n",
" <td>211</td>\n",
" <td>211</td>\n",
" <td>211</td>\n",
" <td>223</td>\n",
" <td>223</td>\n",
" <td>223</td>\n",
" <td>211</td>\n",
" <td>211</td>\n",
" <td>223</td>\n",
" <td>211</td>\n",
" <td>223</td>\n",
" <td>211</td>\n",
" <td>187</td>\n",
" <td>0</td>\n",
" <td>...</td>\n",
" <td>211</td>\n",
" <td>223</td>\n",
" <td>211</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>223</td>\n",
" <td>223</td>\n",
" <td>211</td>\n",
" <td>223</td>\n",
" <td>0</td>\n",
" <td>223</td>\n",
" <td>211</td>\n",
" <td>223</td>\n",
" <td>223</td>\n",
" <td>223</td>\n",
" <td>19</td>\n",
" <td>217</td>\n",
" <td>217</td>\n",
" <td>210</td>\n",
" <td>210</td>\n",
" <td>217</td>\n",
" <td>217</td>\n",
" <td>217</td>\n",
" <td>175</td>\n",
" <td>217</td>\n",
" <td>217</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>217</td>\n",
" <td>324</td>\n",
" <td>324</td>\n",
" <td>324</td>\n",
" <td>324</td>\n",
" <td>323</td>\n",
" <td>324</td>\n",
" <td>324</td>\n",
" <td>324</td>\n",
" <td>324</td>\n",
" <td>324</td>\n",
" </tr>\n",
" <tr>\n",
" <th>10003.0</th>\n",
" <td>118</td>\n",
" <td>118</td>\n",
" <td>105</td>\n",
" <td>105</td>\n",
" <td>107</td>\n",
" <td>116</td>\n",
" <td>105</td>\n",
" <td>118</td>\n",
" <td>117</td>\n",
" <td>113</td>\n",
" <td>107</td>\n",
" <td>83</td>\n",
" <td>105</td>\n",
" <td>116</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>5</td>\n",
" <td>0</td>\n",
" <td>5</td>\n",
" <td>5</td>\n",
" <td>5</td>\n",
" <td>5</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>5</td>\n",
" <td>5</td>\n",
" <td>5</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>5</td>\n",
" <td>0</td>\n",
" <td>5</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>...</td>\n",
" <td>0</td>\n",
" <td>5</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>5</td>\n",
" <td>5</td>\n",
" <td>5</td>\n",
" <td>0</td>\n",
" <td>5</td>\n",
" <td>0</td>\n",
" <td>5</td>\n",
" <td>5</td>\n",
" <td>5</td>\n",
" <td>5</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>118</td>\n",
" <td>118</td>\n",
" <td>118</td>\n",
" <td>118</td>\n",
" <td>118</td>\n",
" <td>118</td>\n",
" <td>118</td>\n",
" <td>118</td>\n",
" <td>118</td>\n",
" <td>118</td>\n",
" </tr>\n",
" <tr>\n",
" <th>10005.0</th>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>52</td>\n",
" <td>52</td>\n",
" <td>54</td>\n",
" <td>56</td>\n",
" <td>52</td>\n",
" <td>65</td>\n",
" <td>64</td>\n",
" <td>60</td>\n",
" <td>54</td>\n",
" <td>30</td>\n",
" <td>52</td>\n",
" <td>63</td>\n",
" <td>36</td>\n",
" <td>36</td>\n",
" <td>48</td>\n",
" <td>36</td>\n",
" <td>48</td>\n",
" <td>48</td>\n",
" <td>48</td>\n",
" <td>48</td>\n",
" <td>48</td>\n",
" <td>48</td>\n",
" <td>24</td>\n",
" <td>36</td>\n",
" <td>36</td>\n",
" <td>36</td>\n",
" <td>36</td>\n",
" <td>48</td>\n",
" <td>48</td>\n",
" <td>48</td>\n",
" <td>36</td>\n",
" <td>36</td>\n",
" <td>48</td>\n",
" <td>36</td>\n",
" <td>48</td>\n",
" <td>36</td>\n",
" <td>24</td>\n",
" <td>36</td>\n",
" <td>...</td>\n",
" <td>36</td>\n",
" <td>48</td>\n",
" <td>36</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>48</td>\n",
" <td>48</td>\n",
" <td>0</td>\n",
" <td>48</td>\n",
" <td>48</td>\n",
" <td>48</td>\n",
" <td>36</td>\n",
" <td>48</td>\n",
" <td>48</td>\n",
" <td>48</td>\n",
" <td>48</td>\n",
" <td>3</td>\n",
" <td>3</td>\n",
" <td>3</td>\n",
" <td>3</td>\n",
" <td>3</td>\n",
" <td>3</td>\n",
" <td>3</td>\n",
" <td>0</td>\n",
" <td>3</td>\n",
" <td>3</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>3</td>\n",
" <td>65</td>\n",
" <td>58</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>46</td>\n",
" <td>65</td>\n",
" <td>58</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" </tr>\n",
" <tr>\n",
" <th>...</th>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" </tr>\n",
" <tr>\n",
" <th>93432.0</th>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>9</td>\n",
" <td>0</td>\n",
" <td>11</td>\n",
" <td>10</td>\n",
" <td>6</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>9</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>...</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>2</td>\n",
" <td>2</td>\n",
" <td>2</td>\n",
" <td>2</td>\n",
" <td>2</td>\n",
" <td>2</td>\n",
" <td>2</td>\n",
" <td>0</td>\n",
" <td>2</td>\n",
" <td>2</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" <td>11</td>\n",
" </tr>\n",
" <tr>\n",
" <th>93433.0</th>\n",
" <td>77</td>\n",
" <td>77</td>\n",
" <td>64</td>\n",
" <td>64</td>\n",
" <td>66</td>\n",
" <td>76</td>\n",
" <td>64</td>\n",
" <td>77</td>\n",
" <td>76</td>\n",
" <td>72</td>\n",
" <td>66</td>\n",
" <td>42</td>\n",
" <td>64</td>\n",
" <td>75</td>\n",
" <td>53</td>\n",
" <td>53</td>\n",
" <td>65</td>\n",
" <td>53</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>41</td>\n",
" <td>53</td>\n",
" <td>53</td>\n",
" <td>53</td>\n",
" <td>53</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>53</td>\n",
" <td>53</td>\n",
" <td>65</td>\n",
" <td>53</td>\n",
" <td>65</td>\n",
" <td>53</td>\n",
" <td>41</td>\n",
" <td>53</td>\n",
" <td>...</td>\n",
" <td>53</td>\n",
" <td>65</td>\n",
" <td>53</td>\n",
" <td>60</td>\n",
" <td>60</td>\n",
" <td>60</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>0</td>\n",
" <td>65</td>\n",
" <td>17</td>\n",
" <td>65</td>\n",
" <td>53</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>51</td>\n",
" <td>51</td>\n",
" <td>45</td>\n",
" <td>48</td>\n",
" <td>51</td>\n",
" <td>51</td>\n",
" <td>51</td>\n",
" <td>23</td>\n",
" <td>51</td>\n",
" <td>51</td>\n",
" <td>23</td>\n",
" <td>23</td>\n",
" <td>46</td>\n",
" <td>77</td>\n",
" <td>77</td>\n",
" <td>77</td>\n",
" <td>77</td>\n",
" <td>77</td>\n",
" <td>77</td>\n",
" <td>77</td>\n",
" <td>77</td>\n",
" <td>77</td>\n",
" <td>77</td>\n",
" </tr>\n",
" <tr>\n",
" <th>93434.0</th>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>67</td>\n",
" <td>76</td>\n",
" <td>65</td>\n",
" <td>78</td>\n",
" <td>77</td>\n",
" <td>73</td>\n",
" <td>67</td>\n",
" <td>43</td>\n",
" <td>65</td>\n",
" <td>76</td>\n",
" <td>60</td>\n",
" <td>60</td>\n",
" <td>72</td>\n",
" <td>60</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>48</td>\n",
" <td>60</td>\n",
" <td>60</td>\n",
" <td>60</td>\n",
" <td>60</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>60</td>\n",
" <td>60</td>\n",
" <td>72</td>\n",
" <td>60</td>\n",
" <td>72</td>\n",
" <td>60</td>\n",
" <td>48</td>\n",
" <td>60</td>\n",
" <td>...</td>\n",
" <td>60</td>\n",
" <td>72</td>\n",
" <td>60</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>36</td>\n",
" <td>72</td>\n",
" <td>60</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>72</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>62</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>38</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>38</td>\n",
" <td>38</td>\n",
" <td>65</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" </tr>\n",
" <tr>\n",
" <th>93435.0</th>\n",
" <td>22</td>\n",
" <td>22</td>\n",
" <td>9</td>\n",
" <td>9</td>\n",
" <td>11</td>\n",
" <td>21</td>\n",
" <td>9</td>\n",
" <td>22</td>\n",
" <td>21</td>\n",
" <td>17</td>\n",
" <td>11</td>\n",
" <td>0</td>\n",
" <td>9</td>\n",
" <td>20</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>0</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>0</td>\n",
" <td>10</td>\n",
" <td>...</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>10</td>\n",
" <td>13</td>\n",
" <td>13</td>\n",
" <td>13</td>\n",
" <td>13</td>\n",
" <td>13</td>\n",
" <td>13</td>\n",
" <td>13</td>\n",
" <td>13</td>\n",
" <td>13</td>\n",
" <td>13</td>\n",
" <td>13</td>\n",
" <td>13</td>\n",
" <td>8</td>\n",
" <td>22</td>\n",
" <td>22</td>\n",
" <td>22</td>\n",
" <td>22</td>\n",
" <td>22</td>\n",
" <td>22</td>\n",
" <td>22</td>\n",
" <td>22</td>\n",
" <td>22</td>\n",
" <td>22</td>\n",
" </tr>\n",
" <tr>\n",
" <th>93436.0</th>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>65</td>\n",
" <td>65</td>\n",
" <td>67</td>\n",
" <td>76</td>\n",
" <td>65</td>\n",
" <td>78</td>\n",
" <td>77</td>\n",
" <td>73</td>\n",
" <td>67</td>\n",
" <td>43</td>\n",
" <td>65</td>\n",
" <td>76</td>\n",
" <td>54</td>\n",
" <td>54</td>\n",
" <td>66</td>\n",
" <td>54</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>42</td>\n",
" <td>54</td>\n",
" <td>54</td>\n",
" <td>54</td>\n",
" <td>54</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>54</td>\n",
" <td>54</td>\n",
" <td>66</td>\n",
" <td>54</td>\n",
" <td>66</td>\n",
" <td>54</td>\n",
" <td>42</td>\n",
" <td>36</td>\n",
" <td>...</td>\n",
" <td>54</td>\n",
" <td>66</td>\n",
" <td>54</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>54</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>66</td>\n",
" <td>71</td>\n",
" <td>71</td>\n",
" <td>65</td>\n",
" <td>71</td>\n",
" <td>71</td>\n",
" <td>71</td>\n",
" <td>71</td>\n",
" <td>41</td>\n",
" <td>71</td>\n",
" <td>71</td>\n",
" <td>41</td>\n",
" <td>41</td>\n",
" <td>66</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>77</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" <td>78</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"<p>29853 rows × 98 columns</p>\n",
"</div>"
],
"text/plain": [
" DATE mvel1 beta betasq ... zerotrade PERMNO date return\n",
"permno ... \n",
"10000.0 16 16 3 3 ... 16 16 16 16\n",
"10001.0 371 371 358 358 ... 371 371 371 371\n",
"10002.0 324 324 311 311 ... 324 324 324 324\n",
"10003.0 118 118 105 105 ... 118 118 118 118\n",
"10005.0 65 65 52 52 ... 58 65 65 65\n",
"... ... ... ... ... ... ... ... ... ...\n",
"93432.0 11 11 0 0 ... 11 11 11 11\n",
"93433.0 77 77 64 64 ... 77 77 77 77\n",
"93434.0 78 78 65 65 ... 78 78 78 78\n",
"93435.0 22 22 9 9 ... 22 22 22 22\n",
"93436.0 78 78 65 65 ... 78 78 78 78\n",
"\n",
"[29853 rows x 98 columns]"
]
},
"metadata": {
"tags": []
},
"execution_count": 21
}
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "IEtoLXgBPUeh",
"colab_type": "text"
},
"source": [
"## 2.2 Missing Values"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "F17S9eJPPUei",
"colab_type": "text"
},
"source": [
"mean Missing % is around 30%"
]
},
{
"cell_type": "code",
"metadata": {
"id": "d8CR4o-VPUej",
"colab_type": "code",
"outputId": "62c222fe-e945-42c7-dcef-c04a1eea6aa4",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 374
}
},
"source": [
"(1-summary.loc['count'].sort_values()/3760208).plot.bar()\n",
"plt.title('Missing % for each column')"
],
"execution_count": 0,
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/plain": [
"Text(0.5, 1.0, 'Missing % for each column')"
]
},
"metadata": {
"tags": []
},
"execution_count": 22
},
{
"output_type": "display_data",
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAXcAAAFUCAYAAADf+HxmAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0\ndHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nOydebxWVfX/3wsQ51mcFdDQRNNSHHJI\nbfg5a2maNqlp5rcoc6iwDKdyqLTU1BwKp5zTwkRxxClRQAQURBGUwQlHFJHJ9fvjsw7n3Md7uQ94\nGXxc79frvJ7nnLPPPntce+21h2PuTpIkSdJYtFvUAUiSJEnanhTuSZIkDUgK9yRJkgYkhXuSJEkD\nksI9SZKkAUnhniRJ0oCkcP+UY2Z/M7Pffoznf21ml7dlmNoCM1vDzB40s3fN7JxFHZ6WMDM3s88s\n4HcMMLMjF+Q7ksWPDos6AMmCwcxeANYG1nb31yvXhwKfB7q6+wvufvTHeY+7n/GxAtoCZtYBuAbY\nHRgIHOTuU+Ler4EP3P3cuXhxFPA6sILnYo7kU0hq7o3NOOCQ4sTMPgcss+iCM0/sDziwGvAOEtaY\nWVdgX+D8Vp7vDIycH8EeDUuSfKJJ4d7YXA18v3J+KHBV1YGZXWFmv4v/q5nZf83sbTN708weMrN2\nce9XZjYpzByjzewrcf0UM7sm/ncJM8OhZjbezF43s99U3rW0mV1pZm+Z2Sgz+6WZTWwh7F2BAe4+\nC7gf2CCunw8cH9ebxcyuiLj+0szeM7OvmtmSZvYXM3spjr+Y2ZLhfhczmxhxfAXo04K/P4hwv2Vm\n/c2sc+XeeWY2wcymmNkQM9upcq99mK+ej/QbYmbrVbz+qpk9F+l+oZlZC+9v0R8z297MBpnZO/G7\nfQt+zMmvOC/yrEOcDzCz35nZ/yLtbjOzVc3snxG3QWbWpfK8m9nR9YQ/WbikcG9sBgIrmNkmZtYe\nOBiZOlrieGAi0AlYA/g14Ga2MdAT2Nrdlwd2A16Yiz87AhsDXwF6m9kmcf1koAsS1F8DvjsXP54C\nvhwCeFfgaTP7BvC6uz8yl+dw98OAfwJ/cPfl3P0e4DfAdsgktQWwDXBS5bE1gVWQxn9UrZ9mth9K\nj/1R+jwEXFdxMij8XgW4FrjJzJaKe8ehHtSewArAD4D3K8/uDWwNbA4chNK3OZr1x8xWAW5HDd+q\nwLnA7Wa2agv+tMbBwPeAdYANgUdRg7cKMArlY5V6w58sRFK4Nz6F9v41VDEnzcXtTGAtoLO7z3T3\nh8KsMRtYEuhuZkuErf75ufhzqrtPc/dhwDAkTEEV/wx3f8vdJzJ300o/ZFYahMwy1yOh8ksz+30M\nll5kZh1biX/Bd4DT3P01d58MnIoEWMGHwMnuPt3dpzXz/NHAme4+KnoNZwCfL7R3d7/G3d9w91nu\nfg5Kr43j2SOBk9x9tIth7v5Gxe+z3P1tdx+PeimfbyEOLfmzF/Ccu18d778OeAbYp860qaWPuz/v\n7u8AdwDPu/s9Ee+bgC/UuK83/MlCJIV743M18G3gMGpMMs3wR2AMcJeZjTWzXgDuPgb4OXAK8JqZ\nXW9ma8/Fn1cq/98Hlov/awMTKveq/5sQwquXu2/u7kcBvYC/IQ2xB7Az0BFpr/WwNvBi5fzFuFYw\n2d0/mMvznYHzwvTwNvAmYEi7xcxOCJPNO3F/RTReALAeMLfGsKX0qqUlf2rjRpyvM5d3zo1XK/+n\nNXNeG756w58sRFK4Nzju/iLSgPcEbmnF7bvufry7b4AGLY8rbOvufq2774iEnANnz0dwXgbWrZyv\n15LDKjEQvD1wKfA5YEj0KAYhU0A9vITCXrB+XCtobeB1AvAjd1+pcizt7v8L+/ovUc9kZXdfCfU2\nrPLshnWGs7UwNOdPbdxA8WuulzaVpoPqa7ZBuJLFkBTunw6OAL7s7lPn5sjM9jazz8SA2DvIHPOh\nmW1sZoX9+wOkvX04H+G4ETjRzFY2s3WQHX+uRFj+CvzM3T9EDdWOYY7ZGRhb57uvA04ys05mthrQ\nm7mPP9Tytwj7phGuFc3swLi3PDALmAx0MLPeyCZecDlwupl1M7H5fNrDW/KnH7CRmX3bzDqY2beA\n7sB/m/HjSeBLZra+ma0InDgf4Ug+AaRw/xQQ9tPBdTjtBtwDvIcG0S5y9/uR/fgsNG/8FWB15k8o\nnIYGbMfFe24GprfyzOHAU+4+JM5vQZrqZDR4eGmd7/4dMBgYDowAnohrdeHut6LeyvVmNgUN+O4R\nt/sDdwLPInPIBzQ1OZ2LGra7gCnA34Gl6313a/6E3X1vNCD+BupF7F1d31CJx93ADSgdhtB8A5A0\nAJbrO5JFhZn9H3Cwu++8qMOSJI1Gau7JQsPM1jKzHcysXUyvPB64dVGHK0kakVyJlyxMOgKXoAVK\nb6PpjRct0hAlSYOSZpkkSZIGJM0ySZIkDUgK9yRJkgZkkdncV1ttNe/Spcuien2SJMknkiFDhrzu\n7p1ac7fIhHuXLl0YPLieqddJkiRJgZnVbjXRLGmWSZIkaUBSuCdJkjQgKdyTJEkakBTuSZIkDUgK\n9yRJkgYkhXuSJEkDksI9SZKkAUnhniRJ0oAsUuHepdftdOl1+6IMQpIkSUOSmnuSJEkDksI9SZKk\nAUnhniRJ0oCkcE+SJGlAUrgnSZI0ICnckyRJGpAU7kmSJA1ICvckSZIGJIV7kiRJA1KXcDez3c1s\ntJmNMbNezdz/s5k9GcezZvZ22wc1SZIkqZdWv6FqZu2BC4GvAROBQWbW191HFm7c/diK+58CX1gA\nYU2SJEnqpB7NfRtgjLuPdfcZwPXAfnNxfwhwXVsELkmSJJk/6hHu6wATKucT49pHMLPOQFfgvo8f\ntCRJkmR+aesB1YOBm919dnM3zewoMxtsZoMnT57cxq9OkiRJCuoR7pOA9Srn68a15jiYuZhk3P1S\nd+/h7j06depUfyiTJEmSeaIe4T4I6GZmXc2sIxLgfWsdmdlngZWBR9s2iEmSJMm80qpwd/dZQE+g\nPzAKuNHdnzaz08xs34rTg4Hr3d0XTFCTJEmSeml1KiSAu/cD+tVc611zfkrbBStJkiT5OOQK1SRJ\nkgYkhXuSJEkDksI9SZKkAUnhniRJ0oCkcE+SJGlAUrgnSZI0ICnckyRJGpAU7kmSJA3IYiPcu/S6\nnS69bl/UwUiSJGkIFhvhXksK+iRJkvlnsRXuSZIkyfyTwj1JkqQBSeGeJEnSgKRwT5IkaUBSuCdJ\nkjQgKdyTJEkakBTuSZIkDcgnQrjXLnCq/Z9z4pMkSZryiRDuSZIkybxRl3A3s93NbLSZjTGzXi24\nOcjMRprZ02Z2bdsGM0mSJJkXWv1Atpm1By4EvgZMBAaZWV93H1lx0w04EdjB3d8ys9UXVICTJEmS\n1qlHc98GGOPuY919BnA9sF+Nmx8CF7r7WwDu/lrbBjNJkiSZF+oR7usAEyrnE+NalY2AjczsETMb\naGa7t1UAkyRJknmnVbPMPPjTDdgFWBd40Mw+5+5vVx2Z2VHAUQDrr78+1kYvT5IkSZpSj+Y+CViv\ncr5uXKsyEejr7jPdfRzwLBL2TXD3S929h7v36NSp0/yGOUmSJGmFeoT7IKCbmXU1s47AwUDfGjf/\nRlo7ZrYaMtOMbcNwJkmSJPNAq8Ld3WcBPYH+wCjgRnd/2sxOM7N9w1l/4A0zGwncD/zC3d9YUIFO\nkiRJ5k5dNnd37wf0q7nWu/LfgePiSJIkSRYxuUI1SZKkAUnhniRJ0oCkcE+SJGlAUrgnSZI0ICnc\nkyRJGpAU7kmSJA1ICvckSZIGJIV7kiRJA5LCPUmSpAFJ4Z4kSdKApHBPkiRpQFK4J0mSNCAp3JMk\nSRqQFO5JkiQNSAr3JEmSBiSFe5IkSQOSwj1JkqQBSeGeJEnSgNQl3M1sdzMbbWZjzKxXM/cPM7PJ\nZvZkHEe2fVCTJEmSemn1G6pm1h64EPgaMBEYZGZ93X1kjdMb3L3nAghj3XTpdTsAL5y116IMRpIk\nySKnHs19G2CMu4919xnA9cB+CzZYSZIkycehHuG+DjChcj4xrtVygJkNN7ObzWy9NgldkiRJMl+0\n1YDqbUAXd98cuBu4sjlHZnaUmQ02s8GTJ09uo1cnSZIktdQj3CcBVU183bg2B3d/w92nx+nlwFbN\neeTul7p7D3fv0alTp/kJb5IkSVIH9Qj3QUA3M+tqZh2Bg4G+VQdmtlbldF9gVNsFcf7p0uv2OYOs\nSZIknyZanS3j7rPMrCfQH2gP/MPdnzaz04DB7t4X+JmZ7QvMAt4EDluAYU6SJElaoVXhDuDu/YB+\nNdd6V/6fCJzYtkFLkiRJ5pdcoZokSdKApHBPkiRpQD41wj0HV5Mk+TTxqRHutaSgT5KkkfnUCvck\nSZJGJoU7abJJkqTxSOHeDCnokyT5pJPCPUmSpAFJ4d4KabJJkuSTSAr3JEmSBiSFe5IkSQOSwn0e\nSBNNkiSfFFK4J0mSNCAp3JMkSRqQFO4fgzTRJEmyuJLCPUmSpAFJ4d5G5GBrkiSLEyncFxBVQZ+C\nP0mShU0K94VMCvokSRYGdQl3M9vdzEab2Rgz6zUXdweYmZtZj7YLYpIkSTKvtCrczaw9cCGwB9Ad\nOMTMujfjbnngGOCxtg5kI1PV5Gu1+tTwkySZX+rR3LcBxrj7WHefAVwP7NeMu9OBs4EP2jB8SZIk\nyXxQj3BfB5hQOZ8Y1+ZgZlsC67l7qpoLiNTqkySZFz72gKqZtQPOBY6vw+1RZjbYzAZPnjz54746\nSZIkaYF6hPskYL3K+bpxrWB5YDNggJm9AGwH9G1uUNXdL3X3Hu7eo1OnTvMf6iRJkmSu1CPcBwHd\nzKyrmXUEDgb6Fjfd/R13X83du7h7F2AgsK+7D14gIU6SJElapVXh7u6zgJ5Af2AUcKO7P21mp5nZ\nvgs6gEmSJMm806EeR+7eD+hXc613C253+fjBSuaFYnD1hbP2avK/uFf8T5Lk00OuUE2SJGlAUrh/\nisitD5Lk00MK9yRJkgYkhXuSJEkDksI9SZKkAUnh/ikm7e9J0rikcE+SJGlAUrgnSZI0ICncEyCn\nSSZJo5HCPfkIzW0v3NIHRZIkWTxJ4Z58LFLQJ8niSQr3pM1IrT5JFh9SuCcLjBT0SbLoSOGeLBTy\nQ+BJsnBJ4Z4kSdKApHBPkiRpQFK4J4sVczPZ5IBtktRPCvfkE0nOxU+SuZPCPWl4UtAnn0bqEu5m\ntruZjTazMWbWq5n7R5vZCDN70sweNrPubR/UJPn4tDZrp54ZPTnbJ/kk0KpwN7P2wIXAHkB34JBm\nhPe17v45d/888Afg3DYPaZIkSVI39Wju2wBj3H2su88Argf2qzpw9ymV02UBb7sgJkmSJPNKhzrc\nrANMqJxPBLatdWRmPwGOAzoCX26T0CXJJ4zCRPPCWXst4pAkn3babEDV3S909w2BXwEnNefGzI4y\ns8FmNnjy5Mlt9eokWSzJWTvJoqQe4T4JWK9yvm5ca4nrga83d8PdL3X3Hu7eo1OnTvWHMkmSJJkn\n6hHug4BuZtbVzDoCBwN9qw7MrFvldC/gubYLYpI0BqnJJwuTVoW7u88CegL9gVHAje7+tJmdZmb7\nhrOeZva0mT2J7O6HLrAQJ0kDkNMpkwVNPQOquHs/oF/Ntd6V/8e0cbiS5FNLdVA2B2iT+SVXqCZJ\nkjQgKdyTJEkakBTuSZIkDUgK9yRJkgYkhXuSJEkDksI9ST5B5Ldok3pJ4Z4kDUgumEpSuCfJp4D8\nXOGnjxTuSfIppi0+WJKfPFw8SeGeJEnSgKRwT5JkoZFa/MIjhXuSJEkDksI9SZKkAUnhniRJ0oCk\ncE+SJGlAUrgnSZI0ICnckyRJGpAU7kmSJA1IXcLdzHY3s9FmNsbMejVz/zgzG2lmw83sXjPr3PZB\nTZIkSeqlVeFuZu2BC4E9gO7AIWbWvcbZUKCHu28O3Az8oa0DmiRJktRPPZr7NsAYdx/r7jOA64H9\nqg7c/X53fz9OBwLrtm0wkyRJknmhHuG+DjChcj4xrrXEEcAdHydQSZIkycejQ1t6ZmbfBXoAO7dw\n/yjgKID1118fa8uXJ0mSJHOoR3OfBKxXOV83rjXBzL4K/AbY192nN+eRu1/q7j3cvUenTp3mJ7xJ\nkiRJHdQj3AcB3cysq5l1BA4G+lYdmNkXgEuQYH+t7YOZJEmSzAutCnd3nwX0BPoDo4Ab3f1pMzvN\nzPYNZ38ElgNuMrMnzaxvC94lSZIkC4G6bO7u3g/oV3Otd+X/V9s4XEmSJMnHIFeoJkmSNCAp3JMk\nSRqQFO5JkiQNSAr3JEmSBiSFe5IkSQOSwj1JkqQBSeGeJEnSgKRwT5IkaUBSuCdJkjQgKdyTJEka\nkBTuSZIkDUgK9yRJkgYkhXuSJEkDksI9SZKkAUnhniRJ0oCkcE+SJGlAUrgnSZI0ICnckyRJGpAU\n7kmSJA1IXcLdzHY3s9FmNsbMejVz/0tm9oSZzTKzb7Z9MJMkSZJ5oVXhbmbtgQuBPYDuwCFm1r3G\n2XjgMODatg5gkiRJMu90qMPNNsAYdx8LYGbXA/sBIwsH7v5C3PtwAYQxSZIkmUfqMcusA0yonE+M\na0mSJPNNl16306XX7U3Om7tX6y6pj4U6oGpmR5nZYDMbPHny5IX56iRJkk8V9Qj3ScB6lfN149o8\n4+6XunsPd+/RqVOn+fEiSZJPOanV10c9wn0Q0M3MuppZR+BgoO+CDVaSJEnycWhVuLv7LKAn0B8Y\nBdzo7k+b2Wlmti+AmW1tZhOBA4FLzOzpBRnoJEmS5qjXbv9p0P7rmS2Du/cD+tVc6135PwiZa5Ik\nST7RdOl1Oy+ctdeiDsbHJleoJkmSNCAp3JMkSRqQFO5JkiQNSAr3JEmSBiSFe5IkSQOSwj1JkqQB\nSeGeJEnSgKRwT5IkaUBSuCdJkjQgKdyTJEkakBTuSZIkLfBJ3ncmhXuSJEkDksI9SZKkAUnhniRJ\n0oCkcE+SJGlAUrgnSZI0ICnckyRJGpAU7kmSJA1ICvckSZIGpC7hbma7m9loMxtjZr2aub+kmd0Q\n9x8zsy5tHdAkSZJFzSdpQVOrwt3M2gMXAnsA3YFDzKx7jbMjgLfc/TPAn4Gz2zqgSZIkSf3Uo7lv\nA4xx97HuPgO4Htivxs1+wJXx/2bgK2ZmbRfMJEmSxYvq1gS12xS0du/j+lEP9Qj3dYAJlfOJca1Z\nN+4+C3gHWHWeQpIkSZK0Gebuc3dg9k1gd3c/Ms6/B2zr7j0rbp4KNxPj/Plw83qNX0cBR8XpxsBo\nYDWgcFf9X3s+P+7m997i4keGMcO4OPmRYVw8wtjZ3TvRGu4+1wP4ItC/cn4icGKNm/7AF+N/hwiA\nteZ3uB/c3P+53avX3aL0P8O4+PiRYcwwLk5+tJX/rR31mGUGAd3MrKuZdQQOBvrWuOkLHBr/vwnc\n5xGaJEmSZOHToTUH7j7LzHoi7bw98A93f9rMTkMtSV/g78DVZjYGeBM1AEmSJMkiolXhDuDu/YB+\nNdd6V/5/ABw4n2G4tIX/c7tXr7tF6X+GcfHxI8OYYVyc/Ggr/+dKqwOqSZIkySeP3H4gSZKkAUnh\nniRJ0oAsNsLdzJY2s43j/9Xxe0wbv2PLmvOPjBOY2YFmtlQz11f7mO9ufV5qU/ftzOygmvPta9ws\na2btatws09L1+Qhz15auFWnXnJu2wszam9n18+D2T5XzJulXzzOfNBZk+Nsqf+en3C1IqnJmIbxr\nh3quLbD3Lwqbu5mtAZwBrO3ue5jZj4HewDR372pmk9Csmw+BA4Dl0J41s4GZcSwBfA2YWvH6RaAj\nsBawJPASYMAKwAPAjoCj2T3XAle5+5YRprPd/Vdm9kT4/UPgG3HtAOBMYH+0v85maEFBe2D5ePfg\nePfMeEcnYHXgEmA74CTgFeBW4D7g3Uq4vxO/k4BuwHNxfmzE+w13v9LMhrr7FyrpOBD4qru/Z2bf\nAB5D2z+0i3Tb2t3/bWbLAXcBOwFPu/tnK34c4e5/r8mfs9y9l5k9UUmfq939e2Y2BHgbWBl4GtjU\n3bc0s3vd/Svhdv9I6yWBLpEfDjwMPAqsH2k5Apjg7qeZWbdI4+7AUvFsJ1QGVgA2AX4EnAAcD6zv\n7j+M5zZ29/+a2UB3364Sj8Hu3iP+rxxpWzTcGwLLhl9fd/dh4e4PwFuRj0cBawO3R9ouHc9eHL+7\nAzsA+wA/cffbzGxExLVgebSqezDwOzSTrKO7/9nMzgbOAg5x94tCkP7E3U8IBecoYA13f6GaB5X4\nrQDcC+wG4O5vmtlGwC+AzkBX4FVgBrAi8A/gWnd/y8w2AM5D61gclcf3genAEODrEead0HToTsDJ\nNGUFoGfE8e3K9RtQvh8U794ZOBWViaeALwHjgZ0jLHsBm0beFGn8x/hdqeadb6O8+ymwLsrTglvj\nd9UIWztgAKoLHVGZXRvV7/+Gm88Cj1f8WAo4Jdx/EVgzwtod2BN4o+J2bOX/Bih9qxwHnAvg7ucC\nhHzZJcJyPqoLs4GhwAnu/o6Z7Qp8GxiGVvt/CeXnnAkw7v5lWmNeJsW31QHcgTJ+WJwPQYJiKCpA\nT6FKPR0J7A/i/8tIYJ+HCtCLwA+AfwOXRwIMAg4BRsV54e+ucVwDjAl/ZiJhcz7wGnAFyujPhT+v\nAf8E7gTOAe4HpsX198Of0ajAnYMq0nPAv5CAeC3unRrH+5FZ09BCr5Hh56tx7xwkCN4GrgMGhh+3\nA6sAf0XrCYpG+clKmj5Z/Fb+D43f7cL/9yJdZwNT4l4/4DsVf25GDes0VOlnx7NjUYP3DCp0A1DD\n+jZwNxIOfYFxEebDI+2fA65CgubxyMMJwC3AlDj+FOH+CjA88m0i8JcIyyDgt/HsUOA2VEa2Q2Vn\nNhIiH0aefg81HrdEnv4i0vpt4KFw+x5wGhJgUyKM+0f4r4/0fz6O6ZE3MyOe48KvacDkSj6eAfwB\nNVKfi2NipOmvItwvAXdEWj9Rk0+DgRHFPSRgBsX59vG+94BZkTcfRtinRprvD7yAFIptgL9F+Ici\ngfUXVGavR/Xje0hg3BNhfDLy6bl4z3sR3/Go3I6nbCz+hQTdzIjTQ/HbN9JqWqT/UFRefhbptjMS\nXFdHWMbE+yeguj+jErcP43/1KOTC60gZOrlyHAo8GO95OP6/g8pnV6RMbIbK8s5xVP9fi8pYv4jn\ny6h+Pxt5PBQJ8OLYuXL8I8JwbcRhQOTNa5Gvx6H8nxl5NDXycTxqGF6K9z6GFNFXUcPwXrjfC9iq\nOOqSs4tIuBcFtijUA6NgDY1fi8Jxcdwfjirx+pVK0JmoHBWhtncUltUpK8lIYEjN+58HfhMJNzsK\n1vRIxKfjfa9SFtzPRMFohxoaA9aIQvMgsHylQi4fYfhlFIxhlfcuTbnlwlXA7Lhe9cMibNPi/a+h\nwjqOsrDPiIIzG3ivkkZbocbqEWDLmjQYGseD4fcLqCL+N95xCNr87VWkIb8Rx7h4z4dRGIswzEYV\nf2bE9y+okL+INDKQAG4HjKqEcQQwPM6Xi/D+MOLzXCXMj0Wev0xZeV+hrNBDoxx8JuLTHgmY4UCf\nON6NY0bEdzqqzKOB/8R7+iBB82b8fwtV1L9HWJenVEKWBx6sxKVd/P8PqqRvRlqeD5wf91ZDFXmn\nuPcO0nZHxPXhqEwNj/ANjTDPquRxcf63uP8c0lzPrMS1OF5Ha1Gq5X1z4PeoYb4H2DfybTxSOor8\nqSpbT0W+Phf5+kz8DkI9WlCdWwutgZlThos6HteGAo9V/a+4aRf5Miny6FQkHJ+quNkDuKTmuSb1\nuebeKGC5yvnDlf8Dq3KnyMfK/9HAks1cH97Mew4CjonjQNSw7k+pRJyMyu7vI44nI0Vgm0raVcMx\nNJ47rEY2DgG+T5TXeZKzi0i4D0Bdp0Jz6RuZMhy18hdEQmyHhOhw1AWegFq3d6JA7BbP/yQK65WU\nwnAgqkQPAC+Gu02Ay1AFHhD/N0FdyOtQg9EZaTYPo9Z+tyjYL4Qfb6GuoMX1aoEYirpQ01A3835k\nBgF1Af8PCcxnUSUrNKZXgWXD3SpxbSqqYL1oYSsHYGvUUD0UfryFduj8ehSUVyMeHyDBP5xS03ge\nNYZ7o4I6FPUMhkYYhlNuKTEUOLPy3p9W/neqCOrlUGPROa6dG3lwG6rIzyEtfSDqHi8ZeVpoerch\nba8n8D/UUBRmshMiX/6HGsknkHDfEHi/WkmaSadCmXgy3jkCGFm5vxRlo3IWKotDI2/XoRRO/0WN\n/TKRPqvE9UOBH0dcxiMt+dBKHhUKwxvI3HAr0l7vCz8vReXufmDfeO7MyMt743xwpVG5M8JQFQ6r\nxHFKhGWtyrXPosZgKKpXoyPtLkSmotdRD/VMVC+fj3zqH8cXgOcL4Vl55++QRjmKSj2Ie7dU8u97\n4c/dSOnYEvgWMjlOQ43hU0i4T0O70Bb+bIHqes+Iw5+iDFxUE8ftI37vRR4NQXXwK6hXfwiqb39G\nDckzEe83Ik2Ho8b0qXjv2cD/q/0f57sgBeeVON5FcqkPkk3vhrvOqLw9G+d3AsvE/0eB0TUNy7TK\neSEbi/wcU4nrKvXI2UVlc98SCfDNUKaujoTt55GNazKySbVHGXutu18Qg5KPIq33KBRhQ7ax77r7\njWa2dTxfdHEKm/348LsjKrwfoIr7LtIwHoj3vG9mf0UC+Cp3f9vMVkTd64eQMPkMpUliKhJUf0aZ\nuT/wP9dYwl2o0N0c4QAVuk5I2P0krv0RNTAXoAI8FWlpf0YN0NZIA9gKCaVLIh1XQoV3NBJ4BwJf\nRgXvXtR1noZsxEUhfwVVgN+igj0DaYXFFs1rocrxMNrc7VZUMSzi+XN3v8bMvhhxfRNp+mshLXM2\n6uo/hmys7Sk1/fYRtw5IoM5GWk2h3QxCNtbTI406IFMaqNF+Ku5vgOzl78cxGikF26HG+rp4ZsO4\nv3n4uU/cXwLl/blIM9wACagnI4wbo8Z4WVSZ/hlp9p1Is7UiHTdAFfZD1KhfhAT8SchGS8R9GqW9\ndGb8L/ZvfQaV1w+RMPx1hM2R5nc2EkIXoDGYw8Lt1yINb4t3d4lnPog8mYV6GstGnF4P/89HgvcZ\nSrrwUd5DZXQZJKBPdfe+UV9kJ4wAACAASURBVDe6RRpfTJmPsyJdZ6FG4oC4vhaqf0T6zgB6RDr0\nBDZCvb6voMamE6qPh8dxdITlZSQf/gP8v/BzOlJoiPfsEWnTl3Jn2hlIwN6EyvAWyFY/CdX5C8If\nUJ3bBDVE6yDFzsKPYizg3UhTgC3c/akY57jO3bcys98gLf2MeM93gQ/cfSUz+1bE8V8obw6mHEfs\nGO+5J965o7uvbGbj4l3rIhkE4O6+QW2G1bKohPuSKFIbo4iMRl3c6XH/aygDD0X24LtjwPA+4CF3\n39zMVkfaz/3AP919s4r/7ZB2ti2tbz38EioUV6CMewy1zFeiwcI9w8/zUaaAGollKAdUNorz9ZBW\nUjRMK8f9k2reeTrwW3cv9sDHzJ5FGb8S0Nfdh5rZyRGWLVCh/i7qnn0znvkJ8CN33zzOV6YUxP90\n97fj+ueQ0C9MELsiLeo9pMWf5u7/CLedkSBdAuXLdUjj/h0SZg8i4bALEt5vIWE1HGkZF6EKfzjl\nAPPZ7r5tNQEi7P3cfRxzwczOQxX+uojXCRGeNSJ9v4U0rmORQPkg4kiEZzkkkKciQT0WVdxLIuzf\nivBeEc/8FjXak5HycS/qwV2BTDJD4wM2X0YN8VdR43xkhMvC/bHAPe7+zlzi9lvUa7olLn0dCaG/\noLJ/NKrUTyIFZiAyqYxEvbIbkUnr3rh3NRIYP3X3bc3sTJSPL6E8mlV5/RbAM+5+ppn1Dn8Hufv/\nmdn6wFru/piZXRzpdRPl5IVtUd4SaXJrxGcrVCaK60Pj+higu+t7EJjZBu4+Nv4vWan3S6K69Qs0\nUP1ZZB7rjZS/7ZHW/nukMA2tlP1h7r5FvOs4VP9vR4L9y+7+Ykv5UMmPQyunf4p8mIjK0X6od+dm\nNhwpZi+6+/h4dnglLN9FcudYlIdnuvsXzOxxVId+gRqhjSMd70VKyE9QI/JXVI4fQnV+R9RA/Ky1\nODRhXk0qbXFQdjluQ61sX1Sh+iLt7K74fwsaiFkCFfBj0ODpMkhgvh7+XIa0i8Pi+hTUcr+INOXN\nUO9g/0iwsXGMi98lkDZ/CyrAE2q7+EjT2RIVsK3i/1dQhVwi/H8eCbUdKE08neP5jVD3+y7UiDyK\nNlgjMnw0cHicH4a07MkR5sIWOhK4uxKmJynNCXejhqEYt1iZprt5Fja80cCqleu/BJ6rnK8M/Dj+\nFyaly4Gx8X8Y5fjDtDhfowhX5MP7NflcdOn3rzmORD2EL6FxgocreVMcrxI27DgGU5pZnq4pVwOB\n9pXzu8J9+8i/XZBw2AX1+taO8HahHM8ZGXnTHpXJrWlqwlkaCeSXUSN4cfj9mbi/JPDzyLveqFH6\nPXBjEWbUSyls7dUxiM7xzjvi+ueB8XHvs8AtzZTL4TVpcGClTJwU77oP2bLHorpyDtJWr0GCYxIy\nD82M5y6tpPvIytEnjn+Fuy1rjh5IKVufctDxfFQvLkNKzX7x7F6o7L0U6dS7GfkwgtLsNxxp5yMi\nT1ehqV38VtQwF/l5UlzrgxTAvpXjjfidhZSPYmB/FuVEgwcpx1Tm/I/zf6A6sUscl/HRcY7VUV1c\nn9LUM5SyLi+Jeg6/LuKPGpE/oV7WkIjLzLjWcV7lbF17y7QVZrYm0gKWNrMvIG0U1M3ZAhW6Q5Fm\n+S4qkLuijFoZaQxHoQwbQtndOwZVms2R5vJHdz893vlr1N3/EAnhDsjksgMSzt9AGteycW82sIKZ\n9QU2jN/lUcEaiBJ7NLLnQdMR/fYo059x902jS+Wmj1KtgwrQDNRynwxMMrMHItyjgBPDrLQpEjzv\nRnzXimlvA4Gvm9mG8e51KLv3q4X7jhGmt9GMCWJ6XrfQONYFBpiZh6bxHWCymd3o7gehgvwZMzsa\n6GRmH0Tcrg8tzsP/P6GGcHyE8fNm9gxqqNqZ2RvAimb2DjAzej5fjbA+hhrIruHHL+K6Ibt6lcNQ\nI/xmnHcAfmBm6wEbmNnTcX02akCHocYcVLEMVZoNkE13CyS8NkAN4oaRhrMjH0YA27n7bNN6h0eB\naZF26yPF4h/IXLIRElLLR/wnIuXjHSS4p6KGazAqo6DpdLcjE9F1yKQ1Je5djjTFteP5p4C1Q6O9\nqZKHa8a7pgB3m9mpSDF5GwnTK8xsD6TVv4K0/kHIzHCFu+9gZl+N+OwFzHD3vc2sME88gsa4hsQB\ngGs67v5An8jXwjwBaji6oPI9FpWzjmiL8A8jH18DfmhmV6FeY/s4vgc8HebaPwDLmdmyqMc2ycwG\noDr7IlJclkZ19rdmtkq8/zikKa+HhOLASN8zUW+gGCvpgHpsl0UYXnX3YwHM7F6kdBTxGWBmd6Ce\n3XNRvu9FdfX/kGAn/B4RZbwrGjxfIu6NA2aY2RKoDK9gZmuhhnUL1PC+V0njU6JH95m4dCjKwzXM\nbGrFXVEfWmShmmWi23MYauEHV269iwrdLRW3m6Mu8wGowI9HhfdCVIHuRoNWq5rZccARIVCHufsW\nFX+GoQI0ExW8Xsg+f4SZXYemVN6BKvkvkM30a6hLtisqIO8ireAk4GrXvO7fowq6BCrUzyCTwXGo\nEF1A2fjcjrSHf6DBkN5RYYvFFFcjLe0JVHA3QULqYTRzZhTlPOTCBgjSNCYhU8jFqAKMRpVpE2T3\n/SEq9C8hDfLceO/dkZ69Ig3GxTtuRTbHr8U7Vozrr6Gpg19BwudqVNmOQcKtQ4SlI9LItwi/z6Dp\nWgR396vMbDSwuUeXPPLqUK+YquLaHsiE8nxc2glVkvcrYSuufwvZce+NdPoG0lzXQFrTTFSWdgNe\ncvedzOwIlK8D4plvR1wmIIViuXj3bJSvO3vFlBQmmn+jMr0S0gqfROX1HuBcd+9SE6c3Ud52QmWn\nP2o0v4M06K4R7sNRnj6DhH2HuPZQxbu1Kv+LhuGlCPuyKD+mRnw2R72ezc2sH+qFvhnHTmg9xTIx\ntnWXy5SwESpba7j7ZmY2HnjT3T9vWiR2p7tPicZ8IDI3PmFag7FDNJI7ozp4QeRPf6Qc/CziPTTC\nNhT1AA5H9WBLVDZvRY3pqLh+Gep5zK7EfQl3X9fM+tCUZeMdv0SWgn0APMw01nQtxH/Q4PHdlMob\nqKwVPIDq1lKU8+qrnI4G5X+I6tD1SJF8F5WfN1D9LL5m9xoyiRHvrvIcTZWdoq6412FzX1RmmQNq\nzrshG9pI1GK+g7SSAaigvo4E2GDUXXwP2TmL+dx/QRnwEBKKZ6FK0wUJnFsr1/8YiTanO1nT5d64\nhTBXu1aF/fO5CFNhnrgmzotpdX2I7hrlqPcw1Ms4rXg38HjRHUWV+7PEVDskPJ+JuHyJmEoV7tsh\nDeLmiPubEYZrUCF6KO4VA557o8HBP1BOL3wEaYhXxHEjcE74X5i/Lq3k0zPAenHeHmm2m7eQZn2B\nlSrnK1fS4w6aTlnbO/Lwzcj791BhfgJVqMdR2bizJs/WRJVjn/j/vXhmX6SZ9op8+hv61oChhrZD\nxY+10Tz0fZHAOZCKWQ2NY3SOdKyalQ5GQuoFVHlHosb135X8fxY1qBvG8V8kqB+OYzISeoci4fVj\nSrPEdhH3nSNsHSvl9HhUrm9BjXd1Rs8lkVZDUPf/faQkjED15xTK8toNNSr9Iyy/jzgcGP49gHqA\nhVnvEWJWB6U5qRh72acSjtHAipU0XjHyYStKs91ApAgsScyQQXVjCdRT+RdqgIYhwXwIMausmbJ2\nf6Tf6cBmLbgZBWxQOe9K09k/hzZ3NONPH6TYXI3KbQfKWTDFrKZhaNwLaqaAxrVLUaN7X4T9fiTn\nqkdhtn6BmDY9T3J2UQj3iFxhc+sdCXUFEmgnRSGbQtkt/GkcD6JK+rUodJORsHoBdZE2R6adQkA8\ngTSodVEFehVpMO9EYo6gXFj0PuXCif0jTIU9rpjPfXGE4xEk1G5ENtzX0CDIFLTA6n8Rx6o9clJk\n2HTKOdHTUXf5tohHMQd7NqoIk+K9UyOTH4h33RWF4j7Cbh/vWz3Sby8kyL4U1w+K+FyJZhqNA77d\nTANRNATt494NkUdPoUo2KtJnBNLML0Ja8oot5HFz0xILIfEvZPe+BJkS3kYNUtGbvAs4AgnCnSO9\nX0QDu89GHl0YZeC+iNuUCE8xb3vlSN8ekZbPRFxejHedGHF8OfL/ftRQ31cT5lMqlfof8Vv0dJ5B\nPbzCPjsS9axGo/L8FOVClkI56VLxez1kxiPKySOU5fFZmmk4Ubm7nHJh3n9QY/h9JCzPQ0L7BCSA\nXkaa5PDI42OAL9T4+Vk0oNcT2KRyvXZNynmox3gIKq/7R17ejxqrl1Dv9XpUvq9FdfvlSPP7Ik8P\nRb3yOQsCw//fo4aiX7i5L/LkeiTwX0QN/QVRPr5fOdaknAU2i3IB3j+jLOyO6t4UVJdeIKZTV+Lb\nE1gt/m+IZM5byJS4WcXdZajB7R/5Xaw3GYgE918pG+5CHiyBlIebIwxOWU7mjL1U3rEDaiTGIAVh\nTlwXW+GOBPRVlKvSplEuGjkTmSiGREJ1iGc6IcH9FqowD0bE965kxpqUwrfQKpZFAqxDFJafEQOK\nkWibxP8hSLsYWlyn6WKKqrZ0K6o4y6Bu+wrh/4NROAv/768cd0eB2LgmLbpFWsxGvYrr4z07o8q9\nM+VCkjGo670N0oC+jrr+I6MgzQh/BsbvVFQBp1PO0/9iFKgZcb4FcFEL+VRoIUNR4V4PCfcr0UDj\nU5QLeP5OZeAznhsGrFzxbxXKwb5aDWlUEcYiP+K30BIfQuaRPqjy/gMJwWspewPDKBvHYpCsmJ75\nQaTPDMrBvpORsD0deLki5G6pSYfjkcCq/l5aOd8INSpPIQ1/t8jLOdp/xa8Rlf/7FHmBekF/RA39\npkiobdBCvlQHeK9GCsYsJEguiDxYHfVcvo3q25+Ar81HXS1MlkVv4r4Ibx9Uf5+JdD4T9UZfoewV\nTqbsET0LHBR+rEY5sHhn5H9x3qGaRpW6X8wKeg71jJ6LcFyGyvjN4fZupOC8hhSjH8e7n454LInK\n2hZU5uXHs+MoxwzGoobhVVRu90F1YBU0E2Yi5YyyO4BZFXnzVMTjYdQYnRHlZCBSNr6MGse3UH1v\nUk6Q6XMIkhv/ruRrk7i2dizUAdUK27vsfsPd/dSwq25Eqa1uRjmFcUczm4JMKrdRDoqMASa79hT5\ncdj/OqHCNwlYyczuQ7axld19EzNbNZ79uZm9hSrUHiizZ7r2dQANsowysw+LALv7NOCcGAzqjaZG\n7Rm3l0A2wa6oK/3NGJwy1F3bMAaiADY1s00jrDugijwbVYxi8Opxd38twvgwWhTxQAyovOvuj8Oc\naYInE/vPIC34SFTRj0Ba1q5I07kq/P4L6vkU0zivQQN1xRS2Ir6bo4GgpSn3SukY/7dFXfnZqEEE\nzZKonfJ5DvComd0U5weixg//qG19JHCvmZ2IGqNiLOXdGNTqiirrMahyPI+E+xjKKX7TUYM21TUu\n0gn1AGa5+9Y0g5ntjYTqPjEt7xn76MZSxdjJxsg0dj2y4++NyuwRqME72mXLfTHGcy5AjYib2bTw\nY0Uzew31iE6J51fy0ja9jOtLZx2AYTF9rjqQti/whJlt5+4DUa/kMDSt9Kdmtm+k+xFIwK1PDPA3\nF/86+AlqyD5r2vNpHFqUc7hpU7DdkTB+LgYKH3b3u8zsSDQW0wuNQWyIBkCLvFo56sSGqAe0fMRv\nlmnju19S7gO0DuqZdEECsVhLcUnEeSXgNjM7BZmIlkBCep2oR0ch09WdqL4siYT7FmaGuxd1owcS\nwNvFuydS7rVThGMwkkGdKCdiHIsUS9x9qmkq9i6ooVkCyYHlI/wXu/t9ACEjioVjAAeY2fdRAzLB\n3beqzYyIa12b6S0qzb2wyxUrFbdHlbWY0/s+MqmMQhpX0aUbh7SFs5DmPCwS4s+UJohhyMxQaLcj\nidWVKGNPo1zCPhC1ssXqtdtQhbiSchVpYV99Cmno9yGNZBrqHt+HhMyVqHs6Hs1t7Uw579vCvzco\nbZtvoIKyAU3NJv1QgZpG0z1Efo/MOpdRrsx7Mn5HUHaf36fUsj6I3z9GGA+LdL4DNYxEOEfS1MZc\naBD/j9L8NT7y4QPUYPwOmYo6o5H93dBeKh1r8ro76ur2jP/FlMARlCsDh6MKPh3NaDkZdaXPQBVu\nRKT/vkibGYCE4+DIy/7xzDjKgfe7Cdsx6tKeiXotW0b8d0IVc0r4cRfqef0HCcrmyu2DwBnxvw9q\nMF+J9Cl6FPuiSj+bcsuIcTTdy+SkCNPLaNC3MHn8OfJ4J9RIF0f12eGU5rEXkOD/kLKHMyzSsui9\nTKXcruJdYqpfnfV0GSQIv0O5v9JjqEEtemlD47c6rbkvpVn1fbT/02URlgcjfZ6KNHsZ9b6Kac23\nRJpOQIrNdCRkH4+8/RsVM1g8swSqLz+nnBs+DJlivku5yvciSvNrn3jPzTVx/j0yI20QefESKuOH\nA/8NN9fFvSWRiWV7ypXUN0U5eB41EN0q738C6BH/Tw+/3wu371Eu9runJi37orUvRVxH15V/i0i4\n/xbNLDggEvhl4PS4VzsPe02kJY+O//0p9xiZSbnvyljUFS2mrBXzlp9s5v1FF7BP5bgK2cE/QJV1\nWFwr7r8Rz7yABFAhlIbRdN5xp4r/xe9uyJSzaWRwrWlmGLB6/B+MGqZiv5QTI849USEfVzk+iHjf\nEn58JwrdO0hIvRLPfSPS61zUyPSiZll/VRjXhG3VcP9tVJBfR43ONXFvT1QRB6CGYDywRwv5vioy\n3zyBBPY/kAbVOfK3cx1lp2oqO7mZ4wI0qHg7pcnt/jiKcYp34/c7SGAtQbk1w5yBy2bePRo1KoVp\n5hdIGRiFhMLEyIfDKOdL7wr8veLHlWi3TiItvh3v7hbPT4qwTi3CXHm2c5ShzSO9ByCTzDtRbvpS\nrhcZRjkO8JEBvVbSeHvU4I9HQv3OiOPxSLBdR2lOOzl+d645ir1oRiKT0K2R7hdTsXNTjjO8Hb/T\nI37D0fhPZySwd6McC6oKvf+iOnBWJY2KxuVNZNYoZMGIuF/U5TnrMyph2ZJy88JiRs5IpGisGO7O\nJrbgQCamf1Ju3vYmklNDIw7bUZoiT0EyawBqZD5A9fi3kf87I1lYTcdHKkcR17MXZ+Fe3YOimMO7\nGqUm9RKqOPdW3O2NTAADUaEfQrkPxz5IG5iKum0fooq4SrgtBg+LedDF4MfmwElFxtZUwNpZHm8i\njblY2NI/MuY5YpFOuG1XyczC7n8eqvB/p6z0R0e8eyO7Xu+KcG9H030mPjIwGde3RiaDdYnFJZRr\nAQ5Cpow+kRbbtVAYi50Op0Rhc8rN0W5HFfsxYuS/GQE1ZwFPnG9IDBA2E967oyB3jeMktIoTNIOn\nun/HH1BPaQlka52MtLBLgc/NY3mr2sqPR43ej5C2vnO9AhDNOHmFcjB+OtJEL6fc5W8G0jhHRj6O\no9ywbCzlrofPU85eeT/y/ffAUvGu/xBCKc5/FO9+IfwsdkitFaqDkankAiSE5wzwz0N6FeMrQyln\niT3VyjP707Re34oUuFOo6REhofp31INaKvL6aaSkjI9rRa/+bKTV/gcJ2P1RnTspjudRw/MaTXuC\nzQ1QFrPShkTZMipllY+OkV1KM7PnaLph4ZxFVkXaFXWW5neX3QDJnbvjfDwafN0+nhtb865q3u4A\nrFtvPi6q7Qfm7BMe508igXwOsoHvihL/Q6SZgLSsFVCCnFHj5e8o55auhArHckjL7IC06deQZv8s\n0pLXNLMLkOb0T1RolkFdzuXdfZOaMI9DQnBCPPNFpNm3Qxl0fjj9FipcvZBQmokydGy8pxhMXjbc\n/h7Zkd9HhfVEVHnXRSaal4Ge3tRuX/BF4FFvuj7gQHe/qXLeDhXGz9fEZ85y6co1Q43CL5EQWiPS\nba0Ib0+kCT2JTDEvxr3n0dzbzcOPx70ZG7eZPeWxTUTE5exIm6mU++JPjzRbJq4fihr27cLtOKTl\njI20Xx2NBTxXvKeZeF2LzDt9UYX+LipL01FZugG4xt13qg1zM3HYEplNfory54W4tSRqVD9EJoi9\nkCY+BU0Q2BOVy6NQGT236q+7v2iV7xyg9N4KpfEYNM7wsLvvXhOeZZEi8GHMSS+2qvgQ1ZkVUcNc\n3Ye8tTg+5tq+YCgSThegBuuBuTz2RoTxQZSed6J86Yka8nsiLF9HY2ono17jaFSvdkXjBH+MdLsa\nKRFLo97M+PDndVQ3+qJG8tR4/52oUVsPlZl1Ud2fBhqrMLOL0IrQ25B2/R7q2R8e8a5ui9Ae5eGW\nEb7uKD/WQD3QYrrixki7H4Zk16+RAvh9VEZOR/PX30B2+M9STtM+AjXuU5E9fzVKM14xRrMmatTn\nlGuLb0/MJS+Ahb+IqViheg0SkMVmVf9CFW0EqoTFPGVHQsYotaKnkEZe8DhKyNtQpd/O3adVFzOZ\nVhp2AB5wbe7zfAjLQ5ENvnf4tSIyiRSLX65z99/FKrgH3P1zZnYlcIxrQ7GOKFNfil/Q3jfFXhsj\nUQUbh1rq3VElmRmCcJq7Lx3uQAV0ecpW/dgIUwd3/1llgUaRafuhvWZ+UEnjJ5A2fTSyE56NKtk0\nygKzBGo07iqe89i3IuI0Ie4vhxqcmyI8WyGhW3xIAVRIi5WyryMbd7GAh5qG59zIrxtRI3shGvQ6\ngRqKhsDMLkfmoFFI09uj4uw+JAxHR5iLuLxY49eDwJ7u/l6cLxfhLVZhboq6/NW9V+ZKLMK6gXKP\npG0o965ZAQnFu1D+beHuR1SefRsJwVq2jXgUU+aKFc+HI+32t+5+V/UB08dTdkK9y0eQBjzF3Q+J\n+0tT+eBHnXG7GQmktVHZ3Rj15l6J+HRA6wlA41WvuvuxplWYeyClZcdIm1uJ9SMoD29A+X8PEvJL\nunt3M3vStTBqSdRT7W5aWb0Tarymmz7qcw4SgMWiNlCdeQT1yh5F9edhyk3FQEri2qhcD4vwreDu\nwyvxrn6cpl/EeTvKPfSLcbKvxjvOi0e7ocb766jHVqRTDySzdkdWg2L673/QuGAxNfaK8KcP5eZr\nS8W7b0eNzJxy3Zxi1hwLe7bMbsg8sS7KpEK4v0eprbyOutCrowR8FSVyT6S5PY5a9gPRlLCjTV/j\nWRHZ5iaZvmIzFSBmXByHNKVlTCtLO5jZ58PNSP/ozI1foylcp0ZhOxDNeOiC5h2/bWbbIA3tcdSg\nfKWZ+A5CgvltK5cOT0PL8beL37WRgNnF3Y+ramLx/vZIKwQViAPQbJdN4np307JnkFCZhezmU+Kd\nV6PC/yVk++wR6fjPeOZoYGJo0u2QwJ2BTD6roa7pMZG+hipGddn5kpFn20Z6TkaVeR/UCN1ScftD\nNOh1NRJc5wBTzexH4fYwyq9ljTYt954W8V4JzRSaI7jN7GWv+YpUC6xOuboPZBpaK5SA6eH38Xx0\nts/cuAr1cm5F3eV3kKnhQNQI3u5arr8l0DN+26H0fz3ivj/SzK4JP3dAZXgXd38g4jjFNVPqaLTs\n/4CauJhrJ9Mj0IDhQUh7LZiNGudmZwu1wNGoUVkBpV1fZF55m1jlWcmH28xsMIC7z4yZTY7KwL7u\nfnz05l5GXw3zaBi3jHg8EWEvNlgbRvllJI80e9T0daKit/8BMtucT7ln/1fjuWXi3VdRzqEvWAf1\nAlavNnZm9lnUwK9Y6R13jzhvjBYQ1fYGd0QbkvWJevY8sh48hxbntTeznwPnuTTocWb2Aprie7KZ\n7emxKWH414lyo8NDUF0ptlq4LdxA2ZC1Tr32m7Y8qFmhWnOvmNmwI9LutkQF7R2aroBbgnLz/RVR\ni7gN0qLPRt28LZGGMANVplrb6AQ0Un8XKizjUEMzAHX3elHO8ij2dZ8Yfj0R/j9LaUOdY/Pz0h49\nK+49G++eTdnVPg618rMotZFiH/fqBy2KRRB3omlQf0b21Dcj/MXg1v6Un78rRu53jmeHoYr6BOre\nFoNMV6BK0Ac1VhdRDu7eiHow99J0oU4x02UEEuazqNnEqyZPd4jfpShnHxVf0zokzu9AhfnwOO6M\n/CoG0ZYB1qzxt7pXd+Hv/s28v9hQ6uQ43ke9tWUpVxE+0VL45xKvLVHD9yqq2MWiuBGUH+CYFnl8\nHypnlwIbxfODa/wbgGy1TyCBVZTVYixkeqTNoZWjumJ6UyqbyVX8ndcB1U5zudfsKk/KnVVfiN89\nafqlsH/UpNuHlIuMHAn/SfF/bLg5MdKu2ATwLWTq2CDy8ZZqHFH5/nbcm1Txp1gJXqzPmFATp/0o\nZzv1iWM4MkFuz0f3cz8ZCdxnUQP48wjbs0i479VC2lX3iC/k3OXxbLEuo9hwbkUke6Yj023nOOra\ny9190dncj6H8Us5lKAN6uebH3l9xWuxBsz1K+GIL1omo17ETqhDdkbBcE9mCi93UQIMXP3X3a+Pd\nyyFBsj8S7scizflyNEh6j7v3byHcuxDmBqQ1TEWC5a/U7DPhsqF2rvGiPSqYr6DpTDOjZ9CNco+U\nfsT8eS+7YUWXdY7NOq6vgOZ0z47zQss/EvU8hqEu3eFII69unjbKtRfPyqiR3Lji75aocd0YVcoh\nYc89AZlhqj2+5ZCm92N3P7KFdBviMoc9QWm+csqeG8gUdouHiSnGCsYiG+ac93k5JxkzuwbZMJ+O\nuIWT0kxVcdsDacagSrOZl1vNLo0E7XzNBTft4X0wKptfQdrmf1FD+T3KPYGKeBf8Cn0QpbDz7o0a\n1Jmoge+EVhqPQA3/je7eq+bdX0L58oi7n21mDwNvufs+cX8/4GfefM+ypfg8i4T0DWgHyLcr93ZH\nDdTYiFNnNNh7WLi/o5Kul6P9/9+r8X9DpKScGH6MRQL2m5QzZYreyXTUgH0LDTq+UvHnSY+xJDMb\n4TKbnonSfBrxoRVUJr4cPcHPIGH6crzbvbRlf9HdH43/30A9qnaUPREo93MvtPWVUCOwj7t/JkyA\nX6DsfWyCtO3JSEEpN19VwgAAIABJREFUepHF/HeLMBYN+G+QbOyIyvWNSF6s6+olrIbGBMd9NOdq\nmFdtpS0OWpgi2Iy7Yp+M6Wil2YtIs74zMudMVGGPppwnPgBp4N/0UnsuNvQ6Ds3CeTX+zwCOqzPM\nv0WV7Iuou/YKmtbVvZXn9m/mOBhp3MU2ud2AveP/IzSdubMVGjSFmpkiSFur7s8yR8uvTW/KLUgP\nj7A/G+n1DtIoi3nLA5HZ4LE4nosCNh5pZKdQ+ZYj6kX8FRX0YirZljXvL5Zkv1Z5zyg0uFysZu0P\n3FB55hbUM7mIysrLGn/rmu/bTHr8Ctlkj4jjYeCXH7NMF1p8k6X9kc/PIZPYOZHu1yDt7+pI1wGU\ny+H3QPVhM7QZVuHPGeG2WOOwCs1ocWi20kDKr0L9j8pspnmIzzZIuxyL6uF3K/fmLAKiZpXnfNSF\nJj0uZHb8bCVNi2M4mi9faOE7EPUizv8YZejVyNM7qJkySM1aDmpWD9e4HYcaGqv8L9aPVPeC2hqZ\nraZFGJ+PY+c4PzbOO6NG5W9IyTsW1cfVKu98IPw8HdnuV0E9wjspt2dYGzXkrab3olqhWmgxe6Kv\nHT0ddrlaDkKDETu6vg7/MIrov9BHKgrN9uK4VswzXhY4z/TRgXFIMBerzDZGA3Bro8T+smk14Rao\n0q+HWuuiVS92X1sVbdo1DdkA+wCXu3sxGNoS+zRzbRckQIttPSchE8p/kXC4ycxeijCsibQWkDZ9\nmGnmznTUKPwPFTzc/T0zW8a0EvdkSvv1mqgL3A4J9DvQwNMtca0r5WyTI5E9/mE0s+MNZOMvpkyu\nggpuNX7FrpDnxDVHMycK9kY20d0o82G/CEOhyS4FHGiaMeLxjofQ+AyulZm1/M/MuteRB01wabjD\nkZYNWmPRbG9tHvx8AlXMWtZFAuldANMqytvd/btxviRSXEDloBh83Dbug8ZufozSqIm91cy2RbOb\nNqVc0fk+0hjxGq15HuLzOPC4mZ2BhPyVwDWmVanHIaH4QzPrZlrR2xGZHVan1Ejd3VewcubJL1F5\nWw71AO+I1+2KyvEtEb8BUcaXQtruishkcT7Q3syKT0oeWgnvL2I84mzU47/U3W+tvLsLmnhxbbg/\nN9LvasrB4SoT0PRPN7MJqCdlIaf+a2aXIK39NjQWcRblwGgnJF+WQebTH7h68iPQFOLZZnYIarxf\nr7yzMyr736VcDbsOMjWvHeF+ycyWpw4WlVmmDwp0VyRU2wMDvJnltuF+KNpm9CvxfzYy1dwR155C\ntuZLi0fQINxFyCa2HGoJQYXnJj5qFliHchHTnHD4PEwhqxeLbUbNbKi7fyGuDUMays+QJlyYSUa7\n+8xwU2vm+ReaH98v7vdAGu57aDZGMVB3N1FYKT/YvC1aWn4xmmp5f/gxAO28t7WZFbbUq5DN+DXU\n0yr2JgcJ/BdQ9/2DVuLdF2169HbEd78I/w/CJHE+6lmAegjnE/u4ewww1vg3CmmqRWPXpJu9OGA1\nWxuHMB/uYQYzs+1RD7RaFp+n/KKRI41uYvz/Qvw+hLTA/yBzyAmoB/sjJOhfcn3qsTsy/dQz8FyE\neQWkTX8Lpe+/Ua9qiJndQHy02TWbaRkkmJdDpolRzfhXzDwZgcxn3wNudfdfxP210Jbfu8WAbB/g\nN64vK52ABrv/GGlUDOZPRXl9bs27BhBbG6MysW2881I0SHkJevDUEPwj3L07NZjZFciEegcyGa6K\n6s6pqPf/H1T2D0Ef2bk7njsS2czvQ8rr+5S2/P3RDJjTkdK6FaWNnwjXE+HPE64tNB53920q58ui\nHkvrZXxeu2ttcaDWe0tioVAkXEvbxi6F7PLFcuLiS0BdkMnF0ErHKUgwXEulS0bTjZqWQgLtIsrN\no4pNpx5rJcwbRTia3ZGxzngX2st4NAXzJS+70kVX7/F58G9ryg9kP4TGHbaiZsEJqhAjKL/d+UbE\n4zbUENxRcXs70tKvQEJzFqpI78QxjqargucsSInnu6O99ZsLb3Ul7/cj/15Bhf0ZVOmLZez3I+2s\nPzVLsCt+1N3NrnmuWAhTbC09T8vy5zHPfxNl9pQ4ngROjHvFpl8fMT3x0YV0t6Ie565xXIbMZcUG\na8UA/B1RDlrciKuOMI9D5rbtmrk3ZzO5yrVhzMVUwEcXE42qOW9HuZPnoMifYsfJ66McPIg02P/Q\n1LxVLMArjqk1x/NIsL8bZbnI73dRPTizhTCfXDmqK5tPjjy8JNzVDupPInZtRPVzfcoN70ZGWE6O\nNHbU8MyqHEX4ZiLZ1jvSYApqnB6l8oH6uR2LSnM3NP/7O5Sfh1ueylxlUNcpBl9/jrolk5CGPRvZ\n1v7k7hea2Yuoot6DEvnHXs41vxL4q7sPMm1g9QwaUT8NzYYZ4Ppu5FmoGzWd8gPLeNmSDkOa0hAq\nHwlw9zlfqmklzn8L/3dFBeUA1GO5BdkPD3P3AWb2ZzTYcgNNN4z6SJffNH//p8jcMQVl/AXIPlvM\nJwcNVG3j7ic0o0kejoTO/agw7YkEySORPuujilYMUE8kPsbt0sDviPec5+qCd0AV/3PNhHcYmub3\nVpx/Ec1hPxNV8J8im/Kcbj3l5mxFOjxQ4+eOQDfXYFMnNAYxrvbdNc+MoQUtc0Fg5cInaPpt0VFo\nzMYrbvdCmvexqFeFu59mWgsx3aOnF25HokZpOzPrj3o6f0BKzBQve4VzBh7rDO/WaCC7M00Hszc3\ns/8hgfaIS5PcENWXR5H5799Upmq6+y1mdjZabX5X+F/9yDaohzDGtQnYACTEd0NjBztV/GuPxpT2\nDtPE7e7+pfDzdDQOdzUqO99BZWnJ4t1mdqa7n1hvOoS/y0U83guT2p+QuaRYxLdKvOMD1PDshVbi\nHhY96ZOQSbIok+4taN2mDxT9Ccm6f6NJEM+gscERyOzX36OX0CoLQlupQzO4mHIv7pORmeQD1CLP\naZVrnumNTCx/RTbAN5HWNx6ZC74R7ooBvp1Q72AcEsbPo0GPEZSbLD1JOZ3y/jjerfyv7usx5GPG\neXjNbxdkzqjde33Ouyk38m+2h8BH9/WeQWkb/zD+z6ScdvYO0g6aaKqU+/fsR2W6IU2XPv85wroz\n5Tdaf0xlw7LKcx/ZzyeuF9r66VS09bg3Z/vlOO9KLMWP86Wp7IPupXZ1G9qlEOocbKrHzUKqBzeh\n+fbFeXUr7FfQbIm/V/J6TMXttuG22JZjs0r5/TY1H/yYx3CNRmMpXfnoZnItfUuhTzNH0Sv+BlJU\nplFqpu9HmfozUXfD7ZaUe828hert5hGmzsR3gZFAHV15ruipVHtls+OdXnn3+6ix+BPlJIZim+Tq\ncQ2SK1NQY/N+/L6IZMhXIzz/v70zj5arqtL4b8eEhBCQADKqYWiEBgRMoEEQXAgKNooTIM2gDNp0\ni4DLblRUAigLBKNLQJTRCGkcEgQB0SaADGFqCAQJRlEJg4wygwSICbv/+M5599St4VXVq3pTnW+t\nWu9V1b3n3LrDPnv8dgyWxvNzEQqUnoDiag+E/e4Pr2uQFXp1GOupsN9mSIk6GLlkY4rtfYRU4lZf\nQxVQ3c616u/g8n1FQvzHwvczgc+YCpAi9nZpMD9DguEgdAJ3QSvpnKDBrxW2Pxhpr0+gG3EW8t0d\nCJxrZlugINAqAO6+S/DB3evuu8RJrejReKWpQu4yKjWT55r8zdEfvcTM/gu5kN6GfKXbI83nfSio\n6lSmzr1kZlu7+z2lMbfwSn/hCma2yFXdtxrSjiagh3AREqbroTjHdVYUP62EXCtjUYD531FgbEIy\n9kte9JpcjiytzwIvmtlWcSNTcdaL1ICrvd58imDrx70Ihj7llZr0HBRXiahVjPMx5IO+O4zfMNhk\nRYHK/OA7rtIy6+3bSZjZlei6rgwsMtH6vo4WzhvRszADuXQOCvGIVYFxpkIYRwLlj+H/m939PmAX\nU7ruz4EJZnYLRTplK3ja3a+scdxj0KIefceGqrWfQRk/9fBdlGW20IMkC/Gjjd39WlMSwMquoPMi\n9IwtQQvY2UjZuwgJxXWDBv1RigAmqBjuAOQT3wvdF0cg7f0jSCCfjLKAYgHf0SHmEVt+7oiE7M+R\n5fQEodMYWlz3dfcdwjk+Ofx/E4pRxfs4ZsuAFrSL0LUijB27pR2PFvEYW/sTWujjb7o/eC3GAjcG\n2ZZa8sOuQjXiH0GQRnN03fB3ZYpS7slIEH0YCenoCtkTnfyPI1Kfv5vZEcjX/HMkcKJwfB4FpmLT\n6YWEFER0o6wFPGVmu6IH4N2oWXS6qBxJpbA9JvnOUdClGVxp4mKODTmeQQ/RLqYKuciXM41KHpQP\nERjyzGyOu5+WjJnyesfMifkhqHM0BYXyOkjgXozcSlckY+yLzvfaSMPfAbmFHkICflzYboqZvRx+\n874om2YVpGX9H7C8GWESHoK+7JYGAnf18NsvDfstNVEjpFjq7m5mUWCsRGPEzCVHwuMD6aFRWU3b\nTcyo8/nGyHI9FbkY70W5899Hvvm/1Njnl57kobtce48jJcZIAvIt4HhTjvp1VLtYvuTusykaswNg\nZrHbWawlmIcE/6MkmSdh28+iTKzVULxpPSTEd0XC8CX0PBxF0fbwByg+9Af0XB/iwb0VsD8qjJuC\nFshbwmcXUWS97Als7ar+ji7bBe7+1fD+P1Fm3rLwDG0TfseK7v694MKM5zjea9sD91iRwdYX1A/P\nw0Eow+91VCV+mrtfaWbT3f09ZrYkjLnMhHdRyJrXkUIa8XtawFAJ9zPQ6rymiQ5gVXThjKSU291P\nCav0VYhW4BxkFsYAxf5hvBuB77n7aWa2IYVwXBdle/waCc05KMf8NnffMGgis1BmzT+Hce+mILFy\nd9+gQ7/5j6iM+RemVK47EXVAuUFETJ2LPCjHh9+/MxLMqXCfhtIBHwnv345Mv32RafoCeuD+irTl\npWG7fwBvhIf1q0iDeiHMFwuOFrr7pPgZchFNIWQboPPkSOM/FGUStCNM6gncFdB1uTQcw0fQgphi\ndrgnVg0CI/KG14QXBFF9/EDh/WSKNM6uwwtqgTLp113oAf42xfk9y92/WW8sMxsTjj9am5OQ0Iz3\n0ztMDSlaWbgOQcVh40iKw9C1uNaUwVIRE0LP40+QIAYtLjMpGsPcYIrPRIH1PcLz62r0sWbYL7VG\nrw+L/3nI2pzhIZZWA1NREPohCt//VOSGXGhKZ1wDONbE6fRdikYzEZORwvJcOOYTkLLyFzO7FnjS\nREFyIEWzmz1KY5wOfC1c20tQnC0qlmugmOClyNLoE+TB4n2NSkK5JymUSXf3NL24XwyJcHf3i4PA\n2BX9uOeQ0N2JkO/sodsQBTvhHuE1A91E1xPcOGHVi5p9WTjugU7qzsj/PlMf2/no4n89jPMN4DPu\n/nTYr+KBD9bBxSWB8G/u/oMmf/Zx7j7HFAAcj4JJ3wWuMXVcilwdZR6UfyDip8iDkqJ8Y0VcgUzT\nu5FJugy5Xj6ALJ+rw7iXhnn/F2kgAK+HRe/PZvb5cG4moRqAw5HABZmrt6DrdRyiN30GmGdmZ3s/\naZERDQTuu4Crw0MJWqA+Vdp3hpm9H2l6m6C0ymaCTVuWtN3nw3yDjZuAncK9NBe5Ac5y90+aUhin\nIiHVCN9B1yKe7/WRxp/WV7RqlWzrScVyCbHm4ojS+C+5+8zksx+buFWgyK5agaKb18S4oSkIH634\nCmsU3X9XRZdgA9RTEiZTcMTfgoTlg8GVuDNKGoj4FrDAVCUf+ayeR/InWu+/QNp81OIfTvbHzNZ3\n1e0cgBbGlVC87pBgmZ5rChqvjK7bssTi3dkTIrMBox1Hfbsvksq60uunFIJ+OnVSx5JxbkBme1XQ\nCGnIaWXfMehCnYAE2j0ooHsZElRLw3gvU2o0QWW6V62mHzV51uv89tht5xQKfvkFVHe2L/OgzKfE\ng9LEXGUu7UeAufF3oBt+EbKgIiPnOeH9z8L5STniIxf8CkiDugBpSFehYNEj6MGJKXpz2rg3ajbT\nRgvLpFbH62euur1dB/l5iPfvkShF9l4KxsLr0ULcMEU37F/R7aoDxzWz1XGQC+dAlNHyJpIOSMk2\nk8LrNJSN80ek2V9Gwcf+ByRIH0LC0Sm6SvWbskrtXgw/Cuf1IOSi3Cu81q6xf0wu+CKy+Bcgd9iL\nFAkZVVzxyf4pr9Ovw7X5XfJ9rO7dFilM16CF9xjgq8l2+5TGPbnV6zjYlL8PUl085EiQjEOmzroo\n3W4iugn6UseScaYid8MWJBwc7n6vmR2HAm2Xh80/jFwgDyDtfEcUbL0BCdrlLo6V2EHnp+6+qSU0\nv2HOhUjji37DGHxtio/EzH6FtJD3I43sVZTTvlWNbVMelFvcfX55m2Zh6sv5I5SJsjRoCUeiB2o6\n0vQOQkHqPng1U+aeyC/6APLRP4M0+e94qQjEQlC3xeMsp0luhHz5jxCChsA3PCkqs4ITvqoqsp+5\nPoWES0VvV3ef1coxDxSmgrzPoYyRw1CgbSwKqi90959YUujWwrgxnbIvIO7u32hh/7rFYeHc1cKN\n6Jl8N7pet6J87L+akhdmUVB1P4Osyy3C2FeT0E/Xgpc05AbHXnG+wtx3oAUHJC8+5e5V/mszu84D\nB48pZfi/kWa/NiLv+hMNqKXDfkdR8DptgM7jMrRAGMr6mo5iYq8i7X0uclHt7e6TwzjlnhcV75tC\np7SQAWoKU5LXJUggT2mw/T7IN7Z5OFFXUcnHsg0Fz8c2yeczw4n8M1o87qIoAolpeo9TStML338b\nRcx3Da/ZSLA1+xsnoiDwxuH9OiRMc10+v32aKtIYHkT+0nlIWG+bbHsNlel5k9HD19dxCRXKvDN8\n9j8kxS6EFL02jrGcJvlKuBc2CK++rk3JPhXpky3O11Ftt81j2BkJuS+H979FWuFiZHmNp3VGxzSd\n8vgw3gUtjjGl1it8d2byOi8c6yVIYy5bQzEV8lZgl+S7PUg4kJDgndjpez28vyMc4wLkdvkhsoZT\nz8GE8DcWSq6GcuzXD/fkzW0eSzx3G4a/lyKZ8wPkiq7gS6JB97Xy+2ZeQ1LEBH0+65iqB4C732QF\nc1tM/akqKbdAVh/8199Efvjp7r5dP3OOQV1xFrsKcH4UjuECpG0cgopp7kea0xmlfQ+n4CO5BnHL\nLGeYo6Spvgnxx/wNaSLT0ML4bNj2d4hpcodk/wVI+7gdnaeYTrkK0jQnU1T8TUHnbxktUgEEP3MM\nGh3t7huXvl/oSXGUmd3i7jsySmAq5d8D3Xt/NpXlv9NLDTr6GSM+G/HvJFQ93G+XqTaPeVXkylvL\nSxZG1KItaZwTPr8dpQ9Gq3gSchumqa/tHk/ZKjsGZdC9LfwFeQceD/87CoKmhZKGFtbxyAPwMIoH\n3ossSO1YJ0hdz3KypGjNxGj6fa/MdPu1u68e3g9Ycx+SgKpVp+qled67NzFEmhZ5nrtfZWYn9beT\nKyvhKZSlMhaZWy9TEGE9GP6ORYtAed8LkBbyBsoKGfaCHaryyw9FfvizwteXIG0vZjksI8kiMOUj\nO9J2dkaa0LMou+lh5DYZj67JHWHsvmBli8fZlyZpZhua2X5UVtleHb4bFvnqA4WporZM+oWHrAh3\nfwIFWVtBWk+xLkXv327hFWRZvWZmk71wq61GIV8WB3dpdHtNQYoCUBDetTO5qVVmWUOdT1ERfDPS\nwGehRIwDgWnu/rHSPqeb2XSUdfeSqfXnFOTyfS96JqYgRQbqBKmtshL9fHTfxuSQ+5DMeYLamW6W\nJIa4mb0Uh6Wy5qQpDBX9wELkHrjdxVO+KQoYfLyfXeP+TfuvS/udiqL9iygWCPfajIOY2bHufkr4\nP/U5G7qhD3f339Tad7ii7A8PPvh1PLBfmtmRyBKajX7nTigneb9kmLHo/Dlyb2yMzN3lqLjkPHc/\nc4DHGXmz43V6E0Xq3XiUs1+O34CuZxWf+3CEmc2lkvTr06j2od/+mA3GPA6Z+7uiBdzR9ZjecMfm\nx48FWCArdzN0ryyiThwjWOknUsSR1gH2c/ebwpjTkBb77jaO59Ph37QAKc6/CHHGx7kNLX5/ctED\nbIxqZX4Vxko9AnPDGNNRgLZe9lD5eOpaTiELZ2sk7Mcku1X1QfAmYwwN0Qk/Vxu+qFiyfg+BD5oG\nnXxq7N+W/xq5C1rhn067nPf5nMP7jUg6p4+UF9U+8oOR9ZKSj+2KCog+RMI3nezTxyOPTNU1gYfD\n+5Wok0nQ4d+xOXUyI4b6HLfwGypIv8L/dw5wzH1QMwdQ5tVllPj1Bzj+e5PXjqiJRPyuZhwDxcAu\nQ37vheEeez3cbzeH99MGeFy3o5aW8f248Fk69wtIa34tbDORym5RaUbbjeH3LKCF7CGKTnG3IzfP\neAou9vfWeiX7TkDuoe8jhWpsq+chfQ1VEdOjwVf3S6rzvPuFuy8hMYm8efN1Mbro5Xzxeki1wpfd\nPa0QXEzRPWkkoZY5+DQq114OnITcU2sjDX0zUxFM2tB5ghc84YbS1N4S3i+nWpvuBmYhy3M45Ku3\ni1js9USwDB+nsvl7O0jrKd6HrLAfEvjhO4BHgCc81DGY2Yoht/shL1UfJ7gYWSf3URRFjaXobtRO\nFW0ZaQESKOVycmnuy1GyxlUgOWJW0UciLZSMadNvoALHT5vZUhSnakQt/asg206j6F1wfpivira6\nhAvDvPMQYd7mVNJrt4ShKmKK/q4TgqnyZlRI0xUkfrklqFS4XFZ9VL1DTf6fb+Klnh0+3we4M/p/\nfYT4eakufDoL+c9jscyxKD/+BiqrE1Ph/oqZTXUxVc5E2vtYKzg/muYOHwAMGNPAzzsScJKZvRlV\nbJ5J0Y9zIGgrHtUCmuH8KeNpV8n9+9z9t0nMJKKdKtoy0gIkQ/GhE1FhYmww/XeUSLA0vN+ISkUv\nNgeaEbZ5C6rUXZxs8xiNMQPVz+yE4ojzkN/9OCvoOyLKqbubeRFkvoDCV98WhuxBsGqq1vUoApqd\nRswTL/Oq9Id0VZ+AWNpi8PVppHl8mMHlJRkQvLqibkOv9MGfCDzi7ns2GOYLFN2iQOfhufAqc350\nC06ozjRROUPw8w7C3J3CPlSSfq2GhEMVaVcLSLXPU03NQcb0s08rGOvukcYCr835U0bkqpkctNrP\nh8/Te3FAz1CQI7+hsFC+7O5PmtlfreDJmYtkwVvN7GKKmpc4RoVHIBxfqzUmFyKLPmba7U9RSd9f\nB6U+68VVdd/i1JUYqmyZ45EvbBOk+Y1DvuCupLV5KMgx8T285tUNpeshCg08lMnXQxp8HWEol3s/\ni6oDG2EhCi5HHvkrgAM8SR0dDHhjlsmRgDINwnMdcCv1aZ+udN91qCS7GyieNrO93P0KqMv5U0bk\nqvk9UoYMeNA7GPi2ogDp8vQz5K6NPDnvRzUuj6Gq+C8gZeCGTh0H1Uyt15t495vBVqUMmRXD+6aK\n88oYKs29JarWDuI6xMMc/cXvRIRGj7pahm0J7OXuJ4XjOrnOOLWwDwrEjDREH3w8JysD25jZ7kib\nij0nU9dVZO6LWvL+FBWAg4Wl4bjq+XlHAjruVhpAPKpZ/AdwsZnFTJxHKXH+1EAFV42pzWRHBLup\nYc1EYI2QlRPV3VWQN+CtXrQ0jO6VW7zIkNmmE8eRoCZTazM7untHn6GhEu6tUrV2CmkgEFS88DeC\nOeSiL/gJCiq2isEIInYD0Qf/iTrfx+Kmd6DA3FrIzN+PYiFsRTtpGmGxXZ/KbkCRAnj7OruNJIw4\nt5K7PwBsb0mHoiZ2Kzcyr8ks6c33RkhxOEUB0l3Q18HrZRTH2CaZ+wWUCXZGSOk8sI35aiKkdzuy\nEGLCglPw7g86Bl24h+h0X/dwa4KqtYNIA4EgtshnqSwQWNbm2ENT6jtAJD742A2+nuvqN8i8PweZ\n1xORgD+pFe2kWZiqh7cMc5VpZ0cFRqJbyczWQlzr63rzDbjLnOebomv5udJ2zfZG6IO7n051AdJx\nqP7lNuTfj3O/HaU2Orrfb6YoShooPtShcTqGoSxi+iKi5TRa6Qs4sHm3RaXSMRA4FbkUjnd1htob\nNXf+YBtjt0zwNBxhKg3fzQvK5EkoEDXO3bc1URFMQPGSZehcRh75likHGhxHy+RjGd1HCFrOBL7m\n7ltZg565yT5TSh+NR66czZGgnQec7e6vDuC4alKSoBhExP4UbsaHTcVTR3TS9z+cMFRumbuBF9y9\nk4GeZlAOBJ6B8kg3NbPHULZOu6banP43GRGocF15URr+REgdc+TK+SCq9j24S8dxW8mUzxgeWMPd\nZ5vZsVDVS6EmamRozaZ4/kBC90IqBXGrqJkCWpq7Iibmam4/KgU7DJ1w3w44wNroCzhA1AoEPovy\nWce4ejjWROpz7kDwdTijwnUVAk6vouYM5yKT+lbCQlh+cDuIi5CAf5IS7WyX5stoDq+Y2eoEN6Q1\n6JnbAAPJKKmHbqeAjjgMlVumbKYBHeJTaDxvbB6d9kj9EpWt63C14CrveyPB5xzdL2Z2n7tv0c1j\nHmzUcF2tA3wyaDnRJ99wIezQcfwFue4WUvjcu36PZDSGNeil0MIYtRgRj3D3/rJuGo05YEbN0Yah\nqlAdqgf07qBpxLTL9VCfwmbSMCe6+x2lwoJ2g6/DGWXX1fPArqamH32I56HWQtghPB1zqTOGFTZC\nLrm3oQyr7WhdjtTs/RszTtqxzgYhBXTEYSSVancC05BLodxQem/6v6meSXzOhODraLx5yq6rGYit\ncHbdPbqDBSEt9UpGIJ3vKEbkrolNudvhrqnX+zejgxiyZh1DgTpR+08iytrxBP9/rei5mW2IfM47\nIG02+pwf6uIhDzpqZakMReaKmc2s8bGP1syGkQIrGnCcwgBaAWZ0Hz2ludeI2s9BBQY7ov6qB6AG\nvbX2XQzsNlg+5yFEzQq7UAl4GNWNJboibL0fuoeMIUMOXI4Q9JTmXkaihcQc2XHAvLT6sRR8rUIX\nfc5DAlMrsE2bZqs+AAAEoklEQVSodl2tixod/J1kIXT3tilJ+zmOWjw1LwLz3f3yGt9lDAJy4HLk\noKc09xqILGwvmLqkP4kaT6QYDM6b4YR6/tCrUA7xle5+YfCHz+vicUxAaZexfuATyBW2lZnt4u4D\npcbNaAM5cDly0OvC/dwQGPo6YjachKra+uDuJw7FgQ0V6mUymdmSUNXXaCHsJLYEdkxoEH6IFpP3\noIyejIyMBuhp4e7u54d/b6IfXovB9jkPQ/S7EHYYk8McsUBmJWA1d19uZs120srI6Fn0dCDEzI42\ns1VMON/M7jazD9TZfBZqPbc76q/4VkZmm7224O7nu/vz7n6Tu2/o7mu6+9ldnPI0RPg008x+jAif\nvh0C2td2cd6MjFGBXg+o/i6QH+2OeKq/Dsxy96k1tu03+DqaYWZHI8KolxGD51TgK90MpIVg3b+E\nt3e6++ONts/IyCjQ05o7BQf7vwIXufvvk8/KKAdf30x3fc7DDYe6+0uIyXN14CDUt7KbGIPaGT4P\n/JOZ7dzl+TIyRg162ucO3GVmc4ENgGNN3aDeqLPtYPuchxuqFkKzATZ5bDSZ2amowKzM535T3Z0y\nMjL60OtumTHA1sBiV7/J1YH1WiFB6hWEitH10EK4FWqrd4O7T+vSfPejHqM5eJqR0QZ62i3j7m8A\nTwGbBZN/c2DVWtu2GHwdjTgM+Arqh7kEWAE1Pu4WFqOWZRkZGW2gp90yiem/iILsv57pf6i7nx6C\nr9HnPAt1KRr1cPc3zCwuhF27b8zsTHQNlqBsmeuoJA47qt6+GRkZBXpauAMfBTZp0vQfVJ/zcEOL\nC+FAEHux3oViGxkZGW2g14V7NP2bEe6tBF9HI1pZCNuGu18IDRt1Z2RkNIGeFO5tmv6HUQRfl4Tg\nay8xF7ayEHYC1wG7IaIygBWRC2yHQZo/I2NEoyeFO22Y/oPlcx5uGEIfeL1G3RkZGU2gZ4RUinZM\n/0H0OQ83DJUPvF6j7oyMjCbQ63nutwO7RQ3RzCYBc929yvTv9bzregthSIvsxnwNG3VnZGQ0Rk/n\nuVPD9Afqmf69nnd9HfJ7R6xIdwm8YqPu1xEFwTmoWjUjI6MJ9KRbJkG/pn/Ou+7DYPvAy42690d1\nBft0cc6MjFGDXhfuXwDmmFmF6V/aJuddC4PtA9+i1JT7ejNb1MX5MjJGFXpduEfTf3ekJV5ByfTP\nedd9aGYh7CRqNuru4nwZGaMKvR5QnY2E+sXho/2BVd29yvRvJfg6GhE6UR1JsRDeBpzp7q91ab56\njbqXAe7uW3Zj3oyM0YJe19xbMf17Pe96sH3g9Rp1Z2RkNIFeF+6tmP69nnc9qD7weo26MzIymkOv\nC/dpwK1mVmH6m9lCqk3/wfY5DzdkH3hGxghCr/vcpzT6PtUeB9vnPNyQfeAZGSMLPS3cW0ErwdfR\niFYWwoyMjKFHFu5NwswWlXzONT/LyMjIGA7odfqBVnC3mW0f32Sfc0ZGxnBG1tybRPY5Z2RkjCRk\n4d4kss85IyNjJCEL94yMjIxRiOxzz8jIyBiFyMI9IyMjYxQiC/eMjIyMUYgs3DMyMjJGIbJwz8jI\nyBiF+H/hdps7P6oqqwAAAABJRU5ErkJggg==\n",
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"tags": []
}
}
]
},
{
"cell_type": "code",
"metadata": {
"id": "KHXlMBwI4RFO",
"colab_type": "code",
"colab": {}
},
"source": [
"#redundant columns\n",
"train_df = data.drop(columns=['permno','date', 'PERMNO', 'return'])"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "UYZyjDCi-Cdm",
"colab_type": "code",
"colab": {}
},
"source": [
"for col in train_df.columns:\n",
" train_df[col]=train_df.groupby('DATE')[col].apply(lambda x:x.fillna(x.median()))"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "markdown",
"metadata": {
"id": "dWFbH0G1Npmh",
"colab_type": "text"
},
"source": [
"Some columns still have missing values. It is because those metrics were not recorded in early years.We filled thos missing value with 0."
]
},
{
"cell_type": "code",
"metadata": {
"id": "PlqZ1xulFRtI",
"colab_type": "code",
"colab": {}
},
"source": [
"#replace all missing samples with 0\n",
"train_df.fillna(value=0, inplace=True)"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "eR9k2oTS1qhv",
"colab_type": "code",
"colab": {}
},
"source": [
"#we do not want to rank normalise the return or PERMNO\n",
"tmp_df=data[[\"PERMNO\",\"return\"]]"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "YASkyyi5PUe1",
"colab_type": "code",
"colab": {}
},
"source": [
"ranked_train_df = train_df.groupby('DATE').rank(method='average', ascending=True, pct=True)"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "evs3I9eiPUe3",
"colab_type": "code",
"colab": {}
},
"source": [
"ranked_train_df = pd.concat([ranked_train_df, train_df['DATE']], axis=1)"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "I67q7qHctWho",
"colab_type": "code",
"colab": {}
},
"source": [
"ranked_train_df = pd.concat([ranked_train_df, tmp_df], axis=1)"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "markdown",
"metadata": {
"id": "ObmLNipKPUe5",
"colab_type": "text"
},
"source": [
"Finally, in this paper, the author rank-normalize the each cross-sectional characteristics. So we did the same. We make sure the sum of the second axis==1"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "_OMn7ZxhPUe7",
"colab_type": "text"
},
"source": [
"# 3. Model"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "eguSKm-MPUe8",
"colab_type": "text"
},
"source": [
"We build the model based on the PyTorch v1.4. The implementation is simple, and contains two classes.\n",
"* `Perceptron`: a sequence of Neural Network\n",
"* `Antoencoder`: The main model.\n",
"\n",
"There are several modifications:\n",
"* I am a bit confused about eq(16). Because Z is rank-normalied, so I think x = r*Z is enough.\n",
"* We can flexibly change the structure of beta nueral network by changing the parameters.\n",
"* The paper did not mention the initializer, so I just used the default setting\n",
"\n",
"Both are subclasses of torch.nn.Module. Please refer to my comments for details."
]
},
{
"cell_type": "code",
"metadata": {
"id": "oYbx_zecPUe9",
"colab_type": "code",
"colab": {}
},
"source": [
"class beta(nn.Module):\n",
" # beta model\n",
"\n",
" def __init__(self, p, k):\n",
" super().__init__()\n",
" \n",
" self.hidden = nn.Linear(p, 32)\n",
" self.output = nn.Linear(32, k)\n",
" self.batchnorm = nn.BatchNorm1d(32)\n",
" self.relu = nn.ReLU(inplace=False)\n",
" \n",
" def forward(self, x):\n",
" x = self.hidden(x)\n",
" x = self.batchnorm(x)\n",
" x = self.relu(x)\n",
" x = self.output(x)\n",
" \n",
" return x\n",
"\n",
"class Linear(nn.Module):\n",
" def __init__(self, in_, out_):\n",
" \"\"\"\n",
" This is to build a linear network\n",
" :param in_:int, input dimensions\n",
" :param out_:int, output dimensions\n",
" \"\"\"\n",
" super(Linear, self).__init__()\n",
" self.linear = nn.Linear(in_, out_)\n",
" def forward(self, x):\n",
" x = self.linear(x)\n",
" return x\n",
"\n",
"class Autoencoder(nn.Module):\n",
"\n",
" def __init__(self, P, K):\n",
" \"\"\"\n",
" This is to build the autoencoder neural netowrk with a multi-layer beta network and a single layer factor network\n",
" :param P:int, # characteristics\n",
" :param K:int, # output factors\n",
" :param hidden_: list or tuple, # neurons for each hidden layer for the beta network(exclude the input and output)\n",
" :param dropout_p: the dropout rate\n",
" \"\"\"\n",
" nn.Module.__init__(self)\n",
" self.beta_net = beta(P, K) # for beta nn, input Z: N*P, output BETA: N*(K+1)\n",
" self.factor_net = Linear(P, K) # for factor nn, only one linear layer, input r: N*1, output f: K*1\n",
"\n",
" def forward(self, z, p_):\n",
" \"\"\"\n",
" :param z: N*P tensor\n",
" :param r: 1*N tensor\n",
" :return: P*1 tensor\n",
" \"\"\"\n",
" beta = self.beta_net(z)\n",
" factor = self.factor_net(p_)\n",
" return t.mm(factor, beta.t())"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "markdown",
"metadata": {
"id": "cIgj6JycPUe_",
"colab_type": "text"
},
"source": [
"# 4. Experiment"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "YKM3Aig0PUfA",
"colab_type": "text"
},
"source": [
"## 4.1 Training"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "15BNNnXBPUfA",
"colab_type": "text"
},
"source": [
"We trained the CA0 - CA3 mentioned in the paper. Below are functions for the training and evaluation process.\n",
"\n",
"I choose the 0-636th month as the training set, and the 636-677th as validation set. "
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "EdfdmNV5XnOW",
"colab_type": "text"
},
"source": [
"**Define** the evaluation function. "
]
},
{
"cell_type": "code",
"metadata": {
"id": "kwA7wes7Xgkp",
"colab_type": "code",
"colab": {}
},
"source": [
"def R_square(pred, target):\n",
" return 1 - np.sum(np.square(target-pred)) / np.sum(np.square(target))"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "MVYuy4H1F-ti",
"colab_type": "code",
"colab": {}
},
"source": [
"class EarlyStopping:\n",
" \"\"\"Early stops the training if validation loss doesn't improve after a given patience.\"\"\"\n",
" def __init__(self, out_rsq_list, patience=7, verbose=False, delta=0):\n",
" \"\"\"\n",
" Args:\n",
" patience (int): How long to wait after last time validation loss improved.\n",
" Default: 7\n",
" verbose (bool): If True, prints a message for each validation loss improvement. \n",
" Default: False\n",
" delta (float): Minimum change in the monitored quantity to qualify as an improvement.\n",
" Default: 0\n",
" \"\"\"\n",
" self.patience = patience\n",
" self.verbose = verbose\n",
" self.counter = 0\n",
" self.best_score = None\n",
" self.early_stop = False\n",
" self.delta = delta\n",
" self.out_rsq_list = out_rsq_list\n",
"\n",
" def __call__(self, val_loss, model, optimizer, epoch, year_end):\n",
"\n",
" #we now assume that val_loss is the R2\n",
" score = val_loss\n",
"\n",
" if self.best_score is None:\n",
" self.best_score = score\n",
" self.save_checkpoint(model, optimizer, epoch, year_end, self.out_rsq_list)\n",
" elif score < self.best_score + self.delta:\n",
" self.counter += 1\n",
" print(f'EarlyStopping counter: {self.counter} out of {self.patience}')\n",
" if self.counter >= self.patience:\n",
" self.early_stop = True\n",
" else:\n",
" self.best_score = score\n",
" self.save_checkpoint(model, optimizer, epoch, year_end, self.out_rsq_list)\n",
" self.counter = 0\n",
"\n",
" def save_checkpoint(self, model, optimizer, epoch, year_end, out_rsq_list):\n",
" '''\n",
" Save the checkpoint\n",
" '''\n",
" print(\"Checkpointing ...\")\n",
" t.save({\n",
" 'epoch': epoch,\n",
" 'model_state_dict': model.state_dict(),\n",
" 'optimizer_state_dict': optimizer.state_dict(),\n",
" 'year_end':year_end,\n",
" 'out_rsq_list': out_rsq_list,\n",
" }, CKP_PATH)"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "mjwmFuJev9Cm",
"colab_type": "code",
"colab": {}
},
"source": [
"ca1 = Autoencoder(94, 6).cuda()\n",
"\n",
"#try the huber loss function\n",
"loss_fn = nn.MSELoss()\n",
"\n",
"decay = 1e-5 # weight decay for L1 (or L2)\n",
"\n",
"epoch = 0\n",
"\n",
"begin_yr = 1957\n",
"\n",
"train_begin_yr = 1975\n",
"\n",
"train_end_yr = 2004\n",
"\n",
"year_end = train_begin_yr\n",
"\n",
"#number of epochs to train per window\n",
"epoch_per_window = 100\n",
"\n",
"optimizer = optim.Adam(ca1.parameters(), lr=1e-2, betas=(0.9,0.999), eps=1e-8)\n",
"\n",
"#the out-of-sample R2 list\n",
"out_rsq_list=[]\n",
"\n",
"LOAD_FROM_CKP = False\n",
"\n",
"if os.path.exists(CKP_PATH):\n",
" LOAD_FROM_CKP = True\n",
" checkpoint = t.load(CKP_PATH)\n",
" ca1.load_state_dict(checkpoint['model_state_dict'])\n",
" optimizer.load_state_dict(checkpoint['optimizer_state_dict'])\n",
" epoch = checkpoint['epoch']\n",
" year_end = checkpoint['year_end']\n",
" out_rsq_list = checkpoint['out_rsq_list']"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "dJxOff-9bgjE",
"colab_type": "code",
"colab": {}
},
"source": [
"def df_train_helper(current_train_df, ca1, epoch, optimizer, loss_list, loss_R2):\n",
" #training\n",
" X = current_train_df.drop(['PERMNO', 'DATE', 'return'], axis=1).values\n",
" y = current_train_df['return'].values\n",
" X = t.from_numpy(X).float().cuda()\n",
" y = t.from_numpy(y).float().cuda()\n",
" optimizer.zero_grad()\n",
" # prepare the data\n",
" z = X\n",
" r = y[np.newaxis, ...]\n",
" zz = t.mm(t.transpose(z, 0, 1), z) #P*P Matrix\n",
" zr = t.mm(r,z) # 1 * P\n",
" # forward\n",
" r_pred = ca1(zz, zr)\n",
" loss = loss_fn(r_pred, zr)\n",
" for param in ca1.parameters():\n",
" loss += decay * t.sum(t.abs(param.float())) # torch has no integrated L1 regulizations, so I manually wrote this part\n",
" loss.backward()\n",
" optimizer.step()\n",
" loss_list.append(loss.item()*X.shape[0])\n",
" loss_R2.append( (r_pred.detach().cpu().numpy(), zr.detach().cpu().numpy() ))"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "xCfAprbegeCR",
"colab_type": "code",
"colab": {}
},
"source": [
"def df_test_helper(current_test_df, year_end, ca1, epoch, optimizer, loss_list, loss_R2):\n",
" #testing\n",
" X_test = current_test_df.drop(['PERMNO', 'DATE', 'return'], axis=1).values\n",
" y_test = current_test_df['return'].values\n",
" X_test = t.from_numpy(X_test).float().cuda()\n",
" y_test = t.from_numpy(y_test).float().cuda()\n",
" test_loss = 0\n",
" ca1 = ca1.eval()\n",
" with t.no_grad():\n",
" # prepare the data\n",
" z = X_test\n",
" r = y_test[np.newaxis, ...]\n",
" zz = t.mm(t.transpose(z, 0, 1), z) #P*P Matrix\n",
" zr = t.mm(r,z) # 1 * p\n",
" # forward\n",
" r_pred = ca1(zz, zr) #1 * p\n",
" test_loss += loss_fn(r_pred, zr).item()\n",
" loss_list.append(test_loss * X_test.shape[0])\n",
" loss_R2.append( (r_pred.detach().cpu().numpy(), zr.detach().cpu().numpy()) )"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "KMzYWXLpiGMe",
"colab_type": "code",
"colab": {}
},
"source": [
"def df_out_helper(current_out_df, year_end, ca1, epoch, out_rsq_list):\n",
" X_out = current_out_df.drop(['PERMNO', 'DATE', 'return'], axis=1).values\n",
" y_out = current_out_df['return'].values\n",
" X_out = t.from_numpy(X_out).float().cuda()\n",
" y_out = t.from_numpy(y_out).float().cuda()\n",
" with t.no_grad():\n",
" ca1 = ca1.eval()\n",
" # prepare the data\n",
" z = X_out\n",
" r = y_out[np.newaxis, ...]\n",
" zz = t.mm(t.transpose(z, 0, 1), z) #P*P Matrix\n",
" zr = t.mm(r,z) # 1 * P\n",
" # forward\n",
" r_pred = ca1(zz, zr)\n",
" out_rsq_list.append( (r_pred.detach().cpu().numpy(), zr.detach().cpu().numpy()) )"
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"scrolled": true,
"id": "aO9LkKtkPUfj",
"colab_type": "code",
"outputId": "d5207c45-866a-4fd4-bc48-ebf8f1ae8843",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 1000
}
},
"source": [
"#Trainning dataset\n",
"for year_end in range(year_end, train_end_yr+1):\n",
" #reset the epoch parameter\n",
" if LOAD_FROM_CKP == False:\n",
" epoch = 0\n",
" else:\n",
" LOAD_FROM_CKP = False\n",
" earlystopping = EarlyStopping(out_rsq_list = out_rsq_list, patience=5, verbose=True)\n",
" for epoch in range(epoch, epoch_per_window):\n",
" begin_date = int(str(begin_yr)+\"0000\")\n",
" end_date = int(str(year_end)+\"0000\")\n",
" print(\"training range\" + \" \" + str(begin_date)+ \"-\" + str(end_date)) \n",
"\n",
" current_train_df = ranked_train_df[(ranked_train_df['DATE'] <= end_date) & (ranked_train_df['DATE'] >= begin_date)]\n",
" \n",
" train_loss_list = []\n",
" train_loss_r2 = []\n",
" #training starts here. we want one batch corresponds to one date\n",
" current_train_df.groupby('DATE').apply(lambda x: df_train_helper(x, ca1, epoch, optimizer, train_loss_list, train_loss_r2))\n",
" train_loss_val = np.mean(train_loss_list)\n",
" print('Train set: loss: {:f}'.format(train_loss_val))\n",
" #calculate the train sample R2\n",
" train_r_pred_arr = np.array([])\n",
" train_r_arr = np.array([])\n",
" for o in train_loss_r2:\n",
" train_r_pred_arr = np.append(train_r_pred_arr, o[0])\n",
" train_r_arr = np.append(train_r_arr, o[1]) \n",
" R2_train = R_square(train_r_pred_arr, train_r_arr)\n",
" print('Train sample R^2 for epoch: %d is %f' % (epoch, R2_train)) \n",
" print(\"\\n\")\n",
"\n",
" test_begin_year = year_end\n",
" test_end_year = test_begin_year + 12\n",
" test_begin_date = int(str(test_begin_year)+\"0000\")\n",
" test_end_date = int(str(test_end_year)+\"0000\")\n",
" print(\"cross-validate range\" + \" \" + str(test_begin_date)+ \"-\" + str(test_end_date))\n",
"\n",
" #Testing Dataset\n",
" current_test_df = ranked_train_df[(ranked_train_df['DATE']>=test_begin_date) & (ranked_train_df['DATE']<=test_end_date) ]\n",
" #the list that stores the test rsq\n",
" loss_list = []\n",
" loss_r2 = []\n",
" #testing starts here\n",
" current_test_df.groupby('DATE').apply(lambda x: df_test_helper(x, year_end, ca1, epoch, optimizer, loss_list, loss_r2))\n",
" test_loss_val = np.mean(loss_list)\n",
" print('Test set: loss: {:f}'.format(test_loss_val))\n",
" #calculate the validation R2\n",
" val_r_pred_arr = np.array([])\n",
" val_r_arr = np.array([])\n",
" for o in loss_r2:\n",
" val_r_pred_arr = np.append(val_r_pred_arr, o[0])\n",
" val_r_arr = np.append(val_r_arr, o[1]) \n",
" R2_val = R_square(val_r_pred_arr, val_r_arr)\n",
" print('Test sample R^2 for epoch: %d is %f' % (epoch, R2_val))\n",
" print(\"\\n\")\n",
" #get the higest R2 epoch\n",
" earlystopping(R2_val, ca1, optimizer, epoch, year_end)\n",
" if (earlystopping.early_stop == True):\n",
" '''\n",
" Stop epochs if early stop is set to true\n",
" '''\n",
" print(\"Early stopping at Epoch \"+str(epoch))\n",
" break\n",
" if os.path.exists(CKP_PATH):\n",
" '''\n",
" Re-Load the checkpoints (early stopped and saved)\n",
" '''\n",
" checkpoint = t.load(CKP_PATH)\n",
" ca1.load_state_dict(checkpoint['model_state_dict'])\n",
" optimizer.load_state_dict(checkpoint['optimizer_state_dict'])\n",
" epoch = checkpoint['epoch']\n",
" year_end = checkpoint['year_end']\n",
" out_rsq_list = checkpoint['out_rsq_list']\n",
"\n",
" out_begin_date = test_end_date\n",
" out_end_year = test_end_year + 1\n",
" out_end_date = int(str(out_end_year)+\"0000\")\n",
" print(\"out-of-sample range\" + \" \" + str(out_begin_date)+ \"-\" + str(out_end_date) +\"\\n\")\n",
" \n",
" #out of sample dataset\n",
" current_out_df = ranked_train_df[(ranked_train_df['DATE']>=out_begin_date) & ((ranked_train_df['DATE']<=(out_end_date+1)))]\n",
" current_out_df.groupby('DATE').apply(lambda x: df_out_helper(x, year_end, ca1, epoch, out_rsq_list))\n",
"\n",
" #save the checkpoint manually again to preserve the out_rsq_list\n",
" earlystopping.save_checkpoint(ca1, optimizer, epoch, year_end, out_rsq_list) "
],
"execution_count": 0,
"outputs": [
{
"output_type": "stream",
"text": [
"training range 19570000-19750000\n",
"Train set: loss: 18072481.316652\n",
"Train sample R^2 for epoch: 0 is 0.028363\n",
"\n",
"\n",
"cross-validate range 19750000-19870000\n",
"Test set: loss: 928678236.668306\n",
"Test sample R^2 for epoch: 0 is -5.332274\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19750000\n",
"Train set: loss: 1580614195.619510\n",
"Train sample R^2 for epoch: 1 is -40.299233\n",
"\n",
"\n",
"cross-validate range 19750000-19870000\n",
"Test set: loss: 75634279.730260\n",
"Test sample R^2 for epoch: 1 is 0.391629\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19750000\n",
"Train set: loss: 47474325.178581\n",
"Train sample R^2 for epoch: 2 is -2.384605\n",
"\n",
"\n",
"cross-validate range 19750000-19870000\n",
"Test set: loss: 860678.086522\n",
"Test sample R^2 for epoch: 2 is 0.993226\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19750000\n",
"Train set: loss: 1260442.763660\n",
"Train sample R^2 for epoch: 3 is 0.865425\n",
"\n",
"\n",
"cross-validate range 19750000-19870000\n",
"Test set: loss: 936861.099471\n",
"Test sample R^2 for epoch: 3 is 0.992602\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19750000\n",
"Train set: loss: 1290258.479395\n",
"Train sample R^2 for epoch: 4 is 0.961050\n",
"\n",
"\n",
"cross-validate range 19750000-19870000\n",
"Test set: loss: 862593.137064\n",
"Test sample R^2 for epoch: 4 is 0.993210\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19750000\n",
"Train set: loss: 1816842.183911\n",
"Train sample R^2 for epoch: 5 is 0.771506\n",
"\n",
"\n",
"cross-validate range 19750000-19870000\n",
"Test set: loss: 1067275.250937\n",
"Test sample R^2 for epoch: 5 is 0.991543\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19750000\n",
"Train set: loss: 149813.767721\n",
"Train sample R^2 for epoch: 6 is 0.994449\n",
"\n",
"\n",
"cross-validate range 19750000-19870000\n",
"Test set: loss: 1154623.995841\n",
"Test sample R^2 for epoch: 6 is 0.990835\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19750000\n",
"Train set: loss: 149736.430845\n",
"Train sample R^2 for epoch: 7 is 0.994462\n",
"\n",
"\n",
"cross-validate range 19750000-19870000\n",
"Test set: loss: 1200795.560006\n",
"Test sample R^2 for epoch: 7 is 0.990461\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 7\n",
"out-of-sample range 19870000-19880000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19760000\n",
"Train set: loss: 1295030.939441\n",
"Train sample R^2 for epoch: 0 is 0.903085\n",
"\n",
"\n",
"cross-validate range 19760000-19880000\n",
"Test set: loss: 2200186.223672\n",
"Test sample R^2 for epoch: 0 is 0.986685\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19760000\n",
"Train set: loss: 1668366.509905\n",
"Train sample R^2 for epoch: 1 is 0.963994\n",
"\n",
"\n",
"cross-validate range 19760000-19880000\n",
"Test set: loss: 1053664.913373\n",
"Test sample R^2 for epoch: 1 is 0.993506\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19760000\n",
"Train set: loss: 4181908.075517\n",
"Train sample R^2 for epoch: 2 is 0.606628\n",
"\n",
"\n",
"cross-validate range 19760000-19880000\n",
"Test set: loss: 3605232.203519\n",
"Test sample R^2 for epoch: 2 is 0.978266\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19760000\n",
"Train set: loss: 354598.514407\n",
"Train sample R^2 for epoch: 3 is 0.991514\n",
"\n",
"\n",
"cross-validate range 19760000-19880000\n",
"Test set: loss: 3684544.362951\n",
"Test sample R^2 for epoch: 3 is 0.977790\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19760000\n",
"Train set: loss: 405494.397345\n",
"Train sample R^2 for epoch: 4 is 0.990422\n",
"\n",
"\n",
"cross-validate range 19760000-19880000\n",
"Test set: loss: 2545791.569017\n",
"Test sample R^2 for epoch: 4 is 0.984616\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19760000\n",
"Train set: loss: 427347.781831\n",
"Train sample R^2 for epoch: 5 is 0.989841\n",
"\n",
"\n",
"cross-validate range 19760000-19880000\n",
"Test set: loss: 1192148.108052\n",
"Test sample R^2 for epoch: 5 is 0.992690\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19760000\n",
"Train set: loss: 420508.096651\n",
"Train sample R^2 for epoch: 6 is 0.989735\n",
"\n",
"\n",
"cross-validate range 19760000-19880000\n",
"Test set: loss: 1005262.840475\n",
"Test sample R^2 for epoch: 6 is 0.993691\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19760000\n",
"Train set: loss: 419246.889850\n",
"Train sample R^2 for epoch: 7 is 0.989471\n",
"\n",
"\n",
"cross-validate range 19760000-19880000\n",
"Test set: loss: 2407455.130362\n",
"Test sample R^2 for epoch: 7 is 0.985085\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19760000\n",
"Train set: loss: 462903.531770\n",
"Train sample R^2 for epoch: 8 is 0.988219\n",
"\n",
"\n",
"cross-validate range 19760000-19880000\n",
"Test set: loss: 6343024.765234\n",
"Test sample R^2 for epoch: 8 is 0.961136\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19760000\n",
"Train set: loss: 569125.089957\n",
"Train sample R^2 for epoch: 9 is 0.985375\n",
"\n",
"\n",
"cross-validate range 19760000-19880000\n",
"Test set: loss: 16145542.716463\n",
"Test sample R^2 for epoch: 9 is 0.901656\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19760000\n",
"Train set: loss: 740469.144640\n",
"Train sample R^2 for epoch: 10 is 0.980647\n",
"\n",
"\n",
"cross-validate range 19760000-19880000\n",
"Test set: loss: 36203656.916734\n",
"Test sample R^2 for epoch: 10 is 0.780120\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19760000\n",
"Train set: loss: 880562.607650\n",
"Train sample R^2 for epoch: 11 is 0.976113\n",
"\n",
"\n",
"cross-validate range 19760000-19880000\n",
"Test set: loss: 59585969.219656\n",
"Test sample R^2 for epoch: 11 is 0.638542\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 11\n",
"out-of-sample range 19880000-19890000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19770000\n",
"Train set: loss: 845451.577501\n",
"Train sample R^2 for epoch: 0 is 0.982314\n",
"\n",
"\n",
"cross-validate range 19770000-19890000\n",
"Test set: loss: 18683370.793108\n",
"Test sample R^2 for epoch: 0 is 0.885675\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19770000\n",
"Train set: loss: 2043990.504365\n",
"Train sample R^2 for epoch: 1 is 0.956173\n",
"\n",
"\n",
"cross-validate range 19770000-19890000\n",
"Test set: loss: 60155939.401322\n",
"Test sample R^2 for epoch: 1 is 0.631588\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19770000\n",
"Train set: loss: 4035258.795535\n",
"Train sample R^2 for epoch: 2 is 0.903590\n",
"\n",
"\n",
"cross-validate range 19770000-19890000\n",
"Test set: loss: 116993177.388186\n",
"Test sample R^2 for epoch: 2 is 0.283189\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19770000\n",
"Train set: loss: 4877574.834118\n",
"Train sample R^2 for epoch: 3 is 0.862769\n",
"\n",
"\n",
"cross-validate range 19770000-19890000\n",
"Test set: loss: 129039560.630639\n",
"Test sample R^2 for epoch: 3 is 0.209315\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19770000\n",
"Train set: loss: 4607047.680025\n",
"Train sample R^2 for epoch: 4 is 0.860951\n",
"\n",
"\n",
"cross-validate range 19770000-19890000\n",
"Test set: loss: 115163053.932381\n",
"Test sample R^2 for epoch: 4 is 0.294357\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19770000\n",
"Train set: loss: 4233891.969892\n",
"Train sample R^2 for epoch: 5 is 0.872622\n",
"\n",
"\n",
"cross-validate range 19770000-19890000\n",
"Test set: loss: 104643471.876803\n",
"Test sample R^2 for epoch: 5 is 0.358819\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 5\n",
"out-of-sample range 19890000-19900000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19780000\n",
"Train set: loss: 3249967.383368\n",
"Train sample R^2 for epoch: 0 is 0.931192\n",
"\n",
"\n",
"cross-validate range 19780000-19900000\n",
"Test set: loss: 136167111.518131\n",
"Test sample R^2 for epoch: 0 is 0.178463\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19780000\n",
"Train set: loss: 6939584.874485\n",
"Train sample R^2 for epoch: 1 is 0.859800\n",
"\n",
"\n",
"cross-validate range 19780000-19900000\n",
"Test set: loss: 320903675.499297\n",
"Test sample R^2 for epoch: 1 is -0.935038\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19780000\n",
"Train set: loss: 7137576.351382\n",
"Train sample R^2 for epoch: 2 is 0.852422\n",
"\n",
"\n",
"cross-validate range 19780000-19900000\n",
"Test set: loss: 328095487.625639\n",
"Test sample R^2 for epoch: 2 is -0.978315\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19780000\n",
"Train set: loss: 6414553.363636\n",
"Train sample R^2 for epoch: 3 is 0.866569\n",
"\n",
"\n",
"cross-validate range 19780000-19900000\n",
"Test set: loss: 291999236.841419\n",
"Test sample R^2 for epoch: 3 is -0.760691\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19780000\n",
"Train set: loss: 5971070.649147\n",
"Train sample R^2 for epoch: 4 is 0.875914\n",
"\n",
"\n",
"cross-validate range 19780000-19900000\n",
"Test set: loss: 270800806.563032\n",
"Test sample R^2 for epoch: 4 is -0.632858\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19780000\n",
"Train set: loss: 5452607.592914\n",
"Train sample R^2 for epoch: 5 is 0.886497\n",
"\n",
"\n",
"cross-validate range 19780000-19900000\n",
"Test set: loss: 245711663.143625\n",
"Test sample R^2 for epoch: 5 is -0.481579\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 5\n",
"out-of-sample range 19900000-19910000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19790000\n",
"Train set: loss: 11011584.086177\n",
"Train sample R^2 for epoch: 0 is 0.794859\n",
"\n",
"\n",
"cross-validate range 19790000-19910000\n",
"Test set: loss: 556619672.222421\n",
"Test sample R^2 for epoch: 0 is -2.237417\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19790000\n",
"Train set: loss: 4689887.371885\n",
"Train sample R^2 for epoch: 1 is 0.690423\n",
"\n",
"\n",
"cross-validate range 19790000-19910000\n",
"Test set: loss: 4809420.427822\n",
"Test sample R^2 for epoch: 1 is 0.971816\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19790000\n",
"Train set: loss: 1699872.239801\n",
"Train sample R^2 for epoch: 2 is 0.966155\n",
"\n",
"\n",
"cross-validate range 19790000-19910000\n",
"Test set: loss: 78294376.817141\n",
"Test sample R^2 for epoch: 2 is 0.544196\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19790000\n",
"Train set: loss: 3686075.420577\n",
"Train sample R^2 for epoch: 3 is 0.904006\n",
"\n",
"\n",
"cross-validate range 19790000-19910000\n",
"Test set: loss: 194241991.482549\n",
"Test sample R^2 for epoch: 3 is -0.130142\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19790000\n",
"Train set: loss: 3444003.063188\n",
"Train sample R^2 for epoch: 4 is 0.871550\n",
"\n",
"\n",
"cross-validate range 19790000-19910000\n",
"Test set: loss: 131343472.671646\n",
"Test sample R^2 for epoch: 4 is 0.235663\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19790000\n",
"Train set: loss: 3164003.273211\n",
"Train sample R^2 for epoch: 5 is 0.897520\n",
"\n",
"\n",
"cross-validate range 19790000-19910000\n",
"Test set: loss: 142246215.794157\n",
"Test sample R^2 for epoch: 5 is 0.172263\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19790000\n",
"Train set: loss: 2810743.018543\n",
"Train sample R^2 for epoch: 6 is 0.901054\n",
"\n",
"\n",
"cross-validate range 19790000-19910000\n",
"Test set: loss: 114099880.668261\n",
"Test sample R^2 for epoch: 6 is 0.335967\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 6\n",
"out-of-sample range 19910000-19920000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19800000\n",
"Train set: loss: 2598668.684905\n",
"Train sample R^2 for epoch: 0 is 0.950874\n",
"\n",
"\n",
"cross-validate range 19800000-19920000\n",
"Test set: loss: 14940129.474352\n",
"Test sample R^2 for epoch: 0 is 0.918145\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19800000\n",
"Train set: loss: 5583759.568308\n",
"Train sample R^2 for epoch: 1 is 0.897083\n",
"\n",
"\n",
"cross-validate range 19800000-19920000\n",
"Test set: loss: 34141070.837267\n",
"Test sample R^2 for epoch: 1 is 0.812935\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19800000\n",
"Train set: loss: 3974247.331848\n",
"Train sample R^2 for epoch: 2 is 0.924365\n",
"\n",
"\n",
"cross-validate range 19800000-19920000\n",
"Test set: loss: 23388372.029937\n",
"Test sample R^2 for epoch: 2 is 0.871852\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19800000\n",
"Train set: loss: 4279606.643012\n",
"Train sample R^2 for epoch: 3 is 0.920198\n",
"\n",
"\n",
"cross-validate range 19800000-19920000\n",
"Test set: loss: 25307641.791886\n",
"Test sample R^2 for epoch: 3 is 0.861331\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19800000\n",
"Train set: loss: 3556724.168951\n",
"Train sample R^2 for epoch: 4 is 0.933225\n",
"\n",
"\n",
"cross-validate range 19800000-19920000\n",
"Test set: loss: 20622033.602481\n",
"Test sample R^2 for epoch: 4 is 0.887001\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19800000\n",
"Train set: loss: 3383213.141198\n",
"Train sample R^2 for epoch: 5 is 0.936898\n",
"\n",
"\n",
"cross-validate range 19800000-19920000\n",
"Test set: loss: 19410221.187400\n",
"Test sample R^2 for epoch: 5 is 0.893636\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 5\n",
"out-of-sample range 19920000-19930000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19810000\n",
"Train set: loss: 6981594.424473\n",
"Train sample R^2 for epoch: 0 is 0.881304\n",
"\n",
"\n",
"cross-validate range 19810000-19930000\n",
"Test set: loss: 1550299.495193\n",
"Test sample R^2 for epoch: 0 is 0.991661\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19810000\n",
"Train set: loss: 3313258.550643\n",
"Train sample R^2 for epoch: 1 is 0.937453\n",
"\n",
"\n",
"cross-validate range 19810000-19930000\n",
"Test set: loss: 1719346.320100\n",
"Test sample R^2 for epoch: 1 is 0.990764\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19810000\n",
"Train set: loss: 4855362.982882\n",
"Train sample R^2 for epoch: 2 is 0.915172\n",
"\n",
"\n",
"cross-validate range 19810000-19930000\n",
"Test set: loss: 1614599.088929\n",
"Test sample R^2 for epoch: 2 is 0.991320\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19810000\n",
"Train set: loss: 3413236.767102\n",
"Train sample R^2 for epoch: 3 is 0.937836\n",
"\n",
"\n",
"cross-validate range 19810000-19930000\n",
"Test set: loss: 1627860.911300\n",
"Test sample R^2 for epoch: 3 is 0.991249\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19810000\n",
"Train set: loss: 3644106.719189\n",
"Train sample R^2 for epoch: 4 is 0.935358\n",
"\n",
"\n",
"cross-validate range 19810000-19930000\n",
"Test set: loss: 1566183.207289\n",
"Test sample R^2 for epoch: 4 is 0.991576\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19810000\n",
"Train set: loss: 2940909.079100\n",
"Train sample R^2 for epoch: 5 is 0.946928\n",
"\n",
"\n",
"cross-validate range 19810000-19930000\n",
"Test set: loss: 1543300.121933\n",
"Test sample R^2 for epoch: 5 is 0.991697\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19810000\n",
"Train set: loss: 2775626.449499\n",
"Train sample R^2 for epoch: 6 is 0.950377\n",
"\n",
"\n",
"cross-validate range 19810000-19930000\n",
"Test set: loss: 1529817.280056\n",
"Test sample R^2 for epoch: 6 is 0.991768\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19810000\n",
"Train set: loss: 2344177.811287\n",
"Train sample R^2 for epoch: 7 is 0.957775\n",
"\n",
"\n",
"cross-validate range 19810000-19930000\n",
"Test set: loss: 1543728.567550\n",
"Test sample R^2 for epoch: 7 is 0.991693\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19810000\n",
"Train set: loss: 2090406.272488\n",
"Train sample R^2 for epoch: 8 is 0.962456\n",
"\n",
"\n",
"cross-validate range 19810000-19930000\n",
"Test set: loss: 1591173.693660\n",
"Test sample R^2 for epoch: 8 is 0.991440\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19810000\n",
"Train set: loss: 1777671.545768\n",
"Train sample R^2 for epoch: 9 is 0.967958\n",
"\n",
"\n",
"cross-validate range 19810000-19930000\n",
"Test set: loss: 1658142.708490\n",
"Test sample R^2 for epoch: 9 is 0.991083\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19810000\n",
"Train set: loss: 1547146.611783\n",
"Train sample R^2 for epoch: 10 is 0.972148\n",
"\n",
"\n",
"cross-validate range 19810000-19930000\n",
"Test set: loss: 1746864.433671\n",
"Test sample R^2 for epoch: 10 is 0.990609\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19810000\n",
"Train set: loss: 1318322.132734\n",
"Train sample R^2 for epoch: 11 is 0.976231\n",
"\n",
"\n",
"cross-validate range 19810000-19930000\n",
"Test set: loss: 1828854.909081\n",
"Test sample R^2 for epoch: 11 is 0.990172\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 11\n",
"out-of-sample range 19930000-19940000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19820000\n",
"Train set: loss: 2537049.634084\n",
"Train sample R^2 for epoch: 0 is 0.955569\n",
"\n",
"\n",
"cross-validate range 19820000-19940000\n",
"Test set: loss: 31678718.412274\n",
"Test sample R^2 for epoch: 0 is 0.831228\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19820000\n",
"Train set: loss: 1745471.985134\n",
"Train sample R^2 for epoch: 1 is 0.965228\n",
"\n",
"\n",
"cross-validate range 19820000-19940000\n",
"Test set: loss: 21036802.382253\n",
"Test sample R^2 for epoch: 1 is 0.887904\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19820000\n",
"Train set: loss: 1841707.336385\n",
"Train sample R^2 for epoch: 2 is 0.965779\n",
"\n",
"\n",
"cross-validate range 19820000-19940000\n",
"Test set: loss: 21555701.580214\n",
"Test sample R^2 for epoch: 2 is 0.885140\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19820000\n",
"Train set: loss: 1329145.427320\n",
"Train sample R^2 for epoch: 3 is 0.973901\n",
"\n",
"\n",
"cross-validate range 19820000-19940000\n",
"Test set: loss: 14793711.707879\n",
"Test sample R^2 for epoch: 3 is 0.921153\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19820000\n",
"Train set: loss: 1323711.383127\n",
"Train sample R^2 for epoch: 4 is 0.975325\n",
"\n",
"\n",
"cross-validate range 19820000-19940000\n",
"Test set: loss: 14156002.991750\n",
"Test sample R^2 for epoch: 4 is 0.924550\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19820000\n",
"Train set: loss: 994419.554807\n",
"Train sample R^2 for epoch: 5 is 0.980720\n",
"\n",
"\n",
"cross-validate range 19820000-19940000\n",
"Test set: loss: 9861404.419038\n",
"Test sample R^2 for epoch: 5 is 0.947422\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19820000\n",
"Train set: loss: 985291.008426\n",
"Train sample R^2 for epoch: 6 is 0.981741\n",
"\n",
"\n",
"cross-validate range 19820000-19940000\n",
"Test set: loss: 9297927.967342\n",
"Test sample R^2 for epoch: 6 is 0.950423\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19820000\n",
"Train set: loss: 755556.542569\n",
"Train sample R^2 for epoch: 7 is 0.985526\n",
"\n",
"\n",
"cross-validate range 19820000-19940000\n",
"Test set: loss: 6483121.312666\n",
"Test sample R^2 for epoch: 7 is 0.965415\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19820000\n",
"Train set: loss: 776753.692400\n",
"Train sample R^2 for epoch: 8 is 0.985786\n",
"\n",
"\n",
"cross-validate range 19820000-19940000\n",
"Test set: loss: 6341684.170603\n",
"Test sample R^2 for epoch: 8 is 0.966168\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19820000\n",
"Train set: loss: 595400.412941\n",
"Train sample R^2 for epoch: 9 is 0.988708\n",
"\n",
"\n",
"cross-validate range 19820000-19940000\n",
"Test set: loss: 4380205.831534\n",
"Test sample R^2 for epoch: 9 is 0.976616\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19820000\n",
"Train set: loss: 661814.156304\n",
"Train sample R^2 for epoch: 10 is 0.988087\n",
"\n",
"\n",
"cross-validate range 19820000-19940000\n",
"Test set: loss: 4703362.026458\n",
"Test sample R^2 for epoch: 10 is 0.974894\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19820000\n",
"Train set: loss: 492753.893670\n",
"Train sample R^2 for epoch: 11 is 0.990703\n",
"\n",
"\n",
"cross-validate range 19820000-19940000\n",
"Test set: loss: 3177454.607563\n",
"Test sample R^2 for epoch: 11 is 0.983022\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19820000\n",
"Train set: loss: 615289.181936\n",
"Train sample R^2 for epoch: 12 is 0.989105\n",
"\n",
"\n",
"cross-validate range 19820000-19940000\n",
"Test set: loss: 3977964.461282\n",
"Test sample R^2 for epoch: 12 is 0.978758\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19820000\n",
"Train set: loss: 441698.047399\n",
"Train sample R^2 for epoch: 13 is 0.991665\n",
"\n",
"\n",
"cross-validate range 19820000-19940000\n",
"Test set: loss: 2678439.898159\n",
"Test sample R^2 for epoch: 13 is 0.985681\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19820000\n",
"Train set: loss: 609434.029453\n",
"Train sample R^2 for epoch: 14 is 0.989314\n",
"\n",
"\n",
"cross-validate range 19820000-19940000\n",
"Test set: loss: 4243658.379351\n",
"Test sample R^2 for epoch: 14 is 0.977343\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19820000\n",
"Train set: loss: 488678.865752\n",
"Train sample R^2 for epoch: 15 is 0.990816\n",
"\n",
"\n",
"cross-validate range 19820000-19940000\n",
"Test set: loss: 3554919.100190\n",
"Test sample R^2 for epoch: 15 is 0.981012\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19820000\n",
"Train set: loss: 604661.813249\n",
"Train sample R^2 for epoch: 16 is 0.989191\n",
"\n",
"\n",
"cross-validate range 19820000-19940000\n",
"Test set: loss: 6815220.027102\n",
"Test sample R^2 for epoch: 16 is 0.963648\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19820000\n",
"Train set: loss: 526881.494585\n",
"Train sample R^2 for epoch: 17 is 0.989727\n",
"\n",
"\n",
"cross-validate range 19820000-19940000\n",
"Test set: loss: 7500874.986397\n",
"Test sample R^2 for epoch: 17 is 0.959997\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19820000\n",
"Train set: loss: 431960.718148\n",
"Train sample R^2 for epoch: 18 is 0.991405\n",
"\n",
"\n",
"cross-validate range 19820000-19940000\n",
"Test set: loss: 4915376.112515\n",
"Test sample R^2 for epoch: 18 is 0.973768\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 18\n",
"out-of-sample range 19940000-19950000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19830000\n",
"Train set: loss: 665611.176161\n",
"Train sample R^2 for epoch: 0 is 0.988856\n",
"\n",
"\n",
"cross-validate range 19830000-19950000\n",
"Test set: loss: 4322131.039233\n",
"Test sample R^2 for epoch: 0 is 0.976788\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19830000\n",
"Train set: loss: 619247.819148\n",
"Train sample R^2 for epoch: 1 is 0.988920\n",
"\n",
"\n",
"cross-validate range 19830000-19950000\n",
"Test set: loss: 3510362.165496\n",
"Test sample R^2 for epoch: 1 is 0.981134\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19830000\n",
"Train set: loss: 962239.289398\n",
"Train sample R^2 for epoch: 2 is 0.983703\n",
"\n",
"\n",
"cross-validate range 19830000-19950000\n",
"Test set: loss: 7015832.746139\n",
"Test sample R^2 for epoch: 2 is 0.962374\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19830000\n",
"Train set: loss: 9016980.196427\n",
"Train sample R^2 for epoch: 3 is 0.857332\n",
"\n",
"\n",
"cross-validate range 19830000-19950000\n",
"Test set: loss: 1644027.164099\n",
"Test sample R^2 for epoch: 3 is 0.991121\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19830000\n",
"Train set: loss: 5500365.974462\n",
"Train sample R^2 for epoch: 4 is 0.906884\n",
"\n",
"\n",
"cross-validate range 19830000-19950000\n",
"Test set: loss: 3103833.173996\n",
"Test sample R^2 for epoch: 4 is 0.983293\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19830000\n",
"Train set: loss: 352168.561292\n",
"Train sample R^2 for epoch: 5 is 0.992468\n",
"\n",
"\n",
"cross-validate range 19830000-19950000\n",
"Test set: loss: 1645674.521377\n",
"Test sample R^2 for epoch: 5 is 0.991112\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19830000\n",
"Train set: loss: 312329.534645\n",
"Train sample R^2 for epoch: 6 is 0.994660\n",
"\n",
"\n",
"cross-validate range 19830000-19950000\n",
"Test set: loss: 1645434.648932\n",
"Test sample R^2 for epoch: 6 is 0.991113\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19830000\n",
"Train set: loss: 323899.886539\n",
"Train sample R^2 for epoch: 7 is 0.994478\n",
"\n",
"\n",
"cross-validate range 19830000-19950000\n",
"Test set: loss: 1700255.066778\n",
"Test sample R^2 for epoch: 7 is 0.990820\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19830000\n",
"Train set: loss: 344575.487688\n",
"Train sample R^2 for epoch: 8 is 0.994135\n",
"\n",
"\n",
"cross-validate range 19830000-19950000\n",
"Test set: loss: 1766494.966943\n",
"Test sample R^2 for epoch: 8 is 0.990465\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 8\n",
"out-of-sample range 19950000-19960000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19840000\n",
"Train set: loss: 5414736.521070\n",
"Train sample R^2 for epoch: 0 is 0.911816\n",
"\n",
"\n",
"cross-validate range 19840000-19960000\n",
"Test set: loss: 2526470.739393\n",
"Test sample R^2 for epoch: 0 is 0.986355\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19840000\n",
"Train set: loss: 359007.528448\n",
"Train sample R^2 for epoch: 1 is 0.994093\n",
"\n",
"\n",
"cross-validate range 19840000-19960000\n",
"Test set: loss: 1681121.301134\n",
"Test sample R^2 for epoch: 1 is 0.990928\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19840000\n",
"Train set: loss: 355957.705760\n",
"Train sample R^2 for epoch: 2 is 0.994300\n",
"\n",
"\n",
"cross-validate range 19840000-19960000\n",
"Test set: loss: 1681842.254169\n",
"Test sample R^2 for epoch: 2 is 0.990924\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19840000\n",
"Train set: loss: 372673.987266\n",
"Train sample R^2 for epoch: 3 is 0.994049\n",
"\n",
"\n",
"cross-validate range 19840000-19960000\n",
"Test set: loss: 1752029.447518\n",
"Test sample R^2 for epoch: 3 is 0.990544\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19840000\n",
"Train set: loss: 410817.013610\n",
"Train sample R^2 for epoch: 4 is 0.993466\n",
"\n",
"\n",
"cross-validate range 19840000-19960000\n",
"Test set: loss: 1699786.519806\n",
"Test sample R^2 for epoch: 4 is 0.990827\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19840000\n",
"Train set: loss: 582260.123275\n",
"Train sample R^2 for epoch: 5 is 0.990834\n",
"\n",
"\n",
"cross-validate range 19840000-19960000\n",
"Test set: loss: 1721715.786633\n",
"Test sample R^2 for epoch: 5 is 0.990708\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19840000\n",
"Train set: loss: 545983.078294\n",
"Train sample R^2 for epoch: 6 is 0.991462\n",
"\n",
"\n",
"cross-validate range 19840000-19960000\n",
"Test set: loss: 1900757.191479\n",
"Test sample R^2 for epoch: 6 is 0.989740\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 6\n",
"out-of-sample range 19960000-19970000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19850000\n",
"Train set: loss: 383047.874442\n",
"Train sample R^2 for epoch: 0 is 0.994084\n",
"\n",
"\n",
"cross-validate range 19850000-19970000\n",
"Test set: loss: 2026773.431515\n",
"Test sample R^2 for epoch: 0 is 0.990101\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19850000\n",
"Train set: loss: 400575.677784\n",
"Train sample R^2 for epoch: 1 is 0.993824\n",
"\n",
"\n",
"cross-validate range 19850000-19970000\n",
"Test set: loss: 2029146.078012\n",
"Test sample R^2 for epoch: 1 is 0.990089\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19850000\n",
"Train set: loss: 438429.624677\n",
"Train sample R^2 for epoch: 2 is 0.993248\n",
"\n",
"\n",
"cross-validate range 19850000-19970000\n",
"Test set: loss: 2053886.848894\n",
"Test sample R^2 for epoch: 2 is 0.989964\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19850000\n",
"Train set: loss: 643046.742600\n",
"Train sample R^2 for epoch: 3 is 0.990166\n",
"\n",
"\n",
"cross-validate range 19850000-19970000\n",
"Test set: loss: 2146499.342873\n",
"Test sample R^2 for epoch: 3 is 0.989504\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19850000\n",
"Train set: loss: 508913.403089\n",
"Train sample R^2 for epoch: 4 is 0.992227\n",
"\n",
"\n",
"cross-validate range 19850000-19970000\n",
"Test set: loss: 2234670.393931\n",
"Test sample R^2 for epoch: 4 is 0.989059\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19850000\n",
"Train set: loss: 501627.004762\n",
"Train sample R^2 for epoch: 5 is 0.992287\n",
"\n",
"\n",
"cross-validate range 19850000-19970000\n",
"Test set: loss: 2061633.719990\n",
"Test sample R^2 for epoch: 5 is 0.989925\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 5\n",
"out-of-sample range 19970000-19980000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19860000\n",
"Train set: loss: 422400.948576\n",
"Train sample R^2 for epoch: 0 is 0.993698\n",
"\n",
"\n",
"cross-validate range 19860000-19980000\n",
"Test set: loss: 2589663.320649\n",
"Test sample R^2 for epoch: 0 is 0.988898\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19860000\n",
"Train set: loss: 462322.159085\n",
"Train sample R^2 for epoch: 1 is 0.993096\n",
"\n",
"\n",
"cross-validate range 19860000-19980000\n",
"Test set: loss: 2560769.963514\n",
"Test sample R^2 for epoch: 1 is 0.989031\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19860000\n",
"Train set: loss: 711403.859936\n",
"Train sample R^2 for epoch: 2 is 0.989466\n",
"\n",
"\n",
"cross-validate range 19860000-19980000\n",
"Test set: loss: 2558223.248700\n",
"Test sample R^2 for epoch: 2 is 0.989042\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19860000\n",
"Train set: loss: 603977.699530\n",
"Train sample R^2 for epoch: 3 is 0.990989\n",
"\n",
"\n",
"cross-validate range 19860000-19980000\n",
"Test set: loss: 2555619.244408\n",
"Test sample R^2 for epoch: 3 is 0.989054\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19860000\n",
"Train set: loss: 467260.388422\n",
"Train sample R^2 for epoch: 4 is 0.992960\n",
"\n",
"\n",
"cross-validate range 19860000-19980000\n",
"Test set: loss: 2569384.251913\n",
"Test sample R^2 for epoch: 4 is 0.988990\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19860000\n",
"Train set: loss: 471074.221880\n",
"Train sample R^2 for epoch: 5 is 0.993046\n",
"\n",
"\n",
"cross-validate range 19860000-19980000\n",
"Test set: loss: 3278990.678134\n",
"Test sample R^2 for epoch: 5 is 0.985817\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19860000\n",
"Train set: loss: 11567620.560518\n",
"Train sample R^2 for epoch: 6 is 0.840121\n",
"\n",
"\n",
"cross-validate range 19860000-19980000\n",
"Test set: loss: 2787631.193097\n",
"Test sample R^2 for epoch: 6 is 0.988017\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19860000\n",
"Train set: loss: 73156305.909912\n",
"Train sample R^2 for epoch: 7 is -0.097275\n",
"\n",
"\n",
"cross-validate range 19860000-19980000\n",
"Test set: loss: 11724762.347860\n",
"Test sample R^2 for epoch: 7 is 0.948350\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19860000\n",
"Train set: loss: 417830.052768\n",
"Train sample R^2 for epoch: 8 is 0.993638\n",
"\n",
"\n",
"cross-validate range 19860000-19980000\n",
"Test set: loss: 2671878.620688\n",
"Test sample R^2 for epoch: 8 is 0.988542\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 8\n",
"out-of-sample range 19980000-19990000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19870000\n",
"Train set: loss: 474857.188629\n",
"Train sample R^2 for epoch: 0 is 0.992990\n",
"\n",
"\n",
"cross-validate range 19870000-19990000\n",
"Test set: loss: 3105418.228833\n",
"Test sample R^2 for epoch: 0 is 0.989539\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19870000\n",
"Train set: loss: 521866.874709\n",
"Train sample R^2 for epoch: 1 is 0.992503\n",
"\n",
"\n",
"cross-validate range 19870000-19990000\n",
"Test set: loss: 3458181.297692\n",
"Test sample R^2 for epoch: 1 is 0.988342\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19870000\n",
"Train set: loss: 5013160.151143\n",
"Train sample R^2 for epoch: 2 is 0.925344\n",
"\n",
"\n",
"cross-validate range 19870000-19990000\n",
"Test set: loss: 3116985.446757\n",
"Test sample R^2 for epoch: 2 is 0.989495\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19870000\n",
"Train set: loss: 12512561.219938\n",
"Train sample R^2 for epoch: 3 is 0.817187\n",
"\n",
"\n",
"cross-validate range 19870000-19990000\n",
"Test set: loss: 41635305.164954\n",
"Test sample R^2 for epoch: 3 is 0.857310\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19870000\n",
"Train set: loss: 895528.277543\n",
"Train sample R^2 for epoch: 4 is 0.986786\n",
"\n",
"\n",
"cross-validate range 19870000-19990000\n",
"Test set: loss: 5070422.170510\n",
"Test sample R^2 for epoch: 4 is 0.982774\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19870000\n",
"Train set: loss: 453664.034994\n",
"Train sample R^2 for epoch: 5 is 0.993330\n",
"\n",
"\n",
"cross-validate range 19870000-19990000\n",
"Test set: loss: 3784483.986015\n",
"Test sample R^2 for epoch: 5 is 0.987227\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 5\n",
"out-of-sample range 19990000-20000000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19880000\n",
"Train set: loss: 1030861.947816\n",
"Train sample R^2 for epoch: 0 is 0.989637\n",
"\n",
"\n",
"cross-validate range 19880000-20000000\n",
"Test set: loss: 236231478.241738\n",
"Test sample R^2 for epoch: 0 is 0.100025\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19880000\n",
"Train set: loss: 4804575.849872\n",
"Train sample R^2 for epoch: 1 is 0.953284\n",
"\n",
"\n",
"cross-validate range 19880000-20000000\n",
"Test set: loss: 258593898.241567\n",
"Test sample R^2 for epoch: 1 is 0.014716\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19880000\n",
"Train set: loss: 15840435.974263\n",
"Train sample R^2 for epoch: 2 is 0.713744\n",
"\n",
"\n",
"cross-validate range 19880000-20000000\n",
"Test set: loss: 4148733.035815\n",
"Test sample R^2 for epoch: 2 is 0.984250\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19880000\n",
"Train set: loss: 467374.526373\n",
"Train sample R^2 for epoch: 3 is 0.994603\n",
"\n",
"\n",
"cross-validate range 19880000-20000000\n",
"Test set: loss: 4271262.203129\n",
"Test sample R^2 for epoch: 3 is 0.983785\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19880000\n",
"Train set: loss: 478929.464607\n",
"Train sample R^2 for epoch: 4 is 0.994442\n",
"\n",
"\n",
"cross-validate range 19880000-20000000\n",
"Test set: loss: 4637358.287218\n",
"Test sample R^2 for epoch: 4 is 0.982392\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19880000\n",
"Train set: loss: 491931.815742\n",
"Train sample R^2 for epoch: 5 is 0.994237\n",
"\n",
"\n",
"cross-validate range 19880000-20000000\n",
"Test set: loss: 5274424.581101\n",
"Test sample R^2 for epoch: 5 is 0.979968\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19880000\n",
"Train set: loss: 496663.897124\n",
"Train sample R^2 for epoch: 6 is 0.994105\n",
"\n",
"\n",
"cross-validate range 19880000-20000000\n",
"Test set: loss: 5203963.276953\n",
"Test sample R^2 for epoch: 6 is 0.980238\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19880000\n",
"Train set: loss: 493723.278694\n",
"Train sample R^2 for epoch: 7 is 0.994173\n",
"\n",
"\n",
"cross-validate range 19880000-20000000\n",
"Test set: loss: 5986416.152587\n",
"Test sample R^2 for epoch: 7 is 0.977262\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 7\n",
"out-of-sample range 20000000-20010000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19890000\n",
"Train set: loss: 484677.746308\n",
"Train sample R^2 for epoch: 0 is 0.994494\n",
"\n",
"\n",
"cross-validate range 19890000-20010000\n",
"Test set: loss: 7653744.857355\n",
"Test sample R^2 for epoch: 0 is 0.976026\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19890000\n",
"Train set: loss: 499553.142932\n",
"Train sample R^2 for epoch: 1 is 0.994333\n",
"\n",
"\n",
"cross-validate range 19890000-20010000\n",
"Test set: loss: 7637717.447577\n",
"Test sample R^2 for epoch: 1 is 0.976076\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19890000\n",
"Train set: loss: 516405.513645\n",
"Train sample R^2 for epoch: 2 is 0.994153\n",
"\n",
"\n",
"cross-validate range 19890000-20010000\n",
"Test set: loss: 7691285.598554\n",
"Test sample R^2 for epoch: 2 is 0.975906\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19890000\n",
"Train set: loss: 509427.791769\n",
"Train sample R^2 for epoch: 3 is 0.994205\n",
"\n",
"\n",
"cross-validate range 19890000-20010000\n",
"Test set: loss: 7811010.826746\n",
"Test sample R^2 for epoch: 3 is 0.975528\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19890000\n",
"Train set: loss: 520780.904026\n",
"Train sample R^2 for epoch: 4 is 0.994136\n",
"\n",
"\n",
"cross-validate range 19890000-20010000\n",
"Test set: loss: 8849861.818269\n",
"Test sample R^2 for epoch: 4 is 0.972250\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19890000\n",
"Train set: loss: 503013.594770\n",
"Train sample R^2 for epoch: 5 is 0.994336\n",
"\n",
"\n",
"cross-validate range 19890000-20010000\n",
"Test set: loss: 9171556.420453\n",
"Test sample R^2 for epoch: 5 is 0.971233\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19890000\n",
"Train set: loss: 1838763.190242\n",
"Train sample R^2 for epoch: 6 is 0.982587\n",
"\n",
"\n",
"cross-validate range 19890000-20010000\n",
"Test set: loss: 158102773.526042\n",
"Test sample R^2 for epoch: 6 is 0.502378\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 6\n",
"out-of-sample range 20010000-20020000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19900000\n",
"Train set: loss: 522218.870303\n",
"Train sample R^2 for epoch: 0 is 0.994056\n",
"\n",
"\n",
"cross-validate range 19900000-20020000\n",
"Test set: loss: 10754057.718745\n",
"Test sample R^2 for epoch: 0 is 0.973363\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19900000\n",
"Train set: loss: 514777.850728\n",
"Train sample R^2 for epoch: 1 is 0.994120\n",
"\n",
"\n",
"cross-validate range 19900000-20020000\n",
"Test set: loss: 10587780.252705\n",
"Test sample R^2 for epoch: 1 is 0.973776\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19900000\n",
"Train set: loss: 527827.990434\n",
"Train sample R^2 for epoch: 2 is 0.994032\n",
"\n",
"\n",
"cross-validate range 19900000-20020000\n",
"Test set: loss: 10686551.690445\n",
"Test sample R^2 for epoch: 2 is 0.973532\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19900000\n",
"Train set: loss: 539344.045405\n",
"Train sample R^2 for epoch: 3 is 0.993970\n",
"\n",
"\n",
"cross-validate range 19900000-20020000\n",
"Test set: loss: 11603846.094716\n",
"Test sample R^2 for epoch: 3 is 0.971258\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19900000\n",
"Train set: loss: 1025956.975837\n",
"Train sample R^2 for epoch: 4 is 0.988203\n",
"\n",
"\n",
"cross-validate range 19900000-20020000\n",
"Test set: loss: 11820730.485133\n",
"Test sample R^2 for epoch: 4 is 0.970717\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19900000\n",
"Train set: loss: 42515831.818413\n",
"Train sample R^2 for epoch: 5 is 0.587481\n",
"\n",
"\n",
"cross-validate range 19900000-20020000\n",
"Test set: loss: 11104266.609157\n",
"Test sample R^2 for epoch: 5 is 0.972450\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19900000\n",
"Train set: loss: 4487181.669102\n",
"Train sample R^2 for epoch: 6 is 0.932104\n",
"\n",
"\n",
"cross-validate range 19900000-20020000\n",
"Test set: loss: 10722782.639907\n",
"Test sample R^2 for epoch: 6 is 0.973438\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 6\n",
"out-of-sample range 20020000-20030000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19910000\n",
"Train set: loss: 559586.763145\n",
"Train sample R^2 for epoch: 0 is 0.993949\n",
"\n",
"\n",
"cross-validate range 19910000-20030000\n",
"Test set: loss: 11430123.063938\n",
"Test sample R^2 for epoch: 0 is 0.972408\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19910000\n",
"Train set: loss: 608679.507016\n",
"Train sample R^2 for epoch: 1 is 0.993555\n",
"\n",
"\n",
"cross-validate range 19910000-20030000\n",
"Test set: loss: 12567908.135118\n",
"Test sample R^2 for epoch: 1 is 0.969675\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19910000\n",
"Train set: loss: 3044621.824354\n",
"Train sample R^2 for epoch: 2 is 0.970056\n",
"\n",
"\n",
"cross-validate range 19910000-20030000\n",
"Test set: loss: 68422845.883516\n",
"Test sample R^2 for epoch: 2 is 0.835534\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19910000\n",
"Train set: loss: 149125092.106775\n",
"Train sample R^2 for epoch: 3 is -0.398842\n",
"\n",
"\n",
"cross-validate range 19910000-20030000\n",
"Test set: loss: 7535696331.261152\n",
"Test sample R^2 for epoch: 3 is -17.109599\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19910000\n",
"Train set: loss: 37845153.443947\n",
"Train sample R^2 for epoch: 4 is 0.501659\n",
"\n",
"\n",
"cross-validate range 19910000-20030000\n",
"Test set: loss: 11555674.397147\n",
"Test sample R^2 for epoch: 4 is 0.972102\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19910000\n",
"Train set: loss: 517774.476436\n",
"Train sample R^2 for epoch: 5 is 0.994361\n",
"\n",
"\n",
"cross-validate range 19910000-20030000\n",
"Test set: loss: 11545107.991163\n",
"Test sample R^2 for epoch: 5 is 0.972127\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 5\n",
"out-of-sample range 20030000-20040000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19920000\n",
"Train set: loss: 689936.278206\n",
"Train sample R^2 for epoch: 0 is 0.993048\n",
"\n",
"\n",
"cross-validate range 19920000-20040000\n",
"Test set: loss: 16630121.525873\n",
"Test sample R^2 for epoch: 0 is 0.960594\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19920000\n",
"Train set: loss: 3867902.854208\n",
"Train sample R^2 for epoch: 1 is 0.966781\n",
"\n",
"\n",
"cross-validate range 19920000-20040000\n",
"Test set: loss: 31644457.290953\n",
"Test sample R^2 for epoch: 1 is 0.925206\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19920000\n",
"Train set: loss: 20887075.446927\n",
"Train sample R^2 for epoch: 2 is 0.825217\n",
"\n",
"\n",
"cross-validate range 19920000-20040000\n",
"Test set: loss: 13465944.165668\n",
"Test sample R^2 for epoch: 2 is 0.968030\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19920000\n",
"Train set: loss: 5981515.462486\n",
"Train sample R^2 for epoch: 3 is 0.935065\n",
"\n",
"\n",
"cross-validate range 19920000-20040000\n",
"Test set: loss: 47050404.221780\n",
"Test sample R^2 for epoch: 3 is 0.888965\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19920000\n",
"Train set: loss: 579241.319875\n",
"Train sample R^2 for epoch: 4 is 0.993934\n",
"\n",
"\n",
"cross-validate range 19920000-20040000\n",
"Test set: loss: 12297991.747287\n",
"Test sample R^2 for epoch: 4 is 0.970800\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19920000\n",
"Train set: loss: 576360.581786\n",
"Train sample R^2 for epoch: 5 is 0.994026\n",
"\n",
"\n",
"cross-validate range 19920000-20040000\n",
"Test set: loss: 11984228.508457\n",
"Test sample R^2 for epoch: 5 is 0.971546\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19920000\n",
"Train set: loss: 583641.675381\n",
"Train sample R^2 for epoch: 6 is 0.993959\n",
"\n",
"\n",
"cross-validate range 19920000-20040000\n",
"Test set: loss: 12669851.350967\n",
"Test sample R^2 for epoch: 6 is 0.969919\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19920000\n",
"Train set: loss: 650145.490607\n",
"Train sample R^2 for epoch: 7 is 0.993394\n",
"\n",
"\n",
"cross-validate range 19920000-20040000\n",
"Test set: loss: 14686856.232098\n",
"Test sample R^2 for epoch: 7 is 0.965202\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19920000\n",
"Train set: loss: 1531122.610155\n",
"Train sample R^2 for epoch: 8 is 0.985660\n",
"\n",
"\n",
"cross-validate range 19920000-20040000\n",
"Test set: loss: 63467451.811146\n",
"Test sample R^2 for epoch: 8 is 0.849908\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19920000\n",
"Train set: loss: 43979645.167966\n",
"Train sample R^2 for epoch: 9 is 0.612461\n",
"\n",
"\n",
"cross-validate range 19920000-20040000\n",
"Test set: loss: 1132968362.450678\n",
"Test sample R^2 for epoch: 9 is -1.676703\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19920000\n",
"Train set: loss: 1128445.481096\n",
"Train sample R^2 for epoch: 10 is 0.977541\n",
"\n",
"\n",
"cross-validate range 19920000-20040000\n",
"Test set: loss: 12371906.341271\n",
"Test sample R^2 for epoch: 10 is 0.970668\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 10\n",
"out-of-sample range 20040000-20050000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19930000\n",
"Train set: loss: 759296.681632\n",
"Train sample R^2 for epoch: 0 is 0.992681\n",
"\n",
"\n",
"cross-validate range 19930000-20050000\n",
"Test set: loss: 11537761.157308\n",
"Test sample R^2 for epoch: 0 is 0.972110\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19930000\n",
"Train set: loss: 750682.175204\n",
"Train sample R^2 for epoch: 1 is 0.992415\n",
"\n",
"\n",
"cross-validate range 19930000-20050000\n",
"Test set: loss: 11464215.874684\n",
"Test sample R^2 for epoch: 1 is 0.972281\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19930000\n",
"Train set: loss: 2311220.627427\n",
"Train sample R^2 for epoch: 2 is 0.979263\n",
"\n",
"\n",
"cross-validate range 19930000-20050000\n",
"Test set: loss: 16053282.547824\n",
"Test sample R^2 for epoch: 2 is 0.961254\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19930000\n",
"Train set: loss: 41428215.054807\n",
"Train sample R^2 for epoch: 3 is 0.655907\n",
"\n",
"\n",
"cross-validate range 19930000-20050000\n",
"Test set: loss: 300210111.132189\n",
"Test sample R^2 for epoch: 3 is 0.279847\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19930000\n",
"Train set: loss: 4646329.205815\n",
"Train sample R^2 for epoch: 4 is 0.948242\n",
"\n",
"\n",
"cross-validate range 19930000-20050000\n",
"Test set: loss: 204211094.525989\n",
"Test sample R^2 for epoch: 4 is 0.509243\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19930000\n",
"Train set: loss: 1064543.182196\n",
"Train sample R^2 for epoch: 5 is 0.989165\n",
"\n",
"\n",
"cross-validate range 19930000-20050000\n",
"Test set: loss: 19089541.711975\n",
"Test sample R^2 for epoch: 5 is 0.953981\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19930000\n",
"Train set: loss: 1424441.954625\n",
"Train sample R^2 for epoch: 6 is 0.986798\n",
"\n",
"\n",
"cross-validate range 19930000-20050000\n",
"Test set: loss: 85267252.809569\n",
"Test sample R^2 for epoch: 6 is 0.794872\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 6\n",
"out-of-sample range 20050000-20060000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19940000\n",
"Train set: loss: 2824964.597798\n",
"Train sample R^2 for epoch: 0 is 0.975063\n",
"\n",
"\n",
"cross-validate range 19940000-20060000\n",
"Test set: loss: 68583134.085480\n",
"Test sample R^2 for epoch: 0 is 0.835006\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19940000\n",
"Train set: loss: 49399411.495547\n",
"Train sample R^2 for epoch: 1 is 0.432191\n",
"\n",
"\n",
"cross-validate range 19940000-20060000\n",
"Test set: loss: 11619884.183319\n",
"Test sample R^2 for epoch: 1 is 0.971942\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19940000\n",
"Train set: loss: 733922.551575\n",
"Train sample R^2 for epoch: 2 is 0.992866\n",
"\n",
"\n",
"cross-validate range 19940000-20060000\n",
"Test set: loss: 11522432.502702\n",
"Test sample R^2 for epoch: 2 is 0.972178\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19940000\n",
"Train set: loss: 743024.834230\n",
"Train sample R^2 for epoch: 3 is 0.992778\n",
"\n",
"\n",
"cross-validate range 19940000-20060000\n",
"Test set: loss: 11453969.236815\n",
"Test sample R^2 for epoch: 3 is 0.972346\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19940000\n",
"Train set: loss: 752846.937399\n",
"Train sample R^2 for epoch: 4 is 0.992663\n",
"\n",
"\n",
"cross-validate range 19940000-20060000\n",
"Test set: loss: 11776104.760279\n",
"Test sample R^2 for epoch: 4 is 0.971565\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19940000\n",
"Train set: loss: 150844953.533677\n",
"Train sample R^2 for epoch: 5 is -0.727463\n",
"\n",
"\n",
"cross-validate range 19940000-20060000\n",
"Test set: loss: 11565007.781848\n",
"Test sample R^2 for epoch: 5 is 0.972075\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19940000\n",
"Train set: loss: 745228.747443\n",
"Train sample R^2 for epoch: 6 is 0.992684\n",
"\n",
"\n",
"cross-validate range 19940000-20060000\n",
"Test set: loss: 11495476.963437\n",
"Test sample R^2 for epoch: 6 is 0.972242\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19940000\n",
"Train set: loss: 756822.835966\n",
"Train sample R^2 for epoch: 7 is 0.992658\n",
"\n",
"\n",
"cross-validate range 19940000-20060000\n",
"Test set: loss: 11800850.156441\n",
"Test sample R^2 for epoch: 7 is 0.971505\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19940000\n",
"Train set: loss: 751690.267192\n",
"Train sample R^2 for epoch: 8 is 0.992673\n",
"\n",
"\n",
"cross-validate range 19940000-20060000\n",
"Test set: loss: 11993424.205182\n",
"Test sample R^2 for epoch: 8 is 0.971039\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 8\n",
"out-of-sample range 20060000-20070000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19950000\n",
"Train set: loss: 769811.308583\n",
"Train sample R^2 for epoch: 0 is 0.992540\n",
"\n",
"\n",
"cross-validate range 19950000-20070000\n",
"Test set: loss: 11390834.640749\n",
"Test sample R^2 for epoch: 0 is 0.972456\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19950000\n",
"Train set: loss: 502326050.691256\n",
"Train sample R^2 for epoch: 1 is -4.861793\n",
"\n",
"\n",
"cross-validate range 19950000-20070000\n",
"Test set: loss: 11535177.010251\n",
"Test sample R^2 for epoch: 1 is 0.972108\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19950000\n",
"Train set: loss: 1899830.755135\n",
"Train sample R^2 for epoch: 2 is 0.947051\n",
"\n",
"\n",
"cross-validate range 19950000-20070000\n",
"Test set: loss: 11571827.689482\n",
"Test sample R^2 for epoch: 2 is 0.972016\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19950000\n",
"Train set: loss: 869999.538308\n",
"Train sample R^2 for epoch: 3 is 0.987016\n",
"\n",
"\n",
"cross-validate range 19950000-20070000\n",
"Test set: loss: 11577560.467463\n",
"Test sample R^2 for epoch: 3 is 0.972002\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19950000\n",
"Train set: loss: 785144.613212\n",
"Train sample R^2 for epoch: 4 is 0.991283\n",
"\n",
"\n",
"cross-validate range 19950000-20070000\n",
"Test set: loss: 11550346.736665\n",
"Test sample R^2 for epoch: 4 is 0.972070\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19950000\n",
"Train set: loss: 14494272.651251\n",
"Train sample R^2 for epoch: 5 is 0.617297\n",
"\n",
"\n",
"cross-validate range 19950000-20070000\n",
"Test set: loss: 11747488.903618\n",
"Test sample R^2 for epoch: 5 is 0.971596\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 5\n",
"out-of-sample range 20070000-20080000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19960000\n",
"Train set: loss: 489440053.022974\n",
"Train sample R^2 for epoch: 0 is -4.733220\n",
"\n",
"\n",
"cross-validate range 19960000-20080000\n",
"Test set: loss: 11449530.993289\n",
"Test sample R^2 for epoch: 0 is 0.972101\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19960000\n",
"Train set: loss: 7455084.398233\n",
"Train sample R^2 for epoch: 1 is 0.790779\n",
"\n",
"\n",
"cross-validate range 19960000-20080000\n",
"Test set: loss: 11491787.441738\n",
"Test sample R^2 for epoch: 1 is 0.971998\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19960000\n",
"Train set: loss: 1031816.928771\n",
"Train sample R^2 for epoch: 2 is 0.979572\n",
"\n",
"\n",
"cross-validate range 19960000-20080000\n",
"Test set: loss: 11493727.394849\n",
"Test sample R^2 for epoch: 2 is 0.971992\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19960000\n",
"Train set: loss: 826363.178464\n",
"Train sample R^2 for epoch: 3 is 0.990577\n",
"\n",
"\n",
"cross-validate range 19960000-20080000\n",
"Test set: loss: 11490986.499350\n",
"Test sample R^2 for epoch: 3 is 0.972000\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19960000\n",
"Train set: loss: 6141911.979843\n",
"Train sample R^2 for epoch: 4 is 0.931765\n",
"\n",
"\n",
"cross-validate range 19960000-20080000\n",
"Test set: loss: 11500334.823399\n",
"Test sample R^2 for epoch: 4 is 0.971978\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19960000\n",
"Train set: loss: 1007696.111793\n",
"Train sample R^2 for epoch: 5 is 0.980932\n",
"\n",
"\n",
"cross-validate range 19960000-20080000\n",
"Test set: loss: 11496530.312631\n",
"Test sample R^2 for epoch: 5 is 0.971988\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 5\n",
"out-of-sample range 20080000-20090000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19970000\n",
"Train set: loss: 7401144.685305\n",
"Train sample R^2 for epoch: 0 is 0.799711\n",
"\n",
"\n",
"cross-validate range 19970000-20090000\n",
"Test set: loss: 11254094.326828\n",
"Test sample R^2 for epoch: 0 is 0.974325\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19970000\n",
"Train set: loss: 1137929.699072\n",
"Train sample R^2 for epoch: 1 is 0.979740\n",
"\n",
"\n",
"cross-validate range 19970000-20090000\n",
"Test set: loss: 11255635.955855\n",
"Test sample R^2 for epoch: 1 is 0.974321\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19970000\n",
"Train set: loss: 950639.812950\n",
"Train sample R^2 for epoch: 2 is 0.989531\n",
"\n",
"\n",
"cross-validate range 19970000-20090000\n",
"Test set: loss: 11247500.120720\n",
"Test sample R^2 for epoch: 2 is 0.974339\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19970000\n",
"Train set: loss: 21071743.055229\n",
"Train sample R^2 for epoch: 3 is 0.452381\n",
"\n",
"\n",
"cross-validate range 19970000-20090000\n",
"Test set: loss: 11282925.414856\n",
"Test sample R^2 for epoch: 3 is 0.974252\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19970000\n",
"Train set: loss: 1060356.081994\n",
"Train sample R^2 for epoch: 4 is 0.985233\n",
"\n",
"\n",
"cross-validate range 19970000-20090000\n",
"Test set: loss: 11276072.741017\n",
"Test sample R^2 for epoch: 4 is 0.974267\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19970000\n",
"Train set: loss: 969144.745387\n",
"Train sample R^2 for epoch: 5 is 0.989361\n",
"\n",
"\n",
"cross-validate range 19970000-20090000\n",
"Test set: loss: 11261897.676145\n",
"Test sample R^2 for epoch: 5 is 0.974299\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19970000\n",
"Train set: loss: 963962.435316\n",
"Train sample R^2 for epoch: 6 is 0.989845\n",
"\n",
"\n",
"cross-validate range 19970000-20090000\n",
"Test set: loss: 11235913.684353\n",
"Test sample R^2 for epoch: 6 is 0.974359\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19970000\n",
"Train set: loss: 929300.419805\n",
"Train sample R^2 for epoch: 7 is 0.991047\n",
"\n",
"\n",
"cross-validate range 19970000-20090000\n",
"Test set: loss: 11347632.381363\n",
"Test sample R^2 for epoch: 7 is 0.974093\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19970000\n",
"Train set: loss: 917852.512693\n",
"Train sample R^2 for epoch: 8 is 0.991520\n",
"\n",
"\n",
"cross-validate range 19970000-20090000\n",
"Test set: loss: 11214915.342675\n",
"Test sample R^2 for epoch: 8 is 0.974406\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19970000\n",
"Train set: loss: 1259955.306860\n",
"Train sample R^2 for epoch: 9 is 0.988820\n",
"\n",
"\n",
"cross-validate range 19970000-20090000\n",
"Test set: loss: 12385407.124365\n",
"Test sample R^2 for epoch: 9 is 0.971706\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19970000\n",
"Train set: loss: 955076.899598\n",
"Train sample R^2 for epoch: 10 is 0.990875\n",
"\n",
"\n",
"cross-validate range 19970000-20090000\n",
"Test set: loss: 11234040.078968\n",
"Test sample R^2 for epoch: 10 is 0.974371\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19970000\n",
"Train set: loss: 4516540.939976\n",
"Train sample R^2 for epoch: 11 is 0.962621\n",
"\n",
"\n",
"cross-validate range 19970000-20090000\n",
"Test set: loss: 11966958.034183\n",
"Test sample R^2 for epoch: 11 is 0.972622\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19970000\n",
"Train set: loss: 1948186.220063\n",
"Train sample R^2 for epoch: 12 is 0.969061\n",
"\n",
"\n",
"cross-validate range 19970000-20090000\n",
"Test set: loss: 13160608.260418\n",
"Test sample R^2 for epoch: 12 is 0.969822\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19970000\n",
"Train set: loss: 1393263.828842\n",
"Train sample R^2 for epoch: 13 is 0.987991\n",
"\n",
"\n",
"cross-validate range 19970000-20090000\n",
"Test set: loss: 17478764.366082\n",
"Test sample R^2 for epoch: 13 is 0.959734\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 13\n",
"out-of-sample range 20090000-20100000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19980000\n",
"Train set: loss: 1420576.850392\n",
"Train sample R^2 for epoch: 0 is 0.988357\n",
"\n",
"\n",
"cross-validate range 19980000-20100000\n",
"Test set: loss: 11358211.954149\n",
"Test sample R^2 for epoch: 0 is 0.974688\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19980000\n",
"Train set: loss: 1094668.995753\n",
"Train sample R^2 for epoch: 1 is 0.990459\n",
"\n",
"\n",
"cross-validate range 19980000-20100000\n",
"Test set: loss: 10973537.627096\n",
"Test sample R^2 for epoch: 1 is 0.975557\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19980000\n",
"Train set: loss: 8845060.312128\n",
"Train sample R^2 for epoch: 2 is 0.930644\n",
"\n",
"\n",
"cross-validate range 19980000-20100000\n",
"Test set: loss: 12885143.431101\n",
"Test sample R^2 for epoch: 2 is 0.971053\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19980000\n",
"Train set: loss: 2679886.934472\n",
"Train sample R^2 for epoch: 3 is 0.965729\n",
"\n",
"\n",
"cross-validate range 19980000-20100000\n",
"Test set: loss: 11089154.788952\n",
"Test sample R^2 for epoch: 3 is 0.975301\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19980000\n",
"Train set: loss: 1046736.888628\n",
"Train sample R^2 for epoch: 4 is 0.991402\n",
"\n",
"\n",
"cross-validate range 19980000-20100000\n",
"Test set: loss: 10885869.305163\n",
"Test sample R^2 for epoch: 4 is 0.975755\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19980000\n",
"Train set: loss: 1076748.481969\n",
"Train sample R^2 for epoch: 5 is 0.991280\n",
"\n",
"\n",
"cross-validate range 19980000-20100000\n",
"Test set: loss: 10879786.492234\n",
"Test sample R^2 for epoch: 5 is 0.975755\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19980000\n",
"Train set: loss: 34414601.551947\n",
"Train sample R^2 for epoch: 6 is 0.751649\n",
"\n",
"\n",
"cross-validate range 19980000-20100000\n",
"Test set: loss: 26951868.846899\n",
"Test sample R^2 for epoch: 6 is 0.938871\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19980000\n",
"Train set: loss: 1348833.732044\n",
"Train sample R^2 for epoch: 7 is 0.988810\n",
"\n",
"\n",
"cross-validate range 19980000-20100000\n",
"Test set: loss: 11167698.353535\n",
"Test sample R^2 for epoch: 7 is 0.975124\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19980000\n",
"Train set: loss: 1391992.004633\n",
"Train sample R^2 for epoch: 8 is 0.988308\n",
"\n",
"\n",
"cross-validate range 19980000-20100000\n",
"Test set: loss: 11164102.564008\n",
"Test sample R^2 for epoch: 8 is 0.975136\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19980000\n",
"Train set: loss: 2304902.896497\n",
"Train sample R^2 for epoch: 9 is 0.981448\n",
"\n",
"\n",
"cross-validate range 19980000-20100000\n",
"Test set: loss: 13029142.408294\n",
"Test sample R^2 for epoch: 9 is 0.970885\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 9\n",
"out-of-sample range 20100000-20110000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19990000\n",
"Train set: loss: 1224634.668966\n",
"Train sample R^2 for epoch: 0 is 0.991366\n",
"\n",
"\n",
"cross-validate range 19990000-20110000\n",
"Test set: loss: 10620543.673601\n",
"Test sample R^2 for epoch: 0 is 0.973228\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19990000\n",
"Train set: loss: 1635484.113191\n",
"Train sample R^2 for epoch: 1 is 0.988556\n",
"\n",
"\n",
"cross-validate range 19990000-20110000\n",
"Test set: loss: 10370219.633237\n",
"Test sample R^2 for epoch: 1 is 0.973930\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-19990000\n",
"Train set: loss: 496738989.894573\n",
"Train sample R^2 for epoch: 2 is -1.869368\n",
"\n",
"\n",
"cross-validate range 19990000-20110000\n",
"Test set: loss: 19435992166.565186\n",
"Test sample R^2 for epoch: 2 is -51.632422\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-19990000\n",
"Train set: loss: 87311917.595520\n",
"Train sample R^2 for epoch: 3 is 0.368740\n",
"\n",
"\n",
"cross-validate range 19990000-20110000\n",
"Test set: loss: 28035788.318681\n",
"Test sample R^2 for epoch: 3 is 0.925812\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-19990000\n",
"Train set: loss: 1553130.188995\n",
"Train sample R^2 for epoch: 4 is 0.988054\n",
"\n",
"\n",
"cross-validate range 19990000-20110000\n",
"Test set: loss: 17475291.033081\n",
"Test sample R^2 for epoch: 4 is 0.954461\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-19990000\n",
"Train set: loss: 12288215.275046\n",
"Train sample R^2 for epoch: 5 is 0.888489\n",
"\n",
"\n",
"cross-validate range 19990000-20110000\n",
"Test set: loss: 12860110.502539\n",
"Test sample R^2 for epoch: 5 is 0.966953\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-19990000\n",
"Train set: loss: 1522800.924188\n",
"Train sample R^2 for epoch: 6 is 0.988815\n",
"\n",
"\n",
"cross-validate range 19990000-20110000\n",
"Test set: loss: 12540018.665091\n",
"Test sample R^2 for epoch: 6 is 0.968242\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 6\n",
"out-of-sample range 20110000-20120000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20000000\n",
"Train set: loss: 674767985.168588\n",
"Train sample R^2 for epoch: 0 is -2.770347\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 116056932.028460\n",
"Test sample R^2 for epoch: 0 is 0.670766\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20000000\n",
"Train set: loss: 12931711.838765\n",
"Train sample R^2 for epoch: 1 is 0.822767\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 12035514.004579\n",
"Test sample R^2 for epoch: 1 is 0.967723\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20000000\n",
"Train set: loss: 2332893.570597\n",
"Train sample R^2 for epoch: 2 is 0.985019\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 10270767.189782\n",
"Test sample R^2 for epoch: 2 is 0.972665\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20000000\n",
"Train set: loss: 38944945.595127\n",
"Train sample R^2 for epoch: 3 is 0.767437\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 18583367.076628\n",
"Test sample R^2 for epoch: 3 is 0.948798\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-20000000\n",
"Train set: loss: 3282349.402457\n",
"Train sample R^2 for epoch: 4 is 0.979536\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 14711238.393150\n",
"Test sample R^2 for epoch: 4 is 0.959880\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-20000000\n",
"Train set: loss: 1542726.323009\n",
"Train sample R^2 for epoch: 5 is 0.989663\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 9360385.535732\n",
"Test sample R^2 for epoch: 5 is 0.975170\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20000000\n",
"Train set: loss: 1529487.587601\n",
"Train sample R^2 for epoch: 6 is 0.989937\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 9366400.513867\n",
"Test sample R^2 for epoch: 6 is 0.975152\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-20000000\n",
"Train set: loss: 1531148.121270\n",
"Train sample R^2 for epoch: 7 is 0.989933\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 9437605.183760\n",
"Test sample R^2 for epoch: 7 is 0.974948\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-20000000\n",
"Train set: loss: 1527081.420571\n",
"Train sample R^2 for epoch: 8 is 0.989970\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 9589216.933569\n",
"Test sample R^2 for epoch: 8 is 0.974514\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-20000000\n",
"Train set: loss: 1517531.525162\n",
"Train sample R^2 for epoch: 9 is 0.990035\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 9501842.051158\n",
"Test sample R^2 for epoch: 9 is 0.974763\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-20000000\n",
"Train set: loss: 1528688.404265\n",
"Train sample R^2 for epoch: 10 is 0.989935\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 9328944.562126\n",
"Test sample R^2 for epoch: 10 is 0.975256\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20000000\n",
"Train set: loss: 2114841.476749\n",
"Train sample R^2 for epoch: 11 is 0.984263\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 9380427.463309\n",
"Test sample R^2 for epoch: 11 is 0.975097\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-20000000\n",
"Train set: loss: 1534045.100885\n",
"Train sample R^2 for epoch: 12 is 0.989862\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 9493633.822349\n",
"Test sample R^2 for epoch: 12 is 0.974771\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-20000000\n",
"Train set: loss: 1528401.568639\n",
"Train sample R^2 for epoch: 13 is 0.989817\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 9584887.347858\n",
"Test sample R^2 for epoch: 13 is 0.974508\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-20000000\n",
"Train set: loss: 3425220.230495\n",
"Train sample R^2 for epoch: 14 is 0.977903\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 9321910.961169\n",
"Test sample R^2 for epoch: 14 is 0.975272\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20000000\n",
"Train set: loss: 1530781.791986\n",
"Train sample R^2 for epoch: 15 is 0.989948\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 10028111.475712\n",
"Test sample R^2 for epoch: 15 is 0.973177\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-20000000\n",
"Train set: loss: 3256220.618675\n",
"Train sample R^2 for epoch: 16 is 0.977714\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 10219520.566430\n",
"Test sample R^2 for epoch: 16 is 0.972588\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-20000000\n",
"Train set: loss: 8059690.254144\n",
"Train sample R^2 for epoch: 17 is 0.947171\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 118022362.000227\n",
"Test sample R^2 for epoch: 17 is 0.661016\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-20000000\n",
"Train set: loss: 79631811.714590\n",
"Train sample R^2 for epoch: 18 is 0.238784\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 9465883.361652\n",
"Test sample R^2 for epoch: 18 is 0.974897\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-20000000\n",
"Train set: loss: 14851837.798809\n",
"Train sample R^2 for epoch: 19 is 0.846751\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 9282186.448808\n",
"Test sample R^2 for epoch: 19 is 0.975384\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20000000\n",
"Train set: loss: 1514509.644062\n",
"Train sample R^2 for epoch: 20 is 0.990025\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 9295834.938692\n",
"Test sample R^2 for epoch: 20 is 0.975342\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-20000000\n",
"Train set: loss: 9047325.970578\n",
"Train sample R^2 for epoch: 21 is 0.941021\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 11451838.404683\n",
"Test sample R^2 for epoch: 21 is 0.969347\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-20000000\n",
"Train set: loss: 1627101.508580\n",
"Train sample R^2 for epoch: 22 is 0.988360\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 9270749.331935\n",
"Test sample R^2 for epoch: 22 is 0.975423\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20000000\n",
"Train set: loss: 3655064.384143\n",
"Train sample R^2 for epoch: 23 is 0.978140\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 9643062.546829\n",
"Test sample R^2 for epoch: 23 is 0.974337\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-20000000\n",
"Train set: loss: 1661585.047256\n",
"Train sample R^2 for epoch: 24 is 0.986551\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 10815701.243260\n",
"Test sample R^2 for epoch: 24 is 0.970881\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-20000000\n",
"Train set: loss: 9166714.525726\n",
"Train sample R^2 for epoch: 25 is 0.936094\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 9351286.656285\n",
"Test sample R^2 for epoch: 25 is 0.975174\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-20000000\n",
"Train set: loss: 1515497.889144\n",
"Train sample R^2 for epoch: 26 is 0.990052\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 10197400.374683\n",
"Test sample R^2 for epoch: 26 is 0.972712\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-20000000\n",
"Train set: loss: 3411027.945256\n",
"Train sample R^2 for epoch: 27 is 0.977255\n",
"\n",
"\n",
"cross-validate range 20000000-20120000\n",
"Test set: loss: 9713548.302075\n",
"Test sample R^2 for epoch: 27 is 0.974176\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 27\n",
"out-of-sample range 20120000-20130000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20010000\n",
"Train set: loss: 5792648.018533\n",
"Train sample R^2 for epoch: 0 is 0.968395\n",
"\n",
"\n",
"cross-validate range 20010000-20130000\n",
"Test set: loss: 8152337.793955\n",
"Test sample R^2 for epoch: 0 is 0.973066\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20010000\n",
"Train set: loss: 3781274.890321\n",
"Train sample R^2 for epoch: 1 is 0.972829\n",
"\n",
"\n",
"cross-validate range 20010000-20130000\n",
"Test set: loss: 12298871.755256\n",
"Test sample R^2 for epoch: 1 is 0.958726\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-20010000\n",
"Train set: loss: 2625151.861320\n",
"Train sample R^2 for epoch: 2 is 0.982627\n",
"\n",
"\n",
"cross-validate range 20010000-20130000\n",
"Test set: loss: 6186232.802458\n",
"Test sample R^2 for epoch: 2 is 0.979917\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20010000\n",
"Train set: loss: 2593621.277249\n",
"Train sample R^2 for epoch: 3 is 0.984773\n",
"\n",
"\n",
"cross-validate range 20010000-20130000\n",
"Test set: loss: 8286176.969715\n",
"Test sample R^2 for epoch: 3 is 0.972675\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-20010000\n",
"Train set: loss: 2928660.076368\n",
"Train sample R^2 for epoch: 4 is 0.982388\n",
"\n",
"\n",
"cross-validate range 20010000-20130000\n",
"Test set: loss: 27354049.226286\n",
"Test sample R^2 for epoch: 4 is 0.906998\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-20010000\n",
"Train set: loss: 26001620.574024\n",
"Train sample R^2 for epoch: 5 is 0.845809\n",
"\n",
"\n",
"cross-validate range 20010000-20130000\n",
"Test set: loss: 203781239.533134\n",
"Test sample R^2 for epoch: 5 is 0.295543\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-20010000\n",
"Train set: loss: 49113375.520135\n",
"Train sample R^2 for epoch: 6 is 0.649183\n",
"\n",
"\n",
"cross-validate range 20010000-20130000\n",
"Test set: loss: 6560977.815687\n",
"Test sample R^2 for epoch: 6 is 0.978566\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-20010000\n",
"Train set: loss: 2836053.316261\n",
"Train sample R^2 for epoch: 7 is 0.982903\n",
"\n",
"\n",
"cross-validate range 20010000-20130000\n",
"Test set: loss: 6113340.241481\n",
"Test sample R^2 for epoch: 7 is 0.980156\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20010000\n",
"Train set: loss: 2578804.845691\n",
"Train sample R^2 for epoch: 8 is 0.984769\n",
"\n",
"\n",
"cross-validate range 20010000-20130000\n",
"Test set: loss: 5808694.912909\n",
"Test sample R^2 for epoch: 8 is 0.981218\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20010000\n",
"Train set: loss: 2552576.118266\n",
"Train sample R^2 for epoch: 9 is 0.984996\n",
"\n",
"\n",
"cross-validate range 20010000-20130000\n",
"Test set: loss: 5935887.765419\n",
"Test sample R^2 for epoch: 9 is 0.980766\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-20010000\n",
"Train set: loss: 22443177.823388\n",
"Train sample R^2 for epoch: 10 is 0.820755\n",
"\n",
"\n",
"cross-validate range 20010000-20130000\n",
"Test set: loss: 5823950.808522\n",
"Test sample R^2 for epoch: 10 is 0.981172\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-20010000\n",
"Train set: loss: 2506132.025018\n",
"Train sample R^2 for epoch: 11 is 0.985487\n",
"\n",
"\n",
"cross-validate range 20010000-20130000\n",
"Test set: loss: 5916677.703522\n",
"Test sample R^2 for epoch: 11 is 0.980826\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-20010000\n",
"Train set: loss: 2534479.992233\n",
"Train sample R^2 for epoch: 12 is 0.985156\n",
"\n",
"\n",
"cross-validate range 20010000-20130000\n",
"Test set: loss: 5772925.275564\n",
"Test sample R^2 for epoch: 12 is 0.981338\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20010000\n",
"Train set: loss: 2595110.928626\n",
"Train sample R^2 for epoch: 13 is 0.984761\n",
"\n",
"\n",
"cross-validate range 20010000-20130000\n",
"Test set: loss: 6192740.614382\n",
"Test sample R^2 for epoch: 13 is 0.979890\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-20010000\n",
"Train set: loss: 4132907.652400\n",
"Train sample R^2 for epoch: 14 is 0.974974\n",
"\n",
"\n",
"cross-validate range 20010000-20130000\n",
"Test set: loss: 6314123.608425\n",
"Test sample R^2 for epoch: 14 is 0.979441\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-20010000\n",
"Train set: loss: 1144512162.767946\n",
"Train sample R^2 for epoch: 15 is -6.454415\n",
"\n",
"\n",
"cross-validate range 20010000-20130000\n",
"Test set: loss: 8689486.745303\n",
"Test sample R^2 for epoch: 15 is 0.971507\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-20010000\n",
"Train set: loss: 245507602.366309\n",
"Train sample R^2 for epoch: 16 is -0.215431\n",
"\n",
"\n",
"cross-validate range 20010000-20130000\n",
"Test set: loss: 15709583.955876\n",
"Test sample R^2 for epoch: 16 is 0.947053\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-20010000\n",
"Train set: loss: 2944116.591615\n",
"Train sample R^2 for epoch: 17 is 0.981210\n",
"\n",
"\n",
"cross-validate range 20010000-20130000\n",
"Test set: loss: 6114044.547440\n",
"Test sample R^2 for epoch: 17 is 0.980155\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 17\n",
"out-of-sample range 20130000-20140000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20020000\n",
"Train set: loss: 3434743.947575\n",
"Train sample R^2 for epoch: 0 is 0.982137\n",
"\n",
"\n",
"cross-validate range 20020000-20140000\n",
"Test set: loss: 3317656.813914\n",
"Test sample R^2 for epoch: 0 is 0.984246\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20020000\n",
"Train set: loss: 3757455.125902\n",
"Train sample R^2 for epoch: 1 is 0.979874\n",
"\n",
"\n",
"cross-validate range 20020000-20140000\n",
"Test set: loss: 3477806.970044\n",
"Test sample R^2 for epoch: 1 is 0.983490\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-20020000\n",
"Train set: loss: 4181279.064385\n",
"Train sample R^2 for epoch: 2 is 0.978179\n",
"\n",
"\n",
"cross-validate range 20020000-20140000\n",
"Test set: loss: 2907488.984630\n",
"Test sample R^2 for epoch: 2 is 0.986272\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20020000\n",
"Train set: loss: 47013530.080519\n",
"Train sample R^2 for epoch: 3 is 0.760000\n",
"\n",
"\n",
"cross-validate range 20020000-20140000\n",
"Test set: loss: 171992392.772154\n",
"Test sample R^2 for epoch: 3 is 0.161403\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-20020000\n",
"Train set: loss: 8818856.380305\n",
"Train sample R^2 for epoch: 4 is 0.864289\n",
"\n",
"\n",
"cross-validate range 20020000-20140000\n",
"Test set: loss: 3920959.212694\n",
"Test sample R^2 for epoch: 4 is 0.981429\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-20020000\n",
"Train set: loss: 3600895.206146\n",
"Train sample R^2 for epoch: 5 is 0.980598\n",
"\n",
"\n",
"cross-validate range 20020000-20140000\n",
"Test set: loss: 2954862.003729\n",
"Test sample R^2 for epoch: 5 is 0.986055\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-20020000\n",
"Train set: loss: 3443035.147235\n",
"Train sample R^2 for epoch: 6 is 0.981676\n",
"\n",
"\n",
"cross-validate range 20020000-20140000\n",
"Test set: loss: 3363186.270371\n",
"Test sample R^2 for epoch: 6 is 0.984057\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-20020000\n",
"Train set: loss: 5477104.515764\n",
"Train sample R^2 for epoch: 7 is 0.971260\n",
"\n",
"\n",
"cross-validate range 20020000-20140000\n",
"Test set: loss: 3661076.490098\n",
"Test sample R^2 for epoch: 7 is 0.982553\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 7\n",
"out-of-sample range 20140000-20150000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20030000\n",
"Train set: loss: 74367143.574283\n",
"Train sample R^2 for epoch: 0 is 0.618268\n",
"\n",
"\n",
"cross-validate range 20030000-20150000\n",
"Test set: loss: 299445352.768688\n",
"Test sample R^2 for epoch: 0 is -0.692331\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20030000\n",
"Train set: loss: 4988443.026908\n",
"Train sample R^2 for epoch: 1 is 0.973073\n",
"\n",
"\n",
"cross-validate range 20030000-20150000\n",
"Test set: loss: 2079497.233853\n",
"Test sample R^2 for epoch: 1 is 0.988318\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20030000\n",
"Train set: loss: 6225328.305656\n",
"Train sample R^2 for epoch: 2 is 0.968258\n",
"\n",
"\n",
"cross-validate range 20030000-20150000\n",
"Test set: loss: 3388300.002760\n",
"Test sample R^2 for epoch: 2 is 0.980975\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-20030000\n",
"Train set: loss: 3489382.708347\n",
"Train sample R^2 for epoch: 3 is 0.982377\n",
"\n",
"\n",
"cross-validate range 20030000-20150000\n",
"Test set: loss: 2043948.844321\n",
"Test sample R^2 for epoch: 3 is 0.988548\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20030000\n",
"Train set: loss: 24334646.866476\n",
"Train sample R^2 for epoch: 4 is 0.868400\n",
"\n",
"\n",
"cross-validate range 20030000-20150000\n",
"Test set: loss: 30919584.348916\n",
"Test sample R^2 for epoch: 4 is 0.825283\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-20030000\n",
"Train set: loss: 361610796.569422\n",
"Train sample R^2 for epoch: 5 is -0.858040\n",
"\n",
"\n",
"cross-validate range 20030000-20150000\n",
"Test set: loss: 348589679.730589\n",
"Test sample R^2 for epoch: 5 is -0.966769\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-20030000\n",
"Train set: loss: 6551634.103237\n",
"Train sample R^2 for epoch: 6 is 0.945815\n",
"\n",
"\n",
"cross-validate range 20030000-20150000\n",
"Test set: loss: 8309951.485239\n",
"Test sample R^2 for epoch: 6 is 0.952835\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-20030000\n",
"Train set: loss: 4303750.281476\n",
"Train sample R^2 for epoch: 7 is 0.976587\n",
"\n",
"\n",
"cross-validate range 20030000-20150000\n",
"Test set: loss: 4578999.561442\n",
"Test sample R^2 for epoch: 7 is 0.974071\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-20030000\n",
"Train set: loss: 6950318.072606\n",
"Train sample R^2 for epoch: 8 is 0.964061\n",
"\n",
"\n",
"cross-validate range 20030000-20150000\n",
"Test set: loss: 41631527.848961\n",
"Test sample R^2 for epoch: 8 is 0.768560\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 8\n",
"out-of-sample range 20150000-20160000\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20040000\n",
"Train set: loss: 24645867.130570\n",
"Train sample R^2 for epoch: 0 is 0.868531\n",
"\n",
"\n",
"cross-validate range 20040000-20160000\n",
"Test set: loss: 25861525.970971\n",
"Test sample R^2 for epoch: 0 is 0.835041\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20040000\n",
"Train set: loss: 94146509.948580\n",
"Train sample R^2 for epoch: 1 is 0.492055\n",
"\n",
"\n",
"cross-validate range 20040000-20160000\n",
"Test set: loss: 1938703.346027\n",
"Test sample R^2 for epoch: 1 is 0.987598\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20040000\n",
"Train set: loss: 3593286.189532\n",
"Train sample R^2 for epoch: 2 is 0.982020\n",
"\n",
"\n",
"cross-validate range 20040000-20160000\n",
"Test set: loss: 1571821.190237\n",
"Test sample R^2 for epoch: 2 is 0.989956\n",
"\n",
"\n",
"Checkpointing ...\n",
"training range 19570000-20040000\n",
"Train set: loss: 4655653.356493\n",
"Train sample R^2 for epoch: 3 is 0.975720\n",
"\n",
"\n",
"cross-validate range 20040000-20160000\n",
"Test set: loss: 2361846.470733\n",
"Test sample R^2 for epoch: 3 is 0.984900\n",
"\n",
"\n",
"EarlyStopping counter: 1 out of 5\n",
"training range 19570000-20040000\n",
"Train set: loss: 271491262.758802\n",
"Train sample R^2 for epoch: 4 is -0.306280\n",
"\n",
"\n",
"cross-validate range 20040000-20160000\n",
"Test set: loss: 994087344.780224\n",
"Test sample R^2 for epoch: 4 is -5.335041\n",
"\n",
"\n",
"EarlyStopping counter: 2 out of 5\n",
"training range 19570000-20040000\n",
"Train set: loss: 7962871.328029\n",
"Train sample R^2 for epoch: 5 is 0.943198\n",
"\n",
"\n",
"cross-validate range 20040000-20160000\n",
"Test set: loss: 38369436.638161\n",
"Test sample R^2 for epoch: 5 is 0.755628\n",
"\n",
"\n",
"EarlyStopping counter: 3 out of 5\n",
"training range 19570000-20040000\n",
"Train set: loss: 5177534.439807\n",
"Train sample R^2 for epoch: 6 is 0.972929\n",
"\n",
"\n",
"cross-validate range 20040000-20160000\n",
"Test set: loss: 17224681.364741\n",
"Test sample R^2 for epoch: 6 is 0.890323\n",
"\n",
"\n",
"EarlyStopping counter: 4 out of 5\n",
"training range 19570000-20040000\n",
"Train set: loss: 3956097.281713\n",
"Train sample R^2 for epoch: 7 is 0.980000\n",
"\n",
"\n",
"cross-validate range 20040000-20160000\n",
"Test set: loss: 2728444.581557\n",
"Test sample R^2 for epoch: 7 is 0.982620\n",
"\n",
"\n",
"EarlyStopping counter: 5 out of 5\n",
"Early stopping at Epoch 7\n",
"out-of-sample range 20160000-20170000\n",
"\n",
"Checkpointing ...\n"
],
"name": "stdout"
}
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "4SZC-wFqqOk9",
"colab_type": "text"
},
"source": [
"Calculate the out-of-sample Total R_square"
]
},
{
"cell_type": "code",
"metadata": {
"id": "7PSqK6GaOkjP",
"colab_type": "code",
"colab": {}
},
"source": [
"r_pred_arr = np.array([])\n",
"r_arr = np.array([])\n",
"for o in out_rsq_list:\n",
" r_pred_arr = np.append(r_pred_arr, o[0])\n",
" r_arr = np.append(r_arr, o[1]) "
],
"execution_count": 0,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "WGnoeczzJkvR",
"colab_type": "code",
"outputId": "8da3ed6b-77aa-4e24-e4eb-41da8f1290b4",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 34
}
},
"source": [
"R_square(r_pred_arr, r_arr)"
],
"execution_count": 0,
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/plain": [
"0.9556648466191366"
]
},
"metadata": {
"tags": []
},
"execution_count": 39
}
]
}
]
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment