Skip to content

Instantly share code, notes, and snippets.

@mirrornerror
Created November 3, 2018 18:32
Show Gist options
  • Save mirrornerror/1d32d2e573117f99017a15e0e7701006 to your computer and use it in GitHub Desktop.
Save mirrornerror/1d32d2e573117f99017a15e0e7701006 to your computer and use it in GitHub Desktop.
Titanic One-Hot Keras CNN
Display the source blob
Display the rendered blob
Raw
{
"cells": [
{
"metadata": {},
"cell_type": "markdown",
"source": "## Kaggle: Titanic ,Keras CNN, One-Hot, Early-Stopping \nhttps://www.kaggle.com/c/titanic"
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "import pandas as pd\nimport numpy as np\nimport matplotlib.pyplot as plt\n%matplotlib inline\n\n# random seed\nimport tensorflow as tf\nimport random as rn\nimport os\nos.environ['PYTHONHASHSEED'] = '0'\nrandom_n = 123\nnp.random.seed(random_n)\nrn.seed(random_n)\nsession_conf = tf.ConfigProto(intra_op_parallelism_threads=1, inter_op_parallelism_threads=1)\nfrom keras import backend as K\ntf.set_random_seed(random_n)\nsess = tf.Session(graph=tf.get_default_graph(), config=session_conf)\nK.set_session(sess)\n\ntrain = pd.read_csv('train.csv', index_col=0)\ntest = pd.read_csv('test.csv', index_col=0)",
"execution_count": 369,
"outputs": []
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "train.head()",
"execution_count": 370,
"outputs": [
{
"data": {
"text/html": "<div>\n<style scoped>\n .dataframe tbody tr th:only-of-type {\n vertical-align: middle;\n }\n\n .dataframe tbody tr th {\n vertical-align: top;\n }\n\n .dataframe thead th {\n text-align: right;\n }\n</style>\n<table border=\"1\" class=\"dataframe\">\n <thead>\n <tr style=\"text-align: right;\">\n <th></th>\n <th>Survived</th>\n <th>Pclass</th>\n <th>Name</th>\n <th>Sex</th>\n <th>Age</th>\n <th>SibSp</th>\n <th>Parch</th>\n <th>Ticket</th>\n <th>Fare</th>\n <th>Cabin</th>\n <th>Embarked</th>\n </tr>\n <tr>\n <th>PassengerId</th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n </tr>\n </thead>\n <tbody>\n <tr>\n <th>1</th>\n <td>0</td>\n <td>3</td>\n <td>Braund, Mr. Owen Harris</td>\n <td>male</td>\n <td>22.0</td>\n <td>1</td>\n <td>0</td>\n <td>A/5 21171</td>\n <td>7.2500</td>\n <td>NaN</td>\n <td>S</td>\n </tr>\n <tr>\n <th>2</th>\n <td>1</td>\n <td>1</td>\n <td>Cumings, Mrs. John Bradley (Florence Briggs Th...</td>\n <td>female</td>\n <td>38.0</td>\n <td>1</td>\n <td>0</td>\n <td>PC 17599</td>\n <td>71.2833</td>\n <td>C85</td>\n <td>C</td>\n </tr>\n <tr>\n <th>3</th>\n <td>1</td>\n <td>3</td>\n <td>Heikkinen, Miss. Laina</td>\n <td>female</td>\n <td>26.0</td>\n <td>0</td>\n <td>0</td>\n <td>STON/O2. 3101282</td>\n <td>7.9250</td>\n <td>NaN</td>\n <td>S</td>\n </tr>\n <tr>\n <th>4</th>\n <td>1</td>\n <td>1</td>\n <td>Futrelle, Mrs. Jacques Heath (Lily May Peel)</td>\n <td>female</td>\n <td>35.0</td>\n <td>1</td>\n <td>0</td>\n <td>113803</td>\n <td>53.1000</td>\n <td>C123</td>\n <td>S</td>\n </tr>\n <tr>\n <th>5</th>\n <td>0</td>\n <td>3</td>\n <td>Allen, Mr. William Henry</td>\n <td>male</td>\n <td>35.0</td>\n <td>0</td>\n <td>0</td>\n <td>373450</td>\n <td>8.0500</td>\n <td>NaN</td>\n <td>S</td>\n </tr>\n </tbody>\n</table>\n</div>",
"text/plain": " Survived Pclass \\\nPassengerId \n1 0 3 \n2 1 1 \n3 1 3 \n4 1 1 \n5 0 3 \n\n Name Sex Age \\\nPassengerId \n1 Braund, Mr. Owen Harris male 22.0 \n2 Cumings, Mrs. John Bradley (Florence Briggs Th... female 38.0 \n3 Heikkinen, Miss. Laina female 26.0 \n4 Futrelle, Mrs. Jacques Heath (Lily May Peel) female 35.0 \n5 Allen, Mr. William Henry male 35.0 \n\n SibSp Parch Ticket Fare Cabin Embarked \nPassengerId \n1 1 0 A/5 21171 7.2500 NaN S \n2 1 0 PC 17599 71.2833 C85 C \n3 0 0 STON/O2. 3101282 7.9250 NaN S \n4 1 0 113803 53.1000 C123 S \n5 0 0 373450 8.0500 NaN S "
},
"execution_count": 370,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {},
"cell_type": "markdown",
"source": "### Drop Survived and Ticket, then combine train with test "
},
{
"metadata": {
"scrolled": true,
"trusted": true
},
"cell_type": "code",
"source": "train_tmp = train.drop(['Survived', 'Ticket'], axis=1)\ntest_tmp = test.drop(['Ticket'], axis=1)\ndf = pd.concat([train_tmp, test_tmp])\ndf_orig = df.copy()\ndf.info()",
"execution_count": 371,
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": "<class 'pandas.core.frame.DataFrame'>\nInt64Index: 1309 entries, 1 to 1309\nData columns (total 9 columns):\nPclass 1309 non-null int64\nName 1309 non-null object\nSex 1309 non-null object\nAge 1046 non-null float64\nSibSp 1309 non-null int64\nParch 1309 non-null int64\nFare 1308 non-null float64\nCabin 295 non-null object\nEmbarked 1307 non-null object\ndtypes: float64(2), int64(3), object(4)\nmemory usage: 102.3+ KB\n"
}
]
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "df.head()",
"execution_count": 372,
"outputs": [
{
"data": {
"text/html": "<div>\n<style scoped>\n .dataframe tbody tr th:only-of-type {\n vertical-align: middle;\n }\n\n .dataframe tbody tr th {\n vertical-align: top;\n }\n\n .dataframe thead th {\n text-align: right;\n }\n</style>\n<table border=\"1\" class=\"dataframe\">\n <thead>\n <tr style=\"text-align: right;\">\n <th></th>\n <th>Pclass</th>\n <th>Name</th>\n <th>Sex</th>\n <th>Age</th>\n <th>SibSp</th>\n <th>Parch</th>\n <th>Fare</th>\n <th>Cabin</th>\n <th>Embarked</th>\n </tr>\n <tr>\n <th>PassengerId</th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n </tr>\n </thead>\n <tbody>\n <tr>\n <th>1</th>\n <td>3</td>\n <td>Braund, Mr. Owen Harris</td>\n <td>male</td>\n <td>22.0</td>\n <td>1</td>\n <td>0</td>\n <td>7.2500</td>\n <td>NaN</td>\n <td>S</td>\n </tr>\n <tr>\n <th>2</th>\n <td>1</td>\n <td>Cumings, Mrs. John Bradley (Florence Briggs Th...</td>\n <td>female</td>\n <td>38.0</td>\n <td>1</td>\n <td>0</td>\n <td>71.2833</td>\n <td>C85</td>\n <td>C</td>\n </tr>\n <tr>\n <th>3</th>\n <td>3</td>\n <td>Heikkinen, Miss. Laina</td>\n <td>female</td>\n <td>26.0</td>\n <td>0</td>\n <td>0</td>\n <td>7.9250</td>\n <td>NaN</td>\n <td>S</td>\n </tr>\n <tr>\n <th>4</th>\n <td>1</td>\n <td>Futrelle, Mrs. Jacques Heath (Lily May Peel)</td>\n <td>female</td>\n <td>35.0</td>\n <td>1</td>\n <td>0</td>\n <td>53.1000</td>\n <td>C123</td>\n <td>S</td>\n </tr>\n <tr>\n <th>5</th>\n <td>3</td>\n <td>Allen, Mr. William Henry</td>\n <td>male</td>\n <td>35.0</td>\n <td>0</td>\n <td>0</td>\n <td>8.0500</td>\n <td>NaN</td>\n <td>S</td>\n </tr>\n </tbody>\n</table>\n</div>",
"text/plain": " Pclass Name \\\nPassengerId \n1 3 Braund, Mr. Owen Harris \n2 1 Cumings, Mrs. John Bradley (Florence Briggs Th... \n3 3 Heikkinen, Miss. Laina \n4 1 Futrelle, Mrs. Jacques Heath (Lily May Peel) \n5 3 Allen, Mr. William Henry \n\n Sex Age SibSp Parch Fare Cabin Embarked \nPassengerId \n1 male 22.0 1 0 7.2500 NaN S \n2 female 38.0 1 0 71.2833 C85 C \n3 female 26.0 0 0 7.9250 NaN S \n4 female 35.0 1 0 53.1000 C123 S \n5 male 35.0 0 0 8.0500 NaN S "
},
"execution_count": 372,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {},
"cell_type": "markdown",
"source": "### Pclass: [1, 2, 3] --> [0, 1, 2]"
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "# change pclass value [1,2,3] to [0,1,2]\ndf.Pclass -= 1\ndf.head()",
"execution_count": 373,
"outputs": [
{
"data": {
"text/html": "<div>\n<style scoped>\n .dataframe tbody tr th:only-of-type {\n vertical-align: middle;\n }\n\n .dataframe tbody tr th {\n vertical-align: top;\n }\n\n .dataframe thead th {\n text-align: right;\n }\n</style>\n<table border=\"1\" class=\"dataframe\">\n <thead>\n <tr style=\"text-align: right;\">\n <th></th>\n <th>Pclass</th>\n <th>Name</th>\n <th>Sex</th>\n <th>Age</th>\n <th>SibSp</th>\n <th>Parch</th>\n <th>Fare</th>\n <th>Cabin</th>\n <th>Embarked</th>\n </tr>\n <tr>\n <th>PassengerId</th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n </tr>\n </thead>\n <tbody>\n <tr>\n <th>1</th>\n <td>2</td>\n <td>Braund, Mr. Owen Harris</td>\n <td>male</td>\n <td>22.0</td>\n <td>1</td>\n <td>0</td>\n <td>7.2500</td>\n <td>NaN</td>\n <td>S</td>\n </tr>\n <tr>\n <th>2</th>\n <td>0</td>\n <td>Cumings, Mrs. John Bradley (Florence Briggs Th...</td>\n <td>female</td>\n <td>38.0</td>\n <td>1</td>\n <td>0</td>\n <td>71.2833</td>\n <td>C85</td>\n <td>C</td>\n </tr>\n <tr>\n <th>3</th>\n <td>2</td>\n <td>Heikkinen, Miss. Laina</td>\n <td>female</td>\n <td>26.0</td>\n <td>0</td>\n <td>0</td>\n <td>7.9250</td>\n <td>NaN</td>\n <td>S</td>\n </tr>\n <tr>\n <th>4</th>\n <td>0</td>\n <td>Futrelle, Mrs. Jacques Heath (Lily May Peel)</td>\n <td>female</td>\n <td>35.0</td>\n <td>1</td>\n <td>0</td>\n <td>53.1000</td>\n <td>C123</td>\n <td>S</td>\n </tr>\n <tr>\n <th>5</th>\n <td>2</td>\n <td>Allen, Mr. William Henry</td>\n <td>male</td>\n <td>35.0</td>\n <td>0</td>\n <td>0</td>\n <td>8.0500</td>\n <td>NaN</td>\n <td>S</td>\n </tr>\n </tbody>\n</table>\n</div>",
"text/plain": " Pclass Name \\\nPassengerId \n1 2 Braund, Mr. Owen Harris \n2 0 Cumings, Mrs. John Bradley (Florence Briggs Th... \n3 2 Heikkinen, Miss. Laina \n4 0 Futrelle, Mrs. Jacques Heath (Lily May Peel) \n5 2 Allen, Mr. William Henry \n\n Sex Age SibSp Parch Fare Cabin Embarked \nPassengerId \n1 male 22.0 1 0 7.2500 NaN S \n2 female 38.0 1 0 71.2833 C85 C \n3 female 26.0 0 0 7.9250 NaN S \n4 female 35.0 1 0 53.1000 C123 S \n5 male 35.0 0 0 8.0500 NaN S "
},
"execution_count": 373,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "# make pclass_one_hot\npclass_one_hot = pd.get_dummies(df.Pclass, prefix='Pclass')\n\n# add pclass_one_hot\ndf = pd.concat([df, pclass_one_hot], axis=1)\n\n# drop Pclass\ndf = df.drop(['Pclass'], axis=1)\ndf.head()",
"execution_count": 374,
"outputs": [
{
"data": {
"text/html": "<div>\n<style scoped>\n .dataframe tbody tr th:only-of-type {\n vertical-align: middle;\n }\n\n .dataframe tbody tr th {\n vertical-align: top;\n }\n\n .dataframe thead th {\n text-align: right;\n }\n</style>\n<table border=\"1\" class=\"dataframe\">\n <thead>\n <tr style=\"text-align: right;\">\n <th></th>\n <th>Name</th>\n <th>Sex</th>\n <th>Age</th>\n <th>SibSp</th>\n <th>Parch</th>\n <th>Fare</th>\n <th>Cabin</th>\n <th>Embarked</th>\n <th>Pclass_0</th>\n <th>Pclass_1</th>\n <th>Pclass_2</th>\n </tr>\n <tr>\n <th>PassengerId</th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n </tr>\n </thead>\n <tbody>\n <tr>\n <th>1</th>\n <td>Braund, Mr. Owen Harris</td>\n <td>male</td>\n <td>22.0</td>\n <td>1</td>\n <td>0</td>\n <td>7.2500</td>\n <td>NaN</td>\n <td>S</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n </tr>\n <tr>\n <th>2</th>\n <td>Cumings, Mrs. John Bradley (Florence Briggs Th...</td>\n <td>female</td>\n <td>38.0</td>\n <td>1</td>\n <td>0</td>\n <td>71.2833</td>\n <td>C85</td>\n <td>C</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>3</th>\n <td>Heikkinen, Miss. Laina</td>\n <td>female</td>\n <td>26.0</td>\n <td>0</td>\n <td>0</td>\n <td>7.9250</td>\n <td>NaN</td>\n <td>S</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n </tr>\n <tr>\n <th>4</th>\n <td>Futrelle, Mrs. Jacques Heath (Lily May Peel)</td>\n <td>female</td>\n <td>35.0</td>\n <td>1</td>\n <td>0</td>\n <td>53.1000</td>\n <td>C123</td>\n <td>S</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>5</th>\n <td>Allen, Mr. William Henry</td>\n <td>male</td>\n <td>35.0</td>\n <td>0</td>\n <td>0</td>\n <td>8.0500</td>\n <td>NaN</td>\n <td>S</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n </tr>\n </tbody>\n</table>\n</div>",
"text/plain": " Name Sex Age \\\nPassengerId \n1 Braund, Mr. Owen Harris male 22.0 \n2 Cumings, Mrs. John Bradley (Florence Briggs Th... female 38.0 \n3 Heikkinen, Miss. Laina female 26.0 \n4 Futrelle, Mrs. Jacques Heath (Lily May Peel) female 35.0 \n5 Allen, Mr. William Henry male 35.0 \n\n SibSp Parch Fare Cabin Embarked Pclass_0 Pclass_1 \\\nPassengerId \n1 1 0 7.2500 NaN S 0 0 \n2 1 0 71.2833 C85 C 1 0 \n3 0 0 7.9250 NaN S 0 0 \n4 1 0 53.1000 C123 S 1 0 \n5 0 0 8.0500 NaN S 0 0 \n\n Pclass_2 \nPassengerId \n1 1 \n2 0 \n3 1 \n4 0 \n5 1 "
},
"execution_count": 374,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {},
"cell_type": "markdown",
"source": "### Name --> Title --> Number --> one-hot"
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "title = df.Name.str.extract(' ([A-Za-z]+)\\..', expand=False)\ntitle_unique = title.unique()\ntitle = title.replace(title_unique, np.arange(len(title_unique)))\n\ntitle_one_hot = pd.get_dummies(title, prefix='Title')\ntitle_one_hot.head()",
"execution_count": 375,
"outputs": [
{
"data": {
"text/html": "<div>\n<style scoped>\n .dataframe tbody tr th:only-of-type {\n vertical-align: middle;\n }\n\n .dataframe tbody tr th {\n vertical-align: top;\n }\n\n .dataframe thead th {\n text-align: right;\n }\n</style>\n<table border=\"1\" class=\"dataframe\">\n <thead>\n <tr style=\"text-align: right;\">\n <th></th>\n <th>Title_0</th>\n <th>Title_1</th>\n <th>Title_2</th>\n <th>Title_3</th>\n <th>Title_4</th>\n <th>Title_5</th>\n <th>Title_6</th>\n <th>Title_7</th>\n <th>Title_8</th>\n <th>Title_9</th>\n <th>Title_10</th>\n <th>Title_11</th>\n <th>Title_12</th>\n <th>Title_13</th>\n <th>Title_14</th>\n <th>Title_15</th>\n <th>Title_16</th>\n <th>Title_17</th>\n </tr>\n <tr>\n <th>PassengerId</th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n </tr>\n </thead>\n <tbody>\n <tr>\n <th>1</th>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>2</th>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>3</th>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>4</th>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>5</th>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n </tbody>\n</table>\n</div>",
"text/plain": " Title_0 Title_1 Title_2 Title_3 Title_4 Title_5 Title_6 \\\nPassengerId \n1 1 0 0 0 0 0 0 \n2 0 1 0 0 0 0 0 \n3 0 0 1 0 0 0 0 \n4 0 1 0 0 0 0 0 \n5 1 0 0 0 0 0 0 \n\n Title_7 Title_8 Title_9 Title_10 Title_11 Title_12 \\\nPassengerId \n1 0 0 0 0 0 0 \n2 0 0 0 0 0 0 \n3 0 0 0 0 0 0 \n4 0 0 0 0 0 0 \n5 0 0 0 0 0 0 \n\n Title_13 Title_14 Title_15 Title_16 Title_17 \nPassengerId \n1 0 0 0 0 0 \n2 0 0 0 0 0 \n3 0 0 0 0 0 \n4 0 0 0 0 0 \n5 0 0 0 0 0 "
},
"execution_count": 375,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "# add title_one_hot\ndf = pd.concat([df, title_one_hot], axis=1)\n\n# Drop Name\ndf = df.drop(['Name'], axis=1)\ndf.head()",
"execution_count": 376,
"outputs": [
{
"data": {
"text/html": "<div>\n<style scoped>\n .dataframe tbody tr th:only-of-type {\n vertical-align: middle;\n }\n\n .dataframe tbody tr th {\n vertical-align: top;\n }\n\n .dataframe thead th {\n text-align: right;\n }\n</style>\n<table border=\"1\" class=\"dataframe\">\n <thead>\n <tr style=\"text-align: right;\">\n <th></th>\n <th>Sex</th>\n <th>Age</th>\n <th>SibSp</th>\n <th>Parch</th>\n <th>Fare</th>\n <th>Cabin</th>\n <th>Embarked</th>\n <th>Pclass_0</th>\n <th>Pclass_1</th>\n <th>Pclass_2</th>\n <th>...</th>\n <th>Title_8</th>\n <th>Title_9</th>\n <th>Title_10</th>\n <th>Title_11</th>\n <th>Title_12</th>\n <th>Title_13</th>\n <th>Title_14</th>\n <th>Title_15</th>\n <th>Title_16</th>\n <th>Title_17</th>\n </tr>\n <tr>\n <th>PassengerId</th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n </tr>\n </thead>\n <tbody>\n <tr>\n <th>1</th>\n <td>male</td>\n <td>22.0</td>\n <td>1</td>\n <td>0</td>\n <td>7.2500</td>\n <td>NaN</td>\n <td>S</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>...</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>2</th>\n <td>female</td>\n <td>38.0</td>\n <td>1</td>\n <td>0</td>\n <td>71.2833</td>\n <td>C85</td>\n <td>C</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>...</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>3</th>\n <td>female</td>\n <td>26.0</td>\n <td>0</td>\n <td>0</td>\n <td>7.9250</td>\n <td>NaN</td>\n <td>S</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>...</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>4</th>\n <td>female</td>\n <td>35.0</td>\n <td>1</td>\n <td>0</td>\n <td>53.1000</td>\n <td>C123</td>\n <td>S</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>...</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>5</th>\n <td>male</td>\n <td>35.0</td>\n <td>0</td>\n <td>0</td>\n <td>8.0500</td>\n <td>NaN</td>\n <td>S</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>...</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n </tbody>\n</table>\n<p>5 rows × 28 columns</p>\n</div>",
"text/plain": " Sex Age SibSp Parch Fare Cabin Embarked Pclass_0 \\\nPassengerId \n1 male 22.0 1 0 7.2500 NaN S 0 \n2 female 38.0 1 0 71.2833 C85 C 1 \n3 female 26.0 0 0 7.9250 NaN S 0 \n4 female 35.0 1 0 53.1000 C123 S 1 \n5 male 35.0 0 0 8.0500 NaN S 0 \n\n Pclass_1 Pclass_2 ... Title_8 Title_9 Title_10 \\\nPassengerId ... \n1 0 1 ... 0 0 0 \n2 0 0 ... 0 0 0 \n3 0 1 ... 0 0 0 \n4 0 0 ... 0 0 0 \n5 0 1 ... 0 0 0 \n\n Title_11 Title_12 Title_13 Title_14 Title_15 Title_16 \\\nPassengerId \n1 0 0 0 0 0 0 \n2 0 0 0 0 0 0 \n3 0 0 0 0 0 0 \n4 0 0 0 0 0 0 \n5 0 0 0 0 0 0 \n\n Title_17 \nPassengerId \n1 0 \n2 0 \n3 0 \n4 0 \n5 0 \n\n[5 rows x 28 columns]"
},
"execution_count": 376,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {},
"cell_type": "markdown",
"source": "### Sex --> male:0, female:1"
},
{
"metadata": {
"scrolled": true,
"trusted": true
},
"cell_type": "code",
"source": "df.Sex = df.Sex.replace({'male': 0, 'female': 1})\ndf.head()",
"execution_count": 377,
"outputs": [
{
"data": {
"text/html": "<div>\n<style scoped>\n .dataframe tbody tr th:only-of-type {\n vertical-align: middle;\n }\n\n .dataframe tbody tr th {\n vertical-align: top;\n }\n\n .dataframe thead th {\n text-align: right;\n }\n</style>\n<table border=\"1\" class=\"dataframe\">\n <thead>\n <tr style=\"text-align: right;\">\n <th></th>\n <th>Sex</th>\n <th>Age</th>\n <th>SibSp</th>\n <th>Parch</th>\n <th>Fare</th>\n <th>Cabin</th>\n <th>Embarked</th>\n <th>Pclass_0</th>\n <th>Pclass_1</th>\n <th>Pclass_2</th>\n <th>...</th>\n <th>Title_8</th>\n <th>Title_9</th>\n <th>Title_10</th>\n <th>Title_11</th>\n <th>Title_12</th>\n <th>Title_13</th>\n <th>Title_14</th>\n <th>Title_15</th>\n <th>Title_16</th>\n <th>Title_17</th>\n </tr>\n <tr>\n <th>PassengerId</th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n </tr>\n </thead>\n <tbody>\n <tr>\n <th>1</th>\n <td>0</td>\n <td>22.0</td>\n <td>1</td>\n <td>0</td>\n <td>7.2500</td>\n <td>NaN</td>\n <td>S</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>...</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>2</th>\n <td>1</td>\n <td>38.0</td>\n <td>1</td>\n <td>0</td>\n <td>71.2833</td>\n <td>C85</td>\n <td>C</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>...</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>3</th>\n <td>1</td>\n <td>26.0</td>\n <td>0</td>\n <td>0</td>\n <td>7.9250</td>\n <td>NaN</td>\n <td>S</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>...</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>4</th>\n <td>1</td>\n <td>35.0</td>\n <td>1</td>\n <td>0</td>\n <td>53.1000</td>\n <td>C123</td>\n <td>S</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>...</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>5</th>\n <td>0</td>\n <td>35.0</td>\n <td>0</td>\n <td>0</td>\n <td>8.0500</td>\n <td>NaN</td>\n <td>S</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>...</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n </tbody>\n</table>\n<p>5 rows × 28 columns</p>\n</div>",
"text/plain": " Sex Age SibSp Parch Fare Cabin Embarked Pclass_0 \\\nPassengerId \n1 0 22.0 1 0 7.2500 NaN S 0 \n2 1 38.0 1 0 71.2833 C85 C 1 \n3 1 26.0 0 0 7.9250 NaN S 0 \n4 1 35.0 1 0 53.1000 C123 S 1 \n5 0 35.0 0 0 8.0500 NaN S 0 \n\n Pclass_1 Pclass_2 ... Title_8 Title_9 Title_10 \\\nPassengerId ... \n1 0 1 ... 0 0 0 \n2 0 0 ... 0 0 0 \n3 0 1 ... 0 0 0 \n4 0 0 ... 0 0 0 \n5 0 1 ... 0 0 0 \n\n Title_11 Title_12 Title_13 Title_14 Title_15 Title_16 \\\nPassengerId \n1 0 0 0 0 0 0 \n2 0 0 0 0 0 0 \n3 0 0 0 0 0 0 \n4 0 0 0 0 0 0 \n5 0 0 0 0 0 0 \n\n Title_17 \nPassengerId \n1 0 \n2 0 \n3 0 \n4 0 \n5 0 \n\n[5 rows x 28 columns]"
},
"execution_count": 377,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {},
"cell_type": "markdown",
"source": "### Cabin --> Number: nan:0, C:1, E:2, G:3, D:4, A:5, B:6, F:7, T:8"
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "# Cabin to Number(null:0)\ncabin_str = df.Cabin.str[0]\ncabin_list = cabin_str.unique()\ncabin_num = cabin_str.replace(cabin_list, np.arange(len(cabin_list)))\ncabin_num.unique()",
"execution_count": 378,
"outputs": [
{
"data": {
"text/plain": "array([0, 1, 2, 3, 4, 5, 6, 7, 8])"
},
"execution_count": 378,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "cabin_one_hot = pd.get_dummies(cabin_num, prefix='Cabin')\ncabin_one_hot.head()",
"execution_count": 379,
"outputs": [
{
"data": {
"text/html": "<div>\n<style scoped>\n .dataframe tbody tr th:only-of-type {\n vertical-align: middle;\n }\n\n .dataframe tbody tr th {\n vertical-align: top;\n }\n\n .dataframe thead th {\n text-align: right;\n }\n</style>\n<table border=\"1\" class=\"dataframe\">\n <thead>\n <tr style=\"text-align: right;\">\n <th></th>\n <th>Cabin_0</th>\n <th>Cabin_1</th>\n <th>Cabin_2</th>\n <th>Cabin_3</th>\n <th>Cabin_4</th>\n <th>Cabin_5</th>\n <th>Cabin_6</th>\n <th>Cabin_7</th>\n <th>Cabin_8</th>\n </tr>\n <tr>\n <th>PassengerId</th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n </tr>\n </thead>\n <tbody>\n <tr>\n <th>1</th>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>2</th>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>3</th>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>4</th>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>5</th>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n </tbody>\n</table>\n</div>",
"text/plain": " Cabin_0 Cabin_1 Cabin_2 Cabin_3 Cabin_4 Cabin_5 Cabin_6 \\\nPassengerId \n1 1 0 0 0 0 0 0 \n2 0 1 0 0 0 0 0 \n3 1 0 0 0 0 0 0 \n4 0 1 0 0 0 0 0 \n5 1 0 0 0 0 0 0 \n\n Cabin_7 Cabin_8 \nPassengerId \n1 0 0 \n2 0 0 \n3 0 0 \n4 0 0 \n5 0 0 "
},
"execution_count": 379,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {
"scrolled": true,
"trusted": true
},
"cell_type": "code",
"source": "# add cabin_one_hot\ndf = pd.concat([df, cabin_one_hot], axis=1)\n\n# drop Cabin\ndf = df.drop(['Cabin'], axis=1)\ndf.head()",
"execution_count": 380,
"outputs": [
{
"data": {
"text/html": "<div>\n<style scoped>\n .dataframe tbody tr th:only-of-type {\n vertical-align: middle;\n }\n\n .dataframe tbody tr th {\n vertical-align: top;\n }\n\n .dataframe thead th {\n text-align: right;\n }\n</style>\n<table border=\"1\" class=\"dataframe\">\n <thead>\n <tr style=\"text-align: right;\">\n <th></th>\n <th>Sex</th>\n <th>Age</th>\n <th>SibSp</th>\n <th>Parch</th>\n <th>Fare</th>\n <th>Embarked</th>\n <th>Pclass_0</th>\n <th>Pclass_1</th>\n <th>Pclass_2</th>\n <th>Title_0</th>\n <th>...</th>\n <th>Title_17</th>\n <th>Cabin_0</th>\n <th>Cabin_1</th>\n <th>Cabin_2</th>\n <th>Cabin_3</th>\n <th>Cabin_4</th>\n <th>Cabin_5</th>\n <th>Cabin_6</th>\n <th>Cabin_7</th>\n <th>Cabin_8</th>\n </tr>\n <tr>\n <th>PassengerId</th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n </tr>\n </thead>\n <tbody>\n <tr>\n <th>1</th>\n <td>0</td>\n <td>22.0</td>\n <td>1</td>\n <td>0</td>\n <td>7.2500</td>\n <td>S</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>1</td>\n <td>...</td>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>2</th>\n <td>1</td>\n <td>38.0</td>\n <td>1</td>\n <td>0</td>\n <td>71.2833</td>\n <td>C</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>...</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>3</th>\n <td>1</td>\n <td>26.0</td>\n <td>0</td>\n <td>0</td>\n <td>7.9250</td>\n <td>S</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>...</td>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>4</th>\n <td>1</td>\n <td>35.0</td>\n <td>1</td>\n <td>0</td>\n <td>53.1000</td>\n <td>S</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>...</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>5</th>\n <td>0</td>\n <td>35.0</td>\n <td>0</td>\n <td>0</td>\n <td>8.0500</td>\n <td>S</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>1</td>\n <td>...</td>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n </tbody>\n</table>\n<p>5 rows × 36 columns</p>\n</div>",
"text/plain": " Sex Age SibSp Parch Fare Embarked Pclass_0 Pclass_1 \\\nPassengerId \n1 0 22.0 1 0 7.2500 S 0 0 \n2 1 38.0 1 0 71.2833 C 1 0 \n3 1 26.0 0 0 7.9250 S 0 0 \n4 1 35.0 1 0 53.1000 S 1 0 \n5 0 35.0 0 0 8.0500 S 0 0 \n\n Pclass_2 Title_0 ... Title_17 Cabin_0 Cabin_1 Cabin_2 \\\nPassengerId ... \n1 1 1 ... 0 1 0 0 \n2 0 0 ... 0 0 1 0 \n3 1 0 ... 0 1 0 0 \n4 0 0 ... 0 0 1 0 \n5 1 1 ... 0 1 0 0 \n\n Cabin_3 Cabin_4 Cabin_5 Cabin_6 Cabin_7 Cabin_8 \nPassengerId \n1 0 0 0 0 0 0 \n2 0 0 0 0 0 0 \n3 0 0 0 0 0 0 \n4 0 0 0 0 0 0 \n5 0 0 0 0 0 0 \n\n[5 rows x 36 columns]"
},
"execution_count": 380,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {},
"cell_type": "markdown",
"source": "### Embarked --> S:0, C:1, Q:2, nan"
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "df.Embarked.unique()",
"execution_count": 381,
"outputs": [
{
"data": {
"text/plain": "array(['S', 'C', 'Q', nan], dtype=object)"
},
"execution_count": 381,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "# find null in Embarked\nEmbarked_null = df.Embarked[df.Embarked.isnull()]\nEmbarked_null.index.values",
"execution_count": 382,
"outputs": [
{
"data": {
"text/plain": "array([ 62, 830])"
},
"execution_count": 382,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "df.Embarked = df.Embarked.replace({'S':0, 'C':1, 'Q':2})\n\n# fill 3 for null\ndf.Embarked = df.Embarked.fillna(3)\n\n# float to integer\ndf.Embarked = df.Embarked.values.astype('int')",
"execution_count": 383,
"outputs": []
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "embarked_one_hot = pd.get_dummies(df.Embarked, prefix='Embarked')\ndf = pd.concat([df, embarked_one_hot], axis=1)\n\n#df = pd.get_dummies(df, columns=['Embarked'])\ndf.columns[df.columns.str.contains('Embarked')]",
"execution_count": 384,
"outputs": [
{
"data": {
"text/plain": "Index(['Embarked', 'Embarked_0', 'Embarked_1', 'Embarked_2', 'Embarked_3'], dtype='object')"
},
"execution_count": 384,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {
"scrolled": false,
"trusted": true
},
"cell_type": "code",
"source": "df = df.drop(['Embarked', 'Embarked_3'], axis=1) # delete the original and null columns\n#df.info()",
"execution_count": 385,
"outputs": []
},
{
"metadata": {},
"cell_type": "markdown",
"source": "### SibSp to one-hot"
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "df.SibSp.unique()",
"execution_count": 386,
"outputs": [
{
"data": {
"text/plain": "array([1, 0, 3, 4, 2, 5, 8])"
},
"execution_count": 386,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "sibsp_one_hot = pd.get_dummies(df.SibSp, prefix='SibSp')\nsibsp_one_hot.head()",
"execution_count": 387,
"outputs": [
{
"data": {
"text/html": "<div>\n<style scoped>\n .dataframe tbody tr th:only-of-type {\n vertical-align: middle;\n }\n\n .dataframe tbody tr th {\n vertical-align: top;\n }\n\n .dataframe thead th {\n text-align: right;\n }\n</style>\n<table border=\"1\" class=\"dataframe\">\n <thead>\n <tr style=\"text-align: right;\">\n <th></th>\n <th>SibSp_0</th>\n <th>SibSp_1</th>\n <th>SibSp_2</th>\n <th>SibSp_3</th>\n <th>SibSp_4</th>\n <th>SibSp_5</th>\n <th>SibSp_8</th>\n </tr>\n <tr>\n <th>PassengerId</th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n </tr>\n </thead>\n <tbody>\n <tr>\n <th>1</th>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>2</th>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>3</th>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>4</th>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>5</th>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n </tbody>\n</table>\n</div>",
"text/plain": " SibSp_0 SibSp_1 SibSp_2 SibSp_3 SibSp_4 SibSp_5 SibSp_8\nPassengerId \n1 0 1 0 0 0 0 0\n2 0 1 0 0 0 0 0\n3 1 0 0 0 0 0 0\n4 0 1 0 0 0 0 0\n5 1 0 0 0 0 0 0"
},
"execution_count": 387,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {},
"cell_type": "markdown",
"source": "### Parch to one-hot"
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "df.Parch.unique()",
"execution_count": 388,
"outputs": [
{
"data": {
"text/plain": "array([0, 1, 2, 5, 3, 4, 6, 9])"
},
"execution_count": 388,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "parch_one_hot = pd.get_dummies(df.Parch, prefix='Parch')\nparch_one_hot.head()",
"execution_count": 389,
"outputs": [
{
"data": {
"text/html": "<div>\n<style scoped>\n .dataframe tbody tr th:only-of-type {\n vertical-align: middle;\n }\n\n .dataframe tbody tr th {\n vertical-align: top;\n }\n\n .dataframe thead th {\n text-align: right;\n }\n</style>\n<table border=\"1\" class=\"dataframe\">\n <thead>\n <tr style=\"text-align: right;\">\n <th></th>\n <th>Parch_0</th>\n <th>Parch_1</th>\n <th>Parch_2</th>\n <th>Parch_3</th>\n <th>Parch_4</th>\n <th>Parch_5</th>\n <th>Parch_6</th>\n <th>Parch_9</th>\n </tr>\n <tr>\n <th>PassengerId</th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n </tr>\n </thead>\n <tbody>\n <tr>\n <th>1</th>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>2</th>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>3</th>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>4</th>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>5</th>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n </tr>\n </tbody>\n</table>\n</div>",
"text/plain": " Parch_0 Parch_1 Parch_2 Parch_3 Parch_4 Parch_5 Parch_6 \\\nPassengerId \n1 1 0 0 0 0 0 0 \n2 1 0 0 0 0 0 0 \n3 1 0 0 0 0 0 0 \n4 1 0 0 0 0 0 0 \n5 1 0 0 0 0 0 0 \n\n Parch_9 \nPassengerId \n1 0 \n2 0 \n3 0 \n4 0 \n5 0 "
},
"execution_count": 389,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "# add sibsp_one_hot and parch_one_hot\n#df = pd.concat([df, sibsp_one_hot, parch_one_hot], axis=1)\n\n# drop Cabin\n#df = df.drop(['SibSp', 'Parch'], axis=1)\ndf.head()",
"execution_count": 390,
"outputs": [
{
"data": {
"text/html": "<div>\n<style scoped>\n .dataframe tbody tr th:only-of-type {\n vertical-align: middle;\n }\n\n .dataframe tbody tr th {\n vertical-align: top;\n }\n\n .dataframe thead th {\n text-align: right;\n }\n</style>\n<table border=\"1\" class=\"dataframe\">\n <thead>\n <tr style=\"text-align: right;\">\n <th></th>\n <th>Sex</th>\n <th>Age</th>\n <th>SibSp</th>\n <th>Parch</th>\n <th>Fare</th>\n <th>Pclass_0</th>\n <th>Pclass_1</th>\n <th>Pclass_2</th>\n <th>Title_0</th>\n <th>Title_1</th>\n <th>...</th>\n <th>Cabin_2</th>\n <th>Cabin_3</th>\n <th>Cabin_4</th>\n <th>Cabin_5</th>\n <th>Cabin_6</th>\n <th>Cabin_7</th>\n <th>Cabin_8</th>\n <th>Embarked_0</th>\n <th>Embarked_1</th>\n <th>Embarked_2</th>\n </tr>\n <tr>\n <th>PassengerId</th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n </tr>\n </thead>\n <tbody>\n <tr>\n <th>1</th>\n <td>0</td>\n <td>22.0</td>\n <td>1</td>\n <td>0</td>\n <td>7.2500</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>1</td>\n <td>0</td>\n <td>...</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>2</th>\n <td>1</td>\n <td>38.0</td>\n <td>1</td>\n <td>0</td>\n <td>71.2833</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>...</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n </tr>\n <tr>\n <th>3</th>\n <td>1</td>\n <td>26.0</td>\n <td>0</td>\n <td>0</td>\n <td>7.9250</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>...</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>4</th>\n <td>1</td>\n <td>35.0</td>\n <td>1</td>\n <td>0</td>\n <td>53.1000</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>...</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>5</th>\n <td>0</td>\n <td>35.0</td>\n <td>0</td>\n <td>0</td>\n <td>8.0500</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>1</td>\n <td>0</td>\n <td>...</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n </tr>\n </tbody>\n</table>\n<p>5 rows × 38 columns</p>\n</div>",
"text/plain": " Sex Age SibSp Parch Fare Pclass_0 Pclass_1 Pclass_2 \\\nPassengerId \n1 0 22.0 1 0 7.2500 0 0 1 \n2 1 38.0 1 0 71.2833 1 0 0 \n3 1 26.0 0 0 7.9250 0 0 1 \n4 1 35.0 1 0 53.1000 1 0 0 \n5 0 35.0 0 0 8.0500 0 0 1 \n\n Title_0 Title_1 ... Cabin_2 Cabin_3 Cabin_4 Cabin_5 \\\nPassengerId ... \n1 1 0 ... 0 0 0 0 \n2 0 1 ... 0 0 0 0 \n3 0 0 ... 0 0 0 0 \n4 0 1 ... 0 0 0 0 \n5 1 0 ... 0 0 0 0 \n\n Cabin_6 Cabin_7 Cabin_8 Embarked_0 Embarked_1 Embarked_2 \nPassengerId \n1 0 0 0 1 0 0 \n2 0 0 0 0 1 0 \n3 0 0 0 1 0 0 \n4 0 0 0 1 0 0 \n5 0 0 0 1 0 0 \n\n[5 rows x 38 columns]"
},
"execution_count": 390,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "df.info()",
"execution_count": 391,
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": "<class 'pandas.core.frame.DataFrame'>\nInt64Index: 1309 entries, 1 to 1309\nData columns (total 38 columns):\nSex 1309 non-null int64\nAge 1046 non-null float64\nSibSp 1309 non-null int64\nParch 1309 non-null int64\nFare 1308 non-null float64\nPclass_0 1309 non-null uint8\nPclass_1 1309 non-null uint8\nPclass_2 1309 non-null uint8\nTitle_0 1309 non-null uint8\nTitle_1 1309 non-null uint8\nTitle_2 1309 non-null uint8\nTitle_3 1309 non-null uint8\nTitle_4 1309 non-null uint8\nTitle_5 1309 non-null uint8\nTitle_6 1309 non-null uint8\nTitle_7 1309 non-null uint8\nTitle_8 1309 non-null uint8\nTitle_9 1309 non-null uint8\nTitle_10 1309 non-null uint8\nTitle_11 1309 non-null uint8\nTitle_12 1309 non-null uint8\nTitle_13 1309 non-null uint8\nTitle_14 1309 non-null uint8\nTitle_15 1309 non-null uint8\nTitle_16 1309 non-null uint8\nTitle_17 1309 non-null uint8\nCabin_0 1309 non-null uint8\nCabin_1 1309 non-null uint8\nCabin_2 1309 non-null uint8\nCabin_3 1309 non-null uint8\nCabin_4 1309 non-null uint8\nCabin_5 1309 non-null uint8\nCabin_6 1309 non-null uint8\nCabin_7 1309 non-null uint8\nCabin_8 1309 non-null uint8\nEmbarked_0 1309 non-null uint8\nEmbarked_1 1309 non-null uint8\nEmbarked_2 1309 non-null uint8\ndtypes: float64(2), int64(3), uint8(33)\nmemory usage: 143.5 KB\n"
}
]
},
{
"metadata": {},
"cell_type": "markdown",
"source": "## zscore or normalization: \n* Age: including NaN\n* Fare: including NaN \n \nZ = (x - x.mean) / x.std \nN = (x - x.min)/(x.max - x.min) \n \nsklearn.preprocessing.MinMaxScaler causes error with Null data."
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "# Normalize Function\ndef normalize(df_col):\n df_col = (df_col - df_col.min()) / (df_col.max() - df_col.min())\n return df_col",
"execution_count": 392,
"outputs": []
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "# Standardization(zscore)\ndef zscore(df_col):\n df_col = (df_col - df_col.mean()) / df_col.std()\n return df_col",
"execution_count": 393,
"outputs": []
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "# df.Age = zscore(df.Age)\n# df.Fare = zscore(df.Fare)\n\ndf.Age = normalize(df.Age)\ndf.Fare = normalize(df.Fare)\ndf.SibSp = normalize(df.SibSp)\ndf.Parch = normalize(df.Parch)\n\n# for col in df.columns:\n# df[col] = zscore(df[col])\n\ndf.describe()",
"execution_count": 394,
"outputs": [
{
"data": {
"text/html": "<div>\n<style scoped>\n .dataframe tbody tr th:only-of-type {\n vertical-align: middle;\n }\n\n .dataframe tbody tr th {\n vertical-align: top;\n }\n\n .dataframe thead th {\n text-align: right;\n }\n</style>\n<table border=\"1\" class=\"dataframe\">\n <thead>\n <tr style=\"text-align: right;\">\n <th></th>\n <th>Sex</th>\n <th>Age</th>\n <th>SibSp</th>\n <th>Parch</th>\n <th>Fare</th>\n <th>Pclass_0</th>\n <th>Pclass_1</th>\n <th>Pclass_2</th>\n <th>Title_0</th>\n <th>Title_1</th>\n <th>...</th>\n <th>Cabin_2</th>\n <th>Cabin_3</th>\n <th>Cabin_4</th>\n <th>Cabin_5</th>\n <th>Cabin_6</th>\n <th>Cabin_7</th>\n <th>Cabin_8</th>\n <th>Embarked_0</th>\n <th>Embarked_1</th>\n <th>Embarked_2</th>\n </tr>\n </thead>\n <tbody>\n <tr>\n <th>count</th>\n <td>1309.000000</td>\n <td>1046.000000</td>\n <td>1309.000000</td>\n <td>1309.000000</td>\n <td>1308.000000</td>\n <td>1309.000000</td>\n <td>1309.000000</td>\n <td>1309.000000</td>\n <td>1309.000000</td>\n <td>1309.000000</td>\n <td>...</td>\n <td>1309.000000</td>\n <td>1309.000000</td>\n <td>1309.000000</td>\n <td>1309.000000</td>\n <td>1309.000000</td>\n <td>1309.000000</td>\n <td>1309.000000</td>\n <td>1309.000000</td>\n <td>1309.000000</td>\n <td>1309.000000</td>\n </tr>\n <tr>\n <th>mean</th>\n <td>0.355997</td>\n <td>0.372180</td>\n <td>0.062357</td>\n <td>0.042781</td>\n <td>0.064988</td>\n <td>0.246753</td>\n <td>0.211612</td>\n <td>0.541635</td>\n <td>0.578304</td>\n <td>0.150497</td>\n <td>...</td>\n <td>0.031322</td>\n <td>0.003820</td>\n <td>0.035141</td>\n <td>0.016807</td>\n <td>0.049656</td>\n <td>0.016043</td>\n <td>0.000764</td>\n <td>0.698243</td>\n <td>0.206264</td>\n <td>0.093965</td>\n </tr>\n <tr>\n <th>std</th>\n <td>0.478997</td>\n <td>0.180552</td>\n <td>0.130207</td>\n <td>0.096173</td>\n <td>0.101026</td>\n <td>0.431287</td>\n <td>0.408607</td>\n <td>0.498454</td>\n <td>0.494019</td>\n <td>0.357694</td>\n <td>...</td>\n <td>0.174252</td>\n <td>0.061709</td>\n <td>0.184207</td>\n <td>0.128596</td>\n <td>0.217317</td>\n <td>0.125688</td>\n <td>0.027639</td>\n <td>0.459196</td>\n <td>0.404777</td>\n <td>0.291891</td>\n </tr>\n <tr>\n <th>min</th>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>...</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n </tr>\n <tr>\n <th>25%</th>\n <td>0.000000</td>\n <td>0.260929</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.015412</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>...</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n </tr>\n <tr>\n <th>50%</th>\n <td>0.000000</td>\n <td>0.348616</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.028213</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>1.000000</td>\n <td>1.000000</td>\n <td>0.000000</td>\n <td>...</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>1.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n </tr>\n <tr>\n <th>75%</th>\n <td>1.000000</td>\n <td>0.486409</td>\n <td>0.125000</td>\n <td>0.000000</td>\n <td>0.061045</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>1.000000</td>\n <td>1.000000</td>\n <td>0.000000</td>\n <td>...</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n <td>1.000000</td>\n <td>0.000000</td>\n <td>0.000000</td>\n </tr>\n <tr>\n <th>max</th>\n <td>1.000000</td>\n <td>1.000000</td>\n <td>1.000000</td>\n <td>1.000000</td>\n <td>1.000000</td>\n <td>1.000000</td>\n <td>1.000000</td>\n <td>1.000000</td>\n <td>1.000000</td>\n <td>1.000000</td>\n <td>...</td>\n <td>1.000000</td>\n <td>1.000000</td>\n <td>1.000000</td>\n <td>1.000000</td>\n <td>1.000000</td>\n <td>1.000000</td>\n <td>1.000000</td>\n <td>1.000000</td>\n <td>1.000000</td>\n <td>1.000000</td>\n </tr>\n </tbody>\n</table>\n<p>8 rows × 38 columns</p>\n</div>",
"text/plain": " Sex Age SibSp Parch Fare \\\ncount 1309.000000 1046.000000 1309.000000 1309.000000 1308.000000 \nmean 0.355997 0.372180 0.062357 0.042781 0.064988 \nstd 0.478997 0.180552 0.130207 0.096173 0.101026 \nmin 0.000000 0.000000 0.000000 0.000000 0.000000 \n25% 0.000000 0.260929 0.000000 0.000000 0.015412 \n50% 0.000000 0.348616 0.000000 0.000000 0.028213 \n75% 1.000000 0.486409 0.125000 0.000000 0.061045 \nmax 1.000000 1.000000 1.000000 1.000000 1.000000 \n\n Pclass_0 Pclass_1 Pclass_2 Title_0 Title_1 \\\ncount 1309.000000 1309.000000 1309.000000 1309.000000 1309.000000 \nmean 0.246753 0.211612 0.541635 0.578304 0.150497 \nstd 0.431287 0.408607 0.498454 0.494019 0.357694 \nmin 0.000000 0.000000 0.000000 0.000000 0.000000 \n25% 0.000000 0.000000 0.000000 0.000000 0.000000 \n50% 0.000000 0.000000 1.000000 1.000000 0.000000 \n75% 0.000000 0.000000 1.000000 1.000000 0.000000 \nmax 1.000000 1.000000 1.000000 1.000000 1.000000 \n\n ... Cabin_2 Cabin_3 Cabin_4 Cabin_5 \\\ncount ... 1309.000000 1309.000000 1309.000000 1309.000000 \nmean ... 0.031322 0.003820 0.035141 0.016807 \nstd ... 0.174252 0.061709 0.184207 0.128596 \nmin ... 0.000000 0.000000 0.000000 0.000000 \n25% ... 0.000000 0.000000 0.000000 0.000000 \n50% ... 0.000000 0.000000 0.000000 0.000000 \n75% ... 0.000000 0.000000 0.000000 0.000000 \nmax ... 1.000000 1.000000 1.000000 1.000000 \n\n Cabin_6 Cabin_7 Cabin_8 Embarked_0 Embarked_1 \\\ncount 1309.000000 1309.000000 1309.000000 1309.000000 1309.000000 \nmean 0.049656 0.016043 0.000764 0.698243 0.206264 \nstd 0.217317 0.125688 0.027639 0.459196 0.404777 \nmin 0.000000 0.000000 0.000000 0.000000 0.000000 \n25% 0.000000 0.000000 0.000000 0.000000 0.000000 \n50% 0.000000 0.000000 0.000000 1.000000 0.000000 \n75% 0.000000 0.000000 0.000000 1.000000 0.000000 \nmax 1.000000 1.000000 1.000000 1.000000 1.000000 \n\n Embarked_2 \ncount 1309.000000 \nmean 0.093965 \nstd 0.291891 \nmin 0.000000 \n25% 0.000000 \n50% 0.000000 \n75% 0.000000 \nmax 1.000000 \n\n[8 rows x 38 columns]"
},
"execution_count": 394,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "len(df.columns)",
"execution_count": 395,
"outputs": [
{
"data": {
"text/plain": "38"
},
"execution_count": 395,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {},
"cell_type": "markdown",
"source": "## Split into Notnull data and Null data\n\n* Age\n* Embarked\n* Fare\n"
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "def null_data(col):\n return pd.DataFrame([df.loc[i] for i in df_orig[col][df_orig[col].isnull()].index])",
"execution_count": 396,
"outputs": []
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "Embarked_null = null_data('Embarked')\nEmbarked_null",
"execution_count": 397,
"outputs": [
{
"data": {
"text/html": "<div>\n<style scoped>\n .dataframe tbody tr th:only-of-type {\n vertical-align: middle;\n }\n\n .dataframe tbody tr th {\n vertical-align: top;\n }\n\n .dataframe thead th {\n text-align: right;\n }\n</style>\n<table border=\"1\" class=\"dataframe\">\n <thead>\n <tr style=\"text-align: right;\">\n <th></th>\n <th>Sex</th>\n <th>Age</th>\n <th>SibSp</th>\n <th>Parch</th>\n <th>Fare</th>\n <th>Pclass_0</th>\n <th>Pclass_1</th>\n <th>Pclass_2</th>\n <th>Title_0</th>\n <th>Title_1</th>\n <th>...</th>\n <th>Cabin_2</th>\n <th>Cabin_3</th>\n <th>Cabin_4</th>\n <th>Cabin_5</th>\n <th>Cabin_6</th>\n <th>Cabin_7</th>\n <th>Cabin_8</th>\n <th>Embarked_0</th>\n <th>Embarked_1</th>\n <th>Embarked_2</th>\n </tr>\n </thead>\n <tbody>\n <tr>\n <th>62</th>\n <td>1.0</td>\n <td>0.473882</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.15615</td>\n <td>1.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>...</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>1.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n </tr>\n <tr>\n <th>830</th>\n <td>1.0</td>\n <td>0.774521</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.15615</td>\n <td>1.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>1.0</td>\n <td>...</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>1.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n </tr>\n </tbody>\n</table>\n<p>2 rows × 38 columns</p>\n</div>",
"text/plain": " Sex Age SibSp Parch Fare Pclass_0 Pclass_1 Pclass_2 \\\n62 1.0 0.473882 0.0 0.0 0.15615 1.0 0.0 0.0 \n830 1.0 0.774521 0.0 0.0 0.15615 1.0 0.0 0.0 \n\n Title_0 Title_1 ... Cabin_2 Cabin_3 Cabin_4 Cabin_5 \\\n62 0.0 0.0 ... 0.0 0.0 0.0 0.0 \n830 0.0 1.0 ... 0.0 0.0 0.0 0.0 \n\n Cabin_6 Cabin_7 Cabin_8 Embarked_0 Embarked_1 Embarked_2 \n62 1.0 0.0 0.0 0.0 0.0 0.0 \n830 1.0 0.0 0.0 0.0 0.0 0.0 \n\n[2 rows x 38 columns]"
},
"execution_count": 397,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "Age_null = null_data('Age')\nAge_null.head()",
"execution_count": 398,
"outputs": [
{
"data": {
"text/html": "<div>\n<style scoped>\n .dataframe tbody tr th:only-of-type {\n vertical-align: middle;\n }\n\n .dataframe tbody tr th {\n vertical-align: top;\n }\n\n .dataframe thead th {\n text-align: right;\n }\n</style>\n<table border=\"1\" class=\"dataframe\">\n <thead>\n <tr style=\"text-align: right;\">\n <th></th>\n <th>Sex</th>\n <th>Age</th>\n <th>SibSp</th>\n <th>Parch</th>\n <th>Fare</th>\n <th>Pclass_0</th>\n <th>Pclass_1</th>\n <th>Pclass_2</th>\n <th>Title_0</th>\n <th>Title_1</th>\n <th>...</th>\n <th>Cabin_2</th>\n <th>Cabin_3</th>\n <th>Cabin_4</th>\n <th>Cabin_5</th>\n <th>Cabin_6</th>\n <th>Cabin_7</th>\n <th>Cabin_8</th>\n <th>Embarked_0</th>\n <th>Embarked_1</th>\n <th>Embarked_2</th>\n </tr>\n </thead>\n <tbody>\n <tr>\n <th>6</th>\n <td>0.0</td>\n <td>NaN</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.016510</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>1.0</td>\n <td>1.0</td>\n <td>0.0</td>\n <td>...</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>1.0</td>\n </tr>\n <tr>\n <th>18</th>\n <td>0.0</td>\n <td>NaN</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.025374</td>\n <td>0.0</td>\n <td>1.0</td>\n <td>0.0</td>\n <td>1.0</td>\n <td>0.0</td>\n <td>...</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>1.0</td>\n <td>0.0</td>\n <td>0.0</td>\n </tr>\n <tr>\n <th>20</th>\n <td>1.0</td>\n <td>NaN</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.014102</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>1.0</td>\n <td>0.0</td>\n <td>1.0</td>\n <td>...</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>1.0</td>\n <td>0.0</td>\n </tr>\n <tr>\n <th>27</th>\n <td>0.0</td>\n <td>NaN</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.014102</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>1.0</td>\n <td>1.0</td>\n <td>0.0</td>\n <td>...</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>1.0</td>\n <td>0.0</td>\n </tr>\n <tr>\n <th>29</th>\n <td>1.0</td>\n <td>NaN</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.015379</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>1.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>...</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>1.0</td>\n </tr>\n </tbody>\n</table>\n<p>5 rows × 38 columns</p>\n</div>",
"text/plain": " Sex Age SibSp Parch Fare Pclass_0 Pclass_1 Pclass_2 Title_0 \\\n6 0.0 NaN 0.0 0.0 0.016510 0.0 0.0 1.0 1.0 \n18 0.0 NaN 0.0 0.0 0.025374 0.0 1.0 0.0 1.0 \n20 1.0 NaN 0.0 0.0 0.014102 0.0 0.0 1.0 0.0 \n27 0.0 NaN 0.0 0.0 0.014102 0.0 0.0 1.0 1.0 \n29 1.0 NaN 0.0 0.0 0.015379 0.0 0.0 1.0 0.0 \n\n Title_1 ... Cabin_2 Cabin_3 Cabin_4 Cabin_5 Cabin_6 Cabin_7 \\\n6 0.0 ... 0.0 0.0 0.0 0.0 0.0 0.0 \n18 0.0 ... 0.0 0.0 0.0 0.0 0.0 0.0 \n20 1.0 ... 0.0 0.0 0.0 0.0 0.0 0.0 \n27 0.0 ... 0.0 0.0 0.0 0.0 0.0 0.0 \n29 0.0 ... 0.0 0.0 0.0 0.0 0.0 0.0 \n\n Cabin_8 Embarked_0 Embarked_1 Embarked_2 \n6 0.0 0.0 0.0 1.0 \n18 0.0 1.0 0.0 0.0 \n20 0.0 0.0 1.0 0.0 \n27 0.0 0.0 1.0 0.0 \n29 0.0 0.0 0.0 1.0 \n\n[5 rows x 38 columns]"
},
"execution_count": 398,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {
"scrolled": true,
"trusted": true
},
"cell_type": "code",
"source": "Fare_null = null_data('Fare')\nFare_null",
"execution_count": 399,
"outputs": [
{
"data": {
"text/html": "<div>\n<style scoped>\n .dataframe tbody tr th:only-of-type {\n vertical-align: middle;\n }\n\n .dataframe tbody tr th {\n vertical-align: top;\n }\n\n .dataframe thead th {\n text-align: right;\n }\n</style>\n<table border=\"1\" class=\"dataframe\">\n <thead>\n <tr style=\"text-align: right;\">\n <th></th>\n <th>Sex</th>\n <th>Age</th>\n <th>SibSp</th>\n <th>Parch</th>\n <th>Fare</th>\n <th>Pclass_0</th>\n <th>Pclass_1</th>\n <th>Pclass_2</th>\n <th>Title_0</th>\n <th>Title_1</th>\n <th>...</th>\n <th>Cabin_2</th>\n <th>Cabin_3</th>\n <th>Cabin_4</th>\n <th>Cabin_5</th>\n <th>Cabin_6</th>\n <th>Cabin_7</th>\n <th>Cabin_8</th>\n <th>Embarked_0</th>\n <th>Embarked_1</th>\n <th>Embarked_2</th>\n </tr>\n </thead>\n <tbody>\n <tr>\n <th>1044</th>\n <td>0.0</td>\n <td>0.755731</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>NaN</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>1.0</td>\n <td>1.0</td>\n <td>0.0</td>\n <td>...</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>0.0</td>\n <td>1.0</td>\n <td>0.0</td>\n <td>0.0</td>\n </tr>\n </tbody>\n</table>\n<p>1 rows × 38 columns</p>\n</div>",
"text/plain": " Sex Age SibSp Parch Fare Pclass_0 Pclass_1 Pclass_2 \\\n1044 0.0 0.755731 0.0 0.0 NaN 0.0 0.0 1.0 \n\n Title_0 Title_1 ... Cabin_2 Cabin_3 Cabin_4 Cabin_5 \\\n1044 1.0 0.0 ... 0.0 0.0 0.0 0.0 \n\n Cabin_6 Cabin_7 Cabin_8 Embarked_0 Embarked_1 Embarked_2 \n1044 0.0 0.0 0.0 1.0 0.0 0.0 \n\n[1 rows x 38 columns]"
},
"execution_count": 399,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {},
"cell_type": "markdown",
"source": "## Notnull Data: df_drop.shape = (1043, 51)"
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "# Drop null rows\nnull_index = np.concatenate([Embarked_null.index, Age_null.index, Fare_null.index])\ndf_drop = df.drop(null_index)\ndf_drop.shape",
"execution_count": 400,
"outputs": [
{
"data": {
"text/plain": "(1043, 38)"
},
"execution_count": 400,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "Age_n = len(df_drop.columns[df_drop.columns.str.contains('Age')])\nFare_n = len(df_drop.columns[df_drop.columns.str.contains('Fare')])\nEmbarked_n = len(df_drop.columns[df_drop.columns.str.contains('Embarked')])\n[Age_n, Fare_n, Embarked_n]",
"execution_count": 401,
"outputs": [
{
"data": {
"text/plain": "[1, 1, 3]"
},
"execution_count": 401,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {},
"cell_type": "markdown",
"source": "## Model to fill NaN in Fare, Embarked, Age"
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "from keras.models import Sequential\nfrom keras.layers import Flatten, Dense, Dropout, BatchNormalization\nimport keras\nfrom keras.callbacks import EarlyStopping, ModelCheckpoint, ReduceLROnPlateau\n\ninitializer = keras.initializers.glorot_uniform(seed=random_n)\n\n# model for Fare, Embarked, Age\ndef fill_data(col):\n drop_columns = df_drop.columns[df_drop.columns.str.contains(col)]\n output_num = len(drop_columns)\n input_cols = len(df_drop.columns) - output_num\n \n model = Sequential()\n model.add(Dense(64*2, activation='relu', input_shape=(input_cols,), kernel_initializer=initializer))\n model.add(Dropout(0.5, seed=random_n))\n# model.add(Dense(64, activation='relu', kernel_initializer=initializer))\n# model.add(Dropout(0.5, seed=random_n))\n\n opt = 'rmsprop'\n \n if col == 'Embarked':\n model.add(Dense(output_num, activation='sigmoid', kernel_initializer=initializer))\n model.compile(optimizer=opt, loss='categorical_crossentropy', metrics=['acc'])\n \n else: # 'Fare', 'Age'\n model.add(Dense(1, activation='sigmoid', kernel_initializer=initializer))\n #model.compile(optimizer='adam', loss='mse', metrics=['mae'])\n model.compile(optimizer=opt, loss='binary_crossentropy', metrics=[])\n \n reduce_lr = ReduceLROnPlateau(monitor='val_loss', factor=0.2, patience=3, min_lr=0.000001,verbose=1)\n checkpointer = ModelCheckpoint(filepath='one_hot_'+col+'.hdf5', verbose=1, save_best_only=True)\n early_stopping = EarlyStopping(patience=10, verbose=1)\n \n x_data = df_drop.drop(drop_columns, axis=1)\n y_data = df_drop[drop_columns]\n epochs = 100\n\n hist = model.fit(x_data, y_data,\n epochs=epochs, \n batch_size=32,\n verbose=1,\n validation_split=0.15,\n callbacks=[reduce_lr, early_stopping, checkpointer])\n\n null_rows = null_data(col)\n null_rows = null_rows.drop(drop_columns, axis=1)\n \n model.load_weights('one_hot_'+col+'.hdf5')\n pred = model.predict(null_rows)\n\n null_index = null_data(col).index\n \n if col == 'Embarked':\n for i, ni in enumerate(null_index):\n imax = pred[i].argmax()\n for j, dc in enumerate(drop_columns):\n if j == imax:\n df.loc[ni, dc] = 1\n else:\n df.loc[ni, dc] = 0\n \n plt.plot(hist.history['acc'], 'b-', label='acc' )\n plt.plot(hist.history['loss'], 'r-', label='loss' )\n plt.xlabel('epochs')\n plt.legend()\n plt.show()\n else: # 'Fare', 'Age'\n for i, idx in enumerate(null_index):\n df.loc[idx, col] = pred[i]\n",
"execution_count": 402,
"outputs": []
},
{
"metadata": {
"scrolled": true,
"trusted": true
},
"cell_type": "code",
"source": "fill_data('Embarked') # id:62,830",
"execution_count": 403,
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": "Train on 886 samples, validate on 157 samples\nEpoch 1/100\n886/886 [==============================] - 1s 912us/step - loss: 0.9259 - acc: 0.6625 - val_loss: 0.8148 - val_acc: 0.7134\n\nEpoch 00001: val_loss improved from inf to 0.81476, saving model to one_hot_Embarked.hdf5\nEpoch 2/100\n886/886 [==============================] - 0s 76us/step - loss: 0.7331 - acc: 0.7551 - val_loss: 0.7105 - val_acc: 0.7134\n\nEpoch 00002: val_loss improved from 0.81476 to 0.71048, saving model to one_hot_Embarked.hdf5\nEpoch 3/100\n886/886 [==============================] - 0s 84us/step - loss: 0.6530 - acc: 0.7551 - val_loss: 0.6770 - val_acc: 0.7134\n\nEpoch 00003: val_loss improved from 0.71048 to 0.67697, saving model to one_hot_Embarked.hdf5\nEpoch 4/100\n886/886 [==============================] - 0s 78us/step - loss: 0.6399 - acc: 0.7540 - val_loss: 0.6619 - val_acc: 0.7134\n\nEpoch 00004: val_loss improved from 0.67697 to 0.66191, saving model to one_hot_Embarked.hdf5\nEpoch 5/100\n886/886 [==============================] - 0s 81us/step - loss: 0.6273 - acc: 0.7585 - val_loss: 0.6533 - val_acc: 0.7134\n\nEpoch 00005: val_loss improved from 0.66191 to 0.65331, saving model to one_hot_Embarked.hdf5\nEpoch 6/100\n886/886 [==============================] - 0s 86us/step - loss: 0.6172 - acc: 0.7517 - val_loss: 0.6454 - val_acc: 0.7134\n\nEpoch 00006: val_loss improved from 0.65331 to 0.64538, saving model to one_hot_Embarked.hdf5\nEpoch 7/100\n886/886 [==============================] - 0s 84us/step - loss: 0.6135 - acc: 0.7540 - val_loss: 0.6417 - val_acc: 0.7197\n\nEpoch 00007: val_loss improved from 0.64538 to 0.64170, saving model to one_hot_Embarked.hdf5\nEpoch 8/100\n886/886 [==============================] - 0s 77us/step - loss: 0.6105 - acc: 0.7585 - val_loss: 0.6411 - val_acc: 0.7197\n\nEpoch 00008: val_loss improved from 0.64170 to 0.64108, saving model to one_hot_Embarked.hdf5\nEpoch 9/100\n886/886 [==============================] - 0s 82us/step - loss: 0.6143 - acc: 0.7562 - val_loss: 0.6396 - val_acc: 0.7197\n\nEpoch 00009: val_loss improved from 0.64108 to 0.63957, saving model to one_hot_Embarked.hdf5\nEpoch 10/100\n886/886 [==============================] - 0s 80us/step - loss: 0.6036 - acc: 0.7607 - val_loss: 0.6373 - val_acc: 0.7197\n\nEpoch 00010: val_loss improved from 0.63957 to 0.63731, saving model to one_hot_Embarked.hdf5\nEpoch 11/100\n886/886 [==============================] - 0s 76us/step - loss: 0.5953 - acc: 0.7664 - val_loss: 0.6369 - val_acc: 0.7197\n\nEpoch 00011: val_loss improved from 0.63731 to 0.63685, saving model to one_hot_Embarked.hdf5\nEpoch 12/100\n886/886 [==============================] - 0s 77us/step - loss: 0.5983 - acc: 0.7630 - val_loss: 0.6359 - val_acc: 0.7261\n\nEpoch 00012: val_loss improved from 0.63685 to 0.63589, saving model to one_hot_Embarked.hdf5\nEpoch 13/100\n886/886 [==============================] - 0s 79us/step - loss: 0.5942 - acc: 0.7619 - val_loss: 0.6363 - val_acc: 0.7261\n\nEpoch 00013: val_loss did not improve from 0.63589\nEpoch 14/100\n886/886 [==============================] - 0s 76us/step - loss: 0.6033 - acc: 0.7551 - val_loss: 0.6369 - val_acc: 0.7197\n\nEpoch 00014: val_loss did not improve from 0.63589\nEpoch 15/100\n886/886 [==============================] - 0s 87us/step - loss: 0.5901 - acc: 0.7709 - val_loss: 0.6421 - val_acc: 0.7197\n\nEpoch 00015: ReduceLROnPlateau reducing learning rate to 0.00020000000949949026.\n\nEpoch 00015: val_loss did not improve from 0.63589\nEpoch 16/100\n886/886 [==============================] - 0s 75us/step - loss: 0.5828 - acc: 0.7675 - val_loss: 0.6412 - val_acc: 0.7261\n\nEpoch 00016: val_loss did not improve from 0.63589\nEpoch 17/100\n886/886 [==============================] - 0s 81us/step - loss: 0.5856 - acc: 0.7652 - val_loss: 0.6408 - val_acc: 0.7261\n\nEpoch 00017: val_loss did not improve from 0.63589\nEpoch 18/100\n886/886 [==============================] - 0s 75us/step - loss: 0.5968 - acc: 0.7664 - val_loss: 0.6403 - val_acc: 0.7197\n\nEpoch 00018: ReduceLROnPlateau reducing learning rate to 4.0000001899898055e-05.\n\nEpoch 00018: val_loss did not improve from 0.63589\nEpoch 19/100\n886/886 [==============================] - 0s 84us/step - loss: 0.5898 - acc: 0.7675 - val_loss: 0.6403 - val_acc: 0.7197\n\nEpoch 00019: val_loss did not improve from 0.63589\nEpoch 20/100\n886/886 [==============================] - 0s 80us/step - loss: 0.5836 - acc: 0.7619 - val_loss: 0.6402 - val_acc: 0.7197\n\nEpoch 00020: val_loss did not improve from 0.63589\nEpoch 21/100\n886/886 [==============================] - 0s 78us/step - loss: 0.5832 - acc: 0.7562 - val_loss: 0.6402 - val_acc: 0.7197\n\nEpoch 00021: ReduceLROnPlateau reducing learning rate to 8.000000525498762e-06.\n\nEpoch 00021: val_loss did not improve from 0.63589\nEpoch 22/100\n886/886 [==============================] - 0s 80us/step - loss: 0.5745 - acc: 0.7765 - val_loss: 0.6402 - val_acc: 0.7197\n\nEpoch 00022: val_loss did not improve from 0.63589\nEpoch 00022: early stopping\n"
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAYQAAAEKCAYAAAASByJ7AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3Xl4VOXZ+PHvbdhUZDUKZZGoICJbQgBbFBQFI1VR3KBWRa3UvlKr71uKvtq3FttqtRW1Uis/KiqiqLiAiuJGRS0ogYJsIhFEAsga9jXJ/fvjPjHDMEkmMEuSuT/XNdfMnPOcM8+Z5dxznlVUFeecc+6oZGfAOedc1eABwTnnHOABwTnnXMADgnPOOcADgnPOuYAHBOecc4AHBOeccwEPCM455wAPCM455wK1kp2Byjj++OO1TZs2yc6Gc85VK3Pnzt2kqukVpatWAaFNmzbk5uYmOxvOOVetiMiqaNJ5kZFzzjnAA4JzzrmABwTnnHOABwTnnHMBDwjOOecADwjOOecCUQUEEckRkWUikicid0ZYP1pE5ge3r0Rka7D83JDl80Vkr4hcGqx7WkRWhqzrGttDc845VxkVBgQRSQPGABcCHYAhItIhNI2q3qGqXVW1K/A34NVg+YyQ5X2B3cC7IZuOKFmvqvNjc0jOOVdzbNgAd9wBu3fH/7WiuULoAeSp6gpV3Q9MAgaWk34I8EKE5VcAb6tqAg7LOeeqv6IiuOYaeOIJyMuL/+tFExBaAKtDnucHyw4hIicBGcCHEVYP5tBA8UcR+SIocqobRV6ccy5ljBoF778PY8ZA587xf71oAoJEWKZlpB0MTFbVooN2INIc6ARMD1l8F9Ae6A40AUZGfHGRYSKSKyK5GzdujCK7zjlX/b3zDtx3HwwdCjfemJjXjCYg5AOtQp63BNaWkTbSVQDAVcBrqnqgZIGqrlOzDxiPFU0dQlXHqmq2qmanp1c4NpNzzlV7334LP/0pdOxoVwcS6W95HEQTEOYAbUUkQ0TqYCf9qeGJROQ0oDEwK8I+DqlXCK4aEBEBLgUWVS7rzjlX8+zfD1ddZfeTJ8MxxyTutSsc7VRVC0VkOFbckwY8paqLRWQUkKuqJcFhCDBJVQ8qThKRNtgVxkdhu54oIulYkdR84JYjORDnXPQ2bIDLLoO1a+HMM+3WsydkZkJdr81LqhEj4LPP4OWXoV27xL62hJ2/q7Ts7Gz14a+dOzKbNsG558LXX8OFF8KcObA6aDZSp44FhZIAceaZ0KZN4oosUt1LL8HVV8Ptt8Po0bHbr4jMVdXsCtN5QHAudWzeDOedB8uWwZtv2mOANWvsX+ns2XY/Zw7s2WPrTjjh4ADRvTscd1zyjqGmWrYMsrOhUyf4178sOMeKBwTn3EEKCiwALFkCU6dC//5lpz1wABYtKg0Qs2fbCQvsaqFjR8jKgrZtD77Vr5+YY4mksBC2brWgt3kzbNly6P3WrdCtG/zkJ9C8efLyGm7XLgu469fDvHnQqlXF21SGBwTnorR+vV2q79hhP8ru3aFBg2TnKra2boV+/eCLL+D1162oqLIKCuDzzy04zJ5t+1ob1t6wefNDg0TbtnDqqXD00dG9TnExbN8e+aRe3rKtW8ve51FHQZMmcOyxsGqVPT//fGvJc9llyQ1kqnD99fDcc9bUtLxAfbiiDQjVagpN52Jl926YMgUmTIB337UeoSVEoEOHgytbO3SAtLTk5fdIbNsGF1wACxbAq68eXjAAaNzY9nPBBaXLdu2yHrTLl9vtq6/s/o03rOI6VKtWpQGieXPLV6QTe0HBwZ9HuIYNoWlTuzVpYvsreRx6H/q4QQMLAmB5fO45u113nQWJyy6Da6+1K6hEf87jxtn38He/i08wqAy/QnApo7jYymYnTIBXXrErglat7F/iT39qJ6k5cw4uJtmyxbatXx969CgNED17woknJvVworJjh53A58yxJowDyxt0Jsa2bYscLJYvt/f12GPLPoGXtaxxY6gVo7+xqvDpp/Z9eOklu8Jo3tyKk669Frp0ic3rlGfePPjRj6B3b3j77fgFIy8yci6weLH96CdOhPx8qxC98kr70ffuXfrPMZyqndBCA8SCBVZWDZCRUVrR2qULpKeXnrhiWSF4uHbutKuBWbPshDdoULJzVOrAAahdO9m5KLV3L7z1ln1Ppk2z/HXqZH8UrrkGWkQcrOfIFBRYfcb+/fCf/9j3J148ILiU9t138Pzz9gOfP9/+eeXkWBC45JLoy7PD7dlj/+pKytFnz7YgE65+/fL/+Yaua97cmnbG0q5dMGCA/QN+4QULgC46mzfDiy/ad2f2bCtC7NvXvjuDBsWmhZWqFVO99RZ89JFdJcSTBwSXcnbtsgrTCRPgvfesiCg7237Igwdb88l4WLMGli6tuBJ082b7V1hcfOg+zjrL2p4PHHjkRSK7d8NFF9mJ5rnnYMiQI9tfKsvLs/dwwgRYscL+SFx2mV059Ot3+J/VX/5iHdBGj7bPPd48ILiU8M03dok/bRp8+KH9gz/ppNJ6gfbtk53DgxUXW9l6aJBYtAj+/nc7ltatYfhw+NnPrLy8svbssSugDz6AZ5+198AdOVUrepswwa4eCgqsDmnIEPvDkZkZfee9jz+2joGXXmq9kRPR6S/agICqVptbt27d1KW2fftUP/xQ9de/Vj39dFX7qaqecorqbbepfvSRalFRsnNZeYWFqq+9ptqnjx3PMceo/uIXqkuXRr+PPXtUc3JURVSffjpuWU15e/eqvvqq6mWXqdaubZ9Xhw6q99+v+u235W/73XeqzZurtm2rum1bYvKrqooNM1ThOdavEKqYwkIrt5w2DWbOtJYtw4fDyScnO2fl27rVmm9OmwYLF9q/9PC26M2bH96/oXXrrAXGtGn2Gjt2WKVtnz5WTj5gQOLHfImn+fPh0UetDmT/fqv7uP12a5JY1vu3bx9cfrmVSY8bBzfdlNg8p6otW+xf/oQJVl8jAuecY1cNl19+cH+WoiIrZpo1yxopJGJ+gxJ+hVCNbNig+uyzqldfrdqokf3jSEtT7dZNtVYt+8c3cKDqjBmqxcXJzq0pLlZduFD1gQdUe/e2/IJq48aq55+v2r69ap06pf/gQfXYY1W7dlW98krV//1f1fHjVT/5RHX9+oOPq7BQ9d//Vr3nHtXMzNLtW7ZUHTZM9fXXVXfsSNqhJ8x336n+/veqzZrZ8bdvr/rEE6o7dx6cbt8+1UsusTRPPpmcvDrVr7+2z+vUU+2zqFdPdfBg1bfeUj1wQPXuu235+PGJzxtRXiEk/SRfmVtNCQhFRapz5tiXp0cPO+GD6oknqt5wg+rLL6tu3Wpp16yxE+Pxx1uazp1Vn3rKigcSbedO1SlTVH/+c9VWrUpP1F262An+k0/si1+isFB1xQrV6dNVH39c9Ve/Uh0wwH4wJQGk5NaggQXAH/9YtUmT0qB49tl2Kb5gQdUJhom2d6/9YejWrTTo/uY3qqtWqe7fb0UXoDpmTLJz6lTtezp7tuqtt6o2bWqfTXq63d90U3LyFG1ASIkio2eeKR2H5XDUqmXNBCM1HWzcOLrOJFu3WsuXadOs+GP9eru87NHDijx+/GOrmCqrTfyePVaE8OijViSTng4//zn813/Fd0yWvDwrhpg2zTp17d9vHYr69bM8X3jh4bXRPnDAKlFLOiqV3PLzrWXQgAFWRHI4Fas1lSr8+9/wyCPW41jEiuK+/NK+F7fdluwcunD799twFBMm2G/45ZcPv8nzkfBWRiEGDrST8OEqLLQfY1kaNSq7d6WItfj49FMrQyzp/j9ggJUNV7YziirMmGEngDfesGB11VVWxpxdcQlhuXbsKD0xz5plQWD5clt32mkWAAYMsCaSPmZ+cq1aZTNpTZxozRcT0XTRVV8eEGKopKlgRYNshd9v22bbd+lSehXQs2fsut5//TX87W/w1FN2Mv/Rj+zEcNllZb/G7t0HDycQOqTA+vWl6erWtaZxJVcBp5wSmzw75xIvpgFBRHKAR7EZ08ap6gNh60cD5wZPjwFOUNVGwboiYGGw7ltVvSRYngFMApoA84BrVXV/efmobq2MCgvtMjHeY8dv3w7jx1tw+PprG5/n1lutDX74iX/NmoO3bdYs8uiUbdsm59LWORd7MQsIIpIGfAX0A/KxOZaHqOqSMtL/EshU1RuD5ztV9ZDBZUXkJeBVVZ0kIv8AFqjqE+XlpboFhEQrKrLy/kcftU5aJZo2tRN8u3aHDklc04Z5ds4dKpbDX/cA8lR1RbDjScBAIGJAwOZW/l0FmROgL/CTYNEzwL1AuQHBlS8tzXqpXnKJVTRu324nfq+Ydc5FI5qA0AJYHfI8H+gZKaGInARkACH/T6knIrlAIfCAqr4ONAW2qmphyD4jtlURkWHAMIDWrVtHkV0HVW/IBudc1RdNQIjUN7KscqbBwGRVDZ3eorWqrhWRk4EPRWQhsD3afarqWGAsWJFRFPl1zjl3GMpo9X6QfCB0hs+WwNoy0g4GXghdoKprg/sVwL+ATGAT0EhESgJSeft0zjmXANEEhDlAWxHJEJE62El/angiETkNaAzMClnWWETqBo+PB3oBS4KeczOAK4Kk1wNTjuRAnHPOHZkKA0JQzj8cmA4sBV5S1cUiMkpELglJOgSYpAc3WzodyBWRBVgAeCCkddJI4L9FJA+rU/jnkR+Oc865w+Ud05xzroaLttlpNEVGzjnnUoAHBOecc4AHBOeccwEPCM455wAPCM455wIeEJxzzgEeEJxzzgU8IDjnnAM8IDjnnAukRkBQhYKCZOfCOeeqtBjN7lvF5eTYZMIff5zsnDjnXJWVGlcI7drBf/4DxcXJzolzzlVZqREQsrJg1y6bad4551xEqRMQwK4SnHPORZQaAaFDB6hTB+bNS3ZOnHOuyooqIIhIjogsE5E8EbkzwvrRIjI/uH0lIluD5V1FZJaILBaRL0Tk6pBtnhaRlSHbdY3dYYWpXRs6dfKA4Jxz5aiwlZGIpAFjgH7Y/MpzRGRqyMxnqOodIel/ic2bDLAbuE5Vl4vID4C5IjJdVbcG60eo6uQYHUv5srLglVesCapIQl7SOeeqk2iuEHoAeaq6QlX3A5OAgeWkHwK8AKCqX6nq8uDxWmADkH5kWT5MWVmwZQt8+21SXt4556q6aAJCC2B1yPP8YNkhROQkIAP4MMK6HkAd4OuQxX8MipJGi0jdqHN9ODKDixYvNnLOuYiiCQiRylfKmoh5MDBZVYsO2oFIc2ACcIOqlnQGuAtoD3QHmgAjI764yDARyRWR3I0bN0aR3TJ07gxpaR4QnHOuDNEEhHygVcjzlsDaMtIOJiguKiEiDYC3gHtUdXbJclVdp2YfMB4rmjqEqo5V1WxVzU5PP4LSpqOPhtNP96anzjlXhmgCwhygrYhkiEgd7KQ/NTyRiJwGNAZmhSyrA7wGPKuqL4elbx7cC3ApsOhwDyJqmZl+heCcc2WoMCCoaiEwHJgOLAVeUtXFIjJKRC4JSToEmKSqocVJVwG9gaERmpdOFJGFwELgeOAPMTie8mVlwbp18N13cX8p55yrbuTg83fVlp2drbm5uYe/g5kzoU8fmDYNLrwwdhlzzrkqTETmqmp2RelSo6dyia7BxYkXGznn3CFSKyA0aACnnuoBwTnnIkitgABWj+AtjZxz7hCpGRBWrvQZ1JxzLkzqBYSSHst+leCccwfxgOCccw5IxYCQng6tWnnFsnPOhUm9gADeY9k55yJIzYCQlQXLlsHOncnOiXPOVRmpGxBU4Ysvkp0T55yrMlIzIPjcCM45d4jUDAgtWljlsgcE55z7XmoGBBHvseycc2FSMyCABYRFi2DfvmTnxDnnqoTUDQiZmVBYaEHBOedcdAFBRHJEZJmI5InInRHWjw6ZAOcrEdkasu56EVke3K4PWd5NRBYG+3wsmDktcbKy7N6LjZxzDoBaFSUQkTRgDNAPm195johMVdUlJWlU9Y6Q9L8EMoPHTYDfAdmAAnODbQuAJ4BhwGxgGpADvB2j46rYySdDw4Zeseycc4ForhB6AHmqukJV9wOTgIHlpB8CvBA8vgB4T1W3BEHgPSAnmE+5garOCqbcfBabVzlxRGzCHA8IzjkHRBcQWgCrQ57nB8sOISInARnAhxVs2yJ4XOE+4yorCxYssLoE55xLcdEEhEhl+2VNxDwYmKyqRRVsG/U+RWSYiOSKSO7GjRsrzGylZGXB3r02jIVzzqW4aAJCPtAq5HlLYG0ZaQdTWlxU3rb5weMK96mqY1U1W1Wz09PTo8huJXiPZeec+140AWEO0FZEMkSkDnbSnxqeSEROAxoDs0IWTwf6i0hjEWkM9Aemq+o6YIeInBm0LroOmHKEx1J5p50GRx/tAcE554iilZGqForIcOzkngY8paqLRWQUkKuqJcFhCDApqCQu2XaLiNyHBRWAUaq6JXj8C+Bp4GisdVHiWhiVqFULunTxpqfOOQdIyPm7ysvOztbc3NzY7vTWW+G552yO5aNSt5+ec67mEpG5qppdUTo/A2ZmwvbtsGJFsnPinHNJ5QHBeyw75xzgAQHOOANq1/aKZedcyvOAULeuBQUPCM65FOcBAUrnRqhGFezOORdrHhDAAsLGjbBmTbJz4pxzSeMBAbzHsnPO4QHBdOlio596QHDOpTAPCADHHgvt23vTU+dcSvOAUCIry68QnHMpzQNCicxMyM+3ymXnnEtBHhBKeI9l51yK84BQwlsaOedSnAeEEo0aQUaGBwTnXMrygBCqpMeyc86lIA8IobKyIC8Ptm1Ldk6ccy7hogoIIpIjIstEJE9E7iwjzVUiskREFovI88Gyc0Vkfshtr4hcGqx7WkRWhqzrGrvDOkwl9Qjz5yc3H845lwQVTqEpImnAGKAfkA/MEZGpqrokJE1b4C6gl6oWiMgJAKo6A+gapGkC5AHvhux+hKpOjtXBHLGSlkbz5kGfPsnNi3POJVg0Vwg9gDxVXaGq+4FJwMCwNDcDY1S1AEBVN0TYzxXA26q6+0gyHFcnngg/+IHXIzjnUlI0AaEFsDrkeX6wLFQ7oJ2IfCois0UkJ8J+BgMvhC37o4h8ISKjRaRu1LmOp8xMb2nknEtJ0QQEibAsfOKAWkBb4BxgCDBORBp9vwOR5kAnYHrINncB7YHuQBNgZMQXFxkmIrkikrsxEb2Is7Jg6VLYXXUvZJxzLh6iCQj5QKuQ5y2BtRHSTFHVA6q6EliGBYgSVwGvqeqBkgWquk7NPmA8VjR1CFUdq6rZqpqdnp4eRXaPUFYWFBfDwoXxfy3nnKtCogkIc4C2IpIhInWwop+pYWleB84FEJHjsSKkFSHrhxBWXBRcNSAiAlwKLDqcA4i50Ipl55xLIRW2MlLVQhEZjhX3pAFPqepiERkF5Krq1GBdfxFZAhRhrYc2A4hIG+wK46OwXU8UkXSsSGo+cEtsDukItWoFTZp4QHDOpRzRajSPcHZ2tubm5sb/hfr1g4ICSMRrOedcnInIXFXNriid91SOJCvL6hD27092TpxzLmE8IESSmWnBYMmSitM651wN4QEhEq9Yds6lIA8IkZx6KtSv7z2WnXMpxQNCJEcdBV27+hWCcy6leEAoS1aWjXpaVJTsnDjnXEJ4QChLVpYNX7F8ebJz4pxzCeEBoSxeseycSzEeEMrSvj3UresBwTmXMjwglKV2bejc2VsaOedShgeE8mRl2RVCNRrewznnDpcHhPJkZsLWrfDNN8nOiXPOxZ0HhPKUVCx7sZFzLgV4QChPp06QluYVy865lOABoTz16kGHDh4QnHMpIaqAICI5IrJMRPJE5M4y0lwlIktEZLGIPB+yvEhE5ge3qSHLM0TkMxFZLiIvBrOxVT1ZWTYvgs+x7Jyr4SoMCCKSBowBLgQ6AENEpENYmrbAXUAvVT0DuD1k9R5V7RrcLglZ/mdgtKq2BQqAm47sUOLkiitg0ybo0wfWrUt2bpxzLm6iuULoAeSp6gpV3Q9MAgaGpbkZGKOqBQCquqG8HQbzKPcFJgeLnsHmVa56LroIpkyBpUuhRw+vYHbO1VjRBIQWwOqQ5/nBslDtgHYi8qmIzBaRnJB19UQkN1hectJvCmxV1cJy9ll1XHwxfPopiMBZZ8Hrryc7R845F3PRBASJsCy8p1YtoC1wDjAEGCcijYJ1rYO5PH8CPCIip0S5T3txkWFBQMnduHFjFNmNky5d4PPPoWNHGDQIHnzQO6w552qUaAJCPtAq5HlLYG2ENFNU9YCqrgSWYQECVV0b3K8A/gVkApuARiJSq5x9Emw3VlWzVTU7PT09qoOKm2bN4F//giuvhJEj4aabfN5l51yNEU1AmAO0DVoF1QEGA1PD0rwOnAsgIsdjRUgrRKSxiNQNWd4LWKKqCswArgi2vx6YcqQHkxBHHw0vvAD/938wfjz07w+bNyc7V845d8QqDAhBOf9wYDqwFHhJVReLyCgRKWk1NB3YLCJLsBP9CFXdDJwO5IrIgmD5A6paMnP9SOC/RSQPq1P4ZywPLK6OOgp+/3uYOBFmz4aePeHLL5OdK+ecOyKi1agcPDs7W3Nzc5OdjYPNmgWXXgr79sHkyXD++cnOkXPOHURE5gZ1ueXynspH6oc/hM8+g1atICcHnnwy2TlyzrnD4gEhFtq0sWapF1wAt9wCd9zhczE756odDwix0qABTJ0Kt98OjzwCAwfC9u3JzpVzzkXNA0IspaXB6NHwxBPwzjvQq5fPpeCcqzY8IMTDLbdYQFi92logffJJsnPknHMV8oAQL+efb01SGzaEvn29stk5V+V5QIin9u1tuIvzzrOrhltu8Z7NzrkqywNCvDVqBG++aUNdPPmkBYf165OdK+ecO4QHhERIS4MHHoBJk2DuXMjOtkl3nHOuCvGAkEhXXw3//rcFiLPPhueeS3aOnHPuex4QEq1rV5gzB848E669Fv7nf6CwsOLtnHMuzjwgJEN6Orz7Lvzyl/Dww3DhhT5iqnMu6TwgJEvt2vDYY/DUUzBzJnTvDgsXJjtXzrkU5gEh2W64AT76CPbutYHyXnkl2TlyzqUoDwhVwZlnWqujTp3giivgnnuguDjZuXLOpRgPCFXFD35g03PeeCP88Y82ON62bcnOlXMuhUQVEEQkR0SWiUieiNxZRpqrRGSJiCwWkeeDZV1FZFaw7AsRuTok/dMislJE5ge3rrE5pGqsbl0YNw4ef9zGQurZExYtSnaunHMposKAICJpwBjgQqADMEREOoSlaQvcBfRS1TOA24NVu4HrgmU5wCMi0ihk0xGq2jW4zT/yw6kBRODWW+H992HLFsjKsuk6fcgL51ycRXOF0APIU9UVqrofmAQMDEtzMzBGVQsAVHVDcP+Vqi4PHq8FNgDpscp8jdanDyxZAldeCffeC926Wf8F55yLk2gCQgtgdcjz/GBZqHZAOxH5VERmi0hO+E5EpAdQB/g6ZPEfg6Kk0SJSN9KLi8gwEckVkdyNGzdGkd0a5PjjYeJEeOMNKCiwyucRI2D37mTnzDlXA0UTECTCMg17XgtoC5wDDAHGhRYNiUhzYAJwg6qWNJ+5C2gPdAeaACMjvbiqjlXVbFXNTk9P0YuLiy6CxYvhZz+Dv/wFunSxpqrOORdD0QSEfKBVyPOWwNoIaaao6gFVXQkswwIEItIAeAu4R1Vnl2ygquvU7APGY0VTriwNG9poqR9+aE1SzzkHfvELn6bTORcz0QSEOUBbEckQkTrAYGBqWJrXgXMBROR4rAhpRZD+NeBZVX05dIPgqgEREeBSwJvTROPcc61H83//N4wdC2ecAdOmJTtXzrkaoMKAoKqFwHBgOrAUeElVF4vIKBG5JEg2HdgsIkuAGVjroc3AVUBvYGiE5qUTRWQhsBA4HvhDTI+sJjvmGPjrX23k1IYN4cc/toHyfDwk59wRENXw6oCqKzs7W3N9HoGD7dsHf/qT3Ro3tj4MV15pzVedcw4Qkbmqml1ROu+pXN3VrWv9FObOhZNOsjkXBg2CteHVPM45Vz4PCDVF584waxY89JD1cu7QAX77W/jgA9i1K9m5c85VA15kVBMtXw7Dh1tv5+JiqFXLOradfbbdzjoLmjRJdi6dcwkSbZGRB4SabPt2q3j++GO7ffZZ6RAYHTuWBoizz4aWLZObV+dc3HhAcIfau9eGv/j4Y5uU59//hh07bF1GRmlw6N0b2rZNXMX05s3wySeWpyVL4JZbbLRX51xMeEBwFSsshC++KA0QH38MJcODNG0KmZk2B3TXrva4XTsrfjpSa9eWvubMmaUjutata8N1rFkDQ4fCI49Ys1rn3BHxgOAqTxW++spO0p9/Dv/5j52s9+2z9fXq2SQ+oYGic2c49tjy9/nNN6Un/5kzIS/P1tWvD7162RXJ2WfbNKJHHQWjRsH991sx1tNPW2c859xh84DgYuPAAVi2zILD/Pl2+89/bLA9sGKldu1KryK6doVmzWD27NIAkJ9vaZs0KS2S6t3b0pZ1xTF7Nlx3nVWQ33679bM4+ujEHLNzNYwHBBc/qrB6dWlwKLlftergdM2a2TDeJQGgQwe7AojWrl0wciSMGQPt28OECZBd4XfaORfGA4JLvIICWLDA6gh69IBTTolNxfR778ENN8B331nfiv/9X6hd+8j361yK8IDgapaCArjtNnjuObtKmDDBrhqccxXyoStczdK4sQWBl1+GlSutvuLRR63jnXMuJjwguOrliius5dP551tl8/nnw7ffJjtXztUIHhBc9dOsGUydCuPGWUe7Tp3gmWessts5d9g8ILjqSQRuusk61nXpYh3ZBg2yiudU4kHQxVBUAUFEckRkmYjkicidZaS5SkSWiMhiEXk+ZPn1IrI8uF0fsrybiCwM9vlYMHOac5WTkQEzZthc09OmwamnWkukmj616P79dswnnAB33eV1KS4mKmxlJCJpwFdAP2zu5DnAEFVdEpKmLfAS0FdVC0TkBFXdICJNgFwgG1BgLtAtSPM58CtgNjANeExV3y4vL5FaGR04cID8/Hz27t1bmeOu8urVq0fLli2p7c0ro5eXB/fcAy++aENv3H23zTthGa/+AAAU5ElEQVRdr16ycxZb77wDv/qV9So//XRYutSujiZMsNn0nAsTbSsjVLXcG/BDYHrI87uAu8LSPAj8LMK2Q4AnQ54/GSxrDnxZVrqybt26ddNwK1as0I0bN2pxcfEh66qr4uJi3bhxo65YsSLZWamecnNV+/VTBdXWrVXHj1ctLEx2ro7c8uWqF19sx9W2reqbb6oWF6s+/LCqiGq3bqpr1iQ7l64KAnK1gvOrqkZVZNQCWB3yPD9YFqod0E5EPhWR2SKSU8G2LYLH5e0zKnv37qVp06bUpBInEaFp06Y17qonYbp1g3fftfkgTjjBOrV16WIV0dWxzH3nTisWOuMMKx77859h4UKbS1sE7rgDpkyBL7+Enj2t57hzhyGagBDpTBv+q6oFtAXOwf7tjxORRuVsG80+7cVFholIrojkbiwZifPQNJFzXo3VxGNKuPPOs0H6XnrJytwHDrSxlD75JLavs38/bN0a232CBa+JE+G00+CBB2x61GXL4De/sZFhQ118celxnXUWvPFG7PPjarxoAkI+0CrkeUsgfMLefGCKqh5Q1ZXAMixAlLVtfvC4vH0CoKpjVTVbVbPT09OjyK5zIUTgyith8WL4xz9gxQoLChdfbP+yK2vnTvj0U3j8cWvllJlpo7Y2bmyD9f3P/8Dbb1u6IzFvnuXzpz+F5s1t7opnn4Uf/KDsbbp2tUmQ2re34Dd6dPW8InLJU1GZEvbvfwWQAdQBFgBnhKXJAZ4JHh+PFRM1BZoAK4HGwW0l0CRINwc4E7taeBsYUFFeItUhLFmyJNbFbVVGTT62pNm1S/VPf1Jt2NDK3a+7TvWbbyKn3bxZ9f33VR98UHXIENXTTrNt7DSrmp6u2r+/6p13qt53n2rfvqp169q62rVVzz5b9fe/V/30U9X9+6PL34YNqsOG2eukp6uOG6daVFS5Y9y5U3XQIMvHz38e/WtXBwsXqt58s+rzz9es44ozoqxDqDCB7YsBWEujr4G7g2WjgEuCxwI8DCwBFgKDQ7a9EcgLbjeELM8GFgX7fJygxVN5t6oaEAYOHKhZWVnaoUMHffLJJ1VV9e2339bMzEzt3Lmz9u3bV1VVd+zYoUOHDtWOHTtqp06ddPLkyeXutyocW421ebPqr39tJ/A6dVRvv111yhQ7gV96qVVGl5z4SyqnBw609VOnqq5ebRW64XbvVn3vPdWRI62StySA1K+vetFFqo88Yie18G0PHFB97DHVRo1Ua9VSveMO1YKCwz++oiLLA1gF+5HsqyooLlYdM0a1Xj3VtLTSz+Thh1W3b0927qq8aANCtR/cbunSpZx++umAjWQQ6/q0rl1t4q7ybNmyhSZNmrBnzx66d+/OBx98QHZ2NjNnziQjI+P79SNHjmTfvn08EuywoKCAxo0bl7nf0GNzcbJ6Ndx7r03EU1xcOr9DZqbdsrLsvmnTw9v/li1WEfz++/DBBza/A1hv6759beiNpk1tBNfFi+35o4/aUOGxMH48DBtmU6K++SacfHJs9ptImzdb8dyUKZCTY8c0Z471w5g502bVu+UWG/ywvCK1FBZts9MYzIfoHnvsMV577TUAVq9ezdixY+nduzcZGRkANGnSBID333+fSZMmfb9decHAJUirVvDPf9oJ+bvvrDVS/fqx23+TJnD55XYDmzPigw/s9v778HzQhzMjA157zcr+Y9mg4IYbbN+DBlkLpNdft1nqqouPPoJrroENG+Dhh63/xVFHWR3QxRdbnclf/woPPWTrr7nG6nE6dkx2zqunaC4jqsqtKhYZzZgxQ3v16qW7du1SVdU+ffrolClT9JprrjkkbWZmpi5fvjzqfSf72FycFRerfvGF6uTJqnv2xPe1li1TPfVUKx577rn4vlYsHDig+tvfWpFbu3aqc+eWn/7rr1WHD1c95hgrTsrJUf3gg8jFeimIGPZDcOXYtm0bjRs35phjjuHLL79k9uzZ7Nu3j48++oiVK1cCVqQE0L9/fx5//PHvty0omYbSpSYRG5jv8svj35u6XTublvSHP7SWS7/7XdVtgbRqlc20d999NkbV3LlWdFeek0+Gv/3NRr79wx9sBr/zzrM+Kc8/b1PBugp5QDhCOTk5FBYW0rlzZ377299y5plnkp6eztixYxk0aBBdunTh6quvBuCee+6hoKCAjh070qVLF2bMmJHk3LuU0rSpddgbOhRGjYKf/ASqWufHl1+2YrtFi+xE/tRTlSvCKxmy5JtvbDTcPXusGOnUU60Z7o4dcct6jRDNZURVuVXFIqN4qsnH5pKouFj1/vutaKVNG9XHH7fmuMm0c6c1JwXVnj2tCCgWiopU33hDtU8f23fDhtaibPHi2Oy/msCLjJxzEYnAnXfa1ULz5jB8OLRpA3/8o01VmmgLFti0qOPG2RAdH38cu9ZQRx0FF10E//qX9Vq/8EIYM8aGATn7bJuSdc+e2LxWDeABwblU1a+f9bqeORO6d7eRYlu3hhEjYG3EgQNiS9XK/Xv2hG3b4L334E9/gniN8Nu9O7zwAqxZAw8+aK3Krr0WWrSwNutLllS8jxrOA4JzqUzE/im/9ZZ14rnkEmu+mZEBN99sQ2zHw6ZN1sT2ttus78WCBVYJnAjp6Rb0vvoKPvwQLrgA/v53u2o46ywbRjxFrxo8IDjnTJcuNpje8uXws59ZcUr79jYWVFiH0Erbv9/Gjnr+eSuu6tIFpk+Hxx6zgfiSMU6ZCJx7bulVw0MPWX+H666zDm6/+pV1FkwhNaqnck1Tk4/NVQPr19sJe8wYK9I5/3w7mfftW3bnOVXIz7epTRcutNsXX9gorSVNP2vXtjqDv//dhgKoSlStM9zYsfDKKxbIevWy3t5XXglHH53sHB6WaHsqe0CowmrysblqZPt2ePJJK0r67js7mZcEhiVLDj7xL1xowaNE69bW16JTJ+jc2e5POy1+9QSxtGkTPPOMBYevvoJGjay+o25dqFPHjiHSfVnrMjKseCoJQ9t7QEig+vXrs/NIhzuOoCocm3Pf27vXytcffNCmKw3VoMGhJ/6OHe0kWt2pWsX7uHGlVzr795d/X1gYeV+9ellg7dEjoYfgYxk552KrXj2raL7xRht3KS/PTvqdOtmVQE2d1EnEek736RP9NsXFFhhKgsT+/Ta44D332FXGNddYi6rWreOX78PglcoxpKqMGDGCjh070qlTJ1588UUA1q1bR+/evenatSsdO3bk448/pqioiKFDh36fdvTo0UnOvXNRSkuDK66wYqOLLoKTTqq5weBwHXWUFS3Vr28DHDZrZhX1y5dbT+pXXrGis7vvrlK9p2vWFUKyxr8OvPrqq8yfP58FCxawadMmunfvTu/evXn++ee54IILuPvuuykqKmL37t3Mnz+fNWvWsGjRIgC2xmMKRudc1XLccTbW0rBhNsLun/5ko+3ed59deaWlJTV7foUQQ5988glDhgwhLS2NE088kT59+jBnzhy6d+/O+PHjuffee1m4cCHHHXccJ598MitWrOCXv/wl77zzDg0aNEh29p1zidK6tTXr/ewzG2dp2DCbd+O995KaraiuEEQkB3gUSAPGqeoDYeuHAg8Ba4JFj6vqOBE5FwgtC2mPzab2uog8DfQBSpokDFXVI/t7H+U/+Xgpq4K+d+/ezJw5k7feeotrr72WESNGcN1117FgwQKmT5/OmDFjeOmll3jqqacSnGPnXFL16GFDdbzyCvzmN9C/PwwYYH0iYjVJUiVUeIUgImnAGOBCoAMwREQi5fRFVe0a3MYBqOqMkmVAX2A38G7INiNCtolxWU/i9e7dmxdffJGioiI2btzIzJkz6dGjB6tWreKEE07g5ptv5qabbmLevHls2rSJ4uJiLr/8cu677z7mzZuX7Ow755JBxOpkli61QPDJJ9ZS69ZbYePGhGYlmiuEHkCeqq4AEJFJwEBs/uTKuAJ4W1V3V3K7auOyyy5j1qxZdOnSBRHhwQcfpFmzZjzzzDM89NBD1K5dm/r16/Pss8+yZs0abrjhBoqLiwG4//77k5x751xS1a0Lv/41XH89/P738I9/WLHSPffYEB9168Y9CxX2QxCRK4AcVf1Z8PxaoKeqDg9JMxS4H9gIfAXcoaqrw/bzIfCwqr4ZPH8a+CGwD/gAuFNV90V4/WHAMIDWrVt3W7Vq1UHra3Jb/Zp8bM65CixdamMuvfWWdWqbMsWa+B6GaPshRFOpHKk9WXgUeQNoo6qdgfeBZ8Iy0xzoBEwPWXwXVqfQHWgCjIz04qo6VlWzVTU7PRnjnTjnXDKcfrr1XXj3XZvxLlZDgpcjmoCQD7QKed4SOGhsXFXdHPLv/v8B3cL2cRXwmqoeCNlmXTB3wz5gPFY05ZxzLlS/fvDOO3DssXF/qWgCwhygrYhkiEgdYDAwNTRBcAVQ4hJgadg+hgAvRNpGRAS4FFhUuaw755yLpQorlVW1UESGY8U9acBTqrpYREZh07JNBW4TkUuAQmALMLRkexFpg11hfBS264kiko4VSc0Hbjncg1BVpIb1lKxOY0w552qGaj+43cqVKznuuONo2rRpjQkKqsrmzZvZsWMHGRkZyc6Oc66aS5nB7Vq2bEl+fj4bE9xeN97q1atHy5Ytk50N51wKqfYBoXbt2v4v2jnnYsDHMnLOOQd4QHDOORfwgOCccw6oZq2MRGQHsCzZ+YjgeGBTsjMRgeercjxfleP5qpxk5uskVa1wqIfqVqm8LJqmU4kmIrmer+h5virH81U5nq/D50VGzjnnAA8IzjnnAtUtIIxNdgbK4PmqHM9X5Xi+KsfzdZiqVaWyc865+KluVwjOOefipEoGBBHJEZFlIpInIndGWF9XRF4M1n8WjKga7zy1EpEZIrJURBaLyK8ipDlHRLaJyPzg9n/xzlfwut+IyMLgNXMjrBcReSx4v74QkawE5Om0kPdhvohsF5Hbw9Ik5P0SkadEZIOILApZ1kRE3hOR5cF94zK2vT5Is1xErk9Avh4SkS+Dz+k1EWlUxrblfuZxyNe9IrIm5LMaUMa25f5245CvF0Py9I2IRJybPc7vV8RzQ1X4jlWaqlapGzbE9tfAyUAdYAHQISzNfwH/CB4PBl5MQL6aA1nB4+OwqULD83UO8GYS3rNvgOPLWT8AeBsbavxM4LMkfKbfYW2hE/5+Ab2BLGBRyLIHsWlbAe4E/hxhuybAiuC+cfC4cZzz1R+oFTz+c6R8RfOZxyFf9wK/juJzLve3G+t8ha3/K/B/SXi/Ip4bqsJ3rLK3qniF0APIU9UVqrofmAQMDEszkNJpOicD50mcx75Wm+FtXvB4BzYJUIt4vmYMDQSeVTMbaBQ2qVG8nQd8raqrKkwZB6o6E5unI1Tod+gZbJKmcBcA76nqFlUtAN4DcuKZL1V9V1ULg6ezsRkKE6qM9ysa0fx245Kv4Pd/FWETcSVCOeeGpH/HKqsqBoQWwOqQ5/kceuL9Pk3w49kGNE1I7vh+0p9M4LMIq38oIgtE5G0ROSNBWVLgXRGZKyLDIqyP5j2Np8GU/UNNxvsFcKKqrgP7QQMnREiT7PftRuzKLpKKPvN4GB4UZT1VRvFHMt+vs4H1qrq8jPUJeb/Czg3V4Tt2kKoYECL90w9vChVNmrgQkfrAK8Dtqro9bPU8rFikC/A34PVE5AnopapZwIXArSLSO2x9Mt+vOti0qi9HWJ2s9ytayXzf7sZmIJxYRpKKPvNYewI4BegKrMOKZ8Il7f0iwjS9YeL+flVwbihzswjLktb0syoGhHxsys0SLYG1ZaURkVpAQw7vErdSRKQ29oFPVNVXw9er6nZV3Rk8ngbUFpHj450vVV0b3G8AXsMu3UNF857Gy4XAPFVdH74iWe9XYL2UzuvdHNgQIU1S3regYvEi4BoNCprDRfGZx5SqrlfVIlUtBv5fGa+XrPerFjAIeLGsNPF+v8o4N1TZ71hZqmJAmAO0FZGM4N/lYGBqWJqpQElt/BXAh2X9cGIlKKP8J7BUVR8uI02zkroMEemBvb+b45yvY0XkuJLHWKXkorBkU4HrxJwJbCu5lE2AMv+5JeP9ChH6HboemBIhzXSgv4g0DopI+gfL4kZEcoCRwCWquruMNNF85rHOV2id02VlvF40v914OB/4UlXzI62M9/tVzrmhSn7HypWs2uzyblirmK+wFgt3B8tGYT8SgHpYEUQe8DlwcgLydBZ2KfcFMD+4DQBuAW4J0gwHFmOtK2YDP0pAvk4OXm9B8Nol71dovgQYE7yfC4HsBH2Ox2An+IYhyxL+fmEBaR1wAPtHdhNW5/QBsDy4bxKkzQbGhWx7Y/A9ywNuSEC+8rAy5ZLvWElruh8A08r7zOOcrwnBd+cL7ETXPDxfwfNDfrvxzFew/OmS71RI2kS+X2WdG5L+HavszXsqO+ecA6pmkZFzzrkk8IDgnHMO8IDgnHMu4AHBOecc4AHBOedcwAOCc3EkNqLrm8nOh3PR8IDgnHMO8IDgHAAi8lMR+TwYL/9JEUkTkZ0i8lcRmSciH4hIepC2q4jMltI5CxoHy08VkfeDwfrmicgpwe7ri8hksXkOJob0zn5ARJYE+/lLkg7due95QHApT0ROB67GBkDrChQB1wDHYuMwZQEfAb8LNnkWGKmqnbHeuyXLJwJj1Abr+xHWqxZs9MvbsTHyTwZ6iUgTbAiIM4L9/CG+R+lcxTwgOGfzNXQD5ojNuHUeduIupnTAtOeAs0SkIdBIVT8Klj8D9A7Gymmhqq8BqOpeLR2L6HNVzVcbGG4+0AbYDuwFxonIICDiuEXOJZIHBOdsrKdnVLVrcDtNVe+NkK68cV7Km6BpX8jjImxGtEJsxM1XsIlT3qlknp2LOQ8IztnAY1eIyAnw/Vy4J2G/jyuCND8BPlHVbUCBiJwdLL8W+Eht/Pt8Ebk02EddETmmrBcMxs5vqDbs9+3YPAPOJVWtZGfAuWRT1SUicg82o9ZR2GiatwK7gDNEZC42K9/VwSbXA/8ITvgrgBuC5dcCT4rIqGAfV5bzsscBU0SkHnZ1cUeMD8u5SvPRTp0rg4jsVNX6yc6Hc4niRUbOOecAv0JwzjkX8CsE55xzgAcE55xzAQ8IzjnnAA8IzjnnAh4QnHPOAR4QnHPOBf4/iY0gcJGY2VsAAAAASUVORK5CYII=\n",
"text/plain": "<matplotlib.figure.Figure at 0x7f118b27b4a8>"
},
"metadata": {},
"output_type": "display_data"
}
]
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "fill_data('Fare') # id:1044",
"execution_count": 404,
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": "Train on 886 samples, validate on 157 samples\nEpoch 1/100\n886/886 [==============================] - 1s 967us/step - loss: 0.5537 - val_loss: 0.3452\n\nEpoch 00001: val_loss improved from inf to 0.34516, saving model to one_hot_Fare.hdf5\nEpoch 2/100\n886/886 [==============================] - 0s 86us/step - loss: 0.2847 - val_loss: 0.2550\n\nEpoch 00002: val_loss improved from 0.34516 to 0.25497, saving model to one_hot_Fare.hdf5\nEpoch 3/100\n886/886 [==============================] - 0s 81us/step - loss: 0.2418 - val_loss: 0.2436\n\nEpoch 00003: val_loss improved from 0.25497 to 0.24355, saving model to one_hot_Fare.hdf5\nEpoch 4/100\n886/886 [==============================] - 0s 79us/step - loss: 0.2335 - val_loss: 0.2393\n\nEpoch 00004: val_loss improved from 0.24355 to 0.23935, saving model to one_hot_Fare.hdf5\nEpoch 5/100\n886/886 [==============================] - 0s 88us/step - loss: 0.2276 - val_loss: 0.2376\n\nEpoch 00005: val_loss improved from 0.23935 to 0.23763, saving model to one_hot_Fare.hdf5\nEpoch 6/100\n886/886 [==============================] - 0s 82us/step - loss: 0.2287 - val_loss: 0.2370\n\nEpoch 00006: val_loss improved from 0.23763 to 0.23698, saving model to one_hot_Fare.hdf5\nEpoch 7/100\n886/886 [==============================] - 0s 80us/step - loss: 0.2256 - val_loss: 0.2358\n\nEpoch 00007: val_loss improved from 0.23698 to 0.23581, saving model to one_hot_Fare.hdf5\nEpoch 8/100\n886/886 [==============================] - 0s 82us/step - loss: 0.2274 - val_loss: 0.2353\n\nEpoch 00008: val_loss improved from 0.23581 to 0.23531, saving model to one_hot_Fare.hdf5\nEpoch 9/100\n886/886 [==============================] - 0s 79us/step - loss: 0.2232 - val_loss: 0.2352\n\nEpoch 00009: val_loss improved from 0.23531 to 0.23519, saving model to one_hot_Fare.hdf5\nEpoch 10/100\n886/886 [==============================] - 0s 83us/step - loss: 0.2240 - val_loss: 0.2344\n\nEpoch 00010: val_loss improved from 0.23519 to 0.23441, saving model to one_hot_Fare.hdf5\nEpoch 11/100\n886/886 [==============================] - 0s 82us/step - loss: 0.2229 - val_loss: 0.2346\n\nEpoch 00011: val_loss did not improve from 0.23441\nEpoch 12/100\n886/886 [==============================] - 0s 84us/step - loss: 0.2237 - val_loss: 0.2347\n\nEpoch 00012: val_loss did not improve from 0.23441\nEpoch 13/100\n886/886 [==============================] - 0s 85us/step - loss: 0.2211 - val_loss: 0.2338\n\nEpoch 00013: val_loss improved from 0.23441 to 0.23384, saving model to one_hot_Fare.hdf5\nEpoch 14/100\n886/886 [==============================] - 0s 86us/step - loss: 0.2196 - val_loss: 0.2334\n\nEpoch 00014: val_loss improved from 0.23384 to 0.23336, saving model to one_hot_Fare.hdf5\nEpoch 15/100\n886/886 [==============================] - 0s 86us/step - loss: 0.2217 - val_loss: 0.2330\n\nEpoch 00015: val_loss improved from 0.23336 to 0.23297, saving model to one_hot_Fare.hdf5\nEpoch 16/100\n886/886 [==============================] - 0s 88us/step - loss: 0.2207 - val_loss: 0.2336\n\nEpoch 00016: val_loss did not improve from 0.23297\nEpoch 17/100\n886/886 [==============================] - 0s 86us/step - loss: 0.2200 - val_loss: 0.2332\n\nEpoch 00017: val_loss did not improve from 0.23297\nEpoch 18/100\n886/886 [==============================] - 0s 85us/step - loss: 0.2212 - val_loss: 0.2329\n\nEpoch 00018: ReduceLROnPlateau reducing learning rate to 0.00020000000949949026.\n\nEpoch 00018: val_loss improved from 0.23297 to 0.23293, saving model to one_hot_Fare.hdf5\nEpoch 19/100\n886/886 [==============================] - 0s 80us/step - loss: 0.2205 - val_loss: 0.2328\n\nEpoch 00019: val_loss improved from 0.23293 to 0.23278, saving model to one_hot_Fare.hdf5\nEpoch 20/100\n886/886 [==============================] - 0s 88us/step - loss: 0.2195 - val_loss: 0.2329\n\nEpoch 00020: val_loss did not improve from 0.23278\nEpoch 21/100\n886/886 [==============================] - 0s 82us/step - loss: 0.2199 - val_loss: 0.2329\n\nEpoch 00021: val_loss did not improve from 0.23278\nEpoch 22/100\n886/886 [==============================] - 0s 88us/step - loss: 0.2206 - val_loss: 0.2329\n\nEpoch 00022: ReduceLROnPlateau reducing learning rate to 4.0000001899898055e-05.\n\nEpoch 00022: val_loss did not improve from 0.23278\nEpoch 23/100\n886/886 [==============================] - 0s 81us/step - loss: 0.2209 - val_loss: 0.2329\n\nEpoch 00023: val_loss did not improve from 0.23278\nEpoch 24/100\n886/886 [==============================] - 0s 90us/step - loss: 0.2200 - val_loss: 0.2329\n\nEpoch 00024: val_loss did not improve from 0.23278\nEpoch 25/100\n886/886 [==============================] - 0s 83us/step - loss: 0.2195 - val_loss: 0.2329\n\nEpoch 00025: ReduceLROnPlateau reducing learning rate to 8.000000525498762e-06.\n\nEpoch 00025: val_loss did not improve from 0.23278\nEpoch 26/100\n886/886 [==============================] - 0s 88us/step - loss: 0.2207 - val_loss: 0.2329\n\nEpoch 00026: val_loss did not improve from 0.23278\nEpoch 27/100\n886/886 [==============================] - 0s 88us/step - loss: 0.2198 - val_loss: 0.2329\n\nEpoch 00027: val_loss did not improve from 0.23278\nEpoch 28/100\n886/886 [==============================] - 0s 84us/step - loss: 0.2198 - val_loss: 0.2329\n\nEpoch 00028: ReduceLROnPlateau reducing learning rate to 1.6000001778593287e-06.\n\nEpoch 00028: val_loss did not improve from 0.23278\nEpoch 29/100\n886/886 [==============================] - 0s 77us/step - loss: 0.2200 - val_loss: 0.2329\n\nEpoch 00029: val_loss did not improve from 0.23278\nEpoch 00029: early stopping\n"
}
]
},
{
"metadata": {
"scrolled": true,
"trusted": true
},
"cell_type": "code",
"source": "fill_data('Age') # id: 6,18,20,27,29,30",
"execution_count": 405,
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": "Train on 886 samples, validate on 157 samples\nEpoch 1/100\n886/886 [==============================] - 1s 1ms/step - loss: 0.6988 - val_loss: 0.6486\n\nEpoch 00001: val_loss improved from inf to 0.64860, saving model to one_hot_Age.hdf5\nEpoch 2/100\n886/886 [==============================] - 0s 85us/step - loss: 0.6557 - val_loss: 0.6375\n\nEpoch 00002: val_loss improved from 0.64860 to 0.63746, saving model to one_hot_Age.hdf5\nEpoch 3/100\n886/886 [==============================] - 0s 82us/step - loss: 0.6431 - val_loss: 0.6320\n\nEpoch 00003: val_loss improved from 0.63746 to 0.63196, saving model to one_hot_Age.hdf5\nEpoch 4/100\n886/886 [==============================] - 0s 84us/step - loss: 0.6398 - val_loss: 0.6298\n\nEpoch 00004: val_loss improved from 0.63196 to 0.62981, saving model to one_hot_Age.hdf5\nEpoch 5/100\n886/886 [==============================] - 0s 87us/step - loss: 0.6365 - val_loss: 0.6302\n\nEpoch 00005: val_loss did not improve from 0.62981\nEpoch 6/100\n886/886 [==============================] - 0s 82us/step - loss: 0.6338 - val_loss: 0.6290\n\nEpoch 00006: val_loss improved from 0.62981 to 0.62900, saving model to one_hot_Age.hdf5\nEpoch 7/100\n886/886 [==============================] - 0s 78us/step - loss: 0.6327 - val_loss: 0.6290\n\nEpoch 00007: val_loss did not improve from 0.62900\nEpoch 8/100\n886/886 [==============================] - 0s 79us/step - loss: 0.6329 - val_loss: 0.6290\n\nEpoch 00008: val_loss improved from 0.62900 to 0.62897, saving model to one_hot_Age.hdf5\nEpoch 9/100\n886/886 [==============================] - 0s 79us/step - loss: 0.6311 - val_loss: 0.6288\n\nEpoch 00009: val_loss improved from 0.62897 to 0.62876, saving model to one_hot_Age.hdf5\nEpoch 10/100\n886/886 [==============================] - 0s 82us/step - loss: 0.6314 - val_loss: 0.6292\n\nEpoch 00010: val_loss did not improve from 0.62876\nEpoch 11/100\n886/886 [==============================] - 0s 79us/step - loss: 0.6277 - val_loss: 0.6291\n\nEpoch 00011: val_loss did not improve from 0.62876\nEpoch 12/100\n886/886 [==============================] - 0s 80us/step - loss: 0.6295 - val_loss: 0.6306\n\nEpoch 00012: ReduceLROnPlateau reducing learning rate to 0.00020000000949949026.\n\nEpoch 00012: val_loss did not improve from 0.62876\nEpoch 13/100\n886/886 [==============================] - 0s 75us/step - loss: 0.6283 - val_loss: 0.6300\n\nEpoch 00013: val_loss did not improve from 0.62876\nEpoch 14/100\n886/886 [==============================] - 0s 78us/step - loss: 0.6293 - val_loss: 0.6299\n\nEpoch 00014: val_loss did not improve from 0.62876\nEpoch 15/100\n886/886 [==============================] - 0s 78us/step - loss: 0.6278 - val_loss: 0.6297\n\nEpoch 00015: ReduceLROnPlateau reducing learning rate to 4.0000001899898055e-05.\n\nEpoch 00015: val_loss did not improve from 0.62876\nEpoch 16/100\n886/886 [==============================] - 0s 85us/step - loss: 0.6270 - val_loss: 0.6297\n\nEpoch 00016: val_loss did not improve from 0.62876\nEpoch 17/100\n886/886 [==============================] - 0s 80us/step - loss: 0.6284 - val_loss: 0.6297\n\nEpoch 00017: val_loss did not improve from 0.62876\nEpoch 18/100\n886/886 [==============================] - 0s 82us/step - loss: 0.6269 - val_loss: 0.6297\n\nEpoch 00018: ReduceLROnPlateau reducing learning rate to 8.000000525498762e-06.\n\nEpoch 00018: val_loss did not improve from 0.62876\nEpoch 19/100\n886/886 [==============================] - 0s 84us/step - loss: 0.6271 - val_loss: 0.6297\n\nEpoch 00019: val_loss did not improve from 0.62876\nEpoch 00019: early stopping\n"
}
]
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "# Drop Cabin\n# cabin_columns = df.columns[df.columns.str.contains('Cabin')]\n# df = df.drop(cabin_columns, axis=1)\n# df.columns",
"execution_count": 406,
"outputs": []
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "# Drop Title\n# title_columns = df.columns[df.columns.str.contains('Title')]\n# df = df.drop(title_columns, axis=1)\n# df.columns",
"execution_count": 407,
"outputs": []
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "df_train = df[0:891].copy()\ndf_test = df[891:].copy()",
"execution_count": 408,
"outputs": []
},
{
"metadata": {},
"cell_type": "markdown",
"source": "## Model to estimate Survived for submission"
},
{
"metadata": {
"scrolled": true,
"trusted": true
},
"cell_type": "code",
"source": "df_cols = len(df_train.columns)\n\nmodel = Sequential()\nmodel.add(Dense(64*2, activation='relu', input_shape=(df_cols,), kernel_initializer=initializer))\nmodel.add(Dropout(0.5, seed=random_n))\n\nmodel.add(Dense(2, activation='softmax'))\nmodel.compile(optimizer='rmsprop', loss='sparse_categorical_crossentropy', metrics=['acc'])\n\nreduce_lr = ReduceLROnPlateau(monitor='val_loss', factor=0.2, patience=3, min_lr=0.000001,verbose=1)\ncheckpointer = ModelCheckpoint(filepath='one_hot_weights.hdf5', verbose=1, save_best_only=True)\nearly_stopping = EarlyStopping(patience=10, verbose=1)\n\nepochs=100\nhist = model.fit(df_train, train.Survived, \n epochs=epochs, \n batch_size=32,\n verbose=1,\n validation_split=0.15,\n callbacks=[reduce_lr, early_stopping, checkpointer])\n\nmodel.load_weights('one_hot_weights.hdf5')\npred = model.predict(df_test)",
"execution_count": 409,
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": "Train on 757 samples, validate on 134 samples\nEpoch 1/100\n757/757 [==============================] - 1s 1ms/step - loss: 0.6053 - acc: 0.6777 - val_loss: 0.4851 - val_acc: 0.8134\n\nEpoch 00001: val_loss improved from inf to 0.48510, saving model to one_hot_weights.hdf5\nEpoch 2/100\n757/757 [==============================] - 0s 72us/step - loss: 0.5064 - acc: 0.7688 - val_loss: 0.4369 - val_acc: 0.8209\n\nEpoch 00002: val_loss improved from 0.48510 to 0.43689, saving model to one_hot_weights.hdf5\nEpoch 3/100\n757/757 [==============================] - 0s 85us/step - loss: 0.4799 - acc: 0.7913 - val_loss: 0.4156 - val_acc: 0.8209\n\nEpoch 00003: val_loss improved from 0.43689 to 0.41557, saving model to one_hot_weights.hdf5\nEpoch 4/100\n757/757 [==============================] - 0s 83us/step - loss: 0.4581 - acc: 0.7926 - val_loss: 0.4054 - val_acc: 0.8209\n\nEpoch 00004: val_loss improved from 0.41557 to 0.40544, saving model to one_hot_weights.hdf5\nEpoch 5/100\n757/757 [==============================] - 0s 84us/step - loss: 0.4520 - acc: 0.7847 - val_loss: 0.4035 - val_acc: 0.8209\n\nEpoch 00005: val_loss improved from 0.40544 to 0.40347, saving model to one_hot_weights.hdf5\nEpoch 6/100\n757/757 [==============================] - 0s 83us/step - loss: 0.4268 - acc: 0.7966 - val_loss: 0.3965 - val_acc: 0.8284\n\nEpoch 00006: val_loss improved from 0.40347 to 0.39645, saving model to one_hot_weights.hdf5\nEpoch 7/100\n757/757 [==============================] - 0s 74us/step - loss: 0.4415 - acc: 0.8045 - val_loss: 0.3943 - val_acc: 0.8284\n\nEpoch 00007: val_loss improved from 0.39645 to 0.39430, saving model to one_hot_weights.hdf5\nEpoch 8/100\n757/757 [==============================] - 0s 76us/step - loss: 0.4311 - acc: 0.8124 - val_loss: 0.3896 - val_acc: 0.8433\n\nEpoch 00008: val_loss improved from 0.39430 to 0.38964, saving model to one_hot_weights.hdf5\nEpoch 9/100\n757/757 [==============================] - 0s 93us/step - loss: 0.4303 - acc: 0.8071 - val_loss: 0.3896 - val_acc: 0.8433\n\nEpoch 00009: val_loss improved from 0.38964 to 0.38963, saving model to one_hot_weights.hdf5\nEpoch 10/100\n757/757 [==============================] - 0s 90us/step - loss: 0.4305 - acc: 0.8085 - val_loss: 0.3847 - val_acc: 0.8358\n\nEpoch 00010: val_loss improved from 0.38963 to 0.38473, saving model to one_hot_weights.hdf5\nEpoch 11/100\n757/757 [==============================] - 0s 92us/step - loss: 0.4362 - acc: 0.8085 - val_loss: 0.3824 - val_acc: 0.8358\n\nEpoch 00011: val_loss improved from 0.38473 to 0.38244, saving model to one_hot_weights.hdf5\nEpoch 12/100\n757/757 [==============================] - 0s 87us/step - loss: 0.4078 - acc: 0.8322 - val_loss: 0.3867 - val_acc: 0.8284\n\nEpoch 00012: val_loss did not improve from 0.38244\nEpoch 13/100\n757/757 [==============================] - 0s 87us/step - loss: 0.4159 - acc: 0.8230 - val_loss: 0.3876 - val_acc: 0.8284\n\nEpoch 00013: val_loss did not improve from 0.38244\nEpoch 14/100\n757/757 [==============================] - 0s 92us/step - loss: 0.4215 - acc: 0.8151 - val_loss: 0.3780 - val_acc: 0.8657\n\nEpoch 00014: val_loss improved from 0.38244 to 0.37801, saving model to one_hot_weights.hdf5\nEpoch 15/100\n757/757 [==============================] - 0s 81us/step - loss: 0.4088 - acc: 0.8256 - val_loss: 0.3764 - val_acc: 0.8657\n\nEpoch 00015: val_loss improved from 0.37801 to 0.37642, saving model to one_hot_weights.hdf5\nEpoch 16/100\n757/757 [==============================] - 0s 88us/step - loss: 0.4151 - acc: 0.8269 - val_loss: 0.3832 - val_acc: 0.8433\n\nEpoch 00016: val_loss did not improve from 0.37642\nEpoch 17/100\n757/757 [==============================] - 0s 90us/step - loss: 0.3984 - acc: 0.8349 - val_loss: 0.3739 - val_acc: 0.8433\n\nEpoch 00017: val_loss improved from 0.37642 to 0.37389, saving model to one_hot_weights.hdf5\nEpoch 18/100\n757/757 [==============================] - 0s 85us/step - loss: 0.4194 - acc: 0.8217 - val_loss: 0.3759 - val_acc: 0.8731\n\nEpoch 00018: val_loss did not improve from 0.37389\nEpoch 19/100\n757/757 [==============================] - 0s 84us/step - loss: 0.4019 - acc: 0.8283 - val_loss: 0.3750 - val_acc: 0.8657\n\nEpoch 00019: val_loss did not improve from 0.37389\nEpoch 20/100\n757/757 [==============================] - 0s 83us/step - loss: 0.4010 - acc: 0.8468 - val_loss: 0.3751 - val_acc: 0.8507\n\nEpoch 00020: ReduceLROnPlateau reducing learning rate to 0.00020000000949949026.\n\nEpoch 00020: val_loss did not improve from 0.37389\nEpoch 21/100\n757/757 [==============================] - 0s 82us/step - loss: 0.3961 - acc: 0.8388 - val_loss: 0.3748 - val_acc: 0.8582\n\nEpoch 00021: val_loss did not improve from 0.37389\nEpoch 22/100\n757/757 [==============================] - 0s 87us/step - loss: 0.3999 - acc: 0.8296 - val_loss: 0.3752 - val_acc: 0.8582\n\nEpoch 00022: val_loss did not improve from 0.37389\nEpoch 23/100\n757/757 [==============================] - 0s 87us/step - loss: 0.4089 - acc: 0.8402 - val_loss: 0.3752 - val_acc: 0.8582\n\nEpoch 00023: ReduceLROnPlateau reducing learning rate to 4.0000001899898055e-05.\n\nEpoch 00023: val_loss did not improve from 0.37389\nEpoch 24/100\n757/757 [==============================] - 0s 93us/step - loss: 0.3984 - acc: 0.8283 - val_loss: 0.3750 - val_acc: 0.8507\n\nEpoch 00024: val_loss did not improve from 0.37389\nEpoch 25/100\n757/757 [==============================] - 0s 86us/step - loss: 0.4027 - acc: 0.8296 - val_loss: 0.3748 - val_acc: 0.8507\n\nEpoch 00025: val_loss did not improve from 0.37389\nEpoch 26/100\n757/757 [==============================] - 0s 86us/step - loss: 0.4051 - acc: 0.8388 - val_loss: 0.3747 - val_acc: 0.8507\n\nEpoch 00026: ReduceLROnPlateau reducing learning rate to 8.000000525498762e-06.\n\nEpoch 00026: val_loss did not improve from 0.37389\nEpoch 27/100\n757/757 [==============================] - 0s 74us/step - loss: 0.4155 - acc: 0.8283 - val_loss: 0.3747 - val_acc: 0.8507\n\nEpoch 00027: val_loss did not improve from 0.37389\nEpoch 00027: early stopping\n"
}
]
},
{
"metadata": {
"scrolled": false,
"trusted": true
},
"cell_type": "code",
"source": "plt.plot(hist.history['loss'], 'b-', label='acc' )\nplt.plot(hist.history['val_loss'], 'g-', label='val_loss' )\nplt.xlabel('epochs')\nplt.legend()\nplt.show()\n\nplt.plot(hist.history['acc'], 'r-', label='loss' )\nplt.plot(hist.history['val_acc'], 'm-', label='val_acc' )\nplt.xlabel('epochs')\nplt.legend()\nplt.show()",
"execution_count": 424,
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAX0AAAEKCAYAAAD+XoUoAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3Xl4VOX5//H3nZ0lYgJB2Zck7JsaERdwR7AKFhdQUaRVK2oVbS0uWCmotVop5SsuaK3WFZefQlsWUVncUIJFIlvCngBiCGEn+/374xkghIRMkplMMnO/rmuuZM6cOec+TPicM895znNEVTHGGBMawgJdgDHGmNpjoW+MMSHEQt8YY0KIhb4xxoQQC31jjAkhFvrGGBNCLPSNMSaEWOgbY0wIsdA3xpgQEhHoAspq1qyZtm/fPtBlGGNMvbJs2bKdqppQ2Xx1LvTbt29PampqoMswxph6RUQ2ezOfNe8YY0wIsdA3xpgQYqFvjDEhpM616RtjQlNhYSFZWVnk5eUFupQ6LSYmhtatWxMZGVmt91voG2PqhKysLGJjY2nfvj0iEuhy6iRVJScnh6ysLDp06FCtZVjzjjGmTsjLy6Np06YW+CcgIjRt2rRG34Ys9I0xdYYFfuVq+m8UNKG/ezdMnAhLlwa6EmOMqbuCpk1fBB57DGJi4MwzA12NMcbUTV4d6YvIIBFZKyLrROTBCua5TkRWichKEXm71PRRIpLheYzyVeFlNWkCzZrB+vX+WoMxxtR/lYa+iIQD04DBQDfgehHpVmaeZOAh4FxV7Q6M9UyPBx4DzgL6Ao+JSJxPt6CUpCRYt85fSzfGhIKrrrqKM844g+7duzN9+nQA5s6dy+mnn07v3r25+OKLAdi/fz+jR4+mZ8+e9OrViw8//DCQZXvNm+advsA6Vd0AICLvAkOBVaXmuQ2Ypqq5AKr6s2f6ZcB8Vd3lee98YBDwjm/KP1ZiInz5pT+WbIypTWPHwvLlvl1mnz4wZUrl87366qvEx8dz6NAhzjzzTIYOHcptt93G4sWL6dChA7t27QJg0qRJNGnShLS0NAByc3N9W7CfeNO80wrILPU8yzOttE5AJxH5SkSWiMigKrzXZ5KSYMsWyM/31xqMMcFu6tSp9O7dm379+pGZmcn06dMZMGDAkX7x8fHxAHz66afcddddR94XF+e3Rgyf8uZIv7z+QVrOcpKBC4DWwBci0sPL9yIitwO3A7Rt29aLksqXmAiqsGkTdO5c7cUYYwLMmyNyf1i4cCGffvop33zzDQ0bNuSCCy6gd+/erF279rh5VbVedjH15kg/C2hT6nlrYFs588xU1UJV3Qisxe0EvHkvqjpdVVNUNSUhodLhoCuUlOR+Wru+MaY69uzZQ1xcHA0bNmTNmjUsWbKE/Px8Fi1axMaNGwGONO8MHDiQ55577sh7g6l5ZymQLCIdRCQKGAHMKjPPx8CFACLSDNfcswGYBwwUkTjPCdyBnml+kZjofloPHmNMdQwaNIiioiJ69erFo48+Sr9+/UhISGD69OkMGzaM3r17M3z4cADGjx9Pbm4uPXr0oHfv3ixYsCDA1Xun0uYdVS0SkbtxYR0OvKqqK0VkIpCqqrM4Gu6rgGLgAVXNARCRSbgdB8DEwyd1/SEhAWJj7UjfGFM90dHRzJkzp9zXBg8efMzzxo0b8/rrr9dGWT7l1cVZqjobmF1m2h9L/a7A/Z5H2fe+CrxaszK9I+KO9u1I3xhjyhc0wzAcZn31jTGmYkEX+omJsHEjFBcHuhJjjKl7gi70k5KgsBAyMyuf1xhjQk3Qhb714DHGmIoFXegf7qtvoW+MMccLutBv1Qqio+1krjHGlCfoQj8sDDp0sCN9Y4x/NW7cuMLXNm3aRI8ePWqxGu8FXeiDdds0xpiKBM2ds0pLTIQFC9zga/VwPCRjQt7YuWNZ/pNvx1buc2ofpgyqeCS3cePG0a5dO+68804AJkyYgIiwePFicnNzKSws5PHHH2fo0KFVWm9eXh5jxowhNTWViIgIJk+ezIUXXsjKlSsZPXo0BQUFlJSU8OGHH9KyZUuuu+46srKyKC4u5tFHHz0y7IOvBGXoJyXBgQOwYwecemqgqzHG1AcjRoxg7NixR0L/vffeY+7cudx3332cdNJJ7Ny5k379+jFkyJAqja45bdo0ANLS0lizZg0DBw4kPT2dF198kXvvvZcbb7yRgoICiouLmT17Ni1btuS///0v4AaA87WgDP3S3TYt9I2pf050RO4vp512Gj///DPbtm0jOzubuLg4WrRowX333cfixYsJCwtj69at7Nixg1OrECxffvklv/3tbwHo0qUL7dq1Iz09nbPPPpsnnniCrKwshg0bRnJyMj179uT3v/8948aN44orrqB///4+386gbdMHa9c3xlTNNddcwwcffMCMGTMYMWIEb731FtnZ2Sxbtozly5dzyimnkJeXV6VluqHJjnfDDTcwa9YsGjRowGWXXcbnn39Op06dWLZsGT179uShhx5i4sSJvtisYwTlkX67dq4Xj/XgMcZUxYgRI7jtttvYuXMnixYt4r333qN58+ZERkayYMECNm/eXOVlDhgwgLfeeouLLrqI9PR0tmzZQufOndmwYQMdO3bknnvuYcOGDaxYsYIuXboQHx/PyJEjady4Ma+99prPtzEoQz8qygW/HekbY6qie/fu7Nu3j1atWtGiRQtuvPFGrrzySlJSUujTpw9dunSp8jLvvPNO7rjjDnr27ElERASvvfYa0dHRzJgxgzfffJPIyEhOPfVU/vjHP7J06VIeeOABwsLCiIyM5IUXXvD5NkpFXz0CJSUlRVNTU2u8nEsvhb174dtvfVCUMcbvVq9eTdeuXQNdRr1Q3r+ViCxT1ZTK3huUbfpgffWNMaY8Qdm8A64Hz65dkJsL9eQm9caYeiYtLY2bbrrpmGnR0dF8W4ebGII29EsPvJZS6RceY0xdoKpV6gMfaD179mT5ct9eRFaZmjbJB23zjg2xbEz9EhMTQ05OTo1DLZipKjk5OcTExFR7GUF7pN+xo/tp7frG1A+tW7cmKyuL7OzsQJdSp8XExNC6detqvz9oQ79RI2jRwo70jakvIiMj6dChQ6DLCHpB27wDronHjvSNMeaooA79pCQ70jfGmNKCOvQTE2HbNjh4MNCVGGNM3RDUoX+42+aGDYGtwxhj6oqgDn3rtmmMMccK6tC3IZaNMeZYQR36cXHuYUf6xhjjBHXogw28ZowxpQV96Ccm2pG+McYcFvShn5QEmzdDYWGgKzHGmMAL+tBPTITiYhf8xhgT6oI+9K0HjzHGHBX0oW999Y0x5iivQl9EBonIWhFZJyIPlvP6LSKSLSLLPY9bS71WXGr6LF8W741TT4WGDe1I3xhjwIuhlUUkHJgGXApkAUtFZJaqrioz6wxVvbucRRxS1T41L7V6RKwHjzHGHObNkX5fYJ2qblDVAuBdYKh/y/It66tvjDGON6HfCsgs9TzLM62sq0VkhYh8ICJtSk2PEZFUEVkiIlfVpNjqSkx0g66VlARi7cYYU3d4E/rl3aW47E0s/w20V9VewKfA66Vea6uqKcANwBQRSTxuBSK3e3YMqf64VVpiIuTnw9atPl+0McbUK96EfhZQ+si9NbCt9AyqmqOq+Z6nLwNnlHptm+fnBmAhcFrZFajqdFVNUdWUhISEKm2ANw5327R2fWNMqPMm9JcCySLSQUSigBHAMb1wRKRFqadDgNWe6XEiEu35vRlwLlD2BLDfHe62ae36xphQV2nvHVUtEpG7gXlAOPCqqq4UkYlAqqrOAu4RkSFAEbALuMXz9q7ASyJSgtvBPFVOrx+/a9MGIiPtSN8YY0S1bPN8YKWkpGhqaqrPl9upE/TpA++95/NFG2NMwInIMs/50xMK+ityD7Num8YYE0Khf/gCrTr2xcYYY2pVyIR+UhLs3Qs7dwa6EmOMCZyQCX0beM0YY0Io9G2IZWOMCaHQ79DBDb5mR/rGmFAWMqEfHe3669uRvjEmlIVM6IMNsWyMMSEV+tZX3xgT6kIq9BMTITvbdd00xphQFHKhD9bEY4wJXSEV+jbEsjEm1IVU6NsQy8aYUBdSoR8bC82b25G+MSZ0hVTogzvatyN9Y0yoCrnQT0qyI31jTOgKudBPTISsLMjLC3QlxhhT+0Iu9JOS3Jj6GzcGuhJjjKl9IRf61lffGBPKQi70bYhlY0woC7nQb9oUTjrJjvSNMaEp5EJfxAZeM8aErpALfbAhlo0xoSskQz8pyfXeKSoKdCXGGFO7QjL0ExNd4GdmBroSY4ypXSEZ+taDxxgTqkIy9K2vvjEmVIVk6Lds6W6Ubkf6xphQE5KhHxZmPXiMMaEpJEMfbIhlY0xoCtnQPzzEsmqgKzHGmNoT0qF/6BBkZAS6EmOMqT0hG/pDhriTuX/5S6ArMcaY2hOyod+6NdxxB7z+OqSnB7oaY4ypHV6FvogMEpG1IrJORB4s5/VbRCRbRJZ7HreWem2UiGR4HqN8WXxNPfSQO9qfMCHQlRhjTO2oNPRFJByYBgwGugHXi0i3cmadoap9PI9XPO+NBx4DzgL6Ao+JSJzPqq+hU06Be+6Bd9+FtLRAV2OMMf7nzZF+X2Cdqm5Q1QLgXWCol8u/DJivqrtUNReYDwyqXqn+8cADEBsLjz0W6EqMMcb/vAn9VkDpocmyPNPKulpEVojIByLSpirvFZHbRSRVRFKzs7O9LN034uPh/vvho49g2bJaXbUxxtQ6b0JfyplWtnf7v4H2qtoL+BR4vQrvRVWnq2qKqqYkJCR4UZJvjR3rwv/RR2t91cYYU6u8Cf0soE2p562BbaVnUNUcVc33PH0ZOMPb99YFTZrAH/4Ac+bAV18FuhpjjPEfb0J/KZAsIh1EJAoYAcwqPYOItCj1dAiw2vP7PGCgiMR5TuAO9Eyrc+6+253YtaN9Y0wwqzT0VbUIuBsX1quB91R1pYhMFJEhntnuEZGVIvIDcA9wi+e9u4BJuB3HUmCiZ1qd06gRPPwwLFgAn30W6GqMMcY/ROvY4DMpKSmampoakHXn5UFysrtw6+uv3U3UjTGmPhCRZaqaUtl8IXtFbnliYlzzzpIlMHt2oKsxxhjfs9AvY/Ro6NjRhX9JSaCrMcYY37LQLyMy0g3L8L//ub77xhgTTCz0y3HDDdClC/zxj1BcHOhqjDHGdyz0yxEeDhMnwqpVblweY4wJFhb6Fbj6aujd243JU1gY6GqMMcY3gib0M/dkkjI9hY9W+6YhPiwMJk1yt1R8/fXK5zfGmPogaEK/RWwL1u1ax+wM3/W1vOIK6NvXNfXk51c+vzHG1HVBE/oRYRFc0vES5qybg68uOBOBJ56AzEx4+WWfLNIYYwIqaEIfYHDSYLbu28qPP//os2VefDGcf74L/4MHfbZYY4wJiKAK/UFJ7v4sc9bN8dkyRVzb/k8/wfPP+2yxxhgTEEEV+q1OakWvU3oxd91cny63f38YNMgd7f/0k08XbYwxtSqoQh9gUOIgvtzyJfvy9/l0uVOmuOad++7z6WKNMaZWBV3oD04eTGFJIZ9t9O34yJ07wyOPuIu15viu9cgYY2pV0IX+uW3OJTYqljkZvk/mcePc8AxjxsCBAz5fvDHG+F3QhX5keKTPu24eFh0N06fD5s1uUDZjjKlvgi70wXXdzNybyeqdqyufuYr694fbboO//Q2WL/f54o0xxq+CMvSPdN30QxMPwF/+As2aufC3UTiNMfVJUIZ+myZt6J7Q3af99UuLi3O9eVJT4bnn/LIKY4zxi6AMfXBNPF9s+YL9Bfv9svzhw13f/fHj3TANxhhTHwRv6CcPpqC4gAUbF/hl+SLuCt2SErj7bqhj95c3xphyBW3on9vmXBpFNvJbEw9Ahw7wpz/BrFl2a0VjTP0QtKEfHRHNxR0v9kvXzdLGjoU+fdzR/p49fluNMcb4RNCGPrh2/U27N7E2Z63f1hER4fru79gBDz/st9UYY4xPBHXo+7vr5mFnnumO9F94Ab75xq+rMsaYGgnq0G9/cnu6NOvC3PW+HXWzPI8/Dq1awe232z11jTF1V1CHPrgmnkWbFnGw0L93QImNhWnT4Mcf4dln/boqY4yptpAI/fzifL913SxtyBAYNsz16Fm/3u+rM8aYKgv60B/QbgANIxv6tetmaVOnQmQk3HGH9d03xtQ9QR/60RHRXNThIr933TysVSv485/h00/h/vvtTlvGmLol6EMf3N20NuRuYN2udbWyvjvugJEj4e9/h3bt4Fe/grS0Wlm1McacUEiE/uDkwYBvb5h+IuHh8MYbsGYN3Hqru9tWr14wcCDMnWvNPsaYwAmJ0O8Y15FOTTvVWugf1qmT69GTlQVPPgkrV8LgwdCjB7zyCuTl1Wo5xhjjXeiLyCARWSsi60TkwRPMd42IqIikeJ63F5FDIrLc83jRV4VX1eCkwSzctJBDhYdqfd3x8fDQQ7Bxo/sGEBXlxuJv29bdgevnn2u9JGNMiKo09EUkHJgGDAa6AdeLSLdy5osF7gG+LfPSelXt43nc4YOaq2VQ0iDyivJYtHlRoEogKsq19X//PSxYAP36ue6dbdu6i7rsvrvGGH/z5ki/L7BOVTeoagHwLjC0nPkmAU8DdbLR4vx25xMTEeP3IRm8IQIXXOBG51yzBkaPhpdfdnfkMsYYf/Im9FsBpW8TkuWZdoSInAa0UdX/lPP+DiLyPxFZJCL9q19qzTSIbMCF7S+s9Xb9ynTu7Mbsue46dyXv9u2BrsgYE8y8CX0pZ9qR/iciEgb8DfhdOfNtB9qq6mnA/cDbInLScSsQuV1EUkUkNTs727vKq2Fw0mAydmWwflfdu1z2iSegoMA19xhjjL94E/pZQJtSz1sD20o9jwV6AAtFZBPQD5glIimqmq+qOQCqugxYD3QquwJVna6qKaqakpCQUL0t8cKRUTfr2NE+QFISjBnjevWsWRPoaowxwcqb0F8KJItIBxGJAkYAsw6/qKp7VLWZqrZX1fbAEmCIqqaKSILnRDAi0hFIBjb4fCu8lNw0mcS4ROau8/+om9Xx6KPQsKHr6WOMMf5QaeirahFwNzAPWA28p6orRWSiiAyp5O0DgBUi8gPwAXCHqu6qadE1MThpMJ9v/Jy8orp3vjkhAf7wB/j4Y/jqq0BXY4wJRlIb49FURUpKiqampvpt+bMzZvOLt3/BvJHzGJg40G/rqa4DByA5Gdq3d8Ev5Z1RMcaYMkRkmaqmVDZfSFyRW9oF7S8gOjy6TnTdLE+jRu5k7jffuCN+Y4zxpZAL/YaRDTm//fl18mTuYaNHQ5cu8OCDvrkL1wcfwOef13w5xpj6L+RCH1y7/tqctWzM3RjoUsoVEQFPPQXp6fCPf9RsWdOnw7XXwsUXuxPERUW+qdEYUz+FbOgDdbYXD7i7cJ13nhubZ//+6i1jxgw3zPPgwW6sn6eecuG/bVvl7zXGBKeQDP1OTTvR4eQOdbqJRwSefhp27IDJk6v+/tmz3Tg/553nmnemT3eDvaWmQp8+MH++72s2xtR9IRn6IsKgpEF8tvGzgIy66a2zz3b33H3mGRf+3lq8GK6+2o3h/+9/u77/4HYCS5e6rqGXXea+RRQX+6V0Y0wdFZKhD3B9j+s5WHiQJ794MtClnNCTT8KhQzBxonfzf/89XHml6/I5dy40aXLs6926wXffwU03uV5Cl11WtR1KbVm9Gu65B5o2hZdeCnQ1xgSPkA39/u36c3Pvm3nqq6dYsWNFoMupUOfObtjl6dMhI+PE865Z40I8Ls4131Q0okWjRvDaa+4k8VdfueaehQt9XXnVFRXBhx+68w7durmwj45230jshjPG+EbIhj7A5IGTiYuJ49ZZt1JcUnfbOR57zIXfww9XPM+mTXDJJe5WjfPnQ+vWJ16miLt377ffwkknuaB98kkoKfFp6V7Zvh0mTXLfTq65Btatc7VkZsJbb7mby7/6au3XZUxQUtU69TjjjDO0Nr2T9o4yAZ389eRaXW9VPfaYKqh+883xr23frpqUpHryyao//FD1Ze/dqzpihFv+oEGq2dk1LrdSJSWqixerDh+uGhHh1j1woOrMmapFRcfOd845qm3bqubn+78uY+orIFW9yNiQG4ahLFXlyneuZMGmBfw45kc6xHWotXVXxb59bniGTp1g0aKjwzPk5robsqxbB59+6k7+Voeqa065915o1swts0kT7x6NGrlvCMXFlT+KiuCzz+D55yEtDU4+2V2MdscdbtvKM3s2/OIXrjnqV7+q3vYZE+y8HYYh5EMfIHNPJt2e78bZrc9m3sh5SB0d8OaFF+DOO90dt6680o3Tc+mlsGwZ/Oc/7vea+v57uO8+dzP3PXvcwx8XdJ12Gtx1F1x//dHeRRVRhTPOcDu+NWtcE5Yx5lgW+lU07btp3D3nbl4b+hqj+oyq9fV7o7AQevRwoZeaCldd5Y6a33/fde30B1XXe+jwDqC8x4EDrqayj7Cw8qd37gx9+1ZtMLkPP3Tt/W+/7XYUxphjWehXUYmWMOCfA1iVvYrVd63mlMan1HoN3jgcfsnJrjfPP/8Jt9wS6Kr8r6Tk6A7vhx/cDsUYc5SNsllFYRLGy1e+zIHCA9w7995Al1OhYcOgXz8X+FOmhEbggwv5Rx6BH390zVvGmOqx0C+la0JXxvcfz4yVM/j32n8HupxyicB777krbe+tu/smvxg+HDp2dPcTrmNfUI2pNyz0yxh33jh6NO/BmP+OYW/+3kCXU642beCKKwJdRe2LiHAjhaamwiefBLoaY+onC/0yosKjeOXKV9i2bxsPfvpgoMsxZdx8s7vw7IknAl2JMfWThX45zmp9FveedS8vpL7Al1u+DHQ5ppSoKHcf4S++cAPL1ZbcXHjxRdc91pj6zEK/ApMumkS7Ju24ddatdfIm6qHs1luheXN4/HH/rkfVDVMxejS0bAljxkD//jBvnn/Xa4w/WehXoHFUY6ZfOZ21OWt5fLGf08VUSYMG8LvfuTGGvvvO98vfv98NcHfGGa6n1Pvvw6hR7paTnTu7C+Pee8/36/WXggLX8yktLdCVmDrBm7EaavNR22PvVObmj27WiIkR+sNP1RjUxvjN3r2qcXGqQ4b4bpkrVqjeeadqbKwbC6hnT9Xnn1fds+foPLt3q553nqqI6vTpvlu3P02Y4LanbdvaGVfJBAZejr1jR/qVqC8jcYaa2FjXZXXWLFhRg5Gx8/LcSJ7nneduOvOPf7grnb/6yl0ENmaMG4X0sCZNXPPOoEFuyOunn675tvhTWpprBjv/fHffhBEj7D7Joc5CvxJNGzZl6uCpLN22lGe/eTbQ5ZhSfvtbF/5PVuM+OHl5rgdQ69bujmI7dsBf/+rGHPrXv+CccyoeJqJhQ/j4Yxeg48bBgw/65roBXw9rXVTkBqiLi3O3zHz+eTdsx/jxvl2P8Y20NPf5+J03Xwdq81HXmndUVUtKSnToO0OVCei9c+7VvMK8QJdkPMaNc00ta9Z4/57//Ec1MdE1eVxxher8+arFxVVfd1GR6h13uOXcfvuxQ0J7q7hY9aOPVPv1U23eXDU9verLqMhf/uJqe++9o9Nuv91N++AD363H1Nyrr6rGxKh27169v0VV75t3Ah7yZR91MfRVVfMK8/Se2fcoE9DTXzpdM3IyAl2SUdUdO1QbNFC95ZbK512/3oU8qHburPrJJzVff0mJ6sMPu2Ved533Y/7n5am+8oqrA1Q7dFCNj1ft2tWdN6ipNWtUo6NVf/lLV2Pp9fbtq9q4seqqVTVfj6mZ/ftVR41yfwMXXaT600/VX5aFvp98vPpjjXsqTmOfjNW3V7wd6HKMqt5zj7sRy8aN5b9+8KDqH//oQrBRI9Wnn/b9DVmeeUaP3ITmwIGK59uzx62/ZUs3/2mnqb77rmphoeqCBW47Lr+8et8aDisuVj33XHeie9u241/PzFRNSFDt0uXYk9Smdq1a5Y7sRdxNkmrymata6PvV5t2b9Zx/nKNMQH8989d6oOAE/8uN32VmqkZGqo4Zc+z0khLXdNK+vftLv/561aws/9XxyiuqYWEucHNzj31t+3bVBx9UbdLE1XLxxe6bRumjcFXVF15wr//hD9WvY+pUt4zXX694ns8/Vw0PVx027PgajP+9+aY7AElI8M03TlULfb8rKCrQhz99WGWCaLdp3TRtR1qgSwppt92mGhWlunWre56e7o66wR1NLVhQO3W8/77bAfXu7b6qp6e7dvToaLdDuPZa1dTUEy9jzBhX9xtvVH39GzaoNmyoOnhw5WH+7LNuPU89VfX1mOo5dOjoeZXzzvPtQYiFfi35ZN0n2vyZ5hrzeIxOT52uJXbYFBDr17sj1zFjXBt7VJTqSSep/u1vqgUFtVvLJ5+44G3WzH11j452J3wzvDwNVFCgesEF7n1Llni/3pIS1y4cG6u6ZYt38w8f7nZG8+d7v566qrhY9aWX3N/Arl2BruZ4GRmqffq41B03zjXp+ZKFfi3avm+7XvKvS5QJ6PD3h+uePGsoDYSRI91fNKjefLNrUgmUr7927fUPP1y9k3PZ2e7kbosW3h8NTp/utv3FF71fz7597ptQ06aqmzZVvc66IiND9fzzj37+HTqoLlsW6KqOev99tzOOi3O9x/zBQr+WFZcU6xOLn9DwP4Vrx7931KVblwa6pJCzcaPqjTeqfvlloCvxjbQ018vmzDPdyegTycx032wuvLDqXf7S0917Tz+98vXUNUVFqn/9q+vB1aSJO6/y1VeqrVu7b0ovvRTYcxZ5eaq//a1L2rPO8u+O1UI/QL7Y/IW2mdxGIydG6pB3hujv5v1OX1z6on624TPdsnuLFpdUsxOuCUkff+yaiG68seLwKilxPX4aNlRdt65665k506XB6NH158Tujz+6IAXVK6889hvRzz+rDhzoXrvpJtc1srbk56v+8IM7WXvmma6GsWN932OsLG9DP6IWrv8KKee1PY/ldyxn3PxxfJP1DfPWzSO/OP/I6w0iGpAYn0hyfLJ7NE0mKT6JLs26cGrjUwNYuamLhg51wyg88gj07OmuAC7rrbdg9mwOoTqCAAARbklEQVR3+8zExOqtZ8gQd6Xu44/DWWfBb35Ts7r9qbAQnnoKJk1yw2K8/ba7Orr0FdQJCe7f5PHH4U9/gu+/d/eX7tzZd3WowubN7kra0o+1a48OdREX59Y7bJjv1ltTXt0YXUQGAX8HwoFXVPWpCua7BngfOFNVUz3THgJ+DRQD96jqCQemDdSN0f2lREvI2ptFRk4GGbsyyMjJYF3uOjJyMlifu56C4oIj817d9Wr+dMGf6N68ewArNnWNKlx/vRvZc9asY++atmMHdOsGXbq4+wuEh1d/PcXFbtmffeaW1a9fzWv3tWXL3NASK1a4oJ861QX8icyfDzfc4IbeeOUVd9vN6ti82e1Ili934f7jj7Bv39HX27VzO+bSj06d3D0gaoO3N0avNPRFJBxIBy4FsoClwPWquqrMfLHAf4Eo4G5VTRWRbsA7QF+gJfAp0ElVKxy5LNhC/0SKS4rJ3JtJRk4GizYvYuq3U9lfsJ8RPUbw2PmP0bmZDw9LTL128CAMGADp6bBkiQt6gGuvdfdLXr7cBX9N7doFKSmwZ4/7GR9/9BEXd+zz0tOio2u+7hPJy3NH7M884+6l8OKL7tuJt7KyXNh//bUbs+mvf608jFVduH/8sXv8739uenz80VDv0ePoz9ID8wWCt6FfeaM/nA3MK/X8IeChcuabAlwBLARSypsXmAecfaL11fc2/ZrYeWCnPjj/QW34REMN+1OYjvpolK7ftT7QZZk6IjNT9ZRT3LhBO3e6HiGg+uc/+3Y9aWmujbxvX9WkJDc8hIge6RlT3qNdO9WHHvLP0A5ffXV0uIpf/er4C9+8VVCgev/9R0+qbt58/DxFRaqLF7v5OnZ084q4C+6eeUZ17dq6e84DX53IBa7BNekcfn4T8FyZeU4DPvT8Xjr0nwNGlprvH8A1J1pfKIf+YTv279D7596vMY/HaMTECL1t1m26eXc5f6Em5Hz9tbsG4fzz3QBtZ5zh+/7e5SkudmG7fr3q0qWq8+a54SOef1510iR3IVxYmEuU009XnTy5/CEgvJGf764YfuABd0+DwzsVX125+sEHrrdSfLzq7Nmux9KsWW6H0qyZW19UlDs5Pn16YLv+VoUvQ//ackL//0o9D/MEfXs9PvSnlRP6V5ezjtuBVCC1bdu2tfIPVB9s3btV7/7v3Ro1KUqjJkXpXf+9S7fu3RroskyAvfaa+58bEeF6idQV27erTpnidkTgdgIDB7ori/ftO/F7N250Q1AMGeK6qYK7svmii1yXzMreX1Xp6aq9ern1NGzofjZponrDDW5U0r17fbu+2uBt6HvTpn82MEFVL/M8f8jTLPRnz/MmwHpgv+ctpwK7gCG48wCl553nWdY3Fa0vlNr0vbVlzxaeWPwEry5/lYiwCMakjOH35/yelrEtA12aCZDnn3dtyCNHBrqS8q1e7XoVvfmmOwHasCH88pdw001w8cWuB87ixTBnDsyd63q8ALRvD4MHu8eFF0Ljxv6r8dAh1wNozx5345zzz6+9k67+4MsTuRG4E7kXA1txJ3JvUNWVFcy/EPi9uhO53YG3OXoi9zMgWe1EbrVsyN3ApMWT+NcP/6JES2jeqDldmnWha7OudGnW5cijbZO2hIndH8cEXkmJO3n6xhuu99Hu3a63zb597uRsTAxccIG7E9ngwZCcXPHNa8yJ+Sz0PQu7HHeiNhx4VVWfEJGJuK8Ts8rMuxBP6HuePwL8CigCxqrqnBOty0K/cuk56cxcM5O1OWtZvXM1q7NXk5uXe+T1BhEN6NS00zE7gnPbnEubJm0CWLUJdfn5rsvjBx+44B80yB1dN2gQ6MqCg09DvzZZ6FedqrLz4E7W7Fxz9JHjfm7M3YjiPuNz2pzD8O7DubbbtbSIbRHgqo0xvmShbwA4VHiItTlrmZ0xm3d/fJe0n9MQhAHtBjC8+3Cu7nY1zRs1D3SZxpgastA35VqdvZoZK2cwY+UM1uxcQ5iEcVGHixjefTjDug4jvkF8oEs0xlSDhb45IVUl7ec0ZvzodgDrc9cTERbBpR0v5dpu11ZpHKAmMU04q9VZhIfVYAwAY0yNWOgbr6kq32///sg3gC17tlR5GS1jW3J9j+sZ2WskvU/pjVgXDGNqlYW+qRZVZWX2Sg4UHPD6PZv3bObttLeZnTGbwpJCuiV0Y2TPkdzQ8wbandzOj9WemKpysPAguw7tIjcvl9xDucf93qZJG247/Tb7lmLqPQt9U+t2HdrF+yvf5820N/lyy5cA9G/bnxt73si13a/12/kCVWXZ9mXMXDOThZsXsvPgziOhXlhSWOH7wiSMEi3h3Dbn8sYv36BDXAe/1GdMbbDQNwG1afcm3k57mzdXvMnqnauJDIvk8uTLGdlrJJcnX07DyIY1Wn5BcQELNy1k5pqZzFw7k637thImYfRt1ZfWJ7UmPiaeuAZxxMXEEd+g/N9jo2N5O+1t7pp9F6rK/w3+P27ufbM1TZl6yULf1AmqyvKflvPmijd558d32L5/O4LQMa4j3Zt3p3tCd7oldKN7Qne6NOtCg8iKr9TZk7eHOevmMHPtTGZnzGZv/l4aRDTgsqTLuKrzVfyi0y9o1rBZlWvcvHszN398M4s3L+bqrlfz0hUv0bRh05pstjG1zkLf1DnFJcUs2LSAr7Z8xcrslazMXkl6TjpFJe42Q2ESRse4jkd2At0TupMUn+SabtbOZMHGBRSWFJLQMIErO13JVV2u4pKOl5xwR1GV2p795lnGfz6eZg2b8c+h/+SypMtqvFxv7C/YzwtLXyBjVwaP9H8koOdBTP1loW/qhYLiAjJyMliZvZJV2avczuDnlWTsyjiyMwBIjk9maOehXNXlKvq17ue3E6//2/4/Rn40klXZq/ht39/yl0v+4pOdSnn25u/lue+eY/I3k8k5lENUeBTR4dFMvmwyvz7t19bMZKrEQt/Ua4d3Bmtz1h4ZUK62QvBQ4SEe+uwh/v7t3+narCtvDnuT01uc7rPl787bzdRvp/K3JX9jd95uLk++nEcHPMqpjU9l9MzRLNy0kMFJg3n5ypdpdVIrn63XBDcLfWNqaP76+dwy8xayD2Qz8cKJPHDOAzX6hpFzMIcpS6Yw9bup7M3fy9DOQxk/YDwpLY/+Py3REqZ9N41xn44jOiKaqYOmMrLXSDvqN5Wy0DfGB3Yd2sVv/vMbPlj1Af3b9mfcueNof3J72jRpw0nR3t0UNftANpO/mcxzS59jf8F+ru56NeMHjKfPqX0qfE9GTga3zLyFrzO/Zmjnobx0xUuc0vgUX22WCUIW+sb4iKryxoo3uHv23ewr2HdkepPoJrRt0vbIo81JbY55HhEWwZQlU3g+9XkOFR7iuu7XMX7AeHo07+HVeotLipmyZAqPfP4IjaMa8/wvnue67tf5azNNPWehb4yP7c7bzarsVWTuyWTLni3usdf9zNyTSc6hnOPeEyZh3NDzBh4+72G6JnSt1npXZ69m1MejWLptKdd1v45pl0+rtGuqqpJ9MJuMnAwydmWwMXcjkeGR7jqFmLjjrls4OeZkIsIiqlWfqRss9I2pZQcKDpC59+gOYefBnVzd9WqSmybXeNlFJUU8/dXTTFg4gfgG8bx0xUsM7TKU3EO5pOekk7Er40jAZ+zKID0nnb35e4+8X5Aj91WoSGxULHEN3I4gNioWcOcYyj4UPW4aQFR4FJFhke5neOQJfy8uKSa/OJ+8oryjP4uOf55fnE9+UX6V/q1EhHAJJyIsgvCwcMIlnPAwz3PP72VfjwiLOPL88Hyln5eeJggiUqWfZes75nmp19ud3I6x/cZWaXtLLddC35hg88NPPzDq41H8sOMH4mLijrljmiC0O7kdyfHJdGraieT4ZJKbJpMcn0z7k9ujKLmHcssdhyg379jne/P3EiZh5T5E5LhpJVpCYXEhhSWFFBYXUlBcQGGJ52c5zyPCIoiOiCYmIobocM/P8p6HxxAVHlWlE9klWkJxSTHFWkxRSRHFWnzsc8/vxSXHvl5UUnTk+eH5yk4rKilyNxhHvf5ZWtm8Lft6SssUPrv5s2r8ZVjoGxO0CooLmLJkCut3rT8S6p2adqJjXEeiI6IDXZ4JEG9D3xrxjKlnosKj+MO5fwh0GaaeCgt0AcYYY2qPhb4xxoQQC31jjAkhFvrGGBNCLPSNMSaEWOgbY0wIsdA3xpgQYqFvjDEhpM5dkSsi2cDmGiyiGbDTR+XUZaGynRA62xoq2wmhs621uZ3tVDWhspnqXOjXlIikenMpcn0XKtsJobOtobKdEDrbWhe305p3jDEmhFjoG2NMCAnG0J8e6AJqSahsJ4TOtobKdkLobGud286ga9M3xhhTsWA80jfGGFOBoAl9ERkkImtFZJ2IPBjoevxJRDaJSJqILBeRoLrjjIi8KiI/i8iPpabFi8h8Ecnw/IwLZI2+UMF2ThCRrZ7PdbmIXB7IGn1BRNqIyAIRWS0iK0XkXs/0YPxMK9rWOvW5BkXzjoiEA+nApUAWsBS4XlVXBbQwPxGRTUCKqgZdP2cRGQDsB/6lqj08054GdqnqU54depyqjgtknTVVwXZOAPar6l8DWZsviUgLoIWqfi8iscAy4CrgFoLvM61oW6+jDn2uwXKk3xdYp6obVLUAeBcYGuCaTDWo6mJgV5nJQ4HXPb+/jvuPVK9VsJ1BR1W3q+r3nt/3AauBVgTnZ1rRttYpwRL6rYDMUs+zqIP/2D6kwCciskxEbg90MbXgFFXdDu4/FtA8wPX4090issLT/FPvmzxKE5H2wGnAtwT5Z1pmW6EOfa7BEvpSzrT6325VsXNV9XRgMHCXp6nA1H8vAIlAH2A78Gxgy/EdEWkMfAiMVdW9ga7Hn8rZ1jr1uQZL6GcBbUo9bw1sC1Atfqeq2zw/fwY+wjVvBbMdnvbSw+2mPwe4Hr9Q1R2qWqyqJcDLBMnnKiKRuBB8S1X/n2dyUH6m5W1rXftcgyX0lwLJItJBRKKAEcCsANfkFyLSyHOSCBFpBAwEfjzxu+q9WcAoz++jgJkBrMVvDoegxy8Jgs9VRAT4B7BaVSeXeinoPtOKtrWufa5B0XsHwNMNagoQDryqqk8EuCS/EJGOuKN7gAjg7WDaVhF5B7gANzrhDuAx4GPgPaAtsAW4VlXr9UnQCrbzAlwTgAKbgN8cbveur0TkPOALIA0o8Ux+GNfWHWyfaUXbej116HMNmtA3xhhTuWBp3jHGGOMFC31jjAkhFvrGGBNCLPSNMSaEWOgbY0wIsdA3xgdE5AIR+U+g6zCmMhb6xhgTQiz0TUgRkZEi8p1nXPOXRCRcRPaLyLMi8r2IfCYiCZ55+4jIEs9AWR8dHihLRJJE5FMR+cHznkTP4huLyAciskZE3vJcoYmIPCUiqzzLqRPD65rQZaFvQoaIdAWG4was6wMUAzcCjYDvPYPYLcJdHQvwL2CcqvbCXWV5ePpbwDRV7Q2cgxtEC9yoimOBbkBH4FwRicddet/ds5zH/buVxpyYhb4JJRcDZwBLRWS553lH3CXzMzzzvAmcJyJNgJNVdZFn+uvAAM+4R61U9SMAVc1T1YOeeb5T1SzPwFrLgfbAXiAPeEVEhgGH5zUmICz0TSgR4HVV7eN5dFbVCeXMd6KxScobxvuw/FK/FwMRqlqEG1XxQ9yNQuZWsWZjfMpC34SSz4BrRKQ5HLlPazvc/4NrPPPcAHypqnuAXBHp75l+E7DIMz56lohc5VlGtIg0rGiFnrHVm6jqbFzTTx9/bJgx3ooIdAHG1BZVXSUi43F3HQsDCoG7gANAdxFZBuzBtfuDG/L3RU+obwBGe6bfBLwkIhM9y7j2BKuNBWaKSAzuW8J9Pt4sY6rERtk0IU9E9qtq40DXYUxtsOYdY4wJIXakb4wxIcSO9I0xJoRY6BtjTAix0DfGmBBioW+MMSHEQt8YY0KIhb4xxoSQ/w9vKW7Y2aiiBQAAAABJRU5ErkJggg==\n",
"text/plain": "<matplotlib.figure.Figure at 0x7f118a0f5978>"
},
"metadata": {},
"output_type": "display_data"
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAYQAAAEKCAYAAAASByJ7AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3Xd4leX5wPHvnUXYBJJAJIwwFBkKEpwVV52t4hZqXUWpdVWlVq2LKmodLY4idQFupDigFotWIsivoAQIMjQyEiAQYhL2CFn374/nRA4hyTlJzslJcu7PdZ0r57zjeZ83gfc+zxZVxRhjjIkIdQaMMcY0DhYQjDHGABYQjDHGeFhAMMYYA1hAMMYY42EBwRhjDGABwRhjjIcFBGOMMYAFBGOMMR5Roc5AbcTHx2vPnj1DnQ1jjGlSlixZUqCqCb6Oa1IBoWfPnqSnp4c6G8YY06SIyAZ/jrMqI2OMMYAFBGOMMR4WEIwxxgB+tiGIyHnA80Ak8Jqq/qXS/u7AG0AHzzH3qepsEbkauMfr0GOA41Q1Q0S+BJKA/Z5956jqj7W9gZKSEnJycigqKqrtqWEhNjaW5ORkoqOjQ50VY0wj5zMgiEgkMBE4G8gBFovILFVd7XXYg8B0VZ0kIv2B2UBPVX0HeMeTziBgpqpmeJ13tarWq5U4JyeHtm3b0rNnT0SkPkk1O6pKYWEhOTk5pKSkhDo7xphGzp8qo+OBtaq6XlWLgWnAiErHKNDO8749sKWKdEYB79U1o9UpKiqiU6dOFgyqICJ06tTJSk/GGL/4ExC6Apu8Pud4tnkbB/xaRHJwpYPbq0jnKg4PCFNEJENEHpJqnugiMkZE0kUkPT8/v8oMWjConv1ujDH+8icgVPVEqbzu5ihgqqomAxcAb4nIT2mLyAnAPlVd6XXO1ao6CDjV87qmqour6iuqmqqqqQkJPsdVGGNqYfeS3RTMLMCW0jXgX0DIAbp5fU7m8Cqh0cB0AFVdCMQC8V77R1KpdKCqmz0/dwPv4qqmmqQ2bdqEOgvG1JqqsnrUalZevJIVF6ygaINVLYY7fwLCYqCviKSISAzu4T6r0jEbgbMARORoXEDI93yOAK7AtT3g2RYlIvGe99HAL4GVGGMazM6vdrJ/zX4SLk9gx1c7+GbAN+S8mIOWWWkhXPkMCKpaCtwGzAG+w/UmWiUij4rIRZ7DxgI3ichyXEngej1YBh0O5Kjqeq9kWwBzRORbIAPYDLwakDsKIVXlnnvuYeDAgQwaNIj3338fgNzcXIYPH87gwYMZOHAgX331FWVlZVx//fU/HTthwoQQ596Em9zXcolsF0m/N/px/Krj6XBqB9besZZlpy5j7+q9oc6eCQG/xiGo6mxcY7H3toe93q8GTqnm3C+BEytt2wsMrWVefbvzTsjI8H1cbQweDM8959ehH374IRkZGSxfvpyCggKGDRvG8OHDeffddzn33HN54IEHKCsrY9++fWRkZLB582ZWrnQFox07dgQ238bUoHRnKfkz8ul8bWciW0US2SOSQbMHkfdOHmvvXEv6kHR6PNCD7vd1JyLGxq+GC/tLB9CCBQsYNWoUkZGRdO7cmdNOO43FixczbNgwpkyZwrhx41ixYgVt27alV69erF+/nttvv53//Oc/tGvXzvcFjAmQvPfyKN9fTtKNST9tExG6/LoLx68+noTLEsh+JJv049LZ9fWuEObUNKQmNdupT35+kw+W6npqDB8+nPnz5/Pvf/+ba665hnvuuYdrr72W5cuXM2fOHCZOnMj06dOZPHlyA+fYhKutr2+l9TGtaTu07WH7YhJj6P9ufxJ/lcia361h6UlLSf59MinjU4hsHRmC3JqGYiWEABo+fDjvv/8+ZWVl5OfnM3/+fI4//ng2bNhAYmIiN910E6NHj2bp0qUUFBRQXl7OZZddxmOPPcbSpUtDnX0TJvYs38Pu9N0kjU6qcZxK/C/jGbZqGEfcfAQ5z+WweOBitn22rQFzahpa8yohhNgll1zCwoULOfbYYxERnn76abp06cIbb7zBM888Q3R0NG3atOHNN99k8+bN3HDDDZSXlwPw5JNPhjj3pq5yX89lze/XQLn/5ySNSaLvc32Dl6ka5L6ei7QQOv+6s89jo9pFceRLR5I4KpHMGzP59txv6fVUL7r/sXsD5NQ0NGlKA1JSU1O18gI53333HUcffXSIctQ02O8ouJacsISS/BISLvdv4OSejD1s/2I7w1YNo3W/1kHO3aHKispYeMRCOp7bkf7v9a/1ud9f+z35H+RzzJxj6PjzjkHKpQk0EVmiqqm+jrMSgjH1cGDLAXZ/s5uUx1Po8acefp1TnF/M172+JvuRbAa8PyDIOTxUwUcFlG4vpcvoLrU+NzI2kn5T+rF39V6+G/UdQ5cOJbZbbBByaULF2hCMqYfCfxUCED8i3seRB8UkxJB8ZzL50/PZs3xPsLJWpdzXc4ntGUvcmXF1Oj+ydSQDPxhI+YFyVl2xivIDtagnM42eBQRj6qHg4wJie8fSqn+rWp2XPDaZqA5RZD2cFaScHW5/1n52fLGDLr/pgkTUfdLDVke1ot+Ufuz+ejdrx64NYA5NqFlAMKaOSneVsn3uduIvjq/1rLLRHaLpdk83CmcVNlg//61TtoJAl+trX11UWcJlCXT7Qze2TNxC3jt5AcidaQwsIBhTR9v+sw0t1lpVF3nrekdXouOjyXoo+KUELVO2TtlKx3M7BqzeP+XJFNoPb0/mTZnsWdGwVV8mOKxR2Zg6KphZQHR8NO1Pbl+n86PaRNH9/u6sG7uOHfN20OG0DgHO4UHbPtvGgZwD9HmuT8DSjIiKoP/7/Vly3BJWXbaKoYuHEtW+7o+UfT/sY/vc7YdPrh8gETERxF8aT3ScLSdbHQsIxtRBeUk5hf8uJOHSBCSy7vXxR/zuCDb9dRNZD2YxeP7goC1olPt6LtEJ0XS6sFNA023RpQX9p/cn4/QMvr/hewZ8MKDW91BeXM7Gpzey4bENaHFwu8FnPZhF34l9SbjU1lapigWEBtamTRv27LHidVO3Y94OynaW1bm6qEJky0h6PNiDNbesYftn2+l4buD79hf/WEzhrEK63t41KBPVdfhZB3o/05t1d69j07Ob6H6P/4PWdi3eReboTPau2EvCFQmkPJFCVNvgPJb2r9/PmlvWsOqyVcRfGk/fv/elRVKLoFyrqbKAYEwdFM4sJKJlBHFn1637prek0UlsetqVEuLOiQt4KSHvrTy0REkaneT74DpKvjOZXQt3sf6+9bQd1pa402v+vZTtLSPr4SxynsshpksMAz8eWO/g6ktM5xiO++Y4Nj27iew/Z7Nj7g56P9vb9bqypWaBZhYQ1ty5hj0Zgf323WZwmxqnGLj33nvp0aMHt9xyCwDjxo1DRJg/fz7bt2+npKSE8ePHM2LECJ/X2rNnDyNGjKjyvDfffJNnn30WEeGYY47hrbfeIi8vj5tvvpn1691SE5MmTeLkk08OwF2bmqgqBTMLiDsnjshW9Z/sLSImgh6P9CDzhkwKZhaQcHHgqjNUldzXc2l3Ujta9w/eqGgR4ajXj2Lvir2svmo1qctSaXFE1d++t3+xncybMinKKiJpTBK9n+5dr7aH2oiIjqDH/T1IuCyBzJsyybwxk7x38zjqlaNo2btlg+ShMbNeRvU0cuTInxbCAZg+fTo33HADH330EUuXLiUtLY2xY8f6tWZtbGxsleetWrWKxx9/nLlz57J8+XKef/55AO644w5OO+00li9fztKlSxkwoGFHvYarPcv2cGDTAeIvDtw32s6/7kzLI1uS/VA2Wh64evRdi3ax77t9QS0dVIhqG8WADwdQtreMVVeuorzk0EFrJdtL+P4337P858uRSGHwl4M56uWjGiwYeGt1ZCsGpw3myH8cye703SwetJhNf91EeWmYD7RTVZ8v4DwgE1gL3FfF/u5AGrAM+Ba4wLO9J7AftypaBvAPr3OGAis8ab6AZ16lml5Dhw7VylavXn3YtobWr18/3bx5s2ZkZOjJJ5+sxcXFeuutt+qgQYP02GOP1djYWM3NzVVV1datW1ebTnXnvfDCC/qnP/3psOPj4+O1qKjIZ/4aw++oOVn/0HpNi0jTA/kHApru1ve2ahppuvW9rQFL87vR3+m81vO0ZFdJwNL0JW9anqaRpmvuXPPTth9n/KgLOi/QtMg0XXvvWi3dV9pg+fGlKKdIv73oW00jTdNT03X38t2hzlLAAenqx7PeZ2gWkUhgInA2kAMsFpFZ6lZJq/AgbmnNSSLSH7e6Wk/PvnWqOriKpCcBY4BFnuPPAz71lZ/G6PLLL2fGjBls3bqVkSNH8s4775Cfn8+SJUuIjo6mZ8+eFBX5XsC8uvNU1eo4G5GCmQW0P6U9MfExAU038cpENj6xkexHskm4PIGIqPoV4Et3l/LjtB9JvCoxaA21VUm8KpGdC3eS81wOsb1j2TF3BwUfFdBmSBuOmX0MbY87fA2GUGrRtQUDPx5I/j/zWXP7GpYMXUK3e7vR48EeRMaG1/oP/vwrOR5Yq541kUVkGjAC8A4IClQs+dUe2FJTgiKSBLRT1YWez28CF9NEA8LIkSO56aabKCgoYN68eUyfPp3ExESio6NJS0tjw4YNfqWzc+fOKs8766yzuOSSS7jrrrvo1KkT27Zto2PHjpx11llMmjSJO++8k7KyMvbu3WsrrwXZ/qz97P12L73/2jvgaUuEkPJYCisvXkneW3kk3VC/ap786fmU7z10VbSG0vuZ3uxO383a29cSERtBr7/0Inlscr2DXLCICIlXJhJ3Vhxrx65l4+Mb2fKPLSGpzqrOsZ8dG/R2Dn/utiuwyetzDnBCpWPGAZ+JyO1Aa+DnXvtSRGQZsAt4UFW/8qSZUynNrrXLeuMxYMAAdu/eTdeuXUlKSuLqq6/mwgsvJDU1lcGDB9OvXz+/0qnuvAEDBvDAAw9w2mmnERkZyZAhQ5g6dSrPP/88Y8aM4fXXXycyMpJJkyZx0kknBfNWw17BzAKgdpPZ1UanizrRdlhbsv+cTeerO9erm2ju67m0OroV7U5s+C8JEdERDPjnAHL+lkPSmCRa9a3dXE+hEt0pmqOnHk3nqzu7KTnKQp2jgyJaNkAw9VWnBFwBvOb1+RrgxUrH3A2M9bw/CVd6iABaAJ30YJvBJlxJYhjwX6/zTwX+Vc31xwDpQHr37t0Pqxuz+nHf7HcUOMtOX6bfDPwmqNconFOoaaRpzsScOqexZ9UeTSNNNz67MYA5M00VfrYh+BNycoBuXp+TObxKaDQw3RNgFgKxQLyqHlDVQs/2JcA64EhPmsk+0qwIWK+oaqqqpiYk2OhCEzolhSXsmL+DTiMCO9q3sriz42h/ans2jN9A2f66fUXNfT0XiRY6X+N7VTRjKvgTEBYDfUUkRURigJHArErHbATOAhCRo3EBIV9EEjyN0ohIL6AvsF5Vc4HdInKiuNbSa4GZAbmjJmDFihUMHjz4kNcJJ1SuhTONTeG/C6GcgHY3rYqIkDI+heLcYrZMqrE5rkrlxeXkvZlHp4s6EZMY2IbvWlm1Cs46C/7979DlwdSKzzYEVS0VkduAOUAkMFlVV4nIo7hiyCxgLPCqiNyFa2C+XlVVRIYDj4pIKa427mZVrVil+3fAVKAlrjG5zg3K2sR64QwaNIiMjIwGuZY2oSVSG7uCjwuI6RpD26HB7yXTYXgH4s6JY+OTG0m6KalWvYQK/1VISUFJg4w9qFZZGYweDV9/DXPnwo03wt/+Bm0bVw8jcyi//pWp6mxc11DvbQ97vV8NnFLFeR8AH1STZjowsDaZrUpsbCyFhYV06tSpSQWFhqCqFBYWEhtryxzWV9n+MrbN2UaX6xpumoOUx1JYesJSNr+wmR4P+Lc8J7jqohbJLeh4TgjXPP7HP1wwmDwZMjPhmWfgv/+FqVPhtNNCly9To8bTp6qOkpOTycnJIT8/P9RZaZRiY2NJTk72faAfinKK2L14t9/HS6TQ4fQORLVr8v/M2P7Fdsr3lQe9ushbu+Pb0emiTmx8ZiMtj2rp16yq5fvK2fafbfR4sEe9ZmGtly1b4P774eyz4frrQQQuvBCuuw7OOAPuugsefxzsi0qj0+T/p0ZHR5OSkhLqbDR7BzYfYMlxSyjJL6nVeTFdYzhy0pHEX9hwD9JgKPi4gMh2kXQ4PXhrFlQl5bEU0o9LZ/UVq30f7CHRQpcb6r8qWp39/vdQUgKTJrlgAHDKKZCRAX/8o6s6+vRTePNNSE0NXT7NYZp8QDDBV17sWVB9fznHfH4MMQn+NVQW5xWzbuw6Vl60ksSRifR5vk9oGznrSMuUwn8V0umCTkGZPrombY5pw4nrTqR0R6nf50TFRRHbPUTfvj/5BGbMgCeegN6VBu+1aQMvvQQjRsBvfgMnnggPPQR/+hNE26I1jYE0pUbH1NRUTU9PD3U2ws6a369h8wub6f9+fxKvTKzVueXF5Wz8y0Y2jN9AZNtI+jzXh86/7tyk2nt2/t9Olv1sGUe/dzSdR1o3zmrt3Qv9+7uG46VLIaaG4L99O9x+O7zzjislvPkmHH10cPO3Ywd0aNgSXmMhIktU1WdxrHGOIzeNRt57eWx+YTPJdybXOhiAm9q558M9Sc1IpdVRrfj+2u/59vxvKdrge26nxqJgZgESLXQ6P7jjD5q8Rx6BjRvh5ZdrDgYAcXHw9tvwz39CVhYMGQITJkB5kGYbnTDBXfPee6EJfQluaBYQTLX2rtpL5o2ZtDulHb2e7lWvtFr3b82Qr4bQ54U+7Fywk28GfEPOCzloWeP+z6mqFHxcQIczOjSqeW0anWXL4LnnYMwY117gr8svh5Ur4Zxz4O674ec/hx9/DFy+VF0D9913Q0oKPP20C1yhtGYN/OIXrnTU2PgznLmxvKqa/toER8nOEl105CJd0HmBFm32PcV2bezP3q/Lz1uuaaTpkhOX6J5VewKafiDtWb2n3tNINHulparDhql27qy6bVvd0igvV339ddXYWNXkZNXFi+ufr5IS1d/8RhVUf/tb9/nGG93nxx6rf/p1MXu2avv2qiIuH3/9a4NcFj+nrgj5Q742LwsIDaO8vFxXXLZC0yLTdPuX24N2jdy3cvWrTl/pl9Ffata4LC07UBaUa9VH9pPZmkaa7t+0P9RZabxefNE9St57r/5pLV2q2qOHaosWqlOm1D2dfftUL7rI5evhh13AUVUtK1O95hq3/emn659ff5WXq44f7wLB4MGq33+vevnlLh9//OPB/AWJvwHBysDmMDkTcij4oIBeT/eiw2nBaYQTEbr8ugsdz+nI2jvXkj0um9wpubTo6v+i5636taLXE72I6Ry8nksFHxfQNrUtscnWZ75Kmze7XkLnngtXXVX/9IYMgfR0l9YNN8CSJa6bam16Ie3YARddBAsWwIsvwm23HdwXEQFTprhusX/8o2vr+P3v65/vmuze7cZjfPgh/OpX8Oqr0KoVTJvm8vb005CfD6+8AlGhfSRbG4I5xI75O1j3x3XEXxpPtz90831CPcUkxtD/3f4M/NdAWg9oTWSrSL9eEbER5L2dxzdHf0Pu1NygTNFxIPcAu7/eHfTJ7AJG1T1sLrwQ8vIa5pp33OEeri+9dHDMQX3Fx8OcOa7e/+9/d/Mh+Xs/ubluJPSiRfDuu4cGgwqRka5X06WXwp13uvESwbJmjete+/HH8Ne/uob0Vq0O5uOll1ybxpQpcNllsH9/8PLiD3+KEY3lZVVGwVW0pUj/r8v/6aK+i7RkR8MtuVhXe77bo0tOWaJppGnG2Rm6b/2+gKa/+R+bNY003b2iCSypuHmz6vnnuyoIUE1NVd0T5LaZmTPdtZ58MnjXeOcd1ZYtVbt2Vf3665qPXbNGNSVFtXVr1TlzfKd94IDqhRe6e3jttcDk19snn7j2gk6dVL/4ouZj//53V5106qmq2wNfTYu1IZjaKCsu06WnLtV5reY1jQegR3lZueZMzNH5bebrvFbzdOPfNmp5aRX1sXv31rqedvn5y3Vhr4VaHuT63Xp77z3VuDj34Pz731VnzVKNiFD95S9dQ2ow7N6t2q2b6sCBqsXFwblGBe92hcmTqz5myRLVxET38PUVOLwVFamee657GL/xRkCyq2VlrtFaRHXIENXsbP/OmzZNNTpaddAg1S1bApMXDwsIplbWjF3jFnh/O3ALvDek/Rv26/ILXM+l9BPSDw1qu3apHnGE6oknum/SfijZVaJfxnypa+5e4/vgUCkoUL3ySvff+MQTVTMzD+6bNMlt/93vgtNgeffdLv3//S/waVclP1/1rLPcNW+5xX27rzB3rmrbtqrdu6t+913t0963T/XMM10QnTatfvnctUv14otdPq++2n0RqY3PPnMlnJ49VX/4oX558WIBwfgt7595mkaaZt6a6fvgRqy8vFy3vrNVF8Qv0C+jv9T1D6/XsqIy1aeecv/UW7ZU7dJFdcECn2nlTXe/k+3zgtPLqt4++cTdS3S06uOPV10SuPded99PPRXYay9Z4h6eN98c2HR9KSlRHTvW3dPPfqaam6v6wQeqMTGq/furbtpU97T37HHVNZGRLs26yMxUPfpol8aECXUPxN98oxofr5qQ4H7XAWABwfhl7/d7dX6b+Zp+Qrp7eDYDB/IP6KqrV2kaafp1v4W6I+4U1bPPVl25UrVPH/cQnTSpxv+wq65epQviF2hZSSP7nezcqTp6tPuvO2iQakZG9ceWlamOHOmOfffdwFy/tFR16FAXjIJQ1+2Xd991wT0hwQWmk05SLSysf7q7drm0oqJctVtNystVf/zRPbA//lj1mWdU27VzD/K5c+ufl++/dyWetm19tz/4wd+AYHMZNTN7Vu6hbLefyy6WQ+aYTEp+LGHo0qHEdmteXSsLPy3kh18t4cCOKLpeGkHi2CGuC+C4cbBoIfzyQvjDHyCmUpdGhRW/WEH8xfH0m9IvJHmv0rx5rvvixo2uy+S4cdDCRzfdAwfcKOBFi+Czz+q/FsELL7humu+/D1deWb+06mP5cjfKuX9/15uodevApLtzpxst/e23rudPx46waZP7nW/adOirqNL0K6mpbmK/Hv6vXVGjzZtdd941a9yo5ssvr3NS/s5l5FdAEJHzgOdxK6a9pqp/qbS/O/AG0MFzzH2qOltEzgb+AsQAxcA9qjrXc86XQBJQ0c/qHFWtccy6BYTqHcg9wJpb11DwUUHtToyAY+YcQ8efh3AxlWApKqI0ZQBZEWPYnHuCW8uvFgbOGtg4pu3evx8eeMBNDdGrl+syefLJ/p+/fbubTiI3F/73v7pNIrdrl5vBdMKEg8tihnqCQtXg5GHbNneP3qsaRkRAUhJ063bw1b37oZ87dw58frZtc92IFy50rzoutetvQPA5CsKzJvJE4GwgB1gsIrPUrZJW4UFguqpOEpH+uNXVegIFwIWqukVEBuKW4ezqdd7V6lZOM3WkquS+nsu6P6yjvKicno/1pN2wdn6f36J7C1ofHaBvV43N1KlEbV1P38+HckTXYRzYeODQ/Qv+z63k1TIWHnwQBh5cwC+iVQTtf9a+4fJaVgZbtx7+TXTjRjdz6IYNcMstbhBTbb8Nx8XB7NmuP/z557sHS5Kfy2uWlblvyg884OYYuu46ePbZ0AcDCF4eOnaEL790S38mJLiH/RFHhGaK7o4d4fPP3d/g+OODfz1fdUrAScAcr8/3A/dXOuZl4F6v4/9XRToCFAItPJ+/BFL9qdeqeFkbwqH2rtmry85Ypmmk6dLTlureH2rZo6E5Ky52XRVPOKHmxr2VK1X79nX1xhMnBn0KAVVV/fxzN13BqFGucbR7d3f9ijEEFa/WrVX79VM97zzV//yn/tdNT1dt1Ur1uONct1Ff5s5VPfZYl5dTTnGNnaZJIoBTV3QFNnl9zgEql1vGAZ+JyO1Aa+DnVaRzGbBMVb2/pk0RkTLcusvjPRk3PpSXlpPzXA7ZD2cj0cKRLx9J0o1JSEQj+NbWWLz9tvtWPXFizd8kBwyAb76Bq6+GW2910ya89FLwlnd88UVXBx8VBcnJrtph+PCqqyE6dAjst+ChQ2H6dDetw1VXwcyZVU+VsG4d3HMPfPSRqw9//3244orGUSowweUrYgBX4NoNKj5fA7xY6Zi7gbF6sISwGojw2j8AWAf09trW1fOzLfAZcG011x8DpAPp3bt3D24YbQJ2Z+zWxUMXaxpp+u1F32pRTmBnIm0WSkpcb6IhQ/z/xl9WpvrQQ+7b8LBh9evCWJXyctUHH3Tpjxjh+r6Hyssvu3zcdNOhv58dO1Tvucd142zd2k3GFsp8moAhUN1O8a/KaBXQzevzeiDR8z4Z+AE4pYZrXA/83VdewrnKqHR/qa770zr9MupLXZC4QPOm5zX+EbSh8vbb7p92XfqTf/ihaps2btTrp58GJj+lpapjxrg8jR4dvNHDtXH//S4/Tzzh8vfyy64bJ6hef73fA/hM0xDIgBDlecCn4HoLLQcGVDrmU+B6z/ujgS24NoMOnuMvqyLNeM/7aGAGcLOvvIRrQNg+f7suOmqRppGm313/nRYXBnmqgKasrMwNDho40L2vi1WrVAcM0J/m0fenvr06+/erXnqpS+v++xumjcIfZWWqv/qVy1efPvrTYK/09FDnzASBvwHBZxuCqpaKyG24HkKRwGRVXSUij3ouMgsYC7wqInfhOvddr6rqOa8P8JCIPORJ8hxgLzBHRKI9af4XeNVXXhpCcV4xG8Zv4MCWA74PbgBle8vYPmc7sT1jOeazY+h4djPsHhpIH34I330H773nugrWRf/+ri3hoYfcDJWffw5vvAE/+1nt0tm1Cy6+GNLSXHfNO++sW36CISICJk920y6vXevaFi6/3NoJwpwNTPNQVfLezGPtXWsp21tGqyNbBeU6dRF3bhwpf04hsnVkqLPSuKm6+fT374fVq930wvX11Veuq2V2thvE9uij/jU45+W5Lp4rVsDUqa7RujGq+P9vgaBZC9g4hHCwP3s/P/z2B7Z/tp12p7TjqNeOonW/Zto3vzn75BM3gnXq1MAEA4BTT3Vp/uEPbswpIvWFAAAaFElEQVTC7Nnw1lsu8FRn/Xo3Ojg3F2bNcoGhsbJAYLyE9QI5WqZsem4TiwcsZtf/dtF3Yl+GzB9iwaApUoXHHoOePd2qVIHUti28/LILBtu2uQFC48dDaenhx377rRsVvH07fPFF4w4GxlQStgFhz8o9LD1lKevuWkeHMzowbPUwut7S1fryN1Wffw6LF8P99wdvROn558PKla5P/kMPuQd/ZubB/fPnuzEFUVGuqunEE4OTD2OCJOzaEMoPlLPhiQ1sfHIjUe2j6PNCHxJHJiJWdG66VN2DODvbNZD6mvAtEKZPh9/9Dvbtg6eecgPKRo6ElBS3/GP37sHPgzF+sjaEKuxcuJPMGzPZt3ofnX/dmd4TehMTH7wF2k0DmT/fLaj+wgsNEwzAzfR56qlw000HF2k/4QTXjhHfCCbEM6YOwiIglO4pJeuBLDa/uJkW3VowaPYgOp3fRBZON7499pibafLGGxv2uklJ8K9/uYnH0tPdxHNt2jRsHowJoLAICCt+sYKdX+2k621dSXk8hai2YXHbgacK5eWB68ETCAsXusbbZ5+Fli0b/voi8JvfuJcxTVxYNCr3/HNPhiwYQt8X+lowqKu0NFc/npTkvpEXFoY6R8748dCpE/z2t6HOiTFNXlgEhLjT42h/cgPObd+c7Nvn6sjPPBNiYtyqUA8/7BpNb7/d9bkPlSVLXFfQu++2qhpjAiAsAoKpo2++geOOc421t90Gy5a5B/DKla5R9eWXoW9f937x4obP3/jxboro225r+Gsb0wxZQDCHKy52pYCTT4a9e10f/xdfPLhS14ABriE1K8uN4J0zxw3WOv10t7RieXntr1la6qZ72LrVv9eCBfDxx3DHHdDO/xXijDHVC7txCMaHlSvh2mtdaeDaa+H559238Jrs2gWvveYmcMvJcZPDjR3r5u9p0cI1RufnV71QecW2LVtqH0jatHGL4HS0Cf+MqYm/4xAsIBinrAz+9je3tnD79vDKK26mztooKXGraz3zjJvCITHRTfuQkwMHKs0e26LF4QuWd+lSux5Mxx4LJ51UuzwaE4YsIBj/rV/vZvRcsAAuuQT+8Q/3MK8rVVfN9Oqr7gHvvSxkxcM/Pt4mVjOmgdhIZeObqnto3323e3C/8QZcc039H9QibrbPc84JTD6NMQ3CAkK42rLFjez99FP4+c/dYinduoU6V8aYELJeRuFo2jQYOBC+/NL1Hpozx4KBMca/gCAi54lIpoisFZH7qtjfXUTSRGSZiHwrIhd47bvfc16miJzrb5omCAoL4aqrYNQoOOooyMhwffjrutSkMaZZ8fkkEJFIYCJwPtAfGCUi/Ssd9iAwXVWHACOBlzzn9vd8HgCcB7wkIpF+pmkCafZsVyr46CN44gk3X/+RR4Y6V8aYRsSfr4bHA2tVdb2qFgPTgBGVjlGgYnRQe2CL5/0IYJqqHlDVLGCtJz1/0jSBsHu3m6L5F7+AhISDi8hEWfORMeZQ/gSErsAmr885nm3exgG/FpEcYDZwu49z/UkTABEZIyLpIpKen5/vR3bNT+bNg2OOcQ3G993ngsGxx4Y6V8aYRsqfgFBVH8TKgxdGAVNVNRm4AHhLRCJqONefNN1G1VdUNVVVUxMSEvzIrqGoyI0UPuMM1530q6/gyScbbvEYY0yT5E+9QQ7g3QUlmYNVQhVG49oIUNWFIhILxPs411eapi6WLHFTTqxeDbfc4hZtqZiDyBhjauBPCWEx0FdEUkQkBtdIPKvSMRuBswBE5GggFsj3HDdSRFqISArQF/jGzzRNbX30kVvYfedO15V04kQLBsYYv/ksIahqqYjcBswBIoHJqrpKRB4F0lV1FjAWeFVE7sJV/Vyvbk6MVSIyHVgNlAK3qmoZQFVpBuH+wscnn7gupcOGuRlH4+JCnSNjTBNjcxk1B3PmwEUXuQbk//7XTU5njDEe/s5lZCOSGlpZmXsFyhdfuFlJ+/eHzz6zYGCMqTPrjB5IqlBQ4Hve//h4t9rYiHoOvfjqK1cy6NPHzS5q1UTGmHqwgFBfmza53jzff+/m/S8qOnR/TMzBaZ9PP939nD3bfau//np47rm6fatfuBAuuMBNJf3f/7ogY4wx9WBtCPV1zTUwY4b7tu8953/FvP8JCYdPJ11cDI895qaQSE52y1Geeab/11y82M1QmpjoBp8dcURg78kY06zYAjkNYfVqNz/QPffAU0/V/vyvv3ZjBn74wa0N/OST0KpVzedkZLgBZ3FxLhjYLKXGGB+sUbkhPPKIW9f3j3+s2/knnODWLr7jDnjhBRgyxAWJ6qxc6UoGbdvC3LkWDIwxAWUBoa6WLXNVRXfdBZ061T2dVq3cQvZffOHaH04+2a1rXFx86HHffw9nneWmn5g7F3r2rFf2jTGmMgsIdfXww67a5q67ApPemWe6hemvvRYef9yVHlaudPvWrHH7RVzg6NMnMNc0xhgvFhDqYtEiNzL4nnugQ4fApdu+vWtgnjnTdU8dOtRVS515JpSUuGDQr1/grmeMMV4sINTFQw+53kO33+772Lq46CJXOvjlL+HRR2HvXte1dMCA4FzPGGOwcQi1N2+eezj/7W+uQTlYEhJcG8Xs2dC7t5UMjDFBZwGhNlRd6eCII+Dmm4N/PRG30pkxxjQACwi18fnnbrqIl16Cli1DnRtjjAkoa0Pwl6rrDtqjB4weHercGGNMwFkJwV//+pebMuL11938RMYY08xYCcEf5eWu7aBPHzdOwBhjmiG/AoKInCcimSKyVkTuq2L/BBHJ8Lx+EJEdnu1neG3PEJEiEbnYs2+qiGR57Rsc2FsLoBkz3KCxP/8ZoqxQZYxpnnxObicikcAPwNlADm495FGqurqa428Hhqjqbypt7wisBZJVdZ+ITAU+UdUZ/mY2JJPblZa6CeyiomD5coiMbNjrG2NMPfk7uZ0/X3ePB9aq6npPwtOAEbh1kqsyCnikiu2XA5+q6j4/rtl4vPsuZGbCBx9YMDDGNGv+VBl1BTZ5fc7xbDuMiPQAUoC5VeweCbxXadvjIvKtp8qpRTVpjhGRdBFJz8/P9yO7AVRSAuPGuVlIL7mkYa9tjDENzJ+AIFVsq66eaSQwQ1UPWTRYRJKAQcAcr833A/2AYUBH4N6qElTVV1Q1VVVTExIS/MhuAE2ZAllZMH784YvcGGNMM+NPQMgBvCfeTwa2VHNsVaUAgCuBj1S1pGKDquaqcwCYgquaajyKityqZiedBOefH+rcGGNM0PkTEBYDfUUkRURicA/9WZUPEpGjgDhgYRVpjKJSoPCUGhARAS4GVtYu60H2yitujWQrHRhjwoTPRmVVLRWR23DVPZHAZFVdJSKPAumqWhEcRgHTtFK3JRHpiSthzKuU9DsikoCrksoAGmByID/t3evWOz7jjNqtdWyMMU2YX53qVXU2MLvStocrfR5XzbnZVNEIraqN90k7cSLk5bmeRcYYEyZspHJl5eXw9NNw3nlwyimhzo0xxjQYCwiVbdkChYUwYkSoc2KMMQ3KAkJl2dnupy1ib4wJMxYQKsvKcj9TUkKbD2OMaWAWECqrKCH06BHSbBhjTEOzgFBZVhYkJUFsbKhzYowxDcoCQmXZ2dZ+YIwJSxYQKsvKsvYDY0xYsoDgrbQUNm2yEoIxJixZQPCWkwNlZVZCMMaEJQsI3mwMgjEmjFlA8GZjEIwxYcwCgrfsbDfVdbduPg81xpjmxgKCt6wsSE6GmJhQ58QYYxqcBQRvNgbBGBPGLCB4szEIxpgw5ldAEJHzRCRTRNaKyH1V7J8gIhme1w8issNrX5nXvlle21NE5GsRWSMi73uW5wyd4mLYvNlKCMaYsOUzIIhIJDAROB/oD4wSkf7ex6jqXao6WFUHAy8CH3rt3l+xT1Uv8tr+FDBBVfsC24HR9byX+tm4EVSthGCMCVv+lBCOB9aq6npVLQamATWtHjMKeK+mBEVEgDOBGZ5NbwAX+5GX4LExCMaYMOdPQOgKbPL6nEMVayQDiEgPIAWY67U5VkTSRWSRiFQ89DsBO1S11FeaDcbGIBhjwlyUH8dIFdu0mmNHAjNUtcxrW3dV3SIivYC5IrIC2OVvmiIyBhgD0L17dz+yW0fZ2RAVBV1DG5eMMSZU/Ckh5ADeI7WSgS3VHDuSStVFqrrF83M98CUwBCgAOohIRUCqNk1VfUVVU1U1NSEhwY/s1lFWlhuQFuVPjDTGmObHn4CwGOjr6RUUg3voz6p8kIgcBcQBC722xYlIC8/7eOAUYLWqKpAGXO459DpgZn1upN6ys626yBgT1nwGBE89/23AHOA7YLqqrhKRR0XEu9fQKGCa52Ff4WggXUSW4wLAX1R1tWffvcDdIrIW16bwev1vpx6ysqxB2RgT1vyqH1HV2cDsStservR5XBXn/Q8YVE2a63E9mEJv/37YutVKCMaYsGYjlQE2bHA/rYRgjAljFhDg4BgEKyEYY8KYBQQ4OAbBSgjGmDBmAQFcCSEmBpKSQp0TY4wJGQsI4EoIPXpAhP06jDHhy56AYGMQjDEGCwiOjUEwxhgLCOzZAwUFVkIwxoQ9Cwg27bUxxgAWEGzaa2OM8bCAYCUEY4wBLCC4EkLLlpCYGOqcGGNMSFlAyM52pQOpah0gY4wJHxYQsrKs/cAYY7CAcLCEYIwxYS68A8KOHe5lJQRjjPEvIIjIeSKSKSJrReS+KvZPEJEMz+sHEdnh2T5YRBaKyCoR+VZErvI6Z6qIZHmdNzhwt+Un62FkjDE/8blimohEAhOBs4EcYLGIzPJaChNVvcvr+NuBIZ6P+4BrVXWNiBwBLBGROaq6w7P/HlWdEaB7qT0bg2CMMT/xp4RwPLBWVderajEwDRhRw/GjgPcAVPUHVV3jeb8F+BFIqF+WA8hKCMYY8xN/AkJXYJPX5xzPtsOISA8gBZhbxb7jgRhgndfmxz1VSRNEpIXfuQ6UrCxo2xY6dmzwSxtjTGPjT0CoqoO+VnPsSGCGqpYdkoBIEvAWcIOqlns23w/0A4YBHYF7q7y4yBgRSReR9Pz8fD+yWws2BsEYY37iT0DIAbp5fU4GtlRz7Eg81UUVRKQd8G/gQVVdVLFdVXPVOQBMwVVNHUZVX1HVVFVNTUgIcG2TjUEwxpif+BMQFgN9RSRFRGJwD/1ZlQ8SkaOAOGCh17YY4CPgTVX9Z6Xjkzw/BbgYWFnXm6gTVRuDYIwxXnz2MlLVUhG5DZgDRAKTVXWViDwKpKtqRXAYBUxTVe/qpCuB4UAnEbnes+16Vc0A3hGRBFyVVAZwc0DuyF+FhW4tBCshGGMM4EdAAFDV2cDsStservR5XBXnvQ28XU2aZ/qdy2CwHkbGGHOI8B2pbGMQjDHmEOEbEKyEYIwxhwjfgJCVBXFx0L59qHNijDGNQvgGBOthZIwxhwjfgGBjEIwx5hDhGRBsDIIxxhwmPANCXh4UFVkJwRhjvIRnQKjoYWQBwRhjfhKeAaFiDIJVGRljzE/CMyDYGARjjDlMeAaErCxISIDWrUOdE2OMaTTCMyBkZ1v7gTHGVBKeASEry6qLjDGmkvALCOXlsGGDlRCMMaaS8AsIW7ZASYmVEIwxppLwCwg2BsEYY6oUfgHBxiAYY0yV/AoIInKeiGSKyFoRua+K/RNEJMPz+kFEdnjtu05E1nhe13ltHyoiKzxpvuBZWzn4KkoIPXo0yOWMMaap8LmEpohEAhOBs4EcYLGIzFLV1RXHqOpdXsffDgzxvO8IPAKkAgos8Zy7HZgEjAEW4ZbnPA/4NED3Vb2sLEhKgtjYoF/KGGOaEn9KCMcDa1V1vaoWA9OAETUcPwp4z/P+XOBzVd3mCQKfA+eJSBLQTlUXqqoCbwIX1/kuasPGIBhjTJX8CQhdgU1en3M82w4jIj2AFGCuj3O7et77k+YYEUkXkfT8/Hw/suuDjUEwxpgq+RMQqqrb12qOHQnMUNUyH+f6naaqvqKqqaqampCQ4DOzNSothU2brIRgjDFV8Ccg5ADdvD4nA1uqOXYkB6uLajo3x/PenzQDJycHysqshGCMMVXwJyAsBvqKSIqIxOAe+rMqHyQiRwFxwEKvzXOAc0QkTkTigHOAOaqaC+wWkRM9vYuuBWbW8158szEIxhhTLZ+9jFS1VERuwz3cI4HJqrpKRB4F0lW1IjiMAqZ5Gokrzt0mIo/hggrAo6q6zfP+d8BUoCWud1HD9DACKyEYY0wVfAYEAFWdjesa6r3t4Uqfx1Vz7mRgchXb04GB/mY0ILKzISICunXzeagxxoSb8BqpnJUFXbtCTEyoc2KMMY1OeAUEG4NgjDHVCq+AYGMQjDGmWuETEIqLYfNmKyEYY0w1wicgbNwIqlZCMMaYaoRPQLAxCMYYU6PwCQg2BsEYY2oUPgEhOxuioly3U2OMMYcJn4CQleUGpEX5NRbPGGPCTngFBGs/MMaYaoVPQMjOtvYDY4ypQXgEhP37YetWKyEYY0wNwiMgbNjgfloJwRhjqhUeAaGiy6mVEIwxplrhERAqBqVZCcEYY6oVHgEhK8tNeZ2UFOqcGGNMo+VXQBCR80QkU0TWish91RxzpYisFpFVIvKuZ9sZIpLh9SoSkYs9+6aKSJbXvsGBu61KsrOhRw+3OI4xxpgq+RylJSKRwETgbCAHWCwis1R1tdcxfYH7gVNUdbuIJAKoahow2HNMR2At8JlX8veo6oxA3Uy1hgyB3r2DfhljjGnK/Bm2ezywVlXXA4jINGAEsNrrmJuAiaq6HUBVf6wincuBT1V1X/2yXAf339/glzTGmKbGnzqUrsAmr885nm3ejgSOFJH/E5FFInJeFemMBN6rtO1xEflWRCaISAu/c22MMSbg/AkIUsU2rfQ5CugLnA6MAl4TkQ4/JSCSBAwC5nidcz/QDxgGdATurfLiImNEJF1E0vPz8/3IrjHGmLrwJyDkAN28PicDW6o4ZqaqlqhqFpCJCxAVrgQ+UtWSig2qmqvOAWAKrmrqMKr6iqqmqmpqQkKCH9k1xhhTF/4EhMVAXxFJEZEYXNXPrErHfAycASAi8bgqpPVe+0dRqbrIU2pARAS4GFhZlxswxhgTGD4blVW1VERuw1X3RAKTVXWViDwKpKvqLM++c0RkNVCG6z1UCCAiPXEljHmVkn5HRBJwVVIZwM2BuSVjjDF1IaqVmwMar9TUVE1PTw91NowxpkkRkSWqmurrOBupZYwxBrCAYIwxxqNJVRmJSD6woY6nxwMFAcxOYxYu9xou9wnhc6/hcp/QsPfaQ1V9dtNsUgGhPkQk3Z86tOYgXO41XO4Twudew+U+oXHeq1UZGWOMASwgGGOM8QingPBKqDPQgMLlXsPlPiF87jVc7hMa4b2GTRuCMcaYmoVTCcEYY0wNwiIg+LPiW3MgItkissKzAl2zGtItIpNF5EcRWem1raOIfC4iazw/40KZx0Co5j7Hichmr9UFLwhlHgNFRLqJSJqIfOdZafH3nu3N6u9aw302ur9rs68y8qz49gNeK74Bo7xXfGsuRCQbSFXVZtePW0SGA3uAN1V1oGfb08A2Vf2LJ9DHqWqV06g3FdXc5zhgj6o+G8q8BZpngsskVV0qIm2BJbiJLq+nGf1da7jPK2lkf9dwKCH8tOKbqhYDFSu+mSZEVecD2yptHgG84Xn/Bu4/WZNWzX02S54p8Jd63u8GvsMtvtWs/q413GejEw4BwZ8V35oLBT4TkSUiMibUmWkAnVU1F9x/OiAxxPkJpts8qwtObupVKFXxzIo8BPiaZvx3rXSf0Mj+ruEQEPxZ8a25OEVVjwPOB271VD+Ypm8S0BsYDOQCfw1tdgJLRNoAHwB3ququUOcnWKq4z0b3dw2HgODPim/Ngqpu8fz8EfiIalaha0byvBZaSgJ+DHF+gkJV81S1TFXLgVdpRn9XEYnGPSTfUdUPPZub3d+1qvtsjH/XcAgI/qz41uSJSGtPgxUi0ho4h+a/Ct0s4DrP++uAmSHMS9BUPBw9LqGZ/F09qyW+Dnynqn/z2tWs/q7V3Wdj/Ls2+15GAJ7uXM9xcMW3x0OcpYATkV64UgG4lfDebU73KSLvAafjZojMAx7BLd06HegObASuUNUm3SBbzX2ejqtWUCAb+G1FHXtTJiI/A74CVgDlns1/wtWvN5u/aw33OYpG9ncNi4BgjDHGt3CoMjLGGOMHCwjGGGMACwjGGGM8LCAYY4wBLCAYY4zxsIBgTBCJyOki8kmo82GMPywgGGOMASwgGAOAiPxaRL7xzEv/sohEisgeEfmriCwVkS9EJMFz7GARWeSZlOyjiknJRKSPiPxXRJZ7zuntSb6NiMwQke9F5B3PyFVE5C8istqTTqOZAtmELwsIJuyJyNHAVbjJAQcDZcDVQGtgqWfCwHm4UcMAbwL3quoxuNGnFdvfASaq6rHAybgJy8DNbnkn0B/oBZwiIh1x0xUM8KQzPrh3aYxvFhCMgbOAocBiEcnwfO6Fm2bgfc8xbwM/E5H2QAdVnefZ/gYw3DOPVFdV/QhAVYtUdZ/nmG9UNccziVkG0BPYBRQBr4nIpUDFscaEjAUEY9wU6W+o6mDP6yhVHVfFcTXN81LVNOsVDni9LwOiVLUUN7vlB7gFYP5TyzwbE3AWEIyBL4DLRSQRflrTtwfu/8flnmN+BSxQ1Z3AdhE51bP9GmCeZ377HBG52JNGCxFpVd0FPXPjt1fV2bjqpMHBuDFjaiMq1BkwJtRUdbWIPIhbbS4CKAFuBfYCA0RkCbAT184Abkrmf3ge+OuBGzzbrwFeFpFHPWlcUcNl2wIzRSQWV7q4K8C3ZUyt2WynxlRDRPaoaptQ58OYhmJVRsYYYwArIRhjjPGwEoIxxhjAAoIxxhgPCwjGGGMACwjGGGM8LCAYY4wBLCAYY4zx+H9ECRqME3y+gwAAAABJRU5ErkJggg==\n",
"text/plain": "<matplotlib.figure.Figure at 0x7f118a049ef0>"
},
"metadata": {},
"output_type": "display_data"
}
]
},
{
"metadata": {
"scrolled": true,
"trusted": true
},
"cell_type": "code",
"source": "result = pred.argmax(axis=1)\nresult",
"execution_count": 411,
"outputs": [
{
"data": {
"text/plain": "array([0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 1, 0, 0, 0, 1, 0, 0,\n 1, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1,\n 1, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 1,\n 1, 0, 0, 1, 1, 0, 0, 1, 1, 0, 0, 1, 0, 1, 1, 0, 0, 0, 0, 0, 1, 0,\n 1, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0,\n 0, 1, 1, 1, 1, 0, 0, 1, 0, 1, 1, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1,\n 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0,\n 0, 0, 1, 0, 0, 0, 1, 1, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1,\n 1, 0, 1, 1, 0, 1, 1, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0,\n 0, 0, 1, 0, 1, 1, 0, 0, 1, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 1, 0,\n 1, 0, 1, 0, 1, 1, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 1, 1, 1, 1,\n 1, 0, 0, 0, 1, 0, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0,\n 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0,\n 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0,\n 0, 0, 0, 0, 0, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0,\n 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 1, 1, 0,\n 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 1, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0,\n 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1,\n 0, 1, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1])"
},
"execution_count": 411,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "# compare with the previous result\nprev = pd.read_csv('one_hot_submission.csv', index_col=0) # best ever\nprint('Diff: ',np.sum(prev.Survived.values != result))\n\n# The Number of Survived\nprint(result.sum(),'/',len(result),' :',round(result.sum()/len(result),2))",
"execution_count": 412,
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": "Diff: 9\n129 / 418 : 0.31\n"
}
]
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "submission = pd.DataFrame({'PassengerId': test.index, 'Survived': result})\nsubmission.to_csv('one_hot_submission.csv', index=False)",
"execution_count": 413,
"outputs": []
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "df.head()",
"execution_count": 414,
"outputs": [
{
"data": {
"text/html": "<div>\n<style scoped>\n .dataframe tbody tr th:only-of-type {\n vertical-align: middle;\n }\n\n .dataframe tbody tr th {\n vertical-align: top;\n }\n\n .dataframe thead th {\n text-align: right;\n }\n</style>\n<table border=\"1\" class=\"dataframe\">\n <thead>\n <tr style=\"text-align: right;\">\n <th></th>\n <th>Sex</th>\n <th>Age</th>\n <th>SibSp</th>\n <th>Parch</th>\n <th>Fare</th>\n <th>Pclass_0</th>\n <th>Pclass_1</th>\n <th>Pclass_2</th>\n <th>Title_0</th>\n <th>Title_1</th>\n <th>...</th>\n <th>Cabin_2</th>\n <th>Cabin_3</th>\n <th>Cabin_4</th>\n <th>Cabin_5</th>\n <th>Cabin_6</th>\n <th>Cabin_7</th>\n <th>Cabin_8</th>\n <th>Embarked_0</th>\n <th>Embarked_1</th>\n <th>Embarked_2</th>\n </tr>\n <tr>\n <th>PassengerId</th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n <th></th>\n </tr>\n </thead>\n <tbody>\n <tr>\n <th>1</th>\n <td>0</td>\n <td>0.273456</td>\n <td>0.125</td>\n <td>0.0</td>\n <td>0.014151</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>1</td>\n <td>0</td>\n <td>...</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>2</th>\n <td>1</td>\n <td>0.473882</td>\n <td>0.125</td>\n <td>0.0</td>\n <td>0.139136</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>...</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n </tr>\n <tr>\n <th>3</th>\n <td>1</td>\n <td>0.323563</td>\n <td>0.000</td>\n <td>0.0</td>\n <td>0.015469</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>...</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>4</th>\n <td>1</td>\n <td>0.436302</td>\n <td>0.125</td>\n <td>0.0</td>\n <td>0.103644</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>...</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n </tr>\n <tr>\n <th>5</th>\n <td>0</td>\n <td>0.436302</td>\n <td>0.000</td>\n <td>0.0</td>\n <td>0.015713</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>1</td>\n <td>0</td>\n <td>...</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>0</td>\n <td>1</td>\n <td>0</td>\n <td>0</td>\n </tr>\n </tbody>\n</table>\n<p>5 rows × 38 columns</p>\n</div>",
"text/plain": " Sex Age SibSp Parch Fare Pclass_0 Pclass_1 \\\nPassengerId \n1 0 0.273456 0.125 0.0 0.014151 0 0 \n2 1 0.473882 0.125 0.0 0.139136 1 0 \n3 1 0.323563 0.000 0.0 0.015469 0 0 \n4 1 0.436302 0.125 0.0 0.103644 1 0 \n5 0 0.436302 0.000 0.0 0.015713 0 0 \n\n Pclass_2 Title_0 Title_1 ... Cabin_2 Cabin_3 \\\nPassengerId ... \n1 1 1 0 ... 0 0 \n2 0 0 1 ... 0 0 \n3 1 0 0 ... 0 0 \n4 0 0 1 ... 0 0 \n5 1 1 0 ... 0 0 \n\n Cabin_4 Cabin_5 Cabin_6 Cabin_7 Cabin_8 Embarked_0 \\\nPassengerId \n1 0 0 0 0 0 1 \n2 0 0 0 0 0 0 \n3 0 0 0 0 0 1 \n4 0 0 0 0 0 1 \n5 0 0 0 0 0 1 \n\n Embarked_1 Embarked_2 \nPassengerId \n1 0 0 \n2 1 0 \n3 0 0 \n4 0 0 \n5 0 0 \n\n[5 rows x 38 columns]"
},
"execution_count": 414,
"metadata": {},
"output_type": "execute_result"
}
]
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "",
"execution_count": null,
"outputs": []
}
],
"metadata": {
"kernelspec": {
"name": "py36",
"display_name": "py36",
"language": "python"
},
"language_info": {
"name": "python",
"version": "3.6.4",
"mimetype": "text/x-python",
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"pygments_lexer": "ipython3",
"nbconvert_exporter": "python",
"file_extension": ".py"
},
"gist": {
"id": "",
"data": {
"description": "Titanic One-Hot Keras CNN",
"public": true
}
}
},
"nbformat": 4,
"nbformat_minor": 2
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment