Skip to content

Instantly share code, notes, and snippets.

@smfullman
Created July 15, 2020 03:52
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save smfullman/01b47b584c1a59f502707f5ac08da5dc to your computer and use it in GitHub Desktop.
Save smfullman/01b47b584c1a59f502707f5ac08da5dc to your computer and use it in GitHub Desktop.
Display the source blob
Display the rendered blob
Raw
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Simple Snorkel Demo\n",
"\n",
"### Goals\n",
"Demonstrate basic capabilities of the [Snorkel](https://www.snorkel.org/) library with a simple NLP classification problem.\n",
"\n",
"### Approach\n",
"Use the classic IMDB movie review data set from [Andrew Maas](http://ai.stanford.edu/~amaas/data/sentiment/), re-group to simulate having only 1000 labeled examples to start, and then develop a sentiment classification model.\n",
"\n",
"Develop Snorkel labeling functions to create weak supervision signals over the remaining unlabeled data, use that information to train a model, and test performance on our \"hand labeled\" data.\n",
"\n",
"### Notes:\n",
"1. This notebook is meant to demonstrate basic Snorkel capabilities such as labeling functions, and not meant to be a full Snorkel tutorial. To learn more head to the Snorkel tutorials page: https://www.snorkel.org/use-cases\n",
"1. This notebook is also not meant to be a performance baseline or prove the usefulness of weak supervision. To that end, we just show how to create a simple model starting with limited data. There are likely better data sets on which to develop benchmarks and show the power of weak supervision. "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Install Requirements"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"!echo \"numpy>=1.18.1\\nscipy>=1.4.1\\npandas>=0.25.0,<0.26.0\\ntqdm>=4.33.0,<5.0.0\\nscikit-learn>=0.21.3,<0.22.0\\ntorch>=1.2.0,<2.0.0\\ntensorflow==1.15.0\\nnetworkx>=2.2,<2.4\\nsnorkel>=0.9.5\\nsklearn-crfsuite>=0.3.6\\nspacy>=2.3.0\" > requirements.txt"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!pip install -r requirements.txt"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Download Data"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"import os, urllib, tarfile"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"tags": []
},
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": "Data already downloaded.\nTar file already extracted.\n"
}
],
"source": [
"\n",
"DATA_URL = 'http://ai.stanford.edu/~amaas/data/sentiment/aclImdb_v1.tar.gz'\n",
"DATA_DIR = './data'\n",
"\n",
"if not os.path.exists(DATA_DIR):\n",
" os.makedirs(DATA_DIR)\n",
"\n",
"if not os.path.isfile(os.path.join(DATA_DIR,'movie_data.tar.gz')):\n",
" urllib.request.urlretrieve(DATA_URL, os.path.join(DATA_DIR,'movie_data.tar.gz'))\n",
"else:\n",
" print(\"Data already downloaded.\")\n",
"\n",
"if os.path.isfile(os.path.join(DATA_DIR,'movie_data.tar.gz')) and not os.path.exists(os.path.join(DATA_DIR,'aclImdb')):\n",
" f = tarfile.open(os.path.join(DATA_DIR,'movie_data.tar.gz'))\n",
" f.extractall(path=DATA_DIR)\n",
" f.close()\n",
"else:\n",
" print(\"Tar file already extracted.\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Create IMDB Dataframes\n",
"\n",
"Here we split our 50,000 examples into two batches:\n",
"\n",
"1. our simulated \"labeled\" data with 1000 rows (to be used as test set)\n",
"1. our \"unlabeled\" data with 49,000 rows (to be used with Snorkel)\n",
" \n",
"We will try to use Snorkel to create labeles for the unlabeled data, train a model using this information, and then test the performance on our 1000 labels."
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
"import numpy as np\n",
"import pandas as pd"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"TRAIN_DATA_FOLDER = './data/aclImdb/train/'\n",
"TEST_DATA_FOLDER = './data/aclImdb/test/'"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {
"tags": []
},
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": "(50000, 4)\n(1000, 4) (49000, 4)\n"
}
],
"source": [
"def create_dataframe_from_files(data_folder):\n",
" examples = list()\n",
" for d in ['pos','neg']:\n",
" for f in os.listdir(os.path.join(data_folder,d)):\n",
" _tmp = open(os.path.join(data_folder,d,f),'r', encoding='utf-8')\n",
" if d=='pos':\n",
" examples += [(_tmp.read(),f,1)]\n",
" else:\n",
" examples += [(_tmp.read(),f,0)]\n",
" df_tmp = pd.DataFrame(examples, columns=['text','file','target'])\n",
" df_tmp['original_assignment'] = data_folder\n",
" df_tmp = df_tmp.sample(frac=1)\n",
" df_tmp = df_tmp.reset_index(drop=True)\n",
" return df_tmp\n",
" \n",
"df_original = pd.concat([create_dataframe_from_files(TRAIN_DATA_FOLDER), create_dataframe_from_files(TEST_DATA_FOLDER)])\n",
"df_original = df_original.reset_index(drop=True)\n",
"print(df_original.shape)\n",
"\n",
"df_original['idx'] = df_original.index\n",
"label_index = df_original[df_original.original_assignment=='./data/aclImdb/test/'].sample(n=1000, random_state=8675309).idx.values\n",
"df_original['simulated_assignment'] = df_original.idx.apply(lambda x: 'labeled' if x in label_index else 'unlabeled')\n",
"df_original = df_original[['idx', 'text','simulated_assignment','target']]\n",
"\n",
"df_labeled = df_original[df_original.simulated_assignment=='labeled']\n",
"df_unlabeled = df_original[df_original.simulated_assignment=='unlabeled']\n",
"print(df_labeled.shape, df_unlabeled.shape)"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/plain": " idx text \\\n25008 25008 I think my summary says it all. This MTV-ish a... \n25009 25009 I went to this film full of hope. With so many... \n25011 25011 The film starts out with a narration of the pr... \n25042 25042 This low-budget indie film redefines the word ... \n25044 25044 On the heels of the well received and beloved ... \n\n simulated_assignment target \n25008 labeled 0 \n25009 labeled 0 \n25011 labeled 0 \n25042 labeled 0 \n25044 labeled 0 ",
"text/html": "<div>\n<style scoped>\n .dataframe tbody tr th:only-of-type {\n vertical-align: middle;\n }\n\n .dataframe tbody tr th {\n vertical-align: top;\n }\n\n .dataframe thead th {\n text-align: right;\n }\n</style>\n<table border=\"1\" class=\"dataframe\">\n <thead>\n <tr style=\"text-align: right;\">\n <th></th>\n <th>idx</th>\n <th>text</th>\n <th>simulated_assignment</th>\n <th>target</th>\n </tr>\n </thead>\n <tbody>\n <tr>\n <th>25008</th>\n <td>25008</td>\n <td>I think my summary says it all. This MTV-ish a...</td>\n <td>labeled</td>\n <td>0</td>\n </tr>\n <tr>\n <th>25009</th>\n <td>25009</td>\n <td>I went to this film full of hope. With so many...</td>\n <td>labeled</td>\n <td>0</td>\n </tr>\n <tr>\n <th>25011</th>\n <td>25011</td>\n <td>The film starts out with a narration of the pr...</td>\n <td>labeled</td>\n <td>0</td>\n </tr>\n <tr>\n <th>25042</th>\n <td>25042</td>\n <td>This low-budget indie film redefines the word ...</td>\n <td>labeled</td>\n <td>0</td>\n </tr>\n <tr>\n <th>25044</th>\n <td>25044</td>\n <td>On the heels of the well received and beloved ...</td>\n <td>labeled</td>\n <td>0</td>\n </tr>\n </tbody>\n</table>\n</div>"
},
"metadata": {},
"execution_count": 6
}
],
"source": [
"df_labeled.head()"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/plain": "1 0.502\n0 0.498\nName: target, dtype: float64"
},
"metadata": {},
"execution_count": 7
}
],
"source": [
"df_labeled.target.value_counts(normalize=True)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Create Labeling Functions for Weak Supervision\n",
"\n",
"In this section we create our labeling functions and do some basic analysis of their coverage and overlap. These functions will provide weak signals (or abstain from signaling) as to the labels of each row in our unlabeled data."
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [],
"source": [
"import re\n",
"\n",
"from snorkel.labeling import labeling_function, LabelingFunction\n",
"from snorkel.labeling import PandasLFApplier, LFAnalysis\n",
"from snorkel.labeling import filter_unlabeled_dataframe\n",
"from snorkel.labeling.model import LabelModel\n",
"from snorkel.utils import probs_to_preds"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [],
"source": [
"ABSTAIN = -1\n",
"NEG = 0\n",
"POS = 1\n",
"\n",
"@labeling_function()\n",
"def check_for_great(row):\n",
" result = ABSTAIN\n",
" finder = re.search(r'(was|is)\\sgreat', row.text, flags=re.I)\n",
" if finder:\n",
" result = POS\n",
" return result\n",
"\n",
"@labeling_function()\n",
"def check_for_recommend(row):\n",
" result = ABSTAIN\n",
" finder_pos = re.search(r'would\\srecommend|would\\sreccommend|highly\\srecommend',\n",
" row.text, flags=re.I)\n",
" if finder_pos:\n",
" result = POS\n",
" finder_neg = re.search(r'(can\\'t|wouldn\\'t|would\\snot)\\s(recommend|reccommend)',\n",
" row.text, flags=re.I)\n",
" if finder_neg:\n",
" result = NEG\n",
"\n",
" return result\n",
"\n",
"@labeling_function()\n",
"def check_for_negative_phrases(row):\n",
" result = ABSTAIN\n",
" finder = re.search(r'movie was terrible|hated it|so bad|pretty bad|\\\n",
" didn\\'t like|not good|not funny|not worth watching|sucks', row.text, flags=re.I)\n",
" if finder:\n",
" result = NEG\n",
" return result\n",
"\n",
"@labeling_function()\n",
"def check_for_positive_phrases(row):\n",
" result = ABSTAIN\n",
" finder = re.search(r'movie was pretty good|loved it|not bad|really liked', row.text, flags=re.I)\n",
" if finder:\n",
" result = POS\n",
" return result"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [],
"source": [
"def keyword_lookup(x, keywords, label):\n",
" if any(word in x.text.lower() for word in keywords):\n",
" return label\n",
" return ABSTAIN\n",
"\n",
"\n",
"def make_keyword_lf(keywords, label=POS):\n",
" return LabelingFunction(\n",
" name=f\"keyword_{keywords[0]}\",\n",
" f=keyword_lookup,\n",
" resources=dict(keywords=keywords, label=label),\n",
" )\n",
"\n",
"keyword_worst = make_keyword_lf(keywords=[\"the worst\", \"worst!\"], label=NEG)\n",
"keyword_terrible = make_keyword_lf(keywords=[\"terrible\", \"terible\"], label=NEG)\n",
"keyword_sucks = make_keyword_lf(keywords=[\"sucks\", \"sucked\"], label=NEG)\n",
"keyword_delightful = make_keyword_lf(keywords=[\"delightful\"], label=POS)"
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {
"tags": []
},
"outputs": [
{
"output_type": "stream",
"name": "stderr",
"text": "100%|██████████| 49000/49000 [00:25<00:00, 1921.17it/s]\n"
}
],
"source": [
"lfs = [check_for_great,\n",
" check_for_recommend,\n",
" check_for_negative_phrases,\n",
" check_for_positive_phrases,\n",
" keyword_worst,\n",
" keyword_terrible,\n",
" keyword_sucks,\n",
" keyword_delightful\n",
" ]\n",
"\n",
"applier = PandasLFApplier(lfs=lfs)\n",
"L_train = applier.apply(df=df_unlabeled)"
]
},
{
"cell_type": "code",
"execution_count": 12,
"metadata": {},
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/plain": "array([[-1, 1, -1, -1, -1, 0, -1, -1],\n [-1, -1, -1, -1, -1, -1, -1, -1],\n [-1, -1, -1, -1, -1, -1, -1, -1],\n [-1, -1, -1, -1, -1, -1, -1, -1],\n [-1, -1, -1, -1, 0, 0, -1, -1],\n [-1, -1, -1, -1, -1, -1, -1, -1],\n [-1, -1, -1, -1, 0, -1, -1, -1],\n [-1, -1, -1, -1, -1, -1, -1, -1],\n [-1, -1, -1, -1, -1, -1, -1, -1],\n [-1, -1, -1, -1, -1, -1, -1, -1]])"
},
"metadata": {},
"execution_count": 12
}
],
"source": [
"L_train[:10,:]"
]
},
{
"cell_type": "code",
"execution_count": 13,
"metadata": {},
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/plain": " j Polarity Coverage Overlaps Conflicts\ncheck_for_great 0 [1] 0.039429 0.007816 0.004245\ncheck_for_recommend 1 [0, 1] 0.032286 0.006571 0.002061\ncheck_for_negative_phrases 2 [0] 0.060265 0.028408 0.003918\ncheck_for_positive_phrases 3 [1] 0.022571 0.006163 0.003673\nkeyword_the worst 4 [0] 0.071755 0.022469 0.002939\nkeyword_terrible 5 [0] 0.054286 0.020184 0.003224\nkeyword_sucks 6 [0] 0.019286 0.013980 0.001408\nkeyword_delightful 7 [1] 0.010796 0.001796 0.000592",
"text/html": "<div>\n<style scoped>\n .dataframe tbody tr th:only-of-type {\n vertical-align: middle;\n }\n\n .dataframe tbody tr th {\n vertical-align: top;\n }\n\n .dataframe thead th {\n text-align: right;\n }\n</style>\n<table border=\"1\" class=\"dataframe\">\n <thead>\n <tr style=\"text-align: right;\">\n <th></th>\n <th>j</th>\n <th>Polarity</th>\n <th>Coverage</th>\n <th>Overlaps</th>\n <th>Conflicts</th>\n </tr>\n </thead>\n <tbody>\n <tr>\n <th>check_for_great</th>\n <td>0</td>\n <td>[1]</td>\n <td>0.039429</td>\n <td>0.007816</td>\n <td>0.004245</td>\n </tr>\n <tr>\n <th>check_for_recommend</th>\n <td>1</td>\n <td>[0, 1]</td>\n <td>0.032286</td>\n <td>0.006571</td>\n <td>0.002061</td>\n </tr>\n <tr>\n <th>check_for_negative_phrases</th>\n <td>2</td>\n <td>[0]</td>\n <td>0.060265</td>\n <td>0.028408</td>\n <td>0.003918</td>\n </tr>\n <tr>\n <th>check_for_positive_phrases</th>\n <td>3</td>\n <td>[1]</td>\n <td>0.022571</td>\n <td>0.006163</td>\n <td>0.003673</td>\n </tr>\n <tr>\n <th>keyword_the worst</th>\n <td>4</td>\n <td>[0]</td>\n <td>0.071755</td>\n <td>0.022469</td>\n <td>0.002939</td>\n </tr>\n <tr>\n <th>keyword_terrible</th>\n <td>5</td>\n <td>[0]</td>\n <td>0.054286</td>\n <td>0.020184</td>\n <td>0.003224</td>\n </tr>\n <tr>\n <th>keyword_sucks</th>\n <td>6</td>\n <td>[0]</td>\n <td>0.019286</td>\n <td>0.013980</td>\n <td>0.001408</td>\n </tr>\n <tr>\n <th>keyword_delightful</th>\n <td>7</td>\n <td>[1]</td>\n <td>0.010796</td>\n <td>0.001796</td>\n <td>0.000592</td>\n </tr>\n </tbody>\n</table>\n</div>"
},
"metadata": {},
"execution_count": 13
}
],
"source": [
"LFAnalysis(L=L_train, lfs=lfs).lf_summary()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Create Label Model\n",
"\n",
"This section takes all of the weak signals from the labeling functions and trains a model to determine a \"final\" label suggestion. This is where we resolve the potential conflicts and overlaps, as well as provide a confidence in the predicted label.\n",
"\n",
"This is the most interesting part of the Snorkel library (effectively answering the question: how you determine a single label from multiple signal inputs, without any supervised signal from real labels). For more information on this, check out: [Training Complex Models with Multi-Task Weak Supervision](https://arxiv.org/abs/1810.02840)."
]
},
{
"cell_type": "code",
"execution_count": 14,
"metadata": {},
"outputs": [],
"source": [
"label_model = LabelModel(cardinality=2, verbose=True)\n",
"label_model.fit(L_train=L_train, n_epochs=500, log_freq=100, seed=123)"
]
},
{
"cell_type": "code",
"execution_count": 15,
"metadata": {},
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/plain": "array([[9.99630956e-01, 3.69044256e-04],\n [5.00000000e-01, 5.00000000e-01],\n [5.00000000e-01, 5.00000000e-01],\n [5.00000000e-01, 5.00000000e-01],\n [9.99998200e-01, 1.80048745e-06],\n [5.00000000e-01, 5.00000000e-01],\n [9.79901657e-01, 2.00983432e-02],\n [5.00000000e-01, 5.00000000e-01],\n [5.00000000e-01, 5.00000000e-01],\n [5.00000000e-01, 5.00000000e-01]])"
},
"metadata": {},
"execution_count": 15
}
],
"source": [
"probs_train = label_model.predict_proba(L=L_train)\n",
"probs_train[:10]"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Train a Linear Model (non-weighted labels)\n",
"\n",
"In this section, we train a simple linear model using the labels created in the previous section.\n",
"\n",
"First, we filter the unlabeled data so that we only include rows where we have some confidence in the Snorkel output (i.e., all of the ABSTAIN or low confidence rows get removed). Then, we use the `probs_to_preds` function to effectively brute force the probability weighted signals to a single class label. Note: when we do this we are throwing away information on the confidence of the label, but this will allow us to use the labels with model libraries such as scikit learn."
]
},
{
"cell_type": "code",
"execution_count": 16,
"metadata": {
"tags": []
},
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": "(12373, 4)\n[0 0 0 1 1 0 0 0 0 0]\n"
}
],
"source": [
"df_train_filtered, probs_train_filtered = filter_unlabeled_dataframe(\n",
" X=df_unlabeled, y=probs_train, L=L_train\n",
")\n",
"preds_train_filtered = probs_to_preds(probs=probs_train_filtered)\n",
"print(df_train_filtered.shape)\n",
"print(preds_train_filtered[:10])"
]
},
{
"cell_type": "code",
"execution_count": 17,
"metadata": {
"tags": []
},
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": "precision recall f1-score support\n\n 0 0.71 0.63 0.67 498\n 1 0.67 0.75 0.71 502\n\n accuracy 0.69 1000\n macro avg 0.69 0.69 0.69 1000\nweighted avg 0.69 0.69 0.69 1000\n\n"
}
],
"source": [
"from sklearn.feature_extraction.text import CountVectorizer\n",
"from sklearn.linear_model import LogisticRegression\n",
"from sklearn.metrics import classification_report\n",
"\n",
"MAX_FEATURES = 1000\n",
"vectorizer = CountVectorizer(ngram_range=(1, 2), max_features=MAX_FEATURES)\n",
"X_train = vectorizer.fit_transform(df_train_filtered.text.tolist())\n",
"X_test = vectorizer.transform(df_labeled.text.tolist())\n",
"\n",
"sklearn_model_base = LogisticRegression(C=1e3, solver=\"liblinear\")\n",
"sklearn_model_base.fit(X=X_train, y=preds_train_filtered)\n",
"\n",
"y_hat_test_base = sklearn_model_base.predict(X_test)\n",
"print(classification_report(df_labeled.target.values, y_hat_test_base))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Entropy Weighted Labels Model\n",
"\n",
"Another way to use the output from Snorkel labeling functions is to incorporate the label model prediction confidence into the final model loss calculation. In this case we weight the label model prediction by 1-entropy of the label prediction (i.e., low confidence label predictions from the Snorkel label model get pushed to 0), and pass that to the `sample_weight` [argument](https://github.com/tensorflow/tensorflow/blob/ed0faac790902a2ea17262e0f37b3ef7faa20f0a/tensorflow/python/keras/losses.py#L124) in tensorflow. This will adjust to loss so uncertain labels will have a smaller impact on the final model weight adjustments. It is also a useful approach if you would like to combine high confidence, hand-labeled data with Snorkel weak supervision.\n",
"\n",
"Note: We moved from a linear model in the previous example to a MLP here, which will take into account feature interactions. Therefore, we can't directly attribute all the performance gain on the test to using weighted labels."
]
},
{
"cell_type": "code",
"execution_count": 18,
"metadata": {
"tags": []
},
"outputs": [
{
"output_type": "stream",
"name": "stderr",
"text": "Using TensorFlow backend.\n"
}
],
"source": [
"from scipy.stats import entropy\n",
"from keras.models import Model\n",
"from keras.regularizers import l2\n",
"from keras.layers import Dense, Dropout, Input\n",
"from keras.optimizers import Adam, SGD\n",
"from keras.metrics import binary_accuracy, binary_crossentropy\n",
"\n",
"import tensorflow as tf"
]
},
{
"cell_type": "code",
"execution_count": 19,
"metadata": {},
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/plain": "array([[<matplotlib.axes._subplots.AxesSubplot object at 0x1a2c5fad90>]],\n dtype=object)"
},
"metadata": {},
"execution_count": 19
},
{
"output_type": "display_data",
"data": {
"text/plain": "<Figure size 432x288 with 1 Axes>",
"image/svg+xml": "<?xml version=\"1.0\" encoding=\"utf-8\" standalone=\"no\"?>\n<!DOCTYPE svg PUBLIC \"-//W3C//DTD SVG 1.1//EN\"\n \"http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd\">\n<!-- Created with matplotlib (https://matplotlib.org/) -->\n<svg height=\"263.63625pt\" version=\"1.1\" viewBox=\"0 0 381.65 263.63625\" width=\"381.65pt\" xmlns=\"http://www.w3.org/2000/svg\" xmlns:xlink=\"http://www.w3.org/1999/xlink\">\n <defs>\n <style type=\"text/css\">\n*{stroke-linecap:butt;stroke-linejoin:round;}\n </style>\n </defs>\n <g id=\"figure_1\">\n <g id=\"patch_1\">\n <path d=\"M -0 263.63625 \nL 381.65 263.63625 \nL 381.65 0 \nL -0 0 \nz\n\" style=\"fill:none;\"/>\n </g>\n <g id=\"axes_1\">\n <g id=\"patch_2\">\n <path d=\"M 39.65 239.758125 \nL 374.45 239.758125 \nL 374.45 22.318125 \nL 39.65 22.318125 \nz\n\" style=\"fill:#ffffff;\"/>\n </g>\n <g id=\"patch_3\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 54.868182 239.758125 \nL 60.955455 239.758125 \nL 60.955455 239.457946 \nL 54.868182 239.457946 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_4\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 60.955455 239.758125 \nL 67.042727 239.758125 \nL 67.042727 239.720603 \nL 60.955455 239.720603 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_5\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 67.042727 239.758125 \nL 73.13 239.758125 \nL 73.13 209.477608 \nL 67.042727 209.477608 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_6\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 73.13 239.758125 \nL 79.217273 239.758125 \nL 79.217273 239.758125 \nL 73.13 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_7\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 79.217273 239.758125 \nL 85.304545 239.758125 \nL 85.304545 239.758125 \nL 79.217273 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_8\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 85.304545 239.758125 \nL 91.391818 239.758125 \nL 91.391818 239.758125 \nL 85.304545 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_9\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 91.391818 239.758125 \nL 97.479091 239.758125 \nL 97.479091 239.720603 \nL 91.391818 239.720603 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_10\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 97.479091 239.758125 \nL 103.566364 239.758125 \nL 103.566364 239.758125 \nL 97.479091 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_11\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 103.566364 239.758125 \nL 109.653636 239.758125 \nL 109.653636 181.636043 \nL 103.566364 181.636043 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_12\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 109.653636 239.758125 \nL 115.740909 239.758125 \nL 115.740909 239.758125 \nL 109.653636 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_13\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 115.740909 239.758125 \nL 121.828182 239.758125 \nL 121.828182 231.953481 \nL 115.740909 231.953481 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_14\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 121.828182 239.758125 \nL 127.915455 239.758125 \nL 127.915455 239.758125 \nL 121.828182 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_15\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 127.915455 239.758125 \nL 134.002727 239.758125 \nL 134.002727 239.758125 \nL 127.915455 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_16\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 134.002727 239.758125 \nL 140.09 239.758125 \nL 140.09 239.758125 \nL 134.002727 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_17\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 140.09 239.758125 \nL 146.177273 239.758125 \nL 146.177273 200.209594 \nL 140.09 200.209594 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_18\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 146.177273 239.758125 \nL 152.264545 239.758125 \nL 152.264545 223.210779 \nL 146.177273 223.210779 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_19\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 152.264545 239.758125 \nL 158.351818 239.758125 \nL 158.351818 239.758125 \nL 152.264545 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_20\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 158.351818 239.758125 \nL 164.439091 239.758125 \nL 164.439091 237.056518 \nL 158.351818 237.056518 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_21\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 164.439091 239.758125 \nL 170.526364 239.758125 \nL 170.526364 239.758125 \nL 164.439091 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_22\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 170.526364 239.758125 \nL 176.613636 239.758125 \nL 176.613636 239.758125 \nL 170.526364 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_23\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 176.613636 239.758125 \nL 182.700909 239.758125 \nL 182.700909 239.758125 \nL 176.613636 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_24\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 182.700909 239.758125 \nL 188.788182 239.758125 \nL 188.788182 239.720603 \nL 182.700909 239.720603 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_25\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 188.788182 239.758125 \nL 194.875455 239.758125 \nL 194.875455 239.758125 \nL 188.788182 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_26\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 194.875455 239.758125 \nL 200.962727 239.758125 \nL 200.962727 238.407321 \nL 194.875455 238.407321 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_27\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 200.962727 239.758125 \nL 207.05 239.758125 \nL 207.05 239.495469 \nL 200.962727 239.495469 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_28\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 207.05 239.758125 \nL 213.137273 239.758125 \nL 213.137273 239.758125 \nL 207.05 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_29\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 213.137273 239.758125 \nL 219.224545 239.758125 \nL 219.224545 239.758125 \nL 213.137273 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_30\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 219.224545 239.758125 \nL 225.311818 239.758125 \nL 225.311818 239.645558 \nL 219.224545 239.645558 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_31\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 225.311818 239.758125 \nL 231.399091 239.758125 \nL 231.399091 239.758125 \nL 225.311818 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_32\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 231.399091 239.758125 \nL 237.486364 239.758125 \nL 237.486364 238.7075 \nL 231.399091 238.7075 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_33\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 237.486364 239.758125 \nL 243.573636 239.758125 \nL 243.573636 236.868906 \nL 237.486364 236.868906 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_34\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 243.573636 239.758125 \nL 249.660909 239.758125 \nL 249.660909 239.082723 \nL 243.573636 239.082723 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_35\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 249.660909 239.758125 \nL 255.748182 239.758125 \nL 255.748182 239.758125 \nL 249.660909 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_36\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 255.748182 239.758125 \nL 261.835455 239.758125 \nL 261.835455 239.758125 \nL 255.748182 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_37\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 261.835455 239.758125 \nL 267.922727 239.758125 \nL 267.922727 238.7075 \nL 261.835455 238.7075 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_38\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 267.922727 239.758125 \nL 274.01 239.758125 \nL 274.01 238.594933 \nL 267.922727 238.594933 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_39\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 274.01 239.758125 \nL 280.097273 239.758125 \nL 280.097273 239.570513 \nL 274.01 239.570513 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_40\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 280.097273 239.758125 \nL 286.184545 239.758125 \nL 286.184545 239.68308 \nL 280.097273 239.68308 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_41\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 286.184545 239.758125 \nL 292.271818 239.758125 \nL 292.271818 239.758125 \nL 286.184545 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_42\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 292.271818 239.758125 \nL 298.359091 239.758125 \nL 298.359091 238.407321 \nL 292.271818 238.407321 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_43\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 298.359091 239.758125 \nL 304.446364 239.758125 \nL 304.446364 239.758125 \nL 298.359091 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_44\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 304.446364 239.758125 \nL 310.533636 239.758125 \nL 310.533636 239.758125 \nL 304.446364 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_45\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 310.533636 239.758125 \nL 316.620909 239.758125 \nL 316.620909 149.141709 \nL 310.533636 149.141709 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_46\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 316.620909 239.758125 \nL 322.708182 239.758125 \nL 322.708182 239.68308 \nL 316.620909 239.68308 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_47\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 322.708182 239.758125 \nL 328.795455 239.758125 \nL 328.795455 239.758125 \nL 322.708182 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_48\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 328.795455 239.758125 \nL 334.882727 239.758125 \nL 334.882727 239.758125 \nL 328.795455 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_49\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 334.882727 239.758125 \nL 340.97 239.758125 \nL 340.97 239.758125 \nL 334.882727 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_50\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 340.97 239.758125 \nL 347.057273 239.758125 \nL 347.057273 238.857589 \nL 340.97 238.857589 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_51\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 347.057273 239.758125 \nL 353.144545 239.758125 \nL 353.144545 239.758125 \nL 347.057273 239.758125 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"patch_52\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 353.144545 239.758125 \nL 359.231818 239.758125 \nL 359.231818 32.672411 \nL 353.144545 32.672411 \nz\n\" style=\"fill:#1f77b4;\"/>\n </g>\n <g id=\"matplotlib.axis_1\">\n <g id=\"xtick_1\">\n <g id=\"line2d_1\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 54.199277 239.758125 \nL 54.199277 22.318125 \n\" style=\"fill:none;stroke:#b0b0b0;stroke-linecap:square;stroke-width:0.8;\"/>\n </g>\n <g id=\"line2d_2\">\n <defs>\n <path d=\"M 0 0 \nL 0 3.5 \n\" id=\"m6fa45f7b26\" style=\"stroke:#000000;stroke-width:0.8;\"/>\n </defs>\n <g>\n <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"54.199277\" xlink:href=\"#m6fa45f7b26\" y=\"239.758125\"/>\n </g>\n </g>\n <g id=\"text_1\">\n <!-- 0.0 -->\n <defs>\n <path d=\"M 31.78125 66.40625 \nQ 24.171875 66.40625 20.328125 58.90625 \nQ 16.5 51.421875 16.5 36.375 \nQ 16.5 21.390625 20.328125 13.890625 \nQ 24.171875 6.390625 31.78125 6.390625 \nQ 39.453125 6.390625 43.28125 13.890625 \nQ 47.125 21.390625 47.125 36.375 \nQ 47.125 51.421875 43.28125 58.90625 \nQ 39.453125 66.40625 31.78125 66.40625 \nz\nM 31.78125 74.21875 \nQ 44.046875 74.21875 50.515625 64.515625 \nQ 56.984375 54.828125 56.984375 36.375 \nQ 56.984375 17.96875 50.515625 8.265625 \nQ 44.046875 -1.421875 31.78125 -1.421875 \nQ 19.53125 -1.421875 13.0625 8.265625 \nQ 6.59375 17.96875 6.59375 36.375 \nQ 6.59375 54.828125 13.0625 64.515625 \nQ 19.53125 74.21875 31.78125 74.21875 \nz\n\" id=\"DejaVuSans-48\"/>\n <path d=\"M 10.6875 12.40625 \nL 21 12.40625 \nL 21 0 \nL 10.6875 0 \nz\n\" id=\"DejaVuSans-46\"/>\n </defs>\n <g transform=\"translate(46.247714 254.356563)scale(0.1 -0.1)\">\n <use xlink:href=\"#DejaVuSans-48\"/>\n <use x=\"63.623047\" xlink:href=\"#DejaVuSans-46\"/>\n <use x=\"95.410156\" xlink:href=\"#DejaVuSans-48\"/>\n </g>\n </g>\n </g>\n <g id=\"xtick_2\">\n <g id=\"line2d_3\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 115.205785 239.758125 \nL 115.205785 22.318125 \n\" style=\"fill:none;stroke:#b0b0b0;stroke-linecap:square;stroke-width:0.8;\"/>\n </g>\n <g id=\"line2d_4\">\n <g>\n <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"115.205785\" xlink:href=\"#m6fa45f7b26\" y=\"239.758125\"/>\n </g>\n </g>\n <g id=\"text_2\">\n <!-- 0.2 -->\n <defs>\n <path d=\"M 19.1875 8.296875 \nL 53.609375 8.296875 \nL 53.609375 0 \nL 7.328125 0 \nL 7.328125 8.296875 \nQ 12.9375 14.109375 22.625 23.890625 \nQ 32.328125 33.6875 34.8125 36.53125 \nQ 39.546875 41.84375 41.421875 45.53125 \nQ 43.3125 49.21875 43.3125 52.78125 \nQ 43.3125 58.59375 39.234375 62.25 \nQ 35.15625 65.921875 28.609375 65.921875 \nQ 23.96875 65.921875 18.8125 64.3125 \nQ 13.671875 62.703125 7.8125 59.421875 \nL 7.8125 69.390625 \nQ 13.765625 71.78125 18.9375 73 \nQ 24.125 74.21875 28.421875 74.21875 \nQ 39.75 74.21875 46.484375 68.546875 \nQ 53.21875 62.890625 53.21875 53.421875 \nQ 53.21875 48.921875 51.53125 44.890625 \nQ 49.859375 40.875 45.40625 35.40625 \nQ 44.1875 33.984375 37.640625 27.21875 \nQ 31.109375 20.453125 19.1875 8.296875 \nz\n\" id=\"DejaVuSans-50\"/>\n </defs>\n <g transform=\"translate(107.254223 254.356563)scale(0.1 -0.1)\">\n <use xlink:href=\"#DejaVuSans-48\"/>\n <use x=\"63.623047\" xlink:href=\"#DejaVuSans-46\"/>\n <use x=\"95.410156\" xlink:href=\"#DejaVuSans-50\"/>\n </g>\n </g>\n </g>\n <g id=\"xtick_3\">\n <g id=\"line2d_5\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 176.212293 239.758125 \nL 176.212293 22.318125 \n\" style=\"fill:none;stroke:#b0b0b0;stroke-linecap:square;stroke-width:0.8;\"/>\n </g>\n <g id=\"line2d_6\">\n <g>\n <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"176.212293\" xlink:href=\"#m6fa45f7b26\" y=\"239.758125\"/>\n </g>\n </g>\n <g id=\"text_3\">\n <!-- 0.4 -->\n <defs>\n <path d=\"M 37.796875 64.3125 \nL 12.890625 25.390625 \nL 37.796875 25.390625 \nz\nM 35.203125 72.90625 \nL 47.609375 72.90625 \nL 47.609375 25.390625 \nL 58.015625 25.390625 \nL 58.015625 17.1875 \nL 47.609375 17.1875 \nL 47.609375 0 \nL 37.796875 0 \nL 37.796875 17.1875 \nL 4.890625 17.1875 \nL 4.890625 26.703125 \nz\n\" id=\"DejaVuSans-52\"/>\n </defs>\n <g transform=\"translate(168.260731 254.356563)scale(0.1 -0.1)\">\n <use xlink:href=\"#DejaVuSans-48\"/>\n <use x=\"63.623047\" xlink:href=\"#DejaVuSans-46\"/>\n <use x=\"95.410156\" xlink:href=\"#DejaVuSans-52\"/>\n </g>\n </g>\n </g>\n <g id=\"xtick_4\">\n <g id=\"line2d_7\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 237.218802 239.758125 \nL 237.218802 22.318125 \n\" style=\"fill:none;stroke:#b0b0b0;stroke-linecap:square;stroke-width:0.8;\"/>\n </g>\n <g id=\"line2d_8\">\n <g>\n <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"237.218802\" xlink:href=\"#m6fa45f7b26\" y=\"239.758125\"/>\n </g>\n </g>\n <g id=\"text_4\">\n <!-- 0.6 -->\n <defs>\n <path d=\"M 33.015625 40.375 \nQ 26.375 40.375 22.484375 35.828125 \nQ 18.609375 31.296875 18.609375 23.390625 \nQ 18.609375 15.53125 22.484375 10.953125 \nQ 26.375 6.390625 33.015625 6.390625 \nQ 39.65625 6.390625 43.53125 10.953125 \nQ 47.40625 15.53125 47.40625 23.390625 \nQ 47.40625 31.296875 43.53125 35.828125 \nQ 39.65625 40.375 33.015625 40.375 \nz\nM 52.59375 71.296875 \nL 52.59375 62.3125 \nQ 48.875 64.0625 45.09375 64.984375 \nQ 41.3125 65.921875 37.59375 65.921875 \nQ 27.828125 65.921875 22.671875 59.328125 \nQ 17.53125 52.734375 16.796875 39.40625 \nQ 19.671875 43.65625 24.015625 45.921875 \nQ 28.375 48.1875 33.59375 48.1875 \nQ 44.578125 48.1875 50.953125 41.515625 \nQ 57.328125 34.859375 57.328125 23.390625 \nQ 57.328125 12.15625 50.6875 5.359375 \nQ 44.046875 -1.421875 33.015625 -1.421875 \nQ 20.359375 -1.421875 13.671875 8.265625 \nQ 6.984375 17.96875 6.984375 36.375 \nQ 6.984375 53.65625 15.1875 63.9375 \nQ 23.390625 74.21875 37.203125 74.21875 \nQ 40.921875 74.21875 44.703125 73.484375 \nQ 48.484375 72.75 52.59375 71.296875 \nz\n\" id=\"DejaVuSans-54\"/>\n </defs>\n <g transform=\"translate(229.267239 254.356563)scale(0.1 -0.1)\">\n <use xlink:href=\"#DejaVuSans-48\"/>\n <use x=\"63.623047\" xlink:href=\"#DejaVuSans-46\"/>\n <use x=\"95.410156\" xlink:href=\"#DejaVuSans-54\"/>\n </g>\n </g>\n </g>\n <g id=\"xtick_5\">\n <g id=\"line2d_9\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 298.22531 239.758125 \nL 298.22531 22.318125 \n\" style=\"fill:none;stroke:#b0b0b0;stroke-linecap:square;stroke-width:0.8;\"/>\n </g>\n <g id=\"line2d_10\">\n <g>\n <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"298.22531\" xlink:href=\"#m6fa45f7b26\" y=\"239.758125\"/>\n </g>\n </g>\n <g id=\"text_5\">\n <!-- 0.8 -->\n <defs>\n <path d=\"M 31.78125 34.625 \nQ 24.75 34.625 20.71875 30.859375 \nQ 16.703125 27.09375 16.703125 20.515625 \nQ 16.703125 13.921875 20.71875 10.15625 \nQ 24.75 6.390625 31.78125 6.390625 \nQ 38.8125 6.390625 42.859375 10.171875 \nQ 46.921875 13.96875 46.921875 20.515625 \nQ 46.921875 27.09375 42.890625 30.859375 \nQ 38.875 34.625 31.78125 34.625 \nz\nM 21.921875 38.8125 \nQ 15.578125 40.375 12.03125 44.71875 \nQ 8.5 49.078125 8.5 55.328125 \nQ 8.5 64.0625 14.71875 69.140625 \nQ 20.953125 74.21875 31.78125 74.21875 \nQ 42.671875 74.21875 48.875 69.140625 \nQ 55.078125 64.0625 55.078125 55.328125 \nQ 55.078125 49.078125 51.53125 44.71875 \nQ 48 40.375 41.703125 38.8125 \nQ 48.828125 37.15625 52.796875 32.3125 \nQ 56.78125 27.484375 56.78125 20.515625 \nQ 56.78125 9.90625 50.3125 4.234375 \nQ 43.84375 -1.421875 31.78125 -1.421875 \nQ 19.734375 -1.421875 13.25 4.234375 \nQ 6.78125 9.90625 6.78125 20.515625 \nQ 6.78125 27.484375 10.78125 32.3125 \nQ 14.796875 37.15625 21.921875 38.8125 \nz\nM 18.3125 54.390625 \nQ 18.3125 48.734375 21.84375 45.5625 \nQ 25.390625 42.390625 31.78125 42.390625 \nQ 38.140625 42.390625 41.71875 45.5625 \nQ 45.3125 48.734375 45.3125 54.390625 \nQ 45.3125 60.0625 41.71875 63.234375 \nQ 38.140625 66.40625 31.78125 66.40625 \nQ 25.390625 66.40625 21.84375 63.234375 \nQ 18.3125 60.0625 18.3125 54.390625 \nz\n\" id=\"DejaVuSans-56\"/>\n </defs>\n <g transform=\"translate(290.273747 254.356563)scale(0.1 -0.1)\">\n <use xlink:href=\"#DejaVuSans-48\"/>\n <use x=\"63.623047\" xlink:href=\"#DejaVuSans-46\"/>\n <use x=\"95.410156\" xlink:href=\"#DejaVuSans-56\"/>\n </g>\n </g>\n </g>\n <g id=\"xtick_6\">\n <g id=\"line2d_11\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 359.231818 239.758125 \nL 359.231818 22.318125 \n\" style=\"fill:none;stroke:#b0b0b0;stroke-linecap:square;stroke-width:0.8;\"/>\n </g>\n <g id=\"line2d_12\">\n <g>\n <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"359.231818\" xlink:href=\"#m6fa45f7b26\" y=\"239.758125\"/>\n </g>\n </g>\n <g id=\"text_6\">\n <!-- 1.0 -->\n <defs>\n <path d=\"M 12.40625 8.296875 \nL 28.515625 8.296875 \nL 28.515625 63.921875 \nL 10.984375 60.40625 \nL 10.984375 69.390625 \nL 28.421875 72.90625 \nL 38.28125 72.90625 \nL 38.28125 8.296875 \nL 54.390625 8.296875 \nL 54.390625 0 \nL 12.40625 0 \nz\n\" id=\"DejaVuSans-49\"/>\n </defs>\n <g transform=\"translate(351.280256 254.356563)scale(0.1 -0.1)\">\n <use xlink:href=\"#DejaVuSans-49\"/>\n <use x=\"63.623047\" xlink:href=\"#DejaVuSans-46\"/>\n <use x=\"95.410156\" xlink:href=\"#DejaVuSans-48\"/>\n </g>\n </g>\n </g>\n </g>\n <g id=\"matplotlib.axis_2\">\n <g id=\"ytick_1\">\n <g id=\"line2d_13\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 39.65 239.758125 \nL 374.45 239.758125 \n\" style=\"fill:none;stroke:#b0b0b0;stroke-linecap:square;stroke-width:0.8;\"/>\n </g>\n <g id=\"line2d_14\">\n <defs>\n <path d=\"M 0 0 \nL -3.5 0 \n\" id=\"m18883d2aba\" style=\"stroke:#000000;stroke-width:0.8;\"/>\n </defs>\n <g>\n <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"39.65\" xlink:href=\"#m18883d2aba\" y=\"239.758125\"/>\n </g>\n </g>\n <g id=\"text_7\">\n <!-- 0 -->\n <g transform=\"translate(26.2875 243.557344)scale(0.1 -0.1)\">\n <use xlink:href=\"#DejaVuSans-48\"/>\n </g>\n </g>\n </g>\n <g id=\"ytick_2\">\n <g id=\"line2d_15\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 39.65 202.2358 \nL 374.45 202.2358 \n\" style=\"fill:none;stroke:#b0b0b0;stroke-linecap:square;stroke-width:0.8;\"/>\n </g>\n <g id=\"line2d_16\">\n <g>\n <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"39.65\" xlink:href=\"#m18883d2aba\" y=\"202.2358\"/>\n </g>\n </g>\n <g id=\"text_8\">\n <!-- 1000 -->\n <g transform=\"translate(7.2 206.035018)scale(0.1 -0.1)\">\n <use xlink:href=\"#DejaVuSans-49\"/>\n <use x=\"63.623047\" xlink:href=\"#DejaVuSans-48\"/>\n <use x=\"127.246094\" xlink:href=\"#DejaVuSans-48\"/>\n <use x=\"190.869141\" xlink:href=\"#DejaVuSans-48\"/>\n </g>\n </g>\n </g>\n <g id=\"ytick_3\">\n <g id=\"line2d_17\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 39.65 164.713474 \nL 374.45 164.713474 \n\" style=\"fill:none;stroke:#b0b0b0;stroke-linecap:square;stroke-width:0.8;\"/>\n </g>\n <g id=\"line2d_18\">\n <g>\n <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"39.65\" xlink:href=\"#m18883d2aba\" y=\"164.713474\"/>\n </g>\n </g>\n <g id=\"text_9\">\n <!-- 2000 -->\n <g transform=\"translate(7.2 168.512693)scale(0.1 -0.1)\">\n <use xlink:href=\"#DejaVuSans-50\"/>\n <use x=\"63.623047\" xlink:href=\"#DejaVuSans-48\"/>\n <use x=\"127.246094\" xlink:href=\"#DejaVuSans-48\"/>\n <use x=\"190.869141\" xlink:href=\"#DejaVuSans-48\"/>\n </g>\n </g>\n </g>\n <g id=\"ytick_4\">\n <g id=\"line2d_19\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 39.65 127.191149 \nL 374.45 127.191149 \n\" style=\"fill:none;stroke:#b0b0b0;stroke-linecap:square;stroke-width:0.8;\"/>\n </g>\n <g id=\"line2d_20\">\n <g>\n <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"39.65\" xlink:href=\"#m18883d2aba\" y=\"127.191149\"/>\n </g>\n </g>\n <g id=\"text_10\">\n <!-- 3000 -->\n <defs>\n <path d=\"M 40.578125 39.3125 \nQ 47.65625 37.796875 51.625 33 \nQ 55.609375 28.21875 55.609375 21.1875 \nQ 55.609375 10.40625 48.1875 4.484375 \nQ 40.765625 -1.421875 27.09375 -1.421875 \nQ 22.515625 -1.421875 17.65625 -0.515625 \nQ 12.796875 0.390625 7.625 2.203125 \nL 7.625 11.71875 \nQ 11.71875 9.328125 16.59375 8.109375 \nQ 21.484375 6.890625 26.8125 6.890625 \nQ 36.078125 6.890625 40.9375 10.546875 \nQ 45.796875 14.203125 45.796875 21.1875 \nQ 45.796875 27.640625 41.28125 31.265625 \nQ 36.765625 34.90625 28.71875 34.90625 \nL 20.21875 34.90625 \nL 20.21875 43.015625 \nL 29.109375 43.015625 \nQ 36.375 43.015625 40.234375 45.921875 \nQ 44.09375 48.828125 44.09375 54.296875 \nQ 44.09375 59.90625 40.109375 62.90625 \nQ 36.140625 65.921875 28.71875 65.921875 \nQ 24.65625 65.921875 20.015625 65.03125 \nQ 15.375 64.15625 9.8125 62.3125 \nL 9.8125 71.09375 \nQ 15.4375 72.65625 20.34375 73.4375 \nQ 25.25 74.21875 29.59375 74.21875 \nQ 40.828125 74.21875 47.359375 69.109375 \nQ 53.90625 64.015625 53.90625 55.328125 \nQ 53.90625 49.265625 50.4375 45.09375 \nQ 46.96875 40.921875 40.578125 39.3125 \nz\n\" id=\"DejaVuSans-51\"/>\n </defs>\n <g transform=\"translate(7.2 130.990367)scale(0.1 -0.1)\">\n <use xlink:href=\"#DejaVuSans-51\"/>\n <use x=\"63.623047\" xlink:href=\"#DejaVuSans-48\"/>\n <use x=\"127.246094\" xlink:href=\"#DejaVuSans-48\"/>\n <use x=\"190.869141\" xlink:href=\"#DejaVuSans-48\"/>\n </g>\n </g>\n </g>\n <g id=\"ytick_5\">\n <g id=\"line2d_21\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 39.65 89.668823 \nL 374.45 89.668823 \n\" style=\"fill:none;stroke:#b0b0b0;stroke-linecap:square;stroke-width:0.8;\"/>\n </g>\n <g id=\"line2d_22\">\n <g>\n <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"39.65\" xlink:href=\"#m18883d2aba\" y=\"89.668823\"/>\n </g>\n </g>\n <g id=\"text_11\">\n <!-- 4000 -->\n <g transform=\"translate(7.2 93.468042)scale(0.1 -0.1)\">\n <use xlink:href=\"#DejaVuSans-52\"/>\n <use x=\"63.623047\" xlink:href=\"#DejaVuSans-48\"/>\n <use x=\"127.246094\" xlink:href=\"#DejaVuSans-48\"/>\n <use x=\"190.869141\" xlink:href=\"#DejaVuSans-48\"/>\n </g>\n </g>\n </g>\n <g id=\"ytick_6\">\n <g id=\"line2d_23\">\n <path clip-path=\"url(#p8fb245b333)\" d=\"M 39.65 52.146498 \nL 374.45 52.146498 \n\" style=\"fill:none;stroke:#b0b0b0;stroke-linecap:square;stroke-width:0.8;\"/>\n </g>\n <g id=\"line2d_24\">\n <g>\n <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"39.65\" xlink:href=\"#m18883d2aba\" y=\"52.146498\"/>\n </g>\n </g>\n <g id=\"text_12\">\n <!-- 5000 -->\n <defs>\n <path d=\"M 10.796875 72.90625 \nL 49.515625 72.90625 \nL 49.515625 64.59375 \nL 19.828125 64.59375 \nL 19.828125 46.734375 \nQ 21.96875 47.46875 24.109375 47.828125 \nQ 26.265625 48.1875 28.421875 48.1875 \nQ 40.625 48.1875 47.75 41.5 \nQ 54.890625 34.8125 54.890625 23.390625 \nQ 54.890625 11.625 47.5625 5.09375 \nQ 40.234375 -1.421875 26.90625 -1.421875 \nQ 22.3125 -1.421875 17.546875 -0.640625 \nQ 12.796875 0.140625 7.71875 1.703125 \nL 7.71875 11.625 \nQ 12.109375 9.234375 16.796875 8.0625 \nQ 21.484375 6.890625 26.703125 6.890625 \nQ 35.15625 6.890625 40.078125 11.328125 \nQ 45.015625 15.765625 45.015625 23.390625 \nQ 45.015625 31 40.078125 35.4375 \nQ 35.15625 39.890625 26.703125 39.890625 \nQ 22.75 39.890625 18.8125 39.015625 \nQ 14.890625 38.140625 10.796875 36.28125 \nz\n\" id=\"DejaVuSans-53\"/>\n </defs>\n <g transform=\"translate(7.2 55.945716)scale(0.1 -0.1)\">\n <use xlink:href=\"#DejaVuSans-53\"/>\n <use x=\"63.623047\" xlink:href=\"#DejaVuSans-48\"/>\n <use x=\"127.246094\" xlink:href=\"#DejaVuSans-48\"/>\n <use x=\"190.869141\" xlink:href=\"#DejaVuSans-48\"/>\n </g>\n </g>\n </g>\n </g>\n <g id=\"patch_53\">\n <path d=\"M 39.65 239.758125 \nL 39.65 22.318125 \n\" style=\"fill:none;stroke:#000000;stroke-linecap:square;stroke-linejoin:miter;stroke-width:0.8;\"/>\n </g>\n <g id=\"patch_54\">\n <path d=\"M 374.45 239.758125 \nL 374.45 22.318125 \n\" style=\"fill:none;stroke:#000000;stroke-linecap:square;stroke-linejoin:miter;stroke-width:0.8;\"/>\n </g>\n <g id=\"patch_55\">\n <path d=\"M 39.65 239.758125 \nL 374.45 239.758125 \n\" style=\"fill:none;stroke:#000000;stroke-linecap:square;stroke-linejoin:miter;stroke-width:0.8;\"/>\n </g>\n <g id=\"patch_56\">\n <path d=\"M 39.65 22.318125 \nL 374.45 22.318125 \n\" style=\"fill:none;stroke:#000000;stroke-linecap:square;stroke-linejoin:miter;stroke-width:0.8;\"/>\n </g>\n <g id=\"text_13\">\n <!-- label_entropy -->\n <defs>\n <path d=\"M 9.421875 75.984375 \nL 18.40625 75.984375 \nL 18.40625 0 \nL 9.421875 0 \nz\n\" id=\"DejaVuSans-108\"/>\n <path d=\"M 34.28125 27.484375 \nQ 23.390625 27.484375 19.1875 25 \nQ 14.984375 22.515625 14.984375 16.5 \nQ 14.984375 11.71875 18.140625 8.90625 \nQ 21.296875 6.109375 26.703125 6.109375 \nQ 34.1875 6.109375 38.703125 11.40625 \nQ 43.21875 16.703125 43.21875 25.484375 \nL 43.21875 27.484375 \nz\nM 52.203125 31.203125 \nL 52.203125 0 \nL 43.21875 0 \nL 43.21875 8.296875 \nQ 40.140625 3.328125 35.546875 0.953125 \nQ 30.953125 -1.421875 24.3125 -1.421875 \nQ 15.921875 -1.421875 10.953125 3.296875 \nQ 6 8.015625 6 15.921875 \nQ 6 25.140625 12.171875 29.828125 \nQ 18.359375 34.515625 30.609375 34.515625 \nL 43.21875 34.515625 \nL 43.21875 35.40625 \nQ 43.21875 41.609375 39.140625 45 \nQ 35.0625 48.390625 27.6875 48.390625 \nQ 23 48.390625 18.546875 47.265625 \nQ 14.109375 46.140625 10.015625 43.890625 \nL 10.015625 52.203125 \nQ 14.9375 54.109375 19.578125 55.046875 \nQ 24.21875 56 28.609375 56 \nQ 40.484375 56 46.34375 49.84375 \nQ 52.203125 43.703125 52.203125 31.203125 \nz\n\" id=\"DejaVuSans-97\"/>\n <path d=\"M 48.6875 27.296875 \nQ 48.6875 37.203125 44.609375 42.84375 \nQ 40.53125 48.484375 33.40625 48.484375 \nQ 26.265625 48.484375 22.1875 42.84375 \nQ 18.109375 37.203125 18.109375 27.296875 \nQ 18.109375 17.390625 22.1875 11.75 \nQ 26.265625 6.109375 33.40625 6.109375 \nQ 40.53125 6.109375 44.609375 11.75 \nQ 48.6875 17.390625 48.6875 27.296875 \nz\nM 18.109375 46.390625 \nQ 20.953125 51.265625 25.265625 53.625 \nQ 29.59375 56 35.59375 56 \nQ 45.5625 56 51.78125 48.09375 \nQ 58.015625 40.1875 58.015625 27.296875 \nQ 58.015625 14.40625 51.78125 6.484375 \nQ 45.5625 -1.421875 35.59375 -1.421875 \nQ 29.59375 -1.421875 25.265625 0.953125 \nQ 20.953125 3.328125 18.109375 8.203125 \nL 18.109375 0 \nL 9.078125 0 \nL 9.078125 75.984375 \nL 18.109375 75.984375 \nz\n\" id=\"DejaVuSans-98\"/>\n <path d=\"M 56.203125 29.59375 \nL 56.203125 25.203125 \nL 14.890625 25.203125 \nQ 15.484375 15.921875 20.484375 11.0625 \nQ 25.484375 6.203125 34.421875 6.203125 \nQ 39.59375 6.203125 44.453125 7.46875 \nQ 49.3125 8.734375 54.109375 11.28125 \nL 54.109375 2.78125 \nQ 49.265625 0.734375 44.1875 -0.34375 \nQ 39.109375 -1.421875 33.890625 -1.421875 \nQ 20.796875 -1.421875 13.15625 6.1875 \nQ 5.515625 13.8125 5.515625 26.8125 \nQ 5.515625 40.234375 12.765625 48.109375 \nQ 20.015625 56 32.328125 56 \nQ 43.359375 56 49.78125 48.890625 \nQ 56.203125 41.796875 56.203125 29.59375 \nz\nM 47.21875 32.234375 \nQ 47.125 39.59375 43.09375 43.984375 \nQ 39.0625 48.390625 32.421875 48.390625 \nQ 24.90625 48.390625 20.390625 44.140625 \nQ 15.875 39.890625 15.1875 32.171875 \nz\n\" id=\"DejaVuSans-101\"/>\n <path d=\"M 50.984375 -16.609375 \nL 50.984375 -23.578125 \nL -0.984375 -23.578125 \nL -0.984375 -16.609375 \nz\n\" id=\"DejaVuSans-95\"/>\n <path d=\"M 54.890625 33.015625 \nL 54.890625 0 \nL 45.90625 0 \nL 45.90625 32.71875 \nQ 45.90625 40.484375 42.875 44.328125 \nQ 39.84375 48.1875 33.796875 48.1875 \nQ 26.515625 48.1875 22.3125 43.546875 \nQ 18.109375 38.921875 18.109375 30.90625 \nL 18.109375 0 \nL 9.078125 0 \nL 9.078125 54.6875 \nL 18.109375 54.6875 \nL 18.109375 46.1875 \nQ 21.34375 51.125 25.703125 53.5625 \nQ 30.078125 56 35.796875 56 \nQ 45.21875 56 50.046875 50.171875 \nQ 54.890625 44.34375 54.890625 33.015625 \nz\n\" id=\"DejaVuSans-110\"/>\n <path d=\"M 18.3125 70.21875 \nL 18.3125 54.6875 \nL 36.8125 54.6875 \nL 36.8125 47.703125 \nL 18.3125 47.703125 \nL 18.3125 18.015625 \nQ 18.3125 11.328125 20.140625 9.421875 \nQ 21.96875 7.515625 27.59375 7.515625 \nL 36.8125 7.515625 \nL 36.8125 0 \nL 27.59375 0 \nQ 17.1875 0 13.234375 3.875 \nQ 9.28125 7.765625 9.28125 18.015625 \nL 9.28125 47.703125 \nL 2.6875 47.703125 \nL 2.6875 54.6875 \nL 9.28125 54.6875 \nL 9.28125 70.21875 \nz\n\" id=\"DejaVuSans-116\"/>\n <path d=\"M 41.109375 46.296875 \nQ 39.59375 47.171875 37.8125 47.578125 \nQ 36.03125 48 33.890625 48 \nQ 26.265625 48 22.1875 43.046875 \nQ 18.109375 38.09375 18.109375 28.8125 \nL 18.109375 0 \nL 9.078125 0 \nL 9.078125 54.6875 \nL 18.109375 54.6875 \nL 18.109375 46.1875 \nQ 20.953125 51.171875 25.484375 53.578125 \nQ 30.03125 56 36.53125 56 \nQ 37.453125 56 38.578125 55.875 \nQ 39.703125 55.765625 41.0625 55.515625 \nz\n\" id=\"DejaVuSans-114\"/>\n <path d=\"M 30.609375 48.390625 \nQ 23.390625 48.390625 19.1875 42.75 \nQ 14.984375 37.109375 14.984375 27.296875 \nQ 14.984375 17.484375 19.15625 11.84375 \nQ 23.34375 6.203125 30.609375 6.203125 \nQ 37.796875 6.203125 41.984375 11.859375 \nQ 46.1875 17.53125 46.1875 27.296875 \nQ 46.1875 37.015625 41.984375 42.703125 \nQ 37.796875 48.390625 30.609375 48.390625 \nz\nM 30.609375 56 \nQ 42.328125 56 49.015625 48.375 \nQ 55.71875 40.765625 55.71875 27.296875 \nQ 55.71875 13.875 49.015625 6.21875 \nQ 42.328125 -1.421875 30.609375 -1.421875 \nQ 18.84375 -1.421875 12.171875 6.21875 \nQ 5.515625 13.875 5.515625 27.296875 \nQ 5.515625 40.765625 12.171875 48.375 \nQ 18.84375 56 30.609375 56 \nz\n\" id=\"DejaVuSans-111\"/>\n <path d=\"M 18.109375 8.203125 \nL 18.109375 -20.796875 \nL 9.078125 -20.796875 \nL 9.078125 54.6875 \nL 18.109375 54.6875 \nL 18.109375 46.390625 \nQ 20.953125 51.265625 25.265625 53.625 \nQ 29.59375 56 35.59375 56 \nQ 45.5625 56 51.78125 48.09375 \nQ 58.015625 40.1875 58.015625 27.296875 \nQ 58.015625 14.40625 51.78125 6.484375 \nQ 45.5625 -1.421875 35.59375 -1.421875 \nQ 29.59375 -1.421875 25.265625 0.953125 \nQ 20.953125 3.328125 18.109375 8.203125 \nz\nM 48.6875 27.296875 \nQ 48.6875 37.203125 44.609375 42.84375 \nQ 40.53125 48.484375 33.40625 48.484375 \nQ 26.265625 48.484375 22.1875 42.84375 \nQ 18.109375 37.203125 18.109375 27.296875 \nQ 18.109375 17.390625 22.1875 11.75 \nQ 26.265625 6.109375 33.40625 6.109375 \nQ 40.53125 6.109375 44.609375 11.75 \nQ 48.6875 17.390625 48.6875 27.296875 \nz\n\" id=\"DejaVuSans-112\"/>\n <path d=\"M 32.171875 -5.078125 \nQ 28.375 -14.84375 24.75 -17.8125 \nQ 21.140625 -20.796875 15.09375 -20.796875 \nL 7.90625 -20.796875 \nL 7.90625 -13.28125 \nL 13.1875 -13.28125 \nQ 16.890625 -13.28125 18.9375 -11.515625 \nQ 21 -9.765625 23.484375 -3.21875 \nL 25.09375 0.875 \nL 2.984375 54.6875 \nL 12.5 54.6875 \nL 29.59375 11.921875 \nL 46.6875 54.6875 \nL 56.203125 54.6875 \nz\n\" id=\"DejaVuSans-121\"/>\n </defs>\n <g transform=\"translate(166.32875 16.318125)scale(0.12 -0.12)\">\n <use xlink:href=\"#DejaVuSans-108\"/>\n <use x=\"27.783203\" xlink:href=\"#DejaVuSans-97\"/>\n <use x=\"89.0625\" xlink:href=\"#DejaVuSans-98\"/>\n <use x=\"152.539062\" xlink:href=\"#DejaVuSans-101\"/>\n <use x=\"214.0625\" xlink:href=\"#DejaVuSans-108\"/>\n <use x=\"241.845703\" xlink:href=\"#DejaVuSans-95\"/>\n <use x=\"291.845703\" xlink:href=\"#DejaVuSans-101\"/>\n <use x=\"353.369141\" xlink:href=\"#DejaVuSans-110\"/>\n <use x=\"416.748047\" xlink:href=\"#DejaVuSans-116\"/>\n <use x=\"455.957031\" xlink:href=\"#DejaVuSans-114\"/>\n <use x=\"494.820312\" xlink:href=\"#DejaVuSans-111\"/>\n <use x=\"556.001953\" xlink:href=\"#DejaVuSans-112\"/>\n <use x=\"619.478516\" xlink:href=\"#DejaVuSans-121\"/>\n </g>\n </g>\n </g>\n </g>\n <defs>\n <clipPath id=\"p8fb245b333\">\n <rect height=\"217.44\" width=\"334.8\" x=\"39.65\" y=\"22.318125\"/>\n </clipPath>\n </defs>\n</svg>\n",
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAX0AAAEICAYAAACzliQjAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAWGUlEQVR4nO3df7RlZX3f8fdHUDGgDIpOzTBxSB1TUVYIuQtJXau5iIURq9BWWizqYCZOliFdxtJETJvQKqaY2JJYjToJlMFGB2qiTJCETNC7TFJRmGqQH3ExQQIjVKoD6Igax3z7x3nGHCb3x7l37j0zd573a6277t7PfvY+z3fOnc/Z5zn7nJOqQpLUhycc6AFIksbH0Jekjhj6ktQRQ1+SOmLoS1JHDH1J6oihr4NWknuTvHSEfpXkuQu8jQXvKy1Hhr60hJJcleTSAz0OaS9DXzqAkhx+oMegvhj6OuglOSXJp5M8kuTBJO9J8qR9up2V5J4kX03y60meMLT/TyW5K8nDSW5M8px53v6Tk7wryX1JvpLk/Ume0rZNJtmZ5KIkD7Xxvb5t2wicD/xikt1J/qC135vkLUluA76Z5PAkr0xyR6txKsnzh27/3iRvTXJnq+F/JDmibbs9ySuG+j6x/RucNL9/ZfXC0Ndy8D3gzcCxwE8ApwM/u0+ffw5MACcDZwM/BZDkHOCXgH8BPBP4U+DD87z9dwLPA04CngusAn5laPs/AI5u7RuA9yY5pqo2Ab8L/FpVHVVVrxja59XAy4EVwA+3Mf18G+MNwB/s88B2PnAm8A/bWP5ja78aeM1Qv7OAB6vq8/OsUZ0w9HXQq6rtVXVzVe2pqnuBDwA/uU+3d1bVrqq6D/gNBqEK8DPAf6mqu6pqD/CrwEmjnu0nCfAG4M3t+N9oxzhvqNt3gbdV1Xer6gZgN/Ajcxz63VV1f1V9C/jXwMeraltVfRd4F/AU4B8P9X9P678LeMdQff+TwbOcp7X11wIfHKU29cnQ10EvyfOSXJ/k/yb5OoPQPXafbvcPLf818INt+TnAb7Zpk0eAXUAYnJWP4pnADwDbh47xR619r6+1B5S9HgOOmuO4w+P9wTZmAKrqb9v2VTP0/359VfUA8OfAv0yyAngZg2cX0rQMfS0H7wP+ElhbVU9jMF2TffqsHlr+IeCBtnw/8DNVtWLo5ylV9b9HvO2vAt8CXjC0/9FVNVeo7zXTx9gOtz/A4MEJ+P6zi9XAl4f6zFQfwGYGUzznAp+uquH9pMcx9LUcPBX4OrA7yT8C3jhNn19IckyS1cCbgGta+/uBtyZ5AUCSo5OcO+oNt7Pu3wYuT/KsdoxVSc4c8RBfYTBnP5trgZcnOT3JE4GLgO8Aww9MFyY5LsnTGTzoXTO07WMMXst4E4M5fmlGhr6Wg38P/BvgGwwC+Jpp+lwHbAc+D3wcuAKgqj7K4IXYLW1q6HYGUyDz8RZgB3BzO8afMPec/V5XACe0qaGPTdehqr7I4Ez9vzN4ZvEK4BVV9TdD3T4E/DFwT/u5dGj/bwG/BxwP/P486lKH4peoSAe3JPcCP11VfzJLn18BnldVr5mpjwTgG0OkZa5N+WxgcOWONCundySgvTFq9zQ/5x/osc0myRsYvFj9h1X1qQM9Hh38nN6RpI54pi9JHTmo5/SPPfbYWrNmzYL2/eY3v8mRRx65uAM6yFlzH6y5D/tT8/bt279aVc+cbttBHfpr1qzh1ltvXdC+U1NTTE5OLu6ADnLW3Adr7sP+1Jzkr2fa5vSOJHXE0Jekjhj6ktQRQ1+SOmLoS1JHDH1J6oihL0kdMfQlqSOGviR15KB+R64kHerWXPzxaduvWrc0Hzvhmb4kdcTQl6SOGPqS1BFDX5I6YuhLUkcMfUnqiKEvSR0x9CWpI4a+JHXE0Jekjhj6ktQRQ1+SOmLoS1JHDH1J6oihL0kdGSn0k9yb5AtJPp/k1tb29CTbktzdfh/T2pPk3Ul2JLktyclDx1nf+t+dZP3SlCRJmsl8zvRPq6qTqmqirV8M3FRVa4Gb2jrAy4C17Wcj8D4YPEgAlwAvAk4BLtn7QCFJGo/9md45G9jcljcD5wy1X10DNwMrkjwbOBPYVlW7quphYBuwbj9uX5I0T6N+XWIBf5ykgA9U1SZgZVU9CFBVDyZ5Vuu7Crh/aN+drW2m9sdJspHBMwRWrlzJ1NTU6NUM2b1794L3Xa6suQ/WfGi56MQ907YvVc2jhv6Lq+qBFuzbkvzlLH0zTVvN0v74hsEDyiaAiYmJmpycHHGIjzc1NcVC912urLkP1nxouWCW78hdippHmt6pqgfa74eAjzKYk/9Km7ah/X6odd8JrB7a/TjggVnaJUljMmfoJzkyyVP3LgNnALcDW4G9V+CsB65ry1uB17WreE4FHm3TQDcCZyQ5pr2Ae0ZrkySNySjTOyuBjybZ2/9DVfVHSW4Brk2yAbgPOLf1vwE4C9gBPAa8HqCqdiV5O3BL6/e2qtq1aJVIkuY0Z+hX1T3Aj07T/jXg9GnaC7hwhmNdCVw5/2FKkhaD78iVpI4Y+pLUEUNfkjpi6EtSRwx9SeqIoS9JHTH0Jakjhr4kdcTQl6SOGPqS1BFDX5I6YuhLUkcMfUnqiKEvSR0x9CWpI4a+JHXE0Jekjhj6ktQRQ1+SOmLoS1JHDH1J6oihL0kdMfQlqSOGviR1xNCXpI4Y+pLUEUNfkjpi6EtSRwx9SerIyKGf5LAkn0tyfVs/Pslnktyd5JokT2rtT27rO9r2NUPHeGtr/2KSMxe7GEnS7OZzpv8m4K6h9XcCl1fVWuBhYENr3wA8XFXPBS5v/UhyAnAe8AJgHfBbSQ7bv+FLkuZjpNBPchzwcuB32nqAlwAfaV02A+e05bPbOm376a3/2cCWqvpOVX0J2AGcshhFSJJGc/iI/X4D+EXgqW39GcAjVbWnre8EVrXlVcD9AFW1J8mjrf8q4OahYw7v831JNgIbAVauXMnU1NSotTzO7t27F7zvcmXNfbDmQ8tFJ+6Ztn2pap4z9JP8M+ChqtqeZHJv8zRda45ts+3zdw1Vm4BNABMTEzU5Oblvl5FMTU2x0H2XK2vugzUfWi64+OPTtl+17sglqXmUM/0XA69MchZwBPA0Bmf+K5Ic3s72jwMeaP13AquBnUkOB44Gdg217zW8jyRpDOac06+qt1bVcVW1hsELsZ+oqvOBTwKvat3WA9e15a1tnbb9E1VVrf28dnXP8cBa4LOLVokkaU6jzulP5y3AliSXAp8DrmjtVwAfTLKDwRn+eQBVdUeSa4E7gT3AhVX1vf24fUnSPM0r9KtqCphqy/cwzdU3VfVt4NwZ9n8H8I75DlKStDh8R64kdcTQl6SOGPqS1BFDX5I6YuhLUkcMfUnqiKEvSR0x9CWpI4a+JHXE0Jekjhj6ktQRQ1+SOmLoS1JHDH1J6oihL0kdMfQlqSOGviR1xNCXpI4Y+pLUEUNfkjpi6EtSRwx9SeqIoS9JHTH0Jakjhr4kdcTQl6SOGPqS1BFDX5I6MmfoJzkiyWeT/EWSO5L859Z+fJLPJLk7yTVJntTan9zWd7Tta4aO9dbW/sUkZy5VUZKk6Y1ypv8d4CVV9aPAScC6JKcC7wQur6q1wMPAhtZ/A/BwVT0XuLz1I8kJwHnAC4B1wG8lOWwxi5EkzW7O0K+B3W31ie2ngJcAH2ntm4Fz2vLZbZ22/fQkae1bquo7VfUlYAdwyqJUIUkayeGjdGpn5NuB5wLvBf4KeKSq9rQuO4FVbXkVcD9AVe1J8ijwjNZ+89Bhh/cZvq2NwEaAlStXMjU1Nb+Kmt27dy943+XKmvtgzYeWi07cM237UtU8UuhX1feAk5KsAD4KPH+6bu13Ztg2U/u+t7UJ2AQwMTFRk5OTowzx75mammKh+y5X1twHaz60XHDxx6dtv2rdkUtS87yu3qmqR4Ap4FRgRZK9DxrHAQ+05Z3AaoC2/Whg13D7NPtIksZglKt3ntnO8EnyFOClwF3AJ4FXtW7rgeva8ta2Ttv+iaqq1n5eu7rneGAt8NnFKkSSNLdRpneeDWxu8/pPAK6tquuT3AlsSXIp8Dngitb/CuCDSXYwOMM/D6Cq7khyLXAnsAe4sE0bSZLGZM7Qr6rbgB+bpv0eprn6pqq+DZw7w7HeAbxj/sOUJC0G35ErSR0x9CWpI4a+JHXE0Jekjhj6ktQRQ1+SOmLoS1JHDH1J6oihL0kdMfQlqSOGviR1xNCXpI4Y+pLUEUNfkjpi6EtSRwx9SerISF+MLknLyZoZvmz83stePuaRHHw805ekjhj6ktQRQ1+SOmLoS1JHDH1J6oihL0kdMfQlqSOGviR1xNCXpI4Y+pLUEUNfkjpi6EtSR+YM/SSrk3wyyV1J7kjyptb+9CTbktzdfh/T2pPk3Ul2JLktyclDx1rf+t+dZP3SlSVJms4oZ/p7gIuq6vnAqcCFSU4ALgZuqqq1wE1tHeBlwNr2sxF4HwweJIBLgBcBpwCX7H2gkCSNx5yhX1UPVtX/acvfAO4CVgFnA5tbt83AOW35bODqGrgZWJHk2cCZwLaq2lVVDwPbgHWLWo0kaVapqtE7J2uATwEvBO6rqhVD2x6uqmOSXA9cVlV/1tpvAt4CTAJHVNWlrf2XgW9V1bv2uY2NDJ4hsHLlyh/fsmXLggrbvXs3Rx111IL2Xa6suQ/WPLcvfPnRadtPXHX0Yg1p0cw01uOPPmzB9/Npp522vaompts28peoJDkK+D3g56vq60lm7DpNW83S/viGqk3AJoCJiYmanJwcdYiPMzU1xUL3Xa6suQ/WPLcLZvoSlfNHP8a4zDTWq9YduST380hX7yR5IoPA/92q+v3W/JU2bUP7/VBr3wmsHtr9OOCBWdolSWMyytU7Aa4A7qqq/za0aSuw9wqc9cB1Q+2va1fxnAo8WlUPAjcCZyQ5pr2Ae0ZrkySNySjTOy8GXgt8IcnnW9svAZcB1ybZANwHnNu23QCcBewAHgNeD1BVu5K8Hbil9XtbVe1alCokSSOZM/TbC7IzTeCfPk3/Ai6c4VhXAlfOZ4CSpMXjO3IlqSOGviR1xNCXpI4Y+pLUEUNfkjpi6EtSRwx9SeqIoS9JHTH0Jakjhr4kdcTQl6SOjPx5+hq/NTN9JvhlLx/zSCQdKjzTl6SOGPqS1BFDX5I6YuhLUkcMfUnqiKEvSR0x9CWpI4a+JHXE0Jekjhj6ktQRQ1+SOmLoS1JHDH1J6oihL0kdMfQlqSOGviR1ZM7QT3JlkoeS3D7U9vQk25Lc3X4f09qT5N1JdiS5LcnJQ/usb/3vTrJ+acqRJM1mlG/Ougp4D3D1UNvFwE1VdVmSi9v6W4CXAWvbz4uA9wEvSvJ04BJgAihge5KtVfXwYhWi8fDbvKTlbc4z/ar6FLBrn+azgc1teTNwzlD71TVwM7AiybOBM4FtVbWrBf02YN1iFCBJGl2qau5OyRrg+qp6YVt/pKpWDG1/uKqOSXI9cFlV/Vlrv4nBM4BJ4IiqurS1/zLwrap61zS3tRHYCLBy5cof37Jly4IK2717N0cdddSC9j1YfOHLj07bfuKqo6dtH0fN8x3TUjsU7uf5sua5HWx/p7OZaazHH33Ygu/n0047bXtVTUy3bbG/GD3TtNUs7X+/sWoTsAlgYmKiJicnFzSQqakpFrrvweKCmaZSzp+ctn0cNc93TEvtULif58ua53aw/Z3OZqaxXrXuyCW5nxd69c5X2rQN7fdDrX0nsHqo33HAA7O0S5LGaKGhvxXYewXOeuC6ofbXtat4TgUeraoHgRuBM5Ic0670OaO1SZLGaM7pnSQfZjAnf2ySnQyuwrkMuDbJBuA+4NzW/QbgLGAH8BjweoCq2pXk7cAtrd/bqmrfF4clSUtsztCvqlfPsOn0afoWcOEMx7kSuHJeo5MkLSrfkStJHTH0Jakji33J5rLmu00lHeo805ekjhj6ktQRQ1+SOmLoS1JHDH1J6oihL0kdMfQlqSOGviR1xNCXpI4Y+pLUEUNfkjpi6EtSR/zANS0KP6xOWh4805ekjhj6ktQRQ1+SOmLoS1JHDH1J6oihL0kdMfQlqSOGviR1xDdnHUK+8OVHucA3SUmahWf6ktQRQ1+SOmLoS1JHDH1J6sjYX8hNsg74TeAw4Heq6rJxj0EHnp/KuXAzvWDvv51GMdbQT3IY8F7gnwI7gVuSbK2qO8c5Dmk6PhAdeDPdB1etO3LMIzl0jftM/xRgR1XdA5BkC3A2sCSh739iLWcz/f1edOL8+i/k732mY83E/1N/52DPnVTV+G4seRWwrqp+uq2/FnhRVf3cUJ+NwMa2+iPAFxd4c8cCX92P4S5H1twHa+7D/tT8nKp65nQbxn2mn2naHveoU1WbgE37fUPJrVU1sb/HWU6suQ/W3IelqnncV+/sBFYPrR8HPDDmMUhSt8Yd+rcAa5Mcn+RJwHnA1jGPQZK6Ndbpnarak+TngBsZXLJ5ZVXdsUQ3t99TRMuQNffBmvuwJDWP9YVcSdKB5TtyJakjhr4kdWTZh36SdUm+mGRHkoun2f7kJNe07Z9Jsmb8o1xcI9T875LcmeS2JDclec6BGOdimqvmoX6vSlJJlv3lfaPUnORftfv6jiQfGvcYF9sIf9s/lOSTST7X/r7POhDjXCxJrkzyUJLbZ9ieJO9u/x63JTl5v2+0qpbtD4MXg/8K+GHgScBfACfs0+dngfe35fOAaw70uMdQ82nAD7TlN/ZQc+v3VOBTwM3AxIEe9xju57XA54Bj2vqzDvS4x1DzJuCNbfkE4N4DPe79rPmfACcDt8+w/SzgDxm8x+lU4DP7e5vL/Uz/+x/rUFV/A+z9WIdhZwOb2/JHgNOTTPcmseVizpqr6pNV9VhbvZnB+yGWs1HuZ4C3A78GfHucg1sio9T8BuC9VfUwQFU9NOYxLrZRai7gaW35aJb5+3yq6lPArlm6nA1cXQM3AyuSPHt/bnO5h/4q4P6h9Z2tbdo+VbUHeBR4xlhGtzRGqXnYBgZnCsvZnDUn+TFgdVVdP86BLaFR7ufnAc9L8udJbm6fYLucjVLzfwJek2QncAPwb8cztANmvv/f57TcvyN3zo91GLHPcjJyPUleA0wAP7mkI1p6s9ac5AnA5cAF4xrQGIxyPx/OYIpnksGzuT9N8sKqemSJx7ZURqn51cBVVfVfk/wE8MFW898u/fAOiEXPr+V+pj/Kxzp8v0+Swxk8JZzt6dTBbqSPskjyUuA/AK+squ+MaWxLZa6anwq8EJhKci+Duc+ty/zF3FH/tq+rqu9W1ZcYfDjh2jGNbymMUvMG4FqAqvo0cASDDyY7VC36R9cs99Af5WMdtgLr2/KrgE9Ue4VkmZqz5jbV8QEGgb/c53lhjpqr6tGqOraq1lTVGgavY7yyqm49MMNdFKP8bX+MwYv2JDmWwXTPPWMd5eIapeb7gNMBkjyfQej/v7GOcry2Aq9rV/GcCjxaVQ/uzwGX9fROzfCxDkneBtxaVVuBKxg8BdzB4Az/vAM34v03Ys2/DhwF/K/2mvV9VfXKAzbo/TRizYeUEWu+ETgjyZ3A94BfqKqvHbhR758Ra74I+O0kb2YwzXHBcj6JS/JhBtNzx7bXKS4BnghQVe9n8LrFWcAO4DHg9ft9m8v430uSNE/LfXpHkjQPhr4kdcTQl6SOGPqS1BFDX5I6YuhLUkcMfUnqyP8HtGYps5172lYAAAAASUVORK5CYII=\n"
},
"metadata": {
"needs_background": "light"
}
}
],
"source": [
"entropy_weighted_labels = (1 - entropy(probs_train_filtered, base=2, axis=1))\n",
"pd.DataFrame(entropy_weighted_labels, columns=['label_entropy']).hist(bins=50)"
]
},
{
"cell_type": "code",
"execution_count": 27,
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"def get_model():\n",
" input_layer = Input(shape=(MAX_FEATURES,))\n",
" x = Dense(256, activation='relu')(input_layer)\n",
" x = Dropout(0.1)(x)\n",
" x = Dense(16, activation='relu')(x)\n",
" out_layer = Dense(1, activation='sigmoid')(x)\n",
"\n",
" opt = Adam(lr=0.0005, beta_1=0.9, beta_2=0.999, amsgrad=False)\n",
"\n",
" model = Model(inputs=[input_layer], outputs=[out_layer])\n",
" model.compile(loss='binary_crossentropy', optimizer=opt,\n",
" metrics=[binary_accuracy, binary_crossentropy])\n",
" return model"
]
},
{
"cell_type": "code",
"execution_count": 28,
"metadata": {
"tags": []
},
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": "Train on 11754 samples, validate on 619 samples\nEpoch 1/25\n11754/11754 [==============================] - 1s 44us/step - loss: 0.2266 - binary_accuracy: 0.6656 - binary_crossentropy: 0.7487 - val_loss: 0.1720 - val_binary_accuracy: 0.6688 - val_binary_crossentropy: 0.5961\nEpoch 2/25\n11754/11754 [==============================] - 0s 17us/step - loss: 0.1410 - binary_accuracy: 0.6890 - binary_crossentropy: 0.5189 - val_loss: 0.1365 - val_binary_accuracy: 0.7415 - val_binary_crossentropy: 0.5093\nEpoch 3/25\n11754/11754 [==============================] - 0s 18us/step - loss: 0.1137 - binary_accuracy: 0.7678 - binary_crossentropy: 0.4422 - val_loss: 0.1218 - val_binary_accuracy: 0.8045 - val_binary_crossentropy: 0.4652\nEpoch 4/25\n11754/11754 [==============================] - 0s 17us/step - loss: 0.0954 - binary_accuracy: 0.8434 - binary_crossentropy: 0.3830 - val_loss: 0.1091 - val_binary_accuracy: 0.8546 - val_binary_crossentropy: 0.3897\nEpoch 5/25\n11754/11754 [==============================] - 0s 17us/step - loss: 0.0784 - binary_accuracy: 0.8705 - binary_crossentropy: 0.3315 - val_loss: 0.1113 - val_binary_accuracy: 0.8352 - val_binary_crossentropy: 0.5109\nEpoch 6/25\n11754/11754 [==============================] - 0s 20us/step - loss: 0.0654 - binary_accuracy: 0.8918 - binary_crossentropy: 0.2885 - val_loss: 0.1043 - val_binary_accuracy: 0.8805 - val_binary_crossentropy: 0.3431\nEpoch 7/25\n11754/11754 [==============================] - 0s 17us/step - loss: 0.0552 - binary_accuracy: 0.9091 - binary_crossentropy: 0.2486 - val_loss: 0.0929 - val_binary_accuracy: 0.8724 - val_binary_crossentropy: 0.3730\nEpoch 8/25\n11754/11754 [==============================] - 0s 18us/step - loss: 0.0451 - binary_accuracy: 0.9244 - binary_crossentropy: 0.2147 - val_loss: 0.0972 - val_binary_accuracy: 0.8901 - val_binary_crossentropy: 0.3023\nEpoch 9/25\n11754/11754 [==============================] - 0s 17us/step - loss: 0.0391 - binary_accuracy: 0.9314 - binary_crossentropy: 0.1927 - val_loss: 0.0849 - val_binary_accuracy: 0.8950 - val_binary_crossentropy: 0.3215\nEpoch 10/25\n11754/11754 [==============================] - 0s 17us/step - loss: 0.0319 - binary_accuracy: 0.9448 - binary_crossentropy: 0.1662 - val_loss: 0.0965 - val_binary_accuracy: 0.8982 - val_binary_crossentropy: 0.3181\nEpoch 11/25\n11754/11754 [==============================] - 0s 17us/step - loss: 0.0256 - binary_accuracy: 0.9539 - binary_crossentropy: 0.1411 - val_loss: 0.0883 - val_binary_accuracy: 0.9015 - val_binary_crossentropy: 0.3370\nEpoch 12/25\n11754/11754 [==============================] - 0s 18us/step - loss: 0.0214 - binary_accuracy: 0.9614 - binary_crossentropy: 0.1240 - val_loss: 0.0989 - val_binary_accuracy: 0.8998 - val_binary_crossentropy: 0.3348\nEpoch 13/25\n11754/11754 [==============================] - 0s 17us/step - loss: 0.0172 - binary_accuracy: 0.9686 - binary_crossentropy: 0.1025 - val_loss: 0.0940 - val_binary_accuracy: 0.9031 - val_binary_crossentropy: 0.3454\nEpoch 14/25\n11754/11754 [==============================] - 0s 16us/step - loss: 0.0143 - binary_accuracy: 0.9735 - binary_crossentropy: 0.0902 - val_loss: 0.1038 - val_binary_accuracy: 0.9079 - val_binary_crossentropy: 0.3537\nEpoch 15/25\n11754/11754 [==============================] - 0s 17us/step - loss: 0.0108 - binary_accuracy: 0.9786 - binary_crossentropy: 0.0711 - val_loss: 0.0989 - val_binary_accuracy: 0.9015 - val_binary_crossentropy: 0.3655\nEpoch 16/25\n11754/11754 [==============================] - 0s 18us/step - loss: 0.0088 - binary_accuracy: 0.9832 - binary_crossentropy: 0.0601 - val_loss: 0.1094 - val_binary_accuracy: 0.9031 - val_binary_crossentropy: 0.3574\nEpoch 17/25\n11754/11754 [==============================] - 0s 17us/step - loss: 0.0073 - binary_accuracy: 0.9857 - binary_crossentropy: 0.0502 - val_loss: 0.1116 - val_binary_accuracy: 0.9015 - val_binary_crossentropy: 0.3591\nEpoch 18/25\n11754/11754 [==============================] - 0s 17us/step - loss: 0.0054 - binary_accuracy: 0.9889 - binary_crossentropy: 0.0394 - val_loss: 0.1117 - val_binary_accuracy: 0.9079 - val_binary_crossentropy: 0.3912\nEpoch 19/25\n11754/11754 [==============================] - 0s 16us/step - loss: 0.0063 - binary_accuracy: 0.9888 - binary_crossentropy: 0.0412 - val_loss: 0.1108 - val_binary_accuracy: 0.9111 - val_binary_crossentropy: 0.3620\nEpoch 20/25\n11754/11754 [==============================] - 0s 16us/step - loss: 0.0045 - binary_accuracy: 0.9914 - binary_crossentropy: 0.0302 - val_loss: 0.1185 - val_binary_accuracy: 0.9079 - val_binary_crossentropy: 0.3944\nEpoch 21/25\n11754/11754 [==============================] - 0s 16us/step - loss: 0.0036 - binary_accuracy: 0.9928 - binary_crossentropy: 0.0259 - val_loss: 0.1286 - val_binary_accuracy: 0.9031 - val_binary_crossentropy: 0.3607\nEpoch 22/25\n11754/11754 [==============================] - 0s 16us/step - loss: 0.0027 - binary_accuracy: 0.9951 - binary_crossentropy: 0.0186 - val_loss: 0.1234 - val_binary_accuracy: 0.9111 - val_binary_crossentropy: 0.4055\nEpoch 23/25\n11754/11754 [==============================] - 0s 16us/step - loss: 0.0032 - binary_accuracy: 0.9949 - binary_crossentropy: 0.0215 - val_loss: 0.1297 - val_binary_accuracy: 0.9015 - val_binary_crossentropy: 0.3772\nEpoch 24/25\n11754/11754 [==============================] - 0s 15us/step - loss: 0.0027 - binary_accuracy: 0.9956 - binary_crossentropy: 0.0168 - val_loss: 0.1308 - val_binary_accuracy: 0.8998 - val_binary_crossentropy: 0.4213\nEpoch 25/25\n11754/11754 [==============================] - 0s 16us/step - loss: 0.0017 - binary_accuracy: 0.9964 - binary_crossentropy: 0.0120 - val_loss: 0.1388 - val_binary_accuracy: 0.9015 - val_binary_crossentropy: 0.4272\n"
},
{
"output_type": "execute_result",
"data": {
"text/plain": "<keras.callbacks.callbacks.History at 0x1a385e3390>"
},
"metadata": {},
"execution_count": 28
}
],
"source": [
"model = get_model()\n",
"model.fit(x=X_train, y=preds_train_filtered,\n",
" sample_weight=entropy_weighted_labels,\n",
" batch_size=256,\n",
" epochs=25,\n",
" validation_split=0.05)"
]
},
{
"cell_type": "code",
"execution_count": 29,
"metadata": {
"tags": []
},
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": "precision recall f1-score support\n\n 0 0.70 0.80 0.75 498\n 1 0.77 0.66 0.71 502\n\n accuracy 0.73 1000\n macro avg 0.73 0.73 0.73 1000\nweighted avg 0.73 0.73 0.73 1000\n\n"
}
],
"source": [
"y_hat_test = model.predict(X_test) > 0.5\n",
"print(classification_report(df_labeled.target.values, y_hat_test))"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": 3
},
"orig_nbformat": 2,
"kernelspec": {
"name": "python_defaultSpec_1594782180757",
"display_name": "Python 3.7.7 64-bit ('anaconda3': virtualenv)"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment