Skip to content

Instantly share code, notes, and snippets.

@ecjang
Created February 18, 2018 11:13
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save ecjang/c0cdad5c3a12bd48c0bcfd12c9ab9ae3 to your computer and use it in GitHub Desktop.
Save ecjang/c0cdad5c3a12bd48c0bcfd12c9ab9ae3 to your computer and use it in GitHub Desktop.
006. Logistic_Regression
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### `11. ML lec 5-1 Logistic Classification의 가설 함수 정의`"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Regression (HCG)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"- H : Hypothesis / C : Cost Function / G : Gradient Decent\n",
"- 주어진 자료를 가지고 가설을 세우고 코스트를 점차적으로 줄여나가는 방법\n",
"- cost 는 가설과 실제값 과의 차이값\n",
"- cost의 최소값을 찾는 방법이 Gradient Decent.\n",
"- Gradient Decent에서 alpha 값은 움직이는 값"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"1. Hypothesis : $ H(x) = WX $\n",
"2. Cost : $ cost(W) = \\frac{1}{m}\\sum (H(x) = WX)^2 $\n",
"3. Gradicent Decent : $ W := W - \\alpha \\frac{\\sigma}{\\sigma W} cost(W) $"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Classification : 0 or 1"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"- Spam Detection : Spam or Ham\n",
"- Facebook feed : Show or Hide\n",
"- Credit Card Fraudulent Transaction detection : Legitimate/draud"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"- 많은 자료들을 0과 1로 표현하려고 하니 문제점들이 발생.\n",
"- 너무 큰 입력값이나, 애매한 중간값을 계산하기 어려움.\n",
"- 그래서 입력값들을 0과 1사이로 정확하게 바꾸어 줄 수 있는 함수를 고민."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Sigmod : Curved in two direction"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"$$ g(z) = \\frac{1}{(1+e^{-W^{t}X})} $$ "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"- S자의 완만한 그래프가 그려짐. 값이 커지면 1에 가까워 지고, 작아지면 0에 가까워 진다.\n",
"- e로 시작하는 계산식이 0일 경우 -> 1/1이 되어서 최댓값인 1이 된다.\n",
"- e로 시작하는 계산식이 매우 클 때 -> 1/e 꼴이 되어 최소값 0이 된다.\n",
"- $WX$가 0일 경우 -> 지수가 0이 되고 결국 1/2가 되어 중간값 0.5가 된다."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"----"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### `12. ML lec 5-2 Logistic Regression의 cost 함수 설명`"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"기존 함수 : $ H(x) = Wx + b $ \n",
"<BR>\n",
"시그모이드 적용 : $ H(X) = \\frac{1}{1 + e{-W^{t}X}} $"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<img width=\"70%\" align=\"left\" src=\"img/cost_function.png\">"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"- 시그모이드를 적용한 함수를 그리면 울퉁불퉁해진다.\n",
"- 이대로 최적화 함수를 적용하면 최저점을 잘 못 인식할 수 있는 문제발생.\n",
"- 일부 최저점(Local Minimum)이 아닌 전체 최저점(Global Minimum)을 구해야 한다."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### New Cost Function for Logistic"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"$$ cost(W) = \\frac{1}{m} \\sum c(H(x), y) $$"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"\\begin{equation}\n",
"f(x)=\\left \\{\\begin{array}{ll}\n",
"\\ -log(H(x)) : y = 1 \\\\\n",
"\\ -log(1- H(x)) : y = 0 \\\\\n",
"\\end{array}\n",
"\\right.\n",
"\\end{equation}"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"$$ c:(H(x), y) = ylog(H(x)) - (1-y)log(1 - H(x)) $$"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Minimize cost : Grdient Decent Algorithm"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import tensorflow as tf\n",
"\n",
"# cost function\n",
"cost = tf.reduce_mean(-tf.reduce_sum(Y*tf.log(hypothesis) + (1-Y)*tf.log(1-hypothesis)))\n",
"\n",
"# Minimize\n",
"a = tf.Variable(0.1) # Learning rate, alpha\n",
"optimizer = tf.train.GradientDescentOptimizer(a)\n",
"train = optimizer.minimize(cost)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"-----"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"`13. ML lab 05 TensorFlow로 Logistic Classification의 구현하기 (new)`"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"- Logistic Regression\n",
"\n",
"$$ sigmoid: H(X) = \\frac{1}{1 + e^{-W^{t}X}} $$ <BR>\n",
"$$ cost(W) = -\\frac{1}{m}\\sum{}{}ylog(H(x)) + (1-y)(log(1 - H(x)) $$ <BR>\n",
"$$ W := W - \\alpha\\frac{\\sigma}{\\sigma W}cost(W) $$"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
"import tensorflow as tf"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"- 데이터 확인\n",
" - x_data는 x1, x2 형식으로 주어짐. 공부한 시간을 의미.\n",
" - y_data는 0,1로 주어짐. 1은 합격, 0은 불합격.\n",
" - [None, 2] : none은 몇 개가 들어오는 정해지지 않을때, 2는 2개씩 있다는 의미."
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [],
"source": [
"x_data = [[1,2], [2,3], [3,1], [4,3], [5,3], [6,2]]\n",
"y_data = [[0], [0], [0], [1], [1], [1]]\n",
"\n",
"# placeholder for a tensor that will be always fed.\n",
"X = tf.placeholder(tf.float32, shape=[None, 2])\n",
"Y = tf.placeholder(tf.float32, shape=[None, 1])\n",
"W = tf.Variable(tf.random_normal([2, 1]), name='weight')\n",
"b = tf.Variable(tf.random_normal([1]), name='bias')"
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {},
"outputs": [],
"source": [
"# Hypothesis using sigmod : tf.div(1., 1. + tf.exp(tf.matmul(X, W) + b))\n",
"hypothesis = tf.sigmoid(tf.matmul(X, W) + b)\n",
"\n",
"# cost/loss function\n",
"cost = -tf.reduce_mean(Y * tf.log(hypothesis) + (1 - Y) * tf.log(1 - hypothesi s))\n",
"\n",
"train = tf.train.GradientDescentOptimizer(learning_rate=0.01).minimize(cost)\n",
"\n",
"#Accuray Computation\n",
"# True if Hypothesis > 0.5 else False\n",
"predicted = tf.cast(hypothesis > 0.5, dtype = tf.float32)\n",
"accuracy = tf.reduce_mean(tf.cast(tf.equal(predicted, Y), dtype=tf.float32))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Train the Model"
]
},
{
"cell_type": "code",
"execution_count": 19,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"0 2.6321828 / 200 0.672776 / 400 0.5846116 / 600 0.5382804 / 800 0.508029 / 1000 0.48489007 / 1200 0.46537724 / 1400 0.4479871 / 1600 0.43201455 / 1800 0.4171041 / 2000 0.40306374 / 2200 0.3897808 / 2400 0.377183 / 2600 0.36521864 / 2800 0.35384724 / 3000 0.34303418 / 3200 0.33274838 / 3400 0.32296112 / 3600 0.31364527 / 3800 0.304775 / 4000 0.2963256 / 4200 0.28827378 / 4400 0.2805973 / 4600 0.27327493 / 4800 0.26628685 / 5000 0.259614 / 5200 0.2532386 / 5400 0.24714382 / 5600 0.24131374 / 5800 0.23573364 / 6000 0.23038949 / 6200 0.22526823 / 6400 0.22035746 / 6600 0.21564586 / 6800 0.21112241 / 7000 0.20677702 / 7200 0.20260035 / 7400 0.19858326 / 7600 0.19471753 / 7800 0.19099534 / 8000 0.18740934 / 8200 0.18395253 / 8400 0.18061858 / 8600 0.17740141 / 8800 0.17429513 / 9000 0.17129444 / 9200 0.16839428 / 9400 0.16559003 / 9600 0.16287702 / 9800 0.16025119 / 10000 0.15770836 / \n",
"\n",
" - Hypothesis: \n",
" [[0.03426922]\n",
" [0.16337043]\n",
" [0.32133013]\n",
" [0.7739592 ]\n",
" [0.9348008 ]\n",
" [0.97858 ]] \n",
" - Correct(Y): \n",
" [[0.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]] \n",
" - Accuracy: \n",
" 1.0\n"
]
}
],
"source": [
"# Launch Graph\n",
"with tf.Session() as sess:\n",
" # Initialize TensorFlow variables\n",
" sess.run(tf.global_variables_initializer())\n",
" \n",
" for step in range(10001):\n",
" cost_val, _ = sess.run([cost, train], feed_dict={X: x_data, Y:y_data})\n",
" if step % 200 == 0:\n",
" print(step, cost_val , end=\" / \")\n",
" \n",
" # Accuracy report\n",
" h, c, a = sess.run([hypothesis, predicted, accuracy], feed_dict={X: x_data, Y:y_data})\n",
" \n",
" print(\"\\n\\n\",\" - Hypothesis: \\n\", h, \"\\n - Correct(Y): \\n\", c, \"\\n - Accuracy: \\n\", a)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Classifying Diabetes\n",
"- 당뇨병 데이터\n",
"- 숫자들은 당뇨에 관련된 데이터들\n",
"- 마지막 컬럼에서 1이 당뇨병이 있는 거, 0이 없는 거"
]
},
{
"cell_type": "code",
"execution_count": 28,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<div>\n",
"<style>\n",
" .dataframe thead tr:only-child th {\n",
" text-align: right;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: left;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>-0.294118</th>\n",
" <th>0.487437</th>\n",
" <th>0.180328</th>\n",
" <th>-0.292929</th>\n",
" <th>0</th>\n",
" <th>0.00149028</th>\n",
" <th>-0.53117</th>\n",
" <th>-0.0333333</th>\n",
" <th>0.1</th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>0</th>\n",
" <td>-0.882353</td>\n",
" <td>-0.145729</td>\n",
" <td>0.081967</td>\n",
" <td>-0.414141</td>\n",
" <td>0.000000</td>\n",
" <td>-0.207153</td>\n",
" <td>-0.766866</td>\n",
" <td>-0.666667</td>\n",
" <td>1</td>\n",
" </tr>\n",
" <tr>\n",
" <th>1</th>\n",
" <td>-0.058824</td>\n",
" <td>0.839196</td>\n",
" <td>0.049180</td>\n",
" <td>0.000000</td>\n",
" <td>0.000000</td>\n",
" <td>-0.305514</td>\n",
" <td>-0.492741</td>\n",
" <td>-0.633333</td>\n",
" <td>0</td>\n",
" </tr>\n",
" <tr>\n",
" <th>2</th>\n",
" <td>-0.882353</td>\n",
" <td>-0.105528</td>\n",
" <td>0.081967</td>\n",
" <td>-0.535354</td>\n",
" <td>-0.777778</td>\n",
" <td>-0.162444</td>\n",
" <td>-0.923997</td>\n",
" <td>0.000000</td>\n",
" <td>1</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"</div>"
],
"text/plain": [
" -0.294118 0.487437 0.180328 -0.292929 0 0.00149028 -0.53117 \\\n",
"0 -0.882353 -0.145729 0.081967 -0.414141 0.000000 -0.207153 -0.766866 \n",
"1 -0.058824 0.839196 0.049180 0.000000 0.000000 -0.305514 -0.492741 \n",
"2 -0.882353 -0.105528 0.081967 -0.535354 -0.777778 -0.162444 -0.923997 \n",
"\n",
" -0.0333333 0.1 \n",
"0 -0.666667 1 \n",
"1 -0.633333 0 \n",
"2 0.000000 1 "
]
},
"execution_count": 28,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"import pandas as pd\n",
"\n",
"df = pd.read_csv('Data/data-03-diabetes.csv')\n",
"df.head(3)"
]
},
{
"cell_type": "code",
"execution_count": 30,
"metadata": {},
"outputs": [],
"source": [
"import numpy as np\n",
"\n",
"xy = np.loadtxt('Data/data-03-diabetes.csv', delimiter=',', dtype=np.float32)\n",
"x_data = xy[:, 0:-1]\n",
"y_data = xy[:, [-1]]"
]
},
{
"cell_type": "code",
"execution_count": 31,
"metadata": {},
"outputs": [],
"source": [
"# placeholder for a tensor that will be always fed.\n",
"X = tf.placeholder(tf.float32, shape=[None, 8])\n",
"Y = tf.placeholder(tf.float32, shape=[None, 1])\n",
"W = tf.Variable(tf.random_normal([8, 1]), name='weight')\n",
"b = tf.Variable(tf.random_normal([1]), name='bias')"
]
},
{
"cell_type": "code",
"execution_count": 33,
"metadata": {},
"outputs": [],
"source": [
"# Hypothesis using sigmod : tf.div(1., 1. + tf.exp(tf.matmul(X, W) + b))\n",
"hypothesis = tf.sigmoid(tf.matmul(X, W) + b)\n",
"\n",
"# cost/loss function\n",
"cost = -tf.reduce_mean(Y * tf.log(hypothesis) + (1 - Y) * tf.log(1 - hypothesis))\n",
"train = tf.train.GradientDescentOptimizer(learning_rate=0.01).minimize(cost)\n",
"\n",
"#Accuray Computation\n",
"# True if Hypothesis > 0.5 else False\n",
"predicted = tf.cast(hypothesis > 0.5, dtype = tf.float32)\n",
"accuracy = tf.reduce_mean(tf.cast(tf.equal(predicted, Y), dtype=tf.float32))"
]
},
{
"cell_type": "code",
"execution_count": 34,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"0 0.9145897 / 200 0.6217353 / 400 0.5711805 / 600 0.5571511 / 800 0.54969805 / 1000 0.54386365 / 1200 0.53870153 / 1400 0.53399694 / 1600 0.5296793 / 1800 0.52570844 / 2000 0.5220531 / 2200 0.51868516 / 2400 0.51558 / 2600 0.51271427 / 2800 0.5100674 / 3000 0.50762063 / 3200 0.50535667 / 3400 0.5032599 / 3600 0.5013163 / 3800 0.49951324 / 4000 0.49783888 / 4200 0.4962824 / 4400 0.49483478 / 4600 0.49348676 / 4800 0.49223068 / 5000 0.49105918 / 5200 0.48996592 / 5400 0.48894438 / 5600 0.48798954 / 5800 0.4870961 / 6000 0.4862596 / 6200 0.48547593 / 6400 0.48474097 / 6600 0.4840514 / 6800 0.48340395 / 7000 0.48279542 / 7200 0.48222357 / 7400 0.48168546 / 7600 0.48117882 / 7800 0.4807017 / 8000 0.48025212 / 8200 0.47982812 / 8400 0.47942802 / 8600 0.47905034 / 8800 0.47869363 / 9000 0.47835648 / 9200 0.47803774 / 9400 0.47773623 / 9600 0.47745106 / 9800 0.47718093 / 10000 0.47692513 / \n",
"\n",
" - Hypothesis: \n",
" [[0.37795937]\n",
" [0.93710846]\n",
" [0.2772684 ]\n",
" [0.94427884]\n",
" [0.08024111]\n",
" [0.8161948 ]\n",
" [0.93239474]\n",
" [0.5422459 ]\n",
" [0.26880753]\n",
" [0.5600177 ]\n",
" [0.73212385]\n",
" [0.15082034]\n",
" [0.2643727 ]\n",
" [0.25788882]\n",
" [0.7307969 ]\n",
" [0.37741715]\n",
" [0.7592269 ]\n",
" [0.7322997 ]\n",
" [0.8136358 ]\n",
" [0.5986698 ]\n",
" [0.64691615]\n",
" [0.10534783]\n",
" [0.701169 ]\n",
" [0.6604219 ]\n",
" [0.3165895 ]\n",
" [0.9538011 ]\n",
" [0.6144037 ]\n",
" [0.6854153 ]\n",
" [0.6379973 ]\n",
" [0.4849738 ]\n",
" [0.95931435]\n",
" [0.9331794 ]\n",
" [0.60776037]\n",
" [0.8443987 ]\n",
" [0.36229703]\n",
" [0.6154204 ]\n",
" [0.79683864]\n",
" [0.46518764]\n",
" [0.46110174]\n",
" [0.33292928]\n",
" [0.8835323 ]\n",
" [0.13436048]\n",
" [0.43114787]\n",
" [0.03980754]\n",
" [0.5946765 ]\n",
" [0.9426602 ]\n",
" [0.6372723 ]\n",
" [0.6930083 ]\n",
" [0.9666622 ]\n",
" [0.9395554 ]\n",
" [0.94415545]\n",
" [0.24174732]\n",
" [0.31210196]\n",
" [0.9628792 ]\n",
" [0.17701541]\n",
" [0.39859533]\n",
" [0.0988318 ]\n",
" [0.6342522 ]\n",
" [0.8446446 ]\n",
" [0.49572718]\n",
" [0.9523375 ]\n",
" [0.7556712 ]\n",
" [0.63299966]\n",
" [0.8689161 ]\n",
" [0.6120836 ]\n",
" [0.52707195]\n",
" [0.97274303]\n",
" [0.7715598 ]\n",
" [0.8486615 ]\n",
" [0.70004165]\n",
" [0.20912507]\n",
" [0.75264645]\n",
" [0.92826045]\n",
" [0.9342586 ]\n",
" [0.8759662 ]\n",
" [0.76235896]\n",
" [0.31451526]\n",
" [0.89634657]\n",
" [0.9071251 ]\n",
" [0.91639215]\n",
" [0.8765216 ]\n",
" [0.8530673 ]\n",
" [0.34343988]\n",
" [0.82991475]\n",
" [0.45937157]\n",
" [0.85160947]\n",
" [0.33583802]\n",
" [0.920775 ]\n",
" [0.9528135 ]\n",
" [0.79035646]\n",
" [0.7167408 ]\n",
" [0.7116038 ]\n",
" [0.7822815 ]\n",
" [0.56358117]\n",
" [0.9039538 ]\n",
" [0.98123646]\n",
" [0.88511926]\n",
" [0.536103 ]\n",
" [0.22637509]\n",
" [0.71892756]\n",
" [0.73735535]\n",
" [0.9696666 ]\n",
" [0.71954125]\n",
" [0.7444778 ]\n",
" [0.95869905]\n",
" [0.68527675]\n",
" [0.91050303]\n",
" [0.81628585]\n",
" [0.53486943]\n",
" [0.28905788]\n",
" [0.9457995 ]\n",
" [0.86101824]\n",
" [0.39193827]\n",
" [0.48934793]\n",
" [0.6329919 ]\n",
" [0.79366595]\n",
" [0.8739392 ]\n",
" [0.9403488 ]\n",
" [0.08383666]\n",
" [0.7074521 ]\n",
" [0.86614454]\n",
" [0.6623714 ]\n",
" [0.6470854 ]\n",
" [0.56595564]\n",
" [0.62801707]\n",
" [0.83730114]\n",
" [0.83477175]\n",
" [0.6365428 ]\n",
" [0.54313564]\n",
" [0.34555244]\n",
" [0.42444506]\n",
" [0.7174299 ]\n",
" [0.95519495]\n",
" [0.81755775]\n",
" [0.7643272 ]\n",
" [0.84944606]\n",
" [0.49141642]\n",
" [0.7789059 ]\n",
" [0.81553876]\n",
" [0.693049 ]\n",
" [0.86960965]\n",
" [0.59563094]\n",
" [0.57189447]\n",
" [0.65448457]\n",
" [0.90315837]\n",
" [0.7422333 ]\n",
" [0.43485993]\n",
" [0.9456318 ]\n",
" [0.6467768 ]\n",
" [0.83343697]\n",
" [0.2603795 ]\n",
" [0.38863903]\n",
" [0.07475116]\n",
" [0.1735873 ]\n",
" [0.90827334]\n",
" [0.8928592 ]\n",
" [0.9542492 ]\n",
" [0.0702714 ]\n",
" [0.6148026 ]\n",
" [0.7352086 ]\n",
" [0.53109455]\n",
" [0.857535 ]\n",
" [0.47192407]\n",
" [0.8049551 ]\n",
" [0.630961 ]\n",
" [0.66488993]\n",
" [0.7316392 ]\n",
" [0.90458274]\n",
" [0.7998106 ]\n",
" [0.59709966]\n",
" [0.88944507]\n",
" [0.82682306]\n",
" [0.95168614]\n",
" [0.2287236 ]\n",
" [0.8257307 ]\n",
" [0.13451843]\n",
" [0.30367076]\n",
" [0.36909786]\n",
" [0.9218279 ]\n",
" [0.62105864]\n",
" [0.93317574]\n",
" [0.91744894]\n",
" [0.6576063 ]\n",
" [0.11936923]\n",
" [0.17696397]\n",
" [0.69487363]\n",
" [0.7718658 ]\n",
" [0.68616956]\n",
" [0.86235774]\n",
" [0.5961719 ]\n",
" [0.37137172]\n",
" [0.09572136]\n",
" [0.8961862 ]\n",
" [0.38607273]\n",
" [0.8829661 ]\n",
" [0.92182535]\n",
" [0.69729227]\n",
" [0.6461459 ]\n",
" [0.5990155 ]\n",
" [0.57422554]\n",
" [0.71598125]\n",
" [0.9662507 ]\n",
" [0.7021931 ]\n",
" [0.8594281 ]\n",
" [0.10701558]\n",
" [0.35920337]\n",
" [0.89112985]\n",
" [0.21501848]\n",
" [0.93919015]\n",
" [0.25147507]\n",
" [0.27545503]\n",
" [0.36970422]\n",
" [0.705037 ]\n",
" [0.14396234]\n",
" [0.72322834]\n",
" [0.73252666]\n",
" [0.8139281 ]\n",
" [0.61029303]\n",
" [0.10872318]\n",
" [0.38898095]\n",
" [0.73816746]\n",
" [0.47879565]\n",
" [0.9434648 ]\n",
" [0.9442081 ]\n",
" [0.6863355 ]\n",
" [0.3123149 ]\n",
" [0.03157235]\n",
" [0.5786235 ]\n",
" [0.33348063]\n",
" [0.34757516]\n",
" [0.9671068 ]\n",
" [0.62814426]\n",
" [0.9549243 ]\n",
" [0.18901575]\n",
" [0.1208536 ]\n",
" [0.30700433]\n",
" [0.84542763]\n",
" [0.91069776]\n",
" [0.8851939 ]\n",
" [0.6985389 ]\n",
" [0.6346558 ]\n",
" [0.5868146 ]\n",
" [0.15549047]\n",
" [0.53515846]\n",
" [0.08832238]\n",
" [0.53128666]\n",
" [0.90445924]\n",
" [0.60817254]\n",
" [0.79231834]\n",
" [0.9676633 ]\n",
" [0.77936167]\n",
" [0.72185457]\n",
" [0.7304513 ]\n",
" [0.76273257]\n",
" [0.88136786]\n",
" [0.4293414 ]\n",
" [0.44117206]\n",
" [0.54167277]\n",
" [0.8216771 ]\n",
" [0.6435322 ]\n",
" [0.6813509 ]\n",
" [0.78657585]\n",
" [0.2735847 ]\n",
" [0.42835858]\n",
" [0.56729984]\n",
" [0.6806193 ]\n",
" [0.3063216 ]\n",
" [0.9149597 ]\n",
" [0.8390944 ]\n",
" [0.9303254 ]\n",
" [0.4810903 ]\n",
" [0.74532145]\n",
" [0.8225574 ]\n",
" [0.8304454 ]\n",
" [0.71146333]\n",
" [0.8824572 ]\n",
" [0.31593633]\n",
" [0.5425979 ]\n",
" [0.7108043 ]\n",
" [0.41706115]\n",
" [0.85020405]\n",
" [0.2851396 ]\n",
" [0.56103814]\n",
" [0.95121616]\n",
" [0.7710139 ]\n",
" [0.82701033]\n",
" [0.6277374 ]\n",
" [0.42474744]\n",
" [0.5620227 ]\n",
" [0.40304694]\n",
" [0.4203658 ]\n",
" [0.7070051 ]\n",
" [0.67328507]\n",
" [0.5640131 ]\n",
" [0.64857405]\n",
" [0.19510339]\n",
" [0.68931174]\n",
" [0.90669256]\n",
" [0.41167995]\n",
" [0.75081056]\n",
" [0.7492995 ]\n",
" [0.4923264 ]\n",
" [0.7356611 ]\n",
" [0.4897159 ]\n",
" [0.70669484]\n",
" [0.903133 ]\n",
" [0.6016407 ]\n",
" [0.76634103]\n",
" [0.84815216]\n",
" [0.50462264]\n",
" [0.85525745]\n",
" [0.9668043 ]\n",
" [0.36859626]\n",
" [0.7620204 ]\n",
" [0.3257778 ]\n",
" [0.782065 ]\n",
" [0.83320785]\n",
" [0.71234274]\n",
" [0.4161246 ]\n",
" [0.7681016 ]\n",
" [0.8143962 ]\n",
" [0.7316683 ]\n",
" [0.17010544]\n",
" [0.79259294]\n",
" [0.8540006 ]\n",
" [0.59405035]\n",
" [0.93724585]\n",
" [0.19713454]\n",
" [0.749837 ]\n",
" [0.9614026 ]\n",
" [0.1562411 ]\n",
" [0.4022885 ]\n",
" [0.70280004]\n",
" [0.3219665 ]\n",
" [0.16587648]\n",
" [0.88452494]\n",
" [0.93495846]\n",
" [0.8707899 ]\n",
" [0.6532771 ]\n",
" [0.5894291 ]\n",
" [0.5406407 ]\n",
" [0.76717937]\n",
" [0.8634216 ]\n",
" [0.9517447 ]\n",
" [0.68938947]\n",
" [0.7326299 ]\n",
" [0.65778136]\n",
" [0.93202716]\n",
" [0.94816166]\n",
" [0.6574522 ]\n",
" [0.31390947]\n",
" [0.65676135]\n",
" [0.2319979 ]\n",
" [0.7022656 ]\n",
" [0.22070657]\n",
" [0.26439962]\n",
" [0.42598286]\n",
" [0.6901499 ]\n",
" [0.31937966]\n",
" [0.5699445 ]\n",
" [0.80860233]\n",
" [0.70385116]\n",
" [0.8648985 ]\n",
" [0.96377945]\n",
" [0.81690145]\n",
" [0.09115198]\n",
" [0.37544614]\n",
" [0.7756784 ]\n",
" [0.84857535]\n",
" [0.62503386]\n",
" [0.21764183]\n",
" [0.91265535]\n",
" [0.88100284]\n",
" [0.22358856]\n",
" [0.57388914]\n",
" [0.86320543]\n",
" [0.88981074]\n",
" [0.8665406 ]\n",
" [0.91799086]\n",
" [0.91348284]\n",
" [0.9386834 ]\n",
" [0.6811876 ]\n",
" [0.5689397 ]\n",
" [0.5555614 ]\n",
" [0.83738625]\n",
" [0.87341356]\n",
" [0.18896522]\n",
" [0.8340985 ]\n",
" [0.907041 ]\n",
" [0.36520222]\n",
" [0.630163 ]\n",
" [0.87363076]\n",
" [0.53375846]\n",
" [0.9442541 ]\n",
" [0.289904 ]\n",
" [0.8345173 ]\n",
" [0.6495181 ]\n",
" [0.8700337 ]\n",
" [0.3561951 ]\n",
" [0.5902587 ]\n",
" [0.7446253 ]\n",
" [0.8343196 ]\n",
" [0.12555777]\n",
" [0.19196647]\n",
" [0.70027816]\n",
" [0.82818925]\n",
" [0.43644285]\n",
" [0.82671773]\n",
" [0.40873873]\n",
" [0.41087452]\n",
" [0.869887 ]\n",
" [0.4384703 ]\n",
" [0.953512 ]\n",
" [0.849827 ]\n",
" [0.64327544]\n",
" [0.9243863 ]\n",
" [0.54490864]\n",
" [0.7815766 ]\n",
" [0.26785332]\n",
" [0.29494554]\n",
" [0.725429 ]\n",
" [0.4303885 ]\n",
" [0.4709099 ]\n",
" [0.8993365 ]\n",
" [0.92936754]\n",
" [0.9150995 ]\n",
" [0.9575455 ]\n",
" [0.7785382 ]\n",
" [0.90548515]\n",
" [0.27683783]\n",
" [0.32669997]\n",
" [0.5412604 ]\n",
" [0.9600275 ]\n",
" [0.6077301 ]\n",
" [0.20987399]\n",
" [0.9310469 ]\n",
" [0.81238526]\n",
" [0.57714677]\n",
" [0.80058765]\n",
" [0.00914798]\n",
" [0.9392989 ]\n",
" [0.76444775]\n",
" [0.7349592 ]\n",
" [0.77732754]\n",
" [0.9754041 ]\n",
" [0.69923156]\n",
" [0.71213377]\n",
" [0.76720345]\n",
" [0.81695783]\n",
" [0.15388079]\n",
" [0.6192658 ]\n",
" [0.9148866 ]\n",
" [0.5923306 ]\n",
" [0.7882242 ]\n",
" [0.9607686 ]\n",
" [0.81289244]\n",
" [0.90075034]\n",
" [0.6373234 ]\n",
" [0.7856944 ]\n",
" [0.9348275 ]\n",
" [0.6976369 ]\n",
" [0.6323808 ]\n",
" [0.25179172]\n",
" [0.4590021 ]\n",
" [0.53321874]\n",
" [0.5476122 ]\n",
" [0.5848263 ]\n",
" [0.8082101 ]\n",
" [0.60351396]\n",
" [0.80474985]\n",
" [0.86732197]\n",
" [0.78032994]\n",
" [0.68837845]\n",
" [0.4809031 ]\n",
" [0.60566264]\n",
" [0.9374285 ]\n",
" [0.82832396]\n",
" [0.17171216]\n",
" [0.36019385]\n",
" [0.4085892 ]\n",
" [0.06221687]\n",
" [0.90959984]\n",
" [0.17426167]\n",
" [0.88597685]\n",
" [0.8900917 ]\n",
" [0.8309586 ]\n",
" [0.6838563 ]\n",
" [0.8871009 ]\n",
" [0.38816532]\n",
" [0.8170053 ]\n",
" [0.94545084]\n",
" [0.35218954]\n",
" [0.46711135]\n",
" [0.9113494 ]\n",
" [0.8593314 ]\n",
" [0.5737389 ]\n",
" [0.7725181 ]\n",
" [0.7980252 ]\n",
" [0.7991724 ]\n",
" [0.2895928 ]\n",
" [0.75676906]\n",
" [0.8699282 ]\n",
" [0.6550036 ]\n",
" [0.7666387 ]\n",
" [0.72765034]\n",
" [0.8192832 ]\n",
" [0.8885327 ]\n",
" [0.9390206 ]\n",
" [0.6348476 ]\n",
" [0.4370818 ]\n",
" [0.71346104]\n",
" [0.82507116]\n",
" [0.9761328 ]\n",
" [0.79800516]\n",
" [0.6819162 ]\n",
" [0.3693394 ]\n",
" [0.7269185 ]\n",
" [0.9301354 ]\n",
" [0.9643725 ]\n",
" [0.9206656 ]\n",
" [0.70141554]\n",
" [0.69950473]\n",
" [0.81815815]\n",
" [0.37867257]\n",
" [0.77296805]\n",
" [0.7962692 ]\n",
" [0.8801436 ]\n",
" [0.6199016 ]\n",
" [0.745664 ]\n",
" [0.8890186 ]\n",
" [0.4838251 ]\n",
" [0.5768089 ]\n",
" [0.58128935]\n",
" [0.7449418 ]\n",
" [0.6053395 ]\n",
" [0.90360266]\n",
" [0.9343584 ]\n",
" [0.23511697]\n",
" [0.09677152]\n",
" [0.762541 ]\n",
" [0.57083344]\n",
" [0.36570165]\n",
" [0.8568496 ]\n",
" [0.9066361 ]\n",
" [0.73334014]\n",
" [0.9411616 ]\n",
" [0.90161395]\n",
" [0.79620683]\n",
" [0.8238949 ]\n",
" [0.7058908 ]\n",
" [0.3979287 ]\n",
" [0.764102 ]\n",
" [0.59197485]\n",
" [0.09296317]\n",
" [0.8944104 ]\n",
" [0.88313854]\n",
" [0.73019636]\n",
" [0.92364657]\n",
" [0.8337299 ]\n",
" [0.86383533]\n",
" [0.61081225]\n",
" [0.65693104]\n",
" [0.8844273 ]\n",
" [0.8344419 ]\n",
" [0.86532015]\n",
" [0.89997697]\n",
" [0.6800423 ]\n",
" [0.77088726]\n",
" [0.8171205 ]\n",
" [0.47614524]\n",
" [0.5556315 ]\n",
" [0.10257592]\n",
" [0.21962121]\n",
" [0.8510823 ]\n",
" [0.6260974 ]\n",
" [0.6216275 ]\n",
" [0.5489324 ]\n",
" [0.9490576 ]\n",
" [0.4036259 ]\n",
" [0.84931296]\n",
" [0.2926572 ]\n",
" [0.9179985 ]\n",
" [0.22670576]\n",
" [0.7796923 ]\n",
" [0.5914683 ]\n",
" [0.84867686]\n",
" [0.57973963]\n",
" [0.270539 ]\n",
" [0.6858667 ]\n",
" [0.9078612 ]\n",
" [0.3955355 ]\n",
" [0.92406005]\n",
" [0.9131948 ]\n",
" [0.8785715 ]\n",
" [0.8552462 ]\n",
" [0.38327697]\n",
" [0.35887775]\n",
" [0.63511676]\n",
" [0.18781431]\n",
" [0.9672322 ]\n",
" [0.3113753 ]\n",
" [0.9434575 ]\n",
" [0.8740429 ]\n",
" [0.3989558 ]\n",
" [0.20178707]\n",
" [0.7087839 ]\n",
" [0.3667823 ]\n",
" [0.87813336]\n",
" [0.7914977 ]\n",
" [0.985504 ]\n",
" [0.54879296]\n",
" [0.62076485]\n",
" [0.7976871 ]\n",
" [0.84230787]\n",
" [0.06980098]\n",
" [0.6222425 ]\n",
" [0.80295783]\n",
" [0.85275453]\n",
" [0.66306746]\n",
" [0.4576119 ]\n",
" [0.60830706]\n",
" [0.9098962 ]\n",
" [0.6511661 ]\n",
" [0.78102815]\n",
" [0.8516409 ]\n",
" [0.9042146 ]\n",
" [0.81267095]\n",
" [0.54395086]\n",
" [0.80880046]\n",
" [0.911878 ]\n",
" [0.6347811 ]\n",
" [0.9744678 ]\n",
" [0.82440275]\n",
" [0.6351552 ]\n",
" [0.53339505]\n",
" [0.84063 ]\n",
" [0.87517554]\n",
" [0.4644057 ]\n",
" [0.7519921 ]\n",
" [0.18734577]\n",
" [0.60566497]\n",
" [0.8212925 ]\n",
" [0.95582074]\n",
" [0.8257597 ]\n",
" [0.74081105]\n",
" [0.78266484]\n",
" [0.8945565 ]\n",
" [0.43082938]\n",
" [0.9512448 ]\n",
" [0.5519473 ]\n",
" [0.8520662 ]\n",
" [0.35706937]\n",
" [0.080543 ]\n",
" [0.3247849 ]\n",
" [0.3195097 ]\n",
" [0.65610224]\n",
" [0.8541871 ]\n",
" [0.5704279 ]\n",
" [0.74372727]\n",
" [0.763321 ]\n",
" [0.5134152 ]\n",
" [0.3448996 ]\n",
" [0.89952743]\n",
" [0.893919 ]\n",
" [0.30167562]\n",
" [0.61273265]\n",
" [0.2153466 ]\n",
" [0.47336623]\n",
" [0.7006797 ]\n",
" [0.6705483 ]\n",
" [0.91744107]\n",
" [0.9810368 ]\n",
" [0.12814437]\n",
" [0.61753565]\n",
" [0.6573522 ]\n",
" [0.4239442 ]\n",
" [0.7450129 ]\n",
" [0.8066857 ]\n",
" [0.8915115 ]\n",
" [0.8073476 ]\n",
" [0.42705446]\n",
" [0.7200018 ]\n",
" [0.1605227 ]\n",
" [0.6515838 ]\n",
" [0.465807 ]\n",
" [0.93238944]\n",
" [0.61629254]\n",
" [0.6220373 ]\n",
" [0.8206705 ]\n",
" [0.7471273 ]\n",
" [0.38452914]\n",
" [0.75859165]\n",
" [0.67423636]\n",
" [0.34685174]\n",
" [0.54977757]\n",
" [0.9067282 ]\n",
" [0.8147883 ]\n",
" [0.5471888 ]\n",
" [0.74886674]\n",
" [0.29771686]\n",
" [0.82082725]\n",
" [0.61191595]\n",
" [0.78250843]\n",
" [0.29951912]\n",
" [0.62855256]\n",
" [0.8630617 ]\n",
" [0.13828118]\n",
" [0.32947308]\n",
" [0.8089388 ]\n",
" [0.8079519 ]\n",
" [0.75572515]\n",
" [0.92256767]\n",
" [0.74288034]\n",
" [0.7283297 ]\n",
" [0.70836693]\n",
" [0.8177371 ]\n",
" [0.664139 ]\n",
" [0.80503076]\n",
" [0.5266353 ]\n",
" [0.6317057 ]\n",
" [0.8877591 ]\n",
" [0.83542985]\n",
" [0.72260445]\n",
" [0.28004587]\n",
" [0.8862402 ]\n",
" [0.85822606]\n",
" [0.8062282 ]\n",
" [0.7029034 ]\n",
" [0.8630252 ]\n",
" [0.81893337]\n",
" [0.7442802 ]\n",
" [0.31600982]\n",
" [0.8737201 ]\n",
" [0.92043906]\n",
" [0.36441725]\n",
" [0.12854618]\n",
" [0.72554684]\n",
" [0.36542156]\n",
" [0.7283739 ]\n",
" [0.31442586]\n",
" [0.49471995]\n",
" [0.4677952 ]\n",
" [0.7742698 ]\n",
" [0.88916826]\n",
" [0.13080047]\n",
" [0.3739761 ]\n",
" [0.6330735 ]\n",
" [0.5478768 ]\n",
" [0.45646703]\n",
" [0.79521585]\n",
" [0.16692732]\n",
" [0.9223253 ]\n",
" [0.12836601]\n",
" [0.8724258 ]\n",
" [0.68219376]\n",
" [0.71421254]\n",
" [0.8614742 ]\n",
" [0.717938 ]\n",
" [0.9099425 ]] \n",
" - Correct(Y): \n",
" [[0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [0.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]\n",
" [1.]] \n",
" - Accuracy: \n",
" 0.77338606\n"
]
}
],
"source": [
"# Launch Graph\n",
"with tf.Session() as sess:\n",
" # Initialize TensorFlow variables\n",
" sess.run(tf.global_variables_initializer())\n",
" \n",
" feed = {X: x_data, Y:y_data}\n",
" for step in range(10001):\n",
" cost_val, _ = sess.run([cost, train], feed_dict=feed)\n",
" if step % 200 == 0:\n",
" print(step, cost_val , end=\" / \")\n",
" \n",
" # Accuracy report\n",
" h, c, a = sess.run([hypothesis, predicted, accuracy], feed_dict=feed)\n",
" \n",
" print(\"\\n\\n\",\" - Hypothesis: \\n\", h, \"\\n - Correct(Y): \\n\", c, \"\\n - Accuracy: \\n\", a)\n",
" \n"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "python3.5",
"language": "python",
"name": "python3.5"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.4"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment