Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Save RenSys/96bee121583bc391e48c1e41b6c93637 to your computer and use it in GitHub Desktop.
Save RenSys/96bee121583bc391e48c1e41b6c93637 to your computer and use it in GitHub Desktop.
TensorFlow - Basic Minimisation with Gradient Descent using placeholder data
{
"cells": [
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "import numpy as np\nimport tensorflow as tf",
"execution_count": 1,
"outputs": [
{
"output_type": "stream",
"text": "/home/karl/anaconda2/envs/py36-test/lib/python3.6/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.\n from ._conv import register_converters as _register_converters\n",
"name": "stderr"
}
]
},
{
"metadata": {},
"cell_type": "markdown",
"source": "### Using Gradient Descent, find the value of w when\n10*W**2 - 20*W + 100 = 0"
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "coefficients = np.array([[10.0], [-20.0], [100.0]] )\nw = tf.Variable(0, dtype=tf.float32)\nx = tf.placeholder(tf.float32, [3,1])",
"execution_count": 33,
"outputs": []
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "cost = x[0][0]*w**2 + x[1][0]*w+x[2][0]\ntrain = tf.train.GradientDescentOptimizer(0.01).minimize(cost)\ninit = tf.global_variables_initializer()\nsession = tf.Session()\nsession.run(init)\nprint(session.run(w))",
"execution_count": 34,
"outputs": [
{
"output_type": "stream",
"text": "0.0\n",
"name": "stdout"
}
]
},
{
"metadata": {},
"cell_type": "markdown",
"source": "### Perform 1 round of Gradient Descent"
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "session.run(train, feed_dict={x:coefficients})\nprint(session.run(w))",
"execution_count": 35,
"outputs": [
{
"output_type": "stream",
"text": "0.19999999\n",
"name": "stdout"
}
]
},
{
"metadata": {},
"cell_type": "markdown",
"source": "### Perform 1000 round of Gradient Descent"
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "for i in range(1_000):\n session.run(train, feed_dict={x:coefficients})\nprint(session.run(w))",
"execution_count": 36,
"outputs": [
{
"output_type": "stream",
"text": "0.9999999\n",
"name": "stdout"
}
]
},
{
"metadata": {
"trusted": true
},
"cell_type": "code",
"source": "",
"execution_count": null,
"outputs": []
}
],
"metadata": {
"_draft": {
"nbviewer_url": "https://gist.github.com/4b6c4b0abda338d6580e1aca6946c183"
},
"gist": {
"id": "4b6c4b0abda338d6580e1aca6946c183",
"data": {
"description": "TensorFlow - Basic Minimisation with Gradient Descent using placeholder data",
"public": true
}
},
"kernelspec": {
"name": "py36-test",
"display_name": "py36-test",
"language": "python"
},
"language_info": {
"name": "python",
"version": "3.6.4",
"mimetype": "text/x-python",
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"pygments_lexer": "ipython3",
"nbconvert_exporter": "python",
"file_extension": ".py"
},
"toc": {
"nav_menu": {},
"number_sections": true,
"sideBar": true,
"skip_h1_title": false,
"toc_cell": false,
"toc_position": {},
"toc_section_display": "block",
"toc_window_display": false
}
},
"nbformat": 4,
"nbformat_minor": 2
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment