Skip to content

Instantly share code, notes, and snippets.

@kenwebb
Last active February 18, 2019 02:46
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save kenwebb/7d31dbd88ec3588f3d9508a7243d9b4d to your computer and use it in GitHub Desktop.
Save kenwebb/7d31dbd88ec3588f3d9508a7243d9b4d to your computer and use it in GitHub Desktop.
TensorFlow.js
<?xml version="1.0" encoding="UTF-8"?>
<!--Xholon Workbook http://www.primordion.com/Xholon/gwt/ MIT License, Copyright (C) Ken Webb, Sun Feb 17 2019 21:45:43 GMT-0500 (Eastern Standard Time)-->
<XholonWorkbook>
<Notes><![CDATA[
Xholon
------
Title: TensorFlow.js
Description:
Url: http://www.primordion.com/Xholon/gwt/
InternalName: 7d31dbd88ec3588f3d9508a7243d9b4d
Keywords:
My Notes
--------
February 14, 2019
see also "deeplearn.js":
http://127.0.0.1:8888/wb/editwb.html?app=7d31dbd88ec3588f3d9508a7243d9b4d&src=gist
TensorFlow.js combines deeplearn.js with other stuff.
use XholonTensorFlow.html
http://127.0.0.1:8888/XholonTensorFlow.html?app=TensorFlow.js&src=lstr&gui=clsc
http://127.0.0.1:8888/XholonTensorFlow.html?app=7d31dbd88ec3588f3d9508a7243d9b4d&src=gist&gui=clsc
gpu nvidia on ubuntu (command line)
--------------------
nvidia-smi
TODO
----
1.
Look closer at ref[4]
Does this mean I can build a Xholon model with lots of probabilities, an then run it with TensorFlow?
ex: What is the probability in BGCO, that a 10-year old will attend the program "Eat Cookies"
convert all attendances to probabilities of attending
This might be a really useful technique to use in Xholon.
see my existing models:
ceb4083319ebfa4a7bd0 Bayesian inference
036d7c299b7c96e4838059e06b6d8334 Web Cryptography API (includes example using IndexedDB0
convert captured data into probabilities of eah action given a certain state
this would be a good Xholon model for exploring the use of TFP with Xholon
References
----------
(1) https://js.tensorflow.org/
A JavaScript library for training and deploying ML models in the browser and on Node.js
(2) https://github.com/tensorflow/tfjs
A WebGL accelerated JavaScript library for training and deploying ML models.
(3) https://js.tensorflow.org/tutorials/
(4) https://github.com/vega
Data Visualization Languages & Tools
) https://github.com/vega/vega-embed
Publish Vega visualizations as embedded web components with interactive parameters.
(5) https://vega.github.io/editor/#/edited
(6) https://github.com/tensorflow/tfjs-examples
numerous examples
(7) https://www.tensorflow.org/install/gpu
(8) https://www.tensorflow.org/
(9) https://tensorflow.rstudio.com/
(10) https://www.tensorflow.org/probability/
TFP
TensorFlow Probability is a library for probabilistic reasoning and statistical analysis.
TensorFlow Probability (TFP) is a Python library built on TensorFlow that makes it easy to combine probabilistic models and deep learning on modern hardware (TPU, GPU).
It's for data scientists, statisticians, ML researchers, and practitioners who want to encode domain knowledge to understand data and make predictions.
TFP includes: ...
) https://camdavidsonpilon.github.io/Probabilistic-Programming-and-Bayesian-Methods-for-Hackers/
printed and online book
) https://docs.google.com/presentation/d/1BWhNVHzhFfYiFL8ynX1wFmag8YjzvHeqKkd8Z0G8H7Y
slide4:
Take Home Message
Express your domain knowledge as a probabilistic model.
Use TFP to execute it.
Presentation Notes:
Probability is a universal language that you can use to encode your domain knowledge.
TFP encodes this language for computation.
TFP releases you from the expressive limitations of other toolsets. Build your best model, with your best knowledge of how the world works
(11) https://www.nature.com/articles/s41467-018-08089-7
Approximate Bayesian computation with deep learning supports a third archaic introgression in Asia and Oceania
Mayukh Mondal, Jaume Bertranpetit & Oscar Lao
]]></Notes>
<_-.XholonClass>
<PhysicalSystem/>
<Testing/>
<!-- Xholon Container
https://github.com/tensorflow/tfjs-examples/tree/master/polynomial-regression-core
https://js.tensorflow.org/tutorials/fit-curve.html
-->
<PolynomialRegression/>
<!-- Xholon Active Object
https://github.com/tensorflow/tfjs-examples/blob/master/polynomial-regression-core/index.js
-->
<FitCurve/>
<!-- Xholon Passive Object
https://github.com/tensorflow/tfjs-examples/blob/master/polynomial-regression-core/data.js
-->
<GenerateData/>
<!-- UI
https://github.com/tensorflow/tfjs-examples/blob/master/polynomial-regression-core/ui.js
-->
<VegaUI/>
<!-- https://github.com/tensorflow/tfjs-examples/blob/master/getting-started/index.js -->
<GettingStarted/>
</_-.XholonClass>
<xholonClassDetails>
</xholonClassDetails>
<PhysicalSystem>
<Testing roleName="123"/>
<PolynomialRegression>
<GenerateData/>
<VegaUI/>
<FitCurve/>
</PolynomialRegression>
<GettingStarted/>
</PhysicalSystem>
<Testingbehavior implName="org.primordion.xholon.base.Behavior_gwtjs"><![CDATA[
// Testing
var me, tf, beh = {
postConfigure: function() {
me = this.cnode.parent();
},
act: function() {
if ($wnd.xh.param("TimeStep") == "1") {
this.test();
}
},
test: function() {
// example from source [1]
// var $wnd=window;
tf = $wnd.tf;
$wnd.console.log(tf);
$wnd.console.log(tf.tensor1d);
var arr = [1,2,3];
var a = tf.tensor1d(arr);
var b = tf.scalar(2);
a.add(b).print();
// tf.tidy takes a function to tidy up after
const average = tf.tidy(() => {
// tf.tidy will clean up all the GPU memory used by tensors inside
// this function, other than the tensor that is returned.
//
// Even in a short sequence of operations like the one below, a number
// of intermediate tensors get created. So it is a good practice to
// put your math ops in a tidy!
const y = tf.tensor1d([1.0, 2.0, 3.0, 4.0]);
const z = tf.ones([4]);
return y.sub(z).square().mean();
});
average.print() // Output: 3.5
}
}
//# sourceURL=Testingbehavior.js
]]></Testingbehavior>
<GenerateDatabehavior implName="org.primordion.xholon.base.Behavior_gwtjs"><![CDATA[
// GenerateData
// msg.data array indexes
const IX_NUMPOINTS = 0;
const IX_COEFF = 1;
const IX_SIGMA = 2;
var me, tf, xs, ys, beh = {
postConfigure: function() {
me = this.cnode.parent();
tf = $wnd.tf;
},
act: function() {
if ($wnd.xh.param("TimeStep") == "2") {
//this.test();
}
},
test: function() {
const trueCoefficients = {a: -.8, b: -.2, c: .9, d: .5};
const result = this.generateData(100, trueCoefficients, 0.04);
$wnd.console.log(result);
var promise1 = result.xs.array();
promise1.then(function(value) {
$wnd.console.log(value);
});
var promise2 = result.ys.array();
promise2.then(function(value) {
$wnd.console.log(value);
});
},
processReceivedSyncMessage: function(msg) {
// msg.data is a JS array
$wnd.console.log("processReceivedSyncMessage " + msg.signal);
var numPoints = msg.data[IX_NUMPOINTS];
var coeff = msg.data[IX_COEFF];
var sigma = msg.data[IX_SIGMA];
//return this.generateData(100, {a: -.8, b: -.2, c: .9, d: .5}, 0.04);
return this.generateData(numPoints, coeff, sigma);
},
generateData: function(numPoints, coeff, sigma = 0.04) {
return tf.tidy(() => {
const [a, b, c, d] = [
tf.scalar(coeff.a), tf.scalar(coeff.b), tf.scalar(coeff.c),
tf.scalar(coeff.d)
];
const xs = tf.randomUniform([numPoints], -1, 1);
// Generate polynomial data
const three = tf.scalar(3, 'int32');
const ys = a.mul(xs.pow(three))
.add(b.mul(xs.square()))
.add(c.mul(xs))
.add(d)
// Add random noise to the generated data
// to make the problem a bit more interesting
.add(tf.randomNormal([numPoints], 0, sigma));
// Normalize the y values to the range 0 to 1.
const ymin = ys.min();
const ymax = ys.max();
const yrange = ymax.sub(ymin);
const ysNormalized = ys.sub(ymin).div(yrange);
return {
xs,
ys: ysNormalized
};
})
}
}
// # sourceURL=GenerateDatabehavior.js
]]></GenerateDatabehavior>
<VegaUIbehavior implName="org.primordion.xholon.base.Behavior_gwtjs"><![CDATA[
// VegaUI
var me, beh = {
postConfigure: function() {
me = this.cnode.parent();
me.println(me.name());
},
processReceivedSyncMessage: function(msg) {
return this.genui(msg);
},
genui: function(msg) {
//import renderChart from 'vega-embed';
switch (msg.signal) {
case 301: // plotData
plotData("div#xhimg", msg.data[0], msg.data[1]);
break;
case 302: // plotDataAndPredictions
plotDataAndPredictions("div#xhimg", msg.data[0], msg.data[1], msg.data[2]);
break;
case 303: // renderCoefficients
renderCoefficients("div#xhgraph", msg.data);
break;
default: break;
}
return "testing genui() " + msg.signal;
async function plotData(container, xs, ys) {
const xvals = await xs.data();
const yvals = await ys.data();
const values = Array.from(yvals).map((y, i) => {
return {'x': xvals[i], 'y': yvals[i]};
});
const spec = {
'$schema': 'https://vega.github.io/schema/vega-lite/v2.json',
'width': 300,
'height': 300,
'data': {'values': values},
'mark': 'point',
'encoding': {
'x': {'field': 'x', 'type': 'quantitative'},
'y': {'field': 'y', 'type': 'quantitative'}
}
};
// test with https://vega.github.io/editor/#/examples/vega-lite/line
me.println(JSON.stringify(spec));
// it draws a correct-looking graph
return renderChart(container, spec, {actions: false, renderer: "svg"});
}
async function plotDataAndPredictions(container, xs, ys, preds) {
const xvals = await xs.data();
const yvals = await ys.data();
try {
const predVals = await preds.data(); // second call to this method: Error: Tensor is disposed.
} catch(e) {
$wnd.console.log(e);
}
const values = Array.from(yvals).map((y, i) => {
return {'x': xvals[i], 'y': yvals[i], pred: predVals[i]};
});
const spec = {
'$schema': 'https://vega.github.io/schema/vega-lite/v2.json',
'width': 300,
'height': 300,
'data': {'values': values},
'layer': [
{
'mark': 'point',
'encoding': {
'x': {'field': 'x', 'type': 'quantitative'},
'y': {'field': 'y', 'type': 'quantitative'}
}
},
{
'mark': 'line',
'encoding': {
'x': {'field': 'x', 'type': 'quantitative'},
'y': {'field': 'pred', 'type': 'quantitative'},
'color': {'value': 'tomato'}
},
}
]
};
// test with https://vega.github.io/editor/#/examples/vega-lite/line
me.println(JSON.stringify(spec));
// it draws a correct-looking graph
return renderChart(container, spec, {actions: false, renderer: "svg"});
}
function renderCoefficients(container, coeff) {
$doc.querySelector(container).insertAdjacentHTML('beforeend',
`<div>a=${coeff.a.toFixed(3)}, b=${coeff.b.toFixed(3)}, c=${coeff.c.toFixed(3)}, d=${coeff.d.toFixed(3)}</div>`);
}
function renderChart(container, spec, options) {
$wnd.console.log("renderChart " + container);
// TODO
$wnd.vegaEmbed(container, spec, options).then(function(result) {
// Access the Vega view instance (https://vega.github.io/vega/docs/api/view/) as result.view
}).catch(console.error);
//$wnd.vegaEmbed(container, spec);
}
}
}
// # sourceURL=VegaUIbehavior.js
]]></VegaUIbehavior>
<FitCurvebehavior implName="org.primordion.xholon.base.Behavior_gwtjs"><![CDATA[
// FitCurve
var me, tf, beh = {
postConfigure: function() {
me = this.cnode.parent();
tf = $wnd.tf;
},
act: function() {
if ($wnd.xh.param("TimeStep") == "3") {
const result = this.fitCurve();
$wnd.console.log(result);
}
},
// https://github.com/tensorflow/tfjs-examples/blob/master/polynomial-regression-core/index.js
// Note: import {plotData, plotDataAndPredictions, renderCoefficients} from './ui';
fitCurve: function() {
var $this = this;
// Step 1: Set up Variables
const a = tf.variable(tf.scalar(Math.random()));
const b = tf.variable(tf.scalar(Math.random()));
const c = tf.variable(tf.scalar(Math.random()));
const d = tf.variable(tf.scalar(Math.random()));
// Step 2: create an optimizer
const numIterations = 75;
const learningRate = 0.5;
const optimizer = tf.train.sgd(learningRate);
// Step 3. Write our training process functions.
function predict(x) {
// y = a * x ^ 3 + b * x ^ 2 + c * x + d
return tf.tidy(() => {
return a.mul(x.pow(tf.scalar(3, 'int32')))
.add(b.mul(x.square())) // + b * x ^ 2
.add(c.mul(x)) // + c * x
.add(d); // + d
});
} // end function predict(x)
// loss
function loss(prediction, labels) {
// Having a good error function is key for training a machine learning model
const error = prediction.sub(labels).square().mean();
return error;
}
// train
async function train(xs, ys, numIterations) {
for (let iter = 0; iter < numIterations; iter++) {
// optimizer.minimize is where the training happens.
// The function it takes must return a numerical estimate (i.e. loss)
// of how well we are doing using the current state of
// the variables we created at the start.
// This optimizer does the 'backward' step of our training process
// updating variables defined previously in order to minimize the
// loss.
optimizer.minimize(() => {
// Feed the examples into the model
const pred = predict(xs);
return loss(pred, ys);
});
// Use tf.nextFrame to not block the browser.
await tf.nextFrame();
} // end for
} // end train
async function learnCoefficients() {
const trueCoefficients = {a: -.8, b: -.2, c: .9, d: .5};
const trainingData = $this.generateData(100, trueCoefficients);
$wnd.console.log(trainingData);
// Plot original data
$this.renderCoefficients('#data .coeff', trueCoefficients);
await $this.plotData('#data .plot', trainingData.xs, trainingData.ys)
// See what the predictions look like with random coefficients
$this.renderCoefficients('#random .coeff', {
a: a.dataSync()[0],
b: b.dataSync()[0],
c: c.dataSync()[0],
d: d.dataSync()[0],
});
const predictionsBefore = predict(trainingData.xs);
await $this.plotDataAndPredictions(
'#random .plot', trainingData.xs, trainingData.ys, predictionsBefore);
// Train the model!
await train(trainingData.xs, trainingData.ys, numIterations);
// See what the final results predictions are after training.
$this.renderCoefficients('#trained .coeff', {
a: a.dataSync()[0],
b: b.dataSync()[0],
c: c.dataSync()[0],
d: d.dataSync()[0],
});
const predictionsAfter = predict(trainingData.xs);
await $this.plotDataAndPredictions(
'#trained .plot', trainingData.xs, trainingData.ys, predictionsAfter);
predictionsBefore.dispose();
predictionsAfter.dispose();
} // end learnCoefficients
learnCoefficients();
return "fitCurve() TODO";
}, // end fitCurve()
// ui methods
plotData: function(text, xs, ys) {
me.println(text);
$wnd.console.log("about to call(301)");
return me.prev().first().call(301, [xs, ys], me).data;
},
plotDataAndPredictions: function(text, xs, ys, preds) {me.println(text);
me.println(text);
$wnd.console.log("about to call(302)");
return me.prev().first().call(302, [xs, ys, preds], me).data;
},
renderCoefficients: function(text, data) {
me.println(text);
$wnd.console.log("about to call(303)");
return me.prev().first().call(303, data, me).data;
},
generateData: function(numPoints, coeff, sigma) {
$wnd.console.log("about to call(101)");
return me.prev().prev().first().call(101, [numPoints, coeff, sigma], me).data;
}
}
// # sourceURL=FitCurvebehavior.js
]]></FitCurvebehavior>
<GettingStartedbehavior implName="org.primordion.xholon.base.Behavior_gwtjs"><![CDATA[
// GettingStarted
// https://github.com/tensorflow/tfjs-examples/blob/master/getting-started/index.js
var me, tf, beh = {
postConfigure: function() {
me = this.cnode.parent();
tf = $wnd.tf;
},
act: function() {
if ($wnd.xh.param("TimeStep") == "5") {
this.gettingStarted();
}
},
gettingStarted: function() {
me.println("gettingStarted");
// Tiny TFJS train / predict example.
async function run() {
// Create a simple model.
const model = tf.sequential();
model.add(tf.layers.dense({units: 1, inputShape: [1]}));
// Prepare the model for training: Specify the loss and the optimizer.
model.compile({loss: 'meanSquaredError', optimizer: 'sgd'});
// Generate some synthetic data for training. (y = 2x - 1)
const xs = tf.tensor2d([-1, 0, 1, 2, 3, 4], [6, 1]);
const ys = tf.tensor2d([-3, -1, 1, 3, 5, 7], [6, 1]);
// Train the model using the data.
await model.fit(xs, ys, {epochs: 250});
// Use the model to do inference on a data point the model hasn't seen.
// Should print approximately 39.
me.println(model.predict(tf.tensor2d([20], [1, 1])).dataSync());
}
run();
}
}
// # sourceURL=GettingStartedbehavior.js
]]></GettingStartedbehavior>
<SvgClient><Attribute_String roleName="svgUri"><![CDATA[data:image/svg+xml,
<svg width="100" height="50" xmlns="http://www.w3.org/2000/svg">
<g>
<title>Testing</title>
<rect id="PhysicalSystem/Testing" fill="#98FB98" height="50" width="50" x="25" y="0"/>
<g>
<title>PolynomialRegression example</title>
<rect id="PhysicalSystem/PolynomialRegression" fill="#6AB06A" height="50" width="10" x="80" y="0"/>
</g>
</g>
</svg>
]]></Attribute_String><Attribute_String roleName="setup">${MODELNAME_DEFAULT},${SVGURI_DEFAULT}</Attribute_String></SvgClient>
</XholonWorkbook>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment