Skip to content

Instantly share code, notes, and snippets.

View tpgmartin's full-sized avatar

Tom Martin tpgmartin

View GitHub Profile

The easiest way to transition between pie charts with differently-sized datasets (while maintaining object constancy) is to set the missing values to zero.

function type(d) {
  d.apples = +d.apples || 0;
  d.oranges = +d.oranges || 0;
  return d;
}
@tpgmartin
tpgmartin / introrx.md
Created October 5, 2016 08:36 — forked from staltz/introrx.md
The introduction to Reactive Programming you've been missing
@tpgmartin
tpgmartin / getElementsByClassName.js
Created May 2, 2017 00:50
Implementing 'getElementsByClassName' from scratch in JavaScript
function getElementsByClassName2(className) {
const nodes = []
function crawl(node) {
if (node.classList && node.classList.value.indexOf(className) > -1) {
nodes.push(node)
}
node.childNodes.forEach((child) =>
crawl(child)
)
@tpgmartin
tpgmartin / flattenArray.js
Created May 2, 2017 01:39
Flatten array sample assuming input will only ever contain nested arrays of numbers
function flatten(arr, flat) {
let output = !!flat ? flat : []
arr.forEach((el) => {
if (typeof el === 'number') {
output.push(el)
} else {
flatten(el, output)
}
@tpgmartin
tpgmartin / feedforward.js
Last active May 16, 2017 21:49
Basic feedforward neural network in javascript
class Network {
constructor(neuronsPerLayer = []) {
this.layers = neuronsPerLayer.length
this.neuronsPerLayer = neuronsPerLayer
this.biases = [] // call initialiseValues()
this.weights = [] // call initialiseValues()
}
// Train network with SGD
@tpgmartin
tpgmartin / network.js
Last active July 2, 2017 15:18
Example feed forward network learning XOR
const nj = require('numjs')
// The activation function of choice, for a given input x, the function will return either 0, if x < 0, or x.
// This is used to find the activation of the hidden layer nodes during forward propagation.
function relu(x) {
return iterator(x, x => ((x > 0) * x))
}
// The derivative of the activation function above, this is used during the backward propagation and gradient descent process to find
// the updated for weights between the input and hidden layer nodes.
function reluDeriv(x) {
@tpgmartin
tpgmartin / blockchain.js
Created July 20, 2017 13:06
Simple blockchain implementation in JavaScript
const crypto = require('crypto')
class Block {
constructor(index, timestamp, data, previousHash) {
this.index = index
this.timestamp = timestamp
this.data = data
this.previousHash = previousHash
this.hash = this.hashBlock()
@tpgmartin
tpgmartin / frame.scala
Last active July 22, 2017 13:00
Example of finding mean of multiple features in frame object grouped by custom index
import org.saddle._
object Main {
def main(args: Array[String]) {
// Requires *.sbt file including saddle dependency to run. Run from command line with `sbt run frame.scala`
val height: Series[String, Double] = Series("Male" -> 6.0, "Male" -> 5.92, "Male" -> 5.58, "Male" -> 5.92,
"Female" -> 5.0, "Female" -> 5.5, "Female" -> 5.42, "Female" -> 5.75)
val weight: Series[String, Double] = Series("Male" -> 180.0, "Male" -> 190.0, "Male" -> 170.0, "Male" -> 165.0,
"Female" -> 100.0, "Female" -> 150.0, "Female" -> 130.0, "Female" -> 150.0)
@tpgmartin
tpgmartin / simple_knn_classifier.py
Last active November 14, 2017 08:59
Implementation of KNN classifier from scratch using Euclidean distance metric
# main.py
from scipy.spatial import distance
from collections import Counter
class KNN():
def __init__(self, n_neighbors=1):
self.n_neighbors = n_neighbors
def fit(self, X_train, y_train):
@tpgmartin
tpgmartin / keybase.md
Created November 28, 2017 03:57
keybase.md

Keybase proof

I hereby claim:

  • I am tpgmartin on github.
  • I am tpgmartin (https://keybase.io/tpgmartin) on keybase.
  • I have a public key whose fingerprint is CF18 67A4 E835 30AE B77E A373 02C9 9A77 4754 85B0

To claim this, I am signing this object: