Skip to content

Instantly share code, notes, and snippets.

@jparmstrong
Last active June 4, 2024 11:08
Show Gist options
  • Save jparmstrong/1ed28c9d0768b8a5c0abb6e97df5eb99 to your computer and use it in GitHub Desktop.
Save jparmstrong/1ed28c9d0768b8a5c0abb6e97df5eb99 to your computer and use it in GitHub Desktop.
kdb/q neural network on XOR
/ An introduction to neural networks with kdb+
/ by James Neill
/ https://code.kx.com/q/wp/neural-networks/
/ get shape of matrix
shape:{(count x;count last x)}
wInit:{
// If only one input neuron is detected exit
// This is most likely due to a missing bias neuron
if[1=x;:"Number of input neurons must be greater than 1."];
flip flip[r]-avg r:{[x;y]x?1.0}[y]each til x
}
// activation function
sigmoid:{1%1+exp neg x}
// x - the input data set with bias node
// y – known outputs corresponding to inputs
// lr – learning rate ‘alpha’
// d – dictionary with 3 items: output
// weights between input and hidden layers
// weights between hidden and output layers
ffn:{[x;y;lr;d]
z:1.0,/:sigmoid[x mmu d`w];
o:sigmoid[z mmu d`v];
// Error of output neurons
deltaO:y-o;
// Error of hidden neurons
deltaZ:1_/:$[deltaO;flip d`v]*z*1-z;
`o`v`w!(o;d[`v]+lr*flip[z] mmu deltaO;
d[`w]+lr*flip[x] mmu deltaZ)
}
// Inputs and expected target values for XOR problem
inputs:((0 0f);(0 1f);(1 0f);(1 1f))
targets:0 1 1 0f
// Add a bias neuron to each input
inputs:inputs,'1.0
/ targets:targets,1.0
w:wInit . {(y;x)} . shape inputs
v:wInit . (1 0)+shape targets
finalResult:(ffn[inputs;targets;0.1]/)[10000;`o`w`v!(0,();w;v)]
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment