Skip to content

Instantly share code, notes, and snippets.

@jpjacobs
jpjacobs / nn.ijs
Last active Aug 8, 2022
JPlayground Neural Network
View nn.ijs
NB. Neural Network Demo
NB. (roughly based on chapters 1-3 of http://neuralnetworksanddeeplearning.com/ by Michael Nielsen)
NB. The main differences with the book are:
NB. - the removal of the minibatch function and loops in favor of matrix functions
NB. - joining bias and weight calculations.
NB. *** Preparations ***
NB. install addons needed from github for reading CSV files
{{install^:(-.fexist'~addons/tables/',y)'github:jsoftware/tables_',y}}&>;:'dsv csv'
NB. require the needed addons
@jpjacobs
jpjacobs / knn.pi
Last active Aug 29, 2015
Picat machine learning
View knn.pi
% Machine learning toolbox
module ml.
import util.
% calculates only half of the distances: symmetry
pdist(M) = M3 =>
M3 = new_array(M.length, M.length),
foreach(I in 1..M.length, J in I..M.length)
M3[I,J] = sum([(M[I,K]-M[J,K])**2 : K in 1..M[1].length]),
M3[J,I] = M3[I,J]
View knn.ijs
NB. knn graph
knn =: 4 : 0
d =. (+/@:*:@:-)"1/~ y NB. distances between vectors in y
n =. x ({. }.@/:)"1 d NB. get nearest neighbors (exclude self)
W =. n (_"_)`(<@<@<@[)`]}"1 d NB. set all non-neighbors to _
W =. (<.|:) W NB. make symmetric
)
View plotgraph.ijs
NB. Make square grid of side y
grid =: |:@:(,"2)@(,: |:)@(#"0 i.)
plotgraph =: 4 : 0 NB. todo: implement autoplacement of locations not given
pd 'reset'
pd 'type line;pensize 1;color blue'
edges =. ; (i.#x) ;/@,.&({&x)"0 &.> y
edges =. <"1 ~.@:(({~"1 /:@{.)"2) > edges NB. normalize: sort points on ascending x, eliminate double edges.
edges =. (#~ -.@({.&>-:{:&>)"1) edges NB. eliminate loops. Empties automagically disappear.
pd"1 edges
@jpjacobs
jpjacobs / som.ijs
Last active Feb 29, 2016
Kohonen SOM
View som.ijs
NB. Self-Organising map
require 'math/lapack/geev'
require 'tables/csv'
require 'viewmat'
NB. closeall_jviewmat_ ''
NB. parameters:
N =: 30 NB. side of neuron map. (N -> (*:N) neurons)
ntrain =: 25 NB. number of training samples per class
niter =: 10000 NB. number of iterations
'a0 an' =: 25 1 NB. Starting and ending learning factor
View fastica
NB. Fast ICA
NB. http://en.wikipedia.org/wiki/FastICA
require 'math/lapack math/lapack/geev'
require 'plot'
center =: -"1 +/%# NB. subtract column means
mp =: +/ .* NB. matrix product
cov =: (mp"2 1)~&.|: ] % <:^:(1&<)@# NB. Covariance (note, does NOT center)
I =: =@:i. NB. unit matrix
clean =: * |@* NB. clean near-to-zero values
whiten =: (mp ([ mp (I@# % %:)@] mp |:@[)&>/@(2b110&geev_jlapack_)@cov)@center
View gist:d0e92be3dfed7ed2df92
NB. Signal Processing in J
NB. ========================
NB. Mainly written as exercise, so no guarantees
NB. cocurrent <'jimproc'
NB. Assumptions : 1-cell = 1 sample of dimensionsionality $ samp
NB. set of samples in 2-cell
NB. Monad mean y
NB. Mean of data samples in y
mean =: +/ % #
View gist:63292d6d5b7ac34ceba3
NB. Listen to UDP stream (phone IMU eg.)
load'socket'
load'graph'
(;:'jsocket jdefs z') copath 'base'
sk =: 0 pick sdcheck sdsocket AF_INET_jdefs_,SOCK_DGRAM_jdefs_,0
sdcheck sdbind sk;AF_INET;'';5555
NB. sensor_number check socket