Skip to content

Instantly share code, notes, and snippets.

View codekansas's full-sized avatar
🏠
Working from home

Ben Bolte codekansas

🏠
Working from home
View GitHub Profile
thickness = 3;
padding = 1;
slat_size = 3;
short_length = 46.36;
long_length = 61.76;
height = 51.76;
first_indent = 15.92;
second_indent = 29.97;
@codekansas
codekansas / maximum_noise_entropy.py
Last active February 21, 2019 08:41
Maximum Noise Entropy implementation using TensorFlow
"""Implementation of maximum noise entropy using TensorFlow.
Paper: http://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1002249
"""
# For Python 3 compatibility.
from __future__ import print_function
# For building the algorithm.
import tensorflow as tf
@codekansas
codekansas / binarized_nn_inference.cpp
Created November 1, 2017 02:25
Efficient binarized neural network inference
/* Binarized neural network inference example.
This shows a simple C++ program for doing inference on
binarized neural networks. To do this efficiently, the code
below makes use of the "bitset" class, which uses the "popcnt"
instruction to count the number of 1's that show up in the
matrix product, in constant time. This means that a matrix
multiplication between a (A, B) and (B, C) matrix takes
O(A * C) time; in other words, each value in the output matrix
is computed in constant time.
*/
@codekansas
codekansas / freeway.py
Created August 13, 2016 01:26
More general version of the Highway Network
from keras.engine import InputSpec
from keras.layers import Dense
from keras.layers.wrappers import Wrapper, TimeDistributed
class Freeway(Wrapper):
def __init__(self, layer, gate=None, **kwargs):
self.supports_masking = True
self.gate = gate
super(Freeway, self).__init__(layer, **kwargs)
@codekansas
codekansas / bert_pytorch.py
Created November 1, 2018 10:25
Implementation of the transformer block used by BERT
#!/usr/bin/env python3
"""Implementation of the transformer block used by BERT.
I saw an excellent implementation of the complete BERT model here:
https://github.com/codertimo/BERT-pytorch
I re-wrote a simplified version of the transformer block below. This was mainly
for my own understanding (so that I could get a grasp of the dimensions and
how the whole attention mechanism works), but I tried to document it pretty
thoroughly so that other people can understand it without having to go too far
@codekansas
codekansas / keras_gensim_embeddings.py
Last active July 23, 2018 09:17
Using Word2Vec embeddings in Keras models
from __future__ import print_function
import json
import os
import numpy as np
from gensim.models import Word2Vec
from gensim.utils import simple_preprocess
from keras.engine import Input
from keras.layers import Embedding, merge
@codekansas
codekansas / .vimrc
Last active May 17, 2018 00:07
vimrc file that i like to use
execute pathogen#infect()
colorscheme badwolf
" turns of syntax highlighting
syntax enable
" use spaces not tabs
set tabstop=8 softtabstop=0 expandtab shiftwidth=2 smarttab
" show line numbers
#!/usr/bin/env python
"""The training script for the DANN model."""
from __future__ import division
from __future__ import print_function
import csv
import os
import itertools
import sys
@codekansas
codekansas / ballpark.py
Last active January 16, 2018 10:56
Adding noise to gradients as a regularizer
from keras.optimizers import SGD, Adagrad, RMSprop, Adadelta, Adam, Adamax
import keras.backend as K
DEFAULT_NOISE = 0.05
def ballpark_gradient(gradient, noise):
return [g * K.random_normal(shape=K.shape(g), mean=1.0, std=noise) for g in gradient]
@codekansas
codekansas / factors.scm
Created September 18, 2017 19:52
Program for finding all the factors of a number in Scheme.
; Finds all factors of a number in O(sqrt n) time.
(define (factors n)
(define (@factors n i a)
(cond ((= (modulo n i) 0) (@factors (quotient n i) i (cons i a)))
((>= (* i i) n) (if (= 1 n) a (cons n a)))
(else (@factors n (+ i 1) a))))
(@factors n 2 `()))
; Multiples all the elements in a list.
(define (mult l)