Skip to content

Instantly share code, notes, and snippets.

View codekansas's full-sized avatar
🏠
Working from home

Ben Bolte codekansas

🏠
Working from home
View GitHub Profile
@codekansas
codekansas / .vimrc
Last active May 17, 2018 00:07
vimrc file that i like to use
execute pathogen#infect()
colorscheme badwolf
" turns of syntax highlighting
syntax enable
" use spaces not tabs
set tabstop=8 softtabstop=0 expandtab shiftwidth=2 smarttab
" show line numbers
@codekansas
codekansas / keras_gensim_embeddings.py
Last active July 23, 2018 09:17
Using Word2Vec embeddings in Keras models
from __future__ import print_function
import json
import os
import numpy as np
from gensim.models import Word2Vec
from gensim.utils import simple_preprocess
from keras.engine import Input
from keras.layers import Embedding, merge
@codekansas
codekansas / bert_pytorch.py
Created November 1, 2018 10:25
Implementation of the transformer block used by BERT
#!/usr/bin/env python3
"""Implementation of the transformer block used by BERT.
I saw an excellent implementation of the complete BERT model here:
https://github.com/codertimo/BERT-pytorch
I re-wrote a simplified version of the transformer block below. This was mainly
for my own understanding (so that I could get a grasp of the dimensions and
how the whole attention mechanism works), but I tried to document it pretty
thoroughly so that other people can understand it without having to go too far
@codekansas
codekansas / freeway.py
Created August 13, 2016 01:26
More general version of the Highway Network
from keras.engine import InputSpec
from keras.layers import Dense
from keras.layers.wrappers import Wrapper, TimeDistributed
class Freeway(Wrapper):
def __init__(self, layer, gate=None, **kwargs):
self.supports_masking = True
self.gate = gate
super(Freeway, self).__init__(layer, **kwargs)
@codekansas
codekansas / binarized_nn_inference.cpp
Created November 1, 2017 02:25
Efficient binarized neural network inference
/* Binarized neural network inference example.
This shows a simple C++ program for doing inference on
binarized neural networks. To do this efficiently, the code
below makes use of the "bitset" class, which uses the "popcnt"
instruction to count the number of 1's that show up in the
matrix product, in constant time. This means that a matrix
multiplication between a (A, B) and (B, C) matrix takes
O(A * C) time; in other words, each value in the output matrix
is computed in constant time.
*/
@codekansas
codekansas / maximum_noise_entropy.py
Last active February 21, 2019 08:41
Maximum Noise Entropy implementation using TensorFlow
"""Implementation of maximum noise entropy using TensorFlow.
Paper: http://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1002249
"""
# For Python 3 compatibility.
from __future__ import print_function
# For building the algorithm.
import tensorflow as tf
thickness = 3;
padding = 1;
slat_size = 3;
short_length = 46.36;
long_length = 61.76;
height = 51.76;
first_indent = 15.92;
second_indent = 29.97;
#!/usr/bin/env python3
"""Problem statement:
There is a teacher and 2 students in a classroom. The students are A and B.
The teacher thinks of 2 positive integers and tells the sum of those numbers
to student A without student B hearing it. Then tells their product to student
B without student A hearing it. After this, the teacher asks the 2 students
what was the 2 numbers.
First student A says: I don't know.
/*
A combination for a lock has 3 wheels, X, Y, and Z, each of which
can be set to eight different positions. The lock is broken and when
any two wheels of the lock are in the correct position, the lock
will open. Thus, anyone can open the lock after 64 tries (let A and
B run through all possible permutations). However, the safe can be
opened in fewer tries! What is the minimum number of tries that can
be guaranteed to open the lock?
*/
// Total dimensions.
height = 120;
width = 40;
padding = 10;
module col(h, sph=false) {
difference() {
cube([width, width, h]);
if (sph)
translate([width / 2, width / 2, h])