Skip to content

Instantly share code, notes, and snippets.

View jacobkahn's full-sized avatar

Jacob Kahn jacobkahn

View GitHub Profile
@syhw
syhw / dnn_compare_optims.py
Created July 16, 2014 19:21
to compare plain SGD vs. SAG vs. Adagrad vs. Adadelta
"""
A deep neural network with or w/o dropout in one file.
"""
import numpy
import theano
import sys
import math
from theano import tensor as T
from theano import shared
@staltz
staltz / introrx.md
Last active July 25, 2024 13:33
The introduction to Reactive Programming you've been missing
@jboner
jboner / latency.txt
Last active July 25, 2024 11:30
Latency Numbers Every Programmer Should Know
Latency Comparison Numbers (~2012)
----------------------------------
L1 cache reference 0.5 ns
Branch mispredict 5 ns
L2 cache reference 7 ns 14x L1 cache
Mutex lock/unlock 25 ns
Main memory reference 100 ns 20x L2 cache, 200x L1 cache
Compress 1K bytes with Zippy 3,000 ns 3 us
Send 1K bytes over 1 Gbps network 10,000 ns 10 us
Read 4K randomly from SSD* 150,000 ns 150 us ~1GB/sec SSD