Skip to content

Instantly share code, notes, and snippets.

@eamartin
eamartin / notebook.ipynb
Last active November 6, 2022 18:53
Understanding & Visualizing Self-Normalizing Neural Networks
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@eamartin
eamartin / losses.txt
Last active September 14, 2016 03:54
Binary classification losses
Consider a binary classification problem with target t and output probability p.
Further, consider the batched version of this problem, with targets t_i and outputs p_i.
Assume independence between all elements in the batch.
The likelihood is
prod_i p_i^t_i * (1-p_i)^(1-t_i) = prod_i (p_i if t_i == 1 else (1-p_i))
Rather than maximizing the likelihood, we assume we can maximize a monotonic function of the likelihood,
such as the log likelihood. If the problem was convex or we knew optimization would end at a global optima,
the maximizer for log likelihood would be the same as the maximizer for likelihood.
@eamartin
eamartin / gist:f73782a0adaf6ba4b9c6995f5673a3a1
Last active October 8, 2020 08:01
Straight-through estimator question
I recently read https://arxiv.org/abs/1308.3432 , and want to make sure I'm understanding the
straight-through gradient estimator correctly. In general, I'm interested in conditional computation
and propagating gradients back through non-smooth functions (or discrete distributions).
My understanding:
Let HT(x) = int(x >= 0) be the hard threshold function. For forwards propagation, use the hard threshold
function. For backwards propagation, replace all instances of HT(x) for some G(x) that has non-zero
gradient in some set of measure > 0 and that approximates HT over the domain of x's. For instance, G
can be identity function if x in [0, 1], or otherwise can be the sigmoid function.
@eamartin
eamartin / gist:d819cb3f63ed34649ab7
Created July 8, 2015 22:53
Qtile 0.10.0 pypi bug
[emartin@eric-Thinkpad:~] virtualenv --no-site-packages qtile_bug
New python executable in qtile_bug/bin/python
Installing setuptools............done.
Installing pip...............done.
[emartin@eric-Thinkpad:~] source qtile_bug/bin/activate
(qtile_bug)[emartin@eric-Thinkpad:~] pip install xcffib
Downloading/unpacking xcffib
Downloading xcffib-0.3.2.tar.gz (65kB): 65kB downloaded
@eamartin
eamartin / ipython_notebook
Last active August 29, 2015 14:02
American Options Pricing project
{
"metadata": {
"language": "Julia",
"name": "",
"signature": "sha256:52758898c2403685db8d857c76083099a23d7889718b8185d62e48903a88e488"
},
"nbformat": 3,
"nbformat_minor": 0,
"worksheets": [
{
def predict_expectation(aaf, covariates):
sfs = aaf.predict_survival_function(covariates)
xvals = sfs.index.values
return simps(sfs.values, xvals, axis=0)
FAIL tests/filesys/extended/dir-vine
pass tests/filesys/extended/grow-create
pass tests/filesys/extended/grow-dir-lg
pass tests/filesys/extended/grow-file-size
pass tests/filesys/extended/grow-root-lg
pass tests/filesys/extended/grow-root-sm
FAIL tests/filesys/extended/grow-seq-lg
pass tests/filesys/extended/grow-seq-sm
FAIL tests/filesys/extended/grow-sparse
pass tests/filesys/extended/grow-tell
@eamartin
eamartin / gist:8573874
Created January 23, 2014 06:29
Image sources for problem 1 on 2014 CNS 186 set #1
Looming motion
looming_1: http://www.hypnobeast.com/spirals/Double_Spiral_1280x1024.jpg
looming_2: http://static3.wikia.nocookie.net/__cb20130317010810/fallout/images/9/95/-pictures.4ever.eu-_optical_illusion,_spiral_158747.jpg
Binary disparity
bindis_1: http://www.indiana.edu/~p1013447/images/juleszrd.gif
bindis_2: http://www.cns.nyu.edu/~msl/3d.gif
Texture gradient
tg_1: http://www.cmap.polytechnique.fr/~maureen/vasarely3.jpg
@eamartin
eamartin / gdb_bl_init
Created January 21, 2014 09:59
Useful scripts for pintos bootloader assignment
target remote localhost:1234
set architecture i8086
break *0x7c00
layout asm
layout reg
display/7i $pc - 4
@eamartin
eamartin / td_qr.jl
Last active November 7, 2023 21:03
QR decomposition by Householder projection for tridiagonal matrices in Julia and Python.
function householder!(x)
x[1] = x[1] + sign(x[1]) .* norm(x)
x ./= norm(x);
end
function tridiag_qr(T)
Q = eye(size(T)...)
R = copy(T)
for i in 1:(size(R, 1) - 1)