Skip to content

Instantly share code, notes, and snippets.

Tim Vieira timvieira

Block or report user

Report or block timvieira

Hide content and notifications from this user.

Learn more about blocking users

Contact Support about this user’s behavior.

Learn more about reporting abuse

Report abuse
View GitHub Profile
View Exponential-jumps.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@timvieira
timvieira / Notes-on-scipy.special.digamma.ipynb
Last active Mar 4, 2018
Don't use scipy.special.digamma, if you care about speed.
View Notes-on-scipy.special.digamma.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@timvieira
timvieira / lagrangeprop.py
Last active Oct 1, 2018
Automatic differentiation as the method of Lagrange multipliers. Code accompanies this blog post: http://timvieira.github.io/blog/post/2017/08/18/backprop-is-not-just-the-chain-rule/
View lagrangeprop.py
# -*- coding: utf-8 -*-
"""
Backprop as the method of Lagrange multiplers (and even the implicit function
theorem).
"""
from __future__ import division
import numpy as np
from arsenal.alphabet import Alphabet
from arsenal.math.checkgrad import finite_difference
@timvieira
timvieira / simple-backprop.py
Last active Jun 17, 2018
Simple example of manually performing "automatic" differentiation.
View simple-backprop.py
"""
Simple example of manually performing "automatic" differentiation
"""
import numpy as np
from numpy import exp, sin, cos
def f(x, with_grad=False):
# Need to cache intermediates from forward pass (might not use all of them).
a = exp(x)
View squishing_multiclass.py
# Efficient passive aggressive updates for multi-class classification
#
# Original article:
# "Column squishing for multiclass updates"
# https://nlpers.blogspot.com/2017/08/column-squishing-for-multiclass-updates.html
from __future__ import division
import numpy as np
import scipy.optimize
View memory-efficient-backprop.py
"""
Memory-efficient backpropagation in an RNN.
Accompanies blog post:
http://timvieira.github.io/blog/post/2016/10/01/reversing-a-sequence-with-sublinear-space/
"""
import numpy as np
from arsenal.math.checkgrad import fdcheck
@timvieira
timvieira / memory-efficient-backprop.py
Created Aug 8, 2017
Memory efficient backpropagation thru time in a recurrent neural network. Accompanies blog post: http://timvieira.github.io/blog/post/2016/10/01/reversing-a-sequence-with-sublinear-space/
View memory-efficient-backprop.py
"""
Memory-efficient backpropagation in an RNN.
Accompanies blog post:
http://timvieira.github.io/blog/post/2016/10/01/reversing-a-sequence-with-sublinear-space/
"""
import numpy as np
from arsenal.math.checkgrad import fdcheck
@timvieira
timvieira / jiawei.py
Created Feb 18, 2017
Cartoon version of Jiawei's optimization problem.
View jiawei.py
"""
Cartoon version of Jiawei's optimization problem.
Created [2017-02-17 Fri]
"""
import numpy as np
from scipy.optimize import fmin_bfgs
import autograd
@timvieira
timvieira / counterfactual-demo.ipynb
Last active Aug 26, 2018
Counterfactual reasoning demo. Accompanies blog post "Counterfactual reasoning and learning from logged data" http://timvieira.github.io/blog/post/2016/12/19/counterfactual-reasoning-and-learning-from-logged-data/
View counterfactual-demo.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@timvieira
timvieira / heap-sample.py
Created Nov 26, 2016
Fast sampling from an evolving distribution
View heap-sample.py
import numpy as np
from numpy.random import uniform
def update(S, k, v):
"Update value position `k` in time O(log n)."
d = S.shape[0]
i = d//2 + k
S[i] = v
while i > 0:
You can’t perform that action at this time.