Skip to content

Instantly share code, notes, and snippets.

Wesley Tansey tansey

Block or report user

Report or block tansey

Hide content and notifications from this user.

Learn more about blocking users

Contact Support about this user’s behavior.

Learn more about reporting abuse

Report abuse
View GitHub Profile
tansey /
Created Aug 9, 2019
Heterogeneous (AKA multi-view) factor modeling in pytorch.
Heterogeneous factor modeling.
This model fits a heterogeneous factor model where columns may be:
1) Binary
2) Categorical
3) Gaussian
Everything is fit via alternating minimization and stochastic gradient descent.
The code relies on pytorch for SGD and a demo is included.
tansey /
Last active May 11, 2019
Pool adjacent violators algorithm for monotone matrix factorization
'''Pool adjacent violators algorithm for (column-)monotone matrix factorization.
Applies the PAV algorithm to column factors of a matrix factorization:
Given: M = W.V'
Returns: V_proj, a projected version of V such that M[i] is monotone decreasing
for all i.
Author: Wesley Tansey
Date: May 2019
tansey /
Last active May 8, 2019
Fast multivariate normal sampling for some common cases
'''Fast sampling from a multivariate normal with covariance or precision
parameterization. Supports sparse arrays. Params:
- mu: If provided, assumes the model is N(mu, Q)
- mu_part: If provided, assumes the model is N(Q mu_part, Q).
This is common in many conjugate Gibbs steps.
- sparse: If true, assumes we are working with a sparse Q
- precision: If true, assumes Q is a precision matrix (inverse covariance)
- chol_factor: If true, assumes Q is a (lower triangular) Cholesky
decomposition of the covariance matrix
(or of the precision matrix if precision=True).
tansey /
Last active Feb 28, 2018
Solver for a nurse scheduling problem
Program to generate valid time allocations of a mental ward staff.
Two staff lists. Each list applies for a specific window of time. Lists may
contain non-empty intersections of employees.
Each employee has a designation as RMN or HCA.
tansey / fitWeightedNegativeBinomial.R
Created Jul 2, 2017
Fit negative binomial with weighted observations in R
View fitWeightedNegativeBinomial.R
# Fit using a simple EM algorithm
# observations are x
# weights are w (must be same length as x)
# returns (r, p)
# r - dispersion parameter
# p - probability of success
weightedNegBinomFit <- function(x, w, maxsteps=30)
sum.wx = sum(x*w)
sum.w = sum(w)
tansey /
Created Feb 10, 2017
Unpack a lower triangular Cholesky matrix from a neural network output in Tensorflow
import tensorflow as tf
def unpack_cholesky(q, ndims):
# Build the lower-triangular Cholesky from the flat buffer (assumes q shape is [batchsize, cholsize])
chol_diag = tf.nn.softplus(q[:,:ndims])
chol_offdiag = q[:,ndims:]
chol_rows = []
chol_start = 0
chol_end = 1
for i in xrange(ndims):
tansey / wtfset.cpp
Created Nov 9, 2016
terrible STL set iterator behavior
View wtfset.cpp
#include <set>
#include <iostream>
class A
std::set<int> s;
A() { s.insert(0); s.insert(1); }
std::set<int> getSet() { return s; }
tansey /
Created Jul 31, 2015
ADMM convergence checker in CVXPY
Implementation of the ADMM convergence rate SDP from Nishihara et al.,
ICML 2015, equation 11.
Code by Wesley Tansey and Sanmi Koyejo
import cvxpy as cvx
import numpy as np
tansey / gist:5b59d73c58587973844d
Created Sep 24, 2014
Python-style string formatting in Julia with floating point support
View gist:5b59d73c58587973844d
function format(s, args...)
# Python-style string formatting with floating point support
# Note that this is 1-based to be more Julian
result = deepcopy(s)
for (i, x) in enumerate(args)
q = Regex("{$i(:\.([0-9])+f)?}")
next = result
for m in eachmatch(q, result)
val = x
if m.captures[2] != nothing
tansey / gist:9753126
Created Mar 25, 2014
Numpy Array vs. Numpy Masked Array -- Madness
View gist:9753126
import numpy as np
import as ma
# Create a 3x3 array in regular numpy
a = np.arange(9).reshape((3,3))
# Get the middle row
b = a[1]
# Change the middle value in the middle row
You can’t perform that action at this time.