Skip to content

Instantly share code, notes, and snippets.

View leonidk's full-sized avatar

Leonid Keselman leonidk

View GitHub Profile
@leonidk
leonidk / sample.py
Created April 14, 2023 19:14
Low Discrepency sequences in arbitrary dimensions
import numpy as np
class QuasiRandom():
def __init__(self,dim=1,seed=None):
self.dim = dim
self.x = np.random.rand(dim) if seed is None else seed
root_sys = [1] +[0 for i in range(dim-1)] + [-1,-1]
self.const = sorted(np.roots(root_sys))[-1].real
self.phi = np.array([1/(self.const)**(i+1) for i in range(dim)])
def generate(self,n_points=1):
@leonidk
leonidk / process.ipynb
Last active November 4, 2021 15:57
just some voting stuff
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@leonidk
leonidk / simple_pend.py
Last active February 15, 2018 21:19
Random Controllers for OpenAI Gym
import gym
import numpy as np
import random
env = gym.make('Pendulum-v0')
dim = env.observation_space.shape[0] + 1
params = int(dim + (dim*(dim-1))/2)
# linear controller with pairwise features
def quad_control(w,ob,t):
import argparse
import torch
import torch.nn as nn
from torch.autograd import Variable
from torch.utils.data import DataLoader
import torchvision
import torchvision.transforms as T
from torchvision.datasets import ImageFolder
@leonidk
leonidk / tensorflow_finetune.py
Created June 15, 2017 03:14 — forked from omoindrot/tensorflow_finetune.py
Example TensorFlow script for fine-tuning a VGG model (uses tf.contrib.data)
"""
Example TensorFlow script for finetuning a VGG model on your own data.
Uses tf.contrib.data module which is in release candidate 1.2.0rc0
Based on:
- PyTorch example from Justin Johnson:
https://gist.github.com/jcjohnson/6e41e8512c17eae5da50aebef3378a4c
Required packages: tensorflow (v1.2)
You can install the release candidate 1.2.0rc0 here:
https://www.tensorflow.org/versions/r1.2/install/
Download the weights trained on ImageNet for VGG:
@leonidk
leonidk / tf_lstm.py
Created December 22, 2016 09:08 — forked from siemanko/tf_lstm.py
Simple implementation of LSTM in Tensorflow in 50 lines (+ 130 lines of data generation and comments)
"""Short and sweet LSTM implementation in Tensorflow.
Motivation:
When Tensorflow was released, adding RNNs was a bit of a hack - it required
building separate graphs for every number of timesteps and was a bit obscure
to use. Since then TF devs added things like `dynamic_rnn`, `scan` and `map_fn`.
Currently the APIs are decent, but all the tutorials that I am aware of are not
making the best use of the new APIs.
Advantages of this implementation:
@leonidk
leonidk / dog_ex.py
Last active March 24, 2024 03:23
difference of gaussians example in python
from skimage import data, feature, color, filter, img_as_float
from matplotlib import pyplot as plt
original_image = img_as_float(data.chelsea())
img = color.rgb2gray(original_image)
k = 1.6
plt.subplot(2,3,1)
@leonidk
leonidk / conv_test.py
Created October 13, 2016 19:03
Convolution vs Correlation Example
#simple correlation vs convolution example
import numpy as np
from matplotlib.pyplot import *
style.use('seaborn-ticks')
for a in [[1,2,3],[1,2,1]]:
figure()
x=[-1,0,1]
b = [0,0,1]
import tensorflow as tf
import numpy as np
# takes a pair, returns a projected pair
def distort_func(p):
x = p[0]
y = p[1]
r = p[2]
# welcome to my hat picking
K1 = 0.1
@leonidk
leonidk / tif.py
Last active June 6, 2016 05:14
ImageJ TiFF Stack Reader
import sys,os,csv
from pylab import *
from PIL import Image
import time
#from mayavi import mlab
#returns a list (length = 3) of channels, each of which is a list (length = Z stack depth) of images
def getImages(fN):
returnArr= [ [], [], [] ]
tiff = Image.open(fN)