Skip to content

Instantly share code, notes, and snippets.

View domarps's full-sized avatar
📱

Pramod Srinivasan domarps

📱
View GitHub Profile
@domarps
domarps / The Pull Request Process
Created February 5, 2016 21:20
Submitting a Pull Request
git clone https://github.com/domarps/meta.git
cd meta/
git checkout develop
git submodule update
git submodule update --init
git submodule update --init --recursive
ls
cd deps/bandit/
rm -rf .git
ls
@domarps
domarps / file-remover.py
Created August 10, 2017 22:33
snippet to remove files
def remove_blacklisted_lines():
with open('some.csv') as oldfile, open('some.csv', 'w') as newfile:
for line in oldfile:
ss = line.split('_')
if not any(t in ss[-2] for t in remove_urls):
newfile.writr(line)
import glob
import os
@domarps
domarps / ImportModulesNotebook.ipynb
Created March 7, 2018 17:19 — forked from Breta01/ImportModulesNotebook.ipynb
Example of importing multiple TensorFlow modules
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@domarps
domarps / knights_tour.py
Created April 7, 2018 22:26
knight's tour on a chessboard
import queue
class knightsTour:
def __init__(self):
pass
def is_within_limits(self, i, j, M, N):
'''
:param i
:param j
:param M
#FC1 activation
z1 = np.dot(X,W1) + b1 # N X H
# RELU layer
h1 = z1.copy() # N X H
h1[h1 < 0] = 0 # N X H
#FC2 layer
z2 = np.dot(h1, W2) + b2 # N X C
def layernorm_forward(x, gamma, beta, ln_param):
"""
Forward pass for layer normalization.
During both training and test-time, the incoming data is normalized per data-point,
before being scaled by gamma and beta parameters identical to that of batch normalization.
Note that in contrast to batch normalization, the behavior during train and test-time for
layer normalization are identical, and we do not need to keep track of running averages
of any sort.
def batchnorm_forward(x, gamma, beta, bn_param):
"""
Forward pass for batch normalization.
During training the sample mean and (uncorrected) sample variance are
computed from minibatch statistics and used to normalize the incoming data.
During training we also keep an exponentially decaying running mean of the
mean and variance of each feature, and these averages are used to normalize
data at test-time.
At each timestep we update the running averages for mean and variance using
an exponential decay based on the momentum parameter:
@domarps
domarps / kmeans.ipynb
Created May 13, 2018 21:17 — forked from jakevdp/kmeans.ipynb
Performance Python: 7 Strategies for Optimizing Your Numerical Code (PyCon 2018)
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
import java.util.*;
public class GenerateSubsets {
/*
* Given a input array of size n elements find all the subsets.
*/
public static String getString(char[] subset, int subStIndex){
prev = None
def convert(node,head,entered):
global prev
if node != None:
convert(node.left, head,entered)
node.left = prev
if prev == None:
head[0] = node
else: