Skip to content

Instantly share code, notes, and snippets.

Andrew Schreiber andrewschreiber

Block or report user

Report or block andrewschreiber

Hide content and notifications from this user.

Learn more about blocking users

Contact Support about this user’s behavior.

Learn more about reporting abuse

Report abuse
View GitHub Profile
View save_vanilla_gradient(network, image, data).py
# Create a saliency map for each data point
for i, image in enumerate(data):
# Forward pass on image
# Note: the activations are saved on each layer
output = image
for l in range(len(network.layers)):
output = network.layers[l].forward(output)
# Backprop to get gradient
label_one_hot = labels[i]
@andrewschreiber
andrewschreiber / vg_logic.py
Created Aug 16, 2019
def save_vanilla_gradient(network, data, labels), see https://github.com/andrewschreiber/numpy-saliency
View vg_logic.py
# Create a saliency map for each data point
for i, image in enumerate(data):
# Forward pass on image
# Note: the activations from this are saved on each layer
output = image
for l in range(len(network.layers)):
output = network.layers[l].forward(output)
# Backprop to get gradient
label_one_hot = labels[i]
View vg_p2.py
# Backprop to get gradient
label_one_hot = labels[i]
dy = np.array(label_one_hot)
for l in range(len(network.layers)-1, -1, -1):
dout = network.layers[l].backward(dy)
dy = dout
@andrewschreiber
andrewschreiber / vg_p1.py
Last active Aug 16, 2019
def save_vanilla_gradient(network, data, labels), see https://github.com/andrewschreiber/numpy-saliency
View vg_p1.py
# Create a saliency map for each data point
for i, image in enumerate(data):
# Run a forward pass with an image
output = image
for l in range(len(network.layers)):
output = network.layers[l].forward(output)
View main.py
from model.data import mnist_train_test_sets
from model.network import LeNet5
from saliency.vanilla_gradient import save_vanilla_gradient
# Get MNIST dataset, preprocessed
train_images, train_labels, test_images, test_labels = mnist_train_test_sets()
# Load net with 98% acc weights
net = LeNet5(weights_path="15epoch_weights.pkl")
# Generate saliency maps for the first 10 images
@andrewschreiber
andrewschreiber / road_sign_classifier.py
Last active Aug 19, 2017
Example Hyperdash expanded SDK
View road_sign_classifier.py
# From CLI:
# hyperdash run -n 'mymodel' python mymodel.py
import hyperdash as hd
learning_rate = hd.param('learning rate', default=0.01) # Setup hyperparameters
# Model code here
hd.metric('loss', training_loss) # Record a metric
# Params and metrics are pretty printed at end of experiment
@andrewschreiber
andrewschreiber / jupyter_gym_render.md
Last active Jun 15, 2019
How to stream OpenAI Gym environment rendering within a Jupyter Notebook
View jupyter_gym_render.md

Open jupyter with

$ xvfb-run -s "-screen 0 1400x900x24" jupyter notebook

In Jupyter

import matplotlib.pyplot as plt
%matplotlib inline

After each step

def show_state(env, step=0):
View keybase.md

Keybase proof

I hereby claim:

  • I am andrewschreiber on github.
  • I am andrewschreiber (https://keybase.io/andrewschreiber) on keybase.
  • I have a public key whose fingerprint is B124 EC8F E431 5EC0 7D0C 6205 2824 08D7 8326 AB72

To claim this, I am signing this object:

@andrewschreiber
andrewschreiber / mac_gym_installer.sh
Created Apr 12, 2017
Installs OpenAI Gym on MacOS -
View mac_gym_installer.sh
#!/bin/sh
# See video https://www.youtube.com/watch?v=7PO27i2lEOs
set -e
command_exists () {
type "$1" &> /dev/null ;
}
@andrewschreiber
andrewschreiber / ReactiveCocoa.podspec.json
Created Oct 7, 2016
RAC 4.2.2 podspec for Swift 2.3 on Xcode 8.0
View ReactiveCocoa.podspec.json
{
"name": "ReactiveCocoa",
"version": "4.2.2",
"summary": "A framework for composing and transforming streams of values.",
"description": "ReactiveCocoa (RAC) is an Objective-C framework for Functional Reactive Programming.\nIt provides APIs for composing and transforming streams of values.",
"homepage": "https://github.com/ReactiveCocoa/ReactiveCocoa",
"license": {
"type": "MIT",
"file": "LICENSE.md"
},
You can’t perform that action at this time.