Skip to content

Instantly share code, notes, and snippets.

💭
architecting

Dillon Erb dte

Block or report user

Report or block dte

Hide content and notifications from this user.

Learn more about blocking users

Contact Support about this user’s behavior.

Learn more about reporting abuse

Report abuse
View GitHub Profile
View keybase.md

Keybase proof

I hereby claim:

  • I am dte on github.
  • I am dte (https://keybase.io/dte) on keybase.
  • I have a public key ASBUHk4r10yK56bUqf0cZ9iYkVdIqehPtxMHsTcvtYp4awo

To claim this, I am signing this object:

@dte
dte / install-CUDA-docker-nvidia-docker.sh
Created Jul 19, 2017
Install CUDA, Docker, and Nvidia Docker on a new Paperspace GPU machine
View install-CUDA-docker-nvidia-docker.sh
#!/bin/bash
# 1. Install CUDA
echo "Installing CUDA..."
# Only install if CUDA is not already installed.
if ! dpkg-query -W cuda; then
# The 16.04 installer works with 16.10.
curl -O http://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/cuda-repo-ubuntu1604_8.0.61-1_amd64.deb
dpkg -i ./cuda-repo-ubuntu1604_8.0.61-1_amd64.deb
apt-get update
apt-get install cuda -y
@dte
dte / install-cuda.sh
Created Jul 19, 2017
Install CUDA 8 on Ubuntu 16.04 / 16.10
View install-cuda.sh
#!/bin/bash
echo "Installing CUDA..."
# Only install if CUDA is not already installed.
if ! dpkg-query -W cuda; then
# The 16.04 installer works with 16.10.
curl -O http://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/cuda-repo-ubuntu1604_8.0.61-1_amd64.deb
dpkg -i ./cuda-repo-ubuntu1604_8.0.61-1_amd64.deb
apt-get update
apt-get install cuda -y
fi
View releaseTest.md

Release Notes

New Thing 1

7/16/2018

About this

New Thing 2

7/14/2018

About this

@dte
dte / loopy_cosine_similarities.py
Created May 9, 2018
Vectorization and Broadcasting with Pytorch
View loopy_cosine_similarities.py
import torch
from torch.nn.functional import cosine_similarity
def embeddings_to_cosine_similarity_matrix(E):
"""
Converts a a tensor of n embeddings to an (n, n) tensor of similarities.
"""
similarities = [[cosine_similarity(a, b, dim=0) for a in E] for b in E]
similarities = list(map(torch.cat, similarities))
return torch.stack(similarities)
View serve.py
import os
from flask import Flask, redirect, url_for, request, render_template, send_from_directory
from werkzeug import secure_filename
import not_hotdog_model
# todo: more pretty interface
# folder to upload pictures
UPLOAD_FOLDER = 'uploads/'
# what files can upload
View Adversarial2.py
torch.manual_seed(10)
Q, P = Q_net() = Q_net(), P_net(0) # Encoder/Decoder
D_gauss = D_net_gauss() # Discriminator adversarial
if torch.cuda.is_available():
Q = Q.cuda()
P = P.cuda()
D_cat = D_gauss.cuda()
D_gauss = D_net_gauss().cuda()
# Set learning rates
gen_lr, reg_lr = 0.0006, 0.0008
View Adversarial.py
#Encoder
class Q_net(nn.Module):
def __init__(self):
super(Q_net, self).__init__()
self.lin1 = nn.Linear(X_dim, N)
self.lin2 = nn.Linear(N, N)
self.lin3gauss = nn.Linear(N, z_dim)
def forward(self, x):
x = F.droppout(self.lin1(x), p=0.25, training=self.training)
x = F.relu(x)
@dte
dte / get_available_gpus.py
Created Nov 16, 2017 — forked from jovianlin/get_available_gpus.py
Get List of Devices in TensorFlow
View get_available_gpus.py
from tensorflow.python.client import device_lib
def get_available_gpus():
local_device_protos = device_lib.list_local_devices()
return [x.name for x in local_device_protos if x.device_type == 'GPU']
get_available_gpus()
@dte
dte / tensorflow_opencv_ubuntu_deps.sh.txt
Created Jul 25, 2017 — forked from CapCap/tensorflow_opencv_ubuntu_deps.sh.txt
Paperspace tensorflow+opencv setup for both python2 and python3 on ubuntu 16
View tensorflow_opencv_ubuntu_deps.sh.txt
#!/bin/bash
# Don't require you to constantly enter password for sudo:
sudo visudo
# In the bottom of the file, paste the following (without the `#`):
# paperspace ALL=(ALL) NOPASSWD: ALL
# Then press `ctl+o` then `enter` to save your changes, and `ctr+x` to exit nano
# Allow connection from your IP to any port- default seems to be just 22 (ssh)
You can’t perform that action at this time.