Skip to content

Instantly share code, notes, and snippets.

View gvanhorn38's full-sized avatar

Grant Van Horn gvanhorn38

View GitHub Profile
@gvanhorn38
gvanhorn38 / inat_classifier_from_prelogits.py
Created October 11, 2018 16:40
Simple iNaturalist Classifier from Features
import json
import os
import numpy as np
from sklearn.metrics import accuracy_score
from sklearn.svm import LinearSVC
# File Paths
DATA_FOLDER = '' # Fill in
TRAIN_DATASET_FP = os.path.join(DATA_FOLDER, 'train2018.json'); assert os.path.exists(TRAIN_DATASET_FP), "Train json file not found"
@gvanhorn38
gvanhorn38 / aws_p2_tensorflow_r1_1_install.md
Created April 17, 2017 20:55
P2 Instance, Tensorflow r1.1

Installing TensorFlow r1.1 on an AWS P2 instance.

Following this with a few modifications

Built using Ubuntu 16.04 with p2.xlarge instance and 25GB of standard ssd storage.

# Update and upgrade installed packages
sudo apt-get update
sudo apt-get upgrade
@gvanhorn38
gvanhorn38 / aws_c4_tensorflow_r1_1_install.md
Last active April 17, 2017 20:55
C4 Instance, Tensorflow r1.1

Installing TensorFlow r1.1 on an AWS C4 instance.

Following this with a few modifications

Built using Ubuntu 16.04 with c4.2xlarge instance and 25GB of standard ssd storage.

# Update and upgrade installed packages
sudo apt-get update
sudo apt-get upgrade
@gvanhorn38
gvanhorn38 / cub_image_cmds.txt
Created March 10, 2017 16:18
commands for the CUB-200 whole image experiment using tf_classification
export DATASET_DIR=/media/drive2/tensorflow_datasets/cub/with_200_val_split
export EXPERIMENT_DIR=/media/drive2/tensorflow_experiments/ebird/cub_image_experiment
export IMAGENET_PRETRAINED_MODEL=/media/drive3/tensorflow_models/inception_v3.ckpt
# Visualize the inputs to the network
CUDA_VISIBLE_DEVICES=1 python visualize_train_inputs.py \
--tfrecords $DATASET_DIR/train* \
--config $EXPERIMENT_DIR/config_train.yaml \
--text_labels
@gvanhorn38
gvanhorn38 / tensorflow_serving_ubuntu_14.md
Last active December 25, 2022 01:22
TensorFlow Serving Ubuntu 14.04

Instructions for installing TensorFlow Serving on Ubuntu 14.04. I am following the instuctions from here.

Install Bazel

Installation instructions can be found here

If you have a previous version of bazel, and you are trying to do a fresh install then you should remove your old version of bazel. If you installed it through apt, then you can do sudo apt-get purge bazel. If you installed it from source, then you probably have a ~/bin directory with the bazel command, which you should delete, and you probably have a ~/.bazel directory that you should delete. Also check your ~/.bashrc file for any links to ~/.bazel.

  1. Install JDK 8 You'll need the add-apt-repository command, which you can get by doing sudo apt-get install software-properties-common
@gvanhorn38
gvanhorn38 / cub_image_config_test.yaml
Last active March 10, 2017 16:08
CUB-200 Image Classification Test Configuration
# Testing specific configuration
RANDOM_SEED : 1.0
SESSION_CONFIG : {
LOG_DEVICE_PLACEMENT : false,
PER_PROCESS_GPU_MEMORY_FRACTION : 0.9
}
#################################################
# Metrics
@gvanhorn38
gvanhorn38 / cub_image_config_train.yaml
Last active September 21, 2018 12:44
CUB-200 Image Classification Train Configuration
# Training specific configuration
RANDOM_SEED : 1.0
SESSION_CONFIG : {
LOG_DEVICE_PLACEMENT : false,
PER_PROCESS_GPU_MEMORY_FRACTION : 0.94
}
#################################################
# Dataset Info
@gvanhorn38
gvanhorn38 / parse_inat_dataset_ex.py
Created January 4, 2017 16:20
Example parsing inaturalist dataset
import cPickle as pickle
import os
dataset_dir = '.'
train_file = os.path.join(dataset_dir, "train_data.pkl")
with open(train_file) as f:
train_data = pickle.load(f)
image_dir = "/home/gvanhorn/datasets/inaturalist/images"
train_images = []
@gvanhorn38
gvanhorn38 / parallel_wget.md
Last active July 20, 2018 21:29
Download Many Images Using wget and parallel

Assuming you have a file urls.txt that has, for each row, the name of file to save and the url to fetch, space separated. You can then use the following to download the urls.

parallel -j8 --colsep " " "wget -q -O {1} {2}" < urls.txt
parallel --eta -j4 --colsep " " "wget -q -N -t 1 -T 10 -O {1} {2}" < urls.txt

-q for quiet

@gvanhorn38
gvanhorn38 / format_cub_dataset_parts.py
Last active March 4, 2018 22:56
Format a CUB style dataset for tfrecord storage, including class labels, bboxes and parts.
import os
import random
import sys
from collections import Counter
def format_labels(image_labels):
"""
Convert the image labels to be integers between [0, num classes)
Returns :