Skip to content

Instantly share code, notes, and snippets.

@samithaj
samithaj / batch_image_resize.sh
Created May 1, 2017 18:07
resizes all the images it finds in a folder and its subfolders
#!/bin/bash
# This script resizes all the images it finds in a folder (and its subfolders) and resizes them
# The resized image is placed in the /resized folder which will reside in the same directory as the image
#
# Usage: > ./batch_resize.sh
initial_folder="./" # You can use "." to target the folder in which you are running the script for example
resized_folder_name="./resized"
all_images=$(find -E $initial_folder -iregex ".*\.(jpg|gif|png|jpeg)")

TensorFlow Serving in 10 minutes!

TensorFlow SERVING is Googles' recommended way to deploy TensorFlow models. Without proper computer engineering background, it can be quite intimidating, even for people who feel comfortable with TensorFlow itself. Few things that I've found particularly hard were:

  • Tutorial examples have C++ code (which I don't know)
  • Tutorials have Kubernetes, gRPG, Bezel (some of which I saw for the first time)
  • It needs to be compiled. That process takes forever!

After all, it worked just fine. Here I present an easiest possible way to deploy your models with TensorFlow Serving. You will have your self-built model running inside TF-Serving by the end of this tutorial. It will be scalable, and you will be able to query it via REST.

@samithaj
samithaj / seq2seq.py
Created September 10, 2017 04:00 — forked from ilblackdragon/seq2seq.py
Example of Seq2Seq with Attention using all the latest APIs
import logging
import numpy as np
import tensorflow as tf
from tensorflow.contrib import layers
GO_TOKEN = 0
END_TOKEN = 1
UNK_TOKEN = 2
@samithaj
samithaj / tf_seq2seq_single_str_inference.py
Created January 20, 2018 06:27 — forked from noname01/tf_seq2seq_single_str_inference.py
Quick hack for loading seq2seq model and inference via feed_dict.
from pydoc import locate
import tensorflow as tf
import numpy as np
from seq2seq import tasks, models
from seq2seq.training import utils as training_utils
from seq2seq.tasks.inference_task import InferenceTask, unbatch_dict
class DecodeOnce(InferenceTask):
'''
@samithaj
samithaj / mem-ts-RNN.py
Created June 16, 2018 17:36 — forked from lukovkin/mem-ts-RNN.py
Memory-efficient training of RNNs (modified example) - see https://groups.google.com/forum/#!topic/keras-users/vnGMtKPu1Xc for the discussion
# Code for Jupyter/IPython Notebook environment
from keras.models import Sequential
from keras.layers.core import TimeDistributedDense, Activation, Dropout
from keras.layers.recurrent import GRU
import numpy as np
from keras.utils.layer_utils import print_layer_shapes
%matplotlib inline
import matplotlib.pyplot as plt
@samithaj
samithaj / hyperband.py
Created November 26, 2019 11:48 — forked from PetrochukM/hyperband.py
Here we implement hyperband and successive halving adaptions. We found that the original hyperband implementation was messy and not tested. We also wanted to adapt it to include model reuse.
"""
We implement additional hyperparameter optimization methods not present in
https://scikit-optimize.github.io/.
Gist: https://gist.github.com/Deepblue129/2c5fae9daf0529ed589018c6353c9f7b
"""
import math
import logging
import random