Skip to content

Instantly share code, notes, and snippets.

View waichee's full-sized avatar

wai chee yau waichee

View GitHub Profile
# Some Google CLI commands to setup dataproc and Big Query etc
cluster_name="cluster-1"
# create dataproc cluster with Jupyterhub https://cloud.google.com/dataproc/docs/tutorials/jupyter-notebook
gcloud compute ssh --zone="us-central1-c" \
--ssh-flag="-D" --ssh-flag="10000" --ssh-flag="-N" "${cluster_name}-m" \
--initialization-actions \
gs://dataproc-initialization-actions/jupyter/jupyter.sh
# Proxy and access UI of cluster https://cloud.google.com/dataproc/docs/concepts/cluster-web-interfaces
#Run a single test
bazel test -c opt //tensorflow_serving/sources/storage_path:file_system_storage_path_source_test
#!/bin/bash
# This script is used to compile Tensorflow Serving protobuf definition into Python
# The generated definitions in Python are stored in the tensorflow_serving_apis folder
# Usage:
# ./compile_ts_serving_proto.sh 333325e413e9680d67ae90196fa123f5271fcf615
#
: ${1?”Error. Please provide the Tensorflow Serving git commit hash/branch name. Usage: ./compile_ts_serving_proto.sh my_awesome_branch “}
script_dir=”$( cd “$( dirname “${BASH_SOURCE[0]}” )” && pwd )”
ts_git_revision=$1 #branch/release or commit hash
local_ts_api_dir=”${script_dir}/tensorflow_serving_apis/” #directory that stores the compiled python proto definition
@waichee
waichee / build_ts_serving_source.sh
Last active January 13, 2019 05:45
Code to Build Tensorflow Serving from source within a Docker container
mkdir -p /work/
# Clone the source from Github
cd /work/ && git clone — recurse-submodules https://github.com/tensorflow/serving
# Pin the version of Tensorflow Serving and its submodule
TENSOR_SERVING_COMMIT_HASH=85db9d3
TENSORFLOW_COMMIT_HASH=dbe5e17
cd /work/serving && git checkout $TENSOR_SERVING_COMMIT_HASH
# install the required packages
pip install memory_profiler
pip install matplotlib
# run the profiler to record the memory usage
# sample 0.1s by defaut
mprof run --include-children python fantastic_model_building_code.py
# plot the recorded memory usage
mprof plot --output memory-profile.png
@waichee
waichee / create_docker_container.sh
Created February 14, 2017 10:12
Steps to create a docker container with dependencies required for compiling Tensorflow Serving
# Clone the Tensorflow Serving source
git clone https://github.com/tensorflow/serving
cd serving && git checkout <commit_hash>
# Build the docker image (time to go get yourself a coffee, maybe a meal as well, this will take a while.)
docker build -t some_user_namespace/tensorflow-serving:latest -f ./serving/tensorflow_serving/tools/docker/Dockerfile.devel .
# Run up the Docker container in terminal
docker run -ti some_user_namespace/tensorflow-serving:latest
@waichee
waichee / leaky.py
Last active September 24, 2020 00:25
import time
import random
# A dummy script which keeps increasing the number of string added to list a
thing = "hi"
a = []
for i in range(1000):
a.append(thing * random.randint(1000,2000))
time.sleep(0.1)