Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Star 2 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save wolf1986/03c0e9de6604ba39493e1b556e7ac5c6 to your computer and use it in GitHub Desktop.
Save wolf1986/03c0e9de6604ba39493e1b556e7ac5c6 to your computer and use it in GitHub Desktop.
Tutorial for Running Tensorflow Serving

Tensorflow Serving Tutorial - 01 - Public Inception Model

In this tutorial:

  • Start with a blank Ubuntu
  • Install requirements
  • Download code & pre-tranied inception-v3 model from Google
  • Export the downloaded "checkpoint" format to a TF Graph that is servable with Tensorflow Model Serving
  • Query the server with images of a cat and a dog

General Notes

  • Compiling Tensorflow Serving from source (on docker with their official instructions) produces an internal gcc error (probably specific to tensorflow commit / gcc version used when writing this: tf serving commit: c1ec43508ee57a5d6269116aba82d2a16d383c8a)
  • If you need to debug or write code, get yourself a decent IDE environment such as PyCharm on the local machine that is able to execute the code.

Assumptions

  • You have an existing model that can be loaded via Tensorflow
  • Work on *nix (Specific instructions are for Ubuntu). Tensorflow Serving is available as a pre-compiled binary using apt-get, this is a big advantage (but should probably work on RHEL / CentOS as well.

Installations

  • Get tensorflow source (this includes python)
cd ~
git clone --recurse-submodules https://github.com/tensorflow/serving
  • Install Python 2.7
  • Install tensorflow-serving-api - to run the python client for TF Serving
# Note: May need to 'sudo bash' before running this, 
#  depending on your python installation
pip install tensorflow-serving-api
# Add TensorFlow Serving distribution URI as a package source (one time setup)
echo "deb [arch=amd64] http://storage.googleapis.com/tensorflow-serving-apt stable tensorflow-model-server tensorflow-model-server-universal" | sudo tee /etc/apt/sources.list.d/tensorflow-serving.list
curl https://storage.googleapis.com/tensorflow-serving-apt/tensorflow-serving.release.pub.gpg | sudo apt-key add -

# Install and update TensorFlow ModelServer
sudo apt-get update && sudo apt-get install tensorflow-model-server

# It is now available via global CLI: tensorflow_model_server

Sanity check - Existing model from Google

Test that the official Inception-v3 works with TF Serving. Full steps are available at: Tensorflow Serving - Inception

  • Download & Extract a checkpoint of the model.
root@c97d8e820ced:~/serving$ curl -O http://download.tensorflow.org/models/image/imagenet/inception-v3-2016-03-01.tar.gz
root@c97d8e820ced:~/serving$ tar xzf inception-v3-2016-03-01.tar.gz
root@c97d8e820ced:~/serving$ ls inception-v3
  • Convert checkpoint to a model that TF Serving can handle (Wrap with input & output signature) - instructions
# Fix module inception is not on python path (only for current process)
export PYTHONPATH=~/serving/tf_models/research/inception/

# Fix packages - Python files won't run because of missing __init__.py files:
echo > tf_models/research/inception/inception/__init__.py
echo > tf_models/research/inception/inception/slim/__init__.py

# Export inception from checkpoint to servable model
python tensorflow_serving/example/inception_saved_model.py --checkpoint_dir=inception-v3 --output_dir=/tmp/inception-export
  • Start TF Serving Model (this will block the current bash)
tensorflow_model_server --port=9000 --model_name=inception --model_base_path=/tmp/inception-export
  • Get some samples for the query:
# Download images to /tmp/images/
mkdir /tmp/images
curl -o /tmp/images/cat-sphynx.jpg https://www.pets4homes.co.uk/images/breeds/23/large/3514efe61d990b82bbc37bed00eea52a.jpg
curl -o /tmp/images/dog-husky.jpg https://www.pets4homes.co.uk/images/articles/3994/large/10-reasons-why-everyone-loves-the-siberian-husky-58cf9f7a1137e.jpg
root@c97d8e820ced:~/serving$ python tensorflow_serving/example/inception_client.py --server=localhost:9000 --image=/tmp/images/cat-sphynx.jpg
outputs {
  key: "classes"
  value {
    dtype: DT_STRING
    tensor_shape {
      dim {
        size: 1
      }
      dim {
        size: 5
      }
    }
    string_val: "Mexican hairless"
    string_val: "toy terrier"
    string_val: "Italian greyhound"
    string_val: "Siamese cat, Siamese"
    string_val: "Egyptian cat"
  }
}
root@c97d8e820ced:~/serving$ python tensorflow_serving/example/inception_client.py --server=localhost:9000 --image=/tmp/images/dog-husky.jpg
outputs {
  key: "classes"
  value {
    dtype: DT_STRING
    tensor_shape {
      dim {
        size: 1
      }
      dim {
        size: 5
      }
    }
    string_val: "Siberian husky"
    string_val: "Eskimo dog, husky"
    string_val: "malamute, malemute, Alaskan malamute"
    string_val: "timber wolf, grey wolf, gray wolf, Canis lupus"
    string_val: "dogsled, dog sled, dog sleigh"
  }
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment