To install TensorFlow Serving dependencies, execute the following:
sudo /usr/local/bin/pip2 install mock
sudo apt-get update && sudo apt-get install -y \
build-essential \
curl \
git \
libfreetype6-dev \
libpng12-dev \
libzmq3-dev \
pkg-config \
python-dev \
python-numpy \
python-pip \
software-properties-common \
swig \
zip \
zlib1g-dev
Clone Tensorflow Serving
git clone --recurse-submodules https://github.com/tensorflow/serving
Configure Tensorflow with the following settings:
cd serving/tensorflow
# Enable Tensorflow google cloud support
export TF_NEED_GCP=1
export TF_CUDNN_VERSION=5
export CUDNN_INSTALL_PATH=/usr/local/cuda/lib64
export TF_UNOFFICIAL_SETTING=1
# Provides a lit of Cuda capabilities per card. AWS currently has K520s and they are 3.0 capable
# https://developer.nvidia.com/cuda-gpus
export TF_CUDA_COMPUTE_CAPABILITIES=3.0
export CUDA_TOOLKIT_PATH=/usr/local/cuda
export TF_CUDA_VERSION=7.5
export GCC_HOST_COMPILER_PATH=/usr/bin/gcc
export TF_NEED_CUDA=1
export PYTHON_BIN_PATH=/usr/bin/python
# Clean out old PIP packages"
/bin/rm -rf /tmp/tensorflow_pkg/*
./configure
cd ..
bazel build tensorflow_serving/...
Binaries are placed in the bazel-bin directory, and can be run using a command like:
./bazel-bin/tensorflow_serving/example/mnist_inference
To test your installation, execute:
bazel test tensorflow_serving/...