Skip to content

Instantly share code, notes, and snippets.

@adityaiitb
Last active August 27, 2020 14:09
Show Gist options
  • Save adityaiitb/2c1265157bac51d3c9a265c0204fbc01 to your computer and use it in GitHub Desktop.
Save adityaiitb/2c1265157bac51d3c9a265c0204fbc01 to your computer and use it in GitHub Desktop.
Tensorflow Model Server and gRPC Client
#!/usr/bin/env python3
import tensorflow as tf
from tensorflow_serving.apis import predict_pb2
from tensorflow_serving.apis import prediction_service_pb2_grpc
import numpy as np
import grpc
channel = grpc.insecure_channel("localhost:8500")
stub = prediction_service_pb2_grpc.PredictionServiceStub(channel)
request = predict_pb2.PredictRequest()
request.model_spec.name = "resnet"
request.model_spec.signature_name = "serving_default"
x = np.random.rand(1,64,64,3)
x = (x * 255).astype("uint8")
x = tf.make_tensor_proto(x)
request.inputs["inputs"].CopyFrom(x)
result = stub.Predict(request, 10.0)
print(result)
  • Install TF ModelServer and Serving API Documentation.

  • Start a tensorflow model server.

tensorflow_model_server \
    --port=8500 \
    --model_name=resnet \
    --model_base_path=<path> \
    --saved_model_tags="serve"
  • Check the port status
nmap localhost
  • Send gRPC requests using the above code.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment