Last active
August 26, 2023 09:45
-
-
Save Muhammad-Yunus/0a3141580ce29fc0199f6589298df60f to your computer and use it in GitHub Desktop.
keras_06_model_serving.ipynb
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
{ | |
"nbformat": 4, | |
"nbformat_minor": 0, | |
"metadata": { | |
"colab": { | |
"provenance": [], | |
"name": "keras_06_model_serving.ipynb", | |
"authorship_tag": "ABX9TyNitQh9qBxUgk9OB9eAUF6R", | |
"include_colab_link": true | |
}, | |
"kernelspec": { | |
"name": "python3", | |
"display_name": "Python 3" | |
}, | |
"language_info": { | |
"name": "python" | |
} | |
}, | |
"cells": [ | |
{ | |
"cell_type": "markdown", | |
"metadata": { | |
"id": "view-in-github", | |
"colab_type": "text" | |
}, | |
"source": [ | |
"<a href=\"https://colab.research.google.com/gist/Muhammad-Yunus/0a3141580ce29fc0199f6589298df60f/keras_06_model_serving.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>" | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"source": [ | |
"# Serve Keras Model using TensorFlow Serving\n", | |
"\n", | |
"### 1. Update `saved_model.zip` to colab" | |
], | |
"metadata": { | |
"id": "339ch4xSyanP" | |
} | |
}, | |
{ | |
"cell_type": "code", | |
"execution_count": null, | |
"metadata": { | |
"id": "o78w7Ot_ySA1" | |
}, | |
"outputs": [], | |
"source": [ | |
"# upload my_model.zip to Colab Notebook\n", | |
"%cd /content\n", | |
"%mkdir -p my_model/1/\n", | |
"%cd my_model/1/\n", | |
"\n", | |
"import os\n", | |
"import shutil\n", | |
"from zipfile import ZipFile\n", | |
"from google.colab import files\n", | |
"\n", | |
"print(\"Upload `my_model.zip` to colab :\")\n", | |
"uploaded = files.upload()\n", | |
"\n", | |
"for fileName, data in uploaded.items():\n", | |
" with open('my_model.zip', 'wb') as f:\n", | |
" f.write(data)\n", | |
" f.close()\n", | |
" print('saved (.zip) file ' + fileName)\n", | |
"\n", | |
"ds = ZipFile(fileName)\n", | |
"ds.extractall()\n", | |
"os.remove(fileName)\n", | |
"print('Extracted zip file ' + fileName)\n", | |
"\n", | |
"%cd .." | |
] | |
}, | |
{ | |
"cell_type": "markdown", | |
"source": [ | |
"### 2. Add `tensorflow-model-server` package to our list of packages" | |
], | |
"metadata": { | |
"id": "LQzSBEHj1iaa" | |
} | |
}, | |
{ | |
"cell_type": "code", | |
"source": [ | |
"!echo \"deb http://storage.googleapis.com/tensorflow-serving-apt stable tensorflow-model-server tensorflow-model-server-universal\" | tee /etc/apt/sources.list.d/tensorflow-serving.list && \\\n", | |
"curl https://storage.googleapis.com/tensorflow-serving-apt/tensorflow-serving.release.pub.gpg | apt-key add -\n", | |
"!apt update" | |
], | |
"metadata": { | |
"id": "C6tXdQB-1mXR" | |
}, | |
"execution_count": null, | |
"outputs": [] | |
}, | |
{ | |
"cell_type": "markdown", | |
"source": [ | |
"### 3. Install tensorflow model server" | |
], | |
"metadata": { | |
"id": "2vZiB1D71sT5" | |
} | |
}, | |
{ | |
"cell_type": "code", | |
"source": [ | |
"!apt-get install tensorflow-model-server" | |
], | |
"metadata": { | |
"id": "i84MCcAr1r_f" | |
}, | |
"execution_count": null, | |
"outputs": [] | |
}, | |
{ | |
"cell_type": "markdown", | |
"source": [ | |
"### 4. Run TensorFlow serving" | |
], | |
"metadata": { | |
"id": "z6M4fwlp1x-j" | |
} | |
}, | |
{ | |
"cell_type": "markdown", | |
"source": [ | |
"- Tensorflow Serving requiring the model `/content/my_model/{version}`" | |
], | |
"metadata": { | |
"id": "rCH3f835BXn1" | |
} | |
}, | |
{ | |
"cell_type": "code", | |
"source": [ | |
"os.environ[\"MODEL_DIR\"] = \"/content/my_model\"" | |
], | |
"metadata": { | |
"id": "7LqrDdMA10nw" | |
}, | |
"execution_count": null, | |
"outputs": [] | |
}, | |
{ | |
"cell_type": "code", | |
"source": [ | |
"%%bash --bg\n", | |
"nohup tensorflow_model_server \\\n", | |
" --rest_api_port=8501 \\\n", | |
" --model_name=my_model \\\n", | |
" --model_base_path=\"${MODEL_DIR}\" >server.log 2>&1\n" | |
], | |
"metadata": { | |
"id": "loqSVGes15m5" | |
}, | |
"execution_count": null, | |
"outputs": [] | |
}, | |
{ | |
"cell_type": "code", | |
"source": [ | |
"!tail server.log" | |
], | |
"metadata": { | |
"id": "tI83ieh_2dXU" | |
}, | |
"execution_count": null, | |
"outputs": [] | |
}, | |
{ | |
"cell_type": "markdown", | |
"source": [ | |
"- Tensorflow Serving will serve the API in the following uri format\n", | |
"`http://<host>:<port>/v1/models/<model_name>:predict`\\\n", | |
"for our case, it should be \\\n", | |
"`http://localhost:8501/v1/models/my_model:predict`" | |
], | |
"metadata": { | |
"id": "kS2VmYH4Avl_" | |
} | |
}, | |
{ | |
"cell_type": "markdown", | |
"source": [ | |
"---\n", | |
"- If you want to **stop** **Tensorflow Serving**, just kill with pid, don't forget to uncomment the command `#`" | |
], | |
"metadata": { | |
"id": "Q_6XGbpm7pwZ" | |
} | |
}, | |
{ | |
"cell_type": "code", | |
"source": [ | |
"#!ps aux | grep tensorflow_model_server" | |
], | |
"metadata": { | |
"id": "x1uN4vMN7LGE" | |
}, | |
"execution_count": null, | |
"outputs": [] | |
}, | |
{ | |
"cell_type": "code", | |
"source": [ | |
"#!kill -9 <pid>" | |
], | |
"metadata": { | |
"id": "jFGe-dQF7P0y" | |
}, | |
"execution_count": null, | |
"outputs": [] | |
}, | |
{ | |
"cell_type": "markdown", | |
"source": [ | |
"---" | |
], | |
"metadata": { | |
"id": "icf3Rc01ACNU" | |
} | |
}, | |
{ | |
"cell_type": "markdown", | |
"source": [ | |
"### 5. Making requests in TensorFlow Serving\n", | |
"\n", | |
"- Load mnist dataset for testing" | |
], | |
"metadata": { | |
"id": "tUKN96fB3KXQ" | |
} | |
}, | |
{ | |
"cell_type": "code", | |
"source": [ | |
"import tensorflow as tf\n", | |
"\n", | |
"__, (X_test, y_test) = tf.keras.datasets.mnist.load_data()\n", | |
"\n", | |
"# Data Normalization -> Between 0 and 1\n", | |
"X_test = X_test / 255.0\n", | |
"\n", | |
"# Reshape dataset\n", | |
"X_test = X_test.reshape(X_test.shape[0], 28, 28, 1)\n", | |
"\n", | |
"class_names = ['0','1', '2', '3', '4', '5', '6', '7', '8', '9']" | |
], | |
"metadata": { | |
"id": "KmkmpeV93uK-" | |
}, | |
"execution_count": null, | |
"outputs": [] | |
}, | |
{ | |
"cell_type": "markdown", | |
"source": [ | |
"- create utility function to show predistion result" | |
], | |
"metadata": { | |
"id": "m4-qMV7sAMjD" | |
} | |
}, | |
{ | |
"cell_type": "code", | |
"source": [ | |
"import numpy as np\n", | |
"import matplotlib.pyplot as plt\n", | |
"\n", | |
"def show(idx, title):\n", | |
" plt.figure()\n", | |
" plt.imshow(X_test[idx].reshape(28,28))\n", | |
" plt.title('\\n\\n{}'.format(title), fontdict={'size': 16})\n" | |
], | |
"metadata": { | |
"id": "tuF-Eu0v3L7D" | |
}, | |
"execution_count": null, | |
"outputs": [] | |
}, | |
{ | |
"cell_type": "code", | |
"source": [ | |
"import random\n", | |
"rando = random.randint(0,len(X_test)-1)\n", | |
"show(rando, 'An Example Image: {}'.format(class_names[y_test[rando]]))" | |
], | |
"metadata": { | |
"id": "g5d0iNx43UBL" | |
}, | |
"execution_count": null, | |
"outputs": [] | |
}, | |
{ | |
"cell_type": "markdown", | |
"source": [ | |
"- Create request body and call your model api served by Tensorflow serving" | |
], | |
"metadata": { | |
"id": "atZ7ndtwAStq" | |
} | |
}, | |
{ | |
"cell_type": "code", | |
"source": [ | |
"import json\n", | |
"\n", | |
"# Let's create a JSON object and make 3 inference requests\n", | |
"data = json.dumps({\n", | |
" \"signature_name\": \"serving_default\",\n", | |
" \"instances\": X_test[0:3].tolist()\n", | |
" })\n", | |
"\n", | |
"print('Data: {} ... {}'.format(data[:50], data[len(data)-52:]))" | |
], | |
"metadata": { | |
"id": "Immh0aQB5b8r" | |
}, | |
"execution_count": null, | |
"outputs": [] | |
}, | |
{ | |
"cell_type": "code", | |
"source": [ | |
"!pip install -q requests" | |
], | |
"metadata": { | |
"id": "wKwW9hf85reJ" | |
}, | |
"execution_count": null, | |
"outputs": [] | |
}, | |
{ | |
"cell_type": "code", | |
"source": [ | |
"import requests" | |
], | |
"metadata": { | |
"id": "rANJYRzR8quB" | |
}, | |
"execution_count": null, | |
"outputs": [] | |
}, | |
{ | |
"cell_type": "code", | |
"source": [ | |
"# made http post request into your api\n", | |
"\n", | |
"headers = {\"content-type\": \"application/json\"}\n", | |
"json_response = requests.post('http://localhost:8501/v1/models/my_model:predict', data=data, headers=headers)\n", | |
"predictions = json.loads(json_response.text)['predictions']\n" | |
], | |
"metadata": { | |
"id": "JFcCn4fH5wZi" | |
}, | |
"execution_count": null, | |
"outputs": [] | |
}, | |
{ | |
"cell_type": "code", | |
"source": [ | |
"# show and check result\n", | |
"N = 2\n", | |
"show(N, 'The model thought this was a {} (class {}), and it was actually a {} (class {})'.format(\n", | |
" class_names[np.argmax(predictions[N])], y_test[N], class_names[np.argmax(predictions[N])], y_test[N]))" | |
], | |
"metadata": { | |
"id": "8AvdhYT0542n" | |
}, | |
"execution_count": null, | |
"outputs": [] | |
} | |
] | |
} |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment