Created
December 24, 2020 02:02
-
-
Save hugozanini/d55501650b97326424fbd08a8d1fe689 to your computer and use it in GitHub Desktop.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
{ | |
"nbformat": 4, | |
"nbformat_minor": 0, | |
"metadata": { | |
"colab": { | |
"name": "Publishing.ipynb", | |
"provenance": [] | |
}, | |
"kernelspec": { | |
"name": "python3", | |
"display_name": "Python 3" | |
} | |
}, | |
"cells": [ | |
{ | |
"cell_type": "markdown", | |
"metadata": { | |
"id": "8R_znCjFXdhj" | |
}, | |
"source": [ | |
"### Export the Inference Graph\n", | |
"\n", | |
"The below code cell adds a line to the tf_utils.py file. This is a temporary fix to a exporting issue occuring when using the API with Tensorflow 2. This code will be removed as soon as the TF Team puts out a fix.\n", | |
"\n", | |
"All credit goes to the Github users [Jacobsolawetz](https://github.com/Jacobsolawetz) and [ Tanner Gilbert](https://github.com/TannerGilbert), who provided this [temporary fix](https://github.com/tensorflow/models/issues/8841#issuecomment-657647648)." | |
] | |
}, | |
{ | |
"cell_type": "code", | |
"metadata": { | |
"id": "AsrZNbomXfzA" | |
}, | |
"source": [ | |
"with open('/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/utils/tf_utils.py') as f:\n", | |
" tf_utils = f.read()\n", | |
"\n", | |
"with open('/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/utils/tf_utils.py', 'w') as f:\n", | |
" # Set labelmap path\n", | |
" throw_statement = \"raise TypeError('Expected Operation, Variable, or Tensor, got ' + str(x))\"\n", | |
" tf_utils = tf_utils.replace(throw_statement, \"if not isinstance(x, str):\" + throw_statement)\n", | |
" f.write(tf_utils)" | |
], | |
"execution_count": null, | |
"outputs": [] | |
}, | |
{ | |
"cell_type": "code", | |
"metadata": { | |
"colab": { | |
"base_uri": "https://localhost:8080/" | |
}, | |
"id": "cRJxDZ36XdKw", | |
"outputId": "5117a940-237d-4e9e-ca0b-8b1a63f0b43f" | |
}, | |
"source": [ | |
"output_directory = 'inference_graph'\n", | |
"\n", | |
"!python /content/models/research/object_detection/exporter_main_v2.py \\\n", | |
" --trained_checkpoint_dir {model_dir} \\\n", | |
" --output_directory {output_directory} \\\n", | |
" --pipeline_config_path {pipeline_config_path}" | |
], | |
"execution_count": null, | |
"outputs": [ | |
{ | |
"output_type": "stream", | |
"text": [ | |
"2020-12-20 17:30:11.903455: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library libcudart.so.10.1\n", | |
"2020-12-20 17:30:17.311425: I tensorflow/compiler/jit/xla_cpu_device.cc:41] Not creating XLA devices, tf_xla_enable_xla_devices not set\n", | |
"2020-12-20 17:30:17.325027: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library libcuda.so.1\n", | |
"INFO:tensorflow:Assets written to: inference_graph/saved_model/assets\n", | |
"I1220 17:30:57.785869 139885737965440 builder_impl.py:775] Assets written to: inference_graph/saved_model/assets\n", | |
"INFO:tensorflow:Writing pipeline config file to inference_graph/pipeline.config\n", | |
"I1220 17:30:58.405525 139885737965440 config_util.py:254] Writing pipeline config file to inference_graph/pipeline.config\n" | |
], | |
"name": "stdout" | |
} | |
] | |
} | |
] | |
} |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
To anyone wondering why they're code doesn't work: feel free to change those paths to where ever those .py files are. For me, this meant changing to the directory name 'python3.6' to 'python3.7'. Worst case is to digging on your Google Colab machine for them :)