Skip to content

Instantly share code, notes, and snippets.

View dnlglsn's full-sized avatar
🤖
Bleep Bloop

Daniel Gleason dnlglsn

🤖
Bleep Bloop
View GitHub Profile
@dnlglsn
dnlglsn / export_inference_graph_unfrozen.py
Created September 14, 2017 22:21
Export a checkpointed object_detection model for serving with TensorFlow Serving
"""
References:
https://github.com/tensorflow/models/blob/master/object_detection/g3doc/exporting_models.md
https://github.com/tensorflow/models/issues/1988
Unfortunately, the tutorial for saving a model for inference "freezes" the
variables in place and makes them unservable by tensorflow_serving.
export_inference_graph.py exports an empty "variables" directory, which needs to
be populated.