Skip to content

Instantly share code, notes, and snippets.

@bmabir17

bmabir17/convert_load.py

Last active Aug 13, 2020
Embed
What would you like to do?
MODEL_FILE = "frozen_inference_graph_257.pb"
# Load the TensorFlow model
converter = tf.compat.v1.lite.TFLiteConverter.from_frozen_graph(
graph_def_file = MODEL_FILE,
input_arrays = ['sub_2'], # For the Xception model it needs to be `sub_7`, for MobileNet it would be `sub_2`
output_arrays = ['ResizeBilinear_2'],
input_shapes={'sub_2':[1,257,257,3]}
)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.