Skip to content

Instantly share code, notes, and snippets.

@hmahadik
Last active May 17, 2021 17:36
Show Gist options
  • Save hmahadik/8c9633f8b070cc02f837b9efb7ec0edb to your computer and use it in GitHub Desktop.
Save hmahadik/8c9633f8b070cc02f837b9efb7ec0edb to your computer and use it in GitHub Desktop.
Inference using tflite_runtime's interpreter with input data generated using np's random_sample
from tflite_runtime import interpreter
import numpy as np
import time
i = interpreter.Interpreter("mobilenet_v1_1_224.tflite")
input_details = i.get_input_details()
output_details = i.get_output_details()
input_shape = input_details[0]['shape']
input_data = np.array(np.random.random_sample(input_shape), dtype=np.float32)
i.set_tensor(input_details[0]['index'], input_data)
t1=time.time(); i.invoke(); t2=time.time(); print(f"Warm-up time: {1000.0*(t2-t1)} ms")
t1=time.time(); i.invoke(); t2=time.time(); print(f"Inference time: {1000.0*(t2-t1)} ms")
output_data = i.get_tensor(output_details[0]['index'])
print(output_data)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment