Skip to content

Instantly share code, notes, and snippets.

@Hyrtsi
Created April 12, 2022 13:11
Show Gist options
  • Save Hyrtsi/708e69efb411af24e2a58c0e8a9fdf85 to your computer and use it in GitHub Desktop.
Save Hyrtsi/708e69efb411af24e2a58c0e8a9fdf85 to your computer and use it in GitHub Desktop.
Python code to analyze onnx graph datatypes to solve error "Unsupported ONNX data type: UINT8 (2)"
"""
What:
Upon converting my .onnx model to TensorRT .engine I got this error
Unsupported ONNX data type: UINT8 (2)
So I created this short python function to check which inputs cause the error.
After some googling I found out that uint8 is not supported in TensorRT.
I plan on replacing the uint8's with something else so that I'm able to
convert the model successfully.
"""
import onnx
def main():
model = onnx.load("onnx_model.onnx")
try:
onnx.checker.check_model(model)
except onnx.checker.ValidationError as e:
print("The model is invalid: %s" % e)
"""
# 1 = float32
# 2 = uint8
# 3 = int8
# 4 = uint16
# 5 = int16
# 6 = int32
# 7 = int64
"""
inputs = model.graph.input
for input in inputs:
dtype = input.type.tensor_type.elem_type
if dtype == 2:
print(input.name, ">>>", input.type.tensor_type.elem_type)
main()
@Hyrtsi
Copy link
Author

Hyrtsi commented Apr 12, 2022

The error may as well result from outputs. You can in theory change the inputs and outputs using this code but that doesn't solve the issue. Basically the whole model must be build without uint8 for it to be compatible with TensorRT

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment