Skip to content

Instantly share code, notes, and snippets.

@kali
Created May 29, 2020 16:33
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save kali/038dd69dfec7bbfe289eded9c92863d6 to your computer and use it in GitHub Desktop.
Save kali/038dd69dfec7bbfe289eded9c92863d6 to your computer and use it in GitHub Desktop.
TSAR 29/05 18:28 ~/dev/snips/tract/cli% cargo run -- ../.cached/onnx/onnx-1.5.0/onnx/backend/test/data/real/test_squeezenet/squeezenet/model.onnx -i 2x3x224x224xf32 dump --info
Finished dev [unoptimized + debuginfo] target(s) in 0.05s
Running `/home/kali/dev/snips/tract/target/debug/tract ../.cached/onnx/onnx-1.5.0/onnx/backend/test/data/real/test_squeezenet/squeezenet/model.onnx -i 2x3x224x224xf32 dump --info`
┏ 52 Source data_0
┃ ━━━ 2x3x224x224xF32
┣┻┻ 53 Conv conv1_1
┃ ━━━ 2x64x111x111xF32
┣ 54 ScalarMax conv1_2
┣ 55 MaxPool pool1_1
┃ ━━━ 2x64x55x55xF32
┃ * Data format: NCHW
┃ * Kernel shape:[3, 3] (strides:Some([2, 2]), padding:Explicit([0, 0], [0, 0]), dilations:None)
┣┻┻ 56 Conv fire2/squeeze1x1_1
┃ ━━━ 2x16x55x55xF32
┣ 57 ScalarMax fire2/squeeze1x1_2
┣┓
┃┣┻┻ 58 Conv fire2/expand1x1_1
┃┃ ━━━ 2x64x55x55xF32
┃┣ 59 ScalarMax fire2/expand1x1_2
┗┓
┃┣┻┻ 60 Conv fire2/expand3x3_1
┃┃ ━━━ 2x64x55x55xF32
┃┣ 61 ScalarMax fire2/expand3x3_2
┣┻ 62 InferenceConcat fire2/concat_1
┃ ━━━ 2x128x55x55xF32
┣┻┻ 63 Conv fire3/squeeze1x1_1
┃ ━━━ 2x16x55x55xF32
┣ 64 ScalarMax fire3/squeeze1x1_2
┣┓
┃┣┻┻ 65 Conv fire3/expand1x1_1
┃┃ ━━━ 2x64x55x55xF32
┃┣ 66 ScalarMax fire3/expand1x1_2
┗┓
┃┣┻┻ 67 Conv fire3/expand3x3_1
┃┃ ━━━ 2x64x55x55xF32
┃┣ 68 ScalarMax fire3/expand3x3_2
┣┻ 69 InferenceConcat fire3/concat_1
┃ ━━━ 2x128x55x55xF32
┣ 70 MaxPool pool3_1
┃ ━━━ 2x128x27x27xF32
┃ * Data format: NCHW
┃ * Kernel shape:[3, 3] (strides:Some([2, 2]), padding:Explicit([0, 0], [0, 0]), dilations:None)
┣┻┻ 71 Conv fire4/squeeze1x1_1
┃ ━━━ 2x32x27x27xF32
┣ 72 ScalarMax fire4/squeeze1x1_2
┣┓
┃┣┻┻ 73 Conv fire4/expand1x1_1
┃┃ ━━━ 2x128x27x27xF32
┃┣ 74 ScalarMax fire4/expand1x1_2
┗┓
┃┣┻┻ 75 Conv fire4/expand3x3_1
┃┃ ━━━ 2x128x27x27xF32
┃┣ 76 ScalarMax fire4/expand3x3_2
┣┻ 77 InferenceConcat fire4/concat_1
┃ ━━━ 2x256x27x27xF32
┣┻┻ 78 Conv fire5/squeeze1x1_1
┃ ━━━ 2x32x27x27xF32
┣ 79 ScalarMax fire5/squeeze1x1_2
┣┓
┃┣┻┻ 80 Conv fire5/expand1x1_1
┃┃ ━━━ 2x128x27x27xF32
┃┣ 81 ScalarMax fire5/expand1x1_2
┗┓
┃┣┻┻ 82 Conv fire5/expand3x3_1
┃┃ ━━━ 2x128x27x27xF32
┃┣ 83 ScalarMax fire5/expand3x3_2
┣┻ 84 InferenceConcat fire5/concat_1
┃ ━━━ 2x256x27x27xF32
┣ 85 MaxPool pool5_1
┃ ━━━ 2x256x13x13xF32
┃ * Data format: NCHW
┃ * Kernel shape:[3, 3] (strides:Some([2, 2]), padding:Explicit([0, 0], [0, 0]), dilations:None)
┣┻┻ 86 Conv fire6/squeeze1x1_1
┃ ━━━ 2x48x13x13xF32
┣ 87 ScalarMax fire6/squeeze1x1_2
┣┓
┃┣┻┻ 88 Conv fire6/expand1x1_1
┃┃ ━━━ 2x192x13x13xF32
┃┣ 89 ScalarMax fire6/expand1x1_2
┗┓
┃┣┻┻ 90 Conv fire6/expand3x3_1
┃┃ ━━━ 2x192x13x13xF32
┃┣ 91 ScalarMax fire6/expand3x3_2
┣┻ 92 InferenceConcat fire6/concat_1
┃ ━━━ 2x384x13x13xF32
┣┻┻ 93 Conv fire7/squeeze1x1_1
┃ ━━━ 2x48x13x13xF32
┣ 94 ScalarMax fire7/squeeze1x1_2
┣┓
┃┣┻┻ 95 Conv fire7/expand1x1_1
┃┃ ━━━ 2x192x13x13xF32
┃┣ 96 ScalarMax fire7/expand1x1_2
┗┓
┃┣┻┻ 97 Conv fire7/expand3x3_1
┃┃ ━━━ 2x192x13x13xF32
┃┣ 98 ScalarMax fire7/expand3x3_2
┣┻ 99 InferenceConcat fire7/concat_1
┃ ━━━ 2x384x13x13xF32
┣┻┻ 100 Conv fire8/squeeze1x1_1
┃ ━━━ 2x64x13x13xF32
┣ 101 ScalarMax fire8/squeeze1x1_2
┣┓
┃┣┻┻ 102 Conv fire8/expand1x1_1
┃┃ ━━━ 2x256x13x13xF32
┃┣ 103 ScalarMax fire8/expand1x1_2
┗┓
┃┣┻┻ 104 Conv fire8/expand3x3_1
┃┃ ━━━ 2x256x13x13xF32
┃┣ 105 ScalarMax fire8/expand3x3_2
┣┻ 106 InferenceConcat fire8/concat_1
┃ ━━━ 2x512x13x13xF32
┣┻┻ 107 Conv fire9/squeeze1x1_1
┃ ━━━ 2x64x13x13xF32
┣ 108 ScalarMax fire9/squeeze1x1_2
┣┓
┃┣┻┻ 109 Conv fire9/expand1x1_1
┃┃ ━━━ 2x256x13x13xF32
┃┣ 110 ScalarMax fire9/expand1x1_2
┗┓
┃┣┻┻ 111 Conv fire9/expand3x3_1
┃┃ ━━━ 2x256x13x13xF32
┃┣ 112 ScalarMax fire9/expand3x3_2
┣┻ 113 InferenceConcat fire9/concat_1
┃ ━━━ 2x512x13x13xF32
┣┳ 114 Dropout fire9/concat_2
┃ ━━━ 2x512x13x13xF32
┃ ━━━ 2x512x13x13xBool
┣┻┻ 115 Conv conv10_1
┃ ━━━ 2x1000x13x13xF32
┣ 116 ScalarMax conv10_2
┣ 117 GlobalAvgPool pool10_1
┃ ━━━ 2x1000x1x1xF32
┣ 118 LayerSoftmax softmaxout_1
━━━ 1x1000x1x1xF32
* axis: 1
[2020-05-29T16:32:46.080169044Z ERROR tract] Failed analyse for node #118 "softmaxout_1" LayerSoftmax
Error: Failed analyse for node #118 "softmaxout_1" LayerSoftmax
Caused by: Infering facts
Caused by: Applying rule outputs[0].shape == inputs[0].shape: Unifying shapes 1x1000x1x1 and 2x1000x1x1, Impossible to unify Val(1) with Val(2).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment