Skip to content

Instantly share code, notes, and snippets.

@petewarden
Created July 24, 2017 19:44
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save petewarden/750b3d8f5173315870bd2fb10174dfbd to your computer and use it in GitHub Desktop.
Save petewarden/750b3d8f5173315870bd2fb10174dfbd to your computer and use it in GitHub Desktop.
bazel run tensorflow/tools/graph_transforms:transform_graph -- \
--in_graph=tensorflow_inception_graph.pb \
--out_graph=optimized_inception_graph.pb --inputs='Mul' \
--outputs='softmax' --transforms='strip_unused_nodes(type=float, shape="1,299,299,3") fold_constants(ignore_errors=true) fold_batch_norms fold_old_batch_norms'
@psyhtest
Copy link

psyhtest commented Aug 22, 2017

I had to specify absolute paths to *.pb files:

$ bazel run tensorflow/tools/graph_transforms:transform_graph -- \
  --in_graph=`pwd`/tensorflow_inception_graph.pb \
  --out_graph=`pwd`/optimized_inception_graph.pb \
  --inputs='Mul' \ 
  --outputs='softmax' \
  --transforms='strip_unused_nodes(type=float, shape="1,299,299,3") fold_constants(ignore_errors=true) fold_batch_norms fold_old_batch_norms'

and got an error:

Input node softmax not found in graph

I checked that I used the same input file as the one in your repos.

I replaced --outputs='softmax' with -outputs='output2':

$ bazel run tensorflow/tools/graph_transforms:transform_graph -- \
  --in_graph=`pwd`/tensorflow_inception_graph.pb \
  --out_graph=`pwd`/optimized_inception_graph.pb \
  --inputs='Mul' \ 
  --outputs='output2' \
  --transforms='strip_unused_nodes(type=float, shape="1,299,299,3") fold_constants(ignore_errors=true) fold_batch_norms fold_old_batch_norms'

but still got a warning:

fold_constants: Ignoring error FeedInputs: unable to find feed output Mul

Is this expected? In other words, is this what fold_constants(ignore_errors=true) is for?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment