Skip to content

Instantly share code, notes, and snippets.

Change the executable subl (`$(which subl)) so that sublime_text doesn't get dynamically linked libs in anaconda, e.g. set `LD_LIBRARY_PATH=/usr/local/lib`
Found 1 possible inputs: (name=sig_input, type=float(1), shape=[?,2])
No variables spotted.
Found 2 possible outputs: (name=instr_sig, op=Transpose) (name=voice_sig, op=Transpose)
Found 19652425 (19.65M) const parameters, 0 (0) variable parameters, and 6 control_edges
Op types used: 716 Const, 115 StridedSlice, 74 Mul, 66 Add, 62 Pack, 60 Transpose, 29 ConcatV2, 26 Reshape, 24 Shape,
23 Dequantize, 18 FloorDiv, 14 Conv2D, 14 Maximum, 12 Relu, 12 Conv2DBackpropInput, 9 Sub, 8 FloorMod, 8 Merge, 8 NextIteration,
8 Enter, 8 Range, 8 Switch, 4 GatherV2, 4 Greater, 4 LoopCond, 4 Exit, 4 SplitV, 2 SpaceToBatchND, 2 Sigmoid,
2 UnsortedSegmentSum, 2 RealDiv, 2 Neg, 2 IRFFT, 2 Cast, 2 BatchToSpaceND, 1 RFFT, 1 Placeholder, 1 PadV2, 1 Fill, 1 ComplexAbs
bazel-bin/tensorflow/tools/graph_transforms/summarize_graph --in_graph=in_graph.pb
allprojects {
repositories {
jcenter()
}
}
dependencies {
...
compile 'org.tensorflow:tensorflow-android:+'
...
bazel-bin/tensorflow/tools/graph_transforms/transform_graph \
--in_graph=in_graph.pb \
--out_graph=out_graph.pb \
--inputs='input1,input2' \
--outputs='output1,output2' \
--transforms='
strip_unused_nodes
remove_nodes(op=Identity, op=CheckNumerics)
fold_constants(ignore_errors=true)
remove_attribute(attribute_name=_class)
from tensorflow.tools.graph_transforms import TransformGraph
def transform_graph(graph_def, input_node_names, output_node_names):
transforms = [
"strip_unused_nodes",
"remove_nodes(op=Identity, op=CheckNumerics)",
"fold_constants(ignore_errors=true)",
# remove colocation attribute
"remove_attribute(attribute_name=_class)",
"fold_batch_norms",
bazel build tensorflow/python/tools:freeze_graph
bazel-bin/tensorflow/python/tools/freeze_graph \
--input_graph=/tmp/model/my_graph.pb \
--input_checkpoint=/tmp/model/model.ckpt-1000 \
--output_graph=/tmp/frozen_graph.pb \
--output_node_names=output_node \
from tensorflow.python.framework import graph_util
# Suppose you have obtained in a way or the other a graph object, and suppose
# you have a list of the output nodes names (manually created after inspectection
# with tensorboard for example). Then, one way to build a frozen graph is the following:
with tf.Session(graph=graph) as sess:
graph_def = graph.as_graph_def()
frozen_graph_def = graph_util.convert_variables_to_constants(
sess, graph_def, output_node_names)
bazel build tensorflow/tools/graph_transforms:summarize_graph
# While you are at it, you can also build other very helpful utilities that you may need:
bazel build tensorflow/python/tools:freeze_graph
bazel build tensorflow/tools/graph_transforms:summarize_graph
bazel build -c opt tensorflow/tools/benchmark:benchmark_model
@alreadytaikeune
alreadytaikeune / logs_new_word2vec_loss
Created July 23, 2018 12:22
logs_new_word2vec_loss
2018-07-23 10:34:49,659 : MainThread : INFO : running /usr/local/lib/python2.7/site-packages/gensim-3.5.0-py2.7-linux-x86_64.egg/gensim/scripts/word2vec_standalone.py -train data/text9 -output /tmp/test -window 5 -negative 5 -threads 4 -min_count 5 -iter 5 -cbow 0 -loss
2018-07-23 10:34:49,661 : MainThread : INFO : collecting all words and their counts
2018-07-23 10:35:01,095 : MainThread : INFO : PROGRESS: at sentence #0, processed 0 words, keeping 0 word types
2018-07-23 10:35:18,754 : MainThread : INFO : PROGRESS: at sentence #10000, processed 100000000 words, keeping 694463 word types
2018-07-23 10:35:29,348 : MainThread : INFO : collected 833184 word types from a corpus of 124301826 raw words and 12431 sentences
2018-07-23 10:35:29,349 : MainThread : INFO : Loading a fresh vocabulary
2018-07-23 10:35:30,889 : MainThread : INFO : effective_min_count=5 retains 218316 unique words (26% of original 833184, drops 614868)
2018-07-23 10:35:30,889 : MainThread : INFO : effective_min_count=5 leaves 123353509 word