Skip to content

Instantly share code, notes, and snippets.

@Bartvelp
Created May 31, 2020 11:50
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save Bartvelp/0b3cad7ef779a601dc1f6f57d9557e5d to your computer and use it in GitHub Desktop.
Save Bartvelp/0b3cad7ef779a601dc1f6f57d9557e5d to your computer and use it in GitHub Desktop.
2020-05-31 13:49:11.322663: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 AVX512F FMA
I0531 13:49:11.325103 46912496414336 tf_logging.py:115] Using config: {'_model_dir': 'seq2species_new_weights_small_short', '_tf_random_seed': None, '_save_summary_steps': 100, '_save_checkpoints_steps': 100000, '_save_checkpoints_secs': None, '_session_config': None, '_keep_checkpoint_max': 1000, '_keep_checkpoint_every_n_hours': 10000, '_log_step_count_steps': 100, '_train_distribute': None, '_device_fn': None, '_service': None, '_cluster_spec': <tensorflow.python.training.server_lib.ClusterSpec object at 0x2aaac308cda0>, '_task_type': 'worker', '_task_id': 0, '_global_id_in_cluster': 0, '_master': '', '_evaluation_master': '', '_is_chief': True, '_num_ps_replicas': 0, '_num_worker_replicas': 1}
W0531 13:49:11.326227 46912496414336 tf_logging.py:120] 'cpuinfo' not imported. CPU info will not be logged.
W0531 13:49:11.326476 46912496414336 tf_logging.py:120] 'psutil' not imported. Memory info will not be logged.
I0531 13:49:11.326534 46912496414336 tf_logging.py:115] Benchmark run: {'model_name': 'model', 'dataset': {'name': 'dataset_name'}, 'machine_config': {'gpu_info': {'count': 0}}, 'run_date': '2020-05-31T11:49:11.325586Z', 'tensorflow_version': {'version': '1.9.0', 'git_hash': 'v1.9.0-0-g25c197e023'}, 'tensorflow_environment_variables': [], 'run_parameters': [{'name': 'batch_size', 'long_value': 32}, {'name': 'train_epochs', 'long_value': 1}]}
I0531 13:49:11.378488 46912496414336 tf_logging.py:115] Calling model_fn.
I0531 13:49:11.802272 46912496414336 tf_logging.py:115] Done calling model_fn.
I0531 13:49:11.803105 46912496414336 tf_logging.py:115] Create CheckpointSaverHook.
I0531 13:49:11.998456 46912496414336 tf_logging.py:115] Graph was finalized.
I0531 13:49:12.000111 46912496414336 tf_logging.py:115] Restoring parameters from seq2species_new_weights_small_short/model.ckpt-0
I0531 13:49:13.769863 46912496414336 tf_logging.py:115] Running local_init_op.
I0531 13:49:13.777753 46912496414336 tf_logging.py:115] Done running local_init_op.
I0531 13:49:14.134858 46912496414336 tf_logging.py:115] Saving checkpoints for 0 into seq2species_new_weights_small_short/model.ckpt.
Traceback (most recent call last):
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1322, in _do_call
return fn(*args)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1307, in _run_fn
options, feed_dict, fetch_list, target_list, run_metadata)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1409, in _call_tf_sessionrun
run_metadata)
tensorflow.python.framework.errors_impl.InvalidArgumentError: Tried to explicitly squeeze dimension 1 but dimension was not 1: 0
[[Node: sparse_softmax_cross_entropy_loss/remove_squeezable_dimensions/Squeeze = Squeeze[T=DT_INT64, squeeze_dims=[-1], _device="/job:localhost/replica:0/task:0/device:CPU:0"](IteratorGetNext:1)]]
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/WUR/grosm002/DeepMicrobes/DeepMicrobes.py", line 365, in <module>
absl_app.run(main)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/absl/app.py", line 278, in run
_run_main(main, args)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/absl/app.py", line 239, in _run_main
sys.exit(main(argv))
File "/home/WUR/grosm002/DeepMicrobes/DeepMicrobes.py", line 357, in main
train(flags.FLAGS, model_fn, 'dataset_name')
File "/home/WUR/grosm002/DeepMicrobes/DeepMicrobes.py", line 214, in train
classifier.train(input_fn=input_fn_train, hooks=train_hooks)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/estimator/estimator.py", line 366, in train
loss = self._train_model(input_fn, hooks, saving_listeners)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/estimator/estimator.py", line 1119, in _train_model
return self._train_model_default(input_fn, hooks, saving_listeners)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/estimator/estimator.py", line 1135, in _train_model_default
saving_listeners)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/estimator/estimator.py", line 1336, in _train_with_estimator_spec
_, loss = mon_sess.run([estimator_spec.train_op, estimator_spec.loss])
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 577, in run
run_metadata=run_metadata)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1053, in run
run_metadata=run_metadata)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1144, in run
raise six.reraise(*original_exc_info)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/six.py", line 703, in reraise
raise value
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1129, in run
return self._sess.run(*args, **kwargs)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1201, in run
run_metadata=run_metadata)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 981, in run
return self._sess.run(*args, **kwargs)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 900, in run
run_metadata_ptr)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1135, in _run
feed_dict_tensor, options, run_metadata)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1316, in _do_run
run_metadata)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1335, in _do_call
raise type(e)(node_def, op, message)
tensorflow.python.framework.errors_impl.InvalidArgumentError: Tried to explicitly squeeze dimension 1 but dimension was not 1: 0
[[Node: sparse_softmax_cross_entropy_loss/remove_squeezable_dimensions/Squeeze = Squeeze[T=DT_INT64, squeeze_dims=[-1], _device="/job:localhost/replica:0/task:0/device:CPU:0"](IteratorGetNext:1)]]
Caused by op 'sparse_softmax_cross_entropy_loss/remove_squeezable_dimensions/Squeeze', defined at:
File "/home/WUR/grosm002/DeepMicrobes/DeepMicrobes.py", line 365, in <module>
absl_app.run(main)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/absl/app.py", line 278, in run
_run_main(main, args)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/absl/app.py", line 239, in _run_main
sys.exit(main(argv))
File "/home/WUR/grosm002/DeepMicrobes/DeepMicrobes.py", line 357, in main
train(flags.FLAGS, model_fn, 'dataset_name')
File "/home/WUR/grosm002/DeepMicrobes/DeepMicrobes.py", line 214, in train
classifier.train(input_fn=input_fn_train, hooks=train_hooks)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/estimator/estimator.py", line 366, in train
loss = self._train_model(input_fn, hooks, saving_listeners)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/estimator/estimator.py", line 1119, in _train_model
return self._train_model_default(input_fn, hooks, saving_listeners)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/estimator/estimator.py", line 1132, in _train_model_default
features, labels, model_fn_lib.ModeKeys.TRAIN, self.config)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/estimator/estimator.py", line 1107, in _call_model_fn
model_fn_results = self._model_fn(features=features, **kwargs)
File "/home/WUR/grosm002/DeepMicrobes/DeepMicrobes.py", line 118, in model_fn
logits=logits, labels=labels)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/ops/losses/losses_impl.py", line 856, in sparse_softmax_cross_entropy
labels, logits, weights, expected_rank_diff=1)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/ops/losses/losses_impl.py", line 785, in _remove_squeezable_dimensions
labels, predictions, expected_rank_diff=expected_rank_diff)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/ops/confusion_matrix.py", line 72, in remove_squeezable_dimensions
labels = array_ops.squeeze(labels, [-1])
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/util/deprecation.py", line 432, in new_func
return func(*args, **kwargs)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/ops/array_ops.py", line 2556, in squeeze
return gen_array_ops.squeeze(input, axis, name)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/ops/gen_array_ops.py", line 7946, in squeeze
"Squeeze", input=input, squeeze_dims=axis, name=name)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/framework/op_def_library.py", line 787, in _apply_op_helper
op_def=op_def)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3414, in create_op
op_def=op_def)
File "/home/WUR/grosm002/miniconda2/envs/DeepMicrobes/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 1740, in __init__
self._traceback = self._graph._extract_stack() # pylint: disable=protected-access
InvalidArgumentError (see above for traceback): Tried to explicitly squeeze dimension 1 but dimension was not 1: 0
[[Node: sparse_softmax_cross_entropy_loss/remove_squeezable_dimensions/Squeeze = Squeeze[T=DT_INT64, squeeze_dims=[-1], _device="/job:localhost/replica:0/task:0/device:CPU:0"](IteratorGetNext:1)]]
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment