Skip to content

Instantly share code, notes, and snippets.

@vanbasten23
Created December 19, 2022 18:21
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save vanbasten23/b76f1fbf4a6ad7d44bad18254084768b to your computer and use it in GitHub Desktop.
Save vanbasten23/b76f1fbf4a6ad7d44bad18254084768b to your computer and use it in GitHub Desktop.
root@t1v-n-621e873b-w-0:/workspaces/work# python3 pytorch/xla/test/test_dynamic_shape_backward_models.py
pred.size()= torch.Size([<=80, 1])
xw32, file=torch_xla/csrc/tensor.cpp, line=665function=eq:
xw32, file=torch_xla/csrc/tensor.cpp, line=757function=bool_:
xw32, file=/workspaces/work/pytorch/c10/core/SymInt.cpp, line=99function=operator==: is_symbolic()=0, sci.is_symbolic()=0
xw32, file=/workspaces/work/pytorch/c10/core/SymInt.cpp, line=99function=operator==: is_symbolic()=0, sci.is_symbolic()=0
xw32, file=torch_xla/csrc/ops/dynamic_ir.cpp, line=113function=getDynamicValue: dim_node_0->getDynamicValue()=79, dim_node_1->getDynamicValue()=79
pred.size()= torch.Size([<=80, 1])
xw32, file=/workspaces/work/pytorch/torch/csrc/autograd/input_metadata.h, line=102function=is_same_shape: typeid(gradSymSizes).name()=N3c108ArrayRefINS_6SymIntEEE, typeid(shapeAsDimVector).name()=N3c108ArrayRefINS_6SymIntEEE
xw32, file=/workspaces/work/pytorch/torch/csrc/autograd/input_metadata.h, line=102function=is_same_shape: typeid(gradSymSizes).name()=N3c108ArrayRefINS_6SymIntEEE, typeid(shapeAsDimVector).name()=N3c108ArrayRefINS_6SymIntEEE
xw32, file=/workspaces/work/pytorch/c10/core/SymInt.cpp, line=99function=operator==: is_symbolic()=1, sci.is_symbolic()=1
xw32, file=torch_xla/csrc/tensor.cpp, line=665function=eq:
xw32, file=torch_xla/csrc/tensor.cpp, line=757function=bool_:
2022-12-19 17:53:35.980394: F ./tensorflow/compiler/xla/stream_executor/stream.h:1188] Check failed: gpu_src.size() == 0 || host_size >= gpu_src.size()
https://symbolize.stripped_domain/r/?trace=7ff55c4768eb,7ff55c7a072f,7ff3771c7921,7ff37d999ac3,7ff382d7eea8,7ff382d7f3b3,7ff37d99ac94,7ff3797e55cd,7ff3797e61fc,7ff3797e4b8e,7ff3797e6e8e,7ff3797e832e,7ff37eedeeb1,7ff3835b47c0,7ff3835b6d27,7ff383cc2015,7ff383cbff12,7ff383caa414,7ff55c795fa2&map=f64270d911f2751a0e4b23805c049d9fd9768a08:7ff374631000-7ff38750c2d0
*** SIGABRT received by PID 3056271 (TID 3057434) on cpu 78 from PID 3056271; stack trace: ***
PC: @ 0x7ff55c4768eb (unknown) raise
@ 0x7ff37397763c 1136 (unknown)
@ 0x7ff55c7a0730 3888 (unknown)
@ 0x7ff3771c7922 1904 tensorflow::(anonymous namespace)::UpdateDynamicInputs()::{lambda()#1}::operator()()
@ 0x7ff37d999ac4 32 std::_Function_handler<>::_M_invoke()
@ 0x7ff382d7eea9 96 xla::(anonymous namespace)::ForEachSubshapeHelper()
@ 0x7ff382d7f3b4 80 xla::ShapeUtil::ForEachSubshapeWithStatus()
@ 0x7ff37d99ac95 5104 tensorflow::TPUExecute()
@ 0x7ff3797e55ce 2400 tensorflow::(anonymous namespace)::RunExecutable()
@ 0x7ff3797e61fd 256 std::_Function_handler<>::_M_invoke()
@ 0x7ff3797e4b8f 1520 tensorflow::(anonymous namespace)::ExecuteTPUProgram()
@ 0x7ff3797e6e8f 4272 tensorflow::(anonymous namespace)::XRTExecuteOp::DoWork()
@ 0x7ff3797e832f 64 tensorflow::(anonymous namespace)::XRTExecuteOp::ComputeAsync()
@ 0x7ff37eedeeb2 464 tensorflow::XlaDevice::ComputeAsync()
@ 0x7ff3835b47c1 2992 tensorflow::(anonymous namespace)::ExecutorState<>::Process()
@ 0x7ff3835b6d28 48 std::_Function_handler<>::_M_invoke()
@ 0x7ff383cc2016 160 Eigen::ThreadPoolTempl<>::WorkerLoop()
@ 0x7ff383cbff13 64 std::_Function_handler<>::_M_invoke()
@ 0x7ff383caa415 80 tsl::(anonymous namespace)::PThread::ThreadFn()
@ 0x7ff55c795fa3 (unknown) start_thread
https://symbolize.stripped_domain/r/?trace=7ff55c4768eb,7ff37397763b,7ff55c7a072f,7ff3771c7921,7ff37d999ac3,7ff382d7eea8,7ff382d7f3b3,7ff37d99ac94,7ff3797e55cd,7ff3797e61fc,7ff3797e4b8e,7ff3797e6e8e,7ff3797e832e,7ff37eedeeb1,7ff3835b47c0,7ff3835b6d27,7ff383cc2015,7ff383cbff12,7ff383caa414,7ff55c795fa2&map=f64270d911f2751a0e4b23805c049d9fd9768a08:7ff374631000-7ff38750c2d0,b023a102200526e4af0d08cd19f17287:7ff3639c5000-7ff373c3a920
E1219 17:53:36.342857 3057434 coredump_hook.cc:395] RAW: Remote crash data gathering hook invoked.
E1219 17:53:36.342882 3057434 client.cc:243] RAW: Coroner client retries enabled (b/136286901), will retry for up to 30 sec.
E1219 17:53:36.342887 3057434 coredump_hook.cc:502] RAW: Sending fingerprint to remote end.
E1219 17:53:36.342897 3057434 coredump_socket.cc:120] RAW: Stat failed errno=2 on socket /var/google/services/logmanagerd/remote_coredump.socket
E1219 17:53:36.342919 3057434 coredump_hook.cc:506] RAW: Cannot send fingerprint to Coroner: [NOT_FOUND] Missing crash reporting socket. Is the listener running?
E1219 17:53:36.342923 3057434 coredump_hook.cc:577] RAW: Dumping core locally.
E1219 17:54:12.864488 3057434 process_state.cc:775] RAW: Raising signal 6 with default behavior
Aborted (core dumped)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment