Skip to content

Instantly share code, notes, and snippets.

@PhilipVinc
Created November 25, 2022 00:11
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save PhilipVinc/226bae15f6f7075da8685cd4e6242b36 to your computer and use it in GitHub Desktop.
Save PhilipVinc/226bae15f6f7075da8685cd4e6242b36 to your computer and use it in GitHub Desktop.
filippovicentini in mpi4jax at cqslpc1 on  mlir [$] via python-3.10.6 via 🐍 3.10.6 took 2s
➜ JAX_PLATFORMS="cpu" pytest --tb=short
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.10.6, pytest-7.2.0, pluggy-1.0.0
MPI vendor: ('Open MPI', (4, 1, 2))
MPI rank: 0
MPI size: 1
rootdir: /home/filippovicentini/Dropbox/Ricerca/Codes/Python/mpi4jax, configfile: pyproject.toml, testpaths: tests
plugins: cov-4.0.0
collected 121 items
tests/test_decorators.py .. [ 1%]
tests/test_examples.py . [ 2%]
tests/test_flush.py . [ 3%]
tests/test_has_cuda.py . [ 4%]
tests/test_jax_compat.py ....... [ 9%]
tests/test_validation.py .... [ 13%]
tests/collective_ops/test_allgather.py .... [ 16%]
tests/collective_ops/test_allreduce.py ............... [ 28%]
tests/collective_ops/test_allreduce_matvec.py ........................ [ 48%]
tests/collective_ops/test_alltoall.py ... [ 51%]
tests/collective_ops/test_barrier.py . [ 52%]
tests/collective_ops/test_bcast.py .... [ 55%]
tests/collective_ops/test_common.py FF.. [ 58%]
tests/collective_ops/test_gather.py .... [ 61%]
tests/collective_ops/test_reduce.py .... [ 65%]
tests/collective_ops/test_scan.py .... [ 68%]
tests/collective_ops/test_scatter.py ... [ 71%]
tests/collective_ops/test_send_and_recv.py sssssss [ 76%]
tests/collective_ops/test_sendrecv.py sssssssssss [ 85%]
tests/experimental/test_auto_tokenize.py ..ss..........sss [100%]
================================================================================= FAILURES =================================================================================
___________________________________________________________________________ test_abort_on_error ____________________________________________________________________________
tests/collective_ops/test_common.py:88: in test_abort_on_error
assert "r0 | MPI_Send returned error code" in proc.stderr
E assert 'r0 | MPI_Send returned error code' in '/home/filippovicentini/Dropbox/Ricerca/Codes/Python/mpi4jax/mpi4jax/_src/decorators.py:49: UserWarning: Not using CUD...0))\njaxlib.xla_extension.XlaRuntimeError: UNIMPLEMENTED: No registered implementation for custom call to "mpi_send"\n'
E + where '/home/filippovicentini/Dropbox/Ricerca/Codes/Python/mpi4jax/mpi4jax/_src/decorators.py:49: UserWarning: Not using CUD...0))\njaxlib.xla_extension.XlaRuntimeError: UNIMPLEMENTED: No registered implementation for custom call to "mpi_send"\n' = CompletedProcess(args=['/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/bin/python', PosixPath('/tmp...))\njaxlib.xla_extension.XlaRuntimeError: UNIMPLEMENTED: No registered implementation for custom call to "mpi_send"\n').stderr
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
/home/filippovicentini/Dropbox/Ricerca/Codes/Python/mpi4jax/mpi4jax/_src/decorators.py:49: UserWarning: Not using CUDA-enabled MPI. If you are sure that your MPI library is built with CUDA support, set MPI4JAX_USE_CUDA_MPI=1. To silence this warning, set MPI4JAX_USE_CUDA_MPI=0.
warnings.warn(warn_msg)
Traceback (most recent call last):
File "/tmp/pytest-of-filippovicentini/pytest-9/test_abort_on_error0/abort.py", line 23, in <module>
send_jit(jnp.ones(10))
File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/traceback_util.py", line 162, in reraise_with_filtered_traceback
return fun(*args, **kwargs)
File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/api.py", line 622, in cache_miss
execute = dispatch._xla_call_impl_lazy(fun_, *tracers, **params)
File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/dispatch.py", line 236, in _xla_call_impl_lazy
return xla_callable(fun, device, backend, name, donated_invars, keep_unused,
File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/linear_util.py", line 303, in memoized_fun
ans = call(fun, *args)
File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/dispatch.py", line 360, in _xla_callable_uncached
keep_unused, *arg_specs).compile().unsafe_call
File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/dispatch.py", line 996, in compile
self._executable = XlaCompiledComputation.from_xla_computation(
File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/dispatch.py", line 1194, in from_xla_computation
compiled = compile_or_get_cached(backend, xla_computation, options,
File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/dispatch.py", line 1077, in compile_or_get_cached
return backend_compile(backend, serialized_computation, compile_options,
File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/profiler.py", line 314, in wrapper
return func(*args, **kwargs)
File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/dispatch.py", line 1012, in backend_compile
return backend.compile(built_c, compile_options=options)
jax._src.traceback_util.UnfilteredStackTrace: jaxlib.xla_extension.XlaRuntimeError: UNIMPLEMENTED: No registered implementation for custom call to "mpi_send"
The stack trace below excludes JAX-internal frames.
The preceding is the original exception that occurred, unmodified.
--------------------
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/tmp/pytest-of-filippovicentini/pytest-9/test_abort_on_error0/abort.py", line 23, in <module>
send_jit(jnp.ones(10))
jaxlib.xla_extension.XlaRuntimeError: UNIMPLEMENTED: No registered implementation for custom call to "mpi_send"
__________________________________________________________________________ test_deadlock_on_exit ___________________________________________________________________________
tests/collective_ops/test_common.py:115: in test_deadlock_on_exit
assert proc.returncode == 0, proc.stderr
E AssertionError: /home/filippovicentini/Dropbox/Ricerca/Codes/Python/mpi4jax/mpi4jax/_src/decorators.py:49: UserWarning: Not using CUDA-enabled MPI. If you are sure that your MPI library is built with CUDA support, set MPI4JAX_USE_CUDA_MPI=1. To silence this warning, set MPI4JAX_USE_CUDA_MPI=0.
E warnings.warn(warn_msg)
E Traceback (most recent call last):
E File "/tmp/pytest-of-filippovicentini/pytest-9/test_deadlock_on_exit0/deadlock_on_exit.py", line 19, in <module>
E jax.jit(
E File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/traceback_util.py", line 162, in reraise_with_filtered_traceback
E return fun(*args, **kwargs)
E File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/api.py", line 622, in cache_miss
E execute = dispatch._xla_call_impl_lazy(fun_, *tracers, **params)
E File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/dispatch.py", line 236, in _xla_call_impl_lazy
E return xla_callable(fun, device, backend, name, donated_invars, keep_unused,
E File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/linear_util.py", line 303, in memoized_fun
E ans = call(fun, *args)
E File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/dispatch.py", line 360, in _xla_callable_uncached
E keep_unused, *arg_specs).compile().unsafe_call
E File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/dispatch.py", line 996, in compile
E self._executable = XlaCompiledComputation.from_xla_computation(
E File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/dispatch.py", line 1194, in from_xla_computation
E compiled = compile_or_get_cached(backend, xla_computation, options,
E File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/dispatch.py", line 1077, in compile_or_get_cached
E return backend_compile(backend, serialized_computation, compile_options,
E File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/profiler.py", line 314, in wrapper
E return func(*args, **kwargs)
E File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/dispatch.py", line 1012, in backend_compile
E return backend.compile(built_c, compile_options=options)
E jax._src.traceback_util.UnfilteredStackTrace: jaxlib.xla_extension.XlaRuntimeError: UNIMPLEMENTED: No registered implementation for custom call to "mpi_sendrecv"
E
E The stack trace below excludes JAX-internal frames.
E The preceding is the original exception that occurred, unmodified.
E
E --------------------
E
E The above exception was the direct cause of the following exception:
E
E Traceback (most recent call last):
E File "/tmp/pytest-of-filippovicentini/pytest-9/test_deadlock_on_exit0/deadlock_on_exit.py", line 19, in <module>
E jax.jit(
E jaxlib.xla_extension.XlaRuntimeError: UNIMPLEMENTED: No registered implementation for custom call to "mpi_sendrecv"
E
E assert 1 == 0
E + where 1 = CompletedProcess(args=['/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/bin/python', PosixPath('/tmp...jaxlib.xla_extension.XlaRuntimeError: UNIMPLEMENTED: No registered implementation for custom call to "mpi_sendrecv"\n').returncode
========================================================================= short test summary info ==========================================================================
FAILED tests/collective_ops/test_common.py::test_abort_on_error - assert 'r0 | MPI_Send returned error code' in '/home/filippovicentini/Dropbox/Ricerca/Codes/Python/mpi4jax/mpi4jax/_src/decorators.py:49: UserWarning: Not using CUD......
FAILED tests/collective_ops/test_common.py::test_deadlock_on_exit - AssertionError: /home/filippovicentini/Dropbox/Ricerca/Codes/Python/mpi4jax/mpi4jax/_src/decorators.py:49: UserWarning: Not using CUDA-enabled MPI. If you are sure tha...
================================================================ 2 failed, 96 passed, 23 skipped in 14.43s =================================================================
filippovicentini@cqslpc1 ~/Dropbox/Ricerca/Codes/Python/mpi4jax mlir 16s
filippovicentini in mpi4jax at cqslpc1 on  mlir [$] via python-3.10.6 via 🐍 3.10.6 took 16s
filippovicentini in mpi4jax at cqslpc1 on  mlir [$] via python-3.10.6 via 🐍 3.10.6 took 18s
➜ >....
E ans = call(fun, *args)
E File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/dispatch.py", line 360, in _xla_callable_uncached
E keep_unused, *arg_specs).compile().unsafe_call
E File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/dispatch.py", line 996, in compile
E self._executable = XlaCompiledComputation.from_xla_computation(
E File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/dispatch.py", line 1194, in from_xla_computation
E compiled = compile_or_get_cached(backend, xla_computation, options,
E File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/dispatch.py", line 1077, in compile_or_get_cached
E return backend_compile(backend, serialized_computation, compile_options,
E File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/profiler.py", line 314, in wrapper
E return func(*args, **kwargs)
E File "/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/lib/python3.10/site-packages/jax/_src/dispatch.py", line 1012, in backend_compile
E return backend.compile(built_c, compile_options=options)
E jax._src.traceback_util.UnfilteredStackTrace: jaxlib.xla_extension.XlaRuntimeError: UNIMPLEMENTED: No registered implementation for custom call to "mpi_sendrecv"
E
E The stack trace below excludes JAX-internal frames.
E The preceding is the original exception that occurred, unmodified.
E
E --------------------
E
E The above exception was the direct cause of the following exception:
E
E Traceback (most recent call last):
E File "/tmp/pytest-of-filippovicentini/pytest-10/test_deadlock_on_exit0/deadlock_on_exit.py", line 19, in <module>
E jax.jit(
E jaxlib.xla_extension.XlaRuntimeError: UNIMPLEMENTED: No registered implementation for custom call to "mpi_sendrecv"
E
E assert 1 == 0
E + where 1 = CompletedProcess(args=['/home/filippovicentini/Documents/pythonenvs/mpi4jax/python-3.10.6/bin/python', PosixPath('/tmp...jaxlib.xla_extension.XlaRuntimeError: UNIMPLEMENTED: No registered implementation for custom call to "mpi_sendrecv"\n').returncode
========================================================================= short test summary info ==========================================================================
FAILED tests/collective_ops/test_common.py::test_abort_on_error - assert 'r0 | MPI_Send returned error code' in '/home/filippovicentini/Dropbox/Ricerca/Codes/Python/mpi4jax/mpi4jax/_src/decorators.py:49: UserWarning: Not using CUD......
FAILED tests/collective_ops/test_common.py::test_deadlock_on_exit - AssertionError: /home/filippovicentini/Dropbox/Ricerca/Codes/Python/mpi4jax/mpi4jax/_src/decorators.py:49: UserWarning: Not using CUDA-enabled MPI. If you are sure tha...
====================================================================== 2 failed, 119 passed in 15.49s ======================================================================
--------------------------------------------------------------------------
Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
Process name: [[6922,1],0]
Exit code: 1
--------------------------------------------------------------------------
filippovicentini@cqslpc1 ~/Dropbox/Ricerca/Codes/Python/mpi4jax mlir 18s
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment