Created
July 4, 2016 09:19
-
-
Save ysagon/c9c5a7207dc77278ead6e4b53dd66929 to your computer and use it in GitHub Desktop.
Rmpi error at test
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
R CMD INSTALL /home/sagon/.local/easybuild/sources/r/R/extensions/Rmpi_0.6-6.tar.gz --configure-args="--with-Rmpi-include=/opt/ebsofts/Compiler/GCC/4.9.3-2.25/OpenMPI/1.10.2/include --with-Rmpi-libpath=/opt/ebsofts/Compiler/GCC/4.9.3-2.25/OpenMPI/1.10.2/lib --with-mpi=/opt/ebsofts/Compiler/GCC/4.9.3-2.25/OpenMPI/1.10.2 --with-Rmpi-type=OPENMPI" --library=/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/R/3.3.1/lib64/R/library --no-clean-on-error | |
* installing *source* package ‘Rmpi’ ... | |
** package ‘Rmpi’ successfully unpacked and MD5 sums checked | |
checking for openpty in -lutil... no | |
checking for main in -lpthread... no | |
configure: creating ./config.status | |
config.status: creating src/Makevars | |
** libs | |
gcc -std=gnu99 -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/R/3.2.3/lib64/R/include -DNDEBUG -DPACKAGE_NAME=\"\" -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -I/opt/ebsofts/Compiler/GCC/4.9.3-2.25/OpenMPI/1.10.2/include -DMPI2 -DOPENMPI -I/opt/ebsofts/Compiler/GCC/4.9.3-2.25/OpenBLAS/0.2.15-LAPACK-3.6.0/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/ScaLAPACK/2.0.2-OpenBLAS-0.2.15-LAPACK-3.6.0/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/FFTW/3.3.4/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/libreadline/6.3/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/ncurses/6.0/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/libpng/1.6.21/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/libjpeg-turbo/1.4.2/include -I/opt/ebsofts/Core/Java/1.8.0_72/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/Tcl/8.6.4/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/Tk/8.6.4-no-X11/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/cURL/7.47.0/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/libxml2/2.9.3/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/GDAL/2.0.2/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/PROJ/4.9.2/include -fpic -O2 -march=native -c Rmpi.c -o Rmpi.o | |
gcc -std=gnu99 -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/R/3.2.3/lib64/R/include -DNDEBUG -DPACKAGE_NAME=\"\" -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -I/opt/ebsofts/Compiler/GCC/4.9.3-2.25/OpenMPI/1.10.2/include -DMPI2 -DOPENMPI -I/opt/ebsofts/Compiler/GCC/4.9.3-2.25/OpenBLAS/0.2.15-LAPACK-3.6.0/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/ScaLAPACK/2.0.2-OpenBLAS-0.2.15-LAPACK-3.6.0/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/FFTW/3.3.4/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/libreadline/6.3/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/ncurses/6.0/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/libpng/1.6.21/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/libjpeg-turbo/1.4.2/include -I/opt/ebsofts/Core/Java/1.8.0_72/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/Tcl/8.6.4/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/Tk/8.6.4-no-X11/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/cURL/7.47.0/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/libxml2/2.9.3/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/GDAL/2.0.2/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/PROJ/4.9.2/include -fpic -O2 -march=native -c conversion.c -o conversion.o | |
gcc -std=gnu99 -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/R/3.2.3/lib64/R/include -DNDEBUG -DPACKAGE_NAME=\"\" -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -I/opt/ebsofts/Compiler/GCC/4.9.3-2.25/OpenMPI/1.10.2/include -DMPI2 -DOPENMPI -I/opt/ebsofts/Compiler/GCC/4.9.3-2.25/OpenBLAS/0.2.15-LAPACK-3.6.0/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/ScaLAPACK/2.0.2-OpenBLAS-0.2.15-LAPACK-3.6.0/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/FFTW/3.3.4/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/libreadline/6.3/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/ncurses/6.0/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/libpng/1.6.21/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/libjpeg-turbo/1.4.2/include -I/opt/ebsofts/Core/Java/1.8.0_72/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/Tcl/8.6.4/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/Tk/8.6.4-no-X11/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/cURL/7.47.0/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/libxml2/2.9.3/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/GDAL/2.0.2/include -I/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/PROJ/4.9.2/include -fpic -O2 -march=native -c internal.c -o internal.o | |
gcc -std=gnu99 -shared -L/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/R/3.2.3/lib64/R/lib -L/opt/ebsofts/Core/GCCcore/4.9.3/lib64 -L/opt/ebsofts/Core/GCCcore/4.9.3/lib -L/opt/ebsofts/Compiler/GCC/4.9.3-2.25/OpenBLAS/0.2.15-LAPACK-3.6.0/lib -L/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/ScaLAPACK/2.0.2-OpenBLAS-0.2.15-LAPACK-3.6.0/lib -L/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/FFTW/3.3.4/lib -L/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/libreadline/6.3/lib -L/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/ncurses/6.0/lib -L/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/libpng/1.6.21/lib -L/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/libjpeg-turbo/1.4.2/lib -L/opt/ebsofts/Core/Java/1.8.0_72/lib -L/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/Tcl/8.6.4/lib -L/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/Tk/8.6.4-no-X11/lib -L/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/cURL/7.47.0/lib -L/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/libxml2/2.9.3/lib -L/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/GDAL/2.0.2/lib -L/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/PROJ/4.9.2/lib -o Rmpi.so Rmpi.o conversion.o internal.o -L/opt/ebsofts/Compiler/GCC/4.9.3-2.25/OpenMPI/1.10.2/lib -lmpi -L/opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/R/3.2.3/lib64/R/lib -lR | |
installing to /opt/ebsofts/MPI/GCC/4.9.3-2.25/OpenMPI/1.10.2/R/3.3.1/lib64/R/library/Rmpi/libs | |
** R | |
** demo | |
** inst | |
** preparing package for lazy loading | |
** help | |
*** installing help indices | |
** building package indices | |
** testing if installed package can be loaded | |
-------------------------------------------------------------------------- | |
It looks like orte_init failed for some reason; your parallel process is | |
likely to abort. There are many reasons that a parallel process can | |
fail during orte_init; some of which are due to configuration or | |
environment problems. This failure appears to be an internal failure; | |
here's some additional information (which may only be relevant to an | |
Open MPI developer): | |
PMI2_Job_GetId failed failed | |
--> Returned value (null) (14) instead of ORTE_SUCCESS | |
-------------------------------------------------------------------------- | |
-------------------------------------------------------------------------- | |
It looks like orte_init failed for some reason; your parallel process is | |
likely to abort. There are many reasons that a parallel process can | |
fail during orte_init; some of which are due to configuration or | |
environment problems. This failure appears to be an internal failure; | |
here's some additional information (which may only be relevant to an | |
Open MPI developer): | |
orte_ess_init failed | |
--> Returned value (null) (14) instead of ORTE_SUCCESS | |
-------------------------------------------------------------------------- | |
-------------------------------------------------------------------------- | |
It looks like MPI_INIT failed for some reason; your parallel process is | |
likely to abort. There are many reasons that a parallel process can | |
fail during MPI_INIT; some of which are due to configuration or environment | |
problems. This failure appears to be an internal failure; here's some | |
additional information (which may only be relevant to an Open MPI | |
developer): | |
ompi_mpi_init: ompi_rte_init failed | |
--> Returned "(null)" (14) instead of "Success" (0) | |
-------------------------------------------------------------------------- | |
*** An error occurred in MPI_Init | |
*** on a NULL communicator | |
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, | |
*** and potentially your MPI job) | |
[master.cluster:165227] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed! | |
ERROR: loading failed |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment