Last active
August 29, 2015 14:20
-
-
Save cwsmith/17394609980363e12c38 to your computer and use it in GitHub Desktop.
5850 node 8 processes test run on two phis, job id 5198658
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
TACC: Starting up job 5198658 | |
TACC: Starting parallel tasks... | |
# Command used to run job in: /home1/02422/cwsmith/.slurm/job.5198658.runcmd.QH2ZVF1r | |
[0] MPI startup(): Multi-threaded optimized library | |
[0] DAPL startup(): trying to open DAPL provider from I_MPI_DAPL_PROVIDER: ofa-v2-mlx4_0-1u | |
[1] DAPL startup(): trying to open DAPL provider from I_MPI_DAPL_PROVIDER: ofa-v2-mlx4_0-1u | |
[2] DAPL startup(): trying to open DAPL provider from I_MPI_DAPL_PROVIDER: ofa-v2-mlx4_0-1u | |
[3] DAPL startup(): trying to open DAPL provider from I_MPI_DAPL_PROVIDER: ofa-v2-mlx4_0-1u | |
[4] DAPL startup(): trying to open DAPL provider from I_MPI_DAPL_PROVIDER: ofa-v2-mlx4_0-1u | |
[5] DAPL startup(): trying to open DAPL provider from I_MPI_DAPL_PROVIDER: ofa-v2-mlx4_0-1u | |
[6] DAPL startup(): trying to open DAPL provider from I_MPI_DAPL_PROVIDER: ofa-v2-mlx4_0-1u | |
[7] DAPL startup(): trying to open DAPL provider from I_MPI_DAPL_PROVIDER: ofa-v2-mlx4_0-1u | |
[1] MPI startup(): DAPL provider ofa-v2-mlx4_0-1u | |
[2] MPI startup(): DAPL provider ofa-v2-mlx4_0-1u | |
[3] MPI startup(): DAPL provider ofa-v2-mlx4_0-1u | |
[0] MPI startup(): DAPL provider ofa-v2-mlx4_0-1u | |
[0] MPI startup(): shm and dapl data transfer modes | |
[2] MPI startup(): shm and dapl data transfer modes | |
[1] MPI startup(): shm and dapl data transfer modes | |
[3] MPI startup(): shm and dapl data transfer modes | |
[6] MPI startup(): DAPL provider ofa-v2-mlx4_0-1u | |
[5] MPI startup(): DAPL provider ofa-v2-mlx4_0-1u | |
[4] MPI startup(): DAPL provider ofa-v2-mlx4_0-1u | |
[7] MPI startup(): DAPL provider ofa-v2-mlx4_0-1u | |
[4] MPI startup(): shm and dapl data transfer modes | |
[5] MPI startup(): shm and dapl data transfer modes | |
[6] MPI startup(): shm and dapl data transfer modes | |
[7] MPI startup(): shm and dapl data transfer modes | |
[0] MPID_nem_init_dapl_coll_fns(): User set DAPL collective mask = 0000 | |
[0] MPID_nem_init_dapl_coll_fns(): Effective DAPL collective mask = 0000 | |
[4] MPID_nem_init_dapl_coll_fns(): User set DAPL collective mask = 0000 | |
[4] MPID_nem_init_dapl_coll_fns(): Effective DAPL collective mask = 0000 | |
[5] MPID_nem_init_dapl_coll_fns(): User set DAPL collective mask = 0000 | |
[5] MPID_nem_init_dapl_coll_fns(): Effective DAPL collective mask = 0000 | |
[1] MPID_nem_init_dapl_coll_fns(): User set DAPL collective mask = 0000 | |
[1] MPID_nem_init_dapl_coll_fns(): Effective DAPL collective mask = 0000 | |
[2] MPID_nem_init_dapl_coll_fns(): User set DAPL collective mask = 0000 | |
[2] MPID_nem_init_dapl_coll_fns(): Effective DAPL collective mask = 0000 | |
[7] MPID_nem_init_dapl_coll_fns(): User set DAPL collective mask = 0000 | |
[7] MPID_nem_init_dapl_coll_fns(): Effective DAPL collective mask = 0000 | |
[6] MPID_nem_init_dapl_coll_fns(): User set DAPL collective mask = 0000 | |
[6] MPID_nem_init_dapl_coll_fns(): Effective DAPL collective mask = 0000 | |
[3] MPID_nem_init_dapl_coll_fns(): User set DAPL collective mask = 0000 | |
[3] MPID_nem_init_dapl_coll_fns(): Effective DAPL collective mask = 0000 | |
[0] MPI startup(): Rank Pid Node name Pin cpu | |
[0] MPI startup(): 0 4094 c557-304-mic0.stampede.tacc.utexas.edu {1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30 | |
,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,5 | |
7,58,59,60,61} | |
[0] MPI startup(): 1 4095 c557-304-mic0.stampede.tacc.utexas.edu {62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88 | |
,89,90,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,11 | |
1,112,113,114,115,116,117,118,119,120,121,122} | |
[0] MPI startup(): 2 4096 c557-304-mic0.stampede.tacc.utexas.edu {123,124,125,126,127,128,129,130,131,132,133,134,135,136,137,138,139,140,141,142, | |
143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,160,161,162, | |
163,164,165,166,167,168,169,170,171,172,173,174,175,176,177,178,179,180,181,182, | |
183} | |
[0] MPI startup(): 3 4097 c557-304-mic0.stampede.tacc.utexas.edu {0,184,185,186,187,188,189,190,191,192,193,194,195,196,197,198,199,200,201,202,20 | |
3,204,205,206,207,208,209,210,211,212,213,214,215,216,217,218,219,220,221,222,22 | |
3,224,225,226,227,228,229,230,231,232,233,234,235,236,237,238,239,240,241,242,24 | |
3} | |
[0] MPI startup(): 4 4095 c557-401-mic0.stampede.tacc.utexas.edu {1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30 | |
,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,5 | |
7,58,59,60,61} | |
[0] MPI startup(): 5 4096 c557-401-mic0.stampede.tacc.utexas.edu {62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88 | |
,89,90,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,11 | |
1,112,113,114,115,116,117,118,119,120,121,122} | |
[0] MPI startup(): 6 4097 c557-401-mic0.stampede.tacc.utexas.edu {123,124,125,126,127,128,129,130,131,132,133,134,135,136,137,138,139,140,141,142, | |
143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,160,161,162, | |
163,164,165,166,167,168,169,170,171,172,173,174,175,176,177,178,179,180,181,182, | |
183} | |
[0] MPI startup(): 7 4098 c557-401-mic0.stampede.tacc.utexas.edu {0,184,185,186,187,188,189,190,191,192,193,194,195,196,197,198,199,200,201,202,20 | |
3,204,205,206,207,208,209,210,211,212,213,214,215,216,217,218,219,220,221,222,22 | |
3,224,225,226,227,228,229,230,231,232,233,234,235,236,237,238,239,240,241,242,24 | |
3} | |
[0] MPI startup(): I_MPI_DAPL_PROVIDER=ofa-v2-mlx4_0-1u | |
[0] MPI startup(): I_MPI_DEBUG=5 | |
[0] MPI startup(): I_MPI_DYNAMIC_CONNECTION=1 | |
[0] MPI startup(): I_MPI_FABRICS=shm:dapl | |
[0] MPI startup(): I_MPI_MIC=1 | |
[0] MPI startup(): I_MPI_PIN_MAPPING=4:0 1,1 62,2 123,3 0 | |
Complete Filename: ./input.config | |
Local Config: solver.inp | |
Directory 8-procs_case/ already exists | |
using the existing inputfiles | |
changing to the problem directory 8-procs_case/ | |
Number of geombc-dat and restart-dat files to read: 1 | |
Number of fields in geombc-dat: 25 | |
Number of parts per file geombc-dat: 8 | |
Bypassing subcommunicator | |
[4] ERROR - ADIO_Init(): Can't load libmpi_lustre.so library: libmpi_lustre.so: cannot open shared object file: No such file or directory | |
[5] ERROR - ADIO_Init(): Can't load libmpi_lustre.so library: libmpi_lustre.so: cannot open shared object file: No such file or directory | |
[6] ERROR - ADIO_Init(): Can't load libmpi_lustre.so library: libmpi_lustre.so: cannot open shared object file: No such file or directory | |
[7] ERROR - ADIO_Init(): Can't load libmpi_lustre.so library: libmpi_lustre.so: cannot open shared object file: No such file or directory | |
[0] ERROR - ADIO_Init(): Can't load libmpi_lustre.so library: libmpi_lustre.so: cannot open shared object file: No such file or directory | |
[1] ERROR - ADIO_Init(): Can't load libmpi_lustre.so library: libmpi_lustre.so: cannot open shared object file: No such file or directory | |
[2] ERROR - ADIO_Init(): Can't load libmpi_lustre.so library: libmpi_lustre.so: cannot open shared object file: No such file or directory | |
[3] ERROR - ADIO_Init(): Can't load libmpi_lustre.so library: libmpi_lustre.so: cannot open shared object file: No such file or directory | |
Number of interior topologies: 1 | |
Number of boundary topologies: 2 | |
Number of fields in restart-dat: 2 | |
Number of parts per file restart-dat: 8 | |
Bypassing subcommunicator | |
WARNING readheader: Not found time derivative of solution@1? | |
Time derivative of solution is set to zero (SAFE) | |
Element block size = 128 | |
Domain size (x,y,z): 0.1000000000 0.0200000000 0.0005000000 | |
Total number of modes = 5850 | |
Number of global nonzeros 82088 | |
1 2.818E+00 1.711E-05 ( 0) < 244| 12> [ 0- 36] 36 | |
1 3.830E+00 2.402E-07 ( -18) < 181| 12> [ 0- 25] 61 | |
1 4.824E+00 7.029E-10 ( -43) < 159| 17> [ 0- 28] 89 | |
1 5.851E+00 3.466E-12 ( -66) < 91| 18> [ 0- 27] 116 | |
Number of fields to write in restart files: 2 | |
Bypassing subcommunicator | |
Filename is restart-dat.1.1 | |
2 7.957E+00 1.507E-05 ( 0) < 495| 10> [ 0- 34] 150 | |
2 8.932E+00 1.268E-07 ( -21) < 244| 11> [ 0- 25] 175 | |
2 9.934E+00 3.950E-10 ( -46) < 34| 13> [ 0- 28] 203 | |
2 1.087E+01 3.406E-11 ( -57) < 304| 17> [ 0- 20] 223 | |
Number of fields to write in restart files: 2 | |
Bypassing subcommunicator | |
Filename is restart-dat.2.1 | |
3 1.314E+01 1.382E-05 ( 0) < 495| 10> [ 0- 33] 256 | |
3 1.411E+01 1.139E-07 ( -21) < 707| 12> [ 0- 24] 280 | |
3 1.512E+01 3.607E-10 ( -46) < 130| 13> [ 0- 29] 309 | |
3 1.607E+01 2.173E-11 ( -58) < 430| 15> [ 0- 22] 331 | |
Number of fields to write in restart files: 2 | |
Bypassing subcommunicator | |
Filename is restart-dat.3.1 | |
4 1.814E+01 1.183E-05 ( -1) < 495| 10> [ 0- 32] 363 | |
4 1.911E+01 1.203E-07 ( -21) < 707| 12> [ 0- 25] 388 | |
4 2.012E+01 3.072E-10 ( -47) < 127| 13> [ 0- 29] 417 | |
4 2.109E+01 5.630E-12 ( -64) < 357| 15> [ 0- 24] 441 | |
Number of fields to write in restart files: 2 | |
Bypassing subcommunicator | |
Filename is restart-dat.4.1 | |
5 2.467E+01 1.041E-05 ( -2) < 495| 10> [ 0- 32] 473 | |
5 2.564E+01 1.243E-07 ( -21) < 707| 11> [ 0- 24] 497 | |
5 2.666E+01 2.808E-10 ( -47) < 127| 12> [ 0- 30] 527 | |
5 2.764E+01 2.098E-12 ( -69) < 708| 14> [ 0- 26] 553 | |
Number of fields to write in restart files: 2 | |
Bypassing subcommunicator | |
Filename is restart-dat.5.1 | |
6 2.980E+01 9.375E-06 ( -2) < 495| 10> [ 0- 32] 585 | |
6 3.076E+01 1.272E-07 ( -21) < 707| 11> [ 0- 24] 609 | |
6 3.178E+01 2.547E-10 ( -48) < 127| 11> [ 0- 30] 639 | |
6 3.273E+01 8.420E-12 ( -63) < 708| 15> [ 0- 22] 661 | |
Number of fields to write in restart files: 4 | |
Bypassing subcommunicator | |
Filename is restart-dat.6.1 | |
T(core) cpu = 31.8976049423218 | |
process - before closing iecho | |
process - after closing iecho | |
phasta.cc - last call before finalize! | |
TACC: Shutdown complete. Exiting. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment