Skip to content

Instantly share code, notes, and snippets.

@bbpbuildbot
Created June 22, 2022 09:34
Show Gist options
  • Save bbpbuildbot/7d516525bff87a756f7190e7fe62cd96 to your computer and use it in GitHub Desktop.
Save bbpbuildbot/7d516525bff87a756f7190e7fe62cd96 to your computer and use it in GitHub Desktop.
Logfiles for GitLab pipeline https://bbpgitlab.epfl.ch/hpc/coreneuron/-/pipelines/61704 (:white_check_mark:) running on GitHub PR BlueBrain/CoreNeuron#830.
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655886532:resolve_secrets Resolving secrets
section_end:1655886532:resolve_secrets section_start:1655886532:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor471265187, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272569
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J272569_PROD_P112_CP7_C11
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 563718
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J272569_PROD_P112_CP7_C11 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=2 --jobid=563718 --cpus-per-task=8 --mem=76G
section_end:1655886534:prepare_executor section_start:1655886534:prepare_script Preparing environment
Running on r2i5n16 via bbpv1.epfl.ch...
section_end:1655886539:prepare_script section_start:1655886539:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655886539:get_sources section_start:1655886539:restore_cache Restoring cache
Checking cache for build:coreneuron:mod2c:intel-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=191936 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1655886541:restore_cache section_start:1655886541:download_artifacts Downloading artifacts
Downloading artifacts for spack_setup (272556)...
Runtime platform  arch=amd64 os=linux pid=192585 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=272556 responseStatus=200 OK token=uHoCNEmA
section_end:1655886542:download_artifacts section_start:1655886542:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272569/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272569/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272569/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272569/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272569/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704/J272569_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install coreneuron%intel +tests~legacy-unit build_type=Debug
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be iw7f7wskesewfnyvmtmk3lfqta3e4jpw
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/563718/ccache
Primary config: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/563718/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 10:29:32 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.14 / 0.51 (28.31 %)
Files: 1473
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: 'iw7f7wskesewfnyvmtmk3lfqta3e4jpw'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- coreneuron%intel~legacy-unit+tests build_type=Debug
Concretized
--------------------------------
- iw7f7ws coreneuron@develop%intel@2021.4.0~caliper~codegenopt~gpu~ipo~ispc~knl~legacy-unit+mpi~nmodl~openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=Debug arch=linux-rhel7-skylake
- dwdch6b ^bison@3.8.2%intel@2021.4.0 arch=linux-rhel7-skylake
[^] dl7pfht ^boost@1.78.0%intel@2021.4.0+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
- z3q5f3x ^cmake@3.21.4%intel@2021.4.0~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
- s4ueg72 ^flex@2.6.3%intel@2021.4.0+lex~nls arch=linux-rhel7-skylake
[^] 3narjkw ^hpe-mpi@2.25.hmpt%intel@2021.4.0 arch=linux-rhel7-skylake
[^] podtqd6 ^libsonata-report@1.1.1%intel@2021.4.0~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ofveh3j ^hdf5@1.10.7%intel@2021.4.0+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] xys6npz ^pkgconf@1.8.0%intel@2021.4.0 arch=linux-rhel7-skylake
[^] e7w5nez ^zlib@1.2.11%intel@2021.4.0+optimize+pic+shared arch=linux-rhel7-skylake
[^] kxkrl2g ^spdlog@1.9.2%intel@2021.4.0~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] unnpvpo ^fmt@8.1.1%intel@2021.4.0~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
- 6ggc5yr ^ninja@1.10.2%intel@2021.4.0 arch=linux-rhel7-skylake
[^] f27a7nn ^python@3.9.7%intel@2021.4.0+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] gdhqypa ^reportinglib@2.5.6%intel@2021.4.0~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> bison@3.8.2 : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-dwdch6bmdeclr2novthsywtrryotawwz)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.78.0-hiorc7 (external boost-1.78.0-dl7pfhtsvyaq3273gr2g3l5vr37eeydc)
==> cmake@3.21.4 : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-z3q5f3xwpuibd3qbgdscqmu3efarbu42)
==> flex@2.6.3 : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-s4ueg72j7l6vkdyvfxj2tweo7v7s3otx)
==> hpe-mpi@2.25.hmpt : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-3narjkwjh6i2jr3zl3g5wdjlqi52hkwh)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/pkgconf-1.8.0-xys6np
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-e7w5nezpwf572epfhbosqfzboztysout)
==> ninja@1.10.2 : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-6ggc5yre7qddwxdjmn7sfptpdoiy4dtp)
==> python@3.9.7 : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-f27a7nnhppeztxijpispnaedcfo2vjxi)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/fmt-8.1.1-unnpvp
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/reportinglib-2.5.6-gdhqyp
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/hdf5-1.10.7-ofveh3
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/spdlog-1.9.2-kxkrl2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/libsonata-report-1.1.1-podtqd
==> Installing coreneuron-develop-iw7f7wskesewfnyvmtmk3lfqta3e4jpw
==> No binary for coreneuron-develop-iw7f7wskesewfnyvmtmk3lfqta3e4jpw found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704, but it is owned by 0
==> No patches needed for coreneuron
==> coreneuron: Executing phase: 'cmake'
==> coreneuron: Executing phase: 'build'
==> coreneuron: Executing phase: 'install'
==> coreneuron: Successfully installed coreneuron-develop-iw7f7wskesewfnyvmtmk3lfqta3e4jpw
Fetch: 3.98s. Build: 42.46s. Total: 46.45s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/software/install_intel-2021.4.0-skylake/coreneuron-develop-iw7f7w
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/563718/ccache
Primary config: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/563718/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 10:31:05 2022
Hits: 107 / 112 (95.54 %)
Direct: 89 / 112 (79.46 %)
Preprocessed: 18 / 23 (78.26 %)
Misses: 5
Direct: 23
Preprocessed: 5
Uncacheable: 18
Primary storage:
Hits: 196 / 224 (87.50 %)
Misses: 28
Cache size (GB): 0.15 / 0.51 (28.44 %)
Files: 1483
Uncacheable:
Called for linking: 15
No input file: 3
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading hpe-mpi/2.25.hmpt
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1655886668:step_script section_start:1655886668:archive_cache Saving cache for successful job
Creating cache build:coreneuron:mod2c:intel-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=200014 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Archive is up to date! 
Created cache
section_end:1655886669:archive_cache section_start:1655886669:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=200094 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=272569 responseStatus=201 Created token=myidMhnG
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=200152 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=272569 responseStatus=201 Created token=myidMhnG
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=200212 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=272569 responseStatus=201 Created token=myidMhnG
section_end:1655886671:upload_artifacts_on_success section_start:1655886671:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655886672:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655886531:resolve_secrets Resolving secrets
section_end:1655886531:resolve_secrets section_start:1655886531:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor076705279, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272563
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J272563_PROD_P112_CP5_C9
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 563716
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J272563_PROD_P112_CP5_C9 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=2 --jobid=563716 --cpus-per-task=8 --mem=76G
section_end:1655886534:prepare_executor section_start:1655886534:prepare_script Preparing environment
Running on r2i5n16 via bbpv1.epfl.ch...
section_end:1655886539:prepare_script section_start:1655886539:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655886539:get_sources section_start:1655886539:restore_cache Restoring cache
Checking cache for build:coreneuron:mod2c:nvhpc:acc:unified-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=192026 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1655886541:restore_cache section_start:1655886541:download_artifacts Downloading artifacts
Downloading artifacts for spack_setup (272556)...
Runtime platform  arch=amd64 os=linux pid=192515 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=272556 responseStatus=200 OK token=uHoCNEmA
section_end:1655886542:download_artifacts section_start:1655886542:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272563/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272563/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272563/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272563/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272563/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704/J272563_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install coreneuron%nvhpc +gpu+unified+openmp+tests~legacy-unit build_type=RelWithDebInfo
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be 7xhnxahexeffbne3dpzxwo5qcrumhnar
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/563716/ccache
Primary config: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/563716/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 10:29:32 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.13 / 0.51 (24.75 %)
Files: 1689
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: '7xhnxahexeffbne3dpzxwo5qcrumhnar'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- coreneuron%nvhpc+gpu~legacy-unit+openmp+tests+unified build_type=RelWithDebInfo
Concretized
--------------------------------
- 7xhnxah coreneuron@develop%nvhpc@22.3~caliper~codegenopt+gpu~ipo~ispc~knl~legacy-unit+mpi~nmodl+openmp~profile+report+shared~sympy~sympyopt+tests+unified build_type=RelWithDebInfo arch=linux-rhel7-skylake
- 6s6wcfe ^bison@3.8.2%nvhpc@22.3 arch=linux-rhel7-skylake
[^] vn2t5vi ^boost@1.78.0%nvhpc@22.3+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
- ucwiakr ^cmake@3.21.4%nvhpc@22.3~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
[^] gi5x2dn ^cuda@11.6.1%nvhpc@22.3~allow-unsupported-compilers~dev arch=linux-rhel7-skylake
- ajxdymq ^flex@2.6.3%nvhpc@22.3+lex~nls arch=linux-rhel7-skylake
[^] pjmdwuu ^hpe-mpi@2.25.hmpt%nvhpc@22.3 arch=linux-rhel7-skylake
[^] 6d2jeoj ^libsonata-report@1.1.1%nvhpc@22.3~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] tbfoeg7 ^hdf5@1.10.7%nvhpc@22.3+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 6dfyugs ^pkgconf@1.8.0%nvhpc@22.3 arch=linux-rhel7-skylake
[^] h7lotu6 ^zlib@1.2.11%nvhpc@22.3+optimize+pic+shared arch=linux-rhel7-skylake
[^] v5zxpfc ^spdlog@1.9.2%nvhpc@22.3~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] jeloytj ^fmt@8.1.1%nvhpc@22.3~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
- cp3ofsp ^ninja@1.10.2%nvhpc@22.3 arch=linux-rhel7-skylake
[^] f62u3oh ^python@3.9.7%nvhpc@22.3+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] bdkvweu ^reportinglib@2.5.6%nvhpc@22.3~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> bison@3.8.2 : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-6s6wcfeanp2mkdib4c3n3ivkcuosopgm)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.78.0-hiorc7 (external boost-1.78.0-vn2t5viljvtb5wyfhgmx57txyeiznxo4)
==> cmake@3.21.4 : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-ucwiakreeghdgbo22kbqqhgnnlwxqtnn)
==> cuda@11.6.1 : has external module in ['cuda/11.6.1']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cuda-11.6.1-ngetva (external cuda-11.6.1-gi5x2dn47i7nv76flbq7aatllxpaqmjp)
==> flex@2.6.3 : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-ajxdymqhnzqgeuhvk5nu5zfymzq35n6i)
==> hpe-mpi@2.25.hmpt : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-pjmdwuu36ioiwqyrg6lrj7nrq3waqjj2)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/pkgconf-1.8.0-6dfyug
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-h7lotu6jszcvumwucry2y5jnhvxw5x2d)
==> ninja@1.10.2 : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-cp3ofspgkdqaqih5vrhotmdwvkozfswp)
==> python@3.9.7 : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-f62u3ohchswt5q2b63chawohzqrl6wvy)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/fmt-8.1.1-jeloyt
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/reportinglib-2.5.6-bdkvwe
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/hdf5-1.10.7-tbfoeg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/spdlog-1.9.2-v5zxpf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/libsonata-report-1.1.1-6d2jeo
==> Installing coreneuron-develop-7xhnxahexeffbne3dpzxwo5qcrumhnar
==> No binary for coreneuron-develop-7xhnxahexeffbne3dpzxwo5qcrumhnar found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704, but it is owned by 0
==> No patches needed for coreneuron
==> coreneuron: Executing phase: 'cmake'
==> coreneuron: Executing phase: 'build'
==> coreneuron: Executing phase: 'install'
==> coreneuron: Successfully installed coreneuron-develop-7xhnxahexeffbne3dpzxwo5qcrumhnar
Fetch: 4.87s. Build: 1m 11.22s. Total: 1m 16.09s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/software/install_nvhpc-22.3-skylake/coreneuron-develop-7xhnxa
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/563716/ccache
Primary config: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/563716/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 10:31:38 2022
Hits: 107 / 115 (93.04 %)
Direct: 89 / 115 (77.39 %)
Preprocessed: 18 / 26 (69.23 %)
Misses: 8
Direct: 26
Preprocessed: 8
Uncacheable: 19
Primary storage:
Hits: 196 / 230 (85.22 %)
Misses: 34
Cache size (GB): 0.13 / 0.51 (24.73 %)
Files: 1705
Uncacheable:
Called for linking: 16
No input file: 3
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading hpe-mpi/2.25.hmpt
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1655886699:step_script section_start:1655886699:archive_cache Saving cache for successful job
Creating cache build:coreneuron:mod2c:nvhpc:acc:unified-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=201472 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Acoreneuron%3Amod2c%3Anvhpc%3Aacc%3Aunified-8-non_protected
Created cache
section_end:1655886705:archive_cache section_start:1655886705:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=201664 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=272563 responseStatus=201 Created token=15snGyEb
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=201739 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=272563 responseStatus=201 Created token=15snGyEb
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=201820 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=272563 responseStatus=201 Created token=15snGyEb
section_end:1655886707:upload_artifacts_on_success section_start:1655886707:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655886707:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655886531:resolve_secrets Resolving secrets
section_end:1655886531:resolve_secrets section_start:1655886531:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor227826565, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272561
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J272561_PROD_P112_CP4_C8
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 563715
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J272561_PROD_P112_CP4_C8 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=2 --jobid=563715 --cpus-per-task=8 --mem=76G
section_end:1655886533:prepare_executor section_start:1655886533:prepare_script Preparing environment
Running on r2i5n16 via bbpv1.epfl.ch...
section_end:1655886539:prepare_script section_start:1655886539:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655886539:get_sources section_start:1655886539:restore_cache Restoring cache
Checking cache for build:coreneuron:mod2c:nvhpc:acc-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=191987 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1655886541:restore_cache section_start:1655886541:download_artifacts Downloading artifacts
Downloading artifacts for spack_setup (272556)...
Runtime platform  arch=amd64 os=linux pid=192463 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=272556 responseStatus=200 OK token=uHoCNEmA
section_end:1655886542:download_artifacts section_start:1655886542:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272561/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272561/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272561/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272561/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272561/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704/J272561_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install coreneuron%nvhpc +gpu+openmp+tests~legacy-unit build_type=RelWithDebInfo
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be 6qi7xiczl6qfolioernjyooskeukbk4m
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/563715/ccache
Primary config: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/563715/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 10:29:32 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.13 / 0.51 (24.72 %)
Files: 1672
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: '6qi7xiczl6qfolioernjyooskeukbk4m'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- coreneuron%nvhpc+gpu~legacy-unit+openmp+tests build_type=RelWithDebInfo
Concretized
--------------------------------
- 6qi7xic coreneuron@develop%nvhpc@22.3~caliper~codegenopt+gpu~ipo~ispc~knl~legacy-unit+mpi~nmodl+openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=RelWithDebInfo arch=linux-rhel7-skylake
- 6s6wcfe ^bison@3.8.2%nvhpc@22.3 arch=linux-rhel7-skylake
[^] vn2t5vi ^boost@1.78.0%nvhpc@22.3+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
- ucwiakr ^cmake@3.21.4%nvhpc@22.3~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
[^] gi5x2dn ^cuda@11.6.1%nvhpc@22.3~allow-unsupported-compilers~dev arch=linux-rhel7-skylake
- ajxdymq ^flex@2.6.3%nvhpc@22.3+lex~nls arch=linux-rhel7-skylake
[^] pjmdwuu ^hpe-mpi@2.25.hmpt%nvhpc@22.3 arch=linux-rhel7-skylake
[^] 6d2jeoj ^libsonata-report@1.1.1%nvhpc@22.3~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] tbfoeg7 ^hdf5@1.10.7%nvhpc@22.3+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 6dfyugs ^pkgconf@1.8.0%nvhpc@22.3 arch=linux-rhel7-skylake
[^] h7lotu6 ^zlib@1.2.11%nvhpc@22.3+optimize+pic+shared arch=linux-rhel7-skylake
[^] v5zxpfc ^spdlog@1.9.2%nvhpc@22.3~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] jeloytj ^fmt@8.1.1%nvhpc@22.3~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
- cp3ofsp ^ninja@1.10.2%nvhpc@22.3 arch=linux-rhel7-skylake
[^] f62u3oh ^python@3.9.7%nvhpc@22.3+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] bdkvweu ^reportinglib@2.5.6%nvhpc@22.3~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> bison@3.8.2 : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-6s6wcfeanp2mkdib4c3n3ivkcuosopgm)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.78.0-hiorc7 (external boost-1.78.0-vn2t5viljvtb5wyfhgmx57txyeiznxo4)
==> cmake@3.21.4 : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-ucwiakreeghdgbo22kbqqhgnnlwxqtnn)
==> cuda@11.6.1 : has external module in ['cuda/11.6.1']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cuda-11.6.1-ngetva (external cuda-11.6.1-gi5x2dn47i7nv76flbq7aatllxpaqmjp)
==> flex@2.6.3 : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-ajxdymqhnzqgeuhvk5nu5zfymzq35n6i)
==> hpe-mpi@2.25.hmpt : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-pjmdwuu36ioiwqyrg6lrj7nrq3waqjj2)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/pkgconf-1.8.0-6dfyug
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-h7lotu6jszcvumwucry2y5jnhvxw5x2d)
==> ninja@1.10.2 : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-cp3ofspgkdqaqih5vrhotmdwvkozfswp)
==> python@3.9.7 : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-f62u3ohchswt5q2b63chawohzqrl6wvy)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/fmt-8.1.1-jeloyt
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/reportinglib-2.5.6-bdkvwe
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/hdf5-1.10.7-tbfoeg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/spdlog-1.9.2-v5zxpf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/libsonata-report-1.1.1-6d2jeo
==> Installing coreneuron-develop-6qi7xiczl6qfolioernjyooskeukbk4m
==> No binary for coreneuron-develop-6qi7xiczl6qfolioernjyooskeukbk4m found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704, but it is owned by 0
==> No patches needed for coreneuron
==> coreneuron: Executing phase: 'cmake'
==> coreneuron: Executing phase: 'build'
==> coreneuron: Executing phase: 'install'
==> coreneuron: Successfully installed coreneuron-develop-6qi7xiczl6qfolioernjyooskeukbk4m
Fetch: 4.75s. Build: 1m 11.42s. Total: 1m 16.16s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/software/install_nvhpc-22.3-skylake/coreneuron-develop-6qi7xi
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/563715/ccache
Primary config: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/563715/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 10:31:38 2022
Hits: 107 / 115 (93.04 %)
Direct: 89 / 115 (77.39 %)
Preprocessed: 18 / 26 (69.23 %)
Misses: 8
Direct: 26
Preprocessed: 8
Uncacheable: 19
Primary storage:
Hits: 196 / 230 (85.22 %)
Misses: 34
Cache size (GB): 0.13 / 0.51 (24.70 %)
Files: 1688
Uncacheable:
Called for linking: 16
No input file: 3
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading hpe-mpi/2.25.hmpt
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1655886699:step_script section_start:1655886699:archive_cache Saving cache for successful job
Creating cache build:coreneuron:mod2c:nvhpc:acc-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=201524 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Acoreneuron%3Amod2c%3Anvhpc%3Aacc-8-non_protected
Created cache
section_end:1655886705:archive_cache section_start:1655886705:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=201602 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=272561 responseStatus=201 Created token=vewM1syi
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=201702 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=272561 responseStatus=201 Created token=vewM1syi
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=201779 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=272561 responseStatus=201 Created token=vewM1syi
section_end:1655886707:upload_artifacts_on_success section_start:1655886707:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655886707:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655886796:resolve_secrets Resolving secrets
section_end:1655886796:resolve_secrets section_start:1655886796:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor390998597, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272572
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J272572_PROD_P112_CP13_C6
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 563742
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J272572_PROD_P112_CP13_C6 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=2 --jobid=563742 --cpus-per-task=8 --mem=76G
section_end:1655886798:prepare_executor section_start:1655886798:prepare_script Preparing environment
Running on r1i6n30 via bbpv1.epfl.ch...
section_end:1655886802:prepare_script section_start:1655886802:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655886802:get_sources section_start:1655886802:restore_cache Restoring cache
Checking cache for build:coreneuron:nmodl:intel-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=254545 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1655886804:restore_cache section_start:1655886804:download_artifacts Downloading artifacts
Downloading artifacts for build:nmodl (272559)...
Runtime platform  arch=amd64 os=linux pid=255061 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=272559 responseStatus=200 OK token=TUUbHxzw
section_end:1655886805:download_artifacts section_start:1655886805:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272572/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272572/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272572/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272572/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272572/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704/J272572_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install coreneuron%intel +nmodl+tests~legacy-unit build_type=Debug ^hpe-mpi%gcc ^/4dzxcpsuksxgtuoesheax4sf76wrhkqb
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be rh5oq7ygplbbyryh6avavlwff7ieo23w
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/563742/ccache
Primary config: /nvme/bbpcihpcproj12/563742/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 10:33:49 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.15 / 0.51 (28.95 %)
Files: 1651
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: 'rh5oq7ygplbbyryh6avavlwff7ieo23w'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- coreneuron%intel~legacy-unit+nmodl+tests build_type=Debug
- ^hpe-mpi%gcc
[+] ^nmodl@develop%gcc@11.2.0~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^catch2@2.13.8%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^cli11@2.1.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^fmt@8.1.1%gcc@11.2.0~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
[^] ^nlohmann-json@3.10.4%gcc@11.2.0~ipo~multiple_headers build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^py-jinja2@3.0.1%gcc@11.2.0~i18n arch=linux-rhel7-skylake
[^] ^py-markupsafe@2.0.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^python@3.9.7%gcc@11.2.0+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] ^py-pybind11@2.9.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^py-pytest@6.2.4%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-attrs@21.2.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-iniconfig@1.1.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-packaging@21.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-pyparsing@2.4.7%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-pluggy@0.13.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-py@1.9.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-setuptools@57.4.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-toml@0.10.2%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-pyyaml@5.3.1%gcc@11.2.0+libyaml arch=linux-rhel7-skylake
[^] ^libyaml@0.2.5%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-sympy@1.9%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-mpmath@1.1.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^spdlog@1.9.2%gcc@11.2.0~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
Concretized
--------------------------------
- rh5oq7y coreneuron@develop%intel@2021.4.0~caliper~codegenopt~gpu~ipo~ispc~knl~legacy-unit+mpi+nmodl~openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=Debug arch=linux-rhel7-skylake
- dwdch6b ^bison@3.8.2%intel@2021.4.0 arch=linux-rhel7-skylake
[^] dl7pfht ^boost@1.78.0%intel@2021.4.0+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] 2qmvlfy ^cmake@3.21.4%gcc@11.2.0~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
- s4ueg72 ^flex@2.6.3%intel@2021.4.0+lex~nls arch=linux-rhel7-skylake
[^] hyunzkn ^hpe-mpi@2.25.hmpt%gcc@11.2.0 arch=linux-rhel7-skylake
[^] lbajih4 ^libsonata-report@1.1.1%gcc@11.2.0~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] utj47fz ^hdf5@1.10.7%gcc@11.2.0+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ihxi5rl ^pkgconf@1.8.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] evtnqzd ^zlib@1.2.11%gcc@11.2.0+optimize+pic+shared arch=linux-rhel7-skylake
[^] r74vcyb ^spdlog@1.9.2%gcc@11.2.0~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 7lotjqk ^fmt@8.1.1%gcc@11.2.0~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
- 6ggc5yr ^ninja@1.10.2%intel@2021.4.0 arch=linux-rhel7-skylake
[+] 4dzxcps ^nmodl@develop%gcc@11.2.0~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] atktt2p ^catch2@2.13.8%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] e5qqxxq ^cli11@2.1.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] jjl6cjc ^nlohmann-json@3.10.4%gcc@11.2.0~ipo~multiple_headers build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] onq3mhg ^py-jinja2@3.0.1%gcc@11.2.0~i18n arch=linux-rhel7-skylake
[^] 4ectow5 ^py-markupsafe@2.0.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 72xzp3v ^python@3.9.7%gcc@11.2.0+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] z5kzef6 ^py-pybind11@2.9.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 22arfs4 ^py-pytest@6.2.4%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ascbeii ^py-attrs@21.2.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] kvw3vhm ^py-iniconfig@1.1.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 7iyiygo ^py-packaging@21.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] c7qvw2q ^py-pyparsing@2.4.7%gcc@11.2.0 arch=linux-rhel7-skylake
[^] mazoiox ^py-pluggy@0.13.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] y7rfzdj ^py-py@1.9.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] v4z3s5e ^py-setuptools@57.4.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] w4gddqx ^py-toml@0.10.2%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ci5oe5b ^py-pyyaml@5.3.1%gcc@11.2.0+libyaml arch=linux-rhel7-skylake
[^] xj2wlac ^libyaml@0.2.5%gcc@11.2.0 arch=linux-rhel7-skylake
[^] vt2or7v ^py-sympy@1.9%gcc@11.2.0 arch=linux-rhel7-skylake
[^] dzb2mfs ^py-mpmath@1.1.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 5fkun4i ^reportinglib@2.5.6%gcc@11.2.0~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> bison@3.8.2 : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-dwdch6bmdeclr2novthsywtrryotawwz)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.78.0-hiorc7 (external boost-1.78.0-dl7pfhtsvyaq3273gr2g3l5vr37eeydc)
==> cmake@3.21.4 : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-2qmvlfyylrv3t5ankluyr5cqey2nlfzd)
==> flex@2.6.3 : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-s4ueg72j7l6vkdyvfxj2tweo7v7s3otx)
==> hpe-mpi@2.25.hmpt : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-hyunzknktdzri34bx26bupkvapehonql)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/pkgconf-1.8.0-ihxi5r
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-evtnqzdtuvprwnbd2nljimpzywrhw4uo)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/fmt-8.1.1-7lotjq
==> ninja@1.10.2 : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-6ggc5yre7qddwxdjmn7sfptpdoiy4dtp)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/catch2-2.13.8-atktt2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/cli11-2.1.1-e5qqxx
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/nlohmann-json-3.10.4-jjl6cj
==> python@3.9.7 : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-72xzp3vgzh5b424qevkfruhs3ajzqbiy)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libyaml-0.2.5-xj2wla
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/reportinglib-2.5.6-5fkun4
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/hdf5-1.10.7-utj47f
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/spdlog-1.9.2-r74vcy
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyparsing-2.4.7-c7qvw2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pluggy-0.13.0-mazoio
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-mpmath-1.1.0-dzb2mf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-iniconfig-1.1.1-kvw3vh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-py-1.9.0-y7rfzd
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-setuptools-57.4.0-v4z3s5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-attrs-21.2.0-ascbei
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-toml-0.10.2-w4gddq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-markupsafe-2.0.1-4ectow
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pybind11-2.9.1-z5kzef
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyyaml-5.3.1-ci5oe5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libsonata-report-1.1.1-lbajih
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-packaging-21.0-7iyiyg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-sympy-1.9-vt2or7
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-jinja2-3.0.1-onq3mh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pytest-6.2.4-22arfs
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/software/install_gcc-11.2.0-skylake/nmodl-develop-4dzxcp
==> Installing coreneuron-develop-rh5oq7ygplbbyryh6avavlwff7ieo23w
==> No binary for coreneuron-develop-rh5oq7ygplbbyryh6avavlwff7ieo23w found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704, but it is owned by 0
==> No patches needed for coreneuron
==> coreneuron: Executing phase: 'cmake'
==> coreneuron: Executing phase: 'build'
==> coreneuron: Executing phase: 'install'
==> coreneuron: Successfully installed coreneuron-develop-rh5oq7ygplbbyryh6avavlwff7ieo23w
Fetch: 4.72s. Build: 50.14s. Total: 54.86s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/software/install_intel-2021.4.0-skylake/coreneuron-develop-rh5oq7
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/563742/ccache
Primary config: /nvme/bbpcihpcproj12/563742/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 10:35:34 2022
Hits: 86 / 91 (94.51 %)
Direct: 77 / 91 (84.62 %)
Preprocessed: 9 / 14 (64.29 %)
Misses: 5
Direct: 14
Preprocessed: 5
Uncacheable: 16
Primary storage:
Hits: 163 / 182 (89.56 %)
Misses: 19
Cache size (GB): 0.15 / 0.51 (28.98 %)
Files: 1670
Uncacheable:
Called for linking: 13
No input file: 3
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading hpe-mpi/2.25.hmpt
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1655886934:step_script section_start:1655886934:archive_cache Saving cache for successful job
Creating cache build:coreneuron:nmodl:intel-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=260731 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Acoreneuron%3Anmodl%3Aintel-8-non_protected
Created cache
section_end:1655886942:archive_cache section_start:1655886942:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=260939 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=272572 responseStatus=201 Created token=mjvHJvjk
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=260976 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=272572 responseStatus=201 Created token=mjvHJvjk
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=261017 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=272572 responseStatus=201 Created token=mjvHJvjk
section_end:1655886944:upload_artifacts_on_success section_start:1655886944:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655886944:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655886796:resolve_secrets Resolving secrets
section_end:1655886796:resolve_secrets section_start:1655886796:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor459020635, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272567
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J272567_PROD_P112_CP12_C4
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 563740
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J272567_PROD_P112_CP12_C4 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=2 --jobid=563740 --cpus-per-task=8 --mem=76G
section_end:1655886798:prepare_executor section_start:1655886798:prepare_script Preparing environment
Running on r1i6n30 via bbpv1.epfl.ch...
section_end:1655886802:prepare_script section_start:1655886802:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655886802:get_sources section_start:1655886802:restore_cache Restoring cache
Checking cache for build:coreneuron:nmodl:nvhpc:acc-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=254506 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1655886804:restore_cache section_start:1655886804:download_artifacts Downloading artifacts
Downloading artifacts for build:nmodl (272559)...
Runtime platform  arch=amd64 os=linux pid=254989 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=272559 responseStatus=200 OK token=TUUbHxzw
section_end:1655886805:download_artifacts section_start:1655886805:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272567/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272567/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272567/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272567/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272567/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704/J272567_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install coreneuron%nvhpc +nmodl~openmp+gpu+tests~legacy-unit+sympy build_type=RelWithDebInfo ^hpe-mpi%gcc ^/4dzxcpsuksxgtuoesheax4sf76wrhkqb
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be 6q3vzbkyvzxputeeg6himnv7tcfu7ge7
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/563740/ccache
Primary config: /nvme/bbpcihpcproj12/563740/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 10:33:49 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.11 / 0.51 (21.75 %)
Files: 1551
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: '6q3vzbkyvzxputeeg6himnv7tcfu7ge7'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- coreneuron%nvhpc+gpu~legacy-unit+nmodl~openmp+sympy+tests build_type=RelWithDebInfo
- ^hpe-mpi%gcc
[+] ^nmodl@develop%gcc@11.2.0~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^catch2@2.13.8%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^cli11@2.1.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^fmt@8.1.1%gcc@11.2.0~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
[^] ^nlohmann-json@3.10.4%gcc@11.2.0~ipo~multiple_headers build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^py-jinja2@3.0.1%gcc@11.2.0~i18n arch=linux-rhel7-skylake
[^] ^py-markupsafe@2.0.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^python@3.9.7%gcc@11.2.0+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] ^py-pybind11@2.9.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^py-pytest@6.2.4%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-attrs@21.2.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-iniconfig@1.1.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-packaging@21.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-pyparsing@2.4.7%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-pluggy@0.13.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-py@1.9.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-setuptools@57.4.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-toml@0.10.2%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-pyyaml@5.3.1%gcc@11.2.0+libyaml arch=linux-rhel7-skylake
[^] ^libyaml@0.2.5%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-sympy@1.9%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-mpmath@1.1.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^spdlog@1.9.2%gcc@11.2.0~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
Concretized
--------------------------------
- 6q3vzbk coreneuron@develop%nvhpc@22.3~caliper~codegenopt+gpu~ipo~ispc~knl~legacy-unit+mpi+nmodl~openmp~profile+report+shared+sympy~sympyopt+tests~unified build_type=RelWithDebInfo arch=linux-rhel7-skylake
- 6s6wcfe ^bison@3.8.2%nvhpc@22.3 arch=linux-rhel7-skylake
[^] vn2t5vi ^boost@1.78.0%nvhpc@22.3+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] 2qmvlfy ^cmake@3.21.4%gcc@11.2.0~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
[^] gi5x2dn ^cuda@11.6.1%nvhpc@22.3~allow-unsupported-compilers~dev arch=linux-rhel7-skylake
- ajxdymq ^flex@2.6.3%nvhpc@22.3+lex~nls arch=linux-rhel7-skylake
[^] hyunzkn ^hpe-mpi@2.25.hmpt%gcc@11.2.0 arch=linux-rhel7-skylake
[^] lbajih4 ^libsonata-report@1.1.1%gcc@11.2.0~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] utj47fz ^hdf5@1.10.7%gcc@11.2.0+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ihxi5rl ^pkgconf@1.8.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] evtnqzd ^zlib@1.2.11%gcc@11.2.0+optimize+pic+shared arch=linux-rhel7-skylake
[^] r74vcyb ^spdlog@1.9.2%gcc@11.2.0~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 7lotjqk ^fmt@8.1.1%gcc@11.2.0~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
- cp3ofsp ^ninja@1.10.2%nvhpc@22.3 arch=linux-rhel7-skylake
[+] 4dzxcps ^nmodl@develop%gcc@11.2.0~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] atktt2p ^catch2@2.13.8%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] e5qqxxq ^cli11@2.1.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] jjl6cjc ^nlohmann-json@3.10.4%gcc@11.2.0~ipo~multiple_headers build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] onq3mhg ^py-jinja2@3.0.1%gcc@11.2.0~i18n arch=linux-rhel7-skylake
[^] 4ectow5 ^py-markupsafe@2.0.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 72xzp3v ^python@3.9.7%gcc@11.2.0+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] z5kzef6 ^py-pybind11@2.9.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 22arfs4 ^py-pytest@6.2.4%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ascbeii ^py-attrs@21.2.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] kvw3vhm ^py-iniconfig@1.1.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 7iyiygo ^py-packaging@21.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] c7qvw2q ^py-pyparsing@2.4.7%gcc@11.2.0 arch=linux-rhel7-skylake
[^] mazoiox ^py-pluggy@0.13.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] y7rfzdj ^py-py@1.9.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] v4z3s5e ^py-setuptools@57.4.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] w4gddqx ^py-toml@0.10.2%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ci5oe5b ^py-pyyaml@5.3.1%gcc@11.2.0+libyaml arch=linux-rhel7-skylake
[^] xj2wlac ^libyaml@0.2.5%gcc@11.2.0 arch=linux-rhel7-skylake
[^] vt2or7v ^py-sympy@1.9%gcc@11.2.0 arch=linux-rhel7-skylake
[^] dzb2mfs ^py-mpmath@1.1.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 5fkun4i ^reportinglib@2.5.6%gcc@11.2.0~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> bison@3.8.2 : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-6s6wcfeanp2mkdib4c3n3ivkcuosopgm)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.78.0-hiorc7 (external boost-1.78.0-vn2t5viljvtb5wyfhgmx57txyeiznxo4)
==> cmake@3.21.4 : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-2qmvlfyylrv3t5ankluyr5cqey2nlfzd)
==> cuda@11.6.1 : has external module in ['cuda/11.6.1']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cuda-11.6.1-ngetva (external cuda-11.6.1-gi5x2dn47i7nv76flbq7aatllxpaqmjp)
==> flex@2.6.3 : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-ajxdymqhnzqgeuhvk5nu5zfymzq35n6i)
==> hpe-mpi@2.25.hmpt : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-hyunzknktdzri34bx26bupkvapehonql)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/pkgconf-1.8.0-ihxi5r
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-evtnqzdtuvprwnbd2nljimpzywrhw4uo)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/fmt-8.1.1-7lotjq
==> ninja@1.10.2 : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-cp3ofspgkdqaqih5vrhotmdwvkozfswp)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/catch2-2.13.8-atktt2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/cli11-2.1.1-e5qqxx
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/nlohmann-json-3.10.4-jjl6cj
==> python@3.9.7 : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-72xzp3vgzh5b424qevkfruhs3ajzqbiy)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libyaml-0.2.5-xj2wla
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/reportinglib-2.5.6-5fkun4
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/hdf5-1.10.7-utj47f
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/spdlog-1.9.2-r74vcy
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-toml-0.10.2-w4gddq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-py-1.9.0-y7rfzd
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-mpmath-1.1.0-dzb2mf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pluggy-0.13.0-mazoio
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-markupsafe-2.0.1-4ectow
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-attrs-21.2.0-ascbei
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-setuptools-57.4.0-v4z3s5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyparsing-2.4.7-c7qvw2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pybind11-2.9.1-z5kzef
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-iniconfig-1.1.1-kvw3vh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyyaml-5.3.1-ci5oe5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libsonata-report-1.1.1-lbajih
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-sympy-1.9-vt2or7
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-jinja2-3.0.1-onq3mh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-packaging-21.0-7iyiyg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pytest-6.2.4-22arfs
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/software/install_gcc-11.2.0-skylake/nmodl-develop-4dzxcp
==> Installing coreneuron-develop-6q3vzbkyvzxputeeg6himnv7tcfu7ge7
==> No binary for coreneuron-develop-6q3vzbkyvzxputeeg6himnv7tcfu7ge7 found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704, but it is owned by 0
==> No patches needed for coreneuron
==> coreneuron: Executing phase: 'cmake'
==> coreneuron: Executing phase: 'build'
==> coreneuron: Executing phase: 'install'
==> coreneuron: Successfully installed coreneuron-develop-6q3vzbkyvzxputeeg6himnv7tcfu7ge7
Fetch: 4.09s. Build: 3m 41.84s. Total: 3m 45.93s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/software/install_nvhpc-22.3-skylake/coreneuron-develop-6q3vzb
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/563740/ccache
Primary config: /nvme/bbpcihpcproj12/563740/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 10:38:27 2022
Hits: 86 / 93 (92.47 %)
Direct: 77 / 93 (82.80 %)
Preprocessed: 9 / 16 (56.25 %)
Misses: 7
Direct: 16
Preprocessed: 7
Uncacheable: 16
Primary storage:
Hits: 163 / 186 (87.63 %)
Misses: 23
Cache size (GB): 0.11 / 0.51 (21.82 %)
Files: 1574
Uncacheable:
Called for linking: 13
No input file: 3
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading hpe-mpi/2.25.hmpt
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1655887108:step_script section_start:1655887108:archive_cache Saving cache for successful job
Creating cache build:coreneuron:nmodl:nvhpc:acc-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=268596 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Acoreneuron%3Anmodl%3Anvhpc%3Aacc-8-non_protected
Created cache
section_end:1655887115:archive_cache section_start:1655887115:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=269219 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=272567 responseStatus=201 Created token=cPTxqcyz
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=269281 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=272567 responseStatus=201 Created token=cPTxqcyz
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=269322 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=272567 responseStatus=201 Created token=cPTxqcyz
section_end:1655887116:upload_artifacts_on_success section_start:1655887116:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655887117:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655886795:resolve_secrets Resolving secrets
section_end:1655886795:resolve_secrets section_start:1655886795:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor860802433, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272565
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J272565_PROD_P112_CP2_C2
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 563739
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J272565_PROD_P112_CP2_C2 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=2 --jobid=563739 --cpus-per-task=8 --mem=76G
section_end:1655886797:prepare_executor section_start:1655886797:prepare_script Preparing environment
Running on r1i6n30 via bbpv1.epfl.ch...
section_end:1655886799:prepare_script section_start:1655886799:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655886800:get_sources section_start:1655886800:restore_cache Restoring cache
Checking cache for build:coreneuron:nmodl:nvhpc:omp-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=254224 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1655886802:restore_cache section_start:1655886802:download_artifacts Downloading artifacts
Downloading artifacts for build:nmodl (272559)...
Runtime platform  arch=amd64 os=linux pid=254413 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=272559 responseStatus=200 OK token=TUUbHxzw
section_end:1655886803:download_artifacts section_start:1655886803:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272565/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272565/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272565/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272565/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272565/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704/J272565_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install coreneuron%nvhpc +nmodl+openmp+gpu+tests~legacy-unit~sympy build_type=RelWithDebInfo ^hpe-mpi%gcc ^/4dzxcpsuksxgtuoesheax4sf76wrhkqb
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be rjmenmrpgml6pswauzum5llfbry6ngnh
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/563739/ccache
Primary config: /nvme/bbpcihpcproj12/563739/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 10:33:49 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.13 / 0.51 (25.48 %)
Files: 1719
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: 'rjmenmrpgml6pswauzum5llfbry6ngnh'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- coreneuron%nvhpc+gpu~legacy-unit+nmodl+openmp~sympy+tests build_type=RelWithDebInfo
- ^hpe-mpi%gcc
[+] ^nmodl@develop%gcc@11.2.0~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^catch2@2.13.8%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^cli11@2.1.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^fmt@8.1.1%gcc@11.2.0~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
[^] ^nlohmann-json@3.10.4%gcc@11.2.0~ipo~multiple_headers build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^py-jinja2@3.0.1%gcc@11.2.0~i18n arch=linux-rhel7-skylake
[^] ^py-markupsafe@2.0.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^python@3.9.7%gcc@11.2.0+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] ^py-pybind11@2.9.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^py-pytest@6.2.4%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-attrs@21.2.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-iniconfig@1.1.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-packaging@21.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-pyparsing@2.4.7%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-pluggy@0.13.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-py@1.9.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-setuptools@57.4.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-toml@0.10.2%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-pyyaml@5.3.1%gcc@11.2.0+libyaml arch=linux-rhel7-skylake
[^] ^libyaml@0.2.5%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-sympy@1.9%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-mpmath@1.1.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^spdlog@1.9.2%gcc@11.2.0~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
Concretized
--------------------------------
- rjmenmr coreneuron@develop%nvhpc@22.3~caliper~codegenopt+gpu~ipo~ispc~knl~legacy-unit+mpi+nmodl+openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=RelWithDebInfo arch=linux-rhel7-skylake
- 6s6wcfe ^bison@3.8.2%nvhpc@22.3 arch=linux-rhel7-skylake
[^] vn2t5vi ^boost@1.78.0%nvhpc@22.3+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] 2qmvlfy ^cmake@3.21.4%gcc@11.2.0~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
[^] gi5x2dn ^cuda@11.6.1%nvhpc@22.3~allow-unsupported-compilers~dev arch=linux-rhel7-skylake
- ajxdymq ^flex@2.6.3%nvhpc@22.3+lex~nls arch=linux-rhel7-skylake
[^] hyunzkn ^hpe-mpi@2.25.hmpt%gcc@11.2.0 arch=linux-rhel7-skylake
[^] lbajih4 ^libsonata-report@1.1.1%gcc@11.2.0~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] utj47fz ^hdf5@1.10.7%gcc@11.2.0+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ihxi5rl ^pkgconf@1.8.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] evtnqzd ^zlib@1.2.11%gcc@11.2.0+optimize+pic+shared arch=linux-rhel7-skylake
[^] r74vcyb ^spdlog@1.9.2%gcc@11.2.0~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 7lotjqk ^fmt@8.1.1%gcc@11.2.0~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
- cp3ofsp ^ninja@1.10.2%nvhpc@22.3 arch=linux-rhel7-skylake
[+] 4dzxcps ^nmodl@develop%gcc@11.2.0~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] atktt2p ^catch2@2.13.8%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] e5qqxxq ^cli11@2.1.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] jjl6cjc ^nlohmann-json@3.10.4%gcc@11.2.0~ipo~multiple_headers build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] onq3mhg ^py-jinja2@3.0.1%gcc@11.2.0~i18n arch=linux-rhel7-skylake
[^] 4ectow5 ^py-markupsafe@2.0.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 72xzp3v ^python@3.9.7%gcc@11.2.0+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] z5kzef6 ^py-pybind11@2.9.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 22arfs4 ^py-pytest@6.2.4%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ascbeii ^py-attrs@21.2.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] kvw3vhm ^py-iniconfig@1.1.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 7iyiygo ^py-packaging@21.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] c7qvw2q ^py-pyparsing@2.4.7%gcc@11.2.0 arch=linux-rhel7-skylake
[^] mazoiox ^py-pluggy@0.13.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] y7rfzdj ^py-py@1.9.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] v4z3s5e ^py-setuptools@57.4.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] w4gddqx ^py-toml@0.10.2%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ci5oe5b ^py-pyyaml@5.3.1%gcc@11.2.0+libyaml arch=linux-rhel7-skylake
[^] xj2wlac ^libyaml@0.2.5%gcc@11.2.0 arch=linux-rhel7-skylake
[^] vt2or7v ^py-sympy@1.9%gcc@11.2.0 arch=linux-rhel7-skylake
[^] dzb2mfs ^py-mpmath@1.1.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 5fkun4i ^reportinglib@2.5.6%gcc@11.2.0~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> bison@3.8.2 : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-6s6wcfeanp2mkdib4c3n3ivkcuosopgm)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.78.0-hiorc7 (external boost-1.78.0-vn2t5viljvtb5wyfhgmx57txyeiznxo4)
==> cmake@3.21.4 : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-2qmvlfyylrv3t5ankluyr5cqey2nlfzd)
==> cuda@11.6.1 : has external module in ['cuda/11.6.1']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cuda-11.6.1-ngetva (external cuda-11.6.1-gi5x2dn47i7nv76flbq7aatllxpaqmjp)
==> flex@2.6.3 : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-ajxdymqhnzqgeuhvk5nu5zfymzq35n6i)
==> hpe-mpi@2.25.hmpt : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-hyunzknktdzri34bx26bupkvapehonql)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/pkgconf-1.8.0-ihxi5r
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-evtnqzdtuvprwnbd2nljimpzywrhw4uo)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/fmt-8.1.1-7lotjq
==> ninja@1.10.2 : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-cp3ofspgkdqaqih5vrhotmdwvkozfswp)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/catch2-2.13.8-atktt2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/cli11-2.1.1-e5qqxx
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/nlohmann-json-3.10.4-jjl6cj
==> python@3.9.7 : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-72xzp3vgzh5b424qevkfruhs3ajzqbiy)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libyaml-0.2.5-xj2wla
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/reportinglib-2.5.6-5fkun4
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/hdf5-1.10.7-utj47f
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/spdlog-1.9.2-r74vcy
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-py-1.9.0-y7rfzd
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-setuptools-57.4.0-v4z3s5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pluggy-0.13.0-mazoio
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-toml-0.10.2-w4gddq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-markupsafe-2.0.1-4ectow
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-iniconfig-1.1.1-kvw3vh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-attrs-21.2.0-ascbei
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyparsing-2.4.7-c7qvw2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-mpmath-1.1.0-dzb2mf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pybind11-2.9.1-z5kzef
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyyaml-5.3.1-ci5oe5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libsonata-report-1.1.1-lbajih
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-jinja2-3.0.1-onq3mh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-packaging-21.0-7iyiyg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-sympy-1.9-vt2or7
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pytest-6.2.4-22arfs
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/software/install_gcc-11.2.0-skylake/nmodl-develop-4dzxcp
==> Installing coreneuron-develop-rjmenmrpgml6pswauzum5llfbry6ngnh
==> No binary for coreneuron-develop-rjmenmrpgml6pswauzum5llfbry6ngnh found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704, but it is owned by 0
==> No patches needed for coreneuron
==> coreneuron: Executing phase: 'cmake'
==> coreneuron: Executing phase: 'build'
==> coreneuron: Executing phase: 'install'
==> coreneuron: Successfully installed coreneuron-develop-rjmenmrpgml6pswauzum5llfbry6ngnh
Fetch: 4.13s. Build: 3m 39.65s. Total: 3m 43.78s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/software/install_nvhpc-22.3-skylake/coreneuron-develop-rjmenm
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/563739/ccache
Primary config: /nvme/bbpcihpcproj12/563739/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 10:38:25 2022
Hits: 86 / 94 (91.49 %)
Direct: 77 / 94 (81.91 %)
Preprocessed: 9 / 17 (52.94 %)
Misses: 8
Direct: 17
Preprocessed: 8
Uncacheable: 17
Primary storage:
Hits: 163 / 188 (86.70 %)
Misses: 25
Cache size (GB): 0.13 / 0.51 (25.54 %)
Files: 1744
Uncacheable:
Called for linking: 14
No input file: 3
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading hpe-mpi/2.25.hmpt
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1655887106:step_script section_start:1655887106:archive_cache Saving cache for successful job
Creating cache build:coreneuron:nmodl:nvhpc:omp-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=268331 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Acoreneuron%3Anmodl%3Anvhpc%3Aomp-8-non_protected
Created cache
section_end:1655887113:archive_cache section_start:1655887113:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=268919 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=272565 responseStatus=201 Created token=W7PLLsmx
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=268977 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=272565 responseStatus=201 Created token=W7PLLsmx
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=269028 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=272565 responseStatus=201 Created token=W7PLLsmx
section_end:1655887114:upload_artifacts_on_success section_start:1655887114:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655887115:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655886676:resolve_secrets Resolving secrets
section_end:1655886676:resolve_secrets section_start:1655886676:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor884222818, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272579
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J272579_PROD_P112_CP7_C11
Job parameters: memory=76G, cpus_per_task=8, duration=2:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 563725
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J272579_PROD_P112_CP7_C11 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=2:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=2 --jobid=563725 --cpus-per-task=8 --mem=76G
section_end:1655886677:prepare_executor section_start:1655886677:prepare_script Preparing environment
Running on r1i7n22 via bbpv1.epfl.ch...
section_end:1655886680:prepare_script section_start:1655886680:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655886681:get_sources section_start:1655886681:restore_cache Restoring cache
Checking cache for build:neuron:mod2c:intel-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=188919 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1655886685:restore_cache section_start:1655886685:download_artifacts Downloading artifacts
Downloading artifacts for build:coreneuron:mod2c:intel (272569)...
Runtime platform  arch=amd64 os=linux pid=189172 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=272569 responseStatus=200 OK token=myidMhnG
section_end:1655886686:download_artifacts section_start:1655886686:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272579/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272579/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272579/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272579/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272579/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704/J272579_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install neuron%intel +coreneuron+debug+tests~legacy-unit~rx3d model_tests=channel-benchmark,olfactory,tqperf-heavy ^/iw7f7wskesewfnyvmtmk3lfqta3e4jpw
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be 5rhr7pzdmuyym5ch4jvmvyd7544mkxl4
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/563725/ccache
Primary config: /nvme/bbpcihpcproj12/563725/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 10:31:53 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.26 / 0.51 (51.42 %)
Files: 13149
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: '5rhr7pzdmuyym5ch4jvmvyd7544mkxl4'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- neuron%intel+coreneuron+debug~legacy-unit~rx3d+tests model_tests=channel-benchmark,olfactory,tqperf-heavy
[+] ^coreneuron@develop%intel@2021.4.0~caliper~codegenopt~gpu~ipo~ispc~knl~legacy-unit+mpi~nmodl~openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=Debug arch=linux-rhel7-skylake
[^] ^boost@1.78.0%intel@2021.4.0+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] ^hpe-mpi@2.25.hmpt%intel@2021.4.0 arch=linux-rhel7-skylake
[^] ^libsonata-report@1.1.1%intel@2021.4.0~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^hdf5@1.10.7%intel@2021.4.0+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^pkgconf@1.8.0%intel@2021.4.0 arch=linux-rhel7-skylake
[^] ^zlib@1.2.11%intel@2021.4.0+optimize+pic+shared arch=linux-rhel7-skylake
[^] ^spdlog@1.9.2%intel@2021.4.0~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^fmt@8.1.1%intel@2021.4.0~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
[^] ^python@3.9.7%intel@2021.4.0+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] ^reportinglib@2.5.6%intel@2021.4.0~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
Concretized
--------------------------------
- 5rhr7pz neuron@develop%intel@2021.4.0+binary~caliper+coreneuron+debug~interviews~ipo+legacy-fr~legacy-unit+memacs+mod-compatibility+mpi~profile+python~rx3d+tests build_type=RelWithDebInfo model_tests=channel-benchmark,olfactory,tqperf-heavy patches=708cb04826b394a858069d93e8c08e1e81e914c23e1ef3da0486e8233834ff6c sanitizers=None arch=linux-rhel7-skylake
- dwdch6b ^bison@3.8.2%intel@2021.4.0 arch=linux-rhel7-skylake
- z3q5f3x ^cmake@3.21.4%intel@2021.4.0~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
[+] iw7f7ws ^coreneuron@develop%intel@2021.4.0~caliper~codegenopt~gpu~ipo~ispc~knl~legacy-unit+mpi~nmodl~openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=Debug arch=linux-rhel7-skylake
[^] dl7pfht ^boost@1.78.0%intel@2021.4.0+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] 3narjkw ^hpe-mpi@2.25.hmpt%intel@2021.4.0 arch=linux-rhel7-skylake
[^] podtqd6 ^libsonata-report@1.1.1%intel@2021.4.0~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ofveh3j ^hdf5@1.10.7%intel@2021.4.0+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] xys6npz ^pkgconf@1.8.0%intel@2021.4.0 arch=linux-rhel7-skylake
[^] e7w5nez ^zlib@1.2.11%intel@2021.4.0+optimize+pic+shared arch=linux-rhel7-skylake
[^] kxkrl2g ^spdlog@1.9.2%intel@2021.4.0~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] unnpvpo ^fmt@8.1.1%intel@2021.4.0~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
[^] f27a7nn ^python@3.9.7%intel@2021.4.0+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] gdhqypa ^reportinglib@2.5.6%intel@2021.4.0~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
- s4ueg72 ^flex@2.6.3%intel@2021.4.0+lex~nls arch=linux-rhel7-skylake
[^] bjxwlfq ^gettext@0.21%intel@2021.4.0+bzip2+curses+git~libunistring+libxml2+tar+xz arch=linux-rhel7-skylake
[^] dvyfbbf ^bzip2@1.0.8%intel@2021.4.0~debug~pic+shared arch=linux-rhel7-skylake
[^] ka2cfu4 ^libiconv@1.16%intel@2021.4.0 libs=shared,static arch=linux-rhel7-skylake
[^] w3ob7m2 ^libxml2@2.9.12%intel@2021.4.0~python arch=linux-rhel7-skylake
[^] ahg6hq4 ^xz@5.2.5%intel@2021.4.0~pic libs=shared,static arch=linux-rhel7-skylake
[^] q3rwuez ^ncurses@6.2%intel@2021.4.0~symlinks+termlib abi=none arch=linux-rhel7-skylake
[^] 2fli66n ^tar@1.28%intel@2021.4.0 patches=08921fcbd732050c74ddf1de7d8ad95ffdbc09f8b4342456fa2f6a0dd02a957c,125cd6142fac2cc339e9aebfe79e40f90766022b8e8401532b1729e84fc148c2,5c314db58d005043bf407abaf25eb9823b9032a22fd12a0b142d4bf548130fa4,d428578be7fb99b831eb61e53b8d88a859afe08b479a21238180899707d79ce4 arch=linux-rhel7-skylake
- 6ggc5yr ^ninja@1.10.2%intel@2021.4.0 arch=linux-rhel7-skylake
[^] qj3vvxk ^py-mpi4py@3.1.2%intel@2021.4.0 arch=linux-rhel7-skylake
[^] 5flpilw ^py-setuptools@57.4.0%intel@2021.4.0 arch=linux-rhel7-skylake
[^] ipsbwzp ^py-numpy@1.19.5%intel@2021.4.0+blas+lapack patches=8a9d5d1b3f145c043b8b04869e7d46c6ff95c3f486d84f69693017c7e6190c7d arch=linux-rhel7-skylake
[^] rl5ojfn ^intel-mkl@2020.4.304%intel@2021.4.0~ilp64+shared threads=none arch=linux-rhel7-skylake
[^] taxuisw ^py-cython@0.29.24%intel@2021.4.0 arch=linux-rhel7-skylake
[^] tzmlo5v ^py-pytest@6.2.4%intel@2021.4.0 arch=linux-rhel7-skylake
[^] d5szgli ^py-attrs@21.2.0%intel@2021.4.0 arch=linux-rhel7-skylake
[^] huky5p4 ^py-iniconfig@1.1.1%intel@2021.4.0 arch=linux-rhel7-skylake
[^] wli7caj ^py-packaging@21.0%intel@2021.4.0 arch=linux-rhel7-skylake
[^] 3a3ralt ^py-pyparsing@2.4.7%intel@2021.4.0 arch=linux-rhel7-skylake
[^] rh6zvev ^py-pluggy@0.13.0%intel@2021.4.0 arch=linux-rhel7-skylake
[^] kpwupa3 ^py-setuptools-scm@6.3.2%intel@2021.4.0+toml arch=linux-rhel7-skylake
[^] apu667d ^py-tomli@1.2.1%intel@2021.4.0 arch=linux-rhel7-skylake
[^] jesa65u ^py-pip@21.1.2%intel@2021.4.0 arch=linux-rhel7-skylake
[^] skrpmyr ^py-py@1.9.0%intel@2021.4.0 arch=linux-rhel7-skylake
[^] ktthpzx ^py-toml@0.10.2%intel@2021.4.0 arch=linux-rhel7-skylake
[^] tzl5sq5 ^py-pytest-cov@2.8.1%intel@2021.4.0 arch=linux-rhel7-skylake
[^] zkd7o6d ^py-coverage@5.5%intel@2021.4.0 arch=linux-rhel7-skylake
[^] 5abs2t7 ^readline@8.1%intel@2021.4.0 arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> bison@3.8.2 : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-dwdch6bmdeclr2novthsywtrryotawwz)
==> cmake@3.21.4 : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-z3q5f3xwpuibd3qbgdscqmu3efarbu42)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.78.0-hiorc7 (external boost-1.78.0-dl7pfhtsvyaq3273gr2g3l5vr37eeydc)
==> hpe-mpi@2.25.hmpt : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-3narjkwjh6i2jr3zl3g5wdjlqi52hkwh)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/pkgconf-1.8.0-xys6np
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-e7w5nezpwf572epfhbosqfzboztysout)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/fmt-8.1.1-unnpvp
==> python@3.9.7 : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-f27a7nnhppeztxijpispnaedcfo2vjxi)
==> flex@2.6.3 : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-s4ueg72j7l6vkdyvfxj2tweo7v7s3otx)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bzip2-1.0.8-eu2d6e (external bzip2-1.0.8-dvyfbbfdan33i7wyydyeskb3legztopm)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/libiconv-1.16-ka2cfu
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/xz-5.2.5-ahg6hq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ncurses-6.2-nsve5x (external ncurses-6.2-q3rwuezhc2j56kexunypjpejzketozmc)
[+] /usr (external tar-1.28-2fli66nkkv35g34exwd2xifp5w3gtgo6)
==> ninja@1.10.2 : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-6ggc5yre7qddwxdjmn7sfptpdoiy4dtp)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/intel-mkl-2020.4.304-rzr3hj (external intel-mkl-2020.4.304-rl5ojfngtc7sffwyz7v6ek6eceq53wed)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/readline-8.1-we2frg (external readline-8.1-5abs2t7f6dtgbejv6kjzd4gjwhdlr7qr)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/reportinglib-2.5.6-gdhqyp
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/hdf5-1.10.7-ofveh3
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/spdlog-1.9.2-kxkrl2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-setuptools-57.4.0-5flpil
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/libxml2-2.9.12-w3ob7m
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/libsonata-report-1.1.1-podtqd
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-cython-0.29.24-taxuis
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-toml-0.10.2-ktthpz
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-coverage-5.5-zkd7o6
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-pip-21.1.2-jesa65
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704, but it is owned by 0
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-attrs-21.2.0-d5szgl
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-pyparsing-2.4.7-3a3ral
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-iniconfig-1.1.1-huky5p
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-mpi4py-3.1.2-qj3vvx
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/gettext-0.21-bjxwlf
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/software/install_intel-2021.4.0-skylake/coreneuron-develop-iw7f7w
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-numpy-1.19.5-ipsbwz
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-tomli-1.2.1-apu667
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-packaging-21.0-wli7ca
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-setuptools-scm-6.3.2-kpwupa
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-pluggy-0.13.0-rh6zve
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-py-1.9.0-skrpmy
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-pytest-6.2.4-tzmlo5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_intel-2021.4.0-skylake/py-pytest-cov-2.8.1-tzl5sq
==> Installing neuron-develop-5rhr7pzdmuyym5ch4jvmvyd7544mkxl4
==> No binary for neuron-develop-5rhr7pzdmuyym5ch4jvmvyd7544mkxl4 found: installing from source
==> Applied patch /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/bluebrain/repo-patches/packages/neuron/revert_Import3d_numerical_format.master.patch
==> neuron: Executing phase: 'cmake'
==> neuron: Executing phase: 'build'
==> neuron: Executing phase: 'install'
==> neuron: Successfully installed neuron-develop-5rhr7pzdmuyym5ch4jvmvyd7544mkxl4
Fetch: 25.20s. Build: 11m 59.35s. Total: 12m 24.55s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/software/install_intel-2021.4.0-skylake/neuron-develop-5rhr7p
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/563725/ccache
Primary config: /nvme/bbpcihpcproj12/563725/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 10:45:11 2022
Hits: 710 / 808 (87.87 %)
Direct: 334 / 818 (40.83 %)
Preprocessed: 376 / 478 (78.66 %)
Misses: 98
Direct: 484
Preprocessed: 102
Uncacheable: 133
Primary storage:
Hits: 1345 / 1630 (82.52 %)
Misses: 285
Cache size (GB): 0.27 / 0.51 (52.96 %)
Files: 13341
Uncacheable:
Autoconf compile/link: 7
Called for linking: 109
Called for preprocessing: 1
Compilation failed: 4
No input file: 6
Preprocessing failed: 6
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading hpe-mpi/2.25.hmpt
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1655887512:step_script section_start:1655887512:archive_cache Saving cache for successful job
Creating cache build:neuron:mod2c:intel-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=224398 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Aneuron%3Amod2c%3Aintel-8-non_protected
Created cache
section_end:1655887525:archive_cache section_start:1655887525:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=225686 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=272579 responseStatus=201 Created token=nyWmD2tP
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=225760 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=272579 responseStatus=201 Created token=nyWmD2tP
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=225862 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=272579 responseStatus=201 Created token=nyWmD2tP
section_end:1655887527:upload_artifacts_on_success section_start:1655887527:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655887528:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655886711:resolve_secrets Resolving secrets
section_end:1655886711:resolve_secrets section_start:1655886711:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor018319654, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272574
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J272574_PROD_P112_CP4_C8
Job parameters: memory=76G, cpus_per_task=8, duration=2:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 563732
job state: PD
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J272574_PROD_P112_CP4_C8 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=2:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=2 --jobid=563732 --cpus-per-task=8 --mem=76G
section_end:1655886715:prepare_executor section_start:1655886715:prepare_script Preparing environment
Running on r1i7n28 via bbpv1.epfl.ch...
section_end:1655886717:prepare_script section_start:1655886717:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655886718:get_sources section_start:1655886718:restore_cache Restoring cache
Checking cache for build:neuron:mod2c:nvhpc:acc-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=169933 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1655886722:restore_cache section_start:1655886722:download_artifacts Downloading artifacts
Downloading artifacts for build:coreneuron:mod2c:nvhpc:acc (272561)...
Runtime platform  arch=amd64 os=linux pid=170117 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=272561 responseStatus=200 OK token=vewM1syi
section_end:1655886723:download_artifacts section_start:1655886723:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272574/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272574/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272574/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272574/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272574/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704/J272574_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install neuron%nvhpc +coreneuron+debug+tests~legacy-unit~rx3d model_tests=channel-benchmark,olfactory,tqperf-heavy ^/6qi7xiczl6qfolioernjyooskeukbk4m
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be mqy2bmic4al44uogrem4llrvfdch65dy
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/563732/ccache
Primary config: /nvme/bbpcihpcproj12/563732/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 10:32:28 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.39 / 0.51 (75.51 %)
Files: 10862
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: 'mqy2bmic4al44uogrem4llrvfdch65dy'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- neuron%nvhpc+coreneuron+debug~legacy-unit~rx3d+tests model_tests=channel-benchmark,olfactory,tqperf-heavy
[+] ^coreneuron@develop%nvhpc@22.3~caliper~codegenopt+gpu~ipo~ispc~knl~legacy-unit+mpi~nmodl+openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^boost@1.78.0%nvhpc@22.3+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] ^cuda@11.6.1%nvhpc@22.3~allow-unsupported-compilers~dev arch=linux-rhel7-skylake
[^] ^hpe-mpi@2.25.hmpt%nvhpc@22.3 arch=linux-rhel7-skylake
[^] ^libsonata-report@1.1.1%nvhpc@22.3~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^hdf5@1.10.7%nvhpc@22.3+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^pkgconf@1.8.0%nvhpc@22.3 arch=linux-rhel7-skylake
[^] ^zlib@1.2.11%nvhpc@22.3+optimize+pic+shared arch=linux-rhel7-skylake
[^] ^spdlog@1.9.2%nvhpc@22.3~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^fmt@8.1.1%nvhpc@22.3~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
[^] ^python@3.9.7%nvhpc@22.3+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] ^reportinglib@2.5.6%nvhpc@22.3~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
Concretized
--------------------------------
- mqy2bmi neuron@develop%nvhpc@22.3+binary~caliper+coreneuron+debug~interviews~ipo+legacy-fr~legacy-unit+memacs+mod-compatibility+mpi~profile+python~rx3d+tests build_type=RelWithDebInfo model_tests=channel-benchmark,olfactory,tqperf-heavy patches=708cb04826b394a858069d93e8c08e1e81e914c23e1ef3da0486e8233834ff6c sanitizers=None arch=linux-rhel7-skylake
- 6s6wcfe ^bison@3.8.2%nvhpc@22.3 arch=linux-rhel7-skylake
- ucwiakr ^cmake@3.21.4%nvhpc@22.3~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
[+] 6qi7xic ^coreneuron@develop%nvhpc@22.3~caliper~codegenopt+gpu~ipo~ispc~knl~legacy-unit+mpi~nmodl+openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] vn2t5vi ^boost@1.78.0%nvhpc@22.3+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] gi5x2dn ^cuda@11.6.1%nvhpc@22.3~allow-unsupported-compilers~dev arch=linux-rhel7-skylake
[^] pjmdwuu ^hpe-mpi@2.25.hmpt%nvhpc@22.3 arch=linux-rhel7-skylake
[^] 6d2jeoj ^libsonata-report@1.1.1%nvhpc@22.3~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] tbfoeg7 ^hdf5@1.10.7%nvhpc@22.3+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 6dfyugs ^pkgconf@1.8.0%nvhpc@22.3 arch=linux-rhel7-skylake
[^] h7lotu6 ^zlib@1.2.11%nvhpc@22.3+optimize+pic+shared arch=linux-rhel7-skylake
[^] v5zxpfc ^spdlog@1.9.2%nvhpc@22.3~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] jeloytj ^fmt@8.1.1%nvhpc@22.3~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
[^] f62u3oh ^python@3.9.7%nvhpc@22.3+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] bdkvweu ^reportinglib@2.5.6%nvhpc@22.3~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
- ajxdymq ^flex@2.6.3%nvhpc@22.3+lex~nls arch=linux-rhel7-skylake
[^] lswrnpk ^gettext@0.21%nvhpc@22.3+bzip2+curses+git~libunistring+libxml2+tar+xz patches=6e530daaae14725d578d6cafbf7d523accc9ed29fd817bd421cf98a5f51e9e1b,bbe9f0539aa504966ac104224d25a9260faa1015ed3adda936467be9c7de4eae,fb27a3fb5e414bdc50ffebbbe2da986473df70a493caa4396226f51a67c55424 arch=linux-rhel7-skylake
[^] h3k3sur ^bzip2@1.0.8%nvhpc@22.3~debug~pic+shared arch=linux-rhel7-skylake
[^] f4hjvxq ^libiconv@1.16%nvhpc@22.3 libs=shared,static arch=linux-rhel7-skylake
[^] i3pi6vv ^libxml2@2.9.12%nvhpc@22.3~python patches=05ff238cf435825ef835c7ae39376b52dc83d8caf19e962f0766c841386a305a,10a88ad47f9797cf7cf2d7d07241f665a3b6d1f31fa026728c8c2ae93e1664e9 arch=linux-rhel7-skylake
[^] cqwvvjz ^xz@5.2.5%nvhpc@22.3~pic libs=shared,static arch=linux-rhel7-skylake
[^] jpe3but ^ncurses@6.2%nvhpc@22.3~symlinks+termlib abi=none arch=linux-rhel7-skylake
[^] zmpra4b ^tar@1.28%nvhpc@22.3 patches=08921fcbd732050c74ddf1de7d8ad95ffdbc09f8b4342456fa2f6a0dd02a957c,125cd6142fac2cc339e9aebfe79e40f90766022b8e8401532b1729e84fc148c2,5c314db58d005043bf407abaf25eb9823b9032a22fd12a0b142d4bf548130fa4,d428578be7fb99b831eb61e53b8d88a859afe08b479a21238180899707d79ce4 arch=linux-rhel7-skylake
- cp3ofsp ^ninja@1.10.2%nvhpc@22.3 arch=linux-rhel7-skylake
[^] v2mlx7s ^py-mpi4py@3.1.2%nvhpc@22.3 arch=linux-rhel7-skylake
[^] l4z6bjq ^py-setuptools@57.4.0%nvhpc@22.3 arch=linux-rhel7-skylake
[^] kohlctm ^py-numpy@1.17.5%gcc@11.2.0+blas+lapack patches=cf407c1024b0878c4222dd352aa9dece412073bb15b138243a2893725434c7b6 arch=linux-rhel7-skylake
[^] r5cvgru ^intel-mkl@2020.4.304%gcc@11.2.0~ilp64+shared threads=none arch=linux-rhel7-skylake
[^] dzkzhqs ^py-pytest@6.2.4%nvhpc@22.3 arch=linux-rhel7-skylake
[^] iopgs7j ^py-attrs@21.2.0%nvhpc@22.3 arch=linux-rhel7-skylake
[^] prphkb7 ^py-iniconfig@1.1.1%nvhpc@22.3 arch=linux-rhel7-skylake
[^] iwg6vas ^py-packaging@21.0%nvhpc@22.3 arch=linux-rhel7-skylake
[^] o7j2crb ^py-pyparsing@2.4.7%nvhpc@22.3 arch=linux-rhel7-skylake
[^] uecslka ^py-pluggy@0.13.0%nvhpc@22.3 arch=linux-rhel7-skylake
[^] fjgegze ^py-setuptools-scm@6.3.2%nvhpc@22.3+toml arch=linux-rhel7-skylake
[^] xsvykjn ^py-tomli@1.2.1%nvhpc@22.3 arch=linux-rhel7-skylake
[^] w6kat5w ^py-pip@21.1.2%nvhpc@22.3 arch=linux-rhel7-skylake
[^] qhehf4e ^py-py@1.9.0%nvhpc@22.3 arch=linux-rhel7-skylake
[^] yg6s277 ^py-toml@0.10.2%nvhpc@22.3 arch=linux-rhel7-skylake
[^] l6udtg7 ^py-pytest-cov@2.8.1%nvhpc@22.3 arch=linux-rhel7-skylake
[^] lfjn2o4 ^py-coverage@5.5%nvhpc@22.3 arch=linux-rhel7-skylake
[^] 3333cwm ^readline@8.1%nvhpc@22.3 arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> bison@3.8.2 : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-6s6wcfeanp2mkdib4c3n3ivkcuosopgm)
==> cmake@3.21.4 : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-ucwiakreeghdgbo22kbqqhgnnlwxqtnn)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.78.0-hiorc7 (external boost-1.78.0-vn2t5viljvtb5wyfhgmx57txyeiznxo4)
==> cuda@11.6.1 : has external module in ['cuda/11.6.1']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cuda-11.6.1-ngetva (external cuda-11.6.1-gi5x2dn47i7nv76flbq7aatllxpaqmjp)
==> hpe-mpi@2.25.hmpt : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-pjmdwuu36ioiwqyrg6lrj7nrq3waqjj2)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/pkgconf-1.8.0-6dfyug
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-h7lotu6jszcvumwucry2y5jnhvxw5x2d)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/fmt-8.1.1-jeloyt
==> python@3.9.7 : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-f62u3ohchswt5q2b63chawohzqrl6wvy)
==> flex@2.6.3 : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-ajxdymqhnzqgeuhvk5nu5zfymzq35n6i)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bzip2-1.0.8-eu2d6e (external bzip2-1.0.8-h3k3surwc24macvlxbcme35uztk6dqiw)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/libiconv-1.16-f4hjvx
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/xz-5.2.5-cqwvvj
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ncurses-6.2-nsve5x (external ncurses-6.2-jpe3butfegqspi6dnm7cyikpjt5nphfo)
[+] /usr (external tar-1.28-zmpra4bla6bcjnayn6eze4rnd5u4txfm)
==> ninja@1.10.2 : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-cp3ofspgkdqaqih5vrhotmdwvkozfswp)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/intel-mkl-2020.4.304-rzr3hj (external intel-mkl-2020.4.304-r5cvgrurdnt6y267arsmmmgqi75ouxd2)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/readline-8.1-we2frg (external readline-8.1-3333cwmykvahrsdydir4qeyasic3liq6)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/reportinglib-2.5.6-bdkvwe
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/hdf5-1.10.7-tbfoeg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/spdlog-1.9.2-v5zxpf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-setuptools-57.4.0-l4z6bj
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/libxml2-2.9.12-i3pi6v
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/libsonata-report-1.1.1-6d2jeo
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-numpy-1.17.5-kohlct
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-toml-0.10.2-yg6s27
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-mpi4py-3.1.2-v2mlx7
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-iniconfig-1.1.1-prphkb
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-pyparsing-2.4.7-o7j2cr
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-coverage-5.5-lfjn2o
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-pip-21.1.2-w6kat5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-attrs-21.2.0-iopgs7
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/gettext-0.21-lswrnp
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/software/install_nvhpc-22.3-skylake/coreneuron-develop-6qi7xi
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-packaging-21.0-iwg6va
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-tomli-1.2.1-xsvykj
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-setuptools-scm-6.3.2-fjgegz
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-py-1.9.0-qhehf4
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-pluggy-0.13.0-uecslk
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-pytest-6.2.4-dzkzhq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_nvhpc-22.3-skylake/py-pytest-cov-2.8.1-l6udtg
==> Installing neuron-develop-mqy2bmic4al44uogrem4llrvfdch65dy
==> No binary for neuron-develop-mqy2bmic4al44uogrem4llrvfdch65dy found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704, but it is owned by 0
==> Applied patch /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/bluebrain/repo-patches/packages/neuron/revert_Import3d_numerical_format.master.patch
==> neuron: Executing phase: 'cmake'
==> neuron: Executing phase: 'build'
==> neuron: Executing phase: 'install'
==> neuron: Successfully installed neuron-develop-mqy2bmic4al44uogrem4llrvfdch65dy
Fetch: 13.63s. Build: 14m 12.38s. Total: 14m 26.01s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/software/install_nvhpc-22.3-skylake/neuron-develop-mqy2bm
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/563732/ccache
Primary config: /nvme/bbpcihpcproj12/563732/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 10:47:45 2022
Hits: 710 / 807 (87.98 %)
Direct: 334 / 817 (40.88 %)
Preprocessed: 376 / 477 (78.83 %)
Misses: 97
Direct: 483
Preprocessed: 101
Uncacheable: 125
Primary storage:
Hits: 1345 / 1628 (82.62 %)
Misses: 283
Cache size (GB): 0.39 / 0.51 (75.74 %)
Files: 11052
Uncacheable:
Autoconf compile/link: 7
Called for linking: 101
Called for preprocessing: 1
Compilation failed: 4
No input file: 6
Preprocessing failed: 6
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading hpe-mpi/2.25.hmpt
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1655887666:step_script section_start:1655887666:archive_cache Saving cache for successful job
Creating cache build:neuron:mod2c:nvhpc:acc-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=198795 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Aneuron%3Amod2c%3Anvhpc%3Aacc-8-non_protected
Created cache
section_end:1655887683:archive_cache section_start:1655887683:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=198886 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=272574 responseStatus=201 Created token=SFnJvts6
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=198927 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=272574 responseStatus=201 Created token=SFnJvts6
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=198970 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=272574 responseStatus=201 Created token=SFnJvts6
section_end:1655887685:upload_artifacts_on_success section_start:1655887685:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655887685:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655886948:resolve_secrets Resolving secrets
section_end:1655886948:resolve_secrets section_start:1655886948:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor323969639, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272581
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J272581_PROD_P112_CP9_C6
Job parameters: memory=76G, cpus_per_task=8, duration=2:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 563749
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J272581_PROD_P112_CP9_C6 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=2:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=2 --jobid=563749 --cpus-per-task=8 --mem=76G
section_end:1655886956:prepare_executor section_start:1655886956:prepare_script Preparing environment
Running on r1i6n30 via bbpv1.epfl.ch...
section_end:1655886959:prepare_script section_start:1655886959:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655886960:get_sources section_start:1655886960:restore_cache Restoring cache
Checking cache for build:neuron:nmodl:intel-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=261621 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1655886965:restore_cache section_start:1655886965:download_artifacts Downloading artifacts
Downloading artifacts for build:coreneuron:nmodl:intel (272572)...
Runtime platform  arch=amd64 os=linux pid=262200 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=272572 responseStatus=200 OK token=mjvHJvjk
section_end:1655886966:download_artifacts section_start:1655886966:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272581/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272581/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272581/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272581/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272581/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704/J272581_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install neuron%intel +coreneuron+debug+tests~legacy-unit~rx3d model_tests=channel-benchmark,olfactory,tqperf-heavy ^/rh5oq7ygplbbyryh6avavlwff7ieo23w
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be jhl3fvxgrrds5w2v5yyiqgn3n4pxwftn
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/563749/ccache
Primary config: /nvme/bbpcihpcproj12/563749/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 10:36:31 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.34 / 0.51 (66.96 %)
Files: 15837
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: 'jhl3fvxgrrds5w2v5yyiqgn3n4pxwftn'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- neuron%intel+coreneuron+debug~legacy-unit~rx3d+tests model_tests=channel-benchmark,olfactory,tqperf-heavy
[+] ^coreneuron@develop%intel@2021.4.0~caliper~codegenopt~gpu~ipo~ispc~knl~legacy-unit+mpi+nmodl~openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=Debug arch=linux-rhel7-skylake
[^] ^boost@1.78.0%intel@2021.4.0+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] ^hpe-mpi@2.25.hmpt%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^libsonata-report@1.1.1%gcc@11.2.0~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^hdf5@1.10.7%gcc@11.2.0+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^pkgconf@1.8.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^zlib@1.2.11%gcc@11.2.0+optimize+pic+shared arch=linux-rhel7-skylake
[^] ^spdlog@1.9.2%gcc@11.2.0~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^fmt@8.1.1%gcc@11.2.0~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
[+] ^nmodl@develop%gcc@11.2.0~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^catch2@2.13.8%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^cli11@2.1.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^nlohmann-json@3.10.4%gcc@11.2.0~ipo~multiple_headers build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^py-jinja2@3.0.1%gcc@11.2.0~i18n arch=linux-rhel7-skylake
[^] ^py-markupsafe@2.0.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^python@3.9.7%gcc@11.2.0+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] ^py-pybind11@2.9.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^py-pytest@6.2.4%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-attrs@21.2.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-iniconfig@1.1.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-packaging@21.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-pyparsing@2.4.7%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-pluggy@0.13.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-py@1.9.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-setuptools@57.4.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-toml@0.10.2%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-pyyaml@5.3.1%gcc@11.2.0+libyaml arch=linux-rhel7-skylake
[^] ^libyaml@0.2.5%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-sympy@1.9%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-mpmath@1.1.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^reportinglib@2.5.6%gcc@11.2.0~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
Concretized
--------------------------------
- jhl3fvx neuron@develop%intel@2021.4.0+binary~caliper+coreneuron+debug~interviews~ipo+legacy-fr~legacy-unit+memacs+mod-compatibility+mpi~profile+python~rx3d+tests build_type=RelWithDebInfo model_tests=channel-benchmark,olfactory,tqperf-heavy patches=708cb04826b394a858069d93e8c08e1e81e914c23e1ef3da0486e8233834ff6c sanitizers=None arch=linux-rhel7-skylake
- dwdch6b ^bison@3.8.2%intel@2021.4.0 arch=linux-rhel7-skylake
- z3q5f3x ^cmake@3.21.4%intel@2021.4.0~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
[+] rh5oq7y ^coreneuron@develop%intel@2021.4.0~caliper~codegenopt~gpu~ipo~ispc~knl~legacy-unit+mpi+nmodl~openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=Debug arch=linux-rhel7-skylake
[^] dl7pfht ^boost@1.78.0%intel@2021.4.0+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] hyunzkn ^hpe-mpi@2.25.hmpt%gcc@11.2.0 arch=linux-rhel7-skylake
[^] lbajih4 ^libsonata-report@1.1.1%gcc@11.2.0~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] utj47fz ^hdf5@1.10.7%gcc@11.2.0+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ihxi5rl ^pkgconf@1.8.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] evtnqzd ^zlib@1.2.11%gcc@11.2.0+optimize+pic+shared arch=linux-rhel7-skylake
[^] r74vcyb ^spdlog@1.9.2%gcc@11.2.0~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 7lotjqk ^fmt@8.1.1%gcc@11.2.0~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
[+] 4dzxcps ^nmodl@develop%gcc@11.2.0~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] atktt2p ^catch2@2.13.8%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] e5qqxxq ^cli11@2.1.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] jjl6cjc ^nlohmann-json@3.10.4%gcc@11.2.0~ipo~multiple_headers build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] onq3mhg ^py-jinja2@3.0.1%gcc@11.2.0~i18n arch=linux-rhel7-skylake
[^] 4ectow5 ^py-markupsafe@2.0.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 72xzp3v ^python@3.9.7%gcc@11.2.0+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] z5kzef6 ^py-pybind11@2.9.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 22arfs4 ^py-pytest@6.2.4%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ascbeii ^py-attrs@21.2.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] kvw3vhm ^py-iniconfig@1.1.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 7iyiygo ^py-packaging@21.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] c7qvw2q ^py-pyparsing@2.4.7%gcc@11.2.0 arch=linux-rhel7-skylake
[^] mazoiox ^py-pluggy@0.13.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] y7rfzdj ^py-py@1.9.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] v4z3s5e ^py-setuptools@57.4.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] w4gddqx ^py-toml@0.10.2%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ci5oe5b ^py-pyyaml@5.3.1%gcc@11.2.0+libyaml arch=linux-rhel7-skylake
[^] xj2wlac ^libyaml@0.2.5%gcc@11.2.0 arch=linux-rhel7-skylake
[^] vt2or7v ^py-sympy@1.9%gcc@11.2.0 arch=linux-rhel7-skylake
[^] dzb2mfs ^py-mpmath@1.1.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 5fkun4i ^reportinglib@2.5.6%gcc@11.2.0~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
- s4ueg72 ^flex@2.6.3%intel@2021.4.0+lex~nls arch=linux-rhel7-skylake
[^] iu2b5hx ^gettext@0.21%gcc@11.2.0+bzip2+curses+git~libunistring+libxml2+tar+xz arch=linux-rhel7-skylake
[^] 3rmq3zx ^bzip2@1.0.8%gcc@11.2.0~debug~pic+shared arch=linux-rhel7-skylake
[^] hxxlexb ^libiconv@1.16%gcc@11.2.0 libs=shared,static arch=linux-rhel7-skylake
[^] dnxqn2k ^libxml2@2.9.12%gcc@11.2.0~python arch=linux-rhel7-skylake
[^] jzpqn5y ^xz@5.2.5%gcc@11.2.0~pic libs=shared,static arch=linux-rhel7-skylake
[^] ams67cx ^ncurses@6.2%gcc@11.2.0~symlinks+termlib abi=none arch=linux-rhel7-skylake
[^] ir7xtbl ^tar@1.28%gcc@11.2.0 patches=08921fcbd732050c74ddf1de7d8ad95ffdbc09f8b4342456fa2f6a0dd02a957c,125cd6142fac2cc339e9aebfe79e40f90766022b8e8401532b1729e84fc148c2,5c314db58d005043bf407abaf25eb9823b9032a22fd12a0b142d4bf548130fa4,d428578be7fb99b831eb61e53b8d88a859afe08b479a21238180899707d79ce4 arch=linux-rhel7-skylake
- 6ggc5yr ^ninja@1.10.2%intel@2021.4.0 arch=linux-rhel7-skylake
[^] n6q4vfz ^py-mpi4py@3.1.2%gcc@11.2.0 arch=linux-rhel7-skylake
[^] gjm7kkq ^py-numpy@1.19.5%gcc@11.2.0+blas+lapack patches=8a9d5d1b3f145c043b8b04869e7d46c6ff95c3f486d84f69693017c7e6190c7d arch=linux-rhel7-skylake
[^] r5cvgru ^intel-mkl@2020.4.304%gcc@11.2.0~ilp64+shared threads=none arch=linux-rhel7-skylake
[^] h2fsi6i ^py-cython@0.29.24%gcc@11.2.0 arch=linux-rhel7-skylake
[^] q5n7ofc ^py-pytest-cov@2.8.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] fudvy5v ^py-coverage@5.5%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 5abs2t7 ^readline@8.1%intel@2021.4.0 arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> bison@3.8.2 : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-dwdch6bmdeclr2novthsywtrryotawwz)
==> cmake@3.21.4 : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-z3q5f3xwpuibd3qbgdscqmu3efarbu42)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.78.0-hiorc7 (external boost-1.78.0-dl7pfhtsvyaq3273gr2g3l5vr37eeydc)
==> hpe-mpi@2.25.hmpt : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-hyunzknktdzri34bx26bupkvapehonql)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/pkgconf-1.8.0-ihxi5r
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-evtnqzdtuvprwnbd2nljimpzywrhw4uo)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/fmt-8.1.1-7lotjq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/catch2-2.13.8-atktt2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/cli11-2.1.1-e5qqxx
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/nlohmann-json-3.10.4-jjl6cj
==> python@3.9.7 : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-72xzp3vgzh5b424qevkfruhs3ajzqbiy)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libyaml-0.2.5-xj2wla
==> flex@2.6.3 : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-s4ueg72j7l6vkdyvfxj2tweo7v7s3otx)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bzip2-1.0.8-eu2d6e (external bzip2-1.0.8-3rmq3zxuntsiphthnvty6gtydbmbkwr5)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libiconv-1.16-hxxlex
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/xz-5.2.5-jzpqn5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ncurses-6.2-nsve5x (external ncurses-6.2-ams67cxbq5vc7wiay2ndr2ksce2igbfw)
[+] /usr (external tar-1.28-ir7xtblauhq3vtkpjrl7ou3nzevcsi3u)
==> ninja@1.10.2 : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-6ggc5yre7qddwxdjmn7sfptpdoiy4dtp)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/intel-mkl-2020.4.304-rzr3hj (external intel-mkl-2020.4.304-r5cvgrurdnt6y267arsmmmgqi75ouxd2)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/readline-8.1-we2frg (external readline-8.1-5abs2t7f6dtgbejv6kjzd4gjwhdlr7qr)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/reportinglib-2.5.6-5fkun4
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/hdf5-1.10.7-utj47f
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/spdlog-1.9.2-r74vcy
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-toml-0.10.2-w4gddq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-py-1.9.0-y7rfzd
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-mpmath-1.1.0-dzb2mf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-markupsafe-2.0.1-4ectow
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pluggy-0.13.0-mazoio
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-iniconfig-1.1.1-kvw3vh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyparsing-2.4.7-c7qvw2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-attrs-21.2.0-ascbei
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pybind11-2.9.1-z5kzef
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-setuptools-57.4.0-v4z3s5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyyaml-5.3.1-ci5oe5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libxml2-2.9.12-dnxqn2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libsonata-report-1.1.1-lbajih
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-sympy-1.9-vt2or7
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-jinja2-3.0.1-onq3mh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-packaging-21.0-7iyiyg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-mpi4py-3.1.2-n6q4vf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-cython-0.29.24-h2fsi6
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-coverage-5.5-fudvy5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/gettext-0.21-iu2b5h
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pytest-6.2.4-22arfs
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-numpy-1.19.5-gjm7kk
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pytest-cov-2.8.1-q5n7of
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/software/install_gcc-11.2.0-skylake/nmodl-develop-4dzxcp
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/software/install_intel-2021.4.0-skylake/coreneuron-develop-rh5oq7
==> Installing neuron-develop-jhl3fvxgrrds5w2v5yyiqgn3n4pxwftn
==> No binary for neuron-develop-jhl3fvxgrrds5w2v5yyiqgn3n4pxwftn found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704, but it is owned by 0
==> Applied patch /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/bluebrain/repo-patches/packages/neuron/revert_Import3d_numerical_format.master.patch
==> neuron: Executing phase: 'cmake'
==> neuron: Executing phase: 'build'
==> neuron: Executing phase: 'install'
==> neuron: Successfully installed neuron-develop-jhl3fvxgrrds5w2v5yyiqgn3n4pxwftn
Fetch: 27.91s. Build: 22m 14.09s. Total: 22m 42.00s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/software/install_intel-2021.4.0-skylake/neuron-develop-jhl3fv
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/563749/ccache
Primary config: /nvme/bbpcihpcproj12/563749/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 11:00:13 2022
Hits: 695 / 793 (87.64 %)
Direct: 168 / 803 (20.92 %)
Preprocessed: 527 / 629 (83.78 %)
Misses: 98
Direct: 635
Preprocessed: 102
Uncacheable: 129
Primary storage:
Hits: 1164 / 1600 (72.75 %)
Misses: 436
Cache size (GB): 0.35 / 0.51 (67.56 %)
Files: 16174
Uncacheable:
Autoconf compile/link: 7
Called for linking: 105
Called for preprocessing: 1
Compilation failed: 4
No input file: 6
Preprocessing failed: 6
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading hpe-mpi/2.25.hmpt
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1655888417:step_script section_start:1655888417:archive_cache Saving cache for successful job
Creating cache build:neuron:nmodl:intel-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=299485 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Aneuron%3Anmodl%3Aintel-8-non_protected
Created cache
section_end:1655888432:archive_cache section_start:1655888432:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=299633 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=272581 responseStatus=201 Created token=ZLXYD-d_
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=299675 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=272581 responseStatus=201 Created token=ZLXYD-d_
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=299717 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=272581 responseStatus=201 Created token=ZLXYD-d_
section_end:1655888434:upload_artifacts_on_success section_start:1655888434:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655888435:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655887120:resolve_secrets Resolving secrets
section_end:1655887120:resolve_secrets section_start:1655887120:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor130627239, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272577
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J272577_PROD_P112_CP12_C4
Job parameters: memory=76G, cpus_per_task=8, duration=2:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 563757
job state: PD
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J272577_PROD_P112_CP12_C4 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=2:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=2 --jobid=563757 --cpus-per-task=8 --mem=76G
section_end:1655887123:prepare_executor section_start:1655887123:prepare_script Preparing environment
Running on r1i6n30 via bbpv1.epfl.ch...
section_end:1655887126:prepare_script section_start:1655887126:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655887126:get_sources section_start:1655887126:restore_cache Restoring cache
Checking cache for build:neuron:nmodl:nvhpc:acc-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=270784 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1655887131:restore_cache section_start:1655887131:download_artifacts Downloading artifacts
Downloading artifacts for build:coreneuron:nmodl:nvhpc:acc (272567)...
Runtime platform  arch=amd64 os=linux pid=271452 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=272567 responseStatus=200 OK token=cPTxqcyz
section_end:1655887132:download_artifacts section_start:1655887132:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272577/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272577/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272577/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272577/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272577/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704/J272577_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install neuron%nvhpc +coreneuron+debug+tests~legacy-unit~rx3d model_tests=channel-benchmark,olfactory,tqperf-heavy ^/6q3vzbkyvzxputeeg6himnv7tcfu7ge7
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be f6nikkkl3jokiiz56xke2nvdgfsrjose
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/563757/ccache
Primary config: /nvme/bbpcihpcproj12/563757/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 10:39:25 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.36 / 0.51 (70.94 %)
Files: 14131
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: 'f6nikkkl3jokiiz56xke2nvdgfsrjose'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- neuron%nvhpc+coreneuron+debug~legacy-unit~rx3d+tests model_tests=channel-benchmark,olfactory,tqperf-heavy
[+] ^coreneuron@develop%nvhpc@22.3~caliper~codegenopt+gpu~ipo~ispc~knl~legacy-unit+mpi+nmodl~openmp~profile+report+shared+sympy~sympyopt+tests~unified build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^boost@1.78.0%nvhpc@22.3+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] ^cuda@11.6.1%nvhpc@22.3~allow-unsupported-compilers~dev arch=linux-rhel7-skylake
[^] ^hpe-mpi@2.25.hmpt%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^libsonata-report@1.1.1%gcc@11.2.0~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^hdf5@1.10.7%gcc@11.2.0+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^pkgconf@1.8.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^zlib@1.2.11%gcc@11.2.0+optimize+pic+shared arch=linux-rhel7-skylake
[^] ^spdlog@1.9.2%gcc@11.2.0~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^fmt@8.1.1%gcc@11.2.0~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
[+] ^nmodl@develop%gcc@11.2.0~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^catch2@2.13.8%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^cli11@2.1.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^nlohmann-json@3.10.4%gcc@11.2.0~ipo~multiple_headers build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^py-jinja2@3.0.1%gcc@11.2.0~i18n arch=linux-rhel7-skylake
[^] ^py-markupsafe@2.0.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^python@3.9.7%gcc@11.2.0+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] ^py-pybind11@2.9.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^py-pytest@6.2.4%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-attrs@21.2.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-iniconfig@1.1.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-packaging@21.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-pyparsing@2.4.7%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-pluggy@0.13.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-py@1.9.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-setuptools@57.4.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-toml@0.10.2%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-pyyaml@5.3.1%gcc@11.2.0+libyaml arch=linux-rhel7-skylake
[^] ^libyaml@0.2.5%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-sympy@1.9%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-mpmath@1.1.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^reportinglib@2.5.6%gcc@11.2.0~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
Concretized
--------------------------------
- f6nikkk neuron@develop%nvhpc@22.3+binary~caliper+coreneuron+debug~interviews~ipo+legacy-fr~legacy-unit+memacs+mod-compatibility+mpi~profile+python~rx3d+tests build_type=RelWithDebInfo model_tests=channel-benchmark,olfactory,tqperf-heavy patches=708cb04826b394a858069d93e8c08e1e81e914c23e1ef3da0486e8233834ff6c sanitizers=None arch=linux-rhel7-skylake
- 6s6wcfe ^bison@3.8.2%nvhpc@22.3 arch=linux-rhel7-skylake
- ucwiakr ^cmake@3.21.4%nvhpc@22.3~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
[+] 6q3vzbk ^coreneuron@develop%nvhpc@22.3~caliper~codegenopt+gpu~ipo~ispc~knl~legacy-unit+mpi+nmodl~openmp~profile+report+shared+sympy~sympyopt+tests~unified build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] vn2t5vi ^boost@1.78.0%nvhpc@22.3+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] gi5x2dn ^cuda@11.6.1%nvhpc@22.3~allow-unsupported-compilers~dev arch=linux-rhel7-skylake
[^] hyunzkn ^hpe-mpi@2.25.hmpt%gcc@11.2.0 arch=linux-rhel7-skylake
[^] lbajih4 ^libsonata-report@1.1.1%gcc@11.2.0~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] utj47fz ^hdf5@1.10.7%gcc@11.2.0+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ihxi5rl ^pkgconf@1.8.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] evtnqzd ^zlib@1.2.11%gcc@11.2.0+optimize+pic+shared arch=linux-rhel7-skylake
[^] r74vcyb ^spdlog@1.9.2%gcc@11.2.0~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 7lotjqk ^fmt@8.1.1%gcc@11.2.0~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
[+] 4dzxcps ^nmodl@develop%gcc@11.2.0~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] atktt2p ^catch2@2.13.8%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] e5qqxxq ^cli11@2.1.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] jjl6cjc ^nlohmann-json@3.10.4%gcc@11.2.0~ipo~multiple_headers build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] onq3mhg ^py-jinja2@3.0.1%gcc@11.2.0~i18n arch=linux-rhel7-skylake
[^] 4ectow5 ^py-markupsafe@2.0.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 72xzp3v ^python@3.9.7%gcc@11.2.0+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] z5kzef6 ^py-pybind11@2.9.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 22arfs4 ^py-pytest@6.2.4%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ascbeii ^py-attrs@21.2.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] kvw3vhm ^py-iniconfig@1.1.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 7iyiygo ^py-packaging@21.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] c7qvw2q ^py-pyparsing@2.4.7%gcc@11.2.0 arch=linux-rhel7-skylake
[^] mazoiox ^py-pluggy@0.13.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] y7rfzdj ^py-py@1.9.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] v4z3s5e ^py-setuptools@57.4.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] w4gddqx ^py-toml@0.10.2%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ci5oe5b ^py-pyyaml@5.3.1%gcc@11.2.0+libyaml arch=linux-rhel7-skylake
[^] xj2wlac ^libyaml@0.2.5%gcc@11.2.0 arch=linux-rhel7-skylake
[^] vt2or7v ^py-sympy@1.9%gcc@11.2.0 arch=linux-rhel7-skylake
[^] dzb2mfs ^py-mpmath@1.1.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 5fkun4i ^reportinglib@2.5.6%gcc@11.2.0~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
- ajxdymq ^flex@2.6.3%nvhpc@22.3+lex~nls arch=linux-rhel7-skylake
[^] iu2b5hx ^gettext@0.21%gcc@11.2.0+bzip2+curses+git~libunistring+libxml2+tar+xz arch=linux-rhel7-skylake
[^] 3rmq3zx ^bzip2@1.0.8%gcc@11.2.0~debug~pic+shared arch=linux-rhel7-skylake
[^] hxxlexb ^libiconv@1.16%gcc@11.2.0 libs=shared,static arch=linux-rhel7-skylake
[^] dnxqn2k ^libxml2@2.9.12%gcc@11.2.0~python arch=linux-rhel7-skylake
[^] jzpqn5y ^xz@5.2.5%gcc@11.2.0~pic libs=shared,static arch=linux-rhel7-skylake
[^] ams67cx ^ncurses@6.2%gcc@11.2.0~symlinks+termlib abi=none arch=linux-rhel7-skylake
[^] ir7xtbl ^tar@1.28%gcc@11.2.0 patches=08921fcbd732050c74ddf1de7d8ad95ffdbc09f8b4342456fa2f6a0dd02a957c,125cd6142fac2cc339e9aebfe79e40f90766022b8e8401532b1729e84fc148c2,5c314db58d005043bf407abaf25eb9823b9032a22fd12a0b142d4bf548130fa4,d428578be7fb99b831eb61e53b8d88a859afe08b479a21238180899707d79ce4 arch=linux-rhel7-skylake
- cp3ofsp ^ninja@1.10.2%nvhpc@22.3 arch=linux-rhel7-skylake
[^] n6q4vfz ^py-mpi4py@3.1.2%gcc@11.2.0 arch=linux-rhel7-skylake
[^] gjm7kkq ^py-numpy@1.19.5%gcc@11.2.0+blas+lapack patches=8a9d5d1b3f145c043b8b04869e7d46c6ff95c3f486d84f69693017c7e6190c7d arch=linux-rhel7-skylake
[^] r5cvgru ^intel-mkl@2020.4.304%gcc@11.2.0~ilp64+shared threads=none arch=linux-rhel7-skylake
[^] h2fsi6i ^py-cython@0.29.24%gcc@11.2.0 arch=linux-rhel7-skylake
[^] q5n7ofc ^py-pytest-cov@2.8.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] fudvy5v ^py-coverage@5.5%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 3333cwm ^readline@8.1%nvhpc@22.3 arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> bison@3.8.2 : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-6s6wcfeanp2mkdib4c3n3ivkcuosopgm)
==> cmake@3.21.4 : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-ucwiakreeghdgbo22kbqqhgnnlwxqtnn)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.78.0-hiorc7 (external boost-1.78.0-vn2t5viljvtb5wyfhgmx57txyeiznxo4)
==> cuda@11.6.1 : has external module in ['cuda/11.6.1']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cuda-11.6.1-ngetva (external cuda-11.6.1-gi5x2dn47i7nv76flbq7aatllxpaqmjp)
==> hpe-mpi@2.25.hmpt : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-hyunzknktdzri34bx26bupkvapehonql)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/pkgconf-1.8.0-ihxi5r
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-evtnqzdtuvprwnbd2nljimpzywrhw4uo)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/fmt-8.1.1-7lotjq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/catch2-2.13.8-atktt2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/cli11-2.1.1-e5qqxx
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/nlohmann-json-3.10.4-jjl6cj
==> python@3.9.7 : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-72xzp3vgzh5b424qevkfruhs3ajzqbiy)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libyaml-0.2.5-xj2wla
==> flex@2.6.3 : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-ajxdymqhnzqgeuhvk5nu5zfymzq35n6i)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bzip2-1.0.8-eu2d6e (external bzip2-1.0.8-3rmq3zxuntsiphthnvty6gtydbmbkwr5)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libiconv-1.16-hxxlex
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/xz-5.2.5-jzpqn5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ncurses-6.2-nsve5x (external ncurses-6.2-ams67cxbq5vc7wiay2ndr2ksce2igbfw)
[+] /usr (external tar-1.28-ir7xtblauhq3vtkpjrl7ou3nzevcsi3u)
==> ninja@1.10.2 : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-cp3ofspgkdqaqih5vrhotmdwvkozfswp)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/intel-mkl-2020.4.304-rzr3hj (external intel-mkl-2020.4.304-r5cvgrurdnt6y267arsmmmgqi75ouxd2)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/readline-8.1-we2frg (external readline-8.1-3333cwmykvahrsdydir4qeyasic3liq6)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/reportinglib-2.5.6-5fkun4
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/hdf5-1.10.7-utj47f
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/spdlog-1.9.2-r74vcy
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pybind11-2.9.1-z5kzef
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-toml-0.10.2-w4gddq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-attrs-21.2.0-ascbei
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-setuptools-57.4.0-v4z3s5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-py-1.9.0-y7rfzd
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyparsing-2.4.7-c7qvw2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-markupsafe-2.0.1-4ectow
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pluggy-0.13.0-mazoio
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-mpmath-1.1.0-dzb2mf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-iniconfig-1.1.1-kvw3vh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyyaml-5.3.1-ci5oe5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libxml2-2.9.12-dnxqn2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libsonata-report-1.1.1-lbajih
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-cython-0.29.24-h2fsi6
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-mpi4py-3.1.2-n6q4vf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-coverage-5.5-fudvy5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-packaging-21.0-7iyiyg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-jinja2-3.0.1-onq3mh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-sympy-1.9-vt2or7
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/gettext-0.21-iu2b5h
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-numpy-1.19.5-gjm7kk
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pytest-6.2.4-22arfs
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pytest-cov-2.8.1-q5n7of
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/software/install_gcc-11.2.0-skylake/nmodl-develop-4dzxcp
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/software/install_nvhpc-22.3-skylake/coreneuron-develop-6q3vzb
==> Installing neuron-develop-f6nikkkl3jokiiz56xke2nvdgfsrjose
==> No binary for neuron-develop-f6nikkkl3jokiiz56xke2nvdgfsrjose found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704, but it is owned by 0
==> Applied patch /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/bluebrain/repo-patches/packages/neuron/revert_Import3d_numerical_format.master.patch
==> neuron: Executing phase: 'cmake'
==> neuron: Executing phase: 'build'
==> neuron: Executing phase: 'install'
==> neuron: Successfully installed neuron-develop-f6nikkkl3jokiiz56xke2nvdgfsrjose
Fetch: 25.45s. Build: 37m 0.75s. Total: 37m 26.20s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/software/install_nvhpc-22.3-skylake/neuron-develop-f6nikk
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/563757/ccache
Primary config: /nvme/bbpcihpcproj12/563757/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 11:18:30 2022
Hits: 695 / 792 (87.75 %)
Direct: 165 / 802 (20.57 %)
Preprocessed: 530 / 631 (83.99 %)
Misses: 97
Direct: 637
Preprocessed: 101
Uncacheable: 122
Primary storage:
Hits: 1161 / 1598 (72.65 %)
Misses: 437
Cache size (GB): 0.37 / 0.51 (71.69 %)
Files: 14475
Uncacheable:
Autoconf compile/link: 7
Called for linking: 98
Called for preprocessing: 1
Compilation failed: 4
No input file: 6
Preprocessing failed: 6
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading hpe-mpi/2.25.hmpt
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1655889511:step_script section_start:1655889511:archive_cache Saving cache for successful job
Creating cache build:neuron:nmodl:nvhpc:acc-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=18178 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Aneuron%3Anmodl%3Anvhpc%3Aacc-8-non_protected
Created cache
section_end:1655889527:archive_cache section_start:1655889527:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=18283 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=272577 responseStatus=201 Created token=BqwBC8nC
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=18330 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=272577 responseStatus=201 Created token=BqwBC8nC
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=18380 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=272577 responseStatus=201 Created token=BqwBC8nC
section_end:1655889529:upload_artifacts_on_success section_start:1655889529:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655889530:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655887116:resolve_secrets Resolving secrets
section_end:1655887116:resolve_secrets section_start:1655887116:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor153598121, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272575
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J272575_PROD_P112_CP2_C2
Job parameters: memory=76G, cpus_per_task=8, duration=2:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 563755
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J272575_PROD_P112_CP2_C2 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=2:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=2 --jobid=563755 --cpus-per-task=8 --mem=76G
section_end:1655887118:prepare_executor section_start:1655887118:prepare_script Preparing environment
Running on r1i6n30 via bbpv1.epfl.ch...
section_end:1655887120:prepare_script section_start:1655887120:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655887121:get_sources section_start:1655887121:restore_cache Restoring cache
Checking cache for build:neuron:nmodl:nvhpc:omp-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=270003 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1655887126:restore_cache section_start:1655887126:download_artifacts Downloading artifacts
Downloading artifacts for build:coreneuron:nmodl:nvhpc:omp (272565)...
Runtime platform  arch=amd64 os=linux pid=270622 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=272565 responseStatus=200 OK token=W7PLLsmx
section_end:1655887126:download_artifacts section_start:1655887126:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272575/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272575/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272575/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272575/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272575/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704/J272575_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install neuron%nvhpc +coreneuron+debug+tests~legacy-unit~rx3d model_tests=channel-benchmark,olfactory,tqperf-heavy ^/rjmenmrpgml6pswauzum5llfbry6ngnh
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be 2supl4esjx3ghhb3azj32rf6llwrd6wk
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/563755/ccache
Primary config: /nvme/bbpcihpcproj12/563755/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 10:39:14 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.39 / 0.51 (76.03 %)
Files: 13350
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: '2supl4esjx3ghhb3azj32rf6llwrd6wk'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- neuron%nvhpc+coreneuron+debug~legacy-unit~rx3d+tests model_tests=channel-benchmark,olfactory,tqperf-heavy
[+] ^coreneuron@develop%nvhpc@22.3~caliper~codegenopt+gpu~ipo~ispc~knl~legacy-unit+mpi+nmodl+openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^boost@1.78.0%nvhpc@22.3+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] ^cuda@11.6.1%nvhpc@22.3~allow-unsupported-compilers~dev arch=linux-rhel7-skylake
[^] ^hpe-mpi@2.25.hmpt%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^libsonata-report@1.1.1%gcc@11.2.0~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^hdf5@1.10.7%gcc@11.2.0+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^pkgconf@1.8.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^zlib@1.2.11%gcc@11.2.0+optimize+pic+shared arch=linux-rhel7-skylake
[^] ^spdlog@1.9.2%gcc@11.2.0~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^fmt@8.1.1%gcc@11.2.0~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
[+] ^nmodl@develop%gcc@11.2.0~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^catch2@2.13.8%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^cli11@2.1.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^nlohmann-json@3.10.4%gcc@11.2.0~ipo~multiple_headers build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^py-jinja2@3.0.1%gcc@11.2.0~i18n arch=linux-rhel7-skylake
[^] ^py-markupsafe@2.0.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^python@3.9.7%gcc@11.2.0+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] ^py-pybind11@2.9.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ^py-pytest@6.2.4%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-attrs@21.2.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-iniconfig@1.1.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-packaging@21.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-pyparsing@2.4.7%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-pluggy@0.13.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-py@1.9.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-setuptools@57.4.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-toml@0.10.2%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-pyyaml@5.3.1%gcc@11.2.0+libyaml arch=linux-rhel7-skylake
[^] ^libyaml@0.2.5%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-sympy@1.9%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^py-mpmath@1.1.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ^reportinglib@2.5.6%gcc@11.2.0~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
Concretized
--------------------------------
- 2supl4e neuron@develop%nvhpc@22.3+binary~caliper+coreneuron+debug~interviews~ipo+legacy-fr~legacy-unit+memacs+mod-compatibility+mpi~profile+python~rx3d+tests build_type=RelWithDebInfo model_tests=channel-benchmark,olfactory,tqperf-heavy patches=708cb04826b394a858069d93e8c08e1e81e914c23e1ef3da0486e8233834ff6c sanitizers=None arch=linux-rhel7-skylake
- 6s6wcfe ^bison@3.8.2%nvhpc@22.3 arch=linux-rhel7-skylake
- ucwiakr ^cmake@3.21.4%nvhpc@22.3~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
[+] rjmenmr ^coreneuron@develop%nvhpc@22.3~caliper~codegenopt+gpu~ipo~ispc~knl~legacy-unit+mpi+nmodl+openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] vn2t5vi ^boost@1.78.0%nvhpc@22.3+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] gi5x2dn ^cuda@11.6.1%nvhpc@22.3~allow-unsupported-compilers~dev arch=linux-rhel7-skylake
[^] hyunzkn ^hpe-mpi@2.25.hmpt%gcc@11.2.0 arch=linux-rhel7-skylake
[^] lbajih4 ^libsonata-report@1.1.1%gcc@11.2.0~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] utj47fz ^hdf5@1.10.7%gcc@11.2.0+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] ihxi5rl ^pkgconf@1.8.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] evtnqzd ^zlib@1.2.11%gcc@11.2.0+optimize+pic+shared arch=linux-rhel7-skylake
[^] r74vcyb ^spdlog@1.9.2%gcc@11.2.0~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 7lotjqk ^fmt@8.1.1%gcc@11.2.0~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
[+] 4dzxcps ^nmodl@develop%gcc@11.2.0~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] atktt2p ^catch2@2.13.8%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] e5qqxxq ^cli11@2.1.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] jjl6cjc ^nlohmann-json@3.10.4%gcc@11.2.0~ipo~multiple_headers build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] onq3mhg ^py-jinja2@3.0.1%gcc@11.2.0~i18n arch=linux-rhel7-skylake
[^] 4ectow5 ^py-markupsafe@2.0.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 72xzp3v ^python@3.9.7%gcc@11.2.0+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] z5kzef6 ^py-pybind11@2.9.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 22arfs4 ^py-pytest@6.2.4%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ascbeii ^py-attrs@21.2.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] kvw3vhm ^py-iniconfig@1.1.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 7iyiygo ^py-packaging@21.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] c7qvw2q ^py-pyparsing@2.4.7%gcc@11.2.0 arch=linux-rhel7-skylake
[^] mazoiox ^py-pluggy@0.13.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] y7rfzdj ^py-py@1.9.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] v4z3s5e ^py-setuptools@57.4.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] w4gddqx ^py-toml@0.10.2%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ci5oe5b ^py-pyyaml@5.3.1%gcc@11.2.0+libyaml arch=linux-rhel7-skylake
[^] xj2wlac ^libyaml@0.2.5%gcc@11.2.0 arch=linux-rhel7-skylake
[^] vt2or7v ^py-sympy@1.9%gcc@11.2.0 arch=linux-rhel7-skylake
[^] dzb2mfs ^py-mpmath@1.1.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 5fkun4i ^reportinglib@2.5.6%gcc@11.2.0~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
- ajxdymq ^flex@2.6.3%nvhpc@22.3+lex~nls arch=linux-rhel7-skylake
[^] iu2b5hx ^gettext@0.21%gcc@11.2.0+bzip2+curses+git~libunistring+libxml2+tar+xz arch=linux-rhel7-skylake
[^] 3rmq3zx ^bzip2@1.0.8%gcc@11.2.0~debug~pic+shared arch=linux-rhel7-skylake
[^] hxxlexb ^libiconv@1.16%gcc@11.2.0 libs=shared,static arch=linux-rhel7-skylake
[^] dnxqn2k ^libxml2@2.9.12%gcc@11.2.0~python arch=linux-rhel7-skylake
[^] jzpqn5y ^xz@5.2.5%gcc@11.2.0~pic libs=shared,static arch=linux-rhel7-skylake
[^] ams67cx ^ncurses@6.2%gcc@11.2.0~symlinks+termlib abi=none arch=linux-rhel7-skylake
[^] ir7xtbl ^tar@1.28%gcc@11.2.0 patches=08921fcbd732050c74ddf1de7d8ad95ffdbc09f8b4342456fa2f6a0dd02a957c,125cd6142fac2cc339e9aebfe79e40f90766022b8e8401532b1729e84fc148c2,5c314db58d005043bf407abaf25eb9823b9032a22fd12a0b142d4bf548130fa4,d428578be7fb99b831eb61e53b8d88a859afe08b479a21238180899707d79ce4 arch=linux-rhel7-skylake
- cp3ofsp ^ninja@1.10.2%nvhpc@22.3 arch=linux-rhel7-skylake
[^] n6q4vfz ^py-mpi4py@3.1.2%gcc@11.2.0 arch=linux-rhel7-skylake
[^] gjm7kkq ^py-numpy@1.19.5%gcc@11.2.0+blas+lapack patches=8a9d5d1b3f145c043b8b04869e7d46c6ff95c3f486d84f69693017c7e6190c7d arch=linux-rhel7-skylake
[^] r5cvgru ^intel-mkl@2020.4.304%gcc@11.2.0~ilp64+shared threads=none arch=linux-rhel7-skylake
[^] h2fsi6i ^py-cython@0.29.24%gcc@11.2.0 arch=linux-rhel7-skylake
[^] q5n7ofc ^py-pytest-cov@2.8.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] fudvy5v ^py-coverage@5.5%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 3333cwm ^readline@8.1%nvhpc@22.3 arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> bison@3.8.2 : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-6s6wcfeanp2mkdib4c3n3ivkcuosopgm)
==> cmake@3.21.4 : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-ucwiakreeghdgbo22kbqqhgnnlwxqtnn)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.78.0-hiorc7 (external boost-1.78.0-vn2t5viljvtb5wyfhgmx57txyeiznxo4)
==> cuda@11.6.1 : has external module in ['cuda/11.6.1']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cuda-11.6.1-ngetva (external cuda-11.6.1-gi5x2dn47i7nv76flbq7aatllxpaqmjp)
==> hpe-mpi@2.25.hmpt : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-hyunzknktdzri34bx26bupkvapehonql)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/pkgconf-1.8.0-ihxi5r
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-evtnqzdtuvprwnbd2nljimpzywrhw4uo)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/fmt-8.1.1-7lotjq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/catch2-2.13.8-atktt2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/cli11-2.1.1-e5qqxx
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/nlohmann-json-3.10.4-jjl6cj
==> python@3.9.7 : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-72xzp3vgzh5b424qevkfruhs3ajzqbiy)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libyaml-0.2.5-xj2wla
==> flex@2.6.3 : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-ajxdymqhnzqgeuhvk5nu5zfymzq35n6i)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bzip2-1.0.8-eu2d6e (external bzip2-1.0.8-3rmq3zxuntsiphthnvty6gtydbmbkwr5)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libiconv-1.16-hxxlex
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/xz-5.2.5-jzpqn5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ncurses-6.2-nsve5x (external ncurses-6.2-ams67cxbq5vc7wiay2ndr2ksce2igbfw)
[+] /usr (external tar-1.28-ir7xtblauhq3vtkpjrl7ou3nzevcsi3u)
==> ninja@1.10.2 : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-cp3ofspgkdqaqih5vrhotmdwvkozfswp)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/intel-mkl-2020.4.304-rzr3hj (external intel-mkl-2020.4.304-r5cvgrurdnt6y267arsmmmgqi75ouxd2)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/readline-8.1-we2frg (external readline-8.1-3333cwmykvahrsdydir4qeyasic3liq6)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/reportinglib-2.5.6-5fkun4
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/hdf5-1.10.7-utj47f
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/spdlog-1.9.2-r74vcy
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-toml-0.10.2-w4gddq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-markupsafe-2.0.1-4ectow
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-mpmath-1.1.0-dzb2mf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyparsing-2.4.7-c7qvw2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-iniconfig-1.1.1-kvw3vh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-attrs-21.2.0-ascbei
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pybind11-2.9.1-z5kzef
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pluggy-0.13.0-mazoio
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-setuptools-57.4.0-v4z3s5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-py-1.9.0-y7rfzd
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyyaml-5.3.1-ci5oe5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libxml2-2.9.12-dnxqn2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libsonata-report-1.1.1-lbajih
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-jinja2-3.0.1-onq3mh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-sympy-1.9-vt2or7
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-packaging-21.0-7iyiyg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-mpi4py-3.1.2-n6q4vf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-cython-0.29.24-h2fsi6
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-coverage-5.5-fudvy5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/gettext-0.21-iu2b5h
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pytest-6.2.4-22arfs
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-numpy-1.19.5-gjm7kk
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/software/install_gcc-11.2.0-skylake/nmodl-develop-4dzxcp
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pytest-cov-2.8.1-q5n7of
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/software/install_nvhpc-22.3-skylake/coreneuron-develop-rjmenm
==> Installing neuron-develop-2supl4esjx3ghhb3azj32rf6llwrd6wk
==> No binary for neuron-develop-2supl4esjx3ghhb3azj32rf6llwrd6wk found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704, but it is owned by 0
==> Applied patch /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/bluebrain/repo-patches/packages/neuron/revert_Import3d_numerical_format.master.patch
==> neuron: Executing phase: 'cmake'
==> neuron: Executing phase: 'build'
==> neuron: Executing phase: 'install'
==> neuron: Successfully installed neuron-develop-2supl4esjx3ghhb3azj32rf6llwrd6wk
Fetch: 26.78s. Build: 37m 7.94s. Total: 37m 34.71s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/software/install_nvhpc-22.3-skylake/neuron-develop-2supl4
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/563755/ccache
Primary config: /nvme/bbpcihpcproj12/563755/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 11:18:30 2022
Hits: 695 / 792 (87.75 %)
Direct: 167 / 802 (20.82 %)
Preprocessed: 528 / 629 (83.94 %)
Misses: 97
Direct: 635
Preprocessed: 101
Uncacheable: 122
Primary storage:
Hits: 1163 / 1598 (72.78 %)
Misses: 435
Cache size (GB): 0.39 / 0.51 (76.74 %)
Files: 13692
Uncacheable:
Autoconf compile/link: 7
Called for linking: 98
Called for preprocessing: 1
Compilation failed: 4
No input file: 6
Preprocessing failed: 6
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading hpe-mpi/2.25.hmpt
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1655889511:step_script section_start:1655889511:archive_cache Saving cache for successful job
Creating cache build:neuron:nmodl:nvhpc:omp-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=18130 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Aneuron%3Anmodl%3Anvhpc%3Aomp-8-non_protected
Created cache
section_end:1655889528:archive_cache section_start:1655889528:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=18436 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=272575 responseStatus=201 Created token=pexsvbLh
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=18483 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=272575 responseStatus=201 Created token=pexsvbLh
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=18528 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=272575 responseStatus=201 Created token=pexsvbLh
section_end:1655889530:upload_artifacts_on_success section_start:1655889530:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655889531:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655886529:resolve_secrets Resolving secrets
section_end:1655886529:resolve_secrets section_start:1655886529:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor531322231, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272559
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
A slurm job will be created with name GL_J272559_PROD_P112_CP2_C2
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 563713
job state: PD
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J272559_PROD_P112_CP2_C2 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=2 --jobid=563713 --cpus-per-task=8 --mem=76G
section_end:1655886533:prepare_executor section_start:1655886533:prepare_script Preparing environment
Running on r2i5n23 via bbpv1.epfl.ch...
section_end:1655886536:prepare_script section_start:1655886536:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655886536:get_sources section_start:1655886536:restore_cache Restoring cache
Checking cache for build:nmodl-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=200449 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1655886541:restore_cache section_start:1655886541:download_artifacts Downloading artifacts
Downloading artifacts for spack_setup (272556)...
Runtime platform  arch=amd64 os=linux pid=201590 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=272556 responseStatus=200 OK token=uHoCNEmA
section_end:1655886542:download_artifacts section_start:1655886542:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272559/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272559/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272559/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272559/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272559/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704/J272559_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install nmodl%gcc ~legacy-unit
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be 4dzxcpsuksxgtuoesheax4sf76wrhkqb
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/563713/ccache
Primary config: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/563713/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 10:29:24 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.43 / 0.51 (84.03 %)
Files: 490
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: '4dzxcpsuksxgtuoesheax4sf76wrhkqb'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- nmodl%gcc~legacy-unit
Concretized
--------------------------------
- 4dzxcps nmodl@develop%gcc@11.2.0~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
- cppb7al ^bison@3.8.2%gcc@11.2.0 arch=linux-rhel7-skylake
[^] atktt2p ^catch2@2.13.8%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 2qmvlfy ^cmake@3.21.4%gcc@11.2.0~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
[^] e5qqxxq ^cli11@2.1.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
- 4bt76dp ^flex@2.6.3%gcc@11.2.0+lex~nls arch=linux-rhel7-skylake
[^] 7lotjqk ^fmt@8.1.1%gcc@11.2.0~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
- utrxbc3 ^ninja@1.10.2%gcc@11.2.0 arch=linux-rhel7-skylake
[^] jjl6cjc ^nlohmann-json@3.10.4%gcc@11.2.0~ipo~multiple_headers build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] onq3mhg ^py-jinja2@3.0.1%gcc@11.2.0~i18n arch=linux-rhel7-skylake
[^] 4ectow5 ^py-markupsafe@2.0.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] v4z3s5e ^py-setuptools@57.4.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 72xzp3v ^python@3.9.7%gcc@11.2.0+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] z5kzef6 ^py-pybind11@2.9.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 22arfs4 ^py-pytest@6.2.4%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ascbeii ^py-attrs@21.2.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] kvw3vhm ^py-iniconfig@1.1.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 7iyiygo ^py-packaging@21.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] c7qvw2q ^py-pyparsing@2.4.7%gcc@11.2.0 arch=linux-rhel7-skylake
[^] mazoiox ^py-pluggy@0.13.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ciusbmc ^py-setuptools-scm@6.3.2%gcc@11.2.0+toml arch=linux-rhel7-skylake
[^] hmcew6w ^py-tomli@1.2.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] sxd7srs ^py-pip@21.1.2%gcc@11.2.0 arch=linux-rhel7-skylake
[^] y7rfzdj ^py-py@1.9.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] w4gddqx ^py-toml@0.10.2%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ci5oe5b ^py-pyyaml@5.3.1%gcc@11.2.0+libyaml arch=linux-rhel7-skylake
[^] xj2wlac ^libyaml@0.2.5%gcc@11.2.0 arch=linux-rhel7-skylake
[^] vt2or7v ^py-sympy@1.9%gcc@11.2.0 arch=linux-rhel7-skylake
[^] dzb2mfs ^py-mpmath@1.1.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] r74vcyb ^spdlog@1.9.2%gcc@11.2.0~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> bison@3.8.2 : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-cppb7alftvhxbedsuxqv72z2thjuoizw)
==> cmake@3.21.4 : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-2qmvlfyylrv3t5ankluyr5cqey2nlfzd)
==> flex@2.6.3 : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-4bt76dpxbix6ep4qtz3mv5i2iddilv53)
==> ninja@1.10.2 : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-utrxbc3aohnru5eynalc3hyv4ca4jqte)
==> python@3.9.7 : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-72xzp3vgzh5b424qevkfruhs3ajzqbiy)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libyaml-0.2.5-xj2wla
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/cli11-2.1.1-e5qqxx
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/fmt-8.1.1-7lotjq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/catch2-2.13.8-atktt2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/nlohmann-json-3.10.4-jjl6cj
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-mpmath-1.1.0-dzb2mf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-setuptools-57.4.0-v4z3s5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyyaml-5.3.1-ci5oe5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/spdlog-1.9.2-r74vcy
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-sympy-1.9-vt2or7
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-toml-0.10.2-w4gddq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-markupsafe-2.0.1-4ectow
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-attrs-21.2.0-ascbei
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pip-21.1.2-sxd7sr
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-iniconfig-1.1.1-kvw3vh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pybind11-2.9.1-z5kzef
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyparsing-2.4.7-c7qvw2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-jinja2-3.0.1-onq3mh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-tomli-1.2.1-hmcew6
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-packaging-21.0-7iyiyg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-setuptools-scm-6.3.2-ciusbm
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-py-1.9.0-y7rfzd
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pluggy-0.13.0-mazoio
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pytest-6.2.4-22arfs
==> Installing nmodl-develop-4dzxcpsuksxgtuoesheax4sf76wrhkqb
==> No binary for nmodl-develop-4dzxcpsuksxgtuoesheax4sf76wrhkqb found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704, but it is owned by 0
==> No patches needed for nmodl
==> nmodl: Executing phase: 'cmake'
==> nmodl: Executing phase: 'build'
==> nmodl: Executing phase: 'install'
==> nmodl: Successfully installed nmodl-develop-4dzxcpsuksxgtuoesheax4sf76wrhkqb
Fetch: 57.44s. Build: 1m 32.70s. Total: 2m 30.13s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/software/install_gcc-11.2.0-skylake/nmodl-develop-4dzxcp
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/563713/ccache
Primary config: /gpfs/bbp.cscs.ch/ssd/slurmTmpFS/bbpcihpcproj12/563713/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Jun 22 10:32:50 2022
Hits: 129 / 131 (98.47 %)
Direct: 28 / 131 (21.37 %)
Preprocessed: 101 / 103 (98.06 %)
Misses: 2
Direct: 103
Preprocessed: 2
Uncacheable: 26
Primary storage:
Hits: 157 / 262 (59.92 %)
Misses: 105
Cache size (GB): 0.43 / 0.51 (84.05 %)
Files: 494
Uncacheable:
Called for linking: 25
No input file: 1
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading hpe-mpi/2.25.hmpt
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1655886771:step_script section_start:1655886771:archive_cache Saving cache for successful job
Creating cache build:nmodl-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=210094 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Anmodl-8-non_protected
Created cache
section_end:1655886790:archive_cache section_start:1655886790:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=210182 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=272559 responseStatus=201 Created token=TUUbHxzw
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=210219 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=272559 responseStatus=201 Created token=TUUbHxzw
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=210262 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=272559 responseStatus=201 Created token=TUUbHxzw
section_end:1655886791:upload_artifacts_on_success section_start:1655886791:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655886792:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655886445:resolve_secrets Resolving secrets
section_end:1655886445:resolve_secrets section_start:1655886445:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor556265222, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272556
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=6, optional qos flag
A slurm job will be created with name GL_J272556_PROD_P112_CP1_C7
Job parameters: memory=30750M, cpus_per_task=6, duration=1:00:00, constraint=cpu ntasks=1 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 563710
job state: PD
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=1 --cpus-per-task=6 --mem=30750M --job-name=GL_J272556_PROD_P112_CP1_C7 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=1 --jobid=563710 --cpus-per-task=6 --mem=30750M
section_end:1655886449:prepare_executor section_start:1655886449:prepare_script Preparing environment
Running on r2i5n23 via bbpv1.epfl.ch...
section_end:1655886450:prepare_script section_start:1655886450:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655886451:get_sources section_start:1655886451:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ if [[ -n "${SPACK_ENV_FILE_URL}" && "${PARSE_GITHUB_PR_DESCRIPTIONS,,}" == "true" ]]; then
$ cat > parse_description.py << END_SCRIPT # collapsed multi-line command
$ cat parse_description.py
import os
import re
import requests
pr_info = requests.get("https://api.github.com/repos/{}/pulls/{}".format(
os.environ['CI_EXTERNAL_PULL_REQUEST_TARGET_REPOSITORY'],
os.environ['CI_EXTERNAL_PULL_REQUEST_IID']),
headers={'Accept': 'application/vnd.github.v3+json'})
pr_body = pr_info.json()["body"]
# match something like NEURON_BRANCH=foo/bar
pat = re.compile('^([A-Z0-9_]+)_([A-Z]+)=([A-Z0-9\-\_\/\+]+)$', re.IGNORECASE)
def parse_term(m):
ref_type = m.group(2).lower()
if ref_type not in {'branch', 'tag', 'ref'}: return
print(m.group(1).upper() + '_' + ref_type.upper() + '=' + m.group(3))
if pr_body is not None:
for pr_body_line in pr_body.splitlines():
if not pr_body_line.startswith('CI_BRANCHES:'): continue
for config_term in pr_body_line[12:].split(','):
pat.sub(parse_term, config_term)
$ (module load unstable python-dev; python parse_description.py) > input_variables.env
Autoloading python/3.9.7
Autoloading hpe-mpi/2.25.hmpt
$ else
$ cat input_variables.env
$ for var_to_unset in $(sed 's/^\(.*\?\)_\(BRANCH\|COMMIT\|TAG\)=.*$/\1_BRANCH\n\1_COMMIT\n\1_TAG/' input_variables.env); do # collapsed multi-line command
$ set -o allexport
$ . input_variables.env
$ set +o allexport
$ unset MODULEPATH
$ . /gpfs/bbp.cscs.ch/ssd/apps/bsd/${SPACK_DEPLOYMENT_SUFFIX}/config/modules.sh
$ echo "MODULEPATH=${MODULEPATH}" > spack_clone_variables.env
$ echo Preparing to clone Spack into ${PWD}
Preparing to clone Spack into /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272556
$ if [[ -z "${SPACK_BRANCH}" && ( -n "${SPACK_COMMIT}" || -n "${SPACK_TAG}" ) ]]; then
$ echo Checking out the ${SPACK_BRANCH} of Spack...
Checking out the develop of Spack...
$ module load unstable git
$ git clone -c feature.manyFiles=true --depth 1 --single-branch --branch ${SPACK_BRANCH} ${SPACK_URL} spack
Cloning into 'spack'...
Updating files: 6% (630/9129) Updating files: 7% (640/9129) Updating files: 8% (731/9129) Updating files: 9% (822/9129) Updating files: 10% (913/9129) Updating files: 11% (1005/9129) Updating files: 12% (1096/9129) Updating files: 13% (1187/9129) Updating files: 14% (1279/9129) Updating files: 15% (1370/9129) Updating files: 16% (1461/9129) Updating files: 17% (1552/9129) Updating files: 17% (1579/9129) Updating files: 18% (1644/9129) Updating files: 19% (1735/9129) Updating files: 20% (1826/9129) Updating files: 21% (1918/9129) Updating files: 22% (2009/9129) Updating files: 23% (2100/9129) Updating files: 24% (2191/9129) Updating files: 25% (2283/9129) Updating files: 25% (2285/9129) Updating files: 26% (2374/9129) Updating files: 27% (2465/9129) Updating files: 28% (2557/9129) Updating files: 29% (2648/9129) Updating files: 30% (2739/9129) Updating files: 31% (2830/9129) Updating files: 32% (2922/9129) Updating files: 33% (3013/9129) Updating files: 34% (3104/9129) Updating files: 34% (3142/9129) Updating files: 35% (3196/9129) Updating files: 36% (3287/9129) Updating files: 37% (3378/9129) Updating files: 38% (3470/9129) Updating files: 39% (3561/9129) Updating files: 40% (3652/9129) Updating files: 41% (3743/9129) Updating files: 42% (3835/9129) Updating files: 43% (3926/9129) Updating files: 44% (4017/9129) Updating files: 45% (4109/9129) Updating files: 46% (4200/9129) Updating files: 47% (4291/9129) Updating files: 47% (4315/9129) Updating files: 48% (4382/9129) Updating files: 49% (4474/9129) Updating files: 50% (4565/9129) Updating files: 51% (4656/9129) Updating files: 52% (4748/9129) Updating files: 53% (4839/9129) Updating files: 54% (4930/9129) Updating files: 55% (5021/9129) Updating files: 56% (5113/9129) Updating files: 57% (5204/9129) Updating files: 58% (5295/9129) Updating files: 59% (5387/9129) Updating files: 60% (5478/9129) Updating files: 61% (5569/9129) Updating files: 62% (5660/9129) Updating files: 62% (5733/9129) Updating files: 63% (5752/9129) Updating files: 64% (5843/9129) Updating files: 65% (5934/9129) Updating files: 66% (6026/9129) Updating files: 67% (6117/9129) Updating files: 68% (6208/9129) Updating files: 69% (6300/9129) Updating files: 70% (6391/9129) Updating files: 71% (6482/9129) Updating files: 72% (6573/9129) Updating files: 72% (6634/9129) Updating files: 73% (6665/9129) Updating files: 74% (6756/9129) Updating files: 75% (6847/9129) Updating files: 76% (6939/9129) Updating files: 77% (7030/9129) Updating files: 78% (7121/9129) Updating files: 79% (7212/9129) Updating files: 80% (7304/9129) Updating files: 81% (7395/9129) Updating files: 82% (7486/9129) Updating files: 82% (7514/9129) Updating files: 83% (7578/9129) Updating files: 84% (7669/9129) Updating files: 85% (7760/9129) Updating files: 86% (7851/9129) Updating files: 87% (7943/9129) Updating files: 88% (8034/9129) Updating files: 89% (8125/9129) Updating files: 90% (8217/9129) Updating files: 91% (8308/9129) Updating files: 92% (8399/9129) Updating files: 93% (8490/9129) Updating files: 94% (8582/9129) Updating files: 95% (8673/9129) Updating files: 96% (8764/9129) Updating files: 97% (8856/9129) Updating files: 97% (8861/9129) Updating files: 98% (8947/9129) Updating files: 99% (9038/9129) Updating files: 100% (9129/9129) Updating files: 100% (9129/9129), done.
$ export SPACK_ROOT=${PWD}/spack
$ export SPACK_USER_CACHE_PATH="${CI_BUILDS_DIR}"
$ export SPACK_SYSTEM_CONFIG_PATH="/gpfs/bbp.cscs.ch/ssd/apps/bsd/${SPACK_DEPLOYMENT_SUFFIX}/config"
$ echo "SPACK_ROOT=${SPACK_ROOT}" >> spack_clone_variables.env
$ echo "SPACK_USER_CACHE_PATH=${SPACK_USER_CACHE_PATH}" >> spack_clone_variables.env
$ echo "SPACK_SYSTEM_CONFIG_PATH=${SPACK_SYSTEM_CONFIG_PATH}" >> spack_clone_variables.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704/J272556_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
$ env -0 | sed -nz '/^CUSTOM_ENV_/d;/^[^=]\+_\(BRANCH\|COMMIT\|TAG\)=.\+/p' | xargs -0t spack configure-pipeline --ignore-packages CI_BUILD CI_COMMIT CI_DEFAULT GITLAB_PIPELINES SPACK ${SPACK_SETUP_IGNORE_PACKAGE_VARIABLES} --write-commit-file=commit-mapping.env
spack configure-pipeline --ignore-packages CI_BUILD CI_COMMIT CI_DEFAULT GITLAB_PIPELINES SPACK --write-commit-file=commit-mapping.env CI_COMMIT_BRANCH=pramodk/omp-simd GITLAB_PIPELINES_BRANCH=main NEURON_BRANCH=master NMODL_BRANCH=master SPACK_BRANCH=develop CI_DEFAULT_BRANCH=master CORENEURON_COMMIT=6f9c31e54f65ead9165e43fe268144873a37c0c1
==> CI_COMMIT: ignoring CI_COMMIT_BRANCH=pramodk/omp-simd
==> GITLAB_PIPELINES: ignoring GITLAB_PIPELINES_BRANCH=main
==> SPACK: ignoring SPACK_BRANCH=develop
==> CI_DEFAULT: ignoring CI_DEFAULT_BRANCH=master
==> neuron: resolved branch master to f242cc8d4ddefb74f71f7f613a4b7b4043c2b641
==> nmodl: resolved branch master to 426b508730185f8d3475c63b9d5a1aba02be9848
==> neuron@develop: remove branch/commit/tag
==> neuron@develop: use commit="f242cc8d4ddefb74f71f7f613a4b7b4043c2b641"
==> neuron@develop: add preferred=True
==> nmodl@develop: remove branch/commit/tag
==> nmodl@develop: use commit="426b508730185f8d3475c63b9d5a1aba02be9848"
==> nmodl@develop: add preferred=True
==> coreneuron@develop: remove branch/commit/tag
==> coreneuron@develop: use commit="6f9c31e54f65ead9165e43fe268144873a37c0c1"
==> coreneuron@develop: add preferred=True
$ (cd "${SPACK_ROOT}" && git diff)
diff --git a/bluebrain/repo-bluebrain/packages/coreneuron/package.py b/bluebrain/repo-bluebrain/packages/coreneuron/package.py
index 4ac1697..86ea33d 100644
--- a/bluebrain/repo-bluebrain/packages/coreneuron/package.py
+++ b/bluebrain/repo-bluebrain/packages/coreneuron/package.py
@@ -20,7 +20,7 @@ class Coreneuron(CMakePackage):
# This simplifies testing the gitlab-pipelines repository:
git = "git@bbpgitlab.epfl.ch:hpc/coreneuron.git"
- version('develop', branch='master')
+ version('develop', preferred=True, commit='6f9c31e54f65ead9165e43fe268144873a37c0c1') # old: branch='master'
# 1.0.1 > 1.0.0.20210519 > 1.0 as far as Spack is concerned
version('1.0.0.20220304', commit='2d08705')
version('1.0.0.20220218', commit='102ebde')
diff --git a/bluebrain/repo-bluebrain/packages/nmodl/package.py b/bluebrain/repo-bluebrain/packages/nmodl/package.py
index 814c56b..4e699a9 100644
--- a/bluebrain/repo-bluebrain/packages/nmodl/package.py
+++ b/bluebrain/repo-bluebrain/packages/nmodl/package.py
@@ -14,7 +14,7 @@ class Nmodl(CMakePackage):
git = "https://github.com/BlueBrain/nmodl.git"
# 0.3.1 > 0.3.0.20220110 > 0.3.0 > 0.3b > 0.3 to Spack
- version("develop", branch="master", submodules=True)
+ version('develop', preferred=True, commit='426b508730185f8d3475c63b9d5a1aba02be9848', submodules=True) # old: branch="master"
version("llvm", branch="llvm", submodules=True)
# This is the merge commit of #875, which allows catch2 etc. to be dependencies
version("0.3.0.20220531", commit="d63a061ee01b1fd6b14971644bb7fa3efeee20b0")
diff --git a/bluebrain/repo-patches/packages/neuron/package.py b/bluebrain/repo-patches/packages/neuron/package.py
index 8a74d23..74e1fc5 100644
--- a/bluebrain/repo-patches/packages/neuron/package.py
+++ b/bluebrain/repo-patches/packages/neuron/package.py
@@ -30,7 +30,7 @@ class Neuron(CMakePackage):
# Patch for recent CMake versions that don't identify NVHPC as PGI
patch("patch-v800-cmake-nvhpc.patch", when="@8.0.0%nvhpc^cmake@3.20:")
- version("develop", branch="master")
+ version('develop', preferred=True, commit='f242cc8d4ddefb74f71f7f613a4b7b4043c2b641') # old: branch="master"
version("8.0.2", tag="8.0.2")
version("8.0.1", tag="8.0.1")
version("8.0.0", tag="8.0.0")
$ cat commit-mapping.env
NEURON_COMMIT=f242cc8d4ddefb74f71f7f613a4b7b4043c2b641
NMODL_COMMIT=426b508730185f8d3475c63b9d5a1aba02be9848
CORENEURON_COMMIT=6f9c31e54f65ead9165e43fe268144873a37c0c1
$ echo "SPACK_BRANCH=${SPACK_BRANCH}" >> commit-mapping.env
$ echo "SPACK_DEPLOYMENT_SUFFIX=${SPACK_DEPLOYMENT_SUFFIX}" >> commit-mapping.env
$ cat commit-mapping.env >> spack_clone_variables.env
$ spack spec -IL ninja
Input spec
--------------------------------
- ninja
Concretized
--------------------------------
==> Bootstrapping clingo from pre-built binaries
- utrxbc3aohnru5eynalc3hyv4ca4jqte ninja@1.10.2%gcc@11.2.0 arch=linux-rhel7-skylake
$ echo "SPACK_SETUP_COMMIT_MAPPING_URL=${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/jobs/${CI_JOB_ID}/artifacts/commit-mapping.env" >> spack_clone_variables.env
$ spack config --scope site add "config:ccache:true"
$ echo "SPACK_USE_CCACHE=true" >> spack_clone_variables.env
section_end:1655886520:step_script section_start:1655886520:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=199221 revision=58ba2b95 version=14.2.0
commit-mapping.env: found 1 matching files and directories
input_variables.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=272556 responseStatus=201 Created token=uHoCNEmA
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=199279 revision=58ba2b95 version=14.2.0
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=272556 responseStatus=201 Created token=uHoCNEmA
section_end:1655886522:upload_artifacts_on_success section_start:1655886522:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655886522:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655886677:resolve_secrets Resolving secrets
section_end:1655886677:resolve_secrets section_start:1655886677:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor046556631, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272592
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
A slurm job will be created with name GL_J272592_PROD_P112_CP9_C15
Job parameters: memory=30750M, cpus_per_task=1, duration=1:00:00, constraint=cpu ntasks=8 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 563728
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=8 --cpus-per-task=1 --mem=30750M --job-name=GL_J272592_PROD_P112_CP9_C15 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=8 --jobid=563728 --cpus-per-task=1 --mem=30750M
section_end:1655886679:prepare_executor section_start:1655886679:prepare_script Preparing environment
Running on r1i5n4 via bbpv1.epfl.ch...
section_end:1655886682:prepare_script section_start:1655886682:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655886683:get_sources section_start:1655886683:download_artifacts Downloading artifacts
Downloading artifacts for build:coreneuron:mod2c:intel (272569)...
Runtime platform  arch=amd64 os=linux pid=5739 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=272569 responseStatus=200 OK token=myidMhnG
section_end:1655886684:download_artifacts section_start:1655886684:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r2i5n16
Build name: Linux-icpc
Create new tag: 20220622-0831 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272569/spack-build/spack-stage-coreneuron-develop-iw7f7wskesewfnyvmtmk3lfqta3e4jpw/spack-build-iw7f7ws
Start 1: cmd_interface_test
Start 2: interleave_info_constructor_test
Start 3: alignment_test
Start 4: queuing_test
Start 5: lfp_test
Start 6: ring_TEST
Start 17: reporting_1
1/17 Test #17: reporting_1 ........................ Passed 3.18 sec
2/17 Test #6: ring_TEST .......................... Passed 3.20 sec
Start 7: ring_binqueue_TEST
3/17 Test #1: cmd_interface_test ................. Passed 3.34 sec
Start 8: ring_multisend_TEST
4/17 Test #3: alignment_test ..................... Passed 3.54 sec
5/17 Test #4: queuing_test ....................... Passed 3.61 sec
Start 9: ring_spike_buffer_TEST
6/17 Test #2: interleave_info_constructor_test ... Passed 3.72 sec
7/17 Test #5: lfp_test ........................... Passed 3.88 sec
Start 10: ring_permute1_TEST
8/17 Test #8: ring_multisend_TEST ................ Passed 2.85 sec
Start 11: ring_permute2_TEST
9/17 Test #7: ring_binqueue_TEST ................. Passed 3.05 sec
Start 12: ring_gap_TEST
10/17 Test #9: ring_spike_buffer_TEST ............. Passed 2.66 sec
Start 13: ring_gap_binqueue_TEST
11/17 Test #11: ring_permute2_TEST ................. Passed 2.09 sec
Start 14: ring_gap_multisend_TEST
12/17 Test #12: ring_gap_TEST ...................... Passed 2.40 sec
Start 15: ring_gap_permute1_TEST
13/17 Test #13: ring_gap_binqueue_TEST ............. Passed 2.45 sec
Start 16: ring_gap_permute2_TEST
14/17 Test #14: ring_gap_multisend_TEST ............ Passed 0.79 sec
15/17 Test #15: ring_gap_permute1_TEST ............. Passed 1.52 sec
16/17 Test #16: ring_gap_permute2_TEST ............. Passed 1.52 sec
17/17 Test #10: ring_permute1_TEST ................. Passed 6.80 sec
100% tests passed, 0 tests failed out of 17
Total Test time (real) = 10.81 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1655886727:step_script section_start:1655886727:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=8508 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=272592 responseStatus=201 Created token=yyFEPF_G
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=8559 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=272592 responseStatus=201 Created token=yyFEPF_G
section_end:1655886728:upload_artifacts_on_success section_start:1655886728:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655886729:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655886712:resolve_secrets Resolving secrets
section_end:1655886712:resolve_secrets section_start:1655886712:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor389294522, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272586
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
A slurm job will be created with name GL_J272586_PROD_P112_CP11_C17
Job parameters: memory=30750M, cpus_per_task=1, duration=1:00:00, constraint=volta ntasks=8 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
Submitted batch job 563734
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=8 --cpus-per-task=1 --mem=30750M --job-name=GL_J272586_PROD_P112_CP11_C17 -C volta --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=8 --jobid=563734 --cpus-per-task=1 --mem=30750M
section_end:1655886715:prepare_executor section_start:1655886715:prepare_script Preparing environment
Running on ldir01u09.bbp.epfl.ch via bbpv1.epfl.ch...
section_end:1655886725:prepare_script section_start:1655886725:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655886726:get_sources section_start:1655886726:download_artifacts Downloading artifacts
Downloading artifacts for build:coreneuron:mod2c:nvhpc:acc:unified (272563)...
Runtime platform  arch=amd64 os=linux pid=53334 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=272563 responseStatus=200 OK token=15snGyEb
section_end:1655886727:download_artifacts section_start:1655886727:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r2i5n16
Build name: Linux-nvc++
Create new tag: 20220622-0832 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272563/spack-build/spack-stage-coreneuron-develop-7xhnxahexeffbne3dpzxwo5qcrumhnar/spack-build-7xhnxah
Start 1: cmd_interface_test
Start 2: interleave_info_constructor_test
Start 3: alignment_test
Start 4: queuing_test
Start 5: lfp_test
Start 6: ring_TEST
Start 19: reporting_1
1/19 Test #1: cmd_interface_test ..................... Passed 1.38 sec
2/19 Test #4: queuing_test ........................... Passed 3.05 sec
Start 7: ring_binqueue_TEST
3/19 Test #5: lfp_test ............................... Passed 3.25 sec
4/19 Test #19: reporting_1 ............................ Passed 34.07 sec
Start 8: ring_multisend_TEST
5/19 Test #3: alignment_test ......................... Passed 40.86 sec
6/19 Test #2: interleave_info_constructor_test ....... Passed 43.31 sec
Start 9: ring_spike_buffer_TEST
7/19 Test #6: ring_TEST .............................. Passed 54.85 sec
Start 10: ring_permute1_TEST
8/19 Test #7: ring_binqueue_TEST ..................... Passed 52.13 sec
Start 11: ring_permute2_TEST
9/19 Test #8: ring_multisend_TEST .................... Passed 36.77 sec
Start 12: ring_gap_TEST
10/19 Test #9: ring_spike_buffer_TEST ................. Passed 41.62 sec
Start 13: ring_gap_binqueue_TEST
11/19 Test #10: ring_permute1_TEST ..................... Passed 36.60 sec
Start 14: ring_gap_multisend_TEST
12/19 Test #11: ring_permute2_TEST ..................... Passed 43.31 sec
Start 15: ring_gap_permute1_TEST
13/19 Test #12: ring_gap_TEST .......................... Passed 35.73 sec
Start 16: ring_gap_permute2_TEST
14/19 Test #13: ring_gap_binqueue_TEST ................. Passed 27.28 sec
Start 17: ring_permute2_cudaInterface_TEST
15/19 Test #14: ring_gap_multisend_TEST ................ Passed 33.72 sec
Start 18: ring_gap_permute2_cudaInterface_TEST
16/19 Test #15: ring_gap_permute1_TEST ................. Passed 36.69 sec
17/19 Test #16: ring_gap_permute2_TEST ................. Passed 32.56 sec
18/19 Test #17: ring_permute2_cudaInterface_TEST ....... Passed 29.87 sec
19/19 Test #18: ring_gap_permute2_cudaInterface_TEST ... Passed 23.38 sec
100% tests passed, 0 tests failed out of 19
Total Test time (real) = 148.59 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1655886903:step_script section_start:1655886903:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=58793 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=272586 responseStatus=201 Created token=nxh-MFfk
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=58887 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=272586 responseStatus=201 Created token=nxh-MFfk
section_end:1655886904:upload_artifacts_on_success section_start:1655886904:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655886905:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655886712:resolve_secrets Resolving secrets
section_end:1655886712:resolve_secrets section_start:1655886712:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor257634120, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272584
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
A slurm job will be created with name GL_J272584_PROD_P112_CP5_C9
Job parameters: memory=30750M, cpus_per_task=1, duration=1:00:00, constraint=volta ntasks=8 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
Submitted batch job 563733
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=8 --cpus-per-task=1 --mem=30750M --job-name=GL_J272584_PROD_P112_CP5_C9 -C volta --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=8 --jobid=563733 --cpus-per-task=1 --mem=30750M
section_end:1655886714:prepare_executor section_start:1655886714:prepare_script Preparing environment
Running on ldir01u09.bbp.epfl.ch via bbpv1.epfl.ch...
section_end:1655886726:prepare_script section_start:1655886726:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655886727:get_sources section_start:1655886727:download_artifacts Downloading artifacts
Downloading artifacts for build:coreneuron:mod2c:nvhpc:acc (272561)...
Runtime platform  arch=amd64 os=linux pid=53411 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=272561 responseStatus=200 OK token=vewM1syi
section_end:1655886728:download_artifacts section_start:1655886728:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r2i5n16
Build name: Linux-nvc++
Create new tag: 20220622-0832 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272561/spack-build/spack-stage-coreneuron-develop-6qi7xiczl6qfolioernjyooskeukbk4m/spack-build-6qi7xic
Start 1: cmd_interface_test
Start 2: interleave_info_constructor_test
Start 3: alignment_test
Start 4: queuing_test
Start 5: lfp_test
Start 6: ring_TEST
Start 19: reporting_1
1/19 Test #1: cmd_interface_test ..................... Passed 2.51 sec
2/19 Test #2: interleave_info_constructor_test ....... Passed 2.66 sec
Start 7: ring_binqueue_TEST
3/19 Test #3: alignment_test ......................... Passed 2.79 sec
4/19 Test #4: queuing_test ........................... Passed 3.05 sec
Start 8: ring_multisend_TEST
5/19 Test #5: lfp_test ............................... Passed 3.26 sec
6/19 Test #8: ring_multisend_TEST .................... Passed 34.31 sec
Start 9: ring_spike_buffer_TEST
7/19 Test #19: reporting_1 ............................ Passed 42.50 sec
Start 10: ring_permute1_TEST
8/19 Test #6: ring_TEST .............................. Passed 55.10 sec
Start 11: ring_permute2_TEST
9/19 Test #7: ring_binqueue_TEST ..................... Passed 52.56 sec
Start 12: ring_gap_TEST
10/19 Test #9: ring_spike_buffer_TEST ................. Passed 40.01 sec
Start 13: ring_gap_binqueue_TEST
11/19 Test #10: ring_permute1_TEST ..................... Passed 42.62 sec
Start 14: ring_gap_multisend_TEST
12/19 Test #11: ring_permute2_TEST ..................... Passed 34.32 sec
Start 15: ring_gap_permute1_TEST
13/19 Test #12: ring_gap_TEST .......................... Passed 43.34 sec
Start 16: ring_gap_permute2_TEST
14/19 Test #13: ring_gap_binqueue_TEST ................. Passed 35.87 sec
Start 17: ring_permute2_cudaInterface_TEST
15/19 Test #14: ring_gap_multisend_TEST ................ Passed 30.96 sec
Start 18: ring_gap_permute2_cudaInterface_TEST
16/19 Test #15: ring_gap_permute1_TEST ................. Passed 35.95 sec
17/19 Test #16: ring_gap_permute2_TEST ................. Passed 36.87 sec
18/19 Test #17: ring_permute2_cudaInterface_TEST ....... Passed 35.17 sec
19/19 Test #18: ring_gap_permute2_cudaInterface_TEST ... Passed 32.65 sec
100% tests passed, 0 tests failed out of 19
Total Test time (real) = 148.77 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1655886903:step_script section_start:1655886903:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=58855 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=272584 responseStatus=201 Created token=SFrEsrLR
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=58913 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=272584 responseStatus=201 Created token=SFrEsrLR
section_end:1655886904:upload_artifacts_on_success section_start:1655886904:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655886905:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655886949:resolve_secrets Resolving secrets
section_end:1655886949:resolve_secrets section_start:1655886949:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor838222641, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272594
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
A slurm job will be created with name GL_J272594_PROD_P112_CP11_C14
Job parameters: memory=30750M, cpus_per_task=1, duration=1:00:00, constraint=cpu ntasks=8 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 563750
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=8 --cpus-per-task=1 --mem=30750M --job-name=GL_J272594_PROD_P112_CP11_C14 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=8 --jobid=563750 --cpus-per-task=1 --mem=30750M
section_end:1655886956:prepare_executor section_start:1655886956:prepare_script Preparing environment
Running on r1i6n30 via bbpv1.epfl.ch...
section_end:1655886960:prepare_script section_start:1655886960:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655886961:get_sources section_start:1655886961:download_artifacts Downloading artifacts
Downloading artifacts for build:coreneuron:nmodl:intel (272572)...
Runtime platform  arch=amd64 os=linux pid=261736 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=272572 responseStatus=200 OK token=mjvHJvjk
section_end:1655886962:download_artifacts section_start:1655886962:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r1i6n30
Build name: Linux-icpc
Create new tag: 20220622-0836 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272572/spack-build/spack-stage-coreneuron-develop-rh5oq7ygplbbyryh6avavlwff7ieo23w/spack-build-rh5oq7y
Start 1: cmd_interface_test
Start 2: interleave_info_constructor_test
Start 3: alignment_test
Start 4: queuing_test
Start 5: lfp_test
Start 6: ring_TEST
Start 17: reporting_1
1/17 Test #17: reporting_1 ........................ Passed 3.32 sec
2/17 Test #6: ring_TEST .......................... Passed 3.33 sec
Start 7: ring_binqueue_TEST
3/17 Test #4: queuing_test ....................... Passed 3.52 sec
Start 8: ring_multisend_TEST
4/17 Test #3: alignment_test ..................... Passed 3.83 sec
5/17 Test #1: cmd_interface_test ................. Passed 4.03 sec
Start 9: ring_spike_buffer_TEST
6/17 Test #2: interleave_info_constructor_test ... Passed 4.12 sec
7/17 Test #5: lfp_test ........................... Passed 4.17 sec
Start 10: ring_permute1_TEST
8/17 Test #7: ring_binqueue_TEST ................. Passed 3.13 sec
Start 11: ring_permute2_TEST
9/17 Test #8: ring_multisend_TEST ................ Passed 4.02 sec
Start 12: ring_gap_TEST
10/17 Test #10: ring_permute1_TEST ................. Passed 3.42 sec
Start 13: ring_gap_binqueue_TEST
11/17 Test #9: ring_spike_buffer_TEST ............. Passed 3.99 sec
Start 14: ring_gap_multisend_TEST
12/17 Test #11: ring_permute2_TEST ................. Passed 1.99 sec
Start 15: ring_gap_permute1_TEST
13/17 Test #12: ring_gap_TEST ...................... Passed 3.39 sec
Start 16: ring_gap_permute2_TEST
14/17 Test #13: ring_gap_binqueue_TEST ............. Passed 3.53 sec
15/17 Test #14: ring_gap_multisend_TEST ............ Passed 4.47 sec
16/17 Test #15: ring_gap_permute1_TEST ............. Passed 4.71 sec
17/17 Test #16: ring_gap_permute2_TEST ............. Passed 2.29 sec
100% tests passed, 0 tests failed out of 17
Total Test time (real) = 13.22 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1655887002:step_script section_start:1655887002:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=263838 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=272594 responseStatus=201 Created token=UyKayb69
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=263872 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=272594 responseStatus=201 Created token=UyKayb69
section_end:1655887003:upload_artifacts_on_success section_start:1655887003:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655887004:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655887120:resolve_secrets Resolving secrets
section_end:1655887120:resolve_secrets section_start:1655887120:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor958115185, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272590
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
A slurm job will be created with name GL_J272590_PROD_P112_CP13_C15
Job parameters: memory=30750M, cpus_per_task=1, duration=1:00:00, constraint=volta ntasks=8 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
Submitted batch job 563758
job state: PD
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=8 --cpus-per-task=1 --mem=30750M --job-name=GL_J272590_PROD_P112_CP13_C15 -C volta --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=8 --jobid=563758 --cpus-per-task=1 --mem=30750M
section_end:1655887123:prepare_executor section_start:1655887123:prepare_script Preparing environment
Running on ldir01u09.bbp.epfl.ch via bbpv1.epfl.ch...
section_end:1655887131:prepare_script section_start:1655887131:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655887132:get_sources section_start:1655887132:download_artifacts Downloading artifacts
Downloading artifacts for build:coreneuron:nmodl:nvhpc:acc (272567)...
Runtime platform  arch=amd64 os=linux pid=63602 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=272567 responseStatus=200 OK token=cPTxqcyz
section_end:1655887133:download_artifacts section_start:1655887133:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r1i6n30
Build name: Linux-nvc++
Create new tag: 20220622-0839 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272567/spack-build/spack-stage-coreneuron-develop-6q3vzbkyvzxputeeg6himnv7tcfu7ge7/spack-build-6q3vzbk
Start 1: cmd_interface_test
Start 2: interleave_info_constructor_test
Start 3: alignment_test
Start 4: queuing_test
Start 5: lfp_test
Start 6: ring_TEST
Start 19: reporting_1
1/19 Test #2: interleave_info_constructor_test ....... Passed 1.75 sec
2/19 Test #1: cmd_interface_test ..................... Passed 1.90 sec
Start 7: ring_binqueue_TEST
3/19 Test #3: alignment_test ......................... Passed 2.01 sec
4/19 Test #4: queuing_test ........................... Passed 2.12 sec
Start 8: ring_multisend_TEST
5/19 Test #5: lfp_test ............................... Passed 2.26 sec
6/19 Test #19: reporting_1 ............................ Passed 24.48 sec
Start 9: ring_spike_buffer_TEST
7/19 Test #7: ring_binqueue_TEST ..................... Passed 41.74 sec
Start 10: ring_permute1_TEST
8/19 Test #6: ring_TEST .............................. Passed 44.96 sec
Start 11: ring_permute2_TEST
9/19 Test #8: ring_multisend_TEST .................... Passed 48.64 sec
Start 12: ring_gap_TEST
10/19 Test #9: ring_spike_buffer_TEST ................. Passed 27.99 sec
Start 13: ring_gap_binqueue_TEST
11/19 Test #10: ring_permute1_TEST ..................... Passed 26.48 sec
Start 14: ring_gap_multisend_TEST
12/19 Test #11: ring_permute2_TEST ..................... Passed 31.26 sec
Start 15: ring_gap_permute1_TEST
13/19 Test #12: ring_gap_TEST .......................... Passed 39.41 sec
Start 16: ring_gap_permute2_TEST
14/19 Test #13: ring_gap_binqueue_TEST ................. Passed 37.79 sec
Start 17: ring_permute2_cudaInterface_TEST
15/19 Test #14: ring_gap_multisend_TEST ................ Passed 47.14 sec
Start 18: ring_gap_permute2_cudaInterface_TEST
16/19 Test #15: ring_gap_permute1_TEST ................. Passed 51.95 sec
17/19 Test #17: ring_permute2_cudaInterface_TEST ....... Passed 45.43 sec
18/19 Test #16: ring_gap_permute2_TEST ................. Passed 54.97 sec
19/19 Test #18: ring_gap_permute2_cudaInterface_TEST ... Passed 28.15 sec
100% tests passed, 0 tests failed out of 19
Total Test time (real) = 145.44 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1655887303:step_script section_start:1655887303:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=67550 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=272590 responseStatus=201 Created token=2ihdmEon
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=67581 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=272590 responseStatus=201 Created token=2ihdmEon
section_end:1655887305:upload_artifacts_on_success section_start:1655887305:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655887305:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655887117:resolve_secrets Resolving secrets
section_end:1655887117:resolve_secrets section_start:1655887117:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor154594499, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272588
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
A slurm job will be created with name GL_J272588_PROD_P112_CP10_C14
Job parameters: memory=30750M, cpus_per_task=1, duration=1:00:00, constraint=volta ntasks=8 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
Submitted batch job 563756
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=8 --cpus-per-task=1 --mem=30750M --job-name=GL_J272588_PROD_P112_CP10_C14 -C volta --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=8 --jobid=563756 --cpus-per-task=1 --mem=30750M
section_end:1655887118:prepare_executor section_start:1655887118:prepare_script Preparing environment
Running on ldir01u09.bbp.epfl.ch via bbpv1.epfl.ch...
section_end:1655887125:prepare_script section_start:1655887125:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655887126:get_sources section_start:1655887126:download_artifacts Downloading artifacts
Downloading artifacts for build:coreneuron:nmodl:nvhpc:omp (272565)...
Runtime platform  arch=amd64 os=linux pid=62956 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=272565 responseStatus=200 OK token=W7PLLsmx
section_end:1655887127:download_artifacts section_start:1655887127:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r1i6n30
Build name: Linux-nvc++
Create new tag: 20220622-0839 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272565/spack-build/spack-stage-coreneuron-develop-rjmenmrpgml6pswauzum5llfbry6ngnh/spack-build-rjmenmr
Start 1: cmd_interface_test
Start 2: interleave_info_constructor_test
Start 3: alignment_test
Start 4: queuing_test
Start 5: lfp_test
Start 6: ring_TEST
Start 19: reporting_1
1/19 Test #2: interleave_info_constructor_test ....... Passed 1.69 sec
2/19 Test #3: alignment_test ......................... Passed 1.80 sec
Start 7: ring_binqueue_TEST
3/19 Test #1: cmd_interface_test ..................... Passed 1.92 sec
4/19 Test #4: queuing_test ........................... Passed 2.05 sec
Start 8: ring_multisend_TEST
5/19 Test #5: lfp_test ............................... Passed 2.19 sec
6/19 Test #19: reporting_1 ............................ Passed 26.84 sec
Start 9: ring_spike_buffer_TEST
7/19 Test #6: ring_TEST .............................. Passed 73.28 sec
Start 10: ring_permute1_TEST
8/19 Test #7: ring_binqueue_TEST ..................... Passed 83.59 sec
Start 11: ring_permute2_TEST
9/19 Test #8: ring_multisend_TEST .................... Passed 85.41 sec
Start 12: ring_gap_TEST
10/19 Test #9: ring_spike_buffer_TEST ................. Passed 64.60 sec
Start 13: ring_gap_binqueue_TEST
11/19 Test #10: ring_permute1_TEST ..................... Passed 45.94 sec
Start 14: ring_gap_multisend_TEST
12/19 Test #11: ring_permute2_TEST ..................... Passed 54.12 sec
Start 15: ring_gap_permute1_TEST
13/19 Test #12: ring_gap_TEST .......................... Passed 195.79 sec
Start 16: ring_gap_permute2_TEST
14/19 Test #13: ring_gap_binqueue_TEST ................. Passed 200.38 sec
Start 17: ring_permute2_cudaInterface_TEST
15/19 Test #14: ring_gap_multisend_TEST ................ Passed 176.63 sec
Start 18: ring_gap_permute2_cudaInterface_TEST
16/19 Test #15: ring_gap_permute1_TEST ................. Passed 157.83 sec
17/19 Test #16: ring_gap_permute2_TEST ................. Passed 14.90 sec
18/19 Test #17: ring_permute2_cudaInterface_TEST ....... Passed 7.75 sec
19/19 Test #18: ring_gap_permute2_cudaInterface_TEST ... Passed 6.87 sec
100% tests passed, 0 tests failed out of 19
Total Test time (real) = 302.75 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1655887456:step_script section_start:1655887456:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=68445 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=272588 responseStatus=201 Created token=xx81umMX
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=68477 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=272588 responseStatus=201 Created token=xx81umMX
section_end:1655887457:upload_artifacts_on_success section_start:1655887457:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655887458:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655887531:resolve_secrets Resolving secrets
section_end:1655887531:resolve_secrets section_start:1655887531:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor800498274, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272602
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
A slurm job will be created with name GL_J272602_PROD_P112_CP6_C9
Job parameters: memory=76G, cpus_per_task=1, duration=1:00:00, constraint=cpu ntasks=16 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 563808
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=16 --cpus-per-task=1 --mem=76G --job-name=GL_J272602_PROD_P112_CP6_C9 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=16 --jobid=563808 --cpus-per-task=1 --mem=76G
section_end:1655887532:prepare_executor section_start:1655887532:prepare_script Preparing environment
Running on r1i7n22 via bbpv1.epfl.ch...
section_end:1655887536:prepare_script section_start:1655887536:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655887538:get_sources section_start:1655887538:download_artifacts Downloading artifacts
Downloading artifacts for build:neuron:mod2c:intel (272579)...
Runtime platform  arch=amd64 os=linux pid=227134 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=272579 responseStatus=200 OK token=nyWmD2tP
section_end:1655887542:download_artifacts section_start:1655887542:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r1i7n22
Build name: Linux-icpc
Create new tag: 20220622-0846 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272579/spack-build/spack-stage-neuron-develop-5rhr7pzdmuyym5ch4jvmvyd7544mkxl4/spack-build-5rhr7pz
Start 49: external_ringtest::coreneuron_cpu_mpi_offline::preparation
Start 56: testcorenrn_bbcore::coreneuron_cpu_offline::preparation
Start 62: testcorenrn_conc::coreneuron_cpu_offline::preparation
Start 68: testcorenrn_deriv::coreneuron_cpu_offline::preparation
Start 74: testcorenrn_gf::coreneuron_cpu_offline::preparation
Start 80: testcorenrn_kin::coreneuron_cpu_offline::preparation
Start 85: testcorenrn_patstim::coreneuron_cpu_offline::preparation
Start 91: testcorenrn_vecplay::coreneuron_cpu_offline::preparation
Start 97: testcorenrn_vecevent::coreneuron_cpu_offline::preparation
1/121 Test #62: testcorenrn_conc::coreneuron_cpu_offline::preparation .............. Passed 2.69 sec
Start 44: external_ringtest::neuron
2/121 Test #68: testcorenrn_deriv::coreneuron_cpu_offline::preparation ............. Passed 2.88 sec
Start 52: testcorenrn_bbcore::neuron
3/121 Test #85: testcorenrn_patstim::coreneuron_cpu_offline::preparation ........... Passed 7.36 sec
Start 103: testcorenrn_watch::coreneuron_cpu_offline::preparation
4/121 Test #56: testcorenrn_bbcore::coreneuron_cpu_offline::preparation ............ Passed 7.72 sec
Start 53: testcorenrn_bbcore::coreneuron_cpu_online
5/121 Test #80: testcorenrn_kin::coreneuron_cpu_offline::preparation ............... Passed 7.75 sec
Start 54: testcorenrn_bbcore::coreneuron_cpu_online_psolve_alternate
6/121 Test #74: testcorenrn_gf::coreneuron_cpu_offline::preparation ................ Passed 8.03 sec
Start 45: external_ringtest::neuron_mpi
7/121 Test #91: testcorenrn_vecplay::coreneuron_cpu_offline::preparation ........... Passed 8.19 sec
Start 46: external_ringtest::coreneuron_cpu_mpi_offline_saverestore
8/121 Test #49: external_ringtest::coreneuron_cpu_mpi_offline::preparation ......... Passed 11.52 sec
Start 47: external_ringtest::coreneuron_cpu_mpi
9/121 Test #97: testcorenrn_vecevent::coreneuron_cpu_offline::preparation .......... Passed 12.23 sec
Start 118: olfactory-bulb-3d::neuron::preparation
10/121 Test #44: external_ringtest::neuron .......................................... Passed 10.48 sec
Start 55: testcorenrn_bbcore::coreneuron_cpu_offline
11/121 Test #118: olfactory-bulb-3d::neuron::preparation ............................. Passed 1.49 sec
Start 120: olfactory-bulb-3d::coreneuron_cpu_online::preparation
12/121 Test #120: olfactory-bulb-3d::coreneuron_cpu_online::preparation .............. Passed 0.53 sec
Start 41: reduced_dentate::neuron
13/121 Test #52: testcorenrn_bbcore::neuron ......................................... Passed 11.94 sec
Start 58: testcorenrn_conc::neuron
14/121 Test #55: testcorenrn_bbcore::coreneuron_cpu_offline ......................... Passed 7.15 sec
Start 59: testcorenrn_conc::coreneuron_cpu_online
15/121 Test #103: testcorenrn_watch::coreneuron_cpu_offline::preparation ............. Passed 15.43 sec
Start 48: external_ringtest::coreneuron_cpu_mpi_offline
16/121 Test #53: testcorenrn_bbcore::coreneuron_cpu_online .......................... Passed 16.82 sec
Start 60: testcorenrn_conc::coreneuron_cpu_online_psolve_alternate
17/121 Test #58: testcorenrn_conc::neuron ........................................... Passed 12.88 sec
Start 61: testcorenrn_conc::coreneuron_cpu_offline
18/121 Test #45: external_ringtest::neuron_mpi ...................................... Passed 20.66 sec
Start 64: testcorenrn_deriv::neuron
Start 65: testcorenrn_deriv::coreneuron_cpu_online
19/121 Test #54: testcorenrn_bbcore::coreneuron_cpu_online_psolve_alternate ......... Passed 21.14 sec
Start 66: testcorenrn_deriv::coreneuron_cpu_online_psolve_alternate
20/121 Test #48: external_ringtest::coreneuron_cpu_mpi_offline ...................... Passed 6.54 sec
Start 67: testcorenrn_deriv::coreneuron_cpu_offline
Start 76: testcorenrn_kin::neuron
21/121 Test #61: testcorenrn_conc::coreneuron_cpu_offline ........................... Passed 3.21 sec
Start 77: testcorenrn_kin::coreneuron_cpu_online
22/121 Test #67: testcorenrn_deriv::coreneuron_cpu_offline .......................... Passed 2.96 sec
Start 78: testcorenrn_kin::coreneuron_cpu_online_psolve_alternate
23/121 Test #64: testcorenrn_deriv::neuron .......................................... Passed 6.22 sec
Start 79: testcorenrn_kin::coreneuron_cpu_offline
24/121 Test #76: testcorenrn_kin::neuron ............................................ Passed 6.07 sec
Start 109: channel_benchmark_hippo::neuron
25/121 Test #59: testcorenrn_conc::coreneuron_cpu_online ............................ Passed 15.51 sec
Start 110: channel_benchmark_hippo::coreneuron_cpu_online
26/121 Test #60: testcorenrn_conc::coreneuron_cpu_online_psolve_alternate ........... Passed 12.67 sec
Start 111: channel_benchmark_hippo::coreneuron_cpu_filemode
27/121 Test #79: testcorenrn_kin::coreneuron_cpu_offline ............................ Passed 3.85 sec
Start 113: channel_benchmark_sscx::neuron
28/121 Test #65: testcorenrn_deriv::coreneuron_cpu_online ........................... Passed 10.94 sec
Start 114: channel_benchmark_sscx::coreneuron_cpu_online
29/121 Test #66: testcorenrn_deriv::coreneuron_cpu_online_psolve_alternate .......... Passed 12.29 sec
Start 115: channel_benchmark_sscx::coreneuron_cpu_filemode
30/121 Test #77: testcorenrn_kin::coreneuron_cpu_online ............................. Passed 9.98 sec
Start 1: testneuron
31/121 Test #1: testneuron ......................................................... Passed 0.09 sec
Start 2: ringtest
32/121 Test #78: testcorenrn_kin::coreneuron_cpu_online_psolve_alternate ............ Passed 9.04 sec
Start 3: connect_dend
33/121 Test #2: ringtest ........................................................... Passed 0.62 sec
Start 4: mpi_init::nrniv_mpiopt
34/121 Test #3: connect_dend ....................................................... Passed 0.70 sec
Start 5: mpi_init::nrniv_nrnmpi_init
35/121 Test #4: mpi_init::nrniv_mpiopt ............................................. Passed 1.06 sec
Start 6: mpi_init::python_nrnmpi_init
36/121 Test #6: mpi_init::python_nrnmpi_init ....................................... Passed 1.10 sec
Start 7: mpi_init::python_mpienv
37/121 Test #7: mpi_init::python_mpienv ............................................ Passed 2.81 sec
Start 8: mpi_init::nrniv_mpiexec_mpiopt
38/121 Test #8: mpi_init::nrniv_mpiexec_mpiopt ..................................... Passed 2.27 sec
Start 9: mpi_init::nrniv_mpiexec_nrnmpi_init
39/121 Test #9: mpi_init::nrniv_mpiexec_nrnmpi_init ................................ Passed 1.88 sec
Start 10: mpi_init::python_mpiexec_nrnmpi_init
40/121 Test #10: mpi_init::python_mpiexec_nrnmpi_init ............................... Passed 2.49 sec
Start 11: mpi_init::python_mpiexec_mpienv
41/121 Test #46: external_ringtest::coreneuron_cpu_mpi_offline_saverestore .......... Passed 48.26 sec
Start 70: testcorenrn_gf::neuron
42/121 Test #47: external_ringtest::coreneuron_cpu_mpi .............................. Passed 47.68 sec
Start 71: testcorenrn_gf::coreneuron_cpu_online
43/121 Test #5: mpi_init::nrniv_nrnmpi_init ........................................ Passed 21.73 sec
Start 12: pynrn::basic_tests
44/121 Test #70: testcorenrn_gf::neuron ............................................. Passed 8.07 sec
Start 72: testcorenrn_gf::coreneuron_cpu_online_psolve_alternate
45/121 Test #11: mpi_init::python_mpiexec_mpienv .................................... Passed 16.15 sec
Start 13: coverage_tests::cover_tests
46/121 Test #71: testcorenrn_gf::coreneuron_cpu_online .............................. Passed 15.01 sec
Start 73: testcorenrn_gf::coreneuron_cpu_offline
47/121 Test #73: testcorenrn_gf::coreneuron_cpu_offline ............................. Passed 17.64 sec
Start 82: testcorenrn_patstim::neuron
48/121 Test #72: testcorenrn_gf::coreneuron_cpu_online_psolve_alternate ............. Passed 42.12 sec
Start 83: testcorenrn_patstim::coreneuron_cpu_offline_saverestore
49/121 Test #13: coverage_tests::cover_tests ........................................ Passed 42.19 sec
Start 14: parallel_tests
50/121 Test #82: testcorenrn_patstim::neuron ........................................ Passed 30.80 sec
Start 84: testcorenrn_patstim::coreneuron_cpu_offline
51/121 Test #41: reduced_dentate::neuron ............................................ Passed 125.51 sec
Start 42: reduced_dentate::coreneuron_cpu
52/121 Test #84: testcorenrn_patstim::coreneuron_cpu_offline ........................ Passed 17.44 sec
Start 87: testcorenrn_vecplay::neuron
53/121 Test #83: testcorenrn_patstim::coreneuron_cpu_offline_saverestore ............ Passed 56.10 sec
Start 88: testcorenrn_vecplay::coreneuron_cpu_online
54/121 Test #12: pynrn::basic_tests ................................................. Passed 99.46 sec
Start 15: parallel_partrans
55/121 Test #87: testcorenrn_vecplay::neuron ........................................ Passed 31.51 sec
Start 89: testcorenrn_vecplay::coreneuron_cpu_online_psolve_alternate
56/121 Test #14: parallel_tests ..................................................... Passed 67.99 sec
Start 16: parallel_netpar
57/121 Test #114: channel_benchmark_sscx::coreneuron_cpu_online ...................... Passed 145.31 sec
Start 17: parallel_bas
58/121 Test #113: channel_benchmark_sscx::neuron ..................................... Passed 149.14 sec
Start 18: coreneuron_modtests::version_macros
59/121 Test #110: channel_benchmark_hippo::coreneuron_cpu_online ..................... Passed 154.55 sec
Start 19: coreneuron_modtests::fornetcon_py_cpu
60/121 Test #111: channel_benchmark_hippo::coreneuron_cpu_filemode ................... Passed 157.71 sec
Start 20: coreneuron_modtests::direct_py_cpu
61/121 Test #15: parallel_partrans .................................................. Passed 32.33 sec
Start 21: coreneuron_modtests::direct_hoc_cpu
62/121 Test #109: channel_benchmark_hippo::neuron .................................... Passed 166.15 sec
Start 22: coreneuron_modtests::spikes_py_cpu
63/121 Test #115: channel_benchmark_sscx::coreneuron_cpu_filemode .................... Passed 166.52 sec
Start 23: coreneuron_modtests::spikes_file_mode_py_cpu
64/121 Test #88: testcorenrn_vecplay::coreneuron_cpu_online ......................... Passed 45.12 sec
Start 90: testcorenrn_vecplay::coreneuron_cpu_offline
65/121 Test #89: testcorenrn_vecplay::coreneuron_cpu_online_psolve_alternate ........ Passed 40.36 sec
Start 99: testcorenrn_watch::neuron
66/121 Test #16: parallel_netpar .................................................... Passed 38.43 sec
Start 24: coreneuron_modtests::fast_imem_py_cpu
67/121 Test #21: coreneuron_modtests::direct_hoc_cpu ................................ Passed 24.65 sec
Start 25: coreneuron_modtests::datareturn_py_cpu
68/121 Test #90: testcorenrn_vecplay::coreneuron_cpu_offline ........................ Passed 17.00 sec
Start 100: testcorenrn_watch::coreneuron_cpu_online
69/121 Test #42: reduced_dentate::coreneuron_cpu .................................... Passed 99.48 sec
Start 93: testcorenrn_vecevent::neuron
70/121 Test #99: testcorenrn_watch::neuron .......................................... Passed 29.17 sec
Start 101: testcorenrn_watch::coreneuron_cpu_online_psolve_alternate
71/121 Test #18: coreneuron_modtests::version_macros ................................ Passed 61.59 sec
Start 26: coreneuron_modtests::test_units_py_cpu
72/121 Test #93: testcorenrn_vecevent::neuron ....................................... Passed 26.30 sec
Start 94: testcorenrn_vecevent::coreneuron_cpu_online
73/121 Test #19: coreneuron_modtests::fornetcon_py_cpu .............................. Passed 76.59 sec
Start 27: coreneuron_modtests::test_netmove_py_cpu
74/121 Test #22: coreneuron_modtests::spikes_py_cpu ................................. Passed 69.01 sec
Start 28: coreneuron_modtests::test_pointer_py_cpu
75/121 Test #20: coreneuron_modtests::direct_py_cpu ................................. Passed 76.36 sec
Start 29: coreneuron_modtests::test_watchrange_py_cpu
76/121 Test #23: coreneuron_modtests::spikes_file_mode_py_cpu ....................... Passed 66.80 sec
Start 30: coreneuron_modtests::test_psolve_py_cpu
77/121 Test #100: testcorenrn_watch::coreneuron_cpu_online ........................... Passed 49.64 sec
Start 102: testcorenrn_watch::coreneuron_cpu_offline
78/121 Test #102: testcorenrn_watch::coreneuron_cpu_offline .......................... Passed 15.44 sec
Start 105: testcorenrn_netstimdirect::direct_netstimdirect
79/121 Test #25: coreneuron_modtests::datareturn_py_cpu ............................. Passed 74.09 sec
Start 31: coreneuron_modtests::test_ba_py_cpu
80/121 Test #101: testcorenrn_watch::coreneuron_cpu_online_psolve_alternate .......... Passed 55.19 sec
Start 106: testcorenrn_netstimdirect::direct_netstimdirect_psolve_alternate
81/121 Test #24: coreneuron_modtests::fast_imem_py_cpu .............................. Passed 78.81 sec
Start 32: coreneuron_modtests::test_natrans_py_cpu
82/121 Test #26: coreneuron_modtests::test_units_py_cpu ............................. Passed 57.63 sec
Start 36: modlunit_unitstest
83/121 Test #36: modlunit_unitstest ................................................. Passed 0.48 sec
Start 37: modlunit_hh
84/121 Test #37: modlunit_hh ........................................................ Passed 0.07 sec
Start 38: modlunit_stim
85/121 Test #38: modlunit_stim ...................................................... Passed 0.06 sec
Start 39: modlunit_pattern
86/121 Test #39: modlunit_pattern ................................................... Passed 0.28 sec
Start 40: external_nrntest
87/121 Test #94: testcorenrn_vecevent::coreneuron_cpu_online ........................ Passed 53.22 sec
Start 95: testcorenrn_vecevent::coreneuron_cpu_online_psolve_alternate
88/121 Test #27: coreneuron_modtests::test_netmove_py_cpu ........................... Passed 58.97 sec
Start 43: reduced_dentate::compare_results
89/121 Test #105: testcorenrn_netstimdirect::direct_netstimdirect .................... Passed 38.29 sec
Start 33: coreneuron_modtests::spikes_mpi_py_cpu
90/121 Test #43: reduced_dentate::compare_results ................................... Passed 2.44 sec
Start 57: testcorenrn_bbcore::compare_results
91/121 Test #57: testcorenrn_bbcore::compare_results ................................ Passed 2.25 sec
Start 63: testcorenrn_conc::compare_results
92/121 Test #17: parallel_bas ....................................................... Passed 148.01 sec
Start 69: testcorenrn_deriv::compare_results
93/121 Test #106: testcorenrn_netstimdirect::direct_netstimdirect_psolve_alternate ... Passed 36.26 sec
Start 34: coreneuron_modtests::spikes_mpi_file_mode_py_cpu
94/121 Test #63: testcorenrn_conc::compare_results .................................. Passed 3.14 sec
Start 75: testcorenrn_gf::compare_results
95/121 Test #29: coreneuron_modtests::test_watchrange_py_cpu ........................ Passed 63.54 sec
Start 81: testcorenrn_kin::compare_results
96/121 Test #69: testcorenrn_deriv::compare_results ................................. Passed 3.26 sec
Start 86: testcorenrn_patstim::compare_results
97/121 Test #75: testcorenrn_gf::compare_results .................................... Passed 3.46 sec
Start 92: testcorenrn_vecplay::compare_results
98/121 Test #81: testcorenrn_kin::compare_results ................................... Passed 2.93 sec
Start 104: testcorenrn_watch::compare_results
99/121 Test #104: testcorenrn_watch::compare_results ................................. Passed 0.77 sec
Start 107: testcorenrn_netstimdirect::compare_results
100/121 Test #92: testcorenrn_vecplay::compare_results ............................... Passed 1.80 sec
Start 112: channel_benchmark_hippo::compare_results
101/121 Test #86: testcorenrn_patstim::compare_results ............................... Passed 3.75 sec
Start 116: channel_benchmark_sscx::compare_results
102/121 Test #107: testcorenrn_netstimdirect::compare_results ......................... Passed 2.81 sec
103/121 Test #30: coreneuron_modtests::test_psolve_py_cpu ............................ Passed 67.74 sec
Start 35: coreneuron_modtests::inputpresyn_py_cpu
104/121 Test #116: channel_benchmark_sscx::compare_results ............................ Passed 2.22 sec
105/121 Test #31: coreneuron_modtests::test_ba_py_cpu ................................ Passed 47.72 sec
106/121 Test #32: coreneuron_modtests::test_natrans_py_cpu ........................... Passed 45.29 sec
107/121 Test #112: channel_benchmark_hippo::compare_results ........................... Passed 3.79 sec
Start 96: testcorenrn_vecevent::coreneuron_cpu_offline
108/121 Test #95: testcorenrn_vecevent::coreneuron_cpu_online_psolve_alternate ....... Passed 27.12 sec
Start 117: olfactory-bulb-3d::neuron
109/121 Test #28: coreneuron_modtests::test_pointer_py_cpu ........................... Passed 77.40 sec
110/121 Test #33: coreneuron_modtests::spikes_mpi_py_cpu ............................. Passed 20.22 sec
111/121 Test #34: coreneuron_modtests::spikes_mpi_file_mode_py_cpu ................... Passed 15.32 sec
Start 119: olfactory-bulb-3d::coreneuron_cpu_online
112/121 Test #96: testcorenrn_vecevent::coreneuron_cpu_offline ....................... Passed 5.70 sec
Start 98: testcorenrn_vecevent::compare_results
113/121 Test #98: testcorenrn_vecevent::compare_results .............................. Passed 0.70 sec
114/121 Test #35: coreneuron_modtests::inputpresyn_py_cpu ............................ Passed 8.04 sec
Start 50: external_ringtest::coreneuron_cpu_mpi_threads
115/121 Test #50: external_ringtest::coreneuron_cpu_mpi_threads ...................... Passed 15.70 sec
Start 51: external_ringtest::compare_results
116/121 Test #51: external_ringtest::compare_results ................................. Passed 0.46 sec
117/121 Test #117: olfactory-bulb-3d::neuron .......................................... Passed 157.38 sec
118/121 Test #119: olfactory-bulb-3d::coreneuron_cpu_online ........................... Passed 187.13 sec
Start 121: olfactory-bulb-3d::compare_results
119/121 Test #121: olfactory-bulb-3d::compare_results ................................. Passed 0.22 sec
120/121 Test #40: external_nrntest ................................................... Passed 265.54 sec
Start 108: tqperf::coreneuron
121/121 Test #108: tqperf::coreneuron ................................................. Passed 7.66 sec
100% tests passed, 0 tests failed out of 121
Total Test time (real) = 581.49 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1655888197:step_script section_start:1655888197:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=249848 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=272602 responseStatus=201 Created token=ZVkACDK1
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=249902 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=272602 responseStatus=201 Created token=ZVkACDK1
section_end:1655888199:upload_artifacts_on_success section_start:1655888199:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655888201:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655887687:resolve_secrets Resolving secrets
section_end:1655887687:resolve_secrets section_start:1655887687:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor487004905, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272596
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
A slurm job will be created with name GL_J272596_PROD_P112_CP4_C8
Job parameters: memory=76G, cpus_per_task=1, duration=1:00:00, constraint=volta ntasks=16 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
Submitted batch job 563812
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=16 --cpus-per-task=1 --mem=76G --job-name=GL_J272596_PROD_P112_CP4_C8 -C volta --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=16 --jobid=563812 --cpus-per-task=1 --mem=76G
section_end:1655887688:prepare_executor section_start:1655887688:prepare_script Preparing environment
Running on ldir01u09.bbp.epfl.ch via bbpv1.epfl.ch...
section_end:1655887695:prepare_script section_start:1655887695:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655887697:get_sources section_start:1655887697:download_artifacts Downloading artifacts
Downloading artifacts for build:neuron:mod2c:nvhpc:acc (272574)...
Runtime platform  arch=amd64 os=linux pid=73489 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=272574 responseStatus=200 OK token=SFnJvts6
section_end:1655887698:download_artifacts section_start:1655887698:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r1i7n28
Build name: Linux-nvc++
Create new tag: 20220622-0848 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272574/spack-build/spack-stage-neuron-develop-mqy2bmic4al44uogrem4llrvfdch65dy/spack-build-mqy2bmi
Start 67: external_ringtest::coreneuron_cpu_mpi_offline::preparation
Start 72: external_ringtest::coreneuron_gpu_mpi_offline::preparation
Start 79: testcorenrn_bbcore::coreneuron_gpu_offline::preparation
Start 83: testcorenrn_bbcore::coreneuron_cpu_offline::preparation
Start 89: testcorenrn_conc::coreneuron_gpu_offline::preparation
Start 93: testcorenrn_conc::coreneuron_cpu_offline::preparation
Start 99: testcorenrn_deriv::coreneuron_gpu_offline::preparation
Start 103: testcorenrn_deriv::coreneuron_cpu_offline::preparation
Start 109: testcorenrn_gf::coreneuron_gpu_offline::preparation
Start 113: testcorenrn_gf::coreneuron_cpu_offline::preparation
Start 119: testcorenrn_kin::coreneuron_gpu_offline::preparation
Start 123: testcorenrn_kin::coreneuron_cpu_offline::preparation
1/185 Test #79: testcorenrn_bbcore::coreneuron_gpu_offline::preparation ............ Passed 2.20 sec
Start 128: testcorenrn_patstim::coreneuron_gpu_offline::preparation
2/185 Test #83: testcorenrn_bbcore::coreneuron_cpu_offline::preparation ............ Passed 2.25 sec
Start 62: external_ringtest::neuron
3/185 Test #93: testcorenrn_conc::coreneuron_cpu_offline::preparation .............. Passed 2.68 sec
Start 75: testcorenrn_bbcore::neuron
4/185 Test #103: testcorenrn_deriv::coreneuron_cpu_offline::preparation ............. Passed 3.30 sec
Start 76: testcorenrn_bbcore::coreneuron_gpu_online
5/185 Test #123: testcorenrn_kin::coreneuron_cpu_offline::preparation ............... Passed 3.49 sec
Start 77: testcorenrn_bbcore::coreneuron_gpu_online_psolve_alternate
6/185 Test #89: testcorenrn_conc::coreneuron_gpu_offline::preparation .............. Passed 3.56 sec
Start 78: testcorenrn_bbcore::coreneuron_gpu_offline
7/185 Test #119: testcorenrn_kin::coreneuron_gpu_offline::preparation ............... Passed 3.67 sec
Start 80: testcorenrn_bbcore::coreneuron_cpu_online
8/185 Test #99: testcorenrn_deriv::coreneuron_gpu_offline::preparation ............. Passed 3.80 sec
Start 81: testcorenrn_bbcore::coreneuron_cpu_online_psolve_alternate
9/185 Test #109: testcorenrn_gf::coreneuron_gpu_offline::preparation ................ Passed 4.05 sec
Start 131: testcorenrn_patstim::coreneuron_cpu_offline::preparation
10/185 Test #62: external_ringtest::neuron .......................................... Passed 1.90 sec
Start 82: testcorenrn_bbcore::coreneuron_cpu_offline
11/185 Test #113: testcorenrn_gf::coreneuron_cpu_offline::preparation ................ Passed 4.19 sec
Start 137: testcorenrn_vecplay::coreneuron_gpu_offline::preparation
12/185 Test #128: testcorenrn_patstim::coreneuron_gpu_offline::preparation ........... Passed 2.04 sec
Start 85: testcorenrn_conc::neuron
13/185 Test #75: testcorenrn_bbcore::neuron ......................................... Passed 1.68 sec
Start 86: testcorenrn_conc::coreneuron_gpu_online
14/185 Test #85: testcorenrn_conc::neuron ........................................... Passed 2.12 sec
Start 87: testcorenrn_conc::coreneuron_gpu_online_psolve_alternate
15/185 Test #131: testcorenrn_patstim::coreneuron_cpu_offline::preparation ........... Passed 2.37 sec
Start 141: testcorenrn_vecplay::coreneuron_cpu_offline::preparation
16/185 Test #137: testcorenrn_vecplay::coreneuron_gpu_offline::preparation ........... Passed 2.28 sec
Start 157: testcorenrn_watch::coreneuron_gpu_offline::preparation
17/185 Test #141: testcorenrn_vecplay::coreneuron_cpu_offline::preparation ........... Passed 1.75 sec
Start 161: testcorenrn_watch::coreneuron_cpu_offline::preparation
18/185 Test #157: testcorenrn_watch::coreneuron_gpu_offline::preparation ............. Passed 1.72 sec
Start 63: external_ringtest::neuron_mpi
19/185 Test #63: external_ringtest::neuron_mpi ...................................... Passed 2.35 sec
Start 64: external_ringtest::coreneuron_cpu_mpi_offline_saverestore
20/185 Test #161: testcorenrn_watch::coreneuron_cpu_offline::preparation ............. Passed 2.41 sec
Start 65: external_ringtest::coreneuron_cpu_mpi
21/185 Test #82: testcorenrn_bbcore::coreneuron_cpu_offline ......................... Passed 21.57 sec
Start 88: testcorenrn_conc::coreneuron_gpu_offline
22/185 Test #80: testcorenrn_bbcore::coreneuron_cpu_online .......................... Passed 22.50 sec
Start 90: testcorenrn_conc::coreneuron_cpu_online
23/185 Test #81: testcorenrn_bbcore::coreneuron_cpu_online_psolve_alternate ......... Passed 22.64 sec
Start 91: testcorenrn_conc::coreneuron_cpu_online_psolve_alternate
24/185 Test #72: external_ringtest::coreneuron_gpu_mpi_offline::preparation ......... Passed 27.80 sec
Start 69: external_ringtest::coreneuron_gpu_mpi_offline_saverestore
25/185 Test #78: testcorenrn_bbcore::coreneuron_gpu_offline ......................... Passed 24.30 sec
Start 92: testcorenrn_conc::coreneuron_cpu_offline
26/185 Test #76: testcorenrn_bbcore::coreneuron_gpu_online .......................... Passed 24.92 sec
Start 95: testcorenrn_deriv::neuron
27/185 Test #67: external_ringtest::coreneuron_cpu_mpi_offline::preparation ......... Passed 28.30 sec
Start 66: external_ringtest::coreneuron_cpu_mpi_offline
28/185 Test #86: testcorenrn_conc::coreneuron_gpu_online ............................ Passed 24.76 sec
Start 96: testcorenrn_deriv::coreneuron_gpu_online
29/185 Test #95: testcorenrn_deriv::neuron .......................................... Passed 1.41 sec
Start 97: testcorenrn_deriv::coreneuron_gpu_online_psolve_alternate
30/185 Test #65: external_ringtest::coreneuron_cpu_mpi .............................. Passed 19.58 sec
Start 70: external_ringtest::coreneuron_gpu_mpi
31/185 Test #87: testcorenrn_conc::coreneuron_gpu_online_psolve_alternate ........... Passed 24.72 sec
Start 98: testcorenrn_deriv::coreneuron_gpu_offline
32/185 Test #88: testcorenrn_conc::coreneuron_gpu_offline ........................... Passed 17.06 sec
Start 100: testcorenrn_deriv::coreneuron_cpu_online
33/185 Test #66: external_ringtest::coreneuron_cpu_mpi_offline ...................... Passed 16.78 sec
Start 71: external_ringtest::coreneuron_gpu_mpi_offline
34/185 Test #90: testcorenrn_conc::coreneuron_cpu_online ............................ Passed 18.96 sec
Start 101: testcorenrn_deriv::coreneuron_cpu_online_psolve_alternate
35/185 Test #91: testcorenrn_conc::coreneuron_cpu_online_psolve_alternate ........... Passed 18.72 sec
Start 102: testcorenrn_deriv::coreneuron_cpu_offline
36/185 Test #77: testcorenrn_bbcore::coreneuron_gpu_online_psolve_alternate ......... Passed 59.40 sec
Start 115: testcorenrn_kin::neuron
37/185 Test #115: testcorenrn_kin::neuron ............................................ Passed 1.46 sec
Start 116: testcorenrn_kin::coreneuron_gpu_online
38/185 Test #96: testcorenrn_deriv::coreneuron_gpu_online ........................... Passed 43.39 sec
Start 117: testcorenrn_kin::coreneuron_gpu_online_psolve_alternate
39/185 Test #70: external_ringtest::coreneuron_gpu_mpi .............................. Passed 55.95 sec
Start 105: testcorenrn_gf::neuron
40/185 Test #105: testcorenrn_gf::neuron ............................................. Passed 4.11 sec
Start 106: testcorenrn_gf::coreneuron_gpu_online
41/185 Test #98: testcorenrn_deriv::coreneuron_gpu_offline .......................... Passed 66.56 sec
Start 118: testcorenrn_kin::coreneuron_gpu_offline
42/185 Test #92: testcorenrn_conc::coreneuron_cpu_offline ........................... Passed 70.36 sec
Start 120: testcorenrn_kin::coreneuron_cpu_online
43/185 Test #97: testcorenrn_deriv::coreneuron_gpu_online_psolve_alternate .......... Passed 68.76 sec
Start 121: testcorenrn_kin::coreneuron_cpu_online_psolve_alternate
44/185 Test #71: external_ringtest::coreneuron_gpu_mpi_offline ...................... Passed 53.37 sec
Start 107: testcorenrn_gf::coreneuron_gpu_online_psolve_alternate
45/185 Test #102: testcorenrn_deriv::coreneuron_cpu_offline .......................... Passed 53.62 sec
Start 122: testcorenrn_kin::coreneuron_cpu_offline
46/185 Test #101: testcorenrn_deriv::coreneuron_cpu_online_psolve_alternate .......... Passed 53.68 sec
Start 126: testcorenrn_patstim::coreneuron_gpu_offline_saverestore
47/185 Test #100: testcorenrn_deriv::coreneuron_cpu_online ........................... Passed 59.30 sec
Start 127: testcorenrn_patstim::coreneuron_gpu_offline
48/185 Test #116: testcorenrn_kin::coreneuron_gpu_online ............................. Passed 55.08 sec
Start 167: channel_benchmark_hippo::neuron
49/185 Test #167: channel_benchmark_hippo::neuron .................................... Passed 28.69 sec
Start 168: channel_benchmark_hippo::coreneuron_gpu_online
50/185 Test #106: testcorenrn_gf::coreneuron_gpu_online .............................. Passed 65.51 sec
Start 108: testcorenrn_gf::coreneuron_gpu_offline
51/185 Test #118: testcorenrn_kin::coreneuron_gpu_offline ............................ Passed 81.16 sec
Start 169: channel_benchmark_hippo::coreneuron_gpu_filemode
52/185 Test #122: testcorenrn_kin::coreneuron_cpu_offline ............................ Passed 80.05 sec
Start 170: channel_benchmark_hippo::coreneuron_cpu_online
53/185 Test #121: testcorenrn_kin::coreneuron_cpu_online_psolve_alternate ............ Passed 80.48 sec
Start 171: channel_benchmark_hippo::coreneuron_cpu_filemode
54/185 Test #127: testcorenrn_patstim::coreneuron_gpu_offline ........................ Passed 76.94 sec
Start 173: channel_benchmark_sscx::neuron
55/185 Test #117: testcorenrn_kin::coreneuron_gpu_online_psolve_alternate ............ Passed 109.64 sec
Start 174: channel_benchmark_sscx::coreneuron_gpu_online
56/185 Test #120: testcorenrn_kin::coreneuron_cpu_online ............................. Passed 84.88 sec
Start 175: channel_benchmark_sscx::coreneuron_gpu_filemode
57/185 Test #107: testcorenrn_gf::coreneuron_gpu_online_psolve_alternate ............. Passed 108.44 sec
Start 110: testcorenrn_gf::coreneuron_cpu_online
58/185 Test #64: external_ringtest::coreneuron_cpu_mpi_offline_saverestore .......... Passed 223.55 sec
Start 111: testcorenrn_gf::coreneuron_cpu_online_psolve_alternate
59/185 Test #108: testcorenrn_gf::coreneuron_gpu_offline ............................. Passed 104.29 sec
Start 112: testcorenrn_gf::coreneuron_cpu_offline
60/185 Test #168: channel_benchmark_hippo::coreneuron_gpu_online ..................... Passed 114.48 sec
Start 176: channel_benchmark_sscx::coreneuron_cpu_online
61/185 Test #69: external_ringtest::coreneuron_gpu_mpi_offline_saverestore .......... Passed 248.14 sec
Start 125: testcorenrn_patstim::neuron
62/185 Test #126: testcorenrn_patstim::coreneuron_gpu_offline_saverestore ............ Passed 177.30 sec
Start 177: channel_benchmark_sscx::coreneuron_cpu_filemode
63/185 Test #125: testcorenrn_patstim::neuron ........................................ Passed 3.89 sec
Start 129: testcorenrn_patstim::coreneuron_cpu_offline_saverestore
64/185 Test #110: testcorenrn_gf::coreneuron_cpu_online .............................. Passed 101.25 sec
Start 130: testcorenrn_patstim::coreneuron_cpu_offline
65/185 Test #173: channel_benchmark_sscx::neuron ..................................... Passed 131.90 sec
Start 1: testneuron
66/185 Test #1: testneuron ......................................................... Passed 0.05 sec
Start 2: ringtest
67/185 Test #2: ringtest ........................................................... Passed 0.23 sec
Start 3: connect_dend
68/185 Test #3: connect_dend ....................................................... Passed 0.13 sec
Start 4: mpi_init::nrniv_mpiopt
69/185 Test #4: mpi_init::nrniv_mpiopt ............................................. Passed 0.39 sec
Start 5: mpi_init::nrniv_nrnmpi_init
70/185 Test #5: mpi_init::nrniv_nrnmpi_init ........................................ Passed 0.15 sec
Start 6: mpi_init::python_nrnmpi_init
71/185 Test #111: testcorenrn_gf::coreneuron_cpu_online_psolve_alternate ............. Passed 77.95 sec
Start 133: testcorenrn_vecplay::neuron
72/185 Test #6: mpi_init::python_nrnmpi_init ....................................... Passed 0.49 sec
Start 7: mpi_init::python_mpienv
73/185 Test #7: mpi_init::python_mpienv ............................................ Passed 12.85 sec
Start 8: mpi_init::nrniv_mpiexec_mpiopt
74/185 Test #133: testcorenrn_vecplay::neuron ........................................ Passed 13.29 sec
Start 134: testcorenrn_vecplay::coreneuron_gpu_online
75/185 Test #8: mpi_init::nrniv_mpiexec_mpiopt ..................................... Passed 1.50 sec
Start 9: mpi_init::nrniv_mpiexec_nrnmpi_init
76/185 Test #9: mpi_init::nrniv_mpiexec_nrnmpi_init ................................ Passed 1.37 sec
Start 10: mpi_init::python_mpiexec_nrnmpi_init
77/185 Test #10: mpi_init::python_mpiexec_nrnmpi_init ............................... Passed 2.02 sec
Start 11: mpi_init::python_mpiexec_mpienv
78/185 Test #11: mpi_init::python_mpiexec_mpienv .................................... Passed 2.17 sec
Start 12: pynrn::basic_tests
79/185 Test #12: pynrn::basic_tests ................................................. Passed 8.62 sec
Start 13: coverage_tests::cover_tests
80/185 Test #13: coverage_tests::cover_tests ........................................ Passed 3.14 sec
Start 14: parallel_tests
81/185 Test #14: parallel_tests ..................................................... Passed 4.97 sec
Start 15: parallel_partrans
82/185 Test #15: parallel_partrans .................................................. Passed 1.75 sec
Start 16: parallel_netpar
83/185 Test #112: testcorenrn_gf::coreneuron_cpu_offline ............................. Passed 92.47 sec
Start 135: testcorenrn_vecplay::coreneuron_gpu_online_psolve_alternate
84/185 Test #16: parallel_netpar .................................................... Passed 2.07 sec
Start 17: parallel_bas
85/185 Test #17: parallel_bas ....................................................... Passed 3.23 sec
Start 18: coreneuron_modtests::version_macros
86/185 Test #174: channel_benchmark_sscx::coreneuron_gpu_online ...................... Passed 208.06 sec
Start 19: coreneuron_modtests::fornetcon_py_cpu
87/185 Test #170: channel_benchmark_hippo::coreneuron_cpu_online ..................... Passed 219.44 sec
Start 20: coreneuron_modtests::direct_py_cpu
88/185 Test #171: channel_benchmark_hippo::coreneuron_cpu_filemode ................... Passed 221.37 sec
Start 21: coreneuron_modtests::direct_hoc_cpu
89/185 Test #176: channel_benchmark_sscx::coreneuron_cpu_online ...................... Passed 137.67 sec
Start 22: coreneuron_modtests::spikes_py_cpu
90/185 Test #169: channel_benchmark_hippo::coreneuron_gpu_filemode ................... Passed 221.99 sec
Start 23: coreneuron_modtests::spikes_file_mode_py_cpu
91/185 Test #175: channel_benchmark_sscx::coreneuron_gpu_filemode .................... Passed 218.00 sec
Start 24: coreneuron_modtests::fast_imem_py_cpu
92/185 Test #24: coreneuron_modtests::fast_imem_py_cpu .............................. Passed 1.51 sec
Start 25: coreneuron_modtests::datareturn_py_cpu
93/185 Test #177: channel_benchmark_sscx::coreneuron_cpu_filemode .................... Passed 127.89 sec
Start 26: coreneuron_modtests::test_units_py_cpu
94/185 Test #130: testcorenrn_patstim::coreneuron_cpu_offline ........................ Passed 96.40 sec
Start 136: testcorenrn_vecplay::coreneuron_gpu_offline
95/185 Test #134: testcorenrn_vecplay::coreneuron_gpu_online ......................... Passed 84.14 sec
Start 138: testcorenrn_vecplay::coreneuron_cpu_online
96/185 Test #135: testcorenrn_vecplay::coreneuron_gpu_online_psolve_alternate ........ Passed 91.06 sec
Start 139: testcorenrn_vecplay::coreneuron_cpu_online_psolve_alternate
97/185 Test #18: coreneuron_modtests::version_macros ................................ Passed 104.27 sec
Start 27: coreneuron_modtests::test_netmove_py_cpu
98/185 Test #21: coreneuron_modtests::direct_hoc_cpu ................................ Passed 82.28 sec
Start 28: coreneuron_modtests::test_pointer_py_cpu
99/185 Test #19: coreneuron_modtests::fornetcon_py_cpu .............................. Passed 92.33 sec
Start 29: coreneuron_modtests::test_watchrange_py_cpu
100/185 Test #129: testcorenrn_patstim::coreneuron_cpu_offline_saverestore ............ Passed 203.76 sec
Start 140: testcorenrn_vecplay::coreneuron_cpu_offline
101/185 Test #26: coreneuron_modtests::test_units_py_cpu ............................. Passed 80.98 sec
Start 30: coreneuron_modtests::test_psolve_py_cpu
102/185 Test #22: coreneuron_modtests::spikes_py_cpu ................................. Passed 85.30 sec
Start 31: coreneuron_modtests::test_ba_py_cpu
103/185 Test #138: testcorenrn_vecplay::coreneuron_cpu_online ......................... Passed 77.20 sec
Start 153: testcorenrn_watch::neuron
104/185 Test #153: testcorenrn_watch::neuron .......................................... Passed 1.58 sec
Start 154: testcorenrn_watch::coreneuron_gpu_online
105/185 Test #20: coreneuron_modtests::direct_py_cpu ................................. Passed 99.27 sec
Start 32: coreneuron_modtests::test_natrans_py_cpu
106/185 Test #25: coreneuron_modtests::datareturn_py_cpu ............................. Passed 94.97 sec
Start 36: coreneuron_modtests::fornetcon_py_gpu
107/185 Test #23: coreneuron_modtests::spikes_file_mode_py_cpu ....................... Passed 96.94 sec
Start 37: coreneuron_modtests::direct_py_gpu
108/185 Test #136: testcorenrn_vecplay::coreneuron_gpu_offline ........................ Passed 94.34 sec
Start 155: testcorenrn_watch::coreneuron_gpu_online_psolve_alternate
109/185 Test #139: testcorenrn_vecplay::coreneuron_cpu_online_psolve_alternate ........ Passed 158.47 sec
Start 156: testcorenrn_watch::coreneuron_gpu_offline
110/185 Test #140: testcorenrn_vecplay::coreneuron_cpu_offline ........................ Passed 160.59 sec
Start 158: testcorenrn_watch::coreneuron_cpu_online
111/185 Test #31: coreneuron_modtests::test_ba_py_cpu ................................ Passed 158.85 sec
Start 38: coreneuron_modtests::direct_hoc_gpu
112/185 Test #27: coreneuron_modtests::test_netmove_py_cpu ........................... Passed 184.15 sec
Start 39: coreneuron_modtests::spikes_py_gpu
113/185 Test #30: coreneuron_modtests::test_psolve_py_cpu ............................ Passed 159.54 sec
Start 40: coreneuron_modtests::spikes_file_mode_py_gpu
114/185 Test #29: coreneuron_modtests::test_watchrange_py_cpu ........................ Passed 162.61 sec
Start 41: coreneuron_modtests::fast_imem_py_gpu
115/185 Test #41: coreneuron_modtests::fast_imem_py_gpu .............................. Passed 1.71 sec
Start 42: coreneuron_modtests::datareturn_py_gpu
116/185 Test #32: coreneuron_modtests::test_natrans_py_cpu ........................... Passed 150.04 sec
Start 43: coreneuron_modtests::test_units_py_gpu
117/185 Test #37: coreneuron_modtests::direct_py_gpu ................................. Passed 152.41 sec
Start 44: coreneuron_modtests::test_netmove_py_gpu
118/185 Test #154: testcorenrn_watch::coreneuron_gpu_online ........................... Passed 166.12 sec
Start 159: testcorenrn_watch::coreneuron_cpu_online_psolve_alternate
119/185 Test #36: coreneuron_modtests::fornetcon_py_gpu .............................. Passed 199.86 sec
Start 45: coreneuron_modtests::test_pointer_py_gpu
120/185 Test #155: testcorenrn_watch::coreneuron_gpu_online_psolve_alternate .......... Passed 199.60 sec
Start 160: testcorenrn_watch::coreneuron_cpu_offline
121/185 Test #156: testcorenrn_watch::coreneuron_gpu_offline .......................... Passed 96.99 sec
Start 163: testcorenrn_netstimdirect::direct_netstimdirect
122/185 Test #39: coreneuron_modtests::spikes_py_gpu ................................. Passed 88.86 sec
Start 46: coreneuron_modtests::test_watchrange_py_gpu
123/185 Test #43: coreneuron_modtests::test_units_py_gpu ............................. Passed 85.89 sec
Start 47: coreneuron_modtests::test_psolve_py_gpu
124/185 Test #40: coreneuron_modtests::spikes_file_mode_py_gpu ....................... Passed 90.03 sec
Start 48: coreneuron_modtests::test_ba_py_gpu
125/185 Test #38: coreneuron_modtests::direct_hoc_gpu ................................ Passed 90.85 sec
Start 49: coreneuron_modtests::test_natrans_py_gpu
126/185 Test #158: testcorenrn_watch::coreneuron_cpu_online ........................... Passed 91.33 sec
Start 164: testcorenrn_netstimdirect::direct_netstimdirect_psolve_alternate
127/185 Test #42: coreneuron_modtests::datareturn_py_gpu ............................. Passed 88.91 sec
Start 53: modlunit_unitstest
128/185 Test #53: modlunit_unitstest ................................................. Passed 0.02 sec
Start 54: modlunit_hh
129/185 Test #54: modlunit_hh ........................................................ Passed 0.01 sec
Start 55: modlunit_stim
130/185 Test #55: modlunit_stim ...................................................... Passed 0.01 sec
Start 56: modlunit_pattern
131/185 Test #56: modlunit_pattern ................................................... Passed 0.02 sec
Start 57: external_nrntest
132/185 Test #159: testcorenrn_watch::coreneuron_cpu_online_psolve_alternate .......... Passed 85.85 sec
Start 33: coreneuron_modtests::spikes_mpi_py_cpu
133/185 Test #44: coreneuron_modtests::test_netmove_py_gpu ........................... Passed 96.65 sec
Start 84: testcorenrn_bbcore::compare_results
134/185 Test #84: testcorenrn_bbcore::compare_results ................................ Passed 0.06 sec
Start 94: testcorenrn_conc::compare_results
135/185 Test #94: testcorenrn_conc::compare_results .................................. Passed 0.20 sec
Start 104: testcorenrn_deriv::compare_results
136/185 Test #104: testcorenrn_deriv::compare_results ................................. Passed 0.05 sec
Start 114: testcorenrn_gf::compare_results
137/185 Test #114: testcorenrn_gf::compare_results .................................... Passed 0.06 sec
Start 124: testcorenrn_kin::compare_results
138/185 Test #124: testcorenrn_kin::compare_results ................................... Passed 0.17 sec
Start 132: testcorenrn_patstim::compare_results
139/185 Test #132: testcorenrn_patstim::compare_results ............................... Passed 0.07 sec
Start 142: testcorenrn_vecplay::compare_results
140/185 Test #142: testcorenrn_vecplay::compare_results ............................... Passed 0.05 sec
Start 172: channel_benchmark_hippo::compare_results
141/185 Test #172: channel_benchmark_hippo::compare_results ........................... Passed 0.05 sec
Start 178: channel_benchmark_sscx::compare_results
142/185 Test #178: channel_benchmark_sscx::compare_results ............................ Passed 0.05 sec
143/185 Test #160: testcorenrn_watch::coreneuron_cpu_offline .......................... Passed 59.87 sec
Start 34: coreneuron_modtests::spikes_mpi_file_mode_py_cpu
Start 162: testcorenrn_watch::compare_results
144/185 Test #162: testcorenrn_watch::compare_results ................................. Passed 0.13 sec
145/185 Test #163: testcorenrn_netstimdirect::direct_netstimdirect .................... Passed 68.69 sec
Start 35: coreneuron_modtests::inputpresyn_py_cpu
146/185 Test #49: coreneuron_modtests::test_natrans_py_gpu ........................... Passed 51.61 sec
Start 50: coreneuron_modtests::spikes_mpi_py_gpu
147/185 Test #164: testcorenrn_netstimdirect::direct_netstimdirect_psolve_alternate ... Passed 53.37 sec
Start 51: coreneuron_modtests::spikes_mpi_file_mode_py_gpu
148/185 Test #46: coreneuron_modtests::test_watchrange_py_gpu ........................ Passed 66.77 sec
Start 165: testcorenrn_netstimdirect::compare_results
149/185 Test #165: testcorenrn_netstimdirect::compare_results ......................... Passed 0.07 sec
150/185 Test #48: coreneuron_modtests::test_ba_py_gpu ................................ Passed 65.71 sec
Start 52: coreneuron_modtests::inputpresyn_py_gpu
151/185 Test #47: coreneuron_modtests::test_psolve_py_gpu ............................ Passed 66.88 sec
152/185 Test #33: coreneuron_modtests::spikes_mpi_py_cpu ............................. Passed 61.77 sec
153/185 Test #34: coreneuron_modtests::spikes_mpi_file_mode_py_cpu ................... Passed 55.30 sec
Start 147: testcorenrn_vecevent::coreneuron_gpu_offline::preparation
154/185 Test #147: testcorenrn_vecevent::coreneuron_gpu_offline::preparation .......... Passed 2.76 sec
Start 151: testcorenrn_vecevent::coreneuron_cpu_offline::preparation
155/185 Test #151: testcorenrn_vecevent::coreneuron_cpu_offline::preparation .......... Passed 2.74 sec
Start 180: olfactory-bulb-3d::neuron::preparation
156/185 Test #180: olfactory-bulb-3d::neuron::preparation ............................. Passed 0.05 sec
Start 182: olfactory-bulb-3d::coreneuron_gpu_online::preparation
157/185 Test #182: olfactory-bulb-3d::coreneuron_gpu_online::preparation .............. Passed 0.05 sec
Start 184: olfactory-bulb-3d::coreneuron_cpu_online::preparation
158/185 Test #184: olfactory-bulb-3d::coreneuron_cpu_online::preparation .............. Passed 0.03 sec
Start 58: reduced_dentate::neuron
159/185 Test #57: external_nrntest ................................................... Passed 88.89 sec
160/185 Test #58: reduced_dentate::neuron ............................................ Passed 12.36 sec
Start 59: reduced_dentate::coreneuron_cpu
161/185 Test #35: coreneuron_modtests::inputpresyn_py_cpu ............................ Passed 65.72 sec
Start 60: reduced_dentate::coreneuron_gpu
162/185 Test #51: coreneuron_modtests::spikes_mpi_file_mode_py_gpu ................... Passed 44.96 sec
163/185 Test #50: coreneuron_modtests::spikes_mpi_py_gpu ............................. Passed 47.07 sec
Start 143: testcorenrn_vecevent::neuron
164/185 Test #143: testcorenrn_vecevent::neuron ....................................... Passed 5.73 sec
Start 144: testcorenrn_vecevent::coreneuron_gpu_online
165/185 Test #52: coreneuron_modtests::inputpresyn_py_gpu ............................ Passed 47.41 sec
166/185 Test #144: testcorenrn_vecevent::coreneuron_gpu_online ........................ Passed 73.73 sec
Start 68: external_ringtest::coreneuron_cpu_mpi_threads
167/185 Test #59: reduced_dentate::coreneuron_cpu .................................... Passed 82.46 sec
Start 145: testcorenrn_vecevent::coreneuron_gpu_online_psolve_alternate
168/185 Test #60: reduced_dentate::coreneuron_gpu .................................... Passed 80.98 sec
Start 146: testcorenrn_vecevent::coreneuron_gpu_offline
169/185 Test #68: external_ringtest::coreneuron_cpu_mpi_threads ...................... Passed 59.61 sec
Start 73: external_ringtest::coreneuron_gpu_mpi_threads
170/185 Test #146: testcorenrn_vecevent::coreneuron_gpu_offline ....................... Passed 58.76 sec
Start 148: testcorenrn_vecevent::coreneuron_cpu_online
171/185 Test #145: testcorenrn_vecevent::coreneuron_gpu_online_psolve_alternate ....... Passed 70.05 sec
Start 149: testcorenrn_vecevent::coreneuron_cpu_online_psolve_alternate
172/185 Test #28: coreneuron_modtests::test_pointer_py_cpu ........................... Passed 507.67 sec
Start 61: reduced_dentate::compare_results
173/185 Test #61: reduced_dentate::compare_results ................................... Passed 0.21 sec
174/185 Test #45: coreneuron_modtests::test_pointer_py_gpu ........................... Passed 333.59 sec
175/185 Test #148: testcorenrn_vecevent::coreneuron_cpu_online ........................ Passed 58.55 sec
Start 150: testcorenrn_vecevent::coreneuron_cpu_offline
176/185 Test #149: testcorenrn_vecevent::coreneuron_cpu_online_psolve_alternate ....... Passed 48.08 sec
Start 179: olfactory-bulb-3d::neuron
177/185 Test #73: external_ringtest::coreneuron_gpu_mpi_threads ...................... Passed 59.52 sec
Start 181: olfactory-bulb-3d::coreneuron_gpu_online
Start 183: olfactory-bulb-3d::coreneuron_cpu_online
178/185 Test #150: testcorenrn_vecevent::coreneuron_cpu_offline ....................... Passed 35.35 sec
Start 74: external_ringtest::compare_results
Start 152: testcorenrn_vecevent::compare_results
179/185 Test #152: testcorenrn_vecevent::compare_results .............................. Passed 0.06 sec
180/185 Test #74: external_ringtest::compare_results ................................. Passed 0.07 sec
181/185 Test #179: olfactory-bulb-3d::neuron .......................................... Passed 112.11 sec
182/185 Test #181: olfactory-bulb-3d::coreneuron_gpu_online ........................... Passed 141.13 sec
183/185 Test #183: olfactory-bulb-3d::coreneuron_cpu_online ........................... Passed 141.56 sec
Start 166: tqperf::coreneuron
184/185 Test #166: tqperf::coreneuron ................................................. Passed 30.18 sec
Start 185: olfactory-bulb-3d::compare_results
185/185 Test #185: olfactory-bulb-3d::compare_results ................................. Passed 0.23 sec
100% tests passed, 0 tests failed out of 185
Total Test time (real) = 1204.74 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1655888928:step_script section_start:1655888928:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=30730 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=272596 responseStatus=201 Created token=zSpZEv_B
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=30783 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=272596 responseStatus=201 Created token=zSpZEv_B
section_end:1655888930:upload_artifacts_on_success section_start:1655888930:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655888931:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655888437:resolve_secrets Resolving secrets
section_end:1655888437:resolve_secrets section_start:1655888437:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor176755817, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272603
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
A slurm job will be created with name GL_J272603_PROD_P112_CP5_C3
Job parameters: memory=76G, cpus_per_task=1, duration=1:00:00, constraint=cpu ntasks=16 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 563992
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=16 --cpus-per-task=1 --mem=76G --job-name=GL_J272603_PROD_P112_CP5_C3 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=16 --jobid=563992 --cpus-per-task=1 --mem=76G
section_end:1655888438:prepare_executor section_start:1655888438:prepare_script Preparing environment
Running on r4i6n4 via bbpv1.epfl.ch...
section_end:1655888441:prepare_script section_start:1655888441:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655888442:get_sources section_start:1655888442:download_artifacts Downloading artifacts
Downloading artifacts for build:neuron:nmodl:intel (272581)...
Runtime platform  arch=amd64 os=linux pid=8432 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=272581 responseStatus=200 OK token=ZLXYD-d_
section_end:1655888443:download_artifacts section_start:1655888443:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r1i6n30
Build name: Linux-icpc
Create new tag: 20220622-0901 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272581/spack-build/spack-stage-neuron-develop-jhl3fvxgrrds5w2v5yyiqgn3n4pxwftn/spack-build-jhl3fvx
Start 49: external_ringtest::coreneuron_cpu_mpi_offline::preparation
Start 56: testcorenrn_bbcore::coreneuron_cpu_offline::preparation
Start 62: testcorenrn_conc::coreneuron_cpu_offline::preparation
Start 68: testcorenrn_deriv::coreneuron_cpu_offline::preparation
Start 74: testcorenrn_gf::coreneuron_cpu_offline::preparation
Start 80: testcorenrn_kin::coreneuron_cpu_offline::preparation
Start 85: testcorenrn_patstim::coreneuron_cpu_offline::preparation
Start 91: testcorenrn_vecplay::coreneuron_cpu_offline::preparation
Start 97: testcorenrn_vecevent::coreneuron_cpu_offline::preparation
1/120 Test #62: testcorenrn_conc::coreneuron_cpu_offline::preparation .............. Passed 2.14 sec
Start 44: external_ringtest::neuron
2/120 Test #56: testcorenrn_bbcore::coreneuron_cpu_offline::preparation ............ Passed 2.19 sec
Start 52: testcorenrn_bbcore::neuron
3/120 Test #68: testcorenrn_deriv::coreneuron_cpu_offline::preparation ............. Passed 2.73 sec
Start 53: testcorenrn_bbcore::coreneuron_cpu_online
4/120 Test #80: testcorenrn_kin::coreneuron_cpu_offline::preparation ............... Passed 2.89 sec
Start 54: testcorenrn_bbcore::coreneuron_cpu_online_psolve_alternate
5/120 Test #91: testcorenrn_vecplay::coreneuron_cpu_offline::preparation ........... Passed 3.85 sec
Start 103: testcorenrn_watch::coreneuron_cpu_offline::preparation
6/120 Test #74: testcorenrn_gf::coreneuron_cpu_offline::preparation ................ Passed 3.96 sec
Start 45: external_ringtest::neuron_mpi
7/120 Test #85: testcorenrn_patstim::coreneuron_cpu_offline::preparation ........... Passed 4.21 sec
Start 46: external_ringtest::coreneuron_cpu_mpi_offline_saverestore
8/120 Test #97: testcorenrn_vecevent::coreneuron_cpu_offline::preparation .......... Passed 4.44 sec
Start 117: olfactory-bulb-3d::neuron::preparation
9/120 Test #117: olfactory-bulb-3d::neuron::preparation ............................. Passed 0.03 sec
Start 119: olfactory-bulb-3d::coreneuron_cpu_online::preparation
10/120 Test #49: external_ringtest::coreneuron_cpu_mpi_offline::preparation ......... Passed 4.50 sec
Start 47: external_ringtest::coreneuron_cpu_mpi
11/120 Test #119: olfactory-bulb-3d::coreneuron_cpu_online::preparation .............. Passed 0.06 sec
Start 41: reduced_dentate::neuron
12/120 Test #52: testcorenrn_bbcore::neuron ......................................... Passed 2.51 sec
Start 55: testcorenrn_bbcore::coreneuron_cpu_offline
13/120 Test #44: external_ringtest::neuron .......................................... Passed 2.68 sec
Start 58: testcorenrn_conc::neuron
14/120 Test #53: testcorenrn_bbcore::coreneuron_cpu_online .......................... Passed 2.49 sec
Start 59: testcorenrn_conc::coreneuron_cpu_online
15/120 Test #54: testcorenrn_bbcore::coreneuron_cpu_online_psolve_alternate ......... Passed 2.45 sec
Start 60: testcorenrn_conc::coreneuron_cpu_online_psolve_alternate
16/120 Test #103: testcorenrn_watch::coreneuron_cpu_offline::preparation ............. Passed 1.79 sec
Start 48: external_ringtest::coreneuron_cpu_mpi_offline
17/120 Test #55: testcorenrn_bbcore::coreneuron_cpu_offline ......................... Passed 2.12 sec
Start 61: testcorenrn_conc::coreneuron_cpu_offline
18/120 Test #58: testcorenrn_conc::neuron ........................................... Passed 3.01 sec
Start 64: testcorenrn_deriv::neuron
19/120 Test #61: testcorenrn_conc::coreneuron_cpu_offline ........................... Passed 1.86 sec
Start 65: testcorenrn_deriv::coreneuron_cpu_online
20/120 Test #45: external_ringtest::neuron_mpi ...................................... Passed 5.26 sec
Start 66: testcorenrn_deriv::coreneuron_cpu_online_psolve_alternate
Start 67: testcorenrn_deriv::coreneuron_cpu_offline
21/120 Test #59: testcorenrn_conc::coreneuron_cpu_online ............................ Passed 4.45 sec
Start 76: testcorenrn_kin::neuron
22/120 Test #64: testcorenrn_deriv::neuron .......................................... Passed 1.90 sec
Start 77: testcorenrn_kin::coreneuron_cpu_online
23/120 Test #60: testcorenrn_conc::coreneuron_cpu_online_psolve_alternate ........... Passed 4.59 sec
Start 78: testcorenrn_kin::coreneuron_cpu_online_psolve_alternate
24/120 Test #67: testcorenrn_deriv::coreneuron_cpu_offline .......................... Passed 1.66 sec
Start 79: testcorenrn_kin::coreneuron_cpu_offline
25/120 Test #65: testcorenrn_deriv::coreneuron_cpu_online ........................... Passed 2.48 sec
Start 108: channel_benchmark_hippo::neuron
26/120 Test #76: testcorenrn_kin::neuron ............................................ Passed 1.88 sec
Start 109: channel_benchmark_hippo::coreneuron_cpu_online
27/120 Test #66: testcorenrn_deriv::coreneuron_cpu_online_psolve_alternate .......... Passed 3.60 sec
Start 110: channel_benchmark_hippo::coreneuron_cpu_filemode
28/120 Test #79: testcorenrn_kin::coreneuron_cpu_offline ............................ Passed 4.04 sec
Start 112: channel_benchmark_sscx::neuron
29/120 Test #48: external_ringtest::coreneuron_cpu_mpi_offline ...................... Passed 9.38 sec
Start 70: testcorenrn_gf::neuron
30/120 Test #78: testcorenrn_kin::coreneuron_cpu_online_psolve_alternate ............ Passed 5.26 sec
Start 113: channel_benchmark_sscx::coreneuron_cpu_online
31/120 Test #77: testcorenrn_kin::coreneuron_cpu_online ............................. Passed 5.60 sec
Start 114: channel_benchmark_sscx::coreneuron_cpu_filemode
32/120 Test #47: external_ringtest::coreneuron_cpu_mpi .............................. Passed 11.50 sec
Start 71: testcorenrn_gf::coreneuron_cpu_online
33/120 Test #70: testcorenrn_gf::neuron ............................................. Passed 8.34 sec
Start 72: testcorenrn_gf::coreneuron_cpu_online_psolve_alternate
34/120 Test #46: external_ringtest::coreneuron_cpu_mpi_offline_saverestore .......... Passed 19.21 sec
Start 73: testcorenrn_gf::coreneuron_cpu_offline
35/120 Test #71: testcorenrn_gf::coreneuron_cpu_online .............................. Passed 12.20 sec
Start 82: testcorenrn_patstim::neuron
36/120 Test #73: testcorenrn_gf::coreneuron_cpu_offline ............................. Passed 7.99 sec
Start 83: testcorenrn_patstim::coreneuron_cpu_offline_saverestore
37/120 Test #82: testcorenrn_patstim::neuron ........................................ Passed 3.87 sec
Start 84: testcorenrn_patstim::coreneuron_cpu_offline
38/120 Test #84: testcorenrn_patstim::coreneuron_cpu_offline ........................ Passed 4.68 sec
Start 87: testcorenrn_vecplay::neuron
39/120 Test #87: testcorenrn_vecplay::neuron ........................................ Passed 3.53 sec
Start 88: testcorenrn_vecplay::coreneuron_cpu_online
40/120 Test #83: testcorenrn_patstim::coreneuron_cpu_offline_saverestore ............ Passed 9.65 sec
Start 89: testcorenrn_vecplay::coreneuron_cpu_online_psolve_alternate
41/120 Test #72: testcorenrn_gf::coreneuron_cpu_online_psolve_alternate ............. Passed 20.54 sec
Start 90: testcorenrn_vecplay::coreneuron_cpu_offline
42/120 Test #90: testcorenrn_vecplay::coreneuron_cpu_offline ........................ Passed 3.31 sec
Start 99: testcorenrn_watch::neuron
43/120 Test #88: testcorenrn_vecplay::coreneuron_cpu_online ......................... Passed 7.19 sec
Start 100: testcorenrn_watch::coreneuron_cpu_online
44/120 Test #99: testcorenrn_watch::neuron .......................................... Passed 3.74 sec
Start 101: testcorenrn_watch::coreneuron_cpu_online_psolve_alternate
45/120 Test #89: testcorenrn_vecplay::coreneuron_cpu_online_psolve_alternate ........ Passed 11.40 sec
Start 102: testcorenrn_watch::coreneuron_cpu_offline
46/120 Test #100: testcorenrn_watch::coreneuron_cpu_online ........................... Passed 7.20 sec
Start 105: testcorenrn_netstimdirect::direct_netstimdirect
47/120 Test #102: testcorenrn_watch::coreneuron_cpu_offline .......................... Passed 3.01 sec
Start 106: testcorenrn_netstimdirect::direct_netstimdirect_psolve_alternate
48/120 Test #105: testcorenrn_netstimdirect::direct_netstimdirect .................... Passed 7.49 sec
Start 1: testneuron
Start 2: ringtest
49/120 Test #1: testneuron ......................................................... Passed 0.09 sec
Start 3: connect_dend
50/120 Test #3: connect_dend ....................................................... Passed 0.16 sec
Start 4: mpi_init::nrniv_mpiopt
51/120 Test #2: ringtest ........................................................... Passed 0.29 sec
Start 5: mpi_init::nrniv_nrnmpi_init
52/120 Test #5: mpi_init::nrniv_nrnmpi_init ........................................ Passed 0.43 sec
Start 6: mpi_init::python_nrnmpi_init
53/120 Test #4: mpi_init::nrniv_mpiopt ............................................. Passed 0.77 sec
Start 7: mpi_init::python_mpienv
54/120 Test #6: mpi_init::python_nrnmpi_init ....................................... Passed 0.96 sec
Start 8: mpi_init::nrniv_mpiexec_mpiopt
55/120 Test #101: testcorenrn_watch::coreneuron_cpu_online_psolve_alternate .......... Passed 12.97 sec
Start 9: mpi_init::nrniv_mpiexec_nrnmpi_init
Start 10: mpi_init::python_mpiexec_nrnmpi_init
56/120 Test #9: mpi_init::nrniv_mpiexec_nrnmpi_init ................................ Passed 1.59 sec
Start 11: mpi_init::python_mpiexec_mpienv
57/120 Test #8: mpi_init::nrniv_mpiexec_mpiopt ..................................... Passed 1.77 sec
Start 12: pynrn::basic_tests
58/120 Test #10: mpi_init::python_mpiexec_nrnmpi_init ............................... Passed 2.14 sec
Start 13: coverage_tests::cover_tests
59/120 Test #106: testcorenrn_netstimdirect::direct_netstimdirect_psolve_alternate ... Passed 10.88 sec
Start 14: parallel_tests
Start 15: parallel_partrans
60/120 Test #7: mpi_init::python_mpienv ............................................ Passed 3.41 sec
Start 16: parallel_netpar
61/120 Test #11: mpi_init::python_mpiexec_mpienv .................................... Passed 3.95 sec
Start 17: parallel_bas
62/120 Test #15: parallel_partrans .................................................. Passed 6.81 sec
Start 18: coreneuron_modtests::version_macros
63/120 Test #16: parallel_netpar .................................................... Passed 6.75 sec
Start 19: coreneuron_modtests::fornetcon_py_cpu
64/120 Test #13: coverage_tests::cover_tests ........................................ Passed 7.76 sec
Start 20: coreneuron_modtests::direct_py_cpu
65/120 Test #12: pynrn::basic_tests ................................................. Passed 11.71 sec
Start 21: coreneuron_modtests::direct_hoc_cpu
66/120 Test #14: parallel_tests ..................................................... Passed 11.69 sec
Start 22: coreneuron_modtests::spikes_py_cpu
67/120 Test #21: coreneuron_modtests::direct_hoc_cpu ................................ Passed 12.84 sec
Start 23: coreneuron_modtests::spikes_file_mode_py_cpu
68/120 Test #17: parallel_bas ....................................................... Passed 25.60 sec
Start 24: coreneuron_modtests::fast_imem_py_cpu
69/120 Test #18: coreneuron_modtests::version_macros ................................ Passed 73.30 sec
Start 25: coreneuron_modtests::datareturn_py_cpu
70/120 Test #19: coreneuron_modtests::fornetcon_py_cpu .............................. Passed 74.30 sec
Start 26: coreneuron_modtests::test_units_py_cpu
71/120 Test #20: coreneuron_modtests::direct_py_cpu ................................. Passed 75.73 sec
Start 27: coreneuron_modtests::test_netmove_py_cpu
72/120 Test #41: reduced_dentate::neuron ............................................ Passed 148.45 sec
Start 42: reduced_dentate::coreneuron_cpu
73/120 Test #22: coreneuron_modtests::spikes_py_cpu ................................. Passed 75.10 sec
Start 28: coreneuron_modtests::test_pointer_py_cpu
74/120 Test #23: coreneuron_modtests::spikes_file_mode_py_cpu ....................... Passed 76.85 sec
Start 29: coreneuron_modtests::test_watchrange_py_cpu
75/120 Test #24: coreneuron_modtests::fast_imem_py_cpu .............................. Passed 79.24 sec
Start 30: coreneuron_modtests::test_psolve_py_cpu
76/120 Test #113: channel_benchmark_sscx::coreneuron_cpu_online ...................... Passed 160.29 sec
Start 31: coreneuron_modtests::test_ba_py_cpu
77/120 Test #114: channel_benchmark_sscx::coreneuron_cpu_filemode .................... Passed 162.14 sec
Start 32: coreneuron_modtests::test_natrans_py_cpu
78/120 Test #109: channel_benchmark_hippo::coreneuron_cpu_online ..................... Passed 167.61 sec
Start 36: modlunit_unitstest
79/120 Test #36: modlunit_unitstest ................................................. Passed 0.03 sec
Start 37: modlunit_hh
80/120 Test #37: modlunit_hh ........................................................ Passed 0.04 sec
Start 38: modlunit_stim
81/120 Test #38: modlunit_stim ...................................................... Passed 0.01 sec
Start 39: modlunit_pattern
82/120 Test #39: modlunit_pattern ................................................... Passed 0.01 sec
Start 40: external_nrntest
83/120 Test #112: channel_benchmark_sscx::neuron ..................................... Passed 164.34 sec
Start 57: testcorenrn_bbcore::compare_results
84/120 Test #57: testcorenrn_bbcore::compare_results ................................ Passed 0.27 sec
Start 63: testcorenrn_conc::compare_results
85/120 Test #63: testcorenrn_conc::compare_results .................................. Passed 0.24 sec
Start 69: testcorenrn_deriv::compare_results
86/120 Test #69: testcorenrn_deriv::compare_results ................................. Passed 0.08 sec
Start 75: testcorenrn_gf::compare_results
87/120 Test #110: channel_benchmark_hippo::coreneuron_cpu_filemode ................... Passed 167.16 sec
Start 81: testcorenrn_kin::compare_results
88/120 Test #75: testcorenrn_gf::compare_results .................................... Passed 0.14 sec
Start 86: testcorenrn_patstim::compare_results
89/120 Test #81: testcorenrn_kin::compare_results ................................... Passed 0.24 sec
Start 92: testcorenrn_vecplay::compare_results
90/120 Test #108: channel_benchmark_hippo::neuron .................................... Passed 169.04 sec
Start 104: testcorenrn_watch::compare_results
91/120 Test #86: testcorenrn_patstim::compare_results ............................... Passed 0.41 sec
Start 107: testcorenrn_netstimdirect::compare_results
92/120 Test #107: testcorenrn_netstimdirect::compare_results ......................... Passed 0.86 sec
Start 111: channel_benchmark_hippo::compare_results
93/120 Test #104: testcorenrn_watch::compare_results ................................. Passed 1.36 sec
Start 115: channel_benchmark_sscx::compare_results
94/120 Test #111: channel_benchmark_hippo::compare_results ........................... Passed 1.02 sec
95/120 Test #92: testcorenrn_vecplay::compare_results ............................... Passed 2.53 sec
Start 33: coreneuron_modtests::spikes_mpi_py_cpu
96/120 Test #115: channel_benchmark_sscx::compare_results ............................ Passed 1.19 sec
97/120 Test #33: coreneuron_modtests::spikes_mpi_py_cpu ............................. Passed 5.63 sec
Start 34: coreneuron_modtests::spikes_mpi_file_mode_py_cpu
98/120 Test #26: coreneuron_modtests::test_units_py_cpu ............................. Passed 42.22 sec
Start 35: coreneuron_modtests::inputpresyn_py_cpu
99/120 Test #27: coreneuron_modtests::test_netmove_py_cpu ........................... Passed 44.02 sec
100/120 Test #25: coreneuron_modtests::datareturn_py_cpu ............................. Passed 47.61 sec
101/120 Test #34: coreneuron_modtests::spikes_mpi_file_mode_py_cpu ................... Passed 6.32 sec
Start 93: testcorenrn_vecevent::neuron
102/120 Test #35: coreneuron_modtests::inputpresyn_py_cpu ............................ Passed 5.13 sec
103/120 Test #29: coreneuron_modtests::test_watchrange_py_cpu ........................ Passed 28.03 sec
104/120 Test #31: coreneuron_modtests::test_ba_py_cpu ................................ Passed 20.05 sec
Start 94: testcorenrn_vecevent::coreneuron_cpu_online
105/120 Test #30: coreneuron_modtests::test_psolve_py_cpu ............................ Passed 21.25 sec
106/120 Test #32: coreneuron_modtests::test_natrans_py_cpu ........................... Passed 18.24 sec
107/120 Test #28: coreneuron_modtests::test_pointer_py_cpu ........................... Passed 42.99 sec
108/120 Test #93: testcorenrn_vecevent::neuron ....................................... Passed 3.39 sec
Start 50: external_ringtest::coreneuron_cpu_mpi_threads
109/120 Test #94: testcorenrn_vecevent::coreneuron_cpu_online ........................ Passed 5.84 sec
Start 95: testcorenrn_vecevent::coreneuron_cpu_online_psolve_alternate
110/120 Test #50: external_ringtest::coreneuron_cpu_mpi_threads ...................... Passed 3.93 sec
Start 96: testcorenrn_vecevent::coreneuron_cpu_offline
Start 51: external_ringtest::compare_results
111/120 Test #51: external_ringtest::compare_results ................................. Passed 0.63 sec
112/120 Test #96: testcorenrn_vecevent::coreneuron_cpu_offline ....................... Passed 4.13 sec
Start 116: olfactory-bulb-3d::neuron
113/120 Test #95: testcorenrn_vecevent::coreneuron_cpu_online_psolve_alternate ....... Passed 5.84 sec
Start 118: olfactory-bulb-3d::coreneuron_cpu_online
Start 98: testcorenrn_vecevent::compare_results
114/120 Test #98: testcorenrn_vecevent::compare_results .............................. Passed 0.85 sec
115/120 Test #42: reduced_dentate::coreneuron_cpu .................................... Passed 56.37 sec
Start 43: reduced_dentate::compare_results
116/120 Test #43: reduced_dentate::compare_results ................................... Passed 0.15 sec
117/120 Test #40: external_nrntest ................................................... Passed 120.46 sec
118/120 Test #118: olfactory-bulb-3d::coreneuron_cpu_online ........................... Passed 119.57 sec
119/120 Test #116: olfactory-bulb-3d::neuron .......................................... Passed 123.43 sec
Start 120: olfactory-bulb-3d::compare_results
120/120 Test #120: olfactory-bulb-3d::compare_results ................................. Passed 0.17 sec
100% tests passed, 0 tests failed out of 120
Total Test time (real) = 329.92 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1655888800:step_script section_start:1655888800:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=15923 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=272603 responseStatus=201 Created token=64KrnGWZ
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=15981 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=272603 responseStatus=201 Created token=64KrnGWZ
section_end:1655888801:upload_artifacts_on_success section_start:1655888801:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655888802:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655889532:resolve_secrets Resolving secrets
section_end:1655889532:resolve_secrets section_start:1655889532:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor343096201, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272600
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
A slurm job will be created with name GL_J272600_PROD_P112_CP0_C0
Job parameters: memory=76G, cpus_per_task=1, duration=1:00:00, constraint=volta ntasks=16 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
Submitted batch job 564017
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=16 --cpus-per-task=1 --mem=76G --job-name=GL_J272600_PROD_P112_CP0_C0 -C volta --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=16 --jobid=564017 --cpus-per-task=1 --mem=76G
section_end:1655889533:prepare_executor section_start:1655889533:prepare_script Preparing environment
Running on ldir01u09.bbp.epfl.ch via bbpv1.epfl.ch...
section_end:1655889540:prepare_script section_start:1655889540:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655889541:get_sources section_start:1655889541:download_artifacts Downloading artifacts
Downloading artifacts for build:neuron:nmodl:nvhpc:acc (272577)...
Runtime platform  arch=amd64 os=linux pid=33327 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=272577 responseStatus=200 OK token=BqwBC8nC
section_end:1655889542:download_artifacts section_start:1655889542:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r1i6n30
Build name: Linux-nvc++
Create new tag: 20220622-0919 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272577/spack-build/spack-stage-neuron-develop-f6nikkkl3jokiiz56xke2nvdgfsrjose/spack-build-f6nikkk
Start 67: external_ringtest::coreneuron_cpu_mpi_offline::preparation
Start 72: external_ringtest::coreneuron_gpu_mpi_offline::preparation
Start 79: testcorenrn_bbcore::coreneuron_gpu_offline::preparation
Start 83: testcorenrn_bbcore::coreneuron_cpu_offline::preparation
Start 89: testcorenrn_conc::coreneuron_gpu_offline::preparation
Start 93: testcorenrn_conc::coreneuron_cpu_offline::preparation
Start 99: testcorenrn_deriv::coreneuron_gpu_offline::preparation
Start 103: testcorenrn_deriv::coreneuron_cpu_offline::preparation
Start 109: testcorenrn_gf::coreneuron_gpu_offline::preparation
Start 113: testcorenrn_gf::coreneuron_cpu_offline::preparation
Start 119: testcorenrn_kin::coreneuron_gpu_offline::preparation
Start 123: testcorenrn_kin::coreneuron_cpu_offline::preparation
1/184 Test #79: testcorenrn_bbcore::coreneuron_gpu_offline::preparation ............ Passed 1.96 sec
Start 128: testcorenrn_patstim::coreneuron_gpu_offline::preparation
2/184 Test #83: testcorenrn_bbcore::coreneuron_cpu_offline::preparation ............ Passed 2.01 sec
Start 62: external_ringtest::neuron
3/184 Test #119: testcorenrn_kin::coreneuron_gpu_offline::preparation ............... Passed 3.50 sec
Start 75: testcorenrn_bbcore::neuron
4/184 Test #89: testcorenrn_conc::coreneuron_gpu_offline::preparation .............. Passed 3.66 sec
Start 76: testcorenrn_bbcore::coreneuron_gpu_online
5/184 Test #93: testcorenrn_conc::coreneuron_cpu_offline::preparation .............. Passed 3.75 sec
Start 77: testcorenrn_bbcore::coreneuron_gpu_online_psolve_alternate
6/184 Test #109: testcorenrn_gf::coreneuron_gpu_offline::preparation ................ Passed 3.79 sec
Start 131: testcorenrn_patstim::coreneuron_cpu_offline::preparation
7/184 Test #123: testcorenrn_kin::coreneuron_cpu_offline::preparation ............... Passed 3.90 sec
Start 78: testcorenrn_bbcore::coreneuron_gpu_offline
8/184 Test #103: testcorenrn_deriv::coreneuron_cpu_offline::preparation ............. Passed 4.00 sec
Start 80: testcorenrn_bbcore::coreneuron_cpu_online
9/184 Test #99: testcorenrn_deriv::coreneuron_gpu_offline::preparation ............. Passed 4.05 sec
Start 81: testcorenrn_bbcore::coreneuron_cpu_online_psolve_alternate
10/184 Test #113: testcorenrn_gf::coreneuron_cpu_offline::preparation ................ Passed 4.28 sec
Start 137: testcorenrn_vecplay::coreneuron_gpu_offline::preparation
11/184 Test #62: external_ringtest::neuron .......................................... Passed 2.37 sec
Start 82: testcorenrn_bbcore::coreneuron_cpu_offline
12/184 Test #128: testcorenrn_patstim::coreneuron_gpu_offline::preparation ........... Passed 2.82 sec
Start 85: testcorenrn_conc::neuron
13/184 Test #75: testcorenrn_bbcore::neuron ......................................... Passed 3.15 sec
Start 86: testcorenrn_conc::coreneuron_gpu_online
14/184 Test #131: testcorenrn_patstim::coreneuron_cpu_offline::preparation ........... Passed 4.06 sec
Start 141: testcorenrn_vecplay::coreneuron_cpu_offline::preparation
15/184 Test #85: testcorenrn_conc::neuron ........................................... Passed 3.29 sec
Start 87: testcorenrn_conc::coreneuron_gpu_online_psolve_alternate
16/184 Test #137: testcorenrn_vecplay::coreneuron_gpu_offline::preparation ........... Passed 4.09 sec
Start 157: testcorenrn_watch::coreneuron_gpu_offline::preparation
17/184 Test #82: testcorenrn_bbcore::coreneuron_cpu_offline ......................... Passed 7.11 sec
Start 88: testcorenrn_conc::coreneuron_gpu_offline
18/184 Test #72: external_ringtest::coreneuron_gpu_mpi_offline::preparation ......... Passed 14.08 sec
Start 161: testcorenrn_watch::coreneuron_cpu_offline::preparation
19/184 Test #78: testcorenrn_bbcore::coreneuron_gpu_offline ......................... Passed 11.67 sec
Start 90: testcorenrn_conc::coreneuron_cpu_online
20/184 Test #76: testcorenrn_bbcore::coreneuron_gpu_online .......................... Passed 12.88 sec
Start 91: testcorenrn_conc::coreneuron_cpu_online_psolve_alternate
21/184 Test #80: testcorenrn_bbcore::coreneuron_cpu_online .......................... Passed 14.73 sec
Start 92: testcorenrn_conc::coreneuron_cpu_offline
22/184 Test #81: testcorenrn_bbcore::coreneuron_cpu_online_psolve_alternate ......... Passed 14.95 sec
Start 95: testcorenrn_deriv::neuron
23/184 Test #67: external_ringtest::coreneuron_cpu_mpi_offline::preparation ......... Passed 19.33 sec
Start 63: external_ringtest::neuron_mpi
24/184 Test #141: testcorenrn_vecplay::coreneuron_cpu_offline::preparation ........... Passed 11.55 sec
Start 64: external_ringtest::coreneuron_cpu_mpi_offline_saverestore
25/184 Test #157: testcorenrn_watch::coreneuron_gpu_offline::preparation ............. Passed 11.58 sec
Start 65: external_ringtest::coreneuron_cpu_mpi
26/184 Test #161: testcorenrn_watch::coreneuron_cpu_offline::preparation ............. Passed 8.54 sec
Start 66: external_ringtest::coreneuron_cpu_mpi_offline
27/184 Test #95: testcorenrn_deriv::neuron .......................................... Passed 4.39 sec
Start 96: testcorenrn_deriv::coreneuron_gpu_online
28/184 Test #77: testcorenrn_bbcore::coreneuron_gpu_online_psolve_alternate ......... Passed 22.46 sec
Start 97: testcorenrn_deriv::coreneuron_gpu_online_psolve_alternate
29/184 Test #63: external_ringtest::neuron_mpi ...................................... Passed 7.19 sec
Start 69: external_ringtest::coreneuron_gpu_mpi_offline_saverestore
30/184 Test #88: testcorenrn_conc::coreneuron_gpu_offline ........................... Passed 27.75 sec
Start 98: testcorenrn_deriv::coreneuron_gpu_offline
31/184 Test #86: testcorenrn_conc::coreneuron_gpu_online ............................ Passed 34.56 sec
Start 100: testcorenrn_deriv::coreneuron_cpu_online
32/184 Test #87: testcorenrn_conc::coreneuron_gpu_online_psolve_alternate ........... Passed 33.27 sec
Start 101: testcorenrn_deriv::coreneuron_cpu_online_psolve_alternate
33/184 Test #92: testcorenrn_conc::coreneuron_cpu_offline ........................... Passed 24.91 sec
Start 102: testcorenrn_deriv::coreneuron_cpu_offline
34/184 Test #90: testcorenrn_conc::coreneuron_cpu_online ............................ Passed 28.06 sec
Start 115: testcorenrn_kin::neuron
35/184 Test #91: testcorenrn_conc::coreneuron_cpu_online_psolve_alternate ........... Passed 27.17 sec
Start 116: testcorenrn_kin::coreneuron_gpu_online
36/184 Test #115: testcorenrn_kin::neuron ............................................ Passed 1.51 sec
Start 117: testcorenrn_kin::coreneuron_gpu_online_psolve_alternate
37/184 Test #66: external_ringtest::coreneuron_cpu_mpi_offline ...................... Passed 27.04 sec
Start 70: external_ringtest::coreneuron_gpu_mpi
38/184 Test #65: external_ringtest::coreneuron_cpu_mpi .............................. Passed 39.60 sec
Start 71: external_ringtest::coreneuron_gpu_mpi_offline
39/184 Test #96: testcorenrn_deriv::coreneuron_gpu_online ........................... Passed 47.45 sec
Start 118: testcorenrn_kin::coreneuron_gpu_offline
40/184 Test #100: testcorenrn_deriv::coreneuron_cpu_online ........................... Passed 29.70 sec
Start 120: testcorenrn_kin::coreneuron_cpu_online
41/184 Test #101: testcorenrn_deriv::coreneuron_cpu_online_psolve_alternate .......... Passed 29.68 sec
Start 121: testcorenrn_kin::coreneuron_cpu_online_psolve_alternate
42/184 Test #98: testcorenrn_deriv::coreneuron_gpu_offline .......................... Passed 31.25 sec
Start 122: testcorenrn_kin::coreneuron_cpu_offline
43/184 Test #97: testcorenrn_deriv::coreneuron_gpu_online_psolve_alternate .......... Passed 44.96 sec
Start 126: testcorenrn_patstim::coreneuron_gpu_offline_saverestore
44/184 Test #102: testcorenrn_deriv::coreneuron_cpu_offline .......................... Passed 31.44 sec
Start 127: testcorenrn_patstim::coreneuron_gpu_offline
45/184 Test #116: testcorenrn_kin::coreneuron_gpu_online ............................. Passed 31.66 sec
Start 166: channel_benchmark_hippo::neuron
46/184 Test #117: testcorenrn_kin::coreneuron_gpu_online_psolve_alternate ............ Passed 48.10 sec
Start 167: channel_benchmark_hippo::coreneuron_gpu_online
47/184 Test #70: external_ringtest::coreneuron_gpu_mpi .............................. Passed 43.74 sec
Start 105: testcorenrn_gf::neuron
48/184 Test #105: testcorenrn_gf::neuron ............................................. Passed 4.15 sec
Start 106: testcorenrn_gf::coreneuron_gpu_online
49/184 Test #71: external_ringtest::coreneuron_gpu_mpi_offline ...................... Passed 62.26 sec
Start 107: testcorenrn_gf::coreneuron_gpu_online_psolve_alternate
50/184 Test #120: testcorenrn_kin::coreneuron_cpu_online ............................. Passed 54.99 sec
Start 168: channel_benchmark_hippo::coreneuron_gpu_filemode
51/184 Test #121: testcorenrn_kin::coreneuron_cpu_online_psolve_alternate ............ Passed 61.40 sec
Start 169: channel_benchmark_hippo::coreneuron_cpu_online
52/184 Test #166: channel_benchmark_hippo::neuron .................................... Passed 59.60 sec
Start 170: channel_benchmark_hippo::coreneuron_cpu_filemode
53/184 Test #118: testcorenrn_kin::coreneuron_gpu_offline ............................ Passed 65.44 sec
Start 172: channel_benchmark_sscx::neuron
54/184 Test #122: testcorenrn_kin::coreneuron_cpu_offline ............................ Passed 65.73 sec
Start 173: channel_benchmark_sscx::coreneuron_gpu_online
55/184 Test #127: testcorenrn_patstim::coreneuron_gpu_offline ........................ Passed 75.94 sec
Start 174: channel_benchmark_sscx::coreneuron_gpu_filemode
56/184 Test #106: testcorenrn_gf::coreneuron_gpu_online .............................. Passed 91.01 sec
Start 108: testcorenrn_gf::coreneuron_gpu_offline
57/184 Test #126: testcorenrn_patstim::coreneuron_gpu_offline_saverestore ............ Passed 144.63 sec
Start 175: channel_benchmark_sscx::coreneuron_cpu_online
58/184 Test #107: testcorenrn_gf::coreneuron_gpu_online_psolve_alternate ............. Passed 119.22 sec
Start 110: testcorenrn_gf::coreneuron_cpu_online
59/184 Test #172: channel_benchmark_sscx::neuron ..................................... Passed 120.03 sec
Start 176: channel_benchmark_sscx::coreneuron_cpu_filemode
60/184 Test #64: external_ringtest::coreneuron_cpu_mpi_offline_saverestore .......... Passed 245.52 sec
Start 111: testcorenrn_gf::coreneuron_cpu_online_psolve_alternate
61/184 Test #69: external_ringtest::coreneuron_gpu_mpi_offline_saverestore .......... Passed 238.93 sec
Start 112: testcorenrn_gf::coreneuron_cpu_offline
62/184 Test #167: channel_benchmark_hippo::coreneuron_gpu_online ..................... Passed 177.82 sec
Start 1: testneuron
63/184 Test #1: testneuron ......................................................... Passed 0.06 sec
Start 2: ringtest
64/184 Test #2: ringtest ........................................................... Passed 0.29 sec
Start 3: connect_dend
65/184 Test #3: connect_dend ....................................................... Passed 0.14 sec
Start 4: mpi_init::nrniv_mpiopt
66/184 Test #4: mpi_init::nrniv_mpiopt ............................................. Passed 0.40 sec
Start 5: mpi_init::nrniv_nrnmpi_init
67/184 Test #5: mpi_init::nrniv_nrnmpi_init ........................................ Passed 0.15 sec
Start 6: mpi_init::python_nrnmpi_init
68/184 Test #6: mpi_init::python_nrnmpi_init ....................................... Passed 0.63 sec
Start 7: mpi_init::python_mpienv
69/184 Test #7: mpi_init::python_mpienv ............................................ Passed 1.47 sec
Start 8: mpi_init::nrniv_mpiexec_mpiopt
70/184 Test #8: mpi_init::nrniv_mpiexec_mpiopt ..................................... Passed 1.57 sec
Start 9: mpi_init::nrniv_mpiexec_nrnmpi_init
71/184 Test #9: mpi_init::nrniv_mpiexec_nrnmpi_init ................................ Passed 1.39 sec
Start 10: mpi_init::python_mpiexec_nrnmpi_init
72/184 Test #10: mpi_init::python_mpiexec_nrnmpi_init ............................... Passed 1.84 sec
Start 11: mpi_init::python_mpiexec_mpienv
73/184 Test #11: mpi_init::python_mpiexec_mpienv .................................... Passed 3.08 sec
Start 12: pynrn::basic_tests
74/184 Test #108: testcorenrn_gf::coreneuron_gpu_offline ............................. Passed 94.97 sec
Start 125: testcorenrn_patstim::neuron
75/184 Test #125: testcorenrn_patstim::neuron ........................................ Passed 1.93 sec
Start 129: testcorenrn_patstim::coreneuron_cpu_offline_saverestore
76/184 Test #12: pynrn::basic_tests ................................................. Passed 4.59 sec
Start 13: coverage_tests::cover_tests
77/184 Test #13: coverage_tests::cover_tests ........................................ Passed 2.13 sec
Start 14: parallel_tests
78/184 Test #14: parallel_tests ..................................................... Passed 3.02 sec
Start 15: parallel_partrans
79/184 Test #15: parallel_partrans .................................................. Passed 1.72 sec
Start 16: parallel_netpar
80/184 Test #16: parallel_netpar .................................................... Passed 1.89 sec
Start 17: parallel_bas
81/184 Test #17: parallel_bas ....................................................... Passed 2.67 sec
Start 18: coreneuron_modtests::version_macros
82/184 Test #168: channel_benchmark_hippo::coreneuron_gpu_filemode ................... Passed 191.54 sec
Start 19: coreneuron_modtests::fornetcon_py_cpu
83/184 Test #169: channel_benchmark_hippo::coreneuron_cpu_online ..................... Passed 185.60 sec
Start 20: coreneuron_modtests::direct_py_cpu
84/184 Test #173: channel_benchmark_sscx::coreneuron_gpu_online ...................... Passed 181.47 sec
Start 21: coreneuron_modtests::direct_hoc_cpu
85/184 Test #170: channel_benchmark_hippo::coreneuron_cpu_filemode ................... Passed 193.14 sec
Start 22: coreneuron_modtests::spikes_py_cpu
86/184 Test #110: testcorenrn_gf::coreneuron_cpu_online .............................. Passed 87.24 sec
Start 130: testcorenrn_patstim::coreneuron_cpu_offline
87/184 Test #112: testcorenrn_gf::coreneuron_cpu_offline ............................. Passed 63.01 sec
Start 133: testcorenrn_vecplay::neuron
88/184 Test #133: testcorenrn_vecplay::neuron ........................................ Passed 2.37 sec
Start 134: testcorenrn_vecplay::coreneuron_gpu_online
89/184 Test #111: testcorenrn_gf::coreneuron_cpu_online_psolve_alternate ............. Passed 74.85 sec
Start 135: testcorenrn_vecplay::coreneuron_gpu_online_psolve_alternate
90/184 Test #18: coreneuron_modtests::version_macros ................................ Passed 52.96 sec
Start 23: coreneuron_modtests::spikes_file_mode_py_cpu
91/184 Test #175: channel_benchmark_sscx::coreneuron_cpu_online ...................... Passed 135.58 sec
Start 24: coreneuron_modtests::fast_imem_py_cpu
92/184 Test #174: channel_benchmark_sscx::coreneuron_gpu_filemode .................... Passed 200.66 sec
Start 25: coreneuron_modtests::datareturn_py_cpu
93/184 Test #24: coreneuron_modtests::fast_imem_py_cpu .............................. Passed 0.43 sec
Start 26: coreneuron_modtests::test_units_py_cpu
94/184 Test #176: channel_benchmark_sscx::coreneuron_cpu_filemode .................... Passed 99.80 sec
Start 27: coreneuron_modtests::test_netmove_py_cpu
95/184 Test #20: coreneuron_modtests::direct_py_cpu ................................. Passed 44.45 sec
Start 28: coreneuron_modtests::test_pointer_py_cpu
96/184 Test #19: coreneuron_modtests::fornetcon_py_cpu .............................. Passed 45.07 sec
Start 29: coreneuron_modtests::test_watchrange_py_cpu
97/184 Test #21: coreneuron_modtests::direct_hoc_cpu ................................ Passed 55.25 sec
Start 30: coreneuron_modtests::test_psolve_py_cpu
98/184 Test #22: coreneuron_modtests::spikes_py_cpu ................................. Passed 46.56 sec
Start 31: coreneuron_modtests::test_ba_py_cpu
99/184 Test #130: testcorenrn_patstim::coreneuron_cpu_offline ........................ Passed 54.90 sec
Start 136: testcorenrn_vecplay::coreneuron_gpu_offline
100/184 Test #134: testcorenrn_vecplay::coreneuron_gpu_online ......................... Passed 64.34 sec
Start 138: testcorenrn_vecplay::coreneuron_cpu_online
101/184 Test #26: coreneuron_modtests::test_units_py_cpu ............................. Passed 43.35 sec
Start 32: coreneuron_modtests::test_natrans_py_cpu
102/184 Test #23: coreneuron_modtests::spikes_file_mode_py_cpu ....................... Passed 44.07 sec
Start 36: coreneuron_modtests::fornetcon_py_gpu
103/184 Test #27: coreneuron_modtests::test_netmove_py_cpu ........................... Passed 39.09 sec
Start 37: coreneuron_modtests::direct_py_gpu
104/184 Test #25: coreneuron_modtests::datareturn_py_cpu ............................. Passed 43.58 sec
Start 38: coreneuron_modtests::direct_hoc_gpu
105/184 Test #129: testcorenrn_patstim::coreneuron_cpu_offline_saverestore ............ Passed 110.40 sec
Start 139: testcorenrn_vecplay::coreneuron_cpu_online_psolve_alternate
106/184 Test #29: coreneuron_modtests::test_watchrange_py_cpu ........................ Passed 35.99 sec
Start 39: coreneuron_modtests::spikes_py_gpu
107/184 Test #135: testcorenrn_vecplay::coreneuron_gpu_online_psolve_alternate ........ Passed 60.85 sec
Start 140: testcorenrn_vecplay::coreneuron_cpu_offline
108/184 Test #30: coreneuron_modtests::test_psolve_py_cpu ............................ Passed 36.42 sec
Start 40: coreneuron_modtests::spikes_file_mode_py_gpu
109/184 Test #31: coreneuron_modtests::test_ba_py_cpu ................................ Passed 42.35 sec
Start 41: coreneuron_modtests::fast_imem_py_gpu
110/184 Test #41: coreneuron_modtests::fast_imem_py_gpu .............................. Passed 0.54 sec
Start 42: coreneuron_modtests::datareturn_py_gpu
111/184 Test #136: testcorenrn_vecplay::coreneuron_gpu_offline ........................ Passed 43.72 sec
Start 153: testcorenrn_watch::neuron
112/184 Test #153: testcorenrn_watch::neuron .......................................... Passed 2.90 sec
Start 154: testcorenrn_watch::coreneuron_gpu_online
113/184 Test #32: coreneuron_modtests::test_natrans_py_cpu ........................... Passed 38.60 sec
Start 43: coreneuron_modtests::test_units_py_gpu
114/184 Test #38: coreneuron_modtests::direct_hoc_gpu ................................ Passed 40.33 sec
Start 44: coreneuron_modtests::test_netmove_py_gpu
115/184 Test #37: coreneuron_modtests::direct_py_gpu ................................. Passed 40.63 sec
Start 45: coreneuron_modtests::test_pointer_py_gpu
116/184 Test #39: coreneuron_modtests::spikes_py_gpu ................................. Passed 37.39 sec
Start 46: coreneuron_modtests::test_watchrange_py_gpu
117/184 Test #36: coreneuron_modtests::fornetcon_py_gpu .............................. Passed 42.29 sec
Start 47: coreneuron_modtests::test_psolve_py_gpu
118/184 Test #139: testcorenrn_vecplay::coreneuron_cpu_online_psolve_alternate ........ Passed 43.58 sec
Start 155: testcorenrn_watch::coreneuron_gpu_online_psolve_alternate
119/184 Test #140: testcorenrn_vecplay::coreneuron_cpu_offline ........................ Passed 38.95 sec
Start 156: testcorenrn_watch::coreneuron_gpu_offline
120/184 Test #138: testcorenrn_vecplay::coreneuron_cpu_online ......................... Passed 44.61 sec
Start 158: testcorenrn_watch::coreneuron_cpu_online
121/184 Test #40: coreneuron_modtests::spikes_file_mode_py_gpu ....................... Passed 41.25 sec
Start 48: coreneuron_modtests::test_ba_py_gpu
122/184 Test #42: coreneuron_modtests::datareturn_py_gpu ............................. Passed 36.49 sec
Start 49: coreneuron_modtests::test_natrans_py_gpu
123/184 Test #43: coreneuron_modtests::test_units_py_gpu ............................. Passed 30.92 sec
Start 53: modlunit_unitstest
124/184 Test #53: modlunit_unitstest ................................................. Passed 0.02 sec
Start 54: modlunit_hh
125/184 Test #54: modlunit_hh ........................................................ Passed 0.02 sec
Start 55: modlunit_stim
126/184 Test #55: modlunit_stim ...................................................... Passed 0.01 sec
Start 56: modlunit_pattern
127/184 Test #56: modlunit_pattern ................................................... Passed 0.02 sec
Start 57: external_nrntest
128/184 Test #154: testcorenrn_watch::coreneuron_gpu_online ........................... Passed 35.16 sec
Start 159: testcorenrn_watch::coreneuron_cpu_online_psolve_alternate
129/184 Test #47: coreneuron_modtests::test_psolve_py_gpu ............................ Passed 27.94 sec
Start 84: testcorenrn_bbcore::compare_results
130/184 Test #44: coreneuron_modtests::test_netmove_py_gpu ........................... Passed 30.14 sec
Start 94: testcorenrn_conc::compare_results
131/184 Test #84: testcorenrn_bbcore::compare_results ................................ Passed 0.39 sec
Start 104: testcorenrn_deriv::compare_results
132/184 Test #104: testcorenrn_deriv::compare_results ................................. Passed 0.36 sec
Start 114: testcorenrn_gf::compare_results
133/184 Test #46: coreneuron_modtests::test_watchrange_py_gpu ........................ Passed 30.34 sec
Start 124: testcorenrn_kin::compare_results
134/184 Test #94: testcorenrn_conc::compare_results .................................. Passed 0.76 sec
Start 132: testcorenrn_patstim::compare_results
135/184 Test #114: testcorenrn_gf::compare_results .................................... Passed 0.44 sec
Start 142: testcorenrn_vecplay::compare_results
136/184 Test #124: testcorenrn_kin::compare_results ................................... Passed 0.51 sec
Start 171: channel_benchmark_hippo::compare_results
137/184 Test #171: channel_benchmark_hippo::compare_results ........................... Passed 0.21 sec
Start 177: channel_benchmark_sscx::compare_results
138/184 Test #132: testcorenrn_patstim::compare_results ............................... Passed 0.60 sec
139/184 Test #142: testcorenrn_vecplay::compare_results ............................... Passed 0.56 sec
Start 160: testcorenrn_watch::coreneuron_cpu_offline
140/184 Test #177: channel_benchmark_sscx::compare_results ............................ Passed 0.43 sec
141/184 Test #158: testcorenrn_watch::coreneuron_cpu_online ........................... Passed 42.14 sec
Start 163: testcorenrn_netstimdirect::direct_netstimdirect
142/184 Test #156: testcorenrn_watch::coreneuron_gpu_offline .......................... Passed 42.49 sec
Start 164: testcorenrn_netstimdirect::direct_netstimdirect_psolve_alternate
143/184 Test #155: testcorenrn_watch::coreneuron_gpu_online_psolve_alternate .......... Passed 43.39 sec
Start 33: coreneuron_modtests::spikes_mpi_py_cpu
144/184 Test #48: coreneuron_modtests::test_ba_py_gpu ................................ Passed 35.33 sec
Start 34: coreneuron_modtests::spikes_mpi_file_mode_py_cpu
145/184 Test #49: coreneuron_modtests::test_natrans_py_gpu ........................... Passed 33.15 sec
146/184 Test #160: testcorenrn_watch::coreneuron_cpu_offline .......................... Passed 33.39 sec
Start 35: coreneuron_modtests::inputpresyn_py_cpu
147/184 Test #159: testcorenrn_watch::coreneuron_cpu_online_psolve_alternate .......... Passed 40.04 sec
Start 50: coreneuron_modtests::spikes_mpi_py_gpu
Start 162: testcorenrn_watch::compare_results
148/184 Test #162: testcorenrn_watch::compare_results ................................. Passed 0.06 sec
149/184 Test #163: testcorenrn_netstimdirect::direct_netstimdirect .................... Passed 32.49 sec
Start 51: coreneuron_modtests::spikes_mpi_file_mode_py_gpu
150/184 Test #33: coreneuron_modtests::spikes_mpi_py_cpu ............................. Passed 37.75 sec
Start 52: coreneuron_modtests::inputpresyn_py_gpu
151/184 Test #164: testcorenrn_netstimdirect::direct_netstimdirect_psolve_alternate ... Passed 38.65 sec
Start 165: testcorenrn_netstimdirect::compare_results
152/184 Test #34: coreneuron_modtests::spikes_mpi_file_mode_py_cpu ................... Passed 34.28 sec
Start 147: testcorenrn_vecevent::coreneuron_gpu_offline::preparation
153/184 Test #165: testcorenrn_netstimdirect::compare_results ......................... Passed 0.68 sec
154/184 Test #35: coreneuron_modtests::inputpresyn_py_cpu ............................ Passed 25.76 sec
155/184 Test #147: testcorenrn_vecevent::coreneuron_gpu_offline::preparation .......... Passed 8.15 sec
Start 151: testcorenrn_vecevent::coreneuron_cpu_offline::preparation
156/184 Test #151: testcorenrn_vecevent::coreneuron_cpu_offline::preparation .......... Passed 3.12 sec
Start 179: olfactory-bulb-3d::neuron::preparation
157/184 Test #179: olfactory-bulb-3d::neuron::preparation ............................. Passed 0.02 sec
Start 181: olfactory-bulb-3d::coreneuron_gpu_online::preparation
158/184 Test #181: olfactory-bulb-3d::coreneuron_gpu_online::preparation .............. Passed 0.02 sec
Start 183: olfactory-bulb-3d::coreneuron_cpu_online::preparation
159/184 Test #183: olfactory-bulb-3d::coreneuron_cpu_online::preparation .............. Passed 0.01 sec
Start 58: reduced_dentate::neuron
160/184 Test #51: coreneuron_modtests::spikes_mpi_file_mode_py_gpu ................... Passed 19.93 sec
Start 59: reduced_dentate::coreneuron_cpu
161/184 Test #50: coreneuron_modtests::spikes_mpi_py_gpu ............................. Passed 32.64 sec
162/184 Test #58: reduced_dentate::neuron ............................................ Passed 21.35 sec
Start 60: reduced_dentate::coreneuron_gpu
163/184 Test #52: coreneuron_modtests::inputpresyn_py_gpu ............................ Passed 38.10 sec
Start 143: testcorenrn_vecevent::neuron
164/184 Test #143: testcorenrn_vecevent::neuron ....................................... Passed 5.65 sec
Start 144: testcorenrn_vecevent::coreneuron_gpu_online
165/184 Test #28: coreneuron_modtests::test_pointer_py_cpu ........................... Passed 210.76 sec
166/184 Test #59: reduced_dentate::coreneuron_cpu .................................... Passed 44.99 sec
Start 68: external_ringtest::coreneuron_cpu_mpi_threads
167/184 Test #60: reduced_dentate::coreneuron_gpu .................................... Passed 26.00 sec
Start 145: testcorenrn_vecevent::coreneuron_gpu_online_psolve_alternate
168/184 Test #45: coreneuron_modtests::test_pointer_py_gpu ........................... Passed 150.40 sec
Start 61: reduced_dentate::compare_results
169/184 Test #61: reduced_dentate::compare_results ................................... Passed 1.82 sec
170/184 Test #144: testcorenrn_vecevent::coreneuron_gpu_online ........................ Passed 24.65 sec
Start 146: testcorenrn_vecevent::coreneuron_gpu_offline
171/184 Test #68: external_ringtest::coreneuron_cpu_mpi_threads ...................... Passed 11.15 sec
Start 73: external_ringtest::coreneuron_gpu_mpi_threads
172/184 Test #57: external_nrntest ................................................... Passed 134.40 sec
173/184 Test #146: testcorenrn_vecevent::coreneuron_gpu_offline ....................... Passed 19.06 sec
Start 148: testcorenrn_vecevent::coreneuron_cpu_online
174/184 Test #73: external_ringtest::coreneuron_gpu_mpi_threads ...................... Passed 20.72 sec
Start 149: testcorenrn_vecevent::coreneuron_cpu_online_psolve_alternate
Start 150: testcorenrn_vecevent::coreneuron_cpu_offline
175/184 Test #145: testcorenrn_vecevent::coreneuron_gpu_online_psolve_alternate ....... Passed 35.64 sec
Start 178: olfactory-bulb-3d::neuron
176/184 Test #148: testcorenrn_vecevent::coreneuron_cpu_online ........................ Passed 32.31 sec
Start 180: olfactory-bulb-3d::coreneuron_gpu_online
177/184 Test #150: testcorenrn_vecevent::coreneuron_cpu_offline ....................... Passed 29.66 sec
Start 182: olfactory-bulb-3d::coreneuron_cpu_online
178/184 Test #149: testcorenrn_vecevent::coreneuron_cpu_online_psolve_alternate ....... Passed 43.58 sec
Start 74: external_ringtest::compare_results
Start 152: testcorenrn_vecevent::compare_results
179/184 Test #152: testcorenrn_vecevent::compare_results .............................. Passed 0.30 sec
180/184 Test #74: external_ringtest::compare_results ................................. Passed 0.30 sec
181/184 Test #178: olfactory-bulb-3d::neuron .......................................... Passed 120.41 sec
182/184 Test #180: olfactory-bulb-3d::coreneuron_gpu_online ........................... Passed 122.44 sec
183/184 Test #182: olfactory-bulb-3d::coreneuron_cpu_online ........................... Passed 123.79 sec
Start 184: olfactory-bulb-3d::compare_results
184/184 Test #184: olfactory-bulb-3d::compare_results ................................. Passed 0.23 sec
100% tests passed, 0 tests failed out of 184
Total Test time (real) = 765.00 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1655890333:step_script section_start:1655890333:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=54581 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=272600 responseStatus=201 Created token=jyxMPny-
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=54672 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=272600 responseStatus=201 Created token=jyxMPny-
section_end:1655890335:upload_artifacts_on_success section_start:1655890335:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655890336:cleanup_file_variables Job succeeded
Running with gitlab-runner 14.3.2 (e0218c92)
 on BB5 map runner pnPo3yJy
section_start:1655889535:resolve_secrets Resolving secrets
section_end:1655889535:resolve_secrets section_start:1655889535:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 14.3.2, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor826182054, slurm job id
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 61704, build ref 6f9c31e54f65ead9165e43fe268144873a37c0c1, job ID 272598
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
A slurm job will be created with name GL_J272598_PROD_P112_CP1_C1
Job parameters: memory=76G, cpus_per_task=1, duration=1:00:00, constraint=volta ntasks=16 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
Submitted batch job 564018
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=16 --cpus-per-task=1 --mem=76G --job-name=GL_J272598_PROD_P112_CP1_C1 -C volta --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P61704 --ntasks=16 --jobid=564018 --cpus-per-task=1 --mem=76G
section_end:1655889536:prepare_executor section_start:1655889536:prepare_script Preparing environment
Running on ldir01u09.bbp.epfl.ch via bbpv1.epfl.ch...
section_end:1655889543:prepare_script section_start:1655889543:get_sources Getting source from Git repository
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1655889544:get_sources section_start:1655889544:download_artifacts Downloading artifacts
Downloading artifacts for build:neuron:nmodl:nvhpc:omp (272575)...
Runtime platform  arch=amd64 os=linux pid=34061 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=272575 responseStatus=200 OK token=pexsvbLh
section_end:1655889545:download_artifacts section_start:1655889545:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r1i6n30
Build name: Linux-nvc++
Create new tag: 20220622-0919 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P61704/J272575/spack-build/spack-stage-neuron-develop-2supl4esjx3ghhb3azj32rf6llwrd6wk/spack-build-2supl4e
Start 67: external_ringtest::coreneuron_cpu_mpi_offline::preparation
Start 72: external_ringtest::coreneuron_gpu_mpi_offline::preparation
Start 79: testcorenrn_bbcore::coreneuron_gpu_offline::preparation
Start 83: testcorenrn_bbcore::coreneuron_cpu_offline::preparation
Start 89: testcorenrn_conc::coreneuron_gpu_offline::preparation
Start 93: testcorenrn_conc::coreneuron_cpu_offline::preparation
Start 99: testcorenrn_deriv::coreneuron_gpu_offline::preparation
Start 103: testcorenrn_deriv::coreneuron_cpu_offline::preparation
Start 109: testcorenrn_gf::coreneuron_gpu_offline::preparation
Start 113: testcorenrn_gf::coreneuron_cpu_offline::preparation
Start 119: testcorenrn_kin::coreneuron_gpu_offline::preparation
Start 123: testcorenrn_kin::coreneuron_cpu_offline::preparation
1/184 Test #79: testcorenrn_bbcore::coreneuron_gpu_offline::preparation ............ Passed 2.53 sec
Start 128: testcorenrn_patstim::coreneuron_gpu_offline::preparation
2/184 Test #89: testcorenrn_conc::coreneuron_gpu_offline::preparation .............. Passed 2.63 sec
Start 62: external_ringtest::neuron
3/184 Test #93: testcorenrn_conc::coreneuron_cpu_offline::preparation .............. Passed 2.94 sec
Start 75: testcorenrn_bbcore::neuron
4/184 Test #123: testcorenrn_kin::coreneuron_cpu_offline::preparation ............... Passed 4.15 sec
Start 76: testcorenrn_bbcore::coreneuron_gpu_online
5/184 Test #119: testcorenrn_kin::coreneuron_gpu_offline::preparation ............... Passed 4.19 sec
Start 77: testcorenrn_bbcore::coreneuron_gpu_online_psolve_alternate
6/184 Test #83: testcorenrn_bbcore::coreneuron_cpu_offline::preparation ............ Passed 5.54 sec
Start 78: testcorenrn_bbcore::coreneuron_gpu_offline
7/184 Test #103: testcorenrn_deriv::coreneuron_cpu_offline::preparation ............. Passed 7.73 sec
Start 80: testcorenrn_bbcore::coreneuron_cpu_online
8/184 Test #109: testcorenrn_gf::coreneuron_gpu_offline::preparation ................ Passed 11.05 sec
Start 131: testcorenrn_patstim::coreneuron_cpu_offline::preparation
9/184 Test #113: testcorenrn_gf::coreneuron_cpu_offline::preparation ................ Passed 11.51 sec
Start 137: testcorenrn_vecplay::coreneuron_gpu_offline::preparation
10/184 Test #128: testcorenrn_patstim::coreneuron_gpu_offline::preparation ........... Passed 9.90 sec
Start 81: testcorenrn_bbcore::coreneuron_cpu_online_psolve_alternate
11/184 Test #75: testcorenrn_bbcore::neuron ......................................... Passed 9.79 sec
Start 82: testcorenrn_bbcore::coreneuron_cpu_offline
12/184 Test #99: testcorenrn_deriv::coreneuron_gpu_offline::preparation ............. Passed 12.77 sec
Start 85: testcorenrn_conc::neuron
13/184 Test #62: external_ringtest::neuron .......................................... Passed 12.16 sec
Start 86: testcorenrn_conc::coreneuron_gpu_online
14/184 Test #78: testcorenrn_bbcore::coreneuron_gpu_offline ......................... Passed 10.55 sec
Start 87: testcorenrn_conc::coreneuron_gpu_online_psolve_alternate
15/184 Test #131: testcorenrn_patstim::coreneuron_cpu_offline::preparation ........... Passed 6.53 sec
Start 141: testcorenrn_vecplay::coreneuron_cpu_offline::preparation
16/184 Test #85: testcorenrn_conc::neuron ........................................... Passed 4.83 sec
Start 88: testcorenrn_conc::coreneuron_gpu_offline
17/184 Test #137: testcorenrn_vecplay::coreneuron_gpu_offline::preparation ........... Passed 7.34 sec
Start 157: testcorenrn_watch::coreneuron_gpu_offline::preparation
18/184 Test #141: testcorenrn_vecplay::coreneuron_cpu_offline::preparation ........... Passed 3.05 sec
Start 161: testcorenrn_watch::coreneuron_cpu_offline::preparation
19/184 Test #157: testcorenrn_watch::coreneuron_gpu_offline::preparation ............. Passed 1.81 sec
Start 63: external_ringtest::neuron_mpi
20/184 Test #161: testcorenrn_watch::coreneuron_cpu_offline::preparation ............. Passed 1.54 sec
Start 64: external_ringtest::coreneuron_cpu_mpi_offline_saverestore
21/184 Test #63: external_ringtest::neuron_mpi ...................................... Passed 2.21 sec
Start 65: external_ringtest::coreneuron_cpu_mpi
22/184 Test #76: testcorenrn_bbcore::coreneuron_gpu_online .......................... Passed 30.06 sec
Start 90: testcorenrn_conc::coreneuron_cpu_online
23/184 Test #67: external_ringtest::coreneuron_cpu_mpi_offline::preparation ......... Passed 34.45 sec
Start 66: external_ringtest::coreneuron_cpu_mpi_offline
24/184 Test #72: external_ringtest::coreneuron_gpu_mpi_offline::preparation ......... Passed 35.69 sec
Start 69: external_ringtest::coreneuron_gpu_mpi_offline_saverestore
25/184 Test #80: testcorenrn_bbcore::coreneuron_cpu_online .......................... Passed 30.10 sec
Start 91: testcorenrn_conc::coreneuron_cpu_online_psolve_alternate
26/184 Test #82: testcorenrn_bbcore::coreneuron_cpu_offline ......................... Passed 29.94 sec
Start 92: testcorenrn_conc::coreneuron_cpu_offline
27/184 Test #77: testcorenrn_bbcore::coreneuron_gpu_online_psolve_alternate ......... Passed 38.65 sec
Start 95: testcorenrn_deriv::neuron
28/184 Test #81: testcorenrn_bbcore::coreneuron_cpu_online_psolve_alternate ......... Passed 30.53 sec
Start 96: testcorenrn_deriv::coreneuron_gpu_online
29/184 Test #95: testcorenrn_deriv::neuron .......................................... Passed 1.58 sec
Start 97: testcorenrn_deriv::coreneuron_gpu_online_psolve_alternate
30/184 Test #87: testcorenrn_conc::coreneuron_gpu_online_psolve_alternate ........... Passed 36.23 sec
Start 98: testcorenrn_deriv::coreneuron_gpu_offline
31/184 Test #86: testcorenrn_conc::coreneuron_gpu_online ............................ Passed 49.17 sec
Start 100: testcorenrn_deriv::coreneuron_cpu_online
32/184 Test #88: testcorenrn_conc::coreneuron_gpu_offline ........................... Passed 46.34 sec
Start 101: testcorenrn_deriv::coreneuron_cpu_online_psolve_alternate
33/184 Test #90: testcorenrn_conc::coreneuron_cpu_online ............................ Passed 29.91 sec
Start 102: testcorenrn_deriv::coreneuron_cpu_offline
34/184 Test #65: external_ringtest::coreneuron_cpu_mpi .............................. Passed 41.16 sec
Start 70: external_ringtest::coreneuron_gpu_mpi
35/184 Test #66: external_ringtest::coreneuron_cpu_mpi_offline ...................... Passed 29.98 sec
Start 71: external_ringtest::coreneuron_gpu_mpi_offline
36/184 Test #91: testcorenrn_conc::coreneuron_cpu_online_psolve_alternate ........... Passed 30.46 sec
Start 115: testcorenrn_kin::neuron
37/184 Test #115: testcorenrn_kin::neuron ............................................ Passed 2.10 sec
Start 116: testcorenrn_kin::coreneuron_gpu_online
38/184 Test #96: testcorenrn_deriv::coreneuron_gpu_online ........................... Passed 43.35 sec
Start 117: testcorenrn_kin::coreneuron_gpu_online_psolve_alternate
39/184 Test #92: testcorenrn_conc::coreneuron_cpu_offline ........................... Passed 43.66 sec
Start 118: testcorenrn_kin::coreneuron_gpu_offline
40/184 Test #97: testcorenrn_deriv::coreneuron_gpu_online_psolve_alternate .......... Passed 42.07 sec
Start 120: testcorenrn_kin::coreneuron_cpu_online
41/184 Test #98: testcorenrn_deriv::coreneuron_gpu_offline .......................... Passed 46.14 sec
Start 121: testcorenrn_kin::coreneuron_cpu_online_psolve_alternate
42/184 Test #100: testcorenrn_deriv::coreneuron_cpu_online ........................... Passed 50.27 sec
Start 122: testcorenrn_kin::coreneuron_cpu_offline
43/184 Test #101: testcorenrn_deriv::coreneuron_cpu_online_psolve_alternate .......... Passed 50.25 sec
Start 126: testcorenrn_patstim::coreneuron_gpu_offline_saverestore
44/184 Test #102: testcorenrn_deriv::coreneuron_cpu_offline .......................... Passed 50.98 sec
Start 127: testcorenrn_patstim::coreneuron_gpu_offline
45/184 Test #116: testcorenrn_kin::coreneuron_gpu_online ............................. Passed 60.37 sec
Start 166: channel_benchmark_hippo::neuron
46/184 Test #70: external_ringtest::coreneuron_gpu_mpi .............................. Passed 79.80 sec
Start 105: testcorenrn_gf::neuron
47/184 Test #71: external_ringtest::coreneuron_gpu_mpi_offline ...................... Passed 79.73 sec
Start 106: testcorenrn_gf::coreneuron_gpu_online
48/184 Test #120: testcorenrn_kin::coreneuron_cpu_online ............................. Passed 58.11 sec
Start 167: channel_benchmark_hippo::coreneuron_gpu_online
49/184 Test #105: testcorenrn_gf::neuron ............................................. Passed 3.46 sec
Start 107: testcorenrn_gf::coreneuron_gpu_online_psolve_alternate
50/184 Test #118: testcorenrn_kin::coreneuron_gpu_offline ............................ Passed 64.84 sec
Start 168: channel_benchmark_hippo::coreneuron_gpu_filemode
51/184 Test #117: testcorenrn_kin::coreneuron_gpu_online_psolve_alternate ............ Passed 92.41 sec
Start 169: channel_benchmark_hippo::coreneuron_cpu_online
52/184 Test #121: testcorenrn_kin::coreneuron_cpu_online_psolve_alternate ............ Passed 80.33 sec
Start 170: channel_benchmark_hippo::coreneuron_cpu_filemode
53/184 Test #122: testcorenrn_kin::coreneuron_cpu_offline ............................ Passed 64.97 sec
Start 172: channel_benchmark_sscx::neuron
54/184 Test #127: testcorenrn_patstim::coreneuron_gpu_offline ........................ Passed 79.19 sec
Start 173: channel_benchmark_sscx::coreneuron_gpu_online
55/184 Test #166: channel_benchmark_hippo::neuron .................................... Passed 65.84 sec
Start 174: channel_benchmark_sscx::coreneuron_gpu_filemode
56/184 Test #64: external_ringtest::coreneuron_cpu_mpi_offline_saverestore .......... Passed 243.62 sec
Start 108: testcorenrn_gf::coreneuron_gpu_offline
57/184 Test #126: testcorenrn_patstim::coreneuron_gpu_offline_saverestore ............ Passed 171.20 sec
Start 175: channel_benchmark_sscx::coreneuron_cpu_online
58/184 Test #106: testcorenrn_gf::coreneuron_gpu_online .............................. Passed 147.37 sec
Start 110: testcorenrn_gf::coreneuron_cpu_online
59/184 Test #107: testcorenrn_gf::coreneuron_gpu_online_psolve_alternate ............. Passed 146.65 sec
Start 111: testcorenrn_gf::coreneuron_cpu_online_psolve_alternate
60/184 Test #172: channel_benchmark_sscx::neuron ..................................... Passed 131.06 sec
Start 176: channel_benchmark_sscx::coreneuron_cpu_filemode
61/184 Test #69: external_ringtest::coreneuron_gpu_mpi_offline_saverestore .......... Passed 284.24 sec
Start 112: testcorenrn_gf::coreneuron_cpu_offline
62/184 Test #167: channel_benchmark_hippo::coreneuron_gpu_online ..................... Passed 177.36 sec
Start 1: testneuron
63/184 Test #1: testneuron ......................................................... Passed 0.06 sec
Start 2: ringtest
64/184 Test #2: ringtest ........................................................... Passed 0.24 sec
Start 3: connect_dend
65/184 Test #3: connect_dend ....................................................... Passed 0.16 sec
Start 4: mpi_init::nrniv_mpiopt
66/184 Test #4: mpi_init::nrniv_mpiopt ............................................. Passed 0.44 sec
Start 5: mpi_init::nrniv_nrnmpi_init
67/184 Test #5: mpi_init::nrniv_nrnmpi_init ........................................ Passed 0.16 sec
Start 6: mpi_init::python_nrnmpi_init
68/184 Test #6: mpi_init::python_nrnmpi_init ....................................... Passed 0.66 sec
Start 7: mpi_init::python_mpienv
69/184 Test #7: mpi_init::python_mpienv ............................................ Passed 1.94 sec
Start 8: mpi_init::nrniv_mpiexec_mpiopt
70/184 Test #8: mpi_init::nrniv_mpiexec_mpiopt ..................................... Passed 1.59 sec
Start 9: mpi_init::nrniv_mpiexec_nrnmpi_init
71/184 Test #9: mpi_init::nrniv_mpiexec_nrnmpi_init ................................ Passed 1.36 sec
Start 10: mpi_init::python_mpiexec_nrnmpi_init
72/184 Test #10: mpi_init::python_mpiexec_nrnmpi_init ............................... Passed 1.85 sec
Start 11: mpi_init::python_mpiexec_mpienv
73/184 Test #11: mpi_init::python_mpiexec_mpienv .................................... Passed 2.61 sec
Start 12: pynrn::basic_tests
74/184 Test #12: pynrn::basic_tests ................................................. Passed 4.31 sec
Start 13: coverage_tests::cover_tests
75/184 Test #13: coverage_tests::cover_tests ........................................ Passed 2.11 sec
Start 14: parallel_tests
76/184 Test #168: channel_benchmark_hippo::coreneuron_gpu_filemode ................... Passed 194.36 sec
Start 15: parallel_partrans
77/184 Test #14: parallel_tests ..................................................... Passed 7.28 sec
Start 16: parallel_netpar
78/184 Test #15: parallel_partrans .................................................. Passed 3.91 sec
Start 17: parallel_bas
79/184 Test #108: testcorenrn_gf::coreneuron_gpu_offline ............................. Passed 84.85 sec
Start 125: testcorenrn_patstim::neuron
80/184 Test #16: parallel_netpar .................................................... Passed 3.98 sec
Start 18: coreneuron_modtests::version_macros
81/184 Test #110: testcorenrn_gf::coreneuron_cpu_online .............................. Passed 60.88 sec
Start 129: testcorenrn_patstim::coreneuron_cpu_offline_saverestore
82/184 Test #125: testcorenrn_patstim::neuron ........................................ Passed 3.26 sec
Start 130: testcorenrn_patstim::coreneuron_cpu_offline
83/184 Test #111: testcorenrn_gf::coreneuron_cpu_online_psolve_alternate ............. Passed 59.82 sec
Start 133: testcorenrn_vecplay::neuron
84/184 Test #133: testcorenrn_vecplay::neuron ........................................ Passed 2.83 sec
Start 134: testcorenrn_vecplay::coreneuron_gpu_online
85/184 Test #169: channel_benchmark_hippo::coreneuron_cpu_online ..................... Passed 178.20 sec
Start 19: coreneuron_modtests::fornetcon_py_cpu
86/184 Test #17: parallel_bas ....................................................... Passed 13.99 sec
Start 20: coreneuron_modtests::direct_py_cpu
87/184 Test #170: channel_benchmark_hippo::coreneuron_cpu_filemode ................... Passed 189.03 sec
Start 21: coreneuron_modtests::direct_hoc_cpu
88/184 Test #112: testcorenrn_gf::coreneuron_cpu_offline ............................. Passed 48.08 sec
Start 135: testcorenrn_vecplay::coreneuron_gpu_online_psolve_alternate
89/184 Test #174: channel_benchmark_sscx::coreneuron_gpu_filemode .................... Passed 171.54 sec
Start 22: coreneuron_modtests::spikes_py_cpu
90/184 Test #173: channel_benchmark_sscx::coreneuron_gpu_online ...................... Passed 174.16 sec
Start 23: coreneuron_modtests::spikes_file_mode_py_cpu
91/184 Test #175: channel_benchmark_sscx::coreneuron_cpu_online ...................... Passed 104.11 sec
Start 24: coreneuron_modtests::fast_imem_py_cpu
92/184 Test #24: coreneuron_modtests::fast_imem_py_cpu .............................. Passed 0.80 sec
Start 25: coreneuron_modtests::datareturn_py_cpu
93/184 Test #19: coreneuron_modtests::fornetcon_py_cpu .............................. Passed 35.62 sec
Start 26: coreneuron_modtests::test_units_py_cpu
94/184 Test #18: coreneuron_modtests::version_macros ................................ Passed 41.72 sec
Start 27: coreneuron_modtests::test_netmove_py_cpu
95/184 Test #130: testcorenrn_patstim::coreneuron_cpu_offline ........................ Passed 39.32 sec
Start 136: testcorenrn_vecplay::coreneuron_gpu_offline
96/184 Test #134: testcorenrn_vecplay::coreneuron_gpu_online ......................... Passed 41.69 sec
Start 138: testcorenrn_vecplay::coreneuron_cpu_online
97/184 Test #20: coreneuron_modtests::direct_py_cpu ................................. Passed 39.75 sec
Start 28: coreneuron_modtests::test_pointer_py_cpu
98/184 Test #21: coreneuron_modtests::direct_hoc_cpu ................................ Passed 35.36 sec
Start 29: coreneuron_modtests::test_watchrange_py_cpu
99/184 Test #23: coreneuron_modtests::spikes_file_mode_py_cpu ....................... Passed 34.81 sec
Start 30: coreneuron_modtests::test_psolve_py_cpu
100/184 Test #22: coreneuron_modtests::spikes_py_cpu ................................. Passed 35.06 sec
Start 31: coreneuron_modtests::test_ba_py_cpu
101/184 Test #176: channel_benchmark_sscx::coreneuron_cpu_filemode .................... Passed 94.01 sec
Start 32: coreneuron_modtests::test_natrans_py_cpu
102/184 Test #27: coreneuron_modtests::test_netmove_py_cpu ........................... Passed 28.87 sec
Start 36: coreneuron_modtests::fornetcon_py_gpu
103/184 Test #135: testcorenrn_vecplay::coreneuron_gpu_online_psolve_alternate ........ Passed 61.07 sec
Start 139: testcorenrn_vecplay::coreneuron_cpu_online_psolve_alternate
104/184 Test #25: coreneuron_modtests::datareturn_py_cpu ............................. Passed 41.17 sec
Start 37: coreneuron_modtests::direct_py_gpu
105/184 Test #26: coreneuron_modtests::test_units_py_cpu ............................. Passed 39.48 sec
Start 38: coreneuron_modtests::direct_hoc_gpu
106/184 Test #129: testcorenrn_patstim::coreneuron_cpu_offline_saverestore ............ Passed 80.57 sec
Start 140: testcorenrn_vecplay::coreneuron_cpu_offline
107/184 Test #138: testcorenrn_vecplay::coreneuron_cpu_online ......................... Passed 36.29 sec
Start 153: testcorenrn_watch::neuron
108/184 Test #136: testcorenrn_vecplay::coreneuron_gpu_offline ........................ Passed 41.69 sec
Start 154: testcorenrn_watch::coreneuron_gpu_online
109/184 Test #153: testcorenrn_watch::neuron .......................................... Passed 1.58 sec
Start 155: testcorenrn_watch::coreneuron_gpu_online_psolve_alternate
110/184 Test #31: coreneuron_modtests::test_ba_py_cpu ................................ Passed 40.70 sec
Start 39: coreneuron_modtests::spikes_py_gpu
111/184 Test #32: coreneuron_modtests::test_natrans_py_cpu ........................... Passed 39.77 sec
Start 40: coreneuron_modtests::spikes_file_mode_py_gpu
112/184 Test #29: coreneuron_modtests::test_watchrange_py_cpu ........................ Passed 40.82 sec
Start 41: coreneuron_modtests::fast_imem_py_gpu
113/184 Test #30: coreneuron_modtests::test_psolve_py_cpu ............................ Passed 40.85 sec
Start 42: coreneuron_modtests::datareturn_py_gpu
114/184 Test #41: coreneuron_modtests::fast_imem_py_gpu .............................. Passed 1.10 sec
Start 43: coreneuron_modtests::test_units_py_gpu
115/184 Test #36: coreneuron_modtests::fornetcon_py_gpu .............................. Passed 34.36 sec
Start 44: coreneuron_modtests::test_netmove_py_gpu
116/184 Test #37: coreneuron_modtests::direct_py_gpu ................................. Passed 25.40 sec
Start 45: coreneuron_modtests::test_pointer_py_gpu
117/184 Test #139: testcorenrn_vecplay::coreneuron_cpu_online_psolve_alternate ........ Passed 35.22 sec
Start 156: testcorenrn_watch::coreneuron_gpu_offline
118/184 Test #38: coreneuron_modtests::direct_hoc_gpu ................................ Passed 42.15 sec
Start 46: coreneuron_modtests::test_watchrange_py_gpu
119/184 Test #140: testcorenrn_vecplay::coreneuron_cpu_offline ........................ Passed 42.30 sec
Start 158: testcorenrn_watch::coreneuron_cpu_online
120/184 Test #154: testcorenrn_watch::coreneuron_gpu_online ........................... Passed 40.37 sec
Start 159: testcorenrn_watch::coreneuron_cpu_online_psolve_alternate
121/184 Test #43: coreneuron_modtests::test_units_py_gpu ............................. Passed 31.93 sec
Start 47: coreneuron_modtests::test_psolve_py_gpu
122/184 Test #155: testcorenrn_watch::coreneuron_gpu_online_psolve_alternate .......... Passed 41.31 sec
Start 160: testcorenrn_watch::coreneuron_cpu_offline
123/184 Test #40: coreneuron_modtests::spikes_file_mode_py_gpu ....................... Passed 36.45 sec
Start 48: coreneuron_modtests::test_ba_py_gpu
124/184 Test #39: coreneuron_modtests::spikes_py_gpu ................................. Passed 38.01 sec
Start 49: coreneuron_modtests::test_natrans_py_gpu
125/184 Test #42: coreneuron_modtests::datareturn_py_gpu ............................. Passed 39.55 sec
Start 53: modlunit_unitstest
126/184 Test #53: modlunit_unitstest ................................................. Passed 0.02 sec
Start 54: modlunit_hh
127/184 Test #54: modlunit_hh ........................................................ Passed 0.02 sec
Start 55: modlunit_stim
128/184 Test #55: modlunit_stim ...................................................... Passed 0.01 sec
Start 56: modlunit_pattern
129/184 Test #56: modlunit_pattern ................................................... Passed 0.01 sec
Start 57: external_nrntest
130/184 Test #44: coreneuron_modtests::test_netmove_py_gpu ........................... Passed 36.72 sec
Start 84: testcorenrn_bbcore::compare_results
131/184 Test #84: testcorenrn_bbcore::compare_results ................................ Passed 0.52 sec
Start 94: testcorenrn_conc::compare_results
132/184 Test #94: testcorenrn_conc::compare_results .................................. Passed 0.17 sec
Start 104: testcorenrn_deriv::compare_results
133/184 Test #104: testcorenrn_deriv::compare_results ................................. Passed 0.18 sec
Start 114: testcorenrn_gf::compare_results
134/184 Test #114: testcorenrn_gf::compare_results .................................... Passed 0.06 sec
Start 124: testcorenrn_kin::compare_results
135/184 Test #124: testcorenrn_kin::compare_results ................................... Passed 0.05 sec
Start 132: testcorenrn_patstim::compare_results
136/184 Test #132: testcorenrn_patstim::compare_results ............................... Passed 0.05 sec
Start 142: testcorenrn_vecplay::compare_results
137/184 Test #142: testcorenrn_vecplay::compare_results ............................... Passed 0.06 sec
Start 171: channel_benchmark_hippo::compare_results
138/184 Test #171: channel_benchmark_hippo::compare_results ........................... Passed 0.05 sec
Start 177: channel_benchmark_sscx::compare_results
139/184 Test #177: channel_benchmark_sscx::compare_results ............................ Passed 0.05 sec
140/184 Test #156: testcorenrn_watch::coreneuron_gpu_offline .......................... Passed 34.11 sec
Start 163: testcorenrn_netstimdirect::direct_netstimdirect
141/184 Test #46: coreneuron_modtests::test_watchrange_py_gpu ........................ Passed 31.89 sec
Start 164: testcorenrn_netstimdirect::direct_netstimdirect_psolve_alternate
142/184 Test #47: coreneuron_modtests::test_psolve_py_gpu ............................ Passed 36.46 sec
143/184 Test #49: coreneuron_modtests::test_natrans_py_gpu ........................... Passed 31.62 sec
Start 33: coreneuron_modtests::spikes_mpi_py_cpu
144/184 Test #48: coreneuron_modtests::test_ba_py_gpu ................................ Passed 33.23 sec
145/184 Test #158: testcorenrn_watch::coreneuron_cpu_online ........................... Passed 38.72 sec
Start 34: coreneuron_modtests::spikes_mpi_file_mode_py_cpu
146/184 Test #159: testcorenrn_watch::coreneuron_cpu_online_psolve_alternate .......... Passed 39.35 sec
Start 35: coreneuron_modtests::inputpresyn_py_cpu
147/184 Test #160: testcorenrn_watch::coreneuron_cpu_offline .......................... Passed 37.01 sec
Start 50: coreneuron_modtests::spikes_mpi_py_gpu
Start 162: testcorenrn_watch::compare_results
148/184 Test #162: testcorenrn_watch::compare_results ................................. Passed 2.62 sec
149/184 Test #163: testcorenrn_netstimdirect::direct_netstimdirect .................... Passed 29.45 sec
Start 51: coreneuron_modtests::spikes_mpi_file_mode_py_gpu
150/184 Test #164: testcorenrn_netstimdirect::direct_netstimdirect_psolve_alternate ... Passed 25.10 sec
Start 52: coreneuron_modtests::inputpresyn_py_gpu
Start 165: testcorenrn_netstimdirect::compare_results
151/184 Test #165: testcorenrn_netstimdirect::compare_results ......................... Passed 0.15 sec
152/184 Test #33: coreneuron_modtests::spikes_mpi_py_cpu ............................. Passed 38.55 sec
153/184 Test #35: coreneuron_modtests::inputpresyn_py_cpu ............................ Passed 37.65 sec
Start 147: testcorenrn_vecevent::coreneuron_gpu_offline::preparation
154/184 Test #34: coreneuron_modtests::spikes_mpi_file_mode_py_cpu ................... Passed 38.42 sec
155/184 Test #50: coreneuron_modtests::spikes_mpi_py_gpu ............................. Passed 37.78 sec
Start 151: testcorenrn_vecevent::coreneuron_cpu_offline::preparation
156/184 Test #151: testcorenrn_vecevent::coreneuron_cpu_offline::preparation .......... Passed 4.57 sec
Start 179: olfactory-bulb-3d::neuron::preparation
157/184 Test #179: olfactory-bulb-3d::neuron::preparation ............................. Passed 0.05 sec
Start 181: olfactory-bulb-3d::coreneuron_gpu_online::preparation
158/184 Test #147: testcorenrn_vecevent::coreneuron_gpu_offline::preparation .......... Passed 4.85 sec
Start 183: olfactory-bulb-3d::coreneuron_cpu_online::preparation
159/184 Test #181: olfactory-bulb-3d::coreneuron_gpu_online::preparation .............. Passed 0.09 sec
Start 58: reduced_dentate::neuron
160/184 Test #183: olfactory-bulb-3d::coreneuron_cpu_online::preparation .............. Passed 0.08 sec
Start 59: reduced_dentate::coreneuron_cpu
161/184 Test #51: coreneuron_modtests::spikes_mpi_file_mode_py_gpu ................... Passed 31.16 sec
162/184 Test #52: coreneuron_modtests::inputpresyn_py_gpu ............................ Passed 27.90 sec
Start 60: reduced_dentate::coreneuron_gpu
163/184 Test #58: reduced_dentate::neuron ............................................ Passed 14.69 sec
Start 143: testcorenrn_vecevent::neuron
164/184 Test #143: testcorenrn_vecevent::neuron ....................................... Passed 7.39 sec
Start 144: testcorenrn_vecevent::coreneuron_gpu_online
165/184 Test #28: coreneuron_modtests::test_pointer_py_cpu ........................... Passed 176.97 sec
166/184 Test #59: reduced_dentate::coreneuron_cpu .................................... Passed 25.29 sec
Start 68: external_ringtest::coreneuron_cpu_mpi_threads
167/184 Test #45: coreneuron_modtests::test_pointer_py_gpu ........................... Passed 128.53 sec
168/184 Test #60: reduced_dentate::coreneuron_gpu .................................... Passed 35.74 sec
Start 145: testcorenrn_vecevent::coreneuron_gpu_online_psolve_alternate
Start 61: reduced_dentate::compare_results
169/184 Test #61: reduced_dentate::compare_results ................................... Passed 0.10 sec
170/184 Test #57: external_nrntest ................................................... Passed 118.92 sec
171/184 Test #68: external_ringtest::coreneuron_cpu_mpi_threads ...................... Passed 22.24 sec
Start 73: external_ringtest::coreneuron_gpu_mpi_threads
172/184 Test #144: testcorenrn_vecevent::coreneuron_gpu_online ........................ Passed 25.44 sec
Start 146: testcorenrn_vecevent::coreneuron_gpu_offline
173/184 Test #145: testcorenrn_vecevent::coreneuron_gpu_online_psolve_alternate ....... Passed 21.00 sec
Start 148: testcorenrn_vecevent::coreneuron_cpu_online
174/184 Test #146: testcorenrn_vecevent::coreneuron_gpu_offline ....................... Passed 29.42 sec
Start 149: testcorenrn_vecevent::coreneuron_cpu_online_psolve_alternate
175/184 Test #148: testcorenrn_vecevent::coreneuron_cpu_online ........................ Passed 22.14 sec
Start 150: testcorenrn_vecevent::coreneuron_cpu_offline
176/184 Test #73: external_ringtest::coreneuron_gpu_mpi_threads ...................... Passed 48.21 sec
Start 178: olfactory-bulb-3d::neuron
Start 180: olfactory-bulb-3d::coreneuron_gpu_online
177/184 Test #150: testcorenrn_vecevent::coreneuron_cpu_offline ....................... Passed 30.52 sec
Start 182: olfactory-bulb-3d::coreneuron_cpu_online
178/184 Test #149: testcorenrn_vecevent::coreneuron_cpu_online_psolve_alternate ....... Passed 65.95 sec
Start 74: external_ringtest::compare_results
Start 152: testcorenrn_vecevent::compare_results
179/184 Test #152: testcorenrn_vecevent::compare_results .............................. Passed 0.06 sec
180/184 Test #74: external_ringtest::compare_results ................................. Passed 0.07 sec
181/184 Test #178: olfactory-bulb-3d::neuron .......................................... Passed 200.29 sec
182/184 Test #180: olfactory-bulb-3d::coreneuron_gpu_online ........................... Passed 204.55 sec
183/184 Test #182: olfactory-bulb-3d::coreneuron_cpu_online ........................... Passed 194.57 sec
Start 184: olfactory-bulb-3d::compare_results
184/184 Test #184: olfactory-bulb-3d::compare_results ................................. Passed 0.23 sec
100% tests passed, 0 tests failed out of 184
Total Test time (real) = 863.42 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1655890438:step_script section_start:1655890438:upload_artifacts_on_success Uploading artifacts for successful job
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=55364 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=272598 responseStatus=201 Created token=W_3qFLr1
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=55425 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=272598 responseStatus=201 Created token=W_3qFLr1
section_end:1655890440:upload_artifacts_on_success section_start:1655890440:cleanup_file_variables Cleaning up project directory and file based variables
section_end:1655890441:cleanup_file_variables Job succeeded
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment