Skip to content

Instantly share code, notes, and snippets.

@bbpbuildbot
Created December 7, 2022 12:27
Show Gist options
  • Save bbpbuildbot/4880150750fffc7ea69464c82ff48870 to your computer and use it in GitHub Desktop.
Save bbpbuildbot/4880150750fffc7ea69464c82ff48870 to your computer and use it in GitHub Desktop.
Logfiles for GitLab pipeline https://bbpgitlab.epfl.ch/hpc/coreneuron/-/pipelines/89112 (:no_entry:) running on GitHub PR BlueBrain/CoreNeuron#844.
Running with gitlab-runner 15.5.0 (0d4137b8)
 on BB5 map runner pnPo3yJy
section_start:1670413418:resolve_secrets Resolving secrets
section_end:1670413418:resolve_secrets section_start:1670413418:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 15.5.0, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor806463168, slurm job id , CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 89112, build ref 1ba5e64356655e0a1488e5f95def3d7c6952e7b1, job ID 462679
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
A slurm job will be created with name GL_J462679_PROD_P112_CP1_C15
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 1054620
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J462679_PROD_P112_CP1_C15 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --ntasks=2 --jobid=1054620 --cpus-per-task=8 --mem=76G
section_end:1670413420:prepare_executor section_start:1670413420:prepare_script Preparing environment
Using git from spack modules
Running on r1i4n11 via bbpv1.epfl.ch...
section_end:1670413422:prepare_script section_start:1670413422:get_sources Getting source from Git repository
Using git from spack modules
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1670413423:get_sources section_start:1670413423:restore_cache Restoring cache
Using git from spack modules
Checking cache for build:coreneuron:mod2c:intel:shared:debug-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=155475 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1670413425:restore_cache section_start:1670413425:download_artifacts Downloading artifacts
Using git from spack modules
Downloading artifacts for spack_setup (462676)...
Runtime platform  arch=amd64 os=linux pid=155875 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=462676 responseStatus=200 OK token=AUepdsWR
section_end:1670413426:download_artifacts section_start:1670413426:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
Using git from spack modules
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462679/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462679/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462679/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462679/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462679/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112/J462679_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = ssh://git@bbpgitlab.epfl.ch/" >> "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = ssh://git@bbpgitlab.epfl.ch/
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install coreneuron%intel +caliper~gpu~legacy-unit~nmodl~openmp+shared+tests~unified build_type=Debug ^hpe-mpi%gcc ^caliper%gcc+cuda cuda_arch=70
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
==> Warning: Missing a source id for libsonata-report@1.1.1_lfp
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be udoqinvar6cp75viyxkh44mgzbht4q76
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/1054620/ccache
Primary config: /nvme/bbpcihpcproj12/1054620/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Dec 7 12:44:17 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.22 / 0.51 (42.33 %)
Files: 3142
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: 'udoqinvar6cp75viyxkh44mgzbht4q76'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- coreneuron%intel+caliper~gpu~legacy-unit~nmodl~openmp+shared+tests~unified build_type=Debug
- ^caliper%gcc+cuda cuda_arch=70
- ^hpe-mpi%gcc
Concretized
--------------------------------
- udoqinv coreneuron@develop%intel@2021.4.0+caliper~codegenopt~gpu~ipo~ispc~knl~legacy-unit+mpi~nmodl~openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=Debug arch=linux-rhel7-skylake
- dwdch6b ^bison@3.8.2%intel@2021.4.0 arch=linux-rhel7-skylake
[^] rdrurry ^boost@1.79.0%intel@2021.4.0+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] pfqy3p5 ^caliper@2.7.0%gcc@11.2.0+adiak+cuda~fortran+gotcha~ipo+libdw~libpfm+libunwind+mpi+papi+sampler+shared~sosflow build_type=RelWithDebInfo cuda_arch=70 arch=linux-rhel7-skylake
[^] at3kmf2 ^adiak@0.2.1%gcc@11.2.0~ipo+mpi+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 2qmvlfy ^cmake@3.21.4%gcc@11.2.0~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
[^] hyunzkn ^hpe-mpi@2.25.hmpt%gcc@11.2.0 arch=linux-rhel7-skylake
[^] wu42aul ^cuda@11.6.1%gcc@11.2.0~allow-unsupported-compilers~dev arch=linux-rhel7-skylake
[^] p6adpyi ^elfutils@0.186%gcc@11.2.0~bzip2~debuginfod+nls~xz arch=linux-rhel7-skylake
[^] iu2b5hx ^gettext@0.21%gcc@11.2.0+bzip2+curses+git~libunistring+libxml2+tar+xz arch=linux-rhel7-skylake
[^] 3rmq3zx ^bzip2@1.0.8%gcc@11.2.0~debug~pic+shared arch=linux-rhel7-skylake
[^] hxxlexb ^libiconv@1.16%gcc@11.2.0 libs=shared,static arch=linux-rhel7-skylake
[^] dnxqn2k ^libxml2@2.9.12%gcc@11.2.0~python arch=linux-rhel7-skylake
[^] ihxi5rl ^pkgconf@1.8.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] jzpqn5y ^xz@5.2.5%gcc@11.2.0~pic libs=shared,static arch=linux-rhel7-skylake
[^] evtnqzd ^zlib@1.2.11%gcc@11.2.0+optimize+pic+shared arch=linux-rhel7-skylake
[^] ams67cx ^ncurses@6.2%gcc@11.2.0~symlinks+termlib abi=none arch=linux-rhel7-skylake
[^] ir7xtbl ^tar@1.28%gcc@11.2.0 patches=08921fcbd732050c74ddf1de7d8ad95ffdbc09f8b4342456fa2f6a0dd02a957c,125cd6142fac2cc339e9aebfe79e40f90766022b8e8401532b1729e84fc148c2,5c314db58d005043bf407abaf25eb9823b9032a22fd12a0b142d4bf548130fa4,d428578be7fb99b831eb61e53b8d88a859afe08b479a21238180899707d79ce4 arch=linux-rhel7-skylake
- bmpk6ez ^m4@1.4.16%gcc@11.2.0+sigsegv arch=linux-rhel7-skylake
[^] gc2ivgg ^libunwind@1.5.0%gcc@11.2.0~block_signals~conservative_checks~cxx_exceptions~debug~debug_frame+docs~pic+tests+weak_backtrace~xz~zlib components=none libs=shared,static arch=linux-rhel7-skylake
[^] vcit7s5 ^papi@6.0.0.1%gcc@11.2.0~cuda+example~infiniband~lmsensors~nvml~powercap~rapl~rocm~rocm_smi~sde+shared~static_tools amdgpu_target=none arch=linux-rhel7-skylake
[^] 72xzp3v ^python@3.9.7%gcc@11.2.0+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
- s4ueg72 ^flex@2.6.3%intel@2021.4.0+lex~nls arch=linux-rhel7-skylake
- za4z336 ^libsonata-report@1.1.1_lfp%gcc@11.2.0~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] utj47fz ^hdf5@1.10.7%gcc@11.2.0+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] r74vcyb ^spdlog@1.9.2%gcc@11.2.0~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 7lotjqk ^fmt@8.1.1%gcc@11.2.0~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
- 6ggc5yr ^ninja@1.10.2%intel@2021.4.0 arch=linux-rhel7-skylake
[^] 5fkun4i ^reportinglib@2.5.6%gcc@11.2.0~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> bison@3.8.2 : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-dwdch6bmdeclr2novthsywtrryotawwz)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.79.0-zoxhho (external boost-1.79.0-rdrurryqe5eahijb4xf6mbqfryg7ezod)
==> cmake@3.21.4 : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-2qmvlfyylrv3t5ankluyr5cqey2nlfzd)
==> hpe-mpi@2.25.hmpt : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-hyunzknktdzri34bx26bupkvapehonql)
==> cuda@11.6.1 : has external module in ['cuda/11.6.1']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cuda-11.6.1-ngetva (external cuda-11.6.1-wu42aulcivrayjquerqpqvvjeadgosp2)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bzip2-1.0.8-eu2d6e (external bzip2-1.0.8-3rmq3zxuntsiphthnvty6gtydbmbkwr5)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libiconv-1.16-hxxlex
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/pkgconf-1.8.0-ihxi5r
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/xz-5.2.5-jzpqn5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-evtnqzdtuvprwnbd2nljimpzywrhw4uo)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ncurses-6.2-nsve5x (external ncurses-6.2-ams67cxbq5vc7wiay2ndr2ksce2igbfw)
[+] /usr (external tar-1.28-ir7xtblauhq3vtkpjrl7ou3nzevcsi3u)
[+] /usr (external m4-1.4.16-bmpk6ezkxo3553jsm5thxh7ygtrom3dn)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libunwind-1.5.0-gc2ivg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/papi-6.0.0.1-vcit7s
==> python@3.9.7 : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-72xzp3vgzh5b424qevkfruhs3ajzqbiy)
==> flex@2.6.3 : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-s4ueg72j7l6vkdyvfxj2tweo7v7s3otx)
==> ninja@1.10.2 : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-6ggc5yre7qddwxdjmn7sfptpdoiy4dtp)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/fmt-8.1.1-7lotjq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/adiak-0.2.1-at3kmf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/reportinglib-2.5.6-5fkun4
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/hdf5-1.10.7-utj47f
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libxml2-2.9.12-dnxqn2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/spdlog-1.9.2-r74vcy
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/gettext-0.21-iu2b5h
==> Installing libsonata-report-1.1.1_lfp-za4z336pgqylgpjq4szfzmkquwej2xtz
==> No binary for libsonata-report-1.1.1_lfp-za4z336pgqylgpjq4szfzmkquwej2xtz found: installing from source
==> Warning: Missing a source id for libsonata-report@1.1.1_lfp
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112, but it is owned by 0
==> No patches needed for libsonata-report
==> libsonata-report: Executing phase: 'cmake'
==> libsonata-report: Executing phase: 'build'
==> libsonata-report: Executing phase: 'install'
==> libsonata-report: Successfully installed libsonata-report-1.1.1_lfp-za4z336pgqylgpjq4szfzmkquwej2xtz
Fetch: 2.70s. Build: 20.14s. Total: 22.83s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/software/install_gcc-11.2.0-skylake/libsonata-report-1.1.1_lfp-za4z33
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/elfutils-0.186-p6adpy
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/caliper-2.7.0-pfqy3p
==> Installing coreneuron-develop-udoqinvar6cp75viyxkh44mgzbht4q76
==> No binary for coreneuron-develop-udoqinvar6cp75viyxkh44mgzbht4q76 found: installing from source
==> No patches needed for coreneuron
==> coreneuron: Executing phase: 'cmake'
==> coreneuron: Executing phase: 'build'
==> coreneuron: Executing phase: 'install'
==> coreneuron: Successfully installed coreneuron-develop-udoqinvar6cp75viyxkh44mgzbht4q76
Fetch: 5.01s. Build: 1m 11.70s. Total: 1m 16.70s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/software/install_intel-2021.4.0-skylake/coreneuron-develop-udoqin
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/1054620/ccache
Primary config: /nvme/bbpcihpcproj12/1054620/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Dec 7 12:46:56 2022
Hits: 108 / 137 (78.83 %)
Direct: 90 / 137 (65.69 %)
Preprocessed: 18 / 47 (38.30 %)
Misses: 29
Direct: 47
Preprocessed: 29
Uncacheable: 43
Primary storage:
Hits: 198 / 274 (72.26 %)
Misses: 76
Cache size (GB): 0.23 / 0.51 (44.45 %)
Files: 3200
Uncacheable:
Called for linking: 32
No input file: 11
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ SPACK_PACKAGE_SLUGIFY=$(echo -n ${SPACK_PACKAGE} | tr -c '[:alnum:]' '_' | tr '[:lower:]' '[:upper:]')
$ echo "${SPACK_PACKAGE_SLUGIFY}_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading hpe-mpi/2.25.hmpt
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1670413617:step_script section_start:1670413617:archive_cache Saving cache for successful job
Using git from spack modules
Creating cache build:coreneuron:mod2c:intel:shared:debug-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=164607 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Acoreneuron%3Amod2c%3Aintel%3Ashared%3Adebug-8-non_protected
Created cache
section_end:1670413632:archive_cache section_start:1670413632:upload_artifacts_on_success Uploading artifacts for successful job
Using git from spack modules
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=165038 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=462679 responseStatus=201 Created token=AUepdsWR
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=165110 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=462679 responseStatus=201 Created token=AUepdsWR
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=165200 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=462679 responseStatus=201 Created token=AUepdsWR
section_end:1670413633:upload_artifacts_on_success section_start:1670413633:cleanup_file_variables Cleaning up project directory and file based variables
Using git from spack modules
section_end:1670413634:cleanup_file_variables Job succeeded
Running with gitlab-runner 15.5.0 (0d4137b8)
 on BB5 map runner pnPo3yJy
section_start:1670413419:resolve_secrets Resolving secrets
section_end:1670413419:resolve_secrets section_start:1670413419:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 15.5.0, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor1864375614, slurm job id , CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 89112, build ref 1ba5e64356655e0a1488e5f95def3d7c6952e7b1, job ID 462682
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
A slurm job will be created with name GL_J462682_PROD_P112_CP2_C16
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 1054621
job state: PD
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J462682_PROD_P112_CP2_C16 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --ntasks=2 --jobid=1054621 --cpus-per-task=8 --mem=76G
section_end:1670413422:prepare_executor section_start:1670413422:prepare_script Preparing environment
Using git from spack modules
Running on r1i4n11 via bbpv1.epfl.ch...
section_end:1670413426:prepare_script section_start:1670413426:get_sources Getting source from Git repository
Using git from spack modules
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1670413427:get_sources section_start:1670413427:restore_cache Restoring cache
Using git from spack modules
Checking cache for build:coreneuron:mod2c:nvhpc:acc:debug:unified-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=156317 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1670413430:restore_cache section_start:1670413430:download_artifacts Downloading artifacts
Using git from spack modules
Downloading artifacts for spack_setup (462676)...
Runtime platform  arch=amd64 os=linux pid=157001 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=462676 responseStatus=200 OK token=atFLeepP
section_end:1670413430:download_artifacts section_start:1670413430:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
Using git from spack modules
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462682/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462682/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462682/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462682/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462682/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112/J462682_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = ssh://git@bbpgitlab.epfl.ch/" >> "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = ssh://git@bbpgitlab.epfl.ch/
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install coreneuron%nvhpc +caliper+gpu~legacy-unit~nmodl+openmp~shared+tests+unified build_type=Debug ^hpe-mpi%gcc ^caliper%gcc+cuda cuda_arch=70
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
==> Warning: Missing a source id for libsonata-report@1.1.1_lfp
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be 2asi2me356b2upp2q2whulfko55yuxye
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/1054621/ccache
Primary config: /nvme/bbpcihpcproj12/1054621/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Dec 7 12:44:18 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.23 / 0.51 (45.81 %)
Files: 3102
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: '2asi2me356b2upp2q2whulfko55yuxye'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- coreneuron%nvhpc+caliper+gpu~legacy-unit~nmodl+openmp~shared+tests+unified build_type=Debug
- ^caliper%gcc+cuda cuda_arch=70
- ^hpe-mpi%gcc
Concretized
--------------------------------
- 2asi2me coreneuron@develop%nvhpc@22.3+caliper~codegenopt+gpu~ipo~ispc~knl~legacy-unit+mpi~nmodl+openmp~profile+report~shared~sympy~sympyopt+tests+unified build_type=Debug arch=linux-rhel7-skylake
- 6s6wcfe ^bison@3.8.2%nvhpc@22.3 arch=linux-rhel7-skylake
[^] fnw44jf ^boost@1.79.0%nvhpc@22.3+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] pfqy3p5 ^caliper@2.7.0%gcc@11.2.0+adiak+cuda~fortran+gotcha~ipo+libdw~libpfm+libunwind+mpi+papi+sampler+shared~sosflow build_type=RelWithDebInfo cuda_arch=70 arch=linux-rhel7-skylake
[^] at3kmf2 ^adiak@0.2.1%gcc@11.2.0~ipo+mpi+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 2qmvlfy ^cmake@3.21.4%gcc@11.2.0~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
[^] hyunzkn ^hpe-mpi@2.25.hmpt%gcc@11.2.0 arch=linux-rhel7-skylake
[^] wu42aul ^cuda@11.6.1%gcc@11.2.0~allow-unsupported-compilers~dev arch=linux-rhel7-skylake
[^] p6adpyi ^elfutils@0.186%gcc@11.2.0~bzip2~debuginfod+nls~xz arch=linux-rhel7-skylake
[^] iu2b5hx ^gettext@0.21%gcc@11.2.0+bzip2+curses+git~libunistring+libxml2+tar+xz arch=linux-rhel7-skylake
[^] 3rmq3zx ^bzip2@1.0.8%gcc@11.2.0~debug~pic+shared arch=linux-rhel7-skylake
[^] hxxlexb ^libiconv@1.16%gcc@11.2.0 libs=shared,static arch=linux-rhel7-skylake
[^] dnxqn2k ^libxml2@2.9.12%gcc@11.2.0~python arch=linux-rhel7-skylake
[^] ihxi5rl ^pkgconf@1.8.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] jzpqn5y ^xz@5.2.5%gcc@11.2.0~pic libs=shared,static arch=linux-rhel7-skylake
[^] evtnqzd ^zlib@1.2.11%gcc@11.2.0+optimize+pic+shared arch=linux-rhel7-skylake
[^] ams67cx ^ncurses@6.2%gcc@11.2.0~symlinks+termlib abi=none arch=linux-rhel7-skylake
[^] ir7xtbl ^tar@1.28%gcc@11.2.0 patches=08921fcbd732050c74ddf1de7d8ad95ffdbc09f8b4342456fa2f6a0dd02a957c,125cd6142fac2cc339e9aebfe79e40f90766022b8e8401532b1729e84fc148c2,5c314db58d005043bf407abaf25eb9823b9032a22fd12a0b142d4bf548130fa4,d428578be7fb99b831eb61e53b8d88a859afe08b479a21238180899707d79ce4 arch=linux-rhel7-skylake
- bmpk6ez ^m4@1.4.16%gcc@11.2.0+sigsegv arch=linux-rhel7-skylake
[^] gc2ivgg ^libunwind@1.5.0%gcc@11.2.0~block_signals~conservative_checks~cxx_exceptions~debug~debug_frame+docs~pic+tests+weak_backtrace~xz~zlib components=none libs=shared,static arch=linux-rhel7-skylake
[^] vcit7s5 ^papi@6.0.0.1%gcc@11.2.0~cuda+example~infiniband~lmsensors~nvml~powercap~rapl~rocm~rocm_smi~sde+shared~static_tools amdgpu_target=none arch=linux-rhel7-skylake
[^] 72xzp3v ^python@3.9.7%gcc@11.2.0+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
- ajxdymq ^flex@2.6.3%nvhpc@22.3+lex~nls arch=linux-rhel7-skylake
- za4z336 ^libsonata-report@1.1.1_lfp%gcc@11.2.0~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] utj47fz ^hdf5@1.10.7%gcc@11.2.0+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] r74vcyb ^spdlog@1.9.2%gcc@11.2.0~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 7lotjqk ^fmt@8.1.1%gcc@11.2.0~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
- cp3ofsp ^ninja@1.10.2%nvhpc@22.3 arch=linux-rhel7-skylake
[^] 5fkun4i ^reportinglib@2.5.6%gcc@11.2.0~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> bison@3.8.2 : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-6s6wcfeanp2mkdib4c3n3ivkcuosopgm)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.79.0-zoxhho (external boost-1.79.0-fnw44jfbxyivagsnavjnk6zdaghmffbt)
==> cmake@3.21.4 : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-2qmvlfyylrv3t5ankluyr5cqey2nlfzd)
==> hpe-mpi@2.25.hmpt : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-hyunzknktdzri34bx26bupkvapehonql)
==> cuda@11.6.1 : has external module in ['cuda/11.6.1']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cuda-11.6.1-ngetva (external cuda-11.6.1-wu42aulcivrayjquerqpqvvjeadgosp2)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bzip2-1.0.8-eu2d6e (external bzip2-1.0.8-3rmq3zxuntsiphthnvty6gtydbmbkwr5)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libiconv-1.16-hxxlex
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/pkgconf-1.8.0-ihxi5r
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/xz-5.2.5-jzpqn5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-evtnqzdtuvprwnbd2nljimpzywrhw4uo)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ncurses-6.2-nsve5x (external ncurses-6.2-ams67cxbq5vc7wiay2ndr2ksce2igbfw)
[+] /usr (external tar-1.28-ir7xtblauhq3vtkpjrl7ou3nzevcsi3u)
[+] /usr (external m4-1.4.16-bmpk6ezkxo3553jsm5thxh7ygtrom3dn)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libunwind-1.5.0-gc2ivg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/papi-6.0.0.1-vcit7s
==> python@3.9.7 : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-72xzp3vgzh5b424qevkfruhs3ajzqbiy)
==> flex@2.6.3 : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-ajxdymqhnzqgeuhvk5nu5zfymzq35n6i)
==> ninja@1.10.2 : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-cp3ofspgkdqaqih5vrhotmdwvkozfswp)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/fmt-8.1.1-7lotjq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/adiak-0.2.1-at3kmf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/reportinglib-2.5.6-5fkun4
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/hdf5-1.10.7-utj47f
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libxml2-2.9.12-dnxqn2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/spdlog-1.9.2-r74vcy
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/gettext-0.21-iu2b5h
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/elfutils-0.186-p6adpy
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/caliper-2.7.0-pfqy3p
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/software/install_gcc-11.2.0-skylake/libsonata-report-1.1.1_lfp-za4z33
==> Installing coreneuron-develop-2asi2me356b2upp2q2whulfko55yuxye
==> No binary for coreneuron-develop-2asi2me356b2upp2q2whulfko55yuxye found: installing from source
==> Warning: Missing a source id for libsonata-report@1.1.1_lfp
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112, but it is owned by 0
==> No patches needed for coreneuron
==> coreneuron: Executing phase: 'cmake'
==> coreneuron: Executing phase: 'build'
==> coreneuron: Executing phase: 'install'
==> coreneuron: Successfully installed coreneuron-develop-2asi2me356b2upp2q2whulfko55yuxye
Fetch: 4.43s. Build: 1m 44.63s. Total: 1m 49.06s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/software/install_nvhpc-22.3-skylake/coreneuron-develop-2asi2m
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/1054621/ccache
Primary config: /nvme/bbpcihpcproj12/1054621/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Dec 7 12:47:27 2022
Hits: 106 / 116 (91.38 %)
Direct: 88 / 116 (75.86 %)
Preprocessed: 18 / 28 (64.29 %)
Misses: 10
Direct: 28
Preprocessed: 10
Uncacheable: 21
Primary storage:
Hits: 194 / 232 (83.62 %)
Misses: 38
Cache size (GB): 0.24 / 0.51 (45.94 %)
Files: 3122
Uncacheable:
Called for linking: 18
No input file: 3
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ SPACK_PACKAGE_SLUGIFY=$(echo -n ${SPACK_PACKAGE} | tr -c '[:alnum:]' '_' | tr '[:lower:]' '[:upper:]')
$ echo "${SPACK_PACKAGE_SLUGIFY}_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading hpe-mpi/2.25.hmpt
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1670413648:step_script section_start:1670413648:archive_cache Saving cache for successful job
Using git from spack modules
Creating cache build:coreneuron:mod2c:nvhpc:acc:debug:unified-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=165875 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Acoreneuron%3Amod2c%3Anvhpc%3Aacc%3Adebug%3Aunified-8-non_protected
Created cache
section_end:1670413663:archive_cache section_start:1670413663:upload_artifacts_on_success Uploading artifacts for successful job
Using git from spack modules
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=166183 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=462682 responseStatus=201 Created token=atFLeepP
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=166232 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=462682 responseStatus=201 Created token=atFLeepP
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=166298 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=462682 responseStatus=201 Created token=atFLeepP
section_end:1670413665:upload_artifacts_on_success section_start:1670413665:cleanup_file_variables Cleaning up project directory and file based variables
Using git from spack modules
section_end:1670413666:cleanup_file_variables Job succeeded
Running with gitlab-runner 15.5.0 (0d4137b8)
 on BB5 map runner pnPo3yJy
section_start:1670413419:resolve_secrets Resolving secrets
section_end:1670413419:resolve_secrets section_start:1670413419:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 15.5.0, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor1083859080, slurm job id , CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 89112, build ref 1ba5e64356655e0a1488e5f95def3d7c6952e7b1, job ID 462683
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
A slurm job will be created with name GL_J462683_PROD_P112_CP3_C17
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 1054622
job state: PD
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J462683_PROD_P112_CP3_C17 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --ntasks=2 --jobid=1054622 --cpus-per-task=8 --mem=76G
section_end:1670413423:prepare_executor section_start:1670413423:prepare_script Preparing environment
Using git from spack modules
Running on r1i4n11 via bbpv1.epfl.ch...
section_end:1670413426:prepare_script section_start:1670413426:get_sources Getting source from Git repository
Using git from spack modules
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1670413427:get_sources section_start:1670413427:restore_cache Restoring cache
Using git from spack modules
Checking cache for build:coreneuron:mod2c:nvhpc:acc:shared-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=156367 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1670413431:restore_cache section_start:1670413431:download_artifacts Downloading artifacts
Using git from spack modules
Downloading artifacts for spack_setup (462676)...
Runtime platform  arch=amd64 os=linux pid=157441 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=462676 responseStatus=200 OK token=XyDh5uKX
section_end:1670413432:download_artifacts section_start:1670413432:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
Using git from spack modules
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462683/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462683/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462683/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462683/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462683/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112/J462683_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = ssh://git@bbpgitlab.epfl.ch/" >> "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = ssh://git@bbpgitlab.epfl.ch/
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install coreneuron%nvhpc +caliper+gpu~legacy-unit~nmodl~openmp+shared+tests~unified build_type=RelWithDebInfo ^hpe-mpi%gcc ^caliper%gcc+cuda cuda_arch=70
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
==> Warning: Missing a source id for libsonata-report@1.1.1_lfp
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be w2z6dfwdq6k5tkwudhpbj7owhrne5qzi
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/1054622/ccache
Primary config: /nvme/bbpcihpcproj12/1054622/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Dec 7 12:44:18 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.36 / 0.51 (71.07 %)
Files: 4592
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: 'w2z6dfwdq6k5tkwudhpbj7owhrne5qzi'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- coreneuron%nvhpc+caliper+gpu~legacy-unit~nmodl~openmp+shared+tests~unified build_type=RelWithDebInfo
- ^caliper%gcc+cuda cuda_arch=70
- ^hpe-mpi%gcc
Concretized
--------------------------------
- w2z6dfw coreneuron@develop%nvhpc@22.3+caliper~codegenopt+gpu~ipo~ispc~knl~legacy-unit+mpi~nmodl~openmp~profile+report+shared~sympy~sympyopt+tests~unified build_type=RelWithDebInfo arch=linux-rhel7-skylake
- 6s6wcfe ^bison@3.8.2%nvhpc@22.3 arch=linux-rhel7-skylake
[^] fnw44jf ^boost@1.79.0%nvhpc@22.3+atomic+chrono~clanglibcpp~container~context~coroutine+date_time~debug+exception~fiber+filesystem+graph~icu+iostreams+locale+log+math~mpi+multithreaded~numpy+pic+program_options~python+random+regex+serialization+shared+signals~singlethreaded+system~taggedlayout+test+thread+timer~versionedlayout+wave cxxstd=98 visibility=hidden arch=linux-rhel7-skylake
[^] pfqy3p5 ^caliper@2.7.0%gcc@11.2.0+adiak+cuda~fortran+gotcha~ipo+libdw~libpfm+libunwind+mpi+papi+sampler+shared~sosflow build_type=RelWithDebInfo cuda_arch=70 arch=linux-rhel7-skylake
[^] at3kmf2 ^adiak@0.2.1%gcc@11.2.0~ipo+mpi+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 2qmvlfy ^cmake@3.21.4%gcc@11.2.0~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
[^] hyunzkn ^hpe-mpi@2.25.hmpt%gcc@11.2.0 arch=linux-rhel7-skylake
[^] wu42aul ^cuda@11.6.1%gcc@11.2.0~allow-unsupported-compilers~dev arch=linux-rhel7-skylake
[^] p6adpyi ^elfutils@0.186%gcc@11.2.0~bzip2~debuginfod+nls~xz arch=linux-rhel7-skylake
[^] iu2b5hx ^gettext@0.21%gcc@11.2.0+bzip2+curses+git~libunistring+libxml2+tar+xz arch=linux-rhel7-skylake
[^] 3rmq3zx ^bzip2@1.0.8%gcc@11.2.0~debug~pic+shared arch=linux-rhel7-skylake
[^] hxxlexb ^libiconv@1.16%gcc@11.2.0 libs=shared,static arch=linux-rhel7-skylake
[^] dnxqn2k ^libxml2@2.9.12%gcc@11.2.0~python arch=linux-rhel7-skylake
[^] ihxi5rl ^pkgconf@1.8.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] jzpqn5y ^xz@5.2.5%gcc@11.2.0~pic libs=shared,static arch=linux-rhel7-skylake
[^] evtnqzd ^zlib@1.2.11%gcc@11.2.0+optimize+pic+shared arch=linux-rhel7-skylake
[^] ams67cx ^ncurses@6.2%gcc@11.2.0~symlinks+termlib abi=none arch=linux-rhel7-skylake
[^] ir7xtbl ^tar@1.28%gcc@11.2.0 patches=08921fcbd732050c74ddf1de7d8ad95ffdbc09f8b4342456fa2f6a0dd02a957c,125cd6142fac2cc339e9aebfe79e40f90766022b8e8401532b1729e84fc148c2,5c314db58d005043bf407abaf25eb9823b9032a22fd12a0b142d4bf548130fa4,d428578be7fb99b831eb61e53b8d88a859afe08b479a21238180899707d79ce4 arch=linux-rhel7-skylake
- bmpk6ez ^m4@1.4.16%gcc@11.2.0+sigsegv arch=linux-rhel7-skylake
[^] gc2ivgg ^libunwind@1.5.0%gcc@11.2.0~block_signals~conservative_checks~cxx_exceptions~debug~debug_frame+docs~pic+tests+weak_backtrace~xz~zlib components=none libs=shared,static arch=linux-rhel7-skylake
[^] vcit7s5 ^papi@6.0.0.1%gcc@11.2.0~cuda+example~infiniband~lmsensors~nvml~powercap~rapl~rocm~rocm_smi~sde+shared~static_tools amdgpu_target=none arch=linux-rhel7-skylake
[^] 72xzp3v ^python@3.9.7%gcc@11.2.0+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
- ajxdymq ^flex@2.6.3%nvhpc@22.3+lex~nls arch=linux-rhel7-skylake
- za4z336 ^libsonata-report@1.1.1_lfp%gcc@11.2.0~ipo+mpi build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] utj47fz ^hdf5@1.10.7%gcc@11.2.0+cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] r74vcyb ^spdlog@1.9.2%gcc@11.2.0~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 7lotjqk ^fmt@8.1.1%gcc@11.2.0~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
- cp3ofsp ^ninja@1.10.2%nvhpc@22.3 arch=linux-rhel7-skylake
[^] 5fkun4i ^reportinglib@2.5.6%gcc@11.2.0~ipo~profile+shared~tests build_type=RelWithDebInfo arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> bison@3.8.2 : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-6s6wcfeanp2mkdib4c3n3ivkcuosopgm)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/boost-1.79.0-zoxhho (external boost-1.79.0-fnw44jfbxyivagsnavjnk6zdaghmffbt)
==> cmake@3.21.4 : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-2qmvlfyylrv3t5ankluyr5cqey2nlfzd)
==> hpe-mpi@2.25.hmpt : has external module in ['hpe-mpi/2.25.hmpt']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/hpe-mpi-2.25.hmpt-4ukyxt (external hpe-mpi-2.25.hmpt-hyunzknktdzri34bx26bupkvapehonql)
==> cuda@11.6.1 : has external module in ['cuda/11.6.1']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cuda-11.6.1-ngetva (external cuda-11.6.1-wu42aulcivrayjquerqpqvvjeadgosp2)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bzip2-1.0.8-eu2d6e (external bzip2-1.0.8-3rmq3zxuntsiphthnvty6gtydbmbkwr5)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libiconv-1.16-hxxlex
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/pkgconf-1.8.0-ihxi5r
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/xz-5.2.5-jzpqn5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/zlib-1.2.11-mswjss (external zlib-1.2.11-evtnqzdtuvprwnbd2nljimpzywrhw4uo)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ncurses-6.2-nsve5x (external ncurses-6.2-ams67cxbq5vc7wiay2ndr2ksce2igbfw)
[+] /usr (external tar-1.28-ir7xtblauhq3vtkpjrl7ou3nzevcsi3u)
[+] /usr (external m4-1.4.16-bmpk6ezkxo3553jsm5thxh7ygtrom3dn)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libunwind-1.5.0-gc2ivg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/papi-6.0.0.1-vcit7s
==> python@3.9.7 : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-72xzp3vgzh5b424qevkfruhs3ajzqbiy)
==> flex@2.6.3 : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-ajxdymqhnzqgeuhvk5nu5zfymzq35n6i)
==> ninja@1.10.2 : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-cp3ofspgkdqaqih5vrhotmdwvkozfswp)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/fmt-8.1.1-7lotjq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/adiak-0.2.1-at3kmf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/reportinglib-2.5.6-5fkun4
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/hdf5-1.10.7-utj47f
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libxml2-2.9.12-dnxqn2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/spdlog-1.9.2-r74vcy
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/gettext-0.21-iu2b5h
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/elfutils-0.186-p6adpy
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/caliper-2.7.0-pfqy3p
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/software/install_gcc-11.2.0-skylake/libsonata-report-1.1.1_lfp-za4z33
==> Installing coreneuron-develop-w2z6dfwdq6k5tkwudhpbj7owhrne5qzi
==> No binary for coreneuron-develop-w2z6dfwdq6k5tkwudhpbj7owhrne5qzi found: installing from source
==> Warning: Missing a source id for libsonata-report@1.1.1_lfp
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112, but it is owned by 0
==> No patches needed for coreneuron
==> coreneuron: Executing phase: 'cmake'
==> coreneuron: Executing phase: 'build'
==> coreneuron: Executing phase: 'install'
==> coreneuron: Successfully installed coreneuron-develop-w2z6dfwdq6k5tkwudhpbj7owhrne5qzi
Fetch: 4.38s. Build: 1m 39.36s. Total: 1m 43.74s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/software/install_nvhpc-22.3-skylake/coreneuron-develop-w2z6df
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/1054622/ccache
Primary config: /nvme/bbpcihpcproj12/1054622/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Dec 7 12:47:22 2022
Hits: 106 / 115 (92.17 %)
Direct: 88 / 115 (76.52 %)
Preprocessed: 18 / 27 (66.67 %)
Misses: 9
Direct: 27
Preprocessed: 9
Uncacheable: 21
Primary storage:
Hits: 194 / 230 (84.35 %)
Misses: 36
Cache size (GB): 0.36 / 0.51 (71.16 %)
Files: 4610
Uncacheable:
Called for linking: 18
No input file: 3
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ SPACK_PACKAGE_SLUGIFY=$(echo -n ${SPACK_PACKAGE} | tr -c '[:alnum:]' '_' | tr '[:lower:]' '[:upper:]')
$ echo "${SPACK_PACKAGE_SLUGIFY}_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading hpe-mpi/2.25.hmpt
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1670413643:step_script section_start:1670413643:archive_cache Saving cache for successful job
Using git from spack modules
Creating cache build:coreneuron:mod2c:nvhpc:acc:shared-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=165760 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Acoreneuron%3Amod2c%3Anvhpc%3Aacc%3Ashared-8-non_protected
Created cache
section_end:1670413662:archive_cache section_start:1670413662:upload_artifacts_on_success Uploading artifacts for successful job
Using git from spack modules
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=165989 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=462683 responseStatus=201 Created token=XyDh5uKX
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=166032 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=462683 responseStatus=201 Created token=XyDh5uKX
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=166071 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=462683 responseStatus=201 Created token=XyDh5uKX
section_end:1670413663:upload_artifacts_on_success section_start:1670413663:cleanup_file_variables Cleaning up project directory and file based variables
Using git from spack modules
section_end:1670413664:cleanup_file_variables Job succeeded
Running with gitlab-runner 15.5.0 (0d4137b8)
 on BB5 map runner pnPo3yJy
section_start:1670413766:resolve_secrets Resolving secrets
section_end:1670413766:resolve_secrets section_start:1670413766:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 15.5.0, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor521915566, slurm job id , CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 89112, build ref 1ba5e64356655e0a1488e5f95def3d7c6952e7b1, job ID 462680
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
A slurm job will be created with name GL_J462680_PROD_P112_CP0_C0
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 1054658
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J462680_PROD_P112_CP0_C0 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --ntasks=2 --jobid=1054658 --cpus-per-task=8 --mem=76G
section_end:1670413777:prepare_executor section_start:1670413777:prepare_script Preparing environment
Using git from spack modules
Running on r1i7n20 via bbpv1.epfl.ch...
section_end:1670413781:prepare_script section_start:1670413781:get_sources Getting source from Git repository
Using git from spack modules
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1670413781:get_sources section_start:1670413781:restore_cache Restoring cache
Using git from spack modules
Checking cache for build:coreneuron:nmodl:intel:debug:legacy-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=104661 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1670413784:restore_cache section_start:1670413784:download_artifacts Downloading artifacts
Using git from spack modules
Downloading artifacts for build:nmodl (462678)...
Runtime platform  arch=amd64 os=linux pid=105018 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=462678 responseStatus=200 OK token=mvokuhq6
section_end:1670413785:download_artifacts section_start:1670413785:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
Using git from spack modules
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462680/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462680/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462680/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462680/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462680/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112/J462680_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = ssh://git@bbpgitlab.epfl.ch/" >> "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = ssh://git@bbpgitlab.epfl.ch/
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install coreneuron%intel +caliper~gpu~legacy-unit+nmodl~openmp~shared~sympy+tests~unified build_type=Debug ^hpe-mpi%gcc ^caliper%gcc+cuda cuda_arch=70 ^/4dzxcpsuksxgtuoesheax4sf76wrhkqb
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
==> Error: Package 'hdf5' not found in repository '/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/bluebrain/repo-bluebrain'
srun: error: r1i7n20: task 0: Exited with exit code 1
section_end:1670413817:step_script section_start:1670413817:upload_artifacts_on_failure Uploading artifacts for failed job
Using git from spack modules
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=108627 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=462680 responseStatus=201 Created token=mvokuhq6
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=108663 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=462680 responseStatus=201 Created token=mvokuhq6
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=108744 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=462680 responseStatus=201 Created token=mvokuhq6
section_end:1670413818:upload_artifacts_on_failure section_start:1670413818:cleanup_file_variables Cleaning up project directory and file based variables
Using git from spack modules
section_end:1670413819:cleanup_file_variables ERROR: Job failed: exit status 1

Running with gitlab-runner 15.5.0 (0d4137b8)
 on BB5 map runner pnPo3yJy
section_start:1670413767:resolve_secrets Resolving secrets
section_end:1670413767:resolve_secrets section_start:1670413767:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 15.5.0, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor632560793, slurm job id , CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 89112, build ref 1ba5e64356655e0a1488e5f95def3d7c6952e7b1, job ID 462681
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
A slurm job will be created with name GL_J462681_PROD_P112_CP1_C5
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 1054659
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J462681_PROD_P112_CP1_C5 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --ntasks=2 --jobid=1054659 --cpus-per-task=8 --mem=76G
section_end:1670413777:prepare_executor section_start:1670413777:prepare_script Preparing environment
Using git from spack modules
Running on r1i4n17 via bbpv1.epfl.ch...
section_end:1670413782:prepare_script section_start:1670413782:get_sources Getting source from Git repository
Using git from spack modules
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1670413782:get_sources section_start:1670413782:restore_cache Restoring cache
Using git from spack modules
Checking cache for build:coreneuron:nmodl:intel:shared:debug-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=48956 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1670413785:restore_cache section_start:1670413785:download_artifacts Downloading artifacts
Using git from spack modules
Downloading artifacts for build:nmodl (462678)...
Runtime platform  arch=amd64 os=linux pid=49129 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=462678 responseStatus=200 OK token=vaC6SEfN
section_end:1670413786:download_artifacts section_start:1670413786:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
Using git from spack modules
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462681/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462681/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462681/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462681/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462681/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112/J462681_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = ssh://git@bbpgitlab.epfl.ch/" >> "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = ssh://git@bbpgitlab.epfl.ch/
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install coreneuron%intel ~caliper~gpu~legacy-unit+nmodl~openmp+shared+sympy+tests~unified build_type=Debug ^hpe-mpi%gcc ^/4dzxcpsuksxgtuoesheax4sf76wrhkqb
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
==> Error: Package 'hdf5' not found in repository '/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/bluebrain/repo-bluebrain'
srun: error: r1i4n17: task 0: Exited with exit code 1
section_end:1670413813:step_script section_start:1670413813:upload_artifacts_on_failure Uploading artifacts for failed job
Using git from spack modules
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=51145 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=462681 responseStatus=201 Created token=vaC6SEfN
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=51188 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=462681 responseStatus=201 Created token=vaC6SEfN
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=51235 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=462681 responseStatus=201 Created token=vaC6SEfN
section_end:1670413815:upload_artifacts_on_failure section_start:1670413815:cleanup_file_variables Cleaning up project directory and file based variables
Using git from spack modules
section_end:1670413815:cleanup_file_variables ERROR: Job failed: exit status 1

Running with gitlab-runner 15.5.0 (0d4137b8)
 on BB5 map runner pnPo3yJy
section_start:1670413767:resolve_secrets Resolving secrets
section_end:1670413767:resolve_secrets section_start:1670413767:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 15.5.0, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor2598914807, slurm job id , CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 89112, build ref 1ba5e64356655e0a1488e5f95def3d7c6952e7b1, job ID 462684
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
A slurm job will be created with name GL_J462684_PROD_P112_CP2_C6
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 1054660
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J462684_PROD_P112_CP2_C6 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --ntasks=2 --jobid=1054660 --cpus-per-task=8 --mem=76G
section_end:1670413778:prepare_executor section_start:1670413778:prepare_script Preparing environment
Using git from spack modules
Running on r1i4n17 via bbpv1.epfl.ch...
section_end:1670413782:prepare_script section_start:1670413782:get_sources Getting source from Git repository
Using git from spack modules
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1670413783:get_sources section_start:1670413783:restore_cache Restoring cache
Using git from spack modules
Checking cache for build:coreneuron:nmodl:nvhpc:acc:debug:legacy-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=49054 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1670413788:restore_cache section_start:1670413788:download_artifacts Downloading artifacts
Using git from spack modules
Downloading artifacts for build:nmodl (462678)...
Runtime platform  arch=amd64 os=linux pid=49678 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=462678 responseStatus=200 OK token=_CbouScf
section_end:1670413789:download_artifacts section_start:1670413789:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
Using git from spack modules
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462684/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462684/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462684/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462684/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462684/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112/J462684_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = ssh://git@bbpgitlab.epfl.ch/" >> "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = ssh://git@bbpgitlab.epfl.ch/
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install coreneuron%nvhpc +caliper+gpu~legacy-unit+nmodl~openmp~shared~sympy+tests~unified build_type=Debug ^hpe-mpi%gcc ^caliper%gcc+cuda cuda_arch=70 ^/4dzxcpsuksxgtuoesheax4sf76wrhkqb
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
==> Error: Package 'hdf5' not found in repository '/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/bluebrain/repo-bluebrain'
srun: error: r1i4n17: task 0: Exited with exit code 1
section_end:1670413814:step_script section_start:1670413814:upload_artifacts_on_failure Uploading artifacts for failed job
Using git from spack modules
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=51332 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=462684 responseStatus=201 Created token=_CbouScf
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=51476 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=462684 responseStatus=201 Created token=_CbouScf
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=51620 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=462684 responseStatus=201 Created token=_CbouScf
section_end:1670413816:upload_artifacts_on_failure section_start:1670413816:cleanup_file_variables Cleaning up project directory and file based variables
Using git from spack modules
section_end:1670413816:cleanup_file_variables ERROR: Job failed: exit status 1

Running with gitlab-runner 15.5.0 (0d4137b8)
 on BB5 map runner pnPo3yJy
section_start:1670413767:resolve_secrets Resolving secrets
section_end:1670413767:resolve_secrets section_start:1670413767:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 15.5.0, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor322120664, slurm job id , CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 89112, build ref 1ba5e64356655e0a1488e5f95def3d7c6952e7b1, job ID 462685
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
A slurm job will be created with name GL_J462685_PROD_P112_CP4_C9
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 1054661
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J462685_PROD_P112_CP4_C9 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --ntasks=2 --jobid=1054661 --cpus-per-task=8 --mem=76G
section_end:1670413777:prepare_executor section_start:1670413777:prepare_script Preparing environment
Using git from spack modules
Running on r1i4n17 via bbpv1.epfl.ch...
section_end:1670413782:prepare_script section_start:1670413782:get_sources Getting source from Git repository
Using git from spack modules
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1670413783:get_sources section_start:1670413783:restore_cache Restoring cache
Using git from spack modules
Checking cache for build:coreneuron:nmodl:nvhpc:acc:shared-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=49001 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1670413788:restore_cache section_start:1670413788:download_artifacts Downloading artifacts
Using git from spack modules
Downloading artifacts for build:nmodl (462678)...
Runtime platform  arch=amd64 os=linux pid=49626 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=462678 responseStatus=200 OK token=HduP6Rqn
section_end:1670413789:download_artifacts section_start:1670413789:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
Using git from spack modules
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462685/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462685/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462685/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462685/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462685/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112/J462685_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = ssh://git@bbpgitlab.epfl.ch/" >> "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = ssh://git@bbpgitlab.epfl.ch/
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install coreneuron%nvhpc +caliper+gpu~legacy-unit+nmodl~openmp+shared+sympy+tests~unified build_type=RelWithDebInfo ^hpe-mpi%gcc ^caliper%gcc+cuda cuda_arch=70 ^/4dzxcpsuksxgtuoesheax4sf76wrhkqb
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
==> Error: Package 'hdf5' not found in repository '/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/bluebrain/repo-bluebrain'
srun: error: r1i4n17: task 0: Exited with exit code 1
section_end:1670413814:step_script section_start:1670413814:upload_artifacts_on_failure Uploading artifacts for failed job
Using git from spack modules
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=51389 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=462685 responseStatus=201 Created token=HduP6Rqn
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=51519 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=462685 responseStatus=201 Created token=HduP6Rqn
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=51687 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=462685 responseStatus=201 Created token=HduP6Rqn
section_end:1670413816:upload_artifacts_on_failure section_start:1670413816:cleanup_file_variables Cleaning up project directory and file based variables
Using git from spack modules
section_end:1670413816:cleanup_file_variables ERROR: Job failed: exit status 1

Running with gitlab-runner 15.5.0 (0d4137b8)
 on BB5 map runner pnPo3yJy
section_start:1670413768:resolve_secrets Resolving secrets
section_end:1670413768:resolve_secrets section_start:1670413768:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 15.5.0, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor2500334939, slurm job id , CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 89112, build ref 1ba5e64356655e0a1488e5f95def3d7c6952e7b1, job ID 462687
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
A slurm job will be created with name GL_J462687_PROD_P112_CP7_C13
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 1054663
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J462687_PROD_P112_CP7_C13 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --ntasks=2 --jobid=1054663 --cpus-per-task=8 --mem=76G
section_end:1670413777:prepare_executor section_start:1670413777:prepare_script Preparing environment
Using git from spack modules
Running on r1i6n18 via bbpv1.epfl.ch...
section_end:1670413779:prepare_script section_start:1670413779:get_sources Getting source from Git repository
Using git from spack modules
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1670413780:get_sources section_start:1670413780:restore_cache Restoring cache
Using git from spack modules
Checking cache for build:coreneuron:nmodl:nvhpc:omp:debug-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=296859 revision=58ba2b95 version=14.2.0
Downloading cache.zip from https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Acoreneuron%3Anmodl%3Anvhpc%3Aomp%3Adebug-8-non_protected
Successfully extracted cache
section_end:1670413787:restore_cache section_start:1670413787:download_artifacts Downloading artifacts
Using git from spack modules
Downloading artifacts for build:nmodl (462678)...
Runtime platform  arch=amd64 os=linux pid=296946 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=462678 responseStatus=200 OK token=5t34smhr
section_end:1670413788:download_artifacts section_start:1670413788:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
Using git from spack modules
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462687/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462687/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462687/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462687/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462687/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112/J462687_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = ssh://git@bbpgitlab.epfl.ch/" >> "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = ssh://git@bbpgitlab.epfl.ch/
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install coreneuron%nvhpc +caliper+gpu~legacy-unit+nmodl+openmp~shared+sympy+tests~unified build_type=Debug ^hpe-mpi%gcc ^caliper%gcc+cuda cuda_arch=70 ^/4dzxcpsuksxgtuoesheax4sf76wrhkqb
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
==> Error: Package 'hdf5' not found in repository '/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/bluebrain/repo-bluebrain'
srun: error: r1i6n18: task 0: Exited with exit code 1
section_end:1670413814:step_script section_start:1670413814:upload_artifacts_on_failure Uploading artifacts for failed job
Using git from spack modules
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=297483 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=462687 responseStatus=201 Created token=5t34smhr
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=297519 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=462687 responseStatus=201 Created token=5t34smhr
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=297560 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=462687 responseStatus=201 Created token=5t34smhr
section_end:1670413816:upload_artifacts_on_failure section_start:1670413816:cleanup_file_variables Cleaning up project directory and file based variables
Using git from spack modules
section_end:1670413816:cleanup_file_variables ERROR: Job failed: exit status 1

Running with gitlab-runner 15.5.0 (0d4137b8)
 on BB5 map runner pnPo3yJy
section_start:1670413768:resolve_secrets Resolving secrets
section_end:1670413768:resolve_secrets section_start:1670413768:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 15.5.0, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor4145081594, slurm job id , CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 89112, build ref 1ba5e64356655e0a1488e5f95def3d7c6952e7b1, job ID 462686
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
A slurm job will be created with name GL_J462686_PROD_P112_CP6_C10
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 1054662
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J462686_PROD_P112_CP6_C10 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --ntasks=2 --jobid=1054662 --cpus-per-task=8 --mem=76G
section_end:1670413777:prepare_executor section_start:1670413777:prepare_script Preparing environment
Using git from spack modules
Running on r1i4n17 via bbpv1.epfl.ch...
section_end:1670413782:prepare_script section_start:1670413782:get_sources Getting source from Git repository
Using git from spack modules
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1670413782:get_sources section_start:1670413782:restore_cache Restoring cache
Using git from spack modules
Checking cache for build:coreneuron:nmodl:nvhpc:omp:legacy-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=48900 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1670413788:restore_cache section_start:1670413788:download_artifacts Downloading artifacts
Using git from spack modules
Downloading artifacts for build:nmodl (462678)...
Runtime platform  arch=amd64 os=linux pid=49750 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=462678 responseStatus=200 OK token=dFRfDJf8
section_end:1670413789:download_artifacts section_start:1670413789:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
Using git from spack modules
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462686/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462686/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462686/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462686/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462686/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112/J462686_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = ssh://git@bbpgitlab.epfl.ch/" >> "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = ssh://git@bbpgitlab.epfl.ch/
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install coreneuron%nvhpc +caliper+gpu~legacy-unit+nmodl+openmp~shared~sympy+tests~unified build_type=RelWithDebInfo ^hpe-mpi%gcc ^caliper%gcc+cuda cuda_arch=70 ^/4dzxcpsuksxgtuoesheax4sf76wrhkqb
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
==> Error: Package 'hdf5' not found in repository '/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/bluebrain/repo-bluebrain'
srun: error: r1i4n17: task 0: Exited with exit code 1
section_end:1670413814:step_script section_start:1670413814:upload_artifacts_on_failure Uploading artifacts for failed job
Using git from spack modules
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=51431 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=462686 responseStatus=201 Created token=dFRfDJf8
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=51558 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=462686 responseStatus=201 Created token=dFRfDJf8
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=51696 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=462686 responseStatus=201 Created token=dFRfDJf8
section_end:1670413816:upload_artifacts_on_failure section_start:1670413816:cleanup_file_variables Cleaning up project directory and file based variables
Using git from spack modules
section_end:1670413816:cleanup_file_variables ERROR: Job failed: exit status 1

Running with gitlab-runner 15.5.0 (0d4137b8)
 on BB5 map runner pnPo3yJy
section_start:1670413637:resolve_secrets Resolving secrets
section_end:1670413637:resolve_secrets section_start:1670413637:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 15.5.0, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor781503776, slurm job id , CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 89112, build ref 1ba5e64356655e0a1488e5f95def3d7c6952e7b1, job ID 462688
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
A slurm job will be created with name GL_J462688_PROD_P112_CP1_C9
Job parameters: memory=76G, cpus_per_task=8, duration=2:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 1054649
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J462688_PROD_P112_CP1_C9 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --time=2:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --ntasks=2 --jobid=1054649 --cpus-per-task=8 --mem=76G
section_end:1670413640:prepare_executor section_start:1670413640:prepare_script Preparing environment
Using git from spack modules
Running on r1i7n20 via bbpv1.epfl.ch...
section_end:1670413643:prepare_script section_start:1670413643:get_sources Getting source from Git repository
Using git from spack modules
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1670413644:get_sources section_start:1670413644:restore_cache Restoring cache
Using git from spack modules
Checking cache for build:neuron:mod2c:intel:shared:debug-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=90635 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1670413650:restore_cache section_start:1670413650:download_artifacts Downloading artifacts
Using git from spack modules
Downloading artifacts for build:coreneuron:mod2c:intel:shared:debug (462679)...
Runtime platform  arch=amd64 os=linux pid=91144 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=462679 responseStatus=200 OK token=uT-YHxag
section_end:1670413650:download_artifacts section_start:1670413650:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
Using git from spack modules
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462688/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462688/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462688/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462688/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462688/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112/J462688_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = ssh://git@bbpgitlab.epfl.ch/" >> "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = ssh://git@bbpgitlab.epfl.ch/
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install neuron%intel +coreneuron+debug+tests~legacy-unit~rx3d model_tests=channel-benchmark,olfactory,tqperf-heavy ^/udoqinvar6cp75viyxkh44mgzbht4q76
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
==> Error: Package 'hdf5' not found in repository '/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/bluebrain/repo-bluebrain'
srun: error: r1i7n20: task 0: Exited with exit code 1
section_end:1670413677:step_script section_start:1670413677:upload_artifacts_on_failure Uploading artifacts for failed job
Using git from spack modules
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=94297 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=462688 responseStatus=201 Created token=uT-YHxag
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=94349 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=462688 responseStatus=201 Created token=uT-YHxag
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=94393 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=462688 responseStatus=201 Created token=uT-YHxag
section_end:1670413679:upload_artifacts_on_failure section_start:1670413679:cleanup_file_variables Cleaning up project directory and file based variables
Using git from spack modules
section_end:1670413680:cleanup_file_variables ERROR: Job failed: exit status 1

Running with gitlab-runner 15.5.0 (0d4137b8)
 on BB5 map runner pnPo3yJy
section_start:1670413667:resolve_secrets Resolving secrets
section_end:1670413667:resolve_secrets section_start:1670413667:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 15.5.0, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor89935053, slurm job id , CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 89112, build ref 1ba5e64356655e0a1488e5f95def3d7c6952e7b1, job ID 462691
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
A slurm job will be created with name GL_J462691_PROD_P112_CP2_C14
Job parameters: memory=76G, cpus_per_task=8, duration=2:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 1054653
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J462691_PROD_P112_CP2_C14 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --time=2:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --ntasks=2 --jobid=1054653 --cpus-per-task=8 --mem=76G
section_end:1670413669:prepare_executor section_start:1670413669:prepare_script Preparing environment
Using git from spack modules
Running on r1i7n22 via bbpv1.epfl.ch...
section_end:1670413671:prepare_script section_start:1670413671:get_sources Getting source from Git repository
Using git from spack modules
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1670413671:get_sources section_start:1670413671:restore_cache Restoring cache
Using git from spack modules
Checking cache for build:neuron:mod2c:nvhpc:acc:shared-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=205158 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1670413677:restore_cache section_start:1670413677:download_artifacts Downloading artifacts
Using git from spack modules
Downloading artifacts for build:coreneuron:mod2c:nvhpc:acc:shared (462683)...
Runtime platform  arch=amd64 os=linux pid=205928 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=462683 responseStatus=200 OK token=sz1qLkeP
section_end:1670413678:download_artifacts section_start:1670413678:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
Using git from spack modules
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462691/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462691/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462691/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462691/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462691/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112/J462691_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = ssh://git@bbpgitlab.epfl.ch/" >> "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = ssh://git@bbpgitlab.epfl.ch/
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install neuron%nvhpc +coreneuron+debug+tests~legacy-unit~rx3d model_tests=channel-benchmark,olfactory,tqperf-heavy ^/w2z6dfwdq6k5tkwudhpbj7owhrne5qzi
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
==> Error: Package 'hdf5' not found in repository '/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/bluebrain/repo-bluebrain'
srun: error: r1i7n22: task 0: Exited with exit code 1
section_end:1670413702:step_script section_start:1670413702:upload_artifacts_on_failure Uploading artifacts for failed job
Using git from spack modules
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=206599 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=462691 responseStatus=201 Created token=sz1qLkeP
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=206640 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=462691 responseStatus=201 Created token=sz1qLkeP
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=206682 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=462691 responseStatus=201 Created token=sz1qLkeP
section_end:1670413703:upload_artifacts_on_failure section_start:1670413703:cleanup_file_variables Cleaning up project directory and file based variables
Using git from spack modules
section_end:1670413704:cleanup_file_variables ERROR: Job failed: exit status 1

Log was not fetched because job had status: skipped
Log was not fetched because job had status: skipped
Log was not fetched because job had status: skipped
Log was not fetched because job had status: skipped
Log was not fetched because job had status: skipped
Log was not fetched because job had status: skipped
Running with gitlab-runner 15.5.0 (0d4137b8)
 on BB5 map runner pnPo3yJy
section_start:1670413418:resolve_secrets Resolving secrets
section_end:1670413418:resolve_secrets section_start:1670413418:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 15.5.0, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor1541314778, slurm job id , CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 89112, build ref 1ba5e64356655e0a1488e5f95def3d7c6952e7b1, job ID 462678
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112, optional exclusive flag , optional cpus per task flag --cpus-per-task=8, optional qos flag
CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
A slurm job will be created with name GL_J462678_PROD_P112_CP0_C0
Job parameters: memory=76G, cpus_per_task=8, duration=1:00:00, constraint=cpu ntasks=2 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 1054619
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=2 --cpus-per-task=8 --mem=76G --job-name=GL_J462678_PROD_P112_CP0_C0 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --ntasks=2 --jobid=1054619 --cpus-per-task=8 --mem=76G
section_end:1670413420:prepare_executor section_start:1670413420:prepare_script Preparing environment
Using git from spack modules
Running on r1i4n11 via bbpv1.epfl.ch...
section_end:1670413422:prepare_script section_start:1670413422:get_sources Getting source from Git repository
Using git from spack modules
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1670413422:get_sources section_start:1670413422:restore_cache Restoring cache
Using git from spack modules
Checking cache for build:nmodl-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=155413 revision=58ba2b95 version=14.2.0
cache.zip is up to date 
Successfully extracted cache
section_end:1670413427:restore_cache section_start:1670413427:download_artifacts Downloading artifacts
Using git from spack modules
Downloading artifacts for spack_setup (462676)...
Runtime platform  arch=amd64 os=linux pid=156468 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=462676 responseStatus=200 OK token=7o7VaR3L
section_end:1670413428:download_artifacts section_start:1670413428:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
Using git from spack modules
$ SPACK_BUILD="${PWD}/spack-build"
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export SPACK_USER_CONFIG_PATH=${PWD}/spack-config
$ mkdir ${SPACK_USER_CONFIG_PATH}
$ cat > ${SPACK_USER_CONFIG_PATH}/config.yaml << END_SCRIPT # collapsed multi-line command
$ spack config blame config
--- config:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462678/spack-config/config.yaml:2 build_stage:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462678/spack-config/config.yaml:3 - /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462678/spack-build
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462678/spack-config/config.yaml:4 source_cache: /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462678/spack-source-cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/config.yaml:2 ccache: True
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:2 install_tree:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:3 root: $user_cache_path/software
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:4 projections:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:5 all: install_{compiler.name}-{compiler.version}-{target}/{name}-{version}-{hash:6}
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:6 module_roots:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:7 tcl: $user_cache_path/modules
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:9 environments:
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:10 root: $user_cache_path/environments
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:88 concretization: separately
/gpfs/bbp.cscs.ch/ssd/apps/bsd//config/config.yaml:12 build_jobs: 8
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 extensions:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:18 - $spack/bluebrain/spack-scripting
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # This is the path to the root of the Spack install tree.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:19 # You can use $spack here to refer to the root of the spack instance.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 template_dirs:
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:37 - $spack/share/spack/templates
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Temporary locations Spack can try to use for builds.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Recommended options are given below.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Builds can be faster in temporary directories on some (e.g., HPC) systems.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Specifying `$tempdir` will ensure use of the default temporary directory
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # (i.e., ``$TMP` or ``$TMPDIR``).
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # Another option that prevents conflicts and potential permission issues is
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # to specify `$user_cache_path/stage`, which ensures each user builds in their
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # home directory.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # A more traditional path uses the value of `$spack/var/spack/stage`, which
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # builds directly inside Spack's instance without staging them in a
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # temporary space. Problems with specifying a path inside a Spack instance
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # are that it precludes its use as a system package and its ability to be
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # pip installable.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # In any case, if the username is not already in the path, Spack will append
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # the value of `$user` in an attempt to avoid potential conflicts between
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # users in shared temporary spaces.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 #
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # The build stage can be purged with `spack clean --stage` and
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # `spack clean -a`, so it is important that the specified directory uniquely
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:38 # identifies Spack staging to avoid accidentally wiping out non-Spack work.
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:73 test_stage: $user_cache_path/test
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:82 misc_cache: $user_cache_path/cache
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:94 connect_timeout: 10
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:99 verify_ssl: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:107 suppress_gpg_warnings: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:113 install_missing_compilers: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:118 checksum: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:123 deprecated: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:128 dirty: False
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:136 build_language: C
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:145 locks: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:150 url_fetch_method: urllib
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:176 concretizer: clingo
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:183 db_lock_timeout: 300
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:191 package_lock_timeout: null
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:197 shared_linking: rpath
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:202 allow_sgid: True
/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676/spack/etc/spack/defaults/config.yaml:207 terminal_title: False
_builtin debug: False
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112/J462678_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = ssh://git@bbpgitlab.epfl.ch/" >> "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = ssh://git@bbpgitlab.epfl.ch/
$ SPACK_FULL_SPEC="${SPACK_PACKAGE}${SPACK_PACKAGE_COMPILER:+%}${SPACK_PACKAGE_COMPILER} ${SPACK_PACKAGE_SPEC} ${SPACK_PACKAGE_DEPENDENCIES} ${SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB}"
$ echo "Preparing to install ${SPACK_FULL_SPEC}"
Preparing to install nmodl%gcc ~legacy-unit
$ JSON_SPEC=$(spack spec --json ${SPACK_FULL_SPEC})
$ SPACK_INSTALLED_HASH=$(module load unstable python; echo "${JSON_SPEC}" | python -c "import json, sys; print(json.loads(sys.stdin.read())[\"spec\"][\"nodes\"][0][\"hash\"])")
$ echo "Determined its hash will be ${SPACK_INSTALLED_HASH}"
Determined its hash will be 4dzxcpsuksxgtuoesheax4sf76wrhkqb
$ SPACK_STAGE_DIR=${SPACK_BUILD}/spack-stage-${SPACK_PACKAGE}-develop-${SPACK_INSTALLED_HASH}
$ SPACK_BUILD_DIR=${SPACK_STAGE_DIR}/spack-build-${SPACK_INSTALLED_HASH:0:7}
$ SPACK_SOURCE_DIR=${SPACK_STAGE_DIR}/spack-src
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ module load unstable ccache
$ export CCACHE_BASEDIR=$(realpath -P ${CI_BUILDS_DIR})
$ echo CCACHE_BASEDIR=${CCACHE_BASEDIR}
CCACHE_BASEDIR=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112
$ export CCACHE_MAXSIZE=512M
$ export CCACHE_DIR="${TMPDIR}/ccache"
$ mkdir -p ${CCACHE_DIR}
$ if [ -f ${CI_PROJECT_DIR}/ccache.tar ]; then
$ tar -C "${CCACHE_DIR}" -xf "${CI_PROJECT_DIR}/ccache.tar"
$ fi
$ ccache --zero-stats
Statistics zeroed
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/1054619/ccache
Primary config: /nvme/bbpcihpcproj12/1054619/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Dec 7 12:44:15 2022
Hits: 0 / 0
Direct: 0 / 0
Preprocessed: 0 / 0
Misses: 0
Direct: 0
Preprocessed: 0
Primary storage:
Hits: 0 / 0
Misses: 0
Cache size (GB): 0.44 / 0.51 (86.80 %)
Files: 584
$ fi
$ module load unstable git ${SPACK_EXTRA_MODULES}
$ spack uninstall -y --dependents /${SPACK_INSTALLED_HASH} || true
==> Error: No installed spec matches the hash: '4dzxcpsuksxgtuoesheax4sf76wrhkqb'
$ spack spec -Il ${SPACK_FULL_SPEC}
Input spec
--------------------------------
- nmodl%gcc~legacy-unit
Concretized
--------------------------------
- 4dzxcps nmodl@develop%gcc@11.2.0~ipo~legacy-unit~llvm~llvm_cuda~python build_type=RelWithDebInfo arch=linux-rhel7-skylake
- cppb7al ^bison@3.8.2%gcc@11.2.0 arch=linux-rhel7-skylake
[^] atktt2p ^catch2@2.13.8%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 2qmvlfy ^cmake@3.21.4%gcc@11.2.0~doc+ncurses+openssl+ownlibs~qt build_type=Release arch=linux-rhel7-skylake
[^] e5qqxxq ^cli11@2.1.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
- 4bt76dp ^flex@2.6.3%gcc@11.2.0+lex~nls arch=linux-rhel7-skylake
[^] 7lotjqk ^fmt@8.1.1%gcc@11.2.0~ipo+pic~shared build_type=RelWithDebInfo cxxstd=11 arch=linux-rhel7-skylake
- utrxbc3 ^ninja@1.10.2%gcc@11.2.0 arch=linux-rhel7-skylake
[^] jjl6cjc ^nlohmann-json@3.10.4%gcc@11.2.0~ipo~multiple_headers build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] onq3mhg ^py-jinja2@3.0.1%gcc@11.2.0~i18n arch=linux-rhel7-skylake
[^] 4ectow5 ^py-markupsafe@2.0.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] v4z3s5e ^py-setuptools@57.4.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 72xzp3v ^python@3.9.7%gcc@11.2.0+bz2+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tix~tkinter~ucs4+uuid+zlib patches=0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87,4c2457325f2b608b1b6a2c63087df8c26e07db3e3d493caf36a56f0ecf6fb768,f2fd060afc4b4618fe8104c4c5d771f36dc55b1db5a4623785a4ea707ec72fb4 arch=linux-rhel7-skylake
[^] z5kzef6 ^py-pybind11@2.9.1%gcc@11.2.0~ipo build_type=RelWithDebInfo arch=linux-rhel7-skylake
[^] 22arfs4 ^py-pytest@6.2.4%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ascbeii ^py-attrs@21.2.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] kvw3vhm ^py-iniconfig@1.1.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] 7iyiygo ^py-packaging@21.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] c7qvw2q ^py-pyparsing@2.4.7%gcc@11.2.0 arch=linux-rhel7-skylake
[^] mazoiox ^py-pluggy@0.13.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ciusbmc ^py-setuptools-scm@6.3.2%gcc@11.2.0+toml arch=linux-rhel7-skylake
[^] hmcew6w ^py-tomli@1.2.1%gcc@11.2.0 arch=linux-rhel7-skylake
[^] sxd7srs ^py-pip@21.1.2%gcc@11.2.0 arch=linux-rhel7-skylake
[^] y7rfzdj ^py-py@1.9.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] w4gddqx ^py-toml@0.10.2%gcc@11.2.0 arch=linux-rhel7-skylake
[^] ci5oe5b ^py-pyyaml@5.3.1%gcc@11.2.0+libyaml arch=linux-rhel7-skylake
[^] xj2wlac ^libyaml@0.2.5%gcc@11.2.0 arch=linux-rhel7-skylake
[^] vt2or7v ^py-sympy@1.9%gcc@11.2.0 arch=linux-rhel7-skylake
[^] dzb2mfs ^py-mpmath@1.1.0%gcc@11.2.0 arch=linux-rhel7-skylake
[^] r74vcyb ^spdlog@1.9.2%gcc@11.2.0~ipo+shared build_type=RelWithDebInfo arch=linux-rhel7-skylake
$ spack ${SPACK_INSTALL_EXTRA_FLAGS} install -j${SLURM_CPUS_PER_TASK} --log-format=junit --log-file=${CI_PROJECT_DIR}/install.xml --keep-stage ${SPACK_FULL_SPEC} || install_failed=1
==> bison@3.8.2 : has external module in ['bison/3.8.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/bison-3.8.2-ej2cew (external bison-3.8.2-cppb7alftvhxbedsuxqv72z2thjuoizw)
==> cmake@3.21.4 : has external module in ['cmake/3.21.4']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/cmake-3.21.4-cdyb7k (external cmake-3.21.4-2qmvlfyylrv3t5ankluyr5cqey2nlfzd)
==> flex@2.6.3 : has external module in ['flex/2.6.3']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/flex-2.6.3-3mtl7j (external flex-2.6.3-4bt76dpxbix6ep4qtz3mv5i2iddilv53)
==> ninja@1.10.2 : has external module in ['ninja/1.10.2']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ninja-1.10.2-eh33rh (external ninja-1.10.2-utrxbc3aohnru5eynalc3hyv4ca4jqte)
==> python@3.9.7 : has external module in ['python/3.9.7']
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/python-3.9.7-yj5alh (external python-3.9.7-72xzp3vgzh5b424qevkfruhs3ajzqbiy)
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/libyaml-0.2.5-xj2wla
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/cli11-2.1.1-e5qqxx
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/fmt-8.1.1-7lotjq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/nlohmann-json-3.10.4-jjl6cj
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/catch2-2.13.8-atktt2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-setuptools-57.4.0-v4z3s5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-mpmath-1.1.0-dzb2mf
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyyaml-5.3.1-ci5oe5
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/spdlog-1.9.2-r74vcy
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-toml-0.10.2-w4gddq
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pyparsing-2.4.7-c7qvw2
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-markupsafe-2.0.1-4ectow
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-iniconfig-1.1.1-kvw3vh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pip-21.1.2-sxd7sr
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pybind11-2.9.1-z5kzef
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-attrs-21.2.0-ascbei
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-sympy-1.9-vt2or7
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-packaging-21.0-7iyiyg
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-jinja2-3.0.1-onq3mh
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-tomli-1.2.1-hmcew6
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-setuptools-scm-6.3.2-ciusbm
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pluggy-0.13.0-mazoio
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-py-1.9.0-y7rfzd
[+] /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_applications/install_gcc-11.2.0-skylake/py-pytest-6.2.4-22arfs
==> Installing nmodl-develop-4dzxcpsuksxgtuoesheax4sf76wrhkqb
==> No binary for nmodl-develop-4dzxcpsuksxgtuoesheax4sf76wrhkqb found: installing from source
==> Warning: Expected user 904556 to own /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112, but it is owned by 0
==> No patches needed for nmodl
==> nmodl: Executing phase: 'cmake'
==> nmodl: Executing phase: 'build'
==> nmodl: Executing phase: 'install'
==> nmodl: Successfully installed nmodl-develop-4dzxcpsuksxgtuoesheax4sf76wrhkqb
Fetch: 57.37s. Build: 2m 57.09s. Total: 3m 54.45s.
[+] /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/software/install_gcc-11.2.0-skylake/nmodl-develop-4dzxcp
$ chmod -R g+rX "${SPACK_BUILD}"
$ if [[ ${install_failed} == 1 ]]; then exit 1; fi
$ if [ ${SPACK_USE_CCACHE+x} ]; then
$ ccache --cleanup
$ ccache --show-stats --verbose
Summary:
Cache directory: /nvme/bbpcihpcproj12/1054619/ccache
Primary config: /nvme/bbpcihpcproj12/1054619/ccache/ccache.conf
Secondary config: /gpfs/bbp.cscs.ch/ssd/apps/bsd/2022-01-10/stage_externals/install_gcc-11.2.0-skylake/ccache-4.4.2-6qbndl/etc/ccache.conf
Stats updated: Wed Dec 7 12:48:57 2022
Hits: 88 / 132 (66.67 %)
Direct: 1 / 132 (0.76 %)
Preprocessed: 87 / 131 (66.41 %)
Misses: 44
Direct: 131
Preprocessed: 44
Uncacheable: 27
Primary storage:
Hits: 89 / 264 (33.71 %)
Misses: 175
Cache size (GB): 0.43 / 0.51 (84.16 %)
Files: 495
Cleanups: 11
Uncacheable:
Called for linking: 26
No input file: 1
$ tar -C "${CCACHE_DIR}" -cf "${CI_PROJECT_DIR}/ccache.tar" .
$ fi
$ touch ${SPACK_STAGE_DIR}/spack-configure-args.txt
$ cp ${SPACK_STAGE_DIR}/spack-{build-env,build-out,configure-args}.txt ${CI_PROJECT_DIR}/
$ echo "SPACK_BUILD_DIR=${SPACK_BUILD_DIR}" > ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_FULL_SPEC=${SPACK_FULL_SPEC}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_SOURCE_DIR=${SPACK_SOURCE_DIR}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ SPACK_PACKAGE_SLUGIFY=$(echo -n ${SPACK_PACKAGE} | tr -c '[:alnum:]' '_' | tr '[:lower:]' '[:upper:]')
$ echo "${SPACK_PACKAGE_SLUGIFY}_INSTALLED_HASH=${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ echo "SPACK_PACKAGE_DEPENDENCY_ON_PREVIOUS_JOB=^/${SPACK_INSTALLED_HASH}" >> ${CI_PROJECT_DIR}/spack_build_info.env
$ num_failures=$(module load unstable python-dev; python -c "from lxml import etree; xml = etree.parse('${CI_PROJECT_DIR}/install.xml'); print(sum(1 for _ in xml.getroot().iter('failure')) + sum(1 for _ in xml.getroot().iter('error')))")
Autoloading python/3.9.7
Autoloading hpe-mpi/2.25.hmpt
$ if [[ ${num_failures} > 0 ]]; then exit ${num_failures}; fi
section_end:1670413738:step_script section_start:1670413738:archive_cache Saving cache for successful job
Using git from spack modules
Creating cache build:nmodl-8-non_protected...
Runtime platform  arch=amd64 os=linux pid=167027 revision=58ba2b95 version=14.2.0
ccache.tar: found 1 matching files and directories
Uploading cache.zip to https://bbpobjectstorage.epfl.ch/gitlab-runners-cache/project/112/build%3Anmodl-8-non_protected
Created cache
section_end:1670413762:archive_cache section_start:1670413762:upload_artifacts_on_success Uploading artifacts for successful job
Using git from spack modules
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=167129 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
initial_environment.env: found 1 matching files and directories
spack-build-env.txt: found 1 matching files and directories
spack-build-out.txt: found 1 matching files and directories
spack-configure-args.txt: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=462678 responseStatus=201 Created token=7o7VaR3L
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=167171 revision=58ba2b95 version=14.2.0
install.xml: found 1 matching files and directories
Uploading artifacts as "junit" to coordinator... ok id=462678 responseStatus=201 Created token=7o7VaR3L
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=167214 revision=58ba2b95 version=14.2.0
spack_build_info.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=462678 responseStatus=201 Created token=7o7VaR3L
section_end:1670413763:upload_artifacts_on_success section_start:1670413763:cleanup_file_variables Cleaning up project directory and file based variables
Using git from spack modules
section_end:1670413764:cleanup_file_variables Job succeeded
Running with gitlab-runner 15.5.0 (0d4137b8)
 on BB5 map runner pnPo3yJy
section_start:1670413313:resolve_secrets Resolving secrets
section_end:1670413313:resolve_secrets section_start:1670413313:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 15.5.0, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor994565005, slurm job id , CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 89112, build ref 1ba5e64356655e0a1488e5f95def3d7c6952e7b1, job ID 462676
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112, optional exclusive flag , optional cpus per task flag --cpus-per-task=6, optional qos flag
CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
A slurm job will be created with name GL_J462676_PROD_P112_CP0_C0
Job parameters: memory=30750M, cpus_per_task=6, duration=1:00:00, constraint=cpu ntasks=1 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 1054593
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=1 --cpus-per-task=6 --mem=30750M --job-name=GL_J462676_PROD_P112_CP0_C0 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --ntasks=1 --jobid=1054593 --cpus-per-task=6 --mem=30750M
section_end:1670413315:prepare_executor section_start:1670413315:prepare_script Preparing environment
Using git from spack modules
Running on r1i7n20 via bbpv1.epfl.ch...
section_end:1670413318:prepare_script section_start:1670413318:get_sources Getting source from Git repository
Using git from spack modules
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1670413319:get_sources section_start:1670413319:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
Using git from spack modules
$ if [[ -n "${SPACK_ENV_FILE_URL}" && "${PARSE_GITHUB_PR_DESCRIPTIONS,,}" == "true" ]]; then
$ cat > parse_description.py << END_SCRIPT # collapsed multi-line command
$ cat parse_description.py
import os
import re
import requests
pr_info = requests.get("https://api.github.com/repos/{}/pulls/{}".format(
os.environ['CI_EXTERNAL_PULL_REQUEST_TARGET_REPOSITORY'],
os.environ['CI_EXTERNAL_PULL_REQUEST_IID']),
headers={'Accept': 'application/vnd.github.v3+json'})
pr_body = pr_info.json()["body"]
# match something like NEURON_BRANCH=foo/bar
pat = re.compile('^([A-Z0-9_]+)_([A-Z]+)=([A-Z0-9\-\_\/\+\.]+)$', re.IGNORECASE)
def parse_term(m):
ref_type = m.group(2).lower()
if ref_type not in {'branch', 'tag', 'ref'}: return
print(m.group(1).upper() + '_' + ref_type.upper() + '=' + m.group(3))
if pr_body is not None:
for pr_body_line in pr_body.splitlines():
if not pr_body_line.startswith('CI_BRANCHES:'): continue
for config_term in pr_body_line[12:].split(','):
pat.sub(parse_term, config_term)
$ (module load unstable python-dev; python parse_description.py) > input_variables.env
Autoloading python/3.9.7
Autoloading hpe-mpi/2.25.hmpt
$ else
$ cat input_variables.env
NEURON_BRANCH=sandbox/jblanco/lfp
NMODL_BRANCH=master
SPACK_BRANCH=sandbox/jblanco/lfp
$ for var_to_unset in $(sed 's/^\(.*\?\)_\(BRANCH\|COMMIT\|TAG\)=.*$/\1_BRANCH\n\1_COMMIT\n\1_TAG/' input_variables.env); do # collapsed multi-line command
Unsetting NEURON_BRANCH
Unsetting NMODL_BRANCH
Unsetting SPACK_BRANCH
$ set -o allexport
$ . input_variables.env
$ set +o allexport
$ unset MODULEPATH
$ . /gpfs/bbp.cscs.ch/ssd/apps/bsd/${SPACK_DEPLOYMENT_SUFFIX}/config/modules.sh
$ echo "MODULEPATH=${MODULEPATH}" > spack_clone_variables.env
$ echo Preparing to clone Spack into ${PWD}
Preparing to clone Spack into /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462676
$ if [[ -z "${SPACK_BRANCH}" && ( -n "${SPACK_COMMIT}" || -n "${SPACK_TAG}" ) ]]; then
$ echo Checking out the ${SPACK_BRANCH} of Spack...
Checking out the sandbox/jblanco/lfp of Spack...
$ module load unstable git
$ git clone -c feature.manyFiles=true --depth 1 --single-branch --branch ${SPACK_BRANCH} ${SPACK_URL} spack
Cloning into 'spack'...
Updating files: 16% (1544/9134) Updating files: 17% (1553/9134) Updating files: 18% (1645/9134) Updating files: 19% (1736/9134) Updating files: 20% (1827/9134) Updating files: 21% (1919/9134) Updating files: 22% (2010/9134) Updating files: 23% (2101/9134) Updating files: 24% (2193/9134) Updating files: 25% (2284/9134) Updating files: 26% (2375/9134) Updating files: 27% (2467/9134) Updating files: 28% (2558/9134) Updating files: 29% (2649/9134) Updating files: 30% (2741/9134) Updating files: 31% (2832/9134) Updating files: 31% (2868/9134) Updating files: 32% (2923/9134) Updating files: 33% (3015/9134) Updating files: 34% (3106/9134) Updating files: 35% (3197/9134) Updating files: 36% (3289/9134) Updating files: 37% (3380/9134) Updating files: 38% (3471/9134) Updating files: 39% (3563/9134) Updating files: 40% (3654/9134) Updating files: 41% (3745/9134) Updating files: 42% (3837/9134) Updating files: 42% (3845/9134) Updating files: 43% (3928/9134) Updating files: 44% (4019/9134) Updating files: 45% (4111/9134) Updating files: 46% (4202/9134) Updating files: 47% (4293/9134) Updating files: 48% (4385/9134) Updating files: 49% (4476/9134) Updating files: 50% (4567/9134) Updating files: 51% (4659/9134) Updating files: 52% (4750/9134) Updating files: 52% (4826/9134) Updating files: 53% (4842/9134) Updating files: 54% (4933/9134) Updating files: 55% (5024/9134) Updating files: 56% (5116/9134) Updating files: 57% (5207/9134) Updating files: 58% (5298/9134) Updating files: 59% (5390/9134) Updating files: 60% (5481/9134) Updating files: 61% (5572/9134) Updating files: 62% (5664/9134) Updating files: 63% (5755/9134) Updating files: 64% (5846/9134) Updating files: 65% (5938/9134) Updating files: 66% (6029/9134) Updating files: 67% (6120/9134) Updating files: 68% (6212/9134) Updating files: 68% (6299/9134) Updating files: 69% (6303/9134) Updating files: 70% (6394/9134) Updating files: 71% (6486/9134) Updating files: 72% (6577/9134) Updating files: 73% (6668/9134) Updating files: 74% (6760/9134) Updating files: 75% (6851/9134) Updating files: 76% (6942/9134) Updating files: 77% (7034/9134) Updating files: 78% (7125/9134) Updating files: 79% (7216/9134) Updating files: 80% (7308/9134) Updating files: 81% (7399/9134) Updating files: 82% (7490/9134) Updating files: 83% (7582/9134) Updating files: 83% (7589/9134) Updating files: 84% (7673/9134) Updating files: 85% (7764/9134) Updating files: 86% (7856/9134) Updating files: 87% (7947/9134) Updating files: 88% (8038/9134) Updating files: 89% (8130/9134) Updating files: 90% (8221/9134) Updating files: 91% (8312/9134) Updating files: 92% (8404/9134) Updating files: 93% (8495/9134) Updating files: 93% (8563/9134) Updating files: 94% (8586/9134) Updating files: 95% (8678/9134) Updating files: 96% (8769/9134) Updating files: 97% (8860/9134) Updating files: 98% (8952/9134) Updating files: 99% (9043/9134) Updating files: 100% (9134/9134) Updating files: 100% (9134/9134), done.
$ export SPACK_ROOT=${PWD}/spack
$ export SPACK_USER_CACHE_PATH="${CI_BUILDS_DIR}"
$ export SPACK_SYSTEM_CONFIG_PATH="/gpfs/bbp.cscs.ch/ssd/apps/bsd/${SPACK_DEPLOYMENT_SUFFIX}/config"
$ echo "SPACK_ROOT=${SPACK_ROOT}" >> spack_clone_variables.env
$ echo "SPACK_USER_CACHE_PATH=${SPACK_USER_CACHE_PATH}" >> spack_clone_variables.env
$ echo "SPACK_SYSTEM_CONFIG_PATH=${SPACK_SYSTEM_CONFIG_PATH}" >> spack_clone_variables.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112/J462676_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = ssh://git@bbpgitlab.epfl.ch/" >> "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = ssh://git@bbpgitlab.epfl.ch/
$ env -0 | sed -nz '/^CUSTOM_ENV_/d;/^[^=]\+_\(BRANCH\|COMMIT\|TAG\)=.\+/p' | xargs -0t spack configure-pipeline --ignore-packages CI_BUILD CI_COMMIT CI_DEFAULT GITLAB_PIPELINES SPACK ${SPACK_SETUP_IGNORE_PACKAGE_VARIABLES} --write-commit-file=commit-mapping.env
spack configure-pipeline --ignore-packages CI_BUILD CI_COMMIT CI_DEFAULT GITLAB_PIPELINES SPACK --write-commit-file=commit-mapping.env CI_COMMIT_BRANCH=sandbox/jblanco/lfp GITLAB_PIPELINES_BRANCH=main NEURON_BRANCH=sandbox/jblanco/lfp NMODL_BRANCH=master SPACK_BRANCH=sandbox/jblanco/lfp CI_DEFAULT_BRANCH=master CORENEURON_COMMIT=1ba5e64356655e0a1488e5f95def3d7c6952e7b1
==> CI_COMMIT: ignoring CI_COMMIT_BRANCH=sandbox/jblanco/lfp
==> GITLAB_PIPELINES: ignoring GITLAB_PIPELINES_BRANCH=main
==> SPACK: ignoring SPACK_BRANCH=sandbox/jblanco/lfp
==> CI_DEFAULT: ignoring CI_DEFAULT_BRANCH=master
==> neuron: resolved branch sandbox/jblanco/lfp to 42164ad2174136fe969c50d6fb31076ea6a52ca3
==> nmodl: resolved branch master to 7cf95511a804590681e1a5b9f9088b7860baca5d
==> neuron@develop: remove branch/commit/tag
==> neuron@develop: use commit="42164ad2174136fe969c50d6fb31076ea6a52ca3"
==> neuron@develop: add preferred=True
==> nmodl@develop: remove branch/commit/tag
==> nmodl@develop: use commit="7cf95511a804590681e1a5b9f9088b7860baca5d"
==> nmodl@develop: add preferred=True
==> coreneuron@develop: remove branch/commit/tag
==> coreneuron@develop: use commit="1ba5e64356655e0a1488e5f95def3d7c6952e7b1"
==> coreneuron@develop: add preferred=True
$ (cd "${SPACK_ROOT}" && git diff)
diff --git a/bluebrain/repo-bluebrain/packages/coreneuron/package.py b/bluebrain/repo-bluebrain/packages/coreneuron/package.py
index 1061e40..154f09f 100644
--- a/bluebrain/repo-bluebrain/packages/coreneuron/package.py
+++ b/bluebrain/repo-bluebrain/packages/coreneuron/package.py
@@ -20,7 +20,7 @@ class Coreneuron(CMakePackage):
# This simplifies testing the gitlab-pipelines repository:
git = "git@bbpgitlab.epfl.ch:hpc/coreneuron.git"
- version('develop', branch='master')
+ version('develop', preferred=True, commit='1ba5e64356655e0a1488e5f95def3d7c6952e7b1') # old: branch='master'
version('8.2.1.2022.03.10_lfp', branch='sandbox/jblanco/lfp')
version('8.2.1_lfp', branch='sandbox/jblanco/lfp')
version('8.2.1', tag='8.2.1')
diff --git a/bluebrain/repo-bluebrain/packages/nmodl/package.py b/bluebrain/repo-bluebrain/packages/nmodl/package.py
index a79f28b..e210189 100644
--- a/bluebrain/repo-bluebrain/packages/nmodl/package.py
+++ b/bluebrain/repo-bluebrain/packages/nmodl/package.py
@@ -14,7 +14,7 @@ class Nmodl(CMakePackage):
git = "https://github.com/BlueBrain/nmodl.git"
# 0.3.1 > 0.3.0.20220110 > 0.3.0 > 0.3b > 0.3 to Spack
- version("develop", branch="master", submodules=True)
+ version('develop', preferred=True, commit='7cf95511a804590681e1a5b9f9088b7860baca5d', submodules=True) # old: branch="master"
version("llvm", branch="llvm", submodules=True)
version("0.4.0", tag="0.4")
# This is the merge commit of #875, which allows catch2 etc. to be dependencies
diff --git a/bluebrain/repo-patches/packages/neuron/package.py b/bluebrain/repo-patches/packages/neuron/package.py
index 5092190..7513f92 100644
--- a/bluebrain/repo-patches/packages/neuron/package.py
+++ b/bluebrain/repo-patches/packages/neuron/package.py
@@ -30,7 +30,7 @@ class Neuron(CMakePackage):
# Patch for recent CMake versions that don't identify NVHPC as PGI
patch("patch-v800-cmake-nvhpc.patch", when="@8.0.0%nvhpc^cmake@3.20:")
- version("develop", branch="master")
+ version('develop', preferred=True, commit='42164ad2174136fe969c50d6fb31076ea6a52ca3') # old: branch="master"
version("8.2.1.2022.03.10_lfp", branch="sandbox/jblanco/lfp")
version("8.2.1_lfp", branch="sandbox/jblanco/lfp")
version("8.2.1", tag="8.2.1")
$ cat commit-mapping.env
NEURON_COMMIT=42164ad2174136fe969c50d6fb31076ea6a52ca3
NMODL_COMMIT=7cf95511a804590681e1a5b9f9088b7860baca5d
CORENEURON_COMMIT=1ba5e64356655e0a1488e5f95def3d7c6952e7b1
$ echo "SPACK_BRANCH=${SPACK_BRANCH}" >> commit-mapping.env
$ echo "SPACK_DEPLOYMENT_SUFFIX=${SPACK_DEPLOYMENT_SUFFIX}" >> commit-mapping.env
$ cat commit-mapping.env >> spack_clone_variables.env
$ spack spec -IL ninja
Input spec
--------------------------------
- ninja
Concretized
--------------------------------
==> Bootstrapping clingo from pre-built binaries
- utrxbc3aohnru5eynalc3hyv4ca4jqte ninja@1.10.2%gcc@11.2.0 arch=linux-rhel7-skylake
$ echo "SPACK_SETUP_COMMIT_MAPPING_URL=${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/jobs/${CI_JOB_ID}/artifacts/commit-mapping.env" >> spack_clone_variables.env
$ spack config --scope site add "config:ccache:true"
$ echo "SPACK_USE_CCACHE=true" >> spack_clone_variables.env
section_end:1670413414:step_script section_start:1670413414:upload_artifacts_on_success Uploading artifacts for successful job
Using git from spack modules
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=70529 revision=58ba2b95 version=14.2.0
commit-mapping.env: found 1 matching files and directories
input_variables.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=462676 responseStatus=201 Created token=VPCPogdo
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=70551 revision=58ba2b95 version=14.2.0
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=462676 responseStatus=201 Created token=VPCPogdo
section_end:1670413415:upload_artifacts_on_success section_start:1670413415:cleanup_file_variables Cleaning up project directory and file based variables
Using git from spack modules
section_end:1670413416:cleanup_file_variables Job succeeded
Running with gitlab-runner 15.5.0 (0d4137b8)
 on BB5 map runner pnPo3yJy
section_start:1670413638:resolve_secrets Resolving secrets
section_end:1670413638:resolve_secrets section_start:1670413638:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 15.5.0, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor491612479, slurm job id , CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 89112, build ref 1ba5e64356655e0a1488e5f95def3d7c6952e7b1, job ID 462696
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
A slurm job will be created with name GL_J462696_PROD_P112_CP4_C10
Job parameters: memory=30750M, cpus_per_task=1, duration=1:00:00, constraint=cpu ntasks=8 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 1054650
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=8 --cpus-per-task=1 --mem=30750M --job-name=GL_J462696_PROD_P112_CP4_C10 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --ntasks=8 --jobid=1054650 --cpus-per-task=1 --mem=30750M
section_end:1670413640:prepare_executor section_start:1670413640:prepare_script Preparing environment
Using git from spack modules
Running on r1i7n22 via bbpv1.epfl.ch...
section_end:1670413642:prepare_script section_start:1670413642:get_sources Getting source from Git repository
Using git from spack modules
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1670413643:get_sources section_start:1670413643:download_artifacts Downloading artifacts
Using git from spack modules
Downloading artifacts for build:coreneuron:mod2c:intel:shared:debug (462679)...
Runtime platform  arch=amd64 os=linux pid=204380 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=462679 responseStatus=200 OK token=2ALRP-_K
section_end:1670413644:download_artifacts section_start:1670413644:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
Using git from spack modules
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r1i4n11
Build name: Linux-icpc
Create new tag: 20221207-1147 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462679/spack-build/spack-stage-coreneuron-develop-udoqinvar6cp75viyxkh44mgzbht4q76/spack-build-udoqinv
Start 1: cmd_interface_test
Start 2: interleave_info_constructor_test
Start 3: alignment_test
Start 4: queuing_test
Start 5: test-solver
Start 6: lfp_test
Start 7: ring_TEST
1/18 Test #2: interleave_info_constructor_test ... Passed 0.18 sec
Start 18: reporting_1
2/18 Test #4: queuing_test ....................... Passed 0.18 sec
3/18 Test #3: alignment_test ..................... Passed 0.18 sec
Start 8: ring_binqueue_TEST
4/18 Test #1: cmd_interface_test ................. Passed 0.20 sec
5/18 Test #6: lfp_test ........................... Passed 0.19 sec
Start 9: ring_multisend_TEST
6/18 Test #18: reporting_1 ........................ Passed 0.72 sec
7/18 Test #7: ring_TEST .......................... Passed 0.98 sec
Start 10: ring_spike_buffer_TEST
8/18 Test #5: test-solver ........................ Passed 1.09 sec
Start 11: ring_gap_TEST
9/18 Test #8: ring_binqueue_TEST ................. Passed 1.08 sec
Start 12: ring_gap_binqueue_TEST
10/18 Test #9: ring_multisend_TEST ................ Passed 1.14 sec
Start 13: ring_gap_multisend_TEST
11/18 Test #10: ring_spike_buffer_TEST ............. Passed 0.88 sec
Start 14: ring_permute0_TEST
12/18 Test #11: ring_gap_TEST ...................... Passed 1.00 sec
Start 15: ring_gap_permute0_TEST
13/18 Test #12: ring_gap_binqueue_TEST ............. Passed 1.00 sec
Start 16: ring_permute1_TEST
14/18 Test #13: ring_gap_multisend_TEST ............ Passed 0.95 sec
Start 17: ring_gap_permute1_TEST
15/18 Test #14: ring_permute0_TEST ................. Passed 0.79 sec
16/18 Test #15: ring_gap_permute0_TEST ............. Passed 0.92 sec
17/18 Test #16: ring_permute1_TEST ................. Passed 0.87 sec
18/18 Test #17: ring_gap_permute1_TEST ............. Passed 0.90 sec
100% tests passed, 0 tests failed out of 18
Total Test time (real) = 3.24 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1670413676:step_script section_start:1670413676:upload_artifacts_on_success Uploading artifacts for successful job
Using git from spack modules
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=205840 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=462696 responseStatus=201 Created token=2ALRP-_K
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=205880 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=462696 responseStatus=201 Created token=2ALRP-_K
section_end:1670413678:upload_artifacts_on_success section_start:1670413678:cleanup_file_variables Cleaning up project directory and file based variables
Using git from spack modules
section_end:1670413679:cleanup_file_variables Job succeeded
Running with gitlab-runner 15.5.0 (0d4137b8)
 on BB5 map runner pnPo3yJy
section_start:1670413668:resolve_secrets Resolving secrets
section_end:1670413668:resolve_secrets section_start:1670413668:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 15.5.0, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor2265344353, slurm job id , CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 89112, build ref 1ba5e64356655e0a1488e5f95def3d7c6952e7b1, job ID 462699
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
A slurm job will be created with name GL_J462699_PROD_P112_CP3_C15
Job parameters: memory=30750M, cpus_per_task=1, duration=1:00:00, constraint=volta ntasks=8 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
Submitted batch job 1054654
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=8 --cpus-per-task=1 --mem=30750M --job-name=GL_J462699_PROD_P112_CP3_C15 -C volta --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --ntasks=8 --jobid=1054654 --cpus-per-task=1 --mem=30750M
section_end:1670415862:prepare_executor section_start:1670415862:prepare_script Preparing environment
Using git from spack modules
Running on r2i3n1 via bbpv1.epfl.ch...
section_end:1670415864:prepare_script section_start:1670415864:get_sources Getting source from Git repository
Using git from spack modules
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1670415865:get_sources section_start:1670415865:download_artifacts Downloading artifacts
Using git from spack modules
Downloading artifacts for build:coreneuron:mod2c:nvhpc:acc:debug:unified (462682)...
Runtime platform  arch=amd64 os=linux pid=183260 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=462682 responseStatus=200 OK token=mbj7ToSh
section_end:1670415866:download_artifacts section_start:1670415866:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
Using git from spack modules
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r1i4n11
Build name: Linux-nvc++
Create new tag: 20221207-1225 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462682/spack-build/spack-stage-coreneuron-develop-2asi2me356b2upp2q2whulfko55yuxye/spack-build-2asi2me
Start 1: cmd_interface_test
Start 2: interleave_info_constructor_test
Start 3: alignment_test
Start 4: queuing_test
Start 5: test-solver
Start 6: lfp_test
Start 7: ring_TEST
1/20 Test #1: cmd_interface_test ..................... Passed 0.45 sec
Start 20: reporting_1
2/20 Test #4: queuing_test ........................... Passed 0.44 sec
3/20 Test #6: lfp_test ............................... Passed 0.44 sec
Start 8: ring_binqueue_TEST
4/20 Test #2: interleave_info_constructor_test ....... Passed 2.15 sec
5/20 Test #3: alignment_test ......................... Passed 2.17 sec
Start 9: ring_multisend_TEST
6/20 Test #5: test-solver ............................ Passed 3.90 sec
7/20 Test #20: reporting_1 ............................ Passed 4.00 sec
Start 10: ring_spike_buffer_TEST
8/20 Test #7: ring_TEST .............................. Passed 33.76 sec
Start 11: ring_gap_TEST
9/20 Test #8: ring_binqueue_TEST ..................... Passed 34.51 sec
Start 12: ring_gap_binqueue_TEST
10/20 Test #9: ring_multisend_TEST .................... Passed 34.04 sec
Start 13: ring_gap_multisend_TEST
11/20 Test #10: ring_spike_buffer_TEST ................. Passed 33.87 sec
Start 14: ring_permute1_TEST
12/20 Test #14: ring_permute1_TEST ..................... Passed 32.07 sec
Start 15: ring_gap_permute1_TEST
13/20 Test #11: ring_gap_TEST .......................... Passed 38.41 sec
Start 16: ring_permute2_TEST
14/20 Test #12: ring_gap_binqueue_TEST ................. Passed 38.55 sec
Start 17: ring_gap_permute2_TEST
15/20 Test #13: ring_gap_multisend_TEST ................ Passed 38.82 sec
Start 18: ring_permute2_cudaInterface_TEST
16/20 Test #16: ring_permute2_TEST ..................... Passed 28.07 sec
Start 19: ring_gap_permute2_cudaInterface_TEST
17/20 Test #18: ring_permute2_cudaInterface_TEST ....... Passed 29.44 sec
18/20 Test #15: ring_gap_permute1_TEST ................. Passed 36.71 sec
19/20 Test #17: ring_gap_permute2_TEST ................. Passed 36.18 sec
20/20 Test #19: ring_gap_permute2_cudaInterface_TEST ... Passed 19.28 sec
100% tests passed, 0 tests failed out of 20
Total Test time (real) = 119.60 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1670416020:step_script section_start:1670416020:upload_artifacts_on_success Uploading artifacts for successful job
Using git from spack modules
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=186278 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=462699 responseStatus=201 Created token=mbj7ToSh
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=186313 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=462699 responseStatus=201 Created token=mbj7ToSh
section_end:1670416021:upload_artifacts_on_success section_start:1670416021:cleanup_file_variables Cleaning up project directory and file based variables
Using git from spack modules
section_end:1670416022:cleanup_file_variables Job succeeded
Running with gitlab-runner 15.5.0 (0d4137b8)
 on BB5 map runner pnPo3yJy
section_start:1670413668:resolve_secrets Resolving secrets
section_end:1670413668:resolve_secrets section_start:1670413668:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 15.5.0, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor4263332851, slurm job id , CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
Runner ID 29, project root hpc, project name coreneuron
Pipeline ID 89112, build ref 1ba5e64356655e0a1488e5f95def3d7c6952e7b1, job ID 462700
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112, optional exclusive flag , optional cpus per task flag --cpus-per-task=1, optional qos flag
CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
uid=0(root) gid=0(root) groups=0(root)
root
UID is 0
A slurm job will be created with name GL_J462700_PROD_P112_CP5_C16
Job parameters: memory=30750M, cpus_per_task=1, duration=1:00:00, constraint=volta ntasks=8 account=proj9998 user=bbpcihpcproj12 partition=prod qos=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
Submitted batch job 1054655
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=8 --cpus-per-task=1 --mem=30750M --job-name=GL_J462700_PROD_P112_CP5_C16 -C volta --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P89112 --ntasks=8 --jobid=1054655 --cpus-per-task=1 --mem=30750M
section_end:1670415862:prepare_executor section_start:1670415862:prepare_script Preparing environment
Using git from spack modules
Running on r2i3n1 via bbpv1.epfl.ch...
section_end:1670415864:prepare_script section_start:1670415864:get_sources Getting source from Git repository
Using git from spack modules
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1670415865:get_sources section_start:1670415865:download_artifacts Downloading artifacts
Using git from spack modules
Downloading artifacts for build:coreneuron:mod2c:nvhpc:acc:shared (462683)...
Runtime platform  arch=amd64 os=linux pid=183203 revision=58ba2b95 version=14.2.0
Downloading artifacts from coordinator... ok  id=462683 responseStatus=200 OK token=F_4UAkLT
section_end:1670415866:download_artifacts section_start:1670415866:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
Using git from spack modules
$ env -0 | sort -z | xargs -0 -L 1 echo > initial_environment.env
$ export CTEST_PARALLEL_LEVEL=${SLURM_TASKS_PER_NODE}
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ cd ${SPACK_BUILD_DIR}
$ export BOOST_TEST_COLOR_OUTPUT=no
$ i_am_a_failure=0
$ spack build-env ${SPACK_FULL_SPEC} -- ctest --output-on-failure -T Test || i_am_a_failure=1
Site: r1i4n11
Build name: Linux-nvc++
Create new tag: 20221207-1225 - Experimental
Test project /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P89112/J462683/spack-build/spack-stage-coreneuron-develop-w2z6dfwdq6k5tkwudhpbj7owhrne5qzi/spack-build-w2z6dfw
Start 1: cmd_interface_test
Start 2: interleave_info_constructor_test
Start 3: alignment_test
Start 4: queuing_test
Start 5: test-solver
Start 6: lfp_test
Start 7: ring_TEST
1/20 Test #1: cmd_interface_test ..................... Passed 0.44 sec
Start 20: reporting_1
2/20 Test #3: alignment_test ......................... Passed 0.44 sec
3/20 Test #2: interleave_info_constructor_test ....... Passed 0.45 sec
Start 8: ring_binqueue_TEST
4/20 Test #4: queuing_test ........................... Passed 0.44 sec
5/20 Test #6: lfp_test ............................... Passed 0.43 sec
Start 9: ring_multisend_TEST
6/20 Test #5: test-solver ............................ Passed 2.45 sec
7/20 Test #20: reporting_1 ............................ Passed 3.36 sec
Start 10: ring_spike_buffer_TEST
8/20 Test #7: ring_TEST .............................. Passed 38.61 sec
Start 11: ring_gap_TEST
9/20 Test #8: ring_binqueue_TEST ..................... Passed 38.93 sec
Start 12: ring_gap_binqueue_TEST
10/20 Test #9: ring_multisend_TEST .................... Passed 39.08 sec
Start 13: ring_gap_multisend_TEST
11/20 Test #10: ring_spike_buffer_TEST ................. Passed 37.30 sec
Start 14: ring_permute1_TEST
12/20 Test #14: ring_permute1_TEST ..................... Passed 35.01 sec
Start 15: ring_gap_permute1_TEST
13/20 Test #11: ring_gap_TEST .......................... Passed 48.55 sec
Start 16: ring_permute2_TEST
14/20 Test #12: ring_gap_binqueue_TEST ................. Passed 49.01 sec
Start 17: ring_gap_permute2_TEST
15/20 Test #13: ring_gap_multisend_TEST ................ Passed 48.98 sec
Start 18: ring_permute2_cudaInterface_TEST
16/20 Test #16: ring_permute2_TEST ..................... Passed 27.88 sec
Start 19: ring_gap_permute2_cudaInterface_TEST
17/20 Test #18: ring_permute2_cudaInterface_TEST ....... Passed 27.24 sec
18/20 Test #15: ring_gap_permute1_TEST ................. Passed 39.92 sec
19/20 Test #17: ring_gap_permute2_TEST ................. Passed 30.16 sec
20/20 Test #19: ring_gap_permute2_cudaInterface_TEST ... Passed 5.26 sec
100% tests passed, 0 tests failed out of 20
Total Test time (real) = 120.38 sec
$ cp -r Testing/ ${CI_PROJECT_DIR}/
$ module load unstable unit-test-translator
Autoloading python/3.9.7
$ cmake2junit > ${CI_PROJECT_DIR}/ctest.xml
$ exit ${i_am_a_failure}
section_end:1670416021:step_script section_start:1670416021:upload_artifacts_on_success Uploading artifacts for successful job
Using git from spack modules
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=186382 revision=58ba2b95 version=14.2.0
initial_environment.env: found 1 matching files and directories
Testing/: found 7 matching files and directories 
Uploading artifacts as "archive" to coordinator... ok id=462700 responseStatus=201 Created token=F_4UAkLT
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=186417 revision=58ba2b95 version=14.2.0
ctest.xml: found 1 matching files and directories 
Uploading artifacts as "junit" to coordinator... ok id=462700 responseStatus=201 Created token=F_4UAkLT
section_end:1670416022:upload_artifacts_on_success section_start:1670416022:cleanup_file_variables Cleaning up project directory and file based variables
Using git from spack modules
section_end:1670416023:cleanup_file_variables Job succeeded
Log was not fetched because job had status: skipped
Log was not fetched because job had status: skipped
Log was not fetched because job had status: skipped
Log was not fetched because job had status: skipped
Log was not fetched because job had status: skipped
Log was not fetched because job had status: skipped
Log was not fetched because job had status: skipped
Log was not fetched because job had status: skipped
Log was not fetched because job had status: skipped
Log was not fetched because job had status: skipped
Log was not fetched because job had status: skipped
Log was not fetched because job had status: skipped
Log was not fetched because job had status: skipped
Log was not fetched because job had status: skipped
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment