Skip to content

Instantly share code, notes, and snippets.

@bbpbuildbot
Created June 18, 2024 07:33
Show Gist options
  • Save bbpbuildbot/a0a9b5c6ac71475c6800eebbc3d64c05 to your computer and use it in GitHub Desktop.
Save bbpbuildbot/a0a9b5c6ac71475c6800eebbc3d64c05 to your computer and use it in GitHub Desktop.
Logfiles for GitLab pipeline https://bbpgitlab.epfl.ch/hpc/nmodl/-/pipelines/217533 (:no_entry:) running on GitHub PR BlueBrain/nmodl#1323.
Log was not fetched because job had status: skipped
Log was not fetched because job had status: skipped
Running with gitlab-runner 15.5.0 (0d4137b8)
 on BB5 map runner pnPo3yJy
section_start:1718695872:resolve_secrets Resolving secrets
section_end:1718695872:resolve_secrets section_start:1718695872:prepare_executor Preparing the "custom" executor
Using Custom executor with driver BB5 PROD runner v0.0.3...
BB5 PROD runner running on bbpv1.epfl.ch, version 15.5.0, user
TMPDIR is /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/tmp/custom-executor1102702084, slurm job id , CUSTOM_ENV_CI_RUNNER_TAGS is ["bb5_map"]
Runner ID 29, project root hpc, project name nmodl
Pipeline ID 217533, build ref , job ID 1322858
Build dir /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P217533, optional exclusive flag , optional cpus per task flag --cpus-per-task=6, optional qos flag , optional reservation flag
A slurm job will be created with name GL_J1322858_PROD_P30_CP3_C1
Job parameters: memory=30750M, cpus_per_task=6, duration=1:00:00, constraint=cpu ntasks=1 account=proj9998 user=bbpcihpcproj12 partition=prod qos= reservation=
Not executing the chown -R
sbatch: INFO: Activating auto partition selection plugin, please report errors to HPC/CS
sbatch: INFO: Job specifies cpu constraint, setting --constraint=[skl|clx]
Submitted batch job 2256646
job state: PD
job state: R
sbatch: sbatch -p prod -A proj9998 --ntasks=1 --cpus-per-task=6 --mem=30750M --job-name=GL_J1322858_PROD_P30_CP3_C1 -C cpu --no-requeue -D /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P217533 --time=1:00:00 --wrap="sleep infinity"
srun: srun --mpi=none --chdir=/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P217533 --ntasks=1 --jobid=2256646 --cpus-per-task=6 --mem=30750M
section_end:1718695875:prepare_executor section_start:1718695875:prepare_script Preparing environment
Using git from spack modules
Running on r1i6n1 via bbpv1.epfl.ch...
section_end:1718695881:prepare_script section_start:1718695881:get_sources Getting source from Git repository
Using git from spack modules
Skipping Git repository setup
Skipping Git checkout
Skipping Git submodules setup
section_end:1718695882:get_sources section_start:1718695882:step_script Executing "step_script" stage of the job script
WARNING: Starting with version 14.0 the 'build_script' stage will be replaced with 'step_script': https://gitlab.com/gitlab-org/gitlab-runner/-/issues/26426
Using git from spack modules
$ if [[ -n "${SPACK_ENV_FILE_URL}" && "${PARSE_GITHUB_PR_DESCRIPTIONS,,}" == "true" ]]; then
$ cat > parse_description.py << END_SCRIPT # collapsed multi-line command
$ cat parse_description.py
import os
import re
import requests
pr_info = requests.get("https://api.github.com/repos/{}/pulls/{}".format(
os.environ['CI_EXTERNAL_PULL_REQUEST_TARGET_REPOSITORY'],
os.environ['CI_EXTERNAL_PULL_REQUEST_IID']),
headers={'Accept': 'application/vnd.github.v3+json'})
pr_body = pr_info.json()["body"]
# match something like NEURON_BRANCH=foo/bar
# special case for SPACK_DEPLOYMENT_SUFFIX=foo/bar
pat = re.compile('^([A-Z0-9_]+)_([A-Z]+)=([A-Z0-9\-\_\/\+\.]+)$', re.IGNORECASE)
def parse_term(m):
ref_type = m.group(2).lower()
is_deployment_suffix = ref_type == 'suffix' and m.group(1).lower() == 'spack_deployment'
if ref_type not in {'branch', 'tag', 'ref'} and not is_deployment_suffix: return
print(m.group(1).upper() + '_' + ref_type.upper() + '=' + m.group(3))
if pr_body is not None:
for pr_body_line in pr_body.splitlines():
if not pr_body_line.startswith('CI_BRANCHES:'): continue
for config_term in pr_body_line[12:].split(','):
pat.sub(parse_term, config_term)
$ (module load unstable python-dev; python parse_description.py) > input_variables.env
$ else
$ cat input_variables.env
NMODL_BRANCH=1uc/optimize-diam-area
$ for var_to_unset in $(sed 's/^\(.*\?\)_\(BRANCH\|COMMIT\|TAG\)=.*$/\1_BRANCH\n\1_COMMIT\n\1_TAG/' input_variables.env); do # collapsed multi-line command
Unsetting NMODL_COMMIT
$ set -o allexport
$ . input_variables.env
$ set +o allexport
$ unset MODULEPATH
$ . /gpfs/bbp.cscs.ch/ssd/apps/bsd/${SPACK_DEPLOYMENT_SUFFIX}/config/modules.sh
$ echo "MODULEPATH=${MODULEPATH}" > spack_clone_variables.env
$ echo Preparing to clone Spack into ${PWD}
Preparing to clone Spack into /gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs/bbpcihpcproj12/P217533/J1322858
$ if [[ -z "${SPACK_BRANCH}" && ( -n "${SPACK_COMMIT}" || -n "${SPACK_TAG}" ) ]]; then
$ echo Checking out the ${SPACK_BRANCH} of Spack...
Checking out the develop of Spack...
$ module load unstable git
$ git clone -c feature.manyFiles=true --depth 1 --single-branch --branch ${SPACK_BRANCH} ${SPACK_URL} spack
Cloning into 'spack'...
Updating files: 12% (1423/11305) Updating files: 13% (1470/11305) Updating files: 14% (1583/11305) Updating files: 15% (1696/11305) Updating files: 16% (1809/11305) Updating files: 17% (1922/11305) Updating files: 18% (2035/11305) Updating files: 19% (2148/11305) Updating files: 20% (2261/11305) Updating files: 21% (2375/11305) Updating files: 22% (2488/11305) Updating files: 23% (2601/11305) Updating files: 24% (2714/11305) Updating files: 25% (2827/11305) Updating files: 26% (2940/11305) Updating files: 27% (3053/11305) Updating files: 28% (3166/11305) Updating files: 29% (3279/11305) Updating files: 30% (3392/11305) Updating files: 30% (3401/11305) Updating files: 31% (3505/11305) Updating files: 32% (3618/11305) Updating files: 33% (3731/11305) Updating files: 34% (3844/11305) Updating files: 35% (3957/11305) Updating files: 36% (4070/11305) Updating files: 37% (4183/11305) Updating files: 38% (4296/11305) Updating files: 39% (4409/11305) Updating files: 40% (4522/11305) Updating files: 41% (4636/11305) Updating files: 42% (4749/11305) Updating files: 42% (4758/11305) Updating files: 43% (4862/11305) Updating files: 44% (4975/11305) Updating files: 45% (5088/11305) Updating files: 46% (5201/11305) Updating files: 47% (5314/11305) Updating files: 48% (5427/11305) Updating files: 49% (5540/11305) Updating files: 50% (5653/11305) Updating files: 51% (5766/11305) Updating files: 52% (5879/11305) Updating files: 53% (5992/11305) Updating files: 54% (6105/11305) Updating files: 55% (6218/11305) Updating files: 56% (6331/11305) Updating files: 57% (6444/11305) Updating files: 57% (6505/11305) Updating files: 58% (6557/11305) Updating files: 59% (6670/11305) Updating files: 60% (6783/11305) Updating files: 61% (6897/11305) Updating files: 62% (7010/11305) Updating files: 63% (7123/11305) Updating files: 64% (7236/11305) Updating files: 65% (7349/11305) Updating files: 66% (7462/11305) Updating files: 67% (7575/11305) Updating files: 68% (7688/11305) Updating files: 69% (7801/11305) Updating files: 69% (7847/11305) Updating files: 70% (7914/11305) Updating files: 71% (8027/11305) Updating files: 72% (8140/11305) Updating files: 73% (8253/11305) Updating files: 74% (8366/11305) Updating files: 75% (8479/11305) Updating files: 76% (8592/11305) Updating files: 77% (8705/11305) Updating files: 78% (8818/11305) Updating files: 79% (8931/11305) Updating files: 80% (9044/11305) Updating files: 81% (9158/11305) Updating files: 82% (9271/11305) Updating files: 83% (9384/11305) Updating files: 84% (9497/11305) Updating files: 85% (9610/11305) Updating files: 86% (9723/11305) Updating files: 87% (9836/11305) Updating files: 87% (9847/11305) Updating files: 88% (9949/11305) Updating files: 89% (10062/11305) Updating files: 90% (10175/11305) Updating files: 91% (10288/11305) Updating files: 92% (10401/11305) Updating files: 93% (10514/11305) Updating files: 94% (10627/11305) Updating files: 95% (10740/11305) Updating files: 96% (10853/11305) Updating files: 97% (10966/11305) Updating files: 98% (11079/11305) Updating files: 99% (11192/11305) Updating files: 100% (11305/11305) Updating files: 100% (11305/11305), done.
$ export SPACK_ROOT=${PWD}/spack
$ export SPACK_RESOLVED_COMMIT=$(cd "${SPACK_ROOT}" && git rev-parse HEAD)
$ export SPACK_USER_CACHE_PATH="${CI_BUILDS_DIR}"
$ export SPACK_SYSTEM_CONFIG_PATH="/gpfs/bbp.cscs.ch/ssd/apps/bsd/${SPACK_DEPLOYMENT_SUFFIX}/config"
$ echo "SPACK_ROOT=${SPACK_ROOT}" >> spack_clone_variables.env
$ echo "SPACK_RESOLVED_COMMIT=${SPACK_RESOLVED_COMMIT}" >> spack_clone_variables.env
$ echo "SPACK_USER_CACHE_PATH=${SPACK_USER_CACHE_PATH}" >> spack_clone_variables.env
$ echo "SPACK_SYSTEM_CONFIG_PATH=${SPACK_SYSTEM_CONFIG_PATH}" >> spack_clone_variables.env
$ . ${SPACK_ROOT}/share/spack/setup-env.sh
$ export XDG_CONFIG_HOME=${CI_BUILDS_DIR}/J${CI_JOB_ID}_local_config
$ echo "Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (${XDG_CONFIG_HOME})"
Configuring git to use CI_JOB_TOKEN to access git@bbpgitlab.epfl.ch (/gpfs/bbp.cscs.ch/ssd/gitlab_map_jobs//bbpcihpcproj12/P217533/J1322858_local_config)
$ mkdir -p "${XDG_CONFIG_HOME}/git"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = git@bbpgitlab.epfl.ch:" > "${XDG_CONFIG_HOME}/git/config"
$ echo -e "[url \"https://gitlab-ci-token:${CI_JOB_TOKEN}@bbpgitlab.epfl.ch/\"]\n insteadOf = ssh://git@bbpgitlab.epfl.ch/" >> "${XDG_CONFIG_HOME}/git/config"
$ cat "${XDG_CONFIG_HOME}/git/config"
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = git@bbpgitlab.epfl.ch:
[url "https://gitlab-ci-token:[MASKED]@bbpgitlab.epfl.ch/"]
insteadOf = ssh://git@bbpgitlab.epfl.ch/
$ env -0 | sed -nz '/^CUSTOM_ENV_/d;/^[^=]\+_\(BRANCH\|COMMIT\|TAG\)=.\+/p' | xargs -0t spack ${SPACK_EXTRA_FLAGS} configure-pipeline --ignore-packages CI_BUILD CI_COMMIT CI_DEFAULT GITLAB_PIPELINES SPACK SPACK_RESOLVED ${SPACK_SETUP_IGNORE_PACKAGE_VARIABLES} --write-commit-file=commit-mapping.env
spack configure-pipeline --ignore-packages CI_BUILD CI_COMMIT CI_DEFAULT GITLAB_PIPELINES SPACK SPACK_RESOLVED CVF BLUECONFIGS --write-commit-file=commit-mapping.env CI_COMMIT_BRANCH=1uc/morphology SPACK_RESOLVED_COMMIT=b9849b16deda184445da115c59d31ec5303b460a NEURON_BRANCH=master NMODL_BRANCH=1uc/optimize-diam-area LIBSONATA_REPORT_BRANCH=master SPACK_BRANCH=develop CI_DEFAULT_BRANCH=master CVF_BRANCH=main BLUECONFIGS_BRANCH=main
==> CI_COMMIT: ignoring CI_COMMIT_BRANCH=1uc/morphology
==> SPACK_RESOLVED: ignoring SPACK_RESOLVED_COMMIT=b9849b16deda184445da115c59d31ec5303b460a
==> SPACK: ignoring SPACK_BRANCH=develop
==> CI_DEFAULT: ignoring CI_DEFAULT_BRANCH=master
==> CVF: ignoring CVF_BRANCH=main
==> BLUECONFIGS: ignoring BLUECONFIGS_BRANCH=main
==> neuron: resolved branch master to 8cb9ec8aee8913e05f3fe4f2e66e4691d7bac610
==> Error: Could not find branch 1uc/optimize-diam-area on remote https://github.com/BlueBrain/nmodl.git (tried refs/heads/1uc/optimize-diam-area)
srun: error: r1i6n1: task 0: Exited with exit code 1
section_end:1718695946:step_script section_start:1718695946:upload_artifacts_on_failure Uploading artifacts for failed job
Using git from spack modules
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=15916 revision=58ba2b95 version=14.2.0
WARNING: commit-mapping.env: no matching files 
input_variables.env: found 1 matching files and directories
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "archive" to coordinator... ok id=1322858 responseStatus=201 Created token=glcbt-64
Uploading artifacts...
Runtime platform  arch=amd64 os=linux pid=15938 revision=58ba2b95 version=14.2.0
spack_clone_variables.env: found 1 matching files and directories
Uploading artifacts as "dotenv" to coordinator... ok id=1322858 responseStatus=201 Created token=glcbt-64
section_end:1718695947:upload_artifacts_on_failure section_start:1718695947:cleanup_file_variables Cleaning up project directory and file based variables
Using git from spack modules
section_end:1718695948:cleanup_file_variables ERROR: Job failed: exit status 1

Log was not fetched because job had status: skipped
Log was not fetched because job had status: skipped
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment