Skip to content

Instantly share code, notes, and snippets.

@alvations
Created August 15, 2019 00:50
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save alvations/b071b8809bd15e1e5c52797bafddc3a3 to your computer and use it in GitHub Desktop.
Save alvations/b071b8809bd15e1e5c52797bafddc3a3 to your computer and use it in GitHub Desktop.
[2019-08-15 08:31:02] [marian] Marian v1.7.8 c65c26d6 2019-08-11 18:27:00 +0100
[2019-08-15 08:31:02] [marian] Running on walle3 as process 24138 with command line:
[2019-08-15 08:31:02] [marian] /home/xyz/marian-dev/build/marian --model /disk2/models/xx-yy-r0/model.npz --type transformer --train-sets /disk2/data/xx-yy/train.sk /disk2/data/xx-yy/train.en --vocabs /disk2/models/xx-yy-r0/vocab.src.spm /disk2/models/xx-yy-r0/vocab.trg.spm --dim-vocabs 32000 32000 --mini-batch-fit --mini-batch 1000 --maxi-batch 1000 --valid-freq 10000 --save-freq 10000 --disp-freq 500 --valid-metrics ce-mean-words perplexity bleu-detok --valid-sets /disk2/data/xx-yy/valid.sk /disk2/data/xx-yy/valid.en --quiet-translation --beam-size 6 --normalize=0.6 --valid-mini-batch 16 --early-stopping 5 --cost-type=ce-mean-words --log /disk2/models/xx-yy-r0/train.log --valid-log /disk2/models/xx-yy-r0/valid.log --enc-depth 6 --dec-depth 6 --transformer-preprocess n --transformer-postprocess da --tied-embeddings-all --dim-emb 1024 --transformer-dim-ffn 4096 --transformer-dropout 0.1 --transformer-dropout-attention 0.1 --transformer-dropout-ffn 0.1 --label-smoothing 0.1 --learn-rate 0.0001 --lr-warmup 8000 --lr-decay-inv-sqrt 8000 --lr-report --optimizer-params 0.9 0.98 1e-09 --clip-norm 5 --devices 0 1 2 3 --workspace 5000 --optimizer-delay 2 --sync-sgd --seed 0 --exponential-smoothing --shuffle-in-ram --transformer-train-position-embeddings
[2019-08-15 08:31:02] [config] after-batches: 0
[2019-08-15 08:31:02] [config] after-epochs: 0
[2019-08-15 08:31:02] [config] allow-unk: false
[2019-08-15 08:31:02] [config] beam-size: 6
[2019-08-15 08:31:02] [config] bert-class-symbol: "[CLS]"
[2019-08-15 08:31:02] [config] bert-mask-symbol: "[MASK]"
[2019-08-15 08:31:02] [config] bert-masking-fraction: 0.15
[2019-08-15 08:31:02] [config] bert-sep-symbol: "[SEP]"
[2019-08-15 08:31:02] [config] bert-train-type-embeddings: true
[2019-08-15 08:31:02] [config] bert-type-vocab-size: 2
[2019-08-15 08:31:02] [config] clip-gemm: 0
[2019-08-15 08:31:02] [config] clip-norm: 5
[2019-08-15 08:31:02] [config] cost-type: ce-mean-words
[2019-08-15 08:31:02] [config] cpu-threads: 0
[2019-08-15 08:31:02] [config] data-weighting: ""
[2019-08-15 08:31:02] [config] data-weighting-type: sentence
[2019-08-15 08:31:02] [config] dec-cell: gru
[2019-08-15 08:31:02] [config] dec-cell-base-depth: 2
[2019-08-15 08:31:02] [config] dec-cell-high-depth: 1
[2019-08-15 08:31:02] [config] dec-depth: 6
[2019-08-15 08:31:02] [config] devices:
[2019-08-15 08:31:02] [config] - 0
[2019-08-15 08:31:02] [config] - 1
[2019-08-15 08:31:02] [config] - 2
[2019-08-15 08:31:02] [config] - 3
[2019-08-15 08:31:02] [config] dim-emb: 1024
[2019-08-15 08:31:02] [config] dim-rnn: 1024
[2019-08-15 08:31:02] [config] dim-vocabs:
[2019-08-15 08:31:02] [config] - 32000
[2019-08-15 08:31:02] [config] - 32000
[2019-08-15 08:31:02] [config] disp-first: 0
[2019-08-15 08:31:02] [config] disp-freq: 500
[2019-08-15 08:31:02] [config] disp-label-counts: false
[2019-08-15 08:31:02] [config] dropout-rnn: 0
[2019-08-15 08:31:02] [config] dropout-src: 0
[2019-08-15 08:31:02] [config] dropout-trg: 0
[2019-08-15 08:31:02] [config] dump-config: ""
[2019-08-15 08:31:02] [config] early-stopping: 5
[2019-08-15 08:31:02] [config] embedding-fix-src: false
[2019-08-15 08:31:02] [config] embedding-fix-trg: false
[2019-08-15 08:31:02] [config] embedding-normalization: false
[2019-08-15 08:31:02] [config] embedding-vectors:
[2019-08-15 08:31:02] [config] []
[2019-08-15 08:31:02] [config] enc-cell: gru
[2019-08-15 08:31:02] [config] enc-cell-depth: 1
[2019-08-15 08:31:02] [config] enc-depth: 6
[2019-08-15 08:31:02] [config] enc-type: bidirectional
[2019-08-15 08:31:02] [config] exponential-smoothing: 0.0001
[2019-08-15 08:31:02] [config] grad-dropping-momentum: 0
[2019-08-15 08:31:02] [config] grad-dropping-rate: 0
[2019-08-15 08:31:02] [config] grad-dropping-warmup: 100
[2019-08-15 08:31:02] [config] guided-alignment: none
[2019-08-15 08:31:02] [config] guided-alignment-cost: mse
[2019-08-15 08:31:02] [config] guided-alignment-weight: 0.1
[2019-08-15 08:31:02] [config] ignore-model-config: false
[2019-08-15 08:31:02] [config] input-types:
[2019-08-15 08:31:02] [config] []
[2019-08-15 08:31:02] [config] interpolate-env-vars: false
[2019-08-15 08:31:02] [config] keep-best: false
[2019-08-15 08:31:02] [config] label-smoothing: 0.1
[2019-08-15 08:31:02] [config] layer-normalization: false
[2019-08-15 08:31:02] [config] learn-rate: 0.0001
[2019-08-15 08:31:02] [config] log: /disk2/models/xx-yy-r0/train.log
[2019-08-15 08:31:02] [config] log-level: info
[2019-08-15 08:31:02] [config] log-time-zone: ""
[2019-08-15 08:31:02] [config] lr-decay: 0
[2019-08-15 08:31:02] [config] lr-decay-freq: 50000
[2019-08-15 08:31:02] [config] lr-decay-inv-sqrt:
[2019-08-15 08:31:02] [config] - 8000
[2019-08-15 08:31:02] [config] lr-decay-repeat-warmup: false
[2019-08-15 08:31:02] [config] lr-decay-reset-optimizer: false
[2019-08-15 08:31:02] [config] lr-decay-start:
[2019-08-15 08:31:02] [config] - 10
[2019-08-15 08:31:02] [config] - 1
[2019-08-15 08:31:02] [config] lr-decay-strategy: epoch+stalled
[2019-08-15 08:31:02] [config] lr-report: true
[2019-08-15 08:31:02] [config] lr-warmup: 8000
[2019-08-15 08:31:02] [config] lr-warmup-at-reload: false
[2019-08-15 08:31:02] [config] lr-warmup-cycle: false
[2019-08-15 08:31:02] [config] lr-warmup-start-rate: 0
[2019-08-15 08:31:02] [config] max-length: 50
[2019-08-15 08:31:02] [config] max-length-crop: false
[2019-08-15 08:31:02] [config] max-length-factor: 3
[2019-08-15 08:31:02] [config] maxi-batch: 1000
[2019-08-15 08:31:02] [config] maxi-batch-sort: trg
[2019-08-15 08:31:02] [config] mini-batch: 1000
[2019-08-15 08:31:02] [config] mini-batch-fit: true
[2019-08-15 08:31:02] [config] mini-batch-fit-step: 10
[2019-08-15 08:31:02] [config] mini-batch-overstuff: 1
[2019-08-15 08:31:02] [config] mini-batch-track-lr: false
[2019-08-15 08:31:02] [config] mini-batch-understuff: 1
[2019-08-15 08:31:02] [config] mini-batch-warmup: 0
[2019-08-15 08:31:02] [config] mini-batch-words: 0
[2019-08-15 08:31:02] [config] mini-batch-words-ref: 0
[2019-08-15 08:31:02] [config] model: /disk2/models/xx-yy-r0/model.npz
[2019-08-15 08:31:02] [config] multi-loss-type: sum
[2019-08-15 08:31:02] [config] multi-node: false
[2019-08-15 08:31:02] [config] multi-node-overlap: true
[2019-08-15 08:31:02] [config] n-best: false
[2019-08-15 08:31:02] [config] no-nccl: false
[2019-08-15 08:31:02] [config] no-reload: false
[2019-08-15 08:31:02] [config] no-restore-corpus: false
[2019-08-15 08:31:02] [config] no-shuffle: false
[2019-08-15 08:31:02] [config] normalize: 0.6
[2019-08-15 08:31:02] [config] num-devices: 0
[2019-08-15 08:31:02] [config] optimizer: adam
[2019-08-15 08:31:02] [config] optimizer-delay: 2
[2019-08-15 08:31:02] [config] optimizer-params:
[2019-08-15 08:31:02] [config] - 0.9
[2019-08-15 08:31:02] [config] - 0.98
[2019-08-15 08:31:02] [config] - 1e-09
[2019-08-15 08:31:02] [config] overwrite: false
[2019-08-15 08:31:02] [config] pretrained-model: ""
[2019-08-15 08:31:02] [config] quiet: false
[2019-08-15 08:31:02] [config] quiet-translation: true
[2019-08-15 08:31:02] [config] relative-paths: false
[2019-08-15 08:31:02] [config] right-left: false
[2019-08-15 08:31:02] [config] save-freq: 10000
[2019-08-15 08:31:02] [config] seed: 0
[2019-08-15 08:31:02] [config] sentencepiece-alphas:
[2019-08-15 08:31:02] [config] []
[2019-08-15 08:31:02] [config] sentencepiece-max-lines: 10000000
[2019-08-15 08:31:02] [config] sentencepiece-options: ""
[2019-08-15 08:31:02] [config] shuffle-in-ram: true
[2019-08-15 08:31:02] [config] skip: false
[2019-08-15 08:31:02] [config] sqlite: ""
[2019-08-15 08:31:02] [config] sqlite-drop: false
[2019-08-15 08:31:02] [config] sync-sgd: true
[2019-08-15 08:31:02] [config] tempdir: /tmp
[2019-08-15 08:31:02] [config] tied-embeddings: false
[2019-08-15 08:31:02] [config] tied-embeddings-all: true
[2019-08-15 08:31:02] [config] tied-embeddings-src: false
[2019-08-15 08:31:02] [config] train-sets:
[2019-08-15 08:31:02] [config] - /disk2/data/xx-yy/train.sk
[2019-08-15 08:31:02] [config] - /disk2/data/xx-yy/train.en
[2019-08-15 08:31:02] [config] transformer-aan-activation: swish
[2019-08-15 08:31:02] [config] transformer-aan-depth: 2
[2019-08-15 08:31:02] [config] transformer-aan-nogate: false
[2019-08-15 08:31:02] [config] transformer-decoder-autoreg: self-attention
[2019-08-15 08:31:02] [config] transformer-dim-aan: 2048
[2019-08-15 08:31:02] [config] transformer-dim-ffn: 4096
[2019-08-15 08:31:02] [config] transformer-dropout: 0.1
[2019-08-15 08:31:02] [config] transformer-dropout-attention: 0.1
[2019-08-15 08:31:02] [config] transformer-dropout-ffn: 0.1
[2019-08-15 08:31:02] [config] transformer-ffn-activation: swish
[2019-08-15 08:31:02] [config] transformer-ffn-depth: 2
[2019-08-15 08:31:02] [config] transformer-guided-alignment-layer: last
[2019-08-15 08:31:02] [config] transformer-heads: 8
[2019-08-15 08:31:02] [config] transformer-no-projection: false
[2019-08-15 08:31:02] [config] transformer-postprocess: da
[2019-08-15 08:31:02] [config] transformer-postprocess-emb: d
[2019-08-15 08:31:02] [config] transformer-preprocess: n
[2019-08-15 08:31:02] [config] transformer-tied-layers:
[2019-08-15 08:31:02] [config] []
[2019-08-15 08:31:02] [config] transformer-train-position-embeddings: true
[2019-08-15 08:31:02] [config] type: transformer
[2019-08-15 08:31:02] [config] ulr: false
[2019-08-15 08:31:02] [config] ulr-dim-emb: 0
[2019-08-15 08:31:02] [config] ulr-dropout: 0
[2019-08-15 08:31:02] [config] ulr-keys-vectors: ""
[2019-08-15 08:31:02] [config] ulr-query-vectors: ""
[2019-08-15 08:31:02] [config] ulr-softmax-temperature: 1
[2019-08-15 08:31:02] [config] ulr-trainable-transformation: false
[2019-08-15 08:31:02] [config] valid-freq: 10000
[2019-08-15 08:31:02] [config] valid-log: /disk2/models/xx-yy-r0/valid.log
[2019-08-15 08:31:02] [config] valid-max-length: 1000
[2019-08-15 08:31:02] [config] valid-metrics:
[2019-08-15 08:31:02] [config] - ce-mean-words
[2019-08-15 08:31:02] [config] - perplexity
[2019-08-15 08:31:02] [config] - bleu-detok
[2019-08-15 08:31:02] [config] valid-mini-batch: 16
[2019-08-15 08:31:02] [config] valid-script-path: ""
[2019-08-15 08:31:02] [config] valid-sets:
[2019-08-15 08:31:02] [config] - /disk2/data/xx-yy/valid.sk
[2019-08-15 08:31:02] [config] - /disk2/data/xx-yy/valid.en
[2019-08-15 08:31:02] [config] valid-translation-output: ""
[2019-08-15 08:31:02] [config] vocabs:
[2019-08-15 08:31:02] [config] - /disk2/models/xx-yy-r0/vocab.src.spm
[2019-08-15 08:31:02] [config] - /disk2/models/xx-yy-r0/vocab.trg.spm
[2019-08-15 08:31:02] [config] word-penalty: 0
[2019-08-15 08:31:02] [config] workspace: 5000
[2019-08-15 08:31:02] [config] Model is being created with Marian v1.7.8 c65c26d6 2019-08-11 18:27:00 +0100
[2019-08-15 08:31:02] Using synchronous training
[2019-08-15 08:31:02] [SentencePiece] Training SentencePiece vocabulary /disk2/models/xx-yy-r0/vocab.src.spm
[2019-08-15 08:31:02] [SentencePiece] Creating temporary file /tmp/marian.oRd1t0
[2019-08-15 08:31:02] [SentencePiece] Sampling at most 10000000 lines from /disk2/data/xx-yy/train.sk
[2019-08-15 08:31:08] [SentencePiece] Selected 3758502 lines
[2019-08-15 08:35:01] [SentencePiece] Removing /disk2/models/xx-yy-r0/vocab.src.spm.vocab
[2019-08-15 08:35:01] [SentencePiece] Renaming /disk2/models/xx-yy-r0/vocab.src.spm.model to /disk2/models/xx-yy-r0/vocab.src.spm
[2019-08-15 08:35:01] [data] Loading SentencePiece vocabulary from file /disk2/models/xx-yy-r0/vocab.src.spm
[2019-08-15 08:35:01] [data] Setting vocabulary size for input 0 to 32000
[2019-08-15 08:35:01] [SentencePiece] Training SentencePiece vocabulary /disk2/models/xx-yy-r0/vocab.trg.spm
[2019-08-15 08:35:01] [SentencePiece] Creating temporary file /tmp/marian.a4mi4N
[2019-08-15 08:35:01] [SentencePiece] Sampling at most 10000000 lines from /disk2/data/xx-yy/train.en
[2019-08-15 08:35:08] [SentencePiece] Selected 3758505 lines
[2019-08-15 08:38:32] [SentencePiece] Removing /disk2/models/xx-yy-r0/vocab.trg.spm.vocab
[2019-08-15 08:38:32] [SentencePiece] Renaming /disk2/models/xx-yy-r0/vocab.trg.spm.model to /disk2/models/xx-yy-r0/vocab.trg.spm
[2019-08-15 08:38:32] [data] Loading SentencePiece vocabulary from file /disk2/models/xx-yy-r0/vocab.trg.spm
[2019-08-15 08:38:32] [data] Setting vocabulary size for input 1 to 32000
[2019-08-15 08:38:32] Compiled without MPI support. Falling back to FakeMPIWrapper
[2019-08-15 08:38:32] [batching] Collecting statistics for batch fitting with step size 10
[2019-08-15 08:38:34] [memory] Extending reserved space to 5120 MB (device gpu0)
[2019-08-15 08:38:35] [memory] Extending reserved space to 5120 MB (device gpu1)
[2019-08-15 08:38:36] [memory] Extending reserved space to 5120 MB (device gpu2)
[2019-08-15 08:38:36] [memory] Extending reserved space to 5120 MB (device gpu3)
[2019-08-15 08:38:37] [comm] Using NCCL 2.4.2 for GPU communication
[2019-08-15 08:38:37] [comm] NCCLCommunicator constructed successfully.
[2019-08-15 08:38:37] [training] Using 4 GPUs
[2019-08-15 08:38:37] [memory] Reserving 797 MB, device gpu0
[2019-08-15 08:38:37] [gpu] 16-bit TensorCores enabled for float32 matrix operations
[2019-08-15 08:38:37] [memory] Reserving 797 MB, device gpu0
[2019-08-15 08:38:43] [batching] Done. Typical MB size is 15680 target words
[2019-08-15 08:38:44] [memory] Extending reserved space to 5120 MB (device gpu0)
[2019-08-15 08:38:44] [memory] Extending reserved space to 5120 MB (device gpu1)
[2019-08-15 08:38:44] [memory] Extending reserved space to 5120 MB (device gpu2)
[2019-08-15 08:38:44] [memory] Extending reserved space to 5120 MB (device gpu3)
[2019-08-15 08:38:44] [comm] Using NCCL 2.4.2 for GPU communication
[2019-08-15 08:38:44] [comm] NCCLCommunicator constructed successfully.
[2019-08-15 08:38:44] [training] Using 4 GPUs
[2019-08-15 08:38:44] Training started
[2019-08-15 08:38:44] [data] Shuffling data
[2019-08-15 08:38:45] [data] Done reading 3758511 sentences
[2019-08-15 08:38:45] [data] Done shuffling 3758511 sentences (cached in RAM)
[2019-08-15 08:40:11] [training] Batches are processed as 1 process(es) x 4 devices/process
[2019-08-15 08:40:11] [memory] Reserving 797 MB, device gpu2
[2019-08-15 08:40:11] [memory] Reserving 797 MB, device gpu3
[2019-08-15 08:40:11] [memory] Reserving 797 MB, device gpu0
[2019-08-15 08:40:11] [memory] Reserving 797 MB, device gpu1
[2019-08-15 08:40:11] [memory] Reserving 797 MB, device gpu3
[2019-08-15 08:40:11] [memory] Reserving 797 MB, device gpu2
[2019-08-15 08:40:11] [memory] Reserving 797 MB, device gpu0
[2019-08-15 08:40:11] [memory] Reserving 797 MB, device gpu1
[2019-08-15 08:40:11] [memory] Reserving 199 MB, device gpu0
[2019-08-15 08:40:11] [memory] Reserving 199 MB, device gpu1
[2019-08-15 08:40:11] [memory] Reserving 199 MB, device gpu2
[2019-08-15 08:40:11] [memory] Reserving 199 MB, device gpu3
[2019-08-15 08:40:12] [memory] Reserving 398 MB, device gpu3
[2019-08-15 08:40:12] [memory] Reserving 398 MB, device gpu0
[2019-08-15 08:40:12] [memory] Reserving 398 MB, device gpu2
[2019-08-15 08:40:12] [memory] Reserving 398 MB, device gpu1
[2019-08-15 08:40:14] Error: CUDA error 700 'an illegal memory access was encountered' - /home/xyz/marian-dev/src/tensors/gpu/algorithm.cu:55: cudaStreamSynchronize(0)
[2019-08-15 08:40:14] Error: CUDA error 700 'an illegal memory access was encountered' - /home/xyz/marian-dev/src/tensors/gpu/algorithm.cu:55: cudaStreamSynchronize(0)
[2019-08-15 08:40:14] Error: CUDA error 700 'an illegal memory access was encountered' - /home/xyz/marian-dev/src/tensors/gpu/cuda_helpers.h:51: cudaMemcpy(dest, start, (end - start) * sizeof(T), cudaMemcpyDefault)
[2019-08-15 08:40:14] Error: Aborted from void marian::gpu::fill(marian::Ptr<marian::Backend>, T*, T*, T) [with T = float; marian::Ptr<marian::Backend> = std::shared_ptr<marian::Backend>] in /home/xyz/marian-dev/src/tensors/gpu/algorithm.cu:55
[2019-08-15 08:40:14] Error: Aborted from void marian::gpu::fill(marian::Ptr<marian::Backend>, T*, T*, T) [with T = float; marian::Ptr<marian::Backend> = std::shared_ptr<marian::Backend>] in /home/xyz/marian-dev/src/tensors/gpu/algorithm.cu:55
[2019-08-15 08:40:14] Error: CUDA error 700 'an illegal memory access was encountered' - /home/xyz/marian-dev/src/tensors/gpu/algorithm.cu:55: cudaStreamSynchronize(0)
[2019-08-15 08:40:14] Error: Aborted from void CudaCopy(const T*, const T*, T*) [with T = const float*] in /home/xyz/marian-dev/src/tensors/gpu/cuda_helpers.h:51
Aborted from void marian::gpu::fill(marian::Ptr<marian::Backend>, T*, T*, T) [with T = float; marian::Ptr<marian::Backend> = std::shared_ptr<marian::Backend>] in /home/xyz/marian-dev/src/tensors/gpu/algorithm.cu:55
[CALL STACK]
[0x5579f783227e] void CudaCopy <float const*>(float const* const*, float const* const*, float const**) + 0x20e
[0x5579f782d321] marian::gpu:: ProdBatched (std::shared_ptr<marian::TensorBase>, std::shared_ptr<marian::Allocator>, std::shared_ptr<marian::TensorBase>, std::shared_ptr<marian::TensorBase>, bool, bool, float, float) + 0xac1
[0x5579f655a5a1] + 0x3d15a1
[0x5579f65808ef] std::_Function_handler<void (),marian::DotBatchedNodeOp::backwardOps()::{lambda()#8}>:: _M_invoke (std::_Any_data const&) + 0x22f
[0x5579f656eb65] marian::Node:: runBackward (std::vector<std::function<void ()>,std::allocator<std::function<void ()>>> const&) + 0x295
[0x5579f65fa7cd] marian::Node:: backward () + 0x6d
[0x5579f6470123] marian::ExpressionGraph:: backward (bool) + 0x293
[0x5579f670668a] + 0x57d68a
[0x5579f6771c14] marian::ThreadPool::enqueue<std::function<void (unsigned long,unsigned long,unsigned long)> const&,unsigned long&,unsigned long&,unsigned long&>(std::function<void (unsigned long,unsigned long,unsigned long)> const&,unsigned long&,unsigned long&,unsigned long&)::{lambda()#1}:: operator() () const + 0x54
[0x5579f6772510] std::_Function_handler<std::unique_ptr<std::__future_base::_Result_base,std::__future_base::_Result_base::_Deleter> (),std::__future_base::_Task_setter<std::unique_ptr<std::__future_base::_Result<void>,std::__future_base::_Result_base::_Deleter>,std::__future_base::_Task_state<marian::ThreadPool::enqueue<std::function<void (unsigned long,unsigned long,unsigned long)> const&,unsigned long&,unsigned long&,unsigned long&>(std::function<void (unsigned long,unsigned long,unsigned long)> const&,unsigned long&,unsigned long&,unsigned long&)::{lambda()#1},std::allocator<int>,void ()>::_M_run()::{lambda()#1},void>>:: _M_invoke (std::_Any_data const&) + 0x30
[0x5579f63b42d9] std::__future_base::_State_baseV2:: _M_do_set (std::function<std::unique_ptr<std::__future_base::_Result_base,std::__future_base::_Result_base::_Deleter> ()>*, bool*) + 0x29
[0x7f5170641197] + 0x11197
[0x5579f6761b49] std::_Function_handler<void (),marian::ThreadPool::enqueue<std::function<void (unsigned long,unsigned long,unsigned long)> const&,unsigned long&,unsigned long&,unsigned long&>(std::function<void (unsigned long,unsigned long,unsigned long)> const&,unsigned long&,unsigned long&,unsigned long&)::{lambda()#3}>:: _M_invoke (std::_Any_data const&) + 0x139
[0x5579f63b8252] std::thread::_State_impl<std::thread::_Invoker<std::tuple<marian::ThreadPool::reserve(unsigned long)::{lambda()#1}>>>:: _M_run () + 0x142
[0x7f5170247630] + 0xd0630
[0x7f5170639182] + 0x9182
[0x7f516ff13b1f] clone + 0x3f
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment