Skip to content

Instantly share code, notes, and snippets.

@hamletbatista
Created October 29, 2019 06:08
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save hamletbatista/3a5ceb301b4d973b29dc428ccc794829 to your computer and use it in GitHub Desktop.
Save hamletbatista/3a5ceb301b4d973b29dc428ccc794829 to your computer and use it in GitHub Desktop.
/content/PreSumm/src
abs test
[2019-10-29 04:11:27,312 INFO] Loading checkpoint from /content/PreSumm/models/CNN_DailyMail_Abstractive/model_step_148000.pt
Namespace(accum_count=1, alpha=0.95, batch_size=32, beam_size=5, bert_data_path='../bert_data/cnndm', beta1=0.9, beta2=0.999, block_trigram=True, dec_dropout=0.2, dec_ff_size=2048, dec_heads=8, dec_hidden_size=768, dec_layers=6, enc_dropout=0.2, enc_ff_size=512, enc_hidden_size=512, enc_layers=6, encoder='bert', ext_dropout=0.2, ext_ff_size=2048, ext_heads=8, ext_hidden_size=768, ext_layers=2, finetune_bert=True, generator_shard_size=32, gpu_ranks=[0], label_smoothing=0.1, large=False, load_from_extractive='', log_file='../logs/val_abs_bert_cnndm', lower=True, lr=1, lr_bert=0.002, lr_dec=0.002, max_grad_norm=0, max_length=200, max_pos=512, max_src_nsents=100, max_src_ntokens_per_sent=200, max_tgt_len=140, max_tgt_ntokens=500, min_length=50, min_src_nsents=3, min_src_ntokens_per_sent=5, min_tgt_ntokens=5, mode='test', model_path='../../models/', optim='adam', param_init=0, param_init_glorot=True, recall_eval=False, report_every=1, report_rouge=False, result_path='../results/abs_bert_cnndm_sample', save_checkpoint_steps=5, seed=666, sep_optim=True, shard_size=2000, share_emb=False, task='abs', temp_dir='../../temp', test_all=False, test_batch_size=500, test_from='/content/PreSumm/models/CNN_DailyMail_Abstractive/model_step_148000.pt', test_start_from=-1, train_from='', train_steps=1000, use_bert_basic_tokenizer=False, use_bert_emb=False, use_interval=True, visible_gpus='-1', warmup_steps=8000, warmup_steps_bert=8000, warmup_steps_dec=8000, world_size=1)
[2019-10-29 04:11:38,809 INFO] loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json from cache at ../../temp/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.bf3b9ea126d8c0001ee8a1e8b92229871d06d36d8808208cc2449280da87785c
[2019-10-29 04:11:38,814 INFO] Model config {
"attention_probs_dropout_prob": 0.1,
"finetuning_task": null,
"hidden_act": "gelu",
"hidden_dropout_prob": 0.1,
"hidden_size": 768,
"initializer_range": 0.02,
"intermediate_size": 3072,
"layer_norm_eps": 1e-12,
"max_position_embeddings": 512,
"num_attention_heads": 12,
"num_hidden_layers": 12,
"num_labels": 2,
"output_attentions": false,
"output_hidden_states": false,
"pruned_heads": {},
"torchscript": false,
"type_vocab_size": 2,
"vocab_size": 30522
}
[2019-10-29 04:11:39,560 INFO] loading weights file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-pytorch_model.bin from cache at ../../temp/aa1ef1aede4482d0dbcd4d52baad8ae300e60902e88fcb0bebdec09afd232066.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157
[2019-10-29 04:11:47,345 INFO] loading file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt from cache at ../../temp/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084
Files In Input Dir: 1
[2019-10-29 04:11:48,279 INFO] loading vocabulary file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt from cache at /root/.cache/torch/pytorch_transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084
[2019-10-29 04:11:48,340 INFO] Processing ...
[2019-10-29 04:11:48,512 INFO] Saving to ../bert_data_test/cnndm.test.0.bert.pt
[2019-10-29 04:11:48,530 INFO] Loading test dataset from ../bert_data_test/cnndm.test.0.bert.pt, number of examples: 1
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment