-
-
Save anonymous/e7644ad05836b6a147cb243e3764ff1f to your computer and use it in GitHub Desktop.
test_doggy
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
2017-07-16 01:22:40,678 : INFO : LOG_FILE | |
2017-07-16 01:22:50,672 : INFO : freeze encoder | |
2017-07-16 01:22:50,864 : INFO : not use pretrained glove | |
2017-07-16 01:22:50,864 : INFO : --model info -- | |
2017-07-16 01:22:50,865 : INFO : ModelWrapper ( | |
2017-07-16 01:22:50,865 : INFO : (drop): Dropout (p = 0.5) | |
2017-07-16 01:22:50,865 : INFO : (encoder): Embedding(10000, 300) | |
2017-07-16 01:22:50,866 : INFO : (conv_module): MultiConvModule ( | |
2017-07-16 01:22:50,866 : INFO : (convList): ModuleList ( | |
2017-07-16 01:22:50,866 : INFO : (0): Conv2d(1, 100, kernel_size=(5, 300), stride=(1, 1), padding=(2, 0)) | |
2017-07-16 01:22:50,867 : INFO : (1): Conv2d(1, 100, kernel_size=(3, 300), stride=(1, 1), padding=(1, 0)) | |
2017-07-16 01:22:50,867 : INFO : ) | |
2017-07-16 01:22:50,867 : INFO : (in_dropout): Dropout (p = 0.5) | |
2017-07-16 01:22:50,867 : INFO : (out_dropout): Dropout (p = 0.2) | |
2017-07-16 01:22:50,867 : INFO : ) | |
2017-07-16 01:22:50,868 : INFO : (rnn): LSTM(200, 168) | |
2017-07-16 01:22:50,868 : INFO : (decoder): Linear (168 -> 10000) | |
2017-07-16 01:22:50,868 : INFO : ) | |
2017-07-16 01:22:57,354 : INFO : | epoch 1 | 200/ 1327 batches | lr 20.00 | ms/batch 32.43 | loss 5.54 | ppl 254.01 | |
2017-07-16 01:23:02,103 : INFO : | epoch 1 | 400/ 1327 batches | lr 20.00 | ms/batch 23.74 | loss 4.06 | ppl 57.80 | |
2017-07-16 01:23:07,038 : INFO : | epoch 1 | 600/ 1327 batches | lr 20.00 | ms/batch 24.67 | loss 3.68 | ppl 39.45 | |
2017-07-16 01:23:11,950 : INFO : | epoch 1 | 800/ 1327 batches | lr 20.00 | ms/batch 24.56 | loss 3.37 | ppl 29.15 | |
2017-07-16 01:23:16,955 : INFO : | epoch 1 | 1000/ 1327 batches | lr 20.00 | ms/batch 25.02 | loss 3.22 | ppl 24.98 | |
2017-07-16 01:23:21,799 : INFO : | epoch 1 | 1200/ 1327 batches | lr 20.00 | ms/batch 24.22 | loss 3.12 | ppl 22.72 | |
2017-07-16 01:23:26,093 : INFO : ----------------------------------------------------------------------------------------- | |
2017-07-16 01:23:26,093 : INFO : | end of epoch 1 | time: 35.22s | valid loss 2.68 | valid ppl 14.65 | |
2017-07-16 01:23:26,093 : INFO : ----------------------------------------------------------------------------------------- | |
2017-07-16 01:23:31,126 : INFO : | epoch 2 | 200/ 1327 batches | lr 20.00 | ms/batch 24.79 | loss 3.01 | ppl 20.21 | |
2017-07-16 01:23:35,856 : INFO : | epoch 2 | 400/ 1327 batches | lr 20.00 | ms/batch 23.64 | loss 2.96 | ppl 19.26 | |
2017-07-16 01:23:40,841 : INFO : | epoch 2 | 600/ 1327 batches | lr 20.00 | ms/batch 24.93 | loss 2.93 | ppl 18.66 | |
2017-07-16 01:23:45,835 : INFO : | epoch 2 | 800/ 1327 batches | lr 20.00 | ms/batch 24.93 | loss 2.82 | ppl 16.73 | |
2017-07-16 01:23:50,793 : INFO : | epoch 2 | 1000/ 1327 batches | lr 20.00 | ms/batch 24.78 | loss 2.78 | ppl 16.18 | |
2017-07-16 01:23:55,753 : INFO : | epoch 2 | 1200/ 1327 batches | lr 20.00 | ms/batch 24.80 | loss 2.75 | ppl 15.70 | |
2017-07-16 01:24:00,011 : INFO : ----------------------------------------------------------------------------------------- | |
2017-07-16 01:24:00,012 : INFO : | end of epoch 2 | time: 33.84s | valid loss 2.26 | valid ppl 9.61 | |
2017-07-16 01:24:00,012 : INFO : ----------------------------------------------------------------------------------------- | |
2017-07-16 01:24:05,003 : INFO : | epoch 3 | 200/ 1327 batches | lr 20.00 | ms/batch 24.23 | loss 2.74 | ppl 15.43 | |
2017-07-16 01:24:09,954 : INFO : | epoch 3 | 400/ 1327 batches | lr 20.00 | ms/batch 24.75 | loss 2.72 | ppl 15.12 | |
2017-07-16 01:24:14,908 : INFO : | epoch 3 | 600/ 1327 batches | lr 20.00 | ms/batch 24.76 | loss 2.72 | ppl 15.25 | |
2017-07-16 01:24:19,935 : INFO : | epoch 3 | 800/ 1327 batches | lr 20.00 | ms/batch 25.13 | loss 2.65 | ppl 14.11 | |
2017-07-16 01:24:24,956 : INFO : | epoch 3 | 1000/ 1327 batches | lr 20.00 | ms/batch 25.11 | loss 2.63 | ppl 13.93 | |
2017-07-16 01:24:30,009 : INFO : | epoch 3 | 1200/ 1327 batches | lr 20.00 | ms/batch 25.26 | loss 2.62 | ppl 13.67 | |
2017-07-16 01:24:34,232 : INFO : ----------------------------------------------------------------------------------------- | |
2017-07-16 01:24:34,232 : INFO : | end of epoch 3 | time: 34.07s | valid loss 2.15 | valid ppl 8.55 | |
2017-07-16 01:24:34,233 : INFO : ----------------------------------------------------------------------------------------- | |
2017-07-16 01:24:39,277 : INFO : | epoch 4 | 200/ 1327 batches | lr 20.00 | ms/batch 24.95 | loss 2.61 | ppl 13.64 | |
2017-07-16 01:24:44,379 : INFO : | epoch 4 | 400/ 1327 batches | lr 20.00 | ms/batch 25.51 | loss 2.62 | ppl 13.74 | |
2017-07-16 01:24:49,351 : INFO : | epoch 4 | 600/ 1327 batches | lr 20.00 | ms/batch 24.85 | loss 2.63 | ppl 13.93 | |
2017-07-16 01:24:54,245 : INFO : | epoch 4 | 800/ 1327 batches | lr 20.00 | ms/batch 24.47 | loss 2.56 | ppl 12.97 | |
2017-07-16 01:24:59,232 : INFO : | epoch 4 | 1000/ 1327 batches | lr 20.00 | ms/batch 24.93 | loss 2.55 | ppl 12.84 | |
2017-07-16 01:25:03,997 : INFO : | epoch 4 | 1200/ 1327 batches | lr 20.00 | ms/batch 23.82 | loss 2.54 | ppl 12.67 | |
2017-07-16 01:25:08,205 : INFO : ----------------------------------------------------------------------------------------- | |
2017-07-16 01:25:08,206 : INFO : | end of epoch 4 | time: 33.92s | valid loss 2.04 | valid ppl 7.67 | |
2017-07-16 01:25:08,206 : INFO : ----------------------------------------------------------------------------------------- | |
2017-07-16 01:25:13,354 : INFO : | epoch 5 | 200/ 1327 batches | lr 20.00 | ms/batch 25.52 | loss 2.54 | ppl 12.66 | |
2017-07-16 01:25:18,251 : INFO : | epoch 5 | 400/ 1327 batches | lr 20.00 | ms/batch 24.48 | loss 2.56 | ppl 12.90 | |
2017-07-16 01:25:23,089 : INFO : | epoch 5 | 600/ 1327 batches | lr 20.00 | ms/batch 24.18 | loss 2.57 | ppl 13.07 | |
2017-07-16 01:25:28,009 : INFO : | epoch 5 | 800/ 1327 batches | lr 20.00 | ms/batch 24.60 | loss 2.50 | ppl 12.22 | |
2017-07-16 01:25:32,917 : INFO : | epoch 5 | 1000/ 1327 batches | lr 20.00 | ms/batch 24.54 | loss 2.50 | ppl 12.16 | |
2017-07-16 01:25:37,909 : INFO : | epoch 5 | 1200/ 1327 batches | lr 20.00 | ms/batch 24.96 | loss 2.49 | ppl 12.08 | |
2017-07-16 01:25:42,103 : INFO : ----------------------------------------------------------------------------------------- | |
2017-07-16 01:25:42,103 : INFO : | end of epoch 5 | time: 33.85s | valid loss 1.94 | valid ppl 6.98 | |
2017-07-16 01:25:42,103 : INFO : ----------------------------------------------------------------------------------------- | |
2017-07-16 01:25:47,060 : INFO : | epoch 6 | 200/ 1327 batches | lr 20.00 | ms/batch 24.53 | loss 2.51 | ppl 12.25 | |
2017-07-16 01:25:51,840 : INFO : | epoch 6 | 400/ 1327 batches | lr 20.00 | ms/batch 23.89 | loss 2.51 | ppl 12.35 | |
2017-07-16 01:25:56,852 : INFO : | epoch 6 | 600/ 1327 batches | lr 20.00 | ms/batch 25.06 | loss 2.53 | ppl 12.55 | |
2017-07-16 01:26:01,814 : INFO : | epoch 6 | 800/ 1327 batches | lr 20.00 | ms/batch 24.80 | loss 2.47 | ppl 11.82 | |
2017-07-16 01:26:06,851 : INFO : | epoch 6 | 1000/ 1327 batches | lr 20.00 | ms/batch 25.19 | loss 2.46 | ppl 11.68 | |
2017-07-16 01:26:11,737 : INFO : | epoch 6 | 1200/ 1327 batches | lr 20.00 | ms/batch 24.42 | loss 2.46 | ppl 11.65 | |
2017-07-16 01:26:15,995 : INFO : ----------------------------------------------------------------------------------------- | |
2017-07-16 01:26:15,995 : INFO : | end of epoch 6 | time: 33.84s | valid loss 1.82 | valid ppl 6.20 | |
2017-07-16 01:26:15,995 : INFO : ----------------------------------------------------------------------------------------- | |
2017-07-16 01:26:20,924 : INFO : | epoch 7 | 200/ 1327 batches | lr 20.00 | ms/batch 24.22 | loss 2.48 | ppl 11.90 | |
2017-07-16 01:26:25,858 : INFO : | epoch 7 | 400/ 1327 batches | lr 20.00 | ms/batch 24.66 | loss 2.49 | ppl 12.03 | |
2017-07-16 01:26:30,752 : INFO : | epoch 7 | 600/ 1327 batches | lr 20.00 | ms/batch 24.47 | loss 2.50 | ppl 12.22 | |
2017-07-16 01:26:35,677 : INFO : | epoch 7 | 800/ 1327 batches | lr 20.00 | ms/batch 24.63 | loss 2.44 | ppl 11.52 | |
2017-07-16 01:26:40,713 : INFO : | epoch 7 | 1000/ 1327 batches | lr 20.00 | ms/batch 25.18 | loss 2.44 | ppl 11.51 | |
2017-07-16 01:26:45,593 : INFO : | epoch 7 | 1200/ 1327 batches | lr 20.00 | ms/batch 24.40 | loss 2.44 | ppl 11.44 | |
2017-07-16 01:26:49,832 : INFO : ----------------------------------------------------------------------------------------- | |
2017-07-16 01:26:49,832 : INFO : | end of epoch 7 | time: 33.75s | valid loss 1.73 | valid ppl 5.62 | |
2017-07-16 01:26:49,832 : INFO : ----------------------------------------------------------------------------------------- | |
2017-07-16 01:26:54,745 : INFO : | epoch 8 | 200/ 1327 batches | lr 20.00 | ms/batch 24.26 | loss 2.44 | ppl 11.53 | |
2017-07-16 01:26:59,736 : INFO : | epoch 8 | 400/ 1327 batches | lr 20.00 | ms/batch 24.95 | loss 2.46 | ppl 11.73 | |
2017-07-16 01:27:04,629 : INFO : | epoch 8 | 600/ 1327 batches | lr 20.00 | ms/batch 24.46 | loss 2.48 | ppl 12.00 | |
2017-07-16 01:27:09,624 : INFO : | epoch 8 | 800/ 1327 batches | lr 20.00 | ms/batch 24.97 | loss 2.43 | ppl 11.35 | |
2017-07-16 01:27:14,657 : INFO : | epoch 8 | 1000/ 1327 batches | lr 20.00 | ms/batch 25.16 | loss 2.42 | ppl 11.22 | |
2017-07-16 01:27:19,600 : INFO : | epoch 8 | 1200/ 1327 batches | lr 20.00 | ms/batch 24.72 | loss 2.41 | ppl 11.15 | |
2017-07-16 01:27:24,005 : INFO : ----------------------------------------------------------------------------------------- | |
2017-07-16 01:27:24,006 : INFO : | end of epoch 8 | time: 34.11s | valid loss 1.59 | valid ppl 4.92 | |
2017-07-16 01:27:24,006 : INFO : ----------------------------------------------------------------------------------------- | |
2017-07-16 01:27:28,926 : INFO : | epoch 9 | 200/ 1327 batches | lr 20.00 | ms/batch 24.26 | loss 2.43 | ppl 11.41 | |
2017-07-16 01:27:33,832 : INFO : | epoch 9 | 400/ 1327 batches | lr 20.00 | ms/batch 24.53 | loss 2.46 | ppl 11.68 | |
2017-07-16 01:27:38,818 : INFO : | epoch 9 | 600/ 1327 batches | lr 20.00 | ms/batch 24.93 | loss 2.47 | ppl 11.81 | |
2017-07-16 01:27:43,865 : INFO : | epoch 9 | 800/ 1327 batches | lr 20.00 | ms/batch 25.23 | loss 2.41 | ppl 11.11 | |
2017-07-16 01:27:48,831 : INFO : | epoch 9 | 1000/ 1327 batches | lr 20.00 | ms/batch 24.83 | loss 2.41 | ppl 11.09 | |
2017-07-16 01:27:53,943 : INFO : | epoch 9 | 1200/ 1327 batches | lr 20.00 | ms/batch 25.56 | loss 2.40 | ppl 11.07 | |
2017-07-16 01:27:58,466 : INFO : ----------------------------------------------------------------------------------------- | |
2017-07-16 01:27:58,466 : INFO : | end of epoch 9 | time: 34.39s | valid loss 1.46 | valid ppl 4.29 | |
2017-07-16 01:27:58,467 : INFO : ----------------------------------------------------------------------------------------- | |
2017-07-16 01:28:03,310 : INFO : | epoch 10 | 200/ 1327 batches | lr 20.00 | ms/batch 23.93 | loss 2.41 | ppl 11.19 | |
2017-07-16 01:28:08,289 : INFO : | epoch 10 | 400/ 1327 batches | lr 20.00 | ms/batch 24.89 | loss 2.44 | ppl 11.50 | |
2017-07-16 01:28:13,239 : INFO : | epoch 10 | 600/ 1327 batches | lr 20.00 | ms/batch 24.75 | loss 2.46 | ppl 11.70 | |
2017-07-16 01:28:17,990 : INFO : | epoch 10 | 800/ 1327 batches | lr 20.00 | ms/batch 23.75 | loss 2.40 | ppl 10.98 | |
2017-07-16 01:28:22,974 : INFO : | epoch 10 | 1000/ 1327 batches | lr 20.00 | ms/batch 24.92 | loss 2.39 | ppl 10.91 | |
2017-07-16 01:28:27,985 : INFO : | epoch 10 | 1200/ 1327 batches | lr 20.00 | ms/batch 25.05 | loss 2.39 | ppl 10.96 | |
2017-07-16 01:28:32,150 : INFO : ----------------------------------------------------------------------------------------- | |
2017-07-16 01:28:32,150 : INFO : | end of epoch 10 | time: 33.63s | valid loss 1.38 | valid ppl 3.96 | |
2017-07-16 01:28:32,150 : INFO : ----------------------------------------------------------------------------------------- | |
2017-07-16 01:28:33,675 : INFO : ========================================================================================= | |
2017-07-16 01:28:33,676 : INFO : | End of training | test loss 1.31 | test ppl 3.70 | |
2017-07-16 01:28:33,676 : INFO : ========================================================================================= | |
2017-07-16 01:28:33,681 : INFO : save state file to debug11 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi, do you find you problem? I get a low ppl too, and I can not find why