Created
November 16, 2018 19:17
-
-
Save felixgwu/8c74a28b040b95635fcf28c5c1e3e078 to your computer and use it in GitHub Desktop.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Can not use cuDNN on context None: Disabled by dnn.enabled flag | |
Mapped name None to device cuda0: GeForce GTX 1080 Ti (0000:61:00.0) | |
11/16/2018 02:06:13 PM In order to work for big datasets fix https://github.com/Theano/Theano/pull/5721 should be applied to theano. | |
11/16/2018 02:06:13 PM loading the dataset from ./data/cmu/ | |
11/16/2018 02:06:14 PM #labels: 129 | |
11/16/2018 02:06:16 PM TfidfVectorizer(analyzer=u'word', binary=True, decode_error=u'strict', | |
dtype='float32', encoding='latin1', input=u'content', | |
lowercase=True, max_df=0.2, max_features=None, min_df=10, | |
ngram_range=(1, 1), norm='l2', preprocessor=None, smooth_idf=True, | |
stop_words='english', strip_accents=None, sublinear_tf=False, | |
token_pattern='(?u)(?<![@])#?\\b\\w\\w+\\b', tokenizer=None, | |
use_idf=True, vocabulary=None) | |
/home/felixgwu/.local/lib/python2.7/site-packages/sklearn/feature_extraction/text.py:1089: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`. | |
if hasattr(X, 'dtype') and np.issubdtype(X.dtype, np.float): | |
11/16/2018 02:06:20 PM training n_samples: 5685, n_features: 9467 | |
11/16/2018 02:06:20 PM development n_samples: 1895, n_features: 9467 | |
11/16/2018 02:06:20 PM test n_samples: 1895, n_features: 9467 | |
11/16/2018 02:06:20 PM saving vocab in ./data/cmu/vocab.pkl | |
11/16/2018 02:06:20 PM vocab dumped successfully! | |
11/16/2018 02:06:20 PM adding the train graph | |
11/16/2018 02:06:22 PM adding the dev graph | |
11/16/2018 02:06:22 PM adding the test graph | |
11/16/2018 02:06:23 PM removing 91477 celebrity nodes with degree higher than 5 | |
11/16/2018 02:06:23 PM projecting the graph | |
11/16/2018 02:06:23 PM 0% | |
11/16/2018 02:06:23 PM 10% | |
11/16/2018 02:06:23 PM 20% | |
11/16/2018 02:06:23 PM 30% | |
11/16/2018 02:06:23 PM 40% | |
11/16/2018 02:06:23 PM 50% | |
11/16/2018 02:06:23 PM 60% | |
11/16/2018 02:06:23 PM 70% | |
11/16/2018 02:06:23 PM 80% | |
11/16/2018 02:06:23 PM 90% | |
11/16/2018 02:06:23 PM 100% | |
11/16/2018 02:06:23 PM #nodes: 9475, #edges: 77155 | |
11/16/2018 02:06:23 PM creating adjacency matrix... | |
/home/felixgwu/.local/lib/python2.7/site-packages/scipy/sparse/compressed.py:730: SparseEfficiencyWarning: Changing the sparsity structure of a csr_matrix is expensive. lil_matrix is more efficient. | |
SparseEfficiencyWarning) | |
11/16/2018 02:06:24 PM adjacency matrix created. | |
11/16/2018 02:06:24 PM dumping data in ./data/cmu/dump.pkl ... | |
11/16/2018 02:06:26 PM data dump finished! | |
11/16/2018 02:06:26 PM stacking training, dev and test features and creating indices... | |
11/16/2018 02:06:26 PM running mlp with graph conv... | |
11/16/2018 02:06:26 PM highway is True | |
11/16/2018 02:06:26 PM Graphconv model input size 9467, output size 129 and hidden layers [300, 300, 300] regul 0.0 dropout 0.5. | |
11/16/2018 02:06:27 PM 3 gconv layers | |
/home/felixgwu/.conda/envs/geo/lib/python2.7/site-packages/lasagne/layers/helper.py:216: UserWarning: get_output() was called with unused kwargs: | |
A | |
% "\n\t".join(suggestions)) | |
/share/felixgwu/projects/gcn_project/geographconv/gcnmodel.py:402: UserWarning: theano.function was asked to create a function computing outputs given certain inputs, but the provided input variable at index 1 is not part of the computational graph needed to compute the outputs: SparseVariable{csr,float32}. | |
To make this warning into an error, you can pass the parameter on_unused_input='raise' to theano.function. To disable it completely, use on_unused_input='ignore'. | |
self.f_gates.append(theano.function([self.X_sym, self.A_sym], self.gate_outputs[i], on_unused_input='warn')) | |
11/16/2018 02:06:48 PM ***********percentile 1.000000 ****************** | |
11/16/2018 02:06:48 PM 5685 training samples | |
11/16/2018 02:06:48 PM training for 10000 epochs with batch size 500 | |
/home/felixgwu/.conda/envs/geo/lib/python2.7/site-packages/theano/tensor/subtensor.py:2339: FutureWarning: Using a non-tuple sequence for multidimensional indexing is deprecated; use `arr[tuple(seq)]` instead of `arr[seq]`. In the future this will be interpreted as an array index, `arr[np.array(seq)]`, which will result either in an error or a different result. | |
out[0][inputs[2:]] = inputs[1] | |
11/16/2018 02:06:48 PM epoch 0 train loss 4.86 train acc 0.01 val loss 4.86 val acc 0.01 best val acc 0.01 maxdown 0 | |
11/16/2018 02:06:49 PM epoch 1 train loss 4.85 train acc 0.08 val loss 4.85 val acc 0.03 best val acc 0.03 maxdown 0 | |
11/16/2018 02:06:50 PM epoch 2 train loss 4.83 train acc 0.14 val loss 4.85 val acc 0.04 best val acc 0.04 maxdown 0 | |
11/16/2018 02:06:51 PM epoch 3 train loss 4.82 train acc 0.15 val loss 4.84 val acc 0.04 best val acc 0.04 maxdown 0 | |
11/16/2018 02:06:51 PM epoch 4 train loss 4.80 train acc 0.16 val loss 4.84 val acc 0.03 best val acc 0.03 maxdown 0 | |
11/16/2018 02:06:52 PM epoch 5 train loss 4.79 train acc 0.16 val loss 4.83 val acc 0.04 best val acc 0.04 maxdown 0 | |
11/16/2018 02:06:53 PM epoch 6 train loss 4.77 train acc 0.17 val loss 4.82 val acc 0.04 best val acc 0.04 maxdown 0 | |
11/16/2018 02:06:53 PM epoch 7 train loss 4.75 train acc 0.18 val loss 4.81 val acc 0.04 best val acc 0.04 maxdown 0 | |
11/16/2018 02:06:54 PM epoch 8 train loss 4.73 train acc 0.20 val loss 4.80 val acc 0.04 best val acc 0.04 maxdown 0 | |
11/16/2018 02:06:55 PM epoch 9 train loss 4.71 train acc 0.23 val loss 4.79 val acc 0.06 best val acc 0.06 maxdown 0 | |
11/16/2018 02:06:56 PM epoch 10 train loss 4.69 train acc 0.25 val loss 4.78 val acc 0.08 best val acc 0.08 maxdown 0 | |
11/16/2018 02:06:56 PM epoch 11 train loss 4.67 train acc 0.27 val loss 4.76 val acc 0.10 best val acc 0.10 maxdown 0 | |
11/16/2018 02:06:57 PM epoch 12 train loss 4.65 train acc 0.30 val loss 4.75 val acc 0.11 best val acc 0.11 maxdown 0 | |
11/16/2018 02:06:58 PM epoch 13 train loss 4.62 train acc 0.32 val loss 4.74 val acc 0.12 best val acc 0.12 maxdown 0 | |
11/16/2018 02:06:59 PM epoch 14 train loss 4.59 train acc 0.33 val loss 4.72 val acc 0.13 best val acc 0.13 maxdown 0 | |
11/16/2018 02:06:59 PM epoch 15 train loss 4.56 train acc 0.34 val loss 4.70 val acc 0.14 best val acc 0.14 maxdown 0 | |
11/16/2018 02:07:00 PM epoch 16 train loss 4.53 train acc 0.36 val loss 4.68 val acc 0.15 best val acc 0.15 maxdown 0 | |
11/16/2018 02:07:01 PM epoch 17 train loss 4.50 train acc 0.38 val loss 4.66 val acc 0.15 best val acc 0.15 maxdown 0 | |
11/16/2018 02:07:02 PM epoch 18 train loss 4.47 train acc 0.39 val loss 4.64 val acc 0.16 best val acc 0.16 maxdown 0 | |
11/16/2018 02:07:02 PM epoch 19 train loss 4.44 train acc 0.40 val loss 4.62 val acc 0.17 best val acc 0.17 maxdown 0 | |
11/16/2018 02:07:03 PM epoch 20 train loss 4.40 train acc 0.40 val loss 4.60 val acc 0.17 best val acc 0.17 maxdown 0 | |
11/16/2018 02:07:04 PM epoch 21 train loss 4.37 train acc 0.42 val loss 4.58 val acc 0.17 best val acc 0.17 maxdown 0 | |
11/16/2018 02:07:05 PM epoch 22 train loss 4.33 train acc 0.42 val loss 4.55 val acc 0.18 best val acc 0.18 maxdown 0 | |
11/16/2018 02:07:05 PM epoch 23 train loss 4.30 train acc 0.43 val loss 4.53 val acc 0.18 best val acc 0.18 maxdown 0 | |
11/16/2018 02:07:06 PM epoch 24 train loss 4.26 train acc 0.44 val loss 4.51 val acc 0.18 best val acc 0.18 maxdown 0 | |
11/16/2018 02:07:07 PM epoch 25 train loss 4.22 train acc 0.44 val loss 4.48 val acc 0.18 best val acc 0.18 maxdown 0 | |
11/16/2018 02:07:08 PM epoch 26 train loss 4.18 train acc 0.45 val loss 4.46 val acc 0.20 best val acc 0.20 maxdown 0 | |
11/16/2018 02:07:08 PM epoch 27 train loss 4.15 train acc 0.46 val loss 4.44 val acc 0.20 best val acc 0.20 maxdown 0 | |
11/16/2018 02:07:09 PM epoch 28 train loss 4.11 train acc 0.46 val loss 4.42 val acc 0.20 best val acc 0.20 maxdown 0 | |
11/16/2018 02:07:10 PM epoch 29 train loss 4.07 train acc 0.47 val loss 4.39 val acc 0.20 best val acc 0.20 maxdown 0 | |
11/16/2018 02:07:11 PM epoch 30 train loss 4.04 train acc 0.47 val loss 4.37 val acc 0.19 best val acc 0.19 maxdown 0 | |
11/16/2018 02:07:11 PM epoch 31 train loss 4.00 train acc 0.48 val loss 4.36 val acc 0.20 best val acc 0.20 maxdown 0 | |
11/16/2018 02:07:12 PM epoch 32 train loss 3.97 train acc 0.48 val loss 4.34 val acc 0.20 best val acc 0.20 maxdown 0 | |
11/16/2018 02:07:13 PM epoch 33 train loss 3.93 train acc 0.49 val loss 4.32 val acc 0.20 best val acc 0.20 maxdown 0 | |
11/16/2018 02:07:14 PM epoch 34 train loss 3.90 train acc 0.49 val loss 4.31 val acc 0.20 best val acc 0.20 maxdown 0 | |
11/16/2018 02:07:14 PM epoch 35 train loss 3.87 train acc 0.50 val loss 4.28 val acc 0.21 best val acc 0.21 maxdown 0 | |
11/16/2018 02:07:15 PM epoch 36 train loss 3.83 train acc 0.50 val loss 4.27 val acc 0.21 best val acc 0.21 maxdown 0 | |
11/16/2018 02:07:16 PM epoch 37 train loss 3.80 train acc 0.51 val loss 4.26 val acc 0.21 best val acc 0.21 maxdown 0 | |
11/16/2018 02:07:17 PM epoch 38 train loss 3.76 train acc 0.52 val loss 4.24 val acc 0.22 best val acc 0.22 maxdown 0 | |
11/16/2018 02:07:17 PM epoch 39 train loss 3.73 train acc 0.52 val loss 4.22 val acc 0.21 best val acc 0.21 maxdown 0 | |
11/16/2018 02:07:18 PM epoch 40 train loss 3.69 train acc 0.53 val loss 4.20 val acc 0.22 best val acc 0.22 maxdown 0 | |
11/16/2018 02:07:19 PM epoch 41 train loss 3.65 train acc 0.52 val loss 4.19 val acc 0.22 best val acc 0.22 maxdown 0 | |
11/16/2018 02:07:20 PM epoch 42 train loss 3.61 train acc 0.53 val loss 4.17 val acc 0.23 best val acc 0.23 maxdown 0 | |
11/16/2018 02:07:20 PM epoch 43 train loss 3.58 train acc 0.54 val loss 4.15 val acc 0.23 best val acc 0.23 maxdown 0 | |
11/16/2018 02:07:21 PM epoch 44 train loss 3.54 train acc 0.54 val loss 4.13 val acc 0.23 best val acc 0.23 maxdown 0 | |
11/16/2018 02:07:22 PM epoch 45 train loss 3.50 train acc 0.54 val loss 4.11 val acc 0.23 best val acc 0.23 maxdown 0 | |
11/16/2018 02:07:22 PM epoch 46 train loss 3.46 train acc 0.54 val loss 4.10 val acc 0.24 best val acc 0.24 maxdown 0 | |
11/16/2018 02:07:23 PM epoch 47 train loss 3.41 train acc 0.55 val loss 4.07 val acc 0.23 best val acc 0.23 maxdown 0 | |
11/16/2018 02:07:24 PM epoch 48 train loss 3.38 train acc 0.55 val loss 4.05 val acc 0.24 best val acc 0.24 maxdown 0 | |
11/16/2018 02:07:25 PM epoch 49 train loss 3.34 train acc 0.55 val loss 4.04 val acc 0.23 best val acc 0.23 maxdown 0 | |
11/16/2018 02:07:25 PM epoch 50 train loss 3.29 train acc 0.56 val loss 4.01 val acc 0.24 best val acc 0.24 maxdown 0 | |
11/16/2018 02:07:26 PM epoch 51 train loss 3.25 train acc 0.56 val loss 3.98 val acc 0.24 best val acc 0.24 maxdown 0 | |
11/16/2018 02:07:27 PM epoch 52 train loss 3.21 train acc 0.57 val loss 3.96 val acc 0.24 best val acc 0.24 maxdown 0 | |
11/16/2018 02:07:28 PM epoch 53 train loss 3.16 train acc 0.57 val loss 3.94 val acc 0.24 best val acc 0.24 maxdown 0 | |
11/16/2018 02:07:28 PM epoch 54 train loss 3.12 train acc 0.58 val loss 3.92 val acc 0.24 best val acc 0.24 maxdown 0 | |
11/16/2018 02:07:29 PM epoch 55 train loss 3.07 train acc 0.57 val loss 3.89 val acc 0.24 best val acc 0.24 maxdown 0 | |
11/16/2018 02:07:30 PM epoch 56 train loss 3.03 train acc 0.58 val loss 3.87 val acc 0.25 best val acc 0.25 maxdown 0 | |
11/16/2018 02:07:31 PM epoch 57 train loss 2.98 train acc 0.58 val loss 3.84 val acc 0.24 best val acc 0.24 maxdown 0 | |
11/16/2018 02:07:31 PM epoch 58 train loss 2.94 train acc 0.58 val loss 3.81 val acc 0.24 best val acc 0.24 maxdown 0 | |
11/16/2018 02:07:32 PM epoch 59 train loss 2.89 train acc 0.59 val loss 3.80 val acc 0.24 best val acc 0.24 maxdown 0 | |
11/16/2018 02:07:33 PM epoch 60 train loss 2.85 train acc 0.59 val loss 3.77 val acc 0.23 best val acc 0.23 maxdown 0 | |
11/16/2018 02:07:34 PM epoch 61 train loss 2.80 train acc 0.58 val loss 3.75 val acc 0.24 best val acc 0.24 maxdown 0 | |
11/16/2018 02:07:34 PM epoch 62 train loss 2.75 train acc 0.59 val loss 3.73 val acc 0.25 best val acc 0.25 maxdown 0 | |
11/16/2018 02:07:35 PM epoch 63 train loss 2.70 train acc 0.59 val loss 3.72 val acc 0.25 best val acc 0.25 maxdown 0 | |
11/16/2018 02:07:36 PM epoch 64 train loss 2.66 train acc 0.58 val loss 3.69 val acc 0.23 best val acc 0.23 maxdown 0 | |
11/16/2018 02:07:37 PM epoch 65 train loss 2.61 train acc 0.58 val loss 3.67 val acc 0.24 best val acc 0.24 maxdown 0 | |
11/16/2018 02:07:37 PM epoch 66 train loss 2.57 train acc 0.58 val loss 3.65 val acc 0.24 best val acc 0.24 maxdown 0 | |
11/16/2018 02:07:38 PM epoch 67 train loss 2.52 train acc 0.58 val loss 3.63 val acc 0.25 best val acc 0.25 maxdown 0 | |
11/16/2018 02:07:39 PM epoch 68 train loss 2.48 train acc 0.59 val loss 3.61 val acc 0.25 best val acc 0.25 maxdown 0 | |
11/16/2018 02:07:40 PM epoch 69 train loss 2.44 train acc 0.59 val loss 3.60 val acc 0.24 best val acc 0.24 maxdown 0 | |
11/16/2018 02:07:40 PM epoch 70 train loss 2.40 train acc 0.59 val loss 3.58 val acc 0.25 best val acc 0.25 maxdown 0 | |
11/16/2018 02:07:41 PM epoch 71 train loss 2.36 train acc 0.59 val loss 3.56 val acc 0.25 best val acc 0.25 maxdown 0 | |
11/16/2018 02:07:42 PM epoch 72 train loss 2.31 train acc 0.61 val loss 3.56 val acc 0.25 best val acc 0.25 maxdown 0 | |
11/16/2018 02:07:42 PM epoch 73 train loss 2.28 train acc 0.61 val loss 3.53 val acc 0.24 best val acc 0.24 maxdown 0 | |
11/16/2018 02:07:43 PM epoch 74 train loss 2.24 train acc 0.61 val loss 3.51 val acc 0.26 best val acc 0.26 maxdown 0 | |
11/16/2018 02:07:44 PM epoch 75 train loss 2.20 train acc 0.61 val loss 3.50 val acc 0.25 best val acc 0.25 maxdown 0 | |
11/16/2018 02:07:45 PM epoch 76 train loss 2.16 train acc 0.62 val loss 3.47 val acc 0.25 best val acc 0.25 maxdown 0 | |
11/16/2018 02:07:45 PM epoch 77 train loss 2.13 train acc 0.63 val loss 3.48 val acc 0.25 best val acc 0.25 maxdown 1 | |
11/16/2018 02:07:46 PM epoch 78 train loss 2.09 train acc 0.63 val loss 3.45 val acc 0.26 best val acc 0.26 maxdown 0 | |
11/16/2018 02:07:47 PM epoch 79 train loss 2.05 train acc 0.64 val loss 3.46 val acc 0.26 best val acc 0.26 maxdown 1 | |
11/16/2018 02:07:48 PM epoch 80 train loss 2.01 train acc 0.64 val loss 3.45 val acc 0.26 best val acc 0.26 maxdown 0 | |
11/16/2018 02:07:48 PM epoch 81 train loss 1.98 train acc 0.64 val loss 3.43 val acc 0.25 best val acc 0.25 maxdown 0 | |
11/16/2018 02:07:49 PM epoch 82 train loss 1.94 train acc 0.65 val loss 3.44 val acc 0.26 best val acc 0.25 maxdown 1 | |
11/16/2018 02:07:50 PM epoch 83 train loss 1.91 train acc 0.66 val loss 3.42 val acc 0.26 best val acc 0.26 maxdown 0 | |
11/16/2018 02:07:51 PM epoch 84 train loss 1.87 train acc 0.66 val loss 3.42 val acc 0.25 best val acc 0.26 maxdown 1 | |
11/16/2018 02:07:51 PM epoch 85 train loss 1.84 train acc 0.66 val loss 3.43 val acc 0.25 best val acc 0.26 maxdown 2 | |
11/16/2018 02:07:52 PM epoch 86 train loss 1.81 train acc 0.67 val loss 3.42 val acc 0.26 best val acc 0.26 maxdown 3 | |
11/16/2018 02:07:53 PM epoch 87 train loss 1.77 train acc 0.67 val loss 3.42 val acc 0.25 best val acc 0.26 maxdown 4 | |
11/16/2018 02:07:54 PM epoch 88 train loss 1.74 train acc 0.67 val loss 3.43 val acc 0.26 best val acc 0.26 maxdown 5 | |
11/16/2018 02:07:54 PM epoch 89 train loss 1.71 train acc 0.68 val loss 3.43 val acc 0.25 best val acc 0.26 maxdown 6 | |
11/16/2018 02:07:55 PM epoch 90 train loss 1.68 train acc 0.68 val loss 3.43 val acc 0.25 best val acc 0.26 maxdown 7 | |
11/16/2018 02:07:56 PM epoch 91 train loss 1.65 train acc 0.69 val loss 3.45 val acc 0.25 best val acc 0.26 maxdown 8 | |
11/16/2018 02:07:57 PM epoch 92 train loss 1.62 train acc 0.69 val loss 3.45 val acc 0.25 best val acc 0.26 maxdown 9 | |
11/16/2018 02:07:57 PM epoch 93 train loss 1.59 train acc 0.70 val loss 3.45 val acc 0.24 best val acc 0.26 maxdown 10 | |
11/16/2018 02:07:58 PM epoch 94 train loss 1.56 train acc 0.71 val loss 3.45 val acc 0.26 best val acc 0.26 maxdown 11 | |
11/16/2018 02:07:58 PM validation results went down. early stopping ... | |
11/16/2018 02:07:58 PM dev results: | |
11/16/2018 02:07:58 PM Mean: 520 Median: 46 Acc@161: 60 | |
11/16/2018 02:07:58 PM test results: | |
11/16/2018 02:07:59 PM Mean: 540 Median: 47 Acc@161: 60 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment