Skip to content

Instantly share code, notes, and snippets.

View colesbury's full-sized avatar

Sam Gross colesbury

View GitHub Profile
@colesbury
colesbury / vgg11.stdout.txt
Created March 13, 2017 05:39
PyTorch VGG-11 training log
This file has been truncated, but you can view the full file.
=> creating model 'vgg11'
Epoch: [0][0/5005] Time 25.966 (25.966) Data 3.152 (3.152) Loss 6.9137 (6.9137) Prec@1 0.000 (0.000) Prec@5 0.391 (0.391)
Epoch: [0][10/5005] Time 0.314 (2.637) Data 0.000 (0.287) Loss 6.9071 (6.9136) Prec@1 0.000 (0.142) Prec@5 1.172 (0.568)
Epoch: [0][20/5005] Time 0.320 (1.534) Data 0.000 (0.150) Loss 6.9173 (6.9126) Prec@1 0.000 (0.130) Prec@5 0.781 (0.484)
Epoch: [0][30/5005] Time 0.320 (1.143) Data 0.000 (0.102) Loss 6.8996 (6.9119) Prec@1 0.781 (0.126) Prec@5 0.781 (0.466)
Epoch: [0][40/5005] Time 0.320 (0.942) Data 0.000 (0.077) Loss 6.9102 (6.9114) Prec@1 0.000 (0.133) Prec@5 0.391 (0.553)
Epoch: [0][50/5005] Time 0.318 (0.821) Data 0.000 (0.062) Loss 6.9081 (6.9111) Prec@1 0.000 (0.115) Prec@5 0.781 (0.521)
Epoch: [0][60/5005] Time 0.323 (0.739) Data 0.000 (0.052) Loss 6.9033 (6.9108) Prec@1 0.000 (0.109) Prec@5 0.391 (0.519)
Epoch: [0][70/5005] Time 0.321 (0.680) Data 0.000 (0.045) Loss 6.9115 (6.9102) Prec@1 0.000 (0.105) Prec@5 0.391 (0.534)
Epoch: [0][80/5005] Time 0.32
This file has been truncated, but you can view the full file.
=> creating model 'vgg13'
Epoch: [0][0/5005] Time 26.690 (26.690) Data 2.100 (2.100) Loss 6.9093 (6.9093) Prec@1 0.000 (0.000) Prec@5 0.781 (0.781)
Epoch: [0][10/5005] Time 0.460 (2.827) Data 0.000 (0.191) Loss 6.9168 (6.9125) Prec@1 0.000 (0.071) Prec@5 0.391 (0.426)
Epoch: [0][20/5005] Time 0.457 (1.698) Data 0.000 (0.100) Loss 6.9116 (6.9104) Prec@1 0.000 (0.112) Prec@5 0.391 (0.428)
Epoch: [0][30/5005] Time 0.457 (1.298) Data 0.000 (0.068) Loss 6.9090 (6.9097) Prec@1 0.000 (0.076) Prec@5 0.391 (0.428)
Epoch: [0][40/5005] Time 0.453 (1.093) Data 0.000 (0.052) Loss 6.9074 (6.9097) Prec@1 0.000 (0.124) Prec@5 0.391 (0.467)
Epoch: [0][50/5005] Time 0.452 (0.968) Data 0.000 (0.042) Loss 6.9094 (6.9094) Prec@1 0.000 (0.107) Prec@5 0.000 (0.475)
Epoch: [0][60/5005] Time 0.453 (0.884) Data 0.000 (0.035) Loss 6.9074 (6.9094) Prec@1 0.391 (0.109) Prec@5 1.562 (0.455)
Epoch: [0][70/5005] Time 0.460 (0.824) Data 0.000 (0.030) Loss 6.9094 (6.9092) Prec@1 0.000 (0.105) Prec@5 0.391 (0.484)
Epoch: [0][80/5005] Time 0.45
This file has been truncated, but you can view the full file.
=> creating model 'vgg16'
Epoch: [0][0/5005] Time 28.278 (28.278) Data 2.122 (2.122) Loss 6.9099 (6.9099) Prec@1 0.000 (0.000) Prec@5 0.391 (0.391)
Epoch: [0][10/5005] Time 0.540 (3.038) Data 0.000 (0.193) Loss 6.9197 (6.9131) Prec@1 0.391 (0.036) Prec@5 1.172 (0.604)
Epoch: [0][20/5005] Time 0.538 (1.845) Data 0.000 (0.101) Loss 6.9122 (6.9120) Prec@1 0.391 (0.093) Prec@5 0.391 (0.558)
Epoch: [0][30/5005] Time 0.534 (1.422) Data 0.000 (0.069) Loss 6.9107 (6.9109) Prec@1 0.000 (0.076) Prec@5 0.000 (0.479)
Epoch: [0][40/5005] Time 0.538 (1.206) Data 0.000 (0.052) Loss 6.9119 (6.9104) Prec@1 0.000 (0.076) Prec@5 0.781 (0.495)
Epoch: [0][50/5005] Time 0.532 (1.074) Data 0.000 (0.042) Loss 6.9075 (6.9097) Prec@1 0.391 (0.069) Prec@5 0.391 (0.467)
Epoch: [0][60/5005] Time 0.535 (0.985) Data 0.000 (0.035) Loss 6.9083 (6.9096) Prec@1 0.391 (0.077) Prec@5 0.781 (0.455)
Epoch: [0][70/5005] Time 0.531 (0.922) Data 0.000 (0.030) Loss 6.9058 (6.9091) Prec@1 0.000 (0.077) Prec@5 0.391 (0.446)
Epoch: [0][80/5005] Time 0.53
This file has been truncated, but you can view the full file.
=> creating model 'vgg19'
Epoch: [0][0/5005] Time 26.047 (26.047) Data 3.729 (3.729) Loss 6.9086 (6.9086) Prec@1 0.000 (0.000) Prec@5 0.781 (0.781)
Epoch: [0][10/5005] Time 0.618 (2.901) Data 0.000 (0.339) Loss 6.9131 (6.9109) Prec@1 0.000 (0.107) Prec@5 0.781 (0.355)
Epoch: [0][20/5005] Time 0.606 (1.811) Data 0.000 (0.178) Loss 6.9147 (6.9107) Prec@1 0.000 (0.074) Prec@5 0.391 (0.465)
Epoch: [0][30/5005] Time 0.614 (1.425) Data 0.000 (0.121) Loss 6.9071 (6.9101) Prec@1 0.000 (0.088) Prec@5 0.391 (0.466)
Epoch: [0][40/5005] Time 0.614 (1.226) Data 0.001 (0.091) Loss 6.9108 (6.9097) Prec@1 0.000 (0.076) Prec@5 0.391 (0.467)
Epoch: [0][50/5005] Time 0.620 (1.106) Data 0.000 (0.074) Loss 6.9059 (6.9094) Prec@1 0.000 (0.061) Prec@5 0.000 (0.414)
Epoch: [0][60/5005] Time 0.616 (1.025) Data 0.000 (0.062) Loss 6.9076 (6.9091) Prec@1 0.391 (0.077) Prec@5 0.781 (0.474)
Epoch: [0][70/5005] Time 0.621 (0.967) Data 0.000 (0.053) Loss 6.9082 (6.9089) Prec@1 0.000 (0.072) Prec@5 0.781 (0.462)
Epoch: [0][80/5005] Time 0.61
import torch
import sys
import subprocess
import tempfile
import tinys3
import shutil
import os
def main():
--- VariableType.cpp 2017-10-21 04:32:34.984221040 -0700
+++ ./torch/csrc/autograd/generated/VariableType.cpp 2017-10-21 04:32:46.720255617 -0700
@@ -173,12 +173,61 @@
return make_variable(std::move(tensor));
}
-static void check_inplace(const VariableImpl& pImpl) {
- if (pImpl.requires_grad && !pImpl.grad_fn) {
+struct VariableFlags {
+ bool requires_grad;
--- ./torch/csrc/autograd/generated/VariableType.cpp 2017-11-02 11:42:00.179695310 -0700
+++ VariableType.cpp 2017-11-02 11:41:41.932641407 -0700
@@ -181,6 +181,13 @@
return make_variable(std::move(tensor));
}
+Variable VariableType::maybe_wrap(Tensor data, const Variable & self, bool inplace) const {
+ if (inplace) {
+ return self;
+ }
commit 8f491cadf7b01468acc37ab749cdcac8133b8856
Author: Sam Gross <sgross@fb.com>
Date: Thu Nov 9 08:34:32 2017 -0800
WIP: packed sequence
diff --git a/test/test_nn.py b/test/test_nn.py
index 3e65a25..995af82 100644
--- a/test/test_nn.py
+++ b/test/test_nn.py
@colesbury
colesbury / -
Created November 10, 2017 19:50
diff --git a/tools/autograd/gen_variable_type.py b/tools/autograd/gen_variable_type.py
index 3378d3b..05ac18f 100644
--- a/tools/autograd/gen_variable_type.py
+++ b/tools/autograd/gen_variable_type.py
@@ -401,7 +401,7 @@ def preprocess_nn_functions(declarations):
autograd_functions = []
for declaration in declarations:
name = declaration['name']
- if name == 'batch_norm' or 'conv' in name:
+ if name == 'batch_norm' or 'conv3d' in name:
diff --git a/autograd.cpp b/autograd.cpp
index aa849d6..5c3896b 100644
--- a/autograd.cpp
+++ b/autograd.cpp
@@ -9,7 +9,7 @@ tag::Engine engine;
void backward(Variable loss, bool keep_graph) {
tag::function_list funclst;
funclst.emplace_back(loss.grad_fn(), loss.output_nr());
- detail::engine.execute(funclst, {Var(at::ones_like(loss.data()))}, false);
+ detail::engine.execute(funclst, {Var(at::ones_like(loss.data()), false, true)}, false);