Skip to content

Instantly share code, notes, and snippets.

@ilkerkesen
Last active October 4, 2016 19:21
Show Gist options
  • Save ilkerkesen/400cc5cd1a9cfd3bcc5eac4365d81cc2 to your computer and use it in GitHub Desktop.
Save ilkerkesen/400cc5cd1a9cfd3bcc5eac4365d81cc2 to your computer and use it in GitHub Desktop.
Caffe Inconsistent Loss
$ sh leakage.sh
WARNING: Logging before InitGoogleLogging() is written to STDERR
I1004 19:15:11.682617 8069 net.cpp:58] Initializing net from parameters:
name: "Housing"
state {
phase: TRAIN
level: 0
}
layer {
type: "MemoryData"
top: "data"
top: "label"
memory_data_param {
batch_size: 506
channels: 1
height: 1
width: 13
}
}
layer {
name: "ip"
type: "InnerProduct"
bottom: "data"
top: "ip"
inner_product_param {
num_output: 1
weight_filler {
type: "gaussian"
std: 0.1
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "loss"
type: "EuclideanLoss"
bottom: "ip"
bottom: "label"
top: "loss"
}
I1004 19:15:11.682698 8069 layer_factory.hpp:77] Creating layer
I1004 19:15:11.682723 8069 net.cpp:100] Creating Layer
I1004 19:15:11.682735 8069 net.cpp:408] -> data
I1004 19:15:11.682749 8069 net.cpp:408] -> label
I1004 19:15:11.693647 8069 net.cpp:150] Setting up
I1004 19:15:11.693671 8069 net.cpp:157] Top shape: 506 1 1 13 (6578)
I1004 19:15:11.693677 8069 net.cpp:157] Top shape: 506 (506)
I1004 19:15:11.693688 8069 net.cpp:165] Memory required for data: 28336
I1004 19:15:11.693696 8069 layer_factory.hpp:77] Creating layer ip
I1004 19:15:11.693711 8069 net.cpp:100] Creating Layer ip
I1004 19:15:11.693720 8069 net.cpp:434] ip <- data
I1004 19:15:11.693727 8069 net.cpp:408] ip -> ip
I1004 19:15:11.694136 8069 net.cpp:150] Setting up ip
I1004 19:15:11.694154 8069 net.cpp:157] Top shape: 506 1 (506)
I1004 19:15:11.694159 8069 net.cpp:165] Memory required for data: 30360
I1004 19:15:11.694175 8069 layer_factory.hpp:77] Creating layer loss
I1004 19:15:11.694188 8069 net.cpp:100] Creating Layer loss
I1004 19:15:11.694198 8069 net.cpp:434] loss <- ip
I1004 19:15:11.694205 8069 net.cpp:434] loss <- label
I1004 19:15:11.694213 8069 net.cpp:408] loss -> loss
I1004 19:15:11.694254 8069 net.cpp:150] Setting up loss
I1004 19:15:11.694269 8069 net.cpp:157] Top shape: (1)
I1004 19:15:11.694278 8069 net.cpp:160] with loss weight 1
I1004 19:15:11.694288 8069 net.cpp:165] Memory required for data: 30364
I1004 19:15:11.694298 8069 net.cpp:226] loss needs backward computation.
I1004 19:15:11.694303 8069 net.cpp:226] ip needs backward computation.
I1004 19:15:11.694308 8069 net.cpp:228] does not need backward computation.
I1004 19:15:11.694313 8069 net.cpp:270] This network produces output loss
I1004 19:15:11.694324 8069 net.cpp:283] Network initialization done.
I1004 19:15:11.694411 8069 solver.cpp:48] Initializing solver from parameters:
base_lr: 0.1
display: 2000
max_iter: 10000
lr_policy: "fixed"
solver_mode: GPU
random_seed: 1
net: "housing.prototxt"
snapshot_after_train: false
I1004 19:15:11.694440 8069 solver.cpp:91] Creating training net from net file: housing.prototxt
I1004 19:15:11.694561 8069 net.cpp:58] Initializing net from parameters:
name: "Housing"
state {
phase: TRAIN
}
layer {
type: "MemoryData"
top: "data"
top: "label"
memory_data_param {
batch_size: 506
channels: 1
height: 1
width: 13
}
}
layer {
name: "ip"
type: "InnerProduct"
bottom: "data"
top: "ip"
inner_product_param {
num_output: 1
weight_filler {
type: "gaussian"
std: 0.1
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "loss"
type: "EuclideanLoss"
bottom: "ip"
bottom: "label"
top: "loss"
}
I1004 19:15:11.694602 8069 layer_factory.hpp:77] Creating layer
I1004 19:15:11.694615 8069 net.cpp:100] Creating Layer
I1004 19:15:11.694622 8069 net.cpp:408] -> data
I1004 19:15:11.694633 8069 net.cpp:408] -> label
I1004 19:15:11.694720 8069 net.cpp:150] Setting up
I1004 19:15:11.694736 8069 net.cpp:157] Top shape: 506 1 1 13 (6578)
I1004 19:15:11.694747 8069 net.cpp:157] Top shape: 506 (506)
I1004 19:15:11.694752 8069 net.cpp:165] Memory required for data: 28336
I1004 19:15:11.694757 8069 layer_factory.hpp:77] Creating layer ip
I1004 19:15:11.694766 8069 net.cpp:100] Creating Layer ip
I1004 19:15:11.694772 8069 net.cpp:434] ip <- data
I1004 19:15:11.694782 8069 net.cpp:408] ip -> ip
I1004 19:15:11.694880 8069 net.cpp:150] Setting up ip
I1004 19:15:11.694895 8069 net.cpp:157] Top shape: 506 1 (506)
I1004 19:15:11.694905 8069 net.cpp:165] Memory required for data: 30360
I1004 19:15:11.694921 8069 layer_factory.hpp:77] Creating layer loss
I1004 19:15:11.694931 8069 net.cpp:100] Creating Layer loss
I1004 19:15:11.694936 8069 net.cpp:434] loss <- ip
I1004 19:15:11.694942 8069 net.cpp:434] loss <- label
I1004 19:15:11.694947 8069 net.cpp:408] loss -> loss
I1004 19:15:11.694984 8069 net.cpp:150] Setting up loss
I1004 19:15:11.694998 8069 net.cpp:157] Top shape: (1)
I1004 19:15:11.695003 8069 net.cpp:160] with loss weight 1
I1004 19:15:11.695009 8069 net.cpp:165] Memory required for data: 30364
I1004 19:15:11.695014 8069 net.cpp:226] loss needs backward computation.
I1004 19:15:11.695019 8069 net.cpp:226] ip needs backward computation.
I1004 19:15:11.695024 8069 net.cpp:228] does not need backward computation.
I1004 19:15:11.695029 8069 net.cpp:270] This network produces output loss
I1004 19:15:11.695037 8069 net.cpp:283] Network initialization done.
I1004 19:15:11.695051 8069 solver.cpp:60] Solver scaffolding done.
I1004 19:15:11.696130 8069 solver.cpp:228] Iteration 0, loss = 295.306
I1004 19:15:11.696158 8069 solver.cpp:244] Train net output #0: loss = 295.306 (* 1 = 295.306 loss)
I1004 19:15:11.696167 8069 sgd_solver.cpp:106] Iteration 0, lr = 0.1
I1004 19:15:12.173707 8069 solver.cpp:228] Iteration 2000, loss = 10.9474
I1004 19:15:12.173734 8069 solver.cpp:244] Train net output #0: loss = 10.9474 (* 1 = 10.9474 loss)
I1004 19:15:12.173741 8069 sgd_solver.cpp:106] Iteration 2000, lr = 0.1
I1004 19:15:12.652598 8069 solver.cpp:228] Iteration 4000, loss = 10.9474
I1004 19:15:12.652654 8069 solver.cpp:244] Train net output #0: loss = 10.9474 (* 1 = 10.9474 loss)
I1004 19:15:12.652669 8069 sgd_solver.cpp:106] Iteration 4000, lr = 0.1
I1004 19:15:13.131041 8069 solver.cpp:228] Iteration 6000, loss = 10.9474
I1004 19:15:13.131072 8069 solver.cpp:244] Train net output #0: loss = 10.9474 (* 1 = 10.9474 loss)
I1004 19:15:13.131085 8069 sgd_solver.cpp:106] Iteration 6000, lr = 0.1
I1004 19:15:13.609783 8069 solver.cpp:228] Iteration 8000, loss = 10.9474
I1004 19:15:13.609846 8069 solver.cpp:244] Train net output #0: loss = 10.9474 (* 1 = 10.9474 loss)
I1004 19:15:13.609861 8069 sgd_solver.cpp:106] Iteration 8000, lr = 0.1
Time: 2.3930
WARNING: Logging before InitGoogleLogging() is written to STDERR
I1004 19:15:15.127089 8080 net.cpp:58] Initializing net from parameters:
name: "Housing"
state {
phase: TRAIN
level: 0
}
layer {
type: "MemoryData"
top: "data"
top: "label"
memory_data_param {
batch_size: 506
channels: 1
height: 1
width: 13
}
}
layer {
name: "ip"
type: "InnerProduct"
bottom: "data"
top: "ip"
inner_product_param {
num_output: 1
weight_filler {
type: "gaussian"
std: 0.1
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "loss"
type: "EuclideanLoss"
bottom: "ip"
bottom: "label"
top: "loss"
}
I1004 19:15:15.127167 8080 layer_factory.hpp:77] Creating layer
I1004 19:15:15.127192 8080 net.cpp:100] Creating Layer
I1004 19:15:15.127204 8080 net.cpp:408] -> data
I1004 19:15:15.127223 8080 net.cpp:408] -> label
I1004 19:15:15.138221 8080 net.cpp:150] Setting up
I1004 19:15:15.138244 8080 net.cpp:157] Top shape: 506 1 1 13 (6578)
I1004 19:15:15.138257 8080 net.cpp:157] Top shape: 506 (506)
I1004 19:15:15.138267 8080 net.cpp:165] Memory required for data: 28336
I1004 19:15:15.138278 8080 layer_factory.hpp:77] Creating layer ip
I1004 19:15:15.138293 8080 net.cpp:100] Creating Layer ip
I1004 19:15:15.138298 8080 net.cpp:434] ip <- data
I1004 19:15:15.138309 8080 net.cpp:408] ip -> ip
I1004 19:15:15.138716 8080 net.cpp:150] Setting up ip
I1004 19:15:15.138736 8080 net.cpp:157] Top shape: 506 1 (506)
I1004 19:15:15.138746 8080 net.cpp:165] Memory required for data: 30360
I1004 19:15:15.138758 8080 layer_factory.hpp:77] Creating layer loss
I1004 19:15:15.138773 8080 net.cpp:100] Creating Layer loss
I1004 19:15:15.138780 8080 net.cpp:434] loss <- ip
I1004 19:15:15.138787 8080 net.cpp:434] loss <- label
I1004 19:15:15.138792 8080 net.cpp:408] loss -> loss
I1004 19:15:15.138833 8080 net.cpp:150] Setting up loss
I1004 19:15:15.138847 8080 net.cpp:157] Top shape: (1)
I1004 19:15:15.138852 8080 net.cpp:160] with loss weight 1
I1004 19:15:15.138862 8080 net.cpp:165] Memory required for data: 30364
I1004 19:15:15.138872 8080 net.cpp:226] loss needs backward computation.
I1004 19:15:15.138882 8080 net.cpp:226] ip needs backward computation.
I1004 19:15:15.138891 8080 net.cpp:228] does not need backward computation.
I1004 19:15:15.138900 8080 net.cpp:270] This network produces output loss
I1004 19:15:15.138906 8080 net.cpp:283] Network initialization done.
I1004 19:15:15.138989 8080 solver.cpp:48] Initializing solver from parameters:
base_lr: 0.1
display: 2000
max_iter: 10000
lr_policy: "fixed"
solver_mode: GPU
random_seed: 1
net: "housing.prototxt"
snapshot_after_train: false
I1004 19:15:15.139013 8080 solver.cpp:91] Creating training net from net file: housing.prototxt
I1004 19:15:15.139118 8080 net.cpp:58] Initializing net from parameters:
name: "Housing"
state {
phase: TRAIN
}
layer {
type: "MemoryData"
top: "data"
top: "label"
memory_data_param {
batch_size: 506
channels: 1
height: 1
width: 13
}
}
layer {
name: "ip"
type: "InnerProduct"
bottom: "data"
top: "ip"
inner_product_param {
num_output: 1
weight_filler {
type: "gaussian"
std: 0.1
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "loss"
type: "EuclideanLoss"
bottom: "ip"
bottom: "label"
top: "loss"
}
I1004 19:15:15.139165 8080 layer_factory.hpp:77] Creating layer
I1004 19:15:15.139179 8080 net.cpp:100] Creating Layer
I1004 19:15:15.139189 8080 net.cpp:408] -> data
I1004 19:15:15.139197 8080 net.cpp:408] -> label
I1004 19:15:15.139282 8080 net.cpp:150] Setting up
I1004 19:15:15.139298 8080 net.cpp:157] Top shape: 506 1 1 13 (6578)
I1004 19:15:15.139305 8080 net.cpp:157] Top shape: 506 (506)
I1004 19:15:15.139310 8080 net.cpp:165] Memory required for data: 28336
I1004 19:15:15.139315 8080 layer_factory.hpp:77] Creating layer ip
I1004 19:15:15.139325 8080 net.cpp:100] Creating Layer ip
I1004 19:15:15.139338 8080 net.cpp:434] ip <- data
I1004 19:15:15.139349 8080 net.cpp:408] ip -> ip
I1004 19:15:15.139446 8080 net.cpp:150] Setting up ip
I1004 19:15:15.139462 8080 net.cpp:157] Top shape: 506 1 (506)
I1004 19:15:15.139473 8080 net.cpp:165] Memory required for data: 30360
I1004 19:15:15.139489 8080 layer_factory.hpp:77] Creating layer loss
I1004 19:15:15.139502 8080 net.cpp:100] Creating Layer loss
I1004 19:15:15.139508 8080 net.cpp:434] loss <- ip
I1004 19:15:15.139519 8080 net.cpp:434] loss <- label
I1004 19:15:15.139529 8080 net.cpp:408] loss -> loss
I1004 19:15:15.139569 8080 net.cpp:150] Setting up loss
I1004 19:15:15.139582 8080 net.cpp:157] Top shape: (1)
I1004 19:15:15.139590 8080 net.cpp:160] with loss weight 1
I1004 19:15:15.139602 8080 net.cpp:165] Memory required for data: 30364
I1004 19:15:15.139606 8080 net.cpp:226] loss needs backward computation.
I1004 19:15:15.139616 8080 net.cpp:226] ip needs backward computation.
I1004 19:15:15.139624 8080 net.cpp:228] does not need backward computation.
I1004 19:15:15.139628 8080 net.cpp:270] This network produces output loss
I1004 19:15:15.139636 8080 net.cpp:283] Network initialization done.
I1004 19:15:15.139652 8080 solver.cpp:60] Solver scaffolding done.
I1004 19:15:15.140749 8080 solver.cpp:228] Iteration 0, loss = 295.306
I1004 19:15:15.140779 8080 solver.cpp:244] Train net output #0: loss = 295.306 (* 1 = 295.306 loss)
I1004 19:15:15.140790 8080 sgd_solver.cpp:106] Iteration 0, lr = 0.1
I1004 19:15:15.620554 8080 solver.cpp:228] Iteration 2000, loss = nan
I1004 19:15:15.620587 8080 solver.cpp:244] Train net output #0: loss = nan (* 1 = nan loss)
I1004 19:15:15.620595 8080 sgd_solver.cpp:106] Iteration 2000, lr = 0.1
I1004 19:15:16.098973 8080 solver.cpp:228] Iteration 4000, loss = nan
I1004 19:15:16.099009 8080 solver.cpp:244] Train net output #0: loss = nan (* 1 = nan loss)
I1004 19:15:16.099021 8080 sgd_solver.cpp:106] Iteration 4000, lr = 0.1
I1004 19:15:16.578022 8080 solver.cpp:228] Iteration 6000, loss = nan
I1004 19:15:16.578064 8080 solver.cpp:244] Train net output #0: loss = nan (* 1 = nan loss)
I1004 19:15:16.578078 8080 sgd_solver.cpp:106] Iteration 6000, lr = 0.1
I1004 19:15:17.057282 8080 solver.cpp:228] Iteration 8000, loss = nan
I1004 19:15:17.057337 8080 solver.cpp:244] Train net output #0: loss = nan (* 1 = nan loss)
I1004 19:15:17.057346 8080 sgd_solver.cpp:106] Iteration 8000, lr = 0.1
Time: 2.3938
WARNING: Logging before InitGoogleLogging() is written to STDERR
I1004 19:15:18.572724 8091 net.cpp:58] Initializing net from parameters:
name: "Housing"
state {
phase: TRAIN
level: 0
}
layer {
type: "MemoryData"
top: "data"
top: "label"
memory_data_param {
batch_size: 506
channels: 1
height: 1
width: 13
}
}
layer {
name: "ip"
type: "InnerProduct"
bottom: "data"
top: "ip"
inner_product_param {
num_output: 1
weight_filler {
type: "gaussian"
std: 0.1
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "loss"
type: "EuclideanLoss"
bottom: "ip"
bottom: "label"
top: "loss"
}
I1004 19:15:18.572806 8091 layer_factory.hpp:77] Creating layer
I1004 19:15:18.572845 8091 net.cpp:100] Creating Layer
I1004 19:15:18.572856 8091 net.cpp:408] -> data
I1004 19:15:18.572873 8091 net.cpp:408] -> label
I1004 19:15:18.583819 8091 net.cpp:150] Setting up
I1004 19:15:18.583844 8091 net.cpp:157] Top shape: 506 1 1 13 (6578)
I1004 19:15:18.583853 8091 net.cpp:157] Top shape: 506 (506)
I1004 19:15:18.583858 8091 net.cpp:165] Memory required for data: 28336
I1004 19:15:18.583865 8091 layer_factory.hpp:77] Creating layer ip
I1004 19:15:18.583887 8091 net.cpp:100] Creating Layer ip
I1004 19:15:18.583897 8091 net.cpp:434] ip <- data
I1004 19:15:18.583905 8091 net.cpp:408] ip -> ip
I1004 19:15:18.584298 8091 net.cpp:150] Setting up ip
I1004 19:15:18.584316 8091 net.cpp:157] Top shape: 506 1 (506)
I1004 19:15:18.584326 8091 net.cpp:165] Memory required for data: 30360
I1004 19:15:18.584342 8091 layer_factory.hpp:77] Creating layer loss
I1004 19:15:18.584357 8091 net.cpp:100] Creating Layer loss
I1004 19:15:18.584367 8091 net.cpp:434] loss <- ip
I1004 19:15:18.584373 8091 net.cpp:434] loss <- label
I1004 19:15:18.584379 8091 net.cpp:408] loss -> loss
I1004 19:15:18.584424 8091 net.cpp:150] Setting up loss
I1004 19:15:18.584440 8091 net.cpp:157] Top shape: (1)
I1004 19:15:18.584450 8091 net.cpp:160] with loss weight 1
I1004 19:15:18.584465 8091 net.cpp:165] Memory required for data: 30364
I1004 19:15:18.584470 8091 net.cpp:226] loss needs backward computation.
I1004 19:15:18.584475 8091 net.cpp:226] ip needs backward computation.
I1004 19:15:18.584484 8091 net.cpp:228] does not need backward computation.
I1004 19:15:18.584492 8091 net.cpp:270] This network produces output loss
I1004 19:15:18.584503 8091 net.cpp:283] Network initialization done.
I1004 19:15:18.584583 8091 solver.cpp:48] Initializing solver from parameters:
base_lr: 0.1
display: 2000
max_iter: 10000
lr_policy: "fixed"
solver_mode: GPU
random_seed: 1
net: "housing.prototxt"
snapshot_after_train: false
I1004 19:15:18.584614 8091 solver.cpp:91] Creating training net from net file: housing.prototxt
I1004 19:15:18.584738 8091 net.cpp:58] Initializing net from parameters:
name: "Housing"
state {
phase: TRAIN
}
layer {
type: "MemoryData"
top: "data"
top: "label"
memory_data_param {
batch_size: 506
channels: 1
height: 1
width: 13
}
}
layer {
name: "ip"
type: "InnerProduct"
bottom: "data"
top: "ip"
inner_product_param {
num_output: 1
weight_filler {
type: "gaussian"
std: 0.1
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "loss"
type: "EuclideanLoss"
bottom: "ip"
bottom: "label"
top: "loss"
}
I1004 19:15:18.584789 8091 layer_factory.hpp:77] Creating layer
I1004 19:15:18.584806 8091 net.cpp:100] Creating Layer
I1004 19:15:18.584816 8091 net.cpp:408] -> data
I1004 19:15:18.584827 8091 net.cpp:408] -> label
I1004 19:15:18.584914 8091 net.cpp:150] Setting up
I1004 19:15:18.584929 8091 net.cpp:157] Top shape: 506 1 1 13 (6578)
I1004 19:15:18.584935 8091 net.cpp:157] Top shape: 506 (506)
I1004 19:15:18.584939 8091 net.cpp:165] Memory required for data: 28336
I1004 19:15:18.584945 8091 layer_factory.hpp:77] Creating layer ip
I1004 19:15:18.584954 8091 net.cpp:100] Creating Layer ip
I1004 19:15:18.584964 8091 net.cpp:434] ip <- data
I1004 19:15:18.584974 8091 net.cpp:408] ip -> ip
I1004 19:15:18.585069 8091 net.cpp:150] Setting up ip
I1004 19:15:18.585085 8091 net.cpp:157] Top shape: 506 1 (506)
I1004 19:15:18.585095 8091 net.cpp:165] Memory required for data: 30360
I1004 19:15:18.585106 8091 layer_factory.hpp:77] Creating layer loss
I1004 19:15:18.585119 8091 net.cpp:100] Creating Layer loss
I1004 19:15:18.585124 8091 net.cpp:434] loss <- ip
I1004 19:15:18.585134 8091 net.cpp:434] loss <- label
I1004 19:15:18.585142 8091 net.cpp:408] loss -> loss
I1004 19:15:18.585181 8091 net.cpp:150] Setting up loss
I1004 19:15:18.585196 8091 net.cpp:157] Top shape: (1)
I1004 19:15:18.585201 8091 net.cpp:160] with loss weight 1
I1004 19:15:18.585209 8091 net.cpp:165] Memory required for data: 30364
I1004 19:15:18.585216 8091 net.cpp:226] loss needs backward computation.
I1004 19:15:18.585222 8091 net.cpp:226] ip needs backward computation.
I1004 19:15:18.585227 8091 net.cpp:228] does not need backward computation.
I1004 19:15:18.585232 8091 net.cpp:270] This network produces output loss
I1004 19:15:18.585237 8091 net.cpp:283] Network initialization done.
I1004 19:15:18.585255 8091 solver.cpp:60] Solver scaffolding done.
I1004 19:15:18.586334 8091 solver.cpp:228] Iteration 0, loss = 295.306
I1004 19:15:18.586362 8091 solver.cpp:244] Train net output #0: loss = 295.306 (* 1 = 295.306 loss)
I1004 19:15:18.586371 8091 sgd_solver.cpp:106] Iteration 0, lr = 0.1
I1004 19:15:19.067911 8091 solver.cpp:228] Iteration 2000, loss = 10.9474
I1004 19:15:19.067946 8091 solver.cpp:244] Train net output #0: loss = 10.9474 (* 1 = 10.9474 loss)
I1004 19:15:19.067955 8091 sgd_solver.cpp:106] Iteration 2000, lr = 0.1
I1004 19:15:19.547925 8091 solver.cpp:228] Iteration 4000, loss = 10.9474
I1004 19:15:19.547953 8091 solver.cpp:244] Train net output #0: loss = 10.9474 (* 1 = 10.9474 loss)
I1004 19:15:19.547963 8091 sgd_solver.cpp:106] Iteration 4000, lr = 0.1
I1004 19:15:20.028372 8091 solver.cpp:228] Iteration 6000, loss = 10.9474
I1004 19:15:20.028408 8091 solver.cpp:244] Train net output #0: loss = 10.9474 (* 1 = 10.9474 loss)
I1004 19:15:20.028420 8091 sgd_solver.cpp:106] Iteration 6000, lr = 0.1
I1004 19:15:20.509683 8091 solver.cpp:228] Iteration 8000, loss = 10.9474
I1004 19:15:20.509727 8091 solver.cpp:244] Train net output #0: loss = 10.9474 (* 1 = 10.9474 loss)
I1004 19:15:20.509749 8091 sgd_solver.cpp:106] Iteration 8000, lr = 0.1
Time: 2.4033
WARNING: Logging before InitGoogleLogging() is written to STDERR
I1004 19:15:22.030012 8102 net.cpp:58] Initializing net from parameters:
name: "Housing"
state {
phase: TRAIN
level: 0
}
layer {
type: "MemoryData"
top: "data"
top: "label"
memory_data_param {
batch_size: 506
channels: 1
height: 1
width: 13
}
}
layer {
name: "ip"
type: "InnerProduct"
bottom: "data"
top: "ip"
inner_product_param {
num_output: 1
weight_filler {
type: "gaussian"
std: 0.1
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "loss"
type: "EuclideanLoss"
bottom: "ip"
bottom: "label"
top: "loss"
}
I1004 19:15:22.030091 8102 layer_factory.hpp:77] Creating layer
I1004 19:15:22.030114 8102 net.cpp:100] Creating Layer
I1004 19:15:22.030125 8102 net.cpp:408] -> data
I1004 19:15:22.030146 8102 net.cpp:408] -> label
I1004 19:15:22.041023 8102 net.cpp:150] Setting up
I1004 19:15:22.041049 8102 net.cpp:157] Top shape: 506 1 1 13 (6578)
I1004 19:15:22.041062 8102 net.cpp:157] Top shape: 506 (506)
I1004 19:15:22.041071 8102 net.cpp:165] Memory required for data: 28336
I1004 19:15:22.041077 8102 layer_factory.hpp:77] Creating layer ip
I1004 19:15:22.041090 8102 net.cpp:100] Creating Layer ip
I1004 19:15:22.041095 8102 net.cpp:434] ip <- data
I1004 19:15:22.041105 8102 net.cpp:408] ip -> ip
I1004 19:15:22.041497 8102 net.cpp:150] Setting up ip
I1004 19:15:22.041514 8102 net.cpp:157] Top shape: 506 1 (506)
I1004 19:15:22.041518 8102 net.cpp:165] Memory required for data: 30360
I1004 19:15:22.041538 8102 layer_factory.hpp:77] Creating layer loss
I1004 19:15:22.041553 8102 net.cpp:100] Creating Layer loss
I1004 19:15:22.041563 8102 net.cpp:434] loss <- ip
I1004 19:15:22.041568 8102 net.cpp:434] loss <- label
I1004 19:15:22.041574 8102 net.cpp:408] loss -> loss
I1004 19:15:22.041616 8102 net.cpp:150] Setting up loss
I1004 19:15:22.041631 8102 net.cpp:157] Top shape: (1)
I1004 19:15:22.041635 8102 net.cpp:160] with loss weight 1
I1004 19:15:22.041645 8102 net.cpp:165] Memory required for data: 30364
I1004 19:15:22.041649 8102 net.cpp:226] loss needs backward computation.
I1004 19:15:22.041656 8102 net.cpp:226] ip needs backward computation.
I1004 19:15:22.041659 8102 net.cpp:228] does not need backward computation.
I1004 19:15:22.041666 8102 net.cpp:270] This network produces output loss
I1004 19:15:22.041674 8102 net.cpp:283] Network initialization done.
I1004 19:15:22.041762 8102 solver.cpp:48] Initializing solver from parameters:
base_lr: 0.1
display: 2000
max_iter: 10000
lr_policy: "fixed"
solver_mode: GPU
random_seed: 1
net: "housing.prototxt"
snapshot_after_train: false
I1004 19:15:22.041792 8102 solver.cpp:91] Creating training net from net file: housing.prototxt
I1004 19:15:22.041919 8102 net.cpp:58] Initializing net from parameters:
name: "Housing"
state {
phase: TRAIN
}
layer {
type: "MemoryData"
top: "data"
top: "label"
memory_data_param {
batch_size: 506
channels: 1
height: 1
width: 13
}
}
layer {
name: "ip"
type: "InnerProduct"
bottom: "data"
top: "ip"
inner_product_param {
num_output: 1
weight_filler {
type: "gaussian"
std: 0.1
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "loss"
type: "EuclideanLoss"
bottom: "ip"
bottom: "label"
top: "loss"
}
I1004 19:15:22.041960 8102 layer_factory.hpp:77] Creating layer
I1004 19:15:22.041970 8102 net.cpp:100] Creating Layer
I1004 19:15:22.041975 8102 net.cpp:408] -> data
I1004 19:15:22.041983 8102 net.cpp:408] -> label
I1004 19:15:22.042074 8102 net.cpp:150] Setting up
I1004 19:15:22.042090 8102 net.cpp:157] Top shape: 506 1 1 13 (6578)
I1004 19:15:22.042096 8102 net.cpp:157] Top shape: 506 (506)
I1004 19:15:22.042101 8102 net.cpp:165] Memory required for data: 28336
I1004 19:15:22.042111 8102 layer_factory.hpp:77] Creating layer ip
I1004 19:15:22.042119 8102 net.cpp:100] Creating Layer ip
I1004 19:15:22.042129 8102 net.cpp:434] ip <- data
I1004 19:15:22.042136 8102 net.cpp:408] ip -> ip
I1004 19:15:22.042237 8102 net.cpp:150] Setting up ip
I1004 19:15:22.042251 8102 net.cpp:157] Top shape: 506 1 (506)
I1004 19:15:22.042261 8102 net.cpp:165] Memory required for data: 30360
I1004 19:15:22.042276 8102 layer_factory.hpp:77] Creating layer loss
I1004 19:15:22.042289 8102 net.cpp:100] Creating Layer loss
I1004 19:15:22.042294 8102 net.cpp:434] loss <- ip
I1004 19:15:22.042299 8102 net.cpp:434] loss <- label
I1004 19:15:22.042305 8102 net.cpp:408] loss -> loss
I1004 19:15:22.042342 8102 net.cpp:150] Setting up loss
I1004 19:15:22.042352 8102 net.cpp:157] Top shape: (1)
I1004 19:15:22.042356 8102 net.cpp:160] with loss weight 1
I1004 19:15:22.042369 8102 net.cpp:165] Memory required for data: 30364
I1004 19:15:22.042374 8102 net.cpp:226] loss needs backward computation.
I1004 19:15:22.042379 8102 net.cpp:226] ip needs backward computation.
I1004 19:15:22.042388 8102 net.cpp:228] does not need backward computation.
I1004 19:15:22.042397 8102 net.cpp:270] This network produces output loss
I1004 19:15:22.042404 8102 net.cpp:283] Network initialization done.
I1004 19:15:22.042419 8102 solver.cpp:60] Solver scaffolding done.
I1004 19:15:22.043507 8102 solver.cpp:228] Iteration 0, loss = 295.306
I1004 19:15:22.043536 8102 solver.cpp:244] Train net output #0: loss = 295.306 (* 1 = 295.306 loss)
I1004 19:15:22.043550 8102 sgd_solver.cpp:106] Iteration 0, lr = 0.1
I1004 19:15:22.518801 8102 solver.cpp:228] Iteration 2000, loss = 10.9474
I1004 19:15:22.518841 8102 solver.cpp:244] Train net output #0: loss = 10.9474 (* 1 = 10.9474 loss)
I1004 19:15:22.518853 8102 sgd_solver.cpp:106] Iteration 2000, lr = 0.1
I1004 19:15:22.994117 8102 solver.cpp:228] Iteration 4000, loss = 10.9474
I1004 19:15:22.994145 8102 solver.cpp:244] Train net output #0: loss = 10.9474 (* 1 = 10.9474 loss)
I1004 19:15:22.994154 8102 sgd_solver.cpp:106] Iteration 4000, lr = 0.1
I1004 19:15:23.468544 8102 solver.cpp:228] Iteration 6000, loss = 10.9474
I1004 19:15:23.468570 8102 solver.cpp:244] Train net output #0: loss = 10.9474 (* 1 = 10.9474 loss)
I1004 19:15:23.468580 8102 sgd_solver.cpp:106] Iteration 6000, lr = 0.1
I1004 19:15:23.944757 8102 solver.cpp:228] Iteration 8000, loss = 10.9474
I1004 19:15:23.944787 8102 solver.cpp:244] Train net output #0: loss = 10.9474 (* 1 = 10.9474 loss)
I1004 19:15:23.944800 8102 sgd_solver.cpp:106] Iteration 8000, lr = 0.1
Time: 2.3767
WARNING: Logging before InitGoogleLogging() is written to STDERR
I1004 19:15:25.459405 8113 net.cpp:58] Initializing net from parameters:
name: "Housing"
state {
phase: TRAIN
level: 0
}
layer {
type: "MemoryData"
top: "data"
top: "label"
memory_data_param {
batch_size: 506
channels: 1
height: 1
width: 13
}
}
layer {
name: "ip"
type: "InnerProduct"
bottom: "data"
top: "ip"
inner_product_param {
num_output: 1
weight_filler {
type: "gaussian"
std: 0.1
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "loss"
type: "EuclideanLoss"
bottom: "ip"
bottom: "label"
top: "loss"
}
I1004 19:15:25.459491 8113 layer_factory.hpp:77] Creating layer
I1004 19:15:25.459522 8113 net.cpp:100] Creating Layer
I1004 19:15:25.459540 8113 net.cpp:408] -> data
I1004 19:15:25.459554 8113 net.cpp:408] -> label
I1004 19:15:25.470367 8113 net.cpp:150] Setting up
I1004 19:15:25.470394 8113 net.cpp:157] Top shape: 506 1 1 13 (6578)
I1004 19:15:25.470403 8113 net.cpp:157] Top shape: 506 (506)
I1004 19:15:25.470407 8113 net.cpp:165] Memory required for data: 28336
I1004 19:15:25.470414 8113 layer_factory.hpp:77] Creating layer ip
I1004 19:15:25.470432 8113 net.cpp:100] Creating Layer ip
I1004 19:15:25.470443 8113 net.cpp:434] ip <- data
I1004 19:15:25.470453 8113 net.cpp:408] ip -> ip
I1004 19:15:25.470850 8113 net.cpp:150] Setting up ip
I1004 19:15:25.470870 8113 net.cpp:157] Top shape: 506 1 (506)
I1004 19:15:25.470878 8113 net.cpp:165] Memory required for data: 30360
I1004 19:15:25.470892 8113 layer_factory.hpp:77] Creating layer loss
I1004 19:15:25.470909 8113 net.cpp:100] Creating Layer loss
I1004 19:15:25.470919 8113 net.cpp:434] loss <- ip
I1004 19:15:25.470924 8113 net.cpp:434] loss <- label
I1004 19:15:25.470930 8113 net.cpp:408] loss -> loss
I1004 19:15:25.470974 8113 net.cpp:150] Setting up loss
I1004 19:15:25.470989 8113 net.cpp:157] Top shape: (1)
I1004 19:15:25.470993 8113 net.cpp:160] with loss weight 1
I1004 19:15:25.471005 8113 net.cpp:165] Memory required for data: 30364
I1004 19:15:25.471011 8113 net.cpp:226] loss needs backward computation.
I1004 19:15:25.471016 8113 net.cpp:226] ip needs backward computation.
I1004 19:15:25.471026 8113 net.cpp:228] does not need backward computation.
I1004 19:15:25.471031 8113 net.cpp:270] This network produces output loss
I1004 19:15:25.471040 8113 net.cpp:283] Network initialization done.
I1004 19:15:25.471125 8113 solver.cpp:48] Initializing solver from parameters:
base_lr: 0.1
display: 2000
max_iter: 10000
lr_policy: "fixed"
solver_mode: GPU
random_seed: 1
net: "housing.prototxt"
snapshot_after_train: false
I1004 19:15:25.471148 8113 solver.cpp:91] Creating training net from net file: housing.prototxt
I1004 19:15:25.471248 8113 net.cpp:58] Initializing net from parameters:
name: "Housing"
state {
phase: TRAIN
}
layer {
type: "MemoryData"
top: "data"
top: "label"
memory_data_param {
batch_size: 506
channels: 1
height: 1
width: 13
}
}
layer {
name: "ip"
type: "InnerProduct"
bottom: "data"
top: "ip"
inner_product_param {
num_output: 1
weight_filler {
type: "gaussian"
std: 0.1
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "loss"
type: "EuclideanLoss"
bottom: "ip"
bottom: "label"
top: "loss"
}
I1004 19:15:25.471287 8113 layer_factory.hpp:77] Creating layer
I1004 19:15:25.471302 8113 net.cpp:100] Creating Layer
I1004 19:15:25.471309 8113 net.cpp:408] -> data
I1004 19:15:25.471319 8113 net.cpp:408] -> label
I1004 19:15:25.471407 8113 net.cpp:150] Setting up
I1004 19:15:25.471418 8113 net.cpp:157] Top shape: 506 1 1 13 (6578)
I1004 19:15:25.471423 8113 net.cpp:157] Top shape: 506 (506)
I1004 19:15:25.471434 8113 net.cpp:165] Memory required for data: 28336
I1004 19:15:25.471444 8113 layer_factory.hpp:77] Creating layer ip
I1004 19:15:25.471457 8113 net.cpp:100] Creating Layer ip
I1004 19:15:25.471462 8113 net.cpp:434] ip <- data
I1004 19:15:25.471472 8113 net.cpp:408] ip -> ip
I1004 19:15:25.471565 8113 net.cpp:150] Setting up ip
I1004 19:15:25.471580 8113 net.cpp:157] Top shape: 506 1 (506)
I1004 19:15:25.471590 8113 net.cpp:165] Memory required for data: 30360
I1004 19:15:25.471601 8113 layer_factory.hpp:77] Creating layer loss
I1004 19:15:25.471616 8113 net.cpp:100] Creating Layer loss
I1004 19:15:25.471621 8113 net.cpp:434] loss <- ip
I1004 19:15:25.471627 8113 net.cpp:434] loss <- label
I1004 19:15:25.471633 8113 net.cpp:408] loss -> loss
I1004 19:15:25.471669 8113 net.cpp:150] Setting up loss
I1004 19:15:25.471678 8113 net.cpp:157] Top shape: (1)
I1004 19:15:25.471681 8113 net.cpp:160] with loss weight 1
I1004 19:15:25.471690 8113 net.cpp:165] Memory required for data: 30364
I1004 19:15:25.471695 8113 net.cpp:226] loss needs backward computation.
I1004 19:15:25.471701 8113 net.cpp:226] ip needs backward computation.
I1004 19:15:25.471706 8113 net.cpp:228] does not need backward computation.
I1004 19:15:25.471710 8113 net.cpp:270] This network produces output loss
I1004 19:15:25.471717 8113 net.cpp:283] Network initialization done.
I1004 19:15:25.471731 8113 solver.cpp:60] Solver scaffolding done.
I1004 19:15:25.472784 8113 solver.cpp:228] Iteration 0, loss = 295.306
I1004 19:15:25.472812 8113 solver.cpp:244] Train net output #0: loss = 295.306 (* 1 = 295.306 loss)
I1004 19:15:25.472821 8113 sgd_solver.cpp:106] Iteration 0, lr = 0.1
I1004 19:15:25.951110 8113 solver.cpp:228] Iteration 2000, loss = 10.9474
I1004 19:15:25.951145 8113 solver.cpp:244] Train net output #0: loss = 10.9474 (* 1 = 10.9474 loss)
I1004 19:15:25.951154 8113 sgd_solver.cpp:106] Iteration 2000, lr = 0.1
I1004 19:15:26.429363 8113 solver.cpp:228] Iteration 4000, loss = 10.9474
I1004 19:15:26.429394 8113 solver.cpp:244] Train net output #0: loss = 10.9474 (* 1 = 10.9474 loss)
I1004 19:15:26.429404 8113 sgd_solver.cpp:106] Iteration 4000, lr = 0.1
I1004 19:15:26.908542 8113 solver.cpp:228] Iteration 6000, loss = 10.9474
I1004 19:15:26.908572 8113 solver.cpp:244] Train net output #0: loss = 10.9474 (* 1 = 10.9474 loss)
I1004 19:15:26.908586 8113 sgd_solver.cpp:106] Iteration 6000, lr = 0.1
I1004 19:15:27.386399 8113 solver.cpp:228] Iteration 8000, loss = 10.9474
I1004 19:15:27.386426 8113 solver.cpp:244] Train net output #0: loss = 10.9474 (* 1 = 10.9474 loss)
I1004 19:15:27.386440 8113 sgd_solver.cpp:106] Iteration 8000, lr = 0.1
Time: 2.3920
name:"Housing"
layer {
type: "MemoryData"
top: "data"
top: "label"
memory_data_param {
batch_size: 506
channels: 1
height: 1
width: 13
}
}
layer {
name: "ip"
type: "InnerProduct"
bottom: "data"
top: "ip"
inner_product_param {
num_output: 1
weight_filler {
type: "gaussian"
std: 0.1
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "loss"
type: "EuclideanLoss"
bottom: "ip"
bottom: "label"
top: "loss"
}
import os
#os.environ['GLOG_minloglevel'] = '2'
import numpy as np
import caffe
import six.moves.cPickle as pickle
import gzip
from timeit import default_timer as timer
import gc
file_path = "housing.data"
data = np.loadtxt(file_path, dtype=np.float32)
# prepare data
x = data[:, 0:13]
y = data[:, 13:]
# normalization
x = (x - x.mean(0)) / x.std(0)
# reshape
xtrn = np.zeros((506, 1, 1, 13))
xtrn[:,0,0,:] = x
ytrn = y.reshape((506, 1, 1, 1))
# caffe.set_mode_cpu()
caffe.set_mode_gpu()
caffe.set_device(0)
net = caffe.Net("housing.prototxt", caffe.TRAIN)
solver = caffe.SGDSolver('housing_solver.prototxt')
gc.disable()
t0 = timer()
for i in range(10000):
solver.net.set_input_arrays(xtrn.astype(np.float32), ytrn.astype(np.float32))
solver.step(1)
t1 = timer()
gc.enable()
print "Time: %.4f" % (t1 - t0)
net: "housing.prototxt"
base_lr: 0.1
lr_policy: "fixed"
display: 2000
random_seed: 1
max_iter: 10000
snapshot_after_train: false
solver_mode: GPU
#!/bin/bash
for i in `seq 1 5`; do
python housing.py
done
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment