Skip to content

Instantly share code, notes, and snippets.

@lzamparo
Created May 22, 2013 18:44
Show Gist options
  • Save lzamparo/5629903 to your computer and use it in GitHub Desktop.
Save lzamparo/5629903 to your computer and use it in GitHub Desktop.
SdA pickling test script output. The first pre-training output statements show a consistent reduction in the reconstruction error. The second set (after unpickling) shows that there is some problem introduced after un-pickling that is manifest in the upper layers.
Run on 2013-05-21 18:38:36.893592
Pre-training layer 0, epoch 0, cost 469.74511106
Pre-training layer 0, epoch 1, cost 444.779148771
Pre-training layer 0, epoch 2, cost 439.117536989
Pre-training layer 0, epoch 3, cost 435.99338236
Pre-training layer 0, epoch 4, cost 433.863444521
Pre-training layer 1, epoch 0, cost 388.678039978
Pre-training layer 1, epoch 1, cost 319.358514327
Pre-training layer 1, epoch 2, cost 302.30478766
Pre-training layer 1, epoch 3, cost 293.561204112
Pre-training layer 1, epoch 4, cost 287.946628352
Pre-training layer 2, epoch 0, cost 238.768491788
Pre-training layer 2, epoch 1, cost 193.903376996
Pre-training layer 2, epoch 2, cost 177.507741406
Pre-training layer 2, epoch 3, cost 168.110121471
Pre-training layer 2, epoch 4, cost 161.643049992
Pretraining time for file test_pretrain_SdA.py was 10.00m to go through 5 epochs
Pickling the model...
Unpickling the model...
Resume training...
Pre-training layer 0, epoch 0, cost 432.292523293 <----- This layer picks up fine, and achieves modest improvement.
Pre-training layer 0, epoch 1, cost 431.07932855
Pre-training layer 0, epoch 2, cost 430.103108182
Pre-training layer 0, epoch 3, cost 429.253838325
Pre-training layer 0, epoch 4, cost 428.536139819
Pre-training layer 1, epoch 0, cost -7957.39133052 <----- Something definitely wrong starting here, and in each
Pre-training layer 1, epoch 1, cost -22539.5159749 subsequent higher layer.
Pre-training layer 1, epoch 2, cost -37921.1679192
Pre-training layer 1, epoch 3, cost -53617.1158488
Pre-training layer 1, epoch 4, cost -69493.0764062
Pre-training layer 2, epoch 0, cost -1399.65590392
Pre-training layer 2, epoch 1, cost -4090.97426465
Pre-training layer 2, epoch 2, cost -6897.76328025
Pre-training layer 2, epoch 3, cost -9785.7599264
Pre-training layer 2, epoch 4, cost -12720.5077254
Pretraining time for file test_pretrain_SdA.py was 9.90m to go through the remaining 5 epochs
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment