Skip to content

Instantly share code, notes, and snippets.

class ConvLSTMCell(nn.Module):
def __init__(self, in_channels, out_channels, kernel_size, padding=1):
super(ConvLSTMCell, self).__init__()
self.k = kernel_size
self.in_channels = in_channels
self.out_channels = out_channels
self.padding = padding
self.w_i = nn.Parameter(torch.Tensor(4*out_channels, in_channels, kernel_size, kernel_size))
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
local sys = require 'sys'
local image = require 'image'
local paths = require 'paths'
local M = {}
local function loadAndConvertImages(imageList, baseDir, numExamples)
local data = torch.Tensor(numExamples,3,256,256)
local labels = torch.Tensor(numExamples)

Train a model
caffe_root/build/tools/caffe train -solver [path_to_solver_file] [Trained models are saved in a location specified in the solver file] - generates a file with (iterations, accuracy)
python [network_definition] [model_prefix] [gpu]
network_deifinition - the .prototxt file which has referenced in the solver. This contains the network which was trained.
model_prefix - this is the prefix with which models were saved throughout the training phase.

For both network_definition and model_prefix, the root folders are specified inside the script.

Chen, Tianqi, Ian Goodfellow, and Jonathon Shlens. "Net2Net: Accelerating Learning via Knowledge Transfer." (2016). []

  • Knowledge transfer scheme, from one model to another. Knowledge transfer throug copying weights.

  • Aim is to reduce training time and give the 'student' model and good starting point to begin its learning process. Student model begins to learn, from where the teacher left off. [critical in real life when several models are stored. Espcially when 'searching' for the best model]

  • Significant for 'life-long' learning, changes in model data become available (more data for categories or addition of categories)

function up {
cd `expr "$PWD" : "^\(.*$1[^/]*\)"`