Skip to content

Instantly share code, notes, and snippets.

@toshi-k
Created August 29, 2015 11:37
Show Gist options
  • Save toshi-k/d3c4eac6744034181040 to your computer and use it in GitHub Desktop.
Save toshi-k/d3c4eac6744034181040 to your computer and use it in GitHub Desktop.
Random Weight Sharing (Torch 7)
local WeightShering, parent = torch.class('nn.WeightSharing','nn.Module')
function WeightShering:__init(inputSize, outputSize, factor)
-- initialize
parent.__init(self)
self.factor = factor
self.inputSize = inputSize
self.outputSize = outputSize
-- shuffle
self.sortf = torch.randperm(inputSize):type("torch.LongTensor")
self.sortb = torch.range(1,inputSize):index(1,self.sortf)
self.sortb = self.sortb:type("torch.LongTensor")
-- network parameter
self.conv = nn.SpatialConvolutionMM(factor, outputSize, 1, inputSize / factor)
end
function WeightShering:updateOutput(input)
-- shuffle
input:index(1, self.sortf)
-- conv
input:resize(self.factor, self.inputSize / self.factor, 1)
self.output = self.conv:updateOutput(input)
self.output:resize(self.outputSize)
return self.output
end
function WeightShering:updateGradInput(input, gradOutput)
-- conv(backward)
gradOutput:resize(self.outputSize, 1, 1)
self.gradInput = self.conv:updateGradInput(input, gradOutput)
self.gradInput:resize(self.inputSize)
-- shuffle(backward)
self.gradInput:index(1, self.sortb)
return self.gradInput
end
function WeightShering:__tostring__()
return torch.type(self) .. string.format('(%d -> %d, share: %d)', self.inputSize, self.outputSize, self.factor)
end
--[[
<<References>>
[1] Compressing Neural Networks with the Hashing Trick
Wenlin Chen, James Wilson, Stephen Tyree, Kilian Weinberger, Yixin Chen
http://jmlr.org/proceedings/papers/v37/chenc15.html
--]]
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment