Skip to content

Instantly share code, notes, and snippets.

View farrajota's full-sized avatar

M. Farrajota farrajota

View GitHub Profile
@farrajota
farrajota / parabola.lua
Last active December 3, 2016 20:00
Sub-pixel precision using 1D parabola fitting
-- Use three (consecutive) points to fit a 1D parabola
-- and return its maximum value. To achieve sub-pixel
-- precision for a 2D maxima (x,y), just fit the parabola over the
-- x and y coordinates separately with two neighboring points.
local function fitParabola(x1,x2,x3,y1,y2,y3)
local x1_sqr = x1*x1
local x2_sqr = x2*x2
local x3_sqr = x3*x3
local div = (x1_sqr-x1*(x2+x3)+x2*x3)*(x2-x3)
@farrajota
farrajota / padarray.lua
Created November 8, 2016 21:25
Pad arrays in torch
local function dimnarrow(x,sz,pad,dim)
local xn = x
for i=1,x:dim() do
if i > dim then
xn = xn:narrow(i,pad[i]+1,sz[i])
end
end
return xn
end
@farrajota
farrajota / svhn_convert_json.m
Last active September 29, 2016 21:19
Convert SVHN metadata .mat file format to .json
function svhn_convert_json(path)
% convert .mat to json files
%% Create .json file
if isempty(path)
error('Must specify a path to the dataset')
end
@farrajota
farrajota / convert_module.lua
Last active April 15, 2017 00:10
Convert a cudnn batchnorm module to nn backend (cycles all modules of a network).
local function ConvertBNcudnn2nn(net)
local function ConvertModule(net)
return net:replace(function(x)
if torch.type(x) == 'cudnn.BatchNormalization' then
return cudnn.convert(x, nn)
else
return x
end
end)
end
@farrajota
farrajota / freeze.lua
Created September 6, 2016 14:17
freeze parameters of a layer
model.modules[1].parameters = function() return nil end -- freezes the layer when using optim
model.modules[1].accGradParameters = function() end -- overwrite this to reduce computations
@farrajota
farrajota / mpii_mat2json.m
Last active December 27, 2023 13:45
Convert mpii annotations from .mat to .json format
function mpii_convert_json( )
% convert mpii annotations .mat file to .json
%% load annotation file
fprintf('Load annotations... ')
data = load('/media/HDD2/Datasets/Human_Pose/mpii/mpii_human_pose_v1_u12_2/mpii_human_pose_v1_u12_1.mat');
fprintf('Done.\n')
%% open file
fprintf('Open file mpii_human_pose_annotations.json\n')
@farrajota
farrajota / data_transforms.lua
Last active January 8, 2018 03:25
Commonly used data augmentation techniques for torch7.
require 'image'
local M = {}
function M.Compose(transforms)
return function(input)
for _, transform in ipairs(transforms) do
input = transform(input)
end
return input
@farrajota
farrajota / binary_tree_torch.lua
Last active May 1, 2017 14:48
Small example on how to create a binary tree in Torch7 using NN containers and nngraph.
--[[
Create a binary tree using two ways: NN containers and nn.gModule containers.
This example is fairly simple, and the default fully-connected layers are all
of size 100. However, this should also be simple to modify to allow different
fc layers with varying inputs/outputs if desired (for example: input a table
storing input+output configuration values for each of the sub-branch's level).
]]
require 'nn'
require 'nngraph'
@farrajota
farrajota / multiple_learning_rates.lua
Last active April 10, 2018 16:47
Example code for how to set different learning rates per layer. Note that when calling :parameters(), the weights and bias of a given layer are separate, consecutive tensors. Therefore, when calling :parameters(), a network with N layers will output a table with N*2 tensors, where the i'th and i'th+1 tensors belong to the same layer.
-- multiple learning rates per network. Optimizes two copies of a model network and checks if the optimization steps (2) and (3) produce the same weights/parameters.
require 'torch'
require 'nn'
require 'optim'
torch.setdefaulttensortype('torch.FloatTensor')
-- (1) Define a model for this example.
local model = nn.Sequential()
model:add(nn.Linear(10,20))
@farrajota
farrajota / table2file.lua
Last active August 25, 2016 09:15
save lua table contents to file (useful for logging purposes)
function save_configs(filename, opt)
-- prints the contents of a table into a file
local function table_print (tt, indent, done)
done = done or {}
indent = indent or 0
if type(tt) == "table" then
local sb = {}
for key, value in pairs (tt) do
table.insert(sb, string.rep (" ", indent)) -- indent it