Skip to content

Instantly share code, notes, and snippets.

@szagoruyko
Last active June 4, 2018 17:47
Show Gist options
  • Save szagoruyko/aad191dac87ebae80b50 to your computer and use it in GitHub Desktop.
Save szagoruyko/aad191dac87ebae80b50 to your computer and use it in GitHub Desktop.

Sparse tensors for Torch7

All types torch.DoubleTensor, torch.FloatTensor, etc. should have their sparse variants: torch.SparseDoubleTensor, torch.SparseFloatTensor, etc.

Copying between dense and sparse matrix should be done with :copy() function.

Underlying BLAS has to be swappable with MKL/OpenBLAS/Atlas, etc. Other math operations have to implemented with CSPARSE.

Cuda version has to be done with CUSPARSE, e.g. torch.SparseCudaTensor or torch.SparseFloatCudaTensor

Sparse tensors have to be serializable.

New constructors have to be added.

Priority

High

  • Constructors
  • Serialization
  • Type-conversion
  • Full (dense) - sparse conversion
  • Pointwise ops
  • add,mul,div,cadd,cmul,cdiv,fill
  • min,max,dot
  • BLAS level 1,2,3
  • indexing

Mid

  • LAPACK
  • comparison
  • mean, std

Low

  • scatter, gather, norm, dist, renorm
  • Batch BLAS

Example

require 'torch'

local indexes = {
  {3,2,4, 5.63},
  {1,3,4, 3.43},
  {3,5,6, 1.23},
}

a = torch.SparseFloatTensor(4,6,9)
a:set(indexes)

b = torch.FloatTensor(#a):copy(a)

-- print non-zero elements
print(a)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment