Skip to content

Instantly share code, notes, and snippets.

Modar M. Alfadly ModarTensai

Block or report user

Report or block ModarTensai

Hide content and notifications from this user.

Learn more about blocking users

Contact Support about this user’s behavior.

Learn more about reporting abuse

Report abuse
View GitHub Profile
ModarTensai /
Last active Jan 15, 2020
Layer-wise Adaptive Rate Control (LARC) in PyTorch. It is LARS with clipping support in addition to scaling.
class LARC:
"""Layer-wise Adaptive Rate Control.
LARC is LARS that supports clipping along with scaling:
This implementation is inspired by:
See also:
import torch
from torch import nn
from torch import functional as F
class Expression:
def __init__(self, out=None, **units):
self.out = out
self.terms = {}
self.coeffs = {}
ModarTensai /
Created Dec 4, 2019
Coarse-to-fine grid search
from argparse import Namespace
import torch
def grid_search(objective, *bounds, density=10, eps=1e-5, max_steps=None):
"""Perfrom coarse-to-fine grid search for the minimum objective.
>>> def f(x, y):
>>> x = x + 0.5
def cov(m, rowvar=True, inplace=False, unbiased=True):
"""Estimate the covariance matrix for the given data.
m: Variables and observations data tensor (accept batches).
rowvar: Whether rows are variables and columns are observations.
inplace: Whether to subtract the variable means inplace or not.
unbiased: Whether to use the unbiased estimation or not.
import torch
def unravel_index(index, shape):
out = []
for dim in reversed(shape):
out.append(index % dim)
index = index // dim
return tuple(reversed(out))
from math import log
import torch
from torch import nn
class L0Sparse(nn.Module):
def __init__(self, layer, init_sparsity=0.5, heat=2 / 3, stretch=0.1):
assert all(0 < x < 1 for x in [init_sparsity, heat, stretch])
self.layer = layer
View gaussian_relu_moments.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
ModarTensai /
Last active Jan 21, 2020
Compute the square root of a positive definite matrix with differentiable operations in pytorch (supports batching).
"""Matrix square root:"""
import torch
def sqrtm(matrix):
"""Compute the square root of a positive definite matrix."""
# s, v = matrix.symeig(eigenvectors=True)
_, s, v = matrix.svd()
good = s > s.max(-1, True).values * s.size(-1) * torch.finfo(s.dtype).eps
components = good.sum(-1)
ModarTensai /
Last active Sep 19, 2019
Principal Component Analysis (PCA) with pytorch and numpy
from collections import namedtuple
import numpy as np
class PCA:
def __init__(self, features, variance=None):
if variance is None:
pca =
features = pca.projection_matrix
ModarTensai /
Created Sep 18, 2019
Create namedtuple types for function outputs only once with nice type name.
from collections import namedtuple
from types import MethodType, FunctionType
# using neither typing.NamedTuple nor dataclasses.dataclass
def named_tuple(function, field_names, *args, name=None, **kwargs):
"""Memoize namedtuple types for function outputs."""
if isinstance(function, MethodType):
function = function.__func__
assert isinstance(function, FunctionType)
You can’t perform that action at this time.