Skip to content

Instantly share code, notes, and snippets.

Modar M. Alfadly ModarTensai

Block or report user

Report or block ModarTensai

Hide content and notifications from this user.

Learn more about blocking users

Contact Support about this user’s behavior.

Learn more about reporting abuse

Report abuse
View GitHub Profile
View gaussian_relu_moments.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
ModarTensai /
Created Oct 21, 2019
Compute the square root of a positive definite matrix with differentiable operations in pytorch (supports batching).
"""Matrix square root:"""
import torch
def sqrtm(matrix):
"""Compute the square root of a positive definite matrix."""
# s, v = matrix.symeig(eigenvectors=True)
_, s, v = matrix.svd()
above_cutoff = s > s.max() * s.size(-1) * torch.finfo(s.dtype).eps
s, v = s[..., above_cutoff], v[..., above_cutoff]
ModarTensai /
Last active Sep 19, 2019
Principal Component Analysis (PCA) with pytorch and numpy
from collections import namedtuple
import numpy as np
class PCA:
def __init__(self, features, variance=None):
if variance is None:
pca =
features = pca.projection_matrix
ModarTensai /
Created Sep 18, 2019
Create namedtuple types for function outputs only once with nice type name.
from collections import namedtuple
from types import MethodType, FunctionType
# using neither typing.NamedTuple nor dataclasses.dataclass
def named_tuple(function, field_names, *args, name=None, **kwargs):
"""Memoize namedtuple types for function outputs."""
if isinstance(function, MethodType):
function = function.__func__
assert isinstance(function, FunctionType)
ModarTensai /
Created Sep 10, 2019
Compute polynomials efficiently with numpy and pytorch (differentiable).
import torch
def polynomial(coefficients, x):
"""Evaluate polynomials using Horner method.
The coefficients are from highest to lowest order.
coefficients: Tensor of size (N, *K).
K is any broadcastable size to `x.size()`.
ModarTensai /
Created May 22, 2019
Change the weights of a conv2d in pytorch to incorporate the mean and std and allow the input range to be in [0, 1]
from torch.nn import functional as F
from torchvision.transforms.functional import normalize
def denormalize_conv2d(weight, bias, mean, std):
weight, bias =,
std = torch.as_tensor(std).data.view(1, -1, 1, 1)
mean = torch.as_tensor(mean).data.view(1, -1, 1, 1)
w = weight / std
b = bias - (w * mean).flatten(1).sum(1)
ModarTensai /
Last active Oct 28, 2019
Seamless running stats for (native python, numpy.ndarray, torch.tensor).
"""Seamless running stats for (native python, numpy.ndarray, torch.tensor)."""
from collections import namedtuple
class MeanMeter:
"""Estimate the mean for a stream of values."""
def __init__(self):
"""Initialize the meter."""
ModarTensai /
Last active Feb 4, 2019
A cheat sheet for Python I made a while ago.
#Python Cheat Sheet
#Any thing preceded by a '#' is a comment and won't be executed like this line here
#Variables are names defined by the programmer that hold values
WonderfulVariableByMe = 5
#There are some reserved words like: for, if, def, while, break, ... that cannot be used to name a variable
#A variable name must not contain spaces and must never start with a number
#Variable names are case-sensitive: modar does not equal Modar
ModarTensai / sfbvc
Created Jan 24, 2019
A shell script for single file branch version control (sfbvc)
View sfbvc
# sfbvc stands for single file branch version control.
# By Modar Alfadly <> on the 24th Jan 2019
# sfbvc is a convention to create an orphan branch for each file in a repository
# and the master branch will be the merge of all the other branches.
# This is a niche convention which is trying to simulate file history
# in applications similar to cloud storages like Google Drive and Dropbox.
# We are using git under the hood and assuming that it is installed and in PATH.
ModarTensai /
Created Oct 23, 2018
If you want to dump histograms in tensorboard while using pytorch, following [this tutorial](, you can use this.
import torch
import tensorflow as tf
def histogram_summary(name, array):
if not hasattr(histogram_summary, 'session'):
histogram_summary.placeholder = tf.placeholder(tf.float32)
histogram_summary.session = tf.Session()
histogram_summary.histograms = {}
if name not in histogram_summary.histograms:
You can’t perform that action at this time.