Skip to content

Instantly share code, notes, and snippets.

Modar M. Alfadly ModarTensai

Block or report user

Report or block ModarTensai

Hide content and notifications from this user.

Learn more about blocking users

Contact Support about this user’s behavior.

Learn more about reporting abuse

Report abuse
View GitHub Profile
@ModarTensai
ModarTensai / polynomial.py
Created Sep 10, 2019
Compute polynomials efficiently with numpy and pytorch (differentiable).
View polynomial.py
import torch
def polynomial(coefficients, x):
"""Evaluate polynomials using Horner method.
The coefficients are from highest to lowest order.
Args:
coefficients: Tensor of size (N, *K).
K is any broadcastable size to `x.size()`.
@ModarTensai
ModarTensai / denormalize_conv2d.py
Created May 22, 2019
Change the weights of a conv2d in pytorch to incorporate the mean and std and allow the input range to be in [0, 1]
View denormalize_conv2d.py
from torch.nn import functional as F
from torchvision.transforms.functional import normalize
def denormalize_conv2d(weight, bias, mean, std):
weight, bias = weight.data, bias.data
std = torch.as_tensor(std).data.view(1, -1, 1, 1)
mean = torch.as_tensor(mean).data.view(1, -1, 1, 1)
w = weight / std
b = bias - (w * mean).flatten(1).sum(1)
View running_metrics.py
"""A collection of running metrics."""
__all__ = ['MeanMetric', 'MeanVarianceMetric']
class MeanMetric:
"""Computes and stores the average of a sequence of numbers."""
def __init__(self):
"""Initialize the meter."""
@ModarTensai
ModarTensai / python_cheat_sheet.py
Last active Feb 4, 2019
A cheat sheet for Python I made a while ago.
View python_cheat_sheet.py
#Python Cheat Sheet
#Any thing preceded by a '#' is a comment and won't be executed like this line here
#Variables are names defined by the programmer that hold values
WonderfulVariableByMe = 5
#There are some reserved words like: for, if, def, while, break, ... that cannot be used to name a variable
#A variable name must not contain spaces and must never start with a number
#Variable names are case-sensitive: modar does not equal Modar
@ModarTensai
ModarTensai / sfbvc
Created Jan 24, 2019
A shell script for single file branch version control (sfbvc)
View sfbvc
# sfbvc stands for single file branch version control.
#
# By Modar Alfadly <https://modar.me> on the 24th Jan 2019
#
# sfbvc is a convention to create an orphan branch for each file in a repository
# and the master branch will be the merge of all the other branches.
# This is a niche convention which is trying to simulate file history
# in applications similar to cloud storages like Google Drive and Dropbox.
# We are using git under the hood and assuming that it is installed and in PATH.
#
@ModarTensai
ModarTensai / histogram_summary.py
Created Oct 23, 2018
If you want to dump histograms in tensorboard while using pytorch, following [this tutorial](https://nbviewer.jupyter.org/gist/ModarTensai/b081dcf6c87f9134f29abe3a77be1ab5), you can use this.
View histogram_summary.py
import torch
import tensorflow as tf
def histogram_summary(name, array):
if not hasattr(histogram_summary, 'session'):
histogram_summary.placeholder = tf.placeholder(tf.float32)
histogram_summary.session = tf.Session()
histogram_summary.histograms = {}
if name not in histogram_summary.histograms:
@ModarTensai
ModarTensai / rng.py
Last active Oct 23, 2018
A more flexible context manager than `torch.random.fork_rng()` to preserve the state of the random number generator in PyTorch for the desired devices.
View rng.py
import torch
class RNG():
'''Preserve the state of the random number generators of torch.
Inspired by torch.random.fork_rng().
Seeding random number generators (RNGs):
- (PyTorch) torch.manual_seed(seed)
@ModarTensai
ModarTensai / num_bins_in_hist.py
Created Oct 22, 2018
If you want to histogram samples of data, use Freedman–Diaconis rule which is the default method to choose the number of bins in Matlab. Here is an implementation in PyTorch:
View num_bins_in_hist.py
import torch
def _indexer_into(x, dim=0, keepdim=False):
'''indexes into x along dim.'''
def indexer(i):
# (e.g., x[:, 2, :] is indexer(2) if dim == 1)
out = x[[slice(None, None)] * dim + [i, ...]]
return out.unsqueeze(dim) if keepdim and x.dim() != out.dim() else out
return indexer
@ModarTensai
ModarTensai / fitting_gaussian.ipynb
Last active Sep 20, 2018
Fitting Gaussian to Sampled Data Using PyTorch.
View fitting_gaussian.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@ModarTensai
ModarTensai / pytorch_tutorial.ipynb
Last active Mar 14, 2019
Basic PyTorch classification tutorial with links and references to useful materials to get started.
View pytorch_tutorial.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
You can’t perform that action at this time.