Skip to content

Instantly share code, notes, and snippets.

View back2yes's full-sized avatar
🎯
Focusing

back2yes

🎯
Focusing
View GitHub Profile
@fschwar4
fschwar4 / sinc_interpolation.py
Last active April 15, 2024 11:22
Fast Python implementation of Whittaker–Shannon / sinc / bandlimited interpolation.
import numpy as np
from numpy.typing import NDArray
def sinc_interpolation(x: NDArray, s: NDArray, u: NDArray) -> NDArray:
"""Whittaker–Shannon or sinc or bandlimited interpolation.
Args:
x (NDArray): signal to be interpolated, can be 1D or 2D
s (NDArray): time points of x (*s* for *samples*)
u (NDArray): time points of y (*u* for *upsampled*)
@paddymccrudden
paddymccrudden / Fleishman.py
Last active February 3, 2022 17:45 — forked from zeimusu/Fleishman.py
Generate data with given mean, standard deviation, skew, and kurtosis. Intended for monte carlo simulations with non normal distributions
import numpy as np
from numpy.linalg import solve
import logging
logging.basicConfig(level = logging.DEBUG)
from scipy.stats import moment,norm
def fleishman(b, c, d):
"""calculate the variance, skew and kurtois of a Fleishman distribution
F = -c + bZ + cZ^2 + dZ^3, where Z ~ N(0,1)
"""
"""
Developed by Vladimir Fadeev
(https://github.com/kirlf)
Kazan, 2017 / 2020
Python 3.7
The result is uploaded in
https://commons.wikimedia.org/wiki/File:AdaptiveBeamForming.png
"""
@ihoromi4
ihoromi4 / seed_everything.py
Created February 5, 2020 11:49
pytorch - set seed everything
def seed_everything(seed: int):
import random, os
import numpy as np
import torch
random.seed(seed)
os.environ['PYTHONHASHSEED'] = str(seed)
np.random.seed(seed)
torch.manual_seed(seed)
torch.cuda.manual_seed(seed)
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@guillefix
guillefix / lc.py
Last active May 10, 2020 14:00
temporal workaround to get Conv2dLocal to work in PyTorch
# coding: utf-8
# In[1]:
import math
import torch
from torch.nn.parameter import Parameter
import torch.nn.functional as F
@goldsborough
goldsborough / conv.cu
Last active November 27, 2023 05:59
Convolution with cuDNN
#include <cudnn.h>
#include <cassert>
#include <cstdlib>
#include <iostream>
#include <opencv2/opencv.hpp>
#define checkCUDNN(expression) \
{ \
cudnnStatus_t status = (expression); \
if (status != CUDNN_STATUS_SUCCESS) { \
# Resample.py
# Andrew Brock, 2017
# This code resamples a 3d grid using catmull-rom spline interpolation, and is GPU accelerated.
# Resample along the trailing dimension
# Assumes a more-than-1D array? Or just directly assumes a 3D array? we'll find out
#
# TODO: Some things could be shared (such as the mgrid call, which can presumably be done once? hmm)
# between resample1d calls.
@rtqichen
rtqichen / pytorch_weight_norm.py
Last active May 11, 2023 06:58
Pytorch weight normalization - works for all nn.Module (probably)
## Weight norm is now added to pytorch as a pre-hook, so use that instead :)
import torch
import torch.nn as nn
from torch.nn import Parameter
from functools import wraps
class WeightNorm(nn.Module):
append_g = '_g'
append_v = '_v'