Skip to content

Instantly share code, notes, and snippets.

View 30stomercury's full-sized avatar

Sung-Lin Yeh 30stomercury

View GitHub Profile
@Hanrui-Wang
Hanrui-Wang / customized_backward.py
Created July 17, 2019 01:54
how to write customized backward function in pytorch
class MyReLU(torch.autograd.Function):
"""
We can implement our own custom autograd Functions by subclassing
torch.autograd.Function and implementing the forward and backward passes
which operate on Tensors.
"""
@staticmethod
def forward(ctx, input):
"""
@yoyololicon
yoyololicon / differentiable_lfilter.py
Last active August 25, 2022 12:59
This lfilter can propogate gradient to filter coefficients.
import torch
import torch.nn as nn
import torch.nn.functional as F
from torchaudio.functional import lfilter as torch_lfilter
from torch.autograd import Function, gradcheck
class lfilter(Function):
@staticmethod