Skip to content

Instantly share code, notes, and snippets.

@thomasbrandon
thomasbrandon / MatchLayers.ipynb
Last active November 14, 2019 21:03
Match layers between fastai and torchvision models
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@thomasbrandon
thomasbrandon / Normalisation.ipynb
Last active October 16, 2019 15:31
Test of pre-trained normalisation approaches
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@thomasbrandon
thomasbrandon / cifar-10-squeezenet-mishcuda.ipynb
Created October 3, 2019 06:48
MishCuda test on cifar-10 squeezenet
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@thomasbrandon
thomasbrandon / MNIST_Stats.ipynb
Created September 27, 2019 14:30
NB for MNIST Stats update in Fastai
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@thomasbrandon
thomasbrandon / error_callback.py
Created September 26, 2019 17:41
FastAI callback to find non-finite gradients and losses
from fastai.basics import *
class ErrorCallback(LearnerCallback):
def __init__(self, lrn:Learner):
super().__init__(lrn)
self.err_loss,self.err_input,self.err_output = None,None,None
def on_train_begin(self, **kwargs):
def hook(mod, inps, outs):
nfs = []
@thomasbrandon
thomasbrandon / MishTrain.ipynb
Created September 26, 2019 09:53
Semi-successful training of Mish
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@thomasbrandon
thomasbrandon / running_stats.py
Created September 26, 2019 06:37
Collect running statistics (mean/std) efficiently in PyTorch
import torch
from torch import Tensor
from typing import Iterable
from fastprogress import progress_bar
class RunningStatistics:
'''Records mean and variance of the final `n_dims` dimension over other dimensions across items. So collecting across `(l,m,n,o)` sized
items with `n_dims=1` will collect `(l,m,n)` sized statistics while with `n_dims=2` the collected statistics will be of size `(l,m)`.
Uses the algorithm from Chan, Golub, and LeVeque in "Algorithms for computing the sample variance: analysis and recommendations":
@thomasbrandon
thomasbrandon / MishPerformance.ipynb
Created September 23, 2019 10:40
Profiling and experiments on Mish Performance
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@thomasbrandon
thomasbrandon / PyTorch-Resample.ipynb
Last active June 29, 2021 21:31
PyTorch implementation of scipy.signal.resample and scipy.signal.resample_poly
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.