Skip to content

Instantly share code, notes, and snippets.

echo "Installing conda..."
# https://waylonwalker.com/install-miniconda/
mkdir -p ~/miniconda3
wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh -O ~/miniconda3/miniconda.sh
bash ~/miniconda3/miniconda.sh -b -u -p ~/miniconda3
rm -rf ~/miniconda3/miniconda.sh
~/miniconda3/bin/conda init bash
# set up pytorch
echo "Installing Python dependencies..."
@sradc
sradc / 1.srp.py
Created October 5, 2022 18:30 — forked from dmmeteo/1.srp.py
SOLID Principles explained in Python with examples.
"""
Single Responsibility Principle
“…You had one job” — Loki to Skurge in Thor: Ragnarok
A class should have only one job.
If a class has more than one responsibility, it becomes coupled.
A change to one responsibility results to modification of the other responsibility.
"""
class Animal:
def __init__(self, name: str):
@sradc
sradc / autodiff.py
Last active December 10, 2023 09:31
Automatic Differentiation in 26 lines of Python
import math
class Var:
def __init__(self, val: float, local_gradients=()):
self.val = val
self.local_gradients = local_gradients
self.grad = 0
def backward(self, path_value: float = 1):
for child_var, local_gradient in self.local_gradients:
@sradc
sradc / vectorised_autodiff_example.py
Created October 11, 2021 16:43
vectorised autodiff example
# minimal example, using code from: https://sidsite.com/posts/autodiff/
from collections import defaultdict
import matplotlib.pyplot as plt
import numpy as np
class Variable:
def __init__(self, value, local_gradients=[]):
self.value = value
self.local_gradients = local_gradients
import math
import matplotlib.pyplot as plt
import numpy as np
np.random.seed(0)
N = 13
# Taking only the real part of the Fourier transform, then applying the inverse Fourier transform:
reconstructed = np.fft.ifft(np.fft.fft(signal).real).real