Skip to content

Instantly share code, notes, and snippets.

View albertbuchard's full-sized avatar

Albert Buchard albertbuchard

View GitHub Profile
@albertbuchard
albertbuchard / textrepo.py
Last active May 20, 2024 18:39
textrepo: A Python Script to Concatenate All Files in a Repository into a Single Text File, Ignoring Specified Patterns
#!/usr/bin/env python3
import fnmatch
import os
"""
This Python script, textrepo, concatenates all files within a specified repository into a single text file
while respecting .gitignore patterns and additional specified ignore patterns. It prints the formatted content
to both a specified output file and standard output. This is useful for reviewing all content within a repository
in a structured format, excluding unwanted files and directories such as node_modules, dist, build, and others.
@albertbuchard
albertbuchard / hh_ht_win_rate.py
Last active March 19, 2024 18:25
Parallelizing Simulations to Calculate HH vs HT Win Rates
from concurrent.futures import ProcessPoolExecutor
import numpy as np
from scipy.signal import convolve2d
from tqdm import tqdm
nsims = int(1e6)
nevents = int(1e5)
@albertbuchard
albertbuchard / .bashrc
Created November 2, 2023 16:49
Useful SLURM bash functions (add to your ~/.bashrc)
tailast() {
# Get the directory from the argument or use the current directory as default
local dir="${1:-.}"
# Find the most recently modified file in the directory without descending into subdirectories
local latest_file=$(find "$dir" -maxdepth 1 -type f -exec stat --format='%Y %n' {} \; | sort -n | tail -1 | awk '{print $2}')
# Check if a file was found
if [[ -z "$latest_file" ]]; then
echo "No files found in $dir"
@albertbuchard
albertbuchard / mine_mwe.py
Last active October 22, 2023 11:49
MINE: Mutual Information Neural Estimation | Minimal Working Example
import math
import torch.optim as optim
import torch
from torch import nn
class MineWrapper(nn.Module):
def __init__(self, stat_model, moving_average_rate=0.1, unbiased=False):
super(MineWrapper, self).__init__()
@albertbuchard
albertbuchard / ddp_debug.py
Created September 11, 2023 13:35
Distributed Data Parallel Debugging - Pytorch / PDB - ddp_pdb()
class DistributedIdentity:
"""
Singleton class to hold distributed identity information.
Handles SLURM, torchrun, and local runs.
Looks for the following environment variables:
- RANK
- WORLD_SIZE
- LOCAL_RANK
@albertbuchard
albertbuchard / sparsemax_torch.py
Last active April 18, 2023 18:31
This code defines a PyTorch implementation of the Sparsemax activation function. Sparsemax is an alternative to the softmax activation function that produces sparse probability distributions (euclidian projection to the simplex). The implementation is provided as a PyTorch nn.Module, making it easy to integrate into any architecture.
import torch
import torch.nn as nn
class Sparsemax(nn.Module):
def __init__(self, dim=-1):
super(Sparsemax, self).__init__()
self.dim = dim
def forward(self, x):
# Move the dimension to apply Sparsemax to the front
@albertbuchard
albertbuchard / timeit_fleuret.py
Last active July 23, 2022 21:39
Comparison tensor fill
setup='''
import numpy as np
import torch
V_nat = [[1, 2], [3, 4]]
U_nat = [[2, -1, 0, 0, 0, 0],
[5, 2, 8, -1, 0, 0]]
def compute_using_fleuret_1():
# Your init script
#
# Atom will evaluate this file each time a new window is opened. It is run
# after packages are loaded/activated and after the previous editor state
# has been restored.
#
# An example hack to log to the console when each text editor is saved.
#
# atom.workspace.observeTextEditors (editor) ->
# editor.onDidSave ->