Skip to content

Instantly share code, notes, and snippets.

View danmackinlay's full-sized avatar

dan mackinlay danmackinlay

View GitHub Profile
@danmackinlay
danmackinlay / asyncAx.py
Last active April 10, 2024 06:40
Ax + SLURM via `submitit` and `asyncio`
#!/usr/bin/env python
"""
Asynchronous hyperparam search using [Ax](https://ax.dev/) and the submitit executor to run on SLURM
Refs
* [Ax Service API tutorial](https://ax.dev/tutorials/gpei_hartmann_service.html)
* [submitit/docs/examples.md](https://github.com/facebookincubator/submitit/blob/07f21fa1234e34151874c00d80c345e215af4967/docs/examples.md?plain=1#L121)
"""
@danmackinlay
danmackinlay / zot_bib.py
Last active August 15, 2021 23:51
script to decorate .Rmd files with bibliography files from my local zotero installation
#! /usr/bin/env python
"""
Export Zotero biblographies for my blog using betterbibtex export support
http://retorque.re/zotero-better-bibtex/exporting/pull
Usage:
From the root of a blogdown blog, run
```

Git pre-commit hook for large files

This hook warns you before you accidentally commit large files to git. It's very hard to reverse such an accidental commit, so it's better to prevent it in advance.

Since you will likely want this script to run in all your git repos, a script is attached to add this hook to all git repos you create / clone in the future.

Of course, you can just download it directly to the hooks in an existing git repo.

@danmackinlay
danmackinlay / .Rprofile
Created February 7, 2021 07:42
.Rprofile that prevents using linxubrew paths, which breaks compilation.
# R should not use the linuxbrew paths.
# this laborious workaround prevents that
.pth = Sys.getenv("PATH")
.pths = unlist(strsplit(.pth, ":"))
.nbrewpthi = as.vector(unlist(lapply(.pths, function (x) !grepl("brew", x))))
Sys.setenv(PATH=paste(.pths[.nbrewpthi], collapse=":"))
print("Changed PATH")
print(.pth)
print("to")
print(Sys.getenv("PATH"))
@danmackinlay
danmackinlay / find-pis
Last active February 7, 2024 09:09 — forked from chr15m/find-pis
Find Raspberry Pi devices on your local networks.
#!/bin/sh
# get broadcast addresses for each network
net=`ifconfig | grep -o -E "Bcast:(.*?) " | cut -f2 -d":"`
# loop over networks running the scan
for n in $net;
do
# first find SSH machines silently to prime the arp table
nmap -T4 -n -p 22 --open --min-parallelism 100 "$n/24" | grep -e "scan report for" -e "ssh" > /dev/null

Keybase proof

I hereby claim:

  • I am danmackinlay on github.
  • I am danmackinlay (https://keybase.io/danmackinlay) on keybase.
  • I have a public key ASBW9iWVglwJj-mtImYHR5uYigzQc-aCyVHThu_sQN9umAo

To claim this, I am signing this object:

@danmackinlay
danmackinlay / README.md
Created October 18, 2016 23:52
Fork of Alfred Klomp's excellent shrinkPdf.sh

Usage

Download the script by clicking the filename at the top of the box. Make it executable. If you run it with no arguments, it prints a usage summary. If you run it with a single argument – the name of the pdf to shrink – it writes the result to stdout:

./shrinkpdf.sh in.pdf > out.pdf

You can also provide a second filename for the output:

@danmackinlay
danmackinlay / gpd.py
Created October 5, 2016 06:19
Generalized Poisson Distribution for scipy.
import numpy as np
from scipy.stats import rv_discrete
from scipy.special import gamma, gammaln
class gpd_gen(rv_discrete):
"""
A Lagrangian Generalised Poisson-Poisson distribution.
``eta`` is the branching ratio,
@danmackinlay
danmackinlay / serialization.R
Created February 1, 2015 01:12
pas ssparse matrices between R and Python
library(rhdf5)
load.sparse.hdf = function (filename, path) {
idx = as.vector(h5read(filename, paste(path, "v_indices", sep="/")))
idxptr = as.vector(h5read(filename, paste(path, "v_indptr", sep="/")))
vals = as.vector(h5read(filename, paste(path, "v_data", sep="/")))
dims = as.vector(h5read(filename, paste(path, "v_datadims", sep="/")))
col.names = h5read(filename, paste(path, "v_col_names", sep="/"))
data = sparseMatrix(
@danmackinlay
danmackinlay / long_exposure.py
Last active August 29, 2015 14:12
Dirty hack to fake long exposure from a quicktime movie
import os
import tempfile
import subprocess
import skimage
from skimage import exposure
import skimage.io as io
import argparse
import shutil
import numpy as np