Skip to content

Instantly share code, notes, and snippets.

Avatar

dan mackinlay danmackinlay

View GitHub Profile
@danmackinlay
danmackinlay / zot_bib.py
Last active Aug 15, 2021
script to decorate .Rmd files with bibliography files from my local zotero installation
View zot_bib.py
#! /usr/bin/env python
"""
Export Zotero biblographies for my blog using betterbibtex export support
http://retorque.re/zotero-better-bibtex/exporting/pull
Usage:
From the root of a blogdown blog, run
```
View Git pre-commit hook for large files.md

Git pre-commit hook for large files

This hook warns you before you accidentally commit large files to git. It's very hard to reverse such an accidental commit, so it's better to prevent it in advance.

Since you will likely want this script to run in all your git repos, a script is attached to add this hook to all git repos you create / clone in the future.

Of course, you can just download it directly to the hooks in an existing git repo.

@danmackinlay
danmackinlay / .Rprofile
Created Feb 7, 2021
.Rprofile that prevents using linxubrew paths, which breaks compilation.
View .Rprofile
# R should not use the linuxbrew paths.
# this laborious workaround prevents that
.pth = Sys.getenv("PATH")
.pths = unlist(strsplit(.pth, ":"))
.nbrewpthi = as.vector(unlist(lapply(.pths, function (x) !grepl("brew", x))))
Sys.setenv(PATH=paste(.pths[.nbrewpthi], collapse=":"))
print("Changed PATH")
print(.pth)
print("to")
print(Sys.getenv("PATH"))
@danmackinlay
danmackinlay / find-pis
Last active Feb 13, 2020 — forked from chr15m/find-pis
Find Raspberry Pi devices on your local networks.
View find-pis
#!/bin/sh
# get broadcast addresses for each network
net=`ifconfig | grep -o -E "Bcast:(.*?) " | cut -f2 -d":"`
# loop over networks running the scan
for n in $net;
do
# first find SSH machines silently to prime the arp table
nmap -T4 -n -p 22 --open --min-parallelism 100 "$n/24" | grep -e "scan report for" -e "ssh" > /dev/null
View keybase.md

Keybase proof

I hereby claim:

  • I am danmackinlay on github.
  • I am danmackinlay (https://keybase.io/danmackinlay) on keybase.
  • I have a public key ASBW9iWVglwJj-mtImYHR5uYigzQc-aCyVHThu_sQN9umAo

To claim this, I am signing this object:

@danmackinlay
danmackinlay / README.md
Created Oct 18, 2016
Fork of Alfred Klomp's excellent shrinkPdf.sh
View README.md

Usage

Download the script by clicking the filename at the top of the box. Make it executable. If you run it with no arguments, it prints a usage summary. If you run it with a single argument – the name of the pdf to shrink – it writes the result to stdout:

./shrinkpdf.sh in.pdf > out.pdf

You can also provide a second filename for the output:

@danmackinlay
danmackinlay / gpd.py
Created Oct 5, 2016
Generalized Poisson Distribution for scipy.
View gpd.py
import numpy as np
from scipy.stats import rv_discrete
from scipy.special import gamma, gammaln
class gpd_gen(rv_discrete):
"""
A Lagrangian Generalised Poisson-Poisson distribution.
``eta`` is the branching ratio,
@danmackinlay
danmackinlay / serialization.R
Created Feb 1, 2015
pas ssparse matrices between R and Python
View serialization.R
library(rhdf5)
load.sparse.hdf = function (filename, path) {
idx = as.vector(h5read(filename, paste(path, "v_indices", sep="/")))
idxptr = as.vector(h5read(filename, paste(path, "v_indptr", sep="/")))
vals = as.vector(h5read(filename, paste(path, "v_data", sep="/")))
dims = as.vector(h5read(filename, paste(path, "v_datadims", sep="/")))
col.names = h5read(filename, paste(path, "v_col_names", sep="/"))
data = sparseMatrix(
@danmackinlay
danmackinlay / long_exposure.py
Last active Aug 29, 2015
Dirty hack to fake long exposure from a quicktime movie
View long_exposure.py
import os
import tempfile
import subprocess
import skimage
from skimage import exposure
import skimage.io as io
import argparse
import shutil
import numpy as np
View mysql2sqlite.sh
#!/bin/sh
# Converts a mysqldump file into a Sqlite 3 compatible file. It also extracts the MySQL `KEY xxxxx` from the
# CREATE block and create them in separate commands _after_ all the INSERTs.
# Awk is choosen because it's fast and portable. You can use gawk, original awk or even the lightning fast mawk.
# The mysqldump file is traversed only once.
# Usage: $ ./mysql2sqlite mysqldump-opts db-name | sqlite3 database.sqlite
# Example: $ ./mysql2sqlite --no-data -u root -pMySecretPassWord myDbase | sqlite3 database.sqlite