I hereby claim:
- I am n8henrie on github.
- I am n8henrie (https://keybase.io/n8henrie) on keybase.
- I have a public key whose fingerprint is F21A 6194 C9DB 9899 CD09 E24E 434B 2C14 B8C3 3422
To claim this, I am signing this object:
#!/bin/bash -e | |
# toggle_docs.sh | |
# Uses pandoc to change all .md to .rst and vice-versa in a given directory tree. | |
# I don't really care for .rst, but PyPI needs it. | |
if [ "$(find . -iname "*.md")" ] && [ "$(find . -iname "*.rst")" ]; then | |
echo "Both .md and .rst filetypes exist, please make it so only one of two is there." | |
elif [ ! "$(find . -iname "*.md")" ] && [ ! "$(find . -iname "*.rst")" ]; then | |
echo "Neither .md or .rst filetypes exist, you should have one or the other." | |
else |
-- Regex from http://www.regular-expressions.info/email.html | |
using terms from application "Quicksilver" | |
on get direct types | |
return {"NSStringPboardType"} | |
end get direct types | |
on process text block_text | |
try |
#! /bin/bash -e | |
# Bash script to download source and configure / make the lastest Python3 on my Raspberry Pi. | |
LINKS='https://www.python.org/ftp/python/' | |
LATEST=$(curl -s https://www.python.org/ftp/python/ | ack '<a href="([\d\.]+)/">' --output "\$1" | sort -n | tail -n 1) | |
DL_URL='https://www.python.org/ftp/python/'$LATEST'/Python-'$LATEST'.tar.xz' | |
cd /var/tmp | |
wget -q $DL_URL |
# Quick function to download a file from the AUR with just the package name | |
# Usage: aur_download packagename | |
# You will then need to do the usual AUR stuff, i.e. tar -xvf package.tar.gz; cd; makepkg -s; pacman -U; | |
# Quick import for bash >= 4 (e.g. Arch): source <(curl -sL http://n8h.me/1BtQtGF) | |
function aur_download() { | |
aur_url="https://aur.archlinux.org$(curl -s "https://aur.archlinux.org/packages/$1" | ack -o "(?<=href=\")/packages.*?$1.*?tar.gz(?=\">)")" | |
echo "$aur_url" | |
read -p "Download the above url? [yn] " response | |
case $response in | |
[Yy]) |
#! /usr/bin/env python3 | |
"""sql_to_csv.py | |
Accepts an sql database and converts all tables to separate csv files. | |
""" | |
import sqlite3 | |
import csv | |
import sys | |
conn = sqlite3.connect(sys.argv[1]) |
I hereby claim:
To claim this, I am signing this object:
#! /usr/bin/env python3 | |
"""everytrail_backup.py | |
Uses the Everytrail API to back up trip info as a .csv file and all the | |
.gpx files. Files will populate into whatever directory the script is in, | |
so I recommend you first make an "everytrail_backup" folder, move the script | |
there, then run.""" | |
import csv | |
import requests | |
import urllib.parse |
// google_form_to_email.gs | |
// Google App Script to take response content from a Google Form and send it to an email address. | |
// Make a publically accessible Google Form, share as a "secret link," and if desired shorten | |
// with a custom Bitly link (assuming you have a free domain sitting around). | |
// Installation: From the *form* (not the spreadsheet with the responses), copy this into | |
// Tools -> Script Editor. In Script Editor, set up your trigger to be "on form submit." | |
// Customize the values in SETUP, and customize the message if desired. | |
// The script will loop over all the rows in the spreadsheet except the header row, email |
def gen_groups(iterable, test, to_beginning=True): | |
group = [] | |
for line in iterable: | |
# If the line is a delimiter | |
if test(line): | |
# and you want delimiters to start groups | |
if to_beginning: | |
# And there is already a group that has | |
# been accumulating non-delimiters | |
if group: |
my_project/__init__.py
setup.py
python3 setup.py develop
python3 setup.py test