Skip to content

Instantly share code, notes, and snippets.

wget -r --spider --delete-after --force-html -D "$DOMAINS" -l $DEPTH "$HOME" 2>&1 \
| grep '^--' | awk '{ print $3 }' | grep -v '\. \(css\|js\|png\|gif\|jpg\)$' | sort | uniq > $OUTPUT
echo 'export PATH=$HOME/local/bin:$PATH' >> ~/.bashrc
. ~/.bashrc
mkdir ~/local
mkdir ~/node-latest-install
cd ~/node-latest-install
curl | tar xz --strip-components=1
./configure --prefix=~/local
make install # ok, fine, this step probably takes more than 30 seconds...
curl | sh
View gist:dada7e90a67cdb4704a4

After installing Arch on my Raspberry Pi, internet worked out of the box: I could plug it into the router, turn it on, ssh in and start downloading things. But the router is in my housemate's bedroom, which isn't ideal. If I want the Pi to be connected to the internet in my room, I need it to be connected to my laptop. (Another option would be a USB wifi dongle, of course.) This is how I did it. Much credit goes to the Ubuntu wiki's Connection sharing page.

I should disclaim that I don't fully understand networking stuff, and some of what I say might be wrong. I also didn't write this as I was going; so while I've consulted my browser and shell histories, it's possible I've forgotten some steps.

My laptop is running Gentoo, and this is where most of the work has to be done. It connects to the internet through wifi, on interface wlan0. The ethernet port is eth0, and eth0 is also the name of the ethernet port on the Pi.

Step zero: plug ev

View gist:02749f02f9f795a5c80f

This American Life limits their podcast feed to only the most recently aired episode, but you can download every episode (or a range) using a one-liner like this:

for i in {1..600};do wget$i.mp3 ;done
View temporary-email-address-domains
# Script for installing tmux on systems where you don't have root access.
# tmux will be installed in $HOME/local/bin.
# It's assumed that wget and a C/C++ compiler are installed.
# exit on error
set -e
wget="wget -e robots=off -nv"
tab="$(printf '\t')"
# Construct listing.txt from url.list
# The list of archived pages, including some wildcard url
set -e
# Usage:
# [--parallel=N] [rsync args...]
# Options:
# --parallel=N Use N parallel processes for transfer. Defaults to 10.
# Notes:
jadedgnome / remove_crw.cmd
Created Dec 25, 2017 — forked from xvitaly/remove_crw.cmd
Remove telemetry updates for Windows 7 and 8.1
View remove_crw.cmd
@echo off
echo Uninstalling KB3075249 (telemetry for Win7/8.1)
start /w wusa.exe /uninstall /kb:3075249 /quiet /norestart
echo Uninstalling KB3080149 (telemetry for Win7/8.1)
start /w wusa.exe /uninstall /kb:3080149 /quiet /norestart
echo Uninstalling KB3021917 (telemetry for Win7)
start /w wusa.exe /uninstall /kb:3021917 /quiet /norestart
echo Uninstalling KB3022345 (telemetry)
start /w wusa.exe /uninstall /kb:3022345 /quiet /norestart
echo Uninstalling KB3068708 (telemetry)
jadedgnome /
Created Feb 19, 2018 — forked from mrbar42/
bash scripts to create VOD HLS stream with ffmpeg almighty (tested on Linux and OS X)


bash beach.mkv

will produce:

      |- playlist.m3u8
      |- 360p.m3u8