Skip to content

Instantly share code, notes, and snippets.

@dbeley
dbeley / doom.txt
Created May 11, 2021 21:15
doom-emacs cheatsheet
SPC
SPC find file in workspace
< switch buffer
, switch workspace buffer
. find file
: M-x
/ search in project
b
b switch workspace buffer
B switch buffer
#!/usr/bin/env bash
# docker prune
docker system prune -a
# remove exited containers:
docker ps --filter status=dead --filter status=exited -aq | xargs -r docker rm -v
# remove unused images:
docker images --no-trunc | grep '' | awk '{ print $3 }' | xargs -r docker rmi
@dbeley
dbeley / download_audio_files_from_website.sh
Last active January 18, 2021 23:37
Download all audio files from a list of URLs.
#!/usr/bin/env bash
set -eEu -o pipefail
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd -P)"
# URLs.txt is a text file containing one URL per line
for URL in `cat "$DIR/URLs.txt"`; do
# you can add other extensions
wget -r -nd --accept=mp3,MP3 -k -l 1 --show-progress --progress=bar:force:noscroll -e robots=off --span-hosts "$URL" -q
done
@dbeley
dbeley / archive_github_user_starred.sh
Last active January 31, 2021 09:15
Archive all github starred repos of an user.
#!/usr/bin/env bash
set -eEu -o pipefail
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd -P)"
usage() { printf "%s" "\
Usage: ./archive_github_user_starred.sh [-h] USER
"; exit 0;
}
if [ "$1" == "-h" ]; then
@dbeley
dbeley / archive_github_user_repos.sh
Last active December 27, 2022 20:20
Archive all github repos of an user.
#!/usr/bin/env bash
set -eEu -o pipefail
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd -P)"
usage() { printf "%s" "\
Usage: ./archive_github_user.sh [-h] USER
"; exit 0;
}
if [ "$1" == "-h" ]; then
@dbeley
dbeley / download_with_favd.sh
Last active December 17, 2020 09:59
Automatically download favicons
#!/usr/bin/env bash
./favd.sh google.com google.ico
@dbeley
dbeley / extract_urls.py
Last active September 14, 2020 00:44
Script to extract youtube urls from a json file returned by the google takeout export.
"""
Extract youtube urls from a json file returned by the google takeout export.
Usage : python extract_urls.py <name of json file>
"""
import logging
import argparse
import json
from pathlib import Path
logger = logging.getLogger()
@dbeley
dbeley / 🎵 My last week in music
Last active February 19, 2020 22:00
🎵 Last week in music
The Gathering ████░░░░░░░░░░░░░ 52 plays
Paradise Lost ███▋░░░░░░░░░░░░░ 48 plays
Agalloch ██▏░░░░░░░░░░░░░░ 28 plays
Anathema █▊░░░░░░░░░░░░░░░ 24 plays
Katatonia █▋░░░░░░░░░░░░░░░ 22 plays
Riverside █▏░░░░░░░░░░░░░░░ 15 plays
Primordial ▊░░░░░░░░░░░░░░░░ 10 plays
Ride ▌░░░░░░░░░░░░░░░░ 8 plays
Myrath ▌░░░░░░░░░░░░░░░░ 7 plays
Thy Catafalque ▍░░░░░░░░░░░░░░░░ 6 plays
import requests
import pandas as pd
from bs4 import BeautifulSoup
url_base = "https://en.wikipedia.org/wiki/Napoleon"
soup_base = BeautifulSoup(requests.get(url_base).content, "lxml")
links = soup_base.find("div", {"class": "navbox"}).find_all("li")
@dbeley
dbeley / python-skeleton.py
Last active August 24, 2019 17:20
simple python script skeleton
"""
Python script skeleton
"""
import logging
import time
import argparse
logger = logging.getLogger()
temps_debut = time.time()