Skip to content

Instantly share code, notes, and snippets.

Automate LetsEncrypt certificate deployment in SecurityOnion

Prerequisites

Turn on user certs in Security Onion (do this only once)

  1. Administration -> Configuration
  2. Options -> Show all configurable settings, including advanced settings
@simonw
simonw / README.md
Created November 5, 2022 21:03
Stream activity from Mastodon into a SQLite database

Stream Mastodon activity into a SQLite database

This script subscribes to the live HTTP feed of public activity on my Mastodon instance and writes the results into SQLite database tables.

It needs sqlite-utils and httpx:

pip install sqlite-utils httpx

Then run:

@kepano
kepano / obsidian-web-clipper.js
Last active May 3, 2024 07:21
Obsidian Web Clipper Bookmarklet to save articles and pages from the web (for Safari, Chrome, Firefox, and mobile browsers)
javascript: Promise.all([import('https://unpkg.com/turndown@6.0.0?module'), import('https://unpkg.com/@tehshrike/readability@0.2.0'), ]).then(async ([{
default: Turndown
}, {
default: Readability
}]) => {
/* Optional vault name */
const vault = "";
/* Optional folder name such as "Clippings/" */
@zuketo
zuketo / es-ccr
Last active March 21, 2019 21:14
es-ccr
# Initial Setup
#
# NOTE: SETUP FOR DEMONSTRATION PURPOSES ONLY, NOT FOR PRODUCTION USE
#
# ELASTICSEARCH
#
# 1. Download Elasticsearch 6.6 (and unzip)
# 2. Start two instances of Elasticsearch, each with a different cluster name, port, and data path:
# ./bin/elasticsearch -E cluster.name=cluster1 -E http.port=9200 -E path.data=./cluster1-data
# ./bin/elasticsearch -E cluster.name=cluster2 -E http.port=9201 -E path.data=./cluster2-data
#!/usr/bin/env python3
'''
A script to recursively compare two directories (including file size and file hash changes)
Usage: python3 compare_dirs.py DIR1 DIR2
'''
import os, sys, hashlib, unicodedata
COMPARE_FILES = True # should file sizes be compared if their names are the same?
@scrapehero
scrapehero / amazon_reviews.py
Last active November 16, 2022 03:43
Python 3 code to extract amazon reviews
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# Written as part of https://www.scrapehero.com/how-to-scrape-amazon-product-reviews-using-python/
from lxml import html
from json import dump,loads
from requests import get
import json
from re import sub
from dateutil import parser as dateparser
from time import sleep
library(tidycensus)
library(leaflet)
library(sf)
library(viridis)
options(tigris_use_cache = TRUE)
il1 <- get_acs(geography = "county",
variables = c(hhincome = "B19013_001"),
state = "IL",
geometry = TRUE) %>%
@BBcan177
BBcan177 / MS-4
Created January 1, 2018 20:52
pfBlockerNG: webtransparency.cs.princeton.edu - The following is the list of sites from Alexa top 1 million which embed scripts that extract email addresses from the browsers' built-in login (password) managers.
# https://webtransparency.cs.princeton.edu/no_boundaries/autofill_sites.html
# The following is the list of sites from Alexa top 1 million which embed scripts that extract email addresses from the browsers'
# built-in login (password) managers.
1000sunny.net
1001trucscoolsafaire.fr
123boutchou.com
12zawodnik.pl
1pic1day.com
2012un-nouveau-paradigme.com
247nigerianewsupdate.co
@koenrh
koenrh / gcp-gpu-vm-hashcat.md
Last active February 4, 2024 18:37
Running Hashcat on Google Cloud's new GPU-based VMs

Running Hashcat on Google Cloud's GPU-based VMs

In February 2017, Google announced the availability GPU-based VMs. I spun up a few of these instances, and ran some benchmarks. Along the way, I wrote down the steps taken to provision these VM instances, and install relevant drivers.

Update April 2019: Updated instructions to use instances with the Tesla T4 GPUs.

@GuilloOme
GuilloOme / .block
Last active December 22, 2017 18:01 — forked from mbostock/.block
Calendar View with D3 v4
license: gpl-3.0
height: 2910
border: no