Skip to content

Instantly share code, notes, and snippets.

@ryan-blunden
ryan-blunden / reset_sourcegraph_admin_password.md
Last active February 6, 2025 07:27
Instructions for resetting the Sourcegraph admin password

Presuming you have access to the Sourcegraph Docker container and the container name is sourcegraph:

  1. Get the id for the admin account (should be 1 in most cases):
docker container exec sourcegraph psql -U postgres sourcegraph -c 'SELECT id, username, passwd FROM users'
  1. Set the $ID variable:
@19317362
19317362 / andro_info.sh
Created July 30, 2019 21:45 — forked from konfou/andro_info.sh
Android Device Specifications
#! /bin/bash
if ! type adb; then
echo "adb not found"
echo "check PATH"
else
echo "============================"
echo "Android Device Specificatios"
echo "============================"
adb wait-\for-device
echo "> Manufacturer"
{% set dest_dir = salt['pillar.get']('config:dest_dir', '/etc/opt/config') -%}
{% set dev_files = [""] -%}
{% set prod_files = [""] -%}
{% set my_env = salt['pillar.get']('config:env', pillar['common']['env_type']) -%}
{% set my_group = salt['pillar.get']('config:group', 'root') -%}
{% set my_user = salt['pillar.get']('config:user', 'root') -%}
{% set files = {"dev": dev_files, "prod": prod_files}[my_env] -%}
---
include:
- profiles.software.sops
@greyhoundforty
greyhoundforty / tf-salt.md
Last active March 22, 2023 13:12
Example of using Salt with Terraform

In this example I am spinning up 2 web servers and 2 file servers using Terraform. During the provision process, Terraform will run a remote-exec script to bind the 4 new servers to the salt master server.

Export Variables

Substitute actual user and API key for SL_USERNAME and SL_API_KEY

export TF_VAR_slusername="SL_USERNAME"
export TF_VAR_slapikey="SL_API_KEY"
@ricjcosme
ricjcosme / dump-restore
Created September 13, 2017 17:33
DUMP / RESTORE PostgreSQL Kubernetes
DUMP
// pod-name name of the postgres pod
// postgres-user database user that is able to access the database
// database-name name of the database
kubectl exec [pod-name] -- bash -c "pg_dump -U [postgres-user] [database-name]" > database.sql
RESTORE
// pod-name name of the postgres pod
// postgres-user database user that is able to access the database
// database-name name of the database
@pescobar
pescobar / build-git.sh
Created October 5, 2015 07:14
compile git with openssl instead of gnutls
#!/usr/bin/env bash
# Clear out all previous attempts
rm -rf "/tmp/source-git/"
# Get the dependencies for git, then get openssl
sudo apt-get install build-essential fakeroot dpkg-dev -y
sudo apt-get build-dep git -y
sudo apt-get install libcurl4-openssl-dev -y
mkdir -p "/tmp/source-git/"
@mjpowersjr
mjpowersjr / gist:740a9583e9ec8b49e0a3
Last active May 2, 2024 01:26
Parsing the MySQL slow query log via Logstash (the easy way?)

The MySQL slow query log is a difficult format to extract information from. After looking at various examples with mixed results, I realized that it's much easier to configure MySQL to write the slow query log to a table in CSV format!

From the MySQL documentation:

By default, the log tables use the CSV storage engine that writes data in comma-separated values format. For users who have access to the .CSV files that contain log table data, the files are easy to import into other programs such as spreadsheets that can process CSV input.

my.cnf

Note: don't forget to open up permissions on your slow query log CSV file so logstash can read it!

# enable slow query log
anonymous
anonymous / Makefile
Created December 15, 2013 11:58
Hedging client for Ripple
HEDGE = node riphedge
all:
npm install ripple-lib
hedge:
-while date; do \
$(HEDGE); \
sleep 100; \
done
anonymous
anonymous / lighttable
Created May 28, 2013 20:05
Light table launcher script
#!/bin/bash
LT=LightTable
# adapted from http://stackoverflow.com/questions/59895/can-a-bash-script-tell-what-directory-its-stored-in
SOURCE="${BASH_SOURCE[0]}"
while [ -h "$SOURCE" ]; do
HERE="$( cd -P "$( dirname "$SOURCE" )" && pwd )"
SOURCE="$(readlink "$SOURCE")"
[[ $SOURCE != /* ]] && SOURCE="$HERE/$SOURCE"
@mhubig
mhubig / apt-config.conf
Last active February 5, 2025 11:55
Logstash config file for parsing apt history.log files (usually found at /var/log/apt/history.log).
input {
tcp {
type => "apt-history"
port => 3333
}
}
filter {
# First, glue all lines together into one event!