Skip to content

Instantly share code, notes, and snippets.

@jon-ruckwood
Last active January 24, 2020 10:27
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save jon-ruckwood/e4478e8d53d6a675b4cd to your computer and use it in GitHub Desktop.
Save jon-ruckwood/e4478e8d53d6a675b4cd to your computer and use it in GitHub Desktop.
Tips and Tricks

jq

Transform JSON into a table to displayed in a terminal:

$ aws dynamodb describe-table --table-name foobar | jq -r "[ .Table.GlobalSecondaryIndexes[] 
    | { IndexName, IndexStatus, Backfilling } ] 
    | (.[0] | keys | @tsv), (.[] | map(.) | @tsv)"

Backfilling     IndexName       IndexStatus
sender_id       ACTIVE
recipient_id    CREATING        true

Terminal

Pretty-print JSON:

echo '{"foo": "lorem", "bar": "ipsum"}' | python -m json.tool

Yaml to JSON:

Using pyyaml:

python -c 'import sys, yaml, json; y=yaml.load(sys.stdin.read()); print json.dumps(y)'

Query single JSON value from a MySQL:

mysql ... -e 'SELECT json FROM table_name LIMIT 1 ' -N -B  dbname | sed 's/\\n//g' | jq '.[]'

Quick and dirty proxy:

nc -lk 7931 </tmp/fifo | nc example.org 7931 >/tmp/fifo

grep over multiple directories:

grep -r foo /2015/10/{01,02,03}/env/service/something.*

Quick and dirty way to output a small result set from PostgreSQL in CSV format to a file (this does not escape commas etc..):

psql -h host -p 5439 -U user db -nAF, --pset=footer -c 'SELECT * FROM table LIMIT 10' -o /tmp/tablt_out.csv

Start all stopped Docker containers:

docker ps -f 'status=exited' | cut -d' ' -f 1 | sed -n '1!p' | xargs -L1 docker start

Reference previous arguments:

position notation
1st !^
last !$
n-th !:n
all !*

e.g.

echo a b c

echo !^     # echo a
echo !:2    # echo b
echo !$     # echo c
echo !*     # echo a b c

Compare two common lines in two files

#find lines only in file1
comm -23 <(sort file1) <(sort file2)

#find lines only in file2
comm -13 <(sort file1) <(sort file2)

#find lines common to both files
comm -12 <(sort file1) <(sort file2)

Test for high latency:

for X in 'seq 60'; do curl -Ik -w "HTTPCode=%{http_code} TotalTime=%{time_total}\n" https://example.org/ -so /dev/null; done

VI

Convert CSV into tab-delimited:

:%!column -t -s,

Delete from next line to end of file:

:+,$d

Lowercase contents:

ggVGu

MySQL

Load data from CSV into table:

LOAD DATA INFILE 'data.csv'
INTO TABLE table_name
COLUMNS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES
;

Query JSON data type:

SELECT * FROM foobar WHERE json_col->'$.id' = 1;
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment