Encrypt:
tar -cJvpf - inputdirectory/ | gpg --symmetric --cipher-algo aes256 | split -d -b 100m - outputfile.tar.xz.gpg
Decrypt:
find ~/Downloads -type f -exec stat -c "%n|%s|%y" {} \; | awk -F"|" '{printf("INSERT INTO files (file_path, file_size, file_date) VALUES ('\''%s'\'', %s, '\''%s'\'');\n", $1, $2, $3)}' | sqlite3 mydb.db |
#!/bin/bash | |
# Run this script without any param for a dry run | |
# Run the script with root and with exec param for removing old kernels after checking | |
# the list printed in the dry run | |
# FROM: https://askubuntu.com/questions/1253347/how-to-easily-remove-old-kernels-in-ubuntu-20-04-lts | |
uname -a | |
IN_USE=$(uname -a | awk '{ print $3 }') | |
echo "Your in use kernel is $IN_USE" |
apiVersion: apps/v1 | |
kind: StatefulSet | |
metadata: | |
name: spark-ui-proxy-controller | |
spec: | |
serviceName: spark-ui-proxy | |
replicas: 1 | |
selector: | |
matchLabels: | |
component: spark-ui-proxy |
Encrypt:
tar -cJvpf - inputdirectory/ | gpg --symmetric --cipher-algo aes256 | split -d -b 100m - outputfile.tar.xz.gpg
Decrypt:
SELECT tableoid::regclass AS source, * | |
FROM schema.partitioned_table |
Sometimes you need to troubleshoot and inspect the details of jobs (load, query etc.) in BigQuery. Inspecting the job history in the BigQuery web UI will only show the jobs that you have run. This is also true when you run run bq ls -j
on the command line.
But, what if you need to get all jobs that have been run? An example would be auotmated jobs run by service accounts. A quick tip is to use the --all
flag:
-a,--[no]all: Show all results. For jobs, will show jobs from all users. For datasets, will list hidden datasets. For transfer configs and runs, this flag is redundant and not necessary.
bq ls -j --all
(this tip originated from a question on Stack Overflow: https://stackoverflow.com/questions/47583485/bigquery-history-of-jobs-submitted-through-python-api)
gsutil ls gs://some-bucket/** |
#!/bin/bash | |
# | |
# A simple entropy walker. | |
# Requirements: apt install ent | |
# | |
echo 0,File-bytes,Entropy,Chi-square,Mean,Monte-Carlo-Pi,Serial-Correlation | |
for i in *; | |
do echo $i $(ent "$i" -t | tail -n1); | |
done |
gcloud sql import csv cloud-sql-instance-name gs://somebucket/folder/2/bigolecsv.gz -d somedatabase --table sometable |
How do you unmarshall nested json
Given dgraph res.Json
:
"q": [
{
"city": {
"uid": "0x51c7ebb",