Skip to content

Instantly share code, notes, and snippets.

View knil-sama's full-sized avatar
💭
Trying to not die

Clement Demonchy knil-sama

💭
Trying to not die
View GitHub Profile
apiVersion: batch/v1beta1
kind: CronJob
metadata:
name: example
namespace: default
spec:
schedule: '*/1 * * * *'
jobTemplate:
spec:
template:
command: ["script"]
args: ["--aws-secret-access-key=$(AWS_SECRET_ACCESS_KEY_USER)", "--aws-access-key-id=$(AWS_ACCESS_KEY_ID_USER)", "--aws-region=$(AWS_DEFAULT_REGION)",
"--postgres=$(POSTGRES_URI)","--s3-bucket=$(AWS_TRACKING_BUCKET)"
envFrom:
- secretRef:
name: {{ .Values.secrets_postgres_database }}
- configMapRef:
name: {{ .Values.config_aws }}
@knil-sama
knil-sama / cronjob_run_current_catchup.sh
Last active August 11, 2019 09:44
Comparer cronjob run current vs catchup
#normal run that load yesterday data
command: ["etl_script"]
args: ["--bucket-name", "name"]
#catchup run that load specific date
command: ["etl_script"]
args: ["--bucket-name", "name", "--start-date", "2018-02-1", "--end-date", "2018-03-01"]
@knil-sama
knil-sama / cronjob_pre_post.sh
Created August 4, 2019 10:27
Example cronjoblogging starting and ending of job in kube
kind: CronJob
metadata:
name: example
spec:
schedule: "10 3,15,20 * * *"
concurrencyPolicy: "Forbid"
suspend: false
jobTemplate:
spec:
template:
kubectl get job -o json | jq --arg DATE "$(date '+%s' -d '1 day ago')" '.items[] | select(.metadata.creationTimestamp | fromdateiso8601 < $DATE) | .metadata.labels."job-name" | select(.!=null) '
git filter-branch --tree-filter 'rm -f directory/large_file' HEAD
@knil-sama
knil-sama / duplicate_cronjob.sh
Last active July 15, 2019 10:31
Create duplicated cronjob in the same namespace
kubectl get cronjob <cronjob-name> -o json > /tmp/<cronjob-name>.json && sed -i 's/<cronjob-name>/<cronjob-name>-duplicate/' /tmp/<cronjob-name>.json && kubectl create -f /tmp/<cronjob-name>.json
kubectl create job --from=cronjob/<cronjob_name> job_name
map <F8> <C-E>:sleep 2500m<CR>j<F8>
#then tape F8
[app:main]
use = egg:pypicloud
pypi.storage = s3
storage.region_name = eu-west-1
storage.bucket = <placeholder>
storage.prefix = packages/
storage.redirect_urls = true
pypi.auth = sql