Skip to content

Instantly share code, notes, and snippets.

Start by deleting environment repos from github to start from scratch.

The names are defined as: echo $(grep repository jx-requirements.yml | cut -f 2 -d ":") or better: cat jx-requirements.yml| yq ".environments[].repository"

https://cert-manager.io/docs/configuration/acme/dns01/google/

kubectl create namespace jx
@feczo
feczo / knative.md
Last active September 28, 2021 02:40

Base K8 setup

export KUBERNETES_SKIP_CREATE_CLUSTER = 1
curl -sS https://get.k8s.io | bash

change machine type, region and number of worker nodes enable pre-emptible machine type to reduce cost requesting more external IP in region quota (IN_USE_ADDRESSES) on console.cloud.google.com

This takes about 15 minutes to set up all steps inclusive server/client.

  1. GCP VM with CoreOS (or any docker capable) image + network / firewall config
  2. install server image https://hub.docker.com/r/johnae/pritunl/
    sudo echo -n >/etc/systemd/resolved.conf && systemctl stop systemd-resolved
    docker pull johnae/pritunl
    docker run -d --privileged -p 53:53/udp -p 53:53/tcp -p 443:443/tcp johnae/pritunl
    docker ps
    docker exec -it $HEXA /bin/bash
# export PROJECT=my-project-x
# export STAGE=gs://my-bucket-x/ds
mvn compile exec:java \
-Dexec.mainClass=com.google.cloud.dataflow.examples.complete.StreamingLogExtract \
-Dexec.args="--project=$PROJECT \
--stagingLocation=$STAGE \
--pubsubTopic='projects/""$PROJECT""/topics/custom_logs' \
--bigQueryDataset='dataflow_demo' \
--bigQueryTable='custom_logs' \
--runner=DataflowPipelineRunner --workerMachineType='n1-standard-1' --streaming=true"
/*
* Copyright (C) 2015 Google Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
* use this file except in compliance with the License. You may obtain a copy of
* the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
- name: log2monitor
cmd: /usr/local/bin/lb_response_code_logging.py
time: '* * * * * *'
onError: Backoff
notifyOnError: false
notifyOnFailure: true
#!/usr/bin/python
import json
import re
import datetime
import logging
import pprint
import requests
from time import sleep
from googleapiclient import discovery
from googleapiclient.errors import HttpError
BLOCKS=$(nslookup -q=TXT _cloud-netblocks.googleusercontent.com 8.8.8.8 | awk -F "spf1 " '{print $2}' | sed -e "s/include:/\n/g" | sed -e "s/?all\"//g" | grep -v ^$)
echo "$BLOCKS" | while read block; do nslookup -q=TXT $block 8.8.8.8 | awk -F "spf1 " '{print $2}' | sed -e "s/ip4:/\n/g" | sed -e "s/?all\"//g" |grep -v "^ip6:" | grep -v ^$; done ;