Skip to content

Instantly share code, notes, and snippets.

View rbrto's full-sized avatar

Roberto Esparza rbrto

View GitHub Profile
@rbrto
rbrto / google_bigquery_backup_views_scheduled_queries_git.py
Created January 6, 2023 18:09 — forked from krisjan-oldekamp/google_bigquery_backup_views_scheduled_queries_git.py
Backup BigQuery Views and Scheduled Queries to a Git repository using Python. Full article on stacktonic.com
############################################################
# Author Krisjan Oldekamp / Stacktonic.com
# Email krisjan@stacktonic.com
# Article https://stacktonic.com/article/backup-your-valuable-big-query-views-and-scheduled-queries-using-python
############################################################
import os
import git
import google.oauth2.service_account
from google.cloud import bigquery
@rbrto
rbrto / how-to.md
Created September 13, 2021 12:48 — forked from radeksimko/how-to.md
VPC endpoint Terraform example setup

How to

ssh ec2-user@IP
aws configure set region us-west-2
aws s3 ls # listing s3 buckets over VPC endpoint privately
@rbrto
rbrto / postgres_manager.py
Created May 24, 2021 17:50 — forked from valferon/postgres_manager.py
Python script to take care of postgres backup and restore of data
#!/usr/bin/python3
import argparse
import logging
import subprocess
import os
import tempfile
from tempfile import mkstemp
import configparser
import gzip
@rbrto
rbrto / docker-compose.yaml
Created April 10, 2021 01:00 — forked from darth-veitcher/docker-compose.yaml
Traefik v2.0 with Cloudflare Wildcard and OpenVPN
version: "3"
services:
traefik:
image: "traefik:v2.0"
container_name: "traefik"
command:
# Globals
- "--log.level=DEBUG"
- "--api=true"
@rbrto
rbrto / LICENSE.txt
Created July 31, 2020 01:26 — forked from DWSR/LICENSE.txt
CloudSQL MySQL instance with scheduled exports via Cloud Function. Licensed under Apache 2.0
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
@rbrto
rbrto / Jenkinsfile
Last active July 5, 2020 02:46 — forked from lferro9000/Jenkinsfile
Jenkinsfile with PHP pipeline for Jenkins 2
#!/usr/bin/env groovy
node('php') {
stage('Get code from SCM') {
checkout(
[$class: 'GitSCM', branches: [[name: '*/#your-dev-branch#']],
doGenerateSubmoduleConfigurations: false,
extensions: [],
submoduleCfg: [],
@rbrto
rbrto / main.workflow
Created May 25, 2020 22:19 — forked from pahud/main.workflow
Github Actions with Amazon EKS CI/CD
workflow "Demo workflow" {
on = "push"
resolves = ["SNS Notification"]
}
action "Build Image" {
uses = "actions/docker/cli@c08a5fc9e0286844156fefff2c141072048141f6"
runs = ["/bin/sh", "-c", "docker build -t $IMAGE_URI ."]
env = {
IMAGE_URI = "xxxxxxxx.dkr.ecr.ap-northeast-1.amazonaws.com/github-action-demo:latest"
@rbrto
rbrto / index.ts
Created May 3, 2020 21:42 — forked from jsdevtom/index.ts
Connect to MongoDB from Google Cloud function best practice through Maintaining Persistent Connections
import {CustomError} from "./error/custom-error.interface";
require('dotenv').config();
import {RequestHandler} from 'express';
import {MongoClient} from 'mongodb';
let client: MongoClient;
const connectToClientIfDropped: () => Promise<void> = async () => {
if (client && client.isConnected()) {
@rbrto
rbrto / index.js
Created April 24, 2020 20:08 — forked from polleyg/index.js
Cloud Function for triggering templated Dataflow pipelines
//gcloud --project=grey-sort-challenge functions deploy goWithTheDataFlow --stage-bucket gs://batch-pipeline --trigger-bucket gs://batch-pipeline
const google = require('googleapis');
exports.goWithTheDataFlow = function(event, callback) {
const file = event.data;
const context = event.context;
console.log("File is: ", file);
console.log("State is: ", context.eventType);
if (context.eventType === 'google.storage.object.finalize' && file.name.indexOf('upload/') !== -1) {