Skip to content

Instantly share code, notes, and snippets.

View Intelrunner's full-sized avatar

Eric E. Intelrunner

View GitHub Profile
@Intelrunner
Intelrunner / soft_delete_clear.py
Last active April 24, 2024 16:45
Loops through all projects a user has available in gcloud and updates all of their buckets to clear the soft delete policy
import subprocess
# Function to run shell commands
def run_command(command):
try:
output = subprocess.check_output(command, shell=True, text=True)
return output.strip() # Return output with whitespace stripped
except subprocess.CalledProcessError as e:
print(f"Command failed with error: {e}")
return None
@Intelrunner
Intelrunner / bq_check.sh
Created March 5, 2024 14:22
Check all projects in a GCP org to see if an API is enabled.
#!/bin/bash
# The Organization ID (Replace with your organization's ID)
ORG_ID="your-organization-id"
# Gather all projects under the specified organization
PROJECT_IDS=$(gcloud projects list --organization=${ORG_ID} --format="value(projectId)")
# Loop through each project
for PROJECT_ID in ${PROJECT_IDS}; do
@Intelrunner
Intelrunner / findFunctionDockerRegistry.sh
Created February 28, 2024 18:24
Lists the current Docker REgistry being used to deploy all Google Cloud functions in a project
#!/bin/bash
# Set your GCP project ID
PROJECT_ID=$(gcloud config get-value project)
# List all functions in the project and iterate over them
for FUNCTION_NAME in $(gcloud functions list --project=$PROJECT_ID --format="value(name)")
do
echo "Function: $FUNCTION_NAME"
# Describe each function to get the dockerRegistry value
@Intelrunner
Intelrunner / updateDatasetByProject.py
Last active February 28, 2024 14:36
A google cloud function that takes a list of projects and gets a list of all BQ datasets in each, and updates each BQ Dataset to "PHYSICAL" billing model.
"""
This (Google Cloud) Function will scan all a list of provided projects and get all Bigquery datasets for each of those projects.
It will then update the storage billing model for each dataset to be 'PHYSICAL' instead of 'LOGICAL'. This is useful for large datasets that are frequently accessed, as it can reduce costs.
Required Permissions:
- The service account running this function must have the following roles:
to be updated---
Returns:
string: A message indicating the function has completed.
@Intelrunner
Intelrunner / bq_dataset_billing_model_change.py
Last active February 27, 2024 20:46
GCP Cloud Function that Takes an EventARC Event of "InsertDataset" for BQ and immediately changes the billing model to physical for that dataset. Returns a simple log output.
import functions_framework
from google.cloud import bigquery
def extract_dataset_name(full_path):
"""
Extracts the dataset name from a full BigQuery dataset resource path.
Args:
- full_path (str): The full resource path of the dataset,
formatted as 'projects/{project_id}/datasets/{dataset_id}'.
@Intelrunner
Intelrunner / cloudbuild.yaml
Created December 18, 2023 16:13
Cloudbuild.yaml for Golang to pull from Artifact Registry (module) and build to CloudFunctions.
steps:
- name: golang
entrypoint: go
args: [ 'run', 'github.com/GoogleCloudPlatform/artifact-registry-go-tools/cmd/auth@v0.1.0', 'add-locations', '--locations=${_LOCATION}' ]
env:
# Set GOPROXY to the public proxy to pull the credential helper tool
- 'GOPROXY=proxy.golang.org'
- name: golang
entrypoint: go
@Intelrunner
Intelrunner / project_structure_build.sh
Last active April 20, 2023 20:38
This gist builds out a simple directory structure for a user facing app and grabs a general git ignore from gist
#!/bin/bash
# prompt user for monorepo name
read -p "Enter monorepo name: " monorepo_name
# check if monorepo directory already exists
if [ -d "$monorepo_name" ]; then
echo "Monorepo directory already exists. Skipping creation process."
else
# create the main directory
@Intelrunner
Intelrunner / ce_to_spanner.py
Last active April 19, 2023 17:28
Calculating Spanner Costs from GCE
class GCPCostCalculator:
def __init__(self):
self.sql_cores = 0
self.sql_ram = 0
self.sql_disk_size = 0
self.sql_disk_type = ""
self.sql_num_machines = 0
self.spanner_nodes = 0
self.spanner_region = ""
gcp_machine_family_pricing = {
'N1': {
'us-central1': {
'core_cost': 0.031611, # Cost per core-hour
'ram_cost': 0.004237, # Cost per GB-hour of RAM
'cud_discount': {
'1-year': 0.3, # 30% discount
'3-year': 0.57 # 57% discount
}
}
@Intelrunner
Intelrunner / reccy.sh
Last active June 15, 2022 20:05
Starting point to process all output from Google Cloud Compute Engine Recommendation Engine
#!/bin/bash
for project in $(gcloud projects list --format="value(projectId)")
do
for recommendation in $(gcloud recommender recommendations list --project=$project --location=global --recommender=google.compute.instance.MachineTypeRecommender)
do
# Here is where the logic for your 'what to do with each recommendation' goes
echo "$recommendation"
done
done