Skip to content

Instantly share code, notes, and snippets.

import math
from sklearn.cluster import DBSCAN
import numpy as np
import statistics
from collections import Counter
def group_by_visual_row(data, eps=5, min_samples=1):
# Cluster report rows using the DBSCAN clustering algorithm to group OCR lines
# DBSCAN (Density-Based Spatial Clustering of Applications with Noise) can help
#!/bin/bash
TOKEN=$(gcloud auth application-default print-access-token)
PROJECT_ID="663188713804"
DOCUMENT_ID="3m69velh07nj0" # 05079-86913.pdf
SOURCE_FOLDER_ID="2n0md82gaqn28" # Unreviewed
DESTINATION_FOLDER_ID="5hrrjcq9h3150" # Reviewed
ENDPOINT="https://contentwarehouse.googleapis.com/v1/projects/$PROJECT_ID/locations/us/documents/$DOCUMENT_ID/linkedSources"
#!/bin/bash
TOKEN=$(gcloud auth application-default print-access-token)
PROJECT_ID="663188713804"
DOCUMENT_ID="29oiqkhlqm5o8"
ENDPOINT="https://contentwarehouse.googleapis.com/v1/projects/$PROJECT_ID/locations/us/documents/$DOCUMENT_ID:setAcl"
# Set the policy
POLICY='{
"policy": {
#!/bin/bash
# This script logs into the Looker UI using the specified credentials and generates a new API3 key. It was written on Mac OS X
# and may need to be slightly adapted if running on Linux (possibly just replacing ggrep with grep)
if [ ! -f /usr/local/bin/ggrep ]; then
echo "Install GNU grep by running 'brew install grep'"
exit 1
fi
if [ ! -f /usr/local/bin/jq ]; then
echo "Install jq by running 'brew install jq'"
"""
This is a Google Cloud Function, it can be deployed like this:
gcloud functions deploy download_full_look --env-vars-file .env.yaml --runtime python37 --trigger-http --allow-unauthenticated
You need a requirements.txt in the same directory, to specify the Flask
and Looker SDK versions. You also need a .env.yaml file to contain
environment variable values for the Looker SDK!
Read logs like this:
gcloud functions logs read download_full_look

Terraform for Looker cluster on GCP

  1. Create a working directory that contains the provided setup-gcp-infrastructure-for-looker.tf and setup-looker.sh files and navigate to it in a terminal.
  2. Accept the Looker EULA for your license key so you can download the JAR programmatically. You only need to accept the agreement, you do not need to download the files.
  3. Download an appropriate terraform binary and ensure terraform is in your $PATH
  4. Install gcloud then login from the command line by typing gcloud auth application-default login and follow these instructions to create an admin project that can provision additional projects. You will also need to add two more IAM bindings:
    gcloud organizations add-iam-policy-binding ${TF_VAR_org_id} \
    --member serviceAccount:terraform@$
    
import csv
import argparse
from looker_sdk import client
"""
The purpose of this script is to parse a CSV file containing a list of emails separated by line breaks. If
the email in the file corresponds to an email in the Looker instance referred to in looker.ini, that user will
be automatically disabled.
"""
#!/bin/bash
# Author: drew.gillson@looker.com
#
# The purpose of this script is to copy a LookML project and some of its dependencies (connections, user attributes, models) from
# a source environment to a destination environment. This script will not copy users, groups, roles, or permission sets.
# Please update the following environment variables:
export SRC_REPO_URL="https://github.com/drewgillson/looker-data-applications-exercise.git"
export SRC_CLIENT_ID="..."
export SRC_CLIENT_SECRET="..."
@drewgillson
drewgillson / lookml_field_label_updater.py
Created November 20, 2019 21:22
This Python script automatically adds labels and descriptions saved in a CSV file to fields within LookML project files.
import csv
import argparse
import lkml
import os
"""
This script automatically adds labels and descriptions saved in a CSV file to fields within LookML project files.
Warning! Since this script doesn't check for fully-scoped field names, if you specify an update to a field with
a common name, like "count", every instance of a field called count in your LookML project will be updated with

dashboard cat

The dashboard cat command is used to output the json that describes the dashboard.

$ gzr dashboard cat 192 --host foo.bar.mycompany.com
JSON data for dashboard

The output document can be very long. Usually one will use the switch --dir DIRECTORY to