Skip to content

Instantly share code, notes, and snippets.

View lewisrodgers's full-sized avatar

Lewis Rodgers lewisrodgers

View GitHub Profile
@lewisrodgers
lewisrodgers / README.md
Last active March 26, 2018 01:01
App Engine boilerplate

Some starter files to get up and running with App Engine and Flask.

Place home.html in a folder called templates.

The folder structure will look something like this:

— root/
  — lib/
 — templates/
@lewisrodgers
lewisrodgers / README.md
Last active April 9, 2024 09:50
Python boilerplate for service account oauth setup to return data from a Google API

Prerequisites

For a more detailed version of the steps below visit: https://developers.google.com/admin-sdk/directory/v1/guides/delegation

  • Enable the necessary APIs (calendar, drive, gmail, etc.) in Cloud Console
  • Create a service account, download the json key file, and enable domain wide delegation (DwD)
  • Determine the required scopes (calendar readonly, drive read/write, etc.)
  • In the G Suite console, an admin authorizes the service account (client ID) with the specified scopes
  • The admin also identifies an account — with proper admin privileges — that the service account will impersonate (when you want to access user's data without manual authorization from the user)
@lewisrodgers
lewisrodgers / beaglebone-pubsub.py
Last active January 22, 2018 17:32
Send sensor data to Cloud Pub/Sub from a BeagleBone
import time
# Cloud Pub/Sub ###########################
from google.cloud import pubsub
client = pubsub.PublisherClient()
topic = client.topic_path(PROJECT_ID, TOPIC_NAME)
def send_to_pubsub(message, data):
@lewisrodgers
lewisrodgers / gcp-exam-resources.md
Created January 19, 2018 20:53
Google Cloud Platform exam resources

Google Cloud Platform certification exam resources

A collection of case studies, white papers, articles, books, and other resources to help prepare you for a Google Cloud Platform certification or two.

If you interested in a particular topic, a good place to start is the Tutorials and Solutions section of cloud.google.com. Search by keyword or browse around. Otherwise, I've currated some of the articles I think would be helpful and added t

@lewisrodgers
lewisrodgers / chrome-web-store-api-deploy-utility.sh
Last active August 14, 2021 00:20
Bash script designed to update and publish a chrome extension that already exists in the Chrome Web Store.
#!/bin/bash
# This script is designed to update and publish a chrome extension
# that already exists in the Chrome Web Store. It also depends on
# OAuth 2.0 credentials and a refresh token.
# Required arguments
# ==================
# There are 2 actions you can perform:
# 1. Update - upload a chrome extension to the Chrome Web Store
#!/bin/bash
# This script is used to update a chrome extension that's already
# been uploaded to the Chrome Web Store. You'll need to know the
# chrome extension's ID that you want to update.
# To find the app ID see: https://developer.chrome.com/webstore/publish#get-the-app-id
#
# usage:
# ./chrome-web-store-api-update-utility.sh $APP_ID
@lewisrodgers
lewisrodgers / gcp-get-tokens-utility.sh
Last active December 29, 2017 14:41
Retrieve OAuth 2.0 access and refresh tokens for Google Cloud Platform
#!/bin/bash
# This script generates a `tokens.json` file.
#
# example:
# {
# "access_token" : "ya29.Glst...",
# "expires_in" : 3600,
# "refresh_token" : "1/dd3t...",
# "token_type" : "Bearer"
# Possible versioning stratgy.
# Only integers are allowed with the `version` property of the manifest.json.
script:
- GIT_HASH=$(git rev-parse --short HEAD)
- jq ".version_name = \"build-$GIT_HASH\"" manifest.json | spong manifest.json
@lewisrodgers
lewisrodgers / tD19.bitbucket-piplines.yml
Last active February 26, 2019 11:36
Chrome extension basic pipeline
pipelines:
branches:
develop:
- step:
name: Update
script:
- apt-get update
- apt-get -y install jq zip
- FILE_NAME=crx.zip
- zip -r $FILE_NAME ./app
@lewisrodgers
lewisrodgers / README.md
Last active April 4, 2018 17:00
Deploying Cloud Functions to schedule a Python Dataflow pipeline.

Deploying Cloud Functions to schedule a Python Dataflow pipeline.

Use this as a guide. But instead of using the java runtime, we'll use python.

The setup can be accomplish with virtualenv to create an isolated environment.

The project folder structure will look like this:

cloudfunction/