Skip to content

Instantly share code, notes, and snippets.

diff --git a/internal_setup.bzl b/internal_setup.bzl
index a80099f..56eb486 100644
--- a/internal_setup.bzl
+++ b/internal_setup.bzl
@@ -21,7 +21,7 @@ load("@rules_bazel_integration_test//bazel_integration_test:deps.bzl", "bazel_in
load("@rules_bazel_integration_test//bazel_integration_test:repo_defs.bzl", "bazel_binaries")
load("@rules_proto//proto:repositories.bzl", "rules_proto_dependencies", "rules_proto_toolchains")
load("//:version.bzl", "SUPPORTED_BAZEL_VERSIONS")
-load("//python/pip_install:repositories.bzl", "pip_install_dependencies")
+load("//python/pip_install:repositories.bzl", "pip_install_dependencies", "uv_dependencies")
import inspect
import itertools
from python.nplusone.core import signals
from sqlalchemy.engine import ScalarResult
from sqlalchemy.orm import attributes, loading, query, strategies
def to_key(instance):
model = type(instance)
pkg_tar(
name = "interpreter",
srcs = [
"@python3_9//:files",
],
mode = "0644",
package_dir = "/usr/local/python",
strip_prefix = ".",
)
@ewhauser
ewhauser / BUILD.bazel
Created April 19, 2022 15:31
Bazel + Gunicorn
gunicorn_binary(
name = "hello",
args = ["hello:app"],
deps = [
"@pypi_flask//:pkg",
],
)
name: "Terraform"
on:
push:
branches:
- main
paths:
- terraform/**
pull_request:
branches:
- main
@ewhauser
ewhauser / date_dim.sql
Created December 24, 2019 22:54
Generate Date Dimension Table for BigQuery
SELECT
FORMAT_DATE('%F', d) as id,
d AS full_date,
EXTRACT(YEAR FROM d) AS year,
EXTRACT(WEEK FROM d) AS year_week,
EXTRACT(DAY FROM d) AS year_day,
EXTRACT(YEAR FROM d) AS fiscal_year,
FORMAT_DATE('%Q', d) as fiscal_qtr,
EXTRACT(MONTH FROM d) AS month,
FORMAT_DATE('%B', d) as month_name,
SELECT repo.name, JSON_EXTRACT(payload, '$.issue.number') as issue_id, COUNT(*) as count, JSON_EXTRACT(payload, '$.issue.title') as title
FROM `githubarchive.month.*`
WHERE type = 'IssueCommentEvent'
and repo.name = 'angular/angular'
and (_TABLE_SUFFIX like '2019%' or _TABLE_SUFFIX like '2018%' or _TABLE_SUFFIX like '2017%')
GROUP BY repo.name, issue_id, title
ORDER BY count desc
@ewhauser
ewhauser / export_audit_logs.py
Created February 3, 2019 21:03
Export LaunchDarkly audit logs to CSV
#!/usr/bin/env python3
"""
Requirements:
pip install requests python-dateutil
"""
import dateutil.parser as dp
import csv
import sys
from optparse import OptionParser

Keybase proof

I hereby claim:

  • I am ewhauser on github.
  • I am ewhauser (https://keybase.io/ewhauser) on keybase.
  • I have a public key ASDgt9UmpXcLznvoNPS0LsfUbzCUQPcbaceAg8QC2CAZlgo

To claim this, I am signing this object:

@ewhauser
ewhauser / gist:5727229
Created June 7, 2013 05:31
Notes on Facebook's Presto
These notes are from Facebook's Analytics @ Scale conference. I didn't take notes from the presentation or discussions with developers, so feel free to correct any inconsistencies:
- Presto is an ANSI SQL engine built on top of HDFS
- Similar functionality to Cloudera's Impala
- Facebook started developing this project prior to the Impala annoucement, some different design choices
- Implemented in Java
- Queries execute around 10x faster than Hive, aggregation based queries can be 100x times faster
- Byte code generation is used for efficient predicate processing
- Efficient fixed memory data structures are used (very low GC overhead)
- Presto daemons do not have to run on data nodes