Skip to content

Instantly share code, notes, and snippets.

View valiantone's full-sized avatar
🌪️
learning to embrace chaos

Zubin J valiantone

🌪️
learning to embrace chaos
View GitHub Profile
@valiantone
valiantone / bellhops-archive.md
Last active May 7, 2024 15:41
Work Hard, Play Harder

Transitioning from Tag Clouds to Tag Trees

In the packages paradigm - each packaged selection/offering is a bundle of semi-unique characteristics - lets term these as attributes. When a package bundle is selected it instantly informs our controller that certain atrributes are ground truth for this move. The current process introduces assumptions rather than ground truth. For instance let's observe Use Case 1.

Use Case 1: Studio Package Bundle

Facts:
    - Two Bellhops on the move
    - Duration of ~ 2hours
    - 16 Foot Moving truck required
@valiantone
valiantone / window_functions.md
Last active June 19, 2021 08:30
SQL queries and examples

Weekly, Monthly Active Emails

WITH
    -- this is your original query, with the ISO week and month number added.
    members_log_aggr(login_date,  year_nbr, iso_week_nbr, month_nbr, email_count) AS
    (
        SELECT
            CAST(ml.login AS Date),
            DATEPART(YEAR, ml.login),
            DATEPART(ISO_WEEK, ml.login),
@valiantone
valiantone / pandas_s3_streaming.py
Created November 30, 2021 15:01 — forked from uhho/pandas_s3_streaming.py
Streaming pandas DataFrame to/from S3 with on-the-fly processing and GZIP compression
def s3_to_pandas(client, bucket, key, header=None):
# get key using boto3 client
obj = client.get_object(Bucket=bucket, Key=key)
gz = gzip.GzipFile(fileobj=obj['Body'])
# load stream directly to DF
return pd.read_csv(gz, header=header, dtype=str)
def s3_to_pandas_with_processing(client, bucket, key, header=None):