Skip to content

Instantly share code, notes, and snippets.

quickrun

An attempt at a "quick" way to run a python shell in an Astro Project environment with a local database and configured Airflow - but no Airflow components or anything else running. This script requires you to be running in an Astro project and for the initial run you need to have astro dev start running (so the script can get it's configuration information) but all future runs reference a .quickstart file. Delete the .quickstart file to regenerate it if needed (though that shouldn't be required)

Usage

@fritz-astronomer
fritz-astronomer / README.md
Last active October 21, 2023 05:15
Telescope DAGs

Telescope

Telescope is a tool to get a birds-eye view of one or many Airflow deployments.

Telescope Processor

Output Preview

Telescope reports are processed and visualized in a BI Tool called Sigma. Some samples: image

@fritz-astronomer
fritz-astronomer / .env
Last active February 6, 2023 19:07
Metrics Export - place in `include/statsd_forwarder.py`
CLOUDWATCH_NAMESPACE=celestial-spaceship-2949
CLOUDWATCH_CONN_ID=cloudwatch_test
AIRFLOW__METRICS__STATSD_CUSTOM_CLIENT_PATH=include.statsd_forwarder.CloudwatchStatsdForwarderClient
from datetime import datetime
from airflow import DAG
from airflow.operators.python_operator import PythonOperator
from airflow.contrib.hooks.aws_hook import AwsHook
def export(bucket, object_name, aws_conn_id):
from airflow.utils import db
from airflow.models import Connection
import csv