Skip to content

Instantly share code, notes, and snippets.

@arnaud9
Last active May 22, 2019 09:15
Show Gist options
  • Save arnaud9/f6bdd786733a49c526895541d38c8c84 to your computer and use it in GitHub Desktop.
Save arnaud9/f6bdd786733a49c526895541d38c8c84 to your computer and use it in GitHub Desktop.
from airflow import DAG
from airflow.operators import DummyOperator, PythonOperator
default_args = {
'owner': 'arnaud',
'start_date': datetime(2019, 1, 1),
'retry_delay': timedelta(minutes=5)
}
# Using the context manager alllows you not to duplicate the dag parameter in each operator
with DAG('S3_dag_test', default_args=default_args, schedule_interval='@once') as dag:
start_task = DummyOperator(
task_id='dummy_start'
)
upload_to_S3_task = PythonOperator(
task_id='upload_file_to_S3',
python_callable=lambda _ : print("Uploading file to S3")
)
# Use arrows to set dependencies between tasks
start_task >> upload_to_S3_task
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment