Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
from airflow.decorators import dag
from airflow.providers.apache.spark.operators.spark_submit import SparkSubmitOperator
@dag()
def workflow():
spark_submit = SparkSubmitOperator(
task_id='spark_submit',
application='path/to/job/script.py',
application_args=['s3://example-bucket/path/to/input',
's3://example-bucket/path/to/output' ] )
sample_dag = workflow()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment