Skip to content

Instantly share code, notes, and snippets.

@prat3ik
Created January 23, 2020 11:31
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save prat3ik/a7eb0961e2a12cae68e4375ca1b3e8bf to your computer and use it in GitHub Desktop.
Save prat3ik/a7eb0961e2a12cae68e4375ca1b3e8bf to your computer and use it in GitHub Desktop.
iMacs-iMac-2:airflow imac$ airflow scheduler
____________ _____________
____ |__( )_________ __/__ /________ __
____ /| |_ /__ ___/_ /_ __ /_ __ \_ | /| / /
___ ___ | / _ / _ __/ _ / / /_/ /_ |/ |/ /
_/_/ |_/_/ /_/ /_/ /_/ \____/____/|__/
[2020-01-23 16:43:20,340] {executor_loader.py:59} INFO - Using executor SequentialExecutor
[2020-01-23 16:43:20,345] {scheduler_job.py:1460} INFO - Starting the scheduler
[2020-01-23 16:43:20,345] {scheduler_job.py:1467} INFO - Processing each file at most -1 times
[2020-01-23 16:43:20,345] {scheduler_job.py:1470} INFO - Searching for files in /Users/imac/airflow/dags
[2020-01-23 16:43:20,349] {scheduler_job.py:1472} INFO - There are 29 files in /Users/imac/airflow/dags
[2020-01-23 16:43:20,349] {scheduler_job.py:1525} INFO - Resetting orphaned tasks for active dag runs
[2020-01-23 16:43:20,359] {dag_processing.py:350} INFO - Launched DagFileProcessorManager with pid: 26531
[2020-01-23 16:43:20,364] {settings.py:51} INFO - Configured default timezone <Timezone [UTC]>
[2020-01-23 16:43:20,373] {dag_processing.py:555} WARNING - Because we cannot use more than 1 thread (max_threads = 2 ) when using sqlite. So we set parallelism to 1.
[2020-01-23 16:45:08,879] {scheduler_job.py:1100} INFO - 4 tasks up for execution:
<TaskInstance: example_bash_operator.runme_0 2020-01-21 00:00:00+00:00 [scheduled]>
<TaskInstance: example_bash_operator.runme_1 2020-01-21 00:00:00+00:00 [scheduled]>
<TaskInstance: example_bash_operator.runme_2 2020-01-21 00:00:00+00:00 [scheduled]>
<TaskInstance: example_bash_operator.also_run_this 2020-01-21 00:00:00+00:00 [scheduled]>
[2020-01-23 16:45:08,889] {scheduler_job.py:1131} INFO - Figuring out tasks to run in Pool(name=default_pool) with 128 open slots and 4 task instances ready to be queued
[2020-01-23 16:45:08,889] {scheduler_job.py:1159} INFO - DAG example_bash_operator has 0/16 running and queued tasks
[2020-01-23 16:45:08,889] {scheduler_job.py:1159} INFO - DAG example_bash_operator has 1/16 running and queued tasks
[2020-01-23 16:45:08,890] {scheduler_job.py:1159} INFO - DAG example_bash_operator has 2/16 running and queued tasks
[2020-01-23 16:45:08,890] {scheduler_job.py:1159} INFO - DAG example_bash_operator has 3/16 running and queued tasks
[2020-01-23 16:45:08,893] {scheduler_job.py:1209} INFO - Setting the following tasks to queued state:
<TaskInstance: example_bash_operator.runme_0 2020-01-21 00:00:00+00:00 [scheduled]>
<TaskInstance: example_bash_operator.runme_1 2020-01-21 00:00:00+00:00 [scheduled]>
<TaskInstance: example_bash_operator.runme_2 2020-01-21 00:00:00+00:00 [scheduled]>
<TaskInstance: example_bash_operator.also_run_this 2020-01-21 00:00:00+00:00 [scheduled]>
[2020-01-23 16:45:08,903] {scheduler_job.py:1289} INFO - Setting the following 4 tasks to queued state:
<TaskInstance: example_bash_operator.runme_0 2020-01-21 00:00:00+00:00 [queued]>
<TaskInstance: example_bash_operator.runme_1 2020-01-21 00:00:00+00:00 [queued]>
<TaskInstance: example_bash_operator.runme_2 2020-01-21 00:00:00+00:00 [queued]>
<TaskInstance: example_bash_operator.also_run_this 2020-01-21 00:00:00+00:00 [queued]>
[2020-01-23 16:45:08,903] {scheduler_job.py:1325} INFO - Sending ('example_bash_operator', 'runme_0', datetime.datetime(2020, 1, 21, 0, 0, tzinfo=<Timezone [UTC]>), 1) to executor with priority 3 and queue default
[2020-01-23 16:45:08,903] {base_executor.py:74} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'example_bash_operator', 'runme_0', '2020-01-21T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py']
[2020-01-23 16:45:08,904] {scheduler_job.py:1325} INFO - Sending ('example_bash_operator', 'runme_1', datetime.datetime(2020, 1, 21, 0, 0, tzinfo=<Timezone [UTC]>), 1) to executor with priority 3 and queue default
[2020-01-23 16:45:08,904] {base_executor.py:74} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'example_bash_operator', 'runme_1', '2020-01-21T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py']
[2020-01-23 16:45:08,904] {scheduler_job.py:1325} INFO - Sending ('example_bash_operator', 'runme_2', datetime.datetime(2020, 1, 21, 0, 0, tzinfo=<Timezone [UTC]>), 1) to executor with priority 3 and queue default
[2020-01-23 16:45:08,904] {base_executor.py:74} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'example_bash_operator', 'runme_2', '2020-01-21T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py']
[2020-01-23 16:45:08,904] {scheduler_job.py:1325} INFO - Sending ('example_bash_operator', 'also_run_this', datetime.datetime(2020, 1, 21, 0, 0, tzinfo=<Timezone [UTC]>), 1) to executor with priority 2 and queue default
[2020-01-23 16:45:08,905] {base_executor.py:74} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'example_bash_operator', 'also_run_this', '2020-01-21T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py']
[2020-01-23 16:45:08,905] {sequential_executor.py:50} INFO - Executing command: ['airflow', 'tasks', 'run', 'example_bash_operator', 'runme_0', '2020-01-21T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py']
[2020-01-23 16:45:10,533] {executor_loader.py:59} INFO - Using executor SequentialExecutor
[2020-01-23 16:45:10,533] {dagbag.py:415} INFO - Filling up the DagBag from /Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py
Running <TaskInstance: example_bash_operator.runme_0 2020-01-21T00:00:00+00:00 [queued]> on host iMacs-iMac-2.local
[2020-01-23 16:45:15,722] {sequential_executor.py:50} INFO - Executing command: ['airflow', 'tasks', 'run', 'example_bash_operator', 'runme_1', '2020-01-21T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py']
[2020-01-23 16:45:16,825] {executor_loader.py:59} INFO - Using executor SequentialExecutor
[2020-01-23 16:45:16,825] {dagbag.py:415} INFO - Filling up the DagBag from /Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py
Running <TaskInstance: example_bash_operator.runme_1 2020-01-21T00:00:00+00:00 [queued]> on host iMacs-iMac-2.local
[2020-01-23 16:45:22,010] {sequential_executor.py:50} INFO - Executing command: ['airflow', 'tasks', 'run', 'example_bash_operator', 'runme_2', '2020-01-21T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py']
[2020-01-23 16:45:23,400] {executor_loader.py:59} INFO - Using executor SequentialExecutor
[2020-01-23 16:45:23,400] {dagbag.py:415} INFO - Filling up the DagBag from /Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py
Running <TaskInstance: example_bash_operator.runme_2 2020-01-21T00:00:00+00:00 [queued]> on host iMacs-iMac-2.local
[2020-01-23 16:45:28,612] {sequential_executor.py:50} INFO - Executing command: ['airflow', 'tasks', 'run', 'example_bash_operator', 'also_run_this', '2020-01-21T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py']
[2020-01-23 16:45:29,961] {executor_loader.py:59} INFO - Using executor SequentialExecutor
[2020-01-23 16:45:29,961] {dagbag.py:415} INFO - Filling up the DagBag from /Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py
Running <TaskInstance: example_bash_operator.also_run_this 2020-01-21T00:00:00+00:00 [queued]> on host iMacs-iMac-2.local
[2020-01-23 16:45:35,122] {scheduler_job.py:1427} INFO - Executor reports execution of example_bash_operator.runme_0 execution_date=2020-01-21 00:00:00+00:00 exited with status success for try_number 1
[2020-01-23 16:45:35,125] {scheduler_job.py:1427} INFO - Executor reports execution of example_bash_operator.runme_1 execution_date=2020-01-21 00:00:00+00:00 exited with status success for try_number 1
[2020-01-23 16:45:35,127] {scheduler_job.py:1427} INFO - Executor reports execution of example_bash_operator.runme_2 execution_date=2020-01-21 00:00:00+00:00 exited with status success for try_number 1
[2020-01-23 16:45:35,128] {scheduler_job.py:1427} INFO - Executor reports execution of example_bash_operator.also_run_this execution_date=2020-01-21 00:00:00+00:00 exited with status success for try_number 1
[2020-01-23 16:45:36,145] {scheduler_job.py:1100} INFO - 1 tasks up for execution:
<TaskInstance: example_branch_operator.run_this_first 2020-01-21 00:00:00+00:00 [scheduled]>
[2020-01-23 16:45:36,147] {scheduler_job.py:1131} INFO - Figuring out tasks to run in Pool(name=default_pool) with 128 open slots and 1 task instances ready to be queued
[2020-01-23 16:45:36,148] {scheduler_job.py:1159} INFO - DAG example_branch_operator has 0/16 running and queued tasks
[2020-01-23 16:45:36,150] {scheduler_job.py:1209} INFO - Setting the following tasks to queued state:
<TaskInstance: example_branch_operator.run_this_first 2020-01-21 00:00:00+00:00 [scheduled]>
[2020-01-23 16:45:36,155] {scheduler_job.py:1289} INFO - Setting the following 1 tasks to queued state:
<TaskInstance: example_branch_operator.run_this_first 2020-01-21 00:00:00+00:00 [queued]>
[2020-01-23 16:45:36,155] {scheduler_job.py:1325} INFO - Sending ('example_branch_operator', 'run_this_first', datetime.datetime(2020, 1, 21, 0, 0, tzinfo=<Timezone [UTC]>), 1) to executor with priority 11 and queue default
[2020-01-23 16:45:36,155] {base_executor.py:74} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'example_branch_operator', 'run_this_first', '2020-01-21T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_branch_operator.py']
[2020-01-23 16:45:36,155] {sequential_executor.py:50} INFO - Executing command: ['airflow', 'tasks', 'run', 'example_branch_operator', 'run_this_first', '2020-01-21T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_branch_operator.py']
[2020-01-23 16:45:37,156] {executor_loader.py:59} INFO - Using executor SequentialExecutor
[2020-01-23 16:45:37,157] {dagbag.py:415} INFO - Filling up the DagBag from /Volumes/disk2/Projects/airflow/airflow/example_dags/example_branch_operator.py
Running <TaskInstance: example_branch_operator.run_this_first 2020-01-21T00:00:00+00:00 [queued]> on host iMacs-iMac-2.local
[2020-01-23 16:45:42,331] {scheduler_job.py:1427} INFO - Executor reports execution of example_branch_operator.run_this_first execution_date=2020-01-21 00:00:00+00:00 exited with status success for try_number 1
[2020-01-23 16:45:56,441] {scheduler_job.py:1100} INFO - 1 tasks up for execution:
<TaskInstance: example-e2e-dag.start 2020-01-22 00:00:00+00:00 [scheduled]>
[2020-01-23 16:45:56,445] {scheduler_job.py:1131} INFO - Figuring out tasks to run in Pool(name=default_pool) with 128 open slots and 1 task instances ready to be queued
[2020-01-23 16:45:56,445] {scheduler_job.py:1159} INFO - DAG example-e2e-dag has 0/16 running and queued tasks
[2020-01-23 16:45:56,449] {scheduler_job.py:1209} INFO - Setting the following tasks to queued state:
<TaskInstance: example-e2e-dag.start 2020-01-22 00:00:00+00:00 [scheduled]>
[2020-01-23 16:45:56,455] {scheduler_job.py:1289} INFO - Setting the following 1 tasks to queued state:
<TaskInstance: example-e2e-dag.start 2020-01-22 00:00:00+00:00 [queued]>
[2020-01-23 16:45:56,456] {scheduler_job.py:1325} INFO - Sending ('example-e2e-dag', 'start', datetime.datetime(2020, 1, 22, 0, 0, tzinfo=<Timezone [UTC]>), 1) to executor with priority 3 and queue default
[2020-01-23 16:45:56,456] {base_executor.py:74} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'example-e2e-dag', 'start', '2020-01-22T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Users/imac/airflow/dags/example_dag_for_e2e.py']
[2020-01-23 16:45:56,456] {sequential_executor.py:50} INFO - Executing command: ['airflow', 'tasks', 'run', 'example-e2e-dag', 'start', '2020-01-22T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Users/imac/airflow/dags/example_dag_for_e2e.py']
[2020-01-23 16:45:57,590] {executor_loader.py:59} INFO - Using executor SequentialExecutor
[2020-01-23 16:45:57,591] {dagbag.py:415} INFO - Filling up the DagBag from /Users/imac/airflow/dags/example_dag_for_e2e.py
Running <TaskInstance: example-e2e-dag.start 2020-01-22T00:00:00+00:00 [queued]> on host iMacs-iMac-2.local
[2020-01-23 16:46:02,724] {scheduler_job.py:1427} INFO - Executor reports execution of example-e2e-dag.start execution_date=2020-01-22 00:00:00+00:00 exited with status success for try_number 1
[2020-01-23 16:46:16,799] {scheduler_job.py:1100} INFO - 5 tasks up for execution:
<TaskInstance: example_bash_operator.run_after_loop 2020-01-21 00:00:00+00:00 [scheduled]>
<TaskInstance: example_bash_operator.runme_0 2020-01-22 00:00:00+00:00 [scheduled]>
<TaskInstance: example_bash_operator.runme_1 2020-01-22 00:00:00+00:00 [scheduled]>
<TaskInstance: example_bash_operator.runme_2 2020-01-22 00:00:00+00:00 [scheduled]>
<TaskInstance: example_bash_operator.also_run_this 2020-01-22 00:00:00+00:00 [scheduled]>
[2020-01-23 16:46:16,803] {scheduler_job.py:1131} INFO - Figuring out tasks to run in Pool(name=default_pool) with 128 open slots and 5 task instances ready to be queued
[2020-01-23 16:46:16,803] {scheduler_job.py:1159} INFO - DAG example_bash_operator has 0/16 running and queued tasks
[2020-01-23 16:46:16,804] {scheduler_job.py:1159} INFO - DAG example_bash_operator has 1/16 running and queued tasks
[2020-01-23 16:46:16,804] {scheduler_job.py:1159} INFO - DAG example_bash_operator has 2/16 running and queued tasks
[2020-01-23 16:46:16,804] {scheduler_job.py:1159} INFO - DAG example_bash_operator has 3/16 running and queued tasks
[2020-01-23 16:46:16,804] {scheduler_job.py:1159} INFO - DAG example_bash_operator has 4/16 running and queued tasks
[2020-01-23 16:46:16,807] {scheduler_job.py:1209} INFO - Setting the following tasks to queued state:
<TaskInstance: example_bash_operator.runme_0 2020-01-22 00:00:00+00:00 [scheduled]>
<TaskInstance: example_bash_operator.runme_1 2020-01-22 00:00:00+00:00 [scheduled]>
<TaskInstance: example_bash_operator.runme_2 2020-01-22 00:00:00+00:00 [scheduled]>
<TaskInstance: example_bash_operator.run_after_loop 2020-01-21 00:00:00+00:00 [scheduled]>
<TaskInstance: example_bash_operator.also_run_this 2020-01-22 00:00:00+00:00 [scheduled]>
[2020-01-23 16:46:16,813] {scheduler_job.py:1289} INFO - Setting the following 5 tasks to queued state:
<TaskInstance: example_bash_operator.run_after_loop 2020-01-21 00:00:00+00:00 [queued]>
<TaskInstance: example_bash_operator.runme_0 2020-01-22 00:00:00+00:00 [queued]>
<TaskInstance: example_bash_operator.runme_1 2020-01-22 00:00:00+00:00 [queued]>
<TaskInstance: example_bash_operator.runme_2 2020-01-22 00:00:00+00:00 [queued]>
<TaskInstance: example_bash_operator.also_run_this 2020-01-22 00:00:00+00:00 [queued]>
[2020-01-23 16:46:16,813] {scheduler_job.py:1325} INFO - Sending ('example_bash_operator', 'run_after_loop', datetime.datetime(2020, 1, 21, 0, 0, tzinfo=<Timezone [UTC]>), 1) to executor with priority 2 and queue default
[2020-01-23 16:46:16,813] {base_executor.py:74} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'example_bash_operator', 'run_after_loop', '2020-01-21T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py']
[2020-01-23 16:46:16,813] {scheduler_job.py:1325} INFO - Sending ('example_bash_operator', 'runme_0', datetime.datetime(2020, 1, 22, 0, 0, tzinfo=<Timezone [UTC]>), 1) to executor with priority 3 and queue default
[2020-01-23 16:46:16,813] {base_executor.py:74} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'example_bash_operator', 'runme_0', '2020-01-22T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py']
[2020-01-23 16:46:16,813] {scheduler_job.py:1325} INFO - Sending ('example_bash_operator', 'runme_1', datetime.datetime(2020, 1, 22, 0, 0, tzinfo=<Timezone [UTC]>), 1) to executor with priority 3 and queue default
[2020-01-23 16:46:16,813] {base_executor.py:74} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'example_bash_operator', 'runme_1', '2020-01-22T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py']
[2020-01-23 16:46:16,813] {scheduler_job.py:1325} INFO - Sending ('example_bash_operator', 'runme_2', datetime.datetime(2020, 1, 22, 0, 0, tzinfo=<Timezone [UTC]>), 1) to executor with priority 3 and queue default
[2020-01-23 16:46:16,813] {base_executor.py:74} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'example_bash_operator', 'runme_2', '2020-01-22T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py']
[2020-01-23 16:46:16,813] {scheduler_job.py:1325} INFO - Sending ('example_bash_operator', 'also_run_this', datetime.datetime(2020, 1, 22, 0, 0, tzinfo=<Timezone [UTC]>), 1) to executor with priority 2 and queue default
[2020-01-23 16:46:16,813] {base_executor.py:74} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'example_bash_operator', 'also_run_this', '2020-01-22T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py']
[2020-01-23 16:46:16,814] {sequential_executor.py:50} INFO - Executing command: ['airflow', 'tasks', 'run', 'example_bash_operator', 'runme_0', '2020-01-22T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py']
[2020-01-23 16:46:17,844] {executor_loader.py:59} INFO - Using executor SequentialExecutor
[2020-01-23 16:46:17,844] {dagbag.py:415} INFO - Filling up the DagBag from /Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py
Running <TaskInstance: example_bash_operator.runme_0 2020-01-22T00:00:00+00:00 [queued]> on host iMacs-iMac-2.local
[2020-01-23 16:46:22,958] {sequential_executor.py:50} INFO - Executing command: ['airflow', 'tasks', 'run', 'example_bash_operator', 'runme_1', '2020-01-22T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py']
[2020-01-23 16:46:23,978] {executor_loader.py:59} INFO - Using executor SequentialExecutor
[2020-01-23 16:46:23,979] {dagbag.py:415} INFO - Filling up the DagBag from /Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py
Running <TaskInstance: example_bash_operator.runme_1 2020-01-22T00:00:00+00:00 [queued]> on host iMacs-iMac-2.local
[2020-01-23 16:46:29,111] {sequential_executor.py:50} INFO - Executing command: ['airflow', 'tasks', 'run', 'example_bash_operator', 'runme_2', '2020-01-22T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py']
[2020-01-23 16:46:30,108] {executor_loader.py:59} INFO - Using executor SequentialExecutor
[2020-01-23 16:46:30,108] {dagbag.py:415} INFO - Filling up the DagBag from /Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py
Running <TaskInstance: example_bash_operator.runme_2 2020-01-22T00:00:00+00:00 [queued]> on host iMacs-iMac-2.local
[2020-01-23 16:46:35,247] {sequential_executor.py:50} INFO - Executing command: ['airflow', 'tasks', 'run', 'example_bash_operator', 'run_after_loop', '2020-01-21T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py']
[2020-01-23 16:46:36,246] {executor_loader.py:59} INFO - Using executor SequentialExecutor
[2020-01-23 16:46:36,246] {dagbag.py:415} INFO - Filling up the DagBag from /Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py
Running <TaskInstance: example_bash_operator.run_after_loop 2020-01-21T00:00:00+00:00 [queued]> on host iMacs-iMac-2.local
[2020-01-23 16:46:41,394] {sequential_executor.py:50} INFO - Executing command: ['airflow', 'tasks', 'run', 'example_bash_operator', 'also_run_this', '2020-01-22T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py']
[2020-01-23 16:46:42,392] {executor_loader.py:59} INFO - Using executor SequentialExecutor
[2020-01-23 16:46:42,393] {dagbag.py:415} INFO - Filling up the DagBag from /Volumes/disk2/Projects/airflow/airflow/example_dags/example_bash_operator.py
Running <TaskInstance: example_bash_operator.also_run_this 2020-01-22T00:00:00+00:00 [queued]> on host iMacs-iMac-2.local
[2020-01-23 16:46:47,561] {scheduler_job.py:1427} INFO - Executor reports execution of example_bash_operator.runme_0 execution_date=2020-01-22 00:00:00+00:00 exited with status success for try_number 1
[2020-01-23 16:46:47,567] {scheduler_job.py:1427} INFO - Executor reports execution of example_bash_operator.runme_1 execution_date=2020-01-22 00:00:00+00:00 exited with status success for try_number 1
[2020-01-23 16:46:47,569] {scheduler_job.py:1427} INFO - Executor reports execution of example_bash_operator.runme_2 execution_date=2020-01-22 00:00:00+00:00 exited with status success for try_number 1
[2020-01-23 16:46:47,571] {scheduler_job.py:1427} INFO - Executor reports execution of example_bash_operator.run_after_loop execution_date=2020-01-21 00:00:00+00:00 exited with status success for try_number 1
[2020-01-23 16:46:47,572] {scheduler_job.py:1427} INFO - Executor reports execution of example_bash_operator.also_run_this execution_date=2020-01-22 00:00:00+00:00 exited with status success for try_number 1
[2020-01-23 16:46:48,847] {scheduler_job.py:1100} INFO - 2 tasks up for execution:
<TaskInstance: example_branch_operator.branching 2020-01-21 00:00:00+00:00 [scheduled]>
<TaskInstance: example_branch_operator.run_this_first 2020-01-22 00:00:00+00:00 [scheduled]>
[2020-01-23 16:46:48,849] {scheduler_job.py:1131} INFO - Figuring out tasks to run in Pool(name=default_pool) with 128 open slots and 2 task instances ready to be queued
[2020-01-23 16:46:48,849] {scheduler_job.py:1159} INFO - DAG example_branch_operator has 0/16 running and queued tasks
[2020-01-23 16:46:48,849] {scheduler_job.py:1159} INFO - DAG example_branch_operator has 1/16 running and queued tasks
[2020-01-23 16:46:48,852] {scheduler_job.py:1209} INFO - Setting the following tasks to queued state:
<TaskInstance: example_branch_operator.run_this_first 2020-01-22 00:00:00+00:00 [scheduled]>
<TaskInstance: example_branch_operator.branching 2020-01-21 00:00:00+00:00 [scheduled]>
[2020-01-23 16:46:48,859] {scheduler_job.py:1289} INFO - Setting the following 2 tasks to queued state:
<TaskInstance: example_branch_operator.run_this_first 2020-01-22 00:00:00+00:00 [queued]>
<TaskInstance: example_branch_operator.branching 2020-01-21 00:00:00+00:00 [queued]>
[2020-01-23 16:46:48,860] {scheduler_job.py:1325} INFO - Sending ('example_branch_operator', 'run_this_first', datetime.datetime(2020, 1, 22, 0, 0, tzinfo=<Timezone [UTC]>), 1) to executor with priority 11 and queue default
[2020-01-23 16:46:48,860] {base_executor.py:74} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'example_branch_operator', 'run_this_first', '2020-01-22T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_branch_operator.py']
[2020-01-23 16:46:48,860] {scheduler_job.py:1325} INFO - Sending ('example_branch_operator', 'branching', datetime.datetime(2020, 1, 21, 0, 0, tzinfo=<Timezone [UTC]>), 1) to executor with priority 10 and queue default
[2020-01-23 16:46:48,860] {base_executor.py:74} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'example_branch_operator', 'branching', '2020-01-21T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_branch_operator.py']
[2020-01-23 16:46:48,860] {sequential_executor.py:50} INFO - Executing command: ['airflow', 'tasks', 'run', 'example_branch_operator', 'run_this_first', '2020-01-22T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_branch_operator.py']
[2020-01-23 16:46:49,946] {executor_loader.py:59} INFO - Using executor SequentialExecutor
[2020-01-23 16:46:49,946] {dagbag.py:415} INFO - Filling up the DagBag from /Volumes/disk2/Projects/airflow/airflow/example_dags/example_branch_operator.py
Running <TaskInstance: example_branch_operator.run_this_first 2020-01-22T00:00:00+00:00 [queued]> on host iMacs-iMac-2.local
[2020-01-23 16:46:55,080] {sequential_executor.py:50} INFO - Executing command: ['airflow', 'tasks', 'run', 'example_branch_operator', 'branching', '2020-01-21T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Volumes/disk2/Projects/airflow/airflow/example_dags/example_branch_operator.py']
[2020-01-23 16:46:56,121] {executor_loader.py:59} INFO - Using executor SequentialExecutor
[2020-01-23 16:46:56,121] {dagbag.py:415} INFO - Filling up the DagBag from /Volumes/disk2/Projects/airflow/airflow/example_dags/example_branch_operator.py
Running <TaskInstance: example_branch_operator.branching 2020-01-21T00:00:00+00:00 [queued]> on host iMacs-iMac-2.local
[2020-01-23 16:47:01,267] {scheduler_job.py:1427} INFO - Executor reports execution of example_branch_operator.run_this_first execution_date=2020-01-22 00:00:00+00:00 exited with status success for try_number 1
[2020-01-23 16:47:01,272] {scheduler_job.py:1427} INFO - Executor reports execution of example_branch_operator.branching execution_date=2020-01-21 00:00:00+00:00 exited with status success for try_number 1
[2020-01-23 16:47:15,369] {scheduler_job.py:1100} INFO - 1 tasks up for execution:
<TaskInstance: example-e2e-dag.section-1 2020-01-22 00:00:00+00:00 [scheduled]>
[2020-01-23 16:47:15,375] {scheduler_job.py:1131} INFO - Figuring out tasks to run in Pool(name=default_pool) with 128 open slots and 1 task instances ready to be queued
[2020-01-23 16:47:15,375] {scheduler_job.py:1159} INFO - DAG example-e2e-dag has 0/16 running and queued tasks
[2020-01-23 16:47:15,379] {scheduler_job.py:1209} INFO - Setting the following tasks to queued state:
<TaskInstance: example-e2e-dag.section-1 2020-01-22 00:00:00+00:00 [scheduled]>
[2020-01-23 16:47:15,384] {scheduler_job.py:1289} INFO - Setting the following 1 tasks to queued state:
<TaskInstance: example-e2e-dag.section-1 2020-01-22 00:00:00+00:00 [queued]>
[2020-01-23 16:47:15,384] {scheduler_job.py:1325} INFO - Sending ('example-e2e-dag', 'section-1', datetime.datetime(2020, 1, 22, 0, 0, tzinfo=<Timezone [UTC]>), 1) to executor with priority 2 and queue default
[2020-01-23 16:47:15,384] {base_executor.py:74} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'example-e2e-dag', 'section-1', '2020-01-22T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Users/imac/airflow/dags/example_dag_for_e2e.py']
[2020-01-23 16:47:15,384] {sequential_executor.py:50} INFO - Executing command: ['airflow', 'tasks', 'run', 'example-e2e-dag', 'section-1', '2020-01-22T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/Users/imac/airflow/dags/example_dag_for_e2e.py']
[2020-01-23 16:47:16,506] {executor_loader.py:59} INFO - Using executor SequentialExecutor
[2020-01-23 16:47:16,506] {dagbag.py:415} INFO - Filling up the DagBag from /Users/imac/airflow/dags/example_dag_for_e2e.py
Running <TaskInstance: example-e2e-dag.section-1 2020-01-22T00:00:00+00:00 [queued]> on host iMacs-iMac-2.local
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment