Skip to content

Instantly share code, notes, and snippets.

@tomfaulhaber
Created November 15, 2018 18:24
Show Gist options
  • Save tomfaulhaber/4e72ed0884c9580c606e02e4b745ddff to your computer and use it in GitHub Desktop.
Save tomfaulhaber/4e72ed0884c9580c606e02e4b745ddff to your computer and use it in GitHub Desktop.
Log from scheduler when dummy dag file is uploaded
[2018-11-15 18:04:42,099] {settings.py:146} DEBUG - Setting up DB connection pool (PID 2613)
[2018-11-15 18:04:42,099] {settings.py:174} INFO - setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
[2018-11-15 18:04:42,374] {__init__.py:42} DEBUG - Cannot import due to doesn't look like a module path
[2018-11-15 18:04:42,658] {__init__.py:51} INFO - Using executor SequentialExecutor
[2018-11-15 18:04:42,677] {cli_action_loggers.py:40} DEBUG - Adding <function default_action_log at 0x7fbec8df5e18> to pre execution callback
[2018-11-15 18:04:42,934] {cli_action_loggers.py:64} DEBUG - Calling callbacks: [<function default_action_log at 0x7fbec8df5e18>]
____________ _____________
____ |__( )_________ __/__ /________ __
____ /| |_ /__ ___/_ /_ __ /_ __ \_ | /| / /
___ ___ | / _ / _ __/ _ / / /_/ /_ |/ |/ /
_/_/ |_/_/ /_/ /_/ /_/ \____/____/|__/
[2018-11-15 18:04:43,017] {jobs.py:1545} INFO - Starting the scheduler
[2018-11-15 18:04:43,017] {jobs.py:1559} INFO - Processing files using up to 2 processes at a time
[2018-11-15 18:04:43,018] {jobs.py:1560} INFO - Running execute loop for -1 seconds
[2018-11-15 18:04:43,018] {jobs.py:1561} INFO - Processing each file at most -1 times
[2018-11-15 18:04:43,018] {jobs.py:1564} INFO - Process each file at most once every 0 seconds
[2018-11-15 18:04:43,018] {jobs.py:1568} INFO - Checking for new files in /efs/dags every 300 seconds
[2018-11-15 18:04:43,018] {jobs.py:1571} INFO - Searching for files in /efs/dags
[2018-11-15 18:04:43,018] {jobs.py:1573} INFO - There are 0 files in /efs/dags
[2018-11-15 18:04:43,018] {jobs.py:1635} INFO - Resetting orphaned tasks for active dag runs
[2018-11-15 18:04:43,029] {jobs.py:1653} DEBUG - Starting Loop...
[2018-11-15 18:04:43,029] {jobs.py:1675} DEBUG - Heartbeating the process manager
[2018-11-15 18:04:43,029] {dag_processing.py:502} DEBUG - 0/2 scheduler processes running
[2018-11-15 18:04:43,029] {dag_processing.py:505} DEBUG - 0 file paths queued for processing
[2018-11-15 18:04:43,029] {dag_processing.py:551} DEBUG - Queuing the following files for processing:
[2018-11-15 18:04:43,029] {jobs.py:1712} DEBUG - Heartbeating the executor
[2018-11-15 18:04:43,029] {base_executor.py:118} DEBUG - 0 running task instances
[2018-11-15 18:04:43,029] {base_executor.py:119} DEBUG - 0 in queue
[2018-11-15 18:04:43,029] {base_executor.py:120} DEBUG - 32 open slots
[2018-11-15 18:04:43,029] {base_executor.py:149} DEBUG - Calling the <class 'airflow.executors.sequential_executor.SequentialExecutor'> sync method
[2018-11-15 18:04:43,030] {jobs.py:1737} DEBUG - Ran scheduling loop in 0.00 seconds
[2018-11-15 18:04:43,030] {jobs.py:1738} DEBUG - Sleeping for 1.00 seconds
[2018-11-15 18:04:44,031] {jobs.py:1653} DEBUG - Starting Loop...
[2018-11-15 18:04:44,031] {jobs.py:1675} DEBUG - Heartbeating the process manager
[2018-11-15 18:04:44,031] {dag_processing.py:502} DEBUG - 0/2 scheduler processes running
[2018-11-15 18:04:44,031] {dag_processing.py:505} DEBUG - 0 file paths queued for processing
[2018-11-15 18:04:44,031] {dag_processing.py:551} DEBUG - Queuing the following files for processing:
[2018-11-15 18:04:44,031] {jobs.py:1712} DEBUG - Heartbeating the executor
[2018-11-15 18:04:44,031] {base_executor.py:118} DEBUG - 0 running task instances
[2018-11-15 18:04:44,032] {base_executor.py:119} DEBUG - 0 in queue
[2018-11-15 18:04:44,032] {base_executor.py:120} DEBUG - 32 open slots
[2018-11-15 18:04:44,032] {base_executor.py:149} DEBUG - Calling the <class 'airflow.executors.sequential_executor.SequentialExecutor'> sync method
[2018-11-15 18:04:44,032] {jobs.py:1737} DEBUG - Ran scheduling loop in 0.00 seconds
[2018-11-15 18:04:44,032] {jobs.py:1738} DEBUG - Sleeping for 1.00 seconds
[2018-11-15 18:04:45,033] {jobs.py:1653} DEBUG - Starting Loop...
[2018-11-15 18:04:45,033] {jobs.py:1675} DEBUG - Heartbeating the process manager
[2018-11-15 18:04:45,034] {dag_processing.py:502} DEBUG - 0/2 scheduler processes running
[2018-11-15 18:04:45,034] {dag_processing.py:505} DEBUG - 0 file paths queued for processing
[2018-11-15 18:04:45,034] {dag_processing.py:551} DEBUG - Queuing the following files for processing:
... lots of null heartbeats ...
[2018-11-15 18:09:41,843] {jobs.py:1712} DEBUG - Heartbeating the executor
[2018-11-15 18:09:41,844] {base_executor.py:118} DEBUG - 0 running task instances
[2018-11-15 18:09:41,844] {base_executor.py:119} DEBUG - 0 in queue
[2018-11-15 18:09:41,844] {base_executor.py:120} DEBUG - 32 open slots
[2018-11-15 18:09:41,844] {base_executor.py:149} DEBUG - Calling the <class 'airflow.executors.sequential_executor.SequentialExecutor'> sync method
[2018-11-15 18:09:41,844] {jobs.py:1737} DEBUG - Ran scheduling loop in 0.00 seconds
[2018-11-15 18:09:41,844] {jobs.py:1738} DEBUG - Sleeping for 1.00 seconds
[2018-11-15 18:09:42,845] {jobs.py:1653} DEBUG - Starting Loop...
[2018-11-15 18:09:42,846] {jobs.py:1675} DEBUG - Heartbeating the process manager
[2018-11-15 18:09:42,846] {dag_processing.py:502} DEBUG - 0/2 scheduler processes running
[2018-11-15 18:09:42,846] {dag_processing.py:505} DEBUG - 0 file paths queued for processing
[2018-11-15 18:09:42,846] {dag_processing.py:551} DEBUG - Queuing the following files for processing:
[2018-11-15 18:09:42,846] {jobs.py:1712} DEBUG - Heartbeating the executor
[2018-11-15 18:09:42,846] {base_executor.py:118} DEBUG - 0 running task instances
[2018-11-15 18:09:42,846] {base_executor.py:119} DEBUG - 0 in queue
[2018-11-15 18:09:42,846] {base_executor.py:120} DEBUG - 32 open slots
[2018-11-15 18:09:42,846] {base_executor.py:149} DEBUG - Calling the <class 'airflow.executors.sequential_executor.SequentialExecutor'> sync method
[2018-11-15 18:09:42,846] {jobs.py:1737} DEBUG - Ran scheduling loop in 0.00 seconds
[2018-11-15 18:09:42,846] {jobs.py:1738} DEBUG - Sleeping for 1.00 seconds
[2018-11-15 18:09:43,848] {jobs.py:1653} DEBUG - Starting Loop...
[2018-11-15 18:09:43,848] {jobs.py:1663} INFO - Searching for files in /efs/dags
[2018-11-15 18:09:43,853] {jobs.py:1667} INFO - There are 1 files in /efs/dags
[2018-11-15 18:09:43,854] {jobs.py:1671} DEBUG - Removing old import errors
[2018-11-15 18:09:43,861] {jobs.py:1675} DEBUG - Heartbeating the process manager
[2018-11-15 18:09:43,862] {dag_processing.py:502} DEBUG - 0/2 scheduler processes running
[2018-11-15 18:09:43,862] {dag_processing.py:505} DEBUG - 0 file paths queued for processing
[2018-11-15 18:09:43,862] {dag_processing.py:551} DEBUG - Queuing the following files for processing:
/efs/dags
[2018-11-15 18:09:43,872] {dag_processing.py:565} DEBUG - Started a process (PID: 2753) to generate tasks for /efs/dags
[2018-11-15 18:09:43,873] {jobs.py:1712} DEBUG - Heartbeating the executor
[2018-11-15 18:09:43,873] {base_executor.py:118} DEBUG - 0 running task instances
[2018-11-15 18:09:43,873] {base_executor.py:119} DEBUG - 0 in queue
[2018-11-15 18:09:43,874] {base_executor.py:120} DEBUG - 32 open slots
[2018-11-15 18:09:43,874] {base_executor.py:149} DEBUG - Calling the <class 'airflow.executors.sequential_executor.SequentialExecutor'> sync method
[2018-11-15 18:09:43,875] {jobs.py:1737} DEBUG - Ran scheduling loop in 0.03 seconds
[2018-11-15 18:09:43,875] {jobs.py:1738} DEBUG - Sleeping for 1.00 seconds
[2018-11-15 18:09:44,793] {settings.py:198} DEBUG - Disposing DB connection pool (PID 2753)
[2018-11-15 18:09:44,876] {jobs.py:1653} DEBUG - Starting Loop...
[2018-11-15 18:09:44,877] {jobs.py:1675} DEBUG - Heartbeating the process manager
[2018-11-15 18:09:44,877] {jobs.py:478} DEBUG - Waiting for <Process(DagFileProcessor0-Process, stopped)>
[2018-11-15 18:09:44,878] {dag_processing.py:490} DEBUG - Processor for /efs/dags finished
[2018-11-15 18:09:44,878] {dag_processing.py:502} DEBUG - 0/2 scheduler processes running
[2018-11-15 18:09:44,879] {dag_processing.py:505} DEBUG - 0 file paths queued for processing
[2018-11-15 18:09:44,879] {dag_processing.py:551} DEBUG - Queuing the following files for processing:
/efs/dags
[2018-11-15 18:09:44,881] {dag_processing.py:565} DEBUG - Started a process (PID: 2755) to generate tasks for /efs/dags
[2018-11-15 18:09:44,920] {jobs.py:1109} INFO - Tasks up for execution:
<TaskInstance: example_bash_operator.also_run_this 2018-11-13 00:00:00+00:00 [scheduled]>
<TaskInstance: example_bash_operator.also_run_this 2018-11-15 17:57:08.810712+00:00 [scheduled]>
<TaskInstance: example_bash_operator.runme_0 2018-11-13 00:00:00+00:00 [scheduled]>
<TaskInstance: example_bash_operator.runme_0 2018-11-15 17:57:08.810712+00:00 [scheduled]>
<TaskInstance: example_bash_operator.runme_1 2018-11-13 00:00:00+00:00 [scheduled]>
<TaskInstance: example_bash_operator.runme_1 2018-11-15 17:57:08.810712+00:00 [scheduled]>
<TaskInstance: example_bash_operator.runme_2 2018-11-13 00:00:00+00:00 [scheduled]>
<TaskInstance: example_bash_operator.runme_2 2018-11-15 17:57:08.810712+00:00 [scheduled]>
[2018-11-15 18:09:44,926] {jobs.py:1144} INFO - Figuring out tasks to run in Pool(name=None) with 128 open slots and 8 task instances in queue
[2018-11-15 18:09:44,930] {jobs.py:1180} INFO - DAG example_bash_operator has 0/16 running and queued tasks
[2018-11-15 18:09:44,931] {jobs.py:1180} INFO - DAG example_bash_operator has 1/16 running and queued tasks
[2018-11-15 18:09:44,931] {jobs.py:1180} INFO - DAG example_bash_operator has 2/16 running and queued tasks
[2018-11-15 18:09:44,931] {jobs.py:1180} INFO - DAG example_bash_operator has 3/16 running and queued tasks
[2018-11-15 18:09:44,931] {jobs.py:1180} INFO - DAG example_bash_operator has 4/16 running and queued tasks
[2018-11-15 18:09:44,932] {jobs.py:1180} INFO - DAG example_bash_operator has 5/16 running and queued tasks
[2018-11-15 18:09:44,932] {jobs.py:1180} INFO - DAG example_bash_operator has 6/16 running and queued tasks
[2018-11-15 18:09:44,932] {jobs.py:1180} INFO - DAG example_bash_operator has 7/16 running and queued tasks
[2018-11-15 18:09:44,932] {jobs.py:1218} INFO - Setting the follow tasks to queued state:
<TaskInstance: example_bash_operator.runme_0 2018-11-13 00:00:00+00:00 [scheduled]>
<TaskInstance: example_bash_operator.runme_1 2018-11-13 00:00:00+00:00 [scheduled]>
<TaskInstance: example_bash_operator.runme_2 2018-11-13 00:00:00+00:00 [scheduled]>
<TaskInstance: example_bash_operator.runme_0 2018-11-15 17:57:08.810712+00:00 [scheduled]>
<TaskInstance: example_bash_operator.runme_1 2018-11-15 17:57:08.810712+00:00 [scheduled]>
<TaskInstance: example_bash_operator.runme_2 2018-11-15 17:57:08.810712+00:00 [scheduled]>
<TaskInstance: example_bash_operator.also_run_this 2018-11-13 00:00:00+00:00 [scheduled]>
<TaskInstance: example_bash_operator.also_run_this 2018-11-15 17:57:08.810712+00:00 [scheduled]>
[2018-11-15 18:09:44,963] {jobs.py:1301} INFO - Setting the follow tasks to queued state:
<TaskInstance: example_bash_operator.also_run_this 2018-11-13 00:00:00+00:00 [queued]>
<TaskInstance: example_bash_operator.also_run_this 2018-11-15 17:57:08.810712+00:00 [queued]>
<TaskInstance: example_bash_operator.runme_0 2018-11-13 00:00:00+00:00 [queued]>
<TaskInstance: example_bash_operator.runme_0 2018-11-15 17:57:08.810712+00:00 [queued]>
<TaskInstance: example_bash_operator.runme_1 2018-11-13 00:00:00+00:00 [queued]>
<TaskInstance: example_bash_operator.runme_1 2018-11-15 17:57:08.810712+00:00 [queued]>
<TaskInstance: example_bash_operator.runme_2 2018-11-13 00:00:00+00:00 [queued]>
<TaskInstance: example_bash_operator.runme_2 2018-11-15 17:57:08.810712+00:00 [queued]>
[2018-11-15 18:09:44,964] {jobs.py:1343} INFO - Sending ('example_bash_operator', 'also_run_this', datetime.datetime(2018, 11, 13, 0, 0, tzinfo=<TimezoneInfo [UTC, GMT, +00:00:00, STD]>), 1) to executor with priority 2 and queue AirFlow-Tasks-181UMXIJEO2ZY
[2018-11-15 18:09:44,964] {base_executor.py:56} INFO - Adding to queue: airflow run example_bash_operator also_run_this 2018-11-13T00:00:00+00:00 --local -sd /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_bash_operator.py
[2018-11-15 18:09:44,964] {jobs.py:1343} INFO - Sending ('example_bash_operator', 'also_run_this', datetime.datetime(2018, 11, 15, 17, 57, 8, 810712, tzinfo=<TimezoneInfo [UTC, GMT, +00:00:00, STD]>), 1) to executor with priority 2 and queue AirFlow-Tasks-181UMXIJEO2ZY
[2018-11-15 18:09:44,965] {base_executor.py:56} INFO - Adding to queue: airflow run example_bash_operator also_run_this 2018-11-15T17:57:08.810712+00:00 --local -sd /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_bash_operator.py
[2018-11-15 18:09:44,965] {jobs.py:1343} INFO - Sending ('example_bash_operator', 'runme_0', datetime.datetime(2018, 11, 13, 0, 0, tzinfo=<TimezoneInfo [UTC, GMT, +00:00:00, STD]>), 1) to executor with priority 3 and queue AirFlow-Tasks-181UMXIJEO2ZY
[2018-11-15 18:09:44,965] {base_executor.py:56} INFO - Adding to queue: airflow run example_bash_operator runme_0 2018-11-13T00:00:00+00:00 --local -sd /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_bash_operator.py
[2018-11-15 18:09:44,966] {jobs.py:1343} INFO - Sending ('example_bash_operator', 'runme_0', datetime.datetime(2018, 11, 15, 17, 57, 8, 810712, tzinfo=<TimezoneInfo [UTC, GMT, +00:00:00, STD]>), 1) to executor with priority 3 and queue AirFlow-Tasks-181UMXIJEO2ZY
[2018-11-15 18:09:44,966] {base_executor.py:56} INFO - Adding to queue: airflow run example_bash_operator runme_0 2018-11-15T17:57:08.810712+00:00 --local -sd /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_bash_operator.py
[2018-11-15 18:09:44,966] {jobs.py:1343} INFO - Sending ('example_bash_operator', 'runme_1', datetime.datetime(2018, 11, 13, 0, 0, tzinfo=<TimezoneInfo [UTC, GMT, +00:00:00, STD]>), 1) to executor with priority 3 and queue AirFlow-Tasks-181UMXIJEO2ZY
[2018-11-15 18:09:44,966] {base_executor.py:56} INFO - Adding to queue: airflow run example_bash_operator runme_1 2018-11-13T00:00:00+00:00 --local -sd /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_bash_operator.py
[2018-11-15 18:09:44,967] {jobs.py:1343} INFO - Sending ('example_bash_operator', 'runme_1', datetime.datetime(2018, 11, 15, 17, 57, 8, 810712, tzinfo=<TimezoneInfo [UTC, GMT, +00:00:00, STD]>), 1) to executor with priority 3 and queue AirFlow-Tasks-181UMXIJEO2ZY
[2018-11-15 18:09:44,970] {base_executor.py:56} INFO - Adding to queue: airflow run example_bash_operator runme_1 2018-11-15T17:57:08.810712+00:00 --local -sd /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_bash_operator.py
[2018-11-15 18:09:44,970] {jobs.py:1343} INFO - Sending ('example_bash_operator', 'runme_2', datetime.datetime(2018, 11, 13, 0, 0, tzinfo=<TimezoneInfo [UTC, GMT, +00:00:00, STD]>), 1) to executor with priority 3 and queue AirFlow-Tasks-181UMXIJEO2ZY
[2018-11-15 18:09:44,971] {base_executor.py:56} INFO - Adding to queue: airflow run example_bash_operator runme_2 2018-11-13T00:00:00+00:00 --local -sd /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_bash_operator.py
[2018-11-15 18:09:44,971] {jobs.py:1343} INFO - Sending ('example_bash_operator', 'runme_2', datetime.datetime(2018, 11, 15, 17, 57, 8, 810712, tzinfo=<TimezoneInfo [UTC, GMT, +00:00:00, STD]>), 1) to executor with priority 3 and queue AirFlow-Tasks-181UMXIJEO2ZY
[2018-11-15 18:09:44,971] {base_executor.py:56} INFO - Adding to queue: airflow run example_bash_operator runme_2 2018-11-15T17:57:08.810712+00:00 --local -sd /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_bash_operator.py
[2018-11-15 18:09:44,973] {jobs.py:1712} DEBUG - Heartbeating the executor
[2018-11-15 18:09:44,973] {base_executor.py:118} DEBUG - 0 running task instances
[2018-11-15 18:09:44,973] {base_executor.py:119} DEBUG - 8 in queue
[2018-11-15 18:09:44,973] {base_executor.py:120} DEBUG - 32 open slots
[2018-11-15 18:09:45,132] {base_executor.py:149} DEBUG - Calling the <class 'airflow.executors.sequential_executor.SequentialExecutor'> sync method
[2018-11-15 18:09:45,132] {sequential_executor.py:45} INFO - Executing command: airflow run example_bash_operator runme_2 2018-11-13T00:00:00+00:00 --local -sd /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_bash_operator.py
[2018-11-15 18:09:45,803] {settings.py:146} DEBUG - Setting up DB connection pool (PID 2756)
[2018-11-15 18:09:45,804] {settings.py:174} INFO - setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
[2018-11-15 18:09:45,818] {settings.py:198} DEBUG - Disposing DB connection pool (PID 2755)
[2018-11-15 18:09:45,912] {__init__.py:42} DEBUG - Cannot import due to doesn't look like a module path
[2018-11-15 18:09:46,017] {__init__.py:51} INFO - Using executor SequentialExecutor
[2018-11-15 18:09:46,027] {cli_action_loggers.py:40} DEBUG - Adding <function default_action_log at 0x7fbd953bdea0> to pre execution callback
[2018-11-15 18:09:46,125] {cli_action_loggers.py:64} DEBUG - Calling callbacks: [<function default_action_log at 0x7fbd953bdea0>]
[2018-11-15 18:09:46,163] {settings.py:146} DEBUG - Setting up DB connection pool (PID 2756)
[2018-11-15 18:09:46,164] {settings.py:154} DEBUG - settings.configure_orm(): Using NullPool
[2018-11-15 18:09:46,165] {models.py:271} INFO - Filling up the DagBag from /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_bash_operator.py
[2018-11-15 18:09:46,168] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_skip_dag.py
[2018-11-15 18:09:46,171] {models.py:515} DEBUG - Loaded DAG <DAG: example_skip_dag>
[2018-11-15 18:09:46,171] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_passing_params_via_test_command.py
[2018-11-15 18:09:46,173] {models.py:515} DEBUG - Loaded DAG <DAG: example_passing_params_via_test_command>
[2018-11-15 18:09:46,175] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/tutorial.py
[2018-11-15 18:09:46,175] {models.py:515} DEBUG - Loaded DAG <DAG: tutorial>
[2018-11-15 18:09:46,176] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_trigger_controller_dag.py
[2018-11-15 18:09:46,177] {models.py:515} DEBUG - Loaded DAG <DAG: example_trigger_controller_dag>
[2018-11-15 18:09:46,177] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_http_operator.py
[2018-11-15 18:09:46,220] {models.py:515} DEBUG - Loaded DAG <DAG: example_http_operator>
[2018-11-15 18:09:46,221] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_branch_python_dop_operator_3.py
[2018-11-15 18:09:46,222] {models.py:515} DEBUG - Loaded DAG <DAG: example_branch_dop_operator_v3>
[2018-11-15 18:09:46,223] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_branch_operator.py
[2018-11-15 18:09:46,225] {models.py:515} DEBUG - Loaded DAG <DAG: example_branch_operator>
[2018-11-15 18:09:46,225] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_short_circuit_operator.py
[2018-11-15 18:09:46,226] {models.py:515} DEBUG - Loaded DAG <DAG: example_short_circuit_operator>
[2018-11-15 18:09:46,227] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_docker_operator.py
[2018-11-15 18:09:46,227] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_bash_operator.py
[2018-11-15 18:09:46,228] {models.py:515} DEBUG - Loaded DAG <DAG: example_bash_operator>
[2018-11-15 18:09:46,229] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_python_operator.py
[2018-11-15 18:09:46,230] {models.py:515} DEBUG - Loaded DAG <DAG: example_python_operator>
[2018-11-15 18:09:46,230] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_subdag_operator.py
[2018-11-15 18:09:46,233] {models.py:515} DEBUG - Loaded DAG <DAG: example_subdag_operator.section-1>
[2018-11-15 18:09:46,234] {models.py:515} DEBUG - Loaded DAG <DAG: example_subdag_operator.section-2>
[2018-11-15 18:09:46,234] {models.py:515} DEBUG - Loaded DAG <DAG: example_subdag_operator>
[2018-11-15 18:09:46,234] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/test_utils.py
[2018-11-15 18:09:46,235] {models.py:515} DEBUG - Loaded DAG <DAG: test_utils>
[2018-11-15 18:09:46,235] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_xcom.py
[2018-11-15 18:09:46,236] {models.py:515} DEBUG - Loaded DAG <DAG: example_xcom>
[2018-11-15 18:09:46,237] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/docker_copy_data.py
[2018-11-15 18:09:46,237] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_latest_only_with_trigger.py
[2018-11-15 18:09:46,239] {models.py:515} DEBUG - Loaded DAG <DAG: latest_only_with_trigger>
[2018-11-15 18:09:46,239] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_trigger_target_dag.py
[2018-11-15 18:09:46,240] {models.py:515} DEBUG - Loaded DAG <DAG: example_trigger_target_dag>
[2018-11-15 18:09:46,240] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_latest_only.py
[2018-11-15 18:09:46,241] {models.py:515} DEBUG - Loaded DAG <DAG: latest_only>
[2018-11-15 18:09:46,241] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/subdags/subdag.py
[2018-11-15 18:09:46,345] {cli.py:484} INFO - Running <TaskInstance: example_bash_operator.runme_2 2018-11-13T00:00:00+00:00 [queued]> on host ip-10-0-0-233.us-east-2.compute.internal
[2018-11-15 18:09:51,439] {cli_action_loggers.py:81} DEBUG - Calling callbacks: []
[2018-11-15 18:09:51,439] {settings.py:198} DEBUG - Disposing DB connection pool (PID 2756)
[2018-11-15 18:09:51,562] {base_executor.py:153} DEBUG - Changing state: ('example_bash_operator', 'runme_2', datetime.datetime(2018, 11, 13, 0, 0, tzinfo=<TimezoneInfo [UTC, GMT, +00:00:00, STD]>), 1)
[2018-11-15 18:09:51,562] {sequential_executor.py:45} INFO - Executing command: airflow run example_bash_operator runme_2 2018-11-15T17:57:08.810712+00:00 --local -sd /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_bash_operator.py
[2018-11-15 18:09:51,960] {settings.py:146} DEBUG - Setting up DB connection pool (PID 2764)
[2018-11-15 18:09:51,961] {settings.py:174} INFO - setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
[2018-11-15 18:09:52,056] {__init__.py:42} DEBUG - Cannot import due to doesn't look like a module path
[2018-11-15 18:09:52,161] {__init__.py:51} INFO - Using executor SequentialExecutor
[2018-11-15 18:09:52,171] {cli_action_loggers.py:40} DEBUG - Adding <function default_action_log at 0x7ff6ab48eea0> to pre execution callback
[2018-11-15 18:09:52,267] {cli_action_loggers.py:64} DEBUG - Calling callbacks: [<function default_action_log at 0x7ff6ab48eea0>]
[2018-11-15 18:09:52,303] {settings.py:146} DEBUG - Setting up DB connection pool (PID 2764)
[2018-11-15 18:09:52,304] {settings.py:154} DEBUG - settings.configure_orm(): Using NullPool
[2018-11-15 18:09:52,305] {models.py:271} INFO - Filling up the DagBag from /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_bash_operator.py
[2018-11-15 18:09:52,308] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_skip_dag.py
[2018-11-15 18:09:52,311] {models.py:515} DEBUG - Loaded DAG <DAG: example_skip_dag>
[2018-11-15 18:09:52,311] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_passing_params_via_test_command.py
[2018-11-15 18:09:52,313] {models.py:515} DEBUG - Loaded DAG <DAG: example_passing_params_via_test_command>
[2018-11-15 18:09:52,315] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/tutorial.py
[2018-11-15 18:09:52,315] {models.py:515} DEBUG - Loaded DAG <DAG: tutorial>
[2018-11-15 18:09:52,316] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_trigger_controller_dag.py
[2018-11-15 18:09:52,317] {models.py:515} DEBUG - Loaded DAG <DAG: example_trigger_controller_dag>
[2018-11-15 18:09:52,317] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_http_operator.py
[2018-11-15 18:09:52,360] {models.py:515} DEBUG - Loaded DAG <DAG: example_http_operator>
[2018-11-15 18:09:52,360] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_branch_python_dop_operator_3.py
[2018-11-15 18:09:52,361] {models.py:515} DEBUG - Loaded DAG <DAG: example_branch_dop_operator_v3>
[2018-11-15 18:09:52,363] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_branch_operator.py
[2018-11-15 18:09:52,364] {models.py:515} DEBUG - Loaded DAG <DAG: example_branch_operator>
[2018-11-15 18:09:52,364] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_short_circuit_operator.py
[2018-11-15 18:09:52,366] {models.py:515} DEBUG - Loaded DAG <DAG: example_short_circuit_operator>
[2018-11-15 18:09:52,366] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_docker_operator.py
[2018-11-15 18:09:52,367] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_bash_operator.py
[2018-11-15 18:09:52,368] {models.py:515} DEBUG - Loaded DAG <DAG: example_bash_operator>
[2018-11-15 18:09:52,368] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_python_operator.py
[2018-11-15 18:09:52,369] {models.py:515} DEBUG - Loaded DAG <DAG: example_python_operator>
[2018-11-15 18:09:52,370] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_subdag_operator.py
[2018-11-15 18:09:52,373] {models.py:515} DEBUG - Loaded DAG <DAG: example_subdag_operator.section-1>
[2018-11-15 18:09:52,373] {models.py:515} DEBUG - Loaded DAG <DAG: example_subdag_operator.section-2>
[2018-11-15 18:09:52,373] {models.py:515} DEBUG - Loaded DAG <DAG: example_subdag_operator>
[2018-11-15 18:09:52,373] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/test_utils.py
[2018-11-15 18:09:52,374] {models.py:515} DEBUG - Loaded DAG <DAG: test_utils>
[2018-11-15 18:09:52,375] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_xcom.py
[2018-11-15 18:09:52,375] {models.py:515} DEBUG - Loaded DAG <DAG: example_xcom>
[2018-11-15 18:09:52,376] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/docker_copy_data.py
[2018-11-15 18:09:52,376] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_latest_only_with_trigger.py
[2018-11-15 18:09:52,378] {models.py:515} DEBUG - Loaded DAG <DAG: latest_only_with_trigger>
[2018-11-15 18:09:52,378] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_trigger_target_dag.py
[2018-11-15 18:09:52,379] {models.py:515} DEBUG - Loaded DAG <DAG: example_trigger_target_dag>
[2018-11-15 18:09:52,379] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_latest_only.py
[2018-11-15 18:09:52,380] {models.py:515} DEBUG - Loaded DAG <DAG: latest_only>
[2018-11-15 18:09:52,380] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/subdags/subdag.py
[2018-11-15 18:09:52,467] {cli.py:484} INFO - Running <TaskInstance: example_bash_operator.runme_2 2018-11-15T17:57:08.810712+00:00 [queued]> on host ip-10-0-0-233.us-east-2.compute.internal
[2018-11-15 18:09:57,592] {cli_action_loggers.py:81} DEBUG - Calling callbacks: []
[2018-11-15 18:09:57,592] {settings.py:198} DEBUG - Disposing DB connection pool (PID 2764)
[2018-11-15 18:09:57,711] {base_executor.py:153} DEBUG - Changing state: ('example_bash_operator', 'runme_2', datetime.datetime(2018, 11, 15, 17, 57, 8, 810712, tzinfo=<TimezoneInfo [UTC, GMT, +00:00:00, STD]>), 1)
[2018-11-15 18:09:57,711] {sequential_executor.py:45} INFO - Executing command: airflow run example_bash_operator runme_0 2018-11-15T17:57:08.810712+00:00 --local -sd /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_bash_operator.py
[2018-11-15 18:09:58,101] {settings.py:146} DEBUG - Setting up DB connection pool (PID 2771)
[2018-11-15 18:09:58,102] {settings.py:174} INFO - setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
[2018-11-15 18:09:58,195] {__init__.py:42} DEBUG - Cannot import due to doesn't look like a module path
[2018-11-15 18:09:58,299] {__init__.py:51} INFO - Using executor SequentialExecutor
[2018-11-15 18:09:58,309] {cli_action_loggers.py:40} DEBUG - Adding <function default_action_log at 0x7f1d70914ea0> to pre execution callback
[2018-11-15 18:09:58,406] {cli_action_loggers.py:64} DEBUG - Calling callbacks: [<function default_action_log at 0x7f1d70914ea0>]
[2018-11-15 18:09:58,441] {settings.py:146} DEBUG - Setting up DB connection pool (PID 2771)
[2018-11-15 18:09:58,442] {settings.py:154} DEBUG - settings.configure_orm(): Using NullPool
[2018-11-15 18:09:58,443] {models.py:271} INFO - Filling up the DagBag from /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_bash_operator.py
[2018-11-15 18:09:58,446] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_skip_dag.py
[2018-11-15 18:09:58,449] {models.py:515} DEBUG - Loaded DAG <DAG: example_skip_dag>
[2018-11-15 18:09:58,449] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_passing_params_via_test_command.py
[2018-11-15 18:09:58,451] {models.py:515} DEBUG - Loaded DAG <DAG: example_passing_params_via_test_command>
[2018-11-15 18:09:58,452] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/tutorial.py
[2018-11-15 18:09:58,453] {models.py:515} DEBUG - Loaded DAG <DAG: tutorial>
[2018-11-15 18:09:58,454] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_trigger_controller_dag.py
[2018-11-15 18:09:58,455] {models.py:515} DEBUG - Loaded DAG <DAG: example_trigger_controller_dag>
[2018-11-15 18:09:58,455] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_http_operator.py
[2018-11-15 18:09:58,497] {models.py:515} DEBUG - Loaded DAG <DAG: example_http_operator>
[2018-11-15 18:09:58,498] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_branch_python_dop_operator_3.py
[2018-11-15 18:09:58,499] {models.py:515} DEBUG - Loaded DAG <DAG: example_branch_dop_operator_v3>
[2018-11-15 18:09:58,501] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_branch_operator.py
[2018-11-15 18:09:58,502] {models.py:515} DEBUG - Loaded DAG <DAG: example_branch_operator>
[2018-11-15 18:09:58,503] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_short_circuit_operator.py
[2018-11-15 18:09:58,504] {models.py:515} DEBUG - Loaded DAG <DAG: example_short_circuit_operator>
[2018-11-15 18:09:58,504] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_docker_operator.py
[2018-11-15 18:09:58,505] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_bash_operator.py
[2018-11-15 18:09:58,506] {models.py:515} DEBUG - Loaded DAG <DAG: example_bash_operator>
[2018-11-15 18:09:58,506] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_python_operator.py
[2018-11-15 18:09:58,507] {models.py:515} DEBUG - Loaded DAG <DAG: example_python_operator>
[2018-11-15 18:09:58,508] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_subdag_operator.py
[2018-11-15 18:09:58,511] {models.py:515} DEBUG - Loaded DAG <DAG: example_subdag_operator.section-2>
[2018-11-15 18:09:58,511] {models.py:515} DEBUG - Loaded DAG <DAG: example_subdag_operator.section-1>
[2018-11-15 18:09:58,511] {models.py:515} DEBUG - Loaded DAG <DAG: example_subdag_operator>
[2018-11-15 18:09:58,512] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/test_utils.py
[2018-11-15 18:09:58,512] {models.py:515} DEBUG - Loaded DAG <DAG: test_utils>
[2018-11-15 18:09:58,513] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_xcom.py
[2018-11-15 18:09:58,514] {models.py:515} DEBUG - Loaded DAG <DAG: example_xcom>
[2018-11-15 18:09:58,514] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/docker_copy_data.py
[2018-11-15 18:09:58,515] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_latest_only_with_trigger.py
[2018-11-15 18:09:58,516] {models.py:515} DEBUG - Loaded DAG <DAG: latest_only_with_trigger>
[2018-11-15 18:09:58,516] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_trigger_target_dag.py
[2018-11-15 18:09:58,517] {models.py:515} DEBUG - Loaded DAG <DAG: example_trigger_target_dag>
[2018-11-15 18:09:58,518] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_latest_only.py
[2018-11-15 18:09:58,518] {models.py:515} DEBUG - Loaded DAG <DAG: latest_only>
[2018-11-15 18:09:58,519] {models.py:366} DEBUG - Importing /usr/local/lib/python3.5/site-packages/airflow/example_dags/subdags/subdag.py
[2018-11-15 18:09:58,612] {cli.py:484} INFO - Running <TaskInstance: example_bash_operator.runme_0 2018-11-15T17:57:08.810712+00:00 [queued]> on host ip-10-0-0-233.us-east-2.compute.internal
[2018-11-15 18:10:03,717] {cli_action_loggers.py:81} DEBUG - Calling callbacks: []
[2018-11-15 18:10:03,718] {settings.py:198} DEBUG - Disposing DB connection pool (PID 2771)
[2018-11-15 18:10:03,848] {base_executor.py:153} DEBUG - Changing state: ('example_bash_operator', 'runme_0', datetime.datetime(2018, 11, 15, 17, 57, 8, 810712, tzinfo=<TimezoneInfo [UTC, GMT, +00:00:00, STD]>), 1)
[2018-11-15 18:10:03,849] {sequential_executor.py:45} INFO - Executing command: airflow run example_bash_operator runme_0 2018-11-13T00:00:00+00:00 --local -sd /usr/local/lib/python3.5/site-packages/airflow/example_dags/example_bash_operator.py
[2018-11-15 18:10:04,251] {settings.py:146} DEBUG - Setting up DB connection pool (PID 2778)
[2018-11-15 18:10:04,251] {settings.py:174} INFO - setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
[2018-11-15 18:10:04,349] {__init__.py:42} DEBUG - Cannot import due to doesn't look like a module path
[2018-11-15 18:10:04,453] {__init__.py:51} INFO - Using executor SequentialExecutor
[2018-11-15 18:10:04,463] {cli_action_loggers.py:40} DEBUG - Adding <function default_action_log at 0x7f0b8aab1ea0> to pre execution callback
[2018-11-15 18:10:04,561] {cli_action_loggers.py:64} DEBUG - Calling callbacks: [<function default_action_log at 0x7f0b8aab1ea0>]
[2018-11-15 18:10:04,609] {settings.py:146} DEBUG - Setting up DB connection pool (PID 2778)
[2018-11-15 18:10:04,609] {settings.py:154} DEBUG - settings.configure_orm(): Using NullPool
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment