Skip to content

Instantly share code, notes, and snippets.

@amandajcrawford
Created January 27, 2020 14:52
Show Gist options
  • Save amandajcrawford/25e60f91721c180fe2dab0a617363113 to your computer and use it in GitHub Desktop.
Save amandajcrawford/25e60f91721c180fe2dab0a617363113 to your computer and use it in GitHub Desktop.
*** Reading local file: /usr/local/home/airflow/logs/programs_pipeline/intellect_program_full_pipeline_solid.download_from_s3_to_file/2020-01-27T14:47:33.817553+00:00/1.log
[2020-01-27 14:47:39,525] {{taskinstance.py:620}} INFO - Dependencies all met for <TaskInstance: programs_pipeline.intellect_program_full_pipeline_solid.download_from_s3_to_file 2020-01-27T14:47:33.817553+00:00 [queued]>
[2020-01-27 14:47:39,536] {{taskinstance.py:620}} INFO - Dependencies all met for <TaskInstance: programs_pipeline.intellect_program_full_pipeline_solid.download_from_s3_to_file 2020-01-27T14:47:33.817553+00:00 [queued]>
[2020-01-27 14:47:39,536] {{taskinstance.py:838}} INFO -
--------------------------------------------------------------------------------
[2020-01-27 14:47:39,536] {{taskinstance.py:839}} INFO - Starting attempt 1 of 1
[2020-01-27 14:47:39,536] {{taskinstance.py:840}} INFO -
--------------------------------------------------------------------------------
[2020-01-27 14:47:39,554] {{taskinstance.py:859}} INFO - Executing <Task(DagsterPythonOperator): intellect_program_full_pipeline_solid.download_from_s3_to_file> on 2020-01-27T14:47:33.817553+00:00
[2020-01-27 14:47:39,554] {{base_task_runner.py:133}} INFO - Running: ['airflow', 'run', 'programs_pipeline', 'intellect_program_full_pipeline_solid.download_from_s3_to_file', '2020-01-27T14:47:33.817553+00:00', '--job_id', '839', '--pool', 'default_pool', '--raw', '-sd', 'DAGS_FOLDER/programs_pipeline.py', '--cfg_path', '/tmp/tmpllbr0x5a']
[2020-01-27 14:47:40,437] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file /usr/lib/python3.7/site-packages/airflow/utils/helpers.py:36: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3,and in 3.9 it will stop working
[2020-01-27 14:47:40,437] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file from collections import Iterable
[2020-01-27 14:47:40,686] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file [2020-01-27 14:47:40,685] {{settings.py:213}} INFO - settings.configure_orm(): Using pool settings. pool_size=5, max_overflow=10, pool_recycle=1800, pid=2051
[2020-01-27 14:47:42,846] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file [2020-01-27 14:47:42,845] {{__init__.py:51}} INFO - Using executor LocalExecutor
[2020-01-27 14:47:43,492] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file [2020-01-27 14:47:43,491] {{dagbag.py:90}} INFO - Filling up the DagBag from /usr/local/home/airflow/dags/programs_pipeline.py
[2020-01-27 14:47:43,993] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file /usr/local/home/airflow/dags/programs_pipeline.py:22: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
[2020-01-27 14:47:43,993] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file storage_env = yaml.load(STORAGE_ENVIRONMENT)
[2020-01-27 14:47:43,994] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file /usr/local/home/airflow/dags/programs_pipeline.py:34: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
[2020-01-27 14:47:43,994] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file pipeline_env = yaml.load(environment_file)
[2020-01-27 14:47:44,023] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file /usr/lib/python3.7/site-packages/dagster/utils/yaml_utils.py:36: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
[2020-01-27 14:47:44,023] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file return yaml.load(ff)
[2020-01-27 14:47:44,028] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file /usr/lib/python3.7/site-packages/dagster/core/serdes/__init__.py:199: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
[2020-01-27 14:47:44,028] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file config_dict = yaml.load(self.config_yaml)
[2020-01-27 14:47:44,185] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file /usr/lib/python3.7/site-packages/dagster/core/serdes/__init__.py:199: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
[2020-01-27 14:47:44,185] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file config_dict = yaml.load(self.config_yaml)
[2020-01-27 14:47:44,190] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file Exception in thread Thread-1:
[2020-01-27 14:47:44,190] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file Traceback (most recent call last):
[2020-01-27 14:47:44,191] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
[2020-01-27 14:47:44,191] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file self.run()
[2020-01-27 14:47:44,192] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/threading.py", line 870, in run
[2020-01-27 14:47:44,192] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file self._target(*self._args, **self._kwargs)
[2020-01-27 14:47:44,192] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/dagster_postgres/event_log/event_log.py", line 158, in watcher_thread
[2020-01-27 14:47:44,193] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file exit_event=watcher_thread_exit,
[2020-01-27 14:47:44,193] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/dagster_postgres/pynotify.py", line 95, in await_pg_notifications
[2020-01-27 14:47:44,193] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file conn = get_conn(conn_string)
[2020-01-27 14:47:44,193] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/dagster_postgres/utils.py", line 7, in get_conn
[2020-01-27 14:47:44,193] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file conn = psycopg2.connect(conn_string)
[2020-01-27 14:47:44,193] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/psycopg2/__init__.py", line 125, in connect
[2020-01-27 14:47:44,193] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file dsn = _ext.make_dsn(dsn, **kwargs)
[2020-01-27 14:47:44,193] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/psycopg2/extensions.py", line 155, in make_dsn
[2020-01-27 14:47:44,193] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file parse_dsn(dsn)
[2020-01-27 14:47:44,193] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file psycopg2.ProgrammingError: invalid dsn: missing "=" after "postgresql+psycopg2://airflow:airflow@postgres:5432/airflow" in connection info string
[2020-01-27 14:47:44,193] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file
[2020-01-27 14:47:44,193] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file
[2020-01-27 14:47:44,386] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file /usr/lib/python3.7/site-packages/dagster/utils/yaml_utils.py:36: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
[2020-01-27 14:47:44,386] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file return yaml.load(ff)
[2020-01-27 14:47:44,831] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file [2020-01-27 14:47:44,831] {{cli.py:516}} INFO - Running <TaskInstance: programs_pipeline.intellect_program_full_pipeline_solid.download_from_s3_to_file 2020-01-27T14:47:33.817553+00:00 [running]> on host e3a89d28b010
[2020-01-27 14:47:44,857] {{python_operator.py:99}} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_ID=programs_pipeline
AIRFLOW_CTX_TASK_ID=intellect_program_full_pipeline_solid.download_from_s3_to_file
AIRFLOW_CTX_EXECUTION_DATE=2020-01-27T14:47:33.817553+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2020-01-27T14:47:33.817553+00:00
[2020-01-27 14:47:44,859] {{logging_mixin.py:95}} INFO - [2020-01-27 14:47:44,858] {{python_operator.py:57}} INFO - Executing GraphQL query:
mutation(
$executionParams: ExecutionParams!
) {
executePlan(
executionParams: $executionParams,
) {
__typename
... on InvalidStepError {
invalidStepKey
}
... on PipelineConfigValidationInvalid {
pipeline {
name
}
errors {
__typename
message
path
reason
}
}
... on PipelineNotFoundError {
message
pipelineName
}
... on PythonError {
message
stack
}
... on ExecutePlanSuccess {
pipeline {
name
}
hasFailures
stepEvents {
__typename
...stepEventFragment
}
}
}
}
fragment eventMetadataEntryFragment on EventMetadataEntry {
__typename
label
description
... on EventPathMetadataEntry {
path
}
... on EventJsonMetadataEntry {
jsonString
}
... on EventUrlMetadataEntry {
url
}
... on EventTextMetadataEntry {
text
}
... on EventMarkdownMetadataEntry {
mdStr
}
}
fragment stepEventFragment on StepEvent {
step {
key
inputs {
name
type {
key
}
dependsOn {
key
}
}
outputs {
name
type {
key
}
}
solidHandleID
kind
metadata {
key
value
}
}
... on MessageEvent {
runId
message
timestamp
level
}
... on StepExpectationResultEvent {
expectationResult {
success
label
description
metadataEntries {
...eventMetadataEntryFragment
}
}
}
... on StepMaterializationEvent {
materialization {
label
description
metadataEntries {
...eventMetadataEntryFragment
}
}
}
... on ExecutionStepInputEvent {
inputName
typeCheck {
__typename
success
label
description
metadataEntries {
...eventMetadataEntryFragment
}
}
}
... on ExecutionStepOutputEvent {
outputName
typeCheck {
__typename
success
label
description
metadataEntries {
...eventMetadataEntryFragment
}
}
}
... on ExecutionStepFailureEvent {
error {
message
}
failureMetadata {
label
description
metadataEntries {
...eventMetadataEntryFragment
}
}
}
}
with variables:
{
"executionParams": {
"environmentConfigData": "REDACTED",
"executionMetadata": {
"runId": "manual__2020-01-27T14:47:33.817553+00:00",
"tags": [
{
"key": "airflow_ts",
"value": "2020-01-27T14:47:33.817553+00:00"
},
{
"key": "execution_epoch_time",
"value": "1580136453.817553"
}
]
},
"mode": "test",
"selector": {
"name": "programs_pipeline"
},
"stepKeys": [
"intellect_program_full_pipeline_solid.download_from_s3_to_file.compute"
]
}
}
[2020-01-27 14:47:44,859] {{logging_mixin.py:95}} WARNING - /usr/lib/python3.7/site-packages/dagster/core/serdes/__init__.py:199: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
config_dict = yaml.load(self.config_yaml)
[2020-01-27 14:47:44,882] {{logging_mixin.py:95}} WARNING - Exception in thread Thread-3:
Traceback (most recent call last):
File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
self.run()
File "/usr/lib/python3.7/threading.py", line 870, in run
self._target(*self._args, **self._kwargs)
File "/usr/lib/python3.7/site-packages/dagster_postgres/event_log/event_log.py", line 158, in watcher_thread
exit_event=watcher_thread_exit,
File "/usr/lib/python3.7/site-packages/dagster_postgres/pynotify.py", line 95, in await_pg_notifications
conn = get_conn(conn_string)
File "/usr/lib/python3.7/site-packages/dagster_postgres/utils.py", line 7, in get_conn
conn = psycopg2.connect(conn_string)
File "/usr/lib/python3.7/site-packages/psycopg2/__init__.py", line 125, in connect
dsn = _ext.make_dsn(dsn, **kwargs)
File "/usr/lib/python3.7/site-packages/psycopg2/extensions.py", line 155, in make_dsn
parse_dsn(dsn)
psycopg2.ProgrammingError: invalid dsn: missing "=" after "postgresql+psycopg2://airflow:airflow@postgres:5432/airflow" in connection info string
[2020-01-27 14:47:44,883] {{logging_mixin.py:95}} WARNING -
[2020-01-27 14:47:44,983] {{taskinstance.py:1051}} ERROR - (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "runs_run_id_key"
DETAIL: Key (run_id)=(manual__2020-01-27T14:47:33.817553+00:00) already exists.
[SQL: INSERT INTO runs (run_id, pipeline_name, status, run_body) VALUES (%(run_id)s, %(pipeline_name)s, %(status)s, %(run_body)s) RETURNING runs.id]
[parameters: {'run_id': 'manual__2020-01-27T14:47:33.817553+00:00', 'pipeline_name': 'programs_pipeline', 'status': 'MANAGED', 'run_body': '{"__class__": "PipelineRun", "environment_dict": {"resources": {"db_info": {"config": {"postgres_db_name": "data_resource_dev", "postgres_hostname": ... (4239 characters truncated) ... ", "name": "programs_pipeline", "solid_subset": null}, "status": {"__enum__": "PipelineRunStatus.MANAGED"}, "step_keys_to_execute": null, "tags": {}}'}]
(Background on this error at: http://sqlalche.me/e/gkpj)
Traceback (most recent call last):
File "/usr/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1246, in _execute_context
cursor, statement, parameters, context
File "/usr/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 588, in do_execute
cursor.execute(statement, parameters)
psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "runs_run_id_key"
DETAIL: Key (run_id)=(manual__2020-01-27T14:47:33.817553+00:00) already exists.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 926, in _run_raw_task
result = task_copy.execute(context=context)
File "/usr/lib/python3.7/site-packages/dagster_airflow/vendor/python_operator.py", line 108, in execute
return_value = self.execute_callable()
File "/usr/lib/python3.7/site-packages/dagster_airflow/vendor/python_operator.py", line 113, in execute_callable
return self.python_callable(*self.op_args, **self.op_kwargs)
File "/usr/lib/python3.7/site-packages/dagster_airflow/operators/python_operator.py", line 71, in python_callable
status=PipelineRunStatus.MANAGED,
File "/usr/lib/python3.7/site-packages/dagster/core/instance/__init__.py", line 283, in get_or_create_run
return self.create_run(pipeline_run)
File "/usr/lib/python3.7/site-packages/dagster/core/instance/__init__.py", line 275, in create_run
run = self._run_storage.add_run(pipeline_run)
File "/usr/lib/python3.7/site-packages/dagster/core/storage/runs/sql_run_storage.py", line 39, in add_run
conn.execute(runs_insert)
File "/usr/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 2182, in execute
return connection.execute(statement, *multiparams, **params)
File "/usr/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 982, in execute
return meth(self, multiparams, params)
File "/usr/lib/python3.7/site-packages/sqlalchemy/sql/elements.py", line 293, in _execute_on_connection
return connection._execute_clauseelement(self, multiparams, params)
File "/usr/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1101, in _execute_clauseelement
distilled_params,
File "/usr/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1250, in _execute_context
e, statement, parameters, cursor, context
File "/usr/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1476, in _handle_dbapi_exception
util.raise_from_cause(sqlalchemy_exception, exc_info)
File "/usr/lib/python3.7/site-packages/sqlalchemy/util/compat.py", line 398, in raise_from_cause
reraise(type(exception), exception, tb=exc_tb, cause=cause)
File "/usr/lib/python3.7/site-packages/sqlalchemy/util/compat.py", line 152, in reraise
raise value.with_traceback(tb)
File "/usr/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1246, in _execute_context
cursor, statement, parameters, context
File "/usr/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 588, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "runs_run_id_key"
DETAIL: Key (run_id)=(manual__2020-01-27T14:47:33.817553+00:00) already exists.
[SQL: INSERT INTO runs (run_id, pipeline_name, status, run_body) VALUES (%(run_id)s, %(pipeline_name)s, %(status)s, %(run_body)s) RETURNING runs.id]
[parameters: {'run_id': 'manual__2020-01-27T14:47:33.817553+00:00', 'pipeline_name': 'programs_pipeline', 'status': 'MANAGED', 'run_body': '{"__class__": "PipelineRun", "environment_dict": {"resources": {"db_info": {"config": {"postgres_db_name": "data_resource_dev", "postgres_hostname": ... (4239 characters truncated) ... ", "name": "programs_pipeline", "solid_subset": null}, "status": {"__enum__": "PipelineRunStatus.MANAGED"}, "step_keys_to_execute": null, "tags": {}}'}]
(Background on this error at: http://sqlalche.me/e/gkpj)
[2020-01-27 14:47:45,000] {{taskinstance.py:1082}} INFO - Marking task as FAILED.
[2020-01-27 14:47:45,056] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file Traceback (most recent call last):
[2020-01-27 14:47:45,056] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1246, in _execute_context
[2020-01-27 14:47:45,057] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file cursor, statement, parameters, context
[2020-01-27 14:47:45,057] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 588, in do_execute
[2020-01-27 14:47:45,057] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file cursor.execute(statement, parameters)
[2020-01-27 14:47:45,057] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "runs_run_id_key"
[2020-01-27 14:47:45,057] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file DETAIL: Key (run_id)=(manual__2020-01-27T14:47:33.817553+00:00) already exists.
[2020-01-27 14:47:45,057] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file
[2020-01-27 14:47:45,057] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file
[2020-01-27 14:47:45,057] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file The above exception was the direct cause of the following exception:
[2020-01-27 14:47:45,057] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file
[2020-01-27 14:47:45,057] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file Traceback (most recent call last):
[2020-01-27 14:47:45,057] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/bin/airflow", line 32, in <module>
[2020-01-27 14:47:45,057] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file args.func(args)
[2020-01-27 14:47:45,057] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/airflow/utils/cli.py", line 74, in wrapper
[2020-01-27 14:47:45,058] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file return f(*args, **kwargs)
[2020-01-27 14:47:45,058] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/airflow/bin/cli.py", line 522, in run
[2020-01-27 14:47:45,058] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file _run(args, dag, ti)
[2020-01-27 14:47:45,058] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/airflow/bin/cli.py", line 440, in _run
[2020-01-27 14:47:45,058] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file pool=args.pool,
[2020-01-27 14:47:45,058] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/airflow/utils/db.py", line 74, in wrapper
[2020-01-27 14:47:45,058] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file return func(*args, **kwargs)
[2020-01-27 14:47:45,058] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 926, in _run_raw_task
[2020-01-27 14:47:45,058] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file result = task_copy.execute(context=context)
[2020-01-27 14:47:45,058] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/dagster_airflow/vendor/python_operator.py", line 108, in execute
[2020-01-27 14:47:45,058] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file return_value = self.execute_callable()
[2020-01-27 14:47:45,058] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/dagster_airflow/vendor/python_operator.py", line 113, in execute_callable
[2020-01-27 14:47:45,058] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file return self.python_callable(*self.op_args, **self.op_kwargs)
[2020-01-27 14:47:45,059] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/dagster_airflow/operators/python_operator.py", line 71, in python_callable
[2020-01-27 14:47:45,059] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file status=PipelineRunStatus.MANAGED,
[2020-01-27 14:47:45,059] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/dagster/core/instance/__init__.py", line 283, in get_or_create_run
[2020-01-27 14:47:45,059] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file return self.create_run(pipeline_run)
[2020-01-27 14:47:45,059] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/dagster/core/instance/__init__.py", line 275, in create_run
[2020-01-27 14:47:45,059] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file run = self._run_storage.add_run(pipeline_run)
[2020-01-27 14:47:45,059] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/dagster/core/storage/runs/sql_run_storage.py", line 39, in add_run
[2020-01-27 14:47:45,059] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file conn.execute(runs_insert)
[2020-01-27 14:47:45,059] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 2182, in execute
[2020-01-27 14:47:45,059] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file return connection.execute(statement, *multiparams, **params)
[2020-01-27 14:47:45,059] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 982, in execute
[2020-01-27 14:47:45,059] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file return meth(self, multiparams, params)
[2020-01-27 14:47:45,060] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/sqlalchemy/sql/elements.py", line 293, in _execute_on_connection
[2020-01-27 14:47:45,060] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file return connection._execute_clauseelement(self, multiparams, params)
[2020-01-27 14:47:45,060] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1101, in _execute_clauseelement
[2020-01-27 14:47:45,060] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file distilled_params,
[2020-01-27 14:47:45,060] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1250, in _execute_context
[2020-01-27 14:47:45,060] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file e, statement, parameters, cursor, context
[2020-01-27 14:47:45,060] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1476, in _handle_dbapi_exception
[2020-01-27 14:47:45,060] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file util.raise_from_cause(sqlalchemy_exception, exc_info)
[2020-01-27 14:47:45,060] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/sqlalchemy/util/compat.py", line 398, in raise_from_cause
[2020-01-27 14:47:45,060] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file reraise(type(exception), exception, tb=exc_tb, cause=cause)
[2020-01-27 14:47:45,060] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/sqlalchemy/util/compat.py", line 152, in reraise
[2020-01-27 14:47:45,060] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file raise value.with_traceback(tb)
[2020-01-27 14:47:45,060] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1246, in _execute_context
[2020-01-27 14:47:45,061] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file cursor, statement, parameters, context
[2020-01-27 14:47:45,061] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file File "/usr/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 588, in do_execute
[2020-01-27 14:47:45,061] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file cursor.execute(statement, parameters)
[2020-01-27 14:47:45,061] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "runs_run_id_key"
[2020-01-27 14:47:45,061] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file DETAIL: Key (run_id)=(manual__2020-01-27T14:47:33.817553+00:00) already exists.
[2020-01-27 14:47:45,061] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file
[2020-01-27 14:47:45,061] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file [SQL: INSERT INTO runs (run_id, pipeline_name, status, run_body) VALUES (%(run_id)s, %(pipeline_name)s, %(status)s, %(run_body)s) RETURNING runs.id]
[2020-01-27 14:47:45,061] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file [parameters: {'run_id': 'manual__2020-01-27T14:47:33.817553+00:00', 'pipeline_name': 'programs_pipeline', 'status': 'MANAGED', 'run_body': '{"__class__": "PipelineRun", "environment_dict": {"resources": {"db_info": {"config": {"postgres_db_name": "data_resource_dev", "postgres_hostname": ... (4239 characters truncated) ... ", "name": "programs_pipeline", "solid_subset": null}, "status": {"__enum__": "PipelineRunStatus.MANAGED"}, "step_keys_to_execute": null, "tags": {}}'}]
[2020-01-27 14:47:45,061] {{base_task_runner.py:115}} INFO - Job 839: Subtask intellect_program_full_pipeline_solid.download_from_s3_to_file (Background on this error at: http://sqlalche.me/e/gkpj)
[2020-01-27 14:47:49,521] {{logging_mixin.py:95}} INFO - [2020-01-27 14:47:49,521] {{local_task_job.py:105}} INFO - Task exited with return code 1
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment