Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Save heumsi/e07c1e7ecef94eb888a281dace2806e2 to your computer and use it in GitHub Desktop.
Save heumsi/e07c1e7ecef94eb888a281dace2806e2 to your computer and use it in GitHub Desktop.
Airbyte Sync Log (full refresh)
2022-11-26 07:37:42 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed.
errors: $.method: must be a constant value Standard
2022-11-26 07:37:42 INFO i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed.
errors: $.credential.hmac_key_secret: object found, string expected, $.credential.hmac_key_access_id: object found, string expected
2022-11-26 07:37:42 INFO i.a.w.t.TemporalAttemptExecution(get):136 - Docker volume job log path: /tmp/workspace/13/0/logs.log
2022-11-26 07:37:42 INFO i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: 0.40.21
2022-11-26 07:37:43 INFO i.a.c.i.LineGobbler(voidCall):114 -
2022-11-26 07:37:43 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- START CHECK -----
2022-11-26 07:37:43 INFO i.a.c.i.LineGobbler(voidCall):114 -
2022-11-26 07:37:43 INFO i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/source-mysql:1.0.13 exists...
2022-11-26 07:37:43 INFO i.a.c.i.LineGobbler(voidCall):114 - airbyte/source-mysql:1.0.13 was found locally.
2022-11-26 07:37:43 INFO i.a.w.p.DockerProcessFactory(create):119 - Creating docker container = source-mysql-check-13-0-pszja with resources io.airbyte.config.ResourceRequirements@405d12cb[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=]
2022-11-26 07:37:43 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/13/0 --log-driver none --name source-mysql-check-13-0-pszja --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_CONNECTOR_IMAGE=airbyte/source-mysql:1.0.13 -e WORKER_JOB_ATTEMPT=0 -e AIRBYTE_VERSION=0.40.21 -e WORKER_JOB_ID=13 airbyte/source-mysql:1.0.13 check --config source_config.json
2022-11-26 07:37:43 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):97 - Reading messages from protocol version 0.2.0
2022-11-26 07:37:43 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - starting source: class io.airbyte.integrations.source.mysql.MySqlSource
2022-11-26 07:37:43 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - integration args: {check=null, config=source_config.json}
2022-11-26 07:37:43 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Running integration: io.airbyte.integrations.base.ssh.SshWrappedSource
2022-11-26 07:37:43 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Command: CHECK
2022-11-26 07:37:43 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'}
2022-11-26 07:37:43 WARN i.a.w.i.DefaultAirbyteStreamFactory(internalLog):117 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-11-26 07:37:43 WARN i.a.w.i.DefaultAirbyteStreamFactory(internalLog):117 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-11-26 07:37:44 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Starting connection with method: NO_TUNNEL
2022-11-26 07:37:44 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - HikariPool-1 - Starting...
2022-11-26 07:37:44 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - HikariPool-1 - Start completed.
2022-11-26 07:37:44 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Attempting to get metadata from the database to see if we can connect.
2022-11-26 07:37:44 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - HikariPool-1 - Shutdown initiated...
2022-11-26 07:37:44 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - HikariPool-1 - Shutdown completed.
2022-11-26 07:37:44 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Completed integration: io.airbyte.integrations.base.ssh.SshWrappedSource
2022-11-26 07:37:44 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - completed source: class io.airbyte.integrations.source.mysql.MySqlSource
2022-11-26 07:37:44 INFO i.a.c.i.LineGobbler(voidCall):114 -
2022-11-26 07:37:44 INFO i.a.w.t.TemporalAttemptExecution(get):160 - Stopping cancellation check scheduling...
2022-11-26 07:37:44 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- END CHECK -----
2022-11-26 07:37:44 INFO i.a.c.i.LineGobbler(voidCall):114 -
2022-11-26 07:37:44 INFO i.a.w.t.TemporalAttemptExecution(get):136 - Docker volume job log path: /tmp/workspace/13/0/logs.log
2022-11-26 07:37:44 INFO i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: 0.40.21
2022-11-26 07:37:44 INFO i.a.c.i.LineGobbler(voidCall):114 -
2022-11-26 07:37:44 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- START CHECK -----
2022-11-26 07:37:44 INFO i.a.c.i.LineGobbler(voidCall):114 -
2022-11-26 07:37:44 INFO i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/destination-bigquery:1.2.7 exists...
2022-11-26 07:37:44 INFO i.a.c.i.LineGobbler(voidCall):114 - airbyte/destination-bigquery:1.2.7 was found locally.
2022-11-26 07:37:44 INFO i.a.w.p.DockerProcessFactory(create):119 - Creating docker container = destination-bigquery-check-13-0-wusbn with resources io.airbyte.config.ResourceRequirements@405d12cb[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=]
2022-11-26 07:37:44 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/13/0 --log-driver none --name destination-bigquery-check-13-0-wusbn --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_CONNECTOR_IMAGE=airbyte/destination-bigquery:1.2.7 -e WORKER_JOB_ATTEMPT=0 -e AIRBYTE_VERSION=0.40.21 -e WORKER_JOB_ID=13 airbyte/destination-bigquery:1.2.7 check --config source_config.json
2022-11-26 07:37:44 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):97 - Reading messages from protocol version 0.2.0
2022-11-26 07:37:45 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - integration args: {check=null, config=source_config.json}
2022-11-26 07:37:45 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Running integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination
2022-11-26 07:37:45 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Command: CHECK
2022-11-26 07:37:45 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'}
2022-11-26 07:37:45 WARN i.a.w.i.DefaultAirbyteStreamFactory(internalLog):117 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-11-26 07:37:45 WARN i.a.w.i.DefaultAirbyteStreamFactory(internalLog):117 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-11-26 07:37:45 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Selected loading method is set to: GCS
2022-11-26 07:37:50 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - S3 format config: {"format_type":"CSV","flattening":"No flattening"}
2022-11-26 07:37:50 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Started testing if all required credentials assigned to user for single file uploading
2022-11-26 07:37:51 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Finished checking for normal upload mode
2022-11-26 07:37:51 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Started testing if all required credentials assigned to user for multipart upload
2022-11-26 07:37:51 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Initiated multipart upload to heumsi-playground-airbyte/staging/test_1669448271196 with full ID ABPnzm5zEbR-Oz_WX8O-r9HHjfKicu9D9TtIBm1Wwplj_M9pyp6rcbzGw4KkhuSqELUYEgnM
2022-11-26 07:37:51 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Called close() on [MultipartOutputStream for parts 1 - 10000]
2022-11-26 07:37:51 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Called close() on [MultipartOutputStream for parts 1 - 10000]
2022-11-26 07:37:51 WARN i.a.w.i.DefaultAirbyteStreamFactory(internalLog):117 - [MultipartOutputStream for parts 1 - 10000] is already closed
2022-11-26 07:37:51 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - [Manager uploading to heumsi-playground-airbyte/staging/test_1669448271196 with id ABPnzm5zE...qELUYEgnM]: Uploading leftover stream [Part number 1 containing 3.34 MB]
2022-11-26 07:37:51 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - [Manager uploading to heumsi-playground-airbyte/staging/test_1669448271196 with id ABPnzm5zE...qELUYEgnM]: Finished uploading [Part number 1 containing 3.34 MB]
2022-11-26 07:37:51 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - [Manager uploading to heumsi-playground-airbyte/staging/test_1669448271196 with id ABPnzm5zE...qELUYEgnM]: Completed
2022-11-26 07:37:52 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Finished verification for multipart upload mode
2022-11-26 07:37:54 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):120 - Completed integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination
2022-11-26 07:37:54 INFO i.a.c.i.LineGobbler(voidCall):114 -
2022-11-26 07:37:54 INFO i.a.w.t.TemporalAttemptExecution(get):160 - Stopping cancellation check scheduling...
2022-11-26 07:37:54 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- END CHECK -----
2022-11-26 07:37:54 INFO i.a.c.i.LineGobbler(voidCall):114 -
2022-11-26 07:37:54 INFO i.a.w.t.TemporalAttemptExecution(get):136 - Docker volume job log path: /tmp/workspace/13/0/logs.log
2022-11-26 07:37:54 INFO i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: 0.40.21
2022-11-26 07:37:54 INFO i.a.c.f.EnvVariableFeatureFlags(getEnvOrDefault):50 - Using default value for environment variable LOG_CONNECTOR_MESSAGES: 'false'
2022-11-26 07:37:54 WARN i.a.m.l.MetricClientFactory(initialize):60 - Metric client is already initialized to otel
2022-11-26 07:37:54 INFO i.a.c.f.EnvVariableFeatureFlags(getEnvOrDefault):50 - Using default value for environment variable LOG_CONNECTOR_MESSAGES: 'false'
2022-11-26 07:37:54 INFO i.a.w.g.DefaultReplicationWorker(run):135 - start sync worker. job id: 13 attempt id: 0
2022-11-26 07:37:54 INFO i.a.w.g.DefaultReplicationWorker(run):150 - configured sync modes: {my_database.table_one=full_refresh - overwrite}
2022-11-26 07:37:54 INFO i.a.c.i.LineGobbler(voidCall):114 -
2022-11-26 07:37:54 INFO i.a.w.i.DefaultAirbyteDestination(start):72 - Running destination...
2022-11-26 07:37:54 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- START REPLICATION -----
2022-11-26 07:37:54 INFO i.a.c.i.LineGobbler(voidCall):114 -
2022-11-26 07:37:54 INFO i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/destination-bigquery:1.2.7 exists...
2022-11-26 07:37:54 INFO i.a.c.i.LineGobbler(voidCall):114 - airbyte/destination-bigquery:1.2.7 was found locally.
2022-11-26 07:37:54 INFO i.a.w.p.DockerProcessFactory(create):119 - Creating docker container = destination-bigquery-write-13-0-azkgy with resources io.airbyte.config.ResourceRequirements@13fb8e7b[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=]
2022-11-26 07:37:54 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/13/0 --log-driver none --name destination-bigquery-write-13-0-azkgy --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_CONNECTOR_IMAGE=airbyte/destination-bigquery:1.2.7 -e WORKER_JOB_ATTEMPT=0 -e AIRBYTE_VERSION=0.40.21 -e WORKER_JOB_ID=13 airbyte/destination-bigquery:1.2.7 write --config destination_config.json --catalog destination_catalog.json
2022-11-26 07:37:54 INFO i.a.w.i.VersionedAirbyteMessageBufferedWriterFactory(createWriter):33 - Writing messages to protocol version 0.2.0
2022-11-26 07:37:54 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):97 - Reading messages from protocol version 0.2.0
2022-11-26 07:37:54 INFO i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/source-mysql:1.0.13 exists...
2022-11-26 07:37:54 INFO i.a.c.i.LineGobbler(voidCall):114 - airbyte/source-mysql:1.0.13 was found locally.
2022-11-26 07:37:54 INFO i.a.w.p.DockerProcessFactory(create):119 - Creating docker container = source-mysql-read-13-0-deqqm with resources io.airbyte.config.ResourceRequirements@1ff43af5[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=]
2022-11-26 07:37:54 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/13/0 --log-driver none --name source-mysql-read-13-0-deqqm --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_CONNECTOR_IMAGE=airbyte/source-mysql:1.0.13 -e WORKER_JOB_ATTEMPT=0 -e AIRBYTE_VERSION=0.40.21 -e WORKER_JOB_ID=13 airbyte/source-mysql:1.0.13 read --config source_config.json --catalog source_catalog.json
2022-11-26 07:37:54 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):97 - Reading messages from protocol version 0.2.0
2022-11-26 07:37:54 INFO i.a.w.g.DefaultReplicationWorker(lambda$readFromDstRunnable$4):243 - Destination output thread started.
2022-11-26 07:37:54 INFO i.a.w.g.DefaultReplicationWorker(replicate):217 - Waiting for source and destination threads to complete.
2022-11-26 07:37:54 INFO i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$6):291 - Replication thread started.
2022-11-26 07:37:55 source > starting source: class io.airbyte.integrations.source.mysql.MySqlSource
2022-11-26 07:37:55 source > integration args: {read=null, catalog=source_catalog.json, config=source_config.json}
2022-11-26 07:37:55 source > Running integration: io.airbyte.integrations.base.ssh.SshWrappedSource
2022-11-26 07:37:55 source > Command: READ
2022-11-26 07:37:55 source > Integration config: IntegrationConfig{command=READ, configPath='source_config.json', catalogPath='source_catalog.json', statePath='null'}
2022-11-26 07:37:55 destination > integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json}
2022-11-26 07:37:55 destination > Running integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination
2022-11-26 07:37:55 destination > Command: WRITE
2022-11-26 07:37:55 destination > Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'}
2022-11-26 07:37:55 source > Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-11-26 07:37:55 source > Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-11-26 07:37:55 destination > Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-11-26 07:37:55 destination > Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-11-26 07:37:55 destination > Selected loading method is set to: GCS
2022-11-26 07:37:55 destination > S3 format config: {"format_type":"AVRO","flattening":"No flattening"}
2022-11-26 07:37:55 destination > All tmp files will be removed from GCS when replication is finished
2022-11-26 07:37:55 source > Starting connection with method: NO_TUNNEL
2022-11-26 07:37:55 source > Stream state manager selected to manage state object with type STREAM.
2022-11-26 07:37:55 source > No cursor field set in catalog but not present in state. Stream: AirbyteStreamNameNamespacePair{name='table_one', namespace='my_database'}, New Cursor Field: null. Resetting cursor value
2022-11-26 07:37:55 source > HikariPool-1 - Starting...
2022-11-26 07:37:55 source > HikariPool-1 - Start completed.
2022-11-26 07:37:55 destination > Creating BigQuery staging message consumer with staging ID 0cae3f6a-e446-42ee-8819-d55cadbba996 at 2022-11-26T07:37:55.575Z
2022-11-26 07:37:55 destination > BigQuery write config: BigQueryWriteConfig[streamName=table_one, namespace=my_database, datasetId=my_database, datasetLocation=asia-northeast3, tmpTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=my_database, tableId=_airbyte_tmp_mhi_table_one}}, targetTableId=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=my_database, tableId=_airbyte_raw_table_one}}, tableSchema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, syncMode=overwrite, stagedFiles=[]]
2022-11-26 07:37:55 destination > class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started.
2022-11-26 07:37:55 destination > Preparing tmp tables in destination started for 1 streams
2022-11-26 07:37:55 source > Internal schemas to exclude: [performance_schema, information_schema, mysql, sys]
2022-11-26 07:37:55 destination > Creating dataset my_database
2022-11-26 07:37:55 source > Table table_one column id (type INT[10], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer})
2022-11-26 07:37:55 source > Table table_one column name (type VARCHAR[200], nullable false) -> JsonSchemaType({type=string})
2022-11-26 07:37:55 source > Table table_one column updated_at (type TIMESTAMP[19], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone})
2022-11-26 07:37:55 source > using CDC: false
2022-11-26 07:37:55 source > Queueing query for table: table_one
2022-11-26 07:37:55 source > Set initial fetch size: -2147483648 rows
2022-11-26 07:37:56 WARN c.n.s.JsonMetaSchema(newValidator):278 - Unknown keyword airbyte_type - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-11-26 07:37:56 source > Closing database connection pool.
2022-11-26 07:37:56 source > HikariPool-1 - Shutdown initiated...
2022-11-26 07:37:56 source > HikariPool-1 - Shutdown completed.
2022-11-26 07:37:56 source > Closed database connection pool.
2022-11-26 07:37:56 source > Completed integration: io.airbyte.integrations.base.ssh.SshWrappedSource
2022-11-26 07:37:56 source > completed source: class io.airbyte.integrations.source.mysql.MySqlSource
2022-11-26 07:37:56 INFO i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$6):333 - Total records read: 2 (142 bytes)
2022-11-26 07:37:56 INFO i.a.w.g.DefaultReplicationWorker(replicate):222 - One of source or destination thread complete. Waiting on the other.
2022-11-26 07:37:56 destination > Creating tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=my_database, tableId=_airbyte_tmp_mhi_table_one}}
2022-11-26 07:37:57 destination > Partitioned table created successfully: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=my_database, tableId=_airbyte_tmp_mhi_table_one}}
2022-11-26 07:37:57 destination > Creating staging path for stream table_one (dataset my_database): staging/my_database_table_one/2022/11/26/07/0cae3f6a-e446-42ee-8819-d55cadbba996/
2022-11-26 07:37:57 destination > Storage Object heumsi-playground-airbyte/staging/my_database_table_one/2022/11/26/07/0cae3f6a-e446-42ee-8819-d55cadbba996/ does not exist in bucket; creating...
2022-11-26 07:37:57 destination > Storage Object heumsi-playground-airbyte/staging/my_database_table_one/2022/11/26/07/0cae3f6a-e446-42ee-8819-d55cadbba996/ has been created in bucket.
2022-11-26 07:37:57 destination > Preparing tmp tables in destination completed.
2022-11-26 07:37:57 destination > Starting a new buffer for stream table_one (current state: 0 bytes in 0 buffers)
2022-11-26 07:37:57 destination > Default schema.
2022-11-26 07:37:58 destination > Airbyte message consumer: succeeded.
2022-11-26 07:37:58 destination > executing on success close procedure.
2022-11-26 07:37:58 destination > Flushing all 1 current buffers (0 bytes in total)
2022-11-26 07:37:58 destination > Flushing buffer of stream table_one (325 bytes)
2022-11-26 07:37:58 destination > Flushing buffer for stream table_one (325 bytes) to staging
2022-11-26 07:37:58 destination > Finished writing data to 1d863578-d34d-4f5f-8fd4-47df9bce525e7812583389385079125.avro (576 bytes)
2022-11-26 07:37:58 destination > Uploading records to staging for stream table_one (dataset my_database): staging/my_database_table_one/2022/11/26/07/0cae3f6a-e446-42ee-8819-d55cadbba996/
2022-11-26 07:37:58 destination > Initiated multipart upload to heumsi-playground-airbyte/staging/my_database_table_one/2022/11/26/07/0cae3f6a-e446-42ee-8819-d55cadbba996/1.avro with full ID ABPnzm4IMJPInfiazv4D-lEwhVQncQV8zHOe8oiOtIkX8OOgrWl3LfHaesoJVc0AAZl7vOMR
2022-11-26 07:37:58 destination > Called close() on [MultipartOutputStream for parts 1 - 10000]
2022-11-26 07:37:58 destination > [Manager uploading to heumsi-playground-airbyte/staging/my_database_table_one/2022/11/26/07/0cae3f6a-e446-42ee-8819-d55cadbba996/1.avro with id ABPnzm4IM...AAZl7vOMR]: Uploading leftover stream [Part number 1 containing 0.00 MB]
2022-11-26 07:37:58 destination > [Manager uploading to heumsi-playground-airbyte/staging/my_database_table_one/2022/11/26/07/0cae3f6a-e446-42ee-8819-d55cadbba996/1.avro with id ABPnzm4IM...AAZl7vOMR]: Finished uploading [Part number 1 containing 0.00 MB]
2022-11-26 07:37:58 destination > [Manager uploading to heumsi-playground-airbyte/staging/my_database_table_one/2022/11/26/07/0cae3f6a-e446-42ee-8819-d55cadbba996/1.avro with id ABPnzm4IM...AAZl7vOMR]: Completed
2022-11-26 07:37:59 destination > Uploaded buffer file to storage: 1d863578-d34d-4f5f-8fd4-47df9bce525e7812583389385079125.avro -> staging/my_database_table_one/2022/11/26/07/0cae3f6a-e446-42ee-8819-d55cadbba996/1.avro (filename: 1.avro)
2022-11-26 07:37:59 destination > Added staged file: 1.avro
2022-11-26 07:37:59 destination > Deleting tempFile data 1d863578-d34d-4f5f-8fd4-47df9bce525e7812583389385079125.avro
2022-11-26 07:37:59 destination > Closing buffer for stream table_one
2022-11-26 07:37:59 destination > Copying into tables in destination started for 1 streams
2022-11-26 07:37:59 destination > Uploading records from staging files to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=my_database, tableId=_airbyte_tmp_mhi_table_one}} (dataset my_database): [1.avro]
2022-11-26 07:37:59 destination > Uploading staged file: gs://heumsi-playground-airbyte/staging/my_database_table_one/2022/11/26/07/0cae3f6a-e446-42ee-8819-d55cadbba996/1.avro
2022-11-26 07:37:59 destination > [JobId{project=heumsi-playground, job=d6d9cb30-99be-4149-87ee-0bda7269396e, location=asia-northeast3}] Created a new job to upload records to tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=my_database, tableId=_airbyte_tmp_mhi_table_one}} (dataset my_database): Job{job=JobId{project=heumsi-playground, job=d6d9cb30-99be-4149-87ee-0bda7269396e, location=asia-northeast3}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1669448279493, endTime=null, startTime=1669448279627, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@heumsi-playground.iam.gserviceaccount.com, etag=OoBYW9yB8+j8HV4hx9r81w==, generatedId=heumsi-playground:asia-northeast3.d6d9cb30-99be-4149-87ee-0bda7269396e, selfLink=https://www.googleapis.com/bigquery/v2/projects/heumsi-playground/jobs/d6d9cb30-99be-4149-87ee-0bda7269396e?location=asia-northeast3, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=my_database, projectId=heumsi-playground, tableId=_airbyte_tmp_mhi_table_one}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://heumsi-playground-airbyte/staging/my_database_table_one/2022/11/26/07/0cae3f6a-e446-42ee-8819-d55cadbba996/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}}
2022-11-26 07:37:59 destination > Waiting for job finish Job{job=JobId{project=heumsi-playground, job=d6d9cb30-99be-4149-87ee-0bda7269396e, location=asia-northeast3}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1669448279493, endTime=null, startTime=1669448279627, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@heumsi-playground.iam.gserviceaccount.com, etag=OoBYW9yB8+j8HV4hx9r81w==, generatedId=heumsi-playground:asia-northeast3.d6d9cb30-99be-4149-87ee-0bda7269396e, selfLink=https://www.googleapis.com/bigquery/v2/projects/heumsi-playground/jobs/d6d9cb30-99be-4149-87ee-0bda7269396e?location=asia-northeast3, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=my_database, projectId=heumsi-playground, tableId=_airbyte_tmp_mhi_table_one}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://heumsi-playground-airbyte/staging/my_database_table_one/2022/11/26/07/0cae3f6a-e446-42ee-8819-d55cadbba996/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}}. Status: JobStatus{state=RUNNING, error=null, executionErrors=null}
2022-11-26 07:38:03 destination > Job finish Job{job=JobId{project=heumsi-playground, job=d6d9cb30-99be-4149-87ee-0bda7269396e, location=asia-northeast3}, status=JobStatus{state=RUNNING, error=null, executionErrors=null}, statistics=LoadStatistics{creationTime=1669448279493, endTime=null, startTime=1669448279627, numChildJobs=null, parentJobId=null, scriptStatistics=null, reservationUsage=null, inputBytes=null, inputFiles=null, outputBytes=null, outputRows=null, badRecords=null}, userEmail=airbyte@heumsi-playground.iam.gserviceaccount.com, etag=OoBYW9yB8+j8HV4hx9r81w==, generatedId=heumsi-playground:asia-northeast3.d6d9cb30-99be-4149-87ee-0bda7269396e, selfLink=https://www.googleapis.com/bigquery/v2/projects/heumsi-playground/jobs/d6d9cb30-99be-4149-87ee-0bda7269396e?location=asia-northeast3, configuration=LoadJobConfiguration{type=LOAD, destinationTable=GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=my_database, projectId=heumsi-playground, tableId=_airbyte_tmp_mhi_table_one}}, decimalTargetTypes=null, destinationEncryptionConfiguration=null, createDisposition=null, writeDisposition=WRITE_APPEND, formatOptions=FormatOptions{format=AVRO}, nullMarker=null, maxBadRecords=null, schema=Schema{fields=[Field{name=_airbyte_ab_id, type=STRING, mode=null, description=null, policyTags=null}, Field{name=_airbyte_emitted_at, type=TIMESTAMP, mode=null, description=null, policyTags=null}, Field{name=_airbyte_data, type=STRING, mode=null, description=null, policyTags=null}]}, ignoreUnknownValue=null, sourceUris=[gs://heumsi-playground-airbyte/staging/my_database_table_one/2022/11/26/07/0cae3f6a-e446-42ee-8819-d55cadbba996/1.avro], schemaUpdateOptions=null, autodetect=null, timePartitioning=null, clustering=null, useAvroLogicalTypes=true, labels=null, jobTimeoutMs=null, rangePartitioning=null, hivePartitioningOptions=null}} with status JobStatus{state=RUNNING, error=null, executionErrors=null}
2022-11-26 07:38:03 destination > [JobId{project=heumsi-playground, job=d6d9cb30-99be-4149-87ee-0bda7269396e, location=asia-northeast3}] Tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=my_database, tableId=_airbyte_tmp_mhi_table_one}} (dataset my_database) is successfully appended with staging files
2022-11-26 07:38:03 destination > Copying data from tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=my_database, tableId=_airbyte_tmp_mhi_table_one}} to target table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=my_database, tableId=_airbyte_raw_table_one}} (dataset my_database, sync mode overwrite)
2022-11-26 07:38:06 destination > successfully copied table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=my_database, tableId=_airbyte_tmp_mhi_table_one}} to table: GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=my_database, tableId=_airbyte_raw_table_one}}
2022-11-26 07:38:06 destination > Finalizing tables in destination completed
2022-11-26 07:38:06 destination > Cleaning up destination started for 1 streams
2022-11-26 07:38:06 destination > Deleting tmp table GenericData{classInfo=[datasetId, projectId, tableId], {datasetId=my_database, tableId=_airbyte_tmp_mhi_table_one}} (dataset my_database)
2022-11-26 07:38:07 destination > Cleaning up staging path for stream table_one (dataset my_database): staging/my_database_table_one
2022-11-26 07:38:07 destination > Deleting object staging/my_database_table_one/2022/11/26/07/0cae3f6a-e446-42ee-8819-d55cadbba996/
2022-11-26 07:38:07 destination > Deleting object staging/my_database_table_one/2022/11/26/07/0cae3f6a-e446-42ee-8819-d55cadbba996/1.avro
2022-11-26 07:38:07 destination > Storage bucket staging/my_database_table_one has been cleaned-up (2 objects were deleted)...
2022-11-26 07:38:07 destination > Cleaning up destination completed.
2022-11-26 07:38:07 destination > Completed integration: io.airbyte.integrations.destination.bigquery.BigQueryDestination
2022-11-26 07:38:07 INFO i.a.w.g.DefaultReplicationWorker(replicate):224 - Source and destination threads complete.
2022-11-26 07:38:07 INFO i.a.w.g.DefaultReplicationWorker(prepStateForLaterSaving):477 - Source did not output any state messages
2022-11-26 07:38:07 WARN i.a.w.g.DefaultReplicationWorker(prepStateForLaterSaving):488 - State capture: No state retained.
2022-11-26 07:38:07 INFO i.a.w.g.DefaultReplicationWorker(getReplicationOutput):408 - sync summary: {
"status" : "completed",
"recordsSynced" : 2,
"bytesSynced" : 142,
"startTime" : 1669448274461,
"endTime" : 1669448287432,
"totalStats" : {
"recordsEmitted" : 2,
"bytesEmitted" : 142,
"sourceStateMessagesEmitted" : 0,
"destinationStateMessagesEmitted" : 0,
"recordsCommitted" : 2,
"meanSecondsBeforeSourceStateMessageEmitted" : 0,
"maxSecondsBeforeSourceStateMessageEmitted" : 0,
"maxSecondsBetweenStateMessageEmittedandCommitted" : 0,
"meanSecondsBetweenStateMessageEmittedandCommitted" : 0,
"replicationStartTime" : 1669448274461,
"replicationEndTime" : 1669448287430,
"sourceReadStartTime" : 1669448274501,
"sourceReadEndTime" : 1669448276106,
"destinationWriteStartTime" : 1669448274544,
"destinationWriteEndTime" : 1669448287429
},
"streamStats" : [ {
"streamName" : "table_one",
"streamNamespace" : "my_database",
"stats" : {
"recordsEmitted" : 2,
"bytesEmitted" : 142,
"recordsCommitted" : 2
}
} ]
}
2022-11-26 07:38:07 INFO i.a.w.g.DefaultReplicationWorker(getReplicationOutput):409 - failures: [ ]
2022-11-26 07:38:07 INFO i.a.w.t.TemporalAttemptExecution(get):160 - Stopping cancellation check scheduling...
2022-11-26 07:38:07 INFO i.a.c.i.LineGobbler(voidCall):114 -
2022-11-26 07:38:07 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- END REPLICATION -----
2022-11-26 07:38:07 INFO i.a.c.i.LineGobbler(voidCall):114 -
2022-11-26 07:38:07 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):198 - sync summary: io.airbyte.config.StandardSyncOutput@56c65474[standardSyncSummary=io.airbyte.config.StandardSyncSummary@6203675d[status=completed,recordsSynced=2,bytesSynced=142,startTime=1669448274461,endTime=1669448287432,totalStats=io.airbyte.config.SyncStats@57b50b8[recordsEmitted=2,bytesEmitted=142,sourceStateMessagesEmitted=0,destinationStateMessagesEmitted=0,recordsCommitted=2,meanSecondsBeforeSourceStateMessageEmitted=0,maxSecondsBeforeSourceStateMessageEmitted=0,maxSecondsBetweenStateMessageEmittedandCommitted=0,meanSecondsBetweenStateMessageEmittedandCommitted=0,replicationStartTime=1669448274461,replicationEndTime=1669448287430,sourceReadStartTime=1669448274501,sourceReadEndTime=1669448276106,destinationWriteStartTime=1669448274544,destinationWriteEndTime=1669448287429,additionalProperties={}],streamStats=[io.airbyte.config.StreamSyncStats@45fe03a4[streamName=table_one,streamNamespace=my_database,stats=io.airbyte.config.SyncStats@21ff8a9b[recordsEmitted=2,bytesEmitted=142,sourceStateMessagesEmitted=<null>,destinationStateMessagesEmitted=<null>,recordsCommitted=2,meanSecondsBeforeSourceStateMessageEmitted=<null>,maxSecondsBeforeSourceStateMessageEmitted=<null>,maxSecondsBetweenStateMessageEmittedandCommitted=<null>,meanSecondsBetweenStateMessageEmittedandCommitted=<null>,replicationStartTime=<null>,replicationEndTime=<null>,sourceReadStartTime=<null>,sourceReadEndTime=<null>,destinationWriteStartTime=<null>,destinationWriteEndTime=<null>,additionalProperties={}]]]],normalizationSummary=<null>,webhookOperationSummary=<null>,state=<null>,outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@583f2876[streams=[io.airbyte.protocol.models.ConfiguredAirbyteStream@71e4d59[stream=io.airbyte.protocol.models.AirbyteStream@32c89d3d[name=table_one,jsonSchema={"type":"object","properties":{"id":{"type":"number","airbyte_type":"integer"},"name":{"type":"string"},"updated_at":{"type":"string","format":"date-time","airbyte_type":"timestamp_with_timezone"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=<null>,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=my_database,additionalProperties={}],syncMode=full_refresh,cursorField=[],destinationSyncMode=overwrite,primaryKey=[[id]],additionalProperties={}]],additionalProperties={}],failures=[]]
2022-11-26 07:38:07 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):203 - Sync summary length: 2303
2022-11-26 07:38:07 INFO i.a.c.t.TemporalUtils(withBackgroundHeartbeat):283 - Stopping temporal heartbeating...
2022-11-26 07:38:07 INFO i.a.w.t.TemporalAttemptExecution(get):136 - Docker volume job log path: /tmp/workspace/13/0/logs.log
2022-11-26 07:38:07 INFO i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: 0.40.21
2022-11-26 07:38:07 INFO i.a.c.i.LineGobbler(voidCall):114 -
2022-11-26 07:38:07 INFO i.a.w.n.DefaultNormalizationRunner(runProcess):134 - Running with normalization version: airbyte/normalization:0.2.24
2022-11-26 07:38:07 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- START DEFAULT NORMALIZATION -----
2022-11-26 07:38:07 INFO i.a.c.i.LineGobbler(voidCall):114 -
2022-11-26 07:38:07 INFO i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/normalization:0.2.24 exists...
2022-11-26 07:38:07 INFO i.a.c.i.LineGobbler(voidCall):114 - airbyte/normalization:0.2.24 was found locally.
2022-11-26 07:38:07 INFO i.a.w.p.DockerProcessFactory(create):119 - Creating docker container = normalization-normalize-13-0-ufmvt with resources io.airbyte.config.ResourceRequirements@463d7049[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=]
2022-11-26 07:38:07 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/13/0/normalize --log-driver none --name normalization-normalize-13-0-ufmvt --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_VERSION=0.40.21 airbyte/normalization:0.2.24 run --integration-type bigquery --config destination_config.json --catalog destination_catalog.json
2022-11-26 07:38:07 normalization > Running: transform-config --config destination_config.json --integration-type bigquery --out /data/13/0/normalize
2022-11-26 07:38:10 normalization > Namespace(config='destination_config.json', integration_type=<DestinationType.BIGQUERY: 'bigquery'>, out='/data/13/0/normalize')
2022-11-26 07:38:10 normalization > transform_bigquery
2022-11-26 07:38:11 normalization > Running: transform-catalog --integration-type bigquery --profile-config-dir /data/13/0/normalize --catalog destination_catalog.json --out /data/13/0/normalize/models/generated/ --json-column _airbyte_data
2022-11-26 07:38:14 normalization > Processing destination_catalog.json...
2022-11-26 07:38:14 normalization > Generating airbyte_ctes/my_database/table_one_ab1.sql from table_one
2022-11-26 07:38:14 normalization > Generating airbyte_ctes/my_database/table_one_ab2.sql from table_one
2022-11-26 07:38:14 normalization > Generating airbyte_ctes/my_database/table_one_ab3.sql from table_one
2022-11-26 07:38:14 normalization > Adding drop table hook for table_one_scd to table_one
2022-11-26 07:38:14 normalization > Generating airbyte_tables/my_database/table_one.sql from table_one
2022-11-26 07:38:14 normalization > detected no config file for ssh, assuming ssh is off.
2022-11-26 07:38:31 normalization > [--event-buffer-size EVENT_BUFFER_SIZE]
2022-11-26 07:38:31 normalization > --event-buffer-size EVENT_BUFFER_SIZE
2022-11-26 07:38:31 INFO i.a.w.n.NormalizationAirbyteStreamFactory(filterOutAndHandleNonAirbyteMessageLines):104 -
2022-11-26 07:38:31 normalization > DBT >=1.0.0 detected; using 10K event buffer size
2022-11-26 07:38:31 INFO i.a.w.n.NormalizationAirbyteStreamFactory(filterOutAndHandleNonAirbyteMessageLines):104 -
2022-11-26 07:38:51 normalization > Running with dbt=1.0.0
2022-11-26 07:38:51 normalization > Partial parse save file not found. Starting full parse.
2022-11-26 07:39:01 normalization > [WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources.
There are 2 unused configuration paths:
- models.airbyte_utils.generated.airbyte_incremental
- models.airbyte_utils.generated.airbyte_views
2022-11-26 07:39:01 normalization > Found 4 models, 0 tests, 0 snapshots, 0 analyses, 602 macros, 0 operations, 0 seed files, 1 source, 0 exposures, 0 metrics
2022-11-26 07:39:02 normalization > Concurrency: 8 threads (target='prod')
2022-11-26 07:39:03 normalization > 1 of 1 START table model my_database.table_one.......................................................................... [RUN]
2022-11-26 07:39:07 normalization > 1 of 1 OK created table model my_database.table_one..................................................................... [CREATE TABLE (2.0 rows, 238.0 Bytes processed) in 3.95s]
2022-11-26 07:39:07 normalization > Finished running 1 table model in 6.07s.
2022-11-26 07:39:07 normalization > Completed successfully
2022-11-26 07:39:07 normalization > Done. PASS=1 WARN=0 ERROR=0 SKIP=0 TOTAL=1
2022-11-26 07:39:07 INFO i.a.w.g.DefaultNormalizationWorker(run):93 - Normalization executed in 1 minute.
2022-11-26 07:39:07 INFO i.a.w.g.DefaultNormalizationWorker(run):106 - Normalization summary: io.airbyte.config.NormalizationSummary@d49030b[startTime=1669448287542,endTime=1669448347871,failures=[]]
2022-11-26 07:39:07 INFO i.a.w.t.TemporalAttemptExecution(get):160 - Stopping cancellation check scheduling...
2022-11-26 07:39:07 INFO i.a.c.t.TemporalUtils(withBackgroundHeartbeat):283 - Stopping temporal heartbeating...
2022-11-26 07:39:07 INFO i.a.c.i.LineGobbler(voidCall):114 -
2022-11-26 07:39:07 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- END DEFAULT NORMALIZATION -----
2022-11-26 07:39:07 INFO i.a.c.i.LineGobbler(voidCall):114 -
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment