docker-compose git:(main) ✗ docker-compose -f docker-compose-planetscale.yml up
Creating network "temporal-network" with driver "bridge"
Creating temporal ... done
Creating temporal-web ... done
Creating temporal-admin-tools ... done
Attaching to temporal, temporal-admin-tools, temporal-web
temporal | + : mysql
temporal | + : true
temporal | + : true
temporal | Temporal CLI address: 172.26.0.2:7233.
temporal | + : temporal
temporal | + : temporal_visibility
temporal | + : ''
temporal | + : 9042
temporal | + : ''
temporal | + : ''
temporal | + : ''
temporal | + : ''
temporal | + : ''
temporal | + : ''
temporal | + : 1
temporal | + : temporal
temporal | + : temporal_visibility
temporal | + : 3306
temporal | + : f0es1algq55z.us-east-1.psdb.cloud
temporal | + : 708niiqkqzaz
temporal | + : pscale_pw_9C2cRL_q9yAnqbLz_7IPoqXs7H7hVFIW94W7ZgtlQf0
temporal | + : false
temporal | + : ''
temporal | + : ''
temporal | + : ''
temporal | + : false
temporal | + : http
temporal | + : ''
temporal | + : 9200
temporal | + : ''
temporal | + : ''
temporal | + : v7
temporal | + : temporal_visibility_v1_dev
temporal | + : 0
temporal | + : 172.26.0.2:7233
temporal | + : false
temporal | + : default
temporal | + : 1
temporal | + : false
temporal | + [[ true != true ]]
temporal | + [[ false == true ]]
temporal | + setup_server
temporal | + echo 'Temporal CLI address: 172.26.0.2:7233.'
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal | Waiting for Temporal server to start...
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | + echo 'Waiting for Temporal server to start...'
temporal | Waiting for Temporal server to start...
temporal | + sleep 1
temporal | 2022/07/05 15:01:24 Loading config; env=docker,zone=,configDir=config
temporal | 2022/07/05 15:01:24 Loading config files=[config/docker.yaml]
temporal | {"level":"info","ts":"2022-07-05T15:01:24.810Z","msg":"Build info.","git-time":"2022-06-17T22:49:04.000Z","git-revision":"79844fd704eb6fd873760503ba57cc9c6bae65d5","git-modified":false,"go-arch":"amd64","go-os":"linux","go-version":"go1.18.2","cgo-enabled":false,"server-version":"1.17.0","logging-call-at":"main.go:136"}
temporal | {"level":"info","ts":"2022-07-05T15:01:24.876Z","msg":"dynamic config changed for the key: limit.maxidlength oldValue: nil newValue: { constraints: {} value: 255 }","logging-call-at":"basic_client.go:299"}
temporal | {"level":"info","ts":"2022-07-05T15:01:24.876Z","msg":"dynamic config changed for the key: system.forcesearchattributescacherefreshonread oldValue: nil newValue: { constraints: {} value: true }","logging-call-at":"basic_client.go:299"}
temporal | {"level":"info","ts":"2022-07-05T15:01:24.876Z","msg":"Updated dynamic config","logging-call-at":"file_based_client.go:184"}
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal | Waiting for Temporal server to start...
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal | Waiting for Temporal server to start...
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal | Waiting for Temporal server to start...
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | Waiting for Temporal server to start...
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal-web | [2022-07-05T15:01:28.776Z] Auth is disabled in config
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal | Waiting for Temporal server to start...
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | Waiting for Temporal server to start...
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal | Waiting for Temporal server to start...
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | Waiting for Temporal server to start...
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal | + + grep tctl -q cluster health
temporal | SERVING
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal | Waiting for Temporal server to start...
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal | Waiting for Temporal server to start...
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | + echo 'Waiting for Temporal server to start...'
temporal | Waiting for Temporal server to start...
temporal | + sleep 1
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | Waiting for Temporal server to start...
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal | Waiting for Temporal server to start...
temporal-web | [2022-07-05T15:01:38.579Z] will use insecure connection with Temporal server...
temporal | {"level":"info","ts":"2022-07-05T15:01:39.149Z","msg":"Created gRPC listener","service":"history","address":"172.26.0.2:7234","logging-call-at":"rpc.go:154"}
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal | Waiting for Temporal server to start...
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal | Waiting for Temporal server to start...
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal | Waiting for Temporal server to start...
temporal | + + tctl grep cluster -q health
temporal | SERVING
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal | Waiting for Temporal server to start...
temporal | {"level":"info","ts":"2022-07-05T15:01:42.841Z","msg":"Created gRPC listener","service":"matching","address":"172.26.0.2:7235","logging-call-at":"rpc.go:154"}
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | Waiting for Temporal server to start...
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal | Waiting for Temporal server to start...
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal | Waiting for Temporal server to start...
temporal-web | temporal-web ssl is not enabled
temporal-web | temporal-web up and listening on port 8088
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal | Waiting for Temporal server to start...
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | + echo 'Waiting for Temporal server to start...'
temporal | Waiting for Temporal server to start...
temporal | + sleep 1
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | {"level":"info","ts":"2022-07-05T15:01:48.997Z","msg":"Created gRPC listener","service":"frontend","address":"172.26.0.2:7233","logging-call-at":"rpc.go:154"}
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal | Waiting for Temporal server to start...
temporal | {"level":"info","ts":"2022-07-05T15:01:54.121Z","msg":"PProf not started due to port not set","logging-call-at":"pprof.go:67"}
temporal | {"level":"info","ts":"2022-07-05T15:01:54.121Z","msg":"Starting server for services","value":{"frontend":{},"history":{},"matching":{},"worker":{}},"logging-call-at":"server_impl.go:99"}
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | {"level":"info","ts":"2022-07-05T15:01:57.371Z","msg":"Membership heartbeat upserted successfully","service":"history","address":"172.26.0.2","port":6934,"hostId":"5dd34184-fc73-11ec-8de6-0242ac1a0002","logging-call-at":"rpMonitor.go:229"}
temporal | {"level":"info","ts":"2022-07-05T15:01:58.980Z","msg":"bootstrap hosts fetched","service":"history","bootstrap-hostports":"172.26.0.2:6934","logging-call-at":"rpMonitor.go:271"}
temporal | {"level":"info","ts":"2022-07-05T15:01:59.004Z","msg":"Current reachable members","service":"history","component":"service-resolver","service":"history","addresses":["172.26.0.2:7234"],"logging-call-at":"rpServiceResolver.go:266"}
temporal | Waiting for Temporal server to start...
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal | {"level":"info","ts":"2022-07-05T15:02:00.147Z","msg":"RuntimeMetricsReporter started","service":"history","logging-call-at":"runtime.go:138"}
temporal | {"level":"info","ts":"2022-07-05T15:02:00.148Z","msg":"history starting","service":"history","logging-call-at":"service.go:96"}
temporal | {"level":"info","ts":"2022-07-05T15:02:00.148Z","msg":"Replication task fetchers started.","logging-call-at":"task_fetcher.go:141"}
temporal | {"level":"info","ts":"2022-07-05T15:02:00.148Z","msg":"none","shard-id":1,"address":"172.26.0.2:7234","lifecycle":"Started","component":"shard-context","logging-call-at":"controller_impl.go:263"}
temporal | {"level":"info","ts":"2022-07-05T15:02:00.148Z","msg":"none","shard-id":2,"address":"172.26.0.2:7234","lifecycle":"Started","component":"shard-context","logging-call-at":"controller_impl.go:263"}
temporal | {"level":"info","ts":"2022-07-05T15:02:00.148Z","msg":"none","shard-id":3,"address":"172.26.0.2:7234","lifecycle":"Started","component":"shard-context","logging-call-at":"controller_impl.go:263"}
temporal | {"level":"info","ts":"2022-07-05T15:02:00.148Z","msg":"none","shard-id":4,"address":"172.26.0.2:7234","lifecycle":"Started","component":"shard-context","logging-call-at":"controller_impl.go:263"}
temporal | {"level":"info","ts":"2022-07-05T15:02:00.149Z","msg":"none","component":"shard-controller","address":"172.26.0.2:7234","lifecycle":"Started","logging-call-at":"controller_impl.go:118"}
temporal | {"level":"info","ts":"2022-07-05T15:02:00.149Z","msg":"Starting to serve on history listener","service":"history","logging-call-at":"service.go:107"}
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | {"level":"info","ts":"2022-07-05T15:02:03.090Z","msg":"RuntimeMetricsReporter started","service":"matching","logging-call-at":"runtime.go:138"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.252Z","msg":"Range updated for shardID","shard-id":4,"address":"172.26.0.2:7234","shard-range-id":3,"previous-shard-range-id":2,"number":0,"next-number":0,"logging-call-at":"context_impl.go:1122"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.252Z","msg":"Acquired shard","shard-id":4,"address":"172.26.0.2:7234","logging-call-at":"context_impl.go:1755"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.252Z","msg":"none","shard-id":4,"address":"172.26.0.2:7234","lifecycle":"Starting","component":"shard-engine","logging-call-at":"context_impl.go:1386"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.273Z","msg":"none","shard-id":4,"address":"172.26.0.2:7234","component":"history-engine","lifecycle":"Starting","logging-call-at":"historyEngine.go:252"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.273Z","msg":"Parallel task processor started","shard-id":4,"address":"172.26.0.2:7234","component":"timer-queue-processor","cluster-name":"active","logging-call-at":"parallel_processor.go:98"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.273Z","msg":"interleaved weighted round robin task scheduler started","shard-id":4,"address":"172.26.0.2:7234","component":"timer-queue-processor","cluster-name":"active","logging-call-at":"interleaved_weighted_round_robin.go:109"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.273Z","msg":"Timer queue processor started.","shard-id":4,"address":"172.26.0.2:7234","component":"timer-queue-processor","cluster-name":"active","component":"timer-queue-processor","logging-call-at":"timerQueueProcessorBase.go:139"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.273Z","msg":"Parallel task processor started","shard-id":4,"address":"172.26.0.2:7234","component":"transfer-queue-processor","cluster-name":"active","logging-call-at":"parallel_processor.go:98"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.273Z","msg":"interleaved weighted round robin task scheduler started","shard-id":4,"address":"172.26.0.2:7234","component":"transfer-queue-processor","cluster-name":"active","logging-call-at":"interleaved_weighted_round_robin.go:109"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.273Z","msg":"none","shard-id":4,"address":"172.26.0.2:7234","component":"transfer-queue-processor","cluster-name":"active","lifecycle":"Starting","component":"transfer-queue-processor","logging-call-at":"queueProcessor.go:128"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.273Z","msg":"none","shard-id":4,"address":"172.26.0.2:7234","component":"transfer-queue-processor","cluster-name":"active","lifecycle":"Started","component":"transfer-queue-processor","logging-call-at":"queueProcessor.go:134"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.273Z","msg":"Parallel task processor started","shard-id":4,"address":"172.26.0.2:7234","component":"visibility-queue-processor","logging-call-at":"parallel_processor.go:98"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.273Z","msg":"interleaved weighted round robin task scheduler started","shard-id":4,"address":"172.26.0.2:7234","component":"visibility-queue-processor","logging-call-at":"interleaved_weighted_round_robin.go:109"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.273Z","msg":"none","shard-id":4,"address":"172.26.0.2:7234","component":"visibility-queue-processor","lifecycle":"Starting","component":"transfer-queue-processor","logging-call-at":"queueProcessor.go:128"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.273Z","msg":"none","shard-id":4,"address":"172.26.0.2:7234","component":"visibility-queue-processor","lifecycle":"Started","component":"transfer-queue-processor","logging-call-at":"queueProcessor.go:134"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.273Z","msg":"none","shard-id":4,"address":"172.26.0.2:7234","component":"history-engine","lifecycle":"Started","logging-call-at":"historyEngine.go:269"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.273Z","msg":"none","shard-id":4,"address":"172.26.0.2:7234","lifecycle":"Started","component":"shard-engine","logging-call-at":"context_impl.go:1389"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.992Z","msg":"Range updated for shardID","shard-id":1,"address":"172.26.0.2:7234","shard-range-id":4,"previous-shard-range-id":3,"number":0,"next-number":0,"logging-call-at":"context_impl.go:1122"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.993Z","msg":"Acquired shard","shard-id":1,"address":"172.26.0.2:7234","logging-call-at":"context_impl.go:1755"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.993Z","msg":"none","shard-id":1,"address":"172.26.0.2:7234","lifecycle":"Starting","component":"shard-engine","logging-call-at":"context_impl.go:1386"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.993Z","msg":"none","shard-id":1,"address":"172.26.0.2:7234","component":"history-engine","lifecycle":"Starting","logging-call-at":"historyEngine.go:252"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.993Z","msg":"Parallel task processor started","shard-id":1,"address":"172.26.0.2:7234","component":"timer-queue-processor","cluster-name":"active","logging-call-at":"parallel_processor.go:98"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.993Z","msg":"interleaved weighted round robin task scheduler started","shard-id":1,"address":"172.26.0.2:7234","component":"timer-queue-processor","cluster-name":"active","logging-call-at":"interleaved_weighted_round_robin.go:109"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.993Z","msg":"Timer queue processor started.","shard-id":1,"address":"172.26.0.2:7234","component":"timer-queue-processor","cluster-name":"active","component":"timer-queue-processor","logging-call-at":"timerQueueProcessorBase.go:139"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.993Z","msg":"Parallel task processor started","shard-id":1,"address":"172.26.0.2:7234","component":"transfer-queue-processor","cluster-name":"active","logging-call-at":"parallel_processor.go:98"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.993Z","msg":"interleaved weighted round robin task scheduler started","shard-id":1,"address":"172.26.0.2:7234","component":"transfer-queue-processor","cluster-name":"active","logging-call-at":"interleaved_weighted_round_robin.go:109"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.993Z","msg":"none","shard-id":1,"address":"172.26.0.2:7234","component":"transfer-queue-processor","cluster-name":"active","lifecycle":"Starting","component":"transfer-queue-processor","logging-call-at":"queueProcessor.go:128"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.994Z","msg":"none","shard-id":1,"address":"172.26.0.2:7234","component":"transfer-queue-processor","cluster-name":"active","lifecycle":"Started","component":"transfer-queue-processor","logging-call-at":"queueProcessor.go:134"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.994Z","msg":"Parallel task processor started","shard-id":1,"address":"172.26.0.2:7234","component":"visibility-queue-processor","logging-call-at":"parallel_processor.go:98"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.994Z","msg":"interleaved weighted round robin task scheduler started","shard-id":1,"address":"172.26.0.2:7234","component":"visibility-queue-processor","logging-call-at":"interleaved_weighted_round_robin.go:109"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.994Z","msg":"none","shard-id":1,"address":"172.26.0.2:7234","component":"visibility-queue-processor","lifecycle":"Starting","component":"transfer-queue-processor","logging-call-at":"queueProcessor.go:128"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.994Z","msg":"none","shard-id":1,"address":"172.26.0.2:7234","component":"visibility-queue-processor","lifecycle":"Started","component":"transfer-queue-processor","logging-call-at":"queueProcessor.go:134"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.994Z","msg":"none","shard-id":1,"address":"172.26.0.2:7234","component":"history-engine","lifecycle":"Started","logging-call-at":"historyEngine.go:269"}
temporal | {"level":"info","ts":"2022-07-05T15:02:03.994Z","msg":"none","shard-id":1,"address":"172.26.0.2:7234","lifecycle":"Started","component":"shard-engine","logging-call-at":"context_impl.go:1389"}
temporal | {"level":"info","ts":"2022-07-05T15:02:05.050Z","msg":"Membership heartbeat upserted successfully","service":"matching","address":"172.26.0.2","port":6935,"hostId":"617921bf-fc73-11ec-8de6-0242ac1a0002","logging-call-at":"rpMonitor.go:229"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.089Z","msg":"Range updated for shardID","shard-id":2,"address":"172.26.0.2:7234","shard-range-id":3,"previous-shard-range-id":2,"number":0,"next-number":0,"logging-call-at":"context_impl.go:1122"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.089Z","msg":"Acquired shard","shard-id":2,"address":"172.26.0.2:7234","logging-call-at":"context_impl.go:1755"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.089Z","msg":"none","shard-id":2,"address":"172.26.0.2:7234","lifecycle":"Starting","component":"shard-engine","logging-call-at":"context_impl.go:1386"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.089Z","msg":"bootstrap hosts fetched","service":"matching","bootstrap-hostports":"172.26.0.2:6934,172.26.0.2:6935","logging-call-at":"rpMonitor.go:271"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.090Z","msg":"none","shard-id":2,"address":"172.26.0.2:7234","component":"history-engine","lifecycle":"Starting","logging-call-at":"historyEngine.go:252"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.090Z","msg":"Parallel task processor started","shard-id":2,"address":"172.26.0.2:7234","component":"timer-queue-processor","cluster-name":"active","logging-call-at":"parallel_processor.go:98"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.090Z","msg":"interleaved weighted round robin task scheduler started","shard-id":2,"address":"172.26.0.2:7234","component":"timer-queue-processor","cluster-name":"active","logging-call-at":"interleaved_weighted_round_robin.go:109"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.090Z","msg":"Timer queue processor started.","shard-id":2,"address":"172.26.0.2:7234","component":"timer-queue-processor","cluster-name":"active","component":"timer-queue-processor","logging-call-at":"timerQueueProcessorBase.go:139"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.090Z","msg":"Parallel task processor started","shard-id":2,"address":"172.26.0.2:7234","component":"transfer-queue-processor","cluster-name":"active","logging-call-at":"parallel_processor.go:98"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.090Z","msg":"interleaved weighted round robin task scheduler started","shard-id":2,"address":"172.26.0.2:7234","component":"transfer-queue-processor","cluster-name":"active","logging-call-at":"interleaved_weighted_round_robin.go:109"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.090Z","msg":"none","shard-id":2,"address":"172.26.0.2:7234","component":"transfer-queue-processor","cluster-name":"active","lifecycle":"Starting","component":"transfer-queue-processor","logging-call-at":"queueProcessor.go:128"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.090Z","msg":"none","shard-id":2,"address":"172.26.0.2:7234","component":"transfer-queue-processor","cluster-name":"active","lifecycle":"Started","component":"transfer-queue-processor","logging-call-at":"queueProcessor.go:134"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.090Z","msg":"Parallel task processor started","shard-id":2,"address":"172.26.0.2:7234","component":"visibility-queue-processor","logging-call-at":"parallel_processor.go:98"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.090Z","msg":"interleaved weighted round robin task scheduler started","shard-id":2,"address":"172.26.0.2:7234","component":"visibility-queue-processor","logging-call-at":"interleaved_weighted_round_robin.go:109"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.090Z","msg":"none","shard-id":2,"address":"172.26.0.2:7234","component":"visibility-queue-processor","lifecycle":"Starting","component":"transfer-queue-processor","logging-call-at":"queueProcessor.go:128"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.090Z","msg":"none","shard-id":2,"address":"172.26.0.2:7234","component":"visibility-queue-processor","lifecycle":"Started","component":"transfer-queue-processor","logging-call-at":"queueProcessor.go:134"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.090Z","msg":"none","shard-id":2,"address":"172.26.0.2:7234","component":"history-engine","lifecycle":"Started","logging-call-at":"historyEngine.go:269"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.090Z","msg":"none","shard-id":2,"address":"172.26.0.2:7234","lifecycle":"Started","component":"shard-engine","logging-call-at":"context_impl.go:1389"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.116Z","msg":"Current reachable members","service":"matching","component":"service-resolver","service":"history","addresses":["172.26.0.2:7234"],"logging-call-at":"rpServiceResolver.go:266"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.117Z","msg":"Current reachable members","service":"matching","component":"service-resolver","service":"matching","addresses":["172.26.0.2:7235"],"logging-call-at":"rpServiceResolver.go:266"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.117Z","msg":"matching starting","service":"matching","logging-call-at":"service.go:91"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.117Z","msg":"Starting to serve on matching listener","service":"matching","logging-call-at":"service.go:102"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.118Z","msg":"none","component":"shard-controller","address":"172.26.0.2:7234","shard-update":"RingMembershipChangedEvent","number-processed":1,"number-deleted":0,"number":0,"logging-call-at":"controller_impl.go:310"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.118Z","msg":"Current reachable members","service":"history","component":"service-resolver","service":"matching","addresses":["172.26.0.2:7235"],"logging-call-at":"rpServiceResolver.go:266"}
temporal | Waiting for Temporal server to start...
temporal | + echo 'Waiting for Temporal server to start...'
temporal | + sleep 1
temporal | {"level":"info","ts":"2022-07-05T15:02:06.724Z","msg":"Range updated for shardID","shard-id":3,"address":"172.26.0.2:7234","shard-range-id":4,"previous-shard-range-id":3,"number":0,"next-number":0,"logging-call-at":"context_impl.go:1122"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.724Z","msg":"Acquired shard","shard-id":3,"address":"172.26.0.2:7234","logging-call-at":"context_impl.go:1755"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.724Z","msg":"none","shard-id":3,"address":"172.26.0.2:7234","lifecycle":"Starting","component":"shard-engine","logging-call-at":"context_impl.go:1386"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.724Z","msg":"none","shard-id":3,"address":"172.26.0.2:7234","component":"history-engine","lifecycle":"Starting","logging-call-at":"historyEngine.go:252"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.725Z","msg":"Parallel task processor started","shard-id":3,"address":"172.26.0.2:7234","component":"timer-queue-processor","cluster-name":"active","logging-call-at":"parallel_processor.go:98"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.725Z","msg":"interleaved weighted round robin task scheduler started","shard-id":3,"address":"172.26.0.2:7234","component":"timer-queue-processor","cluster-name":"active","logging-call-at":"interleaved_weighted_round_robin.go:109"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.725Z","msg":"Timer queue processor started.","shard-id":3,"address":"172.26.0.2:7234","component":"timer-queue-processor","cluster-name":"active","component":"timer-queue-processor","logging-call-at":"timerQueueProcessorBase.go:139"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.725Z","msg":"Parallel task processor started","shard-id":3,"address":"172.26.0.2:7234","component":"transfer-queue-processor","cluster-name":"active","logging-call-at":"parallel_processor.go:98"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.725Z","msg":"interleaved weighted round robin task scheduler started","shard-id":3,"address":"172.26.0.2:7234","component":"transfer-queue-processor","cluster-name":"active","logging-call-at":"interleaved_weighted_round_robin.go:109"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.725Z","msg":"none","shard-id":3,"address":"172.26.0.2:7234","component":"transfer-queue-processor","cluster-name":"active","lifecycle":"Starting","component":"transfer-queue-processor","logging-call-at":"queueProcessor.go:128"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.725Z","msg":"none","shard-id":3,"address":"172.26.0.2:7234","component":"transfer-queue-processor","cluster-name":"active","lifecycle":"Started","component":"transfer-queue-processor","logging-call-at":"queueProcessor.go:134"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.725Z","msg":"Parallel task processor started","shard-id":3,"address":"172.26.0.2:7234","component":"visibility-queue-processor","logging-call-at":"parallel_processor.go:98"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.725Z","msg":"interleaved weighted round robin task scheduler started","shard-id":3,"address":"172.26.0.2:7234","component":"visibility-queue-processor","logging-call-at":"interleaved_weighted_round_robin.go:109"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.725Z","msg":"none","shard-id":3,"address":"172.26.0.2:7234","component":"visibility-queue-processor","lifecycle":"Starting","component":"transfer-queue-processor","logging-call-at":"queueProcessor.go:128"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.725Z","msg":"none","shard-id":3,"address":"172.26.0.2:7234","component":"visibility-queue-processor","lifecycle":"Started","component":"transfer-queue-processor","logging-call-at":"queueProcessor.go:134"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.725Z","msg":"none","shard-id":3,"address":"172.26.0.2:7234","component":"history-engine","lifecycle":"Started","logging-call-at":"historyEngine.go:269"}
temporal | {"level":"info","ts":"2022-07-05T15:02:06.725Z","msg":"none","shard-id":3,"address":"172.26.0.2:7234","lifecycle":"Started","component":"shard-engine","logging-call-at":"context_impl.go:1389"}
temporal | + tctl cluster health
temporal | + grep -q SERVING
temporal | {"level":"error","ts":"2022-07-05T15:02:07.810Z","msg":"Operation failed with internal error.","error":"GetWorkflowExecution: failed to get timer info. Error: Failed to get timer info. Error: context deadline exceeded","metric-scope":5,"logging-call-at":"persistenceMetricClients.go:1424","stacktrace":"go.temporal.io/server/common/log.(*zapLogger).Error\n\t/home/builder/temporal/common/log/zap_logger.go:142\ngo.temporal.io/server/common/persistence.(*metricEmitter).updateErrorMetric\n\t/home/builder/temporal/common/persistence/persistenceMetricClients.go:1424\ngo.temporal.io/server/common/persistence.(*executionPersistenceClient).GetWorkflowExecution\n\t/home/builder/temporal/common/persistence/persistenceMetricClients.go:241\ngo.temporal.io/server/service/history/shard.(*ContextImpl).GetWorkflowExecution\n\t/home/builder/temporal/service/history/shard/context_impl.go:840\ngo.temporal.io/server/service/history/workflow.getWorkflowExecutionWithRetry.func1\n\t/home/builder/temporal/service/history/workflow/transaction_impl.go:462\ngo.temporal.io/server/common/backoff.RetryContext\n\t/home/builder/temporal/common/backoff/retry.go:125\ngo.temporal.io/server/service/history/workflow.getWorkflowExecutionWithRetry\n\t/home/builder/temporal/service/history/workflow/transaction_impl.go:467\ngo.temporal.io/server/service/history/workflow.(*ContextImpl).LoadWorkflowExecution\n\t/home/builder/temporal/service/history/workflow/context.go:274\ngo.temporal.io/server/service/history.(*visibilityQueueTaskExecutor).processStartExecution\n\t/home/builder/temporal/service/history/visibilityQueueTaskExecutor.go:115\ngo.temporal.io/server/service/history.(*visibilityQueueTaskExecutor).Execute\n\t/home/builder/temporal/service/history/visibilityQueueTaskExecutor.go:90\ngo.temporal.io/server/service/history/queues.(*executableImpl).Execute\n\t/home/builder/temporal/service/history/queues/executable.go:161\ngo.temporal.io/server/common/tasks.(*ParallelProcessor).executeTask.func1\n\t/home/builder/temporal/common/tasks/parallel_processor.go:207\ngo.temporal.io/server/common/backoff.Retry.func1\n\t/home/builder/temporal/common/backoff/retry.go:104\ngo.temporal.io/server/common/backoff.RetryContext\n\t/home/builder/temporal/common/backoff/retry.go:125\ngo.temporal.io/server/common/backoff.Retry\n\t/home/builder/temporal/common/backoff/retry.go:105\ngo.temporal.io/server/common/tasks.(*ParallelProcessor).executeTask\n\t/home/builder/temporal/common/tasks/parallel_processor.go:217\ngo.temporal.io/server/common/tasks.(*ParallelProcessor).processTask\n\t/home/builder/temporal/common/tasks/parallel_processor.go:195"}
temporal | {"level":"error","ts":"2022-07-05T15:02:07.810Z","msg":"Persistent fetch operation Failure","shard-id":4,"address":"172.26.0.2:7234","wf-namespace-id":"32049b68-7872-4094-8e63-d0dd59896a83","wf-id":"temporal-sys-tq-scanner","wf-run-id":"d034be6e-fac2-4bb1-bdde-a5e36749c7e2","store-operation":"get-wf-execution","error":"context deadline exceeded","logging-call-at":"transaction_impl.go:489","stacktrace":"go.temporal.io/server/common/log.(*zapLogger).Error\n\t/home/builder/temporal/common/log/zap_logger.go:142\ngo.temporal.io/server/service/history/workflow.getWorkflowExecutionWithRetry\n\t/home/builder/temporal/service/history/workflow/transaction_impl.go:489\ngo.temporal.io/server/service/history/workflow.(*ContextImpl).LoadWorkflowExecution\n\t/home/builder/temporal/service/history/workflow/context.go:274\ngo.temporal.io/server/service/history.(*visibilityQueueTaskExecutor).processStartExecution\n\t/home/builder/temporal/service/history/visibilityQueueTaskExecutor.go:115\ngo.temporal.io/server/service/history.(*visibilityQueueTaskExecutor).Execute\n\t/home/builder/temporal/service/history/visibilityQueueTaskExecutor.go:90\ngo.temporal.io/server/service/history/queues.(*executableImpl).Execute\n\t/home/builder/temporal/service/history/queues/executable.go:161\ngo.temporal.io/server/common/tasks.(*ParallelProcessor).executeTask.func1\n\t/home/builder/temporal/common/tasks/parallel_processor.go:207\ngo.temporal.io/server/common/backoff.Retry.func1\n\t/home/builder/temporal/common/backoff/retry.go:104\ngo.temporal.io/server/common/backoff.RetryContext\n\t/home/builder/temporal/common/backoff/retry.go:125\ngo.temporal.io/server/common/backoff.Retry\n\t/home/builder/temporal/common/backoff/retry.go:105\ngo.temporal.io/server/common/tasks.(*ParallelProcessor).executeTask\n\t/home/builder/temporal/common/tasks/parallel_processor.go:217\ngo.temporal.io/server/common/tasks.(*ParallelProcessor).processTask\n\t/home/builder/temporal/common/tasks/parallel_processor.go:195"}
temporal | {"level":"error","ts":"2022-07-05T15:02:07.811Z","msg":"Fail to process task","shard-id":4,"address":"172.26.0.2:7234","component":"visibility-queue-processor","wf-namespace-id":"32049b68-7872-4094-8e63-d0dd59896a83","wf-id":"temporal-sys-tq-scanner","wf-run-id":"d034be6e-fac2-4bb1-bdde-a5e36749c7e2","queue-task-id":1048580,"queue-task-visibility-timestamp":"2022-07-04T12:28:03.160Z","queue-task-type":"VisibilityStartExecution","queue-task":{"NamespaceID":"32049b68-7872-4094-8e63-d0dd59896a83","WorkflowID":"temporal-sys-tq-scanner","RunID":"d034be6e-fac2-4bb1-bdde-a5e36749c7e2","VisibilityTimestamp":"2022-07-04T12:28:03.160160056Z","TaskID":1048580,"Version":0},"wf-history-event-id":0,"error":"context deadline exceeded","lifecycle":"ProcessingFailed","logging-call-at":"lazy_logger.go:68","stacktrace":"go.temporal.io/server/common/log.(*zapLogger).Error\n\t/home/builder/temporal/common/log/zap_logger.go:142\ngo.temporal.io/server/common/log.(*lazyLogger).Error\n\t/home/builder/temporal/common/log/lazy_logger.go:68\ngo.temporal.io/server/service/history/queues.(*executableImpl).HandleErr\n\t/home/builder/temporal/service/history/queues/executable.go:231\ngo.temporal.io/server/common/tasks.(*ParallelProcessor).executeTask.func1\n\t/home/builder/temporal/common/tasks/parallel_processor.go:208\ngo.temporal.io/server/common/backoff.Retry.func1\n\t/home/builder/temporal/common/backoff/retry.go:104\ngo.temporal.io/server/common/backoff.RetryContext\n\t/home/builder/temporal/common/backoff/retry.go:125\ngo.temporal.io/server/common/backoff.Retry\n\t/home/builder/temporal/common/backoff/retry.go:105\ngo.temporal.io/server/common/tasks.(*ParallelProcessor).executeTask\n\t/home/builder/temporal/common/tasks/parallel_processor.go:217\ngo.temporal.io/server/common/tasks.(*ParallelProcessor).processTask\n\t/home/builder/temporal/common/tasks/parallel_processor.go:195"}
temporal | {"level":"info","ts":"2022-07-05T15:02:08.534Z","msg":"RuntimeMetricsReporter started","service":"worker","logging-call-at":"runtime.go:138"}
temporal | {"level":"info","ts":"2022-07-05T15:02:09.667Z","msg":"Membership heartbeat upserted successfully","service":"worker","address":"172.26.0.2","port":6939,"hostId":"6711f647-fc73-11ec-8de6-0242ac1a0002","logging-call-at":"rpMonitor.go:229"}
temporal | {"level":"info","ts":"2022-07-05T15:02:10.106Z","msg":"bootstrap hosts fetched","service":"worker","bootstrap-hostports":"172.26.0.2:6934,172.26.0.2:6935,172.26.0.2:6939","logging-call-at":"rpMonitor.go:271"}
temporal | {"level":"info","ts":"2022-07-05T15:02:10.108Z","msg":"Current reachable members","service":"worker","component":"service-resolver","service":"matching","addresses":["172.26.0.2:7235"],"logging-call-at":"rpServiceResolver.go:266"}
temporal | {"level":"info","ts":"2022-07-05T15:02:10.108Z","msg":"Current reachable members","service":"worker","component":"service-resolver","service":"history","addresses":["172.26.0.2:7234"],"logging-call-at":"rpServiceResolver.go:266"}
temporal | {"level":"info","ts":"2022-07-05T15:02:10.108Z","msg":"Current reachable members","service":"worker","component":"service-resolver","service":"worker","addresses":["172.26.0.2:7239"],"logging-call-at":"rpServiceResolver.go:266"}
temporal | {"level":"info","ts":"2022-07-05T15:02:10.109Z","msg":"worker starting","service":"worker","component":"worker","logging-call-at":"service.go:332"}
temporal | {"level":"info","ts":"2022-07-05T15:02:10.109Z","msg":"none","component":"shard-controller","address":"172.26.0.2:7234","shard-update":"RingMembershipChangedEvent","number-processed":1,"number-deleted":0,"number":0,"logging-call-at":"controller_impl.go:310"}
temporal | {"level":"info","ts":"2022-07-05T15:02:10.109Z","msg":"Current reachable members","service":"history","component":"service-resolver","service":"worker","addresses":["172.26.0.2:7239"],"logging-call-at":"rpServiceResolver.go:266"}
temporal | {"level":"info","ts":"2022-07-05T15:02:10.145Z","msg":"Current reachable members","service":"matching","component":"service-resolver","service":"worker","addresses":["172.26.0.2:7239"],"logging-call-at":"rpServiceResolver.go:266"}
temporal | {"level":"error","ts":"2022-07-05T15:02:10.812Z","msg":"Operation failed with internal error.","error":"GetWorkflowExecution: failed to get child executionsRow info. Error: Failed to get timer info. Error: context deadline exceeded","metric-scope":5,"logging-call-at":"persistenceMetricClients.go:1424","stacktrace":"go.temporal.io/server/common/log.(*zapLogger).Error\n\t/home/builder/temporal/common/log/zap_logger.go:142\ngo.temporal.io/server/common/persistence.(*metricEmitter).updateErrorMetric\n\t/home/builder/temporal/common/persistence/persistenceMetricClients.go:1424\ngo.temporal.io/server/common/persistence.(*executionPersistenceClient).GetWorkflowExecution\n\t/home/builder/temporal/common/persistence/persistenceMetricClients.go:241\ngo.temporal.io/server/service/history/shard.(*ContextImpl).GetWorkflowExecution\n\t/home/builder/temporal/service/history/shard/context_impl.go:840\ngo.temporal.io/server/service/history/workflow.getWorkflowExecutionWithRetry.func1\n\t/home/builder/temporal/service/history/workflow/transaction_impl.go:462\ngo.temporal.io/server/common/backoff.RetryContext\n\t/home/builder/temporal/common/backoff/retry.go:125\ngo.temporal.io/server/service/history/workflow.getWorkflowExecutionWithRetry\n\t/home/builder/temporal/service/history/workflow/transaction_impl.go:467\ngo.temporal.io/server/service/history/workflow.(*ContextImpl).LoadWorkflowExecution\n\t/home/builder/temporal/service/history/workflow/context.go:274\ngo.temporal.io/server/service/history.(*visibilityQueueTaskExecutor).processStartExecution\n\t/home/builder/temporal/service/history/visibilityQueueTaskExecutor.go:115\ngo.temporal.io/server/service/history.(*visibilityQueueTaskExecutor).Execute\n\t/home/builder/temporal/service/history/visibilityQueueTaskExecutor.go:90\ngo.temporal.io/server/service/history/queues.(*executableImpl).Execute\n\t/home/builder/temporal/service/history/queues/executable.go:161\ngo.temporal.io/server/common/tasks.(*ParallelProcessor).executeTask.func1\n\t/home/builder/temporal/common/tasks/parallel_processor.go:207\ngo.temporal.io/server/common/backoff.Retry.func1\n\t/home/builder/temporal/common/backoff/retry.go:104\ngo.temporal.io/server/common/backoff.RetryContext\n\t/home/builder/temporal/common/backoff/retry.go:125\ngo.temporal.io/server/common/backoff.Retry\n\t/home/builder/temporal/common/backoff/retry.go:105\ngo.temporal.io/server/common/tasks.(*ParallelProcessor).executeTask\n\t/home/builder/temporal/common/tasks/parallel_processor.go:217\ngo.temporal.io/server/common/tasks.(*ParallelProcessor).processTask\n\t/home/builder/temporal/common/tasks/parallel_processor.go:195"}
temporal | {"level":"error","ts":"2022-07-05T15:02:10.812Z","msg":"Persistent fetch operation Failure","shard-id":4,"address":"172.26.0.2:7234","wf-namespace-id":"32049b68-7872-4094-8e63-d0dd59896a83","wf-id":"temporal-sys-tq-scanner","wf-run-id":"d034be6e-fac2-4bb1-bdde-a5e36749c7e2","store-operation":"get-wf-execution","error":"context deadline exceeded","logging-call-at":"transaction_impl.go:489","stacktrace":"go.temporal.io/server/common/log.(*zapLogger).Error\n\t/home/builder/temporal/common/log/zap_logger.go:142\ngo.temporal.io/server/service/history/workflow.getWorkflowExecutionWithRetry\n\t/home/builder/temporal/service/history/workflow/transaction_impl.go:489\ngo.temporal.io/server/service/history/workflow.(*ContextImpl).LoadWorkflowExecution\n\t/home/builder/temporal/service/history/workflow/context.go:274\ngo.temporal.io/server/service/history.(*visibilityQueueTaskExecutor).processStartExecution\n\t/home/builder/temporal/service/history/visibilityQueueTaskExecutor.go:115\ngo.temporal.io/server/service/history.(*visibilityQueueTaskExecutor).Execute\n\t/home/builder/temporal/service/history/visibilityQueueTaskExecutor.go:90\ngo.temporal.io/server/service/history/queues.(*executableImpl).Execute\n\t/home/builder/temporal/service/history/queues/executable.go:161\ngo.temporal.io/server/common/tasks.(*ParallelProcessor).executeTask.func1\n\t/home/builder/temporal/common/tasks/parallel_processor.go:207\ngo.temporal.io/server/common/backoff.Retry.func1\n\t/home/builder/temporal/common/backoff/retry.go:104\ngo.temporal.io/server/common/backoff.RetryContext\n\t/home/builder/temporal/common/backoff/retry.go:125\ngo.temporal.io/server/common/backoff.Retry\n\t/home/builder/temporal/common/backoff/retry.go:105\ngo.temporal.io/server/common/tasks.(*ParallelProcessor).executeTask\n\t/home/builder/temporal/common/tasks/parallel_processor.go:217\ngo.temporal.io/server/common/tasks.(*ParallelProcessor).processTask\n\t/home/builder/temporal/common/tasks/parallel_processor.go:195"}
temporal | {"level":"error","ts":"2022-07-05T15:02:10.812Z","msg":"Fail to process task","shard-id":4,"address":"172.26.0.2:7234","component":"visibility-queue-processor","wf-namespace-id":"32049b68-7872-4094-8e63-d0dd59896a83","wf-id":"temporal-sys-tq-scanner","wf-run-id":"d034be6e-fac2-4bb1-bdde-a5e36749c7e2","queue-task-id":1048580,"queue-task-visibility-timestamp":"2022-07-04T12:28:03.160Z","queue-task-type":"VisibilityStartExecution","queue-task":{"NamespaceID":"32049b68-7872-4094-8e63-d0dd59896a83","WorkflowID":"temporal-sys-tq-scanner","RunID":"d034be6e-fac2-4bb1-bdde-a5e36749c7e2","VisibilityTimestamp":"2022-07-04T12:28:03.160160056Z","TaskID":1048580,"Version":0},"wf-history-event-id":0,"error":"context deadline exceeded","lifecycle":"ProcessingFailed","logging-call-at":"lazy_logger.go:68","stacktrace":"go.temporal.io/server/common/log.(*zapLogger).Error\n\t/home/builder/temporal/common/log/zap_logger.go:142\ngo.temporal.io/server/common/log.(*lazyLogger).Error\n\t/home/builder/temporal/common/log/lazy_logger.go:68\ngo.temporal.io/server/service/history/queues.(*executableImpl).HandleErr\n\t/home/builder/temporal/service/history/queues/executable.go:231\ngo.temporal.io/server/common/tasks.(*ParallelProcessor).executeTask.func1\n\t/home/builder/temporal/common/tasks/parallel_processor.go:208\ngo.temporal.io/server/common/backoff.Retry.func1\n\t/home/builder/temporal/common/backoff/retry.go:104\ngo.temporal.io/server/common/backoff.RetryContext\n\t/home/builder/temporal/common/backoff/retry.go:125\ngo.temporal.io/server/common/backoff.Retry\n\t/home/builder/temporal/common/backoff/retry.go:105\ngo.temporal.io/server/common/tasks.(*ParallelProcessor).executeTask\n\t/home/builder/temporal/common/tasks/parallel_processor.go:217\ngo.temporal.io/server/common/tasks.(*ParallelProcessor).processTask\n\t/home/builder/temporal/common/tasks/parallel_processor.go:195"}
temporal | {"level":"error","ts":"2022-07-05T15:02:11.534Z","msg":"Operation failed with internal error.","error":"GetWorkflowExecution: failed to get buffered events. Error: getBufferedEvents operation failed. Select failed: context deadline exceeded","metric-scope":5,"logging-call-at":"persistenceMetricClients.go:1424","stacktrace":"go.temporal.io/server/common/log.(*zapLogger).Error\n\t/home/builder/temporal/common/log/zap_logger.go:142\ngo.temporal.io/server/common/persistence.(*metricEmitter).updateErrorMetric\n\t/home/builder/temporal/common/persistence/persistenceMetricClients.go:1424\ngo.temporal.io/server/common/persistence.(*executionPersistenceClient).GetWorkflowExecution\n\t/home/builder/temporal/common/persistence/persistenceMetricClients.go:241\ngo.temporal.io/server/service/history/shard.(*ContextImpl).GetWorkflowExecution\n\t/home/builder/temporal/service/history/shard/context_impl.go:840\ngo.temporal.io/server/service/history/workflow.getWorkflowExecutionWithRetry.func1\n\t/home/builder/temporal/service/history/workflow/transaction_impl.go:462\ngo.temporal.io/server/common/backoff.RetryContext\n\t/home/builder/temporal/common/backoff/retry.go:125\ngo.temporal.io/server/service/history/workflow.getWorkflowExecutionWithRetry\n\t/home/builder/temporal/service/history/workflow/transaction_impl.go:467\ngo.temporal.io/server/service/history/workflow.(*ContextImpl).LoadWorkflowExecution\n\t/home/builder/temporal/service/history/workflow/context.go:274\ngo.temporal.io/server/service/history.LoadMutableStateForTask\n\t/home/builder/temporal/service/history/nDCTaskUtil.go:142\ngo.temporal.io/server/service/history.loadMutableStateForTimerTask\n\t/home/builder/temporal/service/history/nDCTaskUtil.go:123\ngo.temporal.io/server/service/history.(*timerQueueActiveTaskExecutor).executeWorkflowTaskTimeoutTask\n\t/home/builder/temporal/service/history/timerQueueActiveTaskExecutor.go:292\ngo.temporal.io/server/service/history.(*timerQueueActiveTaskExecutor).Execute\n\t/home/builder/temporal/service/history/timerQueueActiveTaskExecutor.go:109\ngo.temporal.io/server/service/history/queues.(*executorWrapper).Execute\n\t/home/builder/temporal/service/history/queues/executor_wrapper.go:67\ngo.temporal.io/server/service/history/queues.(*executableImpl).Execute\n\t/home/builder/temporal/service/history/queues/executable.go:161\ngo.temporal.io/server/common/tasks.(*ParallelProcessor).executeTask.func1\n\t/home/builder/temporal/common/tasks/parallel_processor.go:207\ngo.temporal.io/server/common/backoff.Retry.func1\n\t/home/builder/temporal/common/backoff/retry.go:104\ngo.temporal.io/server/common/backoff.RetryContext\n\t/home/builder/temporal/common/backoff/retry.go:125\ngo.temporal.io/server/common/backoff.Retry\n\t/home/builder/temporal/common/backoff/retry.go:105\ngo.temporal.io/server/common/tasks.(*ParallelProcessor).executeTask\n\t/home/builder/temporal/common/tasks/parallel_proce
@Mayowa-Ojo: I tried to reproduce with your two databases and succeeded to run the typescript example. I was using the docker compose file from this very gist 😕
Now, there are at least three things that differ from my setup and yours:
The connection strings / env variables I used (by creating additional credentials for your dbs). May I ask you to try using mine? They are password protected and in format that you can only retrieve them once by clicking on this link. The password to unprotect is the name of your PlanetScale org where you host the temporal DB.
The specific version of https://github.com/temporalio/temporal - I am running successfully with https://github.com/planetscale/temporal/tree/main which is currently 28 commits behind the upstream master branch. If changing the env variables, does not change anything, I would suggest trying planetscale/temporal@a431b8a - I will also retest with the latest HEAD
The actual machine / architecture and location you are running on - I was testing successfully on my local Mac with Intel chipset and amd64 Linux using GitHub CodeSpaces (same Go version as in your logs). If 1) and 2) are not the culprits - it may be worth checking whether you can get it to run in a GitHub Codespace or GitPod.
Looking forward to your feedback 😊